Open AI – The Stranglehold

All about Nvidia

  • Sam Altman is once again capitalising on the intense popularity of generative AI by looking to raise money for a new chip venture which can only be aimed at breaking Nvidia’s current stranglehold on the AI ecosystem.
  • It looks like the discussion revolves around the creation of a fabless chip company that would design the AI chips and then send them to TSMC for manufacturing.
  • The fact that only TSMC seems to be in these discussions implies that whatever chips the new venture may come out with, they will be based on a leading-edge process in order to compete head-to-head with Nvidia.
  • This is the second leg of OpenAI’s verticalization strategy which combined with its launch of a GPT SDK and an app store for developers to sell their GPT-based services aims to take control of the AI ecosystem (see here).
  • If Open AI succeeds in ensuring that GPT becomes the go-to foundation model upon which to base one’s generative AI services, then developers are going to care less about the silicon platform via which their services are trained.
  • This will be exacerbated if there is a chip or series of chips that have been optimised for GPT which can train GPT services at least as well as Nvidia if not better.
  • Nvidia’s chips are extremely good, but they are not what makes the offering compelling to AI developers.
  • Instead, it is the CUDA platform which has been developed for general-purpose computing on GPUs for over 20 years and AI specifically for 12 years.
  • This means that CUDA has the best range of tools, the best support and is by far the most mature which is what gives Nvidia its edge.
  • The speed with which Nvidia releases its products is also a major competitive advantage as its competitors are struggling to keep pace with it.
  • This is why Nvidia currently has an 85%+ market share in the AI training market and can earn 70%+ gross margins on the chips that it ships.
  • It is this stranglehold that everyone is trying to break explaining why there are many start-ups in this field and why all of the major cloud providers are also working on silicon of their own.
  • The problem that they have is that while developers continue to demand Nvidia, then the CEOs of the large cloud providers are going to have to continue to invite Jensen to their corporate events to showcase how his offering is better than their in-house efforts.
  • This is the lock-up that Open AI is trying to break, but it also wants to lock up developers and so from a developer perspective, not very much would change.
  • However, Open AI has spectacularly exposed its weakness which is the fact that there remain substantial risks in its corporate governance structure which will greatly reduce the confidence of developers looking to base their services on GPT.
  • This gives competitors are large opening and I am sure that Meta with its position in the open-source community and Google will both be releasing development kits and tool sets of their own in an attempt to lure developers onto their AI platforms.
  • These are much more attractive propositions than they would have been a few months ago as their governance structures are much less flawed than Open AIs.
  • Hence, as a developer, I would have far more confidence that Google and Meta will not blow up in the same way that Open AI did although they still make plenty of silly mistakes just like everyone else.
  • 2023 was the year of training but I think this will begin to give way to inference in 2024 as algorithms begin to be deployed and the battle for the AI ecosystem heats up.

RICHARD WINDSOR

Richard is founder, owner of research company, Radio Free Mobile. He has 16 years of experience working in sell side equity research. During his 11 year tenure at Nomura Securities, he focused on the equity coverage of the Global Technology sector.