OpenAI & Microsoft – Ecosystem game

OpenAI makes its play for the AI ecosystem

  • OpenAI is seeking to capitalise on its early lead in AI and change a chatbot into a digital environment where developers go to develop their AI models that they can then sell to their customers using OpenAI’s store.
  • This has all the hallmarks of the moves 10 years ago by Apple. Google, Tencent and Alibaba to capture the digital ecosystem.
  • OpenAI held a developer event on 6th November 2023 where it launched a platform for creating custom generative AI models based on its foundation models.
  • These “AI apps” require no coding to create services and this looks very much like the software development kits (SDKs) that Apple, Google, Meta and many others make available for software developers to create apps and services that they sell within the ecosystem.
  • Third-party apps are one of the foundations of a successful digital ecosystem and play a very significant role in ensuring that users stay loyal to one platform or another.
  • All of the digital ecosystems have used this consumer preference to generate hundreds of billions of dollars in economic profit which is another indication of just how important and valuable developers can be.
  • Although the AI ecosystem is at a very early stage of development, there are already two control points in the value chain emerging.
    • First, AI training platform: which remains utterly dominated by Nvidia which has better than 85% market share.
    • The CUDA platform has been in existence for over 20 years and as such, Nvidia has a 10-year lead over everyone else when it comes to creating tools for training AI algorithms.
    • This means that its tools are the best, the most wide ranging and everyone already knows how to use them.
    • Consequently, everyone wants to train their AIs using CUDA and so far no one else has been able to make a dent.
    • Second, the foundation model: where the two leaders are Meta Platforms and OpenAI.
    • When a service like ChatGPT is created there are two stages of training.
    • The first is the building and training of a foundation model that is not particularly good at anything.
    • These foundation models are then fine-tuned or subjected to reinforcement learning that allows the creation of specific services such as ChatGPT or Dall-E.
    • Foundation models are difficult and expensive to make, and they are also almost impossible to swap out if one decides that one wants to use something else.
    • In effect, the creator of the service would need to reperform the fine-tuning again from scratch in order to switch from one to another.
    • This is why OpenAI is making a play for the ecosystem and it is using GPT-3, GPT3.5 and GPT-4 as its control points.
    • If developers start developing on GPT foundation models as OpenAI suggests, it will be very difficult for them to switch to something else which I think is precisely what OpenAI and Microsoft have in mind.
  • OpenAI is also providing a store called GPT Store through which users can access the GPTs created by developers in a faithful reproduction of the hugely successful app store model on smartphones.
  • The trick here is to become the go-to place for developers to create their GPTs and for buyers to find the best generative AI services available.
  • This is the classic network effect and is how I think OpenAI (doubtless guided by Microsoft) intends to become the dominant AI ecosystem.
  • With 100m users and global brand recognition, it has a huge head start but history has shown that the first mover is not always the eventual winner.
  • I suspect that the timing of these announcements also has a lot to do with Meta Platforms whose LlaMa 2 foundation model is fully available to anyone in the open-source community.
  • The result of this is that LlaMa 2 has become the standard for open source which Meta could leverage to also become an AI ecosystem in its own right.
  • This is precisely what Meta is doing in The Metaverse as the fact that it does not control the platforms upon which its current revenue-generating assets are dependent has caused a lot of problems and loss of revenues.
  • Meta is determined that this is not going to happen again which is why I think it deliberately leaked LlaMa to the open source and this is OpenAI’s response to prevent LlaMa from becoming the dominant foundation model.
  • It also threatens to change the control point in the AI ecosystem which is currently dominated by Nvidia.
  • OpenAI has abstracted the technical requirements to develop generative AI meaning that almost anyone can create a model or service based on GPT foundation models.
  • In the long term, it is OpenAI’s interest to not be dependent on Nvidia but to ensure that its models run just as well on the other options that are available in the market.
  • If OpenAI is the go-to ecosystem for generative AI, then this is how Nvidia’s grip on AI could be weakened.
  • Nvidia remains priced for perfection but while it continues to blow expectations out of the water, its valuation is likely to hold.
  • I continue to think that there are much better places to look in the technology sector.

RICHARD WINDSOR

Richard is founder, owner of research company, Radio Free Mobile. He has 16 years of experience working in sell side equity research. During his 11 year tenure at Nomura Securities, he focused on the equity coverage of the Global Technology sector.