Microsoft – Advertisements for all

Turns out this stuff is expensive.

  • Microsoft is moving to add advertising to its Bing GPT-4-powered chat service in a clear indication that large language models (LLMs) are very expensive to create and run which I think is a positive signal for Google.
  • Bing has certainly seen an increase in usage as it now lays claim to 100m daily users most of which are playing around with its GPT-4-powered chat feature and not switching away from Google when they want to find something.
  • OpenAI has made some pretty bold claims for the capabilities of GPT-4 but Microsoft has largely lobotomized these functions following the blowback it received when Sydney went sociopathic.
  • Its chat service is now pretty much GPT-4 to understand the request, Bing to search the internet for the answer and then GPT-4 to formulate the answer into a response with the links that the Bing search unearths.
  • This would be great if the Bing search was not so average in its search capability and in fact in many searches that I have tried, Google does a better job at getting me the answers that I am looking for.
  • This is because deep learning can’t handle changes in the dataset and the dataset of the Internet changes constantly which is why ChatGPT and GPT-4 are all frozen in time so that they can function properly.
  • Microsoft is using GPT-4 to formulate the answers not to do the search which is why it is able to function, but Google is still doing a better job.
  • This is because Google remains much better at covering the long tail of search as well as understanding exactly what it is that is being searched for even if the user does not articulate it very well.
  • However, it appears that even using GPT-4 to transcribe search results into the kind of answer that a human would give is pretty expensive to provide which is why Microsoft is looking to add advertising to the service.
  • This is one of the big problems with the way the applications of LLMs are being discussed which is everyone is excitedly asking what they can do with but nobody is asking how much it will cost.
  • Nvidia’s new DGX server box has 8 H100 or A100 chips and 640GB of storage which is probably just about enough to run an instance of ChatGPT costs $37,000 per month to rent.
  • GPT-3 needs around 800GB of storage while GPT-4 is probably 3-4x that at least which means that the short answer is that one instance of GPT is going to cost at least $500,000 per year to run and could easily cost much more if it gets high usage.
  • Furthermore, to support 100m users I suspect that Microsoft is using many copies of GPT-4 meaning that this quickly becomes a significant and negative cash drain without a monetization mechanism.
  • I suspect that the bills are already pretty big which combined with Microsoft’s need to cut costs in a weak economy, is why it has moved to implement advertising so quickly.
  • This will be implemented using the reference system that Microsoft has created with Bing chat.
  • The results of the search are written as a human might answer the question but then the sources are highlighted as links below the text.
  • Some of these could easily be sponsored links to relevant products and so the monetization system remains pretty similar to the way that search is monetized today.
  • If Bard usage also starts to draw traffic away from Google search and in my testing, there are some uses where it is better than a Google search, then I expect that Google will quickly implement the same.
  • However, the economics of using LLMs for search is not clear cut as for many types of search, it does not provide optimal results.
  • Hence, I continue to think that the threat to Google’s search business is overdone and could get interested in the shares if the market marks Google down significantly.
  • I remain pretty ambivalent about Microsoft’s valuation and would have sold it ages ago had I owned it at the time.

RICHARD WINDSOR

Richard is founder, owner of research company, Radio Free Mobile. He has 16 years of experience working in sell side equity research. During his 11 year tenure at Nomura Securities, he focused on the equity coverage of the Global Technology sector.