Google I/O – Fightback Begins

Google moves to defend its turf.

  • Although Google is yet to see any impact from OpenAI, Perplexity etc on its Search business, it is moving to future-proof its legacy business, which, compared to Nokia in 2007, has a much better chance of success.
  • As expected, Google I/O is all about Gemini, which beat AI as the most used word during the keynote and will be the foundation of all of Google’s businesses going forward.
  • The problem for Google is that generative AI bots are better at complex search queries than regular search algorithms, and as Google has observed, the complexity of search queries is rapidly beginning to increase as users realise what these bots are capable of.
  • This creates a huge problem for Google’s $200bn search business as foundation models are all roughly the same these days, meaning that the barriers to entry for complex search have evaporated.
  • Google, at all costs, must stay ahead of the pack and with the assets that it has and the ecosystem it has built, it has every chance of doing this successfully, even if it does make horrible messes when it comes to marketing.
  • This is why the headline launch at Google I/O this year is “AI Mode”, which is a new feature within Search that it hopes will see off the insurgents.
  • AI Mode is an end-to-end generative AI search which takes the AI Overviews function that many users already see to a level that puts it on par with ChatGPT & Co.
  • It works by taking the query and breaking it down into a large number of separate search queries, which it then executes and puts together an answer from the results.
  • Google calls this “query fanout”, which represents a clever way to make use of what it is best at and then present that in a way that provides a similar level of detail compared to a regular LLM query.
  • It also a good way of keeping costs under control as regular search queries are far cheaper than using LLMs and so the more it can do via search, the better.
  • Given that Google is much better at search (especially the long tail) than anyone else, this should give AI Mode an advantage compared to anyone else, but as always, the proof will be in the pudding.
  • The idea here will be that if the results are good enough, then users will stop jumping ship to ask complicated questions and continue to spend their digital time with Google.
  • My own usage already shows signs of this, as complicated requests naturally go to ChatGPT while finding out where to buy something or a simple fact stays with Google.
  • If all of this were in one place, I would probably stay where I am, but for the moment, the competition is better than Google at the complicated requests.
  • This is what AI Mode is designed to prevent, but it has a couple of challenges out of the box.
    • First, cost: as LLM-based requests are far more expensive to execute than regular search requests.
    • By way of an example, Google claims that it is processing 480tn tokens a month today up 50x from the 9.7tn tokens 12 months ago.
    • If we assume that efficiencies have improved tokens per dollar by 5x in the last 12 months, then Google’s computer cost is still up a staggering 10x.
    • Compare this to the 15% YoY increase in revenues that the company experienced in Q1 2025, and the problem immediately becomes obvious.
    • This is why Google is not putting everything into AI Mode immediately but will keep it as an option that the user selects when he or she has an inquiry of this nature.
    • This explains why OpenAI is burning billions of dollars every year to support its free users, which hands Google a large economic advantage.
    • It also highlights the monetisation problem that AI Mode presents, although the “query fanout” method, which maximises the use of the existing search algorithm, has clearly been designed to manage this problem.
    • Second, monetisation: where both the method and the degree of monetisation are uncertain.
    • If all of search were to move to AI Mode, Google would have to increase its pricing several orders of magnitude to prevent a fall in profitability as a result of the same number of searches but much more compute being used to answer the searches.
    • Furthermore, how a single answer can be monetised as opposed to a series of sponsored links remains very unclear and Google would not be drawn on how it intends to achieve this.
    • Consequently, Google has to find a way for advertisers to be present in generative answers to queries and also charge them much more for being there in the first place.
    • This is a very difficult problem to solve, and I suspect it will require a lot of trial and error to get right.
  • The net result is that if Google were to suddenly take all of OpenAI’s traffic, this would cause a profitability problem due to the large increase in compute cost that would be required to handle all of these requests.
  • This is why AI Mode is staying as a manually selectable feature in search and not the default.
  • I suspect that in time, Google will be able to work out which request needs to go where, and the mode selection will become automatic.
  • However, the real question is whether this is going to be enough to stop the loss of search queries to OpenAI & Co., which has to be the immediate priority.
  • Despite Google’s tendency to shoot itself in the foot, I still think that its AI products are world-class.
  • This means that it should be able to compete effectively with OpenAI & Co. and maintain its grip on the search market even as it evolves.
  • This was not the case in 2007, when Nokia had no good answer to the iPhone or Android and as a result lost all of its market share to the current dominant digital ecosystems.
  • Google is about to go through a tricky time as doubters call into question its search business, and there is a possibility that its shares get hit hard if the numbers wobble before getting back on track.
  • This would represent a great opportunity in Google for anyone brave enough to take the other side of the permanently skittish market, which still has the ability to remain irrational for longer than the average investor can remain solvent.

RICHARD WINDSOR

Richard is founder, owner of research company, Radio Free Mobile. He has 16 years of experience working in sell side equity research. During his 11 year tenure at Nomura Securities, he focused on the equity coverage of the Global Technology sector.

Leave a Comment