Amazon – Alexa me too

Alexa is the obvious place for generative AI at Amazon

  • Amazon is jumping on the generative AI bandwagon with the intention to use it to power searches on its site, but Amazon seems to be ignoring or is ignorant of some of the fundamental issues that will make this very difficult.
  • Amazon is planning to replace the search function for products on its site with a generative chatbot that will be used to search for products on its site.
  • At a high level, this looks like a good idea as Amazon’s search is not renowned for its excellence and it could certainly do with an upgrade, but I think that Alexa is the place where this would make the most sense despite the cost.
  • I have long held the opinion that Amazon is not very good at AI despite a general market opinion to the contrary.
  • Its weak search function, substandard Alexa performance, weak anti-counterfeiting performance and its habit of constantly advertising stuff to me that I have already bought on its site are evidence of this view.
  • All of the current chatbot devices that litter our kitchens, offices and living rooms all predate the invention of transformers which RFM research (see here) has concluded had a big hand in improving language performance in artificial systems seen recently.
  • This means that they can switch on lights, play music and set timers but struggle with anything that is not explained to them in the correct way using the correct words.
  • The devices are in our homes, but the algorithms have been running in the cloud which makes them very simple to update and improve.
  • Hence, I see no reason why Amazon could not simply replace the Alexa entity that exists today with a large language model (LLM) and have users interact with it using the existing hardware.
  • I suspect that Amazon is afraid of the new Alexa going crazy as all LLMs have a strong tendency to do and the cost of supporting the service.
  • All systems that are based on deep learning are only as good as the dataset with which they have been trained and when they come across something they haven’t seen before, they fail.
  • I am pretty sure that replacing Alexa with an LLM would result in competition to make it say something crazy with the results being posted on social media.
  • The cost is also likely to be an issue even for a company of Amazon’s size.
  • RFM research (see here) has concluded that LLMs are expensive to support and with services of millions of users, the cost of inference dwarfs the cost of training.
  • This would allow for a very profitable service in its own right, but the Alexa service is currently free meaning that Amazon would have to absorb all of that extra cost itself at a time when it is trying to save money.
  • This is why I think that Amazon is dipping its toe into the unknown waters of generative AI with a search function on its site rather than the much more obvious Alexa.
  • The problem with using generative AI for search is that the dataset is constantly changing as new things happen or new products are added, which is precisely what systems based on deep learning cannot deal with.
  • This is why ChatGPT is frozen in time and why Bing and Bard are also only using generative AI for the conversational part with the search itself being carried out using traditional methods.
  • Hence, I think Amazon will have difficulty getting this to work in practice and I expect that it might be a very long time before something appears live.
  • This is also the reason why I do not see any imminent threat to Google Search, and should the stock have a large lurch downward as everyone panics, then it would be a good opportunity.
  • However, with valuations where they are today neither Google nor Amazon represent an opportunity so I am staying clear.

RICHARD WINDSOR

Richard is founder, owner of research company, Radio Free Mobile. He has 16 years of experience working in sell side equity research. During his 11 year tenure at Nomura Securities, he focused on the equity coverage of the Global Technology sector.