Tesla / Mobileye – Double vision

Robotaxis will not be appearing anytime soon.

  • McAfee has exposed a corner case that triggered dangerous behaviour by a Tesla vehicle in another example of how deep learning is not well suited to the chaotic road environment.
  • The result is that robotaxis remain as far away as ever and that investors should wary of pricing in robotaxis into Tesla’s valuation.
  • McAfee researchers (see here) stuck a piece of tape onto the 3 of a 35mph sign that elongated the middle stroke of the 3.
  • To any human, this is clearly still reading 35 but the Mobileye EyeQ3 machine vision system read it as 85 leading to the Tesla vehicle accelerating to 50mph over the speed limit.
  • This example once again exposes the core weakness of using deep learning systems to teach machined to interpret reality.
  • These systems have no causal understanding of what it is they are doing and as a result, they are unable to adjust the fluctuations in the dataset resulting in these sorts of failures.
  • A Mobileye spokesman suggested that the modified sign would also fool a human but this looks like a very poor excuse to me.
  • A human seeing that sign would either read it as 35 as I (and almost everyone else) did or would have questioned whether the sign was correct given that it would have been in a built-up area where the limit is always much lower.
  • This is the critical piece that deep learning systems lack.
  • They are unable to take what they have learned in one scenario and apply it to another.
  • Until this problem is either solved or mitigated by using other techniques, autonomous driving is going to remain in the realm of hype.
  • To be completely fair to Mobileye, I don’t think that its autonomous driving system would have fallen for this because it uses a combination of machine vision, lidar, HD Map and software to work out and remove errors such as these.
  • This is the key to Mobileye’s demonstration where it was able to drive the highly chaotic streets of Jerusalem with apparently little problem.
  • Tesla, on the other hand, has simply said “we only need machine vision” which would be true if the machine vision was good enough which clearly it is not.
  • If Tesla can solve the machine vision problem, then I think it can have a leading solution, but I think its approach of relying on deep learning is not the right one.
  • In theory, it is possible to have one huge black box with an end to end deep learning system and there are others outside of Tesla working on this, but I have yet to see a reliable demonstration of driving vehicles safely.
  • This is why I continue to rank Tesla in the bottom half of autonomous driving solutions meaning that it is not going to get to market first with a robotaxi solution.
  • This supports my view that autonomous driving will not be a commercial reality before 2028 outside of rigidly geofenced offerings in very simple suburban or rural areas.
  • Furthermore, this also casts doubts on Tesla’s assertion that it can stably earn 66% gross margins from a robotaxi service because there will be so much competition by the time Tesla makes it to market that prices will have fallen materially.
  • This is a part of the valuation that the market is ascribing to Tesla and while Tesla is looking in much better shape financially, I still can’t justify the valuation using any method based on fundamentals.
  • If I owned Tesla shares, I would sell them but going short is a much too dangerous business while the hype bubble remains intact.

RICHARD WINDSOR

Richard is founder, owner of research company, Radio Free Mobile. He has 16 years of experience working in sell side equity research. During his 11 year tenure at Nomura Securities, he focused on the equity coverage of the Global Technology sector.