Autonomous Autos – Double vision

Reply to this post

 

 

 

 

 

Humans can run red lights, machines can’t .

  • Mobileye has a great excuse to why its autonomous system ran a red light but when lives are at stake, excuses are not good enough.
  • Safety is a major driver of the automotive industry which is why components are tested to breaking point, specifications are set so far in advance and why there is very little margin for error.
  • For vehicles to be truly autonomous and deliver all of the benefits that this offers, the systems need to be held to the same standard, and at the moment, they all fall very far short.
  • The latest example is Mobileye which is working on an autonomous driving system that uses cameras rather than the lidar sensors that everyone else uses.
  • The benefits of this are obvious as lidars are expensive and collect huge amounts of data (up to 700,000 points per second) that all has to be processed in real time.
  • Consequently, if cameras can reliably be used, then this would represent a large cost saving and a big advantage for Mobileye and its owner Intel.
  • However, one of its weaknesses came to light recently when Mobileye was demonstrating its technology to the Israeli TV station, Channel 10.
  • In this instance, the wireless transmitters from the TV cameras interfered with signals being picked up from the traffic light transponder.
  • In the current set up, Mobileye’s decisions are based on the transponder signal such that the vehicle continued even though the camera had identified the light as red.
  • This problem has now been corrected but it demonstrates that this system is very far from being ready for market.
  • The problem is that the situations that an autonomous system is going to find itself in are so high in number, that it is impossible to program or predict each instance.
  • Radio interference from a TV camera is a great example.
  • This is why autonomous driving is predominantly an AI problem.
  • Mobileye is planning to include lidar as a back-up when it has shown that is system is good enough, but in doing so I think that it obviates the point of having a camera only based system.
  • Furthermore, as Waymo has shown, having good AI to make driving decisions requires a vast amount of experience in different driving conditions from which to learn.
  • The importance of AI is highlighted by the fatality that Uber recently suffered (see here) where I am pretty certain that the sensor data picked up the movement of the pedestrian but that the software was unable to accurately interpret sub-optimal data as a potential hazard.
  • Hence, while Mobileye’s system appears to be working in good conditions, it’s the outlier scenarios and obvious weaknesses in the interpretation of sub-optimal data that is likely to continue causing it problems.
  • This is why I think Waymo comfortably leads the pack with the best performing autonomous driving solution today, followed by Cruise from GM (see here).
  • I continue to think that the race to get to a workable solution remains largely academic as I still see the technology being ready long before the market is ready to receive it.
  • This gives the stragglers time to catch up and if Mobileye can perfect its system and offer commercial grade autonomy using only cameras then it will have an advantage.
  • Achieving this goal is still far away making one of its most appealing traits the fact that it is not Google.

RICHARD WINDSOR

Richard is founder, owner of research company, Radio Free Mobile. He has 16 years of experience working in sell side equity research. During his 11 year tenure at Nomura Securities, he focused on the equity coverage of the Global Technology sector.