The technology and trade media is abuzz with deadlines for the arrival of autonomous cars, but pundits and technology observers are clearly missing a point: it’s evolution, not revolution. The fact that the Traffic Safety Administration has defined five different levels for self-driving cars — complete driver control to complete autonomy — is a testament to the evolutionary approach.
Autonomous cars embody a hugely complex proposition with basic building blocks coming from multiple technology realms. There are so many intricacies in a labyrinth of technologies — sensors, cameras, LiDARs, mapping, algorithms, deep learning, artificial intelligence, etc. — and that makes self-driving vehicles a gigantic undertaking that will most likely evolve in the coming years.
BMW’s semi-autonomous cars aim to put safety first.
The autonomous driving technologies seem within reach, but at the same time, there are a lot of ifs and buts. One thing, however, is for sure: the technology drive that the notion of autonomous cars has kick-started is irreversible. Moreover, the convergence of technologies such as deep learning, sensor fusion, and advanced mapping will continue to make vehicles safer and more efficient.
Mobileye case study
Mobileye is a classical case study of the evolutionary approach in driverless car’s technology journey. During the mid-2000s, the firm employed cheap CMOS image sensors used in mobile phones to create a camera system that allowed cars to detect pedestrians, roadway markings, and other vehicles.
Eventually, Mobileye’s vision recognition system became one of the first practical manifestations of the advanced driver assistance system or ADAS. The EyeQ system-on-chip (SoC), which ran an image processing algorithm, was at the heart of this camera system first launched in 2006 for the aftermarket.
Mobileye developed and manufactured this vision processor in collaboration with STMicro. That partnership has surpassed a decade and is still intact. Mobileye contributes the algorithm and software while ST is responsible for processor design and manufacturing and automotive certifications.
The journey that began with the launch of the EyeQ vision chipset back in 2004 has now reached the fifth version, EyeQ5, which the two companies jointly announced in May 2016. The EyeQ5 automotive chipset is going to be a lot more than a vision processor.
The block diagram of EyeQ5 boasting 16 cameras.
It’s aiming to become a central processing platform for cameras, sensors, LiDARs, and radars while consuming less than 5 watts. The chipset is going to be built around FinFET process technology at a 10nm or below node and will be available for sampling in 2018 and production in 2020.
No wonder the ADAS underdog is now rubbing shoulders with giants like BMW and Intel for the rich of autonomous cars, claiming self-driving vehicles will be on streets by 2021. Meanwhile, other automotive chipmakers like NXP and Nvidia are also vying to create super-chips for the next-generation autonomous car platforms.
Autonomous cars embody an evolution, not revolution. Tesla’s announcement that it will stop using Mobileye’s EyeQ chips in its Autopilot system is a case in point. There are inevitably going to be hits and misses before fully autonomous cars reach the streets.