The summer of 2016 truly belongs to the autonomous car platforms and sophisticated electronics marvels that will eventually come with these mindboggling technology advances. First, on June 29, BMW announced that it’s joining hands with vision processing specialist Mobileye and chipmaker Intel to create a central computing architecture for self-driving cars.
At around the same time, the news broke about a Tesla driver dying in a crash while using the semi-autonomous “Autopilot” feature. According to Tesla, on a bright sunny day, automatic emergency brakes failed to see the white side of a tractor trailer and that led to the fatal collision.
Industry reports suggest that this fatal crash eventually led to Tesla’s decision to stop using Mobileye’s EyeQ vision chips in its Autopilot driver-assist systems. On August 16, however, Mobileye made waves again by inking a strategic deal with Tier 1 automotive supplier Delphi to create off-the-shelf autonomous systems.
[Mobileye will add new hardware and software into Delphi's existing automated driving platform.]
Delphi will integrate Mobileye’s powerful automotive system-on-chips (SoCs)—EyeQ4 and EyeQ5—into its multi-domain controller (MDC) suite of sensors— cameras, radars, and LiDARs—mounted on the vehicle’s four corners. Mobileye specializes in computer vision, deep learning, and mapping technologies while Delphi focuses on sensors, software and system integration.
C2000™ MCUs are an industry leader in embedded real-time control solutions for electric vehicle systems which require efficient power conversion and high performance motor control technology. We enable advanced technology for EV onboard battery chargers and EV DC/DC converters with precise waveform control regardless of topology as well as enable advanced technology for EV propulsion, power steering and auxiliary motors. System integration and software solutions reduce evaluation and development time.
Also last month, Mobileye’s automotive nemesis Nvidia teamed up with the Chinese search-engine giant Baidu to use artificial intelligence (AI) in creating a cloud-based mapping platform for autonomous cars. The Chinese Google will combine its mapping and cloud technologies with Nvidia’s Drive PX2 automotive platform that comprises cameras, sensors, an AI-centric operating system, and high-definition 3D mapping technology.
Self-driving technology labyrinth
The pursuits of advanced technology collaborations chronicled here clearly speak for themselves. At the same time, however, the eventual deadlines for these autonomous car undertakings seem to converge, and almost all of them go beyond 2020. A lot could change during this period, including the ultimate shape and form of autonomous cars.
A recent story from The Wall Street Journal affirms the July 2016 column The long road to autonomous cars posted on Embedded Computing Design. The story calls for caution on the part of automakers in claiming self-driving capabilities while they advertise features like adaptive cruise control, emergency brakes, and assisted steering.
[Autonomous cars require a new level of intelligence to handle a variety of complex situations.]
The technology labyrinth behind the marvel of autonomous cars embodies a classical example of breakthroughs and limitations, and as the Journal story points to, carmakers and automotive technology outfits should acknowledge the fact that self-driving cars are a work in progress and are still years away from commercial realization.
Jing Wang, senior vice president and general manager of Baidu’s autonomous driving unit, hit the nail on the head while describing his company’s autonomous car ambitions. He said that Baidu is developing self-driving cars with the intention of increasing passenger safety and reducing traffic congestion and pollution.
Intel’s acquisition of automotive vision specialist Movidius is another stark reminder that the drive behind autonomous cars is for real. However, what shape and form it will take over the coming years isn’t clear yet.