Artificial intelligence and automotive: Happy together

By Majeed Ahmad

Editor-in-Chief

AspenCore

June 28, 2018

Blog

Artificial intelligence and automotive: Happy together

ADAS and autonomous car designs are unleashing the levels of innovation unprecedented in the technology business. At the same time, however, AI is turning the industry upside down.

The tech industry, accustomed to one gold rush at a time, is now at crossroads. The advanced driver assistance systems (ADAS) and autonomous car designs are unleashing the levels of innovation unprecedented in the technology business. At the same time, however, the artificial intelligence (AI) gold rush is turning the industry upside down and, remarkably, these two technology juggernauts seem to be advancing in peaceful coexistence with each other. In other words, they are happy together.

Here, it's worth mentioning that machine learning and deep learning are AI's technology offshoots. Machine learning, a subset of AI technology, relies on experiential learning rather than programming solutions to make decisions. Deep learning is a branch of machine learning that involves layering algorithms in a bid to gain a greater understanding of the data. It can take raw information without any meaning and construct hierarchical representations for generating deep insights.

[Figure 1 | Artificial intelligence is an amalgam of several different technologies. Image via Arteris IP.]

Artificial intelligence and its technology offshoots can run specialized algorithms to perform complex tasks like assisted and automated driving that are otherwise impossible to perform with rule-based programming solutions. Here, in supercomputer-like automotive system-on-chips (SoCs), the AI accelerators are mostly implemented as IP blocks supplied by firms like Cadence, CEVA, and Synopsys.

These hardware accelerators are now being increasingly used to run machine-learning algorithms in automotive chips supporting the camera, LIDAR and radar subsystems. In fact, automotive chip designers are now slicing large algorithms like object detection and recognition performed by fish-eye cameras with surround view into smaller parts while adding more hardware accelerators. Likewise, more AI accelerators are being added to sense car surroundings, localize the vehicle on an HD map, and predict the behavior and position of other objects such as pedestrians, traffic lights and other cars.

[Figure 2 | The role of AI is going to central in sensing the environment around vehicles. Image via Mentor Graphics.]

Cars, called smartphone on wheels during the mid-2010s, reflecting the smartphone revolution, are now being called AI machines on wheels. However, there is a need for caution because unlike smartphones, where low-end AI chips are mostly catering to inputs like face-recognition sensors, stakes are too high in the cars of future. The AI chips have to be spot on, and for that, they need abundant compute power, high bandwidth, and low latency.

AI's love affair with ADAS and autonomous cars is still a work in progress. On one hand, new AI algorithms are quickly coming to the market, and on the other hand, automotive chips continue to boost the compute power in the form of hardware accelerators to accommodate these powerful new AI algorithms.

Majeed Ahmad is the former Editor-in-Chief of EE Times Asia. He is a journalist with an engineering background and two decades of experience in writing and editing technical content. He is also the author of six books on electronics: Smartphone, Nokia’s Smartphone Problem, The Next Web of 50 Billion Devices, Mobile Commerce 2.0, Age of Mobile Data, and Essential 4G Guide.

I am a journalist with an engineering background and two decades of experience in writing and editing technical content. Formerly Editor-in-Chief of EE Times Asia, I have taken part in creating a range of industry-wide print and digital products for the semiconductors industry content value chain.

More from Majeed