The future of vehicle connectivity

By Jean Pilon-Bignell

Vice President Of Business Development, Government & Smart Cities

Geotab

July 30, 2018

Story

As automotive technologies continue to mature, new vehicle sensors are now capable of predicting impending collisions, identifying pedestrians and cyclists, and more.

Since GM and Ford introduced the first automotive electronic control units (ECUs) in the 1970s, vehicles have gradually moved away from their mechanical roots in favor of digital architectures. Today, the average vehicle contains more than 100 digital sensors, with high-end vehicles containing significantly more. Traditionally, these sensors were used to measure and regulate basic vehicle operations, such as engine timing and fuel injection. However, as automotive technologies continue to mature, new vehicle sensors are now capable of predicting impending collisions, identifying pedestrians and cyclists, and even controlling certain vehicle functions. The deployment of these kinds of sensors are expected to accelerate in the near future, generating more than 4 TB of data per vehicle per day.

Sensor technology in vehicles

Most new vehicles with driver assisted safety technologies rely on a combination of sensor types, including radar, LIDAR, cameras, and more. Each of these technologies have their own strengths and weaknesses, and therefore all continue to be deployed in modern vehicles.

Radar is playing a key role in two ADAS technologies: automatic emergency braking system (AEBS) and adaptive cruise control (ACC). Compared to most optical sensing technologies currently available, radar is largely unaffected by most environmental and atmospheric disturbances including fog, rain, snow, darkness, or glare, making it ideal for most driving conditions.

LIDAR sensors work much like radar, but instead of transmitting radio waves use a scanning laser to generate a complete 3D image of the vehicle’s surroundings. LIDAR is arguably one of the most important technologies in the race to Autonomous Vehicles (AVs). In many existing AV deployments, LIDAR provides a 360° view of surrounding obstacles, and in turn directs a vehicle to avoid them.

Camera and other vision-based systems have primarily been reserved for visual driver assist features, in addition to High Definition (HD) mapping applications. As vehicles equipped with camera systems continue to drive our roads, they are building a library of HD maps that other vehicles can use to assist in the identification of stop signs, yield, sign, lane markings, etc.

Driver and pedestrian safety initiatives have been the major contributor for the rapid growth of these ADAS and AV technologies. This has been reflected in a number of federal Government initiatives throughout the globe, including the recent European New Car Assessment Program (NCAP) and NHTSA’s proposed ADAS OEM requirements. Further to these, municipal agencies are beginning to place an increasing emphasis on improving traffic and pedestrian safety through public awareness campaigns, legislation and private partnerships. In light of this, vehicle data is become a critical input into these initiatives.

Urban analytics for smart cities

Vehicle sensors collect large volumes of data, which remains largely underutilized - the data generated by a vehicle is mostly consumed only by that vehicle. However, as cities across the globe invest heavily in smart city initiatives, vehicle data will become instrumental to their success. For instance, connected-vehicle solutions such as Geotab’s open telematics platform are already collecting, normalizing, and securely aggregating more than 2.5B data points per day from vehicles across North America. By leveraging machine learning and big data analytics, this data can be transformed into actionable smart city insights and urban analytics, which can in turn be used to predict traffic patterns, identify high-risk traffic areas, reduce greenhouse-gas emissions, and justify investments in digital infrastructure.

By equipping public or private sector fleets with telematics technologies, vehicles become smart sensors on wheels, and entire fleets become mobile smart city sensor networks. This trend will only accelerate into the future as connected vehicles move beyond vehicle-to-network (V2N) applications and into the realm of vehicle-to-everything (V2X), including infrastructure, pedestrians, and other vehicles.      

V2X: Extending what cars can “see”

Beyond sensor-based radar, LIDAR, and camera technologies, vehicles in the future will communicate with numerous external third party systems in real-time. This is at the core of vehicle-to-everything (V2X) communications. With V2X, cars can communicate with other vehicles (including emergency service vehicles), traffic lights, digital road signs, pedestrians, and more. This will unlock new methods of revenue generation (or cost recovery) for transport departments, help urban planning departments develop saffer and more efficient transportation systems, and provide valuable inputs to a municipality’s environmental and sustainability initiatives. The catalyst for all of this, is a open and secure connected vehicle platform.

Jean Pilon-Bignell is Associate Vice-President of Government and Smart City at Geotab. Leveraging a Masters of Engineering and M.B.A., Jean uses a unique combination of technical expertise and business acumen to develop innovative connected-vehicle, IoT, and smart-city strategies that maximize operating inefficiencies and profits.

Business development lead for Government IoT and smart city applications.

More from Jean

Categories
Automotive