The automotive industry is inching closer and closer to getting autonomous driving on the road. There have been impressive demonstrations, such as Delphi’s Roadrunner car that completed a 3,400-mile autonomous test-drive in April, and Tesla Motors will soon have autonomous driving features in cars that are actually out on the road today. It’s becoming clearer that autonomous driving like Tesla’s model and in many of the other demonstrations is where cars are heading.
“I think that autonomous vehicles and autopilots are going to enable a whole new world of business models and have a substantial impact on how we allocate real estate, where we live, how we drive, and we’ll end up converting vehicles from being cars to living rooms, restaurants, bars, night clubs, meeting rooms, and home theaters that just happen to move while we continue to work and play inside of them,” says Michael Worry, CEO and Founder at electronic product design and manufacturing company Nuvation Engineering ().
Worry is working with a team of researchers at the University of Waterloo () in Ontario, Canada led by Steve Waslander, Assistant Professor, Department of Mechanical and Mechatronics Engineering, who are three years into a four-year partnership of working on autonomous vehicle development. Funding from the Canadian federal and provincial government levels allows for five faculty and five or six students working at the same time on different autonomous driving projects. Through this partnership, Nuvation and the University of Waterloo bring together the perspectives of what’s possible in function and what’s viable in business.
“Steve and his team work on what I call the string theory of autonomous driving – how do we figure out a design that can handle every situation regardless of weather or conditions that come up on the road or anything else?” Worry says. “Whereas myself, as an entrepreneur and a capitalist, I look at it from a perspective of what’s the minimum viable product? What can be done here that can be carved off that can cover enough situations that it’s a product that someone will pay money for? And somewhere in between those two perspectives represents this project work that we do.”
Instead of focusing on building features already in OEMs’ pipelines, Waslander says they’re working on what can be available in vehicles in a few years.
“We’re looking at the collaborative driving future where we have multiple autonomous vehicles communicating with each other,” Waslander says. “We’re trying to simultaneously exploit the fact that they can share information and they can autonomously make decisions and sense things around them.”
The research is split into four topic areas – two focus on multi-vehicle coordination and two focus on perception systems, like those seen in the Google autonomous cars and the Mercedes Benz prototypes. Autonomous platooning involves teams of vehicles coordinating on the highway telling each other how fast they want to go, which lanes they want to travel in, how far they want to get along the highway, and they work together to maximize efficiency and driver comfort. Evasive maneuvers or emergency maneuvering is the other part of multi-vehicle coordination that allows the platoon on the highway to rapidly communicate emergency situations through the chain of vehicles behind them and take appropriate actions, such as slamming on the brakes, switching lanes, or moving off to the shoulder. One perception topic involves generalizing vision-based algorithms for lane detection to be used in every condition a vehicle might encounter, such as intersection detection, merging lane detection, and any other markings on the road. The cars need to figure out how to get from the entry point of the intersection to the correct lane when exiting the intersection. Starting this year the team is researching using computer vision and stereovision in particular to do robust tracking of other vehicles, pedestrians, and bicycles in all sorts of weather systems.
In Waslander’s platooning project, “MPC Based Collaborative Adaptive Cruise Control with Rear End Collision Avoidance,” presented last year at the IEEE Intelligent Vehicles Symposium in Michigan, Waslander, Feyyaz Emre Sancar, Baris Fidan, and Jan P. Huissoon studied the benefits of communicating additional acceleration information from nearby cars in a platoon. The test car has the ability to sense the car in front and behind, and also to communicate with cars two or three steps ahead and get their expected accelerations and other data. It can then use this data to make the whole sequence more stable and better handle changes in the situation.
“The added piece that gets ignored that I always love talking about is that all these cars have hard actuation limits,” Waslander says. “They can only accelerate so quickly and that acceleration depends on how fast they’re already driving and they can only hit the brakes so quickly, which also depends on how quickly they’re driving and the road surface they’re on. All those constraints are things that we can include when we use the model predictive control (MPC) formulation.”
The MPC method the team used to develop improved collaborative adaptive cruise control (CACC) schemes works to predict car behavior to maximize “string stability” in the platoon and avoid rear end collisions with the preceding vehicle and maintain safe conditions for the following vehicle.
“It tries to predict how all the cars are going to move over the next 5-10 seconds instead of just looking instantaneously at the errors in spacing between the vehicles,” Waslander says. “It’s coming up with a plan, not just a control input.”
While the first step of the plan is being implemented the model updates its problem definition to determine a new plan to use for its next step.
“You have a lot more foresight with the MPC method and you can solve it numerically and quickly enough now that we can do this much larger optimization problem inside the control loop,” Waslander says.
This and the other projects Waslander’s team are working on make up the research sphere, where researchers are able to demonstrate the feasibility of a feature. Worry is concerned with the production sphere, where companies can demonstrate they’re willing to sell the feature and guarantee it’ll work. There’s still a huge divide between the two and a lot of work left to do, but Waslander and Worry are working to solve those challenges.
Dealing with less than ideal conditions
One of the biggest challenges of autonomous driving development is dealing with all possible conditions a car and driver can face. When the lighting is perfect, the weather is perfect, and road conditions are perfect, autonomous driving is easy. But besides in lab-controlled settings, that’s hardly what these systems will face.
“The approach that most car companies are taking is, essentially, excessively redundant sensing,” Waslander says. “They have three or four radars in the front bumper and in the back, sonar, cameras pointing out the side view mirrors, cameras pointing forward, and laser scanners in the front and sometimes in the back. What they’re looking for is complementary sensing methods that work in different conditions.”
Using all of the methods helps cover each method’s shortcomings, Waslander says. For example, sonar are bad at long range, and their slow speed makes them good for parking but not much else. Camera systems and lasers work well when the lighting is good, but glare from sunlight or other interfering light sources can throw them off. Radar is immune to many of these problems so its good for features like adaptive cruise control, but it’s hindered by low resolution. The combination of systems allows for most situations to be addressed, and when it can’t handle something it reverts to human control.
Worry notes that while it’s a difficult problem to solve, once we solve it we have the solution forever.
“This is a tough problem, and there are a lot of people working on it, but what’s fascinating is that it’s the same as the light bulb,” Worry says. “Today a light bulb is a trivial thing and there are all kinds of versions of it. Before Edison figured out tungsten there was tons of experimentation done until we figured that out. Once we solved the problem, that problem was solved for all time. We forever had the ability to take electricity and make light. Autonomous vehicles are the same way; once it’s solved then we’ll have this autonomous capability for all time, every vehicle will then have the ability to have that level of autonomous control.”
Computer vision versus lasers and radars
Today in autonomous driving the competing technologies vying to become the best solution are computer vision and laser and radar sensing, Waslander says. Computer vision has been around for a long time and people really believe it will be able to interpret visual data like humans can, and development is getting closer and closer to that goal. Camera technology has become very low power and portable thanks to mobile phones, and it’s extremely low cost so it can provide ubiquitous autonomy capabilities once it’s able to accurately interpret visual data.
Lasers, on the other hand, require rigorous engineering, heavy processing, and have heavy, expensive sensing requirements in order to collect the incredible amount of data they’re capable of collecting about the environment. This type of system is what Google has been using in its self-driving car demonstrations; its spinning sensors on top of the car can measure about 700,000 data points per second in a 360-degree, 100 m radius around the car.
“Laser systems are what have allowed these Google cars to do such amazing demonstrations so quickly,” Waslander says. “They have really good information about the roads they’re driving on, both before and while they’re driving on them. If they can bring the cost of those sensors down [for laser-based systems] – they require fancy optics and all kinds of things that make reducing cost challenging – then that might be a technology that wins versus computer vision, which is already cheap.”
Waslander and Worry agree that cost may be the deciding factor, and that computer vision is winning. Their research emphasizes computer vision methods. From a business standpoint, cost is a big factor but also what people will buy; consumer preference will probably be in the realm of more discrete computer vision cameras than the large spinning cylinder on top of Google’s car.
Getting people on board with self driving cars
Though solving the technical problems of creating a system that works accurately in all conditions is enough of a challenge, humans may be a more difficult challenge to overcome in the autonomous vehicle space.
“We occasionally encounter ‘muggles’ that have this view that autonomous vehicles will never happen because it’s too hard of a problem,” Worry says. “They like to itemize out difficult situations that would be difficult even for a human to handle. The truth is that there is a lot of autonomous capability that can be added to vehicles and enhance the driving experience or free that human up to do something else without needing the full solution.”
Many of these difficult challenges are caused by human drivers themselves.
“As log as there’s still one human driver on the road, you can’t rely only on autonomous vehicles talking with each other,” Worry says. “This is one of the biggest challenges that Google encountered. The rules of the road are very well defined. There are very specific ways as to how you’re supposed to drive, and there are very specific laws. If everyone followed them, then the autonomous driving problem would actually be very easy. The majority of the effort is around people who don’t follow the rules and don’t follow the laws. We have to handle that guy that cuts across lanes without signaling and goes through a stop sign or runs a red light, and we’ve all seen it.”
Worry very strongly believes that humans shouldn’t drive. He didn’t think many people would agree with him at first; when he gave a presentation last year called “Why it should be illegal for humans to drive” he expected people to argue with him about it, but it turned out no one disagreed.
“What’s fascinating is because in general as a society we have a much higher standard for computers than we do for humans,” Worry says. “Every day over 3,000 humans are murdered by other humans using vehicles, and its accidental and unfortunate, but that’s what goes on and somehow as a society we’re not enraged by that, and yet we seem to believe that we need to get autonomous computers in cars to a level that they can handle every situation and it’s just not a realistic comparison.”
Dealing with regulations and certification
Fortunately the people in charge aren’t making autonomous driving progress too difficult. Waslander says government regulations haven’t been much of a hindrance to progress thus far.
“Because you can sit a driver in the car who can take over at any point in time, researchers just started driving and then they ask for permission later, and sure enough Nevada and California jumped on board relatively quickly,” Waslander says.
Worry agrees that because people are still responsible for their vehicles, autonomous capabilities or not, we should be seeing partially autonomous vehicles on the road very soon. He doesn’t see drivers having problems signing a document that says they agree to be personally responsible even while their car is in autopilot mode.
Safety certification may be another story as we get closer to deployment of more autonomous systems, but with progress being made in incremental extensions of existing technology, we’ll only have to prove the new additions are safe or reliable. Tesla’s model of implement now and see what happens will be interesting, and the first big battle in this area, Waslander says. When the software update goes out over the air this summer and the cars become somewhat autonomous, will those cars then be illegal?
Fully autonomous vehicles
For fully autonomous vehicles, Waslander and Worry disagree on when we’ll see them. Worry thinks we’re five years out from fully autonomous driving to be legal, but people will probably start using the technology a couple years before then if the capability is there.
Waslander thinks we’re about ten years out from full autonomy because there are a lot of hurdles left to clear to 100 percent reliability and ability to drive in any situation. For example, what if you send your car out somewhere, but it starts to snow and it has to park itself somewhere for you to go get it potentially miles away? There are a lot of unforeseen circumstances to deal with.
There are also some big picture issues to consider. For example, the engineering community also talks about building smart cities that are power efficient, but how will autonomous vehicles affect this? It could be in a good, efficient way, such as allowing the 30 percent of urban real estate allocated to parking to be cleared up for better use because cars no longer need to be parked nearby – drivers can send their car off to do something else while you’re at work. But that could also negatively impact the problem of urban sprawl. Traffic congestion limits how far away we live from work, but if we could reclaim that time we spend focused on driving, people may start living farther and farther away because they can spend that driving time being productive and therefore not mind a long commute time. Or instead of parking downtown, drivers can send their cars out to drive around and then pick them up when they’re ready because it’s cheaper than parking in tightly packed downtown areas, maybe doubling the car’s driving time or more. If we’re still on fossil fuels this may be trouble for pollution, though if we make the switch to electric cars with power coming from “green” sources the environmental impacts could be reduced.
Taking it one step at a time
The industry isn’t trying to build Rome in a day, so there’s plenty of time to think about and address big picture problems. For now the industry is focused on taking it one step at a time, which is often how technology develops. The ubiquitous and, to many, essential tool of GPS wasn’t perfected immediately at launch, Worry argues.
“GPS wasn’t anywhere near as accurate as it is now, but it has been incrementally improved,” Worry says. “We’re seeing that same sort of natural progression going on with autonomous vehicle technologies.”
“We’re going to see a continuous evolution of autonomy,” Waslander says. “You can already argue that the new features coming on-board vehicles in the next two to three years that have been announced officially as arriving in 2016-2017 already constitute autonomous driving. You won’t have to touch the steering wheel, the brakes, or the gas pedal, and you’ll be able to go long distances on the highway in a much more relaxed state while you’re still monitoring the progress of the vehicle. You’re still responsible for the vehicle, but you don’t have to do the minutia, the chore of driving. So from there cars are going to handle more complicated situations, more unpredictable environments, more complex scenes with lots of things going on. They’ll creep in from both the high-speed but controlled environments on the highway and from low-speed, gradually moving environments like parking lots, to the chaotic driving 60-70 kph (35-45 mph) while working with urban streets with intersections and cars coming from different directions and pedestrians right beside you. Those are probably the most challenging environments.”
The next steps in research and production
Much progress has been made in the autonomous driving space, but there’s still a long way to go if we want to fully reclaim our daily commute times, “drive” home after a few drinks, send the car to pick the kids up from soccer practice, and never worry about parking downtown again. Many smart minds are working on these problems and we’ve been seeing new capabilities all the time.
For Waslander, Worry, and the University of Waterloo and Nuvation partnership, their four research projects are still keeping them plenty busy. This summer they hope to finalize their four-vehicle test bed of 1/5 scale vehicles that are fully autonomous and have them demonstrate all of their progress so far. After that, Waslander hopes to scale up to real vehicles and find partners to help them achieve that. They also hope to expand their collaborative driving models from a local network communicating with two or three nearby vehicles to a much larger area, as 8- and 10-lane highways can have hundreds of vehicles in communication range that may impact your own driving. Worry is looking to see how these technologies can find synergy with current wants and be marketed and applied in the real world.
Real-world applications in the commercial industry
Some existing technologies can already help with autonomous vehicle control tasks that companies are asking for. For example, Worry talked to a company with a fleet of garbage trucks about the possibility of autonomous capabilities freeing drivers from the mundane tasks of their jobs. The company explained its autonomous driving need was to get a system that would prevent collisions with mailboxes.
“Now that’s a very specific, niche problem, and avoiding low-speed collisions is absolutely within the capabilities of current technologies,” Worry says “There are certainly bite-sized chunks that can be carved off and contribute to consumer lifestyle enhancements and business efficiency without having to solve the whole problem.”
In another example, Nuvation has taken an existing autopilot developed for their after-hours, for-fun autonomous driving project, DiscoFish (read about DiscoFish and watch it in action), and commercialized it for medium- and heavy-duty truck manufacturer Peterbilt Motors Company in order to take time-consuming and mundane tasks and automate them so drivers can spend time more important things. They demonstrated the autopilot at ITS World Congress, where it drove a course around a parking lot and maintained accuracy within a couple inches over multiple repeated loops.
Commercial spaces like trucking may be where more sophisticated automation goes first, as they’re far less sensitive to prices compared to consumers buying a car.
“With the automotive industry we’re talking about a few thousand dollars and you have to be done or people won’t pay for it,” Waslander says, “but in the trucking industry the cost sensitivities are far less, so we may see those really high-precision, high-quality sensors go into that sort of industry first. We may not start with autonomous driving in cars – we may start somewhere else.”
A mundane task that drivers hate doing and is time consuming is breaking in new trucks for tens of hours around a loop to burn in the engine. This is something an autopilot could easily do, Worry says.
Marshalling trucks – shuffling trucks around to fit them in the correct location in a parking lot or shipping yard after arriving at their destination – is another low-speed, logistical task that is annoying for a human to do, Worry says. With so many big trucks in close quarters with poor visibility, autonomous systems could also help avoid property damage and small accidents. Human drivers could go do much higher value tasks while an advanced driver assistance system finished these mundane chores.
“You can put in a differential GPS that is amazingly accurate and install it at a local location in a marshalling yard,” Worry says. “It could then drive these trucks around really slowly and do the Tetris match to figure out exactly where the trucks are supposed to be.”
There are many possibilities of real-world applications of autonomous driving systems, and it’ll be interesting to see where it gets implemented first, what it’ll allow us to do, and how people react to it.