The Science of Self-Driving Cars


A wireframe of a sleek modern car.

Soon, your car will be able to drive itself faster and safer, while you relax in the seat.

What’s the future of personal transportation? Well, you’ll likely be spending a lot less time behind the wheel, for one. The rise of self-driving cars means that some scenes out of science-fiction flicks (think Total Recall or I, Robot) are now reality—and even more will be available soon.

Cars today already include many semi-autonomous features, like assisted parking and self-braking systems. And completely autonomous vehicles—able to operate without human control—are rapidly becoming more of a reality. You’re probably familiar with Google’s version, which has made headlines with its Google Chauffeur software, which the company hopes to bring to market by 2020.

The pros of autonomous cars are many. “The sensors in a self-driving car are always observing, are not affected by the state of the driver (sleepy, angry, etc.), and can scan in multiple directions simultaneously,” says Dr. Dominique Freckmann, an automotive engineering manager at TE Connectivity, a global technology leader providing connectivity and sensor solutions that are essential in today’s increasingly connected world. “Autonomous driving is a key aspect of the industry’s drive toward safer roadways.”

“Recent NHTSA research shows that approximately 94 percent of accidents are caused by human error,” adds Alan Amici, a vice president of automotive engineering at TE. “Cars with advance safety features and eventually, self-driving cars, can significantly reduce the number of collisions. The impact of this innovation can be far-reaching, including reduced demand on emergency response systems and reduced auto insurance and health care costs.”

What technology makes self-driving cars possible? It’s really three technologies, Amici says: sensors, connectivity, and software/control algorithms.

“Most of the sensors required for autonomous driving are available today and are used in advanced safety features such as blind-spot monitoring, lane-keep assistance, and forward collision warning,” he says. “Sensors for other features such as radar, ultrasonics, and cameras provide the input necessary to navigate the car safely.”

Connectivity means cars have access to the latest traffic, weather, surface conditions, construction, maps, adjacent cars, and road infrastructure, he says. This data is used to monitor a car’s surrounding operating environment to anticipate braking or avoid hazardous conditions.

Finally, software/control algorithms are needed to reliably capture the data from sensors and connectivity and make decisions on steering, braking, speed, and route guidance.  “By far the most complex part of self-driving cars, the decision-making of the algorithms, must be able to handle a multitude of simple and complex driving situations flawlessly,” Amici says. “The software used to implement these algorithms must be robust and fault-tolerant.”

Two of the most talked about self-driving advancements come from Google and Tesla. They take different approaches: Google is using lidar (a radar-like technology that uses light instead of radio waves) sensor technology and going straight to cars without steering wheels or foot pedals. Tesla has rolled out a software system called Autopilot, which employs high-tech camera sensors as a car’s “eyes,” to some of its cars already on the market.

While technologies and capabilities continue to evolve toward making autonomous vehicles a reality, there are some hurdles. Right now, autonomous cars are legal only in a few U.S. states, as regulators weigh how to best ensure their safe interaction with standard human-driven vehicles.

“Self-driving capability will add benefits to our whole society, such as providing transportation for people who are otherwise not able to drive because of age or physical impairment,” Freckmann says. “That is both exciting and meaningful.” 

August 1, 2016. Written by Nancy Gupton.