Driverless cars need eyes and ears
Driverless cars need eyes and ears

You need just two eyes and two ears to drive. Those remarkable sensors provide all the info you need to, say, know that a fire engine is coming up fast behind you, so get out of the way. Autonomous vehicles need a whole lot more than that. They use half a dozen cameras to see everything around them, radars to know how far away it all is, and at least one lidar laser scanner to map the world. Yet even that may not be enough.

To understand why, think about that fire engine. Your ears hear it approaching from behind, and your stereoscopic sound can determine where it is, where it's headed, and how fast. Hearing plays an essential role in how you navigate the world, and, so far, most autonomous cars can't hear. Engineers at the outfits developing robocars are trying to figure out how to give them that skill, and any other human traits they'll need to hit the roads.

“Since the technology is relatively new, we still don’t have all the answers as to what is best,” says Jeff Miller who studies driverless vehicle systems at USC.

Waymo, which is testing a fleet of autonomous minivans in the Phoenix area, has developed microphones that lets its robocars hear sounds twice as far away as previous sensors while also letting them discern where the sound is coming from.

It recently spent a day testing the system with emergency vehicles from the Chandler, Arizona, police and fire departments. Police cars, ambulances, fire trucks, and even unmarked cop cars chased, passed, and led the Waymo vans through the day and into the night. Sensors aboard the vans recorded vast quantities of data that will help create a database of all the sounds emergency vehicles make, so in the future, Waymo's driverless cars will know how to respond. If it’s a fire truck coming up behind, the car will pull over. If you’re at a green light and an ambulance is approaching from the left, the car will yield.

Read the article on Wired