You need just two eyes and two ears to drive. Those remarkable sensors provide all the info you need to, say, know that a fire engine is coming up fast behind you, so get out of the way. Autonomous vehicles need a whole lot more than that. They use half a dozen cameras to see everything around them, radars to know how far away it all is, and at least one lidar laser scanner to map the world. Yet even that may not be enough.
To understand why, think about that fire engine. Your ears hear it approaching from behind, and your stereoscopic sound can determine where it is, where it's headed, and how fast. Hearing plays an essential role in how you navigate the world, and, so far, most autonomous cars can't hear. Engineers at the outfits developing robocars are trying to figure out how to give them that skill, and any other human traits they'll need to hit the roads.
“Since the technology is relatively new, we still don’t have all the answers as to what is best,” says Jeff Miller who studies driverless vehicle systems at USC.
Waymo, which is testing a fleet of autonomous minivans in the Phoenix area, has developed microphones that lets its robocars hear sounds twice as far away as previous sensors while also letting them discern where the sound is coming from.
It recently spent a day testing the system with emergency vehicles from the Chandler, Arizona, police and fire departments. Police cars, ambulances, fire trucks, and even unmarked cop cars chased, passed, and led the Waymo vans through the day and into the night. Sensors aboard the vans recorded vast quantities of data that will help create a database of all the sounds emergency vehicles make, so in the future, Waymo's driverless cars will know how to respond. If it’s a fire truck coming up behind, the car will pull over. If you’re at a green light and an ambulance is approaching from the left, the car will yield.
Bloomberg Philanthropies and the Aspen Institute have compiled a list of all the cities testing or developing AVs by interviewing and surveying public officials as well as media reports and public documents. The list includes 47 cities that are hosting pilot programs or have committed to doing so.
There’s only one under-30 IT specialist working in agencies for every four who are over 60 years old.
Smart city projects in Atlanta, Chicago and Kansas City, Mo., were early efforts at the Internet of Things, analytics and other connected tech. Here's what they're up to today.
European officials in town for Toronto Region Board of Trade Transportation Summit say public transit operators must harness power of innovators like Uber, or be overwhelmed.
The Mastercard Center for Inclusive Growth is working in partnership with the Urban Institute in Washington, D.C. to develop data-driven metrics and methodologies to better answer key questions related to inclusive economic growth in U.S. metropolitan areas. The project will combine Mastercard insights with public data to create new indicators and identify new evidence to support and monitor equitable development efforts.
ClimaCell, a Boston-based weather tech startup, isn’t deploying any of its own sensors or other hardware. Instead, it uses software to measure the ways in which weather impacts wireless communication networks.
Cary, N.C., is testing smart parking sensors and other technology on its city hall campus to see how they work on a small scale.
Xaqt provides an urban analytics and intelligence platform that helps policymakers better innovate how cities function. Xaqt delivers its services using Amazon Web Services (AWS), which provides the agility and scalability required by its growing roster of smart city clients—without the cost and complexity of managing traditional infrastructure.
The revolutionary company just secured a whopping $40 million in financing from world-renowned chef David Chang, megabrand IKEA and the ruler of Dubai.