How Google's self-driving cars see the world
GoogleGoogle's self-driving cars are already in a few cities.Google's self-driving cars are traveling more naturally than they ever have before.
Decked out with GPS, sensors, cameras, radar, and lasers, Alphabet's (Alphabet is the new parent company of Google) cars are capable of gathering tons of data about their environment from a 360 degree perspective so that they can seamlessly operate in a constantly changing environment.
According to Google's Self-Driving Car Project website, sensors on the car can detect objects up to two football fields away, including people, vehicles, construction zones, birds, cyclists, and more.
But the data collected by each vehicle does more than allow it to respond in the moment. All of the data each car collects is used to constantly improve the software, so that all cars can learn from one vehicle's experience.
Given that Google's self-driving cars have driven more than 1.2 million miles in autonomous mode since 2009, the software knows how to react in a lot of different situations.
Chris Urmson, the head of technical development for the project, gave a thorough look at how its cars are operating in real-life scenarios in June during a Ted Talk presentation.
"We can take all of the data that the vehicles see overtime, the hundreds and thousands of pedestrians, cyclists, and vehicles that have been out there and understand what they look like and use that to understand infer what other vehicles should look like and other pedestrians should look like," Urmson said.
Here's a breakdown of how Google's self-driving cars see the world around them and how they are using real-time data to respond to a wide range of scenarios.