Home NEWS Technology How Driverless Cars See the World Around Them

How Driverless Cars See the World Around Them

5 min read
0
39

Sponsored links

Even in situations where lidar works well, these companies want backup systems in place. So most driverless cars are also equipped with a variety of other sensors.

Like what?

Cameras, radar and global positioning system antennas, the kind of GPS hardware that tells your smartphone where it is.

With the GPS antennas, companies like Uber and Waymo are providing cars with even more information about where they are in the world. With cameras and radar sensors, they can gather additional information about nearby pedestrians, bicyclists, cars and other objects.

Cameras also provide a way to recognize traffic lights, street signs, road markings and other signals that cars need to take into account.

Photo



Elaine Chao, right, the secretary of transportation, looking at a lidar system, used by self-driving cars, at the University of Michigan in September.

Credit
Max Ortiz/Detroit News, via Associated Press

How do the cars use all that information?

That is the hard part. Sifting through all that data and responding to it require a system of immense complexity.

In some cases, engineers will write specific rules that define how a car should respond in a particular situation. If a Waymo car detects a red light, for example, it is programmed to stop.

But a team of engineers could never write rules for every situation a car could encounter. So companies like Waymo and Uber are beginning to rely on “machine learning” systems that can learn behavior by analyzing vast amounts of data describing the country’s roadways.

Waymo now uses a system that learns to identify pedestrians by analyzing thousands of photos that contain people walking or running across or near roads.

Is that the kind of thing that broke down in Tempe?

It is unclear what happened in Tempe. But these cars are designed so that if one system fails, another will kick in. In all likelihood, the Uber cars used lidar and radar as well as cameras to detect and respond to nearby objects, including pedestrians.

Self-driving cars can have difficulty duplicating the subtle, nonverbal communication that goes on between pedestrians and drivers. An autonomous vehicle, after all, can’t make eye contact with someone at a crosswalk.

“It is still important to realize how hard these problems are,” said Ken Goldberg, a professor at the University of California, Berkeley, who specializes in robotics. “That is the thing that many don’t understand, just because these are things humans do so effortlessly.”

The crash occurred at night. Is that a problem?

These cars are designed to work at night, when some sensors can operate just as well in the daytime. Some companies even argue that it is easier for these cars to operate at night.

But there are conditions that these cars are still struggling to master. They do not work as well in heavy precipitation. They can have trouble in tunnels and on bridges. And they may have difficulty dealing with heavy traffic.

Continue reading the main story

Sponsored links
Source link

Load More Related Articles
Load More In Technology
Comments are closed.

Check Also

Prescription for worsening myopia in Canadian kids? Head outdoors

Seven-year-old Jaclyn recently chose bright blue-framed glasses with red dots “becau…