Pedestrians’ non-compliance rules are the biggest problems with self-driving cars
by Ready For AI · Published · Updated
It is undeniable that Autopilot technology has brought great convenience to our lives, but problems with self-driving cars are equally important.
Is the accident of self-driving cars inevitable?
“The way you cross the road is wrong.” This is the conclusion of the arguments of the self-driving cars in the months after the car first caused the death of pedestrians.
Self-driving cars are one of the most important applications of artificial intelligence. Now it has to make people worry more and more. The ability of artificial intelligence to drive in the real world is far worse than many people predicted a few years ago.
Steve Jobs’s famous saying “You shouldn’t hold a mobile phone like this” when the iPhone 4 appeared in the “Antenna Gate” incident. Technical experts say the problem is not that autonomous vehicles don’t work, but that people’s behavior is unpredictable.
In March of this year, a Uber driverless car crashed into a woman who pushed a bicycle across the road in Arizona, USA. At night, she did not pass on the designated crosswalk. After this incident, whether autonomous vehicles can correctly identify and avoid pedestrians crossing the road has become a hot issue. The incident is still under investigation, but a preliminary report from the federal security regulator said that the car’s sensor detected the woman, but its decision-making software thought it might be a false positive, so the data was not adopted.
How long does it take for a real self-driving car to formally go on the road?
Google’s Waymo promises to launch a self-driving taxi service in Phoenix, Arizona later this year; General Motors also promises to launch a similar service sometime in 2019 – using a car without a steering wheel or pedal . However, it is unclear whether the two companies have the ability to conduct business outside the designated area and whether it is possible to respond to emergencies without a safety driver. At the same time, other initiatives seem to be gradually slowing down.
Elon Musk put aside plans for Tesla’s self-driving cars to drive on the roads across the United States. Uber has cancelled an autopilot truck project with a focus on self-driving cars. Daimler Trucks, a subsidiary of the Daimler Group, recently said that commercial driverless trucks will take at least five years to appear. Others, including Musk, have previously predicted that such cars will be able to get on the road by 2020.
Some opinions from Andrew Ng and others
With the delay of these timelines, supporters of driverless cars like Andrew Ng say that there is a foolproof shortcut that allows autonomous vehicles to get on the road faster: persuading pedestrians not to make unruly behaviors . If they choose to walk on the crosswalk, there will be clues around them – sidewalk markers and traffic lights, and the software is more likely to identify them.
But for others, Andrew Ng’s advice also shows another problem, that today’s technology can’t deliver autonomous cars as originally envisioned. “The artificial intelligence we really need is not here yet,” says Cary Marcus, a psychology professor at New York University. His main research interests are humans and artificial intelligence. He said that Andrew Ng “just redefines the goal to make the job easier”, and if we can achieve a safe self-driving car by simply isolating them from human drivers and pedestrians, then we already have this Technology: Train.
Rodney Brooks is a well-known robotic researcher and an honorary professor at the Massachusetts Institute of Technology. In a blog post, he criticized Andrew Ng’s view that one of the great promises of self-driving cars is “they will eliminate the deaths caused by traffic accidents.” Now Andrew Ng says that as long as people are trained to change their behavior. They can eliminate the deaths caused by traffic accidents. So what is the significance of this commitment? ”
Andrew Ng believes that humans are constantly changing their behavior to cope with new technologies, especially transportation. He said: “If you look at the development of the railway, in most cases, people have learned not to stand on the rails.” Andrew Ng also pointed out that people already know that school buses may need to park frequently, while parking, small Children may cross the road in front of the bus, so they will drive more carefully.
Does the error come from the habits of pedestrians themselves?
In fact, according to Peter Norton, a professor of history at the University of Virginia, in the early 1920s, crossing the road was a crime in most parts of the United States, because automakers lobbied at the time, mainly to Prevent strict speed limits and other regulations that may affect car sales. Peter Norton wrote a book on this topic. Therefore, this is a precedent that regulates the behavior of pedestrians and makes way for new technologies.
Although Andrew Ng is probably the most famous autopilot supporter who advocates training humans and vehicles, he is not the only one who holds this view. Shuchisnigdha Deb, a researcher at the Center for Advanced Vehicle Systems at Mississippi State University, said, “There should be appropriate educational programs to familiarize people with these vehicles and how to interact with them and how to use them.” The US Department of Transportation is also at its latest. The need for such consumer education is emphasized in the Autopilot Car Guide.
Maya Pindeus, co-founder and CEO of Humanising Autonomy, compared this education to a series of public awareness education campaigns in Germany and Austria in the 1960s following a series of pedestrian deaths. Humanising Autonomy is a London-based startup that works on models of pedestrian behavior and gestures that autonomous car companies can use. These efforts helped reduce the number of pedestrians who died on the road in Germany, from more than 6,000 deaths in 1970 to less than 500 in 2016 – the last year with data available.
However, it is understandable that the autonomous driving industry does not want to shift the burden to pedestrians. Both Uber and Waymo stated in an e-mailed statement that their goal is to develop autonomous vehicles that can handle the real world without relying on changing human behavior.
One of the challenges these companies and other companies face is that self-driving cars are now relatively new, so pedestrians don’t always treat autonomous cars like ordinary cars. Some people just can’t suppress the urge to test this technology themselves. Waymo, a subsidiary of Alphabet, often encounters pedestrians who deliberately “prank” the car. They keep coming to the car, leaving, and then returning, hindering the car.
People seem to default to the assumption that self-driving cars are designed to be extraordinarily sophisticated and careful, so they are willing to take the risk to prank. “Although our system does have superhuman perception, sometimes people seem to think that Newton’s law no longer applies,” said Paul Newman, co-founder of British startup Oxbotica, who recalls that a pedestrian is running to catch up with autonomous cars. Then suddenly jumped to the front of the car.
As time goes by, self-driving cars will gradually become normal, and people are less likely to go to pranks. At the same time, the industry is discussing what the company should do to get people to know about autonomous vehicles and their movements.
The latest developments in autonomous driving technology
Andrew Ng’s wife, Carol Reiley, is one of the co-founders of Drive.AI. Drive.AI has made a series of modifications to self-driving cars and road tests in Frisco, Texas. The self-driving car has an orange-yellow appearance so that pedestrians can more easily notice and recognize them. Drive.AI is also the first to use an external LED display, similar to the screens used by many city buses to display destinations or route numbers, and to show the movements of pedestrians to pedestrians. For example, if a self-driving car stops in front of a crosswalk, its display might show: “Waiting for you to cross.”
Uber has also taken further steps to patent one of their systems, which will include a variety of flashing external logos and holograms projected in front of the car to communicate with human drivers and pedestrians. Google also applied for a patent for its external logo. Oxbotica’s Newman says he likes the idea of external messaging and voice – just like the hum of a large car while reversing, these devices are used to help ensure safety between humans and self-driving cars. interactive.
Deb said her research shows that people need these external features, as well as voice communication or warnings. But so far, in addition to Drive.AI, the cars used by these companies in road testing do not include such modifications. It is unclear how pedestrians or other human drivers should communicate their intentions to autonomous vehicles, which Deb said may also be necessary to avoid future accidents.
Humanising Autonomy wants those who make autonomous cars to focus more on understanding the nonverbal cues and gestures people use to communicate. Pindeus said that the problem with most computer vision systems used in self-driving cars is that they simply set a bounding box around an object and apply a label—such as a parked car, bicycle, pedestrian, without analyzing the The ability to move things in the box.
Conclusion
Ultimately, a better computer vision system and better artificial intelligence can solve this problem. Over time, cities may be remodeled in the form of “Geofencing,” a new term for separate driving areas and designated pick-up locations for self-driving cars and taxis. At the same time, advice from parents may still apply: don’t cross the road, look at both sides before crossing the road.