Drivers in autonomous vehicles must be alert

Drivers+in+autonomous+vehicles+must+be+alert

Hsinyen Huang, Managing Editor

The proliferation of self-driving cars has always been a dream commonly associated with the future. Thanks to advancements in artificial intelligence and sensor technology, the dream can now be realized with the efforts of companies such as Waymo and Tesla. Recent fatal accidents involving self driving car technology, however, are enough of a reason to reevaluate the safety of allowing computerized programs and sensors to completely take control of vehicles. Self-driving cars have not been developed enough to safely steer and navigate while driving, not to mention encounter unforeseen problems on the road that require human judgment to resolve. Drivers must stay alert while operating self-driving vehicles to make up for what artificial intelligence lacks.

Self-driving cars have been in development by tech and automobile companies since 2009, when Google announced its self-driving car project. Since then, Google’s project has evolved into a separate company, Waymo, and automobile companies such as Tesla and Mercedes-Benz have manufactured vehicles with some self-driving features available for public purchase. Ridesharing companies like Uber and Lyft have also been testing self-driving vehicles to potentially replace human drivers.

Recently, however, self-driving vehicles and the inattentiveness of drivers operating them have been the reasons behind fatal accidents. On March 18, pedestrian Elaine Herzburg was killed after being struck during a test run of one of Uber’s self-driving cars. Just five days later, Wei Huang died of injuries after his Tesla Model X crashed into a highway barrier while driving with Tesla’s Autopilot self-driving system. In both cases, the drivers operating the vehicle had been occupied with distractions prior to the accidents, meaning that drivers need to acknowledge the limits of current driverless car technology by constantly staying alert.

“It is dangerous to rely on a program in terms of operation of a vehicle,” said junior Tanvi Narvekar. “Recent accidents prove the unreliability of these programs, and their relative newness may result in software errors.”

For the owners of cars with self-driving capabilities, it should be noted that even though these cars are marketed as driverless, self-driving technology is driver-assisted technology at best. Even Waymo, the first company to start developing self-driving cars for commercial purposes, has stated that self-driving car technology still needs at least a decade to become fully autonomous. In an effort to paint their products in a favorable light, however, the marketing strategies used by these companies have indirectly caused consumers to become overconfident in the ability of their vehicles to drive safely without human intervention, which have led to accidents resulting from inattentive drivers.

“We still need some manual control,” said freshman Keshav Dandu, a member of one of the robotics teams. “If someone is in a rush, they are more likely to ignore warnings from a self-driving car’s programming.”

Since conditions on the road are always changing, drivers need to remain aware of any new developments. The sensors that self-driving cars use to detect the car’s surroundings can often be negatively affected by severe weather, and the low visibility in such conditions can reduce the accuracy of the car’s cameras. Additionally, autonomous vehicles come equipped with maps of roads, but when conditions change, a self-driving car’s programming can easily be confused. Thus, to lower the number of accidents caused by unexpected circumstances, drivers must always be attuned to their surroundings.

“Driving on a mountain is completely different from driving on a highway, and driving in Europe is different from driving in America,” said Narvekar. “Programming can be used to handle these obstacles, but ultimately, finding calibrations for all of these settings will be extremely difficult.”

No matter how advanced artificial intelligence becomes, computers will never be able to handle social interactions the way humans can. It would therefore be unreasonable to expect self-driving cars to interpret the body language of pedestrians or react appropriately to sudden moves by cyclists. This task remains up to drivers, because artificial intelligence will not be able to understand humans the way that other humans can.

“Even biking signals can confuse humans, and the way that people communicate meaning could always mean something different to another person,” said Dandu. “With a computer, this problem would be even worse, and it would be awkward and potentially dangerous for pedestrians to constantly pay attention to a camera mounted on a car.”

Even if self-driving cars were programmed to avoid the risk of an accident at all costs, it would mean major slowdowns on roadways. The only way to find a balance between efficiency and safety would be for drivers to employ their natural intelligence.

“The most dangerous aspect of self-driving cars is the complete reliance on the system,” said Narvekar. “Glitches are still possible, and humans need to be there during glitches, or people might be injured.”

While the idea of being able to drive anywhere without lifting a finger may seem appealing to those who lead busy lives, the reality is that autonomous vehicle technology still requires the assistance of human drivers to operate safely. Owners of cars with autopilot capabilities must remain aware of the risks associated with the convenience of autonomous vehicles and should always remain aware of dangers on the road.