The idea of self-driving vehicles might seem like a recent modern technology, but the idea for this has been around since the early 1920’s. According to engineer.com, the first known prototype of the driverless car started in 1925 when Houdina Radio Control used radio technology to control a car using a remote. This technology was brought back to light when Google announced in 2009 and that they have been in the works of developing and testing their driverless car. After this announcement, a numerous amount of companies has been developing their own version of driverless vehicles. Two of the most prominent companies that have already introduced self-driving cars on the road are Tesla and Uber.
Any topic. Any deadline.
Our certified writers can do an A-level paper for you.
As of late 2015 and early 2016, Tesla has introduced a fleet of cars available for purchase to the public that offers an Autopilot mode that will drive on its own with driver oversight. Uber has also introduced transportation services to the public in California and Arizona. Electrek has reported in November of 2016 that Tesla has already recorded 1.3 billion miles on their Autopilot technology. Uber has also tested their vehicles on public roads for millions of miles. Since Tesla’s introduction of their self-driving cars, there have been only one reports of a fatal accident. The accident happened in the United States in May 2016. Uber has also been involved in a single non-fatal accident in the United States in 2016.
In Tesla’s Autopilot accident in the United States, Joshua Brown was involved in a high-speed crash with an 18 wheel truck and trailer when the car continuous sped up and hit the rear end of trailer causing the roof of the Tesla to rip off. After the Florida Police Department had finished their investigation, they found Brown at fault for the accident because he did not have his hand on the wheels when the accident occurred, but he was also watching a Harry Potter movie on his car’s touchscreen. This accident happened at the beginning of Tesla’s launch of their self-driving vehicles. For Uber’s accident, their self-driving car was crossing an intersection in the middle of a yellow light, and another vehicle not associated with the self-driving program hit Uber’s vehicle causing it to turn on its side. After the investigation, it was concluded that the human driver was at fault for this accident because he/she failed to yield at the intersection. In both of these situations, the crashes were due to human error. There are different psychological processes that interfere with moral reasoning. The case of self-driving vehicles is surrounded by a number of controversies that raises the questions over the moral reasoning behind the presence of driverless vehicles. There are issues over whether the driverless vehicles should be allowed on the roads with the civilian drivers, whether driverless vehicles should be tested more before being let into the road, and whether the accidents involving the driverless vehicles should be blamed on civilians. Psychological theories of moral reasoning can best explain the introduction of driverless vehicles in the roads with the civilian drivers. According to philosopher Hume, morals excite passions that cause or prevent actions undertaken by human beings.
Need help with your paper ASAP?
GradeMiners certified writers can write it for you.
The passions driving actions should not impede the moral decisions. The driverless vehicles should be on the roads with civilian drivers. The driverless vehicles are fitted with Sure-Brake System that monitors the wheel speed, relay commands to hydraulic modulator, and analyze the relayed data to detect skidding. The automation model has countless applications including the autonomous driving. The NHTSA has classified the automated system vehicles under the level of the automated vehicles. The governments have invested in new technology that would rend the driverless vehicles safe and more lawful roadways. Google vehicles have attempted to attain full autonomy, yet their vehicles are never without a human driver to meet the safety requirements. Although there are many fears over the accidents that driverless, vehicles have been involved, both the companies have taken safety procedures, as well as the governments involved in the manufacture of driverless vehicles. The safety comparison of driverless vehicles to human drivers is notable. Elon Musk, Tesla’s Chief Executive, argues that the accident rates are less by 50% (Boudette).
Driverless vehicles should be tested more before they are let out for consumers to use it. A number of states have passed some regulations regarding the use and test of driverless vehicles. States like California, Nevada, and Michigan has passed legislations regarding the testing of AVs on public roads. The Californian law requires a driver in the driver’s seat to take control of the testing purposes (Chafkin). Testing and development are important for ensuring the safety of the driverless vehicles. There is a need for more tests before the vehicles are put on the road to ensure that all the accidents and risks are avoided.
Boost your grades with a new guide on A+ writing
Learn everything you need about academic writing for free!
The accidents involving self-driving vehicles and civilian vehicles should not be blamed on the civilian drivers. The law should take its course in investigating who was in the wrong. For the case of Uber accident, an investigation concluded that the human driver was the one in the wrong and, therefore, should take the blame. The accident occurred because the civilian driver failed to yield while the Uber vehicle had two safety drivers in the front seats. The accident between Tesla and Autopilot that killed Joshua Brown is blamed on the human driver. However, the autopilot’s cameras and radar failed to recognize the white trunk. Investigation shows that the Brown was at fault because he did not have his hands on the wheel and was watching a movie during the accident. The accidents should be blamed on human fault and the law should take its course in finding who was at fault. Conclusively, moral reasoning should be considered when coming up with decisions and policies relating to self-driving vehicles.
Did you like this sample?
Boudette, E. Neal. Autopilot Cited in Death of Chinese Tesla Driver. 2016: New York Times
Chafkin, Max. Uber’s First Self-Driving Fleet Arrives in Pittsburgh This Month. 2016: BloombergBusinessweek.