Technology is constantly evolving with an emphasis on making life more convenient and helping certain chores become less time-consuming. Self-driving cars are one of these technologies and it’s easy to see the ways in which you could benefit from them. You could be dropped off at work and the car could park itself, you would no longer need a designated driver on those late-night celebrations, there would be no need to get angry at someone who didn’t indicate and you could spend your time productively as you were chauffeured to your destination. Such simple changes that provide a whole new way of living, and yet there are still a few hurdles to cross before its initiation – on top of the fact that the technology has not been perfected.
With both self-driving cars and humans on the roads, the legal issue of who is responsible for a crash comes into question – should the passenger or the software be blamed and who pays for the damages? Just recently, Google’s self-driving car was the cause of an accident for the first time – all previous incidents occurred when a driver took over its automated system (Kantrowitz, 2016). According to a report by Google, the car “believed the bus would stop or slow down” but the driver did neither of these actions (which is perfectly acceptable) and therefore it made contact with the vehicle. In the future, new laws will have to be implemented to ensure the safety of both drivers and passengers as well as eliminating any grey areas; a similar thing is occurring now with drones due to its increased use and popularity. To bring this back to my project, I’ll take a look into the legislation that is currently in place and what other laws they may have to take into account when autonomous cars are more wide-spread.
Another aspect of self-driving cars that needs to be addressed is the potential for cyber-terrorism. Anyone who has seen I, Robot may have a well-founded reason to be afraid of autonomous technology, but what happens when an autonomous vehicle is hacked not by an artificial intelligence, but by a terrorist? Cars could be be forced off the road, driven into crowds of people or made to crash into a building. Hackers have a knack for adapting to new technologies and those developing these vehicles must be aware of these possibilities while they design their cars.
Google has used a series of car models for its road-testing including the Toyota Prius and Audi TT, but now their fleet mainly consists of prototypes and modified Lexus SUVs. However, Google is not the only company looking into the self-driving market; Volvo has planned to release 100 autonomous cars by 2017 that will be used by actual customers in Sweden (ONE News, 2016). As part of my research on self-driving cars, I will look at our car culture and compare it to how it may look in the future. If no one is driving the car, high-powered engines become unnecessary for the everyday trip and driving itself becomes a hobby rather than an expectation. So then will certain brands still grant passengers a level of status? Will a drivers licence no longer be a rite of passage? Will there be groups opposed to this autonomous driving? It will be interesting to see how our culture changes through time towards this technology and whether it is possible that one day driving will be limited to sports racing.
– Kantrowitz, A. (2016), Watch This Sad Bus Driver Get Hit By A Driverless Car and Realize He Can Only Blame Technology, Buzzfeed. Accessed at: http://www.buzzfeed.com/alexkantrowitz/watch-googles-self-driving-car-hit-a-bus#.hfGOxpQK3q
– ONE News (2016), ‘Eyes OFF the road’: Driverless car experiment to send a whole new message, TVNZ. Accessed at: https://www.tvnz.co.nz/one-news/world/eyes-off-road-driverless-car-experiment-send-whole-new-message