CLOSED: [2018-03-23 Fri 18:35] :PROPERTIES: :CREATED: [2018-03-23 Fri 10:49] :ID: 2018-03-23-self-driving-cars :END: :LOGBOOK: - State "DONE" from "DONE" [2018-03-29 Thu 13:07] - State "DONE" from "NEXT" [2018-03-23 Fri 18:35] :END: Update 2018-03-29: Settlement and deactivated anti-collision mechanism In Tempe/USA, [[https://www.theguardian.com/technology/2018/mar/22/video-released-of-uber-self-driving-crash-that-killed-woman-in-arizona][a self-driving car killed a woman]]. The police published videos of the accident: #+CAPTION: Tweet by TempePolice on an incident with a self-driving car. #+ATTR_HTML: :alt Tempe Police Vehicular Crimes Unit is actively investigating the details of this incident that occurred on March 18th. We will provide updated information regarding the investigation once it is available. #+ATTR_HTML: :align center :width 590 [[tsfile:2018-03-21T23.23 Twitter.com - TempePolice - Tempe Police Vehicular Crimes Unit is actively investigating the details of this incident with an Uber -- transportation screenshots publicvoit.png][https://twitter.com/TempePolice/status/976585098542833664]] This is my opinion on this accident and on self-driving cars killing more people in general. *** The Tempe Accident After watching the video, I have to say that probably no human driver could have avoided this accident. The poor woman crossed the street during the night without any light (on herself or from the street light) and did this without even watching the traffic. This could be almost considered as a suicide. However, there was no human piloting the car. It was a [[https://www.uber.com/blog/our-road-to-self-driving-vehicles/][self-driving car by Uber]]. They are equipped with "360 degree cameras, lasers, and radars". In this case, the car should have recognised the woman as some kind of object that is clearly on a collision course with the path of the car. For lasers and radars, there is no such thing as darkness because of missing street light. They see as clearly as in a daylight situation. Update 2018-03-29: The car that was adapted by Uber was a Volvo XC90. According to [[https://www.bloomberg.com/news/articles/2018-03-26/uber-disabled-volvo-suv-s-standard-safety-system-before-fatality][recent news, Uber disabled a built-in safety feature that could have provented the collision or at least could have initiated an emergency break]]. What a bummer. Considering this, a very basic software or hardware faulure caused the death of this woman. Testing a malfunctioning car like that on public ground within a city is unjustifyable. More people will get killed. Update 2018-03-29: meanwhile, [[https://www.cnbc.com/2018/03/28/uber-reaches-settlement-with-family-of-victim-killed-by-self-driving-car.html][Uber reaches settlement with family of victim]]. Both parties keep silent over the details. I guess there was enough money for a decent compensation. And Uber was fast enough to settle with the family before more details went public such as the on-purpose disablement of safety features. *** Where To Test Self-Driving Cars European self-driving car researchers use a different approach for testing self-driving cars. This technology is clearly not in the state of being able to handle typical every-day situations. Not yet. And I, personally, am very skeptical that self-driving cars will be there within the next five to probably ten years. Therefore, self-driving cars are tested only on certain European highways only. The situation there is quite homogenous compared to the situation in cities, where there are too many different players: pedestrians, bicyclists, traffic lights, public transportation, and so forth. *** Self-Driving Cars Will Kill More People As a matter of fact, it is easy to derive that unless all cars are self-driving ones, we are going to see more traffic accidents similar to this one. Humans can not take over critical situations as fast as needed. Especially when they are bored while the car is driving. The self-driving cars can not be programmed for any possible situation there is. Therefore, there will be a huge amount of situations the computer can not act on properly. Uber probably have included a routine to detect bicyclists like the poor woman. They probably have not thought of recognizing a bicycle loaded with lots of bags as a bicycle. This probably might have caused the car not to recognize the object in general. We'll see. I am curious whether or not we are going to read the detailed error report on this situation. Computers need to recognize and handle situations with airplanes doing an emergency landing on a street. Or people spontaneously changing directions. Or falling advertisements or road signs. Or all kinds of animals on the street. Or huge amounts of water blocking sight on the street. Or wrong-way drivers. Or [[https://www.autoblog.com/2017/08/04/self-driving-car-sign-hack-stickers/?hcid=ab-around-ab-tile-3][maliciously hacked street signs]]. Or outdated map data of all kind. Or getting in a car chase. Or driving in a city while there is a power outage. And this list can be continued forever. Human actors and self-driving cars in combination are much worse than humans alone without self-driving cars. Only when the human factor is eliminated completely in a traffic situation, self-driving cars can work properly and therefore reduce traffic accidents by a large degree. This situation is hard to get to. We would have to ban non-self-driving cars and forbid bicyclists and pedestrians where self-driving cars are operated. A trade-off could be that self-driving cars are only allowed on highways but not within inhabitated areas. If we will face situations where only self-driving cars are operated, we could remove all those ugly traffic signs, traffic lights, and even road markings. Most parking lots can be used for humans again instead being occupied by waiting cars. At least a few positive aspects on this topic. *** Comments :PROPERTIES: :END: Mario wrote on Twitter: #+CAPTION: Tweet by sirlanda where he disagrees with my previous tweet on that. #+ATTR_HTML: :alt Small disagreement: a driver would have avoided this collision by either turning on hi-beams *or* going slower. Otherwise the driver would be responsible for the collision in that situation. The pedestrian didn’t teleport onto the street, rather walked with a rather low pace. #+ATTR_HTML: :align center :width 589 [[tsfile:2018-03-23T20.55 Twitter.com - sirlanda - Small disagreement. a driver would have avoided this collision by either turning on hi-beams or going slower. -- screenshots publicvoit.png][https://twitter.com/sirlanda/status/977272580251049986]] Fair point on my comment that a human driver could not have prevented the accident. I don't know why the car did not use the full or upper beam.