Tuesday, March 20, 2018

First Pedestrian Killed by Uber Self-Driving Car


Description of incident in Tempe, AZ on Sunday night at 10:00pm. (NY Times)

A woman in Tempe, Arizona was struck and killed Sunday night as she stepped out into the street in front of one of Uber's self driving vehicles. It is believed to be the first death of a pedestrian by an autonomous vehicle to date. The woman was walking her bike across the road when she was struck by a Volvo XC90 Sport Utility vehicle going 40 mph in a 45 mph zone. It was discovered that the vehicle made no attempt to slow down.

Uber immediately ceased all autonomous car testing after the incident. 

Unfortunately, this incident is not the first fatality related to autonomous cars. In 2016, an Ohio resident was driving his Tesla in Florida while his auto-pilot feature was enabled when his Model S travelled underneath a semi-trailer. Tesla claimed that it had warned consumers about how the current software versions are not intended to make your car hands free. Federal investigators later ruled that Tesla was not at fault in the crash.

Personally, I believe that fatalities of all types, although probably preventable, will inevitably occur whether a human is in control of the vehicle or not. Hopefully, in the near future we reach a point at which most people are either in an autonomous car or have another form of transportation to eliminate the mitigate the possibility of fatalities on the roadways. However, I believe it is dangerous that tech firms and their programmers are those with the final say on code as the decisions the car makes could possibly run contrary to our own morals. 

With the rise of autonomous cars throughout the United States and enormous leaps and bounds we have made towards a technologically connected world, do you believe that pedestrian fatalities will occur more or less often in the future?

How do you feel about computer programmers making the decisions based on their own ethics (ie. Trolley dilemma)? Should we/ should we not regulate what is coded?

Will the roads be safer in the long run with more autonomous cars?

https://www.bloomberg.com/news/articles/2018-03-20/video-shows-woman-stepped-suddenly-in-front-of-self-driving-uber 


15 comments:

Anonymous said...

Although Autonomous cars are still new. I'm not gonna say it should be stopped because it's inevitable that their will be self-driven cars. The coding of the cars should be more secured before being released to the public. If we give it more time eventually we'll have a self-automated car in the future.

Anonymous said...

I think that roads will definitely be safer with autonomous cars in the long run. When the vast majority of cars on the road are autonomous, then the roads will be more safe, but right now and in the near future, I think they will be more dangerous. Instances like these show that a computer cannot predict the unpredictability of humans, in cars or walking. Other people are much better at judging other people.

Anonymous said...

The accidents caused by people operated cars still outnumber those of autonomous cars, so it would be reasonable to assume that the roads would be safer with autonomous cars. But that comes with a factor of risk, since all machines have the capability to malfunction when put into practice. I do believe that will improve over time, and autonomous cars will be made with more safety features from trial and error.

Anonymous said...

I agree with most of the previous comments about the risk invovled with self driving cars but to provide another opinion: firstly, real world testing is the best way to improve the technology and code in these cars so shutting down the programs is a bad move in my opinion. Also, to counter a point that Yuki made, I would argue that people are more likely to "malfunction" than a machine is. This is because people can be distracted by other people, their phones, alcohol, tiredness, and numerous other factors that a machine does not experience. Because of this i would argue humans are riskier than machines. And building off of this, since there are barely any recorded accidents involving self driving cars, while there are tons involving people driven cars, self driving cars are still safer even though the tech is in a testing phase.

Anonymous said...

i think that the rise of autonomous cars will increase the risk, regardless of if programmers claim their code is perfect. We can completely rely on self driving cars to act on their own because it is more precautious to have a driver behind the wheel just to take preventative measures. Of course in the near future, as society modernizes, the marginal error will decrease. It is tricky to regulate what can and can be coded when you don´t know the intent of coders and what they seek to put out. Furthermore, limiting coders restricts them from using their intelligence and skills to help modernize and advance our society. However, safety regulations can be implemented to ensure that it isn´t blown out of proportion.

Anonymous said...

I agree with Lucas,this event shows that the cars aren't programmed to respond to all of the unpredictable actions humans take. I don't know the details of the crash other than the woman who was struck stepped into the street/ was crossing the street and there was a driver in the car at the time even though it was in self driving mode. Based on these few details, I have a lot of questions about if the driver was able/tried to take control of the car, if the woman checked for cars before crossing, etc. Ultimately, though, as long as we have humans with unpredictable actions, we need to program our cars to respond to those actions.

Anonymous said...

I would have to agree with Matt. While this malfunction does make me skeptical about self driving cars, it is best to test the cars in the real world. Testing them in the real world will best show how effective they are if we are to implement them on a much larger scale. Also, as Yuki said, this is only a single accident. The number of deaths caused by humans driving cars far outnumbers the number of deaths caused by autonomous cars. But you have to take into account that there are far less autonomous cars on the road than normal ones. That being said, I still believe it is unreasonable to stop testing just because of a single accident. We can't just eliminate traffic related deaths through putting autonomous cars on the road. There will always be accidents, but I do think that autonomous cars will reduce how often they occur.

Anonymous said...

I definitely agree that these automated cars will probably continue to have malfunctions because that is just the way technological advancements work. Something makes a mistake, and we learn from the mistake and try to make sure that it does not happen again. Testing should not be stopped just because of a single instance, and I do understand that automated car incidents have happened before, but these cases are still rare exceptions. A lot of people above made the important point that the only way automated cars are going to become effective and safer than current cars in the long run is through the testing of them in real life environments. Although for safety we should probably be working on the automated cars that still have a human driver present.

Anonymous said...

I think Uber shouldn't just stop all their autonomous car testing because of this death. While it is tragic that this death occurred, the technology makes mistakes and mistakes could be fixed. The only way we will be able to make them as close to perfect is with trial and error. I think Uber should continue its autonomous car testing once they fix whatever the programs problem was.

Anonymous said...

I do think that these autonomous cars are a great invention, but I don't know how I would feel if I had no control at all over the car. No matter how good technology gets, I still think we should be able to take control over the cars in case we see something that they don't. This incident is tragic, but I don't think they should stop the cars. They are still progressing and it will take a little bit of time before they can be perfected. Maybe they could find a way to test them and still keep everyone as safe as we can. These cars are in part intended to decrease accidents and it would be a shame if they caused more because they are still being perfected. I think they should keep working on them and perfecting them because I believe they will save a lot of lives in the long run.

Anonymous said...

Autonomous driving cars are a remarkable invention. It is simply a new wave. However, if I was to be in a car that I had no control of, I would not feel very safe. The whole point of buying a car is for you to drive and get to places. Autonomous driving takes away a lot of the purpose of even going to the dmv to get your driver's license. At this time, I would rather stick with regular cars that do not have the autonomous driving feature. I like to drive, and I don't want an AI to replace my spot.

Unknown said...

Why was this woman jaywalking in the first place.

Anonymous said...

She trynna get a pay day @frankliu

Anonymous said...

She said time to run up a fat check, but instead, played her life.
The Dangers of Jaywalking is real. Don't do it. You might end up like her. This is a lesson for everyone.

Arjun Bhattal said...

I believe that autonomous cars will be safer in the long run. This was just a tragic accident. We still have much more experimenting to do with self-driving cars. I believe that they will be safe in the future because there will be less drunk driving accidents, fatigue accidents, etc. With this, it can reduce death and injury rates by a lot.