New technology makes traffic safer or more dangerous

There are more transportation options in urban areas than ever before: from on-demand taxis like Uber and Lyft, to bike-share bikes and scooters filling the streets. A new wave of transportation technology is supposed to make the world easier and safer to travel. But along the way, when futuristic cars, e-bikes and scooters hit the road for the first time, travel could become temporarily more dangerous.

The future of mobility is starting to become a public health hazard. A study by the University of California and Stanford University conducted a year-long survey of emergency departments at two California hospitals and found 249 injuries related to scooters. Due to issues of sexual assault and impersonating Uber drivers, Uber has had to keep developing new tools to demonstrate its commitment to safety.

Last year, people thought of car accidents when they talked about autonomous driving, thanks to a media-heavy accident and one fatality.

But it turns out that some features on even standard new cars are not necessarily reliable. Pedestrian avoidance technology is among several new automated tools including emergency braking, lane keeping, adaptive cruise control, limited autonomous driving and collision warning, according to a new study released by the American Automobile Association, according to the Wall Street Journal. One, but it doesn’t work when it’s needed most.

Normally, that’s fine because there’s a driver behind the wheel, the logic goes. But assistive features have created a new problem, in that drivers become too dependent on them and lose their minds while driving.

No company has felt the pain as deeply as Tesla, which makes cars with self-driving technology called Autopilot. Tesla has been involved in several public crashes so far. CEO Elon Musk said all of these accidents were the fault of careless drivers. That may be true, but it doesn’t reflect the complexity of the situation. In 2018, a Tesla crashed into a fire truck parked on the highway. The car alerted the driver less than a second before the crash, but failed to brake hard. Is 0.49 seconds enough time for a person to react to a collision that he is not aware of? Also, what happened to these emergency brakes?

It was once believed that self-driving cars would be our saviour. In 2015, we had reason to believe that by 2050, self-driving cars would reduce car crashes by 90% and save thousands of lives each year. Some researchers are so convinced of the ultimate benefits of self-driving machines that one of the few safety concerns revolves around the Trolley Problem, in which a self-driving car chooses to save as much as possible in a crash people who still choose to save their passengers.

“In some cases, a car knows it’s about to hit and is planning how to do it,” said Andrew Moore, chair of the computer science department at Carnegie Mellon University, in a conversation with Adrienne LaFrance three years ago. Engineers who code to handle collisions will have incredibly detailed scrutiny. Is it trying to save its owner? Or to save others?

But so far, accidents in autonomous driving have nothing to do with trying to save others. They are just system failures. In one case, one of Uber’s self-driving cars simply didn’t notice pedestrians while crossing a two-lane street at night. Of course, humans are also part of the problem. A look at California’s self-driving accident records shows that people like to rear-end self-driving cars, and even scooters can hit the rear of self-driving cars. We just don’t seem to get along with self-driving cars. But beyond pulling every human car off the road and replacing them with self-driving cars, self-driving car companies will have to take “humans” into account in the process.

Early self-driving pilots have been worrisome enough that the car companies that make self-driving cars have delayed their time to market. In April, Ford CEO Jim Hackett said in a talk at the Detroit Economic Club that we overestimated the arrival of self-driving cars.

The gradual increase in self-driving cars on the road is likely to increase crashes. A 2015 paper by the University of Michigan mentioned this: During the transition period between conventional and autonomous vehicles sharing the road, the safety situation may actually worsen, at least for conventional vehicles. This conclusion may be predictable. It also shows that self-driving cars are not necessarily better at driving than experienced middle-aged drivers.

That’s not to say we shouldn’t try to build great technology that ultimately makes the world safer. On college campuses and in select cities like Washington, D.C., Las Vegas, and Detroit, we can see some piloted self-driving shuttles on limited, low-speed routes. Whether it’s a scooter or a self-driving bus, experimentation can help these technologies continue to trial and error until they finally land successfully.

But should human beings sacrifice something in the name of development? And how much trial and error do we really need to usher in the light? This problem could be addressed through legislation that would help direct new modes of transportation into the streets. Because, both giants and small businesses will continue to do what they are supposed to do, launch, scale, iterate, and ultimately change the way we live.

The Links:   7MBP100RA120-05 LMS430HF01