Self-Driving Vehicle Claims Its First Pedestrian in a Fatal Crash in Arizona

An autonomous Uber vehicle has killed a pedestrian in Arizona while it was in self-driving mode | the woman died later at a hospital

Self-Driving Car Claims is First Pedestrian in a Fatal Crash in Arizona

An Arizona woman was hit by a self-driving Uber-owned Volvo in what appears to be the first fatal crash involving an autonomous vehicle and a pedestrian.

Tempe police have confirmed that the vehicle was in autonomous mode when it mowed down the unfortunate woman, although there was a human safety driver behind the wheel.

“Our hearts go out to the victim’s family. We are fully cooperating with local authorities in their investigation of this incident,” Uber said in a tweeted statement.

The San Francisco-based ride-hailing company said it was suspending all operations involving its self-driving vehicles in Phoenix, Pittsburgh, San Francisco and Toronto with immediate effect.

Uber CEO Dara Khosrowshahi has regretted the unfortunate loss of life, saying that the company’s thoughts were with the victim’s family and that Uber was cooperating with the Tempe authorities in the investigations.

“Some incredibly sad news out of Arizona. We’re thinking of the victim’s family as we work with local law enforcement to understand what happened,” Khosrowshahi said on his Twitter page.

Police said that 49-year-old Elaine Herzberg was walking outside of the crosswalk while crossing the road when she was run over by the Volvo in Tempe, Arizona, at around 10 pm Sunday night.

“The vehicle was traveling northbound … when a female walking outside of the crosswalk crossed the road from west to east when she was struck by the Uber vehicle,” said the police statement.

Sgt. Ronald Elcock, who had watched the so far unreleased-to-the-public footage of the crash, said at a Monday presser that the Uber Volvo was doing some 40 mph when it hit Herzberg.

44-year-old Rafael Vasquez – the safety driver behind the wheel at the time – was unhurt and was cooperating with the law enforcement officials.

While Elaine Herzberg may have been the first pedestrian fatality involving an autonomous vehicle, she is definitely not the only per se.

Back in 2016, a man was killed at the wheel of a Tesla vehicle when he failed to heed the vehicle’s safety alerts to take back control of the car, which was in autopilot mode when it crashed into the truck that had cut across its path.

It goes without saying that Sunday’s accident will – and, not for the first time – get skeptic tongues wagging against the practicality of self-driving ventures, which the makers just about guarantee would actually save many, many lives on the roads.

Their argument – and a valid one at that – is that self-driving vehicles would take human error, such as drunk driving or dozing off at the wheel, to name a couple, out of the equation, altogether.

It, however, remains to be seen how soon the makers can achieve the required perfection to make the technology a viable and safe option.

“There will no doubt be an exhaustive investigation,” noted Akshay Anand, an analyst at Kelley Blue Book.

“It’s clear is that this has the potential to severely impact public perceptions of autonomous technology, and should be handled with utmost prudence by regulators, authorities and the industry alike,” he said.

John M Simpson, privacy and technology project director at Consumer Watchdog, said the collision highlighted the need for tighter regulations governing this still teething technology.

“The robot cars cannot accurately predict human behavior, and the real problem comes in the interaction between humans and the robot vehicles,” said John M Simpson – Consumer Watchdog’s project director for privacy and technology.

The non-profit advocacy group for consumers and taxpayers has demanded a national moratorium on autonomous vehicle testing on public roads, at least, until all investigations are complete and conclusions arrived at.

“There should be a national moratorium on all robot car testing on public roads until the complete details of this tragedy are made public and are analyzed by outside experts so we understand what went so terribly wrong,” said Simpson.

“Arizona has been the wild west of robot car testing with virtually no regulations in place.  That’s why Uber and Waymo test there. When there’s no sheriff in town, people get killed,” he added.

According to Consumer Watchdog, the Arizona accident highlights the areas of concern in the autonomous vehicle technology, at least in so far as the vehicle’s ability to safely deal with pedestrians, cyclists and other self-driving or human-driven vehicles is concerned.

“If robot cars are already killing people even with the presence of a human driver in the car, how lethal are these technologies going to be next month when they will roam public roads without a human onboard ready to take control?” asked Sahiba Sindhu, a consumer advocate at Consumer Watchdog.

“Disengagement reports” released by twenty companies this year reveal that robot cars need human-intervention after 5,596 miles – at best, which is not always the case, as most cars cannot make it beyond a few hundred miles without needing a human to take over the driving responsibilities, says Consumer Watchdog.

Regardless of these reports, the California DMV has gone ahead and given its nod to robot cars hitting the roads without a human safety driver behind the wheel, as early as next month.

“If robot cars are already killing people even with the presence of a human driver in the car, how lethal are these technologies going to be next month when they will roam public roads without a human onboard ready to take control?” questioned Sahiba Sindhu, an advocate at Consumer Watchdog.

An email conversation between Travis Kalanick, former Uber CEO, and robot car developer Anthony Levandowski – revealed in the recent Uber-Waymo court case – clear shows the insensitivity of the parties involved with regard to safety standards.

“I just see this as a race and we need to win, second place is first loser” reads one email sent by Levandowski in March 2016, and in another message, sent the same day, he writes, “We do need to think through the strategy to take all the shortcuts we can find.”

Simpson says that public roads are not Uber’s private laboratories and it cannot go on endangering people’s lives. Safety parameters and regulations need to be put in place and met with before such tests should be allowed.

“Uber simply cannot be trusted to use public roads as private laboratories without meaningful safety standards and regulations,” Simpson said.

One is forced to conclude that the two fatalities are an indicator that a lot of fine-tuning to the autonomous vehicle technology is still required before we can trust it with our lives.

Mr. Khosrowshahi, Mr. Musk and others have some serious work ahead of them before autonomous vehicles can even get close to becoming a full-fledged reality of our streets and highways.

Leave your vote

3 points
Upvote Downvote

Total votes: 3

Upvotes: 3

Upvotes percentage: 100.000000%

Downvotes: 0

Downvotes percentage: 0.000000%

Leave a Reply

Your email address will not be published. Required fields are marked *