On April 6, an autonomously driven truck fitted with technology by TuSimple TSP -9.59%▼ Holdings Inc. suddenly veered left, cut across the I-10 highway in Tucson, Ariz., and slammed into a cement barricade. The accident, which regulators disclosed to the public in June after TuSimple filed a report on the incident, underscores concerns that the autonomous-trucking company is risking safety on public roads in a rush to deliver driverless trucks to market, according to independent analysts and more than a dozen of the company's former employees. A TuSimple spokesman said safety is a top priority for the company and that nobody was injured in the accident.
An internal TuSimple report on the mishap, viewed by The Wall Street Journal, said the semi-tractor truck abruptly veered left because a person in the cab hadn't properly rebooted the autonomous driving system before engaging it, causing it to execute an outdated command. The left-turn command was 2 1/2 minutes old—an eternity in autonomous driving—and should have been erased from the system but wasn't, the internal account said.
But researchers at Carnegie Mellon University said it was the autonomous-driving system that turned the wheel and that blaming the entire accident on human error is misleading. Common safeguards would have prevented the crash had they been in place, said the researchers, who have spent decades studying autonomous-driving systems.
For example, a safety driver—a person who sits in the truck to backstop the artificial intelligence—should never be able to engage a self-driving system that isn't properly functioning, they said. The truck also shouldn't respond to commands that are even a couple hundredths of a second old, they said. And the system should never permit an autonomously-driven truck to turn so sharply while traveling at 65 miles an hour.
2 comments:
Yet again, a company has vehicles driving around with insufficient, or no, safety program.
Carnegie Mellon is the gold standard for self driving vehicles; they've had them for over 30 years. They rarely drive on public roads and don't feel their vehicles are at a high enough standard to sell - NO ONE else has the experience they do, yet sells and drives publically...
As I've said before, the rush into this with poor to no planning for problems and exceptions WILL end badly and set back the whole concept by years.
I agree - there MUST be a quick way to override the system. I think self driving vehicles should follow the well established design requirements for aircraft autopilots at a minimum - and probably more for the more complicated environment. They need to be designed by people and companies who have a history in life critical systems, not those who have never done more than websites, where an error is a minor annoyance versus deadly.
You know, every single industrial machine I've ever seen has a big fat red button right next to where the operator stands. That is the "OFF!!!" button that stops the machine in its tracks. Most complex machines have a variety of safety switches and interlocks, in addition to the big fat button, to prevent the operators from getting injured.
But an 18 wheeler on auto-drive can take an abrupt hard left at 65 miles per hour. I mean, wouldn't you limit the possible maneuvers the truck can make to ones that can be accomplished at the speed it's going? Doesn't that seem realistic? But they don't.
I'll tell you what though, I don't think any accident is going to be bad enough or stupid enough to take this crazy train off the rails. They're going to implement this no matter how bad it is and no matter how many people die. For sure. They WANT it and they don't care that it barely works.
Post a Comment