Monday, August 01, 2022

Newest scary thing: Self-driving TRUCKS!

Yes, your concerns about self-driving cars crashing are now last week's news. The new sweetness? Self-driving 18 wheelers wiping out on the highway at 65mph.

On April 6, an autonomously driven truck fitted with technology by TuSimple TSP -9.59% Holdings Inc. suddenly veered left, cut across the I-10 highway in Tucson, Ariz., and slammed into a cement barricade. The accident, which regulators disclosed to the public in June after TuSimple filed a report on the incident, underscores concerns that the autonomous-trucking company is risking safety on public roads in a rush to deliver driverless trucks to market, according to independent analysts and more than a dozen of the company's former employees. A TuSimple spokesman said safety is a top priority for the company and that nobody was injured in the accident.

What's interesting here is that this truck crashed due to an unforced error that happened at highway speed. The details:

An internal TuSimple report on the mishap, viewed by The Wall Street Journal, said the semi-tractor truck abruptly veered left because a person in the cab hadn't properly rebooted the autonomous driving system before engaging it, causing it to execute an outdated command. The left-turn command was 2 1/2 minutes old—an eternity in autonomous driving—and should have been erased from the system but wasn't, the internal account said.

But researchers at Carnegie Mellon University said it was the autonomous-driving system that turned the wheel and that blaming the entire accident on human error is misleading. Common safeguards would have prevented the crash had they been in place, said the researchers, who have spent decades studying autonomous-driving systems.

For example, a safety driver—a person who sits in the truck to backstop the artificial intelligence—should never be able to engage a self-driving system that isn't properly functioning, they said. The truck also shouldn't respond to commands that are even a couple hundredths of a second old, they said. And the system should never permit an autonomously-driven truck to turn so sharply while traveling at 65 miles an hour.


Also mentioned is something that has bothered me about all the self-driving shit out there, shouldn't the robotic system instantly release the controls to the human driver? If it was me designing the thing, I'd have a -mechanical- system to make the computer control system stop functioning no matter what it is doing. You grab the wheel and the self-drive actuators are physically disengaged, like a standard transmission is disengaged by putting your foot on the clutch.

They don't have that. Like Tesla, everything is drive-by-wire. The computer decides if it will accept driver input, the driver doesn't decide shit. You can turn that wheel all you want, and if the computer doesn't like it, the car will not turn.

Because, according to the thinking of the engineers, the computer is the smart one. The driver is just some schmuck.

This is what the ancient Greeks called hubris. They had a whole goddess to take care of that, her name was Nemesis. She's juuuuust waiting for these guys to put a foot wrong. 

Update: Welcome Small Dead Animals! Thanks for the linkage Kate!

2 comments:

Jonathan H said...

Yet again, a company has vehicles driving around with insufficient, or no, safety program.
Carnegie Mellon is the gold standard for self driving vehicles; they've had them for over 30 years. They rarely drive on public roads and don't feel their vehicles are at a high enough standard to sell - NO ONE else has the experience they do, yet sells and drives publically...

As I've said before, the rush into this with poor to no planning for problems and exceptions WILL end badly and set back the whole concept by years.
I agree - there MUST be a quick way to override the system. I think self driving vehicles should follow the well established design requirements for aircraft autopilots at a minimum - and probably more for the more complicated environment. They need to be designed by people and companies who have a history in life critical systems, not those who have never done more than websites, where an error is a minor annoyance versus deadly.

The Phantom said...

You know, every single industrial machine I've ever seen has a big fat red button right next to where the operator stands. That is the "OFF!!!" button that stops the machine in its tracks. Most complex machines have a variety of safety switches and interlocks, in addition to the big fat button, to prevent the operators from getting injured.

But an 18 wheeler on auto-drive can take an abrupt hard left at 65 miles per hour. I mean, wouldn't you limit the possible maneuvers the truck can make to ones that can be accomplished at the speed it's going? Doesn't that seem realistic? But they don't.

I'll tell you what though, I don't think any accident is going to be bad enough or stupid enough to take this crazy train off the rails. They're going to implement this no matter how bad it is and no matter how many people die. For sure. They WANT it and they don't care that it barely works.