this post was submitted on 18 Aug 2023
209 points (94.8% liked)

Technology

58872 readers
4877 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

A driverless car in San Francisco drove right into wet concrete and got stuck after seemingly mistaking it for a regular road: 'It ain't got a brain' / The site had been marked off with constructio...::The site had been marked off with construction cones and workers stood with flags at each end of the block, according to city officials.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 39 points 1 year ago* (last edited 1 year ago) (26 children)

Every time one of these things happens, there's always comments here about how humans do these things too. Two responses to that:

First, human drivers are actually really good at driving. Here's Cory Doctorow explaining this point:

Take the much-vaunted terribleness of human drivers, which the AV industry likes to tout. It's true that the other dumdums on the road cutting you off and changing lanes without their turn-signals are pretty bad drivers, but actual, professional drivers are amazing. The average school-bus driver clocks up 500 million miles without a fatal crash (but of course, bus drivers are part of the public transit system).

Even dopes like you and me are better than you may think – while cars do kill the shit out of Americans, it's because Americans drive so goddamned much. US traffic deaths are a mere one per 100 million miles driven, and most of those deaths are due to recklessness, not inability. Drunks, speeders, texters and sleepy drivers cause traffic fatalities – they may be skilled drivers, but they are also reckless.

There's like a few hundred robot taxis driving relatively few miles, and the problems are constant. I don't know of anyone who has plugged the numbers yet, but I suspect they look pretty bad by comparison.

Second, when self-driving cars fuck up, they become everyone else's problem. Emergency service personnel, paid for by the taxpayer, are suddenly stuck having to call corporate customer service or whatever. When a human fucks up, there's also a human on the scene to take responsibility for the situation and figure out how to remedy it (unless it's a terrible accident and they're disabled or something, but that's an edge case). When one of these robot taxis fucks up, it becomes the problem of whoever they're inconveniencing, be it construction workers, firefighters, police, whatever.

This second point is classic corporate behavior. Companies look for ways to convert their internal costs (in this case, the labor of taxi drivers) into externalities, pushing down their costs but leaving the rest of us to deal with their mess. For example, plastic packaging is much, much cheaper for companies than collecting and reusing glass bottles or whatever, but the trash now becomes everyone else's problem, and at this point, there is microplastic in literally every place on Earth.

[–] [email protected] 4 points 1 year ago (3 children)

It's a software update away from getting better. Humans will be forever a risk to other humans when driving. I'm not saying it's good yet, but people in 2020 thought "driverless cars will be forever 5 years away"

Yet here we are, talking about how bad they are. That's an improvement from only limited testing a few short years ago

[–] [email protected] 6 points 1 year ago* (last edited 1 year ago) (1 children)

No, people in 2014 kept saying driverless cars will be 5 years away. And they kept pushing "no, wait, in 5 more years". It's 2023, and they still say it's "5 more years", or they pretend that it's already here and these cars have no problems whatsoever.

Actual, real, Level 5 automated driving is not here and will take at least 20 years to get there. Probably 50 years, realistically. What Cruise is doing is the same thing Elon Musk does with Tesla's: Call it a "self-driving car" when it's anything but.

These cars can follow lines and pretend to drive. They can't actually drive. They can't handle any of the edge cases. Their handlers completely ignore all of the accidents and mistakes they make on a daily basis. They brush aside the fatalities, and blame it on everything else except themselves, practicing a healthy dose of whataboutism when they compare their mistakes to humans.

[–] [email protected] 1 points 1 year ago (1 children)

What do you mean? If they are 5 years away, what got stuck?

[–] [email protected] 1 points 1 year ago

The 80/20 rule. The 20% of driving is the part that actually fucking matters.

load more comments (1 replies)
load more comments (23 replies)