this post was submitted on 24 Oct 2023
156 points (98.1% liked)

News

23296 readers
3360 users here now

Welcome to the News community!

Rules:

1. Be civil


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban. Do not respond to rule-breaking content; report it and move on.


2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.


Obvious right or left wing sources will be removed at the mods discretion. We have an actively updated blocklist, which you can see here: https://lemmy.world/post/2246130 if you feel like any website is missing, contact the mods. Supporting links can be added in comments or posted seperately but not to the post body.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Post titles should be the same as the article used as source.


Posts which titles don’t match the source won’t be removed, but the autoMod will notify you, and if your title misrepresents the original article, the post will be deleted. If the site changed their headline, the bot might still contact you, just ignore it, we won’t delete your post.


5. Only recent news is allowed.


Posts must be news from the most recent 30 days.


6. All posts must be news articles.


No opinion pieces, Listicles, editorials or celebrity gossip is allowed. All posts will be judged on a case-by-case basis.


7. No duplicate posts.


If a source you used was already posted by someone else, the autoMod will leave a message. Please remove your post if the autoMod is correct. If the post that matches your post is very old, we refer you to rule 5.


8. Misinformation is prohibited.


Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.


9. No link shorteners.


The auto mod will contact you if a link shortener is detected, please delete your post if they are right.


10. Don't copy entire article in your post body


For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.

founded 1 year ago
MODERATORS
 

General Motors' driverless Cruise taxis can no longer operate on California roads without a safety driver, effective immediately.

top 23 comments
sorted by: hot top controversial new old
[–] [email protected] 13 points 1 year ago (1 children)

My only concern with self-driving vehicles is their inability to respond in emergencies. Ambulances, fire trucks, etc. have all had serious issues with them in California.

[–] [email protected] 13 points 1 year ago (4 children)

Self driving cars have been such a perfect example of something not living up to what it was touted to be. We are way behind in tech, and way ahead in what we’re being sold/told. Meaningful self driving (fully fleshed out level 4 or 5) reality is 10-20 years away.

[–] [email protected] 5 points 1 year ago

Have you ridden in one? I took a few trips in SF and they went very well.

[–] [email protected] 2 points 1 year ago

Nothing is ever what it's touted to be in the beginning. All the movies you see with this stuff working would have gone through what we're going through now in their worlds.

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago) (1 children)

I've concluded either self driving vehicles need to be completely isolated from all the humans on/near public roads doing a bunch of unexpected things, or it would take an actual artificial intelligence to really share the roads with human drivers. And once you have actual artificial intelligence that opens up a whole new set of logistical, ethical, and possibly existential problems.

[–] [email protected] 6 points 1 year ago (1 children)

You mean like put them underground and maybe on rails and then make them bigger to support more people? Like like like a Subway?

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago)

Yeah like a train or subway. Still, trains and subways don't have to deal with traffic conditions but they still are expected to attempt an emergency stop if there is a hazard ahead. Self-driving cars don't even do that reliably.

[–] [email protected] 0 points 1 year ago

There are a lot of benefits for them, but you're right, they aren't all there.

What I could agree to in the meantime is dedicated lanes on freeways and shit. They'd have to have their own infrastructure, and California does that really quick.

[–] [email protected] 3 points 1 year ago* (last edited 1 year ago) (1 children)

Cruise to San Francisco: If you don't stop criticizing us, we're threatening to pull out of your city.

California to Cruise: Actually, you're leaving the state.

[–] [email protected] 3 points 1 year ago (1 children)
[–] [email protected] 1 points 1 year ago

Oof yeah I don't know why I wrote Waymo (twice).

[–] [email protected] 3 points 1 year ago

Why the fuck would anyone sign up for the "safety driver" job?

It would be boring- you wouldn't have much control but also couldn't multitask or do anything else during your shift. Presumably it wouldn't pay all that much because the entire business model of this company is to create a market efficiency on the labor side. Finally there is the outstanding question of how much liability you assume as the "safety driver". If the self-driving features start bugging out, but you don't stop it in time, are you on the hook for damage it causes? I would want a ton of legal liability assurances before I took that job.

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago)

That seemed very obvious from the start. Even if computers don't get distracted or tired, they don't have the intelligence to match even a moderately engaged human driver.

[–] [email protected] 1 points 1 year ago (2 children)

The barrier to entry here should be very, very high. These cars can't just work most of the time or only under typical conditions. Human lives are on the line. When a typical car accident happens, sometimes we may blame the driver or assume that it was a perfect storm and that another driver may have fared differently. But these systems all operate identically, meaning any single failure is indicative of a widespread issue across the entire design in a way that just isn't true of humans.

When things get frenetic, I think most people don't want to be the one sitting in the back of a confused self-driving car. And let's not forget that the only ones really benefitting are companies that no longer need to employ drivers. Personally, I am not willing to jeopardize public safety for a little bit of novelty, nor for companies to save money.

[–] [email protected] 2 points 1 year ago

The barrier to entry here should be very, very high.

No, the barrier should be higher than the average human driver.

[–] [email protected] 1 points 1 year ago

These companies need to stop beta testing on public streets and build their own Hollywood set to go drive on.

[–] [email protected] -1 points 1 year ago (3 children)

Honest question: aren't they already (despite and including their problems) safer than human drivers? They expectations shouldn't be perfection (especially at their beginning point), but simply better than us... Which shouldn't be hard.

[–] [email protected] 3 points 1 year ago* (last edited 1 year ago) (1 children)

In ideal conditions they are. In less than ideal, they lack the flexibility required to adapt to a situation. Cruise in particular got caught blocking traffic for no good reason preventing emergency vehicles access.

The best an automated vehicle will do when unsure is stop. A human at least could listen to direction from a person of authority, even if those directions are counter to the rules (e.g. turn around in a one way street). It’s like a reverse Asimov’s law of robotics.

[–] [email protected] 2 points 1 year ago

I am just remembering when I was still on a learners permit. There was an accident in front of me and a cop instructed me to make what would normally be an illegal turn. I was 16 and remembered the rule "instructions from a police officer overrule any road rule". So I made the turn.

[–] [email protected] 1 points 1 year ago

part of humans learning to drive safely is knowing that flouting traffic laws increases your chance of being stopped, fined, or if you're not the right demographic, worse things. we calibrate our behavior to maximize speed and minimize cops, and to avoid being at-fault in an accident, which is a major hit to insurance rates.

autonomous vehicles can't be cited for moving violations. they're learning to maximize speed without the governor of traffic laws. in the absence of speed and citation data, it's hard to measure how safe they are. there is no systemic incentive for them to care about safety, except for bad press.

[–] [email protected] 0 points 1 year ago

We don't know, because the data isn't being released to the public.