this post was submitted on 27 Apr 2024
38 points (95.2% liked)

Technology

59436 readers
4272 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 16 points 6 months ago (2 children)

Accoring to the math in this video: :

  • 150 000 000 miles have been driven with Teslas "FSD", which equals to
  • 375 miles per tesla purchased with FSD capabilities
  • 736 known FSD crashes with 17 fatalities
  • equals 11.3 deaths per 100M miles of teslas FSD

Doesnt sound to bad, until you hear that a human produces 1.35 deaths per 100M miles driven...

Its rough math, but holy moly that already is a completely other class of deadly than a non FSD car

[–] [email protected] 3 points 6 months ago (1 children)

a human produces 1.35 deaths per 100M miles driven

My car has been driven around 100k miles by a human, i.e. it has produced 0.00135 deaths. Is that like a third of a pinky toe?

[–] [email protected] 4 points 6 months ago

Yeah, another 900k, and you'll be ded.

load more comments (1 replies)
[–] [email protected] 6 points 6 months ago (1 children)

I just read on LinkedIn a post from a Tesla engineer laid off.

He said "I checked my email while auto piloting to work".

The employees know more than anyone its capabilities and they still take the same stupid risk.

load more comments (1 replies)
[–] [email protected] 5 points 6 months ago* (last edited 6 months ago) (1 children)

Obviously the time to react to the problem was before the system told you about it, that's the whole point, THE SYSTEM IS NOT READY. Cars are not ready to drive themselves, and obviously the legal system is too slow and backwards to deal with it so it's not ready either. But fuck it let's do it anyway, sure, and while we're at it we can do away with the concept of the driver's license in the first place because nothing matters any more and who gives a shit we're all obviously fucking retarded.

[–] [email protected] 1 points 6 months ago (1 children)
[–] [email protected] 2 points 6 months ago

Yep, and even then it is very limited in when and where you can use it at this point.

Level 4 is the general use “high autonomy” vehicle, and while a few robotaxis and shuttles are able to do it, no regular car has it yet.

[–] [email protected] 2 points 6 months ago (23 children)

I’ve often wondered why the FTC allows it to be marketed as “Full Self-Driving”. That’s blatant false advertising.

[–] [email protected] 1 points 6 months ago* (last edited 6 months ago) (1 children)

You can literally type in an address and the car will take you there with zero input on the driver's part. If that's not full self-driving then I don't know what is. What FSD was capable of a year ago and how it performs today is completely different.

Not only does these statistics include the way less capable older versions of it, it also includes accidents caused by autopilot which is a different system than FSD. It also fails to mention how the accident rate compares to human drivers.

If we replace every single car in the US with a self-driving one that's 10x safer driver than your average human that means you're still getting over 3000 deaths a year due to traffic accidents. That's 10 people a day. If one wants to ban these systems because they're not perfect then that means they'll rather have 100 people die every day instead of 10.

[–] [email protected] 1 points 6 months ago

It also fails to mention how the accident rate compares to human drivers.

That may be because Tesla refuses to publish proper data on this, lol.

Yeah, they claim it's ten times better than a human driver, but none of their analysis methods or data points are available to independent researchers. It's just marketing.

load more comments (22 replies)
[–] [email protected] 2 points 6 months ago

Move fast, break shit. Fake it till you sell it, then move the goal posts down. Shift human casualties onto individual responsibility, a core libertarian theme. Profit off the lies because it's too late, money already in the bank.

[–] [email protected] 1 points 6 months ago (5 children)

I love to hate on musky boi as much as the next guy, but how does this actually compare to vehicular accidents and deaths overall? CGP Grey had the right idea when he said they didn't need to be perfect, just as good as or better than humans.

[–] [email protected] 1 points 6 months ago* (last edited 6 months ago) (1 children)

Grey had the right idea when he said they didn't need to be perfect, just as good as or better than humans.

The better question - is Tesla's FSD causing drivers to have more accidents than other driving assist technologies? It seems like a yes from this article and other data I've linked elsewhere in this thread.

[–] [email protected] 1 points 6 months ago

I appreciate this response amongst all the malding! My understanding of the difference in assistive technologies across different companies is lacking, so I'll definitely look more into this.

[–] [email protected] 1 points 6 months ago (4 children)

CGP Grey also seems to believe self driving cars with the absence of traffic lights is the solution to traffic as opposed to something like trains.

load more comments (4 replies)
[–] [email protected] 1 points 6 months ago (1 children)

A comment above points to a nearly 11x increase over human caused fatalities

load more comments (1 replies)
load more comments (2 replies)
[–] [email protected] 1 points 6 months ago (1 children)

and the pedestrian-emergency-break on tesla cars, and many other cars with that feature will malfunction sometimes causing people behind you to rear-end you.

[–] [email protected] 4 points 6 months ago

Yeah but that's usually the fault of the driver behind you. They're too close, should've left more distance for emergency braking.

[–] [email protected] 1 points 6 months ago (1 children)

They just recalled all the Cybertrucks, because their 'smort' technology is too stupid to realize when an accelerator sensor is stuck...

[–] [email protected] 1 points 6 months ago* (last edited 6 months ago)

The accelerator sensor doesn’t get stuck, pedal does. The face of the accelerator falls off and wedges the pedal into the down position.

[–] [email protected] 1 points 6 months ago (3 children)

Fuck cars, those ones specifically

load more comments (3 replies)
load more comments
view more: next ›