this post was submitted on 11 Jul 2023
544 points (96.4% liked)
Technology
59381 readers
2690 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
That's one way of strawmanning your way out of a discussion.
It's not a strawman argument, it is a fact. Without the ability to audit the entire codebase of self-driving cars, there's no way to know if the manufacturer had knowingly hidden something in the code that might have caused accidents and fatalities too numerous to recount, but too important to ignore, that were linked to a fault in self-driving technology.
I was actually trying to find an article I'd read about Tesla's self-driving software reverting to manual control moments before impact, but I was literally flooded by fatality reports.
We can't audit the code for humans, but we still let them drive.
If the output for computers driving is less than for humans and the computer designers are forced to be as financially liable for car crashes as humans, why shouldn't we let computers drive?
I'm not fully in either camp in this debate, but fwiw, the humans we let drive generally suffer consequences if there is an accident due to their own negligence
Also we do audit them, it's called a license. I know it's super easy to get one in the US but in other countries they can be quite stringent.
And I'm not denying it. However, it takes a very high bar to get someone convicted of vehicular manslaughter and that usually requires evidence that the driver was grossly negligent.
If you can show that a computer can drive as well as a sober human, where is the gross negligence?