this post was submitted on 20 Dec 2023
754 points (98.0% liked)
Technology
60362 readers
4929 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The technology will never be ready if you don't test it.
And I would argue we DON'T need warning lights since, while imperfect, most self-driving tech is already vastly better than your average driver. We should have warning lights for cars that DON'T have self-driving.
This is ultimately why we will NEVER have self-driving cars en masse, because society isn't willing to take the necessary risks to improve the safety of everyone on the road.
How about we:
Why is there this constant false dichotomy implying that the only way to test self driving cars is a wild west of no regulation?
And also who said that self driving cars are safer than humans? Tesla's numbers are all statistical lies (in fact Teslas were recently shown to have the most accidents), Cruise just shutdown in SF because they were a liability, and Waymo is heavily limited in its time/weather/areas for driving.
At some point you need to test it on a large scale. Cruise was even running small-scale and was shut down in short order.
We do.
There isn't.
...everyone?
https://arstechnica.com/cars/2023/12/human-drivers-crash-a-lot-more-than-waymos-software-data-shows/
[Citation needed]
This is actually a great example of exactly what I'm talking about: GM will shut down Cruise permanently because they've discovered what I just said: society has zero tolerance for literally anyone getting hurt by autonomous vehicles, whereas the tens of thousands of people who are killed on our roads every year by individuals is considered acceptable.
The teslas having the most crashes I did see pass by on my news feed too. It doesn't mean that because teslas have self driving and teslas crash the most that this means the self driving tech is the reason for it though. Correlation does not imply causation.
You literally just presented that false dichotomy in a previous comment. Don't try to gaslight us.
I literally presented zero dichotomies of any kind, don't try to strawman us.
The refrain of the tech CEO demanding we allow it free reign as a test.
Funny how you quote me and then immediately misquote me.
You said that in response to an article about jumping past the testing phase. Go read the article.
Yeah no I didn't say that either. Keep trying though.
I guess you just never said anything huh?
Sure. But we're jumping into the deep end by legally allowing the driver to be exempt from distracted driving laws. There's a big difference between testing the technology and relying on the technology.
Can you cite the legislation that exempts drivers using driver assistance systems from paying attention while driving?
No one should be relying on the technology.
California, Nevada, and Germany all have laws for it. The article this comment section is based on specifically mentions California and Nevada.