For the 1000th time Tesla: don't call it "autopilot" when it's nothing more than a cruise control that needs constant attention.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
It is autopilot (a poor one but still one) that legally calls itself cruise control so Tesla wouldn't have to take responsibility when it inevitably breaks the law.
It doesn't have to not kill people to be an improvement, it just has to kill less people than people do
True in a purely logical sense, but assigning liability is a huge issue for self-driving vehicles.
As long as there's manual controls the driver is responsible as they're supposed to be ready to take over
The autopilot knows deers can't sue
What if it kills the deer out of season?
Right, most animals can only zoo!
I guess that's the big game ...
Driving is full of edge cases. Humans are also bad drivers who get edge cases wrong all the time.
The real question isn't is Tesla better/worse in anyone in particular, but overall how does Tesla compare. If a Tesla is better in some situations and worse in others and so overall just as bad as a human I can accept it. Is Tesla is overall worse then they shouldn't be driving at all (If they can identify those situations they can stop and make a human take over). If a Tesla is overall better then I'll accept a few edge cases where they are worse.
Tesla claims overall they are better, but they may not be telling the truth. One would think regulators have data for the above - but they are not talking about it.
Humans are also bad drivers who get edge cases wrong all the time.
It would be so awesome if humans only got the edge cases wrong.
Tesla claims overall they are better, but they may not be telling the truth. One would think regulators have data for the above - but they are not talking about it.
The agency is asking if other similar FSD crashes have occurred in reduced roadway visibility conditions, and if Tesla has updated or modified the FSD system in a way that may affect it in such conditions.
It sure seems like they aren't being very forthcoming with their data between this and being threatened with fines last year for not providing the data. That makes me suspect they still aren't telling the truth.
It sure seems like they aren't being very forthcoming with their data between this and being threatened with fines last year for not providing the data. That makes me suspect they still aren't telling the truth.
I think their silence is very telling, just like their alleged crash test data on Cybertrucks. If your vehicles are that safe, why wouldn't you be shoving that into every single selling point you have? Why wouldn't that fact be plastered across every Gigafactory and blaring from every Tesla that drives past on the road? If Tesla's FSD is that good, and Cybertrucks are that safe, why are they hiding those facts?
If the cybertruck is so safe in crashes they would be begging third parties to test it so they could smugly lord their 3rd party verified crash test data over everyone else.
Bu they don't because they know it would be a repeat of smashing the bulletproof window on stage.
Given that they market it as “supervised”, the question only has to be “are humans safer when using this tool than when not using it?”
One of the cool things I’ve noticed since recent updates, is the car giving a nudge to help me keep centered, even when I’m not using autopilot
Tesla’s approach to automotive autonomy is a unique one: Rather than using pesky sensors, which cost money, the company has instead decided to rely only on the output from the car’s cameras. Its computers analyze every pixel, crunch through tons of data, and then apparently decide to just plow into deer and keep on trucking.
I mean, to be honest...if you are about to hit a deer on the road anyway, speed up. Higher chance the scrawny fucker will get yeeted over you after meeting your car, rather than get juuuuust perfectly booped into air to crash through windshield and into your face.
Official advice I heard many times. Prolly doesn't apply if you are going slow.
Edit: Read further down. This advice is effing outdated, disregard. -_- God I am happy I've never had to put it i to test.
The poster, who pays Tesla CEO Elon Musk for a subscription to the increasingly far-right social media site, claimed that the FSD software “works awesome” and that a deer in the road is an “edge case.” One might argue that edge cases are actually very important parts of any claimed autonomy suite, given how drivers check out when they feel the car is doing the work, but this owner remains “insanely grateful” to Tesla regardless.
How are these people always such pathetic suckers.
I grew up in Maine. Deer in the road isn’t an edge case there. It’s more like a nightly occurrence.
Same in Kansas. Was in a car that hit one in the 80s and see them often enough that I had to avoid one that was crossing a busy interstste highway last week.
Deer are the opposite of an edge case in the majority of the US.
Being a run of the mill fascist (rather than those in power) is actually an incredibly submissive position, they just want strong daddies to take care of them and make the bad people go away. It takes courage to be a "snowflake liberal" by comparison
Only keeping the regular cameras was a genius move to hold back their full autonomy plans
The day he said that "ReGULAr CAmErAs aRe ALl YoU NeEd" was the day I lost all trust in their implementation. And I'm someone who's completely ready to turn over all my driving to an autopilot lol
I roll my eyes at the dishonest bad faith takes people have in the comments about how people do the same thing behind the wheel. Like that's going to make autopiloting self-driving cars an exception. Least a person can react, can slow down or do anything that an unthinking, going-by-the-pixels computer can't do at a whim.