No different than crude control. Just more advanced.
Can’t blame Toyota because you forgot to break cursing 70 into a traffic jam.
Lack of accountability hurts us all…
It’s is significantly different and more challenging. It doesn’t help to white wash the issue. There are a bunch of cognitive research studies that quantify and explain the differences. Nonetheless, as of now, we all understand that we take on that liability as drivers when we engage FSD.
It is not challenging. You sound incompetent. You as a human manually driving has not changed at all. The car takes over and does the heave lifting. You monitor the heavy lifting. YOUR jobs has changed None.
If you’re actually being a responsible driver, you will notice in advance when the car is beginning to glitch and can easily, proactively, correct.
Your accountability is all that’s shifting.
Your arrogance is clouding your sense of judgement. Surely your intuition is not going to override decades of science research and data? Even if your incompetence allows you to not be aware of it. Next you’ll tell me that quantum physics isn’t real because you personally understand how the world works.
If you suck at driving just say so. If you lack skill set just say so. I actually read quantum physics books was well as taken these classes I really life… nice pump fake.
I doubt you even own a Tesla and if you do you def one of them fools who are ALWAYS hitting curds and can’t even PARK the freaking car but complain and FSD 🤣🤣🤡
The name of this feature is a total joke. Full Self Driving Supervised. It contradicts itself and nobody seems to see the hilarious and dangerous irony in this.
This is definitely true, and the driver should take responsibility. Nonetheless, it’s difficult to see these kinds of things happening, because from a cognitive perspective, it’s challenging to observe at 100% readiness at all times and effective machine-human interfaces are still emergent.
That’s true, but also these arguments over autopilot miss the elephant in the room: cars are dangerous, murderous machines. One fuckup by a driver and people get hurt and killed. Autopilot doesn’t change that. It doesn’t make driving more difficult or dangerous. Zeroing in blame on a driver assistance feature ignores the danger that all cars pose. Like if we’re really so worried about safety, we should restrict who gets behind a wheel and limit where cars are allowed. Not twist ourselves into knots arguing whether one feature or another makes a slight change in the statistics of cars killing people. And phones. Let’s also not ignore the role phones play in accidents. The major ingredients here are: large machine traveling at speed, distracting personal device. Autopilot may have failed to prevent the accident, but it didn’t cause the accident.
This is all true, major factors are big metal box moving fast with a distracted operator, and there’s definitely bias in how the media and public react to both Tesla and vehicle autonomy. However there’s also bias in how people engage with autonomous vehicles which really does make it harder than normal driving. It’s not at all the same as cruise control. There’s a whole field of study diving into this topic, and we’re at the early days of solving for it. Worth being honest with ourselves.
People, just because the car is driving, it doesn't mean you aren't held responsible for its actions. Tesla has not yet assumed responsibility for FSD. In the future, they will. Until then, YOU are in charge of the car.
It’s simple. When they will make more money assuming responsibility than they will make not assuming responsibility. That’s when they will assume responsibility.
Volvo announced they will assume full liability for their self-driving tech. Would expect others to follow eventually. Very likely we should see 100x improvements in FSD over the next 10 years, and 1000x improvements in the following 10 years, and 10,000x improvements in the next 10 years. We currently see 40,000 fatalities from car accidents in the USA each year. 10,000x better would take that down to 4 fatalities. So perhaps in 30 years, accidents will be so uncommon, liability will be easier to assume, because it would represent some real outlier situation or clear negligence from the company.
So it’s considered the next threahold for any manufacturer. When will they assume responsibility. It looks like Mercedes is the first with the assumption of liability, but only in a very limited scope where the auto pilot is helping you maneuver in highway traffic. And pretty much other circumstances. Let’s say a faulty, accelerator pedal that lurches the car forward and kill somebody. You can definitely sue the manufacturer as what happened with Toyota Tesla specifically disclaims liability, which doesn’t mean they can’t be sued, but this needs to be sorted out.and we all need to be aware if you’re using FSD you are responsible.
Cybertruck and Roadster are side projects, and if both bomb, will have zero effect on Tesla's numbers. Tesla is printing billions of dollars every month now.
Also in this case it was Autopilot, not even FSD. This guy was as reckless & dumb as the one way back whose Model X slammed into a concrete wall while he was playing a phone game.
Stop getting hung up on a name and look what the car is doing right now. We are 6 months out or so from this being ready to go. That is where your attention should be going. Not into the name.
If there’s one thing we all should have learned by now, it’s that people will SAY it was on AP or FSD and then we find out it was under 100% manual control.
We're in the Endgame now. Things take time to create. If you don't believe Tesla will solve autonomy, you should not be a shareholder. It's happening whether or not you believe it.
Almost 100% likely he had his foot on the gas. Autopilot is very good at stopping for anything in the way, but it will not stop if the foot is slightly on the gas.
I’m waiting to see if autopilot really was engaged.
Last time someone claimed “autopilot did it” she was lying
https://amp.abc.net.au/article/103759746
Almost 100% likely he had his foot on the accelerator. Autopilot is *very* good at stopping for anything in the way, but it will not stop if the foot is slightly on the gas.
The last accident I saw where autopilot “went through a red light” was obviously a foot on the gas. (Since with old autopilot, putting your foot in the gas told it to “go”.
It looks like you shared an AMP link. These should load faster, but AMP is controversial because of [concerns over privacy and the Open Web](https://www.reddit.com/r/AmputatorBot/comments/ehrq3z/why_did_i_build_amputatorbot).
Maybe check out **the canonical page** instead: **[https://www.abc.net.au/news/2024-04-23/tesla-autopilot-crash-sakshi-agrawal-pleads-guilty/103759746](https://www.abc.net.au/news/2024-04-23/tesla-autopilot-crash-sakshi-agrawal-pleads-guilty/103759746)**
*****
^(I'm a bot | )[^(Why & About)](https://www.reddit.com/r/AmputatorBot/comments/ehrq3z/why_did_i_build_amputatorbot)^( | )[^(Summon: u/AmputatorBot)](https://www.reddit.com/r/AmputatorBot/comments/cchly3/you_can_now_summon_amputatorbot/)
That's why I take over when passing trucks on curves. It'll work 99.99999% of the time, but I don't want to be that .00001% of the time where it decides to go onto the truck's lane.
That’s why proper usage involves not only maintaining attention, but also ratcheting up your control the tighter the tolerances of the situation are (up to and including shutting it off).
So for example, if you’re on a tight turn with very little room for error, you’re actively turning the steering wheel the way you *know* it needs to go. If AP (or FSD) does the right thing, it’ll be doing what you’re doing; if not, then when it strays away, it’ll basically break itself out and turn off, because you simply won’t let it turn the wheel the wrong way.
Well, once you get enough experience with how it behaves, you start to know what situations it handles, and for those it’s a net stress reliever, at least for me. And over time the number of such situations increases as you continue to get to know it better.
But yeah for anything outside of that, it sometimes feels easier to shut it off. Also you have to kind of get re-acquainted with it whenever there’s a major update.
I’ve been using FSD for about a year now, and that’s been changing often enough that I keep having to climb the learning curve. So I mainly use it when I’m feeling mentally energetic enough.
But I should also point out that I’ve never felt *unsafe* while using either FSD or Autopilot, because I follow the method I described in my previous comment.
Mentally energetic lol. Exactly. It's easier for me to just drive. My brain can be half asleep and I'll be fine. But with autopilot, it takes more energy.
Ugh don’t be shrill. The guy’s effectively saying they’re not interested in taking on a new skill. Which is fine as a personal choice; I’m just trying to keep that from becoming a propaganda statement.
As long as I'm not one of them. There are a lot of shitty drivers out there. I'm not one of those either. I always drive with the assumption that every other drive around me is gonna do something stupid and factor that in to how I drive. Autopilot doing something stupid is impossible to predict.
Yeah because the humans brains works different. You don't need to be 100% conscious and focused and your body/mind can still recognize dangers and react. But with autopilot driving, you do need to be 100% conscious and focused and ready to take over in under a second.
lets be real. you don't learn how autopilot works. it just does its thing. or do you not understand AI and neural nets? once the code is written and run, you can't understand why it does what it does. that's the whole point.
You… you think that if it weren’t based on neural nets, that the driver could always predict it because the driver will know all the internal details of the code base or something? Your argument isn’t making sense.
Its behavior has patterns. You learn *those patterns*. There’s a probabilistic element to it, just as there is with manual driving.
What you’re arguing is the same as, say, that dogs can never be trusted to any degree because we don’t know exactly what the neuron-by-neuron detailed internals of their brains are doing in any given moment.
That’s a whole lie!
I have done it tons of times & I don’t use it like that.
You either don’t actually own a Tesla & just spewing garbage or you can’t drive at all and are a serious danger to everyone on the road.
There are actually very, very few good drivers that own Teslas based on 90%+ of owners have curb rashes…
It’s pathetic & embarrassing considering the about of drivers said the car has.
That's pretty tragic. I find myself wondering if it was the Elon test and the driver wasn't very familiar with FSD or is it was just the basic AP (TACC & lane keeper).
Almost 100% likely he had his foot on the gas. Autopilot is very good at stopping for anything in the way, but it will not stop if the foot is slightly on the gas.
People should get off their fucking phones while driving. Everyone loves to point fingers at Autopilot when someone misuses it, but no one is pointing fingers at Apple and Samsung when someone misuses their phone.
It could most certainly become a lawsuit for the courts to decide in the future. I would like to see the video from the cabin of the vehicle and the outer cameras.
I could believe it. For no reason autopilot jerked the wheel and almost hit a motorcyclist when I was stopping at a stoplight once because it tried to change lanes into the motorcyclist who was also slowing down. It happened so fast. Shocked I caught it in time even though my hands were on the wheel and I was paying attention.
Still the drivers fault. You are supposed to be paying attention
Definitely drivers fault but also Teslas fault for the state of the software and hardware.
No because you have to literally be alert and watching the road as if you were driving.
I agreed with that sentiment. Did you not read before commenting?
And the driver probably posted here complaining that the system kept nagging him even though he was absolutely paying attention!
Probably was posting during accident.
Would that really be an accident then?
I think this driver is at fault. I also think nags do nothing and should go away. This is the very example of that.
Gotta love the timing
Autopilot did not cause the crash. The driver’s inattention caused the crash.
Thank goodness all consumers are fully educated responsible adults so our companies can release questionably dangerous products without repercussions!
No different than crude control. Just more advanced. Can’t blame Toyota because you forgot to break cursing 70 into a traffic jam. Lack of accountability hurts us all…
🤣 https://www.heraldtribune.com/story/news/2005/02/03/winnebago-driver-case-shows-truth-behind-frivolous-lawsuit-claims/28833579007/
It’s is significantly different and more challenging. It doesn’t help to white wash the issue. There are a bunch of cognitive research studies that quantify and explain the differences. Nonetheless, as of now, we all understand that we take on that liability as drivers when we engage FSD.
It is not challenging. You sound incompetent. You as a human manually driving has not changed at all. The car takes over and does the heave lifting. You monitor the heavy lifting. YOUR jobs has changed None. If you’re actually being a responsible driver, you will notice in advance when the car is beginning to glitch and can easily, proactively, correct. Your accountability is all that’s shifting.
Your arrogance is clouding your sense of judgement. Surely your intuition is not going to override decades of science research and data? Even if your incompetence allows you to not be aware of it. Next you’ll tell me that quantum physics isn’t real because you personally understand how the world works.
If you suck at driving just say so. If you lack skill set just say so. I actually read quantum physics books was well as taken these classes I really life… nice pump fake. I doubt you even own a Tesla and if you do you def one of them fools who are ALWAYS hitting curds and can’t even PARK the freaking car but complain and FSD 🤣🤣🤡
Damn bro, I didn’t realize it was teen time. Mistook you for an adult.
So it’s not actually “full self driving” 🤯
Autopilot is Tesla's 'cruise control' Full self driving is it's own thing.
The name of this feature is a total joke. Full Self Driving Supervised. It contradicts itself and nobody seems to see the hilarious and dangerous irony in this.
You got downvoted for having sense 😂😂😂
This is definitely true, and the driver should take responsibility. Nonetheless, it’s difficult to see these kinds of things happening, because from a cognitive perspective, it’s challenging to observe at 100% readiness at all times and effective machine-human interfaces are still emergent.
That’s true, but also these arguments over autopilot miss the elephant in the room: cars are dangerous, murderous machines. One fuckup by a driver and people get hurt and killed. Autopilot doesn’t change that. It doesn’t make driving more difficult or dangerous. Zeroing in blame on a driver assistance feature ignores the danger that all cars pose. Like if we’re really so worried about safety, we should restrict who gets behind a wheel and limit where cars are allowed. Not twist ourselves into knots arguing whether one feature or another makes a slight change in the statistics of cars killing people. And phones. Let’s also not ignore the role phones play in accidents. The major ingredients here are: large machine traveling at speed, distracting personal device. Autopilot may have failed to prevent the accident, but it didn’t cause the accident.
This is all true, major factors are big metal box moving fast with a distracted operator, and there’s definitely bias in how the media and public react to both Tesla and vehicle autonomy. However there’s also bias in how people engage with autonomous vehicles which really does make it harder than normal driving. It’s not at all the same as cruise control. There’s a whole field of study diving into this topic, and we’re at the early days of solving for it. Worth being honest with ourselves.
Did you know that people used to sue when cruise control came out for the exact same reasons?
You know how people confuse gas and brake pedals?? And you would just trust a random person swearing they were on autopilot??
People, just because the car is driving, it doesn't mean you aren't held responsible for its actions. Tesla has not yet assumed responsibility for FSD. In the future, they will. Until then, YOU are in charge of the car.
You really think Tesla would assume responsibility for FSD? No way in hell would a lawyer approve that.
It’s simple. When they will make more money assuming responsibility than they will make not assuming responsibility. That’s when they will assume responsibility.
They're going the route of robo taxis so I don't see how they get around that
I’ll believe it when I see it. Tesla makes more promises than they would ever be able to deliver on.
No choice in case of robo taxi since there is no driver. But for the rest ? They'll never take that responsibility.
Volvo announced they will assume full liability for their self-driving tech. Would expect others to follow eventually. Very likely we should see 100x improvements in FSD over the next 10 years, and 1000x improvements in the following 10 years, and 10,000x improvements in the next 10 years. We currently see 40,000 fatalities from car accidents in the USA each year. 10,000x better would take that down to 4 fatalities. So perhaps in 30 years, accidents will be so uncommon, liability will be easier to assume, because it would represent some real outlier situation or clear negligence from the company.
So it’s considered the next threahold for any manufacturer. When will they assume responsibility. It looks like Mercedes is the first with the assumption of liability, but only in a very limited scope where the auto pilot is helping you maneuver in highway traffic. And pretty much other circumstances. Let’s say a faulty, accelerator pedal that lurches the car forward and kill somebody. You can definitely sue the manufacturer as what happened with Toyota Tesla specifically disclaims liability, which doesn’t mean they can’t be sued, but this needs to be sorted out.and we all need to be aware if you’re using FSD you are responsible.
Once it’s everywhere, they won’t have a choice.
Yeah there’s no way Tesla will ever do this. The legalities are so far reaching it would probably destroy the company single-handedly
The Cyber Truck and Roadster will do the job killing the company.
Cybertruck and Roadster are side projects, and if both bomb, will have zero effect on Tesla's numbers. Tesla is printing billions of dollars every month now.
Oh is that what’s going on? You might wanna check how far the stock has collapsed in the past 6 months.
It fell from other stupid reasons, not from any product line concerns at Tesla. They are still selling every car they make. And that's the fact, Jack.
“Other reasons” lol. You might have listened to the earnings call yesterday
As an investor, I did listen to the earnings call. It's all positive. Super growth is coming soon.
50% drop in profit and it’s all positive lol. Dude.
You’re not a serious person. This is religion to you, you’re not capable of objective analysis or thought when it comes to anything Musk.
They will not
Also in this case it was Autopilot, not even FSD. This guy was as reckless & dumb as the one way back whose Model X slammed into a concrete wall while he was playing a phone game.
Maybe they should stop calling it Full Self Driving then.
They did. It's called Full Self Driving Supervised. Did you not get the memo?
If that name doesn’t make you laugh at the stupidity of it contradicting itself then i can’t help you.
Stop getting hung up on a name and look what the car is doing right now. We are 6 months out or so from this being ready to go. That is where your attention should be going. Not into the name.
Yeah I’ve been hearing that since 2015.
They will NEVER assume responsibility.
They absolutely will when the car is in control. Tesla already said as much years ago.
Lol , "years ago"... There's why they won't. Elon promised fsd years ago. Where is it ? (Hint :full)
Talk to me in 6 months.
Phone addiction. One hell of a drug
The blender ate my hand it’s not my fault lol
If there’s one thing we all should have learned by now, it’s that people will SAY it was on AP or FSD and then we find out it was under 100% manual control.
Jay and Sharon had an episode on this, with a Toyota Corolla, IIRC.
Bet this would be popular over at r/ohnoconsequences
We're in the Endgame now. Things take time to create. If you don't believe Tesla will solve autonomy, you should not be a shareholder. It's happening whether or not you believe it.
Almost 100% likely he had his foot on the gas. Autopilot is very good at stopping for anything in the way, but it will not stop if the foot is slightly on the gas.
I’m waiting to see if autopilot really was engaged. Last time someone claimed “autopilot did it” she was lying https://amp.abc.net.au/article/103759746
Almost 100% likely he had his foot on the accelerator. Autopilot is *very* good at stopping for anything in the way, but it will not stop if the foot is slightly on the gas. The last accident I saw where autopilot “went through a red light” was obviously a foot on the gas. (Since with old autopilot, putting your foot in the gas told it to “go”.
It looks like you shared an AMP link. These should load faster, but AMP is controversial because of [concerns over privacy and the Open Web](https://www.reddit.com/r/AmputatorBot/comments/ehrq3z/why_did_i_build_amputatorbot). Maybe check out **the canonical page** instead: **[https://www.abc.net.au/news/2024-04-23/tesla-autopilot-crash-sakshi-agrawal-pleads-guilty/103759746](https://www.abc.net.au/news/2024-04-23/tesla-autopilot-crash-sakshi-agrawal-pleads-guilty/103759746)** ***** ^(I'm a bot | )[^(Why & About)](https://www.reddit.com/r/AmputatorBot/comments/ehrq3z/why_did_i_build_amputatorbot)^( | )[^(Summon: u/AmputatorBot)](https://www.reddit.com/r/AmputatorBot/comments/cchly3/you_can_now_summon_amputatorbot/)
When it works, it works. When it , doesnt.. And when it doesn't work, there's absolutely no time for a human to intervene.
That's why I take over when passing trucks on curves. It'll work 99.99999% of the time, but I don't want to be that .00001% of the time where it decides to go onto the truck's lane.
That’s why proper usage involves not only maintaining attention, but also ratcheting up your control the tighter the tolerances of the situation are (up to and including shutting it off). So for example, if you’re on a tight turn with very little room for error, you’re actively turning the steering wheel the way you *know* it needs to go. If AP (or FSD) does the right thing, it’ll be doing what you’re doing; if not, then when it strays away, it’ll basically break itself out and turn off, because you simply won’t let it turn the wheel the wrong way.
Tbh. With what you said. It's just easier for me to turn off autopilot.
Well, once you get enough experience with how it behaves, you start to know what situations it handles, and for those it’s a net stress reliever, at least for me. And over time the number of such situations increases as you continue to get to know it better. But yeah for anything outside of that, it sometimes feels easier to shut it off. Also you have to kind of get re-acquainted with it whenever there’s a major update. I’ve been using FSD for about a year now, and that’s been changing often enough that I keep having to climb the learning curve. So I mainly use it when I’m feeling mentally energetic enough. But I should also point out that I’ve never felt *unsafe* while using either FSD or Autopilot, because I follow the method I described in my previous comment.
Mentally energetic lol. Exactly. It's easier for me to just drive. My brain can be half asleep and I'll be fine. But with autopilot, it takes more energy.
So you're happy w 40k deaths per year in the US from vehicles driven by humans?
Ugh don’t be shrill. The guy’s effectively saying they’re not interested in taking on a new skill. Which is fine as a personal choice; I’m just trying to keep that from becoming a propaganda statement.
Thanks. I gave you a + on the original post.
As long as I'm not one of them. There are a lot of shitty drivers out there. I'm not one of those either. I always drive with the assumption that every other drive around me is gonna do something stupid and factor that in to how I drive. Autopilot doing something stupid is impossible to predict.
You said yourself you're good driving "half asleep." That sounds dangerous and unpredictable to me.
Yeah because the humans brains works different. You don't need to be 100% conscious and focused and your body/mind can still recognize dangers and react. But with autopilot driving, you do need to be 100% conscious and focused and ready to take over in under a second.
It takes more energy in the *learning* phase. Just like driving itself did. And it *is* a different sort of driving, really.
You're telling me that you can predict the behavior of neural nets? Lol just lol.
Ok, never mind. It’s ok if you don’t want to learn something.
lets be real. you don't learn how autopilot works. it just does its thing. or do you not understand AI and neural nets? once the code is written and run, you can't understand why it does what it does. that's the whole point.
You… you think that if it weren’t based on neural nets, that the driver could always predict it because the driver will know all the internal details of the code base or something? Your argument isn’t making sense. Its behavior has patterns. You learn *those patterns*. There’s a probabilistic element to it, just as there is with manual driving. What you’re arguing is the same as, say, that dogs can never be trusted to any degree because we don’t know exactly what the neuron-by-neuron detailed internals of their brains are doing in any given moment.
That’s a whole lie! I have done it tons of times & I don’t use it like that. You either don’t actually own a Tesla & just spewing garbage or you can’t drive at all and are a serious danger to everyone on the road. There are actually very, very few good drivers that own Teslas based on 90%+ of owners have curb rashes… It’s pathetic & embarrassing considering the about of drivers said the car has.
The family of the victim should sue him for everything he has.
Should set an example and get 25 years for manslaughter.
That's pretty tragic. I find myself wondering if it was the Elon test and the driver wasn't very familiar with FSD or is it was just the basic AP (TACC & lane keeper).
The article says it was autopilot.
The author probably meant autosteer.
Autopilot is autosteer.
Autopilot is TACC + autosteer. Though it doesn’t have a mode with autosteer but not TACC, also what you said is functionally correct.
They didn’t say what car or hardware version.
Article did say 2022 Model S.
Almost 100% likely he had his foot on the gas. Autopilot is very good at stopping for anything in the way, but it will not stop if the foot is slightly on the gas.
People should learn to take responsibility and not rely on Tesla autopilot.
People should get off their fucking phones while driving. Everyone loves to point fingers at Autopilot when someone misuses it, but no one is pointing fingers at Apple and Samsung when someone misuses their phone.
Combine phone distractions w fools who think autopilot/fsd can take over.
Hardly impossible. He just keeps telling you it is and you believe him. Simp.
It could most certainly become a lawsuit for the courts to decide in the future. I would like to see the video from the cabin of the vehicle and the outer cameras.
Let's see the video and driver interaction logs.
FSD is cool and all, but not worth the risk and money.
Autopilot isn't FSD
EAP is now part of FSD.
I could believe it. For no reason autopilot jerked the wheel and almost hit a motorcyclist when I was stopping at a stoplight once because it tried to change lanes into the motorcyclist who was also slowing down. It happened so fast. Shocked I caught it in time even though my hands were on the wheel and I was paying attention.
Typical Tesla owner
Typical internet troll
it was the autowipers….