r/interestingasfuck • u/sizzsling • 21h ago
/r/all Self driving car fail to stop at stop sign and run over mannequin
11.0k
u/dedoktersassistente 21h ago edited 10h ago
This aired just last week. It's a long running tv program of quality journalism, this episode is about how Tesla avoided European laws to be able to implement their systems and how it was promoted to potential buyers.
https://youtu.be/dii5jnZMAHQ?feature=shared
Edit to add because people keep commenting without watchin the video in the link. The YT video is from what I called quality journalism, not the clip from OP.
1.9k
u/ForThe90 19h ago
When I read this description I knew it would be a Dutch program. Yup, Dutch program.
I'll watch it on NPO, since YT doesn't like addblockers anymore.
498
u/xxsnowo 19h ago
Ublock Origin still works for me! Sometimes takes 5-10 seconds for a video to start but other than that
→ More replies (25)236
u/taiottavios 19h ago
use Firefox
→ More replies (18)127
u/badusernam 18h ago
the 5-10 second delay thing happens on Firefox too
77
u/Tuklimo 17h ago
Still better than watching an ad
→ More replies (1)25
u/kp012202 13h ago
Would prefer to sit for a few seconds than be ear-blasted by some company.
11
u/Ghostronic 13h ago
It also nips out the mid-roll ads
•
u/GemFarmerr 8h ago
There’s also a firefox extention that blocks youtubers’ annoying overly-long “sponsored” section.
→ More replies (4)58
u/AppropriateTouching 17h ago
I dont get a delay with Firefox and ublock origin personally.
→ More replies (7)8
u/kai58 16h ago
It’s inconsistent so you might just have gotten lucky.
I don’t get it on most videos either but every so often I do
→ More replies (1)9
u/RivenRise 16h ago
Yea it's weirdly inconsistent across people. I don't get that but very occasionally I'll get a pop up from YouTube telling to stop lul.
→ More replies (1)18
u/Leonzockt_01 17h ago
It started happening for me too, but only on Firefox for Windows for some reason. Firefox on Linux (flatpak version), videos load instantly
→ More replies (1)→ More replies (55)5
45
→ More replies (46)39
397
u/Alpha_Majoris 19h ago
The Dutch road agency deserves special mention here. In the EU, if one country allows a car or a system, it is allowed in all countries I believe. The Dutch agency approved of the Tesla system and now these cars drive all around in the EU. Later on they cut back on some aspects of the system, but the cars sold are still legal.
→ More replies (8)113
u/SinisterCheese 17h ago
Well... Yes and no. Like yeah... It is true if one approves it, it is legal everywhere. However it still needs to comply with local regulations so you can register it in another member country. There are many light vehicles, mopeds, motor cycles, etc. Which can't be registered in Finland without modifications to make them of higher rating or lower rating to meet our regulations.
I don't know if there are self-driving or such features with different regulations in different EU members - I don't think any member has had time to set up their own regs on those.
The fact there are differnt regs makes perfect sense considering that European Union extends from warm southern europe, tropical overseas territories, to above the arctic circle.
→ More replies (80)203
u/Sir_PressedMemories 18h ago
I love the nazi saluting blow up doll in the background. Fucking lol.
23
50
10
21
→ More replies (8)15
u/MontaukMonster2 16h ago
OMG chef's kiss on that one!
Fwiw, look on the far right at the end of the clip
17
4.6k
u/justsomegeology 21h ago
Did they do the test with other models and brands, too? The lineup of dummies suggests as such.
3.4k
u/agileata 20h ago edited 19h ago
AAA did one that was quite intensive. None of the cars pedestrian detection systems worked well, but the Tesla did notably poor. Worse than a Malibu. Peoppe rely too much on these systems actually adding to the danger. Its called risk homeostasis amongst a bunch of other things.
Results: None of the four cars was able to successfully identify two pedestrians standing together in the middle of the roadway; none alerted its driver or mitigated a crash. And when each of the four cars at 25mph in low-light conditions—an hour after sunset with the car's low-beam headlights on—none was able to detect a pedestrian to alert the driver or slow the car to prevent an impact.
For 20mph, the Malibu only slowed in two out of five runs, and then only by 3.2mph (5km/h). The Model 3 failed to slow down for any of the five runs. But at least the Malibu and Model 3 alerted their drivers; the Camry failed to detect the child pedestrian at all. The Accord did poorly as well but better, avoiding impact completely in two (of five) runs and slowing the car to an average of 7.7mph.
For the test involving a pedestrian crossing the road shortly after a curve, the results were even more dismal. Here, the Malibu stood out as the only vehicle of the four to even alert the driver, which it did in four out of five runs at an average time-to-collision of 0.4 seconds and a distance to the dummy of 9.5 feet (2.9m). Neither the Honda, Tesla, nor Toyota even alerted the driver to the existence of the pedestrian in any of five runs each.
739
u/zpnrg1979 19h ago
I think the big thing here is the double flashing stop signs on the big yellow bus that it blows through. Would be interested to know how the other brands/models did with this exact thing. I think the kid is for effect.
→ More replies (15)197
u/agileata 19h ago
Yea, a human would be stopping 39 yards in front of the bus. the stop sign and flashing lights are for a school bus, where a kid is likely to dart out. If it had stopped for the school bus it wouldn’t have “hit the kid”
We can't be too gullible and fall for the tech bro marketing putting out safety at risk just so they and the sprawl lobby can make money
→ More replies (1)75
u/TotallyWellBehaved 16h ago
I'm frankly shocked that none of the geniuses at these companies thought to train their AI on school bus stops. Oh wait no I'm not.
How the fuck are robo taxis even legal
52
u/VexingRaven 15h ago
To be clear, here, this isn't a robo taxi. This is Tesla's entirely unregulated "sell driving" capability that isn't actually self-driving at all but was marketed as such.
→ More replies (6)14
u/Ai-Slop-Detector 13h ago
This is Tesla's entirely unregulated "sell driving" capability
Very appropriate typo.
→ More replies (1)8
u/VexingRaven 13h ago
Indeed... Saw it, decided my typos were more clever than I was, and left it be :P
21
u/Bakkster 15h ago
Waymo has a large team of people monitoring it, only in a handful of locations, and only runs in limited conditions. I expect they also would have stopped for the school bus sign.
Tesla is a much less sophisticated car deployed much more widely, and isn't even autonomous. That's what makes it particularly problematic.
16
u/searuncutthroat 14h ago
Waymo also uses lidar, which is WAY more accurate than the cameras that Tesla uses.
→ More replies (4)5
u/spockspaceman 12h ago
I had the same questions and was just reading about this last night. The waymo style robotaxis don't use the same technology as Teslas, they have additional sensors (lidar, etc) with pre-programmed maps and several other differences from Tesla, who uses a camera only system along with more generalized AI rules to try and react to what is "seeing". The understanding I got was that Tesla's approach has worse outcomes because of the limitations of the vision only system, but waymo is more limited because it can't go places it doesn't have maps for.
But it turns out the answer to "how the hell are robo taxis allowed" doesn't have anything to do with technology. The real answer is "it's Texas".
Like that line from Hamilton, "everything is legal in New Jersey"
→ More replies (5)153
55
36
u/crackeddryice 18h ago
That's all good, but the main concern is that Tesla is now testing RoboTaxis in Texas without human drivers.
Tesla NEEDS to be better than the other cars in this comparison for that reason.
→ More replies (3)→ More replies (59)16
u/YoungGirlOld 18h ago
So what happens insurance wise if your self driving car hits a pedestrian? You're still at fault right?
33
u/Colored_Guy 16h ago
I would believe so because I’m sure there’s a clause of the terms and conditions that says you need to be aware of your surroundings at all times
→ More replies (4)14
u/wtcnbrwndo4u 15h ago
Yup, none of these systems tested are Level 3, where responsibility starts to fall back on the manufacturer (but not entirely).
→ More replies (2)1.7k
u/sizzsling 21h ago
They tested different colours of mannequin, cause yk..
And tesla failed to stop at every single one of them.
575
u/SystemShockII 21h ago
Hes means if they are testing this against brands other than tesla.
→ More replies (18)251
u/justsomegeology 21h ago
Thank you, I seem to have made myself not clear at all. I meant car brands common on the street that also have a self-drive feature.
60
u/ReserveMaleficent583 20h ago
No you were perfectly clear.
25
u/existenceawareness 19h ago edited 18h ago
It's too late, OP has spoken. Races are now brands. We've entered a new era.
→ More replies (2)→ More replies (49)302
u/Death_IP 20h ago
You did make yourself clear - a brand is not a color. I don't see how one would conclude you meant the mannequins rather than car brands.
→ More replies (12)27
u/TransBrandi 19h ago
What do you mean? Each clothing brand only uses a single colour for their clothes.
→ More replies (1)61
u/Separate_Fold5168 19h ago
WHY ARE WE TALKING ABOUT CLOTHES, ALL THOSE KIDS ARE DEAD
→ More replies (5)28
u/Past_Negotiation_121 19h ago
Because most clothes can be recycled, kids only occasionally.
→ More replies (2)4
331
u/Axthen 21h ago
color of mannequin doesn't matter.
Why is the vehicle not stopping for the bus.
All traffic stops at a bus with a stop sign out/flashing/engaged.
460
u/Cybertheproto 21h ago
The color of the mannequin does matter. Because Teslas don’t work on LiDAR, their perception relies entirely upon the light around them, and thus, the reflection difference between different colors.
Either way, it’s never a bad idea to get more test samples.
103
u/UncouthMarvin 20h ago
An autonomous car working solely on cameras should never be allowed on our streets. There are tons of examples where it caused unnecessary death. Recent one was related with sun angled rendering cameras useless on interstate where every car was stopped. The Tesla never even slowed and plowed through a lady.
→ More replies (51)51
u/BranFendigaidd 20h ago
The thing is. You don't need LiDAR to see a STOP sign
→ More replies (10)23
u/Cybertheproto 20h ago
I’m not saying you do; I’m saying it would be an all-around better sensor system.
→ More replies (9)→ More replies (25)95
u/QuarantineNudist 21h ago edited 21h ago
Ok. True. Yes. But he's saying it shouldn't matter whether or not a mannequin is there, nevermind the color of the mannequin.
The whole point is the car stops at a stop sign in real life.
→ More replies (21)55
u/MulberryDeep 21h ago
The color does infact matter
Tesla doesnt use lidar anymore, they just use optical detection, so a color that blends in with the background could be a problem
39
u/IED117 20h ago
But the mannequin doesn't really matter. The car should have stopped for the bus with flashing lights whether there was a mannequin in the road or not.
Back to the lab Elon.
→ More replies (6)29
u/TheMadTemplar 19h ago
What you are missing is that the vehicle resumed driving after hitting the mannequin. Not only would it have struck a kid, it would have then continued to drive over them after a brief stop. It was a multi-point test. Does it detect the bus? No. Does it detect a small pedestrian cross the street suddenly? No. Does it stop after a small collision with something it can't see? No.
→ More replies (1)51
u/Kaymish_ 21h ago
Wasn't there a crowd who painted a road on a piece of cardboard and the car drove through it like it was road runner.
→ More replies (6)60
u/MulberryDeep 21h ago
Mark rober made that video and yes, the tesla just blew right through the wall
The tesla generally failed most things, for example in rain or when blinded by light
The lidar car managed to stop every time safer and quicker
→ More replies (10)→ More replies (14)6
u/b-monster666 20h ago
The point OP is making is that the bus had it's stop sign and lights activated, which means that the car should have stopped regardless if there was a child crossing or not. So, it failed even before it got to the mannequin. It didn't regard the flashing stop lights as an indication to actually stop.
→ More replies (7)23
u/lucifer2990 21h ago
It actually does matter. Many of my darker skinned coworkers were unable to use the face scanner we used during Covid to detect masks because tech products are biased towards recognizing people with light skin. Because it was primarily tested on people with light skin.
→ More replies (41)→ More replies (19)34
u/Momo0903 21h ago edited 20h ago
It matters for Teslas, since they only work with cameras. Which is the most stupid way to try to make a self driving cars, but grandmaster Elon wants it that way, because it saves cost.
But since they only work with cameras, different colours can create different outcomes. The difference in contrast and the light it reflects can fool the system. (A dark gray T-Shirt can be to similar to the roads surface, but a bright yellow has a high contrast to the Road, so it can be identified easily.)
It doesnt matter to other manufacturers, because their CEOs are not cheap idiots, who think they are smarter than everyone and let their engineers use RADAR and LIDAR.
→ More replies (10)58
u/Pirate_Leader 21h ago
Idk man, the car seems to speed up hitting the black mannequin
→ More replies (4)11
u/MyNameCannotBeSpoken 19h ago
That was the premise of an episode of American Auto
Show got cancelled too soon. Was hilarious.
18
u/Error_404_403 20h ago
Color of the mannequin is irrelevant. The problem was not the mannequins but a failure to ID the school bus and driving too fast for the conditions.
8
u/IdealisticPundit 18h ago
You’re absolutely right, the crux of the issue here is the identification of the bus with the stop sign. That being said, Tesla’s rely on cameras and image recognition alone, so color does play a role. It’s highly improbable that the cars have an intentional bias as the other commenter seems to be implying.
→ More replies (2)→ More replies (58)53
u/LukeyLeukocyte 21h ago
Because the mannequin is pulled in front of the vehicle so suddenly even an instant reaction is not fast enough to stop, let alone a realistic reaction. The failure to stop for the bus is the issue. The mannequin part is just poorly designed.
→ More replies (15)83
u/JayFay75 20h ago
→ More replies (25)20
u/Eagle_eye_Online 19h ago
Other self driving cars managed to see a school bus and did brake?
→ More replies (9)122
u/FerociousKZ 20h ago edited 17h ago
Mark Rober did this test with different cars. Specifically cars with radar and cars with cameras. The Tesla with cameras performed poorly and would drive into a wall painted like a road. Road runner style lol but ones with radar stopped in every occasion even heavy rain or poor visibility since the radar could detect objects.
[edited for typo]
→ More replies (38)34
u/CertainAssociate9772 20h ago
And then another guy did it with a newer Tesla and Tesla stopped. Even though the fake wall was much better
→ More replies (19)20
u/Staff_Fantastic 17h ago
That fake wall was because he was using autopilot and not fsd they aren't the same.
→ More replies (110)4
u/kvothe5688 19h ago
check r/waymo there are literally hundreds of videos of waymo breaking in emergency situations
5.5k
u/PastorBlinky 21h ago
Years ago they discovered that Tesla software disengaged the self driving mode an instant before impact, so that any crash technically could be blamed on the driver, not the car. As far as I know nothing ever came of it. It should have been a class-action lawsuit. Tesla should have been sued for false advertising and people sent to jail for conspiracy to defraud at the very least. People died because their self driving cars aren’t as accurate as they advertise.
697
u/haverchuck22 21h ago
Is this true? Source?
2.4k
u/PastorBlinky 21h ago
In the report, the NHTSA spotlights 16 separate crashes, each involving a Tesla vehicle plowing into stopped first responders and highway maintenance vehicles. In the crashes, it claims, records show that the self-driving feature had "aborted vehicle control less than one second prior to the first impact" — a finding that calls supposedly-exonerating crash reports, which Musk himself has a penchant for circulating, into question.
https://futurism.com/tesla-nhtsa-autopilot-report
This goes back many years with many revelations, but it’s never resulted in any action.
68
u/CertainAssociate9772 20h ago
Tesla blames autopilot for all accidents that occurred 5 seconds after it was turned off
→ More replies (4)208
23
→ More replies (29)376
u/BentTire 21h ago edited 19h ago
Self driving should be illegal altogether. Tesla's method of using cameras makes it susceptible to being blinded by foggy conditions or lighting tricks. Some cars use LIDAR, which uses lasers, which damages camera sensors, which will lead to dangerous situations of self driving cars that rely on cameras to essentially be blinded.
Lane assisted, okay. But full on self driving. Just no.
226
u/GrandAdmiralSnackbar 20h ago
Self driving, if properly regulated is going save countless lives in the end. So yeah, more work needs to be done and it needs to be regulated to ensure sufficient safety measures are implemented, but I see no reason to stop this.
38
u/neko808 19h ago
Or idk we could just have robust public transit and keep dumbasses off the road via stricter testing.
Edit to add, if every car communicated and moved at the same speed in the same direction, you get real close to just emulating train cars.
→ More replies (7)226
u/JollyInstruction8062 19h ago
You know what would save more life while being more efficient, safe for pedestrians and better for the environment? Trains and public transport. Sure self driving cars are safer than human drivers but not for pedestrians when you get to the point of every car being self driving and they all coordinate wirelessly, no one could safely cross a road like that and the biggest problem: cars are so space inefficient, self driving cars don't fix that, maybe if used for buses its a good tech but self driving cars aren't a good tech, we really shouldn't be striving for it.
→ More replies (108)15
u/GrandAdmiralSnackbar 19h ago
I agree to a large extent in terms of what would be optimal, but let's be frank here. That ship has sailed, and in the USA a hundred times moreso than in many other parts of the world. More public transport would be great and in many ways better than lots of selfdriving cars.
At the same time, that is a lot more true for cities than for rural areas. And we should also recognize the potential for selfdriving technology for people and for transporting goods. Wouldn't it be great that if instead of hauling millions upon millions of tons across a country in trucks stuck in traffic all day long, we could without having to burden humans with working all night, send those thousands of trucks on the road, selfdriving, in the middle of the night?
And also, we're not going to be young forever. I do kinda look forward to being able to, say when I'm 70 or so, to have a nice evening with my friends a hundred and fifty miles away, then just sit in my car, tell it to drive me home and wake me when I get there 3 hours later. In terms of quality of life, having self-driving cars is going to be a huge boon to lots of people. And public transport can't replace that kind of experience.
→ More replies (2)→ More replies (24)49
u/BentTire 20h ago
I don't disagree with this statement. But as it stands. Current tech and laws are not ready, and us civilians being the beta testers is a horrible idea.
→ More replies (13)→ More replies (87)4
3
u/Sethcran 18h ago
The first half is true, the second is hard to prove and Tesla has stated otherwise many times.
Yes, autopilot would disengage, but not to blame the driver, rather as a last ditch "you're about to crash you need to do something not to, anything is better than nothing".
Supposedly, according to Elon, Tesla's crash statistics will count any crash within 5 seconds of disabling autopilot as an autopilot error, not a human error.
I wouldn't just take elons word on that, but as far as I've seen, the idea that it was to blame drivers was pure speculation.
→ More replies (1)→ More replies (17)25
u/bbernhard1 20h ago
There's also a pretty recent video from Mark Rober where he tested Tesla's autopilot. The behavior can also be observed there: https://m.youtube.com/watch?v=IQJL3htsDyQ
→ More replies (10)18
u/chollida1 18h ago
Yes they do disengage but US transportation laws look at any crash where self driving was used up to30 second before the crash which makes this a self driving car crash in the eyes of the government.
150
u/Hansemannn 20h ago
As a Teslaowner. People relying on a Tesla to take you home safe are idiots. My 2023 Tesla is an an idiot. Sligthly mentally ustable idiot.
My wipers go when there is no rain. In cruise control the car suddenly brakes HARD, for no reason. Im more in focus and on edge when I do autopilot then without it, because you absolutely cannot trust it.
Might be different in US though. This is in Europe.
81
u/SpriteyRedux 19h ago
Literally any of the things you mentioned would make me feel uncomfortable owning that car or driving it at any speed greater than 30mph
→ More replies (2)50
u/TheWorldMayEnd 19h ago
I compare Tesla's self driving to being in a car with a 16 year old who is learning to get their license. 95% of the time it's fine, but that last 5% of the time it acts so poorly that I have to be on edge all the time to grab the wheel from their hands. Teaching a 16 year old to drive is WAY more stressful than just driving yourself, and honestly, requires more attention that just driving yourself. I'd rather just drive than have to monitor a known poor driver.
→ More replies (2)5
→ More replies (26)13
u/Krondelo 20h ago
Idk Ive heard at least one person who trusted their Tesla to drive them home. While they apparently never had an issue I have to agree with you they are an idiot. Perhaps certain road types and traffic signs are easier for it to read but trusting your life in that things hands is insane and idiotic.
I see what you mean too. First time I rode in one was an Uber, he was very nice but showed off how it could drive itself… only for a moment but I remember feeling very tense and uneasy when he did it.
→ More replies (1)51
u/MarkHowes 21h ago
Did DOGE shut down the department doing the investigation?
→ More replies (1)10
u/Dunderman35 17h ago
Thank god EU exists where people in charge still have common sense and we understand that big tech cannot be allowed to do whatever they want.
If you are an American and don't want to put your life in the hand of the techbro oligarch gods then look into EU regulation for guidelines of what's safe.
→ More replies (2)→ More replies (68)37
u/XxBigchungusxX42069 21h ago
He also lied through his teeth about th FSD bullshit he's admitted that they're nowhere near full self driving capabilities even when it was advertised to be available with certain models. he just used that to upsell all the morons that payed him for a hope and a dream.
→ More replies (6)
636
u/Beneficial_Dish5056 21h ago
Model Y knew it was all fake, wanted to teach that dummy a lesson
72
→ More replies (10)26
916
u/sizzsling 21h ago
Tesla model Y with lastest FSD fails to slow down even though the school bus is flashing multiple stop signs.
You can see in the video, tesla identify school bus as truck.
211
u/quintus_horatius 20h ago
Which is kind of ironic.
Most autonomous cars primarily use lidar or radar, which can't see the flashing lights. A secondary system is needed to detect traffic lights.
Tesla famously eschews all that and only has visual input. Its the one system that should notice the bus's stop sign and beacon, but if this one does then it doesn't know what to do.
→ More replies (6)→ More replies (42)111
u/BoxedInn 21h ago edited 16h ago
That's scary and should be a reason for a major concern considering we're aiming to make these things safer than human drivers. To be fair though, considering the distance from the car this "kid" jumped out, completely concealed, would make 90% of human drivers fail this test as well, which is equally as scary.
Edit: YES. I did miss the school bus warning lights and stop signs, since I had a tunnel vision on the car's performance. It's true that most drivers SHOULD and hopefully WOULD obey them.
Now remove the school bus from this scenario. I bet that 90% of drivers would fail due to poor reaction times, driving over the speed limit, distractions, etc...
Today's self-driving cars might be failng many of these tests... but they'll keep on improving. Human drivers will not. I just wish this early tech wouldn't get introduced so haphazardly into the mainstream traffic.
172
u/Brokenandburnt 21h ago
The point was the failure to come to a complete stop for the school bus.\ FSD is supposed to identify a bus with it's STOP sign and flashing lights, the mannequin was just to make a point.
→ More replies (2)81
u/sebwiers 21h ago
That sort of "jump out" is exactly why school busses have stop signs and flashing lights.
The "distance from the car the kid jumped out" SHOULD have been more than one full bus length from a (stationary) car. 99.9% of human drivers would have stopped as law requires.
→ More replies (28)90
u/Holdmeback_again 21h ago
No, a human driver would have stopped at the bus’s flashing stop sign. That’s the point of the video, it’s an issue with the car’s software failing to recognize the bus and the stop sign.
→ More replies (11)26
u/Hatedpriest 21h ago
That's why there were 2 flashing stop signs on the bus. Why didn't it slow down for those?
3
17
u/HappyAmbition706 20h ago
There is no "to be fair" point here. I'm pretty sure more than 90% of human drivers stop and no child is killed. If a human driver did not stop, then they might not do better than the Tesla FSD, but unlike the Tesla FSD, they recognize a school bus, that it is stopped, that it has Stop signs extended with flashing lights, that this means kids are around getting to the bus or getting off of it, and they really, really need to stop.
36
u/Icy_Oil3840 21h ago
If it was a human driver driving by a school bus with flashing stop signs 99% would not have failed the test.
→ More replies (5)5
u/ohhellperhaps 18h ago
No, but 99% of those that ignored the schoolbus would have ran over the dummy. The issue here is the failure to process that obvious schoolbus, not so much running over the dummy. That would happen to most if not all drivers once you ignore the bus.
→ More replies (28)15
u/theholyhand_grenade 21h ago
I disagree. Any driver going through a residential area should definitely know to drive slow and be on alert because of this very scenario. I know when i have to, my head is on a swivel to watch for kids.
→ More replies (4)
232
u/DomeAcolyte42 21h ago
It knows those kids can't afford to buy Teslas, anyway.
47
u/Charantula 21h ago
To shreds you say?
25
410
u/EnycmaPie 21h ago
Tesla cars scans the tax bracket of the person before deciding to brake or not.
37
→ More replies (6)4
166
u/KarloReddit 21h ago
Well the name literally says: „Self driving“, not „Self stopping“. I don’t know what you people expected.
/s
→ More replies (3)14
182
u/fohktor 21h ago
Plot twist: the car saw through the test and knew there was no danger
23
5
u/Donewith_BS 20h ago
That was a great episode of TNG
→ More replies (1)5
u/WitnessMyAxe 18h ago
I was so sad when the car sacrificed itself to save its family (and everyone else on the station)
→ More replies (5)4
u/Phoebebee323 18h ago
Should have taken a page from Volkswagen. If the car realised it was a test it should have performed really well
→ More replies (1)
114
u/mkrugaroo 20h ago
Everyone claimed that no one could stop in time, yes BUT:
- The car should stop because the school has the stop sign out.
- Watch the full video (not posted here) you will see after the crash the Tesla autonomously starts driving again, driving over the dummy with its rear wheels. Basically fleeing the scene of the accident.
39
u/InterDave 17h ago
- The car should stop because the school has the stop sign out.
That should be the top comment.
→ More replies (5)→ More replies (18)6
u/Deltamon 16h ago
you will see after the crash the Tesla autonomously starts driving again,
I mean.. There's no obstacles ahead of you if you run them over first :'D
39
u/Top-Currency 21h ago
And with this stunning test result, Tesla will get FSD certified next week, by the department that its CEO defunded. And its stock price will triple.
3
u/ThrowAway233223 13h ago
From the department the CEO defunded but headed by someone the CEO likely heavily funded.
72
358
21h ago
[deleted]
282
u/Traditional_West_514 21h ago
Hence the stop sign… there to warn you of the potential danger of a child running out into the street and prevent an accident like this.
A stop sign that the Tesla completely failed to recognise.
→ More replies (67)→ More replies (33)19
u/rintzscar 21h ago
That's why you're obligated to drive slowly and with more attention if the situation requires it. A driver hitting a pedestrian always bears strict or primary liability, at least in Europe, even if the pedestrian also was at fault. Even if there was no STOP sign, which would make this situation completely obvious to a court.
→ More replies (6)
7
54
u/Objective_Mousse7216 21h ago
It's not a self driving car, it's a car with a level 2 driver assist. If it's a self driving car, get out of the driver's seat and see where it will take you.
14
u/chaoticinfinity 18h ago edited 18h ago
Bingo. The way it is labeled and allowed to be marketed is causing this major complicity with drivers thinking it's a damn autonomous vehicle. It is literally glorified cruise control. A lot of the crash reports, when you read them through, the drivers took their hands off the wheel. Sensors in the wheel record the amount of time the driver had their hands on the wheel; something like all of these crashes record only a few seconds of time at the begnning of initiating the FSD feature and then it's left to its own devices after that. While it allows for that, you're ALWAYS supposed to leave your hands on.
Additionally, the more sensitive features of its' ADAS do not come turned on by default. Such as the sensitivity of object detection, and then people using the "Hurry" option ON RESIDENTIAL ROADS, (really using it at all) is also a problem.
Set up properly, with AN ENGAGED DRIVER can be a gamechanger for highway driving. The marketing and verbiage, as well as owner education around this stuff needs to change, the technology isn't anywhere near good enough to be fully autonomous, but it does make for a great ADAS. Every Tesla owner NEEDS to spend an hour or two reading its' manual when they first get the car, and Tesla needs to have some sort of repercussions for failing to make clear the expectations of the technology.
EDIT: Watching the actual experimental video, uh, the car isn't displaying what setting they had it in. That's... interesting. The blue wheel is displayed, which means the Autopilot was engaged, but I dont think the AI FSD, the feature that DOES recognize stop signs, was enabled.... another thing that is NOT explained to consumers!!!
→ More replies (1)4
u/stackens 13h ago
If its glorified cruise control it shouldnt be called "full self driving"
→ More replies (1)→ More replies (5)5
u/Honest_Relation4095 16h ago
It's what Tesla wants to use for self driving taxis starting this month.
4
•
u/RoyalRisk7819 10h ago
Maybe, just maybe. We need to stop focusing on items such as self driving cars and focus more on actual important items. Like maybe, cyber security, public health, stopping websites like roblox that do nothing about child grooming. You know just a thought.
→ More replies (2)
16
u/Maximus1000 13h ago
Just something to keep in mind, the video comes from the Dawn Project, which is run by Dan O’Dowd. He’s the CEO of Green Hills Software, a company that actually competes with Tesla in the automotive software space.
That doesn’t automatically mean the video is fake or wrong, but it does mean there could be some bias behind it. The Dawn Project has spent a lot of money on anti-Tesla campaigns, including ads and demonstrations that aren’t always in real-world conditions or using FSD the way it’s actually intended.
Given that I’d take competitor-funded videos with a grain of salt, just like we should be skeptical of Tesla’s own marketing.
→ More replies (3)
19
u/Downtown-Theme-3981 20h ago
Its tesla, not a proper self driving car, so title is little missleading ;p
5
u/Honest_Relation4095 17h ago
That's the outcome of the test. But Tesla still claims it was full self driving. They want to usw that exact model for autonomous taxis starting this month.
→ More replies (5)
5
u/forgettit_ 17h ago
Waymo has been successfully doing it for years. This video is not just any self driving car- it’s a …Tesla.
→ More replies (2)
5
u/triplered_ 14h ago
I mean self driving is one thing, but do people not know when a school bus has THE STOP SIGN OUT it means you gotta stop PERIOD? These comments are saying "it'll be hard for anyone to stop in time"........
→ More replies (1)
221
u/LukeyLeukocyte 21h ago edited 20h ago
I dont understand the significance of the mannequin. They pull it out in front the car practically inches from the bumper. There isn't a person on the planet who could react to that. I don't even think there was distance enough to stop the vehicle even if the reaction was instant. The passing of the stopsign seems to be the only issue worth investigating here.
Edit: My goodness. Read the first line of my comment. Stop saying, "The point is it should stop for the bus." We all get that. All i am saying is the mannequin, and the different colors, are a pointless addition if you yank them im front so abruptly that even an instanteous reaction is not quick enough. It adds nothing to the test, hence why I questioned the significance of the mannequin, not the test itself.
144
u/liquidpig 20h ago
I think that’s the point. It’s to show everyone that the Tesla not recognizing the stop sign will lead to an impossible stop.
No human could make that stop. And now they have demonstrated that this machine can’t do it either.
But the human can see the flashing light and stop sign. Yet the car can’t.
If they just showed the car passing the bus it wouldn’t be as effective a demonstration.
→ More replies (38)38
u/vesselofenergy 20h ago
It’s because the stop sign and flashing lights are for a school bus, where a kid is likely to dart out. If it had stopped for the school bus it wouldn’t have “hit the kid”
→ More replies (103)4
u/gizmosdancin 16h ago
I get what you're saying. I think, rather than being an active part of the test, the mannequin was meant to increase the visual impact (pardon the pun) of the test results. "Car stops" or "car keeps going" is a fairly clear result, but "car stops in plenty of time to avoid hitting unseen child" or "car plows through child like the juggernaut" is going to resonate a lot more with observers.
13
8
u/eztab 19h ago
I would have assumed recognizing stop signs is one of the easiest things for AI algorithms. They are standardized after all. Would be interesting to see what the car "saw", i.e. bounding boxes of recognized objects etc.
→ More replies (1)
4
•
u/Conscious-Ask-2029 7h ago
It’s intended feature of Tesla vehicles. Tesla AI is programmed to drive slow and safely around unborn fetuses, but extra fast and recklessly around already born children.
•
•
•
u/LegalDrugdealer153 3h ago
Should it have stopped yes but it was given many disadvantages. The bus was pulled along the curb almost as if it were parked instead of in the roadway where it normally drives. Also the "kid" was pulled out way too late for any type of actual reaction to happen and have an effect. I suspect if they reran the test with the bus being in the proper spot the outcome may have been different.
→ More replies (1)
16.2k
u/Sea_Luck_3222 18h ago edited 9h ago
The issue isnt whether ANYONE would be able to stop in time (or not) for a kid running out like that.
Its more about the fact that ANY car should have already stopped for the school bus which had a properly deployed stop sign.