A Tesla owner reported that he crashed his Cybertruck into a pole after hitting a curb while using Full Self-Driving, Tesla’s advanced driving assist system that Elon Musk claims will be unsupervised this year.
The post is going viral.
Jonathan Challinger, a Florida-based software developer working for Kraus Hamdani Aerospace, reported in a viral post on X that he crashed his Cybertruck into a pole.
He reported that he was driving using Tesla’s Full Self-Driving system, a suite of advanced driver assist (ADAS) features that require driver supervision at all times. However, Tesla claims that it will soon work without driver supervision—hence the name.
Challinger said that he was driving with FSD v13.2.4 on the right lane, which was ending and merging into the left lane, but the car failed to merge and hit the curb.
He said that he failed to react in time and take control of the Cybertruck:
It failed to merge out of a lane that was ending (there was no one on my left) and made no attempt to slow down or turn until it had already hit the curb.
The Cybertruck then crashed into a light post. He was lucky to walkway without a scratch.
To be fair, it was a strange location for a post, but there’s no reason why Tesla FSD shouldn’t have moved lane and even if it wouldn’t have changed lane, it should have hit the curb or post (pictures via TroyTeslike):
![](https://electrek.co/wp-content/uploads/sites/3/2025/02/Screenshot-2025-02-09-at-4.13.42%E2%80%AFPM.png?w=1024)
![](https://electrek.co/wp-content/uploads/sites/3/2025/02/Screenshot-2025-02-09-at-4.13.46%E2%80%AFPM.png?w=1024)
Challinger said that he shared the story as a “public service announcement” to tell people to remain attentive when using Tesla’s Full Self-Driving system and not become complacent:
Big fail on my part, obviously. Don’t make the same mistake I did. Pay attention. It can happen. I follow Tesla and FSD pretty closely and haven’t heard of any accident on V13 at all before this happened. It is easy to get complacent now – don’t.
It might be the first crash on Tesla’s latest FSD v13, which CEO Elon Musk has presented as “mind-blowing” and an important step toward achieving “unsupervised self-driving” by the end of the year.
Musk and his Tesla influencers often share FSD videos claiming that the technology is “on the verge of becoming truly self-driving,” despite data evidence pointing to Tesla still being years away from achieving this.
Electrek’s Take
This guy is lucky to be alive, and he is right. There’s a problem with people becoming complacent with FSD, and Tesla, and especially Elon Musk, are not doing enough to prevent that from happening.
On the contrary, Musk continues to hype every update, like Tesla is on the verge of solving self-driving, and claims that its quarterly safety report proves that FSD is safer than human driving, which is misleading.
If Tesla was developing FSD in a vacuum without Elon’s claims that it would be solved every year for the last 5 years and Tesla selling the software package to customers without any clear idea of when it can be achieved or on what hardware, it would be celebrated product.
Instead, it’s a product that is making Tesla lose credibility and potentially dangerous, as we see today.
I myself had the exact same problem that Challinger described where a lane ends, but FSD doesn’t detect it. It’s weird because it works most of the time so you can get this sentiment of complacency and give the system a chance to move. In this case, it evidently went too far.
Be careful out there and stop believing Elon Musk when he talks about self-driving.