Tesla FSD Beta tried to kill me last night

Entertainment

I was testing Tesla’s latest Full Self-Driving (FSD) Beta update last night (v11.4.7), and a new aggressive bug has nearly made me crash at highway speed twice.

Please take this as a public service announcement.

I received the new FSD Beta v11.4.7 update on my Model 3 this week.

Tesla has been known to have a “two-step forward, one step back” process with FSD Beta.

Last night, I decided to test it on my way to Montreal, and I found one of those “back steps,” and it almost got me into a potentially deadly crash.

I was on FSD Beta with the speed set at 118 km/h (73 mph) on the 20 direction Montreal, and the system automatically moved to the left lane to pass a car.

As I was passing the car, I felt FSD Beta veering aggressively to the left toward the median strip. Fortunately, I use FSD Beta as recommended by Tesla, which means my hands on the wheel and my eyes on the road.

I was able to steer back toward the road, which disengaged FSD Beta. It was super scary as I almost lost control when correcting FSD Beta and again, I was passing a vehicle. I could have crashed into it if I overcorrected.

When you disengage Autopilot/FSD Beta, Tesla encourages you to send a message about why you disengaged the system.

I did that, but I wasn’t sure what happened, so my message was something like: “Autopilot just tried to kill me, so please fix it.”

Despite having a storage device connected, I didn’t see the camera button to record what happened. It’s something I just noticed following this update.

A few moments later, I gave FSD Beta another shot, and I was actually able to repeat the problem.

As I moved to the left lane again, I was way more alert, and when FSD Beta again veered to the left toward the median strip, this time I saw one of those sections for U-turns for emergency vehicles:

FSD Beta tried to enter it at full speed. I again was able to correct it in time and sent Tesla a bug report, though it cut me off before I could explain what happened. It should be clear if they can pull the video.

This is a brand new behavior for FSD Beta – for me, at least.

Tesla Autopilot used to try to take exit ramps it wasn’t supposed to in the early days, but it was something that Tesla fixed a while ago. I haven’t had that happen to me in years.

And this is actually a lot more dangerous than a surprise exit ramp because there’s no exit ramp to slow down in with those median strip u-turn areas. FSD Beta basically tried to take a sharp left turn at 118 km/h. It could have been deadly.

Considering it happened twice in a two-minute timeframe, this is likely a new bug that creeped into FSD Beta.

Hopefully, Tesla can quickly fix this before anything bad happens.

In the meantime, please always use Autopilot and FSD Beta as intended, which means with your hands on your steering wheels and your eyes on the road.

Articles You May Like

Prosecutors ask judge to drop charges against Donald Trump over 2020 election interference
A Nearby Supernova May End Dark Matter Search, Claims New Study
Woman who accused Conor McGregor of rape wins civil assault case – and is awarded damages
MP behind assisted dying bill says she has ‘no doubts’ – as she rejects minister’s ‘slippery slope’ claim
Expect the assisted dying debate to get even louder in days to come