A Florida DUI case is getting attention because it hits one of the biggest misunderstandings in modern driving technology. It is not just about one driver; it is about what people think these systems can do versus what they are actually designed to do.
We have had a bit of fun with Florida DUI arrests lately. From the driver trying to hand off a Barnes & Noble card during a stop to the one blaming excessive speed on “it’s a Supra”, some of these cases come with a built-in punch line.
That is part of the reason we cover them. The humor might get the click, but the goal is simple. Educate people about what can happen. Get them to slow down, not drive impaired, and not drive distracted. If the more ridiculous cases stick in someone’s head long enough to change a decision, that is a win.
This one really doesn’t have that punchline.
Instead, it taps into something else entirely, a growing confusion about what modern driver-assist systems can and cannot do. According to authorities, Kimberly Brown, 37, was found around 2 a.m. on Friday, April 24, asleep behind the wheel of a Tesla stopped in the middle lane of I-75 southbound near mile marker 217.
Troopers said the vehicle had stopped after the driver failed to provide input, but the assumption behind that decision makes this case stand out.
What Troopers Say Happened

Authorities said Brown was using the Tesla’s Autopilot feature when the vehicle came to a stop. That detail matters because Autopilot is not the same thing as a sober driver. It is a driver assistance system, not a legal loophole, and not a replacement for the person behind the wheel.
According to the report, Brown was booked into the Manatee County Jail on a DUI charge. Troopers also used the case to make the larger point plainly: to sit behind the wheel, a driver must be sober, awake, alert, hands on the wheel, and eyes on the road.

Police Have Warned About This Before
This is not the first time law enforcement has had to spell this out. In a Facebook post, the Sansom Park Police Department put it as directly as possible, warning that using Autopilot while intoxicated does not make a Tesla a designated driver.
The department added that the car might steer, but the person behind the wheel is still there, both literally and legally. It also leaned on a pop culture reference that makes the point stick, noting that the system is not KITT from Knight Rider.
The Comment Section Says a Lot About the Problem
The reaction online shows why police keep having to repeat this message. A lot of commenters focused on the same idea, questioning whether she was actually driving if the car was on Autopilot. Some argued the DUI should not apply, others treated the Tesla like a designated driver, and a few suggested sitting in a different seat might change the outcome.
That repetition is the real story. People are treating driver assistance like full automation, and full automation like a personal taxi. That is not how these systems work, and it is not how responsibility works once someone gets behind the wheel.
Autopilot, Full Self-Driving, and the Reality Gap
Some commenters correctly pointed out that Tesla Autopilot and Full Self-Driving are not the same thing. That distinction matters, but it does not change the bigger point.
Whether a driver is using Autopilot, Full Self-Driving, adaptive cruise control, or lane keeping, the person behind the wheel is still responsible for the vehicle. The technology can assist with steering and speed, but it cannot make someone sober or alert, nor can it turn a DUI into a rideshare.
Tesla Is Not Your DD
The system stopped the car when the driver stopped responding. That likely prevented something worse. But a Tesla sitting in the middle lane of I-75 at 2 a.m. is not a safe outcome. It is just where the system runs out of options.
A Tesla is not your DD. It is not a backup plan, and it is not an excuse.
If you have been drinking, do not get behind the wheel. Call an Uber. Get a ride. Hand over the keys.
The system did what it could. The decision is what put it there.
