Another viral video involving a Waymo vehicle is making the rounds out of Miami, and it is already hitting the same nerve these clips always do. Not because of what it shows, but because of what it leaves out.
The video captures a driverless taxi being pulled over by police, with lights flashing and a passenger in the back seat trying to figure out what to do. It is the kind of moment that should come with context. Instead, it comes with loud background music and just enough footage to spark confusion.
You can barely hear what is being said, and just as the situation starts to take shape, the clip cuts off. No resolution, no follow-up, no explanation of what actually happened after the officer made contact.
That gap is exactly why this is going viral. People are not reacting to a clear failure. They are reacting to a situation they do not fully understand, and a video that does not bother to explain it.
What Actually Happened
Embedded media follows. Please allow a moment for it to load. Turn the volume down before playing.
@joe_setaro Someone has to be out to get me because this wasn’t on my bingo card #fyp #waymo #pulledover #selfdrivingcar #foryoupage ♬ Horror Movie Sounds 2 (Ambient Halloween Music) – Jimmy C’s Horror Tracks
According to the initial report from KTLA, the stop occurred after an officer observed the Waymo vehicle intermittently stopping in the roadway, even halting in the middle of traffic.
Inside the car, the passenger questions what to do as the officer approaches. Moments later, a remote Waymo support operator comes through the vehicle’s system, checking in and acting as the point of contact.
Then the video ends. That is all the viewer gets.
The Comments Are Predictable—And Not Entirely Wrong
With no real ending, the internet has stepped in to fill the gap.
Some reactions lean into the obvious comedy. People joke about officers asking for a car’s license and registration. Others are imagining a DUI test for a robotaxi. A few are wondering whether the passenger was just pulled over by association.
It is funny, but it is also telling. Because the jokes are buried under real questions, people do not have answers to. Who gets the ticket? What is the passenger supposed to do? Who is the officer even talking to?
So, How Is This Supposed to Work?
Waymo vehicles are designed to detect emergency lights and sirens, pull over safely, and connect anyone involved to a remote human support team through the vehicle’s built-in system.
That is why, in this video, you hear someone from Waymo come over the speaker. There is no driver to speak to the police, so the system routes communication through a remote operator.
The vehicles are also designed to respond to basic traffic control, including officer direction and hand signals, and can allow access if needed. What looks unusual from the outside is, at least on paper, the system doing exactly what it is supposed to do.
Can Waymo Actually Get a Ticket in Miami? Here Is the Real Answer
One of the biggest questions coming out of this video is simple: if a driverless car gets pulled over in Miami, who actually gets the ticket?
According to a detailed explainer from KJZZ, Florida is one of the states where the answer is already defined.
Under Florida law, the autonomous system itself is considered the vehicle’s operator. That means if Waymo commits a violation, the citation is issued to the company, not the passenger.
So no, the person in the back seat is not getting a ticket. The company is.
That same KJZZ report explains how this works elsewhere. In Arizona, police treat Waymo vehicles like any other car during a stop, but citations still go to the company. In California, enforcement has historically been more complicated because traffic laws were written for a human driver. That gap is now being addressed with updated legislation.
Florida avoids much of that confusion by placing responsibility directly on the operator from the start.
This Is Not the First Time This Has Happened
In a separate case reported by ABC7 News, police in San Bruno, California, pulled over a driverless Waymo during a DUI crackdown after it made an illegal U-turn. No citation was issued at the time because there was no driver to cite.
That incident raised the exact same questions, which are now showing up again, and it will not be the last.
What This Video Actually Shows
Strip away the music, the cut-off ending, and the comment section, and what is left is not a clear failure.
Based on what is visible, the vehicle appears to have done exactly what it is designed to do. It recognized law enforcement, pulled over, and connected the situation to a remote support operator. That is the system working, even if it looks unfamiliar. That unfamiliarity is the real reason this is going viral.
A traffic stop is one of the most predictable interactions on the road, and driverless cars break that pattern. There is no driver, the communication is indirect, and the responsibility sits somewhere most people cannot see. That disconnect creates confusion, and confusion drives engagement. Autonomous vehicles will make mistakes, just like human drivers do, and in some cases arguably fewer. But not every viral clip is evidence of something going wrong.
Sometimes it is just people watching a system they do not yet fully understand, trying to make sense of it in real time.
