Houston, We Have a Problem: Waymo’s Latest Misstep Raises Questions

Image Credit: KHOU / Youtube

Another day, another problem for Waymo.

We have previously covered some of Waymo’s trouble in Texas and beyond, from a robotaxi that appeared to block emergency responders to another that narrowly avoided being hit by a train. The rollout has not exactly been quiet, and each new incident adds a little more scrutiny.

Now, Houston has its own entry, and at this point, it may be time to ask some harder questions before surrendering more of human autonomy to machines.

Wrong Way in Houston

Safety concerns are growing after a Waymo driverless vehicle was caught attempting to drive the wrong way down a Houston street.

The incident, captured on dashcam and shared with local outlet KHOU, reportedly took place near the HOV lane off St. Joseph Parkway. According to the driver who recorded the footage, the Waymo vehicle began turning into oncoming traffic, creating confusion and forcing surrounding drivers to carefully maneuver around it. “It was scary because you didn’t know what the thing was going to do,” the driver said. “If you’ve got a person there you can wave at them… you can’t do that with this.”

Waymo later stated that after the vehicle stopped, a remote team assisted in backing it up and clearing the intersection. The company also pointed to safety data from Austin, claiming its vehicles have been involved in 84% fewer airbag-deployment crashes than human drivers.

Embed media below, please allow a moment for it to load:

The “Edge Case” Argument

As usual, the internet has thoughts. Some people point out that with thousands of autonomous vehicles on the road, odd behavior is inevitable. In that view, these incidents are simply the edge cases that bubble up because they are unusual enough to go viral.

Others compare it to navigation apps, occasionally giving bad directions. The system works most of the time, until it doesn’t. There are also more blunt takes: sometimes computers glitch. That is part of the deal.

All of that may be true. It does not make moments like this feel any less unsettling when you are the one sitting in traffic next to it.

Where This Gets Complicated

Just yesterday, my family went to Target. A young woman was flying through the parking lot like it was a lane of traffic, completely unaware of pedestrians walking behind vehicles, earning herself a one-finger peace sign from just about everyone who witnessed it. In that moment, I had a brief thought: I cannot wait for self-driving cars to get here fast enough to take the wheel away from people like that.

I see something like this Waymo incident, and I’m conflicted all over again. Because here is the truth. Statistically, self-driving cars will likely be safer in the long run than human drivers. They are not going to be blowing through residential areas at 70 mph. They are not going to be drunk, distracted, or showing off.

Lower speeds alone change outcomes. Fewer high-speed impacts means more survivable crashes, fewer catastrophic ones, and fewer lives permanently altered. So yes, on paper, it makes sense to trade human error for machine error.

Accountability Still Matters

We need to start treating 4,000-pound vehicles as loaded weapons when misused. If you charge at law enforcement with a vehicle, it is considered assault with a deadly weapon. When someone plows through a family of pedestrians, it is often treated as negligence.

So why do we treat clearly preventable, reckless behavior behind the wheel as a simple mistake instead of the deliberate choice it often is? If we actually want to close the gap between human drivers and machines, it starts here, with real accountability, not outcomes that feel disconnected from the damage done.

The Data Doesn’t Let Us off the Hook

Technology is improving, but it is not a silver bullet. Recent AAA testing shows that pedestrian automatic emergency braking is improving, with nighttime impact avoidance increasing from 0% in 2019 to 60% in 2025. That is real progress. The same study found major inconsistencies, especially at night, where high-visibility clothing sometimes improved detection and sometimes caused a complete failure. That matters when more than 75% of pedestrian fatalities happen after dark.

The broader numbers are not encouraging either. According to NHTSA, an estimated 7,314 pedestrians were killed in 2023, with more than 68,000 injured. Roadside workers are still being struck and killed every year. So yes, technology can reduce mistakes and step in when a driver fails, but it cannot replace accountability.

Enforce speed limits. Take away licenses. Driving is a privilege, not a right. Hold drivers civilly liable for the damage they cause, and put repeat offenders in jail when necessary. Mandate safety systems like automatic emergency braking, but do not pretend they replace responsible driving. Stop letting drunk drivers back on the road like nothing happened, and hold both drivers and establishments accountable.

If someone drives in a way that kills a family, that should not end in probation. The standard has to be higher than that. Until it is, we are not fixing the problem; we are just working around it. Maybe most importantly, demand that prosecutors and judges actually enforce the laws that already exist.

Author: Michael Andrew

Michael is one of the founders of Guessing Headlights, a longtime car enthusiast whose childhood habit of guessing cars by their headlights with friends became the inspiration behind the site.

He has a soft spot for Jeeps, Corvettes, and street and rat rods. His daily driver is a Wrangler 4xe, and his current fun vehicle is a 1954 International R100. His taste leans toward the odd and overlooked, with a particular appreciation for pop-up headlights and T-tops, practicality be damned.

Michael currently works out of an undisclosed location, not for safety, but so he can keep his automotive opinions unfiltered and unapologetic.

He also maintains, loudly and proudly, that the so-called Malaise Era gets a bad rap. It produced some of the coolest cars ever, and he will die on that hill, probably while arguing about pop-up headlights

Leave a Comment

Flipboard