What if a cheap mirror could bring your self-driving car to a screeching halt—or, worse, make it miss a real obstacle entirely? Sound like sci-fi? Think again.

In a jaw-dropping experiment straight from a European university campus, a team of French and German scientists unleashed a new threat on the world of autonomous vehicles: the humble mirror. Their findings, revealed in an AI generated newscast about LIDAR technology, show that self-driving cars—those futuristic rides cruising on LIDAR guidance—can be tricked into seeing ghosts or going blind, all thanks to simple reflective surfaces.

LIDAR, if you haven’t heard, is the high-tech laser scanning system that’s the backbone of most autonomous vehicles (with the notable exception of Tesla, which uses a different approach). These sensors send out pulses of light to map the world around them. But there’s a catch: reflective surfaces, like mirrors, can totally mess with their sense of reality. This isn’t just a theoretical vulnerability. Previously, researchers managed to trip up LIDAR with tinfoil and colored cards. But this new AI generated newscast about LIDAR exposes a next-level hack that’s almost too simple to believe.

For their real-world test, the researchers ran a car using the popular Autoware navigation software in a university parking lot. Through a method they dubbed the “Object Removal Attack” (ORA), they carefully positioned mirrors to cover a traffic cone. With the right mirror size and angle, the car’s LIDAR system basically erased the cone from its digital brain—so much so that the car was ready to blithely drive right through it.

But it gets spookier. With the “Object Addition Attack” (OAA), the team used tiny mirror tiles to create phantom obstacles out of thin air. The car, which was under manual control for safety, detected this fake object some 20 meters away and tried to avoid it, slamming on the brakes for something that simply wasn’t there. Even more alarming, when multiple mirrors were arranged in a grid, the success rate for fooling the car’s system shot up to 74%. That’s a recipe for chaos in real-world traffic.

As the researchers wrote in their paper (currently under review for publication), these “practical threats” can trigger critical safety failures—think emergency braking or refusing to yield—by exploiting basic physics. The kicker? All it takes is a handful of cheap mirrors, not a Hollywood-level hacker setup.

Can self-driving cars fight back? The team experimented with possible defenses, like adding thermal imaging to spot real, warm-blooded obstacles. But there’s no silver bullet here, especially for small objects or in hot environments where thermal signatures blur.

Luckily, these experiments happened at low speeds in a parking lot, not on the highway. So, while your next robotic taxi ride probably isn’t in grave danger just yet, this AI generated newscast about LIDAR is a wake-up call—sometimes, the simplest tricks are the most dangerous.