
What if Your Brain Deletes Reality Without You Noticing?
On December 3, 1990, Northwest Airlines Flight 1482, a McDonnell Douglas DC-9, was taxiing at Detroit Metropolitan Airport when it mistakenly entered an active runway. At the same time, Northwest Airlines Flight 299, a Boeing 727, was accelerating for takeoff.
Within seconds, the Boeing 727 collided with the DC-9, shearing off its left side. Smoke and flames engulfed the aircraft as passengers scrambled to evacuate. Eight people lost their lives.
In the following investigation, one detail stood out: the Boeing 727 pilots, despite looking directly ahead, failed to see the DC-9 until the last moment. Weather conditions were poor, with thick fog reducing visibility, but this alone didn’t explain how an entire aircraft seemed to vanish from view.
What if their brains had deleted the DC-9 from their perception?
This incident may be an example of Motion-Induced Blindness (MIB), a rare but scientifically recognized phenomenon where the brain erases stationary objects from conscious awareness when surrounded by motion. And it’s not just pilots who are at risk.
What Is MIB?
In 2001, Yoram Bonneh, a vision scientist at the Weizmann Institute of Science, was investigating how the brain processes motion when an anomaly caught his attention. Along with his colleagues Alexander Cooperman and Dov Sagi, experts in visual perception, Bonneh designed an experiment that required participants to fixate on a stationary object while a patterned background moved.

They weren’t studying illusions for entertainment. They were probing the limits of visual processing, searching for the moments when the brain, rather than the eye, fails to register reality.
What they uncovered was startling. Though physically unchanged, the stationary object kept vanishing from sight, only to reappear moments later. This phenomenon, now called Motion-Induced Blindness (MIB), revealed a hidden flaw in human vision: the brain, under certain conditions, can erase objects from conscious awareness.
Try It Yourself
Fix your gaze on a central point in the image below. If a small dot sits in your peripheral vision against a rotating background, it will vanish intermittently, even though it never actually moves. This isn’t a problem with your eyes. Your brain is actively deleting information, and not randomly. It’s deciding what’s important enough to see and what can be ignored.

According to Bonneh, this isn’t a simple illusion but a feature of human vision that may have deep evolutionary roots. “Our brain prioritizes motion over still objects,” Bonneh explains. “It’s a survival strategy. If something moves, it could be a threat. If it’s stationary, it’s less urgent, even if it’s still important.”
This system once helped early humans detect movement in dangerous environments. But in modern settings, where stationary dangers exist, like a pedestrian at a crosswalk, a motorcycle at an intersection, or even an aircraft on a runway, this built-in filtering system can become a deadly flaw.
Further research by András Sárközy, Jonathan E. Robinson, and Gyula Kovács has shown that some regions of our visual field are more prone to MIB than others, suggesting that perception gaps may vary from person to person.
And the consequences go far beyond a scientific curiosity. Drivers have failed to see motorcycles at intersections. Pilots have lost sight of aircraft on runways. AI-powered vision systems have struggled to detect stationary objects in motion-heavy environments. If our brain can erase objects from reality, how much of what we see is truly real?
How Your Brain Edits Reality
On a cold February night in 2025, a 32-year-old woman was walking along West Street in Southington, Connecticut, when a car struck her. The driver, who had traveled that road for years, continued home, completely unaware of what had happened. It wasn’t until later, after seeing police reports and noticing damage to his vehicle, that he realized he had hit someone.
A similar case occurred in 2014 in Eugene, Oregon. A driver struck a pedestrian at an intersection but only realized it hours later. Neither driver was intoxicated. Neither was distracted. They were looking straight ahead. Yet they never saw the pedestrian at all.
While human error is often blamed for such incidents, neuroscientists studying Motion-Induced Blindness (MIB) suggest that perception itself may be at fault. Sometimes, our brains erase objects that are right in front of us.
In 2001, researchers Yoram Bonneh, Alexander Cooperman, and Dov Sagi uncovered Motion-Induced Blindness. But later studies sought to answer why the brain does this, and how often. When does the brain choose to erase something? Why does it prioritize motion over stillness? And could this explain why drivers and pilots fail to see objects in plain view?
Using functional magnetic resonance imaging scans (fMRI), neuroscientists found that when objects “disappear” due to MIB, the brain’s motion-processing region (V5) becomes hyperactive, while the object-processing region (V4) is suppressed. The brain chooses movement over stillness, sometimes at the cost of accuracy.
This adaptation once helped early humans survive, allowing them to track predators in dense landscapes while ignoring static backgrounds. But in today’s world, where dangers can be motionless, a pedestrian at a crosswalk, a motorcycle at an intersection, or an aircraft on a runway, this filtering system creates dangerous blind spots.
The same effect has been reported in aviation. Pilots have lost sight of stationary aircraft while taxiing, especially when focusing on moving objects ahead. In some near-miss incidents, pilots claimed they had checked for obstacles but “never saw” the other plane, only realizing it was there moments before impact.
If this phenomenon extends beyond controlled lab experiments, it raises an urgent question: How often do we ‘see’ objects that our brains have already erased?
When Seeing Fails
Motion-Induced Blindness isn’t just a strange quirk of perception, it’s a factor in real-world accidents. From drivers overlooking motorcycles to pilots missing aircraft, the consequences of erased perception can be deadly.
In 1981, the Hurt Report, a landmark motorcycle safety study, found that in two-thirds of multiple-vehicle motorcycle accidents, the driver failed to see the motorcycle before the collision, even though it was visible.
In heavy traffic, a driver’s brain prioritizes movement, sometimes causing motorcycles to vanish from awareness. Smaller, less dominant objects can be filtered out, making intersections especially risky. Scanning isn’t enough, drivers must shift their gaze deliberately to counteract this effect. Motorcyclists can improve visibility with bright gear, reflective elements, and slight lateral movements. While MIB isn’t proven to cause crashes, its characteristics mirror real-world perception failures, highlighting a serious safety risk.
Pilots train rigorously to prevent visual lapses, yet runway incursions and mid-air collisions persist. Military pilots learn frequent gaze shifts to counteract perception failures, a strategy now adopted in commercial aviation. Modern cockpit displays use blinking and color-shifting markers to keep stationary aircraft visible.
NASA and Aircraft Owners and Pilots Association (AOPA) studies show that systematic visual scanning reduces missed hazards. Pilots who don’t shift focus regularly are more likely to overlook stationary objects, especially in dynamic environments. Though the 1990 Detroit collision is often linked to low visibility, other incidents show that pilots can miss objects even in clear conditions. This raises a key question: Do AI systems share similar perceptual blind spots?
AI-powered vision systems are revealing perception gaps similar to Motion-Induced Blindness. In 2018, a Tesla Model S on Autopilot crashed into a stationary fire truck because its system, designed to track movement, failed to detect the immobile vehicle. AI models trained for moving hazards often struggle with static objects. A recent LiDAR study found that continuous obstruction from dynamic elements can render stationary objects “invisible” (arXiv)

To fix this, developers are refining sensor fusion and integrating radar, LiDAR, and cameras to reduce blind spots. The GM Safety Report highlights how cross-referencing multiple inputs improves object detection, while advances in computer vision help AI distinguish static from moving hazards, similar to how pilots shift focus to counteract MIB.
But the question remains: Can AI truly “see,” or does it share our blind spots? If both humans and machines overlook what’s right in front of them, how do we stop the unseen from leading to disaster?
Can We Stop It?
Motion-induced blindness operates beneath conscious awareness, making it difficult to detect in real time. However, research suggests our brains and technology can be trained to compensate. In high-risk environments, pilots are taught to shift focus constantly, preventing static objects from fading from perception. Studies in aviation psychology show that those who regularly scan their surroundings are far less likely to experience perceptual blind spots.
Drivers can apply similar techniques. Checking an intersection twice isn’t enough if the brain erases an object between glances. Experts recommend actively shifting focus and expanding peripheral awareness, which helps prevent motorcycles or pedestrians from vanishing from perception.
Technology is evolving to detect what humans miss. Collision-detection systems in cars and aircraft now integrate LiDAR, radar, and AI-driven pattern recognition, helping flag stationary objects that might otherwise escape notice. Autonomous vehicle developers are refining AI vision models to prevent MIB-like perception failures, ensuring that static hazards are no longer deprioritized.
ADHD and MIB: A Different Perspective?
Not everyone experiences MIB the same way. Research suggests individuals with ADHD, known for attention shifts and hyperfocus, may perceive it differently. Some theories suggest they resist MIB due to heightened sensitivity to unexpected stimuli, while others propose their distractibility makes objects vanish and reappear more erratically. Though research is ongoing, it raises an intriguing question: Could those with ADHD see the world differently, perhaps even more completely, than others?
But even as we refine these solutions, a deeper question remains. MIB isn’t just a flaw, it’s a window into how our brains construct reality. Cognitive scientists have uncovered other perception failures, from inattentional blindness to false memories, all revealing the same unsettling truth:
What we see isn’t a direct reflection of reality, it’s a filtered version shaped by our brains.
Pilots, drivers, and AI systems rely on vision to navigate the world. But whether biological or artificial, perception isn’t a flawless recording; it’s an interpretation. MIB shows us that our brains actively erase objects from awareness, prioritizing what they believe matters most. AI, too, struggles with similar blind spots, missing stationary hazards in fast-moving environments.
The result? Collisions, accidents, and near misses are not caused by what was unseen but by what was there all along. So, if the world you see isn’t the full picture, how much of reality is slipping away unnoticed?
Responses