If you missed the class or need to hear it again (lots of information on our upcoming semester), view the recording here!
Introduction: Perception vs. Reality: How the Brain Decides What Is “Real”
(Dijkstra et al., 2025)
Overview
One of the most remarkable and often overlooked jobs of the human brain is deciding what counts as reality. At any given moment, the brain is flooded with sensory input from the external world while simultaneously generating internal experiences such as memories, expectations, emotions, and imagination. From a neuroscience perspective, perception is not a passive recording of the world “as it is.” Instead, perception is an active construction, an ongoing decision process in which the brain evaluates sensory evidence and determines whether an experience should be treated as real, imagined, or uncertain. Research in cognitive neuroscience shows that perception and imagination are deeply intertwined at the neural level. Rather than being processed in separate brain systems, imagined and perceived experiences rely on overlapping neural circuits, particularly within the sensory cortex. This neural overlap is efficient and adaptive; it allows humans to mentally simulate future events, rehearse actions, and learn without direct experience. However, it also introduces ambiguity: if imagination and perception use similar brain machinery, how does the brain know the difference?
The Perception Loop
(Dijkstra et al., 2025)
Perception is created through a continuous loop:
1. Your brain is always guessing. Before you even look at something or hear a sound, your brain has already made a prediction about what you'll experience, based on what's normal.
2. Your senses deliver the facts. Your eyes, ears, and other senses send raw data about what's happening in the world right now.
3. Your focus acts as a filter. Your brain compares its guess to the facts and asks, "What's different or important here?" That important difference grabs your attention.
4. Your past fills in the blanks. To understand what you're sensing, your brain instantly searches your memories for similar experiences, giving the raw data meaning and context.
5. Your brain makes a final call. It decides if this experience is real and worth responding to, then updates its predictions for the future, starting the loop all over again.
In a nutshell: Perception isn't just receiving information; it's your brain's continuous, active process of guessing, checking, and interpreting to build your moment-to-moment reality.
Visual perception is how your brain takes in what your eyes see and makes sense of it. It helps you notice, remember, and understand visual details so you can interact with the world.
1) Shared Neural Pathways: Why Imagination Can Feel Real
When we imagine a visual scene—such as a face, an object, or a pattern—the brain does not simply “think about it” in an abstract way. Instead, it partially reactivates the same visual regions that are engaged when we actually see something. Functional brain imaging studies show that imagined images can activate the visual cortex in a graded manner, with stronger imagery producing stronger sensory-like signals.
This reuse of sensory circuits offers clear advantages:
• It allows the brain to predict outcomes, plan actions, and simulate possibilities.
• It supports creativity, problem-solving, and memory recall.
• It enables learning from mental rehearsal rather than direct trial and error.
At the same time, this overlap creates a fundamental challenge: vivid imagination can sometimes be mistaken for perception, especially when external sensory input is weak, ambiguous, or noisy.
This image shows how the brain uses predictive coding to understand the world. First, the brain forms a prediction based on past experience and beliefs about what it expects to see, hear, or feel. Sensory information from the eyes, ears, and body is then compared to that prediction. If the input matches the prediction, the brain keeps its current belief; if it doesn’t, a prediction error is generated. That error is sent back through the brain to update the internal model. Over time, this constant loop of predicting, comparing, and correcting is how perception becomes more accurate and how learning happens.
2) A Shared Sensory Signal and the “Reality Threshold”
Rather than labeling experiences as “real” or “imagined” at the sensory level, the brain appears to compute what researchers call a reality signal—a continuous measure of sensory strength that reflects both external input and internal imagery. Perceived and imagined signals are combined within the visual system, producing a graded sensory experience that varies in intensity.
Crucially, the brain does not automatically tag this signal as real or imagined. Instead, it compares the strength of the combined signal against an internal decision threshold:
• If the signal exceeds the threshold, the experience is more likely to be judged as real.
• If the signal remains below the threshold, the experience is more likely interpreted as imagination.
This helps explain why:
• Daydreams can feel unusually vivid.
• People may experience false alarms (“I thought I saw something”).
• Imagination can blur into perception during fatigue, stress, or low visibility.
These experiences are not errors in the system—they are the natural outcome of how sensory evidence is evaluated in the brain.
Did you know that how the brain creates reality is similar to the way engineers build AI? Through processes like Bayesian inference and predictive coding, our minds generate reality through 'best guesses' based on limited sensory input. Learn more with Dr. Shamil Chandaria of Oxford in our latest video.
3) The Visual Brain as an Integrator: The Fusiform Gyrus
A key brain region involved in integrating imagined and perceived information is the fusiform gyrus, a mid-level visual area known for processing complex visual features such as shapes, patterns, and faces. Importantly, this region does not simply register raw visual input. Instead, it integrates information from both internal imagery and external perception.
Neuroscience research shows that activity in the fusiform gyrus closely tracks:
• how vivid an experience feels,
• whether a person ultimately judges an experience as real, and
• even false perceptions—cases where no external stimulus is present, but the experience is judged as real.
When imagination and perception align—such as imagining the same visual pattern one is trying to detect—the combined sensory signal becomes stronger. This increases the likelihood that imagination will cross the reality threshold and be mistaken for perception. Importantly, these confusions are not due to inattention or carelessness; they arise naturally from how sensory signals are summed in the brain.
4) From Sensation to Conscious Decision: The Role of Prefrontal Cortex
While sensory regions produce a graded signal, the brain must eventually make a binary decision: Was this real or not? This final step involves higher-order brain regions, particularly in the prefrontal cortex, which are responsible for evaluation, monitoring, and conscious awareness.
• Attention amplifies sensory signals (raises them closer to the “reality threshold”)
• What we attend to feels:
• more vivid
• more real
• more important
• Attention is limited → reality is selective
We don’t see reality as it is—we see what our brain decides to highlight.
Tie to daily life:
• Driving and “not seeing” a pedestrian
• Phone notifications hijacking perception
• Anxiety narrowing attention (threat bias)
Research on perceptual decision-making shows that frontal brain regions transform continuous sensory information into discrete judgments. These regions interact with visual areas to convert “how strong it felt” into a conscious report of reality.
Binocular Rivalry: Evidence That Brain Rhythms “Gate” Awareness
(Max Planck Society, 2023)
Additional evidence for the role of the prefrontal cortex comes from studies of binocular rivalry, a phenomenon in which each eye receives a different image. Instead of perceiving both images at once, awareness alternates spontaneously between them—without any change in the external stimulus. Neuroscientists have found that these perceptual switches can be predicted by characteristic patterns of brain waves in the prefrontal cortex, particularly low-frequency oscillations (1–9 Hz) and beta waves (20–40 Hz), which appear just before a shift in conscious perception occurs. These findings challenge older theories suggesting that perception is determined solely by competition between neurons in early visual areas. Instead, large-scale brain oscillations in frontal regions act as gatekeepers, determining which sensory information gains access to conscious awareness.
1) Perception, Consciousness, and the Global Workspace
The discovery that prefrontal brain activity predicts perceptual switches supports a refinement of the global workspace theory of consciousness, which proposes that conscious experience arises when information is broadcast across a widespread network of brain regions. In this framework, sensory information must be amplified and globally shared—often through coordinated brain rhythms—before it becomes part of conscious awareness. Importantly, researchers emphasize that brain waves do not encode what we perceive, but rather whether a percept reaches consciousness. In other words, the brain’s oscillatory patterns control access to awareness, not the content itself.
2) Why This Matters in Everyday Life
Understanding how the brain distinguishes imagination from reality helps explain many everyday experiences, including:
• vivid daydreams and mental imagery,
• misperceptions and false alarms,
• visual distortions under stress, fatigue, or emotional arousal, and
• the influence of expectations and prior beliefs on what we “see.”
Rather than relying on a perfect separation between internal and external signals, the brain uses a practical and efficient strategy: it evaluates sensory strength and applies a threshold to decide what counts as real. This system works remarkably well most of the time—but under certain conditions, the boundary between perception and imagination naturally blurs.
Perception → Decision: How Constructed Reality Drives Behavior
Include:
• Perception happens before reasoning
• Decisions feel rational but are built on:
o predicted reality
o attention bias
o memory weighting
Examples students love:
• First impressions
• Medical decision-making
• Financial choices
• Relationship misunderstandings
Key takeaway line:
We don’t decide based on facts—we decide based on perceived reality.
Raw sensory input from the world (vision, sound, motion).
Example: you see traffic while driving.
This is the “thinking engine,” made of:
Mental representation – your internal model of what’s happening
Situation awareness – “Where am I? What matters right now?”
Decision making – “What should I do?”
Action planning – “How exactly do I do it?”
This is where predictions live.
Executive functions turn the plan into movement.
Hands steer, feet press pedals, eyes track.
Your actions change the world.
The world sends new sensory info back.
The brain updates its model.
That’s the perceptive cycle — nonstop feedback.
You don’t see reality — you interpret it
Your actions are based on your mental model, not raw input
Errors happen when the mental model is wrong, outdated, or overloaded
Predicting Reality in Time: How the Brain Anticipates What Will Happen Next
(Max Planck Society, 2026; Grabenhorst et al., 2026)
Overview
Perception is not only about determining what is real—it is also about predicting when reality will unfold. The brain is constantly preparing for the immediate future, continuously estimating how likely it is that an event will occur within the next few seconds. Recent neuroscience research shows that the brain calculates these probabilities moment by moment, using them to guide attention, perception, and rapid action. Rather than passively waiting for events to happen, the brain actively anticipates them. This predictive process allows humans to respond effectively to environments that change at very different speeds. A video game player reacts to events unfolding in milliseconds, while a boxer anticipates an opponent’s punch over a span of seconds. In both cases, the brain is doing the same fundamental thing: estimating when something is likely to happen and preparing the body and mind to respond.
1) A Three-Second Prediction Window
Neuroscience research suggests that the brain continuously evaluates the probability that an event will occur within a short, rolling time window of approximately three seconds. Within this window, the brain adjusts perception, attention, and motor readiness based on how likely an event is to occur at any given moment.
Importantly, this prediction process is scale-free, meaning:
• The brain uses the same basic probability calculation whether an event is expected in a few hundred milliseconds or several seconds.
• There is no separate timing system for “fast” versus “slow” events.
• Prediction operates consistently across different time scales, at least up to three seconds.
This unified strategy helps explain why humans can adapt so flexibly to new situations and changing environments, even when timing patterns are unfamiliar.
2) Probability Sharpens Perception—and the Sense of Time
One of the most striking findings from this research is that the accuracy of our sense of time depends on probability. When an event is highly likely to occur at a particular moment, the brain tracks time with greater precision. When an event is less likely, temporal precision decreases, and our sense of timing becomes more diffuse.
This finding challenges a long-standing principle in psychology known as Weber’s law, which proposes that timing precision should remain constant regardless of probability. Instead, the brain appears to allocate timing accuracy strategically, sharpening temporal perception when it matters most.
In practical terms:
• High probability → sharper timing, faster reactions
• Low probability → fuzzier timing, slower or less precise responses
This adaptive mechanism allows the brain to conserve resources while remaining ready for likely events.
3) Linking Prediction to Perception and Reality Judgments
These timing predictions interact directly with perception and reality testing. When the brain strongly expects an event to occur at a particular moment, sensory signals arriving at that time are more likely to be amplified and interpreted as meaningful—or even real. Conversely, when timing expectations are weak, sensory input may be ignored, misjudged, or fail to reach conscious awareness.
This timing-based prediction system helps explain:
• why we are more likely to notice stimuli we are expecting,
• why sudden, unexpected events can feel disorienting or unreal, and
• why fatigue or uncertainty can distort both perception and timing.
In this way, perception vs. reality is not only about signal strength (as described in sensory threshold models) but also about temporal expectation. What feels real is shaped not just by how strong a sensory signal is, but by whether it arrives when the brain predicts something should happen.
The blue web across the brain = distributed cortical networks
→ perception, thinking, memory, meaning
The brain isn’t linear. Everything is talking to everything.
Main relay + filter
Decides what sensory info gets broadcast to the cortex
Think of it as the brain’s router
Emotional salience & threat
Flags things as important or dangerous
Memory + context
Answers: “Have I seen this before?” “What does this mean here?”
Action selection & habits
Chooses which action to run
Prediction + error correction
Makes actions smooth and efficient (physical and cognitive)
Hot take: the cerebellum is massively underrated — it’s a prediction engine.
Your body sends constant internal signals (heart rate, tension, pain)
The environment feeds sensory data
Social interactions shape meaning and emotion
All of it loops back into the brain
4) Why This Matters
Together with research on imagination, sensory thresholds, and prefrontal decision-making, these findings reinforce a central idea: the brain is a prediction machine. It constantly estimates what is likely to happen next, when it will happen, and whether incoming information matches those expectations. Perception, consciousness, and reality judgments emerge from this ongoing comparison between prediction and sensory evidence. Understanding this process deepens our insight into attention, decision-making, learning, and even clinical conditions in which timing and prediction break down. The brain does not simply react to the world—it prepares for it, second by second, shaping what we perceive as real.
How the Brain Constructs Reality
(Buzsáki, 2019, 2022)
1) Perception Is an Active Process, Not a Recording
We often assume that our brains work like cameras—taking in information from the outside world and showing us reality exactly as it is. Neuroscience tells a very different story. The brain does not passively receive reality; it actively constructs it. Rather than simply reacting to sights, sounds, and sensations, the brain is constantly generating internal activity. Sensory information from the eyes, ears, and body does not arrive with built-in meaning. Instead, the brain compares incoming signals with its own internal patterns, expectations, and past experiences to decide what those signals mean. In other words, perception is not something that happens to us—it is something the brain does.
2) Why Two People Can Experience the Same Thing Differently
Because the brain relies on internal predictions and prior experience, two people can look at the same situation and perceive it in completely different ways. What we “see” depends on:
• past memories,
• expectations,
• emotional state,
• attention, and
• movement and interaction with the environment.
This explains why optical illusions work, why memories change over time, and why eyewitness accounts can conflict. The brain is always making its best guess about the world, not delivering a perfect copy of it.
Memory: Why the Past Shapes What Feels Real
• Memory is reconstructive, not playback
• The brain uses memory as prior probability
• Strong emotional memories bias perception
Everyday examples:
• Trauma and threat perception
• Nostalgia coloring reality
• Why eyewitnesses disagree
Aging tie-in (important for Emeritus):
• Older adults rely more on prior knowledge → sometimes better judgment, sometimes stronger bias
• This is adaptation, not decline
3) Action Comes Before Understanding
One of the most important discoveries in modern neuroscience is that movement and action help create perception. The brain learns what something is by interacting with it—looking closer, touching it, moving around it, or changing perspective. When we move our eyes, turn our heads, or reach for an object, the brain sends signals not only to our muscles but also to sensory areas of the brain. These signals help the brain distinguish between changes caused by our own actions and changes happening in the environment. This internal communication allows the brain to stabilize perception and assign meaning to what we experience. This is why perception improves through exploration and why learning is deeper when we actively engage rather than passively observe.
4) The Brain Is Never Truly “At Rest”
Even when we are sitting quietly, daydreaming, or sleeping, the brain remains highly active. During these moments, it replays past experiences, simulates future possibilities, and strengthens memories. These internally generated brain patterns help us plan, imagine, problem-solve, and make decisions. From this perspective, thinking itself can be understood as internalized action—the brain rehearsing possibilities without physically acting them out. This ability allows humans to reflect, plan ahead, and adapt to complex environments.
5) Why This Matters for Brain Health
Understanding that perception is constructed—not fixed—empowers us. It means that:
• our thoughts are not always facts,
• our perceptions can be trained and refined,
• curiosity, movement, and learning strengthen brain flexibility, and
• aging brains remain capable of adaptation and growth.
Key takeaway for students: Reality is not simply what happens around us—it is shaped by how our brain predicts, explores, and interprets experience. The more actively we engage with the world, the sharper and more flexible our perception becomes.
How the Brain Makes Sense of What We See
(Oude Lohuis et al., 2022), (Pennartz, 2022)
1) Vision Is a Team Effort Across the Brain
Vision often feels effortless, as if our eyes simply send images to the brain and the brain instantly understands them. In reality, seeing is a complex, collaborative process that involves multiple brain systems working together. What we experience as “visual reality” is the brain’s best interpretation—not a direct recording of the outside world.
Modern neuroscience strongly supports this idea: perception depends not only on what enters our eyes, but also on memory, attention, and information from other senses.
2) Vision Does Not Work Alone: Multisensory Demands
Recent neuroscience research shows that the visual system does not operate in isolation. When the brain is processing what we see, it is simultaneously integrating information from hearing, touch, and body position. Studies demonstrate that the time it takes the brain to interpret a visual scene changes depending on whether other senses are involved. In particular, the causal time window in which primary visual cortex contributes to detection becomes extended when task demands increase via an additional sensory modality that must be monitored. This supports the idea that perception is an active construction rather than a passive response to visual input.
3) The Brain Builds “Best-Guess” Representations
Neuroscientists describe perception as the brain creating a best-guess model of the environment. Rather than accessing reality directly, the brain builds an internal representation that feels complete and immediate, even though it is shaped by prediction and prior experience.
This framework—neurorepresentationalism—proposes that conscious experience is a multimodal, situation-like internal model of body and environment, while we nevertheless have the impression of experiencing external reality directly. This helps explain why illusions feel real, why attention alters perception, and why our experience of the world can shift depending on context or emotional state.
How the Brain Recognizes What the Eye Sees
(Rowekamp & Sharpee, 2017), (Hubel & Wiesel, 1968), (Hubel & Wiesel, 1968), (Bar et al., 2006), (Desimone & Duncan, 1995)
1) From Sensory Input to Meaningful Understanding
Seeing may feel immediate and effortless, but the neuroscience behind visual recognition is anything but simple. What begins as light entering the eyes must be transformed through multiple stages of brain processing before it becomes something meaningful—like recognizing a face, reading a word, or identifying an object in motion.
Visual signals travel from the retina to the primary visual cortex (V1), where the brain extracts basic features such as edges and contrast. From there, processing continues into higher visual regions including V2, where the brain begins combining these building blocks into more complex representations. Research analyzing V2 responses to natural scenes shows that V2 neurons respond to combinations of edges and other multi-feature patterns, reflecting organizing principles for how the brain begins assembling meaningful visual structure from complex inputs. (Related reporting from the Salk Institute also highlights this point in accessible language for learners; Salk Institute, 2017.)
What V1 Does — and Does Not Do
The primary visual cortex (V1) is often mistakenly described as a “camera” or “image screen.” In reality, V1 does not encode objects or scenes. Instead, it detects local features such as:
• edges,
• orientation,
• spatial frequency,
• contrast, and
• motion direction.
These features are fragmented and meaningless on their own.
Why V2 Matters
Visual area V2 plays a crucial role in integrating these fragmented signals. Research shows that V2 neurons respond to combinations of features, not isolated lines or edges. This makes V2 a critical transition zone where raw sensory data begins to acquire structure and meaning.
Importantly:
• V2 activity reflects contextual interpretation
• It is sensitive to both bottom-up sensory input and top-down feedback from higher cortical areas
• It contributes to filling-in, contour completion, and perceptual stability
This explains why perception feels continuous even when sensory input is incomplete or noisy.
Motion perception is thought to be mediated by the dorsal visual pathways. In this video abstract, Anna Roe and colleagues discuss their data demonstrating that V2 is also involved in motion processing. These findings suggest that the ventral visual pathways also play a role in motion perception. Read more in Lu et al., Neuron 68(5).
Sensory Cortex Does Not Work Alone
Sensory regions such as V1, V2, and the fusiform gyrus receive massive feedback projections from frontal and parietal regions. In fact, in the adult brain, top-down connections often outnumber bottom-up ones.
The prefrontal cortex (PFC):
• biases perception toward task-relevant features,
• amplifies expected signals,
• suppresses irrelevant noise, and
• helps resolve ambiguity.
This is why attention and expectation can dramatically alter what we consciously perceive.
Attention as Gain Control
Neuroscientific studies show that attention functions like a gain control system, increasing the signal-to-noise ratio of selected sensory inputs. When attention is directed toward a stimulus:
• sensory responses become stronger,
• perception becomes more vivid,
• and the stimulus is more likely to reach conscious awareness.
This explains why:
• inattentional blindness occurs,
• anxiety narrows perception toward threat,
• multitasking degrades reality testing.
2) Why Visual Perception Is Strong but Not Perfect
The brain’s visual system is designed to be efficient and robust, not flawless. By stabilizing likely patterns and suppressing noise, the brain supports fast recognition under imperfect conditions. At the same time, these shortcuts help explain why optical illusions work and why perception can vary from person to person.
3) Why This Matters for Brain Health and Aging
Understanding how the brain recognizes what we see helps reinforce a central message for lifelong learning: perception and interpretation remain adaptable throughout life.
Visual perception, like learning, benefits from:
• exploration rather than passivity,
• multiple perspectives,
• practice and repetition, and
• meaningful real-world application.
Key takeaway for students: Seeing is not just about eyesight—it is about how the brain organizes patterns and actively creates meaning.
The Brain’s Prediction Engine (Predictive Processing Framework)
(Friston, 2010; Clark, 2013)
Perception as Bayesian Inference
Modern neuroscience increasingly describes perception as a form of Bayesian inference, in which the brain continuously combines prior knowledge (expectations, memories) with incoming sensory evidence to generate the most likely interpretation of the world.
From this perspective:
• Sensory input = evidence
• Memory & expectation = prior probability
• Perception = the brain’s best statistical guess
Neural activity does not simply reflect what is “out there,” but what the brain expects to be there, corrected by incoming signals. When sensory input is weak or ambiguous, prior expectations carry more weight. When sensory input is strong and precise, it can override prior beliefs.
This framework explains why:
• illusions persist even when we “know better,”
• expectations bias perception,
• imagination can feel perceptually real, and
• perception changes under uncertainty, stress, or fatigue.
Your brain already has predictions:
“Spiders are dangerous”
“I’m physically intimidated”
“This is a threat”
Those predictions come before the sensory input.
You then see the spider.
Your body sends signals back (heart rate, tension, adrenaline).
If the signals match the prediction, fear stays high.
If they don’t → prediction error happens.
👉 That “prediction error” is the brain going:
“Wait… reality isn’t matching what I expected.”
Brain predicts: high pain (7/10)
Body actually sends: mild pain (5/10)
Result: prediction error = −2
Your experience of pain gets adjusted toward the prediction.
This is why:
Anxiety makes pain worse
Expecting pain increases pain
Calm expectations can reduce pain
Memory as a Perceptual Prior (Hippocampus & Medial Temporal Lobe)
(Shohamy & Wagner, 2008; Park & Reuter-Lorenz, 2009)
Memory Is Not Separate from Perception
Memory does not simply store past experiences—it actively shapes present perception. The hippocampus and medial temporal lobe structures provide contextual predictions that influence how sensory input is interpreted.
Strong memories:
• bias interpretation,
• increase confidence in perception,
• and can override weak sensory input.
This explains why emotionally charged memories (e.g., trauma, nostalgia) exert disproportionate influence on what feels real.
Aging and Perceptual Bias
Research shows that older adults rely more heavily on prior knowledge during perception. This is not a deficit, but an adaptive shift toward accumulated experience. However, it can also increase susceptibility to expectation-driven misperceptions under uncertainty.
Brain Rhythms and Conscious Access
(Engel & Fries, 2010; Max Planck Society, 2023)
Oscillations as Timing Gates
Neural oscillations coordinate information flow across distant brain regions. Studies of binocular rivalry show that:
• low-frequency oscillations (theta/alpha) regulate large-scale coordination,
• beta oscillations reflect top-down control and expectation,
• gamma activity is associated with local sensory processing.
The prefrontal cortex does not decide what we see—it decides when sensory information gains access to awareness. This supports the idea that consciousness is not localized but emerges from coordinated timing across networks.
Conclusion:
Perception is not a direct recording of the world but an active decision the brain makes by combining sensory input with memory, attention, emotion, and prediction. Imagination and perception share neural pathways, differing mainly in signal strength, timing, and context, which means what feels “real” depends on whether sensory information crosses internal thresholds and gains access to conscious awareness. Because the brain is constantly predicting what will happen next, reality is shaped not only by what we sense, but by what we expect and attend to. This explains why people can experience the same situation differently and why stress, fatigue, or strong beliefs can distort perception. Understanding this process reminds us that our experiences feel real because they are useful—not because they are perfect—and that perception remains flexible and adaptable throughout life.
Dreams, Nightmares, and Déjà Vu: How the Brain Constructs Reality Perception Beyond Wakefulness
(Hobson & Friston, 2012; Dijkstra et al., 2025)
From a neuroscience perspective, “reality” is not limited to waking life. Dreams, nightmares, and déjà vu reveal that the brain continuously constructs experiences using the same neural systems that support perception, memory, and emotion during waking consciousness. Rather than reflecting supernatural events or faulty cognition, these phenomena highlight how the brain generates felt reality by integrating memory signals, emotional salience, and prediction—sometimes without external sensory input.
Dreams: Reality Without External Input The Neuroscience of Dreaming
(Hobson et al., 2000), (Hobson & Friston, 2012), (Stickgold & Walker, 2013)
Dreams primarily occur during rapid eye movement (REM) sleep, a state characterized by heightened activity in limbic and memory-related brain regions alongside reduced activity in executive control areas of the prefrontal cortex (PFC). During REM sleep, the hippocampus, amygdala, parahippocampal cortex, basal ganglia, and visual association cortices are highly active, while regions responsible for reality monitoring, logical reasoning, and error correction are relatively quiet. This neural pattern explains why dreams often feel vivid, emotional, familiar, and real—even when they are bizarre or internally inconsistent. The brain interprets internally generated neural activity as sensory experience, producing a convincing reality without external input.
This figure explains how the brain’s dopamine reward system interacts with memory during sleep. During NREM sleep, the hippocampus communicates with the ventral striatum and VTA, allowing important or novel experiences to be replayed and tagged as meaningful. As the brain transitions into REM sleep, the VTA becomes more active and releases bursts of dopamine, which strengthens learning and highlights emotionally or motivationally important memories. In REM sleep, dopamine signals spread to the hippocampus, amygdala, prefrontal cortex, and nucleus accumbens, helping form strong associations and supporting creativity, emotional processing, and memory consolidation. Overall, the image shows that sleep doesn’t just store memories—it selects and enhances memories that matter most by using the brain’s reward system.
Dreams and Memory Consolidation
(Diekelmann & Born, 2010), (Payne et al., 2009)
A core function of sleep—and dreaming in particular—is memory consolidation. During sleep, recently acquired memories are reactivated and integrated with existing knowledge networks in the neocortex. This process can blur boundaries between real experiences, imagined scenarios, and emotional “gist” memories, contributing to the familiar quality of dreams. Research shows that sleep can even promote false memories by strengthening generalized meaning rather than exact details. This mechanism helps explain why dreams often feel familiar without being accurate reproductions of real events—and why dream imagery can later influence waking perception.
Nightmares: When Emotion Overrides Reality Monitoring
(Nielsen & Levin, 2007), (Revonsuo, 2000)
Nightmares represent an exaggerated form of dream perception in which emotional processing dominates. Neuroimaging studies show heightened amygdala activation during frightening dreams, coupled with reduced top-down regulation from the PFC. As a result, threat signals feel urgent and real, even though the sleeper lacks conscious awareness that the experience is internally generated. From an evolutionary perspective, nightmares may function as threat simulation, allowing the brain to rehearse responses to danger in a safe, offline state. However, when stress, trauma, or sleep disruption alters these systems, nightmares can become frequent and emotionally overwhelming, blurring the line between dream reality and waking emotional memory.
Déjà Vu: When Familiarity Conflicts With Reality
(O’Connor & Moulin, 2013), (Cleary, 2008)
Déjà Vu as Metacognitive Awareness
Déjà vu is not simply the feeling that something has happened before—it is the recognition that this feeling of familiarity is incorrect. Neuroscientists describe déjà vu as a conflict between memory signals and reality monitoring, making it a uniquely metacognitive experience.
Rather than reflecting a memory failure, déjà vu appears to occur when familiarity signals generated in the medial temporal lobe (particularly the parahippocampal cortex) reach consciousness without matching episodic memories from the hippocampus. The prefrontal cortex then evaluates the mismatch and flags the experience as “false familiarity.”
In healthy individuals, this fact-checking system remains intact—allowing people to notice that the feeling is wrong without believing it.
Neural Mechanisms of Déjà Vu
(Brown, 2003), (Cleary et al., 2021)
Although no single model fully explains déjà vu, converging evidence supports a dual-processing framework in which familiarity and recollection normally operate together but occasionally become desynchronized. When familiarity is activated without successful memory retrieval, the brain generates the sensation of déjà vu. Functional imaging and electrical stimulation studies implicate the temporal lobe, parahippocampal cortex, and frontal monitoring regions in this process, reinforcing the idea that déjà vu reflects active error detection rather than cognitive decline.
Dreams, Déjà Vu, and the Continuum of Perceived Reality
(Vignal et al., 2007), (Friston, 2010; Dijkstra et al., 2025)
Dreams and déjà vu are often experienced as strangely similar—and neuroscience offers a compelling explanation. Both phenomena rely on overlapping memory and familiarity networks, particularly when reality monitoring is weakened or temporarily disengaged. During dreaming, the brain suspends external fact-checking altogether. During déjà vu, fact-checking remains active but is challenged by misleading familiarity signals.
Related phenomena such as déjà-rêvé (“already dreamed”) further illustrate this overlap. In these cases, individuals experience the sense that a present event was previously encountered in a dream, highlighting how dream-generated memory traces can later influence waking perception.
Together, these experiences demonstrate that reality is not simply perceived—it is inferred. The brain constantly predicts, evaluates, and updates its model of the world, drawing on memory, emotion, and expectation to decide what feels real.
Dreams, Sleep Perception, and the Illusion of Wakefulness
(CNS, 2024)
Recent research presented at the Cognitive Neuroscience Society (CNS) 2024 annual meeting reinforces a core principle of neuroscience: perception and reality are not the same thing. This applies not only to waking life, but also to sleep and dreaming. Researchers are increasingly showing that how people perceive their sleep often differs dramatically from what objective brain measures show, highlighting the brain’s role as an interpreter rather than a passive recorder of reality.
Dream Activity Shapes How We Perceive Sleep Quality
Claudia Picard-Deland and colleagues at the University of Montréal examined how dream experience influences perceived sleep depth. In their study, participants were awakened repeatedly across all sleep stages and asked whether they believed they had been awake or asleep, how deeply they felt they were sleeping, and whether they had been dreaming.
Key finding:
Participants frequently believed they were awake even when brain recordings showed they were asleep — a phenomenon known as sleep misperception or paradoxical insomnia.
Importantly, when participants recalled dreams — especially vivid, immersive dreams — they reported feeling that their sleep was deeper and more restorative, regardless of objective sleep stage. This suggests that dream experience itself strongly shapes the brain’s judgment of sleep quality, reinforcing the idea that perception, not physiology alone, defines subjective reality.
Key Takeaway for Brain Fitness Students
Dreams, nightmares, and déjà vu are not signs of a malfunctioning brain. They are windows into how the brain constructs reality using memory-based predictions. When these systems operate slightly out of sync—during sleep, stress, fatigue, or emotional overload—the boundary between imagination and perception becomes temporarily blurred. Importantly, the ability to notice these distortions is itself a marker of healthy cognitive function.
This image explains how brain prediction and sensory processing change across wakefulness, REM dreaming, and lucid dreaming. In wakefulness, sensory input from the outside world strongly drives the visual system, and predictions from higher brain areas are balanced by real sensory feedback. In REM dreaming, external input is mostly shut off, so higher brain areas generate strong top-down predictions, creating vivid dream imagery without real sensory data. During REM, cholinergic activity is high while aminergic activity is low, which supports imagination but weakens reality checking. In lucid dreaming, parts of the prefrontal cortex become active again, allowing awareness and control within the dream. Overall, the diagram shows that dreams happen when the brain’s prediction system runs without external sensory correction.
References
Bar, M., Kassam, K. S., Ghuman, A. S., Boshyan, J., Schmidt, A. M., Dale, A. M., Hämäläinen, M. S., Marinkovic, K., Schacter, D. L., Rosen, B. R., & Halgren, E. (2006). Top-down facilitation of visual recognition. Proceedings of the National Academy of Sciences, 103(2), 449–454. (As cited in text)
Buzsáki, G. (2019). The brain from inside out. Oxford University Press.
Buzsáki, G. (2022, June 1). How the brain “constructs” the outside world. Scientific American. https://www.scientificamerican.com
Clark, A. (2013). Whatever next? Predictive brains, situated agents, and the future of cognitive science. Behavioral and Brain Sciences, 36(3), 181–204. (As cited in text)
Desimone, R., & Duncan, J. (1995). Neural mechanisms of selective visual attention. Annual Review of Neuroscience, 18, 193–222. (As cited in text)
Dijkstra, N., von Rein, T., Kok, P., & Fleming, S. M. (2025). A neural basis for distinguishing imagination from reality. Neuron, 113(15), 2536–2542.e4. https://doi.org/10.1016/j.neuron.2025.05.015
Engel, A. K., & Fries, P. (2010). Beta-band oscillations—signalling the status quo? Current Opinion in Neurobiology, 20(2), 156–165. (As cited in text)
Fletcher, P. C., & Frith, C. D. (2009). Perceiving is believing: A Bayesian approach to explaining the positive symptoms of schizophrenia. Nature Reviews Neuroscience, 10, 48–58. (As cited in text)
Friston, K. (2010). The free-energy principle: A unified brain theory? Nature Reviews Neuroscience, 11, 127–138. (As cited in text)
Grabenhorst, M., et al. (2026). The anticipation of imminent events is time-scale invariant. Proceedings of the National Academy of Sciences. https://doi.org/10.1073/pnas.2518982123
Hubel, D. H., & Wiesel, T. N. (1968). Receptive fields and functional architecture of monkey striate cortex. The Journal of Physiology, 195, 215–243. (As cited in text)
Max Planck Society. (2023, April 13). How the brain decides what we perceive: Patterns of brainwaves in the prefrontal cortex gate access to consciousness. https://www.mpg.de
Max Planck Society. (2026, January 9). The brain predicts events with varying degrees of accuracy. https://www.mpg.de
Oude Lohuis, M. N., et al. (2022). Multisensory task demands temporally extend the causal contribution of primary visual cortex to perception. Nature Communications, 13, Article 3351. https://doi.org/10.1038/s41467-022-31045-9
Park, D. C., & Reuter-Lorenz, P. (2009). The adaptive brain: Aging and neurocognitive scaffolding. Annual Review of Psychology, 60, 173–196. (As cited in text)
Pennartz, C. M. A. (2022). What is neurorepresentationalism? From neural activity and predictive processing to multi-level representations and consciousness. Behavioural Brain Research, 432, Article 113969. https://doi.org/10.1016/j.bbr.2022.113969
Rowekamp, R. J., & Sharpee, T. O. (2017). Cross-orientation suppression in visual area V2. Nature Communications, 8, Article 15739. https://doi.org/10.1038/ncomms15739
Salk Institute. (2017, June 8). How the brain recognizes what the eye sees. https://www.salk.edu
Shohamy, D., & Wagner, A. D. (2008). Integrating memories in the human brain: Hippocampal–midbrain encoding of overlapping events. Neuron, 60, 378–389.
Brown, A. S. (2003). A review of the déjà vu experience. Psychological Bulletin, 129(3), 394–413.
Cleary, A. M. (2008). Recognition memory, familiarity, and déjà vu experiences. Current Directions in Psychological Science, 17(5), 353–357.
Cleary, A. M., Ryals, A. J., & Nomi, J. S. (2021). Déjà vu and the brain’s error-detection system. Trends in Cognitive Sciences, 25(2), 95–106.
Diekelmann, S., & Born, J. (2010). The memory function of sleep. Nature Reviews Neuroscience, 11(2), 114–126.
Dijkstra, N., von Rein, T., Kok, P., & Fleming, S. M. (2025). A neural basis for distinguishing imagination from reality. Neuron, 113(15), 2536–2542.e4.
Friston, K. (2010). The free-energy principle: A unified brain theory? Nature Reviews Neuroscience, 11(2), 127–138.
Hobson, J. A., Pace-Schott, E. F., & Stickgold, R. (2000). Dreaming and the brain. Behavioral and Brain Sciences, 23(6), 793–842.
Hobson, J. A., & Friston, K. J. (2012). Waking and dreaming consciousness. Trends in Cognitive Sciences, 16(12), 618–625.
Nielsen, T., & Levin, R. (2007). Nightmares: A new neurocognitive model. Sleep Medicine Reviews, 11(4), 295–310.
O’Connor, A. R., & Moulin, C. J. A. (2013). Déjà vu experiences in healthy individuals. Memory, 21(4), 377–389.
Payne, J. D., Schacter, D. L., Propper, R. E., Huang, L. W., Wamsley, E. J., Tucker, M. A., & Stickgold, R. (2009). The role of sleep in false memory formation. Neurobiology of Learning and Memory, 92(3), 327–334.
Revonsuo, A. (2000). The reinterpretation of dreams. Behavioral and Brain Sciences, 23(6), 877–901.
Vignal, J.-P., Maillard, L., McGonigal, A., & Chauvel, P. (2007). The dreamy state. Brain, 130(1), 88–99.