How Your Thoughts Are Built & How You Can Shape Them | Dr. Jennifer Groh
Summary
Dr. Jennifer Groh, professor of psychology and neuroscience at Duke University, explains how the brain encodes and integrates sensory information — particularly sight and sound — to create a coherent perception of the world. She details the neural mechanisms behind sound localization, multisensory binding, and the surprising discovery that eye movements directly modulate how the ears process sound. The conversation culminates in a compelling theory of what thoughts actually are at the neural level: simulations run across the brain’s sensory-motor infrastructure.
Key Takeaways
- Thoughts are sensory simulations: When you think of a concept (e.g., a cat), your brain runs a mini-simulation across visual, auditory, olfactory, and other sensory cortices simultaneously.
- Eye movements physically move your eardrums: Every saccadic eye movement causes a precisely timed, coordinated movement in both eardrums — the right and left eardrums move in opposite directions — suggesting the brain integrates vision and hearing at the earliest possible stage.
- Sound localization is an extraordinary computational feat: The brain detects interaural time differences as small as half a millisecond — shorter than a single action potential — to determine where a sound originates.
- Your brain actively turns down your hearing when you speak: A top-down mechanism reduces auditory transduction just before speech to prevent the close proximity of your mouth to your ears from being overwhelming.
- Sensory integration must be continuously learned: Because a baby’s head is roughly half the width of an adult’s, the timing cues for sound localization change throughout development and must be constantly recalibrated.
- Hearing loss is linked to dementia: Reduced sensory input may cause the brain to downregulate circuits, contributing to memory and attention decline.
- Protect your hearing: If someone nearby can detect that any sound is coming from your headphones, the volume is likely causing permanent hearing damage.
- Rhythm is universal across all human cultures: Even cultures without melody or harmony have rhythm, supporting theories that rhythmic coordination evolved to enable group cooperation and competition.
- The ventriloquist effect reveals how the brain assigns sound sources: The brain continuously calculates the “most likely candidate” for a sound’s origin and can override raw auditory input with visual information.
- Low-frequency sound travels farther and bends around objects better than high-frequency sound, which is why warning signals (drums, bass horns) favor lower frequencies.
Detailed Notes
The Nature of Thought: Sensory Simulation Theory
- The leading theory discussed is that thinking is the brain running sensory-motor simulations.
- When you think of a concept like “cat,” the brain simultaneously activates:
- Visual cortex (what the cat looks like)
- Auditory cortex (what a cat sounds like)
- Olfactory and other sensory regions (e.g., smell of kitty litter)
- This explains why cognitive load from one modality impairs another: telling a passenger “be quiet” while merging in traffic reflects the need to redirect shared sensory-motor resources toward the immediate task.
- This framework offers perhaps the most mechanistically grounded definition of thought in neuroscience: thoughts are internally generated sensory simulations, not abstract symbolic processes.
The Superior Colliculus: Where Senses First Converge
- The superior colliculus is a midbrain structure responsive to both visual and auditory stimuli.
- Key discovery: auditory neurons in the superior colliculus shift their receptive fields based on where the eyes are pointing.
- This was the founding observation of Dr. Groh’s career and set the framework for understanding dynamic, eye-position-dependent auditory maps.
- The brain must continuously compute: “Where is this sound relative to where my eyes are looking?” — a calculation that involves reference frame transformation.
Sound Localization: The Mechanics
- The brain determines sound location using:
- Interaural time difference (ITD): Sound reaches one ear before the other; the maximum possible delay is ~0.5 milliseconds (less than the duration of a single action potential).
- Interaural level difference (ILD): The head creates an acoustic shadow, making sound slightly quieter in the far ear.
- Spectral filtering by the pinna: The outer ear’s folds create a unique frequency “fingerprint” depending on the direction of the sound — and this fingerprint is unique to each individual.
- People with cauliflower ears (e.g., wrestlers) likely have altered spectral filtering and may initially struggle with sound localization but can adapt over time.
- Distance perception for sound relies on:
- Loudness relative to known source intensity (e.g., thunder)
- Room acoustics and echo delays: The brain computes the difference in arrival time between direct-path sound and copies bouncing off nearby surfaces to estimate distance.
Eye Movements and the Eardrum: A Groundbreaking Finding
- Dr. Groh’s lab discovered that every saccadic eye movement causes the eardrums to move, even in total silence.
- The movement is:
- Precisely time-locked to the onset of the eye movement
- Differential: If eyes move left, the right eardrum bulges inward while the left bulges outward (and vice versa) — moving in a coordinated wave-like pattern
- Graded: The magnitude encodes how far the eyes moved; some vertical movement information is also encoded
- This was detected non-invasively using a microphone placed in the ear canal to detect otoacoustic emissions.
- The mechanism is thought to involve top-down signals from the brain activating the middle ear muscles and outer hair cells (which can contract like muscles), which then tug on the ossicles and eardrum.
- This may represent the earliest stage of visual-auditory integration — happening at the level of the ear itself, not just in the brain.
Multisensory Binding and the Ventriloquist Effect
- The brain continuously evaluates which sensory sources are likely to belong together and merges them accordingly.
- Temporal synchrony is critical: even tiny offsets between lip movements and sound cause perception to break down (as seen in poorly synced videos).
- The ventriloquist effect demonstrates the brain’s willingness to override auditory localization with visual information — the brain attributes sound to the puppet because its mouth movements correlate with the audio.
- When watching video or movies, the brain remaps sound sources based on what is seen on screen, even though the physical source of the sound (speakers, earbuds) is constant.
Why Your Voice Sounds Strange in Recordings
Three reasons:
- Recordings don’t capture the full frequency spectrum of your voice.
- The brain suppresses auditory input just before and during speech — a precisely timed volume-reduction mechanism to prevent self-generated sound from being overwhelming (given how close the mouth is to the ears).
- Bone conduction: Much of how you hear your own voice live is transmitted through bone (skull vibrations), not air — recordings capture only the air-conducted component.
Hearing Health and Hearing Loss
- 80% of people will experience hearing loss if they live long enough.
- Young people are accumulating hearing damage earlier than previous generations due to constant earbud use at high volumes.
- Safe listening guideline: If anyone near you can detect that sound is coming from your headphones — not even the specific content, just the presence of sound — the volume is likely causing permanent hearing damage.
- Noise-canceling headphones are preferable to turning up volume to overcome ambient noise.
- Hearing loss is correlated with dementia: less sensory input may cause the brain to downregulate circuits, leading to memory and attention decline.
- Bluetooth radiation: According to neurosurgeons familiar with the topic, Bluetooth headphone radiation is considerably lower than everyday environmental EMF exposure and is not considered a significant concern.
Music, Rhythm, and Evolution
- Rhythm is the only musical element universal across all human cultures — melody and harmony vary, but rhythm does not.
- A prominent theory: rhythm and music evolved to enable coordinated group action (stomping, shouting together) to compete with predators or rival groups — being collectively louder than any individual.
- Music may also have evolved via sexual selection: musically skilled individuals may have had more offspring (similar to peacock plumage).
- Music organizes language for memory: knowing the **first two or three words of a verse