closing-your-eyes-may-not-improve-your-hearing-after-all
Closing Your Eyes May Not Improve Your Hearing After All

Closing Your Eyes May Not Improve Your Hearing After All

In noisy environments, most of us instinctively shut our eyes to focus better on faint or subtle sounds, assuming that the absence of visual stimuli sharpens our auditory senses by freeing up cognitive resources for hearing. This widespread belief suggests that closing the eyes should enhance our ability to detect weak auditory signals, improving auditory sensitivity through a reallocation of mental processing power. However, pioneering research conducted by a team at Shanghai Jiao Tong University challenges this long-held assumption, revealing that closing one’s eyes in a noisy setting might actually hinder sound detection rather than help it.

Published recently in the Journal of the Acoustical Society of America, the study led by Yu Huang and colleagues systematically examined how different visual conditions affect auditory perception amid background noise. Their experimental design required volunteers to listen to a series of sounds played through headphones while a cacophony of distracting noise persisted in the environment. Participants were then instructed to adjust the volume of target sounds to the minimum level at which they could still reliably perceive them, allowing the researchers to establish individual auditory detection thresholds under varying visual circumstances.

The experiment progressed through four distinct visual scenarios: eyes closed, eyes open facing a blank screen, viewing a still image corresponding to the sound, and watching a dynamic video synchronized with the audio. Surprisingly, the data revealed that eye closure consistently raised auditory detection thresholds, meaning that participants needed the soft sounds to be louder to detect them, compared to when their eyes were open and visually engaged. Conversely, observing a related video improved auditory sensitivity the most, significantly lowering the threshold for detecting faint sounds amid noise.

To understand the neural mechanics underpinning these findings, the research team incorporated electroencephalography (EEG) to monitor brain activity during the trials. EEG recordings indicated that closing the eyes induced a state referred to as neural criticality—a heightened, finely balanced state of network activity—within the participants’ cortex. While such a state can enhance focus on internal stimuli by intensifying sensory gating, it also causes the brain to over-filter incoming auditory signals, suppressing not only background noise but also the very target sounds being sought. This over-filtering effect ultimately degrades auditory perception in noisy settings.

Yu Huang articulates the paradoxical nature of these results: “When immersed in a loud soundscape, your brain’s challenge is to separate the meaningful audio signals from the overlapping noise. Closing the eyes shifts processing towards internal focus, but this internal orientation leads to excessive filtering, which ironically blocks out both irrelevant and relevant sounds. On the other hand, engaging visually with corresponding content helps the auditory system anchor itself externally, enhancing signal detection.” These insights demonstrate that visual engagement does more than distract; it actively facilitates auditory processing.

The implications of this research are profound for everyday listening scenarios, whether in busy urban environments, crowded social settings, or noisy workplaces. The common practice of shutting one’s eyes to hear better may only be effective in quieter contexts where ambient noise is minimal. In contrast, maintaining open eyes and seeking congruent visual input can serve as a practical strategy to sharpen hearing in real-world auditory scenes flooded with competing sounds.

Moreover, the findings add an important dimension to our understanding of multisensory integration, the brain’s ability to synthesize information from various senses for more precise perception. The enhanced auditory detection observed when participants watched related videos suggests that congruent visual-auditory stimuli interact synergistically, potentially engaging cross-modal neural circuits that boost sensory clarity. This aligns with broader neuroscientific frameworks emphasizing the interdependence of sensory systems in complex environments.

Future research, as the authors propose, will delve deeper into the nuances of this cross-modal relationship. Specific questions remain about whether the observed benefits arise from a general state of visual attention or require precise matching between visual and auditory content. For instance, presenting incongruent pairings—such as a bird’s image accompanying drum sounds—could differentiate whether the brain relies on simple visual engagement or the semantic alignment of sensory inputs to enhance auditory perception.

In addition to broadening theoretical frameworks of sensory processing, these discoveries have practical applications. Fields such as audiology, acoustic engineering, and rehabilitation might incorporate visual strategies to assist individuals with hearing difficulties, especially in noisy environments. Moreover, consumer electronics or hearing aid technologies could be designed to leverage synchronized visual cues, improving user experience and communication efficacy in everyday noisy spaces.

This research challenges the entrenched notion that sensory focus requires the exclusion of extraneous inputs. Instead, it highlights the brain’s sophisticated capacity to integrate multisensory information, where appropriate visual engagement can serve as a powerful enhancer of auditory function. By reorienting advice regarding listening strategies in noisy settings, this work encourages individuals to face the world with eyes open, tapping into the brain’s full potential for sound detection and cognitive processing.

The study titled “Visual engagement modulates cortical criticality and auditory target detection thresholds in noisy soundscapes” is set to appear in the March 17, 2026 issue of the Journal of the Acoustical Society of America. This landmark investigation establishes a new paradigm in auditory neuroscience, urging us to reconsider intuitive listening habits and embrace multisensory integration as a pathway to improved hearing in complex acoustic environments.

Readers and researchers interested in the detailed methodologies and data analysis can access the full article via DOI: 10.1121/10.0042380. The accompanying EEG findings and behavioral data offer pivotal insights into the neural underpinnings of sensory interaction, demonstrating how eye opening combined with congruent visual cues optimize neural states conducive to heightened hearing sensitivity, fundamentally transforming our approach to auditory attention in noisy settings.

Subject of Research: Auditory perception and the neural mechanisms of multisensory integration in noisy environments.

Article Title: Visual engagement modulates cortical criticality and auditory target detection thresholds in noisy soundscapes

News Publication Date: March 17, 2026

Web References:
https://doi.org/10.1121/10.0042380

Image Credits: Yu Huang

Keywords

Auditory perception, Speech perception, Acoustics, Audiology

Tags: auditory detection thresholds researchauditory experiments with background noiseauditory perception in noisy environmentscognitive resources and auditory sensitivityeffects of closing eyes on hearingimpact of visual stimuli on auditory processinginteraction between vision and hearingJournal of the Acoustical Society of America studymultitasking sensory perceptionrole of attention in sound detectionShanghai Jiao Tong University hearing studyvisual influence on sound detection