In a groundbreaking discovery, scientists at Duke University in 2018 found that our eyes produce a subtle noise each time they move. Now, these researchers have developed a method to determine where a person is looking by analyzing these imperceptible sounds. By using a highly sensitive microphone placed in the ear canal, the team can pick up these noises, even though we are not consciously aware of them.
Lead scientist Prof. Jennifer Groh suggests that these noises occur when eye movements trigger the contraction of either the middle ear muscles or the hair cells in the ear. Contracting the middle ear muscles helps to dampen loud noises, while contracting the hair cells amplifies quiet sounds. This mechanism allows us to make sense of our surroundings by automatically adjusting the sensitivity of our hearing based on our visual focus.
To test their method, Prof. Groh’s team, alongside Prof. Christopher Shera from the University of Southern California, conducted an experiment involving 16 adults with good vision and hearing. The volunteers were asked to visually track the movements of a green dot on a computer screen without moving their heads. Simultaneously, an eye-tracking camera recorded the direction of their gaze, while microphones in the ear canals captured the ear sounds.
By cross-referencing the eye video and ear audio recordings, the researchers discovered that specific types of ear noises corresponded to eye movements in certain directions. This breakthrough meant that the scientists could accurately determine where a participant’s gaze was focused by identifying the noise signature their ears were producing. Additionally, they could also infer the sounds the participants’ ears were making by tracking the direction of their gaze.
The impact of these findings extends beyond determining a person’s visual focus. They have the potential to enhance our understanding of human perception and contribute to the development of more accurate and informative hearing tests. The method could be used to refine existing hearing assessments by incorporating visual stimuli, providing a more comprehensive evaluation of a person’s hearing abilities.
Moreover, this research opens up avenues for future applications in various fields, such as virtual reality and human-computer interaction. By monitoring ear sounds alongside eye movements, virtual reality experiences could be more immersive and interactive, adapting the audio based on the user’s visual attention. Additionally, in human-computer interaction, this method could enable devices to more accurately discern a user’s intended target on the screen by considering both eye movements and ear sounds.
While this research is still in its early stages, the potential implications for understanding human perception and developing innovative technologies are vast. The ability to determine a person’s visual focus by analyzing the sounds produced by their ears opens doors for advancements in various fields and has the potential to revolutionize hearing tests. As the research continues, we can expect to gain further insights into the interconnectedness of our senses and how our bodies adapt to enhance our perception of the world around us.
*Note:
1. Source: Coherent Market Insights, Public sources, Desk research
2. We have leveraged AI tools to mine information and compile it