BTemplates.com

Powered by Blogger.

Pageviews past week

Quantum mechanics

Auto News

artificial intelligence

About Me

Recommend us on Google!

Information Technology

Popular Posts

Thursday, October 21, 2010

Human Brain Can 'See' Shapes With Sound : See No Shape, Touch No Shape, Hear a Shape? New Way of 'Seeing' the World


Scientists at The Montreal Neurological Institute and Hospital -- The Neuro, McGill University have discovered that our brains have the ability to determine the shape of an object simply by processing specially-coded sounds, without any visual or tactile input. Not only does this new research tell us about the plasticity of the brain and how it perceives the world around us, it also provides important new possibilities for aiding those who are blind or with impaired vision.
New research shows that the human brain is able to 
determine the shape of an object simply by processing 
specially-coded sounds, without any visual or tactile input. 
(Credit: iStockphoto/Sergey Chushkin)

Shape is an inherent property of objects existing in both vision and touch but not sound. Researchers at The Neuro posed the question 'can shape be represented by sound artificially?' "The fact that a property of sound such as frequency can be used to convey shape information suggests that as long as the spatial relation is coded in a systematic way, shape can be preserved and made accessible -- even if the medium via which space is coded is not spatial in its physical nature," says Jung-Kyong Kim, PhD student in Dr. Robert Zatorre's lab at The Neuro and lead investigator in the study.

In other words, similar to our ocean-dwelling dolphin cousins who use echolocation to explore their surroundings, our brains can be trained to recognize shapes represented by sound and the hope is that those with impaired vision could be trained to use this as a tool. In the study, blindfolded sighted participants were trained to recognize tactile spatial information using sounds mapped from abstract shapes. Following training, the individuals were able to match auditory input to tactually discerned shapes and showed generalization to new auditory-tactile or sound-touch pairings.

"We live in a world where we perceive objects using information available from multiple sensory inputs," says Dr. Zatorre, neuroscientist at The Neuro and co-director of the International Laboratory for Brain Music and Sound Research. "On one hand, this organization leads to unique sense-specific percepts, such as colour in vision or pitch in hearing. On the other hand our perceptual system can integrate information present across different senses and generate a unified representation of an object. We can perceive a multisensory object as a single entity because we can detect equivalent attributes or patterns across different senses." Neuroimaging studies have identified brain areas that integrate information coming from different senses -- combining input from across the senses to create a complete and comprehensive picture.

The results from The Neuro study strengthen the hypothesis that our perception of a coherent object or event ultimately occurs at an abstract level beyond the sensory input modes in which it is presented. This research provides important new insight into how our brains process the world as well as new possibilities for those with impaired senses.

The study was published in the journal Experimental Brain Research. The research was supported by grants from the Canadian Institutes of Health Research and the Natural Sciences and Engineering Research Council of Canada.

Editor's Note: This article is not intended to provide medical advice, diagnosis or treatment.

0 comments:

Post a Comment

Please do not spam Spam comments will be deleted immediately upon my review.