Feel the Beat

20.07.2000

You are here

How do the eyes 'speak' with the brain to convey visual information? And how does the brain reconstruct tactile sensations formed at ones fingertips?


Scientists have long known that sensory information such as touch, vision, and hearing is converted into electrochemical impulses and channeled by neuronal pathways to respective sensory areas in the brain. The data, organized in the same spatial pattern as that obtained by the sensory organs, is then processed to divulge its encrypted messages.


However, a recent study by Dr. Ehud Ahissar of the Weizmann Institute of Science suggests that in processing tactile information, the brain requires more than spatial patterning. Put simply - it needs to feel the beat.


Due to appear in the July 20 issue of Nature, the study shows that tactile information is encoded by both spatial and temporal data. In a series of experiments in animal models, Ahissar and colleagues at the Neurobiology Department found that soon after sensory input is translated into electrical impulses, these divide into parallel pathways. The first is probably responsible for spatial data, while the second processes temporal data. 'The brain compares the timing of cells activated by external stimuli to the internal timing of other brain cells, which operate much like a metronome,' says Ahissar.


Previous attempts to uncover the functional significance of the temporal pathway were thwarted by technical difficulties. Ahissar's team overcame this obstacle by designing an improved experimental strategy, which included using stimuli similar to those found in the natural environment.


The findings also suggest a where and what scheme, encoded by the frequency of incoming stimuli. The temporal pathway primarily processes low frequency stimuli, sufficient for determining an object's location. In contrast, the spatial pathway focuses on high-frequency stimuli, making it better suited for representing an object's features. 'Most textures in nature are highly detailed,' explains Ahissar. 'Imagine running your finger across a fabric. The more complex the texture, the more 'bumps' of tactile information encountered, meaning a higher frequency of incoming data to be processed.'


The current study may contribute to improving tactile aids for the blind. According to Ahissar, these aids were originally designed to accommodate the known physiology of the fingertip without regard to brain processing. For instance, Braille letters are too long, so users often have to scan the information twice to retrieve all the data and then 'stitch' it together. Since the new evidence suggests that efficient tactile processing must be done in continuum, this may explain the relatively low speed of most Braille readers. As with much in life, it's all about timing.


This research is funded by the Abramson Family Foundation, North Bethesda, MD, the Nella and Leon Benoziyo Center for Neurosciences and the Murray H. and Meyer Grodetsky Center for Higher Brain Functions.

 

The Weizmann Institute of Science is a major center of scientific research and graduate study located in Rehovot, Israel. Its 2,500 scientists, students and support staff are engaged in more than 1,000 research projects across the spectrum of contemporary science.

 

Keeping Time: Soon after sensory input is translated into electrical impulses, these divide into parallel pathways, processing spatial and temporal data respectively.

Share