7 Ocak 2015 Çarşamba

Neuroscientists Get Better Feeling About Touch

from

BIOENGINEER.ORG http://bioengineer.org/neuroscientists-get-better-feeling-about-touch/



Our sense of touch is one we often take for granted, until our leg falls asleep and we aren’t able to stand, or when we experience acute pain. The sense of touch also has been taken for granted in neuroscience, where it’s the sense scientists know the least about.


sense-of-touch



Photo Credit: Chepko Danil



An international group of researchers, including Carnegie Mellon University’s Alison Barth, is changing that. For the first time researchers have linked a group of neurons to a specific type of somatosensation, a finding that can open the door for a heightened understanding about our sense of touch. The research is published in the Dec. 3 issue of Neuron.


“Somatosensation is critical. You can somewhat overcome losing your sense of smell, sight, taste, or hearing. But if you lose your sense of touch, you wouldn’t be able to sit up or walk. You wouldn’t be able to feel pain,” said Barth, a professor of biological sciences and a member of Carnegie Mellon’s BrainHubSM research initiative. “We know less about the features that make up our rich tactile experience than we do about any other sense, yet it’s such a critical sense.”


Somatosensation, which is another word for our sense of touch, occurs in a number of forms, like feeling texture, temperature, pressure, pain or vibration. It’s responsible for proprioception, which helps us know where we are within our environment. It tells us if our feet are firmly planted on the floor, or if we’re holding a paper cup tightly enough that it won’t slip out of our hand, but loosely enough that we don’t crush the cup. Scientists know a good deal about the molecular receptors that mediate the different types of somatosensation, but they know little about how touch is represented in the brain.


“When someone gets pricked by a pin, we know how information about that sensation travels from the skin to the spinal cord. But what happens in the brain has been much less clear — it seems like all different sorts of touch information get jumbled together,” said Barth, who also is a member of the joint Carnegie Mellon/University of Pittsburgh Center for the Neural Basis of Cognition (CNBC).


It was a jumble — until now.


In previous studies, Barth had discovered that certain groups of neurons in the brain’s neocortex were reliably more active than others. Using the fos-GFP mouse, a transgenic mouse model Barth created to study activity in live neurons, she and her colleagues set out to see if these neurons were generally more excitable, or if they responded specifically to one tactile stimulus. They found that these neurons reacted much more quickly and strongly when a puff of air was directed at the mouse’s whiskers, while other neurons had little or no response.


“This is the first time we’ve been able to visualize neurons in the somatosensory cortex that ‘like’ a specific tactile stimulus,” Barth said. “It shows that neurons are individuals. They have different jobs to do in the cortex. In this case these neurons had a special feature: they responded when all of the mouse’s whiskers moved at once.”


They also found that the neurons in question received direct synaptic input from the posteromedial nucleus of the brain’s thalamus. This shows that the neurons that react to the puff-of-air stimulus have a dedicated, unique sub-network of connections that enable them to communicate with one another and amplify the information they are receiving from the stimulus.


“Now that we have isolated the neural underpinnings of a certain feature, we can try to manipulate and change the interactions between cells. Can we train the mouse and strengthen the connections between neurons? What happens to perception if we remove the connections? It’s really the frontier of truly understanding somatosensory function,” Barth said.


This research also could lead to work that will identify how somatosensory information is coded, which could be used to incorporate sensory information into brain-machine interfaces. This could allow robotic limbs and prosthetics to actively sense and receive tactile input.


Story Source:


The above story is based on materials provided by Carnegie Mellon University.


Hiç yorum yok:

Yorum Gönder