Grant Details
Project Lead | Michael E. Goldberg M.D. |
---|---|
Amount | $450,000 |
Year Awarded | |
Duration | 3 years |
DOI | https://doi.org/10.37717/220020047 |
Summary |
Any creature that moves must be able to perceive where it is, where it is going, and where the objects and creatures around it are. Loss of accurate spatial perception is devastating to patients who have had damage to the parietal lobe of the cerebral cortex, the part of the human (and monkey) brain that analyzes space. Unlike other aspects of sensation, spatial perception is supramodal: we know where things are not only by looking at them, but also by hearing them, feeling them, or as even Shakespeare knew, smell them. In King Lear, Regan tells the newly blinded Gloucester to "smell his way to Dover." It was apparent to 19th century neuroscientists like Herman von Helmholtz and James Hughlings Jackson that the visual perception of space required two things: knowledge of where an object lies on the retina, and knowledge of where the retina is in space. The first step in understanding where the eye is in space is knowing where it lies in the orbit, and that means that the brain must get a position signal of some kind from the eye. The source of the eye position signal is not known. It could arise in two ways. One source is outflow, an 'efference copy' or 'corollary discharge' that might arise from some eye position signal ultimately used to drive the oculomotor neurons which control the muscles when they rotate the eye. Helmholtz proposed this theory in the 19th century. The second source is inflow, a direct signal from specialized cells within the muscles themselves, that signal the length of the muscle back to the brain. These cells are called proprioceptors. Since muscles move the eye by changing their own length, anything that senses muscle length will automatically sense the position of the eye in the orbit. In the skeletal muscles, proprioceptors are well understood: they are called muscle spindles, and have a specialized structure that includes a separate muscle fiber that controls their length, allowing them to work continuously through the range of action of a muscle. One section of the parietal lobe, called the primary somatosensory cortex, has a representation of the length of every muscle, an arrangement which lets humans know where their arms and legs are. The most common proprioceptive structure in the primate eye muscles is the palisade ending, which has its own muscle fiber, like the skeletal muscle spindle. The palisade ending fibers travel in the ophthalmic branch of the trigeminal nerve, the major nerve carrying sensation from the eye and skin above it to the brain. There is abundant evidence for eye position signals in various areas of the visual cortex. Neurons in the posterior parietal cortex report eye position in the dark. More commonly, the responses of visual neurons themselves are often affected by the position of the eye in the orbit. This observation, first made in the posterior parietal cortex has been replicated in a number of visual areas, even the lateral geniculate nucleus, the earliest visual nucleus in the brain. Neuroscientists have assumed that this phenomenon is important in the generation of spatially accurate behavior. The source of this eye position signal is unknown, although conventional wisdom suggests it comes from an outflow signal. It is possible, however, that the eye position signal modulating the visual response could arise from a proprioceptive signal. Neurons in the lateral intraparietal area, whose responses are modulated by eye position, also have their visual responses modulated by the position of head on the body. It is clear that neck proprioception is responsible for this head-position-induced modulation of visual responses. By extension it is possible that the eye-position modulation of visual responses also arises from a proprioceptive input, in this case from oculomotor muscle proprioception. We know that oculomotor proprioceptors project to the spinal nucleus of the trigeminal nerve, the first way station in the somatosensory pathway. It is not a great leap to assume that they join the rest of the trigeminal system, projecting to the ventral posteromedial thalamus, and thence to the primary somatosensory cortex, just like the skeletal muscle proprioceptors. I propose to find and describe the representation of eye position in the primary somatosensory cortex of the Rhesus monkey, whose basic visual, oculomotor, and somatosensory processes are very similar to those of humans. The cortical representation of oculomotor proprioception has never been described, nor has it ever been postulated in the literature. Even the speculations of Büttner-Ennever and Horn (2002) on the function of oculomotor proprioception stop at the superior colliculus, a brain center far more primitive than the cerebral cortex. Nonetheless it is difficult to imagine that the oculomotor proprioceptive system would be the only proprioceptive system without a cortical representation. The representation must be there. The experiments proposed are first to find the representation of eye position in somatosensory cortex, then to establish that it indeed comes from proprioception and not from a motor command, and finally to determine the role of this representation in the generation of the other cortical eye position signals. If oculomotor proprioception is the primary mechanism by which eye position modulates the responses of visual neurons, then the current hypotheses of how the brain analyses space will have to be totally revised. Understanding the nature of spatial perception will give insight into the processes lacking in humans who have deficits in spatial perception. Such deficits can arise developmentally, as in Williams Syndrome, or as the result of strokes or tumors affecting the parietal lobe. Understanding these processes may ultimately lead to the design of better rehabilitative strategies for these devastating conditions. |