Texture Perception

Our sense of touch endows us with an exquisite sensitivity to surface texture. We can discern surfaces whose elements are tens of nanometers in size and hundreds of nanometers apart. The perception of texture not only allows us to make fine discriminations – like telling real silk from fake silk – but also guides object manipulation. For example, our perception of the surface properties of objects informs how much grip force we apply on them: more force is required for slippery objects. One of the remarkable aspects of tactile texture processing is that it operates over six orders of magnitude in element sizes, from the smallest discernible elements (on the order of tens of nanometers) to the largest elements that can fit on a fingertip, measured in tens of millimeters. We have shown that this wide range of scales is accommodated by distributing information across three types of nerve fibers, each sensitive to surface elements over different spatial scales. Importantly, these different afferents convey texture information differently. Coarse textural features, on the order of millimeters, are conveyed in the spatial pattern of activation in one afferent population, drawing analogies to visual texture representations on the retina.

In contrast, fine textural features – with sizes in the tens of nanometers – are conveyed in temporal spiking patterns in two other afferent populations, driven by skin vibrations elicited when the textured surface moves across the skin, and drawing analogies to audition. How these two types of representations are integrated to achieve a unitary sensory experience of texture is a mystery. Furthermore, while afferent responses are highly dependent on exploratory parameters, such as contact force and scanning speed, the perception of texture is highly invariant with respect to these parameters. Thus, neural signals must be interpreted in the context of how they are acquired. Nothing is known about how this is achieved.



Sensory signals about the posture and movements of our hands are critical to our ability to dexterously interact with objects. Despite its importance, little is known about how hand proprioception is encoded in the responses of neurons in primary somatosensory cortex (S1). To fill this gap, we track the movements of the hand as animals perform different motor tasks using state-of-the-art motion tracking while recording the activity evoked in S1 neurons using chronically implanted multi electrode arrays. We then use a variety of mathematical techniques to reveal how information about hand movements and posture is encoded in the responses of populations of S1 neurons.

Neural Coding in the Brainstem

The responses of individual peripheral afferents convey ambiguous information about tactile stimuli, while those of cortical neurons explicitly carry information about behaviorally relevant stimulus features. Little is known about the response properties of neurons in the two intervening structures in the medial lemniscal pathway, namely the dorsal column nuclei and the ventroposterior nucleus of the thalamus. In collaboration with Lee Miller and Joshua Rosenow at Northwestern University, we have developed an approach to chronically implant electrode arrays in the cuneate nucleus (CN) of primates (the nucleus in the dorsal columns that relays input from the upper limb) so that we are able, for the first time, to record single unit activity from this structure in awake, behaving animals. From these recordings, we hope to compare stimulus representations in CN to their counterparts at the periphery and in cortex, to glean a better understanding of the role of this neural structure in sensory information processing.


One approach to restoring sensorimotor function in amputees or tetraplegic patients consists in equipping them with anthropomorphic robotic arms that are interfaced directly with the nervous system. To control these arms, not only must motor intention be translated into movements of the limb, but sensory signals must be transmitted from the limb to the patient. Indeed, without these signals, controlling the arm is very slow, clumsy, and effortful. With this in mind, we develop approaches to convey meaningful and naturalistic sensations through stimulation of peripheral or cortical neurons, attempting to reproduce, to the extent possible, the patterns of neuronal activation that are relevant for basic object interactions. We anticipate that these studies will constitute an important step towards restoring the sense of touch to those who have lost it.