Research Project

Neural encoding of human to human touch expressions and emotion

Tracking contact interactions

August 12, 2020

Steven Hauser, Saad Nagi, Sarah McIntyre, Shan Xu, Hakan Olausson, and Gregory J. Gerling

Naturalistic human-to-human expressions and emotions

Social and emotional sentiment is often conveyed through physical expressions of touch. Rich spatio-temporal details underlie how a toucher’s fingers and palm make contact in interactions with a receiver’s forearm. Prior efforts have used mostly qualitative, human visual observation to distinguish an emotion by certain gestures tied to one’s hand contact, velocity and position. The work herein describes an automated approach to quantitatively eliciting the essential features of these gestures that convey an emotion’s meaning. Although we are now beginning to understand and quantify the physical contact underlying these gestures—e.g., force, velocity, contact duration—it is unclear how these touches are perceived as unique emotions as well as encoded by first-order afferents.

Human-to-human touch as a stimulus

Historically human perceptual and microneurography experiments have relied on well-controlled mechanical stimuli, whether via force or displacement controlled indenter or brushing robot or using hand held von Frey hairs. We have developed a tracking system to monitor human-to-human contact while a toucher attempts to convey an emotional sentiment, such as love, calming or happiness to one's partner. In so doing, they might tend to use motions that involve tapping, stroking, holding, shaking, squeezing, etc. Because force mats or any sort of thin obstruction can completely alter the interaction, we have built a non-contact observation method that involves stereo infrared cameras, visual cameras, and magnetic tracking.

We need to move toward naturalistic human-to-human interactions as opposed to just highly controlled stimuli. Neural afferents of different sorts evolved to encode specific ecological interactions, and some of the most important involve personal interactions between couples.
Tracking human-to-human contact interactions using various observation points, to account for the unique setup and angles when conducting microneurography. In particular, the digit positions are corrected based on tracking the fingertips visually.

Microneurography recordings from single afferents in humans

A case study with one rapidly-adapting (Pacinian) and one C-tactile afferent examines temporal ties between gestures and elicited action potentials. The results indicate this method holds promise in determining the roles of unique afferent types in encoding social and emotional touch attributes in their naturalistic delivery.

Our efforts are among the first to report methods that allow natural human-to-human gestures to be delivered and tracked as stimulus input simultaneous with single afferent microneurography
Responses of neural afferents to the physical quantities of human touch. During a stroking gesture, the PC (Pacinian corpuscle) afferent fired in short bursts as the receptive field was crossed (left); during tapping, the PC fired synchronously with each tap (right). The position of the receptive field on the touch receiver’s arm/hand is illustrated.

Neural responses of C-tactile afferent. During a stroking gesture, the afferent responded to the high shear velocity (left); when being tapped, the afferent fired infrequently (right).

Physical quantities distinguish touch expressions

Human subjects experiments with five pairs of romantic couples evaluated their ability to convey social touch gestures and measured six contact metrics as dependent variables (contact area, tangential velocity, normal velocity, percent of time with palm in contact, number of fingers in contact, and contact duration). The six social touch expressions were happiness and sadness (Ekman's emotions), love and gratitude (prosocial emotions), and attention and calming.

Our findings indicated, for example in the figure below, that a typical sadness expression invokes more contact, evolves more slowly, and impresses less deeply into the forearm than a typical attention expression.

Furthermore, we also found that each expression tends to be employed by 2-5 strategies with variable recognition rates, per the figure below.

We developed six quantitative contact characteristics that were capable of describing and differentiating the gestures used to communicate our word cues that varied in emotional content
Comparing median contact characteristics per gesture for two of the emotions. Each contact characteristic is normalized based on the average IQR across each gesture. Normal and tangential velocities in particular well separate the 6 gestures.

Common expression strategies for each gesture. Via k-means clustering analysis, multiple "expression strategies" were determined. For each gesture, the strategies are listed, along with 1) the percent of the gestures that were clustered into each strategy, 2) the number of unique participants which used each gesture (out of 10), 3) a description along with a representative "snapshot" of each gesture, 4) the relative quantities of our six contact characteristics, and 5) the percent of the time that the receiver correctly identified the strategy. % total values are floored to the nearest whole percent.

References

  • Hauser, S.C., McIntyre, S., Israr, A., Olausson, H., and Gerling, G.J. Uncovering Human-to-Human Physical Interactions that Underlie Emotional and Affective Touch Communication [DOI][PDF]. IEEE World Haptics Conference, 2019.
  • Hauser, S.C., Nagi, S.S., McIntyre, S., Israr, A., Olausson, H., and Gerling, G.J. From Human-to-Human Touch to Peripheral Nerve Responses [DOI][PDF]. IEEE World Haptics Conference, 2019.
  • Boehme R., Hauser, S.C., Gerling, G.J., Heilig, M., and Olausson, H. Distinction of self-produced touch and social touch at cortical and spinal cord levels[DOI][PDF]. Proceedings of the National Academy of Sciences, 2019
No items found.