Steven Hauser, Saad Nagi, Sarah McIntyre, Shan Xu, Hakan Olausson, and Gregory J. Gerling
Social and emotional sentiment is often conveyed through physical expressions of touch. Rich spatio-temporal details underlie how a toucher’s fingers and palm make contact in interactions with a receiver’s forearm. Prior efforts have used mostly qualitative, human visual observation to distinguish an emotion by certain gestures tied to one’s hand contact, velocity and position. The work herein describes an automated approach to quantitatively eliciting the essential features of these gestures that convey an emotion’s meaning. Although we are now beginning to understand and quantify the physical contact underlying these gestures—e.g., force, velocity, contact duration—it is unclear how these touches are perceived as unique emotions as well as encoded by first-order afferents.
Historically human perceptual and microneurography experiments have relied on well-controlled mechanical stimuli, whether via force or displacement controlled indenter or brushing robot or using hand held von Frey hairs. We have developed a tracking system to monitor human-to-human contact while a toucher attempts to convey an emotional sentiment, such as love, calming or happiness to one's partner. In so doing, they might tend to use motions that involve tapping, stroking, holding, shaking, squeezing, etc. Because force mats or any sort of thin obstruction can completely alter the interaction, we have built a non-contact observation method that involves stereo infrared cameras, visual cameras, and magnetic tracking.
We need to move toward naturalistic human-to-human interactions as opposed to just highly controlled stimuli. Neural afferents of different sorts evolved to encode specific ecological interactions, and some of the most important involve personal interactions between couples.
A case study with one rapidly-adapting (Pacinian) and one C-tactile afferent examines temporal ties between gestures and elicited action potentials. The results indicate this method holds promise in determining the roles of unique afferent types in encoding social and emotional touch attributes in their naturalistic delivery.
Our efforts are among the first to report methods that allow natural human-to-human gestures to be delivered and tracked as stimulus input simultaneous with single afferent microneurography
Human subjects experiments with five pairs of romantic couples evaluated their ability to convey social touch gestures and measured six contact metrics as dependent variables (contact area, tangential velocity, normal velocity, percent of time with palm in contact, number of fingers in contact, and contact duration). The six social touch expressions were happiness and sadness (Ekman's emotions), love and gratitude (prosocial emotions), and attention and calming.
Our findings indicated, for example in the figure below, that a typical sadness expression invokes more contact, evolves more slowly, and impresses less deeply into the forearm than a typical attention expression.
Furthermore, we also found that each expression tends to be employed by 2-5 strategies with variable recognition rates, per the figure below.
We developed six quantitative contact characteristics that were capable of describing and differentiating the gestures used to communicate our word cues that varied in emotional content