Horses differentiate human expressions of sadness and joy

15.08.2023

A new study shows that horses can differentiate between expressions of joy and sadness displayed by humans through facial movements or voice tones. Horses were more attracted by the facial expressions of joy than sadness and seemed more excited by the joyful voices.

Emotions are present in all interactions and communication from human to human, and they could also play a role in interspecific communication. Diverse animal species from orangutans to pigeons are already known to perceive human emotions, and domestic mammals have been the focus of several studies in the past years. Dogs, cats, horses, and even goats were shown to distinguish different human facial expressions of emotions. However, this area of research has mainly focused on two emotions: joy and anger. What about other emotions, such as sadness?

The international research team from the National Research Institute for Agriculture, Food and Environment INRAE in France, the University of Tours in France and the University of Turku in Finland observed and analysed horses’ behaviour when presented with human faces and voices expressing joy or sadness. The heart rate of the horses was also recorded during the experiment.

“Sadness is a particularly interesting emotion, because it is not only of negative valence – contrary to joy, which is positive – but it is also of low arousal. Previous studies have shown that horses react to high-arousal emotions like anger or joy. Could they also detect signals of sadness, a low-arousal emotion? We wanted to study whether horses can associate the vocal and facial signals of human sadness, as they can for joy and anger,” says the lead author of the study, Doctoral Researcher Plotine Jardat from the French National Institute for Agriculture, Food and the Environment and the University of Tours, France.

Horses were surprised by incompatibility of sad look and cheerful voice 

During the study, the horses were placed in front of two screens displaying the face of a same person expressing joy on one screen and sadness on the other screen. Simultaneously, a voice was broadcast, expressing either joy or sadness. 

The horses’ first look indicated they matched the face and voice expressing sadness or joy. The researchers observed that during the horses’ first glances at each image, a greater number of the horses spent a longer time looking at the unmatching image than the image that matched the sound.

In other words, when the horses first looked at the images, they were, as expected, surprised by the incompatibility between the sad face and the joyful voice, and vice versa. This suggests that horses can associate a human face and voice expressing the same emotion, be it sadness or joy. 

“This is interesting because it would mean that when horses observe our faces and hear our voices, they don’t just see and hear separate things, but they are able to match them across different modalities. You could imagine that they have a particular box in their mind labelled ‘human sadness’ containing the characteristics of both a human sad face and a human sad voice,” says Doctoral Researcher Océane Liehrmann from the University of Turku.
 

A similar setup has been used in several previous experiments by the team. Its purpose is to explore the animals’ mental processing of the images and sounds and their congruence. In the previous experiments on anger and joy as well as on the perception of adults and children, the horses reacted to the setup by looking more at the image that did not match the sound. The researchers believe that horses look more at the incongruent image because they are intrigued by the lack of correspondence between this image and what they hear.

Horses looked more at joyful images and seemed more excited by joyful voices

The researchers also observed that after the initial look, horses focused on the screen showing the joyful face and looked at it longer and a higher number of times. Moreover, their heart rates seemed to increase more when the broadcast voice expressed joy rather than sadness, suggesting the horses were in a higher arousal state when hearing the former. 

According to the researchers, three hypotheses could explain these observations. First, the horses could have been more attracted by the joyful images because of a greater movement, and more excited by the joyful voices because of acoustic characteristics like pitch variations. Second, horses could have associated human joyful faces with positive situations, so they would prefer to look at these expressions linked to positive memories. Third, horses could feel more positively when looking at the images of joy, and more aroused when hearing joyful voices, because of a phenomenon called “emotional contagion”. Emotional contagion is the correspondence of the emotional state of an observer with the emotional state of the individual they observe. It has been described in humans and primates, and several studies have suggested that it could also happen between humans and other animals, like horses. This phenomenon is often regarded as a premise of empathy.

“Overall, our study shows that horses can differentiate audible and visual signals of human joy and sadness, and associate the corresponding vocal and facial expressions. Horses were also more attracted and seemed more animated by joyful expressions, so people who interact with horses could benefit from expressing joy during these interactions,” Plotine Jardat concludes.

The researchers point out that further studies are needed to better understand horses’ perception of human sadness. In the future, researchers aim to find out, for example, whether horses can also differentiate sadness from other negative emotions, or whether sad expressions from humans can influence horses’ behaviour, especially during human-horse interactions.

 

Contact information:

Océane Liehrmann, oceane.liehrmann@utu.fi, https://oceane-liehrmann.webador.fr/
Plotine Jardat, French National Institute for Agriculture, Food and the Environment, plotine.jardat@inrae.fr 

Created 15.08.2023 | Updated 15.08.2023