How primate eye tracking reveals new insights into the evolution of language

Posted by
Check your BMI
Mariya Surmacheva/Shutterstock
toonsbymoonlight

The human environment is a very social one. Family, friends, colleagues, strangers – they all provide a continuous stream of information that we need to track and make sense of. Who is dating whom? Who is in a fight with whom? While our capacity for dealing with such a large social network is impressive, it’s not something especially unique to humans. Other primates do it too.

We – humans and other primates such as monkeys and apes – have something called social knowledge that allows us to keep track of the social dynamics of our friends, neighbours and even enemies.

What is perhaps different about humans, though, is the way in which we communicate about these dynamics. If I see my neighbours saying hello, I can easily express this in a sentence: “David is greeting Iris.” As far as research has shown, other primates can’t do this.

They can communicate about individual entities, such as alarm calling when there’s danger, or producing food calls when they find a food they like. But they don’t seem to express how an action is linked to the individuals involved.

And this is exactly what happens when I make a sentence like, “David greets Iris.” First, I say who is doing the action (David – the agent), then I express what he is doing (the action), and finally, to whom he is doing the action (the patient).

This structuring of the event is not only the case in English. The majority of languages prioritise agents through grammar, suggesting that this is something that is universal among humans.

Cross-linguistic studies have revealed similar biases when people view images of events. In tasks where people have to describe an image depicting an action, they are rapidly able to identify the agent, and spend more looking at the agent than the patient.

This points to the possibility that our ability to “deconstruct” events such as these, and our apparent bias for agents, might have its roots in an era before language evolved.

Eye tracking

To test this, alongside colleagues from Switzerland, I conducted an eye-tracking study with human adults, six-month-old infants, chimpanzees, gorillas and orangutans in a zoo.

We showed participants videos of social interactions, such as one orangutan embracing another, and non-social interactions, such a person pushing a shelf, using a technique called infra-red eye tracking. This technique allows one to remotely determine the position of the eyes when looking at a screen. This meant that we could work with apes who watched the videos voluntarily, through a designated window.

A chimpanzee watching a video of an agent brushing the hair of a patient.
A chimpanzee watching a video of an agent (left) brushing the hair of a patient (right). Red circles show her gaze switching over time. CC BY-SA

Our results revealed that both adults and apes were quick to identify agents, but only in scenes where the patients were objects.

In social interactions, figuring out who was the agent and who was the patient seemed to take longer. Unexpectedly, only in scenes depicting food did participants look mostly at the agent (who was eating or carrying food).

This lack of prioritisation of the agent in other scenes is probably because we showed videos, rather than asking participants to make decisions from still images, where one needs to track the action as it happens.

Why food scenes trigger such strong attention for agents is unclear, but may be because paying attention to who has food is important for survival. Intriguingly, our results showed very similar gaze patterns between the adult humans and the apes. As each scene unfolded, their gaze alternated between agent and patient.

This suggests that apes make sense of such events in similar ways to people. What about infants? The infants showed very different gaze patterns. They appeared to mostly look at the background of each scene, suggesting that they were unable to identify information in the same way as adults.

This may be because, at this age, they cannot “compute” information at the same speed as adults, and probably also need to gain visual experience to help to quickly identify agents and patients.

Our findings, then, suggest that when presented with the kinds of scenes from which people can easily identify cause and effect, apes appear to be able to identify agents and patients – just like humans. This supports the idea that our propensity for “deconstructing” information about events is not something unique to language, but is an ability that we share with our closest living cousins.

Perhaps it provided a scaffold onto which we later built language. The question, then, is why other primates don’t communicate about events in the way that we do. This is a question to which we don’t yet have an answer.

However, it seems very possible that the social world in which humans and other apes evolved, may well have helped to drive this disposition for identifying agents and patients, through keeping track of all those love-hate relationships.

So next time you see your neighbours saying hello, let it be a reminder that apes seem to view the world in almost the same way as we do.

The Conversation

This research was supported by funding from: The National Center for Competence in Research “Evolving Language” (SNSF agreement number 51NF40_180888) Swiss National Science Foundation (project grant numbers 310030_185324, 100015_182845, PZ00P1_208915) The National Center for Competence in Research “Evolving Language” Top-Up Grant (grant number N603-18-01) Foundation for Research in Science and the Humanities at the University of Zurich (grant number 20-014) Seed money grant, University Research Priority Program “Evolution in Action”, University of Zurich Jacobs Foundation

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments