At a very young age, babies learn to smile back. Soon after, they realize that if they themselves initiate the smiling, they will get a smile in return. As adults, too, we automatically correlate our own facial expressions with those of others: Being greeted with a scowl by someone at whom we have smiled feels completely different from a scowl prompted by our own hostile stare. In other words, the ability to identify the facial expressions of all participants, including ourselves, is crucial for interpreting complex social situations. In people with autism, this ability is obviously missing. Yet despite a large body of research into brain mechanisms dealing with the facial representation of emotions, virtually no studies have focused on ways by which the brain monitors one’s own behavior.
A study conducted in the laboratory of
Dr. Rony Paz of the Weizmann Institute’s Neurobiology Department has for the first time revealed how the brain integrates self-monitoring with observations of others.
These findings, reported in the
Proceedings of the National Academy of Sciences (PNAS), also open new avenues of research into the faulty processing of emotional and social information, such as that which occurs in people with autism.
Research student Uri Livneh, together with Jennifer Resnik and Yosef Shohat, created a unique experimental system in which two monkeys, separated by an opaque shutter, faced one another. Periodically, the shutter turned clear for a few seconds, enabling the monkeys to interact spontaneously, through nonvocal facial expressions, as they do in nature. The scientists identified three major expressions: a positive one, characterized by smacking the lips and contracting the surrounding muscles; a threatening one, characterized mainly by stretching the eyebrows and increasing the eyes’ movements; and a neutral one, observed when the shutter was closed and no interaction took place. In the course of the experiment, the scientists measured the electrical activity of individual neurons in the monkeys’ brains.
The scientists focused on two brain areas involved in processing emotional information and responding to positive and negative facial expressions: the amygdala and a certain portion of the cortex. They found that neurons in both these areas were responsible for two different activities: They both processed information about the individual’s own facial expressions and decoded the expressions of others.
The scientists were able to differentiate the electrical signals involved in the two types of decoding only by conducting extremely precise measurements, at a resolution of a few dozen milliseconds. They revealed that the amygdala actually knows about the smile before it occurs; it receives a neural signal directly from the part of the brain that controls the facial muscles as the smile is being created. This then provides the proper context for interpreting the facial expression of others. Thanks to this close overlap of the neural networks charged with interpreting one’s own and others’ behavior, the amygdala and cortex can quickly receive all the relevant information and create a complete and accurate picture of a social situation.
In follow-up research, the scientists plan to examine whether the same networks are involved in social learning – that is, learning through observing others. For example, if one monkey learns something by classic conditioning, can another monkey learn the same thing just by watching? Does the monkey brain have a neural network for learning by observation? And if so, what can this network teach us about human social learning?
Yet another potential future study concerns neuropsychiatric disorders characterized by defective social communication, as in autism. Autistic people have difficulty creating and interpreting emotional and social situations. Previous research had shown that the neural networks examined in the current study don’t function properly in these people, but appropriate models for studying these networks in depth were not available. “The unique experimental system we have developed can provide a basis for creating a natural model of autism,” says Paz. “The system enables researchers to observe natural and versatile social behavior unique to primates while at the same time analyzing the complex neural networks underlying this behavior.”
A connection with animals
Yosef Shohat, lab technician and manager of Dr. Rony Paz’s laboratory, is an animal person. As a child, he used to go to the beach after storms to collect wounded seagulls to take care of at home. Birds, dogs, insects – all found shelter on his balcony in Kiryat Haim, much to the displeasure of his mother. At age 25, his dedication to animals led to a tragic incident: Yosef tried to restrain a camel that had gone wild while carrying children on its back, and he was attacked and nearly fatally wounded by the animal. He remained paralyzed for a long time; but following prolonged rehabilitation – and contrary to the prognosis of all the doctors – he regained control of his body. Yosef credits animals with this medical miracle: “I needed the connection to animals to overcome not only the physical difficulties but also my depressed mental state.” In the wake of his injury, he gave up on studies of animal behavior at Oranim College, but not on his connection with four-legged and flying creatures. The injury has also caused him to forget the foreign languages he once knew, English and Arabic (he is now learning them anew), but not his mother tongue, Hebrew, nor the nonverbal language in which he communicates with animals.
Yosef lives in Rehovot with his wife, Stavit, who works in the Hebrew University’s Faculty of Agriculture. They have two sons: Omer, 12, and Segev, 10.
Dr. Rony Paz's research is supported by the Sylvia Schaefer Alzheimer's Research Fund; the Ruth and Herman Albert Scholars Program for New Scientists; Pascal and Ilana Mantoux, Israel/ France; Katy and Gary Leff, Calabasas, CA; and the European Research Council. Dr. Paz is the incumbent of the Beracha Foundation Career Development Chair.