The perceptual world of a person with autism spectrum disorder (ASD) is unique. Beginning in infancy, people who have ASD observe and interpret images and social cues differently than others. Caltech researchers now have new insight into just how this occurs, research that eventually may help doctors diagnose, and more effectively treat, the various forms of the disorder. The work is detailed in a study published in the October 22 issue of the journal Neuron.
Symptoms of ASD include impaired social interaction, compromised communication skills, restricted interests, and repetitive behaviors. Research suggests that some of these behaviors are influenced by how an individual with ASD senses, attends to, and perceives the world.
The new study investigated how visual input is interpreted in the brain of someone with ASD. In particular, it examined the validity of long-standing assumptions about the condition, including the belief that those with ASD often miss facial cues, contributing to their inability to respond appropriately in social situations.
"Among other findings, our work shows that the story is not as simple as saying 'people with ASD don't look normally at faces.' They don't look at most things in a typical way," says Ralph Adolphs, the Bren Professor of Psychology and Neuroscience and professor of biology, in whose lab the study was done. Indeed, the researchers found that people with ASD attend more to nonsocial images, to simple edges and patterns in those images, than to the faces of people.
To reach these determinations, Adolphs and his lab teamed up with Qi Zhao, an assistant professor of electrical and computer engineering at the National University of Singapore, the senior author on the paper, who had developed a detailed method. The researchers showed 700 images to 39 subjects. Twenty of the subjects were high-functioning individuals with ASD, and 19 were control, or "neurotypical," subjects without ASD. The two groups were matched for age, race, gender, educational level, and IQ. Each subject viewed each image for three seconds while an eye-tracking device recorded their attention patterns on objects depicted in the images.
Unlike the abstract representations of single objects or faces that have been commonly used in such studies, the images that Adolphs and his team presented contained combinations of more than 5,500 real-world elements—common objects like people, trees, and furniture as well as less common items like knives and flames—in natural settings, mimicking the scenes that a person might observe in day-to-day life.
"Complex images of natural scenes were a big part of this unique approach," says first-author Shuo Wang (PhD '14), a postdoctoral fellow at Caltech. The images were shown to subjects in a rich semantic context, "which simply means showing a scene that makes sense," he explains. "I could make an equally complex scene with Photoshop by combining some random objects such as a beach ball, a hamburger, a Frisbee, a forest, and a plane, but that grouping of objects doesn't have a meaning—there is no story demonstrated. Having objects that are related in a natural way and that show something meaningful provides the semantic context. It is a real-world approach."
In addition to validating previous studies that showed, for example, that individuals with ASD are less drawn to faces than control subjects, the new study found that these subjects were strongly attracted to the center of images, regardless of the content placed there. Similarly, they tended to focus their gaze on objects that stood out—for example, due to differences in color and contrast—rather than on faces. Take, for example, one image from the study showing two people talking with one facing the camera and the other facing away so that only the back of their head is visible. Control subjects concentrated on the visible face, whereas ASD subjects attended equally to the face and the back of the other person's head.
"The study is probably most useful for informing diagnosis," Adolphs says. "Autism is many things. Our study is one initial step in trying to discover what kinds of different autisms there actually are. The next step is to see if all people with ASD show the kind of pattern we found. There are probably differences between individual people with ASD, and those differences could relate to differences in diagnosis, for instance, revealing subtypes of autism. Once we have identified those subtypes, we can begin to ask if different kinds of treatment might be best for each kind of subtype."
Adolphs plans to continue this type of research using functional magnetic resonance imaging scans to track the brain activity of people with ASD while they are viewing images in laboratory settings similar to what was used in this study.
The paper, "Atypical Visual Saliency in Autism Spectrum Disorder Quantified through Model-Based Eye Tracking," was coauthored by Shuo Wang and Ralph Adolphs at Caltech; Ming Jiang and Qi Zhao from the National University of Singapore; Xavier Morin Duchesne and Daniel P. Kennedy of Indiana University, Bloomington; and Elizabeth A. Laugeson from UCLA.
The research was supported by a postdoctoral fellowship from the Autism Science Foundation, a Fonds de Recherche du Québec en Nature et Technologies predoctoral fellowship, a National Institutes of Health Grant and National Alliance for Research on Schizophrenia and Depression Young Investigator Grant, a grant from the National Institute of Mental Health to the Caltech Conte Center for the Neurobiology of Social Decision Making, a grant from the Simons Foundation Autism Research Initiative, and Singapore's Defense Innovative Research Program and the Singapore Ministry of Education's Academic Research Fund Tier 2.