Blog

 

The Revolution Will Be… Digitized

 
 

In the three years since its inception, the Center for Autism Research (CAR)’s Technology and Innovation Lab has developed tools with the potential to revolutionize the way in which psychologists assess a child for Autism Spectrum Disorder (ASD), develop new therapies, and predict and measure the effectiveness of a behavioral therapy or medication. These developments required CAR to assemble its very own Ocean’s Eleven-style team of highly skilled clinicians and scientists from a variety of disciplines. “CAR’s Technology and Innovation Lab marries a unique blend of traditional clinical psychology research with state-of-the-art engineering, technology, and machine learning to innovate new approaches to longstanding questions about ASD and other related conditions,” says CAR’s Director, Robert Schultz, PhD. It’s this blend that provides the foundation for CAR’s ultimate goal of developing a digital phenotype of ASD.

 

Measuring Behavior: The Digital Revolution

 

CAR computer engineer Keith Bartley, M.S., developed CAR’s biometric sensor camera – nicknamed the “SensorTree” –  enabling researchers to flawlessly record, digitize and measure all outward expressions of human behavior, including gesture, eye movement, tone of voice, and fleeting facial movements. Early study results showed the SensorTree was able to diagnose ASD in study participants with almost 90% accuracy during the course of a 3-minute conversation, even without incorporating speech. By comparison, highly- trained ASD experts performing the same assessment achieved 82% accuracy in viewing the same conversation, showing again that computers and machine learning are better at some tasks.

 

The SensorTree can record and analyze everything that a human clinician can observe- and can do it more accurately than a highly-trained expert. This ability to seamlessly turn behaviors into data will transform the team’s ability to do research.  However, cautions Bartley, “The SensorTree is not meant to replace clinicians in the diagnosis or assessment of autism, but to be a diagnostic aid.”  For example, it can triage families waiting to see an expert clinician, but matching clinician specialization to a more accurate understanding of the presenting problem. It can also start the clinical assessment process so that the evaluation done in the clinic is briefer, saving both the family and the doctor valuable time without impacting the quality of the care.  “The SensorTree might also be used in a telemedicine context, fostering the ability of clinicians to diagnose ASD in children who are unable to meet with an expert clinician due to geographical or transportation constraints,” notes Bartley.

 

Casey Zampella, PhD, a postdoctoral fellow at CAR studying social movement coordination, invites participants with ASD to engage in an unscripted, natural conversation with a research staff member. She then analyzes the video footage of the conversation using complex algorithms created by fellow Technology and Innovation Lab team members Birkan Tunç, PhD, and Evangelos Sariyanidi, PhD, in order to uncover the subtle differences in the give-and-take of nonverbal communication during a conversation.

 

Autism is fundamentally a disorder of social interaction; however, most research to date has focused exclusively on the behaviors of individuals with ASD, rather than on bidirectional social processes,” says Dr. Zampella. “The novelty of our approach is its focus on behaviors that naturally unfold between social partners as they interact, allowing us to capture much more information about the core nature of ASD.”

 

In addition to tracking conversational behavior, researchers in CAR’s Technology and Innovation Lab are tracking other biometric data from these social interactions. CAR scientists Elizabeth S. Kim, PhD, and John Herrington, PhD, use Bluetooth technology to wirelessly pair the SensorTree with accelerometers to measure gesture and movement in three dimensions. They can also link to heart rate monitors, and will soon be able to link to other devices, such as thermometers and even EEG sensors, in order to collect a more complete digital picture of what is happening physiologically during social interactions. Combining this data allows the team to measure nuanced differences in social processes between people with and without ASD.

 

Wearable measures of movement behavior are a nice complement to computer vision-based measures of movement, because wearables can collect data in almost any situation- even those where it may be impractical to use cameras,” explains Dr. Kim. “They give us the opportunity to collect finer-grained data and more of it. This means we may be able to tease apart which changes in behaviors are due to a person’s “state”- hunger, satiety, fatigue, alertness, boredom, excitement, etc.-  and which behavior changes may be related to broader traits that are more stable in each person.” Being able to tell those apart, says Dr. Kim, will improve our ability to specify what is different between autism, other psychiatric or developmental differences, and typical development or psychiatric status.

 

We believe that the deeper level of understanding that we can gain from these new tools will help us better recognize individual differences in social communication across people,” said Dr. Zampella. As for the future of the team’s work, she says, “Understanding these communication differences will help us to better match each person's needs with specific treatment approaches. By getting away from a ‘one size fits all’ approach to treatment, we should be able to improve outcomes for people with ASD, including  their success in their everyday social interactions in the community, at work, and within their family."

 

Mining for Meaning

 

A common challenge facing many fields of research is that scientists’ ability to capture and collect data has in many ways outpaced our ability to store, transfer, sort, and analyze it in a way that is both accurate and meaningful.

 

Exhibit A: Having refined the technology to rapidly collect volumes of highly detailed behavioral & physiological data, CAR’s team was faced with the challenge of devising methods to meaningfully mine the data, to truly turn data into information that is meaningful. Enter Birkan Tunc, PhD, and Evangelos Sariyanidi, PhD, two researchers whose combined expertise formed the key to unlocking the potential of the SensorTree. Both researchers possess rare expertise in applying machine learning and advanced computational statistics to human behavior and human development.

 

“Machine learning and computer vision are the backbone of CAR’s Technology and Innovation Lab because the algorithms produced allow researchers to predict how a child with autism may progress through development with a given set of symptoms,” explains Dr. Tunç, a computational scientist at CAR and research assistant professor in the Department of Psychiatry at the University of Pennsylvania.

 

CAR researchers are collecting social-behavioral data from a wide variety of people across the autism spectrum. With enough data, researchers will be able to leverage these measurements in order to anticipate an individual’s social development and to develop more effective social support tools and interventions.

 

Because Children’s Hospital of Philadelphia (CHOP) provides care for such large numbers of youth with ASD, we are in a unique position to not only learn from each individual, but also from the larger group”, explains Dr. Schultz.

 

So, how does CAR’s team measure social communication, exactly? Dr. Sariyanidi, a postdoctoral fellow at CAR, is a computational scientist who develops algorithms that can quantify non-emotive facial expressions as well as emotional expressions. He’s using the SensorTree’s computer vision technology to study how a diagnosis of autism affects facial expressions. “To understand autism, and emotions in general, it is critical that we quantify and distinguish between expression and emotion if we hope to gain a deeper understanding of how autism affects individuals,” he says of his facial expression research.

 

This is not CAR’s first foray into quantification of facial expressions. Nearly twelve years ago, preceding the Technology and Innovation Lab by more than a decade, Dr. John Herrington used electromyography (EMG) to investigate how individuals with ASD understand and make face expressions. “At the time, the analysis of the data was very complicated and subject to numerous limitations; however the advances made through machine learning and computer vision provide a solution,” says Herrington.

 

While CAR’s SensorTree revolutionized video recording, video collection is only part of the lab’s efforts to digitize social communication. Julia Parish-Morris, PhD, director of CAR’s Quantitative Linguistics Lab, is analyzing speech samples collected during the same social interactions that provide the computer vision scientists videos for analyses. Dr. Parish-Morris’ research focuses on the language individuals with ASD use, as well as the acoustic characteristics that make up a person’s vocal signature. Language consists of not only words, but any type of vocalization an individual makes- including babbling, cooing or crying.  “Not all children with ASD have fluid language and we want to better understand all of the vocal communications made by children with ASD. We have actually found that babies who are later diagnosed with ASD cry and babble differently,” says Dr. Parish-Morris.

 

One goal of Dr. Parish-Morris’ research is to determine how verbal interactions change over developmental time, how these interactions differ in individuals with autism, and what kinds of interventions can help individuals reach their maximum potential.  “Ultimately, we hope to correlate linguistic markers with social and behavioral presentations of autism, as well as with biological data, such as genetics and brain imaging,” she says. The goal is to develop a “digital phenotype” of ASD that can make the intricate links between symptoms and their underlying brain and genetic mechanisms. This will open the door for treatments to be designed to target these mechanisms at an individual level.

 

Learn how CAR utilized video technology to revolutionized behavioral assessments.