Georgia Tech's Child Study Lab Sees Computer Science as New 'Microscope' for Autism Research

Autism and Computing Research at Georgia Tech

What if behavior could be mapped and analyzed in much the same way an MRI provides images of the brain or a microscope an up-close look at cells? Both proved to be paradigm shifts in detecting developmental anomalies or diseases like cancer, and Georgia Tech research at the intersection of computing and early childhood behavior could do the same for autism.

Building upon nearly a decade of research, Georgia Tech’s Child Study Lab, which was established in 2010 by a $10 million grant from the National Science Foundation, and collaborators at Weill Cornell Medical College were awarded last year with a $1.7 million grant from the National Institutes of Health. The grant will help researchers collect new data, using the datasets created over the past decade to develop automated tools that better and more efficiently characterize behaviors that are present and important in typical child development but are often considered to be core, early-emerging markers of autism spectrum disorder (ASD) when absent.

[RELATED: Using Computer Science to Augment Autism Research at Georgia Tech (VIDEO)]

Psychologists have long understood that there were links between early childhood development and the likelihood of typical language and behavior outcomes throughout life. What they weren’t able to do, however, was to study childhood behavior at a granular level similar to that of a microscope. Given the importance of early detection to inform proper interventions, the tedium of human coding and analysis poses a significant challenge.

“That process is manual and driven by humans specifying what happens in a frame of a video,” said Jim Rehg, a professor in the School of Interactive Computing and the principal investigator on the NIH award. “It takes hours upon hours of data collection and analysis.”

Computing could alter that reality, and this work being done at Georgia Tech is a significant reason why.

“Given enough video, we can model the details of behavior,” Rehg said. “Deep learning, married with the ability to collect the data, allows us to build out how our algorithms work in much the same way computer science has been applied to genetics and imaging to make those more powerful and scalable.”

That has long been the mission of the Child Study Lab, and the latest grant will continue to move the needle forward in autism research at Georgia Tech and beyond. Unlike many other conditions, autism spectrum disorder can’t be found by taking a blood test or viewing images of the brain. Doctors must analyze behavior through developmental screenings and comprehensive diagnostic evaluations.

During screenings, doctors might talk or play with a child to see how they learn, speak, or behave. Do they exhibit typical communicative skills like joint attention, in which two people use gestures or gaze to share their attention with respect to other objects or events? The skills a child demonstrates in these areas are known to be strong indicators of how they will develop throughout childhood and adolescence.

The challenge here is that, given how important it is to detect ASD at an early age and thus tailor interventions and education to meet the child’s specific needs, the manual labor that comes with these screenings and evaluations makes it far less efficient than detection of other developmental challenges. Autism spectrum disorder affects one in 59 children in the United States alone, and not all who are screened are ultimately determined to be one of those individuals.

The need for objective, automated measurements of behavior is clear, and Rehg – along with IC Research Scientist Agata Rozga, Child Study Lab coordinator Audrey Southerland, collaborators at Weill Cornell, and more – are taking steps in that direction.

“For us, the goal is to use these computational capabilities to extract the important key moments and information to give clinicians or psychologists the ability to more easily examine a child’s behavior,” Southerland said. “If we can provide additional details through technology about the quality or coordination of important social and communicative behaviors, we can hopefully provide behavioral experts with the capability of exploring these behaviors in much greater detail than currently possible.”

The first grant from the NSF funded the creation of the Child Study Lab, which has over the years developed an extensive dataset of behaviors in typically developing children. At the time, it was the first large-scale investment in technology that would assist in modeling and sensing behaviors that underlie developmental conditions like autism spectrum disorder. Additional grants have assisted in studies that use computer vision to measure and detect gaze shifts or wearable technology and machine learning to detect and differentiate between types of problem behaviors.

The NIH grant brings all the past research together to compare what the sensory data says in relation to human coding, and how that might ultimately serve to develop reliable, objective, automated tools for measuring early, nonverbal communication behaviors.

“The important thing is for us to make sure that whatever we produce is good enough so that we can actually push it out into the field to people who are specializing in this area,” Southerland said. “We never want to get rid of the human expert in this field, but we want to build technology they can use to augment and streamline their analysis of behavior.”

In addition to the National Institutes of Health and the National Science Foundation, the Child Study Lab has also received funding from the Simons Foundation and has partnered with external entities like the Marcus Autism Center.

Southerland and the Child Study Lab are actively seeking families with young children to participate in this study to further develop their automated tools. Anyone interested in playing a part in this exciting work can visit the lab’s website to learn more.