In This Section
Why Words Matter: CHOP Researchers Study Verbal Patterns in Children With Autism
shafere1 [at] email.chop.edu (By Emily Shafer)
You can learn subtleties about people from the way they talk, and researchers at Children’s Hospital of Philadelphia are working to prove the same is true of children with autism.
Julia Parish-Morris, PhD, a principal investigator at CHOP’s Center for Autism Research and an Assistant Professor of Psychology in Psychiatry at the University of Pennsylvania’s Perelman School of Medicine, will be using computational linguistics and machine learning to analyze verbal patterns among children with autism. A new grant from the National Institute on Deafness and Other Communication Disorders provides funding support for the project.
“If you have ever met a child with autism, you will quickly realize that they have unique verbal and nonverbal communication patterns,” Dr. Parish-Morris said. “These differences may include unusual word choice, atypical syntax, or differences in the use of pitch to express ideas or put emotions into words. These features are subtle, and can be hard to pick up using the human ear.”
Dr. Parish-Morris will obtain data using the “SensorTree,” which is a biometric camera designed to measure social interaction. The “Digitizing Human Vocal Interaction to Understand and Diagnose Autism” study will include about 750 children over five years: 250 with autism, 250 with other psychiatric conditions such as depression, anxiety, and attention-deficit hyperactivity disorder, and 250 typically developing children. The children will have multiple five-minute conversations that the SensorTree will record.
Next, the study team will use machine-learning techniques to analyze verbal features produced by the children. The data could help with differential diagnosis for children with various psychiatric conditions whose phenotypes may be similar to that of autism, Dr. Parish-Morris said. More importantly, a deeper understanding of social communication in ASD could form the foundation developing personalized supports to help children achieve their communicative goals.
A separate computer vision study is underway using the SensorTree to analyze the nonverbal behaviors of children with autism, such as facial expressions and eye contact. Dr. Parish-Morris’ study is synergistic with that study, and her team expects to combine the data from both studies to understand what happens as social interaction unfolds across various clinical groups.
“Ultimately, we’d like to use machine learning techniques to not only predict whether a person has autism, but also to understand more about each person’s unique patterns of communication,” Dr. Parish-Morris said. “Autism is very heterogeneous, and not all kids need the same supports. Using computational linguistics, computer vision, and machine learning, we could apply these findings clinically to develop personalized supports for children and their families.”
In addition to this study, Dr. Parish-Morris is the principal investigator on another study, funded by Hoffman-LaRoche, in which she and her team are focused on identifying vocal biomarkers of autism, such as differences in prosody and enunciation. That study will include 48 children, of whom 24 have autism and 24 do not. They will analyze seven voice samples for each child, and they hope to identify voice features that distinguish children with autism from those without autism.