Duke researchers are working to create an app that can bring the trained eye of a psychiatrist to an iPad camera lens.

The app will screen children for autism spectrum disorder and is being developed by a research team led by Guillermo Sapiro, professor of electrical and computer engineering, computer science and biomedical engineering. By recording and processing facial movements of participants using an iPad, the program can identify if a user shows characteristics of ASD and needs further testing. But Sapiro noted the system cannot diagnose ASD or provide treatment options.

“This is not a diagnostic tool, but a screening tool to help people get access faster or encourage them to seek access,” Sapiro said. “It’s similar to how kids go to school and get eyesight screening. [School nurses] might tell you to go see a doctor, but they don’t give you the glasses.”

He explained that the app first displays a series of short videos meant to evoke emotional responses. Based on the user’s facial movements in response to the videos, the app suggests whether or not the child requires further observation.

The team’s ultimate goal is to release an app that is easily accessible to schools and parents, instead of only pediatric clinics with specialized resources, said graduate student Jordan Hashemi. As of right now, however, he said the team is unable to provide an estimate for when the application will be available in the App Store.

Although the current work is being done at Duke, the project began at the University of Minnesota, where Sapiro led a computer vision research team. In a clinical setting at the University of Minnesota Medical Center, he recorded children’s reactions to videos using a GoPro camera. Hashemi—then a master’s student under Sapiro—implemented a set of preliminary algorithms that analyzed the GoPro images to identify abnormalities in the children’s emotional responses.

In 2012, Sapiro moved to Duke and started a computer vision lab as part of the Information Initiative at Duke. He has been collaborating on the project with a team of psychiatrists.

Hashemi said the team is currently working to test the system’s various components in different environments. However, the Sapiro Lab has already begun to explore new uses for their image analysis algorithms. Of particular interest is a project that is experimenting with judging the facial movements of car drivers, Hashemi said.

The project was not without its challenges, Hashemi said. Translating the extensive experience of medical doctors to the computer terminal was a key difficulty the team had to deal with. But Sapiro noted that weekly meetings allowed each team member to gain a strong, mutual understanding of the framework behind the app.

“We trust each other’s expertise. We might have a lot of crazy ideas, but sometimes they are not implementable in a clinical environment,” he said. “The team has been outstanding in helping each other understand our abilities and our constraints.”