Abstract
Autism spectrum disorder (ASD) is a developmental disorder that impacts more than 1.6% of children aged 8 across the United States. It is characterized by impairments in social interaction and communication, as well as by a restricted repertoire of activity and interests. The current standardized clinical diagnosis of ASD remains to be a subjective diagnosis, mainly relying on behavior-based tests. However, the diagnostic process for ASD is not only time consuming, but also costly, causing a tremendous financial burden for patients’ families. Therefore, automated diagnosis approaches have been an attractive solution for earlier identification of ASD. In this work, we set to develop a deep learning model for automated diagnosis of ASD. Specifically, a multichannel deep attention neural network (DANN) was proposed by integrating multiple layers of neural networks, attention mechanism, and feature fusion to capture the interrelationships in multimodality data. We evaluated the proposed multichannel DANN model on the Autism Brain Imaging Data Exchange (ABIDE) repository with 809 subjects (408 ASD patients and 401 typical development controls). Our model achieved a state-of-the-art accuracy of 0.732 on ASD classification by integrating three scales of brain functional connectomes and personal characteristic data, outperforming multiple peer machine learning models in a k-fold cross validation experiment. Additional k-fold and leave-one-site-out cross validation were conducted to test the generalizability and robustness of the proposed multichannel DANN model. The results show promise for deep learning models to aid the future automated clinical diagnosis of ASD.
Original language | English (US) |
---|---|
Pages (from-to) | 1-9 |
Number of pages | 9 |
Journal | Complexity |
Volume | 2020 |
DOIs | |
State | Published - Jan 31 2020 |
Bibliographical note
KAUST Repository Item: Exported on 2020-10-01Acknowledgements: This work was supported in part by the Beijing Education Commission Research Project of China under grant no. KM201911232004, National Natural Science Foundation of China under grant no. 61672105, and National Key Research and Development Program of China under grant no. 2018YFB1004100.