Recent advances in neuroscience indicate that analysis of bio-signals such as rest state electroencephalogram (EEG) and eye-tracking data can provide more reliable evaluation of children autism spectrum disorder (ASD) than traditional methods of behavior measurement relying on scales do. However, the effectiveness of the new approaches still lags behind the increasing requirement in clinical or educational practices as the "bio-marker" information carried by the bio-signal of a single-modality is likely insufficient or distorted. This study proposes an approach to joint analysis of EEG and eye-tracking for children ASD evaluation. The approach focuses on deep fusion of the features in two modalities as no explicit correlations between the original bio-signals are available, which also limits the performance of existing methods along this direction. First, the synchronization measures, information entropy, and time-frequency features of the multi-channel EEG are derived. Then a random forest applies to the eye-tracking recordings of the same subjects to single out the most significant features. A graph convolutional network (GCN) model then naturally fuses the two group of features to differentiate the children with ASD from the typically developed (TD) subjects. Experiments have been carried out on the two types of the bio-signals collected from 42 children (21 ASD and 21 TD subjects, 3-6 years old). The results indicate that (1) the proposed approach can achieve an accuracy of 95% in ASD detection, and (2) strong correlations exist between the two bio-signals collected even asynchronously, in particular the EEG synchronization against the face related/joint attentions in terms of covariance.
Keywords: EEG; autism spectrum disorder; eye-tracking; graph convolution network; multi-modality fusion.
Copyright © 2021 Zhang, Chen, Tang and Zhang.