Project Participants


Dr. Martin Schulte-Rüther
(Principal Investigator)
University Medical Center Göttingen
martin.schulte-ruether@med.uni-goettingen.de
Martin’s research is focused on social interaction and emotional processing in typical development as well as in psychiatric and neurodevelopmental disorders, in particular Autism. He employs a broad spectrum of neuroscientific and behavioral methods, including MRI, fNIRS, physiological recordings, eye-tracking, and video-based behavioral analysis. Martin studied Psychology at the Ruhr-University Bochum and received his PhD at the University of Bielefeld working in cooperation with the research Center Jülich on the cognitive neuroscience of empathy. He continued as a Post-Doc at the Child and Adolescent Psychiatry of the RWTH Aachen, focusing on neuroimaging methods in children and adolescents with autism. Next, he headed a research group for Translational Brain Research at the RWTH Aachen, Jülich-Aachen Research Alliance. Currently, he is based at the University Medical Center Göttingen, and is a senior researcher and group leader of the Social Interaction and Developmental Neuroscience Lab at the Child and Adolescent Psychiatry Göttingen.
Selected publications
- Schulte-Rüther M, Kulvicius T, Stroth S, Wolff N, Roessner V, Marschik PB, Kamp-Becker I, Poustka L (2022). Using machine learning to improve diagnostic assessment of ASD in the light of specific differential and co-occurring diagnoses. Journal of Child Psychology and Psychiatry. https://doi.org/10.1111/jcpp.13650
- Hartz A, Guth B, Jording M, Vogeley K, Schulte-Rüther M (2021). Temporal Behavioral Parameters of on-going Gaze Encounters in a Virtual Environment. Frontiers in Psychology, doi:10.3389/fpsyg.2021.673982
- Kruppa J, Reindl V, Gerloff C, Oberwelland E, Prinz J, Herpertz-Dahlmann B, Konrad K, Schulte-Rüther M. (2020). Brain-to-Brain synchrony in children and adolescents with autism spectrum disorder in parent-child-dyads. Social Cognitive and Affective Neuroscience, doi:10.1093/scan/nsaa092
- Kruppa JA, Gossen A, Großheinrich N, Oberwelland E, Cholemkery H, Freitag C, Kohls G, Fink GR, Herpertz-Dahlmann B, Konrad K, Schulte-Rüther M (2019). Social Reinforcement Learning and its Neural Modulation by Oxytocin in Autism Spectrum Disorder. Neuropsychopharmacology, 44:749-756. doi:10.1038/s41386-018-0258-7
- Burkhardt, Petra. (2006). Inferential Bridging Relations Reveal Distinct Neural Mechanisms: Evidence from Event-Related Brain Potentials. Brain and Language, 98, 2, 159-168.Oberwelland E, Schilbach L, Barisic I, Krall SC, Vogeley K, Fink GR, Herpertz-Dahlmann B, Konrad K, Schulte-Rüther M(2017). Young adolescents with autism show abnormal joint attention network: A gaze contingent fMRI study. NeuroImage Clinical, 14:112-121, doi: 10.1016/j.nicl.2017.01.006
Project Description
Autism Spectrum Disorder (ASD) is a prototypic disorder for the impairment of multimodal aspects of visual and verbal communication. Observational instruments such as the Autism Diagnostic Observation Schedule (ADOS-2) provide assessment of behavioral symptoms via a structured social encounter with an experienced clinician who performs several social interactive tasks with the individual. Diagnostic decisions based on this instrument typically rely on qualitative, clinical ratings regarding the clinician’s impression based on visual communicative behavior (such as eye-gaze, facial expressions, gestures) and their integration with verbal communication), however, quantitative indices are lacking.
In human interaction, verbal and visual channels of communication are embedded in a social reference frame combining multimodal aspects. For example, joint attention emerges from using deictic hand gestures (e.g. pointing) in coordination with facial expression and eye gaze to navigate within a shared attentional space of other people and objects. Eye gaze and pointing can clarify which object or person an utterance refers to, e.g., gestures, head movement and facial expression may visualize spatial and social relationships when talking about objects or persons which are not currently visible. Systematic research that explicitly tackles the interplay and temporal dynamics of such multimodal visual communicative behavior is scarce, to date, and would benefit from a fine-grained computerized assessment of dyadic interaction. Motion capture, mobile eye-tracking and automatic facial expression analysis are established techniques in this respect and have proved potential for the diagnostic assessment of disorders such as ASD. What is missing in previous research, however, is the multimodal combination of available techniques during a standardized assessment resulting in a rich, annotated dataset.
In this proposal we aim at providing multimodal assessment of typical and atypical social behavior, focusing on the integration of multiple visual and verbal communication channels and their relation to disorders of social interaction in children. We will perform annotation of specific behavioral events during ongoing reciprocal interaction of a child and an investigator, in particular related to joint attention and reciprocity. Subsequently, we will use machine learning (ML) methods on time series of automatically extracted aspects (e.g. saccades towards faces, facial expression, body pose motion capture) to train models for the automatic identification of these events. Furthermore, we seek to use ML to support the clinical characterization of non-verbal behavior, in particular related to ASD. At the same time, we will generate a rich dataset, i.e. a corpus of multimodal communication during social interaction which will allow for numerous further analyses in the context of the Priority Program ViCom and for the wider research community.