Project Participants


Prof. Dr. Stefan R. Schweinberger
(Principal Investigator)
University of Jena
stefan.schweinberger@uni-jena.de
Stefan Schweinberger is interested in cognition as well as cognitive and social neuroscience. Areas of his research include person perception and human interaction, with a particular focus on communication via the face and the voice. He received his PhD from the University of Konstanz in 1991, and worked as a professor at the Universities of Glasgow (2000-2005) and Jena (2005-present). He and his team use research methods linking brain and cognition/emotion, such as event-related brain potentials (ERP), eyetracking, or investigations of patients with focal brain lesions. His research also covers individual differences and constraints to person perception or communication, whether of sensory (e.g., hearing loss) or central origin (e.g., autism, prosopagnosia).
Selected publications
- Frühholz, S., & Schweinberger, S.R. (2021). Nonverbal auditory communication – Evidence for Integrated Neural Systems for Voice Signal Production and Perception. Progress in Neurobiology, 199, 101948. doi: 10.1016/j.pneurobio.2020.101948.
- Frühholz, S., & Schweinberger, S.R. (2021). Nonverbal auditory communication – Evidence for Integrated Neural Systems for Voice Signal Production and Perception. Progress in Neurobiology, 199, 101948. doi: 10.1016/j.pneurobio.2020.101948.
- Schweinberger, S.R., & von Eiff, C.I. (2022). Enhancing Socio-emotional Communication and QoL in Young CI Recipients: Perspectives from Parameter-specific Morphing and Caricaturing. Frontiers in Neuroscience, 16:956917. doi: 10.3389/fnins.2022.956917.
- Skuk, V.G., Kirchen, L., Oberhoffner, T., Guntinas-Lichius, O., Dobel, C., & Schweinberger, S.R. (2020). Parameter-specific Morphing Reveals Contributions of Timbre and F0 Cues to the Perception of Voice Gender and Age in Cochlear Implant Users. Journal of Speech, Language, and Hearing Research, 63(9), 3155-3175. doi: 10.1044/2020_JSLHR-20-00026
- Skuk, V.G., Kirchen, L., Oberhoffner, T., Guntinas-Lichius, O., Dobel, C., & Schweinberger, S.R. (2020). Parameter-specific Morphing Reveals Contributions of Timbre and F0 Cues to the Perception of Voice Gender and Age in Cochlear Implant Users. Journal of Speech, Language, and Hearing Research, 63(9), 3155-3175. doi: 10.1044/2020_JSLHR-20-00026


Prof. Dr. Christian Dobel
(Principal Investigator)
Jena University Hospital
christian.dobel@med.uni-jena.de
Christian Dobel is a Professor of Experimental Otorhinolaryngology at the Jena University Hospital.
Selected publications
- Dobel C, Diesendruck G, Bölte J. How writing system and age influence spatial representations of actions: a developmental, cross-linguistic study. Psychol Sci. 2007 Jun;18(6):487-91. doi: 10.1111/j.1467-9280.2007.01926.x. PMID: 17576259.
- Dobel C, Enriquez-Geppert S, Hummert M, Zwitserlood P, Bölte J. Conceptual representation of actions in sign language. J Deaf Stud Deaf Educ. 2011 Summer;16(3):392-400. doi: 10.1093/deafed/enq070. Epub 2011 Feb 21. PMID: 21339342.
- Dobel C, Enriquez-Geppert S, Zwitserlood P, Bölte J. Literacy shapes thought: the case of event representation in different cultures. Front Psychol. 2014 Apr 16;5:290. doi: 10.3389/fpsyg.2014.00290. PMID: 24795665; PMCID: PMC3997043.
- Dobel C, Nestler-Collatz B, Guntinas-Lichius O, Schweinberger SR, Zäske R. Deaf signers outperform hearing non-signers in recognizing happy facial expressions. Psychol Res. 2020 Sep;84(6):1485-1494. doi: 10.1007/s00426-019-01160-y. Epub 2019 Mar 13. PMID: 30864002.
- Roesmann K, Dellert T, Junghoefer M, Kissler J, Zwitserlood P, Zwanzger P, Dobel C. The causal role of prefrontal hemispheric asymmetry in valence processing of words – Insights from a combined cTBS-MEG study. Neuroimage. 2019 May 1;191:367-379. doi: 10.1016/j.neuroimage.2019.01.057. Epub 2019 Feb 1. PMID: 30716460.


Prof. Dr. Volker Gast
(Principal Investigator)
University of Jena
volker.gast@uni-jena.de
(Photo, a short introduction, and selected publications are to be updated.)


Celina Isabelle von Eiff
(Postdoc Researcher)
University of Jena
celina.isabelle.von.eiff@uni-jena.de
Celina I. von Eiff is interested in cognitive and social neuroscience, particularly in voice and face perception in individuals with hearing prostheses (i.e., cochlear implants). She uses behavioural measures and EEG to investigate how the human brain integrates sensory information after cochlear implantation. She currently works at the Department of Psychology of the Friedrich Schiller University Jena.
Selected publications
- von Eiff, C. I., Kauk, J., & Schweinberger, S. R. (2023). The Jena Audiovisual Stimuli of Morphed Emotional Pseudospeech (JAVMEPS): A database for emotional auditory only, visual-only, and congruent and incongruent audiovisual voice and dynamic face stimuli with varying voice intensities. Behavior Research Methods, 56(5), 5103-5115.
- von Eiff, C. I., Frühholz, S., Korth, D., Guntinas-Lichius, O., & Schweinberger, S. R. (2022). Crossmodal Benefits to Vocal Emotion Perception in Cochlear Implant Users. iScience. 25(12), 105711.
- Schweinberger, S. R. & von Eiff, C. I. (2022). Enhancing socio-emotional communication and quality of life in young cochlear implant recipients: Perspectives from parameter-specific morphing and caricaturing. Frontiers in Neuroscience, 16.
- von Eiff, C. I., Skuk, V. G., Zäske, R., Nussbaum, C., Frühholz, S., Feuer, U., Guntinas-Lichius, O., & Schweinberger, S. R. (2022). Parameter-Specific Morphing Reveals Contributions of Timbre to the Perception of Vocal Emotions in Cochlear Implant Users. Ear and Hearing, 43(4), 1178-1188.
- Nussbaum, C., von Eiff, C. I., Skuk, V. G., & Schweinberger, S. R. (2022). Vocal emotion adaptation aftereffects within and across speaker genders: Roles of timbre and fundamental frequency. Cognition, 219.
- Schweinberger, S. R., von Eiff, C. I., Kirchen, L., Oberhoffner, T., Guntinas-Lichius, O., Dobel, C., Nussbaum, C., Zäske, R., & Skuk, V. G. (2020). The Role of Stimulus Type and Social Signal for Voice Perception in Cochlear Implant Users: Response to the Letter by Meister et al. Journal of Speech, Language, and Hearing Research, 63(12), 4327-4328.
Project Description
The ability to communicate via auditory spoken language is taken as a benchmark for success of cochlear implants (CIs), but this disregards the important role visual cues play in communication. The relevance of socio-emotional signals and their importance for quality of life with a CI (Luo, Kern, & Pulling, 2018; Schorr, Roth, & Fox, 2009) calls for research on visual benefits to communication. Recruiting models of communication via the face and voice (Young, Frühholz, & Schweinberger, 2020), we consider that deafness can elicit crossmodal cortical plasticity, such that visual stimuli can activate auditory cortex areas. Even after adaptation to a CI, initial findings suggest a particularly strong contribution of visual information to the perception of speech and speaker gender. Better understanding of these phenomena at the functional and brain level is required to promote efficient interventions improving communication, and ultimately life quality. Here we focus on postlingually deaf adult CI users and propose four studies (S1-S4).
In S1, we conduct a systematic review to determine the current state of knowledge regarding the role of visual information (face or manual gesture) for emotion recognition and speech perception from voices, in hearing adults and CI users. In S2, we explore in a behavioral experiment with dynamic time-synchronized audiovisual stimuli whether CI users benefit more from congruent facial expressions when recognizing vocal emotions than do hearing adults – and whether this holds even when controlling for overall auditory-only performance levels. Importantly, we use voice morphing technology, rather than noise, to equate performance levels. In S3, we study brain correlates of audiovisual integration (AVI) in event-related potentials (ERPs) to audiovisual (AV) emotional stimuli. We focus on the ability of congruent AV stimuli to speed up neural processing, and investigate relationships between individual neural markers of AVI and behavioral performance in emotion recognition. In S4, we study the degree to which perceptual training with caricatured vocal emotions can improve auditory and audiovisual emotion recognition in adult CI users. We assess relationships between emotion recognition abilities and reported quality of life in all studies. The project builds on successful previous research funded by the DFG on Voice Perception (Schw 511/10-1, -2) and Audiovisual integration in the identification of speaker and speech (Schw 511/6-1, -2), and on our long-standing collaboration with the Cochlear Implant Rehabilitation Centre in Thuringia.
We hope this work will contribute to models of the cognitive and brain mechanisms underlying multimodal perception in human communication. We propose that better understanding of the mechanisms by which visual facial signals support CI users will provide important information that can be used to optimize both linguistic and socio-emotional communication, and ultimately quality of life.
Project Activities
Publications
von Eiff, C. I., Frühholz, S., Korth, D., Guntinas-Lichius, O., & Schweinberger, S. R. (2022). Crossmodal Benefits to Vocal Emotion Perception in Cochlear Implant Users. iScience, 25, 105711. https://doi.org/10.1016/j.isci.2022.105711
von Eiff, C. I., Kauk, J., & Schweinberger, S. R. (2024). The Jena Audiovisual Stimuli of Morphed Emotional Pseudospeech (JAVMEPS): A database for emotional auditory only, visual-only, and congruent and incongruent audiovisual voice and dynamic face stimuli with varying voice intensities. Behavior Research Methods, 56, 5103-5115. https://doi.org/10.3758/s13428-023-02249-4
von Eiff, C. I., Erchinger, L., Ruttloff, J. M., & Schweinberger, S. R. (in prep.). Improving vocal emotion perception for cochlear implant users: An online training approach using vocal caricatures.
Schweinberger, S.R., & von Eiff, C.I. (2022). Enhancing Socio-emotional Communication and Quality of Life in Young CI Recipients: Perspectives from Parameter-specific Morphing and Caricaturing. Frontiers in Neuroscience, 16, 956917. https://doi.org/10.3389/fnins.2022.956917
Schirmer, A., Croy, I., Liebal, K. & Schweinberger, S.R. (2025). Non-verbal effecting – animal research sheds light on human emotion communication. Biological Reviews, 100(1), 245-257. https://doi.org/10.1111/brv.13140
Henlein, A., Bauer, A., Bhattacharjee, R., Ćwiek, A., Gregori, A., Kügler, F., Lemanski, J., Lücking, A., Mehler, A., Prieto, P., Sánchez-Ramón, P. G., Schepens, J., Schulte-Rüther, M., Schweinberger, S. R., & von Eiff, C. I. (2024). An Outlook for AI Innovation in Multimodal Communication Research. In V. G. Duffy (Ed.), Digital Human Modeling and Applications in Health, Safety, Ergonomics and Risk Management (pp. 182–234). Springer Nature Switzerland. https://doi.org/10.1007/978-3-031-61066-0_13
Gregori, A., Amici, F., Brilmayer, I., Ćwiek, A., Fritzsche, L., Fuchs, S., Henlein, A., Herbort, O., Kügler, F., Lemanski, J., Liebal, K., Lücking, A., Mehler, A., Nguyen, K. T., Pouw, W., Prieto, P., Rohrer, P. L., Sánchez-Ramón, P. G., Schulte-Rüther, M., Schumacher, P., Schweinberger, S., Struckmeier, V., Trettenbein, P., & von Eiff, C. I. (2023). A Roadmap for Technological Innovation in Multimodal Communication Research. In V. G. Duffy (Ed.), Digital Human Modeling and Applications in Health, Safety, Ergonomics and Risk Management. Springer. https://doi.org/10.1007/978-3-031-35748-0_30
Conference contributions
von Eiff, C.I., Skuk, V.G., Zäske, R., Nussbaum, C., Frühholz, S., Feuer, U., Guntinas-Lichius, O., & Schweinberger, S. R. (2022). Parameter-Specific Morphing Reveals Contributions of Timbre to the Perception of Vocal Emotions in Cochlear Implant Users. 5th International Congress on Family Centred Early Intervention for Children Who are Deaf or Hard of Hearing (FCEI), Bad Ischl, Austria, June 8-10, 2022.
Schweinberger, S.R., von Eiff, C.I., & Skuk, V.G. (2022). Parameter-specific voice morphing: Perspectives for application. 5th International Congress on Family Centred Early Intervention for Children Who are Deaf or Hard of Hearing (FCEI), Bad Ischl, Austria, June 8-10, 2022.
Schweinberger, S.R., & von Eiff, C.I. (2022). Training socio-emotional skills: Perspectives from parameter-specific morphing and caricaturing. Talk in the Symposium “Multi-modal socio-emotional communication: Basic mechanisms and functioning in altered sensory and central conditions”. 62nd Meeting of the Society for Psychophysiological Research (SPR), Vancouver, September 28-October 2, 2022.
Schweinberger, S.R., von Eiff, C.I., & Skuk, V.G. (2023). Parameter-specific morphing and caricaturing: Perspectives for assessment and intervention. 1st DZPG Retreat of the German Center for Mental Health, Ulm, September 12-15, 2023.
von Eiff, C.I., Skuk, V.G., Zäske, R., Nussbaum, C., Frühholz, S., Feuer, U., Guntinas-Lichius, O., & Schweinberger, S. R. (2023). Parameter-Specific Morphing Reveals Contributions of Timbre to the Perception of Vocal Emotions in Cochlear Implant Users. 28th Erfurt Days 2023, Erfurt, December 1-2, 2023. (Winner of the Poster Prize)
Von Eiff, C.I., Sawada, N., Nussbaum, C., Ruttloff, J.M., & Schweinberger, S.R. (2024). The Jena Screening Test For Speech Comprehension In Sentences (JESSCom): Development and Preliminary Validation. 26th Annual Meeting of the German Society for Audiology (DGA), Aalen, March 6-8, 2024.
von Eiff, C. I., Erchinger, L., Ruttloff, J. M., & Schweinberger, S. R. (2024). Improving vocal emotion perception for cochlear implant users: An online training approach using vocal caricatures. 6th International Congress on Family‐Centred Early Intervention for Children who are Deaf or Hard of Hearing (FCEI), Bad Ischl (Austria), May 15-17, 2024.
Schweinberger, S.R., Kaufmann, J.M., Kowallik, A.E., Limbach, K., Skuk, V.G., & von Eiff, C.I. (2024). Vocal and facial communication: Tools for basic science, assessment and intervention. Experimental Psychology Society (EPS), York Meeting, York (UK), July 3-5, 2024.
von Eiff, C. I., Erchinger, L., Ruttloff, J. M., & Schweinberger, S. R. (2024). Improving vocal emotion perception for cochlear implant users: An online training approach using vocal caricatures. 2nd Interdisciplinary Conference on Voice Identity: Perception, Production and Computational Approaches (VoiceID), Marburg, August 28-30.
Schweinberger, S.R., Skuk, V.G., Zäske, R., & von Eiff, C.I. (2024). Identifying Individual Profiles of Voice Perception Abilities in Cochlear Implant (CI) Users and their Relationship to Quality of Life (QoL). 2nd Interdisciplinary Conference on Voice Identity: Perception, Production and Computational Approaches (VoiceID), Marburg, August 28-30.
von Eiff, C. I. (2025). Perception of Emotional Expression in Cochlear Implant Users. 27th Annual Meeting of the German Society for Audiology (DGA), Göttingen, March 19-21, 2025.
Invited talks and guest lectures
von Eiff, Celina I.. (10/2024). Perception of Emotional Expression in Cochlear Implant Users. Seminar at the Institute of Cognitive Neuroscience, University College London, England.
Schweinberger, S.R. (6/2024). Social Interaction by Face and Voice: Tools for Basic Science, Assessment, and Intervention. Invited Talk at the University College London, Institute of Cognitive Neuroscience Seminar Series, London, UK, June 3, 2024
Schweinberger, S.R. (6/2024). Facial and Vocal Interaction: Tools for Basic Science, Assessment, and Intervention. Invited Talk at Durham University, Department of Psychology Seminar Series, Durham, UK, June 17, 2024
Schweinberger, S.R. (7/2024). Tools for Basic Science, Assessment, and Intervention in Face and Voice Communication Science. Invited Talk at the Symposium on Voice Perception, University of Oslo, Oslo, UK, July 22, 2024
Schweinberger, S.R. (11/2024). Multimodal Social Interaction: Tools for Basic Science, Assessment, and Intervention in Face and Voice Communication Science. Invited Talk at the University of Uppsala, Sweden, Seminar Series (Organizer: Prof. Petri Laukka), Talk delivered via zoom, November 29, 2024
Theses:
Celina I. von Eiff. 2024. Perception of Emotional Expression in Cochlear Implant Users. Friedrich Schiller University Jena. (Dissertation)
Jenny M. Ruttloff. 2024. Perception and Imitation of Voice Caricatures: Effects of an Online Training on Vocal Emotion Perception in Cochlear Implant Users. Leipzig University. (Master Thesis)
