Gestures or Signs? Comparing Manual and Non-manual Constructions Sharing the Same Form in Co-speech Gesture and Sign Language: A Corpus-driven Approach. (GeSi)

Project Participants

Project Description

There is a number of manual and non-manual constructions which are not constrained to signed language only, they are also observed in co-speech gesture. The following forms will be systematically treated in this project: palm-up, throw away, pointing, list buoys, eyebrow raise and sidewards body leans. While some of these selected constructions have been researched, other forms remain un(der)studied and there is no fine-graded comparison of all these constructions between signers and speakers. This project fills this gap. It aims at providing a detailed corpus-based analysis of the selected constructions in both co-speech gesture and sign. The goal is to determine how – and to what degree – these constructions in sign language differ from comparable forms in gesture on both functional and formation grounds. As these constructions share the same modality, they will be given the same theoretical treatment and will be investigated along a number of various dimensions blurring the strict gesture-sign binary and backing the understanding of these manual and non-manual activities as forming a cross-modal continuum along which functional conventionalization and lexicalization takes place. Distinguishing signs and gesture, the two prime examples of visual communication, this project also provides further insights into the interaction of different channels and the grammatical system(s) underlying this interaction contributing to a new modality-free comprehensive theoretical model of language and communication.

Project Activities

Publications

Bauer A, Kuder A, Schulder M, Schepens J (2025) Correction: Phonetic differences between affirmative and feedback head nods in German Sign Language (DGS): A pose estimation study. PLOS ONE 20(3): e0321229. https://doi.org/10.1371/journal.pone.0321229

Bauer, A., Trettenbrein, P. C., Amici, F., Ćwiek, A., Fuchs, S., Krause, L., Kuder, A., Ladewig, S., Schulder, M., Schumacher, P., Spruijt, D., Zulberti, C. & Schulte-Rüther, M. (2025). Data Collection in Multimodal Language and Communication Research: A Flexible Decision Frameworkhttps://doi.org/10.31234/osf.io/42tud_v1 

Henlein, A., A. Bauer, R. Bhattacharjee, A. Ćwiek, A. Gregori, F.Kügler, J. Lemanski, A. Lücking, A. Mehler, P. Prieto, P. G. Sánchez-Ramón, J. Schepens, M. Schulte-Rüther, S. R.Schweinberger & C. I. von Eiff. 2024. An Outlook for AI Innovation in MultimodalCommunication Research. Lecture Notes in Computer Science, Springer Cham.

Kuder, Anna & Anastasia Bauer. (submitted). When do you smile back: Exploring the alignment of smiling behavior in spontaneous dyadic signed interactions. To be published in Expressing Emotions in Sign Languages, Eds. Sarah Schwarzenberg, Simon Kollien, Nina-Kristin Meister, Thomas Finkbeiner, Annika Herrmann. Sign Languages and Deaf Communities, de Gruyter Mouton & Ishara Press.

Events
DateEventOrganization
09. – 10.10. 2024PALM UP in Gesture and Sign Symposium: Theoretical, Typological and Methodological Perspectives for Future ResearchSandra Debreslioska, Anastasia Bauer, Anna Kuder & and Pamela Perniss
Short-term Collaborations

Comparing a recurrent non-manual movement in spoken and signed languages (2024) – Anastasia Bauer & Silva Ladewig

Examining mouthings with Virtual Reality (VR) glasses (Meta Quest Pro) (2024) –
Anastasia Bauer, Alexander Mehler, Alexander Henlein & Andy Lücking

Studying head nods with Computer Vision tool OpenPose (2024) – Anastasia Bauer, Anna Kuder & Marc Schulder

The Analysis of Mouthings in the Online DGS corpus (2024) –
Anastasia Bauer, Nina K. Meister, Patrick C. Trettenbrein & Liona Paulus