Representation of Pointing Uncertainty for the Integration of Pointing Gestures and Speech

Project Participants

Project Description

Pointing gestures are ubiquitous in every-day life. Typically, pointing is used to direct the attention of others to distant objects in the environment, especially when the objects are difficult to describe verbally (e.g., a star in the night sky or a specific chocolate at a pastry shop). Helpful as pointing may be, pointers do not expect that the pointing gesture alone establishes a joint focus of attention. Hence, pointing is typically complemented by speech. To choose a meaningful verbal description, a pointer needs some notion about the uncertainty associated with his gesture. That is, if a pointer assumes that his gesture may direct another person’s attention exactly to the referent, fewer verbal descriptions (if any) are given than when he expects the point to only draw attention to the rough vicinity of the referent. Likewise, observers of pointing gestures represent their own perceptual uncertainty to know in which area to look for the referent or to judge the confidence of their guess. In summary, effective communication with pointing gestures and speech requires representations of perceptual uncertainty on the sides of the pointers and observers. However, little is known about the specific information pointers assume to convey with pointing, observer assume can be inferred from a point, and how they relate to the information that observers factually infer from pointing gestures. The objective of the present proposal is to examine such representations of uncertainty for the typical case of pointing to small and distant objects. To examine representations of uncertainty, we plan to use visual search tasks, in which the identity of a target is only known to a pointer, who communicates it to an observer with pointing and speech. Eye-tracking data is used to derive representations of uncertainty as pointers and observers need to visually process an area of uncertainty around the (assumed) referent to provide adequate verbal descriptions or when searching for the referent. Additionally, representations of uncertainty are derived from the specificity of the pointer’s verbal descriptions and the level of specificity needed by the observer to be confidence about their guess. The work programme includes experiments in real and virtual settings. WP 1 aims at establishing the basic paradigm. WP 2 addresses whether representations of uncertainty reflect factors that are known to affect pointing perception. WP 3 addresses how representations of uncertainty may be adapted to perceptual performance. The results of WP 1 – 3 are used to extent our formal model of pointing perception to include representations of uncertainty. WP 4 evaluates methods to improve pointing-based communication by making misconceptions about uncertainty explicit in real and virtual settings. In summary, we expect that the project provides new insights into representations of the uncertainty of pointing perception and their effect on the integration of gestures and speech.