Semiotica 2021 (241):159-183 (
2021)
Copy
BIBTEX
Abstract
This study investigates whether there is a relation between the semantics of linguistic expressions that indicate temporal distance and the spatial properties of their co-speech gestures. To this date, research on time gestures has focused on features such as gesture axis, direction, and shape. Here we focus on a gesture property that has been overlooked so far: the distance of the gesture in relation to the body. To achieve this, we investigate two types of temporal linguistic expressions are addressed: proximal (e.g., near future, near past) and distal (e.g., distant past, distant future). Data was obtained through the NewsScape library, a multimodal corpus of television news. A total of 121 co-speech gestures were collected and divided into the two categories. The gestures were later annotated in terms of gesture space and classified in three categories: (i) center, (ii) periphery, and (iii) extreme periphery. Our results suggest that gesture and language are coherent in the expression of temporal distance: when speakers locate an event far from them, they tend to gesture further from their body; similarly, when locating an event close to them, they gesture closer to their body. These results thus reveal how co-speech gestures also reflect a space-time mapping in the dimension of distance.