Communicative Functions of Co-Speech Gestures During Conversation in Adults with ASD

Learn how you can help with a new
Autism, ADHD, Anxiety & Depression study.

CAR stands united with the Black Lives Matter movement
against racism and social injustice.

Zhang, Y., Bagdasarov, A., Kim, E., Dravis, Z., Cola, M., Maddox, B., Ferguson, E., Adeoye, L., Fergusson, F., Pallathra, A., Minyanou, N., Bateman, L., Pomykacz, A., Bartley, K., Brodkin, E., Pandey, J., Parish-Morris, J., Schultz., R.T., & de Marchena, A.

Abstract Text: 

Background: Co-speech hand gestures serve many communicative functions, often denoted by gesture “types.” For example, gestures can be interactive (signaling pragmatic functions, e.g., turn taking), or representational (depicting physical properties of referents, e.g., shape/movement). Several studies have reported that people with ASD use the same types of gestures as controls, while others report differential proportions of certain gesture types (e.g., increased iconic/representational gestures).

A further function of gesture, across types, is to present information that supplements speech (e.g., saying “the big one”, while drawing a circle in the air to indicate roundness, or saying “I don’t know” while gesturing toward an interlocutor, indicating their turn to speak). Some studies have shown reduced supplementary gestures in ASD, suggesting less integration across verbal and nonverbal communicative modalities.


Co-speech gestures in ASD have primarily been studied via elicited narratives, which offer limited opportunities for back-and-forth interaction. Gestures serve different functions based on social context, thus, the objective of the current study was to examine the communicative functions of co-speech gestures during back-and-forth interaction.


Adults with ASD (n=24) and age-, gender, and IQ-matched typically developing controls (TDC; n=10) completed a five-trial collaborative referential communication task designed to elicit spontaneous back-and-forth conversation in a controlled setting (data on remaining 14 TDCs will be available by May 2017). We examined whether gestures fulfill different communicative functions in ASD in two ways: (1) coding gesture types, and (2) examining how often gestures present information that supplements speech. Gestures were coded as interactive, representational, beat (i.e., moving hands in time to speech), and other (less frequent types, including deictic/pointing).


Participants in both groups used far more representational gestures than interactives and beats (p<.001, Cohen’s d=2.21), with no group by type interaction, suggesting that, on this type of task, gestures produced by adults with ASD and TDC fulfill the same communicative functions. Surprisingly, adults with ASD were more likely to include supplementary information in their gestures (p=.02, Cohen’s d=0.91; Figure 1). Finally, we investigated whether participants in both groups modulated the supplementary information presented in gestures based on their gesture type (interactive vs. representational). All participants included more supplementary information when using interactives relative to representationals (p<.001, Cohen’s d= 1.80; Figure 2) with no interaction, suggesting that adults with and without ASD are equally likely to modulate supplementary information based on gesture type.


Adults with ASD use co-speech gesture to serve similar communicative functions as controls, as evidenced by comparable proportions of gesture types and equivalent patterns of modulation of supplementary information based on gesture type. Surprisingly, adults with ASD were more likely to include supplementary information in their gestures, which is inconsistent with previously published work on youth with ASD (e.g., Morett et al., 2016, So et al., 2014), and may reflect differences in task demands or age. Further work with the current dataset will be done to explore how people with ASD combine informational content from gestures and speech, and what may be driving them to include different content across communicative modalities.

Abstract Number: 
Presentation Date and Time: 
Thursday, May 11, 2017 - 12:30pm to 1:45pm
Presentation Location: 
Golden Gate Ballroom (Marriott Marquis Hotel)