“Language is inseparable from imagery” argues David McNeill (quoting from Antonio Damasio) in his revealing book on the relationship between gesture and language. Moreover, gestures actually help us to think and to speak and to articulate our feelings and thinking. In this interesting (but quite technical) book, McNeill also points out that gesture and language occupy the same ‘time frames’ and have the same relationships to context, In his view it is a mistake to consider gesture (or ‘body language’) as separate from spoken language as they work together as connected parts of human language.
Taking this further he discusses how language and thinking have two parts – static and dynamic and that these combine in any piece of communication (any ‘speech event’). Although they are bound together, they often reflect different aspects of our ‘memory’ of events, residing in different parts of the brain. As TapestryWorks have often argued, memory is much more physical and experiential than it is verbal, and the work on gesture supports this.
Interestingly, much of the work on gesture is very ‘semiotic’ in feel, for example in distinguishing iconic and metaphoric gestures (concrete versus abstract). Another classification of gesture (from Edman and Friesen) proposed five types of gesture: illustrators, adaptors, emblems, affect displays and regulators.
Illustrators are generally tied directly to speech with the same intentionality, beating the tempo of what is spoken, pointing out items that are referred to or exploiting imagery from speech content. Adaptors are fragments of previously learnt hand movements, often habitual, such as smoothing your hair or pushing your glasses up. These movements happen with very little awareness and no intention to communicate.
Emblems are more directly communicative and often come before speech, such as the ‘thumbs up’ or ‘shush’ (needing no speech at all). Affect displays and regulators often involve no hand movement, coming from the face in the case of the former and from slight head movements and changes in body position in the case of the latter.
Differences in the structure of language can lead to differences in the nature and timing of gestures. For example, if you “hit someone with a stick”, a typical English speaker will time the action gesture to coincide with the verb, while a Mandarin speaker will time to coincide with the object (stick) which comes earlier in the sentence.
But gesture doesn’t always coincide with language. On some occasions it can convey information that it is not transmitted (or not understood) via language and in some cases may even contradict what is said (classically when someone is withholding some aspect of the truth or simply doesn’t know something).
Typically what the hands say is more likely to be true than what is spoken. More tellingly, we can almost always gesture what we speak, but there are many occasions when we are not able to say what we can gesture.
Work on gesture and language reveals much about the origin of languages, but I think reveals more about the reliability of relying only on what people say rather than what they do.
For me, the hands (and body) win over speech almost every time.
Gesture and Thought by David McNeill
Hearing Gesture: How out hands help us think by Susan Goldin-Meadow
The Feeling of What Happens: Body and emotion in the making of consciousness by Antonio Damasio
One thought on “Let The Hands Speak”
The Arrival of Emoji – TapestryWorks
[…] Of Emoji) is said to confirm that emojis act like gestures (read more on gestures and language here), adding much needed clues to give tonality to words that have been flattened from vocal speech and […]