Emblem: Gestures with standard properties and voice functions. For example, the OK sign has a culturally agreed meaning; and it is necessary to put your thumb and index finger together to form the sign. Conflicting results on the relationship between gestures and language control have been reported in the literature. Jacobs and Garnham (2007) found that gestures were rarely used to retrieve words from speakers; Instead, they were produced for listeners to improve understanding of language content. On the other hand, Goldin-Meadow (1999, 2003) suggested that there are positive effects of using gestures to reduce a speaker`s language needs and/or encourage the formulation of thoughts and ideas to converge complicated ideas, such as explaining a mathematical problem, in normal speakers. In other words, during management, greater cognitive ability can be spared for the task of speaking, thus freeing up the ability to retrieve words from memory (Rauscher, Krauss, & Chen, 1996). Alibali and DiRusso (1999) reported consistent results that typical speakers could remember more words in a word recitation task when gestures were involved, suggesting that gestures might help reduce working memory requirements in a linguistic task. On the other hand, Hostetter and Alibali (2007) reported that people with low phonemic fluency, which results in a weaker ability to generate individual words with a particular initial consonant, but high skills in spatial visualization, most often gesticulated. However, the rate of production of representation gestures did not necessarily vary solely by phonemic fluency competence, suggesting that individual differences in gesture use were associated with individual differences in cognitive abilities. Gestures are processed in the same areas of the brain as language and sign language, such as the left frontal gyrus (Broca`s region) and the posterior middle temporal gyrus, the posterior posterior temporal sulcus and the superior temporal gyrus (Wernicke`s region).  It has been suggested that these parts of the brain originally supported the pairing of gesture and meaning, and then in human evolution „were adapted for the comparable pairing of sound and meaning, as voluntary control of the vocal apparatus was established and spoken language was developed.“  As a result, it underlies both symbolic gestures and spoken language in today`s human brain. Their common neurological foundation also supports the idea that symbolic gesture and spoken language are two parts of a single basic semiotic system that underlies human discourse.
 The combination of hand and body gestures in relation to speech is also reflected in the way gestures are used by blind people during conversation. This phenomenon reveals a function of gesture that goes beyond the representation of the communicative content of language and expands David McNeill`s vision of the gesture-speech system. This suggests that gesture and speech work closely together and that a disruption of one (language or gesture) causes a problem in the other. Studies have found strong evidence that language and gesture in the brain are connected by the heart and work in a wired and choreographed system efficiently. McNeill`s view of this link in the brain is just one of three currently under discussion; the others explain gesticulated as a „support system“ of spoken language or as a physical mechanism of lexical recovery.  Modern computer technology could accelerate this (re)shift towards gesture. .