World Journal of Neuroscience
Vol.2 No.3(2012), Article ID:21852,4 pages DOI:10.4236/wjns.2012.23024

The general motor programmer: Its specialization for speech perception & movement

Anne Maria Keane

Psychology, University of Bolton, Bolton, UK


Received 15 May 2012; revised 16 June 2012; accepted 30 June 2012

Keywords: General Motor Programmer; Common Mechanism; Speech Perception; Hand Movements


A common neural mechanism—the General Motor Programmer—is proposed by Keane (1999) to underlie both the perception of speech and the initiation of hand movement. A proposal to investigate the specific aspect of cognitive functioning this mechanism is specialized for, namely the timing or place of articulation, is outlined.


A common aspect of cognitive processing is shared by both the perception of speech and the initiation of hand movement. This was shown in a study by Keane [1] which although designed to ascertain the relative capacity of each hemisphere for verbal processing found instead interference between a verbal perception task and the hand used to respond. Specifically, right and left handers with varying degrees of hand preference, both strong and weak as measured by the Edinburgh Handedness Inventory (EHI) [2], responded to monaurally presented verbal stimuli (consonant vowels) in a choice reaction time paradigm. It is usual for a right ear or right visual field advantage (indicating a left hemisphere advantage) to be found using either tachistoscopic presentation, dichotic listening, or monaural listening. However, the right ear or right visual field advantage only informs us about the left hemisphere’s ability. It does not inform us about the capability of the right hemisphere. Therefore, a reaction time procedure was used in the Keane study that attempted to assess the capacity of both the right and left hemispheres by requiring a response from each hand following presentation of the verbal stimuli to each ear, which meant there were four response conditions (i.e., right ear-right hand, right ear-left hand, left ear-right hand, left ear-left hand) instead of the usual two (right ear-preferred hand; left ear preferred hand). Similar to previous studies [3-5], Keane [1] did find a right ear advantage for processing of the verbal stimuli but only in the strong right and left handers, not in the weak right and left handers. However, this right ear advantage in the strong handers was found to be related to degree of hand preference in that it was only those strong handers who used their preferred hand for writing and most other activities on the EHI but did not use their preferred hand for all the activities (i.e., the inconsistent strong handers) who were the ones to show the right ear advantage. Those strong handers who had a consistent hand preference, that is they used their preferred hand for all the activities on the EHI, did not display the right ear advantage and instead showed more of a left ear advantage. This lack of the right ear advantage in these even more strongly handed participants who would be expected to show the right ear advantage even more so than the inconsistent strong handers [6], suggested that the necessity of making a manual response was interfering with verbal processing in the left hemisphere.

The amount of interference was found to be the same for both hands indicating that the interference was at a level of motor programming prior to right and left hand control, or even hand preference, as interference at either of these levels would result in greater interference of one hand over the other. Therefore, equal interference to both hands would suggest that the common mechanism between speech perception and programming of hand movements seems to be specialized for some aspect that is involved in an early, initiation-of-movement stage of hand control. Familial sinistrality (i.e., the presence of left handed family members) had also been taken into account in addition to the direction and degree of hand preference, and while familial sinistrality has been found to be related to other levels of hand control [7,8] it was not found to be related to lateralization of the general motor programmer [1]. The general motor programmer, or common mechanism, is general in the sense that it is not specific to just hand movements as the general motor programming mechanism was also required for some aspect of verbal perception, and is most likely used to initiate other movements as well (e.g., movements of the mouth). Also, while the general motor programmer was presumed (from mainly studies of brain-damaged patients) to be based in the left hemisphere [9-11] the results of the Keane study suggest that this appears only to be the case for those with a consistent hand preference.


While the Keane study [1] shows both speech perception and hand movements share the same general motor programming mechanism, what has not yet been determined is what the specific aspect that is shared by both actually is. It is proposed therefore, to re-investigate the data from the original Keane study [1] to try to pinpoint what exactly the general motor programmer is specialized for. Previous research on possible specializations of the left hemisphere on which both speech and hand movements could be based has centered around its being either some type of timing aspect or the spatial positioning of the musculature [9,12-14]. Evidence for a left hemisphere timing advantage, particularly for rapidly changing stimuli, was shown by Schwartz and Tallal [15] in a study in which the presentation rate of acoustic change was varied. They found that when the stop consonants were presented at a rapid rate a left hemisphere advantage was found, but this left hemisphere advantage was significantly reduced when the rate of presentation was slower. Similarly, Boemio, Fromm, Braun and Poeppel [16] found activation of both hemispheres for processing temporal information in nonspeech stimuli, but found greater right hemisphere activation for the perception of stimuli comprised of longer duration, thus slower segments. In addition some more recent lesion and neuroimaging studies have also shown a left hemisphere advantage for timing [17,18]. The finding of a correlation between nonverbal temporal analysis deficits and degree of receptive language deficits in both children with developmental aphasia and adults with acquired aphasia, such that the greater the impairment in temporal analysis the more impaired the person is for receptive language abilities would also suggest that the common mechanism between both motor production and identification of speech may be based on timing [19,20]. However, a left hemisphere specialization for timing has been disputed [21], with it being suggested instead that the left hemisphere is specialized for the rapid positioning and repositioning of the musculature [9,22,23]. Studies of brain damaged patients where those with left hemisphere damage had little difficulty performing a repetitive task but experienced difficulty when the motor task they were asked to perform required several changes in arm, hand and finger positions [9] would support a place of articulation specialization. This left hemisphere muscular repositioning motor control system would appear to mediate speech movements as well as manual movements as in a study by Mateer and Kimura [24] similar findings, but this time in relation to oral movements were found in that while fluent aphasics showed no deficit for producing single oral movements (ba, ba, ba), they were significantly impaired when producing multiple oral movements (ba, da, ga) relative to non-aphasic groups. Evidence for a common left hemisphere specialized muscular repositioning center is further supported by Kimura’s [9] finding that it was the aphasic patients with left hemisphere damage that were more impaired on the manual sequencing task than the non-aphasic patients with left hemisphere damage.


The verbal task used in the Keane study [1] that caused interference with hand movement initiation specifically required a decision on whether two consonant vowels differed with respect to voicing or place of articulation.  The voicing aspect of speech (the perception of the temporal order of vocal cord vibration relative to consonant release) is taken as representing the timing feature of speech [25], while place of articulation refers to the place or point along the vocal tract that is closed thus restricting the flow of air [26]. It is proposed therefore, to determine if the general motor programmer is specialized for timing or for place of articulation by ascertaining if the interference that was found for the consistent handers is evident between the hand response and perception of the consonant vowels when they differ for voicing, or whether instead it is evident when they differ for place of articulation. The verbal stimuli consisted of the consonant vowels: /ba/, /pa/, /ga/ and /ka/ which were presented in pairs, to which right and left handers had to decide if the two consonant vowels presented were the same or different. The difference in interference between the consistent and inconsistent strong handers was found only when the two consonant vowels differed and not when both consonant vowels were the same [1]. Therefore, both perception of speech and the initiation of the hand response must be based on some aspect of cognitive functioning that is being used to determine if two consonant vowels presented are different. The consonant vowels had been presented in the pairs: /ba/ vs. /pa/, /ba/ vs. /ga/, and /ba/ vs. /ka/. The /ba/ vs. /pa/ contrast differs for voicing (i.e., timing) but not place, the /ba/ vs. /ga/ contrast differs for place (place of articulation) but not voicing, and the /ba/ vs. /ka/ contrast differs for both place and voicing. Thus the Keane data provides an opportunity to disentangle timing and place of articulation and so determine which of these is the basis of the general motor programmer.

Perception of speech is related to bilateral hand control at the level of the general motor programmer because they share some common aspect of cognitive processing. Therefore, finding if the interference between the perception of consonant vowels when they are different and the initiation of the hand response is evident for the voicing feature of speech or for the place of articulation aspect will determine if the general motor programmer is specialized for timing or for place of articulation, and should therefore throw some light on one of the links between language and handedness.


  1. Keane, A.M. (1999) Cerebral organization of motor programming and verbal processing as a function of degree of hand preference and familial sinistrality. Brain and Cognition, 40, 500-515. doi:10.1006/brcg.1999.1115
  2. Oldfield, R. (1971) The assessment and analysis of handedness: The edinburgh inventory. Neuropsychologia, 9, 97-113. doi:10.1016/0028-3932(71)90067-4
  3. Morais, J. and Darwin, C.J. (1974) Ear differences for same‑different reaction times to monaurally presented speech. Brain and Language, 1, 383‑390. doi:10.1016/0093-934X(74)90015-7
  4. Rastatter, M.P. and Gallaher, A.J. (1982) Reaction‑times of normal subjects to monaurally presented verbal and tonal stimuli. Neuropsychologia, 20, 465‑473. doi:10.1016/0028-3932(82)90045-8
  5. Studdert-Kennedy, M. and Shankweiler, D. (1970) Hemispheric specialization for speech perception. Journal of the Acoustical Society of America, 48, 579-594. doi:10.1121/1.1912174
  6. Sheehan, E.P. and Smith, H.V. (1986) Cerebral lateralization and handedness and their effects on verbal and spatial reasoning. Neuropsychologia, 24, 531-540. doi:10.1016/0028-3932(86)90097-7
  7. Keane, A.M. (2001) Motor control of the hands: The effect of familial sinistrality. International Journal of Neuroscience, 110, 25-41. doi:10.3109/00207450108994219
  8. Keane, A.M. (2002) Direction of hand preference: The connection with speech and the influence of familial handedness. International Journal of Neuroscience, 112, 1287-1303. doi:10.1080/00207450290158205
  9. Kimura, D. (1977) Acquisition of a motor skill after left-hemisphere damage. Brain, 100, 527‑542. doi:10.1093/brain/100.3.527
  10. Kimura, D. (1982) Left-hemisphere control of oral and brachial movements and their relation to communication. Philosophical Transactions of the Royal Society of London: B: Biological Sciences, 298, 135-149. doi:10.1098/rstb.1982.0077
  11. Liepmann, H. (1980) The left hemisphere and action. Research Bulletin, 506, 17-50.
  12. Bradshaw, J.L. and Nettleton, N.C. (1981) The nature of hemispheric specialization in man. Behavioral and Brain Sciences, 4, 51-91. doi:10.1017/S0140525X00007548
  13. Calvin, W.H. (1983) Timing sequencers as a foundation for language. Behavioral and Brain Sciences, 6, 210-211. doi:10.1017/S0140525X00015533
  14. Ojemann, G.A. (1984) Common cortical and thalamic mechanisms for language and motor functions. American Journal of Physiology, 246, R901-R903.
  15. Schwartz, J. and Tallal, P. (1980) Rate of acoustic change may underlie hemispheric specialization for speech perception. Science, 207, 1380-1381.
  16. Boemio, A., Fromm, S., Braun, A. and Poeppel, D. (2005) Hierarchical and asymmetric temporal sensitivity in human auditory cortices. Nature Neuroscience, 8, 389-395.
  17. Brancucci, A., D’Anselmo, A., Martello, F. and Tommasi, L. (2008) Left hemisphere specialization for duration discrimination of musical and speech sounds. Neuropsychologia, 46, 2013-2019. doi:10.1016/j.neuropsychologia.2008.01.019
  18. Schirmer, A. (2004) Timing speech: A review of lesion and neuroimaging findings. Cognitive Brain Research, 21, 269-287. doi:10.1016/j.cogbrainres.2004.04.003
  19. Tallal, P. (1983) A precise timing mechanism may underlie a common speech perception and production area in the peri-Sylvian cortex of the dominant hemisphere. Behavioral and Brain Sciences, 6, 219-220. doi:10.1017/S0140525X00015636
  20. Tallal, P. and Newcombe, F. (1978) Impairment of auditory perception and language comprehension in dysphasia. Brain and Language, 5, 13-24. doi:10.1016/0093-934X(78)90003-2
  21. Watson, N.V. and Kimura, D. (1989) Right-hand superiority for throwing but not for intercepting. Neuropsychologia, 27, 1399-1414. doi:10.1016/0028-3932(89)90133-4
  22. Lomas, J. and Kimura, D. (1976) Intrahemispheric interaction between speaking and sequential manual activity. Neuropsychologia, 14, 23-33.
  23. McFarland, K., Ashton, R. and Jeffery, C.K. (1989) Lateralized dual-task performance: The effects of spatial and muscular repositioning. Neuropsychologia, 27, 1267-1276. doi:10.1016/0028-3932(89)90039-0
  24. Mateer, C. and Kimura, D. (1977) Impairment of nonverbal oral movements in aphasia. Brain and Language, 4, 262-276. doi:10.1016/0093-934X(77)90022-0
  25. Howell, P. and Rosen, S. (1983) Natural auditory sensitivities as universal determiners of phonemic contrasts. Linguistics, 21, 205-235. doi:10.1515/ling.1983.21.1.205
  26. Wallwork, J.F. (1985) Language and linguistics. 2nd Edition, Heinemann, Oxford.