An article on this study by Dr. Weber-Fox also appeared in the New York Times

By Christine Weber-Fox, Ph.D.,
CCC-SLP Purdue University

In the last decade, accumulating evidence from laboratories in the U.S. and Europe, as well as our own, led to the development of a multi-factorial model of stuttering. This model of stuttering hypothesizes that stuttering emerges from complex interactions among factors including genetics, language processing, emotional/social aspects, and speech motor control. Ultimately, stuttering occurs when the neural signals that produce the coordinated movements in the respiratory, vocal, and articulation systems become disrupted. The underlying notion is that the functions of the brain areas for speech motor control are affected by complex interactions with other neural systems. One important underlying assumption of this model is that these factors may not play the same role in different individuals who stutter and very likely vary in significance over different stages of development.

In our recent series of complementary experiments we studied how neural systems for language processing may contribute to disruptions in speech motor control in people who stutter. We have examined this question using two approaches. First, using a very sensitive movement tracking system, we have been able to analyze speech movements with an accuracy of less than .1 mm to determine how consistently and smoothly these movements are produced across different language tasks. The main finding from these studies is that when language demands are relatively low, the speech movements of adults who stutter are similar in consistency to those of normally fluent adults. However, when the linguistic demands of an utterance become more complex, the additional processing demands affect the speech motor control systems of adults who stutter to a greater extent than those who are normally fluent. This is direct evidence of how task demands in other neural systems, such those involved in language processing, can disrupt the neural signals for the speech movements themselves.

A second approach we have taken to better understand how language processing demands may affect stuttering is to examine brain responses to language tasks when participants are not required to speak. To do this we recorded the electrical activity that is generated by groups of brain cells (electroencephalography) with 32 electrodes that are embedded in an elastic cap. Using this technique we can measure changes in brain activity on a millisecond by millisecond basis. The participants in these studies were asked to read words that were flashed on a computer monitor one word at a time. We then averaged the brain activity that was elicited by specific aspects of the language stimuli, such as whether the word is a content word (e.g., "cow") or a function word (e.g., "into"). This averaging of the brain waves results in a measure known as an event-related brain potential (ERP). When we examined the averaged brain waves of the adults who stutter, we found they were reduced in amplitude compared to matched normally fluent group of adults. We found reduced amplitudes of averaged brain responses in adults who stutter for function and content words, for words read in an unexpected context (semantic anomalies, e.g. The boy hung his coat in the peanut), and violations in verb agreement (e.g., Everyday they travels this road). Our most recent study which will appear in the Journal of Speech, Language, and Hearing Research in December, 2004, examined averaged brain responses that were elicited by words that rhymed or did not rhyme with a preceding word. This experiment allowed us to look at how phonological processing (without the grammatical or semantic demands) may differ in adults who stutter. For the most part, the averaged brain waves and behavioral responses of the adults who stutter were very similar to those of normally fluent speakers. It was only in the most difficult rhyme decision, when the two words looked alike but did not rhyme (e.g., gown, own) that the reactions times were slowed in the adults who stutter. The averaged brain waves for the rhyme decision was larger over the right hemisphere compared to the left in the adults who stutter but equal in the normally fluent speakers. Taken together, these findings indicate that the neural systems for some aspects of language processing may operate differently in adults who stutter even when there are no overt speaking demands. Overall, in both the movement tracking and brain response studies, we found that increased complexity, or greater demands on the language processing system, enhanced differences between adults who stutter and normally fluent adults.

The studies described above have focused on language and motor interactions in adults who stutter. This body of work lays the groundwork for the next phase of our research which is to look at similar types of interactions in young children who stutter. Our research group (Anne Smith and Christine Weber-Fox from the Department of Audiology and Speech Sciences and Howard Zelaznik from the Department of Health and Kinesiology at Purdue University) very much appreciate the support from the National Institute of Deafness and Other Communication Disorders of the National Institutes of Health (Physiological Correlates of Stuttering, R01 00559).