He added that “the work promises to guide the design of new therapies for the 7.5 million Americans who have trouble using their voices, including those with apraxia, trouble planning speech movements, and aphasia, difficulty processing language, which can accompany conditions like autism or result from trauma caused by a stroke.”
For years, researchers tried to link speech functions to brain circuits using electroencephalograms or EEG, which places electrodes on the scalp. Such devices measure quick swings in electrical signals seen as large groups of nerve cells “fire” to transmit electrical signals.
But EEG could not pinpoint the location of nerve circuits with enough resolution, and functional magnetic resonance -- another commonly used technology -- was not fast enough to capture activity patterns related to the conversational planning of replies, said the study authors.
These non-invasive methods leave a critical blind spot in the field’s ability to track what the brain does during everyday conversation, say the authors.
Another technology, electrocorticography (ECoG), overcomes these barriers by placing electrodes not on the scalp, but directly on the surface of the brain. Fast, precise ECoG measurements revealed that the brain achieves natural conversation by combining perception of what is heard, the planning of a reply, and the production of the sounds (articulation) that make up words.
While other ECoG studies have determined the networks related to perception and production, the current study is the first to capture brain activity during the reply planning phase between them, which has been the hardest to study, said the authors.
“Researchers can talk to patients and watch the activity of brain circuits as they talk or listen, but planning has no physical correlate,” said Long.
“When we combined ECoG measures with a technique that asks patients structured questions, we exposed an underlying planning network,” Long added.
To conduct the study, the research team placed electrodes on the brain surfaces of patients during surgeries that were underway to remove either a tumour or brain tissue causing epileptic seizures. In both cases, surgeons place patients under only local anaesthesia initially so they can determine the brain regions that are active as patients talk, thus averting damage to the patient’s speech centres.
Researchers placed arrays of ECoG electrodes on the language-dominant left brain hemispheres of eight patient volunteers. Next, they measured planning responses using a paradigm developed by another lab called the critical information (CI) task, which was designed to control the timing of planning. In each block of questions, a changing keyword, the CI, determines when reply planning starts so brain activity can be tracked in that time window
1. The opposite of soft is what common word?
2. The opposite of hot is what common word?
By changing each question’s wording to present earlier or later key information needed to start the planning of an answer -- the researchers were able to distinguish brain activity related to planning from perception and production. Importantly, the majority of cortical responses were related to only one of these three speech processes, showing that the networks were largely separate for each function.
In addition, the researchers found that 95.5 per cent of planning electrodes were clustered in a spatially distinct region of the brain, with most planning electrodes centred in the caudal inferior frontal gyrus (cIFG) and the caudal middle frontal gyrus (cMFG). While cIFG, commonly known as “Broca’s region,” has long been known to be important for language, a role for cMFG had not been previously been established.
Furthermore, the team found that the planning network identified with the CI task is also active when patients are preparing to speak during natural, unscripted conversations. After patients finished answering the structured questions, researchers engaged them in several minutes of casual back-and-forth conversation, during which the same patterns related to perception, planning, and speaking appeared in the patient’s brain activity.
“This study provides a first description of the specific brain mechanisms that generate language as we speak in natural, everyday contexts,” said Gregg Castellucci, PhD, a postdoctoral fellow in Long’s lab.
“Crucially, the brain mapping we found using simple, controlled tasks held up in tests of natural human behaviour,” he added.
Along with Long and Castellucci, study authors were Christopher Kovach, Matthew Howard III, and Jeremy Greenlee, of the Department of Neurosurgery at the University of Iowa. Funding was provided by National Institute of Health grants R01 NS113071 and R01 DC015260, as well as by the Simons Collaboration on the Global Brain.