September 4, 2007 Features

Neuroimaging and Cochlear Implants

A Look at How the Brain Hears

Hearing scientists use neuroimaging techniques, which examine how the brain responds to external acoustic stimuli, to evaluate auditory processing. Investigators may probe how the brain responds to simple manipulations of frequency, intensity, or duration of the stimuli, or they may investigate how the brain responds to more complex listening conditions such as the appreciation of music, the sounds of familiar or unfamiliar languages, or the perception of speech signals in noise.

A growing literature focuses on learning how individuals with normal hearing or hearing loss process the sounds in their environment and the sounds needed to communicate. Two uses of neuroimaging techniques are in examining processes contributing to tinnitus and other annoying auditory conditions, and examining conditions in which an acoustic signal is transposed into an electrical stimulus such as a cochlear implant (CI). Several studies with individuals with cochlear implants concentrate on how the brain uses cortical resources to perceive speech or music. This information helps determine ways to improve the integrity of an electrical signal representing speech and to train people with cochlear implants to improve their listening performance.

Speech-perception performance in individuals  [PDF] using cochlear implants varies greatly. Some individuals receive significant benefit when listening in both quiet and noisy situations, but others receive little benefit when listening in quiet environments. Individuals with severe-to-profound sensorineural hearing losses typically have a reduced number of intact hair cells within the cochlea which affects the status of the spiral ganglion cells and the transmission of information through the brainstem to the cortex. As a result, many individuals with sensorineural hearing loss experience "bottom-up" difficulties processing sound that are directly related to the anatomical and physiological condition of the peripheral auditory system.

Individuals with long-standing auditory deprivation related to sensorineural hearing loss also may experience "top-down" difficulties processing sounds. "Top-down" processing, in its most general form, refers to cognitive strategies used by listeners to decipher messages. For example, a listener might use the context of a sentence to fill in or guess the meaning of an unintelligible word in order to understand the complete sentence. For individuals using CIs, "top-down" processing may be evaluated through electrophysiological techniques or functional brain-imaging techniques.

Neuroimaging studies with individuals who have cochlear implants use two major techniques, positron emission tomography (PET) and single photon emission computed tomography (SPECT), to examine cortical responses to speech perception tasks in individuals using cochlear implants. SPECT imaging of cortical responses to auditory signals in individuals being considered as candidates for cochlear implantation demonstrate drastically reduced responses that reflect limitations in the peripheral auditory system.

Studies indicate that the overall resting metabolism of persons with long-standing deafness differs from normal-hearing individuals. Elevated metabolic rates are associated with longer periods of auditory deprivation, as in the case of individuals with prelingual deafness, or with individuals who have less residual hearing. Individuals who are deaf demonstrate cortical responses to acoustic inputs that are characterized by a reduction in the extent and amount of activation in the primary and association auditory cortices. In determining candidacy for cochlear implantation, SPECT imaging may assist in determining the better ear for implantation by revealing the strongest, most intact pathway activated during auditory stimulation when all routine audiometric findings are equal across two ears.

Following cochlear implantation, cortical metabolism appears to be responsive to electrical stimulation, and alterations in cortical activations are observed in response to speech signals, pure tones, unfamiliar languages (such as a foreign language), and nonsense sounds. Our work and that of others demonstrate a reduction in the amount and extent of cortical activations observed in individuals before and after cochlear implantation relative to individuals with normal hearing. These reductions appear to reflect our inability to completely restore the peripheral auditory system to a level comparable to individuals with normal hearing. Individuals with a cochlear implant demonstrate greater responses in the primary and association cortices contralateral to the ear of implantation than observed before implantation.

The research suggests a strong relationship between the level of speech perception performance achieved with cochlear implantation and the activation patterns associated with primary auditory and auditory association cortices. High levels of speech perception after cochlear implantation are associated with bilateral activation of the primary auditory cortex (in Brodmann Areas 41, 42) and the auditory association cortex (Brodmann Areas 21, 22, and 38). However, low levels of speech perception after cochlear implantation are associated only with primary auditory cortex activation contralateral to the ear of implantation. These findings are consistent with the results of animal studies indicating that the auditory periphery, brainstem, and cortex are responsive to electrical stimulation from cochlear implants.

Enhancing Auditory Plasticity

A few studies use neuroimaging techniques to evaluate the "treatment" of the brain of individuals with a cochlear implant in order to maximize the potential of "top-down" processing. We were interested in determining how "plastic" or flexible the brains are of individuals who perform at low levels and with minimal activation patterns following cochlear implantation. One approach to examining how the minimal activation patterns could be enhanced is to  provide a pharmacological stimulant  [PDF] known to increase cortical activations during cognitive tasks in combination with clinical intervention. Our initial focus on improving "top-down" processing paired the administration of a pharmacological stimulant (d-amphetamine) with intensive audiologic/aural habilitation (AR). Adult cochlear implant users were randomly assigned to either a treatment group (d-amphetamine plus AR) or a control group (placebo plus AR). The AR program was held twice a week for 1.5 hours and featured sessions with similar content, although the level of sophistication of activities was designed to meet individual needs.

We examined auditory-only speech tracking before and after the eight-week intervention program. SPECT scans were collected at a similar time frame contrasting cortical activations produced when watching and listening to a reading versus watching only. Contrasting these conditions involves a technique that subtracts watching and listening from watching only, leaving only the listening responses.

Intensive AR with the placebo resulted in a gain of approximately 14% in auditory-only speech tracking scores. When AR was paired with d-amphetamine, however, auditory-only speech tracking scored showed gains of 43%. Prior to intervention, similar cortical activations were observed for both groups, but following the treatment program, the placebo group increased their activations bilaterally in the transverse and superior temporal gyri. The treatment group had substantially greater increased activations in the superior temporal gyrus ipsilateral to the ear implanted and more widespread activations contralateral to the ear of implantation, including in primary auditory and association-area cortices.

These initial observations are encouraging in two respects. First, they confirm that the brains of older individuals remain responsive to change guided by traditional AR or AR paired with pharmacological enhancement. Second, intervention outcomes are observable both in behavioral and objective measures of cortical activations. While promising, these observations also should be viewed with caution. We are only at the very initial stages of exploring how AR interfaces with changes in cortical activity. Many meticulous studies that carefully manipulate variables will be needed before these techniques can be considered for clinical use.  

Emily A. Tobey, is professor and Nelle C. Johnston Chair in Early Childhood Communication at the Callier Advanced Hearing Research Center of University of Texas at Dallas and adjunct professor in the departments of Radiology and Otolaryngology—Head and Neck Surgery, University of Texas Southwestern Medical Center. Contact her at etobey@utdallas.edu.  

Michael D. Devous, Sr., is professor and associate director of the Nuclear Medicine Center, Department of Radiology, University of Texas Southwestern Medical Center, Dallas. Contact him at Michael.Devous@utsouthwestern.edu.

cite as: Tobey, E. A.  & Devous, Sr., M. D. (2007, September 04). Neuroimaging and Cochlear Implants : A Look at How the Brain Hears. The ASHA Leader.

Are There Risks for Cochlear Implant Patients with Magnetic Resonance Imaging?

Brain-imaging with fMRI can't be used with individuals with a cochlear implant. The electromagnetic fields originating in a structural or functional MRI investigation (static magnetic fields, static gradients, switched gradients, and radio frequency pulses) interfere with cochlear implants. These critical fields may induce heat resulting in damage to the surrounding tissue of the internal device, evoke voltages that may damage the internal portions of the device, or dislodge the surgically implanted device through force and torque on the magnetic parts of the implant. The internal magnet of the implant causes artifacts in the structural MR image (up to 6 cm in circumference) and it is partially demagnetized, resulting in reduced functionality.

Although the cochlear implant manufacturers continue to develop equipment upgrades to allow cochlear implant users to obtain structural MRIs for medical emergencies and diagnostic needs, the techniques cannot be used to assess the response of the central auditory system to electrical stimulation associated with functional MRI. Guidelines for magnetic resonance imaging safety are available at the Web sites of cochlear implant manufacturers. SPECT and PET, however, offer several advantages for assessing cortical activations in cochlear implant users since they do not evoke the harmful consequences seen in functional MRIs. 



Neuroimaging Techniques: How They Work and What They Measure

Functional brain-imaging technology includes the following five techniques: functional magnetic resonance imaging (fMRI), positron emission tomography (PET), single photon emission computed tomography (SPECT), magnetoencephalography (MEG), magnetic resonance spectroscopy (MRS), and topographic electroencephalography (TEEG).

These imaging techniques measure the following:

  • Glucose metabolism during PET scans, conducted with a form of glucose detectable by the PET scanner
  • Regional cerebral blood flow (rCBF) during PET scans, conducted with a detectable form of water or during SPECT with various
    tracers 
  • The blood-oxygen-level-dependent (BOLD) signal that relates both regional cerebral blood flow and regional cerebral blood volume during fMRI scans. (See sidebar on page 7 for more information about the risks of MRI for cochlear implant users.)

When designing studies using these techniques, clinicians and researchers consider the spatial and temporal resolutions associated with the images. Spatial resolutions may be as small as 1–2 mm when using fMRI scans or as great as 8–12 mm when using PET scans conducted with labeled water.

Temporal resolution refers to the length of time a stimulus presentation is needed to sample cortical responses to a task. Image acquisition may be very quick as in the case of fMRI (less than 100 milliseconds), or long—30 minutes—for PET scans of glucose metabolism.

The test environment also is an important factor. PET and fMRI scans typically are run in scanners, and fMRI scans produce very loud noises and require heavy acoustic shielding. SPECT scans, on the other hand, may be run in a quiet, acoustically controlled environment.

For studies with individuals who use cochlear implants, fMRI is not an option. Therefore, the spatial and temporal resolution of SPECT or PET most influence study-design issues. SPECT provides the greatest control of the study environment but limits investigators to only one study (experimental condition) per day. PET scans with labeled water provide the greatest experimental flexibility in that six or more study conditions (cognitive conditions) can be examined in one session in the scanner. The temporal resolution is similar for both techniques, about 60 sec for rCBF with PET, and about 30 sec for rCBF with SPECT. Also, the spatial resolution is similar, though generally a little better for SPECT—about 6–8 mm for SPECT and 8–12 mm for PET.

Functional brain-imaging studies require a contrast between cortical function measured under at least two test conditions. The conditions may be simple—eyes opened or eyes closed—or may be more complex manipulations that tease apart the subtleties of auditory perceptions. For example, many of our studies require a contrast between watching and listening to a recording of a person reading a story versus only watching a person reading a story. The listening portion of the test may be varied to present signals to two ears, one ear, or to present manipulated (filtered, etc.) sounds in some fashion.

The colored blobs typically observed in functional brain images may represent simple mathematical operations (e.g., subtracting images acquired in two conditions to obtain a difference score) or more sophisticated statistical constructs regulating the power or reliability of the observations by setting the probability levels and size of the effects.

PET and SPECT studies also involve the injection of a radioisotope that is taken up by the brain fairly rapidly, such as SPECT scans integrating activities over 20 seconds in length, or more slowly, such as PET scans with labeled water integrating activities over two minutes. The physical half-life of the radioisotopes also varies from fairly long (>6 hours) or relatively short (<1 minute). Thus, the temporal and spatial resolution of neuroimaging studies is based on the careful control of the stimulus to be studied, how an individual responds to the stimulus, the statistical manipulation of the data, and the test environment. 



Web Sites for Information on Cochlear Implants



Are There Radiation Risks Associated with SPECT or PET Brain Imaging?

 Many people believe radiation is harmful at any level and believe any medical imaging procedure using exposure to radiation will increase risk. Medical practice and research expend great efforts (and costs) to minimize radiation exposure for patients, research volunteers, and workers. Some individuals argue fMRI imaging sequences should be used instead of Nuclear Medicine techniques, such as SPECT or PET, in order to avoid the perceived radiation risk of the radioactive tracer techniques. Many times this fear of radiation is fostered by the widespread regulatory use of the "Linear No-Threshold Theory" (described below, in lay language) regarding the assessment of radiation risk. In fact, SPECT and PET brain imaging procedures have no more known risk than MRI-based techniques. Below, we briefly review the data regarding low-level radiation and risk.

The "Linear No-Threshold Theory" is a working hypothesis originated in the 1950s to serve as a prudent operational guideline to describe radiation effects. Unfortunately, many people think it is a fact, even though there is no scientific evidence to support it. The theory makes a big assumption: that is, measures of mortality, introduction of disease or tissue injury caused by very high radiation exposure levels associated with nuclear accidents, atomic bomb exposures, or high levels of intentional radiation therapy treatments may be extrapolated down by many orders of magnitude to much lower levels of radiation exposure incurred in diagnostic imaging. For example, the hypothesis assumes an injury caused by a high level of radiation is lowered in direct proportion to an assumption that "zero-risk" is equal only to "zero exposure."

This interpretation is equivalent to assuming that if a car hitting a wall 10,000 times in a row at 200 miles per hour kills the driver 100% of the time (and, thus, produces 10,000 deaths), then a car hitting the wall 10,000 times in a row at 2 miles per hour (1/100th original velocity) will kill 100 drivers (1/100th the original number of deaths). The "Linear No-Threshold Theory" assumes a single gamma ray stopping in the human body should be associated with some risk. However, most published, peer-reviewed data suggests risk goes to "zero" at radiation exposure levels well above those incurred in diagnostic procedures for both adults and children.

In 1996, the Health Physics Society issued a policy statement indicating "health risks are either too small to be observed or are nonexistent" for radiation exposures below 10 Roentgen Equivalent Man (rem), although there is substantial and convincing scientific evident for health risks at high doses. The whole body dose of a typical SPECT brain imaging procedure is about 0.01 (rem)—a thousand times lower than the dose discussed by the Health Physics Society. The Office of the Clinical Director of the National Institutes of Health (NIH) states, "The risk of increased rates of cancer after low-level radiation exposure is not supported by population studies of health hazards from exposure to background radiation, radon in homes, radiation in the workplace or radiotherapy. Compared to the frequency of daily spontaneous genetic mutations, the biologic effect of low-level radiation at the cellular level seems extremely low. Furthermore, the potentiation of cellular repair mechanisms by low-level radiation may result in a protective effect from subsequent high-level radiation."[italics are our emphasis of the statement]. The NIH concludes their risk review by saying: "Health risks from low-level radiation could not be detected above the "noise" of adverse events of everyday life. In addition, no data were found that demonstrated higher risks with younger ages at low-level radiation exposure." Indeed, there is no data that demonstrate harm to humans by the radiation exposures used for diagnostic imaging.

We italicize a portion of the statement above from the NIH that refers to a potential protective effect from low levels of radiation. The potential protective effect from low levels of radiation is based on a concept known as radiation hormesis. In contrast to the "all radiation is bad" perspective associated with the "Linear No-Threshold Theory," current data on the effects of low-level radiation exposure support the presence of "radiation hormesis" which suggests that low levels of radiation exposure in humans appears to induce beneficial effects of cellular repair and immune system enhancement. This concept builds on the evolution of humans living in a background radiation environment historically many times higher than is currently present. Current background radiation levels vary by an order of magnitude across geographic regions without any indication that individuals living in lower background radiation regions have less cancer prevalence than individuals in higher background radiation areas. Data from the United States, China, India, Austria and the United Kingdom show that populations living in background areas with higher levels of radiation have increased longevity and decreased cancer death rates.


References

Azzam, E. I., de Toledo, S. M., Raaphorst, G. P., & Mitchel, R. E. (1996). Low-dose ionizing radiation decreases the frequency of neoplastic transformation to a level below the spontaneous rate in C3H 10T1/2 cells. Radiation Research, 146, 369-373.

Bogen, K. T. & Layton, D. W. (1998). Risk management for plausibly hormetic environmental carcinogens: the case of radon. Human and Exprimental Toxicology, 17, 463-467.

Bond, V. P., Wielopolski, L., & Shani, G. (1996). Current misinterpretations of the linear no-threshold hypothesis. Health Physics, 70, 877-882.

Cohen, B. L. (1995). How dangerous is low level radiation? Risk Analysis, 15, 645-653.

Cohen, B. L. (1995). Test of the linear-no threshold theory of radiation carcinogenesis for inhaled radon decay products. Health Physics, 68, 157-174.

Fry, R. J., Grosovsky, A., Hanawalt, P. C., Jostes, R. F., Little, J. B., Morgan, W. F., et al. (1998). "The Impact of Biology on Risk Assessment" workshop of the National Research Council's Board on Radiation Effects Research. July 21–22, 1997, National Academy of Sciences, Washington, DC. Radiation Research, 150, 695-705.

Jaworowski, Z. (1997). Beneficial effects of radiation and regulatory policy. Australian Physical & Engineering Sciences in Medicine, 20(3), 125-138.

Little, M. P. & Muirhead, C. R. (1998). Curvature in the cancer mortality dose response in Japanese atomic bomb survivors: absence of evidence of threshold. International Journal of Radiation Biology, 74, 471-480.

Mossman, K. L. (1998). The linear no-threshold debate: where do we go from here? Medical Physics, 25, 279-284.

Nussbaum, R. H. (1998). The linear no-threshold dose-effect relation: is it relevant to radiation protection regulation? Medical Physics,25, 291-299.

Pollycove, M. (1995). The issue of the decade: hormesis. European Journal of Nuclear Medicine, 22, 399-401.

Pollycove, M. (1998). Nonlinearity of radiation health effects. Environmental Health Perspectives, 106 Suppl 1, 363-368.

Roland, P. S., Tobey, E. A., & Devous, M. D., Sr. (2001). Preoperative functional assessment of auditory cortex in adult cochlear implant users. Laryngoscope, 111, 77-83.

Ron, E. (1998). Ionizing radiation and cancer risk: evidence from epidemiology. Radiation Research, 150, S30-S41.

Tobey, E. A., Devous, M. D., Sr., Buckley, K., Cooper, W. B., Harris, T. S., Ringe, W., et al. (2004). Functional brain imaging as an objective measure of speech perception performance in adult cochlear implant users. International Journal of Audiology, 43 Suppl 1, S52-S56.

Tobey, E. A., Devous, M. D., Sr., Buckley, K., Overson, G., Harris, T., Ringe, W., et al. (2005). Pharmacological enhancement of aural habilitation in adult cochlear implant users. Ear and Hearing, 26, 45S-56S.

Walker-Batson, D., Wendt, J. S., Devous, M. D., Sr., Barton, M. M., & Bonte, F. J. (1988). A long-term follow-up case study of crossed aphasia assessed by single-photon emission tomography (SPECT), language, and neuropsychological testing. Brain and Language, 33, 311-322.



  

Advertise With UsAdvertisement