December 26, 2007 Feature

Digital Hearing Aids

Wheelbarrows to Ear Inserts

see also

Hearing aids are no longer devices for simply amplifying sound. The introduction of digital technology in hearing aids has brought about substantial changes, not only in design, but also in basic function. The first all-digital hearing aid developed 25 years ago ushered in major advances.

In The Beginning

The first electronic hearing aids in the 19th century were large, awkward instruments that sat on a table. Introduction of miniature electron tubes permitted development of wearable hearing aids. Invention of the transistor led to development of hearing aids small enough to fit behind the ear, and ongoing miniaturization of solid-state electronics resulted in hearing aids small enough to fit in the ear canal.

The transformation of hearing aids began with digital computers. In the early 1960s researchers at the Bell Telephone Laboratories developed methods for analyzing and processing speech and other audio signals on a large mainframe computer. This research produced convenient methods of simulating complex speech transmission systems, such as the voice coders (vocoders) for use on the transatlantic cable.

Simulating hearing aids on a digital computer was difficult because the machines were slow—processing an audio speech signal took much longer than the signal's duration. The mainframe computer needed was very large; it seemed inconceivable then that digital sound processing would be possible in a microcomputer small enough to wear on the ear. Nevertheless, the research was valuable in providing new insights about how to process sounds for people with hearing loss.

Introduction of minicomputers in the mid- to late 1960s opened the door to studies of real-time signal processing for people with hearing loss. Although not fast enough to process audio signals digitally in real time, the machines were used to control conventional analog equipment for processing audio signals in real time. Computer-controlled analog systems were widely used in the 1970s for studying new forms of amplification for people with hearing loss.

Amplitude Compression

Edgar Villchur, who invented the acoustic research bookshelf loudspeaker, introduced multi-channel amplitude compression for hearing aids. The incoming audio signal is subdivided into a series of contiguous frequency bands. The gain in each band is adjusted such that intense sounds receive less amplification, while weak sounds receive more amplification. Much research in the 1970s focused on this configuration, which would later become the basic architecture of the first hearing aids to employ digital technology.

In 1975, Daniel Graupe reported development of a six-channel hearing aid with digital control of the gain in each channel. Graupe also developed a digital chip that automatically adjusted the gain in frequency channels containing high noise levels. This chip, the Zeta Noise Blocker, was incorporated in several conventional hearing aids during the 1980s. In 1979 Mangold and Leijon, using digital control of analog components, developed a programmable multi-channel compression hearing aid that had multiple memories. With the touch of a button, the electroacoustic characteristics of the hearing aid could be altered to provide appropriate amplification for different acoustic environments.

Friends With Wheelbarrows

High-speed digital-array processors suitable for use with minicomputers were developed in the 1970s. They were fast enough to process audio signals digitally in real time. In 1982, an all-digital hearing aid using array processing and operating in real time was developed at the City University of New York.

The equipment was relatively large (see photo, above left). The box-like units on the right consist of a minicomputer and a digital array processor. The small units on top are an FM radio transmitter and receiver. These units provide a radio link with a body-worn transmitter and receiver that, in turn, are linked by wire to an ear-worn microphone and hearing aid receiver or loudspeaker. As one wag put it, "It may be a good hearing aid, but you'll need a friend with a wheelbarrow to carry the instrument."

The array processor digital hearing aid was designed as a research tool for exploring the potential of digital signal processing. It could be used to simulate experimental hearing aids, without constructing an actual instrument, saving time and money. The array processor digital hearing aid served its design purpose well and was used as a research tool for more than a decade.

The early 1980s saw development of digital chips dedicated to high-speed digital signal processing. These "DSP" chips were fast enough to process speech and other audio signals in real time, but were too large and consumed too much power to be used in wearable hearing aids. It was a matter of time before their size and power consumption could be reduced to levels practical for a digital hearing aid.

Engebretson, Morley, and Popelka at the Central Institute for the Deaf (CID) began working on a digital aid in the early 1980s and were the first to develop a practical, wearable digital hearing aid with DSP chips. The CID group later joined with the 3M Company in a consortium to develop digital hearing aids. An experimental digital hearing aid using DSP chips was also developed by Nunley et al. in 1983.

In 1987, the Nicolet Corporation introduced the first commercial digital hearing aid. It consisted of a body-worn processor with a hardwire connection to ear-mounted transducers. A behind-the-ear (BTE) digital hearing aid was introduced two years later, shortly before the Nicolet Corporation withdrew from the hearing aid market. Although the Nicolet hearing aid was not commercially successful, it demonstrated feasibility and triggered the race for other companies to develop viable digital hearing aids.

Bell Laboratories developed a hybrid digital/analog hearing aid in which digital circuits controlled a two-channel compression amplifier. Although field trials showed the instrument superior to state-of-the-art hearing aids, AT&T, the parent company of Bell Laboratories, withdrew from the hearing aid industry. The Resound Corporation acquired rights to the AT&T hearing aid in 1987 and after further refinement, the hearing aid was marketed, becoming immediately successful.

These developments brought radical change in the hearing aid industry. Within a few years every major hearing aid company introduced hybrid instruments in which analog amplifiers, filters, and limiters were controlled digitally. Although the audio signal was not digitized, these instruments provided many benefits, such as memory for storing parameter settings, the capability for paired-comparison testing, and convenient selection of appropriate parameter settings for different acoustic environments. These hearing aids also embodied much more sophisticated methods of signal processing, such as multi-channel compression.

The major hearing aid companies raced toward developing a commercially viable all-digital hearing aid. Widex introduced the Senso, the first commercially successful digital hearing aid, in 1996. The Oticon Company had already developed a digital hearing aid the year before and had distributed it to audiological research centers worldwide to foster independent research on digital technology in acoustic amplification. Oticon began marketing its digital hearing aid, the DigiFocus, immediately after introduction of the Senso.

New Prospects

The first generation of digital hearing aids used a fixed multi-channel architecture, a concept similar to the previous generation of hybrid digital/analog hearing aids. Early comparisons between digital and analog hearing aids showed only small improvements in performance for the digital instruments. The recent introduction of digital hearing aids with an open architecture allows innovative new designs that take greater advantage of the unique capabilities of digital processing.

Digital hearing aids are much more effective in reducing unstable acoustic feedback (such as loud whistling) than analog hearing aids. Digital feedback also reduces constraints on the overall gain of a hearing aid, allowing greater use of open, more comfortable ear molds and more accurate control of the overall frequency-gain characteristic.

Difficulty understanding speech in noise is the most common complaint of hearing aid users and considerable effort has focused on noise-reducing capabilities. The most successful development is hearing aids with adjustable directional characteristics, including those that recognize the direction of an interfering noise and attenuate the interference. This approach to noise reduction holds much promise.

An intriguing aspect of automatic sound recognition is to recognize and amplify important information-bearing components of the speech signal. This idea is not new, but digital signal processing provides the means for implementing these techniques in a practical hearing aid.

Real-World Testing

The results of field evaluations using a wearable prototype digital hearing aid were found to be substantially different from those obtained in the clinic. These differences were far greater than initially expected and led to reconsideration of how hearing aids should be evaluated. It was clear that data obtained using traditional methods of hearing aid evaluation in a clinic do not predict real-world performance. Differences in lifestyles affect hearing aid usage and the conditions under which the hearing aid is used.

Despite advancements in methods of digital signal processing, the most effective way to deal with background noise and room reverberation is a remote microphone located close to the speaker's mouth. However, a hard-wire or wireless link between the remote microphone and the hearing aid is not always convenient.

The impact of digital technology on wireless links in acoustic amplification is only beginning to be felt. The recent introduction of novel methods of linking hearing aids to digital wireless telephones and to the Internet using Bluetooth holds considerable promise, especially since data and control signals—as well as audio—can be transmitted, allowing for convenient monitoring of the acoustic environment, efficient data logging, and remote, automatic adjustment of a hearing aid by a powerful central processor.

The development of digital hearing aids began slowly, but increased rapidly as advances in digital technology gained momentum. Each new advance brought new ways of processing audio signals and improved methods of acoustic amplification. More importantly, digital technology introduced new perspectives on the fundamentals of acoustic amplification. Today, the hearing aid is far more complex, combining amplification with advanced signal processing for speech enhancement, noise reduction, self-adapting directional inputs, feedback cancellation, data monitoring, and acoustic scene analysis, as well as a wireless link with other communication systems.  

Preparation of this paper was supported by Grant H133G050228 from the National Institute on Disability and Rehabilitation Research. The opinions expressed in this paper are those of the author and do not necessarily reflect those of the Department of Education. The author does not endorse any of the products mentioned in this paper.

Portions of this paper are based on material presented at the Fourth Joint Meeting of the Acoustical Society of America and the Acoustical Society of Japan, Honolulu, Hawaii, Nov. 30, 2006. 

Harry Levitt, is distinguished professor emeritus, The City University of New York. He developed methods of adaptive testing that are widely used in audiology, speech perception, and psychophysics; methods and instrumentation for teaching speech to deaf children; and new methods of signal processing for hearing aids and other assistive devices. Contact him at harrylevitt@earthlink.net.  

cite as: Levitt, H. (2007, December 26). Digital Hearing Aids : Wheelbarrows to Ear Inserts. The ASHA Leader.

References

Bernard Becker Medical Library, Washington University School of Medicine. Deafness in disguise-2002, timeline of hearing devices and early deaf education. Accessible at: http://beckerexhibits.wustl.edu/.

Graupe, D., & Causey, G. D. (1975). Development of a hearing aid system with independently adjustable subranges of its spectrum using microprocessor hardware. Bulletin of Prosthetics Research, 12, 241-242.

Graupe, D., Grosspitch, J., & Taylor, R. A. (1986). A self-adaptive noise filtering system. Hearing Instruments, 37(9): 29-34, 37(10): 46-50.

Heide, V. H. (1994). Nicolet Project Phoenix, Inc. 1984-1989. The development of a wearable digital signal processing hearing aid. In R. E. Sandlin (Ed.), Understanding digitally programmable hearing aids. Needham Heights, MA: Allyn and Bacon.

Levitt, H. (2007). A historical perspective on digital hearing aids: How digital technology has changed modern hearing aids. Trends in Amplification, 11(1), 7-24.

Mangold, S., & Leijon, A. (1979). Programmable hearing aid with multi-channel compression. Scandanavian Audiology, 8, 121–126.

Pluvinage, V. (1994). Rationale and development of the Resound system. In R. E. Sandlin (Ed.), Understanding digitally programmable hearing aids (pp. 15-39). Needham Heights, MA: Allyn and Bacon.

Sammeth, C. A. (1990). Current availability of digital and hybrid hearing aids. Seminars in Hearing, 11(1), 91-99.

Sandlin, R. E. (Ed.) (1994). Understanding digitally programmable hearing aids. (pp. 275-3130. Needham Heights, MA: Allyn and Bacon.

Villchur, E. (1973). Signal processing to improve speech intelligibility in perceptive deafness. Journal of the Acoustical Society of America, 53(6), 1646-1657.

Waldhauer, F. (1988). History of the AT&T Bell Laboratories hearing enhancement venture: 1983-1987. Audecibel. 



  

Advertise With UsAdvertisement