The recently released WIDEX MOMENT uses AI to learn how users prefer to hear their surroundings and compares it to millions of settings stored in the cloud to help personalize the listening experience. It’s one of a growing number of hearing devices that are using new technologies. “WIDEX MOMENT with PureSound addresses one of the great unsolved challenges for hearing aid users: no matter how good the sound, it still sounds artificial, like you’re listening to a recording of your voice rather than how it sounded before your hearing was impaired,” Kerrie Coughlin, the vice president of Widex Marketing, said in an email interview. “In other words, regardless of technological capability, hearing aids have always sounded like hearing aids.”
We Can’t Hear You
The MOMENT and other high-tech hearing aids address an issue that many Americans face. About one in three people in the U.S. between the ages of 65-74 have hearing loss, and nearly half of those older than 75 have difficulty hearing, according to the National Institute on Deafness and Other Communication Disorders. Traditional hearing aids can be life-changing, but they have their limits. The sound a user hears is altered because when it’s processed in the hearing aid, it reaches the eardrum slightly later than the sound that travels directly through the ear itself. When these two “out of sync” signals mix, the result is an artificial sound. To try to prevent this issue, the MOMENT uses a parallel processing path to reduce latency. The company claims the hearing aid processes sound 10 times faster than other devices, reducing the processing latency to 0.5 milliseconds. “Many hearing aids produce a signal that is unfamiliar to the brain, forcing you to re-learn how to hear,” Coughlin said. “Much of this is due to the delay that happens during the processing of sound. WIDEX MOMENT with PureSound delivers a truer signal with minimal delay, so your brain recognizes the signal, and you recognize the sound.” AI also boosts the performance of the MOMENT, Coughlin said. The software analyzes user preferences and learns how users prefer to hear their surroundings by analyzing settings. The AI also searches through millions of user settings stored in the cloud to help personalize the listening experience.
Manufacturers Jump on AI
The MOMENT isn’t the only hearing aid to use AI. Last week at the Consumer Electronics Show, Oticon launched the Oticon More hearing aid with an onboard deep neural network (DNN). The Oticon More network is trained on 12 million sounds, so it processes speech in noise more like the human brain, the company claims. “The DNN in Oticon More has learned the way the brain learns, naturally over time,” Donald Schum, vice president of audiology at Oticon, said in a news release. “Every sound that passes through the hearing aid is compared to the results discovered in the learning phase. This enables Oticon More to provide a more natural, full, and precisely balanced sound scene, making it easier for the brain to perform optimally.” There’s also the Orka One, a new hearing aid announced at last week’s CES, which claims to use AI to reduce background noise. The ORKA One uses an AI neural network on a chip in the hearing aid’s case, the company says. The network identifies and cuts down on background sounds that can distract and also enhance human voices. “AI denoise technology hearing aids can distinguish the background noise and enable users to hear low-frequency sounds clearly,” the company says on its website. “Thus, users have better hearing, and you can effortlessly converse with your family and friends in noisy areas.” Artificial intelligence is quickly changing all kinds of technologies, including medical devices. If any of these gadgets improve users’ experience, they could be a massive boon to those with hearing loss. Want more? See all our coverage of CES 2021 right here.