Call Us: 01642 247 789 | Email: mail@sarabec.co.uk

Emerging Tech: One common complaint from people who wear a hearing aid is that, while they can hear speech, they are unable to make out its meaning. That is because, while the two things are related, hearing and understanding aren’t the same thing. A new technique developed by researchers from KU Leuven in Belgium, in collaboration with the University of Maryland, may offer an alternative solution, however.

News Image

This article originally appeared on Digital Trends on 13th March 2018.

One common complaint from people who wear a hearing aid is that, while they can hear speech, they are unable to make out its meaning. That is because, while the two things are related, hearing and understanding aren’t the same thing. A new technique developed by researchers from KU Leuven in Belgium, in collaboration with the University of Maryland, may offer an alternative solution, however.

They have developed an automatic test involving an EEG brain cap, in which scientists can look at a person’s brainwaves to see not only whether they have heard a particular sound, but whether they have actually understood it. The test involves using 64 electrodes to measure a patient’s brainwaves while they listen to a sentence. Based on the brain waveform response, this can then be used to reveal whether or not a patient understands what has been said to them.

“Our method is independent of the listener’s state of mind,” Jonas Vanthornhout, one of the researchers on the project, told Digital Trends. “Even if the listener doesn’t pay attention, we can measure speech understanding. We can do this because we directly measure speech understanding from the brain. We reconstruct the speech signals from your brainwaves. When the reconstruction succeeds, this means that you have understood the message. When the reconstruction fails, you didn’t understand the message. Our method will allow for a more accurate diagnosis of patients who cannot actively participate in a speech understanding test because they’re too young, for instance, or because they’re in a coma.”

At present, this is just a neat tech demo. However, as Vanthornhout suggests, long-term, this be used to create some fascinating consumer-facing products. Imagine, for example, smart hearing aids or cochlear implants which adjust their signal based on how well you are understanding a particular speaker. The technology could also decrease the screening time for hearing loss. In addition, we would be fascinated to see whether this has any application within education, to discover whether learners are understanding the words they hear in class. (Although apparently it can not be used to test a deep understanding of course material.)

“We are currently mapping the effects of attention on speech understanding,” Vanthornhout continued. “In order to make our method more clinically relevant, we are also validating it in a population of hearing aid users, cochlear implant users, and in multiple age groups. We are also investigating how the different parts of the brain contribute to speech understanding.”

A paper describing the work was recently published in the Journal of the Association for Research in Otolaryngology.

Accreditations & Proud Members Of

Unable to add product.

Oops! We could not add that product to your cart.

An error occurred adding that product to your cart, please try again.