Cognitive Hearing Science 

My research group has developed the research area of cognitive hearing science that describes and models the brain´s work during high cognitive load communication situations. The research has contributed to applications for both the hearing aid industry and hearing care.

Cognitive abilities for communication

To get a comprehensive picture of communicative disability, one needs to relate cognitive abilities to communicative potential and circumstances. Prediction studies and experimental manipulations have proven to be fruitful. In our division, we have studied the communicative potential of persons with cerebral palsy, traumatic brain injury, epilepsy, intellectual disability, using some core cognitive concepts that typically draw on memory functions. More recently, I have shown that working memory capacity plays a key role for deafened speech-readers, sign language users, for users of cochlear implants and for hearing aid users, especially when signal processing in the hearing aid is more advanced and/or when listening conditions are challenging. 

We listen and understand with the brain 

Imagine yourself at a party, chatting one on one with another guest.  You struggle to hear everything the person is saying amid the din of surrounding conversation. And even though you miss a word here and there, you have enough context to understand the gist of what the person is saying. In this type of situation, you’re relying on different kinds of knowledge stored in your brain, including contextual cues and relevant memories, to process the information you’re receiving. This knowledge use is critical not only for people trying to converse in a noisy ballroom, but especially for people who struggle with hearing impairments. Imagine repeating the above mental processes in just about every conversation you have. We have named the research area Cognitive Hearing Science (CHS). CHS acknowledges the intergral role cognition plays in communication. Specifically, CHS examines the way our minds process the auditory signals being sent to the brain, factoring in the complexity of what we are listening to and the quality of the listening conditions.

Modelling the signal-cognition interface in communication

In a model that I and colleagues have developed, Ease-of-Language Understanding (ELU) under adverse or distracting listening conditions is assumed to depend on the complex interactions between working memory capacity (WMC), attention, and executive functions, and episodic and semantic long-term memory. This kind of approach has already proven to be fruitful for a more comprehensive grasp of the subtle interplay between bottom-up (sensory) and top-down (cognitive) processes, including both online processes and long-term changes (negative or positive) due to hearing impairment/deafness and aging. We test the model by means of both behavioral and brain imaging data, including data from functional magnetic resonance tomography (fMRI), magnetoencephalography (MEG), and pupil size. The model has played an important role in the development of the new area of Cognitive Hearing Science.

Long-term effects of hearing impairment 

One prediction of the ELU model is that when the input to the ear is sufficiently poor, or when the person is hearing impaired, there is an increased probability that the phonological-lexical activation in long-term memory will fail, i.e., there is a mismatch between input to the brain and the representations stored in the brain. This will trigger working memory activity, attempting to reconstruct and fill in what has been misheard. Working memory will therefore be constantly exercised, while long-term memory will be relatively more disused: when some items are not encoded, neither will they be stored ore retrieved as frequently as working memory is exercised. The general consequence is that long-term memory will decline, but working memory capacities will remain. We have shown this to be true with different type of memory tasks and for purely visuospatial materials in a large data base, the UK Biobank. 

Videos

Optimal hearing aid fitting

In this interview Jerker Rönnberg describes the new research field of cognitive hearing science, and that he would like to see more cognitive tests, such as memory tests for optimal hearing aid fitting.

Video

We hear with our ears, but listen and understand with our brains.

The video is about our research on hearing loss and deafness, and the brain's role in our hearing and how everyday life can be made easier for people with hearing loss.

Research programs, commissions etc.

Research projects / programs

  • The Linnaeus HEAD research program  
  • The n200 longitudinal project on cognitive aging, hearing loss, and dementia 
  • The sensory-cognitive communication factor: preserving social interaction and health in aging



Commissions / Awards

  • National Committee for psychology, Academy of Sciences
  • The Stuart Gatehouse Memorial Lecture
  • Invited participant in the international project: Towards a comprehensive model of human memory

Publications

2018

Adriana A Zekveld, Marieke Pronk, Henrik Danielsson, Jerker Rönnberg

Reading Behind the Lines: The Factors Affecting the Text Reception Threshold in Hearing Aid Users.

In Journal of Speech, Language and Hearing Research

Article in journal

Velia Cardin, Mary Rudner, Rita F. De Oliveira, Josefine Andin, Merina T. Su, Lilli Beese, Bencie Woll, Jerker Rönnberg

The Organization of Working Memory Networks is Shaped by Early Sensory Experience

In Cerebral Cortex

Article in journal

2017

Örjan Dahlström, Carine Signoret, Jakob Dahl, Gerhard Andersson, Jerker Rönnberg

Effects of cognitive load on neurophysiological activity among persons with tinnitus

Conference paper

Rachel Ellis, Patrik Sörqvist, Adriana Zekveld, Jerker Rönnberg

Cognitive Hearing Mechanisms of Language Understanding: Short- and Long-Term Perspectives

In Frontiers in Psychology

Article in journal

Carine Signoret, Rina Blomberg, Orjan Dahlstrom, Mary Rudner, Jerker Rönnberg

Phonological expectations override semantic mismatch during speech in noise perception

Conference paper

Shahram Moradi, Björn Lidestam, Henrik Danielsson, Elaine Hoi Ning Ng, Jerker Rönnberg

Visual Cues Contribute Differentially to Audiovisual Perception of Consonants and Vowels in Improving Recognition and Reducing Cognitive Demands in Listeners With Hearing Impairment Using Hearing Aids

In Journal of Speech, Language and Hearing Research

Article in journal

Shahram Moradi, Anna Wahlin, Mathias Hällgren, Jerker Rönnberg, Björn Lidestam

The Efficacy of Short-term Gated Audiovisual Speech Training for Improving Auditory Sentence Identification in Noise in Elderly Hearing Aid Users

In Frontiers in Psychology

Article in journal

2016

Rachel Ellis, Peter Molander, Jerker Rönnberg, Björn Lyxell, Gerhard Andersson, Thomas Lunner

Predicting Speech-in-Noise Recognition from Performance on the Trail Making Test: Results from a Large-Scale Internet Study

In Ear and Hearing

Article in journal

Jerker Rönnberg, Thomas Lunner, Elaine Hoi Ning Ng, Björn Lidestam, Adriana Zekveld, Patrik Sörqvist, Björn Lyxell, Ulf Träff, Wycliffe Yumba, Elisabet Classon, Mathias Hällgren, Birgitta Larsby, Carine Signoret, Kathleen Pichora-Fuller, Mary Rudner, Henrik Danielsson, Stefan Stenfelt

Hearing impairment, cognition and speech understanding: exploratory factor analyses of a comprehensive test battery for a group of hearing aid users, the n200 study

In International Journal of Audiology

Article in journal

Mary Rudner, Gitte Keidser, Staffan Hygge, Jerker Rönnberg

Better visuospatial working memory in adults who report profound deafness compared to those with normal or poor hearing: data from the UK Biobank resource

In Ear and Hearing

Article in journal

Research projects

News

Research environments