App That Listens to Coughing Developed to Tell if People Have COVID-19

If this app works effectively, it will be very important in allowing people to group up more freely again.

As millions of people worldwide battle the symptoms of COVID-19, a group of “silent patients” may not even know they’re sick and spreading the virus. Asymptomatic people, by definition, have no physical symptoms of the illnesses they carry.

Researchers at the Massachusetts Institute of Technology (MIT) however, say they may be showing symptoms after all — in the sound of their cough. Their study has created an artificial intelligence program that can identify if someone has coronavirus by the way their coughing sounds. Researchers programmed their AI model with thousands of different recorded coughs from both healthy and sick volunteers. When they fed in recordings of new patients, the system accurately detected 98.5 percent of coughs coming from people with a confirmed case of COVID-19. AI also successfully picked out 100 percent of asymptomatic cases from volunteers who reported not having any symptoms but tested positive for the virus.

The team is now working on turning their model into a user-friendly app. If approved by the Food and Drug Administration, the app would give people a non-invasive and quick way to screen themselves during the pandemic daily.

“The effective implementation of this group diagnostic tool could diminish the spread of the pandemic if everyone uses it before going to a classroom, a factory, or a restaurant,” says co-author Brian Subirana in a university release.

The MIT team notes researchers have been working on audio-based medical screenings since before the coronavirus emergency began. Their group in particular originally created this AI model to screen for Alzheimer’s disease.

Although the degenerative neurological condition is mostly associated with memory loss, it also affects the muscles and vocal cords. With this knowledge, researchers trained a general machine-learning algorithm called ResNet50 to detect changes in vocal cord strength. Subirana taught the neural network using an audiobook collection with over 1,000 hours of speech files. The AI model could eventually tell the difference between similar worlds like “them” and “the” or “then.”

The system can also read the emotions of the speaker based on the tone of their voice. The team says this is a key in Alzheimer’s detection because patients tend to display more frustration when they try to get words out. The program learned to assess these moods and put them into categories including neutral, calm, happy, and sad.

Finally, the team turned to coughing. Using recordings of patients coughing, the AI model could analyze the lung and respiratory performance of the cougher. An algorithm to detect muscular degradation was also added to help AI distinguish strong coughs from weaker ones.

With all of this data, study authors discovered that the technology could effectively screen for Alzheimer’s based on a patient’s vocal cord strength, sentiment, lung performance, and muscular degradation.

Once the pandemic began, the team at MIT changed gears and started looking at their model to see if it could detect COVID. Researchers say there is growing evidence coronavirus patients also suffer from neurological symptoms and temporary muscular impairment.

“The sounds of talking and coughing are both influenced by the vocal cords and surrounding organs. This means that when you talk, part of your talking is like coughing, and vice versa. It also means that things we easily derive from fluent speech, AI can pick up simply from coughs, including things like the person’s gender, mother tongue, or even emotional state. There’s in fact sentiment embedded in how you cough,” Subirana explains. “So we thought, why don’t we try these Alzheimer’s biomarkers [to see if they’re relevant] for COVID.”

The team created a website to collect audio samples from volunteers, including many with coronavirus. From nearly 200,000 forced-cough audio samples, the group was able to find 2,500 recordings that came from confirmed COVID-19 patients. Many of these patients were also asymptomatic. After adding more random samples to act as a control, the team chose 4,000 coughing samples to train their AI model to screen for the virus.

Along with amazing accuracy in detecting coronavirus patients, researchers say the tests reveal “a striking similarity between Alzheimer’s and COVID discrimination.” They add that the same four biomarkers for detecting Alzheimer’s effectively screen out the virus as well.

“We think this shows that the way you produce sound, changes when you have Covid, even if you’re asymptomatic,” the research scientist in MIT’s Auto-ID Laboratory adds.

Subirana and his team stress that their AI system is not meant to diagnose what illness you may have; whether it be the flu, asthma, or COVID-19. The tool, instead, works by screening out who is healthy from who is asymptomatic but carrying an illness.

The MIT team is now partnering with several hospitals to collect more coughing samples to refine the system’s accuracy. Their hope is to introduce a free pre-screening app to the public which can cut down on clinical testing delays.

“Pandemics could be a thing of the past if pre-screening tools are always on in the background and constantly improved,” the study authors contend.

The study appears in the IEEE Journal of Engineering in Medicine and Biology.