A joint effort between researchers at Massachusetts General Hospital, health science company ZOE, King’s School of Biomedical Engineering & Imaging Sciences, Department of Twin Research & Genetic Epidemiology, and London Medical Imaging and AI Centre for Value Based Healthcare has seen the development of an artificial intelligence diagnostic that can predict whether someone is likely to have COVID-19 based on their symptoms.
Their findings were published this week in Nature Medicine. Researchers from the London AI Centre helped develop the AI model which uses data from the COVID Symptom Study app to predict COVID-19 infection by comparing people’s symptoms and the results of traditional COVID tests.
Researchers say this may provide help for populations where access to testing is limited. In this study, the researchers analysed data gathered from just under 2.5 million people in the UK and US who had been regularly logging their health status in the app, around a third of whom had logged symptoms associated with COVID-19.
Of these, 18,374 reported having had a test for coronavirus, with 7,178 people testing positive.
The research team investigated which symptoms known to be associated with COVID-19 were most likely to be associated with a positive test.
While they found a wide range of symptoms compared to cold and flu, they warn against focusing only on fever and cough.
Particularly striking was the loss of taste and smell, known as anosmia.
Research Fellow from the School of Biomedical Engineering & Imaging Sciences, Dr Carole Sudre focused on applying the model trained on the tested people to all people that had reported symptoms in the app.
She also provided some demographic analysis over these 800k people on which we were applying the model.
"From our data, we were able to identify anosmia as a telling symptom of COVID. This equips doctors with more information about a disease that we still know very little about," Dr Sudre said.
The fact that we were able to extract this information from a healthcare app is a powerful proof of concept that such reporting can provide very valuable insights.– Dr Carole Sudre, Research Fellow, School of Biomedical Engineering & Imaging Sciences
Two thirds of users testing positive for coronavirus infection reported experiencing anosmia compared with just over a fifth of the participants who tested negative.
The findings suggest that anosmia is a stronger predictor of COVID-19 than fever, supporting anecdotal reports of loss of smell and taste as a common symptom of the disease.
The researchers then created a machine learning model that predicted with nearly 80% accuracy whether an individual is likely to have COVID-19 based on their age, sex and a combination of four key symptoms: loss of smell or taste, severe or persistent cough, fatigue and skipping meals.
Applying this model to the entire group of over 800,000 app users experiencing symptoms predicted that just under a fifth of those who were unwell (17.42%) were likely to have COVID-19 at that time.
Researchers suggest that combining this AI prediction with widespread adoption of the app could help to identify those who are likely to be infectious as soon as the earliest symptoms start to appear, focusing testing efforts where they are most needed.
Head of School Professor Seb Ourselin said the data and analyses coming from the COVID19 symptom app provide critical information on infection and monitoring.
Coronavirus tests are costly and many people are unable to be tested still. This predictive model is facilitating a greater and more precise awareness of who is likely to be infected. The fact that we have also found anosmia is a key early warning symptom of COVID19 suggests that it might be beneficial to include the symptoms as part of the routine screening for the disease.– Professor Sebastien Ourselin, Head of School, School of Biomedical Engineering & Imaging Sciences