Scientists react to an independent review into equity in medical devices.
Dr Peter Charlton, Chair of the Institute of Physics and Engineering in Medicine’s Physiological Measurement Special Interest Group, said:
“I welcome this review, and the report’s conclusion that ‘there is extensive evidence of poorer performance of pulse oximeters for patients with darker skin tones’. I do hope that the recommendation ‘to produce devices that are not biased by skin tone’ leads to improved devices. I would also highlight the potential role for independent assessment of different pulse oximeters to determine whether some provide better performance across different skin tones than others.”
Prof Michel Valstar, CEO of BlueSkeye AI, and Honorary Professor in Computer Science, University of Nottingham, said:
“Data based systems can be a double edged sword. If your dataset is not inclusive and diverse enough, it can result in a biased system without you even realising. But with a representative dataset a machine learning system can demonstrably adjust for differences in the absorption coefficients and refractive index of different skin tones, and be truly and robustly reflective of the communities about which your model is designed to draw conclusions or make recommendations. This is what BLUESKEYE AI has done with their medically relevant face and voice analysis products, which have been shown to be unbiased to age, gender, and ethnicity. We did this by doing statistical analysis on a representative dataset of faces to show that our medically relevant face and voice analysis is unbiased. This measurement of and mitigation against bias is now compulsory for AI based medical devices under the EU AI act. AI systems are held to this high standard, and it should be compulsory for any medical device regardless of whether they use AI or not.”
Dr Andrew King, Reader in Medical Image Analysis at King’s College London, said:
“I am very supportive of the findings and recommendations of the independent review into equity in medical devices. My particular area of interest is in the use of AI to assist medical decision making. As the report authors point out, AI models can exhibit biased behaviour (i.e. different levels of performance for different groups) when trained with data which are not representative of all patient groups, such as those defined by race and/or sex. The report rightly highlights skin cancer diagnosis as one such example but there is increasing evidence of similar biases being found in other areas of medicine, where they might be less expected.
“In my opinion, the report is timely and important as AI is starting to impact clinical workflows in medicine and there is a real danger that it could maintain or even exacerbate existing healthcare inequalities. I fully support the authors in calling for the government to appoint an expert panel into these potential unintended consequences of AI in healthcare.
Dr Xiaoxuan Liu and Prof Alastair Denniston, University Hospitals Birmingham NHSFT, University of Birmingham and Co leads for the STANDING Together initiative, said:
“We welcome this review as a significant first step in recognising and tackling health inequalities arising from medical devices. Medical devices are an essential part of modern medicine – from diagnostic tests to life-saving implants. We need to ensure they are ‘inclusive by design’ and work reliably for everybody. We know from our own work, that this is even more important in the context of artificial intelligence (AI), since AI-enabled medical devices are prone to performing differently between groups of people, and that this effect may not be obvious unless explicitly tested for. We welcome the review and its practical recommendations as we seek to improve the lives of patients through healthcare technologies that are effective, safe, and equitable.”
Prof Martyn Thomas FREng, Emeritus Professor of IT, Gresham College, said:
“The press release is an accurate description of the study and its recommendations, although its brevity means that it is no substitute for reading the full report or at least the summary.
“This is an important review based on substantial peer-reviewed science and on the experiences of medical staff and patients.
“The review gives many examples of biases in the way medical devices are designed and used. The report recommends actions to reduce these biases and the risks that they present to patients. Its recommendations should be taken seriously by the Government, by the regulators and by the medical profession.”
Dr Sara Khalid, Associate Professor of Health Informatics and Biomedical Data Science, University of Oxford, said:
“This important review reinforces what experts have long-known about the role of health data poverty and resulting biased AI in worsening underlying systemic health inequities. It will be important to monitor if and how these practical recommendations influence real clinical practice.”
Sarah Norcross, Director of the Progress Educational Trust (PET), said:
“This Independent Review correctly draws attention to the confusion that surrounds polygenic risk scores when these are mistaken for more traditional forms of genomic data.
“The Review is also correct to raise concern over the use of polygenic scores in embryo selection, a practice that is – and should remain – prohibited in the UK. The problem with using polygenic scores in this way is not so much about the ethics of embryo selection in general, but rather about the fact that there is no scientific justification for thinking that polygenic scores are meaningful in relation to selecting embryos at all.
“In other words, the problem here is false claims that polygenic scores can be useful in assisted conception. The falsity of these claims has been pointed out by bodies including the European Society of Human Genetics, the American College of Medical Genetics and Genomics, the International Society of Psychiatric Genetics, and the International Common Disease Alliance.
“To the extent that polygenic risk scores are useful in certain contexts, at present they are most likely to be useful for people of European ancestry. This is because European ancestry is disproportionately represented in the underlying genomic data. This has been the case ever since whole genome sequencing first started at the beginning of this century.
“As this Review observes, commendable efforts are being made to rectify the situation. Unfortunately, though, even when people with non-European ancestry are present in widely available datasets, studies will often exclude the data of these minorities in order to ensure statistical power.”
Professor Peter Bannister, Healthcare Spokesperson for the Institution of Engineering and Technology, said:
“This report focuses on the role that medical devices can play in perpetuating existing racial and ethnic inequities in healthcare and provides a series of practical recommendations for manufacturers of these devices to follow to ensure they function acceptably regardless of the ethnicity of the patient.
“While it is disheartening to learn that factors as obvious as the skin tone of the patient can still undermine the effectiveness of these technologies, the report highlights how prioritising collaboration between engineers and end-users (known as ‘co-design’ or ‘design thinking’) is critical going forwards for all technologies to ensure that any sources of bias, however subtle, can be mitigated.
“The report considers a representative range of technologies, including optical measuring devices such as pulse oximeters, as well as artificial intelligence and the use of genomic data. In the case of technologies such as artificial intelligence, even using diverse data is not enough to ensure that the existing biases already ‘baked into’ society are not just simply amplified via these new solutions. Understanding how inequalities appear in today’s healthcare data (not just who is contained in it) is a critical first step to designing solutions that can close the gap in healthcare equity.
“In reality, technology is just one component needed to deliver equitable healthcare and while the report recognises that wider socio-economic factors also need to be addressed at a whole system level, more needs to be done urgently at a government, infrastructure and delivery level to ensure that those who stand to gain the most in terms of improved health outcomes can actually access these innovations.”
‘Equity in Medical Devices: Independent Review’ was published at 13:00 UK Time on Monday 11 March 2024, which is also when the embargo will lift.
Declared interests
Dr Peter Charlton: No interests to declare.
Dr Andrew King: No conflicts of interest.
Professor Peter Bannister is Managing Director of Romilly Life Sciences Ltd, Healthcare Spokesperson for the Institution of Engineering and Technology (IET) Policy Oversight Panel, Honorary Chair at the University of Birmingham and Non-executive Director of the Life Sciences Hub Wales.
Prof Martyn Thomas: I have no interests that might be considered a conflict with any aspect of this study or its recommendations.
For all other experts, no reply to our request for DOIs was received.
This Roundup was accompanied by an SMC Briefing.