UK report reveals bias within medical tools and devices | Health

Minority ethnic people, women and people from deprived communities are at risk of poorer healthcare because of biases within medical tools and devices, a report has revealed.

Among other findings, the Equity in Medical Devices: Independent Review has raised concerns over devices that use artificial intelligence (AI), as well as those that measure oxygen levels. The team behind the review said urgent action was needed.

Prof Frank Kee, the director of the centre for public health at Queen’s University Belfast and a co-author of the review, said: “We’d like an equity lens on the entire lifecycle of medical devices, from the initial testing, to recruitment of patients either in hospital or in the community, into the early phase studies and the implementation in the field after they are licensed,.”

The junior health minister Andrew Stephenson said: “Making sure the healthcare system works for everyone, regardless of ethnicity, is paramount to our values as a nation. It supports our wider work to create a fairer and simpler NHS.”

The government-commissioned review was set up by Sajid Javid in 2022 when he was health secretary after concerns were raised over the accuracy of pulse oximeter readings in Black and minority ethnic people.

The widely used devices were thrown into the spotlight due to their importance in healthcare during the Covid pandemic, where low oxygen levels were an important sign of serious illness.

The report has confirmed concerns pulse oximeters overestimate the amount of oxygen in the blood of people with dark skin, noting that while there was no evidence of this affecting care in the NHS, harm has been found in the US with such biases leading to delayed diagnosis and treatment, as well as worse organ function and death, in Black patients.

The team members stress they are not calling for the devices to be avoided. Instead the review puts forward a number of measures to improve the use of pulse oximeters in people of different skin tones, including the need to look at changes in readings rather than single readings, while it also provides advice on how to develop and test new devices to ensure they work well for patients of all ethnicities.

Concerns over AI-based devices were also highlighted by the report, including the potential for such technology to exacerbate the under-diagnosis of cardiac conditions in women, lead to discrimination based on patients’ socioeconomic status, and result in under-diagnosis of skin cancers in people with darker skin tones. Concerns over the latter, they say, is down to the fact AI devices are largely trained on images of lighter skin tones.

The report also noted problems with polygenic risk scores – which are often used to provide a measure of an individual’s disease risk due to their genes.

“Major genetic datasets that polygenic risk scores use are overwhelmingly on people of European ancestry, which means that they may not be applicable to people of other ancestries,” said Enitan Carrol, professor of paediatric infection at the University of Liverpool and a co-author of the review.

However, attempts to correct biases can also be problematic. Among examples highlighted by the report are race-based corrections applied to measurements from devices known as spirometers that are used to assess lung function and diagnose respiratory conditions, have themselves been found to contain biases.

Prof Habib Naqvi, the chief executive of the NHS Race and Health Observatory, welcomed the findings, adding the review acknowledged the need for immediate modifications, equity assessments and tighter guidance and regulation around pulse oximeters and other medical devices.

“Access to better health should not be determined by your ethnicity nor by the colour of your skin; medical devices therefore need to be fit-for-purpose for all communities,” he said.

“It’s clear the lack of diverse representation in health research, the absence of robust equity considerations and the scarcity of co-production approaches, have led to racial bias in medical devices, clinical assessments and in other healthcare interventions.”

Reference

Denial of responsibility! Elite News is an automatic aggregator of Global media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, and all materials to their authors. For any complaint, please reach us at – [email protected]. We will take necessary action within 24 hours.
DMCA compliant image

Leave a comment