top of page

The Dark Side of Innovation: The Hidden Racial Biases in Medical AI and Devices

Feb 26

4 min read

2

42

1

Global use of medical technology has substantially increased in recent years. This phenomenon has an immense capacity to improve healthcare delivery; however, the lack of diversity in creating medical algorithms and innovations contributes to glaring racial biases. 
Global use of medical technology has substantially increased in recent years. This phenomenon has an immense capacity to improve healthcare delivery; however, the lack of diversity in creating medical algorithms and innovations contributes to glaring racial biases. 

With the rise of innovative medical technologies, certain tools have transformed healthcare delivery and offered patients access to essential health information. Pulse oximeters, for example, conveniently measure blood oxygen saturation levels by merely clipping a small sensor to your finger. Since the advent of the COVID-19 pandemic, the pulse oximeter has become a pivotal instrument in making decisions for patient care, and the global pulse oximeter market is expected to reach $5.4 billion by 2033. However, the oximeter’s lesser-known and unaddressed shortcomings possess detrimental effects for people of color. 


Pulse oximeters calculate oxygen saturation by emitting red and infrared light and measuring their respective absorptions by tissue and blood. The presence of melanocytes from dark skin pigments can negatively affect the transmission of infrared light, resulting in inaccurate saturation level readings. The pulse oximeter is nearly three times more likely to miss dangerously low blood oxygen levels in Black patients as compared to their White counterparts. Yet, the Food and Drug Administration (FDA) has historically taken little action to address the discrepancies in pulse oximeters. As a result, healthcare professionals are often ignorant of the problem while technologies remain readily available for the public with few precautionary measures. 


While the FDA has begun taking gradual steps to increase diversity in pulse oximeter testing, the device’s racial disparities highlight a larger issue with medical technology. Forehead thermometers are 26% less likely to detect fevers in Black patients compared to oral thermometers, and vein visualization techniques are less efficient in people of African American or Asian ethnicity. Such imprecise medical tools are created by clinically testing these devices on primarily White populations, fundamentally ignoring patients of color and ultimately resulting in poorly calibrated technologies for minority populations. With racial disparities in medical technologies becoming imminently pervasive, it is pertinent that they are further investigated.


The use of medical devices will likely increase and become more complex with the development of artificial intelligence (AI) in the healthcare industry. Through machine learning models and algorithms, medical professionals can leverage AI for disease detection, diagnoses, medical imaging, drug development, and therapeutics. The use of medical AI has already yielded positive outcomes, with one AI algorithm in radiography yielding a 9.4% increase in breast cancer detection and a 5.7% decrease in false positive diagnoses. While AI could be revolutionary in advancing personalized patient care, the implementation of clinical algorithms presents ethical issues analogous to the racial disparities seen in traditional medical devices. There is considerable evidence that Black patients assigned the same level of risk as White patients by commercial algorithms were notably sicker than the White patients. The etiology of these racial biases arises from the data used to derive algorithms not being representative of diverse populations.


Consequently, it is imperative that medical technologies and AI are effectively regulated to ensure that racial health disparities are not perpetuated. The solution to these inequities cannot simply be forcing both patients and physicians to adapt to poorly constructed devices and algorithms; instead, the technology itself must be changed. The fundamental clinical trials and data must be reflective of a broad spectrum of people. Currently, there are no regulations on the data used to train algorithms, making biases’ presence more likely. Clinical trial testing and AI training should be managed by stakeholders like the FDA to ensure that certain standards are met to maintain and promote health equity. By creating and testing medical technology that considers minority populations in their constructions, medical settings aid in alleviating some groups’ historic mistrust of the healthcare system. Furthermore, medical institutions implementing greater transparency in their methods of design can more easily identify racial biases and make modifications to improve health outcomes.


While more action is necessary on an institutional level, steps are being taken to address the racial biases in medical technology. In October 2023, the Biden administration issued an executive order requiring the U.S. Department of Health and Human Services to develop policies and regulations regarding the administration of AI in the healthcare sector. In the state of New York, legislative endeavors, such as State Assembly Bill A9149 which was introduced in February 2024, aim to enact greater transparency for the use of AI in utilization management. 


From a personal perspective, part of the solution lies in acknowledging the problem. Those equipped to remedy racial biases in medical technologies have consistently overlooked or ignored them. In an increasingly diverse world, it is essential to consider race as a factor affecting health outcomes and understand that bias routinely affects patient care. When carefully created, medical technologies can be pivotal in diminishing biases and upholding equity. Their potential capabilities should make us hopeful for improving the quality of healthcare given that we are diligent, cautious, and equitable in their formation.

Comments (1)

kashyaprajesh
Feb 27

Good stuff

Like
bottom of page