Headline

Can AI tackle Racial Inequalities in Healthcare? (BBC)

Chandan Sekhon
November 28, 2021
Summary

Many studies have highlighted how black patients often have their pain level underestimated by healthcare professionals compared to white patients, leading to poorer health outcomes in these groups. Researchers developed new technology involving artificial intelligence (AI) to help reduce this discrimination. Algorithms were trained by reading knee x-rays from patients with arthritis in conjunction with the pain score each patient reported. Through this training, they were able to identify features of the x-rays that may cause more pain. When presented with new x-rays, it was able to predict the pain level the patient was experiencing.

Food for Thought

Results of the studies

The study found that seemingly similar arthritis cases were not so similar, with black patients reporting more pain than white patients. The algorithm took account of undiagnosed features that would normally be overlooked using traditional diagnostic systems. The algorithm was seeing something the radiologists were not. How could this technology be useful? Does this suggest the current diagnostic criteria needs to be altered?

Problems with the algorithm?

An issue that has been highlighted is that this algorithm was developed using datasets which had accidental bias. For example, minority populations made up a smaller proportion of the dataset. What problems could this lead to? Does this make the algorithm itself racially biased in any way?

Could AI replace doctors?

Since this technology has the ability to predict pain without using subjective scales to measure pain, could such an algorithm replace the need for doctors? Why/Why not?

Practice Interview QUestions
  1. What role does AI play in healthcare?
  2. Does technology have the opportunity to replace healthcare professionals, and should this be the case?
  3. Why might ethnic minority groups experience a lower standard of healthcare compared to their white counterparts?

Extra Reading (optional)
BACK TO THE NEWSFEED