Proposed Machine Learning Framework for Ethical Clinical Note-Taking
DOI:
https://doi.org/10.58445/rars.2034Keywords:
Computer Science, Algorithmic Bias, bias mitigation, Healthcare, data scienceAbstract
Ethnic and gender bias can creep into medical notes by physicians within a clinical setting. Physician bias as seen through verbiage of clinical notes and electronic health records (EHR) for certain demographics (e.g. African-Americans, Hispanic/Latinx, and women) can affect the quality of the health care these patients receive. This is important for future disease diagnosis and solutions for medical issues toward certain demographics, especially when human bias can get in the way. This study first reviews the literature on the extent and types of biases and impacts on certain demographics in the clinical setting. By looking into these issues, we propose a ML question framework to mitigate these biases when a clinician is taking notes in their pre-processing phase of data. Then, we assess the relevance of this framework within the context of the MIMIC-III dataset, where we evaluate a comparison of negative descriptors in the post-processing phase. Finally, we provide a set of conditions with a patient's background to guide a physician to record and evaluate medical needs in a holistic fashion to treat all patients fairly.
References
Sun, Michael, et al. “Negative Patient Descriptors: Documenting Racial Bias In The Electronic Health Record.” Health Affairs, University of Chicago, 19 Jan. 2022, www.healthaffairs.org/doi/10.1377/hlthaff.2021.01423.
Igoe, Katherine J. “Algorithmic Bias in Health Care Exacerbates Social Inequities - How to Prevent It.” Executive and Continuing Professional Education, Harvard, 3 Oct. 2023, www.hsph.harvard.edu/ecpe/how-to-prevent-algorithmic-bias-in-health-care/.
Hall, William J, et al. “Implicit Racial/Ethnic Bias among Health Care Professionals and Its Influence on Health Care Outcomes: A Systematic Review.” American Journal of Public Health, U.S. National Library of Medicine, Dec. 2015, www.ncbi.nlm.nih.gov/pmc/articles/PMC4638275/.
Paulsen, Emily, and Emily Paulsen. “Recognizing, Addressing Unintended Gender Bias in Patient Care.” Duke Health Referring Physicians, Duke Health, 14 Jan. 2020, physicians.dukehealth.org/articles/recognizing-addressing-unintended-gender-bias-patient-care#:~:text=One%20in%20five%20women%20say,of%20gender%20bias%20are%20correct.
Beach, Mary Catherine, et al. “Testimonial Injustice: Linguistic Bias in the Medical Records of Black Patients and Women - Journal of General Internal Medicine.” SpringerLink, Springer International Publishing, 22 Mar. 2021, link.springer.com/article/10.1007/s11606-021-06682-z.
Fairness, AI. “Guidance on Choosing Metrics and Mitigation.” Ai Fairness 360 - Resources, IBM Research Trusted AI, aif360.res.ibm.com/resources#guidance.
Estiri, Hossein, et al. “An Objective Framework for Evaluating Unrecognized Bias in Medical AI Models Predicting COVID-19 Outcomes.” OUP Academic, Oxford University Press, 12 May 2022, academic.oup.com/jamia/article/29/8/1334/6576634.
Balagurunathan, Yoganand, et al. “Requirements and Reliability of AI in the Medical Context.” Physica Medica : PM : An International Journal Devoted to the Applications of Physics to Medicine and Biology : Official Journal of the Italian Association of Biomedical Physics (AIFB), U.S. National Library of Medicine, Mar. 2021, www.ncbi.nlm.nih.gov/pmc/articles/PMC8915137/.
Mahmood, Anam. “Tackling Bias in Machine Learning Models.” IBM Developer, IBM, 22 Mar. 2021, developer.ibm.com/articles/tackling-bias-in-machine-learning-models/.
Christensen, Donna M, et al. “Medical Algorithms Are Failing Communities of Color.” Health Affairs, Health Affairs, 2021, www.healthaffairs.org/content/forefront/medical-algorithms-failing-communities-color.
Pifer, Rebecca. “Study Finds Racial Bias in How Clinicians Describe Patients in Medical Records.” Healthcare Dive, Healthcare Dive, 20 Jan. 2022, www.healthcaredive.com/news/racial-bias-patient-descriptors-medical-records-health-affairs/617422/.
Downloads
Posted
Categories
License
Copyright (c) 2024 Khyati Singh
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.