BACKGROUND
Maternal harm is a major crisis that disproportionately affects Black women. Unstructured clinical notes from electronic health records (EHR) data may contain insights for unsafe maternal care delivery. Studies have analyzed unstructured clinical notes in the EHR using natural language processing for tone and sentiment and discovered that tone and sentiment contributes to preventable patient safety events.
OBJECTIVE
We analyzed potentially negative keywords in the clinical notes in EHR data recorded for women who experienced severe maternal morbidity or had postpartum readmission within 42 days of delivery at one of two large, birthing hospitals in Washington, DC.
METHODS
Design: Retrospective cohort study.
Setting: Using a predefined list of negative keywords (i.e., nonadherent, aggressive, agitated, angry, challenging, combative, noncompliant, confront, noncooperative/uncooperative, defensive, exaggerate, hysterical, unpleasant, refuse, and resist), we applied natural language processing and machine learning techniques to detect negative keywords in unstructured EHR clinical notes.
Participants: The cohort was defined as female patients who had delivery encounter at one of two large, birthing hospitals in Washington, DC from January 1st, 2016, to March 31st, 2020.
RESULTS
Negative keywords were more frequently used for 30-44 years old patients. The adjusted odds of having negative keywords in EHR clinical notes during delivery encounters were 1.08, 1, and 0.77 for Black, White, and Other patients, respectively. Patients with commercial insurance type had lower adjusted odds of having a negative keyword (commercial = 1 vs. self pay = 1.13 and Medicare/Medicaid = 1.14).
CONCLUSIONS
Our findings indicated healthcare providers’ potential implicit racial biases in documenting EHR clinical notes for Black patients during delivery encounters.