Understanding Biases in Clinical Notes Based on Gender and SMI Diagnosis
Artificial Intelligence (AI) models may unintentionally amplify existing ‘biases’ (i.e., misleading relationships in data) present in clinical notes; this can result in misleading predictions for certain patient groups. This project will examine clinical notes from the CRIS database to identify and measure biases related to gender and severe mental illness (SMI) diagnosis that could affect the fairness of such mental health AI models.