
The cost of being invisible in medical research | Lily Crawford | TEDxLondonBusinessSchool
Audio Summary
AI Summary
AI in healthcare promises revolution, but headlines often overlook existing medical biases. These biases, predating AI, affect who medicine works for and who it ignores. The speaker's personal experience with an undiagnosed autoimmune condition highlights how uncodified issues are unstudied. This lack of study extends to AI training; if medicine hasn't learned from it, AI can't.
Women's health, despite being a significant opportunity, receives only 6% of private healthcare investment, with 90% of that focused on cancer, reproductive, and maternal care. This leaves many high-burden conditions, like autoimmune diseases, underfunded. While investment is increasing, it's not just about money; it's about how problems are defined and measured.
Drug discovery using AI shows this disparity. A study of 71 AI-assisted drugs in clinical trials found only four (about 5%) targeting conditions disproportionately affecting women, and even these were primarily in oncology. This mirrors the historical underrepresentation of women and people of color in medical data and clinical trials, which only began including women in 1993 and still often exclude pregnant women and those with chronic conditions.
To address this, three paths are proposed:
1. Collect more data and mandate diverse clinical trials. However, policy doesn't always translate to practice, and data collection takes time.
2. Utilize synthetic data. This holds promise but risks amplifying existing biases if based on flawed original data.
3. Employ subgroup analysis or stratified modeling. This involves training AI to look at specific groups separately, even with limited data, making biases visible.
AI acts as a mirror, reflecting the need for closer examination of medical data and its inherent biases. By consciously choosing to teach AI about historically ignored areas, we can move beyond repeating past mistakes and create fairer, better healthcare for everyone.