Opinion
When the algorithm doesn't listen: AI and justice in Bangladesh's clinics
When you build a model, don't just ask if it is accurate. Ask if it listens. When you collect data, don't just ask if it is a large sample. Ask whose voice is missing. The success of AI in our healthcare system shouldn't just be measured by the lives it saves, but by the people it finally hears
Growing up in Satkhira, I spent a lot of time watching families wait in rural clinics. Their voices were often quiet, filled with the kind of anxiety that comes from a long day of uncertainty. Now that I work as a researcher, I find myself thinking about those voices in a new way. If one of those patients speaks to an AI today, will the system actually hear them?
In Bangladesh, we are rapidly building AI tools to help with dengue fever, maternal care, and disaster relief.
While these tools are exciting, they carry a hidden risk called epistemic injustice. In plain language, this happens when a person is treated as if their own knowledge doesn’t matter. In a medical setting, it means an algorithm might systematically ignore what a patient knows about their own health.
How AI can silence a patient
Philosopher Miranda Fricker identified two ways this kind of unfairness happens.
1. Testimonial Injustice: This is a lack of trust based on prejudice. If an AI is trained mostly on data from wealthy city-dwellers, it might label a rural woman’s symptoms as "low risk." The problem isn''t that her symptoms are mild. The problem is that the AI never learned how she describes pain or fatigue. It doesn''t know how to trust her.
2. Hermeneutical Injustice: This happens when we lack the right words to describe our reality. If a community is suffering from a new kind of environmental illness, they might not have a term that fits into a pre-set digital menu. The AI doesn''t just get the answer wrong; it prevents the patient from even asking the right question.
Why this matters for Bangladesh
These aren't just academic ideas. In my own research on heart disease prediction, I have seen how models can fail when they encounter groups that weren't represented in the original data. Our national health records are often incomplete, our languages are diverse, and rural voices are frequently left out of the digital world.
If we put these systems into village health centers without fixing these gaps, we might just be automating old inequalities. A patient who is already ignored by society might find that the machine ignores them, too.
Building AI that truly listens
We don't need to give up on AI. Instead, we need to build systems that prioritise fairness. Based on my work in trustworthy machine learning, I suggest five steps for our country:
• Diverse Data: We have to include rural, low-income, and indigenous communities in our datasets. This is a scientific requirement, not a favor.
• Clear Explanations: An AI should explain its reasoning in a language the patient actually understands.
• Local Standards: We cannot just copy-paste fairness rules from the West. Bangladesh needs its own definitions of what is "fair" in our specific clinics.
• Keeping Doctors Involved: AI should be a tool for human judgment, not a replacement for it. A doctor should always have the final word after hearing a patient’s story.
• Personal Responsibility: Developers and policymakers must constantly ask themselves whose knowledge is being included and who is being left out.
A message to my colleagues
To the next generation of engineers in Bangladesh: your skills are a form of power. But power without humility can lead to a new kind of silence.
When you build a model, don't just ask if it is accurate. Ask if it listens. When you collect data, don't just ask if it is a large sample. Ask whose voice is missing. The success of AI in our healthcare system shouldn't just be measured by the lives it saves, but by the people it finally hears.
* Farjana Yesmin is an independent researcher & machine learning scientist specialising in Trustworthy AI, Federated Learning, and Healthcare Applications
* The views expresses here are the writer's own