Advertisement

Context is not optional in the future of mental healthcare

Mental healthcare fails when it ignores culture and lived experience.
6 minute read
Context is not optional in the future of mental healthcare
Photo: Image: Oluwabukumi Victor Babatunde, Founder, Sane

Across the world, nearly 80 percent of people living with mental health conditions receive little or no care, according to the World Health Organization. In low- and middle-income countries, the treatment gap for severe mental health conditions ranges between 76 and 85%. Even in high-income countries, a significant proportion of people remain untreated.
But the more uncomfortable truth is this: even when care exists, it often fails because it is not designed for the people it is meant to serve.

Research consistently shows that people in low- and middle-income countries and minority populations in high-income countries are far less likely to receive appropriate or sustained mental health treatment. This gap is not explained solely by access. It is driven by misalignment between how mental health is understood by systems and how it is experienced by people.

Mental health systems frequently assume that distress looks the same everywhere, that emotional language is universal, and that therapeutic models developed in Western contexts will translate seamlessly across cultures. In reality, this assumption has quietly undermined mental healthcare outcomes across continents.

As digital solutions scale globally, the risk is growing. Without context, technology doesn’t democratise care. It reproduces its blind spots.

Context is not optional in Mental Healthcare

Digital mental health will never succeed at a global scale unless it is culturally, economically, linguistically, and socially contextual. This is not a philosophical position. It is a practical one.
In many African communities, emotional distress is expressed through physical symptoms rather than psychological language. People speak of body pain, fatigue, or persistent discomfort rather than sadness or anxiety. In diaspora communities, mental health challenges are often filtered through migration stress, identity tension, financial pressure, and intergenerational expectations that shape how distress is understood and whether help is sought at all.

In the United Kingdom, one of the most culturally diverse societies in the world, a single mental health framework is expected to serve populations with radically different lived realities. Evidence shows that people from Black, Asian, and minority ethnic backgrounds are more likely to disengage from mental health services and less likely to receive timely intervention, often due to cultural misunderstanding, lack of trust, and previous negative experiences with care systems.

Yet most digital mental health tools are built using datasets, language models, and clinical assumptions rooted in Western norms. The result is care that is technically available but functionally misaligned. Services exist, but they do not always resonate. Tools are deployed, but they do not always understand.

How Culture shapes Mental Health and Diagnosis in practice

Context shapes how people experience and express mental health in ways that are often invisible to standard diagnostic models. Anxiety, for example, may present as restlessness and worry in one culture, but as persistent fatigue or body pain in another. Depression might be verbalised as sadness in some settings and as spiritual disconnection or loss of purpose in others. Parenting norms differ widely, altering what is perceived as stress, failure, or emotional neglect. Topics considered taboo in one society may be openly discussed in another.

Language itself creates barriers. Emotional vocabulary does not translate cleanly across cultures. A word like “burnout” may resonate in corporate Western environments but carry little meaning elsewhere, despite similar lived experiences. Research has shown that standard diagnostic tools often under-detect depression and anxiety in non-Western populations because symptoms are expressed differently, frequently through somatic complaints rather than emotional language. This is where artificial intelligence enters the conversation, both as an opportunity and a risk.

AI systems used in mental health are only as effective as the data and assumptions that shape them. Multiple studies on machine learning in healthcare have demonstrated that models trained on homogenous datasets perform significantly worse when applied to diverse populations. In mental health, this can mean lower detection accuracy, higher false negatives, and delayed intervention for already underserved groups.

Models trained predominantly on Western populations struggle to recognise distress patterns in non-Western users. Cultural bias becomes embedded, then amplified. Misinterpretation turns into misdiagnosis. Early warning systems fail the very people they are meant to protect.

Context-aware AI offers a different path. When behavioural models are trained on localised datasets, they become better at detecting subtle signals of distress. When cultural norms are embedded into system design, AI can distinguish between healthy variation and genuine risk. When multiple contexts inform model development, global understanding improves for everyone. African behavioural data, for example, does not only serve African users. It expands the collective intelligence of mental health systems worldwide.

Why the challenge is urgent for the United Kingdom

This issue is particularly urgent for the United Kingdom.

The UK faces record-high demand for mental health services, with the NHS under sustained pressure. Mental ill health is estimated to cost the UK economy hundreds of billions of pounds annually through healthcare expenditure, lost productivity, and long-term economic inactivity. NHS waiting lists for mental health services continue to grow, while employers report rising levels of burnout, absenteeism, and presenteeism across sectors. Employers alone lose tens of billions each year due to stress-related productivity loss.

At the same time, the UK’s demographic reality is increasingly multicultural. Migrant and minority communities often experience higher barriers to care and poorer outcomes, not because services are unavailable, but because they are culturally misaligned. If the UK is serious about building inclusive, AI-enabled mental health systems, it cannot rely solely on domestically generated perspectives. It needs global insight, contextual intelligence, and models that reflect real diversity.

Designing a Mental Health future that reflects real lives

Global mental health needs a reset. Care must move upstream toward prevention and early detection. AI must be designed with context at its core, not retrofitted after harm appears. And digital systems must reflect how people actually live, speak, cope, and connect.

The future of mental healthcare is not just more technology. It is better understanding. As digital mental health continues to evolve, the most effective systems will be those that listen first, learn locally, and scale responsibly.

Context is not a limitation to innovation. It is the key to making innovation work.


Oluwabukunmi Victor Babatunde is a technology builder and the founder of Sane, where he develops AI-driven systems for early behavioural risk detection. His work explores the critical role cultural context and social norms play in the effectiveness of mental health interventions, focusing on how AI models must adapt to local environments.

As a dedicated advocate for AI literacy across Africa, Victor builds ecosystems rooted in trust and community relevance, grounded in the belief that mental healthcare must be designed collaboratively to reflect the specific lived realities of its users.

Get passive updates on African tech & startups

View and choose the stories to interact with on our WhatsApp Channel

Explore
Advertisement