In 2025, mental health needs are soaring, but traditional care models remain inaccessible to many. Millions turn to tools like ChatGPT before ever speaking to a therapist. Rather than dismiss this trend as naive or dangerous, we chose to understand it and design a solution that meets people where they are.Joy, our AI-powered mental health conversational coach, was created to reduce barriers to mental health care while protecting the integrity of evidence-based practices. Joy doesn’t aim to replace therapists. It aims to empower people, guide them toward professional help when needed, and support their mental health journey in a way that is scalable, safe, and deeply human in its intent.
Part 1: A public health crisis meets a technological turning point
Rising demand, inadequate supply
We are facing an unprecedented mental health crisis. Rates of depression, anxiety, and burnout are rising across all demographics, affecting individuals and communities on a global scale.
- 1 in 8 people in the world (1 billion people) globally are affected by a mental health condition (Source: WHO Foundation) and half of the world’s population will experience a mental health disorder by age 75 (Source: The Lancet Psychiatry, University of Queensland & Harvard study (2023) (Lancet Psychiatry)
- 41% of French citizens report having been affected by a mental health issue, such as depression, burnout, or suicidal ideation, at some point in their lives (Odoxa / Mutualité Française, 2024).
- 23% say they do not feel they are taking care of their mental health adequately (Harris Interactive / French Ministry of Health, 2024).
- As of January 2025, 11% of the population in France is suffering from depression (DREES, 2025).
Yet access to mental health professionals remains deeply inadequate. In many countries around the world, and especially outside major cities, long wait times, often several months, are the norm, and in some areas, no specialist care is available at all. According to the 2025 French Federation of Hospitals Barometer, half of individuals living with psychiatric conditions report serious difficulties accessing care, and 40% say they were simply unable to obtain an appointment.
“This scarcity is compounded by additional structural and social barriers, such as financial constraints, stigma, time pressures, and geographic inequality. For millions of people, professional mental health support is not just hard to reach, it’s functionally unavailable.” Sophie Garcia, Care & Therapist lead @teale
Impact on economy and organisations
The mental health crisis is not only a public health emergency : it is also an economic challenge of staggering proportions. Mental health disorders cost the global economy over $1 trillion every year in lost productivity, largely due to absenteeism, presenteeism, and staff turnover (WHO). That figure represents an estimated 12 billion working days lost annually.
For organizations, the effects are both visible and hidden:
- Absenteeism and sick leave: Employees struggling with depression, anxiety, or burnout are more likely to miss work, often for extended periods.
- Presenteeism: Many remain physically present but mentally disengaged, leading to significant losses in performance and quality of work.
- Turnover: Burnout and unaddressed stress drive higher attrition rates, increasing recruitment and training costs.
- Team and cultural impact: Beyond individual performance, mental health issues ripple across teams, undermining collaboration, trust, and long-term engagement.
These challenges come at a time when talent retention and organizational resilience are top strategic priorities. A workforce that is unwell mentally cannot sustain innovation, customer service excellence, or long-term competitiveness.
A new gateway to help
More and more people, particularly younger generations, are turning to AI-powered tools as their first source of emotional support (Harvard Business Review, 2025). This growing reliance is not a sign of irrationality, but rather a reflection of persistent gaps in traditional mental health care.
“What we are witnessing is not simply a shift in technology, but a shift in expectations: from searching for information to seeking emotional understanding. Humans don’t just want answers; they want to feel seen, heard, and validated in the full complexity of their emotional lives. People in pain don’t wait for a perfect solution. They reach for what’s available.” Anaïs Roux, Scientific Director @teale
Large Language Models (LLMs) offer a new kind of interface, one capable of responding not only to the content of a user’s questions but also to their tone, intent, and emotional state. For the first time, digital systems can reflect back emotional nuance at a scale and immediacy that human services alone have never been able to match.
Part 2: Three barriers that Joy aims to address
Accessibility
Traditional therapy often excludes those who need it most. Long wait times, geographic limitations, and socioeconomic disparities make it difficult to access professional care. Joy is available 24/7 at no cost and can be accessed from anywhere via a smartphone. It is particularly valuable in underserved regions, or for individuals who are not yet ready, or able, to speak with a therapist.
Science & security
The digital mental health landscape is crowded with unregulated, sometimes pseudoscientific tools. Joy was designed in collaboration with licensed psychologists and psychiatrists, and built on evidence-based frameworks such as cognitive-behavioral therapy (CBT), cognitive science, and psychoeducation. Crucially, it does not rely on unfiltered public data but is trained on proprietary clinically validated materiel. In high-risk scenarios, Joy does not attempt to manage the crisis, it redirects users toward qualified professionals or emergency resources.
“Joy was designed with the highest standards of data security and user privacy in mind. We developed this service responsibly, following strict validation protocols and ensuring that users are always referred to qualified professionals in case of risk.” Gilles Rasigade, CTO @teale
Personalized care & stigma reduction
Many people delay or avoid seeking help due to shame, uncertainty, or fear of being judged. Joy provides a safe, non-judgmental space for individuals to express themselves. Conversations are dynamically personalized to match the user’s tone, emotional state, and psychological needs. By doing so, Joy lowers the emotional threshold for engaging with mental health support and helps bring therapy down from a certain pedestal.
By removing these barriers, Joy will make access to care easier and more effective, helping companies avoid the pitfall of implementing mental health policies that fail to deliver real impact or lasting results.
Part 3: Joy is not a therapist. And that’s the point.
AI as a bridge, not a substitute
It is essential to clearly state what Joy is not: it is not a replacement for human therapy. Rather, Joy is a bridge, an accessible, ethical, and proactive guide that helps individuals make sense of their experience, engage in reflection, and prepare for or complement professional care when needed.
We do not believe AI could or should replace therapists. But we do believe that it can help prepare, motivate, and guide individuals toward professional care. In fact, starting with Joy often leads users to enter therapy better informed and more emotionally ready.
Supporting the therapy process
A significant portion of early therapy is devoted to psychoeducation, helping individuals understand their symptoms, behaviors, and emotional patterns. Joy can support this phase by delivering consistent, science-informed guidance and exercises that promote self-awareness.
This preparation may enhance engagement and accelerate therapeutic progress once individuals begin working with a therapist. Psychologists have already reported that patients who use conversational AI enter sessions with greater clarity and readiness, sometimes bringing structured reflections or insights from prior exchanges with tools like Joy.
Augmenting, not just guiding, therapy
Beyond encouraging people to seek help, AI has the potential to enhance the therapeutic process itself. One of the key moderators of therapy effectiveness is the patient’s psychological state, including their motivation and readiness to engage. By supporting this preparatory phase, AI helps optimize the conditions for a successful therapeutic process. By helping individuals make sense of what they feel before their first session, and by reinforcing insights between sessions through reminders, exercises, or reflections, Joy can increase engagement and accelerate progress. Rather than replacing human connection, it optimizes it, making therapy more focused, more continuous, and sometimes more effective.
Part 4: What an LLM can safely engage into regarding mental health and where the gray area remains
What a mental health LLM can safely do
Large Language Models (LLMs) have shown real potential in supporting mental health when used within clear boundaries. Their strengths lie in:
- Information delivery: Rephrasing, simplifying, and contextualizing evidence-based knowledge.
- Personalization: Tailoring content and resources to an individual’s expressed needs.
- Everyday support: Encouraging reflection, journaling, mindfulness, and self-care practices.
- Accessibility: Offering immediate, low-barrier access to conversation and guidance, especially valuable for individuals who might not otherwise seek help.
When deployed responsibly, LLMs can complement existing care by providing users with scalable, non-judgmental, and always-available support.
Where the gray areas begin
Despite these benefits, significant concerns arise when LLMs are used without safeguards or oversight. Unmonitored systems, such as general-purpose models like ChatGPT or Gemini, illustrate where risks emerge:
Key concerns identified:
- Misinformation: Risk of hallucinated or inaccurate advice, which may go undetected by vulnerable users.
- Over-validation: A tendency to affirm user statements uncritically, potentially reinforcing harmful beliefs.
- Misuse as therapy: Some users push models into simulating psychotherapy, creating a false equivalence with professional care.
- Shallow engagement: LLMs cannot provide the depth, challenge, and accountability of a trained therapist.
- Privacy risks: Sensitive data shared with commercial AI systems raises unresolved concerns about confidentiality.
The limits of LLMs in psychotherapy
While LLMs can enrich reflection and broaden access, they cannot replicate the essence of psychotherapy. Limitations include (Jung, Kyuha & Lee, Gyuho & Huang, Yuanhui & Chen, Yunan. (2025) :
- Not clinically validated: LLMs are not medical devices and lack empirical testing for therapeutic safety and efficacy.
- Risk of harm: Errors, misinterpretation, or overly supportive responses can inadvertently reinforce negative patterns.
- No therapeutic alliance: Effective therapy depends on empathy, trust, and relational depth, qualities machines cannot authentically reproduce.
- Ethical and privacy concerns: Unlike clinicians bound by confidentiality laws, AI systems operate within corporate data infrastructures.
- Cultural and contextual blind spots: Training data biases limit inclusivity and sensitivity to diverse populations.
- Over-reliance risk: Users may substitute AI interaction for professional care, delaying access to appropriate treatment.
Part 5: A coach, not a therapist: Joy's core design principles
Joy is explicitly conceived as a coach, a proactive, ethical, and personalized guide rooted in behavioral science. Its foundational principles include:
- Proactivity: Joy detects signals of distress early and nudges users toward insight, action, or referral.
- Personalization: Each interaction is adapted to the user’s emotional state, context, and goals.
- Ethical clarity: Joy never pretends to be human. It avoids emotional manipulation or overpromising. Its identity as an AI is transparent by design.
- Clinical rigor: All content and conversation paths are informed by expert clinicians and grounded in validated psychological models.
- Human-centric boundaries: When limits are reached, such as in cases of severe distress, Joy refers rather than reassures. It complements human care, never replaces it.
Part 6: Responsible Innovation in Mental Health AI
We believe mental health deserves more than hype.
Joy was not built on trending models or public internet data. Its core engine draws only from curated, evidence-based knowledge developed in close collaboration with psychologists, psychiatrists, and behavioral scientists. Every design decision has been shaped by clinical insight and ethical precaution.
Joy does not enable self-diagnosis. It does not offer medical treatment. And it is not positioned as a one-size-fits-all solution. Instead, it functions as a safe, scalable, and clinically aligned entry point into a broader ecosystem of care. In cases where risk is detected, Joy escalates to human professionals. It respects the line between coaching and therapy, between information and intervention.
As a company, we have adopted a deliberate pace of innovation. We value safety over speed, and depth over novelty. Mental health is not an arena for reckless experimentation. Our mission is to build trust, not only in the technology, but in the intentions behind it.
"The future of mental health support is hybrid. Tools like Joy help democratize access, while guiding users toward the care they truly need." Enzo Mathé, Product Owner @teale
Part 7: The vision ahead: from chatbot to ecosystem
Joy is just the beginning. We imagine a future where AI becomes a cornerstone of hybrid mental health care, supporting individuals, empowering professionals, and shaping healthier organizations.
For individuals
Joy is a trusted companion along the mental health journey. She offers timely answers, helps connect with therapists, books appointments, reinforce therapeutic work between sessions, reminds you to pause and breathe, Personalize content based on user engagement and needs, and provides digital coaching to prepare for important moments. Always present, always proactive, Joy supports you when and where you need it.
For HR leaders
Joy acts as a collective mental health assistant. She answers employee and organizational wellbeing questions, shares practical strategies to strengthen workplace policies, and helps HR collect and interpret anonymized insights to support early detection and prevention. This makes HR not just a responder, but a proactive driver of mental health at work.
For organizations and teams
Joy sees what often goes unnoticed. She detects silent patterns: Why is one team burning out while another thrives? She surfaces insights and guides action, whether directly or alongside experts, because mental health challenges arise not only in individuals, but also in systems. By giving words back to teams and embedding care into culture, Joy transforms awareness into action, and action into lasting change.
Used responsibly, AI will not dilute mental health care. It will deepen it.
Conclusion: this isn’t about AI. It’s about people.
"Joy was not born from fascination with artificial intelligence. It was born from frustration with human suffering and the lack of adequate answers at scale. Too many people go unheard, unsupported, or misunderstood, leaving mental health policies ineffective and millions without meaningful care. Joy exists to change that. It exists to put the very best of technological progress at the service of humanity—reconciling innovation, ethics, and science.” Julia Néel Biz, CEO @teale
This is not about replacing therapy. It’s about expanding access. It’s about reducing stigma. It’s about meeting people where they are, on their phone, in their language, at 2am, when the waitlist is six months long.
Ultimately, this is not a story about technology. It’s a story about care.