Jo Aggarwal remembers her first experience of a computer when she was a child in the 1980s. “I was at the Asian Institute of Technology, and there was this mainframe computer built across different levels. Someone said you could ask it anything, so I asked, ‘How do you make a friend?’,” she remembers. Then she was told it could only work with numbers. “But the promise of computing has been of another intelligence you can converse with. Science has always been about that: of computers becoming sentient.”
Almost 30 years later, in 2016, Aggarwal co-founded Wysa, a platform where the first level of mental health support for people 13 years and above is an AI therapist. A person who may be stressed or anxious can log into the app or access the website on a browser and chat with a bot. The bot will say something like, “What can you do that connects you with yourself?” offering options such as a short walk or writing down a few thoughts. If there’s no time even for that, it says reassuringly, “No problem at all! How about just taking a minute to breathe deeply?” At the top of the screen is always the ‘Add a therapist’ option (₹4,999 for a live audio/ video/text session a week in a month).

Jo Aggarwal
Mental health services are now available on-demand 24×7, within a few minutes, anywhere in the world. Its ‘delivery’ — a term used by both e-commerce platforms and healthcare professionals — is seamless. This delivery involves three players: the customer, the supplier of the service, and a technology platform, much like a quick-commerce operator in the food delivery space.
This Swiggy-fication of mental health has both pluses and minuses, but its quick delivery has changed the way we access help for psychological distress. In the past, we had an access problem: there was invariably a great deal of friend-calling and number-chasing to identify a psychologist or psychotherapist. Then a long wait for a date with them. People also hesitated because of the stigma of seeing a therapist and the fact that an outing like that would have to be reported to (or lied about) to a parent or partner. That has changed.
Yet, this easy, quick access to aid — along with the development of devices that help with stress and sleep, which feed into mental health — has not helped the overall mental health of the world’s population. Even as the self-help and wellness industries (closely allied to the mental health marketplace) see a boom, incidences of stress, anxiety, and depression continue to climb.
The brain, after all, is wired to ensure we survive, so it will pick up the threats and focus on them. “These threats are being consistently fed by a whole industry that competes for your attention,” Aggarwal says, referring to all kinds of media, including social media. With information coming at us from everywhere about wars, political instability, and the climate crisis, the body is in a constant fight-or-flight mode. This may be one reason for deteriorating mental health. Others are financial insecurities and widening inequality, and a crumbling social infrastructure.
The World Health Organization (WHO) states, “in 2019, 1 in every 8 people, or 970 million people around the world were living with a mental disorder, with anxiety and depressive disorders the most common”. During the COVID-19 pandemic, in 2020, cases of anxiety and depressive disorders rose by 26% and 28%, respectively. This month, WHO released a report that said more than 1 billion people are living with mental health disorders.

Mount rush-more
Start-ups are building new products and services to Band-Aid the exploding mental health crisis, but it’s kind of like using Ozempic for obesity. To fix it, we need to look at why people are getting fat (junk food, hormonal issues, cities not built for mobility, etc). Similarly, one of the causes of mental illness is our disconnection: from ourselves, our communities, and with nature. So, while both products and services may help, they can sometimes feel like booking a wellness weekend away from daily stress, only to come back to stew in the same old bad broth.
Dr. Amit Malik, a psychiatrist who founded what began as Inner Hour and is now Amaha, in the same year that Aggarwal founded Wysa, says, “The prevalence and awareness [of mental health conditions] has gone up and the stigma has gone down. So, it is incumbent on the ecosystem to develop solutions.” Amaha has a range of online and offline mental health services, one of which is in the development stage: of matching professionals with people looking for therapy, to make the whole system more robust. “This matching is important because if someone doesn’t have a good experience the first time with a therapist they may not come back at all — not just to Amaha, but to therapy itself. We cannot risk that,” he says.

Dr. Amit Malik
Akash, 30, a freelance researcher-writer based in Kolkata, has found that an online therapist assigned randomly has never been able to go beyond surface-level problems, and there’s no assurance that they will be queer-friendly. However, he has never been asked by either an offline or online practitioner about his social location or politics, both critical in the journey towards building a rapport and connection with a therapist.
He has, however, picked up some self-regulation practices from them, such as box breathing and mindfulness exercises. Through his own exploration, he has also found free-to-use ways of self-soothing, including listening to long-form videos about space and history that help with sleep because of the calming voice. Playlists on Spotify targeted at mental health themes also exist.
Therapy is expensive (most sessions cost anywhere from ₹1,500 upwards) and in a difficult economy, people may see that money as wasted if the therapist is not the right fit. On the flip side, the world is also in a rush. And this shows up in many ways: for Amaha, more than 50% of people who access the free self-help tools on the website or the app will book an appointment with a therapist in 24 hours (starts from ₹1,600), showing that people are prioritising mental health even if it’s heavy on the wallet.
“The prevalence and awareness [of mental health conditions] has gone up and the stigma has gone down. So, it is incumbent on the ecosystem to develop solutions.”Dr. Amit MalikPsychiatrist
On-demand therapy also plays out in a let’s-fix-this-problem-quickly mindset. Shelja Sen, a New Delhi-based narrative family therapist who co-founded Children First, a child and youth mental health organisation, says she sees this in some parents. “They may say, ‘It’s the summer holidays and my child is free, so can we do three sessions a week’,” she says. It’s treated like a pill prescribed by a doctor or a summer project.
Sen says she doesn’t blame parents because there is a lot of judgment around parenting today, pressure for them to perform — to send children abroad to study, to ‘fashion’ perfectly right-brain-left-brain-balanced children who are also socially conscious. “Therapy is often sold as packages of say, three sessions. But therapy takes time. I tell parents, ‘I don’t know how many sessions it will take’,” she states.

Shelja Sen
Another fall-out of this need for speed is the self-pathologising and labelling that comes with the access to knowledge, especially micro-doses of information from social media through Reels on Meta or Shorts on YouTube. “There is a lot of the victimhood discourse online — this notion that we are fragile, broken, that my parents have wronged me. It is focused on the ‘I’,” Sen explains.
But good mental health comes from seeing ourselves as part of an ecosystem and to build a network of people, takes time. Instead, we are focused on “the tyranny of the 3 Ts: trauma, triggering, toxic” as Sen puts it. These call for quick action: identify people who may have caused some hurt (trauma), take (toxic) people out of life, act or react immediately to something that is triggering. All this can cause isolation, loneliness, and the loss of a sense of agency.
COVID-19 also perpetuated the idea of the home as a hub, drawing us further into a cocoon with work-from-home. Our 10×10-ft. rooms were drawn up as the only safe space, and while it was then a physical boundary, it has come to be a psychological one now.

Bot breaks
Warning: the following contains references to suicide. Please avoid reading if you feel triggered by the subject
Adam Raine was 16 when he took his life in April. In August 2025, his parents, based out of California, sued OpenAI and its CEO Sam Altman for the death. A Reuters report says that the couple has claimed that the chatbot validated Raine’s suicidal thoughts, gave detailed information on lethal methods of self-harm, and hide evidence of a failed suicide attempt.
Sophie Rottenberg was 29 when she took her life this July. She had been in conversation with a ChatGPT AI ‘therapist’ called Harry, her mother says, in a New York Times article. “Harry didn’t kill Sophie, but A.I. catered to Sophie’s impulse to hide the worst, to pretend she was doing better than she was,” she says in the article.
Dr. Andrew Clark of Boston University performed a simulation-based comparison study, where he used “10 publicly available AI bots offering therapeutic support and companionship” inputting prompts from fictional adolescents. The resultant paper ‘The Ability of AI Therapy Bots to Set Limits With Distressed Adolescents’, published this year, found that “across 60 total scenarios, chatbots actively endorsed harmful proposals in 19 out of the 60 (32%) opportunities to do so. Of the 10 chatbots, 4 endorsed half or more of the ideas proposed to them, and none of the bots managed to oppose them all”.
If you are in distress, please reach out to these 24×7 helplines: KIRAN 1800-599-0019 or Aasra 9820466726
Slow and fast therapy
Wysa follows “rule-based algorithms and large language modelling (LLM) to listen and respond intelligently”. While an LLM learns from its interactions with people and will validate what the individual user says, rule-based algorithms are generated with a human team. In this case, of therapists and conversation designers who anticipate scenarios, work with how they are seeing people online respond to prompts, and tweak responses.
Aggarwal says Wysa has been through over 800 micro iterations, with many manual content inputs from studies and books. She gives an example of a tweak they made to the algorithm. “Say a spouse has cheated. While ‘reframing a thought’ is part of cognitive behavioural therapy [which focuses on changing negative thoughts to positive], we found that people didn’t want to reframe. So, they would say, ‘He never loved me’ or ‘I have been used’,” she says. Through people’s conversations with the bot, therapists found that what a cheated-upon spouse was looking for was control rather than positive emotions. So, new prompts were fed into the system.
A pathway for high-risk scenarios (self-harm, abuse, trauma, suicide ideation) is triggered if Wysa’s system senses it. “Earlier, we worked on explicit risk, like a person saying, ‘I want to take my life.’ Now, we are working on implicit risk, where someone may say something like, ‘I have lost my job’ and then also say, ‘Where is the nearest bridge?’, for instance,” she says. They will be launching their third iteration next month.

This Swiggy-fication of mental health has both pluses and minuses, but its quick delivery has changed the way we access help for psychological distress.
| Photo Credit:
Illustration: Hitesh Sonar
Suparna (name changed to protect privacy), a Bengaluru-based freelance writer and editor in her 50s, uses the LLM-model DeepSeek and ChatGPT to help her through interpersonal interactions. “Over the last four to five years, I became increasingly convinced that I was autistic, but struggled at first to find professional psychological support,” she says. Suparna read deeply from scientific research, books, and blogs, and went to a doctor for a diagnosis. “I was diagnosed with adult autism.” Now, she uses AI as a “sounding board to help decode social interactions” so she doesn’t end up over-thinking. It gives her a pause “before my mind runs away”.
She is also highly aware that an LLM is “not a thinker, but just a parser or sentences”. She uses it in tandem with a human therapist and medication. “I would be very vulnerable if I didn’t have access to these and it can be dangerous if you’re on the precipice of a dark place,” she says, realising its shortcomings and patterns that can sometimes misguide or reach wrong conclusions or forget a running thread. “It doesn’t substitute for humans and you can’t depend on it, so I don’t believe everything it says. It works if you’re honest with yourself.”
Dial a device
Many AI therapy chatbots are free of cost until a certain level. Devices come at a substantial cost, most priced over ₹20,000. But people invest in them because some of the factors that feed into mental health, such as stress, anxiety, and sleep, can all be tracked with devices, by regular folks unconnected to medicine.
Rohan Dixit, who trained as a neuroscientist with Stanford and Harvard universities, combined his own experience with anxiety and depression as a teenager and his mother’s meditation practice to launch the wearable from Lief Therapeutics (the company of which he is the CEO and founder) in 2018. Lief, a device to be worn discreetly on the upper half of the body, works on biofeedback. Dixit calls personal devices like his own “training wheels” that help the body sense itself and then “self-correct”. Eventually, as the body gets used to listening to itself, actions will come naturally.
Some worry, however, that the training wheels won’t come off. Yameer Adhar, 39, a Dubai-based entrepreneur who lived in Delhi for many years, uses a Whoop band that’s connected to an app, which records nine metrics, including sleep, strain, and heart health. As someone who has experimented with biohacking (using lifestyle changes to self-help and change the body), he’s wary about getting addicted to it though.

Yameer Adhar
“The phone has become an extension of the arm, so I don’t want to be dependent on another device,” says Adhar, who wrote the book Voices in My Head in 2020. “For people like me who are prone to mental health issues, the real-time feedback can cause anxiety. For instance, when I’m not stressed and it shows an elevated heart rate. Then I begin to wonder why.” So he doesn’t keep the device connected to the app all the time so he’s not constantly checking it.
With the lack of delayed gratification, devices too become impulse purchases that may not get used to their full potential. Adhar, like Suparna, has a human therapist, too. While he talks about how AI bots can sometimes be more efficient than humans, he feels it’s not a substitute. He knows from personal experience though that health — both mental and physical — takes time to build, with “time, effort, and sacrifice”.
Clicking to cope
Popular non-invasive devices that claim to bust stress and help improve sleep
Apollo Neuro: A wrist or ankle wearable that can be customised for intensity and duration of vibrations that the company claims “melt tension, sharpen clarity, and guide you to deeper, more restorative sleep”. ₹58,300 (approximately)
Sensate: Worn on a lanyard on the chest, this mouse-shaped pendant emits sounds and vibrations that “destress your nervous system, in just 10 minutes” according to the company website. Starts at ₹36,200
CalmiGO: This handheld device claims to help turn off the body’s fight-or-flight reaction by stimulating four senses: smell, sight, hearing, touch. It helps regulate breathing patterns, extending exhalations; vibrates at the end of an exhalation; has scents; and works by positioning it at the mouth, much like an inhaler used for asthma. Starts at $199
Ozlo sleepbuds: Ear inserts that block noises that could disrupt sleep. The buds track sleep parameters, can stream audio that switches off when it senses a person has gone to sleep. They also have an in-ear personal alarm. $299
Therabody SmartGoggles 2nd Gen: To be worn across the eyes like a regular sleep mask, this comes with three settings of vibrations (constant, pulse, wave), a heating and massage function, and Bluetooth connectivity for sound. $219.99
Muse S Athena: A headband that claims to track brain activity with EEG sensors and give users real-time insights into brain health and ascribe a brain recovery score. It also has meditation coaching and sleep tracking. $474.99
Hugimals’ weighted plushies: Much like weighted blankets, some of these toys can be wrapped around parts of the body, to relieve anxiety. Starts at ₹4,100
* Products have not been tested or recommended by The Hindu

