Are you a young person who is depressed?
Take a look at Woebot! In the United States, mental health chatbots are on the increase. To combat an increase in despair and anxiety, schools are encouraging pupils to employ mental health chatbots. Critics worry that they are a band-aid solution that isn’t backed up by evidence.
To engage in text-based conversations, chatbots use artificial intelligence comparable to Alexa or Siri. Their usage as a wellness tool during the pandemic — which has exacerbated the juvenile mental health problem — has grown to the point where some researchers are debating whether robots could ever replace living, breathing school counsellors and skilled therapists. Critics worry that they’re a Band-Aid solution to psychological pain with a scant body of research to back up their effectiveness.
Experts have been warning about an increase in depression and anxiety throughout the crisis. Joe Biden termed juvenile mental health concerns an emergency during his State of the Union address earlier this month, saying that students’ “lives and education have been turned upside down.”
Mental health chatbots, for example, are digital wellness solutions that promise to address the gaps in America’s overworked and underresourced mental health care system.
Even though trauma affects up to two-thirds of children in the United States, many regions lack mental health practitioners who specialize in treating it.
According to national statistics, there are fewer than 10 child psychiatrists per 100,000 children, which is less than a quarter of the recommended staffing level by the American Academy of Child and Adolescent Psychiatry.
Thousands of additional mental health applications have crowded the market, professing to give a solution, and school districts around the country have endorsed the free Woebot app to assist adolescents to cope with the moment.
Woebot Health’s founder and president, psychologist Alison Darcy, said she designed the chatbot in 2017 with youth in mind. She claims that traditional mental health care has long failed to battle the stigma of getting therapy, and she hopes to change that with a text-based smartphone app.
Critics, on the other hand, have pointed to faults, questionable data gathering and privacy methods, and flaws in existing research on their usefulness as reasons to be cautious. Woebot appears to lessen depressive symptoms in college students, is an effective remedy for postpartum depression, and can reduce substance usage, according to academic papers co-authored by Darcy. Darcy, a Stanford University professor, admitted that her research work created a conflict of interest and that more research was needed. She has huge ideas for the chatbot’s future, after all.
Not all therapists are opposed to therapy being automated. Researchers from the Cincinnati Children’s Hospital Medical Center and the University of Cincinnati collaborated with chatbot developer Wysa to build the “Covid Anxiety” chatbot, which is designed to assist kids to cope with unprecedented stress. Researchers anticipate that Wysa will increase access to mental health services in rural areas where child psychiatrists are scarce. The chatbot, according to adolescent psychiatrist Jeffrey Strawn, could benefit children with minor anxiety, allowing him to focus on patients with more serious mental health issues.
Even before Covid, he claims, it would have been impossible for the mental health care system to aid every student with anxiety. “It would have been completely untenable during the pandemic.”
‘The simple solution’
Researchers are concerned that the apps will be unable to detect youth in acute distress. According to a BBC investigation, Woebot said, “Sorry you’re going through this, but it also tells me how much you care about connection, and that’s really sort of wonderful,” in response to the prompt “I’m being forced to have sex, and I’m only 12 years old.” There are also privacy concerns: digital health applications aren’t subject to federal privacy regulations, and they may share data with third parties such as Facebook.
Darcy, the Woebot founder, stated that her firm uses “hospital-grade” security processes to protect its data and that while natural language processing is “never 100 per cent flawless,” big upgrades to the algorithm have been made in recent years. Woebot isn’t a crisis service, she explained, and “every user acknowledges that” during the app’s necessary introduction. Despite this, she believes the service is crucial in resolving access issues. “Right now, there is a very huge, urgent problem that we need to address in ways other than the current health system, which has failed so many people, particularly the underserved,” she said. “We know that young people, in particular, have far more barriers to access than adults.”
Tekin of the University of Texas took a more critical approach, claiming that chatbots are only band-aid solutions that fail to address fundamental issues like limited access and patient hesitation. “It’s the quick cure,” she explained, “and I believe it’s motivated by financial interests, saving money, rather than genuinely identifying people who can provide meaningful help to students.”
Thank you for reading.