Should we be using generative AI such as ChatGPT to get mental health advice? Some say this is a ... [+] slippery slope and an endangerment.getty
Mental health has become a widespread topic nowadays. In the past, discussions concerning mental health were often hushed up or altogether swept under the rug. A gradual cultural change has led to openly considering mental health issues and eased qualms about doing so in publicly acknowledged ways. You might give some of the credit for this change in overarching societal attitudes as an outcome of the advent of easily accessed smartphone apps that aid your personal mindfulness and presumably spur you toward mental well-being. There are apps for mindfulness, ones for meditation, ones for diagnosing your mental health status, ones for doing mental health screening, and so on. A plethora of apps exist. Can we say that smartphone apps overtly led to openness about mental health? It admittedly is a bit of a chicken or an egg question. Did the openness toward mental health allow for the emergence of relevant smartphone apps, or did the mental well-being of smartphone apps drive society in the direction of being upfront about mental health? Maybe it was an interweaving combination entailing both directions happening at the same time.
In any case, into this potent mix comes the rise of mental health apps that are said to be extraordinarily powered by Artificial Intelligence (AI). The idea is that the underlying technology can be improved via the (presumably) judicious use of AI. Whereas initial versions of mental health apps were predominantly fact-based informational deliveries as though you were doing an online search on said topics, the infusion of AI has led to automation undertaking interactive dialogues with you, akin to texting with a human therapist or the like (well, kind of, as I will be addressing and scrutinizing here).
This takes us to the latest and headline-grabbing AI that has recently garnered national and international attention, namely the use of what is formally known as Generative AI and widely popularized via the app known as ChatGPT. For clarification, ChatGPT is a general-purpose AI interactive system, essentially a general chatbot, nonetheless it is actively and avidly being used by people that seek specifically to glean mental health advice (the app wasn't made for that purpose, and yet people have decided they want to use it anyway for that role). For my recent coverage of ChatGPT, see the link here for an overview. I also did some follow-ups about how ChatGPT is worrying teachers as to students possibly cheating via using AI to write their essays, see the link here, and I did a seasonally flavored look in my Santa-related analysis at the link here. Don't worry, I'll be explaining herein what Generative AI and ChatGPT are all about, doing so momentarily so please hang in there.
If you take a look at social media, you will see people that are proclaiming ChatGPT and generative AI as the best thing since sliced bread. Some suggest that this is in fact sentient AI (nope, they are wrong!). Others worry that people are getting ahead of themselves. They are seeing what they want to see. They have taken a shiny new toy and shown exactly why we can't have catchy new things. Those in AI Ethics and AI Law are soberly and seriously worried about this burgeoning trend, and rightfully so. We will herein take a close look at how people are using generative AI for uses that aren't especially suitable for what AI can really achieve today. All manner of AI ethical and AI legal issues are indubitably wrapped into the whole conundrum. For my ongoing and extensive coverage of AI Ethics and AI Law, see the link here and the link here, just to name a few.
First, let's consider some important facets of mental health and why this is a very big and essential topic. After laying that foundation, we'll do a quick explainer about generative AI and especially ChatGPT. I'll include examples from ChatGPT so that you can see with your own eyes the type of verbiage that the AI app is able to produce. We'll conclude this discussion with some comments about what this all means and how AI Ethics and AI Law are inevitably going to step into the picture.
Fasten your seatbelt for quite a ride.
Mental Health Is A Vital And Growing Societal Concern
According to various published statistics, there is a dark and gloomy cloud overhead concerning today's mental health status. I don't want to seem to be glum about this, but we might as well face up to the reality confronting us. Hiding our heads in the sand won't work. We'll be better off approaching the matter with eyes open and a willingness to solve thorny problems.
Here are some noteworthy stats that were collected by a prominent mental health organization about Americans and the mental health landscape (per Mental Health America, '2023 Key Findings'): Adults widely experience mental illness. About 21% of adults reported experiencing a mental illness, which is roughly the equivalent of saying that approximately 50 million adults in the U.S. have experienced this. Lack of getting mental health treatment is widespread. Slightly more than half of adults with a mental illness are not getting treatment (approximately 55%), so perhaps around 28 million adults aren't getting needed mental health treatment. Youths are impacted too. Around one in ten youths in the U.S. have expressed that they have experienced severely impairing depression that impacted their schoolwork, home life, family interactions, and/or social life. Mental health treatment for youths is lacking. Less than one-third of youths that have severe depression are receiving consistent treatment (only about 28% do), and over half do not get any mental health care at all (estimated 57%). Sparsity of mental health providers. A reported figure is that there are an estimated 350 individuals in the U.S. for every one mental health provider, suggesting a paucity of available qualified mental health professional advisors and therapists for the population all told.
I don't want to get us fixated on the statistics per se since you can readily argue about how these stats are at times collected or reported. For example, sometimes these are based on surveys whereby the poll was preselected to certain areas of the country or types of people. Also, you can decidedly quibble about how honest people are when they self-report their mental health status, depending upon who is asking and why they might want to lean in one direction or another on the topic. Etc.
The gist though is that we can at least generally agree that there is a mental health challenge facing the country and that we ought to be doing something about it. If we do nothing, the base assumption is that things are going to get worse. You can't let a festering problem endlessly fester.
You might have noticed in the aforementioned stats that there is a claimed paucity of available qualified mental health professionals. The belief is that there is an imbalance in supply and demand, for which there is an insufficient supply of mental health advisers and an overabundance of either actual or latent demand for mental health advice (I say latent in the sense that many might not realize the value of seeking mental health advice, or they cannot afford it, or they cannot logistically access it).
How can we deal with this imbalance?
One path seems to be the use of automation and particularly AI to bolster the 'supply side' of providing mental health advice. You could persuasively argue that the popularity of smartphone meditation and mindfulness apps is a sign that there is indeed pent-up demand. When you cannot readily gain access to qualified human advisors, automation and AI step into that gap.
Think too about the convenience factors.
When using an AI app for mental health, you have the AI available 24x7. No need to schedule an appointment. No difficulty in logistically getting together in person with a human adviser. Likely the cost is a lot less expensive too. You can rack up time using the AI app whereas with a human adviser the clock is ticking and the billing minutes are mounting.
But, wait for a darned second, you might be exhorting, an AI app is not on par with a human adviser.
This is ostensibly an apples-to-oranges comparison. Or, perhaps more like this to an apple-to-oyster comparison, such that the two don't especially compare. A properly qualified human adviser that knows what they are doing when it comes to mental health is certainly heads above any kind of AI that we have today. Sure, the AI app might be available around the clock, but you are getting an inferior level of quality and thus you cannot make any sensible likening between using a human adviser versus using the AI.
We will return shortly to this debate about human advisers versus AI-based advisement.
Meanwhile, one aspect of mental health that seems rather heart-wrenching concerns youths and mental health.
One belief is that if we don't catch mental health issues when someone is young, the societal cost is enormous on the other end when they become adults. It is the classic tale of the seedling that grows into either a well-devised tree or one that has all manner of future problems. Perhaps, some suggest, we should especially focus our attention on youths. Catch the issues early. Try to prevent the issues from becoming lifelong difficulties. This eases potentially the manifestation of mental health issues at the adult stage of life, and with some fortitude, we can reduce the mental health deterioration pipeline flow if you get my drift.
Researchers emphasize these similar concerns, such as this recent paper: 'The mental health of adolescents and emerging adults (‘young people') is an area of public health warranting urgent attention globally. A transitional period characterized by rapid change in multiple domains (physical, social, psychological, vocational), adolescence and emerging adulthood is a developmental stage associated with heightened risks to mental well-being, as young people experience major life changes related to puberty, neurodevelopment, as well as changes to identity and autonomy in social contexts. Research indicates high prevalence of mental illness among young people with one in five individuals likely meeting criteria for a mental disorder. Disease burden associated with high prevalence rates are further exacerbated by demand for treatment outstripping supply creating a treatment gap. Digital mental health interventions (DMHIs), such as those delivered via smartphone apps or online, represent a rapidly growing mode of service with potential to offer greater access to support' (Vilas Sawrikar and Kellie Mote, 'Technology Acceptance And Trust: Overlooked Considerations In Young People's Use Of Digital Mental Health Interventions', Health Policy And Technology, October 2022)
As noted by those researchers, the advent of automation and AI mental health apps are seemingly suited to young people for a variety of reasons, such that younger people might be more prone to using high-tech, and they also would likely find appealing the ease of access and other facets. The article mentions that there is an up-and-coming catchphrase known as digital mental health interventions, along with the associated abbreviation of DMHI (this acronym hasn't solidified yet and alternatives are being bandied around).
Let's dig a little deeper into this notion of digital mental health interventions.
Here are some added remarks by the researchers: 'Technology-mediated healthcare could mitigate gaps in services by providing access to support at scale, at low cost and at the user's convenience. The prevalence of access to smartphone technology among younger people points to a seemingly obvious solution for meeting demand in this population. However, while DMHIs have been shown to be effective in randomized control trials, this does not appear to translate to real world uptake. A systematic review of studies indicated that a quarter of mental health apps were never used after installation. Younger people in particular may be less likely to engage with technology targeted at mental health with evidence that younger age groups are less likely to use DMHIs in treatment and they report low preference of online mental health care compared to face-face treatment' (ibid).
A key takeaway is that though you might assume that youths would assuredly adore and use these online mental health apps, the true picture is a lot murkier. Perhaps one particularly telling point is that once the app was installed, usage either dropped off precipitously or never got underway at all. One explanation is that the hype and excitement at downloading the app were quickly overshadowed by the app potentially being difficult to use or perceived as ineffective. You could also suggest that some youths might have been stirred to get the app due to peer pressure or via what they see on social media, and didn't especially intend to use the app. They just wanted to say that they have it. At this age, being part of the 'in' club might be just as important as whatever the app itself does.
Another viewpoint is that if these mental health apps were better at what they do, such as fully leveraging the state-of-the-art in AI, this might lure youths into actual usage of the apps. An added element would be that if youths perceived the app as being popular, they might want to be able to say that they use it too. In that sense, AI provides a seemingly positive double whammy. It can possibly make the mental health apps do a better job, and simultaneously carry the faddish style or panache of being AI and thus a timely and societally heady aspect.
Okay, so AI seems to be a hero rushing to the rescue on this mental health conundrum.
As you will shortly see, AI can be a downside to this too. Regrettably, today's AI can appear to be useful and yet end up being detrimental. Some would argue that a tradeoff must be considered. Others say that today's AI is not ripened as yet on the vine and we are prematurely putting people at risk, youths, and adults. You see, even adults can be fooled or lured into thinking that mental health apps infused with AI are a can-do-no-wrong salvation.
To see how this can be, let's take a close look at the hottest AI around, consisting of Generative AI and particularly the AI app known as ChatGPT.
Opening The Can Of Worms On Generative AI
We are ready to dive into AI.
Of the various types of AI, we will focus herein specifically on Generative AI.
In brief, generative AI is a particular type