{"id":17189,"date":"2023-05-13T00:00:00","date_gmt":"2023-05-13T00:00:00","guid":{"rendered":"https:\/\/1millionbestdownloads.com\/chatgpt-therapy-mental-health-experts-weigh-in-7488513\/"},"modified":"2023-05-13T00:00:00","modified_gmt":"2023-05-13T00:00:00","slug":"chatgpt-therapy-mental-health-experts-weigh-in-7488513","status":"publish","type":"post","link":"https:\/\/1millionbestdownloads.com\/chatgpt-therapy-mental-health-experts-weigh-in-7488513\/","title":{"rendered":"People Are Using ChatGPT in Place of Therapy\u2014What Do Mental Health Experts Think?"},"content":{"rendered":"
<\/p>\n
10'000 Hours\/Getty Images<\/p>\n <\/span> \nArtificial intelligence is having a moment. As AI-powered chatbots like ChatGPT gain popularity, more people have taken to testing out the technology on tasks like answering math questions, translating words or sentences, and even generating recipe ideas or grocery lists.\n<\/p>\n \nSome people on social media have also begun using these AI chatbots as makeshift therapists. By presenting the technology with mental health questions or crises, people can receive advice\u2014often free advice\u2014without having to spend the time or money on therapy sessions.\n<\/p>\n \nOne TikTok user went so far as to say they replaced their therapist with an AI chatbot. \u201cToday I officially quit therapy because I just found the best free replacement: using ChatGPT,\u201d the TikToker said, recommending others may want to do the same.\n<\/p>\n \nThis advice, however, is worrisome to healthcare providers who focus on mental health.\n<\/p>\n \n\u201cBe skeptical. [AI chatbots] are not meant to be used as a substitute for therapy, psychotherapy, or any kind of psychiatric intervention,\u201d Bruce Arnow, PhD, professor at the department of psychiatry, associate chair, and chief psychologist at Stanford University, told Health<\/em>. \u201cThey\u2019re just not far enough along for that, and we don\u2019t know if they\u2019ll ever be.\u201d\n<\/p>\n \nHere\u2019s what expert psychologists had to say about why using AI as a therapist could be a concern, best practices when it comes to seeking help for mental health issues, and the ways that AI could be used safely in the future.\n<\/p>\n Poor Body Health May Indicate Poor Mental Health\u2014Experts Discuss Mind-Body Connection<\/span><\/p>\n <\/span> <\/p>\n \nIt can be challenging to imagine what a \u201ctherapy session\u201d with AI might look like. But for most users online, it simply means messaging with an AI chatbot, which allows people to ask specific and oftentimes personal questions.\n<\/p>\n \nOne TikTok user walked their followers through a conversation they had with ChatGPT, instructing the chatbot to \u201cact as my therapist. I need support and advice about what I should do in certain situations I have been struggling with.\u201d\n<\/p>\n \nChatGPT responded that it was \u201chere to support [them] and offer advice,\u201d before asking follow-up questions about the creator\u2019s concerns and offering possible solutions. It also recommended that they seek professional help if their anxieties still felt overwhelming.\n<\/p>\n \nAnother TikTok creator shared screenshots of their conversation with an AI chatbot embedded in the social media app Snapchat. When the user presented the chatbot with questions about issues in a relationship, it responded, \u201cIt\u2019s understandable to want to know what\u2019s going on with a friend. But it\u2019s important to respect their boundaries and give them space if that\u2019s what they need.\u201d\n<\/p>\n \nStill other users have presented ChatGPT with suicidal ideation. Even in these situations, the technology seems to respond remarkably well, said Olivia Uwamahoro Williams, PhD, assistant professor of counselor education at the University of West Georgia, and co-chair of the American Counseling Association Artificial Intelligence Interest Network.\n<\/p>\n \n\u201cThey all would generate very sound responses,\u201d she told Health<\/em>. \u201cIncluding resources, national resources\u2014so that was good to see. I was like, \u2018Okay, well these things are very accurate. The generated response is very counselor-like, kind of therapist-esque.\u2019\u201d\n<\/p>\n I Have Tried Online Therapy at Three Companies. Here's What I Learned<\/span><\/p>\n <\/span> <\/p>\n \nDespite the chatbots\u2019 seemingly good responses to queries about mental health concerns, psychologists agree that simply using AI in place of traditional therapy is not yet a safe option.\n<\/p>\n \nAt the most basic level, there are some concerns about ChatGPT or other AI chatbots giving out nonsensical or inaccurate information to questions, Arnow explained. ChatGPT itself warns users that the tech \u201cmay occasionally generate incorrect information,\u201d or \u201cmay occasionally produce harmful instructions or biased content.\u201d\n<\/p>\n \nBeyond this, Uwamahoro Williams said there are some logistical concerns with trying to use AI as a therapist, too.\n<\/p>\n \nTherapists are trained and licensed, which means that they have to maintain a certain standard of practice, she explained. AI chatbots don\u2019t have these same guidelines.\n<\/p>\n \n\u201cThere\u2019s not a person involved in this process. And so the first concern that I have is the liability,\u201d she said. \u201cThere\u2019s a lack of safety that we have to be open and honest about, because if something happens, then who is held accountable?\u201d\n<\/p>\n \nSimilarly, using AI as a therapist involves putting sensitive information on the internet, Uwamahoro Williams added, which could be a privacy issue for some people.\n<\/p>\n \nIn the case of ChatGPT, the site does collect and record conversations, which it says it uses to better train the AI. Users can opt out, or they can delete their account or clear their conversations, the latter of which is deleted from ChatGPT\u2019s systems after 30 days.<\/span>\n<\/p>\n \nUwamahoro Williams is also concerned that advice from a chatbot could be misinterpreted by the person seeking help, which could make things worse in the long run.\n<\/p>\n \nAll of these qualms, however, can really be traced back to one main issue, namely that AI is just that\u2014artificial.\n<\/p>\n \n\u201cI think in the future it’s going to probably surpass us\u2014even therapists\u2014in many measurable ways. But one thing it cannot do is be a human being,\u201d Russel Fulmer, PhD, senior associate professor at Xi\u2019an Jiaotong-Liverpool University and incoming professor and director of counseling at Husson University, told Health<\/em>. \u201cThe therapeutic relationship is a really big factor. That accounts for a lot of the positive change that we see.\u201d\n<\/p>\n \nTraditional therapy allows the provider and patient to build an emotional bond, as well as clearly outline the goals of therapy, Arnow explained.\n<\/p>\n \n\u201cAI does a really good job in gathering a lot of knowledge across a continuum,\u201d Uwamahoro Williams said. \u201cAt this time, it doesn't have the capacity to know you specifically as a unique individual and what your specific, unique needs are.\u201d\n<\/p>\n Talk Therapy Is Good for Your Heart Health, Study Finds<\/span><\/p>\n <\/span> <\/p>\n \n \nArnow is a bit skeptical as to whether AI chatbots could ever be advanced enough to provide help on the same level as a human therapist. But Fulmer and Uwamahoro Williams are a bit more comfortable with the idea of chatbots potentially being used in addition to traditional therapy.\n<\/p>\n \n\u201cThese platforms can be used as a supplement to the work that you\u2019re actively doing with a professional mental health provider,\u201d Uwamahoro Williams said.\n<\/p>\n \nChatting with an AI could even be thought of as another tool to further work outside of therapy, similar to journaling or meditation apps, she added.\n<\/p>\n \nThere are even some chatbot AIs that are being piloted specifically for mental health purposes, such as Woebot Health or Elomia. It\u2019s possible that these could be a better option since they\u2019re created specifically for handling mental health-related queries.\n<\/p>\n \nFor example, Elomia says they have a safety feature where humans will step in if people need to speak to a real therapist or a hotline, and Woebot says their AI has a foundation in \u201cclinically tested therapeutic approaches.\u201d\n<\/p>\n \nMost of these programs\u2014in addition to AI in general\u2014are still being developed and piloted though, so it\u2019s probably too early to compare them definitively, Fulmer said.\n<\/p>\n \nOnline AI therapy certainly holds no candle to the real thing\u2014at least for now\u2014Fulmer and Arnow agreed. But the fact remains that mental health care is inaccessible for many people\u2014therapy can be incredibly expensive, many therapists don\u2019t have space for new clients, and persistent stigma all dissuade people from getting the help they need.<\/span><\/span>\n<\/p>\n \n\u201cI guess there\u2019s a difference between my ideals and the recognition of reality,\u201d Fulmer said. \u201cChatGPT and some of these chatbots, they offer a scalable solution that\u2019s, in many cases, relatively low-cost. And they can be a piece of the puzzle. And many people are getting some benefit from them.\u201d\n<\/p>\n \nIf just one person has received some sort of benefit from treating AI as a therapist, then the notion of whether it could work is at least worth considering, Fulmer added.\n<\/p>\n \nFor now, ChatGPT may have useful applications in helping people \u201cscreen\u201d themselves for mental health disorders, experts said. The bot could guide someone through common symptoms to help them decide if they need professional help or diagnosis.\n<\/p>\n \nAI could also help train new counselors and help psychologists learn more about which strategies are most effective, Arnow and Uwamahoro Williams said.\n<\/p>\n \nYears down the line as AI advances, it may have more applications in therapy, Fulmer said, but it still may not be right for everyone.\n<\/p>\n \n\u201cRight now, there is no substitute for a human counselor,\u201d Fulmer said. \u201c[AI] synthesizes large data sets, it\u2019s good with offering some useful information. But only a real-life therapist can get to know you and tailor their responses and their presence to you.\u201d<\/p>\n","protected":false},"excerpt":{"rendered":" Some people online are experimenting with using artificial intelligence in place of real therapy for mental health issues. For now, experts say using the technology as a therapist likely isn't safe, since it poses a number of confidentiality and safety concerns. But with its frequent accessibility and cost issues, therapy could possibly be supplemented by […]<\/p>\n","protected":false},"author":2,"featured_media":17189,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[905,410],"tags":[906,116],"_links":{"self":[{"href":"https:\/\/1millionbestdownloads.com\/wp-json\/wp\/v2\/posts\/17189"}],"collection":[{"href":"https:\/\/1millionbestdownloads.com\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/1millionbestdownloads.com\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/1millionbestdownloads.com\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/1millionbestdownloads.com\/wp-json\/wp\/v2\/comments?post=17189"}],"version-history":[{"count":0,"href":"https:\/\/1millionbestdownloads.com\/wp-json\/wp\/v2\/posts\/17189\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/1millionbestdownloads.com\/wp-json\/wp\/v2\/posts\/17189"}],"wp:attachment":[{"href":"https:\/\/1millionbestdownloads.com\/wp-json\/wp\/v2\/media?parent=17189"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/1millionbestdownloads.com\/wp-json\/wp\/v2\/categories?post=17189"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/1millionbestdownloads.com\/wp-json\/wp\/v2\/tags?post=17189"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}
\n<\/figcaption><\/figure>\n How Are People Using AI Chatbots for Therapy? <\/span> <\/h2>\n
Concerns About Using AI Chatbots for Therapy <\/span> <\/h2>\n
Will AI Chatbots Ever Be a Safe Therapy Option? <\/span> <\/h2>\n
Though psychologists largely agree that using AI as a stand-in for a therapist isn\u2019t safe, they diverge a bit on when and if the technology could ever be useful.\n<\/p>\n