— Read on www.psychtechpodcast.com/chatbot-therapy/
As a person who regularly deals with anxiety, I found this post interesting. Honestly, I did not listen to the podcast, but I read through some of the supporting materials. I have never thought about the use of a chatbot for helping mental health patients deal with issues. I think there is benefit to having someone physically assist someone who is dealing with psychiatric issues, but I also know that many people find value in writing journals or diaries, which are less interactive in nature than a chatbot would be. So, I do think there is some merit to this practice of using specially designed chatbots for mental health. I have met with psychologists and psychiatrists for my anxiety in the past. I would have to admit that the most therapeutic aspect of my time with these professionals has been sharing experiences and having simple conversations. Through these conversations, the therapists are able to determine possible causes to my issues and offer some solutions that may or may not correct the underlying problem.
While I have not dealt with depression to date, I am unable to speak to that particular health concern, but I can imagine that this sort of therapy might be beneficial at least as part of a larger prescription for even that type of concern. Sometimes, we just need someone to listen. With the implementation of a bot designed for this use case, one might derive some of the same benefits as though they were sitting on a therapist’s couch, as it were. In recent years, telemedicine services have started to offer mental health coverage as well as their standard medicine. This is not much different than that, albeit that there is no actual person on the other end. Or is there? Certainly, it was a human who programmed these bots in order for them to provide an expected result. These programmers, and medical professionals, designed the service to operate in such a way as if there were actually a human on the other end. One might imagine that a lot of thought has gone into these services such that the bot would be able to communicate the appropriate messages given a particular medical experience.
What time must go into something like this. For medical professionals to calculate the types of issues that a patient might have and respond appropriately through pre-defined scripts. I would imagine that there is some pretty amazing machine learning behind the scenes of such projects that would provide the most life-like experience possible for the patients. However, listening and speaking are not the only task for these bots. They must be able to determine the state of one’s well-being in order to provide any relevant therapy. Through sentiment gathered from textual conversation, these systems might be able to determine how happy or sad someone is, but could they capture a dark or sinister vibe from a patient? I assume that would be an almost impossible task. To that end, it must be that these services are better for some mental health issues than others. I wonder how many people would be willing to trust their mental health medicine to a machine? It seems we are willing to turn over a lot of functions to machines these days, but would we trust them with our minds?
I would be interested in hearing from our readers about how many of them would use such a service. For anxiety, I might be willing to try it out and see if there is any real benefit. Would you? Please provide your thoughts in the comments below.