Artificial Intelligence and Mental Health


Do you remember AOL chat rooms? Remember chat bots?

Artificial Intelligence (AI) has come a long way in recent years and seems to be the talk of the technology world. Frankly, it’s a bit disconcerting to me but it’s out of my control so I shall try to come to grips with it. I think there are pros and cons to any new advances we make, so as long as the robots don’t start overthrowing humans I’m good.

Has anyone else heard of Woebot? According to their website, Woebot is an “artificial conversational agent that helps you monitor mood and learn about yourself” that runs through Facebook messenger. The website touts scientific evidence, citing this study, but I’m a bit hesitant on the strength of the research considering it was only a sample size of 70 people with only 34 people assigned to using Woebot.


Woebot, again according to their website, was designed to help college students experiencing anxiety and depression symptoms; it works by providing information, cognitive-behavioral techniques and data collection on your mood and thoughts. It also appears to be financially feasible for many who would not be able to afford psychotherapy or medication management services.

I have not used Woebot so I’m not going to provide a review or endorsement of it in any way; I caution anyone who does use it to be aware. I have no idea what the privacy is on using Woebot or whether your information would remain confidential. That said, I have seen a surge in online based programs being advertised and people appear to be utilizing them so I think it’s an important topic to discuss.

Here are some benefits I see to technology being incorporated into mental health services:

  • Affordability – many people don’t seek counseling services because they are unable to afford such services.


  • Data collection – Tracking your mood and activities is a huge proponent of behavioral treatments and can provide significant insight into patterns. I’ve often had clients set daily alarms on their cellphones (the one thing we always seem to have with us) to remind them to record their mood and activity ratings throughout the day. I strongly advocate for data collection in the moment because we tend to forget the nuances of the day. Apps and programs that would remind and check in with us could potentially increase adherence and follow through. I think that web based or downloadable apps have the potential to help synchronize counseling homework assignments and follow through between sessions.


  • 24/7 availability – Counselors are not available 24/7, in most cases. Crisis lines exist and can be lifesaving; but, as we become more and more focused on texting and online communication, utilizing these measures to increase availability for more people is important (in my opinion). People may be afraid to call a crisis line but more apt to use an online chat program.


  • Educational Resources – If these bots really are providing tailored evidence-based educational tools (videos, pamphlets, handouts) based off of mood ratings then this may help reduce feelings of being overwhelmed from the vast amount of information that would be turned up through a search engine.

Potential pitfalls that I foresee would be:

  • Privacy and confidentiality


  • Understanding of educational material – Having a human counselor could assist with gaining understanding of the concepts being taught. I’m not sure how a robot would ensure understanding of the materials.


  • Mismatched problem and treatment – Different treatments are recommended for different problems. Programs such as Woebot would only provide certain services which may not be best suited to the individual. Without human involvement, I’m not sure how that would be determined.


  • No interpersonal practice – Therapeutic alliance is a large part of why therapy works. Having someone that you trust, can confide in, and who is supportive of your efforts can be a completely novel experience for people. While robots may be able to mimic this through text conversations, it does not help people practice new relationship building skills with an actual person.

With regards to Woebot, I strongly believe that more scientific backing needs to be assured before I would advocate for its use. But I’m very interested in others thoughts on this.

What are your thoughts on the use of artificial intelligence (AI) in the delivery of mental health services? Have you had any experience with these services or know someone who has?


Leave me a comment, give me a like (I find them very reinforcing)! Don’t forget to follow this blog to receive notifications of new postings exploring our behaviors and wellbeing.

via the Daily Prompt: Synchronize

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s