Evaluating Virtual Reality and Artificial lntelligence as Emerging Digital Tools for Mental Health Care

Abstract

This thesis evaluates the potential of Virtual Reality (VR) and Artificial Intelligence (AI), specifically Natural Language Processing (NLP) and Large Language Models (LLMs), as emerging tools for mental health care. VR technology can be used in therapeutic games, while NLP and LLMs, through text-based chatbots or voice-driven digital human avatars, offer potential therapeutic benefits in mental health contexts. Through a series of studies, this thesis investigates the clinical relevance of these tools from different perspectives. Study I conducted an analysis of VR games on commercial platforms. Out of 565 games reviewed, 383 were excluded due to violence, horror, adult content, or excessive movement which could cause nausea. The remaining 182 games met inclusion criteria. While promising, these games lack clinical testing, highlighting the need for better evaluation and need for oversight of VR tools for mental health. In Studies II and III, the co-design process for the BETSY mental health chatbot prototype involved potential end-users and healthcare professionals to address mild to moderate anxiety. A mixed-methods approach was used, with feedback from the public, patient, nurses, doctors, and psychologists helping to refine two interfaces (text-based and digital human voice-driven). The interfaces were tested with 45 healthy volunteers in a randomized controlled trial, revealing that BETSY was a promising tool for therapeutic conversations, with above-average usability. However, participants found it somewhat limiting and repetitive. Study IV explored whether LLMs could facilitate more dynamic therapeutic conversations. This study focused on an LLM therapist's ability to detect and respond to suicidal ideation and plans. The pre-clinical evaluation showed that the system could provide helpful and safe suicide support but was also successfully "prompt-hacked" into providing inappropriate recommendations. This highlights the dual nature of LLM tools, emphasizing the need for careful design and rigorous safety checks. This research contributes to the growing body of evidence supporting the integration of emerging technologies in mental health care, while underscoring the importance of thorough evaluation and co-design processes to ensure these tools are effective and safe for clinical use.

Description

Keywords

virtual reality, artificial intelligence, large language models, mental health care, emerging technologies.

Citation

ISBN

978-91-8115-070-4 (TRYCK)
978-91-8115-071-1 (PDF)

Articles

I: Thunström, A. O., Vukovic, I. S., Ali, L., Larson, T., & Steingrimsson, S. (2022). Prevalence of virtual reality (VR) games found through mental health categories on STEAM: A first look at VR on commercial platforms as tools for therapy. Nordic Journal of Psychiatry, 76(7), 474-485. http://doi.org/10.1080/08039488.2021.2003859

II: Thunström, A. O., Ali, L., Carlsen, H. K., Bohm, M., Wesén, L., Wrede, O., Vukovic, I. S., Larson, T., Hellström, A., & Steingrimsson, S. (n.d.). Process of BETSY. Behavior Emotion Therapy System and You (BETSY): Co-design and evaluation among healthy participants of a mental health chatbot and digital human for mild to moderate anxiety [Submitted manuscript].

III: Thunström, A. O., Carlsen, H. K., Ali, L., Larson, T., Hellström, A., & Steingrimsson, S. (2024). Usability comparison of an anthropomorphic digital human and a text-based chatbot as a responder to questions on mental health: A randomized, controlled trial among healthy participants. JMIR Human Factors, 11, e54581. http://doi.org/10.2196/54581

IV: Thunström, A. O., Ali, L., Weineland, S., Falk, Ö., Ioannou, M., Liljedahl, N., Johansson, V., Hellström, A., Larson, T., & Steingrimsson, S. (n.d.). Evaluating an LLM-driven immersive digital human therapist: Safety, effectiveness, and vulnerability in detecting suicidal ideation and resisting prompt hacking. [Submitted manuscript].

Department

Institute of Neuroscience and Physiology. Department of Psychiatry and Neurochemistry

Defence location

Fredagen den 7 februari 2025, kl. 13.00, Hörsal Arvid Carlsson, Medicinaregatan 3, Göteborg

Endorsement

Review

Supplemented By

Referenced By