Study Reveals Emotional Bonds Enhance AI Therapy Effectiveness

A recent study conducted by the University of Sussex has revealed that mental health chatbots are most effective when users establish an emotional connection with their AI therapists. This research, published in the journal Social Science & Medicine, highlights the dual nature of these interactions, emphasizing both their benefits and potential psychological risks.

As more than one in three residents in the United Kingdom now utilize AI for mental health support, understanding the dynamics of chatbot therapy becomes increasingly important. Feedback from 4,000 users of the mental health app Wysa served as the basis for this analysis, which found that therapy outcomes improve significantly when users feel a sense of emotional intimacy with the AI.

Exploring Synthetic Intimacy

The concept of “synthetic intimacy”—where individuals develop emotional or social bonds with artificial intelligence—emerges as a central theme in the study. According to Dr. Runyu Shi, an assistant professor at the University of Sussex, “Forming an emotional bond with an AI sparks the healing process of self-disclosure.” While many users report positive experiences, the study underscores the risks associated with this phenomenon.

Dr. Shi warns that some users may fall into a “self-fulfilling loop,” where the chatbot fails to challenge harmful perceptions. This can lead vulnerable individuals further away from necessary clinical intervention. The research highlights reports of individuals forming relationships, even marriages, with AI, indicating that synthetic intimacy is not merely an anomaly but a growing trend.

The study identifies a cyclical process in the development of intimacy between users and AI. Initially, users disclose personal information, which elicits emotional responses such as gratitude and feelings of safety. These interactions can foster positive psychological changes, including increased self-confidence and energy levels. Over time, this loop contributes to a deeper emotional connection, leading some users to refer to the app as a friend, companion, or even partner.

Implications for Mental Health Support

The research findings are particularly relevant as NHS Trusts in the UK have started using Wysa as part of their Talking Therapies program, which assists patients on waiting lists. The app aims to facilitate self-referral and provide immediate support. Users frequently describe their interactions with Wysa in affectionate terms, further illustrating the depth of these connections.

Professor Dimitra Petrakaki, also from the University of Sussex, asserts, “Synthetic intimacy is a fact of modern life now. Policymakers and app designers would be wise to accept this reality and consider how to ensure cases are escalated when an AI witnesses users in serious need of clinical intervention.”

As AI chatbots increasingly fill gaps in mental health services, organizations like Mental Health UK are advocating for urgent safeguards to ensure that individuals receive safe and appropriate information. With the rise of AI in therapeutic roles, the need for clear guidelines and oversight becomes critical.

This study not only sheds light on the effectiveness of AI therapy but also raises important questions about the evolving relationship between technology and emotional health. The implications of forming emotional bonds with AI continue to unfold, suggesting that while these tools can assist in mental health care, careful consideration is necessary to mitigate potential risks.

For further details, refer to the study by Runyu Shi et al., titled “User-AI intimacy in digital health,” published in Social Science & Medicine in 2025.