Emotional Support from AI Chatbots : Should a Supportive Partner Self-Disclose or Not?

Research output: Journal Publications and Reviews (RGC: 21, 22, 62)21_Publication in refereed journalpeer-review

View graph of relations

Author(s)

Related Research Unit(s)

Detail(s)

Original languageEnglish
Pages (from-to)207–222
Journal / PublicationJournal of Computer-Mediated Communication
Volume26
Issue number4
Online published19 May 2021
Publication statusPublished - Jul 2021

Link(s)

Abstract

This study examined how and when a chatbot’s emotional support was effective in reducing people’s stress and worry. It compared emotional support from chatbot versus human partners in terms of its process and conditional effects on stress/worry reduction. In an online experiment, participants discussed a personal stressor with a chatbot or a human partner who provided none, or either one or both of emotional support and reciprocal self-disclosure. The results showed that emotional support from a conversational partner was mediated through perceived supportiveness of the partner to reduce stress and worry among participants, and the link from emotional support to perceived supportiveness was stronger for a human than for a chatbot. A conversational partner’s reciprocal self-disclosure enhanced the positive effect of emotional support on worry reduction. However, when emotional support was absent, a solely self-disclosing chatbot reduced even less stress than a chatbot not providing any response to participants’ stress.

Research Area(s)

  • Artificial Intelligence, Chatbot, Emotional Support, Disclosure, Stress, Mental Health, Human–AI Communication

Download Statistics

No data available