Abstract
Objectives
In this study, I aimed to unpack the dynamics during users’ conversations with chatbots through user narratives. The aim is to explore the experiences of Hong Kong adults when using chatbots for health related discussions through the lenses of patient-centered communication and conversational diabetics.
Methods
A total of twelve young adults from Hong Kong participated in structured interviews aimed at exploring their experiences with chatbots across various dimensions. Participants provided insights into their preferences and aversions regarding design features, as well as the positive and negative influences these chatbots had on their health-related attitudes and behaviors. Additionally, they discussed the effective tactics and strategies employed by chatbots, any lessons learned from their interactions, and the similarities and differences they observed when compared to human interactions. The interviews were analyzed by two separate coders who worked independently and subsequently reconciled their findings. The results were categorized into two primary themes: 1) negative experiences and concerns, and 2) positive experiences and suggestions for improvement. A framework matrix was utilized to effectively summarize the key information by category. NVivo 11 software facilitated the organization and storage of all textual data.
Results
Users have expressed a preference for user-friendly interfaces found in chatbots like POE and Replica, appreciating interactive cues and context-relevant responses. However, they are dissatisfied with the lack of visual aids in some chatbots, including POE, as well as the generic, computer-like responses that can detract from the user experience. Chatbots have positively impacted users by increasing health awareness, motivating behavior changes, and boosting confidence. Nevertheless, there are negative aspects, such as an over-reliance on chatbots for advice, which can lead to a false sense of security, and some chatbots providing limited emotional support. Nevertheless, there are negative aspects, such as an over-reliance on chatbots for advice, which can lead to a false sense of security. Users may mistakenly believe that the information provided by these automated systems is comprehensive or entirely accurate, leading them to make uninformed decisions based on limited data. Furthermore, some chatbots provide limited emotional support, often lacking the empathy and understanding that human interactions naturally offer. While they may be programmed to respond in a soothing manner or provide encouragement, they cannot genuinely comprehend the complexities of human emotions, which can lead to feelings of frustration or isolation for individuals seeking deeper connection.
Tactically, chatbots are able to employ patient-centered communication, collaborative agendasetting, and motivational interviewing, yet they still need to enhance their ability to elicit user concerns, provide conclusive information, and address complex questions. Additionally, users experience tensions between autonomy and connection, openness and privacy, efficiency and depth, and empowerment and dependency. These dialectics underscore the complexity of human-AI relationships in healthcare, where chatbots address practical needs but cannot fully replicate the emotional and contextual richness of human interactions. The narratives suggest that users navigate these tensions by viewing chatbots as complementary tools rather than substitutes for human care.
Conclusions
Chatbots have both positive and negative aspects in health-related interactions. Their design features and strategies have varying impacts on users' attitudes and behaviors. While they offer convenience and certain types of support, they cannot fully replace human interactions in healthcare. Recognizing these aspects is crucial for optimizing chatbot use in the health field, ensuring they complement rather than substitute for human-centered care.
Practical Implications
Chatbot developers should focus on improving design features, such as adding visual aids and enhancing personalization. They also need to address the lack of emotional intelligence and better handle complex questions. Healthcare professionals can use chatbots as supplementary tools, but should also guide patients on when to seek professional advice. This balanced approach can enhance the overall healthcare experience, leveraging chatbots' advantages while mitigating their limitations.
This includes understanding their perspectives on chatbots' design features, impacts on attitudes and behaviors, tactics and strategies employed, and how these interactions compare to human interactions. By doing so, we can identify areas of strength and weakness in chatbot use for health, guiding future improvements and more effective implementation.
In this study, I aimed to unpack the dynamics during users’ conversations with chatbots through user narratives. The aim is to explore the experiences of Hong Kong adults when using chatbots for health related discussions through the lenses of patient-centered communication and conversational diabetics.
Methods
A total of twelve young adults from Hong Kong participated in structured interviews aimed at exploring their experiences with chatbots across various dimensions. Participants provided insights into their preferences and aversions regarding design features, as well as the positive and negative influences these chatbots had on their health-related attitudes and behaviors. Additionally, they discussed the effective tactics and strategies employed by chatbots, any lessons learned from their interactions, and the similarities and differences they observed when compared to human interactions. The interviews were analyzed by two separate coders who worked independently and subsequently reconciled their findings. The results were categorized into two primary themes: 1) negative experiences and concerns, and 2) positive experiences and suggestions for improvement. A framework matrix was utilized to effectively summarize the key information by category. NVivo 11 software facilitated the organization and storage of all textual data.
Results
Users have expressed a preference for user-friendly interfaces found in chatbots like POE and Replica, appreciating interactive cues and context-relevant responses. However, they are dissatisfied with the lack of visual aids in some chatbots, including POE, as well as the generic, computer-like responses that can detract from the user experience. Chatbots have positively impacted users by increasing health awareness, motivating behavior changes, and boosting confidence. Nevertheless, there are negative aspects, such as an over-reliance on chatbots for advice, which can lead to a false sense of security, and some chatbots providing limited emotional support. Nevertheless, there are negative aspects, such as an over-reliance on chatbots for advice, which can lead to a false sense of security. Users may mistakenly believe that the information provided by these automated systems is comprehensive or entirely accurate, leading them to make uninformed decisions based on limited data. Furthermore, some chatbots provide limited emotional support, often lacking the empathy and understanding that human interactions naturally offer. While they may be programmed to respond in a soothing manner or provide encouragement, they cannot genuinely comprehend the complexities of human emotions, which can lead to feelings of frustration or isolation for individuals seeking deeper connection.
Tactically, chatbots are able to employ patient-centered communication, collaborative agendasetting, and motivational interviewing, yet they still need to enhance their ability to elicit user concerns, provide conclusive information, and address complex questions. Additionally, users experience tensions between autonomy and connection, openness and privacy, efficiency and depth, and empowerment and dependency. These dialectics underscore the complexity of human-AI relationships in healthcare, where chatbots address practical needs but cannot fully replicate the emotional and contextual richness of human interactions. The narratives suggest that users navigate these tensions by viewing chatbots as complementary tools rather than substitutes for human care.
Conclusions
Chatbots have both positive and negative aspects in health-related interactions. Their design features and strategies have varying impacts on users' attitudes and behaviors. While they offer convenience and certain types of support, they cannot fully replace human interactions in healthcare. Recognizing these aspects is crucial for optimizing chatbot use in the health field, ensuring they complement rather than substitute for human-centered care.
Practical Implications
Chatbot developers should focus on improving design features, such as adding visual aids and enhancing personalization. They also need to address the lack of emotional intelligence and better handle complex questions. Healthcare professionals can use chatbots as supplementary tools, but should also guide patients on when to seek professional advice. This balanced approach can enhance the overall healthcare experience, leveraging chatbots' advantages while mitigating their limitations.
This includes understanding their perspectives on chatbots' design features, impacts on attitudes and behaviors, tactics and strategies employed, and how these interactions compare to human interactions. By doing so, we can identify areas of strength and weakness in chatbot use for health, guiding future improvements and more effective implementation.
| Original language | English |
|---|---|
| Publication status | Published - 15 Jul 2025 |
| Event | IAMCR Singapore 2025 Communicating Environmental Justice: Many Voices, One Planet - Singapore, Singapore Duration: 13 Jul 2025 → 17 Jul 2025 https://iamcr.org/singapore2025 |
Conference
| Conference | IAMCR Singapore 2025 Communicating Environmental Justice |
|---|---|
| Abbreviated title | IAMCR 2025 |
| Place | Singapore |
| City | Singapore |
| Period | 13/07/25 → 17/07/25 |
| Internet address |
Bibliographical note
Research Unit(s) information for this publication is provided by the author(s) concerned.Research Keywords
- health chatbots
- patient-centered communication
- user experience
- Hong Kong
Fingerprint
Dive into the research topics of 'Exploring User Narratives: A Qualitative Analysis of Hong Kong Adults' Experiences with Health Chatbots'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver