Skip to main navigation Skip to search Skip to main content

Navigating Ambiguities: A Systematic Review and Comparative Analysis of Social Bot Detection Methods in Communication Research

Xiao Meng, Xiaohui Wang*, Tai-Quan Peng

*Corresponding author for this work

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

Abstract

The proliferation of social bots poses significant challenges for authentic online communication, motivating a growing body of research in communication studies. Yet, conceptual ambiguities and methodological inconsistencies continue to undermine the reliability and replicability of social bot research. This study reviews recent work on social bots, revealing a frequent mismatch between detection methods and the types of bots being investigated. To address these issues, we propose a typology based on three dimensions: intention (benign vs. malicious), coordination (independent vs. coordinated), and operation (rule-based vs. generative). We further build a multiclass dataset of 4,071 bots and 6,386 humans from Bluesky and evaluate four major detection methods: rule-based approaches, supervised machine learning, unsupervised approaches, and large language model-based techniques. By highlighting the strengths and limitations of each, we advocate for multimethod strategies to better respond to evolving bot behaviors. We conclude with recommendations for standardizing detection practices and enhancing methodological rigor in social bot research. © 2026 The Author(s). Published with license by Taylor & Francis Group, LLC.
Original languageEnglish
JournalCommunication Methods and Measures
DOIs
Publication statusOnline published - 18 Jan 2026

Funding

The work was supported by the City University of Hong Kong [Grant No: 9610730].

Fingerprint

Dive into the research topics of 'Navigating Ambiguities: A Systematic Review and Comparative Analysis of Social Bot Detection Methods in Communication Research'. Together they form a unique fingerprint.

Cite this