Exploring the influence of multimodal social media data on stock performance : an empirical perspective and analysis
Research output: Journal Publications and Reviews › RGC 21 - Publication in refereed journal › peer-review
Author(s)
Related Research Unit(s)
Detail(s)
Original language | English |
---|---|
Pages (from-to) | 871-891 |
Journal / Publication | Internet Research |
Volume | 31 |
Issue number | 3 |
Online published | 12 Jan 2021 |
Publication status | Published - 2021 |
Link(s)
Abstract
Purpose - Despite the extensive academic interest in social media sentiment for financial fields, multimodal data in the stock market has been neglected. The purpose of this paper is to explore the influence of multimodal social media data on stock performance, and investigate the underlying mechanism of two forms of social media data, i.e. text and pictures.
Design/methodology/approach - This research employs panel vector autoregressive models to quantify the effect of the sentiment derived from two modalities in social media, i.e. text information and picture information. Through the models, the authors examine the short-term and long-term associations between social media sentiment and stock performance, measured by three metrics. Specifically, the authors design an enhanced sentiment analysis method, integrating random walk and word embeddings through Global Vectors for Word Representation (GloVe), to construct a domain-specific lexicon and apply it to textual sentiment analysis. Secondly, the authors exploit a deep learning framework based on convolutional neural networks to analyze the sentiment in picture data.
Findings - The empirical results derived from vector autoregressive models reveal that both measures of the sentiment extracted from textual information and pictorial information in social media are significant leading indicators of stock performance. Moreover, pictorial information and textual information have similar relationships with stock performance.
Originality/value - To the best of the authors’ knowledge, this is the first study that incorporates multimodal social media data for sentiment analysis, which is valuable in understanding pictures of social media data. The study offers significant implications for researchers and practitioners. This research informs researchers on the attention of multimodal social media data. The study’s findings provide some managerial recommendations, e.g. watching not only words but also pictures in social media.
Design/methodology/approach - This research employs panel vector autoregressive models to quantify the effect of the sentiment derived from two modalities in social media, i.e. text information and picture information. Through the models, the authors examine the short-term and long-term associations between social media sentiment and stock performance, measured by three metrics. Specifically, the authors design an enhanced sentiment analysis method, integrating random walk and word embeddings through Global Vectors for Word Representation (GloVe), to construct a domain-specific lexicon and apply it to textual sentiment analysis. Secondly, the authors exploit a deep learning framework based on convolutional neural networks to analyze the sentiment in picture data.
Findings - The empirical results derived from vector autoregressive models reveal that both measures of the sentiment extracted from textual information and pictorial information in social media are significant leading indicators of stock performance. Moreover, pictorial information and textual information have similar relationships with stock performance.
Originality/value - To the best of the authors’ knowledge, this is the first study that incorporates multimodal social media data for sentiment analysis, which is valuable in understanding pictures of social media data. The study offers significant implications for researchers and practitioners. This research informs researchers on the attention of multimodal social media data. The study’s findings provide some managerial recommendations, e.g. watching not only words but also pictures in social media.
Research Area(s)
- Deep learning, Multimodal data, Sentiment analysis, Stock performance, Vector autoregression
Citation Format(s)
Exploring the influence of multimodal social media data on stock performance: an empirical perspective and analysis. / Yuan, Hui; Tang, Yuanyuan; Xu, Wei et al.
In: Internet Research, Vol. 31, No. 3, 2021, p. 871-891.
In: Internet Research, Vol. 31, No. 3, 2021, p. 871-891.
Research output: Journal Publications and Reviews › RGC 21 - Publication in refereed journal › peer-review