How Reference Points and Negativity Bias Affect Citizens’ Performance Evaluations: A Replication

Research output: Conference Papers (RGC: 31A, 31B, 32, 33)33_Other conference paperpeer-review

View graph of relations



Original languageEnglish
Publication statusPresented - 20 Apr 2022


TitleInternational Research Society for Public Management Conference 2022 (IRSPM2022)
Period19 - 22 April 2022


Performance information (PI) has been widely used as a key mechanism for communicating public service achievements (OECD, 2005). Existing public management research has explored how different stakeholders, including politicians, managers and citizens, interpret performance information (James & Van Ryzin, 2016; Marvel, 2015; Nielsen & Moynihan, 2017). Scholars have used experimental studies to examine the role of cognitive biases, such as reference points (social and historical) (George et al., 2018; Holm, 2017; Hong, 2018; Olsen, 2017) and negativity bias (Charbonneau & Van Ryzin, 2015; James & Moseley, 2014; Jilke, 2018; Olsen, 2017) in evaluating government performance.

Olsen (2017) conducted experimental studies in Denmark to test how reference points alter citizens’ evaluation of organizational performance in education and unemployment areas. Olsen found that while citizens draw extensively on social and historical comparisons when they evaluate public sector performance, they rely more on social reference points in making their assessments. Olsen also found that the role of negativity bias is more mixed and depends more on the type of performance measure than the actual reference point.

Despite these insights, most studies were conducted in the U.S. and Europe, with limited studies in the Asian context. In addition, scholars have not reached a consensus on the relative importance of social and historical comparison (Charbonneau & Van Ryzin, 2015; Holm, 2017) and the effect of negativity bias (George et al., 2017; Holm, 2017). In this study, we conducted an empirical replication of Olsen’s (2017) study, extending it to the Hong Kong context and two different policy areas—secondary education and air quality.

In the replication, we recruited 3,643 participants through the Dynata online panel in Hong Kong, which provides a representative sample of Hong Kong citizens aged over 18. Participants were randomly assigned to three conditions in the vignette experiments, control (absolute information), social and historical comparison groups. In each group, we provided two random-ordering vignettes describing school performance and air quality in a district. After the vignettes, the dependent variables—citizens’ satisfaction ratings for organizational performance in the two policy areas—was measured. We also included several covariates: gender, age, education and political orientation in the survey.

Following Olsen’s (2017) study, we used regressions and difference-in-means tests to analyse the data. The results showed that in both the education and air quality policy areas, social and historical reference points negatively influenced citizens’ evaluation of performance, which is consistent with Olsen’s original finding. We found mixed findings about the relative importance of social comparison. Specifically, in the air quality area, we found that the effect of the social reference point is slightly stronger than the effect of historical reference point, which is consistent with the original finding. However, in the education policy area, the historical comparison effect is slightly stronger than social comparison effect, which is contrary to Olsen’s finding. In addition, we found limited support for the effect of negativity bias in education or air quality policy.

Bibliographic Note

Research Unit(s) information for this publication is provided by the author(s) concerned.

Citation Format(s)

How Reference Points and Negativity Bias Affect Citizens’ Performance Evaluations: A Replication. / Chen, Wenna; George, Bert; Walker, Richard et al.
2022. International Research Society for Public Management Conference 2022 (IRSPM2022).

Research output: Conference Papers (RGC: 31A, 31B, 32, 33)33_Other conference paperpeer-review