Skip to main content

14 July 2025

Uncovering the highly skilled emotional work of content moderators

Healthcare platform moderators use strategies to manage distressing material while staying engaged enough to protect vulnerable users, finds a new study.

man on his own in an office

The researchers call for regulators and platform providers to take steps to reduce the toll of this highly skilled emotional work.

The study, published in the Journal of Management Studies looks at the work of the moderation team of the UK-based non-profit online health platform, Care Opinion. The platform serves as a ‘TripAdvisor for healthcare’ but with a strong social service ethos given its role in shaping public perceptions of healthcare providers and their staff. Moderators on this platform often encounter deeply personal or distressing stories and operate with limited but important discretion; to alter the stories’ content - sometimes following discussion with the author, to withdraw them from publication or to undertake a duty of care to safeguard the author of a story.

The study by researchers at King’s Business School and the University of Sussex found that moderators engage in five main practices as part of their work:

  • Application of rules: by ensuring submissions meet explicit guidelines, staff can moderate with confidence, though some moderators reported finding this desensitising over time
  • Quantification: content is given a score or category, with more experienced moderators handling the more complex or distressing stories
  • Objectification: separating the objective facts of a story about the quality of care from the subjective and emotional part of the story
  • Verification: using gut feeling to go back to an author and check if an account feels disjointed, or suspiciously similar to stories submitted under different names; and
  • Care: going beyond the platform’s stated remit and using their discretion to offer care; this might include breaking confidentiality to intervene in cases where they feel a user’s life may be at risk.

Our findings show that far from being emotionless cogs, moderators have to manage their emotions about highly charged information in order to conform to the platform’s neutral stance. Yet they still engage emotionally and retain feelings of empathy or distress. This is what enables them to bend the rules and occasionally to offer care – even in a tightly controlled system.

study co-author Dimitra Petrakaki, Professor of Technology and Organisation, University of Sussex Business School

The authors’ findings have important implications for many of the stakeholders involved in healthcare and other platforms where sensitive or distressing content is shared.

Platform operators and designers should embed support systems like well-being check-ins and debrief tools into moderator dashboards and ensure staff have training on emotion regulation and empathetic communication.

Policy makers and regulators should develop digital occupational safety and health standards that mandate safeguards such as rotation policies and access to counselling.

Content moderators themselves should understand the skilled, emotional work they perform, rather than seeing themselves as mere “rule-enforcers”. They should develop mechanisms for collective support, such as peer communities of practice.

Moderators told us about the challenges of their role; that there is some content that they just can’t moderate and the weight of responsibility they feel when editing someone’s story. It’s important that as a society, we better value the emotional work of content moderators and build a greater understanding of what it means to perform this new role across different sectors, countries and cultures.

co-author Andreas Kornelakis, Reader in Comparative Management at King’s Business School

In this story

Andreas  Kornelakis

Reader in Comparative Management

Related departments