The Upside Down of Social Media: Understanding the Invisible Digital Work of Content Moderators

Activity: Talk or presentationContributed talkscience-to-science

Description

Content moderation, as a form of invisible digital work, involves the sorting, labeling, and removal of graphic and potentially traumatizing content from social media platforms (Gray & Suri, 2019; Roberts, 2019). Without content moderators, these platforms would become dire digital spaces, exposing users to distressing content such as sexual exploitation, hate speech, suicide, and online harassment. While prior research has identified different types of platform work in the gig economy (e.g., Vallas & Schor, 2020), our understanding of platform work essential to maintaining the platforms’ operational functionality remains limited. Some scholars have begun to highlight the precarious working conditions and invisibility of content moderation, which is often pushed to the periphery of the organizational core (Justesen & Plesner, 2024; Gillespie, 2020). Drawing on 50 interviews with content moderators from Kenya, Morocco, and Germany, this paper examines how digital platform organizations exploit content moderators while simultaneously evading responsibility for the harmful consequences associated with this type of platform work.
Period23 Oct 2024
Event titleReshaping Work 2024 Conference
Event typeConference
LocationNetherlandsShow on map

Fields of science

  • 502029 Product management
  • 502 Economics
  • 506009 Organisation theory
  • 502043 Business consultancy
  • 502044 Business management
  • 502030 Project management
  • 502014 Innovation research
  • 502036 Risk management
  • 502026 Human resource management
  • 502015 Innovation management

JKU Focus areas

  • Digital Transformation
  • Sustainable Development: Responsible Technologies and Management