Porn, dog poo and social media snaps: the ‘taskers’ scraping the internet for Meta-owned AI firm
In a startling revelation, it has come to light that tens of thousands of individuals have been employed by Scale AI, a company partially owned by Meta, to train artificial intelligence (AI) systems. These workers, often referred to as “taskers,” have been engaged in various activities, including scraping social media accounts, harvesting copyrighted content, and even transcribing pornographic soundtracks.
Overview of Scale AI
Scale AI, which is 49% controlled by Meta, has recruited experts from diverse fields such as medicine, physics, and economics. The company aims to refine top-tier AI systems through a platform known as Outlier. Their website promotes flexible work opportunities for individuals with strong academic and professional credentials. However, many taskers have expressed discomfort with the nature of their work, which often strays far from the high-level system refinement that was initially advertised.
The Role of Outlier
Outlier is managed by Scale AI, which has secured contracts with the Pentagon and various U.S. defense companies. The former CEO of Scale AI, Alexandr Wang, currently serves as Meta’s chief AI officer and has been recognized by Forbes as the “world’s youngest self-made billionaire.” The company’s former managing director, Michael Kratsios, is now a science adviser to the U.S. president.
Data Collection Practices
Taskers have reported that users of Meta platforms, such as Facebook and Instagram, would be shocked to learn how their data is collected. One contractor based in the U.S. shared, “I don’t think people understood quite that there’d be somebody on a desk in a random state, looking at your profile, using it to generate AI data.”
The Gig Economy and Taskers’ Experiences
The Guardian spoke to ten individuals who have worked for Outlier, some for over a year. Many of these taskers held other jobs as journalists, graduate students, teachers, and librarians, but sought additional work amid a struggling economy. One worker noted, “A lot of us were really desperate. Many people really needed this job, myself included, and really tried to make the best of a bad situation.”
Ethical Concerns and Internal Struggles
Like many in the growing class of AI gig workers, most taskers felt they were training their own replacements. An artist expressed feelings of “internalized shame and guilt” for contributing to the automation of their aspirations. “As an aspiring human, it makes me angry at the system,” they lamented.
Legal Representation and Worker Rights
Glenn Danas, a partner at Clarkson, a law firm representing AI gig workers in lawsuits against Scale AI and similar platforms, estimates that hundreds of thousands of individuals worldwide are now engaged in work for platforms like Outlier. The Guardian’s interviews with taskers from the UK, US, and Australia revealed the common humiliations associated with AI gig work, including constant monitoring and unstable employment.
Recruitment Practices and Monitoring
Scale AI has faced accusations of employing “bait-and-switch” tactics to attract potential workers, initially promising high salaries only to later offer significantly less. Although the company declined to comment on ongoing litigation, a source indicated that pay rates change only if workers opt into different, lower-paid projects. Taskers were often required to undergo repeated, unpaid AI interviews to qualify for specific assignments, with many believing these interviews were recycled to train AI.
Content Sensitivity and Task Descriptions
Taskers described being asked to transcribe pornographic soundtracks and label disturbing images, including those of dead animals and dog feces. One doctoral student recounted being assigned to label a diagram of baby genitalia, despite prior assurances that there would be no nudity or gore involved in their tasks. “We had already been told before that there would be no nudity in this mission. Appropriate behavior, no gore, like no blood,” they said. “But then I would get an audio transcript for porn or random clips of people throwing up.”
Social Media Scraping and Ethical Dilemmas
Many taskers reported expectations of social media scraping as part of their assignments. Seven workers described scouring Instagram and Facebook accounts, tagging individuals by name, location, and their friends. Some tasks even involved training AI on accounts belonging to individuals under the age of 18. One task required workers to select photos from individuals’ Facebook accounts and order them by the age of the user in the photo.
Personal Ethics and Task Completion
Several taskers expressed discomfort with assignments that involved personal data, particularly those including children. One worker stated, “I didn’t use any friends or family to submit tasks to the AI. I do understand that I don’t like it ethically.” The Scale AI source maintained that taskers were not obligated to continue with tasks that made them uncomfortable and that inappropriate content would be flagged and shut down.
Conclusion
The practices employed by Scale AI and its Outlier platform raise significant ethical questions regarding data privacy, the treatment of gig workers, and the implications of AI training. As the demand for AI systems continues to grow, the experiences of taskers highlight the often overlooked human cost behind the technology.
Note: This article is based on information available as of October 2023 and reflects the ongoing discussions surrounding AI, data privacy, and gig economy practices.

