Sarah Roberts, doctoral candidate at GSLIS and assistant professor at Western University in Ontario, was recently interviewed by NPR's All Things Considered regarding new safeguards used by Internet search engines to eliminate images of child pornography online. In order for these automated programs to capture and remove these images, according to NPR’s report, workers must first scan “known images of child pornography and giv[e] them a unique signature that goes into a database. If that image appears on another site, it is instantly flagged and removed.”
Roberts researches in the area of commercial content moderation, and her dissertation, “Behind the Screen: The Hidden Digital Labor of Online Content Moderation,” in part investigates the impact such work has on the men and women who are paid to ensure these images are eradicated from the Web.
Roberts studies the workers who are part of the new content moderation industry, and she explains that one reason so little is known about them is that most companies require their employees to sign nondisclosure agreements.
"They're precluded from speaking to the media, and it is difficult to reach out and find them," Roberts says. "I think there's an aspect of trauma that can often go along with this work and many workers would rather go home and tune out not talk about it. So I think the unknown aspect of this is by design. It's no mistake that it's difficult to find workers who will talk to you about this."
Many of the workers Roberts has spoken to anonymously have said they feel stigmatized because of the content they come in contact with through their jobs.
"It's exacting a toll on these workers, and because this industry is so new and the need for this work is so new, I think the jury is out as to what the real implications are going to be for these people later on in their life," she says.