by Ken Urban May 3, 2021
In 2020, Artistic Director Steve Cosson and I were awarded a commission from Ensemble Studio Theatre and the Sloan Foundation to develop my new play The Moderate. This project is about the world of internet content moderators. Inspired by the work of scholars like Sarah T. Roberts and documentaries like The Cleaners, the play follows Frank during his lockdown year. Recently unemployed and separated from his wife and son during a global pandemic, Frank accepts a job as a content moderator for a large social media company. He must watch and read at least 2,000 flagged videos and posts a day, “accepting” or “rejecting” them. Watching this material day-in and day-out forces Frank to make a choice about his own life and take a chance saving a teenage girl he has never met.
Steve suggested that a join this year’s R&D Group in order to structure our work together on the play. In the fall, I conducted interviews with scientists, researchers and policymakers working in the fields of AI and internet moderation. Many were incredibly generous, one or two were a little suspicious, and a few flat-out refused any request. I was also fortunate to speak with people currently working as internet content moderators. They spoke to me at great risk since they are made to sign NDAs (non-disclosure agreements) as part of their jobs. These conversations changed me. They were not only invaluable for writing the play, but they altered my relationship to the internet.
Here are some selected highlights from those conversations:
Interview with Sarah T. Roberts
Sarah is an assistant professor of information studies in the Graduate School of Education and Information Studies at UCLA. She is an expert in the areas of internet culture, social media, digital labor, and the intersections of media and technology. She coined the term “commercial content moderation” (CCM) to describe the job paid content moderators do to regulate legal guidelines and standards. Roberts wrote the book Behind the Screen: Content Moderation in the Shadows of Social Media.
Ken: From the interviews you’ve done with content moderators, what did the contractors look for when hiring them? Why did the people that you interviewed end up getting these jobs?
Sarah: I will preface what I say by saying that I think, at this point, the need for bodies has superseded the desire to be picky. So I think it’s probably not quite as discriminating anymore for these kinds of cattle-call jobs and call centers.
But when I talked to the workers, particularly the workers that worked at MegaTech [euphemism for a company we cannot name], there was a bit of a vetting process for them. Each one of them came through a different contracting agency, which already was funny and weird, and strategic on some level I imagine. And so, they were being hired. Their whole hiring process was handled by the third party contractors, but MegaTech of course ultimately sets the terms of what kind of employee they want. And so they had said things like, we need four-year university graduates. That was part of the mandate. MegaTech’s preference is for graduates of schools of where we teach now [Ken teaches at MIT]. So, graduates from more elite institutions, that’s what they were looking for. So the people that I talked to had graduated from Berkeley, USC. One person graduated from a small liberal arts college with a good reputation.
All of them graduated with significant debt. So I think that’s a huge piece. They have to pay their student loans.
Another person I know who worked down in Austin has a PhD in English literature too. And the job market was so bad. So bad that it’s actually… We might question the ethics of continuing to produce those students. Knowing something about the inner workings of that department, how can they completely not support their Ph.D. students, turn them out with debt? They are now overeducated for just about every other job.
At that time content moderators were highly competent and highly educated, but the propensity of Silicon Valley is to devalue anything that isn’t in the STEM fields. I don’t need to tell you that working at MIT. So there was no way that they really saw bringing a four year grad from Cal State who had a degree in economics as having any value in any other domain within their company.
Of course, now these companies hire vast swaths of people. When I have interacted with the them, Facebook, for example, or other big companies, I’m usually interacting with mid-level people who are like in their late twenties, they have master’s degrees from like Georgetown, in international relations or peace studies. They’re really do-gooder types. They love to hire people like that for the policy arm, but for the implementation of the actual moderation, it was a lot of new grads, for whom there was no clear path in the world, outside of what was being touted as America’s new economic hope which was the tech industry.
And of course for them – Facebooks and the companies like them – the internet content moderators they hire… some of them graduated with English degrees or history degrees, so it was all the wrong degree. The right schools and the wrong degrees. As if the humanities and social science understandings of the world are not key to doing a good job at content moderation. Of course, they are. Because you know something, you’re not an ignoramus. You know what you’re looking at when you’re looking at symbols. I mean, you’re looking at signifiers, right? You’re looking at things that have meaning and you are called on to use your cultural capital and your intellect to decide. And so, in fact, they knew on some level that that education was a value, but they devalue it at the cultural level, of the ideological level of Silicon Valley. They wanted that, but they knew they could get it at fire-sale prices.
This is my interpretation. If it were this thought out, I’d be surprised, but this is how this kind of economy got generated. And so you have this caldron of people who are highly educated, who are very smart, who are achievers, and you put them in a job that is rote, boring, no upside, no room for growth, instead of sort of a trajectory up your inner revolving door, that’s going to spit you out.
It’s almost like factory work. Of course we know it doesn’t do the physical things to one one’s body that being on an assembly line or being in manufacturing might do, you’re not going to lose a finger. You’re not doing repetitive motion stuff, but it has these other damages, and these other harms that, frankly, are invisible. It’s even worse in a way, because you can’t even make an informed decision when you go into the job. Maybe a little bit more so now, but certainly not at that time.