Technology

In 2017, Facebook ever so slightly adjusted its mission statement.
Out went a pledge to make the world more open and connected, and in its place came an intention to give people the power to build community and bring the world closer together.You could view this as an admission that 'open' had failed.
'Open' means open to hate speech, child abuse, violence, sex and the kind of illegal acts Facebook would rather have nothing to do with.
And yet the company now finds itself having to clean up such messes every hour of every day.Or rather, it employs outsiders to do said dirty work.
In The Cleaners, a documentary by Hans Block and Moritz Riesewieck, contractors from the Philippines candidly discuss the steady stream of sex, violence and hate speech they have to sift through every day.Former Facebook moderator Chris Gray and filmmaker Moritz Riesewieck at Mozfest 2019(Image credit: Connor Ballard-Bateman)They have to make each decision in eight to 10 seconds, they say, and dont overthink is a direct quote from the training materials, such as they are.
Don't doubt too much whether your decision is right or wrong, because otherwise you will overthink it, and then you won't be able to take a decision, Riesewieck summarises to TheIndianSubcontinent at Mozillas Mozfest, where he and his co-director have just been on a panel discussing internet moderation.If ever there were a company to stress test the idea that any problem can be solved with enough money, its Facebook.
And yet, so far, the problem just continues to grow.
In 2009, Facebook had just 12 (yes, thats twelve) content moderators looking out for the welfare of 120 million users.
There are now over two billion people on the platform and around 15,000 moderators.
While that means the ratio of moderator to user has gone up from paltry to feeble, its worth reflecting that Facebook in 2019 is very different to what it was a decade ago, when the Like button was the latest innovation, and Facebook Live was still years away."The worst of the worst of the internet's trash"Estimates say that there are about 100,000 professionals that work in this field, says Clara Tsao, a Mozilla fellow and expert in countering online disinformation.
They deal with the worst of the worst of the internets trash, she adds, noting that on 4chan theyre literally called 'janitors'.Unlike real-world janitors, though, the internets cleaners arent always given the right equipment for the enormous task at hand.
Facebooks Filipino contingent would occasionally encounter exchanges in languages they didnt speak, using Google Translate to follow meaning.
That inevitably takes a sledgehammer to nuance, before you even get onto the cultural differences inevitable between countries separated by an eight-hour time zone gap.Facebook moderators have to monitor vast amounts of content from around the world, and may be required to assess conversations in a language they don't speak(Image credit: Shutterstock)Facebook moderators aren't only located in the Philippines.
There are offices around the world, and it was in Dublin where Chris Gray found himself after a spell teaching in Asia.
Now hes the lead plaintiff representing moderators in High Court proceedings against Facebook.
Over a nine-month spell at the company (in Ireland, most workers are on 11-month contracts, he says but most leave early), Gray was dealing with 500-600 bits of content a night, usually in the 6pm to 2am slot.
It was only a year after he left that he was officially diagnosed with PTSD.It took me a year before I realised that this job had knocked me on my arse, he says as part of the panel discussion.
This delayed reaction, Riesewieck tells us, isnt wholly uncommon.
In some cases they told us it's mostly their friends telling them that they changed, he explains.It took me a year before I realised that this job had knocked me on my arseChris GrayIn any case, many of Grays former colleagues are privately pleased at him breaking NDA and leading the charge to legal action even if theyre not prepared to say so publicly just yet.
People are just coming out of the woodwork and saying, Oh, thank God, somebody has spoken out and said this, he tells TheIndianSubcontinent later.To be clear, despite having personally been affected by the work, Gray feels that its misleading to assume its non-stop gore, child exploitation and sex.
To be honest, most of the work is tedious, he says.
Its just people reporting each other because they're having an argument and they want to use some process to get back at the other person.Tedious, but high pressure.
In the Irish office, Gray had 30 seconds to pass verdict on content whether it was a one-line insult or a 30-minute video.
If your auditor clicked in [on a video] two seconds later than you and he saw something different he heard a different slur, or he saw something higher up the priority ladder then bang, you've made a wrong decision.
Wrong decisions affect quality score, and quality score affects your employment.
Despite this, the target for the office was a nigh-on impossible 98% accuracy.SuperheroesFinding people to talk about their moderation experience is tough, as Block and Riesewieck found when looking for subjects.
NDAs are universal, and the work comes under a codename at the time of filming it was 'project honey badger'.Despite this, Facebook or the subcontractors that deal with moderation hire quite openly, even if theyre often grossly misleading about what the job actually entails.
They use superheroes in costumes, come be a superhero, clean up the internet, explains Gabi Ivens, another Mozilla fellow on the panel.
One advert in Germany for content moderators asked questions like do you love social media and wants to be up to date with what's happening in the world?But despite the general tediousness of the day-to-day, theres a surprising element to Block and Riesewiecks documentary: many of their interview subjects took real pride in the role, seeing it as less of a job and more of a duty.Filipino Facebook moderators told filmmakers Hans Block and Moritz Riesewieck they felt it was their ethical duty to clean up the internet(Image credit: Shutterstock)They told us they feel like superheroes of the internet like policemen guarding the internet, says Block.
The directors credit this in part to the Philippines 90% Christian populous.
They told us they feel like Jesus freeing the world from it, Block adds.
This, in turn, might make people feel reluctant to walk away, seeing it as their ethical duty rather than just a job.But there are limits to this, especially as moderators arent making the final calls themselves.
Here, the sacred text is Facebooks labyrinthine set of rules and instructions: thousands of words accumulated over many years.
In some cases, people are having to protect speech they think should be banned, or ban speech they think should be protected, something that Ivens sees as an obvious problem for wellbeing.
Keeping content online that you dont think should be online is extremely damaging, even before you think about what people are seeing.The irony to treating the rules as sacred is that Facebooks rules have never been an infallible set text: theyre the results of years of iterative changes, gradually responding to crises as they emerge, and trying to make the subjective more objective.Keeping content online that you dont think should be online is extremely damaging, even before you think about what people are seeingGabi IvensRemember the 'free the nipple' campaign? In short, Facebook guidelines originally said that any photograph with breasts in should be banned as pornographic, which meant the internet was deprived of proud mothers breastfeeding on the platform.
Facebook gradually shifted its rules and accepted that context matters.
In the same way, its had to accept that although theres nothing illegal with people eating Tide Pods or spreading anti-vaccination conspiracy theories, if something becomes a public health epidemic, then it has a duty to step up."Some platforms say certain content might not be unlawful, but is unacceptable, explains Tsao.
But other people feel like the internet should have broader freedoms to say whatever you want.
For Facebook, this dichotomy produces absurd levels of granularity: Now we've got some guidance on if you threaten to push somebody off a roof, Gray says.
Pushing is not a violent action.
The fact that you're on a roof is important, but then how high is the roof? So much for that dont overthink guidance.This kind of inertia in moderation guidelines lets internet trolls thrive.
You dont have to look very hard to come up with examples of internet rabble rousers who step right up to the line without ever quite overstepping it.
Instead, they leave that to their followers and sometimes, catastrophically, that spills over into the real world.Morality doesnt cross bordersFacebooks global status makes the problem even more complex because morality isnt shared across borders.
It's complicated because it surpasses the local policies of countries and borders right into a wild west, Tsao says.Gray gives the example of peoples sexuality: gay pride is very much a thing in most of the west, but less so elsewhere in the world.
You might tag a friend as gay in a post, and theyre comfortable enough with their sexuality to share it.
So in that instance, it feels reasonable not to take the post down, even if a curmudgeonly homophobe complains about it.Morality isn't a global concept.
which makes moderating international content a huge challenge(Image credit: Shutterstock)But then if you're in Nigeria you could get you beaten or killed because somebody sees that post, he explains.
That mistake could cost somebody their life.
I mean, this is the reality of it: you are sometimes looking at life and death situations.Objective acts of violence should be more clear cut, but they arent.
Video of a child getting shot might seem like an obvious candidate for deletion, but what if its citizen journalism uncovering unreported war crimes? If Facebook takes that down, then isnt it just the unwitting propaganda wing of the worlds worst despots?This is the reality of it: you are sometimes looking at life and death situationsChris GrayThis is complex, easily muddled and doesnt help the workers who are being judged for their objective responses to subjective posts.
People are protesting and its appearing on my desk, Gray says during the panel.
And Ive got to make the call: is that baby dead? And then Ive got to press the right button, and if I press the wrong button because my auditor thinks the babys not dead, then Ive made a mistake and it goes towards my quality score and I get fired.So Im lying awake in bed at night seeing that image again and trying to formulate an argument to keep my job.Can it be fixed?It should be pretty obvious at this point that this isnt entirely Facebooks fault, even if the company hasnt exactly helped itself along the way.
But what can it do? Its pretty clear throwing people at the problem wont work, nor is AI moderation ready for show time.
(And there are legitimate doubts that it ever will be for starters, you need humans to train the AI, which just moves trauma one step backwards.
I think it'll be really hard to completely remove humans from the loop, says Tsao.)Facebook don't have a clear strategy for this, says Gray.
It's all reactive.
Something happens, so they make a new rule and hire more people.
He believes a lack of leadership is the root of the problem.
You need to know where youre going with this and what your strategy is, and they dont.
Everything stems from that.Psychology professor Roderick Ordens says it's crucially important that nobody does this type of work alone so responsibility doesn't lie entirely with an individual(Image credit: Shutterstock)That, Tsao believes, is in part because the decision makers havent had to do it themselves.
I've interviewed a bunch of heads of trust and safety at companies, and one of them has always said: if you're going to be in a management role in this professional field, you have to understand what it's like on the bottom, she says.
You have to understand the trauma, you have to understand what kind of support system is needed.Roderick Ordens, a psychology professor from the University of Lincoln, has his own perspectives when we reach out to him.
There is a duty of care.
This doesnt in any way guarantee that there arent going to be casualties amongst people who view this kind of material, but the company must be seen to have done everything reasonable to reduce risks to staff.Nobody should be doing this kind of work alone.
And if it's been done by a group then the thing thats really important is strong group cohesionRoderick OrdensFirst of all, nobody should be doing this kind of work alone.
And if it's been done by a group then the thing thats really important is strong group cohesion.
Its very important to arrange this in such a way that responsibility is not seen to be with the individual.Any company hiring for such dangerous work, Ordens says, should have training so that employees can recognise the warning signs: a general sense of unease, not being able to relax after work, maybe being unduly preoccupied with certain images.
And to be particularly watchful of whether sleep is adversely affected: with an accumulation of poor sleep, everything else feels much worse."What's on your mind?"Whether Facebook is interested in such insights is another matter.
We dont claim that all the fault is on the side of the companies thats not true says Block.
The fault, we consider at least, is they dont make it transparent, they dont open the discussion and they dont accept that they alone cant decide about all that.
Block and Riesewieck know that some Facebook employees have seen their film at a screening in San Francisco, and there was even talk of showing it at Facebooks offices, only for follow-up emails to end up mysteriously unanswered.Certainly the NDA treatment isnt helping, and the sheer quantity of ex- and current employees bound to them means the effect will inevitably lessen as theres a certain safety in numbers.
Gray hasnt had any word from Facebook over breaking his at least not directly.I had a call a couple of weeks ago from a former colleague and they said Hey, I hear youre being sued by Facebook.
No.
Who told you I was being sued? My team leader.
Your team leader is trying to manipulate you into silence.I don't know if anything was ever done.
You know, it just goes off into the void, it seemsChris GrayIn other words, the carrot to stick balance feels as comically off as Facebooks moderator to user ratio.
Given people want to make the internet a better place, perhaps Facebook could tap into that sense of meaning?Even Gray remembers feeling positive, recalling a text message he sent in the early days.
I said I have personally escalated 13 cases of child abuse to the rapid response team, and I feel really good.
But it didnt last.
I never heard back from those people.
I don't know if anything was ever done.
You know, it just goes off into the void, it seems.Could acknowledgement of making a difference boost morale? Maybe, but only if they really are having an impact, Gray fairly interjects.
Some kid in Afghanistan is tied to the bed naked and he's being beaten.
I escalate that because it's it's child sexual abuse, but what can anybody do?Im just following the policy, the person next level up is just deleting it.kWDsogK6NGJMvRQZF36tN4.jpg?#






Unlimited Portal Access + Monthly Magazine - 12 issues


Contribute US to Start Broadcasting - It's Voluntary!


ADVERTISE


Merchandise (Peace Series)