Can we keep Facebook safe without harming staff?

Can we keep Facebook safe without harming staff?

In 2017, Facebook has slightly modified its mission statement. A commitment to "make the world more open and connected" was promised, with the intention of "empowering people to build community and bring the world closer together."

You can view this as an admission that "open" has failed. "Open" means open to hate speech, child abuse, violence, sex, and the kinds of illegal acts that Facebook would have nothing to do with. And yet society now has to clean up this damage every hour, every day.

Or rather, employ strangers to do your dirty work. In The Cleaners, a documentary by Hans Block and Moritz Riesewieck, Filipino businessmen frankly discuss the constant stream of sex, violence and hate speech that they have to filter every day.

Mozfest 2019

Former Facebook Moderator Chris Gray and Filmmaker Moritz Riesewieck at Mozfest 2019

(Image credit: Connor Ballard-Bateman)

Each decision should be made in eight to ten seconds, and "don't think too much" is a direct quote from the training material as it is. "Don't worry too much about whether your decision is right or wrong, otherwise you will overthink it and be unable to make a decision," Riesewieck tells LaComparacion on Mozilla's Mozfest, where he and his partner director come from. Be a part of a moderation discussion group on the Internet.

If ever a company had to insist on the idea that any problem can be solved with enough money, it is Facebook. And yet, so far, the problem is only growing. In 2009, Facebook had just 12 (yes, twelve) content moderators seeking the well-being of 120 million users. There are now more than two billion people on the platform and around 15,000 moderators. While this means that the moderator / user ratio has gone from moderate to low, it should be noted that Facebook in 2019 is very different from what it was a decade ago when the "J" button was used. Like "was the latest innovation and Facebook Live, there were still many years.

"The worst of the worst of Internet waste"

"An estimated 100,000 professionals are working in this field," says Clara Tsao, a Mozilla colleague and an expert in combating misinformation online. "They deal with the worst of the worst waste on the Internet," he adds, noting that on January 4, they are literally called "janitors."

However, unlike real-world janitors, internet cleaners aren't always equipped with the right hardware for the big task at hand. The Filipino Facebook contingent sometimes encountered exchanges in languages ​​they did not speak, using Google Translate to understand the meaning. This inevitably takes on a heavy nuance, even before addressing the inevitable cultural differences between countries separated by an eight-hour gap between time zones.

Social media images

Facebook moderators need to monitor large amounts of content from around the world and may need to rate conversations in a language they don't speak.

(Image credit: Shutterstock)

Facebook moderators are not only found in the Philippines. There are offices all over the world, and it is in Dublin that Chris Gray found himself after spending a teaching period in Asia. He is now the main plaintiff of the moderators in the High Court proceedings against Facebook. During a nine-month stint at the company (in Ireland, most workers have an 11-month contract, but most leave early), Gray tried 500 to 600 content per night, usually from 18pm to 2am. . Just one year after his departure, he officially received a diagnosis of post-traumatic stress disorder.

"It took me a year to realize that this job had knocked me out," he said in the group discussion. This late reaction, Riesewieck tells us, is not entirely unusual. "In some cases, they told us that it was mainly their friends who told them they had changed," he says.

It took me a year to realize this job had hit my ass

Chris Gray

In any event, many of Gray's former colleagues are delighted to see that he has violated the NDA and dragged the lawsuit, even though they are not ready to say so publicly for l & # 39; now. "People just come out of the forest and say, 'Oh thank God, someone spoke up and said that,'" he told LaComparacion later.

To be clear, despite the fact that he was personally affected by the job, Gray feels it is misleading to assume that it is endless gore, sexual and sexual exploitation. "To be honest, most of the work is tedious," he says. "These are just people who report each other because they argue and want to use some process to communicate with the other person."

Tedious, but under high pressure. At the Irish office, Gray had 30 seconds to render a verdict on the content, which was an insult to a 30-minute line or video. . "If your listener clicked (on a video) two seconds after you and saw something different, heard a different insult, or saw something higher up the priority hierarchy, so accept it, you made a bad decision." Bad decisions affect your quality score, and your quality score affects your work. Despite this, the office's goal was near-impossible accuracy at 98%.

Super Heroes

It's hard to find people to talk about their moderation experience, as Block and Riesewieck found when searching for topics. NDAs are universal and the work is classified under one code name: at the time of filming it was the "Honey Badger Project."

Despite this, Facebook, or subcontractors who show restraint, are hiring openly, even if they are often extremely misleading about what the job actually entails. "They're wearing superheroes in costumes," come on, be superheroes, clean up the Internet, "says Gabi Ivens, another Mozilla member on the panel." An ad in Germany for content moderators raised questions such as "Do you like social media and do you want to be aware of what is happening in the world?"

Despite the general boredom of everyday life, Block and Riesewieck's documentaries are surprising: many of the subjects in their interviews are very proud of this role, considering it less of a duty than a job.

Facebook

Filipino Facebook moderators told directors Hans Block and Moritz Riesewieck that they felt it was their ethical duty to clean up the Internet.

(Image credit: Shutterstock)

"They told us that they felt like internet superheroes, like police officers policing the internet," Block said. Administrators attribute this in part to 90 percent of the Christian population of the Philippines. "They told us they felt that Jesus was setting the world free," adds Block. This in turn could make people reluctant to leave, viewing it as an ethical duty and not just a job.

But there are limits to this, especially since the moderators don't make the last calls themselves. Here, the sacred text is Facebook's labyrinthine set of rules and instructions: thousands of words accumulated over many years. In some cases, people must protect speech they think should be banned, or ban words that they feel should be protected, which, according to Ivens, is a clear welfare issue. "Keeping content online that you think shouldn't be should be extremely damaging before you even think about what people are watching."

The irony of regarding the rules as sacred is that Facebook's rules have never been foolproof frozen text: they are the result of years of iterative change, gradually responding to crises as they arise, and attempting to make the subjective more objective.

Keeping content online that you don't think should be online is extremely damaging before you even think about what people are watching.

Gabi ivens

Do you remember the "Free Nipple" campaign? In short, Facebook's guidelines originally stated that any breast photography should be banned as pornography, meaning the internet was deprived of proud mothers breastfeeding on the platform. Facebook has gradually changed its rules and accepted that context mattered. Similarly, we must accept that even if there is nothing illegal about eating tide pods or spreading conspiracy theories against vaccination, if something does turn into a public health epidemic, therefore, it is your duty to act.

"Some platforms say that some of the content may not be illegal, but that it is unacceptable," says Tsao. But "others think that the Internet should have greater freedom to say what it wants." For Facebook, this dichotomy produces absurd levels of granularity: "Now we have some clues about threatening to shove someone off the ceiling," says Gray. "Pushing is not a violent action. Being on the roof is important, but how high is it?" Too bad you "don't think too much."

This kind of inertia in moderation guidelines allows internet trolls to thrive. You don't have to search far to find examples of rumors on the internet that get straight to the point without going overboard. Instead, they leave it to their followers, and sometimes, catastrophically, it spills over into the real world.

Morality does not cross borders

The global state of Facebook makes the problem even more complex because morale is not shared across borders. "It is complicated because it goes beyond the local politics of the countries and borders on a wild west," says Tsao.

Gray gives an example of people's sexuality: gay pride is very present in most Western countries, but less everywhere else in the world. You can tag a friend as gay in a post and they are comfortable enough with their sexuality to share it. So in this case, it is reasonable not to give up, even if a homophobic girl complains about it.

Facebook

Morality is not a global concept. which makes international content moderation a big challenge

(Image credit: Shutterstock)

"But if you are in Nigeria, you could be beaten or killed because someone sees this message," he says. "This mistake could cost someone their life. I mean, it's the reality: sometimes you look at life and death situations."

Objective acts of violence should be more specific, but they are not. The video showing a child being shot may seem like an obvious candidate for crackdown, but what if citizen journalism uncovers unreported war crimes? If Facebook gets rid of it, isn't it just the unwitting propaganda wing of the world's worst despots?

That's the reality: sometimes we look at life and death situations.

Chris Gray

It is complex, easily confused, and does not help workers judged on their objective responses to subjective positions. "People protest and show up on my desk," Gray said on the panel. "And I have to make the call: is this baby dead? Then I have to press the right button. If I press the wrong button because my listener thinks the baby is not dead, then I am wrong and I reach the quality level and I get fired .

"So I'm up at night in bed, I see that picture again and I try to argue to keep my job."

Can it be repaired?

It should be pretty obvious at this point that it's not entirely Facebook's fault, even though the company didn't help throughout the entire process. But what can he do? It's pretty obvious that the problem won't work and AI moderation isn't quite ready for show. (And there are legitimate doubts about this). To start, you need humans to train the AI, which moves the trauma one step back. 39, it will be very difficult to completely remove humans from the loop, "says Tsao).

"Facebook doesn't have a clear strategy for this," says Gray. "Everything is reactive. Something happens, so they create a new rule and hire more people. He thinks that lack of leadership is the root of the problem." You have to know where you are going with that and what your strategy is, and they're not doing it. "It all follows from that."

Stressed worker

Roderick Ordens, professor of psychology, believes that it is extremely important that no one does this kind of work alone. Responsibility does not rest entirely with one person.

(Image credit: Shutterstock)

This, Tsao believes, is due in part to the fact that decision makers did not have to do it themselves. "I have interviewed people who are responsible for trust and security in companies, and one of them has always said," If you want to play a leadership role in this career field, you have to understand how it works. down, "she says." You have to understand the trauma, you have to understand what kind of support system is needed. "

Roderick Ordens, a professor of psychology at the University of Lincoln, has his own perspectives when we contact him. "There is a duty of care. This does not guarantee in any way that there will be no casualties among those who see this type of material, but it must be seen that the company has done everything reasonable to reduce the risks to personnel.

No one should do this kind of work alone. And if it was done by a group, then what really matters is the strong cohesion of the group.

Roderick Orders

"First of all, no one should do this kind of work alone. And if it was done by a group, then what really matters is the strong cohesion of the group. It is very important to organize this so that responsibility is not perceived as the responsibility of the group. individual.

According to Ordens, any company that hires for "such a dangerous job" should be trained to allow employees to recognize the warning signs: "a general feeling of unease, of not being able to relax after work, perhaps overly concerned by some images And be particularly vigilant to see if sleep is affected: with a build-up of bad sleep, everything else is much worse. "

"What is your mind worried about?"

That Facebook is interested in such knowledge is another problem. "We don't pretend that all the blame is on the corporate side, that's not true," says Block. "The fault, at least in our opinion, is that they do not make it transparent, they do not open the discussion and they do not accept that they cannot decide alone on all this." Block and Riesewieck know that some Facebook employees saw their film during a screening in San Francisco, and there was even talk of showing it at Facebook offices, just because of that. Follow-up emails are mysteriously unanswered.

It is true that NDA treatment does not help, and the large number of former and former employees associated with them means that the effect will inevitably diminish because there is some certainty in the numbers. Gray had no news from Facebook to break his, at least not directly.

"I got a call from a former colleague... a few weeks ago. They said, 'Hey, they told me Facebook is suing you.' No. Who did you say I got sued? My team leader.' your team tries to manipulate you into shutting up."

I don't know if something has already been done. You know, it only goes in a vacuum, it seems

Chris Gray

In other words, the balance between carrot and stick is as comical as the Facebook moderator / user report. Since people want to improve the Internet, maybe Facebook could take advantage of that sense?

Even Gray remembers being positive when recalling an SMS he sent at the beginning. "I said, 'I personally relayed 13 cases of child abuse to the rapid response team, and I feel really good.' But that did not last. "I've never heard from these people. I don't know if anything has already been done. You know, it seems like everything is happening in a vacuum.

Could recognition make a difference boost morale? Maybe, but only if they really have an impact, Gray steps in enough. "A child in Afghanistan is tied naked on the bed and beaten. It aggravates this because it is sexual abuse of a child, but what can be done?

"I'm just following the policy, the person at the top level is just eliminating it."