Willner’s then-boss, Jud Hoffman, who has since left Facebook, said that the rules were based on Facebook’s mission of “making the world more open and connected.” Openness implies a bias toward allowing people to write or post what they want, he said.
But Hoffman said the team also relied on the principle of harm articulated by John Stuart Mill, a 19th-century English political philosopher. It states “
that the only purpose for which power can be rightfully exercised over any member of a civilized community, against his will, is to prevent harm to others.” That led to the development of Facebook’s “credible threat” standard, which bans posts that describe specific actions that could threaten others, but allows threats that are not likely to be carried out.
Eventually, however, Hoffman said “we found that limiting it to physical harm wasn’t sufficient, so we started exploring how free expression societies deal with this.”
The rules developed considerable nuance. There is a ban against pictures of Pepe the Frog, a cartoon character often used by “alt-right” white supremacists to perpetrate racist memes, but swastikas are allowed under a rule that permits the “display [of] hate symbols for political messaging.” In the documents examined by ProPublica, which are used to train content reviewers, this rule is illustrated with a picture of Facebook founder Mark Zuckerberg that has been manipulated to apply a swastika to his sleeve.
The documents state that Facebook relies, in part, on the U.S. State Department’s list of designated terrorist organizations, which includes groups such as al-Qaida, the Taliban and Boko Haram. But not all groups deemed terrorist by one country or another are included: A recent investigation by the Pakistan newspaper Dawn
found that 41 of the 64 terrorist groups banned in Pakistan were operational on Facebook.
There is also a secret list, referred to but not included in the documents, of groups designated as hate organizations that are banned from Facebook. That list apparently doesn’t include many Holocaust denial and white supremacist sites that are up on Facebook to this day, such as a group called “Alt-Reich Nation.” A member of that group was recently charged with murdering a black college student in Maryland.
As the rules have multiplied, so have exceptions to them. Facebook’s decision not to protect subsets of protected groups arose because some subgroups such as “female drivers” didn’t seem especially sensitive. The default position was to allow free speech, according to a person familiar with the decision-making.
After the wave of Syrian immigrants began arriving in Europe, Facebook added a special “quasi-protected” category for migrants, according to the documents. They are only protected against calls for violence and dehumanizing generalizations, but not against calls for exclusion and degrading generalizations that are not dehumanizing. So, according to one document, migrants can be referred to as “filthy” but not called “filth.” They cannot be likened to filth or disease “when the comparison is in the noun form,” the document explains.
Facebook also added an exception to its ban against advocating for anyone to be sent to a concentration camp. “Nazis should be sent to a concentration camp,” is allowed, the documents state, because Nazis themselves are a hate group.
The rule against posts that support violent resistance against a foreign occupier was developed because “we didn’t want to be in a position of deciding who is a freedom fighter,” Willner said. Facebook has since dropped the provision and revised its definition of terrorism to include nongovernmental organizations that carry out premeditated violence “to achieve a political, religious or ideological aim,” according to a person familiar with the rules.
The Facebook policy appears to have had repercussions in many of the at least two dozen disputed territories around the world. When Russia occupied Crimea in March 2014, many Ukrainians experienced
a surge in Facebook banning posts and suspending profiles. Facebook’s director of policy for the region, Thomas Myrup Kristensen, acknowledged at the time that it “found a small number of accounts where we had incorrectly removed content. In each case, this was due to
language that appeared to be hate speech but was being used in an ironic way. In these cases, we have restored the content.”
Katerina Zolotareva, 34, a Kiev-based Ukrainian working in communications, has been blocked so often that she runs four accounts under her name. Although she supported the “Euromaidan” protests in February 2014 that antagonized Russia, spurring its military intervention in Crimea, she doesn’t believe that Facebook took sides in the conflict. “There is war in almost every field of Ukrainian life,” she says, “and when war starts, it also starts on Facebook.”
In Western Sahara, a disputed territory occupied by Morocco, a group of journalists called Equipe Media say their account was disabled by Facebook, their primary way to reach the outside world. They had to open a new account, which remains active.
“We feel we have never posted anything against any law,” said Mohammed Mayarah, the group’s general coordinator. “We are a group of media activists. We have the aim to break the Moroccan media blockade imposed since it invaded and occupied Western Sahara.”
In Israel, which captured territory from its neighbors in a 1967 war and has occupied it since, Palestinian groups are blocked so often that they have
their own hashtag, #FbCensorsPalestine, for it. Last year, for instance,
Facebook blocked the accounts of several editors for two leading Palestinian media outlets from the West Bank —
Quds News Network and
Sheebab News Agency. After a couple of days,
Facebook apologizedand un-blocked the journalists’ accounts. Earlier this year, Facebook blocked the account of Fatah, the Palestinian Authority’s ruling party — then
un-blocked it and apologized.
Last year India cracked down on protesters in Kashmir, shooting pellet guns at them and shutting off cellphone service. Local insurgents are seeking autonomy for Kashmir, which is also caught in a territorial tussle between India and Pakistan. Posts of Kashmir activists were
being deleted, and members of a group called the Kashmir Solidarity Network found that all of their Facebook accounts had been blocked on the same day.
Ather Zia, a member of the network and a professor of anthropology at the University of Northern Colorado, said that Facebook restored her account without explanation after two weeks. “We do not trust Facebook any more,” she said. “I use Facebook, but it’s almost this idea that we will be able to create awareness but then we might not be on it for long.”
The rules are one thing. How they’re applied is another. Bickert said Facebook conducts weekly audits of every single content reviewer’s work to ensure that its rules are being followed consistently. But critics say that reviewers, who have to decide on each post within seconds, may vary in both interpretation and vigilance.
Facebook users who don’t mince words in criticizing racism and police killings of racial minorities say that their posts are often taken down. Two years ago, Stacey Patton, a journalism professor at historically black Morgan State University in Baltimore, posed a provocative question on her Facebook page. She asked why “it’s not a crime when White freelance vigilantes and agents of ‘the state’ are serial killers of unarmed Black people, but when Black people kill each other then we are ‘animals’ or ‘criminals.’”
Although it doesn’t appear to violate Facebook’s policies against hate speech, her post was immediately removed, and her account was disabled for three days. Facebook didn’t tell her why. “My posts get deleted about once a month,” said Patton, who often writes about racial issues. She said she also is frequently put in Facebook “jail” — locked out of her account for a period of time after a posting that breaks the rules.
“It’s such emotional violence,” Patton said. “Particularly as a black person, we’re always have these discussions about mass incarceration, and then here’s this fiber-optic space where you can express yourself. Then you say something that some anonymous person doesn’t like and then you’re in ‘jail.’”
Didi Delgado, whose post stating that “white people are racist” was deleted, has been banned from Facebook so often that she has set up an account on another service called Patreon, where she
posts the content that Facebook suppressed. In May, she deplored the increasingly common Facebook censorship of black activists in an article for Medium titled “
Mark Zuckerberg Hates Black People.”
Facebook also locked out Leslie Mac, a Michigan resident who runs a service called
SafetyPinBox where subscribers contribute financially to “the fight for black liberation,” according to her site. Her offense was writing a post stating “White folks. When racism happens in public — YOUR SILENCE IS VIOLENCE.”
The post does not appear to violate Facebook’s policies. Facebook apologized and restored her account after
TechCrunch wrote an article about Mac’s punishment. Since then, Mac has written many other outspoken posts. But, “I have not had a single peep from Facebook,” she said, while “not a single one of my black female friends who write about race or social justice have not been banned.”
“My takeaway from the whole thing is: If you get publicity, they clean it right up,” Mac said. Even so, like most of her friends, she maintains a separate Facebook account in case her main account gets blocked again.
Negative publicity has spurred other Facebook turnabouts as well. Consider the example of the iconic news photograph of a young naked girl running from a napalm bomb during the Vietnam War. Kate Klonick, a Ph.D. candidate at Yale Law School who has spent two years studying censorship operations at tech companies, said the photo had likely been deleted by Facebook thousands of times for violating its ban on nudity.
But last year,
Facebook reversed itself after Norway’s leading newspaper published a front-page open letter to Zuckerberg accusing him of “
abusing his power” by deleting the photo from the newspaper’s Facebook account.
Klonick said that while she admires Facebook’s dedication to policing content on its website, she fears it is evolving into a place where celebrities, world leaders and other important people “are disproportionately the people who have the power to update the rules.”