Inside Two Years of Turmoil at Big Tech's Anti-Terrorism Group
X has left the board of GIFCT, an anti-terrorism organization through which tech companies exchange information to keep violent content off the web. It's the latest in a series of episodes driving tension within the ranks.
www.wired.com
By Paresh Dave Business
Sep 30, 2024 5:00 AM
Inside Two Years of Turmoil at Big Tech's Anti-Terrorism Group
X has left the board of GIFCT, an organization through which tech companies exchange information to keep violent content off the web. It's the latest in a series of episodes driving tension within the ranks.
Photo-illustration: Jacqui VanLiew; Getty Images
Vice presidents from Meta, YouTube, Twitter, and Microsoft gathered over Zoom in March 2023 to discuss whether to allow TikTok, one of their companies’ most fearsome competitors, into their exclusive club.
The four executives comprised the board of the Global Internet Forum to Counter Terrorism (GIFCT)—where companies share tips intended to prevent their platforms from becoming hotbeds for terrorists—and they knew that TikTok needed help keeping extremist propaganda off its platform. TikTok had passed a training program they required and had addressed their questions about its ties to China. But people briefed on the discussions say the board still worried about the possibility of TikTok abusing its membership in some way that benefited the Chinese government and undermined free expression. On top of that, at the time US lawmakers were considering a ban of the app, and more content moderation mishaps for TikTok likely would add to the heat. The board ultimately didn’t approve TikTok.
A WIRED investigation into GIFCT reveals that TikTok’s bid to join the consortium failed because two of the four executives on the board abstained from voting on its application. A week later, on the fourth anniversary of a deadly terrorist attack in Christchurch, New Zealand, researchers blasted TikTok for hosting footage celebrating the rampage. These were the very videos that would have been easily flagged and removed had TikTok’s rivals granted it access to their group’s threat-spotting technology.
Around the same time, the board members declined to admit the parent company of PornHub, citing concerns over whether its content policies met the bar for membership. By comparison, the board last year quickly approved the unproblematic French social app Yubo. GIFCT’s bespoke advice enabled the startup to identify 50 suspicious accounts that it reported to law enforcement, according to Marc-Antoine Durand, Yubo’s chief operating officer.
More recently, despite having the authority to do so, Meta, Microsoft, and YouTube declined to expel Twitter (now X) from the board even as the platform’s relaxed content moderation practices under Elon Musk threatened reputational harm to the GIFCT and other member companies more broadly. This month, X quietly left the board voluntarily.
These secretive membership decisions, revealed for the first time by WIRED, show how Microsoft, Meta, YouTube, and X are gatekeeping access to anti-terrorism guidance and influencing the content users encounter across the web. Our investigation also uncovers contentious fundraising choices by the four companies and the consequences of a lack of quality control checks in their heralded system for flagging violent extremism.
To understand the consortium’s inner workings, WIRED reviewed records of GIFCT’s internal deliberations and spoke with 26 people directly connected to the organization, including leadership, staff, advisers, and partners. Several of the people believe Meta, Microsoft, YouTube, and X have steered the consortium in a way that has undermined its potential. “The result is almost certainly more users are being radicalized,” claims one of the sources, who has been in consistent contact with GIFCT since its inception. Many of the people sought anonymity because their employers or GIFCT hadn’t authorized speaking with WIRED.
The four tech giants have presided over the consortium since they announced it in 2016, when Western governments were berating them for allowing Islamic State to post gruesome videos of journalists and humanitarians being beheaded. Now with a staff of eight, GIFCT—which the board organized as a US nonprofit in 2019 after the Christchurch massacre—is one of the groups through which tech competitors are meant to work together to address discrete online harms, including child abuse and the illicit trade of intimate images.
The efforts have helped bring down some unwelcome content, and pointing to the work can help companies stave off onerous regulations. But the politics involved in managing the consortia generally stay secret.
Just eight of GIFCT’s 25 member companies answered WIRED’s requests for comment. The respondents, which included Meta, Microsoft, and YouTube, all say they are proud to be part of what they view as a valuable group. The consortium’s executive director, Naureen Chowdhury Fink, didn’t dispute WIRED’s reporting. She says TikTok remains in the process to attain membership.
GIFCT has relied on voluntary contributions from its members to fund the roughly $4 million it spends annually, which covers salaries, research, and travel. From 2020 through 2022, Microsoft, Google, and Meta each donated a sum of at least $4 million and Twitter $600,000, according to the available public filings. Some other companies contributed tens of thousands or hundreds of thousands of dollars, but most paid nothing.
By last year, at least two board members were enraged at companies they perceived as freeloaders, and fears spread among the nonprofit’s staff over whether their jobs were in jeopardy. It didn’t help that as Musk turned Twitter into X about a year ago, he kept slashing costs, including suspending the company’s optional checks to GIFCT, according to two people with direct knowledge.
AI Lab
WIRED’s resident AI expert Will Knight takes you to the cutting edge of this fast-changing field and beyond—keeping you informed about where AI and technology are headed. Delivered on Wednesdays.
By signing up you agree to our User Agreement (including the class action waiver and arbitration provisions), our Privacy Policy & Cookie Statement and to receive marketing and account-related emails from WIRED. You can unsubscribe at any time. This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
To diversify funding, the board has signed off on soliciting foundations and even exploring government grants for non-core projects. “We'd really have to carefully consider if it makes sense,” Chowdhury Fink says. “But sometimes working with multiple stakeholders is helpful.”
Rights activists the group privately consulted questioned whether this would count as subsidies for tech giants, which could siphon resources from potentially more potent anti-extremism projects. But records show staff were considering seeking a grant of more than tens of thousands of dollars from the pro-Israel philanthropy Newton and Rochelle Becker Charitable Trust. Chowdhury Fink says GIFCT didn’t end up applying.
This year, Meta, YouTube, Microsoft, and X amended GIFCT’s bylaws to require minimum annual contributions from every member starting in 2025, though Chowdhury Fink says exemptions are possible.
Paying members will be able to vote for two board seats, she says. Eligibility for the board is contingent on making a more sizable donation. X had signaled it wouldn’t pay up and would therefore forfeit its seat, two sources say—a development that ended up happening this month. It had been scheduled to hold tiebreaking power among the four-company board in 2025. (Under the bylaws, Meta, YouTube, and Microsoft could have ejected Twitter from the board as soon as Musk acquired the company. But they chose not to exercise the power.)
Many of the people close to GIFCT who spoke with WIRED contend that Meta, YouTube, and Microsoft must banish X altogether because it has allegedly been allowing extremists to openly engage in illicit behavior on its platform, such as reportedly selling weapons. The Times last month reported on calls to oust X.
The public copy of GIFCT’s “tech solutions” code of conduct is largely redacted for “operational security,” but it does state that a company can be banned for “sustained inappropriate behavior.” By the consortium’s own telling, “membership should be recognized and appreciated as a strong indication of good stewardship for the internet and its users.” X reported this month that it suspended over 57,000 accounts in the first half of this year for violating its violent and hateful entities policy. It didn’t respond to requests for comment.