New research from a team of US & UK researchers has found that politically conservative users tend to share misinformation at a greater volume than...

bnew

Veteran
Joined
Nov 1, 2015
Messages
57,370
Reputation
8,499
Daps
160,094

Social media users’ actions, rather than biased policies, could drive differences in platform enforcement​

Published on
2 Oct 2024

Written by
Mohsen Mosleh

New research from a team of US and UK researchers has found that politically conservative users tend to share misinformation at a greater volume than politically liberal users.

american flag


Social media users’ actions, rather than biased policies, could drive differences in platform enforcement

New research from a team of US and UK researchers has found that politically conservative users tend to share misinformation at a greater volume than politically liberal users and this could explain why conservatives were suspended more frequently — thus an asymmetry in sanctions is not evidence of biased policies on the part of social media companies.

In their new paper, “Differences in misinformation sharing can lead to politically asymmetric sanctions,” published today in Nature, the researchers suggests that the higher quantity of social media policy enforcement (such as account suspensions) for conservative users could be explained by the higher quantity of misinformation shared by those conservative users — and so does not constitute evidence of inherent biases in the policies from social media companies or in the definition of what constitutes misinformation.

Written by researchers from MIT Sloan School of Management, the University of Oxford, Cornell University, and Yale University, co-authors of the paper include Mohsen Mosleh,Qi Yang,Tauhid Zaman,Gordon Pennycook and David G. Rand.

The spread of misinformation has become an increasing concern, especially as the 2024 presidential election in the United States approaches. Many Americans who disagree on political issues agree that the sharing of false information is a substantial problem ;sixty-five percent of Americans say that technology companies should take action to restrict the spread of false information. However, there is great dissension as to whether tech companies are actually moderating platforms fairly.

“Accusations of political bias are often based largely on anecdotes or noteworthy cases, such as the suspension from Twitter and Facebook of former President Trump,” said MIT Sloan professor Rand. “This study allows us to systematically evaluate the data and better understand the differential rates of policy enforcement.”

The asymmetry of conservative sanctions versus liberal sanctions should not be attributed to partisan bias on the part of social media companies and those determining what counts as misinformation, Rand and the co-authors noted.

The research began by looking at Twitter’s suspension of users following the 2020 U.S. presidential election. Researchers identified 100,000 Twitter users from October 2020 who shared hashtags related to the election, and randomly sampled 9,000 — half of whom shared at least one #VoteBidenHarris2020 hashtag and half of whom shared at least one #Trump2020 hashtag. Researchers analysed each user’s data from the month before the election to quantify their tendency to share news from low-quality domains (as well as other potentially relevant characteristics), and then checked nine months later to determine which users were suspended by Twitter.

Accounts that had shared #Trump2020 before the election were 4.4 times more likely to have been subsequently suspended than those who shared #VoteBidenHarris2020. Only 4.5% of the users who shared Biden hashtags had been suspended as of July 2021, while 19.6% of the users who shared Trump hashtags had been suspended.

“We found that there were political differences in behaviour, in addition to the political differences in enforcement,” said Rand. “The fact that the social media accounts of conservatives are suspended more than those of liberals is therefore not evidence of bias on the part of tech companies, and shouldn’t be used to pressure tech companies to abandon policies meant to reduce the sharing of misinformation.”

To better understand this difference, the researchers examined what content was shared by these politically active Twitter users in terms of the reliability of the sources through two different methods. They used a set of 60 news domains (the 20 highest volume sites within the categories of mainstream, hyper-partisan and fake news), and collected trustworthiness ratings for each domain from eight professional fact-checkers. In an effort to eliminate concern about potential bias on the part of journalists and fact-checkers, the researchers also collected ratings from politically-balanced groups of laypeople. Both approaches indicated that people who used Trump hashtags shared four times more links to low-quality news outlets than those who used Biden hashtags.

“Prior work identifying political differences in misinformation sharing has been criticized for relying on the judgment of professional fact-checkers. But we show that conservative Twitter users shared much lower quality news, even when relying on ratings from politically-balanced groups of laypeople,” said co-author Dr Mohsen Mosleh, Associate Professor, Oxford Internet Institute, part of the University of Oxford. “This can’t be written off as the result of political bias in the ratings, and means that preferential suspension of conservative users is not necessarily the result of political bias on the part of social media companies.”

The study also discovered similar associations between conservatism and low-quality news sharing (based on both expert and politically-balanced layperson ratings) were present in seven other datasets from Twitter, Facebook, and survey experiments, spanning 2016 to 2023 and including data from 16 different countries. For example, the researchers found cross-cultural evidence of conservatives sharing more unambiguously false claims about COVID-19 than liberals, with conservative political elites sharing links to lower quality new sources than liberal political elites in the U.K. and Germany as well.

“The social media users analyzed in this research are not representative of Americans more broadly, so these findings do not necessarily mean that conservatives in general are more likely to spread misinformation than liberals. Also, we’re just looking at this particular period in time,” said Rand. “Our basic point would be the same if it was found that liberal users shared more misinformation and were getting suspended more. Such a pattern of suspension would not be enough to show bias on the part of the companies, because of the differences in users’ behaviour.”

Even under politically neutral anti-misinformation policies, the researchers expect that there would be political asymmetries in enforcement. While the analyses do not rule out the possibility of any bias on the part of platforms, the inequality of sanctions is not diagnostic of bias one way or the other. Policy-makers need to be aware that even if social media companies are working in an unbiased way to manage misinformation on their platforms, there will still be some level of differential treatment across groups.
 
Joined
Sep 15, 2015
Messages
22,214
Reputation
7,630
Daps
93,738
Reppin
Chase U
The people who really need to read and understand this are the same ones who are probably going to ignore it or twist it to fit their narrative. Instead of acknowledging the findings that conservatives tend to share more misinformation, leading to more suspensions, they will just double down on the idea that they are being persecuted for their beliefs. It's easier for people like that to believe they're being targeted than to admit they might be wrong or spreading false information. We see it everyday on here. :(
 

bnew

Veteran
Joined
Nov 1, 2015
Messages
57,370
Reputation
8,499
Daps
160,094
The people who really need to read and understand this are the same ones who are probably going to ignore it or twist it to fit their narrative. Instead of acknowledging the findings that conservatives tend to share more misinformation, leading to more suspensions, they will just double down on the idea that they are being persecuted for their beliefs. It's easier for people like that to believe they're being targeted than to admit they might be wrong or spreading false information. We see it everyday on here. :(

used AI to simplify it.



LLM served by Perplexity Labs

Simplified Explanation​

A recent study by researchers from the US and UK looked into why social media platforms like Twitter might suspend accounts of conservative users more often than those of liberal users. Here’s what they found:

Key Points:​

  • Misinformation Sharing: The study discovered that conservative users tend to share false or misleading information (misinformation) more frequently than liberal users.
  • Account Suspensions: Because conservative users share more misinformation, they are more likely to have their accounts suspended by social media platforms.
  • No Bias in Policies: The researchers concluded that the higher rate of suspensions for conservative users is not because social media companies have biased policies against conservatives. Instead, it's because conservative users are sharing more content that violates the platform's rules against misinformation.

How They Did It:​

  • The researchers analyzed data from Twitter users who shared hashtags related to the 2020 U.S. presidential election.
  • They found that users who shared Trump-related hashtags were 4.4 times more likely to be suspended than those who shared Biden-related hashtags.
  • They also checked the quality of news sources shared by these users and found that conservatives shared links from lower-quality news outlets much more often.

Broader Implications:​

  • This study suggests that even if social media companies are enforcing their policies fairly and without bias, there will still be differences in how often different groups get sanctioned because of their behavior.
  • It emphasizes that just because one group gets suspended more often, it doesn't mean the platform is biased against them.

Conclusion:​

The main point is that differences in how often conservative versus liberal accounts get suspended can be explained by differences in user behavior (sharing misinformation), rather than any inherent bias in the policies of social media companies. This research helps clarify why there might seem to be an imbalance in enforcement and encourages policymakers and the public to understand these dynamics better when discussing issues around misinformation and platform moderation.
 
Top