Donate
Text Audio
00:00 00:00
Font Size

Facebook admitted that data it shared with researchers in February 2020 was severely flawed. The result discredited any study relating to so-called “misinformation,” alleged radicalization or political polarization that used the Facebook information. 

The New York Times reported that Facebook’s effort to be more forthcoming with user behavior data it provided to researchers was flawed. Facebook admitted that it failed to include approximately half of the U.S. user data in its data set in a call with Social Science One, a group of researchers organized to work with Facebook on using its data for new research projects. Worse, according to The Times, people on the call claimed that Facebook representatives “said that they still did not know whether other aspects of the data set were affected.”

However, it was not just a random half of data that was missing. The missing data appears to have been for users who were apolitical or neutral. That means the result was skewed toward more politicized answers.

According to The Times, Facebook admitted the half that was provided was for U.S. users “who engaged with political pages enough to make their political leanings clear.” 

Studies that used the flawed data to evaluate how many users were being radicalized or polarized by so-called political misinformation then, in all likelihood, came up with faulty conclusions.

The data was provided to “at least 110 researchers,” and was used in “dozens of papers,” according to The Times. It is unclear which or how many papers used the flawed data. A complete list of all papers that used the data would help minimize skepticism of research in the affected subjects. Research papers typically cite multiple other research papers. So, the list would need to include any papers that cited the ones that used the flawed data as well. 

Facebook reportedly said it was a “technical error.” It also noted that a researcher on the call where Facebook admitted the error “expressed concern that Facebook was either negligent, or, worse, actively undermining the research,” summarized The Times

Any and all studies since Feb. 2020 that looked into so-called “misinformation” on Facebook, or the alleged radicalization or polarization of users on Facebook can’t be trusted. And many of those were already suspect because they were used to push pro-censorship agendas.

Any conclusions from such studies should be considered invalid unless researchers show they did not use the faulty Facebook data set. Every study that used the flawed data should be retracted until a new analysis has been completed.


Conservatives are under attack. Contact Facebook headquarters at 1-650-308-7300 and demand that the platform provide transparency: Companies need to design open systems so that they can be held accountable, while giving weight to privacy concerns. If you have been censored, contact us using CensorTrack’s contact form, and help us hold Big Tech accountable.