Donate
Text Audio
00:00 00:00
Font Size

A new report is urging Congress mandate greater transparency from Big Tech, not a Section 230 overhaul.

But will that be enough to protect conservative free speech online?

Ranking Digital Rights (RDR) released a report Tuesday headlined “It’s Not Just the Content, It’s the Business Model: Democracy’s Online Speech Challenge,” as federal lawmakers look into reining in Big Tech over concerns about free speech, online child exploitation, privacy and misinformation.

The non-profit housed at Soros-funded New America and affiliated with the Open Technology Institute called on tech “companies to be much more transparent about how they work.” It also cautioned against making changes to Section 230 of the Communications Decency Act of 1996.

“[I]nstead of seeking to hold digital platforms liable for content posted by their users, regulators and advocates should instead focus on holding companies accountable for how content is amplified and targeted,” RDR argued in its report. “It’s not just the content, but tech companies’ surveillance-based business models that are distorting the public sphere and threatening democracy.”

[ads:im:1]

In a recent speech, however, Deputy Attorney General Jeffrey A. Rosen asked if Big Tech should get full statutory immunity for “removing lawful speech and given carte blanche as a censor if the content that is not ‘obscene, lewd, lascivious, filthy, excessively violent, or harassing’ under [Section 230 of the Communications Decency Act].”

Rosen briefly summarized Section 230 as applied to a social media site as follows:

“Under Section 230, the social media site is not liable for what the user says, although the user themselves may still be liable. Section 230 also immunizes a website from some liability for ‘in good faith’ removing illicit user-generated content that is ‘obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable.’”

The RDR report recommended federal legislators consider requiring online platforms do the following, according to Axios:

“[P]ublish their rules for what content and targeted advertising they allow; issue reports on the content and ads they take down for breaking the rules; and explain the algorithms that determine what ends up on someone’s screen.”

“‘This kind of transparency is not the end goal,’” said Nathalie Maréchal, one of the authors, reported Axios. “‘This kind of transparency is a necessary first step toward accountability.’”

[ads:im:2]

U.S. Attorney General William Barr took a different tack during his introductory statement to the recent Department of Justice Section 230 Workshop.

“No longer are tech companies the underdog upstarts,” said Barr on Feb. 19. “They have become titans of U.S. industry.” This boom has left consumers with less options, said the attorney general.

“The lack of feasible alternatives is relevant in the Section 230 discussion -- both for those citizens who want safer online spaces and for those whose speech has been banned or restricted by these platforms,” he said. Barr also said that due to the rise of algorithms, content moderation, and recommendations, the lines between “passively hosting third-party speech and actively curating or promoting speech” were blurred.

Indeed, Rosen pointed out the Good Samaritan provision of Section 230 in his recent speech, noting that “platforms have the ability to remove content that they have a ‘good faith’ belief is ‘obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable.’”

But he also made clear one distinction: that Big Tech lacks oversight and has been given “a blank check, ignoring the ‘good faith’ requirement and relying on the broad term ‘otherwise objectionable’ as carte blanche to selectively remove anything from websites for other reasons and still claim immunity.”

MRC TechWatch Senior Analyst Corinne Weaver and MRC TechWatch Contributing Writer Alexander Hall contributed to this report.