Report: Facebook Execs Knew Their Algorithms Were Sowing Division, But Looked the Other Way

Facebook Call Violence People They Don't Like

A report from the Wall Street Journal has indicated that Facebook’s own internal research demonstrated how the tech giant’s algorithms were causing division, but executives within Facebook did nothing to change them and kept the status quo intact.

Facebook conducted a great deal of research following 2017’s Cambridge Analytica scandal and came to the determination that their algorithms “exploit the human brain’s attraction to divisiveness” and foster polarization. They also found in 2016 that “64% of all extremist group joins are due to our recommendation tools.”

“Our recommendation systems grow the problem,” the researchers wrote. They had suggestions to combat the problem, such as limiting content reach from hyper-partisan and overactive contributors, diversifying the types of groups listed to users, and creating sub-groups within groups to prevent them from being derailed into divisive banter.

The fixes were ultimately rejected or diluted by Facebook CEO Mark Zuckerberg and the corporation’s policy chief, Joel Kaplan, according to the report. Kaplan worried that making changes would disproportionately affect conservatives, who are already known to be targeted by the platform with egregious censorship. Kaplan also halted a “Common Ground” project meant to create a less heated political debate on the platform.

Facebook claims that many changes have been made since 2016-17 to supposedly clean up the platform.

“We’ve learned a lot since 2016 and are not the same company today,” a Facebook spokeswoman told Business Insider. “We’ve built a robust integrity team, strengthened our policies and practices to limit harmful content, and used research to understand our platform’s impact on society so we continue to improve.”

Facebook’s vice president of integrity Guy Rosen also addressed the report in a blog post published on Wednesday.

“We’ve taken a number of important steps to reduce the amount of content that could drive polarization on our platform, sometimes at the expense of revenues. This job won’t ever be complete because at the end of the day, online discourse is an extension of society and ours is highly polarized. But it is our job to reduce polarization’s impact on how people experience our products. We are committed to doing just that,” Rosen said.

However, Facebook’s changes have done little to stop polarization and have only increased Draconian censorship of information deemed by the political establishment as wrongthink.

Big League Politics has reported on how Facebook has stopped truthful information about deceased pedophile Jeffrey Epstein and the coronavirus pandemic from circulating on their platform in recent months.

Facebook has become one of the world’s most reviled corporations for their Big Brother machinations, and government regulation may be the only way to bring this Orwellian monstrosity to heel.

Our Latest Articles