49.7 F
New York
Sunday, November 17, 2024

Facebook’s Evidence that Its Content was Divisive was Ignored by Company Execs

Related Articles

-Advertisement-

Must read

With all the negative press that social media giants are receiving these days for suppressing free speech and curtailing the expression of thoughts that do not conform to a politically left-wing agenda, it appears that at least one of these companies were complicit in rigging their algorithms to “exploit the human brain’s attraction to divisiveness,” according to a report on Tuesday in the Wall Street Journal.

The WSJ reported that Facebook had evidence that its algorithms encourage polarization but that solutions to rectify this egregious problem were dismissed or weakened by executives of the company, including CEO Mark Zuckerbreg.

Facebook decided to look into the behavior of its users as a result of the infamous Cambridge Analytica scandal, as was reported by the WSJ. The report also indicated that rather than having the desired effect of “connecting the world” Facebook was serving to divide people and cause irreparable dissension.

A 2016 report revealed that the percentage of those people who join extremist or hate groups on Facebook that are predicated on racial and religious theories is 64% and that these people found out about the existence of these groups through such Facebook recommendation tools as “Groups You Should Join” and “Discover.”

The researchers told the WSJ that the recommendation system algorithms foster the problem of hate groups expanding their memberships on Facebook.
The folks at Facebook reportedly did some brainstorming in terms of trying to devise fixes to this burgeoning problem and came up with multiple options.

According to the WSJ, those solutions include limiting the spread of information from groups’ most hyperactive and hyperpartisan users, suggesting a wider variety of groups than users might normally encounter, and creating subgroups for heated debates to prevent them from derailing entire groups.
While the aforementioned options may sound workable, the WSJ reported that the proposals for the fixes were jettisoned or presented in a weakened format by Zuckerberg and Joel Kaplan, the chief of policy at Facebook.

Addressing the problem head on of users becoming increasingly divided and polarized was not a top priority for Zuckerberg, according to the WSJ report, as he expressed great concern about the potential that these fixes would have on limiting user growth.

When approached by his team to give consideration to the idea of limiting the dissemination of posts of incessant users, Zuckerberg offered a diluted rendition of the pitch and according to the WSJ report, then asked his associates not to raise this matter with him again.

The researchers at Facebook also came to the conclusion that because the social media platform is top heavy with “far-right” content, then any modest changes such as the reduction of clickbait would have disproportionately affected conservatives. according to the WSJ report.

Not wanting to limit user growth or turn anyone off, the Facebook, policy chief Kaplan had nixed the “Common Ground” project whose objective it was to “encourage healthier political discourse” on the platform.

In the end, none of the fixes proffered by Facebook team members came to fruition in terms of being included in their products.

The WSJ reported that Facebook managers told employees in September 2018 that the company was pivoting “away from societal good to individual value.”

Speaking to the Business Insider web site, a spokeswoman from Facebook said, “We’ve learned a lot since 2016 and are not the same company today. We’ve built a robust integrity team, strengthened our policies and practices to limit harmful content, and used research to understand our platform’s impact on society so we continue to improve.”
Also reacting to the eye-opening reporting that was done by the WSJ was Guy Rosen, the Vice President of Integrity at Facebook. In a blog post that appeared on Wednesday, Rosen wrote that Facebook has “taken a number of steps to fight polarization, such as prioritizing content from family and friends in users’ newsfeeds, not recommending groups that violate its terms, prohibiting hate speech and content that could cause real-world harm, and partnering with fact check groups.”

Facebook has faced the wrath of their staunchest critics for many years who feel that the social media platform has not implemented an effective plan to stop the maelstrom of questionable content.

balance of natureDonate

- Advertisement -

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest article

- Advertisement -