Facebook uses violence but does not want you to know it

People hardly commented, liked or shared the posts. Instead a “zombie” mode of consumption was established. In other words, users consumed the content apathetically.

To that was added that uA series of studies showed that this passivity when browsing the social network was detrimental to mental health. From that, Facebook began to apply changes that sought to strengthen “meaningful social interactions” or MSI for its English translation, always with the banner of strengthening the links between users and improving their quality of life.

Mark Zuckerberg communicated Facebook’s new direction as a sacrifice on his part and the company. On his own account he wrote:

Now, I want to be clear: by making these changes, I hope that the time people spend on Facebook and some measures of engagement decrease. But I also hope that the time you spend on Facebook is more valuable. And if we do the right thing, I think it will also be good for our community and our business in the long run. Now, I want to be clear: by making these changes, I hope that the time people spend on Facebook and some measures of engagement decrease. But I also hope that the time you spend on Facebook is more valuable. And if we do the right thing, I think it will also be good for our community and our business in the long run.

The modifications were based on assigning “points” to each type of interaction. In short, the more comments, reactions, likes and shares a publication received, the more reach it would have.

The changes worked. But at what cost?

According to internal memos among California company employees, the platform was becoming an increasingly violent space.

The algorithm change and its effect on policy

Facebook.jpg

In order to reach more people organically – that is, without putting money on it – the pages of political parties began to publish content that promoted sensationalism and outrage.

The math is simple: if a post generates outrage and anger, people are going to argue in the comments. The discussions involve long texts with justifications, people sharing with anger and reactions to the comments that are considered more successful by other users. This causes the reach of a post to skyrocket.

“Disinformation, toxicity and violent content are exceedingly frequent among shared networks”, read one of the internal documents, written by the Facebook data researchers.

Within the leaked writings, it was recorded that political parties from different parts of the world adapted their communication strategy to generate strong reactions from their followers, commonly fury and anger., and thus make your posts reach more people.

They even admitted that this model was worrisome as it endangered democracy.

Years later, this was evidenced by the taking of the Capitol in the United States. Groups sympathetic to Donald Trump – who had lost the elections – entered the building with weapons while the current president, Joe Biden, was handed over to power. After the incident where the lives of the congressmen were in danger, Facebook was the center of criticism. The company was doing nothing to stop extremist political organizations.

The most shocking images of the assault on the Capitol

A blow to the media

Not only the parties were affected. The algorithm change was a very hard blow for the media.

According to Comscore, an online data firm, during the first half of 2018, BuzzFeed dropped 13% in traffic compared to the previous six months, Breitbart lost 46% and ABC News lost 12%.

The solution that many media found was to lower the quality of their content to turn it into more “sharable” posts.

“MSI’s rating is not actually rewarding content that drives meaningful social interactions”, said the CEO of Buzzfeed, Jonah Peretti, through an email addressed to a Facebook employee.

Peretti then explained that the new algorithm favored racial violence (a severe conflict in the United States), “fad / junk science,” “extremely disturbing news” and unpleasant images.

Mark Zuckerberg’s position

Mark Zuckerberg

Lucky the Cat the Doodle that the Spanish speaking world did.svg

Facebook knows this is happening and so does Mark Zuckerberg. The integrity team of the company – whose objective is to maintain the quality and credibility of the content – presented some modifications that could be made in the algorithm to reverse this negative result.

Suggested changes included ways to reduce the proliferation of fake news and content that will generate division. But the CEO rejected many of these options since they directly affected the MSIs that favored Facebook’s performance so much.

In 2019, some changes began to be considered to prevent the proliferation of fake content that has more chances to go viral and just en the spring of 2020 Facebook made this type of modifications in the content related to the sacrazy.

The objective was to reduce the fake news around the pandemic. Later, A group of data scientists proposed extending these measures to other types of content to begin to reverse the damage.

One more time, In the leaked documents, Mark Zuckerberg could be read clarifying that he would not make any changes that would decrease or affect the presence of users on his platform.

Article: Source