Zuckerberg should focus on election misinformation, not the metaverse

Bloomberg Opinion — Give Mark Zuckerberg credit. Faced with criticism about the radical strategic change he has chosen for Facebook, he remains stubborn in turning it into a metaverse company. Other tech billionaires may lambast dissent, but Zuckerberg remains stoic, setting aside the noise to give serious interviews and presentations about his vision for virtual reality.

However, while he can ignore the criticism, the CEO of Facebook’s parent company, Meta Platforms Inc. (GOAL), should rethink his priorities in the coming months, as the United States heads into a potentially tumultuous midterm election. You need to refocus your attention on Facebook, or you risk letting misleading videos about voter fraud proliferate, potentially disrupting the democratic process once again.

Zuckerberg could start by doing what thousands of managers have done before him, and reconsider his tasks.

The metaverse project is still in its infancy: While Facebook has some 3 billion active users, Horizon Worlds, the VR platform that underlies the metaverse experience, only has 200,000, according to internal documents revealed by the Wall Street Journal.

Zuckerberg has been candid in stating that the Meta metaverse will not be fully realized for another five years or more. All the more reason, then, that his passion project can afford to lose his attention for a few months, or at least during critical moments for democracy.

So far he has shown no signs of changing his approach. Facebook’s main election team no longer reports directly to Zuckerberg as he did in 2020, according to the New York Times, when Zuckerberg made the US election that year his top priority.

It has also loosened the reins on key executives tasked with managing electoral disinformation. Head of global affairs Nick Clegg now splits his time between the UK and Silicon Valley, and Guy Rosen, the company’s chief information security officer, has relocated to Israel, a company spokesperson confirmed by email. email.

Researchers who track disinformation on social media say there’s little evidence that Facebook is any better at stopping conspiracy theories now than it was in 2020. Melanie Smith, who leads disinformation research at the Institute for Strategic Dialogue, a nonprofit London-based nonprofit says the company has not improved access to data for outside researchers trying to quantify the spread of misleading messages. Anecdotally they continue to proliferate, she said. Smith said she found Facebook groups recruiting poll watchers ostensibly for the purpose of intimidating voters on Election Day.

He also pointed to a video posted by Florida Congressman Matt Gaetz on his Facebook page, in which he said the 2020 election had been stolen. The video has been viewed over 40,000 times at the time of this writing. Although it was published a month ago, it does not have a fact-check warning label.

Smith also cited recent Facebook posts, shared hundreds of times, inviting people to events to discuss how “Chinese Communists” are running local elections in the US, or posters stating that certain politicians should “go to jail for their role in the stolen election.” Posts made by candidates tend to spread especially, Smith said.

Meta has said that its main approach to handling content until the 2022 midterm elections will be with warning labels. But warning labels are not very effective. For more than 70% of disinformation posts on Facebook, those labels are applied two days or more after the post has been published, long after it has had a chance to spread, according to a study by the Integrity Institute, a Non-profit research organization run by former employees of large technology companies. Studies have shown that misinformation gets 90% of your total social media engagement in less than a day.

The problem, ultimately, is the way Facebook shows people the content that is most likely to keep them on the site, what whistleblower Frances Haugen has called engagement-based ranking. A better approach would be “quality-based ranking,” similar to Google’s page ranking system, which favors consistently reliable sources of information, according to Jeff Allen, former Meta data scientist and co-founder of the Integrity Institute.

Facebook’s growing emphasis on video may compound the problem. In September 2022, misinformation was being shared much more often through videos than through regular Facebook posts, Allen said, citing a recent Integrity Institute study. False content tends to get more engagement than truthful content, she added, so it tends to be favored by an engagement-based system.

In 2020, Facebook launched “glass-breaking” measures to counter a wave of posts claiming the election had been stolen by then-President-elect Joe Biden, which ended up contributing to the storming of the US Capitol. USA on January 6.

Meta should never have to resort to such drastic measures again. If Zuckerberg is serious about connecting people, and doing so responsibly, he should come out of his virtual reality bubble and reexamine the rating system that keeps eyes glued to Facebook content. At a minimum, he could communicate to his employees, and to the public, that he again makes election integrity a priority. The metaverse can wait.

This note does not necessarily reflect the opinion of the editorial board or of Bloomberg LP and its owners.

We wish to give thanks to the writer of this article for this amazing web content

Zuckerberg should focus on election misinformation, not the metaverse