Best News Network

Facebook froze as anti-vaccine comments swarmed users

Facebook froze as anti-vaccine comments swarmed users
In this April 10, 2018, file photo, Facebook CEO Mark Zuckerberg testifies before a joint hearing of the Commerce and Judiciary Committees on Capitol Hill in Washington. Last spring, as false claims about vaccine safety threatened to undermine the world’s response to COVID-19, researchers at Facebook wrote that they could reduce vaccine misinformation by tweaking how vaccine posts show up on users’ newsfeeds, or by turning off comments entirely. Yet despite internal documents showing these changes worked, Facebook was slow to take action. Credit: AP Photo/Alex Brandon, File

In March, as claims about the dangers and ineffectiveness of coronavirus vaccines spun across social media and undermined attempts to stop the spread of the virus, some Facebook employees thought they had found a way to help.

By altering how posts about vaccines are ranked in people’s newsfeeds, researchers at the company realized they could curtail the misleading information individuals saw about COVID-19 vaccines and offer users posts from legitimate sources like the World Health Organization.

“Given these results, I’m assuming we’re hoping to launch ASAP,” one Facebook employee wrote, responding to the internal memo about the study.

Instead, Facebook shelved some suggestions from the study. Other changes weren’t made until April.

When another Facebook researcher suggested disabling some comments on vaccine posts in March until the platform could do a better job of tackling anti-vaccine messages lurking in them, that proposal was ignored at the time.

Critics say the reason Facebook was slow to take action on the ideas is simple: The tech giant worried it might impact the company’s profits.

“Why would you not remove comments? Because engagement is the only thing that matters,” said Imran Ahmed, the CEO of the Center for Countering Digital Hate, an internet watchdog group. “It drives attention and attention equals eyeballs and eyeballs equal ad revenue.”

In an emailed statement, Facebook said it has made “considerable progress” this year with downgrading vaccine misinformation in users’ feeds.

Facebook’s internal discussions were revealed in disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by former Facebook employee-turned-whistleblower Frances Haugen’s legal counsel. The redacted versions received by Congress were obtained by a consortium of news organizations, including The Associated Press.

The trove of documents shows that in the midst of the COVID-19 pandemic, Facebook carefully investigated how its platforms spread misinformation about life-saving vaccines. They also reveal rank-and-file employees regularly suggested solutions for countering anti-vaccine content on the site, to no avail. The Wall Street Journal reported on some of Facebook’s efforts to deal with anti-vaccine comments last month.

Facebook froze as anti-vaccine comments swarmed users
In this Aug. 20, 2021, file photo, protesters against vaccine and mask mandates demonstrate near the state capitol, in Santa Fe, New Mexico. Last spring, as false claims about vaccine safety threatened to undermine the world’s response to COVID-19, researchers at Facebook wrote that they could reduce vaccine misinformation by tweaking how vaccine posts show up on users’ newsfeeds, or by turning off comments entirely. Yet despite internal documents showing these changes worked, Facebook was slow to take action. Credit: AP Photo/Cedar Attanasio, File

Facebook’s response raises questions about whether the company prioritized controversy and division over the health of its users.

“These people are selling fear and outrage,” said Roger McNamee, a Silicon Valley venture capitalist and early investor in Facebook who is now a vocal critic. “It is not a fluke. It is a business model.”

Typically, Facebook ranks posts by engagement—the total number of likes, dislikes, comments, and reshares. That ranking scheme may work well for innocuous subjects like recipes, dog photos, or the latest viral singalong. But Facebook’s own documents show that when it comes to divisive public health issues like vaccines, engagement-based ranking only emphasizes polarization, disagreement, and doubt.

To study ways to reduce vaccine misinformation, Facebook researchers changed how posts are ranked for more than 6,000 users in the U.S., Mexico, Brazil, and the Philippines. Instead of seeing posts about vaccines that were chosen based on their popularity, these users saw posts selected for their trustworthiness.

The results were striking: a nearly 12% decrease in content that made claims debunked by fact-checkers and an 8% increase in content from authoritative public health organizations such as the WHO or U.S. Centers for Disease Control. Those users also had a 7% decrease in negative interactions on the site.

Employees at the company reacted to the study with exuberance, according to internal exchanges included in the whistleblower’s documents.

“Is there any reason we wouldn’t do this?” one Facebook employee wrote in response to an internal memo outlining how the platform could rein in anti-vaccine content.

Facebook said it did implement many of the study’s findings—but not for another month, a delay that came at a pivotal stage of the global vaccine rollout.

In a statement, company spokeswoman Dani Lever said the internal documents “don’t represent the considerable progress we have made since that time in promoting reliable information about COVID-19 and expanding our policies to remove more harmful COVID and vaccine misinformation.”

The company also said it took time to consider and implement the changes.

Facebook froze as anti-vaccine comments swarmed users
In this Feb. 23, 2018, file photo, doctor Roberto Ieraci vaccinate a woman in a vaccine center in Rome. Last spring, as false claims about vaccine safety threatened to undermine the world’s response to COVID-19, researchers at Facebook wrote that they could reduce vaccine misinformation by tweaking how vaccine posts show up on users’ newsfeeds, or by turning off comments entirely. Yet despite internal documents showing these changes worked, Facebook was slow to take action. Credit: AP Photo/Alessandra Tarantino, File

Yet the need to act urgently couldn’t have been clearer: At that time, states across the U.S. were rolling out vaccines to their most vulnerable—the elderly and sick. And public health officials were worried. Only 10% of the population had received their first dose of a COVID-19 vaccine. And a third of Americans were thinking about skipping the shot entirely, according to a poll from The Associated Press-NORC Center for Public Affairs Research.

Despite this, Facebook employees acknowledged they had “no idea” just how bad anti-vaccine sentiment was in the comments sections on Facebook posts. But company research in February found that as much as 60% of the comments on vaccine posts were anti-vaccine or vaccine reluctant.

“That’s a huge problem and we need to fix it,” the presentation on March 9 read.

Even worse, company employees admitted they didn’t have a handle on catching those comments. And if they did, Facebook didn’t have a policy in place to take the comments down. The free-for-all was allowing users to swarm vaccine posts from news outlets or humanitarian organizations with negative comments about vaccines.

“Our ability to detect (vaccine hesitancy) in comments is bad in English—and basically non-existent elsewhere,” another internal memo posted on March 2 said.

Los Angeles resident Derek Beres, an author and fitness instructor, sees anti-vaccine content thrive in the comments every time he promotes immunizations on his accounts on Instagram, which is owned by Facebook. Last year, Beres began hosting a podcast with friends after they noticed conspiracy theories about COVID-19 and vaccines were swirling on the social media feeds of popular health and wellness influencers.

Earlier this year, when Beres posted a picture of himself receiving the COVID-19 shot, some on social media told him he would likely drop dead in six months’ time.

“The comments section is a dumpster fire for so many people,” Beres said.

Anti-vaccine comments on Facebook grew so bad that even as prominent public health agencies like UNICEF and the World Health Organization were urging people to take the vaccine, the organizations refused to use free advertising that Facebook had given them to promote inoculation, according to the documents.

Some Facebook employees had an idea. While the company worked to hammer out a plan to curb all the anti-vaccine sentiment in the comments, why not disable commenting on posts altogether?

Facebook froze as anti-vaccine comments swarmed users
In this Sept. 23, 2021, file photo, Oumie Nyassi shows a video circulating on the internet and that has been confirmed as fake news of a woman claiming she was magnetized after receiving the COVID-19 vaccine, in a doctor’s office at Serrekunda, Gambia hospital. Last spring, as false claims about vaccine safety threatened to undermine the world’s response to COVID-19, researchers at Facebook wrote that they could reduce vaccine misinformation by tweaking how vaccine posts show up on users’ newsfeeds, or by turning off comments entirely. Yet despite internal documents showing these changes worked, Facebook was slow to take action. Credit: AP Photo/Leo Correa, File

“Very interested in your proposal to remove ALL in-line comments for vaccine posts as a stopgap solution until we can sufficiently detect vaccine hesitancy in comments to refine our removal,” one Facebook employee wrote on March 2.

The suggestion went nowhere until mid-April, when Lever said the company stopped showing previews of popular comments on vaccine posts.

Instead, Facebook CEO Mark Zuckerberg announced on March 15 that the company would start labeling posts about vaccines that described them as safe.

The move allowed Facebook to continue to get high engagement—and ultimately profit—off anti-vaccine comments, said Ahmed of the Center for Countering Digital Hate.

“They were trying to find ways to not reduce engagement but at the same time make it look like they were trying to make some moves toward cleaning up the problems that they caused,” he said.

It’s unrealistic to expect a multi-billion-dollar company like Facebook to voluntarily change a system that has proven to be so lucrative, said Dan Brahmy, CEO of Cyabra, an Israeli tech firm that analyzes social media networks and disinformation. Brahmy said government regulations may be the only thing that could force Facebook to act.

“The reason they didn’t do it is because they didn’t have to,” Brahmy said. “If it hurts the bottom line, it’s undoable.”

Bipartisan legislation in the U.S. Senate would require social media platforms to give users the option of turning off algorithms tech companies use to organize individuals’ newsfeeds.

Sen. John Thune, R-South Dakota, a sponsor of the bill, asked Facebook whistleblower Haugen to describe the dangers of engagement-based ranking during her testimony before Congress earlier this month.

She said there are other ways of ranking content—for instance, by the quality of the source, or chronologically—that would serve users better. The reason Facebook won’t consider them, she said, is that they would reduce engagement.

Facebook froze as anti-vaccine comments swarmed users
In this Oct. 5, 2021, file photo, former Facebook data scientist Frances Haugen speaks during a hearing of the Senate Commerce, Science, and Transportation Subcommittee on Consumer Protection, Product Safety, and Data Security, on Capitol Hill, in Washington. Last spring, as false claims about vaccine safety threatened to undermine the world’s response to COVID-19, researchers at Facebook wrote that they could reduce vaccine misinformation by tweaking how vaccine posts show up on users’ newsfeeds, or by turning off comments entirely. Yet despite internal documents showing these changes worked, Facebook was slow to take action. Credit: AP Photo/Alex Brandon, File

“Facebook knows that when they pick out the content … we spend more time on their platform, they make more money,” Haugen said.

Haugen’s leaked documents also reveal that a relatively small number of Facebook’s anti-vaccine users are rewarded with big pageviews under the tech platform’s current ranking system.

Internal Facebook research presented on March 24 warned that most of the “problematic vaccine content” was coming from a handful of areas on the platform. In Facebook communities where vaccine distrust was highest, the report pegged 50% of anti-vaccine pageviews on just 111—or .016%—of Facebook accounts.

“Top producers are mostly users serially posting (vaccine hesitancy) content to feed,” the research found.

On that same day, the Center for Countering Digital Hate published an analysis of social media posts that estimated just a dozen Facebook users were responsible for 73% of anti-vaccine posts on the site between February and March. It was a study that Facebook’s leaders in August told the public was “faulty,” despite the internal research published months before that confirmed a small number of accounts drive anti-vaccine sentiment.

Earlier this month, an AP-NORC poll found that most Americans blame social media companies, like Facebook, and their users for misinformation.

But Ahmed said Facebook shouldn’t just shoulder blame for that problem.

“Facebook has taken decisions which have led to people receiving misinformation which caused them to die,” Ahmed said. “At this point, there should be a murder investigation.”


Facebook overrun by COVID vaccine lies even as it denied fueling hesitancy, report says


© 2021 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed without permission.

Citation:
Facebook froze as anti-vaccine comments swarmed users (2021, October 26)
retrieved 8 November 2021
from https://techxplore.com/news/2021-10-facebook-froze-anti-vaccine-comments-swarmed.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

Stay connected with us on social media platform for instant update click here to join our  Twitter, & Facebook

We are now on Telegram. Click here to join our channel (@TechiUpdate) and stay updated with the latest Technology headlines.

For all the latest Technology News Click Here 

 For the latest news and updates, follow us on Google News

Read original article here

Denial of responsibility! NewsAzi is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.