Best News Network

No, Fauci is not profiting from a coming book on lessons he’s learned from his public service.

“Expect the Unexpected,” compiled from Dr. Anthony S. Fauci’s speeches and interviews, was prematurely listed for presale, a spokeswoman for the publisher said.
Credit…Anna Moneymaker for The New York Times

In the past few days, after the listing for a coming book by Dr. Anthony S. Fauci, the Biden administration’s top adviser on Covid-19, was taken down from Amazon’s and Barnes & Noble’s websites, right-wing outlets and social media commentators spread the rumor that the it had been removed because of public backlash to the idea of Dr. Fauci’s “profiteering” from the pandemic.

In truth, Dr. Fauci is not making any money from the book, which is about lessons he has learned during his decades in public service, and the listing was pulled for a simple reason: the publisher had posted it too early.

Dr. Fauci “will not earn any royalties from its publication and was not paid” for the book, “Expect the Unexpected,” said Ann Day, a spokeswoman for National Geographic Books, its publisher. She said Dr. Fauci also would not earn anything for a related documentary. (Dr. Fauci did not respond to a request for comment.)

The book, which compiles interviews and speeches given by Dr. Fauci during his 37 years as the director of the National Institute of Allergy and Infectious Diseases, was taken off the websites because “it was prematurely posted for presale,” Ms. Day said. She added that proceeds would “go back to the National Geographic Society to fund work in the areas of science, exploration, conservation and education and to reinvest in content.”

In a statement, the national institute noted that the book had not been written by Dr. Fauci himself. The institute also confirmed that he would not earn any royalties from its publication.

The falsehood about the book and Dr. Fauci spread widely online. On May 31, the right-wing outlet The Daily Caller published an article about the book’s appearing for presale online. Some conservative Republicans, including Representatives Andy Biggs of Arizona and Dan Bishop of North Carolina, seized on the article and claimed without evidence that Dr. Fauci would be profiting from the book.

“His lockdown mandates destroyed livelihoods and threatened our children’s futures,” Mr. Bishop posted on Twitter on June 1. “Now he’ll be profiting nicely off it.” The post was liked and shared more than 2,700 times.

That same day, Newsweek and Fox News published articles highlighting the “backlash” that Dr. Fauci faced from right-wing commentators “for profiting from pandemic” after the announcement of his book. The articles did not mention that he would not make money from the book. They reached as many as 20.1 million people on Facebook, according to data from CrowdTangle, a social media analytics tool owned by the social network.

On June 2, a conservative outlet, Just the News, posted an article asserting that Dr. Fauci’s book had been “scrubbed” from Amazon and Barnes & Noble because of the backlash. The founder of the site, John Solomon — a Washington media personality who was instrumental in pushing falsehoods about the Bidens and Ukraine — tweeted the misleading article. So did the pro-Trump activist Jack Posobiec, who once promoted the false Pizzagate conspiracy.

“Books are removed from bn.com from time to time if the details are loaded incorrectly,” a Barnes & Noble spokeswoman said in a statement to The Times. “This book was not removed proactively by Barnes & Noble. We expect it will be available again shortly for purchase as soon as the publisher decides to list it.” Amazon did not comment.

Some articles on June 2, including on Fox News and The Daily Mail, included similar comments from National Geographic Books. But many outlets on the far right continued to push the version of events that the book had been “scrubbed” from online listings because of the backlash, without the updated information. The articles collected more than 32,000 likes and shares on Facebook and reached as many as six million people on Facebook, according to CrowdTangle data.

Days later, people like the Fox News host Sean Hannity and Representative Ronny Jackson, a Republican from Texas and former President Donald J. Trump’s onetime doctor, continued to push the false idea on Twitter.

“Anthony Fauci is set to make a fortune on his upcoming book; meanwhile our country continues to SUFFER from his ENDLESS non-scientific policies,” Mr. Jackson said on Twitter. His post collected nearly 4,000 likes, comments and shares.

Jacob Silver contributed research.

Michael T. Flynn, center, at a Dec. 12 rally in Washington to protest the presidential election results.
Credit…Jonathan Ernst/Reuters

Michael T. Flynn, a former national security adviser, suggested on Sunday at a conference organized by followers of the QAnon conspiracy theory that a Myanmar-style military coup was needed in the United States.

A day later, despite videos of his comments circulating on TV and online, Mr. Flynn denied ever promoting the idea. “I am no stranger to media manipulating my words,” he posted on Monday to the messaging app Telegram.

Since then, something interesting has happened: His claims of media distortion have not taken off among his conservatives supporters online, while the left has widely circulated and criticized his comments.

News stories and videos covering Mr. Flynn’s call for a coup gathered 675,000 likes and shares on Facebook and Twitter, according to a New York Times analysis. His denial, in comparison, collected only around 61,000 likes and shares on Facebook and Twitter.

Only a few big accounts on the right shared his denial in earnest, including Sid Miller, Texas’ agriculture commissioner and an outspoken supporter of Mr. Trump, whose post collected 68 likes and shares. Other shares came from right-wing partisan Facebook pages with names like Apostolic Conservatives Show and A Little to the Right.

By Wednesday, the chatter from right-wing accounts had died out, while many more left-leaning accounts kept up the discussion on his comments — but only to share their incredulity at Mr. Flynn’s original comments and his attempt to deny and reframe the call for a coup.

For example, the left-leaning Facebook pages Occupy Democrats, Being Liberal and Ridin’ With Biden were among the top sharers of Mr. Flynn’s comments.

“Should Mike Flynn get sent to prison for calling for a military coup against American democracy to violently reinstate Trump?” said one meme posted by Occupy Democrats on Tuesday. The one post alone collected more than 11,500 likes and shares.

Jacob Silver contributed reporting.

Video

Cinemagraph
Videos by Associated Press and Reuters

For months, popular social media posts have cited an unverified national health database to falsely suggest that Covid-19 vaccines have caused thousands of deaths, possibly even more than the virus itself.

These claims have been repeatedly debunked. But they remain in circulation as prominent public figures like the Fox News host Tucker Carlson continue to promote them.

“Between late December of 2020 and last month, a total of 3,362 people apparently died after getting the Covid vaccine in the United States,” Mr. Carlson said on his show on Wednesday, citing the Centers for Disease Control and Prevention’s Vaccine Adverse Event Reporting System, or VAERS. “That’s an average of roughly 30 people every day. The actual number is almost certainly higher than that, perhaps vastly higher than that.”

But, as the federal Department of Health and Human Services notes in a disclaimer on its website, the database relies on self-reporting, and its reports may include unverified information.

“VAERS reports alone cannot be used to determine if a vaccine caused or contributed to an adverse event or illness,” the disclaimer reads. “The reports may contain information that is incomplete, inaccurate, coincidental or unverifiable. In large part, reports to VAERS are voluntary, which means they are subject to biases.”

When the C.D.C. examined VAERS reports on Covid-19 vaccines administered from Dec. 14 to May 3, it found 4,178 reports of deaths among people who had received one. The agency noted, however, that “a review of available clinical information, including death certificates, autopsy and medical records, has not established a causal link to Covid-19 vaccines.”

Reports have indicated a “plausible causal relationship” between Johnson & Johnson’s vaccine and a rare blood clotting disorder, according to the C.D.C. Three people who had received that vaccine and developed the blood clot illness died, according to a separate C.D.C. study.

Experts emphasized that the database was a useful tool to flag early warning signs for vaccine safety, but that it was not a replacement for studies on the effects of vaccines or actively monitoring side effects.

“It’s a big net to catch everything, not a way of evaluating what problems are actually caused by vaccines,” said Anna Kirkland, a professor at the University of Michigan and the author of a recent book on vaccine injury claims. “‘Died after getting a vaccine’ could mean you died in a car accident, you died of another disease you already had or anything else.”

Professor Kirkland also warned that lawyers and activists who wanted to make vaccines look more dangerous filed reports to the database and then cited those reports as evidence of danger.

Laura Scherer, a professor at the University of Colorado School of Medicine and the author of a study on the database and the HPV vaccine, called Mr. Carlson’s claim “a gross misuse of VAERS” and “fundamentally misleading.”

“VAERS reports accept a lot of noise in order to have a chance of being able to pick up on potentially important effects,” she said. “The key is that it is always necessary to follow up on those reported events with high-quality research.”

As an example of unsubstantiated suspicions captured in the database, Dr. Scherer cited a report she came across attributing a sudden death to the HPV vaccine three months after the vaccine was administered — an assertion, she said, that was extremely unlikely.

Mr. Carlson responded to criticisms on Thursday night by acknowledging that the database was unverified, but he maintained his suspicions over the vaccines, saying that “more deaths have been connected to the new Covid vaccines over the past four months than to all previous vaccines combined.”

That might be because of the enormous scale of the Covid-19 vaccination drive, an effort not seen in many decades.

“If you have millions of people getting a vaccine, and a lot of suspicion circulating about that vaccine, then you would expect to see more VAERS reports,” Dr. Scherer said. “But this does not mean that the vaccine caused any of these events, and an increase in reporting does not necessarily mean that this vaccine is more dangerous than other vaccines.”

A nurse administered a vaccine in Los Angeles earlier this month.
Credit…Allison Zaucha for The New York Times

In recent weeks, people who oppose Covid vaccinations have spread a claim that is not only false but defies the rules of biology: that being near someone who has received a vaccine can disrupt a woman’s menstrual cycle or cause a miscarriage.

The idea, promoted on social media by accounts with hundreds of thousands of followers, is that vaccinated people might shed vaccine material, affecting people around them as though it were secondhand smoke. This month, a private school in Florida told employees that if they got vaccinated, they could not interact with students because “we have at least three women with menstrual cycles impacted after having spent time with a vaccinated person.”

In reality, it is impossible to experience any effects from being near a vaccinated person, because none of the vaccine ingredients are capable of leaving the body they were injected into.

The vaccines currently authorized for use in the United States instruct your cells to make a version of the spike protein found on the coronavirus, so your immune system can learn to recognize it. Different vaccines use different vehicles to deliver the instructions — for Moderna and Pfizer, messenger RNA, or mRNA; for Johnson & Johnson, an adenovirus genetically modified to be inactive and harmless — but the instructions are similar.

“It’s not like it’s a piece of the virus or it does things that the virus does — it’s just a protein that’s the same shape,” said Emily Martin, an infectious disease epidemiologist at the University of Michigan School of Public Health. “Transferring anything from the vaccine from one person to another is not possible. It’s just not biologically possible.”

Microorganisms spread from person to person by replicating. The vaccine ingredients and the protein can’t replicate, which means they can’t spread. They don’t even spread through your own body, much less to anybody else’s.

“They’re injected into your arm, and that’s where they stay,” Jennifer Nuzzo, an epidemiologist at Johns Hopkins, said of the vaccines. “mRNA is taken up by your muscle cells near the site of injection, the cells use it to make that protein, the immune system learns about the spike protein and gets rid of those cells. It’s not something that circulates.”

It’s also not something that sticks around. Messenger RNA is extremely fragile, which is one reason we’ve never had an mRNA-based vaccine before: It took a long time for scientists to figure out how to keep it intact for even the brief period needed to deliver its instructions. It disintegrates within a couple days of vaccination.

Vaccinated people can’t shed anything because “there’s nothing to be shedding,” said Dr. Céline Gounder, an infectious disease specialist at Bellevue Hospital Center and a member of President Biden’s transition advisory team on the coronavirus. “The people who shed virus are people who have Covid. So if you want to prevent yourself or others from shedding virus, the best way to do that is to get vaccinated so you don’t get Covid.”

This brings us to the reports of women having abnormal periods after being near vaccinated people. Because one person’s vaccine can’t affect anybody else, it is impossible for these two events to be connected. Many things, like stress and infections, can disrupt menstrual cycles.

The shedding claims are “a conspiracy that has been created to weaken trust in a series of vaccines that have been demonstrated in clinical trials to be safe and effective,” Dr. Christopher M. Zahn, vice president of practice activities at the American College of Obstetricians and Gynecologists, said in a statement. “Such conspiracies and false narratives are dangerous and have nothing to do with science.”

Some women have expressed a related concern that getting vaccinated themselves could affect their menstrual cycles. Unlike secondhand effects, this is theoretically possible, and research is ongoing — but anecdotal reports could be explained by other factors, and no study has found a connection between the vaccine and menstrual changes.

“There’s no evidence that the vaccine affects your menstrual cycle in any way,” Dr. Gounder said. “That’s like saying just because I got vaccinated today, we’re going to have a full moon tonight.”

A memorial to George Floyd outside Cup Foods in Minneapolis, near the site of Mr. Floyd’s fatal encounter with the police.
Credit…Joshua Rashaad McFadden for The New York Times

Facebook on Monday said it planned to limit posts that contain misinformation and hate speech related to the trial of Derek Chauvin, the former Minneapolis police officer charged with the murder of George Floyd, to keep them from spilling over into real-world harm.

As closing arguments began in the trial and Minneapolis braced for a verdict, Facebook said it would identify and remove posts on the social network that urged people to bring arms to the city. It also said it would protect members of Mr. Floyd’s family from harassment and take down content that praised, celebrated or mocked his death.

“We know this trial has been painful for many people,” Monika Bickert, Facebook’s vice president of content policy, wrote in a blog post. “We want to strike the right balance between allowing people to speak about the trial and what the verdict means, while still doing our part to protect everyone’s safety.”

Facebook, which has long positioned itself as a site for free speech, has become increasingly proactive in policing content that might lead to real-world violence. The Silicon Valley company has been under fire for years over the way it has handled sensitive news events. That includes last year’s presidential election, when online misinformation about voter fraud galvanized supporters of former President Donald J. Trump. Believing the election to have been stolen from Mr. Trump, some supporters stormed the Capitol building on Jan. 6.

Leading up to the election, Facebook took steps to fight misinformation, foreign interference and voter suppression. The company displayed warnings on more than 150 million posts with election misinformation, removed more than 120,000 posts for violating its voter interference policies and took down 30 networks that posted false messages about the election.

But critics said Facebook and other social media platforms did not do enough. After the storming of the Capitol, the social network stopped Mr. Trump from being able to post on the site. The company’s independent oversight board is now debating whether the former president will be allowed back on Facebook and has said it plans to issue its decision “in the coming weeks,” without giving a definite date.

The death of Mr. Floyd, who was Black, led to a wave of Black Lives Matter protests across the nation last year. Mr. Chauvin, a former Minneapolis police officer who is white, faces charges of manslaughter, second-degree murder and third-degree murder for Mr. Floyd’s death. The trial began in late March. Mr. Chauvin did not testify.

Facebook said on Monday that it had determined that Minneapolis was, at least temporarily, “a high-risk location.” It said it would remove pages, groups, events and Instagram accounts that violated its violence and incitement policy; take down attacks against Mr. Chauvin and Mr. Floyd; and label misinformation and graphic content as sensitive.

The company did not have any further comment.

“As the trial comes to a close, we will continue doing our part to help people safely connect and share what they are experiencing,” Ms. Bickert said in the blog post.

An early voter in Marietta, Ga., last year. While Georgia and Colorado have similar early-voting periods, their voting laws aren’t comparable over all.
Credit…Audra Melton for The New York Times

After Major League Baseball announced recently that it would move the All-Star Game from Atlanta to Denver in protest of new voting restrictions in Georgia, numerous prominent Republicans accused it of hypocrisy.

“Georgia has 17 days of in-person early voting, including two optional Sundays; Colorado has 15,” Gov. Brian Kemp of Georgia told Fox News. “So what I’m being told, they also have a photo ID requirement. So it doesn’t make a whole lot of sense to me.”

Senator Tim Scott of South Carolina made a similar argument in a widely circulated post on Twitter.

But while the 15-day and 17-day numbers are accurate, the overall comparison is not. Here are four key differences between Colorado’s and Georgia’s systems.

  • In Colorado, every registered voter receives a mail ballot by default.

    In Georgia, people who want to vote by mail must apply, and the new law more than halves the time they have to do that: Previously, they could apply as much as 180 days before an election, but now no more than 78 days before. Georgia also forbids officials to send voters an absentee ballot application unless they request it.

  • In Colorado, eligible voters can register anytime, including on Election Day.

    In Georgia, the deadline to register to vote is a month before Election Day, and under the new law, the same deadline applies to any runoff — meaning if a Georgian is not registered by the deadline for the first election, they cannot subsequently register to vote in the runoff.

  • In Colorado, only newly registered voters have to provide identification with their mail-in ballot; for subsequent elections, all that’s required is their signature. And contrary to Mr. Kemp’s statement, there is no photo requirement: Voters can use a birth certificate, a naturalization document, a Medicare or Medicaid card, a utility bill, a bank statement, a paycheck or another government document that shows their name and address.

    In Georgia, only photo identification is acceptable for regular mail-in ballots, and it has to be one of six specific types. The requirement will apply to everyone who votes by mail, not just to newly registered voters as in Colorado.

  • In Colorado, there were 368 ballot drop boxes last year across the state’s 64 counties, not just in government buildings but also at schools, parks, libraries, businesses and more. Boxes were open 24 hours a day.

    In Georgia, the new law requires at least one drop box in each of the 159 counties. (Mr. Kemp and other officials note that before the pandemic, Georgia didn’t have drop boxes at all.) The boxes will be only at registrars’ and absentee ballot clerks’ offices or inside early-voting sites, and open during limited hours.

In 2020, Colorado had the second-highest turnout rate in the country: 76.4 percent of eligible voters, behind only Minnesota, according to data compiled by the United States Elections Project. Georgia was 26th, with a turnout rate of 67.7 percent of eligible voters.

Correction: 

An earlier version of this article incorrectly described Georgia’s voter registration process. Like Colorado, Georgia registers voters automatically when they get a driver’s license; it is not the case that every resident has to fill out a voter registration form.

Out of every 10,000 views on YouTube, 16 to 18 were for videos that broke its rules before removal, the company said on Tuesday.
Credit…Jim Wilson/The New York Times

It is the never-ending battle for YouTube.

Every minute, YouTube is bombarded with videos that run afoul of its many guidelines, whether pornography or copyrighted material or violent extremism or dangerous misinformation. The company has refined its artificially intelligent computer systems in recent years to prevent most of these so-called violative videos from being uploaded to the site, but continues to come under scrutiny for its failure to curb the spread of dangerous content.

In an effort to demonstrate its effectiveness in finding and removing rule-breaking videos, YouTube on Tuesday disclosed a new metric: the Violative View Rate. It is the percentage of total views on YouTube that come from videos that do not meet its guidelines before the videos are removed.

In a blog post, YouTube said violative videos had accounted for 0.16 percent to 0.18 percent of all views on the platform in the fourth quarter of 2020. Or, put another way, out of every 10,000 views on YouTube, 16 to 18 were for content that broke YouTube’s rules and was eventually removed.

“We’ve made a ton of progress, and it’s a very, very low number, but of course we want it to be lower,” said Jennifer O’Connor, a director at YouTube’s trust and safety team.

The company said its violative view rate had improved from three years earlier: 0.63 percent to 0.72 percent in the fourth quarter of 2017.

YouTube said it was not disclosing the total number of times that problematic videos had been watched before they were removed. That reluctance highlights the challenges facing platforms, like YouTube and Facebook, that rely on user-generated content. Even if YouTube makes progress in catching and removing banned content — computers detect 94 percent of problematic videos before they are even viewed, the company said — total views remain an eye-popping figure because the platform is so big.

YouTube decided to disclose a percentage instead of a total number because it helps contextualize how meaningful the problematic content is to the overall platform, Ms. O’Connor said.

YouTube released the metric, which the company has tracked for years and expects to fluctuate over time, as part of a quarterly report that outlines how it is enforcing its guidelines. In the report, YouTube did offer totals for the number of objectionable videos (83 million) and comments (seven billion) that it had removed since 2018.

While YouTube points to such reports as a form of accountability, the underlying data is based on YouTube’s own rulings for which videos violate its guidelines. If YouTube finds fewer videos to be violative — and therefore removes fewer of them — the percentage of violative video views may decrease. And none of the data is subject to an independent audit, although the company did not rule that out in the future.

“We’re starting by simply publishing these numbers, and we make a lot of data available,” Ms. O’Connor said. “But I wouldn’t take that off the table just yet.”

YouTube also said it was counting views liberally. For example, a view counts even if the user stopped watching before reaching the objectionable part of the video, the company said.

National Guard troops near the U.S. Capitol on Thursday.
Credit…Alyssa Schukar for The New York Times

QAnon, the right-wing conspiracy theory community, had another bad day on Thursday.

Following the letdown of Jan. 20 — when, contrary to QAnon belief, former President Donald J. Trump did not declare martial law, announce mass arrests of satanic pedophiles and stop President Biden from taking office — some QAnon believers revised their predictions.

They told themselves that “the storm” — the day of reckoning, in QAnon lore, when the global cabal would be brought to justice — would take place on March 4. That is the day that U.S. presidents were inaugurated until 1933, when the 20th Amendment was ratified and the date was moved to January. Some QAnon believers thought that it would be the day that Mr. Trump would make a triumphal return as the nation’s legitimate president, based on their false interpretation of an obscure 19th century law.

Law enforcement agencies, worried about a repeat of the Jan. 6 riot at the Capitol, took note of QAnon’s revised deadline and prepared for the worst. The Department of Homeland Security and the F.B.I. sent intelligence bulletins to local police departments warning that domestic extremist groups had “discussed plans to take control of the U.S. Capitol and remove Democratic lawmakers.” And the House of Representatives canceled plans to be in session on Thursday, after the Capitol Police warned of a possible QAnon-inspired plot to stage a second assault on the Capitol.

But the Capitol was quiet on Thursday, and QAnon supporters did not erupt in violence. Mr. Trump remains a former president, and no mass arrests of pedophiles have been made.

Even before their latest prophecy failed, QAnon believers were divided about the movement’s future. Some movement influencers who originally promoted the March 4 conspiracy theory had walked back their support for it in recent days, insisting it was a “false flag” operation staged by antifa or other left-wing extremists in order to make QAnon look bad.

On Thursday, as it became clear that no storm was underway, some QAnon believers defiantly maintained that there was still time for Mr. Trump to stage a coup and take office. One Telegram channel devoted to QAnon chatter lit up with false claims that Bill Gates, Dr. Anthony S. Fauci, Representative Alexandria Ocasio-Cortez and other prominent officials had been arrested or executed for treason already, and that “doubles and A.I. clones” had been activated to preserve the illusion that they were still alive.

But other believers contested those claims and appeared resigned to postponing their day of reckoning yet again.

“It may not happen today,” one poster on a QAnon message board wrote. “But when it happens, everyone will see it! As Q predicted. And yes, it will be much much sooner than in four years. We are talking about days (weeks max).”

Tweets that contain Covid-19 vaccine information will be labeled with links to public health sources or Twitter’s policies, the company said on Monday.
Credit…Jim Wilson/The New York Times

Twitter said on Monday that it would start applying labels to tweets that contained misleading information about Covid-19 vaccines, and would enforce its coronavirus misinformation policies with a new five-tier “strike” system.

Tweets that violate the policy will get labels with links to official public health information or the Twitter Rules, the company said in a blog post. Twitter said these labels would increase its ability to deploy automated tools to identify and label similar content across the platform. The company’s goal is to eventually use both automated and human review to address Covid-19 misinformation, the post said, but it added that it would take time for the system to be effective.

Twitter will notify people when it applies a label to one of their tweets, and repeated violations of the Covid-19 policy will result in stricter enforcement, the company said. Two or three strikes result in a 12-hour account lock, while four strikes is a seven-day account lock. After five strikes, Twitter said, the company will permanently suspend the account. (Twitter allows users to submit appeals if accounts are locked or suspended in error.)

The company said it was making these changes to encourage healthy conversation on the platform and help people find reliable information. Since introducing its Covid-19 guidance last March, Twitter said, it had removed more than 8,400 tweets and notified 11.5 million accounts of possible violations worldwide.

A QAnon flag at a demonstration in Los Angeles in August. Last year, views of videos on pro-QAnon channels rose 38 percent, a new report says.
Credit…Kyle Grillot/Agence France-Presse — Getty Images

Two years ago, YouTube changed its recommendation algorithm to reduce the visibility of so-called borderline content — videos that brush up against its rules but do not explicitly violate them — in an effort to curb the spread of misinformation and conspiracy theories on the site.

But those changes did not stop the rapid spread of videos about QAnon, a debunked internet conspiracy theory, according to a research report on Tuesday from Pendulum, a company that tracks misinformation on YouTube.

Online video channels with QAnon content generated more than one billion views in 2020, with 910 million on YouTube alone, up 38 percent from 2019, the report said. When YouTube began to directly crack down on people posting the QAnon conspiracy theories in October, the largest channels moved to smaller platforms, BitChute and Rumble.

Sam Clark, a co-founder of Pendulum, said the research “indicates that moderation done by YouTube has not been enough to stop the growth of overall viewership of this content.”

The report demonstrated the critical role that YouTube, a subsidiary of Google, played in helping to move QAnon from a fringe phenomenon into the mainstream with violent offline consequences.

In a recent national poll, 17 percent of respondents said they believed in one of the core tenets of QAnon — that a group of devil-worshiping elites who run a child sex ring are trying to control politics and the media. And QAnon believers were involved in the deadly Capitol riot in January as well as other offline violence.

“While we welcome more peer-reviewed research, our data contradicts Pendulum’s findings, and just over the past months alone, we have terminated many prominent QAnon channels and removed thousands of videos for violating our policies,” Farshad Shadloo, a YouTube spokesman, said in a statement.

Mr. Shadloo said Pendulum’s sampling was not comprehensive and did not accurately reflect what was popular or what was watched on YouTube. He added that a number of factors could drive an increase in views, including a sudden increase in media coverage, attention from public figures and sharing outside YouTube.

After YouTube changed its algorithm in January 2019, it said views from recommendations among a set of pro-QAnon channels fell more than 80 percent. The updated policy in October said YouTube would no longer allow “content that targets an individual or group with conspiracy theories that have been used to justify real-world violence.”

Pendulum said YouTube had removed 91,000 videos from 285 of the largest QAnon channels and removed about half of those channels altogether. YouTube has not disclosed the full impact of its policy change, but said the majority of its prominent QAnon channels had been terminated.

But YouTube’s actions did not stop the biggest creators of QAnon content. They simply moved to smaller video platforms with less restrictive moderation policies, such as BitChute and Rumble.

When YouTube took action in October, the number of daily views of QAnon channels on all three platforms fell to 1.3 million from 2.7 million. As followers of those top creators moved to the smaller platforms, daily views rose again, to 2.2 million in December.

And after the attack on the Capitol, QAnon channels had their highest-viewed month ever — topping their previous record by 30 percent, with most of the views on BitChute and Rumble.

Pendulum labeled a channel a QAnon channel when 30 percent of more of its most-viewed videos discussed the conspiracy theory in a supportive way or indicated that the content creator was a believer.

On Monday, Facebook announced that it was banning vaccine misinformation. It followed up on Wednesday by removing the Instagram account of Robert F. Kennedy Jr., one of the most prominent anti-vaccine activists on social media.

Facebook has become increasingly aggressive in recent months at combating a deluge of false health claims, conspiracy theories and rumors. The company is acting at a critical moment, as vaccinations against the coronavirus roll out across the globe. Facebook has said it consulted with the World Health Organization and other leading health institutes to determine a list of false or misleading claims around Covid-19 and vaccines in general.

Even so, dozens of prominent anti-vaccine activists remained active on Facebook and Instagram on Thursday, according to an analysis by The New York Times. Some of the accounts had large followings, including the Instagram account for Children’s Health Defense, the nonprofit organization that Mr. Kennedy runs, which has over 172,000 followers.

A search for the word “vaccine” on Instagram on Thursday showed that four of the top 10 accounts took strong anti-vaccine positions. A search for the hashtag #vaccine got three results, one of which was #vaccinetruthadvocate, a term that anti-vaccine activists often use to spread their message. The hashtag was appended to more than 12,000 posts.

“This is going to take some time, however, but we are working to address what you raise,” a Facebook spokeswoman said in a statement.

Researchers who study misinformation said Facebook continued to struggle to contain Covid-19 falsehoods.

“Months after they promised to crack down on Covid misinformation, we reported hundreds of posts containing dangerous misinformation to Facebook, but just one in 10 of those posts were removed,” said Imran Ahmed, chief executive of the nonprofit Center for Countering Digital Hate. “Millions of people are being fed dangerous lies which lead them to doubt government guidance on Covid and on vaccines, prolonging the pandemic. These lies cost lives.”

Here’s a look at some of the prominent accounts still spreading anti-vaccine misinformation on Instagram.

The nonprofit regularly promotes seminars and webinars with vaccine skeptics through its Instagram account, and posts misleading accounts of death and injury associated with the Covid vaccine. Many of its posts receive tens of thousands of likes. The organization did not return a request for comment.

An author and public speaker who has campaigned for years against vaccines, Ms. Elizabeth has over 122,000 Instagram followers on her Health Nut News page and 23,700 on another page she runs. She regularly shares content that argues against “mandatory vaccination.” She did not return a request for comment.

Mr. Ayyudurai, an Indian-American politician, has over 299,000 followers on Instagram. He has spread the false claim that Covid-19 can be treated with vitamin C. He has also accused the “deep state,” or the conspiracy theory that a secret cabal runs the government, of spreading Covid-19. He did not return a request for comment.

Armed National Guard members walk around the grounds of the U.S. Capitol in January.
Credit…T.J. Kirkpatrick for The New York Times

Misinformation about the second impeachment trial against former President Donald J. Trump is swirling online at a much slower clip than the first impeachment trial against him — at least so far.

The media insights company Zignal Labs collected misinformation narratives around the impeachment proceedings from Jan. 25 to Feb. 9, and found three emerging falsehoods that had gotten thousands of mentions on social media and cable television and in print and online news outlets.

The falsehoods, though, had not gained as much traction as misinformation about Mr. Trump’s first impeachment trial or the outcome of the 2020 election. Still, the data shows how virtually any news event is an opportunity to spread lies and push divisive rumors, helped along by social media algorithms, eager audiences and a broken fact-checking system.

Here are the three most popular misinformation narratives about the impeachment proceedings.

The falsehood that Congresswoman Nancy Pelosi somehow knew that a mob would storm the Capitol and is using the impeachment trial as a “diversion” effort was amplified by Senator Ron Johnson on Fox News on Feb. 7.

“We now know that 45 Republican senators believe it’s unconstitutional,” Mr. Johnson said on Fox News, referring to the impeachment proceedings. “Is this another diversion operation? Is this meant to deflect away from what the speaker knew and when she knew it? I don’t know, but I’m suspicious.”

A video clip of the interview was viewed at least 2.1 million times on Twitter.

The falsehood that the Capitol attack was preplanned and “undercuts Trump impeachment premise” gained traction on Feb. 8 when a conservative outlet called Just the News published an article detailing the claim. The article was shared 7,400 times on Twitter and at least 3,000 times on Facebook.

The founder of Just the News, John Solomon — a Washington-based media personality who was instrumental in pushing falsehoods about the Bidens and Ukraine — shared the falsehood from his own Twitter account, collecting thousands of likes and retweets. Other Twitter users then picked up the rumor, further amplifying the false narrative.

Focusing on what was planned in advance should have no bearing on the impeachment trial itself, according to 144 constitutional law scholars who submitted a written analysis of the case against Mr. Trump. They said many of them believe that “President Trump can be convicted and disqualified because he is accused of violating his oath through an ‘extraordinary, unprecedented repudiation of the president’s duties to protect the government’ through his ‘further acts and omissions after he incited the crowd to attack the Capitol.’”

The narrative that it is not too late to impeach former President Barack Obama started to gain traction on Jan. 26 on Twitter. Thousands of Twitter users shared an old suggestion from Representative Matt Gaetz, a Florida Republican, that if a former president can be impeached, Mr. Obama should be tried for spying on Trump.

The false narrative was a revival of “Spygate” — a labyrinthine conspiracy theory involving unproven allegations about a clandestine Democratic plot to spy on Mr. Trump’s 2016 campaign. But the theory fizzled as the past four years saw none of Mr. Trump’s political enemies charged with crimes. And in 2019, a highly anticipated Justice Department inspector general’s report found no evidence of a politicized plot to spy on the Trump campaign.

Anti-vaccine protesters shouted and waved signs at health care workers in Tampa, Fla., on Sunday as they entered Raymond James Stadium to watch the Super Bowl.
Credit…Zack Wittman for The New York Times

Facebook said on Monday that it plans to remove posts with erroneous claims about vaccines from across its platform, including taking down assertions that vaccines cause autism or that it is safer for people to contract the coronavirus than to receive the vaccinations.

The social network has increasingly changed its content policies over the past year as the coronavirus has surged. In October, the social network prohibited people and companies from purchasing advertising that included false or misleading information about vaccines. In December, Facebook said it would remove posts with claims that had been debunked by the World Health Organization or government agencies.

Monday’s move goes further by targeting unpaid posts to the site and particularly Facebook pages and groups. Instead of targeting only misinformation around Covid-19 vaccines, the update encompasses false claims around all vaccines. Facebook said it had consulted with the World Health Organization and other leading health institutes to determine a list of false or misleading claims around Covid-19 and vaccines in general.

In the past, Facebook had said it would only “downrank,” or push lower down in people’s news feeds, misleading or false claims about vaccines, making it more difficult to find such groups or posts. Now posts, pages and groups containing such falsehoods will be removed from the platform entirely.

“Building trust and confidence in these vaccines is critical, so we’re launching the largest worldwide campaign to help public health organizations share accurate information about Covid-19 vaccines and encourage people to get vaccinated as vaccines become available to them,” Kang-Xing Jin, head of health at Facebook, said in a company blog post.

The company said the changes were in response to a recent ruling from the Facebook Oversight Board, an independent body that reviews decisions made by the company’s policy team and rules on whether they were just. In one ruling, the board said that Facebook needed to create a new standard for health-related misinformation because its current rules were “inappropriately vague.”

Facebook also said it would give $120 million in advertising credits to health ministries, nongovernmental organizations and United Nations agencies to aid in spreading reliable Covid-19 vaccine and preventive health information. As vaccination centers roll out more widely, Facebook said it would help point people to locations where they can receive the vaccine.

Mark Zuckerberg, Facebook’s founder and chief executive, has been proactive against false information related to the coronavirus. He has frequently hosted Dr. Anthony Fauci, the nation’s top infectious disease expert, on Facebook to give live video updates on the American response to the coronavirus. In his private philanthropy, Mr. Zuckerberg has also vowed to “eradicate all disease,” pledging billions to fighting viruses and other diseases.

Yet Mr. Zuckerberg has also been a staunch proponent of free speech across Facebook and was previously reluctant to rein in most falsehoods, even if they were potentially dangerous. The exception has been Facebook’s policy to not tolerate statements that could lead to “immediate, direct physical harm” to people on or off the platform.

Facebook has been criticized for that stance, including for allowing President Donald J. Trump to remain on the platform until after the Jan. 6 riot at the U.S. Capitol.

For years, public health advocates and outside critics took issue with Facebook’s refusal to remove false or misleading claims about vaccines. That led to a surge in false vaccine information, often from people or groups who spread other harmful misinformation across the site. Even when Facebook tried updating its policies, it often left loopholes that were exploited by misinformation spreaders.

Facebook on Monday said it would also change its search tools to promote relevant, authoritative results on the coronavirus and vaccine-related information, while making it more difficult to find accounts that discourage people from getting vaccinated.

Representative Alexandria Ocasio-Cortez at the Capitol on Thursday.
Credit…Anna Moneymaker for The New York Times

Since Representative Alexandria Ocasio-Cortez, the New York Democrat, took to Instagram Live on Monday to describe what the Jan. 6 riot was like from inside the Capitol complex, critics have claimed that she wasn’t where she said she was, or that she couldn’t have experienced what she described from her location.

These claims are false.

While Ms. Ocasio-Cortez was not in the main, domed Capitol building when the rioters breached it, she never said she was. She accurately described being in the Cannon House Office Building, which is part of the Capitol complex and is connected to the main building by tunnels.

In her livestream, Ms. Ocasio-Cortez recalled hiding in a bathroom and thinking she was going to die as unknown people entered her office and shouted, “Where is she?” They turned out to be Capitol Police officers who had not clearly identified themselves, and Ms. Ocasio-Cortez said so on Instagram. She did not claim that they were rioters — only that, from her hiding spot, she initially thought they were.

During the riot, reporters wrote on Twitter that the Cannon building was being evacuated because of credible threats, and that Capitol Police officers were running through the hallways and entering offices just as Ms. Ocasio-Cortez described.

The false claims about her statements have spread widely online, much of the backlash stemming from an article on the conservative RedState blog and a livestream from the right-wing commentator Steven Crowder. On Thursday, Representative Nancy Mace, Republican of South Carolina, tweeted, “I’m two doors down from @aoc and no insurrectionists stormed our hallway.”

But Ms. Ocasio-Cortez never said insurrectionists had stormed that hallway, and Ms. Mace herself has described being frightened enough to barricade her own door. A spokeswoman for Ms. Mace said on Friday that the congresswoman’s tweet had been intended as “an indictment of the media for reporting there were insurrectionists in our hallway when in fact there were not,” and that it “was not at all directed at Ocasio-Cortez.”

“As the Capitol complex was stormed and people were being killed, none of us knew in the moment what areas were compromised,” Ms. Ocasio-Cortez tweeted in response to Ms. Mace’s post. (A spokeswoman for Ms. Ocasio-Cortez said the lawmaker had no additional comment.)

Others have corroborated Ms. Ocasio-Cortez’s account and confirmed that the Cannon building was threatened, even though the rioters did not ultimately breach it.

Ari Rabin-Havt, a deputy manager for Senator Bernie Sanders’s 2020 presidential campaign, tweeted that he was in the Capitol tunnels during the attack. As Mr. Rabin-Havt moved toward the Cannon building, he wrote, members of a SWAT team yelled at him to find a hiding place.

And Representative Katie Porter, Democrat of California, said on MSNBC that after the Cannon building was evacuated, she and Ms. Ocasio-Cortez sheltered in Ms. Porter’s office in another building. She said Ms. Ocasio-Cortez was clearly terrified, opening closets to try to find hiding places and wishing aloud that she had worn flats instead of heels in case she had to run.

Jacob Silver contributed reporting.

Rudolph W. Giuliani worked for weeks after the November election in an attempt to subvert its outcome.
Credit…Erin Schaff/The New York Times

Dominion Voting Systems, one of the largest voting machine vendors in the United States, filed a defamation lawsuit against Rudolph W. Giuliani on Monday, accusing him of spreading a litany of falsehoods about the company in his efforts on behalf of former President Donald J. Trump to subvert the election.

The lawsuit chronicles more than 50 inaccurate statements made by Mr. Giuliani in the weeks after the election, and issues a point-by-point rebuttal of each falsehood. Here are four of the most common false statements Mr. Giuliani made about Dominion Voting Systems.

Mr. Giuliani regularly stated, falsely, that Dominion “really is a Venezuelan company” and that it “depends completely on the software of Smartmatic,” a company “developed in about 2004, 2005 to help Chavez steal elections.”

As Dominion writes in its lawsuit: “Dominion was not founded in Venezuela to fix elections for Hugo Chávez. It was founded in 2002 in John Poulos’s basement in Toronto to help blind people vote on paper ballots.” The suit later adds that the headquarters for the company’s United States subsidiary are in Denver.

Another often-repeated claim was that Dominion had programmed its machines to flip votes: “In other words when you pressed down Biden, you got Trump, and when you pressed down Trump you got Biden.”

This has been proved false by numerous government and law enforcement officials, including former Attorney General William P. Barr, who said in December: “There’s been one assertion that would be systemic fraud, and that would be the claim that machines were programmed essentially to skew the election results. And the D.H.S. and D.O.J. have looked into that, and so far, we haven’t seen anything to substantiate that.”

Similarly, a joint statement by numerous government and elections officials and agencies, including the National Association of State Election Directors, the National Association of Secretaries of State, and the Cybersecurity and Infrastructure Security Agency, stated that there was “no evidence that any voting system deleted or lost votes, changed votes, or was in any way compromised.”

The hand recount in Georgia also affirmed that the machine recounts were accurate in that state.

Mr. Giuliani zeroed in on Antrim County, Mich., falsely claiming that a “Dominion machine flipped 6,000 votes from Trump to Biden” there, and that machines in the county were “62 percent inaccurate,” had a “68 percent error rate” and had an “81.9 percent rejection rate.”

Mr. Giuliani’s focus on Antrim County stems from human errors made by the county clerk on election night. According to the lawsuit, the clerk “mistakenly failed to update all of the voting machines’ tabulator memory cards.” But the suit says that “her mistakes were promptly caught as part of the normal canvass process before the election result was made official.” The Michigan secretary of state’s office also conducted a hand audit of all presidential votes in Antrim County that found the machines were accurate.

Mr. Giuliani claimed that his accusations, particularly in Antrim County, were backed up by experts. But he largely relied on one man, Russell Ramsland Jr., a former Republican congressional candidate from Texas, who, according to the lawsuit filed by Dominion, had also publicly favored false conspiracy theories.

Dominion spent more than five pages on Mr. Ramsland’s lack of credentials to properly examine equipment, noting that he had a “fundamental misunderstanding of election software.” The suit also quotes the former acting director of the U.S. Election Assistance Commission Voting System Testing and Certification program, saying the report produced by Mr. Ramsland “showed a ‘grave misunderstanding’ of Antrim County’s voting system and ‘a lack of knowledge of election technology and process.’”

Stay connected with us on social media platform for instant update click here to join our  Twitter, & Facebook

We are now on Telegram. Click here to join our channel (@TechiUpdate) and stay updated with the latest Technology headlines.

For all the latest Technology News Click Here 

 For the latest news and updates, follow us on Google News

Read original article here

Denial of responsibility! NewsAzi is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.