In the midst of the heated U.S. presidential race last summer, with hypercharged scrutiny of partisan propaganda on social media, Facebook Inc. Chief Executive Officer Mark Zuckerberg received a letter from a group of U.S. senators led by Massachusetts Democrat Elizabeth Warren that had nothing to do with elections. They were angry about a year-old piece of climate news.
A Washington Examiner article shared on Facebook in 2019 had denounced climate models, which are widely used by scientists around the world to measure and predict the impacts of warmer temperatures. Science Feedback, an outside organization Facebook works with on fact-checking, had labeled the story false. A review by five scientists found the story “highly misleading” because of “false factual assertions” and accused the authors of “cherry-picking datasets.” The conclusion meant Facebook posts linking to the story would now be saddled with a label saying it had been disputed.
But then Facebook said because the article was designated as an op-ed, it was exempt from fact checks under the company’s policies. The “false” label was removed. Warren’s letter called the op-ed policy a “massive loophole.” Facebook’s policies on climate lies “represents another unfortunate example of Facebook’s refusal to fully combat the deliberate spread of misinformation,” she added.
Climate change has emerged as a key priority in Facebook’s quest to stomp out misinformation, a complicated effort that involves policing user posts while simultaneously defending free speech. In the past few months the company has started fighting climate misinformation with some of the same strategies used to battle COVID‑19 myths and election falsehoods — a sign of the topic’s growing importance internally. But Facebook’s misinformation policies have also left climate activists frustrated.
Zuckerberg got another letter last month, this time from 13 environmental groups including the Union of Concerned Scientists and Greenpeace, asking the company to commit to monitoring climate disinformation and releasing reports, among other things. “Climate change disinformation is spreading rapidly across Facebook’s social media platform, threatening the ability of citizens and policymakers to fight the climate crisis,” the groups wrote.
Unlike elections, which have shorter timelines, climate change is a long-term problem with no definitive ending. That means Facebook doesn’t consider lies about the climate an “imminent” threat of real-world harm, which is the threshold the company uses to determine whether a post containing misinformation should be removed from the service entirely.
“It is an immediate threat, and the fact that they don’t see that makes me mad,” says Naomi Oreskes, a professor of the history of science at Harvard and the co-author of Merchants of Doubt, a book about climate science being deliberately obscured. “We have enormous evidence now that many storms, floods, hurricanes, cyclones have been made worse by climate change.”
At a congressional hearing last month, Zuckerberg admitted that climate misinformation is “a big issue.” A Facebook spokesman says climate change misinformation accounts for a very low percentage of total misinformation on the service but declined to share figures. Experts who follow climate misinformation have had a tough time quantifying its presence on Facebook given a lack of access to private groups and messages, but they’ve seen how small lies can spread quickly.
When millions of Texans lost power in February, Facebook failed to label a number of posts falsely claiming that “wind turbine failures were a leading cause of blackouts,” according to Avaaz, a nonprofit group that studied the issue. The top 10 posts promoting those claims, none of which received fact check labels, garnered an estimated 15.8 million views, Avaaz found.
“Even if it seems like it’s a small number of actors sometimes, the scale of Facebook’s algorithm and its reach as the world’s information highway right now means that millions of people will be impacted by what they see on that platform,” says Fadi Quran, a campaign director at Avaaz.
An outside climate watchdog called InfluenceMap found in October 2020 that dozens of climate denial ads slipped through the social network’s filters and garnered 8 million views. The ads were targeted to older users in more rural, Republican-leaning U.S. states, making them more impactful, says Dylan Tanner, executive director of InfluenceMap. “The power of that targeting makes that 8 million even more powerful,” he says. A Facebook search in March by Bloomberg Green found a handful of groups, with thousands of members, with names such as “Man Made Global Warming is a HOAX” and “Climate Crisis? There is NO Climate Crisis!”
A Facebook spokesman says “we take action against pages, groups, and accounts that repeatedly share false claims.” When fact-checkers rate content as false, he says, “we add a warning label and reduce its distribution.” Groups that repeatedly share misinformation aren’t removed, but are no longer recommended by Facebook’s algorithm, he adds.
Climate misinformation has been around for decades, with lobbying efforts and marketing by Big Oil as far back as the 1980s, but Oreskes says social media may be making it worse. A regular public speaker on climate issues, she’s noticed in recent years that attendees’ questions often mirror online memes. “When it spreads to social media it’s even more pernicious,” Oreskes says, “because now you’re hearing it from a friend or relative.”
Just as Facebook can exacerbate a false narrative, it also has the ability to change it entirely. With almost 2.8 billion users globally, the social network has an opportunity with climate change that it hasn’t had with other topics — a chance to educate millions of people without the pressure of a looming election or public health crisis.
Facebook’s efforts to fight climate-related misinformation picked up last year when it called a handful of experts to ask for help. That led the company to create the Climate Science Information Center, a dedicated space on the service with scientist-approved information. The center appears as the top search result when users query climate-related terms, like “global warming” or “greenhouse gases.” Users can see charts mapping the average annual temperature of their state, or click to read facts about declining polar bear populations or excess carbon dioxide.
The data are meant to counter common misconceptions that people share on Facebook, including claims that the concept of climate change is widely disputed among scientists (not true), or that climbing temperatures are not the result of human activity (also false). Facebook is already adding links to the center on user posts about climate change for some in the U.K, and could soon expand those labels to other countries.
Widespread labeling posts is the same approach Facebook used for the 2020 U.S. election and for COVID‑19 vaccine information. It’s a strategy that aligns with one of Zuckerberg’s key beliefs: The best way to fight misinformation is with more information.
“There has long been this kind of purity default that we should treat all of this as ‘we are just a platform for the free expression of ideas,’” says Anthony Leiserowitz, the founder and director of the Yale Program on Climate Change Communication, who advised Facebook on its climate hub and was not paid by the company. “Maybe through force, maybe through argument inside, they are increasingly recognizing that’s just not an adequate answer to the challenges they are facing.”
Experts say the info center is a good effort but only a start. Oreskes still thinks Facebook should remove climate misinformation entirely. Quran wants Facebook to retroactively notify people if they’ve previously seen misinformation by putting alerts in their feed. “If the goal is addressing misinformation, just providing facts is not an adequate solution,” says John Cook, a research fellow at the Climate Change Communication Research Hub at Monash University in Melbourne. He also consulted on Facebook’s effort without pay. “But I think they recognize that themselves.”
There’s reason to suspect Facebook’s recent efforts may signal more changes to come. Climate change is an issue of great importance to the company’s employees, according to internal surveys, and top executives agree. Zuckerberg once said stopping climate change was “one of the most important challenges of our generation,” and Donald Trump’s decision to pull the U.S. out of the Paris Agreement marked one of the few times Zuckerberg publicly criticized the former president.
Chris Cox, the social network’s longtime head of product and one of its most powerful executives, is also a climate advocate. He advised and invested in multiple climate-related startups during a brief year away from Facebook. Cox’s return in May 2020 added “a huge amount of momentum” to some of the company’s climate-related products, says Edward Palmieri, Facebook’s director of sustainability. “The Climate Science Information Center is something that we had been thinking about, but Chris was able to help us move even faster and get even more resources.”
If the company manages to effectively combat climate misinformation, it could be a major step toward educating millions of people. A failure to stop the lies from spreading, on the other hand, could be devastating. “(It’s) not just an important opportunity, but responsibility to give their users access to quality information,” Leiserowitz says. “To go to an old saying: You never get a second chance to make a first impression.”
In a time of both misinformation and too much information, quality journalism is more crucial than ever.
By subscribing, you can help us get the story right.