A megastudy across 63 countries finds 11 promising interventions in the fight against climate misinformation

Jigsaw
Jigsaw
Published in
6 min readMar 5, 2024

Designing effective global behavioral interventions is a critical component of addressing climate change, but differences across cultures and geographies have had limited scalable solutions to date. While some behavioral interventions have shown promise in isolated experimental settings, comparing interventions between studies is complex, usually resulting in longstanding open debates about the relative effectiveness of interventions. Often, interventions are tested independently, with different samples, on different outcomes, and with a focus on western, educated samples from industrialized, rich, and developed countries.

Which of these approaches works best to fight misinformation? Which, if any, work at scale across regions? Researchers needed a way to test these different approaches head-to-head and with unprecedented global reach. Given our interest in building and understanding information interventions at scale, we joined an international team of researchers to produce a global megastudy investigating 11 information interventions for climate science.

The Jigsaw research team has invested significantly over the last decade in understanding and developing ways to better equip people with the cognitive tools to protect themselves as they navigate information online. This megastudy project, led by researchers at New York University, is a first-of-its-kind manylabs megastudy to test climate information interventions across the globe. It is a mammoth collaboration: over 200 academics from around the world collected data from 59,439 participants in 63 countries to test the relative effectiveness of interventions to tackle climate misinformation (see a full list of collaborators in the paper).

The goal of the study was to test the most promising known approaches to strengthening climate science support head-to-head to understand which were most effective. We began by crowdsourcing those interventions from behavioral science experts. Through this process, we settled on 11 evidence-based behavioral interventions (Fig 1).

“Success” in the fight against climate change and climate misinformation comes in a few forms, so we wanted to measure how these interventions impacted different outcomes. We once again crowdsourced from experts to identify the four outcomes we thought were most relevant to addressing climate misinformation:

  • Belief that climate change is a real and serious problem
  • Support for climate change mitigation policy
  • Willingness to share climate mitigation information on social media
  • Willingness to contribute to a real tree-planting initiative by engaging in a cognitively demanding task.

A full explanation of the methods used to measure outcomes can be found on page four of the paper.

Participants each saw one of the 11 interventions, or a control condition in which they read a passage from a literary text. They were then evaluated on each of the four outcomes of interest in a randomized order. One of the real innovations of this study is that we were able to replicate this exact same study across all 63 countries.

Reducing psychological distance increased support for climate science

We found that reducing the psychological distance of climate change (ie: framing the climate crisis as a proximal risk to the participant by, for example, showing them images of climate change related natural disasters that had occurred near them) was an effective strategy for increasing climate belief by 2.3%, support for climate policies by 1.1%, and social media sharing by 9.9%.

While this specific intervention aimed at reducing geographic psychological distance, other interventions which aimed to reduce temporal psychological distance (such as writing a letter to a socially close child in the future, and writing a letter to your future self) were also effective at stimulating climate belief by 1.2%, support for climate policies by 2.6%, and social media sharing by 10.8%.

Another highly effective intervention was sharing stories about successful collective action that have had meaningful impact on climate effects, such as the restoration of the ozone layer. Showing a collective action example increased climate belief by 1.5%, support for climate policies by 2.4% and social media sharing by 10.5% on average — one of the top three most effective interventions for all outcomes.

Climate messaging evoking negative emotions elevated sharing online — but little else

Using negatively charged emotional language to frame climate information is commonly used as a means to elicit responses and actions online, and it can be highly contentious. Some argue that negative emotions are necessary in order to motivate action, while others worry it may depress or demoralize people into inaction. Our study helps to reconcile some of the debates in the literature by shedding light on both of these effects.

Negative emotion was highly effective at stimulating sharing of climate science posts on social media — increasing sharing by 10.8% on average. However, it did not increase climate belief or policy support, and deterred people from taking individual climate action. Moreover, for participants who started with a low baseline of initial climate beliefs, negative messaging appeared to backfire.

With this insight, it is important for practitioners to carefully consider the behaviors they seek to stimulate when deploying different messaging styles, rather than choosing a single messaging style for all interventions.

Studying this intervention with negative emotions is an example of how the megastudy approach can be so powerful — it enables direct head-to-head comparison to reconcile open debates and provide clear guidance for future climate science messaging.

Next steps: Further understanding of information interventions

Interestingly, almost all of the interventions were effective at producing one outcome: increased social media sharing of climate science. On the other hand, none of the interventions were effective at motivating more tree planting behavior. In fact, some of the interventions actually induced a backfire effect, whereby participants were less interested in tree planting than a control group. Upon further investigation into this effect, we found that all interventions longer than two minutes appeared to induce backfire, while interventions shorter than two minutes appeared to have no effect on tree planting. It is possible that the amount of time invested in the intervention itself reduced the amount of time they were willing to subsequently spend on a time- and cognition- intensive task — as the other outcomes were very low effort by comparison. This may underscore the importance of keeping interventions short. Further research is required to understand this result more clearly.

These results have important implications for educators, policymakers, practitioners, and technologists fighting climate misinformation. This megastudy provides a clear apples-to-apples comparison of common interventions, helping practitioners be more deliberate about the levers they can pull to achieve intended outcomes. It also provides empirical evidence of how these interventions perform across different geographies and contexts, which is especially important for tackling a global issue like climate change.

We will continue to analyze the data and publish insights in the coming months, including investigating the relative effectiveness across countries and cultures, age groups, and political divides. The data we collected is also open-sourced so that any researchers wishing to analyze the data will be able to access it — we encourage any researchers to use this data for new research. Please reach out to Prof. Madalina Vlasceanu vlasceanu@nyu.edu with any questions.

Addressing an issue like climate change requires not only massive structural change, but also huge shifts in collective attitudes and behaviors. Novel methods, like this manylabs megastudy approach, are critical steps towards better testing, validating, and understanding human behavior across the globe. At Jigsaw, we’re excited about the potential for technology to build on these findings to continue to move the needle on positive climate outcomes.

By Rachel Xu, Jigsaw Research Manager, and Madalina Vlasceanu, NYU Assistant Professor of Psychology

--

--

Jigsaw
Jigsaw
Editor for

Jigsaw is a unit within Google that explores threats to open societies, and builds technology that inspires scalable solutions.