By Institute of Atmospheric Physics, Chinese Academy of Sciences
The optimal fingerprinting method (OFM) has been used extensively to detect and attribute the effects of climate change. A recent analysis of the various assumptions used by OFM has revealed several ways the method could be improved to enhance its usefulness and accuracy, particularly in the assessment of climate tipping points.
Earth’s climate system is complex, and understanding the dynamics and causes of climate change are equally complex. Climate scientists have leveraged the fingerprinting method to determine which climate signals are natural and which are man-made. In general terms, the OFM searches existing climate records for climate change patterns, or fingerprints, based on computer modeling. OFM works under the assumption that different variables that influence climate change have different fingerprints, helping climatologists establish whether or not a particular climate phenomenon is due to natural causes of human influence.
Fingerprinting methods depend on assumptions and statistics. Unfortunately, not all assumptions are correct all of the time, which affects the over accuracy of a particular method. To address this issue, Jianhua Lu; professor at Sun Yat-sen University (SYSU) and the Southern Marine Science and Engineering Guangdong Laboratory in Zhuhai, China; carefully analyzed the linearity, non-interaction and stationary-variability assumptions made by the OFM.
Lu published his study on July 3 in Advances in Atmospheric Science.
“I set out to defend the Optimal Fingerprinting Method (OFM) as one of the major methods in the detection and attribution (D&A) of anthropogenic climate change… while pointing out its potential limitations. I intend in my paper to bring the importance of physical reasoning to the attention of researchers when they apply OFM to the D&A of extreme weather and climate events,” said Lu.
While the OFM is based on existing climate data, the fingerprints generated by the method can’t be directly observed and instead exist only as a computer simulation. Additionally, OFM is dependent on how well the modeling approximates the Earth’s climate system. Lu argues that the accuracy of this method can’t and shouldn’t be determined on a statistical basis alone. Rather, model efficacy should be considered from a perspective of climate system dynamics and physics.
Specifically, the linearity assumption of OFM generally works well to approximate the global climate response to external forcing, including the effect of greenhouse gases. The exceptions to this linearity assumption, however, are the total effects of greenhouse gases and aerosols and the feedback from clouds, vapor and sea ice on the balance of incoming and outgoing radiation, which may not be linear.
Additionally, the non-interaction assumption of OFM assumes that internal variability, or the natural climate variations that occur year-to-year or over centuries, does not interact with climate change. While this assumption may be reasonable on a global scale, the non-interaction assumption does not hold up on a regional or even continental scale. Lu urges caution when applying OFM on smaller, rather than global, scales.
Lastly, the OFM stationary-variability assumption assumes that the Earth’s climate is at an equilibrium state. While this assumption can help simplify theoretical explanations, climate systems are in fact chaotic, complex systems with internal variability that can irregularly occur on the scale of years, decades and centuries. Understanding and properly accounting for this variability is particularly important for the assessment or prediction of various climate change tipping points that influence global climate policy.
Experts in climate science agree that current assessment methods have limitations. “As the focus has shifted to assessing impacts of climate change at regional scales and on climate extremes, there is a need to develop appropriate techniques to robustly detect the impact of climate change. Jianhu Lu’s piece nicely summarizes the key issues and suggests that theoretical and dynamical approaches can be used to extend the classical optimal fingerprint methods for this purpose,” said Noel Keenlyside, professor at the Geophysical Institute at the University of Bergen and the Bjerkenes Centre for Climate Research in Bergen, Norway.
Lu is confident that improved understanding of the physics underlying anthropological climate change and internal variability will improve our detection and attribution of specific variables contributing to climate change. “I hope to witness the emergence of novel D&A methods that are able to combine the merits of OFM and dynamical/physical reasoning. Furthermore, these new D&A methods will effectively guide the global community and their climate actions,” said Lu.
More information: Jianhua Lu, ‘Improving Optimal Fingerprinting Methods Requires a Viewpoint beyond Statistical Science’, Advances in Atmospheric Sciences (2024). DOI: 10.1007/s00376-024-4175-x. Featured image credit: NASA | Unsplash