The world is striving to reach net-zero emissions as we try to ward off dangerous global warming. But will getting to net-zero actually avert climate instability, as many assume?
Our new study examined that question. Alarmingly, we found reaching net-zero in the next few decades will not bring an immediate end to the global heating problem. Earth’s climate will change for many centuries to come.
And this continuing climate change will not be evenly spread. Australia would keep warming more than almost any other land area. For example if net-zero emissions are reached by 2060, the Australian city of Melbourne is still predicted to warm by 1°C after that point.
But that’s not to say the world shouldn’t push to reach net-zero emissions as quickly as possible. The sooner we get there, the less damaging change the planet will experience in the long run.
Analysis suggests emissions may peak in the next couple of years then start to fall. But as long as emissions remain substantial, the planet will keep warming.
Most of the world’s nations, including Australia, have signed up to the Paris climate agreement. The deal aims to keep global warming well below 2°C, and requires major emitters to reach net-zero as soon as possible. Australia, along with many other nations, is aiming to reach the goal by 2050.
Getting to net-zero essentially means nations must reduce human-caused greenhouse gas emissions as much as possible, and compensate for remaining emissions by removing greenhouse gases from the atmosphere elsewhere. Methods for doing this include planting additional vegetation to draw down and store carbon, or using technology to suck carbon out of the air.
Getting to net-zero is widely considered the point at which global warming will stop. But is that assumption correct? And does it mean warming would stop everywhere across the planet? Our research sought to find out.
Such models are like lab experiments for climate scientists to test ideas. Models are fed with information about greenhouse gas emissions. They then use equations to predict how those emissions would affect the movement of air and the ocean, and the transfer of carbon and heat, across Earth over time.
We wanted to see what would happen once the world hit net-zero carbon dioxide at various points in time, and maintained it for 1,000 years.
We ran seven simulations from different start points in the 21st century, at five-year increments from 2030 to 2060. These staggered simulations allowed us to measure the effect of various delays in reaching net-zero.
We found Earth’s climate would continue to evolve under all simulations, even if net-zero emissions was maintained for 1,000 years. But importantly, the later net-zero is reached, the larger the climate changes Earth would experience.
We found this temperature would continue to rise slowly under net-zero emissions – albeit at a much slower rate than we see today. Most warming would take place on the ocean surface; average temperature on land would only change a little.
We also looked at temperatures below the ocean surface. There, the ocean would warm strongly even under net-zero emissions – and this continues for many centuries. This is because seawater absorbs a lot of energy before warming up, which means some ocean warming is inevitable even after emissions fall.
Over the last few decades of high greenhouse gas emissions, sea ice extent fell in the Arctic – and more recently, around Antarctica. Under net-zero emissions, we anticipate Arctic sea ice extent would stabilise but not recover.
In contrast, Antarctic sea ice extent is projected to fall under net-zero emissions for many centuries. This is associated with continued slow warming of the Southern Ocean around Antarctica.
Importantly, we found long-term impacts on the climate worsen the later we reach net-zero emissions. Even just a five-year delay would affect on the projected climate 1,000 years later.
Delaying net-zero by five years results in a higher global average surface temperature, a much warmer ocean and reduced sea ice extent for many centuries.
Australia’s evolving climate
The effect on the climate of reaching net-zero emissions differs across the world.
For example, Australia is close to the Southern Ocean, which is projected to continue warming for many centuries even under net-zero emissions. This warming to Australia’s south means even under a net-zero emissions pathway, we expect the continent to continue to warm more than almost all other land areas on Earth.
For example, the models predict Melbourne would experience 1°C of warming over centuries if net-zero was reached in 2060.
Net-zero would also lead to changes in rainfall in Australia. Winter rainfall across the continent would increase – a trend in contrast to drying currently underway in parts of Australia, particularly in the southwest and southeast.
Knowns and unknowns
There is much more to discover about how the climate might behave under net-zero.
But our analysis provides some clues about what climate changes to expect if humanity struggles to achieve large-scale “net-negative” emissions – that is, removing carbon from the atmosphere at a greater rate than it is emitted.
Experiments with more models will help improve scientists’ understanding of climate change beyond net-zero emissions. These simulations may include scenarios in which carbon removal methods are so successful, Earth actually cools and some climate changes are reversed.
Despite the unknowns, one thing is very clear: there is a pressing need to push for net-zero emissions as fast as possible.
Atmospheric rivers – those long, narrow bands of water vapor in the sky that bring heavy rain and storms to the U.S. West Coast and many other regions – are shifting toward higher latitudes, and that’s changing weather patterns around the world.
The shift is worsening droughts in some regions, intensifying flooding in others, and putting water resources that many communities rely on at risk. When atmospheric rivers reach far northward into the Arctic, they can also melt sea ice, affecting the global climate.
In a new study published in Science Advances, University of California, Santa Barbara, climate scientist Qinghua Dingand I show that atmospheric rivers have shifted about 6 to 10 degrees toward the two poles over the past four decades.
California relies on atmospheric rivers for up to 50% of its yearly rainfall. A series of winter atmospheric rivers there can bring enough rain and snow to end a drought, as parts of the region saw in 2023.
While atmospheric rivers share a similar origin – moisture supply from the tropics – atmospheric instability of the jet stream allows them to curve poleward in different ways. No two atmospheric rivers are exactly alike.
What particularly interests climate scientists, including us, is the collective behavior of atmospheric rivers. Atmospheric rivers are commonly seen in the extratropics, a region between the latitudes of 30 and 50 degrees in both hemispheres that includes most of the continental U.S., southern Australia and Chile.
Our study shows that atmospheric rivers have been shifting poleward over the past four decades. In both hemispheres, activity has increased along 50 degrees north and 50 degrees south, while it has decreased along 30 degrees north and 30 degrees south since 1979. In North America, that means more atmospheric rivers drenching British Columbia and Alaska.
The poleward movement of atmospheric rivers can be explained as a chain of interconnected processes.
During La Niña conditions, when sea surface temperatures cool in the eastern tropical Pacific, the Walker circulation – giant loops of air that affect precipitation as they rise and fall over different parts of the tropics – strengthens over the western Pacific. This stronger circulation causes the tropical rainfall belt to expand. The expanded tropical rainfall, combined with changes in atmospheric eddy patterns, results in high-pressure anomalies and wind patterns that steer atmospheric rivers farther poleward.
Conversely, during El Niño conditions, with warmer sea surface temperatures, the mechanism operates in the opposite direction, shifting atmospheric rivers so they don’t travel as far from the equator.
The shifts raise important questions about how climate models predict future changes in atmospheric rivers. Current models might underestimate natural variability, such as changes in the tropical Pacific, which can significantly affect atmospheric rivers. Understanding this connection can help forecasters make better predictions about future rainfall patterns and water availability.
Why does this poleward shift matter?
A shift in atmospheric rivers can have big effects on local climates.
In the subtropics, where atmospheric rivers are becoming less common, the result could be longer droughts and less water. Many areas, such as California and southern Brazil, depend on atmospheric rivers for rainfall to fill reservoirs and support farming. Without this moisture, these areas could face more water shortages, putting stress on communities, farms and ecosystems.
In higher latitudes, atmospheric rivers moving poleward could lead to more extreme rainfall, flooding and landslides in places such as the U.S. Pacific Northwest, Europe, and even in polar regions.
In the Arctic, more atmospheric rivers could speed up sea ice melting, adding to global warming and affecting animals that rely on the ice. An earlier study I was involved in found that the trend in summertime atmospheric river activity may contribute 36% of the increasing trend in summer moisture over the entire Arctic since 1979.
What it means for the future
So far, the shifts we have seen still mainly reflect changes due to natural processes, but human-induced global warming also plays a role. Global warming is expected to increase the overall frequency and intensity of atmospheric rivers because a warmer atmosphere can hold more moisture.
How that might change as the planet continues to warm is less clear. Predicting future changes remains uncertain due largely to the difficulty in predicting thenatural swings between El Niño and La Niña, which play an important role in atmospheric river shifts.
As the world gets warmer, atmospheric rivers – and the critical rains they bring – will keep changing course. We need to understand and adapt to these changes so communities can keep thriving in a changing climate.
Record breaking fossil fuel production, all time high greenhouse gas emissions and extreme temperatures. Like the proverbial frog in the heating pan of water, we refuse to respond to the climate and ecological crisis with any sense of urgency. Under such circumstances, claims from some that global warming can still be limited to no more than 1.5°C take on a surreal quality.
For example, at the start of 2023’s international climate negotiations in Dubai, conference president, Sultan Al Jaber, boldly stated that 1.5°C was his goal and that his presidency would be guided by a “deep sense of urgency” to limit global temperatures to 1.5°C. He made such lofty promises while planning a massive increase in oil and gas production as CEO of the Abu Dhabi National Oil Company.
We should not be surprised to see such behaviour from the head of a fossil fuel company. But Al Jaber is not an outlier. Scratch at the surface of almost any net zero pledge or policy that claims to be aligned with the 1.5°C goal of the landmark 2015 Paris agreement and you will reveal the same sort of reasoning: we can avoid dangerous climate change without actually doing what this demands – which is to rapidly reduce greenhouse gas emissions from industry, transport, energy (70% of total) and food systems (30% of total), while ramping up energy efficiency.
This is also not surprising given that net zero and even the Paris agreement have been built around the perceived need to keep burning fossil fuels, at least in the short term. Not do so would threaten economic growth, given that fossil fuels still supply over 80% of total global energy. The trillions of dollars of fossil fuel assets at risk with rapid decarbonisation have also served as powerful brakes on climate action.
Overshoot
The way to understand this doublethink: that we can avoid dangerous climate change while continuing to burn fossil fuels – is that it relies on the concept of overshoot. The promise is that we can overshoot past any amount of warming, with the deployment of planetary-scale carbon dioxide removal dragging temperatures back down by the end of the century.
This not only cripples any attempt to limit warming to 1.5°C, but risks catastrophic levels of climate change as it locks us in to energy and material-intensive solutions which for the most part exist only on paper.
To argue that we can safely overshoot 1.5°C, or any amount of warming, is saying the quiet bit out loud: we simply don’t care about the increasing amount of suffering and deaths that will be caused while the recovery is worked on.
This article is part of Conversation Insights. Our co-editors commission long-form journalism, working with academics from many different backgrounds who are engaged in projects aimed at tackling societal and scientific challenges.
A key element of overshoot is carbon dioxide removal. This is essentially a time machine – we are told we can turn back the clock of decades of delay by sucking carbon dioxide directly out of the atmosphere. We don’t need rapid decarbonisation now, because in the future we will be able to take back those carbon emissions. If or when that doesn’t work, we are led to believe that even more outlandish geoengineering approaches such as spraying sulphurous compounds into the high atmosphere in an attempt to block out sunlight – which amounts to planetary refrigeration – will save us.
The 2015 Paris agreement was an astonishing accomplishment. The establishment of 1.5°C as being the internationally agreed ceiling for warming was a success for those people and nations most exposed to climate change hazards. We know that every fraction of a degree matters. But at the time, believing warming could really be limited to well below 2°C required a leap of faith when it came to nations and companies putting their shoulder to the wheel of decarbonisation. What has happened instead is that the net zero approach of Paris is becoming detached from reality as it is increasingly relying on science fiction levels of speculative technology.
There is arguably an even bigger problem with the Paris agreement. By framing climate change in terms of temperature, it focuses on the symptoms, not the cause. 1.5°C or any amount of warming is the result of humans changing the energy balance of the climate by increasing the amount of carbon dioxide in the atmosphere. This traps more heat. Changes in the global average temperature is the established way of measuring this increase in heat, but no one experiences this average.
Climate change is dangerous because of weather that affects particular places at particular times. Simply put, this extra heat is making weather more unstable. Unfortunately, having temperature targets makes solar geoengineering seem like a sensible approach because it may lower temperatures. But it does this by not reducing, but increasing our interference in the climate system. Trying to block out the sun in response to increasing carbon emissions is like turning on the air conditioning in response to a house fire.
In 2021 we argued that net zero was a dangerous trap. Three years on and we can see the jaws of this trap beginning to close, with climate policy being increasingly framed in terms of overshoot. The resulting impacts on food and water security, poverty, human health, the destruction of biodiversity and ecosystems will produce intolerable suffering.
The situation demands honesty, and a change of course. If this does not materialise then things are likely to deteriorate, potentially rapidly and in ways that may be impossible to control.
Au revoir Paris
The time has come to accept that climate policy has failed, and that the 2015 landmark Paris agreement is dead. We let it die by pretending that we could both continue to burn fossil fuels and avoid dangerous climate change at the same time. Rather than demand the immediate phase out of fossil fuels, the Paris agreement proposed 22nd-century temperature targets which could be met by balancing the sources and sinks of carbon. Within that ambiguity net zero flourished. And yet apart from the COVID economic shock in 2020, emissions have increased every year since 2015, reaching an all time high in 2023.
Despite there being abundant evidence that climate action makes good economic sense (the cost of inaction vastly exceeds the cost of action), no country strengthened their pledges at the last three COPs (the annual UN international meetings) even though it was clear that the world was on course to sail past 2°C, let alone 1.5°C. The Paris agreement should be producing a 50% reduction in greenhouse gas emissions by 2030, but current policies mean that they are on track to be higher than they are today.
We do not deny that significant progress has been made with renewable technologies. Rates of deployment of wind and solar have increased each year for the past 22 years and carbon emissions are going down in some of the richest nations, including the UK and the US. But this is not happening fast enough. A central element of the Paris agreement is that richer nations need to lead decarbonisation efforts to give lower income nations more time to transition away from fossil fuels. Despite some claims to the contrary, the global energy transition is not in full swing. In fact, it hasn’t actually begun because the transition demands a reduction in fossil fuel use. Instead it continues to increase year-on-year.
And so policymakers are turning to overshoot in an attempt to claim that they have a plan to avoid dangerous climate change. A central plank of this approach is that the climate system in the future will continue to function as it does today. This is a reckless assumption.
2023’s warning signs
At the start of 2023, Berkeley Earth, NASA, the UK Met Office, and Carbon Briefpredicted that 2023 would be slightly warmer than the previous year but unlikely to set any records. Twelve months later and all four organisations concluded that 2023 was by some distance the warmest year ever recorded. In fact, between February 2023 and February 2024 the global average temperature warming exceeded the Paris target of 1.5°C.
Currently we cannot fully explain why global temperatures have been so high for the past 18 months. Changes in dust, soot and other aerosols are important, and there are natural processes such as El Niño that will be having an effect.
But it appears that there is still something missing in our current understanding of how the climate is responding to human impacts. This includes changes in the Earth’s vital natural carbon cycle.
Around half of all the carbon dioxide humans have put into the atmosphere over the whole of human history has gone into “carbon sinks” on land and the oceans. We get this carbon removal “for free”, and without it, warming would be much higher. Carbon dioxide from the air dissolves in the oceans (making them more acidic which threatens marine ecosystems). At the same time, increasing carbon dioxide promotes the growth of plants and trees which locks up carbon in their leaves, roots, trunks.
All climate policies and scenarios assume that these natural carbon sinks will continue to remove tens of billions of tons of carbon from the atmosphere each year. There is evidence that land-based carbon sinks, such as forests, removed significantly less carbon in 2023. If natural sinks begin to fail – something they may well do in a warmer world – then the task of lowering global temperatures becomes even harder. The only credible way of limiting warming to any amount, is to stop putting greenhouse gasses into the atmosphere in the first place.
Science fiction solutions
It’s clear that the commitments countries have made to date as part of the Paris agreement will not keep humanity safe while carbon emissions and temperatures continue to break records. Indeed, proposing to spend trillions of dollars over this century to suck carbon dioxide out of the air, or the myriad other ways to hack the climate is an acknowledgement that the world’s largest polluters are not going to curb the burning of fossil fuels.
Over the following years we are going to see climate impacts increase. Lethal heatwaves are going to become more common. Storms and floods are going to become increasingly destructive. More people are going to be displaced from their homes. National and regional harvests will fail. Vast sums of money will need to be spent on efforts to adapt to climate change, and perhaps even more compensating those who are most affected. We are expected to believe that while all this and more unfolds, new technologies that will directly modify the Earth’s atmosphere and energy balance will be successfully deployed.
What’s more, some of these technologies may need to operate for three hundred years in order for the consequences of overshoot to be avoided. Rather than quickly slow down carbon polluting activities and increasing the chances that the Earth system will recover, we are instead going all in on net zero and overshoot in an increasingly desperate hope that untested science fiction solutions will save us from climate breakdown.
We can see the cliff edge rapidly approaching. Rather than slam on the brakes, some people are instead pushing their foot down harder on the accelerator. Their justification for this insanity is that we need to go faster in order to be able to make the jump and land safely on the other side.
We believe that many who advocate for carbon dioxide removal and geoengineering do so in good faith. But they include proposals to refreeze the Arctic by pumping up sea water onto ice sheets to form new layers of ice and snow. These are interesting ideas to research, but there is very little evidence this will have any effect on the Arctic let alone global climate. These are the sorts of knots that people tie themselves up in when they acknowledge the failure of climate policy, but refuse to challenge the fundamental forces behind such failure. They are unwittingly slowing down the only effective action of rapidly phasing out fossil fuels.
That’s because proposals to remove carbon dioxide from the air or geoengineer the climate promise a recovery from overshoot, a recovery that will be delivered by innovation, driven by growth. That this growth is powered by the same fossil fuels that are causing the problem in the first place doesn’t feature in their analysis.
The bottom line here is that the climate system is utterly indifferent to our pledges and promises. It doesn’t care about economic growth. And if we carry on burning fossil fuels then it will not stop changing until the energy balance is restored. By which time millions of people could be dead, with many more facing intolerable suffering.
Major climate tipping points
Even if we assume that carbon removal and even geoengineering technologies can be deployed in time, there is a very large problem with the plan to overshoot 1.5°C and then lower temperatures later: tipping points.
The science of tipping points is rapidly advancing. Late last year one of us (James Dyke) along with over 200 academics from around the world was involved in the production of the Global Tipping Points Report. This was a review of the latest science about where tipping points in the climate system may be, as well as exploring how social systems can undertake rapid change (in the direction that we want) thereby producing positive tipping points. Within the report’s 350 pages is abundant evidence that the overshoot approach is an extraordinarily dangerous gamble with the future of humanity. Some tipping points have the potential to cause global havoc.
The melt of permafrost could release billions of tons of greenhouse gasses into the atmosphere and supercharge human-caused climate change. Fortunately, this seems unlikely under the current warming. Unfortunately, the chance that ocean currents in the North Atlantic could collapse may be much higher than previously thought. If that were to materialise, weather systems across the world, but in particular in Europe and North America, would be thrown into chaos. Beyond 1.5°C, warm water coral reefs are heading towards annihilation. The latest science concludes that by 2°C global reefs would be reduced by 99%. The devastating bleaching event unfolding across the Great Barrier Reef follows multiple mass mortality events. To say we are witnessing one of the world’s greatest biological wonders die is insufficient. We are knowingly killing it.
We may have even already passed some major climate tipping points. The Earth has two great ice sheets, Antarctica, and Greenland. Both are disappearing as a consequence of climate change. Between 2016 and 2020, the Greenland ice sheet lost on average 372 billion tons of ice a year. The current best assessment of when a tipping point could be reached for the Greenland ice sheet is around 1.5°C.
This does not mean that the Greenland ice sheet will suddenly collapse if warming exceeds that level. There is so much ice (some 2,800 trillion tons) that it would take centuries for all of it to melt over which time sea levels would rise seven metres. If global temperatures could be brought back down after a tipping point, then maybe the ice sheet could be stabilised. We just cannot say with any certainty that such a recovery would be possible. While we struggle with the science, 30 million tons of ice is melting across Greenland every hour on average.
The take home message from research on these and other tipping points is that further warming accelerates us towards catastrophe. Important science, but is anyone listening?
It’s five minutes to midnight…again
We know we must urgently act on climate change because we are repeatedly told that time is running out. In 2015, Professor Jeffrey Sachs, the UN special adviser and director of The Earth Institute, declared:
The time has finally arrived – we’ve been talking about these six months for many years but we’re now here. This is certainly our generation’s best chance to get on track.
In 2019 (then) Prince Charles gave a speech in which he said: “I am firmly of the view that the next 18 months will decide our ability to keep climate change to survivable levels and to restore nature to the equilibrium we need for our survival.”
“We have six months to save the planet,” exhorted International Energy Agency head Fatih Birol – one year later in 2020. In April 2024, Simon Stiell, executive secretary of the United Nations Framework Convention on Climate Change said the next two years are “essential in saving our planet”.
Either the climate crisis has a very fortunate feature that allows the countdown to catastrophe to be continually reset, or we are deluding ourselves with endless declarations that time has not quite run out. If you can repeatedly hit snooze on your alarm clock and roll over back to sleep, then your alarm clock is not working.
Or there is another possibility. Stressing that we have very little time to act is intended to focus attention on climate negotiations. It’s part of a wider attempt to not just wake people up to the impending crisis, but generate effective action. This is sometimes used to explain how the 1.5°C threshold of warming came to be agreed. Rather than a specific target, it should be understood as a stretch goal. We may very well fail, but in reaching for it we move much faster than we would have done with a higher target, such as 2°C. For example, consider this statement made in 2018:
Stretching the goal to 1.5 degrees celsius isn’t simply about speeding up. Rather, something else must happen and society needs to find another lever to pull on a global scale.
What could this lever be? New thinking about economics that goes beyond GDP? Serious consideration of how rich industrialised nations could financially and materially help poorer nations to leapfrog fossil fuel infrastructure? Participatory democracy approaches that could help birth the radical new politics needed for the restructuring of our fossil fuel powered societies? None of these.
The lever in question is Carbon Capture and Storage (CCS) because the above quote comes from an article written by Shell in 2018. In this advertorial Shell argues that we will need fossil fuels for many decades to come. CCS allows the promise that we can continue to burn fossil fuels and avoid carbon dioxide pollution by trapping the gas before it leaves the chimney. Back in 2018, Shell was promoting its carbon removal and offsets heavy Sky Scenario, an approach described as “a dangerous fantasy” by leading climate change academics as it assumed massive carbon emissions could be offset by tree planting.
Shell is far from alone in waving carbon capture magic wands. Exxon is making great claims for CCS as a way to produce net zero hydrogen from fossil gas – claims that have been subject to pointed criticism from academics with recent reporting exposing industry wide greenwashing around CCS.
But the rot goes much deeper. All climate policy scenarios that propose to limit warming to near 1.5°C rely on the largely unproven technologies of CCS and BECCS. BECCS sounds like a good idea in theory. Rather than burn coal in a power station, burn biomass such as wood chips. This would initially be a carbon neutral way of generating electricity if you grew as many trees as you cut down and burnt. If you then add scrubbers to the power station chimneys to capture the carbon dioxide, and then bury that carbon deep underground, then you would be able to generate power at the same time as reducing concentrations of carbon dioxide in the atmosphere.
Unfortunately, there is now clear evidence that in practice, large-scale BECCS would have very adverse effects on biodiversity, and food and water security given the large amounts of land that would be given over to fast growing monoculture tree plantations. The burning of biomass may even be increasing carbon dioxide emissions. Drax, the UK’s largest biomass power station now produces four times as much carbon dioxide as the UK’s largest coal-fired power station.
Five minutes to midnight messages may be motivated to try to galvanise action, to stress the urgency of the situation and that we still (just) have time. But time for what? Climate policy only ever offers gradual change, certainly nothing that would threaten economic growth, or the redistribution of wealth and resources.
Despite the mounting evidence that globalised, industrialised capitalism is propelling humanity towards disaster, five minutes to midnight does not allow time and space to seriously consider alternatives. Instead, the solutions on offer are techno fixes that prop up the status quo and insists that fossil fuel companies such as Shell must be part of the solution.
That is not to say there are no good faith arguments for 1.5°C. But being well motivated does not alter reality. And the reality is that warming will soon pass 1.5°C, and that the Paris agreement has failed. In the light of that, repeatedly asking people to not give up hope, that we can avoid a now unavoidable outcome risks becoming counterproductive. Because if you insist on the impossible (burning fossil fuels and avoiding dangerous climate change), then you must invoke miracles. And there is an entire fossil fuel industry quite desperate to sell such miracles in the form of CCS.
Four suggestions
Humanity has enough problems right now, what we need are solutions. This is the response we sometimes get when we argue that there are fundamental problems with the net zero concept and the Paris agreement. It can be summed up with the simple question: so what’s your suggestion? Below we offer four.
1. Leave fossil fuels in the ground
The unavoidable reality is that we need to rapidly stop burning fossil fuels. The only way we can be sure of that is by leaving them in the ground. We have to stop exploring for new fossil fuel reserves and the exploitation of existing ones. That could be done by stopping fossil fuel financing.
At the same time we must transform the food system, especially the livestock sector, given that it is responsible for nearly two thirds of agricultural emissions. Start there and then work out how best the goods and services of economies can be distributed. Let’s have arguments about that based on reality not wishful thinking.
2. Ditch net zero crystal ball gazing targets
The entire framing of mid and end-century net zero targets should be binned. We are already in the danger zone. The situation demands immediate action, not promises of balancing carbon budgets decades into the future. The SBTi should focus on near-term emissions reductions. By 2030, global emissions need to be half of what they are today for any chance of limiting warming to no more than 2°C.
It is the responsibility of those who hold most power – politicians and business leaders – to act now. To that end we must demand twin targets – all net zero plans should include a separate target for actual reductions in greenhouse gas emissions. We must stop hiding inaction behind promises of future removals. It’s our children and future generations that will need to pay back the overshoot debt.
3. Base policy on credible science and engineering
All climate policies must be based on what can be done in the real world now, or in the very near future. If it is established that a credible amount of carbon can be removed by a proposed approach – which includes capture and its safe permanent storage – then and only then can this be included in net zero plans. The same applies to solar geoengineering.
Speculative technologies must be removed from all policies, pledges and scenarios until we are sure of how they will work, how they will be monitored, reported and validated, and what they will do to not just the climate but the Earth system as a whole. This would probably require a very large increase in research. As academics we like doing research. But academics need to be wary that concluding “needs more research” is not interpreted as “with a bit more funding this could work”.
4. Get real
Finally, around the world there are thousands of groups, projects, initiatives, and collectives that are working towards climate justice. But while there is a Climate Majority Project, and a Climate Reality Project, there is no Climate Honesty Project (although People Get Real does come close). In 2018 Extinction Rebellion was formed and demanded that governments tell the truth about the climate crisis and act accordingly. We can now see that when politicians were making their net zero promises they were also crossing their fingers behind their backs.
We need to acknowledge that net zero and now overshoot are becoming used to argue that nothing fundamental needs to change in our energy intensive societies. We must be honest about our current situation, and where we are heading. Difficult truths need to be told. This includes highlighting the vast inequalities of wealth, carbon emissions, and vulnerability to climate change.
The time for action is now
We rightly blame politicians for failing to act. But in some respects we get the politicians we deserve. Most people, even those that care about climate change, continue to demand cheap energy and food, and a constant supply of consumer products. Reducing demand by just making things more expensive risks plunging people into food and energy poverty and so policies to reduce emissions from consumption need to go beyond market-based approaches. The cost of living crisis is not separate from the climate and ecological crisis. They demand that we radically rethink how our economies and societies function, and whose interests they serve.
To return to the boiling frog predicament at the start, it’s high time for us to jump out of the pot. You have to wonder why we did not start decades ago. It’s here that the analogy offers valuable insights into net zero and the Paris agreement. Because the boiling frog story as typically told misses out a crucial fact. Regular frogs are not stupid. While they will happily sit in slowly warming water, they will attempt to escape once it becomes uncomfortable. The parable as told today is based on experiments at the end of the 19th century that involved frogs that had been “pithed” – a metal rod had been inserted into their skulls that destroyed their higher brain functioning. These radically lobotomised frogs would indeed float inert in water that was cooking them alive.
Promises of net zero and recovery from overshoot are keeping us from struggling to safety. They assure us nothing too drastic needs to happen just yet. Be patient, relax. Meanwhile the planet burns and we see any sort of sustainable future go up in smoke.
Owning up to the failures of climate change policy doesn’t mean giving up. It means accepting the consequences of getting things wrong, and not making the same mistakes. We must plan routes to safe and just futures from where we are, rather where we would wish to be. The time has come to leap.
After a series of natural disasters – from the Canterbury earthquakes to Cyclone Gabrielle – real doubt hangs over the insurance options available to some New Zealand homeowners.
Increasingly, homes in certain areas are becoming uninsurable – or difficult to insure, at least. Insurers have decided the risk is too high to make covering it financially viable, leaving affected homeowners vulnerable.
The question of how insurers can continue to offer policies – all the while managing the growing risk from natural disasters – is becoming hard to ignore.
Insurers will have to explore alternative models and innovate if New Zealand is to adapt to future change.
Cautious insurers
There’s no general requirement in New Zealand that insurers cover anyone’s home, or that anyone’s home actually be insured.
Body-corporate groups are one exception. They must insure the units they manage. Mortgage lenders can also require borrowers to take out home insurance as part of their lending conditions.
When homeowners do get insurance, the risk of certain losses from natural disasters is automatically covered by the Natural Hazards Commission (previously known as the Earthquake Commission).
Even if a home insurance policy were to contain wording that, on the face of it, excluded this public natural-disaster cover, the law would treat the cover as included. At the same time, payouts are only managed by insurers, not financed by them.
The Canterbury earthquakes cost insurers NZ$21 billion and the Natural Hazards Commission $10 billion. And the risk of natural disasters more generally may be making insurers too cautious. They’re increasingly pulling out of areas they consider “high risk”.
That said, there are changes on the horizon. From mid-2025, insurers will have a general duty to “treat consumers fairly”. The Financial Markets Authority – the body responsible for enforcing financial-markets law – may potentially regard refusing home insurance to any consumer as a breach of the duty.
In other words, the Financial Markets Authority may end up forcing insurers to cover most of the country’s homes.
New insurance options
Future-proofing home insurance options will depend on the public and private sectors working together.
Many of the potential solutions are specific to how insurers take risk on. An insurer may decrease your premiums as an incentive for you to “disaster-proof” your home. If you don’t, the insurer may increase your premiums and limit its payouts to you, with individualised excesses or caps.
The insurer may even offer “parametric” insurance, which pays out less than traditional insurance, but faster.
For example, imagine a home insurance policy that covers any earthquake having its epicentre within 500 kilometres of your home, and measuring magnitude six or higher.
A traditional policy would pay out based on how much loss was caused (according to a loss adjuster). A parametric policy would simply pay out a small, pre‑agreed sum, based on the fact the earthquake occurred at all.
A parametric policy wouldn’t require you to prove any actual “loss” – beyond the inconvenience of having your home in the disaster zone.
While parametric insurance is relatively new worldwide, it’s an efficient solution for managing the risk of natural-disaster damage.
Reinsurance, co-insurance and ‘cat bonds’
An insurer may also transfer risk to one or more other insurance businesses – such as a “reinsurer”. If the insurer has to make a payout to you for a claim, the reinsurer then has to make a payout to the insurer for a portion of it.
The insurer may even “co‑insure” the risk. Co‑insurance is where two or more insurers cover different portions of the same risk. So, if you have your home co‑insured, you will have two or more insurers, each responsible for a portion of any claim.
Then there is the potential to transfer insured risk to entities that aren’t even insurance businesses. In some countries (such as Bermuda, the Cayman Islands and Ireland), the insurer can turn the risk into a “catastrophe bond” (also known as a “cat bond”).
Under a cat bond, the insurer arranges for expert investors to lend it capital in return for interest on the loans. The insurer eventually repays the capital, unless there is a specific natural disaster. In that case, the insurer keeps the capital, enabling it to pay out to the affected customers.
The insurer may even use the cat bond to create a “virtuous cycle”. More specifically, the insurer may reinvest the capital in “a project that reduces or prevents loss from the insured climate-related risk” (such as flooding).
Disaster-proofing the insurance industry
Key to improving the situation will be the public and private sectors working together to make climate-related disasters less frequent – and less serious when they occur.
The United Nations’ Intergovernmental Panel on Climate Change has advised on how the sectors could minimise climate-related risk. But they also have similar progress to make to minimise the risk of natural-disaster damage more generally, particularly from earthquakes.
It is important to build homes that are better disaster-proofed. And it is also important to address a major problem that many people don’t necessarily view as related to insurance – the cost of housing.
If New Zealanders wishing to own their homes didn’t have to invest as much of their money in housing as they do, the risk of damage to housing might be of less concern. Natural disaster wouldn’t have to mean financial disaster as much as it does today.
In the meantime, innovative insurance options will become more and more necessary.
It feels like we are getting used to the Earth being on fire. Recently, more than 70 wildfires burned simultaneously in Greece. In early 2024, Chile suffered its worst wildfire season in history, with more than 130 people killed. Last year, Canada’s record-breaking wildfires burned from March to November and, in August, flames devastated the island of Maui, in Hawaii. And the list goes on and on.
Watching the news, it certainly feels like catastrophic extreme wildfires are happening more often, and unfortunately this feeling has now been confirmed as correct. A new study published in Nature Ecology & Evolution shows that the number and intensity of the most extreme wildfires on Earth have doubled over the past two decades.
The authors of the new study, researchers at the University of Tasmania, first calculated the energy released by different fires over 21 years from 2003 to 2023. They did this by using a satellite-based sensor which can identify heat from fires, measuring the energy released as “fire radiative power”.
The researchers identified a total of 30 million fires (technically 30 million “fire events”, which can include some clusters of fires grouped together). They then selected the top 2,913 with the most energy released, that is, the 0.01% “most extreme” wildfires. Their work shows that these extreme wildfires are becoming more frequent, with their number doubling over the past two decades. Since 2017, the Earth has experienced the six years with the highest number of extreme wildfires (all years except 2022).
Importantly, these extreme wildfires are also becoming even more intense. Those classified as extreme in recent years released twice the energy of those classified as extreme at the start of the studied period.
These findings align with other recent evidence that wildfires are worsening. For instance, the area of forest burned every year is slightly increasing, leading to a corresponding rise in forest carbon emissions. (The total land area burned each year is actually decreasing, due to a decrease in grassland and cropland fires, but these fires are lower intensity and emit less carbon than forest fires).
Burn severity – an indicator of how badly a fire damages the ecosystem – is also worsening in many regions, and the percentage of burned land affected by high severity burning is increasing globally as well.
Although the global outlook is overall not good, there are striking differences among regions. The new study identifies boreal forests of the far north and temperate conifer forests (blue and light green in the above map) as the critical types of ecosystem driving the global increase in extreme wildfires. They have the higher number of extreme fires relative to their extent, and show the most dramatic worsening over time, while also seeing an increase in total burned area and percentage burned at high severity. The confluence of these three trends is particularly pervasive in eastern Siberia, and the western US and Canada.
What turns a fire into a catastrophe
Nonetheless, many other regions are also susceptible to fires becoming more consequential, as what turns a fire into a catastrophe depends not only on fire trends but also on the environmental, social and economic context.
For instance, in temperate broadleaf forests around the Mediterranean, there has not been a big change in fire activity and behaviour. But the growing number of houses built in and around wild vegetation in fire-prone areas is a clear example of an action that increases human risk and can lead to catastrophe.
The doubling in extreme wildfires adds to a complex picture of fire patterns and trends. This new evidence underscores the urgency of addressing the root causes behind worsening wildfire activity, such as land cover changes, forest policies and management, and, of course, climate change. This will better prepare us for these extreme fires, which are near-impossible to combat using traditional firefighting methods.
Top image: Under a microscope, a tiny elongate poppy seed, small tan spikemoss megaspores and black soil fungus spheres found in soil recovered from under 2 miles of Greenland’s ice. Halley Mastro/University of Vermont, CC BY-NDPaul Bierman, University of Vermont and Halley Mastro, University of Vermont
As we focused our microscope on the soil sample for the first time, bits of organic material came into view: a tiny poppy seed, the compound eye of an insect, broken willow twigs and spikemoss spores. Dark-colored spheres produced by soil fungi dominated our view.
These were unmistakably the remains of an arctic tundra ecosystem – and proof that Greenland’s entire ice sheet disappeared more recently than people realize.
These tiny hints of past life came from a most unlikely place – a handful of soil that had been buried under 2 miles of ice below the summit of the Greenland ice sheet. Projections of future melting of the ice sheet are unambiguous: When the ice is gone at the summit, at least 90% of Greenland’s ice will have melted.
In 1993, drillers at the summit completed the Greenland Ice Sheet Project 2 ice core, or GISP2, nicknamed the two-mile time machine. The seeds, twigs and spores we found came from a few inches of soil at the bottom of that core — soil that had been tucked away dry, untouched for three decades in a windowless Colorado storage facility.
Our new analysis builds on the work of others who, over the past decade, have chipped away at the belief that Greenland’s ice sheet was present continuously since at least 2.6 million years ago when the Pleistocene ice ages began. In 2016, scientists measuring rare isotopes in rock from above and below the GISP2 soil sample used models to suggest that the ice had vanished at least once within the past 1.1 million years.
Now, by finding well-preserved tundra remains, we have confirmed that Greenland’s ice sheet had indeed melted before and exposed the land below the summit long enough for soil to form and for tundra to grow there. That tells us that the ice sheet is fragile and could melt again.
A landscape with Arctic poppies and spikemosses
To the naked eye, the tiny bits of past life are unremarkable – dark flecks, floating between shiny grains of silt and sand. But, under the microscope, the story they tell is astounding. Together, the seeds, megaspores and insect parts paint a picture of a cold, dry and rocky environment that existed sometime in the past million years.
Above ground, Arctic poppies grew among the rocks. Atop each stalk of this small but tenacious herb, a single cupped flower tracked the Sun across the sky to make the most of each day’s light.
Tiny insects buzzed above mats of diminutive rock spikemoss, creeping across the gravelly surface and bearing spores in summer.
In the rocky soil were dark spheres called sclerotia, produced by fungi that team up with plants’ roots in soil to help both get the nutrients they need. Nearby, willow shrubs adapted to life in the harsh tundra with their small size and fuzzy hair covering their stems.
Each of these living things left clues behind in that handful of soil – evidence that told us Greenland’s ice was once replaced by a hardy tundra ecosystem.
Greenland’s ice is fragile
Our discoveries, published on Aug. 5, 2024, in the Proceedings of the National Academy of Sciences, demonstrate that Greenland’s ice is vulnerable to melting at atmospheric carbon dioxide concentrations lower than today. Concerns about this vulnerability have driven scientists to study the ice sheet since the 1950s.
In the 1960s, a team of engineers extracted the world’s first deep ice core at Camp Century, a nuclear-powered Army base built into the ice sheet over 100 miles from the northwest Greenland coast. They studied the ice, but they had little use for the chunks of rock and soil brought up with the bottom of the core. Those were stored and then lost until 2019, when they were rediscovered in a lab freezer. Our team was among the scientists called in to analyze them.
Another ice core, DYE-3 from south Greenland, contained DNA showing that spruce forests covered that part of the island at some point in the past million years.
The biological evidence makes a convincing case for the fragility of Greenland’s ice sheet. Together, the findings from three ice cores can only mean one thing: With the possible exception of a few mountainous areas to the east, ice must have melted off the entire island in the past million years.
Losing the ice sheet
When Greenland’s ice is gone, world geography changes – and that’s a problem for humanity.
As the ice sheet melts, sea level will eventually rise more than 23 feet, and coastal cities will flood. Most of Miami will be underwater, and so will much of Boston, New York, Mumbai and Jakarta.
Today, sea level is rising at more than an inch each decade, and in some places, several times faster. By 2100, when today’s kids are grandparents, sea level around the globe is likely to be several feet higher.
Using the past to understand the future
The rapid loss of ice is changing the Arctic. Data about past ecosystems, like we have collected from under Greenland’s ice, helps scientists understand how the ecology of the Arctic will change as the climate warms.
When temperatures rise, bright white snow melts and ice shrinks, exposing dark rock and soil that soaks up heat from the Sun. The Arctic is becoming greener with every passing year, thawing underlying permafrost and releasing more carbon that will further warm the planet.
Human-caused climate change is on pace to warm the Arctic and Greenland beyond temperatures they have experienced for millions of years. To save Greenland’s ice, studies show the world will need to stop greenhouse gas emissions from its energy systems and reduce carbon dioxide levels in the atmosphere.
Understanding the environmental conditions that triggered the ice sheet’s last disappearance, and how life on Greenland responded, will be crucial for gauging the future risks facing the ice sheet and coastal communities around the world.
Paul Bierman, Fellow of the Gund Institute for Environment, Professor of Natural Resources and Environmental Science, University of Vermont and Halley Mastro, Graduate Fellow of the Gund Institute for Environment. Graduate Research Assistant in Natural Resources and Environmental Science, University of Vermont
This 05 August webinar shares information about the Climate Change Commission’s first annual emissions reduction monitoring report, released in July 2024. The report provides an evidence-based, impartial view of whether the country is on course to reach its goals of reducing and removing greenhouse gas emissions. It provides insight into the progress made, challenges experienced, and opportunities and risks that need to be considered.
The following quote from Dr Rod Carr towards the end of the webinar, paints a realistic picture of what Aotearoa can expect in term of the economic our global standing and the risks. (Pages on this website explain Nationally Determined Contributions, the Paris Accord, and the Emissions Trading Scheme.)
Webinar question: What would happen if New Zealand wasn’t able or didn’t comply with our Nationally Determined Contributions (NDCs)? What are the implications for us?
Answers:
As time it goes on, meeting our NDCs is getting increasingly more difficult and expensive because of delay.
Not meeting the NDCs: we would certainly expect to see greater scrutiny of our actions from our trading partners particularly where we have free trade agreements (FTAs) and particularly with those strong climate elements within them like the EU FTA.
Not meeting them is also likely to come with a loss of influence and on the global scale in relation to climate change, which may mean we are in worse position to advocate for a response that takes into account our national circumstances.
The final thing is that global consumers and customers are increasingly scrutinising their supply chains and looking for products that are reducing emissions, and so we do increase risks around loss of the global markets.
– Jo Hendy CE: Video, The Climate Change Commission 2024 emissions reduction monitoring report, August 2024
When the rest of the world looks at New Zealand, if we haven’t
met our national determine contributions—we won’t know on the 31st of
December 2030 as it takes a couple of years for inventories and count
up— but when the partners that we care about look at our behaviour and
go, ‘Did you do all that you said you would? Did you do all that you
said you would? And did you do all the things you could have done?’
That’s going to inform whether it’s ‘that you tried hard but missed’ or
‘you didn’t try’.
So foreign countries who are in incurring very real economic
costs to reduce their emissions today— and that includes the Europeans,
the Brits, and the Americans (there’s half a trillion U.S. dollars of
taxpayers money being made available to reduce their emissions so the
idea they’re not doing anything; that’s just wrong)—so when those
countries look at NZ in the early 2030s and they look back to 2020, they
go, ‘Well you could have made a better effort to, for example,
decarbonized ground transport there were known technologies that were
available, but you just chose to buy cheap high polluting cars. You
could have chosen to stop burning as much coal and fossil gas to make
electricity by investing more sooner in renewables, but you chose not
to.’ I think that’s going to influence what the world thinks about New
Zealand ‘s behaviour more than whether we did or did hit the exact
number of tonnes for this decade.
And the rest of the world looks at New Zealand and says,
‘You didn’t try. You didn’t take up the known technologies. You are
short sighted, selfish, and reckless in your use of the climate for
profit.’ I think their attitudes to us will be very different than if we
had tried hard and done all we could but things didn’t turn out well.
– Dr Rod Carr, Video, The Climate Change Commission 2024 emissions reduction monitoring report, August 2024
New Zealand is one of the worst countries in the world in terms of meeting its commitments to keep temperatures under 1.5C. (Image: Climate Action Tracker)
New Zealand is also subsidising high greenhouse gas emissions industries by giving the agricultural sector a 100% discount (Image: Nature journal)
Extreme weather is by definition rare on our planet. Ferocious storms, searing heatwaves and biting cold snaps illustrate what the climate is capable of at its worst. However, since Earth’s climate is rapidly warming, predominantly due to fossil fuel burning, the range of possible weather conditions, including extremes, is changing.
Scientists define “climate” as the distribution of possible weather events observed over a length of time, such as the range of temperatures, rainfall totals or hours of sunshine. From this they construct statistical measures, such as the average (or normal) temperature. Weather varies on several timescales – from seconds to decades – so the longer the period over which the climate is analysed, the more accurately these analyses capture the infinite range of possible configurations of the atmosphere.
Typically, meteorologists and climate scientists use a 30-year period to represent the climate, which is updated every ten years. The most recent climate period is 1991-2020. The difference between each successive 30-year climate period serves as a very literal record of climate change.
This way of thinking about the climate falls short when the climate itself is rapidly changing. Global average temperatures have increased at around 0.2°C per decade over the past 30 years, meaning that the global climate of 1991 was around 0.6°C cooler than that in 2020 (when accounting for other year-to-year fluctuations), and even more so than the present day.
A moving target for climate modellers
If the climate is a range of possible weather events, then this rapid change has two implications. First, it means that part of the distribution of weather events comprising a 30-year climate period occurred in a very different background global climate: for example, northerly winds in the 1990s were much colder than those in the 2020s in north-west Europe, thanks to the Arctic warming nearly four times faster than the global average. Statistics from three decades ago no longer represent what is possible in the present day.
Second, the rapidly changing climate means we have not necessarily experienced the extremes that modern-day atmospheric and oceanic warmth can produce. In a stable climate, scientists would have multiple decades for the atmosphere to get into its various configurations and drive extreme events, such as heatwaves, floods or droughts. We could then use these observations to build up an understanding of what the climate is capable of. But in our rapidly changing climate, we effectively have only a few years – not enough to experience everything the climate has to offer.
Extreme weather events require what meteorologists might call a “perfect storm”. For example, extreme heat in the UK typically requires the northward movement of an air mass from Africa combined with clear skies, dry soils and a stable atmosphere to prevent thunderstorms forming which tend to dissipate heat.
Such “perfect” conditions are intrinsically unlikely, and many years can pass without them occurring – all while the climate continues to change in the background. Based on an understanding of observations alone, this can leave us woefully underprepared for what the climate can now do, should the right weather conditions all come together at once.
Startling recent examples include the extreme heatwave in the Pacific north-west of North America in 2021, in which temperatures exceeded the previous Canadian record maximum by 4.6°C. Another is the occurrence of 40°C in the UK in summer 2022, which exceeded the previous UK record maximum set only three years earlier by 1.6°C. This is part of the reason why the true impact of a fixed amount of global warming is only evident after several decades, but of course – since the climate is changing rapidly – we cannot use this method anymore.
Playing with fire
To better understand these extremes, scientists can use ensembles: many runs of the same weather or climate model that each slightly differ to show a range of plausible outcomes. Ensembles are routinely used in weather prediction, but can also be used to assess extreme events which could happen even if they do not actually happen at the time.
When 40°C first appeared in ensemble forecasts for the UK before the July 2022 heatwave, it revealed the kind of extreme weather that is possible in the current climate. Even if it had not come to fruition, its mere appearance in the models showed that the previously unthinkable was now possible. In the event, several naturally occurring atmospheric factors combined with background climate warming to generate the record-shattering heat on July 19 that year.
The highest observed temperature each year in the UK, from 1900 to 2023
Later in summer 2022, after the first occurrence of 40°C, some ensemble weather forecasts for the UK showed a situation in which 40°C could be reached on multiple consecutive days. This would have posed an unprecedented threat to public health and infrastructure in the UK. Unlike the previous month, this event did not come to pass, and was quickly forgotten – but it shouldn’t have been.
It is not certain whether these model simulations correctly represent the processes involved in producing extreme heat. Even so, we must heed the warning signs.
Despite a record-warm planet, summer 2024 in the UK has been relatively cool so far. The past two years have seen global temperatures far above anything previously observed, and so potential extremes have probably shifted even further from what we have so far experienced.
Just as was the case in August 2022, we’ve got away with it for now – but we might not be so lucky next time.
A new system for forecasting weather and predicting future climate uses artificial intelligence (AI) to achieve results comparable with the best existing models while using much less computer power, according to its creators.
In a paper published in Nature today, a team of researchers from Google, MIT, Harvard and the European Centre for Medium-Range Weather Forecasts say their model offers enormous “computational savings” and can “enhance the large-scale physical simulations that are essential for understanding and predicting the Earth system”.
The NeuralGCM model is the latest in a steady stream of research models that use advances in machine learning to make weather and climate predictions faster and cheaper.
What is NeuralGCM?
The NeuralGCM model aims to combine the best features of traditional models with a machine-learning approach.
At its core, NeuralGCM is what is called a “general circulation model”. It contains a mathematical description of the physical state of Earth’s atmosphere, and it solves complicated equations to predict what will happen in the future.
However, NeuralGCM also uses machine learning – a process of searching out patterns and regularities in vast troves of data – for some less well-understood physical processes, such as cloud formation. The hybrid approach makes sure that the output of the machine learning modules will be consistent with the laws of physics.
The resulting model can then be used for making forecasts of weather days and weeks in advance, as well as looking months and years ahead for climate predictions.
The researchers compared NeuralGCM against other models using a standardised set of forecasting tests called WeatherBench 2. For three- and five-day forecasts, NeuralGCM did about as well as other machine-learning weather models such as Pangu and GraphCast. For longer-range forecasts, over ten and 15 days, NeuralGCM was about as accurate as the best existing traditional models.
NeuralGCM was also quite successful in forecasting less-common weather phenomena, such as tropical cyclones and atmospheric rivers.
Why machine learning?
Machine learning models are based on algorithms that learn patterns in the data they are fed with, then use this learning to make predictions. Because climate and weather systems are highly complex, machine learning models require vast amounts of historical observations and satellite data for training.
The training process is very expensive and requires a lot of computer power. However, after a model is trained, using it to make predictions is fast and cheap. This is a large part of their appeal for weather forecasting.
The high cost of training and low cost of use is similar to other kinds of machine learning models. GPT-4, for example, reportedly took several months to train at a cost of more than US$100 million, but can respond to a query in moments.
A weakness of machine learning models is that they often struggle in unfamiliar situations – or in this case, extreme or unprecedented weather conditions. To do this, a model needs to be able to generalise, or extrapolate beyond the data it was trained on.
NeuralGCM appears to be better at this than other machine learning models, because its physics-based core provides some grounding in reality. As Earth’s climate changes, unprecedented weather conditions will become more common, and we don’t know how well machine learning models will keep up.
Nobody is actually using machine learning-based weather models for day-to-day forecasting yet. However, it is a very active area of research – and one way or another, we can be confident that the forecasts of the future will involve machine learning.
In media articles about unprecedented flooding, you’ll often come across the statement that for every 1°C of warming, the atmosphere can hold about 7% more moisture.
This figure comes from research undertaken by the French engineer Sadi Carnot and published 200 years ago this year.
We now know there’s more to the story. Yes, a hotter atmosphere has the capacity to hold more moisture. But the condensation of water vapour to make rain droplets releases heat. This, in turn, can fuel stronger convection in thunderstorms, which can then dump substantially more rain.
This means that the intensity of extreme rainfall could increase by much more than 7% per degree of warming. What we’re seeing is that thunderstorms can likely dump about double or triple that rate – around 14–21% more rain for each degree of warming.
For Australia, we helped develop a comprehensive review of the latest climate science to guide preparedness for future floods. This showed the increase per degree of global warming was about 7–28% for hourly or shorter duration extreme rain, and 2–15% for daily or longer extreme rain. This is much higher than figures in the existing flood planning standards recommending a general increase of 5% per degree of warming.
Why are thunderstorms important for extreme rain?
For thunderstorms to form, you need ingredients such as moisture in the air and a large temperature difference between lower and higher air masses to create instability.
We typically associate thunderstorms with intense localised rain over a short period. What we’re seeing now, though, is a shift towards more intense thunderstorm downpours, particularly for short periods.
Extreme rain events are also more likely when thunderstorms form in combination with other weather systems, such as east coast lows, intense low pressure systems near eastern Australia. The record floods which hit Lismore in February 2022 and claimed the lives of many people came from extreme rain over many days, which came in part from severe thunderstorms in combination with an east coast low.
Climate change pumps up extreme flood risk factors
The latest report from the Intergovernmental Panel on Climate Change (IPCC) states that:
frequency and intensity of heavy precipitation events have increased since the 1950s over most land areas for which observational data are sufficient for trend analysis (high confidence), and human-induced climate change is likely the main driver
This increase is particularly clear in short-duration extreme rains, such as those caused by thunderstorms.
Why? In part, it’s because of the 7% figure – warmer air is able to hold more water vapour.
But that doesn’t explain everything. There’s something else going on. Condensation produces heat. So as water vapour turns into droplets, more heat becomes available, and hot air rises by convection. In thunderstorms, more heat fuels stronger convection, where warm, moisture-laden air is driven up high.
This explains why thunderstorms can now drive such extreme rainfall in our warming world. As water vapour condenses to make rain, it also makes heat, supercharging storms.
We are seeing these very rapid rates of rainfall increase in recent decades in Australia.
Daily rainfall associated with thunderstorms has increased much more than the 7% figure would suggest – about 2-3 times more.
What about very sudden, extreme rains? Here, the rate of increase could potentially be even larger. One recent study examined extreme rain for periods shorter than one hour near Sydney, suggesting about a 40% increase or more over the past 20 years.
Rapid trends in extreme rainfall intensity are also clear in other lines of evidence, such as fine-resolution modelling.
To model complex climate systems, we need the grunt of supercomputers. But even so, many of our models for climate projections don’t drill down to grid resolutions smaller than about 100 kilometres.
While this can work well for large-scale climate modelling, it’s not suitable for directly simulating thunderstorms. That’s because the convection processes needed to make thunderstorms form happen on much smaller scales than this.
There’s now a concerted effort underway to perform more model simulations at very fine scales, so we can improve the modelling of convection.
Recent results from these very fine scale models for Europe suggest convection will play a more important role in triggering extreme rainfall including in combined storms, such as thunderstorms mingling with low pressure systems and other combinations.
This matches Australian observations, with a trend towards increased rain from thunderstorms combining with other storm types such as cold fronts and cyclones (including low-pressure systems in southern Australia).
Does this change how we plan for floods?
The evidence for supercharged thunderstorm rainfall has grown in recent years.
Australia’s current flood guidance recommendations, which influence how infrastructure projects have been built, are based on extreme rain increasing by just 5% for each degree of warming.
Our research review has shown the real figure is substantially higher.
This means roads, bridges, tunnels built for the 5% figure may not be ready to deal with extreme rain we are already seeing from supercharged thunderstorms.
This will have to change. We still face some uncertainties in precisely linking climate change to a single extreme rain event. But the bigger picture is now very clear: a hotter world is likely one with higher risk of extreme floods, often driven by extreme rain from supercharged thunderstorms.
So what should we do? The first step is to take climate change influences on storms and flood risk as seriously as we now do for bushfires.
The next is to embed the best available evidence in how we plan for these future storms and floods.
We have already loaded the dice for more extreme floods, due to existing human-caused climate change and more to come, unless we can quickly reduce our greenhouse gas emissions.