Climate Summaries


Roy Clark


Here is a series of short climate summaries that address various climate topics. To go to a Climate Summary, please click on the title below


To download a .pdf file with all of the summaries, please click here.















THE DOUBLE FANTASY OF NET ZERO


Roy Clark


Ventura Photonics Climate Summary 01.1

VPCS 01.1 November 15, 2023



Net Zero is a double fantasy. The idea that we can eliminate the use of fossil fuels is a fantasy. The idea that this will save us from the global warming apocalypse is also a fantasy. The climate models used to create the illusion of CO2 induced warming are a classic example of GIGO – Garbage In, Gospel Out. The climate modelers have been playing computer games in an equilibrium climate fantasy land for over 50 years.


There are three parts to this climate fantasy. First, climate energy transfer was oversimplified using the equilibrium climate assumption. The time dependent energy transfer processes that determine the surface temperature were replaced by average values. This created global warming as a mathematical artifact when the CO2 concentration was increased in the early ‘steady state air column’ climate models. Later models were empirically ‘tuned’ to match the global mean temperature record and create a contrived set of radiative forcings. Second, there was ‘mission creep’. As funding was reduced for NASA space exploration and US Department of Energy (DOE) nuclear programs, climate modeling became an alternative source of revenue. The simplified climate models were accepted without question. Third, various outside interests, including environmentalists and politicians decided to exploit the fictional climate apocalypse to further their own causes.


There are seven key papers that created the equilibrium climate fantasy land. Arrhenius started the fantasy with his ‘steady state’ air column model in 1896. In 1967, Manabe and Wetherald (M&W) copied Arrhenius and added a 9 or 18 layer radiative transfer model and a fixed relative humidity (RH) distribution to the basic climate fantasy. This created a ‘water vapor feedback’ that amplified the initial CO2 warming artifact. M&W then spent the next 8 years incorporating the 1967 model artifacts into every unit cell of a ‘highly simplified’ global circulation model (GCM). Mission creep started at NASA in the early 1970s. In 1976, Hansen’s group copied the M&W 1967 model and added the warming artifacts from 10 ‘minor greenhouse species’. Later, in 1981 Hansen’s group completed the basic climate model fraud by adding a slab ocean model, the CO2 doubling ritual and the calculation of the global temperature record with a contrived set of ‘radiative forcings’ to the 1967 M&W model. This provided the foundation for the pseudoscience of radiative forcing, feedbacks and climate sensitivity used by the UN Intergovernmental Panel on Climate Change (IPCC). Later, in the Third IPCC Climate Assessment Report (TAR), the contrived set of radiative forcings were split into ‘natural’ and ‘anthropogenic’. This was used to blame ‘extreme weather events’ on human causes. The initial work was done in 2000 by Stott et al and Tett et al at the UK Hadley Climate Centre.



A radiative forcing is a change in average energy flow at the top of the atmosphere. When the atmospheric concentration of a greenhouse gas such as CO2 is increased, there is a slight decrease in the long wave IR (LWIR) flux emitted to space within the spectral region of the LWIR emission. However, the small amount heat added to the troposphere (lower atmosphere) by the extra CO2 is just reradiated back to space as wideband LWIR emission. It does not heat the surface. There is no ‘CO2 signal’ in the global mean temperature record. The main warming signal is produced by the coupling of ocean oscillations, notably the Atlantic Multidecadal Oscillation (AMO) to the weather station record. The more recent warming also includes urban heat island effects, changes to the rural/urban station mix and data ‘adjustments’ disguised as ‘homogenization’.


The model used by Hansen’s group in 1981 only had three radiative forcing agents, an increase in CO2 concentration, changes in the solar flux and aerosols. As computer technology improved, more forcing agents were added. Starting with the third IPCC assessment in 2001, the 15 radiative forcings used in the CMIP3 model ensemble were split into ‘anthropogenic’ or ‘human caused’ and ‘natural’ forcings. The models were run for three separate cases with ‘natural’, ‘human caused’ and ‘human + natural’ forcing agents. This approach was used to blame increases in ‘extreme weather’ on human caused radiative forcings using a very dubious statistical argument based on changes to the ‘tails’ of the Gaussian temperature distribution. The original work was done at the UK Hadley center and published in two papers in 2000 by Stott, Tett and others. This provided the foundation for Net Zero.


The exploitation of the climate modeling fraud by outside groups started in the 1970s. However, nature did not cooperate and the warming phase of the AMO was not detected in the climate record until 1985. The UN Intergovernmental Panel on Climate Change (IPCC) was established in 1988 and the US Global Change Research Program (USGCRP) was established by presidential initiative in 1989 and mandated by congress in 1990. In the UK, the Hadley Climate Center was established in 1990.


The mission of the IPCC is to assess “the scientific, technical and socioeconomic information relevant for the understanding of the risk of human-induced climate change.” This is based on the a-priori assumption that human activities are causing CO2 induced global warming. The mission of the USGCRP is ‘to coordinate federal research and investments in understanding the forces shaping the global environment, both human and natural, and their impacts on society’. Here, the USGCRP has failed in its mission to find the ‘natural forces’ including ocean oscillations, downslope winds and high pressure domes that are responsible for climate change and extreme weather events such as fires, floods and droughts. The USGCRP has blindly copied the IPCC climate assessment reports and accepted the climate model results as real without any attempt at validation. Few, if any, of the analysts associated with the USGCRP have any expertise in climate energy transfer and many are not scientists at all.


Net Zero is a double fantasy. Net Zero is impossible to achieve. It is also impossible for a ‘CO2 doubling’ to cause a measurable change in surface temperature. There is no ‘climate sensitivity’ to CO2. There is no climate crisis.


The full article including references is available here .


Roy Clark is a retired engineer. Surface energy transfer is considered in more detail in the book he coauthored: Finding simplicity in a complex world – The role of the diurnal temperature cycle in climate energy transfer and climate change, Roy Clark and Arthur Rörsch, 2023. Further information is available at ClarkRorschPublications.com.





DOWN THE RABBIT HOLE:

THE EQUILIBRIUM CLIMATE FANTASY LAND OF THE USGCRP


Roy Clark


Ventura Photonics Climate Summary 02.1 VPCS

02.1 November 15, 2023





‘We are all mad here’

The Cheshire Cat



Since it was first established by Congress in 1990, the US Global Change Research Program (USGCRP) has used the results of fraudulent ‘equilibrium’ climate models to perpetuate a massive Ponzi or pyramid scheme based on exaggerated claims of anthropogenic global warming. It takes the fraudulent climate model output generated by agencies such as NASA and DOE and without question cycles the fake climate warming through the 13 US Agencies to establish a US climate policy that mitigates a nonexistent problem. The pigs have been filling their own trough at taxpayer expense for over 30 years. There has been no significant oversight. None of the agencies using the climate model results have performed any independent validation (‘due diligence’) of the modeling data provided. The climate modelers are no longer scientists, they have become prophets of the Imperial Cult of the Global Warming Apocalypse. Irrational belief in ‘equilibrium’ climate model results has replaced scientific logic and reason. The same group of climate modelers also provides the same fraudulent climate warming data for use by the IPCC in their assessment reports.


There are two different types of technical fraud in the climate models. First, the climate energy transfer processes were oversimplified and this created global warming as a mathematical artifact of the assumptions used. Second, the general circulation climate models (GCMs) require the solution to a very large number of coupled non-linear equations. This means that the model solutions are unstable and the errors increase over time. The GCMs have no predictive capabilities over the time scales required for climate analysis. They are simply ‘tuned’ to give the desired result.


The oversimplification of climate science started in the nineteenth century with the introduction of the equilibrium average climate assumption and speculation that changes in the atmospheric CO2 concentration could cycle the earth through an Ice Age. The first person to try and calculate the changes in surface temperature produced by such changes in CO2 concentration was Arrhenius in 1896. However, he used an oversimplified ‘equilibrium air column’. His results were just mathematical artifacts of his modeling assumptions. Unfortunately, the idea that an increase in atmospheric CO2 concentration could warm the earth became accepted scientific dogma. Instead of an Ice Age, fossil fuel combustion was the cause of climate warming. The first generally accepted computer climate model was published by Manabe and Wetherald (M&W) in 1967. It was a modified version of the ‘equilibrium air column’, with radiative transfer and a prescribed relative humidity distribution. Not only did this model create a global warming artifact as the CO2 concentration was increased, there was also a ‘water vapor feedback’ that amplified the initial mathematical artifact. The model contained six fundamental scientific errors. These were ignored and M&W spent the next 8 years building a ‘primitive’ GCM. The 1967 artifacts were incorporated into every unit cell of the larger model.



As resources dwindled for space exploration and nuclear programs, other agencies jumped on the climate bandwagon and formed their own climate modeling groups. They started by copying the M&W ‘equilibrium’ approach. Melodramatic prophecies of the global warming apocalypse became such a good source of research funding that the scientific process of hypothesis and discovery collapsed. Continued employment was more important. The climate modelers became trapped in a web of lies of their own making. In 1976, a NASA Goddard group that included James Hansen copied the M&W 1967 model and added the mathematical warming artifacts from ten ‘minor species’. Then, in 1981, they added several ‘improvements’ to the basic M&W model including a slab ocean model, the CO2 doubling ritual and the use of radiative forcing agents to simulate a global mean temperature record. These added another three fundamental scientific errors to the ‘one dimensional radiative convective’ equilibrium climate model. Little has changed since then. The climate modelers have been playing computer games in an Equilibrium Climate Fantasy Land since 1967.


Various outside interests, including environmentalists and various political groups also began to exploit the climate apocalypse to further their own causes. The IPCC was formed in 1988. The USGCRP followed in 1989/1990. In 1990, Hadley Climate Centre was established in the UK to feed climate propaganda to Margaret Thatcher.


When the USGCRP was formed, the rest of the 13 government agencies involved simply followed the climate modelers down the rabbit hole into the Equilibrium Climate Fantasy Land. No independent validation or ‘due diligence’ was performed to check the climate model results. The climate Apocalypse was a good source of funding and employment. Very few people in these agencies understood anything about climate physics and most of the USGCRP soon became disciples of the Imperial Cult of the Global Warming Apocalypse.


As computer technology improved, climate models became more complex, but the underlying assumptions remained the same. An increase in the atmospheric concentration of ‘greenhouse gases’ produces a decrease in the LWIR flux at the top of the atmosphere (TOA). This perturbs the ‘radiation balance of the earth’. The climate system then ‘adjusts’ to increase the surface temperature and restore the ‘radiation balance’ at TOA. An elaborate pseudoscientific modeling ritual has been created using radiative forcings, feedbacks and a climate sensitivity to CO2 to give the illusion that the observed increase in atmospheric CO2 concentration is causing ‘climate change’. The surface temperature is warming and this must be caused by CO2. The Sacred Spaghetti Plots from the computer models have to be correct. The models have been ‘tuned’ to match the change in ‘global mean temperature’ and create a contrived set of ‘radiative forcings’. The Apocalypse is coming. Pay for your sins. The US taxpayer is paying for the USGCRP. The world has to be saved from a non-existent problem. Eisenhower’s warning about the corruption of science by government funding has come true.


CO2 is a good plant fertilizer, so there is a major agricultural benefit to an increase in CO2 concentration - enhanced agricultural production. There is no climate emergency. There is no need for utility scale solar or wind energy. There is no need for the large scale deployment of electric vehicles. It is time to dismantle the entire climate fraud, including the USGCRP and rebuild the energy infrastructure of the US based on reliable, fossil fueled and nuclear electrical power.


The full article including references is available here.


Roy Clark is a retired engineer. Surface energy transfer is considered in more detail in the book he coauthored: Finding simplicity in a complex world – The role of the diurnal temperature cycle in climate energy transfer and climate change, Roy Clark and Arthur Rörsch, 2023. Further information is available at ClarkRorschPublications.com.







THE FIFTH CLIMATE ASSESSMENT REPORT


Roy Clark


Ventura Photonics Climate Summary 03.1

VPCS 03.1 November 15, 2023



The US Global Change Research Program (USGCRP) has now released its Fifth National Climate Assessment Report. Instead of wasting taxpayer money on a lengthy and fraudulent report, this single page summary should suffice.


Since the start of the Industrial Revolution over 200 years ago, the atmospheric concentration of CO2 has increased by approximately 140 parts per million (ppm), from 280 to 420 ppm. This has produced a decrease near 2 W m-2 in the longwave IR (LWIR) flux emitted to space at the top of the atmosphere (TOA) within the spectral range of the CO2 emission bands. There has also been a similar increase in the downward LWIR flux from the lower troposphere to the surface. At present, the annual average increase in CO2 concentration is about 2.4 ppm. This produces an annual increase in the downward LWIR flux to the surface of approximately 0.034 W m-2.


1) The additional absorption of 2 W m-2 by the CO2 bands (‘radiative forcing’) has not changed the temperature of the troposphere. Nor has it changed the energy balance of the earth.

2) The 2 W m-2 increase in downward LWIR flux to the surface has not changed the land or ocean surface temperatures.

3) The annual average increase of 0.034 W m-2 in downward LWIR flux to the surface cannot increase the ‘frequency and intensity’ of ‘extreme weather events’.


Any temperature increases produced by these changes in LWIR flux are ‘too small to measure’. In addition, CO2 is a good plant fertilizer, so there is a major agricultural benefit to an increase in CO2 concentration – enhanced agricultural production.


There is no climate emergency. There is no need for utility scale solar or wind energy. There is no need for the large scale deployment of electric vehicles. It is time to dismantle the entire climate fraud, including the USGCRP and rebuild the energy infrastructure of the US based on inexpensive, reliable fossil fueled and nuclear electrical power.


The full article, including references is available here.


Roy Clark is a retired engineer. Surface energy transfer is considered in more detail in the book he coauthored: Finding simplicity in a complex world – The role of the diurnal temperature cycle in climate energy transfer and climate change, Roy Clark and Arthur Rörsch, 2023. Further information is available at ClarkRorschPublications.com.





NET ZERO? ULEZ CAMERAS?

BLAME THE HADLEY CLIMATE CENTRE!


Roy Clark


Ventura Photonics Climate Summary 04.1

VPCS 04.1 November 15, 2023



The evidence for CO2 induced climate change (aka global warming) is based on nothing more than the results from fraudulent ‘equilibrium’ climate models that rely on the pseudoscience of radiative forcings, feedbacks and a climate sensitivity to CO2. It is claimed that small changes in the energy flow at the top of the atmosphere (TOA) called radiative forcings can change the energy balance of the earth. A doubling of the CO2 concentration from 300 to 600 parts per million (ppm) produces a small decrease in the energy flow at TOA. The surface temperature is then supposed to magically warm up and restore the energy balance at TOA. Various feedbacks can change this temperature response. The CO2 warming somehow creates a water vapor feedback that amplifies the initial temperature increase. A hypothetical climate model warming from a CO2 doubling is called the climate sensitivity.


In the real world, such CO2 induced temperature changes are too small to measure. The warming observed in the global mean temperature record is produced by a combination of natural ocean oscillations, dominated by the Atlantic Multidecadal Oscillation (AMO), urban heat island effects, and a lot of other ‘adjustments’. The number and rural/urban mix of the weather stations has changed and the raw temperatures are altered using a process called ‘homogenization’.


There are three parts to the climate modeling fraud. The technical fraud started in the nineteenth century when the climate energy transfer processes were oversimplified using the equilibrium assumption. Second, there was ‘mission creep’. As funding was reduced for NASA space exploration and US Department of Energy (DOE) nuclear programs, climate modeling became an alternative source of revenue. The simplified climate models were accepted without question. Third, various outside interests, including environmentalists and politicians decided to exploit the fictional climate apocalypse to further their own causes.



The Hadley Climate Center in the UK was established in 1990 to feed climate propaganda to Margaret Thatcher. The foundation of Net Zero was established in the Third Climate Assessment Report (TAR) published by the UN Intergovernmental Panel on Climate Change (IPCC) in 2001. The radiative forcing agents were divided into ‘natural’ and ‘anthropogenic’ and the warming in the global mean temperature record was attributed to ‘human causes’. This was used to blame a contrived rise in the frequency and intensity of extreme weather events on increases in the atmospheric concentration of CO2 and other greenhouse gases. The original work was performed by the Hadley Centre and associated groups. However, the basic climate fraud was established earlier in the US by a small cadre of mathematicians and computer programmers between 1967 and 1981 working at NOAA and NASA. The UK Met Office started work on climate models in the 1970s.


The first steady state air column climate model was published by Arrhenius [1896]. When the CO2 concentration was increased, his model created climate warming as a mathematical artifact of the oversimplified calculation. Arrhenius was motivated by the speculation that changes in CO2 concentration could cause the earth to cycle through an Ice Age. This gradually morphed into the scientific dogma that fossil fuel combustion could cause global warming. Starting in the early 1960s, Manabe’s group at the US Weather Bureau (later part of NOAA) decided to adapt an early weather forecasting computer model to predict ‘climate’. To start, they copied the Arrhenius model and added a 9 or 18 layer radiative transfer model with a fixed relative humidity distribution . This added a ‘water vapor feedback’ amplification to the initial warming artifact. When they doubled the CO2 concentration in their one dimensional radiative convective (1-D RC) model they created a fictional 2.9 °C warming. M&W spent the next 8 years incorporating their 1967 model artifacts into every unit cell of a ‘highly simplified’ global circulation model.


As computer technology improved, more radiative forcing agents were added to the models. Starting with the third IPCC assessment in 2001, the radiative forcing agents used in the CMIP3 model ensemble were split into ‘human caused’ and ‘natural’ forcings. The models were run for three separate cases with ‘natural’, ‘human caused’ and ‘human + natural’ forcings. A vague statistical argument was used to blame increases in ‘extreme weather’ on human caused radiative forcings. The original work was done by the UK Hadley Centre and associated groups [Stott et al, 2000 and Tett et al, 2000]. This is illustrated in Figure 1. All three of the later IPCC Climate Assessments have used this approach. It has also been copied without question by the US Global Change Research Program in their National Climate Assessments.


All of the climate modeling work performed at the Hadley Centre that is based on radiative forcings, feedbacks and climate sensitivity is fraudulent. Long wave IR radiative forcings produced by increases in greenhouse gas concentration do not change the energy balance of the earth nor can they produce a measurable increase in surface temperature. It is time to put an end to this massive fraud and shut down all climate modeling activities at Hadley and other climate modeling centers. There is no cost or technical justification for net zero policies including the use of cameras to enforce ULEZs. There is no need to save the world from a non-existent problem.


The full article, including references is available here.


Roy Clark is a retired engineer. Surface energy transfer is considered in more detail in the book he coauthored: Finding simplicity in a complex world – The role of the diurnal temperature cycle in climate energy transfer and climate change, Roy Clark and Arthur Rörsch, 2023. Further information is available at ClarkRorschPublications.com.





Figure 1: The source of ‘Net Zero’ - the fraudulent ‘attribution’ of warming in the global mean temperature record to ‘anthropogenic’ causes. The contrived set of pseudoscientific forcings created by the climate models to simulate the global mean temperature record shown in a) are separated into natural and anthropogenic sources. The climate models are rerun using the natural forcings to create a fraudulent ‘natural’ baseline b) and the anthropogenic forcings c) to show the ‘human caused’ warming. A vague statistical argument e) is used to claim that the anthropogenic warming caused an increase in the frequency and intensity of ‘extreme weather events’.





CLIMATE PSEUDOSCIENCE


Roy Clark


Ventura Photonics Climate Summary 05.1

VPCS 05.1 November 15, 2023



The foundation of the modern computer based climate modeling fraud was established between 1967 and 1981 by the work of Manabe and Wetherald (M&W) at NOAA and Hansen’s group at NASA Goddard. The energy transfer processes that determine the surface temperature of the earth were oversimplified and replaced by an equilibrium average air column. When the CO2 concentration was increased in this model, the surface temperature increased as a mathematical artifact of the simplified calculation. The initial temperature increase was then amplified by a ‘water vapor feedback’ produced by the fixed relative humidity distribution assumed by the model. Hansen’s group added more ‘greenhouse gases’ to the initial M&W model. Then they went on to add a slab ocean model, the CO2 doubling ritual and the calculation of a global temperature record using a contrived set of radiative forcings. Meanwhile, M&W spent the next eight years incorporating the mathematical warming artifacts produced by their 1967 model into each unit cell of a highly simplified global circulation model.


As computer technology improved, a contrived set of pseudoscientific radiative forcing agents was used by the climate models to simulate the global mean temperature record. The forcings were then divided into anthropogenic and natural forcings. This was used to create the illusion that the observed warming in the global average temperature record is ‘human caused’ and this in turn has led to an increase in the intensity and frequency of ‘extreme weather events’. The increase in surface temperature calculated by the climate models for a doubling of the CO2 concentration is called the equilibrium climate sensitivity (ECS). In the real atmosphere this is too small to measure.


There are five fundamental scientific errors in the radiative forcing argument. First, a greenhouse gas forcing is a decrease in LWIR flux at the top of the atmosphere (TOA) that changes the rate of cooling in the atmosphere. When the radiative transfer calculations are extended to include this change in cooling rate, the effects of a CO2 doubling in the turbulent troposphere are too small to measure. Second, the LWIR flux at TOA is decoupled from the surface by molecular line broadening effects. Third, over the oceans, the penetration depth of the LWIR flux into the surface is less than 100 micron (0.004 inches). The increase in downward LWIR flux from the lower troposphere to the surface produced by a greenhouse gas radiative forcing is fully coupled to the much larger and more variable wind driven latent heat flux and cannot heat the oceans. Fourth, over land, any increase in surface temperature produced by a greenhouse gas radiative forcing is too small to measure in the day to day variations of the surface temperature. Fifth, there can be no ‘CO2 signature’ in the global average temperature record. The dominant term is the Atlantic Multi-decadal Oscillation (AMO) augmented by urban heat island effects, changes to the number and urban/rural mix of the weather stations used in the averaging process and homogenization adjustments.


It is time to shut down the ‘equilibrium’ climate models and dismantle the multi-trillion dollar climate fraud.


The full article, including references is available here.


Roy Clark is a retired engineer. Surface energy transfer is considered in more detail in the book he coauthored: Finding simplicity in a complex world – The role of the diurnal temperature cycle in climate energy transfer and climate change, Roy Clark and Arthur Rörsch, 2023. Further information is available at ClarkRorschPublications.com.





CRITICAL THINKING IN CLIMATE SCIENCE


Roy Clark


Ventura Photonics Climate Summary 06.1

VPCS 06.1, June 22, 2024



Critical thinking in climate science started with Joseph Fourier in the 1820s. He successfully explained the seasonal changes in land subsurface temperatures using his theory of heat. This included both the temperature response and the time delay or phase shift between the peak solar flux at summer solstice and the temperature response.


At a moderate depth, as three or four meters, the temperature observed does not vary during each day, but the change is very perceptible in the course of a year, it varies and falls alternately. The extent of these variations, that is, the difference between the maximum and minimum of temperature, is not the same at all depths, it is inversely as the distance from the surface. The different points of the same vertical line do not arrive at the same time at the extreme temperatures. ........... The results observed are in accordance with those furnished by the theory, no phenomenon is more completely explained. Fourier 1824.


Such phase shifts are indisputable evidence for a non-equilibrium thermal response to the solar flux. There is a time delay as heat flows in and out of the surface thermal reservoir. This important work has been ignored for 200 years.


The equilibrium climate assumption was introduced by Pouillet in 1838. The earth was an isolated planet that was heated by shortwave radiation from the sun and cooled by the emission of longwave IR (LWIR) radiation back to space. Therefore, he assumed that an average surface temperature could be determined using average values for just the solar and IR flux terms. As a hypothesis, this had already been disproved by Fourier in 1824. Physical reality was abandoned in favor of mathematical simplicity.


In 1840, critical thinking enabled Agassiz to propose an Ice Age based on his observations of the glaciers in the Alps. There was clear evidence of glacier retreat. In the early 1860s, Tyndall speculated that this Ice Age cycle could be explained by changes in the atmospheric concentration of CO2. In 1896, Arrhenius set out to calculate the effect of changes in the atmospheric CO2 concentration on the surface temperature. Unfortunately, accepted the equilibrium assumption and used a steady state air column. This approach created warming as a mathematical artifact in his oversimplified model. Gradually, the idea that CO2 could cause an Ice Age was replaced by concern over fossil fuel combustion. CO2 induced global warming became scientific dogma.


One of the early uses of computers was for weather forecasting, pioneered by a group led by John von Neumann. Critical thinking by Lorenz in 1963 demonstrated that the solution to the coupled non-linear equations used in the global circulation models (GCMs) could become unstable. Weather forecasting using numerical models was limited to about 12 days ahead.


Starting in the early 1960s, Manabe’s group at the US Weather Bureau began to adapt a weather forecasting model to ‘predict’ climate change in spite of the limitations imposed by the Lorenz instabilities. Part of their motivation was increased funding to support the computer facilities and staff needed for both weather forecasting and climate modeling. To start, they added a 9 or 18 layer radiative transfer algorithm to the Arrhenius steady state air column model and constrained the magnitude of the lapse rate (vertical temperature profile) to a maximum of 6.5 °C per kilometer. This was known as a one dimensional radiative convective (1-D RC) model. In the 1967 version of this model by Manabe and Wetherald (MW67), they imposed a fixed relative humidity (RH) distribution. When the CO2 concentration was doubled from 300 to 600 parts per million (ppm), the surface temperature increased by 2.9 °C for clear sky conditions. The initial Arrhenius warming artifact was amplified by a ‘water vapor feedback’ created by the fixed RH assumption. In addition, the time integration procedure used in the model required about a year to reach a new steady state. In the real atmosphere, the daily variations in temperature and humidity are sufficiently large that the small temperature changes calculated for each model step cannot accumulate over time.


Manabe’s group then spent the next 8 years incorporating the MW67 algorithms into each unit cell of a ‘highly simplified’ GCM. When the CO2 concentration was doubled in this 1975 model, the surface temperature increase was also 2.9 °C. This was just a mathematical artifact created by the MW67 algorithms. However, it provided an invalid benchmark for later climate models and was used in the Charney Report. The temperature increase produced by a CO2 doubling is now known as the equilibrium climate sensitivity, ECS.


As the Apollo (moon landing) program ended there was a major reduction in funding at NASA. The planetary atmospheres group, including a young James Hansen, was told to switch to ‘earth studies’. In 1976, this group simply copied the MW67 model and created warming artifacts for 10 ‘minor species’: N2O, CH4, NH3, HNO3, C2H4, SO2, CCl2F2, CFCl3, CH3Cl and CCl4. Then in 1981, they added a slab ocean, the CO2 doubling ritual and the calculation of a global mean temperature record to their 1-D RC model. This provided the foundation for the pseudoscience of radiative forcings, feedbacks and climate sensitivity still used by the climate models today. There was no critical thinking. They simply copied the modeling errors created by Manabe’s group. A paycheck was more important.


As computer technology improved, the 1-D RC model was replaced by atmospheric GCMs and then by coupled ocean-atmosphere GCMs. The steady state air column was replaced by a fictional planetary average energy balance. The GCMs were simply ‘tuned’ to match the global temperature record. As funding was reduced for nuclear programs, especially after the accident at Three Mile Island, the National Labs, part of the US Department of Energy since 1977 also jumped on the climate bandwagon. This led to the Coupled Model Intercomparison Program (CMIP) that has been a major source of the climate model results used by the Intergovernmental Panel on Climate Change (IPCC). There was no critical thinking. The existing models were simply ‘improved’. The underlying assumptions were never questioned.


There was a significant change in the climate models that started with the Third IPCC Climate Assessment Report (TAR) in 2001. The radiative forcings were split into ‘natural’ and ‘anthropogenic’ contributions and a dubious statistical argument was used to claim that the anthropogenic warming artifacts led to an increase in the frequency and intensity of ‘extreme weather events’. This provided the foundation for the disastrous ‘Net Zero’ policy of today.


Little has changed since 2001. The introduction to Chapter 7 of AR6, WG1 ‘The Earth’s energy budget, climate feedbacks, and climate sensitivity’ [IPCC, 2021] starts:


This chapter assesses the present state of knowledge of Earth’s energy budget, that is, the main flows of energy into and out of the Earth system, and how these energy flows govern the climate response to a radiative forcing. Changes in atmospheric composition and land use, like those caused by anthropogenic greenhouse gas emissions and emissions of aerosols and their precursors, affect climate through perturbations to Earth’s top-of-atmosphere energy budget. The effective radiative forcings (ERFs) quantify these perturbations, including any consequent adjustment to the climate system (but excluding surface temperature response). How the climate system responds to a given forcing is determined by climate feedbacks associated with physical, biogeophysical and biogeochemical processes. These feedback processes are assessed, as are useful measures of global climate response, namely equilibrium climate sensitivity (ECS) and the transient climate response (TCR).


In order to encourage some more critical thinking in climate science, everyone involved should be required to answer the following question:


At present, the annual average increase in atmospheric CO2 concentration is near 2.4 ppm per year. This produces an increase in the downward LWIR flux from the lower troposphere to the surface of approximately 0.034 Watts per square meter per year. How does this change the surface temperature of the earth and impact ‘extreme weather’ events?


The short answer is that such an increase in the atmospheric CO2 concentration can have no measurable effect on surface temperature, nor can it alter extreme weather events.


Further Reading:


R. Clark (2024), “A Nobel prize for Climate Modeling Errors”, Science of Climate Change 4(1) pp. 1-73. [https://doi.org/10.53234/scc202404/17]

R. Clark (2023), “Time Dependent Energy Transfer: The Forgotten legacy of Joseph Fourier” Science of Climate Change 3(5) pp. 421-444 [https://doi.org/10.53234/scc202310/25]

Roy Clark and Arthur Rörsch (2023), Finding Simplicity in a Complex World – The Role of the Diurnal Cycle in Climate Energy Transfer and Climate Change Clark Rörsch Publications, Thousand Oaks, CA. Website: [https://clarkrorschpublication.com]