Ventura Photonics
Non Imaging Optics
Greenhouse Effect
Global Warming
Note: This is part of a longer article that was submitted as a comment to the US Department of Fish and Game in
response to proposed regulations to restrict access to US west coast beaches to protect the western snowy plover.  The
restriction of additional beach area was proposed to compensate for loss of beach due to ‘sea level rise caused by a
warming trend associated with climate change’.  The full article can be found at:
(Click on the red .pdf button, lower right to access the full .pdf file)
Climate Astrology
There is no evidence of any rise in sea level that can be attributed to the observed increase of 70 ppm
in atmospheric CO
2 concentration that has occurred over the last 50 years.  Furthermore, no rise in sea
level can be expected from any additional increase in atmospheric CO
2 concentration including the
much discussed ‘CO
2 doubling’ to 560 ppm.  In fact it is simply impossible for the observed increase in
2 concentration to have caused any kind of climate change.  The dire ocean flooding and other
global warming disasters predicted by the United Nations (UN) Intergovernmental Panel on Climate
Change (IPCC) reports and others are based on nothing more than invalid computer simulations.  All
such predictions have been demonstrated to be incorrect.

There are two fundamental errors that have been made in the computer simulations of global warming.  
The first is the assumption that there is some form of climate equilibrium that can be analyzed using
perturbation theory.  This approach is known as radiative forcing.  The second is the substitution of the
meteorological surface air temperature (MSAT) as a surrogate for the real surface temperature.  The
MSAT is the temperature of the air measured in an enclosure placed at eye level above the ground.  
There is no simple or obvious relationship between the MSAT and the surface temperature of the
ground underneath the enclosure.  The whole global warming argument is based on the empirical
speculation that the observed increase in atmospheric CO
2 concentration has caused a rise in the long
term global average ‘equilibrium surface temperature’.  The two unrelated graphs of the increase in
atmospheric CO
2 concentration and the long term MSAT trend were scaled and made to overlap.  This
created the so called ‘hockey stick’ curve that has been used to justify the global warming argument.  A
false empirical relationship between the rise in atmospheric CO
2 concentration and the MSAT was
created by using the small increase in the downward long wave infrared (LWIR) flux from the increase in
2 concentration as a ‘calibration factor’ for the global warming simulations.  The computer climate
models have been ‘hard wired’ using a circular argument to create global warming from an increase in
‘greenhouse gases’.  If the concentration of the ‘greenhouse gases’ increases then by definition, the
surface temperature must increase.  The hockey stick is just propagating itself.  This is empirical
pseudoscience that can only be described as climate astrology or computational science fiction.

When the real energy transfer physics that determines the surface temperature is examined in detail, it
becomes clear that the observed increase in CO
2 concentration cannot cause any measurable rise in
surface temperature.  The greenhouse effect cannot be explained using climate equilibrium arguments.  
Instead, it has to be described in terms of the dynamics of the surface energy transfer.  There are six
different time dependent energy transfer processes that have to be considered.  These are discussed in
detail in this review.  Once the dynamic properties of the greenhouse effect are understood, the
observed changes in the MSAT record can be explained in terms of variations in ocean surface
temperatures coupled with observational bias in the MSAT station record caused by urban heat island
effects.  Temperatures in urban areas have increased compared to the surrounding rural areas
because of the additional heat stored in the urban infrastructure.  The MSAT record has also been
‘adjusted’ or ‘homogenized’ in various ways to produce the climate record and these have resulted in
additional temperature increases that were not part of the original MSAT data.  
In order to understand the relationship between sea level and climate change it is necessary to consider
the effects of at least four different climate cycles that occur on different time scales.  Over recent
geological time, the Earth has cycled through an Ice Age with a period of approximately 100,000 years
[Augustin et al; 2004; Barbante et al. 2006].  This is linked to changes in the ellipticity of the Earth’s orbit
around the sun caused by planetary perturbations, mainly by Jupiter and Saturn [Varadi et al, 2003].  At
the last glacial maximum, about 20,000 years ago, sea level was approximately 120 m lower than it is
today [Lambeck, 2004].  During recent recorded history, the Earth’s climate has fluctuated with a period
of a few hundred years [Loehle & Huston, 2008].  The last climate minimum was the Maunder Minimum
in the seventeenth century.  This was preceded by the Medieval Warming Period when the Vikings
settled in Greenland and along the eastern coast of N. America.  It now appears that the Earth has
started to cool again after another ‘Modern Maximum’ warm period.  This warming cycle is linked to long
term changes in the sunspot cycle.  During the Maunder Minimum, very few sunspots were observed for
70 years from 1645 to 1715 [Harvey, 1997].  As the Earth has warmed from the Maunder Minimum, sea
levels have risen at a rate of approximately 8 inches per century [Akasofu, 2010].  This rate now
appears to be slowing down [Houston & Dean, 2011].  

The Earth’s climate is also influenced by periodic fluctuations in ocean surface temperatures.  The N.
Pacific and N. Atlantic Oceans have well established variations in surface temperature with periods of
approximately sixty years known as the Pacific Decadal Oscillation (PDO) and the Atlantic Multidecadal
Oscillation (AMO) [Cheetham, 2011].  These have been linked to the dust bowl droughts in the 1930’s
and are the underlying cause of the ‘global warming’ that has been erroneously attributed to CO
2.  The
PDO is the dominant trend found in the California climate record [Clark, 2010a].  In addition there is the
well known short term El Nino Southern Oscillation (ENSO) in the Equatorial Pacific Ocean that
fluctuates with a period between 3 and 7 years [NOAA, 2011].  The changes in air pressure and wind
patterns produced during the ENSO cycle influence the sea levels that are measured by tide gauges
and satellite altimetry.  Such effects have been misinterpreted as an increase in sea level caused by
rising levels of atmospheric CO
2 [Morner, 2010].

The idea that infrared active gases in the atmosphere can trap IR radiation and warm the Earth was first
proposed by Joseph Fourier in 1827 [Fourier, 1827].  Speculation that changes in atmospheric CO
concentration can cause Ice Age fluctuations started in the middle of the nineteenth century.  John
Tyndall began his studies of the infra red (IR) absorption of gases in 1859 and correctly identified water
vapor, followed by carbon dioxide as the most important IR absorbing gases in the atmosphere [Tyndall,
1863].  He was also interested in the study of glaciers and accepted the Ice Age glaciation theories of
Louis Agassiz [Agassiz, 1840].  This led him to propose that changes in CO
2 concentration might be
responsible for climate change.  These empirical speculations have continued unabated for 150 years
[Weart, 1997].  However, when the effect of the observed increase in atmospheric CO
2 concentration on
increase to have caused any kind of climate change [Clark, 2010b, 2010c].  

The conventional greenhouse effect is explained in terms of incorrect equilibrium assumptions.  The
average solar flux reaching the Earth’s surface is 240 W.m^-2, which corresponds to an emission
temperature of 255 K.  The Earth’s surface is at an average temperature of 288 K.  This 33 K difference
is attributed to ‘greenhouse gas trapping of IR radiation’ that warms the surface [Taylor, 2006].  Since it
is assumed, incorrectly that there is some form climate equilibrium, an increase in ‘greenhouse gas
concentration’ must therefore increase this ‘IR trapping’ and cause an increase in surface temperature.  
This argument has no basis in climate reality.  There is no equilibrium on any time scale.  The local
surface temperature varies on a diurnal and seasonal time scale.  The long term average is a
mathematical construct that has little connection to the observed local surface temperature.  The
starting point for any realistic analysis of the greenhouse effect is the simple observation that the dry
sand on the beaches of areas such as Southern California is almost too hot to walk on once it has been
heated by the summer sun.  The ground warms up during the day as it is heated by the sun and cools
off at night.  The full summer flux reaching the surface is approximately 1000 W.m^-2.  If the sun were to
shine long enough to reach thermal equilibrium, the equilibrium surface temperature would be 93 C.  
Similarly, the Earth is always cooling by long wave infrared (LWIR) emission to space.  If the sun stopped
shining, the Earth would continue to cool until it reached the temperature of outer space.  We are
fortunate that the Earth is warmed by the sun during the day and cools at night in such a way that
extremes of temperature are avoided.  The surface temperature is maintained by a dynamic balance
between the solar heating flux, moist convection and LWIR emission.  There are six different energy
transfer processes that interact to maintain the Earth’s surface temperature: the surface energy transfer
at the air-ocean and the air-land interfaces, the downward LWIR flux from the atmosphere, the direct
surface emission to space, the convective transport through the atmosphere and the LWIR emission to
space from the atmosphere.  

During the 1960’s, a mathematical concept known as radiative forcing was introduced into climate
science and climate model simulations [Manabe & Wetherald, 1967].  This made some incorrect
assumptions about climate energy transfer and long term ‘climate equilibrium states’ that allowed the
effect of an increase in CO2 concentration on the Earth’s ‘equilibrium surface temperature’ to be
calculated using the rather limited computational capabilities available at the time.  The ‘equilibrium
surface temperature’ so defined is not a valid measurable climate variable.  However, this was
conveniently ignored and the meteorological surface air temperature (MSAT) was substituted for the
real surface temperature [Hansen, 2005a; Jones et al, 1999].  The MSAT is the air temperature
measured in an enclosure placed at eye level 1.5 to 2 m above the ground [Quayle et al, 1991].  There
is no simple or obvious connection between the real surface temperature and the MSAT.  The
calculated increases in ‘surface temperature’ produced by the increase in CO2 flux alone were too low
to match the MSAT record, so additional ‘water vapor feedback’ effects were created to increase the
calculated rise surface temperature [Held & Soden, 2000].  These have now been shown to be incorrect
[Lindzen & Choi, 2009].  It was claimed that a 1 C rise in ‘average equilibrium surface temperature’ found
in the long term MSAT record was produced by CO
2.  This claim is based on nothing more than the
overlap of two unrelated curves: the increase in atmospheric CO
2 concentration and the long term
increase in the MSAT.  A purely empirical pseudoscientific reasoning was then applied to give an aura of
quantitative analysis [Clark, 2010b].  

It was determined that the observed 100 ppm increase in atmospheric CO
2 concentration over the last
200 years had produced an increase in downward long wave infrared (LWIR) flux of 1.7 W.m^-2.  This is
derived from independent radiative transfer calculations using the spectroscopic data from the HITRAN
database [Clark, 2010b; Rothman et al, 2005].  The number is correct for ‘clear sky’ conditions at a
surface temperature near 288 K.  This change in flux is too small to cause any measurable change in
surface temperature based on actual engineering calculations of the dynamically varying surface heat
transfer [Clark, 2010c].  Instead an empirical ‘radiative forcing constant’ was created by dividing the
observed ‘average’ increase in the MSAT temperature record by the increase in LWIR flux from CO
over the last century: 1/1.5 = 2/3 C/(W.m^-2).  This ‘magic recipe’ can then be applied as a ‘calibration
constant’ for other IR active gases.  The HITRAN database is used to determine the increase in
downward atmospheric LWIR flux produced by an estimated increase in ‘greenhouse gas
concentration’.  Multiply the increase in flux by the 2/3 ‘calibration constant’ for CO
2 and the warming of
every ‘greenhouse gas’ can be calculated.  This is nothing more than climate astrology, but it is the
basis of all of the IPCC climate change predictions and sea level rise claims [Alley et al, 2007; Hansen,
2005a; Hansen et al, 2005b; Knutti et al, 2008; Solomon et al, 2009].  Using this method increases in
‘greenhouse gas’ concentrations can only produce an increase in surface temperature. In order to
‘adjust’ the temperature rise, aerosol effects and other ‘natural forcing constants’ have been introduced
to provide empirical cooling effects to offset or modulate the empirical greenhouse gas warming
constants.  These are based for example on manipulations of the estimated cooling produced by
volcanic aerosols from eruptions such as Mount Pinatubo.  This allows the climate models to be
empirically ‘tuned’ to match measured temperatures for ‘hindcasting’[Eschenbach, 2010].  

In November of 2009, a large archive of e mails and other files from the Climate Research Unit of the
University of East Anglia Climate was released on the Internet [Monckton, 2009; Montford, 2010; Mosher
& Fuller, 2010].  This revealed a pattern of egregious scientific misconduct that extended back over
several decades.  Climate data had been manipulated to create warming where none existed,
particularly for tree ring data.  Legitimate requests for information made under the Freedom of
Information Act were routinely circumvented or denied.  The entire publication and grant awarding peer
review process in climate science had been corrupted.  Friends and associates reviewed each other’s
papers to make sure only articles that agreed with their global warming position were published
regardless of scientific merit.  Pressure was applied to journal editors to reject papers that presented
opposing views.  Even now, journals such as Nature and Science show a strong editorial bias towards
global warming.  Similarly, climate scientists and other self interested parties have tried to influence the
world’s major scientific societies to support global warming.  Such statements of support should be
discounted.  The Royal Society has recently reviewed and revised its policies in this area [Royal Society,
2010].  A small group of climate scientists has also controlled the content of the IPCC reports.  This is
the United Nations Intergovernmental Panel on Climate Change.  It is a political body tasked with the job
of identifying anthropogenic global warming whether it really exists or not [McLean, 2010a; Cheetham,
2009, Laframboise, 2011].  The four major IPCC reports and the computer models used to predict
climate change in the ‘IPCC scenarios’ should not be introduced as scientific evidence of climate change.

As discussed above, the climate simulation computer models are based on nothing more than a circular
empirical argument.  Carbon dioxide must cause global warming therefore more carbon dioxide must
cause more global warming.  This is the pseudoscience built into the radiative forcing constants used in
the climate simulation models.  Once the erroneous equilibrium assumptions are removed, there can be
no CO
2 induced global warming.  This means that all of the predictions of catastrophic rises in sea level,
increases in hurricane intensities, polar ice melting, extreme weather events, receding glaciers and
other global warming related disasters are invalid.  There is no evidence to support any of the global
warming disaster claims.  
The large scale climate simulation models used by the IPCC researchers to predict global warming, sea
level increases and other ‘catastrophes’ are based on the concept of radiative forcing.  This was
introduced by Manabe and Wetherald in 1967, although the basic idea predates this publication.  In
order for the Earth’s climate to be stable, the First Law of Thermodynamics, conservation of energy,
requires that the long term LWIR emission from the Earth balance the incoming solar radiation.  As
discussed above, this is a dynamic balance, not a formal equilibrium requirement.  Radiative forcing
assumes, without justification or validation that long term averages of transient, non-equilibrium climate
variables can be analyzed as a system that is in equilibrium.  This is, quite simply, wrong.  The upward
tropopause’ are then assumed to be equal and equivalent.  Spectroscopic considerations of the
tropopause’ are then assumed to be equal and equivalent.  Spectroscopic considerations of the
molecular emission linewidth show that these fluxes are not equivalent.  This is discussed in Section 3.7
molecular emission linewidth show that these fluxes are not equivalent.  This is discussed in Section 3.7
and illustrated in Figure 3-31 of the full Sea Level Rise Comment [Clark 2011].  A change in CO2
concentration is introduced to ‘perturb’ this ‘equilibrium’ and the change in flux is used to calculate a new
‘equilibrium surface temperature’.  This calculated ‘equilibrium surface temperature’ produced by such
models is not a measurable climate variable.  It assumes that the sun is shining all the time and that the
unperturbed surface is a mathematically defined blackbody surface that is initially receiving and emitting
a flux of 390 W.m^-2.  Small, 1 to 4 W.m^-2 changes in flux in a stratospheric layer of air at 217 K and
0.22 atm. are assumed to be capable of warming a surface at 288 K through 11 km of warmer, higher
density air.  This requires a flagrant violation of the Second Law of Thermodynamics.  The increase in
‘equilibrium surface temperature’ calculated by such models using just the increase in the CO
2 flux is too
low, so additional ‘water vapor feedback’ was created to explain away the inadequacies.  The increase in
LWIR flux from CO
2 produced additional water evaporation which in turn produces more heating.  This is
just mathematical fiction. The cooling of the surface by convection, the conversion of IR radiation into
other forms of energy and the heat capacity/thermal storage properties of the surface are also
conveniently ignored.  

However, the idea that an increase in CO
2 concentration must lead to an increase in surface
temperature was initially accepted almost without question.  Careful analysis of the meteorological
surface temperature record starting in the 1980s revealed a small increasing trend that was immediately
correlated by empirical speculation to the rise in CO
2 concentration [Jones et al, 1999].  This
conveniently ignores a very similar increase that occurred during the dust bowl droughts in the 1930s
before there was any significant increase in CO
2 concentration.  The US MSAT anomaly record is shown
in Figure 4-6.  [NASA, GHCNM, 2011; D’Aleo, 2008].  It should also be noted that the GISS GHCNM
climate record has been periodically ‘adjusted’ to reduce the dust bowl peak [D’Aleo, 2010].   The
dependence of meteorological surface air temperature on weather patterns, ocean surface
temperatures, solar illumination and surface absorption was ignored and empirical correlation is not
proof.  Two unrelated plots of meteorological surface temperature and CO
2 concentration were overlaid
and made to coincide to produce the so called ‘hockey stick’ graph.  Concern over ozone depletion then
led to the inclusion of other greenhouse gases into the radiative forcing models.  An elaborate set of
radiative forcing constants that related small changes in IR emission to surface temperatures was
constructed [Hansen, 2005a].  This was ‘calibrated’ using the change in the meteorological surface
temperature (MSAT) that was empirically assumed to be caused by CO
2. This ‘global warming’ has no
relationship to the true ground surface temperature that is needed to calculate the IR surface flux and
no demonstrated causal relationship to the change in CO
2 concentration. This whole approach is
pseudoscience.  The same technique is used in astrology.  While the positions of the planets can be
calculated quite accurately, they have no relationship to human behavior.  The only ‘proof’ ever
provided for radiative forcing is that the results from one invalid model can be made to agree with those
from another.  No experimental verification is apparently required, nor can any measurement of
‘equilibrium surface temperature’ be performed.  When realistic values for the surface flux terms are
used in an engineering calculation of the surface temperature, an increase of 1.7 W.m^-2 in LWIR flux
from a 100 ppm increase in atmospheric CO
2 concentration cannot change the surface temperature.  
This is shown above in Section 3.3 of the full Sea Level Rise Comment [Clark 2011].  However, instead
of rejecting the concept of radiative forcing as a failed hypothesis, it was argued that the increase in
LWIR flux produced an increase in surface evaporation which introduced a ‘positive feedback’ that
amplified the effect of CO
2 on the surface temperature.  An analysis of ERBE satellite data has shown
this to be incorrect [Lindzen & Choi, 2009].

The radiative forcing assumptions used in the IPCC climate simulation models have no basis in physical
reality.  The equilibrium average assumptions and the use of perturbation theory are invalid.  There is
no justification for the use of the meteorological surface air temperature (MSAT) as a surrogate for the
measured local surface temperature.  The models are ‘hard wired’ using empirical ‘radiative forcing
constants’ and ‘water vapor feedback’ to produce global warming from CO
2 and other ‘greenhouse
gases’ with a complete and arrogant disregard of the basic laws of physics including the First and
Second Laws of Thermodynamics.  The observed changes in MSAT can be explained as a
consequence of the influence of ocean surface temperatures such as the AMO and PDO on the bulk air
temperature of the Earth’s weather systems.  This can be clearly seen in the weather station record for
the State of California as shown above in Section 2.3 of the full comment [Clark, 2011].  Superimposed
on the ocean temperature fluctuations are urban heat island effects.  These may be clearly identified for
the State of California using the PDO as a reference [Clark, R. 2010b].  

The ‘hockey stick’ temperature increase for CO
2 from 1958 may be calculated by multiplying the
increase in LWIR flux from CO
2 by the empirical ‘radiative forcing calibration constant’, 0.67 C/(W.m^-2).  
The resulting increase in ‘predicted average surface temperature’ is 0.8 C.  If the linear slopes of the
PDO and AMO are averaged and an offset of 0.267 C is added, then the resulting line is an almost
exact linear fit to the hockey stick curve for CO
2 forcing.  This shows quite clearly the influence of ocean
temperatures, urban heat island effects and other ‘adjustments’ on the long term weather station trends
that were manipulated to derive the ‘hockey stick’ curve.  Recent work by Wyatt et al [2011] has shown
that the long term temperature fluctuations measured in the northern hemisphere can be matched by a
simple combination of the AMO and PDO indices.  This is shown in Figure 4-2.  This is similar to the
work of D’Aleo [2008] on the US continental temperature record.  
Figure 4-1:  AMO and PDO and trend lines plotted from 1960.  The hockey stick surface
temperature prediction is also shown.  When the average AMO+PDO trend line is offset
by 0.267 C it almost overlaps the hockey stick prediction.
Figure 4-2: Relationship between northern hemisphere temperature and the combined
AMO and PDO indices.  The fit is almost exact [Wyatt et al, 2011].
Scientific Misconduct by the Climate Community and the Collapse of the Peer Review Process
Claims of catastrophic climate change caused by carbon dioxide induced global warming have proved
very effective as a means for obtaining research funding and a whole generation of scientists has
become accustomed to this source of income.  A global warming industry of scientists, engineers,
economists and politicians has benefited significantly from the invalid climate predictions.  It is now clear
that all of these claims are false and a large and influential group of people have been trapped in a web
of lies.  The IPCC has been shown to be a corrupt political body [Laframboise, 2011].  The release of a
large archive of e-mails and other files from the UK Hadley Climate Center has provided abundant
evidence of scientific misconduct.  These matters are under investigation and will be dealt with in due
course by the appropriate authorities.  However, radiative forcing has been in use since 1967 and the
whole peer review process in climate science has collapsed.  Numerous papers have found their way
into ‘respected’ scientific journals such as Nature, Science and Proceedings of the National Academy of
Science (PNAS) that were based on research should never have been funded and results that should
have been rejected as invalid and never published.  Every single result and conclusion that has been
based on the use of radiative forcing is invalid and should not be used in any kind of policy making

By way of example, all of the climate change papers published by Hansen et al that are listed on the
NASA GISS website are invalid and the related NASA GISS discussion of radiative forcing is nothing
more than climate astrology.  Two papers illustrate the issue quite clearly.  These are:

J. Hansen, M. Sato, R. Ruedy, L. Nazarenko, A. Lacis, G. A. Schmidt, G. Russell, I. Aleinov, M. Bauer, S. Bauer, N. Bell, B.
Cairns, V. Canuto, M. Chandler, Y. Cheng, A. D. Genio, G. Faluvegi, E. Fleming, A. Friend, T. Hall, C. Jackman, M. Kelley,
N. Kiang, D. Koch, J. Lean, J. Lerner, K. Lo, S. Menon, R. Miller, P. Minnis, T. Novakov, V. Oinas, Ja. Perlwitz, Ju. Perlwitz,
D. Rind, A. Romanou, D. Shindell, P. Stone, S. Sun, N. Tausnev, D. Thresher, B. Wielicki, T. Wong, M. Yao, and S. Zhang,
J. Geophys. Research, 110 D18104 pp1-45 (2005) ‘Efficacy of climate forcings’

J. Hansen; L. Nazarenko, R. Ruedy, M. Sato, J. Willis, A. D. Genio, D. Koch, A. Lacis, K. Lo. S. Menon, T. Novakov, J
Perlwitz, G. Russell, G. A. Schmidt and N. Tausnev.,
Science 308 1431-1435 (2005), Earth's energy imbalance:
confirmation and implications

The first paper starts from the a-priori assumption that CO2 has been the cause of the observed change
in the meteorological surface temperature record as described in the hockey stick plot.  This presumed,
invalid relationship between CO
2 and the MSAT record is then used to construct an elaborate set of
radiative forcing constants for other greenhouse gases.  The models are therefore empirically hard
wired to produce global warming as the concentration of the various greenhouse gases increases.  The
‘equilibrium surface temperature’ can only increase as the greenhouse gas concentration increases.  
The principal way to produce cooling in such a model is to add aerosol effects, so these are empirically
adjusted to make the model output appear to match the observed MSAT record.  Aerosol emissions
from volcanic eruptions are used for ‘fine tuning’ [Eschenbach, 2011].  The model is run to simulate a
time period of up to 300 years for various ‘forcing’ conditions.  It takes a time period simulation of
approximately 100 years for the model to settle down and achieve some form of computational stability.  
This does not mean that the ‘stable’ results have any relationship to physical reality.  It is also important
to note that some of the authors of this paper also control one of the principal climate records
maintained by NASA GISS.  Neither the model code, nor the data processing used to produce the GISS
climate records have been published, so there are fundamental conflict of interest issues that need to
be resolved.  Mysterious ‘adjustments’ to the GISS climate data sets have already been documented [D’
Aleo, 2010].  

The authors also state in the paper ‘Principal model shortcomings include ~25% regional deficiency of
summer stratus cloud cover off the west coast of the continents with resulting excessive absorption of
solar radiation by as much as 50 W/m^2, deficiency in absorbed solar radiation and net radiation over
other tropical regions by typically 20 W/m^2, sea level pressure too high by 4–8 hPa in the winter in the
Arctic and 2–4 hPa too low in all seasons in the tropics, deficiency of rainfall over the Amazon basin by
about 20%, deficiency in summer cloud cover in the western United States and central Asia by ~25%
with a corresponding ~5°C excessive summer warmth in these regions’.  In spite of 50 W/m2 deficiencies
in flux and 5 C temperature errors, such model results were allowed to be published by the journal

In the second paper, the a-priori assumption is made that an observed increase in ocean heat content
(temperature) has been caused by the observed increase in CO
2 concentration.  As discussed above in
Section 3.2, this is simply impossible.  The daily solar flux into the oceans can easily exceed 20 MJ.m^-2.
day^-1.  The observed increase in downward ‘clear sky’ LWIR flux from CO
2 has been 0.15 MJ. m^-2.
day^-1 over 200 years, coupled into the first 100 micron of the ocean surface.  Here it produces an
insignificant change in the wind driven fluctuations in surface evaporation.  The model used in the ocean
heat content simulations did not include any ocean oscillations, and used empirical ‘radiative forcing’
techniques to simulate changes ocean temperatures.  The model was unable to reproduce the tropical
ocean heat content because it did include any of the relevant energy transfer physics, yet the paper still
passed through peer review.

Other examples of papers that should never have been published include:

R. Knutti, M. R. Allen, M. R. P. Friedlingstein, J. M. Gregory, G. C. Hegerl, G. A. Meehl, M. Meinshausen, J. M. Murphy, G. K.
Plattner, S. C. B. Raper, T. F., Stocker, P. A. Stott, H. Teng and T. M. L. Wigley, T. M. L.,
Journal of Climate 21(11) 2651-
2663 (2008), ‘A Review of Uncertainties in Global Temperature Projections over the Twenty-First Century’

G. A. Meehl, C. Covey, T. Delworth, M. Latif, B. McAvaney, J. F. B. Mitchell and R. J. Stouffer,
Bulletin of the American
Meteorological Society
88(9) 1383-1394 (2007), ‘The WCRP CMIP3 multimodel dataset’

S. Solomon, G-K. Plattner, R. Knutti and P. Freidlingstein,
Proc Natl Acad Sci USA 106 1704-1709 (2009), ‘Irreversible
climate change due to carbon dioxide emissions’

In all of these papers, it is assumed that models are capable of simulating ‘equilibrium surface
temperatures’ and that these temperatures somehow are mysteriously related to the MSAT record.  
Most of the error discussion is related to the uncertainties in the CO
2 emission ‘scenarios’.  The fact that
these models have no basis in physical reality is ignored.  ‘Experiments’ consist of nothing more than
comparing the results of one invalid model with another.  

It should be clear from this discussion that the authors of these papers have completely lost contact with
the realities of climate physics.  The peer review process has collapsed and the authors have acted
using a ‘buddy system’ to review each other’s papers and grant proposals.  The e-mail correspondence
revealed in the ‘climategate’ archive makes it clear that this is a closed community of climate cronies that
shares a common religious belief in global warming.  No other ideas outside of invalid radiative forcing
concepts are accepted and any work that does not support global warming is suppressed.  These
authors have tried to control the content o the papers published in Nature, Science, PNAS, J. Climate
and other journals.  It is only recently that this dominance has been challenged and the egregious level
of scientific misconduct has become apparent [Mosher & Fuller, 2010].  All of the scientific papers
published by this network of authors should be discounted as scientifically invalid.  Other issues, such
as the fraudulent use of research funds will not be considered in this discussion, but these matters
clearly require independent investigation.  
The use of invalid empirical radiative forcing models to predict global warming has resulted in a large
number of fraudulent claims of global warming disasters.  As discussed above, these are based on
nothing more than empirical assumptions that can only be described as climate astrology.  The primary
claim is that the increase in atmospheric CO
2 concentration has led to global warming and that further
increases in ‘greenhouse gases’ will lead to further global warming.  The major second claim is that
glaciers.  The third major claim is that there will increases in climate extremes.  There will more floods,
reality, but they have been widely reported in the main stream news media.  The claims of sea level rise
are examined in detail in Section 5.0 in the full comment [Clark, 2011], but it is worthwhile to provide a
brief overview of the alarmist claims related to global warming and compare the predictions to measured
climate variables.  Further details may be found in the references provided.  Most of this information has
been published on line because the collapse of the peer review process discussed above has restricted
access to various scientific publications.  
False and Alarmist Claims of Global Warming Disasters
Global Warming and the IPCC ‘Scenarios’
The whole global warming argument is based on a misinterpretation of the climate record.   An invalid
empirical relationship between the increase in atmospheric CO
2 concentration and the meteorological
surface air temperature (MSAT) record has been assumed and used in a circular fashion to create the
global warming scare.  This is a classic example of the extrapolation of a pseudo-linear increasing trend
from the upward part of a ‘bell’ or Gaussian type of curve.  The IPCC ‘scenario’ projections are
illustrated in Figure 4-3 [Schreuder, 2011].  These are from Figure 10.4, p 767 of the IPCC 2007 report
[Alley et al, 2007].  The various scenarios refer to projected increases in atmospheric CO
concentration.  The temperature record from 1998 to November 2010 has been added to the original
IPCC figure.  The projections begin in 1998 and there has been no increase in the observed global
average MSAT or lower tropospheric satellite temperature since then.  Figure 4-4 shows the first part of
Figure 4-3 with the NASA GHCNM [2011] and the RSS Satellite records superimposed [REMSS, 2011].  
The NASA record has been offset downwards 0.3 C to overlap the satellite record.  The important point
is that both curves show no increase in average since 1998.  This clearly shows that the IPCC ‘Scenario’
projections have no basis in physical reality.  There are even more discrepancies between the
temperature record and projections made 10 years earlier by Hansen in 1988 [Hansen et al, 1988].  
This clearly shows that the IPCC ‘Scenario’ projections based on climate simulations using radiative
forcing assumptions have no basis in physical reality.  Similar observations have been made by other
authors.  Figure 4-5 shows a similar comparison to Figure 4-3 based on a sinusoidal projection of the
decadal temperature oscillations overlaid on a linear temperature increase representing the recovery
from the Little Ice Age or Maunder Minimum [Akasofu, 2010].  If the recent decrease in sunspot activity
continues, then the projected linear increase may not be sustained and further decreases in
temperature may occur.  
Figure 4-3:  IPCC ‘Scenario’ projections of climate temperature increases based
on various levels of CO
2 emissions.  The predictions began in 1998.  There has
been no increase in global temperature since then.
Figure 4-4: Expanded 1900 to 2010 temperature curve from Figure 4.3 with the recent
temperature record superimposed.  There has been no increase in observed average
temperature since 1998.
Figure 4-5: Temperature record considered as a linear recovery from the Little Ice Age
(Maunder Minimum) with decadal oscillations superimposed.  The IPCC extrapolation
error is clearly shown [Akasofu, 2010]
Climate Record ‘Homogenization’ and ‘Adjustment’
Versions of the climate record are maintained by three groups: the Hadley climate Center in the UK
(HadCRU), the NOAA Global Historic Climate Network (GHCN) and the NASA Goddard Institute for
Space Studies (GISTEMP).  These are derived in different ways mainly from the GHCN data.  The raw
climate data is processed to ‘homgenize’ the data.  This was originally intended to account for station
bias and the change in the number and location of the weather monitoring stations with time.  Instead it
has become a means of ‘fixing’ the data so that it supports the global warming predictions.  This has
been discussed in various articles and is summarized in some detail by Cheetham [2011b].  Figure 4-6
compares the 1999 and 2001 NASA GISTEMP records following the NASA 2001 ‘adjustments.  The
black dots are the annual mean and the black line is the 5 year average of the 1999 data.  The blue
dots and the red line are the 2001 data.
Figure 4-6: Comparison of US NASA GISTEMP data for 1999 (black) and
2001 (red/blue) following the NASA 2000 ‘adjustments.
There are also issues with the extrapolation of station data over large distances where there is no
recorded data, particularly at high latitudes.  The number of weather stations used in the climate record
has also decreased and this has resulted in an increase in the average global temperature record.  This
is illustrated in Figure 4-7 [Cheetham, 2011b].  The important to note is that considerable caution is
needed in using the published climate record to justify global warming.  An even more blatant example
was the manipulation of tree ring data to ‘remove’ the Maunder Minimum from the earlier proxy based
climate record [Dawson, 2010; Wedgman et al, 2010].
Figure 4-7 Changes in the number of reporting stations and average
temperatures from 1950 to 2000.
Since 1979, air temperature data has been available from satellite sensors.  This record is compiled by
two groups, the University of Alabama, Huntsville (UAH) and Remote Sensing Systems (RSS).  There is
little difference between these two data sets and they are free of weather station ‘homogenization’.  One
of the most important results from the satellite data is the demonstration of the importance of the ENSO
El Nino events in changing the air temperatures in the lower troposphere.  Figure 4-8 shows the RSS
lower troposphere global average temperature.  There was no trend in the data from 1979 to 1997
[REMSS, 2011].  After the major El Nino event of 1997/8 there was a step increase in temperature
followed by stable record with no trend.  This is the recent temperature record that should be used in
the interpretation tidal data.
Figure 4-8: RSS lower troposphere global average satellite temperature data
Sea Ice Extent
The Earth’s climate has been warming since the end of the Little Ice Age in the early eighteenth One
long term record that is available is the summer limit of the ice edge in the Norwegian Sea.  This is
shown in Figure 4-9, along with part of the more recent satellite record [Akasofu, 2010].  The Norwegian
Sea ice edge retreat shows an approximately linear decrease in ice extent from 1800.  Superimposed on
this are periodic fluctuations from ocean oscillations.  Minima occur near 1860, 1940 and 2010.  Maxima
occur near 1920 and 1950.  Since atmospheric CO
2 levels did not begin to increase significantly until
the 1960’s, there is no reason to attribute any of the observed changes to CO
2 induced global warming.
Figure 4-9: August ice edge of the Norwegian Sea relative to the 1961-1990 mean, 79.1°N
and satellite ice extent data from 1970 to 1998 [Akasofu, 2010]
Figure 4-10 and Figure 4-11 show the ice area extent anomaly for the Northern and Southern
Hemispheres [Cryosphere Today, 2011].  This is the deviation in the area from the 1979-2008 mean.  
There was a decrease in ice area in the Arctic in 2007 that was caused by unusual weather conditions
from about 1995 that has leveled off after 2007.  However the record is still only half of the duration of
the typical 60 year ocean cycle.  The Antarctic ice area has been stable or has increased slightly over
the period of observation.  The total global sea ice area and anomaly are presented in Figure 4-12.  
This shows a slight decrease over the period of observation that is consistent with a continued recovery
from the Little Ice Age.  There is no obvious trend in the data to indicate any effect from increased
atmospheric CO
2 levels, nor should any be expected.
Figure 4-10: Northern hemisphere sea ice area anomaly (deviation from 1979-2008 mean).
Figure 4-11: Southern hemisphere sea ice area anomaly (deviation from 1979-2008 mean).
Figure 4-12: Global sea ice area and anomaly 1979 to present (March 2011).
Glacier Retreat
As the Earth’s climate has warmed since the end of the Little Ice Age, glaciers have generally decreased
in length [Oerlemans’ 2005].  Glaciers are flowing rivers of ice and the length depends on the glacier
mass balance which is related to the precipitation, solar radiation and air temperature.  Figure 4-13
shows the observed changes in length for 5 glaciers from different parts of the world.  These glaciers
have clearly been retreating since the end of the Little Ice Age.  Figure 4-14 shows the temperature
changes derived from glacier length/mass balance analysis.  There is nothing to indicate any effects on
glacier length that could be attributed to an increase in atmospheric CO
2 concentration.  It should also
be noted that recent claims of Himalayan Glacier melting by the IPCC have been shown to be incorrect
[McLean, 2010b].  In addition, loss of ice cover on Mount Kilimanjaro has been attributed to long term
climate changes that have resulted in increased sublimation of the ice [Fairman et al, 2011].  Again, no
2 induced global warming is involved.  
Figure 4-13:  Glacier length data for 5 glaciers in different parts of the world
[Oerlemans’ 2005].
Figure 4-14: Temperature changes derived from glacier length data [Oerlemans’ 2005].
Extreme Climate Events
One of the most egregious claims used in the global warming argument is that the observed increase in
atmospheric CO
2 concentration is causing an increase in ‘extreme weather events’.  This has allowed
the IPCC and its ‘climate team’ to make numerous sensational claims about global warming as the cause
of local temperature records, floods, droughts, hurricanes and other natural disasters including a
decline in polar bear population and coral bleaching.  None of these claims have any basis in reality.  
Consider for example the heat wave and forest fires that occurred in Russia during the summer of
2010.  These were caused by a persistent ‘blocking high’ over Western Europe.  This is part of the
normal fluctuation in weather patterns over Western Russia.  There has been no change in the long
term trend of July monthly temperatures since 1880.  Similarly, the 2010/2011 cold winter along the
Pacific Coast of S. America is part of a natural weather sequence that has been documented since
Aztec times [Ambler, 2010].  The long term temperature record for the continental US shown in Figure 3-
39 of the full Sea Level Rise Comment [Clark 2011] and the Northern Hemisphere temperature record
shown in Figure 4-2 demonstrate the relationship between ocean surface temperatures and the long
term climate temperature records.  
There have also been numerous fraudulent claims of relationships between global warming and
droughts.  Various regions of the world, including parts of North America and the Sahel (sub Sahara)
region in Africa have experienced extended droughts.  All of these have been related to changes in
ocean surface temperatures [Hagos & Cook, 2008; McCabe et al, 2008].  Extended periods of low
rainfall in parts of the US for periods of 30 years or longer should be considered as normal climate
variation consistent with ocean cycle variations.   Figures 4-15 and 4-16 both show the average annual
rainfall for the US since 1895 [NOAA, Rainfall, 2011].  The totals have remained within the 29±5 inch
range for over 100 years.  There has been a slight increase in average rainfall of 2 inches over this
period as determined using a simple linear fit to the data.  However, a more careful examination reveals
a climate shift in the 1970’s related to the El Nino event in 1977-78.  When the rainfall data is separate
into 2 data series with a split at 1970, the climate shift becomes apparent.
Figure 4-15: Average annual rainfall for the continental US from 1895 showing a simple linear
Figure 4-16: Data from Figure 4-15 showing the step increase in rainfall from the 1977
climate shift.
In addition to droughts, numerous fraudulent claims of increases in cyclone (hurricane) intensity have
also been made.  These also have no basis in reality.  Figure 4-17 shows the estimated global and
Northern Hemisphere cyclone intensities from 1972 inwards.  The intensities are plotted as accumulated
cyclone energy or ACE, which a combination of the square of the wind speed and the duration of the
event.  Current cyclone activity is at rather low levels [COAPS, 2011].  In addition, there have been
fewer hurricanes making landfalls in the US.  It is also important to separate hurricane damage from
hurricane intensity.  In the US, the increase in population in areas such as Florida means that the
amount of damage that even a modest hurricane can cause has increased significantly in recent rears.  
A detailed discussion of tropical cyclone activity and the fraudulent IPCC claims has recently been
published by William Gray [2011].
Figure 4-17:  Global and Northern Hemisphere cyclone intensities from 1972
From this brief discussion of the occurrence of extreme weather events it should be clear that the
observed increase of 70 ppm in atmospheric CO
2 concentration over the last 50 years has had no effect
whatsoever on the Earth’s climate.  This can be seen by examining by temperature, rainfall and cyclone
intensity records.  In addition, the Arctic Ice area has recovered from the 2007 minimum.  However, even
agencies such as NOAA and NASA continue to try and perpetuate the myth of extreme weather events
related to global warming [D’Aleo, 2011].  This was discussed in testimony by Dr. John Christy to the
House Subcommittee on Energy and Power Committee on Energy and Commerce, March 8th 2011.  
Further details are provided in the Appendix.  
When the recent data on global warming are reviewed, it is clear that there is no evidence whatsoever of
any carbon dioxide induced global warming.  It is highly unlikely that there will be any acceleration of the
rise in sea level above that observed over the last 100 years.  In fact, the rise in sea level has been
slowing along the U. S. west coast since 1990.

The whole global warming argument is based on the false, empirical assumption that the long term rise
in globally averaged meteorological surface air temperature (MSAT) record has been caused by an
increase in atmospheric carbon dioxide concentration of approximately 100 ppm.  The analysis
presented in Section 3.0 of the full comment [Clark, 2011] clearly shows that it is impossible for this to
have occurred.  Instead, the increase in MSAT can be explained in terms of increases in ocean surface
temperatures caused by natural ocean cycles.  The PDO is now in its negative phase.  There has been
no increase in global tropospheric air temperatures since the El Nino event in 1997/8.  The MSAT
record also includes local weather station biases caused by urban heat island effects and climate record
compensation ‘adjustments’ that have been used to create additional warming.  The argument that
global warming has induced more ‘climate extremes’ is also incorrect.

All of the computer models that use the concept of radiative forcing to ‘predict’ global warming are
invalid.  None of the publications that rely on these results should be considered as valid evidence of
global warming.  Unfortunately, the peer review process in climate science has collapsed and many such
studies based on radiative forcing have been published, even in ‘respected’ journals.  These articles
need to be discounted, regardless of the journal in which they were published.  

The fact that CO
2 induced global warming is impossible is based on an analysis of the dynamic energy
transfer processed involved and the available empirical evidence.  Unfortunately, the global warming
argument has become detached from its foundations in physics and degenerated into a quasi-religious
cult.  Belief in global warming is more important than physical reality.  This concept extends beyond just
global warming into many areas of environmental science and energy policy.  A review of the scientific
evidence does not support any of the claims of CO
2 induced global warming and belief in climate
astrology should not be allowed to intrude into Government policy decisions.  
It is simply impossible for the observed increase in atmospheric CO2 concentration of 70 ppm over the
last 50 years to have caused any kind climate change.  This follows from a straightforward analysis of
the dynamic energy transfer processes involved.  This is discussed in detail in Section 3.0 in the full
comment [Clark, 2011].  The global warming argument starts from the incorrect assumption that long
term averages of dynamic, non-equilibrium climate variables such as surface temperature somehow
form an ‘equilibrium climate state’ that can be analyzed using perturbation theory.  This equilibrium
assumption may be regarded as a failed hypothesis.  However, instead of rejecting this equilibrium
hypothesis and replacing it with climate models based on dynamic energy transfer to simulate real
climate physics, the equilibrium assumption was retained and augmented using empirical

It was decreed that a 1 W.m^-2 increase in the downward LWIR flux from CO
2 had produced an increase
in ‘equilibrium surface temperature’ of 2/3 C (0.67 C).  This ‘equilibrium surface temperature’ was not
even the surface temperature defined using incorrect black body equilibrium arguments, but the
meteorological surface air temperature (MSAT) measured in an enclosure placed at eye level above the
ground.  A 1 W.m-2 increase in blackbody flux at 288 K requires a temperature rise of only 0.18 C.  The
other 0.49 C had to be created using ‘water feedback’ mechanism to mysteriously amplify the effect of
2.  An elaborate façade of ‘radiative forcing constants’ was created for other greenhouse gases
using the ‘radiative forcing constant’ for CO
2 as a ‘calibration’.  Additional ‘forcing constants’ were
created for various aerosols and other factors that could change the downward atmospheric LWIR flux.  
No physics is required.  The total change in ‘forcing flux’ magically changes the surface temperature to
predict global warming.  This is the underlying basis of the global warming fraud.  To further enhance
the pseudoscience of global warming a second fraudulent claim was added.  Global warming would now
lead to an increase in ‘climate extremes’.  Every record temperature, forest fire, flood, glacier retreat, ice
melt, hurricane and other disaster, limited only by the imagination of those creating this propaganda,
could be blamed on CO
2 induced global warming.  All of this is just plain fraud.  However, this approach
was very effective at generating research funds so many researchers jumped on the funding
bandwagon and the whole peer review process collapsed.  CO
2 had to produce global warming because
my research funds required it to do so.  Global warming became the best science that politics could
buy.  Sadly, this is only just beginning to change.  In this Appendix, some of the information on the global
warming fraud that has recently become available will be briefly reviewed and references will be
provided to more detailed information.  The physical reality, that CO
2 cannot cause any kind of climate
change has already been established.

In November of 2009, a large archive of e mails and other files from the Climate Research Unit of the
University of East Anglia Climate was released on the Internet.  This revealed to many people outside of
the close knit climate community that there had been an ongoing fraud for many years to promote the
global warming agenda and prevent the publication of material that did not support the prevailing global
warming dogma.  Climate science had become detached from its foundation in physical science and
degenerated into a quasi religious cult.  Belief in global warming was a prerequisite for funding in climate
science.  The release of this climate archive became known as ‘Climategate’.  The information provided
has been analyzed in detail by several authors [Monckton, 2009; Montford 2010; Mosher & Fuller,
2010].  The actual archive is available at E. Anglia Confirmed [2011].  The following example is an e-mail
from Kevin Trenberth (bold emphasis added):

From: Kevin Trenberth
To: Michael Mann
Subject: Re: BBC U-turn on climate
Date: Mon, 12 Oct 2009 08:57:37 -0600
Cc: Stephen H Schneider <>, Myles Allen <>, peter stott <>,
"Philip D. Jones" <>, Benjamin Santer <>, Tom Wigley <>,
Thomas R Karl <>, Gavin Schmidt <>, James Hansen
<>, Michael Oppenheimer

Hi all
Well I have my own article on where the heck is global warming? We are asking that here in Boulder where we have
broken records the past two days for the coldest days on record. We had 4 inches of snow. The high the last 2 days was
below 30F and the normal is 69F, and it smashed the previous records for these days by 10F. The low was about 18F
and also a record low, well below the previous record low. This is January weather (see the Rockies baseball playoff
game was canceled on saturday and then played last night in below freezing weather). Trenberth, K. E., 2009: An
imperative for climate change planning: tracking Earth's global energy. Current Opinion in Environmental Sustainability,
1, 19-27, oi:10.1016/j.cosust.2009.06.001. [1][PDF] (A PDF of the published version can be obtained from the author.)
The fact is that we can't account for the lack of warming at the moment and it is a travesty that we can't. The CERES
data published in the August BAMS 09 supplement on 2008 how there should be even more warming: but the data are
surely wrong. Our observing system is inadequate. That said there is a LOT of nonsense about the PDO. People like
CPC are tracking PDO on a monthly basis but it is highly correlated with ENSO. Most of what they are seeing is the
change in ENSO not real PDO. It surely isn't decadal. The PDO is already reversing with the switch to El Nino. The PDO
index became positive in September for first time since Sept 2007. see

Michael Mann wrote:
extremely disappointing to see something like this appear on BBC. its particularly odd, since climate is usually Richard
Black's beat at BBC (and he does a great job). From what I can tell, this guy was formerly a weather person at the Met
Office. We may do something about this on Real Climate, but meanwhile it might be appropriate for the Met Office to have
a say about this, I might ask Richard Black what's up here?

The release of the Climategate archive provided the incentive to look much more closely at the workings
of the IPCC and the climate data on which the IPCC reports were based.  This has revealed a pattern of
systematic fraud and distortion.  Some of these include ‘Glaciergate’, ‘Kiwigate’ and ‘Amazongate’
[McLean 2010a; ICSC 2010; Eschenbach, 2010b]. The first involved unfounded claims of melting of the
Himalayan glaciers and related disasters that were based on magazine articles that had never been
peer reviewed, contrary to IPCC claims that they used only reference peer reviewed articles.  The
second involved ‘adjustments’ to the climate record of New Zealand that were just plain fraudulent.  The
third involved unsubstantiated claims of droughts in the Amazon basin.  

It must also be emphasized that the IPCC is a political body, not a scientific one [McLean, 2010a;
Cheetham 2009].  The IPCC was formed in 1988 with the purpose of assessing “the scientific, technical
and socioeconomic information relevant for the understanding of the risk of human-induced climate
change.”  Its main goal is based on the assumption of “human-induced climate change” – there was
never an attempt to evaluate the scientific evidence of the cause.  The IPCC reports are edited by a
small number of carefully selected reviewers that are all believers in the global warming religion.  The
predictions of global warming published by the IPCC have no basis in physical reality.  The IPCC climate
models are hard wired using radiative forcing constants to create global warming.  

Various scientific societies, including the Royal Society, the American Physical Society and the American
Chemical Society have published strong statements supporting global warming.  These are based on
little more than the IPCC reports and reflect the vested interests of the influential members of these
societies that wrote and supported these reports.  Attempts by other members of these societies to
change or retract these statements of support have not been very successful.  Professor Hal Lewis, a
very senior and respected scientist at the University of Santa Barbara recently reigned from APS
because of its position on global warming.  His letter of resignation reads in part [Lewis, 2011]:

It is of course, the global warming scam, with the (literally) trillions of dollars driving it, that has corrupted so many
scientists, and has carried APS before it like a rogue wave. It is the greatest and most successful pseudoscientific fraud I
have seen in my long life as a physicist. Anyone who has the faintest doubt that this is so should force himself to read
the ClimateGate documents, which lay it bare. (Montford's book organizes the facts very well.) I don't believe that any real
physicist, nay scientist, can read that stuff without revulsion. I would almost make that revulsion a definition of the word

So what has the APS, as an organization, done in the face of this challenge? It has accepted the corruption as the norm,
and gone along with it.

In the interim the ClimateGate scandal broke into the news, and the machinations of the principal alarmists were
revealed to the world. It was a fraud on a scale I have never seen, and I lack the words to describe its enormity. Effect on
the APS position: none. None at all. This is not science; other forces are at work.

It should also be noted that the 2007 Nobel Peace Prize was awarded to Al Gore and other IPCC
committee members based on results from totally fraudulent computer simulations.  The US Supreme
Court decision on CO
2 pollution was based on the same fraudulent climate model results.  The US
Environmental Protection Agency has already elected to ignore the scientific evidence from the public
comments it received on its CO
2 endangerment finding and finalized regulations that have no foundation
in physical reality.  Prof. Richard Lindzen and Prof John Christy, both respected climate scientists have
recently testified before Congress on global warming [Christy, 2011; Lindzen, 2010].  Their testimony
clearly explains many aspects the global warming fraud.  Selected text from their testimony is given

Global Warming: How to approach the science.
Richard S. Lindzen
Program in Atmospheres, Oceans, and Climate
Massachusetts Institute of Technology
Testimony: House Subcommittee on Science and Technology hearing on A Rational Discussion of Climate Change: the
Science, the Evidence, the Response
November 17, 2010

I wish to thank the House Committee on Science and Technology for the opportunity to present my views on the issue of
climate change –or as it was once referred to: global warming. The written testimony is, of course, far more detailed than
my oral summary will be. In the summary, I will simply try to clarify what the debate over climate change is really about. It
most certainly is not about whether climate is changing: it always is. It is not about whether CO
2 is increasing: it clearly
is. It is not about whether the increase in CO
2, by itself, will lead to some warming: it should. The debate is simply over
the matter of how much warming the increase in CO
2 can lead to, and the connection of such warming to the
innumerable claimed catastrophes. The evidence is that the increase in CO
2 will lead to very little warming, and that the
connection of this minimal warming (or even significant warming) to the purported catastrophes is also minimal. The
arguments on which the catastrophic claims are made are extremely weak –and commonly acknowledged as such.

In my long experience with the issue of global warming, I’ve come to realize that the vast majority of laymen --including
policymakers –do not actually know what the scientific debate is about. In this testimony, I will try to clarify this. Some of
you may, for example, be surprised to hear that the debate is not about whether it is warming or not or even about
whether man is contributing some portion of whatever is happening. I’ll explain this in this testimony. Unfortunately,
some part of the confusion is explicitly due to members of the scientific community whose role as partisans has
dominated any other role they may be playing.

Here are two statements that are completely agreed on by the IPCC. It is crucial to be aware of their implications.

1. A doubling of CO
2, by itself, contributes only about 1C to greenhouse warming. All models project more warming,
because, within models, there are positive feedbacks from water vapor and clouds, and these feedbacks are considered
by the IPCC to be uncertain.

2. If one assumes all warming over the past century is due to anthropogenic greenhouse forcing, then the derived
sensitivity of the climate to a doubling of CO2 is less than 1C. The higher sensitivity of existing models is made
consistent with observed warming by invoking unknown additional negative forcings from aerosols and solar variability
as arbitrary adjustments.

Given the above, the notion that
alarming warming is ‘settled science’ should be offensive to any sentient individual,
though to be sure, the above is hardly emphasized by the IPCC.

The usual rationale for alarm comes from models. The notion that models are our only tool, even, if it were true, depends
on models being objective and not arbitrarily adjusted (unfortunately unwarranted assumptions).

However, models are hardly our only tool, though they are sometimes useful. Models can show why they get the results
they get. The reasons involve physical processes that can be independently assessed by both observations and basic
theory. This has, in fact, been done, and the results suggest that all models are exaggerating warming.

The details of some such studies will be shown later in this testimony.

Quite apart from the science itself, there are numerous reasons why an intelligent observer should be suspicious of the
presentation of alarm.

1. The claim of ‘incontrovertibility.’

2. Arguing from ‘authority’ in lieu of scientific reasoning and data or even elementary logic.

3. Use of term ‘global warming’ without either definition or quantification.

4. Identification of complex phenomena with multiple causes with global warming and even as ‘proof’ of global warming.

5. Conflation of existence of climate change with anthropogenic climate change.

Some Salient Points:

1. Virtually by definition, nothing in science is ‘incontrovertible’ –especially in a primitive and complex field as climate.
‘Incontrovertibility’ belongs to religion where it is referred to as dogma.

2. As noted, the value of ‘authority’ in a primitive and politicized field like climate is of dubious value –it is essential to deal
with the science itself. This may present less challenge to the layman than is commonly supposed.

3. ‘Global Warming’ refers to an obscure statistical quantity, globally averaged temperature anomaly, the small residue
of far larger and mostly uncorrelated local anomalies. This quantity is highly uncertain, but may be on the order of 0.7C
over the past 150 years. This quantity is always varying at this level and there have been periods of both warming and
cooling on virtually all time scales. On the time scale of from 1 year to 100 years, there is no need for any externally
specified forcing. The climate system is never in equilibrium because, among other things, the ocean transports heat
between the surface and the depths. To be sure, however, there are other sources of internal variability as well.
Because the quantity we are speaking of is so small, and the error bars are so large, the quantity is easy to abuse in a
variety of ways.

Some current problems with science

Questionable data. (Climategate and involvement of all three centers tracking global average temperature anomaly.)
This is a complicated ethical issue for several reasons. Small temperature changes are not abnormal and even claimed
changes are consistent with low climate sensitivity. However, the public has been mislead to believe that whether it is
warming or cooling –no matter how little –is of vital importance. Tilting the record slightly is thus of little consequence to
the science but of great importance to the public perception.

2. More sophisticated data is being analyzed with the aim of supporting rather than testing models (validation rather than
testing). That certainly has been my experience during service with both the IPCC and the National Climate Assessment
Program. It is also evident in the recent scandal concerning Himalayan glaciers.

(Note that in both cases, we are not dealing with simple measurements, but rather with huge collections of sometimes
dubious measurements that are subject to often subjective analysis –sometimes referred to as ‘massaging.’)

3. Sensitivity is a crucial issue. This refers to how much warming one expects from a given change in CO2 (usually a
doubling). It cannot be determined by assuming that one knows the cause of change. If the cause is not what one
assumes, it yields infinite sensitivity. This problem infects most attempts to infer climate sensitivity from paleoclimate

4. Models cannot be tested by comparing models with models. Attribution cannot be based on the ability or lack thereof
of faulty models to simulate a small portion of the record. Models are simply not basic physics.

All the above and more are, nonetheless, central to the IPCC reports that supposedly are ‘authoritative’ and have been
endorsed by National Academies and numerous professional societies.

Where do we go from here?

Given that this has become a quasi-religious issue, it is hard to tell. However, my personal hope is that we will return to
normative science, and try to understand how the climate actually behaves. Our present approach of dealing with climate
as completely specified by a single number, globally averaged surface temperature anomaly, that is forced by another
single number, atmospheric CO
2 levels, for example, clearly limits real understanding; so does the replacement of
theory by model simulation. In point of fact, there has been progress along these lines and none of it demonstrates a
prominent role for CO
2. It has been possible to account for the cycle of ice ages simply with orbital variations (as was
thought to be the case before global warming mania); tests of sensitivity independent of the assumption that warming is
due to CO
2 (a circular assumption) show sensitivities lower than models show; the resolution of the early faint sun
paradox which could not be resolved by greenhouse gases, is readily resolved by clouds acting as negative feedbacks.

Testimony of Dr. John Christy at House Subcommittee on Energy and Power Hearing
Written Statement of John R. Christy
The University of Alabama in Huntsville
Subcommittee on Energy and Power Committee on Energy and Commerce
8 March 2011

I am John R. Christy, Distinguished Professor of Atmospheric Science, Alabama’s State Climatologist and Director of the
Earth System Science Center at The University of Alabama in Huntsville. I have served as a Lead Author and Contributing
Author of IPCC assessments. It is a privilege for me to offer my view of climate change based on my experience as a
climate scientist. My research area might be best described as building climate datasets from scratch to advance our
understanding of what the climate is doing and why. This often involves weeks and months of tedious examination of
paper records and digitization of data for use computational analysis. I have used traditional surface observations as
well as measurements from balloons and satellites to document the climate story. Many of my datasets are used to test
hypotheses of climate variability and change. In the following I will address six issues that are part of the discussion of
climate change today, some of which will be assisted by the datasets I have built and published.


Recently it has become popular to try and attribute certain extreme events to human causation. The Earth, however, is
very large, the weather is very dynamic, especially at local scales, so that extreme events of one type or another will occur
somewhere on the planet in every year. Since there are innumerable ways to define an extreme event (i.e. record
high/low temperatures, number of days of a certain quantity, precipitation over 1, 2, 10 … days, snowfall amounts, etc.)
this essentially requires there to be numerous “extreme events” in every year. The following assess some of the recent
“extreme events” and explanations that have been offered as to their cause

The tragic flooding in the second half of 2010 in NE Australia was examined in two ways, (1) in terms of financial costs
and (2) in terms of climate history. First, when one normalizes the flood costs year by year, meaning if one could imagine
that the infrastructure now in place was unchanging during the entire study period, the analysis shows there are no long-
term trends in damages. In an update of Crompton and McAneney (2008) of normalized disaster losses in Australia
which includes an estimate for 2010, they show absolutely no trend since 1966. Secondly, regarding the recent
Australian flooding as a physical event in the context climate history (with the estimated 2010 maximum river height
added to the chart below) one sees a relative lull in flooding events after 1900. Only four events reached the moderate
category in the past 110 years, while 14 such events were recorded in the 60 years before 1900. Indeed, the recent flood
magnitude had been exceeded six times in the last 170 years, twice by almost double the level of flooding as observed
in 2010. Such history charts indicate that severe flooding is an extreme event that has occurred from natural, unforced
variability. There is also a suggestion that emergency releases of water from the Wivenhoe Dam upstream of Brisbane
caused “more than 80 per cent of the flood in the Brisbane River. … Without this unprecedented and massive release ...
the flooding in Brisbane would have been minimal.” (The Australian 18 Jan 2011.) (See http://rogerpielkejr.blogspot.
com/2011/02/flood-disasters-and-human-caused.html where Roger Pielke Jr. discusses extreme events and supplies
some of the information used here.)

England Floods
Svensson et al. 2006 discuss the possibility of detecting trends in river floods, noting that much of the findings relate to
“changes in atmospheric circulation patterns” such as the North Atlantic Oscillation (i.e. natural, unforced variability)
which affects England. For the Thames River, there has been no trend in floods since records began in 1880 (their Fig.
5), though multi-decadal variability indicates a lull in flooding events from 1965 to 1990. The authors caution that
analyzing flooding events that start during this lull will create a false positive trend with respect to the full climate record.
Flooding events on the Thames since 1990 are similar to, but generally slightly less than those experienced prior to
1940. One wonders that if there are no long-term increases in flood events in England, how could a single event (Fall
2000) be pinned on human causation as in Pall et al. 2011, while previous, similar events obviously could not? Indeed,
on a remarkable point of fact, Pall et al. 2011 did not even examine the actual history of flood data in England to
understand where the 2000 event might have fit. As best I can tell, this study compared models with models. Indeed,
studies that use climate models to make claims about precipitation events might benefit from the study by Stephens et
al. 2010 whose title sums up the issue, “The dreary state of precipitation in global models.” In mainland Europe as well,
there is a similar lack of increased flooding (Barredo 2009). Looking at a large, global sample, Svensson et al. found the
A recent study of trends in long time series of annual maximum river flows at 195 gauging stations worldwide suggests
that the majority of these flow records (70%) do not exhibit any statistically significant trends. Trends in the remaining
records are almost evenly split between having a positive and a negative direction.

Russia and Pakistan
An unusual weather situation developed in the summer of 2010 in which Russia experienced a very long stretch of high
temperatures while a basin in Pakistan was inundated with flooding rains. NOAA examined the weather pattern and
issued this statement indicating this extreme event was a part of the natural cycle of variability (i.e. natural, unforced
variability) and unrelated to greenhouse gas forcing. "...greenhouse gas forcing fails to explain the 2010 heat wave over
western Russia. The natural process of atmospheric blocking, and the climate impacts induced by such blocking, are
the principal cause for this heat wave. It is not known whether, or to what extent, greenhouse gas emissions may affect
the frequency or intensity of blocking during summer. It is important to note that observations reveal no trend in a daily
frequency of July blocking over the period since 1948, nor is there an appreciable trend in the absolute values of upper
tropospheric summertime heights over western Russia for the period since 1900. The indications are that the current
blocking event is intrinsic to the natural variability of summer climate in this region, a region which has a climatological
vulnerability to blocking and associated heat waves (e.g., 1960, 1972, 1988)."

Snowfall in the United States
Snowfall in the eastern US reached record levels in 2009-10 and 2010-11 in some locations. NOAA’s Climate Scene
Investigators committee issued the following statement regarding this, indicating again that natural, unforced variability
explains the events. Specifically, they wanted to know if human-induced global warming could have caused the
snowstorms due to the fact that a warmer atmosphere holds more water vapor. The CSI Team’s analysis indicates that’
s not likely. They found no evidence — no human “fingerprints” — to implicate our involvement in the snowstorms. If
global warming was the culprit, the team would have expected to find a gradual increase in heavy snowstorms in the mid-
Atlantic region as temperatures rose during the past century. But historical analysis revealed no such increase in
snowfall.  In some of my own studies I have looked closely at the snowfall records of the Sierra Nevada mountains,
which includes data not part of the national archive. Long- term trends in snowfall (and thus water resources) in this part
of California are essentially zero, indicating no change in this valuable resource to the state (Christy and Hnilo, 2010.)

Looking at a long record of weather patterns

A project which seeks to generate consistent and systematic weather maps back to 1871 (20th Century Reanalyisis
Project, has taken a look at the three major indices which are often
related to extreme events. As Dr. Gill Campo of the University of Colorado, leader of the study, noted to the Wall Street
Journal (10 Feb 2011) that “… we were surprised that none of the three major indices of climate variability that we used
show a trend of increased circulation going back to 1871.” (The three indices were the Pacific Walker Circulation, the
North Atlantic Oscillation and the Pacific-North America Oscillation, Compo et al. 2011.) In other words, there appears to
be no supporting evidence over this period that human factors have influenced the major circulation patterns which drive
the larger-scale extreme events. Again we point to natural, unforced variability as the dominant feature of events that have
transpired in the past 140 years. What this means today should be considered a warning – that the climate system has
always had within itself the capability of causing devastating events and these will certainly continue with or without
human influence. Thus, societies should plan for their infrastructure projects to be able to withstand the worst that we
already know has occurred, and to recognize, in such a dynamical system, that even worse events should be expected.
In other words, the set of the measured extreme events of the small climate history we have, since about 1880, does not
represent the full range of extreme events that the climate system can actually generate. The most recent 130 years is
simply our current era’s small sample of the long history of climate. There will certainly be events in this coming century
that exceed the magnitude of extremes measured in the past 130 years in many locations. To put it another way, a large
percentage of the worst extremes over the period 1880 to 2100 will occur after 2011 simply by statistical probability
without any appeal to human forcing at all. Going further, one would assume that about 10 percent of the record
extremes that occur over a thousand-year period ending in 2100 should occur in the 21st century. Are we prepared to
deal with events even worse than we’ve seen so far? Spending resources on creating resiliency to these sure-to-come
extremes, particularly drought/flood extremes, seems rather prudent to me.

A sample study of why extreme events are poor metrics for global changes

In the examples above, we don’t see alarming increases in extreme events, but we must certainly be ready for more to
come as part of nature’s variability. I want to illustrate how one might use extreme events to conclude (improperly I
believe) that the weather in the USA is becoming less extreme and/or colder. For each of the 50 states, there are records
kept for the extreme high and low temperatures back to the late 19th century. In examining the years in which these
extremes occurred (and depending on how one deals with “repeats” of events) we find about 80 percent of the states
recorded their hottest temperature prior to 1955. And, about 60 percent of the states experienced their record cold
temperatures prior to that date too. One could conclude, if they were so inclined, that the climate of the US is becoming
less extreme because the occurrence of state extremes of hot and cold has diminished dramatically since 1955. Since
100 of anything is a fairly large sample (2 values for each of 50 states), this on the surface seems a reasonable
conclusion. Then, one might look at the more recent record of extremes and learn that no state has achieved a record
high temperature in the last 15 years (though one state has tied theirs.) However, five states have observed their all-time
record low temperature in these past 15 years (plus one tie.) This includes last month’s record low of 31°F below zero in
Oklahoma, breaking their previous record by a rather remarkable 4°F. If one were so inclined, one could conclude that
the weather that people worry about (extreme cold) is getting worse in the US. (Note: this lowering of absolute cold
temperature records is nowhere forecast in climate model projections, nor is a significant drop in the occurrence of
extreme high temperature records.) I am not using these statistics to prove the weather in the US is becoming less
extreme and/or colder. My point is that extreme events are poor metrics to use for detecting climate change. Indeed,
because of their rarity (by definition) using extreme events to bolster a claim about any type of climate change (warming
or cooling) runs the risk of setting up the classic “non-falsifiable hypothesis.” For example, we were told by the IPCC that
“milder winter temperatures will decrease heavy snowstorms” (TAR WG2, After the winters of 2009-10 and
2010-11, we are told the opposite by advocates of the IPCC position, “Climate Change Makes Major Snowstorms More
Likely” (

The non-falsifiable hypotheses works this way, “whatever happens is consistent with my hypothesis.” In other words,
there is no event that would “falsify” the hypothesis. As such, these assertions cannot be considered science or in
anyway informative since the hypothesis’ fundamental prediction is “anything may happen.” In the example above if
winters become milder or they become snowier, the hypothesis stands. This is not science. As noted above, there are
innumerable types of events that can be defined as extreme events – so for the enterprising individual (unencumbered
by the scientific method), weather statistics can supply an almost unlimited set of targets in which to discover a “useful”
extreme event. Thus, when such an individual observes an unusual event, it may be tempting to define it as a once-for-all
extreme metric to “prove” a point about climate change. This works both ways with extremes. If one were prescient
enough to have predicted in 1996 that over the next 15 years, five states would break record cold temperatures while zero
states would break record high temperatures as evidence for cooling, would that prove CO
2 emissions have no impact
on climate? No. Extreme events happen, and their causes are intricately tied to semi-unstable dynamical situations that
can occur out of an environment of natural, unforced variability. Science checks hypotheses (assertions) by testing
specific, falsifiable predictions implied by those hypotheses. The predictions are to be made in a manner that, as much
as possible, is blind to the data against which the prediction is evaluated. It is the testable predictions from hypotheses,
derived from climate model output, that run into trouble. Before going on, the main point here is that extreme events do
not lend themselves as being rigorous metrics for convicting human emissions of being guilty of causing them.


As noted earlier, my main research projects deal with building climate datasets from scratch to document what the
climate has done and to test assertions and hypotheses about climate change.  In 1994, Nature magazine published a
study of mine in which we estimated the underlying rate at which the world was warming by removing the impacts of
volcanoes and El Niños (Christy and McNider 1994.) This was important to do because in that particular 15-year period
(1979-1993) there were some significant volcanic cooling episodes and strong El Niños that convoluted what would
have been the underlying trend. The result of that study indicated the underlying trend for 1979-1993 was +0.09 °
C/decade which at the time was one third the rate of warming that should have been occurring according to estimates by
climate model simulations.  Above: update of Christy and McNider 1994: Top curve: Monthly global atmospheric
temperature anomalies 1979-2010 (TLT). 2nd: (SST) the influence of tropical sea surface temperature variations on the
global temperature. 3rd: (TLT-SST) global temperature anomalies without the SST influence. 4th (VOL) The effect of
volcanic cooling on global temperatures (El Chichon 1982 and Mt. Pinatubo 1991). Bottom: (TLT-SST-VOL) underlying
trend once SST and VOL effects are removed. The average underlying trend of TLT-SST-VOL generated from several
parametric variations of the criteria used in these experiments was +0.09 °C/decade. Lines are separated by 1°C. I have
repeated that study for this testimony with data which now cover 32 years as shown above (1979-2010.) In an interesting
result, the new underlying trend remains a modest +0.09 C/decade for the global tropospheric temperature, which is still
only one third of the average rate the climate models project for the current era (+0.26°C/decade.)  There is no evidence
of acceleration in this trend. This evidence strongly suggests that climate model simulations on average are simply too
sensitive to increasing greenhouse gases and thus overstate the warming of the climate system (see below under
climate sensitivity.) This is an example of a model simulation (i.e. hypothesis) which can provide a “prediction” to test:
that “prediction” being the rate at which the Earth’s atmosphere should be warming in the current era. In this case, the
model-average rate of warming fails the test (see next.)


Through the years there have been a number of publications which have specifically targeted two aspects of temperature
change in which observations and models can be compared. The results of both comparisons suggest there are
significant problems with the way climate models represent the processes which govern the atmospheric temperature.
In the first aspect of temperature change, we have shown that the pattern of change at the surface does indeed show
warming over land. However, in very detailed analyses of localized areas in the US and Africa we found that this warming
is dominated by increases in nighttime temperatures, with little change in daytime temperatures. This pattern of warming
is a classic signature of surface development (land cover and land use change) by human activities. The facts that (a)
the daytime temperatures do not show significant warming in these studies and (b) the daytime temperature is much
more representative of the deep atmospheric temperature where the warming due to the enhanced greenhouse effect
should be evident, lead us to conclude that much of the surface temperature warming is related to surface development
around the thermometer sites. This type of surface development interacts with complexities of the nighttime boundary
layer which leads to warming not related to greenhouse warming (Christy et al. 2006, 2009, see also Walters et al. 2007,
Pielke, Sr. 2008.)
The second set of studies investigates one of the clearest signatures or fingerprints of greenhouse gas warming as
depicted in climate models. This signature consists of a region of the tropical upper atmosphere which in models is
shown to warm at least twice as fast as the surface rate of warming. We, and others, have tested this specific signature, i.
e. this hypothesis, against several observational datasets and conclude that this pervasive result from climate models
has not been detected in the real atmosphere. In addition, the global upper atmosphere is also depicted in models to
warm at a rate faster than the surface. Again, we did not find this to be true in observations (Klotzbach et al. 2010.)
The following are quotes from three of the recent papers which come to essentially the same conclusion as earlier work
published in Christy et al. 2007 and Douglass et al. 2007. Table 2 displays the new per decade linear trend calculations
[of difference between global surface and troposphere using model amplification factor] … over land and ocean. All
trends are significant[ly different] at the 95% level. Klotzbach et al. 2010. [Our] result is inconsistent with model
projections which show that significant amplification of the modeled surface trends occurs in the modeled tropospheric
trends. Christy et al. 2010. Over the interval 1979-2009, model-projected temperature trends are two to four times larger
than observed trends in both the lower and mid-troposphere and the differences are statistically significant at the 99%
level. McKitrick et al 2010.
Again we note that these (and other) studies have taken “predictions” from climate model simulations (model outputs
are simply hypotheses), have tested these predictions against observations, and found significant differences.


One of the most misunderstood and contentious issues in climate science surrounds the notion of climate sensitivity.
Climate sensitivity is a basic variable that seeks to quantify the temperature response of the Earth to a particular forcing,
for example answering the question, how much warming can be expected if the warming effect of doubling CO2 acts on
the planet? The temperature used in this formulation is nearly always the surface temperature, which is a rather poor
metric to serve as a proxy for the total heat content of the climate system, but that is the convention in use today. In any
case, it is fairly well agreed that the surface temperature will rise about 1°C as a modest response to a doubling of
atmospheric CO
2 if the rest of the component processes of the climate system remain independent of this response.
This is where the issue becomes uncertain: the complexity and interrelatedness of the various components of the
climate system (e.g. clouds) mean they will not sit by independently while CO
2 warms the planet a little, but will get into
the act too. The fundamental issue in this debate is whether the net response of these interrelated actors will add to the
basic CO
2 warming (i.e. positive feedbacks) or subtract from the basic CO2 warming (i.e. negative feedbacks.)
Since climate models project a temperature rise on the order of 3 °C for a doubling of CO
2, it is clear that in the models,
positive feedbacks come into play to increase the temperature over and above the warming effect of CO
2 alone, which is
only about 1°C. However, given such observational results as noted earlier (i.e. warming rates of models being about
three times that of observations) one can hypothesize that there must be negative feedbacks in the real world that
counteract the positive feedbacks which dominate model processes.  My colleague at UA Huntsville, Dr. Roy Spencer,
has searched tediously for a way to calculate climate sensitivity from satellite observations which at the same time would
reveal the net response of the feedbacks which is so uncertain today. NASA and NOAA have placed in orbit some terrific
assets to answer questions like this. Unfortunately, the best observations to address this issue are only about 10 years
in length, which prevents us from directly calculating the sensitivity to 100 years of increasing CO
2. However, the climate
sensitivity over shorter periods to natural, unforced variability can be assessed, and this is what Dr. Spencer has done.
To put it simply, Spencer tracks large global temperature changes over periods of several weeks. It turns out the global
temperature rises and falls by many tenths of a degree over such periods. Spencer is able to measure the amount of
heat that accumulates in (departs from) the climate system as the temperature rises (falls) with temperature changes.
When all of the math is done, he finds the real climate system is dominated by negative feedbacks (probably related to
cloud variations) that work against changes in temperature once that temperature change has occurred. When this
same analysis is applied to climate model output (i.e. apples to apples comparisons), the result is very different, with all
models showing positive feedbacks, i.e. helping a warming impulse to warm the atmosphere even more (see figure
below.) Thus, the observations and models are again inconsistent. On this time scale in which feedbacks can be
assessed, Spencer sees a significant difference between the way the real Earth processes heat and the way models do.
This difference is very likely found in the way models treat cloudiness, precipitation and/or heat deposition into the ocean.
This appears to offer a strong clue as to why climate models tend to overstate the warming rate of the global
atmosphere  low: Climate feedback parameter from observations (blue, top line) and IPCC AR4 model simulations
(other lines, derived from results in Spencer and Braswell 2010.) Model parameters cluster in a grouping that indicates
considerably more sensitivity to forcing than indicated by observations.  The bottom line of this on-going research is that
over time periods for which we are able to determine climate sensitivity, the evidence suggests that all models are
characterized by feedback processes that are more positive than feedback processes measured in nature.


The term “consensus science” will often be appealed to in arguments about climate change. This is a form of “argument
from authority.” Consensus, however, is a political notion, not a scientific notion. As I testified to the Inter-Academy
Council last June, the IPCC and other similar Assessments do not represent for me a consensus of much more than
the consensus of those who already agree with a particular consensus. The content of these reports is actually under
the control of a relatively small number of individuals - I often refer to them as the “climate establishment” – who through
the years, in my opinion, came to act as gatekeepers of scientific opinion and information, rather than brokers. The
voices of those of us who object to various statements and emphases in these assessments are by-in-large dismissed
rather than acknowledged.  I’ve often stated that climate science is a “murky science.” We do not have laboratory
methods of testing our hypotheses as many other sciences do. As a result, opinion, arguments from authority, dramatic
press releases, and notions of consensus tend to pass for science in our field when they should not.  

I noticed the House has passed an amendment to de-fund the Intergovernmental Panel on Climate Change (IPCC.) I
have a proposal here. If the IPCC activity is ultimately funded by US taxpayers, then I propose that ten percent of the funds
be allocated to a group of well-credentialed scientists with help from individuals experienced in creating verifiable
reports, to produce an assessment that expresses alternative hypotheses that have been (in their view) marginalized,
misrepresented or minimized in previous IPCC reports. We know from climategate emails and many other sources of
information that the IPCC has had problems with those who take different positions on climate change. Topics to be
addresses in this assessment, for example, would include (a) evidence for a low climate sensitivity to increasing
greenhouse gases, (b) the role and importance of natural, unforced variability, (c) a rigorous evaluation of climate model
output, (d) a thorough discussion of uncertainty, (e) a focus on metrics that most directly relate to the rate of accumulation
of heat in the climate system (which, for example, the problematic surface temperature record does not represent), (f)
analysis of the many consequences, including benefits, that result from CO2 increases, and (g) the importance that
accessible energy has to human health and welfare. What this proposal seeks to accomplish is to provide to the
congress and other policymakers a parallel, scientifically-based assessment regarding the state of climate science
which addresses issues which here-to-for have been un- or under-represented by previous tax-payer funded,
government-directed climate reports.


The evidence above suggests that climate models overestimate the response of temperature to greenhouse gas
increases. Even so, using these climate model simulations we calculate that the impact of legislative actions being
considered on the global temperature is essentially imperceptible. These actions will not result in a measurable climate
effect that can be attributable or predictable with any level of confidence, especially at the regional level.  When I testified
before the Energy and Commerce Oversight and Investigations subcommittee in 2006 I provided information on an
imaginary world in which 1,000 1.4 gW nuclear power plants would be built and operated by 2020. This, of course, will
not happen. Even so, this Herculean effort would result in at most a 10 percent reduction in global CO
2 emissions, and
thus exert a tiny impact on whatever the climate is going to do. Indeed, with these most recent estimates of climate
sensitivity, the impact of these emission control measures will be even tinier since the climate system doesn’t seem to
be very sensitive to CO
2 emissions. (Note: we have not considered the many positive benefits of higher concentrations of
2 in the atmosphere, especially for the biological world, nor the tremendous boost to human health, welfare, and
security provided by affordable, carbon-based energy. As someone who has lived in a developing country, I can assure
the subcommittee that without energy, life is brutal and short.) Coal use, which generates a major portion of CO2
emissions, will continue to rise as indicated by the Energy Information Administration’s chart below. Developing
countries in Asia already burn more than twice the coal that North America does, and that discrepancy will continue to
expand. The fact our legislative actions will be inconsequential in the grand scheme of things can be seen by noting that
these actions attempt to bend the blue, North American curve, which is already fairly flat, down a little. So, downward
adjustments to North American coal use will have virtually no effect on global CO
2 emissions (or the climate), no matter
how sensitive one thinks the climate system might be to the extra CO
2 we are putting back into the atmosphere.  Thus, if
the country deems it necessary to de-carbonize civilization’s main energy sources, sound and indeed compelling
reasons beyond human-induced climate change need to be offered. Climate change alone is a weak leg on which to
stand for such a massive undertaking. (I’ll not address the fact there is really no demonstrated technology except nuclear
that can replace large portions of the carbon-based energy production.)

Thank you for this opportunity to offer my views on climate change.


Barredo, J.I., 2009: Normalized flood losses in Europe: 1970-2006. Nat. Hazards Earth Syst. Sci., 9, 97-104.
Christy, J.R. and J.J. Hnilo, 2010: Changes in snowfall in the southern Sierra Nevada of California since 1916. Energy &
Env., 21, 223-234.
Christy, J.R., W.B. Norris and R.T. McNider, 2009: Surface temperature variations in East Africa and possible causes. J.
Clim. 22, DOI: 10.1175/2008JCLI2726.1.
Christy, J. R., W. B. Norris, R. W. Spencer, and J. J. Hnilo, 2007: Tropospheric temperature change since 1979 from
tropical radiosonde and satellite measurements, J. Geophys. Res., 112, D06102, doi:10.1029/2005JD006881.
Christy, J.R., W.B. Norris, K. Redmond and K. Gallo, 2006: Methodology and results of calculating central California
surface temperature trends: Evidence of human- induced climate change? J. Climate, 19, 548-563.
Christy, J.R. and R.T. McNider, 1994: Satellite greenhouse signal? Nature, 367, 325. Compo, G.P. et al. 2011. Review
Article: The Twentieth Century Reanalysis Project. Q. J. R. Meteorol. Soc., 137, 1-28.
Crompton, R. and J. McAneney, 2008: The cost of natural disasters in Australia: the case for disaster risk reduction.
Australian J. Emerg. Manag., 23, 43-46. Douglass,
D.H., J.R. Christy, B.D. Pearson and S.F. Singer, 2007: A comparison of tropical temperature trends with model
predictions. International J. Climatology, DOI: 10.1002/joc.1651.
Klotzbach, P. J., R. A. Pielke Sr., R. A. Pielke Jr., J. R. Christy, and R. T. McNider (2009), An alternative explanation for
differential temperature trends at the surface and in the lower troposphere, J. Geophys. Res., 114, D21102, doi:10.1029
Pall, P., T. Aina, D. A. Stone, P. A. Stott, T. Nozawa, A. G. J. Hilberts, D. Lohmann and M. R. Allen, 2011: , Nature.
Spencer, R.W. and W.D. Braswell, 2010: Stephens, G. et al. 2010: The dreary state of precipitation in global models. J.
Geophys. Res., 115, doi:101029/1010JD014532.
Svensson, C., J. Hannaford, Z. Kundzewicz and T. Marsh, 2006: Trends in river floods, why is there no clear signal in the
observations? Frontiers in flood research.
Tchiguirinskaia, I., Thein, K., Hubert, P. International Association of Hydrological Sciences, International Hydrological
Programme, Publ 305, 1-18.
Walters, J.T., R.T. McNider, X. Shi, W.B. Norris and J.R. Christy, 2007: Positive surface temperature feedback in the stable
nocturnal boundary layer. Geophys. Res. Lett. doi:10.1029/2007GL029505.
Agassiz, L. Etudes sur les Glaciers, Neuchatel, Paris, (1840)
Akasofu, S-I ,
Natural Science 2(11) 1211-1224 (2010), ‘On the recovery from the Little Ice Age’
Alley R. B., et al, (51 authors), IPCC, Summary for Policymakers, Climate Change 2007: The Physical
Science Basis. Contribution of Working Group I to the Fourth Assessment, eds: Solomon, S., Qin, D.,
Manning, M., Chen, Z., Marquis, M., Averyt, K. B. Tignor, M. and Miller, H. L.,
Report of the
Intergovernmental Panel on Climate Change
, Cambridge University Press, Cambridge, United Kingdom
and New York, NY, USA., 2007
Ambler, D., Extreme weather, extreme claims, SPPI, 2010 [Accessed 5/14/2011]
AmeriFlux,, AmeriFlux Network Data, [Accessed 5/14/2011]
ARGO Float Data, Argo Profiling CTD Floats, NOAA Pacific Marine
Environmental Laboratory. [Accessed 5/14/11]
ASTM G173 [Accessed 5/14/2011].
Augustin, L. et al, EPICA community members, (56 Authors),
Nature 429 623-628 (2004), ‘Eight glacial
cycles from an Antarctic ice core’
AVISO, Mean Sea level products and images, [Accessed 5/14/2011]
Baldocchi, D. D., Global Change Biology 9 1-14 (2003), ‘Assessing the eddy covariance technique for
evaluating carbon dioxide exchange rates of ecosystems: past, present and future’
Barbante, C. et al, EPICA community members, (84 Authors), 2006,
Nature 444 195-198 (2006), ‘One to
one coupling of glacial climate variability in Greenland and Antarctica’
Cheetham, A., 2011a, Ocean Oscillations,
[Accessed 5/10/11]
Cheetham, A., 2011b, Hansen’s climate model predictions, [Accessed 5/14/11]
Cheetham, A., A history of the global warming scare, SPPI, 2009, [Accessed 5/11/2011]
Christy, J. Congressional Testimony [Accessed 5/18/2011]
Clark, R. 2011 ‘There is no carbon dioxide induced global warming and there can be no increase in sea
level above the present long term trend’!documentDetail;D=FWS-R8-ES-2010-0070-0127
(click on th .pdf button, lower right to access this file, it is ~ 100 pages long)
Clark, R., 2010a, ‘CA Climate Change is Caused by the Pacific Decadal Oscillation, Not by Carbon
Dioxide’, SPPI Sept 16 th 2010,
[Accessed 5/10/11]
Clark, R., 2010b,
Energy and Environment 21(4) 171-200 (2010), ‘A null hypothesis for CO2’
Clark, R., 2010c, What surface temperature is your model really predicting?,
eu/pages/posts/what-surface-temperature-is-your-model-really-predicting-190.php  [Accessed
Clark. R., 2010d, Gravity rules over the photons in the greenhouse effect. [Accessed 5/10/2011]
COAPS, Florida State University, Global tropical cyclone activity [Accessed 5/14/2011]  http://www.coaps.
Cryosphere Today, Polar Research, University of  Illinois at Urbana-Champaign http://arctic.atmos.uiuc.
edu/cryosphere/  [Accessed 5/14/2011]
D’Aleo, Why the NOAA and NASA proclamations should be ignored, SPPI, 2011 [Accessed 5/14/11]
D’Aleo, J Effects of AMO and PDO on temperatures Intellicast, May 2008.  http://www.intellicast.
com/Community/Content.aspx?a=127 [Accessed 5/14/2011]
D’Aleo, J. ‘Progressive Enhancement of Global Temperature Trends’, Science and Public Policy
Institute, July 2010. [Accessed 5/14/2011]
Dawson, J., The tree ring circus [Accessed 5/14/2010]
E. Anglia Confirmed, The Climategate Archive [Accessed 5/17/2011] http://www.eastangliaemails.
Eschenbach, W., 12/19/2010, Model Charged with Excessive Use of Forcing http://wattsupwiththat.
Eschenbach, W. [2010b] Out in the Ama-Zone, SPPI 4/7/2010 [Accessed 5/17/11].
Fairman, J. G., Jr., U. S. Nair, S. A. Christopher, and T. Mölg (2011), J. Geophys. Res., 116, D03110,
‘Land use change impacts on regional climate over Kilimanjaro’
Follows, M. J.; T. Ito and S. Dutkiewicz,
Ocean Modeling 12 290-301 (2006), ‘On the solution of the
carbonate chemistry system in ocean biogeochemistry models’
Fourier, B. J. B.,
Mem. R. Sci. Inst. 7 527-604 (1827), ‘Memoire sur les temperatures du globe terrestre
et des espaces planetaires’ [Translation available at: ]
Gilbert, W.C.,
Energy and Environment 21(4) 263-276 (2010) ‘The thermodynamic relationship between
surface temperature and water vapor concentration in the troposphere’
Gray, W. M. Gross Errors in the IPCC AR4 Report Regarding past and Future Changes in Tropical
Cyclone Activity - A Nobel Disgrace
html [Accessed 10/26/11]
Gray, V. R., South Pacific sea level: a reassessment, SSPI, Aug 16, 2010 [Accessed 5/18/2011]
Hagos, S. M. and K. H. Cook, Journal of Climate 21(15) 3797-3814 (2008), ‘Ocean warming and late-
twentieth-century Sahel drought and recovery’
Hale, G. M. and Querry, M. R.,
Applied Optics, 12(3) 555-563 (1973), ‘Optical constants of water in the
200 nm to 200 µm region’
Hansen, J. et al, (45 authors),
J. Geophys. Research 110 D18104 1-45 (2005) ‘Efficacy of climate
Hansen, J., (2005b) Nazarenko, L., Ruedy, R., Sato, M., Willis, J., Genio, A. D., Koch, D., Lacis, A., Lo,
K., Menon, S., Novakov, T., Perlwitz, J., Russell, G., Schmidt, G. A. and Tausnev, N.,
Science 308 1431-
1435 (2005) ‘Earth's energy imbalance: confirmation and implications’
Hansen, J., I. Fung, A. Lacis, D. Rind, Lebedeff, R. Ruedy, G. Russell, and P. Stone,
J. Geophys. Res.,
93 9341-9364 (1988), ‘Global climate changes as forecast by Goddard Institute for Space Studies three-
dimensional model’
Harvey, K. L.,
Proc. 2nd Annual Lowell Observatory Fall Workshop, Oct 5-7, (1997), Hall, J. C., ed.,
Solar Analogs: Characteristics and Optimum Candidates, ‘The solar activity cycle and sun-as-a-star
variability in the visible and IR’
Held, I. M. and B. J. Soden,
Annual Review Energy and Environ. 25 441-475 (2000), ‘Water vapor
feedback and global warming’
Helliker, B. R. and S. L. Richter,
Nature 454 511-514 (2008), ‘Subtropical to boreal convergence of tree
leaf temperatures’
Houston J. R. and R.G. Dean,
Journal of Coastal Research (2011) Online:  http://www.jcronline.
org/doi/abs/10.2112/JCOASTRES-D-10-00157.1, ‘Sea-level acceleration based on U.S. tide gauges and
extensions of previous global-gauge analyses’ [Accessed 5/10/11]
ICSC, ‘KIWIGATE’ NZ Crown Agency taken to court over temp records, 9/7/2010 [Accessed 5/17/11]
Indermuhle, A., Monnin, E., Stauffer, B. and Stocker, T.F., Geophysical Research Letters 27: 735-738
(2000), ‘Atmospheric CO
2 concentration from 60 to 20 kyr BP from the Taylor Dome ice core, Antarctica’
Jenkins, F. A. and H. E. White,
Fundamentals of Optics, McGraw Hill, NY, NY, 4th ed. 1976 Chapter 25
JISAO, PDO Data, [Accessed 5/11/2011]
JMA, ENSO Data, [Accessed
Jones, P. D., New, M., Parker, D. E., Martin, S. and Rigor, I. G.,
Rev. Geophysics 37(2) 173-199 (1999),
‘Surface air temperature and its changes over the past 150 years’
Keeling, Atmospheric CO
2 Data,
[Accessed, 5/14/2011]
Kiehl, J.T. and K. E. Trenberth,
Bull. Amer. Meteor. Soc., 78(2) 197-208 (1979), ‘Earth's annual global
mean energy budget’ (FAQ 1.1, Figure 1, p. 96 IPCC Fourth Assessment Report, 2007)
Knutti, R., Allen, M. R., Friedlingstein, P., Gregory, J. M., Hegerl, G. C., Meehl, G. A., Laframboise, D.,
The Delinquent Teenager Who Was Mistaken for the World’s Top Climate Expert, Amazon
Meinshausen, M., Murphy, J. M., Plattner, G-K., Raper, S. C. B., Stocker, T. F., Stott, P. A., Teng, H. and
Wigley, T. M. L.
Journal of Climate 21(11) 2651-2663 (2008), ‘A review of uncertainties in global
temperature projections over the twenty-first century’
Lambeck, K.,
Comptes Rendus Geoscience 336 667-689 (2004), ‘Sea level change through the last
glacial cycle: geophysical, glaciological and paleogeographic consequences’
Levitus, S.; J. I. Antonov, T. P. Boyer R. A. Locarini, H. E. Garcia and A. V. Mishinov,
Research Letters
36 L07608 1-5 (2009), ‘Global ocean heat content 1955-2008 in light of recently
revealed instrumentation problems’  Data available at:
gov/OC5/3M_HEAT_CONTENT/ [Accessed 5/14/2011]
Lewis, H., Letter of Resignation from the American Physical Society [Accessed 5/18/2011]
Lindsay, R. W.; J. Zhang, A. Schweiger, M. Steele and H. Stern, Journal of Climate 22(1) 165-176
(2009), ‘Arctic sea ice retreat in 2007 follows thinning trend’
Lindzen, R. S., Congressional Testimony 11/17/2010, ‘Global warming: how to approach the science’
SPPI Reprint 2/28/11, [Accessed 5/19/2011]
Lindzen, R. S. and Y-S. Choi, Geophys Res. Letts. 36 L16705 1-6 (2009), ‘On the determination of
climate feedbacks from ERBE data’
Loehle, C. and J. Huston,
Energy and Environment 19(1) 93-100 (2008), ‘Correction to: A 2000 year
global temperature reconstruction based on non-tree ring proxies’
Manabe, S. and Wetherald, R. T.,
J. Atmos. Sci., 24 241-249 (1967), ‘Thermal equilibrium of the
atmosphere with a given distribution of relative humidity’
McCabe, G. J.; J. L. Betancourt, S. T. Gray, M. A. Palecki and H. G. Hidalgo,
Quaternary International
31-40 (2008), ‘Associations of multi-decadal sea-surface temperature variability with US drought’
McLean, J., 2010a, ‘We have been conned – an independent review of the IPCC’, SPPI 2010 [Accessed
McLean, J., 2010b, GlacierGate highlights IPCC's flaws, SPPI 2010, [Accessed 5/14/2011]  
Milankovitch, M.,
Théorie Mathématique des Phénomenes Thermiques Produits par la Radiation Solaire,
Gauthier-Villars, Paris (1920); Canon of Insolation and the Ice-Age Problem,
Royal Serbian Acad. Sp.
132 (1941), Israel Program Sci. Trans., Jerusalem (1969)
Monckton, C., SPPI, 2009: Climategate: caught green-handed,
org/monckton/climategate.html [Accessed 5/18/2011]
Montford, A. W., The Hockey Stick Illusion, Stacey International, 2010
Morner, N-A.,
21st Century Science and Technology, pp.7-17 (Fall 2010), ‘There is no alarming sea
level rise’
Mosher, S. and T. W. Fuller,
Climategate: The Crutape Letters, Create Space, 2010
U. S. Standard Atmosphere, NASA-TM-X-74335, 1976.
NASA, AMO Data, [Accessed
90N&month=2&beg_trend_year=1880&end_trend_year=2010&submitted=Submit [Accessed 5/11/2011]
NASA Sunspot Cycle, [Accessed 5/11/2011]
NOAA, ENSO [Accessed 5/10/2011]
NOAA, Rainfall Data, [Accessed 5/14/2011]
NOAA, Sunspot Index, [Accessed 5/14/2011].   
NOAA, Tides and Currents, Monthly tide data, [Accessed 5/14/2011]
Oerlemans, J., Science 308 675-677 (2005), ‘Extracting a climate signal from 169 glacier records’
Quayle, R. G., Easterlin, D. R., Karl, T. R. and Hughes, P. Y.,
Bull. Amer. Met. Soc. 72(11) 1718-1723
(1991), ‘Effects of recent thermometer changes in the cooperative station network’
REMSS, [Accessed 5/14/2011]
Rothman, L. S. et al, (30 authors),
J. Quant. Spectrosc. Rad. Trans. 96 139-204 (2005), ‘The HITRAN
2004 molecular spectroscopic database’
Royal Society, Climate change: a summary of the science, September 2010
Schreuder, H. Ten physics facts – setting the record straight [Accessed 5/14/2011]
Seidel, D. J.; M. Free and J. Wang, J. Geophys Res. 110 D090102 1-13 (2005), ‘Diurnal cycle of upper
air temperature estimated from radiosondes’
Solomon, S., Plattner, G-K, Knutti, R. and Freidlingstein, P.,
Proc Natl Acad Sci USA 106(6) 1704-1709
(2009) ‘Irreversible climate change due to carbon dioxide emissions’
Taylor, F. W.,
Elementary Climate Physics, Oxford University Press, Oxford, 2006, chapter 7
Tsonis, A. A., An Introduction to Atmospheric Thermodynamics, 2nd edn., Cambridge University Press,
Cambridge, UK, 2007, p. 127.
Tyndall, J., Proc.
Roy Inst. Jan 23 pp 200-206 (1863), ‘On radiation through the Earth's atmosphere’
Varadi, F., B. Runnegar, B. and Ghil,
M., Astrophys. J., 562 620-630 (2003), ‘Successive refinements in
long term integrations of planetary orbits’
U. Hawaii, Sea level Center, [Accessed 5/12/2011]
USGS, The San Andreas Fault,, [Accessed 5/14/2011]
VIRGO, SOHO Satellite VIRO Radiometer Data [Accessed 5/11/2011]
Weart, S. R., Physics Today 50(1) 34-40 (1997), ‘The discovery of the risk of global warming’
Wyatt, M. G.; S. Kravtsoc & A. A. Tsonis, AMO and N. Hemisphere's climate variability (Climate
Dynamics, to be published) [Accessed 5/14/2011]
Yu, L., J. Climate,
20(21) 5376-5390 (2007), ‘Global variations in oceanic evaporation (1958-2005): The
role of the changing wind speed’
Yu, L., Jin, X. and Weller R. A.,
OAFlux Project Technical Report (OA-2008-01) Jan 2008, ‘Multidecade
Global Flux Datasets from the Objectively Analyzed Air-sea Fluxes (OAFlux) Project: Latent and
Sensible Heat Fluxes, Ocean Evaporation, and Related Surface Meteorological Variables’ (Available at: )
Dr. Roy Clark
Ventura Photonics
The Dynamic Greenhouse Effect and Climate Averaging Paradox  
For more information click