Pro: Surface Temperature Measurements are Accurate
A new assessment of NASA’s record of global temperatures revealed that the agency’s estimate of Earth’s long-term temperature rise in recent decades is accurate to within less than a tenth of a degree Fahrenheit, providing confidence that past and future research is correctly capturing rising surface temperatures.
Another recent study evaluated US NASA Goddard’s Global Surface Temperature Analysis, (GISTEMP) in a different way that also added confidence to its estimate of long-term warming. A paper published in March 2019, led by Joel Susskind of NASA’s Goddard Space Flight Center, compared GISTEMP data with that of the Atmospheric Infrared Sounder (AIRS), onboard NASA’s Aqua satellite.
GISTEMP uses air temperature recorded with thermometers slightly above the ground or sea, while AIRS uses infrared sensing to measure the temperature right at the Earth’s surface (or “skin temperature”) from space. The AIRS record of temperature change since 2003 (which begins when Aqua launched) closely matched the GISTEMP record.
Comparing two measurements that were similar but recorded in very different ways ensured that they were independent of each other, Schmidt said. One difference was that AIRS showed more warming in the northernmost latitudes.
“The Arctic is one of the places we already detected was warming the most. The AIRS data suggests that it’s warming even faster than we thought,” said Schmidt, who was also a co-author on the Susskind paper.
Taken together, Schmidt said, the two studies help establish GISTEMP as a reliable index for current and future climate research.
“Each of those is a way in which you can try and provide evidence that what you’re doing is real,” Schmidt said. “We’re testing the robustness of the method itself, the robustness of the assumptions, and of the final result against a totally independent data set.”https://climate.nasa.gov/news/2876/new-studies-increase-confidence-in-nasas-measure-of-earths-temperature/
Con: Surface Temperature Records are Distorted
Global warming is made artificially warmer by manufacturing climate data where there isn’t any.
The following quotes are from the [peer reviewed] research, A Critical Review of Global Surface Temperature Data, published in Social Science Research Network (SSRN) by Ross McKitrick, Ph.D. Professor of Economics at the University of Guelph, Guelph Ontario Canada.
“There are three main global temperature histories: the United Kingdom’s University of East Anglia’s Climate Research Unit (CRU-Hadley record (HADCRU), the US NASA Goddard’s Global Surface Temperature Analysis (GISTEMP) record, and the US National Oceanic and Atmospheric Administration (NOAA) record. All three global averages depend on the same underlying land data archive, the US Global Historical Climatology Network (GHCN). CRU and GISS supplement it with a small amount of additional data. Because of this reliance on GHCN, its quality deficiencies will constrain the quality of all derived products.”
As you can imagine, there were very few air temperature monitoring stations around the world in 1880. In fact, prior to 1950, the US had by far the most comprehensive set of temperature stations. Europe, Southern Canada, the coast of China, the coast of Australia and Southern Canada had a considerable number of stations prior to 1950. Vast land regions of the world had virtually no air temperature stations. To this day, Antarctica, Greenland, Siberia, Sahara, Amazon, Northern Canada, the Himalayas have extremely sparce if not virtually non-existent air temperature stations and records.
“While GHCN v2 has at least some data from most places in the world, continuous coverage for the whole of the 20th century is largely limited to the US, southern Canada, Europe and a few other locations.”
With respect to the oceans, seas and lakes of the world, covering 71% of the surface area of the globe, there are only inconsistent and poor-quality air temperature and sea surface temperature (SST) data collected as ships plied mostly established sea lanes across all the oceans, seas and lakes of the world. These temperature readings were made at differing times of day, using disparate equipment and methods. Air temperature measurements were taken at inconsistent altitudes above sea level and SSTs were taken at varying depths. GHCN uses SSTs to extrapolate air temperatures. Scientist literally must make millions of adjustments to this data to calibrate all of these records so that they can be combined and used to determine the GHCN data set. These records and adjustments cannot possibly provide the quality of measurements needed to determine an accurate historical record of average global temperature. The potential errors in interpreting this data far exceed the amount of temperature variance.
“Oceanic data are based on sea surface temperature (SST) rather than marine air temperature (MAT). All three global products rely on SST series derived from the International Comprehensive Ocean-Atmosphere Data Set (ICOADS) archive, though the Hadley Centre switched to a real time network source after 1998, which may have caused a jump in that series. ICOADS observations were primarily obtained from ships that voluntarily monitored sea surface temperatures (SST). Prior to the post-war era, coverage of the southern oceans and polar regions was very thin.”
“The shipping data upon which ICOADS relied exclusively until the late 1970s, and continues to use for about 10 percent of its observations, are bedeviled by the fact that two different types of data are mixed together. The older method for measuring SST was to draw a bucket of water from the sea surface to the deck of the ship and insert a thermometer. Different kinds of buckets (wooden or Met Office-issued canvas buckets, for instance) could generate different readings, and were often biased cool relative to the actual temperature (Thompson et al. 2008).”
“Beginning in the 20th century, as wind-propulsion gave way to engines, readings began to come from sensors monitoring the temperature of water drawn into the engine cooling system. These readings typically have a warm bias compared to the actual SST (Thompson et al. 2008). US vessels are believed to have switched to engine intake readings fairly quickly, whereas UK ships retained the bucket approach much longer. More recently some ships have reported temperatures using hull sensors. In addition, changing ship size introduced artificial trends into ICOADS data (Kent et al. 2007).”
More recently, the temperature stations comprising the set of stations providing measurements used in the GHCN have undergone dramatic changes.
“The number of weather stations providing data to GHCN plunged in 1990 and again in 2005. The sample size has fallen by over 75% from its peak in the early 1970s, and is now smaller than at any time since 1919. The collapse in sample size has not been spatially uniform. It has increased the relative fraction of data coming from airports to about 50 percent (up from about 30 percent in the 1970s). It has also reduced the average latitude of source data and removed relatively more high-altitude monitoring sites. GHCN applies adjustments to try and correct for sampling discontinuities. These have tended to increase the warming trend over the 20th century. After 1990 the magnitude of the adjustments (positive and negative) gets implausibly large. CRU has stated that about 98 percent of its input data are from GHCN. GISS also relies on GHCN with some additional US data from the USHCN network, and some additional Antarctic data sources. NOAA relies entirely on the GHCN network.”
To compensate for this tremendous lack of air temperature data, in order to get a global temperature average, scientists interpolate data from surrounding areas that have data. When such interpolation is done, the measured global temperature actually increases.
NASA Goddard Institute for Space Studies (GISS) is the world’s authority on climate change data. Yet, much of their warming signal is manufactured in statistical methods visible on their own website, as illustrated by how data smoothing creates a warming signal where there isn’t any temperature data.
When station data is used to extrapolate over distance, any errors in the source data will get magnified and spread over a large area2. For example, in Africa there is very little climate data. Say the nearest active data station from the center of the African Savannah is 400 miles (644km) away, at an airport in a city. But, to cover that area without data, they use that city temperature data to extrapolate for the African Savannah. In doing so They are adding the Urban Heat Island of the city to a wide area of the Savanah through the interpolation process, and in turn that raises the global temperature average.
As an illustration, NASA GISS published a July 2019 temperature map with 250 KM ‘smoothing radius’ and also one with 1200 KM ‘smoothing radius.3’ The first map does not extrapolate temperature data over the Savanah (where no real data exists) and results in a global temperature anomaly of 0.88 C. The second, which extends over the Savanah results in a warmer global temperature anomaly of 0.92 C.
This kind of statistically induced warming is not real.
- Systematic Error in Climate Measurements: The surface air temperature record. Pat Frank, April 19, 2016. https://wattsupwiththat.com/2016/04/19/systematic-error-in-climate-measurements-the-surface-air-temperature-record/
- A Critical Review of Global Surface Temperature Data, published in Social Science Research Network (SSRN) by Ross McKitrick, Ph.D. Professor of Economics at the University of Guelph, Guelph Ontario Canada. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=1653928
- NASA GISS Surface Temperature Analysis (v4) – Global Maps https://data.giss.nasa.gov/gistemp/maps/index_v4.html