Among the issues most commonly discussed are individuality, the rights of the individual, the limits of legitimate government, morality, history, economics, government policy, science, business, education, health care, energy, and man-made global warming evaluations. My posts are aimed at thinking, intelligent individuals, whose comments are very welcome.

"No matter how vast your knowledge or how modest, it is your own mind that has to acquire it." Ayn Rand

15 November 2017

Avalanches of global warming alarmism by Dr. Tim Ball and Tom Harris

UN climate cataclysm predictions have no basis in fact and should not be taken seriously
Throughout the United Nations Climate Change Conference wrapping up in Bonn, Germany this week, the world has been inundated with the usual avalanche of manmade global warming alarmism. The UN expects us to believe that extreme weather, shrinking sea ice, and sea level rise will soon become much worse if we do not quickly phase out our use of fossil fuels that provide over 80% of the world’s energy.
There is essentially nothing to support these alarms, of course. We simply do not have adequate observational data required to know or understand what has happened over the past century and a half. Meaningful forecasts of future climate conditions are therefore impossible.
Nevertheless, this year’s session has been especially intense, since the meeting is being chaired by the island nation of Fiji, a government that has taken climate change fears to extremes.
COP23 (the 23rd meeting of the Conference of the Parties on climate change) conference president, Fijian Prime Minister Frank Bainimarama, has called for “an absolute dedication to meet the 1.5-degree target.” This is the arbitrary and most stringent goal suggested by the Paris Agreement. In support of Bainimarama’s position, the COP23/Fiji Website repeatedly cites frightening forecasts made by the UN Intergovernmental Panel on Climate Change (IPCC).
One prediction stated: “The IPCC recently reported that temperatures will significantly increase in the Sahel and Southern African regions, rainfall will significantly decrease, and tropical storms will become more frequent and intense, with a projected 20 per cent increase in cyclone activity.” 
To make such dire forecasts, the IPCC relies on computerized models built on data and formulas to represent atmospheric conditions, and reflect the hypothesis that carbon dioxide is the principal factor driving planetary warming and climate change.
However, we still do not have a comprehensive, workable “theory of climate,” and thus do not have valid formulas to properly represent how the atmosphere functions. We also lack data to properly understand what weather was like over most of the planet even in the recent past. Without a good understanding of past weather conditions, we have no way to know the history, or the future, of average weather conditions – what we call the climate.
An important data set used by the computer models cited by the IPCC is the “HadCRUT4” global average temperature history for the past 167 years. This was produced by the Hadley Centre and the University of East Anglia’s Climatic Research Unit, both based in the United Kingdom.
Until the 1960s, HadCRUT4 temperature data were collected using mercury thermometers located at weather stations situated mostly in the United States, Japan, the UK, and eastern Australia. Most of the rest of the planet had very few temperature sensing stations, and none of the Earth’s oceans (which cover 70% of the planet) had more than occasional stations separated from the next ones by thousands of kilometers of no data. Temperatures over these vast empty areas were simply “guesstimated.”
Making matters even worse, data collected at weather stations in this sparse grid had, at best, an accuracy of +/-0.5 degrees Celsius (0.9 degrees F), and oftentimes no better than +/-1.0 degree C. Averaging such poor data in an attempt to determine past or future global conditions cannot yield anything meaningful – and certainly nothing accurate or valid enough to use in making critical energy policy decisions.
Modern weather station surface temperature data are now collected using precision thermocouples. But, starting in the 1970s, less and less ground surface temperature data was used for plots such as HadCRUT4. Initially, this was done because governments believed satellite monitoring could take over from most of the ground surface data collection.
However, the satellites did not show the warming that climate activists and computer models had forecast. So, bureaucrats closed many of the colder rural surface temperature sensing stations, while many stations in the vast frigid area of Siberia were closed for economic and other reasons. The net result was that cold temperature data disappeared from more recent records – thereby creating artificial warming trends, the very warming that alarmists predicted, desired and needed for political purposes.
Today, we have virtually no data for approximately 85% of the Earth’s surface. Indeed, there are fewer weather stations in operation now than there were in 1960.
That means HadCRUT4 and other surface temperature computations after about 1980 are meaningless. Combining this with the sensitivity (accuracy) problems in the early data, and the fact that we have almost no long-term data above Earth’s surface, the conclusion is unavoidable:
It is not possible to know how or whether Earth’s climate has varied over the past century and a half. The data are therefore useless for input to the computer models that form the basis of the IPCC’s conclusions.
But the lack of adequate surface data is only the start of the problem. The computer models on which the climate scare is based are mathematical constructions that require the input of data above Earth’s surface as well. The models divide the atmosphere into cubes piled on top of each other, ideally with wind, humidity, cloud cover and temperature conditions known for different altitudes. But we currently have even less data above the surface than on it, and there is essentially no historical data at altitude.
Many people think the planet is adequately covered by satellite observations data that is almost global 24/7 coverage and far more accurate than anything determined at weather stations. But the satellites are unable to collect data from the north and south poles, regions that are touted as critical to understanding global warming.
Moreover, space-based temperature data collection did not start until 1979, and 30 years of weather data is required to generate a single data point on a climate graph. The satellite record is far too short to allow us to come to any useful conclusions about climate change.
In fact, there is insufficient data of any kind – temperature, land and sea ice, glaciers, sea level, extreme weather, ocean pH, et cetera – to be able to determine how today’s climate differs from the past, much less predict the future. The IPCC’s climate forecasts have no connection with the real world.
Sherlock Holmes warned that “It is a capital mistake to theorize before one has data. Insensibly one begins to twist facts to suit theories, instead of theories to suit facts.”
Sir Arthur Conan Doyle wrote this famous quote for fiction, of course. But it applies perfectly to today’s global warming debate, especially where the IPCC’s scary conclusions and forecasts are involved. Of course, this will not stop Bainimarama and other conference leaders from citing IPCC “science” in support of their warnings of future climate catastrophe.
We should use these facts to spotlight and embarrass them every time.
___________
Dr. Tim Ball is an environmental consultant and former climatology professor at the University of Winnipeg in Manitoba. Tom Harris is executive director of the Ottawa, Canada-based International Climate Science Coalition.

02 November 2017

Solving the Parallel Plane Black Body Radiator Problem and Why the Consensus Science Is Wrong

I will present the Consensus, Settled Science solution to the parallel plane black body radiator problem and demonstrate that it is wrong.  I will show that it exaggerates the energy of electromagnetic radiation between the two planes by as much as a factor of two as their temperatures approach one another.  In the case in which the two black body radiators have the same temperature, the settled science treatment clearly violates the principal characteristic of a black body cavity, namely its constant energy density, e, given by Stefan’s Law as

e = aT4,

where T is the temperature in Kelvin, a is Stefan’s Constant of 7.57 x 10-16 J/m3K4 , and e is in Joules per cubic meter.

Why does the wrong treatment of thermal radiation matter with respect to the claims of the catastrophic man-made global warming hypothesis?  Because it greatly exaggerates the transport of energy by radiation, it also greatly exaggerates the absorption of that energy by infrared-active gases, commonly called greenhouse gases.  It also causes violations of the Conservation of Energy and implies a much higher heat content in the Earth’s surface and atmosphere than really exists there due to radiation.  These are important errors with very important consequences.

I have already demonstrated the failure of the settled science treatment of thermal radiation from black body radiators in the form of concentric spherical shells in a paper posted on 23 October 2017, entitled Thermal Radiation Basics and Their Violation by the Settled Science of the Catastrophic Man-Made Global Warming Hypothesis, but I want to post this solution with its simpler geometry so that the reason the consensus treatment is wrong will be even more apparent to thinking readers.

Numerous critics of the consensus science on catastrophic man-made global warming have argued that thermodynamics claims that energy only flows from the warmer body to the colder body, but the consensus scientists have argued that thermodynamics only applies to the net flow of energy.  I have long argued that the reason that radiant energy only flows from the warmer to the cooler body is because the flow is controlled by an electric field and an energy gradient in that field.  I will offer that proof in this paper.

In a black body cavity, the electromagnetic radiation is in equilibrium with the walls of the cavity at a temperature T.  The energy density e is the mean value of 

½ E·D + ½ H·B,

where E is the impressed electric field, D is the displacement, which differs from E when the medium is polarized, H is the impressed magnetic field and B is the magnetic polarization of a medium.  The mean value of the energy density of the electromagnetic field depends on the temperature and is independent of the volume of the cavity.  The radiation pressure on the cavity walls is proportional to the energy density.

Inside a black body cavity radiator which is at equilibrium at a temperature T, the energy density, or the energy per unit volume of the vacuum in the cavity is constant in accordance with Stefan’s Law.  If one opens a small peephole in the wall of the black body cavity, the energy density just inside that peephole is the energy density of the black body cavity and that energy density is proportional to the square of the electric field magnitude there.  The Stefan-Boltzmann Law states that the flow rate of energy out of the peephole when the black body cavity is surrounded by vacuum and an environment at T = 0 K, is given as the power P per unit area as

P = σT4

Note that P = (σ/a) e and that e in the T=0 K sink is equal to zero.

Why is it that a surface which is not a peephole into a black body cavity might act like a black body radiator?  It has to be that the energy density very, very close to that surface has the characteristic of the energy density in a black body cavity radiator, namely that

e = aT4.

Any flow of energy out of the surface due to its temperature T must be caused by this electromagnetic field energy density at the surface generated by the vibrational motion of electric charges in the material of the surface.  Such flow of energy only occurs to regions with an energy density that is lower.  There is no flow of energy from the inside wall of a black body radiator because the energy density everywhere inside the cavity at equilibrium is equal.  P from the interior walls is everywhere zero.

In fact, while it is commonly claimed that photons inside the cavity are being 100% absorbed on the walls and an equal amount of radiant energy is emitted from the absorbing wall, the actual case is that the radiant energy incident upon the walls can be entirely reflected from the walls.  Planck had derived the frequency spectrum of a black body cavity from an assumption of complete reflection from the walls.

Here is the problem of the parallel plane black body radiators diagrammed, where TC is the cooler temperature and TH is the warmer temperature:





Let us first consider the case that each plane is alone and surrounded by an environment of space at T=0.  Each plane has a power input that causes the plane to have its given temperature.  Each plane radiates electromagnetic energy at a rate per unit surface area of 

P = σT4.

Consequently, if neither plane were in the presence of the other and each plane has a surface area of A on each side, we have

PC = APCO + APCI = 2AσTC4

And

PH = APHO + APHI = 2AσTH4 ,

since in equilibrium the power input is equal to the power output by radiation.

In the consensus viewpoint, shared by many physicists and by almost every climate scientist, the parallel plane black body radiators above are believed to emit photons from every surface of each plane even in the presence of the other plane with a power per unit area of

PH = σTH4 and PC = σTC4,

just as they would if they were not near one another and they only cast off photons into a sink at T = 0 K.  Thus, when in one another's presence

PC = APCO + APCI - APHI = 2AσTC4 - AσTH4

PH = APHO + APHI - APCI = 2AσTH4 - AσTC4

Note that PC becomes zero at TC = 0.8409 TH and below that temperature PC is negative or a cooling power in addition to radiative cooling.  If TH = 288K, then TC = 242.2K, the effective radiative temperature of the cooler plane to space if the cooler plane is thought of as the atmosphere and the warmer plane is the surface of the Earth, both the atmosphere and the surfaceact like black body radiators, and the atmosphere receives only radiant energy from the surface.  This is in agreement with calculations I have presented in the past and is a result which I believe to be correct under the assumptions, even though in some critical respects this consensus viewpoint is wrong.

If these planes were isolated from one another and each plane faced only that T = 0 K vacuum, then one would have

eH = aTH4 and eC = aTC4,

because these are black body radiator surfaces.

Unlike the case of concentric spherical shells, there is no divergence or convergence of the photons emitted from either surface.  The relationship of the radiative power P to the energy density due to that electromagnetic radiation is always e = (a/σ) P as one traverses the distance between the planes.  Consequently, seeing this from the consensus viewpoint,

e = (a/σ) PHI + (a/σ) PCI = aTH4 + aTC4

anywhere between the two planes, because photons have energy no matter which direction they are traveling.

Now, let us imagine that these planes are very close together and the ends are far away and nearly closed.  Let us have TC TH, then

e 2aTH4,

but this space between the planes is now a black body cavity in the limit that TC TH, and we know by Stefan’s Law that

e = aTH4

in this case.  In addition, we have created a black body cavity radiator here and P inside the cavity is actually zero because the interior is in a state of equilibrium and constant energy density.  It is only P = σ T4 just outside the peephole facing an environment at T = 0 K.  In the above consensus viewpoint case, each plane surface is emitting real photons, but these cannot annihilate those photons of the opposite plane.  There are no negative energy photons.  These respective photon streams simply add to the total energy density.

The consensus treatment of black body thermal radiation doubles the energy density in a black body cavity, in clear violation of the principal characteristic of a black body cavity upon which their treatment must be based.  Their treatment greatly increases the energy density between the planes whenever TC is anywhere near TH, such as is the case of the temperatures in the lower troposphere compared to the Earth’s surface temperature.  Consequently, the sum of PHI and PCI must be much smaller than they are thought to be in the consensus treatment of this problem or in the similar concentric spherical shell problem.

Let us now examine the correct solution to this parallel plane black body radiators problem.  It is the electromagnetic field between the two planes that governs the flow of electromagnetic energy between the planes.  Or one can say it is the energy density at each plane surface that drives the exchange of energy between the planes due to the energy density gradient between the two planes.  The critical and driving parameter here is

Δe = eH - eC = aTH4 – aTC4 ,

where each black body radiator surface maintains its black body radiator requirement that the energy density at the surface is given by Stefan’s Law.

Electromagnetic energy flows from the high energy surface to the low energy surface, as is the case in energy flows generally.

PHI = (σ/a) Δe = (σ/a) (aTH4 – aTC4 ) = σ TH4σ TC4

PCI = 0,

which is consistent with experimental measurements of the rate of radiant heat flow between two black bodies.  Note that as TC approaches TH, PHI approaches zero as should be the case inside a black body cavity.  Note also that when TC = 0 K, PHI is given by the Stefan-Boltzmann Law

PHI = σ TH4.

Let us recalculate PH and PC in this correct formulation of the problem:

PC = APCO + APCI - APHI = AσTC4 + 0 – [AσTH4 - AσTC4] = 2AσTC4 - AσTH4

PH = APHO + APHI - APCI = AσTH4 + [AσTH4 - AσTC4] - 0 = 2AσTH4 - AσTC4

And we see that the power inputs to each plane needed to maintain their respective temperatures as they cool themselves by thermal radiation are unchanged in this correct energy density or electromagnetic field centered viewpoint. Experimentally, the relationship between the power inputs to the thermal radiation emitting planes at given temperatures are exactly the same.  This fact causes the proponents of the consensus viewpoint to believe they are right, but they nonetheless violate the energy density requirements of electromagnetic fields and of black body radiation itself.

Because PCI = 0, back radiation from a cooler atmosphere to the surface is also zero and not 100% of the top of the atmosphere solar insolation as in the current NASA Earth Energy Budget.  Because PH = σ TH4 – σ TC4 , the Earth's surface does not radiate 117% of the top of the atmosphere insolation either.  These radiation flows are hugely exaggerated by NASA and in similar Earth Energy Budgets presented in the UN IPCC reports.  See the NASA Earth Energy Budget below:




This is very important because reducing these two radiant energy flows of infrared photons reduces the effect of infrared-active gases, the so-called greenhouse gases, drastically.  Many fewer photons are actually available to be absorbed or emitted by greenhouse gases than they imagine.  This is a principal error that should cause the global climate computer models to exaggerate the effects of the greenhouse gases, just as they have.

It is not at all surprising that physics adheres to a minimum total energy in the system and does not generate a superfluous stream of photons from the colder body to the warmer body and does not have any more photons flowing to the colder body from the warmer body than necessary.  By means of the electromagnetic field between these two planes, the photon emission of the planes is coupled and affected by the presence of the other plane.  This is in no way surprising for an electromagnetic field problem.  One needs to remember that photons are creatures of electromagnetic fields.  Opposing streams of photons do not annihilate one another to cancel out energy, they simply add their energies.  Treating them as though one stream has a negative energy and the other a positive energy is just a means to throw the use of the Conservation of Energy out the window.  That is too critical a principle of physics to be tossed out the window.



Extending the Solution to Gray Body Thermal Radiators and Other Real Materials:

Many real materials do not behave like black body radiators of thermal radiation.  Those that do not radiate as black body materials would, radiate less than the black body radiator would.  Why would they radiate less?  This is because they do not create as high an electromagnetic field energy density at their surfaces as does a black body radiator.  From Stefan’s Law for a black body radiator, the energy density at the surface is

e = aT4

but for a gray body radiator the energy density at each wavelength λ is

e(λ) = εaT4.

This means the energy density at any given frequency is a constant fraction 0 < ε <1 a="" behave="" black="" body="" case="" either="" general="" gray="" in="" like="" material="" may="" nbsp="" not="" o:p="" of="" or="" radiator.="" that="">

e(λ) = ε(λ)aT4,

where the fraction of the black body output at wavelength λ is variable.

An isolated material surrounded by vacuum and a T=0 K environment then has a power per unit area output of

P(λ) = εσT4 for a gray body and ε is seen to be the emissivity, and

P(λ) = ε(λ)σT4, for a general material, such as carbon dioxide or water vapor.

For our two parallel plates above, if both are gray bodies, then between the plates

Δe = eH – eC = εH a TH4εC a TC4

PHI = (σ/a) Δe = εH σ TH4εC σ TC4

PCI = 0.

Here we see that the emissivity which determines the electromagnetic field energy density at the surface is also playing the role of the absorptivity at the absorbing colder surface.  So of course, Kirchhoff’s Law of thermal radiation that the emissivity equals the absorptivity of a material in thermal equilibrium applies.  There is really nothing at all to prove if one starts with the primary fact and boundary condition that the energy density is the fundamental driver of the thermal radiation of materials.


Update:  Considerable additions were made on 8 November 2017.
Update:  Additions made on 12 November 2017.
Update:  The extension to gray bodies and other bodies was made on 14 November 2017.



01 November 2017

Politicized sustainability threatens planet and people by Paul Driessen

It drives anti-fossil fuel agendas and threatens wildlife, jobs, and human health and welfare
Sustainability (sustainable development) is one of the hottest trends on college campuses, in the news media, in corporate boardrooms and with regulators. There are three different versions.
Real Sustainability involves thoughtful, caring, responsible, economical stewardship and conservation of land, water, energy, metallic, forest, wildlife and other natural resources. Responsible businesses, families and communities practice this kind of sustainability every day: polluting less, recycling where it makes sense, and using less energy, water and raw materials to manufacture the products we need.
Public Relations Sustainability mostly involves meaningless, superficial, unverifiable, image-enhancing assertions that a company is devoted to renewable fuels, corporate responsibility, environmental justice, reducing its carbon footprint – or sustainability. Its primary goal is garnering favorable press or appeasing radical environmental groups.
Politicized Sustainability is the untenable, even dangerous variety. It relies on ideological assertions and theoretical models as an alternative to actual outside-our-windows reality and evidence. Like “dangerous manmade climate change,” its real purpose is gaining greater agitator and government control over people’s energy use, lives, livelihoods, liberties and living standards. It reflects an abysmal understanding of basic energy, economic, resource extraction, manufacturing and human rights realities.
The most common definition is that “we may meet the needs of current generations” only to the extent that doing so “will not compromise the ability of future generations to meet their needs.”
Among other alleged human wrong doing, Political Sustainability thus reflects the assertion that we are rapidly depleting finite resources. Therefore, we must reduce our current needs and wants in order to save those resources for future generations. At first blush, it sounds logical, and even ethical.
However, under sustainability precepts, we are supposed to predict future technologies – and ensure that today’s resource demands will not compromise the completely unpredictable energy and raw material requirements that those completely unpredictable future technologies will introduce. We are supposed to safeguard the assumed needs of future generations, even if it means ignoring or compromising the undeniable needs of current generations – including the needs, aspirations, health and welfare of the most impoverished, malnourished, disease-ravaged, energy-deprived, politically powerless people on Earth.
For thousands of years, mankind advanced at a snail’s pace. Then, as the modern fossil-fuel industrial era found its footing, progress picked up rapidly, until the speed of change became almost exponential. How today is anyone supposed to predict what might be in store ten, fifty or a hundred years from now?
Moreover, as we moved from flint to copper, to bronze, iron, steel and beyond, we didn’t do so because mankind had exhausted Earth’s supplies of flint, copper, tin and so forth. We did it because we innovated. We invented something better, more efficient, more practical. Each advance required different materials. 
Who today can foresee what future technologies we will have … and what raw materials those future technologies will require? How we are supposed to ensure that future families can meet their needs, if we cannot possibly know what those needs will be?
Why then would we even think of empowering activists and governments to regulate today’s activities – based on wholly unpredictable future technologies, lifestyles, needs and resource demands? Why would we ignore or compromise the pressing needs of current generations, to meet those totally unpredictable future needs?
“Resource depletion” claims also fail to account for new technologies that increase energy and mineral reserves, reduce their costs – or decrease the need for certain raw materials: copper, for instance, because lightweight fiber optic cables made from silica (one of Earth’s most abundant minerals) can carry thousands of times more information than a huge bundle of copper wires that weigh 800 times more.
In 1887, when Wisconsin’s Hearthstone House became the world’s first home lit by hydroelectric power, no one could foresee how electricity would come to dominate, enhance and safeguard our lives in the myriad ways it does today. No one could envision the many ways we generate electricity today.
120 years later, no one predicted tiny cellular phones with superb digital cameras and more computing and networking power than a big 1990 desktop computer. No one expected that we would need so much cadmium, lithium, rare earth metals and other raw materials to manufacture thousands of wind turbines.
No one anticipated that new 4-D seismic, deepwater drilling and hydraulic fracturing technologies would find and produce so much oil and natural gas that today we still have at least a century’s worth of these vital energy resources – which “experts” had just told us we would run out of in only a few more years.
And yet, we are still supposed to predict the future 50 or 100 years from now, safeguard the assumed needs of future generations, and ignore the clear needs of current generations. We are also supposed to presume that today’s essential natural resources have to last forever. In reality, they only have to last long enough for our creative intellects to discover real, actually workable replacements: new deposits, production techniques, raw material substitutes or technologies.
Of course, all of this is irrelevant to Politicized Sustainability dogma. That doctrine focuses on ridding the world of fossil fuels, regardless of any social, economic, environmental or human costs of doing so. And regardless of whether supposed alternatives really are eco-friendly and sustainable.
For example, mandated U.S. ethanol quotas eat up 40% of this nation’s corn, grown on over 36 million acres of cropland, to replace 10% of America’s gasoline. Corn ethanol also requires billions of gallons of water, and vast quantities of pesticides, fertilizers, tractor fuel and natural gas … to produce energy that drives up food prices, damages small engines, gets one-third fewer miles per gallon than gasoline – and during its entire production and use cycle emits just as much carbon dioxide as gasoline.
Imagine replacing 100% of US gasoline with corn ethanol. How would that in any way be sustainable?
Mandated, subsidized wind energy requires millions of acres for turbines and ultra-long transmission lines … and billions of tons of concrete, steel, copper, rare earth metals and fiberglass. The turbines’ subsonic noise and light flicker create chronic health problems for susceptible people living near them, and kill millions of birds and bats annually – to produce expensive, intermittent, unreliable electricity that must be backed up by dozens of fossil fuel generators or billions of (nonexistent) land- and resource-intensive battery arrays. 
Meanwhile, American and Canadian companies are cutting down thousands of acres of forests and turning millions of trees into wood pellets that they truck to coastal ports and transport on oil-fueled cargo ships to England. There the pellets are hauled by truck and burned in place of coal, to generate electricity … so that England can meet its renewable fuel targets. How is this sustainable – or “climate friendly”?
Why not just build the fossil fuel power plants … mine for coal and frack for natural gas to fuel them – or build more nuclear power plants – and forget about the ethanol, wind turbines, wood pellets and other pseudo-renewable, pseudo-sustainable false alternatives … until something truly better comes along?
Meanwhile, more than 1.2 billion people still do not have electricity. Another 2 billion have electrical power only sporadically and unpredictably. Hundreds of millions get horribly sick, and five million die every year from lung and intestinal diseases that are due to breathing smoke from open fires … and not having refrigeration, clean water and safe, bacteria-free food.
As Steven Lyazi has noted, these people simply want to take their rightful, God-given places among Earth’s healthy and prosperous people. Instead, they’re being told “that wouldn’t be sustainable.” They’re being told they must be content with a few wind turbines near their villages and little solar panels on their huts – to charge cell phones, pump a little water, power a few light bulbs and operate tiny refrigerators.
Politicized Sustainability is irrational, unjust, inhumane, eco-imperialistic and environmentally destructive. It is especially harmful to the world’s poor. It’s time to rethink and overhaul this insanity.

Paul Driessen is senior policy analyst for the Committee For A Constructive Tomorrow (www.CFACT.org), and author of Eco-Imperialism: Green power - Black death and other books on public policy. 

30 October 2017

EPA endangerment finding endangers USA by Dennis T. Avery

Trump must reverse EPA’s climate change “Endangerment Finding”

Nine years ago, the Obama Environmental Protection Agency issued an “Endangerment Finding.” It claimed that methane leaks from natural gas production and pipelines, and manmade carbon dioxide emissions from burning fossil fuels, cause dangerous global warming that poses an imminent danger to the health and wellbeing of Americans. However, the Finding was based on computerized climate models that couldn’t even successfully hind-cast the weather we’d had over the past century – much less forecast Earth’s climate 100 years into the future. In fact, Earth’s climate has changed frequently, often abruptly.

EPA essentially asserted that the 80% of our energy that comes from coal, oil and natural gas caused all our planet’s recent warming and any more warming is a long-term threat. Obama’s team thus bet in 2009 that Earth’s warming from 1976–98 would continue. But it didn’t. Never mind all those recent NOAA and NASA claims that 2016 was our “hottest year” ever. Satellites are our most honest indicator, and they say our planet’s temperature has risen an insignificant 0.02 degrees C (0.04 degrees F) since 1998.
That 20-year non-warming clearly shows that the models are worthless for prediction. But the Federal Appeals Court in Washington nevertheless recently cited methane emissions to block regulatory approval for a new natural gas pipeline. The ruling will encourage radical greens to keep thinking they can regulate gas and oil production and transport into oblivion. Alarmists across the country are already citing the new precedent in other cases, in effect demanding re-hearings on Trump’s entire energy plan.
If the courts decree that pipelines cause dangerous methane emissions, the U.S. will be forced to generate electricity increasingly via the infamous whimsies of wind and sunshine. But the models’ prediction of dangerously rising temperatures have proven wrong. The disparity between the models’ predictions and the thermometer readings is growing wider by the day. We should not base regulations on them.
In science, if your theory doesn’t take account of all the relevant data, you need a new theory.
Meanwhile, thousands of new coal-fired power plants are being built around the world – even in Europe. (Many Third World power plants are being built with Chinese financing.) The CO2 from this new coal-fired power will dwarf whatever emissions the judges hope to prevent in America.
The President now risks losing the economic growth and millions of new jobs that abundant, affordable energy could and should create. Without new pipelines, our “miraculous” fracked gas will be trapped in the semideserts and mountains where the gas is found.
What danger can today’s EPA find in earth’s current 20-year non-warming? What ice-melt will that trigger? What sea level rise? World food production has just set a new record, in large part because higher CO2 levels in the atmosphere act like fertilizer for crop plants (as well as for forests and grasslands).
Justice Neil Gorsuch’s confirmation to the Supreme Court should strongly encourage a Trump Endangerment reversal. Gorsuch stated in a 2016 opinion that the so-called Chevron Precedent is “difficult to square with the Constitution.” Chevron says courts should defer to federal judges on laws that are ambiguous. He believes it shifts too much power from Congress to unelected bureaucrats.
EPA Administrator Scott Pruitt will need to build a strong case for the reversal, however, because the Supreme Court still does not have a reliable 5–4 conservative majority. Pruitt’s current approach of setting up competing red-teams vs. blue teams must help convince Justice Kennedy that the world today looks much different from when the EPA rubberstamped the IPCC and its failed climate models.
The science was not settled in 2009; and, fortunately, the weight of evidence has since shifted importantly toward the skeptics. It starts with the still-continuing 20-year non-warming. The best “answer” the alarmists can find is that “extra” CO2 heat is hiding in the deep ocean depths. But cold water is heavier than warm water, so the warm water would have warmed the depths on its way down. NASA’s newer and more-accurate data comes from ARGO floats that periodically dive to sample water temperatures 2100 feet below the surface. They find no hidden heat.
Moreover, Earth has been warming, erratically but persistently, since 1715. How much of this warming was due to natural cycles, and how much was man-made? Of any manmade portion, how much was due to CO2, and how much to expanding Urban Heat Islands and cutting down forests? Climate realists say CO2 added barely one degree C; alarmists claim it will increase temperatures by up to 12 degrees C!
How did hurricanes Harvey, Irma and Maria destroy so much property with only 0.02 degrees C of warming? Britain’s wooden-ship logbooks from 1700 to 1850 confirm that there were twice as many major landfalling Caribbean hurricanes per decade during the cold Little Ice Age as during the far warmer years from 1950 to 2000. Nor has the post-1998 weather produced more frequent or intensive storms, longer droughts, or any of the other climate impacts that Obama’s EPA insisted would happen.
The simple truth is that the Pacific Decadal Oscillation has given the world a climate scare every 25 to 30 years since we got thermometers around 1850 (even though the PDO wasn’t even recognized until 1996). In 1845, the ships of Sir John Franklin’s Arctic expedition were crushed by ice. Just 64 years later, in 1909, Roald Amundsen sailed through a relatively warm, ice-free Northwest Passage. In the 1970s, we were warned urgently of a new Ice Age. And then came the “overheated” Al Gore years, 1976–1998.
The huge Pacific Ocean’s 60-year oscillation raises ocean temperatures – and thus the world’s – by 1 to 2 degrees C (1.8 to 3.6 degrees F) for about 30 years, then shifts back again for another 30 years. Every time it shifted in the past, alarmists extended the latest reading in a straight line for five or 20 years and screamed: “ Global Disaster!”  This time, the alarmists claim the non-warming isn’t real!
Today, there’s no doubt the models have predicted more than twice as much warming as we’ve observed. Given the high number of official thermometers that are located in urban areas and near airport tarmac, the models may be overpredicting by three-fold!
Another major new scientific finding also goes against the alarmists. Last year CERN (the multi-billion-dollar Institute for European Nuclear Research) told CERN Courier subscribers that all the climate models must be re-done. CERN reported that its CLOUD experiment had used its huge particle accelerator and a giant cloud chamber to demonstrate that the sun and cosmic rays are the real “mystery factors” in earth’s climate. The research supports the contention that CO2is only a bit player.
CERN says the sun was weak during the Little Ice Age (indeed, during all the “little ice ages”). This allowed far more cosmic rays to hit our atmosphere. Those extra hits shattered millions more molecules into zillions of tiny “cloud seeds.” Each cloud seed carried an electric charge that attracted other molecules to form clumps – and gave us up to ten times as many low clouds. Earth cooled for centuries under overcast skies, as if under a giant awning. Then the sun became more active, there were fewer cosmic rays, the skies got sunnier, and Earth warmed – for centuries.
History says the Modern Warming is likely to last at least another two centuries. The Medieval Warming (350 years long) was the shortest past warming we can find. But first, CERN says, we will have to go through a 60-year Solar Sunspot Minimum that will drop Earth’s temperatures even lower than today for the next 60 years. The Minimums are another recently-recognized cycle: up to 200 years long.
How will a century of non-warming possibly endanger Americans? Trump should be eager to take on Obama’s outdated and ill-informed Endangerment Finding.
Dennis Avery is a former U.S. State Department senior analyst and co-author with astrophysicist Fred Singer of Unstoppable Global Warming: Every 1,500 Years.

My note:  The claim that CO2 has caused significant warming has not been proven empirically and rests on greatly exaggerating its effects as an infrared-active gas by violating energy conservation and the physics of thermal radiation to create phantom photon streams, minimizing the many effects of water vapor, and ignoring the temperature gradient induced by gravity.  The reason the computer climate models are so wrong is mostly because so much bad physics was input into them.  The failures to input cloud cover changes and sunspot cycles just adds to the the many problems in the computer models.