Dennis T. Avery
Aside from protests by Al Gore, Leonardo Di Caprio and friends, the public didn’t seem to raise its CO2 anguish much above the Russians-election frenzy when Trump exited the Paris Climate Accords.
Statistician Bjorn Lomborg had already pointed out that the Paris CO2 emission promises would cost $100 trillion dollars that no one has, and make only a 0.05 degree difference in Earth’s 2100 AD temperature. Others say perhaps a 0.2 degree C (0.3 degrees F) difference, and even that would hold only in the highly unlikely event that all parties actually kept their voluntary pledges.
What few realize, however, is that during the last Ice Age too little CO2 in the air almost eradicated mankind. That’s when much-colder water in oceans (that were 400 feet shallower than today) sucked most of the carbon dioxide from the air; half of North America, Europe and Asia were buried under mile-high glaciers that obliterated everything in their paths; and bitterly cold temperatures further retarded plant growth.
In fact, Earth’s atmosphere had only about 180 parts per million CO2, compared to today’s 400 ppm: 0.018% then versus 0.040% today.
The Ice Age’s combined horrors – intense cold, permanent drought and CO2 starvation – killed most of the plants on Earth. Only a few trees survived, in the mildest climates. Much of the planet’s grass turned to tundra, which is much less nourishing to the herbivores prehistoric humans depended on for food and fur. Recent Cambridge University studies conclude that only about 100,000 humans were left alive worldwide when the current interglacial warming mercifully began.
The few surviving prey animals had to keep migrating to get enough food. That forced our ancestors to migrate with them, in temperatures that routinely fell to 40 degrees below zero (both Fahrenheit and Celsius). The Neanderthals had been living in relatively warm caves protected from predators by fires at the cave mouths. They had hunted their prey by sneaking through the trees – which no longer existed. They apparently couldn’t adapt, and starved. Cambridge found no evidence of genocidal warfare.
The most successful human survivors – who provided most of the DNA for modern Europeans – were nomads from the Black Sea region. The Gravettians had never had trees, so they invented mammoth-skin tents, held up by salvaged mammoth ribs. They also developed spear-throwers, to kill the huge beasts from a safe distance.
Equally important, Gravettians domesticated and bred wolves, to protect their tents from marauders, locate game animals on the broad tundra, and harry the prey into defensive clusters for easier killing. The scarcity of food in that Glacial Maximum intensified the dogs’ appreciation for the bones and bone marrow at the human camps.
When that Ice Age ended, moreover, CO2 changes didn’t lead the warming. The atmospheric CO2 only began to recover about 800 years after the warming started.
Carbon dioxide truly is “the gas of life.” The plants that feed us and wildlife can’t live without inhaling CO2, and then they exhale the oxygen that lets humans and animals keep breathing.
Our crop plants evolved about 400 million years ago, when CO2 in the atmosphere was about 5000 parts per million! Our evergreen trees and shrubs evolved about 360 million years ago, with CO2 levels at about 4,000 ppm. When our deciduous trees evolved about 160 million years ago, the CO2 level was about 2,200 ppm – still five times the current level.
There’s little danger to humans of too much CO2 in the air they breathe. Even the Environmental Protection Agency says 1000 ppm is the safe limit for lifetime human exposure. Space shuttle CO2 alarms are set at 5,000 ppm, and the alarm in nuclear submarines is set at 8,000 ppm!
If there’s little danger of humans having too much CO2 in their air, and a real danger to civilization from having too little, what’s the ideal level of atmospheric CO2? The answer? There’s a broad safe range – with far more risk of too little than too much. At low levels, with few or no plants, there’d be no people or animals, let alone civilization.
Human numbers, moreover, expanded strongly during the Holocene Optimum, with temperatures 4 degrees C higher than today! Even now, residents of the tropics keep demonstrating that humans can tolerate much higher temperatures than most of us experience. (As we utilize the new malaria vaccine, the tropics will prosper even more.) And far more people die from “too cold” than from “too warm.”
The crops continue to produce record yields in our “unprecedented” warming – and the extra CO2 in our air is credited with as much as 15% of that yield gain!
It’s not whether more CO2 in the air raises Earth’s temperatures. We know it does, by some small but still hotly debated amount. Both sides agree that a redoubling of CO2 in the air – by itself – would raise earth’s temperature by only about 1 degree C.
That’s hardly noticeable or measurable in the midst of all the local temperature variations, with the myriad of natural forces that govern planetary climate, with all the discrepancies among the various measuring systems, and amid all the errors, biases and missing or revised data that have crept in.
Moreover, 1 degree C of warming was obviously not enough to frighten the public.
So, the computerized models cited by the Intergovernmental Panel on Climate Change made another assumption: that a hotter world would hold more moisture in its atmosphere. Since water vapor is the most effective greenhouse gas, the climate modelers claimed Earth might heat by 5 or even 10 degrees C. One scientist (who supposedly advises Pope Francis) recently claimed 12 degrees C (21 degrees F) of overheating!
The awkward truth, however, is that NASA has monitored moisture in the atmosphere since 1980 – and water vapor has not increased despite the higher levels of CO2 in the air. Is that why the IPCC models have predicted more than twice as much warming as we’ve actually seen?
The year 1936 recorded the hottest thermometer readings of any year in the last 5,000. However, these days NOAA reports only its “adjusted” temperatures, which always seem to go only higher. In fact, the first surge of human-emitted carbon dioxide after World War II should have produced the biggest surge of warming – if CO2 is the control factor. Instead temperatures went down from 1940 to 1975.
Why did the computer models fail to predict (or even factor in) either the Pacific Oscillation’s current 20-year non-warming or the coming solar sunspot minimum? Only one model has verified itself by back-casting the temperatures and weather we’ve had over the past century. That model is from Nicola Scafetta at Duke University, and it’s based on solar, lunar and planetary cycles. The latest data from the CERN particle physics lab have also produced a model based on cycling – and it foresees no runaway warming. Instead, it sees an impending cold solar minimum.
Is the long, wrong-headed war against carbon dioxide finally fading? Science certainly says it should. But perhaps there is still too much money, prestige and power in climate alarmism for that to happen
___________
Dennis T. Avery is an agricultural and environmental economist and a senior fellow for the Center for Global Food Issues in Virginia. He was formerly a senior analyst for the U.S. Department of State and is co-author, with S. Fred Singer, of Unstoppable Global Warming Every 1500 Years.
No comments:
Post a Comment