© J. Dirk Nies, Ph.D.
Do we possess the wherewithal to alter the earth’s climate? Does our management of the natural world influence weather on a global scale? Could human-influenced climate change have happened before the Industrial Revolution set in motion our voracious use of coal, oil and natural gas that has dramatically increased carbon dioxide levels in our atmosphere? New research and scholarship is prompting some scientists to think so. Here is an inkling into this fascinating and intriguing new theory: people were responsible for the Little Ice Age, the extraordinarily cold weather pattern in the Northern Hemisphere that lasted for several hundred years.
Some date the beginning of the Little Ice Age as early as 1300, immediately following the Medieval Warm Period. At the height of the Little Ice Age—between 1600 and 1850—salty and brackish water froze over in winter. Swedes could walk across Baltic Sea ice to Denmark. New Yorkers could walk from Manhattan to Staten Island. Glaciers expanded and farms were lost to encroaching ice. The growing season was reduced by several weeks. These unforgiving climatic conditions often led to widespread crop failure and famine.
Most climatologists attribute one or more non-biological reasons for the Little Ice Age. These include a prolonged decrease in solar output, persistent changes in ocean currents or heightened volcanic activity. A new picture is emerging that points toward human impact on the ecosystems of the New World as the principal culprit. I will explain.
When Europeans first arrived in the Americas, they did not encounter a largely untouched, virgin wilderness as is often portrayed in our history textbooks. Scientists and historians increasingly recognize that many forest ecosystems in the New World were profoundly shaped and altered by indigenous peoples. In some parts of North America, more forested land exists today than existed 500 years ago.
Using fire every one to three years, Native Americans cleared substantial areas of North America’s eastern forests from Nova Scotia to Florida. To make the land more amenable for hunting, agriculture and building villages, Indians set fires. English colonist Edward Johnson wrote that the forests were as open and spacious as “our Parkes in England.” Here in Virginia, Monacan and Powatan Indians thinned and cleared much of the woodlands of coastal Virginia. Burning and clearing of jungle forests by indigenous peoples occurred on a large scale in Central and South America as well. Continuous human intervention was required to maintain these fire-induced, non-natural grasslands, savannahs, thinned forests and arable spaces.
When Hernán Cortés arrived in the Valley of Mexico in 1519, he and his fellow Spaniards encountered one of the world’s greatest and most populous capitals, Tenochtitlan, located on an island in Lake Texcoco. Of the estimated 50 million or more people living throughout the Americas at that time—reliable population data are scant and estimates vary widely—roughly half lived within the 200,000 square miles of the Aztec realm that greeted Cortés. By the time the Pilgrims anchored the Mayflower, a century later, at Provincetown Harbor on November 11, 1620, the Aztec and surrounding Indian population had fallen by a staggering 97 percent, from 25 million to 730,000! This loss of human life was so great that three centuries elapsed before human population in this region reached pre-Columbian levels.
Depopulation was not a phenomenon restricted to colonial Mexico. Collapse of many indigenous populations and culture occurred throughout the Americas. Why?
Contagious diseases, inadvertently imported by European immigrants, were the principal culprits for this genocide. Smallpox, measles, influenza, pneumonic and bubonic plagues, and malaria—a disease associated with malignant (mal) air (aria)—introduced by the slave trade from Africa, killed nine out of every ten Native Americans. Having no acquired immunities to these Eurasian and African pathogens, indigenous peoples disproportionately succumbed to these foreign bacterial, viral and parasitic diseases in horrific, pandemic proportions.
This human tragedy, in turn, impacted the earth’s climate. A century after the Italian explorer, Cristoforo Colombo, set sail on the evening of August 3, 1492, from the Spanish port of Palos de la Frontera, the world was to experience the brunt of the Little Ice Age.
What possible connection is there between the huge loss of human life and the earth’s climate?
In 2003, Dr. William F. Ruddiman, a paleoclimatologist and now Professor Emeritus in the Department of Environmental Sciences at the University of Virginia, published a novel connection. He suggested that as Native Americans rapidly died off, Indian societies could no longer manage the land as they had been doing for centuries. Wide-spread burning of forest, underbrush and grasslands ceased. This greatly curtailed the flow of carbon dioxide into the atmosphere originating from routine, human-induced forest and brush fires in North, South and Central America. Concurrently, as open areas returned to forest, large quantities of carbon dioxide were removed from the air by vigorously growing trees and shrubs. The combined effects of decreased burning and increased biomass production reduced levels of atmospheric carbon dioxide on a global scale. The reduction of this greenhouse gas led to the severity of the Little Ice Age.
I highlight this novel explanation for climate change for two reasons.
First, climate science is extremely complex and incompletely understood. Despite our wealth of knowledge, theories and computer models, no solid consensus exists regarding what made the planet cooler and then warmer again during the Little Ice Age. Even greater debate and uncertainty surrounds the role agriculture and forest management played in exacerbating or ameliorating climate change in the preindustrial world. If we don’t understand the recent past, with what assurance can we predict the future?
Second, cutting age research is showing evermore clearly the significant connections between plant life and climate change. A study published in the January 2015 Proceedings of the National Academy of Sciences is a case in point. Researchers from NASA’s Jet Propulsion Laboratory and the National Center for Atmospheric Research report that tropical forests around the world are growing faster and absorbing carbon dioxide from the atmosphere at a rate far higher than previously thought.
Prior to this study, most scientists believed that rainforests were poorly absorbing excess carbon dioxide from the air.What this comprehensive, multi-faceted study shows is that tropical rainforests are responding strongly to the CO2 fertilization effect. They are absorbing 1.5 billion tons of carbon dioxide from the air each year; an amount greater than that absorbed by boreal and temperate forests combined. Lead author David Schimel says: “This is good news, because uptake in northern forests may already be slowing, while tropical forests may continue to take up carbon for many years.”
In conclusion, carbon dioxide occupies an essential position in the chemistry of life. To label CO2 only as a hazardous air pollutant is like saying water only causes floods. Both water and carbon dioxide are much more vital and versatile than that. As we devise economic, regulatory and technological fixes to atmospheric carbon pollution, we would be well advised to factor in the deep connections between agriculture, land management, CO2 and the climate.
Why do we continue to focus on CO2 as the primary AGW contributor. Energy consumption increased 20-fold during the 20th century with 80% coming from fossil fuels. Fossil fuels are burned solely for their heat production and CO2 is a by-product. CO2 is a minor component in the greenhouse gas which captures radiative energy. How much of the increase in global warming can be attributed to an increase in the efficiency of the greenhouse blanket and how much to the 20-fold increase beneath this blanket. The heat emissions from energy consumption were more than four times the amount attributable to the actual measured rise in air temperature. Why didn’t the scientists studying this in the latter part of the 20th century note and acknowledge this fact? Let’s reduce our “carbon footprint” by increasing efficiencies but quit touting CCS. It is counter-productive, expensive, and useless in actually reducing the temperature. It requires the removal of 8.8 billion tons of CO2 to reduce the concentration by 1 ppm. To what level should we set as a target? During the gradual rise from 320 to 370 ppm at what point did CO2 become THE CAUSE of global warming? AGW will not stop until fossil and nuclear plants are replaced with renewable energy. (Nuclear power emits twice as much waste heat as it converts to electricity, but because they emit no CO2 more new plants are being permitted and built worldwide.)
Philip, thank you for your thoughtful and insightful comments.
Regarding your observation: “Why do we continue to focus on CO2 as the primary AGW contributor”, I am writing this series of articles precisely to address your concern and to expand our focus, vision and understanding of carbon dioxide beyond its role in anthropogenic global warming (AGW).
Here is a comparison to consider regarding your questiont: “How much of the increase in global warming can be attributed to an increase in the efficiency of the greenhouse blanket and how much to the 20-fold increase beneath this blanket.”
The total solar energy absorbed by Earth’s atmosphere, oceans and land masses is approximately 3,850,000 exajoules per year. (http://en.wikipedia.org/wiki/Solar_energy). This is equivalent to roughly 3.8 x 10(24) joules per year (an exajoule is 10(18) joules).
Worldwide, our energy consumption of all fuels is about 9 Mtoe (million tonnes of crude oil equivalents) each year (http://www.iea.org/publications/freepublications/publication/keyworld2014.pdf). This is equivalent to about 3.8 x 10(17) joules per year (a Mtoe = 4.18168 10(10) joules).
Comparing these two numbers (natural vs anthropogenic), we find that the amount of solar energy heating the earth each year is 10 million times larger (7 orders of magnitude) than the amount of heating derived from all anthropogenic forms of energy (fossil fuel, nuclear, biorenewable etc.).
In other words, even after a 20-fold increase of waste heat generated each year by human activities over the past century, it is still remains vanishingly small (0.0000001%) when compared to the amount of heat the earth receives from the sun. Human energy use is an insignificant quantity (less than a rounding error) in the earth’s energy budget.
I hope this is helpful.
Good to see this being recognised
Firestick hunting was the norm in primitive nomadic hunter cultures such as the Australian aborigine displayed until recent times. In fact approx 6,000 years ago massive migration from the fertile savannah of North Africa (firestick hunting had wiped out the rainforests)
The defining skill that established civilisation and permanent villages was pottery, from that came the Kiln and smelting of metals and farming and cultivation and herding. Denser energy sources were needed especially to transition to iron, so coal and charcoal., there are coal mines thousands of years old.
Charcoal burners were a major peasant industry in the middle ages, in fact the Plague and the LIA saved the forests of Europe due to the massive population decrease.
The LIA created the demand for more heating and charcoal,
However the Volcanic cooling cannot be ignored, 1278 was the Indonesian Super Volcano that left its signature worldwide
Thank you for your fascinating comments.
Yes, very large volcanic eruptions in Indonesia, such as from Mount Tambora in 1815 which led to the “Year Without a Summer,” affect the earth’s weather on a global scale. Generally the impact of a single eruption is less than a few years in duration, however.
Regarding the existence of the Little Ice Age (LIA), two professors of economics, Morgan Kelly and Cormac Ó Gráda, of Univeristy College Dublin, write this month in Eurasia Review “It appears instead that the European Little Ice Age is a statistical artefact, where the standard climatological practice of smoothing what turn out to be white noise data prior to analysis gives the spurious appearance of irregular oscillation – a Slutsky Effect.”
They further say that “While the idea that Europe experienced a Little Ice Age is widespread, its statistical basis is at best exiguous, and appears to stem from inappropriate efforts to smooth data that are actually random.”
If the Little Ice Age truly turns out to be merely a phantom statistical artefact spanning four centuries, what can we say with certainty about weather patterns and trends today? A phrase popularized by Mark Twain – lies, damned lies, and statistics – comes to mind.