If you follow the online climate discussion, you’ve undoubtedly run across the mention of something called the “Little Ice Age,” a period in Earth’s history — generally considered to be between the 16th and 19th centuries — where the much of the world cooled by 1-2° C, enough to trigger a wave of glaciation across the Northern Hemisphere. The Thames regularly froze over (the painting, at right, is of the frozen Thames in 1677), crops dependent on warm weather in China were abandoned, and in 1780, New York Harbor froze, allowing people to walk from Manhattan to Staten Island. There was significant evidence of cooling around the world, lasting for at least a couple of hundred years.
The Little Ice Age gets pulled into climate disruption arguments as an example of how “natural cycles” (solar cycles, volcanos, ocean changes, etc.) can cause major disruption, therefore anthropogenic climate change isn’t real. Even setting aside the illogic of the argument, the Little Ice Age is often considered the leading example of the climate being naturally weird.
Or maybe not. Stanford geochemist Richard Nevle, at this month’s Geological Society of America annual meeting, argued forcefully that the biocide of the indigenous population of North America (where the spread of European diseases killed off 90% of the 40-80 million native Americans within roughly a century) led directly to a massive reforestation event as trees regrew in areas that had been cleared for crops. The trees were generally cleared by slash-and-burn/”swidden agriculture” techniques, leaving evidence in the form of charcoal in the soil.
About 500 years ago, this charcoal accumulation plummeted as the people themselves disappeared. […] Trees returned, reforesting an area at least the size of California, Nevle estimated. This new growth could have soaked up between 2 billion and 17 billion tons of carbon dioxide from the air.
Ice cores from Antarctica contain air bubbles that show a drop in carbon dioxide around this time. These bubbles suggest that levels of the greenhouse gas decreased by 6 to 10 parts per million between 1525 and the early 1600s.
Nevle argues that the composition of the CO2 in trapped bubbles shows clearly that the balance of carbon-13 and carbon-12 isotopes changed in this time period; plants tend to prefer C-12, so a surge in plant growth would correspond to a relative decline in the proportion of C-12/C-13 — which is just what Nevle found. Natural variations, according to Nevle, could only account for about 1.3ppm of the 6-10ppm drop in CO2 concentrations in the atmosphere.
Fascinating stuff, this. It has a few particularly interesting implications:
It shows how a relatively small change in average global temperatures can have a dramatic impact. If dropping 1-2° can put us on the edge of an ice age, imagine what an increase of 1-2°… or more… could do to the planet.
It also shows how big the challenge is to any scheme to pull carbon out of the atmosphere to fight global warming. A reforestation event amounting to an area the size of California managed to pull all of 10ppm of carbon dioxide out of the atmosphere; we’re looking at having to deal with an increase measuring 50-100ppm by the mid-point of this century.
Finally (and apropos to the title of this piece) it’s a real indicator of just how long we’ve been living in the Anthropocene. Human activity — and the sudden lack of same — makes a big difference to the environment.