Why I Stopped Believing in Man-made Global Warming and Became a Climate Skeptic.
When I was in my late 20s, I was living in San Francisco and playing in a reggae band. At the time, I fervently believed that global warming was a real, man-made problem. And I was vocal about it. Among other things, my friends and I would slap bumper stickers on SUVs that said: “I’m Changing The Climate, Ask Me How.” I didn’t like yuppies, and I resented their big SUVs struggling to climb the city’s steep hills, needlessly warming our planet.
It seemed obvious to me that there was a climate change problem. I heard about it—and read about it—every day in the news. Eventually, I started to study the issue, thinking that I needed to understand it better to write informed articles on the subject.
The first time I experienced a twinge of climate change doubt was when I learned that carbon dioxide (CO2) comprised less than 0.04 percent of the atmosphere. In truth, such a seemingly small amount shouldn’t be underestimated since CO2 starts trapping heat quite effectively at far more minuscule concentrations. But at the time, I thought, “That’s strange. I would have thought it was a lot more.”
The point is I had been barraged with so much global warming hysteria that I figured CO2 must comprise one percent or five percent or 10 percent of the atmosphere. But since it was only 0.04 percent, it seemed to me that the people making the case for global warming should be more careful, and not exaggerate their claims—to not lose credibility.
Regardless, I continued to believe that man was changing the climate. And I was unhappy with friends who seemed like stubborn, cranky holdouts when they disputed this indisputable fact—that humankind was indeed warming the planet.
The years rolled along and I still assumed global warming was a man-made catastrophe—until I finally started to study the science of the issue. At that point, two minor pieces of information helped to trigger a real curiosity for me—and led me to realize that the issue was far more complicated than I had always believed.
The first was learning that man produces only a tiny portion of all CO2 released into the atmosphere each year. In contrast, for example, termites alone release far more CO2 annually (and by several orders of magnitude) than all the burning of fossil fuels. The second was learning that there had been a global cooling scare in the 1970s.
After stumbling across these two seemingly random nuggets of information, I started to really read—and with an open mind. And it was then that I began to profoundly change my opinion. Simply put, I went from wholeheartedly believing in man-made “climate change” to viewing the science undergirding the case as very questionable. Overall, I became resentful that I’d been naively indoctrinated by a daily, one-sided media barrage. And I started to look at myself as something of a freedom fighter—someone who was pushing back against misinformation—and making people aware that they were being manipulated.
If asked why I don’t believe in the theory of “anthropogenic” (man-made) warming, I try to list some very simple points. I explain that, yes, the Earth has warmed by roughly 0.8 degrees Celsius over the past 150 years—and I agree that surface temperatures have warmed, that the oceans have warmed, and that glaciers have retreated. But I disagree with the root cause of this warming.
For starters, CO2 is actually a rather flawed “greenhouse gas.” When CO2 is first introduced into the atmosphere it rapidly absorbs as much heat (in the form of infrared radiation) as possible. But it doesn’t take long for CO2 to become “optically saturated.” This means that after reaching roughly 0.0020 percent (20 parts per million) of the atmosphere, CO2 starts fading. From then on, it takes ever-doubling amounts of CO2 to trap the same amount of heat. By the present concentration of 0.04 percent (400 parts per million), CO2 is essentially saturated—and can’t meaningfully trap much additional heat.
This limitation of CO2 actually runs completely counter to the prevailing notion that adding more CO2 to the atmosphere will continue to trap ever greater amounts of heat. In truth, basic science demonstrates exactly the opposite, which is why climate scientists actually base most of their projected warming on “positive feedback” from water vapor.
Significantly, water vapor functions as the predominant heat-trapping gas in the atmosphere. And so, when climate scientists use computer models to predict future warming due to man-made climate change, they are essentially saying the following: We know that CO2 rapidly fades as a greenhouse gas. We think that before CO2 becomes saturated, it will raise global temperatures enough to add more water vapor to the atmosphere. This added water vapor will trap more heat, which will raise temperatures more, which will add more water vapor. The result will be a feedback loop that keeps driving a rise in temperatures.
There’s a major flaw in this theory, however, and one that climate scientists have never been able to solve. Simply put, water vapor in the atmosphere inevitably transitions to clouds. And cumulus clouds not only reflect solar radiation back into space but also produce rain—which not only lowers surface temperatures but also scrubs CO2 from the atmosphere.
Regardless of the cloud problem, this presumed climate “sensitivity” to CO2 is the overall engine of man-made warming and continues to be programmed into computer models. But it remains a tenuous argument. So the real question should be: Well if CO2 isn’t driving global warming, what is?
And that brings us to the second point that I try to make—the issue of solar variability. Over the past 150 years, the sun’s output has increased quite significantly—to levels not seen in as much as 2,000 years. Many people assume that the sun’s output is constant, but in fact, it ebbs and flows on various short- and long-term cycles. And so, not only did solar activity increase sharply during the 20th century, but this same increase in output corresponds quite closely with other warm periods recorded over the past few thousands of years.
Sadly, advocates of man-made climate change essentially discount solar activity as a meaningful contributor to global warming. Their reasoning is that changes in solar irradiance (“brightness”) are quite small compared to the overall, observed warming of the 20th century. However, this view overlooks the related—and larger—impacts of solar variability, including atmospheric ionization and cloud formation. And so, when considered together, these associated factors demonstrate a more complete picture of solar variability’s relevance.
Compounding the impact of increased solar activity in the 20th century is the issue of ozone depletion. From the late 1950s until the mid-1990s, ozone concentrations in the stratosphere steadily diminished. As ozone levels declined, the amount of ultraviolet radiation (UV) reaching the Earth’s surface steadily increased. UV radiation is the primary source of atmospheric warmth, which helps to explain why increased UV penetration during the latter portion of the 20th century contributed to a rise in temperatures. Notably, the ozone layer began to stabilize after 1996, when the Montreal Protocol (which banned chlorofluorocarbons) was fully implemented. In fact, since the late 1990s, both satellite data and weather balloon measurements show a net flatlining of global temperatures.
This “pause” in the rise of global temperatures since the late 1990s has become the focus of serious debate within the climate community. Simply put, the theory of man-made warming cannot account for a halt in the overall rise in temperatures, or why computer models (programmed to emphasize a high climate sensitivity to CO2) are continuing to diverge from actual, observed temperature measurements.
The flatlining of net temperatures observed by satellites is also at odds with recent adjustments to surface measurements issued by the National Oceanic and Atmospheric Administration (NOAA) and adopted by NASA. By revising its temperature collection methods to include measurements taken from the engine manifolds of oceangoing vessels, and by estimating Arctic Ocean readings based on neighboring land masses, NOAA was able to announce in 2015 that there has been no “pause” in temperatures and that recent years are the “warmest on record.” The fact that NOAA’s findings continue to conflict with satellite readings has not received much attention.
What’s really interesting to study about climate change is the pattern of warming and cooling seen over the last few thousand years of the current interglacial period. It’s somewhat ironic that claims of a “warmest year on record” only include the “modern” instrumental era of 1880 to the present. But studying 100 or 150 years yields only a partial glimpse of recent temperature trends. If one actually looks back over the past 2,000 years, for example, a far more complete—and complex—landscape emerges.
Thanks to proxy measurements of carbon, beryllium, and oxygen isotopes, geologists can estimate both temperatures and solar activity going back many thousands of years. And so, we know that from roughly 250-400 AD, the global climate was warmer than today, in what was called the “Roman Warm Period.” Solar output at the time was high, which helps to explain the temperate conditions that aided the expansion of the Roman Empire. However, solar activity plummeted during the middle of the first millennium, leading to several centuries of brutal cold during the Dark Ages, culminating in the Nile River actually freezing in 829 AD.
One hundred years after the Nile River froze, however, solar activity started climbing back toward more comfortable levels, leading to a “Medieval Warm Period” that spanned roughly 950-1250 AD. Historical and geologic records indicate that, globally, the Medieval Warm Period was warmer than today, with balmy weather allowing the Vikings to farm on Greenland (hence its name), and the British to raise vineyards and produce wine.
From 1350-1850, though, solar activity plummeted several times, leading to a colder era nicknamed the “Little Ice Age.” Millions of people died of famine and disease during this inhospitable time, and it was only with the rebound of solar activity in the latter part of the 1800s that temperatures climbed back to a more comfortable climate.
And now we’re in the “Modern Warm Period,” the latest in a succession of warm/cold episodes that track closely with solar variability. Something to consider when reviewing these alternating periods of climate is that it is during warm phases that civilizations flourish. While millions of people starved during the Dark Ages and the Little Ice Age, global agriculture thrived during the Medieval Warm Period.
Overall, and as noted above, there are valid reasons to conclude that CO2 is not the primary forcing agent of global climate. Thus, it’s unfortunate that the media often ridicules climate critics rather than spend more time investigating their views. Clearly, there’s a media bias involved, and one that reinforces notions such as “the science is settled,” or that “97 percent of scientists agree.”
However, the idea of an overwhelming consensus on global warming not only conflicts with the more than 30,000 scientists who have signed the “Petition Project” rejecting man-made warming but relies on spurious assumptions. The 97 percent figure stems from a May 2013 Tweet by President Obama stating that “97 percent of scientists agree: climate change is real, man-made, and dangerous.” As his source, the president linked to a newly published study by John Cook at the University of Queensland. However, a closer look at Cook’s study reveals that of the 11,994 research papers examined, only one-third actually took a position regarding the causes of global warming.
There are various instances of this exaggeration and hype in the climate change debate. But nothing could be as troubling as the prescriptions currently being formulated by climate alarmists. In the service of preventing a future catastrophe based solely on flawed computer modeling, public figures are calling for an 80 percent (or even 100 percent) ban on the future use of fossil fuels. While seemingly well-intentioned, these policies overlook two real-world problems.
First, a mass reduction in fossil fuel use would forfeit the lives of hundreds of millions of people in developing nations. Currently limited to the barest of medical aid and technical infrastructure, these populations already exist at a poverty line, barely supported by sporadic power generation. It is only through the introduction of water and sewage treatment (powered by natural gas and coal plants) that some areas of the Third World are attempting to raise living standards.
Second, a presumed switch to wind and solar generation overlooks the glaring deficiencies of both forms of power: Wind and solar are inherently intermittent means of electricity production (because the wind doesn’t always blow and the sun doesn’t always shine). And both are also low-yield and expensive—in part because their unreliability still requires ample backup generation from “spinning reserves” of either natural gas or coal.
If the United States were to actually make the transition to a partially or fully wind- and solar-based power infrastructure, the failures of Europe and Australia’s green energy experiments show that the nation would experience an ongoing series of power shortages and blackouts. The result would be a consequential loss of health and safety measures. Hospitals would fight to prioritize available power. Water treatment and waste systems could fail. Foods would spoil due to lack of refrigeration.
The overall point is that there are valid reasons to question both the assumptions and policies advocated by climate change activists. It would be helpful if those who take a stand against presumed man-made warming were given a chance to expand on their reasoning, rather than face criticism and scorn. And so, at a time of political uncertainty and economic difficulty, it would make sense to fully study all the potential ramifications of such a complicated issue.