Climate 101 is a Mashable series that answers provoking and salient questions about Earth’s warming climate.
Yes, the sun is a profoundly important factor in Earth's climate. It always will be.
But scientists, like those at NASA, know the sun isn't causing the current, rapid rise in global temperatures. Here's why:
1. Solar activity
Some 26,000 global weather stations(opens in a new tab), in addition to observations(opens in a new tab) taken by ships, buoys, and satellites, show Earth's continued temperature rise, including accelerated warming over the last four decades. The last decade was easily the warmest decade on record.
But during the last four decades or so, solar irradiance, or the sun's energy output, hasn't changed much (it's measured with satellites(opens in a new tab)). In fact, it has slightly decreased. Here lies a foundational problem for anyone arguing the sun's recent activity or energy output is responsible for today's heating climate.
"You can’t decrease the amount of energy you’re receiving from the sun, and then expect that to heat up the Earth. That's a basic violation of physics," explained Peter Jacobs, a climate scientist working in the NASA Goddard Space Flight Center's Office of Communications.
The sun, thankfully, is an extremely stable star. It still has natural swings in energy output(opens in a new tab), but they're really small. For example, there are approximately 11-year periods of activity called solar cycles(opens in a new tab), where the sun's activity increases and then decreases. These changes in energy output are on the order of 0.1 percent, explained Geoff Reeves, who researches space weather at Los Alamos National Laboratory. "The sun has small variations in the amount of light and heat that comes out," said Reeves, noting the last two solar cycles have been below-average in energy output.
If one looks at a longer timescale, the sun can have other relatively small trends in energy output, too, such as the overall "slight extra warming(opens in a new tab) from the sun" noted by NASA since around 1750 — but not nearly enough to account for global climate change (warming from human activity is 50 times greater, according to the space agency). Historically, Earth has also experienced cooler periods like the "Little Ice Age," which largely impacted Europe and the Northern Hemisphere between around A.D. 1300 to 1850. However, research(opens in a new tab) has shown this cooling was most likely due to repeated volcanism and other environmental factors, not a big swing in solar activity. (Big natural changes in climate, like ice ages, are usually caused by relatively small, though impactful, variations in Earth's orbit(opens in a new tab).)
The major driver of modern climate change, according to scientists at top U.S. research agencies and universities, is the alteration of the planet's atmosphere. Certain gases in the atmosphere trap heat, and two particularly potent atmospheric gases — carbon dioxide and methane — are surging as a consequence of fossil fuel burning and other human activities.
"There are big changes in our atmosphere," said Reeves. "That's a simple and straightforward explanation that we understand the physics of."
Earth is currently reacting to the highest atmospheric levels of heat-trapping carbon dioxide in at least(opens in a new tab) (opens in a new tab)800,000 years(opens in a new tab), but more likely millions of years.
2. The stratosphere
Over 50 years ago, atmospheric scientists predicted(opens in a new tab) that as CO2 increased in the lower atmosphere (and warmed Earth), a lofty layer of the atmosphere called the stratosphere, would cool.
They were right(opens in a new tab).
The stratosphere, which exists between some 10 to 30 miles up in the sky, has cooled largely because the CO2 accumulating in the lower atmosphere (where we live, have weather, and are experiencing global warming) absorbs much of this heat(opens in a new tab). This means not much of this energy can reach way up into the stratosphere. "The stratosphere’s temperature is determined by the difference between how much energy it emits and how much it receives, and increasing CO2 is really reducing the energy received from below," explained NASA's Jacobs. Consequently, the stratosphere cools.
Confirming stratospheric cooling was a triumph in climate prediction, and is a hallmark of modern climate change. "This was predicted way before it was observed," said Jacobs.
Tweet may have been deleted (opens in a new tab)
Conversely, if increased solar energy was responsible for rising global temperatures, climate scientists would expect all of the atmosphere to heat up (as this extra solar heat blanketed the planet), not just the lower atmosphere (where today greenhouses gases trap heat).
Today's climate change is happening rapidly compared to previous climatic changes, like warming after an ice age. As NASA notes(opens in a new tab), based on old climate records (such as from deep ice cores or tree rings):
"This ancient, or paleoclimate, evidence reveals that current warming is occurring roughly ten times faster than the average rate of ice-age-recovery warming. Carbon dioxide from human activity is increasing more than 250 times faster than it did from natural sources after the last Ice Age."
The consequences are global, and already serious:
Major Antarctic ice sheets have destabilized, with the potential for many feet of sea level rise.
Heat waves are becoming longer and more frequent, while smashing records.
Storms are intensifying, leading to more billion-dollar floods.
Human-created greenhouse gases, not the sun, are driving these changes. The evidence is clear.
"We know CO2 has been changing a lot since the Industrial Revolution and we know the energy output from the sun hasn't," said Los Alamos' Reeves. "This makes it highly unlikely the sun is responsible for the recent global warming."