You are browsing the archive for policy.

Research: Volcanic Aerosols Largely Responsible for Recent Warming Slowdown

10:20 am in Uncategorized by WeatherDem

Climate change skeptics used the recent slowdown in observed surface warming to claim that 20th century warming was temporary and that the Earth would return to lower average annual temperatures.  They offered up many potential explanations for the slowdown, none of which make physical sense.  The Sun’s 11-year cycle (often used to explain away warming), a primary argument brought forth, is not the reason: this cycle’s solar maximum is near at hand, yet warming has slowed down recently.

Recently accepted research points to a viable physical explanation.  In addition to oceanic transport of heat to the deep ocean and recent La Nina events, sulfuric emissions from small and mid-sized volcanoes entered the lower stratosphere and reflected more incoming solar radiation than normal.  This research separated the effect of natural sulfur emissions from anthropogenic emissions, using a model, to determine the former had a much larger influence than thought.  Aerosol optical depth (AOD) is a calculated metric used to represent how opaque or transparent the atmosphere is to different radiation wavelengths.  The layer between 20 and 30 km increased 4-10% per year since 2000, which is a significant change from normal conditions – significant enough to have effects on Earth’s climate.

Here is one of the paper’s graphical results:


 photo AerosolOpticalDepth525nm-Neelyetal2013_zps8ba54484.png
Figure 1. Observed and modeled time series of stratospheric AOD from three latitude bands.  Satellite observations are represented by the black line.  Base-line model runs are in green. Model runs with the increase in anthropogenic emissions from China and India are in blue. The dashed blue line depicts a model run with 10x the actual increase in anthropogenic emissions. The model run with volcanic emissions is in red. The black diamonds and initials along the bottom of the plot represent the volcanic eruptions that were included in the model run. (Source: Neely paper; subs. req’d.)

As the caption says, satellite measurements are denoted by the thick black curve.  Note the large increase in AOD (higher opacity) over the tropics in the mid-2000s (b) and the large AOD increase over the northern mid-latitudes in the late-2000s (a).  While not a perfect fit to the observations, the model run with volcanic eruptions (red curve) does the best job of explaining the origin of the SO2.  Individual eruptions are indicated by black diamonds on the bottom of each sub-plot.  The effects of volcanic eruptions on climate are, in a general sense, well-known.  Injections of SO2 into the stratosphere reflects sunlight, which reduces the amount of energy entering the Earth’s climate system.  The difference between one large-scale eruption (e.g. Pinatubo in 1991) or many mid-sized eruptions in a short time-period (see above) is not large as far as the climate is concerned.

This could be good news as far as the climate is concerned, at least in the shorth-term.  If incoming energy were reflected back into space instead of being stored in the system, we can physically explain the observed temperature trend slowdown (see Figure 2) and treat the slowdown as real instead of waiting for that energy to transfer from the oceans to the atmosphere, for example.

There is also bad news however.  From the study (emphasis mine):

Read the rest of this entry →

Can Carbon Emissions Be Reduced In Electricity Generation While Including Variable Renewables? A California Case Study

9:09 am in Uncategorized by WeatherDem

This is a class paper I wrote this week and thought it might be of interest to readers here.  I can provide more information if desired.  The point to the paper was to write concisely for a policy audience about a decision support planning method in a subject that interests me.  The teacher instructed us to use Strunk and White’s, “The Elements of Style”.  Note that this is only from one journal paper among many that I read every week between class and research.  I will let readers know how I did after I get feedback.  As always, comments (substance and style) are welcome.

40% of the United States’ total carbon dioxide emissions come from electricity generation.  The electric power sector portfolio can shift toward generation technologies that emit less, but their variability poses integration challenges.  Variable renewables can displace carbon-based generation and reduce associated carbon emissions.  Two Stanford University researchers demonstrated this by developing a generator portfolio planning method to assess California variable renewable energy penetration and carbon emissions (Hart and Jacobson 2011).  Other organizations should adopt this approach to determine renewable deployment feasibility in different markets.

The researchers utilized historical and modeled meteorological and load data from 2005 in Monte Carlo system simulations to determine the least-cost generating mix, required reserve capacity, and hourly system-wide carbon emissions.  2050 projected cost functions and load data comprised a future scenario, which assumed a $100 per ton of CO2 carbon cost.  They integrated the simulations with a deterministic renewable portfolio planning optimization module in least-cost and least-carbon (produced by minimizing the estimated annual carbon emissions) cases.  In simulations, carbon-free generation met 2005 (99.8 ± 0.2%) and 2050 (95.9 ± 0.4%) demand loads in their respective low-carbon portfolios.

System inputs for the 2005 portfolio included hourly forecasted and actual load data, wind speed data generated by the Weather Research and Forecasting model, National Climatic Data Center solar irradiance data, estimated solar thermal generation, hourly calculated state-wide aggregated solar photovoltaic values, hourly temperature and geothermal data, and approximated daily hydroelectric generation and imported generation.  They authors calculated 2050 load data using an assumed annual growth rate of 1.12% in peak demand and 0.82% growth in annual generation.

The Monte Carlo simulations addressed the uncertainty estimation of different system states.  As an example, the authors presented renewables’ percent generation share and capacity factor standard deviations across all Monte Carlo representations.  The portfolio mix (e.g., solar, wind, natural gas, geothermal, and hydroelectric), installed capacities & capacity factors of renewable and conventional energy sources, annual CO2 emissions, expected levelized cost of generation, and electric load constituted this method’s outputs.

A range of results for different goals (i.e., low-cost vs. low-carbon), the capability to run sensitivity studies, and identification of system vulnerabilities comprise this method’s advantages.  Conversely, this method’s cons include low model transparency, subjective definition and threshold of risk, and a requirement for modeling and interpretation expertise.

This method demonstrates that renewable technologies can significantly displace carbon-based generation and reduce associated carbon emissions in large-scale energy grids.  This capability faces financial, technological, and political impediments however.  Absent effective pricing mechanisms, carbon-based generation will remain cheaper than low-carbon sources.  The $100 per ton of CO2 assumption made in the study’s 2050 portfolio is important, considering California’s current carbon market limits, its initial credit auction price of $10.09 per metric tonne (Carroll 2012), and its a $50/ton price ceiling.  In order to meet the projected 2050 load with renewable sources while reducing emissions, technological innovation deserves prioritization.  More efficient and reliable renewable generators will deliver faster investment returns and replace more carbon-based generators.  Improved interaction with all stakeholders during the planning phase of this endeavor will likely reduce political opposition.

Carroll, Rory. 2012. “California Carbon Market Launches, Permits Priced Below Expectations.” Reuters, November 19. http://www.reuters.com/article/2012/11/19/us-california-carbonmarket-idUSBRE8AI13X20121119.

Hart, E. K., and M. Z. Jacobson. 2011. “A Monte Carlo Approach to Generator Portfolio Planning and Carbon Emissions Assessments of Systems with Large Penetrations of Variable Renewables.” Renewable Energy 36 (8): 2278–2286.

NASA & NOAA: 2012 Was In Top-10 Warmest Years For Globe On Record

10:12 am in Uncategorized by WeatherDem

According to data released by NASA and NOAA this week, 2012 was the 9th or 10th warmest year (respectively) globally on record.  NASA’s analysis produced the 9th warmest year in its dataset; NOAA recorded the 10th warmest year in its dataset.  The two agencies have slightly different analysis techniques, which in this case resulted in not only different temperature anomaly values but somewhat different rankings as well.

The details:

2012’s global average temperature was +0.56°C (1°F) warmer than the 1951-1980 base period average (1951-1980), according to NASA, as the following graphic shows.  The warmest regions on Earth (by anomaly) were the Arctic and central North America.  The fall months have a +0.68°C temperature anomaly, which was the highest three-month anomaly in 2012 due to the absence of La Niña.  In contrast, Dec-Jan-Feb produced the lowest temperature anomaly of the year because of the preceding La Niña, which was moderate in strength.  And the latest 12-month period (Nov 2011 – Oct 2012) had a +0.53°C temperature anomaly.  This anomaly is likely to grow larger in the first part of 2013 as the early months of 2012 (influenced by La Niña) slide off.  The time series graph in the lower-right quadrant shows NASA’s 12-month running mean temperature index.  The recent downturn (2010 to 2012) shows the effect of the latest La Niña event (see below for more) that ended in early 2012.  During the summer of 2012, ENSO conditions returned to a neutral state.  Therefore, the temperature trace (12-mo running mean) should track upward again as we proceed through 2013.

Photobucket

Figure 1. Global mean surface temperature anomaly maps and 12-month running mean time series through December 2012 from NASA.

According to NOAA, 2012’s global average temperatures were 0.57°C (1.03°F) above the 20th century mean of 13.9°C (57.0°F).  NOAA’s global temperature anomaly map for 2012 (duplicated below) reinforces the message: high latitudes continue to warm at a faster rate than the mid- or low-latitudes.

 photo 201201-201212_zps7a320a03.gif

Figure 2. Global temperature anomaly map for 2012 from NOAA.

Read the rest of this entry →

2012: Hottest Year On Record For United States

3:13 pm in Uncategorized by WeatherDem

After a brief hiatus (10 graduate school credits & TA-ing leaves no time for blogging), I’m back posting on FDL.  I expect to post much more regularly in 2013 as school activities ramp down.  More of my writing will also include a policy angle.  I want to do more to bridge the science and policy worlds in my blogging as well as in my future career.

It’s official: 2012 was indeed the hottest year in 100+ years of record keeping for the contiguous U.S. (lower 48 states).  The record-breaking heat in March certainly set the table for the record and the heat just kept coming through the summer.  The previous record holder is very noteworthy.  2012 broke 1998′s record by more than 1°F!  Does that sound small?  Let’s put in perspective: that’s the average temperature for thousands of weather stations across a country over 3,000,000 sq. mi. in area for an entire year.  Previously to 2012, temperature records were broken by tenths of a degree or so.  Additionally, 1998 was the year that a high magnitude El Niño occurred.  This El Niño event caused global temperatures to spike to then-record values.  The latest La Niña event, by contrast, wrapped up during 2012.  La Niñas typically keep global temperatures cooler than they otherwise would be.  So this new record is truly astounding!

The official national annual mean temperature: 55.3°F, which was 3.3°F above the 20th century mean value of 52°F.

Photobucket

Figure 1 – NOAA Graph showing year-to-date average US temperatures from 1895-2012.

This first graph shows that January and February started out warmer than usual (top-5), but it was March that separated 2012 from any other year on record.  The heat of July also caused the year-to-date average temperature to further separate 2012 from other years.  Note the separation between 2012 and the previous five-warmest years on record from March through December.  Note further that four of the six warmest years on record occurred since 1999.  Only 1921 and 1934 made the top-five before 2012 and now 1921 will drop off that list.

Photobucket

Figure 2 – Contiguous US map showing state-based ranks of 2012 average temperature.

Nineteen states set all-time annual average temperature records.  This makes sense since dozens of individual stations set all-time monthly and annual temperature records.  Another nine states witnessed their 2nd warmest year on record.  Nine more states had top-five warmest years.  Only one state (Washington) wasn’t classified as “Much Above Normal” for the entire year.  The 2012 heat wave was extensive in space and severe in magnitude.

Usually, dryness tends to accompany La Niña events for the western and central US.  This condition was present again in 2012, as the next figure shows:

Read the rest of this entry →