As part of efforts to develop a Climate Action Plan, a Los Angeles area team produced the first study assessing effects of climate change on the scale of a metropolitan region

Sept. 27, 2010: The hottest day on record in Los Angeles. The official weather station thermometer broke when it reached 113° F. The electrical load from the Los Angeles Department of Water and Power peaked at 6,177 megawatts, highest in history. As days go, was it a freak sizzler, or harbinger of the new normal for 21st-century LA?

PHOTO:Alex Hall

Alex Hall, University of California, Los Angeles. A “very valuable resource,” Hall says of Blacklight, PSC’s system that supported about half the computations for the LA study. “Because you have a grid where computations are impacting each other, it’s very helpful to have shared-memory capability.” Hall credits PSC scientist and XSEDE consultant David O’Neal, who supported the LA team. “He was extremely knowledgeable and professional. We have problems sometimes, and to have someone like him easily accessible is very helpful.”

A coalition of municipalities, academic institutions and businesses in the Los Angeles region are facing this very tough question. Through a ground-breaking initiative in regional planning, they are working to develop a Climate Action Plan that accounts for the local effects of global climate change. To lay a credible foundation for this work, Alex Hall, a professor in UCLA’s Department of Atmospheric and Oceanic Sciences, is leading an effort in computational modeling. In June 2012 he and his colleagues released a study, “Mid-Century Warming in the Los Angeles Region,” that is the first published study assessing effects of global climate change at the scale of a metropolitan region.

“This is the most sophisticated climate science ever done for a city,” said UCLA professor Paul Bunje, who directs the Los Angeles Regional Collaborative for Climate Action and Sustainability.

For computational resources, Hall relied on XSEDE — in particular, PSC’s Blacklight (and, initially, Ember at NCSA) along with the National Energy Research Scientific Computing Center (in Berkeley, California) and UCLA in-house computing. The project takes results of global climate models (GCMs) and applies them at the much reduced scale of the greater Los Angeles region. Because GCM results lack the detail needed to give a clear picture at regional scale, Hall and colleagues did extensive calculations to downscale GCM results to the local features of the Los Angeles area.

The study predicts that for the years 2041 to 2060 temperatures in the greater Los Angeles area will be higher, compared to the last 20 years of the 20th century, by an average of 4-5° F. The number of extremely hot days — temperature above 95° F — will triple in the downtown area, says the study, and quadruple in the valleys and at high elevations. “Every season of the year in every part of the county will be warmer,” says Hall. “This study lays a quantitative foundation for policy-making to confront climate change in this region. Now that we have real numbers, we can talk about adaptation.”


Zooming In: Dynamical Downscaling

The greater Los Angeles “combined statistical area” — which includes Orange County and parts of Ventura, San Bernardino and Riverside counties — is home to nearly 18 million people. Together they account for nearly $750 billion a year in economic activity. For most of the 20th century, LA was the fastest growing region in the country, due in large part to the Mediterranean-like climate, warm to hot dry summers and mild winters.

The area’s geography, a coastal basin nested between the Pacific Ocean and Sierra Mountains, is part of the challenge, says Hall, of understanding the local effects of global climate. Among other factors, modeling must account for how these topographical features shape winds, known as the Santa Anas, which fuel some of the most furious wildfires that occur in densely populated areas.

The central challenge of the modeling is the contrast in scale and resolution of GCMs compared to the LA region. GCMs solve the equations of the atmosphere — wind, clouds, surface temperature, topography and many other factors — over a computational grid (roughly 100 kilometers on a side) that covers the world. Computing power, even at current petascale levels, isn’t enough for GCMs to include detailed topography of each urban region. “Even in global climate models with the highest resolution,” says Hall, “the Los Angeles region is merely a pixel.”

To bridge from the scale of GCMs to a metropolitan area — to reliably capture information at fine enough detail on which to base planning, Hall and colleagues applied an innovative two-stage approach that drew on the archived results of 19 GCMs. “These global models are done on supercomputers around the world,” says Hall, “and the output is publicly accessible.”



Zooming In on LA
Even in global climate models with the highest resolution the Los Angeles region is merely a pixel (left). The mountain ranges across the region are completely lost in the global models. The local topography that defines much of the Los Angeles climate is completely wiped away. The grid is laid out by degrees of latitude (vertical axis) and longitude (horizontal axis), with surface altitude represented by the color scale. Downscaling brings the view in closer (center). In the third map, the inner domain of the modeling (with Los Angeles County outlined), what was only a pixel in GCMs has clear topographic detail that defines the varieties of climate.

The first stage was a demanding computation called “dynamical downscaling.” For this step, the researchers used a regional model from the National Center for Atmospheric Research (NCAR) as their software framework. They overlaid the LA region with a fine grid (two kilometers square) and initialized the model with regional topography and actual atmospheric data (called “reanalysis data”), including ocean-surface data, at coarse resolution from archives.

The first dynamically downscaled simulation established the “baseline,” reconstructing LA regional weather (rainfall, surface temperatures, wind directions and speed, etc.) at fine detail for 1981-2000. Results from this simulation compared well with historical data from 23 weather observation sites in the LA region, lending confidence to the approach. “We can reproduce rain events,” says Hall, “when they actually occurred going back as far as data is available.”

With 1981-2000 as their validated baseline, the researchers then ran the model again. Drawing on output from GCMs, including an NCAR GCM for North America, this second large simulation generated a forecast of the LA regional climate for the 20-year mid-century span of 2041-2060. Each of the two dynamical-downscaling simulations used 96 of Blacklight’s processors (six blades), along with UCLA in-house computing — altogether about four months for each of these demanding computations.

Even if the world succeeds beyond expectations in cutting back greenhouse emissions, says the model, the LA area will still get 70 percent hotter.

“The main advantage of dynamical downscaling,” says Hall, “is that the regional numerical model produces a climate change response driven purely by its own internal dynamics. It is in no way predetermined by any assumptions about the relationship between regional climate and climate at larger scales.”

In the second stage of their strategy, the researchers applied statistical techniques, far less computationally intense than dynamical downscaling, to incorporate results from other GCMs into their mid-century forecast. This technique used parameters (derived from the dynamically downscaled baseline modeling) to represent patterns in the relationship between the LA region and the NCAR GCM, in this way including the benefit of a wide range of GCM variations in the LA results. The overall outcome, presented in June as a white paper, “Mid-Century Warming in the Los Angeles Region,” is available online: http://c-change.la/pdf/LARC-web.pdf.


Hotter Days & More of Them

The 2041-2060 model presented two scenarios for LA conditions at mid-century — “business-as-usual” with greenhouse gas emissions continuing, and a scenario with reduced emissions. The model shows that even if the world succeeds beyond expectations in drastically cutting back greenhouse emissions, the greater LA area will still warm to about 70 percent of the business-as-usual scenario. “I was a little taken aback,” says Hall, “by how much warming remains, no matter how aggressively we cut back.”



Surface Warming
This graphic shows the change in warming (difference between the 1981-2000 baseline and the 2041-2060) as an annual mean surface air temperature in °F, increasing from green to red. “Note the contrast,” says Hall, “between inland and coastal warming, and stronger warming at higher elevations.”

The white paper reports that the number of hot days, when temperature climbs above 95° F, will increase two to four times, depending on location. Temperatures that now occur only on the seven hottest days of the year will happen two to six times as often. The model detail facilitates neighborhood-by-neighborhood findings — showing, for instance, that coastal areas like Santa Monica receive less warming than inland.

While yearly average temperature increases, the most intense effect is in the hot months, which get hotter — with less warming in spring and winter. The modeling shows also that the inland mountains experience increases in average temperature similar to desert areas — an average increase above baseline of about 5° F.

“We’ve provided some matter-of-fact information about future conditions,” says Hall. “It’s not meant to be alarming, but to turn this into a problem to be addressed.” Along with a California statewide effort to reduce greenhouse gas emissions, several regional adaptation programs are already in place, observes Hall, such as increasing tree canopy — including an active “green roof” movement — and programs to build more parks and open space. The neighborhood-scale projections of Hall’s “Mid-Century Warming” study have also given impetus to a plan for a network of air-conditioned heat trauma centers.

In further work, the LA modelers are collaborating with fire experts to look in detail at the implications for wildfire from the Santa Ana winds. They are also investigating the LA “June gloom” phenomenon, when in May and June, notes Hall, the marine layer spreads inland and cloudiness takes over the region. “We see that this is impacted by changing climate.” The modeling team is also delving into key questions, such as the effects of snowpack and low clouds, that affect the availability of water. “With supercomputing,” says Hall, “we can simulate these phenomena in detail and see why they change and assess the credibility of these changes.”

© Pittsburgh Supercomputing Center, Carnegie Mellon University, University of Pittsburgh
300 S. Craig Street, Pittsburgh, PA 15213 Phone: 412.268.4960 Fax: 412.268.5832

This page last updated: November 02, 2012