In Progress, 2011

Twisted Ropes of Solar Wind

With PSC’s Blacklight, a team of physicists is visualizing a fundamental phenomenon involved in space weather that can disrupt satellities, spacecraft and power grids on Earth

A shimmering curtain of iridescence in the night sky, the northern lights, aurora borealis, northern dawn, and its southern sister, aurora australis, rank among mother nature’s more awe-inspiring spectacles. Scientists now know that the auroras are a result of solar wind, ionized particles blasted into space by the sun, crashing into Earth’s magnetic field.

Some of these same events that create beautiful light shows have a dark side — the ability to play havoc with electronics. They’ve knocked out satellites and, on occasion, caused power blackouts on Earth. At the detailed level of physics, all these events, including the auroras, occur due to a phenomenon called “magnetic reconnection” — what happens when Earth’s magnetic field lines break and reconnect in ways that allow solar particles to penetrate deeply enough to cause trouble.

“Magnetic reconnection is a physical process that is prevalent throughout the universe,” says physicist Homa Karimabadi at the University of California, San Diego. “It’s the predominant mechanism that fractures the Earth’s protective magnetic shield exposing us to the effects of solar activity.”

Karimabadi works with a team of physicists who have used petascale supercomputing to carry out, for the first time, realistic 3D simulations of magnetic reconnection. With their very largescale simulations using Kraken, the powerful XSEDE resource at NICS in Tennessee and another system, Karimabadi and his collaborators have been able to characterize, with much greater realism that was previously possible, how turbulence within sheets of electrons generates helical magnetic structures called “flux ropes” — which physicists believe play a large role in magnetic reconnection.

Karimabadi is using PSC’s Blacklight to visualize their recent simulations. “Our simulations produce a huge amount of data,” says Karimabadi. “One run can generate over 200 terabytes. Blacklight’s shared-memory architecture is critical for analysis of these massive data sets.” The results of their study are important for NASA’s upcoming Magnetosphere Multiscale Mission to observe and measure magnetic reconnection



Flux Rope in a Current Sheet
This 3D graphic from visualization on Blacklight shows magnetic-field lines (intensity coded by color, blue through red, negative to positive) and associated tornado-like streamlines (white) of a large flux rope formed, notes Karimabadi, due to “tearing instability in thin electron layers.” Karimabadi and colleagues reported their findings in Nature Physics (April 2011).

Shutting the Door on HIV

With PSC resources, an NIH-funded team is making progress toward finding a therapeutic drug that can deliver a knockout punch to AIDS

AIDS-related research has led to powerful antiviral drugs that increase life expectancy for people infected with the human immuno-deficiency virus (HIV). Even at their most successful, however, these drugs aren’t a cure-all. The United Nations estimates that 33 million people worldwide were living with HIV at the end of 2009, up from 26 million in 1999, and that AIDS in 2009 claimed 1.8 million lives. For researchers, the quest remains not only to find therapeutic agents that manage the disease, but to stop HIV infection before it begins.

To that end, computational chemist Judith LaLonde of Bryn Mawr College works with a multi-faceted team of researchers to develop “inhibitors” — therapeutic drug compounds — that can prevent HIV from gaining entry to cells. “Most HIV therapeutics target replication,” says LaLonde, “or integration of the genetic material, but there’s very few inhibitors that target the first stage, which is recognition and entry of the virus into the human cell.”


Stopping HIV Entry
This image represents the structure of gp120 (purple), a three-part (trimer) HIV protein, with a bridging sheet (left), an outer domain (right), and an inner domain that forms a cavity during binding with host-cell CD4 receptors (not shown), but here is bound with an inhibitor molecule discovered through LaLonde’s virtual screening.

The NIH-funded team includes virologists, chemists and crystallographers, and with LaLonde as a computational specialist, their focus is a protein, called gp120 (“gp” for glycoprotein) on the surface of HIV-1. As part of the initial encounter of HIV with a host cell, gp120 binds to a receptor protein, called CD4, on the host cell’s surface. Studies show that a large cavity forms inside gp120 as it binds to CD4. LaLonde’s objective is to computationally identify compounds that can bind in that gp120 cavity and prevent it from binding with CD4, shutting the door on infection.

Using PSC’s Warhol system, a 64-core Hewlett- Packard cluster, LaLonde runs a “virtual screening” program called ROCS, which uses shape-based similarity matching of smallmolecule compounds. Her starting point in recent work was a compound, called NBD-556, shown to have potential as an inhibitor of gp120-CD4 binding. With ROCS and Warhol, she searched for compounds that matched structural features with NBD-556. “With the cluster at PSC,” says LaLonde, “I was able to screen eight-million commercially purchasable compounds from the Zinc database in 24 hours.”

Her work identified a subset of close matches to NBD-556. With further analysis, the research team narrowed this group of candidates and synthesized those with the most potential. “We discovered new analogues,” says LaLonde, “with biological profiles that are improved from where we started. With this larger repertoire, we can ask ‘Which compounds work better and why? And which analogues are better platforms for continued synthetic optimization in blocking gp120-CD4 binding and viral entry.” To further advance this work, LaLonde plans to move her virtual-screening approach to Blacklight, PSC’s large shared-memory system.



Jukebox with a Brain

With PSC’s Blacklight on their side, a team of machine learning scientists came in near the front of the pack in a prestigious international competition

Algorithms that allow computers to learn, that is, to change their behavior based on information they encounter, describes the branch of artificial intelligence called “machine learning.” The world’s first Department of Machine Learning is at Carnegie Mellon University, and post-doctoral researcher Danny Bickson focuses his work there on developing machine-learning software, called GraphLab. A project initiated by Carnegie Mellon professor Carlos Guestrin, GraphLab is open-source software for solving large machine learning problems with parallel computing. Recently released to the scientific community, it has about 1600 installations around the world.

During the past year, with help from PSC scientist and XSEDE consultant Joel Welling, Bickson customized GraphLab to run efficiently on PSC’s Blacklight system (p. 4), which provided computational firepower for an annual, worldwide machine-learning competition, sponsored by the Association for Computing Machinery, called the KDD Cup. In collaboration with a team from the Chinese National Academy of Science — named LeBuSiShu — and graduate student Yucheng Low from Carnegie Mellon, Bickson and GraphLab came in fifth among more than 1,000 teams, ahead of IBM, AT&T and many other well known computer companies and universities.

The competition involved predicting how much people will like songs — based on how they rated other songs. Part of the challenge was the huge dataset — more than 260 million music ratings from the Yahoo music service, which includes 625,000 songs with ratings — on a scale of 1 to 100 — by a million different listeners. Each rated song has associated information — album and artist, and one or more genres. The LeBuSiShu team’s approach involved running 12 different predictive algorithms, with a total of 53 tunable parameters. The final prediction involved merging the results from all 12 algorithms.

Another challenge involved the taxonomical relationships among the songs, based on their artist, album and genre connections. Bickson’s approach employed a novel method called Matrix Factorization Item Taxonomy Regularization (MFITR). Predictions were tested against a portion of the Yahoo dataset that was held out from the competition, and MFITR by itself produced the second best prediction result among the dozen algorithms Bickson’s team ran via the GraphLab framework.

“For each algorithm,” says Bickson, “there can be many, many runs — as many as you can — to fine tune different parameters and find what works best. When you have more computing power, you can tune your algorithms faster, and you get better results in the short time frame of the contest.” Most of the competing teams, Bickson notes, relied on serial processing on several different machines with large groups of researchers doing the computing. With Blacklight on his team, Bickson and GraphLab more than held their own.

What Shapes the Wind?

A team of scientists is using computational modeling to help develop a Climate Action Plan for the Los Angeles region

“Los Angeles weather is the weather of catastrophe, of apocalypse, and, just as the reliably long and bitter winters of New England determine the way life is lived there, so the violence and the unpredictability of the Santa Ana affect the entire quality of life in Los Angeles, accentuate its impermanence, its unreliability. The wind shows us how close to the edge we are.”
Joan Didion, Los Angeles Notebook

Alex Hall, a professor in UCLA’s Institute of the Environment and Sustainability, is interested in the uncertainties of global climate change and, more specifically, in how climate change may have regional effects. This means thinking about factors not well accounted for in large-scale global climate models. In the Los Angeles area, where Hall lives and works, for instance, it means taking into account how the coastal Pacific ocean affects conditions over the rugged Sierra mountains not far inland — natural features that, among other effects, shape winds, known as the Santa Anas, that have fueled some the most furious wildfires to occur in densely populated areas.

Hall works with a coalition of government, universities and private concerns whose aim is to develop a Climate Action Plan that considers some of the “What if?” questions involved with climate change in the Los Angeles region. “The Santa Ana phenomenon, among other factors, isn’t represented at all in the coarse resolution global models,” says Hall. “To recover that phenomenon you have to regionalize the simulation at high resolution. Studying the dynamics of these phenomena scientifically, understanding the climate system at these smaller scales, is a very important objective.”


Southern California Wind in the Summer of 2002
The first graphic (left) shows mean wind speed direction (arrows) and magnitude (decreasing from red to blue) measured by satellite compared to the results (right) of their coupled ocean and atmosphere model.

Recent results from modeling by Hall and colleagues show that the connection between seasurface temperature and winds at high resolution must be taken into account in considering the regional effects of climate change. They have used PSC’s Blacklight, among other computing resources, and solved a series of challenges in running computational experiments that integrate the effects of the coastal ocean with those of inland topography. At a resolution of two kilometers (compared to the roughly 100-kilometer resolution of global models), their work captures the interplay between surface temperatures of the ocean and inland mountains in creating wind and rain conditions. “In the case of California,” says Hall, “we can reproduce rain events when they actually occurred going back as far as data is available.” In ongoing work, they expect to investigate some of the uncertainties associated with Los Angeles regional climate change scenarios, including not only the Santa Ana winds, but also critical questions associated with availability of water resources.

© Pittsburgh Supercomputing Center, Carnegie Mellon University, University of Pittsburgh
300 S. Craig Street, Pittsburgh, PA 15213 Phone: 412.268.4960 Fax: 412.268.5832

This page last updated: July 30, 2012