PSC News Center

PDF File: This article is available in PDF format as it originally appeared in Projects in Scientific Computing, 2001.

Networking the Future

As a resource for networking know-how, PSC's team of engineers has few peers. They provide engineering consulting for advanced networking nationally, and they conduct seminars that disseminate knowledge to engineers around the country. In projects such as Web100, they're actively involved in technology development. They are, in short, one of the leading groups in the world shaping the networks of the future.

Through the Pittsburgh GigaPoP, a high-speed network crossroads that serves Carnegie Mellon, Penn State, the University of Pittsburgh and West Virginia University, PSC provides advanced network resources for higher education and research. There are more than 20 GigaPoPs in the United States, and Pittsburgh's is among a select group. "In terms of the number of bits we can push, our installed infrastructure is matched only at a few other places," says Gwendolyn Huntoon, who directs the PSC team.

The GigaPoP connects all four universities to Abilene, a high-performance network linking more than 170 U.S. universities and research organizations. Data zooms along the Abilene backbone at 2.4 billion bits per second, fast enough to download the complete works of Shakespeare 436 times per second. With upgrades earlier this year, the GigaPoP link to Abilene improved fourfold to 622 million bits per second. "Demand for bandwidth is constantly increasing," says Huntoon. "What makes the GigaPoP unique is that we add capacity on a regular basis so it's there before it's needed"

Many research applications depend on high-performance networks. PSC staff are collaborating with scientists at the University of Pittsburgh Health System to develop technologies for "telemedicine," such as matching patient tissue samples against a database of already diagnosed tissue. At Carnegie Mellon, research on 3-D modeling of dynamic events, similar to but more sophisticated than the instant-replay technology at this year's Super Bowl, also requires high-speed networks.

Getting in Tune with Web100

Web100 Logo

Most high-performance networks can transfer data at 100 million bits per second (Mbps) or faster. Why then do researchers who use them seldom realize rates above a few Mbps?

Good question, say network engineers at PSC, the National Center for Atmospheric Research (NCAR) and the National Center for Supercomputing Applications, who are doing something about the problem. With support from the National Science Foundation, they've mounted a research program, called Web100, to "tune" computer operating systems to better exploit available network bandwidth.

Most current operating systems have default configurations suited for low-bandwidth use, such as a home PC. But these settings often limit performance on high-bandwidth networks. And scientists shipping visualization data or interacting via a video-conferencing camera, for instance, need every available bit of performance.

The key to overcoming this limit is the Transmission Control Protocol, a "language" computers use to communicate across networks. With adjustments to TCP settings, network experts can "tune" the operating system to the network and optimize performance. Web100's goal is to eliminate the need for a human expert. They have refined TCP software in the Linux operating system to automatically achieve the highest possible transfer rate.

"Our goal is to make it easier for everyone to move data across networks at 100 Mbps or higher," says Matt Mathis, PSC network research coordinator. The Web100 team distributed the initial version of their software in March 2001. Forty-nine researchers at 26 institutions - including Stanford Linear Accelerator Center, Oak Ridge National Laboratory, Lawrence Berkeley Laboratory and Argonne National Laboratory - are testing this release.

In a related project called Net100, funded by the U.S. Department of Energy, PSC collaborates with NCAR, Lawrence Berkeley National Laboratory and Oak Ridge National Laboratory. The operating system auto-tuning capability for Net100 derives from Web100.

Search

PSC News Center
All of PSC


Sidebars
These will open in a new window.


Tuning the Visible Human

Tuning the Visible Human



Robot Cameras for Super Sunday




A workshop in session at PSC.

Pittsburgh Supercomputing Center Workshops (2000-2001)

  • Parallel Programming Techniques
  • Nucleic Acid and Protein Sequence Analysis
  • Building Computing Clusters for Biomedical Research
  • Structure Determination Using Nuclear Magnetic Resonance
  • Realistic Microphysical Simulations Using MCell
  • Methods and Applications of Molecular Dynamics to Biopolymers

Biomedical Supercomputing

"For over 12 years, the Pittsburgh Supercomputing Center has provided national leadership in applying advanced computational resources to biomedical research," said Michael Marron, associate director for biomedical technology at NIH's National Center for Research Resources, last year when NCRR awarded $8.6 million to renew PSC's program in biomedical supercomputing. Since the program's inception in 1987, when it become the first extramural biomedical supercomputing programs in the country funded by NIH, PSC biomedical scientists have brought leading-edge computational resources and expertise together with experts in biology and medicine to solve important problems in the life sciences.

The program has provided access to computing resources for more than 800 biomedical research projects involving nearly 1,800 researchers in 43 states and the District of Columbia. Among these are several projects features on this web-site (See Touchy Proteins, Fishy Proteins and The Road to La-La Land).

PSC's workshops on computational biology have trained more than 2,000 researchers in the use of high-performance computing for biomedical research, in such areas as sequence analysis in genome research, the structure of proteins and DNA, and biological fluid dynamics. "Our training activities reach hundreds of biomedical scientists each year," says biochemist David Deerfield, who directs the PSC program. "Techniques we've developed are helping scientists nationwide cope with the explosion of genome data."

In addition to training and access to computational resources, the biomedical group carries out research in structural biology, protein and nucleic-acid sequence analysis, computational neuroscience and microphysiology. Its researchers collaborate with scientists at many other institutions, including the University of Pittsburgh Medical School, Carnegie Mellon University, Scripps Research Institute, University of California at San Francisco, and Whitehead Institute.

More information: http://www.psc.edu/biomed/biomed.html

A model of nerve and muscle.

A Model of Nerve and Muscle



The Protein Family Tree

The Protein Family Tree



Brainy Simulations

Brainy Simulations