“We’ve found that a lot of network users either have unrealistically high or unrealistically low expectations.”
—Chris Rapier, PSC network applications engineer

Chris probably didn’t notice the slight pause as I took that in, composed myself for the next interview question.

You see, Chris was talking about me.

There’s a truism about PR writers that we come to think we can do anything our employers can. Case in point; while working for a Large Hospital, many moons ago, I helped interview a nicely experienced job candidate who promised us, “I never forget I’m not a doctor.” Walking out to lunch with my boss, I observed, “I forget I’m not a doctor all the time.”

Not so much at PSC. I know I could never do the jobs the people I write about do, because I have absolutely no patience with machines. (My co-workers can vouch for the occasional outbursts of sailor talk emerging from my office.)  I think that, as a network user, I somehow simultaneously meet both of Chris’ characterizations.

Yesterday brought the third of a trilogy of press releases we’ve recently issued concerning PSC’s second-to-none networking group, and its leadership role in creating and managing the XSEDE network’s high-speed, research-only Internet2 network (as well as the commodity connections that help us trade email and other ancillary Web traffic). The news is about DANCES, which will help create virtual “HOV lanes” that carry the really Big Data Internet2 transfers (we’re talking entire-Library-of-Congress-scale amounts of data) so they’re not slowed or disrupted by the ebb and flow of other Internet2 traffic.

PSC's Big Networking Projects, at a Glance

The DANCES release follows one on Web10G, the ongoing effort to jailbreak TCP/IP’s cloistered—if egalitarian—mechanisms to allow users to monitor and spot network glitches. The idea is that they can notify their administrators and develop more realistic expectations. (Web10G was the subject of my talk with Chris.) Earlier still we reported on our role in migrating XSEDE’s networking to Internet2.

The Bigger Picture, of course, is all about Big Data. Yeah, yeah, the term has been bandied about to near-meaninglessness.  But the fact remains that researchers are now generating volumes of data that challenge networking hardware and software as never before. And that PSC is leading the way in creating the networking we’ll need to keep … well, working.

Posted in Bioinformatics, HPC Research, Networking, People, XSEDE | Tagged , , , , , , , , , | Leave a comment

News that the 2016 Summer Olympics will take place under the shadow of Dengue Fever brings to mind a story from my summer of 1975.

I spent much of that season backpacking, horseback riding and climbing across Colorado with Boy Scout Troop 176 of Montvale, N.J. Our scout master, Tom, called these trips “superadventures,” and they were—I learned lessons in woodcraft, leadership and plain ol’ self reliance that have served me well to this day.

One of those lessons was about the fragility of the human body. Tom had picked up malaria in his tour in Viet Nam with the Special Forces. During our pack-horse trip in the mountains, the malaria parasites in his system became active, felling him like a tree under the axe. He spent much of that part of the trip in his tent, fever dreaming and being tended by the adults and the senior scouts.

This made no small impression on us: We loved Tom and were terrified of him in equal measure. Tough, just and virtuous to the point of near-impossibility, he seemed to stand seven feet tall, and while not a loud man, he could employ a certain warning tenor in his voice that made you listen. To see him laid low by a microbe was a life lesson indeed.

Kids have an incredible ability to shrug off things that scare the bejeezus out of us adults; I don’t recall being particularly worried. But today, I’m frustrated that, over a hundred years after the Spanish American War taught the U.S. military a lesson in tropical disease it has never forgotten, these ancient enemies still threaten a large part of the human race.

Somebody ought to do something, as they say. Well, PSC’s Public Health Group is doing something about it.

  • Our CLARA agent-based model is simulating the interactions between humans and mosquitoes in Brazil—site of the 2016 Summer Games—Thailand and Australia. The idea is to test methods for breaking the mosquito-host chain that enable such “vector-borne” diseases to spread.
  • In Benin, India, Mozambique, Niger, Senegal, Thailand and Vietnam, the HERMES collaboration between PSC, Johns Hopkins University and the University of Pittsburgh is modeling vaccine distribution networks, identifying under-appreciated bottlenecks in getting shots to the people who need them. Once identified, these bottlenecks are sometimes relatively easy to fix.
  • In collaboration with the University of Notre Dame, PSC is creating the electronic infrastructure supporting VecNet, a Web-based clearinghouse of ideas and methods that will enable researchers, clinicians, aid agency personnel and government decision makers to share data and test tools for fighting malaria in the Solomon Islands, Kenya, and elsewhere.
  • Nor are we neglecting the home front. When deployed, Apollo will create a Web-based electronic bridge that allows otherwise incompatible public health models to communicate with each other through a common language known as an ontology. Grist for this mill includes PSC’s FRED and FluTe, programs that model disease spread through a separate electronic “agent” for every human in a population, and GAIA, which can take that information and display it geographically. The common display of multiple types of information should offer public health officials new insights into halting disease spread.

The common factor in all these projects is that the sheer volume of epidemiological and public health data is too massive to understand without some serious technological help, and that different actors in the public health sector may be holding pieces that others could use to great effect. Seeing the problems, the opportunities, and sharing that information with someone who has a tool you may not represents one of our best hopes for eradicating a number of deadly diseases—not just controlling them, but making them no more than a scary story from our past.

Posted in Bioinformatics, General, HPC Research | Tagged , , , , , , , , , , , , , , , , | Leave a comment

I come neither to bury the Google search engine nor to praise it — it needs neither, and to shift from misquoting the Bard to misquoting the Emancipator, my words can have little effect on it either way. It’s a simple fact of life that I think everyone can agree on, though, that Web-based searching has made the way we look for information unrecognizable to earlier generations.

That last bit I had to say a little under my breath: I am that earlier generation. I remember this thing called Biological Abstracts, which was a primitive artifact that lived at this placed called a library. BA, as strange as it may seem, was made of dead trees, with letters of carbon black stamped on it. It helped you find research papers. You’d look up a topic by thumbing through these things called “sheets” and, for a given span of dates, it would present a list of references to the papers that had appeared on that topic.  Then you’d have to look through the library for other dead-tree-carbon-black artifacts if you wanted to read the papers.

The precision of this instrument in some hands (mine, anyway) can be seen in the fact that, in the Spring of 1984, I managed to write an immunology class paper on AIDS without discovering the existence of HIV.*

Not that anyone would call modern search engines perfect. When we’re trying to get accurate information, they often throw marketing, propaganda or even purposeful misinformation at us. And if our words aren’t perfectly tuned to the metadata that define web copy, we can wind up very far afield. (To clean up a rude joke I heard, “On the Web you’re never more than three clicks away from something you don’t want your children to see.”)

So.  Call us greedy.  We want the vast power of Web searching, with far more specificity. Well, considering the venue, you won’t be surprised for me to suggest that HPC may be part of the answer.

On Sept. 4, David Woolls, president of CFL Software, will come to PSC to talk about the work he’s doing using his new program, CFL Discover, on our Sherlock machine.

Sherlock, as we’ve said before, is a YarcData Urika “data appliance” designed specifically to search unstructured, arbitrary networks for information. Yarc calls it by that unusual term to stress that the base model isn’t a supercomputer in the usual sense, because you don’t program it. It’s already set to do the searching, you give it the data and the SPARQL search terms and it does the rest. But PSC’s folks worked with Yarc to customize Sherlock with GPUs, to give it additional flexibility, including programmability. And even better, because it isn’t a massively distributed machine that requires you to tell it how to split a problem up into tens of thousands of little parallel pieces, you can program it much like you would your desktop computer.

The ability to run Java-based Discover — very unusual in a supercomputer^ — comes with that setup, and that’s what Woolls is trying to take advantage of. Discover, he told me when we were preparing a press release on the collaboration, is meant to automate the tough next step of the search process — one that we all do manually today: Read the content to see if it’s really what you were looking for.

“In essence we take over where search engines stop,” he said. “We take those documents and read them for you.” The added kick is that the program will not only help you fish out relevance — it can also make connections to related content that you didn’t think to ask for, but which is on target for your needs. Sherlock will kick the testing of Discover into overdrive, expanding on CFL’s earlier work with U.S. Patent records and Wikipedia.

“The aim is to increase the amount of data that we’re going through,” Woolls said. “We’ll test out existing queries to see what kind of response is coming out, then consider other queries which we can answer — potentially, other ways … which we haven’t been able to explore so far.”

I’m looking forward to his talk. That’s 2 p.m. on Sept. 4; it’s an open talk, but RSVP early, seating is limited.

*For the sticklers, the Gallo “HTLV-3″ paper appeared in May of that year, and the Montagnier “LAV” paper in 1983, so although I was cutting it close I should have had a shot at it.
^PSC’s Blacklight has that capability too, BTW.
Posted in General, HPC Research, People | Tagged , , , , , , , , | Leave a comment

Gathering dust on what I imagine to be a dimly lit shelf in Widener Library, Cambridge, Massachusetts, is a biochemistry dissertation I’m sure nobody has ever taken off the shelf and read.

It’s a shame. After all these years, I’m not concerned with its overall significance. But I’d bet a causal reader, pulling it off and blowing the detritus from it, would get a hoot out of the dedication page, including this quote (from Tolkien):

He that breaks a thing to find out what it is has left the path of wisdom.

Bit of an inside joke. But here I was, reducing the disulfides that hold together the halves of the insulin receptor molecule so I could deduce something about its structure.* Literally, breaking it to find out what it is.

Any home mechanic or budding IT tech can tell you:  Any idiot can take something apart. You have to be on the ball, though, to put it back together.

Nerve cell structure in the brain’s optical cortex, reconstructed from electron micrographs of ultra-thin sections.

Which brings us to my belated re-entry into PSC blogging, with a nod to Aaron Dubrow, my opposite number at the Texas Advanced Computing Center, who covered a keynote speech at the XSEDE13 conference by Terrence Sejnowski, Director of the Computational Neurobiology Laboratory at the Salk Institute for Biological Studies, about the NIH’s Brain Initiative.

The Brain Initiative, Sejnowski said (I was there too, though I covered other talks for XSEDE’s External Relations group), is meant to be the moon shot of our generation, harnessing HPC to really understand the brain and how it works. I’ve been around long enough to remember the Moon Shot, capital em ess, and from that tender-age experience I can only say: Go. Go. Go.

But enough of channeling a particular seven-year-old who could name the parts of the Apollo spacecraft (including the Saturn V booster) and just about every dinosaur at the Museum of Natural History in Manhattan. We’ve got a heck of a job ahead of us, if so.

The work began some time ago; hopefully the attention and money of the new program will accelerate it, but it has begun. For PSC’s slice of it, our National Resource for Biomedical SuperComputing has been working with a group at Harvard that’s using HPC resources to knit electron micrograph images of ultra-thin slices of brain tissue together, computationally, into their original structure in the living brain. It’s not easy; the process of slicing the tissues creates wrinkles and other distortions, which have to be corrected in the computations. But that’s not the end of it; they correlated those structures with the activity of the live brain, as shown by two-photon calcium imaging. Pairing the structure with the function gave a new perspective on how nerve cells in the visual cortex process information.

This is the gig for MMBioS,^ the National Center for Multiscale Modeling of Biological Systems, which we’re undertaking with CMU, Pitt, and the Salk Institute (Terry Sejnowski, by the way, is a project co-leader). It aims, ultimately, at nothing less than knitting together biological systems — the brain being an initial focus — from the behavior of individual atoms to brain functions, such as recognizing the orientation of an object.

In my day, we sketched out pictures of arrows between proteins in the cell, showing how signaling pathways were supposed to work. Today we realize it’s a network of interactions that can’t be sketched — it has to be simulated in a computer to be understood. MMBioS’s mission is that phenomenon on steroids: a heck of a putting-back-together problem. Obviously, making the connections between the atomic, molecular, subcellular, cellular, and tissue levels is going to have to be done in steps. But the fact that this task is imaginable with technology that is with us, or soon to be developed — that in 10 years, modeling a mouse brain in its entirety isn’t out of the question — is simply stunning.

Exciting times, my friends.

*Wrap your head around the primitiveness of the protein biochemistry I was doing.  I dare you.
^Say it with me: “mmmmmmBIos.”  (Sorry, Markus.)
Posted in Bioinformatics, General, HPC Research, People, XSEDE | Tagged , , , , , , , , , , | Leave a comment

There’s a flurry of activity right now on the normally quiet 4th floor where my office is.  Curious to find out what’s going on, I asked Pallavi Ishwad, in charge of outreach activities for PSC, whose office is a couple doors down from mine.  “We’re preparing for the MARC interns and workshop,” she said. Oh, now I remember!  Every year in June PSC brings in students from minority serving institutions (MSIs) all over the country to participate in this program.  PSC’s MARC program, led by Hugh Nicholas, is funded by the National Institutes of Health Minority Access to Research Careers Program. The overarching goal of the PSC program is to help establish and integrate bioinformatics in graduate and undergraduate courses at these institutions.

During the summer, June 3 through August 2, eight students (four males and four females) will participate in an intense nine-week program not only to learn how to execute their research projects, or “do research,” but just as importantly, to improve non-science skills to aid in writing peer-reviewed research papers and journal articles and to be able to present scientific work at national and international conferences.

I went to the right person to ask about all this because, as it turns out, Pallavi is taking a more active role in helping to mentor the interns. The program’s mentors make sure, among other things, the interns stay on track with their projects.  Once a week the students have to present their progress to the staff. It’s during this time problems can be caught early on and students can refine, if necessary, their goals for their specific projects.

I was curious to find out what projects these interns would be working on.  I was even more curious about how their projects were selected. Well, the specifics vary and I admit my eyes did glaze over when I asked the project’s co-investigator Alex Ropelewski, and he mentioned RNA, Trinity and guinea fowl in the same sentence.  TMI for this blog! I did, however,  glean from our conversation that Trinity is the name of a software program some of the interns will be using to assemble their sequence data.

Throughout the year, leading up to this summer program, Alex, Hugh, Pallavi and other project staff travel to and work with liaisons at five MSIs—Tennessee State University, North Carolina A&T State University, Johnson C. Smith University, University of Puerto Rico at Mayaguez, and Jackson State University—developing bioinformatics under the grant. The summer internship program is open to students at other MSIs as well and this year includes interns from the University of Texas at El Paso and Morgan State University.

One example is a project being researched by incoming intern Aryan Vahedi-Faridi from Morgan State University. He has data from an experiment that compares the effect of sleep deprivation stress on gene expression in the brain and hopes to analyze this data during his internship.  He wants to understand how the different signaling pathways are affected by this type of stress and relate these findings about sleep deprivation and perhaps treatment of these symptoms. Another intern, Kayla Hinson from the University of Texas at El Paso, wishes to expand her work on population biology and genetics of freshwater invertebrates. Her focus will be on arsenic detoxification pathways and identifying potential detoxifying enzymes in an aquatic model organism. You get the idea. This is serious. And intense! And I now know way more than I intended about correlation values, spicing variants and microarrays.

I’ll be following along this summer to see how they are progressing. I might even attend one or two of the weekly sessions so I can report back to you later on how the program is going. So stay tuned!


Posted in Bioinformatics, General, Outreach, People, Training | Tagged , | Leave a comment

Hepburn and Bogart delivered more than the sum of their talents in "The African Queen" -- so must the stakeholders in VECNet, to conquer malaria.

I’m just gonna fess up and admit that the story I’ve been telling the PSC VECNet Cyber-Infrastructure crew is probably a myth. Director John Huston and Humphrey Bogart probably didn’t stave off malaria while filming “The African Queen” on location in Uganda and the Congo by keeping themselves blotted on gin and tonic.

It has some verisimilitude. Bogart and Huston, famous drinkers, did avoid contracting the disease. And tonic water contains quinine, the world’s first effective pharmaceutical malaria preventive.*

But the story falls apart on closer examination. Bogart, Wikipedia tells us, credited dodging the dysentery bullet to “the large supply of whiskey he had brought along with him.” Anjelica Huston, too, quoted them as drinking Scotch, not G&Ts: “Whenever a fly bit Huston or me, it dropped dead,” Bogart later told her.

Never mind all that. The more interesting part of the story, at least from the point of view of VECNet and its CI project, is that the incomparable Katherine Hepburn, disgusted by the boys’ functional alcoholism, eschewed the booze for local water — and came down with dysentery as a reward for her virtue.

Hepburn and Huston didn’t get each other at first. Vastly different people, these two geniuses needed some time to work around their different ways of looking at the world, of working. It’s our gain that they stuck it out; the combination of their talents produced something epic, vastly greater than its already considerable sum.

So, too, VECNet. What our PI, Nathan Stone, and collaborator Greg Madey, PI at the University of Notre Dame, are aiming for is nothing less than a plug-and-play simulator that allows people with minimal IT training to ask the question, “What happens to malaria incidence if I do X?” and get a statistically valid answer. One of high enough quality to make real-life decisions on what research to conduct, which products to develop, what policies to adopt, and whose projects to fund. And, importantly, to identify which other VECNet users have other pieces of the puzzle, spurring collaborations.

“The larger VECNet community has been at it for over three years now, assembling some of the necessary models and data,” Nathan tells me. “This year’s challenge is all about integration and execution — like assembling a race car with parts from two Humvees and a bus so that any cabbie could use it to win the Indy 500.”

It’s audacious. It’s promising as all hell. And it’s going to take input from people who see the world — and how to attack the problem — very differently:

  • Scientists, famous for not wanting to take a step without the other foot firmly planted, will need to see the project as built on solid scientific footing, with transparency as a chief goal.
  • Engineers, who tend to view the world in terms of what the product does rather than the process of building it, will want to see a system that above all works.
  • Corporate researchers will need to have their proprietary ideas guarded, so that they can reap the fruits of their R&D labors.
  • Government officials will need answers that are both economically feasible and politically doable in the context of their own systems of governance.

I don’t think I’m giving away state secrets to say that these folks are going to be — have been — arguing over how best to do all of the above. But it will be to the world’s gain that they stick it out.

______________________________

*Modern tonic water has far less of the stuff than the originals, and so wouldn’t work.

Posted in Bioinformatics, General, HPC Research, People | Tagged , , , , , , , , , , , , , , , , , , , , , , , , , | Leave a comment

Nathan in his office.

I’ve been working with Nathan Stone for almost ten years, and over the past few months there has been a question nagging at me.  How in the world does a guy who has a Ph.D. in Experimental Nuclear Physics wind up being involved in PSC’s newest collaborative project, VECNet, which is researching new strategies to better control/eliminate malaria in developing countries?  In my mind, experimental nuclear physics != controlling malaria! So I sat down with Nathan and got a little more insight as to what landed him his current role of Modeling and Project Manager for the VECNet Cyber-Infrastructure (CI) team.

Nathan describes his undergrad major as a happy accident.  When he applied to Purdue University as an undergrad, he wanted to be an engineer. Unfortunately, when he applied it was too late to get into the engineering school since all of the slots had already been filled.  So he applied to the College of Science at Purdue and hoped to transfer to engineering his sophomore year.

As it turns out he stuck it out in physics.  He even participated in a REU program at Michigan State between his sophomore and junior years.  While working in the accelerator lab on campus he realized that research was much more than writing a book report.  He enjoyed his research experience at Michigan State and in the accelerator lab so much that he went back to get his Ph.D. there.

Nathan giving a virtual machine room tour at PSC's 25th anniversary celebration to a group of K-12 teachers.

After completing his post-doc at Berkeley, Nathan started working at Brookhaven National Lab on Long Island as a full time staff scientist.  He was a member of a collaborative team to bring up a unique new particle detector, STAR.  Nathan was involved in data calibration and acquisition and experiment controls for the new device.  Before STAR could receive funding to build the device, they needed to prove to the funding agency, the Department of Energy(DOE), that it would work.  This is where PSC enters the story… the STAR team ran simulations on PSC’s supercomputers to make their case to the DOE.

The STAR project required collaboration meetings about every 6 months, and during one of the meetings, Nathan gave a facility tour to PSCers Mike Levine & Sergiu Sanielevici.  Sergiu was looking for staffers to participate in the National Institutes of Health’s Collaboratories project.  Participants would help find new hardware, software, techniques, and tools that could help geographically distributed collaborators to work more effectively together.  Nathan’s experience with various partners at Brookhaven made him a perfect fit, and Mike & Sergiu encouraged Nathan to apply at PSC.

Nathan’s tenure at PSC has involved a number of projects, all of which have had two commonalities, ensuring that PSC users and collaborators have remote, secure and fast data access, and working on and finding tools to make collaborative projects more efficient and effective.

The Star Trek door chime mounted outside of Nathan's office that makes a swoosh noise everytime you visit.

So when PSCer Shawn Brown needed developers with a mind for research for the MIDAS influenza modeling project, he asked Nathan to join his team.  Nathan worked to solve some data issues with the FRED model that they were using and then he moved on to working on developing a new vector model for mosquitoes to better track and predict spread of the Dengue virus.

Later, Bruce Lee at Pitt, who had done some economic modeling for the VECNet project, got Shawn and Nathan involved.  The VECNet project needed a cyberinfrastructure that would help further the research, but they just hadn’t been ready to justify it.  Nathan did an applications analysis, which had not been done before, to justify and project the need for a more complete cyberinfrastructure for the project.  Proposal writing and contract negotiations took over a year, including traveling and meeting with various project collaborators and end-users (a.k.a. “stake holders”).  All of the hard work paid off recently when a generous grant of $1.6 million was awarded to make this proposed cyberinfrastructure a reality.

Now, Nathan is the project manager of the CI team and is ensuring that all of the pieces come together to help provide the VECNet team with tools that will make their research and collaboration more effective and efficient.

When Nathan’s not trying to save the world, one mosquito at a time, you can find him spending time with his family. He has 5 kids: two boys and three girls.  He also enjoys hunting, and driving his co-workers mad with his Star Trek electronic door chime.

 

 

Posted in General, HPC Research, People | Tagged , , , , , | Leave a comment

“The July 12, 1973 fire at the St. Louis National Personnel Records Center (NPRC) destroyed approximately 80% of Army personnel records from 1 Nov 1912 to 1 Jan 1960; and, 75% of the Air Force records from 25 Sep 1947 to 1 Jan 1964. In all, between 16 to 18 million military service files, including those for WWI and WWII, were destroyed.”

—   Kathleen Brandt, archives.com

The 1973 NPRC fire is of more than passing interest to me. Among the millions of Americans who lost records — 80 percent of those who served in the Army in both World Wars and Korea — was Alfred Bernard Chiacchia, my grandfather.

“Pete” Chiacchia — I never heard anybody but my grandmother call him “Alfred” — arrived in Normandy somewhere about 25 days after D-Day. Luckily, Pete — a 35-year-old draftee with three kids — missed the fighting there. But as a combat engineer with George S. Patton’s Third Army, he witnessed as much as a single dogface could have seen of Patton’s sweeping advances through France, Benelux, and into Germany. He told a story about the Battle of the Bulge that, to my deepest regret, I never heard — but which family members assure me was as hilarious as it was epic.

I’m not naïve. War is war. It was never really clear what portion of his stories was real and how much was … enhanced. But he always made it a good story.

“They made me a sergeant,” I did hear him say. “Twice.”

I think it’s safe to say that I inherited the storyteller gene.

Ms. Brandt assures me that, given his birth and death dates, Social Security number and, possibly, unique service number, I can probably reconstruct those records from other sources. I may well take up that quest — not that I’m certain how to get some of that information, now that his generation in my family is all but gone. But it certainly would have been easier just to get them from the NPRC.

Which brings us to our topic of today (well, yesterday, if you want to get technical): meaningful digitization of hand-written and other non-electronic records.

I say “meaningful,” because it’s pretty easy to digitize records if you don’t intend to make use of them. Moving from visual scans to searchable data, however, is an incredibly time- and manpower-expensive prospect, taking thousands of hours by human workers to recognize and transcribe data that were recorded in cursive script into digital form.

Among those working to change that are Liana Diesendruck, Luigi Marini, Rob Kooper and Kenton McHenry at the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign. They’ve been using PSC’s Blacklight to analyze scans of the 1940 Census to try to improve the ability to automate the process.

I look forward to telling this story in detail in the near future. Part of the beauty is that their approach doesn’t try to reproduce humans’ ability to recognize the script, which is beyond computers’ capabilities today. Instead, they match entries based on how similar they are in shape. So one Census worker’s “John” can be matched with another’s, even if their handwriting is very different. Humans can choose from the possible matches — and the system learns from their choices, getting better. Ultimately, the goal will be to digitize many different types of records, so that they won’t merely be protected from loss — they’ll be easily accessible, searchable, living documents available to generations to come.

It’s a story with a happy ending. Pete would have liked it.

*Note added on Veteran’s Day, 11/11/13:  I am working on this story now, and in a more recent interview Dr. Diesendruck hit me with something I should have realized:  The computers are doing something more like what our brains do than you’d think.  In fact, when a fluent reader reads, he or she does *not* read through the words letter by letter — instead, the eye follows the general shapes and extrapolates from there.  That’s one reason for the phenomena of “reader error” — seeing a different word than the one that was there — and being able t rd txt tht dsn’t hv vwls n t.

Posted in General, HPC Research, People, XSEDE | Tagged , , , , , , , , , , , , , , , , , | Leave a comment

Ken working hard at his desk.

We haven’t talked about any of our fabulous staffers in a while and since we did two posts on the XSEDE XRAC process a few weeks back, here and here, we thought this would be the perfect time hear about the man behind it all,  PSC staffer and the XSEDE Allocations Manager who isn’t afraid to wear pink, Ken Hackworth.

Ken as a student, sporting the striped shirt.

Ken has been working for PSC since almost the beginning.  We were founded in 1986 and Ken started working for the Center as a help desk student in 1987.  Around here our students get pulled into a variety of activities, and Ken’s experience was certainly no different, since he also helped out with allocations and as a grunt work office assistant during his tenure as a student.

When Ken graduated from the University of Pittsburgh in 1988 with a B.S. in Economics and a minor in Computer Science, he was hired full time at the Center as a User Consultant.  He assisted our user community by answering their questions and solving any problems that they had. In 1999, he became our User Relations Coordinator which had him more involved in allocations, consulting and overseeing the students working in the hotline and allocations. He also took on coordinating PSC’s booth presence at the annual Supercomputing Conference.

Ken golfing in Nevada prior to the March XRAC meeting.

Ken got his ‘big break’ in July of 2011 when he took over as the XSEDE Allocations Manager.  Ken’s experience here at PSC made him a perfect fit for the job. As the XSEDE Allocations Manager, Ken does a lot, including overseeing the XSEDE allocations process, coordinating and organizing reviewers and their recommendations, working with NSF to incorporate and develop guidelines and of course working with XSEDE service provider site representatives.

When Ken isn’t in the office you can most likely find him on the golf course or spending time with his family.  He’s been married to his high school sweetheart for 23 years, has a 12 year old daughter, and 2 dogs, a black and chocolate lab. Ken also reports that among his many superpowers, the most noteworthy is a photographic memory of golf courses.

Posted in People, XSEDE | Tagged , , , , , , | Leave a comment

I’m not a big fan of second-guessing the ancients’ perception of natural events. People in earlier times didn’t have incorrect ideas of what the world was like because they were ignorant — let alone stupid. Generally, their ideas were pretty much in line with the phenomena they saw, given the quality of the instrumentation they had to measure those phenomena.*

Case in point: Archaeologists have (re)discovered the gates of Hell. The literal gates of Hell. Quite frankly, even knowing all about carbon dioxide and other toxic gases emerging from volcanic caves, it’s hard not to side with the ancients on this one. You watch birds dropping dead at the entrance of an opening into the bowels of the earth, given no volcanology (the word hadn’t been invented yet), no chemistry, no physiology. Of course you’re going to build a temple and bring sacrificial animals (carefully) to the gate, saying, “Please not me, not yet.”

Which is all by way of saying that these things look pretty clear with a couple thousand years of hindsight. And if you improve the instrumentation, you improve your perception.

I’m looking forward to talking next week with Elia Zomot and Ivet Bahar at Pitt about their paper from last month showing how a molecular door in nerve cell membranes uses the relatively high concentration of sodium ions outside the cell to drive neurotransmitters inside. Kind of like an oshiya in the Tokyo subway, cramming commuters into a car. Important, because getting rid of the transmitters in the junction between nerve cells helps turn off neural signals when they’re no longer needed. That process may be faulty in conditions like epilepsy, ischemia and other stroke-associated causes of nerve damage, and Huntington’s disease.

For this task, the researchers used Anton, a PSC resource from D. E. Shaw Research that specializes in simulating the motions of all the atoms in large biomolecules. Anton allowed them to run an electronic model of the door, a protein called an aspartate transporter. First as it opened to the outside to pick up an aspartate molecule, and then as its components pivoted, clothespin-like, to release it inside the cell.

Clothespin-0157e3

Fair-use image from Wikimedia Commons, http://commons.wikimedia.org/wiki/File:Clothespin-0157e3.jpg.

Their model spanned multiple microseconds. That’s a long stretch of time in molecular dynamics. It also would have been a major computational challenge as recently as three years ago, before a D. E. Shaw Research grant made PSC’s Anton the first of its kind available to researchers outside the company.

Based on what I glean from the paper, the “old,” static images of the transporter that we had from previous crystallographic experiments had given the impression that the door opens differently than the Anton calculations show that it has to move. A subtle correction, but one that could pay off in addressing neural damage in a number of diseases.

Our own Markus Dittrich, director of the National Resource for Biomedical SuperComputing at PSC, tells me that Anton can run these kinds of simulations about 100 times faster than anything else out there. Anton handles microsecond-scale simlations with relative ease, offering even glimpses into millisecond timeframes. By comparison, the brain can perceive auditory events as short as about 10 milliseconds.

That’s amazing, even if it isn’t quite as profound a transformation as explaining cave-entrance deaths as a phenomenon of carbon dioxide poisoning rather than of the miasma arising from the land of Death. But then, they had over 600 times as long to work on the problem.

*******************************

* I’m not counting the pre-Galilean notion that heavier objects fall faster than light objects. Then as now, people can hold wrong ideas about things they haven’t bothered to measure.

Posted in Bioinformatics, General, HPC Research, People | Tagged , , , , , , , , , , , , , , , | Leave a comment