PSC Symposium: Cell Modeling with MCell

June 30, 2017 – 10am
300 S Craig St, Pittsburgh PA

 

Many challenges present themselves when creating realistic cellular models, from the detailed spatial representation of membranes and organelles to convoluted biochemical pathways. At smaller and smaller subcellular scales, the familiar concept of concentration becomes increasingly meaningless, instead replaced by the stochastic behaviors of heterogeneously distributed populations of molecules. MCell is designed for studying just such systems, allowing researchers to run realistic simulations with arbitrarily complex 3D geometry, tracking the movement of discrete molecules as they diffuse and react with other molecules.

Join us as scientists from the PSC’s Biomedical Applications Group and Salk Institute discuss how MCell facilitates their research.

Please RSVP to save a seat for this event or to be notified of future events.

 

Presentations

Rozita Laghaei, PhD
Research Scientist
Pittsburgh Supercomputing Center

MCell Simulations of the Mouse Neuromuscular Junction Transmitter Release Site and Evaluation of a novel treatment for neuromuscular disease 

The neuromuscular junction is a reliable synapse in which reliability derives from the summed activity of numerous unreliable elements, each consisting of a synaptic vesicle and associated voltage gated calcium channels (VGCCs).  Lambert-Eaton myasthenic syndrome (LEMS) is an autoimmune disease that reduces reliability, leading to muscle weakness. This weakness is due to an autoantibody-mediated removal of some of the VGCCs that are critical for transmitter release, an upregulation of other VGCC types, and a disruption in organization of these VGCCs. We have used a combination of electrophysiological recording and MCell computer simulations to examine structure-function relationships, the disease LEMS, and novel LEMS treatment strategies. We obtained a comprehensive MCell model, which provides a detailed understanding of the relationship between AZ structural changes caused by LEMS, and the resulting functional disease state.  We have used an iterative approach with MCell simulations and synaptic physiology to refine our mammalian model of the neuromuscular active zone.  MCell is also being used to evaluate the effects of the current treatment for LEMS that target calcium channel activation during an action potential (the potassium channel blocker DAP) alone, and in combination with a newly developed calcium channel gating modifier (the novel calcium channel agonist GV-58). These drugs are predicted to act synergistically to greatly enhance transmitter release.

 

Art Wetzel
Principal Computer Scientist
Pittsburgh Supercomputing Center 

A virtual volume approach to storage, analysis and delivery of petascale volumetric datasets for connectomics and cell modelling research

Biology and medicine are increasingly driven by analyses of 3-dimensional and time series image sets for studies that were not possible with conventional 2-dimensional imagery. Structural data required for building spatially realistic cell models, and even larger connectomics models, are particularly demanding of both resolution and spatial extent. Image capture methods for both optical and electron microscopy at gigapixel per second rates are now routine. In combination these factors have already exceeded 100 TB per specimen at data densities of 1 PB per cubic mm of tissue. New approaches are needed to economically handle these speeds and data scales and make the resulting datasets available for long term on-demand  analyses by researchers and students nationwide.

A virtual volume approach to these data handling problems is suggested by current trends in the economics of computation and data storage along with typical data access patterns. In recent years the relative  improvments in the speed and cost of computation have dramatically outpaced gains in data storage cost and performance. This is particularly true for GPGPU computation where the bandwidth of data retrieval is often the limiting factor in overall throughput. The essence of of the virtual file mechanism is to employ on-the-fly computation to replace redundant data storage for critical operations such as image registration and rendering. This is accomplished using the Linux Filesystem in Userspace (FUSE) mechanism to provide a compatible interface to user programs that conventionally operate from data files. This virtual interface produces the appropriate data content on-demand as application programs access the the virtual files. The virtual filesystem provides a framework for connecting multiple application program units in pipe like fashion while reducing redundant data storage. It also provides advantages by moving computation directly into the data acess path and producing only those parts of the virtual data that end user applications actually require.

 

Tom Bartol, PhD
Staff Researcher
Computational Neurobiology Laboratory
Salk Institute for Biological Studies

Talk TBA