ALPHA PROJECTS | Contents | Next |
|||
PROJECT LEADERS
|
|
||
Monte Carlo Cellular Microphysiology on the Grid |
|||
o understand diseases of nerves and of the brain (such as myasthenia gravis, which can cause paralysis and ultimately death) scientists first seek to understand the normal function of nerve cells, or neurons, at the molecular level. Computational neuroscience often studies interactions between neurons and their environment with models that attempt to represent the chemical reactions and electrical impulses that constitute neural signaling. Such models are a subset of the general problem of inter- and intracellular signaling. MCell, which is in use in two dozen labs around the world, is a program that simulates cellular microphysiology, and a new NPACI alpha project will implement and deploy MCell as a distributed grid resource. |
|||
Figure 1. Neurotransmitter ActivityMCell simulated the transmission of 6,000 molecules of the neurotransmitter acetylcholine (cyan specks) in a reconstructed mouse sternomastoid neuromuscular junction containing acetylcholinesterase (white spheres). The membrane here is studded with unbound (blue), singly bound (red), doubly bound open (yellow), and doubly bound closed (green) acetylcholine receptors, and current flows through the doubly bound open receptors, triggering a chain reaction that leads to muscle contraction. Image rendered by Tom Bartol of the Salk Institute and Joel Stiles of the Pittsburgh Supercomputing Center. |
|||
The project continues a collaboration between computer and computational scientists led by principal investigators Francine D. Berman, SDSC Fellow and professor of computer science and engineering at UC San Diego, and Terrence J. Sejnowski, who is a Howard Hughes Medical Institute investigator at the Salk Institute for Biological Studies, an SDSC Fellow, and a professor of biology and neuroscience at UC San Diego. Berman is also the principal investigator for a new, three-year, $2.5 million NSF Information Technology Research (ITR) grant that will use MCell as a model application in a project to demonstrate "scalable virtual software instruments" for the grid. "The MCell work provides an important opportunity for cell- and neurobiologists to obtain new results while supplying an equally important opportunity for computer scientists to develop new technology," Berman said. "In the alpha project and our ITR work, we will address the significant computer science problems that arise from the need to support steerable scientific simulations in large-scale grid environments." |
|||
MCELL |
|||
MCell is software for 3-D Monte Carlo simulation of ligand diffusion and chemical signaling, originally developed by Thomas M. Bartol and Joel R. Stiles in the lab of Miriam M. Salpeter and Edwin E. Salpeter at Cornell University. Bartol, now in Sejnowski's lab at Salk, and Stiles, now at the Pittsburgh Supercomputing Center, continue as the main MCell developers. Biological structures like neurons show tremendous complexity and diversity at the subcellular level. Inter- and intracellular communication occurs via diverse chemical signaling pathways. A process like synaptic transmission, for example, encompasses both neurotransmitter and neuromodulator molecules. Proteins that affect the filling and emptying of synaptic vesicles with neurotransmitter molecules, receptor proteins, transport proteins, and oxidative and hydrolytic enzymes all play a role. With MCell, scientists incorporate and vary any of these parameters within an arbitrarily complex 3-D geometric representation of the cellular structures involved. The Monte Carlo approach begins by populating this cellular environment with individual ligand and ligand-binding molecules, "in effect reconstituting the biochemistry of the volume considered," Bartol said. The diffusion of the molecules is represented as Brownian random-walk displacements, and high numerical accuracy is obtained by keeping the displacement length and time step larger than the Brownian mean free path and time between collisions. "We can simulate almost all factors governing the kinetics of neurotransmitter and receptor interaction at a synaptic junction, for example," Bartol said, "including ion channel opening and closing, receptor binding and unbinding, the diffusion constant of the neurotransmitter, and the influence on all these of different experimental conditions." Input files are written in a Model Description Language developed by Bartol and Stiles. The probabilities for each interaction are determined by Monte Carlo methods. Averages over many runs of each distinct task increase the signal-to-noise ratio of the outcome (while preserving the realistic noise spectrum of laboratory experiments). Increases in computer power have made highly realistic simulations with MCell feasible. A recent experiment on Blue Horizon, done in collaboration with Ed Salpeter and Bruce Land of Cornell, simulated the diffusion of acetylcholine (ACh) across a neuromuscular junction, its binding to specific receptors, and its subsequent unbinding and hydrolysis (destruction) by the enzyme acetylcholinesterase (AChE). After determining how the process takes place under normal conditions (Figure 1), Bartol and Stiles experimented by "poisoning" the junction with di-isopropylfluorophosphate (DFP, a nerve gas) and with a combination of DFP and alpha-bungarotoxin, a component of venom from Bungarus multicinctus, a snake found in Taiwan. While DFP simply binds to and inhibits AChE and prevents deactivation of ACh, the snake toxin at low concentrations tends to bind to one side of a receptor, preventing one of the two molecules of ACh required for activation from binding there. The toxin thus lowers the amplitude and slows the rate of signal transmission across the neuromuscular junction. "This interested us because something similar happens in a disease like myasthenia gravis," Bartol said. "Our hope is that MCell can ultimately be used to model such autoimmune disease processes." To illustrate the computational complexity of these computations, Bartol noted that 15,680 runs were done for each condition (intact, DFP, DFP plus toxin) for a total of 47,000 runs. "Some of these took only 10 or 15 seconds, while others took as long as four to six hours. The average run was about an hour, with each run simulating 40 milliseconds of real time," Bartol said. "This resulted in about 60 gigabytes of output, which we are still analyzing. We achieved a performance of 250 gigaflops on 1,024 processors (128 nodes) of Blue Horizon." |
|||
RUNNING MCELL ACROSS THE GRID |
|||
Such simulations are just the beginning for MCell scientists. To perform larger simulations on more complex phenomena, MCell researchers are partnering with computer scientists at UC San Diego and the University of Tennessee to develop implementations that target large numbers of networked computational resources across the grid. The software being developed by the computer scientists can also be used by a broad class of scientific applications whose structure is related to that of MCell--other Monte Carlo simulations and parameter studies. The MCell software can scan a broad parameter space of interest, just as a telescope scans the sky. And it should be possible to steer MCell, like a telescope, to regions of special interest. "Our ultimate objective is to use the MCell software to prototype virtual scientific instruments, enabling disciplinary scientists to use the grid as a tool as effectively as other laboratory instruments critical to new results," Berman said. She is working with other computer scientists, including Henri Casanova, project scientist in the Grid Computing Laboratory at UC San Diego, computer pioneer Jack Dongarra (University of Tennessee Distinguished Professor), and Rich Wolski, also a professor of computer science at Tennessee, to design and develop software that eases the use and improves the performance of applications like MCell on the grid. The MCell software leverages three key components. Application-Level Scheduling (AppLeS) middleware, developed by Casanova, Berman, and Wolski, provides a framework for implementing and deploying performance-oriented grid applications. NetSolve, developed by Dongarra, Casanova, and others, is client/agent/server software that binds disparate computational resources into coherent, fault-tolerant units. The Network Weather Service (NWS), developed by Wolski, interrogates resources for status and predicts resource availability dynamically. "All of these pieces are being integrated to accommodate large-scale MCell runs in a networked computational environment," said Casanova. "The MCell input language lets the user construct a biological model, while AppLeS can schedule the necessary computations and data movements efficiently on available resources. NetSolve assembles the resources, and we are hoping to target the software to the Globus metacomputing system as well. The NWS is essential to report and forecast systems status to all of the collaborating network software." He notes, for example, that the Blue Horizon test "was a real workout for the software because of the enormous range of individual task runtimes, from seconds to hours." |
|||
THE MANY USES OF MCELL |
|||
Also involved in the project are Mark Ellisman and Maryann Martone, director and associate director of the National Center for Microscopy and Imaging Research at UC San Diego. "We understand what a great resource MCell can be for the neuroscience community," said Ellisman. "We'll be looking for ways to integrate MCell into our research cycle, and we want to make the federated brain databases being developed within NPACI into resources for MCell." "This is a project poised to deliver on the promise of neurocomputation at the highest level," said co-principal investigator Sejnowski. "Both as an alpha project within NPACI and in its wider implications as an NSF ITR project, MCell development is going to help us ask new and more productive questions." |
|||
|
|||
|