Skip to content

 

NEUROSCIENCE | Contents | Next

Making and Testing Neuron Models

PROJECT LEADERS
James M. Bower, Professor of Biology, Caltech
Terrence J. Sejnowski, Professor of Biology, Director, Institute for Neural Computation, UC San Diego; Professor, Salk Institute for Biological Studies; Investigator, Neuroscience, Howard Hughes Medical Institute
Thomas M. Bartol, Postdoctoral Researcher, Salk Institute for Biological Studies

Computational models describe the behavior of neurons and the communications among neurons in networks. Such models are essential for investigating how neurons or networks contribute to the functions of brain regions. At the next level, models assist in studies of how interactions between regions underlie the functions of the entire nervous system, including behavior and thought. Models can be used to perform "virtual experiments" that are too difficult or impossible to conduct using biological tissue or living subjects. Computationally intensive simulations can incorporate greater numbers of neurons to model increasingly complex and realistic properties, both electrical and chemical. The Neuroscience thrust area is working to provide neuroscientists with the infrastructure and high-performance computing required to perform these large-scale simulations.

"The modeling component of the NPACI Neuroscience thrust area consists of two independent projects," said thrust area leader Mark Ellisman of UC San Diego. One is the very well known project called GENESIS, directed by Jim Bower of Caltech; the other is a general Monte Carlo simulator of microcellular physiology called MCell that has been developed by two groups, one at the Salk Institute for Biological Studies (Tom Bartol and Terry Sejnowski) and the other at Cornell University (Joel Stiles, Ed Salpeter, and Miriam Salpeter).


GENESIS simulation of a Purkinje neuronFigure 1: GENESIS of a Purkinje Cell

GENESIS supports simulations of neural systems, such as this detailed multicompartmental model of a cerebellar Purkinje cell, created by Erik De Schutter and James Bower of Caltech, and visualized by Jason Leigh using the GENESIS Visualizer program.

 


GENESIS

GENESIS--short for GEneral NEural SImulation System--is a general-purpose simulation platform developed to support the simulation of neural systems ranging from complex models of single neurons to large networks made up of more abstract neuronal components. GENESIS has provided the basis for laboratory courses in neural simulation at both Caltech and the Marine Biological Laboratory in Woods Hole, Massachusetts, as well as many other institutions. "Most current GENESIS applications involve realistic simulations of biological neural systems," said developer James M. Bower of Caltech.

The system permits the realistic modeling of single neurons, conceptualized as linked, compartmentalized structures, and groups of neurons with realistic interconnections, which may then be linked in networks and used to simulate the behavior of larger brain structures such as the cerebral cortex (Figure 1). The workstation version is in wide use by computational neuroscientists across the country, and indeed around the world. In February 1998, Bower and co-editor David Beeman of the University of Colorado brought out the second edition of The Book of GENESIS (TELOS/Springer Verlag). Both a manual and a report of modeling work that has been done with the system, the book contains material by 14 contributors from major neuroscience laboratories.

With NPACI support, Bower and his group have worked to develop a parallel version of the GENESIS system, P-GENESIS, which is now available on such NPACI resources as the CRAY T3E at SDSC. Graduate student Fidel Santamaria is working to demonstrate a network simulation of cerebellar cortical circuitry on these resources. Improvements are also being made to a module of GENESIS that performs automated parameter searching, using a variety of methods (gradient descent, stochastic searching, and genetic algorithms). "We are particularly interested in using GENESIS to develop and support neuroscience-related databases," Bower said, "and we are collaborating with the groups working on Federating Brain Data."


Distant view of MCell simulation from front coverFigure 2: MCell Simulation of Ionic Current Generation

An MCell simulation shows a distant view of the nerve junction scene presented in close-up on this magazine's cover. After release, the neurotransmitter acetylcholine (ACh, cyan spheres) binds to acetylcholine receptors (cup-shaped objects) and acetylcholinesterase (AChE). Doubly bound open receptors (yellow) conduct a current that initiates a cascade of events leading to muscle fiber contraction. In NPACI, MCell is collaborating with the Programming Tools and Environments thrust area to allow larger and more complex simulations.

 


MCELL

MCell is a software tool for 3-D Monte Carlo simulation of ligand diffusion and chemical signaling. Developed by Tom Bartol of Terrence Sejnowski's lab at the Salk Institute for Biological Studies and Joel Stiles of the Cornell laboratory of Miriam Salpeter and Ed Salpeter, the system permits detailed modeling of chemical transmission across synapses, which involves an array of complex electrochemical processes (Figure 2).

"We are focusing on the events that occur when a neuron sends its chemical message across a synapse to influence another neuron. This involves modeling the release of neurotransmitter, its diffusion across the synapse, and its binding to receptors to generate quantal currents in the receiving neuron," Bartol said.

With NPACI support, MCell's serial code has been parallelized and MCell's features greatly expanded to allow simulation of multiple ligand and receptor classes along with complex 3-D arrangements of diffusion boundaries representing multiple cell or organelle membranes. The new features include the subdivision of the simulation space and structures into subvolumes, which has resulted in a 150-fold speedup of simulations. "We expect higher speedups with an increase in simulation complexity," Bartol said. MCell now also contains modules to speed up the output of data for animations of the simulation results.

The MCell team has collaborated with Jack Dongarra of the Programming Tools and Environments thrust area to implement MCell using NetSolve, the metacomputing system under development by Dongarra and his group at the University of Tennessee. The NetSolve server version will permit the CRAY T3E or clusters of workstations to participate in large-scale parallel processing of MCell simulations.

Simulations are designed using a Model Description Language to define neurotransmitters and other molecular constituents such as receptors, enzymes, and uptake sites; the arrangement of boundaries; the timing of release; and additional parameters. Thus, many processes in addition to synaptic transmission can now be modeled. "We can now use fully arbitrary 3-D polygonal representations of cellular structures, with resolution down to the electron microscope level," Bartol said.

MCell can now be applied to model synaptic transmission using reconstructions of actual neuromuscular junctions, including junctions altered by diseases such as myasthenia gravis. A collaboration has begun with Manfred Lindau of Cornell University, using MCell to simulate endocrine and neurotransmitter exocytosis--the release of chemical packets or vesicles from the neuron at the synapse.

NPACI support will add a postdoctoral researcher to pursue a collaboration with Kristen Harris, a neurobiologist at Harvard University, on synaptic transmission using reconstructions of brain tissue, including synaptic spines and surrounding glial cells. This project will involve adding new capabilities to MCell to include reactions between mobile species, such as calcium buffers.

EXPANDED MODELING TOOLS

Ellisman notes that this NPACI effort is critical to bringing together local efforts and transforming neural modeling from a kind of cottage industry into an essential part of a large-scale, collaborative, neuroscientific sector of the national computational infrastructure. "We are also considering the addition of other modeling systems, with the idea of making them available on parallel machinery for large-scale computational neuroscience," he said.

Neuroscience modeling simulations can be used to obtain insight into how experimentally observed behaviors contribute to brain processing, leading to new questions for experimental research. They are also used to test hypotheses concerning brain function. As more information becomes available from experimental observations on the characteristics, constituents, and interconnections of neurons, programs are being asked to accommodate ever increasingly larger and more complex simulations. These projects also serve as testbed applications for other NPACI thrust areas developing technology and computational infrastructure to enable large-scale simulations.

"Neuroscience is poised at the edge of a great epoch of practical and theoretical integration, with computation as an essential component of discovery," Ellisman said. "Our NPACI thrust area is designed to bring our discipline in line with others as a driver of the development of a national computational science infrastructure." --MM END