Skip to content

TELESCALE HORIZONS | Contents | Next

It's the End of the Galaxy as We Know It

PROJECT LEADER
Lars E. Hernquist, Harvard-Smithsonian Center for Astrophysics
John Dubinski, University of Toronto

SAC TEAM
Stuart Johnson
Robert Leary
SDSC

COLLABORATORS
Steinn Sigurdsson, Pennsylvania State University
Chris Mihos, Case Western Reserve University
Romeel Davé, Princeton University
Kathryn V. Johnston, The Institute for Advanced Study
Roeland van der Marel, Space Telescope Science Institute
Stephen Vine, Ohio University

W e're doomed. In one of the first Strategic Applications Collaborations (SAC) efforts, scientists at SDSC assisted a group of astrophysicists headed by Lars Hernquist of the Harvard-Smithsonian Center for Astrophysics in modeling galaxy dynamics and evolution faster and more accurately than ever before. According to their latest, state-of-the-art galaxy dynamics simulations on NPACI's newest supercomputer, Blue Horizon, our Milky Way galaxy may collide with the slightly larger Andromeda Galaxy in a mere 3 billion years. The results are catastrophic--the two galaxies eventually merge, but spray billions of stars into the lonely depths of intergalactic space in the process.

WHAT'S PAST IS PROLOGUE

THE MILKY WAY'S FATE ON THE HORIZON

Galaxies consist of collections of billions of stars, vast clouds of dust and gas, and unseen swarms of "dark matter" that stay together due to their mutual gravitational attraction. Galaxies themselves throng together in clusters, and occasionally interact, collide with, or merge with one another. Until the rise of detailed numerical simulations on high-speed computers within the last 30 years, researchers explained the shapes and configurations of galaxies and clusters with vague, qualitative guesses. In fact, computer simulations support the once-controversial theories that galaxies cannibalize one another and that the stars, gas, and dust that astronomers observe may be only a minor constituent of the universe compared to the mysterious dark matter.

Many aspects of galactic dynamics can be modeled by collisionless interactions of self-gravitating mass points that represent relatively small volumes containing stars, gas clouds, or dark matter--the more particles, the better the accuracy and resolution of the simulation. In the collisionless N-body problem, each of N particles interacts with every other particle, so as N increases the complexity of the problem increases by N2 unless approximations are used. A straightforward N-body simulation of a system with millions of particles is dauntingly time-consuming, even on a supercomputer.

Top| Contents | Next

Galactic Collision
Figure 1. Galactic Collision - Interacting galaxies NGC 2207 (left) and the smaller IC 2163, 114 million light-years away in the constellation Canis Major. The pair is similar to the Andromeda Galaxy and the Milky Way. Tidal forces have pulled streamers of material 100,000 light-years from IC 2163. Observations indicate that, billions of years from now, IC 2163 is destined to swing past the larger galaxy again and eventually merge into it. Image courtesy of NASA and The Hubble Heritage Team.

WHAT'S PAST IS PROLOGUE

Hernquist's group was one of three initial collaborations in the SAC program, initiated by Jay Boisseau, SDSC associate director for Scientific Computing, in March 1998 as an experiment. Over the course of a year, SDSC's Scientific Computing group member Stuart Johnson and Senior Staff Scientist Bob Leary worked primarily with Hernquist, John Dubinski of the University of Toronto, and Steinn Sigurdsson of Pennsylvania State University, to improve the execution efficiency of their astrophysics codes and to port them to new machines.

The collaboration was a success and the official project concluded, but the effort continues. "Just because we've achieved our goal for the formal collaboration doesn't mean we stop working with the researchers," Boisseau said.

"Over the past several years, first at UC Santa Cruz and more recently at the University of Toronto and the Canadian Institute for Theoretical Astrophysics, I have developed a parallel, N-body tree code for simulations in structure formation, galaxy formation, and general galaxy dynamics," Dubinski said. "The code is based on the Message Passing Interface (MPI) library and is generally portable to most parallel supercomputing platforms."

Dubinski's PARTREE code package calculates gravitational forces from nearby particles by examining individual particle-particle attractions, but organizes more distant particles into a hierarchical tree structure to aggregate and approximate their gravitational effects.

"I analyzed PARTREE's performance on one processor of our IBM SP," Johnson said, "and found that it was spending about 80% of its time in two force calculation subroutines." He fine-tuned the two sections of code for the SP and the Cray T3E, using a test case supplied by Dubinski that consisted of 2 million particles distributed in a galactic disk and halo of stars. "For in-cache data, the tuned code now runs about 3.7 times faster on the SP than the original, and the tuned subroutines speed up the whole code by a factor of two," Johnson said.

Optimization efforts for the Cray T3E were similar to those for the IBM SP, although the code was not as easy to tune. "The difficulty of deciding how much tuning to do and when to stop make performance programming for the T3E was much like black magic," Johnson said.

Another code package, SCF (Self-Consistent Field) simulates gravitational effects of large numbers of mass points symmetrically swarming about a small number of centers, as is the case in globular star clusters, the centers of galaxies, halos of stars and dark matter around galaxies, and spherical clusters of galaxies.

Working with Sigurdsson, SDSC's Bob Leary optimized SCF by making arithmetic and algebraic optimizations that improved computational performance and by optimizing interprocessor communications to improve parallel-processing efficiency.

"Romeel Davé, Lars Hernquist, and I have also developed a parallel version of TreeSPH, a hybrid smoothed particle hydrodynamics (SPH) tree code for modeling cosmological hydrodynamics and galaxy formation," Dubinski said. The new PTreeSPH code combines a parallel tree approach to modeling gravitational forces with an SPH estimation approach to calculate temperature and pressure effects in interstellar gas. The technique is useful for simulating star formation in gas clouds compressed by tidal interactions, calculating the spectra of gas clouds in early galactic formation, and modeling hydrodynamics of dark and baryonic matter.

enVision reported on the use of SPH and an earlier version of PARTREE to model galaxy collisions five years ago by Hernquist, Dubinski, and Chris Mihos, all then at UC Santa Cruz. The state-of-the-art PARTREE calculations used 250,000 particles. Today, SAC simulations are working with 10 to 100 times that number.

"Now that our simulations run two to three times faster, we can ask more precise questions and get better answers," Hernquist said.

Top| Contents | Next

Animation - Colliding Galaxies
Figure 2. Glimpses of Our Future
The first images from a simulation on Blue Horizon of an encounter between our galaxy and Andromeda. Each model galaxy contains about 10 million point-mass "stars" and is surrounded by a 2 million–particle dark matter halo, for a total of 24 million interacting particles tracked in the simulation. The images depict the stars as particles of constant brightness and do not show star formation due to hydrodynamic effects in gas clouds. Click here for the QuickTime movie version.

THE MILKY WAY'S FATE ON THE HORIZON

The effectiveness of the continuing informal collaboration was demonstrated in the past few months, as one of the largest galaxy simulations ever run was ported to Blue Horizon. This teraflops-performance machine is compatible with the smaller SP used in the SAC project. The astrophysicists in Hernquist's group remained allocated users of NPACI resources--in fact, that was one point of the SAC effort, to facilitate future production runs of scientific simulations by users. Both the researchers and the code specialists were eager to use the new machine, and an informal collaboration continued.

During the new machine's checkout phase, Dubinski and the SAC team ported PARTREE to Blue Horizon. As a test case, Dubinski chose a glimpse of our distant future. In one plausible scenario, our Milky Way galaxy will have a close encounter 3 billion years from now with its relatively near neighbor, the slightly larger Andromeda Galaxy.

"I executed one high-resolution scenario of the Milky Way-Andromeda encounter on Blue Horizon in January," Dubinski said. "We can measure Andromeda's velocity toward us, and we know where the Milky Way is headed. There are still uncertainties in the trajectories because we can't directly measure Andromeda's motion tangential to our line of sight, but our model used reasonable values for the encounter parameters and the amount of dark matter. The simulation results were dramatic."

The two galaxies sideswipe, fling some material into intergalactic space in vast arcs, then coalesce into a single larger galaxy surrounded by an extended halo. The two galactic cores merge, but meanwhile tens of billions of stars are ejected into the intergalactic void. "Astronomers are coming to understand how large spiral and elliptical galaxies form through mergers with smaller ones," Dubinski said. "We observe single snapshots of the process in the sky (Figure 1), but in the computer we can watch the sequence from start to finish."

This initial run tracked 24 million particles (Figure 2). Dubinski is preparing a simulation for Blue Horizon that will involve 120 million particles and will occupy 256 or 512 of the machine's processors. "These simulations should reveal a large amount of detail in the structure of the debris once the galaxies merge," he said. "They elucidate the relative importance of violent relaxation and phase-mixing in the merging process. Analyses of the kinematics of debris streams and shell structures also provide an independent way to measure the total mass of elliptical galaxies."

SCF also ran well on the new teraflops SP. "SCF is extremely scalable," Leary said. "This was the first application to run on all 1,152 processors during Blue Horizon's acceptance test."

"The SAC efforts help researchers use our systems more effectively, and encourage exchange of ideas between researchers and staff experts," Boisseau said. "The ultimate goal is to give researchers the ability to conduct new science by performing new simulations and analyzing their results more effectively. The galactic dynamics simulations in particular have involved both formal and informal consulting and collaboration, and we're seeing the payoff." --MG *

Top| Contents | Next