Skip to content

PROGRAMMING TOOLS AND ENVIRONMENTS | Contents | Next

Closing the Loop between Computation
and Laboratory Experiment

PROJECT LEADER
J.C. Browne
, University of Texas

PARTICIPANTS
Shyamal Mitra, James Overfelt, Yuhong Fu
, University of Texas

COLLABORATIONS
Engineering
Characterization of Heterogeneous Materials

C omposite materials, or simply composites, can be very strong and stiff for their weight. Many composites are made of small, rigid fibers or particles imbedded in a soft matrix. The fibers add strength and stiffness while the matrix provides light weight and toughness. By changing the fiber and matrix properties and the arrangement of fibers, materials scientists can dramatically change the properties of a composite. However, there is no easy way to predict the relationship between the fiber and matrix properties and mechanical properties of the composite. A project led by James Browne at the University of Texas is making headway toward making better predictions by linking computational tools with experimental data.

"This project is driven by the notion that we really want to integrate experiments and computation," said Browne, Regents Chair in Computer Sciences and professor in both the Physics and Electrical and Computer Engineering departments at the University of Texas at Austin. "Data from experiment and data from computations should be used consistently, and right now there's no way to do that."

CAPTURING EXPERIMENTAL DATA

COMPUTATIONAL TOOLS

BRIDGING THE GAP


Figure 1. CT Scan of material

Figure 1. From Material to CT Scan
A top view of the rock sample that was generated from computed tomography scan data.

CAPTURING EXPERIMENTAL DATA

The first obstacle to such integration is that it cannot be done in the abstract. A scientist must have a situation for which comparable data can be extracted from both experiment and computation. The microstructural properties of composites are the real-world application being tackled by the Data Fusion project of NPACI's Programming Tools and Environments thrust area, led by Browne and in collaboration with Greg Rodin of the Texas Institute for Computational and Applied Mathematics (TICAM). Rodin leads the Characterization of Heterogeneous Materials project of the Engineering thrust area.

The experiment starts with a piece of material about the size of a thumbnail. A computed tomography (CT) scan reveals the internal density of the material at the 10-micron scale and shows the fibers imbedded in the matrix. The scan produces X-ray slices of the material as a series of image files (Figure 1).

The next step is to create a 3-D model of the material from the 2-D image slices. Browne and Rodin accomplish this with off-the-shelf visualization tools including the Advanced Visualization System (AVS) and the Visualization Toolkit (VTK). The 2-D images are processed to create a boundary element model in the format that Rodin's modeling application uses for input (Figure 2).

"In the future, we'd like to take the CT data, map it to the proper format, and store it and the results of the computations in the Active Data Repository (ADR)," Browne said. "This is a very large computational problem, and we're serving as a pilot project for interfacing data with the ADR." Browne is working with the ADR project to make this happen. Eventually, the size of the data and the computations may require using a metasystem such as Legion or Globus for this step of the process.

Top| Contents | Next

This project is driven by the notion that we want to integrate experiments and computation. Data from experiment and from computations should be used consistently, and right now there's no way to do that. We're just getting to the point where we can get good enough data out of experiments and computations that it's worth comparing. We may be a little early here, but I expect it will happen.

James Browne, University of Texas


COMPUTATIONAL TOOLS

In particular, the CT data is converted to a representation in a Scalable Distributed Dynamic Array (SDDA). Rodin's application is also built on the SDDA library, developed by Browne and colleagues at Texas. SDDA generalizes the standard array concept into a structure that both adapts at runtime to the size and shape of the object it is representing and is automatically distributed across the processors of a parallel machine. Written in normal Fortran, the use of SDDA means portability for an application, since the data movement between processors is handled by the library.

Rodin's simulations look at how composites deform and ultimately break under strain (see the April-June 1998 ENVISION). These methods will make it possible to save time and money by designing materials with desired features more quickly than the traditional trial-and-error methods in a laboratory. However, for these methods to become accepted by materials designers, designers must know how the model's predictions compare to the behavior of real materials.

Closing the loop is a two-step process. First, Rodin must run his simulations with the model built from the CT scan data and predict what will happen to the material. This he can do. Second, a materials engineer must put the material sample through an actual stress test, another CT scan must be taken, and the effects of the stress on the sample must be compared to the prediction by Rodin's techniques. Performing the physical stress tests will allow Rodin's group to verify and improve the accuracy of their model. Once the model has proved itself, it would then become part of the toolbox available to materials designers.

"Until recently, we have had to make up reasonable models, but now we have access to actual data," Rodin said. "We want to run a physical test to close the loop, but it requires a different set of skills. I don't know any one person who works on every part of the loop."

Top| Contents | Next

Figure 2. Simulation mesh generated from CT scan.

Figure 2. From CT Scan to Model Mesh
An enlarged view of a section of the meshes that were generated to model the crystals in the sample scanned in Figure 1.

BRIDGING THE GAP

The techniques developed by Browne for bringing experimental data into a simulation apply to most digital signal processing applications. Another possible application at TICAM is the subsurface simulation effort led by Mary Wheeler, director of the Center for Subsurface Modeling and leader of NPACI projects in the Engineering and Earth Systems Science thrust areas. Seismic data is routinely used to create models indirectly. Browne's techniques would make it possible to do so directly.

"Our goals are now straightforward," Browne said. "There are other experimental data that will deliver data on the microscopic level. We could do the same thing, for example, with X-ray crystallography data."

One might ask why all computational models don't incorporate experimental data as input. In the long run, this will probably be the case, Browne said, but for now, it remains a difficult task. A great deal of effort goes into preparing, collecting, and analyzing the experimental data before it can be used by a computational model. In addition, the experimentally-based models can require very large scale computations.

"We haven't been doing this very long, and we're still figuring out the best way to do it," Browne said. "We're just getting to the point where we can get good enough data out of experiments and computations that it's worth comparing. People want to do this. We may be a little early here, but I expect it will happen." --DH

Top| Contents | Next
Top| Contents | Next