Skip to content

ENERGY AND THE ENVIRONMENT| Contents | Next

Exploiting Aging Oil Fields with Advanced Computational Tools

GEOPHYSICAL AND FINANCIAL FORCES
TEXAS LEGEND

DEMONSTRATING IPARS
THE INSTRUMENTED OIL FIELD

he fortunes of Slav immigrant Anthony Lucas and the state of Texas rose together on January 10, 1901, when a 150-foot geyser of oil erupted from a well near Beaumont. However, nothing like the 75,000-barrel-a-day Lucas Gusher spurted from the ground near Borger, Midland, or Beaumont in 2001. Even though half the known petroleum reserves under Texas (and other oil-producing states) remain underground, extracting them has become increasingly difficult. To help, NPACI and its partners in Texas, Maryland, New Jersey, and Ohio are integrating computational tools to recover petroleum more efficiently from the nation’s dwindling reserves. "The key questions for oil and gas production companies are plumbing questions," said Steven Bryant, associate director for Microstructure and Pore Scale Modeling at the Center for Subsurface Modeling at the University of Texas, Austin. "What is the rock like between existing wells, and how is the reservoir connected? Two tools we’re working on to answer these questions are reservoir simulation and searching quiSckly through mountains of simulation data."

Reservoir Simulation

In this placement, high pressure (red) corresponds sto areas with injection (water) wells, and low pressure (blue) to production (oil, gas, and water) wells. In this example, the Center for Subsurface Modeling at the University of Texas conducted simulations of placements of several well patterns over five years of water-flooding in order to determine the best drilling scenario.

Plumbing is vital to an industry running out of new discoveries. "There are still virgin territories where companies and countries are searching for new oil fields, but there are fewer and fewer of those areas and they are in increasingly difficult environments," said David Lumley, one of the world’s leading authorities on oil field reservoir monitoring technologies and president of 4th Wave Imaging Corp., of Aliso Viejo, CA. "There is a growing emphasis on becoming more efficient in extracting oil from reservoirs that have already been discovered."

The shift has led to a burgeoning interest in oil-reservoir modeling and data-mining techniques, areas of expertise of an NPACI alpha project, Multi-Component Models for Energy and the Environment. Research by the alpha project’s members is being applied to protect groundwater, but the same geophysics applies to oil extraction. The alpha project is supported directly and indirectly by the National Science Foundation, the Department of Energy, the Department of Defense, the states of Texas and Louisiana, and industrial partners Aramco, British Petroleum, ChevronTexaco Corp., ExxonMobile Corp., Halliburton, IBM, and others.

GEOPHYSICAL AND FINANCIAL FORCES

The plumbing of subsurface oil reservoirs is often an unpredictably contorted labyrinth rather than a neat layer cake of porous and nonporous rock. No two oil reservoirs are the same. Researchers model as best they can both the variations in the permeability of subsurface rock and the geophysical forces that control the movement of oil, gas, and water through the fickle medium. An oil company could simply drill more wells in an existing field to produce more petroleum, but the costs of such an approach could far exceed any additional revenue. This is why NPACI partners at the University of Texas, Austin, have factored economic and financial constraints into their latest models.

For example, some simulations show that an oil company in need of sustained income would be better off drilling a few strategically placed water-injection wells that "sweep" oil like a squeegee toward producing wells.

Of course, the oil business is inherently risky, which may explain why some executives try to improve their odds with a mathematical technique called Monte Carlo simulation. The methodology was named for Monte Carlo, Monaco, home of roulette wheels, card games, and other forms of gambling. Monte Carlo simulation randomly picks variables with uncertain values for particular times or events within a known range. It is a "brute force," computationally intensive technique used to solve complex problems when no other method is effective. Researchers who would never stack chips on red at a roulette table, routinely use Monte Carlo simulation to investigate everything from particle physics to cancer treatment (see story, page 6).

In oil reservoir simulations, the technique can calculate many scenarios of a given geostatistical model by repeatedly sampling values from the probability distributions for the uncertain variables. The technique uses those values to create subunits, or "cells" of a reservoir. In other words, an oil reservoir (typically hundreds of cubic meters) can be subdivided into hundreds, thousands, or millions of cells and "sampled" in random configurations to describe the reservoir as a whole. The more cells sampled, the better.

TEXAS LEGEND

Permeability Patterns

These 14-foot-thick slabs of color-coded rock, about 0.5 mile on a side and between 6,000 and 8,000 feet underground, reveal how highly stratified a petroleum reservoir can be. In this case, the highest permeability area is a thin stratum near the top of the bottom layers—red tracer in a “water-flood sweep” has almost completely swept away the purple color—but just below that layer is a low permeability bed with lots of oil left behind by the flooding.

Lucas had drilled into Louisiana salt domes, hitting oil in two wells. He was obsessed by the sulfur smelling water near Beaumont, which had been bottled for years to treat various human ailments. Lucas guessed that the bitter fumes were a harbinger of oil, and he used nearly all of his personal funds to buy leases at the springs. The rest is Texas history.

Modern production companies rely on geological assessments, seismographic studies that reveal subsurface rock formations, and past experiences. Some oil executives use back-of-the-envelope calculations, but not arcane-sounding mathematics. But fluctuating oil prices can squeeze profits, prompting more companies to apply computational approaches, particularly in cases where performance distinctions are subtle.

Oil reservoir modeling uses measurements of porosity, quartz content, and other properties of the rock perforated by drilling. Geophysicists also use measurements from nearby outcroppings, and time-lapse seismic data that can reveal the year-to-year flow of subsurface fluid. With those data and random "realizations" made with Monte Carlo simulation, geophysicists create a 3-D model of an oil reservoir. "This approach requires a tight integration between the mathematical sciences, the engineering sciences, and the geological sciences," said James Jennings, a research scientist at the University of Texas, Austin’s Bureau of Economic Geology.

Mary F. Wheeler, director of the Center for Subsurface Modeling at the University of Texas, Austin, has led a large group of researchers in the development of the Integrated Parallel Accurate Reservoir Simulator (IPARS), which is designed for large-scale parallel simulations of multiphase, multicomponent flow and transport in the subsurface. IPARS is taking Monte Carlo simulations to a higher level of sophistication by integrating financial and economic models. "We can look at the total oil production from a reservoir versus the total water injection, which is very helpful because the injection can be costly," said Malgorzata Peszynska, associate director for Subsurface Modeling in Wheeler’s group. "We know that the rate at which you produce oil from a reservoir will affect its total production; if you produce very fast you may get a large, immediate financial return, but you may also leave lots of oil and gas behind, which can’t be recovered unless you drill more wells. Our model takes these and many other factors into account. It’s extremely complicated."

DEMONSTRATING IPARS

During the Supercomputing 2001 conference in Denver, CO, an IPARS demonstration was prepared by members of the Multi-Component Models alpha project, including: Peszynska, Wheeler, Bryant, and Ryan Martino, all with the Center for Subsurface Modeling; Joel Saltz, an Ohio State University professor and co-leader of the alpha project; Tahsin Kurc and Umit Catalyurek, assistant professors at Ohio State; and Alan Sussman with the University of Maryland. The IPARS demonstration rapidly solved a challenging benchmark problem designed by the Society of Petroleum Engineers (SPE).

As a group of SC2001 attendees watched, Saltz queried 1.5 terabytes of IPARS modeling data, which had been generated earlier and stored at the University of Maryland. The data included the results of 200 realizations of the SPE benchmark problem. As part of the solution of the problem, DataCutter, a middleware infrastructure developed by researchers in the University of Maryland and the Johns Hopkins Pathology Informatics Departments, enabled the subsetting and user-defined filtering of multi-dimensional data sets stored in archival storage systems across a wide-area network.

"We had 9,000 grid cells, which is not very big–we’ve made runs with a couple million grid cells," said Martino, a graduate student in Wheeler’s group. "In the realizations, we used the same well patterns, the same initial conditions. The only difference was how the oil flowed because of the differences in the permeability field."

Within seconds of each of Saltz’s queries, results moved from a 50-node storage cluster at Maryland to Saltz’s computers at Ohio State where the results were processed with visualization software to created easy-to-understand images, which then flashed onto a display screen at the Denver Convention Center. "An interesting part of this demonstration was the computation of pockets of bypassed oil," said Sussman, an assistant professor at Maryland. "An oil company might well want to drills wells at particular places if the concentration of oil is high enough and the oil is present in enough cells for a sufficient length of time."

Each demonstration worked in seconds–so fast that some conference attendees may not have realized the scale of the computation. "With the advent of PC clusters, anybody could generate gigabytes or terabytes of data, but people who actually do it always say, ‘I’ll never do that again–I don’t have time to look at all of the data,’ " said Bryant, chuckling. "What Saltz and his collaborators have created is a tool that allows us to access huge data sets in a cleverly indexed fashion and rapidly query and visualize the results of the queries."

THE INSTRUMENTED OIL FIELD

The Texas group’s IPARS model is slated to become even more powerful. In October 2001, Wheeler and her colleagues won a three-year, $1.4 million Information Technology Research grant from the National Science Foundation to design technologies to "better monitor and optimize oil and gas production." The grant will help Wheeler’s group play a major role in designing "the instrumented oil field of the future." Other participants in this research include Peszynska, Mrinal Sen, Paul Stoffa, and Clinton Dawson at University of Texas, Austin; Rick Stevens and Mike Papka at Argonne National Laboratory; Manish Parashar at Rutgers University; Saltz, Kurc, and Catalyurek at Ohio State; and Sussman at Maryland.

Wheeler’s group plans to integrate seismic information, well sensors, fiber optics, and remote-control operations. "A major outcome of the proposed research is a computing portal that will enable reservoir simulation and geophysical calculations to interact dynamically with the data and with each other," said Wheeler. "Since the proposed research is directed toward the general problem of modeling and characterization of the Earth’s subsurface, it has immediate application to other areas, including environmental remediation and storage of hazardous wastes." The research also has major economic implications for the nation because the total remaining oil reserves in the U.S. may exceed 200 billion barrels–about 70 years worth at the current rate of consumption.

PETROLEUM LEFT BEHIND

Increasingly powerful parallel computers are needed to make the instrumented oil field a realistic possibility with more detailed, more realistic simulations. "If you do enough realizations, you can be more confident of an upper and lower bound," said Martino.

The simulation estimated the costs and potential benefits of various production scenarios, such as injecting water to sweep oil toward production wells. Such "water flooding" is so common that some reservoirs in Texas now produce 100 barrels of water for every barrel of crude. "If you own an oil company, there are a lot of things you could try, but you can’t estimate which one is going to be best without running a reservoir performance prediction," said Jennings, one of several academic researchers collaborating with the Center for Subsurface Modeling. "You simply must run some sort of reservoir performance prediction to evaluate candidate proposals in order to determine how best to operate a reservoir."

For the instrumented oil reservoir of the future, multi-sensor array measurements of pressure, saturation, and other parameters–including seismic images–will travel as a constant bit stream of data to remote computer sites. "All that data will be need to be computationally processed in order to update reservoir models in real time," said Lumley. "Continuous monitoring measurements will allow you to test and refine your reservoir model on a continuous basis and thus make it more accurate over time. This is the Holy Grail we are all striving for." – RG


Project Leader
Mary F. Wheeler
University of Texas, Austin

Participants
Steven Bryant,
Ryan Martino,
Malgorzata Peszynska
University of Texas, Austin

Alan Sussman
University of Maryland

Joel Saltz
Ohio State University

Manish Parashar
Rutgers University