Gee Whiz Geophysics…But What About the Log Data
?: Normalizing, Editing, and Supplementing Log, Core and Production
Data
from 1935 to the Present
Jeff S. Arbogast1 and Steven M. Goolsby2
1 Petroleum Software Technologies, LLC, Aurora, CO
2 Goolsby Brothers and Associates, Inc, Centennial, CO
Geophysicists have been trying to squeeze as much useable information as possible from seismic
data
long before the discovery of bright spots. Today they display this information with 3D visualization software and 3D
seismic
is touted as the answer to all things….but what about the log
data
?
Most log data
(even ancient log
data
) have 10-25 times better vertical resolution than today's
seismic
data
however, many geoscientists treat log
data
much like it was treated in 1935. They obtain copies of the logs, display them in cross sections, correlate them, and map them. The use of mixed-vintage, incomplete, and/or poor quality log
data
however, can lead to serious problems in
interpretation
. Without accurate, normalized, high-resolution log
data
for every well in a study area, correlations,
seismic
ties and maps may be incorrect. As a result, 3D
seismic
interpretations based on these
data
may turn out to be amazingly colorful but inaccurate representations of what is actually happening in the subsurface.
Today the oil and gas industry is challenged with evaluating declining production in aging fields which could involve hundreds of wells with log data
recorded from 1935 to last week. New plays often involve laminated, poor-quality, low permeability, fractured, or unconventional reservoirs. Using resistivity and SP inversion processing and neural network modeling run on their PC, geologists and geophysicists can generate complete suites of accurate, high-resolution, edited, log, core, and production
data
for every well in a study area. Examples from the Mid-Continent, California and Rocky Mountains will be shown.