Gee Whiz Geophysics…But What About the Log Data
? Normalizing, Editing, and Supplementing Log, Core, and Production
Data
from 1935 to the Present
Jeff S. Arbogast
Petroleum Software Technologies, LLC, 14001 E. Iliff Avenue #414, Aurora, CO 8014
Geophysicists have been trying to squeeze as much useable information as possible from seismic
data
long before the discovery of bright spots. Today they display this information with 3D visualization software and 3D
seismic
is touted as the answer to all things….but what about the log
data
? Most log
data
(even ancient log
data
) have 10-25 times better vertical resolution than today’s
seismic
data
however, many geoscientists treat log
data
much like it was treated in 1935. They obtain copies of the logs, display them in cross sections, correlate them, and map them. The use of mixed-vintage, incomplete, and/or poor quality log
data
however, can lead to serious problems in
interpretation
. Without accurate, normalized, high-resolution log
data
for every well in a study area, correlations,
seismic
ties and maps may be incorrect. As a result, 3D
seismic
interpretations based on these
data
may turn out to be amazingly colorful but inaccurate representations of what is actually happening in the subsurface. Today the oil and gas industry is challenged with evaluating declining production in aging fields which could involve hundreds of wells with log
data
recorded from 1935 to last week. New plays often involve laminated, poor-quality, low permeability, fractured, or unconventional reservoirs. Using resistivity and SP inversion processing and neural network modeling run on their PC, geologists and geophysicists can generate complete suites of accurate, high-resolution, edited, log, core, and production
data
for every well in a study area. Examples from California, the Mid-Continent, and Rocky Mountains will be shown.
AAPG Search and Discovery Article #90076©2008 AAPG Pacific Section, Bakersfield, California