Processing, Survey Design and Quality Control

Steven B. Campbell                                                     Return to Cover Letter

My first seismic data processing experience was at Teledyne Exploration with crooked-line Vibroseis data acquired along mountain roads in Appalachia.  This regional survey began in Upstate New York and continued South across Pennsylvania, West Virginia, Virginia, Tennessee and into Georgia.  While this was over two decades ago, I remember that the Northern part of the survey as dominated by two very bright reflectors, which I now assume to have been the gas-rich Marcellus and Utica shales.

This work was accomplished entirely by IBM card input of positioning and parameters.  The seismic data were read straight off the ½ inch 9 track tape, processed and put to other tapes.   Hard disk space at this era was measured in tens of Megabytes and was used just for operating systems and processing software, not data storage. The positioning QC and stacking layout was done by scotch-taped montage maps of computer printout pages.  We corrected errors and took coordinates off these maps to make the stacking geometry that filled the bins of the crooked lines.  The velocity and data QC was done by long paper plots of velocity corrected CDP gathers rolled out down hallways and inspected “on foot”.

Then, at Western Geophysical in Houston, I worked in (2D) marine processing including Gulf of Mexico data off Texas, Louisiana and MAFLA.  Parameters were input by “Green Screen” text-only monitors. For a while, I was a velocity specialist and therefore saw data from offshore California and Alaska, as well.  Velocity analysis was done by interpreting paper plots of semblance analyses called “Velans” with colored pencils.  The paper plots were then digitized.  We helped to develop an interactive system of velocity analysis called Expeditor, which eventually became standard.

Later, I was a part of the Field Ancillary Computing Effort (FACE) that included MicroMAX data processing on an Compaq 386 desktop in Venezuela.  I processed data in the field for QC on the first 3D survey in Venezuela.  The FACE system had some preliminary utilities for 3D survey design, which I made use of in subsequent bids.   I did the same sort of data processing and QC for a transition zone (3D) project along the Eastern shore of Lake Maracaibo.  I trained aboard the Western Atlantic to process navigation data from the boats deploying the hydrophones recorded by a Sercel 368 system that had been “hardened” for shallow marine recording.  The source boat navigation also had to be processed and QC’ed.  I trained the navigators (surveyors) for the receiver deploying “nav boats” and I was called upon to take over a the night watch as navigator of the source boat during acquisition.

Later, I was stationed aboard the Western Atlantic, making 2D brute stacks of the 3D marine data as it was acquired from Lake Maracaibo..  I also did QC and processing of positioning data from the navigation boats that deployed the telemetry buoys and hydrophones.  The binning coverage was also my responsibility with a very early real-time binning system which had to be edited and recalculated daily.

From there, I went to Bolivia and processed Vibroseis data (also MicroMAX) as it was acquired in the Altiplano (high desert) at 3500-4000 meters in altitude.  This was done alongside a processing effort with the new ProMAX system.  My sections were superior, but to be fair the ProMAX system was very new then and the operators still working out the parameters.   At that time, the processing of Vibrator similarity analysis, refraction data processing and scouting were also my responsibility.

    At Grant Geophysical I did a great deal of survey design with a DOS based program called FD (Field Design).  It was there that I worked with the Green Mountain/SIS in a consortium to develop MESA.  It was a bit later that OMNI split off from MESA and Grant, being the Consortium member and legacy FD client had working copies of both.  Myself and my colleague Julie Martin were founding user-developers of both MESA and OMNI.  We requested many changes and improvements that later became standard features.

We also did a great deal of mapping and used Digital Orthomaps (DOM) to scout shot and receiver locations as dictated by the terrain and obstructions.  I spent weeks working with a DOM of False River Lake, Louisiana to plan source and receiver locations for an Amoco project.  Of the first ten wells drilled on that prospect, nine were successful and one blew out.  Years later, I flew over the area on the way to Europe and recognized it instantly.

At Petroleum Geo Services (PGS,) I led a group dedicated to onshore survey design.  We used both OMNI and MESA.  When the Onshore Department was sold off, my staff chose to go with the new company, which is now a part of GeoKinetics.  I chose to remain with PGS in the Geophysical Support Group and changed over to deep water marine survey design and QC.  We did a great deal of seismic modelling (finite difference, ray tracing and design attribute analysis).  We made use of 3D “virtual reality” system called holoSeis for visualization of modeled data, projected coverage maps and “cultural data”.

Data Processing with Promax was also part of the analysis we did to form a basis for survey design and Quality Control.  It was with PGS what we developed and patented (US Patent No. 6,925,386 B2 – Pramik, Mathur, Campbell, et al) method for Near-Real-Time-Illumination Analysis.  The term “illumination”, in this case refers to ray tracing based on the P190 positioning data over an interpreted surface in a velocity field on a daily basis as the navigation data arrive over a satellite link.  At that time, specialized compression software was necessary to make transmission feasible.  The coverage is projected in holoSeis (3D visualization) on the interpreted surface and examined for areas that might -or might not – need infill. We accomplished this on several large 3D projects.

There was another methodology that analyzed binned coverage based on geophysical parameters. Pre-stack time migration on modeled data was used to establish a tolerance level for coverage gaps permited for each of four offset ranges.  Navigation data were forwarded to the office by satellite in order to be binned and analyzed as to whether infill would be appropriate and results delivered to the clients and the vessel overnight.

There was a series of surveys over a permanent installation of fiber-optic cables and sensor installed around a platform off Brazil.  The survey design was mine and I tracked the coverage in the Houston office as the source point navigation was sent in by satellite.  Updated coverage maps sorted by eight offset ranges were produced and presented on a weekly basis.

While all the previous was going on, we also did airgun source modeling for every array configuration on every job.  PGS’s Marine Source Modeling Utility (MASOMO) is an industry standard package that is solidly backed up by field research.  In the last years at PGS, I was the go-to authority on source modeling and instructed on the use of MASOMO at places like CGG and ExxonMobil.

I hope that this document has made clear my considerable experience in processing, QC and survey design.  As I mentioned before, I can go anywhere I am needed and stay for long periods.  I have a valid professional visa for Brazil that does not expire until 2022.

Steve Campbell

Click the link to return to Steve’s Cover Letter


One thought on “Processing, Survey Design and Quality Control

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s