Seismic data processing is an art and a science. The open source software Seismic Unix has modules for over 20 migration routines, other packages do so as well. Before we get to choosing the best migration routine, there are a couple things anyone handling seismic data should remember. These are 7 Tips from my experience:
Constantly Review your Seismic Data
This may seem obvious, however I have seen it more than one time that people would process their data without looking at the result of intermediate steps. Seismic processing is often also called (seismic) imaging. We are an imaging discipline. All the physical background behind a method may be valid, while this does not mean it will improve your data set applying any operation blindly.
Perform Parameter Testing
Especially, when it comes to complicated processes, some people tend to choose default parameters or published parameters without testing. This will probably do more harm than good on your image.
Do small scale initial tests to estimate parameter fitness. Expand the subset of your data and test the best parameters again. By the process of elimination you will find the best set of parameters for each method.
Document Every Processing Step
Major processing packages usually have this implemented straight away. You have a processing flow that has to be saved before being applied to the data. However, especially when using Seismic Unix, you may be prone to quickly applying an operation and then forgetting about it with neither a document to show for it nor information in the header.
Write it down. Document, why you choose there parameters and this method. Especially when you’re doing a thesis in 6 months (MSc) or 3 years (Phd) you will be grateful to have a reminder why you did anything with your data, right before the defense. Of course a client will also want to know why you choose these methods if they weren’t agreed upon beforehand, but my expertise is more in the thesis side of things.
Reverse your Seismic Processing Operation
Noise removal in seismic data can make you feel like a seismic processing surgeon. You try to only get the bad stuff, leaving everything else intact. However it’s very important to get out.
In some cases I have found it very convenient to “reverse the operation”. Let’s take the FK-Filter as an example You transform your data into the fk domain and pick a polygon of data you’d like to keep. This is the accept-mode of an FK filter. Using the same polygon and applying it in reject-mode, leaves only the parts you would normally cut away. This will, however, very easily show if there are any primary data you missed. Essentially, making this a form of quality control for some filters.
Read the Manual
This may be a novel idea to some. Reading the manual may actually save you a significant amount of headache. Promax is a prime example for this. There is a document with “known problems” that lists bugs and errors in seismic processing modules. This should be the first place to go when trying out a new module. Some legacy modules will be impossible to run on machines you don’t have full admin rights on. Some may even be deprecated entirely.
Keep your Goal in mind
First I contemplated calling this section “the client is king”, but I was sure of students skipping this part. As a student you have several clients to work with, your professors, advisors and examiners and of course yourself. This may easily lead to information overload.
You may compromise, but know your priorities and keep true to the objective. Each data set and each objective has different ways to be achieved and even more ways to be distracted. Keep your focus.
Ask for Help
There is always someone who will be better at what you’re doing. In Schlumberger they had this system of “technical seniority” in place, where you could advance, which made it very clear who would be much more knowledgeable than you. Depending on the confidentiality of your work, don’t be afraid to ask your entire network. The exchanges I had with experts during my thesis helped me immensely.
I am someone who likes to ask questions. And I think collaborative environments make us excel at what we do. You can read some nice thoughts about “asking questions” on the agile blog and the counterpart of answering questions.