Double-blended seismic acquisition for high-resolution imaging
conference paper
In current seismic acquisition, the firing times between shots is sufficiently large to avoid temporal overlap between records. To economize on survey time, the current compromise is to keep the number of shots to an acceptable minimum. The result is a poorly sampled source domain. We propose to abandon the condition of non-overlapping shot records to allow densely sampled, wide-azimuth source distributions ( source blending ). The rationale is that interpolation is much harder than separation. Blending has significant implications for quality and economics. The blending concept can also be applied in the detector space ( detector blending ). Then, the recording channels consist of a superposition of detected signals, each signal with its own code. With detector blending, many more detectors can be used for the same number of recording channels. This is particularly beneficial when the number of channels is limited, like in wireless recording or OBS. The concept of double blending is defined as the case where both source blending and detector blending are applied. Double blending allows a significant data compression during acquisition. After a deblending procedure, existing processing schemes can be used. However, the challenge is to design new algorithms that do not require a deblending preprocessing step.
TNO Identifier
464035
Source title
71st EAGE Conference and Exhibition incorporating SPE EUROPEC 2009, 8-11 June 2009, Amsterdam, The Netherlands
Collation
5 p.
Pages
S007-1 - S007-4
Files
To receive the publication files, please send an e-mail request to TNO Repository.