CN113256547A - Seismic data fusion method based on wavelet technology - Google Patents
Seismic data fusion method based on wavelet technology Download PDFInfo
- Publication number
- CN113256547A CN113256547A CN202110577205.6A CN202110577205A CN113256547A CN 113256547 A CN113256547 A CN 113256547A CN 202110577205 A CN202110577205 A CN 202110577205A CN 113256547 A CN113256547 A CN 113256547A
- Authority
- CN
- China
- Prior art keywords
- wavelet
- image
- seismic data
- fusion
- frequency
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000007500 overflow downdraw method Methods 0.000 title claims abstract description 44
- 238000005516 engineering process Methods 0.000 title claims abstract description 21
- 230000004927 fusion Effects 0.000 claims abstract description 86
- 238000000354 decomposition reaction Methods 0.000 claims abstract description 37
- 230000009466 transformation Effects 0.000 claims abstract description 26
- 238000012545 processing Methods 0.000 claims abstract description 15
- 238000000034 method Methods 0.000 claims description 39
- 230000008569 process Effects 0.000 claims description 9
- 239000011159 matrix material Substances 0.000 claims description 5
- 238000003672 processing method Methods 0.000 claims 1
- 238000004134 energy conservation Methods 0.000 abstract description 4
- 238000001914 filtration Methods 0.000 description 8
- 238000007499 fusion processing Methods 0.000 description 7
- 230000008901 benefit Effects 0.000 description 6
- 238000004458 analytical method Methods 0.000 description 5
- 238000010586 diagram Methods 0.000 description 5
- 230000000694 effects Effects 0.000 description 5
- 238000011160 research Methods 0.000 description 4
- 230000007547 defect Effects 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 238000000513 principal component analysis Methods 0.000 description 3
- 238000013519 translation Methods 0.000 description 3
- 238000012935 Averaging Methods 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000006073 displacement reaction Methods 0.000 description 2
- 238000005553 drilling Methods 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000002123 temporal effect Effects 0.000 description 2
- 241000382349 White tip die-back phytoplasma Species 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000002238 attenuated effect Effects 0.000 description 1
- 238000009412 basement excavation Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000008602 contraction Effects 0.000 description 1
- 230000008021 deposition Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000002349 favourable effect Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 230000014759 maintenance of location Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 238000004806 packaging method and process Methods 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/10—Complex mathematical operations
- G06F17/14—Fourier, Walsh or analogous domain transformations, e.g. Laplace, Hilbert, Karhunen-Loeve, transforms
- G06F17/148—Wavelet transforms
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Mathematical Physics (AREA)
- Mathematical Optimization (AREA)
- Mathematical Analysis (AREA)
- Computational Mathematics (AREA)
- Pure & Applied Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Algebra (AREA)
- Databases & Information Systems (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a seismic data fusion method based on wavelet technology, which specifically comprises the following steps: s1, acquiring a seismic texture attribute image, and performing image processing based on the seismic texture attribute image to obtain decomposition coefficients in different directions; s2, fusing based on the decomposition coefficients in different directions to obtain fused coefficients; and S3, performing inverse wavelet transform based on the fused coefficients to obtain a fused image. The discrete wavelet of the invention keeps the energy conservation of the original image for the coefficient after decomposing the image to obtain the low-frequency coefficient and the high-frequency coefficient in different directions, wherein the low-frequency part mainly reflects the general image and the high-frequency part mainly reflects the details of the image, and the coefficient for decomposing the image into high frequency and low frequency is classified, thus improving the accuracy of the next step of fusion, saving the operation time when carrying out the secondary wavelet transformation and improving the efficiency.
Description
Technical Field
The invention relates to the field of seismic exploration, in particular to a seismic data fusion method based on a wavelet technology.
Background
Under the limitations of early seismic exploration techniques, the work of seismic exploration was primarily seismic data processing and seismic data imaging, and the use scenarios of seismic data focused primarily on solving geological problems. Seismic exploration techniques have developed to date, and at the same time, the working center of gravity of seismic exploration has been skewed from solving geologic structure problems to interpreting geologic structure and interpreting stratigraphic properties. Needless to say, a large amount of geological information is hidden in the stratum, and seismic exploration obtains geological information required by an actual project through excavation of hidden seismic attribute data in the stratum, so that the geological information is processed and interpreted by the seismic attribute data, but the actual underground geological conditions are very complex and variable, and uncertainty factors influencing the seismic attribute information are too many, so that the geological information required by the actual project cannot be accurately obtained by using only one single piece of seismic attribute information.
In the background of this era, the idea of attribute fusion emerged and became fruitful today. In the process of developing the fusion idea, various multi-source attribute fusion methods are all in a complete set and achieve relatively good fusion effect at that time, but because the fusion technology is not mature at that time, many fusion methods have great limitations, and the requirements of high-resolution attribute fusion at present are difficult to meet. Therefore, a method for fusing multi-source seismic data, which can solve the problems in the prior art, is urgently needed in society.
Disclosure of Invention
The invention aims to provide a seismic data fusion method based on wavelet technology, which solves the problems in the prior art and enables multi-source seismic images to be fused.
In order to achieve the purpose, the invention provides the following scheme:
the invention provides a seismic data fusion method based on wavelet technology, which comprises the following steps:
s1, acquiring a seismic texture attribute image, and performing image processing on the seismic texture attribute image to obtain seismic data decomposition coefficients in different directions;
s2, fusing the seismic data decomposition coefficients in different directions to obtain a fused wavelet coefficient and an approximate wavelet coefficient;
and S3, performing inverse wavelet transform based on the fused wavelet coefficient and the approximate wavelet coefficient to obtain a seismic data fusion image.
Further, the seismic data in S1 includes, but is not limited to: amplitude, frequency, phase, coherence, dip, Q-value, velocity, envelope, quadrature, semblance, and wave impedance, the data in the seismic data being used for fusion.
Further, the method of image processing in S1 is: processing the seismic texture attribute image using a two-dimensional discrete wavelet transform whose expression DWT (j, k)1,k2) Comprises the following steps:
wherein j, k1、k2Are all arbitrary integers, a0Is a constant number of times, and is,the mother function of the two-dimensional wavelet is represented, m represents the abscissa and n represents the ordinate.
Further, in S1, when the two-dimensional discrete wavelet transform is performed on the seismic texture attribute image once, the obtained seismic data decomposition coefficients in different directions include a high-frequency subband and a low-frequency subband.
Further, the high-frequency subbands include a horizontal-direction high-frequency subband, a vertical-direction high-frequency subband, and a diagonal-direction high-frequency subband;
wherein phi is1Representing a two-dimensional wavelet mother function in the horizontal direction, x representing an abscissa, and y representing an ordinate;
wherein phi is2Representing a vertical two-dimensional wavelet mother function;
wherein phi is3Representing a diagonal two-dimensional wavelet mother function.
Further, when the seismic texture attribute image is secondarily processed, only the low-frequency sub-band is processed, and the expression of the low-frequency sub-bandComprises the following steps:
where phi represents the mother function of the two-dimensional wavelet.
Further, in S2, the method for fusing the decomposition coefficients of the seismic data in different directions includes: seismic data image fusion based on multi-scale transforms including, but not limited to: rayleigh-tower transform decomposition and gradient tower decomposition based on tower transform.
Further, in the process of fusing the seismic data decomposition coefficients in different directions:
selecting the optimal wavelet transform in the high-frequency sub-band, and selecting the coefficient with the maximum average value of image neighborhood absolute values as a high-frequency fusion wavelet coefficient;
and selecting a weighted average value of low-frequency coefficients of the multi-source image as approximate wavelet coefficients in the low-frequency sub-band.
Furthermore, in the process of carrying out image fusion on the seismic data based on multi-scale transformation, the fusion is pixel-level image fusion.
Further, in S3, the inverse wavelet transform is used to convert the image into an image matrix, and the fused wavelet coefficients and the approximate wavelet coefficients are used as inputs of the inverse wavelet transform in the process of the inverse transform.
The invention discloses the following technical effects:
(1) the method has great advantages in extracting the high-frequency domain feature information of the image, improves the retention level of the image on the spatial characteristic and the frequency characteristic, and reserves more spatial characteristic and frequency characteristic;
(2) the discrete wavelet keeps the energy conservation of the original image in the coefficient after the image is decomposed, so that a low-frequency coefficient and high-frequency coefficients in different directions (horizontal, vertical and diagonal) are obtained, wherein the low-frequency part mainly reflects the general situation of the image, and the high-frequency part mainly reflects the details of the image, so that the image is decomposed into the high-frequency and low-frequency coefficients for classification processing, the accuracy of the next step of fusion is improved, the operation time is saved during the secondary wavelet transformation, and the efficiency is improved;
(3) by properly collecting the original image information, the decomposition coefficients from different images are properly combined to obtain a new coefficient, so that the effect of integrating multi-source information is improved;
(4) the characteristic information of the image is reserved in the combined coefficients, once the coefficients are combined, the final fusion image can be obtained through the inverse discrete wavelet transform, and the fusion efficiency is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
FIG. 1 is a diagram illustrating a pixel-level image fusion architecture in this embodiment;
FIG. 2 is a flowchart of a two-dimensional image fusion process based on wavelet transform in this embodiment;
FIG. 3 is a schematic diagram of the frequency distribution after one discrete wavelet transform in this embodiment;
FIG. 4 is a schematic diagram of the distribution of coefficients after three discrete wavelet transforms in this embodiment;
FIG. 5 is a slice image of the source region to be fused in the present embodiment, wherein (a) is an amplitude attribute slice image and (b) is a frequency (10Hz) attribute slice image;
FIG. 6 is a diagram illustrating a wavelet fusion result of two slice images with different attributes in this embodiment;
fig. 7 is a schematic diagram showing comparison of the seismic texture image fusion results in this embodiment, where (a) is an amplitude attribute slice, (b) is a frequency (10Hz) attribute slice, and (c) is a seismic attribute fusion result based on wavelet transform.
Detailed Description
Reference will now be made in detail to various exemplary embodiments of the invention, the detailed description should not be construed as limiting the invention but as a more detailed description of certain aspects, features and embodiments of the invention.
It is to be understood that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. Further, for numerical ranges in this disclosure, it is understood that each intervening value, between the upper and lower limit of that range, is also specifically disclosed. Every smaller range between any stated value or intervening value in a stated range and any other stated or intervening value in a stated range is encompassed within the invention. The upper and lower limits of these smaller ranges may independently be included or excluded in the range.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. Although only preferred methods and materials are described herein, any methods and materials similar or equivalent to those described herein can be used in the practice or testing of the present invention. All documents mentioned in this specification are incorporated by reference herein for the purpose of disclosing and describing the methods and/or materials associated with the documents. In case of conflict with any incorporated document, the present specification will control.
It will be apparent to those skilled in the art that various modifications and variations can be made in the specific embodiments of the present disclosure without departing from the scope or spirit of the disclosure. Other embodiments will be apparent to those skilled in the art from consideration of the specification. The specification and examples are exemplary only.
As used herein, the terms "comprising," "including," "having," "containing," and the like are open-ended terms that mean including, but not limited to.
The "parts" in the present invention are all parts by mass unless otherwise specified.
Example 1
The pixel-level image fusion approach is the root of all-image fusion and is the lowest level of fusion. It enables direct processing of the original image, sometimes with some pre-processing of the image to be fused, which is not necessary, in contrast to other levels of image fusion, in order to improve image detection performance and accuracy. The key is that the subsequent registration operations performed on the images participating in the fusion must be accurate and the registration should be at least a pixel-level registration.
Pixel-level image fusion achieves the objective of reducing or suppressing multiplicity, defects, uncertainties, and errors that may exist in the interpretation of a perceived object or environment by integrating organically complementary information between images to be fused, through a single information. The method can fully use data provided by a multi-source image, and finally achieves the index of improving the effectiveness and accuracy of feature extraction, classification, fusion and target identification, wherein the framework of pixel level fusion is shown in figure 1.
Pixel-level fusion of multi-source images is divided into two categories:
1) image fusion algorithm based on spatial domain
The image fusion algorithm based on the spatial domain is simple to compile and high in running speed, and in some specific occasions, the image fusion based on the spatial domain can obtain a good fusion result, but in most occasions, the image fusion based on the spatial domain cannot obtain a satisfactory fusion result, wherein common representative methods are as follows: the image fusion method comprises the traditional image fusion method of directly carrying out pixel averaging, weighted averaging and Principal Component Analysis (PCA) dimensionality reduction algorithm, the color space fusion method based on RGB mapping and the image fusion algorithm based on IHS color space, the image fusion method based on modulation, a logic filtering method, an optimization method and an artificial neural network method.
2) Image fusion algorithm based on transform domain
In recent years, in development of image fusion algorithms based on transform domains, image fusion algorithms based on multi-scale decomposition are the research focus of people, so at present, most of the mature fusion algorithms are multi-scale decomposition image fusion algorithms. Among the commonly used methods are: the image fusion method based on the pyramid transformation, such as the Laplace (LPLS) pyramid transformation, the gradient pyramid decomposition and the like, and the wavelet transformation which is gradually matured in recent years as an emerging technology also provide a more superior new method for the image fusion method, and the wavelet decomposition has the advantages of compactness, symmetry, orthogonality and the like, so the image fusion method has better image fusion performance compared with the fusion methods, such as the gradient pyramid fusion method, the RGB fusion method, the PCA and the like.
Although the two methods have advantages and disadvantages, the two methods are not completely independent, and the two methods are organically combined in a plurality of fusion algorithms to exert respective advantages for the fusion result of the image with finer resolution and higher resolution.
The wavelet fusion method based on wavelet transform, wavelet, i.e. very small wave as the name implies, is different from Fourier transform, improves the infinite-length trigonometric function-based mother function of Fourier transform into the wavelet mother function which can be attenuated, and is a multiscale analysis method with constant window function area and variable two domains, which shows that the wavelet transform has higher frequency resolution and smaller time resolution in the low frequency part and higher frequency resolution and lower time resolution in the high frequency part. It is this property that makes wavelet transformation unique to the ability to adapt dynamically to a signal.
Wavelet transforms inherit and develop the mathematical idea of fourier transforms, and analysis of a function (signal) by a short-time (windowed) fourier transform is equivalent to using a "magnifying glass" of constant shape, size and magnification that is translated over the time-frequency domain plane in which the signal lies to observe frequency signature information over a fixed length of time. There is a problem that although the short-time (windowed) fourier transform is used to solve the problem of time-domain localization of the transform function, the size and shape of the window function is fixed, i.e. the window function has no dynamic adaptation capability. This means that it cannot be optimized for different problems.
In practical terms, the requirements of each scenario require special customization. For signals with more favorable characteristic information in high frequency, the waveform is relatively narrow, the time domain resolution is adjusted to be small as much as possible, the frequency is adjusted to be high so as to provide better precision, or a sufficiently narrow time domain window function is used so as to effectively observe the high frequency characteristic information of the signals; for low frequency information, the temporal resolution needs to be relatively long to provide the complete signal information, since the waveform is relatively wide, or a wider temporal window must be used to reflect the low frequency content of the information. If a fixed short-time (windowed) Fourier transform is used, the wide window function is chosen such that the low frequency components are clearly seen, but very bad when observing the high frequency part; if a narrower window function is chosen, the signal information is well determined at high frequencies, but the time resolution may not be taken care of at low frequencies.
Thus, the method really suitable for multiple scenes can only design the length and width of the window function to be variable, and the wavelet transform is researched for realizing the purpose. The wavelet function is obtained by mother wavelet through resolution expansion and contraction and time domain translation, and high and low frequency information can be extracted from signals very conveniently according to the flexible window function.
In practical applications, since general images, seismic texture image data, and the like are mostly at least two-dimensional matrices, it is necessary to perform upscaling of wavelet transform of one-dimensional signals to a two-dimensional plane, and thus two-dimensional wavelets extending as one-dimensional wavelet transform are used in image fusion.
Let f (t)1,t2)∈L2(R2) Represents a in R2Two-dimensional signal of (d), f (t)1,t2) T in (1)1And t2Respectively corresponding to the abscissa and the ordinate. Psi (t)1,t2) The method is characterized in that a mother function of a two-dimensional wavelet is represented, and a definition formula of a mother function of a two-dimensional continuous wavelet is obtained through resolution stretching and two-dimensional displacement and is shown as a formula (1):
wherein tau is1Representing a translation parameter, τ2Representing a translation parameter;
the two-dimensional continuous wavelet transform is obtained as shown in formula (2):
in the formula, #aRepresenting a two-dimensional continuous wavelet mother function obtained after resolution stretching and two-dimensional displacement,the normalization factor is introduced, and the purpose is to ensure the energy conservation before and after the wavelet stretching.
Now a continuous transform (2) of the two-dimensional wavelet has been derived, followed by the inverse transform of the two-dimensional wavelet, as shown in equation (3):
5) Two-dimensional discrete wavelet transform
In order to realize a two-dimensional wavelet transform in a computer, it is necessary to realize a two-dimensional discrete wavelet (DPWT), and a two-dimensional continuous wavelet transform becomes a two-dimensional discrete parametric wavelet transform after performing parameter discretization. The two-dimensional discrete parametric wavelet transform is shown in equation (4):
Then, the integral is dispersed and then is subjected to progression to obtain wavelet transformation in a two-dimensional discrete space. The two-dimensional wavelet Discrete Space Wavelet Transform (DSWT) is shown in equation (5):
and finally, specializing the two-dimensional discrete space wavelet transform to obtain the two-dimensional discrete wavelet transform.
The two-dimensional discrete wavelet transform is shown by the following equation (6):
in the formula a0=2,τ10=τ20=1。
Wavelet transform is used for image fusion:
the use of wavelet transform is to be used for image fusion. Firstly, decomposing an image into high frequency and low frequency through wavelet transformation, then respectively executing a fusion process, and finally, carrying out wavelet inverse transformation to convert the image into an image matrix.
In the image fusion process, in order to keep the characteristics of the multi-source image as much as possible, the wavelet decomposition of a high frequency domain is selected to be optimal, a coefficient with a large average of the image neighborhood absolute values is selected as a fusion wavelet coefficient, and the wavelet decomposition in a low frequency domain is selected as a weighted average of the low frequency coefficients of the multi-source image. The high frequency reflects the detailed part of the data and the low frequency reflects the image profile.
During the inverse transform, the fused wavelet coefficients and the approximation wavelet coefficients are used as inputs to the inverse wavelet transform. After the fused image is output, it is further processed. The image data fusion method based on wavelet transformation has higher operation efficiency and more complete image processing effect, and most or all characteristic information in the source image is reserved in the fusion result. It has high use value in extensive research field at present.
The two-dimensional image fusion based on wavelet transformation mainly comprises three steps:
(1) performing wavelet decomposition on each original image to be fused to obtain decomposition coefficients in different directions;
(2) carrying out fusion processing on coefficients in different directions obtained by decomposition;
(3) and performing wavelet inverse transformation on the fused coefficients to obtain a final result image.
The image fusion process based on two-dimensional wavelet transform is shown in fig. 2.
Since a digital image is a two-dimensional matrix, it is generally assumed that the image matrix size is N × N, and N is 2n(n is a positive integer), so that 4 wavelet transforms are generated to correspond toSub-band regions of the original image resolution and containing the corresponding band coefficients, respectively, e.g. formula (7) -formulaAs shown in equation (10), this process is equivalent to performing sampling in the horizontal and vertical directions simultaneously, and fig. 3 shows the frequency distribution after one discrete two-dimensional wavelet transform. A mathematical prototype of the wavelet transform of an image is set forth.
(1) Energy concentrated band LL:
(2) horizontal high-frequency edge information band HL:
(3) vertical high-frequency edge information band LH:
(4) high-frequency information band HH in diagonal direction:
the principle of wavelet transform for image processing is to convolve an image with low-pass and high-pass filters, respectively, and then sample either of them. Thus, the image is decomposed into a low frequency sub-band and three high frequency sub-bands by one layer of wavelet transform. Wherein the low frequency sub-band LL1The image is obtained by low-pass filtering in both the horizontal direction and the vertical direction of the image; high-frequency subband HL1The image is obtained by carrying out high-pass filtering in the horizontal direction and low-pass filtering in the vertical direction of the image; high-frequency sub-band LH1The image filtering method is characterized by comprising the following steps of carrying out low-pass filtering on the horizontal direction of an image and carrying out high-pass filtering on the vertical direction of the image; high frequency subband HH1The image is obtained by carrying out high-pass filtering on the horizontal direction and the vertical direction of the image. The resolution of each sub-band being that of the original image
Similarly, when the image is subjected to secondary wavelet transform, only the low-frequency sub-band LL is needed to be subjected to secondary wavelet transform, and the LL is obtained1Sub-band decomposition into LL2、LH2、HL2、HH2With resolution of each sub-band being of the original imageAnd the wavelet transformation results of three layers or even higher layers are obtained by analogy. Therefore, four sub-bands are obtained by performing one-layer wavelet transform, seven sub-bands are obtained by performing two-layer wavelet transform, and 3N +1 sub-bands are obtained by performing N-layer decomposition, but the operation speed is limited, and the required processing result can be achieved only by performing one to two wavelet decompositions, and fig. 4 is the coefficient distribution after performing three-layer wavelet transform.
In summary, the image fusion method based on the two-dimensional wavelet transform technology is similar to the tower decomposition, but is better than the tower decomposition, and the method applied to the image fusion has the following four properties:
(1) the method has great advantages in extracting the high-frequency domain characteristic information of the image, and well keeps the spatial characteristic and the frequency characteristic of the image.
(2) The coefficient of the discrete wavelet decomposed image keeps the energy conservation of the original image, the low frequency coefficient and the high frequency coefficient in different directions (horizontal, vertical and diagonal), the low frequency part mainly reflects the general image, and the high frequency part mainly reflects the details of the image.
(3) By the invention, information of original images can be collected properly, the invention combines decomposition coefficients from different images properly, and obtains new coefficients.
(4) The characteristic information of the image is preserved in the combined coefficients, which are once combined, the final fused image is obtained by inverse discrete wavelet transform.
Example 2
Compared with the traditional seismic image fusion method, the image fusion technology based on the wavelet transformation is used in the fusion of the seismic texture attribute images, and is a method for directly fusing at the data level due to the pixel-level fusion technology, and the traditional fusion method in the past has certain limitations, for example, if the fusion method based on the color space is used for the fusion of more than three attributes like the RGB traditional fusion method, the obtained result can lose a lot of information in the original image, while the fusion method based on the wavelet transformation technology can retain most of feature information contained in the original data, which is the advantage of the wavelet transformation.
From the fact that the wavelet transform in embodiment 1 is used in image fusion, it is known that, due to the limitation of the operation speed and the difference of the frequency after decomposition, only one-layer to two-layer wavelet decomposition is selected in most cases, and in fact, the number of layers of the wavelet decomposition determines the number of frequency layers, which has a certain effect on the fusion result.
The embodiment 2 uses matlab compiling tool to realize multi-source image fusion based on wavelet transformation. And finally, the multisource attribute fusion technology based on wavelet transformation is used for fusing seismic texture images with different attributes obtained from the actual work area slices.
Program interface and functional overview:
in order to more intuitively see the difference and the contrast before and after fusion, a simple program front-end operation interface is established by adopting a GUI module in matlab, and functions of related functions are associated to realize the fusion processing of the wavelet image, wherein the functions mainly comprise three parts: the first part is the respective loading of two original images, wherein a native library function uigetfile in matlab is called to interactively select a picture file and carry out reading, loading and displaying operations; the method comprises the following steps of (1) decomposing, fusing and reconstructing a core part, namely a two-dimensional wavelet, through a series of program internal processing processes; and finally, displaying a result image on a software interface. The program interface is simple, and is mainly used for displaying the image to be fused and packaging the specific flow of the two-dimensional wavelet transform algorithm.
The program foreground design comprises an image display area and a control panel area, and in order to effectively, concisely and clearly show the image fusion process, the image display area provides two image display controls which are respectively used for displaying two images to be loaded; the control panel area comprises the operation flows of loading two images and wavelet fusion reconstruction, and a result dialog box of wavelet reconstruction is popped up after fusion is successful, wherein the fusion result is displayed, and the result is conveniently stored and backed up.
Example of image fusion: firstly, loading an experimental image, wherein the focal lengths of two pictures are different, so that the two pictures to be processed have certain defects in display quality, and then a two-dimensional image fusion algorithm based on wavelet transformation is required to realize the feature integration of the images to be processed.
The two-dimensional image fusion method based on wavelet transformation fuses the characteristics of the two source images, overcomes the defects of different focal lengths of the two source images, obtains complete and clear images of the characteristics of the two source images, and comprehensively displays the more detailed characteristics of the two images in a fusion result, which shows that the wavelet decomposition and fusion method cannot generate obvious information loss.
In an actual calculation example, the actual seismic image used in this embodiment is an amplitude attribute work area slice of an actual seismic work area used by a geophysical prospecting company in the south west, wherein a work area slice showing a frequency (10HZ) attribute of other hidden feature information is obtained after processing, and in this embodiment, two images with different feature information are fused by using a two-dimensional image fusion method based on wavelet transform respectively.
Example of image fusion:
two source work area slice images with different attributes are shown as an amplitude attribute slice image (a) and a frequency (10Hz) attribute slice image (b) in FIG. 5, and clearly seen from the two images, the source work area slice images have complementary characteristic regions, such as clear river channels, river beds, large and small faults and other remarkable characteristics, and fusion of seismic texture slice images with different attributes is carried out in actual work, so that reliable information or basis is provided for geological attribute analysis and interpretation processing of prediction of oil and gas, well drilling, reservoir description, lithology analysis and the like. Fig. 6 is a result of a fused image of a two-attribute slice image obtained after the wavelet transform-based image fusion method is performed:
finally, the amplitude attribute slice (a) and the frequency (10Hz) attribute slice (b) of the seismic texture attribute slice are compared with the image (c) generated by image fusion based on wavelet transformation, and the comparison result is shown in fig. 7.
In fig. 7, the amplitude attribute slice (a) has clear lithology or fault display, the frequency attribute slice (b) has clear river and riverbed deposition display, after wavelet fusion, the characteristics of the two images are well combined and clearly displayed, and some characteristics which are not easily recognized by human eyes are also displayed from the fusion result image, such a result shows that the original image is decomposed into a series of sub-images with different spatial resolution and frequency domain characteristics through wavelet transformation, the characteristic change of the source image is reflected, fusion is performed on a plurality of decomposition layers and a plurality of frequency bands to obtain a better fusion result, the clearer image can be seen through two-dimensional image fusion based on wavelet transformation, and the integration of the characteristics of the source image all has, which shows the applicability of the fusion method based on wavelet transformation in seismic texture image fusion, and can be well used in the field of actual seismic texture image fusion.
The embodiment mainly introduces a fusion method of multi-source images and the practical use of the fusion method of the multi-source images in seismic texture attribute image data, a large amount of geological information is hidden in the seismic attribute data, but only the ambiguity and uncertainty of judgment caused by the multi-solution of an interpretation result due to the analysis of a single attribute are analyzed, so that effective seismic attribute data as much as possible are collected and fused, and the geological structure can be judged more easily and more accurately, so that geological work such as reservoir prediction, oil and gas prediction, well drilling, lithology judgment and the like can be accurately and effectively carried out. In recent years, as the research on the wavelet transform-based image fusion method continues, many achievements have been made in the image fusion field, and the paper mainly focuses on the research on the wavelet transform-based image fusion method among the lowest-order pixel-level fusion methods.
The above-described embodiments are merely illustrative of the preferred embodiments of the present invention, and do not limit the scope of the present invention, and various modifications and improvements of the technical solutions of the present invention can be made by those skilled in the art without departing from the spirit of the present invention, and the technical solutions of the present invention are within the scope of the present invention defined by the claims.
Claims (10)
1. A seismic data fusion method based on wavelet technology is characterized in that: the method comprises the following steps:
s1, acquiring a seismic texture attribute image, and performing image processing on the seismic texture attribute image to obtain seismic data decomposition coefficients in different directions;
s2, fusing the seismic data decomposition coefficients in different directions to obtain a fused wavelet coefficient and an approximate wavelet coefficient;
and S3, performing inverse wavelet transform based on the fused wavelet coefficient and the approximate wavelet coefficient to obtain a seismic data fusion image.
2. The wavelet-technology-based seismic data fusion method of claim 1, wherein: the seismic data in S1 include, but are not limited to: amplitude, frequency, phase, coherence, dip, Q-value, velocity, envelope, quadrature, semblance, and wave impedance, the data in the seismic data being used for fusion.
3. The wavelet-technology-based seismic data fusion method of claim 1, wherein: the image processing method in S1 includes: processing the seismic texture attribute image using a two-dimensional discrete wavelet transform whose expression DWT (j, k)1,k2) Comprises the following steps:
4. A method of wavelet-based seismic data fusion as recited in claim 3, wherein: in S1, when the two-dimensional discrete wavelet transform is performed on the seismic texture attribute image once, the obtained seismic data decomposition coefficients in different directions include a high-frequency subband and a low-frequency subband.
5. A method of wavelet-based seismic data fusion as recited in claim 3, wherein: the high-frequency sub-bands comprise a horizontal high-frequency sub-band, a vertical high-frequency sub-band and a diagonal high-frequency sub-band;
wherein phi is1Representing a two-dimensional wavelet mother function in the horizontal direction, x representing an abscissa, and y representing an ordinate;
wherein phi is2Representing a vertical two-dimensional wavelet mother function;
wherein phi is3Representing a diagonal two-dimensional wavelet mother function.
6. The wavelet technology-based seismic data fusion method of claim 4, wherein: when the seismic texture attribute image is secondarily processed, only the low-frequency sub-band is processed, and the expression of the low-frequency sub-bandComprises the following steps:
where phi represents the mother function of the two-dimensional wavelet.
7. The wavelet-technology-based seismic data fusion method of claim 1, wherein: in S2, the method for fusing the decomposition coefficients of the seismic data in different directions includes: seismic data image fusion based on multi-scale transforms including, but not limited to: rayleigh-tower transform decomposition and gradient tower decomposition based on tower transform.
8. A seismic data fusion method based on wavelet technology as claimed in claim 4 or 7 wherein: in the process of fusing the seismic data decomposition coefficients in different directions:
selecting the optimal wavelet transform in the high-frequency sub-band, and selecting the coefficient with the maximum average value of image neighborhood absolute values as a high-frequency fusion wavelet coefficient;
and selecting a weighted average value of low-frequency coefficients of the multi-source image as approximate wavelet coefficients in the low-frequency sub-band.
9. The wavelet-technology-based seismic data fusion method of claim 7, wherein: in the process of carrying out seismic data image fusion based on multi-scale transformation, the fusion is pixel-level image fusion.
10. The wavelet-technology-based seismic data fusion method of claim 1, wherein: in S3, the inverse wavelet transform is used to convert the image into an image matrix, and the fused wavelet coefficients and the approximate wavelet coefficients are used as inputs of the inverse wavelet transform in the process of the inverse transform.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110577205.6A CN113256547B (en) | 2021-05-26 | 2021-05-26 | Seismic data fusion method based on wavelet technology |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110577205.6A CN113256547B (en) | 2021-05-26 | 2021-05-26 | Seismic data fusion method based on wavelet technology |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113256547A true CN113256547A (en) | 2021-08-13 |
CN113256547B CN113256547B (en) | 2022-09-23 |
Family
ID=77184504
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110577205.6A Active CN113256547B (en) | 2021-05-26 | 2021-05-26 | Seismic data fusion method based on wavelet technology |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113256547B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114511830A (en) * | 2022-01-10 | 2022-05-17 | 上海应用技术大学 | Unmanned vehicle vision identification method for garbage classification |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107194903A (en) * | 2017-04-25 | 2017-09-22 | 阜阳师范学院 | A kind of multi-focus image fusing method based on wavelet transformation |
CN108388928A (en) * | 2018-03-27 | 2018-08-10 | 西南石油大学 | A kind of seismic attribute fusion method based on triangle kernel function |
US20200326442A1 (en) * | 2019-04-09 | 2020-10-15 | Petrochina Company Limited | Seismic full horizon tracking method, computer device and computer-readable storage medium |
-
2021
- 2021-05-26 CN CN202110577205.6A patent/CN113256547B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107194903A (en) * | 2017-04-25 | 2017-09-22 | 阜阳师范学院 | A kind of multi-focus image fusing method based on wavelet transformation |
CN108388928A (en) * | 2018-03-27 | 2018-08-10 | 西南石油大学 | A kind of seismic attribute fusion method based on triangle kernel function |
US20200326442A1 (en) * | 2019-04-09 | 2020-10-15 | Petrochina Company Limited | Seismic full horizon tracking method, computer device and computer-readable storage medium |
Non-Patent Citations (2)
Title |
---|
庞世明: "图像融合技术及其在石油勘探中的应用", 《勘探地球物理进展》 * |
郭彪等: "基于小波变换的地震属性多源信息融合预测技术研究", 《矿物岩石》 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114511830A (en) * | 2022-01-10 | 2022-05-17 | 上海应用技术大学 | Unmanned vehicle vision identification method for garbage classification |
CN114511830B (en) * | 2022-01-10 | 2024-05-17 | 上海应用技术大学 | Unmanned vehicle visual recognition method for garbage classification |
Also Published As
Publication number | Publication date |
---|---|
CN113256547B (en) | 2022-09-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Starck et al. | Sparse image and signal processing: Wavelets and related geometric multiscale analysis | |
Bonar et al. | Denoising seismic data using the nonlocal means algorithm | |
Starck et al. | Image processing and data analysis: the multiscale approach | |
CN102879822B (en) | A kind of seismic multi-attribute fusion method based on contourlet transformation | |
Li et al. | Residual learning of cycle-GAN for seismic data denoising | |
Bashar et al. | Wavelet transform-based locally orderless images for texture segmentation | |
Lyu et al. | A nonsubsampled countourlet transform based CNN for real image denoising | |
Li et al. | A deep learning method for denoising based on a fast and flexible convolutional neural network | |
Wan et al. | Practical remote sensing image fusion method based on guided filter and improved SML in the NSST domain | |
Wang et al. | Robust vector median filtering with a structure-adaptive implementation | |
CN113256547B (en) | Seismic data fusion method based on wavelet technology | |
Zhao et al. | Sample2Sample: an improved self-supervised denoising framework for random noise suppression in distributed acoustic sensing vertical seismic profile data | |
Yang et al. | Deep learning with fully convolutional and dense connection framework for ground roll attenuation | |
Wu et al. | Attenuating seismic noise via incoherent dictionary learning | |
Lin et al. | SeisGAN: Improving seismic image resolution and reducing random noise using a generative adversarial network | |
Boustani et al. | Mapping channel edges in seismic data using curvelet transform and morphological filter | |
Wang et al. | Desert seismic noise suppression based on multimodal residual convolutional neural network | |
Xu et al. | Random noise attenuation using a structure-oriented adaptive singular value decomposition | |
Yasar et al. | Comparison of real and complex-valued versions of wavelet transform, curvelet transform and ridgelet transform for medical image denoising | |
Dong et al. | A potential solution to insufficient target-domain noise data: Transfer learning and noise modeling | |
Dong et al. | Can deep learning compensate for sparse shots in the imaging domain? A potential alternative for reducing the acquisition cost of seismic data | |
Iqbal et al. | Blind curvelet-based denoising of seismic surveys in coherent and incoherent noise environments | |
Li et al. | Seismic random noise suppression by using MSRD-GAN | |
Cheng et al. | Simultaneous denoising and reconstruction of distributed acoustic sensing seismic data via a multicascade deep-learning method | |
Yuan et al. | Visualizing GPR Data Using Spatial-Subband Configuration |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |