CN107065006A - A kind of seismic signal coding method based on online dictionary updating - Google Patents
A kind of seismic signal coding method based on online dictionary updating Download PDFInfo
- Publication number
- CN107065006A CN107065006A CN201710062515.8A CN201710062515A CN107065006A CN 107065006 A CN107065006 A CN 107065006A CN 201710062515 A CN201710062515 A CN 201710062515A CN 107065006 A CN107065006 A CN 107065006A
- Authority
- CN
- China
- Prior art keywords
- mrow
- msub
- mtd
- dictionary
- msup
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01V—GEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
- G01V1/00—Seismology; Seismic or acoustic prospecting or detecting
- G01V1/28—Processing seismic data, e.g. analysis, for interpretation, for correction
- G01V1/288—Event detection in seismic signals, e.g. microseismics
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01V—GEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
- G01V2210/00—Details of seismic processing or analysis
- G01V2210/10—Aspects of acoustic signal generation or detection
- G01V2210/14—Signal detection
Abstract
A kind of seismic signal coding method based on online dictionary updating, belong to seismic signal data coding and transmission method, the dictionary transmission problem brought in seismic signal coding using dictionary learning and rarefaction representation is solved, be can apply in the various ground end mappings measured based on seismic signal.The present invention includes:(1)Input seismic signal is divided into multigroup sequentially in time, carrying out sparse coding using the dictionary in caching to every group of data calculates sparse coefficient;(2)To step(1)In sparse coefficient quantified and entropy code;(3)The reconstruction data of the individual transmission groups of above P are read from caching, dictionary learning are carried out with reference to the sparse coefficient of current group transmission, so as to update the dictionary needed for next group of Sparse is represented.The present invention is by way of online dictionary updating, in the case where ensureing the precondition of the effective rarefaction representation of signal, and does not need real-time Transmission dictionary information, so as to effectively reduce data transfer rate amount, goes for various seismic signal high speed acquisition application scenarios.
Description
Technical field
The invention belongs to seismic signal data transmission method, and in particular to a kind of seismic signal lossy coding method.
Background technology
The surveying and mapping technology measured based on seismic signal is one of current ground bottom structure and the effective ways of mineral resources measurement.
In each mapping, ground end progress seismic signal, which is measured, will produce the data more than more than 100T, and signal transmission at present
Bandwidth be extremely limited, it is therefore necessary to before transmission pass through seismic signal coding techniques reduce seismic signal data volume.It is existing
Have in technology, it is proposed that a kind of seismic signal coding method based on discrete cosine transform, it results in the pressure close to 3 times
Demagnification number.Also have using the two-dimentional discrete cosine transform technology based on local earthquake's signal adaptive so that the earthquake after reconstruction
Signal key character can be preserved.Further, it can be obtained using the seismic signal coding techniques of adaptive wavelet bag
Get Geng Gao compression multiple and more preferable reconstruction quality are widely used in earthquake at present due to its preferable direction retention performance
In the feature extraction of signal.The main thought of the above method is to characterize earthquake using a kind of suitable base or the dictionary of redundancy
Signal so that the sign of signal is sparse.In recent years, rarefaction representation is carried out by dictionary learning widely to be paid close attention to,
Especially it is widely used in Image Coding in remote sensing images and goes to learn dictionary by double sparse models, so as to obtains
Obtained preferable encoding efficiency.These achievements in research are indicated applies dictionary learning and rarefaction representation in seismic signal coding
Feasibility.
Traditional coding method based on dictionary learning and rarefaction representation usually contains the following two kinds main method:(1) pass through
The dictionary of off-line learning carries out rarefaction representation to the online data obtained in real time.For this method, need have one in advance
Individual off-line training collection, and the dictionary information gone by the off-line training collection needed for obtaining.Therefore, the rarefaction representation of online data is
The no correlation for being effectively very dependent on off-line data and online data.For actual seismic signal measurement, it is difficult to obtain
Suitable for the general off-line training collection of different situations.(2) real-time online data training dictionary is used, by the dictionary to existing in real time
Line number is according to progress rarefaction representation.In the method, it is necessary to which dictionary is transmitted, so that the rarefaction representation at encoding and decoding end
Process can be synchronous.Therefore, the transmission of dictionary information will increase the size of encoding code stream, so as to reduce coding efficiency.
The content of the invention:
The present invention proposes a kind of seismic signal coding method based on online dictionary updating, belongs to seismic signal data coding
Transmission method, solve in seismic signal coding using dictionary learning and rarefaction representation bring the problem of how transmitting dictionary, can
With applied in the various ground end mappings measured based on seismic signal.
A kind of seismic signal coding method based on online dictionary updating of the present invention, including coding step and decoding are walked
Suddenly;Wherein,
The coding step includes:
Step 1, input seismic signal is divided into multiple groups sequentially in time, to every group of data using the dictionary in caching
Sparse coding is carried out, is specifically:
Step 11, the seismic signal data for nearly facing T mark are divided into one group, and every group of data are individually handled;Assuming that
Current group data are Z group data, and it is expressed as Yz;The data of each mark are divided into several units, each unit yiLength
For M × 1, by yiIt is ranked up according to row mode;Therefore, Yz=[y1,...yi,...yN];It is assumed here that recorded on each mark
Data length is U, then have following relational expression:T × U=M × N;
Step 12, the dictionary D read in cachingz-1, give sparse coefficient matrix WZIt is openness be L, to following formula carry out it is excellent
Change and solve:
Step 2, the sparse coefficient in step S1 is quantified and entropy code, specifically included:
Step 21, using uniform quantization method sparse coefficient matrix is quantified, it is specific as follows:
wZ(i, j) represents sparse coefficient matrix WZMiddle coordinate is the factor v of (i, j), and Δ represents quantization step,The quantized result of the factor v of (i, j) is represented, round () represents rounding operation;
The nonzero coefficient location matrix PT that step 22, establishment are made up of numerical value 0 and numerical value 1, creation method is as follows:
Wherein, abs () represents signed magnitude arithmetic(al);
Step 23, arithmetic coding is used to nonzero coefficient location matrix PT;
Step 24, (position of PT (i, j)=1 is corresponded to nonzero coefficientEncoded using Huffman;
Step 3, the reconstruction data for reading from caching above P transmission groups, with reference to the sparse coefficient of current group transmission
Dictionary learning is carried out, so as to update the dictionary that next group of Sparse represents required, is specifically included:
Step 31, calculating P+1 groups rebuild dataP ∈ [Z-P, Z] (current group data are Z groups), computational methods are as follows:
Wherein,In unit
Step 32, solved according to following optimization process needed for dictionary DZ:
Wherein,aiThe constant of description inter-class correlation is represented, formula (2) changes as follows
Solved for computing:
Step 321, fixed DZ, W' can be calculated by foregoing PS methods;
Step 322, fixed W', DZIt can be updated according to MOD methods:
Step 323, repeat the above steps 321 and step 322 to given number of iterations, the dictionary D needed for updatingZ;
The decoding step includes:
Step 4, sparse coefficient progress inverse quantization and entropy decoding to receiving, generation nonzero coefficient matrix W 'Z, specifically such as
Under:
Step 41, Huffman decodings are carried out to nonzero coefficient encoding code stream, obtain nonzero coefficient wc;
Step 42, to nonzero coefficient wcInverse quantization is carried out, dequantized coefficients w' is obtainedc, it is specific as follows:
w'c=wc×Δ
Step 43, to nonzero coefficient location matrix PT encoding code streams carry out arithmetic decoding, obtain nonzero coefficient location matrix
PT, with reference to the dequantized coefficients w' generated in step 42c, generation nonzero coefficient matrix W 'Z;
Step 5, progress seismic signal reconstruction, it is specific as follows:
Step 51, the dictionary D read in cachingz-1, generation reconstruction signal Y'z, it is specific as follows:
Y'z=Dz-1×W'Z
Step 52, to reconstruction signal Y'z=[y'1...y'i...y'N] (each unit y'iLength be M × 1) weighed
Arrangement, will close on several units head and the tail in the way of row and connects together and be combined into a mark, therefore, the length per mark is
A total of T marks;
Dictionary D in step 6, generation cachingz, for the reconstruction of next group of data, i.e.,:Above P are read from caching
The reconstruction data of transmission group, carry out dictionary learning, so that it is dilute to update next group of data with reference to the sparse coefficient of current group transmission
The dictionary needed for expression is dredged, is specifically included:
Step 61, calculating P+1 groups rebuild dataP ∈ [Z-P, Z] (current group data are Z groups), computational methods are as follows:
Wherein,In unit
Step 62, solved according to following optimization process needed for dictionary DZ:
Wherein,aiThe constant of description inter-class correlation is represented, formula (2) changes as follows
Solved for computing:
Step 621, fixed DZ, W' can be calculated by foregoing PS methods;
Step 622, fixed W', DZIt can be updated according to MOD methods:
Step 623, repeat the above steps 321 and step 322 to given number of iterations, the dictionary D needed for updatingZ。
The present invention is by way of online dictionary updating, in the case where ensureing the precondition of the effective rarefaction representation of signal, not
Real-time Transmission dictionary information is needed, so as to effectively reduce data transfer rate amount, goes for various seismic signals and adopts at a high speed
Collect application scenario.
Brief description of the drawings
Fig. 1 is the flow chart of the present invention;
Fig. 2 is the part signal tested in seismic signal data;
Fig. 3 is the dictionary of study;
Fig. 4 is the result of distinct methods performance comparison.
Embodiment
Embodiment:
The invention mainly comprises:
S1:Input seismic signal is divided into multiple groups sequentially in time, every group of data are entered using the dictionary in caching
Row sparse coding;
S2:Sparse coefficient in step S1 is quantified and entropy code;
S3:The reconstruction data of the individual transmission groups of above P are read from caching, are carried out with reference to the sparse coefficient of current group transmission
Dictionary learning, so as to update the dictionary that next group of Sparse represents required.
Further, step S1 is specially:
S11:The seismic signal data for nearly facing T mark is divided into one group, and every group of data are individually handled;Assuming that current
Group data are Z group data, and it is expressed as Yz.The data of each mark are divided into several units, each unit yiLength be M
× 1, by yiIt is ranked up according to row mode.Therefore, Yz=[y1,...yi,...yN].It is assumed here that the data recorded on each mark
Length is U, then have following relational expression:T × U=M × N.
S12:Read the dictionary D in cachingz-1, give sparse coefficient matrix WZIt is openness be L, following formula is optimized
Solve:
For the solution of formula (1), we intend using PS methods (" Partial search vector selection for
sparse signal representation,”in NORSIG-03).PS methods be based on OMP algorithms (orthogonal matching pursuit,
“Comparison of basis selection methods,”in Signals,Systems and Computers,
1996.Conference Record of the Thirtieth Asilomar Conference on), therefore, provide first
The flow of OMP algorithms:
Search procedure of the PS methods by above-mentioned steps (1) only to maximum correlation dictionary unit is revised as to several poles
The search of big correlation dictionary unit, so that providing more search judgements obtains preferably sparse vector.
Step S2 is specially:
S21:Sparse coefficient matrix is quantified using uniform quantization method, it is specific as follows:
wZ(i, j) represents sparse coefficient matrix WZMiddle coordinate is the factor v of (i, j), and Δ represents quantization step,The quantized result of the factor v of (i, j) is represented, round () represents rounding operation.
S22:The nonzero coefficient matrix PT being made up of numerical value 0 and numerical value 1 is created, creation method is as follows:
Wherein, abs () represents signed magnitude arithmetic(al).
S23:Arithmetic coding is used to nonzero coefficient matrix PT.
S24:(w of the position of PT (i, j)=1 is corresponded to nonzero coefficientr i,j) using Huffman codings.
Step S3 is specially:
S31:Calculate P+1 groups and rebuild dataP ∈ [Z-P, Z] (current group data are Z groups), computational methods are as follows:
Wherein,In unit
S32:Dictionary D needed for being solved according to following optimization processZ:
Wherein,aiRepresent the constant of description inter-class correlation.
Formula (2) interative computation can be solved as follows:
1) fixed DZ, W' can be calculated by foregoing PS methods;
2) fixed W', DZCan be according to MOD methods (" Method of Optimal Directions for Frame
Design,”in 1999 IEEE International Conference on Acoustics,Speech and Signal
Processing (ICASSP)) it is updated:
3) repeat the above steps 1) and step 2) to given number of iterations, the renewal dictionary D needed for generationZ。
Embodiment 1:
1. testing seismic signal data derives from UTAM image data bases (http://utam.gg.utah.edu/
SeismicData/SeismicData.html), we from Find-Trapped-miners data as test data, it
Comprising 72 sensors, each sensor includes 135 marks;
2. each mark takes 1600 time span samples, the data of every 10 marks are 1 group, partial test data such as Fig. 1 institutes
Show;
1. current group of hypothesis is the 3rd group, (preceding 2 groups of data have completed coding and related data has been output to decoding end and eased up
In depositing), read the dictionary D in caching2Sparse coding is carried out, the size of dictionary is 16 × 64 in caching in this embodiment,
Openness is 1/16 (nonzero coefficient accounts for the ratio of overall coefficient), calculates the W obtained3Middle partial data is as follows;
0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
0 | 0 | 0 | 0 | 0 | 191086.69 | 0 | 0 |
0 | 0 | 0 | 0 | 0 | 0 | 0 | 69516.63 |
0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
0 | 0 | 263371.03 | 0 | 0 | 0 | 0 | 0 |
-275961.58 | 248225.21 | 0 | 0 | 0 | 0 | 0 | 0 |
0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
2. couple W3Quantified, quantization step selects 1024;Non-zero in after quantization is encoded using Huffman,
Arithmetic coding is used to non-zero entry position, we weigh primary signal and the difference of reconstruction signal by SNR, SNR now
It is 21.2dB.
3. by reading preceding two groups of reconstruction data and W3, carry out dictionary updating, the dictionary D updated3As shown in Figure 3.
Embodiment 2:
1. the dictionary D that pair the 4th group of data are updated with the 3rd group of data3Rarefaction representation is carried out, and updates dictionary D4;
2. the dictionary D that pair the 5th group of data are updated with the 4th group of data4Rarefaction representation is carried out, and updates dictionary D5;
3. pair above-mentioned every group of data are quantified and encoded, calculate code check and distortion is weighed by SNR;
4. adjustment is different openness, above-mentioned 3 steps are repeated, the distortion situation under different code checks is obtained.
5. in order to prove the validity of algorithm, we are simultaneously to based on DCT, Curvelet and offline dictionary learning method
(K-SVD+ORMP) seismic signal encryption algorithm is tested, and obtains the rate distortion situation of algorithms of different, and relevant comparative is real
Test result as shown in Figure 4.
Specific embodiment described in this technology is only to spirit explanation for example of the invention.Technology belonging to of the invention
The technical staff in field can make various modifications or supplement to described specific embodiment or use similar mode
Substitute, but without departing from the spiritual of the present invention or surmount scope defined in appended claims.
Claims (1)
1. a kind of seismic signal coding method based on online dictionary updating, it is characterised in that walked including coding step and decoding
Suddenly;Wherein,
The coding step includes:
Step 1, input seismic signal is divided into multiple groups sequentially in time, every group of data are carried out using the dictionary in caching
Sparse coding, be specifically:
Step 11, the seismic signal data for nearly facing T mark are divided into one group, and every group of data are individually handled;Assuming that current
Group data are Z group data, and it is expressed as Yz;The data of each mark are divided into several units, each unit yiLength be M
× 1, by yiIt is ranked up according to row mode;Therefore, Yz=[y1,...yi,...yN];It is assumed here that the data recorded on each mark
Length is U, then have following relational expression:T × U=M × N;
Step 12, the dictionary D read in cachingz-1, give sparse coefficient matrix WZIt is openness be L, following formula is optimized and asked
Solution:
<mrow>
<mtable>
<mtr>
<mtd>
<mrow>
<msub>
<mi>W</mi>
<mi>Z</mi>
</msub>
<mo>=</mo>
<munder>
<mi>argmin</mi>
<mi>W</mi>
</munder>
<mo>|</mo>
<mo>|</mo>
<msub>
<mi>Y</mi>
<mi>Z</mi>
</msub>
<mo>-</mo>
<msub>
<mi>D</mi>
<mrow>
<mi>Z</mi>
<mo>-</mo>
<mn>1</mn>
</mrow>
</msub>
<mi>W</mi>
<mo>|</mo>
<msubsup>
<mo>|</mo>
<mn>2</mn>
<mn>2</mn>
</msubsup>
<mo>,</mo>
</mrow>
</mtd>
<mtd>
<mrow>
<mi>s</mi>
<mo>.</mo>
<mi>t</mi>
<mo>.</mo>
</mrow>
</mtd>
<mtd>
<mrow>
<mo>|</mo>
<mo>|</mo>
<mi>W</mi>
<mo>|</mo>
<msub>
<mo>|</mo>
<mn>0</mn>
</msub>
<mo>&le;</mo>
<mi>L</mi>
</mrow>
</mtd>
</mtr>
</mtable>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>1</mn>
<mo>)</mo>
</mrow>
</mrow>
Step 2, the sparse coefficient in step S1 is quantified and entropy code, specifically included:
Step 21, using uniform quantization method sparse coefficient matrix is quantified, it is specific as follows:
<mrow>
<msup>
<msub>
<mi>w</mi>
<mi>Z</mi>
</msub>
<mi>r</mi>
</msup>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>,</mo>
<mi>j</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mi>r</mi>
<mi>o</mi>
<mi>u</mi>
<mi>n</mi>
<mi>d</mi>
<mrow>
<mo>(</mo>
<mfrac>
<mrow>
<msub>
<mi>w</mi>
<mi>Z</mi>
</msub>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>,</mo>
<mi>j</mi>
<mo>)</mo>
</mrow>
</mrow>
<mi>&Delta;</mi>
</mfrac>
<mo>)</mo>
</mrow>
</mrow>
wZ(i, j) represents sparse coefficient matrix WZMiddle coordinate is the factor v of (i, j), and Δ represents quantization step,Generation
The quantized result of the factor v of table (i, j), round () represents rounding operation;
The nonzero coefficient location matrix PT that step 22, establishment are made up of numerical value 0 and numerical value 1, creation method is as follows:
<mfenced open = "{" close = "">
<mtable>
<mtr>
<mtd>
<mrow>
<mi>P</mi>
<mi>T</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>,</mo>
<mi>j</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mn>1</mn>
<mo>,</mo>
</mrow>
</mtd>
<mtd>
<mrow>
<mi>i</mi>
<mi>f</mi>
<mi> </mi>
<mi>a</mi>
<mi>b</mi>
<mi>s</mi>
<mrow>
<mo>(</mo>
<msubsup>
<mi>w</mi>
<mi>Z</mi>
<mi>r</mi>
</msubsup>
<mo>(</mo>
<mi>i</mi>
<mo>,</mo>
<mi>j</mi>
<mo>)</mo>
</mrow>
<mo>)</mo>
<mo>></mo>
<mn>0</mn>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mi>P</mi>
<mi>T</mi>
<mrow>
<mo>(</mo>
<mi>i</mi>
<mo>,</mo>
<mi>j</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mn>0</mn>
<mo>,</mo>
</mrow>
</mtd>
<mtd>
<mrow>
<mi>i</mi>
<mi>f</mi>
<mi> </mi>
<mi>a</mi>
<mi>b</mi>
<mi>s</mi>
<mrow>
<mo>(</mo>
<msubsup>
<mi>w</mi>
<mi>Z</mi>
<mi>r</mi>
</msubsup>
<mo>(</mo>
<mi>i</mi>
<mo>,</mo>
<mi>j</mi>
<mo>)</mo>
</mrow>
<mo>)</mo>
<mo>=</mo>
<mo>=</mo>
<mn>0</mn>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
Wherein, abs () represents signed magnitude arithmetic(al);
Step 23, arithmetic coding is used to nonzero coefficient location matrix PT;
Step 24, (position of PT (i, j)=1 is corresponded to nonzero coefficient) using Huffman codings;
Step 3, the reconstruction data for reading from caching above P transmission groups, are carried out with reference to the sparse coefficient of current group transmission
Dictionary learning, so as to update the dictionary that next group of Sparse represents required, is specifically included:
Step 31, calculating P+1 groups rebuild dataP ∈ [Z-P, Z] (current group data are Z groups), computational methods are as follows:
<mrow>
<msub>
<mover>
<mi>Y</mi>
<mo>~</mo>
</mover>
<mi>p</mi>
</msub>
<mo>=</mo>
<msub>
<mi>D</mi>
<mrow>
<mi>p</mi>
<mo>-</mo>
<mn>1</mn>
</mrow>
</msub>
<msubsup>
<mi>W</mi>
<mi>p</mi>
<mi>q</mi>
</msubsup>
</mrow>
Wherein,In unit
Step 32, solved according to following optimization process needed for dictionary DZ:
<mrow>
<mtable>
<mtr>
<mtd>
<mrow>
<mo>&lsqb;</mo>
<msub>
<mi>D</mi>
<mi>Z</mi>
</msub>
<mo>,</mo>
<msup>
<mi>W</mi>
<mo>&prime;</mo>
</msup>
<mo>&rsqb;</mo>
<mo>=</mo>
<munder>
<mi>argmin</mi>
<mrow>
<mi>D</mi>
<mo>,</mo>
<mi>W</mi>
</mrow>
</munder>
<mo>|</mo>
<mo>|</mo>
<msub>
<mover>
<mi>Y</mi>
<mo>~</mo>
</mover>
<mrow>
<mi>Z</mi>
<mo>+</mo>
<mn>1</mn>
</mrow>
</msub>
<mo>-</mo>
<mi>D</mi>
<mi>W</mi>
<mo>|</mo>
<msubsup>
<mo>|</mo>
<mn>2</mn>
<mn>2</mn>
</msubsup>
<mo>,</mo>
</mrow>
</mtd>
<mtd>
<mrow>
<mi>s</mi>
<mo>.</mo>
<mi>t</mi>
<mo>.</mo>
</mrow>
</mtd>
<mtd>
<mrow>
<mo>|</mo>
<mo>|</mo>
<mi>W</mi>
<mo>|</mo>
<msub>
<mo>|</mo>
<mn>0</mn>
</msub>
<mo>&le;</mo>
<mi>L</mi>
</mrow>
</mtd>
</mtr>
</mtable>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>2</mn>
<mo>)</mo>
</mrow>
</mrow>
Wherein,aiThe constant of description inter-class correlation is represented, formula (2) as follows transport by iteration
Calculate and solve:
Step 321, fixed DZ, W' can be calculated by foregoing PS methods;
Step 322, fixed W', DZIt can be updated according to MOD methods:
<mrow>
<msub>
<mi>D</mi>
<mi>Z</mi>
</msub>
<mo>=</mo>
<mrow>
<mo>(</mo>
<msub>
<mover>
<mi>Y</mi>
<mo>~</mo>
</mover>
<mrow>
<mi>Z</mi>
<mo>+</mo>
<mn>1</mn>
</mrow>
</msub>
<msup>
<mi>W</mi>
<mrow>
<mo>&prime;</mo>
<mi>T</mi>
</mrow>
</msup>
<mo>)</mo>
</mrow>
<msup>
<mrow>
<mo>(</mo>
<msup>
<mi>W</mi>
<mo>&prime;</mo>
</msup>
<msup>
<mi>W</mi>
<mrow>
<mo>&prime;</mo>
<mi>T</mi>
</mrow>
</msup>
<mo>)</mo>
</mrow>
<mrow>
<mo>-</mo>
<mn>1</mn>
</mrow>
</msup>
</mrow>
Step 323, repeat the above steps 321 and step 322 to given number of iterations, the dictionary D needed for updatingZ;
The decoding step includes:
Step 4, sparse coefficient progress inverse quantization and entropy decoding to receiving, generation nonzero coefficient matrix W 'Z, it is specific as follows:
Step 41, Huffman decodings are carried out to nonzero coefficient encoding code stream, obtain nonzero coefficient wc;
Step 42, to nonzero coefficient wcInverse quantization is carried out, dequantized coefficients w' is obtainedc, it is specific as follows:
w′c=wc×Δ
Step 43, to nonzero coefficient location matrix PT encoding code streams carry out arithmetic decoding, obtain nonzero coefficient location matrix PT, knot
Close the dequantized coefficients w' generated in step 42c, generation nonzero coefficient matrix W 'Z;
Step 5, progress seismic signal reconstruction, it is specific as follows:
Step 51, the dictionary D read in cachingz-1, generation reconstruction signal Y'z, it is specific as follows:
Y'z=Dz-1×W'Z
Step 52, to reconstruction signal Y'z=[y'1...y'i...y'N] (each unit y'iLength carry out permutatation for M × 1),
Several units head and the tail in the way of row will be closed on and connected together and be combined into a mark, therefore, the length per mark isAlways
Shared T marks;
Dictionary D in step 6, generation cachingz, for the reconstruction of next group of data, i.e.,:Above P is read from caching to have transmitted
The reconstruction data of group, carry out dictionary learning with reference to the sparse coefficient of current group transmission, are represented so as to update next group of Sparse
Required dictionary, is specifically included:
Step 61, calculating P+1 groups rebuild dataP ∈ [Z-P, Z] (current group data are Z groups), computational methods are as follows:
<mrow>
<msub>
<mover>
<mi>Y</mi>
<mo>~</mo>
</mover>
<mi>p</mi>
</msub>
<mo>=</mo>
<msub>
<mi>D</mi>
<mrow>
<mi>p</mi>
<mo>-</mo>
<mn>1</mn>
</mrow>
</msub>
<msubsup>
<mi>W</mi>
<mi>p</mi>
<mi>q</mi>
</msubsup>
</mrow>
Wherein,In unit
Step 62, solved according to following optimization process needed for dictionary DZ:
<mrow>
<mtable>
<mtr>
<mtd>
<mrow>
<mo>&lsqb;</mo>
<msub>
<mi>D</mi>
<mi>Z</mi>
</msub>
<mo>,</mo>
<msup>
<mi>W</mi>
<mo>&prime;</mo>
</msup>
<mo>&rsqb;</mo>
<mo>=</mo>
<munder>
<mi>argmin</mi>
<mrow>
<mi>D</mi>
<mo>,</mo>
<mi>W</mi>
</mrow>
</munder>
<mo>|</mo>
<mo>|</mo>
<msub>
<mover>
<mi>Y</mi>
<mo>~</mo>
</mover>
<mrow>
<mi>Z</mi>
<mo>+</mo>
<mn>1</mn>
</mrow>
</msub>
<mo>-</mo>
<mi>D</mi>
<mi>W</mi>
<mo>|</mo>
<msubsup>
<mo>|</mo>
<mn>2</mn>
<mn>2</mn>
</msubsup>
<mo>,</mo>
</mrow>
</mtd>
<mtd>
<mrow>
<mi>s</mi>
<mo>.</mo>
<mi>t</mi>
<mo>.</mo>
</mrow>
</mtd>
<mtd>
<mrow>
<mo>|</mo>
<mo>|</mo>
<mi>W</mi>
<mo>|</mo>
<msub>
<mo>|</mo>
<mn>0</mn>
</msub>
<mo>&le;</mo>
<mi>L</mi>
</mrow>
</mtd>
</mtr>
</mtable>
<mo>-</mo>
<mo>-</mo>
<mo>-</mo>
<mrow>
<mo>(</mo>
<mn>2</mn>
<mo>)</mo>
</mrow>
</mrow>
Wherein,aiThe constant of description inter-class correlation is represented, formula (2) as follows transport by iteration
Calculate and solve:
Step 621, fixed DZ, W' can be calculated by foregoing PS methods;
Step 622, fixed W', DZIt can be updated according to MOD methods:
<mrow>
<msub>
<mi>D</mi>
<mi>Z</mi>
</msub>
<mo>=</mo>
<mrow>
<mo>(</mo>
<msub>
<mover>
<mi>Y</mi>
<mo>~</mo>
</mover>
<mrow>
<mi>Z</mi>
<mo>+</mo>
<mn>1</mn>
</mrow>
</msub>
<msup>
<mi>W</mi>
<mrow>
<mo>&prime;</mo>
<mi>T</mi>
</mrow>
</msup>
<mo>)</mo>
</mrow>
<msup>
<mrow>
<mo>(</mo>
<msup>
<mi>W</mi>
<mo>&prime;</mo>
</msup>
<msup>
<mi>W</mi>
<mrow>
<mo>&prime;</mo>
<mi>T</mi>
</mrow>
</msup>
<mo>)</mo>
</mrow>
<mrow>
<mo>-</mo>
<mn>1</mn>
</mrow>
</msup>
</mrow>
Step 623, repeat the above steps 321 and step 322 to given number of iterations, the dictionary D needed for updatingZ。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710062515.8A CN107065006B (en) | 2017-01-23 | 2017-01-23 | A kind of seismic signal coding method based on online dictionary updating |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710062515.8A CN107065006B (en) | 2017-01-23 | 2017-01-23 | A kind of seismic signal coding method based on online dictionary updating |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107065006A true CN107065006A (en) | 2017-08-18 |
CN107065006B CN107065006B (en) | 2019-06-11 |
Family
ID=59598951
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710062515.8A Expired - Fee Related CN107065006B (en) | 2017-01-23 | 2017-01-23 | A kind of seismic signal coding method based on online dictionary updating |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107065006B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107664773A (en) * | 2017-09-26 | 2018-02-06 | 武汉大学 | It is a kind of based on time shift and entropy constrained seismic signal coding method |
CN109581483A (en) * | 2017-09-29 | 2019-04-05 | 中国石油化工股份有限公司 | Processing Seismic Data and system based on rarefaction representation |
CN112634454A (en) * | 2021-03-08 | 2021-04-09 | 南京泛在实境科技有限公司 | Point cloud classical building curved surface reconstruction method based on OLDL _ DWT |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070053603A1 (en) * | 2005-09-08 | 2007-03-08 | Monro Donald M | Low complexity bases matching pursuits data coding and decoding |
CN102879818A (en) * | 2012-08-30 | 2013-01-16 | 中国石油集团川庆钻探工程有限公司地球物理勘探公司 | Improved method for decomposing and reconstructing seismic channel data |
CN103489203A (en) * | 2013-01-31 | 2014-01-01 | 清华大学 | Image coding method and system based on dictionary learning |
CN103594084A (en) * | 2013-10-23 | 2014-02-19 | 江苏大学 | Voice emotion recognition method and system based on joint penalty sparse representation dictionary learning |
CN104517267A (en) * | 2014-12-23 | 2015-04-15 | 电子科技大学 | Infrared image enhancement and reestablishment method based on spectra inversion |
CN105182417A (en) * | 2015-09-11 | 2015-12-23 | 合肥工业大学 | Surface wave separation method and system based on morphological component analysis |
CN105701775A (en) * | 2016-01-06 | 2016-06-22 | 山东师范大学 | Image denoising method based on improved adaptive dictionary learning |
US20160335224A1 (en) * | 2014-03-31 | 2016-11-17 | Los Alamos National Security, Llc | Efficient convolutional sparse coding |
CN106157254A (en) * | 2015-04-21 | 2016-11-23 | 南京理工大学 | Rarefaction representation remote sensing images denoising method based on non local self-similarity |
CN106203414A (en) * | 2016-07-01 | 2016-12-07 | 昆明理工大学 | A kind of based on the method differentiating dictionary learning and the scene image character detection of rarefaction representation |
-
2017
- 2017-01-23 CN CN201710062515.8A patent/CN107065006B/en not_active Expired - Fee Related
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070053603A1 (en) * | 2005-09-08 | 2007-03-08 | Monro Donald M | Low complexity bases matching pursuits data coding and decoding |
CN102879818A (en) * | 2012-08-30 | 2013-01-16 | 中国石油集团川庆钻探工程有限公司地球物理勘探公司 | Improved method for decomposing and reconstructing seismic channel data |
CN103489203A (en) * | 2013-01-31 | 2014-01-01 | 清华大学 | Image coding method and system based on dictionary learning |
CN103594084A (en) * | 2013-10-23 | 2014-02-19 | 江苏大学 | Voice emotion recognition method and system based on joint penalty sparse representation dictionary learning |
US20160335224A1 (en) * | 2014-03-31 | 2016-11-17 | Los Alamos National Security, Llc | Efficient convolutional sparse coding |
CN104517267A (en) * | 2014-12-23 | 2015-04-15 | 电子科技大学 | Infrared image enhancement and reestablishment method based on spectra inversion |
CN106157254A (en) * | 2015-04-21 | 2016-11-23 | 南京理工大学 | Rarefaction representation remote sensing images denoising method based on non local self-similarity |
CN105182417A (en) * | 2015-09-11 | 2015-12-23 | 合肥工业大学 | Surface wave separation method and system based on morphological component analysis |
CN105701775A (en) * | 2016-01-06 | 2016-06-22 | 山东师范大学 | Image denoising method based on improved adaptive dictionary learning |
CN106203414A (en) * | 2016-07-01 | 2016-12-07 | 昆明理工大学 | A kind of based on the method differentiating dictionary learning and the scene image character detection of rarefaction representation |
Non-Patent Citations (2)
Title |
---|
唐杰: ""伪随机噪声编码震源的编码方式研究"", 《中国地球物理2010-中国地球物理学会第二十六届年会、中国地震学会第十三次学术大会论文集》 * |
邵婕 等: ""基于字典训练的小波域稀疏表示为地震去噪方法"", 《石油地球物理勘探》 * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107664773A (en) * | 2017-09-26 | 2018-02-06 | 武汉大学 | It is a kind of based on time shift and entropy constrained seismic signal coding method |
CN109581483A (en) * | 2017-09-29 | 2019-04-05 | 中国石油化工股份有限公司 | Processing Seismic Data and system based on rarefaction representation |
CN112634454A (en) * | 2021-03-08 | 2021-04-09 | 南京泛在实境科技有限公司 | Point cloud classical building curved surface reconstruction method based on OLDL _ DWT |
CN112634454B (en) * | 2021-03-08 | 2021-06-29 | 南京泛在实境科技有限公司 | Point cloud classical building curved surface reconstruction method based on OLDL _ DWT |
Also Published As
Publication number | Publication date |
---|---|
CN107065006B (en) | 2019-06-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107832837B (en) | Convolutional neural network compression method and decompression method based on compressed sensing principle | |
CN103327326B (en) | Based on the SAR image transmission method of compressed sensing and channel self-adapting | |
CN107065006A (en) | A kind of seismic signal coding method based on online dictionary updating | |
CN108924148B (en) | Multi-source signal collaborative compressed sensing data recovery method | |
CN106157339A (en) | The animated Mesh sequence compaction algorithm extracted based on low-rank vertex trajectories subspace | |
CN104935349A (en) | Vibration signal compressing and sampling method | |
CN103077544B (en) | Magnetic resonance parameter matching method and device and medical image processing equipment | |
CN101621514A (en) | Network data compressing method, network system and synthesis center equipment | |
CN105741333A (en) | Real-time compression and reconstruction method of Video-SAR (Synthetic Aperture Radar) image | |
CN104036519B (en) | Partitioning compressive sensing reconstruction method based on image block clustering and sparse dictionary learning | |
CN104301728A (en) | Compressed video capture and reconstruction system based on structured sparse dictionary learning | |
CN107942377A (en) | A kind of seismic data compression and reconstructing method | |
CN105206277A (en) | Voice compression method base on monobit compression perception | |
CN116346549A (en) | Underwater acoustic channel sparse estimation method adopting convolutional neural network channel cluster detection | |
CN105100810B (en) | Compression of images decompressing method and system in a kind of imaging sonar real time processing system | |
CN103558498B (en) | Based on the insulator pollution flashover leakage current signal sparse representation method of wavelet analysis | |
CN109194968B (en) | Image compression sensing method fusing information source channel decoding | |
CN116128070B (en) | Federal learning method based on wireless air calculation and multi-bit quantization compressed sensing | |
CN104125459B (en) | Support set and signal value detection based video compressive sensing reconstruction method | |
CN103985100A (en) | Partitioned compression sensing method based on self-adaptive observation combinational optimization | |
CN103985096A (en) | Hyperspectral image regression prediction compression method based on off-line training | |
CN116071441A (en) | Remote sensing image compression method based on end-to-end convolutional neural network | |
CN107818325A (en) | Image sparse method for expressing based on integrated dictionary learning | |
CN107664773A (en) | It is a kind of based on time shift and entropy constrained seismic signal coding method | |
CN109246437B (en) | Image compression sensing method based on Reed-Solomon code |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20190611 Termination date: 20200123 |