CN106226816A - A kind of pre-stack seismic signal waveform sorting technique - Google Patents

A kind of pre-stack seismic signal waveform sorting technique Download PDF

Info

Publication number
CN106226816A
CN106226816A CN201610815668.0A CN201610815668A CN106226816A CN 106226816 A CN106226816 A CN 106226816A CN 201610815668 A CN201610815668 A CN 201610815668A CN 106226816 A CN106226816 A CN 106226816A
Authority
CN
China
Prior art keywords
degree
membership
training
data
neuron
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610815668.0A
Other languages
Chinese (zh)
Other versions
CN106226816B (en
Inventor
钱峰
张乐
胡光岷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CN201610815668.0A priority Critical patent/CN106226816B/en
Publication of CN106226816A publication Critical patent/CN106226816A/en
Application granted granted Critical
Publication of CN106226816B publication Critical patent/CN106226816B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01VGEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
    • G01V1/00Seismology; Seismic or acoustic prospecting or detecting
    • G01V1/28Processing seismic data, e.g. for interpretation or for event detection
    • G01V1/30Analysis

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Acoustics & Sound (AREA)
  • Environmental & Geological Engineering (AREA)
  • Geology (AREA)
  • General Life Sciences & Earth Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Geophysics (AREA)
  • Geophysics And Detection Of Objects (AREA)

Abstract

The open a kind of pre-stack seismic signal waveform sorting technique of the present invention, first extracts data, and noise reduction, obtains the integration of all bearing datas;Then the pre-training using DBN extracts characteristic and dimensionality reduction;SOM training is carried out according to the characteristic obtained;Using the neuron weights of SOM as FCM algorithm initial cluster center.The present invention uses the DBN model that the degree of depth learns, and had both been effectively maintained the abstract characteristics of initial data, and had presented again and reduce time complexity for traditional method;The method that make use of self organizing neural network and fuzzy C-mean algorithm, has obtained good classifying quality, and the output data of fuzzy clustering has been more than simple key words sorting, and each sample also leaves the possibility continuing heuristic data to the degree of membership of each bunch.

Description

A kind of pre-stack seismic signal waveform sorting technique
Technical field
The present invention relates to seismic data process field, particularly relate to a kind of seismic signal waveform sorting technique.
Background technology
Waveform separation is that oil reservoir prediction stores and exploration also plays as an important technology of geophysical exploration technology A lot of effects, becomes an important technology in petroleum exploration field.Waveform separation also plays in ground Quality Research In critically important effect, identifying sedimentary facies structure especially.The effect that the result of waveform separation algorithm is final depends on carrying out Actual geologic signals and the seismic phase number exactly that contains of this geologic signals of locality has been had the when of classification.? Before three-dimensional geological waveform separation is not introduced into oil exploration, the prediction that oil is distributed by people is mainly by experienced Matter worker is analyzed according to usual experience and knowledge.The substantial amounts of manpower and materials of such needs, and due to artificially Intervention too much, the result drawn is the most convincing, and actual application also tends to go wrong.
During seismology technology is incorporated into oil exploration, it is accurate that people slowly find that geological data is carried out science Early stage prediction can play the biggest effect during oil exploitation, reduce manpower and materials cost put into while add The fast exploitation rate of exploration.And 3-D seismics signal waveform is classified as a seismographic important branch, also function to lift foot The effect of weight.By analytically descending signal, (these signals are then to pass through sensor acquisition by blowing out in three-dimensional waveform classification The signal amplitude returned), then according to by these signals are carried out feature extraction, characterize originally by the feature extracted Signal, then sorting algorithm is according to the similarity of these features, signal is divided into a certain class setting in classification, according to it Positional information in the plane generate seismic facies map.People can dope the information feelings of underground with the situation of base area seismic phase Condition.
In unsupervised waveform separation algorithm, self organizing neural network is obtained in that good result at present, makes it Become a kind of main flow algorithm solved without supervision waveform separation problem.The method good classification effect and fast convergence rate;It is main Being the size being regulated neuron by constantly training, the direction of regulation is so that what triumph neuron was won in upper once training Probability is bigger, and the neuron around triumph neuron also advantage can be improved in training next time.
Traditional Modulation recognition method is primarily directed to poststack signal, and poststack signal is the rough letter to prestack signal Change, though data volume reduces, but also with large losses anisotropy information as cost.Poststack signal is obtained by conventional needle simultaneously Classification results the most increasingly cannot meet requirement the highest to seismic data interpretation precision in current oil gas exploration, and traditional Waveform classification cannot process high-dimensional pre stack data.
Summary of the invention
The present invention solves above-mentioned technical problem, it is proposed that a kind of pre-stack seismic signal waveform sorting technique, first extract Data, and noise reduction, obtain the integration of all bearing datas;Then the pre-training using DBN extracts feature and dimensionality reduction;In utilization The characteristic that one step obtains carries out SOM training;Using the neuron weights of SOM as FCM algorithm initial cluster center;Relative to biography System method reduces time complexity.
The technical solution used in the present invention is: a kind of pre-stack seismic signal waveform sorting technique, including:
S1, to input wide-azimuth geological data noise reduction process;
S2, to the data extraction layer bit field data obtained through step S1, obtain original training sample;Particularly as follows: each layer Bit data sample points t, m bearing data composition training sample X={x1,...xn, xiDimension is t × m, i=1,2 ..., N, n represent the number of training sample;
S3, the original training sample obtaining step S2, according to the restriction Boltzmann machine built in degree of depth confidence network Network extraction characteristic;Specifically include following step by step:
S31, input original training sample, obtain the dimension of original training sample;
S32, initialization K layer limit Boltzmann machine number of network node;
S33, last layer is limited Boltzmann machine network output as next layer limit Boltzmann machine network defeated Enter, train each layer to limit Boltzmann machine network successively;
The characteristic that S34, output are extracted;
S4, the characteristic input self organizing neural network training that step S3 is extracted, by the neuron weights that obtain to Measure the initial cluster center as fuzzy C-mean algorithm, when FCM training terminates, obtain classification results.
Further, the noise reduction process described in step S1 is for using structure directing filtering to realize noise reduction.
Further, step S4 specifically include following step by step:
The characteristic that S41, input step S3 extract, and initialize SOM parameter;
S42, carry out SOM training, when iterations is more than pre-set threshold value, obtain neuron;
S43, neuron step S42 obtained are as FCM initial cluster center;
S44, calculate each sample degree of membership to each bunch, then update bunch center;
S45, judge that degree of membership rate of change, whether less than preset value ε, if then output category result, otherwise returns step S44。
Further, described degree of membership rate of change refers to that the degree of membership currently calculated deducts and adjacent once calculates The degree of membership come, the value then obtained divided by the adjacent degree of membership once calculated.
Beneficial effects of the present invention: first the present invention extracts data, and noise reduction, obtains the integration of all bearing datas; Then the pre-training using DBN extracts characteristic and dimensionality reduction;SOM training is carried out according to the characteristic obtained;By the god of SOM Through unit's weights as FCM algorithm initial cluster center.The present invention uses the DBN model that the degree of depth learns, and has both been effectively maintained original The abstract characteristics of data, presents again and reduces time complexity for traditional method;Make use of self organizing neural network and Fuzzy C equal The method of value, has obtained good classifying quality, and the output data of fuzzy clustering has been more than simple key words sorting, each Sample also leaves the possibility continuing heuristic data to the degree of membership of each bunch.
Accompanying drawing explanation
The protocol procedures figure that Fig. 1 provides for the present invention.
Fig. 2 is typical DBN network.
The flow chart of the RBM network extraction characteristic that Fig. 3 provides for the present invention.
Fig. 4 is that neuron adjusts exemplary plot.
SOM and FCM that Fig. 5 provides for the present invention clusters flow chart.
Detailed description of the invention
For ease of skilled artisan understands that the technology contents of the present invention, below in conjunction with the accompanying drawings present invention is entered one Step explaination.
Being illustrated in figure 1 the protocol procedures figure of the application, the technical scheme of the application is: a kind of pre-stack seismic signal waveform Sorting technique, including:
S1, to input wide-azimuth geological data noise reduction process;Here noise reduction process uses structure directing filtering real Existing.
S2, the wide-azimuth geological data extract layer bit field data after noise reduction process obtaining step S1, obtain former Beginning training sample;Particularly as follows: each layer of bit data sample points t, m bearing data composition training sample X={x1, ...xn, xiDimension is t × m, i=1,2 ..., n, n represent the number of training sample.
S3, the original training sample obtaining step S2, according to the restriction Boltzmann machine built in degree of depth confidence network Network extraction characteristic;The present invention, after obtaining original training sample, builds DBN degree of depth confidence network.Because the application is only It is to use DBN to extract feature, the most reversely adjusting, limiting Boltzmann machine (RBM) network extraction original training sample so building This characteristic.
The deep belief network (DBN) that the application uses is a generative probabilistic model, with the nerve of traditional discrimination model Network is relative, and generating model is to set up a Joint Distribution between observed data and label.
DBNs is made up of multiple restriction Boltzmann machines (Restricted Boltzmann Machines) layer, an allusion quotation The network of type is as shown in Figure 2.It is a visual layers and a hidden layer that these networks " are limited ", and interlayer exists and connects, but in layer Unit between there is not connection.Hidden unit is trained to catch the dependency of the high level data showed in visual layers.
The connection of one DBN is instructed by top-down generation weights and is determined.First, non-supervisory by one Greedy successively method goes pre-training to obtain the weights generating model, and non-supervisory greediness successively method is proved effective by Hinton , and be referred to as contrasting difference (contrastive divergence) by it.
In this training stage, visual layers can produce a vector v, by it, value is delivered to hidden layer.In turn, visually The input of layer can be by random selection, to attempt going to reconstruct original input signal.Finally, these new visual neurons swash Forward direction transmission reconstruct hidden layer is activated unit by unit of living, it is thus achieved that h.The step of these backward and forwards is referred to as Gibbs sampling, and hidden Layer activates the correlation difference between unit and visual layers input just as the Main Basis of right value update.
Be illustrated in figure 3 RBM network extraction feature flow process, specifically include following step by step:
S31, input original training sample, obtain the dimension of original training sample;
S32, initialization K layer limit Boltzmann machine number of network node;
S33, last layer is limited Boltzmann machine network output as next layer limit Boltzmann machine network defeated Enter, train each layer to limit Boltzmann machine network successively;
The characteristic that S34, output are extracted;
S4, characteristic input self organizing neural network (Self-Organizing Map, SOM) that step S3 is extracted Training, using the neuron weight vector that obtains as the initial cluster center of fuzzy C-mean algorithm, when FCM training reaches end condition, Obtain classification results.
As it is shown in figure 5, step S4 specifically include following step by step:
The characteristic that S41, input step S3 extract, and initialize SOM parameter;
S42, carry out SOM training, it is not necessary to Complete Convergence, when iterations is more than pre-set threshold value, obtain neuron;Here Pre-set threshold value take about 10 times, with specific reference to actual sample data training, naturally it is also possible to neuronal center change Rate is less than stopping iteration during certain value.
S43, neuron step S42 obtained are as FCM initial cluster center;
S44, calculate each sample degree of membership to each bunch, then update bunch center;
S45, judge that degree of membership rate of change, whether less than preset value ε, if then output category result, otherwise returns step S44.Degree of membership rate of change refers to that the degree of membership currently calculated deducts the adjacent degree of membership once calculated, then divided by The value that the adjacent degree of membership once calculated obtains.
Self-organizing Maps (SOM) algorithm used in the application, is that Kohonen proposes a kind of unsupervised clustering algorithm. The purpose of the method is to find out the structural model being hidden under data, and this structural model is to find out the most accurately 's.SOM is a network structure being made up of neuron, has a variety of neuron layout type, and the network structure of selection is general It is two-dimentional.All neurons, M is represented with MiRepresent i-th neuron, the dimension of each neuron and the dimension of input data Degree formed objects, is all for n, so Mi=[Mi1,Mi2,…,Min].The most random initialization each neuron MiSize, so After further according to input data go adjust neuron size, Self-organizing Maps is the algorithm that a kind of winner wins entirely, because of according to Rule, the neuron of triumph can adjust towards the direction of input data, because input data are all have to a certain extent similar Property, after so adjusting neuron, next input enters network, and the probability that the neuron that last time wins is won again can increase Greatly, here it is the meaning entirely won of winner.Fig. 4 is that neuron adjusts example.
For study every time, from data volume to be processed, choose input data randomly, calculate this input data With the Euclidean distance of all neurons, the nearest neuron of distance input data is triumph neuron.Computing formula is as follows:
||X-mb| |=mini{||x-mi} (1-1)
The weight of triumph neuron and the neuron adjacent with it can be updated, from formula according to equation below (1-2) It is known that the weight of triumph neuron and its neuron adjacent with it changes according to the direction to input data.More new formula As follows:
mi(t+1)=mi(t)+λ(t)hbi(t)[x-mi(t)] (1-2)
Wherein, t is the number of times of iteration, and λ (t) is learning rate, hbiT () is the distance of the neuron i neural b of distance triumph, should Distance diminishes greatly along with the change of iterations, shown in specific rules such as formula (1-3):
h b i ( t ) = e - ( | | r b - r i | | 2 / ( 2 σ 2 ( t ) ) ) - - - ( 1 - 1 )
Wherein, rb-ri represents triumph neuron b and some neuron i distance on the topology.
SOM algorithm can above-mentioned two steps of continuous iteration, until iteration stopping condition reaches.Iterated conditional is all artificial Set, usually hbiT the σ (t) in () is little to a certain extent, σ (t) about a decreasing function of iterations t, such as, takes σ (t)=1/t.
Fuzzy C-Means Clustering (FCM) fuzzy c-means algorithm (FCMA) algorithm of the application employing or title FCM.In numerous fuzzy clustering algorithms, FCM (FCM) algorithm is most widely used general and relatively successful, and it passes through optimization aim Function obtains each sample point degree of membership to all class centers, thus determines that the generic of sample point is to reach automatically sample number According to carrying out the purpose classified.
Specifically comprise the following steps that
1. input object X={x1,...,xm, determine cluster class number N, determine FUZZY WEIGHTED index p (typically taking 2),
[0,1] initial degree of membership is randomly updated as shown in (1-4)
w i j = ( 1 / d i s t ( x i , c j ) 2 ) 1 p - 1 / Σ q = 1 N ( 1 / d i s t ( x i , c q ) 2 ) 1 p - 1 - - - ( 1 - 4 )
2. for a bunch CjCalculate barycenter cjFormula is as shown in (1-4)
c j = Σ i = 1 m w i j ρ x i / Σ i = 1 m w i j ρ - - - ( 1 - 5 )
3. update each object degree of membership w to each cluster centreij, as shown in (1-5)
w i j = ( 1 / d i s t ( x i , c j ) 2 ) 1 p - 1 / Σ q = 1 N ( 1 / d i s t ( x i , c q ) 2 ) 1 p - 1 - - - ( 1 - 6 )
Repeatedly performing the 2nd, 3 step until algorithmic statement, i.e. front and back cluster centre difference reaches less than threshold value or iterations Predetermined number of times.
Those of ordinary skill in the art it will be appreciated that embodiment described here be to aid in reader understanding this Bright principle, it should be understood that protection scope of the present invention is not limited to such special statement and embodiment.For ability For the technical staff in territory, the present invention can have various modifications and variations.All within the spirit and principles in the present invention, made Any modification, equivalent substitution and improvement etc., within should be included in scope of the presently claimed invention.

Claims (4)

1. a pre-stack seismic signal waveform sorting technique, it is characterised in that including:
S1, to input wide-azimuth geological data noise reduction process;
S2, to the data extraction layer bit field data obtained through step S1, obtain original training sample;Particularly as follows: each layer of figure place According to sample points t, m bearing data composition training sample X={x1,...xn, xiDimension is t × m, i=1,2 ..., n, n Represent the number of training sample;
S3, the original training sample obtaining step S2, according to the restriction Boltzmann machine network built in degree of depth confidence network Extract characteristic;Specifically include following step by step:
S31, input original training sample, obtain the dimension of original training sample;
S32, initialization K layer limit Boltzmann machine number of network node;
S33, last layer is limited Boltzmann machine network output as next layer limit Boltzmann machine network input, depend on Each layer of secondary training limits Boltzmann machine network;
The characteristic that S34, output are extracted;
S4, characteristic input self organizing neural network training step S3 extracted, make the neuron weight vector obtained For the initial cluster center of fuzzy C-mean algorithm, when FCM training terminates, obtain classification results.
A kind of pre-stack seismic signal waveform sorting technique the most according to claim 1, it is characterised in that described in step S1 Noise reduction process realizes noise reduction for using structure directing filtering.
A kind of pre-stack seismic signal waveform sorting technique the most according to claim 1, it is characterised in that step S4 is specifically wrapped Include following step by step:
The characteristic that S41, input step S3 extract, and initialize SOM parameter;
S42, carry out SOM training, when iterations is more than pre-set threshold value, obtain neuron;
S43, neuron step S42 obtained are as FCM initial cluster center;
S44, calculate each sample degree of membership to each bunch, then update bunch center;
S45, judge that degree of membership rate of change, whether less than preset value ε, if then output category result, otherwise returns step S44.
A kind of pre-stack seismic signal waveform sorting technique the most according to claim 3, it is characterised in that described degree of membership becomes Rate refers to that the degree of membership currently calculated deducts the adjacent degree of membership once calculated, and then once calculates divided by adjacent The value that degree of membership out obtains.
CN201610815668.0A 2016-09-12 2016-09-12 A kind of pre-stack seismic signal waveform sorting technique Active CN106226816B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610815668.0A CN106226816B (en) 2016-09-12 2016-09-12 A kind of pre-stack seismic signal waveform sorting technique

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610815668.0A CN106226816B (en) 2016-09-12 2016-09-12 A kind of pre-stack seismic signal waveform sorting technique

Publications (2)

Publication Number Publication Date
CN106226816A true CN106226816A (en) 2016-12-14
CN106226816B CN106226816B (en) 2018-03-09

Family

ID=58073782

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610815668.0A Active CN106226816B (en) 2016-09-12 2016-09-12 A kind of pre-stack seismic signal waveform sorting technique

Country Status (1)

Country Link
CN (1) CN106226816B (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106886043A (en) * 2017-03-01 2017-06-23 成都理工大学 Reservoir detecting method based on geological data deep learning
CN107688201A (en) * 2017-08-23 2018-02-13 电子科技大学 Based on RBM earthquake prestack signal clustering methods
CN108226997A (en) * 2017-11-16 2018-06-29 中国石油天然气集团公司 A kind of seismic facies analysis method based on earthquake data before superposition
CN108680954A (en) * 2018-08-01 2018-10-19 中国石油天然气股份有限公司 Window wave shape clustering method and its device when a kind of more data volumes of frequency domain become
CN109143355A (en) * 2018-08-23 2019-01-04 电子科技大学 Semi-supervised global optimization seismic facies quantitative analysis method based on SOM
CN110135236A (en) * 2019-03-21 2019-08-16 云南路普斯数据科技有限公司 A kind of video face identification method based on wavelet transformation and neural network algorithm
CN110462445A (en) * 2017-02-09 2019-11-15 地质探索系统公司 Geophysics deep learning
CN111898650A (en) * 2020-07-08 2020-11-06 国网浙江省电力有限公司杭州供电公司 Marketing and distribution data automatic clustering analysis equipment and method based on deep learning
CN112905717A (en) * 2021-02-25 2021-06-04 北方工业大学 Public safety data distribution method and device
CN113109869A (en) * 2021-03-30 2021-07-13 成都理工大学 Automatic picking method for first arrival of shale ultrasonic test waveform

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020161526A1 (en) * 2001-04-27 2002-10-31 Frederique Fournier Method for facilitating monitoring, in the course of time, of the evolution of an underground zone by compared analysis of various seismic record sets
CN1382262A (en) * 1999-10-20 2002-11-27 菲利浦石油公司 Multi-attibute seismic waveform classification
US6993440B2 (en) * 2002-04-22 2006-01-31 Harris Corporation System and method for waveform classification and characterization using multidimensional higher-order statistics
CN102650702A (en) * 2012-05-03 2012-08-29 中国石油天然气股份有限公司 Seismic waveform analysis and reservoir prediction method and device
CN103487832A (en) * 2013-09-12 2014-01-01 电子科技大学 Method for classifying supervised waveforms in three-dimensional seismic signal
CN104181597A (en) * 2014-08-31 2014-12-03 电子科技大学 Seismic facies analysis method based on prestack seismic data
CN104914467A (en) * 2015-05-22 2015-09-16 中国石油天然气股份有限公司 Classification model channel extracting seismic facies clustering analysis method
CN105008963A (en) * 2012-11-03 2015-10-28 钻井信息公司 Seismic waveform classification system and method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1382262A (en) * 1999-10-20 2002-11-27 菲利浦石油公司 Multi-attibute seismic waveform classification
US20020161526A1 (en) * 2001-04-27 2002-10-31 Frederique Fournier Method for facilitating monitoring, in the course of time, of the evolution of an underground zone by compared analysis of various seismic record sets
US6993440B2 (en) * 2002-04-22 2006-01-31 Harris Corporation System and method for waveform classification and characterization using multidimensional higher-order statistics
CN102650702A (en) * 2012-05-03 2012-08-29 中国石油天然气股份有限公司 Seismic waveform analysis and reservoir prediction method and device
CN105008963A (en) * 2012-11-03 2015-10-28 钻井信息公司 Seismic waveform classification system and method
CN103487832A (en) * 2013-09-12 2014-01-01 电子科技大学 Method for classifying supervised waveforms in three-dimensional seismic signal
CN104181597A (en) * 2014-08-31 2014-12-03 电子科技大学 Seismic facies analysis method based on prestack seismic data
CN104914467A (en) * 2015-05-22 2015-09-16 中国石油天然气股份有限公司 Classification model channel extracting seismic facies clustering analysis method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
SONG CHENGYUN ET AL.: "Pre-stack-texture-based reservior characteristics and seismic facies analysis", 《APPLIED GEOPHYSICS》 *
邓传伟 等: "波形分类技术在储层沉积微相预测中的应用", 《石油物探》 *

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110462445A (en) * 2017-02-09 2019-11-15 地质探索系统公司 Geophysics deep learning
CN106886043A (en) * 2017-03-01 2017-06-23 成都理工大学 Reservoir detecting method based on geological data deep learning
CN107688201A (en) * 2017-08-23 2018-02-13 电子科技大学 Based on RBM earthquake prestack signal clustering methods
CN107688201B (en) * 2017-08-23 2019-12-31 电子科技大学 RBM-based seismic prestack signal clustering method
CN108226997A (en) * 2017-11-16 2018-06-29 中国石油天然气集团公司 A kind of seismic facies analysis method based on earthquake data before superposition
CN108680954A (en) * 2018-08-01 2018-10-19 中国石油天然气股份有限公司 Window wave shape clustering method and its device when a kind of more data volumes of frequency domain become
CN109143355A (en) * 2018-08-23 2019-01-04 电子科技大学 Semi-supervised global optimization seismic facies quantitative analysis method based on SOM
CN109143355B (en) * 2018-08-23 2019-11-05 电子科技大学 Semi-supervised global optimization seismic facies quantitative analysis method based on SOM
CN110135236A (en) * 2019-03-21 2019-08-16 云南路普斯数据科技有限公司 A kind of video face identification method based on wavelet transformation and neural network algorithm
CN111898650A (en) * 2020-07-08 2020-11-06 国网浙江省电力有限公司杭州供电公司 Marketing and distribution data automatic clustering analysis equipment and method based on deep learning
CN112905717A (en) * 2021-02-25 2021-06-04 北方工业大学 Public safety data distribution method and device
CN113109869A (en) * 2021-03-30 2021-07-13 成都理工大学 Automatic picking method for first arrival of shale ultrasonic test waveform

Also Published As

Publication number Publication date
CN106226816B (en) 2018-03-09

Similar Documents

Publication Publication Date Title
CN106226816B (en) A kind of pre-stack seismic signal waveform sorting technique
CN104570083B (en) Geologic body automatic identifying method based on multi-dimensional earthquake attribute
CN105510970B (en) Obtain seismic facies optimal classes purpose method
CN110609320B (en) Pre-stack seismic reflection pattern recognition method based on multi-scale feature fusion
CN114463333B (en) While-drilling geosteering real-time stratum lattice intelligent updating method and system
CN106886043B (en) Reservoir detection method based on seismic data deep learning
CN104866810A (en) Face recognition method of deep convolutional neural network
CN106778921A (en) Personnel based on deep learning encoding model recognition methods again
CN102279929B (en) Remote-sensing artificial ground object identifying method based on semantic tree model of object
CN106707335B (en) A kind of poststack seismic signal waveform sorting technique
CN104181597B (en) Seismic facies analysis method based on prestack seismic data
CN107688201A (en) Based on RBM earthquake prestack signal clustering methods
CN102840860B (en) A kind of method for recognising star map based on Hybrid Particle Swarm
CN109670539A (en) A kind of silt particle layer detection method based on log deep learning
CN106443822A (en) Geological integrated identification method and device based on gravity-magnetic-electric-seismic three-dimensional joint inversion
Chopra et al. Unsupervised machine learning facies classification in the Delaware Basin and its comparison with supervised Bayesian facies classification
Zhu et al. An automatic identification method of imbalanced lithology based on Deep Forest and K-means SMOTE
CN117236330B (en) Mutual information and antagonistic neural network based method for enhancing theme diversity
Ali et al. Data-driven lithofacies prediction in complex tight sandstone reservoirs: a supervised workflow integrating clustering and classification models
CN109143355B (en) Semi-supervised global optimization seismic facies quantitative analysis method based on SOM
Jervis et al. Deep learning network optimization and hyperparameter tuning for seismic lithofacies classification
Smith et al. Self-organizing artificial neural nets for automatic anomaly identification
CN104714250A (en) Practical internal substratum automatic interpretation method
DELL'AVERSANA An integrated multi-physics Machine Learning approach for exploration risk mitigation.
Graciolli A novel classification method applied to well log data calibrated by ontology based core descriptions

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant