CN108062575A - A kind of high similarity graph picture identification and sorting technique - Google Patents

A kind of high similarity graph picture identification and sorting technique Download PDF

Info

Publication number
CN108062575A
CN108062575A CN201810006119.8A CN201810006119A CN108062575A CN 108062575 A CN108062575 A CN 108062575A CN 201810006119 A CN201810006119 A CN 201810006119A CN 108062575 A CN108062575 A CN 108062575A
Authority
CN
China
Prior art keywords
mrow
msub
msubsup
image
wavelet transformation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810006119.8A
Other languages
Chinese (zh)
Inventor
姜文超
刘海波
孙傲冰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Computing Technology of CAS
Original Assignee
Institute of Computing Technology of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Computing Technology of CAS filed Critical Institute of Computing Technology of CAS
Priority to CN201810006119.8A priority Critical patent/CN108062575A/en
Publication of CN108062575A publication Critical patent/CN108062575A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The present invention relates to technical field of image processing, especially a kind of high similarity graph picture identification and sorting technique.Method of the present invention is the main object extracted first by partitioning algorithm in image, realizes that image subject is suitably separated with background;It prevents image background from being interfered to main object classification, improves the performance of image Accurate classification;Picture breakdown is different frequency domain subgraphs using wavelet transformation by data preprocessing phase, and extracts each frequency domain Sub-Image Feature;Then using the output of wavelet transformation as the input of convolutional neural networks, optimization is updated network weight parameter by repeatedly training, obtains the highest neural network parameter of classification accuracy, obtains optimal classification model.The combination that the present invention passes through wavelet transformation and convolutional neural networks;It can realize accurate identification, the classification to image;It can be used for the processing of image.

Description

A kind of high similarity graph picture identification and sorting technique
Technical field
The present invention relates to technical field of image processing, especially a kind of high similarity graph picture identification and sorting technique.
Background technology
Renninger L.W et al. think that image simple texture analysis can be searched out the clue of scene classification, and And simply texture analysis can simplify image recognition processes and calculate.The texture of Benmokhtar et al. blending images and Color characteristic realizes football scene classification;But this method is classified merely with global characteristics, and separating capacity is limited.LBP is The local binary pattern of image is calculated by the local binary pattern to " consistent " in image, and to each of image Kind pattern count carries out statistics with histogram;But this method is general to coloured image classifying quality.People are to remote sensing image in recent years On textural characteristics using gray level co-occurrence matrixes algorithm, which is between the pixel with certain spatial relation Joint Distribution, the joint histogram of two pixel grey scales pair can be regarded as, belong to second-order statistics.Gray level co-occurrence matrixes describe ash Spatial dependence is spent, illustrates a kind of spatial relationship of the pixel grey scale under texture pattern, but this method is not grabbed completely The firmly characteristics of image of image local gray scale;Therefore it is not ideal enough for image local special characteristic extraction effect.Though wavelet transformation Image local textural characteristics can be effectively extracted, but it is general to image general classification performance;There is the defects of the following aspects: (1) it is not notable enough to image local Minutiae extraction;(2) application range is narrow, general to the classifying quality of coloured image;(3) The general characteristic classification of image is failed to reach satisfactory degree.
The content of the invention
Present invention solves the technical problem that it is to provide a kind of high similarity for merging wavelet transformation and convolutional neural networks Image identifies and sorting technique.
The present invention solve above-mentioned technical problem technical solution be:
The method is the main object extracted first by partitioning algorithm in image, realizes that image subject is fitted with background Work as separation;It prevents image background from being interfered to main object classification, improves the performance of image Accurate classification;Data prediction rank Picture breakdown is different frequency domain subgraphs by Duan Liyong wavelet transformations, and extracts each frequency domain Sub-Image Feature;Then by wavelet transformation Input of the output as convolutional neural networks, optimization is updated network weight parameter by repeatedly training, acquisition is classified The highest neural network parameter of accuracy rate, obtains optimal classification model.
Each frequency domain Sub-Image Feature of extraction is to calculate the energy parameter of each frequency domain subgraph as texture descriptor;To not Generic, different resolution image carries out preliminary feature extraction and determines optimal textural characteristics diversity factor threshold values.
The method will input convolutional neural networks after each subgraph energy feature normalized.
The wavelet transformation is as follows:
Formula (1) is first depending on by image discretization;
Wherein a is scaling function, and b is location parameter;
Then, wavelet transformation carries out fast decoupled, formula such as (2)-(5);Wherein h and g is respectively the low pass of wavelet transformation Wave filter and high-pass filter;
Aj(k, l)=∑k1k2h(k1-2k)h(k2-2l)fj-1(k1,k2) (2)
The convolutional neural networks convolutional layer update is as follows:
In each layer of convolutional layer of convolutional neural networks, the characteristic pattern of preceding layer and a needs are by learning what is obtained Convolution kernel carries out convolutional calculation, new characteristic pattern is obtained, using formula (6):
Wherein Mj represents the input feature vector figure combination of selection, and Ki, j are i-th kind of characteristic pattern of input and the jth kind spy of output The convolution kernel used in the connection between figure is levied, bj is the biasing corresponding to jth kind characteristic pattern, and f is activation primitive;
Using formula (7), the sensitivity of convolutional neural networks neuron is calculated;Wherein up () represents up-sampling operation;
Kinematic error cost function
In down-sampling layer, new characteristic pattern is obtained by formula (9);
Carry out Calculation of Sensitivity
It is final to calculate gradient to obtain preferably network weight parameter;
Picture breakdown is different frequency domain subgraphs by wavelet transformation by the present invention, then calculates the energy of each frequency domain subgraph Parameter is as texture descriptor, and to different classes of, different resolution image carries out preliminary feature extraction and determines that optimal texture is special Diversity factor threshold values is levied, convolutional neural networks will be inputted after each subgraph energy feature normalized, passes through repeatedly training adjustment net Network parameter and the classification accuracy for verifying weights reality finally obtain the highest network model of classification accuracy.Become by small echo It changes and the combination of convolutional neural networks;It can realize accurate identification, the classification to image.
Description of the drawings
The present invention is further described below in conjunction with the accompanying drawings:
Fig. 1 is flow chart of the present invention.
Specific embodiment
As shown in Figure 1, the present invention extracts the main object in image by partitioning algorithm first, image subject and the back of the body are realized Scape suitably separates, and prevents image background from being interfered to main object classification, improves the performance of image Accurate classification;Data are located in advance The reason stage carries out picture breakdown using wavelet transformation and extracts each frequency domain Sub-Image Feature;Then using the output of wavelet transformation as god Input through network is updated network weight parameter optimization by repeatedly training, obtains the highest nerve of classification accuracy Network parameter obtains optimal classification model.
The core of the present invention mainly includes two parts, is respectively local pretreatment stage of the wavelet transformation to image data, And the final process of global classification is carried out to image in convolutional neural networks.In wavelet transformation, due to image data category Two-dimensional digital signal, then first by image discretization, foundation formula (1).
Wherein a is scaling function, and b is location parameter.
Wavelet transformation carries out fast decoupled, formula such as (2)-(5).Wherein h and g is respectively the low-pass filter of wavelet transformation And high-pass filter.
Aj(k, l)=∑k1k2h(k1-2k)h(k2-2l)fj-1(k1,k2) (2)
Second technological core is the right value update of convolutional neural networks, the most termination of this some effects image data classification Fruit.Higher classification accuracy in order to obtain, this technology have carried out many experiments, convolutional layer and pond of the emphasis in neutral net Change the right value update part of layer.
In each layer of convolutional layer of convolutional neural networks, the characteristic pattern of preceding layer and a needs are by learning what is obtained Convolution kernel carries out convolutional calculation, new characteristic pattern is can obtain, such as formula (6).
Wherein MjRepresent the input feature vector figure combination of selection, Ki,jIt is i-th kind of characteristic pattern of input and the jth kind spy of output Levy the convolution kernel used in the connection between figure, bjIt is the biasing corresponding to jth kind characteristic pattern, f is activation primitive.
The sensitivity of convolutional neural networks neuron is calculated, such as formula (7).Wherein up () represents up-sampling operation.
Error cost function
And in down-sampling layer, new characteristic pattern can be obtained by formula (9).
Calculation of Sensitivity
It is final to calculate gradient to obtain preferably network weight parameter.
After right value update is carried out to the convolutional layer of neutral net and sample level, also need to set in detail excellent to other layers Change parameter, each layer parameter of convolutional neural networks such as table 2 that this technology uses.
Present networks include altogether 5 convolutional layers, 3 pond layers, 3 full articulamentums, 1 softmax classification layer, 2 LRN Layer and two stochastical sampling layers find this parameter hypograph classification accuracy height by multiple test.
Each layer parameter of 1. convolutional neural networks of table
Each meaning of parameters of the present invention is as shown in the table:
2. parameter concrete meaning of table

Claims (5)

1. a kind of high similarity graph picture identification and sorting technique, it is characterised in that:The method is to pass through partitioning algorithm first The main object in image is extracted, realizes that image subject is suitably separated with background;Prevent image background from being made to main object classification Into interference, the performance of raising image Accurate classification;Picture breakdown is different frequencies using wavelet transformation by data preprocessing phase Domain subgraph, and extract each frequency domain Sub-Image Feature;Then using the output of wavelet transformation as the input of convolutional neural networks, by more Secondary training is updated network weight parameter optimization, obtains the highest neural network parameter of classification accuracy, is most preferably divided Class model.
2. according to the method described in claim 1, it is characterized in that:Each frequency domain Sub-Image Feature of the extraction is to calculate each frequency domain The energy parameter of subgraph is as texture descriptor;Preliminary feature extraction is carried out to different classes of, different resolution image and is determined Optimal textural characteristics diversity factor threshold values.
3. according to the method described in claim 2, it is characterized in that:The method is after wavelet transformation by each subgraph energy Convolutional neural networks are inputted after feature normalization processing.
4. according to the method described in claim 3, it is characterized in that:After image data input network, the pond of convolutional neural networks Change the right value update of layer and convolutional layer, obtain preferably sorting parameter.
According to claim 2, the wavelet transformation is as follows:
Formula (1) is first depending on by image discretization;
<mrow> <msub> <mi>&amp;psi;</mi> <mrow> <mi>j</mi> <mo>,</mo> <mi>k</mi> </mrow> </msub> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> <mo>=</mo> <msubsup> <mi>a</mi> <mn>0</mn> <mrow> <mo>-</mo> <mfrac> <mi>j</mi> <mn>2</mn> </mfrac> </mrow> </msubsup> <mi>&amp;psi;</mi> <mrow> <mo>(</mo> <msubsup> <mi>a</mi> <mn>0</mn> <mrow> <mo>-</mo> <mi>j</mi> </mrow> </msubsup> <mi>x</mi> <mo>-</mo> <msub> <mi>kb</mi> <mn>0</mn> </msub> <mo>)</mo> </mrow> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow>
Wherein a is scaling function, and b is location parameter;ψj,kIt is the output of wavelet transformation discretization;
Then, wavelet transformation carries out fast decoupled, formula such as (2)-(5);Wherein h and g is respectively the low-pass filtering of wavelet transformation Device and high-pass filter;
Aj(k, l)=∑k1k2h(k1-2k)h(k2-2l)fj-1(k1,k2) (2)
<mrow> <msubsup> <mi>D</mi> <mi>J</mi> <mn>1</mn> </msubsup> <mrow> <mo>(</mo> <mi>k</mi> <mo>,</mo> <mi>l</mi> <mo>)</mo> </mrow> <mo>=</mo> <msub> <mi>&amp;Sigma;</mi> <mrow> <mi>k</mi> <mn>1</mn> </mrow> </msub> <msub> <mi>&amp;Sigma;</mi> <mrow> <mi>k</mi> <mn>2</mn> </mrow> </msub> <mi>g</mi> <mrow> <mo>(</mo> <mi>k</mi> <mn>1</mn> <mo>-</mo> <mn>2</mn> <mi>k</mi> <mo>)</mo> </mrow> <mi>h</mi> <mrow> <mo>(</mo> <mi>k</mi> <mn>2</mn> <mo>-</mo> <mn>2</mn> <mi>l</mi> <mo>)</mo> </mrow> <msub> <mi>f</mi> <mrow> <mi>j</mi> <mo>-</mo> <mn>1</mn> </mrow> </msub> <mrow> <mo>(</mo> <mi>k</mi> <mn>1</mn> <mo>,</mo> <mi>k</mi> <mn>2</mn> <mo>)</mo> </mrow> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>3</mn> <mo>)</mo> </mrow> </mrow>
<mrow> <msubsup> <mi>D</mi> <mi>j</mi> <mn>2</mn> </msubsup> <mrow> <mo>(</mo> <mi>k</mi> <mo>,</mo> <mi>l</mi> <mo>)</mo> </mrow> <mo>=</mo> <msub> <mi>&amp;Sigma;</mi> <mrow> <mi>k</mi> <mn>1</mn> </mrow> </msub> <msub> <mi>&amp;Sigma;</mi> <mrow> <mi>k</mi> <mn>2</mn> </mrow> </msub> <mi>h</mi> <mrow> <mo>(</mo> <mi>k</mi> <mn>1</mn> <mo>-</mo> <mi>k</mi> <mn>2</mn> <mo>)</mo> </mrow> <mi>g</mi> <mrow> <mo>(</mo> <mi>k</mi> <mn>2</mn> <mo>-</mo> <mn>2</mn> <mi>l</mi> <mo>)</mo> </mrow> <msub> <mi>f</mi> <mrow> <mi>j</mi> <mo>-</mo> <mn>1</mn> </mrow> </msub> <mrow> <mo>(</mo> <mi>k</mi> <mn>1</mn> <mo>,</mo> <mi>k</mi> <mn>2</mn> <mo>)</mo> </mrow> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>4</mn> <mo>)</mo> </mrow> </mrow>
<mrow> <msubsup> <mi>D</mi> <mi>j</mi> <mn>3</mn> </msubsup> <mrow> <mo>(</mo> <mi>k</mi> <mo>,</mo> <mi>l</mi> <mo>)</mo> </mrow> <mo>=</mo> <msub> <mi>&amp;Sigma;</mi> <mrow> <mi>k</mi> <mn>1</mn> </mrow> </msub> <msub> <mi>&amp;Sigma;</mi> <mrow> <mi>k</mi> <mn>2</mn> </mrow> </msub> <mi>g</mi> <mrow> <mo>(</mo> <mi>k</mi> <mn>1</mn> <mo>-</mo> <mi>k</mi> <mn>2</mn> <mo>)</mo> </mrow> <mi>g</mi> <mrow> <mo>(</mo> <mi>k</mi> <mn>2</mn> <mo>-</mo> <mn>2</mn> <mi>l</mi> <mo>)</mo> </mrow> <msub> <mi>f</mi> <mrow> <mi>j</mi> <mo>-</mo> <mn>1</mn> </mrow> </msub> <mrow> <mo>(</mo> <mi>k</mi> <mn>1</mn> <mo>,</mo> <mi>k</mi> <mn>2</mn> <mo>)</mo> </mrow> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>5</mn> <mo>)</mo> </mrow> </mrow>
5. according to the method described in claim 4, it is characterized in that:
The convolutional neural networks convolutional layer update is as follows:
In each layer of convolutional layer of convolutional neural networks, the characteristic pattern of preceding layer and a convolution needed by learning to obtain Core carries out convolutional calculation, new characteristic pattern is obtained, using formula (6);
<mrow> <msubsup> <mi>x</mi> <mi>j</mi> <mi>l</mi> </msubsup> <mo>=</mo> <mi>f</mi> <mrow> <mo>(</mo> <msub> <mi>&amp;Sigma;</mi> <mrow> <mi>i</mi> <mo>&amp;Element;</mo> <mi>M</mi> <mi>j</mi> </mrow> </msub> <msubsup> <mi>X</mi> <mi>i</mi> <mrow> <mi>l</mi> <mo>-</mo> <mn>1</mn> </mrow> </msubsup> <mo>*</mo> <msubsup> <mi>k</mi> <mrow> <mi>i</mi> <mi>j</mi> </mrow> <mi>l</mi> </msubsup> <mo>+</mo> <msubsup> <mi>b</mi> <mi>j</mi> <mi>l</mi> </msubsup> <mo>)</mo> </mrow> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>6</mn> <mo>)</mo> </mrow> </mrow>
Wherein Mj represents the input feature vector figure combination of selection, and Ki, j are i-th kind of characteristic pattern of input and the jth kind characteristic pattern of output Between connection used in convolution kernel, bj is the biasing corresponding to jth kind characteristic pattern, and f is activation primitive;
Using formula (7), the sensitivity of convolutional neural networks neuron is calculated.Wherein up () represents up-sampling operation;
Error cost function
<mrow> <mfrac> <mrow> <mo>&amp;part;</mo> <mi>E</mi> </mrow> <mrow> <mo>&amp;part;</mo> <msubsup> <mi>K</mi> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> </mrow> <mi>l</mi> </msubsup> </mrow> </mfrac> <mo>=</mo> <msub> <mi>&amp;Sigma;</mi> <mrow> <mi>u</mi> <mo>,</mo> <mi>v</mi> </mrow> </msub> <msub> <mrow> <mo>(</mo> <msubsup> <mi>&amp;delta;</mi> <mi>j</mi> <mi>l</mi> </msubsup> <mo>)</mo> </mrow> <mrow> <mi>u</mi> <mo>,</mo> <mi>v</mi> </mrow> </msub> <msub> <mrow> <mo>(</mo> <msubsup> <mi>P</mi> <mi>i</mi> <mrow> <mi>l</mi> <mo>-</mo> <mn>1</mn> </mrow> </msubsup> <mo>)</mo> </mrow> <mrow> <mi>u</mi> <mo>,</mo> <mi>v</mi> </mrow> </msub> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>8</mn> <mo>)</mo> </mrow> </mrow>
In down-sampling layer, new characteristic pattern is obtained by formula (9);
<mrow> <msubsup> <mi>x</mi> <mi>j</mi> <mi>l</mi> </msubsup> <mo>=</mo> <mi>f</mi> <mrow> <mo>(</mo> <msubsup> <mi>&amp;beta;</mi> <mi>j</mi> <mi>l</mi> </msubsup> <mi>d</mi> <mi>o</mi> <mi>w</mi> <mi>n</mi> <mo>(</mo> <msubsup> <mi>x</mi> <mi>j</mi> <mrow> <mi>l</mi> <mo>-</mo> <mn>1</mn> </mrow> </msubsup> <mo>)</mo> </mrow> <mo>+</mo> <msubsup> <mi>b</mi> <mi>j</mi> <mi>l</mi> </msubsup> <mo>)</mo> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>9</mn> <mo>)</mo> </mrow> </mrow>
Carry out Calculation of Sensitivity
It is final to calculate gradient to obtain preferably network weight parameter;
<mrow> <mfrac> <mrow> <mo>&amp;part;</mo> <mi>E</mi> </mrow> <mrow> <mo>&amp;part;</mo> <msub> <mi>b</mi> <mi>j</mi> </msub> </mrow> </mfrac> <msub> <mi>&amp;Sigma;</mi> <mrow> <mi>u</mi> <mo>,</mo> <mi>v</mi> </mrow> </msub> <msub> <mrow> <mo>(</mo> <msubsup> <mi>&amp;delta;</mi> <mi>j</mi> <mi>l</mi> </msubsup> <mo>)</mo> </mrow> <mrow> <mi>u</mi> <mo>,</mo> <mi>v</mi> </mrow> </msub> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>11</mn> <mo>)</mo> </mrow> </mrow>
The high similar image identification for combining wavelet transformation and convolutional neural networks overcomes image data particulate with sorting technique The shortcomings that texture feature extraction is insufficient is spent, the generalization ability to image data of classifying is improved, significantly improves image overall point Class effect.
CN201810006119.8A 2018-01-03 2018-01-03 A kind of high similarity graph picture identification and sorting technique Pending CN108062575A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810006119.8A CN108062575A (en) 2018-01-03 2018-01-03 A kind of high similarity graph picture identification and sorting technique

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810006119.8A CN108062575A (en) 2018-01-03 2018-01-03 A kind of high similarity graph picture identification and sorting technique

Publications (1)

Publication Number Publication Date
CN108062575A true CN108062575A (en) 2018-05-22

Family

ID=62141119

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810006119.8A Pending CN108062575A (en) 2018-01-03 2018-01-03 A kind of high similarity graph picture identification and sorting technique

Country Status (1)

Country Link
CN (1) CN108062575A (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109034256A (en) * 2018-08-02 2018-12-18 燕山大学 A kind of the tumor of breast detection system and method for LTP and HOG Fusion Features
CN109145723A (en) * 2018-07-09 2019-01-04 长江大学 A kind of seal recognition methods, system, terminal installation and storage medium
CN111079845A (en) * 2019-12-20 2020-04-28 西安交通大学 Classification and identification method for coal gangue and coal in coal
CN111382764A (en) * 2018-12-29 2020-07-07 北大方正集团有限公司 Neural network model establishing method and device and computer readable storage medium
TWI703961B (en) * 2018-07-06 2020-09-11 台達電子工業股份有限公司 Oral image analysis system and method
CN111767927A (en) * 2020-06-03 2020-10-13 上海交通大学 Lightweight license plate recognition method and system based on full convolution network
CN113362096A (en) * 2020-03-04 2021-09-07 驰众信息技术(上海)有限公司 Frame advertisement image matching method based on deep learning
CN114067392A (en) * 2021-11-06 2022-02-18 北京九州安华信息安全技术有限公司 Image identification method and device based on Riesz wavelet transformation
CN114169430A (en) * 2021-12-06 2022-03-11 深圳依时货拉拉科技有限公司 Method and device for identifying copied image, storage medium and computer equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103971106A (en) * 2014-05-27 2014-08-06 深圳市赛为智能股份有限公司 Multi-view human facial image gender identification method and device
CN104751186A (en) * 2015-04-10 2015-07-01 山东师范大学 Iris image quality classification method based on BP (back propagation) network and wavelet transformation
CN107239751A (en) * 2017-05-22 2017-10-10 西安电子科技大学 High Resolution SAR image classification method based on the full convolutional network of non-down sampling contourlet
CN107368791A (en) * 2017-06-29 2017-11-21 广东欧珀移动通信有限公司 Living iris detection method and Related product
CN107463948A (en) * 2017-07-13 2017-12-12 西安电子科技大学 Classification of Multispectral Images method based on binary channels multiple features fusion network

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103971106A (en) * 2014-05-27 2014-08-06 深圳市赛为智能股份有限公司 Multi-view human facial image gender identification method and device
CN104751186A (en) * 2015-04-10 2015-07-01 山东师范大学 Iris image quality classification method based on BP (back propagation) network and wavelet transformation
CN107239751A (en) * 2017-05-22 2017-10-10 西安电子科技大学 High Resolution SAR image classification method based on the full convolutional network of non-down sampling contourlet
CN107368791A (en) * 2017-06-29 2017-11-21 广东欧珀移动通信有限公司 Living iris detection method and Related product
CN107463948A (en) * 2017-07-13 2017-12-12 西安电子科技大学 Classification of Multispectral Images method based on binary channels multiple features fusion network

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
高强 等: "基于卷积神经网络的绝缘子故障识别算法研究", 《电测与仪表》 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI703961B (en) * 2018-07-06 2020-09-11 台達電子工業股份有限公司 Oral image analysis system and method
CN109145723A (en) * 2018-07-09 2019-01-04 长江大学 A kind of seal recognition methods, system, terminal installation and storage medium
CN109034256A (en) * 2018-08-02 2018-12-18 燕山大学 A kind of the tumor of breast detection system and method for LTP and HOG Fusion Features
CN111382764A (en) * 2018-12-29 2020-07-07 北大方正集团有限公司 Neural network model establishing method and device and computer readable storage medium
CN111382764B (en) * 2018-12-29 2024-02-13 新方正控股发展有限责任公司 Neural network model building method and device for face recognition or gesture recognition and computer readable storage medium
CN111079845A (en) * 2019-12-20 2020-04-28 西安交通大学 Classification and identification method for coal gangue and coal in coal
CN113362096A (en) * 2020-03-04 2021-09-07 驰众信息技术(上海)有限公司 Frame advertisement image matching method based on deep learning
CN111767927A (en) * 2020-06-03 2020-10-13 上海交通大学 Lightweight license plate recognition method and system based on full convolution network
CN114067392A (en) * 2021-11-06 2022-02-18 北京九州安华信息安全技术有限公司 Image identification method and device based on Riesz wavelet transformation
CN114169430A (en) * 2021-12-06 2022-03-11 深圳依时货拉拉科技有限公司 Method and device for identifying copied image, storage medium and computer equipment

Similar Documents

Publication Publication Date Title
CN108062575A (en) A kind of high similarity graph picture identification and sorting technique
Mei et al. Hyperspectral image classification using group-aware hierarchical transformer
CN108460342A (en) Hyperspectral image classification method based on convolution net and Recognition with Recurrent Neural Network
CN103366180A (en) Cell image segmentation method based on automatic feature learning
Liu et al. The classification and denoising of image noise based on deep neural networks
CN103927531A (en) Human face recognition method based on local binary value and PSO BP neural network
CN110647977B (en) Method for optimizing Tiny-YOLO network for detecting ship target on satellite
CN111080678A (en) Multi-temporal SAR image change detection method based on deep learning
CN114155371A (en) Semantic segmentation method based on channel attention and pyramid convolution fusion
CN112950780A (en) Intelligent network map generation method and system based on remote sensing image
Li et al. Automatic Recognition of Rock Images Based on Convolutional Neural Network and Discrete Cosine Transform.
Gong et al. Multispectral and SAR image fusion for multi-scale decomposition based on least squares optimization rolling guidance filtering
CN114937202A (en) Double-current Swin transform remote sensing scene classification method
Liu et al. High-resolution remote sensing image information extraction and target recognition based on multiple information fusion
Ichimura Spatial frequency loss for learning convolutional autoencoders
CN111275732B (en) Foreground object image segmentation method based on depth convolution neural network
CN112733736A (en) Class imbalance hyperspectral image classification method based on enhanced oversampling
Rezvani et al. Single image denoising via a new lightweight learning-based model
CN112818840B (en) Unmanned aerial vehicle online detection system and method
CN116594061B (en) Seismic data denoising method based on multi-scale U-shaped attention network
Yuan Remote sensing image classification methods based on CNN: Challenge and trends
CN117953310A (en) Remote sensing multi-mode image classification method based on continuous scale feature network
CN112418262A (en) Vehicle re-identification method, client and system
Ma et al. An improved 3D-2D convolutional neural network based on feature optimization for hyperspectral image classification
Premk et al. Automatic latent fingerprint segmentation using convolutional neural networks

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20180522