CN105718934A - Method for pest image feature learning and identification based on low-rank sparse coding technology - Google Patents

Method for pest image feature learning and identification based on low-rank sparse coding technology Download PDF

Info

Publication number
CN105718934A
CN105718934A CN201610048633.9A CN201610048633A CN105718934A CN 105718934 A CN105718934 A CN 105718934A CN 201610048633 A CN201610048633 A CN 201610048633A CN 105718934 A CN105718934 A CN 105718934A
Authority
CN
China
Prior art keywords
lambda
pest
low
images
sparse coding
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610048633.9A
Other languages
Chinese (zh)
Inventor
朱会宾
宋良图
谢成军
王伟
胡文皓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anhui Z-Hope Technology Co Ltd
Wuxi Zhongke Funong Internet Of Things Technology Co Ltd
Jiangsu IoT Research and Development Center
Original Assignee
Anhui Z-Hope Technology Co Ltd
Wuxi Zhongke Funong Internet Of Things Technology Co Ltd
Jiangsu IoT Research and Development Center
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anhui Z-Hope Technology Co Ltd, Wuxi Zhongke Funong Internet Of Things Technology Co Ltd, Jiangsu IoT Research and Development Center filed Critical Anhui Z-Hope Technology Co Ltd
Priority to CN201610048633.9A priority Critical patent/CN105718934A/en
Publication of CN105718934A publication Critical patent/CN105718934A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • G06F18/2136Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on sparsity criteria, e.g. with an overcomplete basis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2411Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention relates to a method for pest image feature learning and identification based on a low-rank sparse coding technology. Compared with the prior art, the method overcomes the shortcoming that a pest image is difficult to identify. The method comprises the steps of performing random sampling and decomposition of pest training sample images, performing large-scale random sampling of the pest training sample images, and decomposing the images into a plurality of superpixel areas through an image segmentation method; collecting scale invariant feature transform (SIFT) features, and collecting the SIFT features in each superpixel area of the pest images; adopting the low-rank sparse coding technology to encode and learn local features of the images; performing multi-class classifier identification, obtaining the images needing to be classified, and performing training learning of the local features through multi-class classifiers, to implement judgment of the category of sample pests. The method is greatly improved in computational efficiency and precision.

Description

Pest image feature learning and identifying method based on low-rank sparse coding technology
Technical Field
The invention relates to the technical field of image recognition processing, in particular to a pest image feature learning and recognition method based on a low-rank sparse coding technology.
Background
In modern agriculture, computer vision technology is often used to identify crop pest images. Due to different shapes of pests and complex imaging environment, the extraction of the image characteristics of the pests is particularly important. Image visual features (hereinafter referred to as image features) are a coding technique for performing machine learning and image perception on an image in the field of computer vision. Image features are classified into global features and local features, and the most common local feature is scale invariant feature (SIFT feature). Sparse coding is a coding technology for expressing a vector as sparsely as possible by using a group of overcomplete bases, is widely applied to various fields of machine learning such as compressed sensing, image restoration, face recognition and the like, and achieves a good effect. In academia, it has been agreed to have a sparse structure for image data. Due to great success in the field of image processing, sparse coding techniques have become one of the widely used techniques in the field of computer vision. Then, how to combine SIFT features and sparse coding for utilization has become an urgent technical problem to be solved in the field of pest identification.
Disclosure of Invention
The invention aims to solve the defect that pest images are difficult to identify in the prior art, and provides a pest image feature learning and identifying method based on a low-rank sparse coding technology to solve the problem.
In order to achieve the purpose, the technical scheme of the invention is as follows:
a pest image feature learning and identifying method based on a low-rank sparse coding technology comprises the following steps:
randomly sampling and decomposing pest training sample images, randomly sampling large-scale pest training sample images, and decomposing the images into a plurality of super-pixel regions by an image segmentation method;
acquiring SIFT characteristics, and acquiring SIFT characteristics in each super-pixel region of the pest image;
coding and learning local features of the image by adopting a low-rank sparse coding technology;
and identifying the multi-class classifier, acquiring images needing to be classified, training and learning the local features through the multi-class classifier, and judging the class of the pest of the test sample.
The method for acquiring SIFT features comprises the following steps:
if each super-pixel region comprises n local features, and SIFT descriptors of the n local features form a matrix X, then
X = [ x → 1 , x → 2 , ... , x → n ] ,
Wherein each column is an RdA vector of dimensions, d representing a local feature point;
based on overcomplete dictionary D = [ d → 1 , d → 2 , ... , d → m ] , ,
For matrix X, then X ═ DZ is present, the following minimization expression is proposed:
min Z 1 2 | | X - D Z | | F 2 + λ 1 | | Z | | * + λ 2 | | Z | | 1 , 1 ,
wherein: | Z | non-conducting phosphor*Is a nuclear norm, expressing a sparsification factor;a low rank factor representing a matrix; lambda [ alpha ]1A weight coefficient that is sparsity; lambda [ alpha ]2Weight coefficients for low rank;
SIFT features are collected in each super-pixel area of the pest image until the super-pixel areas are collected.
The method for coding and learning the image local features by adopting the low-rank sparse coding technology comprises the following steps:
for the minimization expression
min Z 1 2 | | X - D Z | | F 2 + λ 1 | | Z | | * + λ 2 | | Z | | 1 , 1
Introducing two variables and two level constraints yields:
min Z ( 1 , 2 , 3 ) 1 2 | | X - DZ 3 | | F 2 + λ 1 | | Z 1 | | * + λ 2 | | Z 2 | | 1 , 1 s . t . Z 3 = Z 1 , Z 3 = Z 2 ;
two operators are introduced:
S λ ( A i j ) = 0 , | A i j | - λ ≤ 0 s i g n ( A i j ) × | A i j | | A i j | - λ > 0
J λ ( A ) = U A S λ ( Σ A ) V A T wherein, A = U A Σ A V A T ;
the equation can be obtained by these two operators:
X * = arg m i n | | X - A | | F 2 + 2 λ | | X | | 1 , 1 = S λ ( A ) ,
X * = arg m i n | | X - A | | F 2 + 2 λ | | X | | * = J λ ( A ) ;
adding the constraint condition into the objective function by using Lagrange multiplier,
L ( Z 1 , 2 , 3 ) = 1 2 | | X - DZ 3 | | F 2 + λ 1 | | Z 1 | | * + λ 2 | | Z 2 | | 1 , 1 + t r [ Y 1 T ( Z 3 - Z 1 ) ] + μ 1 2 | | Z 3 - Z 1 | | F 2 + t r [ Y 2 T ( Z 3 - Z 2 ) ] + μ 2 2 | | Z 3 - Z 2 | | F 2
wherein, Y1And Y2Is the Lagrange multiplier, mu1And mu2Are two penalty parameters;
solving the objective function minimization problem through IAML.
The solving of the objective function minimization problem through the IAML comprises the following steps:
solving for Z1Fixed Z of2,Z3,Y1,μ1,μ2
Z 1 * = arg min Z 1 λ 1 μ 1 | | Z 1 | | * + 1 2 | | Z 1 - ( Z 3 + 1 μ 1 Y 1 ) | | F 2 ⇒ Z 1 * = J λ 1 μ 1 ( Z 3 + 1 μ 1 Y 1 ) ;
Z is obtained2Fixed Z of1,Z3,Y2,μ1,μ2
Z 2 * = argmin Z 2 λ 2 μ 2 | | Z 2 | | 1 , 1 + 1 2 | | Z 2 - ( Z 3 + 1 μ 2 Y 2 ) | | F 2 ⇒ Z 2 * = S λ 2 μ 2 ( Z 3 + 1 μ 2 Y 2 ) ;
Z is obtained3Fixed Z of1,Z2,Y1,Y2,μ1,μ2
Z 3 * = argmin Z 3 1 2 | | X - DZ 3 | | F 2 + t r [ Y 1 t ( Z 3 - Z 1 ) ] + μ 1 2 | | Z 3 - Z 1 | | F 2 + t r [ Y 2 t ( Z 3 - Z 2 ) ] + μ 2 2 | | Z 3 - Z 2 | | F 2 ⇒ Z 3 * = ( D T D + μ 1 I + μ 2 I ) - 1 G ;
Wherein G ═ DTX-Y1-Y21Z12Z2
Y is obtained1,Y2,μ1,μ2
Y 1 = Y 1 + μ 1 ( Z 3 - Z 1 ) ; Y 2 = Y 2 + μ 2 ( Z 3 - z 2 ) μ 1 = ρμ 1 ; μ 2 = ρμ 2
Wherein rho is more than 0 and is a user-defined parameter;
judging whether the solving process reaches the preset iteration times or the precisionRequiring; if so, ending the minimization solution to generate a minimized objective function Lmin(Z1,2,3) (ii) a If not, proceeding to Z1、Z2、Z3、Y1、Y2、μ1And mu2And (4) calculating.
Advantageous effects
Compared with the prior art, the pest image feature learning and identifying method based on the low-rank sparse coding technology adopts the low-rank sparse learning method of feature coding with better robustness, and the method excavates the correlation of different local features; under the condition of ensuring sparsity, introducing the similarity of local region features when constructing an over-complete matrix dictionary; and the sparsity and spatial correlation of SIFT features are utilized, so that the calculation efficiency is greatly improved. Compared with the existing advanced feature coding method, the method has the advantages that the calculation efficiency and the precision are greatly improved.
Drawings
FIG. 1 is a flow chart of the method of the present invention.
Detailed Description
So that the manner in which the above recited features of the present invention can be understood and readily understood, a more particular description of the invention, briefly summarized above, may be had by reference to embodiments, some of which are illustrated in the appended drawings, wherein:
in the academic world, it has been agreed that image data have a sparse structure, due to great success in the field of image processing, sparse coding techniques have become one of the widely used techniques in the field of computer visionm×nIs low rank or nearly low rank, so a low rank version of the acquisition matrix can be recovered by the low rank matrix. The imprecise Lagrange multiplier method (Inexact augmented Lagrange Multiplier, hereinafter referred to as simply "Inexact Lagrange Multiplier")IALM) has the characteristics of high speed, small storage space and the like, and is widely applied to solving the problem of low-rank matrix recovery.
As shown in FIG. 1, the pest image feature learning and identification method based on the low-rank sparse coding technology comprises the following steps:
firstly, randomly sampling and decomposing a pest training sample image. In a laboratory environment, pest images are collected, namely large-scale pest training sample images are randomly sampled to determine local features of pests. The training sample image is decomposed into a plurality of super pixel regions by a common image segmentation method, and the size of the super pixel region can be generally 30 × 30.
And secondly, acquiring SIFT features, and acquiring SIFT features in each super-pixel region of the pest image. The method specifically comprises the following steps:
(1) if each super-pixel region comprises n local features, and SIFT descriptors of the n local features form a matrix X, then
X = [ x → 1 , x → 2 , ... , x → n ] ,
Wherein each column is an RdThe vector of dimensions, d, represents a local feature point, which is typically 128.
(2) Based on overcomplete dictionaryEach having a characteristic point with respect to the matrix XIs a pictographic combination of all vectors of the overcomplete dictionary and, therefore,
for matrix X, then X ═ DZ is present.
The following two points are considered here:
first, since the same local area tends to have similar descriptors, their representation with respect to D is similar, i.e. the representation matrix Z is low rank.
Second, due to the overcomplete dictionary D, the linear feature representation for D also tends to be sparse.
Based on the above two considerations, we propose the following minimization expression:
min Z 1 2 | | X - D Z | | F 2 + λ 1 | | Z | | * + λ 2 | | Z | | 1 , 1 ,
wherein: | Z | non-conducting phosphor*Is a nuclear norm, expressing a sparsification factor;a low rank factor representing a matrix; lambda [ alpha ]1A weight coefficient that is sparsity; lambda [ alpha ]2Are low-ranked weight coefficients.
(3) SIFT features are collected in each super-pixel area of the pest image until the super-pixel areas are collected.
And thirdly, coding and learning the local features of the image by adopting a low-rank sparse coding technology. The method comprises the following specific steps:
(1) for the minimized expression obtained in the second step
min Z 1 2 | | X - D Z | | F 2 + λ 1 | | Z | | * + λ 2 | | Z | | 1 , 1
Introducing two variables and two level constraints yields:
min Z ( 1 , 2 , 3 ) 1 2 | | X - DZ 3 | | F 2 + λ 1 | | Z 1 | | * + λ 2 | | Z 2 | | 1 , 1 s . t . Z 3 = Z 1 , Z 3 = Z 2 .
(2) from the above, after adding two changes and two level constraints to the minimization expression, the minimization expression is converted into the IAML minimization problem, so two operators are introduced here:
S λ ( A i j ) = 0 , | A i j | - λ ≤ 0 s i g n ( A i j ) × | A i j | | A i j | - λ > 0
J λ ( A ) = U A S λ ( Σ A ) V A T wherein, A = U A Σ A V A T ;
the equation can be obtained by these two operators:
X * = argmin | | X - A | | F 2 + 2 λ | | X | | 1 , 1 = S λ ( A ) ,
X * = arg m i n | | X - A | | F 2 + 2 λ | | X | | * = J λ ( A ) .
(3) adding the constraint condition into the objective function by using Lagrange multiplier,
L ( Z 1 , 2 , 3 ) = 1 2 | | X - DZ 3 | | F 2 + λ 1 | | Z 1 | | * + λ 2 | | Z 2 | | 1 , 1 + t r [ Y 1 T ( Z 3 - Z 1 ) ] + μ 1 2 | | Z 3 - Z 1 | | F 2 + t r [ Y 2 T ( Z 3 - Z 2 ) ] + μ 2 2 | | Z 3 - Z 2 | | F 2
wherein, Y1And Y2Is the Lagrange multiplier, mu1And mu2Are two penalty parameters.
(4) Solving the objective function minimization problem through IAML, which comprises the following steps: A. solving for Z1Fixed Z of2,Z3,Y1,μ1,μ2
Z 1 * = arg min Z 1 λ 1 μ 1 | | Z 1 | | * + 1 2 | | Z 1 - ( Z 3 + 1 μ 1 Y 1 ) | | F 2 ⇒ Z 1 * = J λ 1 μ 1 ( Z 3 + 1 μ 1 Y 1 ) ;
B. Z is obtained2Fixed Z of1,Z3,Y2,μ1,μ2
Z 2 * = argmin Z 2 λ 2 μ 2 | | Z 2 | | 1 , 1 + 1 2 | | Z 2 - ( Z 3 + 1 μ 2 Y 2 ) | | F 2 ⇒ Z 2 * = S λ 2 μ 2 ( Z 3 + 1 μ 2 Y 2 ) ;
C. Z is obtained3Fixed Z of1,Z2,Y1,Y2,μ1,μ2
Z 3 * = argmin Z 3 1 2 | | X - DZ 3 | | F 2 + t r [ Y 1 t ( Z 3 - Z 1 ) ] + μ 1 2 | | Z 3 - Z 1 | | F 2 + t r [ Y 2 t ( Z 3 - Z 2 ) ] + μ 2 2 | | Z 3 - Z 2 | | F 2 ⇒ Z 3 * = ( D T D + μ 1 I + μ 2 I ) - 1 G ;
Wherein G ═ DTX-Y1-Y21Z12Z2
D. Y is obtained1,Y2,μ1,μ2
Y 1 = Y 1 + μ 1 ( Z 3 - Z 1 ) ; Y 2 = Y 2 + μ 2 ( Z 3 - z 2 ) μ 1 = ρμ 1 ; μ 2 = ρμ 2
Wherein rho is more than 0 and is a user-defined parameter;
E. judging whether the solving process reaches a preset iteration number or a precision requirement; if so, ending the minimization solution to generate a minimized objective function Lmin(Z1,2,3) (ii) a If not, proceeding to Z1、Z2、Z3、Y1、Y2、μ1And mu2And (4) calculating.
And fourthly, identifying the multi-class classifier. The method comprises the steps of obtaining images needing to be classified, training and learning local features through a multi-class classifier (SVM linear classifier) by utilizing the prior art, and judging the class of the pest of a test sample.
In the experiment process for the D1 and D2 sample libraries, the D1 sample library comprises a training set and a test set, wherein the training set comprises 20 butterfly images of 128 x 128 images, and the test set comprises 720 butterfly images. The D2 sample library comprises 225 insect images of 9 categories, wherein a part of the insect images are randomly selected as training samples, and the rest are testing samples. The butterfly images in the D1 sample library all have similar appearance characteristics, such as shapes, and the main different characteristics of the butterfly images are represented by texture characteristics. In the experiment, the method of the present invention is compared with a plurality of existing image recognition classification models, including SCSPM (spatial pyramid matching sparse coding), LLC (locally constrained linear coding), lscpm (laplacian sparse coding), SC (significant coding), and LCSRC (locally constrained and spatially regularized coding), and the comparison results on the D1 and D2 sample libraries are shown in table 1.
TABLE 1 comparison of different method identification rates (%) (in D1 and D2 sample pools)
As can be seen from Table 1, the method of the present invention is significantly superior to other prior art methods in image recognition rate, and has the characteristic of high efficiency in practical application.
The foregoing shows and describes the general principles, essential features, and advantages of the invention. It will be understood by those skilled in the art that the present invention is not limited to the embodiments described above, which are merely illustrative of the principles of the invention, but that various changes and modifications may be made without departing from the spirit and scope of the invention, which fall within the scope of the invention as claimed. The scope of the invention is defined by the appended claims and equivalents thereof.

Claims (4)

1. A pest image feature learning and identifying method based on a low-rank sparse coding technology is characterized by comprising the following steps of:
11) randomly sampling and decomposing pest training sample images, randomly sampling large-scale pest training sample images, and decomposing the images into a plurality of super-pixel regions by an image segmentation method;
12) acquiring SIFT characteristics, and acquiring SIFT characteristics in each super-pixel region of the pest image;
13) coding and learning local features of the image by adopting a low-rank sparse coding technology;
14) and identifying the multi-class classifier, acquiring images needing to be classified, training and learning the local features through the multi-class classifier, and judging the class of the pest of the test sample.
2. The pest image feature learning and identifying method based on the low-rank sparse coding technology as claimed in claim 1, wherein the collecting SIFT features comprises the following steps:
21) if each super-pixel region comprises n local features, and SIFT descriptors of the n local features form a matrix X, then
X = [ x → 1 , x → 2 , ... , x → n ] ,
Wherein each column is an RdA vector of dimensions, d representing a local feature point;
22) based on overcomplete dictionary D = [ d → 1 , d → 2 , ... , d → m ] , ,
For matrix X, then X ═ DZ is present, the following minimization expression is proposed:
min Z 1 2 | | X - D Z | | F 2 + λ 1 | | Z | | * + λ 2 | | Z | | 1 , 1 ,
wherein: | Z | non-conducting phosphor*Is a nuclear norm, expressing a sparsification factor;a low rank factor representing a matrix; lambda [ alpha ]1A weight coefficient that is sparsity; lambda [ alpha ]2Weight coefficients for low rank;
23) SIFT features are collected in each super-pixel area of the pest image until the super-pixel areas are collected.
3. The pest image feature learning and identification method based on the low-rank sparse coding technology as claimed in claim 1, wherein the encoding and learning of the local features of the image by the low-rank sparse coding technology comprises the following steps:
31) for the minimization expression
min Z 1 2 | | X - D Z | | F 2 + λ 1 | | Z | | * + λ 2 | | Z | | 1 , 1
Introducing two variables and two level constraints yields:
min Z ( 1 , 2 , 3 ) 1 2 | | X - DZ 3 | | F 2 + λ 1 | | Z 1 | | * + λ 2 | | Z 2 | | 1 , 1 s.t.Z3=Z1,Z3=Z2
32) two operators are introduced:
S λ ( A i j ) = 0 , | A i j | - λ ≤ 0 s i g n ( A i j ) × | A i j | | A i j | - λ > 0
J λ ( A ) = U A S λ ( Σ A ) V A T wherein, A = U A Σ A V A T ;
the equation can be obtained by these two operators:
X * = arg m i n | | X - A | | F 2 + 2 λ | | X | | 1 , 1 = S λ ( A ) ,
X * = arg m i n | | X - A | | F 2 + 2 λ | | X | | * = J λ ( A ) ;
33) adding the constraint condition into the objective function by using Lagrange multiplier,
L ( Z 1 , 2 , 3 ) = 1 2 | | X - DZ 3 | | F 2 + λ 1 | | Z 1 | | * + λ 2 | | Z 2 || 1 , 1 + t r [ Y 1 T ( Z 3 - Z 1 ) ] + μ 1 2 | | Z 3 - Z 1 | | F 2 + t r [ Y 2 T ( Z 3 - Z 2 ) ] + μ 2 2 | | Z 3 - Z 2 | | F 2
wherein, Y1And Y2Is the Lagrange multiplier, mu1And mu2Are two penalty parameters;
34) solving the objective function minimization problem through IAML.
4. The pest image feature learning and identification method based on the low-rank sparse coding technology as claimed in claim 3, wherein the solving of the objective function minimization problem by IAML comprises the following steps:
41) solving for Z1Fixed Z of2,Z3,Y1,μ1,μ2
Z 1 * = argmin Z 1 λ 1 μ 1 | | Z 1 | | * + 1 2 | | Z 1 - ( Z 3 + 1 μ 1 Y 1 ) | | F 2 ⇒ Z 1 * = J λ 1 μ 1 ( Z 3 + 1 μ 1 Y 1 ) ;
42) Z is obtained2Fixed Z of1,Z3,Y2,μ1,μ2
Z 2 * = argmin Z 2 λ 2 μ 2 | | Z 2 | | 1 , 1 + 1 2 | | Z 2 - ( Z 3 + 1 μ 2 Y 2 ) | | F 2 ⇒ Z 2 * = S λ 2 μ 2 ( Z 3 + 1 μ 2 Y 2 ) ;
43) Z is obtained3Fixed Z of1,Z2,Y1,Y2,μ1,μ2
Z 3 * = argmin Z 3 1 2 | | X - DZ 3 | | F 2 + t r [ Y 1 t ( Z 3 - Z 1 ) ] + μ 1 2 | | Z 3 - Z 1 | | F 2 + t r [ Y 2 t ( Z 3 - Z 2 ) ] + μ 2 2 | | Z 3 - Z 2 | | F 2 ⇒ Z 3 * = ( D T D + μ 1 I + μ 2 I ) - 1 G ;
Wherein G ═ DTX-Y1-Y21Z12Z2
44) Y is obtained1,Y2,μ1,μ2
Y 1 = Y 1 + μ 1 ( Z 3 - Z 1 ) ; Y 2 = Y 2 + μ 2 ( Z 3 - Z 2 ) μ 1 = ρ μ 1 ; μ 2 = ρ μ 2
Wherein rho is more than 0 and is a user-defined parameter;
45) judging whether the solving process reaches a preset iteration number or a precision requirement; if so, ending the minimization solution to generate a minimized objective function Lmin(Z1,2,3) (ii) a If not, proceeding to Z1、Z2、Z3、Y1、Y2、μ1And mu2And (4) calculating.
CN201610048633.9A 2016-01-25 2016-01-25 Method for pest image feature learning and identification based on low-rank sparse coding technology Pending CN105718934A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610048633.9A CN105718934A (en) 2016-01-25 2016-01-25 Method for pest image feature learning and identification based on low-rank sparse coding technology

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610048633.9A CN105718934A (en) 2016-01-25 2016-01-25 Method for pest image feature learning and identification based on low-rank sparse coding technology

Publications (1)

Publication Number Publication Date
CN105718934A true CN105718934A (en) 2016-06-29

Family

ID=56154991

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610048633.9A Pending CN105718934A (en) 2016-01-25 2016-01-25 Method for pest image feature learning and identification based on low-rank sparse coding technology

Country Status (1)

Country Link
CN (1) CN105718934A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106326903A (en) * 2016-08-31 2017-01-11 中国科学院空间应用工程与技术中心 Typical target recognition method based on affine scaling invariant feature and sparse representation
CN106886797A (en) * 2017-02-24 2017-06-23 电子科技大学 A kind of high resolution detection and recognition methods to composite debonding defect
CN109063738A (en) * 2018-07-03 2018-12-21 浙江理工大学 A kind of ceramic water valve plates automatic on-line detection method of compressed sensing
CN110265039A (en) * 2019-06-03 2019-09-20 南京邮电大学 A kind of method for distinguishing speek person decomposed based on dictionary learning and low-rank matrix
CN113128514A (en) * 2021-04-26 2021-07-16 山东大学 Cotton pest positioning and classifying identification method and system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070258648A1 (en) * 2006-05-05 2007-11-08 Xerox Corporation Generic visual classification with gradient components-based dimensionality enhancement
CN103632138A (en) * 2013-11-20 2014-03-12 南京信息工程大学 Low-rank partitioning sparse representation human face identifying method
CN103761537A (en) * 2014-02-07 2014-04-30 重庆市国土资源和房屋勘测规划院 Image classification method based on low-rank optimization feature dictionary model
CN104102922A (en) * 2014-07-15 2014-10-15 中国科学院合肥物质科学研究院 Pest image classification method based on context sensing dictionary learning
CN104318261A (en) * 2014-11-03 2015-01-28 河南大学 Graph embedding low-rank sparse representation recovery sparse representation face recognition method
CN105184298A (en) * 2015-08-27 2015-12-23 重庆大学 Image classification method through fast and locality-constrained low-rank coding process

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070258648A1 (en) * 2006-05-05 2007-11-08 Xerox Corporation Generic visual classification with gradient components-based dimensionality enhancement
CN103632138A (en) * 2013-11-20 2014-03-12 南京信息工程大学 Low-rank partitioning sparse representation human face identifying method
CN103761537A (en) * 2014-02-07 2014-04-30 重庆市国土资源和房屋勘测规划院 Image classification method based on low-rank optimization feature dictionary model
CN104102922A (en) * 2014-07-15 2014-10-15 中国科学院合肥物质科学研究院 Pest image classification method based on context sensing dictionary learning
CN104318261A (en) * 2014-11-03 2015-01-28 河南大学 Graph embedding low-rank sparse representation recovery sparse representation face recognition method
CN105184298A (en) * 2015-08-27 2015-12-23 重庆大学 Image classification method through fast and locality-constrained low-rank coding process

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106326903A (en) * 2016-08-31 2017-01-11 中国科学院空间应用工程与技术中心 Typical target recognition method based on affine scaling invariant feature and sparse representation
CN106886797A (en) * 2017-02-24 2017-06-23 电子科技大学 A kind of high resolution detection and recognition methods to composite debonding defect
CN106886797B (en) * 2017-02-24 2019-11-19 电子科技大学 The high resolution detection and recognition methods of a kind of pair of composite material debonding defect
CN109063738A (en) * 2018-07-03 2018-12-21 浙江理工大学 A kind of ceramic water valve plates automatic on-line detection method of compressed sensing
CN109063738B (en) * 2018-07-03 2021-12-21 浙江理工大学 Automatic online detection method for compressed sensing ceramic water valve plate
CN110265039A (en) * 2019-06-03 2019-09-20 南京邮电大学 A kind of method for distinguishing speek person decomposed based on dictionary learning and low-rank matrix
CN110265039B (en) * 2019-06-03 2021-07-02 南京邮电大学 Speaker recognition method based on dictionary learning and low-rank matrix decomposition
CN113128514A (en) * 2021-04-26 2021-07-16 山东大学 Cotton pest positioning and classifying identification method and system

Similar Documents

Publication Publication Date Title
CN112750140B (en) Information mining-based disguised target image segmentation method
CN105678284B (en) A kind of fixed bit human body behavior analysis method
CN108108751B (en) Scene recognition method based on convolution multi-feature and deep random forest
CN103400143B (en) A kind of data Subspace clustering method based on various visual angles
CN105528595A (en) Method for identifying and positioning power transmission line insulators in unmanned aerial vehicle aerial images
CN105718934A (en) Method for pest image feature learning and identification based on low-rank sparse coding technology
CN105160310A (en) 3D (three-dimensional) convolutional neural network based human body behavior recognition method
CN107977661B (en) Region-of-interest detection method based on FCN and low-rank sparse decomposition
CN103902989B (en) Human action video frequency identifying method based on Non-negative Matrix Factorization
CN113052185A (en) Small sample target detection method based on fast R-CNN
CN105740833A (en) Human body behavior identification method based on depth sequence
CN110458192B (en) Hyperspectral remote sensing image classification method and system based on visual saliency
CN105205449A (en) Sign language recognition method based on deep learning
CN107944459A (en) A kind of RGB D object identification methods
CN105243154A (en) Remote sensing image retrieval method and system based on significant point characteristics and spare self-encodings
CN109635726B (en) Landslide identification method based on combination of symmetric deep network and multi-scale pooling
Verma et al. Wild animal detection from highly cluttered images using deep convolutional neural network
Han et al. Adaptive spatial-scale-aware deep convolutional neural network for high-resolution remote sensing imagery scene classification
Abdollahifard et al. Stochastic simulation of patterns using Bayesian pattern modeling
CN112381144A (en) Heterogeneous deep network method for non-European and European domain space spectrum feature learning
CN106803105B (en) Image classification method based on sparse representation dictionary learning
CN107845064B (en) Image super-resolution reconstruction method based on active sampling and Gaussian mixture model
CN104715266A (en) Image characteristics extracting method based on combination of SRC-DP and LDA
CN115424288A (en) Visual Transformer self-supervision learning method and system based on multi-dimensional relation modeling
Salem et al. Semantic image inpainting using self-learning encoder-decoder and adversarial loss

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20160629

RJ01 Rejection of invention patent application after publication