CN108573263A - A kind of dictionary learning method of co-ordinative construction rarefaction representation and low-dimensional insertion - Google Patents

A kind of dictionary learning method of co-ordinative construction rarefaction representation and low-dimensional insertion Download PDF

Info

Publication number
CN108573263A
CN108573263A CN201810444013.6A CN201810444013A CN108573263A CN 108573263 A CN108573263 A CN 108573263A CN 201810444013 A CN201810444013 A CN 201810444013A CN 108573263 A CN108573263 A CN 108573263A
Authority
CN
China
Prior art keywords
dictionary
matrix
low
rarefaction representation
coefficient
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810444013.6A
Other languages
Chinese (zh)
Inventor
陈万军
张二虎
蔺广逢
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian University of Technology
Original Assignee
Xian University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian University of Technology filed Critical Xian University of Technology
Priority to CN201810444013.6A priority Critical patent/CN108573263A/en
Publication of CN108573263A publication Critical patent/CN108573263A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/513Sparse representations

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Multimedia (AREA)
  • Character Discrimination (AREA)
  • Image Analysis (AREA)
  • Machine Translation (AREA)

Abstract

The invention discloses the dictionary learning methods of a kind of co-ordinative construction rarefaction representation and low-dimensional insertion, dictionary is constructed to carry out with dimensionality reduction projection matrix time-interleaved, pass through incoherence between forced class of the rarefaction representation coefficient matrix with block diagonalization structure to enhance dictionary in low dimension projective space, at the same time, correlation in class using the low-rank of the expression coefficient on class small pin for the case dictionary to keep dictionary, dictionary is constructed can promote mutually with projection study, fully to keep the sparsity structure of data, to encode out the code coefficient with more classification judgement index, the expression coefficient class discriminating power that the present invention solves higher-dimension characteristic in dictionary learning method existing in the prior art due to training sample data and lacks the dictionary of stringent block diagonalization structural constraint and go out coded by making is weak, the not strong problem of distinction.

Description

A kind of dictionary learning method of co-ordinative construction rarefaction representation and low-dimensional insertion
Technical field
The invention belongs to digital image processing techniques fields, and in particular to a kind of co-ordinative construction rarefaction representation and low-dimensional are embedding The dictionary learning method entered.
Background technology
The core concept of rarefaction representation is based primarily upon following objective fact:Many signals in nature can use a mistake Only a few dictionary item in saturation dictionary carrys out linear combination and indicates or encode.It is the most key in rarefaction representation research to ask Topic is the construction for having strong representation ability dictionary.Currently, rarefaction representation technology is widely used in many application fields, Such as image classification, recognition of face and human action identification.
Dictionary learning is dedicated to learning from training sample to optimal dictionary carrying out more given signal or feature It indicates or encodes well.For the Classification and Identification based on rarefaction representation, ideal sparse matrix of the training sample under dictionary Should be that block is diagonal, i.e., the coefficient on similar sub- dictionary of the sample where it is non-zero, and is on the sub- dictionary of foreign peoples Number is zero.Such structuring coefficient matrix will be with best class discrimination ability.In addition, the higher-dimension due to training data is special Property and the deficiency of training sample make dictionary learning method still suffer from great challenge.Therefore, people naturally enough exist Data Dimensionality Reduction processing is introduced during dictionary learning.But very unfortunately, these dictionary learning methods are often by Data Dimensionality Reduction It is individually studied as two independent processing steps with dictionary learning, i.e., dimension-reduction treatment is carried out to training data first, then Dictionary learning is carried out in low-dimensional feature space.This serial cascade system likely makes that the low-dimensional learnt in advance is thrown Shadow can not keep and be promoted well the potential sparsity structure of data, to be unfavorable for having the dictionary learning of strong identification.
Invention content
The object of the present invention is to provide the dictionary learning methods of a kind of co-ordinative construction rarefaction representation and low-dimensional insertion, solve Due to the higher-dimension characteristic of training sample data and lack stringent block diagonalization in dictionary learning method existing in the prior art The dictionary of structural constraint and make it is coded go out indicate coefficient class discriminating power is weak, distinction is not strong problem.
The technical solution adopted in the present invention is a kind of dictionary learning side of co-ordinative construction rarefaction representation and low-dimensional insertion Method is specifically implemented according to the following steps:
Step 1, the characteristic data set for reading in training sampleWherein C is classification number, and n is The dimension of feature,For the N of the i-th classiThe character subset that a sample is constituted, i=1,2 ..., C,
Step 2, using alternating direction Lagrange multiplier method solving-optimizing problemsIt is encoded Dictionary D, dimensionality reduction projection matrix P and code coefficient matrix X;
Step 3 reads in test sample characteristicIt is by encoder dictionary D and dimensionality reduction projection matrix P and following by solving Optimization problem obtains test sampleRarefaction representation coefficient
Step 4, the rarefaction representation coefficient for calculating test sampleIn all kinds of small pin for the case dictionary DiOn reconstructed error eiWhereinTo correspond in i-th of sub- dictionary DiOn code coefficient D=[D1,D2,…,DC], i=1, 2,…,C;
Step 5, according to minimal reconstruction error criterion to test sampleClassify, category labelFor:
The features of the present invention also characterized in that
In step 2
S.t.X=diag (X11,X22,…,XCC),PPT=I,
Wherein, parameter lambda123> 0;For low dimension projective transformation matrix, m < < n;Training sample Y is in dictionaryUnder expression coefficient matrix be X:
For jth class training sample YjIn the sub- dictionary of the i-th classOn expression coefficient, i, j ∈ 1, 2,…,C};
X is enabled to meet following block diagonalization structural constraint:
Step 2 is specifically implemented according to the following steps:
Step 2.1 introduces auxiliary variable collectionAnd enable Zii=Xii, optimization problemConversion For:
S.t.X=diag (X11,X22,…,XCC),PPT=I,
Zii=Xii, i=1,2 ..., C,
The Lagrange function expressions of its augmentation are:
s.t.PPT=I,
Wherein, FiiFor Lagrange multipliers, γ > 0 are punishment parameter;
Step 2.2, alternating iteration update matrix P, D, X and Zii, until P, D, X and ZiiConvergence.
Step 2.2 is specifically implemented according to the following steps:
Step 2.2.1, fixed other variables, update matrix X by following formula:
Wherein, sgn (x) is defined as:
Step 2.2.2, fixed other variables, update matrix Z by following formulaii
Wherein, U Λ VTFor matrixSingular value decomposition,For soft-threshold operator,It is fixed Justice is as follows:
Step 2.2.3, fixed other variables, update matrix D by following formula:
After having been updated by column by above formula to dictionary D, that is, obtain the value after entire dictionary updating:
Step 2.2.4, fixed other variables, update matrix P by following formula:
First, to matrix (φ (P(t-1))-λ1S Eigenvalues Decomposition) is carried out:
[U, Λ, V]=EVD (φ (P(t-1))-λ1S),
Wherein, φ (P)=(Y-PTΔ)(Y-PTΔ)T, Δ=DX, S=YYT, Λ is matrix (φ (P(t-1))-λ1S spy) The diagonal matrix that value indicative is constituted is matrix (φ (P to the update of projection matrix P(t-1))-λ1S preceding m characteristic value institute) is right The feature vector U (1 answered:m,:), i.e.,:
P(t)=U (1:m,:);
Step 2.2.5, multiplier F is updated by following formulaiiAnd parameter γ:
γ(t)=min { ρ γ(t-1)max}。
Wherein, ρ=1.1, γmax=106,
Encoder dictionary D and dimensionality reduction projection matrix P are obtained after updating above.
The invention has the advantages that the dictionary learning method of a kind of co-ordinative construction rarefaction representation and low-dimensional insertion, with Eliminate the correlation between class has the more preferable code coefficient for differentiating performance to obtain;Enhance sparse table in class by low-rank constraint Show the coherence between coefficient, with the birdsing of the same feather flock together property for indicating coefficient being further lifted on class small pin for the case dictionary;Meanwhile by projecting square The expression ability of battle array learnt to enhance dictionary and the robustness for improving sparse representation model in turn.
Specific implementation mode
The present invention is described in detail With reference to embodiment.
The dictionary learning method of a kind of co-ordinative construction rarefaction representation of the present invention and low-dimensional insertion, dictionary construction are thrown with dimensionality reduction Shadow matrix parallel alternately, passes through the forced rarefaction representation coefficient matrix with block diagonalization structure in low dimension projective space At the same time incoherence between class to enhance dictionary keeps word using the low-rank of the expression coefficient on class small pin for the case dictionary Correlation in the class of allusion quotation.Dictionary is constructed can promote mutually with projection study, fully to keep the sparsity structure of data, to compile Code goes out the code coefficient with more classification judgement index, is specifically implemented according to the following steps:
Step 1, the characteristic data set for reading in training sampleWherein C is classification number, and n is The dimension of feature,For the N of the i-th classiThe character subset that a sample is constituted, i=1,2 ..., C,
Step 2, using alternating direction Lagrange multiplier method solving-optimizing problemsIt is encoded Dictionary D, dimensionality reduction projection matrix P and code coefficient matrix X, wherein
S.t.X=diag (X11,X22,…,XCC),PPT=I,
Wherein, parameter lambda123> 0;For low dimension projective transformation matrix, m < < n;Training sample Y is in dictionaryUnder expression coefficient matrix be X:
For jth class training sample YjIn the sub- dictionary of the i-th classOn expression coefficient, i, j ∈ 1, 2,…,C};
X is enabled to meet following block diagonalization structural constraint:
Step 2 is specifically implemented according to the following steps:
Step 2.1 introduces auxiliary variable collectionAnd enable Zii=Xii, optimization problemConversion For:
S.t.X=diag (X11,X22,…,XCC),PPT=I,
Zii=Xii, i=1,2 ..., C,
The Lagrange function expressions of its augmentation are:
s.t.PPT=I,
Wherein, FiiFor Lagrange multipliers, γ > 0 are punishment parameter;
Step 2.2, alternating iteration update matrix P, D, X and Zii, until P, D, X and ZiiConvergence, specifically according to the following steps Implement:
Step 2.2.1, fixed other variables, update matrix X by following formula:
Wherein, sgn (x) is defined as:
Step 2.2.2, fixed other variables, update matrix Z by following formulaii
Wherein, U Λ VTFor matrixSingular value decomposition,For soft-threshold operator,It is fixed Justice is as follows:
Step 2.2.3, fixed other variables, update matrix D by following formula:
After having been updated by column by above formula to dictionary D, that is, obtain the value after entire dictionary updating:
Step 2.2.4, fixed other variables, update matrix P by following formula:
First, to matrix (φ (P(t-1))-λ1S Eigenvalues Decomposition) is carried out:
[U, Λ, V]=EVD (φ (P(t-1))-λ1S),
Wherein, φ (P)=(Y-PTΔ)(Y-PTΔ)T, Δ=DX, S=YYT, Λ is matrix (φ (P(t-1))-λ1S spy) The diagonal matrix that value indicative is constituted is matrix (φ (P to the update of projection matrix P(t-1))-λ1S preceding m characteristic value institute) is right The feature vector U (1 answered:m,:), i.e.,:
P(t)=U (1:m,:);
Step 2.2.5, multiplier F is updated by following formulaiiAnd parameter γ:
γ(t)=min { ρ γ(t-1)max}。
Wherein, ρ=1.1, γmax=106,
Encoder dictionary D and dimensionality reduction projection matrix P are obtained after updating above;
Step 3 reads in test sample characteristicIt is by encoder dictionary D and dimensionality reduction projection matrix P and following by solving Optimization problem obtains test sampleRarefaction representation coefficient
Step 4, the rarefaction representation coefficient for calculating test sampleIn all kinds of small pin for the case dictionary DiOn reconstructed error eiWhereinTo correspond in i-th of sub- dictionary DiOn code coefficient D=[D1,D2,…,DC], i=1, 2,…,C;
Step 5, according to minimal reconstruction error criterion to test sampleClassify, category labelFor:
The dictionary learning method of a kind of co-ordinative construction rarefaction representation of the present invention and low-dimensional insertion, in dictionary building process In take full advantage of category prior information in lower dimensional space come the sparse coding for learning sample carrying out that there is block structure, with So that the coefficient after coding has stronger expression ability and class discrimination ability, the accuracy rate of classification problem is significantly improved.

Claims (4)

1. the dictionary learning method of a kind of co-ordinative construction rarefaction representation and low-dimensional insertion, which is characterized in that specifically according to following Step is implemented:
Step 1, the characteristic data set for reading in training sampleWherein C is classification number, and n is characterized Dimension,For the N of the i-th classiThe character subset that a sample is constituted, i=1,2 ..., C,
Step 2, using alternating direction Lagrange multiplier method solving-optimizing problemsObtain encoder dictionary D, dimensionality reduction projection matrix P and code coefficient matrix X;
Step 3 reads in test sample characteristicBy encoder dictionary D and dimensionality reduction projection matrix P and by solving following optimization Problem obtains test sampleRarefaction representation coefficient
Step 4, the rarefaction representation coefficient for calculating test sampleIn all kinds of small pin for the case dictionary DiOn reconstructed error eiWhereinTo correspond in i-th of sub- dictionary DiOn code coefficient D=[D1,D2,…,DC], i=1, 2,…,C;
Step 5, according to minimal reconstruction error criterion to test sampleClassify, category labelFor:
2. the dictionary learning method of a kind of co-ordinative construction rarefaction representation according to claim 1 and low-dimensional insertion, special Sign is, in the step 2
S.t.X=diag (X11,X22,…,XCC),PPT=I,
Wherein, parameter lambda123> 0;For low dimension projective transformation matrix, m < < n;Training sample
This Y is in dictionaryUnder expression coefficient matrix be X:
For jth class training sample YjIn the sub- dictionary of the i-th classOn expression coefficient, i, j ∈ 1,2 ..., C};
X is enabled to meet following block diagonalization structural constraint:
3. the dictionary learning method of a kind of co-ordinative construction rarefaction representation according to claim 2 and low-dimensional insertion, special Sign is that the step 2 is specifically implemented according to the following steps:
Step 2.1 introduces auxiliary variable collectionAnd enable Zii=Xii, optimization problemIt is converted into:
S.t.X=diag (X11,X22,…,XCC),PPT=I,
Zii=Xii, the Lagrange function expressions of i=1,2 ..., C, augmentation are:
s.t.PPT=I,
Wherein, FiiFor Lagrange multipliers, γ > 0 are punishment parameter;
Step 2.2, alternating iteration update matrix P, D, X and Zii, until P, D, X and ZiiConvergence.
4. the dictionary learning method of a kind of co-ordinative construction rarefaction representation according to claim 3 and low-dimensional insertion, special Sign is that the step 2.2 is specifically implemented according to the following steps:
Step 2.2.1, fixed other variables, update matrix X by following formula:
Wherein, sgn (x) is defined as:
Step 2.2.2, fixed other variables, update matrix Z by following formulaii
Wherein, U Λ VTFor matrixSingular value decomposition,For soft-threshold operator,Definition is such as Under:
Step 2.2.3, fixed other variables, update matrix D by following formula:
After having been updated by column by above formula to dictionary D, that is, obtain the value after entire dictionary updating:
Step 2.2.4, fixed other variables, update matrix P by following formula:
First, to matrix (φ (P(t-1))-λ1S Eigenvalues Decomposition) is carried out:
[U, Λ, V]=EVD (φ (P(t-1))-λ1S),
Wherein, φ (P)=(Y-PTΔ)(Y-PTΔ)T, Δ=DX, S=YYT, Λ is matrix (φ (P(t-1))-λ1S characteristic value) The diagonal matrix constituted is matrix (φ (P to the update of projection matrix P(t-1))-λ1S corresponding to preceding m characteristic value) Feature vector U (1:m,:), i.e.,:
P(t)=U (1:m,:);
Step 2.2.5, multiplier F is updated by following formulaiiAnd parameter γ:
γ(t)=min { ρ γ(t-1)max}。
Wherein, ρ=1.1, γmax=106,
Encoder dictionary D and dimensionality reduction projection matrix P are obtained after updating above.
CN201810444013.6A 2018-05-10 2018-05-10 A kind of dictionary learning method of co-ordinative construction rarefaction representation and low-dimensional insertion Pending CN108573263A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810444013.6A CN108573263A (en) 2018-05-10 2018-05-10 A kind of dictionary learning method of co-ordinative construction rarefaction representation and low-dimensional insertion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810444013.6A CN108573263A (en) 2018-05-10 2018-05-10 A kind of dictionary learning method of co-ordinative construction rarefaction representation and low-dimensional insertion

Publications (1)

Publication Number Publication Date
CN108573263A true CN108573263A (en) 2018-09-25

Family

ID=63572539

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810444013.6A Pending CN108573263A (en) 2018-05-10 2018-05-10 A kind of dictionary learning method of co-ordinative construction rarefaction representation and low-dimensional insertion

Country Status (1)

Country Link
CN (1) CN108573263A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109829352A (en) * 2018-11-20 2019-05-31 中国人民解放军陆军工程大学 Communication fingerprint identification method integrating multilayer sparse learning and multi-view learning
CN110033824A (en) * 2019-04-13 2019-07-19 湖南大学 A kind of gene expression profile classification method based on shared dictionary learning
CN111666967A (en) * 2020-04-21 2020-09-15 浙江工业大学 Image classification method based on incoherent joint dictionary learning
CN112183300A (en) * 2020-09-23 2021-01-05 厦门大学 AIS radiation source identification method and system based on multi-level sparse representation
CN112734763A (en) * 2021-01-29 2021-04-30 西安理工大学 Image decomposition method based on convolution and K-SVD dictionary joint sparse coding

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109829352A (en) * 2018-11-20 2019-05-31 中国人民解放军陆军工程大学 Communication fingerprint identification method integrating multilayer sparse learning and multi-view learning
CN109829352B (en) * 2018-11-20 2024-06-11 中国人民解放军陆军工程大学 Communication fingerprint identification method integrating multilayer sparse learning and multi-view learning
CN110033824A (en) * 2019-04-13 2019-07-19 湖南大学 A kind of gene expression profile classification method based on shared dictionary learning
CN111666967A (en) * 2020-04-21 2020-09-15 浙江工业大学 Image classification method based on incoherent joint dictionary learning
CN111666967B (en) * 2020-04-21 2023-06-13 浙江工业大学 Image classification method based on incoherence combined dictionary learning
CN112183300A (en) * 2020-09-23 2021-01-05 厦门大学 AIS radiation source identification method and system based on multi-level sparse representation
CN112183300B (en) * 2020-09-23 2024-03-22 厦门大学 AIS radiation source identification method and system based on multi-level sparse representation
CN112734763A (en) * 2021-01-29 2021-04-30 西安理工大学 Image decomposition method based on convolution and K-SVD dictionary joint sparse coding
CN112734763B (en) * 2021-01-29 2022-09-16 西安理工大学 Image decomposition method based on convolution and K-SVD dictionary joint sparse coding

Similar Documents

Publication Publication Date Title
CN108573263A (en) A kind of dictionary learning method of co-ordinative construction rarefaction representation and low-dimensional insertion
CN108509854B (en) Pedestrian re-identification method based on projection matrix constraint and discriminative dictionary learning
Ding et al. Low-rank embedded ensemble semantic dictionary for zero-shot learning
CN111460077B (en) Cross-modal Hash retrieval method based on class semantic guidance
CN114564991B (en) Electroencephalogram signal classification method based on transducer guided convolutional neural network
CN103902964B (en) A kind of face identification method
CN107402993A (en) The cross-module state search method for maximizing Hash is associated based on identification
CN105095863B (en) The Human bodys' response method of semi-supervised dictionary learning based on similitude weights
CN108875459B (en) Weighting sparse representation face recognition method and system based on sparse coefficient similarity
Duong et al. Shrinkteanet: Million-scale lightweight face recognition via shrinking teacher-student networks
CN109190472B (en) Pedestrian attribute identification method based on image and attribute combined guidance
Ma et al. Linearization to nonlinear learning for visual tracking
CN113723312B (en) Rice disease identification method based on visual transducer
CN107832747B (en) Face recognition method based on low-rank dictionary learning algorithm
CN107818345A (en) It is a kind of based on the domain self-adaptive reduced-dimensions method that maximum dependence is kept between data conversion
Li et al. Dating ancient paintings of Mogao Grottoes using deeply learnt visual codes
CN104298977A (en) Low-order representing human body behavior identification method based on irrelevance constraint
CN107066964A (en) Rapid collaborative representation face classification method
CN115054270A (en) Sleep staging method and system for extracting sleep spectrogram features based on GCN
CN1858773A (en) Image identifying method based on Gabor phase mode
CN110443169A (en) A kind of face identification method of edge reserve judgement analysis
CN117523587A (en) Zero-sample Chinese character recognition method based on character sensitive editing distance
CN107291813A (en) Exemplary search method based on semantic segmentation scene
CN109063766B (en) Image classification method based on discriminant prediction sparse decomposition model
CN107085700A (en) A kind of face identification method being combined based on rarefaction representation with neural networks with single hidden layer technology

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20180925

RJ01 Rejection of invention patent application after publication