CN108921088A - A kind of face identification method based on discriminate target equation - Google Patents

A kind of face identification method based on discriminate target equation Download PDF

Info

Publication number
CN108921088A
CN108921088A CN201810699862.6A CN201810699862A CN108921088A CN 108921088 A CN108921088 A CN 108921088A CN 201810699862 A CN201810699862 A CN 201810699862A CN 108921088 A CN108921088 A CN 108921088A
Authority
CN
China
Prior art keywords
test sample
residual
face
sample
coefficient
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810699862.6A
Other languages
Chinese (zh)
Other versions
CN108921088B (en
Inventor
胡建国
马媛
李元新
杨焕
吴明华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sun Yat Sen University
SYSU CMU Shunde International Joint Research Institute
Research Institute of Zhongshan University Shunde District Foshan
National Sun Yat Sen University
Original Assignee
SYSU CMU Shunde International Joint Research Institute
Research Institute of Zhongshan University Shunde District Foshan
National Sun Yat Sen University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SYSU CMU Shunde International Joint Research Institute, Research Institute of Zhongshan University Shunde District Foshan, National Sun Yat Sen University filed Critical SYSU CMU Shunde International Joint Research Institute
Priority to CN201810699862.6A priority Critical patent/CN108921088B/en
Publication of CN108921088A publication Critical patent/CN108921088A/en
Application granted granted Critical
Publication of CN108921088B publication Critical patent/CN108921088B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a kind of face identification methods based on discriminate target equation, first with the first training sample come linear expression test sample, and construct first object equation, then first category corresponding to the least residual of test sample is calculated, carry out linear expression test sample again for first category corresponding to least residual as the second training sample, then the second target equation is constructed, second category corresponding to the least residual of test sample is obtained by calculating the second target equation, second category corresponding to least residual is classification belonging to test sample, method of the invention passes through the Quadric Representation of A to test sample and the calculating of residual error twice, reduce the design complexities during picture classification, face can more accurately be sorted out, and there is very strong robustness under different illumination and posture.

Description

A kind of face identification method based on discriminate target equation
Technical field
The present invention relates to image identification technical field, especially a kind of recognition of face side based on discriminate target equation Method.
Background technique
Recognition of face is always the hot issue in the fields such as pattern-recognition, image procossing, computer vision, and rarefaction representation Very big concern is just caused since being suggested for image classification, and rarefaction representation all has very strong property in many applications Can, the sparse representation method for image classification can be divided into two kinds:Based on original training sample (thetraining Sample-based) and it is based on dictionary (dictionary-based), the rarefaction representation based on original training sample directly utilizes Original sample indicates test sample as atom, and the rarefaction representation based on dictionary uses algorithm and original training sample first Dictionary is obtained, test sample is then indicated using dictionary, rarefaction representation is shown in many application programs due to its characteristic High performance, however available sparse representation method still has some problems, such as computation complexity height, low efficiency, treats When different illumination, facial expression and posture difference the disadvantages of poor robustness.
Summary of the invention
To solve the above problems, the purpose of the present invention is to provide a kind of recognition of face sides based on discriminate target equation Method, by two step residual computations, for traditional rarefaction representation, design complexities are lower, can be more accurate point Class goes out face, and has very strong robustness under different illumination and posture.
Technical solution used by the present invention solves the problems, such as it is:
A kind of face identification method based on discriminate target equation, includes the following steps:
A, face sample is divided into the first training sample and test sample, and with the first training sample linear expression test specimens This;
B, constructing first object equation and solving first indicates coefficient;
C, first category corresponding to least residual is obtained according to the residual error that the first expression coefficient calculates test sample, it is minimum First category corresponding to residual error is the second training sample;
D, with the second training sample linear expression test sample;
E, constructing the second target equation and solving second indicates coefficient;
F, second category corresponding to least residual is obtained according to the residual error that the second expression coefficient calculates test sample, it is minimum Second category corresponding to residual error is classification belonging to test sample.
Further, the step A is in the first training sample linear expression test sample, the expression of the test sample Formula:
Y=b1x1+b2x2+…+bNxN=BX;
Wherein y is test sample, x1, x1..., xNFor the first training sample, X=[x1, x1..., xN], b1, b2..., bNCoefficient, B=[b are indicated for corresponding first1, b2..., bN]T, N is the first face sample, and N=n*L, L are people in face database The quantity of face image, n are the quantity of difference photo possessed by each facial image, a secondary facial image containing x pixel It is represented as a column vector xi∈Rx×1, test sample y is represented as column vector y ∈ Rx×1
Further, the step B constructs first object equation and solves first and indicates in coefficient, the first object equation Expression formula be:
Wherein δ is a normal number, and y is test sample, and X is the first training sample,
X=[x1, x2..., xN], B=[b1, b2..., bN]T,
Bi=[bn(i-1)+1, bn(i-1)+2..., bni]T,
Bj=[bn(j-1)+1, bn(j-1)+2..., bnj]T, (i=1 .., L;J=1 ..., n), L is L people in face database Facial image, n are the quantity of difference photo possessed by each facial image;First object equation is calculated and is used in combinationTable Show calculated result, wherein
XC=[x1, x2..., xn]。
Further, the step C is obtained corresponding to least residual according to the residual error that the first expression coefficient calculates test sample First category the specific steps are:According to the residual error f between test sample and i-th the first training sample of classiIt is calculated s First category corresponding to least residual, wherein residual error fiExpression formula be:
fi=| | y-bijxij2
Wherein (i=1 .., L;J=1 ..., n), L is the quantity of facial image in face database, and n is each facial image institute The quantity for the different photos having, bijFirst indicates coefficient, x corresponding to jth photo for i-th faceijIt is i-th The jth of face photo.
Further, in the second training sample linear expression test sample, the test sample is the step D:
Y=d1z1+d2z2+…+drzr=DZ;
Wherein y is test sample, and r is the second face sample, and r=s*n, s are the Minimum Residual being calculated in the step C The quantity of first category corresponding to difference, Z are the second training sample, Z=[z1, z2..., zr], n be each first category in not With the quantity of photo, D is the second expression coefficient, D=[d1, d2..., dr]T
Further, the step E constructs the second target equation and solves second and indicates in coefficient, the second target equation For:
Wherein δ is a normal number, and y is test sample, and Z is the second training sample, Z=[z1, z2..., zr], D Two indicate coefficient, D=[d1, d2..., dr]T, s is the quantity of first category corresponding to least residual;
Di=[dn(i-1)+1, dn(i-1)+2..., dni]T,
Dj=[dn(j-1)+1, dn(j-1)+2..., dnj]T, (i=1 .., s;J=1 ..., n), n is each of first category The quantity of difference photo possessed by facial image;Second target equation is calculated and is used in combinationIndicate calculated result, wherein
Further, the step F is obtained corresponding to least residual according to the residual error that the second expression coefficient calculates test sample Second category the specific steps are:According to the residual error g between test sample and i-th the second training sample of classiIt is calculated most Second category corresponding to small residual error, wherein residual error giExpression formula be:
gi=| | y-dijzij||2
Wherein (i=1 .., s;J=1 ..., n), dijSecond indicates system corresponding to jth photo for i-th face Number, zijFor the jth photo of i-th face.
The beneficial effects of the invention are as follows:A kind of face identification method based on discriminate target equation that the present invention uses, By constructing two kinds of target equations, Quadric Representation of A, and the calculating by carrying out residual error twice are carried out to test sample, reduce figure Design complexities in piece assorting process can more accurately sort out face, and have under different illumination and posture Very strong robustness.
Detailed description of the invention
The invention will be further described with example with reference to the accompanying drawing.
Fig. 1 is a kind of flow chart of the face identification method based on discriminate target equation of the present invention.
Specific embodiment
Referring to Fig.1, a kind of face identification method based on discriminate target equation of the invention, includes the following steps:
A, face sample is divided into the first training sample and test sample, and with the first training sample linear expression test specimens This.
Assuming that there is the face of L people in face database, everyone has n different photos, therefore a total of N number of first face Sample, wherein N=n*L, if a secondary facial image containing x pixel can be expressed as a column vector xi∈Rx×1, enable P= [X1, X2..., XC..., XL], wherein XC=[x1, x2..., xn], (C=1,2 ..., L), test sample be represented as column to Measure y ∈ Rx×1, therefore can be with the first training sample come linear expression test sample, wherein the expression formula of test sample is:
Y=b1x1+b2x2+…+bNxN=BX;
B is the first expression coefficient, B=[b in above formula1, b2..., bN]T, X is the first training sample, X=[x1, x1..., xN]。
B, constructing first object equation and solving first indicates coefficient.
The expression formula of constructed first object equation is:
Wherein δ is a normal number, each section when it takes a suitable value, in the expression formula of test sample y Good effect, B=[b can be obtained1, b2..., bN]T, Bi=[bn(i-1)+1, bn(i-1)+2..., bni]T;Bj=[bn(j-1)+1, bn(j-1)+2..., bnj]T, wherein (i=1 .., L;J=1 ..., n), first object equation is calculated and is used in combinationIndicate meter It calculates as a result, wherein
C, first category corresponding to least residual is obtained according to the residual error that the first expression coefficient calculates test sample, it is minimum First category corresponding to residual error is the second training sample.
Assume initially that fiThe residual error between test sample y and i-th the first training sample of class is represented, residual error is smaller, represents examination Sample y and i-th the first training sample of class are closer, by calculating residual error fiSize, corresponding to available s least residual First category, residual error fiExpression formula be:
fi=| | y-bijxij||2
Wherein (i=1 .., L;J=1 ..., n), bijFirst indicates system corresponding to jth photo for i-th face Number, xijFor the jth photo of i-th face.
First category corresponding to obtained s least residual is the second training sample in step C.
D, with the second training sample linear expression test sample.
First category corresponding to s least residual a total of r the second face samples, wherein r=s*n, each first There are n different photos in classification, it then can be with this r face sample come linear expression test sample, the table of test sample It is up to formula:
Y=d1z1+d2z2+…+drzr=DZ;
Z is the second training sample, Z=[z1, z2..., zr], D is the second expression coefficient, D=[d1, d2..., dr]T, contain There is the column vector z of classification corresponding to the least residual of z pixeli∈Rz×1It indicates, enables P1=[Z1, Z2..., ZE..., Zs], Wherein ZE=[z1, z2..., zn], (E=1,2 ..., L).
E, constructing the second target equation and solving second indicates coefficient.
Wherein the expression formula of the second target equation is:
Wherein δ is a normal number, and y is test sample, D=[d1, d2..., dr]T, s is the corresponding to least residual A kind of other quantity;
Di=[dn(i-1)+1, dn(i-1)+2..., dni]T,
Dj=[dn(j-1)+1, dn(j-1)+2..., dnj]T, (i=1 .., s;J=1 ..., n);
Second target equation is calculated and is used in combinationIndicate calculated result, wherein Wherein
F, second category corresponding to least residual is obtained according to the residual error that the second expression coefficient calculates test sample.
Assume initially that giThe residual error between test sample h and i-th the second training sample of class is represented, residual error is smaller, represents and surveys This h of sample and i-th the second training sample of class are closer, by calculating residual error giSize, corresponding to available least residual Second category, wherein residual error giExpression formula be:
gi=| | y-dijzij||2
(i=1 .., s;J=1 ..., n), dijSecond indicates coefficient corresponding to jth photo for i-th face, zijPhoto is opened for the jth of i-th face, second category corresponding to the smallest residual error is classification belonging to test sample, at this time It is considered that the face in test sample belongs to that the smallest class facial image of residual error.
Method of the invention constructs target equation first with the first all training samples come linear expression test sample (i.e. first object equation) finds first category corresponding to s least residual by solving residual error, then minimum by this s First category corresponding to residual error carrys out linear expression test sample again, finally by new target equation (i.e. second mesh Mark equation) solve second category corresponding to least residual, to realize the classification of face, biggest advantage of the present invention be using New discriminate target equation, this new discriminate target equation directly strengthens the class inherited of sample, then passes through two Residual computations are walked, face can be more accurately sorted out, reduce the design complexities in picture classification, and in different illumination And have very strong robustness under posture.
Method of the invention generally includes two steps, and first step includes step A, B, C, and second step includes Step D, E, F, previous step are the residual computations of the test sample for being indicated with the first training sample, are surveyed by calculating The residual error of sample sheet obtains the second training sample, to reduce the scope, then indicates test specimens again with the second training sample This, and the least residual of test sample is calculated, second category corresponding to finally obtained least residual is exactly test sample institute The classification of category is considered that testing face belongs to that the smallest class facial image of residual error.
The present invention is tested by multiple authentication, when the quantity s=5 of the first category corresponding to the least residual, the present invention Recognition correct rate highest.
In order to verify effectiveness of the invention, by method of the invention and CRC, L1LS, FISTA, INNC, FCM, LDA, The methods of Homotopy, DALM are compared, and primary data is set as the face quantity L=40 in face database, everyone is different The number of pictures n=10 of posture, so entire face database a total of 400 (40*10) face picture, takes each face respectively 2-6 different photos, that is, 80-240 photos are used as training sample in total, other remaining facial images are then as survey Sample sheet, specific test result are as shown in table 1:
Table 1 (accuracy of method and other image-recognizing methods of the invention)
By the data in table 1 it is found that the quantity for the photo no matter method of the invention chooses each face is how many, this hair The accuracy of bright method will be better than other methods, by table 1 statistics indicate that method of the invention and existing sparse table Show that the method for image classification is higher compared to correctness, design is also simpler, and robustness and high efficiency are higher.
The above, only presently preferred embodiments of the present invention, the invention is not limited to above embodiment, as long as It reaches technical effect of the invention with identical means, all should belong to protection scope of the present invention.

Claims (7)

1. a kind of face identification method based on discriminate target equation, it is characterised in that:Include the following steps:
A, face sample is divided into the first training sample and test sample, and with the first training sample linear expression test sample;
B, constructing first object equation and solving first indicates coefficient;
C, first category corresponding to least residual, least residual are obtained according to the residual error that the first expression coefficient calculates test sample Corresponding first category is the second training sample;
D, with the second training sample linear expression test sample;
E, constructing the second target equation and solving second indicates coefficient;
F, second category corresponding to least residual, least residual are obtained according to the residual error that the second expression coefficient calculates test sample Corresponding second category is classification belonging to test sample.
2. a kind of face identification method based on discriminate target equation according to claim 1, it is characterised in that:It is described Step A is in the first training sample linear expression test sample, the expression formula of the test sample:
Y=b1x1+b2x2+…+bNxN=BX;
Wherein y is test sample, x1, x1..., xNFor the first training sample, X=[x1, x1..., xN], b1, b2..., bNFor Corresponding first indicates coefficient, B=[b1, b2..., bN]T, N is the first face sample, and N=n*L, L are face figure in face database The quantity of picture, n are the quantity of difference photo possessed by each facial image, and a secondary facial image containing x pixel is by table It is shown as a column vector xi∈Rx×1, test sample y is represented as column vector y ∈ Rx×1
3. a kind of face identification method based on discriminate target equation according to claim 1, it is characterised in that:It is described Step B, which constructs first object equation and solves first, indicates in coefficient that the expression formula of the first object equation is:
Wherein δ is a normal number, and y is test sample, and X is the first training sample, X=[x1, x2..., xN], B is the first table Show coefficient, B=[b1, b2..., bN]T,
Bi=[bn(i-1)+1, bn(i-1)+2..., bni]T,
Bj=[bn(j-1)+1, bn(j-1)+2..., bnj]T, (i=1 .., L;J=1 ..., n), L is the number of facial image in face database Amount, n are the quantity of difference photo possessed by each facial image;First object equation is calculated and is used in combinationIt indicates to calculate As a result, wherein
XC=[x1, x2..., xn]。
4. a kind of face identification method based on discriminate target equation according to claim 3, it is characterised in that:It is described Step C obtains the specific steps of first category corresponding to least residual according to the residual error that the first expression coefficient calculates test sample For:According to the residual error f between test sample and i-th the first training sample of classiIt is calculated first corresponding to s least residual Classification, wherein residual error fiExpression formula be:
fi=| | y-bijxij||2
Wherein (i=1 .., L;J=1 ..., n), L is the quantity of facial image in face database, and n is had by each facial image Different photos quantity, bijFirst indicates coefficient, x corresponding to jth photo for i-th faceijFor i-th face Jth photo.
5. a kind of face identification method based on discriminate target equation according to claim 1, it is characterised in that:It is described With in the second training sample linear expression test sample, the test sample is step D:
Y=d1z1+d2z2+…+drzr=DZ;
Wherein y is test sample, and r is the second face sample, and r=s*n, s are the least residual institute that is calculated in the step C The quantity of corresponding first category, Z are the second training sample, Z=[z1, z2..., zr], n is different in each first category shines The quantity of piece, D are the second expression coefficient, D=[d1, d2..., dr]T
6. a kind of face identification method based on discriminate target equation according to claim 1, it is characterised in that:It is described Step E, which constructs the second target equation and solves second, indicates in coefficient that the second target equation is:
Wherein δ is a normal number, and y is test sample, and Z is the second training sample,
Z=[z1, z2..., zr], D is the second expression coefficient, D=[d1, d2..., dr]T, s is first corresponding to least residual The quantity of classification;
Di=[dn(i-1)+1, dn(i-1)+2..., dni]T,
Dj=[dn(j-1)+1, dn(j-1)+2..., dnj]T, (i=1 .., s;J=1 ..., n), n is each of first category face The quantity of difference photo possessed by image;Second target equation is calculated and is used in combinationIndicate calculated result, wherein
ZE=[z1, z2..., zn]。
7. a kind of face identification method based on discriminate target equation according to claim 6, it is characterised in that:It is described Step F obtains the specific steps of second category corresponding to least residual according to the residual error that the second expression coefficient calculates test sample For:According to the residual error g between test sample and i-th the second training sample of classiThe second class corresponding to least residual is calculated Not, wherein residual error giExpression formula be:
gi=| | y-dijzij||2
Wherein (i=1 .., s;J=1 ..., n), dijSecond indicates coefficient, z corresponding to jth photo for i-th faceij For the jth photo of i-th face.
CN201810699862.6A 2018-06-29 2018-06-29 Face recognition method based on discriminant target equation Active CN108921088B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810699862.6A CN108921088B (en) 2018-06-29 2018-06-29 Face recognition method based on discriminant target equation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810699862.6A CN108921088B (en) 2018-06-29 2018-06-29 Face recognition method based on discriminant target equation

Publications (2)

Publication Number Publication Date
CN108921088A true CN108921088A (en) 2018-11-30
CN108921088B CN108921088B (en) 2022-03-04

Family

ID=64423181

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810699862.6A Active CN108921088B (en) 2018-06-29 2018-06-29 Face recognition method based on discriminant target equation

Country Status (1)

Country Link
CN (1) CN108921088B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101976352A (en) * 2010-10-29 2011-02-16 上海交通大学 Various illumination face identification method based on small sample emulating and sparse expression
CN104063714A (en) * 2014-07-20 2014-09-24 詹曙 Fast human face recognition algorithm used for video monitoring and based on CUDA parallel computing and sparse representing
CN104182734A (en) * 2014-08-18 2014-12-03 桂林电子科技大学 Linear-regression based classification (LRC) and collaborative representation based two-stage face identification method
CN104599298A (en) * 2014-12-06 2015-05-06 西北农林科技大学 Two-dimensional subspace tracking based image reconstruction method
CN105740884A (en) * 2016-01-22 2016-07-06 厦门理工学院 Hyper-spectral image classification method based on singular value decomposition and neighborhood space information
CN106446774A (en) * 2016-08-24 2017-02-22 施志刚 Face recognition method based on secondary nearest neighbor sparse reconstruction
CN106960420A (en) * 2017-02-20 2017-07-18 南京邮电大学 A kind of image reconstructing method of segment iteration matching pursuit algorithm
CN107273793A (en) * 2017-04-28 2017-10-20 广东顺德中山大学卡内基梅隆大学国际联合研究院 A kind of feature extracting method for recognition of face

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101976352A (en) * 2010-10-29 2011-02-16 上海交通大学 Various illumination face identification method based on small sample emulating and sparse expression
CN104063714A (en) * 2014-07-20 2014-09-24 詹曙 Fast human face recognition algorithm used for video monitoring and based on CUDA parallel computing and sparse representing
CN104182734A (en) * 2014-08-18 2014-12-03 桂林电子科技大学 Linear-regression based classification (LRC) and collaborative representation based two-stage face identification method
CN104599298A (en) * 2014-12-06 2015-05-06 西北农林科技大学 Two-dimensional subspace tracking based image reconstruction method
CN105740884A (en) * 2016-01-22 2016-07-06 厦门理工学院 Hyper-spectral image classification method based on singular value decomposition and neighborhood space information
CN106446774A (en) * 2016-08-24 2017-02-22 施志刚 Face recognition method based on secondary nearest neighbor sparse reconstruction
CN106960420A (en) * 2017-02-20 2017-07-18 南京邮电大学 A kind of image reconstructing method of segment iteration matching pursuit algorithm
CN107273793A (en) * 2017-04-28 2017-10-20 广东顺德中山大学卡内基梅隆大学国际联合研究院 A kind of feature extracting method for recognition of face

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
BAI S等: "Multiple stage residual model for image classification and vector compression", 《IEEE TRANSACTIONS ON MULTIMEDIA》 *
DING L等: "Discrimination and identification between mainlobe repeater jamming and target echo by basis pursuit", 《 IET RADAR, SONAR & NAVIGATION》 *
YONGXU等: "A New Discriminative Sparse Representation Method for Robust Face Recognition via l2 Regularization", 《IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS》 *
侯姗姗: "稀疏表示在图像分类问题中的应用研究", 《中国优秀硕士学位论文全文数据库 (信息科技辑)》 *
曹冬寅等: "基于稀疏重构残差和随机森林的集成分类算法", 《南京大学学报(自然科学)》 *

Also Published As

Publication number Publication date
CN108921088B (en) 2022-03-04

Similar Documents

Publication Publication Date Title
CN109961089A (en) Small sample and zero sample image classification method based on metric learning and meta learning
CN110363122B (en) Cross-domain target detection method based on multi-layer feature alignment
CN105354595B (en) A kind of robust visual pattern classification method and system
CN108875794B (en) Image visibility detection method based on transfer learning
CN108875818A (en) Based on variation from code machine and confrontation network integration zero sample image classification method
CN106203483B (en) A kind of zero sample image classification method based on semantic related multi-modal mapping method
CN109214470B (en) Image visibility detection method based on coding network fine adjustment
CN104866871B (en) Hyperspectral image classification method based on projection structure sparse coding
CN113435282B (en) Unmanned aerial vehicle image ear recognition method based on deep learning
CN108492298A (en) Based on the multispectral image change detecting method for generating confrontation network
CN108596274A (en) Image classification method based on convolutional neural networks
CN105608478A (en) Combined method and system for extracting and classifying features of images
CN106355195A (en) The system and method used to measure image resolution value
CN109344871A (en) A kind of target classification identification method based on multi-source field fusion transfer learning
CN109117860A (en) A kind of image classification method based on subspace projection and dictionary learning
CN106250918B (en) A kind of mixed Gauss model matching process based on improved soil-shifting distance
CN115330876B (en) Target template graph matching and positioning method based on twin network and central position estimation
CN109583498A (en) A kind of fashion compatibility prediction technique based on low-rank regularization feature enhancing characterization
CN110533063A (en) A kind of cloud amount calculation method and device based on satellite image and GMDH neural network
CN108664941A (en) The sparse description face identification method of core based on Geodesic Mapping analysis
CN110796182A (en) Bill classification method and system for small amount of samples
CN110046669A (en) Half Coupling Metric based on sketch image identifies the pedestrian retrieval method of dictionary learning
CN106485739B (en) A kind of point set method for registering based on L2 distance
CN106095811A (en) A kind of image search method of the discrete Hash of supervision based on optimum code
CN108921088A (en) A kind of face identification method based on discriminate target equation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant