CN111464459A - Network flow characteristic extraction method based on principal component analysis and linear discriminant analysis - Google Patents

Network flow characteristic extraction method based on principal component analysis and linear discriminant analysis Download PDF

Info

Publication number
CN111464459A
CN111464459A CN202010203309.6A CN202010203309A CN111464459A CN 111464459 A CN111464459 A CN 111464459A CN 202010203309 A CN202010203309 A CN 202010203309A CN 111464459 A CN111464459 A CN 111464459A
Authority
CN
China
Prior art keywords
matrix
feature
principal component
linear discriminant
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010203309.6A
Other languages
Chinese (zh)
Inventor
曲桦
刘宇钦
赵季红
张艳鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian Jiaotong University
Original Assignee
Xian Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Jiaotong University filed Critical Xian Jiaotong University
Priority to CN202010203309.6A priority Critical patent/CN111464459A/en
Publication of CN111464459A publication Critical patent/CN111464459A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L47/00Traffic control in data switching networks
    • H04L47/10Flow control; Congestion control
    • H04L47/24Traffic characterised by specific attributes, e.g. priority or QoS
    • H04L47/2441Traffic characterised by specific attributes, e.g. priority or QoS relying on flow classification, e.g. using integrated services [IntServ]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2415Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
    • G06F18/24155Bayesian classification

Abstract

The invention discloses a flow characteristic extraction method based on principal component analysis and linear discriminant analysis. The method combines the advantages of principal component analysis and linear discriminant analysis while solving the defects that the principal component analysis lacks category information, the calculation cost of the linear discriminant analysis is high, the intra-category dispersion matrix is often irreversible and the like. The extracted feature model can enable the subsequent classification process to achieve better effect. Experimental results on a classical Moore dataset demonstrate that the present invention enables the classifier to yield more accurate classification results in a shorter time.

Description

Network flow characteristic extraction method based on principal component analysis and linear discriminant analysis
Technical Field
The invention belongs to the field of flow classification, relates to a network flow characteristic extraction method under a big data background, and particularly relates to a network flow characteristic extraction method based on principal component analysis and linear discriminant analysis.
Background
With the evolution of mobile communication technology from 4G to 5G, wireless communication networks are increasingly high-speed and stable, and a solid foundation is provided for the development of rich internet services. In order to make the mobile communication network and internet technology develop cooperatively, operators are striving to implement the transition from traffic operation to capacity operation, and form an open platform for network capacity, which is unified in internal coordination and shared in open to the outside. To provide proper network capabilities, a capability platform is required to have accurate identification capability for network data flows. However, the huge amount of information carried by high-dimensional data streams presents great difficulties to the processing of computers. Meanwhile, because of the correlation and redundancy among the data stream features, the data with large feature quantity not only causes unnecessary calculation time and resource overhead, but also causes the reduction of classification accuracy. Therefore, extracting the data stream features is an essential step in the traffic classification process using machine learning methods.
For example, PCA only analyzes from the angle of variance of the features and cannot consider the characteristics of the features in the aspect of mean value, meanwhile, due to the fact that the PCA lacks of class information, the data after dimensionality reduction is reduced to the minimum, but the classification process can be more difficult, and the problems that the calculation complexity of L DA is high, the intra-class dispersion matrix is irreversible and the like exist frequently, and the difficulty in practical application is caused.
Disclosure of Invention
The invention aims to provide a network flow characteristic extraction method based on principal component analysis and linear discriminant analysis, which can overcome the original defects of PCA and L DA, fully combine the advantages of the PCA and the L DA, comprehensively analyze and accurately extract network data flow characteristics, and help a classifier to achieve a better classification effect.
In order to achieve the purpose, the invention adopts the following technical scheme:
the network flow characteristic extraction method based on principal component analysis and linear discriminant analysis comprises the following steps:
1) using principal component analysisMethod for constructing projection matrix W containing discrimination informationPCA
2) Projecting a data set X onto a feature matrix WPCADeleting irrelevant redundant characteristic features to obtain a characteristic set Y;
3) constructing a feature matrix W using linear discriminant analysisLDA
4) By projecting a matrix WLDAConverting a feature set Y into a feature space WLDAAnd obtaining a feature set Z.
The invention is further improved in that the specific process of the step 1) is as follows:
a) representing one data stream with n attributes as a vector of size 1 × n, all m data streams constituting a data set X of size m × n;
b) calculating a covariance matrix C of the data set X according to the following formula;
Figure BDA0002420115970000021
φ=xi-μ (2)
A=[φ1,...,φm]T(3)
Figure BDA0002420115970000022
where μ represents the average of the entire data set,
Figure BDA0002420115970000023
is the difference between each piece of data and the mean value, A is the difference matrix, C is the covariance matrix;
c) calculating the eigenvalue and the eigenvector of the covariance matrix C, and sorting the eigenvalues from big to small;
d) forming a projection matrix W by taking eigenvectors corresponding to the first k largest eigenvaluesPCA(ii) a Where k is the dimensionality of the data in the PCA feature space.
A further development of the invention is that, in step d), the matrix W is projectedPCA=[w1,...,wk]Wherein w is1The eigenvector corresponding to the first largest eigenvalue, wkAnd the feature vector corresponding to the kth largest feature value.
A further development of the invention is that, in step d), the matrix W is projectedPCAThe dimension of the medium data is 225.
In a further development of the invention, in step a), m has a value of 377526 and n has a value of 248.
A further development of the invention is that in step 2), the feature set Y is as follows:
Y=WPCA TX (5)。
the invention is further improved in that the specific process of the step 3) is as follows:
3.1) calculating the within-class dispersion matrix S of the feature set YwAnd the inter-class dispersion matrix Sb
Figure BDA0002420115970000031
Figure BDA0002420115970000032
Wherein i represents a category number, N represents the total number of categories, yiRepresents each flow data, mu, after PCA conversioniMean vector, m, representing class iiRepresenting the number of i-th type samples, and mu representing the total mean vector;
3.2) calculating S by using generalized Rayleigh quotientw -1SbThe eigenvalues and eigenvectors of (a);
Figure BDA0002420115970000033
wherein, ω is expressed as a basis vector of L DA feature space, and J (ω) represents a generalized Rayleigh quotient;
3.3) sorting the eigenvalues from big to small, and taking the eigenvectors corresponding to the first p largest eigenvalues to form a projection matrix WLDAWherein p is L DA data in feature spaceThe dimension.
A further development of the invention consists in that the projection matrix WLDAAs follows:
WLDA=[w1,...,wp]
wherein, ω is1Indicates the eigenvector, ω, corresponding to the 1 st largest eigenvaluepAnd representing the feature vector corresponding to the p-th maximum feature value.
A further development of the invention is that in step 4), the feature set Z is as follows:
Z=WLDA TY (9)。
compared with the prior art, the invention has the following beneficial technical effects:
the invention deletes redundant features and irrelevant features of a high-dimensional training set by using a principal component analysis method, then superimposes a linear discriminant analysis algorithm to reduce dimensions, and converts an original flow data set into a new feature space. The invention can keep the advantages of the principal component analysis method and the linear discriminant analysis method under the condition of making up the original defects of the principal component analysis method and the linear discriminant analysis method. Firstly, reducing original data to a dimension smaller than the rank of the dispersion matrix in the data class by using a principal component analysis method, so that the dispersion matrix in the class is reversible, and smooth execution of a linear discriminant analysis method can be ensured; secondly, the principal component analysis method only focuses on the characteristics of the data in the aspect of variance, and the linear discriminant analysis method can simultaneously consider the characteristics of the data in the aspects of mean and variance, so that the method can more comprehensively analyze the characteristics of the data. Therefore, the invention can achieve better feature extraction effect, enable the classifier to achieve higher classification efficiency and meet the requirement of flow classification under the background of big data.
Drawings
FIG. 1 is a flow chart of a flow feature extraction method based on principal component analysis and linear discriminant analysis;
FIG. 2 shows the variation of classification accuracy and recall rate of Naive Bayes (NB) classifier in three feature spaces of original features, features extracted by PCA method and features extracted by PCA + L DA method of the present invention;
FIG. 3 is a comparison of the effects of training time and testing time required by a naive Bayes classifier in three feature spaces of original features, features extracted by the PCA method, and features extracted by the PCA + L DA method of the present invention.
Detailed Description
The invention is described in further detail below with reference to the following figures and examples:
referring to fig. 1, the present invention comprises the steps of:
1) constructing a projection matrix W containing discriminant information by Principal Component Analysis (PCA)PCA(ii) a Projection matrix WPCAThe dimension of the medium data is 225.
Projecting an original data set into a feature space, comprising the specific steps of:
a) input flow data set one data flow with n attributes is represented as a vector of size 1 × n, all m data flows constituting a data set X of size m × n, m having the value 377526 and n having the value 248, in the present invention, depending on the data set used.
b) Calculating a covariance matrix C of the data set X according to the following formula;
Figure BDA0002420115970000051
φ=xi-μ (2)
A=[φ1,...,φm]T(3)
Figure BDA0002420115970000052
where μ represents the average of the entire data set,
Figure BDA0002420115970000053
the difference value of each piece of data and the mean value is obtained, A is a difference matrix, C is a covariance matrix, and the covariance matrix C of the data set is calculated by A;
c) calculating the eigenvalue and the eigenvector of the covariance matrix C, and sorting the eigenvalues from big to small;
d) forming a projection matrix W by taking eigenvectors corresponding to the first k largest eigenvaluesPCA(ii) a Where k is the dimension of the data in the PCA feature space, which in the present invention is 225.
WPCA=[w1,...,wk]
Wherein, w1The eigenvector corresponding to the first largest eigenvalue, wkAnd the feature vector corresponding to the kth largest feature value.
2) Projecting a data set X onto a feature matrix WPCADeleting irrelevant redundant characteristic features to obtain a characteristic set Y; wherein the constructed feature matrix WPCAEliminating redundant and irrelevant features in high-dimensional data, and simultaneously enabling the dispersion matrix in the class to be reversible to ensure the smooth execution of L DA, wherein the dimension of the medium data is less than the number of ranks of the dispersion matrix in the data class, and the specific process is as follows:
converting the original data into a PCA feature space to obtain a feature set Y of a data set X projected to the PCA feature space;
Y=WPCA TX (5)
3) constructing a feature matrix W by adopting a linear discriminant analysis (L DA) methodLDA(ii) a The specific process is as follows:
3.1) calculating the within-class dispersion matrix S of the feature set YwAnd the inter-class dispersion matrix Sb
Figure BDA0002420115970000061
Figure BDA0002420115970000062
Wherein i represents a category number, N represents the total number of categories, yiRepresents each flow data, mu, after PCA conversioniRepresenting the mean vector of class i. m isiIndicates the number of class i samples and μ indicates the overall mean vector.
3.2) calculating Sw -1Sb
3.3) calculating S by using generalized Rayleigh quotientw -1SbThe eigenvalues and eigenvectors of (a);
Figure BDA0002420115970000063
where ω is represented as a basis vector of L DA feature space, and J (ω) represents a generalized rayleigh quotient.
3.4) sorting the eigenvalues from big to small, and taking the eigenvectors corresponding to the first p largest eigenvalues to form a projection matrix WLDAWherein p is the dimension of data in L DA feature space, and the value is 10 in the invention.
WLDA=[w1,...,wp]
Wherein, ω is1Indicates the eigenvector, ω, corresponding to the 1 st largest eigenvaluepAnd representing the feature vector corresponding to the p-th maximum feature value. Omega1,...,ωpThe feature vectors corresponding to the top p maximum feature values.
4) By projecting a matrix WLDAConverting data Y from projection of data set X into PCA feature space into feature space WLDAObtaining a feature set Z;
Z=WLDA TY (9)
5) the method comprises the steps of completing a flow data classification process by adopting three classification models of a Support Vector Machine (SVM), a K-nearest neighbor (KNN) and a Decision Tree (DT), verifying an improvement effect of the method on a classification result, carrying out PCA algorithm processing and L DA algorithm processing on a flow data set to obtain a flow characteristic set Z, and taking the flow characteristic set Z as an input of a classification model to train the classification model to achieve higher classification efficiency.
6) A naive Bayes classification model is adopted to complete the classification process of the flow data, and the effect of the method on improving the classification effect by considering the characteristics of the mean value and the variance is verified. When the data do not satisfy discrete and mutually independent conditions, the naive Bayes classification model assumes that the mean value of the data obeys mu and the variance is sigma2Gaussian distribution, classification effectIf the method is influenced by the mean and the variance at the same time, the method can be used for verifying the influence of the method on the classification effect caused by the attention of the two factors in the feature extraction process.
The two feature extraction processes are performed in the order of principal component analysis and linear discriminant analysis.
Taking a Moore data set as an example, the data set contains 377526 network traffic data of 12L abel categories, wherein each sample data is composed of 248-dimensional features and 1L abel label, each piece of traffic data is represented as a vector with the size of 1 × 248, in this embodiment, n is 248, m is 377526, the whole implementation process is divided into three major parts, and the specific implementation steps are as follows:
1. the first part is the feature space that projects the dataset onto the PCA;
1) first a data set, represented as matrix X, is read in377526×248
2) Calculating a covariance matrix according to the formulas (1), (2), (3) and (4) and constructing a projection matrix;
3) the data set is projected by equation (5) into the feature space of the PCA, where the data in the feature space of the PCA is denoted as Y in this embodiment377526×225
2. The second part is the feature space that converts the data in the PCA feature space to L DA;
1) calculating an intra-class dispersion matrix and an inter-class dispersion matrix of the data in the PCA characteristic space according to formulas (6) and (7);
2) from equation (8) Sw -1SbThe eigenvalue and eigenvector of (A), construct a projection matrix WLDA
3) The data in the PCA feature space is converted to L DA feature space by equation (9).
3. The third part is a recognition process of the data subjected to feature extraction;
1) classifying the data after feature extraction by using three classification models of a support vector machine, K neighbor and a decision tree, and verifying the effectiveness and universality of the method for improving the classification effect;
in order to verify the effectiveness of the method, the original characteristics, the characteristics extracted by the PCA method and the characteristics extracted by the PCA + L DA method are compared respectively for different classification effects of flow classification.
In order to verify the universality of the method, three classification models, namely a support vector machine, K nearest neighbor and a decision tree are respectively used for carrying out experiments to ensure that the recognition capability of the method is improved for various classifiers, each experiment is carried out for 10 times, and an average value is taken as a final result.Table 1 describes the changes of the classification accuracy, the precision and the recall rate of the three classifiers in three feature spaces.Table 2 describes the changes of the required training time and the required testing time of the three classifiers in the three feature spaces.A PCA + L DA can ensure that the required training time and the required testing time of the three classifiers are greatly reduced while the accuracy of the classification results is improved by the three classifiers from the tables 1 and 2.
2) Classifying the data after feature extraction by using a naive Bayes classifier, and verifying the effect of PCA + L DA on improving the classification effect relative to the consideration of PCA on the features of mean value and variance;
the classification model of naive bayes can be expressed as:
Figure BDA0002420115970000081
it assumes that the data are discrete and independent of each other. When the attribute is continuous, naive Bayes assumes that the data obeys mean value mu and variance sigma2P (X | Y) can then be written as:
Figure BDA0002420115970000082
namely, the classification effect is influenced by the mean and the variance at the same time, and the method can be used for verifying the influence of the PCA + L DA on the classification effect caused by the attention of the two factors in the feature extraction process.
FIG. 2 illustrates the change of classification accuracy and recall rate of a naive Bayes classifier in three feature spaces, and it can be seen from FIG. 2 that the accuracy of the naive Bayes classification model can be improved from 0.87 to 0.97 and the recall rate can be improved from 0.02 to 0.95 by adding the analysis of the features of the data in the aspect of mean values, FIG. 3 illustrates the change of the training time and the testing time required by the naive Bayes classifier in the three feature spaces, and it can be seen from FIG. 3 that the PCA + L DA method of the present invention can reduce the training time and the testing time required by the naive Bayes classification model to one tenth of the time required by the PCA method.
3) The variance of the 10 experimental data was calculated, ensuring improved stability of the method to the classification effect. Table 3 describes the variance of the accuracy, precision and recall of the classification results in three feature spaces for the four classification models. Table 4 describes the variance of training time and testing time for the four classification models in the three feature spaces. As can be seen from tables 3 and 4, the variance of the results of the four classification models in terms of accuracy, precision, recall, training time and testing time in ten experiments is within a reasonable range relative to the result data, which can prove the stability of the method for improving the classification effect.
By combining the experimental results, the PCA + L DA method provided by the invention can make up the respective defects of the original PCA and L DA algorithms, can combine the advantages of the two algorithms, and can perform more effective feature extraction, so that a classifier can obtain a more accurate classification result in a shorter time.
The invention deletes redundant features and irrelevant features of a high-dimensional training set by using a principal component analysis method, then superimposes a linear discriminant analysis algorithm to reduce dimensions, and converts an original flow data set into a new feature space. The method combines the advantages of principal component analysis and linear discriminant analysis while solving the defects that the principal component analysis lacks category information, the calculation cost of the linear discriminant analysis is high, the intra-category dispersion matrix is often irreversible and the like. The extracted feature model can enable the subsequent classification process to achieve better effect. Experimental results on a classical Moore dataset demonstrate that the present invention enables the classifier to yield more accurate classification results in a shorter time.
TABLE 1 variation of classification accuracy, precision and recall of three classifiers in three feature spaces
Figure BDA0002420115970000091
TABLE 2 variation of training time and testing time required for three classifiers in three feature spaces
Figure BDA0002420115970000101
TABLE 3 variance of classification accuracy, precision and recall in three feature spaces for four classification models
Figure BDA0002420115970000102
TABLE 4 variance of training time and testing time in three feature spaces for four classification models
Figure BDA0002420115970000103
The invention deletes redundant features and irrelevant features of a high-dimensional training set by using a principal component analysis method, then superimposes a linear discriminant analysis algorithm to reduce dimensions, and converts an original flow data set into a new feature space. The method combines the advantages of principal component analysis and linear discriminant analysis while solving the defects that the principal component analysis lacks category information, the calculation cost of the linear discriminant analysis is high, the intra-category dispersion matrix is often irreversible and the like. The extracted feature model can enable the subsequent classification process to achieve better effect. Experimental results on a classical Moore dataset demonstrate that the present invention enables the classifier to yield more accurate classification results in a shorter time.

Claims (9)

1. The network flow characteristic extraction method based on principal component analysis and linear discriminant analysis is characterized by comprising the following steps of:
1) the structure using principal component analysis comprisesProjection matrix W of discrimination informationPCA
2) Projecting a data set X onto a feature matrix WPCADeleting irrelevant redundant characteristic features to obtain a characteristic set Y;
3) constructing a feature matrix W using linear discriminant analysisLDA
4) By projecting a matrix WLDAConverting a feature set Y into a feature space WLDAAnd obtaining a feature set Z.
2. The method for extracting network traffic characteristics based on principal component analysis and linear discriminant analysis according to claim 1, wherein the specific process of step 1) is as follows:
a) representing one data stream with n attributes as a vector of size 1 × n, all m data streams constituting a data set X of size m × n;
b) calculating a covariance matrix C of the data set X according to the following formula;
Figure FDA0002420115960000011
φ=xi-μ (2)
A=[φ1,...,φm]T(3)
Figure FDA0002420115960000012
where μ represents the average of the entire data set,
Figure FDA0002420115960000013
is the difference between each piece of data and the mean value, A is the difference matrix, C is the covariance matrix;
c) calculating the eigenvalue and the eigenvector of the covariance matrix C, and sorting the eigenvalues from big to small;
d) forming a projection matrix W by taking eigenvectors corresponding to the first k largest eigenvaluesPCA(ii) a Where k is the dimensionality of the data in the PCA feature space.
3. The method for extracting network traffic characteristics based on principal component analysis and linear discriminant analysis as claimed in claim 2, wherein in step d), the projection matrix WPCA=[w1,...,wk]Wherein w is1The eigenvector corresponding to the first largest eigenvalue, wkAnd the feature vector corresponding to the kth largest feature value.
4. The method for extracting network traffic characteristics based on principal component analysis and linear discriminant analysis as claimed in claim 2, wherein in step d), the projection matrix WPCAThe dimension of the medium data is 225.
5. The method for extracting network traffic characteristics based on principal component analysis and linear discriminant analysis according to claim 2, wherein in step a), m has a value of 377526, and n has a value of 248.
6. The method for extracting network traffic characteristics based on principal component analysis and linear discriminant analysis according to claim 1, wherein in step 2), the characteristic set Y is as follows:
Y=WPCATX(5)。
7. the method for extracting network traffic characteristics based on principal component analysis and linear discriminant analysis according to claim 1, wherein the specific process of step 3) is as follows:
3.1) calculating the within-class dispersion matrix S of the feature set YwAnd the inter-class dispersion matrix Sb
Figure FDA0002420115960000021
Figure FDA0002420115960000022
Wherein i represents a category number, N represents the total number of categories, yiRepresents each flow data, mu, after PCA conversioniMean vector, m, representing class iiRepresenting the number of i-th type samples, and mu representing the total mean vector;
3.2) calculating S by using generalized Rayleigh quotientw -1SbThe eigenvalues and eigenvectors of (a);
Figure FDA0002420115960000023
wherein, ω is expressed as a basis vector of L DA feature space, and J (ω) represents a generalized Rayleigh quotient;
3.3) sorting the eigenvalues from big to small, and taking the eigenvectors corresponding to the first p largest eigenvalues to form a projection matrix WLDAWherein p is the dimension of the data in L DA feature space.
8. The method of claim 7, wherein the projection matrix W is a matrix of linear discriminant analysisLDAAs follows:
WLDA=[ω1,...,ωp]
wherein, ω is1Indicates the eigenvector, ω, corresponding to the 1 st largest eigenvaluepAnd representing the feature vector corresponding to the p-th maximum feature value.
9. The method for extracting network traffic characteristics based on principal component analysis and linear discriminant analysis according to claim 1, wherein in step 4), the characteristic set Z is as follows:
Z=WLDA TY (9)。
CN202010203309.6A 2020-03-20 2020-03-20 Network flow characteristic extraction method based on principal component analysis and linear discriminant analysis Pending CN111464459A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010203309.6A CN111464459A (en) 2020-03-20 2020-03-20 Network flow characteristic extraction method based on principal component analysis and linear discriminant analysis

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010203309.6A CN111464459A (en) 2020-03-20 2020-03-20 Network flow characteristic extraction method based on principal component analysis and linear discriminant analysis

Publications (1)

Publication Number Publication Date
CN111464459A true CN111464459A (en) 2020-07-28

Family

ID=71680805

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010203309.6A Pending CN111464459A (en) 2020-03-20 2020-03-20 Network flow characteristic extraction method based on principal component analysis and linear discriminant analysis

Country Status (1)

Country Link
CN (1) CN111464459A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113489685A (en) * 2021-06-15 2021-10-08 江苏大学 Secondary feature extraction and malicious attack identification method based on kernel principal component analysis

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101219048A (en) * 2008-01-25 2008-07-16 北京工业大学 Method for extracting brain electrical character of imagine movement of single side podosoma
CN101226591A (en) * 2008-01-31 2008-07-23 上海交通大学 Personal identification method based on mobile phone pick-up head combining with human face recognition technique
CN101329724A (en) * 2008-07-29 2008-12-24 上海天冠卫视技术研究所 Optimized human face recognition method and apparatus

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101219048A (en) * 2008-01-25 2008-07-16 北京工业大学 Method for extracting brain electrical character of imagine movement of single side podosoma
CN101226591A (en) * 2008-01-31 2008-07-23 上海交通大学 Personal identification method based on mobile phone pick-up head combining with human face recognition technique
CN101329724A (en) * 2008-07-29 2008-12-24 上海天冠卫视技术研究所 Optimized human face recognition method and apparatus

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113489685A (en) * 2021-06-15 2021-10-08 江苏大学 Secondary feature extraction and malicious attack identification method based on kernel principal component analysis
CN113489685B (en) * 2021-06-15 2023-03-21 江苏大学 Secondary feature extraction and malicious attack identification method based on kernel principal component analysis

Similar Documents

Publication Publication Date Title
Zhu et al. Exploring auxiliary context: discrete semantic transfer hashing for scalable image retrieval
Wang et al. View-based discriminative probabilistic modeling for 3D object retrieval and recognition
CN109063565B (en) Low-resolution face recognition method and device
Almazán et al. Segmentation-free word spotting with exemplar SVMs
Kyperountas et al. Weighted piecewise LDA for solving the small sample size problem in face verification
US9275306B2 (en) Devices, systems, and methods for learning a discriminant image representation
CN106971180B (en) A kind of micro- expression recognition method based on the sparse transfer learning of voice dictionary
CN102324022B (en) Composite gradient vector-based face recognition method
Kan et al. Learning prototype hyperplanes for face verification in the wild
Xu et al. Discriminative analysis for symmetric positive definite matrices on lie groups
Tao et al. Quotient vs. difference: comparison between the two discriminant criteria
CN115170868A (en) Clustering-based small sample image classification two-stage meta-learning method
Hu et al. Action recognition using multiple pooling strategies of CNN features
CN111464459A (en) Network flow characteristic extraction method based on principal component analysis and linear discriminant analysis
Sun et al. Multiple-kernel, multiple-instance similarity features for efficient visual object detection
Li et al. Spatial-temporal dynamic hand gesture recognition via hybrid deep learning model
Chang et al. Face recognition based on stacked convolutional autoencoder and sparse representation
CN110458002B (en) Lightweight rapid face recognition method
Che et al. Boosting few-shot open-set recognition with multi-relation margin loss
Lin et al. A small sample face recognition method based on deep learning
Mantziou et al. Large-scale semi-supervised learning by approximate laplacian eigenmaps, VLAD and pyramids
CN111931788A (en) Image feature extraction method based on complex value
Shao et al. Automatic scene recognition based on constructed knowledge space learning
Tompkins et al. Image analysis with regularized laplacian eigenmaps
CN113111161B (en) Cross-media association analysis method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200728