CN112990371B - Unsupervised night image classification method based on feature amplification - Google Patents

Unsupervised night image classification method based on feature amplification Download PDF

Info

Publication number
CN112990371B
CN112990371B CN202110459160.2A CN202110459160A CN112990371B CN 112990371 B CN112990371 B CN 112990371B CN 202110459160 A CN202110459160 A CN 202110459160A CN 112990371 B CN112990371 B CN 112990371B
Authority
CN
China
Prior art keywords
feature
image
classification
night
class
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110459160.2A
Other languages
Chinese (zh)
Other versions
CN112990371A (en
Inventor
章依依
郑影
朱岳江
徐晓刚
曹卫强
朱亚光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Lab
Original Assignee
Zhejiang Lab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Lab filed Critical Zhejiang Lab
Priority to CN202110459160.2A priority Critical patent/CN112990371B/en
Publication of CN112990371A publication Critical patent/CN112990371A/en
Application granted granted Critical
Publication of CN112990371B publication Critical patent/CN112990371B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/088Non-supervised learning, e.g. competitive learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features

Abstract

The invention belongs to the technical field of computer vision identification, and relates to an unsupervised night image classification method based on feature amplification. Training a classification network by adopting an open data set with daytime image classification labels, extracting characteristic vectors of input images through the classification network, and calculating characteristic mean values and covariance matrixes of various classes; inputting the non-label night image into a classification network to obtain a pseudo label of the image, and calculating a feature mean value and a covariance matrix of each class of the night image in a feature space according to the pseudo label; carrying out weighted average on covariance matrixes obtained by the images of the same category in the daytime and at night to obtain a final covariance matrix; performing feature sampling according to the feature mean value of each category of night images and the weighted average covariance matrix; and retraining the classification network by the sampled characteristic values and the original characteristic values. According to the method, the night data are amplified on the characteristic level by learning the characteristic distribution of the labeled daytime images, so that the unsupervised classification of the night images is realized.

Description

Unsupervised night image classification method based on feature amplification
Technical Field
The invention belongs to the technical field of computer vision recognition, and particularly relates to an unsupervised night image classification method based on feature amplification.
Background
The image classification is the most classical task in the field of computer vision recognition, is also the basis of other many vision problems, and has huge practical value and application prospect. Image classification is essentially a pattern classification problem, whose goal is to classify different images into different classes, achieving a minimum of classification errors. With the success of convolutional neural networks CNN, deep learning has proven to be an effective solution to the image classification problem.
Currently available large public data sets related to image classification mainly include ImageNet, COCO, Pascal VOC, etc., however, these data sets are basically images acquired under daytime environment. Research shows that the daytime image and the nighttime image have obvious field difference, and the neural network trained by the daytime data set often has the problem of performance dip when processing nighttime data. To address this problem, there are two main approaches:
1. and (4) domain adaptation. In the migration learning, when the data distribution of the source domain and the target domain is different, but the two tasks are the same, the special migration learning is called as domain adaptation. Domain adaptation is mainly achieved by finding a feature space to match the distribution of the source domain and the target domain in the shared space. The distribution of data at daytime and night is matched by learning the shared space, and the classification performance of the data at night can be effectively improved. The current technical means is mainly realized by generating a countermeasure network, but the problems of unstable training, long training time and the like exist.
2. And (5) data amplification. Because there is no large nighttime image classification dataset, nighttime data amplification can be performed without supervision. For example, the GAN is used to generate a corresponding low-light image from the daytime image, i.e., data amplification is performed at the image level. However, the night image generated by the method does not accord with the distribution of real data, and the field difference is the same as that of the real night data.
Therefore, how to effectively amplify the night data set and make the features extracted by the trained model more fit to the real night data distribution is an urgent problem to be solved by unsupervised night image classification.
Disclosure of Invention
In order to solve the problem of lack of a night classification data set in the prior art, the invention provides an unsupervised night image classification method based on feature amplification, which comprises the following steps of firstly obtaining feature distribution of each category in the day by training a day image classifier, sampling night features by using a mean value and covariance of the day feature distribution on the basis of the assumption that each dimension of a feature vector is Gaussian distribution, realizing night data amplification on a feature level, and finally retraining a model together with original data and sampling data, so as to improve the night image classification performance, wherein the specific technical scheme is as follows:
an unsupervised night image classification method based on feature augmentation comprises the following steps:
step 1: constructing a data set: downloading an open source night image classification data set Exclusive Dark (ExDark), selecting partial images from the open source night image classification data set Exclusive Dark to construct an unsupervised night image data set A, and using the rest images as a night image classification performance verification set B; randomly selecting images of which the parts correspond to the ExDark data set from the Pascal VOC public data set as a daytime image classification data set T;
step 2: training a classification network to extract image features, and obtaining a mean value and a covariance matrix of the features of each category of the image: training a classification network by adopting a daytime image classification data set T, extracting a characteristic vector of an input image through the classification network, and calculating a characteristic mean value and a covariance matrix of each class;
and step 3: inputting a night image data set A to the classification network to obtain a pseudo label of an input image;
and 4, step 4: counting feature mean values and covariance matrixes of all classes of the night image according to the pseudo labels;
and 5: carrying out weighted average on covariance matrixes obtained by daytime and nighttime images of the same category to obtain a fusion covariance matrix;
step 6: performing feature sampling according to the feature mean value and the fusion covariance matrix of each category of the night image;
and 7: and retraining the classification network by the characteristic sample generated by sampling and the original sample.
Further, the step 2 specifically includes:
step 2.1: selecting a ResNet50 deep residual error network as a classification network, and pre-training on an ImageNet data set;
step 2.2: training a classification network by adopting a daytime image classification data set T, modifying the output number of the last classification layer of the ResNet50 classification network into the classification number of the data set T, setting the learning rate of the last layer of the network to be 0.001, setting the pre-trained network layer to be 0.0001, performing model optimization by using SGD (generalized regression) and setting the batch size to be 32, training 50 epochs in total, and adopting a cross entropy loss function
Figure 100002_DEST_PATH_IMAGE001
The calculation formula is as follows:
Figure 100002_DEST_PATH_IMAGE002
n represents the total number of samples,
Figure 100002_DEST_PATH_IMAGE003
a label representing the ith sample,
Figure 100002_DEST_PATH_IMAGE004
representing the predicted probability value of the ith sample;
step 2.3: extracting a feature vector of a daytime image classification data set T through a trained ResNet50 network, namely removing the last full-connection layer of the classification network, extracting a feature vector output by the second last layer, performing two-dimensional space feature vector visual analysis by using a T-SNE algorithm, and further verifying the effectiveness of the classification network;
step 2.4: respectively calculating corresponding characteristic mean values aiming at the classified categories
Figure 100002_DEST_PATH_IMAGE005
And covariance matrix
Figure 100002_DEST_PATH_IMAGE006
Mean of features of class i
Figure 100002_DEST_PATH_IMAGE007
Expressed as:
Figure 100002_DEST_PATH_IMAGE008
wherein the content of the first and second substances,
Figure 100002_DEST_PATH_IMAGE009
representing the jth input image feature point belonging to class i,
Figure 100002_DEST_PATH_IMAGE010
representing the total number of images belonging to class i, class
Figure 100002_DEST_PATH_IMAGE011
Feature distribution covariance matrix of
Figure 100002_DEST_PATH_IMAGE012
Expressed as:
Figure DEST_PATH_IMAGE013
further, the step 3 specifically includes:
step 3.1: inputting the nighttime image data set A into the trained classification network, and calculating the feature point corresponding to each input image and the mean value of the feature points of each category calculated in the step 2.4, namely the feature center
Figure 100002_DEST_PATH_IMAGE014
Is a Euclidean distance of
Figure 100002_DEST_PATH_IMAGE015
Corresponding feature point of each image
Figure 100002_DEST_PATH_IMAGE016
With the ith class feature center
Figure 100002_DEST_PATH_IMAGE017
European distance of
Figure 100002_DEST_PATH_IMAGE018
Expressed as:
Figure 100002_DEST_PATH_IMAGE019
step 3.2: for each input image, the nearest class feature center is found according to the distance calculated in the step 3.1, if the feature point is located
Figure 342709DEST_PATH_IMAGE016
With nearest class feature center
Figure 100002_DEST_PATH_IMAGE020
Distance less than a hyper-parameter
Figure 100002_DEST_PATH_IMAGE021
Then will be
Figure 322166DEST_PATH_IMAGE016
Is set to
Figure 460759DEST_PATH_IMAGE020
The category to which it belongs; otherwise, judging the characteristic point
Figure 590389DEST_PATH_IMAGE016
For noise, it is discarded, resulting in a feature set:
Figure DEST_PATH_IMAGE022
further, the step 4 specifically includes:
according to the obtained characteristic set S, counting the characteristic mean value of each category of the night image
Figure 100002_DEST_PATH_IMAGE023
And covariance matrix
Figure 100002_DEST_PATH_IMAGE024
Mean of features of class i
Figure 100002_DEST_PATH_IMAGE025
Expressed as:
Figure 100002_DEST_PATH_IMAGE026
wherein the content of the first and second substances,
Figure 100002_DEST_PATH_IMAGE027
representing the jth feature vector belonging to class i in the feature set S,
Figure 100002_DEST_PATH_IMAGE028
representing the total number of feature vectors belonging to class i, class
Figure 342445DEST_PATH_IMAGE011
Feature distribution covariance matrix of
Figure 100002_DEST_PATH_IMAGE029
Expressed as:
Figure 100002_DEST_PATH_IMAGE030
further, the step 5 specifically includes:
covariance matrix of daytime image to be acquired
Figure 100002_DEST_PATH_IMAGE031
Covariance matrix with nighttime images
Figure 100002_DEST_PATH_IMAGE032
Carrying out weighted average to obtain a fusion covariance matrix
Figure 100002_DEST_PATH_IMAGE033
Expressed as:
Figure 100002_DEST_PATH_IMAGE034
ρ is the weight value to harmonize the ratio between the different distributions generated during the day and at night.
Further, the step 6 specifically includes:
according to the obtained feature mean value of each class of night images
Figure 509115DEST_PATH_IMAGE023
And a final weighted covariance matrix
Figure 120356DEST_PATH_IMAGE033
Characteristic sampling is performed from the following Gaussian distribution
Figure 100002_DEST_PATH_IMAGE035
In which random generation belongs to class i
Figure 100002_DEST_PATH_IMAGE036
A sample
Figure 100002_DEST_PATH_IMAGE037
Figure 100002_DEST_PATH_IMAGE038
The number of samples generated by each category is unified by the hyper-parameter
Figure 457272DEST_PATH_IMAGE036
And (4) setting.
Further, the step 7 specifically includes:
retraining the classification network and the loss function by the feature sample Z generated by sampling, the original real daytime feature data x and the nighttime feature data y with the pseudo label
Figure 100002_DEST_PATH_IMAGE039
Expressed as:
Figure 100002_DEST_PATH_IMAGE040
wherein
Figure 100002_DEST_PATH_IMAGE041
A representation belongs to any feature vector in the training sample,
Figure 100002_DEST_PATH_IMAGE042
indicating the label to which it corresponds,
Figure 100002_DEST_PATH_IMAGE043
representing classification model parameters.
The invention has the beneficial effects that:
1. carrying out feature distribution migration on night data by using the real daytime image classification data with the labels, so that the night features acquired by sampling are closer to the real data distribution characteristics;
2. night data are amplified at a feature level, and compared with an image level, the dimensionality is lower, and feature distribution statistics and migration are facilitated;
3. the method has the advantages that the cost is mainly increased in the training stage and is not influenced in the reasoning stage, so that the nighttime image classification performance can be effectively improved on the premise of ensuring the reasoning speed.
Drawings
FIG. 1 is a schematic diagram of the distance between the feature space of an input image and the center of each class of features;
fig. 2 is a flow chart of the method of the present invention.
Detailed Description
In order to make the objects, technical solutions and technical effects of the present invention more apparent, the present invention is further described in detail below with reference to the accompanying drawings and examples.
As shown in fig. 1 and 2, the unsupervised nighttime image classification method based on feature augmentation of the present invention includes the following steps:
step 1: constructing a data set: adopting 11 categories in an open source dataset exclusive Dark (ExDARK), namely bicycles, ships, bottles, buses, cars, cats, chairs, dogs, motorcycles, people and tables, and respectively selecting 800 corresponding images from a Pascal VOC public data set as a daytime image classification data set T for the 11 categories; in addition, the ExDARK dataset was divided into two parts: respectively selecting 400 images from 11 categories to construct an unsupervised night image dataset A; the remaining images serve as a nighttime image classification performance verification set B to evaluate the effectiveness of the algorithm.
Step 2: training a classification network to extract image features to obtain a mean value and a covariance matrix of the features of each category of the image, and specifically comprising the following steps:
step 2.1, classification network pre-training:
selecting a ResNet50 deep residual error network as a classification network, and pre-training on an ImageNet data set to enable the network to have prior knowledge, accelerate convergence and avoid overfitting;
step 2.2, fine adjustment of the classification network:
changing the last classification layer of the ResNet50 model trained in the step 2.1 from 1000 outputs to 11, and fine-tuning the classification layer by using the daytime image classification data set T constructed in the step 1, wherein the learning rate of the last layer of the classification network is set to 0.001, the pre-trained network layer is set to 0.0001, model optimization is performed by using SGD, the batch size is set to 32, a total of 50 epochs are trained, and a cross entropy loss function is adopted
Figure 735938DEST_PATH_IMAGE001
The calculation formula is as follows:
Figure 100002_DEST_PATH_IMAGE044
n represents the total number of samples,
Figure 57198DEST_PATH_IMAGE003
a label representing the ith sample,
Figure 545948DEST_PATH_IMAGE004
representing the predicted probability value for the ith sample.
Step 2.3, image feature extraction:
extracting the feature vector of the daytime image classification data set T by using the classification network trained in the step 2.2, namely removing the last full-connection layer of the classification network, and extracting 2048-dimensional feature vectors output by the second last layer; performing two-dimensional space feature vector visualization analysis by using a t-SNE algorithm; if the features of the same category are mutually aggregated and the features of different categories are mutually distinguished, the classification network is trained; otherwise, the classification network needs to be trained continuously until the expected classification effect is achieved, which indicates that the classification network has better feature extraction and feature distinguishing capabilities;
step 2.4, counting the characteristic distribution of the daytime data set:
each feature extracted in the step 2.3 is a feature vector with 2048 dimensions, and each dimension of the feature vectors with the same category is regarded as a Gaussian distribution, so that a new feature vector can be sampled according to the mean value and the variance; for 11 classes, respectively calculating corresponding feature mean values
Figure 751801DEST_PATH_IMAGE005
And covariance matrix
Figure 884973DEST_PATH_IMAGE006
Mean of features of class i
Figure 783659DEST_PATH_IMAGE007
Expressed as:
Figure 759706DEST_PATH_IMAGE008
wherein the content of the first and second substances,
Figure 628304DEST_PATH_IMAGE009
representing the jth input image feature point belonging to class i,
Figure 6196DEST_PATH_IMAGE010
representing the total number of images belonging to class i, class
Figure 685570DEST_PATH_IMAGE011
Feature distribution covariance matrix of
Figure 148913DEST_PATH_IMAGE012
Expressed as:
Figure 100002_DEST_PATH_IMAGE045
and step 3: inputting a night image data set A to the classification network to obtain a pseudo label of an input image, and specifically comprising the following steps:
step 3.1, inputting the data set A to the classification network trained in the step 2, and calculating the feature point corresponding to each input image and the 11 class feature centers obtained in the step 2.4
Figure 555623DEST_PATH_IMAGE014
Is a Euclidean distance of
Figure 788022DEST_PATH_IMAGE015
Corresponding feature point of each image
Figure 28510DEST_PATH_IMAGE016
With the ith class feature center
Figure 326286DEST_PATH_IMAGE017
European distance of
Figure 943212DEST_PATH_IMAGE018
Expressed as:
Figure 889171DEST_PATH_IMAGE019
as shown in fig. 1, a rectangle example represents a feature point of an input image, and a triangle example and a circle example represent two different class feature centers, which are simplified to 2, and actually 11.
Step 3.2, for each input image, finding the nearest class feature center according to the distance calculated in step 3.1, if the feature point is
Figure 566140DEST_PATH_IMAGE016
With nearest class feature center
Figure 738496DEST_PATH_IMAGE020
Distance less than a hyper-parameter
Figure 503320DEST_PATH_IMAGE021
Then will be
Figure 710311DEST_PATH_IMAGE016
Is set to
Figure 292602DEST_PATH_IMAGE020
The category to which it belongs; otherwise, judging the characteristic point
Figure 76887DEST_PATH_IMAGE016
For noise, it is discarded, resulting in a feature set:
Figure 770037DEST_PATH_IMAGE022
the feature set comprises feature vectors of the night map and corresponding pseudo labels.
And 4, step 4: counting the feature mean value and covariance matrix of each class of night images according to the pseudo labels, specifically:
according to the characteristic set S obtained in the step 3.2, counting the characteristic mean value of each category of the night image
Figure 565954DEST_PATH_IMAGE023
And covariance matrix
Figure 194513DEST_PATH_IMAGE024
Mean of features of class i
Figure 341461DEST_PATH_IMAGE025
Expressed as:
Figure 962935DEST_PATH_IMAGE026
wherein the content of the first and second substances,
Figure 878938DEST_PATH_IMAGE027
representing the jth feature vector belonging to class i in the feature set S,
Figure 537453DEST_PATH_IMAGE028
representing the total number of feature vectors belonging to class i, class
Figure 47062DEST_PATH_IMAGE011
Feature distribution covariance matrix of
Figure 82015DEST_PATH_IMAGE029
Expressed as:
Figure 118104DEST_PATH_IMAGE030
and 5: carrying out weighted average on covariance matrixes obtained by the images of the same category at daytime and at night to obtain a fusion covariance matrix, which specifically comprises the following steps: covariance matrix of daytime image acquired in step 2.4
Figure 72153DEST_PATH_IMAGE031
And the covariance matrix of the night image acquired in the step 4
Figure 193693DEST_PATH_IMAGE032
Carrying out weighted average to obtain a fusion covariance matrix
Figure 297915DEST_PATH_IMAGE033
Expressed as:
Figure 529789DEST_PATH_IMAGE034
ρ is a weight value to reconcile the ratio between the different distributions produced during the day and at night, in this example
Figure DEST_PATH_IMAGE046
The performance is the best when the signal strength is not less than 0.8, which indicates that the diversity of the night data distribution can be effectively improved by combining the daytime data distribution, thereby improving the classification performance of the night chart.
Step 6: performing feature sampling according to the feature mean value and the fusion covariance matrix of each category of the night image, specifically: according to the feature mean value of each class of night images obtained in the step 4
Figure 654740DEST_PATH_IMAGE023
And the weighted covariance matrix obtained in step 5
Figure 732417DEST_PATH_IMAGE033
Feature sampling is performed, for class i as an example, from the following Gaussian distribution
Figure 640330DEST_PATH_IMAGE035
In which random generation belongs to class i
Figure 995219DEST_PATH_IMAGE036
A sample
Figure 697596DEST_PATH_IMAGE037
Figure 387204DEST_PATH_IMAGE038
The number of samples generated by each category is uniformly defined by hyper-parametersNumber of
Figure 833228DEST_PATH_IMAGE036
Setting, in this example
Figure 167258DEST_PATH_IMAGE036
= 400. Because the night data and the day data have the problem of domain interval, only the feature points of the night image are considered when calculating the feature mean value; and the distribution diversity of the daytime data can increase the sample richness of the nighttime image, so that the covariance distribution of the daytime data is fused.
And 7: retraining a classification network together with a characteristic sample generated by sampling and an original sample, specifically: retraining the classification network and the loss function by the feature sample Z generated by sampling, the original real daytime feature data x and the nighttime feature data y with the pseudo label
Figure 915902DEST_PATH_IMAGE039
Expressed as:
Figure DEST_PATH_IMAGE047
wherein
Figure 702592DEST_PATH_IMAGE041
A representation belongs to any feature vector in the training sample,
Figure 545784DEST_PATH_IMAGE042
indicating the label to which it corresponds,
Figure 265478DEST_PATH_IMAGE043
representing classification model parameters.
The invention trains the classification network by using the daytime data set with labels and migrates the characteristic diversity to the nighttime data, thereby making up the deficiency of the scarcity of the nighttime data. The classification performance in the ExDARK validation set B is 58.74 percent by using a classification network trained by daytime data only; by adopting the characteristic amplification method provided by the invention, the classification performance in the ExDARK verification set B reaches 69.22%, is improved by 10.48% compared with benchmark, the classification performance of data at night is greatly improved, and the practical benefit and the application value of the method are fully embodied.

Claims (7)

1. An unsupervised night image classification method based on feature augmentation is characterized by comprising the following steps of:
step 1: constructing a data set: downloading an open source night image classification data set Exclusive Dark (ExDark), selecting partial images from the open source night image classification data set Exclusive Dark to construct an unsupervised night image data set A, and using the rest images as a night image classification performance verification set B; randomly selecting images of which the parts correspond to the ExDark data set from the Pascal VOC public data set as a daytime image classification data set T;
step 2: training a classification network to extract image features, and obtaining a mean value and a covariance matrix of the features of each category of the image: training a classification network by adopting a daytime image classification data set T, extracting a characteristic vector of an input image through the classification network, and calculating a characteristic mean value and a covariance matrix of each class of the image;
and step 3: inputting a night image data set A to the classification network to obtain a pseudo label of an input image;
and 4, step 4: counting feature mean values and covariance matrixes of all classes of night images according to the pseudo labels;
and 5: carrying out weighted average on covariance matrixes obtained by daytime and nighttime images of the same category to obtain a fusion covariance matrix;
step 6: performing feature sampling according to the feature mean value and the fusion covariance matrix of each category of the night image;
and 7: and retraining the classification network by the characteristic sample generated by sampling together with the original real daytime characteristic data x and the night characteristic data y with the pseudo label.
2. The unsupervised nighttime image classification method based on feature augmentation as claimed in claim 1, wherein the step 2 specifically comprises:
step 2.1: selecting a ResNet50 deep residual error network as a classification network, and pre-training on an ImageNet data set;
step 2.2: training a classification network by adopting a daytime image classification data set T, modifying the output number of the last classification layer of the ResNet50 classification network into the classification number of the data set T, setting the learning rate of the last layer of the network to be 0.001, setting the pre-trained network layer to be 0.0001, performing model optimization by using SGD (generalized regression) and setting the batch size to be 32, training 50 epochs in total, and adopting a cross entropy loss function
Figure DEST_PATH_IMAGE001
The calculation formula is as follows:
Figure DEST_PATH_IMAGE002
n represents the total number of samples,
Figure DEST_PATH_IMAGE003
a label representing the ith sample,
Figure DEST_PATH_IMAGE004
representing the predicted probability value of the ith sample;
step 2.3: extracting a feature vector of a daytime image classification data set T through a trained ResNet50 network, and performing two-dimensional space feature vector visualization analysis on the feature vector by using a T-SNE algorithm;
step 2.4: respectively calculating corresponding characteristic mean values aiming at the classified categories
Figure DEST_PATH_IMAGE005
And covariance matrix
Figure DEST_PATH_IMAGE006
Mean of features of class i
Figure DEST_PATH_IMAGE007
Expressed as:
Figure DEST_PATH_IMAGE008
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE009
representing the jth input image feature point belonging to class i,
Figure DEST_PATH_IMAGE010
representing the total number of images belonging to class i, class
Figure DEST_PATH_IMAGE011
Daytime image feature distribution covariance matrix
Figure DEST_PATH_IMAGE012
Expressed as:
Figure DEST_PATH_IMAGE014
3. the unsupervised nighttime image classification method based on feature augmentation as claimed in claim 2, wherein the step 3 specifically comprises:
step 3.1: inputting the nighttime image data set A into the trained classification network, and calculating the feature point corresponding to each input image and the mean value of the feature points of each category calculated in the step 2.4, namely the feature center
Figure DEST_PATH_IMAGE015
Is a Euclidean distance of
Figure DEST_PATH_IMAGE016
Corresponding feature point of each image
Figure DEST_PATH_IMAGE017
With the ith class feature center
Figure DEST_PATH_IMAGE018
European distance of
Figure DEST_PATH_IMAGE019
Expressed as:
Figure DEST_PATH_IMAGE020
step 3.2: for each input image, the nearest class feature center is found according to the distance calculated in the step 3.1, if the feature point is located
Figure 396181DEST_PATH_IMAGE017
With nearest class feature center
Figure DEST_PATH_IMAGE021
If the distance is less than the hyper-parameter, then
Figure 723869DEST_PATH_IMAGE017
Is set to
Figure 217167DEST_PATH_IMAGE021
The category to which it belongs; otherwise, judging the characteristic point
Figure 330748DEST_PATH_IMAGE017
For noise, it is discarded, resulting in a feature set:
Figure DEST_PATH_IMAGE024
4. the unsupervised nighttime image classification method based on feature augmentation as claimed in claim 3, wherein the step 4 is specifically:
according to the obtained characteristic set S, counting characteristics of all classes of the night imageMean value of sign
Figure DEST_PATH_IMAGE025
And covariance matrix
Figure DEST_PATH_IMAGE026
Mean of features of class i
Figure DEST_PATH_IMAGE027
Expressed as:
Figure DEST_PATH_IMAGE028
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE029
representing the jth feature vector belonging to class i in the feature set S,
Figure DEST_PATH_IMAGE030
representing the total number of feature vectors belonging to class i, class
Figure 179887DEST_PATH_IMAGE011
Night image feature distribution covariance matrix
Figure DEST_PATH_IMAGE031
Expressed as:
Figure DEST_PATH_IMAGE032
5. the unsupervised nighttime image classification method based on feature augmentation as claimed in claim 4, wherein the step 5 is specifically:
covariance matrix of daytime image to be acquired
Figure DEST_PATH_IMAGE033
Covariance matrix with nighttime images
Figure DEST_PATH_IMAGE034
Carrying out weighted average to obtain a fusion covariance matrix
Figure DEST_PATH_IMAGE035
Expressed as:
Figure DEST_PATH_IMAGE036
ρ is a weight value.
6. The unsupervised nighttime image classification method based on feature augmentation as claimed in claim 5, wherein the step 6 is specifically:
according to the obtained feature mean value of each class of night images
Figure 107172DEST_PATH_IMAGE025
And a final weighted covariance matrix
Figure 291029DEST_PATH_IMAGE035
Characteristic sampling is performed from the following Gaussian distribution
Figure DEST_PATH_IMAGE037
In which random generation belongs to class i
Figure DEST_PATH_IMAGE038
A sample
Figure DEST_PATH_IMAGE039
Figure DEST_PATH_IMAGE040
The number of samples generated by each category is unified by the hyper-parameter
Figure 611895DEST_PATH_IMAGE038
And (4) setting.
7. The unsupervised nighttime image classification method based on feature augmentation according to claim 6, wherein the step 7 is specifically:
retraining the classification network and the loss function by the feature sample Z generated by sampling, the original real daytime feature data x and the nighttime feature data y with the pseudo label
Figure DEST_PATH_IMAGE041
Expressed as:
Figure DEST_PATH_IMAGE042
wherein
Figure DEST_PATH_IMAGE043
A representation belongs to any feature vector in the training sample,
Figure DEST_PATH_IMAGE044
indicating the label to which it corresponds,
Figure DEST_PATH_IMAGE045
representing classification model parameters.
CN202110459160.2A 2021-04-27 2021-04-27 Unsupervised night image classification method based on feature amplification Active CN112990371B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110459160.2A CN112990371B (en) 2021-04-27 2021-04-27 Unsupervised night image classification method based on feature amplification

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110459160.2A CN112990371B (en) 2021-04-27 2021-04-27 Unsupervised night image classification method based on feature amplification

Publications (2)

Publication Number Publication Date
CN112990371A CN112990371A (en) 2021-06-18
CN112990371B true CN112990371B (en) 2021-09-10

Family

ID=76340379

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110459160.2A Active CN112990371B (en) 2021-04-27 2021-04-27 Unsupervised night image classification method based on feature amplification

Country Status (1)

Country Link
CN (1) CN112990371B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113657561B (en) * 2021-10-20 2022-03-18 之江实验室 Semi-supervised night image classification method based on multi-task decoupling learning
CN113989597B (en) * 2021-12-28 2022-04-05 中科视语(北京)科技有限公司 Vehicle weight recognition method and device, electronic equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105608433A (en) * 2015-12-23 2016-05-25 北京化工大学 Nuclear coordinated expression-based hyperspectral image classification method
CN111814871A (en) * 2020-06-13 2020-10-23 浙江大学 Image classification method based on reliable weight optimal transmission
CN112016392A (en) * 2020-07-17 2020-12-01 浙江理工大学 Hyperspectral image-based small sample detection method for soybean pest damage degree
CN112434723A (en) * 2020-07-23 2021-03-02 之江实验室 Day/night image classification and object detection method based on attention network

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108764281A (en) * 2018-04-18 2018-11-06 华南理工大学 A kind of image classification method learning across task depth network based on semi-supervised step certainly
CN110348399B (en) * 2019-07-15 2020-09-29 中国人民解放军国防科技大学 Hyperspectral intelligent classification method based on prototype learning mechanism and multidimensional residual error network

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105608433A (en) * 2015-12-23 2016-05-25 北京化工大学 Nuclear coordinated expression-based hyperspectral image classification method
CN111814871A (en) * 2020-06-13 2020-10-23 浙江大学 Image classification method based on reliable weight optimal transmission
CN112016392A (en) * 2020-07-17 2020-12-01 浙江理工大学 Hyperspectral image-based small sample detection method for soybean pest damage degree
CN112434723A (en) * 2020-07-23 2021-03-02 之江实验室 Day/night image classification and object detection method based on attention network

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Learning to See in Extremely Low-Light Environments with Small Data;Xu yifeng 等;《https://www.proquest.com/openview/ac68d120e78fcc554a08f3c465b3300e/1?pq-origsite=gscholar&cbl=2032404》;20200617;第1-15页 *
Low-shot Learning via Covariance-Preserving Adversarial Augmentation Networks;Hang Gao 等;《https://arxiv.org/abs/1810.11730》;20181213;第1-13页 *
Making of Night Vision: Object Detection Under Low-Illumination;Yuxuan Xiao 等;《IEEE Access》;20200707;第123075-123086页 *
基于超分辨率特征融合的工件表面细微缺陷;刘孝保 等;《https://kns.cnki.net/kcms/detail/11.5946.TP.20210129.1755.004.html》;20210201;第1-18页 *

Also Published As

Publication number Publication date
CN112990371A (en) 2021-06-18

Similar Documents

Publication Publication Date Title
CN112308158B (en) Multi-source field self-adaptive model and method based on partial feature alignment
CN112381116B (en) Self-supervision image classification method based on contrast learning
CN113378632B (en) Pseudo-label optimization-based unsupervised domain adaptive pedestrian re-identification method
CN111126386B (en) Sequence domain adaptation method based on countermeasure learning in scene text recognition
CN108229550B (en) Cloud picture classification method based on multi-granularity cascade forest network
EP3767536A1 (en) Latent code for unsupervised domain adaptation
CN109768985A (en) A kind of intrusion detection method based on traffic visualization and machine learning algorithm
CN107194418B (en) Rice aphid detection method based on antagonistic characteristic learning
CN112990371B (en) Unsupervised night image classification method based on feature amplification
CN103955702A (en) SAR image terrain classification method based on depth RBF network
CN107908642B (en) Industry text entity extraction method based on distributed platform
CN110460605A (en) A kind of Abnormal network traffic detection method based on autocoding
CN113408605A (en) Hyperspectral image semi-supervised classification method based on small sample learning
CN111488917A (en) Garbage image fine-grained classification method based on incremental learning
CN105320967A (en) Multi-label AdaBoost integration method based on label correlation
CN114818963B (en) Small sample detection method based on cross-image feature fusion
CN114926680A (en) Malicious software classification method and system based on AlexNet network model
CN114006870A (en) Network flow identification method based on self-supervision convolution subspace clustering network
CN111191033A (en) Open set classification method based on classification utility
CN113541834B (en) Abnormal signal semi-supervised classification method and system and data processing terminal
CN112669161B (en) Financial wind control system based on block chain, public sentiment and core algorithm
Cai et al. Cloud classification of satellite image based on convolutional neural networks
CN113256507B (en) Attention enhancement method for generating image aiming at binary flow data
CN115496948A (en) Network supervision fine-grained image identification method and system based on deep learning
Huang et al. Resolving intra-class imbalance for gan-based image augmentation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant