CN110781942B - Semi-supervised image classification method and system - Google Patents

Semi-supervised image classification method and system Download PDF

Info

Publication number
CN110781942B
CN110781942B CN201910995951.XA CN201910995951A CN110781942B CN 110781942 B CN110781942 B CN 110781942B CN 201910995951 A CN201910995951 A CN 201910995951A CN 110781942 B CN110781942 B CN 110781942B
Authority
CN
China
Prior art keywords
sample
state transition
transition matrix
sample set
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910995951.XA
Other languages
Chinese (zh)
Other versions
CN110781942A (en
Inventor
康宇
吕文君
许镇义
李泽瑞
昌吉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Science and Technology of China USTC
Original Assignee
University of Science and Technology of China USTC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Science and Technology of China USTC filed Critical University of Science and Technology of China USTC
Priority to CN201910995951.XA priority Critical patent/CN110781942B/en
Publication of CN110781942A publication Critical patent/CN110781942A/en
Application granted granted Critical
Publication of CN110781942B publication Critical patent/CN110781942B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • G06F18/2155Generating training patterns; Bootstrap methods, e.g. bagging or boosting characterised by the incorporation of unlabelled data, e.g. multiple instance learning [MIL], semi-supervised techniques using expectation-maximisation [EM] or naïve labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2411Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a semi-supervised image classification method and a semi-supervised image classification system, which belong to the technical field of computers and comprise the following steps: collecting raw data and converting the raw data into a sample set; let L ═ {1,2, …, c, …, L } denote the value of the labelSet, c 1,2, …, l, establishes a state transition matrix
Figure DDA0002812495230000011
n=1,2,…,l,Tm,nRepresenting the transition probability from state m to state n; setting a labeling interval according to the minimum element on the diagonal line in the state transition matrix; labeling the samples according to the labeling intervals to obtain a labeled sample set and a non-labeled sample set; and training the constructed support vector machine model by using the labeled sample set and the unlabeled sample set to obtain the trained support vector machine. The sample classification of the invention well reflects the actual situation of data classification, and the classified data is used for training the model, thereby improving the accuracy of semi-supervised classification.

Description

Semi-supervised image classification method and system
Technical Field
The invention relates to the technical field of computers, in particular to a semi-supervised image classification method and system.
Background
How to utilize mass data is an important task faced by current machine learning, and a traditional support vector machine is a supervised learning method and needs a large number of marked samples for training. However, in practical applications, since most of the available sample data is unlabeled, there are fewer labeled sample points, and if only these fewer labeled samples are used, the information existing in a large number of position-labeled samples is lost. Therefore, the learner proposes a semi-supervised learning method, namely, the semi-supervised learning method utilizes unlabeled sample data knowledge and utilizes a small amount of labeled sample data knowledge. However, the existing semi-supervised learning methods still do not fully utilize the spatial smoothness, resulting in a large number of misclassifications.
Disclosure of Invention
The invention aims to overcome the defects in the background technology and improve the accuracy of semi-supervised classification.
In order to achieve the above object, in one aspect, a semi-supervised image classification method is adopted, including the following steps:
collecting raw data, performing feature extraction on the raw data to convert the raw data into a sample set X ═ X1,x2,…,xp,…,xPIn which xpIs one sample, P is 1,2, …, P is the number of all samples; sample(s)
Figure GDA0002812495220000011
Figure GDA0002812495220000012
Representing a real number set, d being a sample dimension;
let L ═ {1,2, …, c, …, L } denote a set of tag values, c ═ 1,2, …, L, establish a state transition matrix
Figure GDA0002812495220000013
m,n=1,2,…,l,Tm,nRepresenting the transition probability from state m to state n;
setting a labeling interval delta according to the minimum element on the diagonal line in the state transition matrix, wherein delta is an integer;
by interval of label Δ to xpLabeling to obtain a labeled sample set { xi,yi},i=1,2,…,I,yiIs a sample xiAnnotated, unlabeled sample set { xjJ ═ 1,2, …, J, where I + J ═ P;
training the constructed support vector machine model by using the labeled sample set and the unlabeled sample set to obtain a trained support vector machine;
and classifying the currently acquired data by using a trained support vector machine.
Further, the setting the labeling interval Δ according to the smallest element on the diagonal in the state transition matrix includes:
computing satisfaction
Figure GDA0002812495220000021
Corresponding to w, to obtain wminWherein, TminRepresents the smallest element on the diagonal in the state transition matrix, τ ∈ (0,1) is the threshold, w ═ 0,1,2, …;
calculating the maximum annotation separation Wmax=2wmin+1 and setting the marking interval Delta epsilon [1, W ∈ ]max]。
Further, before the training of the constructed support vector machine model by using the labeled sample set and the unlabeled sample set, the method further includes establishing a support vector machine model as follows:
Figure GDA0002812495220000022
wherein HKFor regenerating nuclear Hilbert space, V (x)i,yiF) is a loss function, | f |KIs the complexity metric norm, gamma, of f in the regenerative nuclear Hilbert spaceKAS>0;
Figure GDA0002812495220000023
θpqIs a characteristic similarity coefficient, xqIn order to be a sample of the sample,
Figure GDA0002812495220000031
Figure GDA0002812495220000032
is a spatial similarity coefficient.
Further, the thetapqThe calculation formula is as follows:
Figure GDA0002812495220000033
wherein the content of the first and second substances,
Figure GDA0002812495220000034
is the distance x in the feature spaceqSet of most recent N samples, tθIs gaussian kernel width.
Further, the spatial similarity coefficient
Figure GDA0002812495220000035
The calculation formula is as follows:
Figure GDA0002812495220000036
wherein the content of the first and second substances,
Figure GDA0002812495220000037
is the width of the gaussian kernel and is,
Figure GDA00028124952200000311
for the y th diagonal of the state transition matrixiElement, p (x)i,xj) Is xiAnd xjThe sampling spatial distance therebetween.
In another aspect, a semi-supervised image classification system is employed, comprising: the system comprises a data collection module, a state transition matrix establishing module, a labeling interval determining module, a labeling module, a model training module and a classification module;
the data collection module is used for collecting raw data, and performing feature extraction on the raw data to convert the raw data into a sample set X { X ═ X }1,x2,…,xp,…,xPIn which xpIs one sample, P is 1,2, …, P is the number of all samples; sample(s)
Figure GDA0002812495220000038
Figure GDA0002812495220000039
Representing a real number set, d being a sample dimension;
the state transition matrix establishing module is used for setting L to be {1,2, …, c, …, L } to represent a set of label values, c to be 1,2, …, L, and establishing the state transition matrix
Figure GDA00028124952200000310
m,n=1,2,…,l,Tm,nRepresenting the transition probability from state m to state n;
the marking interval determining module is used for setting a marking interval delta according to the minimum element on the diagonal line in the state transition matrix, wherein the delta is an integer;
the marking module is used for marking the interval delta to xpLabeling to obtain a labeled sample set { xi,yi},i=1,2,…,I,yiIs a sample xiAnnotated, unlabeled sample set { xjJ ═ 1,2, …, J, where I + J ═ P;
the model training module is used for training the constructed support vector machine model by utilizing the marked sample set and the unmarked sample set to obtain a trained support vector machine;
the classification module is used for classifying the currently acquired data by utilizing a trained support vector machine.
Further, the labeling interval determining module is specifically configured to:
computing satisfaction
Figure GDA0002812495220000041
Corresponding to w, to obtain wminWherein, TminRepresents the smallest element on the diagonal in the state transition matrix, τ ∈ (0,1) is the threshold, w ═ 0,1,2, …;
calculating the maximum annotation separation Wmax=2wmin+1 and setting the marking interval Delta epsilon [1, W ∈ ]max]。
Further, the system also comprises a support vector machine model building module, wherein the support vector machine model building module is used for building a support vector machine model as follows:
Figure GDA0002812495220000042
wherein HKFor regenerating nuclear Hilbert space, V (x)i,yiF) is a loss function, | f |KIs the complexity metric norm, gamma, of f in the regenerative nuclear Hilbert spaceKAS>0;
Figure GDA0002812495220000043
θpqIs a characteristic similarity coefficient, xqFor the sample, q is 1,2, …, P,
Figure GDA0002812495220000051
Figure GDA0002812495220000052
is a spatial similarity coefficient.
Further, the thetapqThe calculation formula is as follows:
Figure GDA0002812495220000053
wherein the content of the first and second substances,
Figure GDA0002812495220000054
is the distance x in the feature spaceqSet of most recent N samples, tθIs the gaussian kernel width;
Figure GDA0002812495220000055
Figure GDA0002812495220000056
is a spatial similarity coefficient.
Further, the spatial similarity coefficient
Figure GDA0002812495220000057
The calculation formula is as follows:
Figure GDA0002812495220000058
wherein the content of the first and second substances,
Figure GDA0002812495220000059
is the width of the gaussian kernel and is,
Figure GDA00028124952200000510
is a state transition matrixDiagonal line yiElement, p (x)i,xj) Is xiAnd xjThe sampling spatial distance therebetween.
Compared with the prior art, the invention has the following technical effects: 1) preventing the lack of labeling of a certain type of sample; 2) better utilize the assumption of spatial smoothness, promote the classification accuracy.
Drawings
The following detailed description of embodiments of the invention refers to the accompanying drawings in which:
FIG. 1 is a schematic flow diagram of a semi-supervised image classification method;
fig. 2 is a schematic structural diagram of a semi-supervised image classification system.
Detailed Description
To further illustrate the features of the present invention, refer to the following detailed description of the invention and the accompanying drawings. The drawings are for reference and illustration purposes only and are not intended to limit the scope of the present disclosure.
As shown in fig. 1, the present embodiment discloses a semi-supervised image classification method, including the following steps S1 to S6:
s1, collecting raw data, extracting features of the raw data to convert the raw data into a sample set X ═ X1,x2,…,xp,…,xPIn which xpIs one sample, P is 1,2, …, P is the number of all samples; sample(s)
Figure GDA0002812495220000061
Figure GDA0002812495220000062
Representing a real number set, d being a sample dimension;
it should be noted that the original data may be a vibration signal and a ground image collected in ground classification of the robot, a rock image collected in an underground lithology identification process, or a hyperspectral image collected in a satellite hyperspectral image classification process.
S2, where L ═ {1,2, …, c, …, L } denotes the value of the labelC 1,2, …, l, establishing a state transition matrix
Figure GDA0002812495220000063
m,n=1,2,…,l,Tm,nRepresenting the transition probability from state m to state n;
it should be noted that the label may be a ground type number, a lithology type number, or a satellite hyperspectral category number.
S3, setting a labeling interval delta according to the minimum element on the diagonal line in the state transition matrix, wherein delta is an integer;
s4, according to the marked interval delta to xpLabeling to obtain a labeled sample set { xi,yi},i=1,2,…,I,yiIs a sample xiAnnotated, unlabeled sample set { xjJ ═ 1,2, …, J, where I + J ═ P;
s5, training the constructed support vector machine model by using the labeled sample set and the unlabeled sample set to obtain a trained support vector machine;
and S6, classifying the currently acquired data by using the trained support vector machine.
It should be noted that, in this embodiment, by labeling the samples in the sample set, the actual situation of data classification is well reflected, and a certain type of sample is prevented from lacking labeling.
It should be noted that the scheme can be applied to robot ground classification, underground lithology identification, satellite hyperspectral image classification and the like so as to improve the classification accuracy. Taking the ground classification applied to the robot as an example:
the robot comprises a robot body, a vibration sensor, a camera lens and a control module, wherein the vibration sensor is arranged on the robot body to detect vibration signals in the direction perpendicular to the ground, the camera lens faces the ground and is used for shooting the ground where the robot is located at present, and the vibration sensor and the camera are in an equal-time sampling working mode. The robot can randomly walk on the ground expected to be identified by utilizing the original data of the vibration sensor and the camera, and collects vibration signals and image signals from the vibration sensor and the camera, wherein the vibration signals and the image signals have time stamps.
Taking every S continuous vibration signals as a vibration frame, and converting the vibration signals into a set of vibration frames { v }1,v2,…,vp,…,vPIn which v ispIs a vibration frame, P is 1,2, …, P is the number of all vibration frames; corresponding each vibration frame with the ground image according to the time stamp to obtain a set of ground images
Figure GDA0002812495220000071
Converting the vibration frame into a sample by adopting d-point Fourier transform to obtain a sample set X ═ X1,x2,…,xp,…,xP}. Read at mark interval Δ
Figure GDA0002812495220000072
Artificial identification
Figure GDA0002812495220000073
Corresponding real ground type and pair xpAnd (6) labeling.
Further, the above step S3: the marking interval Δ is set according to the smallest element on the diagonal in the state transition matrix, including the following subdivision steps S31-S32:
s31, calculating to satisfy
Figure GDA0002812495220000074
Corresponding to w, to obtain wminWherein, TminRepresents the smallest element on the diagonal in the state transition matrix, τ ∈ (0,1) is the threshold, w ═ 0,1,2, …;
s32, calculating the maximum labeling interval Wmax=2wmin+1 and setting the marking interval Delta epsilon [1, W ∈ ]max]。
Further, in the above step S5: before training the constructed support vector machine model by using the labeled sample set and the unlabeled sample set, the method further comprises the following steps of:
Figure GDA0002812495220000081
wherein HKFor regenerating nuclear Hilbert space, V (x)i,yiF) is a loss function, | f |KIs the complexity metric norm, gamma, of f in the regenerative nuclear Hilbert spaceKAS>0;
Figure GDA0002812495220000082
θpqIs a characteristic similarity coefficient, xqFor the sample, q is 1,2, …, P,
Figure GDA0002812495220000083
Figure GDA0002812495220000084
is a spatial similarity coefficient.
Further, the thetapqThe calculation formula is as follows:
Figure GDA0002812495220000085
wherein the content of the first and second substances,
Figure GDA0002812495220000086
is the distance x in the feature spaceqSet of most recent N samples, tθIs gaussian kernel width.
The spatial similarity coefficient
Figure GDA0002812495220000087
The calculation formula is as follows:
Figure GDA0002812495220000088
wherein the content of the first and second substances,
Figure GDA0002812495220000089
is the width of the gaussian kernel and is,
Figure GDA00028124952200000810
for the y th diagonal of the state transition matrixiElement, p (x)i,xj) Is xiAnd xjThe sampling spatial distance therebetween.
It should be noted that the support vector machine model constructed in the embodiment better utilizes the assumption of spatial smoothness, thereby improving the classification accuracy.
As shown in fig. 2, the present embodiment discloses a semi-supervised image classification system, including: the system comprises a data collection module 10, a state transition matrix establishment module 20, a labeling interval determination module 30, a labeling module 40, a model training module 50 and a classification module 60;
the data collection module 10 is configured to collect raw data, perform feature extraction on the raw data to convert the raw data into a sample set X ═ X1,x2,…,xp,…,xPIn which xpIs one sample, P is 1,2, …, P is the number of all samples; sample(s)
Figure GDA0002812495220000091
Figure GDA0002812495220000092
Figure GDA0002812495220000093
Representing a real number set, d being a sample dimension;
the state transition matrix establishing module 20 is configured to set L ═ {1,2, …, c, …, L } to represent a set of tag values, and c ═ 1,2, …, L to establish the state transition matrix
Figure GDA0002812495220000095
Tm,nRepresenting the transition probability from state m to state n;
the labeling interval determining module 30 is configured to set a labeling interval Δ according to a minimum element on a diagonal line in the state transition matrix, where Δ is an integer;
the labeling module 40 is used for labeling the interval Δ to xpThe labeling is carried out, and the label is added,obtaining a labeled sample set { xi,yi},i=1,2,…,I,yiIs a sample xiAnnotated, unlabeled sample set { xjJ ═ 1,2, …, J, where I + J ═ P;
the model training module 50 is configured to train the constructed support vector machine model by using the labeled sample set and the unlabeled sample set to obtain a trained support vector machine;
the classification module 60 is configured to classify the currently acquired data by using a trained support vector machine.
Further, the labeling interval determining module 30 is specifically configured to:
computing satisfaction
Figure GDA0002812495220000094
Corresponding to w, to obtain wminWherein, TminRepresents the smallest element on the diagonal in the state transition matrix, τ ∈ (0,1) is the threshold, w ═ 0,1,2, …;
calculating the maximum annotation separation Wmax=2wmin+1 and setting the marking interval Delta epsilon [1, W ∈ ]max]。
Further, the system also comprises a support vector machine model building module, wherein the support vector machine model building module is used for building a support vector machine model as follows:
Figure GDA0002812495220000101
wherein HKFor regenerating nuclear Hilbert space, V (x)i,yiF) is a loss function, | f |KIs the complexity metric norm, gamma, of f in the regenerative nuclear Hilbert spaceKAS>0;
Figure GDA0002812495220000102
θpqIs a characteristic similarity coefficient, xqFor the sample, q is 1,2, …, P,
Figure GDA0002812495220000103
Figure GDA0002812495220000104
is a spatial similarity coefficient.
Theta is describedpqThe calculation formula is as follows:
Figure GDA0002812495220000105
wherein the content of the first and second substances,
Figure GDA0002812495220000106
is the distance x in the feature spaceqSet of most recent N samples, tθIs gaussian kernel width.
The spatial similarity coefficient
Figure GDA0002812495220000107
The calculation formula is as follows:
Figure GDA0002812495220000108
wherein the content of the first and second substances,
Figure GDA0002812495220000109
is the width of the gaussian kernel and is,
Figure GDA00028124952200001010
for the y th diagonal of the state transition matrixiElement, p (x)i,xj) Is xiAnd xjThe sampling spatial distance therebetween.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (8)

1. A semi-supervised image classification method is characterized by comprising the following steps:
collecting raw data, which is image data, and performing feature extraction on the raw data to convert the raw data into a sample set X ═ X1,x2,…,xp,…,xPIn which xpIs one sample, P is 1,2, …, P is the number of all samples; sample(s)
Figure FDA0002812495210000011
Figure FDA0002812495210000012
Representing a real number set, d being a sample dimension;
let L ═ {1,2, …, c, …, L } denote a set of tag values, c ═ 1,2, …, L, establish a state transition matrix
Figure FDA0002812495210000015
m,n=1,2,…,l,Tm,nRepresenting the transition probability from state m to state n;
according to the minimum element on the diagonal line in the state transition matrix, setting a labeling interval delta, wherein delta is an integer and specifically comprises the following steps:
computing satisfaction
Figure FDA0002812495210000013
Corresponding to w, to obtain wminWherein, TminDenotes the smallest diagonal element in the state transition matrix, τ ∈ (0,1) is the threshold, w ∈ 0,1,2, …, wminRepresents the smallest element in the set of w,
Figure FDA0002812495210000014
the smallest element on the diagonal in the state transition matrix at the w-th time is represented;
calculating the maximum annotation separation Wmax=2wmin+1 and setting the marking interval Delta epsilon [1, W ∈ ]max];
By interval of label Δ to xpLabeling to obtain a labeled sample set { xi,yi},i=1,2,…,I,yiIs a sample xiAnnotated, unlabeled sample set { xjJ ═ 1,2, …, J, where I + J ═ P;
training the constructed support vector machine model by using the labeled sample set and the unlabeled sample set to obtain a trained support vector machine;
and classifying the currently acquired data by using a trained support vector machine.
2. The semi-supervised image classification method of claim 1, wherein before the training of the constructed support vector machine model by using the labeled sample set and the unlabeled sample set, further comprising establishing a support vector machine model as follows:
Figure FDA0002812495210000021
wherein HKFor regenerating nuclear Hilbert space, V (x)i,yiF) is a loss function, | f | | non-woven phosphorKIs the complexity metric norm, gamma, of f in the regenerative nuclear Hilbert spaceA,γA,γS>0;
Figure FDA0002812495210000022
θpqIs a characteristic similarity coefficient, xqFor the sample, q is 1,2, …, P,
Figure FDA0002812495210000023
Figure FDA0002812495210000024
is a spatial similarity coefficient.
3. The semi-supervised image classification method of claim 2, wherein θ ispqThe calculation formula is as follows:
Figure FDA0002812495210000025
wherein the content of the first and second substances,
Figure FDA0002812495210000026
is the distance x in the feature spaceqSet of most recent N samples, tθIs gaussian kernel width.
4. The semi-supervised image classification method of claim 2, wherein the spatial similarity coefficient
Figure FDA0002812495210000027
The calculation formula is as follows:
Figure FDA0002812495210000028
wherein the content of the first and second substances,
Figure FDA0002812495210000029
is the width of the gaussian kernel and is,
Figure FDA00028124952100000210
for the y th diagonal of the state transition matrixiElement, p (x)i,xj) Is xiAnd xjThe sampling spatial distance therebetween.
5. A semi-supervised image classification system, comprising: the system comprises a data collection module, a state transition matrix establishing module, a labeling interval determining module, a labeling module, a model training module and a classification module;
the data collection module is used for collecting original data, the original data is image data, and feature extraction is carried out on the original data so as to convert the original data into a sample set X ═ X1,x2,…,xp,…,xPIn which xpIs one sample, P is 1,2, …, P is the number of all samples; sample(s)
Figure FDA0002812495210000031
Figure FDA0002812495210000032
Representing a real number set, d being a sample dimension;
the state transition matrix establishing module is used for setting L to be {1,2, …, c, …, L } to represent a set of label values, c to be 1,2, …, L, and establishing the state transition matrix
Figure FDA0002812495210000035
m,n=1,2,…,l,Tm,nRepresenting the transition probability from state m to state n;
the labeling interval determining module is configured to set a labeling interval Δ according to a minimum element on a diagonal line in the state transition matrix, where Δ is an integer, and the labeling interval determining module is specifically configured to:
computing satisfaction
Figure FDA0002812495210000033
Corresponding to w, to obtain wminWherein, TminDenotes the smallest diagonal element in the state transition matrix, τ ∈ (0,1) is the threshold, w ∈ 0,1,2, …, wminRepresents the smallest element in the set of w,
Figure FDA0002812495210000034
the smallest element on the diagonal in the state transition matrix at the w-th time is represented;
calculating the maximum annotation separation Wmax=2wmin+1 and setting the marking interval Delta epsilon [1, W ∈ ]max];
The marking module is used for marking the interval delta to xpLabeling to obtain a labeled sample set { xi,yi},i=1,2,…,I,yiIs a sample xiIs labeled withAnnotated sample set { xjJ ═ 1,2, …, J, where I + J ═ P;
the model training module is used for training the constructed support vector machine model by utilizing the marked sample set and the unmarked sample set to obtain a trained support vector machine;
the classification module is used for classifying the currently acquired data by utilizing a trained support vector machine.
6. The semi-supervised image classification system of claim 5, further comprising a support vector machine model construction module for establishing a support vector machine model as follows:
Figure FDA0002812495210000041
wherein HKFor regenerating nuclear Hilbert space, V (x)i,yiF) is a loss function, | f | | non-woven phosphorKIs the complexity metric norm, gamma, of f in the regenerative nuclear Hilbert spaceA,γA,γS>0;
Figure FDA0002812495210000042
θpqIs a characteristic similarity coefficient, xqFor the sample, q is 1,2, …, P,
Figure FDA0002812495210000043
Figure FDA0002812495210000044
is a spatial similarity coefficient.
7. The semi-supervised image classification system of claim 6, wherein θpqThe calculation formula is as follows:
Figure FDA0002812495210000045
wherein the content of the first and second substances,
Figure FDA0002812495210000046
is the distance x in the feature spaceqSet of most recent N samples, tθIs gaussian kernel width.
8. The semi-supervised image classification system of claim 6, wherein the spatial similarity coefficient
Figure FDA0002812495210000051
The calculation formula is as follows:
Figure FDA0002812495210000052
wherein the content of the first and second substances,
Figure FDA0002812495210000053
is the width of the gaussian kernel and is,
Figure FDA0002812495210000054
for the y th diagonal of the state transition matrixiElement, p (x)i,xj) Is xiAnd xjThe sampling spatial distance therebetween.
CN201910995951.XA 2019-10-18 2019-10-18 Semi-supervised image classification method and system Active CN110781942B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910995951.XA CN110781942B (en) 2019-10-18 2019-10-18 Semi-supervised image classification method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910995951.XA CN110781942B (en) 2019-10-18 2019-10-18 Semi-supervised image classification method and system

Publications (2)

Publication Number Publication Date
CN110781942A CN110781942A (en) 2020-02-11
CN110781942B true CN110781942B (en) 2021-03-09

Family

ID=69386069

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910995951.XA Active CN110781942B (en) 2019-10-18 2019-10-18 Semi-supervised image classification method and system

Country Status (1)

Country Link
CN (1) CN110781942B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108985342A (en) * 2018-06-22 2018-12-11 华南理工大学 A kind of uneven classification method based on depth enhancing study
CN108985204A (en) * 2018-07-04 2018-12-11 北京师范大学珠海分校 Pedestrian detection tracking and device

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103886330B (en) * 2014-03-27 2017-03-01 西安电子科技大学 Sorting technique based on semi-supervised SVM integrated study
CN106228183A (en) * 2016-07-18 2016-12-14 北京邮电大学 A kind of semi-supervised learning sorting technique and device
CN107392230A (en) * 2017-06-22 2017-11-24 江南大学 A kind of semi-supervision image classification method for possessing maximization knowledge utilization ability
CN107392015B (en) * 2017-07-06 2019-09-17 长沙学院 A kind of intrusion detection method based on semi-supervised learning

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108985342A (en) * 2018-06-22 2018-12-11 华南理工大学 A kind of uneven classification method based on depth enhancing study
CN108985204A (en) * 2018-07-04 2018-12-11 北京师范大学珠海分校 Pedestrian detection tracking and device

Also Published As

Publication number Publication date
CN110781942A (en) 2020-02-11

Similar Documents

Publication Publication Date Title
CN111368886B (en) Sample screening-based label-free vehicle picture classification method
CN110781788B (en) Method and system for field robot ground classification based on small amount of labels
CN104424466A (en) Object detection method, object detection device and image pickup device
CN111914634B (en) Automatic detection method and system for well lid class resisting complex scene interference
CN112232371B (en) American license plate recognition method based on YOLOv3 and text recognition
CN107833211B (en) Infrared image-based zero value insulator automatic detection method and device
WO2022218396A1 (en) Image processing method and apparatus, and computer readable storage medium
CN111274926B (en) Image data screening method, device, computer equipment and storage medium
CN113327248B (en) Tunnel traffic flow statistical method based on video
CN111209935B (en) Unsupervised target detection method and system based on self-adaptive domain transfer
CN112966665A (en) Pavement disease detection model training method and device and computer equipment
JP2022027473A5 (en)
CN115294150A (en) Image processing method and terminal equipment
Sikirić et al. Classifying traffic scenes using the GIST image descriptor
CN103279742A (en) Multi-resolution pedestrian detection method and device based on multi-task model
CN113989556A (en) Small sample medical image classification method and system
CN111768429A (en) Pedestrian target tracking method in tunnel environment based on Kalman filtering and pedestrian re-identification algorithm
CN115409789A (en) Power transmission line engineering defect detection method based on image semantic segmentation
CN111291705A (en) Cross-multi-target-domain pedestrian re-identification method
CN106971150B (en) Queuing abnormity detection method and device based on logistic regression
CN110781942B (en) Semi-supervised image classification method and system
CN103984965A (en) Pedestrian detection method based on multi-resolution character association
CN114662594B (en) Target feature recognition analysis system
CN103065152A (en) Identification method of digital clock in videos
CN112861682B (en) Road surface image acquisition and classification method and device based on naive Bayes cloud computing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant