CN113076976B - Small sample image classification method based on local feature relation exploration - Google Patents

Small sample image classification method based on local feature relation exploration Download PDF

Info

Publication number
CN113076976B
CN113076976B CN202110287779.XA CN202110287779A CN113076976B CN 113076976 B CN113076976 B CN 113076976B CN 202110287779 A CN202110287779 A CN 202110287779A CN 113076976 B CN113076976 B CN 113076976B
Authority
CN
China
Prior art keywords
local
task
features
neural network
small sample
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110287779.XA
Other languages
Chinese (zh)
Other versions
CN113076976A (en
Inventor
苏勤亮
陈佳星
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sun Yat Sen University
Original Assignee
Sun Yat Sen University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sun Yat Sen University filed Critical Sun Yat Sen University
Priority to CN202110287779.XA priority Critical patent/CN113076976B/en
Publication of CN113076976A publication Critical patent/CN113076976A/en
Application granted granted Critical
Publication of CN113076976B publication Critical patent/CN113076976B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computational Linguistics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Evolutionary Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a small sample image classification method based on local feature relation exploration, which adopts a multi-level graph neural network, firstly, the relation between local features in each image is mined through the local graph neural network, and more representative local semantic features of the image are extracted; and then, the relation among the local semantic features of all samples in each task is explored through a task-level graph neural network, so as to learn more distinguishable task-level local semantic features. Compared with the previous small sample learning method based on the graph neural network, the method of the invention ensures that the learned similarity among samples is finer and more accurate through multi-level local relation mining, thereby improving the performance of small sample image classification.

Description

Small sample image classification method based on local feature relation exploration
Technical Field
The invention relates to the field of small sample image classification, in particular to a small sample image classification method based on local feature relation exploration.
Background
In recent years, deep learning has achieved great success in many applications, but deep learning relies on massive amounts of training data. In some scenarios, such as cold start problems, medical problems, etc. in recommendation systems, the available data resources are very scarce and the cost of acquiring such data is also great. Thus, under these scenarios, the deep learning method is easily overfitted and often does not achieve good performance.
The existence of the above problems has led more and more researchers to notice small sample learning, which aims to learn new knowledge quickly with little data or limited resources. The study of small sample study has two major difficulties: one is that sample categories in the test dataset never appear in the training dataset, and the other is that for those categories in the test dataset that have never been seen, only a few samples are labeled and available for training. In order to solve the difficult problem in the learning of the small sample, many methods have been proposed. Currently, there are two main approaches: optimization-based methods and metric learning-based methods. The optimization-based method efficiently learns model parameters by adjusting an optimization strategy for improving the network. Metric learning-based methods typically use neural networks to extract features of samples and then predict by computing similarities between samples.
Since available data resources in the small sample learning problem are very scarce, it is important to fully exploit the relationships between samples. In view of the suitability of the graph neural network for exploring relationships among samples, many graph neural network-based methods have been proposed for small sample learning. However, all methods based on the neural network are currently used for exploring the example-level relationship among samples, that is, the methods usually take an image as a whole, learn the representation of the whole image, and calculate the similarity among the image representations, wherein the similarity is usually a scalar. However, the relationship between two images tends to be more complex than what a scalar can express. First, there are typically multiple local semantic features in each image, such as dog's hair, mouth, eyes, etc. Second, different images may be similar on some local features but dissimilar on other local features. Thus, if only one scalar is used to represent the similarity between two images, the similarity thus calculated may be inaccurate.
Disclosure of Invention
The invention provides a small sample image classification method with high accuracy and based on local feature relation exploration.
In order to achieve the technical effects, the technical scheme of the invention is as follows:
a small sample image classification method based on local feature relation exploration comprises the following steps:
s1: extracting initial local features of the image by using a convolutional neural network;
s2: constructing a local level diagram by using the initial local features obtained in the step S1, and extracting the local semantic features of the image through a local level diagram neural network;
s3: and (3) constructing a task level graph by using the local semantic features of all the images obtained in the step (S2), exploring the relation among the local semantic features through a task level graph neural network, and applying the relation to small sample image classification.
Further, the specific process of the step S1 is:
for an input picture x p Firstly, a characteristic diagram U of the image is obtained through a convolutional neural network, as shown in a formula (1), and the characteristic diagram of the image is deformed through stretching operation to obtain m initial local characteristics, as shown in a formula (2):
wherein Conv (·) represents a convolutional neural network, and c, h and w represent the number of channels, height and width of the feature map, respectively; m denotes the number of initial local features, m=h×w, c can be understood as the dimension of the initial local features.
Further, in the step S2, for each image x p Constructing a local level graph using the initial local features obtained in step S1Each node in the graph represents a local feature of the image, the top k most similar to each node l Nodes are connected, adjacency matrix->The calculation mode of (a) is as shown in the formula (3) (4):
wherein the function d (·, ·) represents the distance calculation formula,representation and node v i And->Distance of->Is connected with node v i Kth l A near junction;
constructing a local level graph from initial local features of each pictureAfterwards, will->Inputting into a graphic neural network, updating local features by means of adjacency relations among nodes, as shown in formula (5), and then gathering the local features into r clusters by using a graphic pooling operation to obtain r local semantic features H p As in formula (6) (7):
wherein ,GNNlocal (·) represents a local level graph neural network,is a network parameter, reLU (·) is a nonlinear activation function; s is an assignment matrix, S (i, j) represents probability that the ith node belongs to the jth cluster, W' is a randomly initialized parameter, and softmax (·) represents normalizationCarrying out a unification operation; r and d l Representing the number and dimensions of learned local semantic features, respectively.
Further, in the step S3, in the small sample image classification, each task includes N classes, each class has K labeled training samples and T unlabeled test samples, and for each task, all local semantic features of all images contained in it are denoted as H, as in formula (8), and a task level graph is constructed using all local semantic features in each taskK most similar to each node t The nodes are connected, the calculation mode of the adjacency matrix of the task level graph is shown as formula (9) (10), and the relation among the local semantic features is mined through the neural network of the task level graph to obtain the task level local feature Z, as shown as formula (11):
H={H p |p=1,…,N×K+T}={h i |i=1,…,(N×K+T)×r} (8)
Z=GNN task (A task ,H)=ReLU(A task HW task ) (11)
wherein ,GNNtask (. Cndot.) represents task level graph neural networks, W taak Is a network parameter;
the obtained task-level local features Z are directly input into a classifier for classification, the probability that each local feature belongs to each category is calculated, and a sample q is tested i Is the kth local feature of (2)Probability belonging to the j-th class, as in equation (12):
wherein ,Cj Representing the center point of the j-th class, and calculating and averaging all local features in the class;
calculating the probability that each test sample belongs to each category, i.e. averaging the probabilities of all local features in the sample, as in equation (13):
calculating a loss value for each task as in equation (14):
wherein ,representing a cross entropy loss function; y is i Representative test sample q i Is a true generic of (c).
Compared with the prior art, the technical scheme of the invention has the beneficial effects that:
the method adopts a multi-level graph neural network, firstly, the relation between local features in each image is mined through the local level graph neural network, and more representative local semantic features of the image are extracted; and then, the relation among the local semantic features of all samples in each task is explored through a task-level graph neural network, so as to learn more distinguishable task-level local semantic features. Compared with the previous small sample learning method based on the graph neural network, the method of the invention ensures that the learned similarity among samples is finer and more accurate through multi-level local relation mining, thereby improving the performance of small sample image classification.
Drawings
FIG. 1 is a schematic diagram of a model of the present method;
fig. 2 is a schematic flow diagram of the local level graph neural network and the task level graph neural network in steps S2 and S3.
Detailed Description
The drawings are for illustrative purposes only and are not to be construed as limiting the present patent;
for the purpose of better illustrating the embodiments, certain elements of the drawings may be omitted, enlarged or reduced and do not represent the actual product dimensions;
it will be appreciated by those skilled in the art that certain well-known structures in the drawings and descriptions thereof may be omitted.
The technical scheme of the invention is further described below with reference to the accompanying drawings and examples.
As shown in fig. 1-2, a small sample image classification method based on local feature relation exploration comprises the following steps:
s1: extracting initial local features of the image by using a convolutional neural network;
s2: constructing a local level diagram by using the initial local features obtained in the step S1, and extracting the local semantic features of the image through a local level diagram neural network;
s3: and (3) constructing a task level graph by using the local semantic features of all the images obtained in the step (S2), exploring the relation among the local semantic features through a task level graph neural network, and applying the relation to small sample image classification.
The specific process of step S1 is:
for an input picture x p Firstly, a characteristic diagram U of the image is obtained through a convolutional neural network, as shown in a formula (1), and the characteristic diagram of the image is deformed through stretching operation to obtain m initial local characteristics, as shown in a formula (2):
wherein Conv (·) represents a convolutional neural network, and c, h and w represent the number of channels, height and width of the feature map, respectively; m denotes the number of initial local features, m=h×w, c can be understood as the dimension of the initial local features.
In step S2, for each image x p Constructing a local level graph using the initial local features obtained in step S1Each node in the graph represents a local feature of the image, the top k most similar to each node l Nodes are connected, adjacency matrix->The calculation mode of (a) is as shown in the formula (3) (4):
wherein the function d (·, ·) represents the distance calculation formula,representation and node v i And->Distance of->Is connected with node v i Kth l A near junction;
constructing a local level graph from initial local features of each pictureAfterwards, will->Inputting into a graphic neural network, updating local features by means of adjacency relations among nodes, as shown in formula (5), and then gathering the local features into r clusters by using a graphic pooling operation to obtain r local semantic features H p As in formula (6) (7):
wherein ,GNNlocal (·) represents a local level graph neural network,is a network parameter, reLU (·) is a nonlinear activation function; s is an assignment matrix, S (i, j) represents the probability that the ith node belongs to the jth cluster, W' is a randomly initialized parameter, and softmax (DEG) represents normalization operation; r and d l Representing the number and dimensions of learned local semantic features, respectively.
In step S3, in the small sample image classification, each task includes N classes, each class has K labeled training samples and T unlabeled test samples, for each task, all local semantic features of all images it contains are denoted as H, as in equation (8), and a task level graph is constructed from all local semantic features in each taskK most similar to each node t The nodes are connected, the adjacent matrix calculation mode of the task level graph is shown as formula (9) (10), and the task level local characteristics Z are obtained by mining the relation among the local semantic characteristics through the task level graph neural networkFormula (11):
H={H p |p=1,…,N×K+T}={h i |i=1,…,(N×K+T)×r} (8)
Z=GNN task (A task ,H)=ReLU(A task HW task ) (11)
wherein ,GNNtask (. Cndot.) represents task level graph neural networks, W task Is a network parameter;
the obtained task-level local features Z are directly input into a classifier for classification, the probability that each local feature belongs to each category is calculated, and a sample q is tested i Is the kth local feature of (2)Probability belonging to the j-th class, as in equation (12):
wherein ,Cj Representing the center point of the j-th class, and calculating and averaging all local features in the class;
calculating the probability that each test sample belongs to each category, i.e. averaging the probabilities of all local features in the sample, as in equation (13):
calculating a loss value for each task as in equation (14):
wherein ,representing a cross entropy loss function; y is i Representative test sample q i Is a true generic of (c).
In this embodiment, two common data sets, mini-ImageNet and tier-ImageNet, are used, both of which are child data sets of ImageNet. The mini-ImageNet data set comprises 100 categories, and each category comprises 600 pictures; the tier-ImageNet dataset contains 608 classes, which are from 34 superclasses.
The method comprises the following specific steps:
firstly, constructing a convolutional neural network, inputting data into the convolutional neural network, and extracting initial local features of an image.
And secondly, constructing a local level graph neural network, constructing a local level graph by using the initial local features of each picture, inputting the local level graph into the local level graph neural network, and extracting the local semantic features of the image by mining the relation among the initial local features.
Thirdly, a task level graph neural network is built, a task level graph is constructed by using local semantic features of all pictures in each task, the task level graph is input into the task level graph neural network, and task level local features are obtained by mining relations among the local semantic features; and finally, inputting the task-level local features into a classifier for classification, and verifying classification results.
The same or similar reference numerals correspond to the same or similar components;
the positional relationship depicted in the drawings is for illustrative purposes only and is not to be construed as limiting the present patent;
it is to be understood that the above examples of the present invention are provided by way of illustration only and not by way of limitation of the embodiments of the present invention. Other variations or modifications of the above teachings will be apparent to those of ordinary skill in the art. It is not necessary here nor is it exhaustive of all embodiments. Any modification, equivalent replacement, improvement, etc. which come within the spirit and principles of the invention are desired to be protected by the following claims.

Claims (8)

1. The small sample image classification method based on local feature relation exploration is characterized by comprising the following steps of:
s1: extracting initial local features of the image by using a convolutional neural network;
s2: constructing a local level diagram by using the initial local features obtained in the step S1, and extracting the local semantic features of the image through a local level diagram neural network;
s3: constructing a task level graph by using the local semantic features of all the images obtained in the step S2, exploring the relation among the local semantic features through a task level graph neural network, and applying the relation to small sample image classification;
in the step S2, for each image x p Constructing a local level graph using the initial local features obtained in step S1Each node in the graph represents a local feature of the image, the top k most similar to each node l Nodes are connected, adjacency matrix->The calculation mode of (a) is as shown in the formula (3) (4):
wherein the function d (·, ·) represents the distance calculation formula,representation and node v i And->Distance of->Is connected with node v i Kth l A near junction;
in the step S3, in the small sample image classification, each task includes N classes, each class has K labeled training samples and T unlabeled test samples, and for each task, all local semantic features of all images contained in the task are expressed as H, as in formula (8), and a task level graph is constructed by using all local semantic features in each taskK most similar to each node t The nodes are connected, the calculation mode of the adjacency matrix of the task level graph is shown as formula (9) (10), and the relation among the local semantic features is mined through the neural network of the task level graph to obtain the task level local feature Z, as shown as formula (11):
H={H p | p=1,…,N×K+T}={h i | i=1,…,(N×K+T)×r} (8)
Z=GNN task (A task ,H)=ReLU(A task HW task ) (11)
wherein ,GNNtask (. Cndot.) represents task level graph neural networks, W task Is a network parameter.
2. The small sample image classification method based on local feature relation exploration according to claim 1, wherein the specific process of step S1 is:
for an input picture x p Firstly, a characteristic diagram U of the image is obtained through a convolutional neural network, as shown in a formula (1), and the characteristic diagram of the image is deformed through stretching operation to obtain m initial local characteristics, as shown in a formula (2):
wherein Conv (·) represents a convolutional neural network, and c, h and w represent the number of channels, height and width of the feature map, respectively; m represents the number of initial local features.
3. The small sample image classification method based on local feature relation exploration according to claim 1, wherein in said step S2, a local level graph is constructed with initial local features of each pictureAfterwards, will->Inputting into a graphic neural network, updating local features by means of adjacency relations among nodes, as shown in formula (5), and then gathering the local features into r clusters by using a graphic pooling operation to obtain r local semantic features H p As in formula (6) (7):
wherein ,GNNlocal (·) represents a local level graph neural network,is a network parameter, reLU (·) is a nonlinear activation function; s is a valuation matrix, S (i, j) represents the probability that the ith node belongs to the jth cluster, W Is a randomly initialized parameter, softmax (·) represents the normalization operation; r and d l Representing the number and dimensions of learned local semantic features, respectively.
4. The small sample image classification method based on local feature relation exploration according to claim 1, wherein in the step S3, the obtained task-level local features Z are directly input into a classifier for classification, the probability of each local feature belonging to each class is calculated, and the sample q is tested i Is the kth local feature of (2)Probability belonging to the j-th class, as in equation (12):
wherein ,Cj Representing the center point of the j-th class, and calculating and averaging all local features in the class.
5. The small sample image classification method based on local feature relation exploration according to claim 4, wherein in the step S3, the probability that each test sample belongs to each category is calculated, i.e. the probabilities of all local features in the sample are averaged, as shown in formula (13):
6. the small sample image classification method based on local feature relation exploration according to claim 5, wherein in step S3, a loss value of each task is calculated as shown in formula (14):
wherein ,representing a cross entropy loss function; y is i Representative test sample q i Is a true generic of (c).
7. The small sample image classification method based on local feature relation exploration according to any of claims 2-6, characterized in that m = h x w.
8. The small sample image classification method based on local feature relation exploration of claim 2, wherein c is understood as the dimension of the initial local feature.
CN202110287779.XA 2021-03-17 2021-03-17 Small sample image classification method based on local feature relation exploration Active CN113076976B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110287779.XA CN113076976B (en) 2021-03-17 2021-03-17 Small sample image classification method based on local feature relation exploration

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110287779.XA CN113076976B (en) 2021-03-17 2021-03-17 Small sample image classification method based on local feature relation exploration

Publications (2)

Publication Number Publication Date
CN113076976A CN113076976A (en) 2021-07-06
CN113076976B true CN113076976B (en) 2023-08-18

Family

ID=76612760

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110287779.XA Active CN113076976B (en) 2021-03-17 2021-03-17 Small sample image classification method based on local feature relation exploration

Country Status (1)

Country Link
CN (1) CN113076976B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109961089A (en) * 2019-02-26 2019-07-02 中山大学 Small sample and zero sample image classification method based on metric learning and meta learning
CN111210435A (en) * 2019-12-24 2020-05-29 重庆邮电大学 Image semantic segmentation method based on local and global feature enhancement module
CN111858991A (en) * 2020-08-06 2020-10-30 南京大学 Small sample learning algorithm based on covariance measurement
CN112308115A (en) * 2020-09-25 2021-02-02 安徽工业大学 Multi-label image deep learning classification method and equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109961089A (en) * 2019-02-26 2019-07-02 中山大学 Small sample and zero sample image classification method based on metric learning and meta learning
CN111210435A (en) * 2019-12-24 2020-05-29 重庆邮电大学 Image semantic segmentation method based on local and global feature enhancement module
CN111858991A (en) * 2020-08-06 2020-10-30 南京大学 Small sample learning algorithm based on covariance measurement
CN112308115A (en) * 2020-09-25 2021-02-02 安徽工业大学 Multi-label image deep learning classification method and equipment

Also Published As

Publication number Publication date
CN113076976A (en) 2021-07-06

Similar Documents

Publication Publication Date Title
WO2023000574A1 (en) Model training method, apparatus and device, and readable storage medium
CN110909820B (en) Image classification method and system based on self-supervision learning
CN108399428B (en) Triple loss function design method based on trace ratio criterion
CN110135459B (en) Zero sample classification method based on double-triple depth measurement learning network
CN109741341B (en) Image segmentation method based on super-pixel and long-and-short-term memory network
CN110046671A (en) A kind of file classification method based on capsule network
US20240163684A1 (en) Method and System for Constructing and Analyzing Knowledge Graph of Wireless Communication Network Protocol, and Device and Medium
CN113313164B (en) Digital pathological image classification method and system based on super-pixel segmentation and graph convolution
CN112347970B (en) Remote sensing image ground object identification method based on graph convolution neural network
Valera et al. Automatic discovery of the statistical types of variables in a dataset
CN110138595A (en) Time link prediction technique, device, equipment and the medium of dynamic weighting network
CN113191385B (en) Unknown image classification automatic labeling method based on pre-training labeling data
CN114298122B (en) Data classification method, apparatus, device, storage medium and computer program product
CN110598869B (en) Classification method and device based on sequence model and electronic equipment
CN112199532A (en) Zero sample image retrieval method and device based on Hash coding and graph attention machine mechanism
CN111339818A (en) Face multi-attribute recognition system
CN113051914A (en) Enterprise hidden label extraction method and device based on multi-feature dynamic portrait
CN112733602B (en) Relation-guided pedestrian attribute identification method
CN112000689A (en) Multi-knowledge graph fusion method based on text analysis
CN114299362A (en) Small sample image classification method based on k-means clustering
CN116071389A (en) Front background matching-based boundary frame weak supervision image segmentation method
CN116310466A (en) Small sample image classification method based on local irrelevant area screening graph neural network
CN117152788A (en) Skeleton behavior recognition method based on knowledge distillation and multitasking self-supervision learning
CN105809200B (en) Method and device for autonomously extracting image semantic information in bioauthentication mode
CN108537342A (en) A kind of network representation learning method and system based on neighbor information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant