CN113076976A - Small sample image classification method based on local feature relation exploration - Google Patents

Small sample image classification method based on local feature relation exploration Download PDF

Info

Publication number
CN113076976A
CN113076976A CN202110287779.XA CN202110287779A CN113076976A CN 113076976 A CN113076976 A CN 113076976A CN 202110287779 A CN202110287779 A CN 202110287779A CN 113076976 A CN113076976 A CN 113076976A
Authority
CN
China
Prior art keywords
local
task
features
neural network
small sample
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110287779.XA
Other languages
Chinese (zh)
Other versions
CN113076976B (en
Inventor
苏勤亮
陈佳星
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sun Yat Sen University
Original Assignee
Sun Yat Sen University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sun Yat Sen University filed Critical Sun Yat Sen University
Priority to CN202110287779.XA priority Critical patent/CN113076976B/en
Publication of CN113076976A publication Critical patent/CN113076976A/en
Application granted granted Critical
Publication of CN113076976B publication Critical patent/CN113076976B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computational Linguistics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Evolutionary Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a small sample image classification method based on local feature relation exploration, which adopts a multi-level graph neural network, firstly, the relation between local features in each image is excavated through the local graph neural network, and more representative local semantic features of the image are extracted; and then, the relation among the local semantic features of all samples in each task is explored through a task-level graph neural network to learn the more distinctive task-level local semantic features. Compared with the conventional small sample learning method based on the graph neural network, the method provided by the invention has the advantages that the similarity between the learned samples is finer in granularity and more accurate through multi-level local relation mining, so that the image classification performance of the small samples is improved.

Description

Small sample image classification method based on local feature relation exploration
Technical Field
The invention relates to the field of small sample image classification, in particular to a small sample image classification method based on local feature relation exploration.
Background
In recent years, deep learning has had great success in many applications, but it relies on massive amounts of training data. In some scenarios, such as cold start problems in recommendation systems, medical problems, etc., the available data resources are very scarce and the cost of acquiring such data is also significant. Therefore, in these scenarios, the deep learning method is easily over-fitted, and may not achieve good performance.
The existence of the above-mentioned problems has brought more and more researchers to notice small sample learning, the goal of which is to learn new knowledge quickly with little data or limited resources. Two main difficulties exist in the study and research of small samples: one is that the sample classes in the test data set never appeared in the training data set, and the other is that for those classes in the test data set that never appeared, only a few samples are labeled and available for training. To solve the problem in small sample learning, many methods have been proposed. Currently, there are two main approaches: optimization-based methods and metric learning-based methods. The optimization-based approach efficiently learns model parameters by adjusting the optimization strategy that improves the network. The metric learning-based method usually uses a neural network to extract the features of the samples, and then calculates the similarity between the samples to make the prediction.
Because available data resources are very scarce in the small sample learning problem, it is very important to fully mine and utilize the relationship among the samples. Since graph neural networks are suitable for exploring relationships between samples, many graph neural network-based methods are proposed for small sample learning. However, all current methods based on the graph neural network are used for exploring the relationship of the example level among the samples, that is, these methods usually take an image as a whole, learn the representation of the whole image, and then calculate the similarity among the representations of the image, and the similarity is usually a scalar. However, the relationship between two images is often more complex than what can be expressed by a scalar. First, there are usually multiple local semantic features in each image, such as dog's hair, mouth, eyes, etc. Second, different images may be similar in some local features but not in others. Therefore, if only a scalar is used to represent the similarity between two images, the similarity thus calculated may not be accurate.
Disclosure of Invention
The invention provides a small sample image classification method which is high in accuracy and based on local feature relation exploration.
In order to achieve the technical effects, the technical scheme of the invention is as follows:
a small sample image classification method based on local feature relation exploration comprises the following steps:
s1: extracting initial local features of the image by using a convolutional neural network;
s2: constructing a local level graph by using the initial local features obtained in the step S1, and extracting local semantic features of the image through a neural network of the local level graph;
s3: and (4) constructing a task-level graph by using the local semantic features of all the images obtained in the step (S2), researching the relation among the local semantic features through a task-level graph neural network, and applying the relation to small sample image classification.
Further, the specific process of step S1 is:
for an input picture xpFirstly, obtaining a feature map U of the image through a convolutional neural network, as shown in formula (1), and deforming the feature map of the image through a stretching operation to obtain m initial local features, as shown in formula (2):
Figure BDA0002981198730000021
Figure BDA0002981198730000022
wherein Conv (·) represents a convolutional neural network, and c, h and w represent the number of channels, height and width of the feature map, respectively; m denotes the number of initial local features, and m — h × w, c can be understood as the dimension of the initial local features.
Further, in the step S2, for each image xpConstructing a local level graph using the initial local features obtained in step S1
Figure BDA0002981198730000023
Each node in the graph representing an imageA local feature, the top k of each node to which it is most similarlThe nodes being connected together, adjacent to the matrix
Figure BDA0002981198730000024
Is calculated as in formula (3) (4):
Figure BDA0002981198730000025
Figure BDA0002981198730000026
wherein the function d (·,) represents a distance calculation formula,
Figure BDA0002981198730000027
representation and node viAnd
Figure BDA0002981198730000028
the distance of (a) to (b),
Figure BDA0002981198730000029
is and node viKthlA proximal node;
constructing a local level graph by using initial local features of each picture
Figure BDA00029811987300000210
Then, will
Figure BDA00029811987300000211
Inputting into a graph neural network, updating local features by means of adjacency relation between nodes, such as formula (5), and then using graph pooling operation to gather the local features into r clusters to obtain r local semantic features HpAs in equation (6) (7):
Figure BDA00029811987300000212
Figure BDA0002981198730000031
Figure BDA0002981198730000032
wherein ,GNNlocal() represents a local level graph neural network,
Figure BDA0002981198730000033
is a network parameter, ReLU () is a nonlinear activation function; s is an assignment matrix, S (i, j) represents the probability that the ith node belongs to the jth cluster, W' is a parameter initialized randomly, and softmax ((-) represents normalization operation; r and dlRespectively representing the number and the dimensions of the learned local semantic features.
Further, in step S3, in the small sample image classification, each task includes N classes, each class has K labeled training samples and T unlabeled test samples, and for each task, all local semantic features of all images it contains are represented as H, as shown in formula (8), and a task-level graph is constructed by using all local semantic features in each task
Figure BDA0002981198730000034
K to which each node is most similartThe nodes are connected, the calculation mode of the adjacent matrix of the task-level graph is as the formula (9) (10), the relation between the local semantic features is mined through the neural network of the task-level graph, and the task-level local feature Z is obtained as the formula (11):
H={Hp|p=1,…,N×K+T}={hi|i=1,…,(N×K+T)×r} (8)
Figure BDA0002981198730000035
Figure BDA0002981198730000036
Z=GNNtask(Atask,H)=ReLU(AtaskHWtask) (11)
wherein ,GNNtask(. represents a task-level graph neural network, WtaakIs a network parameter;
directly inputting the obtained task-level local features Z into a classifier for classification, calculating the probability of each local feature belonging to each class, and testing a sample qiThe kth local feature of
Figure BDA0002981198730000037
Probability of belonging to the jth class, as in equation (12):
Figure BDA0002981198730000038
wherein ,CjRepresenting the central point of the jth class, and calculating and averaging all local features in the class;
calculating the probability that each test sample belongs to each category, i.e. averaging the probabilities of all local features in the sample, as shown in equation (13):
Figure BDA0002981198730000039
calculating a loss value for each task as in equation (14):
Figure BDA00029811987300000310
wherein ,
Figure BDA00029811987300000311
representing a cross entropy loss function; y isiRepresentative test sample qiThe true class label.
Compared with the prior art, the technical scheme of the invention has the beneficial effects that:
the method adopts a multi-level graph neural network, firstly, the relation between local features in each image is excavated through the local graph neural network, and more representative local semantic features of the image are extracted; and then, the relation among the local semantic features of all samples in each task is explored through a task-level graph neural network to learn the more distinctive task-level local semantic features. Compared with the conventional small sample learning method based on the graph neural network, the method provided by the invention has the advantages that the similarity between the learned samples is finer in granularity and more accurate through multi-level local relation mining, so that the image classification performance of the small samples is improved.
Drawings
FIG. 1 is a schematic model of the present process;
fig. 2 is a schematic flow chart of the neural network of the local level map and the neural network of the task level map in steps S2 and S3.
Detailed Description
The drawings are for illustrative purposes only and are not to be construed as limiting the patent;
for the purpose of better illustrating the embodiments, certain features of the drawings may be omitted, enlarged or reduced, and do not represent the size of an actual product;
it will be understood by those skilled in the art that certain well-known structures in the drawings and descriptions thereof may be omitted.
The technical solution of the present invention is further described below with reference to the accompanying drawings and examples.
As shown in fig. 1-2, a small sample image classification method based on local feature relationship exploration includes the following steps:
s1: extracting initial local features of the image by using a convolutional neural network;
s2: constructing a local level graph by using the initial local features obtained in the step S1, and extracting local semantic features of the image through a neural network of the local level graph;
s3: and (4) constructing a task-level graph by using the local semantic features of all the images obtained in the step (S2), researching the relation among the local semantic features through a task-level graph neural network, and applying the relation to small sample image classification.
The specific process of step S1 is:
for an input picture xpFirstly, obtaining a feature map U of the image through a convolutional neural network, as shown in formula (1), and deforming the feature map of the image through a stretching operation to obtain m initial local features, as shown in formula (2):
Figure BDA0002981198730000041
Figure BDA0002981198730000042
wherein Conv (·) represents a convolutional neural network, and c, h and w represent the number of channels, height and width of the feature map, respectively; m denotes the number of initial local features, and m — h × w, c can be understood as the dimension of the initial local features.
In step S2, x is displayed for each imagepConstructing a local level graph using the initial local features obtained in step S1
Figure BDA0002981198730000051
Each node in the graph represents a local feature of the image, and the top k of each node is most similar to the top klThe nodes being connected together, adjacent to the matrix
Figure BDA0002981198730000052
Is calculated as in formula (3) (4):
Figure BDA0002981198730000053
Figure BDA0002981198730000054
wherein the function d (·,) represents a distance calculation formula,
Figure BDA0002981198730000055
representation and node viAnd
Figure BDA0002981198730000056
the distance of (a) to (b),
Figure BDA0002981198730000057
is and node viKthlA proximal node;
constructing a local level graph by using initial local features of each picture
Figure BDA0002981198730000058
Then, will
Figure BDA0002981198730000059
Inputting into a graph neural network, updating local features by means of adjacency relation between nodes, such as formula (5), and then using graph pooling operation to gather the local features into r clusters to obtain r local semantic features HpAs in equation (6) (7):
Figure BDA00029811987300000510
Figure BDA00029811987300000511
Figure BDA00029811987300000512
wherein ,GNNlocal() represents a local level graph neural network,
Figure BDA00029811987300000513
is a network parameter, ReLU () is a nonlinear activation function; s is an assignment matrix, S (i, j) represents the probability that the ith node belongs to the jth cluster, W' is a parameter initialized randomly, and softmax ((-) represents normalization operation; r and dlRespectively representing the number and sum of learned local semantic featuresDimension.
In step S3, in the classification of small sample images, each task includes N classes, each class has K labeled training samples and T unlabeled test samples, and for each task, all local semantic features of all images it contains are represented as H, as shown in formula (8), and a task-level graph is constructed using all local semantic features in each task
Figure BDA00029811987300000514
K to which each node is most similartThe nodes are connected, the calculation mode of the adjacent matrix of the task-level graph is as the formula (9) (10), the relation between the local semantic features is mined through the neural network of the task-level graph, and the task-level local feature Z is obtained as the formula (11):
H={Hp|p=1,…,N×K+T}={hi|i=1,…,(N×K+T)×r} (8)
Figure BDA00029811987300000515
Figure BDA00029811987300000516
Z=GNNtask(Atask,H)=ReLU(AtaskHWtask) (11)
wherein ,GNNtask(. represents a task-level graph neural network, WtaskIs a network parameter;
directly inputting the obtained task-level local features Z into a classifier for classification, calculating the probability of each local feature belonging to each class, and testing a sample qiThe kth local feature of
Figure BDA0002981198730000061
Probability of belonging to the jth class, as in equation (12):
Figure BDA0002981198730000062
wherein ,CjRepresenting the central point of the jth class, and calculating and averaging all local features in the class;
calculating the probability that each test sample belongs to each category, i.e. averaging the probabilities of all local features in the sample, as shown in equation (13):
Figure BDA0002981198730000063
calculating a loss value for each task as in equation (14):
Figure BDA0002981198730000064
wherein ,
Figure BDA0002981198730000065
representing a cross entropy loss function; y isiRepresentative test sample qiThe true class label.
This embodiment employs two commonly used datasets mini-ImageNet and tipped-ImageNet, both of which are subdata sets of ImageNet. The mini-ImageNet data set comprises 100 categories, wherein each category comprises 600 pictures; the threaded-ImageNet dataset contains 608 classes, which are from 34 super classes.
The method comprises the following specific steps:
firstly, building a convolutional neural network, inputting data into the convolutional neural network, and extracting initial local features of an image.
And secondly, building a local-level graph neural network, constructing a local-level graph by using the initial local features of each picture, inputting the local-level graph into the local-level graph neural network, and extracting the local semantic features of the image by mining the relationship among the initial local features.
Thirdly, constructing a task-level graph neural network, constructing a task-level graph by using the local semantic features of all pictures in each task, inputting the task-level graph into the task-level graph neural network, and acquiring task-level local features by mining the relationship among the local semantic features; and finally, inputting the task-level local features into a classifier for classification, and verifying a classification result.
The same or similar reference numerals correspond to the same or similar parts;
the positional relationships depicted in the drawings are for illustrative purposes only and are not to be construed as limiting the present patent;
it should be understood that the above-described embodiments of the present invention are merely examples for clearly illustrating the present invention, and are not intended to limit the embodiments of the present invention. Other variations and modifications will be apparent to persons skilled in the art in light of the above description. And are neither required nor exhaustive of all embodiments. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present invention should be included in the protection scope of the claims of the present invention.

Claims (10)

1. A small sample image classification method based on local feature relation exploration is characterized by comprising the following steps:
s1, extracting initial local features of the image by using a convolutional neural network;
s2: constructing a local level graph by using the initial local features obtained in the step S1, and extracting local semantic features of the image through a neural network of the local level graph;
s3: and (4) constructing a task-level graph by using the local semantic features of all the images obtained in the step (S2), researching the relation among the local semantic features through a task-level graph neural network, and applying the relation to small sample image classification.
2. The method for classifying small sample images based on local feature relationship exploration according to claim 1, wherein the specific process of step S1 is:
for an input picture xpFirstly, obtaining its characteristic diagram U by convolution neural network, as formula (1), and deforming the characteristic diagram of image by stretching operation to obtainTo m initial local features, as in equation (2):
Figure FDA0002981198720000011
Figure FDA0002981198720000012
wherein Conv (·) represents a convolutional neural network, and c, h and w represent the number of channels, height and width of the feature map, respectively; m represents the number of initial local features.
3. The method for classifying small sample images based on local feature relationship exploration according to claim 2, wherein in step S2, x is applied to each imagepConstructing a local level graph using the initial local features obtained in step S1
Figure FDA0002981198720000013
Each node in the graph represents a local feature of the image, and the top k of each node is most similar to the top klThe nodes being connected together, adjacent to the matrix
Figure FDA0002981198720000014
Is calculated as in formula (3) (4):
Figure FDA0002981198720000015
Figure FDA0002981198720000016
wherein the function d (·,) represents a distance calculation formula,
Figure FDA0002981198720000017
representation and nodeviAnd
Figure FDA0002981198720000018
the distance of (a) to (b),
Figure FDA0002981198720000019
is and node viKthlA proximal node.
4. The method for classifying small sample images based on local feature relationship exploration according to claim 3, wherein in said step S2, a local level graph is constructed by using initial local features of each picture
Figure FDA00029811987200000110
Then, will
Figure FDA00029811987200000111
Inputting into a graph neural network, updating local features by means of adjacency relation between nodes, such as formula (5), and then using graph pooling operation to gather the local features into r clusters to obtain r local semantic features HpAs in equation (6) (7):
Figure FDA0002981198720000021
Figure FDA0002981198720000022
Figure FDA0002981198720000023
wherein ,GNNlocal() represents a local level graph neural network,
Figure FDA0002981198720000024
is a network parameter, ReLU (-) is a nonlinear activation function; s is an assignment matrix, S (i, j) represents the probability that the ith node belongs to the jth cluster, W' is a parameter initialized randomly, and softmax ((-) represents normalization operation; r and dlRespectively representing the number and the dimensions of the learned local semantic features.
5. The method for classifying small sample images based on local feature relationship exploration according to claim 4, wherein in said step S3, in small sample image classification, each task includes N classes, each class has K labeled training samples and T unlabeled test samples, and for each task, all local semantic features of all images it contains are represented as H, as shown in formula (8), and a task-level graph is constructed by using all local semantic features in each task
Figure FDA0002981198720000029
K to which each node is most similartThe nodes are connected, the calculation mode of the adjacent matrix of the task-level graph is as the formula (9) (10), the relation between the local semantic features is mined through the neural network of the task-level graph, and the task-level local feature Z is obtained as the formula (11):
H={Hp|p=1,…,N×K+T}={hi|i=1,…,(N×K+T)×r} (8)
Figure FDA0002981198720000025
Figure FDA0002981198720000026
Z=GNNtask(Atask,H)=ReLU(AtaskHWtask)(11)
wherein ,GNNtask(. represents a task-level graph neural network, WtaskIs a network parameter.
6. According toThe method for classifying small sample images based on local feature relationship exploration according to claim 5, wherein in step S3, the obtained task-level local features Z are directly input into a classifier for classification, the probability of each local feature belonging to each class is calculated, and a sample q is testediThe kth local feature of
Figure FDA0002981198720000027
Probability of belonging to the jth class, as in equation (12):
Figure FDA0002981198720000028
wherein ,CjThe center point of the jth class is represented, and all local features in the class are calculated and averaged.
7. The method for classifying small sample images based on local feature relationship exploration according to claim 6, wherein in said step S3, the probability of each test sample belonging to each class is calculated, i.e. the probability of all local features in the sample is averaged, as shown in formula (13):
Figure FDA0002981198720000031
8. the method for classifying small sample images based on local feature relationship exploration according to claim 7, wherein in said step S3, a loss value of each task is calculated as formula (14):
Figure FDA0002981198720000032
wherein ,
Figure FDA0002981198720000033
to representA cross entropy loss function; y isiRepresentative test sample qiThe true class label.
9. The method for classifying small sample images based on local feature relationship exploration according to any one of claims 2-8, wherein m is h x w.
10. The method for classifying small sample images based on local feature relation exploration according to claim 2, wherein c can be understood as the dimension of an initial local feature.
CN202110287779.XA 2021-03-17 2021-03-17 Small sample image classification method based on local feature relation exploration Active CN113076976B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110287779.XA CN113076976B (en) 2021-03-17 2021-03-17 Small sample image classification method based on local feature relation exploration

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110287779.XA CN113076976B (en) 2021-03-17 2021-03-17 Small sample image classification method based on local feature relation exploration

Publications (2)

Publication Number Publication Date
CN113076976A true CN113076976A (en) 2021-07-06
CN113076976B CN113076976B (en) 2023-08-18

Family

ID=76612760

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110287779.XA Active CN113076976B (en) 2021-03-17 2021-03-17 Small sample image classification method based on local feature relation exploration

Country Status (1)

Country Link
CN (1) CN113076976B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109961089A (en) * 2019-02-26 2019-07-02 中山大学 Small sample and zero sample image classification method based on metric learning and meta learning
CN111210435A (en) * 2019-12-24 2020-05-29 重庆邮电大学 Image semantic segmentation method based on local and global feature enhancement module
CN111858991A (en) * 2020-08-06 2020-10-30 南京大学 Small sample learning algorithm based on covariance measurement
CN112308115A (en) * 2020-09-25 2021-02-02 安徽工业大学 Multi-label image deep learning classification method and equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109961089A (en) * 2019-02-26 2019-07-02 中山大学 Small sample and zero sample image classification method based on metric learning and meta learning
CN111210435A (en) * 2019-12-24 2020-05-29 重庆邮电大学 Image semantic segmentation method based on local and global feature enhancement module
CN111858991A (en) * 2020-08-06 2020-10-30 南京大学 Small sample learning algorithm based on covariance measurement
CN112308115A (en) * 2020-09-25 2021-02-02 安徽工业大学 Multi-label image deep learning classification method and equipment

Also Published As

Publication number Publication date
CN113076976B (en) 2023-08-18

Similar Documents

Publication Publication Date Title
Dong et al. Automatic age estimation based on deep learning algorithm
US20240163684A1 (en) Method and System for Constructing and Analyzing Knowledge Graph of Wireless Communication Network Protocol, and Device and Medium
CN108416370A (en) Image classification method, device based on semi-supervised deep learning and storage medium
CN109871875B (en) Building change detection method based on deep learning
CN112199532B (en) Zero sample image retrieval method and device based on Hash coding and graph attention machine mechanism
CN112949740B (en) Small sample image classification method based on multilevel measurement
CN105631416A (en) Method for carrying out face recognition by using novel density clustering
CN113628201A (en) Deep learning-based pathological section analysis method, electronic device and readable storage medium
CN112733602B (en) Relation-guided pedestrian attribute identification method
CN111581368A (en) Intelligent expert recommendation-oriented user image drawing method based on convolutional neural network
CN114299362A (en) Small sample image classification method based on k-means clustering
CN113065409A (en) Unsupervised pedestrian re-identification method based on camera distribution difference alignment constraint
Xu et al. Graphical modeling for multi-source domain adaptation
CN116071389A (en) Front background matching-based boundary frame weak supervision image segmentation method
CN114897085A (en) Clustering method based on closed subgraph link prediction and computer equipment
CN109857892B (en) Semi-supervised cross-modal Hash retrieval method based on class label transfer
CN105809200B (en) Method and device for autonomously extracting image semantic information in bioauthentication mode
CN114579794A (en) Multi-scale fusion landmark image retrieval method and system based on feature consistency suggestion
CN115392474B (en) Local perception graph representation learning method based on iterative optimization
CN116109834A (en) Small sample image classification method based on local orthogonal feature attention fusion
CN115661539A (en) Less-sample image identification method embedded with uncertainty information
CN113076976A (en) Small sample image classification method based on local feature relation exploration
CN111259176B (en) Cross-modal Hash retrieval method based on matrix decomposition and integrated with supervision information
CN114155403A (en) Image segmentation Hash sorting method based on deep learning
CN111401519B (en) Deep neural network unsupervised learning method based on similarity distance in object and between objects

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant