CN110008967A - A kind of the behavior characterizing method and system of fusion structure and semantic mode - Google Patents

A kind of the behavior characterizing method and system of fusion structure and semantic mode Download PDF

Info

Publication number
CN110008967A
CN110008967A CN201910276931.7A CN201910276931A CN110008967A CN 110008967 A CN110008967 A CN 110008967A CN 201910276931 A CN201910276931 A CN 201910276931A CN 110008967 A CN110008967 A CN 110008967A
Authority
CN
China
Prior art keywords
node
subgraph
block matrix
constituted
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910276931.7A
Other languages
Chinese (zh)
Inventor
李建欣
宁元星
彭浩
龚其然
李培文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN201910276931.7A priority Critical patent/CN110008967A/en
Publication of CN110008967A publication Critical patent/CN110008967A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/254Fusion techniques of classification results, e.g. of results related to same input data
    • G06F18/256Fusion techniques of classification results, e.g. of results related to same input data of results relating to different input data, e.g. multimodal recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Molecular Biology (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Mathematical Physics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Multimedia (AREA)
  • Machine Translation (AREA)

Abstract

This application discloses the behavior characterizing methods and system of a kind of fusion structure and semantic mode, which comprises handles initial data, obtains multiple subgraphs;Data structured processing is carried out to the multiple subgraph respectively, is trained handling in obtained structured data entry neural network, obtains the output result of each subgraph;The output result of each subgraph is merged.

Description

A kind of the behavior characterizing method and system of fusion structure and semantic mode
Technical field
This application involves the behavior characterizing method of nerual network technique more particularly to a kind of fusion structure and semantic mode and System.
Background technique
In recent years, the rapid development of convolutional neural networks has been achieved for huge success in network data, for example schemes The problems such as piece processing, the identification of hand-written data collection.Deep learning method achieves huge in terms of Object identifying and image classification Success.But many problems of nature all cannot simply be modeled as grid data, most typical is exactly social networks, In social networks, user constitutes point one by one, and understanding between them whether constitutes side, and the weight on side has the degree structure of node At can solve many problems in social networks by this model.Therefore, nearest a few thing is concentrated on convolution mind Through the structure except the network promotion to grid, i.e., from 2D/3D image to arbitrary graphic or network.Convolutional neural networks on figure are logical It is commonly referred to as figure convolutional network (Graph Convolutional Network, GCN), and by indicating arbitrary graphic For the fixed character mapping with effective abundant structuring semantic information, there is robustness in feature extraction.
However presently, there are Railway Projects for this mode that convolution is done on Subgraph:
1) picture scroll operation over-simplification in local neighborhood.As the polymerization of local neighborhood interior joint value, exist and parent map The potential loss of the associated information of convolution algorithm;
2) effect classified to isomorphic graphs is poor.GCN model cannot be applied directly, because of it and the node sequence in figure Equivalence it is suitable, it means that they cannot be guaranteed that the output of any two isomorphic graphs is always identical;
3) ability for obtaining global information is limited.Figure Convolution Filter is local data, and provides the flat of local data / Aggregation view.This disadvantage will cause serious difficulty when handling the chart that node label is not present;
4) carry out maximum pond (Max Pooling) when there are the shortcomings that.Due to being compared to node sequencing, Max Pooling not can guarantee isomorphism subgraph and share invariant features vector, because the comparison that Max Pooling can measure subgraph is suitable Sequence.Also, Max Pooling operator is the prevalence selection after polymerizeing for realizing invariance.
The presence of these problems, so that present neural network has much the processing study aspect effect of Subgraph Problem leads to effect and bad.
Apply for content
In order to solve the above technical problems, the embodiment of the present application provides the behavior characterization of a kind of fusion structure and semantic mode Method and system.
The method of Processing with Neural Network data provided by the embodiments of the present application, comprising:
Initial data is handled, multiple subgraphs are obtained;Data structured processing is carried out respectively to the multiple subgraph;
It is trained handling in obtained structured data entry neural network, obtains the output result of each subgraph;
The output result of each subgraph is merged.
The device of Processing with Neural Network data provided by the embodiments of the present application, comprising:
Subgraph extraction module obtains multiple subgraphs for handling initial data;
Structuring processing module, for carrying out data structured processing respectively to the multiple subgraph;
Process of convolution module is trained for will handle in obtained structured data entry neural network, is obtained every The output result of a subgraph;
Merging module is merged for the output result to each subgraph.
Using the above-mentioned technical proposal of the embodiment of the present application, 1) consider the influence that different edges generate node label. Depth convolutional neural networks are realized to extract discrimination figure feature, especially for avoiding isomorphism subgraph problem;2) it can extract Complicated spatial information, and different subgraphs is extracted in real world figure;3) by using proposition based on matched subgraph Normalization and the self-consciou layer covered obtain significant performance boost in most of tasks.
Detailed description of the invention
Fig. 1 is the flow diagram of the method for Processing with Neural Network data provided by the embodiments of the present application;
Fig. 2 is that the behavior of social networks provided by the embodiments of the present application characterizes architecture diagram;
Fig. 3 is the architecture diagram of Processing with Neural Network data provided by the embodiments of the present application;
Fig. 4 is the structure composition schematic diagram of the device of Processing with Neural Network data provided by the embodiments of the present application.
Specific embodiment
The various exemplary embodiments of the application are described in detail now with reference to attached drawing.It should also be noted that unless in addition having Body explanation, the unlimited system of component and the positioned opposite of step, numerical expression and the numerical value otherwise illustrated in these embodiments is originally The range of application.
Simultaneously, it should be appreciated that for ease of description, the size of various pieces shown in attached drawing is not according to reality Proportionate relationship draw.
Be to the description only actually of at least one exemplary embodiment below it is illustrative, never as to the application And its application or any restrictions used.
Technology, method and apparatus known to person of ordinary skill in the relevant may be not discussed in detail, but suitable In the case of, the technology, method and apparatus should be considered as part of specification.
It should also be noted that similar label and letter indicate similar terms in following attached drawing, therefore, once a certain Xiang Yi It is defined in a attached drawing, then in subsequent attached drawing does not need that it is further discussed.
The embodiment of the present application can be applied to the electronic equipments such as computer system/server, can with it is numerous other general Or special-purpose computing system environment or configuration operate together.Suitable for what is be used together with electronic equipments such as computer system/servers Well-known computing system, environment and/or the example of configuration include but is not limited to: personal computer system, server calculate Machine system, thin client, thick client computer, hand-held or laptop devices, microprocessor-based system, set-top box, programmable-consumer Electronic product, NetPC Network PC, minicomputer system, large computer system and the distribution including above-mentioned any system Cloud computing technology environment, etc..
The electronic equipments such as computer system/server can be in the executable finger of the computer system executed by computer system It enables and being described under the general context of (such as program module).In general, program module may include routine, program, target program, group Part, logic, data structure etc., they execute specific task or realize specific abstract data type.Computer system/ Server can be implemented in distributed cloud computing environment, and in distributed cloud computing environment, task is by by communication network chain What the remote processing devices connect executed.In distributed cloud computing environment, it includes the sheet for storing equipment that program module, which can be located at, On ground or remote computing system storage medium.
The problem of any graph model is a kind of very common structure, many real worlds can be modeled as the structure, but It is that there is also many problems in its processing, therefore, the embodiment of the present application proposes a kind of new fusion structure and semantic mode System come such issues that solve, this system can overcome convolution algorithm and obtain the limitation of adjacent node feature, pass through Reasonable structure is selected, whole data information can be preferably obtained, faster be handled, with enterprising in social networks For capable experiment: the embodiment of the present application is the well-designed suitable network themes of social network data collection, and is executed and main Inscribe the selected small figure of matched guidance figure standardization.And computable attention precondition certainly is adapted to, not only abandons two The integration operations in direction, but also feature temper is represented come each dimension ensured in Feature Mapping by setting U Steps Figure.
Fig. 1 is the flow diagram of the method for Processing with Neural Network data provided by the embodiments of the present application, as shown in Figure 1, The methods of the Processing with Neural Network data the following steps are included:
Step 101: initial data being handled, multiple subgraphs are obtained;Data knot is carried out respectively to the multiple subgraph Structureization processing.
In the embodiment of the present application, the graph-tool tool that python can be used handles initial data, obtains Multiple subgraphs, and data structured processing is carried out respectively to multiple subgraphs.It should be noted that the embodiment of the present application can be based on Python environment is realized, not limited to this, without departing from the technical solution core of the embodiment of the present application, the application Embodiment is also based on other language environments to realize.
In the embodiment of the present application, the extraction process of subgraph can be realized by following steps:
1) node in target network is ranked up according to close centers degree;
2) central node of the k node as each subgraph before selecting, k is subgraph number;
3) for each node in the preceding k node, it is corresponding that m node around the node is increased into the node Subgraph in, m is the number of nodes in the subgraph.
In the embodiment of the present application, the data structured treatment process of subgraph can be realized by following steps:
1) for each subgraph in the multiple subgraph, the node in the subgraph is arranged according to close centers degree Sequence, and double bounce structure is selected to carry out structuring processing to the subgraph, obtain central node, the second hop node and three hop nodes; Based on the three-legged structure that the node after the central node and the node is constituted, the first block matrix is formed;
2) three-legged structure constituted based on the node after second hop node and the node forms the second block matrix; Based on the three-legged structure that the node after the third hop node and the node is constituted, third block matrix is formed;
3) first block matrix, second block matrix and the third block matrix are spliced, constitutes the son The structured representation of figure.
Step 102: being trained handling in obtained structured data entry neural network, obtain the defeated of each subgraph Result out.
Here, neural network is handled by structural data of the convolution operation to each subgraph.
Step 103: the output result of each subgraph is merged.
After the convolutional layer of neural network, attention (attention) model is added, is extracted using attention model adjacent Contact details between subgraph.The attention model uses following formula:
Wherein,It is i-th of node that attention model calculates as a result, j represents the node adjacent with node i, S is The number of attention,Be neural metwork training come out each subgraph output result merge after as a result,It indicates The connection relationship factor between subgraph.
It is described further below in conjunction with technical solution of the specific application example to the embodiment of the present application.Referring to Fig. 2, Fig. 2 Architecture diagram is characterized for the behavior of social networks provided by the embodiments of the present application.
(1) subgraph is extracted from social networks
Social networks is very big data set first, and the information content for including is huge, if directly instructed with initial data Practice, demand and training speed to memory can all have very big challenge, therefore, first have to that small subgraph is selected to be instructed from figure Practice, but the information of excessive figure cannot be abandoned again, not so information content is insufficient, and network can not just carry out accurately learning.Cause This, needs reasonably to extract subgraph.
The embodiment of the present application selects subgraph, the formula of close centers degree by close centers degree are as follows:
The formula indicates some node to the complexity of other nodes, that is, to other all node nodal distances The inverse of average value.
The node in social networks is ranked up according to close centers degree first, then (k is son to k node before selection Figure number) centered on point carry out subgraph extension, (number of nodes that m is each subgraph) is come in into m node increase around it.
(2) data information and structured representation are obtained from subgraph
Subgraph can not be handled directly, need to be converted into the data structure that neural network is capable of handling, also It is matrix form.
Firstly, the node in subgraph is ranked up also according to close centers degree, it is obvious that central node is first It is a.Then, selection double bounce structure is by subgraph structured representation, specifically: node is divided into central node, two hop nodes (with The node that central point distance is 1), three hop nodes (node for being 2 with central point distance), because between social network data collection Difference is relatively small to represent activation group or degree, and double bounce path theme has symmetric properties, when the isomerism for ignoring node When, it is suitable for local matching.Finally, being constituted a line square with these nodes and the three-legged structure constituted comprising the node after them Battle array, such three block matrix is spliced, and constitutes the structured representation of sub-graph data, referring to Fig. 2.Because edge is distributed in It is very uneven in figure, therefore triangle or more complicated pattern seldom match.Even intensive subgraph can also be divided into It multiple double-hop path patterns and substantially rebuilds.In addition, two hop paths can easily be expressed as array.
(3) structural data of network training subgraph
Referring to Fig. 3, will learn in the structured data entry neural network of subgraph, because of a structural information unit It is three-legged structure, therefore, a line of input matrix means that a structural information unit, needs to do together when doing convolution.
(4) contact details between subgraph are further extracted using attention model
It can be observed from fig. 2 that joined attention layers again last in complete this example later of network training, this layer Main purpose be contact details between trained subgraph, be able to preferably obtain the structural information of entire figure.It specifically, will be refreshing Sub-graph data after network convolution merges, and obtains an one-dimensional vectorPass through attention layers of adjacent segments later Exchange of information between point, obtains new feature vectorThe formula of attention model is:
Wherein,It is i-th of node that attention model calculates as a result, j represents the node adjacent with node i, S is note The number for power of anticipating,Be neural metwork training come out each subgraph output result merge after as a result,Indicate son The connection relationship factor between figure.
Further,Formula are as follows:
It is parameter, α with WijIndicating the connection relationship between subgraph, multiple Attention can extract more information, The point of different Attention concerns may be different, can thus extract more information.
Fig. 4 is the structure composition schematic diagram of the device of Processing with Neural Network data provided by the embodiments of the present application, such as Fig. 4 institute Show, shown device includes:
Subgraph extraction module 401 obtains multiple subgraphs for handling initial data;
Structuring processing module 402, for carrying out data structured processing respectively to the multiple subgraph;
Process of convolution module 403 being trained for will handle in obtained structured data entry neural network, obtaining The output result of each subgraph;
Merging module 404 is merged for the output result to each subgraph.
In one embodiment, the subgraph extraction module 401, is used for:
Node in target network is ranked up according to close centers degree;
Central node of the k node as each subgraph before selecting, k is subgraph number;
For each node in the preceding k node, it is corresponding that m node around the node is increased into the node In subgraph, m is the number of nodes in the subgraph.
In one embodiment, the structuring processing module 402, is used for:
For each subgraph in the multiple subgraph, the node in the subgraph is ranked up according to close centers degree, And double bounce structure is selected to carry out structuring processing to the subgraph, obtain central node, the second hop node and three hop nodes;
Based on the three-legged structure that the node after the central node and the node is constituted, the first block matrix is formed;It is based on The three-legged structure that node after second hop node and the node is constituted, forms the second block matrix;It is jumped based on the third The three-legged structure that node after node and the node is constituted, forms third block matrix;
First block matrix, second block matrix and the third block matrix are spliced, the subgraph is constituted Structured representation.
In one embodiment, described device further include:
Power module 405 is paid attention to, for extracting the contact details between adjacent subgraph using attention model.
In one embodiment, the attention model uses following formula:
Wherein,It is i-th of node that attention model calculates as a result, j represents the node adjacent with node i, S is note The number for power of anticipating,Be neural metwork training come out each subgraph output result merge after as a result,Indicate son The connection relationship factor between figure.
It will be appreciated by those skilled in the art that the reality of each module in the device of Processing with Neural Network data shown in Fig. 4 Existing function can refer to the associated description of the method for aforementioned neurological network transaction data and understand.Processing with Neural Network shown in Fig. 4 The function of each module in the device of data can be realized and running on the program on processor, can also pass through specific logic Circuit and realize.
Description of the invention is given for the purpose of illustration and description, and is not exhaustively or will be of the invention It is limited to disclosed form.Many modifications and variations are obvious for the ordinary skill in the art.It selects and retouches It states embodiment and is to more preferably illustrate the principle of the present invention and practical application, and those skilled in the art is enable to manage The solution present invention is to design various embodiments suitable for specific applications with various modifications.

Claims (10)

1. the behavior characterizing method and system of a kind of fusion structure and semantic mode, which is characterized in that the described method includes:
Initial data is handled, multiple subgraphs are obtained;Data structured processing is carried out respectively to the multiple subgraph;
It is trained handling in obtained structured data entry neural network, obtains the output result of each subgraph;
The output result of each subgraph is merged.
2. multiple subgraphs are obtained the method according to claim 1, wherein described handle initial data, Include:
Node in target network is ranked up according to close centers degree;
Central node of the k node as each subgraph before selecting, k is subgraph number;
For each node in the preceding k node, m node around the node is increased into the corresponding subgraph of the node In, m is the number of nodes in the subgraph.
3. the method according to claim 1, wherein described carry out data structured to the multiple subgraph respectively Processing, comprising:
For each subgraph in the multiple subgraph, the node in the subgraph is ranked up according to close centers degree, and selects It selects double bounce structure and structuring processing is carried out to the subgraph, obtain central node, the second hop node and three hop nodes;
Based on the three-legged structure that the node after the central node and the node is constituted, the first block matrix is formed;Based on described The three-legged structure that node after second hop node and the node is constituted, forms the second block matrix;Based on the third hop node The three-legged structure constituted with the node after the node forms third block matrix;
First block matrix, second block matrix and the third block matrix are spliced, the knot of the subgraph is constituted Structureization indicates.
4. method according to any one of claims 1 to 3, which is characterized in that the method also includes:
The contact details between adjacent subgraph are extracted using attention model.
5. according to the method described in claim 4, it is characterized in that, the attention model uses following formula:
Wherein,It is i-th of node that attention model calculates as a result, j represents the node adjacent with node i, S is attention Number,Be neural metwork training come out each subgraph output result merge after as a result,Indicate subgraph it Between the connection relationship factor.
6. a kind of device of Processing with Neural Network data, which is characterized in that described device includes:
Subgraph extraction module obtains multiple subgraphs for handling initial data;
Structuring processing module, for carrying out data structured processing respectively to the multiple subgraph;
Process of convolution module being trained for will handle in obtained structured data entry neural network, obtaining every height The output result of figure;
Merging module is merged for the output result to each subgraph.
7. device according to claim 6, which is characterized in that the subgraph extraction module is used for:
Node in target network is ranked up according to close centers degree;
Central node of the k node as each subgraph before selecting, k is subgraph number;
For each node in the preceding k node, m node around the node is increased into the corresponding subgraph of the node In, m is the number of nodes in the subgraph.
8. device according to claim 6, which is characterized in that the structuring processing module is used for:
For each subgraph in the multiple subgraph, the node in the subgraph is ranked up according to close centers degree, and selects It selects double bounce structure and structuring processing is carried out to the subgraph, obtain central node, the second hop node and three hop nodes;
Based on the three-legged structure that the node after the central node and the node is constituted, the first block matrix is formed;Based on described The three-legged structure that node after second hop node and the node is constituted, forms the second block matrix;Based on the third hop node The three-legged structure constituted with the node after the node forms third block matrix;
First block matrix, second block matrix and the third block matrix are spliced, the knot of the subgraph is constituted Structureization indicates.
9. according to the described in any item devices of claim 6 to 8, which is characterized in that described device further include:
Power module is paid attention to, for extracting the contact details between adjacent subgraph using attention model.
10. device according to claim 9, which is characterized in that the attention model uses following formula:
Wherein,It is i-th of node that attention model calculates as a result, j represents the node adjacent with node i, S is attention Number,Be neural metwork training come out each subgraph output result merge after as a result,Indicate subgraph it Between the connection relationship factor.
CN201910276931.7A 2019-04-08 2019-04-08 A kind of the behavior characterizing method and system of fusion structure and semantic mode Pending CN110008967A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910276931.7A CN110008967A (en) 2019-04-08 2019-04-08 A kind of the behavior characterizing method and system of fusion structure and semantic mode

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910276931.7A CN110008967A (en) 2019-04-08 2019-04-08 A kind of the behavior characterizing method and system of fusion structure and semantic mode

Publications (1)

Publication Number Publication Date
CN110008967A true CN110008967A (en) 2019-07-12

Family

ID=67170293

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910276931.7A Pending CN110008967A (en) 2019-04-08 2019-04-08 A kind of the behavior characterizing method and system of fusion structure and semantic mode

Country Status (1)

Country Link
CN (1) CN110008967A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021073211A1 (en) * 2019-10-14 2021-04-22 支付宝(杭州)信息技术有限公司 Method and device for training graph neural network model

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104200272A (en) * 2014-08-28 2014-12-10 北京工业大学 Complex network community mining method based on improved genetic algorithm
CN105243593A (en) * 2015-08-04 2016-01-13 电子科技大学 Weighted network community clustering method based on hybrid measure

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104200272A (en) * 2014-08-28 2014-12-10 北京工业大学 Complex network community mining method based on improved genetic algorithm
CN105243593A (en) * 2015-08-04 2016-01-13 电子科技大学 Weighted network community clustering method based on hybrid measure

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021073211A1 (en) * 2019-10-14 2021-04-22 支付宝(杭州)信息技术有限公司 Method and device for training graph neural network model

Similar Documents

Publication Publication Date Title
Patil et al. Convolutional neural networks: an overview and its applications in pattern recognition
Yavartanoo et al. Spnet: Deep 3d object classification and retrieval using stereographic projection
Xiang et al. Objectnet3d: A large scale database for 3d object recognition
Zhang et al. Relationship proposal networks
Azizpour et al. Factors of transferability for a generic convnet representation
US9183467B2 (en) Sketch segmentation
Liu et al. Robust graph mode seeking by graph shift
CN110059807A (en) Image processing method, device and storage medium
CN112396106B (en) Content recognition method, content recognition model training method, and storage medium
CN103745201B (en) A kind of program identification method and device
Korzh et al. Convolutional neural network ensemble fine-tuning for extended transfer learning
Suh et al. Subgraph matching using compactness prior for robust feature correspondence
EP3642764A1 (en) Learning unified embedding
Singhal et al. Towards a unified framework for visual compatibility prediction
Mehmood et al. Effect of complementary visual words versus complementary features on clustering for effective content-based image search
Chen et al. RRGCCAN: Re-ranking via graph convolution channel attention network for person re-identification
Krishan Kumar et al. Two viewpoints based real‐time recognition for hand gestures
CN116310318A (en) Interactive image segmentation method, device, computer equipment and storage medium
Luo et al. Grounded affordance from exocentric view
CN110008967A (en) A kind of the behavior characterizing method and system of fusion structure and semantic mode
Daryanto et al. Survey: recent trends and techniques in image co-segmentation challenges, issues and its applications
CN116958729A (en) Training of object classification model, object classification method, device and storage medium
Boroujerdi et al. Deep interactive region segmentation and captioning
Murthy et al. A simplified and novel technique to retrieve color images from hand-drawn sketch by human.
Xu et al. Image keypoint matching using graph neural networks

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20190712