CN113673627A - Interpretive automatic commodity classification method and system - Google Patents

Interpretive automatic commodity classification method and system Download PDF

Info

Publication number
CN113673627A
CN113673627A CN202111026189.8A CN202111026189A CN113673627A CN 113673627 A CN113673627 A CN 113673627A CN 202111026189 A CN202111026189 A CN 202111026189A CN 113673627 A CN113673627 A CN 113673627A
Authority
CN
China
Prior art keywords
classification
node
network model
cnn network
decision tree
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111026189.8A
Other languages
Chinese (zh)
Other versions
CN113673627B (en
Inventor
申林山
闫鑫
姜佳成
徐丽
贾我欢
娄茹珍
李悦齐
钱婧捷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Engineering University
Original Assignee
Harbin Engineering University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Engineering University filed Critical Harbin Engineering University
Priority to CN202111026189.8A priority Critical patent/CN113673627B/en
Publication of CN113673627A publication Critical patent/CN113673627A/en
Application granted granted Critical
Publication of CN113673627B publication Critical patent/CN113673627B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/243Classification techniques relating to the number of classes
    • G06F18/24323Tree-organised classifiers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Business, Economics & Management (AREA)
  • Evolutionary Computation (AREA)
  • Software Systems (AREA)
  • Accounting & Taxation (AREA)
  • Finance (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Development Economics (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

An automatic commodity classification method and system with interpretive property belongs to the technical field of image identification and classification. The invention solves the problem that the prior commodity image recognition and classification algorithm is difficult to obtain interpretability, so that the prior method is difficult to accurately classify complex commodities. The method comprises the steps of using a Pythrch tool to label images in a double-label format to construct a corresponding data set, training a designed network architecture by using the constructed data set, classifying the images by using the trained network architecture, and displaying visual results of the images in a webpage, so that a commodity identification classification model with high classification accuracy and strong explanatory power is realized, and the problem that complex commodities are difficult to identify and classify by using a traditional method is solved. The invention can be applied to the recognition and classification of the images.

Description

Interpretive automatic commodity classification method and system
Technical Field
The invention belongs to the technical field of image recognition and classification, and particularly relates to an interpretive automatic commodity classification method and system.
Background
The target image recognition and classification is an important branch of the computer vision field, has wide application prospect, reduces a large amount of work to a certain extent by using the traditional machine learning means to automatically classify commodities, saves labor resources, but brings new trouble at the same time.
With the development of the times, the mobile internet technology is mature day by day, and the network shopping market is more active. In order to meet the demand of people for good life, the commodities are more abundant in variety and number, and the phenomenon also causes a plurality of problems. For network commodity managers, the classification work has a great obstacle due to the over-complicated commodity types. Most of the commodity classification work is carried out through an artificial intelligence algorithm, but the interpretability is poor, so that the problem of excessively complicated commodity classification is difficult to process.
In addition to experimental aspects, researchers have also conducted exploratory studies on deep-learning interpretable theory. Lipton analyzed the interpretability connotation of the deep learning model from 4 aspects of credibility, causal association, migration learning and information providing for the first time in 2018, and indicates that the decision made by the interpretable deep learning model often obtains higher trust, and people can still trust the model even when the result given by the trained model is different from the actual situation.
In summary, in order to make the commodity image recognition and classification algorithm overcome the difficulty that the current interpretability is difficult to obtain, balance the interpretability and the accuracy of the model, make the model capable of being used for processing the complicated commodity classification problem, meet the actual environment condition, and meet the actual application requirements, it is necessary to provide an interpretable commodity automatic classification method.
Disclosure of Invention
The invention aims to solve the problem that the prior commodity image recognition and classification algorithm is difficult to obtain interpretability, so that the prior method is difficult to accurately classify complex commodities, and provides an automatic commodity classification method and system with interpretability.
The technical scheme adopted by the invention for solving the technical problems is as follows:
based on one aspect of the invention, the method for automatically classifying the interpretive commodities specifically comprises the following steps:
designing a network architecture consisting of a basic classifier and an interpretation hierarchy, wherein the basic classifier is a CNN network model, and the interpretation hierarchy is an induced decision tree;
after the CNN network model is pre-trained, an induced decision tree is constructed by loading the weight of the last full connection layer in the pre-trained CNN network model;
step two, training the network architecture designed in the step one by using the marked commodity image data set until the classification accuracy of the network architecture on the marked data set reaches a set threshold value, and stopping training;
and step three, inputting the commodity images to be classified into the trained network architecture, outputting the classification result by using the CNN network model, and outputting an explanation decision process by using an induction decision tree.
According to another aspect of the invention, an automatic classification system for interpretive goods is used for executing an automatic classification method for interpretive goods.
The invention has the beneficial effects that: the invention provides an interpretive automatic commodity classification method and system, wherein a Pythroch tool is used for carrying out double-label format labeling on an image to construct a corresponding data set, a designed network architecture is trained by utilizing the constructed data set, the image is classified by adopting the trained network architecture, and meanwhile, a visual result is displayed in a webpage, so that a commodity identification classification model with high classification accuracy and strong interpretability is realized, and the problem that the complex commodity identification classification is difficult by the traditional method is solved.
The loss function and the embedded decision rule used by the method of the invention ensure that the CNN model has interpretability and simultaneously balances accuracy and interpretability, thereby saving the debugging time of workers on the commodity classification model and being matched with the front-end interface for convenient use and debugging.
Drawings
FIG. 1 is a flow chart of the method of the present invention;
FIG. 2 is a diagram of the construction of an induced decision tree according to the present invention;
FIG. 3 is a system display diagram in the present invention;
FIG. 4 is a graph of the visualization of the results of the present invention.
Detailed Description
First embodiment this embodiment will be described with reference to fig. 1. The method for automatically classifying an explanatory product according to the present embodiment specifically includes the following steps:
designing a network architecture consisting of a basic classifier and interpretable levels, wherein the basic classifier is a CNN network model, and the interpretable levels are induced decision trees;
after the CNN network model is pre-trained, an induced decision tree is constructed by loading the weight of the last full connection layer in the pre-trained CNN network model;
step two, training the network architecture designed in the step one by using the marked commodity image data set until the classification accuracy of the network architecture on the marked data set reaches a set threshold value, and stopping training;
and step three, inputting the commodity images to be classified into the trained network architecture, outputting the classification result by using the CNN network model, and outputting an explanation decision process by using an induction decision tree.
Because the induced hierarchy is built using model weights, intermediate nodes are not forced to split on foreground objects. While a hierarchy like WordNet provides assumptions about the meaning of a node, the tree may split according to unexpected contextual semantic and visual mathematical properties, and to diagnose the visual meaning of a node, the following 4-step test is performed:
1) the meaning of the node is assumed.
This assumption can be calculated automatically for a given classification or can be deduced by manually examining the leaves of each child.
2) A data set is collected with new, invisible classes for testing the meaning assumed in step 1). The samples in this dataset are called non-distributed samples, which are extracted from a separate tagged dataset.
3) Samples from this data set are passed through the node.
For each example, it is checked whether the selected child node conforms to the hypothesis.
4) The assumed accuracy is the percentage of the sample that is passed to the correct child node. If the accuracy is low, repeat with different assumptions.
The visual meaning of the node can be optimally adjusted by repeatedly executing the four steps.
The invention comprehensively considers the structural difference between different neural networks, adopts the embedded inference rule for constructing the induction hierarchy and the decision tree, does not use a single convolution neural network for prediction, and can effectively increase the interpretability of the network. Meanwhile, the decision process is optimized by adopting the tree loss function, so that the accuracy is ensured while interpretability is achieved, and the gap between two important indexes is effectively balanced. The method can quickly and accurately extract the target features from the pictures with limited quantity and complex backgrounds, make classification decisions and give decision explanations which can be understood by working personnel.
The second embodiment is as follows: the difference between the embodiment and the specific embodiment is that the CNN network model is constructed by using a Resnet18 model as a basis, and each Resnet18 model comprises a convolutional layer, a Pooling layer, four blocks and a full-connection layer;
each block is composed of two built blocks, and each built block is composed of two small blocks.
Other steps and parameters are the same as those in the first embodiment.
The third concrete implementation mode: this embodiment will be described with reference to fig. 2. The difference between this embodiment and the first or second embodiment is that the induced decision tree is constructed by loading the weight of the last full connection layer in the pre-trained CNN network model, and the specific process is as follows:
s1, loading the weight of the last full connection layer in the CNN network model obtained by pre-training;
step S2, taking each row of the loaded weight as a representative vector of a leaf node;
step S3, taking the representative average vector of each pair of leaf nodes as the representative vector of the parent;
and step S4, taking the representative average vectors of all leaf nodes as the representative vectors of ancestors.
Other steps and parameters are the same as those in the first or second embodiment.
And establishing a hierarchical structure on the weight space to obtain an interpretable model with higher precision. This is in contrast to existing decision tree-based methods, which use existing hierarchies, such as WordNet, or hierarchies built into the feature space, with data-dependent heuristics, such as information gain. In particular, the row vector ω is taken from the full connection layeriEach representing a class, and then hierarchically clustering the class representations. Vector omega for each leafiAnd r of each intermediate nodeiThe inner product is taken to represent the mean value of its representation. This level is referred to as the induction level. Alternative hierarchies were also compared: classical information gain levels based on neural features and WordNet levels. The WordNet relationship also provides interpretable labels for other candidate decision treesTags, for example, classify shirt as fastening. The earliest ancestor of each subtree leaf is found using the tags, and a semantic hypothesis is additionally generated for each intermediate node in the induction hierarchy.
The fourth concrete implementation mode: the difference between this embodiment and the first to the third embodiment is that the network architecture designed in the first step is trained, and the loss function adopted is:
L=CROSSENTROPY(Dpred,Dlabel)+ωCROSSENTROPY(Dnbdt,Dlabel) (1)
wherein, CROSSENTROPYFormula of operation representing the conventional cross-loss entropy function, DpredWeight values provided for CNN network models, DlabelWeight value provided for the label, DnbdtTo induce the initial weights of the decision tree, ω is a hyperparameter.
The conventional cross-loss entropy function is:
H(p,q)=-∑x(p(x)logq(x)+(1-p(x))log(1-q(x)) (2)
the different data sets are different for the setting of the hyper-parameter, the hyper-parameter for the commodity data sets with the types less than or equal to 100 is set to 1, and the hyper-parameter omega for the commodity data sets with the commodity types between 100 and 1000 is set to 10. Where each node simply returns the probability of each child node as the normalized inner product. For each leaf node, the probability of its path to the root is calculated, and the leaf node with the highest probability is selected. The decision tree after fine tuning has stronger accuracy. For the existing data set, the loss function is used for replacing the traditional cross loss entropy function to train the model, the generation of the decision tree is promoted, and each node of the decision tree has corresponding visual significance.
Other steps and parameters are the same as those in one of the first to third embodiments.
The fifth concrete implementation mode: the difference between this embodiment and one of the first to the fourth embodiments is that the steps of the induced decision tree interpretation decision process are as follows:
step 1, in the induced decision tree, the probability of each child node is as follows:
Figure BDA0003243426050000041
wherein S isiI represents the class index, each child node corresponds to a class, C represents the total number of classes, viIs the output of the preceding stage output unit of the classifier, e is the base of the natural logarithm;
Sithe ratio of the index of the current element to the sum of the indexes of all elements is expressed;
the classification accuracy of the decision tree defined in the mode is higher than that of the traditional decision tree, and the decision tree is more suitable for the classification of the target. Determining the structure of the decision tree by combining the growth and pruning of the tree, and modifying the tree to improve the generalization capability of the tree;
step 2, regarding any leaf node j, considering that the l-th path from the root node to the leaf node is Cl(j) Then, starting from the leaf node j, the probability p (l) of reaching the root node through all child nodes in the ith path is:
Figure BDA0003243426050000051
wherein k ∈ Cl(j) Indicates that the child node k is a child node on the ith path, SkA probability of being the kth child node;
similarly, the probability corresponding to each other path from the root node to the leaf node j and the probability corresponding to each path from the root node to the other leaf node are respectively calculated, the leaf node corresponding to the maximum probability is found, and the classification result corresponding to the found leaf node is found.
Other steps and parameters are the same as in one of the first to fourth embodiments.
In this embodiment, a leaf node is a node at the end of a decision tree, a root node is a node at the top of the decision tree, and the rest of the nodes are children.
The invention also comprises a matched front-end component which interconnects the front end and the back end of the integral model, displays the visual result of the training, and generates the visual result into html format for displaying at the front end.
The sixth specific implementation mode: the present embodiment is different from the first to fifth embodiments in that the labeling tool used in the commodity image dataset is a pytore.
Other steps and parameters are the same as those in one of the first to fifth embodiments.
The seventh embodiment: this embodiment will be described with reference to fig. 3 and 4. The difference between this embodiment and one of the first to sixth embodiments is that the method further includes a fourth step, where the fourth step specifically is:
and the decision process of the CNN network model can be visualized into an html-format file through a front-end and back-end joint debugging mode, and the decision process of the CNN network model is displayed through a webpage by utilizing a written front-end template, so that the CNN network model is convenient for workers to use.
Other steps and parameters are the same as those in one of the first to sixth embodiments.
The specific implementation mode is eight: the system of this embodiment is used to execute the method of any one of the first to seventh embodiments.
The above-described calculation examples of the present invention are merely to explain the calculation model and the calculation flow of the present invention in detail, and are not intended to limit the embodiments of the present invention. It will be apparent to those skilled in the art that other variations and modifications of the present invention can be made based on the above description, and it is not intended to be exhaustive or to limit the invention to the precise form disclosed, and all such modifications and variations are possible and contemplated as falling within the scope of the invention.

Claims (8)

1. An automatic classification method for interpretive commodities is characterized by comprising the following steps:
designing a network architecture consisting of a basic classifier and an interpretation hierarchy, wherein the basic classifier is a CNN network model, and the interpretation hierarchy is an induced decision tree;
after the CNN network model is pre-trained, an induced decision tree is constructed by loading the weight of the last full connection layer in the pre-trained CNN network model;
step two, training the network architecture designed in the step one by using the marked commodity image data set until the classification accuracy of the network architecture on the marked data set reaches a set threshold value, and stopping training;
and step three, inputting the commodity images to be classified into the trained network architecture, outputting the classification result by using the CNN network model, and outputting an explanation decision process by using an induction decision tree.
2. The method of claim 1, wherein the CNN network model is constructed based on a Resnet18 model, the Resnet18 model includes a convolutional layer, a Pooling layer, four blocks, and a fully-connected layer;
each block is composed of two built blocks, and each built block is composed of two small blocks.
3. The method as claimed in claim 2, wherein the induced decision tree is constructed by loading weights of a last full connection layer in a pre-trained CNN network model, and the specific process is as follows:
s1, loading the weight of the last full connection layer in the CNN network model obtained by pre-training;
step S2, taking each row of the loaded weight as a representative vector of a leaf node;
step S3, taking the representative average vector of each pair of leaf nodes as the representative vector of the parent;
and step S4, taking the representative average vectors of all leaf nodes as the representative vectors of ancestors.
4. The method according to claim 2, wherein the network architecture designed in the first step is trained by using a loss function as follows:
L=CROSSENTROPY(Dpred,Dlabel)+ωCROSSENTROPY(Dnbdt,Dlabel) (1)
wherein, CROSSENTROPYFormula of operation representing the conventional cross-loss entropy function, DpredWeight values provided for CNN network models, DlabelWeight value provided for the label, DnbdtTo induce the initial weights of the decision tree, ω is a hyperparameter.
5. The method as claimed in claim 4, wherein the step of the induced decision tree interpreting the decision process comprises:
step 1, in the induced decision tree, the probability of each child node is as follows:
Figure FDA0003243426040000011
wherein S isiIs the probability of the ith child node, i represents the class index, C represents the total number of classes, viIs the output of the preceding stage output unit of the classifier, e is the base of the natural logarithm;
step 2, regarding any leaf node j, considering that the l-th path from the root node to the leaf node is Cl(j) Then, starting from the leaf node j, the probability p (l) of reaching the root node through all child nodes in the ith path is:
Figure FDA0003243426040000021
wherein k ∈ Cl(j) Indicates that the child node k is a child node on the ith path, SkIs as followsk probabilities of child nodes;
similarly, the probability corresponding to each other path from the root node to the leaf node j and the probability corresponding to each path from the root node to the other leaf node are respectively calculated, the leaf node corresponding to the maximum probability is found, and the classification result corresponding to the found leaf node is found.
6. The method of claim 5, wherein the commodity image data set is annotated with a Pytorch as a labeling tool.
7. The automatic classification method for interpretive commodities according to claim 6, characterized in that said method further comprises a fourth step, specifically:
and (3) visualizing the decision process of the CNN network model into an html format file through a front-end and back-end joint debugging mode, and displaying the decision process of the CNN network model through a webpage by utilizing a compiled front-end template.
8. An automatic classification system for an explanatory product, characterized in that the system is configured to execute an automatic classification method for an explanatory product as claimed in any one of claims 1 to 7.
CN202111026189.8A 2021-09-02 2021-09-02 Automatic commodity classification method and system with interpretation Active CN113673627B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111026189.8A CN113673627B (en) 2021-09-02 2021-09-02 Automatic commodity classification method and system with interpretation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111026189.8A CN113673627B (en) 2021-09-02 2021-09-02 Automatic commodity classification method and system with interpretation

Publications (2)

Publication Number Publication Date
CN113673627A true CN113673627A (en) 2021-11-19
CN113673627B CN113673627B (en) 2024-02-13

Family

ID=78548306

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111026189.8A Active CN113673627B (en) 2021-09-02 2021-09-02 Automatic commodity classification method and system with interpretation

Country Status (1)

Country Link
CN (1) CN113673627B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020247949A1 (en) * 2019-06-07 2020-12-10 The Regents Of The University Of California General form of the tree alternating optimization (tao) for learning decision trees
CN112434790A (en) * 2020-11-10 2021-03-02 西安理工大学 Self-interpretation method for convolutional neural network to judge partial black box problem
CN112491796A (en) * 2020-10-28 2021-03-12 北京工业大学 Intrusion detection and semantic decision tree quantitative interpretation method based on convolutional neural network
CN112738015A (en) * 2020-10-28 2021-04-30 北京工业大学 Multi-step attack detection method based on interpretable convolutional neural network CNN and graph detection
CN112784127A (en) * 2021-03-12 2021-05-11 清华大学 Multi-string pattern matching method and device, computer equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020247949A1 (en) * 2019-06-07 2020-12-10 The Regents Of The University Of California General form of the tree alternating optimization (tao) for learning decision trees
CN112491796A (en) * 2020-10-28 2021-03-12 北京工业大学 Intrusion detection and semantic decision tree quantitative interpretation method based on convolutional neural network
CN112738015A (en) * 2020-10-28 2021-04-30 北京工业大学 Multi-step attack detection method based on interpretable convolutional neural network CNN and graph detection
CN112434790A (en) * 2020-11-10 2021-03-02 西安理工大学 Self-interpretation method for convolutional neural network to judge partial black box problem
CN112784127A (en) * 2021-03-12 2021-05-11 清华大学 Multi-string pattern matching method and device, computer equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
张颖芳;凌卫新;: "基于动态调整的GA-SVM多分类二叉树的方法", 科学技术与工程, no. 07, pages 177 - 181 *
闫鑫: "使用生成对抗网络的行人姿态转换方法研究", 专业硕士学位论文 *

Also Published As

Publication number Publication date
CN113673627B (en) 2024-02-13

Similar Documents

Publication Publication Date Title
CN112308115B (en) Multi-label image deep learning classification method and equipment
Rahman et al. Discretization of continuous attributes through low frequency numerical values and attribute interdependency
CN112364352B (en) Method and system for detecting and recommending interpretable software loopholes
Patnaik et al. Intelligent and adaptive web data extraction system using convolutional and long short-term memory deep learning networks
CN113051914A (en) Enterprise hidden label extraction method and device based on multi-feature dynamic portrait
Reformat et al. Software quality analysis with the use of computational intelligence
CN110737805B (en) Method and device for processing graph model data and terminal equipment
CN112732921B (en) False user comment detection method and system
CN113254507B (en) Intelligent construction and inventory method for data asset directory
Ko et al. Architectural spatial layout planning using artificial intelligence
Lonij et al. Open-world visual recognition using knowledge graphs
CN116129286A (en) Method for classifying graphic neural network remote sensing images based on knowledge graph
CN116108191A (en) Deep learning model recommendation method based on knowledge graph
US11995573B2 (en) Artificial intelligence system providing interactive model interpretation and enhancement tools
Wigness et al. Efficient label collection for image datasets via hierarchical clustering
CN111126443A (en) Network representation learning method based on random walk
CN113673627B (en) Automatic commodity classification method and system with interpretation
CN115269816A (en) Core personnel mining method and device based on information processing method and storage medium
Daradkeh et al. Lifelong machine learning for topic modeling based on hellinger distance
CN114840717A (en) Digger data mining method and device, electronic equipment and readable storage medium
Huang et al. Community detection algorithm for social network based on node intimacy and graph embedding model
Yu et al. Workflow recommendation based on graph embedding
Peng et al. TH-SLP: Web Service Link Prediction Based on Topic-aware Heterogeneous Graph Neural Network
Yu et al. An image classification approach for painting using improved convolutional neural algorithm
Cook Learning context-aware representations of subtrees

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant