CN112818678A - Relationship reasoning method and system based on dependency relationship graph - Google Patents

Relationship reasoning method and system based on dependency relationship graph Download PDF

Info

Publication number
CN112818678A
CN112818678A CN202110205890.XA CN202110205890A CN112818678A CN 112818678 A CN112818678 A CN 112818678A CN 202110205890 A CN202110205890 A CN 202110205890A CN 112818678 A CN112818678 A CN 112818678A
Authority
CN
China
Prior art keywords
dependency
relationship
word
tree
sentence
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110205890.XA
Other languages
Chinese (zh)
Other versions
CN112818678B (en
Inventor
张月国
蒋兴健
董莉莉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Jiaotong University
Original Assignee
Shanghai Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Jiaotong University filed Critical Shanghai Jiaotong University
Priority to CN202110205890.XA priority Critical patent/CN112818678B/en
Publication of CN112818678A publication Critical patent/CN112818678A/en
Application granted granted Critical
Publication of CN112818678B publication Critical patent/CN112818678B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/279Recognition of textual entities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/901Indexing; Data structures therefor; Storage structures
    • G06F16/9024Graphs; Linked lists
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/901Indexing; Data structures therefor; Storage structures
    • G06F16/9027Trees
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/30Semantic analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Computational Linguistics (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Data Mining & Analysis (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Evolutionary Computation (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Machine Translation (AREA)

Abstract

The invention provides a relation reasoning method and a relation reasoning system based on a dependency graph, which utilize word meaning characteristics to carry out word division and word characteristic construction on a given sentence pair; obtaining a dependency relation tree extracted from the text after word division through a dependency extractor; the dependency relationship is used as a basis for updating the word characteristics, and the word characteristics in the given sentence pair are learned and updated by combining a deep learning network; taking a plurality of updated word characteristics obtained by the given sentence pair as local characteristics, and fusing the characteristics to obtain global characteristics; taking the global feature as a sentence meaning feature, performing interaction between two sentences, inputting the sentence meaning feature into an output layer to obtain output, comparing the output with a real label, and calculating a loss function of the learning model; and correcting the learning model according to the loss function calculation result of the learning model, and determining the target parameters corresponding to the learning model. The expression of the syntactic dependency tree on natural language reasoning is effectively improved.

Description

Relationship reasoning method and system based on dependency relationship graph
Technical Field
The invention relates to the field of computers, in particular to a relation reasoning method and a relation reasoning system based on a dependency graph.
Background
With the development of deep learning models, the main trend of natural language reasoning task is to use more complex network models to obtain semantic information of sentences and determine the relationship between them. However, in the conventional network, the position information comes from the direct capture of the structure by the long-term and short-term network, and only the position information in space can be obtained, but the relative position information in deep sentence meaning cannot be captured. Deep learning networks that operate directly on syntactic dependency trees are generally based on complex tree neural networks, whose training process is slow and often cannot incorporate information beyond child nodes.
Therefore, chinese patent of the prior art patent document CN109902301A discloses a relationship inference method based on a deep neural network, in which features based on a syntax dependency tree used in this method only utilize dependency features on paths therein, and the dependency relationship types in the syntax dependency tree are not effectively combined, so that extraction features are insufficient and are not specific. Although LSTM and CNN are used to capture on-path features, some off-path information cannot be captured and is not accurate enough.
Disclosure of Invention
Aiming at the defects in the prior art, the invention aims to provide a relationship reasoning method and system based on a dependency graph.
The relationship reasoning method based on the dependency relationship diagram provided by the invention comprises the following steps:
a dividing and constructing step: obtaining a given sentence pair, and performing word division and word characteristic construction on the given sentence pair by using word sense characteristics;
dependent extraction step: obtaining a dependency relation tree extracted from the text after word division through a dependency extractor;
a characteristic updating step: taking the dependency relationship in the dependency relationship tree as a basis for updating word characteristics, and learning and updating the word characteristics in the given sentence pair by combining a deep learning network;
and (3) feature fusion step: taking a plurality of updated word characteristics obtained by the given sentence pair as local characteristics, and fusing the characteristics to obtain global characteristics;
a loss function calculation step: taking the global features as sentence meaning features, carrying out sentence-to-sentence interaction, inputting the sentence-to-sentence interaction into an output layer of the deep learning network to obtain output, comparing the output with a real label, and calculating a loss function of a learning model;
a learning model correction step: and correcting the learning model according to the loss function calculation result of the learning model, and determining the target parameters corresponding to the learning model.
Preferably, the feature updating step includes:
encoding each triple in the dependency relationship tree, wherein the triple comprises a head word, a tail word and a dependency relationship;
processing the head words and the tail words by using the dependency relationship matrix according to the dependency relationship to obtain a relationship triple;
and aggregating the processed relation triples back into the head words and the tail words.
Preferably, the head word in the relationship triple corresponds to the head node in the dependency tree, the tail word corresponds to the tail node in the dependency tree, the dependency corresponds to different relationships in the dependency tree, and the dependency is marked by using a sequence number, where the sequence number is a sequence number of the corresponding relationship in the relationship dictionary.
Preferably, the order of the head words and the tail words is encoded using full-link layer linear mapping.
Preferably, the aggregation process of the relationship triples is completed through a message passing process:
xi=γ(xi,Σl)
where xi denotes the ith node, γ denotes the aggregation operation, Σ denotes the operation of aggregating all triples related to the ith node, and l is the information of the triples.
The invention provides a relation reasoning system based on a dependency relationship diagram, which comprises:
dividing a structure module: obtaining a given sentence pair, and performing word division and word characteristic construction on the given sentence pair by using word sense characteristics;
a dependency extraction module: obtaining a dependency relation tree extracted from the text after word division through a dependency extractor;
a feature update module: taking the dependency relationship in the dependency relationship tree as a basis for updating word characteristics, and learning and updating the word characteristics in the given sentence pair by combining a deep learning network;
a feature fusion module: taking a plurality of updated word characteristics obtained by the given sentence pair as local characteristics, and fusing the characteristics to obtain global characteristics;
a loss function calculation module: taking the global features as sentence meaning features, carrying out sentence-to-sentence interaction, inputting the sentence-to-sentence interaction into an output layer of the deep learning network to obtain output, comparing the output with a real label, and calculating a loss function of a learning model;
a learning model correction module: and correcting the learning model according to the loss function calculation result of the learning model, and determining the target parameters corresponding to the learning model.
Preferably, the feature update module comprises:
encoding each triple in the dependency relationship tree, wherein the triple comprises a head word, a tail word and a dependency relationship;
processing the head words and the tail words by using the dependency relationship matrix according to the dependency relationship to obtain a relationship triple;
and aggregating the processed relation triples back into the head words and the tail words.
Preferably, the head word in the relationship triple corresponds to the head node in the dependency tree, the tail word corresponds to the tail node in the dependency tree, the dependency corresponds to different relationships in the dependency tree, and the dependency is marked by using a sequence number, where the sequence number is a sequence number of the corresponding relationship in the relationship dictionary.
Preferably, the order of the head words and the tail words is encoded using full-link layer linear mapping.
Preferably, the aggregation process of the relationship triples is completed through a message passing process:
xi=γ(xi,Σl)
where xi denotes the ith node, γ denotes the aggregation operation, Σ denotes the operation of aggregating all triples related to the ith node, and l is the information of the triples.
Compared with the prior art, the invention has the following beneficial effects:
the invention represents the relation information in the syntactic dependency tree by using the form of the triple, and better combines the structure information in the syntactic dependency tree by using the graph network improvement algorithm, thereby effectively improving the expression of the syntactic dependency tree on natural language reasoning and solving the problem that the existing deep network-based relation reasoning method does not reasonably use the information in the dependency relationship tree, which causes the relatively poor relation reasoning result.
Drawings
Other features, objects and advantages of the invention will become more apparent upon reading of the detailed description of non-limiting embodiments with reference to the following drawings:
FIG. 1 is a flow chart of a dependency graph-based relationship inference method;
FIG. 2 is a detailed flow diagram of a term feature update method;
FIG. 3 is a diagram of syntactic dependencies;
FIG. 4 is a feature update aggregation schematic;
FIG. 5 is a diagram of a relational inference engine;
FIG. 6 is a relational inference flow diagram.
Detailed Description
The present invention will be described in detail with reference to specific examples. The following examples will assist those skilled in the art in further understanding the invention, but are not intended to limit the invention in any way. It should be noted that it would be obvious to those skilled in the art that various changes and modifications can be made without departing from the spirit of the invention. All falling within the scope of the present invention.
According to the relationship inference method based on the dependency relationship diagram provided by the invention, as shown in fig. 1, the relationship inference method comprises the following steps:
step 1: and obtaining a given sentence pair, carrying out word division on the given sentence pair by using the word sense characteristics and constructing the word characteristics after division.
Step 2: and obtaining a dependency relation tree which is extracted from the text after the words are divided by a dependency extractor.
And step 3: and taking the dependency relationship in the dependency relationship tree as a basis for updating the word characteristics, and learning and updating the characteristics of the words in the given sentence pair by combining a deep learning network.
And 4, taking the plurality of updated word characteristics obtained by the given sentence pair as local characteristics, and fusing the characteristics to obtain global characteristics. The fusion method may adopt a method including maximum value pooling, average value pooling, RNN sequence processing, and the like, and RNN sequence processing is taken as an example in this embodiment.
And 5, taking the global feature as a sentence meaning feature, carrying out interaction between two sentences, inputting the sentence meaning feature into an output layer to obtain output, comparing the output with a real label, and calculating a loss function of the model.
And 6, correcting the learning model according to the loss function calculation condition of the learning model, and determining a target parameter corresponding to the learning model to generate the learning model.
The step 3 is a main step of word feature update, as shown in fig. 2, and includes the following steps:
and 3.1, coding each triple in the dependency relationship graph, including a head word, a tail word and a dependency relationship.
As shown in fig. 3, the sentences in a given sentence pair can be divided into a dependency graph, where each word serves as a node in the graph, and the connection between the nodes represents the dependency between the nodes. In this embodiment, for each edge and two adjacent nodes on the edge in the dependency relationship graph, the two adjacent nodes can be encoded into a relationship triple, and the relationship triple mainly consists of a head word, a tail word and a dependency relationship. The corresponding relation of the head words depends on the head node in the tree, namely the starting point of the relation. The corresponding relation of the tail words depends on the tail nodes in the tree, namely the directional points of the relation. The dependency relationship correspondence relationship may be marked by a sequence number, where the corresponding sequence number is the sequence number of the correspondence relationship in the relationship dictionary. Specifically, in fig. 3, a relational triple associated with the word "natural language processing" is (natural language processing, solving, SBV), where the head word is natural language processing and the tail word is solving, and the relationship between the two is SBV, which corresponds to a sequence number in the relational dictionary.
And 3.2, processing the head words and the tail words by using the dependency relationship matrix according to the dependency relationship.
Specifically, each triple in the dependency graph includes a head word and a tail word thereof, where the head word and the tail word represent different order information, the head word is at a starting point located at a directed edge, the tail words are positioned at the end points of the directed relation edges, the sequence of coding the two words can use a convolutional neural network, a long-term and short-term memory network and the like, the application is more biased to use full-connection layer linear mapping (FC), one full-connection layer contains fewer parameters, meanwhile, the spliced word vector can contain relative position information thereof, and more smoothly represents the information of the whole triple, as shown in formula (1), wr and br are weight and bias parameters corresponding to different relations t in the full connection layer respectively, h represents a head word, t represents a tail word, and l is a feature vector which represents the triple information and is output by the network.
l=Wr(h,t)+br (1)
And 3.3, aggregating the processed relation triples back to the head words and the tail words.
Specifically, the relation-obtaining triple can be represented as information of one edge in the dependency graph, each node in the graph and the adjacent edge have relation in the graph, and the contents of the relation triple are aggregated into the head node and the tail node of the relation triple, so that the head node and the tail node can be helped to better represent the information contained in the graph. In the graph convolution network, the aggregation process of the relational triples can be completed through a message passing process, as shown in formula (2), where xi represents the ith node, γ represents the aggregation operation, Σ represents the operation of aggregating all triples related to the ith node, and l is the information of the triples.
xi=γ(xi,Σl) (2)
It should be noted that each edge in the dependency graph is not only associated with the starting point but also associated with the ending point thereof, so that during aggregation, for each node in the graph, as long as the associated relationship edge is an edge, no matter whether the node is the starting node or the ending node of the edge, the influence of the relationship edge on the node should be considered, and aggregation operation is performed. Aggregation operations may use simple averaging, maximum, or summing operations, but this application prefers to use a sequential network for processing because it has gating elements that can better determine which information in each triplet has better control over word representation. The gate cycle control unit (GRU) uses less parameters while including the gating capability of the sequence network, so that the GRU can fit data more quickly and obtain a better effect, and the GRU is selected to complete aggregation of relation triples connected on a single word.
As shown in fig. 4, the actual polymerization operation of the GRU is as follows:
regarding the feature vector l of each dependency relationship triplet as the content connected with the head word and the tail word in the triplet relationship, for example, the triplet related to "yes" includes: (is, solve, VOB), (problem, is,SBV) (core, is,VOB). The GRU sequence is used as an input in the GRU sequence, the sequence of the GRU sequence can be regarded as having no influence on the aggregation operation, the characteristic representation with the initial input of 'yes' can be obtained, and after the input processing, a new representation 'yes' with all related triple representations aggregated can be obtained.
As shown in fig. 5, a relationship inference apparatus includes:
the embedding module is used for acquiring a sample word characteristic set, and adopting a selected word divider and a word characteristic structure for each sample in the sample set to determine different word characteristics corresponding to each sample text;
the coding module is used for establishing a dependency relationship graph of the statement, combining the corresponding word characteristics with the dependency relationship triples related to the word characteristics to obtain corresponding relationship characteristics, and aggregating to obtain corresponding updated word characteristics;
the fusion module is used for fusing local word characteristics of different corresponding words of the sample to obtain global characteristics of corresponding sample sentences;
the interaction module is used for combining different global characteristics corresponding to the two sample sentences to obtain an interaction vector;
and the generating module is used for determining the difference of the local features and the global features of the two sample sentences aiming at the interactive vectors obtained after interaction, and constructing the loss function of the learning model according to the difference between the different relationship between the local features and the global features of the two sample sentences and the different relationship between the local features and the global features of the two actual sample sentences.
The specific operation is shown in fig. 6:
in the initial stage, a pair of sample sentences is given, wherein the first sentence is 'what the researcher can process when researching natural language processing', and the second sentence is 'what the natural language processing can solve is the core concerned by the researcher', because the processing method of the sample sentence pair is completely the same, taking the processing method of the first sentence as an example. The dependency tree of sentence one can be obtained through the word segmenter and the dependency analyzer, each node in the tree is converted into one node in the graph, then the dependency edges in the tree are converted into directed edges of the directed graph, and one dependency tree can be converted into the dependency graph. As shown in fig. 3, each dependency relationship in the dependency relationship diagram can be represented as a dependency relationship triple, the triple includes a head node, a tail node, and a dependency relationship, a linear transformation mode is adopted for the construction mode of the dependency relationship triple, and after the head node and the tail node are spliced, different dependency relationship matrices are used for transformation according to the type of the dependency relationship. The converted dependency relationship triple comprises implicit information represented by each edge, and for each node, the implicit information related to each node can help each node to better express new semantics in the sequence, so that each dependency relationship connected with each node is aggregated to the node by using an aggregation operation to serve as a new feature representation. As shown in fig. 4, the information of each relationship triple can be used as a unit in a sequence, and a gated loop unit is used to complete the aggregation operation, and the final state quantity is used as its new feature representation. And the new feature representation of each node is used as the local feature of the sequence, the cyclic network is used for extracting the feature of the local feature, and the attention mechanism is used for enhancing the fusion of information in the whole sequence to obtain the sentence meaning vector containing the whole sentence content. Through the interaction module, the content in the sentence meaning vector of the given sample sentence pair can be better interacted, and the sentence meaning vector is mapped to a high-dimensional interaction space to obtain the interaction vector containing the interaction information. And using the interaction vector as the input of a generating module, capturing key information in the interaction vector in an all-around manner through the multilayer perceptron, outputting a relation vector corresponding to the relation category, and taking the maximum value as the final predicted category. Through the selected loss function, loss calculation is carried out on the selected loss function and the one-hot vector of the correct category, then backward propagation is carried out along a transmission path of the network through reverse transmission, and the parameters of the model are updated, so that the training of the model parameters in the graph 5 is completed, the prediction capability of the model is improved, and the prediction task can be better completed.
Those skilled in the art will appreciate that, in addition to implementing the system and its various devices, modules, units provided by the present invention as pure computer readable program code, the system and its various devices, modules, units provided by the present invention can be fully implemented by logically programming method steps in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers and the like. Therefore, the system and various devices, modules and units thereof provided by the invention can be regarded as a hardware component, and the devices, modules and units included in the system for realizing various functions can also be regarded as structures in the hardware component; means, modules, units for performing the various functions may also be regarded as structures within both software modules and hardware components for performing the method.
In the description of the present application, it is to be understood that the terms "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", and the like indicate orientations or positional relationships based on those shown in the drawings, and are only for convenience in describing the present application and simplifying the description, but do not indicate or imply that the referred device or element must have a specific orientation, be constructed in a specific orientation, and be operated, and thus, should not be construed as limiting the present application.
The foregoing description of specific embodiments of the present invention has been presented. It is to be understood that the present invention is not limited to the specific embodiments described above, and that various changes or modifications may be made by one skilled in the art within the scope of the appended claims without departing from the spirit of the invention. The embodiments and features of the embodiments of the present application may be combined with each other arbitrarily without conflict.

Claims (10)

1. A relationship inference method based on a dependency relationship graph is characterized by comprising the following steps:
a dividing and constructing step: obtaining a given sentence pair, and performing word division and word characteristic construction on the given sentence pair by using word sense characteristics;
dependent extraction step: obtaining a dependency relation tree extracted from the text after word division through a dependency extractor;
a characteristic updating step: taking the dependency relationship in the dependency relationship tree as a basis for updating word characteristics, and learning and updating the word characteristics in the given sentence pair by combining a deep learning network;
and (3) feature fusion step: taking a plurality of updated word characteristics obtained by the given sentence pair as local characteristics, and fusing the characteristics to obtain global characteristics;
a loss function calculation step: taking the global features as sentence meaning features, carrying out sentence-to-sentence interaction, inputting the sentence-to-sentence interaction into an output layer of the deep learning network to obtain output, comparing the output with a real label, and calculating a loss function of a learning model;
a learning model correction step: and correcting the learning model according to the loss function calculation result of the learning model, and determining the target parameters corresponding to the learning model.
2. The dependency graph-based relationship inference method according to claim 1, wherein the feature updating step comprises:
encoding each triple in the dependency relationship tree, wherein the triple comprises a head word, a tail word and a dependency relationship;
processing the head words and the tail words by using the dependency relationship matrix according to the dependency relationship to obtain a relationship triple;
and aggregating the processed relation triples back into the head words and the tail words.
3. The dependency graph-based relationship inference method according to claim 2, wherein the head words in the relationship triplets correspond to head nodes in the dependency tree, the tail words correspond to tail nodes in the dependency tree, the dependencies correspond to different relationships in the dependency tree, and are denoted by a sequence number, which is a sequence number of the corresponding relationship in the relationship dictionary.
4. The dependency graph-based relationship inference method of claim 2, wherein the order of the head words and the tail words is encoded using full-connected layer linear mapping.
5. The relationship inference method based on dependency graph according to claim 2, wherein the aggregation process of relationship triples is completed through a message passing process:
xi=γ(xi,Σl)
where xi denotes the ith node, γ denotes the aggregation operation, Σ denotes the operation of aggregating all triples related to the ith node, and l is the information of the triples.
6. A relationship inference system based on dependency graph, comprising:
dividing a structure module: obtaining a given sentence pair, and performing word division and word characteristic construction on the given sentence pair by using word sense characteristics;
a dependency extraction module: obtaining a dependency relation tree extracted from the text after word division through a dependency extractor;
a feature update module: taking the dependency relationship in the dependency relationship tree as a basis for updating word characteristics, and learning and updating the word characteristics in the given sentence pair by combining a deep learning network;
a feature fusion module: taking a plurality of updated word characteristics obtained by the given sentence pair as local characteristics, and fusing the characteristics to obtain global characteristics;
a loss function calculation module: taking the global features as sentence meaning features, carrying out sentence-to-sentence interaction, inputting the sentence-to-sentence interaction into an output layer of the deep learning network to obtain output, comparing the output with a real label, and calculating a loss function of a learning model;
a learning model correction module: and correcting the learning model according to the loss function calculation result of the learning model, and determining the target parameters corresponding to the learning model.
7. The dependency graph-based relationship inference system of claim 6, wherein the feature update module comprises:
encoding each triple in the dependency relationship tree, wherein the triple comprises a head word, a tail word and a dependency relationship;
processing the head words and the tail words by using the dependency relationship matrix according to the dependency relationship to obtain a relationship triple;
and aggregating the processed relation triples back into the head words and the tail words.
8. The dependency graph-based relationship inference system according to claim 7, wherein the head words in the relationship triplets correspond to head nodes in the dependency tree, the tail words correspond to tail nodes in the dependency tree, the dependencies correspond to different relationships in the dependency tree, and are denoted by a sequence number, which is a sequence number of the corresponding relationship in the relationship dictionary.
9. The dependency graph-based relationship inference system of claim 7, wherein the order of the head terms and the tail terms is encoded using fully-connected layer linear mapping.
10. The dependency graph-based relationship inference system according to claim 7, wherein the aggregation process of relationship triples is done through a message passing process:
xi=γ(xi,Σl)
where xi denotes the ith node, γ denotes the aggregation operation, Σ denotes the operation of aggregating all triples related to the ith node, and l is the information of the triples.
CN202110205890.XA 2021-02-24 2021-02-24 Dependency relationship graph-based relationship reasoning method and system Active CN112818678B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110205890.XA CN112818678B (en) 2021-02-24 2021-02-24 Dependency relationship graph-based relationship reasoning method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110205890.XA CN112818678B (en) 2021-02-24 2021-02-24 Dependency relationship graph-based relationship reasoning method and system

Publications (2)

Publication Number Publication Date
CN112818678A true CN112818678A (en) 2021-05-18
CN112818678B CN112818678B (en) 2022-10-28

Family

ID=75865359

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110205890.XA Active CN112818678B (en) 2021-02-24 2021-02-24 Dependency relationship graph-based relationship reasoning method and system

Country Status (1)

Country Link
CN (1) CN112818678B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114327473A (en) * 2021-12-15 2022-04-12 中电信数智科技有限公司 Software package dependency relationship detection method
CN115150152A (en) * 2022-06-30 2022-10-04 中国人民解放军陆军工程大学 Method for rapidly reasoning actual authority of network user based on authority dependency graph reduction

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108446769A (en) * 2018-01-23 2018-08-24 深圳市阿西莫夫科技有限公司 Knowledge mapping relation inference method, apparatus, computer equipment and storage medium
CN108733792A (en) * 2018-05-14 2018-11-02 北京大学深圳研究生院 A kind of entity relation extraction method
CN108763376A (en) * 2018-05-18 2018-11-06 浙江大学 Syncretic relation path, type, the representation of knowledge learning method of entity description information
CN109284361A (en) * 2018-09-29 2019-01-29 深圳追科技有限公司 A kind of entity abstracting method and system based on deep learning
CN109902301A (en) * 2019-02-26 2019-06-18 广东工业大学 Relation inference method, device and equipment based on deep neural network
CN110968660A (en) * 2019-12-09 2020-04-07 四川长虹电器股份有限公司 Information extraction method and system based on joint training model
CN111026875A (en) * 2019-11-26 2020-04-17 中国人民大学 Knowledge graph complementing method based on entity description and relation path
CN111325243A (en) * 2020-02-03 2020-06-23 天津大学 Visual relation detection method based on regional attention learning mechanism
US20210012061A1 (en) * 2019-07-12 2021-01-14 Nec Laboratories America, Inc. Supervised cross-modal retrieval for time-series and text using multimodal triplet loss

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108446769A (en) * 2018-01-23 2018-08-24 深圳市阿西莫夫科技有限公司 Knowledge mapping relation inference method, apparatus, computer equipment and storage medium
CN108733792A (en) * 2018-05-14 2018-11-02 北京大学深圳研究生院 A kind of entity relation extraction method
CN108763376A (en) * 2018-05-18 2018-11-06 浙江大学 Syncretic relation path, type, the representation of knowledge learning method of entity description information
CN109284361A (en) * 2018-09-29 2019-01-29 深圳追科技有限公司 A kind of entity abstracting method and system based on deep learning
CN109902301A (en) * 2019-02-26 2019-06-18 广东工业大学 Relation inference method, device and equipment based on deep neural network
US20210012061A1 (en) * 2019-07-12 2021-01-14 Nec Laboratories America, Inc. Supervised cross-modal retrieval for time-series and text using multimodal triplet loss
CN111026875A (en) * 2019-11-26 2020-04-17 中国人民大学 Knowledge graph complementing method based on entity description and relation path
CN110968660A (en) * 2019-12-09 2020-04-07 四川长虹电器股份有限公司 Information extraction method and system based on joint training model
CN111325243A (en) * 2020-02-03 2020-06-23 天津大学 Visual relation detection method based on regional attention learning mechanism

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114327473A (en) * 2021-12-15 2022-04-12 中电信数智科技有限公司 Software package dependency relationship detection method
CN114327473B (en) * 2021-12-15 2022-09-06 中电信数智科技有限公司 Software package dependency relationship detection method
CN115150152A (en) * 2022-06-30 2022-10-04 中国人民解放军陆军工程大学 Method for rapidly reasoning actual authority of network user based on authority dependency graph reduction
CN115150152B (en) * 2022-06-30 2024-04-26 中国人民解放军陆军工程大学 Network user actual authority quick reasoning method based on authority dependency graph reduction

Also Published As

Publication number Publication date
CN112818678B (en) 2022-10-28

Similar Documents

Publication Publication Date Title
US20220051056A1 (en) Semantic segmentation network structure generation method and apparatus, device, and storage medium
Hui et al. Linguistic structure guided context modeling for referring image segmentation
US20230025317A1 (en) Text classification model training method, text classification method, apparatus, device, storage medium and computer program product
US9928040B2 (en) Source code generation, completion, checking, correction
JP2024500182A (en) Explainable transducer transformer
WO2019205318A1 (en) Public opinion information classification method and apparatus, computer device, and storage medium
WO2021174774A1 (en) Neural network relationship extraction method, computer device, and readable storage medium
CN112818678B (en) Dependency relationship graph-based relationship reasoning method and system
US20220138185A1 (en) Scene graph modification based on natural language commands
US20220222447A1 (en) Translation method and apparatus, electronic device, and computer-readable storage medium
WO2022041015A1 (en) Neural network model optimisation method and apparatus
CN115470232A (en) Model training and data query method and device, electronic equipment and storage medium
CN117540221B (en) Image processing method and device, storage medium and electronic equipment
CN114818707A (en) Automatic driving decision method and system based on knowledge graph
US20240086158A1 (en) Assisted composition of quantum algorithms
US20240046127A1 (en) Dynamic causal discovery in imitation learning
CN111159424B (en) Method and device for labeling knowledge graph entity, storage medium and electronic equipment
CN116541020A (en) Code generation method, device, equipment, medium and product based on field model
CN116975743A (en) Industry information classification method, device, computer equipment and storage medium
CN116644180A (en) Training method and training system for text matching model and text label determining method
CN116974554A (en) Code data processing method, apparatus, computer device and storage medium
CN115204171A (en) Document-level event extraction method and system based on hypergraph neural network
CN113268599B (en) Training method and device for file classification model, computer equipment and storage medium
CN116167361A (en) Text error correction method, apparatus, device, computer storage medium, and program product
CN112650861A (en) Personality prediction method, system and device based on task layering

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant