CN112818678B - Dependency relationship graph-based relationship reasoning method and system - Google Patents

Dependency relationship graph-based relationship reasoning method and system Download PDF

Info

Publication number
CN112818678B
CN112818678B CN202110205890.XA CN202110205890A CN112818678B CN 112818678 B CN112818678 B CN 112818678B CN 202110205890 A CN202110205890 A CN 202110205890A CN 112818678 B CN112818678 B CN 112818678B
Authority
CN
China
Prior art keywords
dependency
word
relationship
tree
relation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110205890.XA
Other languages
Chinese (zh)
Other versions
CN112818678A (en
Inventor
张月国
蒋兴健
董莉莉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Jiaotong University
Original Assignee
Shanghai Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Jiaotong University filed Critical Shanghai Jiaotong University
Priority to CN202110205890.XA priority Critical patent/CN112818678B/en
Publication of CN112818678A publication Critical patent/CN112818678A/en
Application granted granted Critical
Publication of CN112818678B publication Critical patent/CN112818678B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/279Recognition of textual entities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/901Indexing; Data structures therefor; Storage structures
    • G06F16/9024Graphs; Linked lists
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/901Indexing; Data structures therefor; Storage structures
    • G06F16/9027Trees
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/30Semantic analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Computational Linguistics (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Data Mining & Analysis (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Evolutionary Computation (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Machine Translation (AREA)

Abstract

The invention provides a relation reasoning method and a relation reasoning system based on a dependency graph, which are used for carrying out word division and word characteristic construction on a given sentence pair by utilizing word meaning characteristics; obtaining a dependency relation tree extracted from the text after word division through a dependency extractor; the dependency relationship is used as a basis for updating the word characteristics, and the word characteristics in the given sentence pair are learned and updated by combining a deep learning network; taking a plurality of updated word characteristics obtained by the given sentence pair as local characteristics, and fusing the characteristics to obtain global characteristics; taking the global feature as a sentence meaning feature, performing interaction between two sentences, inputting the sentence meaning feature into an output layer to obtain output, comparing the output with a real label, and calculating a loss function of the learning model; and correcting the learning model according to the loss function calculation result of the learning model, and determining the target parameters corresponding to the learning model. The expression of the syntactic dependency tree on natural language reasoning is effectively improved.

Description

Dependency relationship graph-based relationship reasoning method and system
Technical Field
The invention relates to the field of computers, in particular to a relation reasoning method and a relation reasoning system based on a dependency graph.
Background
With the continuous development of deep learning models, the main trend of natural language reasoning task is to use more complex network models to obtain semantic information of sentences and determine the relationship between them. However, in the conventional network, the position information comes from the direct capture of the structure by the long-term and short-term network, and only the position information in the space can be obtained, but the relative position information in the deep sentence meaning cannot be captured. Deep learning networks that operate directly on syntactic dependency trees are generally based on complex tree neural networks, whose training process is slow and often cannot incorporate information beyond child nodes.
Therefore, chinese patent CN109902301A discloses a deep neural network-based relationship inference method, in which features based on a syntax dependency tree used in the method only utilize dependency features on paths therein, and are not effectively combined with dependency relationship types in the syntax dependency tree, so that extraction features are insufficient and are not specific. Although LSTM and CNN are used to capture on-path features, some off-path information cannot be captured and is not accurate enough.
Disclosure of Invention
Aiming at the defects in the prior art, the invention aims to provide a relationship inference method and a relationship inference system based on a dependency graph.
The relationship reasoning method based on the dependency relationship diagram provided by the invention comprises the following steps:
a dividing and constructing step: obtaining a given sentence pair, and carrying out word division and word characteristic construction on the given sentence pair by using word meaning characteristics;
dependent extraction step: obtaining a dependency relation tree extracted from the text after word division through a dependency extractor;
a characteristic updating step: taking the dependency relationship in the dependency relationship tree as a basis for updating word characteristics, and learning and updating the word characteristics in the given sentence pair by combining a deep learning network;
and (3) feature fusion step: taking a plurality of updated word characteristics obtained by the given sentence pair as local characteristics, and fusing the characteristics to obtain global characteristics;
a loss function calculation step: taking the global features as sentence meaning features, carrying out sentence-to-sentence interaction, inputting the sentence-to-sentence interaction into an output layer of the deep learning network to obtain output, comparing the output with a real label, and calculating a loss function of a learning model;
a learning model correction step: and correcting the learning model according to the loss function calculation result of the learning model, and determining the target parameters corresponding to the learning model.
Preferably, the feature updating step includes:
encoding each triple in the dependency relationship tree, wherein the triple comprises a head word, a tail word and a dependency relationship;
processing the head words and the tail words by using the dependency relationship matrix according to the dependency relationship to obtain a relationship triple;
and aggregating the processed relation triples back to the head words and the tail words.
Preferably, the head word in the relationship triple corresponds to the head node in the dependency tree, the tail word corresponds to the tail node in the dependency tree, the dependency corresponds to different relationships in the dependency tree, and the dependency is marked by using a sequence number, where the sequence number is a sequence number of the corresponding relationship in the relationship dictionary.
Preferably, the order of the head words and the tail words is encoded using full-link layer linear mapping.
Preferably, the aggregation process of the relational triples is completed through a message passing process:
xi=γ(xi,Σl)
where xi denotes the ith node, γ denotes the aggregation operation, Σ denotes the operation of aggregating all triples related to the ith node, and l is the information of the triples.
The invention provides a relation reasoning system based on a dependency relationship diagram, which comprises:
dividing a structure module: obtaining a given sentence pair, and performing word division and word characteristic construction on the given sentence pair by using word sense characteristics;
a dependency extraction module: obtaining a dependency relation tree extracted from the text after word division through a dependency extractor;
a feature update module: taking the dependency relationship in the dependency relationship tree as a basis for updating word characteristics, and learning and updating the word characteristics in the given sentence pair by combining a deep learning network;
a feature fusion module: taking a plurality of updated word characteristics obtained by the given sentence pair as local characteristics, and fusing the characteristics to obtain global characteristics;
a loss function calculation module: taking the global features as sentence meaning features, carrying out sentence-to-sentence interaction, inputting the sentence meaning features into an output layer of the deep learning network to obtain output, comparing the output with a real label, and calculating a loss function of a learning model;
a learning model correction module: and correcting the learning model according to the loss function calculation result of the learning model, and determining the target parameters corresponding to the learning model.
Preferably, the feature updating module comprises:
encoding each triple in the dependency relationship tree, wherein the triple comprises a head word, a tail word and a dependency relationship;
processing the head words and the tail words by using the dependency relationship matrix according to the dependency relationship to obtain relationship triples;
and aggregating the processed relation triples back to the head words and the tail words.
Preferably, the head word in the relationship triple corresponds to the head node in the dependency tree, the tail word corresponds to the tail node in the dependency tree, the dependency corresponds to different relationships in the dependency tree, and the dependency is marked by using a sequence number, where the sequence number is a sequence number of the corresponding relationship in the relationship dictionary.
Preferably, the order of the leading and trailing words is encoded using full-concatenation layer linear mapping.
Preferably, the aggregation process of the relational triples is completed through a message passing process:
xi=γ(xi,Σl)
where xi denotes the ith node, γ denotes the aggregation operation, Σ denotes the operation of aggregating all triples related to the ith node, and l is the information of the triples.
Compared with the prior art, the invention has the following beneficial effects:
the invention represents the relation information in the syntactic dependency tree by using the form of the triple, and better combines the structure information in the syntactic dependency tree by using the graph network improvement algorithm, thereby effectively improving the expression of the syntactic dependency tree on natural language reasoning and solving the problem that the existing deep network-based relation reasoning method does not reasonably use the information in the dependency relationship tree, which causes the relatively poor relation reasoning result.
Drawings
Other features, objects and advantages of the invention will become more apparent upon reading of the detailed description of non-limiting embodiments with reference to the following drawings:
FIG. 1 is a flow chart of a dependency graph-based relationship inference method;
FIG. 2 is a detailed flow diagram of a term feature update method;
FIG. 3 is a diagram of syntactic dependencies;
FIG. 4 is a feature update aggregation schematic;
FIG. 5 is a diagram of a relational inference engine;
fig. 6 is a relational inference flow diagram.
Detailed Description
The present invention will be described in detail with reference to specific examples. The following examples will assist those skilled in the art in further understanding the invention, but are not intended to limit the invention in any way. It should be noted that it would be obvious to those skilled in the art that various changes and modifications can be made without departing from the spirit of the invention. All falling within the scope of the invention.
According to the relationship inference method based on the dependency relationship diagram provided by the invention, as shown in fig. 1, the relationship inference method comprises the following steps:
step 1: and obtaining a given sentence pair, carrying out word division on the given sentence pair by using the word sense characteristics and constructing the word characteristics after division.
Step 2: and obtaining a dependency relation tree extracted from the text after word division through a dependency extractor.
And step 3: and taking the dependency relationship in the dependency relationship tree as a basis for updating the word characteristics, and learning and updating the characteristics of the words in the given sentence pair by combining a deep learning network.
And 4, taking the plurality of updated word characteristics obtained by the given sentence pair as local characteristics, and fusing the characteristics to obtain global characteristics. The fusion method may include maximum value pooling, average value pooling, RNN sequence processing, and the like, and RNN sequence processing is taken as an example in this embodiment.
And 5, taking the global feature as a sentence meaning feature, carrying out interaction between two sentences, inputting the sentence meaning feature into an output layer to obtain output, comparing the output with a real label, and calculating a loss function of the model.
And 6, correcting the learning model according to the loss function calculation condition of the learning model, and determining a target parameter corresponding to the learning model to generate the learning model.
The step 3 is a main step of word feature update, and as shown in fig. 2, includes the following steps:
and 3.1, coding each triple in the dependency graph, including a head word, a tail word and a dependency relationship.
As shown in fig. 3, the sentences in a given sentence pair can be divided into a dependency graph, where each word serves as a node in the graph, and the connection between the nodes represents the dependency between the nodes. In this embodiment, for each edge and two adjacent nodes on the edge in the dependency relationship graph, the two adjacent nodes can be encoded into a relationship triple, and the relationship triple mainly includes a head word, a tail word, and a dependency relationship. The corresponding relation of the head words depends on the head node in the tree, namely the starting point of the relation. The corresponding relation of the tail words depends on the tail nodes in the tree, namely the directional points of the relation. The dependency relationship correspondence relationship may be marked by a sequence number, where the corresponding sequence number is the sequence number of the correspondence relationship in the relational dictionary. Specifically, in fig. 3, a relational triple associated with the word "natural language processing" is (natural language processing, solving, SBV), where the head word is natural language processing and the tail word is solving, and the relationship between the two is SBV, which corresponds to a sequence number in the relational dictionary.
And 3.2, processing the head words and the tail words by using the dependency relationship matrix according to the dependency relationship.
Specifically, each triplet in the dependency graph includes a head word and a tail word, where the head word and the tail word represent different sequence information, the head word is located at a start point of a directed edge, and the tail word is located at an end point of the directed edge, and a convolutional neural network, a long-term and short-term memory network, etc. may be used for encoding the sequence of the two words.
l=Wr(h,t)+br (1)
And 3.3, aggregating the processed relation triples back to the head words and the tail words.
Specifically, the relation-obtaining triple can be represented as information of one edge in the dependency graph, each node in the graph and the adjacent edge have relation in the graph, and the contents of the relation triple are aggregated into the head node and the tail node of the relation triple, so that the head node and the tail node can be helped to better represent the information contained in the graph. In the graph convolution network, the aggregation process of the relational triples can be completed through a message passing process, as shown in formula (2), where xi represents the ith node, γ represents the aggregation operation, Σ represents the operation of aggregating all triples related to the ith node, and l is the information of the triples.
xi=γ(xi,Σl) (2)
It should be noted that each edge in the dependency graph is not only associated with the starting point but also associated with the ending point thereof, so that during aggregation, for each node in the graph, as long as the associated relationship edge is an edge, no matter whether the node is the starting node or the ending node of the edge, the influence of the relationship edge on the node should be considered, and aggregation operation is performed. Aggregation operations may use simple averaging, maximum, or summing operations, but this application prefers to use a sequential network for processing because it has gating elements that can better determine which information in each triplet has better control over word representation. The gate cycle control unit (GRU) uses less parameters while including the gating capability of the sequence network, so that the GRU can fit data more quickly and obtain a better effect, and the GRU is selected to complete aggregation of relation triples connected on a single word.
As shown in fig. 4, the actual polymerization operation of the GRU is as follows:
regarding the feature vector l of each dependency relationship triplet as the content connected with the head word and the tail word in the triplet relationship, for example, the triplet related to "yes" includes: (is, solve, VOB), (problem, is,SBV) (core, is,VOB). The GRU sequence is used as an input in the GRU sequence, the sequence of the GRU sequence can be regarded as having no influence on the aggregation operation, the characteristic representation with the initial input of 'yes' can be obtained, and after the input processing, a new representation 'yes' with all related triple representations aggregated can be obtained.
As shown in fig. 5, a relationship inference apparatus includes:
the embedding module is used for acquiring a sample word characteristic set, and adopting a selected word divider and a word characteristic structure for each sample in the sample set to determine different word characteristics corresponding to each sample text;
the coding module is used for establishing a dependency relationship graph of the statement, combining the corresponding word characteristics with the dependency relationship triples related to the word characteristics to obtain corresponding relationship characteristics, and aggregating to obtain corresponding updated word characteristics;
the fusion module is used for fusing the local word characteristics of different corresponding words of the sample to obtain the global characteristics of the corresponding sample sentence;
the interaction module is used for combining different global characteristics corresponding to the two sample sentences to obtain an interaction vector;
and the generating module is used for determining the difference of the local features and the global features of the two sample sentences aiming at the interactive vectors obtained after interaction, and constructing the loss function of the learning model according to the difference between the different relationship between the local features and the global features of the two sample sentences and the different relationship between the local features and the global features of the two actual sample sentences.
The specific operation is shown in fig. 6:
in the initial stage, a pair of sample sentences is given, wherein the first sentence is 'what the researcher can process when researching natural language processing', and the second sentence is 'what the natural language processing can solve is the core concerned by the researcher', because the processing method of the sample sentence pair is completely the same, taking the processing method of the first sentence as an example. The dependency tree of sentence one can be obtained through the word segmenter and the dependency analyzer, each node in the tree is converted into one node in the graph, then the dependency edges in the tree are converted into directed edges of the directed graph, and one dependency tree can be converted into the dependency graph. As shown in fig. 3, each dependency relationship in the dependency relationship diagram can be represented as a dependency relationship triple, the triple includes a head node, a tail node, and a dependency relationship, a linear transformation mode is adopted for the construction mode of the dependency relationship triple, and after the head node and the tail node are spliced, different dependency relationship matrices are used for transformation according to the type of the dependency relationship. The converted dependency relationship triple comprises implicit information represented by each edge, and for each node, the implicit information related to each node can help each node to better express new semantics in the sequence, so that each dependency relationship connected with each node is aggregated to the node by using an aggregation operation to serve as a new feature representation. As shown in fig. 4, the information of each relationship triple can be used as a unit in a sequence, and a gated loop unit is used to complete the aggregation operation, and the final state quantity is used as its new feature representation. And the new feature representation of each node is used as the local feature of the sequence, the cyclic network is used for extracting the feature of the local feature, and the attention mechanism is used for enhancing the fusion of information in the whole sequence to obtain the sentence meaning vector containing the whole sentence content. Through the interaction module, the content in the sentence meaning vector of the given sample sentence pair can be better interacted, and the sentence meaning vector is mapped to a high-dimensional interaction space to obtain the interaction vector containing the interaction information. And using the interaction vector as the input of a generating module, capturing key information in the interaction vector in an all-around manner through the multilayer perceptron, outputting a relation vector corresponding to the relation category, and taking the maximum value as the final predicted category. The selected loss function and the one-hot vector of the correct category are used for loss calculation, then the loss function is transmitted backwards along a transmission path of the network through reverse transmission, and the parameters of the model are updated, so that the training of the parameters of the model in the figure 5 is completed, the prediction capability of the model is improved, and the prediction task can be better completed.
It is well within the knowledge of a person skilled in the art to implement the system and its various devices, modules, units provided by the present invention in a purely computer readable program code means that the same functionality can be implemented by logically programming method steps in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers and the like. Therefore, the system and various devices, modules and units thereof provided by the invention can be regarded as a hardware component, and the devices, modules and units included in the system for realizing various functions can also be regarded as structures in the hardware component; means, modules, units for realizing various functions can also be regarded as structures in both software modules and hardware components for realizing the methods.
In the description of the present application, it is to be understood that the terms "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", and the like indicate orientations or positional relationships based on those shown in the drawings, and are only for convenience in describing the present application and simplifying the description, but do not indicate or imply that the referred device or element must have a specific orientation, be constructed in a specific orientation, and be operated, and thus, should not be construed as limiting the present application.
The foregoing description has described specific embodiments of the present invention. It is to be understood that the present invention is not limited to the specific embodiments described above, and that various changes or modifications may be made by one skilled in the art within the scope of the appended claims without departing from the spirit of the invention. The embodiments and features of the embodiments of the present application may be combined with each other arbitrarily without conflict.

Claims (2)

1. A relationship inference method based on a dependency relationship graph is characterized by comprising the following steps:
a dividing and constructing step: obtaining a given sentence pair, and carrying out word division and word characteristic construction on the given sentence pair by using word meaning characteristics;
a dependent extraction step: obtaining a dependency relation tree extracted from the text after word division through a dependency extractor;
a characteristic updating step: taking the dependency relationship in the dependency relationship tree as a basis for updating word characteristics, and learning and updating the word characteristics in the given sentence pair by combining a deep learning network;
a characteristic fusion step: taking a plurality of updated word characteristics obtained by the given sentence pair as local characteristics, and fusing the characteristics to obtain global characteristics;
a loss function calculation step: taking the global features as sentence meaning features, carrying out sentence-to-sentence interaction, inputting the sentence meaning features into an output layer of the deep learning network to obtain output, comparing the output with a real label, and calculating a loss function of a learning model;
a learning model correction step: correcting the learning model according to the loss function calculation result of the learning model, and determining a target parameter corresponding to the learning model;
the feature updating step includes:
encoding each triple in the dependency tree, wherein the triple comprises a head word, a tail word and a dependency;
processing the head words and the tail words by using the dependency relationship matrix according to the dependency relationship to obtain relationship triples;
aggregating the processed relation triples back into head words and tail words;
the head words in the relation triples correspond to head nodes in the dependency tree, the tail words correspond to tail nodes in the dependency tree, the dependency corresponds to different relations in the dependency tree, and the dependency is marked by using a serial number, wherein the serial number is the serial number of the corresponding relation in the relation dictionary;
encoding the order of the head words and the tail words by using full-link layer linear mapping;
the aggregation process of the relation triples is completed through a message passing process:
xi=γ(xi,Σl)
where xi denotes the ith node, γ denotes the aggregation operation, Σ denotes the operation of aggregating all triples related to the ith node, and l is the information of the triples.
2. A relationship inference system based on dependency graph, comprising:
dividing a structure module: obtaining a given sentence pair, and performing word division and word characteristic construction on the given sentence pair by using word sense characteristics;
a dependency extraction module: obtaining a dependency relation tree extracted from the text after word division through a dependency extractor;
a feature update module: taking the dependency relationship in the dependency relationship tree as a basis for updating word characteristics, and learning and updating the word characteristics in the given sentence pair by combining a deep learning network;
a feature fusion module: taking a plurality of updated word characteristics obtained by the given sentence pair as local characteristics, and fusing the characteristics to obtain global characteristics;
a loss function calculation module: taking the global features as sentence meaning features, carrying out sentence-to-sentence interaction, inputting the sentence-to-sentence interaction into an output layer of the deep learning network to obtain output, comparing the output with a real label, and calculating a loss function of a learning model;
a learning model correction module: correcting the learning model according to the loss function calculation result of the learning model, and determining a target parameter corresponding to the learning model;
the feature update module includes:
encoding each triple in the dependency relationship tree, wherein the triple comprises a head word, a tail word and a dependency relationship;
processing the head words and the tail words by using the dependency relationship matrix according to the dependency relationship to obtain a relationship triple;
aggregating the processed relation triples back into head words and tail words;
the head words in the relation triples correspond to head nodes in the dependency tree, the tail words correspond to tail nodes in the dependency tree, the dependency corresponds to different relations in the dependency tree, and the dependency is marked by using a serial number, wherein the serial number is the serial number of the corresponding relation in the relation dictionary;
encoding the order of the head words and the tail words by using full-link layer linear mapping;
the aggregation process of the relation triples is completed through a message passing process:
xi=γ(xi,Σl)
where xi denotes the ith node, γ denotes the aggregation operation, Σ denotes the operation of aggregating all triples related to the ith node, and l is the information of the triples.
CN202110205890.XA 2021-02-24 2021-02-24 Dependency relationship graph-based relationship reasoning method and system Active CN112818678B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110205890.XA CN112818678B (en) 2021-02-24 2021-02-24 Dependency relationship graph-based relationship reasoning method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110205890.XA CN112818678B (en) 2021-02-24 2021-02-24 Dependency relationship graph-based relationship reasoning method and system

Publications (2)

Publication Number Publication Date
CN112818678A CN112818678A (en) 2021-05-18
CN112818678B true CN112818678B (en) 2022-10-28

Family

ID=75865359

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110205890.XA Active CN112818678B (en) 2021-02-24 2021-02-24 Dependency relationship graph-based relationship reasoning method and system

Country Status (1)

Country Link
CN (1) CN112818678B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114327473B (en) * 2021-12-15 2022-09-06 中电信数智科技有限公司 Software package dependency relationship detection method
CN115150152B (en) * 2022-06-30 2024-04-26 中国人民解放军陆军工程大学 Network user actual authority quick reasoning method based on authority dependency graph reduction

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108446769A (en) * 2018-01-23 2018-08-24 深圳市阿西莫夫科技有限公司 Knowledge mapping relation inference method, apparatus, computer equipment and storage medium
CN108733792A (en) * 2018-05-14 2018-11-02 北京大学深圳研究生院 A kind of entity relation extraction method
CN109284361A (en) * 2018-09-29 2019-01-29 深圳追科技有限公司 A kind of entity abstracting method and system based on deep learning
CN111026875A (en) * 2019-11-26 2020-04-17 中国人民大学 Knowledge graph complementing method based on entity description and relation path

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108763376B (en) * 2018-05-18 2020-09-29 浙江大学 Knowledge representation learning method for integrating relationship path, type and entity description information
CN109902301B (en) * 2019-02-26 2023-02-10 广东工业大学 Deep neural network-based relationship reasoning method, device and equipment
US20210012061A1 (en) * 2019-07-12 2021-01-14 Nec Laboratories America, Inc. Supervised cross-modal retrieval for time-series and text using multimodal triplet loss
CN110968660B (en) * 2019-12-09 2022-05-06 四川长虹电器股份有限公司 Information extraction method and system based on joint training model
CN111325243B (en) * 2020-02-03 2023-06-16 天津大学 Visual relationship detection method based on regional attention learning mechanism

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108446769A (en) * 2018-01-23 2018-08-24 深圳市阿西莫夫科技有限公司 Knowledge mapping relation inference method, apparatus, computer equipment and storage medium
CN108733792A (en) * 2018-05-14 2018-11-02 北京大学深圳研究生院 A kind of entity relation extraction method
CN109284361A (en) * 2018-09-29 2019-01-29 深圳追科技有限公司 A kind of entity abstracting method and system based on deep learning
CN111026875A (en) * 2019-11-26 2020-04-17 中国人民大学 Knowledge graph complementing method based on entity description and relation path

Also Published As

Publication number Publication date
CN112818678A (en) 2021-05-18

Similar Documents

Publication Publication Date Title
Hui et al. Linguistic structure guided context modeling for referring image segmentation
US20220051056A1 (en) Semantic segmentation network structure generation method and apparatus, device, and storage medium
US20230025317A1 (en) Text classification model training method, text classification method, apparatus, device, storage medium and computer program product
JP2024500182A (en) Explainable transducer transformer
WO2019205318A1 (en) Public opinion information classification method and apparatus, computer device, and storage medium
Vasile et al. Time window temporal logic
CN112818678B (en) Dependency relationship graph-based relationship reasoning method and system
WO2021135477A1 (en) Probabilistic graphical model-based text attribute extraction method and apparatus, computer device and storage medium
WO2021174774A1 (en) Neural network relationship extraction method, computer device, and readable storage medium
US20220222447A1 (en) Translation method and apparatus, electronic device, and computer-readable storage medium
CN106682343B (en) Formal verification method of adjacency matrix based on graph
AU2021225262A1 (en) Scene graph modification based on natural language commands
CN112131888A (en) Method, device and equipment for analyzing semantic emotion and storage medium
CN116974626B (en) Analysis sequence chart generation method, device, equipment and computer readable storage medium
US20240086158A1 (en) Assisted composition of quantum algorithms
CN111159424B (en) Method and device for labeling knowledge graph entity, storage medium and electronic equipment
CN117312559A (en) Method and system for extracting aspect-level emotion four-tuple based on tree structure information perception
CN116594608A (en) Method and device for generating and training visual neural network model
CN115470232A (en) Model training and data query method and device, electronic equipment and storage medium
CN114511813B (en) Video semantic description method and device
CN116974554A (en) Code data processing method, apparatus, computer device and storage medium
CN111638926A (en) Method for realizing artificial intelligence in Django framework
CN113590800B (en) Training method and device for image generation model and image generation method and device
CN112650861A (en) Personality prediction method, system and device based on task layering
CN113486180A (en) Remote supervision relation extraction method and system based on relation hierarchy interaction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant