CN113869033A - Graph neural network sentence sequencing method integrated with iterative sentence pair relation prediction - Google Patents
Graph neural network sentence sequencing method integrated with iterative sentence pair relation prediction Download PDFInfo
- Publication number
- CN113869033A CN113869033A CN202111123744.9A CN202111123744A CN113869033A CN 113869033 A CN113869033 A CN 113869033A CN 202111123744 A CN202111123744 A CN 202111123744A CN 113869033 A CN113869033 A CN 113869033A
- Authority
- CN
- China
- Prior art keywords
- sentence
- entity
- graph
- iterative
- pair
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000013528 artificial neural network Methods 0.000 title claims abstract description 38
- 238000012163 sequencing technique Methods 0.000 title claims abstract description 30
- 238000000034 method Methods 0.000 claims abstract description 22
- 238000012549 training Methods 0.000 claims abstract description 17
- 239000013598 vector Substances 0.000 claims description 8
- 238000010586 diagram Methods 0.000 description 10
- 230000006870 function Effects 0.000 description 8
- 238000004590 computer program Methods 0.000 description 7
- 238000012986 modification Methods 0.000 description 6
- 230000004048 modification Effects 0.000 description 6
- 238000012545 processing Methods 0.000 description 5
- 238000003058 natural language processing Methods 0.000 description 3
- 230000004075 alteration Effects 0.000 description 2
- 230000001427 coherent effect Effects 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000000306 recurrent effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/20—Natural language analysis
- G06F40/205—Parsing
- G06F40/211—Syntactic parsing, e.g. based on context-free grammar [CFG] or unification grammars
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/20—Natural language analysis
- G06F40/279—Recognition of textual entities
- G06F40/289—Phrasal analysis, e.g. finite state techniques or chunking
- G06F40/295—Named entity recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
Abstract
The invention discloses a method and a medium for sorting sentences of a graph neural network fused with iterative sentence pair relation prediction, wherein the method comprises the following steps: constructing a sentence entity graph; analyzing the sentence entity graph through an initial sentence pair sequence classifier to predict the sequence of the connected sentence pairs in the sentence entity graph; training according to the sequence of the connected sentence pairs in the sentence entity graph to obtain an iterative sentence pair sequence classifier, and iteratively updating the weights of the connected edges in the sentence entity graph through the iterative sentence pair sequence classifier; training according to the sentence entity graph after iterative updating to obtain a sentence sequencing model; obtaining sentence information to be sorted, inputting the sentence information to be sorted into the sentence sorting model, and outputting a sentence sorting mode corresponding to the sentence information to be sorted through the sentence sorting model; the sentence sequencing accuracy can be effectively improved.
Description
Technical Field
The invention relates to the technical field of natural language processing, in particular to a graph neural network sentence sequencing method and a computer-readable storage medium, wherein the graph neural network sentence sequencing method is integrated with iterative sentence pair relation prediction.
Background
With the rapid development and increasing popularity of Natural Language Processing (NLP) technology, building a coherent model of sentences has become an important research task because it can provide useful information for locating, evaluating and generating multiple sentences of sentences. Sentence ordering is an important subtask that aims to restore unordered sentences to truly coherent paragraphs. It requires consistency in processing logic and syntax and is of increasing interest due to its wide range of applications.
In the related technology, in the process of sentence sequencing, the accuracy of sentence sequencing is improved by independently improving a sentence pair sequencing model or independently improving a coder-decoder mode; however, this improved approach is still not accurate enough for the ordering result of the sentences.
Disclosure of Invention
The present invention is directed to solving, at least to some extent, one of the technical problems in the art described above. Therefore, an object of the present invention is to provide a method for sorting sentences in a neural network of a graph, which incorporates an iterative sentence pair relationship prediction, and can effectively improve the accuracy of sentence sorting.
A second object of the invention is to propose a computer-readable storage medium.
In order to achieve the above object, an embodiment of a first aspect of the present invention provides a method for sorting graph neural network sentences into iterative sentence pair relationship prediction, including the following steps: constructing a sentence entity graph; analyzing the sentence entity graph through an initial sentence pair sequence classifier to predict the sequence of the connected sentence pairs in the sentence entity graph; training according to the sequence of the connected sentence pairs in the sentence entity graph to obtain an iterative sentence pair sequence classifier, and iteratively updating the weights of the connected edges in the sentence entity graph through the iterative sentence pair sequence classifier; training according to the sentence entity graph after iterative updating to obtain a sentence sequencing model; and obtaining information of sentences to be sorted, inputting the information of the sentences to be sorted into the sentence sorting model, and outputting a sentence sorting mode corresponding to the information of the sentences to be sorted through the sentence sorting model.
According to the graph neural network sentence sequencing method for integrating the iterative sentence pair relation prediction, firstly, a sentence entity graph is constructed; then, analyzing the sentence entity graph through an initial sentence pair sequence classifier to predict the sequence of the connected sentence pairs in the sentence entity graph; then, training according to the sequence of the connected sentence pairs in the sentence entity graph to obtain an iterative sentence pair sequence classifier, and iteratively updating the weights of the connected edges in the sentence entity graph through the iterative sentence pair sequence classifier; then, training is carried out according to the sentence entity graph after iterative updating so as to obtain a sentence sequencing model; then, obtaining information of sentences to be sorted, inputting the information of the sentences to be sorted into the sentence sorting model, and outputting a sentence sorting mode corresponding to the information of the sentences to be sorted through the sentence sorting model; therefore, the accuracy of sentence sequencing is effectively improved.
In addition, the method for sorting the sentences in the neural network of the graph, which incorporates the iterative sentence pair relationship prediction according to the above embodiment of the present invention, may further have the following additional technical features:
optionally, the nodes of the sentence entity graph include sentence nodes and entity nodes; the sentence nodes are sentence vectors obtained by coding any sentence coding model, and the entity nodes are word vectors obtained by any word representation model.
Optionally, the continuous edges include a sentence-sentence continuous edge, a sentence-entity continuous edge, and an entity-entity continuous edge; when the two sentences contain common entities, the sentence nodes corresponding to the two sentences are connected with the two directed edges in opposite directions to form a sentence-sentence connecting edge; when the entity appears in more than or equal to two sentences, an undirected edge is connected between the entity node corresponding to the entity and the corresponding sentence node to form a sentence and entity connected edge; and when the similarity between the entities is greater than a preset threshold value, connecting the undirected edges between the corresponding entity nodes to form entity-entity connecting edges.
Optionally, training according to the sequence of the connected sentence pairs in the sentence entity graph to obtain an iterative sentence pair sequence classifier, including: assigning weights of the sentences and the sentence connecting edges in the sentence entity graph according to the sequential relation of the connecting sentence pairs in the sentence entity graph; randomly adding noise to the sentence in the sentence entity graph and the sentence connecting edge, and coding the sentence entity graph added with the noise by using a graph neural network; merging sentence nodes corresponding to all the connected sentence pairs into a sentence pair representation, and predicting the sequence of the sentence pairs according to the sentence pair representation by using a classifier; and calculating a target error through a classification target function, updating parameters of the graph neural network according to the target error, and finally obtaining the iterative sentence pair sequence classifier.
Optionally, iteratively updating, by the iterative sentence pair order classifier, the weight of the continuous edge in the sentence entity graph includes: predicting probability values corresponding to the sentence pair sequences through the iterative sentence pair sequence classifier, and calculating confidence degrees corresponding to the sentence pair sequences according to the probability values and preset probability threshold intervals; and selecting a probability value corresponding to the corresponding sentence pair sequence according to the confidence coefficient to update the weight of the connected edges in the sentence sub-entity graph.
To achieve the above object, a second aspect of the present invention provides a computer-readable storage medium, on which a graph neural network sentence sorting program incorporating iterative sentence pair relation prediction is stored, and when executed by a processor, the graph neural network sentence sorting program incorporating iterative sentence pair relation prediction implements the graph neural network sentence sorting method incorporating iterative sentence pair relation prediction as described above.
According to the computer-readable storage medium of the embodiment of the invention, the graph neural network sentence sequencing program fused with the iterative sentence pair relationship prediction is stored, so that the processor realizes the graph neural network sentence sequencing method fused with the iterative sentence pair relationship prediction when executing the graph neural network sentence sequencing program fused with the iterative sentence pair relationship prediction, thereby effectively improving the sentence sequencing accuracy.
Drawings
FIG. 1 is a flow chart of a method for sorting sentences in a graph neural network incorporating iterative sentence-pair relationship prediction according to an embodiment of the present invention;
FIG. 2 is a process diagram of a neural network sentence sequencing method incorporating iterative sentence-pair relationship prediction according to an embodiment of the present invention;
FIG. 3 is a diagram illustrating a sentence-sentence edge-connecting weight noise adding process according to an embodiment of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are illustrative and intended to be illustrative of the invention and are not to be construed as limiting the invention.
In the related technology, the sentence sequencing method adopted is still not accurate enough for the sentence sequencing result; according to the graph neural network sentence sequencing method for integrating the iterative sentence pair relation prediction, firstly, a sentence entity graph is constructed; then, analyzing the sentence entity graph through an initial sentence pair sequence classifier to predict the sequence of the connected sentence pairs in the sentence entity graph; then, training according to the sequence of the connected sentence pairs in the sentence entity graph to obtain an iterative sentence pair sequence classifier, and iteratively updating the weights of the connected edges in the sentence entity graph through the iterative sentence pair sequence classifier; then, training is carried out according to the sentence entity graph after iterative updating so as to obtain a sentence sequencing model; then, obtaining information of sentences to be sorted, inputting the information of the sentences to be sorted into the sentence sorting model, and outputting a sentence sorting mode corresponding to the information of the sentences to be sorted through the sentence sorting model; therefore, the accuracy of sentence sequencing is effectively improved.
In order to better understand the above technical solutions, exemplary embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the invention are shown in the drawings, it should be understood that the invention can be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
In order to better understand the technical solution, the technical solution will be described in detail with reference to the drawings and the specific embodiments.
Fig. 1 is a schematic flow chart of a method for sorting graph neural network sentences into which iterative sentence pair relationship prediction is merged according to an embodiment of the present invention, and as shown in fig. 1, the method for sorting graph neural network sentences into which iterative sentence pair relationship prediction is merged includes the following steps:
s101, constructing a sentence entity graph.
The sentence entity graph can be constructed in various ways.
In some embodiments, the nodes of the sentence entity graph include sentence nodes and entity nodes; the sentence nodes are sentence vectors obtained by coding any sentence coding model, and the entity nodes are word vectors obtained by any word representation model.
That is, in the sentence entity graph, entity nodes are represented by word vectors obtained by any word representation model (e.g., Wordvec, GloVe, BERT, etc.), and sentence nodes are represented by sentence vectors obtained by any sentence coding model (e.g., LSTM, Bi-LSTM, Mean-posing, etc.).
In some embodiments, the continuous edges include sentence-to-sentence continuous edges, sentence-to-entity continuous edges, entity-to-entity continuous edges; when the two sentences contain common entities, the sentence nodes corresponding to the two sentences are connected with the two directed edges in opposite directions to form a sentence-sentence connecting edge; when the entity appears in more than or equal to two sentences, an undirected edge is connected between the entity node corresponding to the entity and the corresponding sentence node to form a sentence and entity connected edge; and when the similarity between the entities is greater than a preset threshold value, connecting the undirected edges between the corresponding entity nodes to form entity-entity connecting edges.
That is, when the sentences corresponding to the two sentence nodes contain a common entity, connecting two directed edges in opposite directions for the two sentence nodes to serve as connecting edges between the sentences; the design mode of the sentence and sentence connecting edge weight can be various; for example, setting the initial weight of each sentence and the sentence connecting edge to be 0.5, and indicating that the sequence information is unknown; if the sequence of the sentence a obtained subsequently has a probability of 90% before the sentence b; then the weight of 0.9 is assigned to the sentence and sentence connecting edge pointing to the sentence node b at the sentence node a, and the weight of 0.1 is assigned to the sentence and sentence connecting edge pointing to the sentence node a at the sentence node b. Then, when the entity appears in more than or equal to two sentences, connecting the entity node corresponding to the entity with the corresponding sentence node to form a sentence-entity connecting edge; specifically, corresponding tags can be added to the sentence and the entity, for example, corresponding tags can be added according to sentence components of the entity in the corresponding sentence (for example, the subject is S, the object is 0, and the others are X); then, calculating the similarity between the entities according to a preset tool and an algorithm (such as word, and the like), and judging whether the similarity between the two entities is greater than a preset threshold value or not; if so, connecting the corresponding two entity nodes with a non-directional edge as an entity-entity connecting edge, wherein the weight of the entity-entity connecting edge is the similarity value between the two entities.
S102, analyzing the sentence entity graph through the initial sentence pair sequence classifier to predict the sequence of the connected sentence pairs in the sentence entity graph.
That is, an initial sentence pair order classifier based on a specific entity graph is trained to analyze a sentence entity graph through the initial sentence pair order classifier to predict the order of the connected sentences in the sentence entity graph.
As an example, as shown in FIG. 2, the initial sentence versus sequential classifier predicts that the edge weight of sentence node S1 to sentence node S2 is 0.4, the edge weight of sentence node S2 to sentence node S1 is 0.6, the edge weight of sentence node S2 to sentence node S3 is 0.9, and the edge weight of sentence node S3 to sentence node S2 is 0.1.
S103, training according to the sequence of the connected sentence pairs in the sentence entity graph to obtain an iterative sentence pair sequence classifier, and iteratively updating the weights of the connected edges in the sentence entity graph through the iterative sentence pair sequence classifier.
The prediction mode of the iterative sentence to the sequential classifier is iterative, and a sentence entity graph updated by the prediction result of the initial sentence to the sequential classifier or the iterative sentence to the sequential classifier is used as input; and outputting a prediction result according to input iteration so as to iteratively update the edge-connecting weight of the sentence entity node.
In some embodiments, training according to the order of the connected sentence pairs in the sentence entity graph to obtain an iterative sentence pair order classifier comprises: assigning values to the weights of the sentence and the sentence connecting edges in the sentence entity graph according to the sequence of the connecting sentence pairs in the sentence entity graph; randomly adding noise to the sentence in the sentence entity graph and the sentence connecting edge, and coding the sentence entity graph added with the noise by using a graph neural network; merging sentence nodes corresponding to all the connected sentence pairs into a sentence pair representation, and predicting the sequence of the sentence pairs according to the sentence pair representation by using a classifier; and calculating a target error through a classification target function, updating parameters of the graph neural network according to the target error, and finally obtaining the iterative sentence pair sequence classifier.
As an example, as shown in fig. 3, first, weights of the sentence and sentence connecting edges in the sentence entity graph are assigned according to the sequence of the connected sentence pairs in the sentence entity graph, as shown in the figure, if the sequence of sentence 1 is before sentence 2, then the weight of the connecting edge pointing from sentence node 1 to sentence node 2 is assigned as 1, and the weight of the connecting edge pointing from sentence node 2 to sentence node 1 is assigned as 0. Next, noise is added randomly, and knowing that the order of sentence 1 precedes sentence 2, a directed edge pointing from sentence node 1 to sentence node 2 is given an error weight of 0.3(<0.5), and a directed edge pointing from sentence node 2 to sentence node 1 is given an error weight of 0.7(> 0.5). Then, coding the sentence entity graph added with the noise by using a graph circulation neural network; secondly, merging sentence nodes corresponding to all connected sentence pairs into sentence pair representation, using a two-layer multilayer perceptron as a classifier, using the sentence pair representation as input, and predicting the sequence of the sentence pairs; then, a target error is calculated with the classification objective function to update parameters of the graph recurrent neural network.
In some embodiments, iteratively updating the weights of the connected edges in the sentence sub-graph by the iterative sentence pair order classifier comprises: predicting probability values corresponding to the sentence pair sequences through an iterative sentence pair sequence classifier, and calculating confidence degrees corresponding to the sentence pair sequences according to the probability values and preset probability threshold intervals; and selecting a probability value corresponding to the sentence pair sequence according to the confidence coefficient to update the weight of the connected edges in the sentence sub-entity graph.
And S104, training according to the sentence entity graph after iterative updating to obtain a sentence sequencing model.
And S105, obtaining the information of the sentences to be sorted, inputting the information of the sentences to be sorted into the sentence sorting model, and outputting a sentence sorting mode corresponding to the information of the sentences to be sorted through the sentence sorting model.
In summary, according to the graph neural network sentence sequencing method incorporating the iterative sentence pair relationship prediction in the embodiment of the present invention, first, a sentence entity graph is constructed; then, analyzing the sentence entity graph through an initial sentence pair sequence classifier to predict the sequence of the connected sentence pairs in the sentence entity graph; then, training according to the sequence of the connected sentence pairs in the sentence entity graph to obtain an iterative sentence pair sequence classifier, and iteratively updating the weights of the connected edges in the sentence entity graph through the iterative sentence pair sequence classifier; then, training is carried out according to the sentence entity graph after iterative updating so as to obtain a sentence sequencing model; then, obtaining information of sentences to be sorted, inputting the information of the sentences to be sorted into the sentence sorting model, and outputting a sentence sorting mode corresponding to the information of the sentences to be sorted through the sentence sorting model; therefore, the accuracy of sentence sequencing is effectively improved.
In order to implement the foregoing embodiment, an embodiment of the present invention provides a computer-readable storage medium, on which a graph neural network sentence sorting program merged with iterative sentence pair relation prediction is stored, and when executed by a processor, the graph neural network sentence sorting program merged with iterative sentence pair relation prediction implements the graph neural network sentence sorting method merged with iterative sentence pair relation prediction as described above.
According to the computer-readable storage medium of the embodiment of the invention, the graph neural network sentence sequencing program fused with the iterative sentence pair relationship prediction is stored, so that the processor realizes the graph neural network sentence sequencing method fused with the iterative sentence pair relationship prediction when executing the graph neural network sentence sequencing program fused with the iterative sentence pair relationship prediction, thereby effectively improving the sentence sequencing accuracy.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It should be noted that in the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words may be interpreted as names.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the invention.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.
In the description of the present invention, it is to be understood that the terms "first", "second" and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying any number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present invention, "a plurality" means two or more unless specifically defined otherwise.
In the present invention, unless otherwise expressly stated or limited, the terms "mounted," "connected," "secured," and the like are to be construed broadly and can, for example, be fixedly connected, detachably connected, or integrally formed; can be mechanically or electrically connected; either directly or indirectly through intervening media, either internally or in any other relationship. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
In the present invention, unless otherwise expressly stated or limited, the first feature "on" or "under" the second feature may be directly contacting the first and second features or indirectly contacting the first and second features through an intermediate. Also, a first feature "on," "over," and "above" a second feature may be directly or diagonally above the second feature, or may simply indicate that the first feature is at a higher level than the second feature. A first feature being "under," "below," and "beneath" a second feature may be directly under or obliquely under the first feature, or may simply mean that the first feature is at a lesser elevation than the second feature.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above should not be understood to necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.
Claims (6)
1. A method for sorting sentences of a graph neural network merged into iterative sentence pair relation prediction is characterized by comprising the following steps:
constructing a sentence entity graph;
analyzing the sentence entity graph through an initial sentence pair sequence classifier to predict the sequence of the connected sentence pairs in the sentence entity graph;
training according to the sequence of the connected sentence pairs in the sentence entity graph to obtain an iterative sentence pair sequence classifier, and iteratively updating the weights of the connected edges in the sentence entity graph through the iterative sentence pair sequence classifier;
training according to the sentence entity graph after iterative updating to obtain a sentence sequencing model;
and obtaining information of sentences to be sorted, inputting the information of the sentences to be sorted into the sentence sorting model, and outputting a sentence sorting mode corresponding to the information of the sentences to be sorted through the sentence sorting model.
2. The method of claim 1, wherein the nodes of the sentence entity graph comprise sentence nodes and entity nodes;
the sentence nodes are sentence vectors obtained by coding any sentence coding model, and the entity nodes are word vectors obtained by any word representation model.
3. The method of claim 1, wherein the edges comprise sentence-to-sentence edges, sentence-to-entity edges, entity-to-entity edges;
when the two sentences contain common entities, the sentence nodes corresponding to the two sentences are connected with the two directed edges in opposite directions to form a sentence-sentence connecting edge;
when the entity appears in more than or equal to two sentences, an undirected edge is connected between the entity node corresponding to the entity and the corresponding sentence node to form a sentence and entity connected edge;
and when the similarity between the entities is greater than a preset threshold value, connecting the undirected edges between the corresponding entity nodes to form entity-entity connecting edges.
4. The method of claim 3, wherein training a sequence of successive sentence pairs in the sentence entity graph to obtain an iterative sentence pair sequence classifier comprises:
assigning weights of the sentences and the sentence connecting edges in the sentence entity graph according to the sequence of the connecting sentence pairs in the sentence entity graph;
randomly adding noise to the sentence in the sentence entity graph and the sentence connecting edge, and coding the sentence entity graph added with the noise by using a graph neural network;
merging sentence nodes corresponding to all the connected sentence pairs into a sentence pair representation, and predicting the sequence of the sentence pairs according to the sentence pair representation by using a classifier;
and calculating a target error through a classification target function, updating parameters of the graph neural network according to the target error, and finally obtaining the iterative sentence pair sequence classifier.
5. The method of claim 1, wherein iteratively updating weights of edges in the sentence entity graph by the iterative sentence pair order classifier comprises:
predicting probability values corresponding to the sentence pair sequences through the iterative sentence pair sequence classifier, and calculating confidence degrees corresponding to the sentence pair sequences according to the probability values and preset probability threshold intervals;
and selecting a probability value corresponding to the corresponding sentence pair sequence according to the confidence coefficient to update the weight of the connected edges in the sentence sub-entity graph.
6. A computer-readable storage medium having stored thereon a graph neural network sentence sequencing procedure incorporating iterative sentence pair relationship prediction, which when executed by a processor implements the graph neural network sentence sequencing method incorporating iterative sentence pair relationship prediction according to any one of claims 1-5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111123744.9A CN113869033A (en) | 2021-09-24 | 2021-09-24 | Graph neural network sentence sequencing method integrated with iterative sentence pair relation prediction |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111123744.9A CN113869033A (en) | 2021-09-24 | 2021-09-24 | Graph neural network sentence sequencing method integrated with iterative sentence pair relation prediction |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113869033A true CN113869033A (en) | 2021-12-31 |
Family
ID=78993971
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111123744.9A Pending CN113869033A (en) | 2021-09-24 | 2021-09-24 | Graph neural network sentence sequencing method integrated with iterative sentence pair relation prediction |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113869033A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114970491A (en) * | 2022-08-02 | 2022-08-30 | 深圳市城市公共安全技术研究院有限公司 | Text connectivity judgment method and device, electronic equipment and storage medium |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9336186B1 (en) * | 2013-10-10 | 2016-05-10 | Google Inc. | Methods and apparatus related to sentence compression |
CN107870964A (en) * | 2017-07-28 | 2018-04-03 | 北京中科汇联科技股份有限公司 | A kind of sentence sort method and system applied to answer emerging system |
CN109739973A (en) * | 2018-12-20 | 2019-05-10 | 北京奇安信科技有限公司 | Text snippet generation method, device, electronic equipment and storage medium |
WO2020042925A1 (en) * | 2018-08-29 | 2020-03-05 | 腾讯科技(深圳)有限公司 | Man-machine conversation method and apparatus, electronic device, and computer readable medium |
CN112380835A (en) * | 2020-10-10 | 2021-02-19 | 中国科学院信息工程研究所 | Question answer extraction method fusing entity and sentence reasoning information and electronic device |
CN112465593A (en) * | 2020-11-27 | 2021-03-09 | 中国科学技术大学 | Method for realizing fashion suit recommendation through graph neural network |
CN112948543A (en) * | 2021-02-20 | 2021-06-11 | 河海大学 | Multi-language multi-document abstract extraction method based on weighted TextRank |
WO2021174946A1 (en) * | 2020-10-12 | 2021-09-10 | 平安科技(深圳)有限公司 | Visualization method, system, computer device, and storage medium |
-
2021
- 2021-09-24 CN CN202111123744.9A patent/CN113869033A/en active Pending
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9336186B1 (en) * | 2013-10-10 | 2016-05-10 | Google Inc. | Methods and apparatus related to sentence compression |
CN107870964A (en) * | 2017-07-28 | 2018-04-03 | 北京中科汇联科技股份有限公司 | A kind of sentence sort method and system applied to answer emerging system |
WO2020042925A1 (en) * | 2018-08-29 | 2020-03-05 | 腾讯科技(深圳)有限公司 | Man-machine conversation method and apparatus, electronic device, and computer readable medium |
CN109739973A (en) * | 2018-12-20 | 2019-05-10 | 北京奇安信科技有限公司 | Text snippet generation method, device, electronic equipment and storage medium |
CN112380835A (en) * | 2020-10-10 | 2021-02-19 | 中国科学院信息工程研究所 | Question answer extraction method fusing entity and sentence reasoning information and electronic device |
WO2021174946A1 (en) * | 2020-10-12 | 2021-09-10 | 平安科技(深圳)有限公司 | Visualization method, system, computer device, and storage medium |
CN112465593A (en) * | 2020-11-27 | 2021-03-09 | 中国科学技术大学 | Method for realizing fashion suit recommendation through graph neural network |
CN112948543A (en) * | 2021-02-20 | 2021-06-11 | 河海大学 | Multi-language multi-document abstract extraction method based on weighted TextRank |
Non-Patent Citations (2)
Title |
---|
汤小娜;苏劲松;: "贝叶斯分类在词义消歧中的分析", 黑龙江科技信息, no. 07, 15 April 2007 (2007-04-15) * |
苏劲松: "全宋词语料库建设及其风格与情感分析的计算方法研究", 《硕士电子期刊信息科技辑》, no. 2008, 15 July 2008 (2008-07-15) * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114970491A (en) * | 2022-08-02 | 2022-08-30 | 深圳市城市公共安全技术研究院有限公司 | Text connectivity judgment method and device, electronic equipment and storage medium |
CN114970491B (en) * | 2022-08-02 | 2022-10-04 | 深圳市城市公共安全技术研究院有限公司 | Text connectivity judgment method and device, electronic equipment and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109120462B (en) | Method and device for predicting opportunistic network link and readable storage medium | |
JP6743934B2 (en) | Method, apparatus and system for estimating causal relationship between observed variables | |
CN111406267A (en) | Neural architecture search using performance-predictive neural networks | |
CN110366734A (en) | Optimization neural network framework | |
CN112364880B (en) | Omics data processing method, device, equipment and medium based on graph neural network | |
CN108304679A (en) | A kind of adaptive reliability analysis method | |
KR20180014471A (en) | Method and apparatus for searching new material | |
CN108536784B (en) | Comment information sentiment analysis method and device, computer storage medium and server | |
CN110909868A (en) | Node representation method and device based on graph neural network model | |
CN111428866A (en) | Incremental learning method and device, storage medium and electronic equipment | |
CN112086144A (en) | Molecule generation method, molecule generation device, electronic device, and storage medium | |
Trivodaliev et al. | Exploring function prediction in protein interaction networks via clustering methods | |
CN112420125A (en) | Molecular attribute prediction method and device, intelligent equipment and terminal | |
Mo et al. | Network simplification and k-terminal reliability evaluation of sensor-cloud systems | |
CN113869033A (en) | Graph neural network sentence sequencing method integrated with iterative sentence pair relation prediction | |
KR20180056013A (en) | Method and apparatus for predicting toxicity of nano material | |
CN112508177A (en) | Network structure searching method and device, electronic equipment and storage medium | |
CN116933657A (en) | Complex profile processing parameter feature extraction method, system, equipment and medium | |
CN113449176A (en) | Recommendation method and device based on knowledge graph | |
CN110705889A (en) | Enterprise screening method, device, equipment and storage medium | |
CN107105052B (en) | Heuristic Web service combination method based on graph planning | |
CN113783715B (en) | Opportunistic network topology prediction method adopting causal convolutional neural network | |
KR101151013B1 (en) | Method for evaluating performance of tire | |
CN115345303A (en) | Convolutional neural network weight tuning method, device, storage medium and electronic equipment | |
CN116805384A (en) | Automatic searching method, automatic searching performance prediction model training method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |