CN113486647A - Semantic parsing method and device, electronic equipment and storage medium - Google Patents

Semantic parsing method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN113486647A
CN113486647A CN202110744721.3A CN202110744721A CN113486647A CN 113486647 A CN113486647 A CN 113486647A CN 202110744721 A CN202110744721 A CN 202110744721A CN 113486647 A CN113486647 A CN 113486647A
Authority
CN
China
Prior art keywords
target
abstract
operation sequence
natural language
syntax tree
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110744721.3A
Other languages
Chinese (zh)
Inventor
桂朔
丁小进
王雪萌
张翼
王茹梦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Construction Bank Corp
Original Assignee
China Construction Bank Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Construction Bank Corp filed Critical China Construction Bank Corp
Priority to CN202110744721.3A priority Critical patent/CN113486647A/en
Publication of CN113486647A publication Critical patent/CN113486647A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/253Grammatical analysis; Style critique
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • G06F16/3331Query processing
    • G06F16/334Query execution
    • G06F16/3344Query execution using natural language analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/205Parsing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/30Semantic analysis

Abstract

The invention provides a semantic parsing method, a semantic parsing device, electronic equipment and a storage medium, wherein a target natural language vector is obtained after vectorization processing is carried out on an obtained target natural language, and the target natural language vector is input into a pre-established GCDLenc-dec model; processing the target natural language vector by utilizing a first-layer codec of a GCDLenc-dec model to obtain a target coding vector, and processing the target coding vector to obtain a target abstract summary tree operation sequence; processing the target abstract summary tree operation sequence and the target coding vector by utilizing a second layer of codec of a GCDLenc-dec model to obtain a target abstract syntax tree operation sequence; and inputting the operation sequence of the target abstract syntax tree into a conversion system to obtain a target abstract syntax tree, and performing meaning conversion on the target abstract syntax tree by using a syntax model to obtain a target natural language meaning expression. The method and the device can improve the accuracy of semantic parsing on the natural language.

Description

Semantic parsing method and device, electronic equipment and storage medium
Technical Field
The present invention relates to the field of artificial intelligence, and more particularly, to a semantic parsing method, apparatus, electronic device, and storage medium.
Background
With the rapid development of artificial intelligence, a semantic analysis system which can understand natural language and realize communication between people and computers is constructed by using artificial intelligence, and the semantic analysis system has more and more extensive application in academic and industrial fields, and the core component of the semantic analysis system is semantic analysis.
The existing semantic analysis model constructs semantic correspondence by manually defining a vocabulary table or learning and obtaining the vocabulary table from paired natural languages and knowledge bases, thereby realizing the semantic analysis of natural semantics. However, the semantic correspondence constructed in this way is not obvious enough, so that the meaning expression obtained by semantic parsing of the natural language is not accurate.
Disclosure of Invention
The invention provides a semantic parsing method, a semantic parsing device, electronic equipment and a storage medium, and aims to improve the accuracy of semantic parsing on natural language.
The invention discloses a semantic parsing method in a first aspect, which comprises the following steps:
acquiring a target natural language, and inputting a target natural language vector obtained after vectorization processing of the target natural language into a pre-established GCDL enc-dec model, wherein the GCDL enc-dec model comprises a first layer of codec and a second layer of codec obtained by training based on a training example, and the training example comprises a natural language sample and a target meaning representation corresponding to the natural language sample;
processing the target natural language vector by using the first layer of codec to obtain a target coding vector, and processing the target coding vector to obtain a target abstract summary tree operation sequence;
processing the target abstract summary tree operation sequence and the target coding vector by utilizing the second layer of coder-decoder to obtain a target abstract syntax tree operation sequence;
and inputting the target abstract syntax tree operation sequence into a conversion system to obtain a target abstract syntax tree, and performing meaning conversion on the target abstract syntax tree by using a syntax model to obtain meaning representation of the target natural language.
Optionally, the process of training the obtained first layer codec and second layer codec based on the training example includes:
acquiring a training example, wherein the training example comprises a natural language sample and a target meaning representation corresponding to the natural language sample;
inputting the target meaning representation corresponding to the natural language sample into the grammar model to obtain an abstract grammar tree and an abstract summary tree;
inputting the abstract syntax tree and the abstract summary tree into the conversion system, so that the conversion system processes the abstract syntax tree to obtain a first abstract syntax tree operation sequence and processes the abstract summary tree to obtain a first abstract summary tree operation sequence;
inputting a natural language sample vector obtained after vectorization processing of the natural language into a first layer of codec to be trained to obtain a coding vector, and processing the coding vector to obtain a second abstract summary tree operation sequence;
inputting the coding vector and the second abstract summary tree operation sequence into a second layer of codec to be trained for processing to obtain a second abstract syntax tree operation sequence;
inputting the second abstract syntax tree operation sequence into the conversion system to obtain an abstract syntax tree, and performing semantic analysis on the abstract syntax tree by using the second abstract syntax tree operation sequence and the syntax model to obtain a meaning expression of the natural language sample;
calculating by using the first abstract syntax tree operation sequence, the first abstract summary tree operation sequence, the second abstract syntax tree operation sequence and the second abstract syntax tree operation sequence to obtain a loss function;
and taking the meaning expression of the natural language sample approaching the target meaning expression as a training target, and adjusting the parameters of the first layer codec to be trained and the second layer codec to be trained through the loss function until the first layer codec to be trained and the second layer codec to be trained converge to obtain a first layer codec and a second layer codec.
Optionally, the first-layer codec includes a sentence encoder and an abstract summary tree decoder, and the processing of the natural language sample vector obtained by vectorizing the natural language is input to the first-layer codec to be trained to obtain a coding vector, and the processing of the coding vector is performed to obtain a second abstract summary tree operation sequence, including:
inputting a natural language sample vector obtained after vectorization processing of the natural language into the sentence encoder to obtain an encoding vector;
and inputting the coding vector into the abstract summary tree decoder, and enabling the abstract summary tree decoder to process the coding vector to obtain a second abstract summary tree operation sequence.
Optionally, the second codec includes an abstract summary tree encoder and an abstract syntax tree decoder, and the inputting the coding vector and the second abstract summary tree operation sequence into a second-layer codec to be trained to be processed to obtain a second abstract syntax tree operation sequence includes:
inputting the second abstract summary tree operation sequence into the abstract summary tree encoder to obtain encoding information of the second abstract summary tree operation sequence;
and inputting the coding information of the second abstract syntax tree operation sequence and the coding vector into the abstract syntax tree decoder to obtain a second abstract syntax tree operation sequence.
Optionally, the processing the target natural language vector by using the first layer codec to obtain a target coding vector, and processing the target coding vector to obtain a target abstract summary tree operation sequence includes:
processing the target natural language vector by using the sentence encoder to obtain a target encoding vector;
and processing the target coding vector by using the abstract summary tree decoder to obtain a target abstract summary tree operation sequence.
Optionally, the processing, by using the second layer codec, the target abstract syntax tree operation sequence and the target coding vector to obtain a target abstract syntax tree operation sequence includes:
processing a target abstract summary tree operation sequence by using the abstract summary tree encoder to obtain encoding information of the target abstract summary tree operation sequence;
and processing the coding information of the target abstract summary tree operation sequence by using the abstract syntax tree decoder to obtain a target abstract syntax tree operation sequence.
The second aspect of the present invention discloses a semantic analysis device, which includes:
the device comprises a first acquisition unit, a second acquisition unit and a third acquisition unit, wherein the first acquisition unit is used for acquiring a target natural language and inputting a target natural language vector into a pre-established GCDL enc-dec model after vectorizing the target natural language, the GCDL enc-dec model comprises a first layer of codec and a second layer of codec which are obtained based on training of a training example, and the training example comprises a natural language sample and a target meaning representation corresponding to the natural language sample;
the first processing unit is used for processing the target natural language vector by utilizing the first layer of codec to obtain a target coding vector and processing the target coding vector to obtain a target abstract summary tree operation sequence;
a second processing unit, configured to process the target abstract syntax tree operation sequence and the target coding vector by using the second layer codec to obtain a target abstract syntax tree operation sequence;
and the first semantic parsing unit is used for inputting the target abstract syntax tree operation sequence into a conversion system to obtain a target abstract syntax tree, and performing meaning conversion on the target abstract syntax tree based on a syntax model to obtain meaning representation of the target natural language.
Optionally, the process of training the first layer codec and the second layer codec based on the training example set includes:
the second acquisition unit is used for acquiring a training example, wherein the training example comprises a natural language sample and a corresponding target meaning representation thereof;
the third processing unit is used for inputting the target meaning corresponding to the natural language sample into the grammar model to obtain an abstract grammar tree and an abstract summary tree;
the conversion unit is used for inputting the abstract syntax tree and the abstract summary tree into the conversion system, so that the conversion system processes the abstract syntax tree to obtain a first abstract syntax tree operation sequence and processes the abstract summary tree to obtain a first abstract summary tree operation sequence;
the fourth processing unit is used for inputting a natural language sample vector obtained after vectorization processing is carried out on the natural language into a first layer of codec to be trained to obtain a coding vector, and processing the coding vector to obtain a second abstract summary tree operation sequence;
a fifth processing unit, configured to input the coding vector and the second abstract summary tree operation sequence into a second-layer codec to be trained for processing to obtain a second abstract syntax tree operation sequence;
the second semantic parsing unit is used for inputting the second abstract syntax tree operation sequence into the conversion system to obtain a third abstract syntax tree, and performing semantic parsing on the third abstract syntax tree by using the syntax model to obtain a meaning expression of the natural language sample;
a calculating unit, configured to calculate by using the first abstract syntax tree operation sequence, the second abstract syntax tree operation sequence, and the second abstract syntax tree operation sequence, to obtain a loss function;
and the training unit is used for adjusting the parameters of the first layer codec to be trained and the second layer codec to be trained through the loss function until the first layer codec to be trained and the second layer codec to be trained converge to obtain a first layer codec and a second layer codec.
A third aspect of the invention shows an electronic device comprising a processor and a memory, the memory being adapted to store semantically resolved program code and data, the processor being adapted to invoke program instructions in the memory to perform a semantic resolution method as shown in the first aspect of the invention.
A fourth aspect of the present invention shows a storage medium, which includes a storage program, wherein when the program runs, a device in which the storage medium is located is controlled to execute a semantic parsing method as shown in the first aspect of the present invention.
The invention provides a semantic parsing method, a device, electronic equipment and a storage medium, which are characterized in that a target natural language vector is obtained by vectorizing an obtained target natural language, the natural language vector is input into a pre-established GCDL enc-dec model, the target natural language vector is processed based on a first layer of coding decoder in the GCDL enc-dec model to obtain a target coding vector, the target coding vector is processed to obtain a target abstract summary tree operation sequence, the target abstract summary tree operation sequence and the target coding vector are processed by utilizing a second layer of coding decoder in the GCDL enc-dec model to obtain a target abstract syntax tree operation sequence, finally the target abstract syntax tree operation sequence is input into a conversion system to obtain a target abstract syntax tree, and the target abstract syntax tree is semantically parsed by utilizing a syntax model, and obtaining the meaning expression of the target natural language. According to the technical scheme provided by the invention, a target natural language vector obtained after vectorization processing is directly carried out on a target natural language by using a first layer of codec in a pre-established GCDL enc-dec model is processed to obtain a corresponding coding vector and a target abstract summary tree operation sequence, and then the obtained coding vector and the target abstract summary tree operation sequence are processed by using a second layer of codec in the GCDL enc-dec model to obtain the target abstract syntax tree operation sequence, so that the corresponding meaning expression can be obtained by processing the obtained target abstract syntax tree operation sequence without constructing a vocabulary table, and the accuracy of semantic analysis on the natural language is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
FIG. 1 is a schematic structural diagram of a semantic parsing method in the prior art;
FIG. 2 is a schematic structural diagram of another conventional semantic analysis method;
fig. 3 is a schematic structural diagram of a semantic parsing system according to an embodiment of the present invention;
FIG. 4 is a flowchart illustrating a method for generating a GCDL enc-dec model according to an embodiment of the present invention;
fig. 5 is a schematic flow chart of a semantic analysis method according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of a semantic analysis apparatus according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The term "include" and variations thereof as used herein are open-ended, i.e., "including but not limited to". The term "based on" is "based, at least in part, on". The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments". Relevant definitions for other terms will be given in the following description.
It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing different devices, modules, or units, and are not used for limiting the order or interdependence of the functions performed by the devices, modules, or units.
It is noted that references to "a", "an", and "the" modifications in the disclosure are exemplary rather than limiting, and that those skilled in the art will understand that "one or more" unless the context clearly dictates otherwise.
Abstract syntax tree: abstract Syntax Tree, a Tree representation of the Syntax structure of the AST source code, where each node in the Tree represents the structure of part of the code in the source code.
A conversion system: consisting of a set of states (states) and a set of transitions (transitions) that represent transitions between states, the transition system being used to describe dynamic processes.
And (3) grammar model: the codes can be analyzed into an abstract syntax tree under a standard library through an ast module built in the python, and the abstract syntax tree defined by the standard library is further converted into an abstract syntax tree under ASDL description. The abstract syntax tree can also be converted back to a meaning representation.
As can be seen from the above background art, in the prior art, the existing semantic analysis model constructs semantic correspondence by manually defining a vocabulary table, or learning and obtaining a vocabulary table from a paired natural language and knowledge base, thereby implementing semantic analysis of natural semantics. However, the semantic correspondence constructed in this way is not obvious enough, so that the meaning expression obtained by semantic parsing of the natural language is not accurate.
In order to avoid the problem that the meaning representation obtained by semantic parsing of a natural language is inaccurate due to the fact that the constructed semantic correspondence is not obvious enough, in the prior art, the semantic parsing of the natural language is achieved through a seq2seq model based on an encoder-decoder framework, specifically, the natural language is divided into word input seq2seq models, the seq2seq models are enabled to sequentially process all input words, and corresponding meaning representation is generated, as shown in fig. 1, but the meaning representation generated in the mode has the risk of not conforming to grammar logic.
In order to solve the problem, in the prior art, a natural language is divided into word input seq2seq models, the seq2seq models sequentially process each input word to generate a corresponding operation sequence, and the generated operation sequence is input into an abstract syntax tree to be processed, so as to obtain a corresponding meaning expression, as shown in fig. 2. Although the meaning representation generated in this way does not have the problem of being inconsistent with grammar logic, the abstract syntax tree requires that the input operation sequence needs to meet a certain length, which requires that the seq2seq model generates a longer operation sequence, and the overlong operation sequence generated by the seq2seq model affects the accuracy of the generated operation sequence, thereby resulting in lower accuracy of the final generated meaning representation.
Therefore, the present invention provides a semantic parsing method, apparatus, electronic device and storage medium, which directly uses the first Layer codec in the pre-established syntax-based dual-Layer codec model (GCDL enc-dec model) to process the target natural language vector obtained after vectorization processing of the target natural language, and after obtaining the corresponding coding vector and the target abstract summary tree operation sequence, then, a second layer of codec in a GCDL enc-dec model is utilized to process the obtained coding vector and the target abstract syntax tree operation sequence to obtain a target abstract syntax tree operation sequence, the obtained target abstract syntax tree operation sequence is processed to obtain corresponding meaning expression, and a vocabulary table does not need to be constructed, so that the accuracy of semantic analysis on the natural language is improved. In addition, the generated abstract syntax tree operation sequence is split into two processes, so that the overlong operation sequence generated by the existing seq2seq model can be effectively avoided, the accuracy of the generated operation sequence is improved, and the accuracy of the generated meaning expression is improved.
Referring to fig. 3, a schematic structural diagram of a semantic parsing system provided in an embodiment of the present invention is shown, where the semantic parsing system includes: a pre-established GCDL enc-dec model 100, a conversion system 200 and a grammar model 300. The pre-established GCDL enc-dec model 100 comprises a first layer of codec and a second layer of codec which are trained based on a training instance set, wherein the first layer of codec comprises a sentence coder and an abstract summary tree decoder, and the second layer of codec comprises an abstract summary tree coder and an abstract syntax tree decoder.
Referring to fig. 4, a flow diagram of a method for a GCDL enc-dec model generation process provided in an embodiment of the present invention is shown, where the GCDL enc-dec model generation process specifically includes the following steps:
s401: and acquiring a training example, wherein the training example comprises a natural language sample and a target meaning representation corresponding to the natural language sample.
In the specific implementation of step S401, a training instance containing a natural language sample and a meaning expression corresponding to the natural language sample (for convenience of distinction, the meaning expression corresponding to the collected natural language sample is referred to as a target meaning expression) may be collected.
It should be noted that, a plurality of training examples may be collected, and the inventors may collect the training examples according to their needs, which is not limited in the embodiments of the present application.
S402: and representing the target meaning corresponding to the natural language sample into a grammar model to obtain an abstract grammar tree and an abstract summary tree.
In the process of specifically executing step S402, after the training instance is obtained, the target meaning representation corresponding to the natural language sample in the obtained training instance may be input into the grammar model, so that the grammar model processes the input target meaning representation based on the grammar algorithm to obtain the abstract grammar tree, and the leaf nodes of the obtained abstract grammar tree are removed to obtain the abstract summary tree.
S403: and inputting the abstract syntax tree and the abstract summary tree into a conversion system, processing the abstract syntax tree by the conversion system to obtain a first abstract syntax tree operation sequence, and processing the abstract summary tree to obtain a first abstract summary tree operation sequence.
In the specific execution of step S403, after the input target meaning representation is processed by the syntax model to obtain the abstract syntax tree and the abstract summary tree, the abstract syntax tree is converted into an abstract syntax tree operation sequence by the conversion system (for the convenience of distinction, the abstract syntax tree operation sequence obtained by converting the abstract syntax tree is referred to as a first abstract syntax tree operation sequence), and the abstract summary tree is converted into an abstract summary tree operation sequence by the conversion system (for the convenience of distinction, the abstract summary tree operation sequence obtained by converting the abstract syntax tree summary is referred to as a first abstract summary tree operation sequence).
S404: and inputting a natural language sample vector obtained after vectorization processing of the natural language into a first layer of codec to be trained to obtain a coding vector, and processing the coding vector to obtain a second abstract summary tree operation sequence.
In the specific process of executing step S404, after the training example is obtained, vectorization processing may be performed on the natural language sample in the obtained training example to obtain a natural language sample vector, and the natural language sample vector obtained by vectorization is input to the first layer codec to be trained, so as to encode the Natural Language sample vector based on a sentence encoder (NLD-encoder) in the first layer codec to be trained to obtain an encoded vector, and decoding the obtained coding vector by using an Abstract SKetch Tree decoder (ASKT-decoder) in the first layer of codec to be trained to obtain an Abstract SKetch Tree operation sequence (for the convenience of distinguishing, the Abstract SKetch Tree operation sequence obtained by processing the coding vector by using the Abstract SKetch Tree decoder is called as a second Abstract SKetch Tree operation sequence).
It should be noted that, the manner of obtaining the second abstract summary tree operation sequence by decoding the encoded vector by using the abstract summary tree decoder in the first layer codec to be trained can be referred to in formula (1).
Figure BDA0003142368810000091
Wherein X is a target natural language vector, S is a coding vector, AS is a second abstract summary tree operation sequence, AStSequence of sub-operations, as, obtained for processing the coded vector at time t<tIs the sub-operation sequence obtained by processing the code vector before the time t.
S405: and inputting the coding vector and the second abstract summary tree operation sequence into a second layer of codec to be trained for processing to obtain a second abstract syntax tree operation sequence.
In the process of specifically executing step S405, after the natural language sample vector is processed by the to-be-trained first layer codec to obtain a coding vector and a second Abstract summary Tree operation sequence, the obtained coding vector and the second Abstract summary Tree operation sequence are input to the to-be-trained second layer codec so as to encode the second Abstract summary Tree operation sequence based on the Abstract summary Tree encoder (ASKT-encoder) of the to-be-trained second layer codec to obtain coding information of the second Abstract Tree operation sequence, and the coding information and the coding vector of the second Abstract Tree operation sequence are decoded by the Abstract Syntax Tree decoder (AST-decoder) to obtain a Syntax Abstract Tree operation sequence (for convenience of distinguishing, the Abstract Tree operation sequence obtained by decoding the coding information and the coding vector of the second Abstract Tree operation sequence is referred to as a first Abstract Tree operation sequence A sequence of two abstract syntax tree operations).
The manner of decoding the coding information and the coding vector of the second abstract syntax tree operation sequence by using the abstract syntax tree decoder to obtain the second abstract syntax tree operation sequence can be referred to as formula (2).
Figure BDA0003142368810000101
Wherein X is a target natural language vector, S is a coding vector, Y is coding information of a second abstract syntax tree operation sequence, AY is a second abstract syntax tree operation sequence, AYtA sub-operation sequence, ay, obtained for processing the coded information of the second abstract summary tree operation sequence at time t<tThe sub-operation sequence is obtained by processing the coding information of the second abstract summary tree operation sequence before the time t.
S406: and inputting the second abstract syntax tree operation sequence into a conversion system to obtain an abstract syntax tree, and performing meaning conversion on the abstract syntax tree by using the second abstract syntax tree operation sequence and a syntax model to obtain meaning representation of the natural language sample.
In the process of specifically executing step S406, after the second layer codec to be trained is used to process the coding vector and the second abstract syntax tree operation sequence to obtain a second abstract syntax tree operation sequence, the second abstract syntax tree operation sequence is input into the conversion system, so that the conversion system is used to convert the second abstract syntax tree operation sequence into an abstract syntax tree, and the syntax model is used to perform meaning conversion on the abstract syntax tree to obtain meaning representation of the natural language sample.
S407: and calculating by using the first abstract syntax tree operation sequence, the first abstract summary tree operation sequence, the second abstract syntax tree operation sequence and the second abstract syntax tree operation sequence to obtain a loss function.
In the process of specifically executing step S407, a first loss function is calculated by using the first abstract syntax tree operation sequence and the second abstract syntax tree operation sequence, a second loss function is calculated by using the first abstract syntax tree operation sequence and the second abstract syntax tree operation sequence, and finally, a loss function of the GCDL enc-dec model is calculated according to the first loss function and the second loss function.
S408: and taking the meaning expression of the natural language sample approaching the target meaning expression as a training target, and adjusting the parameters of the first layer codec to be trained and the second layer codec to be trained through a loss function until the first layer codec to be trained and the second layer codec to be trained converge to obtain the first layer codec and the second layer codec.
In the process of specifically executing step S408, after the loss function of the GCDL enc-dec model is calculated, the meaning expression of the natural language sample approaches the target meaning expression as a training target, and parameters of the sentence encoder, the abstract summary tree decoder, the abstract summary tree encoder and the abstract syntax tree decoder are respectively adjusted through the loss function until the sentence encoder, the abstract summary tree decoder, the abstract summary tree encoder and the abstract syntax tree decoder converge, so as to obtain the GCDL enc-dec model.
In the embodiment of the application, a first layer of codec and a second layer of codec in a to-be-trained GCDL enc-dec model are trained by using a training example to obtain a GCDL enc-dec model, so that when semantic parsing is performed on a natural language, a target natural language vector obtained after vectorization processing is directly performed on the target natural language by using the first layer of codec in the GCDL enc-dec model to obtain a corresponding target coding vector and a target abstract summary tree operation sequence, then the obtained target coding vector and the target abstract summary tree operation sequence are processed by using the second layer of codec in the GCDL enc-dec model to obtain a target abstract syntax tree operation sequence, and the obtained target abstract syntax tree operation sequence is processed to obtain a corresponding meaning expression without constructing a vocabulary table, therefore, the accuracy of semantic parsing on the natural language is improved. In addition, the generated abstract syntax tree operation sequence is split into two processes, so that the overlong operation sequence generated by the existing seq2seq model can be effectively avoided, the accuracy of the generated operation sequence is improved, and the accuracy of the generated meaning expression is improved.
Referring to fig. 5, a schematic flow diagram of a semantic analysis method provided in an embodiment of the present invention is shown, where the semantic analysis method specifically includes the following steps:
s501: the method comprises the steps of obtaining a target natural language, carrying out vectorization processing on the target natural language to obtain a target natural language vector, inputting the target natural language vector into a pre-established GCDL enc-dec model, wherein the GCDL enc-dec model comprises a first layer of codec and a second layer of codec which are obtained based on training of a training example, and the training example comprises a natural language sample and a target meaning representation corresponding to the natural language sample.
In step S501, a GCDL enc-dec model is pre-established, where the GCDL enc-dec model includes a first layer codec and a second layer codec trained based on a training instance set, the first layer codec includes a sentence coder and an abstract summary tree decoder, the second layer codec includes an abstract summary tree coder and an abstract syntax tree decoder, and the training instance includes a natural language sample and a target meaning representation corresponding to the natural language sample.
It should be noted that, for the process of training the first layer codec and the second layer codec based on the training example set, reference may be made to a method for generating the GCDL enc-dec model disclosed in fig. 4, and details are not repeated here.
In the specific execution process of step S501, after the natural language to be subjected to semantic parsing is acquired (for convenience of distinguishing, the acquired natural language to be subjected to semantic parsing is referred to as a target natural language), vectorizing the acquired target natural language to obtain a target natural language vector, and inputting the obtained target natural language vector into a pre-constructed GCDL enc-dec model.
S502: and processing the target natural language vector by using a first layer of codec to obtain a target coding vector, and processing the target coding vector to obtain a target abstract summary tree operation sequence.
In the process of specifically executing step S502, after the obtained target natural language vector is input into a pre-constructed GCDL enc-dec model, a sentence encoder in a first layer codec in the GCDL enc-dec model is used to perform encoding processing on the target natural language link to obtain a target encoding vector, and an abstract summary tree decoder in the first layer codec is used to perform decoding processing on the target encoding vector to obtain a target abstract summary tree operation sequence.
S503: and processing the target abstract syntax tree operation sequence and the target coding vector by utilizing a second layer of coder-decoder to obtain a target abstract syntax tree operation sequence.
In the process of specifically executing step S503, after the target natural language vector is processed by the first layer codec to obtain the target coding vector and the target abstract summary tree operation sequence, the obtained target coding vector and the target abstract summary tree operation sequence are input to the second layer codec, so that the abstract summary tree encoder of the second layer codec performs coding processing on the target abstract summary tree operation sequence to obtain the coding information of the target abstract summary tree operation sequence, and the abstract syntax tree decoder performs decoding processing on the coding information of the target abstract summary tree operation sequence and the target coding vector to obtain the target abstract syntax tree operation sequence.
S504: and inputting the operation sequence of the target abstract syntax tree into a conversion system to obtain a target abstract syntax tree, and performing meaning conversion on the target abstract syntax tree by using a syntax model to obtain meaning representation of the target natural language.
In the process of specifically executing step S504, after the target coding vector and the target abstract syntax tree operation sequence are processed by the second layer codec to obtain a target abstract syntax tree operation sequence, the target abstract syntax tree operation sequence is input into the conversion system, so that the target abstract syntax tree operation sequence is converted into the target abstract syntax tree by the conversion system, and the target abstract syntax tree is subjected to meaning conversion by the syntax model to obtain a meaning representation of the target natural language.
The invention provides a semantic parsing method, which obtains a target natural language vector by vectorizing an obtained target natural language, inputs the natural language vector into a pre-established GCDL enc-dec model, processes the target natural language vector based on a first layer of codec in the GCDL enc-dec model to obtain a target coding vector, processing the target coding vector to obtain a target abstract syntax tree operation sequence, processing the target abstract syntax tree operation sequence and the target coding vector by using a second layer of codec in a GCDL enc-dec model to obtain a target abstract syntax tree operation sequence, inputting the target abstract syntax tree operation sequence into a conversion system to obtain a target abstract syntax tree, and semantic analysis is carried out on the target abstract syntax tree by using the syntax model to obtain the meaning expression of the target natural language. According to the technical scheme provided by the invention, a target natural language vector obtained after vectorization processing is directly carried out on a target natural language by using a first layer of codec in a pre-established GCDL enc-dec model is processed to obtain a corresponding target coding vector and a target abstract summary tree operation sequence, and then the target abstract syntax tree operation sequence is obtained by processing the obtained target coding vector and the target abstract summary tree operation sequence by using a second layer of codec in the GCDL enc-dec model, so that corresponding meaning expression can be obtained by processing the obtained target abstract syntax tree operation sequence without constructing a vocabulary table, and the accuracy of semantic analysis on the natural language is improved. In addition, the generated abstract syntax tree operation sequence is split into two processes, so that the overlong operation sequence generated by the existing seq2seq model can be effectively avoided, the accuracy of the generated operation sequence is improved, and the accuracy of the generated meaning expression is improved.
Based on the semantic parsing method disclosed by the embodiment of the present invention, the embodiment of the present invention further discloses a semantic parsing apparatus, as shown in fig. 6, the semantic parsing apparatus includes:
the first obtaining unit 61 is configured to obtain a target natural language, and input a vector of the target natural language obtained after vectorization processing of the target natural language into a pre-established GCDL enc-dec model, where the GCDL enc-dec model includes a first layer of codec and a second layer of codec obtained by training based on a training example, and the training example includes a natural language sample and a target meaning representation corresponding to the natural language sample;
a first processing unit 62, configured to process the target natural language vector by using a first layer codec to obtain a target coding vector, and process the target coding vector to obtain a target abstract summary tree operation sequence;
a second processing unit 63, configured to process the target abstract syntax tree operation sequence and the target coding vector by using a second layer codec to obtain a target abstract syntax tree operation sequence;
and the first semantic parsing unit 64 is configured to input the target abstract syntax tree operation sequence into the conversion system to obtain a target abstract syntax tree, and perform meaning conversion on the target abstract syntax tree by using a syntax model to obtain a meaning representation of the target natural language.
The specific principle and the execution process of each unit in the semantic analysis device disclosed in the embodiment of the present invention are the same as those of the semantic analysis method disclosed in the embodiment of the present invention, and reference may be made to corresponding parts in the semantic analysis method disclosed in the embodiment of the present invention, which are not described herein again.
The invention provides a semantic analysis device, which obtains a target natural language vector by vectorizing an obtained target natural language, inputs the natural language vector into a pre-established GCDL enc-dec model, processes the target natural language vector based on a first layer of codec in the GCDL enc-dec model to obtain a target coding vector, processing the target coding vector to obtain a target abstract syntax tree operation sequence, processing the target abstract syntax tree operation sequence and the target coding vector by using a second layer of codec in a GCDL enc-dec model to obtain a target abstract syntax tree operation sequence, inputting the target abstract syntax tree operation sequence into a conversion system to obtain a target abstract syntax tree, and semantic analysis is carried out on the target abstract syntax tree by using the syntax model to obtain the meaning expression of the target natural language. According to the technical scheme provided by the invention, a target natural language vector obtained after vectorization processing is directly carried out on a target natural language by using a first layer of codec in a pre-established GCDL enc-dec model is processed to obtain a corresponding target coding vector and a target abstract summary tree operation sequence, and then the target abstract syntax tree operation sequence is obtained by processing the obtained target coding vector and the target abstract summary tree operation sequence by using a second layer of codec in the GCDL enc-dec model, so that corresponding meaning expression can be obtained by processing the obtained target abstract syntax tree operation sequence without constructing a vocabulary table, and the accuracy of semantic analysis on the natural language is improved. In addition, the generated abstract syntax tree operation sequence is split into two processes, so that the overlong operation sequence generated by the existing seq2seq model can be effectively avoided, the accuracy of the generated operation sequence is improved, and the accuracy of the generated meaning expression is improved.
Optionally, the process of training the first layer codec and the second layer codec based on the training instance set includes:
the second acquisition unit is used for acquiring a training example, wherein the training example comprises a natural language sample and a target meaning representation corresponding to the natural language sample;
the third processing unit is used for representing the target meaning corresponding to the natural language sample into the grammar model to obtain an abstract grammar tree and an abstract summary tree;
the conversion unit is used for inputting the abstract syntax tree and the abstract summary tree into the conversion system, so that the conversion system processes the abstract syntax tree to obtain a first abstract syntax tree operation sequence and processes the abstract summary tree to obtain a first abstract summary tree operation sequence;
the fourth processing unit is used for inputting a natural language sample vector obtained after vectorization processing of a natural language into the first layer of codec to be trained to obtain a coding vector, and processing the coding vector to obtain a second abstract summary tree operation sequence;
the fifth processing unit is used for inputting the coding vector and the second abstract summary tree operation sequence into a second layer codec to be trained for processing to obtain a second abstract syntax tree operation sequence;
the second semantic parsing unit is used for inputting the second abstract syntax tree operation sequence into the conversion system to obtain a third abstract syntax tree, and performing meaning conversion on the third abstract syntax tree by using the syntax model to obtain meaning representation of the natural language sample;
the calculation unit is used for calculating by utilizing the first abstract syntax tree operation sequence, the second abstract syntax tree operation sequence and the second abstract syntax tree operation sequence to obtain a loss function;
and the training unit is used for adjusting the parameters of the first layer codec to be trained and the second layer codec to be trained through the loss function until the first layer codec to be trained and the second layer codec to be trained converge to obtain the first layer codec and the second layer codec.
Optionally, the first layer codec includes a sentence coder and an abstract summary tree decoder, and the fourth processing unit includes:
the first coding unit is used for inputting a natural language sample vector obtained after vectorization processing of a natural language into a sentence coder to obtain a coding vector;
and the first decoding unit is used for inputting the coding vector into the abstract summary tree decoder so that the abstract summary tree decoder processes the coding vector to obtain a second abstract summary tree operation sequence.
Optionally, the second codec includes an abstract summary tree encoder and an abstract syntax tree decoder, and the fifth processing unit includes:
the second coding unit is used for inputting the second abstract summary tree operation sequence into the abstract summary tree coder to obtain the coding information of the second abstract summary tree operation sequence;
and the second decoding unit is used for inputting the coding information and the coding vector of the second abstract summary tree operation sequence into the abstract syntax tree decoder to obtain a second abstract syntax tree operation sequence.
Optionally, the first processing unit includes:
the third coding unit is used for processing the target natural language vector by utilizing a sentence coder to obtain a target coding vector;
and the third decoding unit is used for processing the target coding vector by using the abstract summary tree decoder to obtain a target abstract summary tree operation sequence.
Optionally, the second processing unit includes:
the fourth coding unit is used for processing the target abstract summary tree operation sequence by using the abstract summary tree coder to obtain the coding information of the target abstract summary tree operation sequence;
and the fourth decoding unit is used for processing the coding information of the target abstract summary tree operation sequence by using the abstract syntax tree decoder to obtain the target abstract syntax tree operation sequence.
An electronic device is provided in an embodiment of the present application, as shown in fig. 7, the electronic device includes a processor 701 and a memory 702, the memory 702 is configured to store program codes and data for semantic parsing, and the processor 701 is configured to call program instructions in the memory to execute steps that are shown in the semantic parsing method in the foregoing embodiment.
The embodiment of the application provides a storage medium, the storage medium comprises a storage program, and when the program runs, a device where the storage medium is located is controlled to execute the semantic parsing method shown in the above embodiment.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, the system or system embodiments are substantially similar to the method embodiments and therefore are described in a relatively simple manner, and reference may be made to some of the descriptions of the method embodiments for related points. The above-described system and system embodiments are merely illustrative, wherein units described as separate components may or may not be physically separate, and components shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
Those of skill would further appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative components and steps have been described above generally in terms of their functionality in order to clearly illustrate this interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that it is obvious to those skilled in the art that various modifications and improvements can be made without departing from the principle of the present invention, and these modifications and improvements should also be considered as the protection scope of the present invention.

Claims (10)

1. A method of semantic parsing, the method comprising:
acquiring a target natural language, and inputting a target natural language vector obtained after vectorization processing of the target natural language into a pre-established GCDL enc-dec model, wherein the GCDL enc-dec model comprises a first layer of codec and a second layer of codec obtained by training based on a training example, and the training example comprises a natural language sample and a target meaning representation corresponding to the natural language sample;
processing the target natural language vector by using the first layer of codec to obtain a target coding vector, and processing the target coding vector to obtain a target abstract summary tree operation sequence;
processing the target abstract summary tree operation sequence and the target coding vector by utilizing the second layer of coder-decoder to obtain a target abstract syntax tree operation sequence;
and inputting the target abstract syntax tree operation sequence into a conversion system to obtain a target abstract syntax tree, and performing meaning conversion on the target abstract syntax tree by using a syntax model to obtain meaning representation of the target natural language.
2. The method of claim 1, wherein the process of training the derived first layer codec and second layer codec based on the training instance comprises:
acquiring a training example, wherein the training example comprises a natural language sample and a target meaning representation corresponding to the natural language sample;
inputting the target meaning representation corresponding to the natural language sample into the grammar model to obtain an abstract grammar tree and an abstract summary tree;
inputting the abstract syntax tree and the abstract summary tree into the conversion system, so that the conversion system processes the abstract syntax tree to obtain a first abstract syntax tree operation sequence and processes the abstract summary tree to obtain a first abstract summary tree operation sequence;
inputting a natural language sample vector obtained after vectorization processing of the natural language into a first layer of codec to be trained to obtain a coding vector, and processing the coding vector to obtain a second abstract summary tree operation sequence;
inputting the coding vector and the second abstract summary tree operation sequence into a second layer of codec to be trained for processing to obtain a second abstract syntax tree operation sequence;
inputting the second abstract syntax tree operation sequence into the conversion system to obtain an abstract syntax tree, and performing semantic analysis on the abstract syntax tree by using the second abstract syntax tree operation sequence and the syntax model to obtain a meaning expression of the natural language sample;
calculating by using the first abstract syntax tree operation sequence, the first abstract summary tree operation sequence, the second abstract syntax tree operation sequence and the second abstract syntax tree operation sequence to obtain a loss function;
and taking the meaning expression of the natural language sample approaching the target meaning expression as a training target, and adjusting the parameters of the first layer codec to be trained and the second layer codec to be trained through the loss function until the first layer codec to be trained and the second layer codec to be trained converge to obtain a first layer codec and a second layer codec.
3. The method according to claim 2, wherein the first layer codec comprises a sentence coder and an abstract summary tree decoder, and the inputting the natural language sample vector obtained by vectorizing the natural language into the first layer codec to be trained to obtain a coding vector, and processing the coding vector to obtain a second abstract summary tree operation sequence comprises:
inputting a natural language sample vector obtained after vectorization processing of the natural language into the sentence encoder to obtain an encoding vector;
and inputting the coding vector into the abstract summary tree decoder, and enabling the abstract summary tree decoder to process the coding vector to obtain a second abstract summary tree operation sequence.
4. The method of claim 3, wherein the second codec comprises an abstract syntax tree coder and an abstract syntax tree decoder, and wherein inputting the coding vector and the second abstract syntax tree operation sequence into a second-layer codec to be trained for processing to obtain a second abstract syntax tree operation sequence comprises:
inputting the second abstract summary tree operation sequence into the abstract summary tree encoder to obtain encoding information of the second abstract summary tree operation sequence;
and inputting the coding information of the second abstract syntax tree operation sequence and the coding vector into the abstract syntax tree decoder to obtain a second abstract syntax tree operation sequence.
5. The method of claim 3, wherein processing the target natural language vector with the first layer codec to obtain a target code vector and processing the target code vector to obtain a target abstract summary tree operation sequence comprises:
processing the target natural language vector by using the sentence encoder to obtain a target encoding vector;
and processing the target coding vector by using the abstract summary tree decoder to obtain a target abstract summary tree operation sequence.
6. The method of claim 4, wherein processing the sequence of target abstract syntax tree operations and the target coded vector with the second layer codec to obtain a sequence of target abstract syntax tree operations comprises:
processing a target abstract summary tree operation sequence by using the abstract summary tree encoder to obtain encoding information of the target abstract summary tree operation sequence;
and processing the coding information of the target abstract summary tree operation sequence by using the abstract syntax tree decoder to obtain a target abstract syntax tree operation sequence.
7. A semantic parsing apparatus, the apparatus comprising:
the system comprises a first acquisition unit, a second acquisition unit and a third acquisition unit, wherein the first acquisition unit is used for acquiring a target natural language and inputting a target natural language vector obtained after vectorization processing is carried out on the target natural language into a pre-established GCDLenc-dec model, the GCDLenc-dec model comprises a first layer of codec and a second layer of codec which are obtained based on training examples, and the training examples comprise natural language samples and target meaning representations corresponding to the natural language samples;
the first processing unit is used for processing the target natural language vector by utilizing the first layer of codec to obtain a target coding vector and processing the target coding vector to obtain a target abstract summary tree operation sequence;
a second processing unit, configured to process the target abstract syntax tree operation sequence and the target coding vector by using the second layer codec to obtain a target abstract syntax tree operation sequence;
and the first semantic parsing unit is used for inputting the target abstract syntax tree operation sequence into a conversion system to obtain a target abstract syntax tree, and performing meaning conversion on the target abstract syntax tree based on a syntax model to obtain meaning representation of the target natural language.
8. The apparatus of claim 7, wherein the process of training the resulting first layer codec and second layer codec based on the set of training instances comprises:
the second acquisition unit is used for acquiring a training example, wherein the training example comprises a natural language sample and a corresponding target meaning representation thereof;
the third processing unit is used for inputting the target meaning corresponding to the natural language sample into the grammar model to obtain an abstract grammar tree and an abstract summary tree;
the conversion unit is used for inputting the abstract syntax tree and the abstract summary tree into the conversion system, so that the conversion system processes the abstract syntax tree to obtain a first abstract syntax tree operation sequence and processes the abstract summary tree to obtain a first abstract summary tree operation sequence;
the fourth processing unit is used for inputting a natural language sample vector obtained after vectorization processing is carried out on the natural language into a first layer of codec to be trained to obtain a coding vector, and processing the coding vector to obtain a second abstract summary tree operation sequence;
a fifth processing unit, configured to input the coding vector and the second abstract summary tree operation sequence into a second-layer codec to be trained for processing to obtain a second abstract syntax tree operation sequence;
the second semantic parsing unit is used for inputting the second abstract syntax tree operation sequence into the conversion system to obtain a third abstract syntax tree, and performing semantic parsing on the third abstract syntax tree by using the syntax model to obtain a meaning expression of the natural language sample;
a calculating unit, configured to calculate by using the first abstract syntax tree operation sequence, the second abstract syntax tree operation sequence, and the second abstract syntax tree operation sequence, to obtain a loss function;
and the training unit is used for adjusting the parameters of the first layer codec to be trained and the second layer codec to be trained through the loss function until the first layer codec to be trained and the second layer codec to be trained converge to obtain a first layer codec and a second layer codec.
9. An electronic device, comprising a processor and a memory, the memory storing semantically resolved program code and data, the processor being configured to invoke program instructions in the memory to perform a semantic parsing method according to any of claims 1-6.
10. A storage medium, characterized in that the storage medium comprises a storage program, wherein when the program runs, a device on which the storage medium is located is controlled to execute a semantic parsing method according to any one of claims 1-6.
CN202110744721.3A 2021-06-30 2021-06-30 Semantic parsing method and device, electronic equipment and storage medium Pending CN113486647A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110744721.3A CN113486647A (en) 2021-06-30 2021-06-30 Semantic parsing method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110744721.3A CN113486647A (en) 2021-06-30 2021-06-30 Semantic parsing method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113486647A true CN113486647A (en) 2021-10-08

Family

ID=77939141

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110744721.3A Pending CN113486647A (en) 2021-06-30 2021-06-30 Semantic parsing method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113486647A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023160346A1 (en) * 2022-02-28 2023-08-31 International Business Machines Corporation Meaning and sense preserving textual encoding and embedding

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023160346A1 (en) * 2022-02-28 2023-08-31 International Business Machines Corporation Meaning and sense preserving textual encoding and embedding

Similar Documents

Publication Publication Date Title
CN108388425B (en) Method for automatically completing codes based on LSTM
CN106502985B (en) neural network modeling method and device for generating titles
CN110489102B (en) Method for automatically generating Python code from natural language
CN106875940B (en) Machine self-learning construction knowledge graph training method based on neural network
CN112528637B (en) Text processing model training method, device, computer equipment and storage medium
CN112037773B (en) N-optimal spoken language semantic recognition method and device and electronic equipment
CN114489669A (en) Python language code fragment generation method based on graph learning
CN108363685B (en) Self-media data text representation method based on recursive variation self-coding model
CN115497477A (en) Voice interaction method, voice interaction device, electronic equipment and storage medium
CN117370378A (en) Method, device, equipment and medium for converting natural language into database statement
CN116661805A (en) Code representation generation method and device, storage medium and electronic equipment
CN113450758B (en) Speech synthesis method, apparatus, device and medium
CN113486647A (en) Semantic parsing method and device, electronic equipment and storage medium
CN111508466A (en) Text processing method, device and equipment and computer readable storage medium
CN114548053A (en) Text comparison learning error correction system, method and device based on editing method
CN112307179A (en) Text matching method, device, equipment and storage medium
CN112364659A (en) Unsupervised semantic representation automatic identification method and unsupervised semantic representation automatic identification device
CN116050425A (en) Method for establishing pre-training language model, text prediction method and device
CN116483314A (en) Automatic intelligent activity diagram generation method
CN112434143B (en) Dialog method, storage medium and system based on hidden state constraint of GRU (generalized regression Unit)
CN114881141A (en) Event type analysis method and related equipment
CN114743539A (en) Speech synthesis method, apparatus, device and storage medium
CN113870826A (en) Pronunciation duration prediction method based on duration prediction model and related equipment
JP2017182277A (en) Coding device, decoding device, discrete series conversion device, method and program
CN112560497B (en) Semantic understanding method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination