CN113553822A - Ancient poetry generation model training method, ancient poetry generation equipment and storage medium - Google Patents
Ancient poetry generation model training method, ancient poetry generation equipment and storage medium Download PDFInfo
- Publication number
- CN113553822A CN113553822A CN202110869571.9A CN202110869571A CN113553822A CN 113553822 A CN113553822 A CN 113553822A CN 202110869571 A CN202110869571 A CN 202110869571A CN 113553822 A CN113553822 A CN 113553822A
- Authority
- CN
- China
- Prior art keywords
- ancient poetry
- input
- poetry
- ancient
- different formats
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/10—Text processing
- G06F40/166—Editing, e.g. inserting or deleting
- G06F40/186—Templates
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/10—Text processing
- G06F40/103—Formatting, i.e. changing of presentation of documents
- G06F40/117—Tagging; Marking up; Designating a block; Setting of attributes
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D10/00—Energy efficient computing, e.g. low power processors, power management or thermal management
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Computational Linguistics (AREA)
- General Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Machine Translation (AREA)
Abstract
The application provides an ancient poetry generating model training method, an ancient poetry generating method, equipment and a storage medium, wherein the training method comprises the following steps: inputting the ancient poetry data set into an input module of the ancient poetry generating model; the ancient poetry data set comprises a plurality of ancient poetry with different formats, and global control information and format control information corresponding to each sentence of ancient poetry; mapping the global control information into a plurality of independent words and/or identifications, and explicitly constructing global control templates in different formats aiming at each word or identification; for each character in the sentence of ancient poetry, explicitly constructing format control templates in different formats based on format control information; and inputting the global control template and the format control template into a decoding module of the ancient poetry generating model to train the ancient poetry generating model. The method and the device can mix ancient poetry data of various different formats to perform model training, and can artificially control the ancient poetry formats needing to be generated in the process of generating the ancient poetry, so that the interactivity of users and the flexibility and controllability of poetry sentences are improved.
Description
Technical Field
The application relates to the technical field of ancient poetry generation, in particular to ancient poetry generation model training, an ancient poetry generation method, equipment and a storage medium.
Background
In recent years, as artificial intelligence and natural language processing related technologies are continuously studied deeply, more and more deep learning technologies are continuously practiced and landed. The ancient Chinese poetry is a treasure in the traditional Chinese culture, the elegant and simple sentences of the ancient poetry, and the sentence patterns of the neat pair of the stick can make people enjoy the enjoyment. In the game, corresponding ancient poems are generated for some data sheets, some themes and even players in a personalized mode, and the freshness and the interactive feeling of the players can be improved. In some marketing scenes, the ancient poems are generated in a self-defined manner according to the requirements of players, so that the ancient poems are beneficial to spontaneous popularization of the players, the marketing popularization cost is reduced, and the popularization efficiency is greatly improved.
The technology of ancient poetry generation based on deep learning is more and more mature in recent years, but the traditional text generation model based on the Recurrent Neural Network (RNN) is difficult to explicitly control the generated text format, such as the length, level and narrow tone, rhyme and other formats of each poetry sentence. A model of a conventional transform-based coder-decoder (encoder-decoder) architecture models text format information by adding different position codes, so that length controllability can be achieved at the time of generation. The reason why the methods are not well applied in the field of ancient poetry generation is that different format requirements need to train a model respectively, and the resource waste rate is high; or an encoder-decoder architecture model is used, so that the overall calculation amount is too complex, and the real-time request of a user cannot be quickly responded; and the formats such as rhyme, level and narrow cannot be well controlled explicitly.
Disclosure of Invention
In view of this, an object of the present application is to provide a model training for generating ancient poetry, a method, a device and a storage medium for generating ancient poetry, which can mix ancient poetry data of various different formats for training in the process of model training, and can artificially control the format of ancient poetry to be generated in the process of generating ancient poetry, thereby improving interactivity of users and flexible controllability of poetry sentences.
In a first aspect, an embodiment of the present application provides a training method for an ancient poetry generating model, where the training method includes:
acquiring an ancient poetry data set and acquiring a pre-constructed ancient poetry generating model; the ancient poetry data set comprises a plurality of ancient poetry with different formats, and global control information and format control information corresponding to each ancient poetry;
inputting the ancient poetry data set into an input module of the ancient poetry generating model;
extracting global control information and format control information corresponding to each ancient poem;
mapping global control information corresponding to the ancient poetry sentence into a plurality of independent characters and/or identifications, and explicitly constructing global control templates in different formats aiming at each character or identification;
for each character in the ancient poetry sentence, explicitly constructing format control templates in different formats based on format control information corresponding to the ancient poetry sentence;
and inputting the global control template and the format control template which correspond to the ancient poetry and are in different formats into a decoding module of the ancient poetry generating model, and training the decoding module.
In one possible embodiment, the global control information includes at least one of: theme information, emotion information.
In one possible embodiment, the step of mapping the global control information corresponding to the ancient verse into a plurality of individual words and/or logos includes:
mapping the theme information corresponding to the ancient poetry sentence into a plurality of independent characters; and/or
And mapping the emotional information corresponding to the ancient poetry sentence into a plurality of independent marks.
In one possible embodiment, the format control information includes at least one of: level and narrow information, rhyme information, poetry sentence length and poetry sentence quantity.
In a possible implementation manner, for each character in the ancient poetry sentence, the step of explicitly constructing the format control templates in different formats based on the format control information corresponding to the ancient poetry sentence comprises the following steps:
aiming at each character in the ancient poetry sentence, a format control template which represents level and narrow tones is explicitly constructed based on level and narrow tone information corresponding to the ancient poetry sentence; and/or
For each character in the ancient poetry sentence, explicitly constructing a format control template for expressing rhyme based on rhyme information corresponding to the ancient poetry sentence; and/or
For each character in the ancient poetry, a format control template representing a relative position and/or a global position is explicitly constructed based on the poetry length corresponding to the ancient poetry; and/or
And (4) for each character in the ancient poetry, explicitly constructing a format control template for expressing the type of the poetry based on the number of the poetry corresponding to the ancient poetry.
In a possible implementation manner, the step of inputting the global control template and the format control template in different formats corresponding to the ancient poetry into the decoding module of the ancient poetry generating model includes:
converting the global control templates of the display structure in different formats into global control vectors in different formats aiming at each word or mark corresponding to the global control information;
summing the global control vectors in different formats to obtain a target global control vector corresponding to each word or identifier;
converting the format control templates with different formats of the display structure into format control vectors with different formats aiming at each character in the sentence of ancient poetry;
adding the character vector of each previous moment and the format control vectors of the characters to be generated at the current moment under different formats to obtain a target format control vector corresponding to the current moment in the ancient poetry sentence;
and inputting the target global control vector corresponding to each character or identifier and the target format control vector corresponding to the current moment into a decoding module of the ancient poetry generating model, and autoregressive decoding the predicted text of the character to be generated at the current moment.
In a second aspect, an embodiment of the present application further provides an ancient poetry generating method, where the generating method includes:
acquiring input global control information and/or input format control information matched with a user;
inputting the input global control information and/or the input format control information into the trained ancient poetry generating model;
mapping the input global control information into a plurality of independent characters and/or identifications through an input module in the ancient poetry generating model, and matching the input global control templates in different formats aiming at each character or identification;
aiming at each character of the ancient poetry to be generated, matching an input format control template in different formats based on the input format control information;
and inputting the input global control templates with different formats and the input format control templates of each character with different formats into a decoding module in the ancient poetry generating model, and decoding a predicted text of the next character of the character in an autoregressive manner to generate the ancient poetry.
In one possible embodiment, the input global control information includes at least one of: inputting theme information and inputting emotion information.
In one possible embodiment, the step of mapping, by an input module in the ancient poetry generating model, the input global control information into a plurality of individual words and/or logos comprises:
mapping the input subject information into a plurality of independent characters through an input module in the ancient poetry generating model; and/or
And mapping the input emotional information into a plurality of independent identifications through an input module in the ancient poetry generating model.
In one possible embodiment, the input format control information includes at least one of: inputting level and narrow information, inputting rhyme information, inputting poetry sentence length and inputting poetry sentence quantity.
In a possible embodiment, the step of matching the input format control templates in different formats based on the input format control information for each character of the ancient poetry to be generated comprises:
aiming at each character of the ancient poetry to be generated, matching an input format control template which represents level and narrow based on input level and narrow information; and/or
Matching an input format control template representing rhyme based on input rhyme information for each character of the ancient poetry to be generated; and/or
Matching an input format control template representing a relative position and/or a global position based on the length of an input verse for each character of an ancient poem to be generated; and/or
And aiming at each character of the ancient poetry to be generated, matching an input format control template representing the poetry type based on the quantity of input poetry sentences.
In one possible embodiment, the step of inputting the input global control templates in different formats and the input format control templates of each word in different formats into the decoding module in the ancient poetry generating model to auto-regressively decode the predicted text of the next word of the word comprises:
for each word or identification corresponding to the input global control information, converting the matched input global control templates in different formats into input global control vectors in different formats;
summing the input global control vectors in different formats to obtain a target input global control vector corresponding to each word or identifier;
aiming at each character of the ancient poetry to be generated, converting the matched input format control templates in different formats into input format control vectors in different formats;
adding the word vector of each previous moment and the input format control vectors of the words to be generated at the current moment under different formats to obtain a target input format control vector corresponding to the current moment;
and inputting the target input global control vector corresponding to each character or identifier and the target input format control vector corresponding to the current moment into a decoding module of the ancient poetry generating model, and decoding the predicted text of the character to be generated at the current moment in an autoregressive mode.
In a third aspect, an embodiment of the present application further provides a training apparatus for an ancient poetry generating model, where the training apparatus includes:
the ancient poetry acquisition module is used for acquiring an ancient poetry data set and acquiring a pre-constructed ancient poetry generation model; the ancient poetry data set comprises a plurality of ancient poetry with different formats, and global control information and format control information corresponding to each ancient poetry;
the ancient poetry input module is used for inputting the ancient poetry data set into the input module of the ancient poetry generating model;
the information extraction module is used for extracting global control information and format control information corresponding to each ancient poem;
the overall structure module is used for mapping the overall control information corresponding to the ancient poetry sentence into a plurality of independent characters and/or identifications, and for each character or identification, explicitly constructing overall control templates in different formats;
the format construction module is used for explicitly constructing format control templates in different formats according to the format control information corresponding to the ancient poetry sentence aiming at each character in the ancient poetry sentence;
and the training module is used for inputting the global control template and the format control template which correspond to the ancient poetry and are in different formats into the decoding module of the ancient poetry generating model, and training the decoding module.
In a fourth aspect, an embodiment of the present application further provides an ancient poetry generating device, where the generating device includes:
the information acquisition module is used for acquiring input global control information and/or input format control information matched with a user;
the information input module is used for inputting the input global control information and/or the input format control information into the trained ancient poetry generating model;
the overall matching module is used for mapping the input overall control information into a plurality of independent characters and/or identifications through an input module in the ancient poetry generating model and matching the input overall control templates in different formats aiming at each character or identification;
the format matching module is used for matching the input format control templates in different formats based on the input format control information aiming at each character of the ancient poetry to be generated;
and the generating module is used for inputting the input global control templates in different formats and the input format control templates of each character in different formats into the decoding module in the ancient poetry generating model, and decoding the predicted text of the next character of the character in an autoregressive manner so as to generate the ancient poetry.
In a fifth aspect, an embodiment of the present application further provides an electronic device, including: a processor, a storage medium and a bus, wherein the storage medium stores machine-readable instructions executable by the processor, the processor and the storage medium communicate with each other through the bus when the electronic device runs, and the processor executes the machine-readable instructions to execute the steps of the method in any possible implementation manner of the first aspect or the second aspect.
In a sixth aspect, the present application further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and the computer program, when executed by a processor, performs the steps of the method in any possible implementation manner of the first aspect or the second aspect.
The embodiment of the application provides a training method of an ancient poetry generating model, which comprises the steps of firstly obtaining an ancient poetry data set and obtaining a pre-constructed ancient poetry generating model; the ancient poetry data set comprises a plurality of ancient poetry with different formats, and global control information and format control information corresponding to each ancient poetry; that is, this application can mix the ancient poetry data of various different formats and carry out the model training. Then, inputting the ancient poetry data set into an input module of the ancient poetry generating model; and extracting global control information and format control information corresponding to each ancient poem. Then, mapping the global control information corresponding to the ancient poetry sentence into a plurality of independent characters and/or identifications, and explicitly constructing global control templates in different formats aiming at each character or identification; for each character in the ancient poetry sentence, explicitly constructing format control templates in different formats based on format control information corresponding to the ancient poetry sentence; that is, this application can show the whole control template and the format control template under the different formats of structure to can show the text format that control generation, for example, the length of every poem of sentence, level and narrow, format such as the rhyme of giving a rest to the speech. And finally, inputting the global control template and the format control template which correspond to the ancient poetry and are in different formats into a decoding module of the ancient poetry generating model, and training the decoding module. The ancient poetry data of various different formats can be mixed and trained in the model training process, the overall control template and the format control template which are constructed in an explicit mode and constructed in different formats can be input into the decoding module, the generated ancient poetry texts strictly accord with the preset format requirements, various playing methods are provided for games, and the game content of players is enriched. Meanwhile, the defect that the semantics of the generated result based on the RNN model is not smooth is overcome.
The embodiment of the application also provides an ancient poetry generating method, which comprises the steps of firstly obtaining input global control information and/or input format control information matched with a user; inputting the input global control information and/or the input format control information into the trained ancient poetry generating model; that is, the format of the ancient poems which need to be generated can be artificially controlled according to the requirements of the user. Then, mapping the input global control information into a plurality of independent characters and/or identifications through an input module in the ancient poetry generating model, and matching the input global control templates in different formats aiming at each character or identification; aiming at each character of the ancient poetry to be generated, matching an input format control template in different formats based on the input format control information; that is, the text format generated can be explicitly controlled, such as the length, level and narrow of each poem, the format of the rhyme and the like. And finally, inputting the input global control templates with different formats and the input format control templates of each character with different formats into a decoding module in the ancient poetry generating model, and decoding a predicted text of the next character of the character in an autoregressive manner to generate the ancient poetry. In the process of generating the ancient poetry, the embodiment of the application can artificially control the format of the ancient poetry needing to be generated, thereby improving the interactivity of users and the flexible controllability of poetry sentences.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained from the drawings without inventive effort.
Fig. 1 is a schematic structural diagram illustrating an ancient poetry generating model provided by an embodiment of the application;
FIG. 2 is a flow chart illustrating a training method of an ancient poetry generating model provided by an embodiment of the application;
fig. 3 is a flowchart illustrating an ancient poetry generating method provided by an embodiment of the application;
fig. 4 is a schematic structural diagram illustrating a training apparatus for generating a model of ancient poetry according to an embodiment of the present application;
fig. 5 is a schematic structural diagram illustrating an ancient poetry generating device provided by an embodiment of the present application;
fig. 6 shows a schematic structural diagram of an electronic device provided in an embodiment of the present application.
Detailed Description
In order to make the purpose, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it should be understood that the drawings in the present application are for illustrative and descriptive purposes only and are not used to limit the scope of protection of the present application. Additionally, it should be understood that the schematic drawings are not necessarily drawn to scale. The flowcharts used in this application illustrate operations implemented according to some embodiments of the present application. It should be understood that the operations of the flow diagrams may be performed out of order, and steps without logical context may be performed in reverse order or simultaneously. One skilled in the art, under the guidance of this application, may add one or more other operations to, or remove one or more operations from, the flowchart.
In addition, the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that in the embodiments of the present application, the term "comprising" is used to indicate the presence of the features stated hereinafter, but does not exclude the addition of further features.
In conventional solutions, on one hand, the conventional text generation model based on the Recurrent Neural Network (RNN) has difficulty in explicitly controlling the format of the generated text, such as the length, tone, rhyme, and the like of each poem. On the other hand, a model of a conventional transform-based coder-decoder (encoder-decoder) architecture models text format information by adding different position codes, so that length controllability can be achieved at the time of generation. The reason why the methods are not well applied in the field of ancient poetry generation is that different format requirements need to train a model respectively, and the resource waste rate is high; or an encoder-decoder architecture model is used, so that the overall calculation amount is too complex, and the real-time request of a user cannot be quickly responded; and the formats such as rhyme, level and narrow cannot be well controlled explicitly. Based on this, the embodiments of the present application provide an ancient poetry generating model training, an ancient poetry generating method, an ancient poetry generating device, and a storage medium, which are described below by way of embodiments.
Fig. 1 is a schematic structural diagram of an ancient poetry generating model provided in an embodiment of the present application. The ancient poetry generating model comprises an input module 1 and a decoding module 2, wherein the input module 1 comprises an input layer and control templates with various different formats, and the decoding module 2 comprises a Decoder (transducer Decoder) and an output layer.
The training method of the ancient poetry generating model provided by the embodiment of the application is explained in detail below with reference to the content described in the ancient poetry generating model shown in fig. 1.
Referring to fig. 2, a schematic flow chart of a training method for an ancient poetry generating model provided in the embodiment of the present application is shown. The training method may comprise the steps of:
s201, acquiring an ancient poetry data set, and acquiring a pre-constructed ancient poetry generating model; the ancient poetry data set comprises a plurality of ancient poetry with different formats, and global control information and format control information corresponding to each ancient poetry;
s202, inputting the ancient poetry data set into an input module of the ancient poetry generating model;
s203, extracting global control information and format control information corresponding to each ancient poem;
s204, mapping the global control information corresponding to the ancient poetry sentence into a plurality of independent characters and/or identifications, and explicitly constructing global control templates in different formats aiming at each character or identification;
s205, for each character in the ancient poetry sentence, explicitly constructing format control templates in different formats based on format control information corresponding to the ancient poetry sentence;
s206, inputting the global control template and the format control template which correspond to the ancient poetry and are in different formats into a decoding module of the ancient poetry generating model, and training the decoding module.
In step S201, the ancient poetry data set includes a plurality of ancient poetry with different formats, and global control information and format control information corresponding to each ancient poetry. Wherein, ancient poetry with different formats can comprise five-language poetry, seven-language poetry, words and the like. The global control information may include at least one of: theme information, affective information, theme information can include the key word of ancient poetry, affective information can include the information representing the emotional degree of ancient poetry, such as very negative, generally negative, neutral emotion, generally positive, very positive, etc.. The format control information may include at least one of: level and narrow information, rhyme information, poetry sentence length and poetry sentence quantity.
In step S204, the global control information corresponding to each ancient poem includes theme information and/or emotion information, and the theme information corresponding to the ancient poem is mapped into a plurality of independent characters; and/or mapping the emotional information corresponding to the ancient poetry sentence into a plurality of independent marks. For each word or logo, a global control template in a different format is explicitly constructed. As shown in FIG. 1, the different formats may include "CHAR", "SPOS", "Pingze", "TYPE", and "POS", among others. Wherein "CHAR" represents information of the word itself, "SPOS" represents relative position information in the entire control template, "pinze" represents information that the word is both flat and narrow (e.g., flat and narrow) in a verse, "TYPE" represents a verse TYPE, and "POS" represents global position information in the entire control template.
As can be seen from the first and second leftmost columns of fig. 1, the subject information corresponding to the ancient poetry is "bright moon", and is mapped to "bright" and "month". For the 'Ming' word, a global control template in different formats is explicitly constructed: "EMing dynasty”、“Et0”、“Et0”、“Etpc"and" E0", wherein" EMing dynastyThe subscript of "indicates the information of" Ming "word itself," Et0The subscript of "denotes time t0 and" EtpcThe subscript of "denotes the subject," E0The subscript of "denotes global position 0. Aiming at the 'moon' word, a global control template in different formats is explicitly constructed: "EMoon cake”、“Et1”、“Et1”、“Etpc"and" E1", wherein" EMoon cakeThe subscript of "indicates the information of" month "word itself," Et1The subscript of "denotes time t1 and" EtpcThe subscript of "denotes the subject," E1The subscript of "denotes global position 1。
As can be seen from the third and fourth leftmost columns of fig. 1, the emotional information corresponding to the ancient poetry is "generally negative", and is mapped to the "S3" symbol and the "S2" symbol. For the "S3" identification, the global control template in a different format is explicitly constructed: "ES3”、“ES0”、“ES0”、“Estm"and" E2", wherein" ES3The subscript of "denotes information" S3 "identifies itself," ES0"subscript denotes the randomly defined S0 tag, with no practical meaning," EstmThe subscript of "denotes the emotion," E2The subscript of "denotes global position 2. For the "S2" identification, the global control template in a different format is explicitly constructed: "ES2”、“ES1”、“ES1”、“Estm"and" E3", wherein" ES2The subscript of "denotes information" S2 "identifies itself," ES1"subscript denotes the randomly defined S1 tag, with no practical meaning," EstmThe subscript of "denotes the emotion," E3The subscript of "denotes global position 3.
In step S205, the format control information corresponding to each ancient poem includes at least one of: level and narrow information, rhyme information, poetry sentence length and poetry sentence quantity. Aiming at each character in the ancient poetry sentence, a format control template which represents level and narrow tones is explicitly constructed based on level and narrow tone information corresponding to the ancient poetry sentence; and/or for each character in the sentence of ancient poetry, explicitly constructing a format control template for expressing rhymes based on rhyme information corresponding to the sentence of ancient poetry; and/or for each character in the ancient poetry sentence, explicitly constructing a format control template representing a relative position and/or a global position based on the poetry sentence length corresponding to the ancient poetry sentence; and/or for each character in the ancient poetry sentence, explicitly constructing a format control template for expressing the type of the poetry sentence based on the poetry sentence number corresponding to the ancient poetry sentence.
The sixth row on the left side of the graph 1 is combined to know, and aiming at the words 'lifting', based on the level and narrow information 'zeptolevel and level and narrow' corresponding to the ancient poetry sentence, the explicit construction shows the format control template with level and narrow“EPZ1"based on the corresponding verse length" 5 "of this ancient verse, explicitly construct a format control template" E "representing the relative positionP4"and the form control template of global position" E4 ", based on the poetry quantity" 2 "corresponding to the ancient poetry of the sentence, explicitly constructs the form control template" E "representing the type of the poetrysn0". Aiming at the 'head' word, based on the level and the oblique tone information 'zeptos, level and oblique tone' corresponding to the ancient poetry sentence, the explicit construction shows the format control template 'E' with level and oblique tonePZ0"based on the corresponding verse length" 5 "of this ancient verse, explicitly construct a format control template" E "representing the relative positionP3"and the form control template of global position" E5 ", based on the poetry quantity" 2 "corresponding to the ancient poetry of the sentence, explicitly constructs the form control template" E "representing the type of the poetrysn0". Other words are analogized and are not described in detail here. Wherein "E" isPZ1The subscript of "indicates" zeptos "," EPZ0The subscripts of "indicate" flat "," Esn0The subscript of "denotes the first sentence," Esn1The subscript of "denotes the second sentence.
In a possible implementation, step S206 may include the following sub-steps:
s2061, aiming at each word or mark corresponding to the global control information, converting the global control templates of the display structure under different formats into global control vectors under different formats;
s2062, summing the global control vectors in different formats to obtain a target global control vector corresponding to each word or identifier;
s2063, converting the format control templates in different formats of the display structure into format control vectors in different formats aiming at each character in the sentence of ancient poetry;
s2064, summing the character vector of each previous moment and the format control vectors of the characters to be generated at the current moment under different formats to obtain a target format control vector corresponding to the current moment in the ancient poetry sentence;
s2065, inputting the target global control vector corresponding to each character or identification and the target format control vector corresponding to the current moment into a decoding module of the ancient poetry generating model, and decoding the predicted text of the character to be generated at the current moment in an autoregressive mode.
The substeps S2061 to S2065 will be described below with reference to fig. 1 as an example.
In step S2061, for example, for the "bright" word, the global control templates in five different formats of the structure will be displayed: "EMing dynasty”、“Et0”、“Et0”、“Etpc"and" E0"convert to the global control vector embedding in five different formats. For the "S3" designation, a global control template in five different formats of construction will be displayed: "ES3”、“ES0”、“ES0”、“Estm"and" E2"convert to the global control vector embedding in five different formats.
In step S2062, for example, the global control vectors embedding in five different formats are summed up for the "bright" word, so as to obtain the target global control vector embedding corresponding to the "bright" word. And for the identifier of 'S3', summing the global control vectors embedding under the five different formats to obtain a target global control vector embedding corresponding to the identifier of 'S3'.
In step S2063, for example, for the word "top", five format control templates with different formats of the structure are displayed: "ELifting device”、“EP4”、“EPZ1”、“Esn0”、“E4"convert to the format control vector embedding under five different formats.
In step S2064, for example, for the word "lifting" generated at the previous moment, E is setLifting deviceFormat control template for corresponding word vector and word to be generated at current moment in different formatsP3”、“EPZ0”、“Esn0”、“E5And adding the format control vectors embedding respectively corresponding to the target format control vectors embedding to obtain the target format control vector embedding corresponding to the current moment.
In step S2065, the target global control vector corresponding to the "bright" word, the "moon" word, the "S3" flag, and the "S2" flag, and the target format control vector corresponding to the current time are input into the decoding module 2, and the predicted text "header" of the word to be generated at the current time is decoded in an autoregressive manner. And then inputting the target format control vector corresponding to the head character generated at the current moment into the decoding module 2, predicting the hope of the character to be generated at the next moment, and so on, and finally predicting the whole sentence poetry. That is, the prediction of the next word depends on the word that has been generated at the previous time. The generated text semantics are more smooth due to the strong representation capability of the Decoder (transform Decoder).
The embodiment of the application provides a training method of an ancient poetry generating model, which comprises the steps of firstly obtaining an ancient poetry data set and obtaining a pre-constructed ancient poetry generating model; the ancient poetry data set comprises a plurality of ancient poetry with different formats, and global control information and format control information corresponding to each ancient poetry; that is, this application can mix the ancient poetry data of various different formats and carry out the model training. Then, inputting the ancient poetry data set into an input module of the ancient poetry generating model; and extracting global control information and format control information corresponding to each ancient poem. Then, mapping the global control information corresponding to the ancient poetry sentence into a plurality of independent characters and/or identifications, and explicitly constructing global control templates in different formats aiming at each character or identification; for each character in the ancient poetry sentence, explicitly constructing format control templates in different formats based on format control information corresponding to the ancient poetry sentence; that is, this application can show the whole control template and the format control template under the different formats of structure to can show the text format that control generation, for example, the length of every poem of sentence, level and narrow, format such as the rhyme of giving a rest to the speech. And finally, inputting the global control template and the format control template which correspond to the ancient poetry and are in different formats into a decoding module of the ancient poetry generating model, and training the decoding module. The ancient poetry data of various different formats can be mixed and trained in the model training process, the overall control template and the format control template which are constructed in an explicit mode and constructed in different formats can be input into the decoding module, the generated ancient poetry texts strictly accord with the preset format requirements, various playing methods are provided for games, and the game content of players is enriched. Meanwhile, the defect that the semantics of the generated result based on the RNN model is not smooth is overcome.
Referring to fig. 3, a schematic flow chart of an ancient poetry generating method provided in the embodiment of the present application is shown. The generation method may include the steps of:
s301, acquiring input global control information and/or input format control information matched with a user;
s302, inputting the input global control information and/or the input format control information into the trained ancient poetry generating model;
s303, mapping the input global control information into a plurality of independent characters and/or identifications through an input module in the ancient poetry generating model, and matching the input global control templates in different formats aiming at each character or identification;
s304, aiming at each character of the ancient poetry to be generated, matching input format control templates in different formats based on the input format control information;
s305, inputting the input global control templates in different formats and the input format control templates of each character in different formats into a decoding module in the ancient poetry generating model, and decoding a predicted text of the next character of the character in an autoregressive mode to generate the ancient poetry.
In step S301, input global control information and/or input format control information manually input by the user may be obtained, and the format of the ancient poetry that needs to be generated may be artificially controlled according to the requirements of the user. Alternatively, input global control information and/or input format control information matching the user is mined based on the game history. Wherein the input global control information comprises at least one of: the method comprises the steps of inputting theme information and emotional information, wherein the input theme information can comprise keywords of ancient poems, and the input emotional information can comprise information representing the emotional degree of the ancient poems, such as very negative, general negative, neutral emotion, general positive, very positive and the like. The input format control information includes at least one of: inputting level and narrow information, inputting rhyme information, inputting poetry sentence length and inputting poetry sentence quantity.
In step S303, the input global control information includes at least one of: inputting theme information and inputting emotion information. Mapping the input subject information into a plurality of independent characters through an input module in the ancient poetry generating model; and/or mapping the input emotional information into a plurality of separate identifications through an input module in the ancient poetry generating model. And matching the input global control templates in different formats aiming at each word or mark.
As can be seen from the first and second leftmost columns of fig. 1, the input subject information corresponding to the ancient poetry is "bright moon", and is mapped to "bright" and "month". Aiming at the 'Ming' word, the input global control templates in different formats are matched: "EMing dynasty”、“Et0”、“Et0”、“Etpc"and" E0", wherein" EMing dynastyThe subscript of "indicates the information of" Ming "word itself," Et0The subscript of "denotes time t0 and" EtpcThe subscript of "denotes the subject," E0The subscript of "denotes global position 0. Aiming at the 'moon' word, the input global control templates in different formats are matched: "EMoon cake”、“Et1”、“Et1”、“Etpc"and" E1", wherein" EMoon cakeThe subscript of "indicates the information of" month "word itself," Et1The subscript of "denotes time t1 and" EtpcThe subscript of "denotes the subject," E1The subscript of "denotes global position 1.
As can be seen from the third and fourth leftmost columns of fig. 1, the input emotion information corresponding to the ancient poetry is "generally negative", and is mapped to the "S3" symbol and the "S2" symbol. For the "S3" identification, match the input global control template in a different format: "ES3”、“ES0”、“ES0”、“Estm"and" E2", wherein" ES3The subscript of "denotes information" S3 "identifies itself," ES0"subscript denotes the randomly defined S0 tag, with no practical meaning," EstmThe subscript of "denotes the emotion," E2The subscript of "denotes global position 2. For the "S2" identification, match the input global control template in a different format: "ES2”、“ES1”、“ES1”、“Estm"and" E3", wherein" ES2The subscript of "denotes information" S2 "identifies itself," ES1"subscript denotes the randomly defined S1 tag, with no practical meaning," EstmThe subscript of "denotes the emotion," E3The subscript of "denotes global position 3.
In step S304, the input format control information includes at least one of: inputting level and narrow information, inputting rhyme information, inputting poetry sentence length and inputting poetry sentence quantity. Aiming at each character of the ancient poetry to be generated, matching an input format control template which represents level and narrow based on input level and narrow information; and/or matching an input format control template representing rhyme based on input rhyme information for each character of the ancient poetry to be generated; and/or for each character of the ancient poetry to be generated, matching an input format control template representing a relative position and/or a global position based on the length of the input poetry; and/or matching an input format control template representing the type of the poetry sentence based on the number of the input poetry sentences aiming at each character of the ancient poetry sentences to be generated.
The sixth row at the left side of the graph 1 is combined to know that the word is ' lifted ', and based on the input level and narrow information ' zeptolevel and level and narrow ' corresponding to the ancient poetry sentence, the matching of the level and narrow ' represents the input format control template ' E ' which is level and narrowPZ1"based on the input verse length" 5 "corresponding to the ancient verse, match the input format control template" E "representing the relative positionP4"and input format control template of global position" E4 ", match the input format control template" E "expressing the type of verse based on the input verse quantity" 2 "corresponding to the ancient verse of the sentencesn0”。
In a possible implementation, step S305 may comprise the following sub-steps:
s3051, aiming at each word or identification corresponding to the input global control information, converting the matched input global control templates in different formats into input global control vectors in different formats;
s3052, summing the input global control vectors in different formats to obtain a target input global control vector corresponding to each word or identifier;
s3053, converting the matched input format control templates in different formats into input format control vectors in different formats aiming at each character of the ancient poetry to be generated;
s3054, summing the word vector of each previous moment and the input format control vectors of the words to be generated at the current moment under different formats to obtain a target input format control vector corresponding to the current moment;
s3055, inputting the target input global control vector corresponding to each character or identification and the target input format control vector corresponding to the current moment into a decoding module of the ancient poetry generating model, and autoregressive decoding the predicted text of the character to be generated at the current moment.
Next, the substeps S3051-S3055 will be described with reference to fig. 1 as an example.
In step S3051, for example, for the "bright" word, five matched input global control templates in different formats are input: "EMing dynasty”、“Et0”、“Et0”、“Etpc"and" E0"convert to the input global control vector embedding in five different formats. For the "S3" designation, the input global control template in five different formats will be matched: "ES3”、“ES0”、“ES0”、“Estm"and" E2"convert to the input global control vector embedding in five different formats.
In step S3052, for example, for the "bright" word, the input global control vectors embedding in the five different formats are summed to obtain the target input global control vector embedding corresponding to the "bright" word. And aiming at the identifier of 'S3', summing the input global control vectors embedding under the five different formats to obtain a target input global control vector embedding corresponding to the identifier of 'S3'.
In step S3053, for example, for the word "top", the input format control templates in the five different formats that are matched are: "ELifting device”、“EP4”、“EPZ1”、“Esn0”、“E4"convert to input format control vectors imbedding in five different formats.
In step S3054, for example, for the word "LIJI" generated at the previous moment, "E" is addedLifting device"input format control template for corresponding word vector and word to be generated at present in different formats" EP3”、“EPZ0”、“Esn0”、“E5And adding the corresponding input format control vectors embedding respectively to obtain a target input format control vector embedding corresponding to the current moment.
In step S3055, the target input global control vector corresponding to the "clear" word, the "month" word, the "S3" flag, and the "S2" flag, and the target input format control vector corresponding to the current time are input into the decoding module 2, and the predicted text "header" of the word to be generated at the current time is decoded in an autoregressive manner. And then inputting the target input format control vector corresponding to the head character generated at the current moment into the decoding module 2, predicting the hope of the character to be generated at the next moment, and so on, and finally predicting the whole poem. That is, the prediction of the next word depends on the word that has been generated at the previous time. The generated text semantics are more smooth due to the strong representation capability of the Decoder (transform Decoder).
The embodiment of the application provides an ancient poetry generating method, which comprises the steps of firstly, obtaining input global control information and/or input format control information matched with a user; inputting the input global control information and/or the input format control information into the trained ancient poetry generating model; that is, the format of the ancient poems which need to be generated can be artificially controlled according to the requirements of the user. Then, mapping the input global control information into a plurality of independent characters and/or identifications through an input module in the ancient poetry generating model, and matching the input global control templates in different formats aiming at each character or identification; aiming at each character of the ancient poetry to be generated, matching an input format control template in different formats based on the input format control information; that is, the text format generated can be explicitly controlled, such as the length, level and narrow of each poem, the format of the rhyme and the like. And finally, inputting the input global control templates with different formats and the input format control templates of each character with different formats into a decoding module in the ancient poetry generating model, and decoding a predicted text of the next character of the character in an autoregressive manner to generate the ancient poetry. In the process of generating the ancient poetry, the embodiment of the application can artificially control the format of the ancient poetry needing to be generated, thereby improving the interactivity of users and the flexible controllability of poetry sentences.
Based on the same technical concept, the embodiment of the application further provides training of the ancient poetry generating model, an ancient poetry generating device, electronic equipment, a computer storage medium and the like, and the following embodiments can be specifically referred to.
Referring to fig. 4, a schematic structural diagram of a training apparatus for generating a model of ancient poetry provided in the embodiment of the present application is shown. The training device may comprise:
an ancient poetry obtaining module 401, configured to obtain an ancient poetry data set and obtain a pre-constructed ancient poetry generating model; the ancient poetry data set comprises a plurality of ancient poetry with different formats, and global control information and format control information corresponding to each ancient poetry;
an ancient poetry input module 402, configured to input the ancient poetry data set into an input module of the ancient poetry generating model;
an information extraction module 403, configured to extract, for each ancient poem, global control information and format control information corresponding to the ancient poem;
a global construction module 404, configured to map global control information corresponding to the ancient poetry sentence into a plurality of individual characters and/or identifiers, and explicitly construct global control templates in different formats for each character or identifier;
a format construction module 405, configured to explicitly construct, for each character in the ancient poetry sentence, a format control template in different formats based on format control information corresponding to the ancient poetry sentence;
and the training module 406 is used for inputting the global control template and the format control template which correspond to the ancient poetry and are in different formats into the decoding module of the ancient poetry generating model, and training the decoding module.
Referring to fig. 5, a schematic structural diagram of an ancient poetry generating device provided in the embodiment of the present application is shown. The generating means may comprise:
an information obtaining module 501, configured to obtain input global control information and/or input format control information that is matched with a user;
an information input module 502, configured to input the input global control information and/or the input format control information into the trained ancient poetry generating model;
a global matching module 503, configured to map the input global control information into a plurality of individual characters and/or identifiers through an input module in the ancient poetry generating model, and match the input global control templates in different formats for each character or identifier;
the format matching module 504 is used for matching the input format control templates in different formats according to the input format control information aiming at each character of the ancient poetry to be generated;
and a generating module 505, configured to input the input global control templates in different formats and the input format control templates of each character in different formats into the decoding module in the ancient poetry generating model, and autoregressive decode a predicted text of a next character of the character, so as to generate an ancient poetry.
Referring to fig. 6, fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure. As shown in fig. 6, includes: the processor 601, the memory 602, and the bus 603, where the memory 602 stores machine-readable instructions executable by the processor 601, when the electronic device runs, the processor 601 communicates with the memory 602 through the bus 603, and the processor 601 executes the machine-readable instructions to perform the method described in the foregoing method embodiment.
The computer program product for training the ancient poetry generating model and the ancient poetry generating method provided by the embodiment of the application comprises a computer readable storage medium storing nonvolatile program codes executable by a processor, instructions included in the program codes can be used for executing the method in the previous method embodiment, specific implementation can be referred to the method embodiment, and details are not repeated herein.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the system and the apparatus described above may refer to corresponding processes in the method embodiments, and are not described in detail in this application. In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. The above-described apparatus embodiments are merely illustrative, and for example, the division of the modules is merely a logical division, and there may be other divisions in actual implementation, and for example, a plurality of modules or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or modules through some communication interfaces, and may be in an electrical, mechanical or other form.
The modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer-readable storage medium executable by a processor. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a U disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
Claims (14)
1. A training method for an ancient poetry generating model is characterized by comprising the following steps:
acquiring an ancient poetry data set and acquiring a pre-constructed ancient poetry generating model; the ancient poetry data set comprises a plurality of ancient poetry with different formats, and global control information and format control information corresponding to each ancient poetry;
inputting the ancient poetry data set into an input module of the ancient poetry generating model;
extracting global control information and format control information corresponding to each ancient poem;
mapping global control information corresponding to the ancient poetry sentence into a plurality of independent characters and/or identifications, and explicitly constructing global control templates in different formats aiming at each character or identification;
for each character in the ancient poetry sentence, explicitly constructing format control templates in different formats based on format control information corresponding to the ancient poetry sentence;
and inputting the global control template and the format control template which correspond to the ancient poetry and are in different formats into a decoding module of the ancient poetry generating model, and training the decoding module.
2. The training method of claim 1, wherein the global control information comprises at least one of: theme information, emotion information.
3. The training method as claimed in claim 2, wherein the step of mapping the global control information corresponding to the ancient verse into a plurality of individual words and/or logos comprises:
mapping the theme information corresponding to the ancient poetry sentence into a plurality of independent characters; and/or
And mapping the emotional information corresponding to the ancient poetry sentence into a plurality of independent marks.
4. The training method of claim 1, wherein the format control information comprises at least one of: level and narrow information, rhyme information, poetry sentence length and poetry sentence quantity.
5. The training method as claimed in claim 4, wherein the step of explicitly constructing format control templates in different formats based on format control information corresponding to the sentence of ancient poetry for each character in the sentence of ancient poetry comprises:
aiming at each character in the ancient poetry sentence, a format control template which represents level and narrow tones is explicitly constructed based on level and narrow tone information corresponding to the ancient poetry sentence; and/or
For each character in the ancient poetry sentence, explicitly constructing a format control template for expressing rhyme based on rhyme information corresponding to the ancient poetry sentence; and/or
For each character in the ancient poetry, a format control template representing a relative position and/or a global position is explicitly constructed based on the poetry length corresponding to the ancient poetry; and/or
And (4) for each character in the ancient poetry, explicitly constructing a format control template for expressing the type of the poetry based on the number of the poetry corresponding to the ancient poetry.
6. The training method of claim 1, wherein the step of inputting the global control template and the format control template corresponding to the ancient poetry in different formats into a decoding module of the ancient poetry generating model comprises:
converting the global control templates of the display structure in different formats into global control vectors in different formats aiming at each word or mark corresponding to the global control information;
summing the global control vectors in different formats to obtain a target global control vector corresponding to each word or identifier;
converting the format control templates with different formats of the display structure into format control vectors with different formats aiming at each character in the sentence of ancient poetry;
adding the character vector of each previous moment and the format control vectors of the characters to be generated at the current moment under different formats to obtain a target format control vector corresponding to the current moment in the ancient poetry sentence;
and inputting the target global control vector corresponding to each character or identifier and the target format control vector corresponding to the current moment into a decoding module of the ancient poetry generating model, and autoregressive decoding the predicted text of the character to be generated at the current moment.
7. An ancient poetry generating method is characterized by comprising the following steps:
acquiring input global control information and/or input format control information matched with a user;
inputting the input global control information and/or the input format control information into the trained ancient poetry generating model;
mapping the input global control information into a plurality of independent characters and/or identifications through an input module in the ancient poetry generating model, and matching the input global control templates in different formats aiming at each character or identification;
aiming at each character of the ancient poetry to be generated, matching an input format control template in different formats based on the input format control information;
and inputting the input global control templates with different formats and the input format control templates of each character with different formats into a decoding module in the ancient poetry generating model, and decoding a predicted text of the next character of the character in an autoregressive manner to generate the ancient poetry.
8. The generation method according to claim 7, wherein the input global control information includes at least one of: inputting theme information and inputting emotion information.
9. The method of generating of claim 8, wherein the step of mapping said input global control information into a plurality of individual words and/or logos by an input module in said ancient poetry generation model comprises:
mapping the input subject information into a plurality of independent characters through an input module in the ancient poetry generating model; and/or
And mapping the input emotional information into a plurality of independent identifications through an input module in the ancient poetry generating model.
10. The generation method according to claim 7, wherein the input format control information includes at least one of: inputting level and narrow information, inputting rhyme information, inputting poetry sentence length and inputting poetry sentence quantity.
11. The generating method of claim 10, wherein the step of matching an input format control template in different formats based on the input format control information for each word of the ancient poetry to be generated comprises:
aiming at each character of the ancient poetry to be generated, matching an input format control template which represents level and narrow based on input level and narrow information; and/or
Matching an input format control template representing rhyme based on input rhyme information for each character of the ancient poetry to be generated; and/or
Matching an input format control template representing a relative position and/or a global position based on the length of an input verse for each character of an ancient poem to be generated; and/or
And aiming at each character of the ancient poetry to be generated, matching an input format control template representing the poetry type based on the quantity of input poetry sentences.
12. The method of generating of claim 7, wherein the step of inputting the input global control templates in different formats and the input format control templates for each word in different formats to the decoding module in the ancient poetry generating model to auto-regressively decode the predicted text of the next word of the word comprises:
for each word or identification corresponding to the input global control information, converting the matched input global control templates in different formats into input global control vectors in different formats;
summing the input global control vectors in different formats to obtain a target input global control vector corresponding to each word or identifier;
aiming at each character of the ancient poetry to be generated, converting the matched input format control templates in different formats into input format control vectors in different formats;
adding the word vector of each previous moment and the input format control vectors of the words to be generated at the current moment under different formats to obtain a target input format control vector corresponding to the current moment;
and inputting the target input global control vector corresponding to each character or identifier and the target input format control vector corresponding to the current moment into a decoding module of the ancient poetry generating model, and decoding the predicted text of the character to be generated at the current moment in an autoregressive mode.
13. An electronic device, comprising: a processor, a storage medium and a bus, the storage medium storing machine-readable instructions executable by the processor, the processor and the storage medium communicating via the bus when the electronic device is operating, the processor executing the machine-readable instructions to perform the steps of the method according to any one of claims 1 to 12.
14. A computer-readable storage medium, having stored thereon a computer program which, when being executed by a processor, is adapted to carry out the steps of the method according to any one of claims 1 to 12.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110869571.9A CN113553822B (en) | 2021-07-30 | 2021-07-30 | Ancient poetry generating model training, ancient poetry generating method, equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110869571.9A CN113553822B (en) | 2021-07-30 | 2021-07-30 | Ancient poetry generating model training, ancient poetry generating method, equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113553822A true CN113553822A (en) | 2021-10-26 |
CN113553822B CN113553822B (en) | 2023-06-30 |
Family
ID=78133311
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110869571.9A Active CN113553822B (en) | 2021-07-30 | 2021-07-30 | Ancient poetry generating model training, ancient poetry generating method, equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113553822B (en) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109086270A (en) * | 2018-07-24 | 2018-12-25 | 重庆大学 | System and method of composing poem automatically based on classic poetry corpus vectorization |
CN109815493A (en) * | 2019-01-09 | 2019-05-28 | 厦门大学 | A kind of modeling method that the intelligence hip-hop music lyrics generate |
CN111382580A (en) * | 2020-01-21 | 2020-07-07 | 沈阳雅译网络技术有限公司 | Encoder-decoder framework pre-training method for neural machine translation |
CN112183109A (en) * | 2020-09-22 | 2021-01-05 | 甘肃农业大学 | MASS-based poetry sentence generation information steganography method |
CN112257775A (en) * | 2020-10-21 | 2021-01-22 | 东南大学 | Poetry method by graph based on convolutional neural network and unsupervised language model |
CN112287678A (en) * | 2020-11-03 | 2021-01-29 | 沈阳雅译网络技术有限公司 | Ancient poetry automatic generation method based on pre-training model |
CN112651235A (en) * | 2020-12-24 | 2021-04-13 | 北京搜狗科技发展有限公司 | Poetry generation method and related device |
-
2021
- 2021-07-30 CN CN202110869571.9A patent/CN113553822B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109086270A (en) * | 2018-07-24 | 2018-12-25 | 重庆大学 | System and method of composing poem automatically based on classic poetry corpus vectorization |
CN109815493A (en) * | 2019-01-09 | 2019-05-28 | 厦门大学 | A kind of modeling method that the intelligence hip-hop music lyrics generate |
CN111382580A (en) * | 2020-01-21 | 2020-07-07 | 沈阳雅译网络技术有限公司 | Encoder-decoder framework pre-training method for neural machine translation |
CN112183109A (en) * | 2020-09-22 | 2021-01-05 | 甘肃农业大学 | MASS-based poetry sentence generation information steganography method |
CN112257775A (en) * | 2020-10-21 | 2021-01-22 | 东南大学 | Poetry method by graph based on convolutional neural network and unsupervised language model |
CN112287678A (en) * | 2020-11-03 | 2021-01-29 | 沈阳雅译网络技术有限公司 | Ancient poetry automatic generation method based on pre-training model |
CN112651235A (en) * | 2020-12-24 | 2021-04-13 | 北京搜狗科技发展有限公司 | Poetry generation method and related device |
Non-Patent Citations (2)
Title |
---|
何晶;周明;蒋龙;: "基于统计的汉语格律诗生成研究", 中文信息学报 * |
周昌乐;游维;丁晓君;: "一种宋词自动生成的遗传算法及其机器实现", 软件学报 * |
Also Published As
Publication number | Publication date |
---|---|
CN113553822B (en) | 2023-06-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107680580B (en) | Text conversion model training method and device, and text conversion method and device | |
CN104391980B (en) | The method and apparatus for generating song | |
So | Transpacific Community: America, China, and the Rise and Fall of a Cultural Network | |
Plaks | Ta Hsüeh and Chung Yung: The highest order of cultivation and on the practice of the Mean | |
Marchesi | Dante and Augustine: Linguistics, Poetics, Hermeneutics | |
Barchiesi | Roman Callimachus | |
Anthony | The Journey to the West, Revised Edition, Volume 1 | |
CN112199502A (en) | Emotion-based poetry sentence generation method and device, electronic equipment and storage medium | |
CN117216234A (en) | Artificial intelligence-based speaking operation rewriting method, device, equipment and storage medium | |
De Bruijn | Ruby in the Dust. Poetry and History in Padmāvat by the South Asian Sufi Poet Muhammad Jāyasī | |
CN113553822B (en) | Ancient poetry generating model training, ancient poetry generating method, equipment and storage medium | |
Scott | Restorying: The creative act of retelling | |
Lukin et al. | Automating direct speech variations in stories and games | |
Fägersten | Language play in contemporary Swedish comic strips | |
CN113268983B (en) | Role-oriented story ending generation method | |
Maciver | Triphiodorus and the poetics of Imperial Greek epic | |
CN113051513A (en) | Online Chinese culture interesting learning inheritance platform, interaction method and APP terminal | |
McEntire | Struggling with God: an introduction to the Pentateuch | |
Tope | English-language literature of the Philippines | |
Robinson | Translator, touretter: avant-garde translation and the touretter sublime | |
CN113838445B (en) | Song creation method and related equipment | |
CN113436591B (en) | Pitch information generation method, device, computer equipment and storage medium | |
Affiah et al. | From orality to print: An oraliterary examination of Efua T. Sutherland’s The Marriage of Anansewa and Femi Osofisan’s Morountodun | |
Lam | Cultural Transplantation: The Writing of Classical Chinese Poetry in Colonial Singapore (1887‒1945) | |
Sönmez | The rise of multimodality: applying translation criticism to video games |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |