CN106598921A - Method and device for converting to ancient poem from modern article based on long short term memory (LSTM) model - Google Patents

Method and device for converting to ancient poem from modern article based on long short term memory (LSTM) model Download PDF

Info

Publication number
CN106598921A
CN106598921A CN201611140395.0A CN201611140395A CN106598921A CN 106598921 A CN106598921 A CN 106598921A CN 201611140395 A CN201611140395 A CN 201611140395A CN 106598921 A CN106598921 A CN 106598921A
Authority
CN
China
Prior art keywords
lstm
lstm models
word
ancient poetry
poem
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201611140395.0A
Other languages
Chinese (zh)
Inventor
王东
白紫薇
冯洋
杜新凯
游世学
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zhongke Huilian Technology Co Ltd
Tsinghua University
Original Assignee
Beijing Zhongke Huilian Technology Co Ltd
Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zhongke Huilian Technology Co Ltd, Tsinghua University filed Critical Beijing Zhongke Huilian Technology Co Ltd
Priority to CN201611140395.0A priority Critical patent/CN106598921A/en
Publication of CN106598921A publication Critical patent/CN106598921A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/12Use of codes for handling textual entities
    • G06F40/151Transformation
    • G06F40/16Automatic learning of transformation rules, e.g. from examples

Abstract

The invention discloses a method for converting to an ancient poem from a modern article based on a long short term memory (LSTM) model. The LSTM model comprises a coding part and a decoding part, wherein the coding part is used for converting an input word sequence of a user to an activated state, the decoding part is used for adjusting the generation of a target sequence according to the activated state, and the coding part and the decoding part are both achieved by LSTM. The invention also discloses a device for achieving the method. The method has the beneficial effects that the modern article is automatically converted to the ancient poem by a computer, the method is more suitable for a popular usage habit, an ordinary user can generate a poem by inputting the modern article, and the method is more convenient compared with a traditional theme generation method; and by expressing the logic of the modern article with the logic of the ancient poem, word piling is cast off, the ancient poem conforming to compliance and having meaning expression capability is formed, fine control can be performed on the ancient poem generation process, and the formed ancient poem has a logic clue.

Description

Conversion method and device of the modern times text based on LSTM models to ancient poetry
Technical field
The present invention relates to conversion method and device of a kind of modern times text based on LSTM models to ancient poetry.
Background technology
Reading history records makes one wise, and reading Confucian classics makes one beautiful, and poem is Chinese cultural infrastructure and national rarity over the past thousands of years, It is the elite of language, is crystallization of wisdom, be the flower of thought, is the most pure spiritual home of the mankind.But with sending out for language Exhibition, understanding of the contemporary people to poem are fewer and feweri, and the ability composed poem also much is less than ancients.
Fortunately, with the arrival in big data epoch, artificial intelligence develops rapidly, by the automatic of computer classic poetry It is converted into possibility.For example, can lead to and provide a target topic, machine can automatically generate the Gu that a first symbol closes the theme Poem.Allow machine to do poem and can reduce the mankind and word, the difficulty in sentence-making are sought in writing the poem, make poet pay close attention to the theme of a poem, artistic conception etc. More valuable thing;Meanwhile, the poem that machine does is not fettered by mankind's Traditional Thinking, and novelty is extremely strong.Though these novelties It is so not fully reasonable, but to the prompting that poet enriches, and then the more creation inspirations of poet are excited, write out wonderful with more very thinking The innovative works thought.It is believed that automatically generating for ancient poetry can greatly swash people to classical interest, China is passed The development of system art plays the role of important with succession.
However, it is current according to theme specific mode based on poem method of doing automatically there is significant limitation.On the one hand, this Selection requirement of the mode of kind to descriptor is higher, and only theme selected ci poem is reasonable, and the ancient poetry of generation is just more reasonable.This is to many general General family forms very big obstacle, it is intended that user is only depicted ancient poetry artistic conception to be expressed and interior is generated with Modern Chinese Hold, this description is condensed into ancient poetry by machine, thus the suitability that ancient poetry generates system can be greatly improved.On the other hand, according to work as The mode that front descriptor is specified, poet can only enclose the interior content for providing and doing poem general greatly, but be unable to the flow process that precise control does poem. In fact, the conjunction rule batch that ancient poetry is not only word are gathered, it is often more important that poet's emotion is by describing the scenery, the laddering liter of process of narrating China.The current ancient poetry generated with theme limiting mode, not the company's inertia with thinking and activity, it is impossible to form rational table justice Logic, it is impossible to form complete progressive subject heading list and reach, thus the poem for generating is much piling up for word, lacks emotion and logic Property, using limited.
The content of the invention
It is an object of the invention to overcome the shortcoming of prior art, there is provided a kind of modern times text based on LSTM models arrives ancient poetry Conversion method and device.
With traditional theme word specify do poem method compared with, modern text ancient poetry generate will be difficult it is many.One significant tired Difficulty is that the length of modern text is typically much deeper than ancient poetry length, if modern text is converted into ancient poetry, needs are extracted wherein most Valuable semantic information, and these information are expressed with concise Ancient Chinese.Such as:" exactly February in early spring now, Spring breeze brushes, spring grass germinating, how long does not cross, will green south of the River bank " poem " spring breeze and green south of the River bank " can be corresponded to, in vain " now exactly " in words text, " how long does not cross ", " will " these words are all to the too many help of table justice.We need to extract Go out such as the clear and definite word of the comparison of expressing the meaning such as " green ", " south of the River ", " bank ", the meaning of formation sentence.How sentence justice is extracted, to reasonable Ancient poetry generate it is significant.
On the other hand, a Modern Chinese may have various transcription forms, how naturally generate various expression waies Ancient poetry, and need the problem for solving.
The purpose of the present invention is achieved through the following technical solutions:Conversion side of the modern times text based on LSTM models to ancient poetry Method, described LSTM models include for by it is the input word of user Sequence Transformed be a fixed dimension dense vector(One State of activation)Addressable part and according to state of activation adjust target sequence generation decoding device.
Coding and decoded portion are realized by LSTM.First, the modern times text of user is input into sentence through one by system Individual two-way LSTM networks are encoded, and wherein each sentence is expressed as a semantic vector, form a semantic vector group.These Coding of the semantic vector as user view.In generating process, the continuous circular flow of LSTM network is generated in text Each word, generate each word when, need using the semantic vector corresponding to this as reference input so that should Sentence is generated the table justice content required with user and is consistent.
Be introduced into be intended to vector purpose be in generating process, focus more in list entries with export mostly concerned portion Point.Such as:In " exactly February in early spring now, spring breeze brush, spring grass germinating, how long do not cross, will green south of the River bank ", Compare with " being now ", " will " we focus more on the words such as " February ", " south of the River ".
Note, in generating process, we add the style rule such as punctuate, rhymed, level and oblique tone, it is ensured that " word string " of generation While user view being met farthest, it then follows compulsory poem rule.
Addressable part:
In an encoding process, we used two-way LSTM models, the output at unidirectional LSTM models a certain moment only with it is current when Carve and the input information at former moment is relevant, and the output at two-way LSTM a certain moment is also relevant with input information afterwards.
The LSTM models of forward directionOrder according to information input reads list entries(), and calculate Forward direction hidden layer state().Backward LSTM models read sequence by the order contrary with information input(), and calculate backward hidden layer state(), forward and backward state is linked together, and it is every to obtain The explanation vector of individual word.By this method, we obtainBoth the letter of the point word had above been contained Breath, also contains the information of the point word below.
Decoding device:
In decoder layers, we define conditional probability and are:
Be LSTM be the i moment hidden layer state.
Conditional probability and each wordDirect context vectorIt is relevant.Represent that the outside input in generating process is believed Breath.Depending on hidden layer status switch(), eachIncluding the information of whole list entries, but focus more on Information around i word.
WeightFor:
Wherein:
, W, U are to need optimized three matrixes in training process
Concordance ensures
In superincumbent generating process, per modern text generates a semantic vector, to generate an ancient poetry, therefore this poem This theme will be surrounded;Meanwhile, the generation result of latter sentence relies on the word that previous sentence is generated, therefore ensures the company of whole first poem Inertia.This means that our generation method connects used before and after having the ability to generate a head, while and meeting modern text logic clue Ancient poetry, it is ensured that concordance.
Described LSTM models can rely on a style rule constraint, and the ancient poetry of generation has to comply with text requirement.It is logical Cross and change this style rule, the classic poetry of multiple format can be generated.Such as, in seven-character octave, this style rule will After seeking seven words of every generation, it is necessary to generate an end mark;Meanwhile, per the 7th word has to comply with the unified rhythm, each Each word in sentence needs to meet certain level and oblique tone requirement.This style rule is converted, the different rhythms, different types can be obtained Ancient poetry, base extremely can be according to different words(As Flos Papaveris rhoeadis, wave are washed the sand)Design style rule, generates the such poems of the Song Dynasty of various words.
Poem multiformity is generated
We can generate the different ancient poetries of various expression by introducing randomness.Such as, we can be in the writings in the vernacular of input In add some close words at random, make the ancient poetry of generation that there is multiformity.These randomness can by " increase ", " deletion ", " repair Change " etc. operation realizing.Such as the example mentioned in " problem description ":" exactly February in early spring now, spring breeze brush, spring grass germinating, How long do not cross, will green south of the River bank ", by deleting " south of the River bank " or " small bird " can be added, obtain " careless long Oriolus chinensis diffususs Winged February day ".Add a small amount of randomization information increase the multiformity of poem of generation, and the poem for ultimately generating will not be made Word deviates too far from theme
Training
In the training process, vector one softmax of input that decoding device is obtained by we returns layer, obtains possible outcome Probability distribution.Actual probability distribution and desired output that selection softmax is obtained(One-hot forms)Cross entropy as loss Function.Loss function is:
Y is desired output, and a is reality output, and w is word, and s is sentence, and n is batch size.
In order to accelerate training, prevent from being absorbed in local optimum, the present invention adopts most small quantities of stochastic gradient descent algorithm(mini- batch SGD)As optimized algorithm:
C ' is the derivative of loss function.
And use AdaDelta algorithm self-adaptative adjustment learning rates.
The described modern times text based on LSTM models is realized to the device of the conversion method of ancient poetry, including for by user's Input word is Sequence Transformed be a state of activation coding module and adjusted according to state of activation target sequence generation decoding Module.
The present invention has advantages below:
Neutral net is combined by the present invention with rule, carries out semantic understanding to Modern Chinese by neutral net, then according to rule System generates the ancient poetry for meeting semanteme.For ensure generate ancient poetry multiformity, we using two kinds introducing randomness methods, one It is to carry out some random disturbances in the modern text to being input into, two is that style rule is modified.The former introduces expression multiformity, The latter introduces style multiformity(Such as poem with five characters to a line, poem with seven characters to a line, various rhythmic patterns etc.).
Modern text is changed into automatically by ancient poetry by computer, more meets public use custom, lead to can commonly used person Cross the modern text of input and generate poem, it is more convenient than traditional theme generation method.
By the logicality in modern text is expressed as the logicality in ancient poetry expression, breaks away from word and pile up, formed and close rule And the ancient poetry with table justice ability, careful control can be carried out to ancient poetry generating process, make generation ancient poetry that there is logic clue.
The colourful ancient poetry with various expression, various styles can be write out by adding randomness.
Description of the drawings
Fig. 1 is the principle schematic of the present invention.
Specific embodiment
The present invention will be further described below in conjunction with the accompanying drawings:
Conversion method of the modern times text based on LSTM models to ancient poetry, described LSTM models are included for by the input word of user Sequence Transformed is the dense vector of a fixed dimension(One state of activation)Addressable part and according to state of activation adjust mesh The decoding device of the generation of mark sequence.
Coding and decoded portion are realized by LSTM.As shown in figure 1, first, it is the latter half of Fig. 1, system will User modern times text input sentence encoded through a two-way LSTM network, wherein each sentence be expressed as a semanteme to Amount, forms a semantic vector group.Coding of these semantic vectors as user view.In generating process(The top of Fig. 1), The continuous circular flow of one LSTM network, generates each word in text, and when each word is generated, needs should Semantic vector corresponding to sentence is used as reference input so that this is generated the table justice content required with user and is consistent.
Be introduced into be intended to vector purpose be in generating process, focus more in list entries with export mostly concerned portion Point.Such as:In " exactly February in early spring now, spring breeze brush, spring grass germinating, how long do not cross, will green south of the River bank ", Compare with " being now ", " will " we focus more on the words such as " February ", " south of the River ".
Note, in generating process, we add the style rule such as punctuate, rhymed, level and oblique tone, it is ensured that " word string " of generation While user view being met farthest, it then follows compulsory poem rule.
Addressable part:
In an encoding process, we used two-way LSTM models, the output at unidirectional LSTM models a certain moment only with it is current when Carve and the input information at former moment is relevant, and the output at two-way LSTM a certain moment is also relevant with input information afterwards.
The LSTM models of forward directionOrder according to information input reads list entries(), and calculate Forward direction hidden layer state().Backward LSTM models read sequence by the order contrary with information input(), and calculate backward hidden layer state(), forward and backward state is linked together, and it is every to obtain The explanation vector of individual word.By this method, we obtainBoth the letter of the point word had above been contained Breath, also contains the information of the point word below.
Decoding device:
In decoder layers, we define conditional probability and are:
Be LSTM be the i moment hidden layer state.
Conditional probability and each wordDirect context vectorIt is relevant.Represent that the outside input in generating process is believed Breath.Depending on hidden layer status switch(), eachIncluding the information of whole list entries, but focus more on Information around i word.
WeightFor:
Wherein:
, W, U are to need optimized three matrixes in training process
Concordance ensures
In superincumbent generating process, per modern text generates a semantic vector, to generate an ancient poetry, therefore this poem This theme will be surrounded;Meanwhile, the generation result of latter sentence relies on the word that previous sentence is generated, therefore ensures the company of whole first poem Inertia.This means that our generation method connects used before and after having the ability to generate a head, while and meeting modern text logic clue Ancient poetry, it is ensured that concordance.
Described LSTM models can rely on a style rule constraint, and the ancient poetry of generation has to comply with text requirement.It is logical Cross and change this style rule, the classic poetry of multiple format can be generated.Such as, in seven-character octave, this style rule will After seeking seven words of every generation, it is necessary to generate an end mark;Meanwhile, per the 7th word has to comply with the unified rhythm, each Each word in sentence needs to meet certain level and oblique tone requirement.This style rule is converted, the different rhythms, different types can be obtained Ancient poetry, base extremely can be according to different words(As Flos Papaveris rhoeadis, wave are washed the sand)Design style rule, generates the such poems of the Song Dynasty of various words.
Poem multiformity is generated
We can generate the different ancient poetries of various expression by introducing randomness.Such as, we can be in the writings in the vernacular of input In add some close words at random, make the ancient poetry of generation that there is multiformity.These randomness can by " increase ", " deletion ", " repair Change " etc. operation realizing.Such as the example mentioned in " problem description ":" exactly February in early spring now, spring breeze brush, spring grass germinating, How long do not cross, will green south of the River bank ", by deleting " south of the River bank " or " small bird " can be added, obtain " careless long Oriolus chinensis diffususs Winged February day ".Add a small amount of randomization information increase the multiformity of poem of generation, and the poem for ultimately generating will not be made Word deviates too far from theme
Training
In the training process, vector one softmax of input that decoding device is obtained by we returns layer, obtains possible outcome Probability distribution.Actual probability distribution and desired output that selection softmax is obtained(One-hot forms)Cross entropy as loss Function.Loss function is:
Y is desired output, and a is reality output, and w is word, and s is sentence, and n is batch size.
In order to accelerate training, prevent from being absorbed in local optimum, the present invention adopts most small quantities of stochastic gradient descent algorithm(mini- batch SGD)As optimized algorithm:
C ' is the derivative of loss function.
And use AdaDelta algorithm self-adaptative adjustment learning rates.
The described modern times text based on LSTM models is realized to the device of the conversion method of ancient poetry, including for by user's Input word is Sequence Transformed be a state of activation coding module and adjusted according to state of activation target sequence generation decoding Module.

Claims (8)

1. the conversion method of ancient poetry is arrived based on the modern times text of LSTM models, it is characterised in that:Described LSTM models include for By it is the input word of user Sequence Transformed be a state of activation addressable part and according to state of activation adjust target sequence life Into decoding device;Addressable part and decoding device are realized by LSTM.
2. the modern times text based on LSTM models according to claim 1 arrives the conversion method of ancient poetry, it is characterised in that:It is described Addressable part used two-way LSTM models, the LSTM models of forward direction read list entries according to the order of information input(), and to hidden layer state before calculating(), backward LSTM models are by contrary with information input Order read sequence(), and calculate backward hidden layer state(), by forward and backward state Link together and obtain the explanation vector of each word
3. the modern times text based on LSTM models according to claim 1 arrives the conversion method of ancient poetry, it is characterised in that:It is described Decoding device in, in decoder layers, define conditional probability be:
,
Be LSTM be the i moment hidden layer state,
For each wordDirect context vector, represent generating process in external input information,
,
WeightFor:
,
Wherein:
4. the modern times text based on LSTM models according to claim 1 arrives the conversion method of ancient poetry, it is characterised in that:Generate During, the generation result of latter sentence relies on the word that previous sentence is generated.
5. the modern times text based on LSTM models according to claim 1 arrives the conversion method of ancient poetry, it is characterised in that:It is described LSTM models rely on a style rule constraint.
6. the modern times text based on LSTM models according to claim 1 arrives the conversion method of ancient poetry, it is characterised in that:It is described The training method of LSTM models be:Vector one softmax of input that decoding device is obtained returns layer, obtains possible outcome Probability distribution, select the cross entropy of the actual probability distributions that obtain of softmax and desired output as loss function, lose letter Number is:
,
Y is desired output, and a is reality output, and w is word, and s is sentence, and n is batch size.
7. the modern times text based on LSTM models according to claim 6 arrives the conversion method of ancient poetry, it is characterised in that:Using Most small quantities of stochastic gradient descent algorithm is used as optimized algorithm:
C ' is the derivative of loss function;
And use AdaDelta algorithm regularized learning algorithm speed.
8. realize the modern times text based on LSTM models as described in any one in claim 1-7 to the conversion method of ancient poetry Device, it is characterised in that:Including for by it is the input word of user Sequence Transformed be a state of activation coding module and according to State of activation adjusts the decoder module of the generation of target sequence.
CN201611140395.0A 2016-12-12 2016-12-12 Method and device for converting to ancient poem from modern article based on long short term memory (LSTM) model Pending CN106598921A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611140395.0A CN106598921A (en) 2016-12-12 2016-12-12 Method and device for converting to ancient poem from modern article based on long short term memory (LSTM) model

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611140395.0A CN106598921A (en) 2016-12-12 2016-12-12 Method and device for converting to ancient poem from modern article based on long short term memory (LSTM) model

Publications (1)

Publication Number Publication Date
CN106598921A true CN106598921A (en) 2017-04-26

Family

ID=58597666

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611140395.0A Pending CN106598921A (en) 2016-12-12 2016-12-12 Method and device for converting to ancient poem from modern article based on long short term memory (LSTM) model

Country Status (1)

Country Link
CN (1) CN106598921A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107590192A (en) * 2017-08-11 2018-01-16 深圳市腾讯计算机系统有限公司 Mathematicization processing method, device, equipment and the storage medium of text question
CN107943525A (en) * 2017-11-17 2018-04-20 魏茨怡 A kind of mobile phone app interactive modes based on Recognition with Recurrent Neural Network
CN108304436A (en) * 2017-09-12 2018-07-20 深圳市腾讯计算机系统有限公司 The generation method of style sentence, the training method of model, device and equipment
WO2019169719A1 (en) * 2018-03-08 2019-09-12 平安科技(深圳)有限公司 Automatic abstract extraction method and apparatus, and computer device and storage medium
CN110263150A (en) * 2019-03-05 2019-09-20 腾讯科技(深圳)有限公司 Document creation method, device, computer equipment and storage medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1425974A (en) * 2002-11-14 2003-06-25 范金娣 Chinese character input method code
CN103052943A (en) * 2010-09-09 2013-04-17 株式会社日立制作所 Source code conversion method and source code conversion program
CN103092594A (en) * 2011-11-07 2013-05-08 金蝶软件(中国)有限公司 Model conversion method and device
CN104157290A (en) * 2014-08-19 2014-11-19 大连理工大学 Speaker recognition method based on depth learning
CN104217214A (en) * 2014-08-21 2014-12-17 广东顺德中山大学卡内基梅隆大学国际联合研究院 Configurable convolutional neural network based red green blue-distance (RGB-D) figure behavior identification method
CN105159890A (en) * 2014-06-06 2015-12-16 谷歌公司 Generating representations of input sequences using neural networks
CN105868774A (en) * 2016-03-24 2016-08-17 西安电子科技大学 Selective search and convolutional neural network based vehicle logo recognition method
CN105929845A (en) * 2016-05-18 2016-09-07 中国计量大学 Unmanned aerial vehicle network-based river channel cruise system and cruise method
CN106126492A (en) * 2016-06-07 2016-11-16 北京高地信息技术有限公司 Statement recognition methods based on two-way LSTM neutral net and device

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1425974A (en) * 2002-11-14 2003-06-25 范金娣 Chinese character input method code
CN103052943A (en) * 2010-09-09 2013-04-17 株式会社日立制作所 Source code conversion method and source code conversion program
CN103092594A (en) * 2011-11-07 2013-05-08 金蝶软件(中国)有限公司 Model conversion method and device
CN105159890A (en) * 2014-06-06 2015-12-16 谷歌公司 Generating representations of input sequences using neural networks
CN104157290A (en) * 2014-08-19 2014-11-19 大连理工大学 Speaker recognition method based on depth learning
CN104217214A (en) * 2014-08-21 2014-12-17 广东顺德中山大学卡内基梅隆大学国际联合研究院 Configurable convolutional neural network based red green blue-distance (RGB-D) figure behavior identification method
CN105868774A (en) * 2016-03-24 2016-08-17 西安电子科技大学 Selective search and convolutional neural network based vehicle logo recognition method
CN105929845A (en) * 2016-05-18 2016-09-07 中国计量大学 Unmanned aerial vehicle network-based river channel cruise system and cruise method
CN106126492A (en) * 2016-06-07 2016-11-16 北京高地信息技术有限公司 Statement recognition methods based on two-way LSTM neutral net and device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
DZMITRY BAHDANAU 等: ""NEURAL MACHINE TRANSLATION BY JOINTLY LEARNING TO ALIGN AND TRANSLATE"", 《HTTP://ARXIV.ORG/ABS/1409.0473V7》 *
QIXIN WANG 等: ""Chinese Song Iambics Generation with Neural Attention-based Model"", 《HTTP://ARXIV.ORG/ABS/1604.06274》 *

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107590192A (en) * 2017-08-11 2018-01-16 深圳市腾讯计算机系统有限公司 Mathematicization processing method, device, equipment and the storage medium of text question
CN107590192B (en) * 2017-08-11 2023-05-05 深圳市腾讯计算机系统有限公司 Mathematical processing method, device, equipment and storage medium for text questions
CN108304436A (en) * 2017-09-12 2018-07-20 深圳市腾讯计算机系统有限公司 The generation method of style sentence, the training method of model, device and equipment
WO2019052311A1 (en) * 2017-09-12 2019-03-21 腾讯科技(深圳)有限公司 Style statement generation method, model training method and apparatus, and computer device
CN108304436B (en) * 2017-09-12 2019-11-05 深圳市腾讯计算机系统有限公司 Generation method, the training method of model, device and the equipment of style sentence
US11348570B2 (en) 2017-09-12 2022-05-31 Tencent Technology (Shenzhen) Company Limited Method for generating style statement, method and apparatus for training model, and computer device
US11869485B2 (en) 2017-09-12 2024-01-09 Tencent Technology (Shenzhen) Company Limited Method for generating style statement, method and apparatus for training model, and computer device
CN107943525A (en) * 2017-11-17 2018-04-20 魏茨怡 A kind of mobile phone app interactive modes based on Recognition with Recurrent Neural Network
WO2019169719A1 (en) * 2018-03-08 2019-09-12 平安科技(深圳)有限公司 Automatic abstract extraction method and apparatus, and computer device and storage medium
CN110263150A (en) * 2019-03-05 2019-09-20 腾讯科技(深圳)有限公司 Document creation method, device, computer equipment and storage medium
CN110263150B (en) * 2019-03-05 2023-10-31 腾讯科技(深圳)有限公司 Text generation method, device, computer equipment and storage medium

Similar Documents

Publication Publication Date Title
CN106598921A (en) Method and device for converting to ancient poem from modern article based on long short term memory (LSTM) model
JP7194779B2 (en) Speech synthesis method and corresponding model training method, device, electronic device, storage medium, and computer program
Pan et al. Hierarchical recurrent neural encoder for video representation with application to captioning
CN110321417A (en) A kind of dialogue generation method, system, readable storage medium storing program for executing and computer equipment
CN110135574A (en) Neural network training method, image generating method and computer storage medium
CN108459999A (en) A kind of font design method, system, equipment and computer readable storage medium
Hearn How to Read Chinese Paintings
JP2021192119A (en) Method for registering attribute of voice synthesis model, device, electronic apparatus, storage medium and computer program
CN112183109A (en) MASS-based poetry sentence generation information steganography method
Emmerson From dance! to" dance": Distance and digits
CN106898341B (en) Personalized music generation method and device based on common semantic space
Xia et al. English translation of classical Chinese poetry
Xia et al. Llmga: Multimodal large language model based generation assistant
Zhang et al. The Soul of Creation (Shensi)
Yang et al. Research on the Application of Coloratura in Hu Tingjiang's Adapted Vocal Music Works. Taking" Youth Dance" as an Example
Zhang et al. Application of Confucian Cultural Concepts in the Landscape Design of Chinese Architecture
CN101303626A (en) Solution method of spell input method with multi-repeat codes
CN111192567B (en) Method and device for generating interaction information of intelligent equipment
CN101093421A (en) Hierarchy type codes of four stocks of Chinese characters, and digital encoded method for inputting shape and sound
CN104536589A (en) Chinese character input method
Keane Re-imagining China's future: Soft power, cultural presence and the East Asian media market
Lu et al. Phases
Colonnese The Labyrinth as an Architectural Mediator: Vredeman De Vries and the Geometric Garden in The Netherlands
Schott The Sonically Evoked Spaces of Post-Rock in an Era of Climate Reality
Chen et al. A Study on the Advantages of Chinese Animation School in Publicationand Distribution of A Study on the Advantages of Chinese AnimationSchool in Publication and Distribution of Animation Artwork———Take the animated films" Uproar in Heaven" and" Nezha Conquers theDragon King" as examples

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20170426