CN107368475A - A kind of machine translation method and system based on generation confrontation neutral net - Google Patents

A kind of machine translation method and system based on generation confrontation neutral net Download PDF

Info

Publication number
CN107368475A
CN107368475A CN201710586841.9A CN201710586841A CN107368475A CN 107368475 A CN107368475 A CN 107368475A CN 201710586841 A CN201710586841 A CN 201710586841A CN 107368475 A CN107368475 A CN 107368475A
Authority
CN
China
Prior art keywords
network
machine translation
generation
vector
neutral net
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710586841.9A
Other languages
Chinese (zh)
Other versions
CN107368475B (en
Inventor
李世奇
程国艮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mandarin Technology (beijing) Co Ltd
Original Assignee
Mandarin Technology (beijing) Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mandarin Technology (beijing) Co Ltd filed Critical Mandarin Technology (beijing) Co Ltd
Priority to CN201710586841.9A priority Critical patent/CN107368475B/en
Publication of CN107368475A publication Critical patent/CN107368475A/en
Application granted granted Critical
Publication of CN107368475B publication Critical patent/CN107368475B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/40Processing or translation of natural language
    • G06F40/58Use of machine translation, e.g. for multi-lingual retrieval, for server-side translation for client devices or for real-time translation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Machine Translation (AREA)

Abstract

The invention belongs to field of computer technology, discloses a kind of machine translation method and system based on generation confrontation neutral net, and method includes:On the basis of former machine translation generates network, a differentiation network with the generation network confrontation of former machine translation is introduced;For judging the translation of object language, it is derived from training parallel corpora, or the result of former machine translation generation Network-based machine translation;Differentiate that network uses multilayer perceptron BP network model, realize that two-value is classified;System includes:Differentiate network, generation network, single language language material, parallel corpora.The present invention is while the bilingual parallel corporas resource manually marked is made full use of, moreover it is possible to makes full use of single language language material resource, carries out semi-supervised learning;Single language language material resource very abundant and easily acquisition, solve this insufficient problem of training corpus needed for neural network machine translation model.

Description

A kind of machine translation method and system based on generation confrontation neutral net
Technical field
The invention belongs to field of computer technology, more particularly to a kind of machine translation side based on generation confrontation neutral net Method and system.
Background technology
Machine translation is that a kind of translation of source language sentence automatically is turned into another object language using computerized algorithm The process of sentence.Machine translation is a research direction of artificial intelligence, has highly important scientific research value and practical value. Along with deepening constantly for globalization process and developing rapidly for internet, machine translation mothod at home and abroad politics, economy, society Meeting, cultural exchanges etc. play more and more important effect.
At present, the machine translation method based on deep-neural-network is machine translation field effect the best way.Mainly Using " coding-decoding " structure, it is made up of two parts of encoder and decoder, the two uses Recognition with Recurrent Neural Network (Recurrent Neural Network, RNN) and long short-term memory (Long Short-Term Memory, LSTM) network knot Structure.The flow of translation includes:First, the source language sentence of input is converted into a term vector sequence as circulation by encoder The input of neutral net, encoder can export the intensive vector of a regular length, referred to as context vector.Then, decoder Using context vector as input, a Softmax grader is combined using another Recognition with Recurrent Neural Network, exports target language The term vector sequence of speech.Finally, term vector is mapped as object language word one by one using dictionary, completes whole translation process.
In summary, the problem of prior art is present be:
The most important defect of prior art is that the training heavy dependence of deep-neural-network model manually marks on a large scale Bilingual parallel sentence pair corpus.Because the cost manually marked is higher, shortage is extensive, the manually mark of high quality is bilingual parallel Corpus, causes that neural network machine translation model training data is insufficient, poor-performing, is existing neural network machine translation The bottleneck problem that model faces;Particularly in terms of some rare foreign languages, the parallel corpora money available for training neural network model Source is even more few, it is difficult to builds high performance machine translation system.
The content of the invention
The problem of existing for prior art, the invention provides a kind of machine translation based on generation confrontation neutral net Method and system.
The present invention is achieved in that a kind of machine translation method based on generation confrontation neutral net, described based on life Into the machine translation method of confrontation neutral net, on the basis of former machine translation generates network, introduce one and turned over former machine Translate the differentiation network of generation network confrontation;For judging the translation of object language, training corpus is derived from, or former machine turns over Translate the result of generation Network-based machine translation;The differentiation network uses multilayer perceptron BP network model, realizes two-value Classification.
Further, the method for the two-value classification includes:
Using the form of hyperbolic tangent function:
Wherein, T (x) is the activation primitive of hidden layer;H (x) is implicit layer functions;
Whole multilayer perceptron BP network model function f (x) can formalization representation be:
F (x)=S (W2·h(x)+b2)=S (W2·T(W1x+b1)+b2),
Wherein, model parameter W2And b2Represent hidden layer to the weight matrix and output layer bias vector of output layer respectively;S (x) be hidden layer activation primitive;The activation primitive uses the form of sigmoid functions:
When multilayer perceptron BP network model carries out two-value classification, input layer vector X is substituted into f (x) and calculated Go out output vector Y, select the classification representated by the larger dimension of numerical value in Y, as classification results, instruction translation is derived from training Language material, also it is derived from generating network.
Further, the generation network is made up of encoder and decoder two parts;The encoder uses two-way length When remember (Long Short-Term Memory, LSTM) neural network structure;The encoder is first by the source language sentence of input Son is converted into the sequence of a term vector, and as the input of long memory network in short-term, network can generate a regular length Intensive vector, referred to as context vector, it is the output of encoder;
Then, the decoder is using another unidirectional long Memory Neural Networks in short-term, above and below encoder output Literary vector is input;One Softmax grader of superposition on output layer is obtained in neural network machine translation model, exports target language The term vector sequence of speech;Term vector is mapped as object language word one by one by dictionary, completes automatic translation process.
Further, the input X of the neural network machine translation modeltAnd ht-1When representing input word vector sum t-1 respectively Carve the output of LSTM neutral net units;Export htRepresent the output of current time LSTM neutral net unit;
Specifically include:
it=g (Wxixt+Whiht-1+bi);
ft=g (Wxfxt+Whfht-1+bf);
ot=g (Wxoxt+Whoht-1+bo);
ht=ot·tanh(ct);
Wherein, it、ft、otInput gate, out gate, forgetting are represented respectively;ct-1Represent the state of t-1 moment neurons, ct WithRepresent the state of neuron and hidden state, htFor the output of LSTM neurons;Parameter W and b represent the connection weight of each layer respectively Value and amount of bias;
Further, encoder uses two LSTM networks, and a positive term vector sequence of input, another inputs reverse word Sequence vector, two-way LSTM networks are formed, the vector of two network outputs is connected, forms context vector;Decoder uses One LSTM network, Input context vector, exports a status switch;Again by Softmax graders, functional form is such as Under:
Wherein, (θ12,…,θk) be grader parameter, k be grader classification sum, i represent some classification class Not;The state that decoder is exported, is converted into the term vector of object language, then sequence is integrated one by one, forms translation As a result.
Further, differentiate that network is trained by confrontation type, for the synchronous ability for improving generation network generation object language The ability in translation source is judged with raising differentiation network;In confrontation type training process, differentiate that network is used to judge translation knot Fruit is the True Data in language material, or the result of former machine translation generation Network-based machine translation;
In the machine translation method based on generation confrontation neutral net, differentiate that the process of e-learning makes a living into network And differentiate the competition process between network;Specifically include:
One is taken in the sample generated at random from authentic specimen and by generation model, allows and differentiates that network goes to determine whether Very;
By the mechanism of Machine Learning of competitive mode, make generation network and differentiate that the performance of network is constantly lifted;When whole net Network reaches Nash Equilibrium state, i.e., when two network parameters are stable, training is completed;Now, the machine translation of network generation is generated As a result, have been able to out-trick and differentiate network, it is thought that translation derives from parallel corpora;Now, generation network model can be made For the Machine Translation Model of output.
Further, the machine translation method based on generation confrontation neutral net is bilingual flat using what is manually marked While row language material resource, also using single language language material resource, semi-supervised learning is carried out.
Further, the machine translation method based on generation confrontation neutral net, is specifically included:
Two-way length Memory Neural Networks in short-term are built, as differentiation network;
By generation network and differentiate that network is combined, form complete generation confrontation network;It will be encoded in generation network The input vector of device and the output vector of decoder are attached, and differentiation network is passed to as input;Meanwhile network will be differentiated Output result 0 or 1 feed back to generation network;
Parallel corpora and single language language material are integrated, form a semi-supervised language material, it is whole raw with the semi-supervised language material training Into confrontation network;When generation confrontation network parameter keeps stable, training is completed.
After completing generation confrontation network model training, machine translation mould of the generation network portion as output in network Type, subsequently used.
Another object of the present invention is to provide a kind of machine translation system based on generation confrontation neutral net to include:
For judging the translation of object language, training corpus is derived from, or former machine translation generation net machine turns over The result translated;Using multilayer perceptron BP network model, the differentiation network that two-value is classified is realized.
Further, the machine translation system based on generation confrontation neutral net also includes:
Network is generated, and differentiates that network is combined, forms complete generation confrontation network;Encoder in network will be generated Input vector and the output vector of decoder be attached, pass to differentiation network as input;Meanwhile network will be differentiated Output result 0 or 1 feeds back to generation network;
Single language language material, integrated with parallel corpora, form a semi-supervised language material, the semi-supervised whole generation pair of language material training Anti- network;When generation confrontation network parameter keeps stable, training is completed.
Advantages of the present invention and good effect are:
The present invention generates network in former machine translation, that is, uses " coding-decoding " artificial neural Machine Translation Model On the basis of, introduce a differentiation network with the generation network confrontation of former machine translation;For judging the translation of object language, come Come from training corpus, or the result of former machine translation generation Network-based machine translation.
The present invention is improved the general frame system of the existing machine translation method based on artificial neural network.Carry A kind of machine translation method based on generation confrontation network has been supplied, has enable neural network machine translation model that there is a kind of self study Power.While the bilingual parallel corporas resource manually marked is made full use of, moreover it is possible to carried out using single language language material resource semi-supervised Study.Single language language material resource very abundant and easily acquisition, it is insufficient to solve training corpus needed for neural network machine translation This bottleneck problem, artificial mark language material cost more than 50% can be saved.
Model training of the present invention well after, in actual applications the present invention in model parameter scale and operation time and mesh Preceding neural network machine translation model is suitable, will not increase complexity during Machine Translation Model practicality.
Brief description of the drawings
Fig. 1 is the machine translation method flow chart provided in an embodiment of the present invention based on generation confrontation neutral net.
Fig. 2 is the machine translation system schematic diagram provided in an embodiment of the present invention based on generation confrontation neutral net.
In figure:1st, network is differentiated;2nd, network is generated;3rd, single language language material;4th, parallel corpora.
Fig. 3 is the neural network machine translation model signal provided in an embodiment of the present invention based on " coding-decoding " structure Figure.
Fig. 4 is LSTM neutral nets cell schematics provided in an embodiment of the present invention.
Embodiment
In order to make the purpose , technical scheme and advantage of the present invention be clearer, with reference to embodiments, to the present invention It is further elaborated.It should be appreciated that the specific embodiments described herein are merely illustrative of the present invention, it is not used to Limit the present invention.
At present, the most important defect of prior art is that the training heavy dependence of deep-neural-network model is manually marked on a large scale The bilingual parallel sentence pair corpus of note.Because the cost manually marked is higher, shortage is extensive, the manually mark of high quality is bilingual Parallel Corpus, causes that neural network machine translation model training data is insufficient, poor-performing, is existing neural network machine The bottleneck problem that translation model faces;Particularly in terms of some rare foreign languages, the parallel language available for training neural network model Expect that resource is even more few, it is difficult to build high performance machine translation system.
The present invention differentiates network using multilayer perceptron BP network model construction, realizes that two-value is classified.The multilayer Perceptron neural network model includes an input layer X:{x1,x2,…,xn, a hidden layer H:{h1,h2,…,hmAnd one Output layer Y:{y1,y2}。
Implicit layer functions h (x) can formalization representation be:
;H (x)=T (W1x+b1)
Wherein, model parameter W1And b1Represent input layer to the weight matrix and hidden layer bias vector of hidden layer respectively;T (x) be hidden layer activation primitive, the present invention in use hyperbolic tangent function form:
Whole multilayer perceptron neural network model function f (x) can formalization representation be:
F (x)=S (W2·h(x)+b2)=S (W2·T(W1x+b1)+b2);
Wherein, model parameter W2And b2Represent hidden layer to the weight matrix and output layer bias vector of output layer respectively.S (x) be hidden layer activation primitive, the present invention in use sigmoid functions form:
When multilayer perceptron neural network model carries out two-value classification, input layer vector X is substituted into f (x) and is calculated Two-dimentional output vector Y, the classification representated by the dimension that numerical value is larger in Y is selected, as classification results.
Below in conjunction with the accompanying drawings and specific embodiment is described in detail to the application principle of the present invention.
As shown in figure 1, the machine translation method provided in an embodiment of the present invention based on generation confrontation neutral net,
On the basis of tradition is based on nerve net machine translation, another artificial neural network resisted therewith is introduced, Referred to as differentiate network;Former machine translation LSTM neutral nets are referred to as generating network.It is raw in Network-based machine translation model is generated It is traditional neutral net translation model based on " coding-decoding " into model used by network, its effect is according to input Source language sentence, generate corresponding target language sentence;Model is multilayer perceptron feed forward neural used by differentiating network Network model, realizes the function of two-value classification, and each node is a perceptron in neural rivalry network.Differentiate network Effect is to judge the translation of object language, is derived from training corpus, or the result based on Recognition with Recurrent Neural Network machine translation.
Generation confrontation network introduces between generation network and differentiation network the mechanism for competing confrontation, is trained by confrontation type, It is synchronous to improve the ability of generation network generation object language, and differentiate the ability that network judges translation source.In the training process, The training objective for differentiating network is to judge that translation result is the True Data in language material, or the result of machine translation;And give birth to Instruction target into network be generation translation result can out-trick differentiate network, differentiation network is thought that the result of machine translation is Result in real corpus.
The process of machine translation method learning provided in an embodiment of the present invention based on generation confrontation neutral net becomes It is a kind of to generate network and differentiate the competition process between network --- at random generated from authentic specimen and by generation model One is taken in sample, allows and differentiates that network goes to determine whether very.By the mechanism of Machine Learning of this competitive mode, make generation network Constantly lifted with the performance for differentiating network.When whole network reaches Nash Equilibrium state, i.e., two network parameters are not sent out substantially During changing, represent training and complete.Now, show to generate the machine translation result that network generates, it is already possible to out-trick and differentiate net Network, it is allowed to think that translation is source and parallel corpora.Now, generation network model can be used as the Machine Translation Model of output.
As shown in Fig. 2 the machine translation system provided in an embodiment of the present invention based on generation confrontation neutral net includes:
For judging the translation of object language, training corpus is derived from, or former machine translation generation net machine turns over The result translated;Using multilayer perceptron BP network model, the differentiation network 1 that two-value is classified is realized.
The machine translation system based on generation confrontation neutral net also includes:
Network 2 is generated, and differentiates that network is combined, forms complete generation confrontation network;It will be encoded in generation network The input vector of device and the output vector of decoder are attached, and differentiation network is passed to as input;Meanwhile network will be differentiated Output result 0 or 1 feed back to generation network;
Single language language material 3, is integrated with parallel corpora 4, forms a semi-supervised language material, the semi-supervised whole generation of language material training Resist network;When generation confrontation network parameter keeps stable, training is completed.
With reference to good effect, the invention will be further described.
The embodiment of the present invention builds the length based on " coding-decoding " structure Memory Neural Networks in short-term, then with double Language parallel corpora is trained to generation network.
The embodiment of the present invention constructs another two-way length Memory Neural Networks in short-term, as differentiation network.
The application principle of the present invention is further described with reference to specific embodiment.
In the embodiment of the present invention, the method for two-value classification includes:
Using the form of hyperbolic tangent function:
Wherein, T (x) is the activation primitive of hidden layer;H (x) is implicit layer functions;
Whole multilayer perceptron BP network model function f (x) can formalization representation be:
F (x)=S (W2·h(x)+b2)=S (W2·T(W1x+b1)+b2),
Wherein, model parameter W2And b2Represent hidden layer to the weight matrix and output layer bias vector of output layer respectively;S (x) be hidden layer activation primitive;The activation primitive uses the form of sigmoid functions:
When multilayer perceptron BP network model carries out two-value classification, input layer vector X is substituted into f (x) and calculated Go out output vector Y, select the classification representated by the larger dimension of numerical value in Y, as classification results, instruction translation is derived from training Language material, also it is derived from generating network.
As shown in figure 3, generation network is made up of encoder and decoder two parts;The encoder uses two-way length in short-term Remember (Long Short-Term Memory, LSTM) neural network structure;The encoder is first by the source language sentence of input The sequence of a term vector is converted into, as the input of long memory network in short-term, network can generate the close of regular length Collection vector, referred to as context vector, it is the output of encoder;
Then, the decoder is using another unidirectional long Memory Neural Networks in short-term, above and below encoder output Literary vector is input;One Softmax grader of superposition on output layer is obtained in neural network machine translation model, exports target language The term vector sequence of speech;Term vector is mapped as object language word one by one by dictionary, completes automatic translation process.
As shown in figure 4, the input X of neural network machine translation modeltAnd ht-1When representing input word vector sum t-1 respectively Carve the output of LSTM neutral net units;Export htRepresent the output of current time LSTM neutral net unit;
Specifically include:
it=g (Wxixt+Whiht-1+bi);
ft=g (Wxfxt+Whfht-1+bf);
ot=g (Wxoxt+Whoht-1+bo);
ht=ot·tanh(ct);
Wherein, it、ft、otInput gate, out gate, forgetting are represented respectively;ct-1Represent the state of t-1 moment neurons, ct WithRepresent the state of neuron and hidden state, htFor the output of LSTM neurons;Parameter W and b represent the connection of each layer respectively Weights and amount of bias;
Encoder uses two LSTM networks, and a positive term vector sequence of input, another inputs reverse term vector sequence Row, form two-way LSTM networks, and the vector of two network outputs is connected, forms context vector;Decoder uses one LSTM networks, Input context vector, export a status switch;It is as follows by Softmax graders, functional form again:
Wherein, (θ12,…,θk) be grader parameter, k be grader classification sum, i represent some classification class Not;The state that decoder is exported, is converted into the term vector of object language, then sequence is integrated one by one, forms translation As a result.
Differentiate that network is trained by confrontation type, sentence for the synchronous ability for improving generation network generation object language and raising Other network judges the ability in translation source;In confrontation type training process, differentiate that network is used to judge that translation result is language material In True Data, or former machine translation generation Network-based machine translation result;
In the machine translation method based on generation confrontation neutral net, differentiate that the process of e-learning makes a living into network And differentiate the competition process between network;Specifically include:
One is taken in the sample generated at random from authentic specimen and by generation model, allows and differentiates that network goes to determine whether Very;
By the mechanism of Machine Learning of competitive mode, make generation network and differentiate that the performance of network is constantly lifted;When whole net Network reaches Nash Equilibrium state, i.e., when two network parameters are stable, training is completed;Now, the machine translation of network generation is generated As a result, have been able to out-trick and differentiate network, it is thought that translation derives from parallel corpora;Now, generation network model can be made For the Machine Translation Model of output.
The described machine translation method based on generation confrontation neutral net, is provided using the bilingual parallel corporas manually marked While source, also using single language language material resource, semi-supervised learning is carried out.
The described machine translation method based on generation confrontation neutral net, is specifically included:
Two-way length Memory Neural Networks in short-term are built, as differentiation network;
By generation network and differentiate that network is combined, form complete generation confrontation network;It will be encoded in generation network The input vector of device and the output vector of decoder are attached, and differentiation network is passed to as input;Meanwhile network will be differentiated Output result 0 or 1 feed back to generation network;
Parallel corpora and single language language material are integrated, form a semi-supervised language material, it is whole raw with the semi-supervised language material training Into confrontation network;When generation confrontation network parameter keeps stable, training is completed.
After completing generation confrontation network model training, machine translation mould of the generation network portion as output in network Type, subsequently used.
The present invention is by generation network and differentiates that network is combined, and forms complete generation confrontation network.Specifically, will The input vector of encoder and the output vector of decoder are attached in generation network, and differentiation network is passed to as input; Meanwhile the output result (0 or 1) for differentiating network is fed back into generation network.
The present invention integrates parallel corpora and single language language material, forms a large-scale semi-supervised language material, is instructed with the language material Practice whole generation confrontation network.When generation confrontation network parameter keeps stable, training is completed.
After the present invention completes generation confrontation network model training, the generation network portion in network can be used as the machine of output Device translation model, it can subsequently be used, specifically used method is:Word segmentation processing is carried out to original language, by the result after participle It is input in the encoder of generation network, each original language word is successively inputted in corresponding neural network node, generates net The output result of network decoder, it is corresponding object language translation.
The foregoing is merely illustrative of the preferred embodiments of the present invention, is not intended to limit the invention, all essences in the present invention All any modification, equivalent and improvement made within refreshing and principle etc., should be included in the scope of the protection.

Claims (10)

1. a kind of machine translation method based on generation confrontation neutral net, it is characterised in that described based on generation confrontation nerve The machine translation method of network, on the basis of former machine translation generates network, introduce one and former machine translation generation network The differentiation network of confrontation;For judging the translation of object language, training corpus, or former machine translation generation network are derived from The result of machine translation;The differentiation network uses multilayer perceptron BP network model, realizes that two-value is classified.
2. the machine translation method as claimed in claim 1 based on generation confrontation neutral net, it is characterised in that the two-value The method of classification includes:
Using the form of hyperbolic tangent function:
Wherein, T (x) is the activation primitive of hidden layer;H (x) is implicit layer functions;
Whole multilayer perceptron BP network model function f (x) can formalization representation be:
F (x)=S (W2·h(x)+b2)=S (W2·T(W1x+b1)+b2),
Wherein, model parameter W2And b2Represent hidden layer to the weight matrix and output layer bias vector of output layer respectively;S (x) is The activation primitive of hidden layer;The activation primitive uses the form of sigmoid functions:
It is defeated by being calculated in input layer vector X substitution f (x) when multilayer perceptron BP network model carries out two-value classification Outgoing vector Y, the classification representated by the larger dimension of numerical value in Y is selected, as classification results, instruction translation is derived from training language Material, also it is derived from generating network.
3. the machine translation method as claimed in claim 1 based on generation confrontation neutral net, it is characterised in that the generation Network is made up of encoder and decoder two parts;The encoder uses two-way length Memory Neural Networks structure in short-term;It is described The source language sentence of input is first converted into the sequence of a term vector by encoder, as the input of long memory network in short-term, Network can generate the intensive vector of a regular length, referred to as context vector, be the output of encoder;
Then, the decoder is using another unidirectional long Memory Neural Networks in short-term, with the context of encoder output to Measure as input;One Softmax grader of superposition on output layer is obtained in neural network machine translation model, exports object language Term vector sequence;Term vector is mapped as object language word one by one by dictionary, completes automatic translation process.
4. the machine translation method as claimed in claim 3 based on generation confrontation neutral net, it is characterised in that the nerve The input X of Network-based machine translation modeltAnd ht-1The defeated of input word vector sum t-1 moment LSTM neutral net units is represented respectively Go out;Export htRepresent the output of current time LSTM neutral net unit;
Specifically include:
it=g (Wxixt+Whiht-1+bi);
ft=g (Wxfxt+Whfht-1+bf);
ot=g (Wxoxt+Whoht-1+bo);
ht=ot·tanh(ct);
Wherein, it、ft、otInput gate, out gate, forgetting are represented respectively;ct-1Represent the state of t-1 moment neurons, ctWithTable Show the state of neuron and hidden state, htFor the output of LSTM neurons;Parameter W and b represent respectively each layer connection weight and Amount of bias.
5. the machine translation method as claimed in claim 3 based on generation confrontation neutral net, it is characterised in that encoder is adopted With two LSTM networks, a positive term vector sequence of input, another inputs reverse term vector sequence, forms two-way LSTM nets Network, the vector of two network outputs is connected, forms context vector;Decoder uses a LSTM network, Input context Vector, export a status switch;It is as follows by Softmax graders, functional form again:
Wherein, (θ12,…,θk) be grader parameter, k is the classification sum of grader, and i represents some class categories;Will The state of decoder output, is converted into the term vector of object language, then sequence is integrated one by one, forms translation result.
6. the machine translation method as claimed in claim 1 based on generation confrontation neutral net, it is characterised in that differentiate network Trained by confrontation type, the ability for synchronous raising generation network generation object language judges translation with differentiation network is improved The ability in source;In confrontation type training process, differentiate that network is used to judge that translation result is the True Data in language material, still The result of former machine translation generation Network-based machine translation;
In the machine translation method based on generation confrontation neutral net, differentiate that the process of e-learning is made a living into network and sentenced Competition process between other network;Specifically include:
One is taken in the sample generated at random from authentic specimen and by generation model, allows and differentiates that network goes to determine whether very;
By the mechanism of Machine Learning of competitive mode, make generation network and differentiate that the performance of network is constantly lifted;When whole network reaches To Nash Equilibrium state, i.e., when two network parameters are stable, training is completed;Now, the machine translation result of network generation is generated, Have been able to out-trick and differentiate network, it is thought that translation derives from parallel corpora;Now, generation network model can be used as exporting Machine Translation Model.
7. the machine translation method as claimed in claim 1 based on generation confrontation neutral net, it is characterised in that described base In the machine translation method of generation confrontation neutral net, while using the bilingual parallel corporas resource manually marked, also utilize Single language language material resource, carry out semi-supervised learning.
8. the machine translation method as claimed in claim 1 based on generation confrontation neutral net, it is characterised in that described base In the machine translation method of generation confrontation neutral net, specifically include:
Two-way length Memory Neural Networks in short-term are built, as differentiation network;
By generation network and differentiate that network is combined, form complete generation confrontation network;Encoder in network will be generated The output vector of input vector and decoder is attached, and differentiation network is passed to as input;Meanwhile the defeated of network will be differentiated Go out result 0 or 1 and feed back to generation network;
Parallel corpora and single language language material are integrated, form a semi-supervised language material, with the semi-supervised whole generation pair of language material training Anti- network;When generation confrontation network parameter keeps stable, training is completed.
After completing generation confrontation network model training, Machine Translation Model of the generation network portion as output in network, after It is continuous to be used.
9. a kind of machine translation method as claimed in claim 1 based on generation confrontation neutral net is resisted based on generation The machine translation system of neutral net, it is characterised in that the machine translation system based on generation confrontation neutral net includes:
For judging the translation of object language, training corpus is derived from, or former machine translation generates Network-based machine translation As a result;Using multilayer perceptron BP network model, the differentiation network that two-value is classified is realized.
10. the machine translation system as claimed in claim 9 based on generation confrontation neutral net, it is characterised in that the base Also include in the machine translation system of generation confrontation neutral net:
Network is generated, and differentiates that network is combined, forms complete generation confrontation network;By generate network in encoder it is defeated The output vector of incoming vector and decoder is attached, and differentiation network is passed to as input;Meanwhile the output that network will be differentiated As a result 0 or 1 feeds back to generation network;
Single language language material, integrated with parallel corpora, form a semi-supervised language material, the semi-supervised whole generation confrontation net of language material training Network;When generation confrontation network parameter keeps stable, training is completed.
CN201710586841.9A 2017-07-18 2017-07-18 Machine translation method and system based on generation of antagonistic neural network Active CN107368475B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710586841.9A CN107368475B (en) 2017-07-18 2017-07-18 Machine translation method and system based on generation of antagonistic neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710586841.9A CN107368475B (en) 2017-07-18 2017-07-18 Machine translation method and system based on generation of antagonistic neural network

Publications (2)

Publication Number Publication Date
CN107368475A true CN107368475A (en) 2017-11-21
CN107368475B CN107368475B (en) 2021-06-04

Family

ID=60308088

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710586841.9A Active CN107368475B (en) 2017-07-18 2017-07-18 Machine translation method and system based on generation of antagonistic neural network

Country Status (1)

Country Link
CN (1) CN107368475B (en)

Cited By (42)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107991876A (en) * 2017-12-14 2018-05-04 南京航空航天大学 Aero-engine condition monitoring data creation method based on production confrontation network
CN108304390A (en) * 2017-12-15 2018-07-20 腾讯科技(深圳)有限公司 Training method, interpretation method, device based on translation model and storage medium
CN108415906A (en) * 2018-03-28 2018-08-17 中译语通科技股份有限公司 Based on field automatic identification chapter machine translation method, machine translation system
CN108734276A (en) * 2018-04-28 2018-11-02 同济大学 A kind of learning by imitation dialogue generation method generating network based on confrontation
CN108829685A (en) * 2018-05-07 2018-11-16 内蒙古工业大学 A kind of illiteracy Chinese inter-translation method based on single language training
CN108846130A (en) * 2018-06-29 2018-11-20 北京百度网讯科技有限公司 A kind of question text generation method, device, equipment and medium
CN108874978A (en) * 2018-06-08 2018-11-23 杭州知智能科技有限公司 One method that conference content abstract task is solved based on layering adaptability segmented network
CN108897740A (en) * 2018-05-07 2018-11-27 内蒙古工业大学 A kind of illiteracy Chinese machine translation method based on confrontation neural network
CN109241540A (en) * 2018-08-07 2019-01-18 中国科学院计算技术研究所 A kind of blind automatic switching method of Chinese based on deep neural network and system
CN109410179A (en) * 2018-09-28 2019-03-01 合肥工业大学 A kind of image abnormity detection method based on generation confrontation network
CN109523021A (en) * 2018-09-28 2019-03-26 浙江工业大学 A kind of dynamic network Structure Prediction Methods based on long memory network in short-term
CN109547320A (en) * 2018-09-29 2019-03-29 阿里巴巴集团控股有限公司 Social contact method, device and equipment
CN109670180A (en) * 2018-12-21 2019-04-23 语联网(武汉)信息技术有限公司 The method and device of the translation personal characteristics of vectorization interpreter
CN109887494A (en) * 2017-12-01 2019-06-14 腾讯科技(深圳)有限公司 The method and apparatus of reconstructed speech signal
CN109887047A (en) * 2018-12-28 2019-06-14 浙江工业大学 A kind of signal-image interpretation method based on production confrontation network
CN109902310A (en) * 2019-01-15 2019-06-18 深圳中兴网信科技有限公司 Vocabulary detection method, vocabulary detection system and computer readable storage medium
CN110069790A (en) * 2019-05-10 2019-07-30 东北大学 It is a kind of by translation retroversion to machine translation system and method literally
CN110110337A (en) * 2019-05-08 2019-08-09 网易有道信息技术(北京)有限公司 Translation model training method, medium, device and calculating equipment
WO2019161753A1 (en) * 2018-02-26 2019-08-29 腾讯科技(深圳)有限公司 Information translation method and device, and storage medium and electronic device
CN110309512A (en) * 2019-07-05 2019-10-08 北京邮电大学 A kind of Chinese grammer error correction method thereof based on generation confrontation network
CN110334361A (en) * 2019-07-12 2019-10-15 电子科技大学 A kind of neural machine translation method towards rare foreign languages language
CN110472255A (en) * 2019-08-20 2019-11-19 腾讯科技(深圳)有限公司 Neural network machine interpretation method, model, electric terminal and storage medium
CN110555247A (en) * 2019-08-16 2019-12-10 华南理工大学 structure damage early warning method based on multipoint sensor data and BilSTM
CN110598221A (en) * 2019-08-29 2019-12-20 内蒙古工业大学 Method for improving translation quality of Mongolian Chinese by constructing Mongolian Chinese parallel corpus by using generated confrontation network
CN110750997A (en) * 2018-07-05 2020-02-04 普天信息技术有限公司 Machine translation method and device based on generation countermeasure learning
CN110852066A (en) * 2018-07-25 2020-02-28 清华大学 Multi-language entity relation extraction method and system based on confrontation training mechanism
CN110866404A (en) * 2019-10-30 2020-03-06 语联网(武汉)信息技术有限公司 Word vector generation method and device based on LSTM neural network
CN110866395A (en) * 2019-10-30 2020-03-06 语联网(武汉)信息技术有限公司 Word vector generation method and device based on translator editing behavior
CN110874537A (en) * 2018-08-31 2020-03-10 阿里巴巴集团控股有限公司 Generation method of multi-language translation model, translation method and translation equipment
CN110895935A (en) * 2018-09-13 2020-03-20 阿里巴巴集团控股有限公司 Speech recognition method, system, device and medium
WO2020063710A1 (en) * 2018-09-26 2020-04-02 Huawei Technologies Co., Ltd. Systems and methods for multilingual text generation
CN111178094A (en) * 2019-12-20 2020-05-19 沈阳雅译网络技术有限公司 Pre-training-based scarce resource neural machine translation training method
CN111178097A (en) * 2019-12-24 2020-05-19 语联网(武汉)信息技术有限公司 Method and device for generating Chinese and Tai bilingual corpus based on multi-level translation model
CN111310480A (en) * 2020-01-20 2020-06-19 昆明理工大学 Weakly supervised Hanyue bilingual dictionary construction method based on English pivot
CN111460837A (en) * 2020-03-31 2020-07-28 广州大学 Character-level confrontation sample generation method and device for neural machine translation
CN111523308A (en) * 2020-03-18 2020-08-11 大箴(杭州)科技有限公司 Chinese word segmentation method and device and computer equipment
CN111914552A (en) * 2020-07-31 2020-11-10 平安科技(深圳)有限公司 Training method and device of data enhancement model
CN112633018A (en) * 2020-12-28 2021-04-09 内蒙古工业大学 Mongolian Chinese neural machine translation method based on data enhancement
CN113283249A (en) * 2020-02-19 2021-08-20 阿里巴巴集团控股有限公司 Machine translation method, device and computer readable storage medium
CN113343719A (en) * 2021-06-21 2021-09-03 哈尔滨工业大学 Unsupervised bilingual translation dictionary acquisition method for collaborative training by using different word embedding models
CN113642341A (en) * 2021-06-30 2021-11-12 深译信息科技(横琴)有限公司 Deep confrontation generation method for solving scarcity of medical text data
CN115567239A (en) * 2022-08-16 2023-01-03 广州大学 Encrypted flow characteristic hiding system and method based on generation countermeasure

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4481972B2 (en) * 2006-09-28 2010-06-16 株式会社東芝 Speech translation device, speech translation method, and speech translation program
DE202017102381U1 (en) * 2017-04-21 2017-05-11 Robert Bosch Gmbh Device for improving the robustness against "Adversarial Examples"

Cited By (70)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11482237B2 (en) 2017-12-01 2022-10-25 Tencent Technology (Shenzhen) Company Limited Method and terminal for reconstructing speech signal, and computer storage medium
CN109887494A (en) * 2017-12-01 2019-06-14 腾讯科技(深圳)有限公司 The method and apparatus of reconstructed speech signal
CN107991876A (en) * 2017-12-14 2018-05-04 南京航空航天大学 Aero-engine condition monitoring data creation method based on production confrontation network
CN108304390B (en) * 2017-12-15 2020-10-16 腾讯科技(深圳)有限公司 Translation model-based training method, training device, translation method and storage medium
WO2019114695A1 (en) * 2017-12-15 2019-06-20 腾讯科技(深圳)有限公司 Translation model-based training method, translation method, computer device and storage medium
CN108304390A (en) * 2017-12-15 2018-07-20 腾讯科技(深圳)有限公司 Training method, interpretation method, device based on translation model and storage medium
US11270079B2 (en) 2017-12-15 2022-03-08 Tencent Technology (Shenzhen) Company Limited Translation model based training method and translation method, computer device, and storage medium
WO2019161753A1 (en) * 2018-02-26 2019-08-29 腾讯科技(深圳)有限公司 Information translation method and device, and storage medium and electronic device
US11710003B2 (en) 2018-02-26 2023-07-25 Tencent Technology (Shenzhen) Company Limited Information conversion method and apparatus, storage medium, and electronic device
CN108415906A (en) * 2018-03-28 2018-08-17 中译语通科技股份有限公司 Based on field automatic identification chapter machine translation method, machine translation system
CN108415906B (en) * 2018-03-28 2021-08-17 中译语通科技股份有限公司 Automatic identification discourse machine translation method and machine translation system based on field
CN108734276B (en) * 2018-04-28 2021-12-31 同济大学 Simulated learning dialogue generation method based on confrontation generation network
CN108734276A (en) * 2018-04-28 2018-11-02 同济大学 A kind of learning by imitation dialogue generation method generating network based on confrontation
CN108897740A (en) * 2018-05-07 2018-11-27 内蒙古工业大学 A kind of illiteracy Chinese machine translation method based on confrontation neural network
CN108829685A (en) * 2018-05-07 2018-11-16 内蒙古工业大学 A kind of illiteracy Chinese inter-translation method based on single language training
CN108874978A (en) * 2018-06-08 2018-11-23 杭州知智能科技有限公司 One method that conference content abstract task is solved based on layering adaptability segmented network
CN108874978B (en) * 2018-06-08 2021-09-10 杭州一知智能科技有限公司 Method for solving conference content abstract task based on layered adaptive segmented network
CN108846130A (en) * 2018-06-29 2018-11-20 北京百度网讯科技有限公司 A kind of question text generation method, device, equipment and medium
CN108846130B (en) * 2018-06-29 2021-02-05 北京百度网讯科技有限公司 Question text generation method, device, equipment and medium
CN110750997A (en) * 2018-07-05 2020-02-04 普天信息技术有限公司 Machine translation method and device based on generation countermeasure learning
CN110852066B (en) * 2018-07-25 2021-06-01 清华大学 Multi-language entity relation extraction method and system based on confrontation training mechanism
CN110852066A (en) * 2018-07-25 2020-02-28 清华大学 Multi-language entity relation extraction method and system based on confrontation training mechanism
CN109241540B (en) * 2018-08-07 2020-09-15 中国科学院计算技术研究所 Hanblindness automatic conversion method and system based on deep neural network
CN109241540A (en) * 2018-08-07 2019-01-18 中国科学院计算技术研究所 A kind of blind automatic switching method of Chinese based on deep neural network and system
CN110874537A (en) * 2018-08-31 2020-03-10 阿里巴巴集团控股有限公司 Generation method of multi-language translation model, translation method and translation equipment
CN110874537B (en) * 2018-08-31 2023-06-27 阿里巴巴集团控股有限公司 Method for generating multilingual translation model, translation method and equipment
CN110895935B (en) * 2018-09-13 2023-10-27 阿里巴巴集团控股有限公司 Speech recognition method, system, equipment and medium
CN110895935A (en) * 2018-09-13 2020-03-20 阿里巴巴集团控股有限公司 Speech recognition method, system, device and medium
CN113228030A (en) * 2018-09-26 2021-08-06 华为技术有限公司 Multi-language text generation system and method
WO2020063710A1 (en) * 2018-09-26 2020-04-02 Huawei Technologies Co., Ltd. Systems and methods for multilingual text generation
US11151334B2 (en) 2018-09-26 2021-10-19 Huawei Technologies Co., Ltd. Systems and methods for multilingual text generation field
CN113228030B (en) * 2018-09-26 2023-11-03 华为技术有限公司 Multilingual text generation system and method
CN109523021A (en) * 2018-09-28 2019-03-26 浙江工业大学 A kind of dynamic network Structure Prediction Methods based on long memory network in short-term
CN109410179A (en) * 2018-09-28 2019-03-01 合肥工业大学 A kind of image abnormity detection method based on generation confrontation network
CN109410179B (en) * 2018-09-28 2021-07-23 合肥工业大学 Image anomaly detection method based on generation countermeasure network
CN109547320A (en) * 2018-09-29 2019-03-29 阿里巴巴集团控股有限公司 Social contact method, device and equipment
CN109670180A (en) * 2018-12-21 2019-04-23 语联网(武汉)信息技术有限公司 The method and device of the translation personal characteristics of vectorization interpreter
CN109887047B (en) * 2018-12-28 2023-04-07 浙江工业大学 Signal-image translation method based on generation type countermeasure network
CN109887047A (en) * 2018-12-28 2019-06-14 浙江工业大学 A kind of signal-image interpretation method based on production confrontation network
CN109902310A (en) * 2019-01-15 2019-06-18 深圳中兴网信科技有限公司 Vocabulary detection method, vocabulary detection system and computer readable storage medium
CN110110337B (en) * 2019-05-08 2023-04-18 网易有道信息技术(北京)有限公司 Translation model training method, medium, device and computing equipment
CN110110337A (en) * 2019-05-08 2019-08-09 网易有道信息技术(北京)有限公司 Translation model training method, medium, device and calculating equipment
CN110069790A (en) * 2019-05-10 2019-07-30 东北大学 It is a kind of by translation retroversion to machine translation system and method literally
CN110309512A (en) * 2019-07-05 2019-10-08 北京邮电大学 A kind of Chinese grammer error correction method thereof based on generation confrontation network
CN110334361B (en) * 2019-07-12 2022-11-22 电子科技大学 Neural machine translation method for Chinese language
CN110334361A (en) * 2019-07-12 2019-10-15 电子科技大学 A kind of neural machine translation method towards rare foreign languages language
CN110555247A (en) * 2019-08-16 2019-12-10 华南理工大学 structure damage early warning method based on multipoint sensor data and BilSTM
CN110472255B (en) * 2019-08-20 2021-03-02 腾讯科技(深圳)有限公司 Neural network machine translation method, model, electronic terminal, and storage medium
CN110472255A (en) * 2019-08-20 2019-11-19 腾讯科技(深圳)有限公司 Neural network machine interpretation method, model, electric terminal and storage medium
CN110598221A (en) * 2019-08-29 2019-12-20 内蒙古工业大学 Method for improving translation quality of Mongolian Chinese by constructing Mongolian Chinese parallel corpus by using generated confrontation network
CN110866404B (en) * 2019-10-30 2023-05-05 语联网(武汉)信息技术有限公司 Word vector generation method and device based on LSTM neural network
CN110866395B (en) * 2019-10-30 2023-05-05 语联网(武汉)信息技术有限公司 Word vector generation method and device based on translator editing behaviors
CN110866395A (en) * 2019-10-30 2020-03-06 语联网(武汉)信息技术有限公司 Word vector generation method and device based on translator editing behavior
CN110866404A (en) * 2019-10-30 2020-03-06 语联网(武汉)信息技术有限公司 Word vector generation method and device based on LSTM neural network
CN111178094B (en) * 2019-12-20 2023-04-07 沈阳雅译网络技术有限公司 Pre-training-based scarce resource neural machine translation training method
CN111178094A (en) * 2019-12-20 2020-05-19 沈阳雅译网络技术有限公司 Pre-training-based scarce resource neural machine translation training method
CN111178097B (en) * 2019-12-24 2023-07-04 语联网(武汉)信息技术有限公司 Method and device for generating Zhongtai bilingual corpus based on multistage translation model
CN111178097A (en) * 2019-12-24 2020-05-19 语联网(武汉)信息技术有限公司 Method and device for generating Chinese and Tai bilingual corpus based on multi-level translation model
CN111310480A (en) * 2020-01-20 2020-06-19 昆明理工大学 Weakly supervised Hanyue bilingual dictionary construction method based on English pivot
CN111310480B (en) * 2020-01-20 2021-12-28 昆明理工大学 Weakly supervised Hanyue bilingual dictionary construction method based on English pivot
CN113283249A (en) * 2020-02-19 2021-08-20 阿里巴巴集团控股有限公司 Machine translation method, device and computer readable storage medium
CN111523308A (en) * 2020-03-18 2020-08-11 大箴(杭州)科技有限公司 Chinese word segmentation method and device and computer equipment
CN111523308B (en) * 2020-03-18 2024-01-26 大箴(杭州)科技有限公司 Chinese word segmentation method and device and computer equipment
CN111460837A (en) * 2020-03-31 2020-07-28 广州大学 Character-level confrontation sample generation method and device for neural machine translation
CN111914552A (en) * 2020-07-31 2020-11-10 平安科技(深圳)有限公司 Training method and device of data enhancement model
CN112633018A (en) * 2020-12-28 2021-04-09 内蒙古工业大学 Mongolian Chinese neural machine translation method based on data enhancement
CN113343719A (en) * 2021-06-21 2021-09-03 哈尔滨工业大学 Unsupervised bilingual translation dictionary acquisition method for collaborative training by using different word embedding models
CN113642341A (en) * 2021-06-30 2021-11-12 深译信息科技(横琴)有限公司 Deep confrontation generation method for solving scarcity of medical text data
CN115567239A (en) * 2022-08-16 2023-01-03 广州大学 Encrypted flow characteristic hiding system and method based on generation countermeasure
CN115567239B (en) * 2022-08-16 2024-08-20 广州大学 Encryption traffic feature hiding system and method based on generation countermeasure

Also Published As

Publication number Publication date
CN107368475B (en) 2021-06-04

Similar Documents

Publication Publication Date Title
CN107368475A (en) A kind of machine translation method and system based on generation confrontation neutral net
CN107578106B (en) Neural network natural language reasoning method fusing word semantic knowledge
CN109376242B (en) Text classification method based on cyclic neural network variant and convolutional neural network
Cheng et al. Facial expression recognition method based on improved VGG convolutional neural network
CN108897740A (en) A kind of illiteracy Chinese machine translation method based on confrontation neural network
CN107066445B (en) The deep learning method of one attribute emotion word vector
CN108875807A (en) A kind of Image Description Methods multiple dimensioned based on more attentions
CN110390397B (en) Text inclusion recognition method and device
CN108595602A (en) The question sentence file classification method combined with depth model based on shallow Model
CN111859978A (en) Emotion text generation method based on deep learning
CN107818302A (en) Non-rigid multi-scale object detection method based on convolutional neural network
CN109635109A (en) Sentence classification method based on LSTM and combination part of speech and more attention mechanism
CN108280064A (en) Participle, part-of-speech tagging, Entity recognition and the combination treatment method of syntactic analysis
CN108763326A (en) A kind of sentiment analysis model building method of the diversified convolutional neural networks of feature based
CN106897254A (en) A kind of network representation learning method
CN108009285A (en) Forest Ecology man-machine interaction method based on natural language processing
CN107832310A (en) Structuring argument generation method and system based on seq2seq models
CN108932232A (en) A kind of illiteracy Chinese inter-translation method based on LSTM neural network
CN110851760A (en) Human-computer interaction system for integrating visual question answering in web3D environment
CN113157919B (en) Sentence text aspect-level emotion classification method and sentence text aspect-level emotion classification system
CN115630156A (en) Mongolian emotion analysis method and system fusing Prompt and SRU
CN108519976A (en) The method for generating extensive sentiment dictionary based on neural network
CN108388944A (en) LSTM neural network chips and its application method
CN110297894A (en) A kind of Intelligent dialogue generation method based on auxiliary network
CN115408515A (en) Automatic classification method for online collaborative discussion interactive text

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 100040 Shijingshan Road, Shijingshan District, Beijing, No. 20, 16 layer 1601

Applicant after: Chinese translation language through Polytron Technologies Inc

Address before: 100040 Shijingshan District railway building, Beijing, the 16 floor

Applicant before: Mandarin Technology (Beijing) Co., Ltd.

GR01 Patent grant
GR01 Patent grant