CN110674280A - Answer selection algorithm based on enhanced question importance expression - Google Patents

Answer selection algorithm based on enhanced question importance expression Download PDF

Info

Publication number
CN110674280A
CN110674280A CN201911143753.7A CN201911143753A CN110674280A CN 110674280 A CN110674280 A CN 110674280A CN 201911143753 A CN201911143753 A CN 201911143753A CN 110674280 A CN110674280 A CN 110674280A
Authority
CN
China
Prior art keywords
question
answer
vector
word
algorithm based
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911143753.7A
Other languages
Chinese (zh)
Other versions
CN110674280B (en
Inventor
琚生根
谢正文
熊熙
孙界平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zhongke Microparticle Biotechnology Co ltd
Original Assignee
Sichuan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sichuan University filed Critical Sichuan University
Publication of CN110674280A publication Critical patent/CN110674280A/en
Application granted granted Critical
Publication of CN110674280B publication Critical patent/CN110674280B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • G06F16/332Query formulation
    • G06F16/3329Natural language query formulation or dialogue systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Mathematical Physics (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Evolutionary Computation (AREA)
  • Biophysics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Databases & Information Systems (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention relates to an answer selection algorithm based on enhanced question importance representation, which comprises the following steps: s1, coding the question and the answer through a BilSTM coding layer; s2, obtaining a new problem vector by the coded problem by using a self-attention mechanism; s3, establishing a word-level similarity matrix for the questions and the answers and aligning the word-level similarity matrix; s4, capturing semantic information of multiple granularities, and fusing and comparing vectors of different granularities; and S5, extracting fusion characteristics through the multi-window CNN to obtain the best option. The answer selection algorithm based on the enhanced question importance expression provides an answer selection algorithm based on a question importance expression network aiming at noise words in sentences, and the method generates 'clean' question sentence vectors by endowing different words with different weights again by using a self-attention mechanism; and capturing fine-grained semantic information between the question sentences and the answer sentences by using the word-level interaction matrix, thereby relieving the influence of noise words in the answer sentences.

Description

Answer selection algorithm based on enhanced question importance expression
Technical Field
The invention relates to the technical field of answer selection, in particular to an answer selection algorithm based on enhanced question importance expression.
Background
Answer Selection (AS) is a subtask in Question Answering (QA) and also a topical topic in Information Retrieval (IR) in recent years. Answer selection is to select the most appropriate answer from the candidate answer list according to the input question.
However, the currently used method has the influence of noise words, thereby influencing the accuracy of the answer.
Disclosure of Invention
In order to solve the above problems, it is an object of the present invention to provide an answer selection algorithm based on an enhanced representation of the importance of a question.
In order to achieve the purpose, the invention provides the following technical scheme: an answer selection algorithm based on enhanced question importance representation, comprising the steps of:
s1, coding the question and the answer through a BilSTM coding layer;
s2, the coded question and answer are regenerated by the question vector of the self-attention mechanism to obtain a new question vector;
s3, aligning the word levels in the question and the answer by using the word level similarity matrix;
s4, capturing semantic information of multiple granularities, and comparing vectors of different granularities;
and S5, extracting fusion characteristics through multiple layers of CNNs to obtain the best option.
Preferably, in step S1, Q is the question, the answer is a, and H is usedq={hq1,...,hqmH anda={ha1,...,hanrepresents a question sentence vector and an answer sentence vector,is sentence HqThe ith word of (1) is embedded, and m and n respectively represent the lengths of the question and the answer;
the question and answer capture the information of sentence context through a BilSTM coding layer, the hidden layer dimension of the LSTM is u, and the embedded word at the time t is xtThe hidden layer and the memory unit at the previous moment are h respectivelyt-1And ct-1Hidden layer h at the next momenttAnd a memory cell ctThe calculation is as follows:
gt=φ(Wgxt+Vght-1+bg),
it=σ(Wixt+Wiht-1+bi),
ft=σ(Wfxt+Wfht-1+bf),
ot=σ(Woxt+Woht-1+bo),
ct=gt⊙it+ct-1⊙ft
ht=ct⊙ot
wherein the content of the first and second substances,
Figure BDA0002281616940000021
in the above-mentioned manner,
Figure BDA0002281616940000022
sigma and phi are sigmoid function and tanh function respectively, ⊙ represents that two vectors are subjected to element multiplication, and an input gate i, a forgetting gate f and an output gate o can be self-multipliedDynamic control of the flow of information, with storage unit ctCan remember the long-distance information htIs a vector representation at time t.
Preferably, the sentence T of question obtained in step S1 is processed in step S2q={tq1,...,tqmAnd the sentence T of the answera={ta1,...,tanTherein of
Figure BDA0002281616940000023
And calculating the weight of each word in the question and updating the weight to generate a new question vector representation. The new vector calculation formula is:
v=TqW1(ii) a Wherein
Figure BDA0002281616940000024
αqSigmoid (v); wherein
Uq=αq⊙Tq(ii) a Wherein
Figure BDA0002281616940000026
Preferably, the calculation method of the word-level matrix is as follows:
M(i,j)=Uq(i)Ta(j)T
wherein the content of the first and second substances,
Figure BDA0002281616940000031
each row of the word-level matrix is the influence of the words in the question on each word in the answer, and the rows and the columns of the word-level matrix are normalized by a softmax function to obtain a mutual information influence factor lambdaq(i, j) and λa(i, j) wherein λq(i, j) and λaThe value ranges of (i, j) are all [0,1 ]](ii) a Multiplying the question vector and the answer vector with the corresponding influence factors to obtain two new vectors EqAnd Ea
It is preferable thatIn step S4, the original problem vector is represented as Q, and the vectors passing through the attention alignment layer are represented as Q
Figure BDA0002281616940000032
The original vector of the answer is A, and the vector passing through the attention alignment layer is represented as
Figure BDA0002281616940000033
Vector subtraction represents the euclidean distance between two vectors, and vector multiplication is approximate to the cosine distance between two vectors, and the specific calculation formula is as follows:
Figure BDA0002281616940000034
Figure BDA0002281616940000035
wherein the content of the first and second substances,
preferably, the calculation formula in step S5 is:
u ═ CNN (Fuse), wherein Fuse represents the fusion content KqOr the fusion content Ka
Obtaining S from the output u of the CNN through maximum pooling and average poolingq,max,Sa,max,Sq,mean,Sa,meanThen splicing into a vector S;
deriving final prediction vector by multi-layer perceptron (MLP)
Figure BDA0002281616940000037
Obtaining a score vector by using the following formula;
G=softmax(Score);
Figure BDA0002281616940000038
reducing the difference between the probability distribution of the predicted value and the probability distribution of the label value, wherein the formula is as follows:
Figure BDA0002281616940000041
compared with the prior art, the invention has the beneficial effects that: the answer selection algorithm based on the enhanced question importance expression provides an answer selection algorithm based on a question importance expression network aiming at noise words in sentences, and the method generates 'clean' question sentence vectors by endowing different words with different weights again by using a self-attention mechanism; and capturing fine-grained semantic information between the question sentences and the answer sentences by using the word-level interaction matrix, thereby relieving the influence of noise words in the answer sentences.
Drawings
FIG. 1 is an overall frame diagram of the present invention;
FIG. 2 is a graph generated based on a problem vector of the self-attention mechanism.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1-2, the present invention provides a technical solution: an answer selection algorithm based on enhanced question importance representation, comprising the steps of:
s1, coding the question and the answer through a BilSTM coding layer; in step S1, Q is the question, the answer is A, and H is usedq={hq1,...,hqmH anda={ha1,...,hanrepresents a question sentence vector and an answer sentence vector,
Figure BDA0002281616940000042
is sentence HqThe ith word of (1) is embedded, m and n represent question and answer, respectivelyThe length of the case;
the question and answer capture the information of sentence context through a BilSTM coding layer, the hidden layer dimension of the LSTM is u, and the embedded word at the time t is xtThe hidden layer and the memory unit at the previous moment are h respectivelyt-1And ct-1Hidden layer h at the next momenttAnd a memory cell ctThe calculation is as follows:
gt=φ(Wgxt+Vght-1+bg),
it=σ(Wixt+Wiht-1+bi),
ft=σ(Wfxt+Wfht-1+bf),
ot=σ(Woxt+Woht-1+bo),
ct=gt⊙it+ct-1⊙ft
ht=ct⊙ot
wherein the content of the first and second substances,
Figure BDA0002281616940000051
in the above-mentioned manner,sigma and phi are sigmoid function and tanh function respectively, ⊙ represents that two vectors are subjected to element multiplication, an input gate i, a forgetting gate f and an output gate o can automatically control the flow of information, and a memory unit ctCan remember the long-distance information htIs a vector representation at time t.
S2, the coded question and answer are regenerated by the question vector of the self-attention mechanism to obtain a new question vector; the sentence T of question obtained in step S1 is processed in step S2q={tq1,...,tqmAnd the sentence T of the answera={ta1,...,tanTherein of
Figure BDA0002281616940000053
And calculating the weight of each word in the question and updating the weight to generate a new question vector representation. The new vector calculation formula is:
v=TqW1(ii) a Wherein
Figure BDA0002281616940000054
αqSigmoid (v); wherein
Figure BDA0002281616940000055
Uq=αq⊙Tq(ii) a Wherein
Figure BDA0002281616940000056
S3, establishing a word-level similarity matrix for the questions and the answers and aligning the word-level similarity matrix; the calculation mode of the word level matrix is as follows:
M(i,j)=Uq(i)Ta(j)T
wherein the content of the first and second substances,
Figure BDA0002281616940000061
each row of the word-level matrix is the influence of the words in the question on each word in the answer, and the rows and the columns of the word-level matrix are normalized by a softmax function to obtain a mutual information influence factor lambdaq(i, j) and λa(i, j) wherein λq(i, j) and λaThe value ranges of (i, j) are all [0,1 ]](ii) a Multiplying the question vector and the answer vector with the corresponding influence factors to obtain two new vectors EqAnd Ea
S4, capturing semantic information of multiple granularities, and fusing and comparing vectors of different granularities; the problem original vector is denoted as Q and the vectors passing through the attention-alignment layer are denoted as Q
Figure BDA0002281616940000062
The original vector of the answer is A, and the vector passing through the attention alignment layer is represented as
Figure BDA0002281616940000063
Vector subtraction represents the euclidean distance between two vectors, and vector multiplication is approximate to the cosine distance between two vectors, and the specific calculation formula is as follows:
Figure BDA0002281616940000065
wherein the content of the first and second substances,
Figure BDA0002281616940000066
s5, extracting fusion characteristics through multiple layers of CNNs to obtain the best option, wherein the calculation formula is as follows:
u ═ CNN (Fuse), wherein Fuse represents the fusion content KqOr the fusion content Ka
Obtaining S from the output u of the CNN through maximum pooling and average poolingq,max,Sa,max,Sq,mean,Sa,meanThen splicing into a vector S;
deriving final prediction vector by multi-layer perceptron (MLP)
Figure BDA0002281616940000067
Obtaining a score vector by using the following formula;
G=softmax(Score);
Figure BDA0002281616940000068
reducing the difference between the probability distribution of the predicted value and the probability distribution of the label value, wherein the formula is as follows:
Figure BDA0002281616940000071
although the present invention has been described in detail with reference to the foregoing embodiments, it will be apparent to those skilled in the art that various changes in the embodiments and/or modifications of the invention can be made, and equivalents and modifications of some features of the invention can be made without departing from the spirit and scope of the invention.

Claims (6)

1. An answer selection algorithm based on enhanced question importance representation, characterized by: the method comprises the following steps:
s1, coding the question and the answer through a BilSTM coding layer;
s2, obtaining a new problem vector by the coded problem by using a self-attention mechanism;
s3, establishing a word-level similarity matrix for the questions and the answers and aligning the word-level similarity matrix;
s4, capturing semantic information of multiple granularities, and fusing and comparing vectors of different granularities;
and S5, extracting fusion characteristics through the multi-window CNN to obtain the best option.
2. An answer selection algorithm based on an enhanced question importance representation according to claim 1, characterized in that: in step S1, Q is the question, the answer is A, and H is usedq={hq1,...,hqmH anda={ha1,...,hanrepresents a question sentence vector and an answer sentence vector,
Figure FDA0002281616930000011
is sentence HqThe ith word of (1) is embedded, and m and n respectively represent the lengths of the question and the answer;
the question and answer capture the information of sentence context through a BilSTM coding layer, the hidden layer dimension of the LSTM is u, and the embedded word at the time t is xtThe hidden layer and the memory unit at the previous moment are h respectivelyt-1And ct-1Hidden layer h at the next momenttAnd a memory cell ctThe calculation is as follows:
gt=φ(Wgxt+Vght-1+bg),
it=σ(Wixt+Wiht-1+bi),
ft=σ(Wfxt+Wfht-1+bf),
ot=σ(Woxt+Woht-1+bo),
ct=gt⊙it+ct-1⊙ft
ht=ct⊙ot
wherein the content of the first and second substances,
Figure FDA0002281616930000021
in the above-mentioned manner,
Figure FDA0002281616930000022
sigma and phi are sigmoid function and tanh function respectively, ⊙ represents that two vectors are subjected to element multiplication, an input gate i, a forgetting gate f and an output gate o can automatically control the flow of information, and a memory unit ctCan remember the long-distance information htIs a vector representation at time t.
3. An answer selection algorithm based on an enhanced question importance representation according to claim 1, characterized in that: the sentence T of question obtained in step S1 is processed in step S2q={tq1,...,tqmAnd the sentence T of the answera={ta1,...,tanTherein of
Figure FDA0002281616930000023
And calculating the weight of each word in the question and updating the weight to generate a new question vector representation. The new vector calculation formula is:
v=TqW1(ii) a Wherein
Figure FDA0002281616930000024
αqSigmoid (v); wherein
Figure FDA0002281616930000025
Uq=αq⊙Tq(ii) a Wherein
Figure FDA0002281616930000026
4. An answer selection algorithm based on an enhanced question importance representation according to claim 1, characterized in that: the calculation mode of the word level matrix is as follows:
M(i,j)=Uq(i)Ta(j)T
wherein the content of the first and second substances,each row of the word-level matrix is the influence of the words in the question on each word in the answer, and the rows and the columns of the word-level matrix are normalized by a softmax function to obtain a mutual information influence factor lambdaq(i, j) and λa(i, j) wherein λq(i, j) and λaThe value ranges of (i, j) are all [0,1 ]](ii) a Multiplying the question vector and the answer vector with the corresponding influence factors to obtain two new vectors EqAnd Ea
5. An answer selection algorithm based on an enhanced question importance representation according to claim 1, characterized in that: in step S4, the original problem vector is denoted as Q, and the vectors passing through the attention alignment layer are denoted as Q
Figure FDA0002281616930000031
The original vector of the answer is A, and the vector passing through the attention alignment layer is represented as
Figure FDA0002281616930000032
Vector subtraction represents the Euclidean distance between two vectorsThe vector multiplication is approximate to the cosine distance between two vectors, and the specific calculation formula is as follows:
Figure FDA0002281616930000033
Figure FDA0002281616930000034
wherein the content of the first and second substances,
Figure FDA0002281616930000035
6. an answer selection algorithm based on an enhanced question importance representation according to claim 4, characterized in that: the calculation formula in step S5 is:
u ═ CNN (Fuse), wherein Fuse represents the fusion content KqOr the fusion content Ka
Obtaining S from the output u of the CNN through maximum pooling and average poolingq,max,Sa,max,Sq,mean,Sa,meanThen splicing into a vector S;
deriving final prediction vector by multi-layer perceptron (MLP)
Figure FDA0002281616930000036
Obtaining a score vector by using the following formula;
G=softmax(Score);
reducing the difference between the probability distribution of the predicted value and the probability distribution of the label value, wherein the formula is as follows:
Figure FDA0002281616930000038
CN201911143753.7A 2019-06-21 2019-11-20 Answer selection algorithm based on enhanced question importance representation Active CN110674280B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910542759 2019-06-21
CN2019105427595 2019-06-21

Publications (2)

Publication Number Publication Date
CN110674280A true CN110674280A (en) 2020-01-10
CN110674280B CN110674280B (en) 2023-12-15

Family

ID=69087950

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911143753.7A Active CN110674280B (en) 2019-06-21 2019-11-20 Answer selection algorithm based on enhanced question importance representation

Country Status (1)

Country Link
CN (1) CN110674280B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111665819A (en) * 2020-06-08 2020-09-15 杭州电子科技大学 Deep learning multi-model fusion-based complex chemical process fault diagnosis method

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018063696A (en) * 2016-10-07 2018-04-19 国立研究開発法人情報通信研究機構 Non-factoid question answering system and method, and computer program therefor
US20180276525A1 (en) * 2015-12-03 2018-09-27 Huawei Technologies Co., Ltd. Method and neural network system for human-computer interaction, and user equipment
CN108763284A (en) * 2018-04-13 2018-11-06 华南理工大学 A kind of question answering system implementation method based on deep learning and topic model
US20180329884A1 (en) * 2017-05-12 2018-11-15 Rsvp Technologies Inc. Neural contextual conversation learning
CN108829719A (en) * 2018-05-07 2018-11-16 中国科学院合肥物质科学研究院 The non-true class quiz answers selection method of one kind and system
CN108845990A (en) * 2018-06-12 2018-11-20 北京慧闻科技发展有限公司 Answer selection method, device and electronic equipment based on two-way attention mechanism
CN108959246A (en) * 2018-06-12 2018-12-07 北京慧闻科技发展有限公司 Answer selection method, device and electronic equipment based on improved attention mechanism
CN109033068A (en) * 2018-06-14 2018-12-18 北京慧闻科技发展有限公司 It is used to read the method, apparatus understood and electronic equipment based on attention mechanism
CN109670029A (en) * 2018-12-28 2019-04-23 百度在线网络技术(北京)有限公司 For determining the method, apparatus, computer equipment and storage medium of problem answers
US20200134263A1 (en) * 2017-07-13 2020-04-30 National Institute Of Information And Communications Technology Non-factoid question-answering device

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180276525A1 (en) * 2015-12-03 2018-09-27 Huawei Technologies Co., Ltd. Method and neural network system for human-computer interaction, and user equipment
JP2018063696A (en) * 2016-10-07 2018-04-19 国立研究開発法人情報通信研究機構 Non-factoid question answering system and method, and computer program therefor
US20180329884A1 (en) * 2017-05-12 2018-11-15 Rsvp Technologies Inc. Neural contextual conversation learning
US20200134263A1 (en) * 2017-07-13 2020-04-30 National Institute Of Information And Communications Technology Non-factoid question-answering device
CN108763284A (en) * 2018-04-13 2018-11-06 华南理工大学 A kind of question answering system implementation method based on deep learning and topic model
CN108829719A (en) * 2018-05-07 2018-11-16 中国科学院合肥物质科学研究院 The non-true class quiz answers selection method of one kind and system
CN108845990A (en) * 2018-06-12 2018-11-20 北京慧闻科技发展有限公司 Answer selection method, device and electronic equipment based on two-way attention mechanism
CN108959246A (en) * 2018-06-12 2018-12-07 北京慧闻科技发展有限公司 Answer selection method, device and electronic equipment based on improved attention mechanism
CN109033068A (en) * 2018-06-14 2018-12-18 北京慧闻科技发展有限公司 It is used to read the method, apparatus understood and electronic equipment based on attention mechanism
CN109670029A (en) * 2018-12-28 2019-04-23 百度在线网络技术(北京)有限公司 For determining the method, apparatus, computer equipment and storage medium of problem answers

Non-Patent Citations (7)

* Cited by examiner, † Cited by third party
Title
Y SONG: "P-CNN: Enhancing text matching with positional convolutional neural network", 《KNOWLEDGE-BASED SYSTEMS》 *
庞亮等: "深度文本匹配综述", 《计算机学报》 *
栾克鑫等: "基于句内注意力机制的答案自动抽取方法", 《智能计算机与应用》 *
熊雪等: "基于注意力机制的答案选择方法研究", 《智能计算机与应用》 *
王梓良: "基于深度学习的QA答案选择算法研究", 《中国优秀硕士学位论文全文数据库信息科技辑》 *
王霞等: "基于段落内部推理和联合问题答案匹配的选择型阅读理解模型", 《四川大学学报(自然科学版)》 *
陈柯锦等: "基于多尺度相似度特征的答案选择算法", 《系统工程与电子技术》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111665819A (en) * 2020-06-08 2020-09-15 杭州电子科技大学 Deep learning multi-model fusion-based complex chemical process fault diagnosis method

Also Published As

Publication number Publication date
CN110674280B (en) 2023-12-15

Similar Documents

Publication Publication Date Title
US11170287B2 (en) Generating dual sequence inferences using a neural network model
CN109783827B (en) Deep neural machine translation method based on dynamic linear polymerization
CN109992773B (en) Word vector training method, system, device and medium based on multi-task learning
CN110929092B (en) Multi-event video description method based on dynamic attention mechanism
US11562177B2 (en) Triple verification device and triple verification method
CN108665506B (en) Image processing method, image processing device, computer storage medium and server
CN109871541B (en) Named entity identification method suitable for multiple languages and fields
US20210375280A1 (en) Systems and methods for response selection in multi-party conversations with dynamic topic tracking
US20030236662A1 (en) Sequential conditional generalized iterative scaling
CN111414749B (en) Social text dependency syntactic analysis system based on deep neural network
CN111428525A (en) Implicit discourse relation identification method and system and readable storage medium
CN112527966B (en) Network text emotion analysis method based on Bi-GRU neural network and self-attention mechanism
Vialatte et al. A study of deep learning robustness against computation failures
WO2023231513A1 (en) Conversation content generation method and apparatus, and storage medium and terminal
CN115080715B (en) Span extraction reading understanding method based on residual structure and bidirectional fusion attention
CN111914553A (en) Financial information negative subject judgment method based on machine learning
CN109979461B (en) Voice translation method and device
CN111652000A (en) Sentence similarity judging method and system
CN110674280B (en) Answer selection algorithm based on enhanced question importance representation
CN116543289B (en) Image description method based on encoder-decoder and Bi-LSTM attention model
CN112906398A (en) Sentence semantic matching method, system, storage medium and electronic equipment
CN116362242A (en) Small sample slot value extraction method, device, equipment and storage medium
WO2023017568A1 (en) Learning device, inference device, learning method, and program
CN110909860A (en) Method and device for initializing neural network parameters
CN114743056A (en) Dynamic early-quit-based image description generation model and model training method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20230411

Address after: 518000 room 110, No. 41, Longguan West Road, Dalang, Longhua District, Shenzhen, Guangdong

Applicant after: Zhuo Mucheng

Address before: 610064, No. 24, south section of Ring Road, Sichuan, Chengdu

Applicant before: SICHUAN University

TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20231110

Address after: Room 302, Unit 1, 3rd Floor, Building 4, Yard 1, Wujiachang Road, Haidian District, Beijing, 100080

Applicant after: Beijing Zhongke Microparticle Biotechnology Co.,Ltd.

Address before: 518000 room 110, No. 41, Longguan West Road, Dalang, Longhua District, Shenzhen, Guangdong

Applicant before: Zhuo Mucheng

GR01 Patent grant
GR01 Patent grant