WO2022050724A1 - Dispositif, procédé et système de détermination de réponses à des requêtes - Google Patents
Dispositif, procédé et système de détermination de réponses à des requêtes Download PDFInfo
- Publication number
- WO2022050724A1 WO2022050724A1 PCT/KR2021/011861 KR2021011861W WO2022050724A1 WO 2022050724 A1 WO2022050724 A1 WO 2022050724A1 KR 2021011861 W KR2021011861 W KR 2021011861W WO 2022050724 A1 WO2022050724 A1 WO 2022050724A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- graph
- query
- embedding
- response
- language
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/30—Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
- G06F16/33—Querying
- G06F16/332—Query formulation
- G06F16/3329—Natural language query formulation or dialogue systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/30—Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
- G06F16/33—Querying
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/30—Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
- G06F16/33—Querying
- G06F16/332—Query formulation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/30—Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
- G06F16/33—Querying
- G06F16/3331—Query processing
- G06F16/334—Query execution
- G06F16/3344—Query execution using natural language analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/901—Indexing; Data structures therefor; Storage structures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/901—Indexing; Data structures therefor; Storage structures
- G06F16/9024—Graphs; Linked lists
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
Definitions
- the present invention relates to an apparatus, method, and system for determining a response to a query using a deep neural network (DNN) model.
- DNN deep neural network
- An object of the present invention is to provide an apparatus, method, and system for determining a response to a question that can more easily interpret a response determination process while determining an appropriate response (correct answer) to a given question.
- an apparatus for determining a response to a query a method for determining a response, and a system for determining a response are provided.
- the response determination device includes a graph generator that generates an integrated graph for a query, a graph encoder that generates a graph embedding for the graph, a text encoder that generates a language embedding for the query, and a combination of the graph embedding and the language embedding. Then, it may include a result obtaining unit that determines a response to the query by performing the learning.
- the response determination system includes a terminal device receiving a query and a server device communicatively connected to the terminal device and obtaining a response corresponding to the query, wherein at least one of the terminal device and the server device responds to the query.
- the server device For generating an integrated graph for the graph, generating a graph embedding for the graph, or generating a language embedding for the query, the server device concatenates the graph embedding and the language embedding, and then performs learning to respond to the query. You can decide on a response.
- a response determination method comprises the steps of generating an integrated graph for a query, generating a graph embedding for the graph, generating a language embedding for the query, concatenating the graph embedding and the language embedding, and learning and determining an answer to the query by performing the method.
- answers to common-sense information queries can be searched for and output more accurately and quickly, and it is possible to more easily interpret and explain by what process or for what reason the answers to the questions are determined.
- a device capable of answering a question such as an artificial intelligence speaker or an artificial intelligence personal assistant device or program, can easily understand how the user found an answer to the user's query, enabling a more rational conversation between the device and the person, Accordingly, the device can provide necessary information more quickly and accurately.
- FIG. 1 is a block diagram of an embodiment of an apparatus for determining a response to an inquiry.
- FIG. 2 is a block diagram of one embodiment of a processor.
- FIG. 3 is a diagram illustrating an example of an abstract semantic representation (AMR) graph.
- FIG. 4 is a diagram illustrating an example of an integrated graph.
- FIG. 5 is a diagram of an embodiment of a system for determining a response to a query
- FIG. 6 is a flowchart of an embodiment of a method for determining a response to a query.
- a term to which a 'unit' is added used below may be implemented in software or hardware, and according to an embodiment, one 'unit' may be implemented as one physical or logical part, or a plurality of 'units' may be implemented as one It is also possible to be implemented with physical or logical parts, or one 'unit' may be implemented with a plurality of physical or logical parts.
- a certain part when it is said that a certain part is connected to another part, it may mean a physical connection or electrically connected according to a certain part and another part.
- a part when it is said that a part includes another part, this does not exclude another part other than the other part unless otherwise stated, and it means that another part may be further included according to the designer's choice do.
- FIGS. 1 to 4 various embodiments of an apparatus for determining a response to a query will be described with reference to FIGS. 1 to 4 .
- FIG. 1 is a block diagram of an embodiment of an apparatus for determining a response to an inquiry.
- the response determining apparatus 100 illustrated in FIG. 1 may be implemented using at least one information processing apparatus capable of receiving data and performing a predetermined arithmetic process on it.
- the at least one information processing device includes an artificial intelligence response device (for example, an artificial intelligence speaker, etc.), a smart phone, a tablet PC, a smart watch, a portable game machine, a head mounted display (HMD, Head Mounted Display) device, a navigation device, Personal digital assistants (PDAs), desktop computers, laptop computers, digital televisions, set-top boxes, consumer electronics, vehicles, manned vehicles, unmanned aerial vehicles, robots, industrial machines and/or electronic devices specifically designed to determine responses to queries, etc. may include, but is not limited thereto, and various devices that a designer can consider may be used as the above-described response determining device 100 .
- an artificial intelligence response device for example, an artificial intelligence speaker, etc.
- HMD Head Mounted Display
- PDAs Personal digital assistants
- desktop computers laptop computers, digital televisions, set-top boxes,
- the response determining apparatus 100 may include an input unit 101 , an output unit 102 , a storage unit 103 , and a processor 102 , as shown in FIG. 1 .
- the input unit 101 , the output unit 102 , and the storage unit 103 may be omitted.
- the input unit 101 may receive at least one query input from a user or an external device, and according to an embodiment, may further receive a plurality of selections prepared in response to each of the at least one query simultaneously or sequentially.
- the query is a text (which may include at least one letter, symbol, number, and/or figure, etc.), sound (for example, it may include voice), an image, and/or various expression systems that can represent other questions.
- the selection map may be formed of text, sound, image, and/or a predetermined expression system.
- the query and the options may be formed in the same expression system, for example, text, or may be formed in a different expression system.
- the query may include a common sense question.
- the plurality of options may include answers presented in advance to the query, and at least one of the plurality of options may be an answer suitable for the query.
- the input unit 101 may receive a common sense question answer including a question and an option.
- Common sense question-and-answer refers to a multiple-choice question-and-answer dataset that requires various types of common sense to predict correct answers. That is, common sense question and answer means a set of data including a question and at least two or more options corresponding thereto, and it is possible to determine the option corresponding to the correct answer among at least two or more options only by using common sense.
- the input unit 101 receives at least one data, an application (which can be expressed in software, a program, or an app, etc.) and/or a command or instruction required for the operation of the response determining device 100 from a user, a designer, or an external device.
- the input unit 101 is, for example, a keyboard device, a mouse device, a graphic tablet, a microphone, a touch screen, a touch pad, a track ball, a track pad, a data input/output terminal (USB terminal, etc.) and/or a wired or wireless communication module, etc. It can be implemented using, but is not limited thereto.
- the output unit 102 may provide the processing result to a user, a designer, or an external device by visually or aurally outputting data acquired in the process of the processor 110 or the processing result of the processor 110 to the outside.
- the output unit 102 may output information on an option determined as a correct answer among the options and/or may output data or graphs obtained in the process of processing.
- the output unit 102 may further output a query or at least one option prepared in advance for the query.
- the output unit 102 may output the processing result or data acquired in the processing process visually or audibly, for example, to output the processing result or data in a visual form such as text or image to the outside.
- the output unit 102 may be implemented using, for example, a cathode ray tube, a display panel, a speaker device, a data input/output terminal, and/or a wired or wireless communication module, etc.
- various devices or parts are used according to the designer's choice. can be implemented.
- the storage unit 103 is electrically connected to at least one of the input unit 101 , the output unit 102 , and the processor 110 , and is necessary for the overall operation of the response determining device 100 or the individual operation of the processor 110 . It is provided to temporarily or non-temporarily store one or more applications or data.
- the storage unit 103 may store a query input through the input unit 101 , at least one option prepared in advance for the query, or an abstract semantic representation (AMR) generated by the graph generation unit 111 . Meaning Representation graph, integrated graph obtained by adding ConceptNet to abstract semantic representation graph, graph embedding result obtained by graph encoder (112 in Fig. 2), or text encoder (Fig.
- the expression embedding result obtained by 117 of 2) or a finally obtained result may be temporarily or non-temporarily stored.
- the storage unit 103 may store one or more applications for generating an integrated graph or obtaining a result, and may also store at least one learning model required for driving the application.
- the application may be directly created and input by the designer and stored in the storage unit 103, or obtained through a distribution network accessible through a wired or wireless communication network (eg, Electronic Software Distribution (ESD)).
- ESD Electronic Software Distribution
- the storage unit 102 may include, for example, at least one of a main memory device and an auxiliary memory device, and the auxiliary storage device may include, for example, a flash memory or a solid state drive (SSD). ), a hard disk drive (HDD), a DVD (DVD), an SD card and/or a compact disk (CD), and the like.
- the processor 110 is provided to perform an operation, determination, processing and/or control operation necessary for the overall operation of the response determining apparatus 100, or to perform at least one operation for determining a correct answer to an input query. it could be In this case, the processor 110 may perform a predetermined operation by driving the application stored in the storage unit 103 .
- the processor 110 is, for example, a central processing unit (CPU, Central Processing Unit), a micro controller unit (MCU), an application processor (AP, Application Processor), an electronic control unit (ECU, Electronic Controlling Unit). ) and/or may be implemented using at least a semiconductor-based electronic device capable of various calculations and control processing. A detailed operation of the processor 110 will be described later.
- FIG. 2 is a block diagram of one embodiment of a processor.
- the processor 110 may include a graph generating unit 111 , a graph encoder 112 , a text encoder 117 , and a result obtaining unit 119 . At least two of the graph generating unit 111 , the graph encoder 112 , the text encoder 117 , and the result obtaining unit 119 may be logically separated or physically separated. When physically separated, at least one of the graph generating unit 111 , the graph encoder 112 , the text encoder 117 , and the result obtaining unit 119 may be implemented using a separate processor.
- the query may be input to the graph generating unit 111 and the text encoder 117 , respectively.
- the at least one option may be input to at least one of the graph generator 111 and the text encoder 117 simultaneously or sequentially with the query.
- the graph generator 111 may generate an integrated graph based on the input query, and the text encoder 117 may extract a semantic expression for the query.
- FIG. 3 is a diagram illustrating an example of an abstract semantic representation (AMR) graph
- FIG. 4 is a diagram illustrating an example of an integrated graph.
- the graph generating unit 111 may convert the query 90 in the form of a sentence into an abstract semantic representation graph 91 (AMR graph).
- AMR graph abstract semantic representation graph
- Abstract semantic representation is a semantic representation language, and it refers to a technique for representing words or sentences in a structured graph form.
- the abstract semantic representation graph 91 has a single root obtained by transforming one or several sentences based on the propositional meaning in one or several sentences, and is an acyclic graph (single-rooted, labeled, directed and acyclic graph).
- the abstract semantic representation graph 91 at least one of an action (event), an actor (descriptor), and an action target (explained object) in the sentence 90 may be connected and expressed according to a relationship.
- an action event
- actor actor
- action target explanation object
- FIG. 3 when a sentence 90 of [A boy wants to go] is input, the sentence 90 is an event i0, an actor s0, and an action.
- the target s1 may be included.
- the abstract semantic representation corresponding to the sentence 90 is at least one arrow formed from the first point (e) toward each point (p0, p1) and from each point (e, p0, p1) It may be formed including at least one arrow formed toward a specific instance.
- each arrow from the first point e to the respective points p0 and p1 may be connected to, for example, the actor ARG0 and the object of the action ARG1.
- at least one arrow may be further formed from each point (e, p0, p1) to a specific example corresponding to each point (e, p0, p1).
- an arrow pointing to the event i0 in the sentence 90, for example, [want]) is formed, and the second point located at the end of the arrow pointing to the actor ARG0 is formed.
- an arrow pointing toward the actor s0, for example, [boy] in the sentence 90 is formed, and at the third point p1 located at the end of the arrow indicating the object of the action ARG1, An arrow pointing to the action object s1, for example, [go] in the sentence 90 may be formed.
- at least one abstract semantic representation graph 91 corresponding to the at least one query 90 may be formed.
- an abstract semantic representation graph for at least one option for the query may be generated together.
- the graph generating unit 111 may generate and obtain the integrated graph 92 as shown in FIG. 4 by indexing the conceptnet graph for the query in the obtained at least one abstract semantic representation graph 91 .
- the CupceptNet is a semantic network, and more specifically, a concept (a concept, a noun phrase, a verb phrase, or a mutually related natural phrase such as an adjective phrase) is expressed as a node, and a plurality of nodes corresponding to each concept is expressed in the form of a network by connecting to a path (line segment) representing the relationship between a plurality of nodes.
- the CupceptNet may be generated based on a predetermined database, for example, an Open Mind Common Sense (OMCS) database.
- OMCS Open Mind Common Sense
- a node ni is prepared in response to the event i0 in the sentence 90, and the node ni for the event i0 is the node ( ni, n0) is connected to the node n0 for the actor s0 in the sentence 90 through a path representing the relationship ARG0.
- the node ni for the event i0 is a predetermined path IsWayOf to another node ni1 corresponding to a word or sentence (eg, [demand]) having a specific relationship with the event i0. ) can also be connected.
- the node n0 for the actor s0 is a path ARG1 indicating the relationship between the actor s0 and the node n1 corresponding to the action object s1 in the sentence 90 and both (n0, n1) can be connected via
- the node n0 for the actor s0 corresponds to the relationship (super-lower concept) to the node (n01, for example, [man]) for a word or sentence having a relationship of a higher-level concept of the actor s0.
- a node (n03, for example, [male child]) corresponding to a sentence or the like may also be connected with a path (synonym) corresponding to the synonym.
- the node n1 for the object s1 of the action is also related to the object s1 of the action (IsRelatedTo, for example [leave]). It may be connected to the node n10 and the like.
- at least one integrated graph 92 corresponding to the query 90 and/or at least one option for the query 90 can be generated.
- the graph generator 111 may transmit the integrated graph 92 to the graph encoder 112 as shown in FIG. 3 .
- the graphic encoder 112 may perform graph embedding so that semantic information of the integrated graph 92 can be input in the form of embeddings into the learning model processed by the result acquisition unit 119 .
- Embedding means converting input data into at least one vector format before learning is performed, and graph embedding means converting the integrated graph 92 obtained by the graph generating unit 111 into a vector format.
- the obtained vector may have one or more dimensions, and each element in the vector may be expressed numerically.
- the graphic encoder 112 may include a relation embedding processing unit 113 , a node embedding processing unit 114 , a junction unit 115 , and a learning unit 116 .
- the relationship embedding processing unit 113 is configured to provide a path (ARG0, ARG1, IsWayOf, IsATypeOf, Antonym, Synonym, IsRelatedTo), after searching for and obtaining at least one, relation embedding may be obtained based on each obtained at least one path.
- the relation embedding processing unit 113 may use a predetermined learning model to obtain relation embeddings from each path. In this case, each path may be used as an input value of the learning model.
- the learning model used by the relation embedding processing unit 113 is, for example, a long short term memory (LSTM), a recurrent neural network (RNN), or a gated recurrent unit (GRU), etc.
- the designer may include at least one learning model that can be considered for embedding.
- the relationship embedding may be passed to the junction 115 .
- the node embedding processing unit 114 may obtain a node embedding corresponding to each node ni, n0, n1 based on each node ni, n0, n1. According to an embodiment, the node embedding processing unit 114 may use a Global Vectors for Word Representation (GloVe) method to obtain node embeddings. Acquisition of the node embedding may be performed simultaneously with, preceding, or following the obtaining of the relation embedding according to an embodiment, or may be performed independently of the obtaining of the relation embedding. The node embedding may be passed to the junction 115 .
- GloVe Global Vectors for Word Representation
- the concatenation unit 115 receives the relation embedding output by the relation embedding processing unit 113 and the node embedding output by the node embedding processing unit 114, and concatenates the relation embedding and the node embedding to obtain the concatenated embedding. .
- the bonded embedding may be transmitted to the learning unit 116 .
- the learning unit 116 may acquire graph embeddings by learning the joint embeddings. Specifically, the learning unit 116 may process the learning model by using the joined embedding as an input value, and obtain graph embeddings according to the processing result of the learning model.
- the learning model may include, for example, a learning model employing self-attention, for example, a transformer model.
- the text encoder 117 may receive the query 90 from the input unit 101 and obtain a language embedding for the received query 90 . If necessary, the text encoder 117 may further receive at least one option prepared for the query 90 , and then acquire language embeddings for the received at least one option. In this case, the text encoder 117 may acquire language embeddings for the query 90 or the option using at least one learning model trained in advance, and according to an embodiment, learn the same for the query 90 and the option. Language embeddings for each may be obtained by applying a model, or language embeddings may be obtained for each by applying different learning models to each. In this case, the used learning model may include, for example, Bidirectional Encoder Representations from Transformers (BERT) or XLNet.
- BERT Bidirectional Encoder Representations from Transformers
- XLNet XLNet
- the text encoder 117 in order to obtain together the embeddings for the query 90 and the options related to the query 90, the text encoder 117 generates an identifier for distinguishing the query 90 and the options between the query 90 and the options. It can also be inserted into and combined into the learning model. For example, when using a vert as a learning model, the query 90 and options are combined, but a [SEP] token that separates sentences is inserted between the query 90 and the options, and then the meaning is expressed by inputting it into the vert. may be extracted, and language embeddings may be obtained accordingly. In the case of an example, a [CLS] token for indicating the beginning of a sentence may be further added to the front end of the query 90 .
- the graph embedding and language embedding obtained as described above may be transmitted to the result obtaining unit 119 .
- the result acquisition unit 119 may determine a response that can be determined as the correct answer to the query based on the learning model. For example, the result acquisition unit 119 may select and determine one or more options recognized as correct answers to the query from among at least one option.
- the result acquisition unit 119 may be performed using a learning model.
- the learning model may include at least one linear layer. If necessary, the learning model may be prepared by including a Softmax function, which is a normalized exponential function.
- the learning model of the result acquisition unit 119 is a transformer, a long-term memory, a recurrent neural network, a deep neural network (DNN), a convolutional neural network (CNN), a deep trust neural network (DBN: It may include a Deep Belief Network or Deep Reinforcement Learning model, etc.
- the result obtained by the result acquisition unit 119 may be transmitted to the output unit 102, and the output unit 120 It can be output visually or audibly.
- the attention-focusing weight is used to determine which part of the graph is used to resolve a query input during the operation of the above-described processor 110 , the graph generator 111 , or the graph encoder 112 .
- (Attention weight) may be further used.
- the attention weight is added, it is possible to interpret the operation of the processor 110 , and accordingly, it is possible to more easily interpret and explain how or why a response to a query is determined.
- FIG. 5 is a diagram of an embodiment of a system for determining a response to a query
- the response determination system 200 connects at least one terminal device 210 and at least one terminal device 210 to a wired and/or wireless communication network 201 in an embodiment. It may include a server device 220 communicatively connected through.
- the at least one terminal device 210 may receive an inquiry from a user or a designer, or receive an inquiry and at least one option corresponding to the inquiry. According to an embodiment, the at least one terminal device 210 may transmit a received query to the server device 220 or may transmit a query and a selection to the server device 220 . According to another embodiment, the at least one terminal device 210 performs at least one operation among the graph generator 110 , the graph encoder 112 , and the text encoder 117 described above, and results according to the operation (eg, For example, the integrated graph 92 , at least one of graph embedding and language embedding) may be transmitted to the server device 220 .
- the integrated graph 92 e.g, the integrated graph 92 , at least one of graph embedding and language embedding
- the at least one terminal device 210 may receive the result (eg, a response determined as the correct answer to the query or information on the option determined as the correct answer among at least one option, etc.) transmitted by the server device 220 .
- the reception result may be output to the outside through a display or a speaker device.
- the at least one terminal device 210 may be, for example, an artificial intelligence response device, a smart phone, a tablet PC, a smart watch, a digital television, a desktop computer, a laptop computer, a head mounted display device, a navigation device, a portable game machine, a home appliance. , a robot and/or a predetermined electronic device capable of inputting other information and communicating with the server device 220 .
- the server device 220 receives a query from the terminal device 210 , receives a query and a choice, or a result of an operation processing of the terminal device 210 (eg, the integrated graph 92 , graph embedding and language embedding). of at least one of the following) and performing an appropriate processing operation according to the received data, and the result (for example, a response determined as the correct answer to the query and/or the option determined as the correct answer among at least one option) response, etc.) can be obtained.
- the server device 220 may perform the operation of the above-described result obtaining unit 119 , and according to the type of data received from the terminal device 210 , the above-described graph generating unit 110 and graph encoder 112 .
- the server device 220 may be implemented using one or more computing devices for servers. When implemented using two or more computer devices for servers, each computer device for servers may perform the same operation as each other or may perform different operations from each other. In the latter case, for example, one server computer may be designed to generate an integrated graph and perform graph embedding acquisition operations, and the other server computer may be designed to perform language embedding acquisition and final result acquisition operations. In addition, the server device 220 may be implemented using a desktop computer, a laptop computer, or a set-top box.
- FIG. 6 is a flowchart of an embodiment of a method for determining a response to a query.
- a query may be input first ( 300 ).
- at least one option for the query may be input together with the query, and the at least one option may include at least one option that is the correct answer for the query.
- an integrated graph for the query may be generated or an integrated graph for each of the query and the options may be generated ( 302 to 306 ).
- the generation of the integrated graph is obtained by converting a query or a choice to obtain an abstract semantic representation graph (302), indexing the conceptnet graph to the obtained abstract semantic representation graph (304), and finally the integrated graph may be obtained (306).
- graph embeddings for the integrated graph may be obtained (308-312). Specifically, the acquisition of the graph embedding is performed after acquiring the relationship embedding and the node embedding for the integrated graph (308), respectively, and then joining the acquired relationship embedding and the node embedding to obtain the jointed embedding (310), and the obtained jointed embedding is obtained (310). It may be performed by performing learning on embedding (312).
- the acquisition of the relation embedding may be performed using a learning model such as a long-term memory, a recurrent neural network, or a gated cyclic unit, according to an embodiment, and the acquisition of the node embedding is a global vector method for word representation, etc. can be performed using
- learning about the spliced embedding may be performed using a learning model employing magnetic concentration, such as a transformer.
- the language embedding process 320 may be further performed independently or non-independently from the acquisition of the integrated graph or the acquisition of graph embeddings 302 to 312 .
- the language embedding process 320 may be performed based on, for example, a learning model such as Bert or XLNet.
- Language embeddings may be obtained according to the language embedding process 320 .
- the graphics embedding and the language embedding may be joined ( 330 ).
- a result of combining graphic embedding and language embedding may be input to at least one learning model, and a response to a query (ie, a correct answer) may be determined according to a processing result of the learning model ( 332 ).
- the learning model may include at least one linear layer, and according to an embodiment, it may be implemented including a softmax function.
- at least one learning model to which a result of combining graphic embedding and language embedding is input may be implemented using a transformer, a short-term memory, or the like.
- the method for determining an answer to a query may be implemented in the form of a program that can be driven by a computer device.
- the program may include program instructions, data files, and data structures alone or in combination.
- the program may be designed and manufactured using machine code or high-level language code.
- the program may be specially designed to implement the above-described method, or may be implemented using various functions or definitions that are known and available to those skilled in the art of computer software.
- the computer device may be implemented including a processor or memory that enables the function of the program to be realized, and may further include a communication device if necessary.
- a program for implementing the above-described method for determining an answer to a query may be recorded in a computer-readable recording medium.
- the computer-readable recording medium is, for example, a semiconductor storage device such as a solid state drive (SSD), ROM, RAM or flash memory, a magnetic disk storage medium such as a hard disk or floppy disk, a compact disk or DVD, etc. It may include at least one type of physical device capable of storing a specific program executed in response to a call of a computer, such as an optical recording medium, a magneto-optical recording medium such as a floppy disk, and a magnetic tape.
- the apparatus for determining a response to a query, a method for determining a response to a query, and a system for determining a response to a query have been described above, but the apparatus, method, and system for determining a response to a query are not limited to the above-described embodiment.
- Various devices, methods or systems that can be implemented by those skilled in the art by modifying and modifying based on the above-described embodiments may also be recognized as an embodiment of the device, method, and system for determining a response to the above-mentioned inquiry. there is.
- the described techniques are performed in an order different from the described method, and/or the described components of the system, structure, apparatus, circuit, etc., are combined or combined in a different form than the described method, or other components or Even if it is replaced or substituted by equivalents, it may be an embodiment of the above-described apparatus for determining a response to a query, a method for determining a response to a query, and a system for determining a response to a query.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Databases & Information Systems (AREA)
- Data Mining & Analysis (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Software Systems (AREA)
- Mathematical Physics (AREA)
- Artificial Intelligence (AREA)
- Computational Linguistics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Computation (AREA)
- Medical Informatics (AREA)
- Computing Systems (AREA)
- Human Computer Interaction (AREA)
- Machine Translation (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
La présente invention se rapporte à un dispositif, à un procédé et à un système de détermination de réponses à des requêtes. Le dispositif de détermination de réponse peut comprendre : une partie de génération de graphique servant à générer un graphique intégré destiné aux requêtes ; un codeur graphique servant à générer une incorporation de graphique destinée au graphique ; un codeur de texte servant à générer une incorporation de langue destinée aux requêtes ; et une partie d'acquisition de résultat servant à déterminer des réponses aux requêtes en effectuant un apprentissage après l'assemblage de l'incorporation de graphique et de l'incorporation de langue.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2020-0111846 | 2020-09-02 | ||
KR1020200111846A KR102526501B1 (ko) | 2020-09-02 | 2020-09-02 | 질의에 대한 응답 결정 장치, 방법 및 시스템 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022050724A1 true WO2022050724A1 (fr) | 2022-03-10 |
Family
ID=80491237
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2021/011861 WO2022050724A1 (fr) | 2020-09-02 | 2021-09-02 | Dispositif, procédé et système de détermination de réponses à des requêtes |
Country Status (2)
Country | Link |
---|---|
KR (1) | KR102526501B1 (fr) |
WO (1) | WO2022050724A1 (fr) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20240052511A (ko) | 2022-10-14 | 2024-04-23 | 고려대학교 산학협력단 | 숫자 연산이 가능한 기계독해기반 질의응답 장치 및 방법 |
KR20240071906A (ko) | 2022-11-16 | 2024-05-23 | 부산대학교 산학협력단 | 수치 추론 연산을 위한 단서 생성 기반 표 질의응답 시스템 및 방법 |
KR20240076978A (ko) | 2022-11-24 | 2024-05-31 | 고려대학교 산학협력단 | 지식 및 페르소나를 이용한 대화 모델 생성 장치 및 방법 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20160103911A (ko) * | 2015-02-24 | 2016-09-02 | 한국과학기술원 | 개념 그래프 매칭을 이용한 질의응답 방법 및 시스템 |
US20160357855A1 (en) * | 2015-06-02 | 2016-12-08 | International Business Machines Corporation | Utilizing Word Embeddings for Term Matching in Question Answering Systems |
KR20180092808A (ko) * | 2017-02-08 | 2018-08-20 | 한국과학기술원 | 개념 그래프 기반 질의응답 시스템 및 이를 이용한 문맥 검색 방법 |
KR20200062521A (ko) * | 2018-11-27 | 2020-06-04 | 한국과학기술원 | 개념 그래프 기반 질의응답 장치 및 방법 |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101653256B1 (ko) | 2015-02-23 | 2016-09-01 | 건국대학교 산학협력단 | 사용자 질의 응답을 위한 메타지식데이터베이스 구축 방법 |
US10909329B2 (en) | 2015-05-21 | 2021-02-02 | Baidu Usa Llc | Multilingual image question answering |
KR102034646B1 (ko) | 2017-12-20 | 2019-10-22 | 주식회사 솔트룩스 | 복수의 질의 응답 모듈을 가지는 자동 질의 응답 시스템 |
US11625620B2 (en) * | 2018-08-16 | 2023-04-11 | Oracle International Corporation | Techniques for building a knowledge graph in limited knowledge domains |
KR102039294B1 (ko) | 2019-03-25 | 2019-10-31 | 김보언 | 인공지능 기반 질의 응답을 통한 포상관광지 획득 방법, 장치 및 프로그램 |
-
2020
- 2020-09-02 KR KR1020200111846A patent/KR102526501B1/ko active IP Right Grant
-
2021
- 2021-09-02 WO PCT/KR2021/011861 patent/WO2022050724A1/fr active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20160103911A (ko) * | 2015-02-24 | 2016-09-02 | 한국과학기술원 | 개념 그래프 매칭을 이용한 질의응답 방법 및 시스템 |
US20160357855A1 (en) * | 2015-06-02 | 2016-12-08 | International Business Machines Corporation | Utilizing Word Embeddings for Term Matching in Question Answering Systems |
KR20180092808A (ko) * | 2017-02-08 | 2018-08-20 | 한국과학기술원 | 개념 그래프 기반 질의응답 시스템 및 이를 이용한 문맥 검색 방법 |
KR20200062521A (ko) * | 2018-11-27 | 2020-06-04 | 한국과학기술원 | 개념 그래프 기반 질의응답 장치 및 방법 |
Non-Patent Citations (1)
Title |
---|
YOO SOYEOP, JEONG OKRAN: " An Intelligent Chatbot Utilizing BERT Model and Knowledge Graph", THE JOURNAL OF SOCIETY FOR E-BUSINESS STUDIES, vol. 24, no. 3, 1 August 2019 (2019-08-01), pages 87 - 98, XP055909472, ISSN: 2288-3908, DOI: 10.7838/jsebs.2019.24.3.087 * |
Also Published As
Publication number | Publication date |
---|---|
KR20220030088A (ko) | 2022-03-10 |
KR102526501B1 (ko) | 2023-04-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2022050724A1 (fr) | Dispositif, procédé et système de détermination de réponses à des requêtes | |
CN108846130B (zh) | 一种问题文本生成方法、装置、设备和介质 | |
CN108985358B (zh) | 情绪识别方法、装置、设备及存储介质 | |
WO2021049706A1 (fr) | Système et procédé de réponse aux questions d'ensemble | |
EP3627398A1 (fr) | Procédé, système et programme informatique pour une réponse d'intelligence artificielle | |
JP2022039973A (ja) | 品質を管理するための方法及び装置、電子機器、記憶媒体、並びに、コンピュータプログラム | |
KR20210154705A (ko) | 시맨틱 매칭 방법, 장치, 기기 및 저장 매체 | |
CN111898374A (zh) | 文本识别方法、装置、存储介质和电子设备 | |
Nguyen et al. | Ontology-based integration of knowledge base for building an intelligent searching chatbot. | |
EP4145303A1 (fr) | Procédé et dispositif de recherche d'informations, dispositif électronique et support d'informations | |
Nguyen et al. | Design intelligent educational chatbot for information retrieval based on integrated knowledge bases | |
CN112232066A (zh) | 一种教学纲要生成方法、装置、存储介质及电子设备 | |
CN116662496A (zh) | 信息抽取方法、训练问答处理模型的方法及装置 | |
CN110543551B (zh) | 一种问题语句处理方法和装置 | |
CN114676705B (zh) | 一种对话关系处理方法、计算机及可读存储介质 | |
Wen et al. | CQACD: A concept question-answering system for intelligent tutoring using a domain ontology with rich semantics | |
CN111767720B (zh) | 一种标题生成方法、计算机及可读存储介质 | |
Rao et al. | Smart-bot assistant for college information system | |
CN113627197B (zh) | 文本的意图识别方法、装置、设备及存储介质 | |
CN116108918A (zh) | 对话预训练模型的训练方法及相关装置 | |
CN114020908A (zh) | 文本分类方法及装置、计算机可读存储介质及电子设备 | |
CN109933788B (zh) | 类型确定方法、装置、设备和介质 | |
CN111966895A (zh) | 一种基于Watson对话服务的电影问答系统构建方法、装置及系统 | |
Jeyanthi et al. | AI‐Based Development of Student E‐Learning Framework | |
CN114564562B (zh) | 基于答案指导的题目生成方法、装置、设备及存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21864683 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21864683 Country of ref document: EP Kind code of ref document: A1 |