CN110008476A - Semantic analytic method, device, equipment and storage medium - Google Patents

Semantic analytic method, device, equipment and storage medium Download PDF

Info

Publication number
CN110008476A
CN110008476A CN201910284812.6A CN201910284812A CN110008476A CN 110008476 A CN110008476 A CN 110008476A CN 201910284812 A CN201910284812 A CN 201910284812A CN 110008476 A CN110008476 A CN 110008476A
Authority
CN
China
Prior art keywords
vector
word
attention
sequence
feature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910284812.6A
Other languages
Chinese (zh)
Other versions
CN110008476B (en
Inventor
孟振南
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chumen Wenwen Information Technology Co Ltd
Original Assignee
Chumen Wenwen Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chumen Wenwen Information Technology Co Ltd filed Critical Chumen Wenwen Information Technology Co Ltd
Priority to CN201910284812.6A priority Critical patent/CN110008476B/en
Publication of CN110008476A publication Critical patent/CN110008476A/en
Application granted granted Critical
Publication of CN110008476B publication Critical patent/CN110008476B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/30Semantic analysis

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Machine Translation (AREA)

Abstract

Present disclose provides a kind of semantic analytic methods, comprising: obtains feature relevant to each word in list entries, obtains the sequence comprising the relevant feature of each word;Based on the sequence comprising the relevant feature of each word, the primary vector for generating the list entries is indicated;It is indicated based on the primary vector, generates intention attention vector sum semanteme slot and pay attention to force vector;Notice that force vector obtains semantic parsing result corresponding with list entries according to attention vector sum semanteme slot is intended to.The disclosure additionally provides a kind of semantic resolver, electronic equipment and readable storage medium storing program for executing.

Description

Semantic analytic method, device, equipment and storage medium
Technical field
This disclosure relates to a kind of semanteme analytic method, participle device, electronic equipment and readable storage medium storing program for executing.
Background technique
Traditional semantic analytic method is that Intention Anticipation and semantic slot filling are divided into two steps, and the first step is first to read statement Intent classifier is carried out, second step carries out semantic slot filling under specific be intended to.Often have ignored in the same sentence be intended to and Relationship between semantic slot, but semantic slot usually has very big contact with intention.
Have proposed in the prior art by using attention mechanism simulation semanteme slot and be intended between relationship come simultaneously into The method of row Intention Anticipation and semantic slot filling, in the hope of improving the accuracy rate of semantic slot filling and Intention Anticipation.But in semanteme In slot filling task, only only used attention mechanism cannot reach good effect.
Summary of the invention
At least one of in order to solve the above-mentioned technical problem, present disclose provides a kind of semantic analytic methods, semantic solution Analysis apparatus, electronic equipment and readable storage medium storing program for executing.
According to one aspect of the disclosure, a kind of semantic analytic method, including obtaining and each word phase in list entries The feature of pass obtains the sequence comprising the relevant feature of each word;It is raw based on the sequence comprising the relevant feature of each word It is indicated at the primary vector of the list entries;It is indicated based on the primary vector, generates and be intended to attention vector sum semanteme slot Pay attention to force vector;According to the primary vector indicate, be intended to attention vector sum semanteme slot pay attention to force vector obtain with it is described defeated Enter the corresponding semantic parsing result of sequence, wherein the semanteme parsing result includes Intention Anticipation result and semantic slot filling knot Fruit.
According to one aspect of the disclosure, the method also includes the sequence comprising the relevant feature of each word is It is obtained by the word insertion of list entries to relevant Fusion Features.
According to one aspect of the disclosure, the method also includes based on the sequence comprising the relevant feature of each word Column, the primary vector for generating the list entries indicate to include: that the sequence comprising the relevant feature of each word is input into Two-way shot and long term memory network (BLSTM), and the hiding sequence of forward direction and reversed hiding sequence assembly are obtained into the primary vector It indicates.
According to one aspect of the disclosure, the method also includes being indicated based on the primary vector, generate and be intended to pay attention to Force vector and semantic slot notice that force vector includes: to calculate attention weight for each hidden state in primary vector expression, Based on each hidden state and its corresponding attention weight, obtain semantic slot and pay attention to force vector, be based on the last one time The hidden state of step and its corresponding attention weight obtain and are intended to pay attention to force vector.
According to one aspect of the disclosure, the method also includes being infused based on the intention attention vector sum semanteme slot Force vector of anticipating obtains weighted feature relevant to each hidden state, is indicated based on the weighted feature, primary vector and semantic Slot notices that force vector obtains the semantic slot filling result.
According to one aspect of the disclosure, the method also includes by the weighted feature, primary vector expression and semanteme Slot notices that force vector input condition random field (CRF) layer obtains semantic slot filling result.
According to one aspect of the disclosure, the method also includes, from knowledge mapping obtain with it is every in list entries The relevant feature of a word, and the word insertion of list entries is obtained including that each word is relevant by opsition dependent with relevant Fusion Features The sequence of feature.
According to one aspect of the disclosure, a kind of semantic resolver is provided, comprising: word-characteristic sequence generation module, For obtaining feature relevant to each word in list entries, and obtain the sequence comprising the relevant feature of each word;First Vector indicates generation module, for based on the sequence comprising the relevant feature of each word, generating the of the list entries One vector indicates;Attention vector generation module is generated for being indicated based on the primary vector and is intended to attention vector sum language Adopted slot pays attention to force vector;Semantic parsing result generation module, for being indicated according to the primary vector, being intended to attention vector sum Semantic slot notices that force vector obtains semantic parsing result corresponding with the list entries, wherein the semanteme parsing result packet It includes Intention Anticipation result and semantic slot fills result.
According to the another aspect of the disclosure, a kind of electronic equipment, comprising: memory, memory storage computer execution refer to It enables;And processor, processor executes the computer executed instructions of memory storage, so that processor executes above-mentioned method.
According to the another further aspect of the disclosure, a kind of readable storage medium storing program for executing is stored with computer execution in readable storage medium storing program for executing Instruction, for realizing above-mentioned method when computer executed instructions are executed by processor.
Detailed description of the invention
Attached drawing shows the illustrative embodiments of the disclosure, and it is bright together for explaining the principles of this disclosure, Which includes these attached drawings to provide further understanding of the disclosure, and attached drawing is included in the description and constitutes this Part of specification.
Fig. 1 is the schematic flow chart according to the semantic analytic method of one embodiment of the disclosure.
Fig. 2 is the schematic flow chart according to the attention force vector generating process of one embodiment of the disclosure.
Fig. 3 is the schematic block diagram according to the semantic resolver of one embodiment of the disclosure.
Fig. 4 is the schematic block diagram according to the attention vector generator of one embodiment of the disclosure.
Fig. 5 is the explanatory view according to the electronic equipment of one embodiment of the disclosure.
Specific embodiment
The disclosure is described in further detail with embodiment with reference to the accompanying drawing.It is understood that this place The specific embodiment of description is only used for explaining related content, rather than the restriction to the disclosure.It also should be noted that being Convenient for description, part relevant to the disclosure is illustrated only in attached drawing.
It should be noted that in the absence of conflict, the feature in embodiment and embodiment in the disclosure can To be combined with each other.The disclosure is described in detail below with reference to the accompanying drawings and in conjunction with embodiment.
This field in the related technology, semanteme parsing be a very important module in natural language processing (NLP), Including Intention Anticipation and semantic slot filling.Intention Anticipation, which refers to, judges the intention of user, and common intention has that " weather is looked into Ask ", " Music on Demand ", " video on demand " etc..Semantic slot filling refers under specific be intended to, and extracts corresponding entity and comes into one It is semantic to walk accurate Analysis.In " weather lookup " intention, slot position can be " city name " and " time ", and " Music on Demand " is intended to In, slot position can be " song title ", " singer's name ", " album name " and " types of songs " etc., and in " video on demand " intention, slot position can To be " video name ", " director " and " performer " etc..
Fig. 1 is the schematic flow chart according to the semantic analytic method of one embodiment of the disclosure.
In one embodiment of the present disclosure, the semantic analytic method includes each of S11 acquisition and list entries The relevant feature of word obtains the sequence comprising the relevant feature of each word;S12 is based on described comprising the relevant feature of each word Sequence, the primary vector for generating the list entries indicate;S13 based on the primary vector indicate, generate be intended to attention to Amount and semantic slot pay attention to force vector;S14 indicates according to the primary vector, is intended to attention vector sum semanteme slot pays attention to force vector Obtain semantic parsing result corresponding with the list entries, wherein the semanteme parsing result include Intention Anticipation result and Semantic slot fills result.
In step s 11, obtain relevant to each word in list entries feature, by the word insertion of list entries and The sequence comprising the relevant feature of each word that relevant Fusion Features obtain.
In the disclosure, for list entries x={ x1,x2,……xT, on each time step i, word is embedded in (word Embedding) (x is obtained to relevant one or more Fusion Features (as spliced)i,fi), to generate related comprising each word Feature sequence.In the disclosure, the above-mentioned sequence comprising the relevant feature of each word is referred to as word-characteristic sequence (word- feature sequence)。
Specifically, it is inquired in knowledge mapping and the related one or more features of each word in list entries.Institute Stating feature may include part of speech, attribute and relationship of other entities etc..Then, opsition dependent is by the insertion of the word of list entries and phase The Fusion Features of pass obtain the sequence comprising the relevant feature of each word.
In step S22, word-characteristic sequence is input into neural network model, such as two-way shot and long term memory network (BLSTM), to obtain primary vector expression.The primary vector indicates, believes the context in word-characteristic sequence of input Breath is modeled.It will be understood by those skilled in the art that above-mentioned two-way shot and long term memory network (BLSTM) carries out being exemplary , other neural network models for being able to achieve similar or more function can also use.
Specifically, in two-way shot and long term memory network (BLSTM), by the hiding sequence of forward direction and reversed hiding sequence assembly Obtaining the primary vector indicates.For example, hidden state h can be expressed as on each time step ii
Then, in step s 13, the primary vector indicates that distribution is input into intention attention layer and semantic slot pays attention to Power layer, so that obtaining intention attention vector sum semanteme slot pays attention to force vector.
Finally, the primary vector indicates, intention attention vector sum semanteme slot pays attention to force vector quilt in step S14 It is input to output layer, to obtain semantic parsing result corresponding with the list entries, wherein the semanteme parsing result packet It includes Intention Anticipation result and semantic slot fills result.
For example, for list entries " daphne odera for playing XYZ " (XYZ is the name of singer, is indicated herein with XYZ), then The Intention Anticipation result that can predict above-mentioned read statement is " Music on Demand ", and corresponding slot position includes " singer's name ", " song Name " fills " XYZ " in " singer's name " this slot position, fills " daphne odera " in " song title " this slot position.
Fig. 2 is the schematic flow chart according to the attention force vector generating process of one embodiment of the disclosure.
In the step s 21, take notice of in figure attention layer and semantic slot attention layer, by utilizing attention (Attention) mechanism gains attention power weight.
In step S22, S23, on each time step i, by hidden state hiIt sums with attention Weight Force vector is paid attention to semantic slot.It is anticipated using the hidden state value of the last one time step with corresponding attention Weight Caption meaning force vector.
Then, in step s 24, it is intended to notice that force vector obtains and each hiding shape based on semantic slot attention vector sum The relevant weighted feature of state.Such as pass through semantic pod door (slot-gated).
Then, primary vector is indicated, semantic slot pays attention to vector and is intended to by merging semantic slot attention vector sum Notice that force vector obtains weighted feature relevant to each hidden state, as the defeated of output layer (for example, condition random field layer) Enter, obtains the result of semantic slot filling.Here, being intended to pay attention to force vector, modeling by merging semantic slot attention vector sum The relationship being intended between semantic slot, while providing the performance of Intention Anticipation and semantic slot filling.
According to a further embodiment of the disclosure, a kind of semantic resolver 30 is provided.Fig. 3 is shown according to the disclosure The semantic resolver of one embodiment comprising word-characteristic sequence generation module 31, in acquisition and list entries The relevant feature of each word, and obtain the sequence comprising the relevant feature of each word;Primary vector indicates generation module 32, is used for Based on the sequence comprising the relevant feature of each word, the primary vector for generating the list entries is indicated;Pay attention to force vector Generation module 33 is generated intention attention vector sum semanteme slot and pays attention to force vector for being indicated based on the primary vector;It is semantic Parsing result generation module 34 pays attention to force vector for indicating according to the primary vector, being intended to attention vector sum semanteme slot Obtain semantic parsing result corresponding with the list entries, wherein the semanteme parsing result include Intention Anticipation result and Semantic slot fills result.
According to a further embodiment of the disclosure, a kind of attention vector generator 40 is provided, for semantic parsing The specific implementation of device 30.Fig. 4 is the attention vector generator 40 according to disclosure another embodiment.It is wrapped It includes, attention weight generation module 41, for generating attention weight corresponding with each hidden state;Semantic slot attention to Generation module 42 is measured, pays attention to force vector with each hidden state and its corresponding attention weight, the semantic slot of acquisition for being based on; It is intended to attention vector generation module 43, hidden state and its corresponding attention weight based on the last one time step obtain Proud caption meaning force vector;Joint weighted feature generation module 44 is based on the intention attention vector sum semanteme slot attention Vector obtains weighted feature relevant to each hidden state.
And the treatment process executed in above-mentioned each module is opposite with the respective process specifically described in the above method respectively It answers.
The disclosure also provides a kind of electronic equipment, as shown in figure 5, the equipment includes: communication interface 1000, memory 2000 With processor 3000.Communication interface 1000 carries out data interaction for being communicated with external device.In memory 2000 It is stored with the computer program that can be run on processor 3000.Processor 3000 is realized above-mentioned when executing the computer program Method in embodiment.The quantity of the memory 2000 and processor 3000 can be one or more.
Memory 2000 may include high speed RAM memory, can also further include nonvolatile memory (non- Volatile memory), a for example, at least magnetic disk storage.
If communication interface 1000, memory 2000 and the independent realization of processor 3000, communication interface 1000, memory 2000 and processor 3000 can be connected with each other by bus and complete mutual communication.The bus can be industrial standard Architecture (ISA, Industry Standard Architecture) bus, external equipment interconnection (PCI, Peripheral Component) bus or extended industry-standard architecture (EISA, Extended Industry Standard Component) bus etc..The bus can be divided into address bus, data/address bus, control bus etc..For convenient for expression, the figure In only indicated with a thick line, it is not intended that an only bus or a type of bus.
Optionally, in specific implementation, if communication interface 1000, memory 2000 and processor 3000 are integrated in one On block chip, then communication interface 1000, memory 2000 and processor 3000 can complete mutual lead to by internal interface Letter.
The present invention is improved mainly in combination with practical business demand for the shortcoming of existing segmentation methods, by machine Device learning algorithm and the dictionary of field customization combine, and on the one hand can be improved participle accuracy rate, on the other hand can be for real The application scenarios on border improve its field adaptability.
Any process described otherwise above or method description are construed as in flow chart or herein, and expression includes It is one or more for realizing specific logical function or process the step of executable instruction code module, segment or portion Point, and the range of the preferred embodiment of the disclosure includes other realization, wherein can not press shown or discussed suitable Sequence, including according to related function by it is basic simultaneously in the way of or in the opposite order, Lai Zhihang function, this should be by the disclosure Embodiment person of ordinary skill in the field understood.Processor executes each method as described above and processing. For example, the method implementation in the disclosure may be implemented as software program, it is tangibly embodied in machine readable media, Such as memory.In some embodiments, some or all of of software program can be via memory and/or communication interface And it is loaded into and/or installs.When software program is loaded into memory and is executed by processor, above-described side can be executed One or more steps in method.Alternatively, in other embodiments, processor can pass through other any modes appropriate (for example, by means of firmware) and be configured as executing one of above method.
Expression or logic and/or step described otherwise above herein in flow charts, may be embodied in any In readable storage medium storing program for executing, so that (such as computer based system is including processor for instruction execution system, device or equipment Unite or other can be from instruction execution system, device or equipment instruction fetch and the system executed instruction) it uses, or refer in conjunction with these It enables and executes system, device or equipment and use.
For the purpose of this specification, " readable storage medium storing program for executing " can be it is any may include, store, communicate, propagate, or transport Program is for instruction execution system, device or equipment or the device used in conjunction with these instruction execution systems, device or equipment. The more specific example (non-exhaustive list) of readable storage medium storing program for executing include the following: there is the electrical connection section of one or more wirings (electronic device), portable computer diskette box (magnetic device), random access memory (RAM), read-only memory (ROM) are erasable Except editable read-only memory (EPROM or flash memory), fiber device and portable read-only memory (CDROM).Separately Outside, readable storage medium storing program for executing can even is that the paper that can print described program on it or other suitable media, because can example Such as by carrying out optical scanner to paper or other media, is then edited, interpreted or when necessary with the progress of other suitable methods Processing is then stored in memory electronically to obtain described program.
It should be appreciated that each section of the disclosure can be realized with hardware, software or their combination.In above-mentioned embodiment party In formula, multiple steps or method can carry out reality in memory and by the software that suitable instruction execution system executes with storage It is existing.It, and in another embodiment, can be in following technology well known in the art for example, if realized with hardware Any one or their combination are realized: having a discrete logic for realizing the logic gates of logic function to data-signal Circuit, the specific integrated circuit with suitable combinational logic gate circuit, programmable gate array (PGA), field-programmable gate array Arrange (FPGA) etc..
Those skilled in the art are understood that realize all or part of the steps of above embodiment method It is that relevant hardware can be instructed to complete by program, the program can store in a kind of readable storage medium storing program for executing, should Program when being executed, includes the steps that one or a combination set of method implementation.
In addition, can integrate in a processing module in each functional unit in each embodiment of the disclosure, it can also To be that each unit physically exists alone, can also be integrated in two or more units in a module.It is above-mentioned integrated Module both can take the form of hardware realization, can also be realized in the form of software function module.The integrated module If in the form of software function module realize and when sold or used as an independent product, also can store readable at one In storage medium.The storage medium can be read-only memory, disk or CD etc..
In the description of this specification, reference term " an embodiment/mode ", " some embodiment/modes ", The description of " example ", " specific example " or " some examples " etc. means the embodiment/mode or example is combined to describe specific Feature, structure, material or feature are contained at least one embodiment/mode or example of the application.In this specification In, schematic expression of the above terms are necessarily directed to identical embodiment/mode or example.Moreover, description Particular features, structures, materials, or characteristics can be in any one or more embodiment/modes or example in an appropriate manner In conjunction with.In addition, without conflicting with each other, those skilled in the art can be by different implementations described in this specification Mode/mode or example and different embodiments/mode or exemplary feature are combined.
In addition, term " first ", " second " are used for descriptive purposes only and cannot be understood as indicating or suggesting relative importance Or implicitly indicate the quantity of indicated technical characteristic.Define " first " as a result, the feature of " second " can be expressed or Implicitly include at least one this feature.In the description of the present application, the meaning of " plurality " is at least two, such as two, three It is a etc., unless otherwise specifically defined.
It will be understood by those of skill in the art that above embodiment is used for the purpose of clearly demonstrating the disclosure, and simultaneously Non- be defined to the scope of the present disclosure.For those skilled in the art, may be used also on the basis of disclosed above To make other variations or modification, and these variations or modification are still in the scope of the present disclosure.

Claims (10)

1. a kind of semanteme analytic method characterized by comprising
Feature relevant to each word in list entries is obtained, the sequence comprising the relevant feature of each word is obtained;
Based on the sequence comprising the relevant feature of each word, the primary vector for generating the list entries is indicated;
It is indicated based on the primary vector, generates intention attention vector sum semanteme slot and pay attention to force vector;And
It is indicated according to the primary vector, intention attention vector sum semanteme slot notices that force vector obtains and the list entries pair The semantic parsing result answered,
Wherein, the semantic parsing result includes that Intention Anticipation result and semantic slot fill result.
2. the method as described in claim 1, which is characterized in that the sequence comprising the relevant feature of each word is by defeated The word insertion for entering sequence is obtained to relevant Fusion Features.
3. method according to claim 1 or 2, which is characterized in that based on the sequence comprising the relevant feature of each word, The primary vector for generating the list entries indicates
The sequence comprising the relevant feature of each word is input into two-way shot and long term memory network, and forward direction is hidden sequence The primary vector expression is obtained with reversed sequence assembly of hiding.
4. method according to any one of claims 1 to 3, which is characterized in that indicated based on the primary vector, generate meaning Caption meaning force vector and semantic slot notice that force vector includes:
Attention weight is calculated for each hidden state in primary vector expression,
Based on each hidden state and its corresponding attention weight, obtain semantic slot and pay attention to force vector, and
Hidden state and its corresponding attention weight based on the last one time step obtain and are intended to pay attention to force vector.
5. the method as claimed in claim 3 or 4, which is characterized in that further include:
Notice that force vector obtains weighted feature relevant to each hidden state based on the intention attention vector sum semanteme slot, And
It is indicated based on the weighted feature, primary vector and semantic slot notices that force vector obtains the semantic slot filling result.
6. method as described in claim 5, which is characterized in that further include:
The weighted feature, primary vector are indicated and semantic slot pays attention to force vector input condition random field layer, obtains semantic slot Fill result.
7. such as method of any of claims 1-6, it is characterized in that, obtaining and each word phase in list entries The feature of pass, obtaining the sequence comprising the relevant feature of each word includes:
From acquisition feature relevant to each word in list entries in knowledge mapping, and the word of list entries is embedded in by opsition dependent The sequence comprising the relevant feature of each word is obtained to relevant Fusion Features.
8. a kind of semanteme resolver characterized by comprising
Word-characteristic sequence generation module for obtaining feature relevant to each word in list entries, and is obtained comprising each The sequence of the relevant feature of word;
Primary vector indicates generation module, for generating the input based on the sequence comprising the relevant feature of each word The primary vector of sequence indicates;
Attention vector generation module is generated for being indicated based on the primary vector and is intended to attention vector sum semanteme slot note Meaning force vector;
Semantic parsing result generation module pays attention to for being indicated according to the primary vector, being intended to attention vector sum semanteme slot Force vector obtains semantic parsing result corresponding with the list entries,
Wherein, the semantic parsing result includes that Intention Anticipation result and semantic slot fill result.
9. a kind of electronic equipment characterized by comprising
Memory, the memory storage execute instruction;And
Processor, the processor execute executing instruction for the memory storage, so that the processor is executed as right is wanted Method described in asking any one of 1 to 7.
10. a kind of readable storage medium storing program for executing, which is characterized in that it is stored with and executes instruction in the readable storage medium storing program for executing, the execution For realizing the method as described in any one of claims 1 to 7 when instruction is executed by processor.
CN201910284812.6A 2019-04-10 2019-04-10 Semantic analysis method, device, equipment and storage medium Active CN110008476B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910284812.6A CN110008476B (en) 2019-04-10 2019-04-10 Semantic analysis method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910284812.6A CN110008476B (en) 2019-04-10 2019-04-10 Semantic analysis method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN110008476A true CN110008476A (en) 2019-07-12
CN110008476B CN110008476B (en) 2023-04-28

Family

ID=67170734

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910284812.6A Active CN110008476B (en) 2019-04-10 2019-04-10 Semantic analysis method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN110008476B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110674314A (en) * 2019-09-27 2020-01-10 北京百度网讯科技有限公司 Sentence recognition method and device
CN110853626A (en) * 2019-10-21 2020-02-28 成都信息工程大学 Bidirectional attention neural network-based dialogue understanding method, device and equipment
CN111046674A (en) * 2019-12-20 2020-04-21 科大讯飞股份有限公司 Semantic understanding method and device, electronic equipment and storage medium
WO2021190259A1 (en) * 2020-03-23 2021-09-30 华为技术有限公司 Slot identification method and electronic device
WO2022198750A1 (en) * 2021-03-26 2022-09-29 南京邮电大学 Semantic recognition method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107315737A (en) * 2017-07-04 2017-11-03 北京奇艺世纪科技有限公司 A kind of semantic logic processing method and system
CN108920666A (en) * 2018-07-05 2018-11-30 苏州思必驰信息科技有限公司 Searching method, system, electronic equipment and storage medium based on semantic understanding
US20180357225A1 (en) * 2017-06-13 2018-12-13 Beijing Baidu Netcom Science And Technology Co., Ltd. Method for generating chatting data based on artificial intelligence, computer device and computer-readable storage medium
CN109241524A (en) * 2018-08-13 2019-01-18 腾讯科技(深圳)有限公司 Semantic analysis method and device, computer readable storage medium, electronic equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180357225A1 (en) * 2017-06-13 2018-12-13 Beijing Baidu Netcom Science And Technology Co., Ltd. Method for generating chatting data based on artificial intelligence, computer device and computer-readable storage medium
CN107315737A (en) * 2017-07-04 2017-11-03 北京奇艺世纪科技有限公司 A kind of semantic logic processing method and system
CN108920666A (en) * 2018-07-05 2018-11-30 苏州思必驰信息科技有限公司 Searching method, system, electronic equipment and storage medium based on semantic understanding
CN109241524A (en) * 2018-08-13 2019-01-18 腾讯科技(深圳)有限公司 Semantic analysis method and device, computer readable storage medium, electronic equipment

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110674314A (en) * 2019-09-27 2020-01-10 北京百度网讯科技有限公司 Sentence recognition method and device
CN110674314B (en) * 2019-09-27 2022-06-28 北京百度网讯科技有限公司 Sentence recognition method and device
CN110853626A (en) * 2019-10-21 2020-02-28 成都信息工程大学 Bidirectional attention neural network-based dialogue understanding method, device and equipment
CN111046674A (en) * 2019-12-20 2020-04-21 科大讯飞股份有限公司 Semantic understanding method and device, electronic equipment and storage medium
CN111046674B (en) * 2019-12-20 2024-05-31 科大讯飞股份有限公司 Semantic understanding method and device, electronic equipment and storage medium
WO2021190259A1 (en) * 2020-03-23 2021-09-30 华为技术有限公司 Slot identification method and electronic device
WO2022198750A1 (en) * 2021-03-26 2022-09-29 南京邮电大学 Semantic recognition method
JP2023522502A (en) * 2021-03-26 2023-05-31 南京郵電大学 Semantic recognition method
JP7370033B2 (en) 2021-03-26 2023-10-27 南京郵電大学 Semantic recognition method

Also Published As

Publication number Publication date
CN110008476B (en) 2023-04-28

Similar Documents

Publication Publication Date Title
CN110008476A (en) Semantic analytic method, device, equipment and storage medium
Holzapfel et al. Ethical dimensions of music information retrieval technology
Parekh Principles of multimedia
CN102483698B (en) The client tier checking of dynamic WEB application
CN109543022A (en) Text error correction method and device
US11264006B2 (en) Voice synthesis method, device and apparatus, as well as non-volatile storage medium
Clemens Prosodic noun incorporation: The relationship between prosody and argument structure in Niuean
CN110308902A (en) Document generating method, device, equipment and storage medium
CN109599095A (en) A kind of mask method of voice data, device, equipment and computer storage medium
US11200881B2 (en) Automatic translation using deep learning
US11200366B2 (en) Using classifications from text to determine instances of graphical element types to include in a template layout for digital media output
JP2023547802A (en) Answer span correction
CN108304387A (en) The recognition methods of noise word, device, server group and storage medium in text
CN110297633A (en) Code conversion method, device, equipment and storage medium
CN107239482B (en) A kind of processing method converting the image into music and server
US20100131849A1 (en) Method and apparatus for providing advertising moving picture
Lopes et al. EvoDesigner: Towards aiding creativity in graphic design
US20210150215A1 (en) Intelligent performance rating
CN110990531A (en) Text emotion recognition method and device
KR102034633B1 (en) System for sharing knowhow and method for providing service of the same
US20220164680A1 (en) Environment augmentation based on individualized knowledge graphs
JP7058438B2 (en) Dialogue response system, model learning device and dialogue device
CN114443916A (en) Supply and demand matching method and system for test data
Zhong et al. Smoodi: Stylized motion diffusion model
CN110119204A (en) The processing method of any VR/AR/MR device can be matched by editing a VR/AR/MR content

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant