WO2012158572A2 - Exploiting query click logs for domain detection in spoken language understanding - Google Patents

Exploiting query click logs for domain detection in spoken language understanding Download PDF

Info

Publication number
WO2012158572A2
WO2012158572A2 PCT/US2012/037668 US2012037668W WO2012158572A2 WO 2012158572 A2 WO2012158572 A2 WO 2012158572A2 US 2012037668 W US2012037668 W US 2012037668W WO 2012158572 A2 WO2012158572 A2 WO 2012158572A2
Authority
WO
WIPO (PCT)
Prior art keywords
query
log data
domain
link
data
Prior art date
Application number
PCT/US2012/037668
Other languages
English (en)
French (fr)
Other versions
WO2012158572A3 (en
Inventor
Dilek Hakkani-Tur
Larry Paul Heck
Gokhan Tur
Original Assignee
Microsoft Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US13/234,186 external-priority patent/US20120290509A1/en
Priority claimed from US13/234,202 external-priority patent/US20120290293A1/en
Application filed by Microsoft Corporation filed Critical Microsoft Corporation
Priority to EP12786677.0A priority Critical patent/EP2707808A4/en
Priority to CN201280023613.6A priority patent/CN103534696B/zh
Publication of WO2012158572A2 publication Critical patent/WO2012158572A2/en
Publication of WO2012158572A3 publication Critical patent/WO2012158572A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/951Indexing; Web crawling techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/40Processing or translation of natural language

Definitions

  • [001] Search queries mined from search engine query logs may be analyzed to improve domain detection in spoken language understanding (SLU) applications.
  • SLU spoken language understanding
  • Three key tasks in understanding applications are domain classification, intent determination and slot filling.
  • Domain classification is often completed first in SLU systems, serving as a top- level triage for subsequent processing.
  • Domain detection systems may be framed as a classification problem. Given a user utterance or sentence 3 ⁇ 4, a set y t C C of semantic domain labels may be associated with x t , where C is the finite set of domains covered. To perform this classification task, the class with the maximum conditional probability, may be selected.
  • supervised classification methods may be used to estimate these conditional probabilities and each domain class may be trained from a set of labeled utterances. Collecting and annotating naturally spoken utterances to train these domain classes is often costly, representing a significant barrier to deployment both in terms of effort and finances.
  • Domain detection training in a spoken language understanding system may be provided.
  • Log data associated with a search engine, each associated with a search query, may be received.
  • a domain label for each search query may be identified and the domain label and link data may be provided to a training set for a spoken language understanding model.
  • FIG. 1 is a block diagram of an operating environment
  • Fig. 2 is a flow chart of a method for providing domain detection training
  • Fig. 3 is a flow chart of a subroutine of the method of Fig. 2 for classifying domain labels
  • Fig. 4 is a block diagram of a computing device.
  • Embodiments of the present invention may provide for a system and/or method for exploiting query click logs in domain detection of spoken language utterances.
  • the abundance of implicitly labeled web search queries in search engines may be leveraged to aid in training domain detection classes.
  • Large-scale engines such as Bing® or Google® log more then 100M search queries per day.
  • Each query in the log may be associated with a set of Uniform Resource Locators (URLs) that were clicked after the users entered the query.
  • URLs Uniform Resource Locators
  • This user click information may be used to infer domain class labels and, therefore, may provide (potentially noisy) supervision in training domain classifiers. For example, the queries of two users who click on the same URL (e.g., http://www.hotels.com) are probably from the same domain (e.g., "hotels").
  • a clicked URL category may be assigned as the domain label of a user query.
  • the label "hotels” may be assigned to the user query "Holiday Inn and Suites" when the user has clicked on http://www.hotels.com.
  • click data may be noisy and occur with low frequency.
  • it may also be useful to estimate successful clicks by mining query click logs to gather the set of URLs the people who searched by using the same exact query.
  • query entropy, dwell times and session length may be evaluated for mining high-quality clicks.
  • User action patterns and dwell time may be used to estimate successful search sessions.
  • Query entropy and frequency may be integrated with other features from domain detection, such as the probabilities assigned by a domain detection model trained on labeled data, to sample high quality clicks both for adding as examples to the training set, and to pre-sample the data for use in supervised classifier training, and/or semi- and lightly-supervised learning methods such as label propagation.
  • domain detection such as the probabilities assigned by a domain detection model trained on labeled data, to sample high quality clicks both for adding as examples to the training set, and to pre-sample the data for use in supervised classifier training, and/or semi- and lightly-supervised learning methods such as label propagation.
  • a label propagation algorithm may transfer domain annotations from labeled natural language (NL) utterances to unlabeled web search queries. Click information may also be considered as noisy supervision, and the domain label extracted from the clicked URL category may be incorporated into the label propagation algorithm.
  • NL labeled natural language
  • Query click data may include logs of search engine users' queries and the links they click from a list of sites returned by the search engine. Some click data, however, is very noisy, and may include links that were clicked on almost randomly. A sampling measure may be applied queries and domain labels from the clicked URLs for use in domain detection. Supervision from the noisy user clicks may then be included into the label propagation algorithm that may transfer domain labels from labeled examples to the sampled search queries.
  • a set of queries whose users clicked on the URLs that are related to target domain categories may be extracted.
  • the query click logs may then be mined to download instances of these search queries and the set of links that were clicked on by search engine users who entered the same query.
  • Criteria for sampling a subset of the queries may comprise query frequency, query (click) entropy, and/or query length.
  • Query frequency may refer to the number of times a query has been searched by different users in a given time frame.
  • users may ask the same things as web search users, hence adding frequent search queries to the domain detection training set may help to improve its accuracy.
  • Query (click) entropy aims to measure the diversity of the URLs clicked on by the users of a query q, and may be computed according to Equation 1 , below.
  • Low click entropy may be a good indicator of the correctness of the domain category estimated from the query click label.
  • Query length may refer to the number of words in the query.
  • the number of words in a query may comprise a good indicator of natural language utterances, and search queries that include natural language utterances instead of simply a sequence of keywords may be more useful for training data in SLU domain classification.
  • the sampled queries may be added with the domain labels estimated from the clicked URLs to a labeled training set, or these sampled examples may be used for semi-supervised learning approaches such as self-training and/or label propagation.
  • the label propagation algorithm may be extended to exploit the domain information from the clicked URLs.
  • Self-training may involve training an initial classifier from existing manually labeled examples.
  • the initial classifier may be used to automatically assign labels for a larger set of unlabeled examples. Then the examples which were assigned classes with high posterior probabilities may be added to the training data.
  • Label propagation may comprise a graph-based, iterative algorithm commonly used for semi-supervised learning.
  • the algorithm may propagate the labels through a dataset along high density areas defined by unlabeled examples in a manner similar to the k-Nearest-Neighbor (kNN) classification algorithm.
  • LP may enable the classifier to see samples which have no common phrases to the training set. For example, if the training set has the phrase "hotel” but not "suites", the example query above "holiday inn and suites" may propagate the label to another query, say “ocean-view suites", which will propagate it to others.
  • the LP algorithm converges and has a closed form solution for easier implementation.
  • ⁇ yi+l,...,yi+u ⁇ is unknown.
  • the goal of label propagation may be to estimate Yu from and Y L .
  • a fully connected graph may be created using the samples as nodes.
  • the edges between the nodes, 1 ⁇ 43 ⁇ 4 ⁇ represent the Euclidean distance with a control parameter ⁇ and may be computed according to Equation 3, below.
  • x may comprise the value of the d th feature of sample Xi.
  • the graph may then be represented using a (/ + u) x (/ + u) probabilistic transition matrix T as computed according to Equation 4.
  • a corresponding (/ + u) x ⁇ C ⁇ matrix may also be defined for the labels.
  • the labels for the unlabeled samples may initially be randomly set before iterating as follows. First, labels may be propagated 1 step (Y ⁇ - TY). Next, the rows of 7 may be normalized to maintain a probability distribution before the labels of the labeled data are restored. This sequence converges to a fixed solution described below as Equation 5, where ( T) is the
  • T — and T u i and T m are the bottom left and
  • T right parts of T, obtained by splitting T after the I th row and column into four sub- matrices.
  • User-clicekd URLs may provide a noisy label for each query.
  • the domain category assigned to each example by LP and the domain category of the clicked URL may therefore be checked for agreement, and those examples with high probability labels from LP, that also agree with the click label, may be added to a training data set.
  • a category of the clicked URL may also be used as a feature in the representation of a query. This may allow for propagation of labels between queries that have the same click labels with a higher weight in LP, thereby extending feature transformation approaches, such as the supervised latent Dirichlet allocation (sLDA) incorporating the correct labels and factored latent semantic analysis (fLSA) supporting the use of additional features.
  • sLDA supervised latent Dirichlet allocation
  • fLSA factored latent semantic analysis
  • ⁇ C ⁇ binary features may be included for each domain, resulting in a D + ⁇ C ⁇ - dimensional feature space.
  • a value of 1 may be assigned to the feature corresponding to the click label of the query, and 0 to all the others. This may result in a straight-forward extension of the computation of the Euclidean distance with noisy supervision, as illustrated by Equation 6.
  • x +k may comprise a binary feature indicating a click of the URL for the k th domain.
  • the LP may be run and the top scoring examples for each domain may be added to the classification training data.
  • Fig. 1 is a block diagram of an operating environment 100 for providing a spoken dialog system (SDS) 110.
  • SDS 110 may comprise a labeled data storage 115, a spoken language understanding component 120, and a statistical dialog manager 125.
  • Labeled data 115 may be received from a label propagation system 130 comprising a plurality of session logs 135, such as may be associated with web search sessions, and a session processing module 140.
  • Session processing module may be operative to analyze data from session logs 135 and provide training data comprising domain labels for various search queries to SDS 110.
  • SDS 110 may be operative to interact with a user device 150, such as over a network (not shown).
  • SDS 110 and label propagation system 130 may comprise separate servers in communication via a network and/or may comprise applications, processes, and/or services executing on shared hardware.
  • User device 150 may comprise an electronic communications device such as a computer, laptop, cell phone, tablet, game console and/or other device.
  • User device 150 may be coupled to a capture device 155 that may be operative to record a user and capture spoken words, motions and/or gestures made by the user, such as with a camera and/or microphone.
  • User device 150 may be further operative to capture other inputs from the user such as by a keyboard, touchscreen and/or mouse (not pictured).
  • capture device 155 may comprise any speech and/or motion detection device capable of detecting the actions of the user.
  • capture device 155 may comprise a Microsoft® Kinect® motion capture device comprising a plurality of cameras and a plurality of microphones.
  • Fig. 2 is a flow chart setting forth the general stages involved in a method 200 consistent with an embodiment of the invention for providing statistical dialog manager training.
  • Method 200 may be implemented using a computing device 400 as described in more detail below with respect to Fig. 4. Ways to implement the stages of method 200 will be described in greater detail below.
  • Method 200 may begin at starting block 205 and proceed to stage 210 where computing device 400 may receive a plurality of query log data.
  • the query log data may comprise a search queries, followed links (e.g., uniform resource locators), non-followed links, and/or link characteristics, such as dwell time, associated with a web search session.
  • Method 200 may then advance to stage 220 where computing device 400 may sample a subset of the plurality of query log data according to one and/or more of the link characteristics.
  • label propagations system 130 may analyze link
  • characteristics such as dwell time, query entropy, query frequency, and search query lengths to identify which of the log data comprises high correlations with a target domain.
  • Method 200 may then advance to subroutine 230 where computing device 400 may classify each of the subset of the plurality of query log data into a domain label. For example, a session log comprising a search query of "hotels in Redmond” and a followed link to http://www.hotels.com may be classified in the "hotels" domain. The classification process is described below in greater detail with respect to Fig. 3.
  • Method 200 may then advance to stage 240 where computing device 400 may provide the subset of the plurality of query log data to a spoken language understanding model.
  • label propagation system 130 may provide the classified data to SDS 110 as training data and/or for use in responding to live queries.
  • Method 200 may then advance to stage 250 where computing device 400 may receive a natural language query from a user.
  • capture device 155 may record a user query of "I need a place to stay tonight," and provide it, via user device 150, to SDS 110.
  • Method 200 may then advance to stage 260 where computing device 400 may assign a query domain to the natural language query according to the spoken language understanding model. For example, based on labeled log data received from label propagation system 130, the query may be mapped to prior web search queries of users looking for hotel rooms. Such prior queries may be classified in the "hotels" domain, and that data may result in SDM 125 assigning the received query into the same domain.
  • Method 200 may then advance to stage 270 where computing device 400 may provide a query response to the user according to the assigned query domain.
  • SDS 1 10 may perform a web search of hotels restricted by other information in the query (e.g., needs to have availability "tonight” and/or a presumption that the user is looking for a hotel nearby).
  • Method 200 may then end at stage 275
  • Fig. 3 is a flow chart setting forth the general stages of subroutine 230 of method 200 consistent with an embodiment of the invention for classifying a domain label.
  • Subroutine 230 may be implemented using computing device 400 as described in more detail below with respect to Fig. 4. Ways to implement the stages of subroutine 230 will be described in greater detail below. Subroutine 230 may begin at starting block 305 and proceed to stage 310 where computing device 400 may identify a plurality of possible domains associated with the link data. For example, session processing module 140 may select a group of target domains for which training data is sought and/or may select all possible domains associated with SDS 110.
  • Subroutine 230 may then advance to stage 320 where computing device 400 may generate a probability associated with each of the plurality of possible domains that the at least one of the plurality of link data is associated with the domain. For example, session processing module 140 may assign a probability that the search terms of the query are associated with each domain used by SLU 120.
  • Subroutine 230 may then advance to stage 330 where computing device 400 may select the classifying domain for the at least one of the plurality of possible link data from the plurality of possible domains. For example session processing module 140 may select the domain having the highest probability among the plurality of possible domains.
  • Subroutine 230 may then end at stage 335 and return to method 200.
  • An embodiment consistent with the invention may comprise a system for providing domain detection training.
  • the system may comprise a memory storage and a processing unit coupled to the memory storage.
  • the processing unit may be operative to receive a plurality of log data associated with a search engine, wherein each of the plurality of log data is associated with a search query, identify a domain label for the search query of at least one of the plurality of log data, and provide the domain label and the at least one of the plurality of link data to a training set for an understanding model.
  • Another embodiment consistent with the invention may comprise a system for providing domain detection training.
  • the system may comprise a memory storage and a processing unit coupled to the memory storage.
  • the processing unit may be operative to identify a plurality of query log data associated with a target domain label, extract, from each of the plurality of query log data, a search query, at least one followed link, and at least one link characteristic, sample a subset of the plurality of query log data according to the at least one link characteristic, assign the target domain label to each of the subset of the plurality of query log data, and provide the subset of the plurality of query log data to a spoken language understanding model.
  • An embodiment consistent with the invention may comprise a system for providing domain detection training.
  • the system may comprise a memory storage and a processing unit coupled to the memory storage.
  • the processing unit may be operative to receive a plurality of query log data, each comprising at least a search query, at least one followed link, and at least one link characteristic associated with a web search session, sample a subset of the plurality of query log data according to the at least one link characteristic associated with each of the subset of the plurality of query log data, classify each of the subset of the plurality of query log data into a domain label, and provide the subset of the plurality of query log data to a spoken language understanding model.
  • the processing unit may be further operative to receive a natural language query from a user, assign a query domain to the natural language query according to the spoken language understanding model, and provide a query response to the user according to the assigned query domain.
  • Fig. 4 is a block diagram of a system including computing device 400.
  • the aforementioned memory storage and processing unit may be implemented in a computing device, such as computing device 400 of FIG. 4. Any suitable combination of hardware, software, or firmware may be used to implement the memory storage and processing unit.
  • the memory storage and processing unit may be implemented with computing device 400 or any of other computing devices 418, in combination with computing device 400.
  • the aforementioned system, device, and processors are examples and other systems, devices, and processors may comprise the aforementioned memory storage and processing unit, consistent with embodiments of the invention.
  • computing device 400 may comprise operating environment 400 as described above. Methods described in this specification may operate in other environments and are not limited to computing device 400.
  • a system consistent with an embodiment of the invention may include a computing device, such as computing device 400.
  • computing device 400 may include at least one processing unit 402 and a system memory 404.
  • system memory 404 may comprise, but is not limited to, volatile (e.g. random access memory (RAM)), nonvolatile (e.g. read-only memory (ROM)), flash memory, or any combination.
  • System memory 404 may include operating system 405, one or more programming modules 406, and may include SDM 125. Operating system 405, for example, may be suitable for controlling computing device 400 's operation.
  • embodiments of the invention may be practiced in conjunction with a graphics library, other operating systems, or any other application program and is not limited to any particular application or system. This basic configuration is illustrated in FIG. 4 by those components within a dashed line 408.
  • Computing device 400 may have additional features or functionality.
  • computing device 400 may also include additional data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical disks, or tape.
  • additional storage is illustrated in FIG. 4 by a removable storage 409 and a non-removable storage 410.
  • Computing device 400 may also contain a communication connection 416 that may allow device 400 to communicate with other computing devices 418, such as over a network in a distributed computing environment, for example, an intranet or the Internet.
  • Communication connection 416 is one example of communication media.
  • Computer readable media may include computer storage media.
  • Computer storage media may include volatile and nonvolatile, removable and nonremovable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data.
  • System memory 404, removable storage 409, and non-removable storage 410 are all computer storage media examples (i.e., memory storage.)
  • Computer storage media may include, but is not limited to, RAM, ROM, electrically erasable read-only memory
  • Computing device 400 may also have input device(s) 412 such as a keyboard, a mouse, a pen, a sound input device, a touch input device, etc.
  • Output device(s) 414 such as a display, speakers, a printer, etc. may also be included. The aforementioned devices are examples and others may be used.
  • Computer readable media may also include communication media.
  • Communication media may be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and includes any information delivery media.
  • modulated data signal may describe a signal that has one or more characteristics set or changed in such a manner as to encode information in the signal.
  • communication media may include wired media such as a wired network or direct- wired connection, and wireless media such as acoustic, radio frequency (RF), infrared, and other wireless media.
  • RF radio frequency
  • program modules and data files may be stored in system memory 404, including operating system 405.
  • programming modules 406 e.g., statistical dialog manager 125
  • processing unit 402 may perform other processes.
  • Other programming modules that may be used in accordance with embodiments of the present invention may include electronic mail and contacts applications, word processing applications, spreadsheet applications, database applications, slide presentation applications, drawing or computer- aided application programs, etc.
  • program modules may include routines, programs, components, data structures, and other types of structures that may perform particular tasks or that may implement particular abstract data types.
  • embodiments of the invention may be practiced with other computer system configurations, including hand-held devices, multiprocessor systems, microprocessor- based or programmable consumer electronics, minicomputers, mainframe computers, and the like.
  • Embodiments of the invention may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network.
  • program modules may be located in both local and remote memory storage devices.
  • embodiments of the invention may be practiced in an electrical circuit comprising discrete electronic elements, packaged or integrated electronic chips containing logic gates, a circuit utilizing a microprocessor, or on a single chip containing electronic elements or microprocessors.
  • Embodiments of the invention may also be practiced using other technologies capable of performing logical operations such as, for example, AND, OR, and NOT, including but not limited to mechanical, optical, fluidic, and quantum technologies.
  • embodiments of the invention may be practiced within a general purpose computer or in any other circuits or systems.
  • Embodiments of the invention may be implemented as a computer process (method), a computing system, or as an article of manufacture, such as a computer program product or computer readable media.
  • the computer program product may be a computer storage media readable by a computer system and encoding a computer program of instructions for executing a computer process.
  • the computer program product may also be a propagated signal on a carrier readable by a computing system and encoding a computer program of instructions for executing a computer process.
  • the present invention may be embodied in hardware and/or in software (including firmware, resident software, micro-code, etc.).
  • embodiments of the present invention may take the form of a computer program product on a computer-usable or computer- readable storage medium having computer-usable or computer-readable program code embodied in the medium for use by or in connection with an instruction execution system.
  • a computer-usable or computer-readable medium may be any medium that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
  • the computer-usable or computer-readable medium may be, for example but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. More specific computer-readable medium examples (a non-exhaustive list), the computer-readable medium may include the following: an electrical connection having one or more wires, a portable computer diskette, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, and a portable compact disc read-only memory (CD-ROM).
  • RAM random access memory
  • ROM read-only memory
  • EPROM or Flash memory erasable programmable read-only memory
  • CD-ROM portable compact disc read-only memory
  • the computer-usable or computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted, or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory.
  • Embodiments of the invention may be practiced via a system-on-a-chip (SOC) where each or many of the components illustrated in Figure 4 may be integrated onto a single integrated circuit.
  • SOC system-on-a-chip
  • Such an SOC device may include one or more processing units, graphics units, communications units, system virtualization units and various application functionalities, all of which may be integrated (or "burned") onto the chip substrate as a single integrated circuit.
  • the functionality, described herein, with respect to providing training data for a spoken language understanding system may operate via application-specific logic integrated with other components of the computing device/system X on the single integrated circuit (chip).
  • Embodiments of the present invention for example, are described above with reference to block diagrams and/or operational illustrations of methods, systems, and computer program products according to embodiments of the invention. The

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
PCT/US2012/037668 2011-05-13 2012-05-11 Exploiting query click logs for domain detection in spoken language understanding WO2012158572A2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP12786677.0A EP2707808A4 (en) 2011-05-13 2012-05-11 USE OF QUERY LOOKING PROTOCOLS FOR DOMAIN RECOGNITION IN UNDERSTANDING SPOKEN LANGUAGE
CN201280023613.6A CN103534696B (zh) 2011-05-13 2012-05-11 针对口语语言理解中的域检测利用查询点击记录

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
US201161485778P 2011-05-13 2011-05-13
US201161485664P 2011-05-13 2011-05-13
US61/485,664 2011-05-13
US61/485,778 2011-05-13
US13/234,186 2011-09-16
US13/234,202 2011-09-16
US13/234,186 US20120290509A1 (en) 2011-05-13 2011-09-16 Training Statistical Dialog Managers in Spoken Dialog Systems With Web Data
US13/234,202 US20120290293A1 (en) 2011-05-13 2011-09-16 Exploiting Query Click Logs for Domain Detection in Spoken Language Understanding

Publications (2)

Publication Number Publication Date
WO2012158572A2 true WO2012158572A2 (en) 2012-11-22
WO2012158572A3 WO2012158572A3 (en) 2013-03-21

Family

ID=47177580

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/US2012/037667 WO2012158571A2 (en) 2011-05-13 2012-05-11 Training statistical dialog managers in spoken dialog systems with web data
PCT/US2012/037668 WO2012158572A2 (en) 2011-05-13 2012-05-11 Exploiting query click logs for domain detection in spoken language understanding

Family Applications Before (1)

Application Number Title Priority Date Filing Date
PCT/US2012/037667 WO2012158571A2 (en) 2011-05-13 2012-05-11 Training statistical dialog managers in spoken dialog systems with web data

Country Status (3)

Country Link
EP (2) EP2707808A4 (zh)
CN (2) CN103534696B (zh)
WO (2) WO2012158571A2 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120290509A1 (en) * 2011-05-13 2012-11-15 Microsoft Corporation Training Statistical Dialog Managers in Spoken Dialog Systems With Web Data

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DK2994908T3 (da) 2013-05-07 2019-09-23 Veveo Inc Grænseflade til inkrementel taleinput med realtidsfeedback
US9852136B2 (en) 2014-12-23 2017-12-26 Rovi Guides, Inc. Systems and methods for determining whether a negation statement applies to a current or past query
US10817519B2 (en) * 2016-06-06 2020-10-27 Baidu Usa Llc Automatic conversion stage discovery
KR20190100428A (ko) * 2016-07-19 2019-08-28 게이트박스 가부시키가이샤 화상 표시장치, 화제 선택 방법, 화제 선택 프로그램, 화상 표시 방법 및 화상 표시 프로그램
CN106407333B (zh) * 2016-09-05 2020-03-03 北京百度网讯科技有限公司 基于人工智能的口语查询识别方法及装置
CN107291828B (zh) * 2017-05-27 2021-06-11 北京百度网讯科技有限公司 基于人工智能的口语查询解析方法、装置及存储介质
CN108121814B (zh) * 2017-12-28 2022-04-22 北京百度网讯科技有限公司 搜索结果排序模型生成方法和装置
CN109086332A (zh) * 2018-07-04 2018-12-25 深圳供电局有限公司 一种电力调度日志查询方法及系统
CN109901896A (zh) * 2018-12-06 2019-06-18 华为技术有限公司 一种人机交互系统及人机交互系统中多任务处理方法

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1116134A1 (en) * 1998-08-24 2001-07-18 BCL Computers, Inc. Adaptive natural language interface
IL140805A0 (en) * 1998-10-02 2002-02-10 Ibm Structure skeletons for efficient voice navigation through generic hierarchical objects
US6314398B1 (en) * 1999-03-01 2001-11-06 Matsushita Electric Industrial Co., Ltd. Apparatus and method using speech understanding for automatic channel selection in interactive television
WO2000055843A1 (en) * 1999-03-12 2000-09-21 Entropic Limited Man-machine dialogue system and method
EP1236175A4 (en) * 1999-08-06 2006-07-12 Lexis Nexis SYSTEM AND METHOD FOR CLASSIFYING LEGAL CONCEPTS USING A LEGAL TOPIC SCHEME
US7092928B1 (en) * 2000-07-31 2006-08-15 Quantum Leap Research, Inc. Intelligent portal engine
KR20020049164A (ko) * 2000-12-19 2002-06-26 오길록 유전자 알고리즘을 이용한 카테고리 학습과 단어클러스터에 의한 문서 자동 분류 시스템 및 그 방법
US20020198714A1 (en) * 2001-06-26 2002-12-26 Guojun Zhou Statistical spoken dialog system
US7720674B2 (en) * 2004-06-29 2010-05-18 Sap Ag Systems and methods for processing natural language queries
US7835911B2 (en) * 2005-12-30 2010-11-16 Nuance Communications, Inc. Method and system for automatically building natural language understanding models
US7840538B2 (en) * 2006-12-20 2010-11-23 Yahoo! Inc. Discovering query intent from search queries and concept networks
US8165877B2 (en) * 2007-08-03 2012-04-24 Microsoft Corporation Confidence measure generation for speech related searching
US8126869B2 (en) * 2008-02-08 2012-02-28 Microsoft Corporation Automated client sitemap generation
US8244752B2 (en) * 2008-04-21 2012-08-14 Microsoft Corporation Classifying search query traffic

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of EP2707808A4 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120290509A1 (en) * 2011-05-13 2012-11-15 Microsoft Corporation Training Statistical Dialog Managers in Spoken Dialog Systems With Web Data

Also Published As

Publication number Publication date
CN103534696B (zh) 2018-02-16
EP2707807A2 (en) 2014-03-19
CN103534697B (zh) 2017-11-21
WO2012158571A2 (en) 2012-11-22
EP2707808A2 (en) 2014-03-19
EP2707808A4 (en) 2015-10-21
WO2012158572A3 (en) 2013-03-21
WO2012158571A3 (en) 2013-03-28
EP2707807A4 (en) 2015-10-21
CN103534697A (zh) 2014-01-22
CN103534696A (zh) 2014-01-22

Similar Documents

Publication Publication Date Title
US20120290293A1 (en) Exploiting Query Click Logs for Domain Detection in Spoken Language Understanding
US11151175B2 (en) On-demand relation extraction from text
CN107832414B (zh) 用于推送信息的方法和装置
CN107256267B (zh) 查询方法和装置
WO2012158572A2 (en) Exploiting query click logs for domain detection in spoken language understanding
US11514235B2 (en) Information extraction from open-ended schema-less tables
CN107924483B (zh) 通用假设排序模型的生成与应用
US20190354810A1 (en) Active learning to reduce noise in labels
CN110069709B (zh) 意图识别方法、装置、计算机可读介质及电子设备
US8073877B2 (en) Scalable semi-structured named entity detection
KR101754473B1 (ko) 문서를 이미지 기반 컨텐츠로 요약하여 제공하는 방법 및 시스템
CN107862046B (zh) 一种基于短文本相似度的税务商品编码分类方法及系统
US9305083B2 (en) Author disambiguation
US20130060769A1 (en) System and method for identifying social media interactions
CN109416705A (zh) 利用语料库中可用的信息用于数据解析和预测
TW202020691A (zh) 特徵詞的確定方法、裝置和伺服器
WO2014031458A1 (en) Translating natural language utterances to keyword search queries
CN108027814B (zh) 停用词识别方法与装置
CN112329824A (zh) 多模型融合训练方法、文本分类方法以及装置
WO2022174496A1 (zh) 基于生成模型的数据标注方法、装置、设备及存储介质
CN109271624B (zh) 一种目标词确定方法、装置及存储介质
JP2021508391A (ja) 対象領域およびクライアント固有のアプリケーション・プログラム・インタフェース推奨の促進
CN111886596A (zh) 使用基于序列的锁定/解锁分类进行机器翻译锁定
CN113947086A (zh) 样本数据生成方法、训练方法、语料生成方法和装置
CN114416998A (zh) 文本标签的识别方法、装置、电子设备及存储介质

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 2012786677

Country of ref document: EP

NENP Non-entry into the national phase in:

Ref country code: DE