US20140282136A1 - Query intent expression for search in an embedded application context - Google Patents
Query intent expression for search in an embedded application context Download PDFInfo
- Publication number
- US20140282136A1 US20140282136A1 US13/904,887 US201313904887A US2014282136A1 US 20140282136 A1 US20140282136 A1 US 20140282136A1 US 201313904887 A US201313904887 A US 201313904887A US 2014282136 A1 US2014282136 A1 US 2014282136A1
- Authority
- US
- United States
- Prior art keywords
- intent
- search
- entity
- user
- interaction
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G06F17/30864—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0241—Advertisements
- G06Q30/0251—Targeted advertisements
- G06Q30/0269—Targeted advertisements based on user profile or attribute
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/10—File systems; File servers
- G06F16/17—Details of further file system functions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/10—File systems; File servers
- G06F16/17—Details of further file system functions
- G06F16/176—Support for shared access to files; File sharing support
- G06F16/1767—Concurrency control, e.g. optimistic or pessimistic approaches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/903—Querying
- G06F16/9032—Query formulation
- G06F16/90324—Query formulation using system suggestions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/95—Retrieval from the web
- G06F16/951—Indexing; Web crawling techniques
Definitions
- Search engines and interfaces allow users to retrieve information by inputting search queries, for instance, into a search input region. While a user is inputting a search prefix associated with a search query, automatic systems provide likely completions or suggestions to the search prefix being input. When the user executes the search query, either by manually inputting the desired search query or by selecting a suggestion, the search engine directs the user to a search engine results page (“SERP”).
- SERP search engine results page
- systems, methods, computer storage media, and user interfaces are provided for query intent expression for search in an embedded application context.
- a search interaction is received from a user.
- the search interaction may comprise an interaction with a device or application or a learned intent based on a previous interaction.
- Remote data from a remote data source is received.
- Local data is received from each available device or embedded application.
- the remote data and/or local data may provide one or more intent suggestions based on the search interaction.
- the remote data is merged with the local data to personalize a result set comprising one or more entity identifications associated with the one or more intent suggestions.
- the result set may be prioritized based on a set of rules associated with each available device or embedded application.
- the result set is provided to the user and includes an aggregated intent preview comprising metadata corresponding to one or more entities associated with at least one of the one or more entity identifications.
- FIG. 1 is a block diagram of an exemplary computing environment suitable for use in implementing embodiments of the present invention
- FIG. 2 is a flow diagram showing an exemplary method for intent preview, disambiguation, and refinement of a search, in accordance with an embodiment of the present invention
- FIG. 3 schematically shows a network diagram suitable for performing embodiments of the present invention
- FIGS. 4-20 depict illustrative screen displays, in accordance with exemplary embodiments of the present invention.
- FIG. 21 is a flow diagram showing an exemplary method for intent preview, disambiguation, and refinement of a search, in accordance with an embodiment of the present invention.
- FIG. 22 schematically shows another network diagram suitable for performing embodiments of the present invention.
- FIGS. 23-28 depict additional illustrative screen displays, in accordance with exemplary embodiments of the present invention.
- FIG. 29 is a flow diagram showing an exemplary method for query intent expression for search in an embedded application context, in accordance with an embodiment of the present invention.
- Entities are instances of abstract concepts and objects, including people, places, things, events, locations, businesses, movies, and the like.
- the SERP may or may not include information about the particular entity the user is searching for.
- Existing autosuggestion systems do not address tail queries (i.e., infrequent or unseen queries) or intent the system has not encountered or otherwise ambiguous during the query formulation process.
- Intent refers to the target of the search, which may be an entity.
- existing autosuggestion systems do not allow disambiguation of intent or allow users to express intent prior to retrieving the SERP. Any changes to the search query, such as selection of suggestions or input of additional characters, causes the SERP to refresh which can be distracting to the user and inefficient from a resource perspective.
- summarized data such as in a search history or search session, is limited to presenting individual queries of a set. This can make it difficult for a user to ascertain the appropriate context or intent of a given session which effectively limits the ability to share the data in a meaningful way.
- a search prefix comprising one or more characters associated with an unexecuted search query is received.
- One or more intent suggestions are suggested to a user.
- one or more entity identifications associated with each of the one or more intent suggestions are received.
- Metadata corresponding to at least one entity associated with the one or more entity identifications is retrieved from an entity data store. Without retrieving search results for the unexecuted search query, an aggregated intent preview based on the retrieved metadata corresponding to the at least one entity is provided.
- the one or more entities are ranked based on entity-intrinsic signals (i.e., number of attributes associated with an entity, entity type, number of information sources associated with an entity), query-entity interactions by users (i.e., explicit interactions or clicks on an entity in a search window or third party entity repository, interactions attributed to an entity via a query-url-entity tripartite graph), and query pattern likelihood scores, populating the intent suggestions or aggregated intent preview in order of relevance or likelihood of query intent.
- a refined intent preview associated with metadata corresponding to one or more subentities based on a selected item of metadata associated with the one or more entities is provided, conserving time and resources by allowing the user to further refine intent without executing the unexecuted search query.
- task completion for a selected entity or subentity is enabled allowing the user to easily and quickly take a particular action or complete a task associated with the entity or subentity without having to execute the unexecuted search query.
- task completion refers to the opening and execution or completion of a task within an application, independent window, link, or process with or without affecting the search or search window.
- a set of queries issued by the user and entities corresponding to the set of queries may be provided, enabling the user to easily and quickly interact with a search history via the provided entities.
- one embodiment of the present invention is directed to one or more computer storage media storing computer-useable instructions that, when used by one or more computing devices, cause the one or more computing devices to perform a method of non-committal intent preview, disambiguation, and refinement of a search.
- the method includes receiving a search prefix from a user, the search prefix comprising one or more characters associated with a search query.
- One or more intent suggestions are provided to the user based on a comparison of the search prefix with an autosuggest data store.
- One or more entity identifications associated with the intent suggestions are identified based on an entity ranking.
- An aggregated intent preview comprising metadata corresponding to one or more entities associated with at least one of the one or more entity identifications is provided.
- a refinement request is received from the user.
- the refinement request comprises an indication that the user has selected an item of metadata corresponding to a subentity and associated with the one or more entities.
- a refined intent preview comprising metadata corresponding to the subentity is provided.
- GUI graphical user interface
- An autosuggest display area displays, without executing the search, one or more intent suggestions to the user.
- An entity display area displays, without executing the search, an aggregated intent preview comprising metadata associated with at least one entity corresponding to entity identifications associated with the one or more intent suggestions.
- a refinement display area displays, without executing the search, a refined intent preview comprising metadata associated with a subentity corresponding to an item of metadata selected by the user and associated with the at least one entity.
- Yet another embodiment of the present invention includes a system for providing non-committal intent preview, disambiguation, and refinement of a search.
- the system includes one or more processors coupled to a computer storage medium, the computer storage medium having stored thereon a plurality of computer software components executable by the processor.
- the computer software components include an autosuggest component that receives a search prefix comprising one or more characters associated with an unexecuted search query and suggests one or more intent suggestions to a user.
- An entity identification component receives, for each of the one or more intent suggestions, one or more associated entity identifications.
- a metadata component retrieves metadata from an entity data store. The metadata corresponds to at least one entity associated with the one or more entity identifications.
- a preview component provides, without retrieving search results for the unexecuted search query, an aggregated intent preview based on the retrieved metadata corresponding to the at least one entity.
- search is thought of as a query based action taken by a user specifically to identify a piece of information.
- search can be extended to any user interaction, such as with an application, user interface, operating system, device, or even extended to a learned intent based on previous user interactions.
- the entry point for a search can be anywhere the user is able to interact with the application, user interface, operating system, or device.
- a flyout surface area enables the user to interact with the aggregated intent preview within any application, user interface, operating system, or device to provide the user with rich suggestions, concrete, instant answers to questions (based on local and remote context), enable tasks or actions, and generally assist the user in refining an intent associated with the search.
- the flyout surface may be any secondary canvas or surface area for receiving a search interaction or providing search intent preview, disambiguation, and refinement of search.
- one embodiment of the present invention is directed to computer storage media having computer-executable instructions embodied thereon that, when executed by one or more computing devices, cause the one or more computing devices to perform a method of query intent expression for search in an embedded application context.
- the method includes receiving a search interaction from a user, the search interaction comprising an interaction with a device or application or a learned intent based on a previous interaction.
- Remote data is received from a remote data source, the remote data providing one or more intent suggestions based on the search interaction.
- Local data from each available device or embedded application is received, the local data providing one or more intent suggestions based on the search interaction.
- the remote data is merged with the local data to personalize a result set comprising one or more entity identifications associated with the one or more intent suggestions.
- the result set is prioritized based on a set of rules associated with each available device embedded application.
- the result set is provided to the user, the result set including an aggregated intent preview comprising metadata corresponding to one or more entities associated with at least one of the one or more entity identifications.
- GUI graphical user interface
- the GUI includes an interaction display area for receiving, from a user, a search interaction corresponding to a search.
- An autosuggest display area displays, without executing the search, one or more intent suggestions to the user, the one or more intent suggestions comprising remote data and local data based on the search interaction.
- An entity display area displays, without executing the search, an aggregated intent preview comprising metadata associated with at least one entity corresponding to entity identifications associated with the one or more intent suggestions, the aggregated intent preview being prioritized in accordance with a set of rules associated with each available device or embedded application.
- Yet another embodiment of the present invention includes a system for providing intent expression for search in an embedded application context.
- the system includes one or more processors coupled to a computer storage medium, the computer storage medium having stored thereon a plurality of computer software components executable by the processor.
- the computer software components include an interaction component that receives a search interaction from a user, the search interaction comprising an interaction with a device or application or a learned intent based on a previous interaction.
- a merge component merges remote data with local data to personalize a result set comprising one or more entity identifications associated with the one or more intent suggestions.
- a priority component prioritizes the result set based on a set of rules associated with each available device or embedded application.
- a preview component provides the result set to the user, the result set including an aggregated intent preview comprising metadata corresponding to one or more entities associated with at least one of the one or more entity identifications.
- an exemplary operating environment in which embodiments of the present invention may be implemented is described below in order to provide a general context for various aspects of the present invention.
- an exemplary operating environment for implementing embodiments of the present invention is shown and designated generally as computing device 100 .
- the computing device 100 is but one example of a suitable computing environment and is not intended to suggest any limitation as to the scope of use or functionality of embodiments of the invention. Neither should the computing device 100 be interpreted as having any dependency or requirement relating to any one component nor any combination of components illustrated.
- Embodiments of the invention may be described in the general context of computer code or machine-useable instructions, including computer-useable or computer-executable instructions such as program modules, being executed by a computer or other machine, such as a personal data assistant or other handheld device.
- program modules include routines, programs, objects, components, data structures, and the like, and/or refer to code that performs particular tasks or implements particular abstract data types.
- Embodiments of the invention may be practiced in a variety of system configurations, including hand-held devices, consumer electronics, general-purpose computers, more specialty computing devices, and the like.
- Embodiments of the invention may also be practiced in distributed computing environments where tasks are performed by remote-processing devices that are linked through a communications network.
- the computing device 100 includes a bus 110 that directly or indirectly couples the following devices: a memory 112 , one or more processors 114 , one or more presentation components 116 , one or more input/output (I/O) ports 118 , one or more I/O components 120 , and an illustrative power supply 122 .
- the bus 110 represents what may be one or more busses (such as an address bus, data bus, or combination thereof).
- busses such as an address bus, data bus, or combination thereof.
- FIG. 1 is merely illustrative of an exemplary computing device that can be used in connection with one or more embodiments of the present invention. Distinction is not made between such categories as “workstation,” “server,” “laptop,” “hand-held device,” etc., as all are contemplated within the scope of FIG. 1 and reference to “computing device.”
- the computing device 100 typically includes a variety of computer-readable media.
- Computer-readable media may be any available media that is accessible by the computing device 100 and includes both volatile and nonvolatile media, removable and non-removable media.
- Computer-readable media comprises computer storage media and communication media; computer storage media excluding signals per se.
- Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data.
- Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by computing device 100 .
- Communication media embodies computer-readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media.
- modulated data signal means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
- communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. Combinations of any of the above should also be included within the scope of computer-readable media.
- the memory 112 includes computer-storage media in the form of volatile and/or nonvolatile memory.
- the memory may be removable, non-removable, or a combination thereof.
- Exemplary hardware devices include solid-state memory, hard drives, optical-disc drives, and the like.
- the computing device 100 includes one or more processors that read data from various entities such as the memory 112 or the I/O components 120 .
- the presentation component(s) 116 present data indications to a user or other device.
- Exemplary presentation components include a display device, speaker, printing component, vibrating component, and the like.
- the I/O ports 118 allow the computing device 100 to be logically coupled to other devices including the I/O components 120 , some of which may be built in.
- Illustrative I/O components include a microphone, joystick, game pad, satellite dish, scanner, printer, wireless device, a controller, such as a stylus, a keyboard and a mouse, a natural user interface (NUI), and the like.
- NUI natural user interface
- a NUI processes air gestures (i.e., motion or movements associated with a user's hand or hands or other parts of the user's body), voice, or other physiological inputs generated by a user. These inputs may be interpreted as search prefixes, search requests, requests for interacting with intent suggestions, requests for interacting with entities or subentities, or requests for interacting with advertisements, entity or disambiguation tiles, actions, search histories, and the like provided by the computing device 100 . These requests may be transmitted to the appropriate network element for further processing.
- a NUI implements any combination of speech recognition, touch and stylus recognition, facial recognition, biometric recognition, gesture recognition both on screen and adjacent to the screen, air gestures, head and eye tracking, and touch recognition associated with displays on the computing device 100 .
- the computing device 100 may be equipped with depth cameras, such as, stereoscopic camera systems, infrared camera systems, RGB camera systems, and combinations of these for gesture detection and recognition. Additionally, the computing device 100 may be equipped with accelerometers or gyroscopes that enable detection of motion. The output of the accelerometers or gyroscopes is provided to the display of the computing device 100 to render immersive augmented reality or virtual reality.
- aspects of the subject matter described herein may be described in the general context of computer-executable instructions, such as program modules, being executed by a computing device.
- program modules include routines, programs, objects, components, data structures, and so forth, which perform particular tasks or implement particular abstract data types.
- aspects of the subject matter described herein may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network.
- program modules may be located in both local and remote computer storage media including memory storage devices.
- tent disambiguation engine may also encompass a server, a Web browser, a set of one or more processes distributed on one or more computers, one or more stand-alone storage devices, a set of one or more other computing or storage devices, any application, process, or device capable of providing search functionality; search intent preview, disambiguation, and refinement as described herein; a combination of one or more of the above; and the like.
- embodiments of the present invention are generally directed to systems, methods, and computer-readable storage media for non-committal intent preview, disambiguation, and refinement of a search.
- a search prefix comprising one or more characters associated with an unexecuted search query is received.
- One or more intent suggestions are suggested to a user.
- one or more associated entity identifications are received.
- Metadata corresponding to at least one entity associated with the one or more entity identifications is retrieved from an entity data store. Without retrieving search results for the unexecuted search query, an aggregated intent preview based on the retrieved metadata corresponding to the at least one entity is provided.
- the one or more entities may be ranked based on entity-intrinsic signals, query-entity interactions by users, and query pattern likelihood scores.
- a refined intent preview associated with metadata corresponding to one or more subentities based on a selected item of metadata associated with the one or more entities may be provided.
- Task completion for a selected entity or subentity may be enabled.
- a set of queries issued by the user and entities corresponding to the set of queries may be provided.
- the entities enable the user to interact with a search history.
- a flow diagram is provided showing an exemplary method 200 for intent preview, disambiguation, and refinement of a search, in accordance with an embodiment of the present invention.
- the method 200 allows a user to surface content (e.g., an intent) that is difficult to find using common expressions.
- the user can enter the method 200 at any point.
- a user can exit the method 200 at any point, such as by executing the unexecuted query.
- the user may determine that reentry is necessary to refine the query.
- an initial essence of the query is expressed.
- the user may begin inputting a search prefix associated with the search query “Harry Potter.”
- the user may actually type “Harry Potter” or an intent suggestion for “Harry Potter” may be provided and selected based on the search prefix.
- a search term like “Harry Potter” may map onto a large set of entities varying in type (e.g., books, characters, movies, actors, costumes, toys, and the like)
- the search term by itself may be ambiguous.
- intent suggestions identifying basic groups of entities or a few of the top-most ranked entity groups can be provided to the user.
- a type of entity may be expressed. For example, the user may type “Harry Potter movie” or select an intent suggestion “Harry Potter movie.”
- entity disambiguation stage 230 more specific information regarding the type of entity may be expressed. For example, the user may desire information about a particular Harry Potter movie. The user may type “Harry Potter movie prisoner of Azkaban” or selected an intent suggestion “Harry Potter movie prisoner of Azkaban.” Each token or word added to the unexecuted query string provides a deeper understanding of the intent.
- the user may focus the search on a particular aspect of the previewed entity.
- the user may desire to locate information about the cast of the selected movie. For instance, the user may type or select “Harry Potter movie prisoner of Azkaban cast.”
- the user can execute the unexecuted search query, at the consume stage 250 , and the SERP 252 is provided.
- the user may desire to narrow the focus of the search and may refine the search further at the react stage 260 .
- the computing system 300 illustrates an environment in which a search session may be conducted utilizing pre-existing search navigation patterns.
- the computing system 300 generally includes user computing devices 310 (e.g., mobile device, television, kiosk, watch, touch screen or tablet device, workstation, gaming system, internet-connected consoles, and the like) and an intent disambiguation engine 320 in communication with one another via a network 302 .
- the network 302 may include, without limitation, one or more local area networks (LANs) and/or wide area networks (WANs).
- LANs local area networks
- WANs wide area networks
- any number of user computing devices 310 and/or intent disambiguation engines 320 may be employed in the computing system 300 within the scope of embodiments of the present invention. Each may comprise a single device/interface or multiple devices/interfaces cooperating in a distributed environment.
- the intent disambiguation engine 320 may comprise multiple devices and/or modules arranged in a distributed environment that collectively provide the functionality of the intent disambiguation engine 320 described herein. Additionally, other components or modules not shown also may be included within the computing system 300 .
- one or more of the illustrated components/modules may be implemented as stand-alone applications. In other embodiments, one or more of the illustrated components/modules may be implemented via a user computing device 310 , the intent disambiguation engine 320 , or as an Internet-based service. It will be understood by those of ordinary skill in the art that the components/modules illustrated in FIG. 3 are exemplary in nature and in number and should not be construed as limiting. Any number of components/modules may be employed to achieve the desired functionality within the scope of embodiments hereof. Further, components/modules may be located on and/or shared by any number of intent disambiguation engines and/or user computing devices. By way of example only, the intent disambiguation engine 320 might be provided as a single computing device (as shown), a cluster of computing devices, or a computing device remote from one or more of the remaining components.
- the user computing device 310 may include any type of computing device, such as the computing device 100 described with reference to FIG. 1 , for example.
- the user computing device 310 includes a display and is capable of executing a search or acting as a host for search results.
- the search process is configured to render search engine home pages (or other online landing pages) and search engine results pages (SERPs), in association with the display of the user computing device 310 .
- SERPs search engine home pages
- SERPs search engine results pages
- the user computing device 310 is further configured to receive user input of requests for various web pages (including search engine home pages), receive user input search queries, receive user input to refine search intent and/or take action on an entity (generally input via a user interface provided on the display and permitting alpha-numeric, voice, motion/gesture, and/or textual input into a designated search input region) and to receive content for presentation on the display, for instance, from the intent disambiguation engine 320 .
- the functionality described herein as being performed by the user device 310 and/or intent disambiguation engine 320 may be performed by any operating system, application, process, web browser, web browser chrome or via accessibility to an operating system, application, process, web browser, web browser chrome, or any device otherwise capable of executing a search or acting as a host for search results. It should further be noted that embodiments of the present invention are equally applicable to mobile computing devices and devices accepting touch, gesture, and/or voice input. Any and all such variations, and any combination thereof, are contemplated to be within the scope of embodiments of the present invention.
- the intent disambiguation engine 320 of FIG. 3 is configured to, among other things, provide intent preview, disambiguation, and refinement of a search.
- the intent disambiguation engine 320 is additionally configured to, among other things, enable actions on entities and provide entity-centric search history and shared data.
- the intent disambiguation engine 320 includes a search prefix component 322 , an autosuggest component 324 , an entity identification component 326 , a metadata component 328 , a preview component 330 , a ranking component 332 , a refinement component 334 , and an action component 336 .
- the illustrated intent disambiguation engine 320 also has access to a completion trie 340 and an entity data store 350 .
- the completion trie 340 is a data store configured to store and associate intent suggestions with entity identifications (“entity IDs”).
- entity data store 350 is a high performance data store configured to provide fast lookup of entities and metadata associated with entities corresponding to one or more entity IDs identified by the completion trie 340 . It will be understood and appreciated by those of ordinary skill in the art that the information stored in association with the completion trie 340 and the entity data store 350 may be configurable and may include any information relevant to search queries/terms/histories, intent suggestions, entity identifications, entities, and metadata associated with the entities. The content and volume of such information are not intended to limit the scope of embodiments of the present invention in any way.
- each of the completion trie 340 and the entity data store 350 may, in fact, be a plurality of storage devices, for instance a database cluster, portions of which may reside in association with the intent disambiguation engine 320 , the user computing device 310 , another external computing device (not shown), and/or any combination thereof. Further, the completion trie 340 and the entity data store 350 may be combined in a single storage device or database cluster.
- the search prefix component 322 of the intent disambiguation engine 320 is configured to receive a search prefix, for instance, utilizing search functionality associated with the user computing device 310 .
- the search prefix comprises one or more characters associated with an unexecuted search query.
- the search prefix component 322 communicates the search prefix to the autosuggest component 324 .
- the autosuggest component 324 of the intent disambiguation engine 320 is configured to receive the search prefix comprising one or more characters associated with an unexecuted search query. Upon receiving the search prefix, the autosuggest component 324 retrieves one or more intent suggestions associated with the search prefix. In one embodiment, the one or more intent suggestions are retrieved from the completion trie 340 .
- the intent suggestions represent the most likely intent of the user and/or target(s) of the unexecuted search query. The most likely intent of the user and/or target of the unexecuted search query may be determined by determining the type of query and possible types of entities associated with that type of query.
- Each of the intent suggestions may also be associated with one or more entity IDs. An entity ID indicates the intent suggestion is associated with one or more entities and may assist the user in distinguishing one intent suggestion from another.
- the entity identification component (“entity ID component”) 326 of the intent disambiguation engine 320 is configured to retrieve the entity ID.
- entity ID may be used to look up metadata associated with one or more entities that is stored, in one embodiment, in the entity data store 350 .
- entity ID may further describe or indicate the type of entity associated with the entity ID. Such indication may help the user readily locate or identify a particular search within a search history or share a particular search with others.
- the metadata component 328 of the intent disambiguation engine 320 is configured to retrieve metadata from the entity data store 350 .
- the metadata corresponds to at least one entity associated with the one or more entity identifications.
- the metadata may include content associated with the entity such as data or snippets of data that may be returned by or be available via links in search results for that entity. Metadata for multiple entities may be retrieved allowing the user to narrow or refine a search. For example, a primary intent suggestion representing the likely primary focus of the search as well as one or more secondary intent suggestions representing subcategories or subentities of the primary intent suggestion can be retrieved. Similarly, a primary intent suggestion representing the most likely target of the search as well as secondary intent suggestions representing less likely targets of the search can be retrieved.
- a request to retrieve metadata in one embodiment, is initiated when the user hovers over or selects an intent suggestion. In another embodiment, metadata for the first intent suggestion or most likely intent suggestion is automatically selected or retrieved.
- the preview component 330 of the intent disambiguation engine 320 is configured to provide an aggregated intent preview based on the retrieved metadata corresponding to the at least one entity (or a category of entities, e.g., “Seattle restaurants” or “Jackie Chan movies” and the like).
- the aggregated intent preview is provided without retrieving search results for the unexecuted search query. This allows the user to preview metadata associated with the intent suggestions without consuming the resources necessary to execute the full unexecuted search query. Rather than updating the SERP each time the user selects one of the intent suggestions, the aggregated intent preview provides the user with enough information about a particular entity to narrow the focus of the search.
- the aggregated intent preview provides a non-committal preview of one or more entities or subentities to help the user to refine an intent associated with the search without committing to the search until the user is actually ready to execute the search. More simply, the aggregated intent preview does not distract the user by constantly refreshing a SERP associated with a search because the search query is not executed until the user is satisfied the intent of the search is properly identified and adequately focused.
- a user may be searching for a particular person or thing.
- the autosuggest component 334 may retrieve several intent suggestions associated with the search prefix.
- Each of the intent suggestions may be associated with an entity ID that is associated with an intent suggestion that completes the search prefix (e.g., completes the spelling of one or more persons or things associated with the search prefix).
- the one or more entities or subentities identified in the aggregated intent preview is associated with the intent suggestion and may further identify one or more subcategories or subentities associated with the intent suggestion to help the user refine the search accordingly.
- the ranking component 332 of the intent disambiguation engine 320 is configured to rank the one or more entities.
- the ranking may be utilized to automatically determine the intent or target of the unexecuted search query.
- the ranking may be based on entity-intrinsic signals, query-entity interactions by users, and/or query pattern likelihood scores.
- the entity-intrinsic signals may comprise a number of attributes or a number of information sources. For example, one intent suggestion may be ranked higher than another if it includes more attributes associated with a particular entity. Similarly, one intent suggestion may be associated with a particular entity that has a higher number of information sources than another intent suggestion associated with a different entity.
- Each of these entity-intrinsic signals may be utilized to assign a static ranking score to the intent suggestion, independent of the unexecuted search query. The same methodology can be utilized to rank and influence the display of entities or subentities provided in the aggregated intent preview.
- the query pattern likelihood scores may be based on expected patterns.
- the expected patterns may be based on entity type, quality standards independent of an individual entity, quality standards independent of associated queries, dominance of one particular entity over another, non-entity associations of the query, and the like.
- Expected patterns represent the identification by the system of one or more expected terms, based on the entity type, associated with the intent suggestion. Expected patterns generally are based on data that is typically associated with an entity and which users have come to expect having associated with the particular entity type. For example, each intent suggestion associated with an entity can be examined to identify expected patterns based on the entity type. If the entity type is a business, expected patterns of the intent suggestions may include business names, locations, type of businesses, and the like. On the other hand, if the entity type is a person, expected patterns of the intent suggestions may include first names, middle initials, locations, last names, occupations, and the like.
- the quality standards may be independent of the individual entity but may be based on the entity type. For example, a determination can be made to make sure the query includes at least one well known business name or person name.
- the quality standards may also be independent of the intent suggestions or unexecuted search query. For example, entities may only be included in the aggregate intent preview if they contain a minimum number of attributes or have been updated recently (e.g., within a predetermined or configurable amount of time). Thus, the quality standards ensure that items associated with the query or the entities included in the aggregate intent preview are expected or known (e.g., one or more known terms), meet minimum requirements (e.g., minimum number of entity-intrinsic signals), and up-to-date.
- one particular entity (e) dominates intent suggestions for an unexecuted search query
- intent suggestions associated with a less dominant entity e′, e′′, e′′′, etc.
- a threshold e.g., given a set of intent suggestions for an unexecuted search query, a percentage of those intent suggestions that corresponds to/is directed to the entity (e) meets or exceeds a threshold
- entity (e) may be considered to dominate the intent suggestions for the unexecuted search query.
- entity (e) dominates the intent suggestions for the unexecuted search query.
- intent suggestions associated with other entities e′, e′′, e′′′, etc.
- less dominant entities may be associated with the selected intent suggestion even when another more dominant query-entity pair exceeds the particular configurable or automatically determined threshold.
- a business entity may be dominant to all other entities for the intent suggestion “hotel California.”
- song entities associated with the intent suggestion “hotel California” may actually be the target or intent of the user. Even if the business entity exceeds the threshold to be determined a dominant entity for that particular intent suggestion, the song entities are still associated with the intent suggestion until the actual intent or target of the unexecuted search query is determined.
- non-entity associations of an intent suggestion may also be considered to determine whether a particular entity is dominant.
- an intent suggestion or unexecuted search query may not have an entity intent (an entity intent suggests the intent or target of the search is an entity).
- the intent suggestion or target of the unexecuted search query is not an entity.
- the intent suggestion or the target of the unexecuted search query may instead target a web resource. In this instance, even when an entity (e.g., business or person entity) exists, the primary intent is the web resource and the query-entity associated is dropped.
- the primary intent may be determined based on user signals at the time the search prefix is input, how the user interacts with the intent suggestions or aggregated intent preview (e.g., query-entity interactions, entity clicks or clicks on an entity in a search window or third party entity repository, etc.), a search history associated with the user (e.g., search logs, previous query-entity interactions, previous entity clicks or clicks on an entity in a search window or third party entity repository, etc.), third party search history (e.g., search logs, previous third party query-entity interactions, previous third party entity clicks or clicks on an entity in a search window or third party entity repository, etc.).
- search history associated with the user e.g., search logs, previous query-entity interactions, previous entity clicks or clicks on an entity in a search window or third party entity repository, etc.
- third party search history e.g., search logs, previous third party query-entity interactions, previous third party entity clicks or clicks on an entity in
- the refinement component 334 of the intent disambiguation engine 320 is configured to, without retrieving search results for the unexecuted search query, provide a refined intent preview.
- the refined intent preview is associated with metadata corresponding to one or more subentities.
- the one or more subentities are based on a selected item of metadata associated with the one or more entities. For example, a user may select or interact with an item from the aggregated intent preview.
- the selected item may be based on metadata corresponding to the one or more entities associated with an intent suggestion.
- the selected item may be associated with one or more subentities related to the entity. Such a selection allows the user to further refine the search by narrowing the focus or intent of the search without actually executing the unexecuted search query.
- the action component 336 of the intent disambiguation engine 320 is configured to enable task completion for a selected entity or subentity in association with the aggregated intent preview. This allows the aggregated intent preview to not only identify an intent of the search but actually allows the user to complete a task or action associated with the unexecuted search query. For example, a user may desire information about a particular movie. The action component allows the user to actually view or download the movie, such as on Netflix®. The action component may provide a link or tile that, upon selection, opens an application, independent window, link, or process to execute the task. In one embodiment, upon selection of the link or tile, the action component opens an application, independent window, link, or process without affecting the search window.
- the action component upon selection of the link or tile, the action component opens an application, independent window, link, or process and the search is refined or updated. In one embodiment, upon selection of the link or tile, the action component opens an application, independent window, link, or process and the search window is closed.
- any number of actions or tasks may be enabled by the action component 336 .
- an application may be available that relates to a particular entity or subentity. Upon selection, the application is installed on the user device. Similarly, tickets or reservations to a particular event or place can be purchased or made by the action component 336 .
- the action component 336 may further enable third party components to execute external actions (e.g., reservations, purchases, and the like).
- the action component 336 is configured to include paid placement text or display advertisements in association with the aggregated intent preview.
- illustrative screen displays for non-committal intent preview, disambiguation, and refinement of a search are provided. It is understood that each of the illustrative screen displays are connected logically, such that they comprise a user interface designed for non-committal intent preview, disambiguation, and refinement of a search.
- the screen displays may appear in any order and with any number of screen displays, without regard to whether the screen display is described or depicted herein.
- a search display area displays a search bar 410 for receiving a search prefix 412 from a user corresponding to an unexecuted search.
- Autosuggest display area 420 displays, without executing the search, one or more intent suggestions 421 , 422 , 423 , 424 , 425 , 426 , 427 , 428 to the user.
- Entity display area 430 displays, without executing the search, an aggregated intent preview comprising metadata associated with at least one entity 432 , 434 , 436 , 438 , 440 corresponding to entity identifications associated with the one or more intent suggestions.
- the aggregated intent preview may include a main or primary entity 432 that appears larger than the other entities (i.e., secondary entities).
- the primary entity 432 may be ranked higher than the other entities, such as by the ranking methodology described herein.
- the secondary entities 434 , 436 , 438 , 440 may be subentities associated with the primary entity 432 or may be distinct entities altogether, such as lower ranked entities.
- Each of the entities may be selectable, such as to further refine the intent of the search, but without executing the search, or enable action or completion of a particular task, such as those actions and tasks described herein.
- FIG. 5 an illustrative screen display 500 of an embodiment of the present invention is shown.
- the search display area displays a search bar 510 with the search prefix 512 “aven.”
- Autosuggest display area 520 displays, without executing the search, intent suggestions 521 , 522 , 523 , 524 , 525 , 526 , 527 , 528 .
- intent suggestions 521 , 522 , 523 , 524 , 525 , 526 , 527 , 528 include “avengers”, “avenue”, “avengers alternate opening”, “avengers trailer”, “avenged sevenfold”, “avengers games”, “avenade”, “aventa learning” to the user.
- Entity display area 530 displays, without executing the search, an aggregated intent preview comprising metadata associated with at least one entity.
- the entities include the primary entity 532 , Marvel's The Avengers and secondary entities 534 , 536 , 538 , 540 .
- the secondary entities 534 , 536 , 538 , 540 include Images (i.e., associated with the primary entity, “Marvel's The Avengers”), Shopping (i.e., for items associated with the primary entity), Reviews (i.e., of the primary entity), Cast & Crew (i.e., for the primary entity) and correspond to entity identifications associated with the one or more intent suggestions or, in this case, the primary entity 532 .
- the primary entity 532 may be identified by the user such, as by selecting an intent suggestion, or may be automatically selected corresponding to a ranking identifying the most likely entity (thus, intent) of the search.
- Each of the entities is selectable, such as to further refine the intent of the search, but without executing the search, or enable action or completion of a particular task, such as those actions and tasks described herein.
- the user can narrow the search to identify images associated with “Marvel's The Avengers” by selecting the tile or secondary entity 534 (e.g., Images), or by typing additional characters into the search bar 510 .
- Selection of the Images entity may narrow the search further, such as by identifying particular scenes or characters. This results in the display of a refinement display area that displays, without executing the search, a refined intent preview comprising metadata associated with a subentity corresponding to a selected item of metadata associated with the at least one entity.
- Selection of the Images entity may also enable the user to complete a task, such as by allowing the user to view images associated with the movie via another website, application, and the like.
- the search display area displays a search bar 610 with the search prefix 612 .
- Autosuggest display area 620 displays, without executing the search, one or more intent suggestions to the user.
- the intent suggestions may be ranked such as by the ranking methodology described herein.
- Entity display area 630 displays, without executing the search, an aggregated intent preview comprising metadata associated with one or more entities 632 , 634 , 636 .
- the entities 632 , 634 , 636 correspond to entity identifications associated with the one or more intent suggestions or, in this case, multiple entities 632 , 634 , 636 of the same entity type.
- the most likely intent may be predicted or automatically selected, such as by the ranking methodology described herein, other candidates of the same entity type may also be provided to allow the user to more easily identify the target of the search.
- the most likely intent (i.e., primary intent) 632 is scaled, in one embodiment, to appear larger or occupy more space in the aggregated intent preview provided in the entity display area 630 than the less likely, secondary intents 634 , 636 .
- the aggregated intent preview is automatically expanded based on intent confidence. In other words, if the intent confidence, as determined by the ranking methodology described herein or any methodology for determining confidence, exceeds a configurable threshold, the aggregated intent preview is automatically expanded and populated with one or more entities and associated metadata based on the determined or predicted intent.
- the search prefix 612 “michal j” and selection of intent suggestion “michael jackson” may result in the display of three entities 632 , 634 , 636 corresponding to the selected intent suggestion.
- the three entities 632 , 634 , 636 may be the same type, in this case people entities, each associated with a metadata corresponding to a subentity (e.g., profession, location, and the like) associated with each people entity.
- the entity display area 630 is scaled, in one embodiment, based on relevance signals or likelihood of intent. For example, an entity tile associated with the primary intent 632 may appear larger or occupy more space in the entity display area 630 than other secondary intents 634 , 636 .
- Each of these primary and secondary intents 632 , 634 , 636 is selectable to allow the user to identify the appropriate intent of the search or further refine the search without actually executing the search. For instance, the user can narrow the search to identify, target, or preview subentities associated with the selected entity. Further, each of the entities 632 , 634 , 636 may enable action or completion of a particular task, such as those actions and tasks described herein. Once the user has properly identified or narrowed the intent of the search, the user can execute the search.
- a popular now display area 710 displays metadata 734 , 736 associated with one or more entities 732 corresponding to entity identifications not associated with a search prefix received in the search bar.
- the popular now display area 710 can be provided without receiving a search prefix corresponding to a search.
- the popular now display area 710 displays intent suggestions 720 associated the most popular entities based on a search history, query-entity interactions, or entity click data.
- the search history, query-entity interactions, or entity click data may be associated with a user or group of users, such as a group of users associated with the user in a social network, location, place of employment, occupation, interest, proximity, subscription, school, demographics, and the like.
- the search history, query-entity interactions, or entity click data may be based on a configurable time range.
- the configurable time range may be set for any time range in the past, present, or future (i.e., such as for predicting likely popular searches, search terms, and entities in the future based on expected news, forecasts, events, schedules, holidays, press releases, product information, and the like).
- an illustrative screen display 800 of an embodiment of the present invention is shown.
- the autosuggest display area 820 displays, without executing the search, one or more intent suggestions to the user.
- the entity display area 830 displays, without executing the search, an aggregated intent preview comprising metadata associated with at least one entity corresponding to entity identifications associated with the one or more intent suggestions.
- the intent suggestion 822 “kelly white” was selected.
- multiple entities corresponding to the entity identification associated with the intent suggestion “kelly white” are provided in the entity display area 830 .
- Metadata corresponding to each entity is provided allowing the user to determine which entity or “kelly white” is the intent or target of the search.
- the search may need to be refined further, such as by selecting one of the entities in the entity display area 830 .
- the user may select the corresponding entity to refine the search further, in which case additional metadata is retrieved and provided for subentities associated with the selected entity allowing the user to select the appropriate entity or subentity.
- FIG. 9 an illustrative screen display 900 of an embodiment of the present invention is shown. Similar to the example set forth and illustrated in FIG. 8 , FIG. 9 depicts an entity display area 930 displaying multiple entities associated with a selected intent suggestion 922 . Each of the entities allows the user to further refine the search until the actual intent is determined and the search query is executed. For example, a user may type the search prefix 912 “canon cameras.” Intent suggestions are provided in the autosuggest display area 920 . Once the intent suggestion 922 is selected, either automatically based on intent confidence or manually by the user, metadata corresponding to entities associated with the intent suggestion is retrieved and provided in the entity display area 930 . As desired, the user can refine the search further by selecting an entity or subentity which results in subentities and corresponding metadata being provided in the entity display area 930 .
- an illustrative screen display 1000 of an embodiment of the present invention is shown.
- the metadata provided by the entity display area 1030 is provided for a single entity and is divided into entity or intent disambiguation tiles corresponding to a primary intent 1032 and secondary intents 1034 , 1036 , 1038 , 1040 .
- the primary intent 1032 may allow the user to execute the search for the selected intent suggestions or entity depicted by the entity display area 1030
- the secondary intents 1034 , 1036 , 1038 , 1040 may allow the user to refine the search further for the entity or a particular subentity as already described herein.
- a navigational display area 1134 may appear in the entity display area 1130 corresponding to the intent suggestion 1122 selected from the one or more intent suggestions 1120 .
- the navigational display area 1134 may represent local intent associated with the entity 1132 .
- the navigational display area 1134 displays a map and/or directions to an entity provided in the entity display area 1130 .
- the navigational display area 1134 may further enable an action or task, as described in more detail below, such as providing directions from a location associated with the user to the nearest entity or an entity associated with a selected location.
- an advertisement display area 1136 displays text or display advertisements for a particular entity.
- the text or display advertisements may be paid for by or auctioned to a provider distinct or independent of the search provider, such as the provider associated with the entity (e.g., Starbucks®).
- the text or display advertisements (e.g., paid placement for advertisements) may also be associated with an action or completion of a task (e.g., promoting download of an application) as described in more detail below.
- An action display area 1232 displays an action available for the user to take on a particular entity.
- the action enables task completion for the intent suggestion 1222 selected from the one or more intent suggestions 1220 .
- the task completion is provided by a provider distinct or independent of the search provider.
- the action may request or install an application associated with the distinct or independent provider.
- the action display area 1332 may appear in the entity display area 1330 corresponding to the intent suggestion 1322 selected from the one or more intent suggestions 1320 .
- the action is provided by a provider distinct or independent of the search provider.
- the action may execute an application provided by the distinct or independent application provider, request permission to install the application, or request login credentials for an account associated with the application or provider.
- FIGS. 14A and 14B illustrative screen displays depict mobile embodiments of the present invention.
- the autosuggest display area 1420 displays, without executing the search, one or more intent suggestions to the user.
- the entity display area 1430 displays (as illustrated in FIG. 14B ), without executing the search, an aggregated intent preview comprising metadata associated with at least one entity corresponding to entity identifications associated with the one or more intent suggestions.
- the intent suggestion 1422 “rihanna” was selected.
- multiple entities corresponding to the entity identification associated with the intent suggestion “rihanna” are provided in the entity display area 1430 .
- Metadata corresponding to each entity is provided allowing the user to determine which entity associated with the intent suggestion “rihanna” is the actual intent or target of the search.
- the search may need to be refined further, such as by selecting one of the entities in the entity display area 1430 .
- the user may select the desired entity or subentity to refine the search further, in which case additional metadata is retrieved and provided for the selected entity or subentity allowing the user to identify the appropriate entity or subentity.
- a search history display area 1510 displays, in one embodiment, a set of queries 1520 issued by the user and entities 1540 corresponding to the set of queries 1520 .
- the entities 1540 enable the user to interact with the search history. This allows a user to quickly assimilate and understand a view associated with the user's search history.
- the search history display area 1510 displays an aggregate set of queries issued by multiple users and entities corresponding to the aggregate set of queries, such as might be implemented in an embodiment to display which entities are being searched for the most by a population of users.
- a social network display area 1600 displays a topic or entity 1620 shared by one or more users 1610 via a social network.
- the topic or entity represents a set of queries issued by the one or more users 1610 and characterized by metadata associated with the at least one entity 1620 .
- the entity may be selectable, allowing a user to retrace the steps of the research performed by the user sharing the entity 1620 .
- the entity may be selectable, allowing a user to execute a search associated with the entity 1620 .
- the search may be executed with the same operating system, application, process, web browser, web browser chrome, or device otherwise capable of executing a search or acting as a host for search results as the original shared search.
- the search may be executed with a different operating system, application, process, web browser, web browser chrome or via accessibility to an operating system, application, process, web browser, web browser chrome, or any device otherwise capable of executing a search or acting as a host for search results than the original shared search.
- FIG. 17 an illustrative screen display 1700 of an embodiment of the present invention is shown.
- a multi-user history display area 1700 displays a set of entities 1730 corresponding to a set of most popular searched for entities 1720 over a predetermined period of time by a population of users.
- the multi-user history display area 1700 may be tailored by a user to select specific topics or entities.
- the multi-user history display area 1700 then identifies the set of most popular searched for entities corresponding to the selected topic or entity. For example, and referring again to FIG. 17 , a user may be interested in the most researched presidents in the last month. As illustrated, a most popular search for presidents entity search may result in a set of entities 1730 that includes presidents 1732 , 1734 , 1736 .
- an illustrative screen display 1800 of an embodiment of the present invention is shown.
- an annotated query display area 1800 displays an annotated query 1810 with a set of previously identified entities 1812 , 1814 , 1816 for the query 1810 .
- an illustrative screen display 1900 of an embodiment of the present invention is shown.
- an annotated entity display area 1900 displays an annotated entity 1912 , 1914 , 1916 , 1918 , 1920 with a set of previously identified metadata 1922 , 1924 , 1926 , 1928 , 1930 for each entity.
- each of the annotated query display area 1800 and annotated entity display area 1900 the set of previously identified entities for a query or metadata for the entity is automatically populated with previously existing information. This allows the intent disambiguation engine 320 of FIG. 3 , for example, to retrieve valuable information for a user with minimal time, effort, and resources.
- FIG. 20 an illustrative screen display 2000 of an embodiment of the present invention is shown.
- the metadata provided by the entity preview area 2010 is provided for related intents 2032 , 2034 , 2036 , 2038 .
- the related intents 2032 , 2034 , 2036 , 2038 are the result of a user executing a query for a selected primary intent, contextual signals, and further interaction.
- a user may select to execute the search for one of the entities the primary intent 1032 of FIG. 10 .
- any interactions may provide contextual information when the user further interacts with the search bar 2020 for the selected intent suggestion provided by the entity preview area 2010 .
- any interactions e.g., query-entity interactions, entity clicks, etc.
- the search bar 2020 may provide contextual information when the user further interacts with the search bar 2020 for the selected intent suggestion provided by the entity preview area 2010 .
- related intents 2032 , 2034 , 2036 , 2038 are provided instead.
- the related intents 2032 , 2034 , 2036 , 2038 allow the user to continue the search experience recognizing that the intent has changed in accordance with the contextual information and further interaction with the search bar 2020 .
- a search prefix is received from a user (e.g., utilizing the search prefix receiving component 322 of the intent disambiguation engine 320 of FIG. 3 ).
- the search prefix comprises one or more characters associated with a search query.
- one or more intent suggestions are provided to the user (e.g., utilizing the autosuggest component 324 of the intent disambiguation engine 320 of FIG. 3 ).
- the one or more intent suggestions may be based on a comparison of the search prefix to an autosuggest store.
- the one or more intent suggestions may be retrieved by an application programming interface (API) call to the autosuggest store.
- API application programming interface
- the one or more intent suggestions may be rendered by a separate service from the intent disambiguation engine 320 .
- One or more entity IDs associated with the intent suggestions are identified as indicated at block 2114 (e.g., utilizing the entity identification component 326 of the intent disambiguation engine 320 of FIG. 3 ).
- the one or more intent suggestions may be based on an entity ranking.
- the entities associated with the intent suggestions that are the most likely target or intent of the search may be ranked and identified.
- the ranking may be in accordance with the ranking methodology described herein.
- the one or more entities are ranked based on entity-intrinsic signals, query-entity interactions by users, and query pattern likelihood scores.
- the query pattern likelihood scores are based on entity type, quality standards independent of an individual entity, quality standards independent of associated queries, dominance of one particular entity over another, and non-entity associations of the query.
- the ranked entities are associated with a set of user queries.
- the set of user queries may be associated with a single user or multiple users over time.
- An aggregated intent preview is provided as indicated at block 2116 (e.g., utilizing the preview component 330 of the intent disambiguation engine 320 of FIG. 3 ).
- the aggregated intent preview comprises metadata corresponding to one or more entities associated with at least one of the one or more entity IDs (the metadata is retrieved, for example, by the metadata component 328 of the intent disambiguation engine 320 of FIG. 3 ).
- the metadata may be retrieved in a separate API call from the user interface than the API call that retrieves the one or more intent suggestions.
- a separate service renders the metadata than the service rendering the one or more intent suggestions and/or the intent disambiguation engine 320 .
- a refinement request is received from the user.
- the refinement request comprises an indication the user has selected an item associated with the one or more entities. More simply, the refinement request is an indication the user determined to refine or narrow the focus or intent of the search.
- the item of metadata may correspond to a subentity (i.e., a subset of metadata associated with the entity that may focus on one aspect associated with or further define or distinguish the entity). Metadata associated with the selected subentity is retrieved, for example, by the metadata component 328 of the intent disambiguation engine 320 of FIG. 3 .
- a refined intent preview is provided as indicated at block 2120 (e.g., utilizing the refinement component 334 of the intent disambiguation engine 320 of FIG. 3 ).
- the refined intent preview allows the user to narrow the intent of the unexecuted search without executing the search. For example, the user may enter the search prefix “bellevue weath.” Based on this search prefix, one of the intent suggestions provided may be “bellevue weather.” After selecting this intent suggestion, either automatically based on confidence or manually by the user, the aggregated intent preview may comprise metadata corresponding to “Bellevue, Wash. weather”, “Bellevue, Wyoming weather”, and “Bellevue, Ohio” weather. Based on the intent of the user, the user is able to identify the appropriate location of desired weather and refine the intent accordingly. After the user refines the intent to the desired location, additional metadata associated with that selected location may be provided and refined even further, as described herein.
- FIG. 22 a block diagram is provided illustrating an exemplary computing system 2200 in which embodiments of the present invention may be employed.
- the computing system 2200 illustrates an environment in which intents and query understanding may be provided and merged from a remote service and a local front-end application (e.g., INTERNET EXPLORER) as part of the search intent preview experience.
- a local front-end application e.g., INTERNET EXPLORER
- the computing system 2200 generally includes intent expression engine 2210 , remote service 2240 , remote data source 2242 , local service or application 2250 , local data sources 2252 , 2254 , 2256 , 2258 , prioritization rules 2260 , user computing devices 2270 (e.g., mobile device, television, kiosk, watch, touch screen or tablet device, workstation, gaming system, internet-connected consoles, and the like) which may also provide local intents and query understanding, and an intent disambiguation engine 2280 (e.g., intent disambiguation engine 320 as shown in FIG. 3 ) in communication with one another via a network 2202 .
- the network 2202 may include, without limitation, one or more local area networks (LANs) and/or wide area networks (WANs). Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets and the Internet. Accordingly, the network 2202 is not further described herein.
- intent expression engines 2210 user computing devices 2270 , and/or intent disambiguation engines 2280 may be employed in the computing system 2200 within the scope of embodiments of the present invention. Each may comprise a single device/interface or multiple devices/interfaces cooperating in a distributed environment.
- the intent disambiguation engine 2280 may comprise multiple devices and/or modules arranged in a distributed environment that collectively provide the functionality of the intent disambiguation engine 2280 described herein. Additionally, other components or modules not shown also may be included within the computing system 2200 .
- one or more of the illustrated components/modules may be implemented as stand-alone applications. In other embodiments, one or more of the illustrated components/modules may be implemented via the intent expression engine 2210 , a user computing device 2270 , the intent disambiguation engine 2280 , or as an Internet-based service. It will be understood by those of ordinary skill in the art that the components/modules illustrated in FIG. 22 are exemplary in nature and in number and should not be construed as limiting. Any number of components/modules may be employed to achieve the desired functionality within the scope of embodiments hereof. Further, components/modules may be located on and/or shared by any number of intent disambiguation engines, intent expression engines and/or user computing devices.
- the intent disambiguation engine 2280 might be provided as a single computing device (as shown), a cluster of computing devices, or a computing device remote from one or more of the remaining components.
- the intent disambiguation engine 2280 and intent expression engine 2210 could be provided together on a single computing device, a cluster of computing devices, or a computing device remote from one or more of the remaining components.
- the intent disambiguation engine 2280 and intent expression engine 2210 may be provided by a single entity or multiple entities.
- a search engine provider could provide both the intent disambiguation engine 2280 and intent expression engine 2210 .
- a search provider could provide the intent disambiguation engine 2280 and a separate provider could provide the intent expression engine 2210 . Any and all such variations are contemplated to be within the scope of embodiments herein.
- Each of the user computing devices 2270 and the intent disambiguation engine 2280 may be similar to the user devices 310 and intent disambiguation engine 320 , respectively, discussed above with reference to FIG. 3 .
- the intent disambiguation engine 2280 may include a number of components (search prefix component, autosuggest component, entity identification component, metadata component, preview component, ranking component, refinement component, and action component) and operate using a completion trie and entity date store in a manner similar to that described above with reference to FIG. 3 . As such, details of these components of FIG. 22 will not be described in further detail here.
- the intent expression engine 2210 generally operates to merge signals from remote and local data sources to identify one or more intent suggestions that are provided to users on user computing devices 2270 . As shown in FIG. 22 , the intent expression engine 2210 includes, in various embodiments, interaction component 2212 , remote data component 2214 , local data component 2216 , merge component 2218 , rule component 2220 , and priority component 2222 .
- Interaction component 2212 receives a search interaction from a user, the search interaction comprising an interaction with a device or application or a learned intent based on a previous interaction.
- the search interaction comprises an interaction with a device or application or a learned intent based on a previous interaction.
- the user may have searched on multiple occasions for a particular item or information (e.g., a weather forecast or stock prices).
- the search interaction may become a learned intent based on these previous interactions.
- the interaction may include a search prefix comprising one or more characters associated with a search query.
- the interaction may include a gesture or voice command.
- the interaction may include a navigation within an application, a user interface, or on a device such as a movement of a cursor, mouse, or a touch on a display.
- remote data component 2214 receives the remote data from a remote data source.
- the remote data provides one or more intent suggestions based on the search interaction.
- the remote data source may include remote data provided by intent disambiguation engine as described above with respect to FIG. 3 .
- local data component 2216 receives the local data from each available device or embedded application.
- the local data provides one or more intent suggestions based on the search interaction.
- the local data may be favorites or preferences associated with the device or application from which the search interaction is received or initiated.
- the local data may be capabilities, functionalities, tasks, or actions provided by the device or application.
- the local data may be local device information.
- the local data may be local data associated with an application or residing on or accessible by the application or device.
- Merge component 2218 merges remote data with local data to personalize a result set comprising one or more entity identifications associated with the one or more intent suggestions.
- rule component 2220 generates a set of rules based on an identification of a host application or device and a nature of the host application or device.
- Priority component 2222 prioritizes the result set based on the set of rules associated with each available device or embedded application.
- the host application is a web browser
- the nature of the host application is to browse websites.
- the set of rules may be generated ranking the result set with websites higher than an entity identification that launches an application.
- the local data may include favorites, pinned websites, and the like that are identified within local data of the browser application.
- the host device may be an XBOX. Because the nature of the XBOX is tailored to games, movies, and music, the set of rules may be generated ranking the result set to launch games, movies, and music higher than other results. Or the set of rules may be generated ranking the result set to launch already installed items higher than finding items that are not currently identified within the local data (i.e., local data items are ranked higher than remote data items).
- Preview component 2224 provides the result set to the user, the result set including an aggregated intent preview comprising metadata corresponding to one or more entities associated with at least one of the one or more entity identifications.
- the user may interact with the aggregated intent preview by selecting the desired metadata to refine the search (prior to actually executing the search) or to execute the search.
- the user may also interact with the aggregated intent preview by selecting the desired metadata to accomplish a task or launch an application.
- screen displays of intent expression for search in an embedded application context are provided. It should be understood that the screen displays are provided by way of example only and should not be viewed as limiting. The screen displays may appear in any order and with any number of screen displays, without regard to whether the screen display is described or depicted herein.
- an interaction display area displays a search bar 2310 for receiving a search interaction from a user corresponding to an unexecuted search.
- the search bar 2310 may be any area within an application or service capable of receiving a user interaction and is not limited to an area that is designed solely for receiving a search interaction.
- the search bar 2310 may be any area of a display provided to the user and the search interaction may be the user looking or gesturing to a certain portion of the display.
- the search bar 2310 may also include any area of a display provided to the user capable of receiving any sort of user interaction (e.g., any area the user can type and not one that is merely there to receive search terms).
- the search bar 2310 may not even be a visible portion of a display; rather, the search bar 2310 may be voice recognition associated with an application or service such that the user interaction is received when the user communicates audibly with the application or service.
- Autosuggest display area 2320 displays, without executing the search, one or more intent suggestions to the user.
- the one or more intent suggestions comprise remote data and local data based on the search interaction.
- the local data is received from each available device (e.g., the device the search interaction was initiated by or received from) or embedded application.
- the remote data is received from a remote data source.
- Entity display area 2330 displays, without executing the search, an aggregated intent preview.
- the aggregated intent preview comprises metadata associated with at least one entity corresponding to entity identifications associated with the one or more intent suggestion.
- the aggregated intent preview is prioritized in accordance with a set of rules associated with each available device or embedded application.
- the set of rules may be generated based on a host application and/or a nature of the host application.
- the aggregated intent preview may comprise user-selectable tiles 2332 , 2334 , 2336 , 2338 , 2340 , 2342 associated with at least one entity corresponding to entity identifications associated with the one or more intent suggestions.
- Each of the tiles 2332 , 2334 , 2336 , 2338 , 2340 , 2342 may be selectable, such as to further refine the intent of the search, but without executing the search, or enable action or completion of a particular task, such as those actions and tasks described herein.
- Each of the autosuggest display area 2320 and entity display area 2330 may be provided in a flyout surface area 2350 on the device or inside a browser chrome to provide small targeted intents with a rich display of structured content.
- the flyout surface area 2350 displays a flyout surface in response to the user interaction.
- the flyout surface is an entry point for intent expression inside an application.
- FIG. 24 provides an illustrative screen display 2400 in which a search interaction has been received with a high navigational intent (e.g., Amazon).
- Entity display area 2410 displays, without executing the search, an aggregated intent preview comprising user-selectable tiles 2412 , 2414 , 2416 , 2418 associated with at least one entity corresponding to entity identifications associated with one or more intent suggestions.
- Each of the tiles 2412 , 2414 , 2416 , 2418 may be selectable, such as to further refine the intent of the search, but without executing the search, or enable action or completion of a particular task, such as those actions and tasks described herein.
- the tiles include a main tile 2412 corresponding with a main entity (e.g., Amazon.com) and subtiles 2414 , 2416 , 2418 , corresponding with subentities, which are deeplinks in this case.
- a main entity e.g., Amazon.com
- subtiles 2414 , 2416 , 2418 corresponding with subentities, which are deeplinks in this case.
- the term “deeplink” refers to a hyperlink to web pages associated with the main destination web page that allow the user to access specific information or perform a specific task.
- the deeplinks may include subtiles corresponding to local data (e.g., launch an application that is installed on a device) such that advertisements 2332 and 2334 have been presented for two of the deeplinks.
- Entity display area may further include non-navigational tiles 2422 , 2424 , 2426 , such as related entities corresponding to a category associated with the primary entity.
- the category associated with Amazon.com may be ecommerce.
- other ecommerce sites may be suggested (e.g., Ebay).
- local data may be received indication that a particular application (e.g., an Ebay application) is installed on the device.
- a non-navigational tile may be provided that allows the user to launch that application (e.g., the Ebay application).
- FIG. 25 provides a screen display 2500 in which the user has entered the search prefix “fa” which results in a number of intent suggestions provided in an autosuggest display 2510 .
- the entity display area includes tiles 2512 , 2514 , 2516 , 2518 , 2520 , 2522 , 2524 , 2526 , 2528 , 2530 , which include metadata associated with at least one entity corresponding to entity identifications associated with the one or more intent suggestions.
- Each of the tiles 2512 , 2514 , 2516 , 2518 , 2520 , 2522 , 2524 , 2526 , 2528 , 2530 corresponds to an intent suggestions based on remote data and local data (e.g., application installed on the device where the search interaction was received or initiated, local file on the device where the search interaction was received or initiated, remote data, local data from application where the search interaction was received or initiated, various sub sources such as search engine indexed data, structured or unstructured data from sources not harvested by web search engine (e.g., deep web data) that may sit behind a website or application or advertisements).
- remote data and local data e.g., application installed on the device where the search interaction was received or initiated, local file on the device where the search interaction was received or initiated, remote data, local data from application where the search interaction was received or initiated, various sub sources such as search engine indexed data, structured or unstructured data from sources not harvested by web search engine (e.g., deep web data) that may sit behind a website or application
- FIG. 26 provides an illustrative screen display 2600 in which various intent suggestions are provided based on merged local and remote data.
- the various intent suggestions may be provided in a format or include metadata that identifies a data source of the intent suggestions.
- the intent suggestion may further suggest an action or task to the user (e.g., execute a search, install or launch an application and the like).
- FIG. 27 provides an illustrative screen display 2700 in which the intent suggestions utilize both remote and local data to provide suggestion to the user.
- a search interaction may have been received indicating the user is interested in a place to eat.
- local data is received from the device in which the search interaction was initiated providing a location of the user (e.g., GPS, Wi-Fi triangulation, and the like). Consequently, the entity display area 2710 provides intent suggestions 2712 , 2714 that merge both remote and local data.
- FIG. 28 provides an illustrative screen display 2800 in which the search interactions are based on learned intent.
- the user may have a habit of searching for the weather, movie times, or a stock price on a regular basis.
- a learned intent display area 2810 displays, prior to receiving the search interaction from the user, at least a portion of the result set 2812 , 2814 , 2816 .
- the portion of the result set is based on a previous interaction and comprises remote data and local data based on the learned intent.
- the local data may provide context to the one or more intent suggestions (such as weather in a particular location the user initiates the search, movie times for a nearby theater, or stock prices for nearby companies).
- a search interaction is received from a user.
- the search interaction comprises an interaction with a device or application or a learned intent based on a previous interaction.
- the user may have searched on multiple occasions for a particular item or information (e.g., a weather forecast or stock prices).
- the search interaction may become a learned intent based on these previous interactions.
- the interaction may include a search prefix comprising one or more characters associated with a search query.
- the interaction may include a gesture or voice command.
- the interaction may include a navigation within an application or on a device such as a movement of a cursor, mouse, or a touch on a display.
- remote data is received from a remote data source.
- the remote data provides one or more intent suggestions based on the search interaction.
- the remote data source may include remote data provided by intent disambiguation engine as described above with respect to FIG. 3 .
- local data is received from each available device or embedded application. The local data provides one or more intent suggestions based on the search interaction.
- the remote data is merged with the local data, at block 2914 , to personalize a result set.
- the result set comprises one or more entity identifications associated with one or more intent suggestions.
- the result set is prioritized, at block 2916 , based on a set of rules associated with each available device or embedded application.
- a host application is identified. A nature of the host application may be determined.
- the set of rules may be generated based on the host application and/or the nature of the host application.
- a host device is identified. The set of rules may be generated based on the host device and/or the nature of the host device.
- the entry point of the search interaction is identified as INTERNET EXPLORER
- the nature of the host application may be determined as browsing websites.
- a set of rules may prioritize the result set according to items specified by the user within the application such as typed URL, favorites (e.g., INTERNET EXPLORER favorites), browser history, domain suggestion, and search suggestion (i.e., as provided by remote data source).
- the result set can be tailored to the user taking into account personalized settings and preferences within the application and/or device itself.
- the rules may also identify and prioritize tasks related to applications installed on or functionalities provided by the device.
- the result set is provided to the user.
- the result set includes an aggregated intent preview comprising metadata corresponding to one or more entities associated with at least one of the one or more entity identifications.
- the result set allows the user to further refine the search without actually executing the search by interacting further with the entity identifications provided in the result set.
- a refinement request is received from the user.
- the refinement request comprises an indication that the user has selected an item of metadata associated with the one or more entities.
- the item of metadata corresponds to a subentity.
- a refined intent preview comprises metadata corresponding to the subentity, allowing the user to further refine or execute the search.
- embodiments of the present invention provide systems, methods, and computer-readable storage media for, among other things, non-committal intent preview, disambiguation, and refinement of a search.
- a search prefix comprising one or more characters associated with an unexecuted search query may be received.
- One or more intent suggestions may be suggested to a user.
- one or more entity identifications associated with each of the one or more intent suggestions may be received.
- Metadata corresponding to at least one entity associated with the one or more entity identifications may be retrieved from an entity data store. Without retrieving search results for the unexecuted search query, an aggregated intent preview based on the retrieved metadata corresponding to the at least one entity may be provided.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Databases & Information Systems (AREA)
- Accounting & Taxation (AREA)
- Development Economics (AREA)
- Finance (AREA)
- Strategic Management (AREA)
- Data Mining & Analysis (AREA)
- Game Theory and Decision Science (AREA)
- General Business, Economics & Management (AREA)
- Marketing (AREA)
- Economics (AREA)
- Entrepreneurship & Innovation (AREA)
- Human Computer Interaction (AREA)
- Mathematical Physics (AREA)
- Computational Linguistics (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- User Interface Of Digital Computer (AREA)
- Information Transfer Between Computers (AREA)
Abstract
Description
- This patent application is a continuation-in-part of and claims priority to International Application No. PCT/CN2013/072599 (Attorney Docket No. 338258.01/MFCP.179914), filed Mar. 14, 2013, which is incorporated herein by reference in the entirety.
- Search engines and interfaces allow users to retrieve information by inputting search queries, for instance, into a search input region. While a user is inputting a search prefix associated with a search query, automatic systems provide likely completions or suggestions to the search prefix being input. When the user executes the search query, either by manually inputting the desired search query or by selecting a suggestion, the search engine directs the user to a search engine results page (“SERP”).
- This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
- In various embodiments, systems, methods, computer storage media, and user interfaces are provided for query intent expression for search in an embedded application context. are provided for intent expression for search in an embedded application context. A search interaction is received from a user. The search interaction may comprise an interaction with a device or application or a learned intent based on a previous interaction. Remote data from a remote data source is received. Local data is received from each available device or embedded application. The remote data and/or local data may provide one or more intent suggestions based on the search interaction. The remote data is merged with the local data to personalize a result set comprising one or more entity identifications associated with the one or more intent suggestions. The result set may be prioritized based on a set of rules associated with each available device or embedded application. The result set is provided to the user and includes an aggregated intent preview comprising metadata corresponding to one or more entities associated with at least one of the one or more entity identifications.
- The present invention is illustrated by way of example and not limitation in the accompanying figures in which like reference numerals indicate similar elements and in which:
-
FIG. 1 is a block diagram of an exemplary computing environment suitable for use in implementing embodiments of the present invention; -
FIG. 2 is a flow diagram showing an exemplary method for intent preview, disambiguation, and refinement of a search, in accordance with an embodiment of the present invention; -
FIG. 3 schematically shows a network diagram suitable for performing embodiments of the present invention; -
FIGS. 4-20 depict illustrative screen displays, in accordance with exemplary embodiments of the present invention; -
FIG. 21 is a flow diagram showing an exemplary method for intent preview, disambiguation, and refinement of a search, in accordance with an embodiment of the present invention; -
FIG. 22 schematically shows another network diagram suitable for performing embodiments of the present invention; -
FIGS. 23-28 depict additional illustrative screen displays, in accordance with exemplary embodiments of the present invention; and -
FIG. 29 is a flow diagram showing an exemplary method for query intent expression for search in an embedded application context, in accordance with an embodiment of the present invention. - The subject matter of the present invention is described with specificity herein to meet statutory requirements. However, the description itself is not intended to limit the scope of this patent. Rather, the inventors have contemplated that the claimed subject matter might also be embodied in other ways, to include different steps or combinations of steps similar to the ones described in this document, in conjunction with other present or future technologies. Moreover, although the terms “step” and/or “block” may be used herein to connote different elements of methods employed, the terms should not be interpreted as implying any particular order among or between various steps herein disclosed unless and except when the order of individual steps is explicitly described.
- Users are often searching for a particular entity. Entities are instances of abstract concepts and objects, including people, places, things, events, locations, businesses, movies, and the like. Depending on the search query a user inputs or selects, the SERP may or may not include information about the particular entity the user is searching for.
- Existing autosuggestion systems do not address tail queries (i.e., infrequent or unseen queries) or intent the system has not encountered or otherwise ambiguous during the query formulation process. Intent refers to the target of the search, which may be an entity. Further, existing autosuggestion systems do not allow disambiguation of intent or allow users to express intent prior to retrieving the SERP. Any changes to the search query, such as selection of suggestions or input of additional characters, causes the SERP to refresh which can be distracting to the user and inefficient from a resource perspective. Still further, summarized data, such as in a search history or search session, is limited to presenting individual queries of a set. This can make it difficult for a user to ascertain the appropriate context or intent of a given session which effectively limits the ability to share the data in a meaningful way.
- Various aspects of the technology described herein are generally directed to systems, methods, and computer-readable storage media for query intent expression non-committal intent preview, disambiguation, and refinement of a search. A search prefix comprising one or more characters associated with an unexecuted search query is received. One or more intent suggestions are suggested to a user. For each of the one or more intent suggestions, one or more entity identifications associated with each of the one or more intent suggestions are received. Metadata corresponding to at least one entity associated with the one or more entity identifications is retrieved from an entity data store. Without retrieving search results for the unexecuted search query, an aggregated intent preview based on the retrieved metadata corresponding to the at least one entity is provided. In embodiments, the one or more entities are ranked based on entity-intrinsic signals (i.e., number of attributes associated with an entity, entity type, number of information sources associated with an entity), query-entity interactions by users (i.e., explicit interactions or clicks on an entity in a search window or third party entity repository, interactions attributed to an entity via a query-url-entity tripartite graph), and query pattern likelihood scores, populating the intent suggestions or aggregated intent preview in order of relevance or likelihood of query intent. In embodiments, a refined intent preview associated with metadata corresponding to one or more subentities based on a selected item of metadata associated with the one or more entities is provided, conserving time and resources by allowing the user to further refine intent without executing the unexecuted search query. In embodiments, task completion for a selected entity or subentity is enabled allowing the user to easily and quickly take a particular action or complete a task associated with the entity or subentity without having to execute the unexecuted search query. In other words, task completion refers to the opening and execution or completion of a task within an application, independent window, link, or process with or without affecting the search or search window. In embodiments, a set of queries issued by the user and entities corresponding to the set of queries may be provided, enabling the user to easily and quickly interact with a search history via the provided entities.
- Accordingly, one embodiment of the present invention is directed to one or more computer storage media storing computer-useable instructions that, when used by one or more computing devices, cause the one or more computing devices to perform a method of non-committal intent preview, disambiguation, and refinement of a search. The method includes receiving a search prefix from a user, the search prefix comprising one or more characters associated with a search query. One or more intent suggestions are provided to the user based on a comparison of the search prefix with an autosuggest data store. One or more entity identifications associated with the intent suggestions are identified based on an entity ranking. An aggregated intent preview comprising metadata corresponding to one or more entities associated with at least one of the one or more entity identifications is provided. A refinement request is received from the user. The refinement request comprises an indication that the user has selected an item of metadata corresponding to a subentity and associated with the one or more entities. A refined intent preview comprising metadata corresponding to the subentity is provided.
- Another embodiment of the present invention is directed to computer storage media having computer-executable instructions embodied thereon that, when executed by one or more computing devices, cause the one or more computing devices to produce a graphical user interface (GUI) for non-committal intent preview, disambiguation, and refinement of a search The GUI includes a search display area that displays a search bar for receiving a search prefix corresponding to an unexecuted search from a user. An autosuggest display area displays, without executing the search, one or more intent suggestions to the user. An entity display area displays, without executing the search, an aggregated intent preview comprising metadata associated with at least one entity corresponding to entity identifications associated with the one or more intent suggestions. A refinement display area displays, without executing the search, a refined intent preview comprising metadata associated with a subentity corresponding to an item of metadata selected by the user and associated with the at least one entity.
- Yet another embodiment of the present invention includes a system for providing non-committal intent preview, disambiguation, and refinement of a search. The system includes one or more processors coupled to a computer storage medium, the computer storage medium having stored thereon a plurality of computer software components executable by the processor. The computer software components include an autosuggest component that receives a search prefix comprising one or more characters associated with an unexecuted search query and suggests one or more intent suggestions to a user. An entity identification component receives, for each of the one or more intent suggestions, one or more associated entity identifications. A metadata component retrieves metadata from an entity data store. The metadata corresponds to at least one entity associated with the one or more entity identifications. A preview component provides, without retrieving search results for the unexecuted search query, an aggregated intent preview based on the retrieved metadata corresponding to the at least one entity.
- Further embodiments are directed to query intent expression for search in an embedded application context discussed herein. Traditionally, search is thought of as a query based action taken by a user specifically to identify a piece of information. However, as described herein, search can be extended to any user interaction, such as with an application, user interface, operating system, device, or even extended to a learned intent based on previous user interactions. Thus, the entry point for a search can be anywhere the user is able to interact with the application, user interface, operating system, or device. A flyout surface area enables the user to interact with the aggregated intent preview within any application, user interface, operating system, or device to provide the user with rich suggestions, concrete, instant answers to questions (based on local and remote context), enable tasks or actions, and generally assist the user in refining an intent associated with the search. The flyout surface may be any secondary canvas or surface area for receiving a search interaction or providing search intent preview, disambiguation, and refinement of search.
- Accordingly, one embodiment of the present invention is directed to computer storage media having computer-executable instructions embodied thereon that, when executed by one or more computing devices, cause the one or more computing devices to perform a method of query intent expression for search in an embedded application context. The method includes receiving a search interaction from a user, the search interaction comprising an interaction with a device or application or a learned intent based on a previous interaction. Remote data is received from a remote data source, the remote data providing one or more intent suggestions based on the search interaction. Local data from each available device or embedded application is received, the local data providing one or more intent suggestions based on the search interaction. The remote data is merged with the local data to personalize a result set comprising one or more entity identifications associated with the one or more intent suggestions. The result set is prioritized based on a set of rules associated with each available device embedded application. The result set is provided to the user, the result set including an aggregated intent preview comprising metadata corresponding to one or more entities associated with at least one of the one or more entity identifications.
- Another embodiment of the present invention is directed to computer storage media having computer-executable instructions embodied thereon that, when executed by one or more computing devices, cause the one or more computing devices to produce a graphical user interface (GUI) for intent expression for search in an embedded application context. The GUI includes an interaction display area for receiving, from a user, a search interaction corresponding to a search. An autosuggest display area displays, without executing the search, one or more intent suggestions to the user, the one or more intent suggestions comprising remote data and local data based on the search interaction. An entity display area displays, without executing the search, an aggregated intent preview comprising metadata associated with at least one entity corresponding to entity identifications associated with the one or more intent suggestions, the aggregated intent preview being prioritized in accordance with a set of rules associated with each available device or embedded application.
- Yet another embodiment of the present invention includes a system for providing intent expression for search in an embedded application context. The system includes one or more processors coupled to a computer storage medium, the computer storage medium having stored thereon a plurality of computer software components executable by the processor. The computer software components include an interaction component that receives a search interaction from a user, the search interaction comprising an interaction with a device or application or a learned intent based on a previous interaction. A merge component merges remote data with local data to personalize a result set comprising one or more entity identifications associated with the one or more intent suggestions. A priority component prioritizes the result set based on a set of rules associated with each available device or embedded application. A preview component provides the result set to the user, the result set including an aggregated intent preview comprising metadata corresponding to one or more entities associated with at least one of the one or more entity identifications.
- Having briefly described an overview of embodiments of the present invention, an exemplary operating environment in which embodiments of the present invention may be implemented is described below in order to provide a general context for various aspects of the present invention. Referring to the figures in general and initially to
FIG. 1 in particular, an exemplary operating environment for implementing embodiments of the present invention is shown and designated generally ascomputing device 100. Thecomputing device 100 is but one example of a suitable computing environment and is not intended to suggest any limitation as to the scope of use or functionality of embodiments of the invention. Neither should thecomputing device 100 be interpreted as having any dependency or requirement relating to any one component nor any combination of components illustrated. - Embodiments of the invention may be described in the general context of computer code or machine-useable instructions, including computer-useable or computer-executable instructions such as program modules, being executed by a computer or other machine, such as a personal data assistant or other handheld device. Generally, program modules include routines, programs, objects, components, data structures, and the like, and/or refer to code that performs particular tasks or implements particular abstract data types. Embodiments of the invention may be practiced in a variety of system configurations, including hand-held devices, consumer electronics, general-purpose computers, more specialty computing devices, and the like. Embodiments of the invention may also be practiced in distributed computing environments where tasks are performed by remote-processing devices that are linked through a communications network.
- With continued reference to
FIG. 1 , thecomputing device 100 includes abus 110 that directly or indirectly couples the following devices: amemory 112, one ormore processors 114, one ormore presentation components 116, one or more input/output (I/O)ports 118, one or more I/O components 120, and anillustrative power supply 122. Thebus 110 represents what may be one or more busses (such as an address bus, data bus, or combination thereof). Although the various blocks ofFIG. 1 are shown with lines for the sake of clarity, in reality, these blocks represent logical, not necessarily actual, components. For example, one may consider a presentation component such as a display device to be an I/O component. Also, processors have memory. The inventors hereof recognize that such is the nature of the art, and reiterate that the diagram ofFIG. 1 is merely illustrative of an exemplary computing device that can be used in connection with one or more embodiments of the present invention. Distinction is not made between such categories as “workstation,” “server,” “laptop,” “hand-held device,” etc., as all are contemplated within the scope ofFIG. 1 and reference to “computing device.” - The
computing device 100 typically includes a variety of computer-readable media. Computer-readable media may be any available media that is accessible by thecomputing device 100 and includes both volatile and nonvolatile media, removable and non-removable media. Computer-readable media comprises computer storage media and communication media; computer storage media excluding signals per se. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by computingdevice 100. - Communication media, on the other hand, embodies computer-readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. Combinations of any of the above should also be included within the scope of computer-readable media.
- The
memory 112 includes computer-storage media in the form of volatile and/or nonvolatile memory. The memory may be removable, non-removable, or a combination thereof. Exemplary hardware devices include solid-state memory, hard drives, optical-disc drives, and the like. Thecomputing device 100 includes one or more processors that read data from various entities such as thememory 112 or the I/O components 120. The presentation component(s) 116 present data indications to a user or other device. Exemplary presentation components include a display device, speaker, printing component, vibrating component, and the like. - The I/
O ports 118 allow thecomputing device 100 to be logically coupled to other devices including the I/O components 120, some of which may be built in. Illustrative I/O components include a microphone, joystick, game pad, satellite dish, scanner, printer, wireless device, a controller, such as a stylus, a keyboard and a mouse, a natural user interface (NUI), and the like. - A NUI processes air gestures (i.e., motion or movements associated with a user's hand or hands or other parts of the user's body), voice, or other physiological inputs generated by a user. These inputs may be interpreted as search prefixes, search requests, requests for interacting with intent suggestions, requests for interacting with entities or subentities, or requests for interacting with advertisements, entity or disambiguation tiles, actions, search histories, and the like provided by the
computing device 100. These requests may be transmitted to the appropriate network element for further processing. A NUI implements any combination of speech recognition, touch and stylus recognition, facial recognition, biometric recognition, gesture recognition both on screen and adjacent to the screen, air gestures, head and eye tracking, and touch recognition associated with displays on thecomputing device 100. Thecomputing device 100 may be equipped with depth cameras, such as, stereoscopic camera systems, infrared camera systems, RGB camera systems, and combinations of these for gesture detection and recognition. Additionally, thecomputing device 100 may be equipped with accelerometers or gyroscopes that enable detection of motion. The output of the accelerometers or gyroscopes is provided to the display of thecomputing device 100 to render immersive augmented reality or virtual reality. - Aspects of the subject matter described herein may be described in the general context of computer-executable instructions, such as program modules, being executed by a computing device. Generally, program modules include routines, programs, objects, components, data structures, and so forth, which perform particular tasks or implement particular abstract data types. Aspects of the subject matter described herein may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
- Furthermore, although the term “intent disambiguation engine” is used herein, it will be recognized that this term may also encompass a server, a Web browser, a set of one or more processes distributed on one or more computers, one or more stand-alone storage devices, a set of one or more other computing or storage devices, any application, process, or device capable of providing search functionality; search intent preview, disambiguation, and refinement as described herein; a combination of one or more of the above; and the like.
- As previously mentioned, embodiments of the present invention are generally directed to systems, methods, and computer-readable storage media for non-committal intent preview, disambiguation, and refinement of a search. A search prefix comprising one or more characters associated with an unexecuted search query is received. One or more intent suggestions are suggested to a user. For each of the one or more intent suggestions, one or more associated entity identifications are received. Metadata corresponding to at least one entity associated with the one or more entity identifications is retrieved from an entity data store. Without retrieving search results for the unexecuted search query, an aggregated intent preview based on the retrieved metadata corresponding to the at least one entity is provided. The one or more entities may be ranked based on entity-intrinsic signals, query-entity interactions by users, and query pattern likelihood scores. A refined intent preview associated with metadata corresponding to one or more subentities based on a selected item of metadata associated with the one or more entities may be provided. Task completion for a selected entity or subentity may be enabled. A set of queries issued by the user and entities corresponding to the set of queries may be provided. In embodiments, the entities enable the user to interact with a search history.
- Referring to
FIG. 2 , a flow diagram is provided showing anexemplary method 200 for intent preview, disambiguation, and refinement of a search, in accordance with an embodiment of the present invention. Themethod 200 allows a user to surface content (e.g., an intent) that is difficult to find using common expressions. The user can enter themethod 200 at any point. Similarly, a user can exit themethod 200 at any point, such as by executing the unexecuted query. However, the user may determine that reentry is necessary to refine the query. - At the
concept stage 210, an initial essence of the query is expressed. For example, the user may begin inputting a search prefix associated with the search query “Harry Potter.” The user may actually type “Harry Potter” or an intent suggestion for “Harry Potter” may be provided and selected based on the search prefix. Because a search term like “Harry Potter” may map onto a large set of entities varying in type (e.g., books, characters, movies, actors, costumes, toys, and the like), the search term by itself may be ambiguous. In order to identify the intent or target of the search, intent suggestions identifying basic groups of entities or a few of the top-most ranked entity groups can be provided to the user. - At the
segment disambiguation stage 220, a type of entity may be expressed. For example, the user may type “Harry Potter movie” or select an intent suggestion “Harry Potter movie.” Similarly, at theentity disambiguation stage 230, more specific information regarding the type of entity may be expressed. For example, the user may desire information about a particular Harry Potter movie. The user may type “Harry Potter movie prisoner of Azkaban” or selected an intent suggestion “Harry Potter movie prisoner of Azkaban.” Each token or word added to the unexecuted query string provides a deeper understanding of the intent. - At the
intent refinement stage 240, the user may focus the search on a particular aspect of the previewed entity. In the present example, the user may desire to locate information about the cast of the selected movie. For instance, the user may type or select “Harry Potter movie prisoner of Azkaban cast.” As previously mentioned, once the user is satisfied the intent or target of the unexecuted search query has been properly identified, the user can execute the unexecuted search query, at the consumestage 250, and theSERP 252 is provided. The user may desire to narrow the focus of the search and may refine the search further at thereact stage 260. - Referring now to
FIG. 3 , a block diagram is provided illustrating anexemplary computing system 300 in which embodiments of the present invention may be employed. Generally, thecomputing system 300 illustrates an environment in which a search session may be conducted utilizing pre-existing search navigation patterns. Among other components not shown, thecomputing system 300 generally includes user computing devices 310 (e.g., mobile device, television, kiosk, watch, touch screen or tablet device, workstation, gaming system, internet-connected consoles, and the like) and anintent disambiguation engine 320 in communication with one another via anetwork 302. Thenetwork 302 may include, without limitation, one or more local area networks (LANs) and/or wide area networks (WANs). Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets and the Internet. Accordingly, thenetwork 302 is not further described herein. - It should be understood that any number of
user computing devices 310 and/orintent disambiguation engines 320 may be employed in thecomputing system 300 within the scope of embodiments of the present invention. Each may comprise a single device/interface or multiple devices/interfaces cooperating in a distributed environment. For instance, theintent disambiguation engine 320 may comprise multiple devices and/or modules arranged in a distributed environment that collectively provide the functionality of theintent disambiguation engine 320 described herein. Additionally, other components or modules not shown also may be included within thecomputing system 300. - In some embodiments, one or more of the illustrated components/modules may be implemented as stand-alone applications. In other embodiments, one or more of the illustrated components/modules may be implemented via a
user computing device 310, theintent disambiguation engine 320, or as an Internet-based service. It will be understood by those of ordinary skill in the art that the components/modules illustrated inFIG. 3 are exemplary in nature and in number and should not be construed as limiting. Any number of components/modules may be employed to achieve the desired functionality within the scope of embodiments hereof. Further, components/modules may be located on and/or shared by any number of intent disambiguation engines and/or user computing devices. By way of example only, theintent disambiguation engine 320 might be provided as a single computing device (as shown), a cluster of computing devices, or a computing device remote from one or more of the remaining components. - It should be understood that this and other arrangements described herein are set forth only as examples. Other arrangements and elements (e.g., machines, interfaces, functions, orders, and groupings of functions, etc.) can be used in addition to or instead of those shown, and some elements may be omitted altogether. Further, many of the elements described herein are functional entities that may be implemented as discrete or distributed components or in conjunction with other components, and in any suitable combination and location. Various functions described herein as being performed by one or more entities may be carried out by hardware, firmware, and/or software. For instance, various functions may be carried out by a processor executing instructions stored in memory.
- The
user computing device 310 may include any type of computing device, such as thecomputing device 100 described with reference toFIG. 1 , for example. Generally, theuser computing device 310 includes a display and is capable of executing a search or acting as a host for search results. The search process, among other things, is configured to render search engine home pages (or other online landing pages) and search engine results pages (SERPs), in association with the display of theuser computing device 310. Theuser computing device 310 is further configured to receive user input of requests for various web pages (including search engine home pages), receive user input search queries, receive user input to refine search intent and/or take action on an entity (generally input via a user interface provided on the display and permitting alpha-numeric, voice, motion/gesture, and/or textual input into a designated search input region) and to receive content for presentation on the display, for instance, from theintent disambiguation engine 320. It should be noted that the functionality described herein as being performed by theuser device 310 and/orintent disambiguation engine 320 may be performed by any operating system, application, process, web browser, web browser chrome or via accessibility to an operating system, application, process, web browser, web browser chrome, or any device otherwise capable of executing a search or acting as a host for search results. It should further be noted that embodiments of the present invention are equally applicable to mobile computing devices and devices accepting touch, gesture, and/or voice input. Any and all such variations, and any combination thereof, are contemplated to be within the scope of embodiments of the present invention. - The
intent disambiguation engine 320 ofFIG. 3 is configured to, among other things, provide intent preview, disambiguation, and refinement of a search. Theintent disambiguation engine 320 is additionally configured to, among other things, enable actions on entities and provide entity-centric search history and shared data. As illustrated, in various embodiments, theintent disambiguation engine 320 includes asearch prefix component 322, anautosuggest component 324, anentity identification component 326, ametadata component 328, apreview component 330, aranking component 332, arefinement component 334, and anaction component 336. The illustratedintent disambiguation engine 320 also has access to acompletion trie 340 and anentity data store 350. The completion trie 340 is a data store configured to store and associate intent suggestions with entity identifications (“entity IDs”). Theentity data store 350 is a high performance data store configured to provide fast lookup of entities and metadata associated with entities corresponding to one or more entity IDs identified by thecompletion trie 340. It will be understood and appreciated by those of ordinary skill in the art that the information stored in association with the completion trie 340 and theentity data store 350 may be configurable and may include any information relevant to search queries/terms/histories, intent suggestions, entity identifications, entities, and metadata associated with the entities. The content and volume of such information are not intended to limit the scope of embodiments of the present invention in any way. Further, though illustrated as two, independent components, each of the completion trie 340 and theentity data store 350 may, in fact, be a plurality of storage devices, for instance a database cluster, portions of which may reside in association with theintent disambiguation engine 320, theuser computing device 310, another external computing device (not shown), and/or any combination thereof. Further, the completion trie 340 and theentity data store 350 may be combined in a single storage device or database cluster. - The
search prefix component 322 of theintent disambiguation engine 320 is configured to receive a search prefix, for instance, utilizing search functionality associated with theuser computing device 310. The search prefix comprises one or more characters associated with an unexecuted search query. Upon receiving the one or more characters associated with the unexecuted search query, thesearch prefix component 322 communicates the search prefix to theautosuggest component 324. - The
autosuggest component 324 of theintent disambiguation engine 320 is configured to receive the search prefix comprising one or more characters associated with an unexecuted search query. Upon receiving the search prefix, theautosuggest component 324 retrieves one or more intent suggestions associated with the search prefix. In one embodiment, the one or more intent suggestions are retrieved from thecompletion trie 340. The intent suggestions represent the most likely intent of the user and/or target(s) of the unexecuted search query. The most likely intent of the user and/or target of the unexecuted search query may be determined by determining the type of query and possible types of entities associated with that type of query. Each of the intent suggestions may also be associated with one or more entity IDs. An entity ID indicates the intent suggestion is associated with one or more entities and may assist the user in distinguishing one intent suggestion from another. - If the intent suggestion is associated with an entity ID, the entity identification component (“entity ID component”) 326 of the
intent disambiguation engine 320 is configured to retrieve the entity ID. The entity ID may be used to look up metadata associated with one or more entities that is stored, in one embodiment, in theentity data store 350. The entity ID may further describe or indicate the type of entity associated with the entity ID. Such indication may help the user readily locate or identify a particular search within a search history or share a particular search with others. - The
metadata component 328 of theintent disambiguation engine 320 is configured to retrieve metadata from theentity data store 350. The metadata corresponds to at least one entity associated with the one or more entity identifications. The metadata may include content associated with the entity such as data or snippets of data that may be returned by or be available via links in search results for that entity. Metadata for multiple entities may be retrieved allowing the user to narrow or refine a search. For example, a primary intent suggestion representing the likely primary focus of the search as well as one or more secondary intent suggestions representing subcategories or subentities of the primary intent suggestion can be retrieved. Similarly, a primary intent suggestion representing the most likely target of the search as well as secondary intent suggestions representing less likely targets of the search can be retrieved. A request to retrieve metadata, in one embodiment, is initiated when the user hovers over or selects an intent suggestion. In another embodiment, metadata for the first intent suggestion or most likely intent suggestion is automatically selected or retrieved. - The
preview component 330 of theintent disambiguation engine 320 is configured to provide an aggregated intent preview based on the retrieved metadata corresponding to the at least one entity (or a category of entities, e.g., “Seattle restaurants” or “Jackie Chan movies” and the like). The aggregated intent preview is provided without retrieving search results for the unexecuted search query. This allows the user to preview metadata associated with the intent suggestions without consuming the resources necessary to execute the full unexecuted search query. Rather than updating the SERP each time the user selects one of the intent suggestions, the aggregated intent preview provides the user with enough information about a particular entity to narrow the focus of the search. In other words, the aggregated intent preview provides a non-committal preview of one or more entities or subentities to help the user to refine an intent associated with the search without committing to the search until the user is actually ready to execute the search. More simply, the aggregated intent preview does not distract the user by constantly refreshing a SERP associated with a search because the search query is not executed until the user is satisfied the intent of the search is properly identified and adequately focused. - For example, a user may be searching for a particular person or thing. After receiving a search prefix associated with a search query input by the user seeking information regarding that person or thing, the
autosuggest component 334 may retrieve several intent suggestions associated with the search prefix. Each of the intent suggestions may be associated with an entity ID that is associated with an intent suggestion that completes the search prefix (e.g., completes the spelling of one or more persons or things associated with the search prefix). The one or more entities or subentities identified in the aggregated intent preview is associated with the intent suggestion and may further identify one or more subcategories or subentities associated with the intent suggestion to help the user refine the search accordingly. - In one embodiment, the
ranking component 332 of theintent disambiguation engine 320 is configured to rank the one or more entities. The ranking may be utilized to automatically determine the intent or target of the unexecuted search query. The ranking may be based on entity-intrinsic signals, query-entity interactions by users, and/or query pattern likelihood scores. The entity-intrinsic signals may comprise a number of attributes or a number of information sources. For example, one intent suggestion may be ranked higher than another if it includes more attributes associated with a particular entity. Similarly, one intent suggestion may be associated with a particular entity that has a higher number of information sources than another intent suggestion associated with a different entity. Each of these entity-intrinsic signals may be utilized to assign a static ranking score to the intent suggestion, independent of the unexecuted search query. The same methodology can be utilized to rank and influence the display of entities or subentities provided in the aggregated intent preview. - The query pattern likelihood scores may be based on expected patterns. The expected patterns may be based on entity type, quality standards independent of an individual entity, quality standards independent of associated queries, dominance of one particular entity over another, non-entity associations of the query, and the like.
- Expected patterns represent the identification by the system of one or more expected terms, based on the entity type, associated with the intent suggestion. Expected patterns generally are based on data that is typically associated with an entity and which users have come to expect having associated with the particular entity type. For example, each intent suggestion associated with an entity can be examined to identify expected patterns based on the entity type. If the entity type is a business, expected patterns of the intent suggestions may include business names, locations, type of businesses, and the like. On the other hand, if the entity type is a person, expected patterns of the intent suggestions may include first names, middle initials, locations, last names, occupations, and the like.
- The quality standards may be independent of the individual entity but may be based on the entity type. For example, a determination can be made to make sure the query includes at least one well known business name or person name. The quality standards may also be independent of the intent suggestions or unexecuted search query. For example, entities may only be included in the aggregate intent preview if they contain a minimum number of attributes or have been updated recently (e.g., within a predetermined or configurable amount of time). Thus, the quality standards ensure that items associated with the query or the entities included in the aggregate intent preview are expected or known (e.g., one or more known terms), meet minimum requirements (e.g., minimum number of entity-intrinsic signals), and up-to-date.
- In the instance where one particular entity (e) dominates intent suggestions for an unexecuted search query, it may be determined that intent suggestions associated with a less dominant entity (e′, e″, e′″, etc.) should not be provided for the unexecuted search query. When one entity (e) exceeds a particular configurable, predetermined, or automatically determined threshold (e.g., given a set of intent suggestions for an unexecuted search query, a percentage of those intent suggestions that corresponds to/is directed to the entity (e) meets or exceeds a threshold), entity (e) may be considered to dominate the intent suggestions for the unexecuted search query. For example, if over fifty percent of the intent suggestions for an unexecuted search query are associated with an entity (e), entity (e) dominates the intent suggestions for the unexecuted search query. As a result, it may be determined that intent suggestions associated with other entities (e′, e″, e′″, etc.) should not be provided for the unexecuted search query.
- However, in situations where multiple entity types may be identified as the possible or likely target or intent of the search, less dominant entities may be associated with the selected intent suggestion even when another more dominant query-entity pair exceeds the particular configurable or automatically determined threshold. For example, a business entity may be dominant to all other entities for the intent suggestion “hotel California.” However, song entities associated with the intent suggestion “hotel California” may actually be the target or intent of the user. Even if the business entity exceeds the threshold to be determined a dominant entity for that particular intent suggestion, the song entities are still associated with the intent suggestion until the actual intent or target of the unexecuted search query is determined.
- Similarly, non-entity associations of an intent suggestion may also be considered to determine whether a particular entity is dominant. For example, an intent suggestion or unexecuted search query may not have an entity intent (an entity intent suggests the intent or target of the search is an entity). In other words, the intent suggestion or target of the unexecuted search query is not an entity. The intent suggestion or the target of the unexecuted search query may instead target a web resource. In this instance, even when an entity (e.g., business or person entity) exists, the primary intent is the web resource and the query-entity associated is dropped. The primary intent may be determined based on user signals at the time the search prefix is input, how the user interacts with the intent suggestions or aggregated intent preview (e.g., query-entity interactions, entity clicks or clicks on an entity in a search window or third party entity repository, etc.), a search history associated with the user (e.g., search logs, previous query-entity interactions, previous entity clicks or clicks on an entity in a search window or third party entity repository, etc.), third party search history (e.g., search logs, previous third party query-entity interactions, previous third party entity clicks or clicks on an entity in a search window or third party entity repository, etc.).
- The
refinement component 334 of theintent disambiguation engine 320 is configured to, without retrieving search results for the unexecuted search query, provide a refined intent preview. The refined intent preview is associated with metadata corresponding to one or more subentities. The one or more subentities are based on a selected item of metadata associated with the one or more entities. For example, a user may select or interact with an item from the aggregated intent preview. The selected item may be based on metadata corresponding to the one or more entities associated with an intent suggestion. The selected item may be associated with one or more subentities related to the entity. Such a selection allows the user to further refine the search by narrowing the focus or intent of the search without actually executing the unexecuted search query. - The
action component 336 of theintent disambiguation engine 320 is configured to enable task completion for a selected entity or subentity in association with the aggregated intent preview. This allows the aggregated intent preview to not only identify an intent of the search but actually allows the user to complete a task or action associated with the unexecuted search query. For example, a user may desire information about a particular movie. The action component allows the user to actually view or download the movie, such as on Netflix®. The action component may provide a link or tile that, upon selection, opens an application, independent window, link, or process to execute the task. In one embodiment, upon selection of the link or tile, the action component opens an application, independent window, link, or process without affecting the search window. In one embodiment, upon selection of the link or tile, the action component opens an application, independent window, link, or process and the search is refined or updated. In one embodiment, upon selection of the link or tile, the action component opens an application, independent window, link, or process and the search window is closed. As can be appreciated, any number of actions or tasks may be enabled by theaction component 336. For example, an application may be available that relates to a particular entity or subentity. Upon selection, the application is installed on the user device. Similarly, tickets or reservations to a particular event or place can be purchased or made by theaction component 336. Theaction component 336 may further enable third party components to execute external actions (e.g., reservations, purchases, and the like). In one embodiment, theaction component 336 is configured to include paid placement text or display advertisements in association with the aggregated intent preview. - With reference to
FIGS. 4-19 , illustrative screen displays for non-committal intent preview, disambiguation, and refinement of a search are provided. It is understood that each of the illustrative screen displays are connected logically, such that they comprise a user interface designed for non-committal intent preview, disambiguation, and refinement of a search. The screen displays may appear in any order and with any number of screen displays, without regard to whether the screen display is described or depicted herein. - Referring now to
FIG. 4 , anillustrative screen display 400 of an embodiment of the present invention is shown. A search display area displays asearch bar 410 for receiving asearch prefix 412 from a user corresponding to an unexecuted search.Autosuggest display area 420 displays, without executing the search, one or moreintent suggestions 421, 422, 423, 424, 425, 426, 427, 428 to the user.Entity display area 430 displays, without executing the search, an aggregated intent preview comprising metadata associated with at least oneentity primary entity 432 that appears larger than the other entities (i.e., secondary entities). Theprimary entity 432 may be ranked higher than the other entities, such as by the ranking methodology described herein. Thesecondary entities primary entity 432 or may be distinct entities altogether, such as lower ranked entities. Each of the entities may be selectable, such as to further refine the intent of the search, but without executing the search, or enable action or completion of a particular task, such as those actions and tasks described herein. - In
FIG. 5 , anillustrative screen display 500 of an embodiment of the present invention is shown. The search display area displays asearch bar 510 with thesearch prefix 512 “aven.”Autosuggest display area 520 displays, without executing the search,intent suggestions intent suggestions intent suggestions Entity display area 530 displays, without executing the search, an aggregated intent preview comprising metadata associated with at least one entity. As illustrated, the entities include theprimary entity 532, Marvel's The Avengers andsecondary entities secondary entities primary entity 532. Theprimary entity 532 may be identified by the user such, as by selecting an intent suggestion, or may be automatically selected corresponding to a ranking identifying the most likely entity (thus, intent) of the search. Each of the entities is selectable, such as to further refine the intent of the search, but without executing the search, or enable action or completion of a particular task, such as those actions and tasks described herein. For instance, the user can narrow the search to identify images associated with “Marvel's The Avengers” by selecting the tile or secondary entity 534 (e.g., Images), or by typing additional characters into thesearch bar 510. Selection of the Images entity may narrow the search further, such as by identifying particular scenes or characters. This results in the display of a refinement display area that displays, without executing the search, a refined intent preview comprising metadata associated with a subentity corresponding to a selected item of metadata associated with the at least one entity. Selection of the Images entity may also enable the user to complete a task, such as by allowing the user to view images associated with the movie via another website, application, and the like. - Turning now to
FIG. 6 , anillustrative screen display 600 of an embodiment of the present invention is shown. The search display area displays asearch bar 610 with thesearch prefix 612.Autosuggest display area 620 displays, without executing the search, one or more intent suggestions to the user. The intent suggestions may be ranked such as by the ranking methodology described herein.Entity display area 630 displays, without executing the search, an aggregated intent preview comprising metadata associated with one ormore entities entities multiple entities entity display area 630 than the less likely,secondary intents - As illustrated in
FIG. 6 , thesearch prefix 612 “michal j” and selection of intent suggestion “michael jackson” may result in the display of threeentities entities entity display area 630 is scaled, in one embodiment, based on relevance signals or likelihood of intent. For example, an entity tile associated with theprimary intent 632 may appear larger or occupy more space in theentity display area 630 than othersecondary intents secondary intents entities - With reference now to
FIG. 7 , anillustrative screen display 700 of an embodiment of the present invention is shown. In one embodiment, a popular now displayarea 710displays metadata more entities 732 corresponding to entity identifications not associated with a search prefix received in the search bar. In other words, the popular now displayarea 710 can be provided without receiving a search prefix corresponding to a search. Rather, the popular now displayarea 710 displaysintent suggestions 720 associated the most popular entities based on a search history, query-entity interactions, or entity click data. The search history, query-entity interactions, or entity click data may be associated with a user or group of users, such as a group of users associated with the user in a social network, location, place of employment, occupation, interest, proximity, subscription, school, demographics, and the like. The search history, query-entity interactions, or entity click data may be based on a configurable time range. The configurable time range may be set for any time range in the past, present, or future (i.e., such as for predicting likely popular searches, search terms, and entities in the future based on expected news, forecasts, events, schedules, holidays, press releases, product information, and the like). - In
FIG. 8 , anillustrative screen display 800 of an embodiment of the present invention is shown. As illustrated, after asearch prefix 812 is received in thesearch bar 810, theautosuggest display area 820 displays, without executing the search, one or more intent suggestions to the user. Once anintent suggestion 822 is selected, theentity display area 830 displays, without executing the search, an aggregated intent preview comprising metadata associated with at least one entity corresponding to entity identifications associated with the one or more intent suggestions. In this example, theintent suggestion 822 “kelly white” was selected. As illustrated, multiple entities corresponding to the entity identification associated with the intent suggestion “kelly white” are provided in theentity display area 830. Metadata corresponding to each entity is provided allowing the user to determine which entity or “kelly white” is the intent or target of the search. In some instances, the search may need to be refined further, such as by selecting one of the entities in theentity display area 830. For example, there may be multiple entities or subentities associated with one of the entities. In this case, there may be multiple ski instructors named “Kelly White” in North Bend, Wash. The user may select the corresponding entity to refine the search further, in which case additional metadata is retrieved and provided for subentities associated with the selected entity allowing the user to select the appropriate entity or subentity. - Turning now to
FIG. 9 , anillustrative screen display 900 of an embodiment of the present invention is shown. Similar to the example set forth and illustrated inFIG. 8 ,FIG. 9 depicts anentity display area 930 displaying multiple entities associated with a selectedintent suggestion 922. Each of the entities allows the user to further refine the search until the actual intent is determined and the search query is executed. For example, a user may type thesearch prefix 912 “canon cameras.” Intent suggestions are provided in theautosuggest display area 920. Once theintent suggestion 922 is selected, either automatically based on intent confidence or manually by the user, metadata corresponding to entities associated with the intent suggestion is retrieved and provided in theentity display area 930. As desired, the user can refine the search further by selecting an entity or subentity which results in subentities and corresponding metadata being provided in theentity display area 930. - With reference now to
FIG. 10 , anillustrative screen display 1000 of an embodiment of the present invention is shown. As illustrated, the metadata provided by theentity display area 1030 is provided for a single entity and is divided into entity or intent disambiguation tiles corresponding to aprimary intent 1032 andsecondary intents primary intent 1032 may allow the user to execute the search for the selected intent suggestions or entity depicted by theentity display area 1030, while thesecondary intents - In
FIG. 11 , anillustrative screen display 1100 of an embodiment of the present invention is shown. Anavigational display area 1134 may appear in theentity display area 1130 corresponding to theintent suggestion 1122 selected from the one or moreintent suggestions 1120. Thenavigational display area 1134 may represent local intent associated with theentity 1132. As illustrated, thenavigational display area 1134 displays a map and/or directions to an entity provided in theentity display area 1130. Thenavigational display area 1134 may further enable an action or task, as described in more detail below, such as providing directions from a location associated with the user to the nearest entity or an entity associated with a selected location. In one embodiment, anadvertisement display area 1136 displays text or display advertisements for a particular entity. The text or display advertisements may be paid for by or auctioned to a provider distinct or independent of the search provider, such as the provider associated with the entity (e.g., Starbucks®). The text or display advertisements (e.g., paid placement for advertisements) may also be associated with an action or completion of a task (e.g., promoting download of an application) as described in more detail below. - Turning now to
FIG. 12 , anillustrative screen display 1200 of an embodiment of the present invention is shown. Anaction display area 1232, in one embodiment, displays an action available for the user to take on a particular entity. The action enables task completion for theintent suggestion 1222 selected from the one or moreintent suggestions 1220. In one embodiment, the task completion is provided by a provider distinct or independent of the search provider. In one embodiment, the action may request or install an application associated with the distinct or independent provider. - Similarly, and with reference now to
FIG. 13 , anillustrative screen display 1300 of an embodiment of the present invention is shown. Theaction display area 1332 may appear in theentity display area 1330 corresponding to theintent suggestion 1322 selected from the one or moreintent suggestions 1320. In one embodiment, the action is provided by a provider distinct or independent of the search provider. In one embodiment, the action may execute an application provided by the distinct or independent application provider, request permission to install the application, or request login credentials for an account associated with the application or provider. - In
FIGS. 14A and 14B , illustrative screen displays depict mobile embodiments of the present invention. As illustrated inFIG. 14A , after asearch prefix 1412 is received in thesearch bar 1410, theautosuggest display area 1420 displays, without executing the search, one or more intent suggestions to the user. Once anintent suggestion 1422 is selected, theentity display area 1430 displays (as illustrated inFIG. 14B ), without executing the search, an aggregated intent preview comprising metadata associated with at least one entity corresponding to entity identifications associated with the one or more intent suggestions. In this example, theintent suggestion 1422 “rihanna” was selected. As illustrated, multiple entities corresponding to the entity identification associated with the intent suggestion “rihanna” are provided in theentity display area 1430. Metadata corresponding to each entity is provided allowing the user to determine which entity associated with the intent suggestion “rihanna” is the actual intent or target of the search. In some instances, the search may need to be refined further, such as by selecting one of the entities in theentity display area 1430. For example, there may be multiple entities or subentities associated with one of the entities. In this case, there may be multiple items ofmetadata - Turning now to
FIG. 15 , anillustrative screen display 1500 of an embodiment of the present invention is shown. As illustrated, a searchhistory display area 1510 displays, in one embodiment, a set ofqueries 1520 issued by the user andentities 1540 corresponding to the set ofqueries 1520. Theentities 1540 enable the user to interact with the search history. This allows a user to quickly assimilate and understand a view associated with the user's search history. In one embodiment, the searchhistory display area 1510 displays an aggregate set of queries issued by multiple users and entities corresponding to the aggregate set of queries, such as might be implemented in an embodiment to display which entities are being searched for the most by a population of users. - With reference now to
FIG. 16 , an illustrative screen display of an embodiment of the present invention is shown. As illustrated, a socialnetwork display area 1600 displays a topic orentity 1620 shared by one ormore users 1610 via a social network. The topic or entity represents a set of queries issued by the one ormore users 1610 and characterized by metadata associated with the at least oneentity 1620. The entity may be selectable, allowing a user to retrace the steps of the research performed by the user sharing theentity 1620. The entity may be selectable, allowing a user to execute a search associated with theentity 1620. In one embodiment, the search may be executed with the same operating system, application, process, web browser, web browser chrome, or device otherwise capable of executing a search or acting as a host for search results as the original shared search. In one embodiment, the search may be executed with a different operating system, application, process, web browser, web browser chrome or via accessibility to an operating system, application, process, web browser, web browser chrome, or any device otherwise capable of executing a search or acting as a host for search results than the original shared search. - In
FIG. 17 , anillustrative screen display 1700 of an embodiment of the present invention is shown. As illustrated, a multi-userhistory display area 1700 displays a set ofentities 1730 corresponding to a set of most popular searched forentities 1720 over a predetermined period of time by a population of users. The multi-userhistory display area 1700 may be tailored by a user to select specific topics or entities. The multi-userhistory display area 1700 then identifies the set of most popular searched for entities corresponding to the selected topic or entity. For example, and referring again toFIG. 17 , a user may be interested in the most researched presidents in the last month. As illustrated, a most popular search for presidents entity search may result in a set ofentities 1730 that includespresidents - Turning now to
FIG. 18 , anillustrative screen display 1800 of an embodiment of the present invention is shown. As illustrated, an annotatedquery display area 1800 displays an annotatedquery 1810 with a set of previously identifiedentities query 1810. Similarly, and with reference now toFIG. 19 , anillustrative screen display 1900 of an embodiment of the present invention is shown. As illustrated, an annotatedentity display area 1900 displays an annotatedentity metadata query display area 1800 and annotatedentity display area 1900, the set of previously identified entities for a query or metadata for the entity is automatically populated with previously existing information. This allows theintent disambiguation engine 320 ofFIG. 3 , for example, to retrieve valuable information for a user with minimal time, effort, and resources. - In
FIG. 20 , anillustrative screen display 2000 of an embodiment of the present invention is shown. As illustrated, the metadata provided by theentity preview area 2010 is provided forrelated intents related intents FIG. 10 , a user may select to execute the search for one of the entities theprimary intent 1032 ofFIG. 10 . If the user is not satisfied with the search results, any interactions (e.g., query-entity interactions, entity clicks, etc.) may provide contextual information when the user further interacts with thesearch bar 2020 for the selected intent suggestion provided by theentity preview area 2010. As a result, rather than identifying the sameprimary intent 1032 andsecondary intents FIG. 10 , by identifying and leveraging any contextual information,related intents related intents search bar 2020. - Referring now to
FIG. 21 , a flow diagram is illustrated showing anexemplary method 2100 of non-committal intent preview, disambiguation, and refinement of a search. As indicated atblock 2110, a search prefix is received from a user (e.g., utilizing the searchprefix receiving component 322 of theintent disambiguation engine 320 ofFIG. 3 ). The search prefix comprises one or more characters associated with a search query. As indicated atblock 2112, one or more intent suggestions are provided to the user (e.g., utilizing theautosuggest component 324 of theintent disambiguation engine 320 ofFIG. 3 ). The one or more intent suggestions may be based on a comparison of the search prefix to an autosuggest store. The one or more intent suggestions may be retrieved by an application programming interface (API) call to the autosuggest store. The one or more intent suggestions may be rendered by a separate service from theintent disambiguation engine 320. - One or more entity IDs associated with the intent suggestions are identified as indicated at block 2114 (e.g., utilizing the
entity identification component 326 of theintent disambiguation engine 320 ofFIG. 3 ). The one or more intent suggestions may be based on an entity ranking. In other words, the entities associated with the intent suggestions that are the most likely target or intent of the search may be ranked and identified. The ranking may be in accordance with the ranking methodology described herein. For example, in one embodiment, the one or more entities are ranked based on entity-intrinsic signals, query-entity interactions by users, and query pattern likelihood scores. In one embodiment, the query pattern likelihood scores are based on entity type, quality standards independent of an individual entity, quality standards independent of associated queries, dominance of one particular entity over another, and non-entity associations of the query. In one embodiment, the ranked entities are associated with a set of user queries. The set of user queries may be associated with a single user or multiple users over time. - An aggregated intent preview is provided as indicated at block 2116 (e.g., utilizing the
preview component 330 of theintent disambiguation engine 320 ofFIG. 3 ). The aggregated intent preview comprises metadata corresponding to one or more entities associated with at least one of the one or more entity IDs (the metadata is retrieved, for example, by themetadata component 328 of theintent disambiguation engine 320 ofFIG. 3 ). To provide better efficiency and conserve network, bandwidth, and user device resources, the metadata may be retrieved in a separate API call from the user interface than the API call that retrieves the one or more intent suggestions. In one embodiment, a separate service renders the metadata than the service rendering the one or more intent suggestions and/or theintent disambiguation engine 320. - As indicated at
block 2118, a refinement request is received from the user. The refinement request comprises an indication the user has selected an item associated with the one or more entities. More simply, the refinement request is an indication the user determined to refine or narrow the focus or intent of the search. The item of metadata may correspond to a subentity (i.e., a subset of metadata associated with the entity that may focus on one aspect associated with or further define or distinguish the entity). Metadata associated with the selected subentity is retrieved, for example, by themetadata component 328 of theintent disambiguation engine 320 ofFIG. 3 . - A refined intent preview is provided as indicated at block 2120 (e.g., utilizing the
refinement component 334 of theintent disambiguation engine 320 ofFIG. 3 ). The refined intent preview allows the user to narrow the intent of the unexecuted search without executing the search. For example, the user may enter the search prefix “bellevue weath.” Based on this search prefix, one of the intent suggestions provided may be “bellevue weather.” After selecting this intent suggestion, either automatically based on confidence or manually by the user, the aggregated intent preview may comprise metadata corresponding to “Bellevue, Wash. weather”, “Bellevue, Nebraska weather”, and “Bellevue, Ohio” weather. Based on the intent of the user, the user is able to identify the appropriate location of desired weather and refine the intent accordingly. After the user refines the intent to the desired location, additional metadata associated with that selected location may be provided and refined even further, as described herein. - As indicated previously, further embodiments are directed to intent expression for search in an embedded application context. Referring now to
FIG. 22 , a block diagram is provided illustrating anexemplary computing system 2200 in which embodiments of the present invention may be employed. Generally, thecomputing system 2200 illustrates an environment in which intents and query understanding may be provided and merged from a remote service and a local front-end application (e.g., INTERNET EXPLORER) as part of the search intent preview experience. Thecomputing system 2200 generally includesintent expression engine 2210,remote service 2240,remote data source 2242, local service orapplication 2250,local data sources prioritization rules 2260, user computing devices 2270 (e.g., mobile device, television, kiosk, watch, touch screen or tablet device, workstation, gaming system, internet-connected consoles, and the like) which may also provide local intents and query understanding, and an intent disambiguation engine 2280 (e.g.,intent disambiguation engine 320 as shown inFIG. 3 ) in communication with one another via anetwork 2202. Thenetwork 2202 may include, without limitation, one or more local area networks (LANs) and/or wide area networks (WANs). Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets and the Internet. Accordingly, thenetwork 2202 is not further described herein. - It should be understood that any number of
intent expression engines 2210,user computing devices 2270, and/orintent disambiguation engines 2280 may be employed in thecomputing system 2200 within the scope of embodiments of the present invention. Each may comprise a single device/interface or multiple devices/interfaces cooperating in a distributed environment. For instance, theintent disambiguation engine 2280 may comprise multiple devices and/or modules arranged in a distributed environment that collectively provide the functionality of theintent disambiguation engine 2280 described herein. Additionally, other components or modules not shown also may be included within thecomputing system 2200. - In some embodiments, one or more of the illustrated components/modules may be implemented as stand-alone applications. In other embodiments, one or more of the illustrated components/modules may be implemented via the
intent expression engine 2210, auser computing device 2270, theintent disambiguation engine 2280, or as an Internet-based service. It will be understood by those of ordinary skill in the art that the components/modules illustrated inFIG. 22 are exemplary in nature and in number and should not be construed as limiting. Any number of components/modules may be employed to achieve the desired functionality within the scope of embodiments hereof. Further, components/modules may be located on and/or shared by any number of intent disambiguation engines, intent expression engines and/or user computing devices. By way of example only, theintent disambiguation engine 2280 might be provided as a single computing device (as shown), a cluster of computing devices, or a computing device remote from one or more of the remaining components. As another example, theintent disambiguation engine 2280 andintent expression engine 2210 could be provided together on a single computing device, a cluster of computing devices, or a computing device remote from one or more of the remaining components. Additionally, theintent disambiguation engine 2280 andintent expression engine 2210 may be provided by a single entity or multiple entities. For instance, a search engine provider could provide both theintent disambiguation engine 2280 andintent expression engine 2210. Alternatively, a search provider could provide theintent disambiguation engine 2280 and a separate provider could provide theintent expression engine 2210. Any and all such variations are contemplated to be within the scope of embodiments herein. - Each of the
user computing devices 2270 and theintent disambiguation engine 2280 may be similar to theuser devices 310 andintent disambiguation engine 320, respectively, discussed above with reference toFIG. 3 . Additionally, theintent disambiguation engine 2280 may include a number of components (search prefix component, autosuggest component, entity identification component, metadata component, preview component, ranking component, refinement component, and action component) and operate using a completion trie and entity date store in a manner similar to that described above with reference toFIG. 3 . As such, details of these components ofFIG. 22 will not be described in further detail here. - The
intent expression engine 2210 generally operates to merge signals from remote and local data sources to identify one or more intent suggestions that are provided to users onuser computing devices 2270. As shown inFIG. 22 , theintent expression engine 2210 includes, in various embodiments,interaction component 2212,remote data component 2214,local data component 2216, mergecomponent 2218,rule component 2220, andpriority component 2222. -
Interaction component 2212 receives a search interaction from a user, the search interaction comprising an interaction with a device or application or a learned intent based on a previous interaction. The search interaction comprises an interaction with a device or application or a learned intent based on a previous interaction. For example, the user may have searched on multiple occasions for a particular item or information (e.g., a weather forecast or stock prices). The search interaction may become a learned intent based on these previous interactions. The interaction may include a search prefix comprising one or more characters associated with a search query. The interaction may include a gesture or voice command. The interaction may include a navigation within an application, a user interface, or on a device such as a movement of a cursor, mouse, or a touch on a display. - In one embodiment,
remote data component 2214 receives the remote data from a remote data source. The remote data provides one or more intent suggestions based on the search interaction. The remote data source may include remote data provided by intent disambiguation engine as described above with respect toFIG. 3 . - In one embodiment,
local data component 2216 receives the local data from each available device or embedded application. The local data provides one or more intent suggestions based on the search interaction. The local data may be favorites or preferences associated with the device or application from which the search interaction is received or initiated. The local data may be capabilities, functionalities, tasks, or actions provided by the device or application. The local data may be local device information. The local data may be local data associated with an application or residing on or accessible by the application or device. -
Merge component 2218 merges remote data with local data to personalize a result set comprising one or more entity identifications associated with the one or more intent suggestions. In one embodiment,rule component 2220 generates a set of rules based on an identification of a host application or device and a nature of the host application or device.Priority component 2222 prioritizes the result set based on the set of rules associated with each available device or embedded application. - For example, if the host application is a web browser, the nature of the host application is to browse websites. Accordingly, the set of rules may be generated ranking the result set with websites higher than an entity identification that launches an application. Similarly, the local data may include favorites, pinned websites, and the like that are identified within local data of the browser application. In another example, the host device may be an XBOX. Because the nature of the XBOX is tailored to games, movies, and music, the set of rules may be generated ranking the result set to launch games, movies, and music higher than other results. Or the set of rules may be generated ranking the result set to launch already installed items higher than finding items that are not currently identified within the local data (i.e., local data items are ranked higher than remote data items).
-
Preview component 2224 provides the result set to the user, the result set including an aggregated intent preview comprising metadata corresponding to one or more entities associated with at least one of the one or more entity identifications. The user may interact with the aggregated intent preview by selecting the desired metadata to refine the search (prior to actually executing the search) or to execute the search. The user may also interact with the aggregated intent preview by selecting the desired metadata to accomplish a task or launch an application. - With reference to
FIGS. 23-28 , illustrative screen displays of intent expression for search in an embedded application context are provided. It should be understood that the screen displays are provided by way of example only and should not be viewed as limiting. The screen displays may appear in any order and with any number of screen displays, without regard to whether the screen display is described or depicted herein. - Referring initially to
FIG. 23 , anillustrative screen display 2300 of an embodiment of the present invention is shown. As shown inFIG. 23 , an interaction display area displays asearch bar 2310 for receiving a search interaction from a user corresponding to an unexecuted search. Thesearch bar 2310 may be any area within an application or service capable of receiving a user interaction and is not limited to an area that is designed solely for receiving a search interaction. For example, thesearch bar 2310 may be any area of a display provided to the user and the search interaction may be the user looking or gesturing to a certain portion of the display. Thesearch bar 2310 may also include any area of a display provided to the user capable of receiving any sort of user interaction (e.g., any area the user can type and not one that is merely there to receive search terms). Thesearch bar 2310 may not even be a visible portion of a display; rather, thesearch bar 2310 may be voice recognition associated with an application or service such that the user interaction is received when the user communicates audibly with the application or service. -
Autosuggest display area 2320 displays, without executing the search, one or more intent suggestions to the user. The one or more intent suggestions comprise remote data and local data based on the search interaction. In one embodiment, the local data is received from each available device (e.g., the device the search interaction was initiated by or received from) or embedded application. In one embodiment, the remote data is received from a remote data source.Entity display area 2330 displays, without executing the search, an aggregated intent preview. The aggregated intent preview comprises metadata associated with at least one entity corresponding to entity identifications associated with the one or more intent suggestion. The aggregated intent preview is prioritized in accordance with a set of rules associated with each available device or embedded application. The set of rules may be generated based on a host application and/or a nature of the host application. The aggregated intent preview may comprise user-selectable tiles tiles autosuggest display area 2320 andentity display area 2330 may be provided in aflyout surface area 2350 on the device or inside a browser chrome to provide small targeted intents with a rich display of structured content. Theflyout surface area 2350 displays a flyout surface in response to the user interaction. In one embodiment, the flyout surface is an entry point for intent expression inside an application. -
FIG. 24 provides anillustrative screen display 2400 in which a search interaction has been received with a high navigational intent (e.g., Amazon).Entity display area 2410 displays, without executing the search, an aggregated intent preview comprising user-selectable tiles tiles main tile 2412 corresponding with a main entity (e.g., Amazon.com) andsubtiles advertisements 2332 and 2334 have been presented for two of the deeplinks. - Entity display area may further include
non-navigational tiles - By way of example,
FIG. 25 provides ascreen display 2500 in which the user has entered the search prefix “fa” which results in a number of intent suggestions provided in anautosuggest display 2510. The entity display area includestiles tiles -
FIG. 26 provides anillustrative screen display 2600 in which various intent suggestions are provided based on merged local and remote data. The various intent suggestions may be provided in a format or include metadata that identifies a data source of the intent suggestions. The intent suggestion may further suggest an action or task to the user (e.g., execute a search, install or launch an application and the like). -
FIG. 27 provides anillustrative screen display 2700 in which the intent suggestions utilize both remote and local data to provide suggestion to the user. In this example, a search interaction may have been received indicating the user is interested in a place to eat. In addition to receiving remote data, local data is received from the device in which the search interaction was initiated providing a location of the user (e.g., GPS, Wi-Fi triangulation, and the like). Consequently, theentity display area 2710 providesintent suggestions 2712, 2714 that merge both remote and local data. -
FIG. 28 provides anillustrative screen display 2800 in which the search interactions are based on learned intent. For example, the user may have a habit of searching for the weather, movie times, or a stock price on a regular basis. A learnedintent display area 2810 displays, prior to receiving the search interaction from the user, at least a portion of the result set 2812, 2814, 2816. The portion of the result set is based on a previous interaction and comprises remote data and local data based on the learned intent. The local data may provide context to the one or more intent suggestions (such as weather in a particular location the user initiates the search, movie times for a nearby theater, or stock prices for nearby companies). - Referring now to
FIG. 29 , a flow diagram is provided that illustrates amethod 2900 for query intent expression for search in an embedded application context. As shown atblock 2910, a search interaction is received from a user. The search interaction comprises an interaction with a device or application or a learned intent based on a previous interaction. For example, the user may have searched on multiple occasions for a particular item or information (e.g., a weather forecast or stock prices). The search interaction may become a learned intent based on these previous interactions. The interaction may include a search prefix comprising one or more characters associated with a search query. The interaction may include a gesture or voice command. The interaction may include a navigation within an application or on a device such as a movement of a cursor, mouse, or a touch on a display. - As shown at
block 2912, remote data is received from a remote data source. The remote data provides one or more intent suggestions based on the search interaction. The remote data source may include remote data provided by intent disambiguation engine as described above with respect toFIG. 3 . Similarly, atblock 2914, local data is received from each available device or embedded application. The local data provides one or more intent suggestions based on the search interaction. - The remote data is merged with the local data, at
block 2914, to personalize a result set. The result set comprises one or more entity identifications associated with one or more intent suggestions. The result set is prioritized, atblock 2916, based on a set of rules associated with each available device or embedded application. In one embodiment, a host application is identified. A nature of the host application may be determined. The set of rules may be generated based on the host application and/or the nature of the host application. In one embodiment, a host device is identified. The set of rules may be generated based on the host device and/or the nature of the host device. - For example, if the entry point of the search interaction is identified as INTERNET EXPLORER, the nature of the host application may be determined as browsing websites. A set of rules may prioritize the result set according to items specified by the user within the application such as typed URL, favorites (e.g., INTERNET EXPLORER favorites), browser history, domain suggestion, and search suggestion (i.e., as provided by remote data source). Thus, rather than merely providing an unpersonalized set of entity identifications as provided by a remote service, the result set can be tailored to the user taking into account personalized settings and preferences within the application and/or device itself. As can be appreciated, the rules may also identify and prioritize tasks related to applications installed on or functionalities provided by the device.
- At
block 2920, the result set is provided to the user. The result set includes an aggregated intent preview comprising metadata corresponding to one or more entities associated with at least one of the one or more entity identifications. The result set allows the user to further refine the search without actually executing the search by interacting further with the entity identifications provided in the result set. In one embodiment, a refinement request is received from the user. The refinement request comprises an indication that the user has selected an item of metadata associated with the one or more entities. The item of metadata corresponds to a subentity. In one embodiment, a refined intent preview comprises metadata corresponding to the subentity, allowing the user to further refine or execute the search. - As can be understood, embodiments of the present invention provide systems, methods, and computer-readable storage media for, among other things, non-committal intent preview, disambiguation, and refinement of a search. A search prefix comprising one or more characters associated with an unexecuted search query may be received. One or more intent suggestions may be suggested to a user. For each of the one or more intent suggestions, one or more entity identifications associated with each of the one or more intent suggestions may be received. Metadata corresponding to at least one entity associated with the one or more entity identifications may be retrieved from an entity data store. Without retrieving search results for the unexecuted search query, an aggregated intent preview based on the retrieved metadata corresponding to the at least one entity may be provided.
- The present invention has been described in relation to particular embodiments, which are intended in all respects to be illustrative rather than restrictive. Alternative embodiments will become apparent to those of ordinary skill in the art to which the present invention pertains without departing from its scope.
- While the invention is susceptible to various modifications and alternative constructions, certain illustrated embodiments thereof are shown in the drawings and have been described above in detail. It should be understood, however, that there is no intention to limit the invention to the specific forms disclosed, but on the contrary, the intention is to cover all modifications, alternative constructions, and equivalents falling within the spirit and scope of the invention.
- It will be understood by those of ordinary skill in the art that the order of steps shown in
methods 200 ofFIG. 2 , 2100 ofFIG. 21 , and 2900 ofFIG. 29 is not meant to limit the scope of the present invention in any way and, in fact, the steps may occur in a variety of different sequences within embodiments hereof. Any and all such variations, and any combination thereof, are contemplated to be within the scope of embodiments of the present invention.
Claims (20)
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP14717618.4A EP2973028A4 (en) | 2013-03-14 | 2014-03-14 | Query intent expression for search in an embedded application context |
CN201480015031.2A CN105122240A (en) | 2013-03-14 | 2014-03-14 | Query intent expression for search in an embedded application context |
PCT/US2014/028318 WO2014152936A2 (en) | 2013-03-14 | 2014-03-14 | Query intent expression for search in an embedded application context |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CNPCT/CN2013/072599 | 2013-03-14 | ||
PCT/CN2013/072599 WO2014139120A1 (en) | 2013-03-14 | 2013-03-14 | Search intent preview, disambiguation, and refinement |
Publications (1)
Publication Number | Publication Date |
---|---|
US20140282136A1 true US20140282136A1 (en) | 2014-09-18 |
Family
ID=51533054
Family Applications (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/839,395 Abandoned US20140280015A1 (en) | 2013-03-14 | 2013-03-15 | Serving advertisements for search preview based on user intents |
US13/904,887 Abandoned US20140282136A1 (en) | 2013-03-14 | 2013-05-29 | Query intent expression for search in an embedded application context |
US13/911,844 Active 2035-02-09 US10175860B2 (en) | 2013-03-14 | 2013-06-06 | Search intent preview, disambiguation, and refinement |
US13/917,260 Abandoned US20140280093A1 (en) | 2013-03-14 | 2013-06-13 | Social entity previews in query formulation |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/839,395 Abandoned US20140280015A1 (en) | 2013-03-14 | 2013-03-15 | Serving advertisements for search preview based on user intents |
Family Applications After (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/911,844 Active 2035-02-09 US10175860B2 (en) | 2013-03-14 | 2013-06-06 | Search intent preview, disambiguation, and refinement |
US13/917,260 Abandoned US20140280093A1 (en) | 2013-03-14 | 2013-06-13 | Social entity previews in query formulation |
Country Status (4)
Country | Link |
---|---|
US (4) | US20140280015A1 (en) |
EP (2) | EP2973035A4 (en) |
BR (1) | BR112015020551A2 (en) |
WO (4) | WO2014139120A1 (en) |
Cited By (37)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150227633A1 (en) * | 2014-02-12 | 2015-08-13 | Quixey, Inc. | Query Cards |
US20150278370A1 (en) * | 2014-04-01 | 2015-10-01 | Microsoft Corporation | Task completion for natural language input |
US20150310058A1 (en) * | 2014-04-23 | 2015-10-29 | Yahoo! Inc. | Opportunistically solving search use cases |
US20150378597A1 (en) * | 2014-06-27 | 2015-12-31 | Telenav, Inc. | Computing system with interface mechanism and method of operation thereof |
US20160092550A1 (en) * | 2014-09-30 | 2016-03-31 | Yahoo!, Inc. | Automated search intent discovery |
US20160092506A1 (en) * | 2014-09-29 | 2016-03-31 | Linkedin Corporation | Generating suggested structured queries |
US9374431B2 (en) | 2013-06-20 | 2016-06-21 | Microsoft Technology Licensing, Llc | Frequent sites based on browsing patterns |
US20170098159A1 (en) * | 2015-10-01 | 2017-04-06 | Google Inc. | Action suggestions for user-selected content |
US20170185681A1 (en) * | 2015-12-29 | 2017-06-29 | Yandex Europe Ag | Method of and system for processing a prefix associated with a search query |
US9720955B1 (en) | 2016-04-20 | 2017-08-01 | Google Inc. | Search query predictions by a keyboard |
WO2017184220A1 (en) * | 2016-04-20 | 2017-10-26 | Google Inc. | Keyboard with a suggested search query region |
US20170329778A1 (en) * | 2016-05-12 | 2017-11-16 | International Business Machines Corporation | Altering input search terms |
US20180011900A1 (en) * | 2016-07-11 | 2018-01-11 | Facebook, Inc. | Keyboard-Based Corrections for Search Queries on Online Social Networks |
US9946773B2 (en) | 2016-04-20 | 2018-04-17 | Google Llc | Graphical keyboard with integrated search features |
US10055433B2 (en) | 2014-09-18 | 2018-08-21 | Microsoft Technology Licensing, Llc | Referenced content indexing |
US10078673B2 (en) | 2016-04-20 | 2018-09-18 | Google Llc | Determining graphical elements associated with text |
US10114883B1 (en) * | 2014-09-30 | 2018-10-30 | Apple Inc. | Generating preferred metadata for content items |
US10140017B2 (en) | 2016-04-20 | 2018-11-27 | Google Llc | Graphical keyboard application with integrated search |
US10175860B2 (en) | 2013-03-14 | 2019-01-08 | Microsoft Technology Licensing, Llc | Search intent preview, disambiguation, and refinement |
US10248440B1 (en) | 2014-07-11 | 2019-04-02 | Google Llc | Providing a set of user input actions to a mobile device to cause performance of the set of user input actions |
US10311118B2 (en) * | 2014-02-07 | 2019-06-04 | Samsung Electronics Co., Ltd. | Systems and methods for generating search results using application-specific rule sets |
US10394892B2 (en) * | 2014-12-31 | 2019-08-27 | Ebay Inc. | Dynamic content delivery search system |
US10535005B1 (en) | 2016-10-26 | 2020-01-14 | Google Llc | Providing contextual actions for mobile onscreen content |
US10664157B2 (en) * | 2016-08-03 | 2020-05-26 | Google Llc | Image search query predictions by a keyboard |
US20200167410A1 (en) * | 2018-11-27 | 2020-05-28 | Snap-On Incorporated | Method and system for modifying web page based on tags associated with content file |
US10803391B2 (en) * | 2015-07-29 | 2020-10-13 | Google Llc | Modeling personal entities on a mobile device using embeddings |
US10866994B2 (en) | 2015-06-23 | 2020-12-15 | Splunk Inc. | Systems and methods for instant crawling, curation of data sources, and enabling ad-hoc search |
US10877961B1 (en) * | 2017-09-29 | 2020-12-29 | TAM-C Solutions, LLC | Technologies for collecting network-based information |
US10885039B2 (en) * | 2014-05-30 | 2021-01-05 | Apple Inc. | Machine learning based search improvement |
CN112204539A (en) * | 2018-01-16 | 2021-01-08 | 索尼互动娱乐有限责任公司 | Adaptive search using social graph information |
US11042591B2 (en) * | 2015-06-23 | 2021-06-22 | Splunk Inc. | Analytical search engine |
US20220215067A1 (en) * | 2021-01-05 | 2022-07-07 | Vmware, Inc. | Extracting and populating content from an email link |
US20220214897A1 (en) * | 2020-03-11 | 2022-07-07 | Atlassian Pty Ltd. | Computer user interface for a virtual workspace having multiple application portals displaying context-related content |
US11397770B2 (en) * | 2018-11-26 | 2022-07-26 | Sap Se | Query discovery and interpretation |
US20230061328A1 (en) * | 2021-08-30 | 2023-03-02 | Red Hat, Inc. | Seamless integration of multiple applications in tutorials |
US20230117568A1 (en) * | 2021-10-18 | 2023-04-20 | Microsoft Technology Licensing, Llc | Knowledge attributes and passage information based interactive next query recommendation |
US11663269B2 (en) * | 2019-02-21 | 2023-05-30 | Beijing Baidu Netcom Science And Technology Co., Ltd. | Error correction method and apparatus, and computer readable medium |
Families Citing this family (36)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8862622B2 (en) * | 2007-12-10 | 2014-10-14 | Sprylogics International Corp. | Analysis, inference, and visualization of social networks |
US8688514B1 (en) | 2011-06-24 | 2014-04-01 | Google Inc. | Ad selection using image data |
US11087424B1 (en) | 2011-06-24 | 2021-08-10 | Google Llc | Image recognition-based content item selection |
US10972530B2 (en) | 2016-12-30 | 2021-04-06 | Google Llc | Audio-based data structure generation |
US10586127B1 (en) | 2011-11-14 | 2020-03-10 | Google Llc | Extracting audiovisual features from content elements on online documents |
US11093692B2 (en) | 2011-11-14 | 2021-08-17 | Google Llc | Extracting audiovisual features from digital components |
US10331686B2 (en) * | 2013-03-14 | 2019-06-25 | Microsoft Corporation | Conducting search sessions utilizing navigation patterns |
US11030239B2 (en) | 2013-05-31 | 2021-06-08 | Google Llc | Audio based entity-action pair based selection |
US9953085B1 (en) | 2013-05-31 | 2018-04-24 | Google Llc | Feed upload for search entity based content selection |
US20150088648A1 (en) * | 2013-09-24 | 2015-03-26 | Google Inc. | Determining commercial intent |
US20150161200A1 (en) * | 2013-11-27 | 2015-06-11 | Placester, Inc. | System and method for entity-based search, search profiling, and dynamic search updating |
CN104899214B (en) | 2014-03-06 | 2018-05-22 | 阿里巴巴集团控股有限公司 | A kind of data processing method and system established input and suggested |
US9727617B1 (en) * | 2014-03-10 | 2017-08-08 | Google Inc. | Systems and methods for searching quotes of entities using a database |
DE112015001468T5 (en) * | 2014-03-27 | 2016-12-15 | Sony Corporation | Electronic device and method for identifying input commands of a user |
US11004139B2 (en) | 2014-03-31 | 2021-05-11 | Monticello Enterprises LLC | System and method for providing simplified in store purchases and in-app purchases using a use-interface-based payment API |
US12008629B2 (en) | 2014-03-31 | 2024-06-11 | Monticello Enterprises LLC | System and method for providing a social media shopping experience |
US11080777B2 (en) | 2014-03-31 | 2021-08-03 | Monticello Enterprises LLC | System and method for providing a social media shopping experience |
US10511580B2 (en) | 2014-03-31 | 2019-12-17 | Monticello Enterprises LLC | System and method for providing a social media shopping experience |
US9894413B2 (en) * | 2014-06-12 | 2018-02-13 | Google Llc | Systems and methods for locally detecting consumed video content |
US10572810B2 (en) | 2015-01-07 | 2020-02-25 | Microsoft Technology Licensing, Llc | Managing user interaction for input understanding determinations |
JP2018512090A (en) * | 2015-02-27 | 2018-05-10 | キーポイント テクノロジーズ インディア プライベート リミテッド | Context discovery technology |
US10147095B2 (en) | 2015-04-30 | 2018-12-04 | Microsoft Technology Licensing, Llc | Chain understanding in search |
US20160350421A1 (en) * | 2015-06-01 | 2016-12-01 | Boyd Cannon Multerer | Personal searchable document collections with associated user references |
US10249297B2 (en) | 2015-07-13 | 2019-04-02 | Microsoft Technology Licensing, Llc | Propagating conversational alternatives using delayed hypothesis binding |
US10248967B2 (en) | 2015-09-25 | 2019-04-02 | Microsoft Technology Licensing, Llc | Compressing an original query while preserving its intent |
US10180833B2 (en) * | 2015-12-31 | 2019-01-15 | Samsung Electronics Co., Ltd. | Cooperative web-assisted deep link redirection |
US10446137B2 (en) | 2016-09-07 | 2019-10-15 | Microsoft Technology Licensing, Llc | Ambiguity resolving conversational understanding system |
US20180081893A1 (en) * | 2016-09-19 | 2018-03-22 | Ebay Inc. | Prediction-based instant search |
EP3627322A4 (en) | 2017-06-14 | 2020-04-29 | Beijing Xiaomi Mobile Software Co., Ltd. | Application interaction method, interaction method and device |
US20200007645A1 (en) * | 2018-06-27 | 2020-01-02 | Microsoft Technology Licensing, Llc | Managing profile data for multiple enterprise identities |
CN109669534B (en) * | 2018-11-13 | 2022-07-15 | 北京灵犀微光科技有限公司 | Augmented reality display method and device |
CN110147494B (en) * | 2019-04-24 | 2020-05-08 | 北京三快在线科技有限公司 | Information searching method and device, storage medium and electronic equipment |
US11409805B2 (en) | 2019-05-30 | 2022-08-09 | AdMarketplace | Computer implemented system and methods for implementing a search engine access point enhanced for suggested listing navigation |
US11429879B2 (en) * | 2020-05-12 | 2022-08-30 | Ubs Business Solutions Ag | Methods and systems for identifying dynamic thematic relationships as a function of time |
WO2021247655A1 (en) | 2020-06-02 | 2021-12-09 | Liveperson, Inc. | Systems and method for intent messaging |
US11086949B1 (en) | 2021-02-25 | 2021-08-10 | Fmr Llc | Systems and methods for intent guided related searching using sequence semantics |
Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060242586A1 (en) * | 2005-04-20 | 2006-10-26 | Microsoft Corporation | Searchable task-based interface to control panel functionality |
US20060288015A1 (en) * | 2005-06-15 | 2006-12-21 | Schirripa Steven R | Electronic content classification |
US20070185847A1 (en) * | 2006-01-31 | 2007-08-09 | Intellext, Inc. | Methods and apparatus for filtering search results |
US20090112848A1 (en) * | 2007-10-31 | 2009-04-30 | Samsung Electronics Co., Ltd. | Method and system for suggesting search queries on electronic devices |
US7603360B2 (en) * | 2005-09-14 | 2009-10-13 | Jumptap, Inc. | Location influenced search results |
US20100146012A1 (en) * | 2008-12-04 | 2010-06-10 | Microsoft Corporation | Previewing search results for suggested refinement terms and vertical searches |
US20110087686A1 (en) * | 2003-12-30 | 2011-04-14 | Microsoft Corporation | Incremental query refinement |
US8027990B1 (en) * | 2008-07-09 | 2011-09-27 | Google Inc. | Dynamic query suggestion |
US20120084291A1 (en) * | 2010-09-30 | 2012-04-05 | Microsoft Corporation | Applying search queries to content sets |
US8204897B1 (en) * | 2008-09-09 | 2012-06-19 | Google Inc. | Interactive search querying |
US20120246165A1 (en) * | 2011-03-22 | 2012-09-27 | Yahoo! Inc. | Search assistant system and method |
US20130275456A1 (en) * | 2012-04-13 | 2013-10-17 | Yahoo! Inc. | Method and System for Content Search |
US20130282749A1 (en) * | 2012-04-23 | 2013-10-24 | Yahoo! Inc. | Instant search results with page previews |
US8601019B1 (en) * | 2012-04-03 | 2013-12-03 | Google Inc. | Presenting autocomplete suggestions |
Family Cites Families (71)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6564213B1 (en) | 2000-04-18 | 2003-05-13 | Amazon.Com, Inc. | Search query autocompletion |
US7332748B2 (en) | 2002-12-04 | 2008-02-19 | Nec Electronics Corporation | Electro-static discharge protection device |
US7523096B2 (en) | 2003-12-03 | 2009-04-21 | Google Inc. | Methods and systems for personalized network searching |
US8015119B2 (en) | 2004-01-21 | 2011-09-06 | Google Inc. | Methods and systems for the display and navigation of a social network |
US8005714B2 (en) | 2004-02-02 | 2011-08-23 | David Shaw | System and method for providing a discount |
US8082264B2 (en) | 2004-04-07 | 2011-12-20 | Inquira, Inc. | Automated scheme for identifying user intent in real-time |
WO2006008716A2 (en) | 2004-07-16 | 2006-01-26 | Blu Ventures Llc | A method to access and use an integrated web site in a mobile environment |
WO2006036781A2 (en) | 2004-09-22 | 2006-04-06 | Perfect Market Technologies, Inc. | Search engine using user intent |
US20070050251A1 (en) | 2005-08-29 | 2007-03-01 | Microsoft Corporation | Monetizing a preview pane for ads |
US7660581B2 (en) | 2005-09-14 | 2010-02-09 | Jumptap, Inc. | Managing sponsored content based on usage history |
US20070100650A1 (en) | 2005-09-14 | 2007-05-03 | Jorey Ramer | Action functionality for mobile content search results |
US8209344B2 (en) | 2005-09-14 | 2012-06-26 | Jumptap, Inc. | Embedding sponsored content in mobile applications |
US8010523B2 (en) | 2005-12-30 | 2011-08-30 | Google Inc. | Dynamic search box for web browser |
US7657523B2 (en) | 2006-03-09 | 2010-02-02 | Customerforce.Com | Ranking search results presented to on-line users as a function of perspectives of relationships trusted by the users |
US20070294240A1 (en) | 2006-06-07 | 2007-12-20 | Microsoft Corporation | Intent based search |
US20080022216A1 (en) | 2006-07-21 | 2008-01-24 | Duval John J | Method and system for obtaining primary search terms for use in conducting an internet search |
US20080109491A1 (en) | 2006-11-03 | 2008-05-08 | Sezwho Inc. | Method and system for managing reputation profile on online communities |
US8954500B2 (en) | 2008-01-04 | 2015-02-10 | Yahoo! Inc. | Identifying and employing social network relationships |
US7840538B2 (en) | 2006-12-20 | 2010-11-23 | Yahoo! Inc. | Discovering query intent from search queries and concept networks |
US8156135B2 (en) * | 2006-12-22 | 2012-04-10 | Yahoo! Inc. | Method and system for progressive disclosure of search results |
US7966321B2 (en) | 2007-01-17 | 2011-06-21 | Google Inc. | Presentation of local results |
US8027964B2 (en) | 2007-07-13 | 2011-09-27 | Medio Systems, Inc. | Personalized query completion suggestion |
WO2009039392A1 (en) * | 2007-09-21 | 2009-03-26 | The Board Of Trustees Of The University Of Illinois | A system for entity search and a method for entity scoring in a linked document database |
US8694483B2 (en) * | 2007-10-19 | 2014-04-08 | Xerox Corporation | Real-time query suggestion in a troubleshooting context |
US8086590B2 (en) | 2008-04-25 | 2011-12-27 | Microsoft Corporation | Product suggestions and bypassing irrelevant query results |
US20090313100A1 (en) | 2008-06-11 | 2009-12-17 | Yahoo! Inc. | System and method for previewing search results |
US8082278B2 (en) | 2008-06-13 | 2011-12-20 | Microsoft Corporation | Generating query suggestions from semantic relationships in content |
US8135616B2 (en) | 2008-06-26 | 2012-03-13 | Microsoft Corporation | Browsing and quality of service features |
US20100169364A1 (en) | 2008-06-30 | 2010-07-01 | Blame Canada Holdings Inc. | Metadata Enhanced Browser |
US8521731B2 (en) | 2008-07-09 | 2013-08-27 | Yahoo! Inc. | Systems and methods for query expansion in sponsored search |
US8010537B2 (en) | 2008-08-27 | 2011-08-30 | Yahoo! Inc. | System and method for assisting search requests with vertical suggestions |
US7979415B2 (en) | 2008-09-04 | 2011-07-12 | Microsoft Corporation | Predicting future queries from log data |
US8108778B2 (en) * | 2008-09-30 | 2012-01-31 | Yahoo! Inc. | System and method for context enhanced mapping within a user interface |
US7949647B2 (en) * | 2008-11-26 | 2011-05-24 | Yahoo! Inc. | Navigation assistance for search engines |
US20100180001A1 (en) | 2009-01-11 | 2010-07-15 | Dick Clarence Hardt | Contextual messaging and notification system |
US8452794B2 (en) | 2009-02-11 | 2013-05-28 | Microsoft Corporation | Visual and textual query suggestion |
US9684741B2 (en) | 2009-06-05 | 2017-06-20 | Microsoft Technology Licensing, Llc | Presenting search results according to query domains |
US8341157B2 (en) | 2009-07-31 | 2012-12-25 | Yahoo! Inc. | System and method for intent-driven search result presentation |
US9405841B2 (en) | 2009-10-15 | 2016-08-02 | A9.Com, Inc. | Dynamic search suggestion and category specific completion |
US8332748B1 (en) | 2009-10-22 | 2012-12-11 | Google Inc. | Multi-directional auto-complete menu |
US8326859B2 (en) | 2009-11-02 | 2012-12-04 | Microsoft Corporation | Task prediction |
WO2011060291A2 (en) | 2009-11-13 | 2011-05-19 | Dreamwell, Ltd. | Manufacturer-linked landing page for online advertising |
US20120284253A9 (en) | 2009-12-01 | 2012-11-08 | Rishab Aiyer Ghosh | System and method for query suggestion based on real-time content stream |
US8631004B2 (en) | 2009-12-28 | 2014-01-14 | Yahoo! Inc. | Search suggestion clustering and presentation |
US20110184981A1 (en) | 2010-01-27 | 2011-07-28 | Yahoo! Inc. | Personalize Search Results for Search Queries with General Implicit Local Intent |
US9129012B2 (en) | 2010-02-03 | 2015-09-08 | Google Inc. | Information search system with real-time feedback |
US8346795B2 (en) * | 2010-03-10 | 2013-01-01 | Xerox Corporation | System and method for guiding entity-based searching |
US8880520B2 (en) | 2010-04-21 | 2014-11-04 | Yahoo! Inc. | Selectively adding social dimension to web searches |
US8554756B2 (en) * | 2010-06-25 | 2013-10-08 | Microsoft Corporation | Integrating social network data with search results |
US20120059838A1 (en) | 2010-09-07 | 2012-03-08 | Microsoft Corporation | Providing entity-specific content in response to a search query |
US20120123857A1 (en) | 2010-11-11 | 2012-05-17 | Mrugank Kiran Surve | Bidding Model for Sponsored Search Advertising Based on User Query Intent |
US8538978B2 (en) | 2010-11-22 | 2013-09-17 | International Business Machines Corporation | Presenting a search suggestion with a social comments icon |
US8977979B2 (en) | 2010-12-06 | 2015-03-10 | International Business Machines Corporation | Social network relationship mapping |
US20120158461A1 (en) * | 2010-12-17 | 2012-06-21 | Verizon Patent And Licensing Inc. | Content management and advertisement management |
US20120316955A1 (en) * | 2011-04-06 | 2012-12-13 | Yahoo! Inc. | System and Method for Mobile Application Search |
US9633392B2 (en) | 2011-04-13 | 2017-04-25 | Paypal, Inc. | Integrated finding experience systems and methods |
US20120265784A1 (en) | 2011-04-15 | 2012-10-18 | Microsoft Corporation | Ordering semantic query formulation suggestions |
US20120296743A1 (en) | 2011-05-19 | 2012-11-22 | Yahoo! Inc. | Method and System for Personalized Search Suggestions |
US8700544B2 (en) | 2011-06-17 | 2014-04-15 | Microsoft Corporation | Functionality for personalizing search results |
US8495058B2 (en) | 2011-08-05 | 2013-07-23 | Google Inc. | Filtering social search results |
US20130054631A1 (en) * | 2011-08-30 | 2013-02-28 | Microsoft Corporation | Adding social network data to search suggestions |
US9244985B1 (en) | 2011-09-06 | 2016-01-26 | Google Inc. | Generating search results for people |
US9043350B2 (en) * | 2011-09-22 | 2015-05-26 | Microsoft Technology Licensing, Llc | Providing topic based search guidance |
US9489458B1 (en) | 2011-09-30 | 2016-11-08 | Google Inc. | Suggesting interaction among members of a social network |
US8671106B1 (en) | 2012-05-23 | 2014-03-11 | Google Inc. | Indicators for entities corresponding to search suggestions |
US8799276B1 (en) | 2012-05-30 | 2014-08-05 | Google Inc. | Displaying social content in search results |
US20140149932A1 (en) | 2012-11-26 | 2014-05-29 | Nero Ag | System and method for providing a tapestry presentation |
US9026429B2 (en) * | 2012-12-05 | 2015-05-05 | Facebook, Inc. | Systems and methods for character string auto-suggestion based on degree of difficulty |
US9092527B2 (en) * | 2013-01-30 | 2015-07-28 | Quixey, Inc. | Performing application search based on entities |
US9336211B1 (en) * | 2013-03-13 | 2016-05-10 | Google Inc. | Associating an entity with a search query |
WO2014139120A1 (en) | 2013-03-14 | 2014-09-18 | Microsoft Corporation | Search intent preview, disambiguation, and refinement |
-
2013
- 2013-03-14 WO PCT/CN2013/072599 patent/WO2014139120A1/en active Application Filing
- 2013-03-15 US US13/839,395 patent/US20140280015A1/en not_active Abandoned
- 2013-05-29 US US13/904,887 patent/US20140282136A1/en not_active Abandoned
- 2013-06-06 US US13/911,844 patent/US10175860B2/en active Active
- 2013-06-13 US US13/917,260 patent/US20140280093A1/en not_active Abandoned
-
2014
- 2014-03-14 BR BR112015020551A patent/BR112015020551A2/en not_active IP Right Cessation
- 2014-03-14 WO PCT/US2014/028554 patent/WO2014152989A2/en active Application Filing
- 2014-03-14 EP EP14720831.8A patent/EP2973035A4/en not_active Withdrawn
- 2014-03-14 EP EP14717618.4A patent/EP2973028A4/en not_active Withdrawn
- 2014-03-14 WO PCT/US2014/028993 patent/WO2014153086A2/en active Application Filing
- 2014-03-14 WO PCT/US2014/028318 patent/WO2014152936A2/en active Application Filing
Patent Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110087686A1 (en) * | 2003-12-30 | 2011-04-14 | Microsoft Corporation | Incremental query refinement |
US20060242586A1 (en) * | 2005-04-20 | 2006-10-26 | Microsoft Corporation | Searchable task-based interface to control panel functionality |
US20060288015A1 (en) * | 2005-06-15 | 2006-12-21 | Schirripa Steven R | Electronic content classification |
US7603360B2 (en) * | 2005-09-14 | 2009-10-13 | Jumptap, Inc. | Location influenced search results |
US20070185847A1 (en) * | 2006-01-31 | 2007-08-09 | Intellext, Inc. | Methods and apparatus for filtering search results |
US20090112848A1 (en) * | 2007-10-31 | 2009-04-30 | Samsung Electronics Co., Ltd. | Method and system for suggesting search queries on electronic devices |
US8027990B1 (en) * | 2008-07-09 | 2011-09-27 | Google Inc. | Dynamic query suggestion |
US8204897B1 (en) * | 2008-09-09 | 2012-06-19 | Google Inc. | Interactive search querying |
US20100146012A1 (en) * | 2008-12-04 | 2010-06-10 | Microsoft Corporation | Previewing search results for suggested refinement terms and vertical searches |
US20120084291A1 (en) * | 2010-09-30 | 2012-04-05 | Microsoft Corporation | Applying search queries to content sets |
US20120246165A1 (en) * | 2011-03-22 | 2012-09-27 | Yahoo! Inc. | Search assistant system and method |
US8601019B1 (en) * | 2012-04-03 | 2013-12-03 | Google Inc. | Presenting autocomplete suggestions |
US20130275456A1 (en) * | 2012-04-13 | 2013-10-17 | Yahoo! Inc. | Method and System for Content Search |
US20130282749A1 (en) * | 2012-04-23 | 2013-10-24 | Yahoo! Inc. | Instant search results with page previews |
Cited By (68)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10175860B2 (en) | 2013-03-14 | 2019-01-08 | Microsoft Technology Licensing, Llc | Search intent preview, disambiguation, and refinement |
US10375186B2 (en) | 2013-06-20 | 2019-08-06 | Microsoft Technology Licensing, Llc | Frequent sites based on browsing patterns |
US9374431B2 (en) | 2013-06-20 | 2016-06-21 | Microsoft Technology Licensing, Llc | Frequent sites based on browsing patterns |
US10311118B2 (en) * | 2014-02-07 | 2019-06-04 | Samsung Electronics Co., Ltd. | Systems and methods for generating search results using application-specific rule sets |
US10083205B2 (en) * | 2014-02-12 | 2018-09-25 | Samsung Electronics Co., Ltd. | Query cards |
US20150227633A1 (en) * | 2014-02-12 | 2015-08-13 | Quixey, Inc. | Query Cards |
US20150278370A1 (en) * | 2014-04-01 | 2015-10-01 | Microsoft Corporation | Task completion for natural language input |
US9824149B2 (en) * | 2014-04-23 | 2017-11-21 | Excalibur Ip, Llc | Opportunistically solving search use cases |
US20150310058A1 (en) * | 2014-04-23 | 2015-10-29 | Yahoo! Inc. | Opportunistically solving search use cases |
US10885039B2 (en) * | 2014-05-30 | 2021-01-05 | Apple Inc. | Machine learning based search improvement |
US10613751B2 (en) * | 2014-06-27 | 2020-04-07 | Telenav, Inc. | Computing system with interface mechanism and method of operation thereof |
US20150378597A1 (en) * | 2014-06-27 | 2015-12-31 | Telenav, Inc. | Computing system with interface mechanism and method of operation thereof |
US10592261B1 (en) | 2014-07-11 | 2020-03-17 | Google Llc | Automating user input from onscreen content |
US10248440B1 (en) | 2014-07-11 | 2019-04-02 | Google Llc | Providing a set of user input actions to a mobile device to cause performance of the set of user input actions |
US11704136B1 (en) | 2014-07-11 | 2023-07-18 | Google Llc | Automatic reminders in a mobile environment |
US10652706B1 (en) | 2014-07-11 | 2020-05-12 | Google Llc | Entity disambiguation in a mobile environment |
US10055433B2 (en) | 2014-09-18 | 2018-08-21 | Microsoft Technology Licensing, Llc | Referenced content indexing |
US20160092506A1 (en) * | 2014-09-29 | 2016-03-31 | Linkedin Corporation | Generating suggested structured queries |
US20190073408A1 (en) * | 2014-09-30 | 2019-03-07 | Apple Inc. | Generating preferred metadata for content items |
US10114883B1 (en) * | 2014-09-30 | 2018-10-30 | Apple Inc. | Generating preferred metadata for content items |
US20160092550A1 (en) * | 2014-09-30 | 2016-03-31 | Yahoo!, Inc. | Automated search intent discovery |
US11321357B2 (en) * | 2014-09-30 | 2022-05-03 | Apple Inc. | Generating preferred metadata for content items |
US11593431B2 (en) * | 2014-12-31 | 2023-02-28 | Ebay Inc. | Dynamic content delivery search system |
US10394892B2 (en) * | 2014-12-31 | 2019-08-27 | Ebay Inc. | Dynamic content delivery search system |
US20230177087A1 (en) * | 2014-12-31 | 2023-06-08 | Ebay Inc. | Dynamic content delivery search system |
US10885125B2 (en) | 2015-06-23 | 2021-01-05 | Splunk Inc. | Techniques for curating data for query processing |
US11113342B2 (en) | 2015-06-23 | 2021-09-07 | Splunk Inc. | Techniques for compiling and presenting query results |
US11042591B2 (en) * | 2015-06-23 | 2021-06-22 | Splunk Inc. | Analytical search engine |
US11868411B1 (en) | 2015-06-23 | 2024-01-09 | Splunk Inc. | Techniques for compiling and presenting query results |
US10866994B2 (en) | 2015-06-23 | 2020-12-15 | Splunk Inc. | Systems and methods for instant crawling, curation of data sources, and enabling ad-hoc search |
US10803391B2 (en) * | 2015-07-29 | 2020-10-13 | Google Llc | Modeling personal entities on a mobile device using embeddings |
US20170098159A1 (en) * | 2015-10-01 | 2017-04-06 | Google Inc. | Action suggestions for user-selected content |
US12026593B2 (en) | 2015-10-01 | 2024-07-02 | Google Llc | Action suggestions for user-selected content |
US10970646B2 (en) * | 2015-10-01 | 2021-04-06 | Google Llc | Action suggestions for user-selected content |
US10496711B2 (en) * | 2015-12-29 | 2019-12-03 | Yandex Europe Ag | Method of and system for processing a prefix associated with a search query |
US20170185681A1 (en) * | 2015-12-29 | 2017-06-29 | Yandex Europe Ag | Method of and system for processing a prefix associated with a search query |
US9965530B2 (en) | 2016-04-20 | 2018-05-08 | Google Llc | Graphical keyboard with integrated search features |
US10140017B2 (en) | 2016-04-20 | 2018-11-27 | Google Llc | Graphical keyboard application with integrated search |
US9720955B1 (en) | 2016-04-20 | 2017-08-01 | Google Inc. | Search query predictions by a keyboard |
WO2017184220A1 (en) * | 2016-04-20 | 2017-10-26 | Google Inc. | Keyboard with a suggested search query region |
US9946773B2 (en) | 2016-04-20 | 2018-04-17 | Google Llc | Graphical keyboard with integrated search features |
US9977595B2 (en) | 2016-04-20 | 2018-05-22 | Google Llc | Keyboard with a suggested search query region |
US10078673B2 (en) | 2016-04-20 | 2018-09-18 | Google Llc | Determining graphical elements associated with text |
US10305828B2 (en) | 2016-04-20 | 2019-05-28 | Google Llc | Search query predictions by a keyboard |
US10222957B2 (en) * | 2016-04-20 | 2019-03-05 | Google Llc | Keyboard with a suggested search query region |
US11334800B2 (en) * | 2016-05-12 | 2022-05-17 | International Business Machines Corporation | Altering input search terms |
US20170329778A1 (en) * | 2016-05-12 | 2017-11-16 | International Business Machines Corporation | Altering input search terms |
US20180046626A1 (en) * | 2016-05-12 | 2018-02-15 | International Business Machines Corporation | Altering input search terms |
US11200498B2 (en) * | 2016-05-12 | 2021-12-14 | International Business Machines Corporation | Altering input search terms |
US10635661B2 (en) * | 2016-07-11 | 2020-04-28 | Facebook, Inc. | Keyboard-based corrections for search queries on online social networks |
US20180011900A1 (en) * | 2016-07-11 | 2018-01-11 | Facebook, Inc. | Keyboard-Based Corrections for Search Queries on Online Social Networks |
US10664157B2 (en) * | 2016-08-03 | 2020-05-26 | Google Llc | Image search query predictions by a keyboard |
US10535005B1 (en) | 2016-10-26 | 2020-01-14 | Google Llc | Providing contextual actions for mobile onscreen content |
US11734581B1 (en) | 2016-10-26 | 2023-08-22 | Google Llc | Providing contextual actions for mobile onscreen content |
US10877961B1 (en) * | 2017-09-29 | 2020-12-29 | TAM-C Solutions, LLC | Technologies for collecting network-based information |
CN112204539A (en) * | 2018-01-16 | 2021-01-08 | 索尼互动娱乐有限责任公司 | Adaptive search using social graph information |
US11397770B2 (en) * | 2018-11-26 | 2022-07-26 | Sap Se | Query discovery and interpretation |
US10817654B2 (en) * | 2018-11-27 | 2020-10-27 | Snap-On Incorporated | Method and system for modifying web page based on tags associated with content file |
US20200167410A1 (en) * | 2018-11-27 | 2020-05-28 | Snap-On Incorporated | Method and system for modifying web page based on tags associated with content file |
US11409947B2 (en) | 2018-11-27 | 2022-08-09 | Snap-On Incorporated | Method and system for modifying web page based on tags associated with content file |
US11663269B2 (en) * | 2019-02-21 | 2023-05-30 | Beijing Baidu Netcom Science And Technology Co., Ltd. | Error correction method and apparatus, and computer readable medium |
US20220214897A1 (en) * | 2020-03-11 | 2022-07-07 | Atlassian Pty Ltd. | Computer user interface for a virtual workspace having multiple application portals displaying context-related content |
US11899734B2 (en) * | 2021-01-05 | 2024-02-13 | Vmware, Inc. | Extracting and populating content from an email link |
US20220215067A1 (en) * | 2021-01-05 | 2022-07-07 | Vmware, Inc. | Extracting and populating content from an email link |
US20230061328A1 (en) * | 2021-08-30 | 2023-03-02 | Red Hat, Inc. | Seamless integration of multiple applications in tutorials |
US11928487B2 (en) * | 2021-08-30 | 2024-03-12 | Red Hat, Inc. | Seamless integration of multiple applications in tutorials |
US20230117568A1 (en) * | 2021-10-18 | 2023-04-20 | Microsoft Technology Licensing, Llc | Knowledge attributes and passage information based interactive next query recommendation |
US11995139B2 (en) * | 2021-10-18 | 2024-05-28 | Microsoft Technology Licensing, Llc | Knowledge attributes and passage information based interactive next query recommendation |
Also Published As
Publication number | Publication date |
---|---|
WO2014152989A3 (en) | 2015-02-19 |
WO2014153086A2 (en) | 2014-09-25 |
EP2973028A4 (en) | 2016-08-31 |
US20140280093A1 (en) | 2014-09-18 |
EP2973035A2 (en) | 2016-01-20 |
EP2973035A4 (en) | 2016-08-31 |
US20140280015A1 (en) | 2014-09-18 |
WO2014152989A2 (en) | 2014-09-25 |
WO2014152936A3 (en) | 2015-01-08 |
US10175860B2 (en) | 2019-01-08 |
EP2973028A2 (en) | 2016-01-20 |
BR112015020551A2 (en) | 2017-07-18 |
WO2014153086A3 (en) | 2014-12-04 |
WO2014139120A1 (en) | 2014-09-18 |
US20140280092A1 (en) | 2014-09-18 |
WO2014152936A2 (en) | 2014-09-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10175860B2 (en) | Search intent preview, disambiguation, and refinement | |
CN108139849B (en) | Action suggestions for user-selected content | |
US9613132B2 (en) | Method of and system for displaying a plurality of user-selectable refinements to a search query | |
US9582549B2 (en) | Computer application data in search results | |
US10885076B2 (en) | Computerized system and method for search query auto-completion | |
US9760541B2 (en) | Systems and methods for delivery techniques of contextualized services on mobile devices | |
RU2581840C2 (en) | Registration for system level search user interface | |
US9652556B2 (en) | Search suggestions based on viewport content | |
US20130254031A1 (en) | Dynamic Modification of Advertisements Displayed in Response to a Search Engine Query | |
US20210279297A1 (en) | Linking to a search result | |
US20160299978A1 (en) | Device dependent search experience | |
US11995139B2 (en) | Knowledge attributes and passage information based interactive next query recommendation | |
Spencer | Google power search | |
JP7492994B2 (en) | Search result providing method, system, and computer program | |
US20240256624A1 (en) | Knowledge attributes and passage information based interactive next query recommendation | |
KR20230032811A (en) | Method, system, and computer program to dynamically provide sub-item recommendation list for each item included in search results based on search query | |
TW201447615A (en) | Social entity previews in query formulation | |
WO2022251130A1 (en) | Linking to a search result |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: MICROSOFT CORPORATION, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MARANTZ, DANIEL;WANG, KUANSAN;KUO, YU-TING;AND OTHERS;SIGNING DATES FROM 20130529 TO 20130803;REEL/FRAME:031099/0789 |
|
AS | Assignment |
Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034747/0417 Effective date: 20141014 Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:039025/0454 Effective date: 20141014 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |