NL2027356B1 - Visual search query generation - Google Patents
Visual search query generation Download PDFInfo
- Publication number
- NL2027356B1 NL2027356B1 NL2027356A NL2027356A NL2027356B1 NL 2027356 B1 NL2027356 B1 NL 2027356B1 NL 2027356 A NL2027356 A NL 2027356A NL 2027356 A NL2027356 A NL 2027356A NL 2027356 B1 NL2027356 B1 NL 2027356B1
- Authority
- NL
- Netherlands
- Prior art keywords
- search
- visual object
- user
- processor
- gui
- Prior art date
Links
- 230000000007 visual effect Effects 0.000 title claims abstract description 137
- 230000004044 response Effects 0.000 claims abstract description 31
- 238000000034 method Methods 0.000 claims description 22
- 238000012545 processing Methods 0.000 claims description 5
- 238000001914 filtration Methods 0.000 abstract description 84
- 238000009877 rendering Methods 0.000 abstract description 4
- 238000012549 training Methods 0.000 description 21
- 238000010801 machine learning Methods 0.000 description 17
- 238000004891 communication Methods 0.000 description 15
- 238000013480 data collection Methods 0.000 description 11
- 238000013135 deep learning Methods 0.000 description 10
- 230000006870 function Effects 0.000 description 8
- 230000008569 process Effects 0.000 description 5
- 230000003213 activating effect Effects 0.000 description 4
- 238000004422 calculation algorithm Methods 0.000 description 4
- 238000013461 design Methods 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 238000012552 review Methods 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000008520 organization Effects 0.000 description 3
- 230000004913 activation Effects 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000013527 convolutional neural network Methods 0.000 description 2
- 238000003066 decision tree Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000014509 gene expression Effects 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 238000009825 accumulation Methods 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000012417 linear regression Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 238000007637 random forest analysis Methods 0.000 description 1
- 238000012706 support-vector machine Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 238000010200 validation analysis Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/903—Querying
- G06F16/9032—Query formulation
- G06F16/90324—Query formulation using system suggestions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/903—Querying
- G06F16/9032—Query formulation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/903—Querying
- G06F16/9035—Filtering based on additional data, e.g. user or group profiles
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0623—Item investigation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0641—Shopping interfaces
- G06Q30/0643—Graphical representation of items or shoppers
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- Databases & Information Systems (AREA)
- General Physics & Mathematics (AREA)
- Accounting & Taxation (AREA)
- Finance (AREA)
- General Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Computational Linguistics (AREA)
- Economics (AREA)
- General Business, Economics & Management (AREA)
- Strategic Management (AREA)
- Marketing (AREA)
- Development Economics (AREA)
- Mathematical Physics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A system for generating a search query is disclosed, which includes a processor and a computer-readable medium storing instructions that controls the system to perform rendering a GUI for an application, which includes a first Visual object associated with a first operation; in response to detecting that a user has selected the first Visual object during a first mode of operation, executing the first operation, and in response to detecting that the user has selected the first Visual object during a second mode of operation, performing disabling the first operation, inferring, from information related to the first Visual object, one or more search filtering options contextually relevant to the first Visual object, displaying, Via the GUI, and in response to detecting that the user has selected one of the search filtering options, adding the selected search filtering option to a search query.
Description
[0001] Recent development in computer and communication technologies has transformed the way information is conveyed to users from a text-based presentation to a graphic-based presentation, which is more intuitive and user-friendly. Nowadays, most consumer-targeted operating systems and applications are programed to offer a graphical user interface (GUI), which displays a number of visual objects that convey information and represent actions that can be taken by the user. User interface (UI) design is also an important part of web-design. A website with a well-thought-out UI design typically attracts more traffic, which could contribute to increased sales and profit. Also, an application with a streamlined UI typically allows to user to complete a task in a shorter period of time, improving performance at both individual and organization levels.
[0002] Despite improved user experiences that the recent development in GUI designs and implementations has brought, users still are not provided with many options to search desired information from applications or websites that they are accessing. Some applications or websites are designed to provide a text field where a user can enter a search term. Such text field, however, cannot be used if the user does not know or remember a keyword or term to be searched. Even if the user knows or remembers a keyword, searching such keyword may result in too many matching data entities. To provide more targeted searches, some GUIs allow a user to build a search query by using a plurality of field codes (e.g., “title/needle or ttl/syringe andnot (sew or thread$)”), but most users are not familiar with search query syntaxes and very often fail to formulate a complete search query required for a desired search result. Accordingly, there still remain significant areas for new and improved implementations for more intuitive and user-friendly GUI-based information searches.
[0003] In an implementation, a system for generating a search query includes a processor and a computer-readable medium in communication with the processor and storing instructions that, when executed by the processor, cause the processor to control the system to perform rendering a graphical user interface, GUI, for an application, the GUI including a plurality of visual objects including a first visual object associated with a first operation; in response to detecting that a user has selected the first visual object during a first mode of operation, executing the first operation associated with the first visual object; and in response to 1 detecting that the user has selected the first visual object during a second mode of operation, performing disabling the first operation associated with the first visual object; inferring, from information related to the first visual object, one or more search filtering options contextually relevant to the first visual object; displaying, via the GUI, the one or more search filtering options including a first search filtering option; and in response to detecting that the user has selected the first search filtering option, adding the selected first search filtering option to a search query.
[0004] In another implementation, a method of operating a system for generating a search query includes rendering a graphical user interface GUI for an application, the GUI including a plurality of visual objects including a first visual object associated with a first operation; in response to detecting that a user has selected the first visual object during a first mode of operation, executing the first operation associated with the first visual object; and in response to detecting that the user has selected the first visual object during a second mode of operation, performing disabling the first operation associated with the first visual object; inferring, from information related to the first visual object, one or more search filtering options contextually relevant to the first visual object; displaying, via the GUI, the one or more search filtering options including a first search filtering option; and in response to detecting that the user has selected the first search filtering option, adding the selected first search filtering option to a search query.
[0005] In another implementation, a non-transitory computer-readable medium stores executable instructions that, when executed by a processor, cause the processor to perform rendering a graphical user interface GUI for an application, the GUI including a plurality of visual objects including a first visual object associated with a first operation; in response to detecting that a user has selected the first visual object during a first mode of operation, executing the first operation associated with the first visual object; and in response to detecting that the user has selected the first visual object during a second mode of operation, performing disabling the first operation associated with the first visual object; inferring, from information related to the first visual object, one or more search filtering options contextually relevant to the first visual object; displaying, via the GUI, the one or more search filtering options including a first search filtering option; and in response to detecting that the user has selected the first search filtering option, adding the selected first search filtering option to a search query.
[0006] This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to 2 identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
[0007] The drawing figures depict one or more implementations in accord with the present teachings, by way of example only, not by way of limitation. In the figures, like reference numerals refer to the same or similar elements. Furthermore, it should be understood that the drawings are not necessarily to scale.
[0008] FIG. 1 illustrates an implementation of a system for visually generating a search query.
[0009] FIG. 2A illustrates an example website displayed via a browser application GUI,
[0010] FIG. 2B illustrates the GUI of FIG. 1A displaying an example popup menu showing search filtering options.
[0011] FIG. 3A illustrates an example mobile chat application GUI displaying a popup menu showing search filtering options
[0012] FIGS. 4A to 4F illustrate an implementation of operations for visually creating and modifying a search query.
[0013] FIG. 5 is a flow diagram showing an implementation of operations for visually creating a search query.
[0014] FIG. 6 is a block diagram showing an example computer system upon which aspects of this disclosure may be implemented.
[0015] In the following detailed description, numerous specific details are set forth by way of examples in order to provide a thorough understanding of the relevant teachings. However, it should be apparent that the present teachings may be practiced without such details. In other instances, well known methods, procedures, components, and/or circuitry have been described at a relatively high-level, without detail, in order to avoid unnecessarily obscuring aspects of the present teachings.
[0016] This description is directed to operating a graphical user interface (GUI) in two different modes: a normal mode and a visual search query mode. During the normal mode, when a visual object displayed on the GUI is selected by a user, a predefined operation 3 associated with the visual object is performed. When the visual search query mode is activated, predefined operations associated with visual objects are disabled. Instead, a user’s selection of a visual object causes displaying of a list of one or more search filtering options that are contextually relevant to the selected visual object. The user then can review the available search filtering options and select a desired filtering option, which causes the selected search filtering option to be added to a search query. Accordingly, users can create or modify a search query by selecting a visual object that might be relevant to a desired search filtering options, and then selecting a desired one from a list of search filtering options, which results in the selected search filtering option added to a search query. Hence, this description provides an intuitive and user-friendly way to interact with a GUI for generating a search query and carrying out a search, thereby providing technical solutions to the technical problems in conventional approaches that users are required to know or learn how to activate and execute search functions, which keywords or search terms should be used, and how to formulate a search query, etc. to obtain desired search results.
[0017] With this overview, attention is now turned to the figures to describe various implementations of the presenting teachings. FIG. 1 illustrates an implementation of a system 100 that allows a user to interact with a GUI for generating a search query. The system 100 may include a local device 110, an application server 130, and/or various data resources, such as internet resources 150 (e.g., an internet encyclopedia, content data storage, etc.), data resources (e.g., web sites/web pages 160A, contacts/directory databases 160B, maps 160C, accounts 160D, etc.), artificial intelligent (Al) engine 170A, deep learning (DL) engine 170B, application database 180, and/or the like.
[0018] The local device 110 is representative of any physical or virtual computing system, device, or collection thereof, such as a smart phone, laptop computer, tablet computer, desktop computer, hybrid computer, gaming machine, smart television, entertainment device, Internet appliance, virtual machine, wearable computer, as well as any variation or combination thereof. The local device 110 may operate remotely from the application server 130, and hence may communicate with each other by way of data and information exchanged over a suitable communication link or links. The local device 110 may implement all or portions of functions for visually generating a search query. The local device 10 may host, be integrated with, or be in communication with some or all of the data sources. For example, the local device 110 may be in communication with the web sites/web pages 160A, contacts/directory databases 160B, maps 160C and user/member accounts 160D (collectively “data resources 160” hereinafter) via the internet resources 150. The local device 110 may be 4 in communication with the Al engine 170A, DL engine 170B and application database 180 via the application server 130.
[0019] The local device 110 may include or be connected to a display 112 and may further include or be connected to one or more user interface devices, such as, a mouse, keyboard, speaker, microphone, etc. (not shown). The local device 110 may host a local application
120. The local application 120 is representative of any software application, module, component, or collection thereof, capable of visually convey information via a GUI, such as a web browser, email client application, chat application, collaboration solution application (e.g., Microsoft™ Teams®, etc.), productivity application (e.g., Microsoft” Office™), etc. The local application 120 may also include certain functions of an operating system (OS) related to visually conveying information via a GUI. The local application 120 may also be a locally installed and executed application, a streamed application, a mobile application, or any combination or variation thereof. The local application 120 may be implemented as a standalone application or may be distributed across multiple applications.
[0020] The application server 130 is representative of any physical or virtual computing system, device, or collection thereof, such as, a web server, rack server, blade server, virtual machine server, or tower server, as well as any other type of computing system, which may be, in some scenarios, implemented in a data center, a virtual data center, or some other suitable facility. The application server 130 may operate an application service 140, which may implement all or portions of the functions to carry out visually convey information to a local user associated with the local device 110. The application service 140 may host, be integrated with, or be in communication with various data sources such as the internet resources 150, data resources 160, Al engine 170A, DL engine 170B, application database 180, etc.
[0021] The application service 140 may be any software application, module, component, or collection thereof capable of providing an application service to the local application 120. In some cases, the application service 140 is a standalone application providing a productivity tool for collaboration among various users including members and non-members of an organization or entity. In some cases, the application service 140 includes a productivity and/or collaboration solutions (or other solutions) for which the visual search query generation may be provided. Examples of the productivity and/or collaboration (and other applications) for which the visual search query generation may be provided include, but are not limited to, an office/productivity suite, collaboration applications, web browsers, email 5 applications, chat applications, conferencing applications, blogging and micro-blogging applications, social networking applications, gaming applications, etc.
[0022] The features and functionality provided by the local application 120 and application service 140 can be co-located or even integrated as a single application. In addition to the above-mentioned features and functionality available across application and service platforms, aspects of the described visual search query generation may be carried out across multiple devices on a same or different computing devices. For example, some functionality for the visual search query generation may be provided by the local application 120 on the local device 10 and the local application 120 may communicate by way of data and information exchanged between with the application server 130 or other devices.
[0023] The system 100 may carry out various functions for a user of the local device 110 to visually generate a search query and execute a search based on the search query. For example, the system 100 may render a GUI for an application, which includes a plurality of visual objects including a first visual object associated with a first operation. In response to detecting that a user has selected the first visual object, the system 100 may execute the first operation associated with the first visual object. When a user input requesting to activate a visual search query mode is received, the system 100 may disable the first operation associated with the first visual object. When the user selects the first visual object while the visual search query mode is activated, the system 100 may infer, from information related to the first visual object, one or more search filtering options contextually relevant to the first visual object. Each search filtering option may identify at least one of a search filtering options. The system 100 may then display, via the GUI, the one or more search filtering options including a first search filtering option. When the user selects the first search filtering option, the system 100 may add the selected first search filtering option to a search query.
[0024] The search filtering options may be inferred in various ways. For example, when a visual object is selected by a user, the system 100 may determine whether a programing code for the local application 120 and/or a data collection relevant to the local application 120 contain information contextually relevant to the selected visual object. Such programing code and/or data collection may contain various information related to the selected visual object, such as, a web address, directory link, storage link, file name, user ID, text data, attribute data, storage location, etc., which may be used to infer one or more search filtering options that the user is likely to have in mind when the user has selected the particular visual object. Such inference may be done by the local device 110 alone or in communication with 6 the application service 140. Each search filtering option may identify a person, group, organization, activity, project, topic, subject, category, channel, time, date, period, location, communication session, communication type, etc.
[0025] For more accurate inference, various information from various sources may be searched and considered, such as the internet resources 150, data resources 160, Al engine 170A, DL engine 170B, application database 180, etc. For example, the system 100 may search the data collection related to the local application 120 and/or the application server 140, which may be stored in the local device 110, application server 130 and/or application database 180. The system 100 may also search the web sites/web pages 160A, contacts/directory databases 160B, maps 160C, and/or user/member accounts 160D.
[0026] Furthermore, the system 100 may use the Al engine 170A and/or DL engine 170B, etc. to provide more accurate and relevant estimation of a keyword or search filter. The Al and DL engines 170A and 170B may be implemented based on a machine-learning (ML), which generally involves various algorithms that can automatically learn over time. The foundation of these algorithms is generally built on mathematics and statistics that can be employed to predict events, classify entities, diagnose problems, and model function approximations. As an example, a system can be trained in order to identify patterns in user activity, determine associations between various datapoints and make decisions based on the patterns and associations. Such determination may be made following the accumulation, review, and/or analysis of data from a large number of users over time, that may be configured to provide the ML algorithm (MLA) with an initial or ongoing training set.
[0027] In different implementations, a training system may be used that includes an initial ML model (which may be referred to as an “ML model trainer”) configured to generate a subsequent trained ML model from training data obtained from a training data repository.
The generation of this ML model may be referred to as “training” or “learning.” The training system may include and/or have access to substantial computation resources for training, such as a cloud, including many computer server systems adapted for machine learning training. In some implementations, the ML model trainer is configured to automatically generate multiple different ML models from the same or similar training data for comparison.
For example, different underlying ML algorithms may be trained, such as, but not limited to, decision trees, random decision forests, neural networks, deep learning (for example, convolutional neural networks), support vector machines, regression (for example, support vector regression, Bayesian linear regression, or Gaussian process regression). As another example, size or complexity of a model may be varied between different ML models, such as 7 a maximum depth for decision trees, or a number and/or size of hidden layers in a convolutional neural network.
[0028] Moreover, different training approaches may be used for training different ML models, such as, but not limited to, selection of training, validation, and test sets of training data, ordering and/or weighting of training data objects, or numbers of training iterations. One or more of the resulting multiple trained ML models may be selected based on factors such as, but not limited to, accuracy, computational efficiency, and/or power efficiency. In some implementations, a single trained ML model may be produced. The training data may be continually updated, and one or more of the models used by the system can be revised or regenerated to reflect the updates to the training data. Over time, the training system (whether stored remotely, locally, or both) can be configured to receive and accumulate more and more training data objects, thereby increasing the amount and variety of training data available for ML model training, resulting in increased accuracy, effectiveness, and robustness of trained ML models.
[0029] For example, the DL engine 170B may be provided with user selection history. From the user selection history, the DL engine 170B may have learned that a particular user has previously indicated as a certain search filtering option as the intended search filtering option when a particular visual object or a particular visual object type was selected in a particular circumstance. For example, a user history record may indicate that the user has frequently selected an avatar in a chat GUI to search emails from chat messages sent by a member associated with the avatar. Such information may then be shared with the Al engine 170A such that the system 10 can more arcuately and efficiently infer one or more search filtering options.
[0030] For example, the local application 120 may be a web browser operating in association with a web page, web site, web service, or the like. An example web page 200 is shown in FIG. 2, which displays a plurality of visual objects, such as textual objects, image objects, etc. via a browser GUI displayed on the display 112. The textual objects or texts include category names 210A (e.g., “Valises-pushcarts (229),” “Valises (2),” etc.), item names and numbers 210B (e.g., “CARTON 006J155;38,” etc.), item descriptions 210C, item prices 210D (e.g., “772$”), which are collectively referred to as texts 210. The image objects include a plurality of images (e.g., bag images 220A, 220B, 220C and 220D, which are collectively referred to as images 220). The web page 200 also includes a text field 230, where a user can type a search term to search a data collection associated with the web page
200.
8
[0031] Conventionally, when a user selects a visual object displayed on the web page 200, the browser application executes a predetermined operation associated with the selected visual object. For example, the image 220A may be associated with an embedded link to another web page (not shown) which provides detailed product information (e.g., an item name, price, color options, description, etc.). When the user wants to know more about the item shown in the image 220A, the user may select the image 220A by, for example, moving a cursor 240 over the image 220A and perform a left-click, which leads to a predetermined operation of displaying the linked web page. The user’s interaction with or selection of the image 220A does not result any operations related to performing a search or creating or modifying a search query.
[0032] On the other hand, according to an aspect, once a visual search query mode is activated, the predefined operations associated with the visual objects are disabled. The user interface 200 may include an activation button 232 to allow a user to activate the visual search query mode. For example, as shown in FIG. 2A, the activation button 232 may be placed at the right side of the text field 232. Alternatively, the visual search query mode may be automatically activated when the user interacts with the text field 230. For example, when the user clicks the text field 230, the system 100 may activate the visual search query mode. The visual search query mode may also be activated by a combination of key entries (e.g., control + F9), a voice command, etc. Regarding the image 220A, upon activating the visual search query mode, the predetermined operation of displaying the linked web page is disabled. Hence, the user’s selection of the image 220A does not cause displaying of the linked web page. Instead, when the user selects the image 220A by, for example, moving the cursor 240 to hover over the image 220A, the system 100 may infer, from various information relevant to the selected image 220A (e.g., an item name, item model, item type, item size, item color, item manufacturer, etc.), one or more search filtering options (e.g, items having the same color, items having the same type, items by the same manufacturer, etc.) that the user may use for creating or moditying a search query or filtering a search. Such search filtering options may be displayed as a list via the GUI 200. For example, as shown in FIG. 2B, the system 100 may display a popup menu 250 listing one or more search filtering options (e.g., “COLOR: BLACK,” “TYPE: TRAVEL LUGGAGE,” “MANUFACTURER: CARLTON,” etc.). The system 100 may also search inventory data to retrieve quantity information (e.g., “(156),” “(124),” “(22),” etc.) relevant to the search filtering options, which may be provided along with the search filtering options in the popup menu 250, as shown in FIG. 2B.
9
[0033] The user may then select a desired search filtering option from the popup menu 250, which results in creating a new search query or modifying a search query. For example, the user may move the cursor 240 over the search filtering option “TYPE: TRAVEL LUGGAGE” and performing a left-click, which may result in creating a new search query including the selected search filtering option “TYPE: TRAVEL LUGGAGE.” Hence, even if the user does not know how the type of the selected item is called, the user still can create or modify a search query including the accurate type name (e.g., “TRAVEL LUGGAGE”) by looking at the displayed images 220 and selecting one that has the desired shape or configuration. The system 100 may then automatically generate one or more search filtering options including the type of the selected item. As such, the description provides an intuitive and user-friendly way of creating or modifying a search query, which may eliminate a user’s need to know or learn how to formulate a search query or how to perform a search, which may vary from one application to another.
[0034] The user may add more search filtering options to the search query by selecting a different visual object and repeating the same process described above. As shown in FIG. 2B, the selected search filtering option may be displayed on the text field 240. When there is an existing search query including a previously added search term or search filtering option, selecting a new search filtering option may result in modifying the existing search query to add the newly selected search filtering option. The user may also modify the search string by typing a search term in the text field 230. The user may create a search query by typing a search term in the text field 230 and then modify the search query by activating the visual search query mode and selecting a search filtering option. As such, a search string may include a mixture of one or more text entries and one or more search filtering options. Once the search query is completed, the user may click a search button 234, and the system 100 may perform a search to display a search result.
[0035] FIG. 3 shows an implementation of a mobile chat application GUI 300 rendered for a mobile device associated with a user (e.g., John Smith). The GUI 300 includes a search box 310, chat display area 312 and text entry area 314. The search box 310 includes a button 320 for activating a visual search query mode. The chat display area 312 includes a plurality of chats (e.g, chats 330A, 330B and 330C) exchanged between two users (e.g., Sam Dole and John Smith). The chat display area 312 also displays user avatars 340A and 340B next to the corresponding chats. The avatars 340A and 340B may associated with a predetermined operation. For example, when Same Dole’s avatar 340A is selected by the user, Sam Dole’s user information (e.g., email address, phone number, etc. ).
10
[0036] When the user activates a visual search query mode by, for example, pressing the button 320, the predetermined operation associated with the avatar 340A may be disabled. Instead, the system 100 may search a data collection associated with the chat application and other applications and/or services to identify data entities contextually relevant to the avatar 340A. For example, when the user selects the avatar 340A by, for example, moving a cursor 342 to hover over the avatar 340A, the system 100 may identify that the avatar 340A is associated with a user ID (e.g, “samdole”). The system 100 may then search a data collection associated with the chat application and identify data entities relevant to the user ID, such as, chats sent by the user, chatrooms created by the user, chatrooms participated by the user, etc. The system 100 may also search other data sources, such as data collections associated with other applications (e.g., email application, scheduling application, note-taking application, collaboration application, etc.) and 1dentify data entities relevant to the user ID (e.g., emails sent by the user, meetings involving the user, filed uploaded by the user, notes shared by the user, etc.) Based on the identified data entities, the system 100 may infer one or more search filtering options that the user may use to filter a search. The system 100 may create one or more data filters for the identified data entities. For example, from one or more data collections, the system 100 may find a number of data entities associated with the user, for example, chats sent by the user, scheduled meetings involving the user, files uploaded by the user, notes shared by the user, etc. The system 100 may then determine how to filter, from numerous data entities from various data sources, the identifies data entities of different data categories. Then, the system 100 may generate a search filter corresponding to each data category. For example, upon identifying that there are a number of chats sent by the user, the system 100 may create a search filter named, for example, “Chat sent by: Sam Dole” or the like, which may be used for filtering, from numerous data entities from various data sources, only the chats sent by the user. Other filters may be created in a similar manner. For example, a “Meeting involving Sam Dole” filter may be created based on the scheduling data entities in which the user is identified as a participant. The system 100 may avoid creating or eliminate one or more filters that the user is not likely to use based on, for example, prior user selections or other user’s general tendencies.
[0037] Upon completing generating one or more search filters, the system 100 may generate a list of search filtering options, which may be displayed via the mobile application GUI 300. For example, the GUI 300 may display a popup menu 350 showing a list of search filtering options including, for example, “Chat sent by: Sam Dole,” “Meetings: Sam Dole,” “Files uploaded by; Sam Dole,” “Notes shared by: Sam Dole,” etc. By selecting one of the search 11 filtering options, the user may create a new search query or add the selected search filtering option to an existing search query. Hence, despite a visual object being associated with a predetermined operation irrelevant to a search function, the user “John Smith” may create or modify a search query by interacting with the visual object. Also, the user may be able to create a search query and perform a search without knowing or learning how to formulate a search query to filter particular data entities associated with a particular user, activity, item, etc.
[0038] FIGS. 4A to 4F illustrate an implementation of a GUI 400 for a workplace collaboration application. In FIG. 4A, the GUI 400 may include a search box 410 where a user can type a search term 420, for example, “LT Review.” The system 100 may then search one or more data collections relevant to the application and identify one or more data entities matching the search term, which may be displayed via a full-down menu 430. The pull-down menu 430 may present the status of the relevant data entities in a categorized manner. For example, as shown in FIG. 4A, the full-down menu 430 may show the matching data entities arranged by data categories, such as, top hits, chats, files, etc. Entering the search term 420 may create a new search query, which may be displayed via the search box
410.
[0039] The GUI 400 may provide an option to activate a visual search query mode. For example, the search box 410 may include an icon 412, which is labeled “Visual query builder.” To activate the visual search query mode, the user may operate a mouse to move a cursor 402 to the icon 412 as shown in FIG. 4B and then perform a left-click. The visual search query mode may be activated in a different manner by, for example, a hot key, combination of key inputs, voice commend, etc. Also, interacting with the search box 410 may automatically activate the visual search query mode.
[0040] Upon activating the visual search query mode, predetermined operations associated with visual objects displayed on the GUI 400 may be disabled. For example, in FIG. 4C, a visual object 440 (labeled “chatroom’™) may be associated with a predetermined operation of displaying a canvas for a “shiproom” channel (not shown). However, the predetermined operation is disabled when the visual search query mode is activated. Instead, when the user selects the visual object 440 by, for example, moving the cursor 402 to hover over the visual object 440, the system 100 may search one or more data collections associated with the application to identify one or more data entities relevant to the visual object 440, and then infer one or more search filtering options. For example, the system 100 may determine that the visual element 440 is relevant to an existing channel named “Northwind Traders > 12
Shiproom.” On this basis, the system 100 may infer a search filtering option for the data entities associated the “Shiproom” channel. The system 100 may also determine quantity information of the identified data entities. Based on the identified data entities, channel information, quantity information, etc., the system 100 may display a popup menu 442 showing a list of available search filtering options. For example, the system 100 may infer, from the relevant data entities, that there is only one filtering option available for the selected visual object 440, which is named “Northwind Trader > Shiproom” in FIG. 4C. The popup menu 442 may then display the “Northwind Trader > Shiproom” search filtering option. The popup menu 442 may also show the quantity information, such as “28 matches in chats,” “12 matches in files,” “4 matches in One Note,” etc. When the user selects the “Northwind Traders > Shiproom” search filtering option shown in the popup menu 442, the system 100 may modify the search string by adding the selected search filtering option, which may be displayed on the search box 410 as shown in FIG. 4E.
[0041] The user may interact with other visual objects to add additional search filtering options to the search query. For example, as shown in FIG. 4D, the user may select an avatar 450 by, for example, moving the cursor 402 to hover over the avatar 450. The system 100 may then identity data entities relevant to the avatar 450, and then infer one or more search filtering options for the selected avatar 450. For example, the system 100 may determine that the selected visual object 450 is associated with a user named Charlotte de Crum. Based the user identity information, the system 100 may search one or more data collections and infer one or more search filtering options, which are named “From: Charlotte de Crum,” “In: Northwind traders > Marketing,” etc. The system 100 may then display a popup menu 452 showing a list of the search filtering options along with quantity information (e.g., “2 matches in chats,” etc.). When the user selects the “From: Charlotte de Crum” search filtering option, the selected filtering option may be added to the search query.
[0042] FIG. 4E shows the search box 410 showing the search query including the search term “LT review” 420, two search filtering options “Northwind Traders > Marketing” 422 and “From: Charlotte de Crum” 424. When the user indicates that the search query is completed by, for example, pressing an “enter” key or the like, the system 100 may conduct a search based on the search query and display a search result via the GUI 400. For example, as shown in FIG. 4F, the GUI 400 may display two messages 460A and 460B as the search result. As such, users may create a search query or modify an existing search query by interacting with visual objects displayed on a GUI. The system 100 is configured to carry out the difficult part — inferring search filtering options that the user is like to use to create or 13 modify a search query. By selecting relevant search filtering options, the user may create or modify a search query. Hence, users are not required to know or learn how to formulate a search query or which search terms should be used for complete and accurate searches.
[0043] FIG. 5 is a flow diagram showing an implementation of operations performed for visually generating a search query. At step 510, the system 100 may render a graphical user interface (GUI) for an application. The GUI may include a plurality of visual objects including a first visual object associated with a first operation. At step 520, in response to detecting that a user has selected the first visual object, the system 100 may execute the first operation associated with the first visual object. At step 530, in response to receiving a user input requesting to activate a visual search query mode, the system 100 may disable the first operation associated with the first visual object. At step 540, in response to detecting that the user has selected the first visual object while the visual search query mode is activated, the system 100 may infer, from information related to the selected first visual object, one or more search filtering options contextually relevant to the first visual object. At step 550, the system 100 may display, via the GUI, the one or more search filtering options including a first search filtering option. At step 560, in response to detecting that the user has selected the first search filtering option, the system 100 may adding the selected first search filtering option to a search query.
[0044] FIG. 6 is a block diagram showing an example a computer system 600 upon which aspects of this disclosure may be implemented. The computer system 600 may include a bus 602 or other communication mechanism for communicating information, and a processor 604 coupled with the bus 602 for processing information. The computer system 600 may also include a main memory 606, such as a random-access memory (RAM) or other dynamic storage device, coupled to the bus 602 for storing information and instructions to be executed by the processor 604. The main memory 606 may also be used for storing temporary variables or other intermediate information during execution of instructions to be executed by the processor 604. The computer system 600 may implement, for example, the local device 110 and/or application server 130.
[0045] The computer system 600 may further include a read only memory (ROM) 608 or other static storage device coupled to the bus 602 for storing static information and instructions for the processor 604. A storage device 610, such as a flash or other non-volatile memory may be coupled to the bus 602 for storing information and instructions.
[0046] The computer system 600 may be coupled via the bus 602 to a display 612, such as a liquid crystal display (LCD), for displaying information. One or more user input devices, 14 such as the example user input device 614 may be coupled to the bus 602, and may be configured for receiving various user inputs, such as user command selections and communicating these to the processor 604, or to the main memory 606. The user input device 614 may include physical structure, or virtual implementation, or both, providing user input modes or options, for controlling, for example, a cursor, visible to a user through display 612 or through other techniques, and such modes or operations may include, for example virtual mouse, trackball, or cursor direction keys.
[0047] The computer system 600 may include respective resources of the processor 604 executing, in an overlapping or interleaved manner, respective program instructions.
Instructions may be read into the main memory 606 from another machine-readable medium, such as the storage device 610. In some examples, hard-wired circuitry may be used in place of or in combination with software instructions. The term “machine-readable medium” as used herein refers to any medium that participates in providing data that causes a machine to operate in a specific fashion. Such a medium may take forms, including but not limited to, non-volatile media, volatile media, and transmission media. Non-volatile media may include, for example, optical or magnetic disks, such as storage device 610. Transmission media may include optical paths, or electrical or acoustic signal propagation paths, and may include acoustic or light waves, such as those generated during radio-wave and infra-red data communications, that are capable of carrying instructions detectable by a physical mechanism for input to a machine.
[0048] The computer system 600 may also include a communication interface 618 coupled to the bus 602, for two-way data communication coupling to a network link 620 connected to a local network 622. The network link 620 may provide data communication through one or more networks to other data devices. For example, the network link 620 may provide a connection through the local network 622 to a host computer 624 or to data equipment operated by an Internet Service Provider (ISP) 626 to access through the Internet 628 a server 630, for example, to obtain code for an application program.
[0049] While various embodiments have been described, the description is intended to be exemplary, rather than limiting, and it is understood that many more embodiments and implementations are possible that are within the scope of the embodiments. Although many possible combinations of features are shown in the accompanying figures and discussed in this detailed description, many other combinations of the disclosed features are possible.
Any feature of any embodiment may be used in combination with or substituted for any other feature or element in any other embodiment unless specifically restricted. Therefore, it will 15 be understood that any of the features shown and/or discussed in the present disclosure may be implemented together in any suitable combination. Accordingly, the embodiments are not to be restricted except in light of the attached claims and their equivalents. Also, various modifications and changes may be made within the scope of the attached claims.
[0050] While the foregoing has described what are considered to be the best mode and/or other examples, it is understood that various modifications may be made therein and that the subject matter disclosed herein may be implemented in various forms and examples, and that the teachings may be applied in numerous applications, only some of which have been described herein. It is intended by the following claims to claim any and all applications, modifications and variations that fall within the true scope of the present teachings.
[0051] Unless otherwise stated, all measurements, values, ratings, positions, magnitudes, sizes, and other specifications that are set forth in this specification, including in the claims that follow, are approximate, not exact. They are intended to have a reasonable range that is consistent with the functions to which they relate and with what is customary in the art to which they pertain.
[0052] The scope of protection is limited solely by the claims that now follow. That scope is intended and should be interpreted to be as broad as is consistent with the ordinary meaning of the language that is used in the claims when interpreted in light of this specification and the prosecution history that follows and to encompass all structural and functional equivalents. Notwithstanding, none of the claims are intended to embrace subject matter that fails to satisfy the requirement of Sections 101, 102, or 103 of the Patent Act, nor should they be interpreted in such a way. Any unintended embracement of such subject matter is hereby disclaimed.
[0053] Except as stated immediately above, nothing that has been stated or illustrated is intended or should be interpreted to cause a dedication of any component, step, feature, object, benefit, advantage, or equivalent to the public, regardless of whether it is or is not recited in the claims.
[0054] It will be understood that the terms and expressions used herein have the ordinary meaning as is accorded to such terms and expressions with respect to their corresponding respective areas of inquiry and study except where specific meanings have otherwise been set forth herein. Relational terms such as first and second and the like may be used solely to distinguish one entity or action from another without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms “comprises,” “comprising,” or any other variation thereof, are intended to cover a non-exclusive inclusion, 16 such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. An element proceeded by “a” or “an” does not, without further constraints, preclude the existence of additional identical elements in the process, method, article, or apparatus that comprises the element.
[0055] The Abstract of the Disclosure is provided to allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it may be seen that various features are grouped together in various examples for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claims require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed example. Thus, the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separately claimed subject matter.
17
Claims (15)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
NL2027356A NL2027356B1 (en) | 2021-01-20 | 2021-01-20 | Visual search query generation |
PCT/US2022/011904 WO2022159302A1 (en) | 2021-01-20 | 2022-01-11 | Visual search query generation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
NL2027356A NL2027356B1 (en) | 2021-01-20 | 2021-01-20 | Visual search query generation |
Publications (1)
Publication Number | Publication Date |
---|---|
NL2027356B1 true NL2027356B1 (en) | 2022-07-28 |
Family
ID=74858721
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
NL2027356A NL2027356B1 (en) | 2021-01-20 | 2021-01-20 | Visual search query generation |
Country Status (2)
Country | Link |
---|---|
NL (1) | NL2027356B1 (en) |
WO (1) | WO2022159302A1 (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8949250B1 (en) * | 2013-12-19 | 2015-02-03 | Facebook, Inc. | Generating recommended search queries on online social networks |
US20150169067A1 (en) * | 2012-05-11 | 2015-06-18 | Google Inc. | Methods and systems for content-based search |
US20150309690A1 (en) * | 2014-04-24 | 2015-10-29 | Alibaba Group Holding Limited | Method and system for searching information records |
US20190318405A1 (en) * | 2018-04-16 | 2019-10-17 | Microsoft Technology Licensing , LLC | Product identification in image with multiple products |
-
2021
- 2021-01-20 NL NL2027356A patent/NL2027356B1/en not_active IP Right Cessation
-
2022
- 2022-01-11 WO PCT/US2022/011904 patent/WO2022159302A1/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150169067A1 (en) * | 2012-05-11 | 2015-06-18 | Google Inc. | Methods and systems for content-based search |
US8949250B1 (en) * | 2013-12-19 | 2015-02-03 | Facebook, Inc. | Generating recommended search queries on online social networks |
US20150309690A1 (en) * | 2014-04-24 | 2015-10-29 | Alibaba Group Holding Limited | Method and system for searching information records |
US20190318405A1 (en) * | 2018-04-16 | 2019-10-17 | Microsoft Technology Licensing , LLC | Product identification in image with multiple products |
Also Published As
Publication number | Publication date |
---|---|
WO2022159302A1 (en) | 2022-07-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3682391B1 (en) | Capturing and leveraging signals reflecting bot-to-bot delegation | |
JP5474833B2 (en) | Improve web search with relevant user data | |
US9009152B2 (en) | Smart widgets | |
US20190079909A1 (en) | Intelligently updating a collaboration site or template | |
US9122743B2 (en) | Enhanced search query modification | |
CN101124576B (en) | Search system and methods with integration of user annotations from a trust network | |
US9613132B2 (en) | Method of and system for displaying a plurality of user-selectable refinements to a search query | |
US7840601B2 (en) | Editable table modification | |
US8122041B2 (en) | Sharing and collaboration of search findings | |
US20140330821A1 (en) | Recommending context based actions for data visualizations | |
US9098384B2 (en) | Runtime connection suggestion engine for portal content | |
US8972399B2 (en) | Ranking based on social activity data | |
US11132406B2 (en) | Action indicators for search operation output elements | |
JP2007213569A (en) | Deep enterprise search | |
JP6457641B2 (en) | Search for offers and advertisements on online social networks | |
US20090150387A1 (en) | Guided research tool | |
US20180341716A1 (en) | Suggested content generation | |
JP6162134B2 (en) | Social page trigger | |
KR102581333B1 (en) | System and method for improved online research | |
NL2027356B1 (en) | Visual search query generation | |
US20230027628A1 (en) | User context-based enterprise search with multi-modal interaction | |
JP7467536B2 (en) | Generate action elements that suggest content for ongoing tasks | |
US11907224B2 (en) | Facilitating search result removal | |
Peikos et al. | A systematic review of multidimensional relevance estimation in information retrieval | |
Ma et al. | Mobile application search: a QoS-aware and tag-based approach |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
MM | Lapsed because of non-payment of the annual fee |
Effective date: 20240201 |