WO2014019077A1 - Taste-based navigation at multiple levels of granularity - Google Patents
Taste-based navigation at multiple levels of granularity Download PDFInfo
- Publication number
- WO2014019077A1 WO2014019077A1 PCT/CA2013/000693 CA2013000693W WO2014019077A1 WO 2014019077 A1 WO2014019077 A1 WO 2014019077A1 CA 2013000693 W CA2013000693 W CA 2013000693W WO 2014019077 A1 WO2014019077 A1 WO 2014019077A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- type
- search result
- search
- search results
- user
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/903—Querying
- G06F16/9032—Query formulation
- G06F16/90324—Query formulation using system suggestions
Definitions
- the subject matter disclosed generally relates to user interfaces.
- Conventional search interfaces allow the user to type in a search query and hit the search button to retrieve a list of products.
- the initial search may provide search results including a plurality of items.
- the user may open and view the first item, and move to the next item if a "next" button is provided.
- the user may go back to the main results page using breadcrumbs to view the other items (Breadcrumbs is a navigation aid used in user interfaces which provides links back to each previous page the user navigated through to get to the current page).
- a user interface for searching and viewing products in a user friendly manner.
- a user interface in accordance with the present embodiments saves the user the trouble of typing new search queries and moving back and forth through the different search result pages to refine the search by accounting for the user's taste as the user navigates through different search results and the different granularity levels of the search products.
- such interface is ideal for implementation on smart phones and portable devices with limited screen sizes.
- a method for navigating levels of granularity of search results displayed on a user interface implemented on a display device comprising: receiving a plurality of type 1 search results returned in response to a search query, said type 1 search results having a granularity level that is based on the search query and a type of data that is available at a search engine; storing said type 1 search results in a queue based on a relevance-coefficient associated with each search result, said relevance- coefficient indicating a relevance of the search result to the search query; displaying a first subset of type 1 search results comprising type 1 search results having the highest relevance-coefficients; in response to receiving a user input representing a zoom-in instruction over a selected type 1 search result, displaying a first subset of type 2 search results, said type 2 search results representing a lower-granularity level of the selected type 1 search result; receiving a user input representing a user preference of a selected type 2 search result; adjusting relevance-coefficients
- a computing device for navigating search results, the computing device comprising: a processor; a memory having recorded thereon one or more programs which when executed by the processor cause the system to display a user interface on a display device operably connected to the computing device for displaying search results in a display area of the user interface; wherein the computing device is configured to: receive a plurality of search results returned by a search engine in response to a search query, and store the plurality of search results in a queue based on a relevance coefficient associated with each search result; display a first set of search results in the display area, said first set including search results having the highest relevance coefficients; upon receiving a user input representing a zoom-in instruction over a selected search result, display a new set of search results representing a lower granularity of the selected search result; upon detecting a user input liking or disliking a given lower granularity search result, adjust a relevance coefficient associated with each search result that is related to the given search result based on a similar
- a method for navigating levels of granularity of search results comprising: displaying a first subset of type 1 search results returned in response to a search query on a touch-sensitive display of a portable computing device, the type 1 search results having a granularity level that is defined by a search engine for the search query; In response to detecting a pinch gesture representing a zoom-in instruction over a selected type 1 search result, displaying a first subset of type n search results representing a lower-granularity level of the selected type 1 search result; receiving a user input liking or disliking a selected type n search result; in response to the liking or disliking, adjusting a relevance-coefficient associated with each search result that is related to the selected type n search result based on a similarity coefficient between a related search result and the selected type n search result; in response to receiving a user input representing a zoom-out instruction, displaying a second subset of type 1 search results including
- Figures 1 a to 1c illustrate user interfaces with edges having different shapes, in accordance with an embodiment
- Figure 2 illustrates an example of a user interface having four edges defining a rectangle
- Figure 3 illustrates an exemplary user interface illustrating a cluster of search results presented as icons with visual links between related results
- Figure 4 illustrates an example of a queue corresponding to the interface of Figure 3 for storing the search results
- Figures 5a to 5e illustrate the steps of updating the cluster of search results displayed in the search interface when a search result is liked
- Figure 6 illustrates an updated version of the queue of Figure 4, corresponding to the interface of Figures 5b to 5e;
- Figures 7a to 7f illustrate the steps of updating the cluster of search results displayed in the search interface when a tag is liked
- Figure 8 illustrates the queue corresponding to the interface of Figure 7a
- Figure 9 illustrates the top artists associated with the tag selected in Figures 7b;
- Figure 10 illustrates an updated version of the queue of Figure 8, corresponding to the interface of Figures 7b to 7e;
- Figures 11a to 11 g illustrate an example of zoom-in navigations of different granularity levels on a search interface in accordance with the present embodiments
- Figure 12a to 12d illustrate examples of queues corresponding to the search results in Figures 11a to 11d, respectively;
- Figure 13 illustrates an example of search results returned by the search engine at different granularity levels in response to the search query "car";
- Figure 13a illustrates different hierarchy of granularity levels for search results of Figure 13;
- Figure 14a illustrates an example of search results that are related to M3 and the similarity factor between M3 and each related search result, before receiving the user's preference;
- Figure 14b illustrates the search results of Figure 14a after receiving the user's preference
- Figures 15a to 15d illustrate an example of zoom-out navigations of different granularity levels on a search interface in accordance with the present embodiments
- Figure 16a to 16d illustrate examples of queues corresponding to the search results in Figures 15a to 15d, respectively;
- Figures 17a and 17b illustrate examples of zoom in and zoom out gestures, respectively;
- Figures 18a and 18b illustrate an embodiment of a search interface which allows the user to express a degree of liking/disliking of a given search result
- Figures 19a to 19e are screen shots of a user interface in accordance with the present embodiment.
- Figure 20 is a flowchart of a method for navigating levels of granularity of search results displayed on a user interface implemented on a display device
- Figure 21 is a flowchart of a method for navigating levels of granularity of search results
- Figure 22 illustrates an embodiment of a computing environment in which embodiments of the present invention may be practiced.
- a user interface which detects the user's taste as the user is navigating through different search results and through different granularities of the search results.
- the interface refines the search results across the different granularities based on the user's taste to simplify and expedite the search process, thereby, guiding the user toward the search results that are relevant to the user making the search.
- the interface refines the search results across the different granularities with every user input indicating a user preference/taste, whereby, the results presented to the user at one or more granularity levels before receiving/detecting a user preference differ from the results displayed to the user after receiving the user preference.
- search result provider the entity that provides the search results e.g. search engine, a database, website, api etc.
- search engine the entity that provides the search results e.g. search engine, a database, website, api etc.
- the received search results are stored in a queue in a memory provided on the computing device in accordance with a relevance coefficient which represents the relevance of a given search result to the search query.
- the search results displayed in the interface are those having the highest coefficients, since they are considered as the most relevant to the search query.
- the search results may also have similarity coefficients, each similarity coefficient indicating a degree of similarity between a given search result and another one.
- the relevance coefficients and sometimes the similarity coefficients are received from the search result provider.
- the user interface may detect the user's taste as the user is navigating through the search results.
- the interface may detect the user's liking and disliking of selected search results presented in the interface (horizontal navigation) to adjust the relevancy coefficient associated with the liked/dislike search results and all related search results.
- adjusting the relevance coefficient of related search results is based on the similarity coefficient between the liked/disliked search result and each related search result.
- the user interface may be displayed on a display area of a display device (e.g. screen, monitor etc.).
- the interface may define a periphery surrounding the display area.
- the periphery may have edges having different shapes as shown in Figures 1a to 1c.
- the edge may be provided near or at the periphery.
- the periphery may have one edge or a plurality of edges.
- the user interface would have a rectangular edge with four edge portions (hereinafter four edges) as shown in Figure 1 a to use the maximum space possible since most displays/monitors have a rectangular shape.
- Figure 2 illustrates an example of a user interface 200 having four edges 202 to 208 defining a rectangle.
- the interface 200 may include a search query region 210 for typing the search query in.
- the search query region may be provided within the edges of the interface 200 and may also be provided outside the interface 210. It is also possible that the search query region 210 appears and disappears when touching a certain location on the screen/keyboard.
- the user may type in a search query and hit search.
- the search results may appear within the edges of the interface 200.
- a function indicating a user preference is assigned to at least one edge (or portion of an edge) of the user interface.
- edge 202 is assigned the function "like”
- edge 204 is assigned the function "preview”
- edge 206 is assigned the function "dislike”
- edge 208 is assigned the function "ignore”.
- the user may drag a search results toward one of the edges 202 to 208 of the interface 200 to activate the function associated with that edge.
- the interface Prior to activating the function, the interface may provide a visual indicator indicating the edge toward which the user is moving the search result and the function associated with that edge.
- the function may be activated by bringing the search result in proximity or in contact with the edge, or by throwing the function toward the edge as will be described in further detail hereinbelow.
- An interface in accordance with the present embodiments may be used for performing different types of searches. For example, it may be used for performing a product search, or a regular search in a web browser using the internet or the like or local search of a local database/library.
- Examples of products may include: artist, author, singer, dancer, music composer, music type, band, actor, music album, song, painting, book, movie, game, electronic device, cars, houses etc.
- the interface may return a number of search results each having associated therewith: a coefficient of relevance (hereinafter coefficient), a list of similar results/products, a factor of similarity (aka similarity-coefficient) between the result and each similar product, and one or more common characteristics that relates the result to the similar product.
- the interface stores the results and the associated data in a queue in memory.
- the results may be ordered in the queue in accordance with the magnitude of the coefficients associated therewith, so that the results having the highest coefficients are stored at the beginning of the queue and displayed first and those having lower coefficients are stored at the end of the queue and displayed last (if they are ever displayed).
- the interface may display a number (or all) of the results stored in the queue based on the space allocated to the interface 200 on the display.
- the products may be displayed as icons or nodes.
- the interface may display the search results within the edges thereof while showing a visual link between two (or more) of the results based on a common characteristic that exists between the two results.
- Figure 3 illustrates an exemplary user interface illustrating a cluster of search results presented as icons with visual links between related results.
- the common characteristic may be a characteristic that relates to the search result such as type of music, character of the actor etc. and may also relate to the user's social network. For example, the common characteristic could be the fact that the user's friends recommend the product or have bought it, or they like it (or dislike it), or wrote a review about it etc.
- the interface 200 may associate a tag to the visual link to indicate the common characteristic that exists between the search results linked by the visual link.
- the user may type the name of the singer in the search query region 210 and hit search.
- the interface 200 returns and displays a number of singers and provides a visual link between related singers. For example, Beyonce and Nicole Scherzinger are related by being 'sexy', while Kelly Rowland and Beyonce are related by the type of music 'soul' that they sing.
- Figure 4 illustrates an example of a queue 240 illustrating a number of search results listed in the order of the magnitude of their coefficients.
- the singers Beyonce and Rihanna have a coefficient of 1 (which is the highest) and take the first two places in the queue 240 since they both have many characteristics in common;
- Nicole Scherzinger has a coefficient 0.87 and comes in the third place and so on.
- Figure 4 also shows the singers that are similar to Nicole Scherzinger along with a factor of similarity. Each singer listed in the list has a list of related singers, however for space constaints, Figure 4 shows only the lists associated with Nicole Scherzinger and Wanessa.
- one or more of the interface edges may have a function associated with them. Accordingly, if the user drags an icon toward one of these edges and activates the function associated with that edge, the interface would refine the search results displayed within the interface or play a sample of the product represented by the dragged icon. For instance, if the user drags an icon toward the like edge 242 the interface 200 may remove the dragged icon from the display, adjust the coefficient of similarity of that icon to be the highest, and then adjust the coefficient of the similar products in the queue. Adjusting the coefficient of the similar results may be based on the similarity factor between these results e.g. 0.5 between Nicole Scherzinger and Jennifer Lopez as shown in Figure 4. The queue may then reordered to take into consideration the new coefficients, and a new result may be displayed in the interface along with a new link and a new tag. An example is illustrated in Figures 5a to 5e.
- the queue 240 is reordered to take into account the new coefficients, as shown in Figure 6.
- the liked singer would remain in the queue but would not appear in the interface unless the user does so manually for example by touching or double-clicking on the liking edge to bring the liked results back into the cluster to be displayed.
- a new singer The Pussy Cat Dolls
- new links/tags sweet
- the common characteristics may also be stored in accordance with their relevance.
- the 'smart' tag between Rihanna and Kelly Rowland has been removed for including a 'sweet' tag that relates Rihanna, The Pussy Cat Dolls and Destiny's Child.
- the user may dislike one of the results by dragging the corresponding icon toward the dislike edge 246.
- the interface would then remove the dragged product from the cluster, decrease the coefficient of the dragged product and the coefficient of the related products based on the factor of similarity. This step may be performed in the same manner as the step of liking with the exception that the coefficients are decreased.
- Dragging A Tag the user may perform the actions described in Figures 5 and 6 with a tag rather than an icon.
- Figure 7a illustrates an exemplary user interface including a cluster of icons representing search results and tagged visual links between related results.
- Figure 8 illustrates the queue corresponding to the interface of Figure 7a. If the user likes singers that are 'sexy' they may drag the tag 'sexy' toward the liking edge 202, as shown in Figure 7b. When the tag is liked, the tag may be removed from the interface as shown in Figure 7c, and the tag's top artists are fetched as shown in Figure 9 to increase their coefficients. After the tag is removed other tags are displayed as shown in Figure 7d. Once the coefficients are modified the queue may be reordered to take into account the new coefficients, as shown in Figure 10.
- the cluster of products shown in the interface 200 is changed. For example, since Shakira and Britney Spears became in the top five places in the queue as shown in Figure 10, the interface would display them instead of Kelly Rowland and Destiny's Child, as shown in Figure 7e. New tags may then be calculated between the related icons as shown in Figure 7f which shows that Briteny Spears, Beyonce and Shakira are linked by the tag 'pop' which indicated that the three singers sing pop music.
- the other functions may also be applied by dragging the tag toward the desired edge or portion of an edge. These functions have been discussed above in connection with the dragging of icons representing search results and will not be repeated herein. [0062] Accordingly the embodiments discussed above, describe how the user may navigate different search results and how the search results displayed to the user are changed every time the user selects/drags a given search result and likes or dislikes the given search result, based on the relevance-coefficient associated with each search result in the queue, and a similarity-coefficient between the given search result and every other similar search result.
- the interface may allow the user to navigate through different levels of granularity of the search results (aka vertical navigation) to view search results at lower or higher granularity levels.
- levels of granularity of the search results aka vertical navigation
- An example is provided with respect to Figures 11 a to 11 f .
- Figures 11a to 11f illustrate an example of zoom-in navigations of different granularity levels on a search interface in accordance with the present embodiments
- Figure 12a to 12d illustrate examples of queues corresponding to the search results in Figures 11 a to 11 d , respectively.
- Figure 13 illustrates an example of search results returned by the search engine at different granularity levels in response to the search query "car”.
- search results may and may not be all returned to the computing device in one shot. It may be that some of the search results are sent by the search result provider to the computing device as the user navigates through the different results. For example, the different classes and models of Mercedes may only be sent to the computing device if the user zooms- in over the Mercedes icon in Figures 11 b.
- the embodiments are not limited to a specific hierarchical structure.
- the embodiments are not limited to the hierarchy shown in Figure 13, and may still be practised with other hierarchy structures, as exemplified in Figure 13a (when compared to Figure 13).
- the granularity levels may differ between a data provider (e.g. search engine, database etc.) and another.
- the interface may display the granularity levels in accordance with the hierarchy structure provided by the data provider.
- FIG. 1 1 a illustrates type 1 search results returned by a search engine (e.g. http://www.caranddriver.com/ ) in response to the search query "car”.
- a search engine e.g. http://www.caranddriver.com/
- the type 1 results typically represent the results at the highest granularity level for the search query.
- the type 1 granularity level usually depends on the search query and the data available at the search engine. For example, if the search query was "vehicle" instead of "car” a search engine such as http://www.autotrader.com/ would have returned different choices of vehicles as the type 1 results including for example: motorcycles, cars, boats, trucks etc. On the other hand, if the user is searching for a "car” on a search engine associated with a specific manufacturer e.g. BMW, the type 1 search results that are returned would have been those shown in Figure 1 1 c immediately since BMW does not provide Asian and/or American cars, etc.
- a search engine such as http://www.autotrader.com/
- the interface allows the user to navigate through different levels of granularity of the search results. For example, using a zoom-in gesture (e.g. reverse pinch gesture as exemplified in Figure 17a), on a touch sensitive display the user may navigate into a deeper granularity level of a given search result, and may return to the first level using a zoom out gesture (e.g. pinch gesture as exemplified in Figure 17b).
- a zoom out gesture e.g. pinch gesture as exemplified in Figure 17b
- more than two granularity levels may be provided, and the granularity level that the user may reach depends on the width of the pinch gesture performed by the user.
- the user may navigate into the type 2 granularity by performing a reverse pinch gesture that is between 1 cm and 1 .5 cm on a portable device. Assuming however that the user has performed a reverse pinch gesture that 2cm wide, the interface may present type 3 (or type4) search results immediately without showing the type 2 results.
- the user may perform a zoom-in gesture over that search result to view the type 2 results illustrated in Figure 11b which represent a lower granularity level for European cars. If the user performs another zoom-in gesture over "BMW" in Figure 11 b they may view the type 3 results presented in Figure 11c which illustrate a lower granularity level for the type 2 search results. By zooming-in over "3 series" in Figure 11c, the user may view the type 4 search results illustrated in Figure 11 d which represent a lower granularity level for the BMW 3 series search result.
- the interface may provide a visual indicator that indicates the granularity level of the results currently displayed on the interface as shown at 280.
- the visual indicator 280 may be provided so that it shows all the available granularity levels at the same time and also indicate the current granularity level of the search results displayed in the interface as shown at 280-a in Figure 11g.
- the user may express their taste on the interface by dragging the M3 icon toward a like edge as exemplified in Figure 11 e or by opening the link (e.g. by tapping/clicking over the associated icon) to view more details about that search result as shown in Figure 11f and pressing a like/dislike button 282.
- the interface may adjust the relevance coefficients associated with related search results stored in the queue based on the similarity coefficient between each related search result and the liked search result to display different results at each granularity level.
- Figure 14a illustrates an example of search results that are related to BMW M3 and the similarity factor between BMW M3 and each related search result, before receiving the user's liking of BMW M3. It should be noted that the in Figure 14a, the similarity factors displayed are limited to search results of the same granularity level. However, there may be direct or indirect similarity factors between results of different granularities as will be described below.
- the similarity coefficient with Mercedes C63 AMG is 0.9, with Audi S4 is 0.85, and with Lexus IS 350F is 0.8. These models are known as competitors of the same class (compact sport sedan/coupe).
- Figure 14a also shows that the relevance coefficient before the user expressed their taste is 0.6 for M3, 0.6 for C63 AMG, 0.6 for Audi S4, and 0.5 for IS 350F Sport.
- the relevance coefficients are usually received from the search engine and represent a degree of relevance to the search query.
- the relevance coefficient of the liked result M3 may be increased , and the relevance coefficient of related search results across all granularity levels may be adjusted to take into account the new similarity coefficient of M3.
- Figures 15a to 15d illustrate an example of zoom-out navigations of different granularity levels on a search interface in accordance with the present embodiments
- Figure 16a to 16d illustrate examples of queues corresponding to the search results in Figures 15a to 15d, respectively.
- the interface may change the search results (type 4) at the same granularity level of the liked result to present results that are more relevant to the user.
- Figure 15a illustrates results that are different than those shown in Figure 11 d for the same granularity level (models level).
- the interface now shows C63 AMG, Audi S4, Lexus 350F instead of the different 3series models of BMW shown in Figure 11d. If the user zooms out to a higher granularity level e.g. type 3, they may view the type 3 search results shown in Figure 15b which also include differences when compared to those shown in Figure 11c for the same granularity level (series). The same applies to Figures 15c and 11 b and Figures 15d and 11a.
- the queue may be re-ordered on each level of granularity every time a user preference (liking /disliking a given search result) is received.
- the reordering takes into account the new coefficients of the different search results, whereby different results are presented to the user as the user navigates through the different granularity levels. For example, if the user likes BMW M3 cars and expresses their taste on the interface, several search results at different granularities may be promoted and several other may be demoted as a result of receiving the user's expression of taste. In the present example, all year models of BMW M3 may be promoted (e.g. 1989 M3s to 2013 M3s), the maker will be promoted as a whole (e.g.
- BMW similar cars of the same maker will be promoted (e.g. BMW 335i M-Sport Package, M5, M1 , M6 X5M, X6M), all car types of the same type or same class (e.g. C63 AMG of Mercedes, Lexus IS 350F, Audi S4 etc.), all makers of the same type of cars that are in the same class (e.g. Audi, Mercedes, Lexus), and all year models of those cars that in the same type (e.g. all year models of: Lexus IS-F, Audi S4, Mercedes AMG).
- BMW 335i M-Sport Package M5, M1 , M6 X5M, X6M
- all car types of the same type or same class e.g. C63 AMG of Mercedes, Lexus IS 350F, Audi S4 etc.
- Audi, Mercedes, Lexus e.g. Audi, Mercedes, Lexus
- all year models of those cars that in the same type e.g. all year models of: Lexus IS-F, Audi S4, Mercedes AMG
- the Lexus search results presented to the user would be different than those that would have been shown prior to receiving the user's liking of BMW M3.
- the Lexus search results displayed to the user may in their majority be from the Lexus IS categories which include the sport cars such as Lexus IS 250-F, Lexus IS 350F which are similar to BMW M3. While the majority of results that would have been shown prior to receiving the user's liking of BMW M3 would have been those classified in the luxury class.
- the interface may be configured to demote car makers that do not have an equivalent to the search result liked by the user.
- the relevance coefficients associated with these results are reduced (or at least not increased) across the different granularity levels since they do not have search results that are relevant to the user making the search.
- the similarity coefficients are sometimes provided by the entity that provides the search results (e.g. search engine, a database, website, api etc.). However, this data is not always available and sometimes when it is available it is not complete.
- the interface may establish similarity coefficients between related search results using one of more of the following methods.
- the interface may establish/set the similarity coefficients between two given search results based on common attributes of the search results.
- attributes may include the number/amount of available horsepower, torque, number of doors, weight, maximum speed, acceleration, fuel consumption, transmission type, luxury level, size, number of passengers, manufacturer's country, class, type, available options such as navigation, rear view camera, etc.
- the interface may establish the similarity coefficients by detecting the user's activities on the interface. For example, by monitoring the search results that the user is viewing and/or the time spent in viewing each of these results. For example, if the user views M3 then navigates to C63 AMG and spends similar amount of time reviewing this vehicle, the interface may determine that the these two vehicles are similar and may establish a similarity coefficient between them.
- the interfaces associated with different devices may each report the navigation activities performed on their associated devices to a remote server (e.g. The Tastefilter server) over the internet or another telecommunications network. The server may then compare the different navigation activities performed on the different devices to find a trend and establish similarity between different search results.
- the server may then detect a match between M3 and C63 AMG and establish a similarity coefficient between them.
- the detection of a match may be based on one or more of the following: the number of users who viewed and/or liked one result and then navigated to view and/or like the other, the time spend viewing each result, the percentage of users who viewed the two results from the entire number of users who only viewed one but not the other.
- the interface may provide the user with the option of expressing a degree of liking/disliking e.g. like a lot, like a bit etc.
- the degree of liking or disliking may be expressed based on the position at which the user drags the search result.
- the edge may be graded to give a score to each end of the edge and intermediate scores for positions in between the two ends. For example, it is possible to assign the lowest score to one end and the highest score to the opposite end.
- Figures 18a and 18b illustrate an embodiment of a search interface which allows the user to express a degree of liking/disliking of a given search result.
- the "like" edge 202 has two ends 264 and 266.
- the first end 264 is assigned the lowest liking score whereas the second end 266 is assigned the highest liking score.
- a visual indicators may be displayed beside each end of the edge 202 to indicate the degree of liking associated with each end (a big heart for the "like a lot" end 266 and a small heart for the "like a bit" end 264).
- the coefficient of the dragged item is adjusted based on the location of intersection with the associated edge.
- the coefficient of the dragged item would be increased by the minimum possible value, and if the selected item is dragged beside the second end 266 the coefficient of the item is increased to the maximum.
- the user expressed a moderate liking of M3 by dragging the M3 icon in the middle as shown in Figure 18a.
- the relevance coefficient associated with M3 may be promoted/increased based on the degree of liking.
- a visual indicator may be displayed beside each end of the dislike edge to indicate the degree of disliking associated with each end (a big trash can for the "dislike a lot" end 267 and a small trash can for the "dislike a bit" end 265).
- Figures 19a to 19e are screen shots of a user interface in accordance with the present embodiment.
- the granularity level of the results displayed in the interface may depend on the width of the pinch gesture performed over the selected search result.
- Figure 19a it is shown in Figure 19a that the user performed a pinch gesture (using two fingers 253) over the tag "rock” , as the pinch gesture goes wider the top tags for "rock” may appear in the interface as shown in Figure 19b, if the gestures goes wider the top artists for "rock” may appear as shown in Figure 19c, if the pinch goes even wider the top albums for "rock” may appear as shown in Figure 19d, and if the pinch goes wider the top songs for "rock” may appear as shown in Figure 19e.
- the tag "rock” appears in the interface (e.g. in the center thereof) at the same time the lower granularity levels for "rock” are displayed to indicate to the user that the search results currently displayed are classified under "rock".
- the visual indicator 280 informs the user of the available granularities that may be navigated, and at the same time indicates the current granularity of the search results currently shown in the interface as indicated at 280-a.
- the interface may display the search results shown in any one of Figures 19c, 19d, and 19e may be displayed immediately after Figure 19a if the pinch performed in Figure 19a was wide enough.
- the interface may be configured so that the width of the pinch gesture decides the granularity level of the search results displayed in the interface. The same applies to the zoom out gestures.
- zoom-in and zoom-out gestures may be received on a touch sensitive display, a vision-based tool that analyses the gesture of the user in the air using a camera, radar, motion detector, glove, and the like, a pointing device, a keyboard or any other device that allows for interfacing with a computing device.
- Figure 20 is a flowchart of a method 350 for navigating levels of granularity of search results displayed on a user interface implemented on a display device.
- the method comprises, at step 352, receiving a plurality of type 1 search results returned in response to a search query, said type 1 search results having a granularity level that is based on the search query and a type of data that is available at a search engine.
- Step 354 comprises storing said type 1 search results in a queue based on a relevance-coefficient associated with each search result, said relevance-coefficient indicating a relevance of the search result to the search query.
- Step 356 comprises displaying a first subset of type 1 search results comprising type 1 search results having the highest relevance- coefficients.
- Step 360 comprises, in response to receiving a user input representing a zoom-in instruction over a selected type 1 search result, displaying a first subset of type 2 search results, said type 2 search results representing a lower-granularity level of the selected type 1 search result; receiving a user input representing a user preference of a selected type 2 search result.
- Step 362 comprises adjusting relevance-coefficients associated with type 1 and type 2 search results based on the user preference and a similarity- coefficient between the selected type 2 search result and any related type 1 and/or type 2 search result.
- Step 364 comprises re-ordering the queue based on the adjusted relevance-coefficients.
- Step 366 comprises displaying a second subset of type 1 search results upon receiving a user input representing a zoom- out instruction, the second subset of type 1 search results having at least one type 1 search result different than the first subset of type 1 search results.
- Figure 21 is a flowchart of a method 370 for navigating levels of granularity of search results.
- the method comprises, at step 372, displaying a first subset of type 1 search results returned in response to a search query on a touch-sensitive display of a portable computing device, the type 1 search results having a granularity level that is defined by a search engine for the search query.
- step 374 in response to detecting a pinch gesture representing a zoom-in instruction over a selected type 1 search result, displaying a first subset of type n search results representing a lower-granularity level of the selected type 1 search result.
- Step 376 comprises receiving a user input liking or disliking a selected type n search result.
- Step 378 comprises, in response to the liking or disliking, adjusting a relevance-coefficient associated with each search result that is related to the selected type n search result based on a similarity coefficient between a related search result and the selected type n search result.
- Step 380 comprises, in response to receiving a user input representing a zoom-out instruction, displaying a second subset of type 1 search results including at least one type 1 search result which is different from the first subset of type 1 search results.
- Embodiments of the invention may be implemented/operated using a client machine.
- the client machine can be embodied in any one of the following computing devices: a computing workstation; a desktop computer; a tablet, a laptop or notebook computer; a server; a handheld computer; a mobile telephone; a portable telecommunication device; a media playing device; a gaming system; a mobile computing device; a device of the IPOD or IPAD family of devices manufactured by Apple Computer; any one of the PLAYSTATION family of devices manufactured by the Sony Corporation; any one of the Nintendo family of devices manufactured by Nintendo Co; any one of the XBOX family of devices manufactured by the Microsoft Corporation; or any other type and/or form of computing, telecommunications or media device that is capable of communication and that has sufficient processor power and memory capacity to perform the methods and systems described herein.
- the client machine can be a mobile device or any other mobile computing device capable of performing the methods and systems described herein.
- Still other embodiments of the client machine include a mobile client machine that can be any one of the following: any one series of Blackberry, Playbook or other handheld device manufactured by Research In Motion Limited; the iPhone manufactured by Apple Computer; Windows Phone 7, HTC, Sony Ericsson, any telephone or computing device running the Android operating system, or any handheld or smart phone; a Pocket PC; a Pocket PC Phone; or any other handheld mobile device supporting Microsoft Windows Mobile Software, etc.
- the client machine may include a display and a touch-sensitive surface. It should be understood, however, that the computing device may also include one or more other physical user interface devices, such as a physical keyboard, a mouse and/ or a joystick. In another embodiment, the computing device may include or be operably connected to a motion detector or a vision based interface (such as a virtual keyboard) for receiving the user's inputs.
- a motion detector or a vision based interface such as a virtual keyboard
- the client machine may be in communication with a remote server via a communication network.
- the data may be loaded from a local database or from local data files e.g. XML, JSON etc.
- Figure 22 illustrates an embodiment of a computing environment
- client machines 102A-102N in communication with servers 106A-106N, and a network 104 installed in between the client machines 102A-102N and the servers 106A-106N.
- client machines 102A-10N may be referred to as a single client machine 102 or a single group of client machines 102
- servers may be referred to as a single server 106 or a single group of servers 106.
- One embodiment includes a single client machine
- another embodiment includes a single server 106 communicating with more than one client machine 102, while another embodiment includes a single client machine 102 communicating with a single server 106.
- the client machine 102 may in some embodiments execute, operate or otherwise provide an application that can be any one of the following: software; a program; executable instructions; a web browser; HTML5; Javascript; WebGL; a web-based client; a client-server application; a thin-client computing client; software related to voice over internet protocol (VoIP) communications like a soft IP telephone; an application for streaming video and/or audio; an application for facilitating real-time-data communications; a HTTP client; a FTP client; an Oscar client; a Telnet client; any application from any application store such as Apple's app store, or the Google play store, or the Amazon app store, or blackberry; or any other type and/or form of executable instructions capable of executing on client machine 102.
- VoIP voice over internet protocol
- Still other embodiments may include a computing environment 101 with an application that is any of either server-based or remote-based, and an application that is executed on the server 106 on behalf of the client machine 102.
- the client machine 102 may include a network interface to interface to a Local Area Network (LAN), Wide Area Network (WAN) or the Internet through a variety of connections including, but not limited to, standard telephone lines, LAN or WAN links (e.g., 802.11 , T1 , T3, 56 kb, X.25, SNA, DECNET), broadband connections (e.gATM, Gigabit Ethernet, Ethernet- over-SONET), wireless connections, or some combination of any or all of the above.
- LAN Local Area Network
- WAN Wide Area Network
- broadband connections e.gATM, Gigabit Ethernet, Ethernet- over-SONET
- wireless connections or some combination of any or all of the above.
- the computing environment 101 can in some embodiments include a server 106 or more than one server 106 configured to provide the functionality of any one of the following server types: a file server; an application server; a web server; a proxy server; an appliance; a network appliance; a gateway; an application gateway; a gateway server; a virtualization server; a deployment server; a SSL VPN server; a firewall; a web server; an application server or as a master application server; a server 106 configured to operate as an active direction; a server 106 configured to operate as application acceleration application that provides firewall functionality, application functionality, or load balancing functionality, or other type of computing machine configured to operate as a server 106.
- a server 106 may include a remote authentication dial-in user service such that the server 106 is a RADIUS server.
- the network 104 between the client machine 102 and the server 106 is a connection over which data is transferred between the client machine 102 and the server 106.
- the illustration in Figure 14 depicts a network 104 connecting the client machines 102 to the servers 106, other embodiments include a computing environment 101 with client machines 102 installed on the same network as the servers 106.
- a computing environment 101 with a network 104 can be any of the following: a local-area network (LAN); a metropolitan area network (MAN); a wide area network (WAN); a primary network comprised of multiple sub-networks located between the client machines 102 and the servers 106; a primary public network with a private subnetwork; a primary private network with a public sub-network; or a primary private network with a private sub-network.
- LAN local-area network
- MAN metropolitan area network
- WAN wide area network
- a primary network comprised of multiple sub-networks located between the client machines 102 and the servers 106 a primary public network with a private subnetwork; a primary private network with a public sub-network; or a primary private network with a private sub-network.
- Still further embodiments include a network 104 that can be any of the following network types: a point to point network; a broadcast network; a telecommunications network; a data communication network; a computer network; an ATM (Asynchronous Transfer Mode) network; a SONET (Synchronous Optical Network) network; a SDH (Synchronous Digital Hierarchy) network; a wireless network; a wireline network; a network 104 that includes a wireless link where the wireless link can be an infrared channel or satellite band; or any other network type able to transfer data from client machines 102 to servers 106 and vice versa to accomplish the methods and systems described herein.
- a network 104 that can be any of the following network types: a point to point network; a broadcast network; a telecommunications network; a data communication network; a computer network; an ATM (Asynchronous Transfer Mode) network; a SONET (Synchronous Optical Network) network; a SDH (Synchronous Digital Hierarch
- Network topology may differ within different embodiments, possible network topologies include: a bus network topology; a star network topology; a ring network topology; a repeater-based network topology; a tiered-star network topology; or any other network topology able transfer data from client machines 102 to servers 106, and vice versa, to accomplish the methods and systems described herein.
- Additional embodiments may include a network 104 of mobile telephone networks that use a protocol to communicate among mobile devices, where the protocol can be any one of the following: LTE, AMPS; TDMA; CDMA; GSM; GPRS UMTS; or any other protocol able to transmit data among mobile devices to accomplish the systems and methods described herein.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Databases & Information Systems (AREA)
- Theoretical Computer Science (AREA)
- Mathematical Physics (AREA)
- Computational Linguistics (AREA)
- Data Mining & Analysis (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A user interface which detects the user's taste as the user is navigating through different search results and through different granularities of search results. The interface refines the search results across the different granularities based on the user's taste to simplify and expedite the search process, thereby, guiding the user toward the search results that are relevant to the user making the search. In an embodiment, the interface refines the search results across the different granularities with every user input indicating a user preference/taste, whereby, the results presented to the user at one or more granularity levels before receiving/detecting a user preference differ from the results displayed to the user after receiving the user preference.
Description
TASTE-BASED NAVIGATION AT MULTIPLE LEVELS OF GRANULARITY
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims priority under 35USC§119(e) of US provisional patent application No. 61/678809 filed on August 2, 2012, the specification of which is hereby incorporated by reference.
BACKGROUND
(a) Field
[0002] The subject matter disclosed generally relates to user interfaces.
(b) Related Prior Art
[0003] There is a need in the market for a user interface which simplifies the task of searching especially when the search is being done on a computing device that does not have a physical keyboard.
[0004] Conventional search interfaces allow the user to type in a search query and hit the search button to retrieve a list of products. The initial search may provide search results including a plurality of items. The user may open and view the first item, and move to the next item if a "next" button is provided. The user may go back to the main results page using breadcrumbs to view the other items (Breadcrumbs is a navigation aid used in user interfaces which provides links back to each previous page the user navigated through to get to the current page).
[0005] These activities are time consuming and confusing for the average consumer. The challenge also increases when the page is being surfed using a portable device having a smaller screen such as a portable phone that does not have a physical keyboard or when the keyboard takes a portion of the available screen.
[0006] Therefore, there is need in market for a user interface which simplifies the task of refining a search to guide the user to search results that are relevant to that user.
SUMMARY
[0007] The present embodiments describe such interface.
[0008] In embodiments there is disclosed a user interface for searching and viewing products in a user friendly manner. A user interface in accordance with the present embodiments saves the user the trouble of typing new search queries and moving back and forth through the different search result pages to refine the search by accounting for the user's taste as the user navigates through different search results and the different granularity levels of the search products. In addition to being user friendly, such interface is ideal for implementation on smart phones and portable devices with limited screen sizes.
[0009] In an aspect, there is provided a method for navigating levels of granularity of search results displayed on a user interface implemented on a display device, the method comprising: receiving a plurality of type 1 search results returned in response to a search query, said type 1 search results having a granularity level that is based on the search query and a type of data that is available at a search engine; storing said type 1 search results in a queue based on a relevance-coefficient associated with each search result, said relevance- coefficient indicating a relevance of the search result to the search query; displaying a first subset of type 1 search results comprising type 1 search results having the highest relevance-coefficients; in response to receiving a user input representing a zoom-in instruction over a selected type 1 search result, displaying a first subset of type 2 search results, said type 2 search results representing a lower-granularity level of the selected type 1 search result; receiving a user input representing a user preference of a selected type 2 search result; adjusting relevance-coefficients associated with type 1 and type 2 search results based on the user preference and a similarity-coefficient between the
selected type 2 search result and any related type 1 and/or type 2 search result; re-ordering the queue based on the adjusted relevance-coefficients;
displaying a second subset of type 1 search results upon receiving a user input representing a zoom-out instruction, the second subset of type 1 search results having at least one type 1 search result different than the first subset of type 1 search results.
[0010] In another aspect, there is provided a computing device for navigating search results, the computing device comprising: a processor; a memory having recorded thereon one or more programs which when executed by the processor cause the system to display a user interface on a display device operably connected to the computing device for displaying search results in a display area of the user interface; wherein the computing device is configured to: receive a plurality of search results returned by a search engine in response to a search query, and store the plurality of search results in a queue based on a relevance coefficient associated with each search result; display a first set of search results in the display area, said first set including search results having the highest relevance coefficients; upon receiving a user input representing a zoom-in instruction over a selected search result, display a new set of search results representing a lower granularity of the selected search result; upon detecting a user input liking or disliking a given lower granularity search result, adjust a relevance coefficient associated with each search result that is related to the given search result based on a similarity coefficient between the liked/disliked search result and the related search result; re-order the queue based on the adjusted relevance coefficients; and upon receiving a user input representing a zoom-out instruction display a second set of search results different than the first set.
[0011] In yet a further aspect, there is provided a method for navigating levels of granularity of search results, the method comprising: displaying a first subset of type 1 search results returned in response to a search query on a
touch-sensitive display of a portable computing device, the type 1 search results having a granularity level that is defined by a search engine for the search query; In response to detecting a pinch gesture representing a zoom-in instruction over a selected type 1 search result, displaying a first subset of type n search results representing a lower-granularity level of the selected type 1 search result; receiving a user input liking or disliking a selected type n search result; in response to the liking or disliking, adjusting a relevance-coefficient associated with each search result that is related to the selected type n search result based on a similarity coefficient between a related search result and the selected type n search result; in response to receiving a user input representing a zoom-out instruction, displaying a second subset of type 1 search results including at least one type 1 search result which is different from the first subset of type 1 search results.
[0012] Features and advantages of the subject matter hereof will become more apparent in light of the following detailed description of selected embodiments, as illustrated in the accompanying figures. As will be realized, the subject matter disclosed and claimed is capable of modifications in various respects, all without departing from the scope of the claims. Accordingly, the drawings and the description are to be regarded as illustrative in nature, and not as restrictive and the full scope of the subject matter is set forth in the claims.
BRIEF DESCRIPTION OF THE DRAWINGS
[0013] Further features and advantages of the present disclosure will become apparent from the following detailed description, taken in combination with the appended drawings, in which:
[0014] Figures 1 a to 1c illustrate user interfaces with edges having different shapes, in accordance with an embodiment;
[0015] Figure 2 illustrates an example of a user interface having four edges defining a rectangle;
[0016] Figure 3 illustrates an exemplary user interface illustrating a cluster of search results presented as icons with visual links between related results;
[0017] Figure 4 illustrates an example of a queue corresponding to the interface of Figure 3 for storing the search results;
[0018] Figures 5a to 5e illustrate the steps of updating the cluster of search results displayed in the search interface when a search result is liked;
[0019] Figure 6 illustrates an updated version of the queue of Figure 4, corresponding to the interface of Figures 5b to 5e;
[0020] Figures 7a to 7f illustrate the steps of updating the cluster of search results displayed in the search interface when a tag is liked;
[0021] Figure 8 illustrates the queue corresponding to the interface of Figure 7a;
[0022] Figure 9 illustrates the top artists associated with the tag selected in Figures 7b;
[0023] Figure 10 illustrates an updated version of the queue of Figure 8, corresponding to the interface of Figures 7b to 7e;
[0024] Figures 11a to 11 g illustrate an example of zoom-in navigations of different granularity levels on a search interface in accordance with the present embodiments;
[0025] Figure 12a to 12d illustrate examples of queues corresponding to the search results in Figures 11a to 11d, respectively;
[0026] Figure 13 illustrates an example of search results returned by the search engine at different granularity levels in response to the search query "car";
[0027] Figure 13a illustrates different hierarchy of granularity levels for search results of Figure 13;
[0028] Figure 14a illustrates an example of search results that are related to M3 and the similarity factor between M3 and each related search result, before receiving the user's preference;
[0029] Figure 14b illustrates the search results of Figure 14a after receiving the user's preference;
[0030] Figures 15a to 15d illustrate an example of zoom-out navigations of different granularity levels on a search interface in accordance with the present embodiments;
[0031] Figure 16a to 16d illustrate examples of queues corresponding to the search results in Figures 15a to 15d, respectively;
[0032] Figures 17a and 17b illustrate examples of zoom in and zoom out gestures, respectively;
[0033] Figures 18a and 18b illustrate an embodiment of a search interface which allows the user to express a degree of liking/disliking of a given search result;
[0034] Figures 19a to 19e are screen shots of a user interface in accordance with the present embodiment;
[0035] Figure 20 is a flowchart of a method for navigating levels of granularity of search results displayed on a user interface implemented on a display device
[0036] Figure 21 is a flowchart of a method for navigating levels of granularity of search results
[0037] Figure 22 illustrates an embodiment of a computing environment in which embodiments of the present invention may be practiced.
[0038] It will be noted that throughout the appended drawings, like features are identified by like reference numerals.
DETAILED DESCRIPTION
[0039] In an embodiment, a user interface is provided which detects the user's taste as the user is navigating through different search results and through different granularities of the search results. The interface refines the search results across the different granularities based on the user's taste to simplify and expedite the search process, thereby, guiding the user toward the search results that are relevant to the user making the search. In an embodiment, the interface refines the search results across the different granularities with every user input indicating a user preference/taste, whereby, the results presented to the user at one or more granularity levels before receiving/detecting a user preference differ from the results displayed to the user after receiving the user preference.
[0040] When a user enters a search query and hits the search button, the search query is relayed to a search result provider (the entity that provides the search results e.g. search engine, a database, website, api etc.) engines which return a list of search results. The received search results are stored in a queue in a memory provided on the computing device in accordance with a relevance coefficient which represents the relevance of a given search result to the search query. The search results displayed in the interface are those having the highest coefficients, since they are considered as the most relevant to the search query. The search results may also have similarity coefficients, each similarity coefficient indicating a degree of similarity between a given search result and another one.
[0041] In a non-limiting example of implementation, the relevance coefficients and sometimes the similarity coefficients are received from the search result provider.
[0042] In an embodiment, the user interface may detect the user's taste as the user is navigating through the search results. In a non-limiting example of implementation, the interface may detect the user's liking and disliking of selected search results presented in the interface (horizontal navigation) to adjust
the relevancy coefficient associated with the liked/dislike search results and all related search results. In an embodiment, adjusting the relevance coefficient of related search results is based on the similarity coefficient between the liked/disliked search result and each related search result.
[0043] The present invention will be more readily understood by referring to the following examples which are given to illustrate the invention rather than to limit its scope.
[0044] The following paragraphs describe an example of how the interface may detect the user's taste by activating functions associated with edges of the interface as the user navigates through the search results. It should be noted that the embodiments are not limited to this implementation and that other methods may also be used without departing from the scope of this disclosure.
Detecting User's Taste
[0045] The user interface may be displayed on a display area of a display device (e.g. screen, monitor etc.). The interface may define a periphery surrounding the display area. The periphery may have edges having different shapes as shown in Figures 1a to 1c. The edge may be provided near or at the periphery. The periphery may have one edge or a plurality of edges. Preferably, the user interface would have a rectangular edge with four edge portions (hereinafter four edges) as shown in Figure 1 a to use the maximum space possible since most displays/monitors have a rectangular shape. However, nothing prevents from implementing the user interface as a circle, ellipse, arc, etc. as exemplified in Figures 1 b and 1c.
[0046] Figure 2 illustrates an example of a user interface 200 having four edges 202 to 208 defining a rectangle. The interface 200 may include a search query region 210 for typing the search query in. The search query region may be provided within the edges of the interface 200 and may also be provided outside the interface 210. It is also possible that the search query region 210 appears
and disappears when touching a certain location on the screen/keyboard. The user may type in a search query and hit search. The search results may appear within the edges of the interface 200.
[0047] In an embodiment, a function indicating a user preference is assigned to at least one edge (or portion of an edge) of the user interface. For example, as shown in Figure 2, edge 202 is assigned the function "like", edge 204 is assigned the function "preview" , edge 206 is assigned the function "dislike", and edge 208 is assigned the function "ignore". Whereby, the user may drag a search results toward one of the edges 202 to 208 of the interface 200 to activate the function associated with that edge. Prior to activating the function, the interface may provide a visual indicator indicating the edge toward which the user is moving the search result and the function associated with that edge. The function may be activated by bringing the search result in proximity or in contact with the edge, or by throwing the function toward the edge as will be described in further detail hereinbelow.
[0048] An interface in accordance with the present embodiments may be used for performing different types of searches. For example, it may be used for performing a product search, or a regular search in a web browser using the internet or the like or local search of a local database/library. Examples of products may include: artist, author, singer, dancer, music composer, music type, band, actor, music album, song, painting, book, movie, game, electronic device, cars, houses etc.
[0049] In an embodiment, after receiving a search query command the interface (or the processor of the computing device on which the interface is implemented) may return a number of search results each having associated therewith: a coefficient of relevance (hereinafter coefficient), a list of similar results/products, a factor of similarity (aka similarity-coefficient) between the result and each similar product, and one or more common characteristics that relates the result to the similar product. The interface stores the results and the
associated data in a queue in memory. In an embodiment, the results may be ordered in the queue in accordance with the magnitude of the coefficients associated therewith, so that the results having the highest coefficients are stored at the beginning of the queue and displayed first and those having lower coefficients are stored at the end of the queue and displayed last (if they are ever displayed). The interface may display a number (or all) of the results stored in the queue based on the space allocated to the interface 200 on the display. The products may be displayed as icons or nodes.
[0050] In an embodiment, the interface may display the search results within the edges thereof while showing a visual link between two (or more) of the results based on a common characteristic that exists between the two results. Figure 3 illustrates an exemplary user interface illustrating a cluster of search results presented as icons with visual links between related results. The common characteristic may be a characteristic that relates to the search result such as type of music, character of the actor etc. and may also relate to the user's social network. For example, the common characteristic could be the fact that the user's friends recommend the product or have bought it, or they like it (or dislike it), or wrote a review about it etc. As shown in Figure 3, the interface 200 may associate a tag to the visual link to indicate the common characteristic that exists between the search results linked by the visual link. With reference to Figure 3, assuming that the user wants to search the music of the singer 'Beyonce', the user may type the name of the singer in the search query region 210 and hit search. As shown in figure 3, the interface 200 returns and displays a number of singers and provides a visual link between related singers. For example, Beyonce and Nicole Scherzinger are related by being 'sexy', while Kelly Rowland and Beyonce are related by the type of music 'soul' that they sing.
[0051] Figure 4 illustrates an example of a queue 240 illustrating a number of search results listed in the order of the magnitude of their coefficients. For instance, the singers Beyonce and Rihanna have a coefficient of 1 (which is the
highest) and take the first two places in the queue 240 since they both have many characteristics in common; Nicole Scherzinger has a coefficient 0.87 and comes in the third place and so on. Figure 4 also shows the singers that are similar to Nicole Scherzinger along with a factor of similarity. Each singer listed in the list has a list of related singers, however for space constaints, Figure 4 shows only the lists associated with Nicole Scherzinger and Wanessa.
Dragging An Icon
[0052] As discussed above with reference to Figure 2, one or more of the interface edges may have a function associated with them. Accordingly, if the user drags an icon toward one of these edges and activates the function associated with that edge, the interface would refine the search results displayed within the interface or play a sample of the product represented by the dragged icon. For instance, if the user drags an icon toward the like edge 242 the interface 200 may remove the dragged icon from the display, adjust the coefficient of similarity of that icon to be the highest, and then adjust the coefficient of the similar products in the queue. Adjusting the coefficient of the similar results may be based on the similarity factor between these results e.g. 0.5 between Nicole Scherzinger and Jennifer Lopez as shown in Figure 4. The queue may then reordered to take into consideration the new coefficients, and a new result may be displayed in the interface along with a new link and a new tag. An example is illustrated in Figures 5a to 5e.
[0053] Assuming that the user likes Nicole Sherzinger, they may drag the icon representing her toward the like edge 202, as shown in Figure 5a. The interface would then remove the icon of Nicole Scherzinger from the cluster, as shown in Figure 5b, and recalculates the links as shown in Figure 5c. In the meantime, the interface may increase the coefficient of Nicole Scherzinger to 1 , and adjust the coefficient of the related singers based on the factor of similarity. In the present example, re-adjustment of the coefficient is done by adding the factor of similarity to the coefficient of the related singer and dividing the sum by
two. In another embodiment, it is possible to determine an amount to be added and multiply that amount by the factor of similarity and add the result to the coefficient of the similar result. Other methods may also be used to adjust the coefficients.
[0054] The queue 240 is reordered to take into account the new coefficients, as shown in Figure 6. Of course the liked singer (result) would remain in the queue but would not appear in the interface unless the user does so manually for example by touching or double-clicking on the liking edge to bring the liked results back into the cluster to be displayed. Once the queue 240 is reordered, a new singer (The Pussy Cat Dolls) is displayed in the interface 200 as shown in Figure 5d, and new links/tags (sweet) may be calculated and displayed as illustrated in Figure 5e.
[0055] It should be noted that it is not mandatory to display the steps of each of Figures 5a to 5e to the user, but this method may be useful to indicate the progress while the system is calculating the results and/or updating the queue.
[0056] In an embodiment, the common characteristics may also be stored in accordance with their relevance. In the example of Figure 5d, the 'smart' tag between Rihanna and Kelly Rowland has been removed for including a 'sweet' tag that relates Rihanna, The Pussy Cat Dolls and Destiny's Child.
[0057] In an embodiment, the user may dislike one of the results by dragging the corresponding icon toward the dislike edge 246. The interface would then remove the dragged product from the cluster, decrease the coefficient of the dragged product and the coefficient of the related products based on the factor of similarity. This step may be performed in the same manner as the step of liking with the exception that the coefficients are decreased.
Dragging A Tag
[0058] In a further embodiment, the user may perform the actions described in Figures 5 and 6 with a tag rather than an icon.
[0059] For example, when the user drags a tag toward the like edge 202, the tag's top results are fetched to increase their coefficients and/or add them to the queue where applicable. The queue is then reordered to take into account the new coefficients associated to the results. On the interface level, the tag that is dragged disappears from the interface to be replaced by one or more other tags, and the cluster of search results may also be modified to take into consideration the new coefficients. An example is illustrated in Figures 7a to 7f.
[0060] Figure 7a illustrates an exemplary user interface including a cluster of icons representing search results and tagged visual links between related results. Figure 8 illustrates the queue corresponding to the interface of Figure 7a. If the user likes singers that are 'sexy' they may drag the tag 'sexy' toward the liking edge 202, as shown in Figure 7b. When the tag is liked, the tag may be removed from the interface as shown in Figure 7c, and the tag's top artists are fetched as shown in Figure 9 to increase their coefficients. After the tag is removed other tags are displayed as shown in Figure 7d. Once the coefficients are modified the queue may be reordered to take into account the new coefficients, as shown in Figure 10. Once the queue is reordered, the cluster of products shown in the interface 200 is changed. For example, since Shakira and Britney Spears became in the top five places in the queue as shown in Figure 10, the interface would display them instead of Kelly Rowland and Destiny's Child, as shown in Figure 7e. New tags may then be calculated between the related icons as shown in Figure 7f which shows that Briteny Spears, Beyonce and Shakira are linked by the tag 'pop' which indicated that the three singers sing pop music.
[0061] The other functions (dislike, ignore, preview, etc.) may also be applied by dragging the tag toward the desired edge or portion of an edge. These functions have been discussed above in connection with the dragging of icons representing search results and will not be repeated herein.
[0062] Accordingly the embodiments discussed above, describe how the user may navigate different search results and how the search results displayed to the user are changed every time the user selects/drags a given search result and likes or dislikes the given search result, based on the relevance-coefficient associated with each search result in the queue, and a similarity-coefficient between the given search result and every other similar search result.
Navigating Different levels of Granularity
[0063] In an embodiment, the interface may allow the user to navigate through different levels of granularity of the search results (aka vertical navigation) to view search results at lower or higher granularity levels. An example is provided with respect to Figures 11 a to 11 f .
[0064] Figures 11a to 11f illustrate an example of zoom-in navigations of different granularity levels on a search interface in accordance with the present embodiments, and Figure 12a to 12d illustrate examples of queues corresponding to the search results in Figures 11 a to 11 d , respectively. Figure 13 illustrates an example of search results returned by the search engine at different granularity levels in response to the search query "car".
[0065] It should be noted that the search results may and may not be all returned to the computing device in one shot. It may be that some of the search results are sent by the search result provider to the computing device as the user navigates through the different results. For example, the different classes and models of Mercedes may only be sent to the computing device if the user zooms- in over the Mercedes icon in Figures 11 b.
[0066] It is also to be noted that the embodiments are not limited to a specific hierarchical structure. In other words, the embodiments are not limited to the hierarchy shown in Figure 13, and may still be practised with other hierarchy structures, as exemplified in Figure 13a (when compared to Figure 13). For example, the granularity levels may differ between a data provider (e.g. search
engine, database etc.) and another. However, the interface may display the granularity levels in accordance with the hierarchy structure provided by the data provider.
[0067] Referring back to Figure 1 1 a, this Figure illustrates type 1 search results returned by a search engine (e.g. http://www.caranddriver.com/ ) in response to the search query "car".
[0068] The type 1 results typically represent the results at the highest granularity level for the search query. The type 1 granularity level usually depends on the search query and the data available at the search engine. For example, if the search query was "vehicle" instead of "car" a search engine such as http://www.autotrader.com/ would have returned different choices of vehicles as the type 1 results including for example: motorcycles, cars, boats, trucks etc. On the other hand, if the user is searching for a "car" on a search engine associated with a specific manufacturer e.g. BMW, the type 1 search results that are returned would have been those shown in Figure 1 1 c immediately since BMW does not provide Asian and/or American cars, etc.
[0069] As discussed above, the interface allows the user to navigate through different levels of granularity of the search results. For example, using a zoom-in gesture (e.g. reverse pinch gesture as exemplified in Figure 17a), on a touch sensitive display the user may navigate into a deeper granularity level of a given search result, and may return to the first level using a zoom out gesture (e.g. pinch gesture as exemplified in Figure 17b). In an embodiment, more than two granularity levels may be provided, and the granularity level that the user may reach depends on the width of the pinch gesture performed by the user. For example assuming that the search results presented on the interface represent the highest level (typel ) of granularity for the search query, the user may navigate into the type 2 granularity by performing a reverse pinch gesture that is between 1 cm and 1 .5 cm on a portable device. Assuming however that the user
has performed a reverse pinch gesture that 2cm wide, the interface may present type 3 (or type4) search results immediately without showing the type 2 results.
[0070] Referring back to Figure 11a and assuming that the user is interested in European cars, the user may perform a zoom-in gesture over that search result to view the type 2 results illustrated in Figure 11b which represent a lower granularity level for European cars. If the user performs another zoom-in gesture over "BMW" in Figure 11 b they may view the type 3 results presented in Figure 11c which illustrate a lower granularity level for the type 2 search results. By zooming-in over "3 series" in Figure 11c, the user may view the type 4 search results illustrated in Figure 11 d which represent a lower granularity level for the BMW 3 series search result.
[0071] In an embodiment, the interface may provide a visual indicator that indicates the granularity level of the results currently displayed on the interface as shown at 280. In another embodiment, the visual indicator 280 may be provided so that it shows all the available granularity levels at the same time and also indicate the current granularity level of the search results displayed in the interface as shown at 280-a in Figure 11g.
[0072] If the user likes M3 cars (shown in Figure 1 1d), they may express their taste on the interface by dragging the M3 icon toward a like edge as exemplified in Figure 11 e or by opening the link (e.g. by tapping/clicking over the associated icon) to view more details about that search result as shown in Figure 11f and pressing a like/dislike button 282. Upon receipt of the user's expression of taste/preference the interface may adjust the relevance coefficients associated with related search results stored in the queue based on the similarity coefficient between each related search result and the liked search result to display different results at each granularity level.
[0073] Figure 14a illustrates an example of search results that are related to BMW M3 and the similarity factor between BMW M3 and each related search
result, before receiving the user's liking of BMW M3. It should be noted that the in Figure 14a, the similarity factors displayed are limited to search results of the same granularity level. However, there may be direct or indirect similarity factors between results of different granularities as will be described below.
[0074] As shown in Figure 14a, the similarity coefficient with Mercedes C63 AMG is 0.9, with Audi S4 is 0.85, and with Lexus IS 350F is 0.8. These models are known as competitors of the same class (compact sport sedan/coupe). Figure 14a also shows that the relevance coefficient before the user expressed their taste is 0.6 for M3, 0.6 for C63 AMG, 0.6 for Audi S4, and 0.5 for IS 350F Sport. As discussed above, the relevance coefficients are usually received from the search engine and represent a degree of relevance to the search query.
[0075] In a non-limiting example of implementation, after receiving the user's input liking M3, the relevance coefficient of the liked result M3 may be increased , and the relevance coefficient of related search results across all granularity levels may be adjusted to take into account the new similarity coefficient of M3.
[0076] After receiving the user's expression of taste different search results will be presented to the user as exemplified in Figures 15a to 15d.
[0077] Figures 15a to 15d illustrate an example of zoom-out navigations of different granularity levels on a search interface in accordance with the present embodiments, and Figure 16a to 16d illustrate examples of queues corresponding to the search results in Figures 15a to 15d, respectively.
[0078] After receiving the user's input liking of M3, the interface may change the search results (type 4) at the same granularity level of the liked result to present results that are more relevant to the user. In the present example, Figure 15a illustrates results that are different than those shown in Figure 11 d for the same granularity level (models level). In particular, the interface now shows
C63 AMG, Audi S4, Lexus 350F instead of the different 3series models of BMW shown in Figure 11d. If the user zooms out to a higher granularity level e.g. type 3, they may view the type 3 search results shown in Figure 15b which also include differences when compared to those shown in Figure 11c for the same granularity level (series). The same applies to Figures 15c and 11 b and Figures 15d and 11a.
[0079] In an embodiment, the queue may be re-ordered on each level of granularity every time a user preference (liking /disliking a given search result) is received. The reordering takes into account the new coefficients of the different search results, whereby different results are presented to the user as the user navigates through the different granularity levels. For example, if the user likes BMW M3 cars and expresses their taste on the interface, several search results at different granularities may be promoted and several other may be demoted as a result of receiving the user's expression of taste. In the present example, all year models of BMW M3 may be promoted (e.g. 1989 M3s to 2013 M3s), the maker will be promoted as a whole (e.g. BMW), similar cars of the same maker will be promoted (e.g. BMW 335i M-Sport Package, M5, M1 , M6 X5M, X6M), all car types of the same type or same class (e.g. C63 AMG of Mercedes, Lexus IS 350F, Audi S4 etc.), all makers of the same type of cars that are in the same class (e.g. Audi, Mercedes, Lexus), and all year models of those cars that in the same type (e.g. all year models of: Lexus IS-F, Audi S4, Mercedes AMG). Whereby even if the user did not yet view Lexus search results, if the user had already liked BMW M3 and then zooms out to Lexus and then zooms in over Lexus, the Lexus search results presented to the user would be different than those that would have been shown prior to receiving the user's liking of BMW M3. In the present case, the Lexus search results displayed to the user may in their majority be from the Lexus IS categories which include the sport cars such as Lexus IS 250-F, Lexus IS 350F which are similar to BMW M3. While the
majority of results that would have been shown prior to receiving the user's liking of BMW M3 would have been those classified in the luxury class.
[0080] In an embodiment, the interface may be configured to demote car makers that do not have an equivalent to the search result liked by the user. In the present example, since Hyundai, Nissan, Volvo and all American cars do not have an equivalent to BMW M3, the relevance coefficients associated with these results are reduced (or at least not increased) across the different granularity levels since they do not have search results that are relevant to the user making the search.
[0081] Needless to say, if the user preference indicates that the user dislikes a given search result, the disliked search result and all its related search results will receive a reduction in their relevance coefficient using the same concept discussed above and may also disappear from the interface if the new coefficients in the queue become higher.
[0082] It should be noted that the embodiments are not limited to cars and that other types of search results may be used without departing from the scope of this disclosure. Such other types may include, by order of granularity level:
- music type, artist, album, individual tracks;
- electronics, computers, portable devices, tablets;
- electronics, camcorders, individual cameras;
- recipe book, recipe, ingredients
- etc.
[0083] As discussed above, it is possible that the similarity coefficients are sometimes provided by the entity that provides the search results (e.g. search engine, a database, website, api etc.). However, this data is not always available and sometimes when it is available it is not complete. In an embodiment, the
interface may establish similarity coefficients between related search results using one of more of the following methods.
[0084] In one embodiment, the interface may establish/set the similarity coefficients between two given search results based on common attributes of the search results. For example, in the example of cars, such attributes may include the number/amount of available horsepower, torque, number of doors, weight, maximum speed, acceleration, fuel consumption, transmission type, luxury level, size, number of passengers, manufacturer's country, class, type, available options such as navigation, rear view camera, etc.
[0085] In another example, the interface may establish the similarity coefficients by detecting the user's activities on the interface. For example, by monitoring the search results that the user is viewing and/or the time spent in viewing each of these results. For example, if the user views M3 then navigates to C63 AMG and spends similar amount of time reviewing this vehicle, the interface may determine that the these two vehicles are similar and may establish a similarity coefficient between them. In an embodiment, the interfaces associated with different devices may each report the navigation activities performed on their associated devices to a remote server (e.g. The Tastefilter server) over the internet or another telecommunications network. The server may then compare the different navigation activities performed on the different devices to find a trend and establish similarity between different search results. For example, assuming that no similarity coefficient is established between C63 AMG of Mercedes and BMW M3, and a given number of users who viewed one of these results also navigated to the other result to view it. The server may then detect a match between M3 and C63 AMG and establish a similarity coefficient between them. The detection of a match may be based on one or more of the following: the number of users who viewed and/or liked one result and then navigated to view and/or like the other, the time spend viewing each result, the
percentage of users who viewed the two results from the entire number of users who only viewed one but not the other.
[0086] In another embodiment, the interface may establish a similarity coefficient between two search results based on the similarity coefficients associated with common related search results. For example, assuming that the similarity coefficient between M3 and IS 350-F is 0.8 and between IS 350-F and the IS series of Lexus is 0.9. In the present case, the similarity factor between M3 and the IS series would be 0.8*0.9=0.72 as shown in Figure 14b.
[0087] In another embodiment, the interface may provide the user with the option of expressing a degree of liking/disliking e.g. like a lot, like a bit etc. In an non-limiting example of implementation, the degree of liking or disliking may be expressed based on the position at which the user drags the search result. In an embodiment, the edge may be graded to give a score to each end of the edge and intermediate scores for positions in between the two ends. For example, it is possible to assign the lowest score to one end and the highest score to the opposite end. Figures 18a and 18b illustrate an embodiment of a search interface which allows the user to express a degree of liking/disliking of a given search result.
[0088] With reference to Figure 18a, the "like" edge 202 has two ends 264 and 266. The first end 264 is assigned the lowest liking score whereas the second end 266 is assigned the highest liking score. For example as shown in Figure 18a when the user drags the search result M3 toward the "like" edge 202 a visual indicators may be displayed beside each end of the edge 202 to indicate the degree of liking associated with each end (a big heart for the "like a lot" end 266 and a small heart for the "like a bit" end 264). Whereby, the coefficient of the dragged item is adjusted based on the location of intersection with the associated edge. For example, if the selected item is dragged beside the first end 264 the coefficient of the dragged item would be increased by the minimum possible value, and if the selected item is dragged beside the second end 266 the
coefficient of the item is increased to the maximum. In the present example the user expressed a moderate liking of M3 by dragging the M3 icon in the middle as shown in Figure 18a. In the present case, the relevance coefficient associated with M3 may be promoted/increased based on the degree of liking.
[0089] Similarly, if the user drags the search result toward the dislike edge 206, a visual indicator may be displayed beside each end of the dislike edge to indicate the degree of disliking associated with each end (a big trash can for the "dislike a lot" end 267 and a small trash can for the "dislike a bit" end 265).
[0090] In a non-limiting example of implementation, adjustment of the relevance coefficients of the liked/disliked item and the similar search results across the same or different granularity levels may be done in the following manner: if search result X has a relevance coefficient Rx , and X receives a 0.5 degree of liking then: New Rx= Old Rx + 0.5.
[0091] At the same time, assuming that Y is search result that is similar to X and has a similarity coefficient s1 with X. if the old relevance coefficient of Y was Ry, then the new similarity coefficient of Y may be: New Ry= (0.5*s1 ) + Old Ry.
[0092] Figures 19a to 19e are screen shots of a user interface in accordance with the present embodiment. As discussed above, the granularity level of the results displayed in the interface may depend on the width of the pinch gesture performed over the selected search result. With reference to Figures 19a to 19d, it is shown in Figure 19a that the user performed a pinch gesture (using two fingers 253) over the tag "rock" , as the pinch gesture goes wider the top tags for "rock" may appear in the interface as shown in Figure 19b, if the gestures goes wider the top artists for "rock" may appear as shown in Figure 19c, if the pinch goes even wider the top albums for "rock" may appear as shown in Figure 19d, and if the pinch goes wider the top songs for "rock" may appear as shown in Figure 19e.
[0093] As shown in Figures 19b to 19e, the tag "rock" appears in the interface (e.g. in the center thereof) at the same time the lower granularity levels for "rock" are displayed to indicate to the user that the search results currently displayed are classified under "rock". At the same time, the visual indicator 280 informs the user of the available granularities that may be navigated, and at the same time indicates the current granularity of the search results currently shown in the interface as indicated at 280-a.
[0094] Assuming that the user has let go of the pinch in Figure 19b, and then zoomed over soft rock, then the "soft rock" icon would appear while displaying the top artists, albums, and songs for "soft rock" instead of the higher granularity "rock", and the same applies to any lower granularity.
[0095] It is also to be noted that the interface may display the search results shown in any one of Figures 19c, 19d, and 19e may be displayed immediately after Figure 19a if the pinch performed in Figure 19a was wide enough. In other words, the interface may be configured so that the width of the pinch gesture decides the granularity level of the search results displayed in the interface. The same applies to the zoom out gestures.
[0096] It is to be noted that the zoom-in and zoom-out gestures may be received on a touch sensitive display, a vision-based tool that analyses the gesture of the user in the air using a camera, radar, motion detector, glove, and the like, a pointing device, a keyboard or any other device that allows for interfacing with a computing device.
[0097] Figure 20 is a flowchart of a method 350 for navigating levels of granularity of search results displayed on a user interface implemented on a display device. The method comprises, at step 352, receiving a plurality of type 1 search results returned in response to a search query, said type 1 search results having a granularity level that is based on the search query and a type of data that is available at a search engine. Step 354 comprises storing said type 1
search results in a queue based on a relevance-coefficient associated with each search result, said relevance-coefficient indicating a relevance of the search result to the search query. Step 356 comprises displaying a first subset of type 1 search results comprising type 1 search results having the highest relevance- coefficients. Step 360 comprises, in response to receiving a user input representing a zoom-in instruction over a selected type 1 search result, displaying a first subset of type 2 search results, said type 2 search results representing a lower-granularity level of the selected type 1 search result; receiving a user input representing a user preference of a selected type 2 search result. Step 362 comprises adjusting relevance-coefficients associated with type 1 and type 2 search results based on the user preference and a similarity- coefficient between the selected type 2 search result and any related type 1 and/or type 2 search result. Step 364 comprises re-ordering the queue based on the adjusted relevance-coefficients. Step 366 comprises displaying a second subset of type 1 search results upon receiving a user input representing a zoom- out instruction, the second subset of type 1 search results having at least one type 1 search result different than the first subset of type 1 search results.
[0098] Figure 21 is a flowchart of a method 370 for navigating levels of granularity of search results. The method comprises, at step 372, displaying a first subset of type 1 search results returned in response to a search query on a touch-sensitive display of a portable computing device, the type 1 search results having a granularity level that is defined by a search engine for the search query. At step 374, in response to detecting a pinch gesture representing a zoom-in instruction over a selected type 1 search result, displaying a first subset of type n search results representing a lower-granularity level of the selected type 1 search result. Step 376 comprises receiving a user input liking or disliking a selected type n search result. Step 378 comprises, in response to the liking or disliking, adjusting a relevance-coefficient associated with each search result that is related to the selected type n search result based on a similarity coefficient
between a related search result and the selected type n search result. Step 380 comprises, in response to receiving a user input representing a zoom-out instruction, displaying a second subset of type 1 search results including at least one type 1 search result which is different from the first subset of type 1 search results.
Hardware and Operating Environment
[0099] Embodiments of the invention may be implemented/operated using a client machine.
[00100] The client machine can be embodied in any one of the following computing devices: a computing workstation; a desktop computer; a tablet, a laptop or notebook computer; a server; a handheld computer; a mobile telephone; a portable telecommunication device; a media playing device; a gaming system; a mobile computing device; a device of the IPOD or IPAD family of devices manufactured by Apple Computer; any one of the PLAYSTATION family of devices manufactured by the Sony Corporation; any one of the Nintendo family of devices manufactured by Nintendo Co; any one of the XBOX family of devices manufactured by the Microsoft Corporation; or any other type and/or form of computing, telecommunications or media device that is capable of communication and that has sufficient processor power and memory capacity to perform the methods and systems described herein. In other embodiments the client machine can be a mobile device or any other mobile computing device capable of performing the methods and systems described herein.
[00101] Still other embodiments of the client machine include a mobile client machine that can be any one of the following: any one series of Blackberry, Playbook or other handheld device manufactured by Research In Motion Limited; the iPhone manufactured by Apple Computer; Windows Phone 7, HTC, Sony Ericsson, any telephone or computing device running the Android operating system, or any handheld or smart phone; a Pocket PC; a Pocket PC Phone; or
any other handheld mobile device supporting Microsoft Windows Mobile Software, etc.
[00102] The client machine may include a display and a touch-sensitive surface. It should be understood, however, that the computing device may also include one or more other physical user interface devices, such as a physical keyboard, a mouse and/ or a joystick. In another embodiment, the computing device may include or be operably connected to a motion detector or a vision based interface (such as a virtual keyboard) for receiving the user's inputs.
[00103] The client machine may be in communication with a remote server via a communication network. In another implementation, the data may be loaded from a local database or from local data files e.g. XML, JSON etc.
[00104] Figure 22 illustrates an embodiment of a computing environment
101 that includes one or more client machines 102A-102N in communication with servers 106A-106N, and a network 104 installed in between the client machines 102A-102N and the servers 106A-106N. In some embodiments, client machines 102A-10N may be referred to as a single client machine 102 or a single group of client machines 102, while servers may be referred to as a single server 106 or a single group of servers 106. One embodiment includes a single client machine
102 communicating with more than one server 106, another embodiment includes a single server 106 communicating with more than one client machine 102, while another embodiment includes a single client machine 102 communicating with a single server 106.
[00105] The client machine 102 may in some embodiments execute, operate or otherwise provide an application that can be any one of the following: software; a program; executable instructions; a web browser; HTML5; Javascript; WebGL; a web-based client; a client-server application; a thin-client computing client; software related to voice over internet protocol (VoIP) communications like a soft IP telephone; an application for streaming video and/or audio; an
application for facilitating real-time-data communications; a HTTP client; a FTP client; an Oscar client; a Telnet client; any application from any application store such as Apple's app store, or the Google play store, or the Amazon app store, or blackberry; or any other type and/or form of executable instructions capable of executing on client machine 102. Still other embodiments may include a computing environment 101 with an application that is any of either server-based or remote-based, and an application that is executed on the server 106 on behalf of the client machine 102. The client machine 102 may include a network interface to interface to a Local Area Network (LAN), Wide Area Network (WAN) or the Internet through a variety of connections including, but not limited to, standard telephone lines, LAN or WAN links (e.g., 802.11 , T1 , T3, 56 kb, X.25, SNA, DECNET), broadband connections (e.gATM, Gigabit Ethernet, Ethernet- over-SONET), wireless connections, or some combination of any or all of the above.
[00106] The computing environment 101 can in some embodiments include a server 106 or more than one server 106 configured to provide the functionality of any one of the following server types: a file server; an application server; a web server; a proxy server; an appliance; a network appliance; a gateway; an application gateway; a gateway server; a virtualization server; a deployment server; a SSL VPN server; a firewall; a web server; an application server or as a master application server; a server 106 configured to operate as an active direction; a server 106 configured to operate as application acceleration application that provides firewall functionality, application functionality, or load balancing functionality, or other type of computing machine configured to operate as a server 106. In some embodiments, a server 106 may include a remote authentication dial-in user service such that the server 106 is a RADIUS server.
[00107] The network 104 between the client machine 102 and the server 106 is a connection over which data is transferred between the client machine 102 and the server 106. Although the illustration in Figure 14 depicts a network
104 connecting the client machines 102 to the servers 106, other embodiments include a computing environment 101 with client machines 102 installed on the same network as the servers 106. Other embodiments can include a computing environment 101 with a network 104 that can be any of the following: a local-area network (LAN); a metropolitan area network (MAN); a wide area network (WAN); a primary network comprised of multiple sub-networks located between the client machines 102 and the servers 106; a primary public network with a private subnetwork; a primary private network with a public sub-network; or a primary private network with a private sub-network. Still further embodiments include a network 104 that can be any of the following network types: a point to point network; a broadcast network; a telecommunications network; a data communication network; a computer network; an ATM (Asynchronous Transfer Mode) network; a SONET (Synchronous Optical Network) network; a SDH (Synchronous Digital Hierarchy) network; a wireless network; a wireline network; a network 104 that includes a wireless link where the wireless link can be an infrared channel or satellite band; or any other network type able to transfer data from client machines 102 to servers 106 and vice versa to accomplish the methods and systems described herein. Network topology may differ within different embodiments, possible network topologies include: a bus network topology; a star network topology; a ring network topology; a repeater-based network topology; a tiered-star network topology; or any other network topology able transfer data from client machines 102 to servers 106, and vice versa, to accomplish the methods and systems described herein. Additional embodiments may include a network 104 of mobile telephone networks that use a protocol to communicate among mobile devices, where the protocol can be any one of the following: LTE, AMPS; TDMA; CDMA; GSM; GPRS UMTS; or any other protocol able to transmit data among mobile devices to accomplish the systems and methods described herein.
[00108] While preferred embodiments have been described above and illustrated in the accompanying drawings, it will be evident to those skilled in the art that modifications may be made without departing from this disclosure. Such modifications are considered as possible variants comprised in the scope of the disclosure.
Claims
1. A method for navigating levels of granularity of search results displayed on a user interface implemented on a display device, the method comprising:
- receiving a plurality of type 1 search results returned in response to a search query, said type 1 search results having a granularity level that is based on the search query and a type of data that is available at a search engine;
- storing said type 1 search results in a queue based on a relevance- coefficient associated with each search result, said relevance-coefficient indicating a relevance of the search result to the search query;
- displaying a first subset of type 1 search results comprising type 1 search results having the highest relevance-coefficients;
- in response to receiving a user input representing a zoom-in instruction over a selected type 1 search result, displaying a first subset of type 2 search results, said type 2 search results representing a lower-granularity level of the selected type 1 search result;
- receiving a user input representing a user preference of a selected type 2 search result, said user preference indicating a liking or a disliking of the selected type 2 search result;
- adjusting relevance-coefficients associated with type 1 and type 2 search results based on the user preference and a similarity-coefficient between the selected type 2 search result and any related type 1 and/or type 2 search result;
- re-ordering the queue based on the adjusted relevance-coefficients;
- displaying a second subset of type 1 search results upon receiving a user input representing a zoom-out instruction, the second subset of type 1 search results having at least one type 1 search result different than the first subset of type 1 search results.
2. The method of claim 1 further comprising:
- displaying a plurality of type 3 search results upon receiving a user input representing a zoom-in instruction over a given type 2 search result, said type 3 search results representing a lower-granularity level of the given type 2 search result.
3. The method of claim 2 further comprising:
- receiving a user input indicating a user preference on a type 3 search result;
- adjusting relevance-coefficients associated with type 1 , type 2, and type 3 search results based on the user preference and a similarity-coefficient between the given type 3 search result and any related type 1 , type 2, and type 3 search result;
- re-ordering the queue based on the adjusted relevance-coefficients;
- displaying a second subset of type 2 search results upon receiving a user input representing a zoom-out instruction, the second subset having at least one type 2 search result different than the first subset.
4. The method of claim 1 , further comprising displaying a second subset of type 2 search results different than the first subset of type 2 search results after receiving the user input indicating the user preference and before receiving the user input representing the zoom-out instruction.
5. The method of claim 1 , wherein the user preference represents a liking of the given type 2 search result, the method further comprising increasing the relevance-coefficient of the liked search result and increasing the relevance- coefficient of a related search result based on the similarity coefficient between the related search result and the liked search result.
6. The method of claim 1 , wherein the user preference represents a disliking of the given type 2 search result, the method further comprising decreasing the relevance-coefficient of the disliked search result and decreasing the relevance-
coefficient of a related search result based on the similarity coefficient between the related search result and the disliked search result.
7. The method of claim 1 , further comprising displaying a visual indicator on the user interface for indicating the granularity level of the results displayed in the user interface.
8. The method of claim 1 , further comprising setting the similarity coefficient between the selected type 2 search result and another search result based on common attributes between the selected type 2 search result and the other search result.
9. The method of claim 1 , further comprising setting the similarity coefficient between the selected type 2 search result and another search result by monitoring the user's activities and/or time spent in reviewing the selected type 2 search result and the other search result.
10. The method of claim 1 , further comprising displaying the selected type 1 search result and the first subset of type 2 search results simultaneously in the user interface.
11. The method of claim 1 , wherein:
- the user interface is displayed on a touch-sensitive device;
- the user input representing a zoom-in instruction includes a reverse pinch gesture; and
- the user input representing a zoom-out instruction includes a pinch gesture.
12. The method of claim 1 , wherein the user preference indicates one of liking or disliking the selected search result and the user interface comprises a first edge having a "like" function associated therewith and a second edge having a "dislike" function associated therewith, wherein receiving the user input indicating
the user preference comprises detecting a dragging of the selected search result in proximity or in contact with the first edge or the second edge.
13. A computing device for navigating search results, the computing device comprising:
- a processor;
- a memory having recorded thereon one or more programs which when executed by the processor cause the system to display a user interface on a display device operably connected to the computing device for displaying search results in a display area of the user interface; wherein the computing device is configured to:
o receive a plurality of search results returned by a search engine in response to a search query, and store the plurality of search results in a queue based on a relevance coefficient associated with each search result; o display a first set of search results in the display area, said first set including search results having the highest relevance coefficients; o upon receiving a user input representing a zoom-in instruction over a selected search result, display a new set of search results representing a lower granularity of the selected search result;
o upon detecting a user input liking or disliking a given lower granularity search result, adjust a relevance coefficient associated with each search result that is related to the given search result at each granularity level based on a similarity coefficient between the liked/disliked search result and the related search result; o re-order the queue based on the adjusted relevance coefficients; o Upon receiving a user input representing a zoom-out instruction display a second set of search results different than the first set.
14. The computing device of claim 13, wherein a like/dislike function is associated with an edge of the interface, said function being activated on the given search result when the given search result is dragged in proximity or in contact with the edge.
15. The computing device of claim 13, wherein the computing device sets the similarity coefficient based on common attributes between the liked/disliked search result and the related search result.
16. The computing device of claim 13, wherein the computing device sets the similarity coefficient by monitoring the user's activities and/or time spent in reviewing the liked/disliked search result and the related search result.
17. The computing device of claim 13, wherein the user interface comprises a first edge having a "like" function associated therewith and a second edge having a "dislike" function associated therewith, wherein receiving the user input liking or disliking a given search result comprises detecting a dragging of the given search result in proximity or in contact with the first edge or the second edge.
18. The computing device of claim 17, wherein at least of the first edge and second edge comprises a first end defining a minimum degree of liking/disliking and a second end defining a maximum degree of liking/disliking, wherein a degree of liking/disliking associated with a dragged search result is based on a position of an intersection point between the dragging trajectory and the edge, with respect to the first end and the second end.
19. The computing device of claim 13, wherein the computing device displays a visual indicator on the interface, said visual indicator showing all available granularity levels and indicating the granularity level of the search results currently displayed in the interface.
20. A method for navigating levels of granularity of search results, the method comprising:
displaying a first subset of type 1 search results returned in response to a search query on a display of a portable computing device, the type 1 search results having a granularity level that is defined by a search engine for the search query;
- In response to detecting a zoom-in instruction over a selected type 1 search result, displaying a first subset of type n search results representing a lower-granularity level of the selected type 1 search result;
- receiving a user input liking or disliking a selected type n search result;
- in response to the liking or disliking of the selected type n search result, adjusting a relevance-coefficient associated with each search result that is related to the selected type n search result based on a similarity coefficient between a related search result and the selected type n search result;
- In response to receiving a user input representing a zoom-out instruction, displaying a second subset of type 1 search results including at least one type 1 search result which is different from the first subset of type 1 search results.
21. The method of claim 20, wherein the zoom-in instruction represents a pinch gesture, the method further comprising setting the granularity level of the type n search results as a result of a width of the pinch gesture.
22. The method of claim 20 further comprising, receiving the user input using one or more of: a touch sensitive display, a motion detector, a vision based detector, a keyboard, and a pointing device.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201261678809P | 2012-08-02 | 2012-08-02 | |
US61/678,809 | 2012-08-02 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014019077A1 true WO2014019077A1 (en) | 2014-02-06 |
Family
ID=50027025
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CA2013/000693 WO2014019077A1 (en) | 2012-08-02 | 2013-08-02 | Taste-based navigation at multiple levels of granularity |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2014019077A1 (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6567812B1 (en) * | 2000-09-27 | 2003-05-20 | Siemens Aktiengesellschaft | Management of query result complexity using weighted criteria for hierarchical data structuring |
US20080098311A1 (en) * | 2006-07-28 | 2008-04-24 | Guillaume Delarue | Method and System for Navigating in a Database of a Computer System |
US20080222540A1 (en) * | 2007-03-05 | 2008-09-11 | Apple Inc. | Animating thrown data objects in a project environment |
US20090204599A1 (en) * | 2008-02-13 | 2009-08-13 | Microsoft Corporation | Using related users data to enhance web search |
-
2013
- 2013-08-02 WO PCT/CA2013/000693 patent/WO2014019077A1/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6567812B1 (en) * | 2000-09-27 | 2003-05-20 | Siemens Aktiengesellschaft | Management of query result complexity using weighted criteria for hierarchical data structuring |
US20080098311A1 (en) * | 2006-07-28 | 2008-04-24 | Guillaume Delarue | Method and System for Navigating in a Database of a Computer System |
US20080222540A1 (en) * | 2007-03-05 | 2008-09-11 | Apple Inc. | Animating thrown data objects in a project environment |
US20090204599A1 (en) * | 2008-02-13 | 2009-08-13 | Microsoft Corporation | Using related users data to enhance web search |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11543958B2 (en) | Control of search results with multipoint pinch gestures | |
US10394420B2 (en) | Computer-implemented method of generating a content recommendation interface | |
US11720633B2 (en) | Aggregating personalized suggestions from multiple sources | |
US9852227B2 (en) | Control for persistent search results and iterative searching | |
US20130167059A1 (en) | User interface for displaying and refining search results | |
US7941429B2 (en) | Interface for visually searching and navigating objects | |
US8280901B2 (en) | Method and system for displaying search results | |
AU2012304710B2 (en) | Facilitating interaction with system level search user interface | |
US20170068739A1 (en) | Queryless search based on context | |
US8863014B2 (en) | User interface for product comparison | |
US9699490B1 (en) | Adaptive filtering to adjust automated selection of content using weightings based on contextual parameters of a browsing session | |
US9953011B1 (en) | Dynamically paginated user interface | |
US9285958B1 (en) | Browser interface for accessing predictive content | |
JP2018538643A (en) | Mobile user interface | |
EP2507728A1 (en) | Method and apparatus for providing media content searching capabilities | |
US9310974B1 (en) | Zoom-based interface navigation | |
WO2023207490A1 (en) | Information display method and apparatus, device and storage medium | |
WO2014019077A1 (en) | Taste-based navigation at multiple levels of granularity | |
US20140047003A1 (en) | Processing data entities associated to descriptors | |
JP2023008982A (en) | Generating action elements suggesting content for ongoing tasks | |
WO2021105994A1 (en) | A digital content selection and management method | |
TWI427495B (en) | Operation platform system, operation method and host apparatus | |
AU2015275297A1 (en) | Multipoint pinch gesture control of search results | |
MX2008004831A (en) | Simultaneously spawning multiple searches across multiple providers |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13825836 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205N DATED 10/04/2015) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13825836 Country of ref document: EP Kind code of ref document: A1 |