WO2013090946A1 - Systems and methods involving features of seach and/or search integration - Google Patents
Systems and methods involving features of seach and/or search integration Download PDFInfo
- Publication number
- WO2013090946A1 WO2013090946A1 PCT/US2012/070214 US2012070214W WO2013090946A1 WO 2013090946 A1 WO2013090946 A1 WO 2013090946A1 US 2012070214 W US2012070214 W US 2012070214W WO 2013090946 A1 WO2013090946 A1 WO 2013090946A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- search results
- user
- search
- multimedia presentation
- results page
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 114
- 230000010354 integration Effects 0.000 title description 14
- 230000002452 interceptive effect Effects 0.000 claims abstract description 61
- 230000003993 interaction Effects 0.000 claims abstract description 57
- 238000012545 processing Methods 0.000 claims abstract description 34
- 230000008569 process Effects 0.000 claims description 12
- 230000000007 visual effect Effects 0.000 claims description 12
- 230000000694 effects Effects 0.000 abstract 1
- 230000006870 function Effects 0.000 description 12
- 238000010586 diagram Methods 0.000 description 10
- 238000005516 engineering process Methods 0.000 description 9
- 239000012092 media component Substances 0.000 description 5
- 230000008901 benefit Effects 0.000 description 4
- 230000009286 beneficial effect Effects 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 238000003491 array Methods 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 239000002184 metal Substances 0.000 description 2
- 229910044991 metal oxide Inorganic materials 0.000 description 2
- 150000004706 metal oxides Chemical class 0.000 description 2
- 230000002035 prolonged effect Effects 0.000 description 2
- 238000010079 rubber tapping Methods 0.000 description 2
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000003542 behavioural effect Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 150000001875 compounds Chemical class 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 229920000547 conjugated polymer Polymers 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000011982 device technology Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 230000005669 field effect Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000008676 import Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 230000001537 neural effect Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 229920000642 polymer Polymers 0.000 description 1
- 230000001737 promoting effect Effects 0.000 description 1
- 238000004549 pulsed laser deposition Methods 0.000 description 1
- 238000010845 search algorithm Methods 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 239000002699 waste material Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/95—Retrieval from the web
- G06F16/953—Querying, e.g. by the use of web search engines
- G06F16/9538—Presentation of query results
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1694—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/95—Retrieval from the web
- G06F16/951—Indexing; Web crawling techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2200/00—Indexing scheme relating to G06F1/04 - G06F1/32
- G06F2200/16—Indexing scheme relating to G06F1/16 - G06F1/18
- G06F2200/161—Indexing scheme relating to constructional details of the monitor
- G06F2200/1614—Image rotation following screen orientation, e.g. switching from landscape to portrait mode
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2200/00—Indexing scheme relating to G06F1/04 - G06F1/32
- G06F2200/16—Indexing scheme relating to G06F1/16 - G06F1/18
- G06F2200/163—Indexing scheme relating to constructional details of the computer
- G06F2200/1637—Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer
Definitions
- aspects of the present innovations relate to computer networking searches, and, more particularly, to associated systems and methods, such as processing search information, providing interactive search results, and search integration.
- FIGs. 1 and 2 show exemplary screenshots of prior art search result pages. These prior art examples show how generally, when an end user performs an Internet search, the search engine produces a search results page (also called an "SERP").
- the prior art as shown in FIGs 1 and 2, contain lists of results with hyperlinks and a sentence or two about each result, 101 , and 201 . That text, 101 , 102, is machine-selected by proprietary algorithms unique to each search engine— as opposed to being curated by humans— and is sometimes a random and not adequate description of the linked page. As such, there is no end-user control of the displayed text.
- Captions were first used when there was no rich media on the web and, therefore, were only text- based. Because of this legacy, architecture search results are mostly text-based captions as shown in FIGs 1 and 2, the way users consume this media is in a limited format— meaning that they can only view search results as one form of media at any given time, such as limited to just video, or just text. Continuing with FIGs 1 and 2, the prior art presented results as text, still images or video. There is not a great deal of context to the captions in search results and the presentation of those results is different from every search engine even though each search engine has its own proprietary search algorithms.
- FIGs 3 and 4 are illustrations of exemplary prior art web page previews.
- FIGs 3 and 4 show that even when an entire page is presented as a live preview, 301 , 401— as it is with example company SERP, there is not much value added to the user's search. The information is densely packed and the graphics are too small to be useful. Only the general layout of the page is discernible which does little in terms of adding content or context.
- search engine results are often inaccurate and imperfect. Text captions do not always accurately represent the content on a site because they lack context and richness. As a result, a search may not be efficient. Users often waste time uncovering the actual context of individual search results.
- Systems and methods consistent with the present innovations are directed to implementations such as processing search information, providing interactive search results, and search integration, among others.
- system and methods herein may allow for search results of improved nature, such as results that are interactive, expanded, deeper and/or richer as a function of mixed-media components, as well as improved user experience and/or improved value to various participants, among other benefits.
- FIG. 1A is an exemplary screenshot of a prior art search result page.
- FIG. 1 B is block diagram of FIG 1 A.
- FIG. 2A is an exemplary screenshot of a prior art search result page.
- FIG. 2B is block diagram of FIG 2A.
- FIG. 3A is an illustration of exemplary prior art web page previews.
- FIG. 3B is block diagram of FIG 3A.
- FIG. 4A is an illustration of exemplary prior art web page previews.
- FIG. 4B is block diagram of FIG 4A.
- FIG. 5A is an illustration of a search engine results page with integration features consistent with certain aspects of the innovations herein.
- FIG. 5B is block diagram of FIG 5A consistent with certain aspects of the innovations herein.
- FIG. 6A is an illustration a live preview showing an example search engine results page consistent with certain aspects of the innovations herein.
- FIG. 6B is block diagram of FIG 6A consistent with certain aspects of the innovations herein.
- FIG 7A is a diagram illustrating an example search engine results page from a re-query consistent with certain aspects related to the innovations herein.
- FIG. 7B is block diagram of FIG 7A consistent with certain aspects of the innovations herein.
- FIG. 8A is an example showing ad placement in an implementation consistent with certain aspects related to the innovations herein.
- FIG. 8B is block diagram of FIG 8A consistent with certain aspects of the innovations herein.
- FIG. 9 is an exemplary screenshot showing an illustrative mobile device display including a search engine results page with integrated mixed-media component consistent with certain aspects related to the innovations herein.
- FIG. 10 is an illustration of an exemplary search engine results page showing user action with a mobile device display search results page consistent with certain aspects related to the innovations herein.
- FIG. 1 1 is an exemplary screenshot illustrating further mobile device display functionality consistent with certain aspects related to the innovations herein.
- FIG 12 is an exemplary screenshot illustrating mobile device display of a search results content such as a mixed-media module consistent with certain aspects related to the innovations herein.
- FIG 13 is an exemplary screenshot of an illustrative mobile device display showing user interaction with a mixed-media module from the search results consistent with certain aspects related to the innovations herein.
- FIG 14 is an exemplary screenshot of a mobile device display showing an illustrative result of a user interaction consistent with certain aspects related to the innovations herein.
- FIG 15 is an illustration showing an example gesture consistent with certain aspects related to the innovations herein.
- FIG 16 is an illustration showing an example gesture consistent with certain aspects related to the innovations herein.
- FIG 17 is an illustration showing an example gesture consistent with certain aspects related to the innovations herein.
- FIG 18 is an illustration showing an example gesture consistent with certain aspects related to the innovations herein.
- FIG 19 is an illustration of an exemplary search engine results page showing integration/ position aspects consistent with certain aspects related to the innovations herein.
- systems and methods consistent with the innovations herein are directed to providing search results with improved features.
- aspects herein may relate to innovative integration of a rich, mixed-media, interactive component, also sometimes referred to as a 'Qwiki'TM component or module, into search results pages.
- this component or module may be an interactive narrative presentation of the content that is being searched and it may feature an interactive layer which allows the recipient of the search result to receive more detailed information without leaving the search engine results page ("SERP").
- SERP search engine results page
- systems and methods involving search results integrated with these component(s) may include features that are innovative over existing systems as a function of the information density and mixed-media/multimedia capabilities of such "mixed-media" integrated component(s).
- implementations may involve the integration of such component into a search engine results page (SERP).
- SERP search engine results page
- This can be any existing or future SERP including those popular today.
- various SERP-component integrated systems and methods herein provide display of search engine results in an interactive playable format compatible with mobile devices and their variety of interfaces.
- FIGs. 5A and 5B are illustrations of an exemplary search engine results page including an integrated mixed-media module 501 consistent with aspects of the innovations herein.
- Such implementations allow the user to stay on the search page and efficiently interact with the search engine in a way that is beneficial for that search engine through deeper more refined searches, increased ad views and clickthrough rates (CTR).
- CTR clickthrough rates
- the integrated component may include features that serve as an "interactive summary" of a web page/search result which enhances the utility of the search experience. This results in higher quality searches and the increased revenue for the search provider, such as through re-queries (deeper searches in the existing topic).
- a method of processing search information comprising, processing information to return, to a user, search results via a search engine, in a results page.
- the search results page in one example includes at least one pre- prepared, non-rendered narrative multimedia presentation.
- the example method further comprising, providing, for display via a search results page, at least one interactive multimedia presentation selectable by the user.
- multimedia presentation may be a mixed-media module as specified herein.
- the example method further comprising providing, as a function of a first interaction of the user with a selected multimedia presentation, user access to at least one of third party information, web sites, content, applications and/or other multimedia.
- the example method could include providing, as a function of a second interaction of the user with the selected multimedia presentation, functionality configured to receive a new search query and generate a new search results page.
- a mixed-media module 501 can be a controllable media component within a search results page.
- a mixed-media module may give the publisher of a site control over its brand and its content as it appears on the search results page within such mixed-media module. This may be accomplished via creator tools associated with creation of such modules that generate an embeddable interactive object, or via markup language that publishers can include on their site that is recognized by search engine crawlers. This also leads to a better search experience for the end user. Implementations include the integration of a multimedia component such as a mixed-media module into the SERP of an Internet search engine as illustrated in FIGs. 5A-5B. Such component/module's interactive summary creates a playable caption that surfaces the best contents from the page 501 . The title in this illustration, for example, "Watch the Qwiki [module], Tokyo" may specified by the creator 505.
- FIGs. 6A and 6B depict exemplary preview illustrations showing illustrative search engine results pages with mixed-media modules. Consistent with this basic preview as explained herein, a mixed-media module integrated into the search results page provides for a richer user experience and increases traffic for that page. Further, implementations may include playable captions that provide more context than regular text captions used in existing systems.
- the mixed-media module may be a rich multimedia visual and interactive piece of content.
- SERP search results page, integrated with such mixed- media module acts as an interactive multimedia summary of a search result rather than just a text based caption integrated into an SERP as previously done.
- the typical search engine result is augmented or even replaced by a mixed-media module 601 that enhances the results.
- Navigation to a desired result e.g. a selected mixed-media module, may be an expansion inside the normal search results into a larger display.
- a 'new search' button, icon or functionality may be included within mixed-media modules, e.g., a magnifying glass icon 603. This may be configured to allow for further searching or re-querying within the mixed-media module.
- a media/asset loading bar 605 may also be included, allowing for audio and/or video to play in the mixed-media module or in another window.
- the mixed-media module may also include one or more hyperlinks 610 to other web pages.
- An expander button 615 may also be included to allow for the mixed-media information in the module to be displayed in a full screen format.
- systems and methods herein may involve methods of processing search information comprising a computer server configured to communicate with at least one search engine web crawler.
- Exemplary methods also may include the computer server configured to receive the search engine web crawler results from at least a first query, and to generate search results for display in a browser window based on the first query.
- Methods may also include embodiments involving provision of search results include at least a customizable caption, various multimedia content, and at least one hyperlink configured to cause a re-query of the search engine web crawler.
- a mixed-media module integrated SERP also improves the usefulness of search.
- such interactive component has a higher density of information than the prior art, which proves to be more valuable to the end user, online content providers, and the search engines.
- the search engine crawlers can detect certain mixed-media module, such as via detection of metadata associated with QwikiTM modules, and embed it in a search results page (SERP). Further, implementations herein may utilize the mixed-media module as an interactive and playable caption, 605.
- the mixed-media module may expand within the page as shown in FIG. 6 at 601 and can offer the user a variety of options to explore related content triggering new search queries, 603, media/asset loading, 605, links to related pages, 610, and playback options, 615. Further, component video or audio files may be played within the mixed-media module on the SERP, without need for loading an external page.
- implementations provide functionality to display associated content on the same SERP by instantly modifying it or the contents around it.
- this new integrated content may be displayed without triggering new tabs.
- systems and methods herein may provide a controllable interactive media component within a search results page. For the first time, then, implementations herein involving the mixed-media modules allow the publisher of a site control over its brand and its content as it appears on the search results page within the mixed-media module.
- search engine results are clustered in a way that isn't helpful and can be overwhelming. Users get results that don't help with a decision because they are unrelated to what the user actually needs. The limited text in a caption often doesn't reveal enough information. As a result the user must select links, search that site and, if it is not the desired result, back up to the original search results or begin a new search from scratch. It's time consuming, awkward and makes things easy for a user to get lost.
- a search result enhanced via present mixed-media module(s) implementations may also involve innovations associated with second or follow-up queries, referred to herein as "re-query.”
- FIG 7 is a diagram illustrating an example search engine results page associated with a re-query, consistent with aspects of the innovations herein.
- a re-query allows a search engine user to refine their search results without losing the original search. Clicking on a hyperlink within the mixed-media module allows the user to "re-query" the search engine and dig deeper into a subject by searching the mixed- media/interactive components within a module.
- Implementations are also configured such that this opens a new window without closing the original one and thereby reduce the need to constantly hit the "back" button in order to return to the original results. This enables the ability to search, and then re-search specific details of interest within a search result without getting distracted or lost.
- a method of processing search information comprising processing information to return, to a user, search results via a search engine, in a results page.
- This example method could also include where the results page includes at least one pre-prepared, non-rendered narrative multimedia presentation, such as a mixed-media module.
- the example method could include providing, for display via a search results page, at least one interactive multimedia presentation selectable by the user. Also, providing, as a function of a first interaction of the user with a selected multimedia presentation, user access to at least one of third party information, web sites, content, applications and/or other multimedia.
- the example method could also include providing, as a function of a second interaction of the user with the selected multimedia presentation, functionality configured to receive a new search query and generate a new search results page.
- Various "re-query" implementations also allows users to stay on a search page and refine their searches in new windows without losing the original search or getting lost. This is more efficient for users and less frustrating as they are more likely to find their desired results.
- Systems and methods herein may be configured to refine a SERP via such functionality, allowing for high information density.
- the re-query can show selected caption with images 705. It can also show video or animation 710.
- specific concepts may even be suggested for further re-query 715.
- Video is becoming more prevalent online because publishers don't want to present text-only sites and there is a desire to differentiate/supplement search placement; however, traditional streaming video is time-consuming to create and view. Video content is also highly compressed on mobile devices resulting in poor streaming and picture quality. Video is also hard to interact with because there is no standard, universal layer for interactivity. For the most part, video is a stand-alone experience because of the lack of an interactive layer.
- a method of processing search information comprising a computer server configured to communicate with at least one search engine web crawler.
- the example method could also have the computer server configured to interact with the search engine web crawler search results by causing display of the search results.
- the example method may include wherein the search results include interactive multimedia content, e.g., one or more mixed-media modules, and/or associated content such as at least one hyperlink, etc.
- search results include interactive multimedia content, e.g., one or more mixed-media modules, and/or associated content such as at least one hyperlink, etc.
- mixed-media module integrated implementations can incorporate interactive images, text and other types of media. Further, given such implementations operate without large-bandwidth video transmissions especially rendered video content for audiovisual/multimedia experience, systems and methods herein provide an expanded interactive search with other mixed media, thus allowing for quicker loads and consumption of less bandwidth during utilization.
- FIG. 8 is an example showing illustrative ad placement 801 features, consistent with aspects of the innovations herein.
- the integration of mixed-media module interactive summaries into a SERP creates additional advertising monetization units, these units can be presented as interactive captions on the CPC/PPC (Cost Per Click/Pay Per Click) advertisements that traditionally are placed alongside search results, or the CPC/PPC ads (and other promotional units) can be placed within the mixed-media module itself, as shown in FIG 8.
- the interactive summary can be presented as a caption on the CPC advertisement that are traditionally placed alongside organic search results 801 .
- the CPC ads can be placed within the multimedia presentation or mixed-media module 805, itself.
- FIG 8 may give the appearance that the CPC ad is loading within the Wikipedia result.
- implementations may include the CPC ad displaying its own mixed- media module. Loading the CPC ad into the Wikipedia mixed-media module is a different embodiment from such implementations.
- a method of processing search information comprising returning search results in a search results page including one or more pre-prepared narrative multimedia presentations.
- the example method could also include providing at least one integrated multimedia presentation selected by a user. And, also, providing access to at least one of additional third party information, sites, content, applications and other multimedia.
- the example method could include wherein, the multimedia presentations are configured in association with other features for low- bandwidth (e.g., non-rendered, etc.) display for use on a mobile device.
- streaming and picture quality can be easily optimized for specific mobile devices. Further, such
- systems and methods herein may include features and implementations involving interactive and coordinated hyperlinks for deeper exploration of the content within the video— this feature of coordinating links/content inside of the mixed-media module interactive summary allow new attribution and monetization capabilities by content creators and search engines utilizing the underlying model(s).
- a “mobile device” can be any kind of smartphone, tablet computer, laptop, notebook, or any kind of similar device. These devices are typically touch screen enabled and retain internet connectivity through either a shorter range radio such as those used in WiFi technologies, or through cellular telephone connections, or both. The device may connect to the internet in any fashion.
- FIG. 9 depicts an illustrative SERP with mixed-media module implementation formatted for a mobile smartphone or tablet computer, consistent with aspects of the innovations herein.
- an illustrative "Play Qwiki module” icon is shown directly beneath the first search result in the search result screen.
- FIG. 10 is an illustration of a search engine results page with the integration of touch-enable functionality consistent with aspects of the innovations herein.
- a user is shown tapping the "Play Qwiki module” icon using a finger.
- Touch enabled screens allow such interaction with a stylus or other such device as well, while such features may be navigated with various cursor-based functionality, as well.
- FIG. 1 1 is an illustration of exemplary mobile device display and functionality consistent with aspects of the innovations herein.
- the mobile smartphone may be rotated to initiate a specified function associated with the SERP or just to allow for a landscape display, instead of a profile display.
- FIG 12 is an exemplary screenshot illustrating mobile device display of a search results content such as a mixed-media module consistent with certain aspects related to the innovations herein.
- FIG 13 is an exemplary screenshot of an illustrative mobile device display showing user interaction with a mixed-media module from the search results consistent with certain aspects related to the innovations herein.
- Figure 13 shows a user interacting with a portion of the mixed-media module, here tapping the particular media or object with respect to which additional content (details, information, etc) or further functionality ("re-query", etc) is desired.
- the search engine may be configured to interoperate with such action in a variety of ways.
- FIG 14 is an exemplary screenshot of a mobile device display showing an illustrative result of a user interaction consistent with certain aspects related to the innovations herein.
- this example shows an illustrative re-direct associated with the tapped object to a particular web page.
- the result shows a multimedia text and image or video within the web page.
- an illustrative multimedia presentation herein may be configured as an interactive system of mixed-media/interactive content with clickable components.
- Various mixed-media modules here, may also provide a visual confirmation of search results which means less frustration and more productivity for the user.
- These mixed- media modules may also provides visual relevancy - the multimedia nature of such interactive component provides more in-depth detail of a topic than text alone.
- pages with multi-media components are often ranked higher in search engine results.
- systems and methods herein provide ways for content creators to provide interactive multi-media content and, in some implementations, improve their search engine ranking through increased metadata information.
- the visual nature of embodiments herein also means that such result would not have to be ranked at the very top of an SERP to catch the attention of a search engine user since visual images are more efficiently scanned than text.
- better search results will mean greater return on investment. Online ads will be viewed within a more appropriate context and, therefore, more likely to target the right consumers. Interactions with the associated mixed-media modules can also provide additional data to rank pages.
- mixed-media module interactive summaries as integrated herein are lightweight - they use less bandwidth than pure video and are a rich, interactive, multi-media experience. Viewing such mixed-media module is faster and easier than video alone because they are interactive and have more discrete sets of contents that can easily be traversed beyond a simple play bar associated with most traditional video.
- Mixed-media modules herein also contain more information (meta-data) than video because of its multitude of components (mixed media), interactive nature and because of the ability to re- query.
- another way that implementations herein are an improvement over the traditional search experience, especially from online video is that that end user does not experience the mixed-media module in a linear fashion.
- a user can readily jump to different collections of media once a quick scan assures them the preset set of options will not yield the desired results.
- the user can also choose their path through the content by clicking on hyperlinks (meta-data) within the mixed-media module. This allows the end-user to explore the information that is of the most interest to them, in greater detail and in their preferred format (i.e. text, photos, or video).
- Innovations herein also work across multiple platforms.
- mixed-media module interactive components herein can run inside a standard web browser and its player software can be integrated into mobile devices, TV devices, video game units, etc. Further, such mixed-media module(s) may be configured as a universal component across all media and devices.
- mixed-media modules herein can act as an "interactive summary/caption" which highlights the curated content from a search result and presents it in narrative form.
- users may "preview” the contents of the search in an engaging, interactive experience on multiple devices.
- an interaction a user may have with the mixed-media module is via "Gestures", such as set forth in connection with FIGs. 15-19. These Gestures may include various touch-screen enabled interactions whereby a user is able to tap, pinch, tap and hold, and swipe or scroll the mixed-media module.
- search engines, servers and/or intermediaries may be configured to respond to or interact in accordance with these Gestures in different ways, such as the examples as described in the Figures and associated descriptions herein.
- some implementations herein include methods wherein the interactive multimedia content is configured to allow a new search query and generate a new search results page.
- FIG 15 shows an example Gesture consistent with aspects of the innovations herein.
- systems and methods herein may be configured to respond to a user tap or click of an object in the grid or in the feed to open another mixed-media module, webpage, video, or detailed animation in an overlay over the current screen.
- some embodiments include methods wherein the interaction includes a tap of a portion, button or link of the selected multimedia presentation used in the generation of the new search results page.
- FIG 16 shows another example Gesture consistent with aspects of the innovations herein.
- a user can pinch into an object in the grid to see detailed or related information on the object including source, related media, access interactive animations, view full video, read full article, and the like.
- some embodiments include methods wherein interactions include a pinch of a portion, button or link of the selected multimedia presentation used in the generation of the new search results page.
- FIG 17 shows another example Gesture consistent with aspects of the innovations herein.
- systems and methods herein may be configured such that a user can tap or click and hold on an element in the grid or in the feed to provide various or additional options.
- options may include, though are not limited to, open now, queue for later, add to favorites, etc.
- some embodiments include methods wherein interactions include a tap and hold of a portion, button or link of the selected multimedia presentation used in the generation of the new search results page.
- FIG 18 shows another example Gesture consistent with aspects of the innovations herein.
- a user can swipe or scroll with one finger left or right over the grid to advance or rewind the presentation of the mixed-media.
- some embodiments include methods wherein interactions include a swipe or scroll of a portion, button or link of the selected multimedia presentation used in the generation of the new search results page.
- FIG 19 shows another example of an illustrative interface involving a sample search result and mixed-media module presentation, consistent with aspects of the innovations herein.
- the mixed-media module may be presented as a very foremost piece of content, such as the first item to select in the upper, left portion of the search result.
- Such placement yields easy user access to greater content in the mixed-media module, and all of the
- the mobile wireless devices can be touch screen enabled, using a stylus or finger or other such thing to interact with the screen, and objects on the screen.
- the touch screen enabled technologies also allow for pinching in or out to zoom in or out or enlarge or shrink an object or the display. Sliding a touch can scroll either in vertical or horizontal directions, or any other direction supported by the system.
- the touch screens can also detect a prolonged tap, opening further functionality when a prolonged tap and hold occurs on an object.
- Such functionality can be accomplished by a cursor or pointer of some sort, typically controlled by a mouse, pointer stick, roller ball, etc.
- a cursor or pointer of some sort typically controlled by a mouse, pointer stick, roller ball, etc.
- There may be additional functionality embedded into the display objects to allow for some of the functionality such as a scroll bar or zoom buttons, etc.
- each module can be implemented as a software program stored on a tangible memory (e.g., random access memory, read only memory, CD-ROM memory, hard disk drive) within or associated with the computing elements, sensors, receivers, etc. disclosed above, e.g., to be read by a processing unit to implement the functions of the innovations herein.
- the modules can comprise programming instructions transmitted to a general purpose computer or to processing hardware via a transmission carrier wave.
- the modules can be implemented as hardware logic circuitry implementing the functions encompassed by the innovations herein.
- modules can be implemented using special purpose instructions (SIMD instructions), field programmable logic arrays or any mix thereof which provides the desired level performance and cost.
- SIMD instructions special purpose instructions
- implementations and features of the invention may be implemented through computer-hardware, software and/or firmware.
- the systems and methods disclosed herein may be embodied in various forms including, for example, a data processor, such as a computer that also includes a database, digital electronic circuitry, firmware, software, or in combinations of them.
- components such as software, systems and methods consistent with the innovations herein may be implemented with any combination of hardware, software and/or firmware.
- the above-noted features and other aspects and principles of the innovations herein may be implemented in various environments.
- Such environments and related applications may be specially constructed for performing the various processes and operations according to the invention or they may include a general-purpose computer or computing platform selectively activated or reconfigured by code to provide the necessary functionality.
- the processes disclosed herein are not inherently related to any particular computer, network, architecture, environment, or other apparatus, and may be implemented by a suitable combination of hardware, software, and/or firmware.
- various general-purpose machines may be used with programs written in accordance with teachings of the invention, or it may be more convenient to construct a specialized apparatus or system to perform the required methods and techniques.
- aspects of the method and system described herein may be implemented as functionality programmed into any of a variety of circuitry, including programmable logic devices (“PLDs”), such as field programmable gate arrays (“FPGAs”), programmable array logic (“PAL”) devices, electrically programmable logic and memory devices and standard cell-based devices, as well as application specific integrated circuits.
- PLDs programmable logic devices
- FPGAs field programmable gate arrays
- PAL programmable array logic
- electrically programmable logic and memory devices and standard cell-based devices as well as application specific integrated circuits.
- Some other possibilities for implementing aspects include: memory devices, microcontrollers with memory (such as EEPROM), embedded microprocessors, firmware, software, etc.
- aspects may be embodied in microprocessors having software-based circuit emulation, discrete logic (sequential and combinatorial), custom devices, fuzzy (neural) logic, quantum devices, and hybrids of any of the above device types.
- the underlying device technologies may be provided in a variety of component types, e.g., metal-oxide
- MOSFET semiconductor field-effect transistor
- CMOS complementary metal-oxide semiconductor
- ECL emitter-coupled logic
- polymer technologies e.g., silicon-conjugated polymer and metal-conjugated polymer-metal structures
- mixed analog and digital and so on.
- Computer-readable media in which such formatted data and/or instructions may be embodied to include, but are not limited to, non-volatile storage media in various forms (e.g., optical, magnetic or semiconductor storage media), though do not include non-tangible media.
- non-volatile storage media in various forms (e.g., optical, magnetic or semiconductor storage media), though do not include non-tangible media.
- the words “comprise,” “comprising,” and the like are to be construed in an inclusive sense as opposed to an exclusive or exhaustive sense; that is to say, in a sense of "including, but not limited to.” Words using the singular or plural number also include the plural or singular number respectively.
- the words “herein,” “hereunder,” “above,” “below,” and words of similar import refer to this application as a whole and not to any particular portions of this application.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Databases & Information Systems (AREA)
- Human Computer Interaction (AREA)
- Data Mining & Analysis (AREA)
- Computer Hardware Design (AREA)
- User Interface Of Digital Computer (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
Systems and methods are disclosed for performing processing involved with search, such as processing search information to return search results. In one exemplary implementation, there is provided a method for processing information to return search results including mixed-media media presentation(s) selectable by a user. Moreover, such method may involve user interaction to manipulate the presentation, display various media and/or effect other functionality. Further implementations may involve generation of interactive, visually rich mixed-media content of high information density providing improved user experience and/or improved value to various participants.
Description
SYSTEMS AND METHODS INVOLVING FEATURES OF SEARCH
AND/OR SEARCH INTEGRATION
CROSS-REFERENCE TO RELATED APPLICATIONS
This application claims benefit/priority of U.S. provisional patent application No. 61/576,352, filed 15 December 201 1 , which is incorporated herein by reference in entirety.
BACKGROUND
Field:
Aspects of the present innovations relate to computer networking searches, and, more particularly, to associated systems and methods, such as processing search information, providing interactive search results, and search integration.
Description of Related Information:
The web has evolved into a rich, multi-media experience, but the process of searching online and associated drawbacks have changed little in the last fifteen years. Search is still primarily text based (captions) with only small thumbnail images (or previews) appearing as a visual search result. Text captions are machine generated and are not a rich or efficient user experience. Also, humans process visual information much faster than we process text, but there is limited visual information in search results. Search engines have tried to remedy this problem by providing "live previews" of the source web pages and presenting them in text and graphical form. Unfortunately, this process is expensive, storage heavy and adds little value for the end user. Further, Internet search results often result in lists of hyperlinks that are not very informative to the searching user.
For example, FIGs. 1 and 2 show exemplary screenshots of prior art search result pages. These prior art examples show how generally, when an end user performs an Internet search, the search engine produces a search results page (also called an "SERP"). The prior art, as shown in FIGs 1 and 2, contain lists of results with hyperlinks and a sentence or two about each result, 101 , and 201 . That text, 101 , 102, is machine-selected by proprietary algorithms unique to each search engine— as opposed to being curated by humans— and is sometimes a random and not adequate description of the linked page. As such, there is no end-user control of the displayed text.
The selected text is called a "caption" as shown in FIGs 1 at 101 , and FIG 2 at 201 . Captions were first used when there was no rich media on the web and, therefore, were only text- based.
Because of this legacy, architecture search results are mostly text-based captions as shown in FIGs 1 and 2, the way users consume this media is in a limited format— meaning that they can only view search results as one form of media at any given time, such as limited to just video, or just text. Continuing with FIGs 1 and 2, the prior art presented results as text, still images or video. There is not a great deal of context to the captions in search results and the presentation of those results is different from every search engine even though each search engine has its own proprietary search algorithms. In order to refine a search in the prior art systems, one must start a search over or hit the "back" button to return to earlier results. Further, searches from mobile devices only compound problems in the prior art. With limited screen real estate, proprietary operating systems, limited bandwidth and a variety of interfaces, such as touch, voice, keyboards - both on screen and physical.
FIGs 3 and 4 are illustrations of exemplary prior art web page previews. FIGs 3 and 4 show that even when an entire page is presented as a live preview, 301 , 401— as it is with example company SERP, there is not much value added to the user's search. The information is densely packed and the graphics are too small to be useful. Only the general layout of the page is discernible which does little in terms of adding content or context.
Another problem is that search engine results are often inaccurate and imperfect. Text captions do not always accurately represent the content on a site because they lack context and richness. As a result, a search may not be efficient. Users often waste time uncovering the actual context of individual search results.
Currently, companies or website publishers do not have control over how their caption(s) appear within a SERP. The captions are algorithmically machine generated and cannot be curated by the owner of a site. In sum, there is a need for systems and methods that address the above drawbacks an/or provide other beneficial functionality or advantages to parties involved with search.
SUMMARY
Systems and methods consistent with the present innovations are directed to implementations such as processing search information, providing interactive search results, and search integration, among others. According to some implementations, system and methods herein
may allow for search results of improved nature, such as results that are interactive, expanded, deeper and/or richer as a function of mixed-media components, as well as improved user experience and/or improved value to various participants, among other benefits.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the inventions, as described. Further features and/or variations may be provided in addition to those set forth herein. For example, the present inventions may be directed to various combinations and subcombinations of the disclosed features and/or combinations and subcombinations of several further features disclosed below in the detailed description.
DESCRIPTION OF THE DRAWINGS
The accompanying drawings, which constitute a part of this specification, illustrate various implementations and features of the present inventions and, together with the description, explain aspects of innovations herein. In the drawings:
FIG. 1A is an exemplary screenshot of a prior art search result page.
FIG. 1 B is block diagram of FIG 1 A.
FIG. 2A is an exemplary screenshot of a prior art search result page.
FIG. 2B is block diagram of FIG 2A.
FIG. 3A is an illustration of exemplary prior art web page previews.
FIG. 3B is block diagram of FIG 3A.
FIG. 4A is an illustration of exemplary prior art web page previews.
FIG. 4B is block diagram of FIG 4A.
FIG. 5A is an illustration of a search engine results page with integration features consistent with certain aspects of the innovations herein.
FIG. 5B is block diagram of FIG 5A consistent with certain aspects of the innovations herein.
FIG. 6A is an illustration a live preview showing an example search engine results page consistent with certain aspects of the innovations herein.
FIG. 6B is block diagram of FIG 6A consistent with certain aspects of the innovations herein.
FIG 7A is a diagram illustrating an example search engine results page from a re-query consistent with certain aspects related to the innovations herein.
FIG. 7B is block diagram of FIG 7A consistent with certain aspects of the innovations herein.
FIG. 8A is an example showing ad placement in an implementation consistent with certain aspects related to the innovations herein.
FIG. 8B is block diagram of FIG 8A consistent with certain aspects of the innovations herein. FIG. 9 is an exemplary screenshot showing an illustrative mobile device display including a search engine results page with integrated mixed-media component consistent with certain aspects related to the innovations herein.
FIG. 10 is an illustration of an exemplary search engine results page showing user action with a mobile device display search results page consistent with certain aspects related to the innovations herein.
FIG. 1 1 is an exemplary screenshot illustrating further mobile device display functionality consistent with certain aspects related to the innovations herein.
FIG 12 is an exemplary screenshot illustrating mobile device display of a search results content such as a mixed-media module consistent with certain aspects related to the innovations herein. FIG 13 is an exemplary screenshot of an illustrative mobile device display showing user interaction with a mixed-media module from the search results consistent with certain aspects related to the innovations herein.
FIG 14 is an exemplary screenshot of a mobile device display showing an illustrative result of a user interaction consistent with certain aspects related to the innovations herein.
FIG 15 is an illustration showing an example gesture consistent with certain aspects related to the innovations herein.
FIG 16 is an illustration showing an example gesture consistent with certain aspects related to the innovations herein.
FIG 17 is an illustration showing an example gesture consistent with certain aspects related to the innovations herein.
FIG 18 is an illustration showing an example gesture consistent with certain aspects related to the innovations herein.
FIG 19 is an illustration of an exemplary search engine results page showing integration/ position aspects consistent with certain aspects related to the innovations herein.
DETAILED DESCRIPTION OF ILLUSTRATIVE IMPLEMENTATIONS
Reference will now be made in detail to the invention, examples of which are illustrated in the accompanying drawings. The implementations set forth in the following description do not represent all implementations consistent with the claimed invention. Instead, they are merely some examples consistent with certain aspects related to the invention. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts.
According to some implementations, systems and methods consistent with the innovations herein are directed to providing search results with improved features. For example, aspects herein may relate to innovative integration of a rich, mixed-media, interactive component, also sometimes referred to as a 'Qwiki'™ component or module, into search results pages. In some implementations, this component or module may be an interactive narrative presentation of the content that is being searched and it may feature an interactive layer which allows the recipient of the search result to receive more detailed information without leaving the search engine results page ("SERP"). According to certain embodiments, systems and methods involving search results integrated with these component(s) may include features that are innovative over existing systems as a function of the information density and mixed-media/multimedia capabilities of such "mixed-media" integrated component(s).
As set forth herein, implementations may involve the integration of such component into a search engine results page (SERP). This can be any existing or future SERP including those popular today. Moreover, various SERP-component integrated systems and methods herein provide display of search engine results in an interactive playable format compatible with mobile devices and their variety of interfaces.
FIGs. 5A and 5B are illustrations of an exemplary search engine results page including an integrated mixed-media module 501 consistent with aspects of the innovations herein. Such implementations allow the user to stay on the search page and efficiently interact with the search engine in a way that is beneficial for that search engine through deeper more refined searches, increased ad views and clickthrough rates (CTR). Further, in various embodiments set forth herein, the integrated component may include features that serve as an "interactive summary" of a web page/search result which enhances the utility of the search experience. This results in higher quality searches and the increased revenue for the search provider, such as through re-queries (deeper searches in the existing topic).
In one illustrative implementation, for example, there is provided a method of processing search information comprising, processing information to return, to a user, search results via a search engine, in a results page. The search results page, in one example includes at least one pre- prepared, non-rendered narrative multimedia presentation. The example method further comprising, providing, for display via a search results page, at least one interactive multimedia presentation selectable by the user. In particular, such multimedia presentation may be a mixed-media module as specified herein. Additionally, the example method further comprising providing, as a function of a first interaction of the user with a selected multimedia presentation, user access to at least one of third party information, web sites, content, applications and/or other multimedia. Also, the example method could include providing, as a function of a second interaction of the user with the selected multimedia presentation, functionality configured to receive a new search query and generate a new search results page.
Further integrations of such components with search results also involve creation of new ad units (and thereby, in some implementations, new sources of revenue) inside of the mixed- media interactive summary, as explained further below.
Consistent with certain implementations, another way systems and methods herein may depart from the traditional search experience, and from online video, is that the end user does not have to experience the mixed-media module in a linear fashion. The user can choose their path through the content with various functionality, such as clicking on hyperlinks within the mixed- media module, via navigation functionality/gestures, and the like. This allows the end-user to explore the information that is of the most interest to them, in greater detail and in their preferred format, e.g., text, photos, video, etc.
Turning again to FIGs. 5A and 5B, a mixed-media module 501 can be a controllable media component within a search results page. Here, for example, a mixed-media module may give the publisher of a site control over its brand and its content as it appears on the search results page within such mixed-media module. This may be accomplished via creator tools associated with creation of such modules that generate an embeddable interactive object, or via markup language that publishers can include on their site that is recognized by search engine crawlers. This also leads to a better search experience for the end user. Implementations include the integration of a multimedia component such as a mixed-media module into the SERP of an Internet search engine as illustrated in FIGs. 5A-5B. Such component/module's interactive summary creates a playable caption that surfaces the best contents from the page 501 . The
title in this illustration, for example, "Watch the Qwiki [module], Tokyo" may specified by the creator 505.
FIGs. 6A and 6B depict exemplary preview illustrations showing illustrative search engine results pages with mixed-media modules. Consistent with this basic preview as explained herein, a mixed-media module integrated into the search results page provides for a richer user experience and increases traffic for that page. Further, implementations may include playable captions that provide more context than regular text captions used in existing systems.
Consistent with the innovations herein, systems and methods are provided involving procedures and/or mechanisms for enhancing search results via novel integration of mixed-media modules. Such implementations may present coordinated text, images, video, documents, narrations and links all available in one interactive screen format or window. Examples of these can be seen in Figs. 6A and 6B. Here, for example, the mixed-media module may be a rich multimedia visual and interactive piece of content. A search results page, SERP, integrated with such mixed- media module acts as an interactive multimedia summary of a search result rather than just a text based caption integrated into an SERP as previously done.
As seen in connection with FIGs. 6A and 6B, the typical search engine result is augmented or even replaced by a mixed-media module 601 that enhances the results. Navigation to a desired result, e.g. a selected mixed-media module, may be an expansion inside the normal search results into a larger display. Further, a 'new search' button, icon or functionality may be included within mixed-media modules, e.g., a magnifying glass icon 603. This may be configured to allow for further searching or re-querying within the mixed-media module. Further, a media/asset loading bar 605 may also be included, allowing for audio and/or video to play in the mixed-media module or in another window. The mixed-media module may also include one or more hyperlinks 610 to other web pages. An expander button 615 may also be included to allow for the mixed-media information in the module to be displayed in a full screen format.
With regard to these implementations, such as 'new search' functionality, systems and methods herein may involve methods of processing search information comprising a computer server configured to communicate with at least one search engine web crawler. Exemplary methods also may include the computer server configured to receive the search engine web crawler results from at least a first query, and to generate search results for display in a browser window based on the first query. Methods may also include embodiments involving provision of search
results include at least a customizable caption, various multimedia content, and at least one hyperlink configured to cause a re-query of the search engine web crawler.
Referring still to FIGs. 6A and 6B, in accordance with some aspects of the innovations herein, a mixed-media module integrated SERP also improves the usefulness of search. A seen in FIGs. 6A and 6B, such interactive component has a higher density of information than the prior art, which proves to be more valuable to the end user, online content providers, and the search engines. The search engine crawlers can detect certain mixed-media module, such as via detection of metadata associated with Qwiki™ modules, and embed it in a search results page (SERP). Further, implementations herein may utilize the mixed-media module as an interactive and playable caption, 605.
According to implementations herein, once played, the mixed-media module may expand within the page as shown in FIG. 6 at 601 and can offer the user a variety of options to explore related content triggering new search queries, 603, media/asset loading, 605, links to related pages, 610, and playback options, 615. Further, component video or audio files may be played within the mixed-media module on the SERP, without need for loading an external page.
In addition to the display of related media/links in a new window on the same page, further implementations provide functionality to display associated content on the same SERP by instantly modifying it or the contents around it. Here, for example, this new integrated content may be displayed without triggering new tabs. Additionally, in contrast to existing SERP functionality where captions are algorithmically machine generated and cannot be curated by relevant parties, systems and methods herein may provide a controllable interactive media component within a search results page. For the first time, then, implementations herein involving the mixed-media modules allow the publisher of a site control over its brand and its content as it appears on the search results page within the mixed-media module.
Further, consistent with certain aspects related to the innovations herein, present
implementations improve upon and enhance existing search technologies because they provides narrative context to search results— something lacking until now. Results herein are a richer experience with more visual, linked information and interactive features.
As a function of the present mixed-media modules embodiments, which may be created by participants of the search process, search results may be more accurate and provide better context. Consistent with implementations herein, brand managers and content publishers can control their story within a search engine result without purchasing expensive search advertising. This is particularly valuable because existing captions are often not relevant for a search engine user they add little or no value to the process. They contain a limited amount of data and few clues as to the overall content of the site those captions are supposed to summarize. In other words, captions lack the context and visual richness provided via the innovations herein. Additionally, search engine results are clustered in a way that isn't helpful and can be overwhelming. Users get results that don't help with a decision because they are unrelated to what the user actually needs. The limited text in a caption often doesn't reveal enough information. As a result the user must select links, search that site and, if it is not the desired result, back up to the original search results or begin a new search from scratch. It's time consuming, awkward and makes things easy for a user to get lost. According to further embodiments, a search result enhanced via present mixed-media module(s) implementations may also involve innovations associated with second or follow-up queries, referred to herein as "re-query." FIG 7 is a diagram illustrating an example search engine results page associated with a re-query, consistent with aspects of the innovations herein. Notably, a re-query allows a search engine user to refine their search results without losing the original search. Clicking on a hyperlink within the mixed-media module allows the user to "re-query" the search engine and dig deeper into a subject by searching the mixed- media/interactive components within a module. Implementations are also configured such that this opens a new window without closing the original one and thereby reduce the need to constantly hit the "back" button in order to return to the original results. This enables the ability to search, and then re-search specific details of interest within a search result without getting distracted or lost.
In one illustrative implementation, for example, there is provided a method of processing search information comprising processing information to return, to a user, search results via a search engine, in a results page. This example method could also include where the results page includes at least one pre-prepared, non-rendered narrative multimedia presentation, such as a mixed-media module. Further, the example method could include providing, for display via a search results page, at least one interactive multimedia presentation selectable by the user. Also, providing, as a function of a first interaction of the user with a selected multimedia
presentation, user access to at least one of third party information, web sites, content, applications and/or other multimedia. And the example method could also include providing, as a function of a second interaction of the user with the selected multimedia presentation, functionality configured to receive a new search query and generate a new search results page. Various "re-query" implementations also allows users to stay on a search page and refine their searches in new windows without losing the original search or getting lost. This is more efficient for users and less frustrating as they are more likely to find their desired results. Systems and methods herein may be configured to refine a SERP via such functionality, allowing for high information density. For example, the re-query can show selected caption with images 705. It can also show video or animation 710. Moreover, specific concepts may even be suggested for further re-query 715. In addition to the display of related media/links in a new window on the same page there is an option to display associated content on the same SERP by instantly modifying it or the contents around it. This new integrated content is displayed without triggering new tabs. These "re-query" innovations may also drive a deeper understanding of a queried subject matter by displaying related search topics. As such, systems and methods herein provide for a mixed media/multi-media capability which can illustrate/enhance a selected search result with images, videos, animations, documents and even narrations. Specific concepts can be suggested for re- query, driving additional search engine traffic. This additional traffic yields higher advertising rates on the re-query pages as the searches are more specific and focused by a more specific customer interest. The richness of the re-queried media also achieves beneficial advertising results, given that richer media fetches an increased CPM/CPT (Cost Per Thousand
impressions) that advertisers are willing to pay.
Systems and methods herein overcome other issues with search engine results, namely problems associated with search placement. Placement on the SERP is important because, the higher the placement, the more likely a site will be selected by a user. For this reason, the top of the page is seen as the most valuable real estate on an SERP. Entire industries have been created just to place a search result in a higher position in the SERP as processing pages and pages of text results is time consuming. Presently search engines consider it a success when a user spends a minimal amount of time on their page. This might be counterintuitive, but it's because a quick search process means
that the user is finding the information that they need and moving on. The down side to this is that the search engine only has a limited amount of time to display ads and monetize the interaction. As such, implementations herein provide an innovative and ideal scenarios for search engines, e.g., keeping users on their site through a layer of interactivity that allows for a deeper exploration of search results without leaving the original search time and time again.
While visual results in searches can yield better results, previews of websites are very expensive for search engines to create, maintain and store. Bandwidth is also an issue when end users access search engines via mobile devices. On mobile devices and smart phones, in particular, there is limited screen real estate and text-based search results are tiny and difficult to read. It's even more difficult for a user to differentiate between search results when looking at a tiny screen.
Moreover, many search engines are adding video content to their search results. Video is becoming more prevalent online because publishers don't want to present text-only sites and there is a desire to differentiate/supplement search placement; however, traditional streaming video is time-consuming to create and view. Video content is also highly compressed on mobile devices resulting in poor streaming and picture quality. Video is also hard to interact with because there is no standard, universal layer for interactivity. For the most part, video is a stand-alone experience because of the lack of an interactive layer. In addition, similar to exploring component web pages, watching and re-searching for appropriate videos is very time consuming— because of limited previews, users often don't know if they have discovered the right or wrong video related to their topic, as the videos are indexed and retrieved via keyword, not according to the content of the pages also part of the same search result.
Embodiments herein address these issues and drawbacks, as well. In one illustrative implementation, for example, there is provided a method of processing search information comprising a computer server configured to communicate with at least one search engine web crawler. The example method could also have the computer server configured to interact with the search engine web crawler search results by causing display of the search results. And the example method may include wherein the search results include interactive multimedia content, e.g., one or more mixed-media modules, and/or associated content such as at least one hyperlink, etc.
Especially in view of the issued with traditional video content noted above, systems and methods herein are an improvement on other rich media such as online video technology because they use less bandwidth, are easily customizable, flexible, incorporate interactive video, images, text and other types of media. In still other exemplary embodiments herein, mixed-media module integrated implementations can incorporate interactive images, text and other types of media. Further, given such implementations operate without large-bandwidth video transmissions especially rendered video content for audiovisual/multimedia experience, systems and methods herein provide an expanded interactive search with other mixed media, thus allowing for quicker loads and consumption of less bandwidth during utilization.
FIG. 8 is an example showing illustrative ad placement 801 features, consistent with aspects of the innovations herein. The integration of mixed-media module interactive summaries into a SERP creates additional advertising monetization units, these units can be presented as interactive captions on the CPC/PPC (Cost Per Click/Pay Per Click) advertisements that traditionally are placed alongside search results, or the CPC/PPC ads (and other promotional units) can be placed within the mixed-media module itself, as shown in FIG 8. For example, the interactive summary can be presented as a caption on the CPC advertisement that are traditionally placed alongside organic search results 801 . In some implementations, the CPC ads can be placed within the multimedia presentation or mixed-media module 805, itself. It should be noted that FIG 8 may give the appearance that the CPC ad is loading within the Wikipedia result. However, implementations may include the CPC ad displaying its own mixed- media module. Loading the CPC ad into the Wikipedia mixed-media module is a different embodiment from such implementations.
Referring now to FIGs. 9-14, implementations herein with mixed-media module integrations involving video can yield improved/higher quality on mobile devices, consistent with aspects of the innovations herein. In one illustrative implementation, for example, there is provided a method of processing search information comprising returning search results in a search results page including one or more pre-prepared narrative multimedia presentations. The example method could also include providing at least one integrated multimedia presentation selected by a user. And, also, providing access to at least one of additional third party information, sites, content, applications and other multimedia. Further, the example method could include
wherein, the multimedia presentations are configured in association with other features for low- bandwidth (e.g., non-rendered, etc.) display for use on a mobile device.
Also, given the flexible and non-rendered nature of the mixed-media modules, streaming and picture quality can be easily optimized for specific mobile devices. Further, such
implementations allow ease of interactions by providing a standard universal layer for interactivity. In other embodiments, systems and methods herein may include features and implementations involving interactive and coordinated hyperlinks for deeper exploration of the content within the video— this feature of coordinating links/content inside of the mixed-media module interactive summary allow new attribution and monetization capabilities by content creators and search engines utilizing the underlying model(s).
Here, it should be noted that a "mobile device" can be any kind of smartphone, tablet computer, laptop, notebook, or any kind of similar device. These devices are typically touch screen enabled and retain internet connectivity through either a shorter range radio such as those used in WiFi technologies, or through cellular telephone connections, or both. The device may connect to the internet in any fashion.
FIG. 9 depicts an illustrative SERP with mixed-media module implementation formatted for a mobile smartphone or tablet computer, consistent with aspects of the innovations herein. As shown, for example, an illustrative "Play Qwiki module" icon is shown directly beneath the first search result in the search result screen. FIG. 10 is an illustration of a search engine results page with the integration of touch-enable functionality consistent with aspects of the innovations herein. In FIG 10, a user is shown tapping the "Play Qwiki module" icon using a finger. Touch enabled screens allow such interaction with a stylus or other such device as well, while such features may be navigated with various cursor-based functionality, as well. FIG. 1 1 is an illustration of exemplary mobile device display and functionality consistent with aspects of the innovations herein. In the example in FIG 1 1 , the mobile smartphone may be rotated to initiate a specified function associated with the SERP or just to allow for a landscape display, instead of a profile display.
FIG 12 is an exemplary screenshot illustrating mobile device display of a search results content such as a mixed-media module consistent with certain aspects related to the innovations herein.
FIG 13 is an exemplary screenshot of an illustrative mobile device display showing user interaction with a mixed-media module from the search results consistent with certain aspects related to the innovations herein. Figure 13 shows a user interacting with a portion of the mixed-media module, here tapping the particular media or object with respect to which additional content (details, information, etc) or further functionality ("re-query", etc) is desired. As set forth elsewhere herein, the search engine may be configured to interoperate with such action in a variety of ways.
FIG 14 is an exemplary screenshot of a mobile device display showing an illustrative result of a user interaction consistent with certain aspects related to the innovations herein. Here, this example shows an illustrative re-direct associated with the tapped object to a particular web page. The result shows a multimedia text and image or video within the web page.
Turning to some more general aspects, an illustrative multimedia presentation herein may be configured as an interactive system of mixed-media/interactive content with clickable components. Various mixed-media modules, here, may also provide a visual confirmation of search results which means less frustration and more productivity for the user. These mixed- media modules may also provides visual relevancy - the multimedia nature of such interactive component provides more in-depth detail of a topic than text alone.
Further, it is noted that pages with multi-media components are often ranked higher in search engine results. In accordance with aspects of the present innovations herein, systems and methods herein provide ways for content creators to provide interactive multi-media content and, in some implementations, improve their search engine ranking through increased metadata information. The visual nature of embodiments herein also means that such result would not have to be ranked at the very top of an SERP to catch the attention of a search engine user since visual images are more efficiently scanned than text. For online advertisers, better search results will mean greater return on investment. Online ads will be viewed within a more appropriate context and, therefore, more likely to target the right consumers. Interactions with the associated mixed-media modules can also provide additional data to rank pages.
In accordance with aspects of the present innovations, mixed-media module interactive summaries as integrated herein are lightweight - they use less bandwidth than pure video and are a rich, interactive, multi-media experience. Viewing such mixed-media module is faster and easier than video alone because they are interactive and have more discrete sets of contents
that can easily be traversed beyond a simple play bar associated with most traditional video. Mixed-media modules herein also contain more information (meta-data) than video because of its multitude of components (mixed media), interactive nature and because of the ability to re- query. With regard to certain aspects of the innovations herein, another way that implementations herein are an improvement over the traditional search experience, especially from online video, is that that end user does not experience the mixed-media module in a linear fashion. A user can readily jump to different collections of media once a quick scan assures them the preset set of options will not yield the desired results. The user can also choose their path through the content by clicking on hyperlinks (meta-data) within the mixed-media module. This allows the end-user to explore the information that is of the most interest to them, in greater detail and in their preferred format (i.e. text, photos, or video). Innovations herein also work across multiple platforms. For example, mixed-media module interactive components herein can run inside a standard web browser and its player software can be integrated into mobile devices, TV devices, video game units, etc. Further, such mixed-media module(s) may be configured as a universal component across all media and devices.
In accordance with aspects of the present innovations, mixed-media modules herein can act as an "interactive summary/caption" which highlights the curated content from a search result and presents it in narrative form. As such, users may "preview" the contents of the search in an engaging, interactive experience on multiple devices. In certain implementations, an interaction a user may have with the mixed-media module is via "Gestures", such as set forth in connection with FIGs. 15-19. These Gestures may include various touch-screen enabled interactions whereby a user is able to tap, pinch, tap and hold, and swipe or scroll the mixed-media module. Various search engines, servers and/or intermediaries may be configured to respond to or interact in accordance with these Gestures in different ways, such as the examples as described in the Figures and associated descriptions herein. Thus, some implementations herein include methods wherein the interactive multimedia content is configured to allow a new search query and generate a new search results page.
FIG 15 shows an example Gesture consistent with aspects of the innovations herein. Here, within a search result expanded to the selected mixed-media module, systems and methods herein may be configured to respond to a user tap or click of an object in the grid or in the feed to open another mixed-media module, webpage, video, or detailed animation in an overlay over
the current screen. Thus, some embodiments include methods wherein the interaction includes a tap of a portion, button or link of the selected multimedia presentation used in the generation of the new search results page.
FIG 16 shows another example Gesture consistent with aspects of the innovations herein. Here, a user can pinch into an object in the grid to see detailed or related information on the object including source, related media, access interactive animations, view full video, read full article, and the like. Thus, some embodiments include methods wherein interactions include a pinch of a portion, button or link of the selected multimedia presentation used in the generation of the new search results page. FIG 17 shows another example Gesture consistent with aspects of the innovations herein.
Here, for example, systems and methods herein may be configured such that a user can tap or click and hold on an element in the grid or in the feed to provide various or additional options. Such options may include, though are not limited to, open now, queue for later, add to favorites, etc. Thus, some embodiments include methods wherein interactions include a tap and hold of a portion, button or link of the selected multimedia presentation used in the generation of the new search results page.
FIG 18 shows another example Gesture consistent with aspects of the innovations herein. Here, a user can swipe or scroll with one finger left or right over the grid to advance or rewind the presentation of the mixed-media. Thus, some embodiments include methods wherein interactions include a swipe or scroll of a portion, button or link of the selected multimedia presentation used in the generation of the new search results page.
FIG 19 shows another example of an illustrative interface involving a sample search result and mixed-media module presentation, consistent with aspects of the innovations herein. Here, for example, the mixed-media module may be presented as a very foremost piece of content, such as the first item to select in the upper, left portion of the search result. Such placement yields easy user access to greater content in the mixed-media module, and all of the
associated benefits therein to the search engine/provider and ad revenue partners.
In the description here, it is to be understood that both mouse / cursor enabled computing devices, and those without cursors, but use touch screen technologies are both fully
supported. To that, the terms "click" or "tap" or "touch" can be used synonymously and interchangeably. Thus, a clickthrough is the same as a tap-through or any other term with the
equivalent meaning. The mobile wireless devices can be touch screen enabled, using a stylus or finger or other such thing to interact with the screen, and objects on the screen. The touch screen enabled technologies also allow for pinching in or out to zoom in or out or enlarge or shrink an object or the display. Sliding a touch can scroll either in vertical or horizontal directions, or any other direction supported by the system. The touch screens can also detect a prolonged tap, opening further functionality when a prolonged tap and hold occurs on an object. In devices that do not support a touch screen, such functionality can be accomplished by a cursor or pointer of some sort, typically controlled by a mouse, pointer stick, roller ball, etc. There may be additional functionality embedded into the display objects to allow for some of the functionality such as a scroll bar or zoom buttons, etc. These functionalities are also fully supported here and can be used interchangeably with the touch screen enabled technologies.
In the present description, the terms component, module, device, etc. may refer to any type of logical or functional process or blocks that may be implemented in a variety of ways. For example, the functions of various blocks can be combined with one another into any other number of modules. Each module can be implemented as a software program stored on a tangible memory (e.g., random access memory, read only memory, CD-ROM memory, hard disk drive) within or associated with the computing elements, sensors, receivers, etc. disclosed above, e.g., to be read by a processing unit to implement the functions of the innovations herein. Or, the modules can comprise programming instructions transmitted to a general purpose computer or to processing hardware via a transmission carrier wave. Also, the modules can be implemented as hardware logic circuitry implementing the functions encompassed by the innovations herein. Finally, the modules can be implemented using special purpose instructions (SIMD instructions), field programmable logic arrays or any mix thereof which provides the desired level performance and cost. As disclosed herein, implementations and features of the invention may be implemented through computer-hardware, software and/or firmware. For example, the systems and methods disclosed herein may be embodied in various forms including, for example, a data processor, such as a computer that also includes a database, digital electronic circuitry, firmware, software, or in combinations of them. Further, while some of the disclosed implementations describe components such as software, systems and methods consistent with the innovations herein may be implemented with any combination of hardware, software and/or firmware. Moreover, the above-noted features and other aspects and principles of the innovations herein may be implemented in various environments. Such environments and related applications
may be specially constructed for performing the various processes and operations according to the invention or they may include a general-purpose computer or computing platform selectively activated or reconfigured by code to provide the necessary functionality. The processes disclosed herein are not inherently related to any particular computer, network, architecture, environment, or other apparatus, and may be implemented by a suitable combination of hardware, software, and/or firmware. For example, various general-purpose machines may be used with programs written in accordance with teachings of the invention, or it may be more convenient to construct a specialized apparatus or system to perform the required methods and techniques. Aspects of the method and system described herein, such as the location estimate features, may be implemented as functionality programmed into any of a variety of circuitry, including programmable logic devices ("PLDs"), such as field programmable gate arrays ("FPGAs"), programmable array logic ("PAL") devices, electrically programmable logic and memory devices and standard cell-based devices, as well as application specific integrated circuits. Some other possibilities for implementing aspects include: memory devices, microcontrollers with memory (such as EEPROM), embedded microprocessors, firmware, software, etc.
Furthermore, aspects may be embodied in microprocessors having software-based circuit emulation, discrete logic (sequential and combinatorial), custom devices, fuzzy (neural) logic, quantum devices, and hybrids of any of the above device types. The underlying device technologies may be provided in a variety of component types, e.g., metal-oxide
semiconductor field-effect transistor ("MOSFET") technologies like complementary metal-oxide semiconductor ("CMOS"), bipolar technologies like emitter-coupled logic ("ECL"), polymer technologies (e.g., silicon-conjugated polymer and metal-conjugated polymer-metal structures), mixed analog and digital, and so on. It should also be noted that the various logic and/or functions disclosed herein may be enabled using any number of combinations of hardware, firmware, and/or as data and/or instructions embodied in various machine-readable or computer-readable media, in terms of their behavioral, register transfer, logic component, and/or other characteristics. Computer-readable media in which such formatted data and/or instructions may be embodied to include, but are not limited to, non-volatile storage media in various forms (e.g., optical, magnetic or semiconductor storage media), though do not include non-tangible media.
Unless the context clearly requires otherwise, throughout the description and the claims, the words "comprise," "comprising," and the like are to be construed in an inclusive sense as opposed to an exclusive or exhaustive sense; that is to say, in a sense of "including, but not limited to." Words using the singular or plural number also include the plural or singular number respectively. Additionally, the words "herein," "hereunder," "above," "below," and words of similar import refer to this application as a whole and not to any particular portions of this application. When the word "or" is used in reference to a list of two or more items, that word covers all of the following interpretations of the word: any of the items in the list, all of the items in the list and any combination of the items in the list. Other implementations of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the invention being indicated by the disclosure above in combination with the following paragraphs describing the scope of one or more implementations of the following invention
Claims
1 . A method for processing search information, the method comprising:
processing information to return, to a user, search results via a search engine, in a results page including multimedia information; and
providing, for display via a search results page, at least one mixed-media module selectable by the user.
2. The method of claim 1 further comprising providing, as a function of interaction of the user with a selected mixed-media module, additional information and/or functionality to the user.
3. A method for processing search information, the method comprising:
processing information to return, to a user, search results via a search engine, in a results page including at least one multimedia presentation;
providing, for display via a search results page, at least one interactive multimedia presentation selectable by the user;
providing, as a function of a first interaction of the user with a selected multimedia presentation, user access to at least one of third party information, web sites, content, applications and/or other multimedia;
providing, as a function of a second interaction of the user with the selected multimedia presentation, functionality configured to receive a new search query and generate a new search results page.
4. The method of claim 3 wherein the search engine is configured to process, as the second interaction, receipt of a gesture by the user on a mobile device.
5. The method of claim 4 wherein the gesture includes one or more of:
a tap of a portion, button or link of the selected multimedia presentation used in the generation of the new search results page; and/or
a swipe or scroll of a portion, button or link of the selected multimedia presentation used in the generation of the new search results page.
6. The method of claim 4 wherein the gesture includes one or more of: a tap and hold of a portion, button or link of the selected multimedia presentation used in the generation of the new search results page; and/or
a pinch of a portion, button or link of the selected multimedia presentation used in the generation of the new search results page.
7. The method of any claim herein wherein the server is configured to provide the multimedia presentation for display in an expanded view in the search result page.
8. The method of any claim herein wherein the search results further include advertisements provided as a function of the user's interaction with the multimedia presentation.
9. A method for processing search information, the method comprising:
returning search results in a search results page including one or more multimedia presentations;
providing at least one interactive multimedia presentation selected by a user;
providing access to at least one of, additional third party information, sites, content, applications and/or other multimedia;
wherein, the multimedia presentation is configured to,
allow user interaction to manipulate the presentation;
cause display of more than one kind of media.
10. The method of claim 9 or any claim herein wherein the presentations are configured to be displayed on at least one mobile device.
1 1. The method of claim 9 or any claim herein further comprising, providing, as a function of the user interaction with the multimedia presentation, functionality configured to receive a new search query and generate a new search results page.
12. The method of claim 1 1 or any claim herein wherein the user interaction includes a tap of a portion, button or link of the selected multimedia presentation used in the generation of the new search results page.
13. The method of claim 1 1 or any claim herein wherein the user interaction includes a swipe or scroll of a portion, button or link of the selected multimedia presentation used in the generation of the new search results page.
14. The method of claim 1 1 or any claim herein wherein the user interaction includes a tap and hold of a portion, button or link of the selected multimedia presentation used in the generation of the new search results page.
15. The method of claim 1 1 or any claim herein wherein the user interaction includes a pinch of a portion, button or link of the selected multimedia presentation used in the generation of the new search results page.
16. The method of claim 9 or any claim herein wherein the server is configured to provide the multimedia presentation for display in an expanded view in the search result page.
17. The method of claim 9 or any claim herein wherein the search results further include advertisements provided as a function of the user's interaction with the multimedia presentation.
18. A method for processing search information, the method comprising:
returning search results in a search results page including one or more multimedia presentations;
providing at least one integrated multimedia presentation selected by a user;
providing access to at least one of additional third party information, sites, content, applications and/or other multimedia;
wherein, the multimedia presentations are configured for low-bandwidth and/or are non- rendered to facilitate display on a mobile device.
19. The method of claim 18 or any claim herein wherein the pre-prepared narrative multimedia presentation is interactive.
20. The method of claim 18 or any claim herein further comprising, providing, as a function of the user interaction with the multimedia presentation, functionality configured to receive a new search query and generate a new search results page.
21. The method of claim 20 or any claim herein wherein the user interaction includes a tap of a portion, button or link of the selected multimedia presentation used in the generation of the new search results page.
22. The method of claim 20 or any claim herein wherein the user interaction includes a swipe or scroll of a portion, button or link of the selected multimedia presentation used in the generation of the new search results page.
23. The method of claim 20 or any claim herein wherein the user interaction includes a tap and hold of a portion, button or link of the selected multimedia presentation used in the generation of the new search results page.
24. The method of claim 20 or any claim herein wherein the user interaction includes a pinch of a portion, button or link of the selected multimedia presentation used in the generation of the new search results page.
25. The method of claim 18 or any claim herein wherein the server is configured to provide the multimedia presentation for display in an expanded view in the search result page.
26. The method of claim 18 or any claim herein wherein the search results further include advertisements provided as a function of the user's interaction with the multimedia presentation.
27. An interactive multimedia search summary system comprising:
one or more servers and/or processing components configured to,
communicate with at least one search engine;
perform processing regarding interaction with the search engine search results associated with display of the search results, wherein the search results include
interactive multimedia content; and/or
at least one hyperlink.
28. The system of claim 27 or any claim herein further comprising a customizable caption that is configured to be played by the user.
29. The system of claim 27 or any claim herein wherein the search results further include metadata, allowing interaction with the search engine.
30. The system of claim 28 or any claim herein wherein the playable caption is configured to expand on a display and offer further search options.
31 . The system of claim 30 or any claim herein wherein the further search options include at least one of, a new search query, media/asset loading, a link to a related page, and playback options.
32. The system of claim 27 or any claim herein wherein search result further includes at least one of component video and/or audio.
33. The system of claim 27 or any claim herein wherein the hyperlink is configured to allow a re- query of the search engine web crawler.
34. The system of claim 33 or any claim herein wherein the hyperlink is further configured to cause a new window to open to display the search results of the re-query.
35. The system of claim 34 or any claim herein wherein the re-query of the search engine is a search of the meta data within the search result.
36. The system of claim 29 or any claim herein wherein the meta data is derived from parsed unstructured data or selecting media.
37. The system of claim 27 or any claim herein wherein the computer server is further configured to run a script on the search engine to obtain structured data.
38. The system of claim 37 or any claim herein wherein the search results include information based on the obtained structured data.
39. The system of claim 27 or any claim herein wherein the server is further configured to detect entities in text.
40. The system of claim 27 or any claim herein wherein the display of the search results is configured and/or formatted for display on a mobile smartphone.
41 . The system of claim 27 or any claim herein wherein the search results are configured and/or formatted for display on a tablet computer.
42. The system of claim 27 or any claim herein wherein the search results are further configured with advertisements, such as advertisements embedded within a mixed-media module.
43. The system of claim 27 or any claim herein wherein the interactive multimedia content is configured to allow a new search query and generate a new search results page.
44. The system of claim 43 or any claim herein wherein the interaction includes a tap of a portion, button or link of the selected multimedia presentation used in the generation of the new search results page.
45. The system of claim 43 or any claim herein wherein the interaction includes a swipe or scroll of a portion, button or link of the selected multimedia presentation used in the generation of the new search results page.
46. The system of claim 43 or any claim herein wherein the interaction includes a tap and hold of a portion, button or link of the selected multimedia presentation used in the generation of the new search results page.
47. The system of claim 43 or any claim herein wherein the interaction includes a pinch of a portion, button or link of the selected multimedia presentation used in the generation of the new search results page.
48. The system of claim 27 or any claim herein wherein the server is configured to provide the multimedia presentation for display in an expanded view in the search result page.
49. The system of claim 27 or any claim herein wherein the search results further include advertisements provided as a function of the user's interaction with the multimedia presentation.
50. A system for causing display of search results, the system comprising:
one or more servers and/or processing components configured to,
communicate with at least one search engine;
receive the search engine results from at least a first query;
generate search results for display in a browser window based on the first query, wherein the search results include one or more of,
a customizable caption,
multimedia, and/or
at least one hyperlink configured to cause a re-query of the search engine web crawler.
51 . The system of claim 50 or any claim herein wherein the server is further configured to cause display of the search results from the re-query in a new browser window.
52. The system of claim 50 or any claim herein wherein the multimedia includes at least one of an image, a video, an animation, a map, a document and a narrative.
53. The system of claim 50 or any claim herein wherein the server is further configured to parse unstructured data into structured meta data and media.
54. The system of claim 53 or any claim herein wherein the search results include information from the parsed unstructured data.
55. The system of claim 50 or any claim herein wherein the server is further configured to detect entities in text.
56. The system of claim 50 or any claim herein wherein the search results further include advertisements.
57. The system of claim 50 or any claim herein wherein the interactive multimedia content is configured to allow a new search query and generate a new search results page.
58. The system of claim 57 or any claim herein wherein the interaction includes a tap of a portion, button or link of the selected multimedia presentation used in the generation of the new search results page.
59. The system of claim 57 or any claim herein wherein the interaction includes a swipe or scroll of a portion, button or link of the selected multimedia presentation used in the generation of the new search results page.
60. The system of claim 57 or any claim herein wherein the interaction includes a tap and hold of a portion, button or link of the selected multimedia presentation used in the generation of the new search results page.
61. The system of claim 57 or any claim herein wherein the interaction includes a pinch of a portion, button or link of the selected multimedia presentation used in the generation of the new search results page.
62. The system of claim 50 or any claim herein wherein the server is configured to provide the multimedia presentation for display in an expanded view in the search result page.
63. The system of claim 50 or any claim herein wherein the search results further include advertisements provided as a function of the user's interaction with the multimedia presentation.
64. The method or system of any claim above wherein the multimedia presentation includes a mixed media module comprising at least two items of text, animation, images, pictures and/or video, such as presentation of three or more items in a non-rendered format.
65. The method or system of claim 64 or any claim herein wherein the mixed media module is comprised of one or more slots into which the server causes media to display.
66. The method or system of any claim above wherein the mixed media modules include realtime information objects.
67. The method or system of any claim above wherein narration of the multimedia presentation includes causing portions of the multimedia presentation to highlight in coordination with played audio, and/or wherein the highlighted portion is magnified.
68. The method or system of any claim above further comprising, creating at least one index file including one or more of a summary of the media objects, a snippet, location, title alias, overall quality rank, and/or an iconic picture.
69. The method or system of any claim above wherein:
the search engine further causes display of related information to the user entered search, in the multimedia presentation; and/or
wherein the multimedia presentation further includes a film strip tool bar configured to allow a user to scroll portions of a video.
70. The invention of any claim herein, wherein the multimedia presentation and/or search results page includes a mixed media module.
71. The invention of any claim herein, wherein the mixed media module includes two or more types of media content including text, a hyperlink, an image, audio, a document, text entry, interactive icon, toolbar and video.
72. The invention of any claim herein, wherein the mixed media module includes an interactive layer for user interaction and providing a deeper exploration of search results without leaving the original search results page.
73. The invention of any claim herein, wherein the mixed medial module is expandable in display size within its original search result page.
74. The invention of any claim herein, wherein the mixed media module includes playable captions to display and describe the mixed media content being displayed so as to provide narrative context to the search results displayed.
75. The invention of any claim herein, wherein the mixed media module includes advertising monetization units in place of, or in addition to, advertising monetization units within the search results page and outside the mixed media module.
76. The invention of any claim herein, further comprising a second mixed media module as an advertising monetization unit.
77. The invention of any claim herein, wherein the mixed media module includes a play mode to visually advance the plurality of media content displayed in time.
78. The invention of any claim herein, wherein the mixed media module displays playable captions generated by a site publisher of the media content being displayed such that a brand and an appearance of the of media content displayed within the mixed media module on the search results page is determined by the site publisher and not a search results page algorithm or search engine.
79. The invention of any claim herein, wherein the site publisher controls the display of the site publisher content displayed in the mixed media module.
80. The invention of any claim herein, wherein the playable captions are not algorithmically generated.
81. The invention of any claim herein, wherein the mixed media module provides visual relevancy where the plural types of media content displayed provides more in-depth detail of a topic than text alone.
82. The invention of any claim herein, wherein the mixed media module provides additional data to rank pages for search results.
83. The invention of any claim herein, wherein the media content of the mixed media module includes more meta-data than meta-data of video.
84. The invention of any claim herein, wherein the user operates does not operate to control the mixed media module display linearly such that a user chooses a display path through the media content of the plurality of media types within the mixed media module.
85. The invention of any claim herein, wherein the mixed medial module provides an interactive summary in narrative form of curated mixed media content from a search result.
86. The invention of any claim herein, wherein the mixed media module displaying re-query suggestions of related search concepts.
87. The invention of any claim herein, wherein the mixed media module provides higher information density display over display of a single media type.
88. The invention of any claim herein, further configured with functionality that allows a user to select a re-query search within an original mixed media module opens a new mixed media module to provide high information density.
89. The invention of any claim herein, wherein the mixed media module comprises three or more different types of content selected from text, image, video, audio, document, text entry, interactive icon, toolbar and/or video.
90. The invention of any claim herein, wherein the mixed media module comprises three or more different types of content selected from text, image, video, audio, document, text entry, interactive icon, toolbar and/or video to provide a user experience characterized as visually rich by the three or more different types of content displayed at once in the mixed media module.
91 . The invention of any claim herein, further configured such that multiple different pieces of media content of the mixed media module being displayed each provide the user with additional information of differing scope.
92. The invention of any claim herein, wherein the mixed media module provides a layer of interactivity such that deeper exploration of search results is provided to the user without leaving the original search result page.
93. The invention of any claim herein, wherein the mixed media module displays media content without the search engine generating previews of websites to save bandwidth.
94. The invention of any claim herein, wherein the user operates the interactive layer to customize the content of the mixed media module display.
95. The invention of any claim herein, wherein the mixed media module loads audio and video data within the mixed media module so as to prevent loading of the audio and video data to an external page.
96. The invention of any claim herein, wherein the mixed media module loads audio and video data within the mixed media module such that bandwidth is reduced.
97. A method for processing search information, the method comprising:
processing information to return, to a user, search results via a search engine, in a results page including at least one multimedia presentation;
providing, for display via a search results page, at least one interactive multimedia presentation selectable by the user;
providing, as a function of a first interaction of the user with a selected multimedia presentation, user access to at least one of third party information, web sites, content, applications and/or other multimedia;
providing, as a function of a second interaction of the user with the selected multimedia presentation, functionality configured to receive a new search query and generate a new search results page.
98. The method of claim 97 or other claims herein wherein the search engine is configured to process, as the second interaction, selection by the user of a piece of content within the selected multimedia presentation during play of the selected multimedia presentation.
99. A method for processing search information, the method comprising:
processing information to return, to a user, search results via a search engine, in a results page including at least one pre-prepared, non-rendered narrative multimedia
presentation; providing, for display via a search results page, at least one interactive multimedia presentation selectable by the user;
providing, as a function of a first interaction of the user with a selected multimedia presentation, user access to at least one of third party information, web sites, content, applications and/or other multimedia;
providing, as a function of a second interaction of the user with the selected multimedia presentation, functionality configured to receive a new search query and generate a new search results page.
100. A method for processing search information, the method comprising:
returning search results in a search results page including one or more pre-prepared narrative multimedia presentations;
providing at least one interactive multimedia presentation selected by a user;
providing access to at least one of, additional third party information, sites, content, applications and other multimedia;
wherein, the pre-prepared narrative multimedia presentation is configured to,
allow user interaction to manipulate the presentation;
cause display of more than one kind of media not including rendered video.
101 . A method for processing search information, the method comprising:
returning search results in a search results page including one or more pre-prepared narrative multimedia presentations;
providing at least one integrated multimedia presentation selected by a user;
providing access to at least one of additional third party information, sites, content, applications and other multimedia;
wherein, the multimedia presentations are configured for low-bandwidth and non- rendered display for use on a mobile device.
102. A method for processing search information, the method comprising:
processing information to return, to a user, search results via a search engine, in a results page including multimedia information;
providing, for display via a search results page, at least one mixed-media module selectable by the user; and providing, as a function of interaction of the user with a selected mixed-media module, additional information and/or functionality to the user.
103. One or more components and/or computer-readable media comprising computer-readable instructions executable by one or more processing components to perform a method for processing search information, the method comprising:
one or more steps and/or features recited in any claim(s) herein and/or set forth in the present disclosure.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201280060719.3A CN104145265B (en) | 2011-12-15 | 2012-12-17 | It is related to the system and method for feature for searching for and/or searching for integration |
EP12857892.9A EP2791780A4 (en) | 2011-12-15 | 2012-12-17 | Systems and methods involving features of search and/or search integration |
CA2857517A CA2857517A1 (en) | 2011-12-15 | 2012-12-17 | Systems and methods involving features of search and/or search integration |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201161576352P | 2011-12-15 | 2011-12-15 | |
US61/576,352 | 2011-12-15 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013090946A1 true WO2013090946A1 (en) | 2013-06-20 |
Family
ID=48613295
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2012/070214 WO2013090946A1 (en) | 2011-12-15 | 2012-12-17 | Systems and methods involving features of seach and/or search integration |
Country Status (4)
Country | Link |
---|---|
EP (1) | EP2791780A4 (en) |
CN (1) | CN104145265B (en) |
CA (1) | CA2857517A1 (en) |
WO (1) | WO2013090946A1 (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103902678A (en) * | 2014-03-21 | 2014-07-02 | 百度在线网络技术(北京)有限公司 | Search recommendation method and device |
WO2015072880A1 (en) * | 2013-11-15 | 2015-05-21 | Yandex Europe Ag | A method of processing a user request within a search result page |
WO2015072881A1 (en) * | 2013-11-15 | 2015-05-21 | Yandex Europe Ag | A method of presenting information on a search result page |
WO2015139459A1 (en) * | 2014-03-21 | 2015-09-24 | 百度在线网络技术(北京)有限公司 | Method and device for search and recommendation |
WO2017084306A1 (en) * | 2015-11-20 | 2017-05-26 | 乐视控股(北京)有限公司 | Method and apparatus for playing key information of video in browser of mobile device |
US10089412B2 (en) | 2015-03-30 | 2018-10-02 | Yandex Europe Ag | Method of and system for processing a search query |
US10909112B2 (en) | 2014-06-24 | 2021-02-02 | Yandex Europe Ag | Method of and a system for determining linked objects |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104991897B (en) * | 2015-05-29 | 2018-09-25 | 百度在线网络技术(北京)有限公司 | Weights and measures searching method and device |
CN105159993A (en) * | 2015-09-02 | 2015-12-16 | 百度在线网络技术(北京)有限公司 | Search method and device |
CN105138697B (en) * | 2015-09-25 | 2018-11-13 | 百度在线网络技术(北京)有限公司 | A kind of search result shows method, apparatus and system |
CN107229741B (en) * | 2017-06-20 | 2021-12-10 | 百度在线网络技术(北京)有限公司 | Information searching method, device, equipment and storage medium |
CN107657024B (en) * | 2017-09-27 | 2021-03-23 | 百度在线网络技术(北京)有限公司 | Search result display method, device, equipment and storage medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5911139A (en) * | 1996-03-29 | 1999-06-08 | Virage, Inc. | Visual image database search engine which allows for different schema |
US20090019034A1 (en) | 2007-06-26 | 2009-01-15 | Seeqpod, Inc. | Media discovery and playlist generation |
US20090327268A1 (en) * | 2008-06-27 | 2009-12-31 | Microsoft Corporation | Providing targeted information for entertainment-oriented searches |
US7716235B2 (en) * | 2001-04-20 | 2010-05-11 | Yahoo! Inc. | Phonetic self-improving search engine |
US20110288913A1 (en) | 2010-05-20 | 2011-11-24 | Google Inc. | Interactive Ads |
RU2435212C2 (en) * | 2006-03-02 | 2011-11-27 | Майкрософт Корпорейшн | Collecting data on user behaviour during web search to increase web search relevance |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101339548A (en) * | 2007-07-05 | 2009-01-07 | 上海美通无线网络信息有限公司 | Method for implementing resource search proxy to mobile phone by internet |
US9268856B2 (en) * | 2007-09-28 | 2016-02-23 | Yahoo! Inc. | System and method for inclusion of interactive elements on a search results page |
-
2012
- 2012-12-17 CA CA2857517A patent/CA2857517A1/en not_active Abandoned
- 2012-12-17 EP EP12857892.9A patent/EP2791780A4/en not_active Withdrawn
- 2012-12-17 CN CN201280060719.3A patent/CN104145265B/en not_active Expired - Fee Related
- 2012-12-17 WO PCT/US2012/070214 patent/WO2013090946A1/en active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5911139A (en) * | 1996-03-29 | 1999-06-08 | Virage, Inc. | Visual image database search engine which allows for different schema |
US7716235B2 (en) * | 2001-04-20 | 2010-05-11 | Yahoo! Inc. | Phonetic self-improving search engine |
RU2435212C2 (en) * | 2006-03-02 | 2011-11-27 | Майкрософт Корпорейшн | Collecting data on user behaviour during web search to increase web search relevance |
US20090019034A1 (en) | 2007-06-26 | 2009-01-15 | Seeqpod, Inc. | Media discovery and playlist generation |
US20090327268A1 (en) * | 2008-06-27 | 2009-12-31 | Microsoft Corporation | Providing targeted information for entertainment-oriented searches |
US20110288913A1 (en) | 2010-05-20 | 2011-11-24 | Google Inc. | Interactive Ads |
Non-Patent Citations (2)
Title |
---|
HOW TO EMBED ALMOST ANYTHING IN YOUR WEBSITE, 9 January 2009 (2009-01-09) |
See also references of EP2791780A4 |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015072880A1 (en) * | 2013-11-15 | 2015-05-21 | Yandex Europe Ag | A method of processing a user request within a search result page |
WO2015072881A1 (en) * | 2013-11-15 | 2015-05-21 | Yandex Europe Ag | A method of presenting information on a search result page |
CN103902678A (en) * | 2014-03-21 | 2014-07-02 | 百度在线网络技术(北京)有限公司 | Search recommendation method and device |
WO2015139459A1 (en) * | 2014-03-21 | 2015-09-24 | 百度在线网络技术(北京)有限公司 | Method and device for search and recommendation |
KR20160018564A (en) * | 2014-03-21 | 2016-02-17 | 바이두 온라인 네트웍 테크놀러지 (베이징) 캄파니 리미티드 | Method and device for search and recommendation |
KR101711322B1 (en) | 2014-03-21 | 2017-02-28 | 바이두 온라인 네트웍 테크놀러지 (베이징) 캄파니 리미티드 | Method and device for search and recommendation |
US10909112B2 (en) | 2014-06-24 | 2021-02-02 | Yandex Europe Ag | Method of and a system for determining linked objects |
US10089412B2 (en) | 2015-03-30 | 2018-10-02 | Yandex Europe Ag | Method of and system for processing a search query |
WO2017084306A1 (en) * | 2015-11-20 | 2017-05-26 | 乐视控股(北京)有限公司 | Method and apparatus for playing key information of video in browser of mobile device |
Also Published As
Publication number | Publication date |
---|---|
CN104145265B (en) | 2019-04-05 |
EP2791780A4 (en) | 2016-05-11 |
EP2791780A1 (en) | 2014-10-22 |
CA2857517A1 (en) | 2013-06-20 |
CN104145265A (en) | 2014-11-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2013090946A1 (en) | Systems and methods involving features of seach and/or search integration | |
US10394420B2 (en) | Computer-implemented method of generating a content recommendation interface | |
US10417289B2 (en) | Systems and methods involving integration/creation of search results media modules | |
JP5845254B2 (en) | Customizing the search experience using images | |
US8943440B2 (en) | Method and system for organizing applications | |
US20100146012A1 (en) | Previewing search results for suggested refinement terms and vertical searches | |
US20110289458A1 (en) | User interface animation for a content system | |
US10303723B2 (en) | Systems and methods involving search enhancement features associated with media modules | |
US9843823B2 (en) | Systems and methods involving creation of information modules, including server, media searching, user interface and/or other features | |
KR20130130748A (en) | Multi-mode web browsing | |
WO2012039966A1 (en) | Media content recommendations based on prefernces different types of media content | |
US20170091336A1 (en) | Method and apparatus for generating a recommended set of items for a user | |
US10296158B2 (en) | Systems and methods involving features of creation/viewing/utilization of information modules such as mixed-media modules | |
US10387503B2 (en) | Systems and methods involving features of search and/or search integration | |
KR20210154957A (en) | Method and system for adding tag to video content | |
US11099714B2 (en) | Systems and methods involving creation/display/utilization of information modules, such as mixed-media and multimedia modules | |
US20170052953A1 (en) | Systems and methods involving features of search and/or search integration | |
US10504555B2 (en) | Systems and methods involving features of creation/viewing/utilization of information modules such as mixed-media modules | |
EP2795444A1 (en) | Systems and methods involving features of creation/viewing/utilization of information modules | |
JP5973480B2 (en) | Information processing apparatus, information processing method, and program | |
WO2013188603A2 (en) | Systems and methods involving search enhancement features associated with media modules | |
JP2012038315A (en) | System and method for retrieving image information |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12857892 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2857517 Country of ref document: CA |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012857892 Country of ref document: EP |