AU2011218613A1 - Video interaction - Google Patents

Video interaction Download PDF

Info

Publication number
AU2011218613A1
AU2011218613A1 AU2011218613A AU2011218613A AU2011218613A1 AU 2011218613 A1 AU2011218613 A1 AU 2011218613A1 AU 2011218613 A AU2011218613 A AU 2011218613A AU 2011218613 A AU2011218613 A AU 2011218613A AU 2011218613 A1 AU2011218613 A1 AU 2011218613A1
Authority
AU
Australia
Prior art keywords
information
user
video content
video
user input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
AU2011218613A
Inventor
Dale Herigstad
Harry Kargman
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ACK Ventures Holdings LLC
Original Assignee
ACK Ventures Holdings LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ACK Ventures Holdings LLC filed Critical ACK Ventures Holdings LLC
Priority to AU2011218613A priority Critical patent/AU2011218613A1/en
Publication of AU2011218613A1 publication Critical patent/AU2011218613A1/en
Abandoned legal-status Critical Current

Links

Landscapes

  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

A method comprising while primary video content is being presented through a first communication channel on a primary video device, on a handheld wireless device, displaying 5 information describing video content available for viewing on the primary video device, the information received through a second communication channel; and in response to user input, displaying and storing a graphical element representing the information. bN

Description

AUSTRALIA Patents Act 1990 COMPLETE SPECIFICATION Standard Patent Applicant(s) ACK Ventures Holdings, LLC Invention Title: VIDEO INTERACTION The following statement is a full description of this invention, including the best method for performing it known to me/us: 2 VIDEO INTERACTION RELATED APPLICATION The present application is a divisional application of Australian patent application no. 5 2010201682 which in turn is a divisional application of Australian patent application no. 2007200584 which in turn is a divisional application of Australian patent application no. 2001295029. The present application relates to subject matter disclosed in Australian patent application nos. 2010201682, 2007200584 or 2001295029. Most of the disclosure of applications 2010201682, 2007200584 and 2001295029 is included herein to facilitate 10 understanding of the present invention. If necessary, reference may be made to application nos. 2010201682, 2007200584 or 2001295029 to understand the present invention and the disclosure of application nos. 2010201682, 2007200584 and 2001295029 is incorporated herein by reference. 15 FIELD This invention relates to video interaction. BACKGROUND One rudimentary level of television interaction, for example,- is achieved with a simple remote 20 controller, which enables local control of the channel, volume, picture quality, and other aspects of the television presentation. A variety of schemes also have been proposed to enable a user to interact more actively with the television program content. 25 SUMMARY In one aspect the invention provides a method comprising: while primary video content is being presented through a first communication channel on a primary video device, 30 on a handheld wireless device, displaying information describing video content available for viewing on the primary video device, the information received through a second communication channel; and in response to user input, displaying and storing a graphical element representing the information. 2795823_1 (GHMatters) P60814.AU.3 26/08/11 3 In an embodiment, the method comprises displaying second information describing second video content available for viewing on the primary video device, the second information received through the second communication channel. 5 In an embodiment, the method comprises displaying information on the handheld wireless device related to the primary video content displayed on the primary video device. In an embodiment, the information describing video content available for viewing is displayed in response to user input. 10 In an embodiment, the user input comprises an invocation of the graphical element. In an embodiment, the method comprises receiving the information describing video content from an Internet server. 15 In an embodiment, the method comprises displaying a graphical element representing interactivity material. In an embodiment, the information describing video content includes an image. 20 In an embodiment, the image comprises a thumbnail image. In an embodiment, the information describing video content includes video. 25 In an embodiment, the information describing video content includes text. In an embodiment, the user input comprises a selection of a channel. In an embodiment, the information describing video content is provided by an electronic 30 program guide. In an embodiment, the method comprises accepting a choice by the user of displaying, on the primary video device, the video content available for viewing. 22601261 (GHMatters) 28/04/10 4 In an embodiment, the method comprises generating a tag representative of the information represented by the graphical element. In an embodiment, the tag is generated by capturing a time and a channel of video content. 5 In an embodiment, the tag is generated by sending a query from the handheld wireless device to a server in response to user input. In an embodiment, information or functions corresponding to the query are received from the 10 server and stored in the handheld wireless device for use in interacting with the user. In an embodiment, the tag is generated by a server in response to a query from the handheld wireless device. 15 In an embodiment, the information describing video content comprises a promotion. In an embodiment, the promotion is user-targeted. In an embodiment, the information describing video content comprises an advertisement. 20 In an embodiment, the advertisement is user-targeted. In an embodiment, the information comprises an advertisement for video content. 25 In an embodiment, the method comprises displaying video content represented by the advertisement. In an embodiment, the advertisement for video content is a television advertisement. 30 In an embodiment, the method comprises sending, to another user, a reference to the information. In an embodiment, the method comprises sending, to another mobile device, a reference to the information. 22601261 (GHMatters) 28104/10 5 In an embodiment, the method comprises sending, to a computer, a reference to the information. In an embodiment, the method comprises sending, to a video device, a reference to the information. 5 In an embodiment, the method comprises storing a reference to the information. In an embodiment, the information is displayed in a first portion of a display of the handheld wireless device. 10 In an embodiment, the graphical element is displayed in a second portion of the display of the handheld wireless device. In an embodiment, the graphical element is an icon. 15 In an embodiment, the graphical element is received through the second communication channel. In another aspect, the invention provides a method comprising: 20 while primary video content is being presented through a primary communication channel on a video device, on a handheld wireless device, displaying a graphical element representing interactivity material matched with the video program, the interactivity material received through a second communication channel; and 25 in response to user input, displaying the interactivity material. In an embodiment, the interactivity material is received from another user. In an embodiment, the interactivity material is an advertisement. 30 In an embodiment, the advertisement is user-targeted. In an embodiment, the interactivity material involves an e-commerce opportunity. 2260126_1 (GHMatters) 28/04/10 Sa In an embodiment, the interactivity material is a promotion. In an embodiment, the promotion is user-targeted. 5 In an embodiment, the user input comprises an invocation of the graphical element. In an embodiment, the user input comprises a voice command. 10 In an embodiment, the user input comprises keyboard input. In an embodiment, the user input comprises microphone input. In an embodiment, the method comprises in response to user input, storing a reference to the 15 interactivity material. In an embodiment, the method comprises in response to user input, retrieving a stored reference to the interactivity material. 20 In an embodiment, the method comprises in response to user input, sending a reference to the interactivity material to another user. In an embodiment, the method comprises in response to user input, sending a reference to the interactivity material to another device. 25 In an embodiment, the method comprises in response to user input, sending a reference to the interactivity material to a web site. In an embodiment, the method comprises receiving the interactivity material from another user. 30 In an embodiment, the interactivity material comprises an image. In an embodiment, the interactivity material comprises video. 35 In an embodiment, the interactivity material comprises text. 22601261 (GHMatters) 28/04/10 5b In an embodiment, the interactivity material comprises an embedded link. In an embodiment, the interactivity material is displayed in a first portion of a display of the 5 handheld wireless device. In an embodiment, the graphical element is displayed in a second portion of the display of the handheld wireless device. 10 In an embodiment, the graphical element is an icon. Among the advantages of the embodiments of the invention are one or more of the following: Instead of requiring modifications to the video content broadcast stream or demanding that a 15 user purchase additional equipment embodiments of the invention work with existing video systems and a user's handheld device, requiring only an Internet connection. Embodiments of the invention incorporate an intuitive, aesthetic interface that allows a user simply to tap on the screen of his handheld when TV content of interest appears on the TV 20 screen. For the content provider, all that is required is to provide the information necessary for the interactive links on their own servers. 25 Embodiments of the invention allow PDA and mobile phone users to expand their TV viewing experience by letting them "grab" subjects or individual items in TV programs or commercials to expand the viewing experience and gather further information on content of interest. Embodiments of the invention resolves the design battle for screen real estate by allowing the enhanced television experience to occur in the palm of the hand, rather than on the television 30 set. Embodiments of the invention avoids interrupting the viewing experience, which is protected because the user is not required to deal immediately with the grabbed content. Embodiments of the invention enhance television commercials, allowing users to acquire additional information as well as purchase goods. Advertisers can offer users targeted 35 promotions as well as gain instant feedback about the effectiveness of their advertising. 22601261 (GHMatters) 28104/10 Sc Embodiments of the invention enable a user's mobile device to act also as a universal remote control for the user's television and entertainment system. The mobile device can also display functions of an Electronic Programming Guide (EPG), give specifically targeted promotions for 5 programming, and offer easily accessed program schedules, all within the same device that provides the content enhancements. Given user permissions, the mobile device can organize content "grabbed" by the user so that content is hierarchically displayed according to a user's pre-set interests. Advertisers can use 10 this information to offer user-targeted promotions. The system can also allow filtering to streamline the display based on the user's preferences. For instance, a hockey fan viewing enhanced sports content from a news broadcast may not want to see further information on the day's tennis match. He can set his profile to indicate this. 22601261 (GHMatters) 28104/10 6 Other advantages and features will become apparent from the following description and from the claims. 5 DESCRIPTION Figures 1 through 23 show user interface screens. Figure 24 is a block diagram. 10 As shown in figure 24, a user's experience in viewing and using video content on a video device 10 can be enhanced by enabling the user 12 to interact with a mobile device 14 (such as an advanced remote controller) that is synchronized with the video content, for example, by indicating other content that may be of interest to the user. As the user views the video content, 15 he may be prompted periodically, e.g., by a 'hot" icon 16 on a display 18 of the mobile device or in some other way, of an opportunity for interactivity that coincides with or is related to the video content. The viewer may indicate an interest in the available interactivity by, for example, invoking the 20 icon, which triggers the generation of a tag 20. The tag may be stored in the local memory 22 of the mobile device 14 and/or provided in some way to a remote server 24. The tags enable the server to provide the interactivity to the user either immediately or at some later time or in some other way. 25 Tagging can occur in any of at least the following four ways: I. The tag can be a timestamp that captures the time (using a system clock on the mobile device) and the channel of video content that is being viewed at that time. The time and channel identity can together be called the "coordinates". When the coordinates 23 are sent to a remote 30 server 24, the server can retrieve interactivity information and functions 26 22601261 (GHMalters) 28/04/10 7 corresponding the coordinates and can return the information or provide the functions to the mobile device. 5 2. The tag can be in the form of a query 26 sent to the server at the exact time that the user indicates interest by invoking the icon. The server responds with the corresponding information or functions, which are then stored in the remote device for use in interacting with the user. 10 3. The tagging can be done by server in response to the mobile device querying the server at the exact time of the viewer's indicated interest. The server may generate the tag including embedded links 28 and return it to the mobile device, where it is stored. Later, by clicking on a displayed version of this tag and on displayed versions of the embedded links, the user can access the server to retrieve the interactivity information or functions. 15 4. Alternatively, the server may constantly broadcast a changing set of tags that are dynamically synchronized to the changing video content being viewed by the user. The mobile device displays the tags as they are received. When the user sees something of interest, he invokes the currently displayed tag, which corresponds to the current video content. The tag holds embedded links and the user can then access interactivity information or functions that correspond to the links. 20 The tagging methods allow the user to access and retrieve additional information and functions synchronously or asynchronously to the changing stream of video content. For example, the additional information or functions could involve (a) additional news that supplements news contained in the video content, (b) e-commerce opportunities, (c) web 25 pages on the Internet, (d) participation in newsgroups, (e) an opportunity to ask questions, (e) setting the mobile device to perform some action such as recording future video content or (f) any other information, action or transaction that the host of the remote server 24 chooses to make available. 30 8 The mobile device 14 may be powered by a battery 30 and include an input/output capability 32, such as a touch screen, a keypad, or a microphone to receive voice. A wireless access 5 capability 34 may be included to provide-for wireless access to the video device both to control the video device (much as a conventional remote controller would do) or to display information on the video device. A second wireless access capability 36 provides wireless access to the Internet or other wide area network to carry the tags, queries, coordinates, and interactivity information and functions. (In some cases, wireless access 34 to the video device will be sufficient to provide the functions of wireless capability 36 if the video device 10 has the ability to communicate through the network 40.) The mobile device runs software 42 that is configured to perform the following functions: (1) provide secure networking capability to access the server 24 through the network 40, (2) display text and images, and perform sound and video, (3) tag information as explained 15 earlier, and (4) store information as explained earlier. The interactivity information and functions, which can together be called the "interaction content" 46, are created by an interaction content creator using content creation software 44, and then loaded into the server for use at run time. The content creator develops the interaction content based on the video content on various channels of the video feed 60 that 20 are known to the creator in advance of the feed being delivered to the user's video device. Each item of interaction content may be related to one or more segment of video content and may be associated with that segment as part of the content creation. The content creation software provides the following functions: (1) a graphical user interface that enables the content creator to author information and functions that are to be 25 synchronized to video content that will be conveyed to the user, (2) a tag generation function that enables the creator to build tags that include audio, video, text, and images, (3) a function that enables a content creator to build icons and interactivity information and functions, (4) the testing of software to validate links, and (5) staging software to mount 30 9 Finished interactivity content to the staging server 24. The interaction content is loaded into the interactivity information and functions 26 stored at the server. At the same time, a table 5 or map 29 that associates video content segments with related interaction segments can be loaded into the server. The server runs server staging software 50 that (1) manages and hosts interactivity content 46 created using the content creation software, (2) affords networking capability to manage multiple simultaneous requests from mobile devices, (3) manages preference and 10 personalization information 52 that customizes the interactivity content to the user, (4) builds reports on usage to enable feedback for advertising and marketing purposes, and (5) enables creation and recording of e-commerce transactions. At run-time, the server can use tags generated by user interaction with the mobile device indicating the video content being viewed to identify and serve related interaction content, based on associations stored in the 15 map 29. The server can also base the interaction content on the more general preference and personalization information 52. Referring in more detail to the mobile device, the wireless capability 34, 36 could use any of a wide variety of protocols including PCS, cellular, Bluetooth, 802.11, WiFi, Infrared, or alternative technology. The mobile device could be able to connect to and control the video 20 feed 60 that drives the video device, for example, in the way that a universal remote controller controls a television, VCR, settop box, or other video device. The mobile device must also simultaneously connect to the network 44. The mobile device could be, for example, a wireless-enabled PDA such as a Compaq iPaq or Palm Pilot or an advanced functionality remote controller of the kind that includes a color 25 LCD screen, an infrared link to control the television, VCR, settop box, or other video device, and an 802.1 lb card to connect to a local area network to provides access to the Internet. 30 10 In one example of the use of the system, a user watching a television would also see a video or still picture on the screen of his remote controller that is serving as the mobile device. The S video or still picture (which may be thought of as a mobile feed 62 that is separate from the video feed) changes continually (as arranged by the content creator) to correspond to the video content carried on the video feed. If the user changes the channel of the video feed using the remote controller, the mobile feed is automatically changed by the server to the new channel. The information and functions on 10 the remote controller are updated in real-time over the network and are synchronized with the video feed by the server. From time to time, the mobile feed displayed on the remote controller will display a "hot" icon to prompt the user that interactivity is available. The interactivity opportunity is predetermined by the content author of the mobile feed. If the video content on the video 15 feed piques the user's interest at the time of the prompting, the user can tap the hot icon on the screen (or invoke the interactivity opportunity by voice command or keyboard), which tags the content. From a user interface perspective, when the user tags the content, the hot icon moves from the mobile feed window to a folder or area where it is stored for later access. The stored icon then represents the link that the user can later click to access 20 interactivity information or functions that the author of the tag has created. The user can access, save, store and send the information or function. The author of the tag can determine whether the user can edit the information or function. The user may choose to access the tagged information on the mobile device or may send it to another device such as a computer, television or other device that has a screen and input/output capability. The interactivity content is portable and may be sent to other people and other devices. 25 To enable the synchronization of content between the video feed and the mobile feed, the mobile device is able to identify the channel of the video feed is on and report the channel to the server. If the mobile device is one that has remote controller functionality, then when the 30 11 user changes channel, the channel identity is automatically known to the controller and can be reported to the server. Alternatively, the user may specifically enter the number of the channel into the mobile device at the time he changes channels, or the television, settop box, or VCR may send the current channel information to the mobile device at the time the user changes the channel. By determining the channel information, the mobile device can provide information to the server necessary to synchronize the mobile feed with the video feed. 10 The content creation software may be similar to a simple version of website development software. The mobile feed controls the synchronization of the video feed with the information that the mobile device retrieves. Sometimes, the mobile feed can provide ancillary information to 15 augment or complement the video feed. The content creation software also provides tools to embed in the mobile feed hot icons to indicate interactivity. It also has a module to create the links and to build the tags and information that are staged on the server and will be viewed on either the mobile device or other end user device. The server stages the mobile feed and the embedded links. The content creator uses available 20 software to create both the mobile feed and the embedded information which the user accesses when the user tags the content. The mobile feed may be text, a picture or even full video. The source of the video feed and the source of the mobile feed can be located in one place or in two different places 25 An example of a user interface that could be provided to a user of a mobile device is illustrated in figures I through 23. During a television show, the PDA may display images or 30 12 information that relate directly or indirectly to what is being shown on the television program. The user can bookmark or flag (i.e., tag) pieces of information that come from the mobile feed as he sees them on the PDA. Items that are bookmarked or flagged can be retrieved later from the server at a time convenient to the user. For example, if the uscr sees a short clip of a program, such as a baseball game, on the 10 television, he can indicate through the PDA interface that he wishes to bookmark the program. Later he can retrieve additional information about the program from the server. In general, the invention enables a user to work with a second parallel synchronized source of information and interaction while watching a television show, to identify items of interest that are displayed on the PDA, and to later retrieve or take some other action with respect to 15 the items of interest. Figure 1 shows the channel and volume controls on the left side that would be used for remote control of the television. The current channel is MTV News, channel 082. Figure 2 shows a banner related to "The Source Hip-Hop Music Awards" which is associated with a related segment being shown on the television. If the user is interested in bookmarking 20 this item, he presses on the place where the down arrow is shown and his request is transmitted to the server for storage. On Figure 3, a different banner is shown and the user again has the opportunity to bookmark the item by pressing the down arrow. 25 On the next screen, figure 4, an icon associated with the banner that appeared on figure 3 is shown in the bottom half of the screen indicating that the user has bookmarked this item. 30 13 On figure 7, the user has pressed the nine button icon in the upper left corner, indicating a' desire to enter information on a numeric keypad and the buttons of the icon have turned 5 black. In figures 8, 9, and 10, the keyboard is scrolled from left to right onto the screen. In figures 11, 12, and 13, the user has pressed 043 to change the channel to CNN Headline Sports. The information is sent to the Internet server, which then changes the program material that is being transmitted to the user's television. The banner for that program is then displayed on 10 the top of the screen. In figure 15, the user sees a banner for the NASCAR Brickyard 400 program and has the opportunity to press a large down arrow to bookmark it or to press a small down arrow that is labeled ticker. 15 In figure 16, the NASCAR icon has been added to the bookmarked items. In figure 17, the banner has changed to the Senior Burnet Classic in accordance with a short item being shown at the same time on the television. As before, the user has the chance to add this to his bookmarked set at the bottom half of the screen. 20 In figure 18, the Major League Baseball scores are shown. In figure 19, the user has highlighted the score of the Brewers-Dodgers game and added it to the bookmarked items. In figure 2 1, the user is shown information about the Brickyard 400 race, including the three leading contenders. The user is given the chance to view the race or scores or to buy stuff. On figure 22, the user has pressed the item the RACE and is shown four thumbnails of race 25 pictures. By pressing on the lower left-hand picture, the user is shown an enlarged image on figure 23 together with a text caption. 30 14 The ticker arrow allows the user to scroll through an entire sequence of different short clips, just as, for example, the television may broadcast a series of short clips of baseball games. Because the Internet server has personalization information about the 5 user on the server, the ticker can be altered to suit the user's tastes so that when he presses the ticker arrow, he sees a sequence of short clips that are of interest to him. Any icon that has been generated as a result of bookmarking can be invoked at any time by the user by simply pressing that icon. Then the Intemet server will serve images, 10 video, or information related to that icon. Other implementations are within the scope of the following claims. For the purposes of this specification it will be clearly understood that the word 1 5 "comprising" means "including but not limited to", and that the word "comprises" has a corresponding meaning.

Claims (43)

  1. 2. The method of claim I comprising displaying second information describing second video content available for viewing on the primary video device, the second information 15 received through the second communication channel.
  2. 3. The method of claim I comprising displaying information on the handheld wireless device related to the primary video content displayed on the primary video device. 20 4. The method of claim I wherein the information describing video content available for viewing is displayed in response to user input.
  3. 5. The method of claim 4 wherein the user input comprises an invocation of the graphical element. 25
  4. 6. The method of claim I comprising receiving the information describing video content from an Internet server.
  5. 7. The method of claim I comprising displaying a graphical element representing 30 interactivity material.
  6. 8. The method of claim I wherein the information describing video content includes an image. 35 9. The method of claim 8 wherein the image comprises a thumbnail image. 22601261 (GHMatters) 28/04/10 16
  7. 10. The method of claim I wherein the information describing video content includes video.
  8. 11. The method of claim I wherein the information describing video content includes text. 5
  9. 12. The method of claim I wherein the user input comprises a selection of a channel.
  10. 13. The method of claim I wherein the information describing video content is provided by an electronic program guide. 10
  11. 14. The method of claim 1 comprising accepting a choice by the user of displaying, on the primary video device, the video content available for viewing.
  12. 15. The method of claim I comprising generating a tag representative of the information 15 represented by the graphical element.
  13. 16. The method of claim 15 wherein the tag is generated by capturing a time and a channel of video content. 20 17. The method of claim 15 wherein the tag is generated by sending a query from the handheld wireless device to a server in response to user input.
  14. 18. The method of claim 17 wherein information or functions corresponding to the query are received from the server and stored in the handheld wireless device for use in interacting 25 with the user.
  15. 19. The method of claim 15 wherein the tag is generated by a server in response to a query from the handheld wireless device. 30 20. The method of claim I wherein the information describing video content comprises a promotion.
  16. 21. The method of claim 20 wherein the promotion is user-targeted. 35 22. The method of claim I wherein the information describing video content comprises an advertisement. 22601261 (GHMatters) 28/04/10 17
  17. 23. The method of claim 22 wherein the advertisement is user-targeted.
  18. 24. The method of claim 22 wherein the information comprises an advertisement for video 5 content.
  19. 25. The method of claim 24 comprising displaying video content represented by the advertisement. 10 26. The method of claim 24 wherein the advertisement for video content is a television advertisement.
  20. 27. The method of claim I comprising sending, to another user, a reference to the information. 15
  21. 28. The method of claim I comprising sending, to another mobile device, a reference to the information.
  22. 29. The method of claim I comprising sending, to a computer, a reference to the 20 information.
  23. 30. The method of claim I comprising sending, to a video device, a reference to the information. 25 3 1. The method of claim I comprising storing a reference to the information.
  24. 32. The method of claim I wherein the information is displayed in a first portion of a display of the handheld wireless device. 30 33. The method of claim 32 wherein the graphical element is displayed in a second portion of the display of the handheld wireless device.
  25. 34. The method of claim 22 wherein the graphical element is an icon. 35 35. The method of claim 22 wherein the graphical element is received through the second communication channel. 22601261 (GHMatters) 28104/10 18
  26. 36. A method comprising: while primary video content is being presented through a primary communication channel on a video device, 5 on a handheld wireless device, displaying a graphical element representing interactivity material matched with the video program, the interactivity material received through a second communication channel; and in response to user input, displaying the interactivity material. 10 37. The method of claim 36 wherein the interactivity material is received from another user.
  27. 38. The method of claim 36 wherein the interactivity material is an advertisement. 15 39. The method of claim 38 wherein the advertisement is user-targeted.
  28. 40. The method of claim 36 wherein the interactivity material involves an e-commerce opportunity. 20 41. The method of claim 36 wherein the interactivity material is a promotion.
  29. 42. The method of claim 41 wherein the promotion is user-targeted.
  30. 43. The method of claim 36 wherein the user input comprises an invocation of the graphical 25 element.
  31. 44. The method of claim 36 wherein the user input comprises a voice command.
  32. 45. The method of claim 36 wherein the user input comprises keyboard input. 30
  33. 46. The method of claim 36 wherein the user input comprises microphone input.
  34. 47. The method of claim 36 comprising, in response to user input, storing a reference to the interactivity material. 35 22601261 (GHMatters) 28/04/10 19
  35. 48. The method of claim 36 comprising, in response to user input, retrieving a stored reference to the interactivity material.
  36. 49. The method of claim 36 comprising, in response to user input, sending a reference to 5 the interactivity material to another user.
  37. 50. The method of claim 36 comprising, in response to user input, sending a reference to the interactivity material to another device. 10 51. The method of claim 36 comprising, in response to user input, sending a reference to the interactivity material to a web site.
  38. 52. The method of claim 36 comprising receiving the interactivity material from another user. 15
  39. 53. The method of claim 36 wherein the interactivity material comprises an image.
  40. 54. The method of claim 36 wherein the interactivity material comprises video. 20 55. The method of claim 36 wherein the interactivity material comprises text.
  41. 56. The method of claim 36 wherein the interactivity material comprises an embedded link.
  42. 57. The method of claim 36 wherein the interactivity material is displayed in a first portion 25 of a display of the handheld wireless device.
  43. 58. The method of claim 57 wherein the graphical element is displayed in a second portion of the display of the handheld wireless device. 30 59. The method of claim 36 wherein the graphical element is an icon. 22601261 (GHMatters) 28/04/10
AU2011218613A 2000-09-08 2011-08-26 Video interaction Abandoned AU2011218613A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2011218613A AU2011218613A1 (en) 2000-09-08 2011-08-26 Video interaction

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US60/231,285 2000-09-08
AU2011218613A AU2011218613A1 (en) 2000-09-08 2011-08-26 Video interaction

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
AU2010201682A Division AU2010201682B2 (en) 2000-09-08 2010-04-28 Video interaction

Publications (1)

Publication Number Publication Date
AU2011218613A1 true AU2011218613A1 (en) 2011-09-15

Family

ID=45465299

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2011218613A Abandoned AU2011218613A1 (en) 2000-09-08 2011-08-26 Video interaction

Country Status (1)

Country Link
AU (1) AU2011218613A1 (en)

Similar Documents

Publication Publication Date Title
AU2010201682B2 (en) Video interaction
AU2001295029A1 (en) Video interaction
JP5770874B2 (en) Access to Internet data through a television system
JP3167109B2 (en) Method and apparatus for automatically displaying an Internet homepage on a television screen in cooperation with a television program
US20090228921A1 (en) Content Matching Information Presentation Device and Presentation Method Thereof
KR20090085791A (en) Apparatus for serving multimedia contents and method thereof, and multimedia contents service system having the same
JP2006136015A (en) Query-based electronic program guide
JPH11196345A (en) Display system
JPH11243512A (en) Master-slave joint type display system
CA2763736C (en) Systems and methods for displaying program data relating to a show
AU2011218613A1 (en) Video interaction
WO2020085943A1 (en) Method for interactively displaying contextual information when rendering a video stream
JP4371667B2 (en) Interface device used with multimedia content playback device to search multimedia content being played back
AU3785900A (en) Systems and methods for providing television schedule information

Legal Events

Date Code Title Description
MK4 Application lapsed section 142(2)(d) - no continuation fee paid for the application