AU2011249132A1 - Mobile device remote retour channel - Google Patents
Mobile device remote retour channelInfo
- Publication number
- AU2011249132A1 AU2011249132A1 AU2011249132A AU2011249132A AU2011249132A1 AU 2011249132 A1 AU2011249132 A1 AU 2011249132A1 AU 2011249132 A AU2011249132 A AU 2011249132A AU 2011249132 A AU2011249132 A AU 2011249132A AU 2011249132 A1 AU2011249132 A1 AU 2011249132A1
- Authority
- AU
- Australia
- Prior art keywords
- instructions
- control device
- image
- central server
- input
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Description
MOBILE DEVICE REMOTE RETOUR CHANNEL
The present invention relates to a method for manipulating the display of images of an image display and/or user in- terface originating from an image source of a remote server and/or a storage medium on a local rendering device by means of a control device, such as a handheld device, of a user which is connected to a network and which makes contact with the remote server or the local rendering device via a network connection. The present invention also relates to a control device, such as a device connected to a network, such as a handheld device, suitable for application in such a method. The present invention also relates to a central server, a local rendering device and a sys- tem. The present invention further relates to computer software for executing such a method.
Known from the international patent application with publication number WO 2008/044916 of the same applicant as this document is a system for providing image information to local users by means of a plurality of individual video streams on the basis of for instance a video codec. For this purpose the images are generated on the basis of for instance a plurality of individual applications which are executed on a central server, on the basis of which indi- vidual video streams are generated in the central server. This patent application also includes a number of further optimizations of this general principle. The content of this patent application is hereby deemed included in this text by way of reference, for the purpose of providing a combined disclosure of all individual aspects of this earlier application in combination with individual aspects of this present application text.
In the system of the above stated application '916 use is made of a remote control as known in a standard set-top box for directly providing the set-top box with instructions which are provided to the central server via the network connection of the set-top box. Such a remote control has a large number of limitations in respect of the operation of the user interface.
In order to provide improvements in the operation of the user interface, the present invention provides a method for manipulating the display of images of an image display and/or user interface originating from an image source of a remote server and/or a storage medium on a local rendering device by means of a control device, such as a
handheld device, of a user which is connected to a network and which makes contact with the remote server or the local rendering device via a network connection, wherein the method comprises steps for:
- the remote server or the local rendering device receiving manipulation instructions from the control device pro- vided with manipulation software suitable for executing of the method by the control device;
- processing the manipulation instructions on the central server and/or the local rendering device, and
- sending image information from the central server and/or the local rendering device for the purpose of displaying the images and/or the user interface for final display on a display device such as a TV or monitor.
An advantage of a method according to the present invention is that instructions can be received from the control device via a network connection. It hereby becomes possible to use a relatively advanced device as control device, such as a general purpose computer device. Such a general purpose computer device has a relative wealth of input op-
tions for the user, such as a touchscreen, a motion detector and so on. With the present invention it becomes possible to provide such a relative wealth of input options to a user of a system according to the stated internation- al patent application. It further becomes possible to provide such a relative wealth of input options to the user of a local rendering device such as a video recorder, computer, media player and so on. Such a rendering device must for this purpose be provided with a network connec- tion for receiving the instructions. Alternatively, it is possible to provide a direct mutual connection in similar manner to a known remote control by means of for instance an infrared connection or a cable.
It is further possible by means of the richer input op- tions to make use of a large number of interactive applications such as games, chat and so on.
According to first preferred embodiment, a method according to the present invention comprises steps for generating video codec operations, such as MPEG operations, on the basis of the input manipulation instructions, the MPEG operations being used for the image display. In combination with video processing operations as described in said publication ¾916 it is possible to apply the instructions for the purpose of executing the video codec operations on the basis thereof. Operations hereby become possible on the basis of the relatively rich user interface of the control device. Examples hereof are for instance zoom operations which can be performed on the basis of multi- touch input or input of gestures.
In a further preferred embodiment the method comprises steps for changing the display of the user interface on the basis of the manipulation instructions. It is hereby possible for instance to navigate through a menu struc-
ture. It is for instance possible to switch between two menu screens by means of performing a swipe movement over a touchscreen. It is however also possible to select and activate a submenu item and thereby switch to a further menu page.
The method more preferably comprises image processing operations which are operable within a video codec, such as the application of movement vectors and/or translation vectors for realizing displacement effects and/or zoom ef- fects. It hereby becomes possible in advantageous manner to for instance select one of a plurality of small displays and subsequently enlarge this to full-screen. Compare this to the use of a photo page on the internet. If a user selects one of these images by means of a mouse, this is shown enlarged on the screen. It is possible by means of the present invention to show in for instance a user interface nine photos or moving video images, one of which the user selects which is then shown enlarged. It is further possible here by means of said zoom operations to gradually enlarge the image in a smooth movement starting from the already available, relatively small image. Then, when the high-resolution larger image is available from the background data storage, the image is shown definitively in high quality. Such a situation can be timed such that it appears to the user as if the image is enlarged immediately following clicking, whereby there does not appear to be the latency of retrieval of the background image with a higher resolution.
In such image processing operations on the basis of the manipulation instructions from the control device use is more preferably made of inter-encoding and intra-encoding . Image processing operations known from 916 can hereby be applied .
Manipulation instructions are preferably also applied which are entered by means of a touchscreen, such as sliding movements for scroll instructions or slide instructions, zoom in and out movements for zoom instructions, the instructions preferably being generated by means of multi-touch instructions. The user is hereby provided with a relatively great wealth of input options.
The instructions are more preferably generated by means of moving the control device, wherein these movements can be detected by means of a movement detector or a gravity detector arranged in the control device. It is for instance possible here to move for instance a level to the right in the menu structure by means of rotating the control device to the right or, alternatively, to move a level to the left in the menu structure by means of rotating the control device to the left. It also becomes possible for instance to implement an action effect as chosen by the user by means of shaking the control device.
The instructions more preferably comprise text input, speech input and/or image input. It hereby becomes possible in simple manner to input larger quantities of textual information. In a known remote control text is usually entered by means of successively selecting letters by means of a four-direction cursor key. According to the prior art this is very time-consuming and is obviated in effective manner by means of an aspect of the present preferred embodiment.
In order to provide greater security and identification of the user in relation to the central server or the local rendering device, a further embodiment provides steps for mutually pairing the central server and/or the local rendering device.
This is more preferably performed by the central server and/or local rendering device sending a code to the screen for input thereof in the control device and receiving from the control device information on the basis of which the input of the code can be verified.
Further methods of inputting data for pairing purposes can be executed by means of text input, gestures, motions, speech and/or image input.
A further aspect according to the present invention re- lates to a control device, such as a device connected to a network, such as a handheld device suitable for application in a method according to one or more of the foregoing claims, comprising:
-a central processing unit, at least one memory and pref- erably a touchscreen and/or a motion sensor, which are mutually connected to form a computer device for executing manipulation software for the purpose of generating manipulation instructions,
- the manipulation software for generation by the control device of manipulation instructions for the purpose of manipulating the image display and/or user interface,
- transmitting means for transferring the manipulation instructions by means of a network from the control device to the central server and/or local rendering device to a central server and/or a local rendering device via the network connection. Advantages can be gained by means of such a control device together with a central server and/or a local rendering device as referred to in the foregoing and as will be described in great detail herein- below.
A further aspect according to the present invention relates to a central server for streaming a number of parallel user sessions from at least one server to at least one
client device of a plurality of client devices for displaying the sessions on a screen connectable to a client device, wherein the sessions comprise video data and optional additional data such as audio data, wherein the central server comprises receiving means for receiving the instructions from a network connection, and wherein the central server comprises processing means for processing instructions comprising text input, gestures, motions, speech and/or image input.
A further aspect according to the present invention relates to a local rendering device, such as a video recorder, computer, media player, for displaying a user session and/or video information on a screen, wherein the media player comprises receiving means for receiving the in- structions from a network connection, and wherein the central server comprises processing means for processing instructions comprising text input, gestures, motions, speech and/or image input.
A further aspect according to the present invention re- lates to a system for streaming a number of parallel user sessions from at least one server to at least one client device of a plurality of client devices for displaying the sessions on a screen connectable to a client device, wherein the sessions comprise video data and optional ad- ditional data such as audio data, wherein the central server comprises receiving means for receiving the instructions relating to display on a respective client from a plurality of control devices by means of a network connection, and wherein the central server comprises pro- cessing means for processing instructions comprising text input, gestures, motions, speech and/or image input.
A further aspect according to the present invention relates to computer software for executing a method accord-
ing to one or more of the foregoing claims and/or for use in a central server, local rendering device, control device and/or system according to one or more of the foregoing claims.
Such aspects according to the present invention provide respective advantages as stated in the foregoing and as described in great detail hereinbelow.
Further advantages, features and details of the present invention will be described in greater detail hereinbelow on the basis of one or more preferred embodiments, with reference to the accompanying figures.
Fig. 1 shows a schematic representation of a preferred embodiment according to the present invention.
Fig. 2 shows a representation of the prior art (B) and a representation in accordance with a preferred embodiment according to the present invention (A) .
A first preferred embodiment (Fig. 1) according to the present invention relates to a mobile computer 100. This is similar to for instance a mobile phone. Mobile computer 100 comprises a screen 41 which is preferably touch- sensitive. The mobile computer also comprises four control buttons 42 arranged on the bottom side. A touch-sensitive surface 43 for navigation is situated between control buttons 42. Also situated on the bottom side is a microphone 44 for recording sounds such as voice sounds. A loudspeaker 46 for reproducing sounds is situated on the top side. Situated adjacently of loudspeaker 46 is a camera for recording images. A camera {not shown), likewise for recording images, is also situated on the rear side. The images are further transmitted via set-top box 3 or rendering device 3 to television 20.
Disclosed up to this point is a per se known mobile computer, such as a mobile phone or a PDA. According to the
present invention this mobile computer is provided with a software application for detecting input for the purpose of the present invention, and transmitting such input by means of a network connection. The software application is provided for this purpose with connecting means for making a connection to the network access means of the mobile computer. Access is hereby obtained to for instance a wireless network which is connected to the internet 19 or which is a separate wireless network. Alternatively, a fixed network is of course also possible. It is also possible in alternative manner for the wireless network to be a mobile network run by a mobile network operator.
Via the network connection the mobile device has contact with either the server 101 or local rendering device 3. Server 101 can likewise be a schematic representation of the components 5, 4, 102, 103 of Fig. 2. Fig. 2B is the same view as figure 9 of the cited document '916. Figure A shows as modification the lay-out of the return path of the remote control executed by mobile computer 100. The return path runs via internet 19 (as shown in figure 1) directly from the mobile computer to server 102. Parallel use (not shown) can also be made here of the standard remote control of set-top box 3. This can however also be switched off.
The control information which mobile computer 100 transmits to server 102 (which forms part of server 101 of Fig. 1} is enriched according to the present invention with said input options in respect of text input, gestures, motions, speech and/or image input.
A plurality of accelerated operating options hereby becomes possible which would not be possible by means of the standard remote control with buttons. It becomes possible by means of for instance the gestures and the motions to
indicate the speed of the movement. A user can hereby determine in dynamic manner how quickly an operation is performed, or for instance how much information is scrolled during performing of a single movement. It also becomes possible to rotate the image, for instance by describing a circle on the touchscreen or for instance rotating two fingertips on the screen. A further example is that the number of fingertips which operate the screen simultaneously determines which function is activated.
For transmission of the instructions from the mobile device to server 1 use is made of general internet technology, such as HTTP. The application on the mobile computer converts the touches on the touchscreen to parameters which are important for the user interface displayed on screen 20. In order to perform the sliding movement on the screen use is made of the "swipe=true" parameter, and for the speed of the movement the parameter "velocity=V", wherein V is a value of the speed. Further parameters are provided in similar manner, such as pinching for zooming, a rotation movement for rotation and text for text input. Examples which are used are as follows.
An instruction takes the form of a URL for reaching the server, providing an identification and providing an instruction. An instruction for transmitting an arrow up in- struction from a user to the server is as follows:
http: //sessionmanager/key?clientid=avplay&key=up.
An instruction to perform a similar operation by means of an upward sliding movement on the touchscreen of the mobile computer is as follows:
http: //sessionmanager/key?clientid=avplay&key=up&swipe=tru e&velocity=3.24 which indicates that an upward movement has to be made at a speed 3.24. This achieves that the desired speed is likewise displayed by the user interface.
Through repeated use the user can learn which speed produces which practical effect. Alternatively, it is possible to allow the user to set individual preferred settings .
An instruction to zoom out in order to reduce in size a part of the image is as follows:
http : //sessionmanager/event?clientid=avplay&event=onscale& scale=2.11, this achieving that a pinching movement is performed on the image with a factor 2.11, whereby the part of the image that has been selected is reduced in size. It is conversely possible to zoom in using such a function .
If a user wishes to input text in the user interface, the following function can be used:
http: //sessionmanaqer/event?clientid=avplay&event=onstring &text=bladibla, whereby the text value "bladibla" is used in the user interface to give for instance a name to a photo or video fragment. Because text input becomes possible, it is also possible according to the invention to use for instance chat applications with such a system.
The pairing of a mobile device with the remote server or local rendering device can be executed in that the server displays a code on the screen, this code being entered into the mobile computer by means of for instance text in- put. Once the code has been recognized as authentic, the user can use the mobile computer to manipulate the session to which he/she has rights. Alternatively, it is possible to pair by for instance showing on the screen a code which is recorded by means of one of the cameras of the mobile computer. The code can then be forwarded by means of a challenge to the remote server and/or local rendering device in order to effect the authentification of the user of the mobile computer. Pairing has the further advantage
of providing additional security, so that instructions can also be applied for the purpose of purchasing for instance video on-demand or other pay services such as games.
It is once again stated here that the present invention has been changed specifically for the purpose of application in a system as according to '916. The skilled person in the field will be able to interpret the present disclosure clearly in the light of the disclosure of this document and in combination with individual aspects of the two documents. Fig. 2B is for instance included as a copy of Fig. 9 of x916. Further parts of the disclosure of this earlier document are likewise deemed to be incorporated in the present document in order to form part of the disclosure of this document. The purpose of this comprehensive and detailed reference is to save textual description. All the figures of '916 are also deemed to be included in this document, individually and in combination with all individual aspects of the disclosure of the present new document .
The present invention has been described in the foregoing on the basis of several preferred embodiments. Different aspects of different embodiments are deemed described in combination with each other, wherein all combinations which can be deemed by a skilled person in the field as falling within the scope of the invention on the basis of reading of this document are included. These preferred embodiments are not limitative for the scope of protection of this document. The rights sought are defined in the appended claims.
Claims (19)
1. Method for manipulating the display of images of an image display and/or user interface originating from an image source of a remote server and/or a storage medium on a local rendering device by means of a control device, such as a handheld device, of a user which is connected to a network and which makes contact with the remote server or the local rendering device via a network connection, wherein the method comprises steps for:
- the remote server or the local rendering device receiving manipulation instructions from the control device provided with manipulation software suitable for executing of the method by the control device;
- processing the manipulation instructions on the central server and/or the local rendering device, and
- sending image information from the central server and/or the local rendering device for the purpose of displaying the images and/or the user interface for final display on a display device such as a TV or monitor.
2. Method as claimed in claim 1, comprising steps for generating video codec operations, such as MPEG operations, on the basis of the input manipulation instruc- tions, the MPEG operations being used for the image display.
3. Method as claimed in claim 1 or 2, comprising steps for changing the display of the user interface on the basis of the manipulation instructions.
4. Method as claimed in one or more of the foregoing claims, comprising image processing operations which are operable within a video codec, such as the application of movement vectors and/or translation vectors for realizing displacement effects and/or zoom effects.
5. Method as claimed in one or more of the foregoing claims, comprising image processing operations on the basis of the manipulation instructions from the control device while applying inter encoding and intra encoding.
6. Method as claimed in one or more of the foregoing claims, wherein the manipulation instructions comprise instructions which are entered by means of a touchscreen, such as sliding movements for scroll instructions or slide instructions, zoom in and out movements for zoom instruc- tions, the instructions preferably being generated by means of multi-touch instructions.
7. Method as claimed in one or more of the foregoing claims, wherein the instructions are generated by means of moving the control device, wherein these movements can be recorded by means of a movement detector or a gravity detector arranged in the control device.
8. Method as claimed in one or more of the forego- ing claims, wherein the instructions comprise text input, speech input and/or image input.
9. Method as claimed in one or more of the forego- ing claims, comprising steps for mutually pairing the cen- tral server and/or the local rendering device.
10. Method as claimed in claim 9, comprising steps for the central server and/or the local rendering device sending a code to the screen for input thereof in the control device and receiving from the control device information on the basis of which the input of the code can be verified.
11. Method as claimed in claim 9 or 10, wherein the input into the control device can be executed by means of text input, gestures, motions, speech and/or image input .
12. Method as claimed in one or more of the foregoing claims, comprising instructions for selecting an item in a user interface and/or for activating the selected item, preferably further comprising instructions for enlargement of the selected item in the image.
13. Method as claimed in claim 12, wherein during enlargement image information with a higher resolution is retrieved from a data store, while a zoom rendering is executed on the basis of the small image information already available in the user interface.
14. Method as claimed in claim 13, wherein during executing of the zoom rendering a relatively high-quality rendering is executed on the basis of retrieved high- resolution information which is displayed as soon as it is available instead of the zoom rendering on the basis of the small image information.
15. Control device, such as a device connected to a network, such as a handheld device suitable for application in a method as claimed in one or more of the foregoing claims, comprising: - a central processing unit, at least one memory and preferably a touchscreen and/or a motion sensor, which are mutually connected to form a computer device for executing manipulation software for the purpose of generating manip- ulation instructions,
- the manipulation software for generation by the control device of manipulation instructions for the purpose of manipulating the image display and/or user interface,
- transmitting means for transferring the manipulation in- structions by means of a network from the control device to the central server and/or local rendering device to a central server and/or a local rendering device via the network connection.
16. Central server for streaming a number of parallel user sessions from at least one server to at least one client device of a plurality of client devices for displaying the sessions on a screen connectable to a client device, wherein the sessions comprise video data and optional additional data such as audio data, wherein the central server comprises receiving means for receiving the instructions from a network, connection, and wherein the central server comprises processing means for processing instructions comprising text input, gestures, motions, speech and/or image input.
17. Local rendering device, such as a video recorder, computer, media player, for displaying a user session and/or video information on a screen, wherein the me- dia player comprises receiving means for receiving the instructions from a network connection, and wherein the central server comprises processing means for processing in- structions comprising text input, gestures, motions, speech and/or image input .
18. System for streaming a number of parallel user sessions from at least one server to at least one client device of a plurality of client devices for displaying the sessions on a screen connectable to a client device, wherein the sessions comprise video data and optional additional data such as audio data, wherein the central server comprises receiving means for receiving the instructions relating to display on a respective client from a plurality of control devices by means of a network connection, and wherein the central server comprises processing means for processing instructions comprising text input, gestures, motions, speech and/or image input.
19. Computer software for executing a method as claimed in one or more of the foregoing claims and/or for use in a central server, local rendering device, control device and/or system according to one or more of the foregoing claims .
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
NLNL2004670 | 2010-05-04 | ||
NL2004670A NL2004670C2 (en) | 2010-05-04 | 2010-05-04 | METHOD FOR MULTIMODAL REMOTE CONTROL. |
PCT/NL2011/050308 WO2011139155A1 (en) | 2010-05-04 | 2011-05-04 | Mobile device remote retour channel |
Publications (2)
Publication Number | Publication Date |
---|---|
AU2011249132A1 true AU2011249132A1 (en) | 2012-11-22 |
AU2011249132B2 AU2011249132B2 (en) | 2015-09-24 |
Family
ID=44475067
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
AU2011249132A Ceased AU2011249132B2 (en) | 2010-05-04 | 2011-05-04 | Mobile device remote retour channel |
Country Status (10)
Country | Link |
---|---|
US (1) | US20130198776A1 (en) |
EP (1) | EP2567545A1 (en) |
JP (1) | JP2013526232A (en) |
KR (1) | KR20130061149A (en) |
AU (1) | AU2011249132B2 (en) |
BR (1) | BR112012028137A2 (en) |
CA (1) | CA2797930A1 (en) |
IL (1) | IL222830A0 (en) |
NL (1) | NL2004670C2 (en) |
WO (1) | WO2011139155A1 (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9021541B2 (en) | 2010-10-14 | 2015-04-28 | Activevideo Networks, Inc. | Streaming digital video between video devices using a cable television system |
US9042454B2 (en) | 2007-01-12 | 2015-05-26 | Activevideo Networks, Inc. | Interactive encoded content system including object models for viewing on a remote device |
US9077860B2 (en) | 2005-07-26 | 2015-07-07 | Activevideo Networks, Inc. | System and method for providing video content associated with a source image to a television in a communication network |
US9123084B2 (en) | 2012-04-12 | 2015-09-01 | Activevideo Networks, Inc. | Graphical application integration with MPEG objects |
US9826197B2 (en) | 2007-01-12 | 2017-11-21 | Activevideo Networks, Inc. | Providing television broadcasts over a managed network and interactive content over an unmanaged network to a client device |
US11073969B2 (en) | 2013-03-15 | 2021-07-27 | Activevideo Networks, Inc. | Multiple-mode system and method for providing user selectable video content |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9204203B2 (en) | 2011-04-07 | 2015-12-01 | Activevideo Networks, Inc. | Reduction of latency in video distribution networks using adaptive bit rates |
JP5148739B1 (en) * | 2011-11-29 | 2013-02-20 | 株式会社東芝 | Information processing apparatus, system and method |
EP2815582B1 (en) | 2012-01-09 | 2019-09-04 | ActiveVideo Networks, Inc. | Rendering of an interactive lean-backward user interface on a television |
US9800945B2 (en) | 2012-04-03 | 2017-10-24 | Activevideo Networks, Inc. | Class-based intelligent multiplexing over unmanaged networks |
DE202013006341U1 (en) | 2012-07-27 | 2013-08-08 | Magine Holding AB | System for playing media content from the World Wide Web |
SE1200467A1 (en) | 2012-07-27 | 2014-01-28 | Magine Holding AB | System and procedure |
US9219922B2 (en) | 2013-06-06 | 2015-12-22 | Activevideo Networks, Inc. | System and method for exploiting scene graph information in construction of an encoded video sequence |
US9294785B2 (en) | 2013-06-06 | 2016-03-22 | Activevideo Networks, Inc. | System and method for exploiting scene graph information in construction of an encoded video sequence |
WO2014197879A1 (en) | 2013-06-06 | 2014-12-11 | Activevideo Networks, Inc. | Overlay rendering of user interface onto source video |
US9986296B2 (en) * | 2014-01-07 | 2018-05-29 | Oath Inc. | Interaction with multiple connected devices |
US9788029B2 (en) | 2014-04-25 | 2017-10-10 | Activevideo Networks, Inc. | Intelligent multiplexing using class-based, multi-dimensioned decision logic for managed networks |
TWI573047B (en) * | 2015-12-18 | 2017-03-01 | 明基電通股份有限公司 | Wireless pairing system |
US11416203B2 (en) | 2019-06-28 | 2022-08-16 | Activevideo Networks, Inc. | Orchestrated control for displaying media |
EP4256791A1 (en) | 2020-12-07 | 2023-10-11 | ActiveVideo Networks, Inc. | Systems and methods of alternative networked application services |
Family Cites Families (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7210099B2 (en) * | 2000-06-12 | 2007-04-24 | Softview Llc | Resolution independent vector display of internet content |
JP2002112228A (en) * | 2000-09-29 | 2002-04-12 | Canon Inc | Multimedia on-demand system, information transmission method, and storage medium |
WO2002047388A2 (en) * | 2000-11-14 | 2002-06-13 | Scientific-Atlanta, Inc. | Networked subscriber television distribution |
SE519884C2 (en) * | 2001-02-02 | 2003-04-22 | Scalado Ab | Method for zooming and producing a zoomable image |
JP2002369167A (en) * | 2001-06-11 | 2002-12-20 | Canon Inc | Information processor and its method |
US20030001908A1 (en) * | 2001-06-29 | 2003-01-02 | Koninklijke Philips Electronics N.V. | Picture-in-picture repositioning and/or resizing based on speech and gesture control |
JP4802425B2 (en) * | 2001-09-06 | 2011-10-26 | ソニー株式会社 | Video display device |
KR101157308B1 (en) * | 2003-04-30 | 2012-06-15 | 디즈니엔터프라이지즈,인크. | Cell phone multimedia controller |
US7233316B2 (en) * | 2003-05-01 | 2007-06-19 | Thomson Licensing | Multimedia user interface |
JP4478868B2 (en) * | 2004-03-09 | 2010-06-09 | ソニー株式会社 | Image display device and image display method |
US20080052742A1 (en) * | 2005-04-26 | 2008-02-28 | Slide, Inc. | Method and apparatus for presenting media content |
JP4695474B2 (en) * | 2005-09-21 | 2011-06-08 | 株式会社東芝 | Composite video control apparatus, composite video control method, and program |
JP4774921B2 (en) * | 2005-11-01 | 2011-09-21 | Kddi株式会社 | File display method and system |
US7634296B2 (en) * | 2005-12-02 | 2009-12-15 | General Instrument Corporation | Set top box with mobile phone interface |
AU2006101096B4 (en) * | 2005-12-30 | 2010-07-08 | Apple Inc. | Portable electronic device with multi-touch input |
JP5044961B2 (en) * | 2006-03-29 | 2012-10-10 | カシオ計算機株式会社 | Client device and program |
US7864163B2 (en) * | 2006-09-06 | 2011-01-04 | Apple Inc. | Portable electronic device, method, and graphical user interface for displaying structured electronic documents |
US20100146139A1 (en) | 2006-09-29 | 2010-06-10 | Avinity Systems B.V. | Method for streaming parallel user sessions, system and computer software |
JP4791929B2 (en) * | 2006-09-29 | 2011-10-12 | 株式会社日立製作所 | Information distribution system, information distribution method, content distribution management device, content distribution management method, and program |
SE533185C2 (en) * | 2007-02-16 | 2010-07-13 | Scalado Ab | Method for processing a digital image and image representation format |
EP2188700A1 (en) * | 2007-09-18 | 2010-05-26 | Thomson Licensing | User interface for set top box |
JP2009159188A (en) * | 2007-12-26 | 2009-07-16 | Hitachi Ltd | Server for displaying content |
US9900557B2 (en) * | 2007-12-28 | 2018-02-20 | Verizon Patent And Licensing Inc. | Method and apparatus for remote set-top box management |
US20090228922A1 (en) * | 2008-03-10 | 2009-09-10 | United Video Properties, Inc. | Methods and devices for presenting an interactive media guidance application |
US9210355B2 (en) * | 2008-03-12 | 2015-12-08 | Echostar Technologies L.L.C. | Apparatus and methods for controlling an entertainment device using a mobile communication device |
WO2009114247A2 (en) * | 2008-03-12 | 2009-09-17 | Echostar Technologies Llc | Apparatus and methods for controlling an entertainment device using a mobile communication device |
JP5322094B2 (en) * | 2008-03-31 | 2013-10-23 | Kddi株式会社 | VoD system for client-controlled video communication terminals |
JP5090246B2 (en) * | 2008-05-09 | 2012-12-05 | ソニー株式会社 | Information providing apparatus, portable information terminal, content processing device, content processing system, and program |
US9641884B2 (en) * | 2008-11-15 | 2017-05-02 | Adobe Systems Incorporated | Method and device for establishing a content mirroring session |
EP2343881B1 (en) * | 2010-01-07 | 2019-11-20 | LG Electronics Inc. | Method of processing application in digital broadcast receiver connected with interactive network, and digital broadcast receiver |
-
2010
- 2010-05-04 NL NL2004670A patent/NL2004670C2/en not_active IP Right Cessation
-
2011
- 2011-05-04 BR BR112012028137A patent/BR112012028137A2/en not_active IP Right Cessation
- 2011-05-04 JP JP2013509016A patent/JP2013526232A/en active Pending
- 2011-05-04 WO PCT/NL2011/050308 patent/WO2011139155A1/en active Application Filing
- 2011-05-04 EP EP11738835A patent/EP2567545A1/en not_active Withdrawn
- 2011-05-04 AU AU2011249132A patent/AU2011249132B2/en not_active Ceased
- 2011-05-04 CA CA2797930A patent/CA2797930A1/en not_active Abandoned
- 2011-05-04 KR KR1020127031648A patent/KR20130061149A/en not_active Application Discontinuation
-
2012
- 2012-11-01 IL IL222830A patent/IL222830A0/en unknown
- 2012-11-02 US US13/668,004 patent/US20130198776A1/en not_active Abandoned
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9077860B2 (en) | 2005-07-26 | 2015-07-07 | Activevideo Networks, Inc. | System and method for providing video content associated with a source image to a television in a communication network |
US9042454B2 (en) | 2007-01-12 | 2015-05-26 | Activevideo Networks, Inc. | Interactive encoded content system including object models for viewing on a remote device |
US9826197B2 (en) | 2007-01-12 | 2017-11-21 | Activevideo Networks, Inc. | Providing television broadcasts over a managed network and interactive content over an unmanaged network to a client device |
US9021541B2 (en) | 2010-10-14 | 2015-04-28 | Activevideo Networks, Inc. | Streaming digital video between video devices using a cable television system |
US9123084B2 (en) | 2012-04-12 | 2015-09-01 | Activevideo Networks, Inc. | Graphical application integration with MPEG objects |
US11073969B2 (en) | 2013-03-15 | 2021-07-27 | Activevideo Networks, Inc. | Multiple-mode system and method for providing user selectable video content |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AU2011249132B2 (en) | Mobile device remote retour channel | |
AU2011249132A1 (en) | Mobile device remote retour channel | |
US11126343B2 (en) | Information processing apparatus, information processing method, and program | |
US9736540B2 (en) | System and method for multi-device video image display and modification | |
JP6913634B2 (en) | Interactive computer systems and interactive methods | |
KR101763887B1 (en) | Contents synchronization apparatus and method for providing synchronized interaction | |
US9723123B2 (en) | Multi-screen control method and device supporting multiple window applications | |
US20130326583A1 (en) | Mobile computing device | |
KR101843592B1 (en) | Primary screen view control through kinetic ui framework | |
US7984377B2 (en) | Cascaded display of video media | |
US20140282061A1 (en) | Methods and systems for customizing user input interfaces | |
US20110271227A1 (en) | Zoom display navigation | |
KR20120014868A (en) | Information processing device, information processing method, computer program, and content display system | |
US20150281744A1 (en) | Viewing system and method | |
JP2009093356A (en) | Information processor and scroll method | |
Sánchez et al. | Controlling multimedia players using nfc enabled mobile phones | |
CN103782603B (en) | The system and method that user interface shows | |
US11843816B2 (en) | Apparatuses, systems, and methods for adding functionalities to a circular button on a remote control device | |
JP7195015B2 (en) | instruction system, program | |
CN116266868A (en) | Display equipment and viewing angle switching method | |
KR20230075365A (en) | A System and Method for Providing Multiple 3D Contents using a Web-browser | |
KR20130123679A (en) | Video confenrece apparatus, and method for operating the same | |
Barkhuus et al. | New interaction modes for rich panoramic live video experiences | |
JP2013109459A (en) | Display device, display method and program |