WO2012142043A1 - Gesture visualization and sharing between electronic devices and remote displays - Google Patents

Gesture visualization and sharing between electronic devices and remote displays Download PDF

Info

Publication number
WO2012142043A1
WO2012142043A1 PCT/US2012/032929 US2012032929W WO2012142043A1 WO 2012142043 A1 WO2012142043 A1 WO 2012142043A1 US 2012032929 W US2012032929 W US 2012032929W WO 2012142043 A1 WO2012142043 A1 WO 2012142043A1
Authority
WO
WIPO (PCT)
Prior art keywords
graphical output
remote display
touch inputs
electronic device
computer
Prior art date
Application number
PCT/US2012/032929
Other languages
French (fr)
Inventor
Nicholas V. King
Original Assignee
Apple Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apple Inc. filed Critical Apple Inc.
Priority to AU2012243007A priority Critical patent/AU2012243007B2/en
Priority to EP12714517.5A priority patent/EP2678771B1/en
Priority to CN201280016099.3A priority patent/CN103460177B/en
Priority to KR1020137025984A priority patent/KR101629072B1/en
Priority to JP2014505219A priority patent/JP5902288B2/en
Publication of WO2012142043A1 publication Critical patent/WO2012142043A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/131Protocols for games, networked simulations or virtual reality
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/0407Resolution change, inclusive of the use of different resolutions for different screen areas
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/06Colour space transformation
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/04Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller

Definitions

  • the present embodiments relate to techniques for driving remote displays from electronic devices. More specifically, the present embodiments relate to techniques for driving a remote display using visualizations of gestures on an electronic device, as well as techniques for sharing gestures between the electronic device and the remote display.
  • Modern portable electronic devices typically include functionality to create, store, open, and/or update various forms of digital media.
  • a mobile phone may include a camera for capturing images, memory in which images may be stored, software for viewing images, and/or software for editing images.
  • the portability and convenience associated with portable electronic devices allows users of the portable electronic devices to incorporate digital media into everyday activities.
  • the camera on a mobile phone may allow a user of the mobile phone to take pictures at various times and in multiple settings, while the display screen on the mobile phone and installed software may allow the user to display the pictures to others.
  • the display screen on a tablet computer may be too small to be used in a presentation to a large group of people.
  • the user of the tablet computer may conduct the presentation by driving a large remote display using a screen sharing application on the tablet computer.
  • the disclosed embodiments provide a system that facilitates interaction between an electronic device and a remote display.
  • the system includes a first application and an encoding apparatus on the electronic device, and a second application and a decoding apparatus on the remote display.
  • the encoding apparatus obtains graphical output for a display of the electronic device and a first set of touch inputs associated with the graphical output from a first touch screen.
  • the encoding apparatus encodes the graphical output
  • the first application transmits the graphical output and the first set of touch inputs to the remote display.
  • the decoding apparatus decodes the graphical output.
  • the second application uses the graphical output and a visual representation of the first set of touch inputs to drive the remote display.
  • the second application also obtains a second set of touch inputs associated with the graphical output from a second touch screen and transmits the second set of touch inputs to the electronic device.
  • the first application then updates the graphical output based on the second set of touch inputs.
  • the first application also identifies the remote display as a source of the second set of touch inputs.
  • the identified remote display may enable modification of the graphical output by the first application prior to transmitting the graphical output to the remote display and/or transmission of data from the first application to the remote display based on the second set of touch inputs.
  • transmitting the graphical output and the first set of touch inputs to the remote display involves at least one of compositing the visual representation of the first set of touch inputs into the graphical output, and transmitting the first set of touch inputs as auxiliary data associated with the graphical output to the remote display.
  • using the graphical output and the visual representation of the first set of touch inputs to drive the remote display involves:
  • the electronic device is at least one of a mobile phone, a tablet computer, and a portable media player.
  • the remote display is at least one of a tablet computer, a mobile phone, a portable media player, a projector, and a monitor.
  • FIG. 1 shows a schematic of a system in accordance with an embodiment.
  • FIG. 2 shows a system for facilitating interaction between an electronic device and a remote display in accordance with an embodiment.
  • FIG. 3 shows an exemplary interaction between an electronic device and a remote display in accordance with an embodiment.
  • FIG. 4 shows an exemplary interaction between an electronic device and a remote display in accordance with an embodiment.
  • FIG. 5 shows a flowchart illustrating the process of interacting with a remote display in accordance with an embodiment.
  • FIG. 6 shows a flowchart illustrating the process of interacting with an electronic device in accordance with an embodiment.
  • FIG. 7 shows a computer system in accordance with an embodiment.
  • the data structures and code described in this detailed description are typically stored on a computer-readable storage medium, which may be any device or medium that can store code and/or data for use by a computer system.
  • the computer-readable storage medium includes, but is not limited to, volatile memory, non-volatile memory, magnetic and optical storage devices such as disk drives, magnetic tape, CDs (compact discs), DVDs (digital versatile discs or digital video discs), or other media capable of storing code and/or data now known or later developed.
  • the methods and processes described in the detailed description section can be embodied as code and/or data, which can be stored in a computer-readable storage medium as described above.
  • a computer system reads and executes the code and/or data stored on the computer-readable storage medium, the computer system performs the methods and processes embodied as data structures and code and stored within the computer-readable storage medium.
  • methods and processes described herein can be included in hardware modules or apparatus.
  • modules or apparatus may include, but are not limited to, an application-specific integrated circuit (ASIC) chip, a field-programmable gate array (FPGA), a dedicated or shared processor that executes a particular software module or a piece of code at a particular time, and/or other programmable-logic devices now known or later developed.
  • ASIC application-specific integrated circuit
  • FPGA field-programmable gate array
  • programmable-logic devices now known or later developed.
  • FIG. 1 shows a schematic of a system in accordance with an embodiment.
  • the system includes an electronic device 102 and a remote display 104.
  • Electronic device 102 may correspond to a mobile phone, tablet computer, portable media player, and/or other compact electronic device that includes functionality to store digital media such as documents, images, audio, and/or video.
  • Remote display 104 may also correspond to a compact electronic device such as a tablet computer, mobile phone, and/or portable media player, or remote display 104 may include a projector, monitor, and/or other type of electronic display that is external to and/or larger than a display on electronic device 102.
  • remote display 104 facilitates the sharing of digital media from electronic device 102.
  • electronic device 102 may be used to drive remote display 104 so that graphical output on remote display 104 is substantially the same as graphical output on electronic device 102.
  • a user of electronic device 102 may control the display of a photo slideshow, presentation, and/or document on both remote display 104 and electronic device 102 from an application on electronic device 102.
  • remote display 104 provides additional space for displaying the graphical output, remote display 104 may allow the photo slideshow, presentation, and/or document to be viewed by more people than if the photo slideshow, presentation, and/or document were displayed only on electronic device 102.
  • a server 106 on electronic device 102 may be used to communicate with a client 108 on remote display 104.
  • Server 106 may transmit graphical output from electronic device 102 to client 108, and client 108 may update remote display 104 with the graphical output.
  • server 106 and client 108 may correspond to a remote desktop server and remote desktop client that communicate over a network connection between electronic device 102 and remote display 104.
  • the remote desktop server may propagate changes to the desktop and/or display of electronic device 102 to the remote desktop client, and the remote desktop client may update remote display 104 accordingly.
  • server 106 and client 108 may allow electronic device 102 to drive remote display 104 without connecting to remote display 104 using a video interface such as Digital Visual Interface (DVI), High-Definition Multimedia Interface (HDMI), and/or
  • DVI Digital Visual Interface
  • HDMI High-Definition Multimedia Interface
  • Server 106 and client 108 may additionally be configured to drive remote display 104 using visual representations of a first set of touch inputs from electronic device 102 and/or update the graphical output based on a second set of touch inputs from remote display 104.
  • a first application associated with server 106 may obtain a first set of touch inputs from a first touch screen associated with (e.g., provided by) electronic device 102.
  • Each of the touch inputs may correspond to a tapping gesture, a swiping gesture, a pinching gesture, a rotating gesture, and/or another touch-based gesture on the touch screen.
  • server 106 may transmit the first set of touch inputs, along with the graphical output, to remote display 104.
  • a second application associated with client 108 may then use the graphical output and a visual representation of the first set of touch inputs to drive remote display 104.
  • the second application may update remote display 104 with the graphical output, as well as a set of dots representing locations of the first set of touch inputs within the graphical output.
  • the second application may additionally obtain the second set of touch inputs from a second touch screen associated with (e.g., provided by) remote display 104.
  • the second set of touch inputs may include a number of touch-based gestures.
  • client 108 may transmit the second set of touch inputs to electronic device 102, and the first application may update the graphical output based on the second set of touch inputs.
  • the first application may use the second set of touch inputs to update the graphical output as if the second set of touch inputs were received from the first touch screen on the electronic device.
  • the first application may identify the remote display as a source of the second set of touch inputs.
  • the first application may then use the identified remote display and/or second set of touch inputs to generate a graphical overlay that is displayed over the graphical output on the remote display.
  • the first application may also transmit data to the remote display based on the second set of touch inputs. Consequently, the first and second applications may facilitate both the sharing of digital media from electronic device 102 and interaction between electronic device 102 and remote display 104.
  • FIG. 2 shows a system for facilitating interaction between electronic device 102 and remote display 104 in accordance with an embodiment.
  • electronic device 102 may drive remote display 104 so that graphical output 208 on electronic device 102 is substantially the same as graphical output 228 on remote display 104.
  • electronic device 102 may enable the display of a presentation, photo slideshow, and/or document on both remote display 104 and the display of electronic device 102.
  • a first application 210 associated with server 106 may generate graphical output 208 using a graphics-processing mechanism 206 (e.g., graphics-processing unit (GPU), graphics stack, etc.) in electronic device
  • graphics-processing mechanism 206 e.g., graphics-processing unit (GPU), graphics stack, etc.
  • application 210 may provide a user interface 202 (e.g., graphical user interface (GUI)) that obtains a series of touch inputs 204 (e.g., gestures) from a user through a touch screen associated with electronic device 102.
  • GUI graphical user interface
  • Application 210 may then issue draw commands to graphics-processing mechanism 206 based on touch inputs 204 to generate graphical output 208 that is shown within user interface 202 and/or the touch screen.
  • the user may interact with application 210 by providing touch inputs 204 to application 210 through the touch screen and/or user interface 202 and receiving graphical output 208 from application 210 through the touch screen and/or user interface 202.
  • graphical output 208 may be obtained by application 210 and encoded by an encoding apparatus 212 associated with application 210.
  • a conversion mechanism 214 in encoding apparatus 212 may convert graphical output 208 from a first color space to a second color space, and a scaling mechanism 216 may scale graphical output 208.
  • encoding apparatus 212 may include functionality to encode graphical output 208 using an H.264 codec. Conversion mechanism 214 may thus convert graphical output 208 from an RGB color space into a YUV color space.
  • scaling mechanism 216 may scale graphical output 208 up or down to allow graphical output 208 to match the resolution of remote display 104.
  • server 106 may transmit graphical output 208 to client 108 over a network (e.g., wireless network, local area network (LAN), wide area network (WAN), etc.) connection.
  • a second application 218 associated with client 108 may then use graphical output 208 to update remote display 104.
  • a decoding apparatus 220 associated with application 218 may decode graphical output 208.
  • decoding apparatus 220 may include an H.264 codec that obtains frames of pixel values from the encoded graphical output 208.
  • the pixel values may then be sent to a graphics-processing mechanism 226 (e.g., GPU, graphics stack) in remote display 104 and used by graphics-processing mechanism 226 to generate graphical output 228 for driving remote display 104.
  • a graphics-processing mechanism 226 e.g., GPU, graphics stack
  • graphics-processing mechanism 226 may be shown within a user interface 222 provided by application 218 and/or a touch screen associated with remote display 104.
  • a visual representation 230 of touch inputs 204 may also be used to drive remote display 104. More specifically, touch inputs 204 may be obtained by application 210 and/or server 106 and transmitted along with graphical output 208 to client 108. Visual representation 230 may be generated using touch inputs 204 and provided to graphics-processing mechanism 226 so that graphics-processing mechanism 226 may drive remote display 104 using visual representation 230.
  • visual representation 230 may include a set of dots representing the locations of touch inputs 204 within graphical output 208 and 228. As a result, visual representation 230 may be generated by drawing a dot at each location associated with touch inputs 204. Visual representations of touch inputs are discussed in further detail below with respect to FIG. 3.
  • visual representation 230 is generated by
  • server 106 may transmit a single data stream containing graphical output 208 and visual representation 230 to client 108.
  • the data stream may then be decoded by decoding apparatus 220 and used by graphics-processing mechanism 226 to drive remote display 104.
  • application 210 and/or server 106 may transmit touch inputs 204 as auxiliary data associated with graphical output 208 to client 108.
  • graphical output 208 may be transmitted through a main communication channel between server 106 and client 108
  • touch inputs 204 may be transmitted through a sideband channel between server 106 and client 108.
  • application 218 and/or graphics-processing mechanism 226 may drive remote display 104 by drawing graphical output 228 to a first buffer, drawing visual representation 230 to a second buffer, and using the first and second buffers to drive the remote display.
  • touch inputs 224 from remote display 104 may be used in the update of graphical output 208 and 228.
  • a user of remote display 104 may provide touch inputs 224 as touch-based gestures through a touch screen associated with remote display 104.
  • User interface 222 and/or application 218 may obtain touch inputs 224 from the touch screen, and client 108 may transmit touch inputs 224 to server 106 and/or application 210.
  • application 210 may update graphical output 208 based on touch inputs 224.
  • application 210 may generate graphical output 208 from touch inputs 224 as if touch inputs 224 were received from user interface 202 and/or the touch screen associated with electronic device 102.
  • transmission of touch inputs 224 to electronic device 102 may allow a user to interact with application 210 from remote display 104 in the same way as the user would from electronic device 102.
  • server 106 and/or application 210 may include functionality to identify remote display 104 as the source of touch inputs 224. Such identification of remote display 104 may allow application 210 to modify graphical output 208 based on touch inputs 224 prior to transmitting graphical output 208 to remote display 104 and/or transmit data to remote display 104 based on touch inputs 224.
  • application 210 may use touch inputs 224 to generate a graphical overlay for graphical output 208 that is transmitted with graphical output 208 to client 108 but not provided to graphics-processing mechanism 206.
  • application 218 may provide the graphical overlay and graphical output 208 to graphics-processing mechanism 226, which drives remote display 104 using both the graphical overlay and graphical output 208.
  • the graphical overlay may be shown within user interface 222 and/or remote display 104 but not within user interface 202 and/or the display (e.g., touch screen) on electronic device 102.
  • Touch inputs 224 associated with the graphical overlay within user interface 222 may then be transmitted to server 106 and used by application 210 to update the graphical overlay and/or transmit data associated with the graphical overlay to remote display 104. Modification of graphical output 208 and/or transmission of data to remote display 104 based on touch inputs 224 is discussed in further detail below with respect to FIG. 4.
  • applications 210 and 218 may allow electronic device 102 and/or remote display 104 to visualize and/or share graphical output 208 and 228 and touch inputs 204 and 224.
  • applications 210 and 218 may facilitate the sharing of digital media from electronic device 102, as well as interaction between electronic device 102 and remote display 104.
  • the transmission of graphical output 208 and touch inputs 204 from application 210 to application 218 may allow a user of remote display 104 to view user interface 202 on remote display 104 and/or observe the use of electronic device 102 by another user.
  • the transmission of touch inputs 224 from application 218 to application 210 may allow the user of remote display 104 to interact with application 210 and/or obtain data (e.g., digital media) from electronic device 102.
  • encoding apparatus 212 and server 106 may execute within application 210 and/or independently of application 210.
  • decoding apparatus 220 and client 108 may execute within application 218 and/or independently of application 218.
  • applications 210 and 218 may correspond to identical applications that each implement encoding apparatus 212, server 106, client 108, and decoding apparatus 220 to enable viewing of and/or interaction with either user interface 202 or user interface 222 from both electronic device 102 and remote display 104.
  • FIG. 3 shows an exemplary interaction between an electronic device 302 and a remote display 304 in accordance with an embodiment.
  • Electronic device 302 may be used to drive remote display 304 so that graphical output on remote display 304 is substantially the same as graphical output on electronic device 302.
  • graphical output for a display of electronic device 302 may be transmitted to remote display 304 and used to drive remote display
  • a set of touch inputs 306-308 may be obtained from electronic device 302.
  • Touch inputs 306-308 may be associated with tapping gestures, swiping gestures, pinching gestures, rotating gestures, and/or other touch-based gestures on a touch screen associated with electronic device 302.
  • Touch inputs 306-308 may also be transmitted to remote display 304 to enable the driving of remote display 304 using visual representations 310-312 of touch inputs 306-308.
  • visual representations 310-312 may correspond to dots that represent the locations of touch inputs 306-308 within the graphical output.
  • Visual representations 310-312 may also persist for a period after touch inputs 306-308 cease and/or change location to convey motion information (e.g., lines, arcs, etc.) associated with touch inputs 306-308 to a user of remote display 304. In other words, visual representations 310-312 may allow the user of remote display 304 to observe the use of electronic device 302 by a different user.
  • motion information e.g., lines, arcs, etc.
  • touch inputs 306-308 may be transmitted to remote display 304 by compositing visual representations 310-312 into the graphical output of electronic device 302 prior to transmitting the graphical output to remote display 304.
  • the graphical output and touch inputs 306-308 may thus be transmitted as a single data stream to remote display 104 and drawn to a single buffer that is used to drive remote display 304.
  • touch inputs 306-308 may be transmitted as auxiliary data associated with the graphical output to remote display 304.
  • the graphical output may be drawn to a first buffer
  • visual representations 310-312 may be drawn to a second buffer based on touch inputs 306-308
  • the first and second buffers may be used to drive remote display 304.
  • FIG. 4 shows an exemplary interaction between an electronic device 402 and a remote display 404 in accordance with an embodiment.
  • electronic device 402 may be used to drive remote display 404 so that graphical output is substantially the same on both electronic device 402 and remote display 404.
  • a graphical overlay 406 is shown on remote display 404 but not on electronic device 402. Overlay 406 may result from the transmission of touch inputs from remote display 404 to electronic device 402, as well as the subsequent processing of the touch inputs by an application on electronic device 402. For example, the application may generate overlay 406 by updating the graphical output based on the touch inputs prior to transmitting the graphical output to remote display 404 but not prior to using the graphical output to drive a display (e.g., touch screen) on electronic device 402.
  • a display e.g., touch screen
  • Overlay 406 may also facilitate the transmission of data from the application to remote display 404 based on the touch inputs.
  • overlay 406 may correspond to a dialog box that gives a user of remote display 404 an option to save a file associated with the graphical output and another option to not save the file. Touch inputs provided by the user within the dialog box may then be sent to electronic device 402 for processing by the application. If the application determines that the touch inputs represent the selection of the option to save the file, the application may remove overlay 406 from remote display 404 and transmit data for the file to remote display 404.
  • the application may generate a dialog box on electronic device 402 to query the user of electronic device 402 for permission to transmit the data to remote display 404, or the application may transmit the data without obtaining permission from the user of electronic device 402. Conversely, if the application determines that the touch inputs represent the selection of the option to not save the file, the application may remove overlay 406 from remote display 404 without transmitting file data to remote display 404.
  • FIG. 5 shows a flowchart illustrating the process of interacting with a remote display in accordance with an embodiment.
  • one or more of the steps may be omitted, repeated, and/or performed in a different order. Accordingly, the specific arrangement of steps shown in FIG. 5 should not be construed as limiting the scope of the embodiments.
  • graphical output for a display of an electronic device is obtained (operation 502), and a first set of touch inputs associated with the graphical output is obtained from a touch screen associated with the electronic device (operation 504).
  • the graphical output is encoded (operation 506).
  • the graphical output may be encoded using an H.264 codec that converts the graphical output from a first color space to a second color space and/or scales the graphical output.
  • the graphical output and first set of touch inputs are then transmitted to the remote display (operation 508), where the graphical output and a visual representation of the first set of touch inputs are used to drive the remote display.
  • Driving of the remote display using the graphical output and visual representation of the first set of touch inputs is discussed in further detail below with respect to FIG. 6.
  • a second set of touch inputs may also be received (operation 510) from the remote display. If the second set of touch inputs is not received, no processing related to the second set of touch inputs is performed. If the second set of touch inputs is received, the graphical output is updated based on the second set of touch inputs (operation 512). For example, the second set of touch inputs may be provided to an application configured to generate the graphical output. The application may process the second set of touch inputs as if the second set of touch inputs were obtained through the touch screen of the electronic device.
  • the application may identify the remote display as a source of the second set of touch inputs and modify the graphical output prior to transmitting the graphical output to the remote display.
  • the application may also use the identification of the remote display to transmit data to the remote display based on the second set of touch inputs.
  • Interaction with the remote display may continue (operation 514).
  • the electronic device may interact with the remote display as long as a network connection exists between the electronic device and remote display and/or digital media is being shared between the electronic device and remote display.
  • the graphical output and the first set of touch inputs are obtained (operations 502-504), the graphical output is encoded (operation 506), and the graphical output and first set of touch inputs are transmitted to the remote display (operation 508).
  • the second set of touch inputs may also be received (operation 510) from the remote display and used to update the graphical output (operation 512).
  • the graphical output and visual representation of the first set of touch inputs may continue to be obtained, modified, and/or transmitted until interaction between the electronic device and the remote display ceases.
  • FIG. 6 shows a flowchart illustrating the process of interacting with an electronic device in accordance with an embodiment.
  • one or more of the steps may be omitted, repeated, and/or performed in a different order. Accordingly, the specific arrangement of steps shown in FIG. 6 should not be construed as limiting the scope of the embodiments.
  • graphical output and a first set of touch inputs associated with the graphical output are obtained from the electronic device (operation 602).
  • the first set of touch inputs may be composited into the graphical output and/or transmitted as auxiliary data associated with the graphical input.
  • the graphical output is decoded (operation 604).
  • an H.264 codec may be used to obtain frames of pixel values from the graphical output.
  • the graphical output and a visual representation of the first set of touch inputs are then used to drive a remote display (operation 606). If the first set of touch inputs is composited into the graphical output, the composited graphical output may be drawn to a single buffer that is used to drive the remote display. If the first set of touch inputs is transmitted as auxiliary data, the graphical output may be drawn to a first buffer, the visual representation of the first set of touch inputs may be drawn to a second buffer, and the first and second buffers may be used to drive the remote display. [0055] A second set of touch inputs may also be provided (operation 608) by a user of the remote display.
  • the second set of touch inputs is not provided, no processing related to the second set of touch inputs is performed. If the second set of touch inputs is provided, the second set of touch inputs is obtained from a touch screen associated with the remote display (operation 610) and transmitted to the electronic device (operation 612). The second set of touch inputs may then be used by the electronic device to update the graphical output and/or transmit data to the remote display.
  • Interaction with the electronic device may continue (operation 614). If interaction with the electronic device is to continue, the graphical output and first set of touch inputs are received from the electronic device (operation 602), the graphical output is decoded (operation 604), and the graphical output and a visual representation of the first set of touch inputs is used to drive the remote display (operation 606). Concurrently, a second set of touch inputs may be provided (operation 608) by a user of the remote display, obtained from the touch screen
  • operation 610 Use of the graphical output and visual representation to drive the remote display and transmission of the second set of touch inputs to the electronic device may continue until interaction between the remote display and the electronic device ceases.
  • FIG. 7 shows a computer system 700 in accordance with an embodiment.
  • Computer system 700 may correspond to an apparatus that includes a processor 702, memory 704, storage 706, and/or other components found in electronic computing devices.
  • Processor 702 may support parallel processing and/or multi-threaded operation with other processors in computer system 700.
  • Computer system 700 may also include input/output (I/O) devices such as a keyboard 708, a mouse 710, and a display 712.
  • I/O input/output
  • Computer system 700 may include functionality to execute various components of the present embodiments.
  • computer system 700 may include an operating system (not shown) that coordinates the use of hardware and software resources on computer system 700, as well as one or more applications that perform specialized tasks for the user.
  • applications may obtain the use of hardware resources on computer system 700 from the operating system, as well as interact with the user through a hardware and/or software framework provided by the operating system.
  • computer system 700 provides a system for facilitating interaction between an electronic device and a remote display.
  • the system may include a first application and an encoding apparatus on the electronic device, and a second application and a decoding apparatus on the remote display.
  • the encoding apparatus may obtain graphical output for a display of the electronic device and a first set of touch inputs associated with the graphical output from a first touch screen on the electronic device.
  • the encoding apparatus may encode the graphical output, and the first application may transmit the graphical output and the first set of touch inputs to the remote display.
  • the decoding apparatus may decode the graphical output.
  • the second application may then use the graphical output and a visual representation of the first set of touch inputs to drive the remote display.
  • the second application may obtain a second set of touch inputs associated with the graphical output from a second touch screen on the remote display and transmit the second set of touch inputs to the electronic device.
  • the first application may then update the graphical output based on the second set of touch inputs.
  • the first application may identify the remote display as a source of the second set of touch inputs.
  • the first application may then use the second set of touch inputs to generate an overlay that is displayed over the graphical output on the remote display.
  • the first application may also transmit data to the remote display based on the second set of touch inputs.
  • one or more components of computer system 700 may be remotely located and connected to the other components over a network.
  • Portions of the present embodiments e.g., first application, second application, encoding apparatus, decoding apparatus, etc.
  • the present embodiments may also be located on different nodes of a distributed system that implements the embodiments.
  • the present embodiments may be implemented using a cloud computing system that communicates with the electronic device using a network connection with the electronic device and displays graphical output and a visual representation of the first set of touch inputs from the electronic device on a set of remote displays.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Information Transfer Between Computers (AREA)
  • User Interface Of Digital Computer (AREA)
  • Digital Computer Display Output (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The disclosed embodiments provide a system that facilitates interaction between an electronic device and a remote display. The system includes a first application and an encoding apparatus on the electronic device, and a second application and a decoding apparatus on the remote display. The encoding apparatus obtains graphical output for a display of the electronic device and a first set of touch inputs associated with the graphical output from a first touch screen. Next, the encoding apparatus encodes the graphical output, and the first application transmits the graphical output and the first set of touch inputs to the remote display. Upon receiving the graphical output and the first set of touch inputs at the remote display, the decoding apparatus decodes the graphical output. The second application then uses the graphical output and a visual representation of the first set of touch inputs to drive the remote display.

Description

GESTURE VISUALIZATION AND SHARING BETWEEN ELECTRONIC DEVICES AND REMOTE DISPLAYS
Inventor: Nicholas V. King
BACKGROUND Field
[0001] The present embodiments relate to techniques for driving remote displays from electronic devices. More specifically, the present embodiments relate to techniques for driving a remote display using visualizations of gestures on an electronic device, as well as techniques for sharing gestures between the electronic device and the remote display.
Related Art
[0001] Modern portable electronic devices typically include functionality to create, store, open, and/or update various forms of digital media. For example, a mobile phone may include a camera for capturing images, memory in which images may be stored, software for viewing images, and/or software for editing images. Moreover, the portability and convenience associated with portable electronic devices allows users of the portable electronic devices to incorporate digital media into everyday activities. For example, the camera on a mobile phone may allow a user of the mobile phone to take pictures at various times and in multiple settings, while the display screen on the mobile phone and installed software may allow the user to display the pictures to others.
[0002] However, size and resource limitations may prevent users of portable electronic devices from effectively sharing media on the portable electronic devices. For example, the display screen on a tablet computer may be too small to be used in a presentation to a large group of people. Instead, the user of the tablet computer may conduct the presentation by driving a large remote display using a screen sharing application on the tablet computer.
[0003] Hence, what is needed is a mechanism for facilitating the sharing of media from a portable electronic device. SUMMARY
[0004] The disclosed embodiments provide a system that facilitates interaction between an electronic device and a remote display. The system includes a first application and an encoding apparatus on the electronic device, and a second application and a decoding apparatus on the remote display. The encoding apparatus obtains graphical output for a display of the electronic device and a first set of touch inputs associated with the graphical output from a first touch screen. Next, the encoding apparatus encodes the graphical output, and the first application transmits the graphical output and the first set of touch inputs to the remote display. Upon receiving the graphical output and the first set of touch inputs at the remote display, the decoding apparatus decodes the graphical output. The second application then uses the graphical output and a visual representation of the first set of touch inputs to drive the remote display.
[0005] In some embodiments, the second application also obtains a second set of touch inputs associated with the graphical output from a second touch screen and transmits the second set of touch inputs to the electronic device. The first application then updates the graphical output based on the second set of touch inputs.
[0006] In some embodiments, the first application also identifies the remote display as a source of the second set of touch inputs. The identified remote display may enable modification of the graphical output by the first application prior to transmitting the graphical output to the remote display and/or transmission of data from the first application to the remote display based on the second set of touch inputs.
[0007] In some embodiments, transmitting the graphical output and the first set of touch inputs to the remote display involves at least one of compositing the visual representation of the first set of touch inputs into the graphical output, and transmitting the first set of touch inputs as auxiliary data associated with the graphical output to the remote display.
[0008] In some embodiments, using the graphical output and the visual representation of the first set of touch inputs to drive the remote display involves:
(i) drawing the graphical output to a first buffer;
(ii) drawing the visual representation of the first set of touch inputs to a second buffer; and
(iii) using the first and second buffers to drive the remote display.
[0009] In some embodiments, the electronic device is at least one of a mobile phone, a tablet computer, and a portable media player.
[0010] In some embodiments, the remote display is at least one of a tablet computer, a mobile phone, a portable media player, a projector, and a monitor. BRIEF DESCRIPTION OF THE FIGURES
[0011] FIG. 1 shows a schematic of a system in accordance with an embodiment.
[0012] FIG. 2 shows a system for facilitating interaction between an electronic device and a remote display in accordance with an embodiment.
[0013] FIG. 3 shows an exemplary interaction between an electronic device and a remote display in accordance with an embodiment.
[0014] FIG. 4 shows an exemplary interaction between an electronic device and a remote display in accordance with an embodiment.
[0015] FIG. 5 shows a flowchart illustrating the process of interacting with a remote display in accordance with an embodiment.
[0016] FIG. 6 shows a flowchart illustrating the process of interacting with an electronic device in accordance with an embodiment.
[0017] FIG. 7 shows a computer system in accordance with an embodiment.
[0018] In the figures, like reference numerals refer to the same figure elements.
DETAILED DESCRIPTION
[0019] The following description is presented to enable any person skilled in the art to make and use the embodiments, and is provided in the context of a particular application and its requirements. Various modifications to the disclosed embodiments will be readily apparent to those skilled in the art, and the general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the present disclosure. Thus, the present invention is not limited to the embodiments shown, but is to be accorded the widest scope consistent with the principles and features disclosed herein.
[0020] The data structures and code described in this detailed description are typically stored on a computer-readable storage medium, which may be any device or medium that can store code and/or data for use by a computer system. The computer-readable storage medium includes, but is not limited to, volatile memory, non-volatile memory, magnetic and optical storage devices such as disk drives, magnetic tape, CDs (compact discs), DVDs (digital versatile discs or digital video discs), or other media capable of storing code and/or data now known or later developed.
[0021] The methods and processes described in the detailed description section can be embodied as code and/or data, which can be stored in a computer-readable storage medium as described above. When a computer system reads and executes the code and/or data stored on the computer-readable storage medium, the computer system performs the methods and processes embodied as data structures and code and stored within the computer-readable storage medium. [0022] Furthermore, methods and processes described herein can be included in hardware modules or apparatus. These modules or apparatus may include, but are not limited to, an application-specific integrated circuit (ASIC) chip, a field-programmable gate array (FPGA), a dedicated or shared processor that executes a particular software module or a piece of code at a particular time, and/or other programmable-logic devices now known or later developed. When the hardware modules or apparatus are activated, they perform the methods and processes included within them.
[0023] FIG. 1 shows a schematic of a system in accordance with an embodiment. The system includes an electronic device 102 and a remote display 104. Electronic device 102 may correspond to a mobile phone, tablet computer, portable media player, and/or other compact electronic device that includes functionality to store digital media such as documents, images, audio, and/or video. Remote display 104 may also correspond to a compact electronic device such as a tablet computer, mobile phone, and/or portable media player, or remote display 104 may include a projector, monitor, and/or other type of electronic display that is external to and/or larger than a display on electronic device 102.
[0024] In one or more embodiments, remote display 104 facilitates the sharing of digital media from electronic device 102. In particular, electronic device 102 may be used to drive remote display 104 so that graphical output on remote display 104 is substantially the same as graphical output on electronic device 102. For example, a user of electronic device 102 may control the display of a photo slideshow, presentation, and/or document on both remote display 104 and electronic device 102 from an application on electronic device 102. Because remote display 104 provides additional space for displaying the graphical output, remote display 104 may allow the photo slideshow, presentation, and/or document to be viewed by more people than if the photo slideshow, presentation, and/or document were displayed only on electronic device 102.
[0025] To enable the driving of remote display 104 from electronic device 102, a server 106 on electronic device 102 may be used to communicate with a client 108 on remote display 104. Server 106 may transmit graphical output from electronic device 102 to client 108, and client 108 may update remote display 104 with the graphical output. For example, server 106 and client 108 may correspond to a remote desktop server and remote desktop client that communicate over a network connection between electronic device 102 and remote display 104. The remote desktop server may propagate changes to the desktop and/or display of electronic device 102 to the remote desktop client, and the remote desktop client may update remote display 104 accordingly. In other words, server 106 and client 108 may allow electronic device 102 to drive remote display 104 without connecting to remote display 104 using a video interface such as Digital Visual Interface (DVI), High-Definition Multimedia Interface (HDMI), and/or
DisplayPort.
[0026] Server 106 and client 108 may additionally be configured to drive remote display 104 using visual representations of a first set of touch inputs from electronic device 102 and/or update the graphical output based on a second set of touch inputs from remote display 104. As discussed in further detail below with respect to FIG. 2, a first application associated with server 106 may obtain a first set of touch inputs from a first touch screen associated with (e.g., provided by) electronic device 102. Each of the touch inputs may correspond to a tapping gesture, a swiping gesture, a pinching gesture, a rotating gesture, and/or another touch-based gesture on the touch screen. Next, server 106 may transmit the first set of touch inputs, along with the graphical output, to remote display 104. A second application associated with client 108 may then use the graphical output and a visual representation of the first set of touch inputs to drive remote display 104. For example, the second application may update remote display 104 with the graphical output, as well as a set of dots representing locations of the first set of touch inputs within the graphical output.
[0027] The second application may additionally obtain the second set of touch inputs from a second touch screen associated with (e.g., provided by) remote display 104. As with the first set of touch inputs, the second set of touch inputs may include a number of touch-based gestures. Next, client 108 may transmit the second set of touch inputs to electronic device 102, and the first application may update the graphical output based on the second set of touch inputs. For example, the first application may use the second set of touch inputs to update the graphical output as if the second set of touch inputs were received from the first touch screen on the electronic device. On the other hand, the first application may identify the remote display as a source of the second set of touch inputs. The first application may then use the identified remote display and/or second set of touch inputs to generate a graphical overlay that is displayed over the graphical output on the remote display. The first application may also transmit data to the remote display based on the second set of touch inputs. Consequently, the first and second applications may facilitate both the sharing of digital media from electronic device 102 and interaction between electronic device 102 and remote display 104.
[0028] FIG. 2 shows a system for facilitating interaction between electronic device 102 and remote display 104 in accordance with an embodiment. As described above, electronic device 102 may drive remote display 104 so that graphical output 208 on electronic device 102 is substantially the same as graphical output 228 on remote display 104. For example, electronic device 102 may enable the display of a presentation, photo slideshow, and/or document on both remote display 104 and the display of electronic device 102. [0029] To drive remote display 104 from electronic device 102, a first application 210 associated with server 106 may generate graphical output 208 using a graphics-processing mechanism 206 (e.g., graphics-processing unit (GPU), graphics stack, etc.) in electronic device
102. For example, application 210 may provide a user interface 202 (e.g., graphical user interface (GUI)) that obtains a series of touch inputs 204 (e.g., gestures) from a user through a touch screen associated with electronic device 102. Application 210 may then issue draw commands to graphics-processing mechanism 206 based on touch inputs 204 to generate graphical output 208 that is shown within user interface 202 and/or the touch screen. As a result, the user may interact with application 210 by providing touch inputs 204 to application 210 through the touch screen and/or user interface 202 and receiving graphical output 208 from application 210 through the touch screen and/or user interface 202.
[0030] After graphical output 208 is generated by graphics-processing mechanism 206, graphical output 208 may be obtained by application 210 and encoded by an encoding apparatus 212 associated with application 210. During encoding, a conversion mechanism 214 in encoding apparatus 212 may convert graphical output 208 from a first color space to a second color space, and a scaling mechanism 216 may scale graphical output 208. For example, encoding apparatus 212 may include functionality to encode graphical output 208 using an H.264 codec. Conversion mechanism 214 may thus convert graphical output 208 from an RGB color space into a YUV color space. At the same time, scaling mechanism 216 may scale graphical output 208 up or down to allow graphical output 208 to match the resolution of remote display 104.
[0031] Once graphical output 208 is encoded, server 106 may transmit graphical output 208 to client 108 over a network (e.g., wireless network, local area network (LAN), wide area network (WAN), etc.) connection. A second application 218 associated with client 108 may then use graphical output 208 to update remote display 104. More specifically, a decoding apparatus 220 associated with application 218 may decode graphical output 208. For example, decoding apparatus 220 may include an H.264 codec that obtains frames of pixel values from the encoded graphical output 208. The pixel values may then be sent to a graphics-processing mechanism 226 (e.g., GPU, graphics stack) in remote display 104 and used by graphics-processing mechanism 226 to generate graphical output 228 for driving remote display 104. As with display of graphical output 208 in electronic device 102, graphical output 228 may be shown within a user interface 222 provided by application 218 and/or a touch screen associated with remote display 104.
[0032] As mentioned previously, a visual representation 230 of touch inputs 204 may also be used to drive remote display 104. More specifically, touch inputs 204 may be obtained by application 210 and/or server 106 and transmitted along with graphical output 208 to client 108. Visual representation 230 may be generated using touch inputs 204 and provided to graphics-processing mechanism 226 so that graphics-processing mechanism 226 may drive remote display 104 using visual representation 230. For example, visual representation 230 may include a set of dots representing the locations of touch inputs 204 within graphical output 208 and 228. As a result, visual representation 230 may be generated by drawing a dot at each location associated with touch inputs 204. Visual representations of touch inputs are discussed in further detail below with respect to FIG. 3.
[0033] In one or more embodiments, visual representation 230 is generated by
application 210 and/or graphics-processing mechanism 206 and composited into graphical output 208 at electronic device 102. Consequently, server 106 may transmit a single data stream containing graphical output 208 and visual representation 230 to client 108. The data stream may then be decoded by decoding apparatus 220 and used by graphics-processing mechanism 226 to drive remote display 104.
[0034] On the other hand, application 210 and/or server 106 may transmit touch inputs 204 as auxiliary data associated with graphical output 208 to client 108. For example, graphical output 208 may be transmitted through a main communication channel between server 106 and client 108, while touch inputs 204 may be transmitted through a sideband channel between server 106 and client 108. As a result, application 218 and/or graphics-processing mechanism 226 may drive remote display 104 by drawing graphical output 228 to a first buffer, drawing visual representation 230 to a second buffer, and using the first and second buffers to drive the remote display.
[0035] Interaction between electronic device 102 and remote display 104 may
additionally be facilitated by allowing touch inputs 224 from remote display 104 to be used in the update of graphical output 208 and 228. A user of remote display 104 may provide touch inputs 224 as touch-based gestures through a touch screen associated with remote display 104. User interface 222 and/or application 218 may obtain touch inputs 224 from the touch screen, and client 108 may transmit touch inputs 224 to server 106 and/or application 210.
[0036] Next, application 210 may update graphical output 208 based on touch inputs 224. For example, application 210 may generate graphical output 208 from touch inputs 224 as if touch inputs 224 were received from user interface 202 and/or the touch screen associated with electronic device 102. In other words, transmission of touch inputs 224 to electronic device 102 may allow a user to interact with application 210 from remote display 104 in the same way as the user would from electronic device 102.
[0037] Conversely, server 106 and/or application 210 may include functionality to identify remote display 104 as the source of touch inputs 224. Such identification of remote display 104 may allow application 210 to modify graphical output 208 based on touch inputs 224 prior to transmitting graphical output 208 to remote display 104 and/or transmit data to remote display 104 based on touch inputs 224.
[0038] For example, application 210 may use touch inputs 224 to generate a graphical overlay for graphical output 208 that is transmitted with graphical output 208 to client 108 but not provided to graphics-processing mechanism 206. Next, application 218 may provide the graphical overlay and graphical output 208 to graphics-processing mechanism 226, which drives remote display 104 using both the graphical overlay and graphical output 208. As a result, the graphical overlay may be shown within user interface 222 and/or remote display 104 but not within user interface 202 and/or the display (e.g., touch screen) on electronic device 102. Touch inputs 224 associated with the graphical overlay within user interface 222 may then be transmitted to server 106 and used by application 210 to update the graphical overlay and/or transmit data associated with the graphical overlay to remote display 104. Modification of graphical output 208 and/or transmission of data to remote display 104 based on touch inputs 224 is discussed in further detail below with respect to FIG. 4.
[0039] Consequently, applications 210 and 218 may allow electronic device 102 and/or remote display 104 to visualize and/or share graphical output 208 and 228 and touch inputs 204 and 224. In turn, applications 210 and 218 may facilitate the sharing of digital media from electronic device 102, as well as interaction between electronic device 102 and remote display 104. For example, the transmission of graphical output 208 and touch inputs 204 from application 210 to application 218 may allow a user of remote display 104 to view user interface 202 on remote display 104 and/or observe the use of electronic device 102 by another user. Similarly, the transmission of touch inputs 224 from application 218 to application 210 may allow the user of remote display 104 to interact with application 210 and/or obtain data (e.g., digital media) from electronic device 102.
[0040] Those skilled in the art will appreciate that the system of FIG. 2 may be implemented in a variety of ways. First, encoding apparatus 212 and server 106 may execute within application 210 and/or independently of application 210. Along the same lines, decoding apparatus 220 and client 108 may execute within application 218 and/or independently of application 218. Moreover, applications 210 and 218 may correspond to identical applications that each implement encoding apparatus 212, server 106, client 108, and decoding apparatus 220 to enable viewing of and/or interaction with either user interface 202 or user interface 222 from both electronic device 102 and remote display 104. On the other hand, applications 210 and 218 may occupy complementary roles, such that only one user interface (e.g., user interface 202) is accessible from both electronic device 102 and remote display 104. [0041] FIG. 3 shows an exemplary interaction between an electronic device 302 and a remote display 304 in accordance with an embodiment. Electronic device 302 may be used to drive remote display 304 so that graphical output on remote display 304 is substantially the same as graphical output on electronic device 302. For example, graphical output for a display of electronic device 302 may be transmitted to remote display 304 and used to drive remote display
304.
[0042] In addition, a set of touch inputs 306-308 may be obtained from electronic device 302. Touch inputs 306-308 may be associated with tapping gestures, swiping gestures, pinching gestures, rotating gestures, and/or other touch-based gestures on a touch screen associated with electronic device 302. Touch inputs 306-308 may also be transmitted to remote display 304 to enable the driving of remote display 304 using visual representations 310-312 of touch inputs 306-308. For example, visual representations 310-312 may correspond to dots that represent the locations of touch inputs 306-308 within the graphical output. Visual representations 310-312 may also persist for a period after touch inputs 306-308 cease and/or change location to convey motion information (e.g., lines, arcs, etc.) associated with touch inputs 306-308 to a user of remote display 304. In other words, visual representations 310-312 may allow the user of remote display 304 to observe the use of electronic device 302 by a different user.
[0043] In particular, touch inputs 306-308 may be transmitted to remote display 304 by compositing visual representations 310-312 into the graphical output of electronic device 302 prior to transmitting the graphical output to remote display 304. The graphical output and touch inputs 306-308 may thus be transmitted as a single data stream to remote display 104 and drawn to a single buffer that is used to drive remote display 304. Alternatively, touch inputs 306-308 may be transmitted as auxiliary data associated with the graphical output to remote display 304. As a result, the graphical output may be drawn to a first buffer, visual representations 310-312 may be drawn to a second buffer based on touch inputs 306-308, and the first and second buffers may be used to drive remote display 304.
[0044] FIG. 4 shows an exemplary interaction between an electronic device 402 and a remote display 404 in accordance with an embodiment. Like electronic device 302 and remote display 304 of FIG. 3, electronic device 402 may be used to drive remote display 404 so that graphical output is substantially the same on both electronic device 402 and remote display 404.
[0045] However, a graphical overlay 406 is shown on remote display 404 but not on electronic device 402. Overlay 406 may result from the transmission of touch inputs from remote display 404 to electronic device 402, as well as the subsequent processing of the touch inputs by an application on electronic device 402. For example, the application may generate overlay 406 by updating the graphical output based on the touch inputs prior to transmitting the graphical output to remote display 404 but not prior to using the graphical output to drive a display (e.g., touch screen) on electronic device 402.
[0046] Overlay 406 may also facilitate the transmission of data from the application to remote display 404 based on the touch inputs. For example, overlay 406 may correspond to a dialog box that gives a user of remote display 404 an option to save a file associated with the graphical output and another option to not save the file. Touch inputs provided by the user within the dialog box may then be sent to electronic device 402 for processing by the application. If the application determines that the touch inputs represent the selection of the option to save the file, the application may remove overlay 406 from remote display 404 and transmit data for the file to remote display 404. In addition, the application may generate a dialog box on electronic device 402 to query the user of electronic device 402 for permission to transmit the data to remote display 404, or the application may transmit the data without obtaining permission from the user of electronic device 402. Conversely, if the application determines that the touch inputs represent the selection of the option to not save the file, the application may remove overlay 406 from remote display 404 without transmitting file data to remote display 404.
[0047] FIG. 5 shows a flowchart illustrating the process of interacting with a remote display in accordance with an embodiment. In one or more embodiments, one or more of the steps may be omitted, repeated, and/or performed in a different order. Accordingly, the specific arrangement of steps shown in FIG. 5 should not be construed as limiting the scope of the embodiments.
[0048] First, graphical output for a display of an electronic device is obtained (operation 502), and a first set of touch inputs associated with the graphical output is obtained from a touch screen associated with the electronic device (operation 504). Next, the graphical output is encoded (operation 506). For example, the graphical output may be encoded using an H.264 codec that converts the graphical output from a first color space to a second color space and/or scales the graphical output.
[0049] The graphical output and first set of touch inputs are then transmitted to the remote display (operation 508), where the graphical output and a visual representation of the first set of touch inputs are used to drive the remote display. Driving of the remote display using the graphical output and visual representation of the first set of touch inputs is discussed in further detail below with respect to FIG. 6.
[0050] A second set of touch inputs may also be received (operation 510) from the remote display. If the second set of touch inputs is not received, no processing related to the second set of touch inputs is performed. If the second set of touch inputs is received, the graphical output is updated based on the second set of touch inputs (operation 512). For example, the second set of touch inputs may be provided to an application configured to generate the graphical output. The application may process the second set of touch inputs as if the second set of touch inputs were obtained through the touch screen of the electronic device.
Alternatively, the application may identify the remote display as a source of the second set of touch inputs and modify the graphical output prior to transmitting the graphical output to the remote display. The application may also use the identification of the remote display to transmit data to the remote display based on the second set of touch inputs.
[0051] Interaction with the remote display may continue (operation 514). For example, the electronic device may interact with the remote display as long as a network connection exists between the electronic device and remote display and/or digital media is being shared between the electronic device and remote display. If interaction with the remote display is to continue, the graphical output and the first set of touch inputs are obtained (operations 502-504), the graphical output is encoded (operation 506), and the graphical output and first set of touch inputs are transmitted to the remote display (operation 508). At the same time, the second set of touch inputs may also be received (operation 510) from the remote display and used to update the graphical output (operation 512). The graphical output and visual representation of the first set of touch inputs may continue to be obtained, modified, and/or transmitted until interaction between the electronic device and the remote display ceases.
[0052] FIG. 6 shows a flowchart illustrating the process of interacting with an electronic device in accordance with an embodiment. In one or more embodiments, one or more of the steps may be omitted, repeated, and/or performed in a different order. Accordingly, the specific arrangement of steps shown in FIG. 6 should not be construed as limiting the scope of the embodiments.
[0053] Initially, graphical output and a first set of touch inputs associated with the graphical output are obtained from the electronic device (operation 602). The first set of touch inputs may be composited into the graphical output and/or transmitted as auxiliary data associated with the graphical input. Next, the graphical output is decoded (operation 604). For example, an H.264 codec may be used to obtain frames of pixel values from the graphical output.
[0054] The graphical output and a visual representation of the first set of touch inputs are then used to drive a remote display (operation 606). If the first set of touch inputs is composited into the graphical output, the composited graphical output may be drawn to a single buffer that is used to drive the remote display. If the first set of touch inputs is transmitted as auxiliary data, the graphical output may be drawn to a first buffer, the visual representation of the first set of touch inputs may be drawn to a second buffer, and the first and second buffers may be used to drive the remote display. [0055] A second set of touch inputs may also be provided (operation 608) by a user of the remote display. If the second set of touch inputs is not provided, no processing related to the second set of touch inputs is performed. If the second set of touch inputs is provided, the second set of touch inputs is obtained from a touch screen associated with the remote display (operation 610) and transmitted to the electronic device (operation 612). The second set of touch inputs may then be used by the electronic device to update the graphical output and/or transmit data to the remote display.
[0056] Interaction with the electronic device may continue (operation 614). If interaction with the electronic device is to continue, the graphical output and first set of touch inputs are received from the electronic device (operation 602), the graphical output is decoded (operation 604), and the graphical output and a visual representation of the first set of touch inputs is used to drive the remote display (operation 606). Concurrently, a second set of touch inputs may be provided (operation 608) by a user of the remote display, obtained from the touch screen
(operation 610), and transmitted to the electronic device (operation 612). Use of the graphical output and visual representation to drive the remote display and transmission of the second set of touch inputs to the electronic device may continue until interaction between the remote display and the electronic device ceases.
[0002] FIG. 7 shows a computer system 700 in accordance with an embodiment.
Computer system 700 may correspond to an apparatus that includes a processor 702, memory 704, storage 706, and/or other components found in electronic computing devices. Processor 702 may support parallel processing and/or multi-threaded operation with other processors in computer system 700. Computer system 700 may also include input/output (I/O) devices such as a keyboard 708, a mouse 710, and a display 712.
[0003] Computer system 700 may include functionality to execute various components of the present embodiments. In particular, computer system 700 may include an operating system (not shown) that coordinates the use of hardware and software resources on computer system 700, as well as one or more applications that perform specialized tasks for the user. To perform tasks for the user, applications may obtain the use of hardware resources on computer system 700 from the operating system, as well as interact with the user through a hardware and/or software framework provided by the operating system.
[0004] In one or more embodiments, computer system 700 provides a system for facilitating interaction between an electronic device and a remote display. The system may include a first application and an encoding apparatus on the electronic device, and a second application and a decoding apparatus on the remote display. The encoding apparatus may obtain graphical output for a display of the electronic device and a first set of touch inputs associated with the graphical output from a first touch screen on the electronic device. The encoding apparatus may encode the graphical output, and the first application may transmit the graphical output and the first set of touch inputs to the remote display. Upon receiving the graphical output and the first set of touch inputs at the remote display, the decoding apparatus may decode the graphical output. The second application may then use the graphical output and a visual representation of the first set of touch inputs to drive the remote display.
[0005] Furthermore, the second application may obtain a second set of touch inputs associated with the graphical output from a second touch screen on the remote display and transmit the second set of touch inputs to the electronic device. The first application may then update the graphical output based on the second set of touch inputs. For example, the first application may identify the remote display as a source of the second set of touch inputs. The first application may then use the second set of touch inputs to generate an overlay that is displayed over the graphical output on the remote display. The first application may also transmit data to the remote display based on the second set of touch inputs.
[0057] In addition, one or more components of computer system 700 may be remotely located and connected to the other components over a network. Portions of the present embodiments (e.g., first application, second application, encoding apparatus, decoding apparatus, etc.) may also be located on different nodes of a distributed system that implements the embodiments. For example, the present embodiments may be implemented using a cloud computing system that communicates with the electronic device using a network connection with the electronic device and displays graphical output and a visual representation of the first set of touch inputs from the electronic device on a set of remote displays.
[0058] The foregoing descriptions of various embodiments have been presented only for purposes of illustration and description. They are not intended to be exhaustive or to limit the present invention to the forms disclosed. Accordingly, many modifications and variations will be apparent to practitioners skilled in the art. Additionally, the above disclosure is not intended to limit the present invention.

Claims

What Is Claimed Is:
1. A computer-implemented method for interacting with a remote display, comprising:
obtaining graphical output for a display of the electronic device;
obtaining a first set of touch inputs associated with the graphical output from a touch screen associated with the electronic device; and
transmitting the graphical output and the first set of touch inputs to the remote display, wherein the graphical output and a visual representation of the first set of touch inputs are used to drive the remote display.
2. The computer-implemented method of claim 1, further comprising:
encoding the graphical output prior to transmitting the graphical output to the remote display.
3. The computer-implemented method of claim 2, wherein encoding the graphical output involves at least one of:
converting the graphical output from a first color space to a second color space; and scaling the graphical output.
4. The computer-implemented method of claim 1, further comprising:
receiving a second set of touch inputs from the remote display; and
updating the graphical output based on the second set of touch inputs.
5. The computer-implemented method of claim 4, wherein updating the graphical output based on the second set of touch inputs involves:
providing the second set of touch inputs to an application configured to generate the graphical output.
6. The computer-implemented method of claim 5, wherein updating the graphical output based on the second set of touch inputs further involves:
identifying the remote display as a source of the second set of touch inputs,
wherein the identified remote display enables modification of the graphical output by the application prior to transmitting the graphical output to the remote display.
7. The computer-implemented method of claim 6, wherein the identified remote display further enables the transmission of data from the application to the remote display based on the second set of touch inputs.
8. The computer-implemented method of claim 1, wherein transmitting the graphical output and the first set of touch inputs to the remote display involves at least one of:
compositing the visual representation of the first set of touch inputs into the graphical output; and
transmitting the first set of touch inputs as auxiliary data associated with the graphical output to the remote display.
9. The computer-implemented method of claim 1, wherein the visual representation of the first set of touch inputs comprises a set of dots.
10. A computer-implemented method for interacting with an electronic device, comprising:
receiving graphical output and a first set of touch inputs associated with the graphical output from the electronic device; and
using the graphical output and a visual representation of the first set of touch inputs to drive a remote display.
11. The computer-implemented method of claim 10, further comprising:
decoding the graphical output prior to using the graphical output to drive the remote display.
12. The computer-implemented method of claim 10, further comprising:
obtaining a second set of touch inputs associated with the graphical output from a touch screen associated with the remote display; and
transmitting the second set of touch inputs to the electronic device, wherein the second set of touch inputs is used by the electronic device to update the graphical output.
13. The computer-implemented method of claim 12, wherein the second set of touch inputs is further used by the electronic device to transmit data to the remote display.
14. The computer-implemented method of claim 10, wherein using the graphical output and the visual representation of the first set of touch inputs to drive the remote display involves:
drawing the graphical output to a first buffer;
drawing the visual representation of the first set of touch inputs to a second buffer; and using the first and second buffers to drive the remote display.
15. The computer-implemented method of claim 10, wherein the remote display is at least one of a tablet computer, a mobile phone, a portable media player, a projector, and a monitor.
16. A system for facilitating interaction between an electronic device and a remote display, comprising:
a first application on the electronic device, wherein the first application is configured to: obtain graphical output for a display of the electronic device;
obtain a first set of touch inputs associated with the graphical output from a first touch screen on the electronic device; and
transmit the graphical output and the first set of touch inputs to the remote display; and
a second application on the remote display, wherein the second application is configured to use the graphical output and a visual representation of the first set of touch inputs to drive the remote display.
17. The system of claim 16, further comprising:
an encoding apparatus on the electronic device, wherein the encoding apparatus is configured to encode the graphical output prior to transmitting the graphical output to the remote display; and
a decoding apparatus on the remote display, wherein the decoding apparatus is configured to decode the graphical output prior to using the graphical output to drive the remote display.
18. The system of claim 16,
wherein the second application is further configured to:
obtain a second set of touch inputs associated with the graphical output from a second touch screen on the remote display; and
transmit the second set of touch inputs to the electronic device, and wherein the first application is further configured to update the graphical output based on the second set of touch inputs.
19. The system of claim 18, wherein the first application is further configured to identify the remote display as a source of the second set of touch inputs.
20. The system of claim 19, wherein the identified remote display enables at least one of:
modification of the graphical output by the first application prior to transmitting the graphical output to the remote display; and
transmission of data from the first application to the remote display based on the second set of touch inputs.
21. The system of claim 16, wherein transmitting the graphical output and the first set of touch inputs to the remote display involves at least one of:
compositing the visual representation of the first set of touch inputs into the graphical output; and
transmitting the first set of touch inputs as auxiliary data associated with the graphical output to the remote display.
22. The system of claim 16, wherein the electronic device is at least one of a mobile phone, a tablet computer, and a portable media player.
23. A computer-readable storage medium storing instructions that when executed by a computer cause the computer to perform a method for interacting with a remote display, the method comprising:
obtaining graphical output for a display of the electronic device;
obtaining a first set of touch inputs associated with the graphical output from a touch screen associated with the electronic device; and
transmitting the graphical output and the first set of touch inputs to the remote display, wherein the graphical output and a visual representation of the first set of touch inputs are used to drive the remote display.
24. The computer-readable storage medium of claim 23, the method further comprising: receiving a second set of touch inputs from the remote display; and
updating the graphical output based on the second set of touch inputs.
25. The computer-readable storage medium of claim 24, wherein updating the graphical output based on the second set of touch inputs involves:
providing the second set of touch inputs to an application configured to generate the graphical output.
26. The computer-readable storage medium of claim 23, wherein transmitting the graphical output and the first set of touch inputs to the remote display involves at least one of: compositing the visual representation of the first set of touch inputs into the graphical output; and
transmitting the first set of touch inputs as auxiliary data associated with the graphical output to the remote display.
PCT/US2012/032929 2011-04-12 2012-04-10 Gesture visualization and sharing between electronic devices and remote displays WO2012142043A1 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
AU2012243007A AU2012243007B2 (en) 2011-04-12 2012-04-10 Gesture visualization and sharing between electronic devices and remote displays
EP12714517.5A EP2678771B1 (en) 2011-04-12 2012-04-10 Gesture visualization and sharing between electronic devices and remote displays
CN201280016099.3A CN103460177B (en) 2011-04-12 2012-04-10 Gesture visualization between electronic equipment and remote display and sharing
KR1020137025984A KR101629072B1 (en) 2011-04-12 2012-04-10 Gesture visualization and sharing between electronic devices and remote displays
JP2014505219A JP5902288B2 (en) 2011-04-12 2012-04-10 Gesture visualization and sharing between electronic device and remote display

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US13/084,779 2011-04-12
US13/084,779 US9152373B2 (en) 2011-04-12 2011-04-12 Gesture visualization and sharing between electronic devices and remote displays

Publications (1)

Publication Number Publication Date
WO2012142043A1 true WO2012142043A1 (en) 2012-10-18

Family

ID=45955179

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2012/032929 WO2012142043A1 (en) 2011-04-12 2012-04-10 Gesture visualization and sharing between electronic devices and remote displays

Country Status (7)

Country Link
US (1) US9152373B2 (en)
EP (1) EP2678771B1 (en)
JP (1) JP5902288B2 (en)
KR (1) KR101629072B1 (en)
CN (1) CN103460177B (en)
AU (1) AU2012243007B2 (en)
WO (1) WO2012142043A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016506586A (en) * 2013-01-10 2016-03-03 フォックス スポーツ プロダクションズ,インコーポレイティッド System, method and interface for viewer interaction for 3D display of vehicles

Families Citing this family (50)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120113091A1 (en) * 2010-10-29 2012-05-10 Joel Solomon Isaacson Remote Graphics
US10976981B2 (en) * 2011-07-15 2021-04-13 Vmware, Inc. Remote desktop exporting
US9727227B2 (en) * 2011-07-28 2017-08-08 Microsoft Technology Licensing, Llc Multi-touch remoting
US8819296B2 (en) * 2011-11-17 2014-08-26 Nokia Corporation Apparatus, a method and a computer program
US20130234984A1 (en) * 2012-03-06 2013-09-12 Industry-University Cooperation Foundation Hanyang University System for linking and controlling terminals and user terminal used in the same
CN103513909A (en) * 2012-06-29 2014-01-15 联想(北京)有限公司 Method for controlling electronic device and electronic device
US9213515B2 (en) * 2012-09-24 2015-12-15 At&T Intellectual Property I, L.P. On-demand multi-screen computing
US20140108940A1 (en) * 2012-10-15 2014-04-17 Nvidia Corporation Method and system of remote communication over a network
US9930082B2 (en) 2012-11-20 2018-03-27 Nvidia Corporation Method and system for network driven automatic adaptive rendering impedance
US10185416B2 (en) 2012-11-20 2019-01-22 Samsung Electronics Co., Ltd. User gesture input to wearable electronic device involving movement of device
US11372536B2 (en) 2012-11-20 2022-06-28 Samsung Electronics Company, Ltd. Transition and interaction model for wearable electronic device
US10423214B2 (en) * 2012-11-20 2019-09-24 Samsung Electronics Company, Ltd Delegating processing from wearable electronic device
US10551928B2 (en) 2012-11-20 2020-02-04 Samsung Electronics Company, Ltd. GUI transitions on wearable electronic device
US11157436B2 (en) 2012-11-20 2021-10-26 Samsung Electronics Company, Ltd. Services associated with wearable electronic device
US8994827B2 (en) 2012-11-20 2015-03-31 Samsung Electronics Co., Ltd Wearable electronic device
US9477313B2 (en) 2012-11-20 2016-10-25 Samsung Electronics Co., Ltd. User gesture input to wearable electronic device involving outward-facing sensor of device
US9936335B2 (en) 2012-12-13 2018-04-03 Microsoft Technology Licensing, Llc Mobile computing device application sharing
JP2014123311A (en) * 2012-12-21 2014-07-03 International Business Maschines Corporation Device, method and program for providing corresponding application program with input from input device
US9772978B2 (en) * 2013-02-07 2017-09-26 Adobe Systems Incorporated Touch input visualizations based on user interface context
US9445155B2 (en) 2013-03-04 2016-09-13 Google Technology Holdings LLC Gesture-based content sharing
US9438543B2 (en) 2013-03-04 2016-09-06 Google Technology Holdings LLC Gesture-based content sharing
US20180173416A1 (en) * 2013-03-07 2018-06-21 UBE, INC. d/b/a PLUM Distributed networking of configurable load controllers
KR102097640B1 (en) * 2013-03-08 2020-04-06 엘지전자 주식회사 Mobile terminal and control method thereof
US9122366B2 (en) * 2013-03-15 2015-09-01 Navico Holding As Residue indicators
US9547466B2 (en) * 2013-05-29 2017-01-17 Vmware, Inc. Systems and methods for using screen sampling to detect display changes
KR102053822B1 (en) 2013-06-03 2019-12-09 삼성전자주식회사 Portable apparatus and method for displaying a screen
US9819604B2 (en) 2013-07-31 2017-11-14 Nvidia Corporation Real time network adaptive low latency transport stream muxing of audio/video streams for miracast
WO2015030795A1 (en) * 2013-08-30 2015-03-05 Hewlett Packard Development Company, L.P. Touch input association
KR102184269B1 (en) * 2013-09-02 2020-11-30 삼성전자 주식회사 Display apparatus, portable apparatus and method for displaying a screen thereof
US9958289B2 (en) 2013-09-26 2018-05-01 Google Llc Controlling navigation software on a portable device from the head unit of a vehicle
US10054463B2 (en) 2013-09-26 2018-08-21 Google Llc Systems and methods for providing navigation data to a vehicle
US9109917B2 (en) 2013-09-26 2015-08-18 Google Inc. Systems and methods for providing input suggestions via the head unit of a vehicle
US20150103015A1 (en) * 2013-10-10 2015-04-16 Blackberry Limited Devices and methods for generating tactile feedback
KR20150069155A (en) * 2013-12-13 2015-06-23 삼성전자주식회사 Touch indicator display method of electronic apparatus and electronic appparatus thereof
KR20150081708A (en) * 2014-01-06 2015-07-15 삼성전자주식회사 user terminal apparatus and control method thereof
US10691332B2 (en) 2014-02-28 2020-06-23 Samsung Electronics Company, Ltd. Text input on an interactive display
US9817549B2 (en) * 2014-06-25 2017-11-14 Verizon Patent And Licensing Inc. Method and system for auto switching applications based on device orientation
US20170371515A1 (en) 2014-11-19 2017-12-28 Honda Motor Co., Ltd. System and method for providing absolute and zone coordinate mapping with graphic animations
US9727231B2 (en) 2014-11-19 2017-08-08 Honda Motor Co., Ltd. System and method for providing absolute coordinate and zone mapping between a touchpad and a display screen
TWI506534B (en) * 2014-12-09 2015-11-01 Awind Inc Mirror display system having low data traffic and method thereof
US10817244B2 (en) 2014-12-22 2020-10-27 Zoho Corporation Private Limited Orientation-responsive data rendering
WO2016108683A1 (en) * 2014-12-29 2016-07-07 Emerico Sdn Bhd An apparatus for use in a bank teller system
CN106406784A (en) * 2015-07-27 2017-02-15 宏碁股份有限公司 Image output device
US10872690B2 (en) 2018-11-28 2020-12-22 General Electric Company System and method for remote visualization of medical images
US10452868B1 (en) 2019-02-04 2019-10-22 S2 Systems Corporation Web browser remoting using network vector rendering
US10558824B1 (en) 2019-02-04 2020-02-11 S2 Systems Corporation Application remoting using network vector rendering
US11880422B2 (en) 2019-02-04 2024-01-23 Cloudflare, Inc. Theft prevention for sensitive information
US10552639B1 (en) 2019-02-04 2020-02-04 S2 Systems Corporation Local isolator application with cohesive application-isolation interface
US11194468B2 (en) 2020-05-11 2021-12-07 Aron Ezra Systems and methods for non-contacting interaction with user terminals
WO2022181899A1 (en) * 2021-02-26 2022-09-01 엘지전자 주식회사 Signal processing device and vehicle display device having same

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060284785A1 (en) * 2003-09-10 2006-12-21 Bitterlich Jeans Y Generation of an object-processing platform between two computers by joining screens
US20080115073A1 (en) 2005-05-26 2008-05-15 ERICKSON Shawn Method and Apparatus for Remote Display of Drawn Content

Family Cites Families (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2952955B2 (en) * 1990-04-19 1999-09-27 ソニー株式会社 Image creation device
US5652849A (en) 1995-03-16 1997-07-29 Regents Of The University Of Michigan Apparatus and method for remote control using a visual information stream
JPH0922339A (en) 1995-07-05 1997-01-21 Matsushita Electric Ind Co Ltd Remote controller
JPH0997042A (en) 1995-10-03 1997-04-08 Matsushita Electric Ind Co Ltd Image display device
US6333929B1 (en) * 1997-08-29 2001-12-25 Intel Corporation Packet format for a distributed system
US8479122B2 (en) 2004-07-30 2013-07-02 Apple Inc. Gestures for touch sensitive input devices
US6564380B1 (en) 1999-01-26 2003-05-13 Pixelworld Networks, Inc. System and method for sending live video on the internet
US6396523B1 (en) 1999-07-29 2002-05-28 Interlink Electronics, Inc. Home entertainment device remote control
TW456112B (en) 1999-12-10 2001-09-21 Sun Wave Technology Corp Multi-function remote control with touch screen display
JP2001197461A (en) 2000-01-07 2001-07-19 Matsushita Electric Ind Co Ltd Sharing operation method for multimedia information operation window
US6765557B1 (en) 2000-04-10 2004-07-20 Interlink Electronics, Inc. Remote control having touch pad to screen mapping
EP1364362A1 (en) 2001-01-24 2003-11-26 Interlink Electronics, Inc. Game and home entertainment device remote control
JP2002244987A (en) 2001-02-15 2002-08-30 Matsushita Electric Ind Co Ltd Transfer system, first device, second device, and program
US6914551B2 (en) 2002-04-12 2005-07-05 Apple Computer, Inc. Apparatus and method to facilitate universal remote control
EP1499127B1 (en) 2003-07-18 2006-08-30 Alcatel A method of distributing real time data streams across a multimedia network as well as a mediation device and a multimedia network therefore
US7535465B2 (en) 2003-09-02 2009-05-19 Creative Technology Ltd. Method and system to display media content data
JP2007001237A (en) * 2005-06-27 2007-01-11 Konica Minolta Business Technologies Inc Apparatus, apparatus system, image forming apparatus, and method and program for controlling apparatus
WO2007075196A1 (en) * 2005-09-07 2007-07-05 Vidyo, Inc. System and method for a high reliability base layer trunk
JP2007173952A (en) 2005-12-19 2007-07-05 Sony Corp Content reproduction system, reproducing unit and method, providing device and providing method, program, and recording medium
US8436889B2 (en) 2005-12-22 2013-05-07 Vidyo, Inc. System and method for videoconferencing using scalable video coding and compositing scalable video conferencing servers
US7733808B2 (en) 2006-11-10 2010-06-08 Microsoft Corporation Peer-to-peer aided live video sharing system
US20080178224A1 (en) 2007-01-20 2008-07-24 Michael Laude Upgradeable intelligent remote control device with integrated program guide
US8555180B2 (en) * 2007-03-27 2013-10-08 Amulet Technologies, Llc Smart peripheral architecture for portable media players
JP2008310443A (en) 2007-06-12 2008-12-25 Canon Inc Image processor, image processing method, program, and recording medium
US7889175B2 (en) 2007-06-28 2011-02-15 Panasonic Corporation Touchpad-enabled remote controller and user interaction methods
US9767681B2 (en) 2007-12-12 2017-09-19 Apple Inc. Handheld electronic devices with remote control functionality and gesture recognition
US8542323B2 (en) 2007-12-21 2013-09-24 Sony Corporation Touch sensitive wireless navigation device for remote control
US20090284476A1 (en) * 2008-05-13 2009-11-19 Apple Inc. Pushing a user interface to a remote device
US20100020033A1 (en) * 2008-07-23 2010-01-28 Obinna Ihenacho Alozie Nwosu System, method and computer program product for a virtual keyboard
JP2010130445A (en) 2008-11-28 2010-06-10 Sharp Corp Viewing system, display device, remote controller device, and viewing method
JP5380147B2 (en) 2009-04-30 2014-01-08 オリンパスイメージング株式会社 Imaging device, display device, and imaging system
US20110029915A1 (en) * 2009-08-02 2011-02-03 Harris Technology, Llc Layered desktop system
EP2553561A4 (en) * 2010-04-01 2016-03-30 Citrix Systems Inc Interacting with remote applications displayed within a virtual desktop of a tablet computing device
US20120060095A1 (en) * 2010-09-03 2012-03-08 Rovi Technologies Corporation Systems and methods for generating personalized media content embedding images of multiple users
US8958018B2 (en) * 2010-12-22 2015-02-17 Google Technology Holdings LLC Remote control device and method for controlling operation of a media display system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060284785A1 (en) * 2003-09-10 2006-12-21 Bitterlich Jeans Y Generation of an object-processing platform between two computers by joining screens
US20080115073A1 (en) 2005-05-26 2008-05-15 ERICKSON Shawn Method and Apparatus for Remote Display of Drawn Content

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016506586A (en) * 2013-01-10 2016-03-03 フォックス スポーツ プロダクションズ,インコーポレイティッド System, method and interface for viewer interaction for 3D display of vehicles

Also Published As

Publication number Publication date
CN103460177B (en) 2016-08-17
AU2012243007A1 (en) 2013-10-10
KR20140000328A (en) 2014-01-02
JP2014513831A (en) 2014-06-05
AU2012243007B2 (en) 2015-11-12
EP2678771B1 (en) 2019-11-13
CN103460177A (en) 2013-12-18
JP5902288B2 (en) 2016-04-13
US20120262379A1 (en) 2012-10-18
KR101629072B1 (en) 2016-06-09
US9152373B2 (en) 2015-10-06
EP2678771A1 (en) 2014-01-01

Similar Documents

Publication Publication Date Title
AU2012243007B2 (en) Gesture visualization and sharing between electronic devices and remote displays
US9727301B2 (en) Gesture-based prioritization of graphical output on remote displays
US20130141471A1 (en) Obscuring graphical output on remote displays
KR101401216B1 (en) Mirroring graphics content to an external display
US20200007602A1 (en) Remote desktop video streaming alpha-channel
US11403121B2 (en) Streaming per-pixel transparency information using transparency-agnostic video codecs
US10089057B2 (en) Display redistribution between a primary display and a secondary display
US8786634B2 (en) Adaptive use of wireless display
US10056059B2 (en) Resolution-independent virtual display
US20160210769A1 (en) System and method for a multi-device display unit
GB2484736A (en) Connecting a display device via USB interface
US20170229102A1 (en) Techniques for descriptor overlay superimposed on an asset
US9774821B2 (en) Display apparatus and control method thereof
US20160005379A1 (en) Image Generation
CN112866784A (en) Large-screen local playback control method, control system, equipment and storage medium
US20110102442A1 (en) Recording Contents of Display Screens
JP6395971B1 (en) Modification of graphical command token
US9317891B2 (en) Systems and methods for hardware-accelerated key color extraction
US11817036B2 (en) Display apparatus and control method thereof
JP2012008585A (en) Display control device and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12714517

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2012714517

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2014505219

Country of ref document: JP

Kind code of ref document: A

Ref document number: 20137025984

Country of ref document: KR

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2012243007

Country of ref document: AU

Date of ref document: 20120410

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE