WO2017193328A1 - Human interface device and automatic calibration for back-controlling source device during remote screen casting session - Google Patents

Human interface device and automatic calibration for back-controlling source device during remote screen casting session Download PDF

Info

Publication number
WO2017193328A1
WO2017193328A1 PCT/CN2016/081844 CN2016081844W WO2017193328A1 WO 2017193328 A1 WO2017193328 A1 WO 2017193328A1 CN 2016081844 W CN2016081844 W CN 2016081844W WO 2017193328 A1 WO2017193328 A1 WO 2017193328A1
Authority
WO
WIPO (PCT)
Prior art keywords
source device
screen
local
remote screen
message
Prior art date
Application number
PCT/CN2016/081844
Other languages
French (fr)
Inventor
Huajie WU
Kuichu NI
Wenbin FU
Jiang Li
Original Assignee
Qualcomm Incorporated
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qualcomm Incorporated filed Critical Qualcomm Incorporated
Priority to PCT/CN2016/081844 priority Critical patent/WO2017193328A1/en
Publication of WO2017193328A1 publication Critical patent/WO2017193328A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • G06F9/452Remote windowing, e.g. X-Window System, desktop virtualisation
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0693Calibration of display systems
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/0442Handling or displaying different aspect ratios, or changing the aspect ratio
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/12Use of DVI or HDMI protocol in interfaces along the display data pipeline

Definitions

  • the various aspects and embodiments described herein relate to back-controlling a source device through one or more user inputs applied at a sink device, and further to automatically calibrating displays at the source device and the sink device and rates at which the user inputs are sampled at the sink device and sent to the source device to improve overall user experience during a media share session.
  • handheld computing devices are often linked to an on-board hands-free device installed within the vehicle.
  • the on-board hands-free device may be integrated into the vehicle hardware, or the on-board hands-free device may be an external device (e.g., a device that can be mounted to the vehicle dashboard) .
  • smartphones can cast a screen associated therewith into the vehicle kit via a suitable communication protocol, such as High Definition Multimedia Interface (HDMI) , Mobile High-Definition Link (MHL) , Miracast, and/or others.
  • HDMI and MHL do not enable the vehicle kit to back-control the smartphone.
  • Miracast defines a user input back channel (UIBC) to provide back-control
  • UIBC user input back channel
  • the vehicle kit may use an input device such as a touchscreen or a Bluetooth Human Interface Device (HID) to back-control smartphone (e.g., a mouse, trackball, touchpad, etc. ) .
  • a touchscreen e.g., a touchscreen
  • HID Bluetooth Human Interface Device
  • this traditional method requires the user to concurrently watch the smartphone screen while the user controls the smartphone, which is impractical and also dangerous for the driver, thus somewhat defeating the point of linking the smartphone to the on-board hands-free device.
  • An additional problem with accurately back-controlling the smartphone using either a touch screen or a Bluetooth HID is that the remote screen cast from the smartphone to the vehicle kit needs to be calibrated.
  • a source device e.g., a smartphone
  • a sink device e.g., a vehicle kit
  • the sink device may implement a Bluetooth Human Interface Device (HID) (e.g., an HID mouse with consumer control) to locally back-control the source device, wherein the sink device may further implement a Bluetooth Serial Port Profile (SPP) application to aid the Bluetooth HID in back-controlling the source device.
  • HID Human Interface Device
  • SPP Bluetooth Serial Port Profile
  • the user e.g., a driver where the sink device is an integrated vehicle kit displaying a screen casted from a smartphone that corresponds to the source device
  • the user can instead simply view the screen displayed on the sink device and use the Bluetooth HID implemented on the sink device to back-control the source device.
  • the Bluetooth HID and the SPP application can be implemented to back-control the source device via user inputs applied at the sink device, with the sink device implementing a Bluetooth HID role and a SPP initiator role and the source device implementing a Bluetooth HID host role and a SPP acceptor role, which can be especially advantageous in vehicular platforms.
  • the screen calibration issues mentioned above may be addressed through an automatic calibration algorithm that can automatically calibrate the remote screen cast from the source device in both landscape and portrait orientations. After the calibration has been performed once, the calibration data can be stored and subsequently retrieved anytime that the same source device is linked to the sink device.
  • the automatic calibration algorithm may allow any source device to be connected to the sink device without the user needing to manually (and unnecessarily) re-calibrate the source device when subsequently connected to the sink device.
  • the automatic calibration algorithm may be more accurate than any manual user-performed calibration.
  • a remote screen calibration algorithm can be used to automatically determine an output rectangle when the screen at the source device is cast into the local screen at the sink device in landscape and portrait orientations.
  • an application at the sink device may transmit a screen calibration message to the source device, wherein the screen calibration message may include a request to show a calibration screen at the source device in either a landscape or portrait orientation.
  • the calibration screen may comprise a full-white screen or another suitable screen configured to display a unique color such that valid and invalid regions can be suitably differentiated.
  • the source device may then display the calibration screen and information indicating a resolution associated with the screen at the source device (e.g., a height and width, which may be expressed in pixels) .
  • the sink device may then calculate the output rectangle in the local screen according to the calibration screen displayed at the source device and cast to the sink device, wherein the output rectangle may comprise a valid region limited according to four (4) vertices.
  • the sink device may subsequently convert coordinates in the local screen to coordinates in the remote screen at the source device according to the resolution associated with the remote screen and the coordinates associated with the vertices in the output rectangle.
  • the above-mentioned calibration may then be repeated in the other orientation (e.g., landscape or portrait, which can be calibrated in any suitable order) .
  • the results from the remote screen calibration algorithm may then be stored in the sink device to be used in future screen sharing sessions between the source device and the sink device. Therefore, once the remote screen at the source device has been calibrated for use at the sink device, there may be no need to perform any further calibration with respect to that source device. Furthermore, the source device and the sink device can carry out the above-described remote screen calibration algorithm substantially autonomously without requiring any user input.
  • the user inputs applied at the sink device can be sampled and reported back to the source device in a manner that depends on a pointer velocity at the source device, among various other parameters.
  • the time to send a packet from the sink device to the source device to report one or more user inputs applied at the sink device may be considered in combination with the user input sample rate employed in the sink device to prevent the user from perceiving the latency when using the sink device to back-control the source device.
  • the pointer velocity at the source device and the low velocity threshold associated with the pointer at the source device may be considered to avoid the appearance of a floating pointer at the source device (e.g., when user inputs are quickly applied) .
  • the sink device may transmit a packet to report a user input to the source device in response to determining that the pointer velocity at the source device is less than the low velocity threshold at the source device, wherein the pointer velocity at the source device may depend on a distance between local points at which successive user inputs are captured at the sink device and a time to send the packet to the source device.
  • the sink device may send the user input report to the source device upon detecting a touch down event, upon detecting a touch move event that satisfies certain sampling criteria, and/or upon detecting a touch up event. Furthermore, in the case where the sink device sends the user input report upon detecting the touch up event, the sink device may further send a user input report associated with the last touch move event preceding the touch up event to ensure that the touch up event is processed according to the correct coordinates relative to the most recently sampled user input.
  • a source device may therefore cast a remote screen associated therewith to a sink device (e.g., via a wired or wireless connection) , wherein the remote screen may be shown on a local screen at the sink device.
  • the sink device may support a Human Interface Device (HID) to back-control the source device and a helper application configured to wirelessly communicate with the source device to retrieve at least a resolution and a current orientation associated with the remote screen, to aid a process to automatically calibrate the remote screen, and to obtain other relevant parameters associated with the source device and/or the remote screen at the source device (e.g., pointer movement units, pointer velocity control parameters, etc. ) .
  • HID Human Interface Device
  • helper application configured to wirelessly communicate with the source device to retrieve at least a resolution and a current orientation associated with the remote screen, to aid a process to automatically calibrate the remote screen, and to obtain other relevant parameters associated with the source device and/or the remote screen at the source device (e.g., pointer movement units, pointer velocity
  • coordinates associated with the local touch messages in the local screen may be converted to coordinates in the remote screen based on the current orientation and the resolution associated with the calibrated remote screen.
  • HID input reports may be periodically transmitted from the sink device to the source device based on various pointer velocity and latency parameters (e.g., to avoid triggering pointer acceleration that may result in a floating pointer at the source device and to avoid conditions that may result in user-perceptible latency and/or recognition difficulties with respect to rendering local touch inputs applied at the sink device in the remote screen at the source device) .
  • a method for back-controlling a source device may comprise determining, at a sink device, at least a current orientation and a resolution associated with a remote screen casted from the source device, wherein the remote screen casted from the source device is displayed on a local screen at the sink device, sending, to the source device, a message to request that the source device display a calibration screen configured to fill the remote screen with a unique color, automatically calculating, at the sink device, an output rectangle associated with the current orientation at the remote screen casted from the source device, wherein the output rectangle comprises a region in the local screen filled with the unique color, capturing a local touch message at the sink device, the local touch message associated with first coordinates in the local screen at the sink device, and converting the first coordinates in the local screen to second coordinates in the remote screen to back-control the source device via the local touch message captured at the sink device, wherein the first coordinates are converted to the second coordinates based on the resolution associated with the remote screen and the calculated output rectangle associated with the current orientation at
  • an apparatus may comprise means for determining at least a current orientation and a resolution associated with a remote screen casted from a source device, wherein the remote screen casted from the source device is displayed on a local screen at the apparatus, means for sending, to the source device, a message to request that the source device display a calibration screen configured to fill the remote screen with a unique color, means for automatically calculating an output rectangle associated with the current orientation at the remote screen casted from the source device, wherein the output rectangle comprises a region in the local screen filled with the unique color, means for capturing a local touch message, the local touch message associated with first coordinates in the local screen at the apparatus, and means for converting the first coordinates in the local screen to second coordinates in the remote screen based on the resolution associated with the remote screen and the calculated output rectangle associated with the current orientation at the remote screen.
  • a method for back-controlling a source device may additionally (or alternatively) comprise implementing, at a sink device, a human interface device (HID) to back-control a source device casting a remote screen into a local screen at the sink device, determining, at the sink device, at least a low threshold velocity above which the source device accelerates a pointer in the remote screen, sampling, at the sink device, a local touch message applied via the HID to the local screen according to a native sample rate, and transmitting, to the source device, a message to report the local touch message at the native sample rate in response to the local touch message comprising a press, a release, or a move that satisfies a predetermined condition that depends, at least in part, on the low threshold velocity above which the source device accelerates the pointer in the remote screen.
  • HID human interface device
  • a sink device configured to back-control a source device may additionally (or alternatively) comprise a local screen configured to display a remote screen casted from the source device, a transceiver configured to communicate with the source device to determine at least a low threshold velocity above which the source device accelerates a pointer in the remote screen, and one or more processors configured to implement a human interface device (HID) to back-control the source device, sample a local touch message applied via the HID to the local screen according to a native sample rate, and cause the transceiver to transmit a message to report the local touch message to the source device at the native sample rate in response to the local touch message comprising a press, a release, or a move that satisfies a predetermined condition that depends, at least in part, on the low threshold velocity above which the source device accelerates the pointer in the remote screen.
  • HID human interface device
  • an apparatus may comprise means for implementing a human interface device (HID) to back-control a source device casting a remote screen into a local screen at the apparatus, means for determining at least a low threshold velocity above which the source device accelerates a pointer in the remote screen, means for sampling a local touch message applied via the HID to the local screen according to a native sample rate, and means for transmitting, to the source device, a message to report the local touch message at the native sample rate in response to the local touch message comprising a press, a release, or a move that satisfies a predetermined condition that depends, at least in part, on the low threshold velocity above which the source device accelerates the pointer in the remote screen.
  • HID human interface device
  • FIG. 1 illustrates an exemplary system in which a source device may cast a screen to a sink device and user inputs can be applied at the sink device to back-control the source device, according to various aspects.
  • FIG. 2 illustrates an exemplary high-level system architecture associated with a sink device that may display a screen casted from a source device and back-control the source device via user inputs applied at the source device, according to various aspects.
  • FIG. 3 illustrates relationships between an Open Systems Interconnect (OSI) seven-layer model and a Bluetooth protocol stack that can be used to back-control a source device via user inputs applied at a sink device, according to various aspects.
  • OSI Open Systems Interconnect
  • FIG. 4 illustrates an exemplary method that can be implemented at a sink device to automatically calibrate a screen at a source device, according to various aspects.
  • FIG. 5A-5B illustrate exemplary calibration screens that can be used to automatically calibrate a screen at a source device in a landscape orientation according to the method shown in FIG. 4, according to various aspects.
  • FIG. 6A-6B illustrate exemplary calibration screens that can be used to automatically calibrate a screen at a source device in a portrait orientation according to the method shown in FIG. 4, according to various aspects.
  • FIG. 7 illustrates an exemplary method and timing diagram that can be used to capture and transmit user inputs applied at a sink device to a source device to thereby back-control the source device, according to various aspects.
  • FIG. 8 illustrates an exemplary call flow that can be implemented at a source device and a sink device to back-control the source device via user inputs applied at the sink device and to automatically calibrate a screen that the source device casts to the sink device, according to various aspects.
  • FIG. 9 illustrates an exemplary device that can implement the various aspects and embodiments described herein.
  • FIG. 10 illustrates an exemplary architecture that can be implemented to enable a source device to cast a screen to a sink device and to back-control the source device via user inputs applied at the sink device, according to various aspects.
  • aspects and/or embodiments may be described in terms of sequences of actions to be performed by, for example, elements of a computing device.
  • Those skilled in the art will recognize that various actions described herein can be performed by specific circuits (e.g., an application specific integrated circuit (ASIC)) , by program instructions being executed by one or more processors, or by a combination of both.
  • these sequence of actions described herein can be considered to be embodied entirely within any form of non-transitory computer-readable medium having stored thereon a corresponding set of computer instructions that upon execution would cause an associated processor to perform the functionality described herein.
  • the various aspects described herein may be embodied in a number of different forms, all of which have been contemplated to be within the scope of the claimed subject matter.
  • the corresponding form of any such aspects may be described herein as, for example, “logic configured to” and/or other structural components configured to perform the described action.
  • a source device and a sink device may establish a communication link to enable the source device to cast a screen associated therewith to the sink device such that the screen at the source device is further displayed at the sink device.
  • the source device and the sink device may implement one or more communication technologies to enable the remote screen casting from the source device to the sink device and to back-control the source device via one or more user inputs applied at the sink device.
  • the sink device may implement a Bluetooth Human Interface Device (HID) (e.g., an HID mouse with consumer control) to locally back-control the source device in addition to a Bluetooth Serial Port Profile (SPP) application to aid the Bluetooth HID in back-controlling the source device.
  • HID Bluetooth Human Interface Device
  • SPP Bluetooth Serial Port Profile
  • the user can instead simply view the screen displayed on the sink device and use the Bluetooth HID implemented on the sink device to back-control the source device.
  • the Bluetooth HID and the SPP application can be implemented to back-control the source device via user inputs applied at the sink device, with the sink device implementing a Bluetooth HID role and a SPP initiator role, while the source device may implement a Bluetooth HID host role and a SPP acceptor role.
  • the sink device may be configured to automatically calibrate the screen at the source device in both landscape and portrait orientations such that after the calibration has been performed once, results from the calibration can be stored in a suitable memory (e.g., a local memory, a memory accessible via a network, etc. ) and the sink device may subsequently retrieve the calibration results anytime that the same source device is linked to the sink device.
  • a suitable memory e.g., a local memory, a memory accessible via a network, etc.
  • the sink device may subsequently retrieve the calibration results anytime that the same source device is linked to the sink device.
  • the automatic calibration algorithm may allow any source device to be connected to the sink device without the user needing to manually (and unnecessarily) re-calibrate the source device when subsequently connected to the sink device while providing may accurate results than any manual user-performed calibration.
  • a remote screen calibration algorithm can be used to automatically determine an output rectangle associated with the screen casted from the source device to the sink device in landscape and portrait orientations. More particularly, an application at the sink device may transmit a screen calibration message to the source device, wherein the screen calibration message may include a request to show a calibration screen at the source device in either a landscape orientation or a portrait orientation.
  • the calibration screen may comprise a full-white screen or another suitable screen configured to display a unique color such that valid and invalid regions can be suitably differentiated.
  • the source device may then display the calibration screen and information indicating a resolution associated with the screen at the source device (e.g., a height and width, which may be expressed in pixels) .
  • the sink device may then calculate the output rectangle in the local screen according to the calibration screen displayed at the source device and cast to the sink device, wherein the output rectangle may comprise a valid region limited according to four (4) vertices.
  • the sink device may subsequently convert coordinates in the local screen to coordinates in the remote screen at the source device according to the resolution associated with the remote screen and the coordinates associated with the vertices in the output rectangle.
  • the above-mentioned calibration may then be repeated in the other orientation (e.g., landscape or portrait, which can be calibrated in any suitable order) .
  • the results from the remote screen calibration algorithm may then be stored in the sink device to be used in future screen sharing sessions between the source device and the sink device. Therefore, once the remote screen at the source device has been calibrated for use at the sink device, there may be no need to perform any further calibration with respect to that source device. Furthermore, the source device and the sink device can carry out the above-described remote screen calibration algorithm substantially autonomously without requiring any user input.
  • the sink device may further capture the user inputs that are applied at the sink device and transmit messages to report the captured user inputs to the source device according to a latency associated with a wireless link between the source device and the sink device and various other parameters relating to a velocity associated with the captured user inputs (e.g., a pointer velocity at the source device, a distance that a pointer has been moved between successive inputs, etc. ) .
  • a time to send a packet from the sink device to the source device in order to report one or more user inputs applied at the sink device may be considered in combination with the user input sample rate employed in the sink device to prevent the user from perceiving the latency when using the sink device to back-control the source device.
  • the pointer velocity at the source device and the low velocity threshold associated with the pointer at the source device may be considered to avoid the appearance of a floating pointer at the source device (e.g., when user inputs are quickly applied) .
  • the sink device may transmit a packet to report a user input to the source device in response to determining that the pointer velocity at the source device is less than the low velocity threshold at the source device, wherein the pointer velocity at the source device may depend on a distance between local points at which successive user inputs are captured at the sink device and a time to send the packet to the source device.
  • the sink device may send the user input report to the source device upon detecting a touch down event, upon detecting a touch move event that satisfies certain sampling criteria, and/or upon detecting a touch up event. Furthermore, in the case where the sink device sends the user input report upon detecting the touch up event, the sink device may further send a user input report associated with the last touch move event preceding the touch up event to ensure that the touch up event is processed according to the correct coordinates relative to the most recently sampled user input.
  • a source device 120 may cast a screen to a sink device 160 and user inputs can be applied at the sink device 160 to back-control the source device 120.
  • a source device 120 may cast a screen to a sink device 160 and user inputs can be applied at the sink device 160 to back-control the source device 120.
  • user inputs can be applied at the sink device 160 to back-control the source device 120.
  • the source device 120 and the sink device may communicate with one another via one or more communication channels 150, wherein the one or more communication channels 150 may be used to cast the screen associated with the source device 120 to the sink device 160, to back-control the source device 120 via the user inputs applied at the sink device 160, to automatically calibrate the screen casted from the source device 120 to the sink device 160, and to determine various parameters that may be used to control when messages to report the user inputs applied at the sink device 160 are transmitted to the source device 120.
  • the one or more communication channels 150 may be used to cast the screen associated with the source device 120 to the sink device 160, to back-control the source device 120 via the user inputs applied at the sink device 160, to automatically calibrate the screen casted from the source device 120 to the sink device 160, and to determine various parameters that may be used to control when messages to report the user inputs applied at the sink device 160 are transmitted to the source device 120.
  • the source device 120 may include a memory 122, a display 124, a speaker 126, an audio and/or a video (A/V) encoder 128, an audio and/or a video (A/V) control module 130, and a transceiver 132 (e.g., a transmitter/receiver (TX/RX) unit) .
  • a transceiver 132 e.g., a transmitter/receiver (TX/RX) unit
  • the sink device 160 may include a transceiver 162 (e.g., a transmitter/receiver (TX/RX) unit) , an audio and/or a video (A/V) decoder 164, a display 166, a speaker 168, one or more input devices 170, and a user input processing module (UIPM) 172.
  • TX/RX transmitter/receiver
  • A/V audio and/or a video
  • UIPM user input processing module
  • the illustrated components shown in FIG. 1 represent merely one example configuration associated with the system 100, whereby other configurations may include fewer and/or more components than illustrated in FIG. 1.
  • the source device 120 can display a visual portion associated with multimedia data on the local display 124 and can output an audio portion associated with the multimedia data using the speaker 126.
  • the multimedia data may be stored locally on the memory 122, accessed from an external storage medium such as a file server, hard drive, external memory, Blu-ray disc, DVD, or other physical storage medium, or may be streamed to the source device 120 via a network connection such as via the Internet.
  • the source device 120 may include a camera and/or a microphone (not explicitly shown in FIG. 1) that can capture multimedia data in real-time.
  • Multimedia data may include content such as movies, television shows, music, or the like, as well as real-time content generated at the source device 120.
  • one or more applications running on the source device 120 may produce and/or capture the real-time multimedia content generate at the source device 120 (e.g., video data captured during a video telephony session) .
  • the real-time content may include a video frame that includes one or more user input options available to select, video frames that combine multiple different content types (e.g., a video frame from a movie or television program that has user input options overlaid on the video frame) , and/or other suitable configurations.
  • the A/V encoder 128 at the source device 120 can encode the multimedia data and the transceiver 132 can be used to transmit the encoded data over the communication channel 150 to the sink device 160.
  • the transceiver 162 associated with the sink device 160 may therefore receive the encoded data and the A/V decoder 164 associated with the sink device 160 may decode the encoded data and output the decoded data on the display 166 and/or the speaker 168.
  • the audio and/or video data rendered via the display 124 and/or the speaker 126 at the source device 120 can be simultaneously rendered via the display 166 and/or the speaker 168 at the sink device 160.
  • the audio and/or video data may only be rendered via the display 166 and/or the speaker 168 at the sink device 160.
  • the audio and/or video data may be arranged in one or more frames, and any audio frames may be time-synchronized with the video frames when rendered.
  • the communication channel 150 used to transmit the multimedia data from the source device 120 to the sink device 160 may comprise a wired link, such as a wired High-Definition Multimedia Interface (HDMI) link that can be used to transfer uncompressed video data and compressed or uncompressed digital audio from the source device 120 to the sink device 160, a wired link that implements the Mobile High-Definition Link (MHL) standard that allows consumers to connect mobile phones, tablets, and/or other portable consumer electronics to HD-capable sink devices (e.g., via a five-pin or an eleven-pin Micro-USB-to-HDMI adapter, an MHL passive cable, a USB Type-C connector, a reversible superMHL connector, etc.
  • HDMI High-Definition Multimedia Interface
  • MHL Mobile High-Definition Link
  • the communication channel 150 used to transmit the multimedia data from the source device 120 to the sink device 160 may comprise a wireless link configured in accordance with the Wi-Fi Display (also known as Miracast) technical specification, which defines a wireless peer-to-peer protocol that can be used to connect the source device 120 to the sink device 160 and thereby transfer high-definition video and surround sound audio over a Wi-Fi Direct connection such that the source device 120 and the sink device 160 can communicate directly without using an intermediary device (e.g., a wireless access point) .
  • the source device 120 and the sink device 160 may also establish a tunneled direct link setup (TDLS) to avoid or reduce network congestion.
  • TDLS tunneled direct link setup
  • Wi-Fi Direct and TDLS may be intended to establish relatively short-distance communication sessions, which in the present context may refer to a range less than approximately seventy (70) meters, although in a noisy or obstructed environment the distance between devices may be even shorter, such as approximately thirty-five (35) meters or less or generally within a vehicle interior.
  • the A/V encoder 128 and the A/V decoder 164 may implement one or more audio and video compression standards, such as the ITU-T H.264 standard, alternatively referred to as MPEG-4, Part 10, Advanced Video Coding (AVC) , the high efficiency video coding (HEVC) standard, or the Display Stream Compression (DSC) standard. Many other proprietary and/or standardized compression techniques may also be used.
  • A/V decoder 164 may be configured to perform reciprocal coding operations with respect to the A/V encoder 128. Furthermore, although not explicitly shown in FIG.
  • the A/V encoder 128 and the A/V decoder 164 may each be integrated with an audio encoder and decoder, and may include appropriate MUX-DEMUX units, or other hardware and software, to encode both audio and video in a common data stream or separate data streams.
  • multimedia data may be stored on or received at source device 120 in an encoded form and/or transferred over the communication channel 150 in an uncompressed form (e.g., when using a wired HDMI/MHL link) , wherein the multimedia data may not require further compression at the A/V encoder 128 in such use cases.
  • the communication channel 150 may carry audio payload data and video payload data separately or alternatively carry audio and video payload data in a common data stream.
  • the MUX-DEMUX units mentioned above may conform to the ITU H. 223 multiplexer protocol, or other protocols such as the user datagram protocol (UDP) .
  • the A/V encoder 128 and the A/V decoder 164 each may be implemented as one or more microprocessors, digital signal processors (DSPs) , application specific integrated circuits (ASICs) , field programmable gate arrays (FPGAs) , discrete logic, software, hardware, firmware or any combinations thereof.
  • the A/V encoder 128 and the A/V decoder 164 may each be included in one or more encoders or decoders that are integrated in a combined encoder/decoder (CODEC) . Accordingly, the source device 120 and the sink device 160 may each comprise specialized machines configured to implement the various aspects described herein.
  • the system 100 illustrated in FIG. 1 may be implemented in a vehicular context, whereby the sink device 160 may represent a vehicle head unit (or integrated car kit) that can integrate one or more user interface devices with the source device 120.
  • the user interface devices may include one or more input devices 170 configured to receive input from a user through tactile, audio, or video feedback, wherein example input devices 170 may include a presence-sensitive and/or a touch-sensitive display, a mouse, a keyboard, a voice responsive system, a video camera, a microphone, steering wheel button (s) or knob (s) or other controls in the vehicle that can be pushed or rotated (e.g., to increase or decrease volume) , and/or any other input device 170 that can capture a command from a user.
  • example input devices 170 may include a presence-sensitive and/or a touch-sensitive display, a mouse, a keyboard, a voice responsive system, a video camera, a microphone, steering wheel button (s) or knob (s) or other controls in the vehicle
  • any references herein to a “user” associated with the sink device 160 and/or the source device 120 may include a driver or passenger in an automobile that includes the sink device 160.
  • the user interface devices may also include one or more output devices configured to provide output to a user using tactile, audio, and/or video stimuli, wherein example output devices may include the speaker 168 and the display 166, wherein the latter may be a presence-sensitive display, a touch-sensitive display, a liquid crystal display (LCD) , etc.
  • the output devices may include any suitable device that can convert a signal into an appropriate form that humans or machines can understand.
  • the communication channel 150 may include one or more Bluetooth connections that can be used to back-control the source device 120 via user inputs applied through the one or more input devices 170 at the sink device 160.
  • the source device 120 and the sink device 160 can use the communication channel 150 to enable the source device 120 to cast a screen associated therewith to the sink device 160 (e.g., via a wired HDMI/MHL link, a wireless Wi-Fi Display link, etc.
  • the sink device 160 may include a user input processing module (UIPM) 172 that can capture the user inputs received at the one or more input devices 170 and report the captured user inputs to the source device 120 via the communication channel 150.
  • UIPM user input processing module
  • the user may control multimedia data transmitted from the source device 120 to the sink device 160, launch and control applications on the source device 120, and/or otherwise perform commands that may be processed at the source device 120 remotely and without directly interacting with the source device 120 while having an experience that looks and feels as though the user is directly and locally interacting with the source device 120.
  • FIG. 2 illustrates an exemplary high-level system architecture 200 associated with a sink device that may display a screen casted from a source device and back-control the source device via user inputs applied at the sink device.
  • the sink device may comprise a back-control application 220, which may generally execute in a user space associated with a particular operating system (e.g., WinCE, Linux, etc. ) .
  • the back-control application 220 may support a Human Interface Device (HID) 221 to back-control the source device, wherein the HID 221 may be configured to capture the user inputs applied at the sink device.
  • HID Human Interface Device
  • a HID mouse may comprise a handheld, button-activated input device with two axes and one, two, or three buttons, wherein the HID mouse may direct an indicator to move correspondingly about a visual display screen when rolled along a flat surface, thereby allowing a user to move the indicator freely in select operations or to manipulate text, graphics, and/or other information output on the visual display screen.
  • a HID touch-screen may refer to a digitizer having an integrated display that allows a finger or stylus to be used for pointing, where some touch-screen technologies can differentiate between a finger touch and a stylus touch.
  • the HID touch-screen supports an absolute x-axis and an absolute y-axis rather than relative x-y axes.
  • a HID-compliant consumer control device may refer to a general consumer control device.
  • many smartphones include additional virtual keys that are not displayed on a screen (e.g., a “Home” button, a “Lock” button, volume control buttons, etc. ) .
  • the HID consumer control device can be configured to provide such virtual keys.
  • the back-control application 220 may support a Serial Port Profile (SPP) application 223 to retrieve information associated with a remote screen at the source device (e.g., a resolution and orientation associated with the remote screen casted from the source device to the sink device) .
  • SPP Serial Port Profile
  • the SPP application 223 may establish a virtual serial port that can enable communication between the sink device and the source device over an emulated serial cable and thereby allow the back-control application 220 to request and receive the information associated with the remote screen at the source device over the emulated serial port connection.
  • the remote screen information requested and received via the SPP application 223 may thereby allow the HID 221 to back-control the source device, as the HID 221 would be unable to suitably back-control the source device without knowing the current orientation and valid/invalid points on the remote screen at the source device.
  • the sink device may further comprise a media casting application 225 that also runs in the user space, wherein the media casting application 225 may cast the screen associated with the source device into a local system on chip (SoC) or other suitable platform at the sink device.
  • SoC system on chip
  • the source device may cast the screen associated therewith to the sink device via HDMI, MHL, and/or another suitable wired technology, whereby the sink device may include an HDMI chip 254 configured to receive data relating to the screen casted from the source device over a wired connection, wherein the data relating to the remote screen at the source device may then be provided to an HDMI driver 234 implemented in a kernel space.
  • the remote screen data may then be provided to an HDMI/MHL application 227 implemented in the media casting application 225 such that the remote screen at the source device may be rendered at the sink device.
  • the source device may cast the screen associated therewith to the sink device via a wireless technology such as Wi-Fi Display or Miracast 229, whereby the sink device may include a Wi-Fi chip 256 to handle wireless communication with the source device and a Wi-Fi stack 236 implemented in a kernel space to transport data between the Wi-Fi chip 256 and the media casting application 225.
  • a protocol stack that can be used to wirelessly cast the remote screen at the source device to the sink device will be described in further detail below with reference to FIG. 10.
  • the back-control application 220 may communicate with the media casting application 225 to exchange information relating to a connection status associated with the HID 221 and a link connecting the source device to the sink device (e.g., HDMI/MHL, Miracast, etc. ) . Furthermore, as appropriate, the back-control application 220 may issue a request to the media casting application 225 in order to calibrate the remote screen at the source device. Accordingly, in response to receiving the request to calibrate the remote screen at the source device, the media casting application 225 may lock a surface associated with the local screen on the sink device so as to calibrate the remote screen at the source device in both landscape and portrait orientations. For example, as will be described in further below with reference to FIG. 5 through FIG.
  • the remote screen at the source device may be calibrated with respect to the local screen at the sink device to enable remote back-control via the HID 221, as the back-control application 220 otherwise would not know the actual output rectangle associated with the remote screen casted into the local screen at the sink device and therefore be unable to construct a valid HID input report to back-control the source device.
  • the surface associated with the local screen may be locked to ensure that the back-control application 220 has exclusive permission to access the local screen surface during the remote screen calibration process.
  • the back-control application 220 and the media casting application 225 may communicate via inter-process communication (IPC) within the user space, wherein the particular IPC may be operating system dependent (e.g., Windows Message Queue or Named Event can be used in Microsoft Windows CE) .
  • the back-control application 220 may communicate with a Bluetooth stack 232, which may implement at least the Bluetooth HID profile and the Bluetooth Serial Port Profile (SPP) , and the Bluetooth stack 232 may further communicate with a Bluetooth chip 252 through a Host Controller Interface (HCI) transport, such as BlueCore Serial Protocol.
  • HCI Host Controller Interface
  • OSI seven-layer model 310 was established to standardize information transmission between points over the Internet or other wired and/or wireless networks, wherein the OSI model 310 separates communications processes between two points in a network into seven stacked layers, with each layer adding certain functions. Each device handles a message such that a downward flow through each layer occurs at a sending endpoint and an upward flow through the layers occurs at a receiving endpoint.
  • the programming and/or hardware that provides the seven layers in the OSI model 310 is typically a combination of device operating systems, application software, TCP/IP and/or other transport and network protocols, and other software and hardware.
  • the OSI model 310 includes a physical layer 312 (OSI Layer 1) used to convey a bit stream through a network at a physical level.
  • the Institute of Electrical and Electronics Engineers (IEEE) sub-divides the physical layer 312 into the PLCP (Physical Layer Convergence Procedure) sub-layer and the PMD (Physical Medium Dependent) sub-layer.
  • the data link layer 314 (OSI Layer 2) provides physical level synchronization, performs bit-stuffing, and furnishes transmission protocol knowledge and management, etc.
  • MAC Media Access Control
  • LLC Logical Link Control
  • the network layer 316 (OSI Layer 3) handles data transfer across a network (e.g., routing and forwarding) in a manner independent from any media and specific network topology
  • the transport layer 318 (OSI Layer 4) manages end-to-end control and error-checking to multiplex data transfer across the network according to application-level reliability requirements
  • the session layer 320 (OSI Layer 5) establishes, coordinates, and terminates conversations, exchanges, and dialogs between the applications to provide management and data flow control services.
  • the presentation layer 322 (OSI Layer 6) converts incoming and outgoing data from one presentation format to another.
  • the presentation layer 322 may add service structure to the data units to provide data to the application layer 324 (OSI Layer 7) according to a common representation, while the application layer 324 is where communication partners are identified, quality of service (QoS) is identified, user authentication and privacy are considered, constraints on data syntax are identified, and any other functions that may be relevant to managing communications between host applications are managed.
  • OSI Layer 7 application layer 324
  • QoS quality of service
  • the radio frequency (RF) layer 332 generally corresponds to the physical layer 312 in the OSI model 310
  • the baseband layer 334 and the link manager protocol layer 336 generally correspond to the data link layer 314, and a host controller interface (HCI) 338 separates the RF layer 332, the baseband layer 334, and the link manager protocol layer 336 from the upper layers.
  • HCI host controller interface
  • the Physical Layer 312 in the OSI model 310 manages electrical interfaces to communications media, which includes modulation and channel coding, and therefore covers the Bluetooth radio in the RF layer 332 (and possibly part of the baseband layer 334) , while the data link layer 314 manages transmission, framing, and error control over a particular link, which overlaps tasks performed in the link manager protocol layer 336 and the control end of the baseband layer 334 (e.g., error checking and correction) .
  • L2CAP Logical Link Control and Adaptation Protocol
  • RCOMM RF communication
  • SCO Synchronous Connection Oriented
  • OBEX object exchange
  • TCP/IP TCP/IP
  • SDP Service Discovery Protocol
  • HID Human Interface Device
  • APDTP Audio/Video Distribution Transport Protocol
  • the presentation layer 322 and the application layer 324 in the OSI seven-layer model 310 corresponds to the Bluetooth Profiles layer in the Bluetooth protocol stack 330, wherein the Bluetooth Profiles layer may include the Serial Port Profile (SPP) 356 and any additional applications or profiles 358 (e.g., the Hands-Free Profile (HFP) for voice, the Advanced Audio Distribution Profile (A2DP) for high-quality audio streaming, the Video Distribution Profile (VDP) for video streaming, etc. ) .
  • the applications or profiles 358 may correspond to the presentation layer 322 and the application layer 324 in the OSI model 310. Accordingly, a Bluetooth Profile may generally be considered synonymous with an “application” in the OSI seven-layer model 310.
  • the RFCOMM channel 342 comprises a communication channel named “service level connection” ( “SLC” ) that emulates a serial port used for further communication between an Audio Gateway (AG) device and a Handsfree (HF) device.
  • SLC service level connection
  • a separate baseband link called a synchronous connection-oriented (SCO) channel carries the voice data, represented as Audio (SCO) 350 in FIG. 3.
  • Audio Audio
  • A2DP the audio data (unidirectional high-quality audio content) goes over AVDTP 348, which in turn goes over L2CAP 340.
  • all L2CAP 340 data flows over a logical link.
  • the sink device may implement at least the Human Interface Device (HID) profile 346 and the SPP 356.
  • the HID profile 346 may use the universal serial bus (USB) definition associated with a HID device in order to leverage existing class drivers and further describe how to use the USB HID protocol to discover a feature set associated with a HID class device and how a Bluetooth-enabled device can support HID services using the L2CAP layer 340.
  • USB universal serial bus
  • the HID profile 346 is designed to enable initialization and control self-describing devices as well as provide a low latency link with low power requirements, wherein the HID profile 346 runs natively on the L2CAP layer 340 and does not reuse Bluetooth protocols other than SDP 344 in order to provide a simple implementation.
  • the sink device may therefore implement an adapter associated with a parser used in the HID profile 346 in order to provide utilities to initialize a HID input report relating to local touch messages captured at the sink device (e.g., via a HID mouse, a HID consumer control device, etc. ) .
  • the SPP 356 may be implemented to emulate an RS232 (or similar) serial cable such that one or more applications may use the Bluetooth-implemented SPP 356 as a cable replacement through an operating system dependent virtual serial port abstraction.
  • the Bluetooth stack 232 implemented in the sink device may support a HID profile to back-control the source device.
  • the HID profile implemented in the Bluetooth stack 232 may have capabilities to register a HID 221, accept and establish a connection with the HID 221, and disconnect from the HID 221.
  • the sink device therefore effectively acts as a HID device, sending HID input reports to the source device according to one or more touch messages captured at the sink device via the HID 221.
  • the source device may be configured to initiate a HID connection with the sink device when a Bluetooth connection is established from the source device, provided that the HID Service Discovery Protocol (SDP) record is activated in the sink device.
  • SDP HID Service Discovery Protocol
  • the sink device may reject the HID connection due to native limitations (e.g., the source device may implement the “host” role associated with the Bluetooth HID profile and therefore initially request human data input and output services from the sink device implementing the “HID” role associated with the Bluetooth HID profile) . Consequently, the back-control application 220 implemented in the sink device may only have the ability to initiate a HID connection with the source device after a HID connection between the sink device and the source device has been established at least once before.
  • native limitations e.g., the source device may implement the “host” role associated with the Bluetooth HID profile and therefore initially request human data input and output services from the sink device implementing the “HID” role associated with the Bluetooth HID profile
  • the back-control application 220 may be configured to back-control the source device via the HID 221 and may further support the SPP application 223 to retrieve information associated with the remote screen at the source device (e.g., a resolution, an orientation, control parameters, etc. ) .
  • the HID 221 may be a “combo device” that comprises a HID mouse in combination with a HID consumer control.
  • a HID mouse may not support additional virtual keys that are provided on many source devices (e.g., external buttons on a smartphone that are not displayed on the screen) .
  • a HID consumer control can provide the virtual keys that are needed to back-control the source device and cannot be provided via the HID mouse.
  • the HID consumer control can be fitted with controls provided in a media player or other suitable applications that are running on the source device, especially where the source device does not cast a surface layer in which such controls are displayed into the sink device when the media casting application 225 is active (e.g., smartphones that run the Android operating system may not cast the surface layer associated with the media player into the sink device when connected to the sink device via HDMI 227) .
  • the back-control application 220 may construct a “HID combo device” report according to local user inputs captured from the HID mouse and/or HID consumer control.
  • a HID mouse may not support multi-touch features that may be built into the source device
  • another option may be to use a HID touch-screen as a pointer device to back-control the source device (e.g., where the sink device is implemented in a vehicle or another suitable environment that includes a HID touch-screen that supports multi-touch features) .
  • the HID touch-screen may not always be feasible, as a mapping between the HID touch-screen and the remote screen at the source device may not fully match.
  • the HID touch-screen may have a lower resolution than the remote screen at the source device (e.g., the HID touch-screen may have a 800x480 pixel resolution, whereas the source device may have a 1280x720 pixel resolution) .
  • the back-control application 220 may be unable to pre-define the maximum values on the x-axis and the y-axis in a touch-screen HID report.
  • the back-control application 220 constructs a touch-screen HID report with maximum values on the x-y axes based on the local screen width and height (e.g., 800, 480) , only a partial region within the remote screen at the source device would be effective due to the difference between the screen resolutions at the source and sink devices.
  • the local screen width and height e.g. 800, 480
  • the sink device needs to know at least the screen resolution at the source device in order to properly back-control the source device. For example, as mentioned above, whether a HID touch-screen can be used depends on the difference between the screen resolution at the sink and source devices. Furthermore, in order to use a HID mouse as the HID 221, coordinates in the local screen at the sink device may need to be mapped to coordinates in the remote screen at the source device.
  • the back-control application 220 may further need to know the previous cursor position in the source device and the initial cursor position.
  • the mapping between the coordinates in the local screen at the sink device and the remote screen at the source device may vary depending on whether the source device is in a landscape or portrait orientation (especially in vehicles where the local screen is always in a landscape orientation or other environments in which the local screen has a fixed orientation) .
  • the back-control application 220 may need to know the exact output rectangle in the local screen depending on whether the remote screen is in a landscape orientation or a portrait orientation.
  • the HID 221 supported via the back-control application 220 e.g., mouse and consumer control
  • the media casting application 225 e.g., HDMI 227 or Miracast 229
  • the SPP application 223 may provide the back-control application 220 with various parameters that enable the HID 221 to be used in back-controlling the source device, including the orientation and exact output rectangle at the remote screen.
  • the SPP application 223 implemented at the sink device may communicate with a helper SPP application at the source device (not explicitly shown in FIG. 2) to request and retrieve information associated with the remote screen at the source device.
  • the SPP application 223 implemented at the sink device and the helper SPP application at the source device may support primitives to allow the sink device to retrieve a screen resolution, a screen orientation (or direction) , a pointer or cursor movement unit, and pointer velocity control parameters from the source device.
  • a helper SPP application at the source device may support primitives to allow the sink device to retrieve a screen resolution, a screen orientation (or direction) , a pointer or cursor movement unit, and pointer velocity control parameters from the source device.
  • the SPP application 223 at the sink device and the helper SPP application at the source device may support primitives to enable the source device to calibrate the remote screen at the source device (e.g., primitives to start a screen calibration process, show a calibration screen, stop the screen calibration process, etc. ) .
  • communication between the SPP application 223 at the sink device and the helper SPP application at the source device may be based on a client-server architecture, wherein the SPP application 223 implemented at the sink device issues requests to the helper SPP application at the source device and the helper SPP application at the source device responds to the requests from the SPP application 223 implemented at the sink device.
  • the helper SPP application at the source device may send unsolicited indicators to the SPP application 223 implemented at the sink device (e.g., to inform the sink device about the orientation or direction, pointer movement speed, etc. associated with the remote screen, as such parameters may dynamically change at the source device from time to time) .
  • the back-control application 220 may be unable to establish a connection between the SPP application 233 and the corresponding helper SPP application at the source device until the relevant HID 221 has been connected with the source device. Likewise, if the relevant HID 221 loses the connection with the source device, the back-control application 220 may automatically disconnect from the SPP application 233.
  • the Bluetooth stack 232 at the sink device may support multiple connections with the SPP application 223, the helper SPP application at the source device may only support one SPP connection.
  • an exemplary method 400 as illustrated therein may be illustrated implemented at a sink device to enable the sink device to automatically calibrate a screen at a source device.
  • the sink device may need to know the output rectangle when the remote screen is cast from the source device into the local screen at the sink device in a landscape orientation and a portrait orientation.
  • the sink device may perform the method 400 shown in FIG. 4 once with the remote screen at the source device arranged in the landscape orientation and perform the method 400 again with the remote screen at the source device arranged in the portrait orientation.
  • the remote screen may be suitably calibrated in the reverse order as well.
  • the method 400 may only be performed once with the remote screen in the relevant orientation.
  • the sink device may initially send a message to the source device to request that the source device show a calibration screen in a landscape orientation.
  • the calibration screen may comprise a screen filled with a unique color that may permit the sink device to clearly differentiate between valid and invalid regions within the remote screen casted from the source device into the local screen at the sink device.
  • FIG. 5A illustrates an exemplary calibration screen 510 that may be displayed at the source device, shown as a full-white screen in FIG. 5A.
  • the sink device may differentiate between the valid and invalid regions within the remote screen casted to the sink device and thereby calculate the output rectangle when the remote screen is cast from the source device into the local screen in the landscape orientation.
  • the source device may rotate the remote screen into the landscape orientation in response to the request from the sink device and then display the calibration screen 510 on the remote screen.
  • the sink device may then lock the surface associated with the local screen at the sink device to ensure that an application handling the remote screen calibration has exclusive permission to access the local screen surface and thereby enable the sink device to calculate the output rectangle in the local screen for the remote screen casted from the source device at block 430.
  • the local screen at the sink device may have four vertices, which are denoted in FIG. 5A as P0, P1, P2, P3.
  • the resolution associated with the local screen at the sink device is 800x480 pixels
  • x-y coordinates respectively corresponding to the four vertices P0, P1, P2, P3 in the local screen at the sink device may be (0, 0) , (800, 0) , (0, 480) , (800, 480) .
  • the sink device may determine that the output rectangle for the remote screen is limited to the four vertices in FIG.
  • the output rectangle calculated at block 430 may comprise the region within the four vertices denoted P0′ (X0′, Y0′) , P1′ (X1′, Y1′) , P2′ (X2′, Y2′) , P3′ (X3′, Y3′) .
  • the source device may then constrain valid points in the local screen according to the output rectangle that was calculated in block 430 for the remote screen in the landscape orientation.
  • FIG. 5B illustrates an example screen with valid and invalid regions when the remote screen at the source device is in the landscape orientation, wherein valid points may include any points that fall inside region 512 and invalid points may include any points that fall outside region 512 and within region 514 and/or region 516.
  • the sink device may then store screen coordinate conversion results for the landscape orientation such that points in the local screen at the sink device can be suitably converted to points in the remote screen at the source device. For example, in response to any subsequent user input captured at the sink device, an initial determination may be made as to whether a coordinate associated with the user input falls within the valid region 512. In the event that the coordinate associated with the user input falls within either invalid region 514 or 516, the user input would not map to any point on the remote screen and can therefore be discarded.
  • the sink device may need to map coordinates associated with the local point 518 to coordinates at the source device to enable proper back-control via the user input received via the local screen.
  • the screen coordinate conversion results that are stored at block 450 may be used to map the coordinates associated with the local point 518 (or any other local point in the valid region 512) to coordinates associated with remote points on the remote screen at the source device.
  • coordinates associated with the local point 518 in the local screen may be converted to the coordinates associated with a remote point in the remote screen as follows:
  • X and Y represent the x-y coordinates associated with the local point 518 in the screen at the sink device
  • X′and Y′ represent the x-y coordinates associated with the remote point in the screen at the source device
  • W′and H′ represent the width and height associated with the remote screen (e.g., W′may be the maximum value in the resolution associated with the remote screen, while H′may be the minimum value in the resolution associated with the remote screen, whereby a remote screen having a 1024x720 resolution would result in W′being 1024 and H′being 720) .
  • L0 and T0 correspond to the x-y coordinates associated with the upper-left corner in the output rectangle region 512 in the landscape orientation
  • R0 and B0 correspond to the x-y coordinates associated with the lower-right corner in the output rectangle region 512 in the landscape orientation (i.e., across a diagonal from the upper-left corner)
  • coordinates associated with any arbitrary local point in the output rectangle region 512 can be converted to coordinates in the remote screen at the source device.
  • the sink device may perform the method 400 in substantially the same manner as described above to calibrate the remote screen at the source device in a portrait orientation.
  • the sink device may send a message to the source device to request that the source device show the calibration screen in the portrait orientation, wherein the calibration screen may comprise a screen filled with a unique color that may permit the sink device to clearly differentiate between valid and invalid regions (e.g., a full-white screen) .
  • the source device may rotate the remote screen into the portrait orientation as-needed in response to the request from the sink device and then display a calibration screen 610.
  • the sink device may lock the local screen surface to ensure that the application handling the remote screen calibration has exclusive permission to access the local screen while calculating the output rectangle in the local screen for the remote screen in the portrait orientation.
  • the local screen at the sink device may have four vertices, which are denoted in FIG. 6A as P0 (0, 0) , P1 (800, 0) , P2 (0, 480) , P3 (800, 480) assuming an 800x480 resolution. Accordingly, when calculating the output rectangle in the portrait orientation at block 430, the sink device may determine that the output rectangle in the portrait orientation is limited to the four vertices in FIG. 6A that are denoted P0′′ (X0′′, Y0′′) , P1′′ (X1′′, Y1′′) , P2′′ (X2′′, Y2′′) , P3′′ (X3′′, Y3′′) .
  • the output rectangle calculated at block 430 may comprise the region within the four vertices denoted P0′′ (X0′′, Y0′′) , P1′′ (X1′′, Y1′′) , P2′′ (X2′′, Y2′′) , P3′′ (X3′′, Y3′′) .
  • the source device may then constrain valid points in the local screen according to the output rectangle that was calculated in block 430 for the remote screen in the portrait orientation. For example, FIG.
  • FIG. 6B illustrates an example screen with valid and invalid regions when the remote screen at the source device is in the landscape orientation, wherein valid points may include any points that fall inside region 612 and invalid points may include any points that fall outside region 612 and within region 612 and/or region 616.
  • the sink device may then store screen coordinate conversion results for the portrait orientation such that points in the local screen at the sink device can be suitably converted to points in the remote screen at the source device. For example, in response to any subsequent user input captured at the sink device, an initial determination may be made as to whether a coordinate associated with the user input falls within the valid region 612. In the event that the coordinate associated with the user input falls within either invalid region 614 or 616, the user input would not map to any point on the remote screen and can therefore be discarded.
  • the sink device may need to map coordinates associated with the local point 618 to coordinates at the source device to enable proper back-control via the user input received via the local screen.
  • the screen coordinate conversion results that are stored at block 450 may be used to map the coordinates associated with the local point 618 (or any other local point in the valid region 612) to coordinates associated with remote points on the remote screen at the source device.
  • coordinates associated with the local point 618 in the local screen may be converted to the coordinates associated with a remote point in the remote screen as follows:
  • X and Y represent the x-y coordinates associated with the local point 618 in the screen at the sink device
  • X′and Y′ represent the x-y coordinates associated with the remote point in the screen at the source device
  • W′and H′ represent the width and height associated with the remote screen (i.e., the maximum and minimum values in the resolution associated with the remote screen) .
  • L1 and T1 correspond to the x-y coordinates associated with the upper-left corner in the output rectangle region 612 in the portrait orientation
  • R1 and B1 correspond to the x-y coordinates associated with the lower-right corner in the output rectangle region 612 in the portrait orientation (i.e., across a diagonal from the upper-left corner in the output rectangle region 612) . Accordingly, given the resolution associated with the remote screen and the coordinates associated with the output rectangle region 612 in the portrait orientation, coordinates associated with any local point in the output rectangle region 612 can be converted to coordinates in the remote screen at the source device.
  • an exemplary method 700 and corresponding timing diagram 770 as shown therein can be used to capture and transmit user inputs applied at a sink device to a source device to thereby back-control the source device. More particularly, once the source device has been suitably linked to the sink device and any necessary calibration with respect to the remote screen has been completed and/or results from a prior calibration have been retrieved, a user may locally back-control the source device via a touch-screen or a Bluetooth Human Interface Device (HID) (e.g., a HID mouse and consumer control “combo” device) deployed in the sink device.
  • HID Bluetooth Human Interface Device
  • user input from such input devices may be captured via a back-control application as a touch message, wherein the back-control application may generally send a HID input report to the source device each time that a touch message is locally captured at the sink device (i.e., in a default implementation, one touch message corresponds to at least one HID input report) .
  • sending a HID input report per touch message may be impractical in certain circumstances.
  • the time to send a Host Controller Interface (HCI) Asynchronous Connection-Less (ACL) packet wrapped with the HID input report will typically be longer than a touch message sample rate locally used in the sink device, which will lead to apparent user-perceptible latency when using the Bluetooth HID to back-control the source device.
  • the method 700 shown in FIG. 7 may consider a time required to send HID input reports to the source device and a local touch message sample rate used in the sink device to address the above-mentioned latency issues.
  • the method 700 shown in FIG. 7 may consider the velocity at which the pointer moves in the source device to address issues in which a floating pointer may appear at the source device when the user makes several quick touch messages in succession (e.g., during quick handwriting) .
  • a floating pointer may appear at the source device when the user makes several quick touch messages in succession (e.g., during quick handwriting) .
  • the pointer in an Android phone has a speed in a range between [-7, 7] , wherein a scale factor is applied to the velocity to adapt the resolution associated with the input device to the resolution associated with the output device.
  • the scale factor is applied to adapt the resolution associated with the local screen in the sink device to the resolution associated with the remote screen in the source device.
  • the source device will have a default low threshold velocity (LV) , which is the scaled speed at which acceleration begins to be applied to the pointer.
  • LV low threshold velocity
  • the LV is the upper bound on a pointer speed at which small motions can be performed without acceleration (e.g., in an Android phone, the LV is limited to 500 pixels-per-second) .
  • the source device will accelerate the pointer such that the pointer can move the extra distance. Consequently, the sink device will lose the ability to control the pointer in the source device under such conditions where the pointer velocity exceeds the LV limit in the source device. Accordingly, the method 700 shown in FIG. 7 may consider the velocity at which the pointer moves in the source device to address the above-mentioned floating pointer issue.
  • the HID input report associated with the touch-screen and/or other suitable HID devices used to back-control the source device may be initialized according to a local touch message captured at the sink device at some point in time after the Bluetooth HID has been connected to the source device and the link used to cast the remote screen at the source device into the sink device is active.
  • the HID input report may be determined according to a HID descriptor that outlines specific information about usage parameters associated with the HID device.
  • a HID input report that includes information about a touch message captured from a HID mouse may provide maximum and minimum values on relative x-y axes. Accordingly, a HID input report based on a HID mouse touch input may only include one (1) byte to limit logical minimum and maximum values to a range between [-127, 127] based on a HID mouse natively supporting relative coordinates. As such, when a HID mouse is implemented to back-control the source device, the back-control application may store a previous point in the remote screen to calculate a difference between a current remote point and the previous remote point.
  • the touch message captured at block 710 may include a touch point (x, y) in the local screen at the sink device and a touch state (or type) , which may include, for example, a touch down (e.g., press) , a touch move (e.g., swipe) , or a touch up (e.g., release) .
  • a touch point x, y
  • a touch state or type
  • a touch down e.g., press
  • a touch move e.g., swipe
  • a touch up e.g., release
  • the source device may have a native sample rate (ST) used to capture local touch messages, which may generally be a static value (e.g., 125 Hz such that a local touch message is sampled every eight (8) milliseconds) .
  • ST native sample rate
  • the sink device may be configured to re-sample the touch message at a re-sample rate (RT) that differs from the native sample rate ST to determine how frequently to send the HID input reports to the source device.
  • RT re-sample rate
  • the sink device may periodically send a HID input report to the source device.
  • the method 700 shown in FIG. 7 may consider various parameters to send the HID input reports to the source device on a periodic basis (e.g., one HID input report every ten to forty milliseconds and at most one HID input report per local touch message) .
  • a periodic basis e.g., one HID input report every ten to forty milliseconds and at most one HID input report per local touch message.
  • the re-sample rate RT may generally correspond to a rate at which the back-control application used to send the HID input report to the source device captures the touch message.
  • n the back-control application may handle every n th touch message
  • the difficulty to make successive touch messages recognizable at the source device increases.
  • the back-control application may be implicitly required to send an additional padding report, wherein the padding report may also be a HID input report, with the main difference that the (x, y) fields in the padding report are always zero.
  • the padding report does not move the pointer in the source device and instead aims to solve the floating pointer issue through quick handwriting to some extent.
  • the padding report may therefore be added whenever the back-control application handles a touch message, with the padding report having a count limited to a range between [0, RT -1] .
  • the native sample rate (ST) and the re-sample rate (RT) as described above may therefore be used to determine conditions under which the sink device sends the HID input report to the source device. More particularly, the native (default) sample rate ST to capture the local touch message at the sink device (e.g., 125 Hz or every eight (8) milliseconds) may determine how many (or how often) “raw” touch messages will be captured at block 710. Accordingly, the back-control application configured to handle the local touch message may first en-queue the local touch message that was captured at block 710 and a separate thread may then de-queue the local touch message to allow proper handling according to the re-sample rate RT mentioned above.
  • ST the native sample rate
  • RT re-sample rate
  • the thread in the back-control application configured to handle the local touch message may then send the HID input report into the Bluetooth stack, and the Bluetooth stack may then send the HCI ACL packet wrapped with the HID input report into the Bluetooth chip.
  • a time to send the HID input report into the Bluetooth chip may vary among different Bluetooth stack implementations (e.g., typically between eight (8) to sixteen (16) milliseconds) .
  • Another variable that may be considered in determining the conditions under which the HID input report is sent to the source device may depend on a time to send the HID input report from the Bluetooth chip to the source device (TC) , which may be determined according to a “sniff” interval employed in the Bluetooth chip.
  • TC source device
  • TC source device
  • the source device may request adjustments to the Bluetooth sniff interval to control the variable TC that depends on the time to send the HID input report to the source device and thereby guarantee that the HCI ACL packet wrapped with the HID input report will be sent to the source device at a minimum periodicity (e.g., every ten (10) to forty (40) milliseconds) .
  • a minimum periodicity e.g., every ten (10) to forty (40) milliseconds
  • the source device may determine how to process each local touch message captured at block 710. More particularly, the source device may determine whether the local touch message captured at block 710 was a “touch down” message, a “touch move” message, or a “touch up” message. In response to determining at block 720 that the local touch message was a touch down message, the source device may transmit the HID input report to the source device at block 722. Alternatively, in response to determining at block 730 that the local touch message was a “touch move” message, the source device may determine whether to transmit the HID input report corresponding to the touch move message or alternatively store the touch move message as a reference point without transmitting the HID input report to the sink device.
  • the sink device may determine a movement distance (D) associated with the pointer in the source device based on a previous remote point and a current remote point in the remote screen at the source device.
  • the source device may store coordinates associated with a previous local point P0 (X0, Y0) that corresponds to a previous local touch message captured in the sink device and a current local point P1 (X1, Y1) that corresponds to the current local touch message captured at block 710.
  • the coordinates (X0, Y0) and (X1, Y1) associated with the previous and current local points P0, P1 may therefore be converted to coordinates (X0′, Y0′) and (X1′, Y1′) that correspond to the previous and current points in the remote screen at the source device, which may be referred to as P0′, P1′.
  • the coordinates associated with the previous and current local points P0, P1 in the local screen at the sink device may be converted to coordinates associated with the previous and current points P0′, P1′in the remote screen at the source device according to the equations described in further detail above with reference to FIG. 4 through FIG. 5.
  • the movement distance D associated with the pointer in the source device may be computed as follows:
  • the sink device may then compute the velocity (S) associated with the pointer in the source device based on the movement distance D associated with the pointer in the source device and the variable TC as-determined according to the above-mentioned sniff interval, as follows:
  • the sink device may then determine whether the velocity S associated with the pointer in the source device is less than the low threshold velocity LV at which acceleration begins to be applied to the pointer at the source device, wherein the HID input report corresponding to the touch move message may be sent to the source device at block 740 in response to determining that S is less than LV.
  • the above-mentioned padding reports may be sent to the source device at block 740 in addition to the HID input report corresponding to the current touch move message when S is less than LV.
  • the sink device may determine whether a current touch move index is a multiple of the re-sample rate RT at block 738. For example, assuming an RT equal to three (3) , HID input reports and the above-mentioned padding reports may be sent to the source device at block 740 with respect to touch move messages having an index of zero, three, six, nine, twelve, etc.
  • the sink device may ignore the touch move input at block 742 without sending the HID input report (or the padding reports) to the source device.
  • the sink device may further set the last touch move message to the current touch move message at block 742 in order to maintain the previous local/remote point coordinates that may be needed to provide a reference point with respect to subsequent touch messages that are captured at the sink device.
  • the sink device may determine that the local touch message was a touch up message at block 740. In such a case, the sink device may transmit the HID input report corresponding to the last touch move message and the current touch up message at block 742. In addition, the sink device may reset the touch move index so as to continue re-sampling in the same manner described above with respect to any subsequent touch move messages.
  • the timing diagram 770 shown in FIG. 7 illustrates an exemplary touch message sequence that may be handled according to the method 700 further shown in FIG. 7, wherein the timing diagram 770 shown in FIG. 7 assumes that the re-sample rate equals three (3) .
  • RT can have a dynamic value greater than or equal to one (1) , wherein using a larger RT value will result in more touch messages being ignored such that inputs (e.g., handwriting) controlled via the HID may be more difficult to recognize when shown in the remote screen at the source device.
  • smaller RT values may result in the sink device sending more touch messages that are captured via the HID to the source device, which may result in an apparent (user-perceptible) latency in showing the captured inputs at the source device.
  • the variable RT may generally have a value calculated to balance the above-mentioned tradeoffs to avoid a floating pointer in the remote screen at the source device, which may be three (3) in the exemplary timing diagram 770 shown in FIG. 7.
  • the sink device may transmit the corresponding HID input report to the source device.
  • the sink device may transmit the corresponding HID input report (s) to the source device according to the re-sample rate RT.
  • the sink device may transmit the HID input report corresponding to touch move messages M0, M3, and M6, while touch move messages M1, M2, M4, M5, M7, and M8 may not be reported based on the re-sample rate RT.
  • the sink device may transmit the HID input report corresponding to touch move message M8 (i.e., the last touch move message prior to the touch up message) and transmit the HID input report corresponding to the touch up message.
  • FIG. 8 illustrates an exemplary call flow that can be implemented at a source device 814 and a sink device to back-control the source device 814 via user inputs applied at the sink device.
  • the sink device may implement a back-control application 810 to back-control the source device 814 and a media casting application 812 to manage a link through which the source device 814 casts a remote screen associated therewith to a local screen at the sink device.
  • the source device 814 may initially connect to the back-control application 810 as a Human Interface Device (HID) to enable the back-control application 810 to back-control the source device 814.
  • HID Human Interface Device
  • the media casting application 812 may establish the link through which the source device 814 casts the remote screen to the local screen at the sink device (e.g., via HDMI/MHL, Miracast, etc. ) .
  • the back-control application 810 may then establish a Serial Port Profile (SPP) connection with the source device 814, which may generally occur after the HID has been connected at 820 and the link with the source device 814 has been established at 822.
  • SPP Serial Port Profile
  • the back-control application 810 may be configured to recover or otherwise re-establish the SPP connection in the event that the SPP connection is terminated while the HID is still connected to the source device 814.
  • the back-control application 810 may then retrieve information associated with the remote screen at the source device 814 and send a message requesting that the source device 814 show a calibration screen at 828.
  • the back-control application 810 may retrieve the information associated with the remote screen at the source device 814 and request that the source device 814 show the calibration screen through the SPP connection.
  • the back-control application 810 may then send a request to calibrate the remote screen to the media casting application 812, which may then lock a surface of the local screen at 832 and calculate the (landscape or portrait) output rectangle at 834.
  • the local screen at the sink device may be locked to ensure that the back-control application 810 handling the remote screen calibration process has exclusive permission to access the local screen surface while calibrating the remote screen.
  • the coordinates associated with the output rectangle may then be provided to the back-control application 810, which may store the calibration result at 838 such that the back-control application 810 may convert coordinates in the local screen to coordinates in the remote screen.
  • the calibration process performed at 826 through 838 is described in further detail above with respect to FIG. 4 through FIG. 6, wherein the messages from 828 through 838 may be performed once with the remote screen at the source device 814 in a landscape orientation and once with the remote screen in a portrait orientation.
  • the messages shown 828 through 838 may be omitted to the extent that the remote screen cast into the local screen has been previously calibrated (e.g., where the same source device 814 is reconnected to the sink device, a source device 814 with the same resolution as a prior calibrated source device 814 is connected to the sink device, etc. ) .
  • the back-control application 810 may then capture a local touch input and initialize a corresponding HID input report at 842.
  • initializing the HID input report may comprise converting coordinates associated with the local touch input to coordinates in the remote screen at the source device 814 according to the prior calibration and depending on whether the source device 814 is currently in a landscape or portrait orientation.
  • the back-control application 810 may then send the HID input report to the source device at 844 when the local touch input satisfies certain criteria (e.g., where the local touch input is a touch down message, a touch up message, a last touch move message before a touch up message, a touch move message that matches a re-sample interval, a touch move message that results in a pointer velocity at the source device 814 below a low threshold velocity above which pointer acceleration is triggered at the source device 814, etc. ) .
  • certain criteria e.g., where the local touch input is a touch down message, a touch up message, a last touch move message before a touch up message, a touch move message that matches a re-sample interval, a touch move message that results in a pointer velocity at the source device 814 below a low threshold velocity above which pointer acceleration is triggered at the source device 814, etc.
  • the process to report local touch messages as shown at 840 through 844 is described in
  • the source device 814 may send a message to disconnect the HID to the back-control application 810, as depicted at 846, which may further result in the back-control application 810 automatically terminating the SPP connection with the source device 814, as shown at 848.
  • FIG. 9 illustrates an exemplary device 900 that can implement the various aspects and embodiments described herein.
  • the device 900 shown in FIG. 9 may correspond to a source device that can cast a screen associated therewith to a sink device (e.g., via Miracast, HDMI/MHL, or another suitable technology that may allow screen sharing between two devices) .
  • the device 900 may further implement a Bluetooth Human Interface Device (HID) host role to allow user inputs to be applied at the sink device to back-control the device 900 operating as the source device as well as Bluetooth Serial Port Profile (SPP) acceptor role to enable communication with the sink device over an emulated physical cable.
  • HID Bluetooth Human Interface Device
  • SPP Bluetooth Serial Port Profile
  • the device 900 may correspond to a sink device that can display a screen cast from a source device and further implement a Bluetooth HID role to accept user inputs used to back-control the source device as well as Bluetooth SPP initiator role to initiate communication with the sink device over an emulated physical cable.
  • the device 900 may include a housing 910, a processor 920, a memory 922, a signal detector 924, a user interface 926, a digital signal processor (DSP) 928, a transmitter 932, a receiver 934, an antenna 936, a local display 938, and a bus system 950.
  • DSP digital signal processor
  • the functions associated with the transmitter 932 and the receiver 934 can be incorporated into a transceiver 930.
  • the device 900 can also be configured to communicate in a wireless network that includes, for example, a base station, an access point, or the like.
  • the processor 920 can be configured to control operations associated with the device 900, wherein the processor 920 may also be referred to as a central processing unit (CPU) .
  • the memory 922 can be coupled to the processor 920, can be in communication with the processor 920, and can provide instructions and data to the processor 920.
  • the processor 920 can perform logical and arithmetic operations based on program instructions stored within the memory 922.
  • the instructions in the memory 922 can be executable to perform one or more methods and processes described herein.
  • the processor 920 can include, or be a component in, a processing system implemented with one or more processors.
  • the one or more processors can be implemented with any one or more general-purpose microprocessors, microcontrollers, digital signal processors (DSPs) , field programmable gate array (FPGAs) , programmable logic devices (PLDs) , controllers, state machines, gated logic, discrete hardware components, dedicated hardware finite state machines, combinations thereof, and/or any other suitable entities that can perform calculations and/or manipulate information.
  • the processing system can also include machine-readable media configured to store software, which can be broadly construed to include any suitable instructions, whether referred to as software, firmware, middleware, microcode, hardware description language, or otherwise. Instructions can include code in a source code format, a binary code format, an executable code format, and/or any other suitable format. The instructions, when executed on the one or more processors, can cause the processing system to perform one or more functions described herein.
  • the memory 922 can include read-only memory (ROM) , random access memory (RAM) , and/or any suitable combination thereof.
  • the memory 922 can also include non-volatile random access memory (NVRAM) .
  • the transmitter 932 and the receiver 934 can transmit and receive data between the device 900 and a remote location.
  • the antenna 936 can be attached to the housing 910 and electrically coupled to the transceiver 930.
  • the device 900 can also include multiple transmitters, multiple receivers, multiple transceivers, and/or multiple antennas (not illustrated) .
  • the signal detector 924 can be used to detect and quantify the level associated with one or more signals received at the transceiver 930. The signal detector 924 can detect such signals as total energy, energy per subcarrier per symbol, power spectral density, and/or and in other ways.
  • the digital signal processor (DSP) 928 can be used to process signals, wherein the DSP 928 can be configured to generate a packet to be transmitted via the transmitter 932 and/or the transceiver 930.
  • the packet can include a physical layer protocol data unit (PPDU) .
  • PPDU physical layer protocol data unit
  • the user interface 926 can include, for example, a keypad, a microphone, a speaker, a display, and/or other suitable interfaces.
  • the user interface 926 can include any element or component that conveys information to a user associated with the device 900 and/or receives input from the user.
  • the user interface 926 may comprise at least the screen casted to the sink device.
  • the user interface 926 may comprise a display to show the screen casted from the source device in addition to one or more touchscreen sensors, steering wheel control buttons, a microphone, a mouse, and/or any other suitable human interface device, whereby the user interface 926 may receive user inputs to back-control the source device and one or more packets to report the user inputs received via the user interface 926 may be transmitted to the source device via the transmitter 932 and/or transceiver 930.
  • the device 900 shown in FIG. 9 may include the transmitter 932 and the receiver 934 (or the transceiver 930) to transmit and receive data between the device 900 and a remote location.
  • the transmitter 932 and the receiver 934 (or the transceiver 930) may be used to communicate appropriate wireless signals to cast the screen associated with the source device to the sink device.
  • the various aspects and embodiments described herein contemplate that the screen casting session between the source device and the sink device may be implemented via HDMI/MHL or another wired technology.
  • the device 900 may optionally include a connector 940 such that an appropriate physical cable can be plugged into the connector 940 to convey the screen casting signals between the source device and the sink device in addition to the transmitter 932 and the receiver 934 (or the transceiver 930) that are used to communicate wireless signals used to back-control the source device via one or more user inputs applied at the sink device.
  • the connector 940 may be compatible with a five-pin or an eleven-pin Micro-USB-to-HDMI adapter, an MHL passive cable, a USB Type-C connector, a reversible superMHL connector, and/or other suitable physical cables.
  • the local display 938 may comprise any suitable video output devices such as a cathode ray tube (CRT) , a liquid crystal display (LCD) , a plasma display, a light emitting diode (LED) display, an organic light emitting diode (OLED) display, or another display device.
  • the local display 938 may be an emissive display or a transmissive display.
  • the local display 938 may also be a touch-screen display or a presence-sensitive display such that the local display 938 is simultaneously an input device and an output (display) device.
  • a touch-screen display may be capacitive, resistive, and/or another suitable touch or presence-sensitive panel that allows a user to provide user input.
  • the various components associated with the device 900 can be coupled together via the bus system 950, wherein the bus system 950 can include a data bus as well as a power bus, a control signal bus, and/or a status signal bus in addition to the data bus.
  • the device 900 can also include other components or elements not illustrated in FIG. 9.
  • One or more components associated with the device 900 can be in communication with another one or more components associated with the device 900 via means that may comprise another communication channel (not illustrated) to provide, for example, an input signal to the other component.
  • FIG. 9 illustrates various separate components, one or more components shown therein can be combined or commonly implemented.
  • the processor 920 and the memory 922 can be embodied on a single chip.
  • the processor 920 can additionally (or alternatively) contain memory, such as processor registers.
  • one or more functional blocks or portions thereof can be embodied on a single chip.
  • the functionality associated with a particular block can be implemented on two or more chips.
  • the processor 920 can be used to implement the functionality described above with respect to the signal detector 924 and/or the DSP 928.
  • FIG. 10 illustrates an exemplary data communication model 1000 or protocol stack that can be used in a system in which a source device casts a screen to a sink device and the source device is back-controlled via user inputs applied at the sink device.
  • the data communication model 1000 illustrates interactions between data and control protocols used to transmit data between a source device and a sink device during a session in which the source device casts at least a screen associated therewith to the sink device.
  • the data communication model 1000 may include a physical (PHY) layer 1002, a media access control (MAC) layer (1004) , an internet protocol (IP) layer 1006, a user datagram protocol (UDP) layer 1008, a real time protocol (RTP) layer 1010, an MPEG2 transport stream (MPEG2-TS) 1012, a content protection layer 1014, packetized elementary stream (PES) packetization 1016, a video codec 1018, an audio codec 1020, a transport control protocol (TCP) the 1022, a real time streaming protocol (RTSP) the 1024, feedback packetization 1026, and a human interface device (HID) input layer 1030.
  • PHY physical
  • IP internet protocol
  • UDP user datagram protocol
  • RTP real time protocol
  • MPEG2-TS MPEG2 transport stream
  • PES packetized elementary stream
  • TCP transport control protocol
  • RTSP real time streaming protocol
  • feedback packetization 1026 feedback packetization 1026
  • HID human interface device
  • the physical layer 1002 and the MAC layer 1004 may define physical signaling, addressing, and channel access control used in communications between the source device and the sink device.
  • the physical layer 1002 and the MAC layer 1004 may define the frequency band structure used in the communications (e.g., Federal Communications Commission bands defined at 2.4, GHz, 3.6 GHz, 5 GHz, 60 GHz, or Ultrawideband (UWB) frequency band structures) .
  • the physical layer 1002 and the MAC layer 1004 may also define data modulation techniques (e.g., analog and digital amplitude modulation, frequency modulation, phase modulation techniques, and combinations thereof) .
  • the physical layer 1002 and the MAC layer 1004 may also define multiplexing techniques (e.g., orthogonal frequency division multiplexing (OFDM) , time division multi access (TDMA) , frequency division multi access (FDMA) , code division multi access (CDMA) , or any combination of OFDM, FDMA, TDMA and/or CDMA) .
  • multiplexing techniques e.g., orthogonal frequency division multiplexing (OFDM) , time division multi access (TDMA) , frequency division multi access (FDMA) , code division multi access (CDMA) , or any combination of OFDM, FDMA, TDMA and/or CDMA
  • OFDM orthogonal frequency division multiplexing
  • TDMA time division multi access
  • FDMA frequency division multi access
  • CDMA code division multi access
  • the physical layer 1002 and the MAC layer 1004 may be defined according to a Wi-Fi standard (e.g., IEEE 802.11-2007 and 802.11n-2009x) .
  • the physical layer 1002 and the MAC layer 1004 may be defined according to WirelessHD, Wireless Home Digital Interface (WHDI) , WiGig, Wireless USB, High Definition Multimedia Interface (HDMI) , Mobile High-Definition Link (MHL) , Ethernet, and/or other suitable network architectures.
  • WHDI Wireless Home Digital Interface
  • WiGig Wireless USB
  • HDMI High Definition Multimedia Interface
  • MHL Mobile High-Definition Link
  • Ethernet and/or other suitable network architectures.
  • the IP layer 1006, the UDP layer 1008, the RTP layer 1010, the TCP layer 1022, and the RTSP layer 1024 may define packet structures and encapsulations used in the communications between the source device and the sink device and may be defined according to the standards maintained by the Internet Engineering Task Force (IETF) .
  • the source device and the sink device may use the RTSP layer 1024 to negotiate capabilities, establish a session, and maintain and manage a session.
  • the source device and the sink device may further establish a feedback channel to back-control the source device via user inputs applied at the sink device (e.g., via a Bluetooth Human Interface Device (HID) and Serial Port Profile (SPP) helper application, as described above) .
  • HID Bluetooth Human Interface Device
  • SPP Serial Port Profile
  • the video codec 1018 may define one or more video data coding techniques that may be used to manage video data casted from the source device to the sink device.
  • the video codec 1018 may implement one or more video compression standards, which may include ITU-T H. 261, ISO/IEC MPEG-1 Visual, ITU-T H. 262 or ISO/IEC MPEG-2 Visual, ITU-T H. 263, ISO/IEC MPEG-4 Visual, ITU-T H. 264 (also known as ISO/IEC MPEG-4 AVC) , VP8 and High-Efficiency Video Coding (HEVC) .
  • video compression standards may include ITU-T H. 261, ISO/IEC MPEG-1 Visual, ITU-T H. 262 or ISO/IEC MPEG-2 Visual, ITU-T H. 263, ISO/IEC MPEG-4 Visual, ITU-T H. 264 (also known as ISO/IEC MPEG-4 AVC) , VP8 and High-Efficiency Video Coding (HEVC) .
  • the audio codec 1020 may define audio data coding techniques that may be used to manage audio data casted from the source device to the sink device.
  • the audio data may be coded using multi-channel formats such those developed by Dolby and Digital Theater Systems.
  • the audio data may further be coded using a compressed or uncompressed format, wherein example compressed audio formats may include, without limitation, MPEG-1, 2 Audio Layers II and III, AC-3, AAC, etc., while an example uncompressed audio format may include the pulse-code modulation (PCM) audio format.
  • PCM pulse-code modulation
  • the packetized elementary stream (PES) packetization 1016 and MPEG2 transport stream (MPEG2-TS) 1012 may define how coded audio and video data is packetized and transmitted.
  • PES packetized elementary stream
  • MPEG2-TS MPEG2 transport stream
  • the PES packetization 1016 and MPEG-TS 1012 may be defined according to MPEG-2 Part 1.
  • audio and video data may be packetized and transmitted according to other packetization and transport stream protocols.
  • the content protection 1014 may protect against unauthorized copying of audio or video data, wherein the content protection 1014 may be defined according to the High bandwidth Digital Content Protection (HDCP) 2.0 specification in one example.
  • HDCP High bandwidth Digital Content Protection
  • the feedback packetization 1026 may define how user input and performance information is packetized, wherein the feedback may generally affect how subsequent media data is presented to the user at the sink device (e.g., zoom and pan operations) and how the source device processes (e.g., encodes and/or transmits) the media data to the sink device.
  • the feedback may generally be used to back-control any functional aspect associated with the source device.
  • HID input 1030 may be received at one or more HID devices at the sink device, which may implement a Bluetooth HID to back-control the source device and a Bluetooth Serial Port Profile (SPP) helper application to retrieve various parameters associated with the remote screen at the source device and to calibrate the remote screen at the source device relative to a local screen at the sink device.
  • the feedback packetization 1026 may provide functionality to convert coordinates associated with one or more touch messages or other HID input 1030 received at the sink device to coordinates in the local screen at the sink device, as described in further detail above with respect to FIG. 4 through FIG. 6, and the feedback packetization 1026 may further provide functionality to packetize such information for wireless transmission to the source device.
  • the feedback packetization 1026 may implement one or more controls to determine whether and/or when to transmit packets reporting the HID input 1030 to the source device, as described in further detail above with reference to FIG. 7.
  • the source device may cast the screen associated therewith to the sink device via any suitable screen sharing technology.
  • the source device may cast the screen associated therewith to the source device via Miracast or another suitable wireless technology or via HDMI/MHL or another suitable wired technology.
  • the particular data communication model 1000 shown in FIG. 10 is for illustrative purposes only and is not intended to limit the communications between the source device and the sink device to any particular screen sharing technology.
  • the source device and the sink device may communicate screen casting information via wireless signals, whereas in a screen sharing session implemented via HDMI/MHL, the source device and the sink device may be connected to one another via a physical cable and the screen casting information may be conveyed between the source device and the sink device via the physical cable.
  • DSP digital signal processor
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • a general purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine.
  • a processor may also be implemented as a combination of computing devices (e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration) .
  • a software module may reside in RAM, flash memory, ROM, EPROM, EEPROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of non-transitory computer-readable medium known in the art.
  • An exemplary non-transitory computer-readable medium may be coupled to the processor such that the processor can read information from, and write information to, the non-transitory computer-readable medium.
  • the non-transitory computer-readable medium may be integral to the processor.
  • the processor and the non-transitory computer-readable medium may reside in an ASIC.
  • the ASIC may reside in an IoT device.
  • the processor and the non-transitory computer-readable medium may be discrete components in a user terminal.
  • the functions described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a non-transitory computer-readable medium.
  • Computer-readable media may include storage media and/or communication media including any non-transitory medium that may facilitate transferring a computer program from one place to another.
  • a storage media may be any available media that can be accessed by a computer.
  • such computer-readable media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer.
  • any connection is properly termed a computer-readable medium.
  • the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave
  • the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of a medium.
  • disk and disc which may be used interchangeably herein, includes CD, laser disc, optical disc, DVD, floppy disk, and Blu-ray discs, which usually reproduce data magnetically and/or optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

According to various aspects, a source device may cast a remote screen associated therewith to a sink device, wherein the remote screen may be shown on a local screen at the sink device. Furthermore, the sink device may support a Human Interface Device (HID) to back-control the source device and a helper application configured to retrieve at least a resolution and a current orientation associated with the remote screen and to aid automatically calibrating the remote screen. Accordingly, in response to capturing one or more local touch messages via the HID, coordinates associated with the local touch messages in the local screen may be converted to coordinates in the remote screen based on the current orientation and the resolution associated with the calibrated remote screen. Furthermore, HID input reports may be periodically transmitted from the sink device to the source device based on various pointer velocity and latency parameters.

Description

HUMAN INTERFACE DEVICE (HID) AND AUTOMATIC CALIBRATION FOR BACK-CONTROLLING A SOURCE DEVICE DURING A REMOTE SCREEN CASTING SESSION TECHNICAL FIELD
The various aspects and embodiments described herein relate to back-controlling a source device through one or more user inputs applied at a sink device, and further to automatically calibrating displays at the source device and the sink device and rates at which the user inputs are sampled at the sink device and sent to the source device to improve overall user experience during a media share session.
BACKGROUND
Using a mobile phone device or other handheld computing devices while operating a moving vehicle is both impractical and dangerous. Accordingly, to improve safety and convenience, handheld computing devices are often linked to an on-board hands-free device installed within the vehicle. For example, in various use cases, the on-board hands-free device may be integrated into the vehicle hardware, or the on-board hands-free device may be an external device (e.g., a device that can be mounted to the vehicle dashboard) . Generally, smartphones can cast a screen associated therewith into the vehicle kit via a suitable communication protocol, such as High Definition Multimedia Interface (HDMI) , Mobile High-Definition Link (MHL) , Miracast, and/or others. However, HDMI and MHL do not enable the vehicle kit to back-control the smartphone. Furthermore, although Miracast defines a user input back channel (UIBC) to provide back-control, many smartphone devices available on the market do not support UIBC, which is limited to operating over Wi-Fi.
Furthermore, once the smartphone has cast the screen associated therewith into the vehicle kit via a suitable communication protocol, the vehicle kit may use an input device such as a touchscreen or a Bluetooth Human Interface Device (HID) to back-control smartphone (e.g., a mouse, trackball, touchpad, etc. ) . However, this traditional method requires the user to concurrently watch the smartphone screen while the user controls the smartphone, which is impractical and also dangerous for the driver, thus somewhat defeating the point of linking the smartphone to the on-board hands-free device. An additional problem with accurately back-controlling the smartphone using  either a touch screen or a Bluetooth HID is that the remote screen cast from the smartphone to the vehicle kit needs to be calibrated. However, calibration is typically manual and requires the user to calibrate the remote screen for every smartphone connected to the vehicle kit in both landscape and portrait modes, which can be time consuming and inaccurate. Further still, depending on a rate at which the user applies inputs at the vehicle kit in order to back-control the smartphone, the smartphone may be unable to process or render the inputs correctly. For example, depending on the pointer velocity at the smartphone, the inputs applied at the vehicle kit may float when shown at the smartphone, potentially leading to inaccurate and unrecognizable results.
Accordingly, there exists a need to provide improved mechanisms to back-control a source device (e.g., a smartphone) at a sink device (e.g., a vehicle kit) and to improve user experience when displaying a remote screen at a sink device.
SUMMARY
The following presents a simplified summary relating to one or more aspects and/or embodiments disclosed herein. As such, the following summary should not be considered an extensive overview relating to all contemplated aspects and/or embodiments, nor should the following summary be regarded to identify key or critical elements relating to all contemplated aspects and/or embodiments or to delineate the scope associated with any particular aspect and/or embodiment. Accordingly, the following summary has the sole purpose to present certain concepts relating to one or more aspects and/or embodiments relating to the mechanisms disclosed herein in a simplified form to precede the detailed description presented below.
According to various aspects, to address the above-mentioned deficiencies with respect to providing the ability to back-control a source device that casts a screen to a sink device, the sink device may implement a Bluetooth Human Interface Device (HID) (e.g., an HID mouse with consumer control) to locally back-control the source device, wherein the sink device may further implement a Bluetooth Serial Port Profile (SPP) application to aid the Bluetooth HID in back-controlling the source device. As such, rather than having to concurrently view the screen associated with the source device in order to control the source device, the user (e.g., a driver where the sink device is an integrated vehicle kit displaying a screen casted from a smartphone that corresponds to the source device) can instead simply view the screen displayed on the sink device and  use the Bluetooth HID implemented on the sink device to back-control the source device. Accordingly, the Bluetooth HID and the SPP application can be implemented to back-control the source device via user inputs applied at the sink device, with the sink device implementing a Bluetooth HID role and a SPP initiator role and the source device implementing a Bluetooth HID host role and a SPP acceptor role, which can be especially advantageous in vehicular platforms. Furthermore, according to various aspects, the screen calibration issues mentioned above may be addressed through an automatic calibration algorithm that can automatically calibrate the remote screen cast from the source device in both landscape and portrait orientations. After the calibration has been performed once, the calibration data can be stored and subsequently retrieved anytime that the same source device is linked to the sink device. As such, the automatic calibration algorithm may allow any source device to be connected to the sink device without the user needing to manually (and unnecessarily) re-calibrate the source device when subsequently connected to the sink device. Furthermore, the automatic calibration algorithm may be more accurate than any manual user-performed calibration.
According to various aspects, to further address the above-mentioned problems with respect to calibrating a local screen at a sink device with a screen at a source device, a remote screen calibration algorithm can be used to automatically determine an output rectangle when the screen at the source device is cast into the local screen at the sink device in landscape and portrait orientations. More particularly, an application at the sink device may transmit a screen calibration message to the source device, wherein the screen calibration message may include a request to show a calibration screen at the source device in either a landscape or portrait orientation. For example, in various embodiments, the calibration screen may comprise a full-white screen or another suitable screen configured to display a unique color such that valid and invalid regions can be suitably differentiated. Accordingly, the source device may then display the calibration screen and information indicating a resolution associated with the screen at the source device (e.g., a height and width, which may be expressed in pixels) . The sink device may then calculate the output rectangle in the local screen according to the calibration screen displayed at the source device and cast to the sink device, wherein the output rectangle may comprise a valid region limited according to four (4) vertices. As such, the sink device may subsequently convert coordinates in the local screen to coordinates in the remote screen at the source device according to the resolution  associated with the remote screen and the coordinates associated with the vertices in the output rectangle. The above-mentioned calibration may then be repeated in the other orientation (e.g., landscape or portrait, which can be calibrated in any suitable order) . The results from the remote screen calibration algorithm may then be stored in the sink device to be used in future screen sharing sessions between the source device and the sink device. Therefore, once the remote screen at the source device has been calibrated for use at the sink device, there may be no need to perform any further calibration with respect to that source device. Furthermore, the source device and the sink device can carry out the above-described remote screen calibration algorithm substantially autonomously without requiring any user input.
According to various aspects, to further address the above-mentioned problems with respect to latency in a source device rendering user inputs that are applied at a sink device to back-control the source device, the user inputs applied at the sink device can be sampled and reported back to the source device in a manner that depends on a pointer velocity at the source device, among various other parameters. For example, the time to send a packet from the sink device to the source device to report one or more user inputs applied at the sink device may be considered in combination with the user input sample rate employed in the sink device to prevent the user from perceiving the latency when using the sink device to back-control the source device. Furthermore, the pointer velocity at the source device and the low velocity threshold associated with the pointer at the source device may be considered to avoid the appearance of a floating pointer at the source device (e.g., when user inputs are quickly applied) . Accordingly, in various embodiments, the sink device may transmit a packet to report a user input to the source device in response to determining that the pointer velocity at the source device is less than the low velocity threshold at the source device, wherein the pointer velocity at the source device may depend on a distance between local points at which successive user inputs are captured at the sink device and a time to send the packet to the source device. Otherwise, the sink device may send the user input report to the source device upon detecting a touch down event, upon detecting a touch move event that satisfies certain sampling criteria, and/or upon detecting a touch up event. Furthermore, in the case where the sink device sends the user input report upon detecting the touch up event, the sink device may further send a user input report associated with the last touch move event preceding the touch up event to ensure that the touch up event is processed  according to the correct coordinates relative to the most recently sampled user input.
According to various aspects, a source device may therefore cast a remote screen associated therewith to a sink device (e.g., via a wired or wireless connection) , wherein the remote screen may be shown on a local screen at the sink device. Furthermore, the sink device may support a Human Interface Device (HID) to back-control the source device and a helper application configured to wirelessly communicate with the source device to retrieve at least a resolution and a current orientation associated with the remote screen, to aid a process to automatically calibrate the remote screen, and to obtain other relevant parameters associated with the source device and/or the remote screen at the source device (e.g., pointer movement units, pointer velocity control parameters, etc. ) . Accordingly, in response to capturing one or more local touch messages via the HID, coordinates associated with the local touch messages in the local screen may be converted to coordinates in the remote screen based on the current orientation and the resolution associated with the calibrated remote screen. Furthermore, HID input reports may be periodically transmitted from the sink device to the source device based on various pointer velocity and latency parameters (e.g., to avoid triggering pointer acceleration that may result in a floating pointer at the source device and to avoid conditions that may result in user-perceptible latency and/or recognition difficulties with respect to rendering local touch inputs applied at the sink device in the remote screen at the source device) .
According to various aspects, a method for back-controlling a source device may comprise determining, at a sink device, at least a current orientation and a resolution associated with a remote screen casted from the source device, wherein the remote screen casted from the source device is displayed on a local screen at the sink device, sending, to the source device, a message to request that the source device display a calibration screen configured to fill the remote screen with a unique color, automatically calculating, at the sink device, an output rectangle associated with the current orientation at the remote screen casted from the source device, wherein the output rectangle comprises a region in the local screen filled with the unique color, capturing a local touch message at the sink device, the local touch message associated with first coordinates in the local screen at the sink device, and converting the first coordinates in the local screen to second coordinates in the remote screen to back-control the source device via the local touch message captured at the sink device, wherein the first  coordinates are converted to the second coordinates based on the resolution associated with the remote screen and the calculated output rectangle associated with the current orientation at the remote screen.
According to various aspects, a sink device configured to back-control a source device may comprise a local screen configured to display a remote screen casted from the source device, a transceiver configured to communicate with the source device to retrieve at least a current orientation and a resolution associated with the remote screen casted from the source device and to request that the source device display a calibration screen configured to fill the remote screen with a unique color, and one or more processors configured to automatically calculate an output rectangle associated with the current orientation at the remote screen casted from the source device, wherein the output rectangle comprises a region in the local screen filled with the unique color, capture a local touch message at the sink device, the local touch message associated with first coordinates in the local screen at the sink device, and convert the first coordinates in the local screen to second coordinates in the remote screen based on the resolution associated with the remote screen and the calculated output rectangle associated with the current orientation at the remote screen.
According to various aspects, an apparatus may comprise means for determining at least a current orientation and a resolution associated with a remote screen casted from a source device, wherein the remote screen casted from the source device is displayed on a local screen at the apparatus, means for sending, to the source device, a message to request that the source device display a calibration screen configured to fill the remote screen with a unique color, means for automatically calculating an output rectangle associated with the current orientation at the remote screen casted from the source device, wherein the output rectangle comprises a region in the local screen filled with the unique color, means for capturing a local touch message, the local touch message associated with first coordinates in the local screen at the apparatus, and means for converting the first coordinates in the local screen to second coordinates in the remote screen based on the resolution associated with the remote screen and the calculated output rectangle associated with the current orientation at the remote screen.
According to various aspects, a method for back-controlling a source device may additionally (or alternatively) comprise implementing, at a sink device, a human interface device (HID) to back-control a source device casting a remote screen into a  local screen at the sink device, determining, at the sink device, at least a low threshold velocity above which the source device accelerates a pointer in the remote screen, sampling, at the sink device, a local touch message applied via the HID to the local screen according to a native sample rate, and transmitting, to the source device, a message to report the local touch message at the native sample rate in response to the local touch message comprising a press, a release, or a move that satisfies a predetermined condition that depends, at least in part, on the low threshold velocity above which the source device accelerates the pointer in the remote screen.
According to various aspects, a sink device configured to back-control a source device may additionally (or alternatively) comprise a local screen configured to display a remote screen casted from the source device, a transceiver configured to communicate with the source device to determine at least a low threshold velocity above which the source device accelerates a pointer in the remote screen, and one or more processors configured to implement a human interface device (HID) to back-control the source device, sample a local touch message applied via the HID to the local screen according to a native sample rate, and cause the transceiver to transmit a message to report the local touch message to the source device at the native sample rate in response to the local touch message comprising a press, a release, or a move that satisfies a predetermined condition that depends, at least in part, on the low threshold velocity above which the source device accelerates the pointer in the remote screen.
According to various aspects, an apparatus may comprise means for implementing a human interface device (HID) to back-control a source device casting a remote screen into a local screen at the apparatus, means for determining at least a low threshold velocity above which the source device accelerates a pointer in the remote screen, means for sampling a local touch message applied via the HID to the local screen according to a native sample rate, and means for transmitting, to the source device, a message to report the local touch message at the native sample rate in response to the local touch message comprising a press, a release, or a move that satisfies a predetermined condition that depends, at least in part, on the low threshold velocity above which the source device accelerates the pointer in the remote screen.
Other objects and advantages associated with the aspects and embodiments disclosed herein will be apparent to those skilled in the art based on the accompanying drawings and detailed description.
BRIEF DESCRIPTION OF THE DRAWINGS
A more complete appreciation of the various aspects and embodiments described herein and many attendant advantages thereof will be readily obtained as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings which are presented solely for illustration and not limitation, and in which:
FIG. 1 illustrates an exemplary system in which a source device may cast a screen to a sink device and user inputs can be applied at the sink device to back-control the source device, according to various aspects.
FIG. 2 illustrates an exemplary high-level system architecture associated with a sink device that may display a screen casted from a source device and back-control the source device via user inputs applied at the source device, according to various aspects.
FIG. 3 illustrates relationships between an Open Systems Interconnect (OSI) seven-layer model and a Bluetooth protocol stack that can be used to back-control a source device via user inputs applied at a sink device, according to various aspects.
FIG. 4 illustrates an exemplary method that can be implemented at a sink device to automatically calibrate a screen at a source device, according to various aspects.
FIG. 5A-5B illustrate exemplary calibration screens that can be used to automatically calibrate a screen at a source device in a landscape orientation according to the method shown in FIG. 4, according to various aspects.
FIG. 6A-6B illustrate exemplary calibration screens that can be used to automatically calibrate a screen at a source device in a portrait orientation according to the method shown in FIG. 4, according to various aspects.
FIG. 7 illustrates an exemplary method and timing diagram that can be used to capture and transmit user inputs applied at a sink device to a source device to thereby back-control the source device, according to various aspects.
FIG. 8 illustrates an exemplary call flow that can be implemented at a source device and a sink device to back-control the source device via user inputs applied at the sink device and to automatically calibrate a screen that the source device casts to the sink device, according to various aspects.
FIG. 9 illustrates an exemplary device that can implement the various aspects and embodiments described herein.
FIG. 10 illustrates an exemplary architecture that can be implemented to enable a source device to cast a screen to a sink device and to back-control the source device via user inputs applied at the sink device, according to various aspects.
DETAILED DESCRIPTION
Various aspects and embodiments are disclosed in the following description and related drawings to show specific examples relating to exemplary aspects and embodiments. Alternate aspects and embodiments will be apparent to those skilled in the pertinent art upon reading this disclosure, and may be constructed and practiced without departing from the scope or spirit of the disclosure. Additionally, well-known elements will not be described in detail or may be omitted so as to not obscure the relevant details of the aspects and embodiments disclosed herein.
The word “exemplary” is used herein to mean “serving as an example, instance, or illustration. ” Any embodiment described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other embodiments. Likewise, the term “embodiments” does not require that all embodiments include the discussed feature, advantage, or mode of operation.
The terminology used herein describes particular embodiments only and should not be construed to limit any embodiments disclosed herein. As used herein, the singular forms “a, ” “an, ” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. Those skilled in the art will further understand that the terms “comprises, ” “comprising, ” “includes, ” and/or “including, ” as used herein, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
Further, various aspects and/or embodiments may be described in terms of sequences of actions to be performed by, for example, elements of a computing device. Those skilled in the art will recognize that various actions described herein can be performed by specific circuits (e.g., an application specific integrated circuit (ASIC)) , by program instructions being executed by one or more processors, or by a combination of both. Additionally, these sequence of actions described herein can be considered to be embodied entirely within any form of non-transitory computer-readable medium having stored thereon a corresponding set of computer instructions that upon execution  would cause an associated processor to perform the functionality described herein. Thus, the various aspects described herein may be embodied in a number of different forms, all of which have been contemplated to be within the scope of the claimed subject matter. In addition, for each of the aspects described herein, the corresponding form of any such aspects may be described herein as, for example, “logic configured to” and/or other structural components configured to perform the described action.
According to various aspects, a source device and a sink device may establish a communication link to enable the source device to cast a screen associated therewith to the sink device such that the screen at the source device is further displayed at the sink device. As such, according to various embodiments, the source device and the sink device may implement one or more communication technologies to enable the remote screen casting from the source device to the sink device and to back-control the source device via one or more user inputs applied at the sink device. For example, in various embodiments, the sink device may implement a Bluetooth Human Interface Device (HID) (e.g., an HID mouse with consumer control) to locally back-control the source device in addition to a Bluetooth Serial Port Profile (SPP) application to aid the Bluetooth HID in back-controlling the source device. As such, rather than having to concurrently view the screen associated with the source device in order to control the source device, the user can instead simply view the screen displayed on the sink device and use the Bluetooth HID implemented on the sink device to back-control the source device. Accordingly, the Bluetooth HID and the SPP application can be implemented to back-control the source device via user inputs applied at the sink device, with the sink device implementing a Bluetooth HID role and a SPP initiator role, while the source device may implement a Bluetooth HID host role and a SPP acceptor role. Furthermore, according to various aspects, the sink device may be configured to automatically calibrate the screen at the source device in both landscape and portrait orientations such that after the calibration has been performed once, results from the calibration can be stored in a suitable memory (e.g., a local memory, a memory accessible via a network, etc. ) and the sink device may subsequently retrieve the calibration results anytime that the same source device is linked to the sink device. As such, the automatic calibration algorithm may allow any source device to be connected to the sink device without the user needing to manually (and unnecessarily) re-calibrate the source device when subsequently connected to the sink device while providing may  accurate results than any manual user-performed calibration.
According to various aspects, to automatically calibrate a local screen at the sink device with a remote screen at the source device, a remote screen calibration algorithm can be used to automatically determine an output rectangle associated with the screen casted from the source device to the sink device in landscape and portrait orientations. More particularly, an application at the sink device may transmit a screen calibration message to the source device, wherein the screen calibration message may include a request to show a calibration screen at the source device in either a landscape orientation or a portrait orientation. For example, in various embodiments, the calibration screen may comprise a full-white screen or another suitable screen configured to display a unique color such that valid and invalid regions can be suitably differentiated. Accordingly, the source device may then display the calibration screen and information indicating a resolution associated with the screen at the source device (e.g., a height and width, which may be expressed in pixels) . The sink device may then calculate the output rectangle in the local screen according to the calibration screen displayed at the source device and cast to the sink device, wherein the output rectangle may comprise a valid region limited according to four (4) vertices. As such, the sink device may subsequently convert coordinates in the local screen to coordinates in the remote screen at the source device according to the resolution associated with the remote screen and the coordinates associated with the vertices in the output rectangle. The above-mentioned calibration may then be repeated in the other orientation (e.g., landscape or portrait, which can be calibrated in any suitable order) . The results from the remote screen calibration algorithm may then be stored in the sink device to be used in future screen sharing sessions between the source device and the sink device. Therefore, once the remote screen at the source device has been calibrated for use at the sink device, there may be no need to perform any further calibration with respect to that source device. Furthermore, the source device and the sink device can carry out the above-described remote screen calibration algorithm substantially autonomously without requiring any user input.
According to various aspects, the sink device may further capture the user inputs that are applied at the sink device and transmit messages to report the captured user inputs to the source device according to a latency associated with a wireless link between the source device and the sink device and various other parameters relating to a  velocity associated with the captured user inputs (e.g., a pointer velocity at the source device, a distance that a pointer has been moved between successive inputs, etc. ) . For example, a time to send a packet from the sink device to the source device in order to report one or more user inputs applied at the sink device may be considered in combination with the user input sample rate employed in the sink device to prevent the user from perceiving the latency when using the sink device to back-control the source device. Furthermore, the pointer velocity at the source device and the low velocity threshold associated with the pointer at the source device may be considered to avoid the appearance of a floating pointer at the source device (e.g., when user inputs are quickly applied) . Accordingly, in various embodiments, the sink device may transmit a packet to report a user input to the source device in response to determining that the pointer velocity at the source device is less than the low velocity threshold at the source device, wherein the pointer velocity at the source device may depend on a distance between local points at which successive user inputs are captured at the sink device and a time to send the packet to the source device. Otherwise, the sink device may send the user input report to the source device upon detecting a touch down event, upon detecting a touch move event that satisfies certain sampling criteria, and/or upon detecting a touch up event. Furthermore, in the case where the sink device sends the user input report upon detecting the touch up event, the sink device may further send a user input report associated with the last touch move event preceding the touch up event to ensure that the touch up event is processed according to the correct coordinates relative to the most recently sampled user input.
More particularly, with specific reference to FIG. 1, an exemplary system 100 is illustrated in which a source device 120 may cast a screen to a sink device 160 and user inputs can be applied at the sink device 160 to back-control the source device 120. In various embodiments, as shown in FIG. 1 and as will be described in further detail herein, the source device 120 and the sink device may communicate with one another via one or more communication channels 150, wherein the one or more communication channels 150 may be used to cast the screen associated with the source device 120 to the sink device 160, to back-control the source device 120 via the user inputs applied at the sink device 160, to automatically calibrate the screen casted from the source device 120 to the sink device 160, and to determine various parameters that may be used to control when messages to report the user inputs applied at the sink device 160 are transmitted to  the source device 120.
According to various aspects, in the exemplary system 100 shown in FIG. 1, the source device 120 may include a memory 122, a display 124, a speaker 126, an audio and/or a video (A/V) encoder 128, an audio and/or a video (A/V) control module 130, and a transceiver 132 (e.g., a transmitter/receiver (TX/RX) unit) . Furthermore, the sink device 160 may include a transceiver 162 (e.g., a transmitter/receiver (TX/RX) unit) , an audio and/or a video (A/V) decoder 164, a display 166, a speaker 168, one or more input devices 170, and a user input processing module (UIPM) 172. However, those skilled in the art will appreciate that the illustrated components shown in FIG. 1 represent merely one example configuration associated with the system 100, whereby other configurations may include fewer and/or more components than illustrated in FIG. 1. In the example shown in FIG. 1, the source device 120 can display a visual portion associated with multimedia data on the local display 124 and can output an audio portion associated with the multimedia data using the speaker 126. The multimedia data may be stored locally on the memory 122, accessed from an external storage medium such as a file server, hard drive, external memory, Blu-ray disc, DVD, or other physical storage medium, or may be streamed to the source device 120 via a network connection such as via the Internet. In some instances, the source device 120 may include a camera and/or a microphone (not explicitly shown in FIG. 1) that can capture multimedia data in real-time. Multimedia data may include content such as movies, television shows, music, or the like, as well as real-time content generated at the source device 120. For example, one or more applications running on the source device 120 may produce and/or capture the real-time multimedia content generate at the source device 120 (e.g., video data captured during a video telephony session) . In various instances, the real-time content may include a video frame that includes one or more user input options available to select, video frames that combine multiple different content types (e.g., a video frame from a movie or television program that has user input options overlaid on the video frame) , and/or other suitable configurations.
According to various embodiments, in addition to (and/or alternatively to) rendering multimedia data via the local display 124 and the local speaker 126, the A/V encoder 128 at the source device 120 can encode the multimedia data and the transceiver 132 can be used to transmit the encoded data over the communication channel 150 to the sink device 160. The transceiver 162 associated with the sink device  160 may therefore receive the encoded data and the A/V decoder 164 associated with the sink device 160 may decode the encoded data and output the decoded data on the display 166 and/or the speaker 168. In this manner, the audio and/or video data rendered via the display 124 and/or the speaker 126 at the source device 120 can be simultaneously rendered via the display 166 and/or the speaker 168 at the sink device 160. Alternatively, the audio and/or video data may only be rendered via the display 166 and/or the speaker 168 at the sink device 160. In either case, the audio and/or video data may be arranged in one or more frames, and any audio frames may be time-synchronized with the video frames when rendered. In various embodiments, the communication channel 150 used to transmit the multimedia data from the source device 120 to the sink device 160 may comprise a wired link, such as a wired High-Definition Multimedia Interface (HDMI) link that can be used to transfer uncompressed video data and compressed or uncompressed digital audio from the source device 120 to the sink device 160, a wired link that implements the Mobile High-Definition Link (MHL) standard that allows consumers to connect mobile phones, tablets, and/or other portable consumer electronics to HD-capable sink devices (e.g., via a five-pin or an eleven-pin Micro-USB-to-HDMI adapter, an MHL passive cable, a USB Type-C connector, a reversible superMHL connector, etc. ) , or another suitable wired interface. Alternatively, in various embodiments, the communication channel 150 used to transmit the multimedia data from the source device 120 to the sink device 160 may comprise a wireless link configured in accordance with the Wi-Fi Display (also known as Miracast) technical specification, which defines a wireless peer-to-peer protocol that can be used to connect the source device 120 to the sink device 160 and thereby transfer high-definition video and surround sound audio over a Wi-Fi Direct connection such that the source device 120 and the sink device 160 can communicate directly without using an intermediary device (e.g., a wireless access point) . The source device 120 and the sink device 160 may also establish a tunneled direct link setup (TDLS) to avoid or reduce network congestion. In general, Wi-Fi Direct and TDLS may be intended to establish relatively short-distance communication sessions, which in the present context may refer to a range less than approximately seventy (70) meters, although in a noisy or obstructed environment the distance between devices may be even shorter, such as approximately thirty-five (35) meters or less or generally within a vehicle interior.
In various embodiments, the A/V encoder 128 and the A/V decoder 164 may  implement one or more audio and video compression standards, such as the ITU-T H.264 standard, alternatively referred to as MPEG-4, Part 10, Advanced Video Coding (AVC) , the high efficiency video coding (HEVC) standard, or the Display Stream Compression (DSC) standard. Many other proprietary and/or standardized compression techniques may also be used. Generally speaking, the A/V decoder 164 may be configured to perform reciprocal coding operations with respect to the A/V encoder 128. Furthermore, although not explicitly shown in FIG. 1, the A/V encoder 128 and the A/V decoder 164 may each be integrated with an audio encoder and decoder, and may include appropriate MUX-DEMUX units, or other hardware and software, to encode both audio and video in a common data stream or separate data streams. In certain use cases, multimedia data may be stored on or received at source device 120 in an encoded form and/or transferred over the communication channel 150 in an uncompressed form (e.g., when using a wired HDMI/MHL link) , wherein the multimedia data may not require further compression at the A/V encoder 128 in such use cases. Furthermore, in various embodiments, the communication channel 150 may carry audio payload data and video payload data separately or alternatively carry audio and video payload data in a common data stream. If applicable, the MUX-DEMUX units mentioned above may conform to the ITU H. 223 multiplexer protocol, or other protocols such as the user datagram protocol (UDP) . The A/V encoder 128 and the A/V decoder 164 each may be implemented as one or more microprocessors, digital signal processors (DSPs) , application specific integrated circuits (ASICs) , field programmable gate arrays (FPGAs) , discrete logic, software, hardware, firmware or any combinations thereof. The A/V encoder 128 and the A/V decoder 164 may each be included in one or more encoders or decoders that are integrated in a combined encoder/decoder (CODEC) . Accordingly, the source device 120 and the sink device 160 may each comprise specialized machines configured to implement the various aspects described herein.
For example, in various embodiments, the system 100 illustrated in FIG. 1 may be implemented in a vehicular context, whereby the sink device 160 may represent a vehicle head unit (or integrated car kit) that can integrate one or more user interface devices with the source device 120. In particular, the user interface devices may include one or more input devices 170 configured to receive input from a user through tactile, audio, or video feedback, wherein example input devices 170 may include a presence-sensitive and/or a touch-sensitive display, a mouse, a keyboard, a voice responsive  system, a video camera, a microphone, steering wheel button (s) or knob (s) or other controls in the vehicle that can be pushed or rotated (e.g., to increase or decrease volume) , and/or any other input device 170 that can capture a command from a user. As such, any references herein to a “user” associated with the sink device 160 and/or the source device 120 may include a driver or passenger in an automobile that includes the sink device 160. The user interface devices may also include one or more output devices configured to provide output to a user using tactile, audio, and/or video stimuli, wherein example output devices may include the speaker 168 and the display 166, wherein the latter may be a presence-sensitive display, a touch-sensitive display, a liquid crystal display (LCD) , etc. However, those skilled in the art will appreciate that the output devices may include any suitable device that can convert a signal into an appropriate form that humans or machines can understand.
According to various aspects, in addition to providing a wired and/or wireless link to enable the source device 120 to cast information shown on the display 124 to the sink device 160, which then shows the information on the local display 166, the communication channel 150 may include one or more Bluetooth connections that can be used to back-control the source device 120 via user inputs applied through the one or more input devices 170 at the sink device 160. As such, in various embodiments, the source device 120 and the sink device 160 can use the communication channel 150 to enable the source device 120 to cast a screen associated therewith to the sink device 160 (e.g., via a wired HDMI/MHL link, a wireless Wi-Fi Display link, etc. ) and further to enable the sink device 160 to automatically calibrate the local display 166 with respect to the particular display 124 at the source device 120 and to capture user inputs at the input devices 170 and format the user inputs into a data packet structure that can be transmitted to and interpreted at the source device 120 to provide the user with the ability to seamlessly back-control the source device 120 through interaction with the sink device 160. Accordingly, as will be described in further detail below, the sink device 160 may include a user input processing module (UIPM) 172 that can capture the user inputs received at the one or more input devices 170 and report the captured user inputs to the source device 120 via the communication channel 150. As such, the user may control multimedia data transmitted from the source device 120 to the sink device 160, launch and control applications on the source device 120, and/or otherwise perform commands that may be processed at the source device 120 remotely and without directly  interacting with the source device 120 while having an experience that looks and feels as though the user is directly and locally interacting with the source device 120.
For example, according to various aspects, FIG. 2 illustrates an exemplary high-level system architecture 200 associated with a sink device that may display a screen casted from a source device and back-control the source device via user inputs applied at the sink device. As shown in FIG. 2, the sink device may comprise a back-control application 220, which may generally execute in a user space associated with a particular operating system (e.g., WinCE, Linux, etc. ) . In various embodiments, the back-control application 220 may support a Human Interface Device (HID) 221 to back-control the source device, wherein the HID 221 may be configured to capture the user inputs applied at the sink device. For example, a HID mouse may comprise a handheld, button-activated input device with two axes and one, two, or three buttons, wherein the HID mouse may direct an indicator to move correspondingly about a visual display screen when rolled along a flat surface, thereby allowing a user to move the indicator freely in select operations or to manipulate text, graphics, and/or other information output on the visual display screen. A HID touch-screen may refer to a digitizer having an integrated display that allows a finger or stylus to be used for pointing, where some touch-screen technologies can differentiate between a finger touch and a stylus touch. Typically, the HID touch-screen supports an absolute x-axis and an absolute y-axis rather than relative x-y axes. A HID-compliant consumer control device may refer to a general consumer control device. For example, many smartphones include additional virtual keys that are not displayed on a screen (e.g., a “Home” button, a “Lock” button, volume control buttons, etc. ) . Accordingly, because a HID mouse may not be fitted to support certain controls such as virtual keys and corresponding virtual keys are not displayed or otherwise casted to the sink device, the HID consumer control device can be configured to provide such virtual keys.
According to various aspects, in addition to supporting the HID 221 to enable back-controlling the source device, the back-control application 220 may support a Serial Port Profile (SPP) application 223 to retrieve information associated with a remote screen at the source device (e.g., a resolution and orientation associated with the remote screen casted from the source device to the sink device) . For example, in various embodiments, the SPP application 223 may establish a virtual serial port that can enable communication between the sink device and the source device over an  emulated serial cable and thereby allow the back-control application 220 to request and receive the information associated with the remote screen at the source device over the emulated serial port connection. The remote screen information requested and received via the SPP application 223 may thereby allow the HID 221 to back-control the source device, as the HID 221 would be unable to suitably back-control the source device without knowing the current orientation and valid/invalid points on the remote screen at the source device.
In various embodiments, the sink device may further comprise a media casting application 225 that also runs in the user space, wherein the media casting application 225 may cast the screen associated with the source device into a local system on chip (SoC) or other suitable platform at the sink device. For example, in various embodiments, the source device may cast the screen associated therewith to the sink device via HDMI, MHL, and/or another suitable wired technology, whereby the sink device may include an HDMI chip 254 configured to receive data relating to the screen casted from the source device over a wired connection, wherein the data relating to the remote screen at the source device may then be provided to an HDMI driver 234 implemented in a kernel space. The remote screen data may then be provided to an HDMI/MHL application 227 implemented in the media casting application 225 such that the remote screen at the source device may be rendered at the sink device. Alternatively, according to various embodiments, the source device may cast the screen associated therewith to the sink device via a wireless technology such as Wi-Fi Display or Miracast 229, whereby the sink device may include a Wi-Fi chip 256 to handle wireless communication with the source device and a Wi-Fi stack 236 implemented in a kernel space to transport data between the Wi-Fi chip 256 and the media casting application 225. According to various embodiments, further detail relating to a protocol stack that can be used to wirelessly cast the remote screen at the source device to the sink device will be described in further detail below with reference to FIG. 10.
According to various aspects, the back-control application 220 may communicate with the media casting application 225 to exchange information relating to a connection status associated with the HID 221 and a link connecting the source device to the sink device (e.g., HDMI/MHL, Miracast, etc. ) . Furthermore, as appropriate, the back-control application 220 may issue a request to the media casting application 225 in order to calibrate the remote screen at the source device. Accordingly, in response to  receiving the request to calibrate the remote screen at the source device, the media casting application 225 may lock a surface associated with the local screen on the sink device so as to calibrate the remote screen at the source device in both landscape and portrait orientations. For example, as will be described in further below with reference to FIG. 5 through FIG. 7, the remote screen at the source device may be calibrated with respect to the local screen at the sink device to enable remote back-control via the HID 221, as the back-control application 220 otherwise would not know the actual output rectangle associated with the remote screen casted into the local screen at the sink device and therefore be unable to construct a valid HID input report to back-control the source device. Furthermore, according to various aspects, the surface associated with the local screen may be locked to ensure that the back-control application 220 has exclusive permission to access the local screen surface during the remote screen calibration process. In general, the back-control application 220 and the media casting application 225 may communicate via inter-process communication (IPC) within the user space, wherein the particular IPC may be operating system dependent (e.g., Windows Message Queue or Named Event can be used in Microsoft Windows CE) . The back-control application 220 may communicate with a Bluetooth stack 232, which may implement at least the Bluetooth HID profile and the Bluetooth Serial Port Profile (SPP) , and the Bluetooth stack 232 may further communicate with a Bluetooth chip 252 through a Host Controller Interface (HCI) transport, such as BlueCore Serial Protocol.
According to various aspects, referring to FIG. 3, relationships between an Open Systems Interconnect (OSI) seven-layer model 310 and a Bluetooth protocol stack 330 will now be described to further illustrate and explain how the various components shown in FIG. 2 may be implemented to back-control a source device via user inputs applied at a sink device. In particular, the OSI seven-layer model 310 was established to standardize information transmission between points over the Internet or other wired and/or wireless networks, wherein the OSI model 310 separates communications processes between two points in a network into seven stacked layers, with each layer adding certain functions. Each device handles a message such that a downward flow through each layer occurs at a sending endpoint and an upward flow through the layers occurs at a receiving endpoint. The programming and/or hardware that provides the seven layers in the OSI model 310 is typically a combination of device operating systems, application software, TCP/IP and/or other transport and network protocols, and  other software and hardware.
More particularly, referring to FIG. 3, the OSI model 310 includes a physical layer 312 (OSI Layer 1) used to convey a bit stream through a network at a physical level. The Institute of Electrical and Electronics Engineers (IEEE) sub-divides the physical layer 312 into the PLCP (Physical Layer Convergence Procedure) sub-layer and the PMD (Physical Medium Dependent) sub-layer. The data link layer 314 (OSI Layer 2) provides physical level synchronization, performs bit-stuffing, and furnishes transmission protocol knowledge and management, etc. The IEEE sub-divides the data link layer 314 into two further sub-layers, which comprise the Media Access Control (MAC) sub-layer to control data transfer to and from the physical layer and the Logical Link Control (LLC) sub-layer to interface with the network layer 316 (OSI Layer 3) , interpret commands, and perform error recovery. The network layer 316 (OSI Layer 3) handles data transfer across a network (e.g., routing and forwarding) in a manner independent from any media and specific network topology, the transport layer 318 (OSI Layer 4) manages end-to-end control and error-checking to multiplex data transfer across the network according to application-level reliability requirements, and the session layer 320 (OSI Layer 5) establishes, coordinates, and terminates conversations, exchanges, and dialogs between the applications to provide management and data flow control services. The presentation layer 322 (OSI Layer 6) converts incoming and outgoing data from one presentation format to another. For example, the presentation layer 322 may add service structure to the data units to provide data to the application layer 324 (OSI Layer 7) according to a common representation, while the application layer 324 is where communication partners are identified, quality of service (QoS) is identified, user authentication and privacy are considered, constraints on data syntax are identified, and any other functions that may be relevant to managing communications between host applications are managed.
Turning now to the Bluetooth protocol stack 330, the radio frequency (RF) layer 332 generally corresponds to the physical layer 312 in the OSI model 310, the baseband layer 334 and the link manager protocol layer 336 generally correspond to the data link layer 314, and a host controller interface (HCI) 338 separates the RF layer 332, the baseband layer 334, and the link manager protocol layer 336 from the upper layers. For example, the Physical Layer 312 in the OSI model 310 manages electrical interfaces to communications media, which includes modulation and channel coding, and therefore  covers the Bluetooth radio in the RF layer 332 (and possibly part of the baseband layer 334) , while the data link layer 314 manages transmission, framing, and error control over a particular link, which overlaps tasks performed in the link manager protocol layer 336 and the control end of the baseband layer 334 (e.g., error checking and correction) .
Above the HCI 338, the Logical Link Control and Adaptation Protocol (L2CAP) 340, RF communication (RFCOMM) 342, Synchronous Connection Oriented (SCO) Audio 350, object exchange (OBEX) 352, TCP/IP 354, Service Discovery Protocol (SDP) 344, Human Interface Device (HID) 346, and Audio/Video Distribution Transport Protocol (AVDTP) 348 functions correspond to the network layer 316, transport layer 318, and session layer 320. The presentation layer 322 and the application layer 324 in the OSI seven-layer model 310 corresponds to the Bluetooth Profiles layer in the Bluetooth protocol stack 330, wherein the Bluetooth Profiles layer may include the Serial Port Profile (SPP) 356 and any additional applications or profiles 358 (e.g., the Hands-Free Profile (HFP) for voice, the Advanced Audio Distribution Profile (A2DP) for high-quality audio streaming, the Video Distribution Profile (VDP) for video streaming, etc. ) . The applications or profiles 358 may correspond to the presentation layer 322 and the application layer 324 in the OSI model 310. Accordingly, a Bluetooth Profile may generally be considered synonymous with an “application” in the OSI seven-layer model 310. In relation to the Bluetooth HFP, the RFCOMM channel 342 comprises a communication channel named “service level connection” ( “SLC” ) that emulates a serial port used for further communication between an Audio Gateway (AG) device and a Handsfree (HF) device. For voice audio connections, such as in the Bluetooth HFP, a separate baseband link called a synchronous connection-oriented (SCO) channel carries the voice data, represented as Audio (SCO) 350 in FIG. 3. For A2DP, the audio data (unidirectional high-quality audio content) goes over AVDTP 348, which in turn goes over L2CAP 340. At the radio level, all L2CAP 340 data flows over a logical link.
As such, to enable a sink device to back-control a source device, the sink device may implement at least the Human Interface Device (HID) profile 346 and the SPP 356. In particular, the HID profile 346 may use the universal serial bus (USB) definition associated with a HID device in order to leverage existing class drivers and further describe how to use the USB HID protocol to discover a feature set associated with a HID class device and how a Bluetooth-enabled device can support HID services using  the L2CAP layer 340. Notably, the HID profile 346 is designed to enable initialization and control self-describing devices as well as provide a low latency link with low power requirements, wherein the HID profile 346 runs natively on the L2CAP layer 340 and does not reuse Bluetooth protocols other than SDP 344 in order to provide a simple implementation. In various embodiments, the sink device may therefore implement an adapter associated with a parser used in the HID profile 346 in order to provide utilities to initialize a HID input report relating to local touch messages captured at the sink device (e.g., via a HID mouse, a HID consumer control device, etc. ) . Furthermore, the SPP 356 may be implemented to emulate an RS232 (or similar) serial cable such that one or more applications may use the Bluetooth-implemented SPP 356 as a cable replacement through an operating system dependent virtual serial port abstraction.
Accordingly, referring again to FIG. 2, the Bluetooth stack 232 implemented in the sink device may support a HID profile to back-control the source device. In particular, the HID profile implemented in the Bluetooth stack 232 may have capabilities to register a HID 221, accept and establish a connection with the HID 221, and disconnect from the HID 221. The sink device therefore effectively acts as a HID device, sending HID input reports to the source device according to one or more touch messages captured at the sink device via the HID 221. Notably, the source device may be configured to initiate a HID connection with the sink device when a Bluetooth connection is established from the source device, provided that the HID Service Discovery Protocol (SDP) record is activated in the sink device. In contrast, where the sink device attempts to initiate a HID connection with the source device and the a HID connection has not been established before, the sink device may reject the HID connection due to native limitations (e.g., the source device may implement the “host” role associated with the Bluetooth HID profile and therefore initially request human data input and output services from the sink device implementing the “HID” role associated with the Bluetooth HID profile) . Consequently, the back-control application 220 implemented in the sink device may only have the ability to initiate a HID connection with the source device after a HID connection between the sink device and the source device has been established at least once before.
According to various aspects, as mentioned above, the back-control application 220 may be configured to back-control the source device via the HID 221 and may further support the SPP application 223 to retrieve information associated with the  remote screen at the source device (e.g., a resolution, an orientation, control parameters, etc. ) . In various embodiments, the HID 221 may be a “combo device” that comprises a HID mouse in combination with a HID consumer control. For example, as mentioned above, a HID mouse may not support additional virtual keys that are provided on many source devices (e.g., external buttons on a smartphone that are not displayed on the screen) . Accordingly, in various embodiments, a HID consumer control can provide the virtual keys that are needed to back-control the source device and cannot be provided via the HID mouse. Furthermore, the HID consumer control can be fitted with controls provided in a media player or other suitable applications that are running on the source device, especially where the source device does not cast a surface layer in which such controls are displayed into the sink device when the media casting application 225 is active (e.g., smartphones that run the Android operating system may not cast the surface layer associated with the media player into the sink device when connected to the sink device via HDMI 227) . As such, when a HID mouse and a HID consumer control are used in combination to back-control the source device, the back-control application 220 may construct a “HID combo device” report according to local user inputs captured from the HID mouse and/or HID consumer control. However, because a HID mouse may not support multi-touch features that may be built into the source device, another option may be to use a HID touch-screen as a pointer device to back-control the source device (e.g., where the sink device is implemented in a vehicle or another suitable environment that includes a HID touch-screen that supports multi-touch features) . However, using a HID touch-screen may not always be feasible, as a mapping between the HID touch-screen and the remote screen at the source device may not fully match. For example, the HID touch-screen may have a lower resolution than the remote screen at the source device (e.g., the HID touch-screen may have a 800x480 pixel resolution, whereas the source device may have a 1280x720 pixel resolution) . In such cases, the back-control application 220 may be unable to pre-define the maximum values on the x-axis and the y-axis in a touch-screen HID report. For example, where the back-control application 220 constructs a touch-screen HID report with maximum values on the x-y axes based on the local screen width and height (e.g., 800, 480) , only a partial region within the remote screen at the source device would be effective due to the difference between the screen resolutions at the source and sink devices.
As such, according to various aspects, the sink device needs to know at least the  screen resolution at the source device in order to properly back-control the source device. For example, as mentioned above, whether a HID touch-screen can be used depends on the difference between the screen resolution at the sink and source devices. Furthermore, in order to use a HID mouse as the HID 221, coordinates in the local screen at the sink device may need to be mapped to coordinates in the remote screen at the source device. Further still, because a HID mouse reports relative coordinates (e.g., relative to a base point located at a previous cursor position) whereas a touch message captured at the sink device normally supports absolute x-y coordinates relative to a base point in the upper-left corner of the local screen, the back-control application 220 may further need to know the previous cursor position in the source device and the initial cursor position. Additionally, because the base-point typically represents the upper-left corner in the screen, both at the source device and the sink device, the mapping between the coordinates in the local screen at the sink device and the remote screen at the source device may vary depending on whether the source device is in a landscape or portrait orientation (especially in vehicles where the local screen is always in a landscape orientation or other environments in which the local screen has a fixed orientation) .
Accordingly, in order to back-control the source device after the source device casts the remote screen into the sink device, the back-control application 220 may need to know the exact output rectangle in the local screen depending on whether the remote screen is in a landscape orientation or a portrait orientation. However, the HID 221 supported via the back-control application 220 (e.g., mouse and consumer control) and the media casting application 225 (e.g., HDMI 227 or Miracast 229) may lack the ability to provide the orientation and exact output rectangle at the remote screen. As such, according to various aspects, the SPP application 223 may provide the back-control application 220 with various parameters that enable the HID 221 to be used in back-controlling the source device, including the orientation and exact output rectangle at the remote screen. In particular, the SPP application 223 implemented at the sink device may communicate with a helper SPP application at the source device (not explicitly shown in FIG. 2) to request and retrieve information associated with the remote screen at the source device. For example, the SPP application 223 implemented at the sink device and the helper SPP application at the source device may support primitives to allow the sink device to retrieve a screen resolution, a screen orientation (or direction) , a pointer or cursor movement unit, and pointer velocity control parameters from the  source device. In addition, as will be described in further detail below with reference to FIG. 4 through FIG. 6, the SPP application 223 at the sink device and the helper SPP application at the source device may support primitives to enable the source device to calibrate the remote screen at the source device (e.g., primitives to start a screen calibration process, show a calibration screen, stop the screen calibration process, etc. ) .
In various embodiments, communication between the SPP application 223 at the sink device and the helper SPP application at the source device may be based on a client-server architecture, wherein the SPP application 223 implemented at the sink device issues requests to the helper SPP application at the source device and the helper SPP application at the source device responds to the requests from the SPP application 223 implemented at the sink device. However, in various embodiments, the helper SPP application at the source device may send unsolicited indicators to the SPP application 223 implemented at the sink device (e.g., to inform the sink device about the orientation or direction, pointer movement speed, etc. associated with the remote screen, as such parameters may dynamically change at the source device from time to time) . Furthermore, those skilled in the art will appreciate that the back-control application 220 may be unable to establish a connection between the SPP application 233 and the corresponding helper SPP application at the source device until the relevant HID 221 has been connected with the source device. Likewise, if the relevant HID 221 loses the connection with the source device, the back-control application 220 may automatically disconnect from the SPP application 233. In addition, although the Bluetooth stack 232 at the sink device may support multiple connections with the SPP application 223, the helper SPP application at the source device may only support one SPP connection.
According to various aspects, referring to FIG. 4, an exemplary method 400 as illustrated therein may be illustrated implemented at a sink device to enable the sink device to automatically calibrate a screen at a source device. For example, as mentioned above, the sink device may need to know the output rectangle when the remote screen is cast from the source device into the local screen at the sink device in a landscape orientation and a portrait orientation. As such, the sink device may perform the method 400 shown in FIG. 4 once with the remote screen at the source device arranged in the landscape orientation and perform the method 400 again with the remote screen at the source device arranged in the portrait orientation. Accordingly, although the following description provides details relating to the remote screen calibration in  landscape orientation prior to providing details relating to the remote screen calibration in portrait orientation, those skilled in the art will appreciate that the remote screen may be suitably calibrated in the reverse order as well. Furthermore, to the extent that the source device only supports a portrait orientation or a landscape orientation, the method 400 may only be performed once with the remote screen in the relevant orientation.
More particularly, at block 410, the sink device may initially send a message to the source device to request that the source device show a calibration screen in a landscape orientation. For example, in various embodiments, the calibration screen may comprise a screen filled with a unique color that may permit the sink device to clearly differentiate between valid and invalid regions within the remote screen casted from the source device into the local screen at the sink device. For example, according to various aspects, FIG. 5A illustrates an exemplary calibration screen 510 that may be displayed at the source device, shown as a full-white screen in FIG. 5A. Accordingly, when the source device shows the calibration screen 510, the sink device may differentiate between the valid and invalid regions within the remote screen casted to the sink device and thereby calculate the output rectangle when the remote screen is cast from the source device into the local screen in the landscape orientation. As such, to the extent that the remote screen at the source device was in a portrait orientation, the source device may rotate the remote screen into the landscape orientation in response to the request from the sink device and then display the calibration screen 510 on the remote screen. At block 420, the sink device may then lock the surface associated with the local screen at the sink device to ensure that an application handling the remote screen calibration has exclusive permission to access the local screen surface and thereby enable the sink device to calculate the output rectangle in the local screen for the remote screen casted from the source device at block 430.
More particularly, as shown in FIG. 5A, the local screen at the sink device may have four vertices, which are denoted in FIG. 5A as P0, P1, P2, P3. As such, assuming that the resolution associated with the local screen at the sink device is 800x480 pixels, x-y coordinates respectively corresponding to the four vertices P0, P1, P2, P3 in the local screen at the sink device may be (0, 0) , (800, 0) , (0, 480) , (800, 480) . Accordingly, when calculating the output rectangle at block 430, the sink device may determine that the output rectangle for the remote screen is limited to the four vertices in FIG. 5A that are denoted P0′ (X0′, Y0′) , P1′ (X1′, Y1′) , P2′ (X2′, Y2′) , P3′ (X3′, Y3′) . Accordingly, in  the landscape orientation, the output rectangle calculated at block 430 may comprise the region within the four vertices denoted P0′ (X0′, Y0′) , P1′ (X1′, Y1′) , P2′ (X2′, Y2′) , P3′ (X3′, Y3′) . In various embodiments, at block 440, the source device may then constrain valid points in the local screen according to the output rectangle that was calculated in block 430 for the remote screen in the landscape orientation. For example, FIG. 5B illustrates an example screen with valid and invalid regions when the remote screen at the source device is in the landscape orientation, wherein valid points may include any points that fall inside region 512 and invalid points may include any points that fall outside region 512 and within region 514 and/or region 516.
In various embodiments, at block 450, the sink device may then store screen coordinate conversion results for the landscape orientation such that points in the local screen at the sink device can be suitably converted to points in the remote screen at the source device. For example, in response to any subsequent user input captured at the sink device, an initial determination may be made as to whether a coordinate associated with the user input falls within the valid region 512. In the event that the coordinate associated with the user input falls within either invalid region 514 or 516, the user input would not map to any point on the remote screen and can therefore be discarded. Otherwise, assuming that a user input is received at a local point 518 that falls within the valid region 512, the sink device may need to map coordinates associated with the local point 518 to coordinates at the source device to enable proper back-control via the user input received via the local screen. As such, the screen coordinate conversion results that are stored at block 450 may be used to map the coordinates associated with the local point 518 (or any other local point in the valid region 512) to coordinates associated with remote points on the remote screen at the source device. In particular, when the remote screen at the source device is in the landscape orientation, coordinates associated with the local point 518 in the local screen may be converted to the coordinates associated with a remote point in the remote screen as follows:
Figure PCTCN2016081844-appb-000001
Figure PCTCN2016081844-appb-000002
In the above example, X and Y represent the x-y coordinates associated with the local point 518 in the screen at the sink device, X′and Y′represent the x-y coordinates  associated with the remote point in the screen at the source device, and W′and H′represent the width and height associated with the remote screen (e.g., W′may be the maximum value in the resolution associated with the remote screen, while H′may be the minimum value in the resolution associated with the remote screen, whereby a remote screen having a 1024x720 resolution would result in W′being 1024 and H′being 720) . Furthermore, L0 and T0 correspond to the x-y coordinates associated with the upper-left corner in the output rectangle region 512 in the landscape orientation, while R0 and B0 correspond to the x-y coordinates associated with the lower-right corner in the output rectangle region 512 in the landscape orientation (i.e., across a diagonal from the upper-left corner) . Accordingly, given the resolution associated with the remote screen and the coordinates associated with the output rectangle region 512 in the landscape orientation, coordinates associated with any arbitrary local point in the output rectangle region 512 can be converted to coordinates in the remote screen at the source device.
According to various aspects, as noted above, the sink device may perform the method 400 in substantially the same manner as described above to calibrate the remote screen at the source device in a portrait orientation. For example, at block 410, the sink device may send a message to the source device to request that the source device show the calibration screen in the portrait orientation, wherein the calibration screen may comprise a screen filled with a unique color that may permit the sink device to clearly differentiate between valid and invalid regions (e.g., a full-white screen) . Accordingly, as shown in FIG. 6A, the source device may rotate the remote screen into the portrait orientation as-needed in response to the request from the sink device and then display a calibration screen 610. At block 420, the sink device may lock the local screen surface to ensure that the application handling the remote screen calibration has exclusive permission to access the local screen while calculating the output rectangle in the local screen for the remote screen in the portrait orientation.
More particularly, as shown in FIG. 6A, the local screen at the sink device may have four vertices, which are denoted in FIG. 6A as P0 (0, 0) , P1 (800, 0) , P2 (0, 480) , P3 (800, 480) assuming an 800x480 resolution. Accordingly, when calculating the output rectangle in the portrait orientation at block 430, the sink device may determine that the output rectangle in the portrait orientation is limited to the four vertices in FIG. 6A that are denoted P0″ (X0″, Y0″) , P1″ (X1″, Y1″) , P2″ (X2″, Y2″) , P3″ (X3″, Y3″) . Accordingly, in the portrait orientation, the output rectangle calculated at block 430 may  comprise the region within the four vertices denoted P0″ (X0″, Y0″) , P1″ (X1″, Y1″) , P2″ (X2″, Y2″) , P3″ (X3″, Y3″) . In various embodiments, at block 440, the source device may then constrain valid points in the local screen according to the output rectangle that was calculated in block 430 for the remote screen in the portrait orientation. For example, FIG. 6B illustrates an example screen with valid and invalid regions when the remote screen at the source device is in the landscape orientation, wherein valid points may include any points that fall inside region 612 and invalid points may include any points that fall outside region 612 and within region 612 and/or region 616.
In various embodiments, at block 450, the sink device may then store screen coordinate conversion results for the portrait orientation such that points in the local screen at the sink device can be suitably converted to points in the remote screen at the source device. For example, in response to any subsequent user input captured at the sink device, an initial determination may be made as to whether a coordinate associated with the user input falls within the valid region 612. In the event that the coordinate associated with the user input falls within either invalid region 614 or 616, the user input would not map to any point on the remote screen and can therefore be discarded. Otherwise, assuming that a user input is received at a local point 618 that falls within the valid region 612, the sink device may need to map coordinates associated with the local point 618 to coordinates at the source device to enable proper back-control via the user input received via the local screen. As such, the screen coordinate conversion results that are stored at block 450 may be used to map the coordinates associated with the local point 618 (or any other local point in the valid region 612) to coordinates associated with remote points on the remote screen at the source device. In particular, when the remote screen at the source device is in the portrait orientation, coordinates associated with the local point 618 in the local screen may be converted to the coordinates associated with a remote point in the remote screen as follows:
Figure PCTCN2016081844-appb-000003
Figure PCTCN2016081844-appb-000004
In the above example, X and Y represent the x-y coordinates associated with the local point 618 in the screen at the sink device, X′and Y′represent the x-y coordinates associated with the remote point in the screen at the source device, and W′and H′ represent the width and height associated with the remote screen (i.e., the maximum and minimum values in the resolution associated with the remote screen) . Furthermore, L1 and T1 correspond to the x-y coordinates associated with the upper-left corner in the output rectangle region 612 in the portrait orientation, while R1 and B1 correspond to the x-y coordinates associated with the lower-right corner in the output rectangle region 612 in the portrait orientation (i.e., across a diagonal from the upper-left corner in the output rectangle region 612) . Accordingly, given the resolution associated with the remote screen and the coordinates associated with the output rectangle region 612 in the portrait orientation, coordinates associated with any local point in the output rectangle region 612 can be converted to coordinates in the remote screen at the source device.
According to various aspects, referring now to FIG. 7, an exemplary method 700 and corresponding timing diagram 770 as shown therein can be used to capture and transmit user inputs applied at a sink device to a source device to thereby back-control the source device. More particularly, once the source device has been suitably linked to the sink device and any necessary calibration with respect to the remote screen has been completed and/or results from a prior calibration have been retrieved, a user may locally back-control the source device via a touch-screen or a Bluetooth Human Interface Device (HID) (e.g., a HID mouse and consumer control “combo” device) deployed in the sink device. In general, user input from such input devices may be captured via a back-control application as a touch message, wherein the back-control application may generally send a HID input report to the source device each time that a touch message is locally captured at the sink device (i.e., in a default implementation, one touch message corresponds to at least one HID input report) . However, sending a HID input report per touch message may be impractical in certain circumstances. For example, the time to send a Host Controller Interface (HCI) Asynchronous Connection-Less (ACL) packet wrapped with the HID input report will typically be longer than a touch message sample rate locally used in the sink device, which will lead to apparent user-perceptible latency when using the Bluetooth HID to back-control the source device. Accordingly, the method 700 shown in FIG. 7 may consider a time required to send HID input reports to the source device and a local touch message sample rate used in the sink device to address the above-mentioned latency issues.
Furthermore, the method 700 shown in FIG. 7 may consider the velocity at which the pointer moves in the source device to address issues in which a floating  pointer may appear at the source device when the user makes several quick touch messages in succession (e.g., during quick handwriting) . For example, in an Android smartphone, there exists a native limitation with respect to pointer sensitivity, which is measured based on the speed and acceleration associated with the pointer. The pointer in an Android phone has a speed in a range between [-7, 7] , wherein a scale factor is applied to the velocity to adapt the resolution associated with the input device to the resolution associated with the output device. In this case, the scale factor is applied to adapt the resolution associated with the local screen in the sink device to the resolution associated with the remote screen in the source device. The source device will have a default low threshold velocity (LV) , which is the scaled speed at which acceleration begins to be applied to the pointer. In other words, the LV is the upper bound on a pointer speed at which small motions can be performed without acceleration (e.g., in an Android phone, the LV is limited to 500 pixels-per-second) . In the event that the pointer has a velocity that exceeds the LV, the source device will accelerate the pointer such that the pointer can move the extra distance. Consequently, the sink device will lose the ability to control the pointer in the source device under such conditions where the pointer velocity exceeds the LV limit in the source device. Accordingly, the method 700 shown in FIG. 7 may consider the velocity at which the pointer moves in the source device to address the above-mentioned floating pointer issue.
More particularly, the following description may assume that the Bluetooth HID has been connected to the source device and that a communication link from the source device to the sink device is active such that the remote screen at the source device is cast into the sink device. Accordingly, at bock 710, the HID input report associated with the touch-screen and/or other suitable HID devices used to back-control the source device may be initialized according to a local touch message captured at the sink device at some point in time after the Bluetooth HID has been connected to the source device and the link used to cast the remote screen at the source device into the sink device is active. In particular, the HID input report may be determined according to a HID descriptor that outlines specific information about usage parameters associated with the HID device. For example, a HID input report that includes information about a touch message captured from a HID mouse may provide maximum and minimum values on relative x-y axes. Accordingly, a HID input report based on a HID mouse touch input may only include one (1) byte to limit logical minimum and maximum values to a range between  [-127, 127] based on a HID mouse natively supporting relative coordinates. As such, when a HID mouse is implemented to back-control the source device, the back-control application may store a previous point in the remote screen to calculate a difference between a current remote point and the previous remote point. Although different operating systems may define a touch message differently, in general, the touch message captured at block 710 may include a touch point (x, y) in the local screen at the sink device and a touch state (or type) , which may include, for example, a touch down (e.g., press) , a touch move (e.g., swipe) , or a touch up (e.g., release) .
According to various embodiments, the source device may have a native sample rate (ST) used to capture local touch messages, which may generally be a static value (e.g., 125 Hz such that a local touch message is sampled every eight (8) milliseconds) . However, because different users will typically interact with the local screen at different speeds, the sink device may be configured to re-sample the touch message at a re-sample rate (RT) that differs from the native sample rate ST to determine how frequently to send the HID input reports to the source device. In addition, based on the re-sample rate RT and the speed at which the pointer is moving in the remote screen at the source device, the sink device may periodically send a HID input report to the source device. For example, the method 700 shown in FIG. 7 may consider various parameters to send the HID input reports to the source device on a periodic basis (e.g., one HID input report every ten to forty milliseconds and at most one HID input report per local touch message) . In this manner, the response time associated with the pointer in the can be accelerated and the pointer in the remote screen at the source device can be guaranteed to not reach the low threshold velocity LV. Consequently, the source device will not accelerate the pointer, which is what would otherwise cause the pointer to float.
According to various embodiments, the re-sample rate RT may generally correspond to a rate at which the back-control application used to send the HID input report to the source device captures the touch message. For example, where RT = n, the back-control application may handle every nth touch message (e.g., where n=1, the back-control application may handle every touch message, where n=2, the back-control application may handle every other touch message, and so on) . In general, as RT increases, the difficulty to make successive touch messages recognizable at the source device increases. Furthermore, in implementations where RT is greater than one, the back-control application may be implicitly required to send an additional padding report,  wherein the padding report may also be a HID input report, with the main difference that the (x, y) fields in the padding report are always zero. In other words, the padding report does not move the pointer in the source device and instead aims to solve the floating pointer issue through quick handwriting to some extent. The padding report may therefore be added whenever the back-control application handles a touch message, with the padding report having a count limited to a range between [0, RT -1] .
According to various aspects, the native sample rate (ST) and the re-sample rate (RT) as described above may therefore be used to determine conditions under which the sink device sends the HID input report to the source device. More particularly, the native (default) sample rate ST to capture the local touch message at the sink device (e.g., 125 Hz or every eight (8) milliseconds) may determine how many (or how often) “raw” touch messages will be captured at block 710. Accordingly, the back-control application configured to handle the local touch message may first en-queue the local touch message that was captured at block 710 and a separate thread may then de-queue the local touch message to allow proper handling according to the re-sample rate RT mentioned above. The thread in the back-control application configured to handle the local touch message, which has been integrated with the re-sample algorithm described herein, may then send the HID input report into the Bluetooth stack, and the Bluetooth stack may then send the HCI ACL packet wrapped with the HID input report into the Bluetooth chip. Accordingly, a time to send the HID input report into the Bluetooth chip (TH) may vary among different Bluetooth stack implementations (e.g., typically between eight (8) to sixteen (16) milliseconds) . Furthermore, another variable that may be considered in determining the conditions under which the HID input report is sent to the source device may depend on a time to send the HID input report from the Bluetooth chip to the source device (TC) , which may be determined according to a “sniff” interval employed in the Bluetooth chip. For example, Bluetooth has an implicit called “sniff” mode, which allows a Bluetooth HID device to request various sniff intervals to reduce a radio duty cycle, address latency issues, reduce power consumption, etc. according to fine-grained adjustments in the sniff interval. As such, the source device may request adjustments to the Bluetooth sniff interval to control the variable TC that depends on the time to send the HID input report to the source device and thereby guarantee that the HCI ACL packet wrapped with the HID input report will be sent to the source device at a minimum periodicity (e.g., every ten (10) to forty (40) milliseconds) .
Accordingly, based on at least the various parameters described above, the source device may determine how to process each local touch message captured at block 710. More particularly, the source device may determine whether the local touch message captured at block 710 was a “touch down” message, a “touch move” message, or a “touch up” message. In response to determining at block 720 that the local touch message was a touch down message, the source device may transmit the HID input report to the source device at block 722. Alternatively, in response to determining at block 730 that the local touch message was a “touch move” message, the source device may determine whether to transmit the HID input report corresponding to the touch move message or alternatively store the touch move message as a reference point without transmitting the HID input report to the sink device. In particular, at block 732, the sink device may determine a movement distance (D) associated with the pointer in the source device based on a previous remote point and a current remote point in the remote screen at the source device. For example, in various embodiments, the source device may store coordinates associated with a previous local point P0 (X0, Y0) that corresponds to a previous local touch message captured in the sink device and a current local point P1 (X1, Y1) that corresponds to the current local touch message captured at block 710. The coordinates (X0, Y0) and (X1, Y1) associated with the previous and current local points P0, P1 may therefore be converted to coordinates (X0′, Y0′) and (X1′, Y1′) that correspond to the previous and current points in the remote screen at the source device, which may be referred to as P0′, P1′. For example, the coordinates associated with the previous and current local points P0, P1 in the local screen at the sink device may be converted to coordinates associated with the previous and current points P0′, P1′in the remote screen at the source device according to the equations described in further detail above with reference to FIG. 4 through FIG. 5. Accordingly, in various embodiments, the movement distance D associated with the pointer in the source device may be computed as follows:
Figure PCTCN2016081844-appb-000005
In various embodiments, at block 734, the sink device may then compute the velocity (S) associated with the pointer in the source device based on the movement distance D associated with the pointer in the source device and the variable TC as-determined according to the above-mentioned sniff interval, as follows:
Figure PCTCN2016081844-appb-000006
In various embodiments, at block 736, the sink device may then determine whether the velocity S associated with the pointer in the source device is less than the low threshold velocity LV at which acceleration begins to be applied to the pointer at the source device, wherein the HID input report corresponding to the touch move message may be sent to the source device at block 740 in response to determining that S is less than LV. Furthermore, in various embodiments, the above-mentioned padding reports may be sent to the source device at block 740 in addition to the HID input report corresponding to the current touch move message when S is less than LV. Alternatively, in the event that the velocity S associated with the pointer in the source device equals or exceeds the low threshold velocity LV, the sink device may determine whether a current touch move index is a multiple of the re-sample rate RT at block 738. For example, assuming an RT equal to three (3) , HID input reports and the above-mentioned padding reports may be sent to the source device at block 740 with respect to touch move messages having an index of zero, three, six, nine, twelve, etc. Otherwise, where the pointer velocity S at the source device equals or exceeds the low threshold velocity LV and the current touch move index is not a multiple of the re-sample rate RT, the sink device may ignore the touch move input at block 742 without sending the HID input report (or the padding reports) to the source device. However, according to various embodiments, the sink device may further set the last touch move message to the current touch move message at block 742 in order to maintain the previous local/remote point coordinates that may be needed to provide a reference point with respect to subsequent touch messages that are captured at the sink device.
According to various embodiments, assuming that the local touch message is neither a touch down message nor a touch move message, the sink device may determine that the local touch message was a touch up message at block 740. In such a case, the sink device may transmit the HID input report corresponding to the last touch move message and the current touch up message at block 742. In addition, the sink device may reset the touch move index so as to continue re-sampling in the same manner described above with respect to any subsequent touch move messages.
Accordingly, the timing diagram 770 shown in FIG. 7 illustrates an exemplary touch message sequence that may be handled according to the method 700 further  shown in FIG. 7, wherein the timing diagram 770 shown in FIG. 7 assumes that the re-sample rate equals three (3) . In general, RT can have a dynamic value greater than or equal to one (1) , wherein using a larger RT value will result in more touch messages being ignored such that inputs (e.g., handwriting) controlled via the HID may be more difficult to recognize when shown in the remote screen at the source device. Conversely, smaller RT values may result in the sink device sending more touch messages that are captured via the HID to the source device, which may result in an apparent (user-perceptible) latency in showing the captured inputs at the source device. Accordingly, the variable RT may generally have a value calculated to balance the above-mentioned tradeoffs to avoid a floating pointer in the remote screen at the source device, which may be three (3) in the exemplary timing diagram 770 shown in FIG. 7.
As such, in response to capturing a touch down message, as depicted at 772, the sink device may transmit the corresponding HID input report to the source device. In response to capturing touch move message messages M0-M8, as depicted at 774 through 790, the sink device may transmit the corresponding HID input report (s) to the source device according to the re-sample rate RT. In particular, the sink device may transmit the HID input report corresponding to touch move messages M0, M3, and M6, while touch move messages M1, M2, M4, M5, M7, and M8 may not be reported based on the re-sample rate RT. However, in response to capturing a touch up message, as depicted at 792, the sink device may transmit the HID input report corresponding to touch move message M8 (i.e., the last touch move message prior to the touch up message) and transmit the HID input report corresponding to the touch up message.
According to various aspects, FIG. 8 illustrates an exemplary call flow that can be implemented at a source device 814 and a sink device to back-control the source device 814 via user inputs applied at the sink device. More particularly, in the call flow shown in FIG. 8, the sink device may implement a back-control application 810 to back-control the source device 814 and a media casting application 812 to manage a link through which the source device 814 casts a remote screen associated therewith to a local screen at the sink device. For example, at 820, the source device 814 may initially connect to the back-control application 810 as a Human Interface Device (HID) to enable the back-control application 810 to back-control the source device 814. In various embodiments, at 822, the media casting application 812 may establish the link through which the source device 814 casts the remote screen to the local screen at the  sink device (e.g., via HDMI/MHL, Miracast, etc. ) . At 824, the back-control application 810 may then establish a Serial Port Profile (SPP) connection with the source device 814, which may generally occur after the HID has been connected at 820 and the link with the source device 814 has been established at 822. Furthermore, although not explicitly shown in FIG. 8, the back-control application 810 may be configured to recover or otherwise re-establish the SPP connection in the event that the SPP connection is terminated while the HID is still connected to the source device 814.
In various embodiments, as depicted at 826, the back-control application 810 may then retrieve information associated with the remote screen at the source device 814 and send a message requesting that the source device 814 show a calibration screen at 828. For example, in various embodiments, the back-control application 810 may retrieve the information associated with the remote screen at the source device 814 and request that the source device 814 show the calibration screen through the SPP connection. In various embodiments, at 830, the back-control application 810 may then send a request to calibrate the remote screen to the media casting application 812, which may then lock a surface of the local screen at 832 and calculate the (landscape or portrait) output rectangle at 834. For example, as mentioned above, the local screen at the sink device may be locked to ensure that the back-control application 810 handling the remote screen calibration process has exclusive permission to access the local screen surface while calibrating the remote screen. At 836, the coordinates associated with the output rectangle may then be provided to the back-control application 810, which may store the calibration result at 838 such that the back-control application 810 may convert coordinates in the local screen to coordinates in the remote screen. In general, the calibration process performed at 826 through 838 is described in further detail above with respect to FIG. 4 through FIG. 6, wherein the messages from 828 through 838 may be performed once with the remote screen at the source device 814 in a landscape orientation and once with the remote screen in a portrait orientation. Furthermore, because the back-control application 810 stores the calibration result, the messages shown 828 through 838 may be omitted to the extent that the remote screen cast into the local screen has been previously calibrated (e.g., where the same source device 814 is reconnected to the sink device, a source device 814 with the same resolution as a prior calibrated source device 814 is connected to the sink device, etc. ) .
In various embodiments, at 840, the back-control application 810 may then  capture a local touch input and initialize a corresponding HID input report at 842. For example, initializing the HID input report may comprise converting coordinates associated with the local touch input to coordinates in the remote screen at the source device 814 according to the prior calibration and depending on whether the source device 814 is currently in a landscape or portrait orientation. The back-control application 810 may then send the HID input report to the source device at 844 when the local touch input satisfies certain criteria (e.g., where the local touch input is a touch down message, a touch up message, a last touch move message before a touch up message, a touch move message that matches a re-sample interval, a touch move message that results in a pointer velocity at the source device 814 below a low threshold velocity above which pointer acceleration is triggered at the source device 814, etc. ) . In general, the process to report local touch messages as shown at 840 through 844 is described in further detail above with respect to FIG. 7.
In various embodiments, at some point in time, the source device 814 may send a message to disconnect the HID to the back-control application 810, as depicted at 846, which may further result in the back-control application 810 automatically terminating the SPP connection with the source device 814, as shown at 848.
According to various aspects, FIG. 9 illustrates an exemplary device 900 that can implement the various aspects and embodiments described herein. For example, in various embodiments, the device 900 shown in FIG. 9 may correspond to a source device that can cast a screen associated therewith to a sink device (e.g., via Miracast, HDMI/MHL, or another suitable technology that may allow screen sharing between two devices) . Furthermore, in such cases, the device 900 may further implement a Bluetooth Human Interface Device (HID) host role to allow user inputs to be applied at the sink device to back-control the device 900 operating as the source device as well as Bluetooth Serial Port Profile (SPP) acceptor role to enable communication with the sink device over an emulated physical cable. Alternatively, according to various aspects, the device 900 may correspond to a sink device that can display a screen cast from a source device and further implement a Bluetooth HID role to accept user inputs used to back-control the source device as well as Bluetooth SPP initiator role to initiate communication with the sink device over an emulated physical cable. In various embodiments, the device 900 may include a housing 910, a processor 920, a memory 922, a signal detector 924, a user interface 926, a digital signal processor (DSP) 928, a  transmitter 932, a receiver 934, an antenna 936, a local display 938, and a bus system 950. Furthermore, in various embodiments, the functions associated with the transmitter 932 and the receiver 934 can be incorporated into a transceiver 930. The device 900 can also be configured to communicate in a wireless network that includes, for example, a base station, an access point, or the like.
In various embodiments, the processor 920 can be configured to control operations associated with the device 900, wherein the processor 920 may also be referred to as a central processing unit (CPU) . The memory 922 can be coupled to the processor 920, can be in communication with the processor 920, and can provide instructions and data to the processor 920. The processor 920 can perform logical and arithmetic operations based on program instructions stored within the memory 922. The instructions in the memory 922 can be executable to perform one or more methods and processes described herein. Furthermore, in various embodiments, the processor 920 can include, or be a component in, a processing system implemented with one or more processors. The one or more processors can be implemented with any one or more general-purpose microprocessors, microcontrollers, digital signal processors (DSPs) , field programmable gate array (FPGAs) , programmable logic devices (PLDs) , controllers, state machines, gated logic, discrete hardware components, dedicated hardware finite state machines, combinations thereof, and/or any other suitable entities that can perform calculations and/or manipulate information. In various embodiments, the processing system can also include machine-readable media configured to store software, which can be broadly construed to include any suitable instructions, whether referred to as software, firmware, middleware, microcode, hardware description language, or otherwise. Instructions can include code in a source code format, a binary code format, an executable code format, and/or any other suitable format. The instructions, when executed on the one or more processors, can cause the processing system to perform one or more functions described herein.
In various embodiments, the memory 922 can include read-only memory (ROM) , random access memory (RAM) , and/or any suitable combination thereof. The memory 922 can also include non-volatile random access memory (NVRAM) .
In various embodiments, the transmitter 932 and the receiver 934 (or the transceiver 930) can transmit and receive data between the device 900 and a remote location. The antenna 936 can be attached to the housing 910 and electrically coupled  to the transceiver 930. In some implementations, the device 900 can also include multiple transmitters, multiple receivers, multiple transceivers, and/or multiple antennas (not illustrated) . In various embodiments, the signal detector 924 can be used to detect and quantify the level associated with one or more signals received at the transceiver 930. The signal detector 924 can detect such signals as total energy, energy per subcarrier per symbol, power spectral density, and/or and in other ways. In various embodiments, the digital signal processor (DSP) 928 can be used to process signals, wherein the DSP 928 can be configured to generate a packet to be transmitted via the transmitter 932 and/or the transceiver 930. In various embodiments, the packet can include a physical layer protocol data unit (PPDU) .
In various embodiments, the user interface 926 can include, for example, a keypad, a microphone, a speaker, a display, and/or other suitable interfaces. The user interface 926 can include any element or component that conveys information to a user associated with the device 900 and/or receives input from the user. For example, where the device 900 corresponds to a source device that can cast a screen associated therewith to a sink device, the user interface 926 may comprise at least the screen casted to the sink device. Alternatively, where the device 900 corresponds to a sink device that can display a screen casted from a source device, the user interface 926 may comprise a display to show the screen casted from the source device in addition to one or more touchscreen sensors, steering wheel control buttons, a microphone, a mouse, and/or any other suitable human interface device, whereby the user interface 926 may receive user inputs to back-control the source device and one or more packets to report the user inputs received via the user interface 926 may be transmitted to the source device via the transmitter 932 and/or transceiver 930.
In various embodiments, as mentioned above, the device 900 shown in FIG. 9 may include the transmitter 932 and the receiver 934 (or the transceiver 930) to transmit and receive data between the device 900 and a remote location. Accordingly, in embodiments where the device 900 corresponds to a source device and/or a sink device involved in a screen casting session via Miracast or another suitable wireless screen sharing technology, the transmitter 932 and the receiver 934 (or the transceiver 930) may be used to communicate appropriate wireless signals to cast the screen associated with the source device to the sink device. Alternatively, the various aspects and embodiments described herein contemplate that the screen casting session between the  source device and the sink device may be implemented via HDMI/MHL or another wired technology. As such, in various embodiments, the device 900 may optionally include a connector 940 such that an appropriate physical cable can be plugged into the connector 940 to convey the screen casting signals between the source device and the sink device in addition to the transmitter 932 and the receiver 934 (or the transceiver 930) that are used to communicate wireless signals used to back-control the source device via one or more user inputs applied at the sink device. For example, in various embodiments, the connector 940 may be compatible with a five-pin or an eleven-pin Micro-USB-to-HDMI adapter, an MHL passive cable, a USB Type-C connector, a reversible superMHL connector, and/or other suitable physical cables.
In various embodiments, the local display 938 may comprise any suitable video output devices such as a cathode ray tube (CRT) , a liquid crystal display (LCD) , a plasma display, a light emitting diode (LED) display, an organic light emitting diode (OLED) display, or another display device. In these or other examples, the local display 938 may be an emissive display or a transmissive display. The local display 938 may also be a touch-screen display or a presence-sensitive display such that the local display 938 is simultaneously an input device and an output (display) device. In various embodiments, such a touch-screen display may be capacitive, resistive, and/or another suitable touch or presence-sensitive panel that allows a user to provide user input.
In various embodiments, the various components associated with the device 900 can be coupled together via the bus system 950, wherein the bus system 950 can include a data bus as well as a power bus, a control signal bus, and/or a status signal bus in addition to the data bus. In various embodiments, the device 900 can also include other components or elements not illustrated in FIG. 9. One or more components associated with the device 900 can be in communication with another one or more components associated with the device 900 via means that may comprise another communication channel (not illustrated) to provide, for example, an input signal to the other component. In various embodiments, although FIG. 9 illustrates various separate components, one or more components shown therein can be combined or commonly implemented. For example, the processor 920 and the memory 922 can be embodied on a single chip. In various embodiments, the processor 920 can additionally (or alternatively) contain memory, such as processor registers. Similarly, one or more functional blocks or portions thereof can be embodied on a single chip. Alternatively, the functionality  associated with a particular block can be implemented on two or more chips. For example, in addition to the functionality described above, the processor 920 can be used to implement the functionality described above with respect to the signal detector 924 and/or the DSP 928.
According to various aspects, FIG. 10 illustrates an exemplary data communication model 1000 or protocol stack that can be used in a system in which a source device casts a screen to a sink device and the source device is back-controlled via user inputs applied at the sink device. In particular, the data communication model 1000 illustrates interactions between data and control protocols used to transmit data between a source device and a sink device during a session in which the source device casts at least a screen associated therewith to the sink device. In various embodiments, the data communication model 1000 may include a physical (PHY) layer 1002, a media access control (MAC) layer (1004) , an internet protocol (IP) layer 1006, a user datagram protocol (UDP) layer 1008, a real time protocol (RTP) layer 1010, an MPEG2 transport stream (MPEG2-TS) 1012, a content protection layer 1014, packetized elementary stream (PES) packetization 1016, a video codec 1018, an audio codec 1020, a transport control protocol (TCP) the 1022, a real time streaming protocol (RTSP) the 1024, feedback packetization 1026, and a human interface device (HID) input layer 1030.
In various embodiments, the physical layer 1002 and the MAC layer 1004 may define physical signaling, addressing, and channel access control used in communications between the source device and the sink device. The physical layer 1002 and the MAC layer 1004 may define the frequency band structure used in the communications (e.g., Federal Communications Commission bands defined at 2.4, GHz, 3.6 GHz, 5 GHz, 60 GHz, or Ultrawideband (UWB) frequency band structures) . The physical layer 1002 and the MAC layer 1004 may also define data modulation techniques (e.g., analog and digital amplitude modulation, frequency modulation, phase modulation techniques, and combinations thereof) . The physical layer 1002 and the MAC layer 1004 may also define multiplexing techniques (e.g., orthogonal frequency division multiplexing (OFDM) , time division multi access (TDMA) , frequency division multi access (FDMA) , code division multi access (CDMA) , or any combination of OFDM, FDMA, TDMA and/or CDMA) . In one example, the physical layer 1002 and the MAC layer 1004 may be defined according to a Wi-Fi standard (e.g., IEEE 802.11-2007 and 802.11n-2009x) . In other examples, the physical layer 1002 and the MAC  layer 1004 may be defined according to WirelessHD, Wireless Home Digital Interface (WHDI) , WiGig, Wireless USB, High Definition Multimedia Interface (HDMI) , Mobile High-Definition Link (MHL) , Ethernet, and/or other suitable network architectures.
In various embodiments, the IP layer 1006, the UDP layer 1008, the RTP layer 1010, the TCP layer 1022, and the RTSP layer 1024 may define packet structures and encapsulations used in the communications between the source device and the sink device and may be defined according to the standards maintained by the Internet Engineering Task Force (IETF) . For example, in various embodiments, the source device and the sink device may use the RTSP layer 1024 to negotiate capabilities, establish a session, and maintain and manage a session. The source device and the sink device may further establish a feedback channel to back-control the source device via user inputs applied at the sink device (e.g., via a Bluetooth Human Interface Device (HID) and Serial Port Profile (SPP) helper application, as described above) .
In various embodiments, the video codec 1018 may define one or more video data coding techniques that may be used to manage video data casted from the source device to the sink device. For example, in various embodiments, the video codec 1018 may implement one or more video compression standards, which may include ITU-T H. 261, ISO/IEC MPEG-1 Visual, ITU-T H. 262 or ISO/IEC MPEG-2 Visual, ITU-T H. 263, ISO/IEC MPEG-4 Visual, ITU-T H. 264 (also known as ISO/IEC MPEG-4 AVC) , VP8 and High-Efficiency Video Coding (HEVC) . However, those skilled in the art will appreciate that in certain implementations the source device may transmit either compressed or uncompressed video data to the sink device. In various embodiments, the audio codec 1020 may define audio data coding techniques that may be used to manage audio data casted from the source device to the sink device. For example, in various embodiments, the audio data may be coded using multi-channel formats such those developed by Dolby and Digital Theater Systems. The audio data may further be coded using a compressed or uncompressed format, wherein example compressed audio formats may include, without limitation, MPEG-1, 2 Audio Layers II and III, AC-3, AAC, etc., while an example uncompressed audio format may include the pulse-code modulation (PCM) audio format. In various embodiments, the packetized elementary stream (PES) packetization 1016 and MPEG2 transport stream (MPEG2-TS) 1012 may define how coded audio and video data is packetized and transmitted. The PES packetization 1016 and MPEG-TS 1012 may be defined according to MPEG-2 Part 1.  In other examples, audio and video data may be packetized and transmitted according to other packetization and transport stream protocols. Furthermore, the content protection 1014 may protect against unauthorized copying of audio or video data, wherein the content protection 1014 may be defined according to the High bandwidth Digital Content Protection (HDCP) 2.0 specification in one example.
In various embodiments, the feedback packetization 1026 may define how user input and performance information is packetized, wherein the feedback may generally affect how subsequent media data is presented to the user at the sink device (e.g., zoom and pan operations) and how the source device processes (e.g., encodes and/or transmits) the media data to the sink device. However, those skilled in the art will appreciate that the feedback may generally be used to back-control any functional aspect associated with the source device. For example, in various embodiments, Human Interface Device (HID) input 1030 may be received at one or more HID devices at the sink device, which may implement a Bluetooth HID to back-control the source device and a Bluetooth Serial Port Profile (SPP) helper application to retrieve various parameters associated with the remote screen at the source device and to calibrate the remote screen at the source device relative to a local screen at the sink device. Furthermore, the feedback packetization 1026 may provide functionality to convert coordinates associated with one or more touch messages or other HID input 1030 received at the sink device to coordinates in the local screen at the sink device, as described in further detail above with respect to FIG. 4 through FIG. 6, and the feedback packetization 1026 may further provide functionality to packetize such information for wireless transmission to the source device. Further still, the feedback packetization 1026 may implement one or more controls to determine whether and/or when to transmit packets reporting the HID input 1030 to the source device, as described in further detail above with reference to FIG. 7.
According to various embodiments, as described in further detail above, the source device may cast the screen associated therewith to the sink device via any suitable screen sharing technology. For example, in various embodiments, the source device may cast the screen associated therewith to the source device via Miracast or another suitable wireless technology or via HDMI/MHL or another suitable wired technology. Accordingly, those skilled in the art will appreciate that the particular data communication model 1000 shown in FIG. 10 is for illustrative purposes only and is not  intended to limit the communications between the source device and the sink device to any particular screen sharing technology. As such, those skilled in the art will appreciate that there may be various changes, omissions, and/or additions to the data communication model 1000 depending on the particular technology used to cast the screen from the source device to the sink device. For example, in a screen sharing session implemented via Miracast, the source device and the sink device may communicate screen casting information via wireless signals, whereas in a screen sharing session implemented via HDMI/MHL, the source device and the sink device may be connected to one another via a physical cable and the screen casting information may be conveyed between the source device and the sink device via the physical cable.
Those skilled in the art will appreciate that information and signals may be represented using any of a variety of different technologies and techniques. For example, data, instructions, commands, information, signals, bits, symbols, and chips that may be referenced throughout the above description may be represented by voltages, currents, electromagnetic waves, magnetic fields or particles, optical fields or particles, or any combination thereof.
Further, those skilled in the art will appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the aspects disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted to depart from the scope of the various aspects and embodiments described herein.
The various illustrative logical blocks, modules, and circuits described in connection with the aspects disclosed herein may be implemented or performed with a general purpose processor, a digital signal processor (DSP) , an application specific integrated circuit (ASIC) , a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described  herein. A general purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices (e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration) .
The methods, sequences, and/or algorithms described in connection with the aspects disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in RAM, flash memory, ROM, EPROM, EEPROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of non-transitory computer-readable medium known in the art. An exemplary non-transitory computer-readable medium may be coupled to the processor such that the processor can read information from, and write information to, the non-transitory computer-readable medium. In the alternative, the non-transitory computer-readable medium may be integral to the processor. The processor and the non-transitory computer-readable medium may reside in an ASIC. The ASIC may reside in an IoT device. In the alternative, the processor and the non-transitory computer-readable medium may be discrete components in a user terminal.
In one or more exemplary aspects, the functions described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a non-transitory computer-readable medium. Computer-readable media may include storage media and/or communication media including any non-transitory medium that may facilitate transferring a computer program from one place to another. A storage media may be any available media that can be accessed by a computer. By way of example, and not limitation, such computer-readable media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. Also, any connection is properly termed a computer-readable medium. For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair,  DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of a medium. The term disk and disc, which may be used interchangeably herein, includes CD, laser disc, optical disc, DVD, floppy disk, and Blu-ray discs, which usually reproduce data magnetically and/or optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
While the foregoing disclosure shows illustrative aspects and embodiments, those skilled in the art will appreciate that various changes and modifications could be made herein without departing from the scope of the disclosure as defined by the appended claims. Furthermore, in accordance with the various illustrative aspects and embodiments described herein, those skilled in the art will appreciate that the functions, steps, and/or actions in any methods described above and/or recited in any method claims appended hereto need not be performed in any particular order. Further still, to the extent that any elements are described above or recited in the appended claims in a singular form, those skilled in the art will appreciate that singular form (s) contemplate the plural as well unless limitation to the singular form (s) is explicitly stated.

Claims (45)

  1. A method for back-controlling a source device, comprising:
    determining, at a sink device, at least a current orientation and a resolution associated with a remote screen casted from the source device, wherein the remote screen casted from the source device is displayed on a local screen at the sink device;
    sending, to the source device, a message to request that the source device display a calibration screen configured to fill the remote screen with a unique color;
    automatically calculating, at the sink device, an output rectangle associated with the current orientation at the remote screen casted from the source device, wherein the output rectangle comprises a region in the local screen filled with the unique color;
    capturing a local touch message at the sink device, the local touch message associated with first coordinates in the local screen at the sink device; and
    converting the first coordinates in the local screen to second coordinates in the remote screen based on the resolution associated with the remote screen and the calculated output rectangle associated with the current orientation at the remote screen.
  2. The method recited in claim 1, further comprising:
    causing the source device to apply the local touch message at the second coordinates in the remote screen in response to determining that the first coordinates associated with the local touch message are located within the calculated output rectangle associated with the current orientation at the remote screen.
  3. The method recited in claim 1, wherein automatically calculating the output rectangle further comprises differentiating the region in the local screen filled with the unique color from one or more invalid regions displaying a different color.
  4. The method recited in claim 3, further comprising:
    discarding the local touch message in response to determining that the first coordinates associated with the local touch message are located within the one or more invalid regions.
  5. The method recited in claim 1, wherein automatically calculating the output rectangle further comprises:
    identifying a first corner in the calculated output rectangle, the first corner having a first x-coordinate (L) and a first y-coordinate (T) in the local screen; and
    identifying a second corner in the calculated output rectangle, the second corner across a diagonal from the first corner and having a second x-coordinate (R) and a second y-coordinate (B) in the local screen.
  6. The method recited in claim 5, wherein the first coordinates in the local screen are converted to the second coordinates in the remote screen according to the equations:
    Figure PCTCN2016081844-appb-100001
    and
    Figure PCTCN2016081844-appb-100002
    where X and Y represent the first coordinates in the local screen, W′ represents a maximum value in the resolution associated with the remote screen, H′ represents a minimum value in the resolution associated with the remote screen, and X′ and Y′ represent the second coordinates in the remote screen when the remote screen casted from the source device is in a landscape orientation.
  7. The method recited in claim 5, wherein the first coordinates in the local screen are converted to the second coordinates in the remote screen according to the equations:
    Figure PCTCN2016081844-appb-100003
    and
    Figure PCTCN2016081844-appb-100004
    where X and Y represent the first coordinates in the local screen, W′ represents a maximum value in the resolution associated with the remote screen, H′ represents a minimum value in the resolution associated with the remote screen, and X′ and Y′ represent the second coordinates in the remote screen when the remote screen casted from the source device is in a portrait orientation.
  8. The method recited in claim 1, wherein automatically calculating the output rectangle further comprises locking a surface associated with the local screen to ensure that a calibration application has exclusive permission to access the surface associated with the local screen.
  9. The method recited in claim 1, wherein:
    the local touch message is captured from a human interface device (HID) supported at the sink device,
    the sink device implements a Bluetooth HID role to back-control the source device, and
    the sink device implements a Bluetooth Serial Port Profile (SPP) initiator role to retrieve the current orientation and the resolution associated with the remote screen and to request that the source device display the calibration screen.
  10. The method recited in claim 1, further comprising:
    storing, at the sink device, a result from calculating the output rectangle associated with the current orientation at the remote screen; and
    retrieving the stored result from calculating the output rectangle associated with the current orientation at the remote screen in response to the source device establishing a subsequent session to cast the remote screen into the local screen at the sink device.
  11. A sink device configured to back-control a source device, comprising:
    a local screen configured to display a remote screen casted from the source device;
    a transceiver configured to communicate with the source device to retrieve at least a current orientation and a resolution associated with the remote screen casted from the source device and to request that the source device display a calibration screen configured to fill the remote screen with a unique color; and
    one or more processors configured to:
    automatically calculate an output rectangle associated with the current orientation at the remote screen casted from the source device, wherein the output rectangle comprises a region in the local screen filled with the unique color;
    capture a local touch message at the sink device, the local touch message associated with first coordinates in the local screen at the sink device; and
    convert the first coordinates in the local screen to second coordinates in the remote screen based on the resolution associated with the remote screen and the calculated output rectangle associated with the current orientation at the remote screen.
  12. The sink device recited in claim 11, wherein the one or more processors are further configured to cause the source device to apply the local touch message at the second coordinates in the remote screen in response to a determination that the first coordinates associated with the local touch message are located within the calculated output rectangle associated with the current orientation at the remote screen.
  13. The sink device recited in claim 11, wherein the one or more processors are further configured to:
    differentiate the region in the local screen filled with the unique color from one or more invalid regions displaying a different color to automatically calculate the output rectangle; and
    discard the local touch message in response to a determination that the first coordinates associated with the local touch message are located within the one or more invalid regions.
  14. The sink device recited in claim 11, wherein the one or more processors are further configured to:
    identify a first corner in the calculated output rectangle, the first corner having a first x-coordinate (L) and a first y-coordinate (T) in the local screen; and
    identify a second corner in the calculated output rectangle, the second corner across a diagonal from the first corner and having a second x-coordinate (R) and a second y-coordinate (B) in the local screen.
  15. The sink device recited in claim 14, wherein the first coordinates in the local screen are converted to the second coordinates in the remote screen according to the equations:
    Figure PCTCN2016081844-appb-100005
    and
    Figure PCTCN2016081844-appb-100006
    where X and Y represent the first coordinates in the local screen, W′ represents a maximum value in the resolution associated with the remote screen, H′ represents a minimum value in the resolution associated with the remote screen, and X′ and Y′ represent the second coordinates in the remote screen when the remote screen casted from the source device is in a landscape orientation.
  16. The sink device recited in claim 14, wherein the first coordinates in the local screen are converted to the second coordinates in the remote screen according to the equations:
    Figure PCTCN2016081844-appb-100007
    and
    Figure PCTCN2016081844-appb-100008
    where X and Y represent the first coordinates in the local screen, W′ represents a maximum value in the resolution associated with the remote screen, H′ represents a minimum value in the resolution associated with the remote screen, and X′ and Y′ represent the second coordinates in the remote screen when the remote screen casted from the source device is in a portrait orientation.
  17. An apparatus, comprising:
    means for determining at least a current orientation and a resolution associated with a remote screen casted from a source device, wherein the remote screen casted from the source device is displayed on a local screen at the apparatus;
    means for sending, to the source device, a message to request that the source device display a calibration screen configured to fill the remote screen with a unique color;
    means for automatically calculating an output rectangle associated with the current orientation at the remote screen casted from the source device, wherein the output rectangle comprises a region in the local screen filled with the unique color;
    means for capturing a local touch message, the local touch message associated with first coordinates in the local screen at the apparatus; and
    means for converting the first coordinates in the local screen to second coordinates in the remote screen based on the resolution associated with the remote screen and the calculated output rectangle associated with the current orientation at the remote screen.
  18. The apparatus recited in claim 17, further comprising:
    means for causing the source device to apply the local touch message at the second coordinates in the remote screen in response to the first coordinates associated with the local touch message falling within the calculated output rectangle associated with the current orientation at the remote screen.
  19. The apparatus recited in claim 17, wherein the means for automatically calculating the output rectangle is further configured to differentiate the region in the local screen filled with the unique color from one or more invalid regions in which a different color is displayed.
  20. The apparatus recited in claim 19, further comprising:
    means for discarding the local touch message in response to the first coordinates associated with the local touch message falling within the one or more invalid regions.
  21. The apparatus recited in claim 17, further comprising:
    means for locking a surface associated with the local screen to ensure that a calibration application has exclusive permission to access the surface associated with the local screen.
  22. The apparatus recited in claim 17, wherein:
    the local touch message is captured from a human interface device (HID) ,
    the apparatus implements a Bluetooth HID role to back-control the source  device, and
    the apparatus implements a Bluetooth Serial Port Profile (SPP) initiator role to retrieve the current orientation and the resolution associated with the remote screen and to request that the source device display the calibration screen.
  23. The apparatus recited in claim 17, further comprising:
    means for storing a result from calculating the output rectangle associated with the current orientation at the remote screen; and
    means for retrieving the stored result from calculating the output rectangle associated with the current orientation at the remote screen in response to the source device establishing a subsequent session to cast the remote screen into the local screen at the apparatus.
  24. A method for back-controlling a source device, comprising:
    implementing, at a sink device, a human interface device (HID) to back-control a source device casting a remote screen into a local screen at the sink device;
    determining, at the sink device, at least a low threshold velocity above which the source device accelerates a pointer in the remote screen;
    sampling, at the sink device, a local touch message applied via the HID to the local screen according to a native sample rate; and
    transmitting, to the source device, a message to report the local touch message at the native sample rate in response to the local touch message comprising a press, a release, or a move that satisfies a predetermined condition that depends, at least in part, on the low threshold velocity above which the source device accelerates the pointer in the remote screen.
  25. The method recited in claim 24, wherein the move satisfies the predetermined condition where the local touch message moves the pointer in the remote screen from a prior point to a current point at a velocity less than the low threshold velocity above which the source device accelerates the pointer in the remote screen.
  26. The method recited in claim 25, wherein the velocity at which the local touch message moves the pointer in the remote screen depends on a distance between the prior  point and the current point in the remote screen in combination with a latency to send the message to report the local touch message to the source device.
  27. The method recited in claim 26, further comprising:
    determining coordinates associated with the prior point and the current point in the local screen at the sink device;
    converting the coordinates associated with the prior point and the current point in the local screen at the sink device to coordinates in the remote screen at the source device; and
    calculating the distance between the prior point and the current point in the remote screen based on the converted coordinates in the remote screen.
  28. The method recited in claim 24, further comprising:
    transmitting, to the source device, the message to report the local touch message at a re-sample rate in response to the local touch message comprising a move having an index that matches an interval associated with the re-sample rate, wherein the interval associated with the re-sample rate is greater than an interval associated with the native sample rate.
  29. The method recited in claim 28, further comprising:
    transmitting, to the source device, a padding report with the message transmitted to the source device at the re-sample rate, wherein the padding report comprises a HID input report that does not move the pointer in the remote screen at the source device.
  30. The method recited in claim 24, further comprising:
    storing the local touch message without transmitting the message to report the local touch message to the source device in response to the local touch message comprising a move that does not satisfy the predetermined condition or have an index that matches an interval associated with a re-sample rate that is greater than an interval associated with the native sample rate.
  31. The method recited in claim 24, further comprising:
    transmitting, to the source device, a HID input report corresponding to a last  touch message that moved the pointer with the message to report the local touch message comprising the release.
  32. A sink device configured to back-control a source device, comprising:
    a local screen configured to display a remote screen casted from the source device;
    a transceiver configured to communicate with the source device to determine at least a low threshold velocity above which the source device accelerates a pointer in the remote screen; and
    one or more processors configured to:
    implement a human interface device (HID) to back-control the source device;
    sample a local touch message applied via the HID to the local screen according to a native sample rate; and
    cause the transceiver to transmit a message to report the local touch message to the source device at the native sample rate in response to the local touch message comprising a press, a release, or a move that satisfies a predetermined condition that depends, at least in part, on the low threshold velocity above which the source device accelerates the pointer in the remote screen.
  33. The sink device recited in claim 32, wherein the move satisfies the predetermined condition where the local touch message moves the pointer in the remote screen from a prior point to a current point at a velocity less than the low threshold velocity above which the source device accelerates the pointer in the remote screen.
  34. The sink device recited in claim 33, wherein the velocity at which the local touch message moves the pointer in the remote screen depends on a distance between the prior point and the current point in the remote screen in combination with a latency to send the message to report the local touch message to the source device.
  35. The sink device recited in claim 32, wherein the one or more processors are further configured to:
    cause the transceiver to transmit the message to report the local touch message at a re-sample rate in response to the local touch message comprising a move having an index that matches an interval associated with the re-sample rate, wherein the interval associated with the re-sample rate is greater than an interval associated with the native sample rate; and
    cause the transceiver to transmit, to the source device, a padding report with the message transmitted to the source device at the re-sample rate, wherein the padding report comprises a HID input report that does not move the pointer in the remote screen at the source device.
  36. The sink device recited in claim 32, wherein the one or more processors are further configured to store the local touch message without transmitting the message to report the local touch message to the source device in response to the local touch message comprising a move that does not satisfy the predetermined condition or have an index that matches an interval associated with a re-sample rate that is greater than an interval associated with the native sample rate.
  37. The sink device recited in claim 32, wherein the one or more processors are further configured to:
    cause the transceiver to transmit, to the source device, a HID input report corresponding to a last touch message that moved the pointer with the message to report the local touch message comprising the release.
  38. An apparatus, comprising
    means for implementing a human interface device (HID) to back-control a source device casting a remote screen into a local screen at the apparatus;
    means for determining at least a low threshold velocity above which the source device accelerates a pointer in the remote screen;
    means for sampling a local touch message applied via the HID to the local screen according to a native sample rate; and
    means for transmitting, to the source device, a message to report the local touch  message at the native sample rate in response to the local touch message comprising a press, a release, or a move that satisfies a predetermined condition that depends, at least in part, on the low threshold velocity above which the source device accelerates the pointer in the remote screen.
  39. The apparatus recited in claim 38, wherein the move satisfies the predetermined condition where the local touch message moves the pointer in the remote screen from a prior point to a current point at a velocity less than the low threshold velocity above which the source device accelerates the pointer in the remote screen.
  40. The apparatus recited in claim 39, wherein the velocity at which the local touch message moves the pointer in the remote screen depends on a distance between the prior point and the current point in the remote screen in combination with a latency to send the message to report the local touch message to the source device.
  41. The apparatus recited in claim 40, further comprising:
    means for determining coordinates associated with the prior point and the current point in the local screen at the apparatus;
    means for converting the coordinates associated with the prior point and the current point in the local screen at the apparatus to coordinates in the remote screen at the source device; and
    means for calculating the distance between the prior point and the current point in the remote screen based on the converted coordinates in the remote screen.
  42. The apparatus recited in claim 38, further comprising:
    means for transmitting, to the source device, the message to report the local touch message at a re-sample rate in response to the local touch message comprising a move having an index that matches an interval associated with the re-sample rate, wherein the interval associated with the re-sample rate is greater than an interval associated with the native sample rate.
  43. The apparatus recited in claim 42, further comprising:
    means for transmitting, to the source device, a padding report with the message  transmitted to the source device at the re-sample rate, wherein the padding report comprises a HID input report that does not move the pointer in the remote screen at the source device.
  44. The apparatus recited in claim 38, further comprising:
    means for storing the local touch message without transmitting the message to report the local touch message to the source device in response to the local touch message comprising a move that does not satisfy the predetermined condition or have an index that matches an interval associated with a re-sample rate that is greater than an interval associated with the native sample rate.
  45. The apparatus recited in claim 38, further comprising:
    means for transmitting, to the source device, a HID input report corresponding to a last touch message that moved the pointer with the message to report the local touch message comprising the release.
PCT/CN2016/081844 2016-05-12 2016-05-12 Human interface device and automatic calibration for back-controlling source device during remote screen casting session WO2017193328A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/081844 WO2017193328A1 (en) 2016-05-12 2016-05-12 Human interface device and automatic calibration for back-controlling source device during remote screen casting session

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/081844 WO2017193328A1 (en) 2016-05-12 2016-05-12 Human interface device and automatic calibration for back-controlling source device during remote screen casting session

Publications (1)

Publication Number Publication Date
WO2017193328A1 true WO2017193328A1 (en) 2017-11-16

Family

ID=60266150

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/081844 WO2017193328A1 (en) 2016-05-12 2016-05-12 Human interface device and automatic calibration for back-controlling source device during remote screen casting session

Country Status (1)

Country Link
WO (1) WO2017193328A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20200014544A (en) * 2018-08-01 2020-02-11 삼성전자주식회사 Electronic apparatus for processing input event and operating method thereof
CN110851368A (en) * 2019-11-19 2020-02-28 天津车之家数据信息技术有限公司 Multi-device collaborative testing method and device, computing device and system
CN113393796A (en) * 2021-06-22 2021-09-14 湖北亿咖通科技有限公司 Screen sharing method, device, system, vehicle, equipment and storage medium
WO2023020528A1 (en) * 2021-08-19 2023-02-23 华为技术有限公司 Mirroring method, device, storage medium, and computer program product
US20230162694A1 (en) * 2020-08-13 2023-05-25 Chongqing Konka Photoelectric Technology Research Institute Co., Ltd. Display Panel and Electronic Device
WO2024001812A1 (en) * 2022-06-27 2024-01-04 华为技术有限公司 Message management method, electronic device and system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013090489A1 (en) * 2011-12-14 2013-06-20 Motorola Mobility Llc Method and apparatus for data transfer of touch screen events between devices
CN103324322A (en) * 2012-03-20 2013-09-25 蓝云科技股份有限公司 Touch device and system provided with multiple touch devices
US20150082241A1 (en) * 2013-09-17 2015-03-19 Samsung Electronics Co., Ltd. Method for screen mirroring and source device thereof
CN105045584A (en) * 2015-07-07 2015-11-11 东风汽车有限公司 Touch control method and system for screen of vehicle machine
CN105528100A (en) * 2014-10-21 2016-04-27 惠州市德赛西威汽车电子股份有限公司 A reverse control method for interconnected vehicle-mounted infotainment device and mobile terminal
CN105573692A (en) * 2015-05-29 2016-05-11 宇龙计算机通信科技(深圳)有限公司 Projection control method, associated terminal and system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013090489A1 (en) * 2011-12-14 2013-06-20 Motorola Mobility Llc Method and apparatus for data transfer of touch screen events between devices
CN103324322A (en) * 2012-03-20 2013-09-25 蓝云科技股份有限公司 Touch device and system provided with multiple touch devices
US20150082241A1 (en) * 2013-09-17 2015-03-19 Samsung Electronics Co., Ltd. Method for screen mirroring and source device thereof
CN105528100A (en) * 2014-10-21 2016-04-27 惠州市德赛西威汽车电子股份有限公司 A reverse control method for interconnected vehicle-mounted infotainment device and mobile terminal
CN105573692A (en) * 2015-05-29 2016-05-11 宇龙计算机通信科技(深圳)有限公司 Projection control method, associated terminal and system
CN105045584A (en) * 2015-07-07 2015-11-11 东风汽车有限公司 Touch control method and system for screen of vehicle machine

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220027009A1 (en) * 2018-08-01 2022-01-27 Samsung Electronics Co., Ltd. Electronic device for processing input event and method of operating same
US11669192B2 (en) 2018-08-01 2023-06-06 Samsung Electronics Co., Ltd. Electronic device for processing input event and method of operating same
CN112513795A (en) * 2018-08-01 2021-03-16 三星电子株式会社 Electronic device for processing input event and operation method thereof
EP3807749A4 (en) * 2018-08-01 2021-07-28 Samsung Electronics Co., Ltd. Electronic device for processing input event and method of operating same
US11144154B2 (en) 2018-08-01 2021-10-12 Samsung Electronics Co., Ltd. Electronic device for processing input event and method of operating same
KR20200014544A (en) * 2018-08-01 2020-02-11 삼성전자주식회사 Electronic apparatus for processing input event and operating method thereof
KR102527080B1 (en) * 2018-08-01 2023-05-02 삼성전자주식회사 Electronic apparatus for processing input event and operating method thereof
CN110851368A (en) * 2019-11-19 2020-02-28 天津车之家数据信息技术有限公司 Multi-device collaborative testing method and device, computing device and system
US11862113B2 (en) * 2020-08-13 2024-01-02 Chongqing Konka Photoelectric Technology Research Institute Co., Ltd. Display panel and electronic device
US20230162694A1 (en) * 2020-08-13 2023-05-25 Chongqing Konka Photoelectric Technology Research Institute Co., Ltd. Display Panel and Electronic Device
CN113393796A (en) * 2021-06-22 2021-09-14 湖北亿咖通科技有限公司 Screen sharing method, device, system, vehicle, equipment and storage medium
CN113393796B (en) * 2021-06-22 2022-10-14 亿咖通(湖北)技术有限公司 Screen sharing method, device, system, vehicle, equipment and storage medium
WO2023020528A1 (en) * 2021-08-19 2023-02-23 华为技术有限公司 Mirroring method, device, storage medium, and computer program product
WO2024001812A1 (en) * 2022-06-27 2024-01-04 华为技术有限公司 Message management method, electronic device and system

Similar Documents

Publication Publication Date Title
WO2017193328A1 (en) Human interface device and automatic calibration for back-controlling source device during remote screen casting session
US10382494B2 (en) User input back channel for wireless displays
JP6013562B2 (en) Negotiating functionality between the wireless sink and the wireless source device
JP6092338B2 (en) User input back channel for wireless display
US9065876B2 (en) User input back channel from a wireless sink device to a wireless source device for multi-touch gesture wireless displays
US9413803B2 (en) User input back channel for wireless displays
US8914187B2 (en) Vehicle dashboard wireless display system
US8964783B2 (en) User input back channel for wireless displays
US9582239B2 (en) User input back channel for wireless displays
US20130003624A1 (en) User input back channel for wireless displays
JP2014508995A (en) User input back channel for wireless display
JP2014510434A (en) User input back channel for wireless display

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16901283

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 16901283

Country of ref document: EP

Kind code of ref document: A1