CN118476234A - Source device, sink device, and method of operating the same - Google Patents

Source device, sink device, and method of operating the same Download PDF

Info

Publication number
CN118476234A
CN118476234A CN202280086528.8A CN202280086528A CN118476234A CN 118476234 A CN118476234 A CN 118476234A CN 202280086528 A CN202280086528 A CN 202280086528A CN 118476234 A CN118476234 A CN 118476234A
Authority
CN
China
Prior art keywords
user input
sink device
input data
source device
transmission
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202280086528.8A
Other languages
Chinese (zh)
Inventor
金炯振
李承范
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020220019455A external-priority patent/KR20230103800A/en
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Priority claimed from PCT/KR2022/016772 external-priority patent/WO2023128206A1/en
Publication of CN118476234A publication Critical patent/CN118476234A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The source device includes a wireless communication module, a memory, and a processor. The processor is configured to transmit screen image data generated by the source device to be displayed on a sink device to the sink device through the wireless communication module. The processor is further configured to determine whether to execute a target application while the screen image is displayed on the sink device, the target application being configured to change an amount of transmission of user input data generated in the screen image based on the screen image data by an input device connected to the sink device. The processor is further configured to: based on determining that the target application is being executed, a transmission bit rate of the screen image data is adjusted by changing a transmission profile used to transmit the screen image data.

Description

Source device, sink device, and method of operating the same
Technical Field
The present disclosure relates to a source device, a sink device, and methods of operating the same.
Background
Human Interface Devices (HIDs) for performing a user interface between a user and a device may be used by various types of user input, such as touch input, gesture input, mouse input, keyboard input, and/or pen input. User input may be transmitted to each device through, for example, a user input back channel (user input back channel, UIBC). UIBC may correspond to a function of transmitting user input appearing in an image displayed on a display screen of the sink device to the source device to cause the source device to process the user input. UIBC may be used for user input communication from the sink device to a user interface included in the source device. Here, a device that provides an image to another electronic device may be referred to as a "source device" and a device that receives an image may be referred to as a "sink device.
Disclosure of Invention
As technology advances, the number and size of user input data generated in a human interface device (human INTERFACE DEVICE, HID) increases in proportion to an increase in the original size of a video image. For example, for pen input, the "300" to "400" bars of input data may be generated per second. However, in case of poor network throughput, when UIBC data is transmitted in the same manner as in case of good network throughput, the amount and size of UIBC data required to be processed within a limited throughput may increase, which may cause a delay of user input between devices exchanging UIBC data, thereby compromising usability. In addition, although the bit rate of user input data is significantly smaller than that of video images, due to the large amount of user input data, a delay in user input may occur.
According to an embodiment, the amount of user input data (e.g., user Input Back Channel (UIBC) data) generated between a source device and a sink device according to the type of application used in the source device and/or a network environment including network throughput may be adaptively adjusted.
According to an embodiment, the sink device may adaptively adjust the amount of user input data to be transmitted by determining network conditions.
According to the embodiments, when a target application in which a large amount of user input may occur is executed in a sink device, a source device may reduce the bit rate of a video image to be transmitted to the sink device through communication between the source device and the sink device, and may increase the amount of user input data to be transmitted to the source device.
According to one embodiment, a source device includes a wireless communication module, a memory, and a processor. The processor is configured to transmit, to the sink device, screen image data generated by the source device to be displayed on the sink device through the wireless communication module, determine whether to execute a target application while displaying a screen image on the sink device, the target application being configured to change a transmission amount of user input data generated by an input device connected to the sink device in a screen image based on the screen image data, and adjust a transmission bit rate of the screen image data by changing a transmission profile for transmitting the screen image data when it is determined that the target application is being executed.
According to one embodiment, a sink device includes a wireless communication module, a display module, a memory, and a processor. The processor may be configured to: receiving, by the wireless communication module, screen image data generated by a source device to be displayed on the sink device, displaying a screen image based on the screen image data using the display module, acquiring user input data generated in the screen image by an input device connected to the sink device while displaying the screen image, dynamically changing a parameter for adaptively adjusting a transmission amount of the user input data based on a network quality between the source device and the sink device, and transmitting the dynamically changed parameter to the source device, the transmission amount of the user input data including at least one of a size of data to be transmitted or a number of pieces of data.
According to one embodiment, a method of operating a source device includes: the method includes transmitting, to a sink device, screen image data generated by a source device to be displayed on the sink device, determining whether to execute a target application configured to change a transmission amount of user input data generated in a screen image based on the screen image data by an input device connected to the sink device while the screen image is displayed on the sink device, and adjusting a transmission bit rate of the screen image data by changing a transmission profile for transmitting the screen image data when it is determined that the target application is being executed.
According to one embodiment, a method of operating a sink device includes: receiving screen image data generated by a source device to be displayed on a sink device, displaying a screen image based on the screen image data, acquiring user input data generated in the screen image by an input device connected to the sink device while the screen image is displayed, dynamically changing a parameter for adaptively adjusting a transmission amount of the user input data, the parameter including at least one of a size of data to be transmitted or a number of pieces of data, and transmitting the dynamically changed parameter to the source device based on a network quality between the source device and the sink device.
According to one embodiment, the sink device may determine network quality and adaptively increase or decrease the amount of transmission of user input data, thereby improving the availability of an input device (e.g., HID) connected to the sink device.
According to one embodiment, the quality of user input data may be improved by adaptively adjusting the amount of transmission of data exchanged between a source device and a sink device through communication between the source device and the sink device.
According to one embodiment, the latency of user input may be reduced and usability increased by adaptively adjusting the amount of user input data to be generated and/or the amount of user input data to be transmitted based on network conditions.
According to one embodiment, by adjusting the bit rate of data (e.g., screen image data) transmitted by a source device according to the type of application executing in a sink device, it is possible to reduce the delay of user input and improve usability.
Drawings
The foregoing and other aspects, features, and advantages of certain embodiments of the present disclosure will become more apparent from the following detailed description, taken in conjunction with the accompanying drawings, in which:
FIG. 1 is a block diagram illustrating an electronic device in a network environment, according to one embodiment;
FIG. 2 is a block diagram illustrating a procedure according to one embodiment;
FIG. 3 is a block diagram illustrating a source device according to one embodiment;
FIG. 4 is a block diagram illustrating a sink device according to one embodiment;
FIG. 5 is a diagram illustrating operations performed between a source device and a sink device according to one embodiment;
Fig. 6 is a block diagram illustrating a state diagram of a transfer profile according to one embodiment;
FIG. 7 is a diagram illustrating the amount of transmission and data size of each of a plurality of transmission profiles according to one embodiment;
FIG. 8 is a diagram illustrating a method for a sink device to limit the number of pieces of user input data to be transmitted, according to one embodiment;
FIG. 9 is a diagram illustrating an example of an input report according to one embodiment;
FIG. 10 is a diagram illustrating a method of exchanging input reports and report descriptors between a source device and a sink device, according to one embodiment;
FIG. 11 is a diagram illustrating a report descriptor for each transfer profile change in response to touch input, according to one embodiment;
FIG. 12 is a diagram illustrating a report descriptor for each transfer profile change in response to pen input, according to one embodiment;
FIG. 13 is a flow chart illustrating a method of operating a source device according to one embodiment;
FIG. 14 is a flow chart illustrating a method of operating a sink device according to one embodiment;
FIG. 15 is a flow chart illustrating a method of adjusting the bit rate of data sent through a communication between a source device and a sink device according to one embodiment; and
Fig. 16 is a diagram illustrating a process of transmitting user input data when a source device is a user terminal and a sink device is smart glasses according to one embodiment.
Detailed Description
Hereinafter, embodiments will be described in detail with reference to the accompanying drawings. When the embodiments are described with reference to the drawings, like reference numerals denote like elements, and any repetitive description thereof will be omitted.
Fig. 1 is a block diagram illustrating an electronic device 101 in a network environment 100 according to one embodiment. Referring to fig. 1, an electronic device 101 in a network environment 100 may communicate with the electronic device 102 via a first network 198 (e.g., a short-range wireless communication network) or with at least one of the electronic device 104 or the server 108 via a second network 199 (e.g., a long-range wireless communication network). According to one embodiment, the electronic device 101 may communicate with the electronic device 104 via a server 108. According to one embodiment, the electronic device 101 may include a processor 120, a memory 130, an input module 150, a sound output module 155, a display module 160, an audio module 170, and a sensor module 176, an interface 177, a connection terminal 178, a haptic module 179, a camera module 180, a power management module 188, a battery 189, a communication module 190, a Subscriber Identity Module (SIM) 196, or an antenna module 197. In some embodiments, at least one component (e.g., connection terminal 178) may be omitted from electronic device 101, or one or more other components may be added to electronic device 101. In some embodiments, some components (e.g., sensor module 176, camera module 180, or antenna module 197) may be integrated into a single component (e.g., display module 160).
The processor 120 may execute, for example, software (e.g., program 140) to control at least one other component (e.g., hardware or software component) of the electronic device 101 connected to the processor 120, and may perform various data processing or calculations. According to one embodiment, as at least part of the data processing or calculation, the processor 120 may store commands or data received from another component (e.g., the sensor module 176 or the communication module 190) in the volatile memory 132, process the commands or data stored in the volatile memory 132, and store the resulting data in the non-volatile memory 134. According to one embodiment, the processor 120 may include a main processor 121 (e.g., a Central Processing Unit (CPU) or an Application Processor (AP)) or an auxiliary processor 123 (e.g., a Graphics Processing Unit (GPU), a Neural Processing Unit (NPU), an Image Signal Processor (ISP), a sensor hub processor, or a Communication Processor (CP)), and the auxiliary processor 123 may operate independently of the main processor 121 or in conjunction with the main processor 121. For example, when the electronic device 101 includes a main processor 121 and a secondary processor 123, the secondary processor 123 may be adapted to consume less power than the main processor 121 or be dedicated to a specified function. The auxiliary processor 123 may be implemented separately from the main processor 121 or as part of the main processor 121.
The auxiliary processor 123 may control at least some of the functions or states associated with at least one component of the electronic device 101 (e.g., the display module 160, the sensor module 176, or the communication module 190) in place of the main processor 121 when the main processor 121 is in an inactive (e.g., sleep) state or in conjunction with the main processor 121 when the main processor 121 is in an active state (e.g., executing an application). According to one embodiment, the auxiliary processor 123 (e.g., ISP or CP) may be implemented as part of another component (e.g., camera module 180 or communication module 190) functionally associated with the auxiliary processor 123. According to one embodiment, the auxiliary processor 123 (e.g., NPU) may include a hardware architecture designated for processing an Artificial Intelligence (AI) model. AI models may be generated by machine learning. Such learning may be performed by, for example, the electronic device 101 performing artificial intelligence, or via a separate server (e.g., server 108). The learning algorithm may include, but is not limited to, for example, supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning. The AI model may include a plurality of artificial neural network layers. The artificial neural network may include, for example, a Deep Neural Network (DNN), a Convolutional Neural Network (CNN), a Recurrent Neural Network (RNN), a boltzmann machine limited (RBM), a Deep Belief Network (DBN), and a bi-directional recurrent deep neural network (BRDNN), a deep Q network, or a combination of two or more thereof, but is not limited thereto. The AI model may additionally or alternatively include software structures in addition to hardware structures.
The memory 130 may store various data used by at least one component of the electronic device 101 (e.g., the processor 120 or the sensor module 176). The data may include, for example, software (e.g., program 140) and input data or output data for commands associated therewith. Memory 130 may include volatile memory 132 or nonvolatile memory 134.
The program 140 may be stored as software in the memory 130 and may include, for example, an Operating System (OS) 142, middleware 144, or applications 146.
The input module 150 may receive commands or data from outside the electronic device 101 (e.g., a user) to be used by another component of the electronic device 101 (e.g., the processor 120). The input module 150 may include, for example, a microphone, a mouse, a keyboard, keys (e.g., buttons) or a digital pen (e.g., a stylus).
The sound output module 155 may output a sound signal to the outside of the electronic device 101. The sound output module 155 may include, for example, a speaker or a receiver. The speakers may be used for general purposes such as playing multimedia or playing recordings. The receiver may be used to receive an incoming call. According to one embodiment, the receiver may be implemented separately from the speaker or as part of the speaker.
The display module 160 may visually provide information to an exterior (e.g., a user) of the electronic device 101. The display module 160 may include, for example, a display, a hologram device, or a projector, and a control circuit for controlling a corresponding one of the display, the hologram device, and the projector. According to one embodiment, the display module 160 may include a touch sensor adapted to detect a touch or a pressure sensor adapted to measure the strength of a force caused by a touch.
The audio module 170 may convert sound into electrical signals and vice versa. According to one embodiment, the audio module 170 may obtain sound via the input module 150 or output sound via the sound output module 155 or an external electronic device (e.g., electronic device 102 such as a speaker or earphone) connected directly or wirelessly to the electronic device 101.
The sensor module 176 may detect an operational state (e.g., power or temperature) of the electronic device 101 or an environmental state (e.g., a state of a user) external to the electronic device 101 and generate an electrical signal or data value corresponding to the detected state. According to one embodiment, the sensor module 176 may include, for example, a gesture sensor, a gyroscope sensor, an atmospheric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an Infrared (IR) sensor, a biometric sensor, a temperature sensor, a humidity sensor, or a fingerprint sensor.
Interface 177 can support one or more specified protocols for electronic device 101 to couple directly (e.g., wired) or wirelessly with an external electronic device (e.g., electronic device 102). According to one embodiment, interface 177 may include, for example, a high-definition multimedia interface (HDMI), a Universal Serial Bus (USB) interface, a Secure Digital (SD) card interface, or an audio interface.
The connection terminal 178 may include a connector via which the electronic device 101 may be physically connected to an external electronic device (e.g., the electronic device 102). According to one embodiment, the connection terminal 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (e.g., a headphone connector).
The haptic module 179 may convert the electrical signal into a mechanical stimulus (e.g., vibration or movement) or an electrical stimulus that may be recognized by the user via his or her tactile or kinesthetic sensation. According to one embodiment, haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrostimulator.
The camera module 180 may capture still images and moving images. According to one embodiment, the camera module 180 may include one or more lenses, image pixels, image signal processors, or flash lamps.
The power management module 188 may manage power supplied to the electronic device 101. According to one embodiment, the power management module 188 may be implemented as at least a portion of, for example, a Power Management Integrated Circuit (PMIC).
Battery 189 may provide power to at least one component of electronic device 101. According to one embodiment, battery 189 may include, for example, a non-rechargeable primary battery, a rechargeable secondary battery, or a fuel cell.
The communication module 190 may support establishing a direct (e.g., wired) communication channel or a wireless communication channel between the electronic device 101 and an external electronic device (e.g., the electronic device 102, the electronic device 104, or the server 108) and performing communication via the established communication channel. The communication module 190 may include one or more CPs that are operable independently of the processor 120 (e.g., an AP) and support direct (e.g., wired) or wireless communication. According to one embodiment, the communication module 190 may include a wireless communication module 192 (e.g., a cellular communication module, a short-range wireless communication module, or a Global Navigation Satellite System (GNSS) communication module) or a wired communication module 194 (e.g., a Local Area Network (LAN) communication module or a Power Line Communication (PLC) module). A corresponding one of these communication modules may communicate with the external electronic device 104 via a first network 198 (e.g., a short-range communication network such as bluetooth (TM), wireless fidelity (Wi-Fi) direct, or infrared data association (IrDA)) or a second network 199 (e.g., a long-range communication network such as a conventional cellular network, 5G network, next-generation communication network, the internet, or a computer network (e.g., a LAN or Wide Area Network (WAN)). These various types of communication modules may be implemented as a single component (e.g., a single chip), or may be implemented as multiple components (e.g., multiple chips) separate from each other. The wireless communication module 192 may use subscriber information (e.g., an International Mobile Subscriber Identity (IMSI)) stored in the SIM 196 to identify and authenticate the electronic device 101 in a communication network, such as the first network 198 or the second network 199.
The wireless communication module 192 may support a 5G network following a 4G network as well as next generation communication technologies (e.g., new Radio (NR) access technologies). The NR access technology can support enhanced mobile broadband (eMBB), large-scale machine type communications (mMTC), or ultra-reliable and low-latency communications (URLLC). The wireless communication module 192 may support a high frequency band (e.g., mmWave band) to achieve, for example, high data transmission rates. The wireless communication module 192 may support various techniques for ensuring performance over a high frequency band, such as beamforming, massive multiple-input multiple-output (MIMO), full-dimensional MIMO (FD-MIMO), array antennas, analog beamforming, or massive antennas. The wireless communication module 192 may support various requirements specified in the electronic device 101, an external electronic device (e.g., electronic device 104), or a network system (e.g., second network 199). According to one embodiment, the wireless communication module 192 may support a peak data rate (e.g., 20Gbps or greater) for implementing eMBB, a lost coverage (e.g., 164dB or less) for implementing mMTC, or a U-plane delay (e.g., 0.5ms or less, or 1ms or less round trip for each of the Downlink (DL) and Uplink (UL)) for implementing URLLC.
The antenna module 197 may transmit signals or power to or receive signals or power from outside of the electronic device 101 (e.g., an external electronic device). According to one embodiment, the antenna module 197 may include an antenna including a radiating element including a conductive material or conductive pattern formed in or on a substrate (e.g., a Printed Circuit Board (PCB)). According to one embodiment, the antenna module 197 may include a plurality of antennas (e.g., array antennas). In this case, at least one antenna of a communication scheme suitable for use in a communication network, such as the first network 198 or the second network 199, may be selected from a plurality of antennas by, for example, the communication module 190. Signals or power may be transmitted or received between the communication module 190 and the external electronic device via at least one selected antenna. According to one embodiment, another component other than the radiating element, such as a Radio Frequency Integrated Circuit (RFIC), may additionally be formed as part of the antenna module 197.
According to an embodiment, antenna module 197 may form a millimeter wave antenna module. According to one embodiment, an mmWave antenna module may include a PCB, an RFIC disposed on or adjacent to a first surface (e.g., a bottom surface) of the PCB and capable of supporting a specified high frequency band (e.g., an mmWave frequency band), and a plurality of antennas (e.g., array antennas) disposed on or adjacent to a second surface (e.g., a top surface or a side surface) of the PCB and capable of transmitting or receiving signals of the specified high frequency band.
At least some of the above components may be coupled to each other and communicate signals (e.g., commands or data) therebetween via an inter-peripheral communication scheme (e.g., bus, general Purpose Input and Output (GPIO), serial Peripheral Interface (SPI), or Mobile Industrial Processor Interface (MIPI)).
According to one embodiment, commands or data may be sent or received between the electronic device 101 and the external electronic device 104 via the server 108 coupled with the second network 199. Each of the external electronic devices 102 or 104 may be the same type of device as the electronic device 101 or a different type of device.
According to one embodiment, all or some of the operations to be performed by electronic device 101 may be performed at one or more external electronic devices (e.g., external apparatuses 102 and 104 and server 108). For example, if the electronic device 101 needs to automatically perform a function or service, or in response to a request from a user or another apparatus, the electronic device 101 may request one or more external electronic devices to perform at least a portion of the function or service instead of, or in addition to, the function or service. The external electronic device or devices receiving the request may perform at least a portion of the requested function or service, or additional functions or additional services related to the request, and may transmit the result of the execution to the electronic device 101. The electronic device 101 may provide the result with or without further processing as at least a portion of the reply to the request. To this end, for example, cloud computing, distributed computing, mobile Edge Computing (MEC), or client-server computing techniques may be used. The electronic device 101 may provide ultra-low latency services using, for example, distributed computing or mobile edge computing. In one embodiment, the external electronic device 104 may include an internet of things (IoT) device. Server 108 may be an intelligent server using machine learning and/or neural networks. According to one embodiment, the external electronic device 104 or the server 108 may be included in the second network 199. The electronic device 101 may be applied to smart services (e.g., smart home, smart city, smart car, or healthcare) based on 5G communication technology or IoT-related technology.
FIG. 2 is a block diagram 200 illustrating the program 140 according to one embodiment. According to one embodiment, the program 140 may include an OS142, middleware 144, or an application 146 executable in the OS142 for controlling one or more resources of the electronic device 101. The OS142 may include, for example, android TM、iOSTM、WindowsTM、SymbianTM、TizenTM or Bada TM. For example, at least a portion of program 140 may be preloaded on electronic device 101 during manufacture or may be downloaded from or updated by an external electronic device (e.g., electronic device 102 or 104 or server 108) during use by a user.
OS142 may control the management (e.g., allocation or deallocation) of one or more system resources (e.g., processes, memory, or power supply) of electronic device 101. OS142 may additionally or alternatively include other one or more drivers to drive other hardware devices of electronic device 101, such as input module 150, sound output module 155, display module 160, audio module 170, sensor module 176, interface 177, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, SIM 196, or antenna module 197.
Middleware 144 may provide various functions to application 146 such that the functions or information provided from one or more resources of electronic device 101 may be used by application 146. Middleware 144 may include, for example, an application manager 201, a window manager 203, a multimedia manager 205, a resource manager 207, a power manager 209, a Database (DB) manager 211, a package manager 213, a connectivity manager 215, a notification manager 217, a location manager 219, a graphics manager 221, a security manager 223, a telephony manager 225, or a voice recognition manager 227.
The application manager 201 may, for example, manage the lifecycle of the application 146. The window manager 203 may, for example, manage one or more Graphical User Interface (GUI) resources used on-screen. The multimedia manager 205 may, for example, identify one or more formats to be used for playing the media files and may encode or decode a corresponding one of the media files using a codec suitable for the corresponding format selected from the one or more formats. For example, the resource manager 207 may manage the source code of the application 146 or the memory space of the memory 130. For example, the power manager 209 may manage the capacity, temperature, or power of the battery 189 and may determine or provide relevant information to be used for operation of the electronic device 101 based at least in part on corresponding information of the capacity, temperature, or power of the battery 189. According to one embodiment, the power manager 209 may communicate with a basic input/output system (BIOS) (not shown) of the electronic device 101.
The DB manager 211 may, for example, generate, search, or change a DB to be used by the application 146. The envelope manager 213 may manage, for example, the installation or update of an application distributed in the form of an envelope file. For example, the connectivity manager 215 may manage wireless connections or direct connections between the electronic device 101 and external electronic devices. For example, notification manager 217 may provide functionality to notify a user of the occurrence of a specified event (e.g., an incoming call, message, or alarm). For example, the location manager 219 may manage location information about the electronic device 101. For example, the graphics manager 221 may manage one or more graphical effects or user interfaces related to one or more graphical effects to be provided to a user.
The security manager 223 may provide, for example, system security or user authentication. The telephony manager 225 may manage, for example, voice call functions or video call functions provided by the electronic device 101. The speech recognition manager 227 can, for example, send the user's speech data to the server 108 and can receive commands from the server 108 corresponding to functions to be performed on the electronic device 101 based at least in part on the speech data or text data converted based at least in part on the speech data. According to one embodiment, the middleware 144 may dynamically delete some existing components or add new components. According to one embodiment, at least a portion of middleware 144 may be included as part of OS142 or may be implemented as another software separate from OS 142.
The applications 146 may include, for example, home page 251, dialer 253, short Message Service (SMS)/Multimedia Message Service (MMS) 255, instant Message (IM) 257, browser 259, camera 261, alarm 263, contact 265, voice recognition 267, email 269, calendar 271, media player 273, album 275, watch 277, health 279 (e.g., for measuring exercise level or biometric information such as blood glucose) or environmental information 281 (e.g., for measuring air pressure, humidity or temperature information) applications. According to one embodiment, the applications 146 may also include an information exchange application (not shown) capable of supporting information exchange between the electronic device 101 and an external electronic device. For example, the information exchange application may include a notification relay application adapted to transmit specified information (e.g., a call, a message, or an alarm) to the external electronic device or a device management application adapted to manage the external electronic device. The notification relay application may transmit notification information corresponding to the occurrence of a specified event (e.g., receipt of an email) at another application of the electronic device 101 (e.g., email application 269) to the external electronic device. Additionally or alternatively, the notification relay application may receive notification information from an external electronic device and provide the notification information to a user of the electronic device 101.
The device management application may control the power (e.g., turn on or off) or functions (e.g., adjustment of brightness, resolution, or focus) of an external electronic device or some component of an external electronic device (e.g., a display module or a camera module of the external electronic device) in communication with the electronic device 101. The device management application may additionally or alternatively support installation, deletion, or update of applications operating on the external electronic device.
The electronic device according to the embodiment may be various types of electronic devices. The electronic equipment may include, for example, a portable communication device (e.g., a smart phone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance device. According to one embodiment of the present disclosure, the electronic device is not limited to those described above.
It should be understood that the embodiments of the present disclosure and the terminology used therein are not intended to limit the technical features set forth herein to the particular embodiments, and include various modifications, equivalents, or alternatives to the respective embodiments. The same reference numbers may be used for similar or related elements throughout the description taken in conjunction with the drawings. It is to be understood that the singular form of a noun corresponding to an item may include one or more things unless the context clearly indicates otherwise. As used herein, "a or B", "at least one of a and B", "at least one of a or B", "A, B or C", "at least one of A, B and C", and "at least one of A, B or C" may include any one of the items listed together in a respective one of the phrases, or all possible combinations thereof. Terms such as "first," "second," or "first" or "second" may be used simply to distinguish a component from other components discussed and do not limit the component in other respects (e.g., importance or order). It will be understood that if an element (e.g., a first element) is referred to as being "coupled to," "connected to," or "connected to" another element (e.g., a second element) with or without the term "operatively" or "communicatively," then the element can be directly (e.g., through a wire), wirelessly, or via a third element.
As used in connection with one embodiment of the present disclosure, the term "module" may include an element implemented in hardware, software, or firmware, and may be used interchangeably with other terms (e.g., "logic," "logic block," "component," or "circuit"). A module may be a single integrated component or a minimal unit or portion thereof adapted to perform one or more functions. For example, according to one embodiment, a module may be implemented in the form of an Application Specific Integrated Circuit (ASIC).
The embodiments set forth herein may be implemented as software (e.g., program 140) comprising one or more instructions stored on a storage medium (e.g., internal memory 136 or external memory 138) readable by a machine (e.g., electronic device 101 of fig. 1). For example, a processor (e.g., processor 120) of a machine (e.g., electronic device 101) may invoke at least one of one or more instructions stored in a storage medium and execute it. This allows the machine to be operated to perform at least one function in accordance with the at least one instruction invoked. The one or more instructions may include code generated by a compiler or code executable by an interpreter. The machine-readable storage medium may be provided in the form of a non-transitory storage medium. Here, the term "non-transitory" simply means that the storage medium is a tangible device and does not include a signal (e.g., electromagnetic waves), but the term does not distinguish between a location where data is semi-permanently stored in the storage medium and a location where data is temporarily stored in the storage medium.
According to one embodiment, a method according to one embodiment of the present disclosure may be included and provided in a computer program product. The computer program product may be traded as a product between a seller and a buyer. The computer program product may be distributed in the form of a machine-readable storage medium, e.g., a compact disc read only memory (CD-ROM), or distributed online (e.g., downloaded or uploaded) via an application store, e.g., playStoreTM, or distributed directly between two user devices, e.g., smartphones (e.g., downloaded or uploaded). If distributed online, at least a portion of the computer program product may be temporarily generated or at least temporarily stored in a machine-readable storage medium, such as a memory of a manufacturer server, a server of an application store, or a relay server.
According to an embodiment, each of the above-described components (e.g., a module or a program) may include a single entity or a plurality of entities, and some of the plurality of entities may be separately provided in different components. Depending on the embodiment, one or more of the above components may be omitted, or one or more other components may be added. Alternatively or additionally, multiple components (e.g., modules or programs) may be integrated into a single component. In this case, according to one embodiment, the integrated component may still perform one or more functions of each of the plurality of components in the same or similar manner as performed by the corresponding one of the plurality of components prior to integration. Depending on the embodiment, operations performed by a module, program, or another component may be performed sequentially, in parallel, repeatedly, or heuristically, or one or more of the operations may be performed in a different order or omitted, or one or more other operations may be added.
Fig. 3 is a block diagram illustrating a source device according to one embodiment. Hereinafter, in embodiments of the present disclosure, source device 300 (e.g., electronic devices 101 and 102 of fig. 1, source device 510 of fig. 5, source device 1001 of fig. 10, source device 1501 of fig. 15, and/or user terminal 1601 of fig. 16) is used to transmit screen image data to sink devices (e.g., electronic devices 101 and 102 of fig. 1, sink device 400 of fig. 4, sink device 530 of fig. 5, sink device 1003 of fig. 10, and sink device 1503 of fig. 15 and/or smart glasses 1603 of fig. 16). The screen image data may include at least one of an image frame generated by, for example, copying a screen (e.g., screen 515 of fig. 5) displayed by display module 320 of source device 300 and/or an image frame associated with screen 515 displayed by display module 320 of source device 300.
For example, source device 300 may send to sink device 400 screen 515 that is actually displayed on source device 300 without change, or may send to sink device 400 a combination of at least a portion of the image frames displayed on the output (e.g., screen 515) displayed on source device 300. In another example, source device 300 may send a new image frame to sink device 400 that is not output from source device 300. The image frames may be included in a data packet of a specified format and transmitted. Hereinafter, for convenience of description, the "image frame related to the screen 515" may be briefly expressed as "screen image data". The screen image data may be multimedia data including audio data in addition to image data.
Source device 300 and sink device 400 may be located adjacent to each other and may be connected to, for example, the same wireless fidelity (Wi-Fi) network, however, embodiments are not limited thereto. For example, the source device 300 and the sink device 400 may be connected through bluetooth communication. The source device 300 may be, for example, an electronic device capable of supporting Miracast TM for wirelessly sharing multimedia data including high-resolution photos and high-definition video content between Wi-Fi devices. Source device 300 may transmit screen image data generated by transcoding a screen (e.g., screen 515 of fig. 5) displayed on a display of source device 300 to sink device 400 through wireless communication such as Wi-Fi. The screen image data generated by the source device 300 and transmitted to the sink device 400 may correspond to an image frame different from the image frame output from the source device 300, although the content is the same as that of the screen 515 output from the display module 320 included in the source device 300, because the screen image data is transmitted by copying and/or editing the image frame output from the source device 300.
In addition, the screen image data transmitted to the sink device 400 may have the same resolution and aspect ratio as those of the screen 515 output from the display module 320 of the source device 300, or may have a different resolution and/or aspect ratio from those of the screen 515 output from the display module 320 of the source device 300.
Here, the video format and/or the audio format of the screen image data transmitted to the sink device 400 may be determined according to codec settings between the source device 300 and the sink device 400. Source device 300 may receive and process user input data, such as touch inputs and key inputs, from sink device 400 via, for example, a User Input Back Channel (UIBC). UIBC may be used by source device 300 to process user input data when user input data for user input (e.g., user input 535 of fig. 5) occurring on screen image 537 based on screen image data by an input device connected to sink device 400 is transmitted to source device 300 while screen image 537 is displayed on a display screen of sink device 400.
Referring to fig. 3, a source apparatus 300 includes a wireless communication module 310 (e.g., wireless communication module 192 of fig. 1), a display module 320 (e.g., display module 160 of fig. 1), a memory 330 (e.g., memory 130 of fig. 1), and a processor 340 (e.g., processor 120 of fig. 1).
Wireless communication module 310 may perform wireless communication with sink device 400 and may transmit to sink device 400 a plurality of image frames (e.g., screen image data) related to a screen of source device 300 (e.g., screen 515 of fig. 5) generated by source device 300 for display on sink device 400. In addition, wireless communication module 310 may receive user input data sent from sink device 400 to source device 300 over UIBC.
UIBC may have a reverse channel structure (also referred to as a user interface back channel) and may be configured to allow sink device 400 to transmit user input data to source device 300 corresponding to user inputs occurring in an input device (e.g., input device 531 of fig. 5) connected to sink device 400. The reverse channel structure may also allow user interface functions and upper layer messages for transmitting user input to reside in an Internet Protocol (IP) transport layer between sink device 400 and source device 300. To facilitate reliable transmission and sequential delivery of data packets including user input data, UIBC may also be configured to be performed over a packet-based communication protocol, such as transmission control protocol/internet protocol (TCP/IP) or User Datagram Protocol (UDP). In addition, UIBC may also be configured to send various types of user input data, including cross-platform or multi-platform user input data that may be operated in various types of computer platforms. For example, source device 300 may performAnd sink device 400 may perform operations such asOr (b)And another OS of the same class. A variety of different types of user input formats may allow a variety of different types of source device 300 and sink device 400 to use the protocol via UIBC. For example, as the user input format, a general input format may be used, or a platform specific input format (e.g., HID format) may be used. In one embodiment, by transmitting and receiving user input data between source device 300 and sink device 400 via UIBC, flexibility of the platform and/or OS used by each device may be provided.
For example, when screen image data generated by source device 300 and transmitted to sink device 400 is being displayed using a display module of sink device 400 (e.g., display module 420 of fig. 4), user input data may be generated in screen image 537 by an input event of an input device connected to sink device 400 (or included in sink device 400).
The input devices may include, for example, all input devices that may be connected to sink device 400 via wires and/or wirelessly and may transmit user operations. Input devices may include, for example, a mouse, keyboard, touch screen, pen, microphone, and wearable device, but are not limited thereto. Input events may include, but are not limited to, mouse clicks, key inputs to a keyboard, touch inputs to a touch screen, pen inputs, voice inputs, gesture inputs, and gaze movement inputs, for example.
The display module 320 may display a screen generated by the source device 300 (e.g., screen 515 of fig. 5).
Memory 330 may store computer-executable instructions. Memory 330 may also store various information generated during processing by processor 340. In addition, the memory 330 may store various data and programs. Memory 330 may include, for example, volatile memory or nonvolatile memory. Memory 330 may include a high-capacity storage medium such as a hard disk to store various data.
Processor 340 may execute instructions by accessing memory 330. Processor 340 may transmit screen image data generated by source device 300 to be displayed on sink device 400 to sink device 400 through wireless communication module 310.
The screen image data generated by the source device 300 may include, for example, an image frame obtained by copying the screen 515 displayed on the display module 320 of the source device 300, and/or an image frame related to the screen 515 displayed on the display module 320 of the source device 300, but is not limited thereto.
Here, the image frame related to the screen 515 displayed on the display module 320 of the source device 300 may include the same content as that of the screen 515 displayed on the display module 320. The image frames associated with the screen 515 may include, for example, image frames that differ in size, resolution, and aspect ratio, as well as image frames generated by transcoding the screen 515 displayed on the display module 320 into a video format and/or audio format determined according to codec settings between the source device 300 and the sink device 400.
The processor 340 may determine whether a target application configured to change the transmission amount of user input data generated in the screen image 537 based on screen image data by an input device connected to the sink device 400 is being executed when the screen image 537 is being displayed on the sink device 400. Here, the target application may correspond to an application configured to change the transmission amount of user input data by an input device connected to the sink device 400 when the corresponding application is executed, because a large amount of user input data is generated or user input data is frequently generated as compared to other applications. The target application may include, for example, an application for providing a predetermined service, such as a handwriting application, a photo editing application, and/or a drawing application, in which at least one of a pen input event or a touch input event occurs, but is not limited thereto. In addition to the application for providing the predetermined service, the target application may include a user experience (UX) for displaying a basic framework of the menu.
For example, when it is determined that a target application is being executed in the screen 537 based on the screen image data displayed on the sink device 400, the processor 340 may adjust the transmission bit rate of the screen image data by changing the transmission profile for transmitting the screen image data. If it is determined that the target application is being executed in the screen image 537, the processor 340 may reduce the transmission bit rate of the screen image data transmitted to the sink device 400 and request the sink device 400 to increase the transmission amount of the user input data based on the transmission profile. If it is determined that the target application is not executed in the screen image 537, the processor 340 may increase a transmission bit rate of the screen image data transmitted to the sink device 400 and request the sink device 400 to reduce the transmission amount of the user input data based on the transmission profile.
Further, upon determining that the target application is being executed in the screen image 537 displayed on the sink device 400, the processor 340 may transmit a message including a transfer profile related to the user input data to the sink device 400. The transfer profile may define a communication scheme for data transfer between the sink device and the source device. The transfer profile may include, for example, at least one of a type, a structure, or a method of use of a protocol for transmitting the corresponding data, but is not limited thereto.
For example, the message transmitted from the source device 300 to the sink device 400 may be a Real Time Streaming Protocol (RTSP) message. The RTSP message may be a network control protocol for controlling the streaming server and may operate in an application layer of the internet protocol. The RTSP message may include information about an information transmission scheme of voice or video transmitted in real time, such as a delivery profile. The RTSP message may include parameters such as "wfd_uibc_first=on".
Various examples of operations performed between source device 300 and sink device 400 when user input data is generated by an input device connected to sink device 400 will be described in more detail below with reference to fig. 5.
According to one embodiment, sink device 400 may be, for example, a wearable device (e.g., smart glasses 1603 of fig. 16) as shown below in fig. 16. If sink device 400 is smart glasses 1603, processor 340 may receive display information via wireless communication module 310 including at least one of: the number of screen images to be displayed on the smart glasses 1603, the size of each of the screen images, the resolution of each of the screen images, or the bit rate of each of the screen images. The processor 340 may request the smart glasses 1603 to transmit information through a user input interface for sharing additional information based on the display information. The user input interface will be described below with reference to fig. 16. The additional information may be information additionally used to process information not defined by the general UIBC protocol. The additional information may include, for example, information additionally used to process at least one of eye (iris), head and hand tracking information, or image information and/or depth information for gesture recognition, object recognition and tracking, but is not limited thereto.
However, the operation of the processor 340 is not limited to the above description. For example, the processor 340 may also perform the above-described operations along with at least one of the operations that will be described below with reference to fig. 5 to 16.
Fig. 4 is a block diagram illustrating a sink device according to one embodiment. Hereinafter, in embodiments of the present disclosure, sink device 400 (e.g., electronic devices 101 and 102 of fig. 1, sink device 530 of fig. 5, sink device 1003 of fig. 10, sink device 1503 of fig. 15, and/or smart glasses 1603 of fig. 16) may correspond to a device configured to establish communication with source devices (e.g., electronic devices 101 and 102 of fig. 1, source device 300 of fig. 3, source device 510 of fig. 5, source device 1001 of fig. 10, source device 1501 of fig. 15, and/or user terminal 1601 of fig. 16) and display a screen image (e.g., screen image 537 of fig. 5) based on screen image data transmitted by source device 300. Sink device 400 may be, for example, an electronic device (such as a Personal Computer (PC), a smart phone, a laptop computer, or a tablet computer), or may correspond to a wearable electronic device (such as smart glasses 1603), but is not limited thereto.
Referring to fig. 4, a sink device 400 according to one embodiment may include a wireless communication module 410 (e.g., wireless communication module 192 of fig. 1), a display module 420 (e.g., display module 160 of fig. 1), a memory 430 (e.g., memory 130 of fig. 1), and a processor 440 (e.g., processor 120 of fig. 1).
Wireless communication module 410 may receive screen image data generated by source device 300 to be displayed on sink device 400.
The display module 420 may display screen image data received through the wireless communication module 410.
Memory 430 may store computer-executable instructions. Memory 430 may also store various information generated during processing by processor 440. In addition, the memory 430 may store various data and programs. Memory 430 may include, for example, volatile memory or nonvolatile memory. The memory 430 may include a high-capacity storage medium such as a hard disk to store various data.
Processor 440 may execute instructions by accessing memory 430. Processor 440 may receive screen image data transmitted by source device 300 to be displayed on sink device 400 from source device 300 through wireless communication module 410. The processor 440 may display a screen image 537 based on the screen image data using the display module 420. The screen image data may include, but is not limited to, image frames generated by, for example, copying a screen (e.g., screen 515 of fig. 5) displayed by display module 320 of source device 300, and/or image frames associated with screen 515 displayed by module 320 of source device 300.
While screen image 537 is displayed by display module 420, processor 440 may obtain user input data generated in screen image 537 by an input device (e.g., input device 531 of fig. 5) connected to sink device 400.
Processor 440 may dynamically adjust parameters for adaptively adjusting the amount of transmission of user input data, including at least one of a data size and a number of data pieces to be transmitted, based on the network quality between source device 300 and sink device 400. Here, the parameters for adaptively adjusting the transmission amount of the user input data may include, for example, the size of the user input data and the number of pieces of the user input data, in addition to parameters of the input report and/or the report descriptor of the user input data, which will be described below with reference to fig. 9 and 10.
Processor 440 may determine a transmission profile for UIBC transmission of user input data based on current network conditions including network throughput (network throughput, TP). Processor 440 may measure, for example, network quality between source device 300 and sink device 400. For example, the processor 440 may determine the network quality between the source device 300 and the sink device 400 based on at least one of a TCP window size or a Round Trip Time (RTT) between the source device 300 and the sink device 400, but the embodiment is not limited thereto.
Processor 440 may determine one of a plurality of transmission profiles (e.g., transmission profiles 610, 630, and 650 of fig. 6) for transmitting user input data based on the measured network quality.
The plurality of transmission profiles 610, 630, and 650 may include, for example, at least two of a first transmission profile 610 corresponding to a first network quality, a second transmission profile 630 corresponding to a second network quality less than the first network quality, and a third transmission profile 650 corresponding to a third network quality less than the second network quality, but are not necessarily limited thereto. The relationship between transfer profiles 610, 630, and 650 will be described in more detail below with reference to fig. 6.
Each of the transmission profiles 610, 630, and 650 may include a transmission amount of each of the transmission profiles 610, 630, and 650. The transmission amount of each of the transmission profiles 610, 630, and 650 may include, for example, at least one of a maximum size or a maximum number of user input data that may be transmitted at one time for each of the transmission profiles 610, 630, and 650, but is not limited thereto. The transmission amount and data size of each of the plurality of transmission profiles 610, 630, and 650 will be described in more detail below with reference to fig. 7.
The processor 440 may adaptively adjust the parameters based on the determined transfer profile. The processor 440 may adjust the number of user input data to be transmitted based on the transfer profile, for example, for each type of user input data. The method by which the processor 440 adjusts the number of pieces of user input data to be transmitted will be described in more detail with reference to fig. 8.
The user input data may include, for example, at least one of an input report (e.g., input report 900 of fig. 9) or a report descriptor (e.g., report descriptors 1110 and 1130 of fig. 11, and/or report descriptors 1210 and 1230 of fig. 12) indicating the content of the user input data, the report descriptor being transmitted prior to transmission of the input report and indicating a composition of the input report for interpreting the value of the input report. If the transmission profile changes in response to a change in network quality, the processor 440 may adjust the data size of the user input data included in the report descriptor 1110 based on at least one of the changed transmission profile or the type of user input data. Examples of a method of exchanging user input data between the source device 300 and the sink device 400 and an input report according to one embodiment will be described in more detail below with reference to fig. 9 and 10.
The method by which the processor 440 adaptively adjusts parameters based on the transmission profile will be described in more detail below with reference to fig. 11 and 12.
Processor 440 may send the dynamically changing parameters to source device 300.
According to one embodiment, when sink device 400 is a smart glasses (e.g., smart glasses 1603 of fig. 16), processor 440 may share display information with source device 300 that includes at least one of: the number of screen images displayed on the smart glasses 1603, the size of each of the screen images, the resolution of each of the screen images, or the bit rate of each of the screen images. Processor 440 may receive a request for information transfer from source device 300 through a user input interface for sharing additional information.
However, the operation of the processor 440 is not limited to the above-described operation, and the processor 440 may also perform at least one of the operations described below with reference to fig. 5 to 16 together with the above-described operation.
Fig. 5 is a diagram illustrating operations performed between a source device and a sink device according to one embodiment. Fig. 5 illustrates a case where source device 510 (e.g., electronic devices 101 and 102 of fig. 1, source device 1001 of fig. 10, source device 1501 of fig. 15, and/or user terminal 1601 of fig. 16) and sink device 530 (e.g., electronic devices 101 and 102 of fig. 1, sink device 400 of fig. 4, sink device 1003 of fig. 10, sink device 1503 of fig. 15, and/or smart glasses 1603 of fig. 16) exchange data over a communication channel according to one embodiment.
The communication channel may generally represent any communication medium or set of different communication media for transmitting video data from source device 510 to sink device 530. The communication channel may correspond to a relatively short-range communication channel, such as wireless fidelity (Wi-Fi) and bluetooth, or may include any wireless or wired communication medium, such as a Radio Frequency (RF) spectrum or one or more physical transmission lines, or any combination of wireless and wired media. According to an embodiment, the communication channel may form part of a data packet based network (e.g., a local area network, a wide area network, or a global network such as the internet). The communication channel may include UIBC as described above.
As described above, the source device 510 may transmit screen image data including audio data and/or video data including image frames associated with the screen 515 to the sink device 530. The source device 510 may use a common communication channel to transmit screen image data to the sink device 530.
The sink device 530 may display the screen image 537 by decoding and/or rendering data (e.g., screen image data) received from the source device 510. In addition, sink device 530 may obtain user input data corresponding to user input 535 generated by an input device 531 (e.g., mouse) connected to sink device 530. In addition to the mouse shown in fig. 5, the input device 531 may include, for example, a keyboard, a trackball, a track pad, a touch screen, a voice recognition module, a gesture recognition module, an iris recognition module, a mouth recognition module, and/or various types of Human Interface Devices (HIDs), but is not limited thereto.
Sink device 530 may format user input data corresponding to user input 535 (e.g., cursor movement by input device 531) into a data packet structure that may be interpreted by source device 510 and may send the formatted user input data to source device 510 via UIBC as described above.
The source device 510 may respond to the user input 535 generated by the input device 531 connected to the sink device 530 while the screen image 537 displayed on the sink device 530 is being displayed using the display based on the screen image data generated and transmitted by the source device 510. Through the interactions described above, user input data corresponding to user input 535 (e.g., movement of a cursor in sink device 530) may be resent through UIBC to source device 510.
Fig. 6 is a block diagram illustrating a state diagram of a transfer profile according to one embodiment. Fig. 6 illustrates a graph 600 of transmission profile types (e.g., a first transmission profile 610, a second transmission profile 630, and a third transmission profile 650) based on network quality changes, according to one embodiment.
In one embodiment, depending on network conditions including network quality, or the type of application used in the source devices (e.g., electronic devices 101 and 102 of fig. 1, source device 300 of fig. 3, source device 510 of fig. 5, source device 1001 of fig. 10, source device 1501 of fig. 15, and/or user terminal 1601 of fig. 16), the sink devices (e.g., electronic devices 101 and 102 of fig. 1, sink device 400 of fig. 4, sink device 530 of fig. 5, sink device 1003 of fig. 10, sink device 1503 of fig. 15, and/or smart glasses 1603 of fig. 16) may increase or decrease the amount of transmission of user input data to be transmitted to source device 300 to reduce the latency of user input (e.g., user input 535 of fig. 5) occurring between source device 300 and sink device 400 and provide improved usability.
Sink device 400 may determine network conditions and adjust the amount of transmission of user input data to be sent to source device 300. For example, sink device 400 may determine the network condition by measuring the network quality, or determining the network quality based on at least a portion of information received from another device (e.g., source device 300 or an Access Point (AP)).
Sink device 400 may use, for example, round Trip Time (RTT) over TCP/IP and/or TCP window size to determine the network quality between source device 300 and sink device 400. Sink device 400 may define a transmission profile for transmission of user input data based on network quality and may flexibly select a transmission profile appropriate for current network conditions.
The first transfer profile 610 may correspond to a first network quality indicating good network quality. The first transfer profile 610 may also be denoted as "high profile" due to the large amount of transmission.
The second transfer profile 630 may correspond to a second network quality that indicates a normal (or intermediate) network quality and that is less than the first network quality. The second transfer profile 630 may also be denoted as "intermediate profile" due to the intermediate transfer volume.
The third transfer profile 650 may correspond to a third network quality that indicates a poor network quality and is lower than the second network quality. The third transfer profile 650 may also be denoted as "low profile" due to the small amount of transmission.
In one embodiment, the number of profiles is not limited and may further include a plurality of other profiles.
Each of the first transmission profile 610, the second transmission profile 630, and the third transmission profile 650 may include a transmission amount of each transmission profile. The transmission amount of each transmission profile may include, for example, a bit rate, a maximum size of user input data transmitted once for each transmission profile, and/or a maximum number of pieces, but is not limited thereto.
Sink device 400 may flexibly select a transmission profile appropriate for the current network conditions. In one example, when it is determined that the network quality is decreasing at the current time when user input data is to be transmitted even though previous user input data is transmitted based on the first transmission profile 610, the sink device 400 may change the first transmission profile 610 to the second transmission profile 630 and transmit the user input data. In another example, when it is determined that the network quality is increasing at the time when the current user input data is to be transmitted even though the previous user input data is transmitted based on the third transmission profile 650, the sink device 400 may change the third transmission profile 650 to the second transmission profile 630 and transmit the user input data.
Sink device 400 may transmit user input data to source device 300 by adjusting the size and/or number of user input data to be transmitted according to the type of transmission profile selected. Examples of sink device 400 adjusting the size and/or number of user input data to be transmitted will be described in more detail below with reference to fig. 7 and 8.
Alternatively, the source device 300 may reduce the bit rate of the screen image data transmitted to the sink device 400 through communication between the source device 300 and the sink device 400, and the sink device 400 may increase the transmission amount including the bit rate of the user input data to be transmitted to the source device 300, and thus may enhance the quality of the user input.
Fig. 7 is a diagram showing a transmission amount and a data size of each of a plurality of transmission profiles according to one embodiment. Fig. 7 shows a graph 710 showing the transmission amount and size of user input data when the transmission profile is the first transmission profile, a graph 730 showing the transmission amount and size of user input data when the transmission profile is the second transmission profile, and a graph 750 showing the transmission amount and size of user input data when the transmission profile is the third transmission profile.
For example, a sink device (e.g., electronic devices 101 and 102 of fig. 1, sink device 400 of fig. 4, sink device 530 of fig. 5, sink device 1003 of fig. 10, sink device 1503 of fig. 15, and/or smart glasses 1603 of fig. 16) may determine a network condition, select a transfer profile based on the current network condition, and increase or decrease the amount of user input data to be sent, as described above with reference to fig. 6.
Sink device 400 may define at least some of the maximum size and/or maximum number of user input data to be transmitted based on each transfer profile.
For example, sink device 400 may define the first transfer profile as having a first amount of transfer, where the maximum size and maximum number of user input data is unrestricted, as shown in diagram 710. The sink device 400 may define the second transmission profile to have a second transmission amount smaller than the first transmission amount by limiting the number of pieces of user input data to be transmitted, as shown in fig. 730. The second transmission amount may have, for example, 50% of the first transmission amount, which is the maximum transmission amount of the corresponding network, but is not limited thereto. The sink device 400 may define the third transmission profile to have a third transmission amount corresponding to the size of the user input data and the number of pieces of user input data smaller than the second transmission amount by limiting the number of pieces of user input data to be transmitted, as shown in fig. 750. The third transmission amount may correspond to a minimum transmission amount of the corresponding network, but is not limited thereto.
Sink device 400 may limit the number of pieces of user input data to be transmitted by, for example, reducing or discarding the user input data. A method of limiting the number of pieces of user input data to be transmitted by the sink device 400 will be described in more detail below with reference to fig. 8.
Fig. 8 is a diagram illustrating a method in which a sink device limits the number of pieces of user input data to be transmitted according to one embodiment. Fig. 8 illustrates a graph 810 showing the number of pieces of user input data to be transmitted when a transmission profile according to one embodiment is a first transmission profile (e.g., first transmission profile 610 of fig. 6), and a graph 830 showing the number of pieces of user input data to be transmitted when the first transmission profile becomes a third transmission profile (e.g., third transmission profile 650 of fig. 6).
For example, as shown in diagram 810, five pieces of user input data, e.g., input1 (input 1), input2 (input 2), input3 (input 3), input4 (input 4), and input5 (input 5), in response to X and Y coordinates of handwriting pressure (i.e., pen pressure) entered via touch input and/or mouse input or pen input may be generated in a screen image (e.g., screen image 537 of fig. 5) of a sink device (e.g., electronic devices 101 and 102 of fig. 1, sink device 400 of fig. 4, sink device 530 of fig. 5, sink device 1003 of fig. 10, sink device 1503 of fig. 15, and/or smart glasses 1603 of fig. 16) at a current time.
In this example, if the sink device 400 measures a relatively low network quality, the sink device 400 may determine the transmission profile as a third transmission profile based on the relatively low network quality. Sink device 400 may dynamically change parameters for adaptively adjusting the amount of transmission of user input data based on the third transmission profile.
For example, when a difference between the first user input data generated in the screen image 537 at the current time and the second user input data generated at a previous time before the current time is less than a predetermined value, the sink device 400 may reduce the number of pieces of user input data to be generated by discarding the first user input data generated at the current time based on the transmission profile.
For example, sink device 400 may calculate, for each of user input data input1, input2, input3, input4, and input5 shown in diagram 810, a first difference between a first coordinate of first user input data generated in screen image 537 at a current time and a second coordinate of second user input data generated at a previous time prior to the current time. For example, when the first difference between the three pieces of user input data (e.g., user input data input2, input3, and input 4) is smaller than the set first reference value, the sink device 400 may adjust the number of pieces of user input data to be transmitted by discarding the user input data input2, input3, and input4 as the first user input data based on the selected transmission profile (e.g., third transmission profile), as shown in fig. 830. Sink device 400 may send two pieces of user input data (e.g., input1 and input 5) to a source device (e.g., electronic devices 101 and 102 of fig. 1, source device 300 of fig. 3, source device 510 of fig. 5, source device 1001 of fig. 10, source device 1501 of fig. 15, and/or user terminal 1601 of fig. 16) based on the third transfer profile. In this example, the number of first user input data discarded by sink device 400 may vary depending on the type of transfer profile.
In another example, sink device 400 may be configured to discard the input at a fixed rate, rather than by comparing the data in the input. For example, the second transfer profile may instruct sink device 400 to discard every other data input, resulting in a 50% reduction in the amount of data sent. Likewise, the third transfer profile may instruct sink device 400 to transmit only one of every fourth user input, resulting in a 75% reduction in the amount of data transmitted.
The amount of user input data generated according to one embodiment may vary depending on the type of input device used by the user (e.g., the type of user input).
In an example, in a touch input, it may be determined that a new input event occurs whenever x and y coordinate values change, and if the x and y coordinate values remain unchanged, no new input event occurs. In another example, in pen input, a new input event may be determined to occur if a data value such as pen pressure and/or tilt changes even though the pen input points to exactly the same coordinates. In other words, a large amount of input data may be generated in response to pen input as compared to touch input.
For example, when a pen as an input device is connected to the sink device 400, user input data of the pen may generally include data such as pressure, tilt, and orientation in addition to x and y coordinate values. Thus, when network conditions are not good, it may be desirable to reduce the number of pieces of user input data, such as a large amount of data generated by pen input, and transmit the user input data.
If the input data is generated by a tilt in the pen input and/or a change in pen pressure, sink device 400 may reduce the number of pieces of user input data generated in the same manner as described above.
Sink device 400 may calculate a second difference between the first input information and the second input information. The first input information may include at least one of a first pressure, a first tilt, or a first orientation of first user input data generated at a current time in the screen image 537, and the second input information may include at least one of a second pressure, a second tilt, or a second orientation of second user input data generated at a previous time prior to the current time.
In one example, when the second difference is less than the set second reference value, sink device 400 may adjust the number of user input data to be transmitted by discarding the first user input data based on the selected transmission profile. According to the transmission profile, the sink device 400 may limit the number of pieces of user input data to be transmitted similar to the second transmission profile, or may set the minimum data size and the minimum number of pieces of user input data to be transmitted similar to the third transmission profile.
In another example, when the second difference is greater than or equal to the set second reference value, the corresponding user input data may be determined as meaningful data, and thus, the sink device 400 may transmit the first user input data to the source device 300 without change, instead of discarding the first user input data.
The sink device 400 can reduce the number of pieces of user input data to be generated by the above-described method described with reference to fig. 8, and define a maximum number of pieces of data to be transmitted per second for each profile.
Sink device 400 may adjust the number of pieces of user input data to be transmitted based on the transfer profile for each type of user input data.
If the transmission profile is the first transmission profile, sink device 400 may send all user input data generated by the user input to source device 300. For example, the maximum number of user input data generated per predetermined time unit (e.g., one second (sec)) may vary according to the type of each user input data. If the transmission profile is the second transmission profile, the sink device 400 may limit the number of pieces of user input data to be generated such that the maximum number of pieces of user input data for each type of user input data may be limited to about 50% of the maximum number of pieces of user input data for the first transmission profile.
If the transmission profile is the third transmission profile, the sink device 400 may transmit the minimum number of user input data to the source device 300 in which no malfunction occurs for each type of user input data. For example, when the type of user input data is a touch input, if an input event such as a touch down input and/or a touch up input is omitted, a significant failure may occur. However, if the movement amount of the input event (such as a touch movement input) is not large, a significant malfunction may not occur even if the input event is omitted. The sink device 400 may set a predetermined reference value for input events that may be omitted based on the movement amount (e.g., touch movement input), and may reduce the amount of user input data to be generated and the amount of user input data to be transmitted by discarding input events that do not exceed the predetermined reference value.
Fig. 9 is a diagram illustrating an example of an input report according to one embodiment. FIG. 9 illustrates an example of an input report 900 of user input data according to one embodiment.
For example, user input data generated in various input devices connected to sink devices (e.g., electronic devices 101 and 102 of fig. 1, sink device 400 of fig. 4, sink device 530 of fig. 5, sink device 1003 of fig. 10, sink device 1503 of fig. 15, and/or smart glasses 1603 of fig. 16) may be transmitted. In this example, a portion of the user input data may need to be sent in a format suitable for each type of input device (e.g., HID format), or a portion of the user input data may be sent in a common format (e.g., common format). For example, when a touch input occurs on the screen of the sink device 400, the number of fingers and x and y coordinate values may need to be transmitted as user input data. When pen input occurs, it may be necessary to send x and y coordinate values and information such as pressure, tilt, and orientation. In this example, user input data such as the number of fingers and x and y coordinate values, and/or user input data such as x and y coordinate values, pressure, tilt, and/or orientation may be transmitted in HID format.
If the user input data is sent in HID format, the user input data may include input report 900 and/or report descriptors (e.g., report descriptors 1110 and 1130 of fig. 11, and/or report descriptors 1210 and 1230 of fig. 12).
As shown in FIG. 9, the input report 900 may include the content of the user input data, e.g., actual data, such as x and y coordinate values associated with the user input. Here, the x and y coordinate values may be relative coordinate values.
The input report 900 may include, for example, a relative coordinate value corresponding to each of the first, second, and third transmission profiles, a value obtained by converting the relative coordinate value into a hexadecimal (hex) number, and byte alignment information, but is not limited thereto. The byte alignment information may indicate the total number of bytes representing the x and y coordinate values.
The report descriptor 1110 may correspond to data indicating a composition of the input report 900 (e.g., a composition such as a size and a transmission order of input data) for interpreting a value of the input report 900. Report descriptor 1110 may be transmitted prior to transmitting input report 900, as shown in fig. 10 below.
If data transmitted in an HID format is received, a source device (e.g., the electronic devices 101 and 102 of FIG. 1, the source device 300 of FIG. 3, the source device 510 of FIG. 5, the source device 1001 of FIG. 10, the source device 1501 of FIG. 15, and/or the user terminal 1601 of FIG. 16) may store a report descriptor 1110 and interpret the value of the input report 900 to be received based on the report descriptor 1110.
Fig. 10 is a diagram illustrating a method of exchanging input reports and report descriptors of user input data between a source device and a sink device, according to one embodiment. Fig. 10 shows a diagram 1000 of a case where an input report (e.g., input report 900 of fig. 9) and a report descriptor (e.g., report descriptors 1110 and 1130 of fig. 11, and/or report descriptors 1210 and 1230 of fig. 12) corresponding to each user input data are exchanged between a source device 1001 (e.g., electronic devices 101 and 102 of fig. 1, source device 300 of fig. 3, source device 1501 of fig. 5, and/or user terminal 1601 of fig. 16) and a sink device 1003 (e.g., sink devices 101 and 102 of fig. 1, sink device 400 of fig. 4, sink device 530 of fig. 5, sink device 1503 of fig. 15, and/or smart glasses 1603 of fig. 16).
Sink device 1003 may send user input data to source device 1001 including input report 900 and report descriptor 1110 for each type of user input data generated in the respective input device.
In operation 1010, the sink device 1003 may transmit a report descriptor corresponding to each of the mouse input, the touch input, the keyboard input, and the pen input to the source device 1001. The source device 1001 may store the report descriptor transmitted in operation 1010.
In operation 1020, the sink device 1003 may transmit an input report corresponding to the report descriptor transmitted in operation 1010, for example, an input report corresponding to each of a mouse input, a touch input, a keyboard input, and a pen input, to the source device 1001. The source device 1001 may interpret an input report corresponding to each of a mouse input, a touch input, a keyboard input, and a pen input using each report descriptor stored in operation 1010.
HID such as a keyboard or mouse may not send and receive new report descriptors unless in particular new settings are added after the first report descriptor is sent. However, if the transfer profile changes due to network conditions and/or user settings, the sink device 1003 may variably generate a report descriptor of the user input data during the mirror connection and adjust the data size of the user input data.
For example, if the network quality changes, the sink device 1003 may change the transmission profile in response to the change in the network quality. If the transmission profile changes, the sink device 1003 may adjust the data size of the user input data included in the report descriptor based on at least one of the changed transmission profile or the type of the user input data.
For example, to represent x and y coordinates, sink device 1003 may change the size of the data (e.g., x and y coordinates) by transmitting a profile based on current network conditions.
In one example, when the transmission profile is a first transmission profile (e.g., first transmission profile 610 of fig. 6), sink device 1003 may represent each of the x and y coordinate data as 16 bits such that the x and y coordinates may be represented by a total of 4 bytes. In another example, when the transmission profile is a second transmission profile (e.g., second transmission profile 630 of fig. 6), the sink device 1003 may express each of the x and y coordinate data as 12 bits so that the x and y coordinates may be represented by a total of 3 bytes. In another example, when the transmission profile is a third transmission profile (e.g., third transmission profile 650 of fig. 6), sink device 1003 may express each of the x and y coordinate data as 8 bits such that the x and y coordinates may be represented by a total of "2" bytes.
As described above, if the data size corresponding to the touch input is adjusted based on the transmission profile according to the network quality change, the sink device 1003 may transmit a report descriptor of the change corresponding to the touch input to the source device 1001 in operation 1030. A method by which the sink device 1003 changes the report descriptor of each transmission profile (e.g., the first transmission profile 610 to the third transmission profile 650 of fig. 6) will be described in more detail below with reference to fig. 11 to 12.
In operation 1040, the sink device 1003 may transmit an input report changed based on the format of the changed report descriptor.
FIG. 11 is a diagram illustrating a report descriptor for each transfer profile change in response to a touch input, according to one embodiment. FIG. 11 illustrates an example of a report descriptor 1110 in a first transfer profile associated with x and y coordinates of a touch input (e.g., first transfer profile 610 of FIG. 6) and an example of a report descriptor 1130 in a third transfer profile (e.g., third transfer profile 650 of FIG. 6) according to one embodiment.
In an example, when the transmission profile is the first transmission profile, the sink device (e.g., the electronic devices 101 and 102 of fig. 1, the sink device 400 of fig. 4, the sink device 530 of fig. 5, the sink device 1003 of fig. 10, the sink device 1503 of fig. 15, and/or the smart glasses 1603 of fig. 16) may express the actual x and y coordinates corresponding to a portion of the report descriptor 1110 associated with the x and y coordinates of the touch input as relative values between "0" and "32767". Sink device 400 may use all 16 bits (as in "report_size (16)") to express data representing the x and y coordinates of the touch input in REPORT descriptor 1110 and transmit the data so that the original data (e.g., x and y coordinates) is not lost. In addition, "LOGICAL _maximum (32767)" described in the report descriptor 1110 may indicate that transmission of data may be represented by a relative value between "0" and "32767", and "physical_maximum (1920)" may indicate that a PHYSICAL value corresponding to the relative value ranges from "0" to "1920".
In another example, when the transmission profile is the third transmission profile, the sink device 400 may express the actual x and y coordinates corresponding to a portion of the report descriptor 1130 associated with the x and y coordinates of the touch input as a relative value between "0" and "127". The sink device may use 8 bits as in "report_size (8)" to express data representing the x and y coordinates of the touch input in REPORT descriptor 1130. In this example, sink device 400 may express 16-bit data (e.g., x and y coordinates) as 8-bit data by downscaling the data. If the reduced data is transmitted to the source device 300 and restored, the x and y coordinate values may be different from the original values. Here, the "physical_maximum (1920)" described in the report descriptor 1130 may indicate that the PHYSICAL MAXIMUM corresponding to the relative value ranges from "0" to "1920". According to one embodiment, since the source device 300 generates screen image data, the maximum value of the actual x and y coordinates may be stored in the source device 300 in advance. Here, the value of the physical_maximum may not be included in the report descriptor 1130. "LOGICAL _maximum (127)" may indicate that data having a physical MAXIMUM of "0" to "1920" is reduced to a value between "0" and "127" and transmitted. As described above, based on the information included in the report descriptor 1130, the source device 300 may interpret 64-bit data transmitted by the sink device 400 as 960-bit data or 127-bit data transmitted by the sink device 400 as 1920-bit data.
Sink device 400 may reduce the size of the data by reducing the pen pressure value of the pen input in the same manner as the x and y coordinate values of the touch input. A method of changing the data size of the pen pressure value of the pen input will be described in more detail with reference to fig. 12.
FIG. 12 is a diagram illustrating a report descriptor for each transfer profile change in response to pen input, according to one embodiment. FIG. 12 illustrates a diagram 1210 showing a portion of a report descriptor in a first transmission profile (e.g., first transmission profile 610 of FIG. 6) associated with a pen pressure value of a pen input, and a diagram 1230 showing a portion of a report descriptor in a third transmission profile (e.g., third transmission profile 650 of FIG. 6), according to one embodiment.
In an example, when the transmission profile is the first transmission profile, the sink device (e.g., the electronic devices 101 and 102 of fig. 1, the sink device 400 of fig. 4, the sink device 530 of fig. 5, the sink device 1003 of fig. 10, the sink device 1503 of fig. 15, and/or the smart glasses 1603 of fig. 16) may express the value of the pen pressure of the stylus input as 16 bits, as in "report_size (16)" described in the REPORT descriptor 1210 associated with the pen pressure of the pen input. In addition, "LOGICAL _maximum (4096)" described in the report descriptor 1210 may indicate that data transmission with respect to pen pressure may be represented by a relative value between "0" and "4096".
In another example, when the transmission profile is the third transmission profile, sink device 400 may reduce the value indicating the pen pressure expressed as a 16-bit pen input to 8 bits. Sink device 400 may express the value of the pen pressure of the stylus input as 8 bits, as described in "REPORT_SIZE (8)" in REPORT descriptor 1230 associated with the pen pressure of the pen input. The "LOGICAL _maximum (127)" described in the report descriptor 1230 may indicate that data transmission with respect to pen pressure may be represented by a relative value between "0" and "127".
Fig. 13 is a flowchart illustrating a method of operating a source device according to one embodiment. In the following embodiments, operations may be performed sequentially, but need not be. For example, the order of operations may be changed, and at least two operations may be performed in parallel. Referring to fig. 13, a source device (e.g., the electronic devices 101 and 102 of fig. 1, the source device 300 of fig. 3, the source device 510 of fig. 5, the source device 1001 of fig. 10, the electronic device 1501 of fig. 15, and/or the user terminal 1601 of fig. 16) according to one embodiment may adjust a transmission bit rate of a screen through operations 1310 to 1330.
In operation 1310, source device 300 may transmit screen image data generated by source device 300 to be displayed on a sink device (e.g., electronic devices 101 and 102 of fig. 1, sink device 400 of fig. 4, sink device 530 of fig. 5, sink device 1003 of fig. 10, sink device 1503 of fig. 15, and/or smart glasses 1603 of fig. 16) to sink device 400.
In operation 1320, the source device 300 may determine whether a target application configured to change the transmission amount of user input data generated in a screen image based on screen image data (e.g., screen image 537 of fig. 5) by an input device connected to the sink device 400 is being executed while the screen image 537 is displayed on the sink device 400 through operation 1310.
In operation 1330, when it is determined in operation 1320 that the target application is being executed, the source device 300 may adjust the transmission bit rate of the screen image data by changing the transmission profile for transmitting the screen image data.
Fig. 14 is a flowchart illustrating a method of operating a sink device according to one embodiment. In the following embodiments, operations may be performed sequentially, but need not be. For example, the order of operations may be changed, and at least two operations may be performed in parallel.
Referring to fig. 14, sink devices (e.g., the electronic devices 101 and 102 of fig. 1, the sink device 400 of fig. 4, the sink device 530 of fig. 5, the sink device 1003 of fig. 10, the sink device 1503 of fig. 15, and/or the smart glasses 1603 of fig. 16) may transmit parameters dynamically changed to a source device (e.g., the electronic devices 101 and 102 of fig. 1, the source device 300 of fig. 3, the source device 510 of fig. 5, the source device 1001 of fig. 10, the electronic device 1501 of fig. 15, and/or the user terminal 1601 of fig. 16) to adjust the transmission amount of user input data.
In operation 1410, the sink device 400 may receive screen image data generated by the source device 300 to be displayed on the sink device 400.
In operation 1420, the sink device 400 may display a screen image (e.g., screen image 537 of fig. 5) based on the screen image data received in operation 1410.
In operation 1430, when the screen image 537 is displayed in operation 1420, the sink device 400 may acquire user input data generated in the screen image 537 by an input device (e.g., the input device 531 of fig. 5) connected to the sink device 400.
In operation 1440, the sink device 400 may dynamically change a parameter for adaptively adjusting a transmission amount of user input data, including at least one of a data size or a number of pieces of data to be transmitted, based on a network quality between the source device 300 and the sink device 400.
In operation 1450, the sink device 400 may transmit the parameters dynamically changed in operation 1440 to the source device 300.
Fig. 15 is a flowchart illustrating a method of adjusting a bit rate of data transmitted through communication between a source device and a sink device according to one embodiment. In the following embodiments, operations may be performed sequentially, but need not be. For example, the order of operations may be changed, and at least two operations may be performed in parallel.
Referring to fig. 15, a source device 1501 (e.g., the electronic devices 101 and 102 of fig. 1, the source device 300 of fig. 3, the source device 510 of fig. 5, the source device 1001 of fig. 10, and/or the user terminal 1601 of fig. 16) and a sink device 1503 (e.g., the electronic devices 101 and 102 of fig. 1, the sink device 400 of fig. 4, the sink device 530 of fig. 5, the sink device 1003 of fig. 10, and/or the smart glasses 1603 of fig. 16) according to one embodiment may adaptively adjust a bit rate of an image and a bit rate of user input data through operations 1510 to 1580.
In operation 1510, the source device 1501 may determine or confirm whether a target application is being executed on an image displayed by the source device 1501. Here, the target application may correspond to an application configured to change the transmission amount of user input data of an input device (e.g., input device 531 of fig. 5) connected to the sink device 1503 because a relatively large amount of user input data is generated in the target application as compared to other applications. The target applications may include, for example, a handwriting application, a photo editing application, and a drawing application, in which at least one of a pen input event or a touch input event occurs, but are not limited thereto.
When it is determined in operation 1510 that the target application is being executed, the source device 1501 may transmit a message including a transfer profile associated with the user input data to the sink device 1503 to share the current network condition with the sink device 1503 in operation 1520. The message sent by source device 1501 in operation 1520 may be, for example, an RTSP message. The RTSP message may be a network control protocol for controlling the streaming server and may operate in an application layer of the internet protocol. The RTSP message may include information about an information transmission scheme of voice or video transmitted in real time, such as a delivery profile. The RTSP message may include parameters such as "wfd_uibc_first=on". The message transmitted by the source device 1501 in operation 1520 may include, for example, a signal for requesting the sink device 1503 to increase the transmission amount of the user input data.
Operation 1520 may be performed in the background so that the user may not recognize, or a message in which the transmission amount of the user input data is set to a value desired by the user by allowing the user to directly select the value may be transmitted.
In operation 1530, the source device 1501 may reduce the bit rate of the screen image data based on the transfer profile included in the message transmitted in operation 1520. The screen image data may correspond to screen image data generated by the source device 1501 to be displayed on the sink device 1503. If the message is transmitted in operation 1520, the source device 1501 may reduce the bit rate of the previously transmitted image to ensure that a greater amount of the frequency band of user input data may be transmitted within a limited network throughput. For example, in a drawing application, the currently displayed image may not be significantly different from the previously displayed image even if the bit rate of the image is reduced because the complexity of the image is relatively low.
In operation 1540, the sink device 1503 may increase the bit rate of the user input data (e.g., UIBC data) based on the transmission profile included in the message received in operation 1520. Sink device 1503 may enhance the quality of user input data by adding a transfer profile or transmitting user input data without limitation in the same manner as described above.
In one embodiment, operations 1530 and 1540 may be performed concurrently or sequentially with a predetermined time difference.
In operation 1550, the sink device 1503 may determine or confirm whether the execution of the target application determined in operation 1510 is terminated.
When it is determined in operation 1550 that execution of the target application is terminated, the source device 1501 may transmit a message including a transfer profile related to user input data to the sink device 1503 in operation 1560. The transfer profile included in the message transmitted by the source device 1501 in operation 1560 may include, for example, information for requesting the sink device 1503 to reduce the transfer amount of the user input data.
In operation 1570, the source device 1501 may increase a bit rate of an image based on a transfer profile included in the message transmitted in operation 1560.
In operation 1580, the sink device 1503 may again reduce the bit rate of the user input data based on the transmission profile included in the message received in operation 1560.
Fig. 16 is a diagram illustrating a process of transmitting user input data when a source device and a sink device are a user terminal and smart glasses, respectively, according to one embodiment.
Fig. 16 illustrates operations of transmitting and receiving user input data between a user terminal 1601 corresponding to a source device (e.g., electronic devices 101 and 102 of fig. 1, source device 300 of fig. 3, source device 510 of fig. 5, source device 1001 of fig. 10, and/or source device 1501 of fig. 15) and a smart glasses 1603 corresponding to a sink device (e.g., electronic devices 101 and 102 of fig. 1, sink device 400 of fig. 4, sink device 530 of fig. 5, sink device 1003 of fig. 10, and/or sink device 1503 of fig. 15) according to one embodiment. The smart glasses 1603 may be, for example, augmented Reality (AR) glasses, but are not limited thereto.
In addition to Miracast, the method of transmitting user input data described above with reference to fig. 3-15 may be equally applicable to AR devices, such as Head Mounted Displays (HMDs) or smart glasses 1603.
For example, in an AR device, user input data, e.g., gestures such as hand motions of a user, or gaze of a user, may be generated on a screen that mixes virtual content and a real screen.
In one embodiment, the above-described method of transmitting user input data is not limited to the UIBC interface, and may be equally applicable to various types of user input data generated in an AR device, thereby enhancing usability.
If the sink device is smart glasses 1603, the user input data may be extended to an interface that includes various additional information not defined by the universal UIBC protocol (e.g., image information and depth information for eye, head and hand tracking, gesture recognition, object recognition, and/or object tracking). Accordingly, an interface protocol for transmitting various additional information may be newly defined, and a format of metadata for sharing and/or transmitting a change of the additional information or a relationship between the additional information and original data including compressed data may also be defined.
In one embodiment, for transmission of user input data generated in the AR device, an interface for sharing various additional information used in place of the UIBC interface protocol may be defined as a "user input interface".
The synchronization process including transmitting user input data between the user terminal 1601 corresponding to the source device and the smart glasses 1603 corresponding to the sink device may be performed through, for example, operations 1610 to 1680.
In operation 1610, the user terminal 1601 may be connected to the smart glasses 1603 through a tether.
If a connection to the smart glasses 1603 is established through the tether, the user terminal 1601 may request the smart glasses 1603 to perform a capability check in operation 1620.
If a request to perform a capability check is received, the smart glasses 1603 may transmit information, such as a transfer protocol, a type and form of data to be shared with the user terminal 1601, and/or data to be transmitted to the user terminal 1601, to the user terminal 1601 through a user input interface in operation 1630.
In operation 1640, the smart glasses 1603 may share, with the user terminal 1601, change and/or display information regarding a real screen and/or a virtual screen including virtual content displayed on the smart glasses 1603 in real time. The display information may include, for example, at least one of the number of virtual screens (e.g., screen images) including virtual content, the size of the virtual screens, the resolution of the virtual screens, the bit rate of the virtual screens, or the size, resolution, or bit rate of the real screens, but is not limited thereto.
In operation 1650, the user terminal 1601 may issue a request to send user input data to the smart glasses 1603 using the user input interface based on the display information shared in operation 1640. The user terminal 1601 may request the smart glasses 1603 to transmit user input data based on one profile determined among one or more preset profiles. For example, when there are a plurality of types of user input data, a profile having a different transmission bit rate may be defined according to each type of user input data. In this example, the user terminal 1601 may request the smart glasses 1603 to transmit the user input data according to a profile having a different transmission bit rate for each type of user input data. In operation 1660, in response to the request in operation 1650, the smart glasses 1603 may transmit the user input data generated by the smart glasses 1603 to the user terminal 1601.
In operation 1670, the smart glasses 1603 may determine whether a display information change event occurs through a program or application (e.g., a real estate application or a gaming application) installed in the smart glasses 1603.
If it is determined in operation 1670 that the display information change event occurs, the smart glasses 1603 may change the transfer profile using the user input interface in operation 1680.
In operation 1690, the smart glasses 1603 may transmit information including user input data based on the transmission profile changed in operation 1680 to the user terminal 1601 so that the smart glasses 1603 may be synchronized with the user terminal 1601.
The synchronization process between the user terminal 1601 and the smart glasses 1603 described above with reference to fig. 16 may also be performed in a manner that one or more devices, such as servers/clients, master/slave, and/or multiple access edge computing (MEC) clouds/pico clouds, are distributed and processed.
Embodiments described herein may be implemented using hardware components, software components, and/or combinations thereof. A processing device may be implemented using one or more general purpose or special purpose computers, such as, for example, a processor, controller and Arithmetic Logic Unit (ALU), digital Signal Processor (DSP), microcomputer, field Programmable Gate Array (FPGA), programmable Logic Unit (PLU), microprocessor, or any other device capable of responding to and executing instructions in a defined manner. The processing device may run an OS and one or more software applications running on the OS. The processing device may also access, store, manipulate, process, and create data in response to execution of the software. For simplicity, the description of the processing device is used in the singular; however, those skilled in the art will appreciate that a processing device may include multiple processing elements and multiple types of processing elements. For example, the processing device may include multiple processors, or a single processor and a single controller. In addition, different processing configurations are possible, such as parallel processors.
The software may include computer programs, code segments, instructions, or some combination thereof to individually or collectively instruct or configure the processing device to operate as needed. The software and data may be embodied permanently or temporarily in any type of machine, component, physical or virtual equipment, computer storage medium or device, or in a propagated signal wave capable of providing instructions or data to, or being interpreted by, a processing device. The software may also be distributed over network-coupled computer systems so that the software is stored and executed in a distributed fashion. The software and data may be stored by one or more non-transitory computer readable recording media.
The method according to the above-described embodiments may be recorded in a non-transitory computer-readable medium including program instructions for implementing various operations of the above-described embodiments. Media may also include data files, data structures, and the like, alone or in combination with program instructions. The program instructions recorded on the medium may be program instructions specially designed and constructed for the purposes of the embodiments, or they may be of the type well known and available to those having skill in the computer software arts. Examples of non-transitory computer readable media include magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROM disks and DVDs; magneto-optical media such as optical disks; and hardware devices that are specially configured to store and perform program instructions, such as read-only memory (ROM), random Access Memory (RAM), flash memory, and the like. Examples of program instructions include both machine code, such as produced by a compiler, and files containing higher level code that may be executed by the computer using an interpreter.
The hardware devices described above may be configured to act as one or more software modules in order to perform the operations of the embodiments described above, and vice versa.
As described above, although the embodiments have been described with reference to the limited drawings, those skilled in the art can apply various technical modifications and variations thereto. For example, suitable results may be achieved if the described techniques were performed in a different order and/or if components in the described systems, architectures, devices or circuits were combined in a different manner and/or replaced or supplemented by other components or their equivalents. Accordingly, other implementations, other examples, and equivalents of the claims are within the scope of the following claims.
According to one embodiment, source device 101, 102, 300, 510, 1001, 1501 and/or 1601 includes wireless communication module 192, 310, memory 130, 330 and processor 120, 340. The processor 120, 340 may be configured to: the screen image data generated by the source device 101, 102, 300, 510, 1001, 1501 and/or 1601 to be displayed on the sink device 101, 102, 400, 530, 1003, 1503 and/or 1603 is transmitted to the sink device 101, 102, 400, 530, 1503 and/or 1603 through the wireless communication module 192, 310, it is determined whether a target application that changes the transmission amount of user input data generated in the screen image 537 based on the screen image data by the input device 531 connected to the sink device 101 is being executed when the screen image 537 is displayed on the sink device 101, 102, 400, 530, 1003, 1503 and/or 1603, and when it is determined that the target application is being executed, the transmission bit rate of the screen image data is adjusted by changing the transmission profile for transmitting the screen image data.
According to one embodiment, the screen image data may include at least one of an image frame generated by copying a screen 515 displayed by the display module 160, 320 of the source device 101, 102, 300, 510, 1001, 1501, and/or 1601 or an image frame related to a screen 515 displayed by the display module 160, 320 of the source device 101, 102, 300, 510, 1001, 1501, and/or 1601.
According to one embodiment, the processor 120, 340 may be configured to: when it is determined that the target application is being executed, a message including a transfer profile associated with the user input data is sent to the sink device 101, 102, 400, 530, 1003, 1503, and/or 1603.
According to one embodiment, the processor 120, 340 may be configured to: when it is determined that the target application is being executed, the requesting sink device 101, 102, 400, 530, 1003, 1503, and/or 1603 increases the transmission amount of the user input data based on the transmission profile.
According to one embodiment, the target application may comprise at least one of a handwriting application, a photo editing application, or a drawing application, wherein at least one of a pen input event or a touch input event occurs.
According to one embodiment, when the sink device 101, 102, 400, 530, 1003, 1503, 1603 is a smart glasses 1603, the processor 120, 340 may be configured to receive display information comprising at least one of the following through the wireless communication module 192, 310: the number of screen images displayed on the smart glasses 1603, the size of each of the screen images, the resolution of each of the screen images, or the bit rate of each of the screen images, and requests the smart glasses 1603 to transmit information through a user input interface for sharing additional information based on the display information.
According to one embodiment, sink devices 101, 102, 400, 530, 1003, 1503, and/or 1603 include wireless communication module 192, 410, display module 160, 320, memory 130, 430, and processor 120, 440. The processor 120, 440 may be configured to: receiving, by the wireless communication module 192, 410, screen image data generated by the source device 101, 102, 300, 510, 1001, 1501 and/or 1601 to be displayed on the sink device 101, 102, 400, 530, 1003, 1503 and/or 1603, displaying a screen image 537 based on the screen image data using the display module 160, 320, acquiring user input data generated in the screen image 537 by an input device 531 connected to the sink device 101, 102, 400, 530, 1003, 1503 and/or 1603 while the screen image 537 is displayed, dynamically changing parameters for adaptively adjusting a transmission amount of user input data including at least one of a data size and a number of data to be transmitted based on a network quality between the source device 101, 102, 300, 510, 1001, 1501 and/or 1601 and the sink device 101, 102, 400, 530, 1003, 1503 and/or 1603, and transmitting the dynamically changed parameters to the source device 101, 102, 300, 510, 1001, 1501 and/or 1601.
According to one embodiment, the screen image data may include at least one of an image frame generated by copying a screen 515 displayed by the display module 160, 320 of the source device 101, 102, 300, 510, 1001, 1501, and/or 1601 or an image frame related to a screen 515 displayed by the display module 160, 320 of the source device 101, 102, 300, 510, 1001, 1501, and/or 1601.
According to one embodiment, the processor 120, 440 may be configured to measure a network quality between the source device 101, 102, 300, 510, 1001, 1501 and/or 1601 and the sink device 101, 102, 400, 530, 1003, 1503 and/or 1603, determine one of a plurality of transfer profiles 610, 630 and 650 for transmitting user input data based on the network quality, and adaptively adjust parameters based on the transfer profiles.
According to one embodiment, the plurality of transmission profiles 610, 630, and 650 may include at least two of a first transmission profile 610 corresponding to a first network quality, a second transmission profile 630 corresponding to a second network quality less than the first network quality, and a third transmission profile 650 corresponding to a third network quality less than the second network quality. Each of the transmission profiles 610, 630, and 650 may include a transmission amount of each of the transmission profiles 610, 630, and 650. The transmission amount of each of the transmission profiles 610, 630, and 650 may include at least one of a maximum size or a maximum number of user input data to be transmitted at one time for each of the transmission profiles 610, 630, and 650.
According to one embodiment, the first transfer profile 610 may have a first amount of transfer corresponding to a maximum size and a maximum number of unrestricted user input data. The second transfer profile 630 may have a second amount of transfer less than the first amount of transfer by reducing or discarding user input data. The third transfer profile 650 may have a third transfer amount, and the size and number of user input data corresponding to the third transfer amount may be smaller than the size and number of user input data corresponding to the second transfer amount.
According to one embodiment, the processor 120, 440 may be configured to calculate a first difference between a first coordinate of first user input data generated at a current time and a second coordinate of second user input data generated at a previous time before the current time in the screen image 537, and adjust the number of pieces of user input data to be transmitted by discarding the first user input data based on the transmission profile when the first difference is less than a set first reference value.
According to one embodiment, the processor 120, 440 may be configured to calculate a second difference between the first input information and the second input information and adjust the number of pieces of user input data to be transmitted by discarding the first user input data based on the transmission profile when the second difference is smaller than a set second reference value. The first input information may include at least one of a first pressure, a first tilt, or a first orientation of first user input data generated at a current time in the screen image 537, and the second input information may include at least one of a second pressure, a second tilt, or a second orientation of second user input data generated at a previous time prior to the current time.
According to one embodiment, the processor 120, 440 may be configured to adjust the number of user input data to be transmitted based on the transfer profile of each type of user input data.
According to one embodiment, the user input data may include an input report 900 indicating the content of the user input data and at least one of report descriptors 1110, 1130, 1210, 1230, the report descriptors 1110, 1130, 1210, 1230 being sent before the input report 900 is sent and indicating the composition of the input report 900 for interpreting the values of the input report 900. The processor 120, 440 may be configured to adjust the data size of the user input data included in the report descriptor 1110, 1130, 1210, 1230 based on at least one of the changed transmission profile or the type of user input data when the transmission profile changes in response to a change in network quality.
According to one embodiment, the processor 120, 440 may be configured to: the network quality between the source device 101, 102, 300, 510, 1001, 1501 and/or 1601 and the sink device 101, 102, 400, 530, 1003, 1503 and/or 1603 is determined based on at least one of a TCP window size or RTT between the source device 101, 102, 300, 510, 1001, 1501 and/or 1601 and the sink device 101, 102, 400, 530, 1003, 1503 and/or 1603.
According to one embodiment, the processor 120, 440 may be configured to: when the sink device 101, 102, 400, 530, 1003, 1503, and/or 1603 is the smart glasses 1603, display information including at least one of the following is shared with the source device 101, 102, 300, 510, 1001, 1501, and/or 1601: the number of screen images displayed on the smart glasses 1603, the size of each of the screen images, the resolution of each of the screen images, or the bit rate of each of the screen images, and receives a request for information transmission from the source device 101, 102, 300, 510, 1001, 1501, and/or 1601 through a user input interface for sharing additional information.
According to one embodiment, a method of operation of a source device 101, 102, 300, 510, 1001, 1501 and/or 1601 includes: an operation 1310 of transmitting screen image data generated by the source device 101, 102, 300, 510, 1001, 1501 and/or 1601 to be displayed on the sink device 101, 102, 400, 530, 1003, 1503 and/or 1603 to the sink device 101, 102, 400, 530, 1003, 1503 and/or 1603, an operation 1320 of determining whether a target application configured to change a transmission amount of user input data generated in the screen image 537 based on the screen image data by the input device 531 connected to the sink device 101, 102, 400, 530, 1003, 1503 and/or 1603 is being executed while the screen image 537 is being displayed on the sink device 101, 102, 400, 530, 1003, 1503 and/or 1603, and an operation 1330 of adjusting a transmission bit rate of the screen image data by changing a transmission profile for transmitting the screen image data when it is determined that the target application is being executed.
According to one embodiment, a method of operating a sink device 101, 102, 400, 530, 1003, 1503, and/or 1603 includes: an operation 1410 of receiving screen image data generated by the source device 101, 102, 300, 510, 1001, 1501, and/or 1601 to display on the sink device 101, 102, 400, 530, 1003, 1503, and/or 1603; an operation 1420 of displaying a screen image 537 based on the screen image data; an operation 1430 of acquiring user input data generated in the screen image 537 by the input device 531 connected to the sink device 101, 102, 400, 530, 1003; a parameter operation 1440 for adaptively adjusting the transmission amount of user input data, including at least one of a data size or a number of data to be transmitted, is dynamically changed based on the network quality between the source device 101, 102, 300, 510, 1001, 1501 and/or 1601 and the sink device 101, 102, 400, 530, 1003, 1503 and/or 1603; and an operation 1450 of transmitting the dynamically changed parameters to the source device 101, 102, 300, 510, 1001, 1501 and/or 1601.
According to one embodiment, the method of operating the sink device 101, 102, 400, 530, 1003, 1503, and/or 1603 may further comprise: measuring network quality between the source device 101, 102, 300, 510, 1001, 1501 and/or 1601 and the sink device 101, 102, 400, 530, 1003, 1503 and/or 1603; based on the network quality, one of a plurality of transfer profiles to be used for transmitting the user input data is selected. According to one embodiment, wherein the plurality of transmission profiles comprises a first transmission profile corresponding to a first network quality and a second transmission profile corresponding to a second network quality smaller than the first network quality, and each of the plurality of transmission profiles comprises a transmission amount comprising at least one of a maximum size and a maximum number of user input data to be transmitted at a time.
According to one embodiment, wherein the maximum number of user input data for the secondary transmission of the second transmission profile is approximately 50% of the maximum number of user input data for the secondary transmission of the first transmission profile.

Claims (15)

1.A source device, comprising:
A wireless communication module;
a memory; and
The processor may be configured to perform the steps of,
Wherein the processor is configured to:
transmitting, to the sink device, screen image data generated by the source device to be displayed on the sink device through the wireless communication module;
Determining whether a target application is being executed while a screen image is displayed on the sink device, the target application being configured to change a transmission amount of user input data generated by an input device connected to the sink device in a screen image based on the screen image data; and
Based on determining that the target application is being executed, a transmission bit rate of the screen image data is adjusted by changing a transmission profile used to transmit the screen image data.
2. The source device of claim 1, wherein the screen image data includes at least one of an image frame generated by copying a screen displayed by a display module of the source device and an image frame related to a screen displayed by a display module of the source device.
3. The source device of claim 1, wherein the processor is further configured to: based on determining that the target application is being executed, a message including the delivery profile associated with the user input data is sent to the sink device.
4. The source device of claim 3, wherein the processor is further configured to: based on determining that the target application is being executed, the sink device is requested to increase the amount of transmission of the user input data based on the transmission profile.
5. The source device of claim 1, wherein the target application comprises at least one of a handwriting application, a photo editing application, and a drawing application, wherein at least one of a pen input event and a touch input event occurs in the application.
6. The source device of claim 1, wherein the sink device is a smart glasses and the processor is configured to:
receiving, by the wireless communication module, display information including at least one of a number of screen images displayed on the smart glasses, a size of each of the screen images, a resolution of each of the screen images, and a bit rate of each of the screen images; and
Based on the display information, requesting the smart glasses to transmit information through a user input interface for sharing additional information.
7. A sink device, comprising:
A wireless communication module;
a display module;
a memory; and
The processor may be configured to perform the steps of,
Wherein the processor is configured to:
Receiving, by the wireless communication module, screen image data generated by a source device to be displayed on the sink device;
Displaying a screen image based on the screen image data using the display module;
Acquiring user input data generated in the screen image by an input device connected to the sink device while the screen image is displayed;
Dynamically changing a parameter for adaptively adjusting a transmission amount of the user input data based on a network quality between the source device and the sink device, the parameter including at least one of a data size and a number of pieces of data to be transmitted; and
And transmitting the dynamically changed parameters to the source device.
8. The sink device of claim 7, wherein the screen image data comprises at least one of an image frame generated by copying a screen displayed by a display module of the source device and an image frame related to the screen displayed by the display module of the source device.
9. The sink device of claim 7, wherein the processor is further configured to:
measuring the network quality between the source device and the sink device;
determining, based on the network quality, one of a plurality of transfer profiles for transmission of the user input data; and
The parameters are adaptively adjusted based on the one of the plurality of transfer profiles.
10. The sink device of claim 9, wherein,
The plurality of transmission profiles includes at least two of a first transmission profile corresponding to a first network quality, a second transmission profile corresponding to a second network quality less than the first network quality, and a third transmission profile corresponding to a third network quality less than the second network quality, and
Each of the plurality of transmission profiles includes a transmission amount of each of the transmission profiles, the transmission amount including at least one of a maximum size or a maximum number of the user input data to be transmitted at a time for each of the transmission profiles.
11. The sink device of claim 10, wherein,
The first transfer profile has a first amount of transfer corresponding to a maximum size and a maximum number of unrestricted user input data,
By reducing or discarding the user input data, the second transfer profile has a second transfer volume smaller than the first transfer volume, and
The third transfer profile has a third transfer amount corresponding to a size and number of pieces of user input data smaller than a size and number of pieces of user input data corresponding to the second transfer amount.
12. The sink device of claim 9, wherein the processor is further configured to:
calculating a first difference between a first coordinate of first user input data generated at a current time and a second coordinate of second user input data generated at a previous time before the current time in the screen image, and
When the first difference is smaller than a set first reference value, the number of pieces of user input data to be transmitted is adjusted by discarding the first user input data based on the transmission profile.
13. The sink device of claim 9, wherein the processor is further configured to:
Calculating a second difference between first input information and second input information, the first input information including at least one of a first pressure, a first tilt, or a first orientation of first user input data generated in the screen image at a current time, and the second input information including at least one of a second pressure, a second tilt, or a second orientation of second user input data generated at a previous time prior to the current time; and
When the second difference is smaller than a set second reference value, the number of pieces of user input data to be transmitted is adjusted by discarding the first user input data based on the transmission profile.
14. The sink device of claim 9, wherein the processor is further configured to adjust the number of user input data to be transmitted based on a transmission profile of each type of user input data,
Wherein the user input data comprises at least one of:
An input report indicating the content of the user input data; and
A report descriptor which is transmitted prior to transmission of the input report and indicates a composition of the input report for interpreting a value of the input report, and
The processor is further configured to: when the transmission profile is changed in response to the change in the network quality, a data size of user input data included in the report descriptor is adjusted based on at least one of the changed transmission profile and the type of the user input data.
15. The sink device of claim 7, wherein the processor is further configured to, based on a determination that the sink device is smart glasses:
Sharing display information with the source device, the display information including at least one of a number of screen images displayed on the smart glasses, a size of each of the screen images, a resolution of each of the screen images, and a bit rate of each of the screen images; and
A request for information transfer is received from the source device via a user input interface for sharing additional information.
CN202280086528.8A 2021-12-30 2022-10-30 Source device, sink device, and method of operating the same Pending CN118476234A (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR10-2021-0192800 2021-12-30
KR10-2022-0019455 2022-02-15
KR1020220019455A KR20230103800A (en) 2021-12-30 2022-02-15 Source device, sink device, and methods of operation thereof
PCT/KR2022/016772 WO2023128206A1 (en) 2021-12-30 2022-10-30 Source device, sink device, and operating methods therefor

Publications (1)

Publication Number Publication Date
CN118476234A true CN118476234A (en) 2024-08-09

Family

ID=92159668

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202280086528.8A Pending CN118476234A (en) 2021-12-30 2022-10-30 Source device, sink device, and method of operating the same

Country Status (1)

Country Link
CN (1) CN118476234A (en)

Similar Documents

Publication Publication Date Title
EP3046331B1 (en) Media control method and system based on cloud desktop
US20220156029A1 (en) Electronic device and method for providing application screen of display of external device thereof
US12074960B2 (en) Electronic device and method for electronic device processing received data packet
US11900015B2 (en) Electronic device and method for controlling audio volume thereof
KR20210101696A (en) Electronic device and method for controling buffer
US11706261B2 (en) Electronic device and method for transmitting and receiving content
US10319341B2 (en) Electronic device and method for displaying content thereof
US20160029027A1 (en) Device and method for processing image
US20240080530A1 (en) Electronic apparatus and operating method of electronic apparatus
US20230413348A1 (en) Method for transmitting audio data in electronic device
US12032867B2 (en) Source device and sink device for sharing expanded screen, and methods of operating the same
WO2019089398A1 (en) Networked user interface back channel discovery via wired video connection
US20230217169A1 (en) Electronic device and multichannel audio output method using same
US20230154500A1 (en) Electronic device, and method of synchronizing video data and audio data by using same
US11853637B2 (en) Electronic device for synchronizing output time point of content output by external devices and method of operating same
US20230214168A1 (en) Source device, sink device, and operating methods thereof
CN118476234A (en) Source device, sink device, and method of operating the same
US11797346B2 (en) Electronic device for controlling processing unit on basis of time spent for generating frame and maximum allowed time and method of operating electronic device
KR20230103800A (en) Source device, sink device, and methods of operation thereof
KR20200114707A (en) Electronic device and method for processing a streaming application in electronic device
US11863904B2 (en) Electronic device for performing video call using frame rate conversion and method for the same
EP4344216A1 (en) Electronic device for transmitting plurality of image streams while performing video call, and method for operating electronic device
US20230188917A1 (en) Electronic device for converting number of audio channels, and electronic device operating method
KR20220167624A (en) Electronic device for performing video call using frame rate conversion and method for the same
KR20230055331A (en) Source device, and sink device sharing extended screen, and operation method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication