WO2023128206A1 - Dispositif source, dispositif récepteur et leurs procédés d'exploitation - Google Patents

Dispositif source, dispositif récepteur et leurs procédés d'exploitation Download PDF

Info

Publication number
WO2023128206A1
WO2023128206A1 PCT/KR2022/016772 KR2022016772W WO2023128206A1 WO 2023128206 A1 WO2023128206 A1 WO 2023128206A1 KR 2022016772 W KR2022016772 W KR 2022016772W WO 2023128206 A1 WO2023128206 A1 WO 2023128206A1
Authority
WO
WIPO (PCT)
Prior art keywords
user input
transmission
sink device
input data
source device
Prior art date
Application number
PCT/KR2022/016772
Other languages
English (en)
Korean (ko)
Inventor
김형진
이승범
Original Assignee
삼성전자주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020220019455A external-priority patent/KR20230103800A/ko
Application filed by 삼성전자주식회사 filed Critical 삼성전자주식회사
Priority to US18/120,913 priority Critical patent/US20230214168A1/en
Publication of WO2023128206A1 publication Critical patent/WO2023128206A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video stream to a specific local network, e.g. a Bluetooth® network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/462Content or additional data management, e.g. creating a master electronic program guide from data received from the Internet and a Head-end, controlling the complexity of a video stream by scaling the resolution or bit-rate based on the client capabilities
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level

Definitions

  • Various embodiments of the present disclosure relate to a source device, a sink device, and operating methods thereof.
  • a human interface device that performs a user interface between a user and a device can receive various types of user input, such as, for example, touch input events, gesture input, mouse input, keyboard input, and/or pen input events.
  • user input may be transmitted to each device through a user input back channel (UIBC), for example.
  • UIBC may correspond to a function in which a user input generated from an image displayed on a display screen of a sink device is transmitted to a source device, and then the source device processes the user input.
  • the UIBC may be used to communicate user input from the sink device to a user interface residing on the source device.
  • a device providing an image to another electronic device may be referred to as a 'source device'
  • a device receiving the corresponding image may be referred to as a 'sink device'.
  • the amount and size of user input data generated by the human interface device also increases in proportion to the increase in the size of the original video image. For example, in the case of a pen input event, 300 to 400 input data may occur per second.
  • the network throughput is poor, if the UIBC data is transmitted in the same way as when the network throughput is good, the amount and size of UIBC data to be processed within the limited throughput increases, so that between devices that exchange UIBC data A delay in user input may occur, which may impair usability.
  • the bitrate of user input data is considerably smaller than that of video images, user input may be delayed due to a large amount of user input data.
  • user input data (e.g., user input back channel (UIBC)) generated between a source device and a sink device according to a network environment including network throughput and/or the type of application used by the source device. data) can be adaptively adjusted.
  • UIBC user input back channel
  • the sink device can adaptively adjust the amount of transmission of user input data by determining a network condition.
  • the source device when a target application that can generate a lot of user input is executed in the sink device, the source device reduces the bit rate of the video image transmitted to the sink device through communication between the source device and the sink device, and sync The device may increase the transmission amount of user input data transmitted to the source device.
  • a source device includes a wireless communication module, a memory, and a processor, and the processor, through the wireless communication module, syncs screen image data generated by the source device for display on the sink device.
  • a target application configured to change the transmission amount of user input data generated from the screen image by an input device connected to the sink device while the screen image based on the screen image data is displayed in the sink device is running. and, based on the determination that the target application is running, the transmission bit rate of the screen image data may be adjusted by changing a transmission profile for transmitting the screen image data.
  • a sink device includes a wireless communication module, a display module, a memory, and a processor, and the processor, through the wireless communication module, displays a screen image generated by a source device to the sink device.
  • Receives data displays a screen image based on the screen image data through the display module, and receives user input data generated from the screen image by an input device connected to the sink device while the screen image is displayed.
  • a method of operating a source device includes transmitting screen image data generated by the source device to a sink device for display on the sink device, and displaying a screen image based on the screen image data in the sink device. while the input device connected to the sink device determines whether to execute a target application configured to change the amount of user input data generated in the screen image by an input device connected to the sink device, and based on the determination that the target application is running, An operation of adjusting a transmission bit rate of the screen image data by changing a transmission profile for transmitting the screen image data may be included.
  • a method of operating a sink device includes receiving screen image data generated by a source device to display on the sink device, displaying a screen image based on the screen image data, and displaying the screen image While being displayed, at least one of a data size and the number of transmissions based on an operation of obtaining user input data generated from the screen image by an input device connected to the sink device and network quality between the source device and the sink device. It may include an operation of dynamically changing a parameter for transmission of the user input data including, and an operation of transmitting the dynamically changed parameter to the source device.
  • the sink device can improve usability of an input device (eg, a human interface device) connected to the sink device by determining network quality and adaptively increasing or decreasing the amount of transmission of user input data.
  • an input device eg, a human interface device
  • the quality of user input data may be improved by adaptively adjusting the transmission amount of data exchanged between the source device and the sink device through communication between the source device and the sink device.
  • bit rate of data eg, screen image data
  • FIG. 1 is a block diagram of an electronic device in a network environment according to various embodiments.
  • FIG. 2 is a block diagram illustrating a program according to various embodiments.
  • FIG. 3 is a block diagram of a source device according to an embodiment.
  • FIG. 4 is a block diagram of a sink device according to an embodiment.
  • FIG. 5 is a diagram for explaining an operation between a source device and a sink device according to an embodiment.
  • FIG. 6 is a block diagram illustrating a state diagram of a transmission profile according to an embodiment.
  • FIG. 7 is a diagram for explaining transmission amount and data size for each of a plurality of transmission profiles according to an embodiment.
  • FIG. 8 is a diagram for explaining a method in which a sink device limits the transmission number of user input data according to an embodiment.
  • FIG. 9 is a diagram illustrating an example of an input report according to an embodiment.
  • FIG. 10 is a diagram for explaining a method of exchanging an input report and a report descriptor between a source device and a sink device according to an embodiment.
  • FIG. 11 is a diagram for explaining a report descriptor that is changed for each transmission profile in response to a touch input event according to an exemplary embodiment.
  • FIG. 12 is a diagram for explaining a report descriptor that is changed for each transmission profile in response to a pen input event according to an exemplary embodiment.
  • FIG. 13 is a flowchart illustrating a method of operating a source device according to an embodiment.
  • FIG. 14 is a flowchart illustrating a method of operating a sink device according to an embodiment.
  • 15 is a flowchart illustrating a method of adjusting a bit rate of data transmitted through communication between a source device and a sink device according to an embodiment.
  • 16 is a diagram for explaining a process of transmitting user input data when a source device is a user terminal and a sink device is smart glasses, according to an embodiment.
  • FIG. 1 is a block diagram of an electronic device 101 within a network environment 100, according to various embodiments.
  • an electronic device 101 communicates with an electronic device 102 through a first network 198 (eg, a short-range wireless communication network) or through a second network 199. It may communicate with at least one of the electronic device 104 or the server 108 through (eg, a long-distance wireless communication network). According to an embodiment, the electronic device 101 may communicate with the electronic device 104 through the server 108 .
  • a first network 198 eg, a short-range wireless communication network
  • the server 108 e.g, a long-distance wireless communication network
  • the electronic device 101 includes a processor 120, a memory 130, an input module 150, an audio output module 155, a display module 160, an audio module 170, a sensor module ( 176), interface 177, connection terminal 178, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196 , or the antenna module 197 may be included.
  • at least one of these components eg, the connection terminal 178) may be omitted or one or more other components may be added.
  • some of these components eg, sensor module 176, camera module 180, or antenna module 197) are integrated into a single component (eg, display module 160). It can be.
  • the processor 120 for example, executes software (eg, the program 140) to cause at least one other component (eg, hardware or software component) of the electronic device 101 connected to the processor 120. It can control and perform various data processing or calculations. According to one embodiment, as at least part of data processing or operation, the processor 120 transfers instructions or data received from other components (e.g., sensor module 176 or communication module 190) to volatile memory 132. , processing commands or data stored in the volatile memory 132 , and storing resultant data in the non-volatile memory 134 .
  • software eg, the program 140
  • the processor 120 transfers instructions or data received from other components (e.g., sensor module 176 or communication module 190) to volatile memory 132. , processing commands or data stored in the volatile memory 132 , and storing resultant data in the non-volatile memory 134 .
  • the processor 120 includes a main processor 121 (eg, a central processing unit or an application processor) or a secondary processor 123 (eg, a graphic processing unit, a neural network processing unit ( NPU: neural processing unit (NPU), image signal processor, sensor hub processor, or communication processor).
  • a main processor 121 eg, a central processing unit or an application processor
  • a secondary processor 123 eg, a graphic processing unit, a neural network processing unit ( NPU: neural processing unit (NPU), image signal processor, sensor hub processor, or communication processor.
  • NPU neural network processing unit
  • the secondary processor 123 may use less power than the main processor 121 or be set to be specialized for a designated function.
  • the secondary processor 123 may be implemented separately from or as part of the main processor 121 .
  • the secondary processor 123 may, for example, take the place of the main processor 121 while the main processor 121 is in an inactive (eg, sleep) state, or the main processor 121 is active (eg, running an application). ) state, together with the main processor 121, at least one of the components of the electronic device 101 (eg, the display module 160, the sensor module 176, or the communication module 190) It is possible to control at least some of the related functions or states.
  • the auxiliary processor 123 eg, image signal processor or communication processor
  • the auxiliary processor 123 may include a hardware structure specialized for processing an artificial intelligence model.
  • AI models can be created through machine learning. Such learning may be performed, for example, in the electronic device 101 itself where the artificial intelligence model is performed, or may be performed through a separate server (eg, the server 108).
  • the learning algorithm may include, for example, supervised learning, unsupervised learning, semi-supervised learning or reinforcement learning, but in the above example Not limited.
  • the artificial intelligence model may include a plurality of artificial neural network layers.
  • Artificial neural networks include deep neural networks (DNNs), convolutional neural networks (CNNs), recurrent neural networks (RNNs), restricted boltzmann machines (RBMs), deep belief networks (DBNs), bidirectional recurrent deep neural networks (BRDNNs), It may be one of deep Q-networks or a combination of two or more of the foregoing, but is not limited to the foregoing examples.
  • the artificial intelligence model may include, in addition or alternatively, software structures in addition to hardware structures.
  • the memory 130 may store various data used by at least one component (eg, the processor 120 or the sensor module 176) of the electronic device 101 .
  • the data may include, for example, input data or output data for software (eg, program 140) and commands related thereto.
  • the memory 130 may include volatile memory 132 or non-volatile memory 134 .
  • the program 140 may be stored as software in the memory 130 and may include, for example, an operating system 142 , middleware 144 , or an application 146 .
  • the input module 150 may receive a command or data to be used by a component (eg, the processor 120) of the electronic device 101 from the outside of the electronic device 101 (eg, a user).
  • the input module 150 may include, for example, a microphone, a mouse, a keyboard, a key (eg, a button), or a digital pen (eg, a stylus pen).
  • the sound output module 155 may output sound signals to the outside of the electronic device 101 .
  • the sound output module 155 may include, for example, a speaker or a receiver.
  • the speaker can be used for general purposes such as multimedia playback or recording playback.
  • a receiver may be used to receive an incoming call. According to one embodiment, the receiver may be implemented separately from the speaker or as part of it.
  • the display module 160 may visually provide information to the outside of the electronic device 101 (eg, a user).
  • the display module 160 may include, for example, the display 160, a hologram device, or a projector and a control circuit for controlling the device.
  • the display module 160 may include a touch sensor set to detect a touch or a pressure sensor set to measure the intensity of force generated by the touch.
  • the audio module 170 may convert sound into an electrical signal or vice versa. According to one embodiment, the audio module 170 acquires sound through the input module 150, the sound output module 155, or an external electronic device connected directly or wirelessly to the electronic device 101 (eg: Sound may be output through the electronic device 102 (eg, a speaker or a headphone).
  • the audio module 170 acquires sound through the input module 150, the sound output module 155, or an external electronic device connected directly or wirelessly to the electronic device 101 (eg: Sound may be output through the electronic device 102 (eg, a speaker or a headphone).
  • the sensor module 176 detects an operating state (eg, power or temperature) of the electronic device 101 or an external environmental state (eg, a user state), and generates an electrical signal or data value corresponding to the detected state. can do.
  • the sensor module 176 may include, for example, a gesture sensor, a gyro sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a bio sensor, It may include a temperature sensor, humidity sensor, or fingerprint sensor.
  • the interface 177 may support one or more designated protocols that may be used to directly or wirelessly connect the electronic device 101 to an external electronic device (eg, the electronic device 102).
  • the interface 177 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • HDMI high definition multimedia interface
  • USB universal serial bus
  • SD card interface Secure Digital Card interface
  • audio interface audio interface
  • connection terminal 178 may include a connector through which the electronic device 101 may be physically connected to an external electronic device (eg, the electronic device 102).
  • the connection terminal 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
  • the haptic module 179 may convert electrical signals into mechanical stimuli (eg, vibration or motion) or electrical stimuli that a user may perceive through tactile or kinesthetic senses.
  • the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 180 may capture still images and moving images. According to one embodiment, camera module 180 may include one or more lenses, image pixels, image signal processors, or flashes.
  • the power management module 188 may manage power supplied to the electronic device 101 .
  • the power management module 188 may be implemented as at least part of a power management integrated circuit (PMIC), for example.
  • PMIC power management integrated circuit
  • the battery 189 may supply power to at least one component of the electronic device 101 .
  • the battery 189 may include, for example, a non-rechargeable primary cell, a rechargeable secondary cell, or a fuel cell.
  • the communication module 190 is a direct (eg, wired) communication channel or a wireless communication channel between the electronic device 101 and an external electronic device (eg, the electronic device 102, the electronic device 104, or the server 108). Establishment and communication through the established communication channel may be supported.
  • the communication module 190 may include one or more communication processors that operate independently of the processor 120 (eg, an application processor) and support direct (eg, wired) communication or wireless communication.
  • the communication module 190 is a wireless communication module 192 (eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (eg, : a local area network (LAN) communication module or a power line communication module).
  • a corresponding communication module is a first network 198 (eg, a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)) or a second network 199 (eg, a legacy communication module).
  • the wireless communication module 192 uses subscriber information (eg, International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 196 within a communication network such as the first network 198 or the second network 199.
  • IMSI International Mobile Subscriber Identifier
  • the wireless communication module 192 may support a 5G network after a 4G network and a next-generation communication technology, for example, NR access technology (new radio access technology).
  • NR access technologies include high-speed transmission of high-capacity data (enhanced mobile broadband (eMBB)), minimization of terminal power and access of multiple terminals (massive machine type communications (mMTC)), or high reliability and low latency (ultra-reliable and low latency (URLLC)).
  • eMBB enhanced mobile broadband
  • mMTC massive machine type communications
  • URLLC ultra-reliable and low latency
  • -latency communications can be supported.
  • the wireless communication module 192 may support a high frequency band (eg, mmWave band) to achieve a high data rate, for example.
  • the wireless communication module 192 uses various technologies for securing performance in a high frequency band, such as beamforming, massive multiple-input and multiple-output (MIMO), and full-dimensional multiplexing. Technologies such as input/output (FD-MIMO: full dimensional MIMO), array antenna, analog beam-forming, or large scale antenna may be supported.
  • the wireless communication module 192 may support various requirements defined for the electronic device 101, an external electronic device (eg, the electronic device 104), or a network system (eg, the second network 199).
  • the wireless communication module 192 is a peak data rate for eMBB realization (eg, 20 Gbps or more), a loss coverage for mMTC realization (eg, 164 dB or less), or a U-plane latency for URLLC realization (eg, Example: downlink (DL) and uplink (UL) each of 0.5 ms or less, or round trip 1 ms or less) may be supported.
  • eMBB peak data rate for eMBB realization
  • a loss coverage for mMTC realization eg, 164 dB or less
  • U-plane latency for URLLC realization eg, Example: downlink (DL) and uplink (UL) each of 0.5 ms or less, or round trip 1 ms or less
  • the antenna module 197 may transmit or receive signals or power to the outside (eg, an external electronic device).
  • the antenna module 197 may include an antenna including a radiator formed of a conductor or a conductive pattern formed on a substrate (eg, PCB).
  • the antenna module 197 may include a plurality of antennas (eg, an array antenna). In this case, at least one antenna suitable for a communication method used in a communication network such as the first network 198 or the second network 199 is selected from the plurality of antennas by the communication module 190, for example. can be chosen A signal or power may be transmitted or received between the communication module 190 and an external electronic device through the selected at least one antenna.
  • other components eg, a radio frequency integrated circuit (RFIC) may be additionally formed as a part of the antenna module 197 in addition to the radiator.
  • RFIC radio frequency integrated circuit
  • the antenna module 197 may form a mmWave antenna module.
  • the mmWave antenna module includes a printed circuit board, an RFIC disposed on or adjacent to a first surface (eg, a lower surface) of the printed circuit board and capable of supporting a designated high frequency band (eg, mmWave band); and a plurality of antennas (eg, array antennas) disposed on or adjacent to a second surface (eg, a top surface or a side surface) of the printed circuit board and capable of transmitting or receiving signals of the designated high frequency band. can do.
  • peripheral devices eg, a bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
  • signal e.g. commands or data
  • commands or data may be transmitted or received between the electronic device 101 and the external electronic device 104 through the server 108 connected to the second network 199 .
  • Each of the external electronic devices 102 or 104 may be the same as or different from the electronic device 101 .
  • all or part of operations executed in the electronic device 101 may be executed in one or more external electronic devices among the external electronic devices 102 , 104 , or 108 .
  • the electronic device 101 when the electronic device 101 needs to perform a certain function or service automatically or in response to a request from a user or another device, the electronic device 101 instead of executing the function or service by itself.
  • one or more external electronic devices may be requested to perform the function or at least part of the service.
  • One or more external electronic devices receiving the request may execute at least a part of the requested function or service or an additional function or service related to the request, and deliver the execution result to the electronic device 101 .
  • the electronic device 101 may provide the result as at least part of a response to the request as it is or additionally processed.
  • the electronic device 101 may provide an ultra-low latency service using, for example, distributed computing or mobile edge computing.
  • the external electronic device 104 may include an internet of things (IoT) device.
  • Server 108 may be an intelligent server using machine learning and/or neural networks.
  • the external electronic device 104 or server 108 may be included in the second network 199 .
  • the electronic device 101 may be applied to intelligent services (eg, smart home, smart city, smart car, or health care) based on 5G communication technology and IoT-related technology.
  • the program 140 includes an operating system 142, middleware 144, or an application 146 executable in the operating system 142 for controlling one or more resources of the electronic device 101.
  • the operating system 142 may include, for example, AndroidTM, iOSTM, WindowsTM, SymbianTM, TizenTM, or BadaTM.
  • At least some of the programs 140 are, for example, preloaded in the electronic device 101 at the time of manufacture, or when used by a user, an external electronic device (eg, the electronic device 102 or 104), or a server ( 108)) can be downloaded or updated.
  • the operating system 142 may control management (eg, allocation or reclamation) of one or more system resources (eg, process, memory, or power) of the electronic device 101 .
  • Operating system 142 may additionally or alternatively include other hardware devices of electronic device 101 , such as input module 150 , sound output module 155 , display module 160 , audio module 170 . , sensor module 176, interface 177, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196, or It may include one or more driver programs for driving the antenna module 197.
  • the middleware 144 may provide various functions to the application 146 so that the function or information provided from one or more resources of the electronic device 101 may be used by the application 146 .
  • the middleware 144 includes, for example, the application manager 201, the window manager 203, the multimedia manager 205, the resource manager 207, the power manager 209, the database manager 211, and the package manager 213. ), connectivity manager 215, notification manager 217, location manager 219, graphics manager 221, security manager 223, call manager 225, or voice recognition manager 227. can
  • the application manager 201 may manage the life cycle of the application 146 , for example.
  • the window manager 203 may manage one or more GUI resources used in a screen, for example.
  • the multimedia manager 205 identifies, for example, one or more formats necessary for reproducing media files, and encodes or decodes a corresponding media file among the media files using a codec suitable for the selected format. can be done
  • the resource manager 207 may manage a source code of the application 146 or a memory space of the memory 130 .
  • the power manager 209 may manage, for example, the capacity, temperature, or power of the battery 189, and determine or provide related information necessary for the operation of the electronic device 101 by using corresponding information among them. . According to an embodiment, the power manager 209 may interoperate with a basic input/output system (BIOS) (not shown) of the electronic device 101 .
  • BIOS basic input/output system
  • the database manager 211 may create, search, or change a database to be used by the application 146, for example.
  • the package manager 213 may manage installation or update of applications distributed in the form of package files, for example.
  • the connectivity manager 215 may manage, for example, a wireless connection or a direct connection between the electronic device 101 and an external electronic device.
  • the notification manager 217 may provide a function for notifying a user of occurrence of a designated event (eg, an incoming call, message, or alarm), for example.
  • the location manager 219 may manage location information of the electronic device 101, for example.
  • the graphic manager 221 may manage, for example, one or more graphic effects to be provided to a user or a user interface related thereto.
  • Security manager 223 may provide system security or user authentication, for example.
  • the telephony manager 225 may manage, for example, a voice call function or a video call function provided by the electronic device 101 .
  • the voice recognition manager 227 transmits, for example, the user's voice data to the server 108, and at least partially based on the voice data, a command corresponding to a function to be performed in the electronic device 101; Alternatively, text data converted at least partially based on the voice data may be received from the server 108 .
  • the middleware 244 may dynamically delete some existing components or add new components.
  • at least part of the middleware 144 may be included as part of the operating system 142 or may be implemented as separate software different from the operating system 142 .
  • the application 146 includes, for example, a home 251, a dialer 253, an SMS/MMS 255, an instant message (IM) 257, a browser 259, a camera 261, and an alarm 263. , Contacts (265), Voice Recognition (267), Email (269), Calendar (271), Media Player (273), Albums (275), Watch (277), Health (279) (e.g. exercise or blood sugar) measurement of biometric information) or environmental information 281 (eg, measurement of atmospheric pressure, humidity, or temperature information). According to an embodiment, the application 146 may further include an information exchange application (not shown) capable of supporting information exchange between the electronic device 101 and an external electronic device.
  • an information exchange application not shown
  • the information exchange application may include, for example, a notification relay application configured to transmit designated information (eg, a call, message, or alarm) to an external electronic device, or a device management application configured to manage an external electronic device.
  • the notification relay application for example, transmits notification information corresponding to a designated event (eg, mail reception) generated in another application (eg, the email application 269) of the electronic device 101 to an external electronic device.
  • the notification relay application may receive notification information from an external electronic device and provide the notification information to the user of the electronic device 101 .
  • the device management application is, for example, a power source (eg, turn-on or turn-off) of an external electronic device that communicates with the electronic device 101 or some component thereof (eg, a display module or a camera module of the external electronic device). ) or functions (eg brightness, resolution, or focus).
  • the device management application may additionally or alternatively support installation, deletion, or update of an application operating in an external electronic device.
  • Electronic devices may be devices of various types.
  • the electronic device may include, for example, a portable communication device (eg, a smart phone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance.
  • a portable communication device eg, a smart phone
  • a computer device e.g., a smart phone
  • a portable multimedia device e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a camera
  • a wearable device e.g., a smart bracelet
  • first, second, or first or secondary may simply be used to distinguish that component from other corresponding components, and may refer to that component in other respects (eg, importance or order) is not limited.
  • a (eg, first) component is said to be “coupled” or “connected” to another (eg, second) component, with or without the terms “functionally” or “communicatively.”
  • the certain component may be connected to the other component directly (eg by wire), wirelessly, or through a third component.
  • module used in various embodiments of this document may include a unit implemented in hardware, software, or firmware, and is interchangeable with terms such as, for example, logic, logical blocks, parts, or circuits.
  • a module may be an integrally constructed component or a minimal unit of components or a portion thereof that performs one or more functions.
  • the module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • a storage medium eg, internal memory 136 or external memory 138
  • a machine eg, electronic device 101 of FIG. 1
  • It may be implemented as software (eg, program 140) comprising one or more instructions.
  • a processor eg, the processor 120
  • a device eg, the electronic device 101
  • the one or more instructions may include code generated by a compiler or code executable by an interpreter.
  • a storage medium readable by a device may be provided in the form of a non-transitory storage medium.
  • the storage medium is a tangible device and does not contain a signal (e.g. electromagnetic wave), and this term refers to the case where data is stored semi-permanently in the storage medium. It does not discriminate when it is temporarily stored.
  • a signal e.g. electromagnetic wave
  • the method according to various embodiments disclosed in this document may be provided by being included in a computer program product.
  • Computer program products may be traded between sellers and buyers as commodities.
  • a computer program product is distributed in the form of a device-readable storage medium (e.g. compact disc read only memory (CD-ROM)), or through an application store (e.g. Play StoreTM) or on two user devices (e.g. It can be distributed (eg downloaded or uploaded) online, directly between smart phones.
  • a device e.g. compact disc read only memory (CD-ROM)
  • an application store e.g. Play StoreTM
  • two user devices e.g. It can be distributed (eg downloaded or uploaded) online, directly between smart phones.
  • at least part of the computer program product may be temporarily stored or temporarily created in a storage medium readable by a device such as a manufacturer's server, an application store server, or a relay server's memory.
  • each component (eg, module or program) of the above-described components may include a single object or a plurality of entities, and some of the plurality of entities may be separately disposed in other components. there is.
  • one or more components or operations among the aforementioned corresponding components may be omitted, or one or more other components or operations may be added.
  • a plurality of components eg modules or programs
  • the integrated component may perform one or more functions of each of the plurality of components identically or similarly to those performed by a corresponding component of the plurality of components prior to the integration. .
  • the actions performed by a module, program, or other component are executed sequentially, in parallel, iteratively, or heuristically, or one or more of the actions are executed in a different order, or omitted. or one or more other actions may be added.
  • a source device 300 (eg, the electronic device 101 and the electronic device 102 of FIG. 1 , the source device 510 of FIG. 5 , and the source device of FIG. 10 ) 1001, the source device 1501 of FIG. 15, and/or the user terminal 1601 of FIG. 16) transmits screen image data to a sink device (eg, the electronic device 101 of FIG. 1, the electronic device 102, Sink device 400 in FIG. 4 , sink device 530 in FIG. 5 , sink device 1003 in FIG. 10 , sink device 1503 in FIG. 15 , and/or smart glasses 1603 in FIG. 16 .
  • a sink device eg, the electronic device 101 of FIG. 1, the electronic device 102, Sink device 400 in FIG. 4 , sink device 530 in FIG. 5 , sink device 1003 in FIG. 10 , sink device 1503 in FIG. 15 , and/or smart glasses 1603 in FIG. 16 .
  • the screen image data is, for example, an image frame replicating a screen displayed by the display module 320 of the source device 300 (eg, the screen 515 of FIG. 5), and/or a display of the source device 300. It may include at least one of image frames related to the screen 515 displayed by the module 320 .
  • the source device 300 may transmit, for example, the screen 515, which is an output actually displayed on the source device 300, to the sink device 400 as it is, or an output displayed on the source device 300 (eg, a screen ( At least some image frames displayed in 515) may be combined and transmitted to the sink device 400 .
  • the source device 300 may transmit new image frames not output from the source device 300 to the sink device 400 .
  • Image frames may be included in a data packet of a designated format and transmitted.
  • 'image frame related to the screen 515' may be simplified and expressed as 'screen image data'.
  • the screen image data may be multimedia data including not only video data but also audio data.
  • the source device 300 and the sink device 400 are located adjacent to each other and may be connected to the same Wi-Fi network, but are not necessarily limited thereto.
  • the source device 300 and the sink device 400 may be connected through Bluetooth communication.
  • the source device 300 may be, for example, an electronic device capable of supporting Miracast to wirelessly share multimedia data including high-resolution photos and high-definition video contents between Wi-Fi devices.
  • the source device 300 transmits the screen image data generated by transcoding the screen displayed on the display of the source device 300 (eg, the screen 515 of FIG. 5) through wireless communication such as WiFi. It can be transmitted to the sink device 400 through.
  • the source device Since the image frame output in 300 is copied and/or edited and transmitted, it may actually correspond to another image frame.
  • the screen image data transmitted to the sink device 400 may have the same resolution and same aspect ratio as the screen 515 output from the display module 320 of the source device 300, or The screen 515 output from the display module 320 may have a different resolution and/or a different aspect ratio.
  • a video format and/or an audio format of screen image data transmitted to the sink device 400 may be determined according to codec settings between the source device 300 and the sink device 400 .
  • the source device 300 may receive and process user input data such as a touch input event and a key input from the sink device 400 through a user input back channel (UIBC). While the screen image 537 based on the screen image data is displayed on the display screen of the sink device 400, the UIBC is a user input generated from the screen image 537 by an input device connected to the sink device 400 (eg: After transmitting the user input data by the user input 535 of FIG. 5 to the source device 300, the source device 300 may be used to process the user input data.
  • UIBC user input back channel
  • the source device 300 includes a wireless communication module 310 (eg, the wireless communication module 192 of FIG. 1) and a display module 320 (eg, the display module of FIG. 1). 160), a memory 330 (eg, the memory 130 of FIG. 1), and a processor 340 (eg, the processor 120 of FIG. 1).
  • the wireless communication module 310 performs wireless communication with the sink device 400 and displays a screen of the source device 300 generated by the source device 300 to be displayed on the sink device 400 (eg, in FIG. 5 ).
  • a plurality of image frames related to the screen 515 may be transmitted to the sink device 400 .
  • the wireless communication module 310 may receive user input data transmitted from the sink device 400 to the source device 300 through UIBC.
  • the UIBC has a reverse channel structure, also referred to as a user input back channel, and the sink device 400 is generated by an input device connected to the sink device 400 (eg, the input device 531 of FIG. 5). It may be configured to transmit user input data corresponding to user inputs to the source device 300 .
  • the reverse channel structure may allow upper layer messages and user interface functions for transmitting user inputs to reside in an internet protocol (IP) transport layer between the sink device 400 and the source device 300.
  • IP internet protocol
  • UIBC is a transmission control protocol/internet protocol (hereinafter referred to as 'TCP/IP') or user datagram protocol ( It may also be configured to run on top of a packet-based communication protocol, such as the user datagram protocol (UDP).
  • UDP user datagram protocol
  • the UIBC may be configured to transmit various types of user input data including cross-platform or multi-platform user input data capable of operating on various types of computer platforms. For example, while the source device 300 runs iOS ® , the sink device 400 may run another operating system such as Android ® or Windows ® .
  • a plurality of different types of user input formats may enable a plurality of different types of source device 300 and sink devices 400 to utilize the protocol via UIBC.
  • the user input format for example, general input formats (generic format) may be used, or platform-specific input formats (eg, HID format) may be used.
  • general input formats for example, general input formats (generic format) may be used, or platform-specific input formats (eg, HID format) may be used.
  • platform-specific input formats eg, HID format
  • transmitting and receiving user input data between the source device 300 and the sink device 400 through the UIBC flexibility for a platform and/or operating system used by each device may be provided.
  • the screen image data is displayed through the display module (eg, the display module 420 of FIG. 4) of the sink device 400. It may be generated in the screen image 537 by an input event by an input device connected to (or included in) the sink device 400 .
  • the input device may include, for example, a mouse, a keyboard, a touch screen, a pen, a microphone, and a wearable device, but is not necessarily limited thereto.
  • the input event may include, for example, a mouse click, a key input on a keyboard, a touch input event on a touch screen, a pen input event, a voice input, a gesture input, and a gaze movement input, but is not necessarily limited thereto.
  • the display module 320 may display a screen (eg, screen 515 of FIG. 5 ) generated by the source device 300 .
  • the memory 330 may store computer-executable instructions. In addition, the memory 330 may store various pieces of information generated during processing of the processor 340 . In addition, the memory 330 may store various data and programs. The memory 330 may include volatile memory or non-volatile memory. The memory 330 may include a mass storage medium such as a hard disk to store various types of data.
  • the processor 340 may access the memory 330 and execute instructions.
  • the processor 340 may transmit screen image data generated by the source device 300 to the sink device 400 to be displayed on the sink device 400 through the wireless communication module 310 .
  • the screen image data generated by the source device 300 is, for example, an image frame replicating the screen 515 displayed on the display module 320 of the source device 300 and/or the image frame of the source device 300. It may include an image frame related to the screen 515 displayed on the display module 320, but is not necessarily limited thereto.
  • the image frame related to the 'screen 515 displayed on the display module 320 of the source device 300 includes the same content as the screen 515 displayed on the display module 320, but, for example, an image
  • the processor 340 determines the transmission amount of user input data generated from the screen image 537 by the input device connected to the sink device 400. It is possible to check whether a target application configured to change is running.
  • the 'target application' corresponds to an application configured to change the transmission amount of user input data by an input device connected to the sink device 400 when the corresponding application is executed because user input data occurs more or more frequently than other applications. can do.
  • the target application may include, for example, an application that provides a specific service, such as a writing application, a photo editing application, and/or a drawing application in which at least one of a pen input event and a touch input event occurs, and must be It is not limited to this.
  • the target application may include not only an application that provides a specific service, but also user experience (UX) of a basic framework displaying a menu.
  • the processor 340 changes the transmission profile for transmitting the screen image data, thereby changing the screen image data.
  • the transmission bit rate of can be adjusted.
  • the processor 340 lowers the transmission bit rate of the screen image data transmitted to the sink device 400, while providing user input to the sink device 400 through a transmission profile. You can request to increase the amount of data transmission.
  • the processor 340 increases the transmission bit rate of the screen image data transmitted to the sink device 400, and the sink device 400 through the transmission profile. ) to lower the transmission amount of user input data.
  • the processor 340 may transmit a message including a transmission profile related to user input data to the sink device 400.
  • a transfer profile may define a communication scheme for data transmission between a sink device and a source device.
  • the transmission profile may include, for example, at least one of the type, structure, and usage method of a protocol used for transmission of corresponding data, but is not necessarily limited thereto.
  • a message transmitted from the source device 300 to the sink device 400 may be, for example, a real time streaming protocol (RTSP) message.
  • the RTSP message is a network control protocol for controlling a streaming media server and can operate in the application layer of the Internet protocol.
  • the RTSP message may include information about a method of transmitting information of voice or video transmitted in real time, such as a transmission profile.
  • the sink device 400 may be, for example, a wearable device (eg, smart glasses 1603 of FIG. 16 ) as shown in FIG. 16 below.
  • the processor 340 performs at least one of the number, size, resolution, and bit rate of screen images displayed on the smart glasses 1603 through the wireless communication module 310. It is possible to receive display information including.
  • the processor 340 may request information transmission from the smart glasses 1603 through a user input interface for sharing additional information according to display information.
  • the user input interface will be described later with reference to FIG. 16 .
  • 'Additional information' may be information additionally used to process information not defined by a general UIBC protocol.
  • the additional information includes, for example, information additionally used to process at least one of eye (iris), head, and hand tracking information and image information and/or depth information for gesture recognition, object recognition and tracking, and/or depth information. It may include, but is not necessarily limited thereto.
  • the operation of the processor 340 is not limited to the above.
  • the processor 340 may perform the above-described operation together with at least one of operations described below with reference to FIGS. 5 to 16 .
  • the sink device 400 (eg, the electronic device 101 and the electronic device 102 of FIG. 1 , the sink device 530 of FIG. 5 , the sink device 1003 of FIG. 10 ,
  • the sink device 1503 of FIG. 15 and/or the smart glasses 1603 of FIG. 16 are source devices (e.g., electronic device 101, electronic device 102 of FIG. 1, and source device 300 of FIG. 3). , establish communication with the source device 510 in FIG. 5, the source device 1001 in FIG. 10, the source device 1501 in FIG. 15, and/or the user terminal 1601 in FIG.
  • the sink device 400 may be, for example, an electronic device such as a personal computer (PC), a smart phone, a laptop, or a tablet, or a wearable electronic device such as the smart glasses 1603, and is necessarily limited thereto. It doesn't work.
  • a sink device 400 includes a wireless communication module 410 (eg, the wireless communication module 192 of FIG. 1 ), a display module 420 (eg, FIG. 1), a memory 430 (eg, the memory 130 of FIG. 1), and a processor 440 (eg, the processor 120 of FIG. 1).
  • a wireless communication module 410 eg, the wireless communication module 192 of FIG. 1
  • a display module 420 eg, FIG. 1
  • a memory 430 eg, the memory 130 of FIG. 1
  • a processor 440 eg, the processor 120 of FIG.
  • the wireless communication module 410 may receive screen image data generated by the source device 300 to be displayed on the sink device 400 .
  • the display module 420 may display screen image data received through the wireless communication module 410 .
  • the memory 430 may store computer-executable instructions. Also, the memory 430 may store various pieces of information generated during processing of the processor 440 . In addition, the memory 430 may store various data and programs. The memory 430 may include volatile memory or non-volatile memory. The memory 430 may include a mass storage medium such as a hard disk to store various types of data.
  • the processor 440 may access the memory 430 and execute instructions.
  • the processor 440 may receive screen image data transmitted by the source device 300 to be displayed on the sink device 400 through the wireless communication module 410 .
  • the processor 440 may display a screen image 537 based on screen image data through the display module 420 .
  • the screen image data is, for example, an image frame replicating a screen displayed by the display module 320 of the source device 300 (eg, the screen 515 of FIG. 5), and/or a display of the source device 300. It may include an image frame related to the screen 515 displayed by the module 320, but is not necessarily limited thereto.
  • the processor 440 converts the screen image 537 by an input device (eg, the input device 531 of FIG. 5) connected to the sink device 400.
  • an input device eg, the input device 531 of FIG. 5
  • User input data generated from can be obtained.
  • the processor 440 dynamically adjusts a parameter for adaptively adjusting the transmission amount of user input data including at least one of a data size and a transmission number based on network quality between the source device 300 and the sink device 400.
  • the 'parameter for adaptively adjusting the transmission amount of user input data' is, for example, not only the input report of user input data and/or the parameters of the report descriptor described through FIGS. 9 to 10 below, It can be understood as meaning encompassing both the size of user input data and the number of user input data.
  • the processor 440 may determine a transmission profile for UIBC transmission of user input data according to current network conditions including network throughput (TP).
  • the processor 440 may measure network quality between the source device 300 and the sink device 400, for example.
  • the processor 440 may, for example, based on at least one of a round trip time (RTT) between the source device 300 and the sink device 400 and a TCP window size, the source device ( 300) and the sink device 400 may determine network quality, but is not necessarily limited thereto.
  • RTT round trip time
  • the processor 440 determines one transmission profile among a plurality of transmission profiles (eg, transmission profiles 610, 630, and 650 of FIG. 6) used to transmit user input data according to the measured network quality.
  • a plurality of transmission profiles eg, transmission profiles 610, 630, and 650 of FIG. 6
  • the plurality of transmission profiles 610, 630, and 650 include, for example, a first transmission profile 610 corresponding to a first network quality and a second transmission profile corresponding to a second network quality lower than the first network quality. It may include at least two of the transmission profile 630 and the third transmission profile 650 corresponding to a third network quality lower than the second network quality, but is not necessarily limited thereto.
  • the relationship between the transport profiles 610, 630, and 650 will be described in more detail with reference to FIG. 6 below.
  • the transmission profiles 610 , 630 , and 650 may include a transmission amount for each transmission profile 610 , 630 , and 650 .
  • the transmission amount for each of the transmission profiles 610, 630, and 650 is, for example, the maximum size of user input data that can be transmitted at one time for each transmission profile 610, 630, and 650, and the user input that can be transmitted at one time. It may include at least one of the maximum number of data, but is not necessarily limited thereto. Transmission amounts and data sizes for each of the plurality of transmission profiles 610, 630, and 650 will be described in more detail with reference to FIG. 7 below.
  • the processor 440 may adaptively adjust parameters according to the determined transmission profile.
  • the processor 440 may adjust the transmission number of user input data according to a transmission profile for each type of user input data, for example. A method of controlling the number of transmissions of user input data by the processor 440 will be described in more detail with reference to FIG. 8 below.
  • the user input data is transmitted prior to transmission of, for example, an input report indicating the contents of the user input data (eg, the input report 900 of FIG. 9 ) and the input report, and the value of the input report is interpreted. It may include at least one of report descriptors (e.g., report descriptors 1110 and 1130 of FIG. 11 and/or report descriptors 1210 and 1230 of FIG. 12) indicating the configuration of the input report used to there is.
  • the processor 440 adjusts the data size of user input data included in the report descriptor 1110 based on at least one of the changed transmission profile and the type of user input data.
  • a method for the processor 440 to adaptively adjust a parameter according to a transmission profile will be described in more detail with reference to FIGS. 11 and 12 below.
  • the processor 440 may transmit dynamically changed parameters to the source device 300 .
  • the processor 440 controls the number, size, resolution, and bit of screen images displayed on the smart glasses 1603 .
  • Display information including at least one of the rates may be shared with the source device 300.
  • the processor 440 may receive a request for information transfer from the source device 300 through a user input interface for sharing additional information. .
  • the operation of the processor 440 is not limited to the above-described operation, and the processor 440 may perform the above-described operation together with at least one of operations described later through FIGS. 5 to 16 .
  • FIG. 5 is a diagram for explaining an operation between a source device and a sink device according to an embodiment.
  • a source device 510 according to an embodiment (eg, electronic device 101 and electronic device 102 of FIG. 1 , source device 1001 of FIG. 10 , and source device 1501 of FIG. 15 ) ), and/or the user terminal 1601 of FIG. 16) and the sink device 530 (eg, the electronic device 101 and the electronic device 102 of FIG. 1, the sink device 400 of FIG. 4, and the sink device 530 of FIG. 10).
  • the sink device 1003, the sink device 1503 of FIG. 15, and/or the smart glasses 1603 of FIG. 16 exchange data through a communication channel is illustrated.
  • a communication channel may generally represent any communication medium or collection of different communication mediums for transmitting video data from source device 510 to sink device 530 .
  • the communication channel may correspond to a relatively short-range communication channel, such as Wi-Fi, Bluetooth, for example, or any wired or wireless communication, such as a radio frequency (RF) spectrum or one or more physical transmission lines. media, or any combination of wireless or wired media.
  • the communication channel may form part of a packet-based network, such as a local area network, a wide area network, or a global network such as the Internet.
  • the communication channel may include the aforementioned UIBC.
  • the source device 510 may transmit video data including image frames related to the screen 515 and/or screen image data including audio data to the sink device 530 .
  • the source device 510 may transmit screen image data to the sink device 530 using a general communication channel.
  • the sink device 530 may display the screen image 537 by decoding and/or rendering data (eg, screen image data) received from the source device 510 .
  • the sink device 530 may obtain user input data corresponding to a user input 535 generated by an input device (eg, a mouse) 531 connected to the sink device 530 .
  • the input device 531 may include, for example, a keyboard, a track ball, a track pad, a touch screen, a voice recognition module, a gesture recognition module, an iris recognition module, a mouth shape recognition module, and/or others. It may include, but is not necessarily limited to, various types of Dormant Device Interface (HID) units or devices.
  • HID Dormant Device Interface
  • the sink device 530 formats user input data corresponding to a user input 535 such as a cursor movement generated by the input device 531 into a data packet structure that the source device 510 can interpret. ), the formatted user input data may be transmitted to the source device 510 through the aforementioned UIBC.
  • the source device 510 inputs an input connected to the sink device 530 while the screen image 537 displayed on the sink device 530 is displayed on the display based on the screen image data generated and transmitted by the source device 510. It may respond to user input 535 generated by device 531 . Through this interaction, user input data corresponding to the user input 535 , such as a cursor movement generated by the sink device 530 , may be transmitted back to the source device 510 through the UIBC.
  • FIG. 6 is a block diagram illustrating a state diagram of a transmission profile according to an embodiment.
  • types of transmission profiles eg, a first transmission profile 610, a second transmission profile 630, and a third transmission profile 650 changed in response to network quality according to an embodiment .
  • network conditions or source devices including network quality eg, electronic device 101 and electronic device 102 in FIG. 1 , source device 300 in FIG. 3 , source device 510 in FIG. 5
  • a sink device eg, an electronic device of FIG. 1 ( 101), electronic device 102, sink device 400 of FIG. 4, sink device 530 of FIG. 5, sink device 1003 of FIG. 10, sink device 1503 of FIG. 15, and/or FIG. 16
  • User input e.g., FIG. 5 It is possible to reduce the delay for user input 535) and provide better usability.
  • the sink device 400 may adjust the transmission amount of user input data transmitted to the source device 300 by determining a network condition. For example, the sink device 400 determines network conditions by measuring network quality or determining network quality based at least in part on information received from other devices (eg, the source device 300 or an access point (AP)). can do.
  • a network condition For example, the sink device 400 determines network conditions by measuring network quality or determining network quality based at least in part on information received from other devices (eg, the source device 300 or an access point (AP)). can do.
  • AP access point
  • the sink device 400 uses, for example, a round trip time (RTT) on TCP/IP and/or a TCP window size to establish a connection between the source device 300 and the sink device 400.
  • RTT round trip time
  • the sink device 400 may define a transmission profile for transmission of user input data according to network quality, and may flexibly select an appropriate transmission profile suitable for the current network situation.
  • the first transmission profile 610 may correspond to a first network quality indicating a 'good' network quality.
  • the first transmission profile 610 can also be expressed as a 'High Profile' in that the transmission amount is large.
  • the second transmission profile 630 may correspond to a second network quality indicating a 'normal (or medium)' network quality that is lower than the first network quality.
  • the second transmission profile 630 can also be expressed as a 'Mid Profile' in that the amount of transmission is medium.
  • the third transmission profile 650 may correspond to a third network quality lower than the second network quality, indicating a 'bad' network quality.
  • the third transmission profile 650 can also be expressed as a 'Low Profile' in that the amount of transmission is small.
  • the number of profiles is not limited, and a number of other profiles may be further included.
  • Each of the first transmission profile 610, the second transmission profile 630, and the third transmission profile 650 may include a transmission amount for each transmission profile.
  • the transmission amount for each transmission profile may include, for example, the maximum size, maximum number, and/or bit rate of user input data that can be transmitted at one time for each transmission profile, but is not necessarily limited thereto.
  • the sink device 400 can flexibly select an appropriate transport profile suitable for the current network situation. For example, if the previous user input data was transmitted by the first transmission profile 610, but it is determined that the network quality deteriorated at the time the current user input data is to be transmitted, the sink device 400 converts the transmission profile. The first transmission profile 610 may be changed to the second transmission profile 630 and transmitted. Alternatively, if the previous user input data was transmitted by the third transmission profile 650, but it is determined that the network quality has improved at the time the current user input data is to be transmitted, the sink device 400 provides the user input data. 3 transmission profile 650 can be changed to the second transmission profile 630 and transmitted.
  • the sink device 400 may transmit to the source device 300 by adjusting the size and/or number of user input data to be transmitted according to the type of the selected transmission profile. An embodiment in which the sink device 400 adjusts the size and/or number of user input data will be described in more detail with reference to FIGS. 7 and 8 below.
  • the source device 300 and the sink device 400 communicate with each other so that the source device 300 reduces the bit rate of screen image data transmitted to the sink device 400, and the sink device 400 transmits the source device 400 to the source device 400.
  • the quality of user input may be improved by increasing the transmission amount including the bit rate of user input data transmitted to the device 300 .
  • FIG. 7 is a diagram for explaining transmission amount and data size for each of a plurality of transmission profiles according to an embodiment.
  • a diagram 710 showing the transmission amount and size of user input data when the transmission profile is the first transmission profile according to an embodiment, and the transmission amount and size of user input data when the transmission profile is the second transmission profile according to an embodiment.
  • a diagram 730 and a diagram 750 showing the transmission amount and size of user input data in the case of the third transmission profile are shown.
  • Sink device (eg, electronic device 101 of FIG. 1 , electronic device 102 , sink device 400 of FIG. 4 , sink device 530 of FIG. 5 , sink device 1003 of FIG. 10 , sink device 1003 of FIG. 15 )
  • the sink device 400 may define at least some of the maximum size and/or maximum number of user input data to be transmitted through each transmission profile.
  • the sink device 400 may define the first transmission profile to have the maximum size of user input data and the first transmission amount in which the maximum number is unlimited.
  • the sink device 400 may define the second transmission profile to have a second transmission amount smaller than the first transmission amount as shown in the drawing 730 by limiting the transmission number of user input data.
  • the second transmission amount may have, for example, 50% of the first transmission amount, which is the maximum transmission amount of the corresponding network, but is not necessarily limited thereto.
  • the sink device 400 may define a third transmission profile such that the size and number of user input data have a third transmission amount smaller than the second transmission amount, as shown in drawing 750, by limiting the transmission number of user input data.
  • the third transmission amount may correspond to the minimum transmission amount of the corresponding network, but is not necessarily limited thereto.
  • the sink device 400 may limit the transmission number of user input data by, for example, downscaling the size of the user input data or dropping the user input data. A method of limiting the transmission number of user input data by the sink device 400 will be described in more detail with reference to FIG. 8 below.
  • FIG. 8 is a diagram for explaining a method in which a sink device limits the transmission number of user input data according to an embodiment.
  • a diagram 810 showing the number of user input data transmitted when a transmission profile according to an embodiment is a first transmission profile (eg, the first transmission profile 610 of FIG. 6) and transmission
  • a diagram 830 showing the number of user input data transmitted when the profile is changed from the first transmission profile to the third transmission profile (eg, the third transmission profile 650 of FIG. 6 ) is shown.
  • Sink device eg, electronic device 101 of FIG. 1 , electronic device 102 , sink device 400 of FIG. 4 , sink device 530 of FIG. 5 , sink device 1003 of FIG. 10 , sink device 1003 of FIG. 15
  • a touch input event as shown in the drawing 810 at the current viewpoint in the screen image (eg, the screen image 537 of FIG. 5 ) of the sink device 1503 and/or the smart glasses 1603 of FIG. 16 .
  • (x,y) coordinates by mouse input, or 5 user input data (input1, input2, input3, input4, input5) by writing pressure ('pen pressure') by pen input event.
  • the sink device 400 may determine the transmission profile as the third profile according to the relatively low network quality.
  • the sink device 400 may dynamically change a parameter for adaptively adjusting the transmission amount of user input data according to the third transmission profile.
  • the sink device 400 generates a transmission profile when a difference between first user input data generated at the current time point and second user input data generated at a previous time point preceding the current time point in the screen image 537 is less than a predetermined value. Accordingly, the number of occurrences of user input data may be reduced by discarding the first user input data generated at the current point in time.
  • the sink device 400 generates the first data generated at the current time in the screen image 537 for each of the five user input data (input1, input2, input3, input4, and input5) shown in the drawing 810.
  • a first difference between a first coordinate of the user input data and a second coordinate of second user input data generated at a previous point in time preceding the current point in time may be calculated.
  • the sink device 400 according to the selected transmission profile (eg, the third transmission profile) As shown in drawing 830, the number of transmissions of user input data can be adjusted by dropping three user input data (input2, input3, and input4) that are the first user input data.
  • the sink device 400 transmits two input data (input1 and input5) according to the third transmission profile to a source device (eg, the electronic device 101 and the electronic device 102 of FIG. 1 and the source device 300 of FIG. 3 ). ), the source device 510 of FIG. 5, the source device 1001 of FIG. 10, the source device 1501 of FIG. 15, and/or the user terminal 1601 of FIG. 16).
  • a source device eg, the electronic device 101 and the electronic device 102 of FIG. 1 and the source device 300 of FIG. 3 .
  • the source device 510 of FIG. 5 the source device 1001 of FIG. 10
  • the source device 1501 of FIG. 15 the source device 1501 of FIG. 15
  • the user terminal 1601 of FIG. 16 the number of discarded first user input data by the sink device 400 may vary according to the type of transmission profile.
  • sink device 400 may be configured to drop inputs at a fixed rate rather than compare data at the inputs.
  • the second transmission profile may instruct the sink device 400 to drop all other data inputs, resulting in a 50% reduction in the amount of transmitted data.
  • the third transmission profile may instruct the sink device 400 to transmit only one of the four user inputs, and as a result, the amount of transmitted data may be reduced by 75%.
  • the generation amount of user input data may vary according to the type of input device used by the user, for example, the type of user input.
  • a new input event occurs whenever the (x, y) coordinate value changes, and if the (x, y) coordinate value does not change, a new input event does not occur.
  • data values such as pen pressure and/or tilt change, it may be determined that a new input event has occurred.
  • a pen input event may generate more input data than a touch input event.
  • user input data by the pen generally include pressure, tilt, and orientation in addition to (x,y) coordinate values. Data such as may be additionally included. Therefore, when network conditions are poor, reducing and transmitting the number of user input data, such as a pen input event having a large amount of data, may have a great meaning.
  • the sink device 400 can reduce the number of occurrences of user input data in the same manner as described above even when input data is generated due to changes in pen pressure and/or inclination in a pen input event.
  • the sink device 400 includes at least one of a first pressure, a first tilt, and a first orientation of the first user input data generated at the current time in the screen image 537 .
  • a second difference between first input information and second input information including at least one of a second pressure, a second slope, and a second direction of second user input data generated at a previous time point preceding the current time point may be calculated. .
  • the sink device 400 may adjust the transmission number of user input data by dropping the first user input data according to the selected transmission profile.
  • the sink device 400 may limit the number of transmitted user input data according to the transmission profile as in the second transmission profile or set the user input data to be transmitted with the minimum data size and number as in the third transmission profile. .
  • the sink device 400 does not discard the first user input data and sends it to the source device 300 as it is. can transmit
  • the sink device 400 may reduce the number of generated user input data in the method described above with reference to FIG. 8 and may define the maximum transmission number per second for each profile.
  • the sink device 400 may adjust the transmission number of user input data according to a transmission profile for each type of user input data.
  • the sink device 400 may transmit all user input data generated by user input to the source device 300 .
  • the maximum number ('maximum number of occurrences') of user input data generated per predetermined time unit eg, 1 second (sec)
  • the sink device 400 may limit the number of occurrences of user input data so that the maximum number of occurrences for each type of user input data is limited to about 50% of that of the first transmission profile.
  • the sink device 400 may transmit to the source device 300 the minimum number of user input data that does not cause malfunction for each type of user input data. For example, when the type of user input data is a touch input event, a large malfunction may occur if an input event such as touch down and/or touch up is omitted. On the other hand, if the movement amount of an input event such as a touch move is large, a large malfunction may not occur even if the input event is omitted.
  • the sink device 400 sets a specific reference value for an input event that can be omitted according to the amount of movement, such as a touch move, and discards input events that do not exceed the specific reference value, thereby generating and transmitting user input data. can reduce
  • FIG. 9 is a diagram illustrating an example of an input report according to an embodiment. Referring to FIG. 9 , a diagram illustrating an example of an input report 900 of user input data according to an exemplary embodiment is shown.
  • Sink device (eg, electronic device 101 of FIG. 1 , electronic device 102 , sink device 400 of FIG. 4 , sink device 530 of FIG. 5 , sink device 1003 of FIG. 10 , sink device 1003 of FIG. 15 )
  • a general format eg, generic format
  • the number of fingers and (x,y) coordinate values should be transmitted as user input data
  • (x,y) Information such as coordinate values, pressure, tilt, and orientation may need to be transmitted.
  • user input data such as the number of fingers, (x,y) coordinate values, and/or user input such as (x,y) coordinate values, pressure, tilt, and/or orientation Data may be transmitted in HID format.
  • the user input data may include an input report 900 and/or a report descriptor (e.g., report descriptors 1110 and 1130 of FIG. 11 ), and/or or the report descriptors 1210 and 1230 of FIG. 12).
  • a report descriptor e.g., report descriptors 1110 and 1130 of FIG. 11
  • the report descriptors 1210 and 1230 of FIG. 12 may be included in HID (Human Interface Devices) format.
  • the input report 900 may include contents of user input data, for example, actual data such as (x, y) coordinate values related to the user input.
  • actual data such as (x, y) coordinate values related to the user input.
  • the (x,y) coordinate values may be relative coordinate values.
  • the input report 900 includes, for example, a relative coordinate value corresponding to each of the first transmission profile, the second transmission profile, and the third transmission profile, a hexadecimal (hex) conversion value for the relative coordinate value, and byte alignment ( byte alignment) information, but is not necessarily limited thereto.
  • the byte alignment information may indicate how many bytes the (x, y) coordinate value is represented in total.
  • the report descriptor 1110 corresponds to data representing the configuration of the input report 900 used to interpret the value of the input report 900 (eg, configuration such as in what size and in what order the input data is transmitted) can do.
  • the report descriptor 1110 may be transmitted prior to transmission of the input report 900, as shown in FIG. 10 below.
  • the source device that received the data transmitted in the HID format (e.g., the electronic device 101 and the electronic device 102 in FIG. 1, the source device 300 in FIG. 3, the source device 510 in FIG. 5, and the electronic device 102 in FIG.
  • the source device 1001, the source device 1501 in FIG. 15, and/or the user terminal 1601 in FIG. 16 memorizes the report descriptor 1110, and input reports received thereafter according to the report descriptor 1110.
  • the value of (900) can be interpreted.
  • FIG. 10 is a diagram for explaining a method of exchanging an input report and a report descriptor of user input data between a source device and a sink device according to an embodiment.
  • a source device 1001 eg, FIG. 1
  • sink device 1003 eg, electronic device 101 of FIG. 1, electronic device 102, sink device 400 of FIG. 4, sink device 530 of FIG. 5, sink device 1503 of FIG.
  • FIG. 16 A diagram 1000 showing a situation in which report descriptors 1110 and 1130 of FIG. 11 and/or report descriptors 1210 and 1230 of FIG. 12 are exchanged is shown.
  • the sink device 1003 may transmit user input data including an input report 900 and a report descriptor 1110 to the source device 1001 for each type of user input generated by various input devices.
  • the sink device 1003 may transmit a report descriptor corresponding to each of the mouse input, touch input event, keyboard input, and pen input event to the source device 1001.
  • the source device 1001 may store the report descriptor transmitted in operation 1010 .
  • the sink device 1003 transmits an input report corresponding to the report descriptor transmitted in operation 1010, for example, an input report corresponding to each of a mouse input, a touch input event, a keyboard input, and a pen input event to the source device 1001. ) can be sent to The source device 1001 may interpret an input report corresponding to each of a mouse input, a touch input event, a keyboard input, and a pen input event using each report descriptor stored in operation 1010 .
  • HID devices such as keyboards and mice may not send or receive new report descriptors unless a new setting is specially added after the first report descriptor is sent.
  • the sink device 1003 can adjust the data size of the user input data by variably generating a report descriptor of the user input data during the mirroring connection.
  • the sink device 1003 may change a transmission profile in response to the change in network quality.
  • the sink device 1003 may adjust the data size of user input data included in the report descriptor based on at least one of the changed transmission profile and the type of user input data.
  • the sink device 1003 may change the representation size of data (eg, (x, y) coordinates) according to a transmission profile according to a current network situation. For example, when the transmission profile is the first transmission profile (eg, the first transmission profile 610 of FIG. 6), the sink device 1003 represents (x, y) coordinate data with 16 bits, respectively. By doing so, (x,y) coordinates can be represented by a total of 4 bytes. When the transmission profile is the second transmission profile (eg, the second transmission profile 630 of FIG. 6), the sink device 1003 expresses (x, y) coordinate data with 12 bits each, resulting in a total of 3 bytes.
  • the transmission profile is the second transmission profile (eg, the second transmission profile 630 of FIG. 6)
  • the sink device 1003 expresses (x, y) coordinate data with 12 bits each, resulting in a total of 3 bytes.
  • (x,y) coordinates can be represented by (byte).
  • the transmission profile is the third transmission profile (eg, the third transmission profile 650 of FIG. 6)
  • the sink device 1003 expresses (x, y) coordinate data with 8 bits, respectively, so that a total of 2 (x,y) coordinates can be represented by bytes.
  • the sink device 1003 transmits the changed report descriptor corresponding to the touch input event to the source device 1001 ) can be sent to A method for the sink device 1003 to change the report descriptor for each transmission profile (eg, the transmission profiles 610, 630, and 650 of FIG. 6) will be described in more detail with reference to FIGS. 11 and 12 below.
  • the sink device 1003 may transmit a changed input report according to the changed report descriptor format.
  • FIG. 11 is a diagram for explaining a report descriptor that is changed for each transmission profile in response to a touch input event according to an exemplary embodiment.
  • a report descriptor 1110 in a first transmission profile eg, the first transmission profile 610 of FIG. 6
  • An example diagram and an example report descriptor 1130 in a third transmission profile eg, third transmission profile 650 of FIG. 6 ) are shown.
  • the sink device eg, the electronic device 101 and 102 of FIG. 1 , the sink device 400 of FIG. 4 , and the sink device 530 of FIG. 5
  • the actual (x,y) coordinates corresponding to some of them can be expressed as relative values between 0 and 32767.
  • the sink device 400 expresses the data representing the (x,y) coordinates of the touch input event in the report descriptor 1110 using all 16-bit sizes, such as "REPORT_SIZE(16)", and transmits the original data (e.g., (x,y) coordinates) can be transmitted without loss.
  • the sink device 400 sets the actual (x,y) coordinates corresponding to a part of the report descriptor 1130 related to the (x,y) coordinates of the touch input event from 0. It can be expressed as a relative value between 127.
  • the sink device may express data representing the (x,y) coordinates of the touch input event in the report descriptor 1130 using an 8-bit size, such as "REPORT_SIZE(8)".
  • the sink device 400 may express 16-bit data (eg, (x, y) coordinates) as 8-bit data by downscaling. When the downscaled data is transmitted to the source device 300 and restored, the (x,y) coordinate values may differ from the original values.
  • PHYSICAL_MAXIMUM (1920)" described in the report descriptor 1130 may mean that the maximum physical value corresponding to the relative value is 0 to 1920.
  • the source device 300 since the source device 300 generates screen image data, the source device 300 may store the maximum value of the actual (x,y) coordinates in advance. In this case, the value of PHYSICAL_MAXIMUM may not be included in the report descriptor 1130.
  • "LOGICAL_MAXIMUM (127)” may mean that data having a physical maximum value (0 to 1920) is downscaled to a value between 0 and 127 and transmitted. According to the information included in the report descriptor 1130, the source device 300 interprets the 64-bit data transmitted by the sink device 400 as 960-bit data 960 or the 127-bit data transmitted by the sink device 400. Data can be interpreted as 1920 bit data.
  • the sink device 400 may reduce the size of data by downscaling not only the above-described (x,y) coordinate values of the touch input event but also the pen pressure value of the pen in the same way.
  • a method of changing the size of data for a pen pressure value of a pen input will be described in more detail with reference to FIG. 12 below.
  • FIG. 12 is a diagram for explaining a report descriptor that is changed for each transmission profile in response to a pen input event according to an exemplary embodiment.
  • a diagram 1210 illustrating a part of a report descriptor in a first transmission profile (eg, the first transmission profile 610 of FIG. 6) related to a pen pressure value of a pen input event according to an embodiment
  • a diagram 1230 illustrating a portion of a report descriptor in a third transmission profile (eg, third transmission profile 650 of FIG. 6 ) is shown.
  • the sink device eg, the electronic device 101 and 102 of FIG. 1 , the sink device 400 of FIG. 4 , and the sink device 530 of FIG. 5
  • the sink device 1003 of FIG. 10 the sink device 1503 of FIG. 15, and/or the smart glasses 1603 of FIG. 16 are described in the report descriptor 1210 related to the pen pressure of the pen input event.
  • a value indicating the pen pressure of a pen input event can be expressed in 16 bits, such as "REPORT_SIZE (16)”.
  • “LOGICAL_MAXIMUM (4096)” described in the report descriptor 1210 may mean that when sending pen pressure data, it will be expressed as a relative value between 0 and 4096.
  • the sink device 400 may downscale a value representing the pen pressure of the pen input event, which was expressed in 16 bits, and express it in 8 bits.
  • the sink device 400 may express a value representing the pen pressure of the pen input event with 8 bits, such as “REPORT_SIZE(8)” described in the report descriptor 1230 regarding the pen input event pressure.
  • "LOGICAL_MAXIMUM (127)" described in the report descriptor 1230 may mean that when sending pen pressure data, it is expressed as a relative value between 0 and 127.
  • FIG. 13 is a flowchart illustrating a method of operating a source device according to an embodiment.
  • each operation may be performed sequentially, but not necessarily sequentially. For example, the order of each operation may be changed, or at least two operations may be performed in parallel.
  • a source device according to an embodiment eg, electronic device 101 and electronic device 102 of FIG. 1 , source device 300 of FIG. 3 , source device 510 of FIG. 5 , FIG.
  • the source device 1001 of FIG. 10, the source device 1501 of FIG. 15, and/or the user terminal 1601 of FIG. 16 may adjust the transmission bit rate of the screen through operations 1310 to 1330.
  • the source device 300 performs a sink device (eg, the electronic device 101 and 102 of FIG. 1 , the sink device 400 of FIG. 4 , the sink device 530 of FIG. 5 , and the electronic device 102 of FIG. 10 ).
  • a sink device eg, the electronic device 101 and 102 of FIG. 1 , the sink device 400 of FIG. 4 , the sink device 530 of FIG. 5 , and the electronic device 102 of FIG. 10 .
  • the sink device 400 can be sent to display the screen image data generated by the source device 300 on the sink device 1003 of FIG. 15 , the sink device 1503 of FIG. 15 , and/or the smart glasses 1603 of FIG. 16 .
  • the source device 300 transmits to the sink device 400 while displaying a screen image (eg, screen image 537 of FIG. 5) based on the screen image data in the sink device 400 through operation 1310. It is possible to check whether a target application configured to change the transmission amount of user input data generated in the screen image 537 by the connected input device is running.
  • a target application configured to change the transmission amount of user input data generated in the screen image 537 by the connected input device is running.
  • the source device 300 may adjust the transmission bit rate of the screen image data by changing a transmission profile for transmitting the screen image data.
  • each operation may be performed sequentially, but not necessarily sequentially.
  • the order of each operation may be changed, or at least two operations may be performed in parallel.
  • a sink device eg, electronic device 101 and electronic device 102 of FIG. 1 , sink device 400 of FIG. 4 , sink device 530 of FIG. 5 , FIG.
  • the sink device 1003 of FIG. 10 , the sink device 1503 of FIG. 15 , and/or the smart glasses 1603 of FIG. 16 are dynamically changed parameters through operations 1410 to 1450 to adjust the amount of transmission of user input data.
  • a source device e.g., the electronic device 101 of FIG. 1, the electronic device 102, the source device 300 of FIG. 3, the source device 510 of FIG. 5, the source device 1001 of FIG. 10, and the source device 1001 of FIG. 15 may be transmitted to the source device 1501 of and/or the user terminal 1601 of FIG. 16 .
  • the sink device 400 may receive screen image data generated by the source device 300 to be displayed on the sink device 400.
  • the sink device 400 may display a screen image (eg, the screen image 537 of FIG. 5) based on the screen image data received in operation 1410.
  • the sink device 400 receives a screen image 537 by an input device connected to the sink device 400 (eg, the input device 531 of FIG. 5). ) can obtain user input data generated from the input device connected to the sink device 400 (eg, the input device 531 of FIG. 5).
  • the sink device 400 adaptively adjusts a transmission amount of user input data including at least one of a data size and a transmission number based on network quality between the source device 300 and the sink device 400. parameters can be dynamically changed.
  • the sink device 400 may transmit the dynamically changed parameter in operation 1440 to the source device 300.
  • each operation may be performed sequentially, but not necessarily sequentially. For example, the order of each operation may be changed, or at least two operations may be performed in parallel.
  • a source device 1501 according to an embodiment (eg, electronic device 101 and electronic device 102 of FIG. 1 , source device 300 of FIG. 3 , source device of FIG. 5 ) 510, the source device 1001 in FIG. 10, and/or the user terminal 1601 in FIG. 16) and the sink device 1503 (eg, the electronic device 101 in FIG. 1, the electronic device 102 in FIG.
  • the sink device 400 of FIG. 4, the sink device 530 of FIG. 5, the sink device 1003 of FIG. 10, and/or the smart glasses 1603 of FIG. 16 perform the bit rate of the image through operations 1510 to 1580. and the bit rate of user input data may be adaptively adjusted.
  • the source device 1501 may check or verify whether a target application is running on the image displayed by the source device 1501.
  • the input device connected to the sink device 1503 eg, the input device 531 of FIG. 5
  • the target application may include, for example, a writing application, a photo editing application, and a drawing application in which at least one of a pen input event and a touch input event occurs, but is not limited thereto.
  • the source device 1501 transmits a message including a transmission profile related to the user input data to the sink device 1503 to inform the sink device 1503 of the current network situation.
  • the message transmitted by the source device 1501 in operation 1520 may be, for example, a Real Time Streaming Protocol (RTSP) message.
  • the RTSP message is a network control protocol for controlling a streaming media server and can operate in the application layer of the Internet Protocol.
  • the RTSP message may include information about a method of transmitting information of voice or video transmitted in real time, such as a transmission profile.
  • the message transmitted by the source device 1501 in operation 1520 may include, for example, a signal requesting the sink device 1503 to increase the transmission amount of user input data.
  • Operation 1520 may be performed in the background without the user recognizing it, or a message in which the transmission amount of the user input data may be set to a value desired by the user may be transmitted by giving the user a direct option.
  • the source device 1501 may lower the bit rate of screen image data according to the transmission profile included in the message transmitted in operation 1520.
  • the screen image data may correspond to screen image data generated by the source device 1501 to be displayed on the sink device 1503 .
  • the source device 1501 may lower the bit rate of the previously transmitted video in order to secure a bandwidth in which more user input data can be transmitted within the limited network throughput. For example, in the case of a drawing application, since the complexity of the image is not high, even if the bit rate of the image is lowered, the currently displayed image may not be significantly different from the previously displayed image.
  • the sink device 1503 may increase the bit rate of user input data (eg, UIBC data) according to the transmission profile included in the message received in operation 1520.
  • the sink device 1503 may transmit user input data without limit as in the above-described method or may improve the quality of user input data by increasing a transmission profile.
  • operations 1530 and 1540 may be performed identically or sequentially with a predetermined time difference.
  • the sink device 1503 may check or confirm whether execution of the target application, which was checked in operation 1510, has ended.
  • the source device 1501 may transmit a message including a transmission profile related to user input data to the sink device 1503.
  • the transmission profile included in the message transmitted by the source device 1501 in operation 1560 may include, for example, information requesting the sink device 1503 to lower the transmission amount of user input data.
  • the source device 1501 may increase the bit rate of the video again according to the transmission profile included in the message transmitted in operation 1560.
  • the sink device 1503 may lower the bit rate of user input data according to the transmission profile included in the message received in operation 1560.
  • 16 is a diagram for explaining a process of transmitting user input data when a source device is a user terminal and a sink device is smart glasses, according to an embodiment.
  • a source device eg, electronic device 101 and electronic device 102 of FIG. 1 , source device 300 of FIG. 3 , source device 510 of FIG. 5 , FIG.
  • the user terminal 1601 corresponding to the source device 1001 of FIG. 10 and/or the source device 1501 of FIG. 15 and a sink device (eg, electronic device 101 of FIG. 1 , electronic device 102 of FIG. 1 , FIG.
  • a sink device eg, electronic device 101 of FIG. 1 , electronic device 102 of FIG. 1 , FIG.
  • An operation of transmitting and receiving is shown.
  • the smart glasses 1603 may be, for example, augmented reality (AR) glasses, but are not necessarily limited thereto.
  • AR augmented reality
  • the method of transmitting user input data described above with reference to FIGS. 3 to 15 is not only Miracast but also augmented reality devices such as smart glasses 1603 or head mounted displays (HMDs). The same can be applied to AR) devices).
  • augmented reality devices such as smart glasses 1603 or head mounted displays (HMDs). The same can be applied to AR) devices).
  • user input data such as a gesture such as a user's hand motion or a user's gaze may be generated on a screen where virtual content and a real screen are mixed.
  • usability can be improved by applying the same method to various types of user input data generated from augmented reality devices without limiting the transmission method of the above-described user input data to the UIBC interface.
  • the user input data is various additional information not defined by the general UIBC protocol (e.g., eye, head, hand tracking information, gesture recognition, object recognition, and/or object tracking information). image information and depth information). Therefore, an interface protocol for transmitting various additional information may be newly defined, and meta data for sharing and/or transmitting not only raw data including compressed data but also the relationship between the additional information or the amount of change. ) can also be defined.
  • the general UIBC protocol e.g., eye, head, hand tracking information, gesture recognition, object recognition, and/or object tracking information. image information and depth information. Therefore, an interface protocol for transmitting various additional information may be newly defined, and meta data for sharing and/or transmitting not only raw data including compressed data but also the relationship between the additional information or the amount of change. ) can also be defined.
  • an interface for sharing various additional information replacing the UIBC interface protocol may be defined as a 'user input interface' for transmission of user input data generated from augmented reality devices.
  • An example of a synchronization process including transmission of user input data between the user terminal 1601 corresponding to the source device and the smart glasses 1603 corresponding to the sink device may be performed as, for example, operations 1610 to 1680. there is.
  • the user terminal 1601 may be connected to the smart glasses 1603 through tethering.
  • the user terminal 1601 may request the smart glasses 1603 to perform a capability check.
  • the smart glasses 1603 receiving the capability check request, in operation 1630, through a user input interface, for example, the type and form of data to be shared with the user terminal 1601 and/or data to be transmitted to the user terminal 1601.
  • Information such as , , and transmission protocol may be transmitted to the user terminal 1601 .
  • the smart glasses 1603 may share display information and/or variation of a real screen displayed on the smart glasses 1603 and/or a virtual screen including virtual content with the user terminal 1601 in real time.
  • the display information may include, for example, at least one of the number, size, resolution, and bit rate of virtual screens (eg, screen images) including a real screen and/or virtual content. Not limited.
  • the user terminal 1601 may request transmission of user input data from the smart glasses 1603 according to the display information shared in operation 1640 using the user input interface.
  • the user terminal 1601 may request transmission of user input data from the smart glasses 1603 according to one profile determined among one or more predetermined profiles. For example, when there are several types of user input data, profiles having transmission bitrates of different sizes may be defined according to each type of input data used. In this case, the user terminal 1601 may request the smart glasses 1603 to transfer the user input data according to a profile having a transmission bit rate of different sizes according to the type of input data used.
  • the smart glasses 1603 may transmit user input data generated in the smart glasses 1603 to the user terminal 1601 in response to the request of operation 1650.
  • the smart glasses 1603 may detect whether a change event of display information is generated by an application (eg, a real estate application or a game application) installed on the smart glasses 1603 or a program.
  • an application eg, a real estate application or a game application
  • the smart glasses 1603 may change the transmission profile using the user input interface.
  • the smart glasses 1603 may be synchronized with the user terminal 1601 by transmitting information including user input data according to the transmission profile changed in operation 1680.
  • the synchronization process between the user terminal 1601 and the smart glasses 1603 described above with reference to FIG. 16 is, for example, a server/client, a master device/slave device, and/or a multi-access edge computing (MEC) cloud/pico cloud ( pico-cloud), it may be configured in a form in which one or more devices distribute and process processing.
  • MEC multi-access edge computing
  • the embodiments described above may be implemented as hardware components, software components, and/or a combination of hardware components and software components.
  • the devices, methods and components described in the embodiments may include, for example, a processor, a controller, an arithmetic logic unit (ALU), a digital signal processor, a microcomputer, a field programmable gate (FPGA). array), programmable logic units (PLUs), microprocessors, or any other device capable of executing and responding to instructions.
  • the processing device may execute an operating system (OS) and software applications running on the operating system.
  • a processing device may also access, store, manipulate, process, and generate data in response to execution of software.
  • the processing device includes a plurality of processing elements and/or a plurality of types of processing elements. It can be seen that it can include.
  • a processing device may include a plurality of processors or a processor and a controller. Other processing configurations are also possible, such as parallel processors.
  • Software may include a computer program, code, instructions, or a combination of one or more of the foregoing, which configures a processing device to operate as desired or processes independently or collectively. You can command the device.
  • Software and/or data may be any tangible machine, component, physical device, virtual equipment, computer storage medium or device, intended to be interpreted by or provide instructions or data to a processing device. , or may be permanently or temporarily embodied in a transmitted signal wave.
  • Software may be distributed on networked computer systems and stored or executed in a distributed manner.
  • Software and data may be stored on computer readable media.
  • the method according to the embodiment may be implemented in the form of program instructions that can be executed through various computer means and recorded on a computer readable medium.
  • the computer readable medium may include program instructions, data files, data structures, etc. alone or in combination, and the program instructions recorded on the medium may be specially designed and configured for the embodiment or may be known and usable to those skilled in the art of computer software.
  • Examples of computer-readable recording media include magnetic media such as hard disks, floppy disks and magnetic tapes, optical media such as CD-ROMs and DVDs, and magnetic media such as floptical disks.
  • - includes hardware devices specially configured to store and execute program instructions, such as magneto-optical media, and ROM, RAM, flash memory, and the like.
  • Examples of program instructions include high-level language codes that can be executed by a computer using an interpreter, as well as machine language codes such as those produced by a compiler.
  • the hardware device described above may be configured to operate as one or a plurality of software modules to perform the operations of the embodiments, and vice versa.
  • the source device (101, 102, 300, 510, 1001, 1501, and/or 1601) includes a wireless communication module (192, 310), a memory (130, 330), and a processor (120, 340). Including, the processor 120, 340, through the wireless communication module 192, 310, to indicate to the sink device (101, 102, 400, 530, 1003, 1503, and / or 1603) Transmission of screen image data generated by the source device (101, 102, 300, 510, 1001, 1501, and/or 1601) to the sink device (101, 102, 400, 530, 1003, 1503, and/or 1603) and while the screen image 537 based on the screen image data is displayed on the sink device 101, 102, 400, 530, 1003, 1503, and/or 1603, the sink device 101, 102, 400, 530, 1003, 1503, and / or 1603, determine whether a target application configured to change the amount of transmission of user input data generated in the screen image 537 by the input device 531 is running,
  • the screen image data is reproduced by copying the screen 515 displayed by the display module 160 or 320 of the source device 101, 102, 300, 510, 1001, 1501, and/or 1601.
  • At least one of the generated image frame or the image frame related to the screen 515 displayed by the display module 160 or 320 of the source device 101, 102, 300, 510, 1001, 1501, and/or 1601 can include
  • the processor 120 or 340 sends the sink device 101 , 102 , 400 , 530 , 1003 , 1503 , and/or 1603 information to the user based on the determination that the target application is running.
  • a message including the transmission profile related to the input data may be transmitted.
  • the processor 120 or 340 transmits information to the sink device 101, 102, 400, 530, 1003, 1503, and/or the sink device 101, 102, 400, 530, 1003, 1503, and/or 1603) may be requested to increase the transmission amount of the user input data.
  • the target application may include at least one of a writing application, a photo editing application, and a drawing application in which at least one of a pen input event and a touch input event occurs.
  • the sink devices 101, 102, 400, 530, 1003, 1503, and/or 1603 are smart glasses 1603, and the processors 120 and 340 include the wireless communication module 192, Through 310), a user receives display information including at least one of the number, size, resolution, and bit rate of screens displayed on the smart glasses 1603 and shares additional information according to the display information Information transfer may be requested to the smart glasses 1603 through an input interface.
  • the sink device 101, 102, 400, 530, 1003, 1503, and/or 1603 includes wireless communication modules 192 and 410, display modules 160 and 320, memory ( 130 and 430), and processors 120 and 440, wherein the processors 120 and 440, via the wireless communication modules 192 and 410, the sink devices 101, 102, 400, 530 and 1003 , 1503, and/or 1603, receives screen image data generated by the source device (101, 102, 300, 510, 1001, 1501, and/or 1601) for display, and the display module (160, 320) ), a screen image 537 based on the screen image data is displayed, and while the screen image 537 is displayed, the sink device 101, 102, 400, 530, 1003, 1503, and/or 1603 Obtains user input data generated from the screen image 537 by an input device 531 connected to , and the source device 101, 102, 300, 510, 1001, 1501, and/or 1601 and the sink device A parameter for
  • the screen image data is a copy of the screen 515 displayed by the display module 160, 320 of the source device 101, 102, 300, 510, 1001, 1501, and/or 1601. At least one of an image frame and/or an image frame related to the screen 515 displayed by the display module 160 or 320 of the source device 101, 102, 300, 510, 1001, 1501, and/or 1601 can do.
  • the processor 120 or 440 may include the source device 101, 102, 300, 510, 1001, 1501, and/or 1601 and the sink device 101, 102, 400, 530, 1003, 1503 and / or 1603), and determines one of a plurality of transmission profiles (610, 630, 650) used for transmission of the user input data according to the network quality, , Based on any one of the plurality of transmission profiles 610, 630, and 650, the parameter may be adaptively adjusted.
  • the plurality of transmission profiles 610, 630, and 650 include a first transmission profile 610 (profile) corresponding to a first network quality and a second network quality lower than the first network quality.
  • the transmission profiles (610, 630, 650) include at least two of a corresponding second transmission profile (630) and a third transmission profile (650) corresponding to a third network quality lower than the second network quality.
  • Each of the transmission profiles (610, 630, 650) - the transmission amount for each transmission profile (610, 630, 650) is the user who can transmit each of the transmission profiles (610, 630, 650) at once including at least one of the maximum size of input data and the maximum number of the user input data that can be transmitted at one time.
  • the first transmission profile 610 has the maximum size of the user input data and a first transmission amount in which the maximum number is unlimited, and the second transmission profile 630 transmits the user input data.
  • the third transmission profile 650 determines that the size and number of the user input data are the second transmission amount. It may have a third transmission amount smaller than the transmission amount.
  • the processors 120 and 440 store first coordinates of first user input data generated at the current time point in the screen image 537 and second user input data generated at a previous time point preceding the current time point. Calculating a first difference between second coordinates of and, when the first difference is less than a set first reference value, transmits the user input data by dropping the first user input data according to the selected transmission profile number can be adjusted.
  • the processors 120 and 440 determine a first pressure, a first tilt, and a first direction ( orientation) and second input including at least one of a second pressure, a second slope, and a second direction of second user input data generated at a previous time point preceding the current time point.
  • a second difference between pieces of information is calculated, and when the second difference is smaller than a set second reference value, the first user input data is dropped according to the selected transmission profile to control the transmission number of the user input data.
  • the processor 120 or 440 may adjust the transmission number of the user input data according to the transmission profile for each type of the user input data.
  • the user input data is transmitted prior to transmission of an input report 900 indicating the contents of the user input data and the input report 900, and the input report 900 and at least one of a report descriptor (1110, 1130, 1210, 1230) indicating a configuration of the input report 900 used to interpret a value, wherein the processor 120, 440 determines the network quality Data of the user input data included in the report descriptors (1110, 1130, 1210, 1230) based on at least one of the changed transmission profile and the type of the user input data when the transmission profile is changed in response to the change You can adjust the size.
  • the processor 120 or 440 may include the source device 101, 102, 300, 510, 1001, 1501, and/or 1601 and the sink device 101, 102, 400, 530, 1003, Based on at least one of a round trip time (RTT) between 1503 and/or 1603 and a transmission control protocol (TCP) window size, the source device 101, 102, 300, 510 , 1001, 1501, and/or 1601 and the sink device 101, 102, 400, 530, 1003, 1503, and/or 1603 may determine network quality.
  • RTT round trip time
  • TCP transmission control protocol
  • the processor 120 or 440 determines that the sink device 101 , 102 , 400 , 530 , 1003 , 1503 , and/or 1603 is the smart glasses 1603 , the smart glasses Sharing display information including at least one of the number, size, resolution, and bit rate of screen images displayed in 1603 with the source device 101, 102, 300, 510, 1001, 1501, and/or 1601 and information transfer may be requested from the source device 101, 102, 300, 510, 1001, 1501, and/or 1601 through a user input interface for sharing additional information.
  • the operating method of the source device 101, 102, 300, 510, 1001, 1501, and/or 1601 may include the sink device 101, 102, 400, 530, 1003, 1503, and/or 1603
  • the screen image data generated by the source device (101, 102, 300, 510, 1001, 1501, and/or 1601) is displayed on the sink device (101, 102, 400, 530, 1003, 1503, and/or 1601).
  • the sync A target application configured to change the transmission amount of user input data generated in the screen image 537 by an input device 531 connected to the device 101, 102, 400, 530, 1003, 1503, and/or 1603 ) is executed or not, and based on the determination that the target application is running, operation 1330 of adjusting a transmission bit rate of the screen image data by changing a transmission profile for transmitting the screen image data. can do.
  • the operating method of the sink device 101, 102, 400, 530, 1003, 1503, and/or 1603 includes the sink device 101, 102, 400, 530, 1003, 1503, and/or 1603 Operation 1410 of receiving screen image data generated by the source device (101, 102, 300, 510, 1001, 1501, and/or 1601) to display on the screen image data 537 based on the screen image data.
  • An operation 1450 of transmitting to the source device 101 , 102 , 300 , 510 , 1001 , 1501 , and/or 1601 may include.
  • the operating method of the sink device 101, 102, 400, 530, 1003, 1503, and/or 1603 includes the source device 101, 102, 300, 510, 1001, 1501, and/or 1601) and an operation of measuring network quality between the sink device 101, 102, 400, 530, 1003, 1503, and/or 1603, a plurality of transmissions to be used for transmission of the user input data based on the network quality selecting one of the profiles; and adjusting the parameter based on one of the plurality of transmission profiles.
  • the plurality of transmission profiles include a first transmission profile corresponding to a first network quality and a second transmission profile corresponding to a second network quality lower than the first network quality, and the plurality of transmission profiles
  • Each of the profiles may include a transmission amount including at least one of a maximum size and a maximum number of user input data to be transmitted at one time.
  • the maximum number of user input data to be transmitted at one time in the second transmission profile may be about 50 percent of the maximum number of user input data to be transmitted at one time in the first transmission profile.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)
  • Transceivers (AREA)
  • Facsimiles In General (AREA)

Abstract

Un dispositif source selon un mode de réalisation comprend un module de communication sans fil, une mémoire et un processeur. Le processeur transmet, à un dispositif récepteur, des données d'image d'écran générées au moyen du dispositif source afin de les afficher sur le dispositif récepteur par l'intermédiaire du module de communication sans fil. Le processeur détermine également s'il faut exécuter une application cible pour changer la quantité de transmission de données d'entrée d'utilisateur générées sur une image d'écran au moyen d'un dispositif d'entrée connecté au dispositif récepteur, alors que l'image d'écran basée sur les données d'image d'écran est affichée sur le dispositif récepteur. Le processeur peut également régler, sur la base d'une détermination indiquant que l'application cible est en cours d'exécution, le débit binaire de transmission des données d'image d'écran en changeant un profil de transmission transmettant les données d'image d'écran.
PCT/KR2022/016772 2021-12-30 2022-10-30 Dispositif source, dispositif récepteur et leurs procédés d'exploitation WO2023128206A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US18/120,913 US20230214168A1 (en) 2021-12-30 2023-03-13 Source device, sink device, and operating methods thereof

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR20210192800 2021-12-30
KR10-2021-0192800 2021-12-30
KR1020220019455A KR20230103800A (ko) 2021-12-30 2022-02-15 소스 장치, 싱크 장치 및 그 동작 방법들
KR10-2022-0019455 2022-02-15

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/120,913 Continuation US20230214168A1 (en) 2021-12-30 2023-03-13 Source device, sink device, and operating methods thereof

Publications (1)

Publication Number Publication Date
WO2023128206A1 true WO2023128206A1 (fr) 2023-07-06

Family

ID=86999320

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2022/016772 WO2023128206A1 (fr) 2021-12-30 2022-10-30 Dispositif source, dispositif récepteur et leurs procédés d'exploitation

Country Status (1)

Country Link
WO (1) WO2023128206A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101640854B1 (ko) * 2011-01-21 2016-07-19 퀄컴 인코포레이티드 무선 디스플레이들을 위한 사용자 입력 백 채널
KR101780300B1 (ko) * 2013-01-25 2017-10-10 퀄컴 인코포레이티드 무선 디스플레이 디바이스들에 대한 사용자 입력 제어를 위한 비연결형 전송
KR101914478B1 (ko) * 2012-01-10 2018-11-02 에스케이플래닛 주식회사 이미지 제공 시스템, 이를 위한 장치 및 이미지 제공 방법
KR102281341B1 (ko) * 2015-01-26 2021-07-23 엘지전자 주식회사 싱크 디바이스 및 그 제어 방법
KR20210121777A (ko) * 2020-03-31 2021-10-08 엘지전자 주식회사 디스플레이 장치 및 그의 동작 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101640854B1 (ko) * 2011-01-21 2016-07-19 퀄컴 인코포레이티드 무선 디스플레이들을 위한 사용자 입력 백 채널
KR101914478B1 (ko) * 2012-01-10 2018-11-02 에스케이플래닛 주식회사 이미지 제공 시스템, 이를 위한 장치 및 이미지 제공 방법
KR101780300B1 (ko) * 2013-01-25 2017-10-10 퀄컴 인코포레이티드 무선 디스플레이 디바이스들에 대한 사용자 입력 제어를 위한 비연결형 전송
KR102281341B1 (ko) * 2015-01-26 2021-07-23 엘지전자 주식회사 싱크 디바이스 및 그 제어 방법
KR20210121777A (ko) * 2020-03-31 2021-10-08 엘지전자 주식회사 디스플레이 장치 및 그의 동작 방법

Similar Documents

Publication Publication Date Title
WO2014104744A1 (fr) Dispositif terminal et son procédé de commande
WO2018076866A1 (fr) Procédé de traitement de données, dispositif, support de stockage, dispositif électronique, et serveur
WO2016039576A2 (fr) Dispositif et procédé d'accès à une pluralité de réseaux dans un système de communications sans fil
WO2022031029A1 (fr) Dispositif électronique et procédé, mis en œuvre par un dispositif électronique, de fourniture d'un écran d'application sur une unité d'affichage de dispositif externe
WO2020009461A1 (fr) Appareil et procédé de réglage de paramètre de réseau
WO2021230589A1 (fr) Dispositif électronique et procédé pour dispositif électronique traitant un paquet de donnés reçu
WO2022005000A1 (fr) Dispositif électronique et procédé de fonctionnement associé
WO2022025606A1 (fr) Dispositif électronique et procédé de commande de volume d'audio associé
WO2022030836A1 (fr) Procédé et appareil de commande de réseau pour communication de données dans un dispositif électronique
WO2023128206A1 (fr) Dispositif source, dispositif récepteur et leurs procédés d'exploitation
WO2018047989A1 (fr) Dispositif d'affichage d'image et système associé
WO2022030908A1 (fr) Dispositif électronique et procédé de synchronisation de données vidéo et de données audio à l'aide de celui-ci
WO2022119194A1 (fr) Dispositif électronique et procédé de sortie audio multicanaux l'utilisant
WO2022065788A1 (fr) Dispositif électronique de réception ou de transmission de données de rcs, et procédé de fonctionnement de dispositif électronique
WO2022065707A1 (fr) Dispositif électronique pour une communication directe avec un dispositif électronique externe, et son procédé de fonctionnement
WO2022025463A1 (fr) Dispositif électronique pour synchroniser la synchronisation de sortie d'une sortie de contenu par des dispositifs externes et procédé pour faire fonctionner un dispositif électronique
WO2023068521A1 (fr) Dispositif source pour partager un écran étendu, dispositif collecteur et leur procédé de fonctionnement
WO2023080447A1 (fr) Dispositif électronique de planification de transmission et de réception de données par l'intermédiaire d'une pluralité de liaisons, et procédé de fonctionnement du dispositif électronique
WO2023027288A1 (fr) Dispositif électronique de transmission d'une pluralité de flux d'images tout en mettant en œuvre un appel vidéo, et procédé de fonctionnement d'un dispositif électronique
WO2022169106A1 (fr) Dispositif électronique pour transmettre et recevoir un flux multimédia et son procédé de fonctionnement
WO2023043057A1 (fr) Dispositif électronique et procédé de partage d'écrans et de signaux audio correspondant aux écrans
WO2022265209A1 (fr) Dispositif électronique effectuant un appel vidéo à l'aide d'une frc et procédé de fonctionnement pour dispositif électronique
WO2023085566A1 (fr) Dispositif électronique pour déterminer un intervalle de gop sur la base d'informations de performance concernant un dispositif électronique externe et procédé d'exploitation pour dispositif électronique
WO2022025384A1 (fr) Dispositif électronique de lecture de vidéo et procédé de lecture de vidéo
WO2022075743A1 (fr) Dispositif électronique de transmission d'un paquet par le biais d'une connexion de communication sans fil, et procédé de fonctionnement associé

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22916360

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2022916360

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2022916360

Country of ref document: EP

Effective date: 20240613