WO2020014880A1 - 一种多屏互动方法及设备 - Google Patents

一种多屏互动方法及设备 Download PDF

Info

Publication number
WO2020014880A1
WO2020014880A1 PCT/CN2018/096038 CN2018096038W WO2020014880A1 WO 2020014880 A1 WO2020014880 A1 WO 2020014880A1 CN 2018096038 W CN2018096038 W CN 2018096038W WO 2020014880 A1 WO2020014880 A1 WO 2020014880A1
Authority
WO
WIPO (PCT)
Prior art keywords
layer channel
screen
source device
application
interface
Prior art date
Application number
PCT/CN2018/096038
Other languages
English (en)
French (fr)
Inventor
魏治宇
徐辉
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to PCT/CN2018/096038 priority Critical patent/WO2020014880A1/zh
Priority to CN201880072055.XA priority patent/CN111316598B/zh
Publication of WO2020014880A1 publication Critical patent/WO2020014880A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/02Details
    • H04L12/16Arrangements for providing special services to substations
    • H04L12/18Arrangements for providing special services to substations for broadcast or conference, e.g. multicast

Definitions

  • the present application relates to the technical field of terminals, and in particular, to a multi-screen interaction method and device.
  • screen sharing mode that is, different devices implement multi-screen interaction by sharing the screen of the operating system or the screen in the application, but sharing the screen of the operating system or the in-app for a long time.
  • screen sharing mode there may be fancy screens and freezes, or out of sync between screens due to a large delay time, which will affect the user experience.
  • the present application provides a multi-screen interaction method and device, which are used to solve the problems of fancy screens and stuttering or screen out-of-sync when multi-screen interaction between different devices in the prior art.
  • an embodiment of the present application provides a multi-screen interaction method.
  • the method includes: the source device detects a user's screen casting operation, and in response to the screen casting operation, the source device transmits the currently displayed The content is transmitted to the target device, and then the source device determines the first network transmission parameter according to the content currently displayed by the source device or the type of screen.
  • the source device determines that the first network transmission parameter of the first transmission layer channel is not satisfied.
  • the source device transmits the currently displayed content to the target device through a second transmission layer channel, and the second transmission layer channel is determined by the source device according to the first network transmission parameter.
  • the source device of the present application can switch from the first transmission layer to the second transmission layer channel with a low latency or switch to a low packet loss rate in accordance with the currently displayed content or the type of screen projection.
  • the second transport layer channel is a low latency or switch to a low packet loss rate in accordance with the currently displayed content or the type of screen projection.
  • the source device determines the first network transmission parameter as a packet loss rate according to the interface of the audio / video service type application;
  • the packet loss rate of the second transport layer channel is lower than the packet loss rate of the first transport layer channel.
  • the source device determines that the first network transmission parameter is a delay according to the game service type application; wherein the second transmission layer channel The delay is lower than the delay of the first transport layer channel.
  • the source device can switch from the first transport layer channel to the second transport layer channel with a lower latency, thereby improving the inconsistency of the picture.
  • the source device determines that the first network transmission parameter is a delay according to the same-origin mode; wherein the second transmission layer channel The delay is lower than the delay of the first transmission layer channel. That is, when the source device casts the screen in the same source mode, the source device can switch from the first transport layer channel to the second transport layer channel with a lower delay, thereby improving the situation of inconsistent pictures.
  • the source device determines that the first network transmission parameter is a packet loss rate according to the heterogeneous mode; wherein the second transmission layer channel The packet loss rate is lower than the packet loss rate of the first transport layer channel. That is, when the source device casts the screen in a heterogeneous manner, the source device can switch from the first transmission layer channel to the second transmission layer channel with a lower delay, thereby improving the picture freeze.
  • the first transport layer channel is a channel corresponding to VTP
  • the channel whose delay is lower than VTP is a channel corresponding to UDP, or the channel whose packet loss rate is lower than that corresponding to VTP is TCP corresponding.
  • an embodiment of the present application provides a multi-screen interaction method.
  • the method includes: a source device detects a user's split-screen operation; and in response to the split-screen operation, the source device converts a first application that is currently running Interface of the second application and the interface of the second application are displayed in separate screens; then the source device detects the user's first screen casting operation, and in response to the first screen casting operation, the source device displays the first application's The interface is transmitted to the first target device, and the interface of the second application is transmitted to the second target device through the second transmission layer channel.
  • the source device determines the first network according to the interface of the first application or the type of the screen.
  • Transmission parameters when the source device determines that the first network transmission parameter of the first transmission layer channel does not meet the set conditions, the source device transmits the interface of the first application to the third channel through the third transmission layer channel A first target device, and the third transmission layer channel is determined according to the first network transmission parameter.
  • the source device determines a second network transmission parameter according to an interface or a screen type of the second application; when the source device determines that the second network transmission parameter of the second transmission layer channel is greater than or equal to a second threshold, The source device transmits the interface of the second application to the second target device through the fourth transmission layer channel, and the fourth transmission layer channel is determined by the source device according to the second network transmission parameter.
  • the source device can complete the screen casting of multiple application interfaces through the above-mentioned multi-screen interaction method in the scene where the split screen function and the screen projection function are both turned on, which can improve the screen out-of-sync and screen outage during the screen-casting process. Stuck problem.
  • the source device detects a second screen casting operation of the user, and the second screen casting operation indicates that the screened content is only the interface of the first application; in response to the second screen casting In operation, the source device stops transmitting the interface of the second application to the second target device through the fourth transmission channel. That is, when the source device selects only one of the interfaces in the split-screen interface to project, the source device can implement the interface of projecting only one of the applications.
  • an embodiment of the present application provides an electronic device including a processor and a memory.
  • the memory is used to store one or more computer programs; when the one or more computer programs stored in the memory are executed by a processor, the terminal can implement any one of the possible design methods of the first aspect, or the second aspect Any of the possible design methods.
  • an embodiment of the present application further provides an electronic device, and the electronic device includes a module / unit that executes the first aspect or any one of the possible design methods of the first aspect.
  • These modules / units can be implemented by hardware, and can also be implemented by hardware executing corresponding software.
  • an embodiment of the present application further provides an electronic device, and the electronic device includes a module / unit that executes the second aspect or any one of the possible design methods of the second aspect.
  • These modules / units can be implemented by hardware, and can also be implemented by hardware executing corresponding software.
  • an embodiment of the present application further provides a computer-readable storage medium, where the computer-readable storage medium includes a computer program, and when the computer program runs on an electronic device, the electronic device executes any one of the possibilities. Design approach.
  • an embodiment of the present application further provides a method including a computer program product, and when the computer program product runs on a terminal, causing the terminal to execute any one of possible designs.
  • FIG. 1 is a schematic diagram of a scenario architecture applicable to multi-screen interaction according to an embodiment of the present application
  • FIG. 2 is a schematic structural diagram of a terminal according to an embodiment of the present application.
  • FIG. 3 is a schematic structural diagram of an Android operating system according to an embodiment of the present application.
  • FIG. 4 is a block diagram of a technology and a protocol involved in a Wi-Fi Display provided in the prior art
  • FIG. 5 is a schematic flowchart of a multi-screen interaction method according to an embodiment of the present application.
  • FIG. 6 is a block diagram of a technology and a protocol related to a Wi-Fi Display provided by an embodiment of the present application;
  • FIG. 7a is a schematic flowchart of a multi-screen interaction method according to an embodiment of the present application.
  • FIGS. 7b to 7c are schematic diagrams of two types of multi-screen interaction scenarios provided by embodiments of the present application.
  • 7d is a second schematic flowchart of a multi-screen interaction method according to an embodiment of the present application.
  • FIG. 8 is a first schematic diagram of a screen interaction method after a split screen function is enabled according to an embodiment of the present application
  • FIG. 9 is a second schematic diagram of a screen interaction method after a split screen function is enabled according to an embodiment of the present application.
  • FIG. 10 is a schematic structural diagram of a terminal according to an embodiment of the present application.
  • FIG. 11 is a schematic structural diagram of a mobile phone according to an embodiment of the present application.
  • Multi-screen interactive technology refers to the fact that different operating systems and different terminal devices (such as smartphones, smart tablets, computers, and televisions) are compatible with each other across operations, and digital multimedia is achieved through wireless network connections ( (Such as high-definition video, audio, pictures) content transmission. Multi-screen interactive technology can realize the display content sharing on different terminal platform devices at the same time, which can enrich the multimedia life of users.
  • multi-screen interaction functions includes the following three modes: a) content sharing mode: multi-screen interaction is achieved by sharing media content or links between media devices between different devices; b) interface sharing mode: different devices The multi-screen interaction is realized by sharing the interface of the system or the interface within the application. c) Remote control mode: that is, one device controls another device to realize the interaction between multiple screens.
  • the current multi-screen interactive implementation solutions include digital living network alliance (DLAN), wireless high-definition technology (Intel Wireless Display (Intel WIDI)), mirroring method, IGRS protocol, (Also known as Wi-Fi Display), And other technologies.
  • DLAN digital living network alliance
  • Intel Wireless Display Intel WIDI
  • IGRS protocol Also known as Wi-Fi Display
  • the multi-screen interaction method provided in the embodiment of the present application is applicable to the architecture shown in FIG. 1, and the architecture includes a source device 10 and a target device 20.
  • the source device 10 and the target device 20 communicate with each other through a communication network.
  • the communication network may be a local area network or a wide area network transferred through a relay device.
  • the communication network may be a Wi-Fi hotspot network, a Wi-Fi direct network, a Bluetooth network, a zigbee network, or a near field communication (NFC) network. Communications network.
  • the communication network may be a 3rd-generation wireless communication technology (3G) network, a 4th generation mobile communication technology (4G) ) Networks, 5th-generation mobile communication technology (5G) networks, public land mobile networks (PLMN) or the Internet that are evolving in the future.
  • 3G 3rd-generation wireless communication technology
  • 4G 4th generation mobile communication technology
  • 5G 5th-generation mobile communication technology
  • PLMN public land mobile networks
  • the source device 10 refers to a device that actively shares a system interface or an application interface.
  • the source device 10 may be a mobile phone, a tablet computer, a notebook computer, or the like.
  • the target device 20 refers to a device that passively accepts an interface shared by the source device 10.
  • the target device 20 may be a mobile phone, a tablet computer, a notebook computer, or the like.
  • the source device encodes the shared interface into H264 format data based on the aforementioned Miracast multi-screen interactive technology, and then sends the H264 format data to the target device 20.
  • the target device 20 can then decode and display the H264 format data.
  • the multi-screen interaction method provided in the embodiments of the present application can be applied to mobile phones, tablet computers, wearable devices, in-vehicle devices, augmented reality (AR) devices, virtual reality (VR) devices, Laptops, drones, ultra-mobile personal computers (UMPCs), netbooks, personal digital assistants (PDAs), smart TVs, and other display-supporting terminals.
  • the specific form is not subject to special restrictions.
  • the source device may be a mobile phone
  • the target device may be a tablet computer.
  • the target device may be the above-mentioned terminal having both a receiving function and a display function, or may be a device containing only a receiving function, such as a set-top box.
  • the target device is a device that only includes a receiving function
  • the target device can then be connected to a device with a display function to achieve the purpose of multi-screen interaction.
  • the target device is a set-top box, which is connected to a television display.
  • the source device 10 or the target device 20 in the embodiment of the present application may be the electronic device 100.
  • the embodiment is specifically described below by taking the electronic device 100 as an example. It should be understood that the illustrated electronic device 100 is only an example of the source device 10 or the target device 20, and the electronic device 100 may have more or fewer components than those shown in the figure, and two or more may be combined More components, or can have different component configurations.
  • the various components shown in the figures can be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing or application specific integrated circuits.
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a USB interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150,
  • the wireless communication module 160 the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, the sensor module 180, the button 190, the motor 191, the pointer 192, the camera 193, the display 194, and the SIM card interface 195, and the like.
  • the sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light. Sensor 180L, bone conduction sensor 180M, etc.
  • the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the electronic device 100.
  • the electronic device 100 may include more or fewer parts than shown, or some parts may be combined, or some parts may be split, or different parts may be arranged.
  • the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image, signal processor, ISP), controller, memory, video codec, digital signal processor (DSP), baseband processor, and / or neural network processing unit (NPU) Wait.
  • AP application processor
  • modem processor graphics processing unit
  • GPU graphics processing unit
  • image signal processor image, signal processor, ISP
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • NPU neural network processing unit
  • different processing units may be independent devices or integrated in one or more processors.
  • the controller may be a nerve center and a command center of the electronic device 100.
  • the controller can generate operation control signals according to the instruction operation code and timing signals, and complete the control of fetching and executing instructions.
  • the processor 110 may further include a memory for storing instructions and data.
  • the memory in the processor 110 is a cache memory.
  • the memory may store instructions or data that the processor 110 has just used or used cyclically. If the processor 110 needs to use the instruction or data again, it can be directly called from the memory. Repeated accesses are avoided and the waiting time of the processor 110 is reduced, thereby improving the efficiency of the system.
  • the processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit (I2C) interface, an integrated circuit (inter-integrated circuit, sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transceiver (universal asynchronous) receiver / transmitter (UART) interface, mobile industry processor interface (MIPI), general-purpose input / output (GPIO) interface, subscriber identity module (SIM) interface, and / Or universal serial bus (universal serial bus, USB) interface.
  • I2C integrated circuit
  • I2S integrated circuit
  • PCM pulse code modulation
  • UART universal asynchronous transceiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input / output
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus, including a serial data line (SDA) and a serial clock line (SCL).
  • the processor 110 may include multiple sets of I2C buses.
  • the processor 110 may be respectively coupled to a touch sensor 180K, a charger, a flash, a camera 193, and the like through different I2C bus interfaces.
  • the processor 110 may be coupled to the touch sensor 180K through the I2C interface, so that the processor 110 and the touch sensor 180K communicate through the I2C bus interface to implement the touch function of the electronic device 100.
  • the I2S interface can be used for audio communication.
  • the processor 110 may include multiple sets of I2S buses.
  • the processor 110 may be coupled with the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170.
  • the audio module 170 may transmit audio signals to the wireless communication module 160 through an I2S interface, so as to implement a function of receiving a call through a Bluetooth headset.
  • the PCM interface can also be used for audio communications, sampling, quantizing, and encoding analog signals.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement the function of receiving calls through a Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus for asynchronous communication.
  • the bus may be a two-way communication bus. It converts the data to be transferred between serial and parallel communications.
  • a UART interface is generally used to connect the processor 110 and the wireless communication module 160.
  • the processor 110 communicates with a Bluetooth module in the wireless communication module 160 through a UART interface to implement a Bluetooth function.
  • the audio module 170 may transmit audio signals to the wireless communication module 160 through a UART interface, so as to implement a function of playing music through a Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display 194, the camera 193, and the like.
  • the MIPI interface includes a camera serial interface (CSI), a display serial interface (DSI), and the like.
  • CSI camera serial interface
  • DSI display serial interface
  • the processor 110 and the camera 193 communicate through a CSI interface to implement a shooting function of the electronic device 100.
  • the processor 110 and the display screen 194 communicate through a DSI interface to implement a display function of the electronic device 100.
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface may be used to connect the processor 110 with the camera 193, the display screen 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like.
  • GPIO interface can also be configured as I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface that complies with the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface can be used to connect a charger to charge the electronic device 100, and can also be used to transfer data between the electronic device 100 and a peripheral device. It can also be used to connect headphones and play audio through headphones. This interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules illustrated in the embodiments of the present invention is only a schematic description, and does not constitute a limitation on the structure of the electronic device 100.
  • the electronic device 100 may also adopt different interface connection modes or a combination of multiple interface connection modes in the above embodiments.
  • the charging management module 140 is configured to receive a charging input from a charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 may receive a charging input of a wired charger through a USB interface.
  • the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. While the charge management module 140 is charging the battery 142, the power management module 141 can also provide power to the electronic device.
  • the power management module 141 is used to connect the battery 142, the charge management module 140 and the processor 110.
  • the power management module 141 receives inputs from the battery 142 and / or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the external memory, the display screen 194, the camera 193, and the wireless communication module 160.
  • the power management module 141 can also be used to monitor parameters such as battery capacity, number of battery cycles, battery health (leakage, impedance) and other parameters.
  • the power management module 141 may also be disposed in the processor 110.
  • the power management module 141 and the charge management module 140 may be provided in the same device.
  • the wireless communication function of the electronic device 100 may be implemented by the antenna module 1, the antenna module 2 mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
  • the antenna 1 and the antenna 2 are used for transmitting and receiving electromagnetic wave signals.
  • Each antenna in the electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be multiplexed to improve antenna utilization. For example, a cellular network antenna can be multiplexed into a wireless LAN diversity antenna. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 may provide a wireless communication solution including 2G / 3G / 4G / 5G and the like applied on the electronic device 100.
  • the mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like.
  • the mobile communication module 150 may receive the electromagnetic wave by the antenna 1, and perform filtering, amplification, and other processing on the received electromagnetic wave, and transmit it to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modem processor and convert it into electromagnetic wave radiation through the antenna 1.
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110.
  • at least part of the functional modules of the mobile communication module 150 may be provided in the same device as at least part of the modules of the processor 110.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is configured to modulate a low-frequency baseband signal to be transmitted into a high-frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal.
  • the demodulator then transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low-frequency baseband signal is processed by the baseband processor and then passed to the application processor.
  • the application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays an image or video through the display screen 194.
  • the modem processor may be a separate device.
  • the modem processor may be independent of the processor 110 and disposed in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide wireless local area networks (WLAN), Bluetooth (Bluetooth, BT), global navigation satellite system (GNSS), frequency modulation (frequency modulation) FM), near field communication technology (NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • the wireless communication module 160 may be one or more devices that integrate at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2, frequency-modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110.
  • the wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency-modulate it, amplify it, and convert it into electromagnetic wave radiation through the antenna 2.
  • the antenna 1 of the electronic device 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include a global mobile communication system (GSM), a general packet radio service (GPRS), a code division multiple access (CDMA), and broadband. Code division multiple access (wideband code division multiple access, WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), BT, GNSS, WLAN, NFC , FM, and / or IR technology.
  • the GNSS may include a global positioning system (GPS), a global navigation satellite system (GLONASS), a beidou navigation system (BDS), and a quasi-zenith satellite system (quasi -zenith satellite system (QZSS)) and / or satellite-based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS beidou navigation system
  • QZSS quasi-zenith satellite system
  • SBAS satellite-based augmentation systems
  • the electronic device 100 implements a display function through a GPU, a display screen 194, and an application processor.
  • the GPU is a microprocessor for image processing and is connected to the display 194 and an application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • the processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos, and the like.
  • the display screen 194 includes a display panel.
  • the display panel can adopt LCD (liquid crystal display), OLED (organic light-emitting diode), active matrix organic light-emitting diode or active-matrix organic light-emitting diode (active-matrix organic light-emitting diode) emitting diode (AMOLED), flexible light-emitting diodes (FLEDs), Miniled, MicroLed, Micro-oLed, quantum dot light emitting diodes (QLEDs), etc.
  • the electronic device 100 may include one or N display screens, where N is a positive integer greater than 1.
  • the electronic device 100 may implement a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, and an application processor.
  • the ISP processes the data fed back from the camera 193. For example, when taking a picture, the shutter is opened, and the light is transmitted to the light receiving element of the camera through the lens. The light signal is converted into an electrical signal, and the light receiving element of the camera passes the electrical signal to the ISP for processing and converts the image to the naked eye. ISP can also optimize the image's noise, brightness, and skin tone. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene. In some embodiments, an ISP may be provided in the camera 193.
  • the camera 193 is used to capture still images or videos.
  • An object generates an optical image through a lens and projects it onto a photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then passes the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs digital image signals to the DSP for processing.
  • DSP converts digital image signals into image signals in standard RGB, YUV and other formats.
  • the electronic device 100 may include one or N cameras, where N is a positive integer greater than 1.
  • a digital signal processor is used to process digital signals. In addition to digital image signals, it can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform a Fourier transform on the frequency point energy and the like.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs. In this way, the electronic device 100 can play or record videos in multiple encoding formats, such as: MPEG1, MPEG2, MPEG3, MPEG4, and so on.
  • NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • the NPU can realize applications such as intelligent recognition of the electronic device 100, such as image recognition, face recognition, speech recognition, text understanding, and the like.
  • the external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capacity of the electronic device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, save music, videos and other files on an external memory card.
  • the internal memory 121 may be used to store computer executable program code, where the executable program code includes instructions.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by executing instructions stored in the internal memory 121.
  • the memory 121 may include a storage program area and a storage data area.
  • the storage program area may store an operating system, at least one application required by a function (such as a sound playback function, an image playback function, etc.) and the like.
  • the storage data area may store data (such as audio data, phone book, etc.) created during the use of the electronic device 100.
  • the memory 121 may include a high-speed random access memory, and may further include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
  • UFS universal flash memory
  • the electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, a headphone interface 170D, and an application processor. Such as music playback, recording, etc.
  • the audio module 170 is configured to convert digital audio information into an analog audio signal and output, and is also used to convert an analog audio input into a digital audio signal.
  • the audio module 170 may also be used to encode and decode audio signals.
  • the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
  • the speaker 170A also called a "horn" is used to convert audio electrical signals into sound signals.
  • the electronic device 100 can listen to music through the speaker 170A, or listen to a hands-free call.
  • the receiver 170B also referred to as the "handset" is used to convert audio electrical signals into sound signals.
  • the electronic device 100 answers a call or a voice message, it can answer the voice by holding the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone”, “microphone”, is used to convert sound signals into electrical signals.
  • the user can make a sound through the mouth near the microphone 170C, and input a sound signal into the microphone 170C.
  • the electronic device 100 may be provided with at least one microphone 170C.
  • the electronic device 100 may be provided with two microphones, in addition to collecting sound signals, it may also implement a noise reduction function.
  • the electronic device 100 may be provided with three, four, or more microphones to collect sound signals, reduce noise, identify the sound source, and implement a directional recording function.
  • the headset interface 170D is used to connect a wired headset.
  • the earphone interface can be a USB interface or a 3.5mm open mobile electronic platform (OMTP) standard interface, and the American Cellular Telecommunications Industry Association (of the United States, CTIA) standard interface.
  • OMTP open mobile electronic platform
  • CTIA American Cellular Telecommunications Industry Association
  • the pressure sensor 180A is used to sense a pressure signal, and can convert the pressure signal into an electrical signal.
  • the pressure sensor 180A may be disposed on the display screen 194.
  • the capacitive pressure sensor may be at least two parallel plates having a conductive material. When a force is applied to the pressure sensor 180A, the capacitance between the electrodes changes.
  • the electronic device 100 determines the intensity of the pressure according to the change in capacitance.
  • the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the electronic device 100 may also calculate the touched position based on the detection signal of the pressure sensor 180A.
  • touch operations acting on the same touch position but different touch operation intensities may correspond to different operation instructions. For example, when a touch operation with a touch operation intensity lower than the first pressure threshold is applied to the short message application icon, an instruction for viewing the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold is applied to the short message application icon, an instruction for creating a short message is executed.
  • the gyro sensor 180B may be used to determine a movement posture of the electronic device 100.
  • the angular velocity of the electronic device 100 around three axes ie, the x, y, and z axes
  • the gyro sensor 180B can be used for image stabilization.
  • the gyro sensor 180B detects the angle of the electronic device 100 shake, and calculates the distance that the lens module needs to compensate according to the angle, so that the lens cancels the shake of the electronic device 100 through the backward movement to achieve image stabilization.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenes.
  • the barometric pressure sensor 180C is used to measure air pressure.
  • the electronic device 100 calculates the altitude through the air pressure value measured by the air pressure sensor 180C, and assists in positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the electronic device 100 can detect the opening and closing of the flip leather case by using the magnetic sensor 180D.
  • the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. Further, according to the opened and closed state of the holster or the opened and closed state of the flip cover, characteristics such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 180E can detect the magnitude of acceleration of the electronic device 100 in various directions (generally three axes).
  • the magnitude and direction of gravity can be detected when the electronic device 100 is stationary. It can also be used to recognize the posture of electronic devices, and is used in applications such as switching between horizontal and vertical screens, and pedometers.
  • the electronic device 100 can measure the distance by infrared or laser. In some embodiments, when shooting a scene, the electronic device 100 may use the distance sensor 180F to measure the distance to achieve fast focusing.
  • the proximity light sensor 180G may include, for example, a light emitting diode (LED) and a light detector, such as a photodiode.
  • the light emitting diode may be an infrared light emitting diode.
  • the electronic device 100 emits infrared light through a light emitting diode.
  • the electronic device 100 uses a photodiode to detect infrared reflected light from a nearby object. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficiently reflected light is detected, the electronic device 100 may determine that there is no object near the electronic device 100.
  • the electronic device 100 may use the proximity light sensor 180G to detect that the user is holding the electronic device 100 close to the ear to talk, so as to automatically turn off the screen to save power.
  • the proximity light sensor 180G can also be used in holster mode, and the pocket mode automatically unlocks and locks the screen.
  • the ambient light sensor 180L is used to sense ambient light brightness.
  • the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • Ambient light sensor 180L can also be used to automatically adjust white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touch.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 may use the collected fingerprint characteristics to realize fingerprint unlocking, access application lock, fingerprint photographing, fingerprint answering an incoming call, and the like.
  • the temperature sensor 180J is used to detect the temperature.
  • the electronic device 100 executes a temperature processing strategy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds the threshold, the electronic device 100 performs a performance reduction of a processor located near the temperature sensor 180J so as to reduce power consumption and implement thermal protection.
  • the electronic device 100 when the temperature is lower than another threshold, the electronic device 100 heats the battery 142 to avoid the abnormal shutdown of the electronic device 100 caused by the low temperature.
  • the electronic device 100 when the temperature is lower than another threshold, performs a boost on the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
  • the touch sensor 180K is also called “touch panel”. Can be set on display 194. Used to detect touch operations on or near it. The detected touch operation may be passed to the application processor to determine the type of touch event, and a corresponding visual output is provided through the display screen 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100, which is different from the position of the display screen 194.
  • the bone conduction sensor 180M can acquire vibration signals.
  • the bone conduction sensor 180M can acquire a vibration signal of a human voice oscillating bone mass.
  • Bone conduction sensor 180M can also contact the human pulse and receive blood pressure beating signals.
  • the bone conduction sensor 180M may also be provided in the headset.
  • the audio module 170 may analyze a voice signal based on the vibration signal of the oscillating bone mass of the vocal part obtained by the bone conduction sensor 180M to implement a voice function.
  • the application processor may analyze the heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M to implement a heart rate detection function.
  • the keys 190 include a power-on key, a volume key, and the like.
  • the keys can be mechanical keys. It can also be a touch button.
  • the electronic device 100 may receive a key input, and generate a key signal input related to user settings and function control of the electronic device 100.
  • the motor 191 may generate a vibration alert.
  • the motor 191 can be used for vibration alert for incoming calls, and can also be used for touch vibration feedback.
  • the touch operation applied to different applications can correspond to different vibration feedback effects.
  • Acting on touch operations in different areas of the display screen 194, the motor 191 can also correspond to different vibration feedback effects.
  • Different application scenarios (such as time reminders, receiving information, alarm clocks, games, etc.) can also correspond to different vibration feedback effects.
  • Touch vibration feedback effect can also support customization.
  • the indicator 192 can be an indicator light, which can be used to indicate the charging status, power change, and can also be used to indicate messages, missed calls, notifications, and so on.
  • the SIM card interface 195 is used to connect to a subscriber identity module (SIM).
  • SIM subscriber identity module
  • the SIM card can be contacted and separated from the electronic device 100 by inserting or removing the SIM card interface.
  • the electronic device 100 may support one or N SIM card interfaces, and N is a positive integer greater than 1.
  • SIM card interface 195 can support Nano SIM card, Micro SIM card, SIM card, etc. Multiple SIM cards can be inserted into the same SIM card interface at the same time. The types of the multiple cards may be the same or different.
  • the SIM card interface 195 may also be compatible with different types of SIM cards.
  • the SIM card interface 195 is also compatible with external memory cards.
  • the electronic device 100 interacts with the network through a SIM card to implement functions such as calling and data communication.
  • the electronic device 100 uses an eSIM, that is, an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture.
  • the embodiment of the present invention takes the Android system with a layered architecture as an example, and illustrates the software structure of the electronic device 100 by way of example.
  • FIG. 3 is a software structural block diagram of an electronic device 100 according to an embodiment of the present invention.
  • the layered architecture divides the software into several layers, each of which has a clear role and division of labor.
  • the layers communicate with each other through a software interface.
  • the Android system is divided into four layers, which are an application layer, an application framework layer, an Android runtime and a system library, and a kernel layer from top to bottom.
  • the application layer can include a series of application packages.
  • the application package can include applications such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, SMS, etc.
  • the application framework layer provides an application programming interface (API) and a programming framework for applications at the application layer.
  • API application programming interface
  • the application framework layer includes some predefined functions.
  • the application framework layer may include a window manager, a content provider, a view system, a phone manager, a resource manager, a notification manager, and the like.
  • the window manager is used to manage window programs.
  • the window manager can obtain the display size, determine whether there is a status bar, lock the screen, take a screenshot, etc.
  • Content providers are used to store and retrieve data and make it accessible to applications.
  • the data may include videos, images, audio, calls made and received, browsing history and bookmarks, phone books, and so on.
  • the view system includes visual controls, such as controls that display text, controls that display pictures, and so on.
  • the view system can be used to build applications.
  • the display interface can consist of one or more views.
  • the display interface including the SMS notification icon may include a view that displays text and a view that displays pictures.
  • the phone manager is used to provide a communication function of the electronic device 100. For example, management of call status (including connection, hang up, etc.).
  • the resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and so on.
  • the notification manager enables the application to display notification information in the status bar, which can be used to convey notification-type messages that can disappear automatically after a short stay without user interaction.
  • the notification manager is used to inform download completion, message reminders, etc.
  • the notification manager can also be a notification that appears in the status bar at the top of the system as a chart or scroll bar text, such as a notification from an application running in the background, or a notification that appears on the screen in the form of a dialog window.
  • text messages are displayed in the status bar, sounds are emitted, electronic equipment vibrates, and the indicator light flashes.
  • Android Runtime includes core libraries and virtual machines. Android runtime is responsible for the scheduling and management of the Android system.
  • the core library contains two parts: one is the functional functions that the Java language needs to call, and the other is the Android core library.
  • the application layer and the application framework layer run in a virtual machine.
  • the virtual machine executes the java files of the application layer and the application framework layer as binary files.
  • Virtual machines are used to perform object lifecycle management, stack management, thread management, security and exception management, and garbage collection.
  • the system library can include multiple functional modules. For example: surface manager (media manager), media library (Media library), three-dimensional graphics processing library (for example: OpenGL ES), 2D graphics engine (for example: SGL) and so on.
  • surface manager media manager
  • media library Media library
  • Three-dimensional graphics processing library for example: OpenGL ES
  • 2D graphics engine for example: SGL
  • the Surface Manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
  • the media library supports a variety of commonly used audio and video formats for playback and recording, as well as still image files.
  • the media library can support multiple audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing.
  • the 2D graphics engine is a graphics engine for 2D graphics.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer contains at least a display driver, a camera driver, an audio driver, and a sensor driver.
  • the following describes the workflow of the software and hardware of the electronic device 100 by way of example in conjunction with capturing a photographing scene.
  • a corresponding hardware interrupt is sent to the kernel layer.
  • the kernel layer processes touch operations into raw input events (including touch coordinates, time stamps of touch operations, and other information). Raw input events are stored at the kernel level.
  • the application framework layer obtains the original input event from the kernel layer and identifies the control corresponding to the original input event. Taking the touch operation as a touch and click operation, and the control corresponding to the click operation is the control of the camera application icon as an example, the camera application calls the interface of the application framework layer to start the camera application, and then starts the camera driver by calling the kernel layer. The camera captures still images or videos.
  • FIG. 4 shows a technology and protocol block diagram related to the multi-screen interactive technology of Wi-Fi Display.
  • Wi-Fi P2P Wi-Fi point-to-point
  • RTSP Real-Time Streaming Protocol
  • RTP Real-Time Transport Protocol
  • streaming media technology streaming media technology
  • audio-video codec related technologies Wi-Fi Display
  • Wi-Fi Display is mainly based on Wi-Fi P2P technology and uses RTSP as the audio and video stream control protocol.
  • RTSP Real-Time Streaming Protocol
  • RTP Real-Time Transport Protocol
  • the target device 20 establishes a TCP connection with the source device 10 (you can look at the source device 10 As a client, consider the target device 20 as a server).
  • RTSP negotiation is performed between the target device 20 and the source device 10.
  • a Wi-Fi Display session is established.
  • the target device 20 and the source device 10 implement processes such as transmission and control of audio and video streams, encoding and decoding.
  • the source device 10 may perform encryption protection on the transmitted content (for example, the encryption technology may be High-bandwidth Digital Content Protection Technology (HDCP)).
  • HDCP High-bandwidth Digital Content Protection Technology
  • RTSP is an application layer protocol used to control the transmission of data (such as multimedia streams) with real-time characteristics.
  • RTSP generally works together with underlying protocols such as RTP or Real-time Transport Control Protocol (RTCP) and Resource Reservation Protocol (RSVP) to provide a complete set of streaming services based on the Internet.
  • RTSP can select the transmission channel of the data stream (for example, the channel corresponding to the User Datagram Protocol (UDP) and the channel corresponding to the Transmission Control Protocol (TCP)).
  • UDP User Datagram Protocol
  • TCP Transmission Control Protocol
  • TCP is used to provide reliable data transmission in the IP environment.
  • the services it provides include data stream transmission, reliability, effective flow control, full-duplex operation, and multiplexing. Sends through connection-oriented, end-to-end, and reliable data packets. In layman's terms, it is to open up a connected channel for the data sent in advance, and then send the data.
  • TCP corresponds to applications with high reliability requirements. TCP focuses on the quality of transmission, but it will cause the problem of excessive delay.
  • UDP is a simple connection-free transmission protocol, that is, the client does not need to establish a connection with the server to directly send data to the server. At the same time, there is no mechanism to ensure that this data has been successfully sent to the server. In other words, UDP focuses on transmission efficiency, but it will cause packet loss.
  • the delay is not large, but there will be cases of fancy screens and freezes; if the source device 10 and the target TCP is used for data transmission between the devices 20, and there are fewer flower screens and freezes, but the delay is large, which affects the user's consistent experience.
  • the existing technology will switch the channel of the transport layer based on the network bandwidth, it does not consider the service type corresponding to the data sent by the source device during the multi-screen interaction process.
  • the data sent may be the video service type.
  • network transmission parameters can be delay, transmission quality, transmission rate, packet loss rate, etc., so the source device can use the corresponding channels of different transport layer protocols to transmit accordingly. Encoded data.
  • the source device can also adopt the corresponding Unused transport layer channels.
  • the embodiment of the present application provides a multi-screen interaction method, which dynamically considers the interface shared by the source device and the type of the screen to be cast, and dynamically switches between channels of different transmission layers.
  • the implementation of the multi-screen interaction method may involve the following modules: 1. Media Player Service and related modules are mainly used for RTP / RTSP transmission and corresponding Codec technology; 2, Surface Flinger and related modules, the role is to mix the screen UI data and deliver it to the display device to display; 3, Window Manager Service and related modules, used to manage the position of each UI layer in the system And attributes. Because not all UI layers will be cast to the target device.
  • the video in the mobile phone can be delivered to the target device for display, but if a password input box (probably initiated by a background application) pops up suddenly during playback, this password input box is generally not cast to protect privacy.
  • Screen to the target device to display 3, Display Manager Service and related modules, used to manage all display devices in the system; 4, Wi-Fi Service and related modules: used to establish Wi-Fi P2P connection.
  • Step 201 The source device detects a user's screen casting operation, and transmits the currently displayed content to the target device through the first transmission layer channel.
  • the first transport layer channel is the default transport layer channel, which can be manually set by the user, or it can be said that the system is preset.
  • the screen-casting type of the screen-casting operation can be of the same type or the heterogeneous type, and the user can choose any one of the methods to project the screen according to his own needs.
  • Step 202 The source device determines a first network transmission parameter according to a currently displayed content or a screen-casting type.
  • the first network transmission parameter may be any one of the foregoing network transmission parameters, such as delay, or transmission quality, transmission rate, and packet loss rate.
  • the content displayed by the source device may be an interface of an operating system or an interface in an application.
  • the applications in the source device may have application types such as document applications, video applications, music players, photo albums, and game applications.
  • Different interfaces in the application correspond to different types of services.
  • the types of services can be text service type, video service type, audio service type, picture service type, and game service type.
  • the cool dog application may have both music album covers, lyrics, and music mp3 files, so the service types corresponding to different interfaces can be text service types, picture service types, or audio service types.
  • different interfaces in the application may also correspond to the same type of service, for example, each interface in the album application corresponds to the type of picture service. Because different service types have different requirements for delay and packet loss rate, the first network transmission parameter can be determined according to the service type corresponding to the currently displayed content. For example, the game service type requires higher delay, and the first network transmission The parameter is the delay.
  • the source device may determine the service type according to the name of the application in which the currently displayed content is located.
  • the application where the sharing interface is located is a game application
  • the service type may be a game service type.
  • the source device can capture keywords on the interface to determine the type of service. For example, there are keywords such as lyrics on the shared interface, and it can be determined that the type of service can be an audio service type.
  • the source device can also determine the type of service through a preset whitelist or using the training and learning results of user interaction frequencies in advance. For example, for third-party applications, because these applications are installed by users through the application store, the source device can obtain information about the application from the application store when the application is installed, and then determine the type of business corresponding to the application, and then generate Corresponds to the white list of applications of the same service type. For another example, when the frequency of user character input reaches a certain threshold, it can be determined that the service type corresponding to the currently shared interface is a text service type.
  • the source device can determine the first network transmission parameter according to the same source method or different source method.
  • the first network transmission parameter in the homogeneous mode, is a delay, and in the heterologous mode, the first network transmission parameter is a packet loss rate.
  • the source device may determine the first network transmission parameter according to the currently displayed content and the type of screen projection. That is, the source device determines the first network transmission parameter in combination with the service type corresponding to the currently displayed content and whether the type of the screen selected by the user is the same source method or the heterogeneous method. For example, if the interface shared by the source device is a video service type, and the user selects a heterogeneous method, then the packet loss rate can be determined as the first network transmission parameter. For example, the content currently displayed by the source device is the game service type, and The type of projection screen selected by the user is the same-source mode, and then the selection delay may be determined as the first network transmission parameter.
  • Step 203 When the source device determines that the first network transmission parameter of the first transmission layer channel does not meet the set conditions, the source device transmits the currently displayed content to the target device through the second transmission layer channel. .
  • the second transmission layer channel is determined according to the first network transmission parameter, that is, if the first network transmission parameter is a delay, the second transmission channel is a channel with a delay less than that of the first transmission layer channel;
  • a network transmission parameter is a packet loss rate, and the second transmission channel is a channel whose packet loss rate is smaller than that of the first transmission layer channel.
  • the setting condition generally means that the packet loss rate is less than the first threshold, or the delay is less than the second threshold, or the transmission rate is greater than the third threshold, or the transmission quality is greater than the fourth threshold.
  • step 203 can be understood as: when the source device determines that the packet loss rate of the first transmission channel is greater than or equal to the first threshold, the source device switches from the first transport layer channel to the second transport layer with a lower packet loss rate. Channel, encodes the currently displayed content through the second transmission layer channel, and sends the generated encoded data to the target device, which then encodes and displays it. Or it is understood that, when the source device determines that the delay of the first transmission channel is greater than or equal to the second threshold, the source device switches from the first transmission layer channel to the second transmission layer channel with a small delay, and passes through the second transmission layer channel. Encode the currently displayed content, and send the generated encoded data to the target device, which then encodes and displays it.
  • the process of determining the second transmission channel may also be performed at the same time as step 202, that is, the source device determines the second transmission channel according to the currently displayed content or the type of the screen while determining the transmission parameters of the first network. For example, if the source device determines that the currently displayed content corresponds to a service type that is a video service type or an audio service type, because this type of service requires higher picture or sound smoothness, then the second transport layer channel can be selected as TCP Corresponding channel; if it is determined that the currently displayed content corresponds to a service type that is a game service type, because this type of service requires high latency, then a channel corresponding to UDP can be selected as the second transport layer channel.
  • TCP can be selected.
  • the corresponding channel is used as the second transmission layer channel.
  • the correspondence between the network transmission parameters, the channel corresponding to the transport layer protocol, the type of projection screen, and the currently displayed content of the source device can be shown in Table 1.
  • the source device For example, if the source device's screen type is the same-source mode, because the same-source mode requires higher delay, the first network transmission parameter is the delay, and the source device selects the channel corresponding to UDP as the second transmission layer. Channel; if the source device determines that the screen type is heterogeneous, because the heterogeneous method has high requirements for smooth screen, the first network transmission parameter is the packet loss rate, then the source device can choose the channel corresponding to TCP as the second Transport layer channel.
  • the source device selects the channel corresponding to TCP as the second transmission layer. aisle. If the source device determines that the currently displayed content is a game service type, and the user chooses the same-origin method, because the scenario requires high latency, the source device selects the channel corresponding to UDP as the second transport layer channel.
  • VTP Video Transmission Protocol
  • Table 1 can use mechanisms such as packet loss avoidance deterioration, packet loss recovery, and packet loss retransmission to ensure a balance between small delays and less stuttering.
  • VTP uses a similar stream transmission design scheme as TCP, but uses UDP as the channel. On this basis, it realizes order-preserving, reliable, and strong packet loss-resistant stream transmission.
  • VTP has a packet loss monitoring mechanism, an adaptive FEC mechanism, and a Slow.Fast mechanism, and it has an enhanced congestion algorithm and FEC.
  • the source device can set the channel corresponding to the VTP as the first transmission layer channel. That is, at the beginning, the source device casts a screen to the target device through the channel corresponding to the VTP. During the screen casting process, the source device determines whether the first network transmission parameter is a delay or a packet loss according to the currently displayed content or the type of the screen. Rate, etc., if the first network parameter is a delay, the second transport layer channel is a TCP corresponding to a packet loss rate smaller than VTP, and then the source device determines that when the delay or packet loss rate does not meet the set conditions, Switch the channel corresponding to VTP to the channel corresponding to TCP.
  • the source device when the source device starts the screen projection function and the network IP layer is successfully connected, the source device starts the VTP protocol at the transport layer to establish an end-to-end channel, so that the RTP and RTSP protocols run on top of the VTP protocol, and then defaults to Transmission and control of the encoded data corresponding to the currently displayed content is performed on the channel corresponding to the VTP.
  • the source device may determine that the second transmission layer channel is a TCP corresponding channel according to the currently displayed content or the type of the screen, and once the packet loss rate does not meet the set conditions, it switches to the TCP corresponding channel.
  • the embodiments of the present application can implement time synchronization during the screen casting process or improve the problem of screen freezes and improve the user experience.
  • the source device as a mobile phone and the target device as a personal computer (PC) as an example in conjunction with FIG. 7a to FIG. 7c.
  • PC personal computer
  • FIG. 7a shows the multi-screen interaction process between the mobile phone and the PC. The specific steps are shown below.
  • step 301 the mobile phone starts a multi-screen interactive function, and first uses a standard miracast technology to complete the wireless connection between the mobile phone and the PC.
  • step 302 the mobile phone selects the first transmission layer channel (that is, the channel corresponding to the VTP) to send the encoded data corresponding to the content displayed by the mobile phone to the PC.
  • the first transmission layer channel that is, the channel corresponding to the VTP
  • step 303a when the mobile phone determines that the currently displayed content is a video service type or a screen projection type heterogeneous method, it executes steps 304a to 307a, otherwise, it skips to step 303b.
  • step 304a during the screen casting process of the mobile phone, according to the video service type or heterogeneous mode corresponding to the content currently displayed on the mobile phone, it is determined that the first network transmission parameter is a packet loss rate, and the second transmission layer channel is a TCP corresponding channel.
  • Step 305a the mobile phone monitors whether the packet loss rate of the encoded data of the first transmission layer channel is greater than or equal to the first threshold (for example, 20%) during the screen casting process, and if it is greater than or equal to the first threshold, step 306a is performed, otherwise skip Go to step 307a.
  • the first threshold for example, 20%
  • step 306a the mobile phone switches to a second transmission layer channel (for example, a channel corresponding to TCP), and transmits the encoded data corresponding to the interface shared by the mobile phone to the PC through the second transmission layer channel.
  • a second transmission layer channel for example, a channel corresponding to TCP
  • step 307a if the packet loss rate is less than the first threshold, the mobile phone continues to transmit the encoded data corresponding to the interface shared by the mobile phone to the PC by using the first transmission layer channel (for example, the channel corresponding to VTP).
  • the first transmission layer channel for example, the channel corresponding to VTP.
  • step 303b when the mobile phone judges that the currently displayed content is a game business type or a screen-casting type homogeneous method, it executes steps 304b to 306b, otherwise, it directly jumps to step 307b.
  • Step 304b during the screen casting process of the mobile phone, it is determined that the first network transmission parameter is a delay and the second transmission layer channel is a UDP corresponding channel according to the game service type or the same source mode corresponding to the content currently displayed on the mobile phone.
  • step 305b the mobile phone monitors whether the time delay of the encoded data of the first transmission layer channel is greater than or equal to the second threshold (for example, 2 seconds) during the screen casting process, and if the delay is greater than or equal to the second threshold, step 306b is performed, otherwise, jump Go to step 307b.
  • the second threshold for example, 2 seconds
  • step 306b if it is greater than or equal to the second threshold, the mobile phone switches to the second transmission layer channel (such as the channel corresponding to UDP), and transmits the encoded data corresponding to the interface shared by the mobile phone to the PC.
  • the second transmission layer channel such as the channel corresponding to UDP
  • step 307b the mobile phone continues to transmit the encoded data corresponding to the interface shared by the mobile phone to the PC through the channel corresponding to the VTP.
  • the mobile phone casts the interface in the movie to the PC. Because the content displayed by the mobile phone is a video service type, the delay requirement is low, but the packet loss rate is The requirements are high. At this time, if the mobile phone monitors that the packet loss rate of the channel corresponding to VTP is greater than or equal to 20%, it will automatically switch to the channel corresponding to TCP, or the pop-up window of the mobile phone will notify the user to manually switch the channel at the transport layer to the channel corresponding to TCP. . After switching to the channel corresponding to TCP, if the packet loss rate of the channel of the second transmission layer of the mobile phone is lower than 20% again, you can switch back to the channel corresponding to VTP automatically or manually.
  • the way for the mobile phone and the PC to complete switching the channel of the transport layer may be: the mobile phone first informs the PC that the transport layer channel has switched to the TCP corresponding channel, and then the PC still reads from the original port after receiving the notification message Fetch data. When empty data is read, data is obtained from the channel corresponding to TCP.
  • the mobile phone casts the interface in the game to the PC. Due to the high delay requirements of the game service type, if the mobile phone monitors the delay of the channel corresponding to VTP is greater than A certain value, such as 2 seconds, then the mobile phone automatically switches to the channel corresponding to UDP, or a pop-up window of the mobile phone notifies the user to manually switch the channel at the transport layer to the channel corresponding to UDP. After switching to the channel corresponding to UDP, if the delay is less than 2 seconds, you can automatically or manually switch back to the channel corresponding to VTP. In this scenario, the manner in which the mobile phone and the PC side complete the switching of the channel of the transmission layer is as described above, and is not repeated here.
  • a certain value such as 2 seconds
  • FIG. 7d illustrates a multi-screen interaction process between a mobile phone and a PC terminal, and specific steps are shown below.
  • step 401 the mobile phone starts a multi-screen interactive function, and first uses a standard miracast technology to complete the wireless connection between the mobile phone and the PC, that is, the connection of the RTSP protocol, and the authentication and other processes.
  • Step 402 The mobile phone selects the first transmission layer channel (that is, the channel corresponding to the VTP) to send the encoded data corresponding to the content displayed by the mobile phone to the PC.
  • the first transmission layer channel that is, the channel corresponding to the VTP
  • step 403a when the mobile phone determines that the currently displayed content is a video service type or a screen-casting type heterogeneous method, it executes steps 404a to 407a, otherwise skips to step 403b.
  • step 404a the mobile phone monitors whether the packet loss rate of the encoded data of the first transmission layer channel is greater than or equal to the first threshold (for example, 20%) during the screen casting process. If the packet loss rate is greater than or equal to the first threshold, step 306a is performed; Go to step 307a.
  • the first threshold for example, 20%
  • step 405a the mobile phone switches to a second transport layer channel (for example, a TCP corresponding channel) with a packet loss rate lower than that of the first transport layer channel, and transmits the encoded data corresponding to the interface shared by the mobile phone to the PC through the second transport layer channel.
  • a second transport layer channel for example, a TCP corresponding channel
  • step 406a if the packet loss rate is less than the first threshold, the mobile phone continues to transmit the encoded data corresponding to the interface shared by the mobile phone to the PC by using the first transmission layer channel (for example, the channel corresponding to VTP).
  • the first transmission layer channel for example, the channel corresponding to VTP.
  • step 403b when the mobile phone determines that the currently displayed content is a game business type or a screen-casting type homogeneous method, it executes steps 404b to 406b, otherwise, it skips to step 407.
  • Step 404b the mobile phone monitors whether the time delay of the encoded data of the first transmission layer channel is greater than or equal to the second threshold (for example, 2 seconds) during the screen casting process, and if it is greater than or equal to the second threshold, step 405b is performed, otherwise, the process jumps to Go to step 407.
  • the second threshold for example, 2 seconds
  • step 405b if it is greater than or equal to the second threshold, the mobile phone switches to the second transmission layer channel (such as the channel corresponding to UDP), and transmits the encoded data corresponding to the interface shared by the mobile phone to the PC.
  • the second transmission layer channel such as the channel corresponding to UDP
  • step 406b the mobile phone monitors whether the time delay of the encoded data of the second transmission layer channel is greater than or equal to the second threshold (for example, 2 seconds) during the screen casting process. If the delay of the encoded data is greater than or equal to the second threshold, step 405b is performed; Go to step 407.
  • the second threshold for example, 2 seconds
  • step 407 the mobile phone transmits the encoded data corresponding to the interface shared by the mobile phone to the PC using the channel corresponding to the VTP.
  • the source device when the source device simultaneously enables the multi-screen interactive function and the split screen function, the source device can establish a wireless connection with the first target device and the second target device;
  • the source device may display the interface of the first application currently running and the interface of the second application in split-screen mode in response to the split-screen operation;
  • the source device detects the user's first cast Screen operation, in response to the first screen casting operation, then the source device transmits the interface of the first application to the first target device through the first transmission layer channel, and the second application transmits the second application through the second transmission layer channel Interface to the second target device, and subsequently determine the third transport layer channel or the fourth transport layer channel according to the interface of the first application or the interface of the second application, the type of screen projection, etc., with no packet loss rate or delay.
  • the transport layer channel is switched, and the way to specifically determine the third transport layer channel or the fourth
  • first transport layer channel and the second transport layer channel may both be default transport layer channels, for example, channels corresponding to VTP, or channels corresponding to different transport layer protocols.
  • mobile phone A simultaneously projects the interfaces of the two applications of games and WeChat onto mobile phone B and mobile phone C, respectively.
  • mobile phone A defaults to use the channel corresponding to the VTP protocol to cast the game interface 501 to mobile phone B, and uses another channel corresponding to the VTP protocol to display the WeChat screen 502 to mobile phone C.
  • the channel corresponding to the UDP is switched; when the mobile phone A monitors the data packet loss rate in the channel corresponding to the VTP protocol between the mobile phone A and the mobile phone C When it is greater than 20%, the channel corresponding to TCP is switched.
  • the source device detects a second screen casting operation by the user, wherein the second screen casting operation indicates that the content being screened is the interface of the first application; then responding to the second screen casting operation , The source device transmits only the interface of the first application to the first target device through the third transmission channel.
  • the mobile phone can select the channel corresponding to UDP to project the game interface 501 to the PC.
  • the mobile phone monitors that the data delay of the channel corresponding to UDP is less than 2 seconds, it switches back to the channel corresponding to VTP. .
  • the transmission layer channel of the encoded data corresponding to the content displayed by the source device is determined by combining factors such as the content displayed by the source device and the type of screen projection during the multi-screen interaction process.
  • factors such as the content and the type of projection screen require higher delay, the channel corresponding to the protocol with a lower delay is selected. This can reduce the delay to a certain extent.
  • the content displayed by the source device and the type of projection screen affect When the requirement for smooth screen is high, the channel corresponding to the protocol with a lower packet loss rate is selected, which can reduce the problem of screen freeze to a certain extent.
  • An embodiment of the present application further provides a computer-readable storage medium, where the computer-readable storage medium includes a computer program, and when the computer program runs on an electronic device, the electronic device executes any of the foregoing multi-screen interaction methods. Possible implementation.
  • An embodiment of the present application further provides a computer program product that, when the computer program product runs on an electronic device, causes the electronic device to perform any one of the possible implementations of the multi-screen interaction method described above.
  • an embodiment of the present application discloses an electronic device.
  • the electronic device is used to implement the methods described in the foregoing method embodiments, and includes: a transceiver module 1001, processing Module 1002.
  • the transceiver module 1001 is configured to support the electronic device to perform step 201 in FIG. 5
  • the processing module 1002 is configured to support the electronic device to perform step 202 in FIG. 5
  • the transceiver module 1001 is configured to support the electronic device to perform step 203 in FIG. 5. All relevant content of each step involved in the above method embodiment can be referred to the functional description of the corresponding functional module, and will not be repeated here.
  • an embodiment of the present application discloses an electronic device.
  • the electronic device may include: one or more processors 1101; a memory 1102; a display 1103; one or more Application programs (not shown); and one or more computer programs 1104, each of which may be connected via one or more communication buses 1105.
  • the one or more computer programs 1104 are stored in the memory 1102 and configured to be executed by the one or more processors 1101.
  • the one or more computer programs 1104 include instructions, and the instructions may be used to execute 5 and the corresponding steps in the respective embodiments.
  • Each functional unit in each of the embodiments of the present application may be integrated into one processing unit, or each of the units may exist separately physically, or two or more units may be integrated into one unit.
  • the above integrated unit may be implemented in the form of hardware or in the form of software functional unit.
  • the integrated unit When the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it may be stored in a computer-readable storage medium.
  • the technical solutions of the embodiments of the present application essentially or partly contribute to the existing technology or all or part of the technical solutions may be embodied in the form of a software product.
  • the computer software product is stored in a storage device.
  • the medium includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) or a processor to perform all or part of the steps of the method described in the embodiments of the present application.
  • the foregoing storage media include: flash media, mobile hard disks, read-only memories, random access memories, magnetic disks, or optical discs, which can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

本申请提供一种多屏互动方法及设备,该方法可以应用于具有多屏互动功能的电子设备,该方法包括:源设备检测到用户的投屏操作,响应于该投屏操作,源设备通过第一传输层通道将当前显示的内容传输到所述目标设备;在投屏过程中,源设备根据所述源设备当前显示的内容和/或投屏类型确定第一网络传输参数;当源设备确定所述第一传输层通道的第一网络传输参数不满足设定条件时时,源设备通过第二传输层通道将当前显示的内容传输到目标设备,因第二网络传输通道是根据第一网络传输参数确定的,具有时延小或者丢包率小等优点,因此可以保证投屏过程中的时延同步,或者改善画面卡顿的问题。

Description

一种多屏互动方法及设备 技术领域
本申请涉及终端技术领域,尤其涉及一种多屏互动方法及设备。
背景技术
随着通讯技术、信息技术以及电视技术的快速发展及普及,信息的获取及传输以空前的速度进行,人们的生活方式发生了很大改变,并逐渐向快捷性和娱乐性的方向发展,家庭娱乐中心正逐渐成为用户获取信息和进行娱乐的新方式。移动智能协助互联网与家庭数字电视的有效互取,给数字电视媒体带来了全新的战和机遇。在智能化的背景下,电视、手机,电脑的功能趋于多元。如何充分利用各种智能终端的优势,实现智能终端间的互通互联与资源共享,成为未来家庭娱乐发展的目标。多屏互动技术正是在这样的需求驱动下出现并发展起来的,而且迅速成为了广电、家电厂商、互联网运营商以及通信相关行业关注的重点,也成为了未来家庭娱乐产业的关键技术。
目前从多屏互动的功能来看,存在一种屏幕分享模式,即不同设备间通过分享操作系统的屏幕或者应用内的屏幕来实现多屏互动,但是长时间分享操作系统的屏幕或者应用内的屏幕,可能会有花屏和卡顿的情况出现,或者因延迟时间较大出现屏幕之间不同步的情况,从而对用户体验造成影响。
发明内容
本申请提供一种多屏互动方法及设备,用以解决现有技术不同设备之间多屏互动时会出现花屏和卡顿,或者画面不同步问题。
第一方面,本申请实施例提供了一种多屏互动方法,该方法包括:源设备检测到用户的投屏操作,响应于该投屏操作,源设备通过第一传输层通道将当前显示的内容传输到目标设备,然后源设备根据所述源设备当前显示的内容或投屏类型确定第一网络传输参数,当所述源设备确定所述第一传输层通道的第一网络传输参数不满足设定条件时,所述源设备通过第二传输层通道将当前显示的内容传输到所述目标设备,所述第二传输层通道是所述源设备根据所述第一网络传输参数确定的。
通过该方法,本申请源设备可以在投屏过程中,可以结合当前显示的内容或投屏类型,从第一传输层切换到低时延的第二传输层通道,或者切换到低丢包率的第二传输层通道。
在一种可能的实现中,若所述源设备当前显示音视频业务类型的应用的界面,源设备根据所述音视频业务类型的应用的界面确定所述第一网络传输参数为丢包率;其中,所述第二传输层通道的丢包率低于所述第一传输层通道的丢包率。也就是说,当源设备投屏的是视频业务类型的应用的界面时,源设备可以从第一传输层通道切换至丢包率较低的第二传输层通道,从而改善画面卡顿的情况。
在一种可能的实现中,若源设备当前显示游戏业务类型的应用,所述源设备根据游戏业务类型的应用确定所述第一网络传输参数为时延;其中,所述第二传输层通道的时延低于所述第一传输层通道的时延。也就是说,当源设备投屏的是游戏业务类型的应用的界面时,源设备可以从第一传输层通道切换至时延较低的第二传输层通道,从而改善画面不一 致的情况。
在一种可能的实现中,若源设备当前投屏类型是同源方式,所述源设备根据同源方式确定所述第一网络传输参数为时延;其中,所述第二传输层通道的时延低于所述第一传输层通道的时延。也就是说,当源设备同源方式投屏时,源设备可以从第一传输层通道切换至时延较低的第二传输层通道,从而改善画面不一致的情况。
在一种可能的实现中,若源设备当前投屏类型是异源方式,所述源设备根据异源方式确定所述第一网络传输参数为丢包率;其中,所述第二传输层通道的丢包率低于所述第一传输层通道的丢包率。也就是说,,当源设备异源方式投屏时,源设备可以从第一传输层通道切换至时延较低的第二传输层通道,从而改善画面卡顿的情况。
在一种可能的实现中,所述第一传输层通道为VTP对应的通道,时延低于VTP对应的通道为UDP对应的通道,或者丢包率低于所述VTP对应的通道为TCP对应的通道。
第二方面,本申请实施例提供了一种多屏互动方法,该方法包括:源设备检测到用户的分屏操作;响应于所述分屏操作,所述源设备将当前运行的第一应用的界面和第二应用的界面分屏显示;然后源设备检测到用户的第一投屏操作,响应于所述第一投屏操作,所述源设备通过第一传输层通道将第一应用的界面传输到第一目标设备,以及通过第二传输层通道将第二应用的界面传输到第二目标设备;在投屏过程中,源设备根据第一应用的界面或投屏类型确定第一网络传输参数;当所述源设备确定所述第一传输层通道的第一网络传输参数不满足设定条件时时所述源设备通过所述第三传输层通道将第一应用的界面传输到所述第一目标设备,所述第三传输层通道是根据所述第一网络传输参数确定的。
另外,所述源设备根据第二应用的界面或投屏类型确定第二网络传输参数;当所述源设备确定所述第二传输层通道的第二网络传输参数大于或等于第二阈值时,所述源设备通过所述第四传输层通道将第二应用的界面传输到所述第二目标设备,所述第四传输层通道是源设备根据所述第二网络传输参数确定的。
通过该方法,源设备可以在分屏功能和投屏功能均开启的场景下,通过上述多屏互动方法完成多个应用的界面的投屏,能够改善在投屏过程中画面不同步和画面不卡顿的问题。
在一种可能的实现,源设备检测到用户的第二投屏操作,所述第二投屏操作指示被投屏的内容仅为所述第一应用的界面;响应于所述第二投屏操作,源设备停止通过所述第四传输通道将所述第二应用的界面传输到所述第二目标设备。也就是说,当源设备只选择将分屏界面中的其中一个界面投屏时,源设备能够实现只投屏其中一个应用的界面。
第三方面,本申请实施例提供一种电子设备,包括处理器和存储器。其中,存储器用于存储一个或多个计算机程序;当存储器存储的一个或多个计算机程序被处理器执行时,使得终端能够实现第一方面的任意一种可能的设计的方法,或者第二方面的任意一种可能的设计的方法。
第四方面,本申请实施例还提供了一种电子设备,所述电子设备包括执行第一方面或者第一方面的任意一种可能的设计的方法的模块/单元。这些模块/单元可以通过硬件实现,也可以通过硬件执行相应的软件实现。
第五方面,本申请实施例还提供了一种电子设备,所述电子设备包括执行第二方面或者第二方面的任意一种可能的设计的方法的模块/单元。这些模块/单元可以通过硬件实现,也可以通过硬件执行相应的软件实现。
第六方面,本申请实施例中还提供一种计算机可读存储介质,所述计算机可读存储介质包括计算机程序,当计算机程序在电子设备上运行时,使得所述电子设备执行任意一种可能的设计的方法。
第七方面,本申请实施例还提供一种包含计算机程序产品,当所述计算机程序产品在终端上运行时,使得所述终端执行任意一种可能的设计的方法。
本申请的这些方面或其他方面在以下实施例的描述中会更加简明易懂。
附图说明
图1为本申请实施例提供的一种多屏互动适用的场景架构示意图;
图2为本申请实施例提供的终端结构示意图;
图3为本申请实施例提供的一种安卓操作系统的架构示意图;
图4为现有技术提供的一种Wi-Fi Display所涉及的技术及协议框图;
图5为本申请实施例提供的一种多屏互动方法流程示意图;
图6为本申请实施例提供的一种Wi-Fi Display所涉及的技术及协议框图;
图7a为本申请实施例提供的一种多屏互动方法流程示意图一;
图7b至图7c为本申请实施例提供的两种多屏互动场景示意图;
图7d为本申请实施例提供的一种多屏互动方法流程示意图二;
图8为本申请实施例提供的一种分屏功能开启后的屏互动方法示意图一;
图9为本申请实施例提供的一种分屏功能开启后的屏互动方法示意图二;
图10为本申请实施例提供的一种终端结构示意图;
图11为本申请实施例提供的一种手机结构示意图。
具体实施方式
下面将结合附图对本申请实施例作进一步地详细描述。
多屏互动技术指的是,在不同的操作系统,以及不同的终端设备(例如智能手机、智能平板、电脑、电视)之间可以相互兼容跨越操作,通过无线网络连接的方式,实现数字多媒体(例如高清视频,音频,图片)内容的传输。多屏互动技术可以实现在不同终端平台设备上同时共享显示内容,能够丰富用户的多媒体生活。
从多屏互动的功能来看,包括以下三种模式:a)内容分享模式:即不同设备之间通过分享媒体内容或媒体内容的链接来实现多屏互动;b)界面分享模式:即不同设备间通过分享系统的界面或者应用内的界面来实现多屏互动。c)远程控制模式:即通过一台设备控制另外一台设备,实现多屏间的互动。对于以上三种多屏互动的方式,国际标准化组织与各种产业联盟定义了多种实现方案。目前多屏互动实现方案有数字生活网络联盟(digital living network alliance,DLAN)、无线高清技术(Intel Wireless Display,Intel WIDI)、镜像方式、闪联协议、
Figure PCTCN2018096038-appb-000001
(也称为Wi-Fi Display)、
Figure PCTCN2018096038-appb-000002
Figure PCTCN2018096038-appb-000003
Figure PCTCN2018096038-appb-000004
等技术。其中,支持界面分享模式的多屏互动方案有
Figure PCTCN2018096038-appb-000005
Figure PCTCN2018096038-appb-000006
Figure PCTCN2018096038-appb-000007
Figure PCTCN2018096038-appb-000008
等技术,采用这一类多屏互动方案实现界面分享又称为无线投屏。
本申请实施例提供的多屏互动方法适用于图1所示的架构,该架构包括源设备10、目标设备20。源设备10和目标设备20通过通信网络互相通信。其中,该通信网络可以是局 域网,也可以是通过中继(relay)设备转接的广域网。当该通信网络为局域网时,示例性的,该通信网络可以是Wi-Fi热点网络、Wi-Fi直连网络、蓝牙网络、zigbee网络或近场通信(near field communication,NFC)网络等近距离通信网络。当该通信网络为广域网时,示例性的,该通信网络可以是第三代移动通信技术(3rd-generation wireless telephone technology,3G)网络、第四代移动通信技术(the 4th generation mobile communication technology,4G)网络、第五代移动通信技术(5th-generation mobile communication technology,5G)网络、未来演进的公共陆地移动网络(public land mobile network,PLMN)或因特网等。
其中,源设备10,是指主动分享系统的界面或者应用的界面的设备,示例性地,该源设备10可以是手机、平板电脑、笔记本电脑等。目标设备20,是指被动接受源设备10所分享的界面的设备,示例性地,目标设备20可以是手机、平板电脑、笔记本电脑等。
例如,源设备基于上述Miracast多屏互动技术,将所分享的界面编码为H264格式数据,再将H264格式数据发送至目标设备20。然后目标设备20就可以对H264格式数据进行解码并显示。
需要说明的是,本申请实施例提供的多屏互动方法可应用于手机、平板电脑、可穿戴设备、车载设备、增强现实(augmented reality,AR)设备、虚拟现实(virtual reality,VR)设备、笔记本电脑、无人机、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本、个人数字助理(personal digital assistant,PDA)、智能电视等支持显示的终端上,本申请实施例中对终端的具体形式不做特殊限制。例如源设备可以是手机,目标设备可以是平板电脑。另外目标设备可以是上述兼具有接收功能的和显示功能的终端,也可以是只包含接收功能的设备,例如机顶盒。如果是目标设备是只包含接收功能的设备,那么该目标设备可以再连接到具有显示功能的设备上,以达到多屏互动的目的。例如目标设备是机顶盒,该机顶盒与电视显示器连接。
如图2所示,本申请实施例中的源设备10或目标设备20可以为电子设备100。下面以电子设备100为例对实施例进行具体说明。应该理解的是,图示电子设备100仅是源设备10或目标设备20的一个范例,并且电子设备100可以具有比图中所示出的更多的或者更少的部件,可以组合两个或更多的部件,或者可以具有不同的部件配置。图中所示出的各种部件可以在包括一个或多个信号处理或专用集成电路在内的硬件、软件、或硬件和软件的组合中实现。
图2中,电子设备100可以包括处理器110,外部存储器接口120,内部存储器121,USB接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及SIM卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
可以理解的是,本发明实施例示意的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬 件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(Neural-network Processing Unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
其中,控制器可以是电子设备100的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
I2C接口是一种双向同步串行总线,包括一根串行数据线(serial data line,SDA)和一根串行时钟线(derail clock line,SCL)。在一些实施例中,处理器110可以包含多组I2C总线。处理器110可以通过不同的I2C总线接口分别耦合触摸传感器180K,充电器,闪光灯,摄像头193等。例如:处理器110可以通过I2C接口耦合触摸传感器180K,使处理器110与触摸传感器180K通过I2C总线接口通信,实现电子设备100的触摸功能。
I2S接口可以用于音频通信。在一些实施例中,处理器110可以包含多组I2S总线。处理器110可以通过I2S总线与音频模块170耦合,实现处理器110与音频模块170之间的通信。在一些实施例中,音频模块170可以通过I2S接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。
PCM接口也可以用于音频通信,将模拟信号抽样,量化和编码。在一些实施例中,音频模块170与无线通信模块160可以通过PCM总线接口耦合。在一些实施例中,音频模块170也可以通过PCM接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。所述I2S接口和所述PCM接口都可以用于音频通信。
UART接口是一种通用串行数据总线,用于异步通信。该总线可以为双向通信总线。它将要传输的数据在串行通信与并行通信之间转换。在一些实施例中,UART接口通常被用于连接处理器110与无线通信模块160。例如:处理器110通过UART接口与无线通信模块160中的蓝牙模块通信,实现蓝牙功能。在一些实施例中,音频模块170可以通过UART接口向无线通信模块160传递音频信号,实现通过蓝牙耳机播放音乐的功能。
MIPI接口可以被用于连接处理器110与显示屏194,摄像头193等外围器件。MIPI接口包括摄像头串行接口(camera serial interface,CSI),显示屏串行接口(display serial  interface,DSI)等。在一些实施例中,处理器110和摄像头193通过CSI接口通信,实现电子设备100的拍摄功能。处理器110和显示屏194通过DSI接口通信,实现电子设备100的显示功能。
GPIO接口可以通过软件配置。GPIO接口可以被配置为控制信号,也可被配置为数据信号。在一些实施例中,GPIO接口可以用于连接处理器110与摄像头193,显示屏194,无线通信模块160,音频模块170,传感器模块180等。GPIO接口还可以被配置为I2C接口,I2S接口,UART接口,MIPI接口等。
USB接口130是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接口可以用于连接充电器为电子设备100充电,也可以用于电子设备100与外围设备之间传输数据。也可以用于连接耳机,通过耳机播放音频。该接口还可以用于连接其他电子设备,例如AR设备等。
可以理解的是,本发明实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备100的结构限定。在本申请另一些实施例中,电子设备100也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块140用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。在一些有线充电的实施例中,充电管理模块140可以通过USB接口接收有线充电器的充电输入。在一些无线充电的实施例中,充电管理模块140可以通过电子设备100的无线充电线圈接收无线充电输入。充电管理模块140为电池142充电的同时,还可以通过电源管理模块141为电子设备供电。
电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块141接收电池142和/或充电管理模块140的输入,为处理器110,内部存储器121,外部存储器,显示屏194,摄像头193,和无线通信模块160等供电。电源管理模块141还可以用于监测电池容量,电池循环次数,电池健康状态(漏电,阻抗)等参数。在其他一些实施例中,电源管理模块141也可以设置于处理器110中。在另一些实施例中,电源管理模块141和充电管理模块140也可以设置于同一个器件中。
电子设备100的无线通信功能可以通过天线模块1,天线模块2移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将蜂窝网天线复用为无线局域网分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在电子设备100上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(Low Noise Amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器 将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A,受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块150或其他功能模块设置在同一个器件中。
无线通信模块160可以提供应用在电子设备100上的包括无线局域网(wireless local area networks,WLAN),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,电子设备100的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得电子设备100可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS))和/或星基增强系统(satellite based augmentation systems,SBAS)。
电子设备100通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用LCD(liquid crystal display,液晶显示屏),OLED(organic light-emitting diode,有机发光二极管),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备100可以包括1个或N个显示屏,N为大于1的正整数。
电子设备100可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,电子设备100可以包括1个或N个摄像头,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备100可以支持一种或多种视频编解码器。这样,电子设备100可以播放或录制多种编码格式的视频,例如:MPEG1,MPEG2,MPEG3,MPEG4等。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现电子设备100的智能认知等应用,例如:图像识别,人脸识别,语音识别,文本理解等。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备100的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,从而执行电子设备100的各种功能应用以及数据处理。存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储电子设备100使用过程中所创建的数据(比如音频数据,电话本等)等。此外,存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
电子设备100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。电子设备100可以通过扬声器170A收听音乐,或收听免提通话。
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。当电子设备100接听电话或语音信息时,可以通过将受话器170B靠近人耳接听语音。
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风170C发声,将声音信号输入到麦克风170C。电子设备100可以设置至少一个麦克风170C。在另一些实施例中,电子设备100可以设置两个麦克风,除了采集声音信号,还可以实现降噪功能。在另一些实施例中,电 子设备100还可以设置三个,四个或更多麦克风,实现采集声音信号,降噪,还可以识别声音来源,实现定向录音功能等。
耳机接口170D用于连接有线耳机。耳机接口可以是USB接口,也可以是3.5mm的开放移动电子设备平台(open mobile terminal platform,OMTP)标准接口,美国蜂窝电信工业协会(cellular telecommunications industry association of the USA,CTIA)标准接口。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。压力传感器180A的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。电容式压力传感器可以是包括至少两个具有导电材料的平行板。当有力作用于压力传感器180A,电极之间的电容改变。电子设备100根据电容的变化确定压力的强度。当有触摸操作作用于显示屏194,电子设备100根据压力传感器180A检测所述触摸操作强度。电子设备100也可以根据压力传感器180A的检测信号计算触摸的位置。在一些实施例中,作用于相同触摸位置,但不同触摸操作强度的触摸操作,可以对应不同的操作指令。例如:当有触摸操作强度小于第一压力阈值的触摸操作作用于短消息应用图标时,执行查看短消息的指令。当有触摸操作强度大于或等于第一压力阈值的触摸操作作用于短消息应用图标时,执行新建短消息的指令。
陀螺仪传感器180B可以用于确定电子设备100的运动姿态。在一些实施例中,可以通过陀螺仪传感器180B确定电子设备100围绕三个轴(即,x,y和z轴)的角速度。陀螺仪传感器180B可以用于拍摄防抖。示例性的,当按下快门,陀螺仪传感器180B检测电子设备100抖动的角度,根据角度计算出镜头模组需要补偿的距离,让镜头通过反向运动抵消电子设备100的抖动,实现防抖。陀螺仪传感器180B还可以用于导航,体感游戏场景。
气压传感器180C用于测量气压。在一些实施例中,电子设备100通过气压传感器180C测得的气压值计算海拔高度,辅助定位和导航。
磁传感器180D包括霍尔传感器。电子设备100可以利用磁传感器180D检测翻盖皮套的开合。在一些实施例中,当电子设备100是翻盖机时,电子设备100可以根据磁传感器180D检测翻盖的开合。进而根据检测到的皮套的开合状态或翻盖的开合状态,设置翻盖自动解锁等特性。
加速度传感器180E可检测电子设备100在各个方向上(一般为三轴)加速度的大小。当电子设备100静止时可检测出重力的大小及方向。还可以用于识别电子设备姿态,应用于横竖屏切换,计步器等应用。
距离传感器180F,用于测量距离。电子设备100可以通过红外或激光测量距离。在一些实施例中,拍摄场景,电子设备100可以利用距离传感器180F测距以实现快速对焦。
接近光传感器180G可以包括例如发光二极管(LED)和光检测器,例如光电二极管。发光二极管可以是红外发光二极管。电子设备100通过发光二极管向外发射红外光。电子设备100使用光电二极管检测来自附近物体的红外反射光。当检测到充分的反射光时,可以确定电子设备100附近有物体。当检测到不充分的反射光时,电子设备100可以确定电子设备100附近没有物体。电子设备100可以利用接近光传感器180G检测用户手持电子设备100贴近耳朵通话,以便自动熄灭屏幕达到省电的目的。接近光传感器180G也可用于皮套模式,口袋模式自动解锁与锁屏。
环境光传感器180L用于感知环境光亮度。电子设备100可以根据感知的环境光亮度自适应调节显示屏194亮度。环境光传感器180L也可用于拍照时自动调节白平衡。环境 光传感器180L还可以与接近光传感器180G配合,检测电子设备100是否在口袋里,以防误触。
指纹传感器180H用于采集指纹。电子设备100可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。
温度传感器180J用于检测温度。在一些实施例中,电子设备100利用温度传感器180J检测的温度,执行温度处理策略。例如,当温度传感器180J上报的温度超过阈值,电子设备100执行降低位于温度传感器180J附近的处理器的性能,以便降低功耗实施热保护。在另一些实施例中,当温度低于另一阈值时,电子设备100对电池142加热,以避免低温导致电子设备100异常关机。在其他一些实施例中,当温度低于又一阈值时,电子设备100对电池142的输出电压执行升压,以避免低温导致的异常关机。
触摸传感器180K,也称“触控面板”。可设置于显示屏194。用于检测作用于其上或附近的触摸操作。可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型,并通过显示屏194提供相应的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于电子设备100的表面,与显示屏194所处的位置不同。
骨传导传感器180M可以获取振动信号。在一些实施例中,骨传导传感器180M可以获取人体声部振动骨块的振动信号。骨传导传感器180M也可以接触人体脉搏,接收血压跳动信号。在一些实施例中,骨传导传感器180M也可以设置于耳机中。音频模块170可以基于所述骨传导传感器180M获取的声部振动骨块的振动信号,解析出语音信号,实现语音功能。应用处理器可以基于所述骨传导传感器180M获取的血压跳动信号解析心率信息,实现心率检测功能。
按键190包括开机键,音量键等。按键可以是机械按键。也可以是触摸式按键。电子设备100可以接收按键输入,产生与电子设备100的用户设置以及功能控制有关的键信号输入。
马达191可以产生振动提示。马达191可以用于来电振动提示,也可以用于触摸振动反馈。例如,作用于不同应用(例如拍照,音频播放等)的触摸操作,可以对应不同的振动反馈效果。作用于显示屏194不同区域的触摸操作,马达191也可对应不同的振动反馈效果。不同的应用场景(例如:时间提醒,接收信息,闹钟,游戏等)也可以对应不同的振动反馈效果。触摸振动反馈效果还可以支持自定义。
指示器192可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。
SIM卡接口195用于连接用户标识模块(subscriber identity module,SIM)。SIM卡可以通过插入SIM卡接口,或从SIM卡接口拔出,实现和电子设备100的接触和分离。电子设备100可以支持1个或N个SIM卡接口,N为大于1的正整数。SIM卡接口195可以支持Nano SIM卡,Micro SIM卡,SIM卡等。同一个SIM卡接口可以同时插入多张卡。所述多张卡的类型可以相同,也可以不同。SIM卡接口195也可以兼容不同类型的SIM卡。SIM卡接口195也可以兼容外部存储卡。电子设备100通过SIM卡和网络交互,实现通话以及数据通信等功能。在一些实施例中,电子设备100采用eSIM,即:嵌入式SIM卡。eSIM卡可以嵌在电子设备100中,不能和电子设备100分离。
进一步来说,上述电子设备100的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本发明实施例以分层架构的Android系统为例,示例性说 明电子设备100的软件结构。
图3是本发明实施例的电子设备100的软件结构框图。
分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android系统分为四层,从上至下分别为应用程序层,应用程序框架层,安卓运行时(Android runtime)和系统库,以及内核层。
1、应用程序层可以包括一系列应用程序包。
如图3所示,应用程序包可以包括相机,图库,日历,通话,地图,导航,WLAN,蓝牙,音乐,视频,短信息等应用程序。
2、应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。
如图3所示,应用程序框架层可以包括窗口管理器,内容提供器,视图系统,电话管理器,资源管理器,通知管理器等。
窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。
内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。所述数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。
视图系统包括可视控件,例如显示文字的控件,显示图片的控件等。视图系统可用于构建应用程序。显示界面可以由一个或多个视图组成的。例如,包括短信通知图标的显示界面,可以包括显示文字的视图以及显示图片的视图。
电话管理器用于提供电子设备100的通信功能。例如通话状态的管理(包括接通,挂断等)。
资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。
通知管理器使应用程序可以在状态栏中显示通知信息,可以用于传达告知类型的消息,可以短暂停留后自动消失,无需用户交互。比如通知管理器被用于告知下载完成,消息提醒等。通知管理器还可以是以图表或者滚动条文本形式出现在系统顶部状态栏的通知,例如后台运行的应用程序的通知,还可以是以对话窗口形式出现在屏幕上的通知。例如在状态栏提示文本信息,发出提示音,电子设备振动,指示灯闪烁等。
3、Android Runtime包括核心库和虚拟机。Android runtime负责安卓系统的调度和管理。
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
4、系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(Media Libraries),三维图形处理库(例如:OpenGL ES),2D图形引擎(例如:SGL)等。
表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。
媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。
三维图形处理库用于实现三维图形绘图,图像渲染,合成,和图层处理等。
2D图形引擎是2D绘图的绘图引擎。
5、内核层是硬件和软件之间的层。内核层至少包含显示驱动,摄像头驱动,音频驱动,传感器驱动。
下面结合捕获拍照场景,示例性说明电子设备100软件以及硬件的工作流程。
当触摸传感器180K接收到触摸操作,相应的硬件中断被发给内核层。内核层将触摸操作加工成原始输入事件(包括触摸坐标,触摸操作的时间戳等信息)。原始输入事件被存储在内核层。应用程序框架层从内核层获取原始输入事件,识别该原始输入事件所对应的控件。以该触摸操作是触摸单击操作,该单击操作所对应的控件为相机应用图标的控件为例,相机应用调用应用框架层的接口,启动相机应用,进而通过调用内核层启动摄像头驱动,通过摄像头捕获静态图像或视频。
示例性地,图4示出了Wi-Fi Display这一多屏互动技术所涉及的技术及协议框图,图中涉及的技术层面比较多,相关的协议也比较多,包括了Wi-Fi P2P(Wi-Fi点对点)技术、实时流协议(Real Time Streaming protocol,RTSP)及实时传输协议(Real-time Transport protocol,RTP)技术、流媒体技术以及音视频编解码相关的技术。Wi-Fi Display主要是基于Wi-Fi P2P技术,利用RTSP作为音频及视频流控制协议,在多屏互动过程中涉及了流媒体的传输、控制、加密、解密、编码及解码等技术流程。
具体来说,基于图4所示的协议框图原理,图1中的源设备10和目的设备20通过Wi-Fi P2P连接后,目标设备20与源设备10建立TCP连接(可以把源设备10看作为客户端,把目标设备20看作为服务器)。当建立了TCP连接后目标设备20与源设备10之间再进行RTSP协商。当协商成功后建立Wi-Fi Display会话。最终目标设备20与源设备10实现音频及视频流的传输与控制以及编码及解码等流程。另外,源设备10可以对传输的内容做加密保护(比如加密技术可以是高带宽数字内容保护技术(High-bandwidth Digital Content Protection,HDCP))。
其中,RTSP是一个应用层协议,用于控制具有实时特性的数据(例如多媒体流)的传送。RTSP一般与RTP或实时传输控制协议(Real-time Transport Control protocol,RTCP)和资源预留协议(Resource Reserve Protocol,RSVP)等底层协议一起协同工作,提供基于Internet(因特网)的整套的流服务。RTSP可以选择数据流的传输通道(例如:用户数据报协议(User Datagram Protocol,UDP)对应的通道、传输控制协议(Transmission Control Protocol,TCP)对应的通道)。
补充来说,TCP用于提供IP环境下的数据可靠传输,它提供的服务包括数据流传送、可靠性、有效流控、全双工操作和多路复用。通过面向连接、端到端和可靠的数据包发送。通俗地说,它是事先为所发送的数据开辟出连接好的通道,然后再进行数据发送。一般来说,TCP对应的是可靠性要求高的应用。TCP侧重传输的质量,但会产生时延过大的问题。UDP是一个简单的面向非连接的传输协议,也就是客户端不需要与服务端建立连接,就直接将数据发送给服务端,同时,没有机制保证这条数据已成功发送给服务端。也就是说UDP侧重传输效率,但会产生丢包的问题。
基于上述原因,在多屏互动场景下,如果源设备10和目标设备20之间使用UDP通 道进行数据传递,时延不大,但会有花屏和卡顿的情况出现;如果源设备10和目标设备20之间使用TCP进行数据传递,花屏和卡顿状况较少,但是时延较大,对用户的一致性体验造成了影响。
目前现有技术虽然会基于网络带宽情况,对传输层的通道进行切换,但是并没有结合考虑在多屏互动过程中源设备所发送的数据对应的业务类型,例如发送的数据可以是视频业务类型的编码数据、或者是游戏业务类型的编码数据、亦或者是文本业务类型的编码数据等。因不同的业务类型,对网络传输参数的影响不同,例如网络传输参数可以是时延、传输质量、传输速率、丢包率等,所以源设备可以对应地采用不同传输层协议对应的通道来传输编码数据。另外,如果源设备选择的投屏类型为同源方式(所谓同源方式,即源设备和目标设备之间的画面保持一致),则对时延的要求较高,相反,如果源设备选择的投屏类型为异源方式(所谓异源方式,即源设备和目标设备之间的画面不一致),则对时延的要求不高,所以针对不同的投屏类型,源设备也可以对应地采用不用的传输层的通道。
为此,本申请实施例提供一种多屏互动方法,该方法结合考虑源设备所分享的界面和投屏类型,动态地在不同的传输层的通道之间进行切换。在本申请实施例中,若源设备和目标设备采用的是Android系统,那么该多屏互动方法的实现可能涉及如下模块:1、Media Player Service及相关模块,主要用于RTP/RTSP传输及相应的编解码技术;2、Surface Flinger及相关模块,作用是将各层UI数据混屏并投递到显示设备中去显示;3、Window Manager Service及相关模块,用于管理系统中各个UI层的位置和属性。由于并非所有的UI层都会投屏到目标设备上。例如手机中的视频可投递到目标设备上去显示,但假如在播放过程中,突然弹出一个密码输入框(可能是某个后台应用程序发起的),为保护隐私,这个密码输入框一般不会投屏到目标设备上去显示;3、Display Manager Service及相关模块,用于管理系统中所有的显示设备;4、Wi-Fi Service及相关模块:用于建立Wi-Fi P2P连接。
下面结合附图5,对本申请实施例提供的多屏互动方法及设备进行详细说明,具体步骤如下。
步骤201、源设备检测到用户的投屏操作,通过第一传输层通道将当前的显示的内容传输至到目标设备。
一般地,第一传输层通道是默认的传输层通道,可以是用户手动设置的,也可以说系统预先设置的。另外,投屏操作的投屏类型可以有同源方式或者异源方式这两种方式,用户可以根据自身需求选择任意一种方式投屏。
步骤202、源设备根据当前显示的内容或投屏类型,确定第一网络传输参数。
其中,第一网络传输参数可以是上述网络传输参数中的任意一种,例如时延、或者传输质量、传输速率、丢包率等。
具体地,源设备显示的内容可以是操作系统的界面或者应用中的界面。一般地,源设备中的应用可以有文档应用、视频应用、音乐播放器、相册、游戏应用等应用类型。应用中的不同界面对应着不同类型的业务,其中,业务类型可以有文本业务类型、视频业务类型、音频业务类型、图片业务类型、游戏业务类型等。比如酷狗应用,该应用可能既有音乐专辑封面,也有歌词,还有音乐mp3文件,所以不同的界面对应的业务类型可以是文本业务类型、图片业务类型或者音频业务类型。另外,应用中的不同界面也可以对应着同一类型的业务,例如相册应用中的各个界面对应着图片业务类型。因不同的业务类型对时延、丢包率要求不同,所以可以根据当前显示的内容对应的业务类型确定第一网络传输参数, 例如,游戏业务类型对时延要求较高,则第一网络传输参数为时延。
具体而言,源设备可以根据当前显示的内容所在的应用的名称,确定业务类型,例如分享的界面所在的应用是游戏应用,那么业务类型可以是游戏业务类型。另外,源设备可以抓取界面上的关键词,确定业务类型,例如分享的界面上有歌词等关键词,可以确定业务类型可以是音频业务类型。
除此之外,源设备还可以通过预设的白名单,或者利用预先对用户交互频率的训练学习结果来确定业务类型。比如对于第三方应用来说,因为这类应用是用户通过应用商店安装的,所以源设备可以在安装应用时,从应用商店获取应用的相关信息,进而确定出该应用对应的业务类型,然后生成对应着同一种业务类型的应用的白名单。再比如说,当用户字符输入频率达到一定的阈值时,则可以确定当前分享的界面对应的业务类型是文本业务类型。
另外,投屏类型不同,对时延的要求也不同,所以源设备可以根据同源方式或异源方式确定第一网络传输参数。例如同源方式下,则第一网络传输参数为时延,异源方式下,则第一网络传输参数为丢包率。
在一种可能的实现中,源设备可以根据当前显示的内容和投屏类型,确定第一网络传输参数。也就是说,源设备结合当前显示的内容对应的业务类型,以及用户选择的投屏类型是同源方式还是异源方式,确定第一网络传输参数。例如源设备分享的界面是视频业务类型,且用户选择的是异源方式,那么可以确定选择丢包率作为第一网络传输参数,又比如说,源设备当前显示的内容是游戏业务类型,且用户选择的投屏类型是同源方式,那么可以确定选择时延作为第一网络传输参数。
步骤203、当所述源设备确定所述第一传输层通道的第一网络传输参数不满足设定条件时,源设备通过所述第二传输层通道将当前显示的内容传输到所述目标设备。
其中,第二传输层通道是根据第一网络传输参数确定的,也就是说,若第一网络传输参数是时延,则第二传输通道是时延小于第一传输层通道的通道;若第一网络传输参数是丢包率,则第二传输通道是丢包率小于第一传输层通道的通道。
另外,设定条件一般指的是丢包率小于第一阈值,或者时延小于第二阈值,又或者是传输速率大于第三阈值,或者传输质量大于第四阈值。
示例性地,上述步骤203可以理解为,当源设备确定第一传输通道的丢包率大于或者等于第一阈值,源设备从第一传输层通道切换至丢包率较小的第二传输层通道,通过第二传输层通道将当前显示的内容进行编码,并将生成的编码数据发送到目标设备,继而目标设备再进行编码并显示。或者理解为,当源设备确定第一传输通道的时延大于或者等于第二阈值,则源设备从第一传输层通道切换至时延较小的第二传输层通道,通过第二传输层通道将当前显示的内容进行编码,并将生成的编码数据发送到目标设备,继而目标设备再进行编码并显示。
需要说明的是,确定第二传输通道的过程也可以与步骤202同时进行,即源设备在确定第一网络传输参数的同时,根据当前显示的内容或投屏类型确定出第二传输通道。举例来说,如果源设备确定当前显示的内容对应的是业务类型是视频业务类型或音频业务类型,因为这类业务类型对画面或者声音顺畅要求较高,那么可以选择第二传输层通道为TCP对应的通道;如果确定当前显示的内容对应的是业务类型是游戏业务类型,因为这类业务类型对时延要求较高,那么可以选择UDP对应的通道作为第二传输层通道。除此之外, 如果确定当前显示的内容对应的业务类型是文本业务类型或图片业务类型,因为这类业务类型对时延要求并不高,对画面不卡顿要求较高,所以可以选择TCP对应的通道作为第二传输层通道。
示例性地,网络传输参数、传输层协议对应的通道、投屏类型、源设备的当前显示的内容之间的对应关系可以如表1所示。
表1
Figure PCTCN2018096038-appb-000009
举例来说,如果源设备的投屏类型是同源方式时,因为同源方式对时延要求较高,则第一网络传输参数为时延,源设备选择UDP对应的通道作为第二传输层通道;如果源设备确定的投屏类型是异源方式时,因为异源方式对画面流畅要求较高,则第一网络传输参数为丢包率,那么源设备可以选择TCP对应的通道作为第二传输层通道。
或者,比如说,如果源设备当前显示的内容对应的视频业务类型,且用户选择的是异源方式,因为该场景对画面流畅要求较高,则源设备选择TCP对应的通道作为第二传输层通道。如果源设备确定当前显示的内容是游戏业务类型,且用户选择的是同源方式,因为该场景对时延要求较高,则源设备选择UDP对应的通道作为第二传输层通道。
需要说明的是,表1中视频传输协议(Video Transfer Protocol,VTP)可以利用丢包免恶化、丢包恢复和丢包重传等机制,保证时延小和卡顿花屏少之间的均衡。VTP采用了与TCP类似的流传输设计方案,但又以UDP为通道,在此基础上实现了保序、可靠和强抗丢包的流传输。VTP带有丢包监测机制、自适应FEC机制、Slow.Fast机制,并有增强的拥塞算法和FEC。
因VTP对应的通道可以保证时延小和卡顿花屏少之间的均衡,源设备可以将VTP对应的通道设置为第一传输层通道。也就是说,一开始,源设备通过VTP对应的通道投屏至目标设备,在投屏过程中,源设备根据当前显示的内容或投屏类型,确定第一网络传输参数是时延还是丢包率等,若第一网络参数是时延,则第二传输层通道是比VTP的丢包率小的TCP对应的通道,然后源设备确定在时延或丢包率不满足设定条件时,将VTP对应的通道切换至TCP对应的通道。结合图6来说,当源设备启动投屏功能,网络IP层连接成功之后,源设备在传输层启动VTP协议建立端到端的通道,使RTP和RTSP协议运行在VTP协议之上,然后默认在该VTP对应的通道上进行当前显示的内容对应的编码数据的传输和控制。在投屏过程中,源设备可以根据当前显示的内容或投屏类型确定第二传输层通道是TCP对应的通道,一旦丢包率不满足设定条件时,则切换至TCP对应的通道。
通过上述方法,本申请实施例可以实现投屏过程中的时延同步或者是改善画面花屏卡顿的问题,提高用户体验。
为了使本领域的技术人员更加系统地理解本申请实施例提供的技术方案,下面结合图7a至图7c,以源设备是手机,目标设备是个人计算机(personal computer,PC)为例,对上述多屏互动方法的具体流程进行举例说明。
图7a示出了手机和PC端之间的多屏互动过程,具体步骤见下方。
步骤301,手机启动多屏互动功能,先利用标准的miracast技术完成手机与PC的无线连接。
步骤302,手机选择第一传输层通道(即VTP对应的通道)将手机所显示的内容对应的编码数据发送至PC。
步骤303a,手机判断当前显示的内容是视频业务类型或者是投屏类型异源方式时,执行步骤304a至步骤307a,否则跳转至步骤303b。
步骤304a,在手机投屏过程中,根据手机当前显示的内容对应的视频业务类型或异源方式,确定第一网络传输参数是丢包率,第二传输层通道是TCP对应的通道。
步骤305a,手机在投屏过程中,监控第一传输层通道的编码数据的丢包率是否大于或等于第一阈值(例如20%),若大于或者等于第一阈值则执行步骤306a,否则跳转至步骤307a。
步骤306a,手机切换至第二传输层通道(例如TCP对应的通道),通过第二传输层通道传输手机所分享的界面对应的编码数据至PC。
步骤307a,若丢包率小于第一阈值,则手机继续利用第一传输层通道(例如VTP对应的通道)传输手机所分享的界面对应的编码数据至PC。
步骤303b,手机判断当前显示的内容是游戏业务类型或者是投屏类型同源方式时,执行步骤304b至步骤306b,否则直接跳转至步骤307b。
步骤304b,在手机投屏过程中,根据手机当前显示的内容对应的游戏业务类型或同源方式,确定第一网络传输参数是时延,第二传输层通道是UDP对应的通道。
步骤305b,手机在投屏过程中,监控第一传输层通道的编码数据的时延是否大于或等于第二阈值(例如2秒),若大于或者等于第二阈值则执行步骤306b,否则跳转至步骤307b。
步骤306b,若大于或者等于第二阈值,手机切换至第二传输层通道(例如UDP对应的通道),传输手机所分享的界面对应的编码数据至PC。
步骤307b,手机继续利用VTP对应的通道传输手机所分享的界面对应的编码数据至PC。
举例来说,如图7b所示的场景,该场景中手机将电影中的界面投屏至PC上,因手机显示的内容是视频业务类型,对时延的要求较低,但对丢包率要求较高,此时若手机监控到VTP对应的通道的丢包率大于等于20%,则自动切换至TCP对应的通道,或者手机弹出窗口通知用户手动将传输层的通道切换至TCP对应的通道。当切换到TCP对应的通道之后,如果手机第二传输层的通道的丢包率又重新低于到20%,则可以自动或者手动切换回VTP对应的通道。
具体地,手机和PC侧完成切换传输层的通道的方式可以是:手机先通知PC传输层的通道自身已切换至TCP对应的通道,然后PC收到该通知消息后,仍然从原来的端口读取数据,当读取到空数据时,则从TCP对应的通道获取数据。
示例性地,参见图7c所示的场景,该场景中手机将游戏中的界面投屏至PC上,因游戏业务类型对时延要求较高,若手机监控到VTP对应的通道的时延大于一定值,例如2 秒,那么手机自动切换至UDP对应的通道,或者手机弹出窗口通知用户手动将传输层的通道切换至UDP对应的通道。当切换到UDP对应的通道之后,如果时延又低于2秒,则可以自动或者手动切换回VTP对应的通道。其中,这种场景下,手机和PC侧完成切换传输层的通道的方式如上文所述,该处不再赘述。
在本申请另外一些实施例中,图7d示出了手机和PC端之间的多屏互动过程,具体步骤见下方。
步骤401,手机启动多屏互动功能,先利用标准的miracast技术完成手机与PC的无线连接,即RTSP协议的连接,以及认证等过程。
步骤402,手机选择第一传输层通道(即VTP对应的通道)将手机所显示的内容对应的编码数据发送至PC。
步骤403a,手机判断当前显示的内容是视频业务类型或者是投屏类型异源方式时,执行步骤404a至步骤407a,否则跳转至步骤403b。
步骤404a,手机在投屏过程中,监控第一传输层通道的编码数据的丢包率是否大于或等于第一阈值(例如20%),若大于或者等于第一阈值则执行步骤306a,否则跳转至步骤307a。
步骤405a,手机切换至丢包率比第一传输层通道小的第二传输层通道(例如TCP对应的通道),通过第二传输层通道传输手机所分享的界面对应的编码数据至PC。
步骤406a,若丢包率小于第一阈值,则手机继续利用第一传输层通道(例如VTP对应的通道)传输手机所分享的界面对应的编码数据至PC。
步骤403b,手机判断当前显示的内容是游戏业务类型或者是投屏类型同源方式时,执行步骤404b至步骤406b,否则跳转至步骤407。
步骤404b,手机在投屏过程中,监控第一传输层通道的编码数据的时延是否大于或等于第二阈值(例如2秒),若大于或者等于第二阈值则执行步骤405b,否则跳转至步骤407。
步骤405b,若大于或者等于第二阈值,手机切换至第二传输层通道(例如UDP对应的通道),传输手机所分享的界面对应的编码数据至PC。
步骤406b,手机在投屏过程中,监控第二传输层通道的编码数据的时延是否大于或等于第二阈值(例如2秒),若大于或者等于第二阈值则执行步骤405b,否则跳转至步骤407。
步骤407,手机使用VTP对应的通道传输手机所分享的界面对应的编码数据至PC。
在本申请另外一些实施例中,因手机还具有分屏功能,当源设备同时开启多屏互动功能和分屏功能时,源设备可以与第一目标设备和第二目标设备建立无线连接;当源设备检测到用户的分屏操作时,可以响应于该分屏操作,源设备将当前运行的第一应用的界面和第二应用的界面分屏显示;若源设备检测到用户的第一投屏操作,则可以响应于所述第一投屏操作,然后源设备通过第一传输层通道将第一应用的界面传输到所述第一目标设备,以及通过第二传输层通道将第二应用的界面传输到所述第二目标设备,后续根据第一应用的界面或者第二应用的界面、投屏类型等确定第三传输层通道或者第四传输层通道,在丢包率或者时延不满足条件时,切换传输层通道,具体确定第三传输层通道或者第四传输层通道的方式可以参加上文第二传输层通道的切换方式,在此不再赘述。
另外,需要说明的是,第一传输层通道、第二传输层通道可以均是默认的传输层通道,例如VTP对应的通道,也可以是不同传输层协议对应的通道。
举例来说,如图8所示,手机A同时将游戏和微信这两种应用的界面分别投射到手机 B和手机C上。比如,手机A默认使用VTP协议对应的通道将游戏界面501投屏至手机B,以及使用另一VTP协议对应的通道将微信画面502投屏至手机C,当手机A监控到手机A和手机B之间的VTP协议对应的通道中的数据时延大于2秒时,则切换为UDP对应的通道;当手机A监控到手机A和手机C之间的VTP协议对应的通道中的数据丢包率大于20%时,则切换为TCP对应的通道。
进一步来说,若源设备由检测到用户的第二投屏操作,其中,第二投屏操作指示被投屏的内容为所述第一应用的界面;则响应于所述第二投屏操作,所述源设备通过所述第三传输通道仅将所述第一应用的界面传输到所述第一目标设备。
例如,如图9所示,手机可以选择UDP对应的通道将游戏界面501投屏至PC上,当手机监控到UDP对应的通道的数据时延低于2秒时,则切换回VTP对应的通道。
可见,本申请实施例中,通过结合多屏互动过程中源设备所显示的内容以及投屏类型等因素,确定源设备所显示的内容对应的编码数据的传输层通道,当源设备所显示的内容以及投屏类型等因素对时延要求较高时,则选择时延较低的协议对应的通道,这样可以一定程度上降低时延;当源设备所显示的内容以及投屏类型等因素对画面顺畅要求较高时,则选择丢包率较低的协议对应的通道,这样可以一定程度上减少画面卡顿的问题。
本申请实施例中还提供一种计算机可读存储介质,所述计算机可读存储介质包括计算机程序,当计算机程序在电子设备上运行时,使得所述电子设备执行上述多屏互动方法任意一种可能的实现。
本申请实施例还提供一种包含计算机程序产品,当所述计算机程序产品在电子设备上运行时,使得所述电子设备执行上述多屏互动方法任意一种可能的实现。
在本申请的一些实施例中,本申请实施例公开了一种电子设备,如图10所示,该电子设备用于实现以上各个方法实施例中记载的方法,其包括:收发模块1001、处理模块1002。收发模块1001用于支持电子设备执行图5中的步骤201,处理模块1002用于支持电子设备执行图5中的步骤202,收发模块1001用于支持电子设备执行图5中的步骤203。上述方法实施例涉及的各步骤的所有相关内容均可以援引到对应功能模块的功能描述,在此不再赘述。
在本申请的另一些实施例中,本申请实施例公开了一种电子设备,如图11所示,该电子设备可以包括:一个或多个处理器1101;存储器1102;显示器1103;一个或多个应用程序(未示出);以及一个或多个计算机程序1104,上述各器件可以通过一个或多个通信总线1105连接。其中该一个或多个计算机程序1104被存储在上述存储器1102中并被配置为被该一个或多个处理器1101执行,该一个或多个计算机程序1104包括指令,上述指令可以用于执行如图5及相应实施例中的各个步骤。
通过以上的实施方式的描述,所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。上述描述的系统,装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请实施例各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)或处理器执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:快闪存储器、移动硬盘、只读存储器、随机存取存储器、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本申请实施例的具体实施方式,但本申请实施例的保护范围并不局限于此,任何在本申请实施例揭露的技术范围内的变化或替换,都应涵盖在本申请实施例的保护范围之内。因此,本申请实施例的保护范围应以所述权利要求的保护范围为准。

Claims (17)

  1. 一种多屏互动方法,其特征在于,包括:
    源设备检测到用户的投屏操作;
    响应于所述投屏操作,所述源设备通过第一传输层通道将当前显示的内容传输到目标设备;
    所述源设备根据所述源设备当前显示的内容或投屏类型确定第一网络传输参数;
    当所述源设备确定所述第一传输层通道的第一网络传输参数不满足设定条件时,所述源设备通过第二传输层通道将当前显示的内容传输到所述目标设备,所述第二传输层通道是所述源设备根据所述第一网络传输参数确定的。
  2. 如权利要求1所述的方法,其特征在于,所述源设备根据所述源设备当前显示的内容确定第一网络传输参数,包括:
    若所述源设备当前显示第一应用的界面,所述源设备根据所述第一应用的界面确定所述第一网络传输参数为丢包率;其中,所述第二传输层通道的丢包率低于所述第一传输层通道的丢包率,所述第一应用为音视频业务类型的应用。
  3. 如权利要求1所述的方法,其特征在于,所述源设备根据所述源设备当前显示的内容确定第二传输层通道,包括:
    若所述源设备当前显示第二应用的界面,所述源设备根据所述第二应用的界面确定所述第一网络传输参数为时延;其中,所述第二传输层通道的时延低于所述第一传输层通道的时延,所述第二应用为游戏业务类型的应用。
  4. 如权利要求1所述的方法,其特征在于,所述源设备根据投屏类型确定第二传输层通道,包括:
    若所述源设备检测到的投屏类型是同源方式,所述源设备根据所述同源方式确定所述第一网络传输参数为时延,其中,所述第二传输层通道的时延低于所述第一传输层通道的时延。
  5. 如权利要求1所述的方法,其特征在于,所述源设备根据投屏类型确定第二传输层通道,包括:
    若所述源设备检测到的投屏类型是异源方式,所述源设备根据所述异源方式确定所述第一网络传输参数为丢包率,其中,所述第二传输层通道的丢包率低于所述第一传输层通道的丢包率。
  6. 如权利要求1至5任一项所述的方法,其特征在于,所述第一传输层通道为视频传输协议VTP对应的通道,时延低于所述第一传输层通道的第二传输层通道为用户数据报协议UDP对应的通道,或者丢包率低于所述第一传输层通道的第二传输层通道为传输控制协议TCP对应的通道。
  7. 一种多屏互动方法,其特征在于,包括:
    所述源设备检测到用户的分屏操作;
    响应于所述分屏操作,所述源设备将当前运行的第一应用的界面和第二应用的界面分屏显示;
    所述源设备检测到用户的第一投屏操作,所述第一投屏操作指示被投屏的内容为所述第一应用的界面和所述第二应用的界面;
    响应于所述第一投屏操作,所述源设备通过第一传输层通道将第一应用的界面传输到第一目标设备,以及通过第二传输层通道将第二应用的界面传输到第二目标设备;
    所述源设备根据第一应用的界面或投屏类型确定第一网络传输参数;
    当所述源设备确定所述第一传输层通道的第一网络传输参数不满足设定条件时时所述源设备通过所述第三传输层通道将第一应用的界面传输到所述第一目标设备,所述第三传输层通道是所述源设备根据所述第一网络传输参数确定的;
    所述源设备根据第二应用的界面或投屏类型确定第二网络传输参数;
    当所述源设备确定所述第二传输层通道的第二网络传输参数大于或等于第二阈值时,所述源设备通过所述第四传输层通道将第二应用的界面传输到所述第二目标设备,所述第四传输层通道是所述源设备根据所述第二网络传输参数确定的。
  8. 如权利要求7所述的方法,其特征在于,还包括:
    所述源设备检测到用户的第二投屏操作,所述第二投屏操作指示被投屏的内容仅为所述第一应用的界面;
    响应于所述第二投屏操作,所述源设备停止通过所述第四传输通道将所述第二应用的界面传输到所述第二目标设备。
  9. 一种电子设备,其特征在于,包括处理器和存储器;
    所述存储器用于存储一个或多个计算机程序;
    当所述存储器存储的一个或多个计算机程序被所述处理器执行时,使得所述电子设备执行:
    检测到用户的投屏操作;
    响应于所述投屏操作,通过第一传输层通道将当前显示的内容传输到目标设备;
    根据所述源设备当前显示的内容或投屏类型确定第一网络传输参数;
    当确定所述第一传输层通道的第一网络传输参数不满足设定条件时,通过第二传输层通道将当前显示的内容传输到所述目标设备,所述第二传输层通道是根据所述第一网络传输参数确定的。
  10. 如权利要求9所述的电子设备,其特征在于,当所述存储器存储的一个或多个计算机程序被所述处理器执行时,还使得所述电子设备执行:
    若所述当前显示内容为第一应用的界面,则根据所述第一应用的界面确定所述第一网络传输参数为丢包率,其中,所述第二传输层通道的丢包率低于所述第一传输层通道的丢包率,所述第一应用为音视频业务类型的应用。
  11. 如权利要求9所述的电子设备,其特征在于,当所述存储器存储的一个或多个计算机程序被所述处理器执行时,还使得所述电子设备执行:
    若所述当前显示内容为第二应用的界面,则根据所述第二应用的界面确定所述第一网络传输参数为时延,其中,所述第二传输层通道的时延低于所述第一传输层通道的时延,所述第二应用为游戏业务类型的应用。
  12. 如权利要求9所述的电子设备,其特征在于,当所述存储器存储的一个或多个计算机程序被所述处理器执行时,还使得所述电子设备执行:
    若检测到的投屏类型是同源方式,则根据所述同源方式确定所述第一网络传输参数为时延,其中,所述第二传输层通道的时延低于所述第一传输层通道的时延。
  13. 如权利要求9所述的电子设备,其特征在于,当所述存储器存储的一个或多个计 算机程序被所述处理器执行时,还使得所述电子设备执行:
    若检测到的投屏类型是异源方式,则根据所述异源方式确定所述第一网络传输参数为丢包率,其中,所述第二传输层通道的丢包率低于所述第一传输层通道的丢包率。
  14. 如权利要求9至13任一项所述的电子设备,其特征在于,所述第一传输层通道为视频传输协议VTP对应的通道,时延低于所述第一传输层通道的第二传输层通道为用户数据报协议UDP对应的通道,或者丢包率低于所述第一传输层通道的第二传输层通道为传输控制协议TCP对应的通道。
  15. 一种电子设备,其特征在于,包括处理器和存储器;
    所述存储器用于存储一个或多个计算机程序;
    当所述存储器存储的一个或多个计算机程序被所述处理器执行时,使得所述电子设备执行如下步骤:
    检测到用户的分屏操作;
    响应于所述分屏操作,将当前运行的第一应用的界面和第二应用的界面分屏显示;
    检测到用户的第一投屏操作,所述第一投屏操作指示被投屏的内容为所述第一应用的界面和所述第二应用的界面;
    响应于所述第一投屏操作,通过第一传输层通道将当前显示的内容传输到第一目标设备,以及通过第二传输层通道将第二应用的界面传输到第二目标设备;
    根据第一应用的界面或投屏类型确定第一网络传输参数;
    当确定所述第一传输层通道的第一网络传输参数不满足设定条件时,通过所述第三传输层通道将第一应用的界面传输到所述第一目标设备,所述第三传输层通道是根据所述第一网络传输参数确定的;
    根据第二应用的界面或投屏类型确定第二网络传输参数;
    当确定所述第二传输层通道的第二网络传输参数大于或等于第二阈值时,通过所述第四传输层通道将第二应用的界面传输到所述第二目标设备,所述第四传输层通道是根据所述第二网络传输参数确定的。
  16. 如权利要求15所述的电子设备,其特征在于,当所述存储器存储的一个或多个计算机程序被所述处理器执行时,还使得所述电子设备执行:
    检测到用户的第二投屏操作,所述第二投屏操作指示被投屏的内容仅为所述第一应用的界面;
    响应于所述第二投屏操作,停止通过所述第四传输通道将所述第二应用的界面传输到所述第二目标设备。
  17. 一种计算机存储介质,其特征在于,所述计算机可读存储介质包括计算机程序,当计算机程序在电子设备上运行时,使得所述电子设备执行如权利要求1至8任一所述的方法。
PCT/CN2018/096038 2018-07-17 2018-07-17 一种多屏互动方法及设备 WO2020014880A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2018/096038 WO2020014880A1 (zh) 2018-07-17 2018-07-17 一种多屏互动方法及设备
CN201880072055.XA CN111316598B (zh) 2018-07-17 2018-07-17 一种多屏互动方法及设备

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/096038 WO2020014880A1 (zh) 2018-07-17 2018-07-17 一种多屏互动方法及设备

Publications (1)

Publication Number Publication Date
WO2020014880A1 true WO2020014880A1 (zh) 2020-01-23

Family

ID=69163573

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/096038 WO2020014880A1 (zh) 2018-07-17 2018-07-17 一种多屏互动方法及设备

Country Status (2)

Country Link
CN (1) CN111316598B (zh)
WO (1) WO2020014880A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112714354A (zh) * 2021-03-25 2021-04-27 北京泰迪熊移动科技有限公司 多屏互动方法及装置
WO2022089271A1 (zh) * 2020-10-30 2022-05-05 华为技术有限公司 无线投屏方法、移动设备及计算机可读存储介质

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111787377B (zh) * 2020-08-19 2022-06-28 青岛海信传媒网络技术有限公司 显示设备及投屏方法
CN112055251A (zh) * 2020-09-04 2020-12-08 中国第一汽车股份有限公司 一种媒体数据播放方法、装置、设备及存储介质
CN112328344B (zh) * 2020-11-02 2022-11-22 联想(北京)有限公司 一种投屏处理方法及第一设备
CN112506080A (zh) * 2020-11-26 2021-03-16 佛山格捷锐信息技术有限公司 分析设备远程控制方法、装置、计算机设备和存储介质
CN114697733B (zh) * 2020-12-31 2023-06-06 华为技术有限公司 投屏音视频数据的传输方法以及相关设备
CN112861638A (zh) * 2021-01-14 2021-05-28 华为技术有限公司 一种投屏方法及装置
CN115080103A (zh) * 2021-03-11 2022-09-20 华为技术有限公司 设备之间同步软件特性的方法及电子设备
CN113242463B (zh) * 2021-03-26 2023-03-03 北京汗粮科技有限公司 一种通过扩展参数增强投屏交互能力的方法
CN112994957B (zh) * 2021-05-11 2021-09-14 西安羚控电子科技有限公司 一种智能扩展投屏和互动的网络管理方法及存储介质
CN113535104B (zh) * 2021-05-31 2023-09-26 惠州华阳通用电子有限公司 一种基于虚拟机的多屏显示切换方法及装置
CN114286402B (zh) * 2021-06-16 2022-11-22 荣耀终端有限公司 通道切换方法、电子设备及存储介质
CN113572836B (zh) * 2021-07-21 2022-07-15 腾讯科技(深圳)有限公司 一种数据传输方法、装置、服务器及存储介质
CN113726817B (zh) * 2021-11-01 2022-03-11 苏州万店掌网络科技有限公司 一种流媒体数据的传输方法、装置及介质
CN115525453B (zh) * 2022-01-19 2023-08-04 荣耀终端有限公司 多屏协同中断的处理方法及电子设备
CN114268936B (zh) * 2022-03-01 2022-07-12 荣耀终端有限公司 数据传输方法及装置
CN117597888A (zh) * 2022-06-16 2024-02-23 北京小米移动软件有限公司 一种跨设备的网络管理方法、装置及电子设备
CN115460442B (zh) * 2022-08-11 2024-02-20 北京罗克维尔斯科技有限公司 投屏切换方法、装置、电子设备、可读存储介质和车辆
CN117676481A (zh) * 2022-09-08 2024-03-08 Oppo广东移动通信有限公司 内容协同方法、装置、系统、存储介质及电子设备
CN116709417B (zh) * 2022-10-14 2024-05-31 荣耀终端有限公司 一种温度控制方法及相关设备

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102802048A (zh) * 2012-08-24 2012-11-28 乐视网信息技术(北京)股份有限公司 一种多屏互动系统和方法
CN104836672A (zh) * 2014-02-10 2015-08-12 中国移动通信集团公司 多屏互动中数据传输的方法、装置、系统及终端设备
CN104978156A (zh) * 2014-04-02 2015-10-14 联想(北京)有限公司 多屏显示方法及多屏显示处理装置

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102791045B (zh) * 2011-05-20 2017-03-29 希姆通信息技术(上海)有限公司 移动通信终端及其自动优选传输协议的方法
KR102264806B1 (ko) * 2015-02-17 2021-06-14 삼성전자주식회사 스크린 미러링 서비스 제공방법 및 장치
CN106959796A (zh) * 2017-03-22 2017-07-18 广东小天才科技有限公司 移动终端屏幕显示方法及装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102802048A (zh) * 2012-08-24 2012-11-28 乐视网信息技术(北京)股份有限公司 一种多屏互动系统和方法
CN104836672A (zh) * 2014-02-10 2015-08-12 中国移动通信集团公司 多屏互动中数据传输的方法、装置、系统及终端设备
CN104978156A (zh) * 2014-04-02 2015-10-14 联想(北京)有限公司 多屏显示方法及多屏显示处理装置

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022089271A1 (zh) * 2020-10-30 2022-05-05 华为技术有限公司 无线投屏方法、移动设备及计算机可读存储介质
CN112714354A (zh) * 2021-03-25 2021-04-27 北京泰迪熊移动科技有限公司 多屏互动方法及装置
CN112714354B (zh) * 2021-03-25 2021-08-03 北京泰迪熊移动科技有限公司 多屏互动方法及装置

Also Published As

Publication number Publication date
CN111316598B (zh) 2021-08-31
CN111316598A (zh) 2020-06-19

Similar Documents

Publication Publication Date Title
WO2020014880A1 (zh) 一种多屏互动方法及设备
WO2020238871A1 (zh) 一种投屏方法、系统及相关装置
CN113542839B (zh) 电子设备的投屏方法和电子设备
JP7235871B2 (ja) データ伝送方法と電子デバイス
WO2021000807A1 (zh) 一种应用程序中等待场景的处理方法和装置
CN113923230B (zh) 数据同步方法、电子设备和计算机可读存储介质
WO2021185244A1 (zh) 一种设备交互的方法和电子设备
WO2021175300A1 (zh) 数据传输方法、装置、电子设备和可读存储介质
WO2022121775A1 (zh) 一种投屏方法及设备
WO2022105445A1 (zh) 基于浏览器的应用投屏方法及相关装置
WO2021083128A1 (zh) 一种声音处理方法及其装置
KR102491006B1 (ko) 데이터 송신 방법 및 전자 기기
WO2022042770A1 (zh) 控制通信服务状态的方法、终端设备和可读存储介质
US11665274B2 (en) Call method and apparatus
CN114040242A (zh) 投屏方法和电子设备
US20240045643A1 (en) Codec negotiation and switching method
CN115237359A (zh) 一种投屏显示参数调节方法
WO2022222691A1 (zh) 一种通话处理方法及相关设备
WO2021052388A1 (zh) 一种视频通信方法及视频通信装置
WO2022161006A1 (zh) 合拍的方法、装置、电子设备和可读存储介质
WO2021218544A1 (zh) 一种提供无线上网的系统、方法及电子设备
WO2023093778A1 (zh) 一种截屏方法及相关装置
CN113271577B (zh) 媒体数据播放系统、方法及相关装置
WO2024022307A1 (zh) 一种投屏方法及电子设备
CN114153531A (zh) 管理物联网设备的方法和装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18926609

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18926609

Country of ref document: EP

Kind code of ref document: A1