WO2020244492A1 - 一种投屏显示方法及电子设备 - Google Patents

一种投屏显示方法及电子设备 Download PDF

Info

Publication number
WO2020244492A1
WO2020244492A1 PCT/CN2020/093872 CN2020093872W WO2020244492A1 WO 2020244492 A1 WO2020244492 A1 WO 2020244492A1 CN 2020093872 W CN2020093872 W CN 2020093872W WO 2020244492 A1 WO2020244492 A1 WO 2020244492A1
Authority
WO
WIPO (PCT)
Prior art keywords
interface
view information
target control
drawing instruction
electronic device
Prior art date
Application number
PCT/CN2020/093872
Other languages
English (en)
French (fr)
Inventor
范振华
曹原
卞苏成
杨婉艺
李鹏程
魏曦
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to US17/616,833 priority Critical patent/US11907604B2/en
Priority to EP20818738.5A priority patent/EP3951585A4/en
Publication of WO2020244492A1 publication Critical patent/WO2020244492A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • G06F3/1438Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display using more than one graphics controller
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • H04M1/724094Interfacing with a device worn on the user's body to provide access to telephonic functionalities, e.g. accepting a call, reading or composing a message
    • H04M1/724095Worn on the wrist, hand or arm
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • H04M1/72412User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories using two-way short-range wireless interfaces
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/14Solving problems related to the presentation of information to be displayed
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/20Details of the management of multiple sources of image data

Definitions

  • This application relates to the field of terminal technology, and in particular to a projection screen display method and electronic equipment.
  • the electronic device can realize the switching and display of display content among multiple devices by means such as screen projection.
  • the user may send the display content in the mobile phone (that is, the source device) to other destination devices that support the screen projection function for display.
  • the display content in the mobile phone that is, the source device
  • other destination devices that support the screen projection function for display.
  • the display interface in the mobile phone can be projected to a smart TV for display. Later, if the user wants to watch the display content in the tablet computer, the user can disconnect the mobile phone and the smart TV, and set the tablet computer as a new source device. Furthermore, the user can project the display interface of the tablet computer to the smart TV for display, instead of displaying the display interface in the mobile phone. Obviously, this method of switching back and forth between multiple source devices for screen projection is relatively cumbersome, and the user experience is not high.
  • the present application provides a projection screen display method and electronic device.
  • the destination device can display the display content of multiple source devices at the same time, thereby improving the display effect and user experience of the projection screen display between multiple devices.
  • the present application provides a projection display method, including: a destination device receives a first message sent by a first source device, the first message includes a first drawing instruction, and the first drawing instruction is used to draw one or more The first target control, where the first target control is a control in the first interface displayed by the first source device; and the target device may also receive a second message sent by the second source device, and the second message includes the second drawing Instruction, the second drawing instruction is used to draw one or more second target controls, where the second target control is a control in the second interface displayed by the second source device; further, the destination device may follow the first drawing instruction and the first 2.
  • Drawing instruction Draw the screen projection interface, and the drawn screen projection interface includes the first target control and the second target control.
  • the target device not only displays the first target control that the first source device needs to project, but also the second target control that the second source device needs to project. , So that the user can watch the display content on multiple source devices in one destination device, without the user switching between multiple source devices, thereby improving the display effect and user experience of the projection display between multiple devices.
  • the foregoing first message may also include first view information (for example, a first view tree), and the first view information includes the layer order of the first target control in the foregoing projection interface; similarly ,
  • the second message may also include second view information (for example, the second view tree), and the second view information includes the layer order of the second target control in the above-mentioned projection interface; at this time, the destination device according to the first drawing instruction And the second drawing instruction to draw the screen projection interface, specifically including: the destination device can generate third view information according to the first view information and the second view information, and the third view information includes the first target control and the second target control in the above projection The order of layers in the screen interface; in this way, the target device can execute the first drawing instruction and the second drawing instruction to draw the above-mentioned screen interface according to the layer order between the various controls in the third view information.
  • the method further includes: the target device can obtain the connection between the first interface and the second interface.
  • the configuration file corresponds to the configuration file, which not only records the first display position of the first target control in the projection interface, but also records the second display position of the second target control in the projection interface; at this time, the purpose
  • the device generates the third view information according to the first view information and the second view information, which specifically includes: the target device splits and reorganizes the controls in the first view information and the second view information according to the obtained configuration file, to obtain the first view information Three views of information.
  • the method further includes: the target device obtains a first configuration file corresponding to the first interface, and the first configuration file records that the first target control is in The first display position in the projection interface; and the target device opens a person to obtain a second configuration file corresponding to the second interface, and the second configuration file records the second display position of the second target control in the projection interface;
  • the destination device generates the third view information according to the first view information and the second view information, which specifically includes: the destination device can compare the information in the first view information and the second view information according to the first configuration file and the second configuration file.
  • the control is split and reorganized to obtain the third view information.
  • the target device executes the first drawing instruction and the second drawing instruction to draw the screen projection interface according to the third view information, which specifically includes: the destination device executes the first drawing instruction at the first display position according to the layer order of the first target control in the third view information , To draw the first target control; and the target device executes the second drawing instruction at the second display position according to the layer order of the second target control in the third view information to draw the second target control.
  • the display positions of the first target control in the first interface and the projection interface are the same or different; the display positions of the second target control in the second interface and the projection interface are the same or different .
  • the destination device before the destination device receives the first message sent by the first source device, it further includes: the destination device displays a third interface; at this time, the projection interface drawn by the destination device can still be It includes one or more third target controls in the third interface. That is, during the screen projection process, the destination device can simultaneously display the display content of multiple source devices and the display content of the destination device itself, so that the user can watch the display content on multiple devices in one device.
  • the present application provides a screen projection display method, including: a source device displays a first display interface; the source device can receive a screen projection instruction from a user to project the first display interface to the destination device; in response to the screen projection instruction, The source device determines one or more first target controls in the first display interface that need to be projected to the target device; further, the source device may send a first message to the target device, and the first message includes the first drawing of the first target control Instruction, so that the target device can draw the first target control on the projection interface according to the first drawing instruction.
  • this application provides an electronic device, which may be the aforementioned source device or destination device.
  • the electronic device includes: a touch screen, a communication module, one or more processors, one or more memories, and one or more computer programs; wherein, the processor is coupled with the communication module, the touch screen and the memory, and one or more of the foregoing Multiple computer programs are stored in the memory, and when the electronic device is running, the processor executes one or more computer programs stored in the memory, so that the electronic device executes the projection display method described in any one of the above.
  • the present application provides a computer storage medium including computer instructions, which when the computer instructions run on an electronic device, cause the electronic device to execute the projection display method according to any one of the first aspect.
  • this application provides a computer program product, which when the computer program product runs on an electronic device, causes the electronic device to execute the projection display method described in any one of the first aspect.
  • the electronic equipment described in the third aspect, the computer storage medium described in the fourth aspect, and the computer program product described in the fifth aspect provided above are all used to execute the corresponding methods provided above.
  • the beneficial effects that can be achieved please refer to the beneficial effects in the corresponding method provided above, which will not be repeated here.
  • FIG. 1 is a schematic diagram of the architecture of a communication system provided by an embodiment of the application
  • FIG. 2 is a first structural diagram of an electronic device according to an embodiment of the application.
  • FIG. 3 is a schematic structural diagram of an operating system in an electronic device provided by an embodiment of the application.
  • FIG. 4 is a schematic diagram 1 of an application scenario of a projection screen display method provided by an embodiment of the application;
  • FIG. 5 is a second schematic diagram of an application scenario of a projection screen display method provided by an embodiment of the application.
  • FIG. 6 is a third schematic diagram of an application scenario of a projection screen display method provided by an embodiment of the application.
  • FIG. 7 is a fourth schematic diagram of an application scenario of a projection screen display method provided by an embodiment of this application.
  • FIG. 8 is a schematic diagram 5 of an application scenario of a projection screen display method provided by an embodiment of the application.
  • FIG. 9 is a sixth schematic diagram of an application scenario of a projection screen display method provided by an embodiment of the application.
  • FIG. 10 is a schematic diagram 7 of an application scenario of a projection screen display method provided by an embodiment of this application.
  • FIG. 11 is an eighth schematic diagram of an application scenario of a projection screen display method provided by an embodiment of this application.
  • FIG. 12 is a schematic diagram 9 of an application scenario of a projection screen display method provided by an embodiment of this application.
  • FIG. 13 is a tenth schematic diagram of an application scenario of a projection screen display method provided by an embodiment of this application.
  • FIG. 14 is a schematic diagram eleventh of an application scenario of a projection screen display method provided by an embodiment of this application.
  • 15 is a twelfth schematic diagram of an application scenario of a projection screen display method provided by an embodiment of this application.
  • 16 is a thirteenth schematic diagram of an application scenario of a projection screen display method provided by an embodiment of the application.
  • FIG. 17 is a second structural diagram of an electronic device provided by an embodiment of this application.
  • a projection display method provided by an embodiment of the present application may be applied to a communication system 100, and the communication system 100 may include N (N>1) electronic devices.
  • the communication system 100 may include an electronic device 101 and an electronic device 102.
  • the electronic device 101 may be connected to the electronic device 102 through one or more communication networks 104.
  • the communication network 104 may be a wired network or a wireless network).
  • the aforementioned communication network 104 may be a local area network (local area networks, LAN), or a wide area network (wide area networks, WAN), such as the Internet.
  • the communication network 104 can be implemented using any known network communication protocol.
  • the above-mentioned network communication protocol can be various wired or wireless communication protocols, such as Ethernet, universal serial bus (USB), and Firewire (FIREWIRE).
  • GSM Global System for Mobile Communications
  • GPRS General Packet Radio Service
  • CDMA Code Division Multiple Access
  • WCDMA Broadband Code Division Multiple Access
  • TD-SCDMA time-division code division multiple access
  • LTE long term evolution
  • Bluetooth wireless fidelity (Wi-Fi)
  • Wi-Fi wireless fidelity
  • NFC voice over Internet protocol (VoIP) based on Internet protocol, communication protocol supporting network slicing architecture or any other suitable communication protocol.
  • VoIP voice over Internet protocol
  • the electronic device 101 may establish a Wi-Fi connection with the electronic device 102 through a Wi-Fi protocol.
  • the electronic device 101 may be used as a destination device, and the electronic device 102 may be used as a source device of the electronic device 101.
  • the electronic device 102 can project the display content in its display interface to the electronic device 101, and the electronic device 101 as a target device can display the display content sent by the electronic device 102 in its own display interface.
  • the above-mentioned communication system 100 may further include an electronic device 103, for example, the electronic device 103 may be a wearable device.
  • the electronic device 103 can also be used as the source device of the electronic device 101, and the display content in its display interface can also be projected to the electronic device 101.
  • the electronic device 101 can also display the transmission of the electronic device 103 on its own display interface. The content of the display.
  • a destination device (such as the aforementioned electronic device 101) can simultaneously receive display content from multiple source devices (such as the aforementioned electronic device 102 and electronic device 103), and the destination device can display content on its own
  • the display content from different source devices is presented in the interface at the same time.
  • the electronic device 101 is still used as the destination device, and the electronic device 102 and the electronic device 103 are used as source devices.
  • the electronic device 102 can identify each control in its display interface (for example, the first interface), and determine one or more controls (hereinafter referred to as the first target control) that needs to be projected to the electronic device 101 this time. Furthermore, the electronic device 102 can send the first target control and related drawing instructions to the electronic device 101.
  • the electronic device 103 can also recognize each control in its own display interface (for example, the second interface), and determine one or more controls that need to be projected to the electronic device 101 for display this time (hereinafter referred to as the second target control). Furthermore, the electronic device 103 can send the second target control and related drawing instructions to the electronic device 101.
  • the electronic device 101 can call the drawing instruction sent by the electronic device 102 to draw the first target control on its display interface.
  • the electronic device 101 can also call the drawing instruction sent by the electronic device 103 to draw the second target control on its own display interface.
  • the electronic device 101 can finally display a screen projection interface including the first target control and the second target control.
  • the screen projection interface not only displays the content that needs to be projected by the first source device (ie, the electronic device 102), but also displays the content that needs to be screened by the second source device (ie, the electronic device 103).
  • the destination device can display the display content of multiple source devices at the same time, so that the user can watch the display content on multiple source devices in one destination device, without requiring the user to display content on multiple source devices.
  • any one or more electronic devices in the aforementioned communication system 100 can be used as a source device or a destination device, which is not limited in the embodiment of the present application.
  • the specific structures of the electronic device 101, the electronic device 102, and the electronic device 103 may be the same or different.
  • each of the above electronic devices may specifically be mobile phones, tablet computers, smart TVs, wearable electronic devices, car machines, notebook computers, ultra-mobile personal computers (UMPC), handheld computers, netbooks, and personal digital assistants. (personal digital assistant, PDA), virtual reality equipment, etc.
  • UMPC ultra-mobile personal computers
  • PDA personal digital assistant
  • virtual reality equipment etc.
  • the embodiments of the present application do not make any restrictions on this.
  • FIG. 2 shows a schematic structural diagram of the electronic device 101.
  • the electronic device 101 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, and an antenna 2.
  • Mobile communication module 150 Wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone interface 170D, sensor module 180, camera 193, display screen 194, etc.
  • the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the electronic device 101.
  • the electronic device 101 may include more or fewer components than shown in the figure, or combine certain components, or split certain components, or arrange different components.
  • the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU), etc.
  • AP application processor
  • modem processor modem processor
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • the different processing units may be independent devices or integrated in one or more processors.
  • a memory may also be provided in the processor 110 to store instructions and data.
  • the memory in the processor 110 is a cache memory.
  • the memory can store instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to use the instruction or data again, it can be directly called from the memory. Repeated accesses are avoided, the waiting time of the processor 110 is reduced, and the efficiency of the system is improved.
  • the processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (PCM) interface, and a universal asynchronous transmitter receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and / Or Universal Serial Bus (USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • UART universal asynchronous transmitter receiver/transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB Universal Serial Bus
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger can be a wireless charger or a wired charger.
  • the charging management module 140 may receive the charging input of the wired charger through the USB interface 130.
  • the charging management module 140 may receive the wireless charging input through the wireless charging coil of the electronic device 101. While the charging management module 140 charges the battery 142, it can also supply power to the electronic device through the power management module 141.
  • the power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the display screen 194, the camera 193, and the wireless communication module 160.
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, and battery health status (leakage, impedance).
  • the power management module 141 may also be provided in the processor 110.
  • the power management module 141 and the charging management module 140 may also be provided in the same device.
  • the wireless communication function of the electronic device 101 can be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, and the baseband processor.
  • the antenna 1 and the antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in the electronic device 101 can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna can be used in combination with a tuning switch.
  • the mobile communication module 150 can provide a wireless communication solution including 2G/3G/4G/5G and the like applied to the electronic device 101.
  • the mobile communication module 150 may include one or more filters, switches, power amplifiers, low noise amplifiers (LNA), etc.
  • the mobile communication module 150 can receive electromagnetic waves by the antenna 1, and perform processing such as filtering and amplifying the received electromagnetic waves, and then transmitting them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modem processor, and convert it into electromagnetic waves for radiation via the antenna 1.
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110.
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be provided in the same device.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low-frequency baseband signal is processed by the baseband processor and then passed to the application processor.
  • the application processor outputs sound signals through audio equipment (not limited to speakers 170A, receiver 170B, etc.), or displays images or videos through display 194.
  • the modem processor may be an independent device. In other embodiments, the modem processor may be independent of the processor 110 and be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the electronic device 101, including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), Bluetooth (BT), and global navigation satellites.
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication technology
  • infrared technology infrared, IR
  • the wireless communication module 160 may be one or more devices integrating one or more communication processing modules.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2, frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110.
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110, perform frequency modulation, amplify it, and convert it into electromagnetic wave radiation via the antenna 2.
  • the antenna 1 of the electronic device 101 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the electronic device 101 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technologies may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite-based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite-based augmentation systems
  • the electronic device 101 implements a display function through a GPU, a display screen 194, and an application processor.
  • the GPU is a microprocessor for image processing, connected to the display 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • the processor 110 may include one or more GPUs, which execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos, etc.
  • the display screen 194 includes a display panel.
  • the display panel can adopt liquid crystal display (LCD), organic light-emitting diode (OLED), active-matrix organic light-emitting diode or active-matrix organic light-emitting diode (active-matrix organic light-emitting diode).
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • active-matrix organic light-emitting diode active-matrix organic light-emitting diode
  • AMOLED flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (QLED), etc.
  • the electronic device 101 may include one or N display screens 194, and N is a positive integer greater than one.
  • the electronic device 101 can realize a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, and an application processor.
  • the ISP is used to process the data fed back from the camera 193. For example, when taking a picture, the shutter is opened, the light is transmitted to the photosensitive element of the camera through the lens, the light signal is converted into an electrical signal, and the photosensitive element of the camera transfers the electrical signal to the ISP for processing and is converted into an image visible to the naked eye.
  • ISP can also optimize the image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193.
  • the camera 193 is used to capture still images or videos.
  • the object generates an optical image through the lens and projects it to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • ISP outputs digital image signals to DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats.
  • the electronic device 101 may include 1 or N cameras 193, and N is a positive integer greater than 1.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 101 selects the frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 101 may support one or more video codecs. In this way, the electronic device 101 can play or record videos in a variety of encoding formats, such as: moving picture experts group (MPEG) 1, MPEG2, MPEG3, MPEG4, and so on.
  • MPEG moving picture experts group
  • the external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 101.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example, save music, video and other files in an external memory card.
  • the internal memory 121 may be used to store one or more computer programs, and the one or more computer programs include instructions.
  • the processor 110 can run the above-mentioned instructions stored in the internal memory 121 to enable the electronic device 101 to execute the projection display method provided in some embodiments of the present application, as well as various functional applications and data processing.
  • the internal memory 121 may include a storage program area and a storage data area. Among them, the storage program area can store the operating system; the storage program area can also store one or more application programs (such as a gallery, contacts, etc.) and so on.
  • the data storage area can store data (such as photos, contacts, etc.) created during the use of the electronic device 101.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, universal flash storage (UFS), etc.
  • the processor 110 executes the instructions stored in the internal memory 121 and/or the instructions stored in the memory provided in the processor to cause the electronic device 101 to execute the screen projection provided in the embodiments of the present application. Display methods, as well as various functional applications and data processing.
  • the electronic device 101 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. For example, music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into an analog audio signal for output, and is also used to convert an analog audio input into a digital audio signal.
  • the audio module 170 can also be used to encode and decode audio signals.
  • the audio module 170 may be provided in the processor 110, or part of the functional modules of the audio module 170 may be provided in the processor 110.
  • the speaker 170A also called a “speaker” is used to convert audio electrical signals into sound signals.
  • the electronic device 101 can listen to music through the speaker 170A, or listen to a hands-free call.
  • the receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the electronic device 101 answers a call or voice message, it can receive the voice by bringing the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone”, “microphone”, is used to convert sound signals into electrical signals.
  • the user can approach the microphone 170C through the mouth to make a sound, and input the sound signal to the microphone 170C.
  • the electronic device 101 may be provided with one or more microphones 170C. In other embodiments, the electronic device 101 may be provided with two microphones 170C, which can implement noise reduction functions in addition to collecting sound signals. In some other embodiments, the electronic device 101 can also be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and realize directional recording functions.
  • the earphone interface 170D is used to connect wired earphones.
  • the earphone interface 170D may be a USB interface 130, or a 3.5mm open mobile terminal platform (OMTP) standard interface, and a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA, CTIA
  • the sensor module 180 may include a pressure sensor, a gyroscope sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, and the like.
  • the above electronic device may also include one or more components such as buttons, motors, indicators, and SIM card interfaces, which are not limited in the embodiment of the present application.
  • the above-mentioned software system of the electronic device 101 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the embodiment of the present application takes a layered Android system as an example to illustrate the software structure of the electronic device 101.
  • FIG. 3 is a software structure block diagram of the electronic device 101 according to an embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Communication between layers through software interface.
  • the Android system is divided into four layers, from top to bottom, the application layer, the application framework layer, the Android runtime and system library, and the kernel layer.
  • the application layer can include a series of applications.
  • the above-mentioned application programs may include APPs (applications) such as call, contact, camera, gallery, calendar, map, navigation, Bluetooth, music, video, short message, etc.
  • APPs applications
  • the application framework layer provides application programming interfaces (application programming interface, API) and programming frameworks for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer may include a view system, a notification manager, an activity manager, a window manager, a content provider, a resource manager, an input method manager, and so on.
  • the view system can be used to construct the display interface of the application.
  • Each display interface can consist of one or more controls.
  • controls can include interface elements such as icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, and widgets.
  • the view system can obtain view information of the corresponding display interface when drawing the display interface, and the view information records the layer relationship between the various controls in the display interface to be drawn.
  • each control in the display interface is generally organized hierarchically according to a tree structure to form a complete ViewTree (view tree), which may be referred to as the view information of the above display interface.
  • the view system can draw the display interface according to the layer relationship between the controls set in the ViewTree.
  • Each control in the display interface corresponds to a set of drawing instructions, such as DrawLine, DrawPoint, DrawBitmap, etc.
  • FIG. 4 shows the chat interface 401 of the WeChat APP.
  • the bottom control in the chat interface 401 is the root node (root), and a basemap 402 is set under the root node.
  • the following controls are also included: a title bar 403, a chat background 404, and an input bar 405.
  • the title bar 403 further includes a return button 406 and a title 407
  • the chat background 404 further includes an avatar 408 and a bubble 409
  • the input bar 405 further includes a voice input button icon 410, an input box 411, and a send button 412.
  • the above controls are layered in order to form a view tree A as shown in Figure 4(b).
  • the base map 402 is a child node of the root node
  • the title bar 403, the chat background 404, and the input field 405 are all child nodes of the base map 402.
  • Both the return button 406 and the title 407 are child nodes of the title bar 403.
  • Both the avatar 408 and the bubble 409 are child nodes of the chat background 404.
  • the voice input button icon 410, the input box 411, and the send button 412 are all child nodes of the input field 405.
  • the view system may call the drawing instructions of the corresponding control layer by layer to draw each control according to the layer relationship between the controls in the view tree A, and finally form the chat interface 401.
  • a projection management module may be added to the view system of the source device.
  • the screen projection management module can record the drawing instructions for each control in the display screen drawn by the view system and the drawing resources (such as avatars, icons, etc.) required by the drawing instructions.
  • the projection management module can generate the view tree 2 of the projection interface that needs to be displayed in the target device based on the view tree 1 of the currently displayed screen.
  • the number of controls in the view tree 2 may be different from the number of controls in the view tree 1, and the positional relationship between the controls in the view tree 2 may also be different from the positional relationship between the controls in the view tree 1.
  • the projection management module can instruct the source device to send the drawing instructions and drawing resources of the view tree 2 and each control in the view tree 2 to the destination device, so that the destination device can follow the layer relationship between the controls in the view tree 2 one by one.
  • the layer calls the drawing instructions of the corresponding controls to draw the projected screen after the projected screen.
  • each source device can send the view tree and related drawing instructions and drawing resources that need to be projected to the destination device this time according to the above method.
  • a projection management module can also be set in the view system of the destination device.
  • the destination device may receive the first UI message sent by the source device 1.
  • the first UI message may include view tree A and drawing instructions and drawing resources of each control in the view tree A
  • the destination device may receive the source device 2.
  • the sent second UI message, the second UI message may include view tree B and drawing instructions and drawing resources of each control in view tree B.
  • the projection management module of the target device can split, delete or reorganize the controls in view tree A and view tree B based on view tree A and view tree B, thereby generating the projection screen that needs to be displayed this time View tree C of the interface. In this way, the target device can call the drawing instructions and drawing resources of the corresponding control layer by layer to draw the projected screen after the projection according to the layer relationship between the controls in the view tree C.
  • the final screen projection interface drawn by the destination device contains both the controls in view tree A and the controls in view tree B. That is, the projection interface contains both the display content in source device 1 and the display in source device 2. content.
  • the user can watch the display content on different source devices at the same time in one destination device, thereby improving the display effect of the destination device and the user's use experience in the projection scene.
  • the above-mentioned projection management module can also be set at the application framework layer independently of the view system, and the embodiment of the present application does not impose any limitation on this.
  • the aforementioned activity manager can be used to manage the life cycle of each application.
  • Applications usually run in the operating system in the form of activity.
  • the activity manager can schedule the activity process of the application to manage the life cycle of each application.
  • the window manager is used to manage window programs.
  • the window manager can obtain the size of the display, determine whether there is a status bar, lock the screen, take a screenshot, etc.
  • the content provider is used to store and retrieve data and make these data accessible to applications.
  • the data may include video, image, audio, phone calls made and received, browsing history and bookmarks, phone book, etc.
  • the resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, etc.
  • Android Runtime includes core libraries and virtual machines. Android runtime is responsible for the scheduling and management of the Android system.
  • the core library consists of two parts: one part is the function functions that the java language needs to call, and the other part is the core library of Android.
  • the application layer and the application framework layer run in a virtual machine.
  • the virtual machine executes the java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
  • the system library can include multiple functional modules. For example: surface manager (surface manager), media library (Media Libraries), three-dimensional graphics processing library (for example: OpenGL ES), 2D graphics engine (for example: SGL), etc.
  • the surface manager is used to manage the display subsystem, and provides a combination of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support multiple audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to realize 3D graphics drawing, image rendering, synthesis, and layer processing.
  • the 2D graphics engine is a graphics engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer includes at least a display driver, a camera driver, an audio driver, a sensor driver, etc., which are not limited in the embodiment of the present application.
  • the mobile phone 501 can be used as the source device.
  • the mobile phone 501 can display a prompt box 503, and the prompt box 503 contains one or more candidate devices that can be used as target devices for screen projection.
  • the user can select a candidate device in the prompt box 503 as the target device for this screencast. For example, if it is detected that the user clicks on the “mobile phone 500” in the prompt box 503, the mobile phone 501 can determine that the target device for this screen projection is the mobile phone 500.
  • the candidate device in the prompt box 503 may be an electronic device located in the same communication network as the mobile phone 501.
  • each candidate device in the prompt box 503 is an electronic device in the Wi-Fi network that the mobile phone 501 accesses.
  • each candidate device in the prompt box 503 is an electronic device under the mobile phone account that the mobile phone 501 logs in.
  • the mobile phone 502 can also be used as a source device.
  • the mobile phone 501 may display a prompt box 504. Similar to the above-mentioned prompt box 503, the prompt box 504 also contains one or more candidate devices that can be used as target devices for screen projection. The user can select a candidate device in the prompt box 504 as the target device for this screencast. For example, if it is detected that the user clicks on the “mobile phone 500” in the prompt box 504, the mobile phone 502 can determine that the target device for this screen projection is the mobile phone 500.
  • the candidate device in the prompt box 504 may be an electronic device located in the same communication network as the mobile phone 502.
  • each candidate device in the prompt box 504 is an electronic device in the Wi-Fi network that the mobile phone 502 accesses.
  • each candidate device in the prompt box 504 is an electronic device under the mobile phone account that the mobile phone 502 logs in.
  • the destination device of the mobile phone 501 ie, the first source device
  • the destination device of the mobile phone 502 ie, the second source device
  • the user can also set a certain electronic device (such as the aforementioned mobile phone 500) as the destination device, and select multiple electronic devices (such as the aforementioned mobile phone 501 and mobile phone 502) as the source device for this screen projection.
  • a certain electronic device such as the aforementioned mobile phone 500
  • select multiple electronic devices such as the aforementioned mobile phone 501 and mobile phone 502
  • the application embodiment does not impose any restriction on this.
  • the mobile phone 501 and the mobile phone 502 as the source devices of the screen projection
  • the mobile phone 500 as an example of the destination device of the screen projection.
  • the playback interface 600 includes the following controls: a base map 601, a status bar 602, a title bar 603, an album cover 604, lyrics 605, and a control bar 606.
  • the status bar 602 includes controls such as time, signal strength, and battery capacity.
  • the title bar 603 includes controls such as the song name 6031 and the singer 6032.
  • the control bar 606 includes controls such as a progress bar 6061, a pause button 6062, a previous button 6063, and a next button 6064.
  • the mobile phone 501 can obtain the corresponding view information when the view system draws the above-mentioned playback interface 600.
  • the mobile phone 501 can obtain the view tree of the playback interface 600, and the drawing instructions and drawing resources of each control in the view tree.
  • the view tree 701 records the layer relationship between the various controls in the aforementioned playback interface 600.
  • the root node of the playback interface 600 includes a sub-node of the base map 601.
  • the status bar 602, the title bar 603, the album cover 604, the lyrics 605, and the control bar 606 are all sub-nodes of the base map 601.
  • the song title 6031 and the singer 6041 are child nodes of the title column 603.
  • the progress bar 6061, the pause button 6062, the previous button 6063, and the next button 6064 are child nodes of the control bar 606.
  • the mobile phone 501 After the mobile phone 501 obtains the view tree 701 of the aforementioned playback interface 600, it can further determine one or more controls (ie, the first target control) in the playback interface 600 that need to be projected to the display in the mobile phone 500.
  • one or more controls ie, the first target control
  • a configuration file corresponding to the aforementioned playback interface 600 may be preset in the mobile phone 501.
  • the mobile phone 501 may obtain the configuration file corresponding to the playing interface 600 from the server.
  • the configuration file records one or more controls in the playback interface 600 that the source device needs to project to the destination device.
  • the identification of one or more controls in the playback interface 600 that needs to be projected to the target device can be recorded in the configuration file.
  • the mobile phone 501 can determine the needs in the playback interface 600 according to the identification of the controls recorded in the configuration file. Projected to the first target control displayed in the mobile phone 500.
  • the identification of each control in the play interface 600 may be updated, so the first target control may not be determined by the identification of the control in the configuration file. accurate.
  • the specific display position of one or more controls that need to be projected to the target device in the playback interface 600 can also be recorded in the configuration file, such as the coordinates of the controls. In this way, the mobile phone 501 displays the position according to the controls recorded in the configuration file.
  • the display position in the playback interface 600 can uniquely determine the first target control in the playback interface 600 that needs to be projected to the mobile phone 500 to be displayed.
  • the mobile phone 501 may preset multiple configuration files corresponding to the aforementioned playback interface 600.
  • the configuration file corresponding to the playback interface 600 is profile 1; when the target device is a watch type device, the configuration file corresponding to the playback interface 600 is profile 2.
  • the mobile phone 501 can obtain a configuration file (for example, configuration file 1) corresponding to the playback interface 600 according to the type of the target device (for example, the above-mentioned mobile phone 500) selected by the user this time.
  • the configuration file 1 corresponding to the playing interface 600 may be:
  • configuration file can be stored in a mobile phone or server in JSON (JavaScript Object Notation) format, XML (Extensible Markup Language) format, or text format, and the embodiment of the application does not impose any limitation on this.
  • JSON JavaScript Object Notation
  • XML Extensible Markup Language
  • the configuration file 1 contains multiple "src” fields (for example, the aforementioned "src1" field and "src2" field).
  • Each "src” field records the specific position of a control in the playback interface 600.
  • the position of each control can be uniquely determined by the values of the 4 parameters: left, top, widht, and height.
  • left is the size of the top left corner of the control on the x axis
  • top is the size of the top left corner of the control on the y axis
  • widht is the width of the control
  • height is the height of the control.
  • the one or more controls recorded in the configuration file 1 are the first target controls that the mobile phone 501 needs to project to the mobile phone 500 to be displayed.
  • the mobile phone may set one or more configuration files for the mobile phone in advance.
  • the mobile phone 501 can obtain the package name (packagename) of the first music APP currently running in the foreground and the activityname of the currently displayed play interface 600.
  • the mobile phone 501 can query the configuration file 1 corresponding to the playback interface 600 in the obtained configuration file according to the packagename and activityname.
  • the mobile phone 501 can recognize the first target control in the playback interface 600 that needs to be projected to the mobile phone 500 according to the position of each control recorded in the "src" field in the configuration file 1.
  • the user can also manually specify in the playback interface 600 to project to the first target control displayed in the mobile phone 500.
  • the mobile phone 501 can detect a user's touch event in the playing interface 600. If it is detected that the user performs a preset touch event at point A of the playback interface 600, the mobile phone can find the corresponding control in the profile 1 according to the coordinates of point A, and determine the found control to be projected to the mobile phone 500 for display The first target control.
  • the configuration file 1 may also record the specific display position of the first target control in the target device after the screen is projected.
  • a "dest1" field corresponding to the "src1" field can be set in the above configuration file 1, and the "dest1" field is used to indicate the display position of the control 1 in the target device.
  • the "dest1" field is as follows:
  • the mobile phone 501 can determine the specific display position of each first target control in the play interface 600 after the screen is projected on the target device (ie, the mobile phone 500) according to each "dest" field in the configuration file 1.
  • the configuration file 1 may also record the change relationship of the display position of the first target control before and after the screen is projected.
  • the following fields are also set for control 1 in configuration file 1:
  • the "translationx” field and the “translationy” field are used to indicate the translation distance of the control 1 on the x-axis and the y-axis after the projection; the "scalex” field and the “scaley” field are respectively used to indicate that the control 1 is in the The zoom ratio on the x-axis and y-axis; the “rotatedegree” field is used to indicate the rotation angle of control 1 after projection; the “order” field is used to indicate the layer position of control 1 after projection (for example, in the bottom layer The layer is still in the top layer).
  • the mobile phone 501 can also determine the specific display position of the control 1 in the target device (ie, the mobile phone 500) after the screen is projected according to the change relationship of the display position of the control 1 before and after the screen is projected in the above field.
  • the mobile phone 501 may generate a view tree 701 including the first target control based on the playback interface 600.
  • a new view tree for the target control may be generated.
  • the mobile phone 501 recognizes the first target control in the playback interface 600 according to the aforementioned configuration file 1, including: the song name 6031 and singer 6041 in the title bar 603, the pause button 6062 in the control bar 606, the previous button 6063, and Next button 6064, and album cover 604. Moreover, the positional relationship between these controls has not changed before and after the screen is projected. That is, the layer relationship of the first target control in the playback interface 600 is the same as the layer relationship of the first target control in the subsequent screen projection interface.
  • the mobile phone 501 splits each control in the above-mentioned view tree 701, the nodes that are not the first target control (such as the base map 601, the status bar 602, the lyrics 605, and the control bar 606 The progress bar 6061) is deleted.
  • the mobile phone 501 can reorganize the reserved first target control according to the layer order in the view tree 701 to obtain the corresponding view tree 801 after the playback interface 600 is projected.
  • the root node includes three sub-nodes: the title bar 603, the album cover 604, and the control bar 606.
  • the control bar 606 includes the pause button 6062, the previous button 6063, and the next button 6064. Child nodes.
  • the mobile phone 501 (that is, the first source device) can send a first UI message to the mobile phone 500 (that is, the destination device) through the communication network 104.
  • the first UI message includes the view tree 801 and each control related in the view tree 801. Drawing instructions and drawing resources.
  • the mobile phone 501 and the mobile phone 500 may establish a socket connection based on the TCP/IP (Transmission Control Protocol/Internet Protocol) protocol.
  • the mobile phone can use the socket connection to send the first UI message corresponding to the aforementioned playback interface 600 to the mobile phone 500.
  • TCP/IP Transmission Control Protocol/Internet Protocol
  • the mobile phone 501 can continue to generate the first UI message corresponding to the new display interface according to the aforementioned method, and send the new first UI message to the mobile phone 500.
  • the first source device (for example, the mobile phone 501) is used as the execution subject to illustrate a projection display method provided by the embodiment of the present application.
  • a destination device such as a mobile phone 500
  • each source device can project one or more controls in its display interface to the destination device for display according to the above method.
  • the mobile phone 502 can also enable the screen projection function, and set the target device of the current screen projection as the mobile phone 500.
  • the playback interface 900 includes the following controls: a base map 901, a status bar 902, a title bar 903, a bullet screen 904, an album cover 905, and a control bar 906.
  • the status bar 902 includes controls such as time, signal strength, and battery capacity.
  • the title bar 903 includes controls such as the song name 9031 and the singer 9032.
  • the control bar 906 includes controls such as a progress bar 9061, a pause button 9062, a previous button 9063, and a next button 9064. It can be seen that, compared with the above-mentioned play interface 600 of the first music APP, the play interface 900 of the second music APP is provided with the function of viewing barrage.
  • the mobile phone 502 can obtain the view tree when its view system draws the aforementioned playback interface 900, as well as the drawing instructions and drawing resources of each control in the view tree.
  • the view tree 1001 of the above-mentioned playing interface 900 it is the view tree 1001 of the above-mentioned playing interface 900.
  • the view tree 1001 records the layer relationship between the various controls in the aforementioned playback interface 900.
  • the root node of the playback interface 900 includes a sub-node of the base map 901.
  • the status bar 902, the title bar 903, the bullet screen 904, the album cover 905, and the control bar 906 are all sub-nodes of the base map 901.
  • the song name 9031 and the singer 9041 are child nodes of the title column 903.
  • the progress bar 9061, the pause button 9062, the previous button 9063, and the next button 9064 are child nodes of the control bar 906.
  • the mobile phone 502 can also obtain the configuration file 2 corresponding to the playing interface 900 according to the packagename of the second music APP and the activityname of the packagename. Similar to the aforementioned configuration file 1, the configuration file 2 records the second target control in the playback interface 900 that needs to be projected to the target device, such as the identification or display position of the second target control in the playback interface 900.
  • the mobile phone 502 can recognize the second target control in the view tree 1001 that needs to be screened to the mobile phone 500 according to the configuration file 2.
  • the mobile phone 502 can also use the control manually set by the user in the playback interface 900 as the second target control that needs to be projected to the mobile phone 500 to display.
  • the mobile phone 502 splits and reorganizes the controls in the view tree 1001 to generate a screencast and play The view tree 1002 corresponding to the interface 900.
  • the root node of the view tree 1002 includes a bullet screen 904.
  • the mobile phone 502 (that is, the second source device) can send a second UI message to the mobile phone 500 (that is, the destination device) through the communication network 104.
  • the second UI message includes the view tree 1002 and each control related to the view tree 1002. Drawing instructions and drawing resources.
  • the mobile phone 502 can continue to generate the second UI message corresponding to the new display interface according to the aforementioned method, and send the new second UI message to the mobile phone 500.
  • the source device may send a UI message to the destination device in the form of a message.
  • the source device may serialize and compress the message.
  • the destination device can decompress and deserialize the received message, so as to parse the UI message carried in the message.
  • each source device such as the aforementioned mobile phone 501 and mobile phone 502
  • operations such as splitting, deleting, and reorganizing the controls in the display interface can be performed, so that the final destination device
  • the screen projection interface displayed in can adapt to the user's use needs, thereby improving the display effect and user experience when projecting between multiple devices.
  • the mobile phone 500 may also pre-store a configuration file.
  • the configuration file stored in the destination device may contain information about target controls in multiple source devices.
  • the configuration file 3 stored in the mobile phone 500 is as follows:
  • the aforementioned configuration file 3 includes both the first target control in the playback interface 600 in the first music APP and the second target control in the playback interface 900 in the second music APP.
  • the first UI message sent by the mobile phone 501 to the mobile phone 500 may carry the packagename of the first music APP and the activityname of the playing interface 600.
  • the second UI message sent by the mobile phone 502 to the mobile phone 500 may carry the packagename of the second music APP and the activityname of the playing interface 900.
  • the mobile phone 500 receives the first UI message and the second UI message, it can obtain a configuration file containing both the activityname of the above-mentioned playing interface 600 and the activityname of the above-mentioned playing interface 900 (ie, the above-mentioned configuration file 3).
  • the mobile phone 500 can determine the view tree 801 and the view after screen projection according to the configuration file 3.
  • the mobile phone 500 can use the bullet screen 904 as a child node of the album cover 604.
  • the "stack mode" field can also be set in the configuration file 3.
  • the layering mode may include gradual blending, deepening blending, or overlapping.
  • the mobile phone 500 can determine the specific stacking mode when drawing two nodes whose positions overlap according to the "stacking mode" field.
  • the mobile phone 500 may also store the configuration files used by each source device when projecting the screen, for example, the configuration file 1 and the configuration file 2 described above. Since the configuration file 1 records the display position of the first target control after the screen is cast, and the configuration file 2 records the display position of the second target control after the screen is cast, the mobile phone 500 can determine according to the configuration file 1 and the configuration file 2. After the screen is projected, the specific display positions of the controls in the view tree 801 and the view tree 1002 and the layer relationship between the controls are generated, thereby generating a view tree 1101 corresponding to the projection interface.
  • the mobile phone can obtain the corresponding configuration file 1 according to the packagename of the first music APP and the activityname of the playing interface 600 in the first UI message.
  • the mobile phone can obtain the corresponding configuration file 2 according to the packagename of the second music APP and the activityname of the playing interface 900 in the second UI message. Then, combining the display position of the first target control in the configuration file 1 in the projection interface and the display position of the second target control in the configuration file 2 in the projection interface, as shown in Figure 11, the mobile phone can generate The view tree 1101 corresponding to the projection interface.
  • the mobile phone 500 when the mobile phone 500 detects that the display position of a control in profile 1 (for example, album cover 604) overlaps with the display position of a control in profile 2 (for example, the barrage 904), the mobile phone 500 can follow
  • the "order" field recorded in the configuration file 1 of the control of album cover 604 and the "order” field recorded in the configuration file 2 of the control of barrage 904 determine the positional relationship between the album cover 604 and the barrage 904.
  • the value of the "order" field of the album cover 604 in profile 1 is 2
  • the value of the "order” field of the bullet screen 904 in profile 2 is 1, indicating that the barrage 904 is located on the layer of the album cover 604 Above, as shown in FIG.
  • the mobile phone 500 can set the bullet screen 904 as a child node of the album cover 604.
  • the mobile phone 500 may also perform processing such as blurring and translucent on the overlapping controls to improve the visual effect of the user when viewing.
  • the mobile phone 500 when the mobile phone 500 detects that the album cover 604 and the bullet screen 904 overlap through the configuration file 1 and the configuration file 2, the mobile phone 500 can also modify the display position of the album cover 604 or the bullet screen 904 in the screen projection interface. , So that the album cover 604 and the barrage 904 do not overlap. At this time, the mobile phone 500 may regard both the album cover 604 and the bullet screen 904 as child nodes under the root node.
  • the mobile phone 500 may also lay out the display positions of the various controls in the projection interface according to the number of source devices. For example, when the mobile phone 500 has two source devices (such as the mobile phone 501 and the mobile phone 502), the mobile phone can set the first target control in the view tree 801 sent by the mobile phone 501 to be displayed on the left side of the screen projection interface, and set The second target control in the view tree 1002 sent by the mobile phone 502 is displayed on the right side of the screen projection interface.
  • those skilled in the art can also set corresponding strategies based on actual experience or actual application scenarios, so that the target device can lay out the target controls projected by each source device on the screen projection interface, and the embodiment of the application does not impose any limitation on this.
  • the mobile phone 500 can call the drawing instructions of each control in the view tree 101 in turn according to the level and order of the controls in the view tree 1101. The control.
  • the mobile phone 500 can draw the control according to the display position of the control in the projection interface recorded in the configuration file 3, and execute the corresponding drawing instruction at the display position.
  • the mobile phone 500 can draw and display the screen projection interface 1201 based on the view tree 1101 and the configuration file 3.
  • the screen projection interface 1201 includes not only the first target control such as the title bar 603, the album cover 604, and the control bar 606 in the foregoing playback interface 600, but also the second target control of the bullet screen 904 in the foregoing playback interface 900.
  • the destination device can display the display content of multiple source devices at the same time, so that the user can watch the display content on multiple source devices in one destination device, without requiring the user to display content on multiple source devices.
  • the communication connection between the destination device and the source device can be dynamically added or deleted.
  • the mobile phone 502 can disconnect the communication connection (such as a Wi-Fi connection) with the mobile phone 500 (that is, the target device). Furthermore, the mobile phone 500 will not receive the second UI message sent by the mobile phone 502, nor will it display related content in the playback interface 900 on the screen projection interface.
  • the communication connection such as a Wi-Fi connection
  • the smart watch can establish a communication connection with the mobile phone 500. While receiving the above-mentioned first UI message and second UI message, the mobile phone 500 may also receive a third UI message from the smart watch. Furthermore, the mobile phone 500 can re-lay out the various controls in the projection interface based on the corresponding configuration file, so that the mobile phone 501, the mobile phone 502, and the smart watch can all project their display content to the mobile phone 500 for display.
  • a new source device such as a smart watch
  • the playback interface 600 in the mobile phone 501 and the playback interface 900 in the mobile phone 502 are projected to the mobile phone 500 as an example. It is understandable that multiple source devices can also project different types of display interfaces at the same time. To the destination device.
  • the mobile phone 501 as the first source device can display the lock screen interface 1301 after enabling the screen projection function.
  • the lock screen interface 1301 includes a wallpaper 1302, and the wallpaper 1302 includes controls such as a status bar 1303, a text control 1304, a file control 1305, and a notification message 1306.
  • the notification message 1306 also includes controls such as application icon 1307, time 1308, and message content 1309.
  • the mobile phone 501 may obtain the view tree of the lock screen interface 1301 and the drawing instructions and drawing resources of each control in the view tree.
  • the mobile phone 501 can also obtain the configuration file A corresponding to the lock screen interface 1301.
  • the configuration file A records that the first target control in the lock screen interface 1301 that needs to be screened to the smart TV is: the icon 1307 in the notification message 1306 and the message content 1309.
  • the mobile phone 501 can generate a view tree 1310 corresponding to the lock screen interface 1301 after the screen is projected according to the display position of the first target control recorded in the configuration file A after the screen is projected. As shown in (b) in FIG. 13, in the view tree 1310, both the icon 1307 and the message content 1309 are child nodes under the root node.
  • the mobile phone 501 (that is, the first source device) can send a first UI message to the smart TV (that is, the destination device).
  • the first UI message includes the above-mentioned view tree 1310 and drawing instructions and drawing related to each control in the view tree 1310. Resources.
  • the smart watch 1400 can display the detection interface 1401 after turning on the screen projection function as the second source device.
  • the detection interface 1401 includes time 1402, date 1403, heart rate information 1404, and calorie information 1405.
  • the heart rate information 1404 further includes a heart rate icon and a heart rate value.
  • the calorie information 1405 also includes a calorie icon and a calorie value.
  • the smart watch 1400 can acquire the view tree of the detection interface 1401 and the drawing instructions and drawing resources of each control in the view tree.
  • the smart watch 1400 can also obtain the configuration file B corresponding to the detection interface 1401.
  • the configuration file B records that the second target controls in the detection interface 1401 that need to be projected to the smart TV are: each control in the heart rate information 1404 and the calorie information 1405.
  • the smart watch 1400 can generate a view tree 1410 corresponding to the detection interface 1401 after the screen is projected according to the display position of the second target control recorded in the configuration file B after the screen is projected.
  • the heart rate information 1404 and calorie information 1405 are both child nodes under the root node.
  • the child nodes of the heart rate information 1404 are the heart rate icon and heart rate value
  • the calorie information 1405 is The child nodes are the calorie icon and calorie value.
  • the smart watch 1400 (that is, the second source device) can send a second UI message to the smart TV (that is, the destination device), and the second UI message includes the above-mentioned view tree 1410 and drawing instructions related to each control in the view tree 1410 and Drawing resources.
  • the smart TV 1500 After the smart TV 1500 receives the first UI message and the second UI message, similar to the mobile phone 500, the smart TV 1500 can split the view tree 1310 in the first UI message and the view tree 1410 in the second UI message. Divide and reorganize to generate a view tree corresponding to this screencast interface.
  • the smart TV 1500 may generate a view tree 1502 corresponding to the screen projection interface 1501 this time.
  • the view tree 1502 includes each control in the above-mentioned view tree 1310, and also includes each control in the above-mentioned view tree 1410.
  • the view tree 1502 may also include one or more controls being displayed in the smart TV 1500, such as a video screen 1503.
  • the smart TV 1500 can call a corresponding drawing instruction according to the view tree 1502 to draw each control on its display screen.
  • the smart TV 1500 can draw and display the screen projection interface 1501 corresponding to the view tree 1502.
  • the projection interface 1501 includes the video image 1503 being displayed on the smart TV 1500, the icon 1307 and message content 1309 in the notification message 1306 being displayed in the mobile phone 501, and the heart rate information 1404 and calorie information 1405 being displayed in the smart watch 1400.
  • the destination device can simultaneously display the display content of multiple source devices and the display content of the destination device itself, so that the user can watch the display content on multiple devices in one device. Improve the display effect and user experience of the projection screen between multiple devices.
  • the foregoing embodiment exemplarily shows an application scenario in which the display content of multiple source devices is projected to the same destination device. It should be understood that the foregoing projection display method can also be applied in other scenarios. The embodiment of this application does not impose any restriction on this.
  • an electronic device in a certain conference site can be used as a destination device, and various electronic devices in other conference sites can be used as a source device.
  • Each source device can send the display content that needs to be projected to the destination device to the destination device in the form of a UI message according to the foregoing method.
  • the destination device can display the display content projected by each source device this time on its own display screen according to the foregoing method.
  • the target device can also display the display content of the target device itself on its own display screen.
  • students can install the teaching assistant APP on their mobile phones or computers or tablets.
  • their electronic devices can be used as source devices to project the display content of the answer area to the teacher's mobile phone or computer or tablet for display.
  • the teacher can preview the answering process of multiple students in their respective answering areas in real time, and understand the students' answer ideas in time, thereby improving the teaching effect of the teaching assistant APP.
  • the embodiment of the present application discloses an electronic device including a processor, and a memory, an input device, an output device, and a communication module connected to the processor.
  • the input device and the output device can be integrated into one device.
  • a touch sensor can be used as an input device
  • a display screen can be used as an output device
  • the touch sensor and display screen can be integrated into a touch screen.
  • the above electronic device may include: a touch screen 1701, which includes a touch sensor 1706 and a display screen 1707; one or more processors 1702; a memory 1703; a communication module 1708; one or more Application programs (not shown); and one or more computer programs 1704.
  • the above-mentioned devices can be connected through one or more communication buses 1705.
  • the one or more computer programs 1704 are stored in the aforementioned memory 1703 and configured to be executed by the one or more processors 1702, and the one or more computer programs 1704 include instructions, and the aforementioned instructions can be used to execute the aforementioned implementations.
  • the steps in the example Among them, all relevant content of the steps involved in the above method embodiments can be cited in the functional description of the corresponding physical device, which will not be repeated here.
  • the foregoing processor 1702 may specifically be the processor 110 shown in FIG. 2, the foregoing memory 1703 may specifically be the internal memory 121 and/or the external memory 120 shown in FIG. 2, and the foregoing display screen 1707 may specifically be FIG. 2
  • the touch sensor 1706 may specifically be the touch sensor in the sensor module 180 shown in FIG. 2
  • the communication module 1708 may specifically be the mobile communication module 150 and/or the wireless communication module 160 shown in FIG. The embodiment of this application does not impose any restriction on this.
  • the functional units in the various embodiments of the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated unit can be implemented in the form of hardware or software functional unit.
  • the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a computer readable storage medium.
  • a computer readable storage medium includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor to execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage media include: flash memory, mobile hard disk, read-only memory, random access memory, magnetic disk or optical disk and other media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Software Systems (AREA)
  • Computer Graphics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

一种投屏显示方法及电子设备,涉及终端技术领域,目的设备可同时显示多个源设备中的显示内容,从而提高多设备之间投屏显示的显示效果和用户体验。该方法包括:目的设备接收第一源设备发送的第一消息,第一消息中包括第一绘制指令,第一绘制指令为用于绘制一个或多个第一目标控件的指令,第一目标控件为第一源设备显示的第一界面中的控件;目的设备接收第二源设备发送的第二消息,第二消息中包括第二绘制指令,第二绘制指令为用于绘制一个或多个第二目标控件的指令,第二目标控件为第二源设备显示的第二界面中的控件;目的设备根据第一绘制指令和第二绘制指令绘制投屏界面,投屏界面中包括第一目标控件和第二目标控件。

Description

一种投屏显示方法及电子设备
本申请要求在2019年6月5日提交中国国家知识产权局、申请号为201910487807.5的中国专利申请的优先权,发明名称为“一种投屏显示方法及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及终端技术领域,尤其涉及一种投屏显示方法及电子设备。
背景技术
随着智能家居技术的发展,一个用户或家庭中往往具备多个能够互相通信的电子设备。各类电子设备一般具有各自的设备特点,例如,手机的便携性更好,电视屏幕的显示效果更好,而音箱的音质效果更好。为了充分发挥不同电子设备的设备特点,电子设备可以通过投屏等方式实现显示内容在多个设备之间的切换和显示。
示例性的,用户可以将手机(即源设备)中的显示内容发送至其他支持投屏功能的目的设备中进行显示。随着用户拥有的电子设备数目的增多,如果用户需要将不同源设备中的显示内容投屏至目的设备时,用户需要频繁切换源设备进行投屏。
例如,用户使用手机时可将手机中的显示界面投射至智能电视中显示。后续,如果用户想观看平板电脑中的显示内容,则用户可断开手机与智能电视之间的连接,并将平板电脑设置为新的源设备。进而,用户可将平板电脑的显示界面投射至智能电视中显示,不再显示手机中的显示界面。显然,这种在多个源设备之间来回切换进行投屏的方法较为繁琐,用户的使用体验不高。
发明内容
本申请提供一种投屏显示方法及电子设备,目的设备可同时显示多个源设备中的显示内容,从而提高多设备之间投屏显示的显示效果和用户体验。
为达到上述目的,本申请采用如下技术方案:
第一方面,本申请提供一种投屏显示方法,包括:目的设备接收第一源设备发送的第一消息,第一消息中包括第一绘制指令,第一绘制指令用于绘制一个或多个第一目标控件,其中,第一目标控件为第一源设备显示的第一界面中的控件;并且,目的设备还可以接收第二源设备发送的第二消息,第二消息中包括第二绘制指令,第二绘制指令用于绘制一个或多个第二目标控件,其中,第二目标控件为第二源设备显示的第二界面中的控件;进而,目的设备可根据第一绘制指令和第二绘制指令绘制投屏界面,绘制出的投屏界面中包括第一目标控件和第二目标控件。
也就是说,在投屏场景下,目的设备在投屏界面中既显示出了第一源设备需要投屏的第一目标控件,又显示出了第二源设备需要投屏的第二目标控件,使得用户可在一个目的设备中观看到多个源设备上的显示内容,无需用户在多个源设备之间进行切换,从而提高多设备之间投屏显示的显示效果和用户体验。
在一种可能的实现方式中,上述第一消息中还可以包括第一视图信息(例如第一视图树),第一视图信息包括第一目标控件在上述投屏界面中的图层顺序;同样,第二消息中还可以包括第二视图信息(例如第二视图树),第二视图信息包括第二目标控件在上述投屏界面中的图层顺序;此时,目的设备根据第一绘制指令和第二绘制指令绘制投屏界面,具体包括:目的 设备可根据第一视图信息和第二视图信息生成第三视图信息,第三视图信息中包括第一目标控件和第二目标控件在上述投屏界面中的图层顺序;这样,目的设备按照第三视图信息中各个控件之间的图层顺序,可相应的执行第一绘制指令和第二绘制指令绘制上述投屏界面。
在一种可能的实现方式中,在目的设备执行第一目标控件的绘制指令和第二目标控件的绘制指令绘制投屏界面之前,还包括:目的设备可获取与上述第一界面和第二界面均对应的配置文件,该配置文件中不仅记录了第一目标控件在投屏界面中的第一显示位置,还记录了第二目标控件在投屏界面中的第二显示位置;此时,目的设备根据第一视图信息和第二视图信息生成第三视图信息,具体包括:目的设备根据获取到的配置文件,对第一视图信息和第二视图信息中的控件进行拆分和重组,得到第三视图信息。
或者,在目的设备执行第一绘制指令和第二绘制指令绘制投屏界面之前,还包括:目的设备获取与第一界面对应的第一配置文件,第一配置文件中记录了第一目标控件在投屏界面中的第一显示位置;并且,目的设备开人获取与第二界面对应的第二配置文件,第二配置文件中记录了第二目标控件在投屏界面中的第二显示位置;此时,目的设备根据第一视图信息和第二视图信息生成第三视图信息,具体包括:目的设备可根据第一配置文件和第二配置文件,对第一视图信息和第二视图信息中的控件进行拆分和重组,得到第三视图信息。
在一种可能的实现方式中,由于配置文件中记录了第一目标控件在投屏界面中的第一显示位置,以及第二目标控件在投屏界面中的第二显示位置,那么,目的设备按照第三视图信息执行第一绘制指令和第二绘制指令绘制投屏界面,具体包括:目的设备根据第三视图信息中第一目标控件的图层顺序,在第一显示位置执行第一绘制指令,以绘制第一目标控件;并且,目的设备根据第三视图信息中第二目标控件的图层顺序,在第二显示位置执行第二绘制指令,以绘制第二目标控件。
在一种可能的实现方式中,上述第一目标控件在第一界面和投屏界面中的显示位置相同或不同;上述第二目标控件在第二界面和投屏界面中的显示位置相同或不同。
在一种可能的实现方式中,在目的设备接收第一源设备发送的第一消息之前,还包括:目的设备显示第三界面;此时,投屏后目的设备绘制的投屏界面中还可以包括第三界面中的一个或多个第三目标控件。也就是说,在投屏过程中,目的设备可以同时显示多个源设备中的显示内容,以及目的设备自身的显示内容,使得用户可在一个设备中观看到多个设备上的显示内容。
第二方面,本申请提供一种投屏显示方法,包括:源设备显示第一显示界面;源设备可接收用户将第一显示界面投射至目的设备的投屏指令;响应于该投屏指令,源设备确定第一显示界面中需要投屏至目的设备的一个或多个第一目标控件;进而,源设备可向目的设备发送第一消息,第一消息中包括第一目标控件的第一绘制指令,以使得目的设备可根据该第一绘制指令在投屏界面中绘制第一目标控件。
第三方面,本申请提供一种电子设备,该电子设备可以为上述源设备或目的设备。其中,该电子设备包括:触摸屏、通信模块、一个或多个处理器、一个或多个存储器、以及一个或多个计算机程序;其中,处理器与通信模块、触摸屏以及存储器均耦合,上述一个或多个计算机程序被存储在存储器中,当电子设备运行时,该处理器执行该存储器存储的一个或多个计算机程序,以使电子设备执行上述任一项所述的投屏显示方法。
第四方面,本申请提供一种计算机存储介质,包括计算机指令,当计算机指令在电子设备上运行时,使得电子设备执行如第一方面中任一项所述的投屏显示方法。
第五方面,本申请提供一种计算机程序产品,当计算机程序产品在电子设备上运行时, 使得电子设备执行如第一方面中任一项所述的投屏显示方法。
可以理解地,上述提供的第三方面所述的电子设备、第四方面所述的计算机存储介质,以及第五方面所述的计算机程序产品均用于执行上文所提供的对应的方法,因此,其所能达到的有益效果可参考上文所提供的对应的方法中的有益效果,此处不再赘述。
附图说明
图1为本申请实施例提供的一种通信系统的架构示意图;
图2为本申请实施例提供的一种电子设备的结构示意图一;
图3为本申请实施例提供的一种电子设备内操作系统的架构示意图;
图4为本申请实施例提供的一种投屏显示方法的应用场景示意图一;
图5为本申请实施例提供的一种投屏显示方法的应用场景示意图二;
图6为本申请实施例提供的一种投屏显示方法的应用场景示意图三;
图7为本申请实施例提供的一种投屏显示方法的应用场景示意图四;
图8为本申请实施例提供的一种投屏显示方法的应用场景示意图五;
图9为本申请实施例提供的一种投屏显示方法的应用场景示意图六;
图10为本申请实施例提供的一种投屏显示方法的应用场景示意图七;
图11为本申请实施例提供的一种投屏显示方法的应用场景示意图八;
图12为本申请实施例提供的一种投屏显示方法的应用场景示意图九;
图13为本申请实施例提供的一种投屏显示方法的应用场景示意图十;
图14为本申请实施例提供的一种投屏显示方法的应用场景示意图十一;
图15为本申请实施例提供的一种投屏显示方法的应用场景示意图十二;
图16为本申请实施例提供的一种投屏显示方法的应用场景示意图十三;
图17为本申请实施例提供的一种电子设备的结构示意图二。
具体实施方式
下面将结合附图对本实施例的实施方式进行详细描述。
如图1所示,本申请实施例提供的一种投屏显示方法可应用于通信系统100,通信系统100中可以包括N(N>1)个电子设备。例如,通信系统100中可包括电子设备101和电子设备102。
示例性地,电子设备101可以通过一个或多个通信网络104与电子设备102连接。
通信网络104可以是有线网络,也可以是无线网络)。例如,上述通信网络104可以是局域网(local area networks,LAN),也可以是广域网(wide area networks,WAN),例如互联网。该通信网络104可使用任何已知的网络通信协议来实现,上述网络通信协议可以是各种有线或无线通信协议,诸如以太网、通用串行总线(universal serial bus,USB)、火线(FIREWIRE)、全球移动通讯系统(global system for mobile communications,GSM)、通用分组无线服务(general packet radio service,GPRS)、码分多址接入(code division multiple access,CDMA)、宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE)、蓝牙、无线保真(wireless fidelity,Wi-Fi)、NFC、基于互联网协议的语音通话(voice over Internet protocol,VoIP)、支持网络切片架构的通信协议或任何其他合适的通信协议。示例性地,在一些实施例中,电子设备101可以通过Wi-Fi协议与电子设备102建立Wi-Fi连接。
示例性的,电子设备101可以作为目的设备,电子设备102可以作为电子设备101的源设备。电子设备102可将其显示界面中的显示内容投射至电子设备101,电子设备101作为 目的设备可在自身的显示界面中显示电子设备102发来的显示内容。
仍如图1所示,上述通信系统100中还可以包括电子设备103,例如,电子设备103可以为可穿戴设备。示例性的,电子设备103也可以作为电子设备101的源设备,将其显示界面中的显示内容也投射至电子设备101,同样,电子设备101也可在自身的显示界面中显示电子设备103发来的显示内容。
在本申请实施例中,一个目的设备(例如上述电子设备101)可以同时接收多个源设备(例如上述电子设备102和电子设备103)发来的显示内容,并且,目的设备可在自身的显示界面中同时呈现不同源设备发来的显示内容。
示例性的,仍以电子设备101为目的设备、电子设备102和电子设备103为源设备举例。
电子设备102可识别自身显示界面(例如第一界面)中的各个控件,并确定本次需要投射至电子设备101显示的一个或多个控件(后续称为第一目标控件)。进而,电子设备102可将第一目标控件及相关的绘制指令发送给电子设备101。
同时,电子设备103也可识别自身显示界面(例如第二界面)中的各个控件,并确定本次需要投射至电子设备101显示的一个或多个控件(后续称为第二目标控件)。进而,电子设备103可将第二目标控件及相关的绘制指令发送给电子设备101。
进而,电子设备101可调用电子设备102发来的绘制指令,在自身的显示界面中绘制第一目标控件。同时,电子设备101还可调用电子设备103发来的绘制指令,在自身的显示界面中绘制第二目标控件。这样,电子设备101最终可显示出包括第一目标控件和第二目标控件的投屏界面。该投屏界面中既显示出了第一源设备(即电子设备102)需要投屏的内容,又显示出了第二源设备(即电子设备103)需要投屏的内容。
也就是说,在投屏场景下,目的设备可同时显示多个源设备中的显示内容,使得用户可在一个目的设备中观看到多个源设备上的显示内容,无需用户在多个源设备之间进行切换,从而提高多设备之间投屏显示的显示效果和用户体验。
当然,上述通信系统100中的任意一个或多个电子设备均可作为源设备或目的设备,本申请实施例对此不做任何限制。
在一些实施例中,上述电子设备101、电子设备102以及电子设备103的具体结构可以是相同的,也可以是不同的。
例如,上述各个电子设备具体可以是手机、平板电脑、智能电视、可穿戴电子设备、车机、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、手持计算机、上网本、个人数字助理(personal digital assistant,PDA)、虚拟现实设备等,本申请实施例对此不做任何限制。
以电子设备101举例,图2示出了电子设备101的结构示意图。
电子设备101可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,摄像头193,显示屏194等。
可以理解的是,本发明实施例示意的结构并不构成对电子设备101的具体限定。在本申请另一些实施例中,电子设备101可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器 (application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
充电管理模块140用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。在一些有线充电的实施例中,充电管理模块140可以通过USB接口130接收有线充电器的充电输入。在一些无线充电的实施例中,充电管理模块140可以通过电子设备101的无线充电线圈接收无线充电输入。充电管理模块140为电池142充电的同时,还可以通过电源管理模块141为电子设备供电。
电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块141接收电池142和/或充电管理模块140的输入,为处理器110,内部存储器121,显示屏194,摄像头193,和无线通信模块160等供电。电源管理模块141还可以用于监测电池容量,电池循环次数,电池健康状态(漏电,阻抗)等参数。在其他一些实施例中,电源管理模块141也可以设置于处理器110中。在另一些实施例中,电源管理模块141和充电管理模块140也可以设置于同一个器件中。
电子设备101的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备101中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在电子设备101上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括一个或多个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A,受话器170B等)输出声音信 号,或通过显示屏194显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块150或其他功能模块设置在同一个器件中。
无线通信模块160可以提供应用在电子设备101上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(Bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成一个或多个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,电子设备101的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得电子设备101可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
电子设备101通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备101可以包括1个或N个显示屏194,N为大于1的正整数。
电子设备101可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工 处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,电子设备101可以包括1个或N个摄像头193,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备101在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备101可以支持一种或多种视频编解码器。这样,电子设备101可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备101的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储一个或多个计算机程序,该一个或多个计算机程序包括指令。处理器110可以通过运行存储在内部存储器121的上述指令,从而使得电子设备101执行本申请一些实施例中所提供的投屏显示方法,以及各种功能应用和数据处理等。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统;该存储程序区还可以存储一个或多个应用程序(比如图库、联系人等)等。存储数据区可存储电子设备101使用过程中所创建的数据(比如照片,联系人等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如一个或多个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。在另一些实施例中,处理器110通过运行存储在内部存储器121的指令,和/或存储在设置于处理器中的存储器的指令,来使得电子设备101执行本申请实施例中提供的投屏显示方法,以及各种功能应用和数据处理。
电子设备101可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。电子设备101可以通过扬声器170A收听音乐,或收听免提通话。
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。当电子设备101接听电话或语音信息时,可以通过将受话器170B靠近人耳接听语音。
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风170C发声,将声音信号输入到麦克风170C。电子设备101可以设置一个或多个麦克风170C。在另一些实施例中,电子设备101可以设置两个麦克风170C,除了采集声音信号,还可以实现降噪功能。在另一些实施例中,电子设备101还可以设置三个,四个或更多麦克风170C,实现采集声音信号,降噪,还可以识别声音来源,实现定向录音功能等。
耳机接口170D用于连接有线耳机。耳机接口170D可以是USB接口130,也可以是3.5mm的开放移动电子设备平台(open mobile terminal platform,OMTP)标准接口,美国蜂窝电信工业协会(cellular telecommunications industry association of the USA,CTIA)标准接口。
传感器模块180可以包括压力传感器,陀螺仪传感器,气压传感器,磁传感器,加速度 传感器,距离传感器,接近光传感器,指纹传感器,温度传感器,触摸传感器,环境光传感器,骨传导传感器等。
另外,上述电子设备中还可以包括按键、马达、指示器以及SIM卡接口等一种或多种部件,本申请实施例对此不做任何限制。
上述电子设备101的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本申请实施例以分层架构的Android系统为例,示例性说明电子设备101的软件结构。
图3是本申请实施例的电子设备101的软件结构框图。
分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android系统分为四层,从上至下分别为应用程序层,应用程序框架层,安卓运行时(Android runtime)和系统库,以及内核层。
1、应用程序层
应用程序层可以包括一系列应用程序。
如图3所示,上述应用程序可以包括通话,联系人,相机,图库,日历,地图,导航,蓝牙,音乐,视频,短信息等APP(应用,application)。
2、应用程序框架层
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。
如图3所示,应用程序框架层中可以包括视图系统(view system),通知管理器,活动管理器,窗口管理器,内容提供器,资源管理器,输入法管理器等。
其中,视图系统可用于构建应用程序的显示界面。每个显示界面可以由一个或多个控件组成。一般而言,控件可以包括图标、按钮、菜单、选项卡、文本框、对话框、状态栏、导航栏、微件(Widget)等界面元素。
视图系统在绘制显示界面时可获取对应显示界面的视图信息,该视图信息中记录了需要绘制的显示界面中各个控件之间的图层关系。示例性的,显示界面中的各个控件一般按照树状结构分层组织,形成一个完整的ViewTree(视图树),该视图树可称为上述显示界面的视图信息。视图系统可根据ViewTree中设置好的各个控件之间的图层关系绘制显示界面。在绘制显示界面中的每一个控件时都对应一组绘制指令,例如DrawLine、DrawPoint、DrawBitmap等。
例如,图4中的(a)示出了微信APP的聊天界面401,聊天界面401中最底层的控件为根节点(root),根节点下设置有底图402这一控件,底图402中还包括以下控件:标题栏403、聊天背景404以及输入栏405。其中,标题栏403中进一步包括返回按钮406和标题407,聊天背景404中进一步包括头像408和气泡409,输入栏405中进一步包括语音输入按钮图标410、输入框411以及发送按钮412。
上述控件按照顺序分层可形成如图4中(b)所示的视图树A。其中,底图402为根节点的子节点,标题栏403、聊天背景404以及输入栏405均为底图402的子节点。返回按钮406和标题407均为标题栏403的子节点。头像408和气泡409均为聊天背景404的子节点。语音输入按钮图标410、输入框411以及发送按钮412均为输入栏405的子节点。视图系统在绘制聊天界面401时可按照视图树A中各个控件之间的图层关系,从根节点开始逐层调用对应控件的绘制指令绘制每个控件,最终形成聊天界面401。
在本申请实施例中,可以在源设备的视图系统增加一个投屏管理模块。
当用户打开源设备的投屏功能后,投屏管理模块可记录视图系统绘制显示画面中每个控件的绘制指令,以及该绘制指令所需的绘图资源(例如头像、图标等)。并且,投屏管理模块可基于当前显示画面的视图树1,生成本次在目的设备中需要显示的投屏界面的视图树2。视图树2中控件的数量与视图树1中控件的数量可以不同,视图树2中控件之间的位置关系也可与视图树1中控件之间的位置关系不同。进而,投屏管理模块可指示源设备将视图树2以及视图树2中各个控件的绘制指令和绘图资源发送给目的设备,使得目的设备可按照视图树2中控件之间的图层关系,逐层调用相应控件的绘制指令绘制投屏后的投屏界面。
那么,当同一目的设备有多个不同的源设备时,每个源设备均可按照上述方法向目的设备发送本次需要投屏显示的视图树及相关的绘制指令和绘图资源。
相应的,目的设备的视图系统中也可设置投屏管理模块。
例如,目的设备可以接收源设备1发来的第一UI消息,第一UI消息中可以包括视图树A以及视图树A中各个控件的绘制指令和绘图资源,并且,目的设备可以接收源设备2发来的第二UI消息,第二UI消息中可以包括视图树B以及视图树B中各个控件的绘制指令和绘图资源。进而,目的设备的投屏管理模块可基于视图树A和视图树B,对视图树A和视图树B中的控件进行拆分、删减或重组等操作,从而生成本次需要显示的投屏界面的视图树C。这样,目的设备可按照视图树C中控件之间的图层关系,逐层调用相应控件的绘制指令和绘图资源绘制投屏后的投屏界面。
这样一来,目的设备最终绘制的投屏界面中同时包含视图树A中的控件和视图树B中的控件,即投屏界面中同时包含源设备1中的显示内容和源设备2中的显示内容。用户可以在一个目的设备中同时观看到不同源设备上的显示内容,从而提高投屏场景下目的设备的显示效果和用户的使用体验。
需要说明的是,上述投屏管理模块也可以独立于视图系统单独设置在应用程序框架层,本申请实施例对此不做任何限制。
另外,上述活动管理器可用于管理每个应用的生命周期。应用通常以activity的形式运行在操作系统中。活动管理器可以调度应用的activity进程管理每个应用的生命周期。窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。所述数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等。
3、Android runtime和系统库
Android Runtime包括核心库和虚拟机。Android runtime负责安卓系统的调度和管理。
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(Media Libraries),三维图形处理库(例如:OpenGL ES),2D图形引擎(例如:SGL)等。
其中,表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。三维图形处理库用于实现三维图形绘图,图像渲染,合成,和图层处理等。2D图形引擎是 2D绘图的绘图引擎。
4、内核层
内核层是硬件和软件之间的层。内核层至少包含显示驱动,摄像头驱动,音频驱动,传感器驱动等,本申请实施例对此不做任何限制。
以下将结合附图详细阐述本申请实施例提供的一种投屏显示方法。
示例性的,如图5中的(a)所示,用户在手机501中开启投屏功能后,手机501可作为源设备。并且,手机501可显示提示框503,提示框503中包含可作为目的设备进行投屏的一个或多个候选设备。用户可在提示框503中选择一个候选设备作为本次投屏的目的设备。例如,如果检测到用户点击提示框503中的“手机500”,则手机501可确定本次投屏的目的设备为手机500。
其中,上述提示框503中的候选设备可以是与手机501位于同一通信网络中的电子设备。例如,提示框503中的各个候选设备均为手机501接入的Wi-Fi网络中的电子设备。又例如,提示框503中的各个候选设备均为手机501登录的手机账号下的电子设备。
类似的,如图5中的(b)所示,用户在手机502中开启投屏功能后,手机502也可作为源设备。并且,手机501可显示提示框504,与上述提示框503类似的,提示框504中也包含可作为目的设备进行投屏的一个或多个候选设备。用户可在提示框504中选择一个候选设备作为本次投屏的目的设备。例如,如果检测到用户点击提示框504中的“手机500”,则手机502可确定本次投屏的目的设备为手机500。
其中,上述提示框504中的候选设备可以是与手机502位于同一通信网络中的电子设备。例如,提示框504中的各个候选设备均为手机502接入的Wi-Fi网络中的电子设备。又例如,提示框504中的各个候选设备均为手机502登录的手机账号下的电子设备。
可以看出,当上述手机500、手机501与手机502属于同一通信网络时,其中的一个手机可同时作为其他两个手机的目的设备进行投屏显示。例如,手机501(即第一源设备)投屏的目的设备为手机500,并且,手机502(即第二源设备)投屏的目的设备也为手机500。
当然,用户也可以将某一电子设备(例如上述手机500)设置为目的设备,并在目的设备中选择多个电子设备(例如上述手机501和手机502)作为源设备进行本次投屏,本申请实施例对此不做任何限制。
仍以手机501和手机502为本次投屏的源设备,手机500为本次投屏的目的设备举例。
如图6所示,如果手机501开启投屏功能后显示第一音乐APP的播放界面600,说明此时需要将播放界面600中的显示内容投射至手机500中显示。示例性的,播放界面600中包括以下控件:底图601、状态栏602、标题栏603、专辑封面604、歌词605以及控制栏606。其中,状态栏602中包括时间、信号强度以及电池容量等控件。标题栏603中包括歌曲名称6031和演唱者6032等控件。控制栏606中包括进度条6061、暂停按钮6062、上一首按钮6063以及下一首按钮6064等控件。
进而,手机501可获取view system绘制上述播放界面600时对应视图信息。以视图树为视图信息举例,手机501可获取播放界面600的视图树,以及视图树中各个控件的绘制指令和绘制资源。例如,如图7所示,为上述播放界面600的视图树701。视图树701记录了上述播放界面600中各个控件的之间的图层关系。在视图树701中,播放界面600的根节点下包括底图601这一子节点,状态栏602、标题栏603、专辑封面604、歌词605以及控制栏606均为底图601的子节点。歌曲名称6031和演唱者6041为标题栏603的子节点。进度条6061、暂停按钮6062、上一首按钮6063以及下一首按钮6064为控制栏606的子节点。
手机501获取到上述播放界面600的视图树701后,可进一步确定播放界面600中需要投射至手机500中显示的一个或多个控件(即第一目标控件)。
示例性的,手机501中可预先设置与上述播放界面600对应的配置文件。或者,手机501可从服务器中获取与播放界面600对应的配置文件。该配置文件中记录了播放界面600中源设备需要投射至目的设备上的一个或多个控件。
示例性的,在配置文件中可以记录播放界面600中需要投射至目的设备上的一个或多个控件的标识,这样,手机501根据配置文件中记录的控件的标识可确定出播放界面600中需要投射至手机500中显示的第一目标控件。又例如,由于不同版本的第一音乐APP在运行播放界面600时,播放界面600内各个控件的标识可能会被更新,因此通过配置文件中控件的标识确定出的第一目标控件时可能会不准确。对此,也可在配置文件中可以记录需要投射至目的设备上的一个或多个控件在播放界面600中的具体显示位置,例如控件的坐标,这样,手机501根据配置文件中记录的控件在播放界面600中的显示位置可以唯一确定出播放界面600中需要投射至手机500中显示的第一目标控件。
当然,对于不同类型的目的设备,手机501可预先设置多份与上述播放界面600对应的配置文件。例如,当目的设备为手机类型的设备时,与播放界面600对应的配置文件为配置文件1;当目的设备为手表类型的设备时,与播放界面600对应的配置文件为配置文件2。那么,手机501可根据本次用户选中的目的设备(例如上述手机500)的类型,获取与播放界面600对应的配置文件(例如配置文件1)。
示例性的,与播放界面600对应的配置文件1可以为:
Figure PCTCN2020093872-appb-000001
需要说明的是,上述配置文件可以采用JSON(JavaScript Object Notation)格式、XML(Extensible Markup Language)格式或文本格式等格式存储在手机中或服务器中,本申请实施例对此不做任何限制。
可以看出,配置文件1中包含多个“src”字段(例如上述“src1”字段和“src2”字段)。每个“src”字段中记录了播放界面600中一个控件的具体位置。例如,每个控件的位置均可通过left,top,widht,height这4个参数的取值唯一确定。其中,left为控件左上角顶点在x 轴的大小,top为控件左上角顶点在y轴的大小,widht为控件的宽度,height为控件的高度。配置文件1中记录的一个或多个控件即为手机501需要投射至手机500中显示的第一目标控件。
示例性的,检测到用户在手机501中将手机500选中为目的设备后,手机可预先为手机这一类型设置的一个或多个配置文件。并且,手机501可获取当前在前台运行的第一音乐APP的包名(packagename),以及当前显示的播放界面600的activityname。进而,手机501可在获取到的配置文件中,根据该packagename和activityname,查询与播放界面600对应的配置文件1。进而,手机501根据配置文件1中“src”字段记录的各个控件的位置,可识别出播放界面600中需要投射至手机500中显示的第一目标控件。
又或者,用户还可以在播放界面600中手动指定投射至手机500中显示的第一目标控件。例如,手机501可检测用户在播放界面600中的触摸事件。如果检测到用户在播放界面600的A点执行预设的触摸事件,则手机可根据A点的坐标在配置文件1中查找对应的控件,并将查找到的控件确定为投射至手机500中显示的第一目标控件。
在一些实施例中,上述配置文件1中还可以记录第一目标控件投屏后在目的设备中的具体显示位置。例如,可以在上述配置文件1设置与“src1”字段对应的“dest1”字段,“dest1”字段用于指示控件1在目的设备中的显示位置。示例性的,“dest1”字段如下所示:
Figure PCTCN2020093872-appb-000002
那么,手机501根据配置文件1中的各个“dest”字段,可以确定出每播放界面600中的每个第一目标控件投屏后在目的设备(即手机500)中的具体显示位置。
在另一些实施例中,上述配置文件1中还可以记录第一目标控件在投屏前后显示位置的变化关系。例如,配置文件1中针对控件1还设置有如下字段:
Figure PCTCN2020093872-appb-000003
其中,“translationx”字段和“translationy”字段分别用于指示控件1投屏后在x轴和y轴上的平移距离;“scalex”字段和“scaley”字段分别用于指示控件1投屏后在x轴和y轴上的缩放比例;“rotatedegree”字段用于指示控件1投屏后的旋转角度;“order”字段用于指示控件1投屏后的所在的图层位置(例如在最底层图层还是在最顶层图层中)。
同样,手机501根据上述字段中记录的控件1在投屏前后显示位置的变化关系,也可以确定出控件1投屏后在目的设备(即手机500)中的具体显示位置。
示例性的,手机501根据上述配置文件1识别出播放界面600中的第一目标控件,以及第一目标控件投屏后的具体显示位置后,可基于播放界面600的视图树701生成包括含第一目标控件的新的视图树。
例如,手机501根据上述配置文件1识别出播放界面600中的第一目标控件包括:标题栏603中的歌曲名称6031和演唱者6041,控制栏606中的暂停按钮6062、上一首按钮6063以及下一首按钮6064,以及专辑封面604。并且,这些控件之间的位置关系在投屏前后没有变化。也就是输说,第一目标控件在播放界面600中的图层关系与第一目标控件在后续投屏界面中的图层关系相同。
那么,如图8所示,手机501将上述视图树701中的各个控件拆分后,可将不是第一目标控件的节点(例如底图601、状态栏602、歌词605以及控制栏606中的进度条6061)删除。并且,手机501可将保留的第一目标控件按照视图树701中的图层顺序进行重组,得到播放界面600投屏后对应的视图树801。在视图树801中,根节点下包括:标题栏603、专辑封面604以及控制栏606这三个子节点,控制栏606下包括:暂停按钮6062、上一首按钮6063以及下一首按钮6064这三个子节点。
进而,手机501(即第一源设备)可通过上述通信网络104向手机500(即目的设备)发送第一UI消息,第一UI消息中包括上述视图树801以及视图树801中每个控件相关的绘制指令和绘图资源。例如,手机501与手机500可基于TCP/IP(Transmission Control Protocol/Internet Protocol)协议建立socket连接。进而,手机可使用该socket连接将与上述播放界面600对应的第一UI消息发送给手机500。
后续,当手机501更新了上述播放界面600时,手机501可继续按照上述方法生成与新的显示界面对应的第一UI消息,并将新的第一UI消息发送给手机500。
上述实施例中是以第一源设备(例如手机501)为执行主体阐述本申请实施例提供的一种投屏显示方法。当目的设备(例如手机500)有多个源设备同时对目的设备进行投屏时,每个源设备都可按照上述方法将自身显示界面中的一个或多个控件投屏至目的设备中显示。
示例性的,在手机501向手机500进行投屏的同时,手机502也可以开启投屏功能,并将本次投屏的目的设备设置为手机500。
如图9所示,如果手机502开启投屏功能后显示第二音乐APP的播放界面900,说明此时需要将播放界面900中的显示内容投射至手机500中显示。示例性的,播放界面900中包括以下控件:底图901、状态栏902、标题栏903、弹幕904、专辑封面905以及控制栏906。其中,状态栏902中包括时间、信号强度以及电池容量等控件。标题栏903中包括歌曲名称9031和演唱者9032等控件。控制栏906中包括进度条9061、暂停按钮9062、上一首按钮9063以及下一首按钮9064等控件。可以看出,与上述第一音乐APP的播放界面600相比,第二音乐APP的播放界面900中设置了观看弹幕的功能。
同样,手机502可获取其view system绘制上述播放界面900时的视图树,以及视图树中各个控件的绘制指令和绘制资源。例如,如图10中的(a)所示,为上述播放界面900的视图树1001。视图树1001记录了上述播放界面900中各个控件的之间的图层关系。在视图树1001中,播放界面900的根节点下包括底图901这一子节点,状态栏902、标题栏903、弹幕904、专辑封面905以及控制栏906均为底图901的子节点。歌曲名称9031和演唱者9041为标题栏903的子节点。进度条9061、暂停按钮9062、上一首按钮9063以及下一首按钮9064为控制栏906的子节点。
另外,手机502还可以根据第二音乐APP的packagename和packagename的activityname获取与播放界面900对应的配置文件2。与上述配置文件1类似的,配置文件2中记录了播放界面900中需要投射至目的设备上的第二目标控件,例如第二目标控件在播放界面900中的标识或显示位置。
那么,手机502根据配置文件2可识别出视图树1001中需要投屏至手机500显示的第二目标控件。当然,手机502也可以将用户在播放界面900中手动设置的控件作为需要投屏至手机500显示的第二目标控件。
以第二目标控件为上述播放界面900中的弹幕904举例,如图10中的(b)所示,手机502对视图树1001中的控件进行拆分和重组后可生成投屏后与播放界面900对应的视图树1002。其中,视图树1002的根节点下包括弹幕904。
进而,手机502(即第二源设备)可通过上述通信网络104向手机500(即目的设备)发送第二UI消息,第二UI消息中包括上述视图树1002以及视图树1002中每个控件相关的绘制指令和绘图资源。后续,当手机502更新了上述播放界面900时,手机502可继续按照上述方法生成与新的显示界面对应的第二UI消息,并将新的第二UI消息发送给手机500。
另外,源设备可以报文的形式向目的设备发送UI消息,在发送携带UI消息的报文前,源设备可对报文进行序列化(serialization)和压缩。相应的,目的设备接收到源设备发送的报文后,可对接收到的报文进行解压和反序列化,从而解析出报文中携带的UI消息。
可以看出,各个源设备(例如上述手机501和手机502)在向目的设备投屏其显示界面时,可对显示界面中的控件进行拆分、删减和重组等操作,使得最终在目的设备中显示的投屏界面能够适应用户的使用需求,从而提高多设备之间投屏时的显示效果和用户体验。
仍以本次投屏的目的设备为手机500举例,手机500内也可预先存储配置文件。示例性的,与上述源设备中的配置文件(例如上述配置文件1)不同的是,目的设备(即手机500)中存储的配置文件中可以包含多个源设备中目标控件的相关信息。
示例性的,手机500内存储的配置文件3如下所示:
Figure PCTCN2020093872-appb-000004
Figure PCTCN2020093872-appb-000005
可以看出,上述配置文件3中既包括第一音乐APP中播放界面600内的第一目标控件,又包括第二音乐APP中播放界面900内的第二目标控件。
示例性的,手机501向手机500发送的第一UI消息中可以携带第一音乐APP的packagename和播放界面600的activityname。手机502向手机500发送的第二UI消息中可以携带第二音乐APP的packagename和播放界面900的activityname。那么,手机500接收到第一UI消息和第二UI消息后,可获取既包含上述播放界面600的activityname,又包含上述播放界面900的activityname的配置文件(即上述配置文件3)。
示例性的,手机500接收到第一UI消息中的视图树801和第二UI消息中的视图树1002后,如图11所示,手机可以根据配置文件3确定投屏后视图树801和视图树1002中各个控件的具体显示位置以及各个控件之间的图层关系,从而生成与投屏界面对应的视图树1101。例如,如果配置文件3中记录了播放界面900中的弹幕904与播放界面600中的专辑封面604有部分重叠,且弹幕904的图层(order)位于专辑封面604的图层之上,那么在投屏界面的视图树1101中,手机500可将弹幕904作为专辑封面604的一个子节点。
另外,当配置文件3中记录的两个节点的位置发生重合时,还可以在配置文件3中设置“层叠模式”字段。例如,该层叠模式可以包括渐变融合、加深融合或叠加等。手机500可以根据“层叠模式”字段确定在绘制位置重合的两个节点时的具体层叠模式。
在另一些实施例中,手机500中也可以存储上述各个源设备在投屏时使用的配置文件,例如,上述配置文件1和配置文件2。由于配置文件1中记录了第一目标控件投屏后的显示 位置,配置文件2中记录了第二目标控件投屏后的显示位置,因此,手机500根据配置文件1和配置文件2可确定出投屏后上述视图树801和视图树1002中各个控件的具体显示位置以及各个控件之间的图层关系,从而生成与投屏界面对应的视图树1101。
示例性的,手机可根据第一UI消息中第一音乐APP的packagename和播放界面600的activityname,获取到对应的配置文件1。并且,手机可根据第二UI消息中第二音乐APP的packagename和播放界面900的activityname,获取到对应的配置文件2。那么,结合配置文件1中第一目标控件在投屏界面中的显示位置,以及配置文件2中第二目标控件在投屏界面中的显示位置,如图11所示,手机可生成与本次投屏界面对应的视图树1101。
另外,当手机500检测到配置文件1中的某一控件(例如专辑封面604)的显示位置与配置文件2中的某一控件(例如弹幕904)的显示位置发生重叠时,手机500可根据专辑封面604这一控件在配置文件1中记录的“order”字段和弹幕904这一控件在配置文件2中记录的“order”字段,确定专辑封面604与弹幕904之间的位置关系。例如,配置文件1中专辑封面604的“order”字段的取值为2,配置文件2中弹幕904的“order”字段的取值为1,说明弹幕904位于专辑封面604的图层之上,仍如图11所示,手机500可将弹幕904设置为专辑封面604的一个子节点。当然,当两个控件在投屏界面上重叠时,手机500还可以对重叠的控件进行虚化、半透明等处理,提高用户观看时的视觉效果。
又例如,当手机500通过配置文件1和配置文件2检测到专辑封面604与弹幕904这个两个控件重叠时,手机500还可以修改专辑封面604或弹幕904在投屏界面中的显示位置,使得专辑封面604与弹幕904不重叠。此时,手机500可将专辑封面604和弹幕904均作为根节点下的子节点。
在另一些实施例中,手机500也可以根据源设备的数量布局投屏界面中各个控件的显示位置。例如,当手机500有2个源设备(例如上述手机501和手机502)时,手机可设置将手机501发来的视图树801中的第一目标控件显示在投屏界面的左侧,并设置将手机502发来的视图树1002中的第二目标控件显示在投屏界面的右侧。当然,本领域技术人员还可以根据实际经验或实际应用场景设置相应的策略,使得目的设备可以在投屏界面中布局各个源设备投射的目标控件,本申请实施例对此不做任何限制。
仍以上述视图树1101举例,手机500确定出本次投屏界面的视图树1101后,便可按照视图树1101中各个控件的层次和顺序,依次调用视图树101中每一个控件的绘制指令绘制该控件。在具体绘制每个控件时,手机500可按照配置文件3中记录的该控件在投屏界面中的显示位置,在该显示位置处执行相应的绘制指令绘制该控件。最终,如图12所示,手机500基于视图树1101和配置文件3可绘制并显示出投屏界面1201。投屏界面1201中不仅包括上述播放界面600中的标题栏603、专辑封面604以及控制栏606等第一目标控件,还包括上述播放界面900中的弹幕904这一第二目标控件。
这样一来,在投屏过程中,目的设备可以同时显示多个源设备中的显示内容,使得用户可在一个目的设备中观看到多个源设备上的显示内容,无需用户在多个源设备之间进行切换,从而提高多设备之间投屏显示的显示效果和用户体验。
另外,在投屏过程中,目的设备与源设备之间的通信连接可以动态的增加或删除。
例如,当用户关闭手机502中的投屏功能后,手机502可断开与手机500(即目的设备)之间的通信连接(例如Wi-Fi连接)。进而,手机500不会接收到手机502发来的第二UI消息,也不会在投屏界面中显示播放界面900中的相关内容。
又例如,在手机501和手机502向手机500投屏的过程中,如果加入了新的源设备(例 如智能手表),则智能手表可建立与手机500之间的通信连接。手机500在接收上述第一UI消息和第二UI消息的同时,还可以接收来自智能手表的第三UI消息。进而,手机500可基于对应的配置文件重新布局投屏界面中的各个控件,使得手机501、手机502和智能手表均可将其显示内容投射至手机500中显示。
上述实施例中是以将手机501中的播放界面600和手机502中的播放界面900投射至手机500中举例说明的,可以理解的是,多个源设备还可以将不同类型的显示界面同时投射至目的设备中显示。
以手机和智能手表为本次投屏的源设备,智能电视为本次投屏的目的设备举例。
如图13中的(a)所示,手机501作为第一源设备开启投屏功能后可显示锁屏界面1301。锁屏界面1301中包括壁纸1302,壁纸1302上包括状态栏1303、文本控件1304、文件控件1305以及通知消息1306等控件。通知消息1306中又包括应用图标1307、时间1308以及消息内容1309等控件。
示例性的,手机501可获取上述锁屏界面1301的视图树以及视图树中每个控件的绘制指令和绘制资源。并且,手机501还可获取与锁屏界面1301对应的配置文件A。配置文件A中记录了锁屏界面1301中需要投屏至智能电视中的第一目标控件为:通知消息1306中的图标1307和消息内容1309。进而,手机501根据配置文件A中记录的第一目标控件在投屏后的显示位置,可生成投屏后与上述锁屏界面1301对应的视图树1310。如图13中的(b)所示,在视图树1310中,图标1307和消息内容1309均为根节点下的子节点。
进而,手机501(即第一源设备)可向智能电视(即目的设备)发送第一UI消息,第一UI消息中包括上述视图树1310以及视图树1310中每个控件相关的绘制指令和绘图资源。
类似的,如图14中的(a)所示,在手机501向智能电视进行投屏的同时,智能手表1400作为第二源设备开启投屏功能后可显示检测界面1401。检测界面1401中包括时间1402、日期1403、心率信息1404以及热量信息1405。其中,心率信息1404中又包括心率图标和心率值。热量信息1405中又包括热量图标和热量值。
那么,智能手表1400可获取上述检测界面1401的视图树以及视图树中每个控件的绘制指令和绘制资源。并且,智能手表1400还可获取与检测界面1401对应的配置文件B。配置文件B中记录了检测界面1401中需要投屏至智能电视中的第二目标控件为:心率信息1404和热量信息1405中的各个控件。进而,智能手表1400根据配置文件B中记录的第二目标控件在投屏后的显示位置,可生成投屏后与上述检测界面1401对应的视图树1410。如图14中的(b)所示,在视图树1410中,心率信息1404和热量信息1405均为根节点下的子节点,心率信息1404的子节点为心率图标和心率值,热量信息1405的子节点为热量图标和热量值。
进而,智能手表1400(即第二源设备)可向智能电视(即目的设备)发送第二UI消息,第二UI消息中包括上述视图树1410以及视图树1410中每个控件相关的绘制指令和绘图资源。
智能电视1500接收到上述第一UI消息和第二UI消息后,与上述手机500类似的,智能电视1500可对第一UI消息中的视图树1310和第二UI消息中的视图树1410进行拆分和重组,生成与本次投屏界面对应的视图树。
示例性的,如图15中的(a)所示,智能电视1500可生成与本次投屏界面1501对应的视图树1502。视图树1502中包括上述视图树1310中的各个控件,还包括上述视图树1410中的各个控件。并且,视图树1502中还可以包括智能电视1500中正在显示的一个或多个控件,例如视频画面1503。
进而,智能电视1500可按照视图树1502调用相应的绘制指令在其显示屏中绘制每个控件。最终,如图15中的(b)所示,智能电视1500可绘制并显示与视图树1502对应的投屏界面1501。投屏界面1501中包括智能电视1500正在显示的视频画面1503,手机501中正在显示的通知消息1306中的图标1307和消息内容1309,以及智能手表1400中正在显示的心率信息1404和热量信息1405。
这样一来,在投屏过程中,目的设备可以同时显示多个源设备中的显示内容,以及目的设备自身的显示内容,使得用户可在一个设备中观看到多个设备上的显示内容,从而提高多设备之间投屏显示的显示效果和用户体验。
需要说明的是,上述实施例中示例性的示出了将多个源设备中的显示内容投射至同一目的设备的应用场景,可以理解的是,上述投屏显示方法还可以应用在其他场景中,本申请实施例对此不做任何限制。
示例性的,如图16所示,在召开视频会议时,某一会场中的电子设备可作为目的设备,其他会场中的各个电子设备可作为源设备。各个源设备可按照上述方法将需要投射至目的设备的显示内容以UI消息的形式发送给目的设备。进而,目的设备可按照上述方法在自身的显示屏中显示各个源设备本次投射的显示内容。当然,目的设备还可以在自身的显示屏中显示目的设备自身的显示内容。
又例如,学生可以在自己的手机或电脑或平板中安装教辅APP。学生在使用教辅APP答题时,其电子设备可作为源设备将答题区域的显示内容投射至老师使用的手机或电脑或平板中显示。这样,老师可以实时预览多个学生在各自答题区域中的答题过程,及时了解学生的答题思路,从而提高教辅APP的教学效果。
本申请实施例公开了一种电子设备,包括处理器,以及与处理器相连的存储器、输入设备、输出设备和通信模块。其中,输入设备和输出设备可集成为一个设备,例如,可将触摸传感器作为输入设备,将显示屏作为输出设备,并将触摸传感器和显示屏集成为触摸屏。
此时,如图17所示,上述电子设备可以包括:触摸屏1701,所述触摸屏1701包括触摸传感器1706和显示屏1707;一个或多个处理器1702;存储器1703;通信模块1708;一个或多个应用程序(未示出);以及一个或多个计算机程序1704,上述各器件可以通过一个或多个通信总线1705连接。其中该一个或多个计算机程序1704被存储在上述存储器1703中并被配置为被该一个或多个处理器1702执行,该一个或多个计算机程序1704包括指令,上述指令可以用于执行上述实施例中的各个步骤。其中,上述方法实施例涉及的各步骤的所有相关内容均可以援引到对应实体器件的功能描述,在此不再赘述。
示例性的,上述处理器1702具体可以为图2所示的处理器110,上述存储器1703具体可以为图2所示的内部存储器121和/或外部存储器120,上述显示屏1707具体可以为图2所示的显示屏194,上述触摸传感器1706具体可以为图2所示的传感器模块180中的触摸传感器,上述通信模块1708具体可以为图2所示的移动通信模块150和/或无线通信模块160,本申请实施例对此不做任何限制。
通过以上的实施方式的描述,所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。上述描述的系统,装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请实施例各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个 单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)或处理器执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:快闪存储器、移动硬盘、只读存储器、随机存取存储器、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本申请实施例的具体实施方式,但本申请实施例的保护范围并不局限于此,任何在本申请实施例揭露的技术范围内的变化或替换,都应涵盖在本申请实施例的保护范围之内。因此,本申请实施例的保护范围应以所述权利要求的保护范围为准。

Claims (16)

  1. 一种投屏显示方法,其特征在于,包括:
    目的设备接收第一源设备发送的第一消息,所述第一消息中包括第一绘制指令,所述第一绘制指令为用于绘制一个或多个第一目标控件的指令,所述第一目标控件为所述第一源设备显示的第一界面中的控件;
    所述目的设备接收第二源设备发送的第二消息,所述第二消息中包括第二绘制指令,所述第二绘制指令为用于绘制一个或多个第二目标控件的指令,所述第二目标控件为所述第二源设备显示的第二界面中的控件;
    所述目的设备根据所述第一绘制指令和所述第二绘制指令绘制投屏界面,所述投屏界面中包括所述第一目标控件和所述第二目标控件。
  2. 根据权利要求1所述的方法,其特征在于,所述第一消息中还包括第一视图信息,所述第一视图信息包括所述第一目标控件在所述投屏界面中的图层顺序;所述第二消息中还包括第二视图信息,所述第二视图信息包括所述第二目标控件在所述投屏界面中的图层顺序;
    其中,所述目的设备根据所述第一绘制指令和所述第二绘制指令绘制投屏界面,包括:
    所述目的设备根据所述第一视图信息和所述第二视图信息生成第三视图信息,所述第三视图信息包括所述第一目标控件和所述第二目标控件在所述投屏界面中的图层顺序;
    所述目的设备按照所述第三视图信息,执行所述第一绘制指令和所述第二绘制指令绘制投屏界面。
  3. 根据权利要求2所述的方法,其特征在于,在所述目的设备执行所述第一目标控件的绘制指令和所述第二目标控件的绘制指令绘制投屏界面之前,还包括:
    所述目的设备获取与所述第一界面和所述第二界面均对应的配置文件,所述配置文件中记录了所述第一目标控件在所述投屏界面中的第一显示位置,以及所述第二目标控件在所述投屏界面中的第二显示位置;
    其中,所述目的设备根据所述第一视图信息和所述第二视图信息生成第三视图信息,包括:
    所述目的设备根据所述配置文件,对所述第一视图信息和所述第二视图信息中的控件进行拆分和重组,得到所述第三视图信息。
  4. 根据权利要求2所述的方法,其特征在于,在所述目的设备执行所述第一绘制指令和所述第二绘制指令绘制投屏界面之前,还包括:
    所述目的设备获取与所述第一界面对应的第一配置文件,所述第一配置文件中记录了所述第一目标控件在所述投屏界面中的第一显示位置;
    所述目的设备获取与所述第二界面对应的第二配置文件,所述第二配置文件中记录了所述第二目标控件在所述投屏界面中的第二显示位置;
    其中,所述目的设备根据所述第一视图信息和所述第二视图信息生成第三视图信息,包括:
    所述目的设备根据所述第一配置文件和所述第二配置文件,对所述第一视图信息和所述第二视图信息中的控件进行拆分和重组,得到所述第三视图信息。
  5. 根据权利要求3或4所述的方法,其特征在于,所述目的设备按照所述第三视图信息,执行所述第一绘制指令和所述第二绘制指令绘制投屏界面,包括:
    所述目的设备根据所述第三视图信息中所述第一目标控件的图层顺序,在所述第一显示位置执行所述第一绘制指令,以绘制所述第一目标控件;并且,
    所述目的设备根据所述第三视图信息中所述第二目标控件的图层顺序,在所述第二显示位置执行所述第二绘制指令,以绘制所述第二目标控件。
  6. 根据权利要求1-5中任一项所述的方法,其特征在于,
    所述第一目标控件在所述第一界面和所述投屏界面中的显示位置相同或不同;
    所述第二目标控件在所述第二界面和所述投屏界面中的显示位置相同或不同。
  7. 根据权利要求1-6中任一项所述的方法,其特征在于,在目的设备接收第一源设备发送的第一消息之前,还包括:
    所述目的设备显示第三界面;
    其中,所述投屏界面中还包括所述第三界面中的一个或多个第三目标控件。
  8. 一种电子设备,其特征在于,包括:
    触摸屏,所述触摸屏包括触摸传感器和显示屏;
    通信模块;
    一个或多个处理器;
    一个或多个存储器;
    以及一个或多个计算机程序,其中所述一个或多个计算机程序被存储在所述一个或多个存储器中,所述一个或多个计算机程序包括指令,当所述指令被所述电子设备执行时,使得所述电子设备执行以下步骤:
    接收第一源设备发送的第一消息,所述第一消息中包括第一绘制指令,所述第一绘制指令为用于绘制一个或多个第一目标控件的指令,所述第一目标控件为所述第一源设备显示的第一界面中的控件;
    接收第二源设备发送的第二消息,所述第二消息中包括第二绘制指令,所述第二绘制指令为用于绘制一个或多个第二目标控件的指令,所述第二目标控件为所述第二源设备显示的第二界面中的控件;
    根据所述第一绘制指令和所述第二绘制指令绘制投屏界面,所述投屏界面中包括所述第一目标控件和所述第二目标控件。
  9. 根据权利要求8所述的电子设备,其特征在于,所述第一消息中还包括第一视图信息,所述第一视图信息包括所述第一目标控件在所述投屏界面中的图层顺序;所述第二消息中还包括第二视图信息,所述第二视图信息包括所述第二目标控件在所述投屏界面中的图层顺序;
    其中,所述电子设备根据所述第一绘制指令和所述第二绘制指令绘制投屏界面,具体包括:
    根据所述第一视图信息和所述第二视图信息生成第三视图信息,所述第三视图信息包括所述第一目标控件和所述第二目标控件在所述投屏界面中的图层顺序;
    按照所述第三视图信息,执行所述第一绘制指令和所述第二绘制指令绘制投屏界面。
  10. 根据权利要求9所述的电子设备,其特征在于,在所述电子设备执行所述第一目标控件的绘制指令和所述第二目标控件的绘制指令绘制投屏界面之前,所述电子设备还用于执行:
    获取与所述第一界面和所述第二界面均对应的配置文件,所述配置文件中记录了所述第一目标控件在所述投屏界面中的第一显示位置,以及所述第二目标控件在所述投屏界面中的第二显示位置;
    其中,所述电子设备根据所述第一视图信息和所述第二视图信息生成第三视图信息,包括:
    根据所述配置文件,对所述第一视图信息和所述第二视图信息中的控件进行拆分和重组,得到所述第三视图信息。
  11. 根据权利要求9所述的电子设备,其特征在于,在所述电子设备执行所述第一绘制指令和所述第二绘制指令绘制投屏界面之前,所述电子设备还用于执行:
    获取与所述第一界面对应的第一配置文件,所述第一配置文件中记录了所述第一目标控件在所述投屏界面中的第一显示位置;
    获取与所述第二界面对应的第二配置文件,所述第二配置文件中记录了所述第二目标控件在所述投屏界面中的第二显示位置;
    其中,所述电子设备根据所述第一视图信息和所述第二视图信息生成第三视图信息,包括:
    根据所述第一配置文件和所述第二配置文件,对所述第一视图信息和所述第二视图信息中的控件进行拆分和重组,得到所述第三视图信息。
  12. 根据权利要求10或11所述的电子设备,其特征在于,所述电子设备按照所述第三视图信息,执行所述第一绘制指令和所述第二绘制指令绘制投屏界面,具体包括:
    根据所述第三视图信息中所述第一目标控件的图层顺序,在所述第一显示位置执行所述第一绘制指令,以绘制所述第一目标控件;并且,
    根据所述第三视图信息中所述第二目标控件的图层顺序,在所述第二显示位置执行所述第二绘制指令,以绘制所述第二目标控件。
  13. 根据权利要求8-12中任一项所述的电子设备,其特征在于,
    所述第一目标控件在所述第一界面和所述投屏界面中的显示位置相同或不同;
    所述第二目标控件在所述第二界面和所述投屏界面中的显示位置相同或不同。
  14. 根据权利要求8-13中任一项所述的电子设备,其特征在于,在电子设备接收第一源设备发送的第一消息之前,所述电子设备还用于执行:
    显示第三界面;
    其中,所述投屏界面中还包括所述第三界面中的一个或多个第三目标控件。
  15. 一种计算机可读存储介质,所述计算机可读存储介质中存储有指令,其特征在于,当所述指令在电子设备上运行时,使得所述电子设备执行如权利要求1-7中任一项所述的投屏显示方法。
  16. 一种包含指令的计算机程序产品,其特征在于,当所述计算机程序产品在电子设备上运行时,使得所述电子设备执行如权利要求1-7中任一项所述的投屏显示方法。
PCT/CN2020/093872 2019-06-05 2020-06-02 一种投屏显示方法及电子设备 WO2020244492A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US17/616,833 US11907604B2 (en) 2019-06-05 2020-06-02 Screen mirroring display method using layer orders of target controls and electronic device
EP20818738.5A EP3951585A4 (en) 2019-06-05 2020-06-02 SCREEN PROJECTION DISPLAY METHOD AND ELECTRONIC DEVICE

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910487807.5A CN110389736A (zh) 2019-06-05 2019-06-05 一种投屏显示方法及电子设备
CN201910487807.5 2019-06-05

Publications (1)

Publication Number Publication Date
WO2020244492A1 true WO2020244492A1 (zh) 2020-12-10

Family

ID=68285216

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/093872 WO2020244492A1 (zh) 2019-06-05 2020-06-02 一种投屏显示方法及电子设备

Country Status (4)

Country Link
US (1) US11907604B2 (zh)
EP (1) EP3951585A4 (zh)
CN (1) CN110389736A (zh)
WO (1) WO2020244492A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113093977A (zh) * 2021-04-12 2021-07-09 Tcl通讯(宁波)有限公司 移动终端手表的设置方法、装置、智能终端及存储介质
WO2022143508A1 (zh) * 2020-12-31 2022-07-07 华为技术有限公司 一种近场中传输数据的方法、设备及系统

Families Citing this family (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112231025B (zh) * 2019-03-06 2022-12-06 华为终端有限公司 Ui组件显示的方法及电子设备
CN110381195A (zh) * 2019-06-05 2019-10-25 华为技术有限公司 一种投屏显示方法及电子设备
CN110389736A (zh) * 2019-06-05 2019-10-29 华为技术有限公司 一种投屏显示方法及电子设备
CN110941376A (zh) * 2019-11-29 2020-03-31 联想(北京)有限公司 显示控制方法及电子设备
CN116055773A (zh) 2019-12-17 2023-05-02 华为技术有限公司 一种多屏协同方法、系统及电子设备
CN111158836B (zh) * 2019-12-31 2022-03-25 联想(北京)有限公司 一种信息处理方法和电子设备
CN114157756A (zh) * 2020-08-20 2022-03-08 华为技术有限公司 任务处理方法及相关电子设备
CN111399792B (zh) * 2020-03-20 2023-01-17 维沃移动通信有限公司 一种内容共享方法及电子设备
CN111447598B (zh) * 2020-03-23 2023-06-20 维沃移动通信有限公司 一种交互方法和显示设备
CN113515327A (zh) * 2020-03-25 2021-10-19 华为技术有限公司 一种显示时间的方法及电子设备
CN111506283B (zh) * 2020-04-26 2023-10-27 西安万像电子科技有限公司 图像显示方法、装置及系统
CN115756270B (zh) 2020-05-29 2024-03-26 华为技术有限公司 一种内容分享的方法、装置及系统
CN111770379B (zh) * 2020-07-10 2021-08-24 腾讯科技(深圳)有限公司 一种视频投放方法、装置及设备
CN111901674B (zh) * 2020-07-13 2021-08-03 腾讯科技(深圳)有限公司 一种视频播放控制方法及装置
CN114071169B (zh) * 2020-07-29 2023-04-11 华为技术有限公司 一种媒体内容共享方法和装置
CN111988653A (zh) * 2020-08-25 2020-11-24 京东方科技集团股份有限公司 多视频投屏信息的交互方法、装置、设备及存储介质
CN112035081A (zh) * 2020-09-01 2020-12-04 平安付科技服务有限公司 投屏方法、装置、计算机设备及存储介质
CN114173184B (zh) * 2020-09-10 2024-10-18 华为终端有限公司 投屏方法和电子设备
CN112256221A (zh) * 2020-10-20 2021-01-22 北京字跳网络技术有限公司 信息显示方法、装置和电子设备
CN115145517A (zh) * 2021-03-31 2022-10-04 华为技术有限公司 一种投屏方法、电子设备和系统
CN113242435B (zh) * 2021-04-13 2022-05-20 江苏视博云信息技术有限公司 一种投屏的方法、装置及系统
CN113810761B (zh) * 2021-09-17 2023-11-21 上海哔哩哔哩科技有限公司 多终端交互方法、装置及系统
CN116301516A (zh) * 2021-12-21 2023-06-23 北京小米移动软件有限公司 一种应用共享方法及装置、电子设备、存储介质
CN114501120B (zh) * 2022-01-11 2023-06-09 烽火通信科技股份有限公司 多终端无线投屏切换方法与电子设备
CN114489405A (zh) * 2022-01-29 2022-05-13 青岛海信移动通信技术股份有限公司 用户界面显示方法及终端设备
WO2024080735A1 (ko) * 2022-10-11 2024-04-18 삼성전자 주식회사 연관 콘텐트를 미러링하는 디바이스 및 방법
US11689695B1 (en) * 2022-12-15 2023-06-27 Northern Trust Corporation Computing technologies for screensharing
CN118250506A (zh) * 2022-12-23 2024-06-25 华为技术有限公司 一种投屏方法、电子设备及系统

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108124173A (zh) * 2017-12-11 2018-06-05 深圳创维-Rgb电子有限公司 一种一对多投屏显示方法、系统及存储介质
CN108366062A (zh) * 2018-01-24 2018-08-03 上海哇嗨网络科技有限公司 通过邀请建立投屏连接的方法、客户端、服务器和系统
CN109032722A (zh) * 2018-06-27 2018-12-18 广州视源电子科技股份有限公司 更新ui组件的显示效果的方法、装置及设备、介质
CN109508162A (zh) * 2018-10-12 2019-03-22 福建星网视易信息系统有限公司 一种投屏显示方法、系统及存储介质
CN110389736A (zh) * 2019-06-05 2019-10-29 华为技术有限公司 一种投屏显示方法及电子设备

Family Cites Families (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6075531A (en) * 1997-12-15 2000-06-13 International Business Machines Corporation Computer system and method of manipulating multiple graphical user interface components on a computer display with a proximity pointer
US20040169654A1 (en) * 2003-02-27 2004-09-02 Teracruz, Inc. System and method for tree map visualization for database performance data
US8127248B2 (en) * 2003-06-20 2012-02-28 Apple Inc. Computer interface having a virtual single-layer mode for viewing overlapping objects
US8473851B2 (en) * 2008-02-27 2013-06-25 Cisco Technology, Inc. Multi-party virtual desktop
CN101692196B (zh) * 2009-08-25 2012-10-10 宇龙计算机通信科技(深圳)有限公司 一种窗口排列方法及系统
WO2011094931A1 (en) * 2010-02-03 2011-08-11 Nokia Corporation Method and apparatus for providing context attributes and informational links for media data
US10013137B2 (en) * 2010-08-31 2018-07-03 Datapath Limited System and method for unlimited multi-user computer desktop environment
US8682973B2 (en) * 2011-10-05 2014-03-25 Microsoft Corporation Multi-user and multi-device collaboration
CN103257704B (zh) * 2012-02-20 2016-12-14 联想(北京)有限公司 信息处理设备和方法
KR101340780B1 (ko) 2012-02-29 2013-12-11 주식회사 팬택 데이터 공유 시스템 및 방법
PT2962478T (pt) * 2013-02-26 2020-04-22 Mersive Tech Inc Sistema e método para controlo de múltiplos utilizadores e transmissão de multimédia para um ecrã partilhado
US10191713B2 (en) * 2014-03-24 2019-01-29 Lenovo (Beijing) Co., Ltd. Information processing method and electronic device
JP2016110178A (ja) * 2014-12-02 2016-06-20 ソニー株式会社 情報処理装置、情報処理方法およびプログラム
US10326822B2 (en) * 2015-12-03 2019-06-18 Google Llc Methods, systems and media for presenting a virtual operating system on a display device
KR20170096408A (ko) * 2016-02-16 2017-08-24 삼성전자주식회사 어플리케이션을 표시하는 방법 및 이를 지원하는 전자 장치
WO2018010021A1 (en) * 2016-07-11 2018-01-18 Light Wave Technology Inc. Pointer control in a handheld computer by way of hid commands
CN106203160A (zh) * 2016-06-30 2016-12-07 联想(北京)有限公司 一种控制方法及电子设备
US20180004715A1 (en) * 2016-07-01 2018-01-04 Facebook, Inc. Optimizing view hierarchy by automatically removing layout-only views
US20180053003A1 (en) * 2016-08-18 2018-02-22 Qualcomm Incorporated Selectively obfuscating a portion of a stream of visual media that is streamed to at least one sink during a screen-sharing session
CN106528025B (zh) * 2016-11-14 2019-12-31 卓智网络科技有限公司 多屏图像投屏方法、终端、服务器和系统
US10621271B2 (en) * 2017-05-25 2020-04-14 Microsoft Technology Licensing, Llc Reordering a multi-level layout using a hierarchical tree
CN108874341B (zh) * 2018-06-13 2021-09-14 深圳市东向同人科技有限公司 屏幕投影方法及终端设备
CN109445733A (zh) * 2018-10-16 2019-03-08 杭州橙鹰数据技术有限公司 跨屏展示方法、装置、计算设备以及储存介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108124173A (zh) * 2017-12-11 2018-06-05 深圳创维-Rgb电子有限公司 一种一对多投屏显示方法、系统及存储介质
CN108366062A (zh) * 2018-01-24 2018-08-03 上海哇嗨网络科技有限公司 通过邀请建立投屏连接的方法、客户端、服务器和系统
CN109032722A (zh) * 2018-06-27 2018-12-18 广州视源电子科技股份有限公司 更新ui组件的显示效果的方法、装置及设备、介质
CN109508162A (zh) * 2018-10-12 2019-03-22 福建星网视易信息系统有限公司 一种投屏显示方法、系统及存储介质
CN110389736A (zh) * 2019-06-05 2019-10-29 华为技术有限公司 一种投屏显示方法及电子设备

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3951585A4

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022143508A1 (zh) * 2020-12-31 2022-07-07 华为技术有限公司 一种近场中传输数据的方法、设备及系统
CN113093977A (zh) * 2021-04-12 2021-07-09 Tcl通讯(宁波)有限公司 移动终端手表的设置方法、装置、智能终端及存储介质

Also Published As

Publication number Publication date
EP3951585A1 (en) 2022-02-09
US20220308823A1 (en) 2022-09-29
EP3951585A4 (en) 2022-04-06
CN110389736A (zh) 2019-10-29
US11907604B2 (en) 2024-02-20

Similar Documents

Publication Publication Date Title
WO2020244492A1 (zh) 一种投屏显示方法及电子设备
WO2020244495A1 (zh) 一种投屏显示方法及电子设备
WO2020244500A1 (zh) 一种投屏场景下的触控方法及电子设备
WO2020244497A1 (zh) 一种柔性屏幕的显示方法及电子设备
US11722449B2 (en) Notification message preview method and electronic device
US11385857B2 (en) Method for displaying UI component and electronic device
CN115473957B (zh) 一种图像处理方法和电子设备
WO2021121052A1 (zh) 一种多屏协同方法、系统及电子设备
WO2020192456A1 (zh) 一种语音交互方法及电子设备
WO2020155014A1 (zh) 智能家居设备分享系统、方法及电子设备
CN112714214A (zh) 一种内容接续方法及电子设备
CN116360725B (zh) 显示交互系统、显示方法及设备
CN114647350B (zh) 应用共享方法、电子设备和存储介质
WO2022078295A1 (zh) 一种设备推荐方法及电子设备
WO2020155875A1 (zh) 电子设备的显示方法、图形用户界面及电子设备
WO2022161119A1 (zh) 一种显示方法及电子设备
JP2022515863A (ja) スマートホームデバイスによってネットワークにアクセスするための方法および関連するデバイス
WO2023005711A1 (zh) 一种服务的推荐方法及电子设备
WO2022143310A1 (zh) 一种双路投屏的方法及电子设备
CN115016697A (zh) 投屏方法、计算机设备、可读存储介质和程序产品
WO2021042881A1 (zh) 消息通知方法及电子设备
CN115485685A (zh) 应用程序安全检测方法、装置、存储介质及电子设备
WO2022188632A1 (zh) 主题展示方法、装置、终端及计算机存储介质
WO2023202445A1 (zh) 演示系统、方法、图形界面及相关装置
WO2024088225A1 (zh) 一种蓝牙测距方法、电子设备及系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20818738

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020818738

Country of ref document: EP

Effective date: 20211104

NENP Non-entry into the national phase

Ref country code: DE