WO2020244495A1 - 一种投屏显示方法及电子设备 - Google Patents

一种投屏显示方法及电子设备 Download PDF

Info

Publication number
WO2020244495A1
WO2020244495A1 PCT/CN2020/093892 CN2020093892W WO2020244495A1 WO 2020244495 A1 WO2020244495 A1 WO 2020244495A1 CN 2020093892 W CN2020093892 W CN 2020093892W WO 2020244495 A1 WO2020244495 A1 WO 2020244495A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
interface
controls
display
target
Prior art date
Application number
PCT/CN2020/093892
Other languages
English (en)
French (fr)
Inventor
范振华
曹原
魏曦
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP20819136.1A priority Critical patent/EP3958548A4/en
Priority to US17/616,901 priority patent/US11880628B2/en
Publication of WO2020244495A1 publication Critical patent/WO2020244495A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • H04M1/72412User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories using two-way short-range wireless interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • G06F9/44505Configuring for program initiating, e.g. using registry, configuration files
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • G06F9/452Remote windowing, e.g. X-Window System, desktop virtualisation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72439User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72469User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/14Solving problems related to the presentation of information to be displayed
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/20Details of the management of multiple sources of image data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • H04M1/724094Interfacing with a device worn on the user's body to provide access to telephonic functionalities, e.g. accepting a call, reading or composing a message
    • H04M1/724095Worn on the wrist, hand or arm
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72442User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality for playing music files
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/16Details of telephonic subscriber devices including more than one display unit
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector

Definitions

  • This application relates to the field of terminal technology, and in particular to a projection screen display method and electronic equipment.
  • the electronic device can realize the switching and display of multimedia data among multiple devices by means of screen projection.
  • the user may send the content displayed in the mobile phone (that is, the source device) to other destination devices that support the screen projection function for display.
  • the destination device may encounter problems such as inability to display or poor display effect when displaying the content sent by the source device.
  • This application provides a screen projection display method and electronic device.
  • the destination device can lay out the display content sent by the source device according to its own device characteristics, thereby improving the display effect and user experience of the projection screen display among multiple devices.
  • the present application provides a projection display method, including: a first electronic device (ie, a source device) displays a first display interface; the first electronic device receives a user to project the first display interface to a second electronic device (ie, The projection instruction of the destination device 1); in response to the projection instruction, the first electronic device can determine one or more first target controls that need to be projected in the first display interface; further, the first electronic device can send a message to the second The electronic device sends a first message, and the first message includes a drawing instruction of the first target control, so that the second electronic device draws a first projection interface including the first target control according to the drawing instruction of the first target control.
  • a first electronic device ie, a source device
  • the first electronic device receives a user to project the first display interface to a second electronic device (ie, The projection instruction of the destination device 1); in response to the projection instruction, the first electronic device can determine one or more first target controls that need to be projected in the first display interface; further, the first electronic device can send
  • the source device when performing projection display, can project one or more controls in its display interface to the projection interface of the destination device, so that the display interface in the source device can be transformed or reorganized and displayed in
  • the device characteristics such as the screen size of the target device can be more flexibly adapted to improve the display effect and use experience in the projection scene.
  • the first electronic device determining the first target control in the first display interface specifically includes: the first electronic device can obtain the configuration corresponding to the first display interface according to the type of the second electronic device File, the configuration file records the first target control that needs to be screened in the first display interface; further, the first electronic device can determine the first target control in the first display interface according to the configuration file.
  • the aforementioned configuration file may record the identification of the first target control in the first display interface; at this time, the first electronic device determines the first target control in the first display interface according to the configuration file, which specifically includes: The first electronic device determines the first target control that needs to be projected this time according to the identification of the first target control in the first display interface.
  • the above configuration file may record the display position of the first target control in the first display interface; at this time, the first electronic device determines the first target control in the first display interface according to the configuration file, which specifically includes: An electronic device determines the first target control to be projected this time according to the display position of the first target control in the first display interface.
  • the method further includes: the first electronic device acquires first view information when the first display interface is drawn (For example, the view tree of the first display interface), the first view information includes the layer sequence between the controls in the first display interface; further, the first electronic device may determine the second view information according to the first view information, and the second The view information includes the layer order of the first target control in the first projection interface;
  • the source device can split, delete or reorganize the various controls in its display interface, so as to display a new screen projection interface in the target device to adapt to the device characteristics of the target device’s display size. Thereby improving the display effect and user experience of the target device in the projection scene.
  • the first message sent by the source device to the destination device may also include the above-mentioned second view information, so that the second electronic device sequentially calls the first target control according to the layer order of the first target control in the second view information.
  • the drawing instruction draws the first screen projection interface.
  • determining the second view information by the first electronic device according to the first view information includes: the first electronic device splits and reorganizes the first target control in the first view information to obtain the second View information.
  • the layer order of the first target control in the first view information and the second view information is the same, that is, the layer relationship between the first target control and the first target in the first display interface before projection
  • the layer relationship of the controls in the first projection interface after the projection is the same.
  • the configuration file corresponding to the first display interface also records the display position of the first target control in the first projection interface; at this time, the first electronic device according to the first view information Determining the second view information includes: the first electronic device separates the first target control that needs to be projected from the first view information; further, the first electronic device may set the first target control recorded in the configuration file in the first The display position in the projection interface is reorganized to obtain the second view information. At this time, the position and layer of the first target control after the screen is projected may be the same or different from the position and layer before the screen is projected.
  • the above-mentioned first message may also include drawing resources of the first target control, such as icons, avatars, etc.; the drawing resources of the first target control are used for the second electronic device to execute the first target control. In the drawing instruction, use the drawing resource to draw the user interface of the first target control.
  • the method further includes: the first electronic device receives the projection screen of the user projecting the first display interface to the third electronic device (that is, the target device 2) Instruction; in response to the projection instruction, the first electronic device determines the second target control in the first display interface; wherein the first electronic device determines the second target control method and the first electronic device determines the above-mentioned first target control method similar.
  • the first electronic device may send a second message carrying the drawing instruction of the second target control to the third electronic device.
  • the first electronic device may also carry the drawing instruction of the second target control in the first message.
  • the first electronic device may send the first message to the second Electronic equipment and third electronic equipment. In this way, the first electronic device can simultaneously project different controls in its display interface to multiple destination devices for display.
  • the present application provides a projection display method, including: a second electronic device (ie, a destination device) can receive a first message sent by a first electronic device (ie, a source device), and the first message includes N(N Is an integer greater than 0) drawing instructions of the controls; then, the second electronic device can draw the first screen projection interface according to the drawing instructions of the N controls, and the first screen projection interface includes at least one of the above N controls.
  • the first message further includes first view information corresponding to the N controls, and the first view information includes the layer order between the N controls; in this case, the second The electronic device draws the first projection interface according to the drawing instructions of the above N controls, which specifically includes: the second electronic device can determine the drawing order of the N controls according to the layer order between the N controls in the first view information, and then , The second electronic device can respectively execute the drawing instructions corresponding to the N controls according to the drawing order to draw the above N controls to form the first screen projection interface.
  • the above-mentioned first message further includes the identification of the first application interface to which the first target control belongs; then, after the second electronic device receives the first message sent by the first electronic device, it also includes : The second electronic device obtains the configuration file corresponding to the identification of the first application interface, and the configuration file records the display position of the above-mentioned N controls in the first screen projection interface after screen projection; at this time, the second electronic device follows
  • the drawing instructions of the above N controls to draw the first projection interface includes: the second electronic device executes the drawing sequence of each control at the corresponding display position recorded in the configuration file according to the drawing order of the N controls indicated by the first view information. Drawing instructions, draw the above N controls in sequence to form the first screen projection interface.
  • the above-mentioned first message further includes the identification of the first application interface to which the first target control belongs; then, after the second electronic device receives the first message sent by the first electronic device, it also includes : The second electronic device obtains a configuration file corresponding to the identifier of the first application interface, and the configuration file records the display positions of M (M is an integer not greater than N) target controls in the first projection interface.
  • M is an integer not greater than N
  • the M The target controls are a subset of the N controls. In other words, the M target controls need to be projected on the target device as part of the N controls in the first message. Then, the second electronic device can determine the need to cast in the first message among the N controls according to the above configuration file.
  • the second electronic device draws the first screen projection interface according to the drawing instructions of the N controls, including: the second electronic device determines this time from the drawing instructions of the N controls
  • the drawing instructions of the M target controls are used; further, the second electronic device can draw the M target controls according to the drawing instructions of the M target controls to form the first screen projection interface.
  • the method further includes: after the second electronic device generates the screen projection according to the above configuration file
  • the second view information includes the layer order of the M target controls in the first projection interface; at this time, the second electronic device draws the M target controls according to the drawing instructions of the M target controls.
  • Forming the first screen projection interface including: the second electronic device can call the drawing instructions of the corresponding target control respectively according to the drawing order of the M target controls indicated by the second view information, and draw at the corresponding display position recorded in the configuration file Each target control finally forms the first screen projection interface.
  • this application provides an electronic device, which may be the aforementioned source device or destination device.
  • the electronic device includes: a touch screen, a communication module, one or more processors, one or more memories, and one or more computer programs; wherein, the processor is coupled with the communication module, the touch screen and the memory, and one or more of the foregoing Multiple computer programs are stored in the memory, and when the electronic device is running, the processor executes one or more computer programs stored in the memory, so that the electronic device executes the projection display method described in any one of the above.
  • the present application provides a computer storage medium including computer instructions, which when the computer instructions run on an electronic device, cause the electronic device to execute the projection display method according to any one of the first aspect.
  • the present application provides a computer program product, which when the computer program product runs on an electronic device, causes the electronic device to execute the projection display method described in any one of the first aspect.
  • the present application provides a projection display system, which may include at least one source device and at least one destination device; the source device can be used to execute the projection display method according to any one of the first aspect, The destination device can be used to execute the projection display method according to any one of the second aspect.
  • the electronic equipment described in the third aspect, the computer storage medium described in the fourth aspect, the computer program product described in the fifth aspect, and the system described in the sixth aspect provided above are all used to execute the foregoing
  • the beneficial effects that can be achieved can refer to the beneficial effects in the corresponding method provided above, which will not be repeated here.
  • FIG. 1 is a schematic diagram of the architecture of a communication system provided by an embodiment of the application
  • FIG. 2 is a first structural diagram of an electronic device according to an embodiment of the application.
  • FIG. 3 is a schematic structural diagram of an operating system in an electronic device provided by an embodiment of the application.
  • FIG. 4 is a schematic diagram 1 of an application scenario of a projection screen display method provided by an embodiment of the application;
  • FIG. 5 is a second schematic diagram of an application scenario of a projection screen display method provided by an embodiment of the application.
  • FIG. 6 is a third schematic diagram of an application scenario of a projection screen display method provided by an embodiment of the application.
  • FIG. 7 is a fourth schematic diagram of an application scenario of a projection screen display method provided by an embodiment of this application.
  • FIG. 8 is a schematic diagram 5 of an application scenario of a projection screen display method provided by an embodiment of the application.
  • FIG. 9 is a sixth schematic diagram of an application scenario of a projection screen display method provided by an embodiment of the application.
  • FIG. 10 is a schematic diagram 7 of an application scenario of a projection screen display method provided by an embodiment of this application.
  • FIG. 11 is an eighth schematic diagram of an application scenario of a projection screen display method provided by an embodiment of this application.
  • FIG. 12 is a schematic diagram 9 of an application scenario of a projection screen display method provided by an embodiment of this application.
  • FIG. 13 is a tenth schematic diagram of an application scenario of a projection screen display method provided by an embodiment of this application.
  • FIG. 14 is a schematic diagram eleventh of an application scenario of a projection screen display method provided by an embodiment of this application.
  • 15 is a twelfth schematic diagram of an application scenario of a projection screen display method provided by an embodiment of this application.
  • 16 is a thirteenth schematic diagram of an application scenario of a projection screen display method provided by an embodiment of the application.
  • FIG. 17 is a fourteenth schematic diagram of an application scenario of a projection screen display method provided by an embodiment of this application.
  • FIG. 18 is a fifteenth schematic diagram of an application scenario of a projection screen display method provided by an embodiment of this application.
  • FIG. 19 is a sixteenth schematic diagram of an application scenario of a projection display method provided by an embodiment of this application.
  • FIG. 20 is a second structural diagram of an electronic device provided by an embodiment of this application.
  • a projection display method provided by an embodiment of the present application may be applied to a communication system 100, and the communication system 100 may include N (N>1) electronic devices.
  • the communication system 100 may include an electronic device 101 and an electronic device 102.
  • the electronic device 101 may be connected to the electronic device 102 through one or more communication networks 104.
  • the communication network 104 may be a wired network or a wireless network).
  • the aforementioned communication network 104 may be a local area network (local area networks, LAN), or a wide area network (wide area networks, WAN), such as the Internet.
  • the communication network 104 can be implemented using any known network communication protocol.
  • the above-mentioned network communication protocol can be various wired or wireless communication protocols, such as Ethernet, universal serial bus (USB), and Firewire (FIREWIRE).
  • GSM Global System for Mobile Communications
  • GPRS General Packet Radio Service
  • CDMA Code Division Multiple Access
  • WCDMA Broadband Code Division Multiple Access
  • TD-SCDMA time-division code division multiple access
  • LTE long term evolution
  • Bluetooth wireless fidelity (Wi-Fi)
  • Wi-Fi wireless fidelity
  • NFC voice over Internet protocol (VoIP) based on Internet protocol, communication protocol supporting network slicing architecture or any other suitable communication protocol.
  • VoIP voice over Internet protocol
  • the electronic device 101 may establish a Wi-Fi connection with the electronic device 102 through a Wi-Fi protocol.
  • the electronic device 101 may be used as a source device, and the electronic device 102 may be used as a destination device of the electronic device 101.
  • the electronic device 101 can project a display interface in the display screen to the display screen of the electronic device 102 for display.
  • the above-mentioned communication system 100 may further include an electronic device 103, for example, the electronic device 103 may be a wearable device. Then, the electronic device 101 can also be used as the source device of the electronic device 103, and the display interface in its display screen is also projected to the display screen of the electronic device 103 for display.
  • the electronic device 101 when the electronic device 101 is the source device, the electronic device 101 can project its display interface to multiple destination devices (for example, the above-mentioned electronic device 102 and the electronic device 103) for display at the same time.
  • the electronic device 101 can project its display interface to multiple destination devices (for example, the above-mentioned electronic device 102 and the electronic device 103) for display at the same time.
  • destination devices for example, the above-mentioned electronic device 102 and the electronic device 103
  • any electronic device in the aforementioned communication system 100 can be used as a source device, and the embodiment of the present application does not impose any limitation on this.
  • the electronic device 101 can identify each control in its display interface, and the position and layer relationship between each control. Furthermore, the electronic device 101 may perform transformations such as splitting, cutting, or reorganization of these controls, so as to obtain one or more controls (which may be referred to as target controls in the following) that need to be projected to the electronic device 102 (or the electronic device 103). Furthermore, the electronic device 101 may send a drawing instruction related to the target control to the electronic device 102 (or the electronic device 103) so that the electronic device 102 (or the electronic device 103) can draw the target control in the projection interface displayed by itself.
  • transformations such as splitting, cutting, or reorganization of these controls, so as to obtain one or more controls (which may be referred to as target controls in the following) that need to be projected to the electronic device 102 (or the electronic device 103).
  • the electronic device 101 may send a drawing instruction related to the target control to the electronic device 102 (or the electronic device 103) so that the electronic device 102 (or the electronic device 103) can draw the target
  • the electronic device 101 is still used as an example of the source device, and the electronic device 101 can identify each control in its display interface.
  • the electronic device 101 can send drawing instructions related to these controls to the electronic device 102 (or the electronic device 103).
  • the electronic device 102 (or the electronic device 103) can split, cut, or reorganize the various controls in the electronic device 101, and thereby obtain one or more target controls that need to be displayed this time. Subsequently, the electronic device 102 (or the electronic device 103) may draw the aforementioned target control on its own display screen.
  • the display interface finally presented to the user by the destination device may be different from the display interface in the source device.
  • the display interface in the source device can be cropped, transformed or reorganized and displayed on the target device, so as to adapt to the device characteristics of the target device's screen size and other device characteristics more flexibly, and improve the display in the projection scene Effects and experience.
  • the specific structures of the electronic device 101, the electronic device 102, and the electronic device 103 may be the same or different.
  • each of the above electronic devices may specifically be mobile phones, tablet computers, smart TVs, wearable electronic devices, car machines, notebook computers, ultra-mobile personal computers (UMPC), handheld computers, netbooks, and personal digital assistants. (personal digital assistant, PDA), virtual reality equipment, etc.
  • UMPC ultra-mobile personal computers
  • PDA personal digital assistant
  • virtual reality equipment etc.
  • the embodiments of the present application do not make any restrictions on this.
  • FIG. 2 shows a schematic structural diagram of the electronic device 101.
  • the electronic device 101 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, and an antenna 2.
  • Mobile communication module 150 Wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone interface 170D, sensor module 180, camera 193, display screen 194, etc.
  • the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the electronic device 101.
  • the electronic device 101 may include more or fewer components than shown in the figure, or combine certain components, or split certain components, or arrange different components.
  • the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU), etc.
  • AP application processor
  • modem processor modem processor
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • the different processing units may be independent devices or integrated in one or more processors.
  • a memory may also be provided in the processor 110 to store instructions and data.
  • the memory in the processor 110 is a cache memory.
  • the memory can store instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to use the instruction or data again, it can be directly called from the memory. Repeated accesses are avoided, the waiting time of the processor 110 is reduced, and the efficiency of the system is improved.
  • the processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (PCM) interface, and a universal asynchronous transmitter receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and / Or Universal Serial Bus (USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • UART universal asynchronous transmitter receiver/transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB Universal Serial Bus
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger can be a wireless charger or a wired charger.
  • the charging management module 140 may receive the charging input of the wired charger through the USB interface 130.
  • the charging management module 140 may receive the wireless charging input through the wireless charging coil of the electronic device 101. While the charging management module 140 charges the battery 142, it can also supply power to the electronic device through the power management module 141.
  • the power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the display screen 194, the camera 193, and the wireless communication module 160.
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, and battery health status (leakage, impedance).
  • the power management module 141 may also be provided in the processor 110.
  • the power management module 141 and the charging management module 140 may also be provided in the same device.
  • the wireless communication function of the electronic device 101 can be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, and the baseband processor.
  • the antenna 1 and the antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in the electronic device 101 can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna can be used in combination with a tuning switch.
  • the mobile communication module 150 can provide a wireless communication solution including 2G/3G/4G/5G and the like applied to the electronic device 101.
  • the mobile communication module 150 may include one or more filters, switches, power amplifiers, low noise amplifiers (LNA), etc.
  • the mobile communication module 150 can receive electromagnetic waves by the antenna 1, and perform processing such as filtering and amplifying the received electromagnetic waves, and then transmitting them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modem processor, and convert it into electromagnetic waves for radiation via the antenna 1.
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110.
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be provided in the same device.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low-frequency baseband signal is processed by the baseband processor and then passed to the application processor.
  • the application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays an image or video through the display screen 194.
  • the modem processor may be an independent device.
  • the modem processor may be independent of the processor 110 and be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the electronic device 101, including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), Bluetooth (BT), and global navigation satellites.
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication technology
  • infrared technology infrared, IR
  • the wireless communication module 160 may be one or more devices integrating one or more communication processing modules.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2, frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110.
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110, perform frequency modulation, amplify it, and convert it into electromagnetic wave radiation via the antenna 2.
  • the antenna 1 of the electronic device 101 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the electronic device 101 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technologies may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite-based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite-based augmentation systems
  • the electronic device 101 implements a display function through a GPU, a display screen 194, and an application processor.
  • the GPU is a microprocessor for image processing, connected to the display 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • the processor 110 may include one or more GPUs, which execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos, etc.
  • the display screen 194 includes a display panel.
  • the display panel can adopt liquid crystal display (LCD), organic light-emitting diode (OLED), active-matrix organic light-emitting diode or active-matrix organic light-emitting diode (active-matrix organic light-emitting diode).
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • active-matrix organic light-emitting diode active-matrix organic light-emitting diode
  • AMOLED flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (QLED), etc.
  • the electronic device 101 may include one or N display screens 194, and N is a positive integer greater than one.
  • the electronic device 101 can realize a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, and an application processor.
  • the ISP is used to process the data fed back from the camera 193. For example, when taking a picture, the shutter is opened, the light is transmitted to the photosensitive element of the camera through the lens, the light signal is converted into an electrical signal, and the photosensitive element of the camera transfers the electrical signal to the ISP for processing and is converted into an image visible to the naked eye.
  • ISP can also optimize the image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193.
  • the camera 193 is used to capture still images or videos.
  • the object generates an optical image through the lens and projects it to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • ISP outputs digital image signals to DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats.
  • the electronic device 101 may include 1 or N cameras 193, and N is a positive integer greater than 1.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 101 selects the frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 101 may support one or more video codecs. In this way, the electronic device 101 can play or record videos in a variety of encoding formats, such as: moving picture experts group (MPEG) 1, MPEG2, MPEG3, MPEG4, and so on.
  • MPEG moving picture experts group
  • the external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 101.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example, save music, video and other files in an external memory card.
  • the internal memory 121 may be used to store one or more computer programs, and the one or more computer programs include instructions.
  • the processor 110 can run the above-mentioned instructions stored in the internal memory 121 to enable the electronic device 101 to execute the projection display method provided in some embodiments of the present application, as well as various functional applications and data processing.
  • the internal memory 121 may include a storage program area and a storage data area. Among them, the storage program area can store the operating system; the storage program area can also store one or more application programs (such as a gallery, contacts, etc.) and so on.
  • the data storage area can store data (such as photos, contacts, etc.) created during the use of the electronic device 101.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, universal flash storage (UFS), etc.
  • the processor 110 executes the instructions stored in the internal memory 121 and/or the instructions stored in the memory provided in the processor to cause the electronic device 101 to execute the screen projection provided in the embodiments of the present application. Display methods, as well as various functional applications and data processing.
  • the electronic device 101 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. For example, music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into an analog audio signal for output, and is also used to convert an analog audio input into a digital audio signal.
  • the audio module 170 can also be used to encode and decode audio signals.
  • the audio module 170 may be provided in the processor 110, or part of the functional modules of the audio module 170 may be provided in the processor 110.
  • the speaker 170A also called a “speaker” is used to convert audio electrical signals into sound signals.
  • the electronic device 101 can listen to music through the speaker 170A, or listen to a hands-free call.
  • the receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the electronic device 101 answers a call or voice message, it can receive the voice by bringing the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone”, “microphone”, is used to convert sound signals into electrical signals.
  • the user can approach the microphone 170C through the mouth to make a sound, and input the sound signal to the microphone 170C.
  • the electronic device 101 may be provided with one or more microphones 170C. In other embodiments, the electronic device 101 may be provided with two microphones 170C, which can implement noise reduction functions in addition to collecting sound signals. In some other embodiments, the electronic device 101 can also be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and realize directional recording functions.
  • the earphone interface 170D is used to connect wired earphones.
  • the earphone interface 170D may be a USB interface 130, or a 3.5mm open mobile terminal platform (OMTP) standard interface, and a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA, CTIA
  • the sensor module 180 may include a pressure sensor, a gyroscope sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, etc.
  • the above electronic device may also include one or more components such as buttons, motors, indicators, and SIM card interfaces, which are not limited in the embodiment of the present application.
  • the software system of the above electronic device 101 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the embodiment of the present application takes a layered Android system as an example to illustrate the software structure of the electronic device 101.
  • FIG. 3 is a software structure block diagram of the electronic device 101 according to an embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Communication between layers through software interface.
  • the Android system is divided into four layers, from top to bottom, the application layer, the application framework layer, the Android runtime and system library, and the kernel layer.
  • the application layer can include a series of applications.
  • the above-mentioned application programs may include APPs (applications) such as call, contact, camera, gallery, calendar, map, navigation, Bluetooth, music, video, short message, etc.
  • APPs applications
  • the application framework layer provides application programming interfaces (application programming interface, API) and programming frameworks for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer may include a view system, a notification manager, an activity manager, a window manager, a content provider, a resource manager, an input method manager, and so on.
  • the view system can be used to construct the display interface of the application.
  • Each display interface can consist of one or more controls.
  • controls can include interface elements such as icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, and widgets.
  • the view system can obtain view information of the corresponding display interface when drawing the display interface, and the view information records the layer relationship between the various controls in the display interface to be drawn.
  • each control in the display interface is generally organized hierarchically according to a tree structure to form a complete ViewTree (view tree), which may be referred to as the view information of the above display interface.
  • the view system can draw the display interface according to the layer relationship between the controls set in the ViewTree.
  • Each control in the display interface corresponds to a set of drawing instructions, such as DrawLine, DrawPoint, DrawBitmap, etc.
  • the view system can in turn call the drawing instructions of the corresponding control to draw the control according to the layer relationship between the controls in the ViewTree.
  • FIG. 4 shows the chat interface 401 of the WeChat APP.
  • the bottom control in the chat interface 401 is the root node (root), and a basemap 402 is set under the root node.
  • the following controls are also included: a title bar 403, a chat background 404, and an input bar 405.
  • the title bar 403 further includes a return button 406 and a title 407
  • the chat background 404 further includes an avatar 408 and a bubble 409
  • the input bar 405 further includes a voice input button icon 410, an input box 411, and a send button 412.
  • the base map 402 is located under the root node, and the title bar 403, the chat background 404, and the input field 405 are all child nodes of the base map 402. Both the return button 406 and the title 407 are child nodes of the title bar 403. Both the avatar 408 and the bubble 409 are child nodes of the chat background 404.
  • the voice input button icon 410, the input box 411, and the send button 412 are all child nodes of the input field 405.
  • the view system can call the corresponding drawing instructions layer by layer to draw each control according to the layer relationship between the controls in the view tree A, and finally form the chat interface 401.
  • a projection management module may be added to the view system of the source device.
  • the screen projection management module can record the drawing instructions for the view system to draw each control in each display interface, and the drawing resources (such as avatars, icons, etc.) required by the drawing instructions.
  • the projection management module can determine one or more target controls in the current display interface that need to be projected to the target device for display.
  • the projection management module can generate a view tree 2 of the projection interface of the target control after the projection of the target control based on the view tree 1 of the current display interface.
  • the number of controls in view tree 2 may be different from the number of controls in view tree 1, and the positional relationship between controls in view tree 2 may also be different from the positional relationship of controls in view tree 1.
  • the projection management module can instruct the source device to send the drawing instructions and drawing resources of the view tree 2 and each control in the view tree 2 to the destination device, so that the destination device can follow the layer relationship between the controls in the view tree 2 one by one.
  • the layer calls the drawing instructions of the corresponding controls to draw the projected screen after the projected screen.
  • the source device can project the target control in its display interface to the projection interface displayed on the target device for display.
  • a projection management module can also be added to the view system of the destination device.
  • the above-mentioned electronic device 101 when the above-mentioned electronic device 101 is a destination device, it can receive a UI message sent by the source device.
  • the UI message can include the view tree 1 of the display interface in the source device and the drawing instructions and drawing resources of each control in the view tree 1.
  • the projection management module can generate the view tree 2 of the projection interface that needs to be displayed in the target device this time. In this way, the target device can call the drawing instructions and drawing resources of the corresponding control layer by layer according to the layer relationship between the controls in the view tree 2 to draw the projected screen interface.
  • the above-mentioned projection management module can split, delete, and reorganize the various controls in the source device display interface, so as to display a new projection interface in the destination device to suit the purpose.
  • the device characteristics such as the display size of the device can improve the display effect and user experience of the target device in the projection scene.
  • the above-mentioned projection management module can also be set at the application framework layer independently of the view system, and the embodiment of the present application does not impose any limitation on this.
  • the aforementioned activity manager can be used to manage the life cycle of each application.
  • Applications usually run in the operating system in the form of activity.
  • the activity manager can schedule the activity process of the application to manage the life cycle of each application.
  • the window manager is used to manage window programs.
  • the window manager can obtain the size of the display, determine whether there is a status bar, lock the screen, take a screenshot, etc.
  • the content provider is used to store and retrieve data and make these data accessible to applications.
  • the data may include video, image, audio, phone calls made and received, browsing history and bookmarks, phone book, etc.
  • the resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, etc.
  • Android Runtime includes core libraries and virtual machines. Android runtime is responsible for the scheduling and management of the Android system.
  • the core library consists of two parts: one part is the function functions that the java language needs to call, and the other part is the core library of Android.
  • the application layer and the application framework layer run in a virtual machine.
  • the virtual machine executes the java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
  • the system library can include multiple functional modules. For example: surface manager (surface manager), media library (Media Libraries), three-dimensional graphics processing library (for example: OpenGL ES), 2D graphics engine (for example: SGL), etc.
  • the surface manager is used to manage the display subsystem, and provides a combination of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support multiple audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to realize 3D graphics drawing, image rendering, synthesis, and layer processing.
  • the 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer includes at least a display driver, a camera driver, an audio driver, a sensor driver, etc., which are not limited in the embodiment of the present application.
  • a mobile phone is used as an example of a source device for screen projection
  • a screen projection display method provided by an embodiment of the present application is described in detail with reference to the accompanying drawings.
  • a screen projection button can be set in the mobile phone in advance, and the screen projection button can be used to project the display interface in the mobile phone to other electronic devices for display.
  • the above-mentioned screen projection button may be set in a position such as a pull-down menu, a pull-up menu, or a negative one-screen menu, which is not limited in the embodiment of the present application.
  • the mobile phone when the user uses the music APP to play songs on the mobile phone, the mobile phone can display the music playing interface 501 of the music APP. If the user wants to project the music playing interface 501 in the mobile phone to another electronic device for continued display at this time, the user can perform a preset operation in the music playing interface 501.
  • the preset operation may be a pull-down operation.
  • the mobile phone in response to a pull-down operation performed by the user in the music playing interface 501, as shown in (b) of FIG. 5, the mobile phone may display a pull-down menu 503 including a device switching button 502.
  • the mobile phone may display a prompt box 601, and the prompt box 601 may contain one or more candidate devices. The user can select one or more of these candidate devices as the target device. Subsequently, the mobile phone may project the above-mentioned music playing interface 501 to the destination device selected by the user for screen projection display.
  • the mobile phone can query other electronic devices located in the same network. For example, if the mobile phone is connected to a Wi-Fi network named "1234", the mobile phone can query which electronic devices are included in the Wi-Fi network. If it is found that the Wi-Fi network also includes smart TVs, notebooks, tablets, and smart watches, the mobile phone can use these electronic devices as candidate devices and display the icons of these electronic devices in the prompt box 601.
  • the mobile phone can also request the server to query other electronic devices that log in to the same Huawei account as the mobile phone. Furthermore, the server may send the identities of one or more electronic devices that have been queried for logging into the same Huawei account to the mobile phone. In this way, the mobile phone can use these electronic devices as candidate devices and display the icons of these electronic devices in the prompt box 601. in.
  • the mobile phone may first split, cut, or reorganize the display content in the music playing interface 501 to obtain a screen projection interface corresponding to the music playing interface 501. Furthermore, the mobile phone can project the screen projection interface to the smart watch for display, so that the screen projection interface displayed in the smart watch can not only reflect the content in the music playback interface 501, but also adapt to the screen size of the smart watch's own display screen.
  • Each configuration file can correspond to a specific application or a specific display interface.
  • Each configuration file records one or more target controls in the source device that need to be projected to the destination device.
  • the configuration file may record the identification of one or more controls in the music playback interface 501 that need to be projected to the target device.
  • the mobile phone can determine the needs in the music playback interface 501 according to the identification of the controls recorded in the configuration file.
  • the target control for the projection may record the identification of one or more controls in the music playback interface 501 that need to be projected to the target device.
  • the identification of each control in the music playing interface 501 may be updated, so the target control determined by the identification of the control in the configuration file may be inaccurate.
  • the specific display position of one or more controls that need to be projected on the music playing interface 501 may also be recorded in the configuration file.
  • the position of each control can be uniquely determined by the values of the 4 parameters: left, top, widht, and height. Among them, left is the size of the top-left vertex of the control on the x-axis, top is the size of the top-left vertex of the control on the y-axis, widht is the width of the control, and height is the height of the control. In this way, the mobile phone can uniquely determine the target control in the music playing interface 501 that needs to be screened according to the display position of the controls recorded in the configuration file in the music playing interface 501.
  • the configuration file can also record the display position of the target control in the projection interface after being projected to the target device.
  • the music playing interface 501 includes the following controls: a base map 701, a status bar 702, a title bar 703, an album cover 704, lyrics 705, and a control bar 706.
  • the status bar 702 includes controls such as time, signal strength, and battery capacity.
  • the title bar 703 includes controls such as the song name 7031 and the singer 7032.
  • the control bar 706 includes controls such as a progress bar 7061, a pause button 7062, a previous button 7063, and a next button 7064.
  • the mobile phone can preset the corresponding profile 1 for the music playing interface 501 when projecting to the smart watch. Due to the small size of the display screen of the smart watch, the main purpose of the user to project the music playing interface 501 in the mobile phone to the smart watch is to facilitate the control of the music APP. Therefore, the target controls that need to be projected in the music playing interface 501 can be set in the configuration file 1 as the title bar 703 and the control bar 706 in advance. In addition, the display position of each control in the title bar 703 and the control bar 706 in the music playing interface 501 before the screen projection, and the display position of each control in the title bar 703 and the control bar 706 after the screen projection can be set in the configuration file 1.
  • the configuration file 1 corresponding to the music playing interface 501 may be:
  • configuration file can be stored in a mobile phone or server in JSON (JavaScript Object Notation) format, XML (Extensible Markup Language) format, or text format, and the embodiment of the application does not impose any limitation on this.
  • JSON JavaScript Object Notation
  • XML Extensible Markup Language
  • the mobile phone can obtain one or more configuration files preset for the smart watch.
  • the mobile phone can obtain the package name (packagename) of the music application currently running in the foreground and the activityname of the current display interface.
  • the mobile phone can query the configuration file 1 corresponding to the music playing interface 501 according to the packagename and activityname in the obtained configuration file. Then, according to the location information recorded in the ID or "src" field of the target control 1 in the configuration file 1, the mobile phone can identify one or more target controls in the music playing interface 501 that need to be projected to the smart watch.
  • the mobile phone or the server can also preset corresponding configuration files for smart watches of different models or different display specifications (such as screen shape, resolution, etc.). Then, after detecting that the user selects the smart watch in the prompt box 601, the mobile phone can further obtain the model or display specification of the smart watch that is projected this time, and find the configuration file corresponding to the model or display specification.
  • the target controls that need to be projected by smart watches of different models or different display specifications can also be recorded in the same configuration file (for example, the aforementioned configuration file 1).
  • the mobile phone After the mobile phone obtains the configuration file 1, it can find the corresponding "src" field or the ID of the target control in the configuration file 1 according to the model or display specifications of the smart watch that is projected this time, so as to identify the need for this time Project to one or more target controls displayed in the smartwatch.
  • the mobile phone can display the above-mentioned music playing interface 501 according to the existing screen projection scheme. Projected to the smart watch for display, the embodiment of this application does not impose any restriction on this.
  • the mobile phone can also obtain the view information corresponding to the view system when the music playing interface 501 is drawn.
  • the view tree 801 records the layer relationship between the various controls in the music playing interface 501 described above.
  • the root node of the music playing interface 501 includes a child node of the base map 701, and the status bar 702, the title bar 703, the album cover 704, the lyrics 705, and the control bar 706 are all child nodes of the base map 701.
  • the song name 7031 and the singer 7041 are child nodes of the title column 703.
  • the progress bar 7061, the pause button 7062, the previous button 7063, and the next button 7064 are child nodes of the control bar 706.
  • the mobile phone recognizes the target controls in the music playback interface 501 through the above configuration file 1, including: the song name 7031 and the singer 7041 in the title bar 703, the pause button 7062 in the control bar 706, the previous button 7063, and Next button 7064, and album cover 704. Furthermore, since the configuration file 1 records the display position of the target control in the projection interface after the screen is projected, the mobile phone can split, crop, and reorganize the view tree 801 of the music playback interface 501 according to the above configuration file 1. , The view tree 901 of the projection interface displayed on the smart watch after the projection is generated.
  • FIG. 9 it is a schematic diagram of the view tree 901 of the screen projection interface.
  • the mobile phone deletes nodes in the view tree 801 that are not target controls, such as the aforementioned base map 701, status bar 702, various controls in the status bar 702, and the progress bar 7601 in the control bar 706.
  • the target controls in the title bar 703 and the control bar 706 after the projection are recorded in the profile 1 are located on the layer of the album cover 704, in the view tree 901, the mobile phone can set the title bar 703 and the control bar 706 As a child node of the album cover 704.
  • the sub-nodes of the title bar 703 include the song name 7031 and the singer 7041
  • the sub-nodes of the control bar 706 include the pause button 7062, the previous button 7063, and the next button 7064.
  • the mobile phone after the mobile phone recognizes the target control in the music playback interface 501 through the above configuration file 1, it can also use the layer relationship of the target control in the music playback interface 501 to generate the screen projection corresponding to the screen projection interface.
  • View tree For example, the mobile phone can split the target control in the view tree 801, and generate a new view tree according to the layer relationship of the target control in the view tree 801, that is, the view tree corresponding to the projection interface.
  • the mobile phone i.e., the source device
  • the UI message includes the view tree corresponding to the projection interface (for example, the view tree 901 and the view tree 901). Drawing instructions and drawing resources related to each control).
  • a mobile phone and a smart watch can establish a socket connection based on the TCP/IP protocol.
  • the mobile phone can use the socket connection to send the UI message corresponding to the music playing interface 501 to the smart watch.
  • the screen projection method provided in this embodiment of the application can reduce the time when the source device interacts with the target device.
  • the transmission bandwidth can improve the transmission speed when projecting.
  • the mobile phone can generate a UI message corresponding to the new display interface according to the above method, and send the new UI message to the smart watch.
  • the smart watch after the smart watch receives the UI message corresponding to the above-mentioned music playing interface 501, it can call the drawing instructions of each target control in the view tree 901 in turn according to the layer relationship between the target controls in the view tree 901 Draw the target control. Finally, as shown in FIG. 10, the smart watch can draw a screen projection interface 1001 after the above-mentioned music playing interface 501 is screened. Each control in the projection interface 1001 corresponds to each control in the view tree 901 one-to-one.
  • the smart watch may also store configuration files of different display interfaces, or the smart watch may also obtain configuration files of different display interfaces from the server.
  • the UI message sent from the mobile phone may also carry the identifier of the music playing interface 501 described above.
  • the smart watch can find a corresponding configuration file (for example, the aforementioned configuration file 1) according to the identifier of the music playing interface 501.
  • the target device draws the projection interface 1001
  • it can determine the drawing order of the target controls according to the layer relationship between the target controls in the view tree 901.
  • the destination device can determine, according to the view tree 901, to first draw the album cover 704 and then draw the child nodes of the album cover 704 (such as the title bar 703).
  • the smart watch can also determine the specific drawing position of the album cover 704 according to the "dest" field of the album cover 704 in the configuration file 1, and then the smart watch can call the drawing instruction corresponding to the album cover 704 at this position Draw album cover 704. Similarly, the smart watch can draw each target control in turn based on the layer relationship between the target controls in the view tree 901, thereby forming a projection interface 1001 as shown in FIG. 10.
  • the configuration file 1 of the music playing interface 501 may not include the aforementioned "dest” field. Then, when the smart watch draws the target control 1 in the music playback interface 501, it can determine the translation distance of the target control 1 on the x-axis and y-axis according to the "translationx” field and the "translationy” field; The "scalex” field and the “scaley” field determine the scaling ratio of the target control 1 on the x-axis and the y-axis; for another example, the smart watch can also determine the rotation angle of the target control 1 according to the "rotatedegree” field. In this way, the smart watch can also calculate the specific display position of the target control 1 projected in the smart watch, and further, the smart watch can call the drawing instruction of the target control 1 to draw and display the target control 1 in the corresponding position.
  • the screen projection interface 1001 can be adapted to the display size of the display screen in the smart watch and the user's usage requirements, thereby improving the display effect and user experience when the screen is projected between multiple devices.
  • the mobile phone can also project the current display interface to multiple destination devices for display at the same time.
  • the mobile phone when the mobile phone displays the video playback interface 1101 of the video APP, if it detects that the user has turned on the screen projection function and selects a smart watch and a smart TV as the target device for this screen projection, the mobile phone can obtain the smart The configuration file A corresponding to the video playback interface 1101 set by the watch, and the configuration file B corresponding to the video playback interface 1101 set in advance for the smart TV.
  • the mobile phone can identify the video playback interface 1101 according to the identification of the first target control recorded in the configuration file A or the display position of the first target control before the screen is projected. And identify the second target control in the video playback interface 1101 according to the identification of the second target control recorded in the configuration file B or the display position of the second target control before the screen is projected.
  • the first target control corresponds to the smart watch
  • the second target control corresponds to the smart TV.
  • the video playback interface 1101 includes a status bar 1100, a video screen 1102, a text control 1103, a progress bar 1104, and a control bar 1105.
  • the control bar 1105 includes a pause button 1106, a previous button 1107, and a next button 1108.
  • the view tree 1201 corresponding to the video playback interface 1101 includes a root node, a status bar 1100, a video screen 1102, and a control bar 1105 located under the root node.
  • the video screen 1102 includes two sub-nodes, a text control 1103 and a progress bar 1104, and the control bar 1105 includes three sub-nodes: a pause button 1106, a previous button 1107, and a next button 1108.
  • the mobile phone can recognize through the configuration file A that the first target control of the video playback interface 1101 to the smart watch is: the control bar 1105 and the sub-nodes under the control bar 1105, and the smart TV can be identified to the mobile phone through the configuration file B.
  • the second target control of the projected video playback interface 1101 is: the video screen 1102 and each sub-node under the video screen 1102.
  • the mobile phone can use the union of the first target control and the second target control in the view tree 1201 as the view tree to be sent this time.
  • the mobile phone deletes the non-target controls (ie, the status bar 1100) in the view tree 1201
  • the view tree 1202 including the first target control and the second target control can be obtained.
  • the mobile phone can send UI messages to smart watches and smart TVs via the aforementioned communication network 104.
  • the UI messages include not only related drawing instructions and drawing resources of the first target control, but also related drawing instructions and drawing resources of the second target control. .
  • the UI message may also include the view tree 1202.
  • the smart watch may pre-store the configuration file A corresponding to the video playback interface 1101, or the smart watch may obtain the corresponding configuration file A from the server after receiving the aforementioned UI message. Furthermore, as shown in FIG. 13, the smart watch can split, crop, and reorganize the view tree 1202 according to the first target control recorded in the configuration file A to generate the first screen displayed on the smart watch.
  • View tree 1301 of the projection interface For example, the view tree 1301 includes a control bar 1105 under the root node, and the child nodes under the control bar 1105 are: a pause button 1106, a previous button 1107, and a next button 1108.
  • the smart watch can use the corresponding drawing instruction to draw each first target control in the corresponding position according to the specific position of the target control recorded in the configuration file A in the projection interface according to the view tree 1301, thereby displaying The first screen projection interface 1302 shown in FIG. 13. That is, after the mobile phone projects the video playback interface 1101 to the smart watch, the smart watch displays the relevant controls of the control bar in the video playback interface 1101.
  • the smart TV may also pre-store the configuration file B corresponding to the video playback interface 1101, or the smart TV may obtain the corresponding configuration file B from the server after receiving the aforementioned UI message. Furthermore, as shown in FIG. 14, the smart TV can perform operations such as splitting, cropping, and reorganizing the view tree 1202 according to the second target control recorded in the configuration file B to generate a second screen displayed on the smart TV.
  • View tree 1401 of the projection interface For example, the view tree 1401 includes a video screen 1102 under the root node, and the child nodes under the video screen 1102 are: a text control 1103 and a progress bar 1104.
  • the smart TV can use the corresponding drawing instructions to draw each second target control in the corresponding position according to the specific position of the target control recorded in the configuration file B in the projection interface according to the view tree 1401, thereby displaying The second screen projection interface 1402 shown in FIG. 14.
  • the smart TV displays related controls in the video screen of the video playback interface 1101.
  • the mobile phone after the mobile phone obtains the view tree 1201 of the aforementioned video playback interface 1101, it can also generate the view tree 1301 corresponding to the smart watch according to the aforementioned configuration file A, and according to the aforementioned configuration file B A view tree 1401 corresponding to the smart TV is generated. Furthermore, the mobile phone can send a first UI message to the smart watch. The first UI message contains the view tree 1301 and the drawing instructions for each control in the view tree 1301, so that the smart watch can draw and display the first screen projection interface based on the view tree 1301. 1302. Correspondingly, the mobile phone can send a second UI message to the smart TV. The second UI message contains the view tree 1401 and the drawing instructions for each control in the view tree 1401, so that the smart TV can draw and display the above second projection screen based on the view tree 1401. Interface 1402.
  • the controls in the display screen can be split, deleted, and reorganized to display different screen projection interfaces on different destination devices to adapt to different The device characteristics such as the screen size of the target device, so as to improve the display effect and user experience during projection.
  • the user can also manually specify the target control to be cast to the destination device in the display interface of the source device.
  • the source device can project one or more target controls manually designated by the user to the target device for display.
  • the mobile phone is displaying the lyrics browsing interface 1601 in the music APP.
  • the mobile phone can display multiple candidate devices that support this screen projection in the prompt box. If it is detected that the user selects the smart watch in the prompt box, it means that the user wants to project the currently displayed lyrics browsing interface 1601 to the smart watch for display. Then, the mobile phone can obtain the packagename of the music APP and the activityname of the lyrics browsing interface 1601, and the mobile phone can query the configuration files corresponding to the packagename and activityname in multiple configuration files corresponding to the smart watch (for example, configuration file 2). Similar to the aforementioned configuration file 1, the configuration file 2 records one or more target controls that can be projected to the smart watch in the browsing interface 1601.
  • the browsing interface 1601 includes the following controls: a status bar 1602, a title bar 1603, a lyrics 1604, and a control bar 1605.
  • the control bar 1605 includes controls such as a progress bar, a pause button, a previous button, and a next button.
  • the title bar 1603 includes controls such as song name and artist.
  • the status bar 1602 includes controls such as time, signal strength, and battery capacity.
  • the mobile phone After detecting that the user selects the smart watch as the target device, as shown in (a) in FIG. 17, the mobile phone can display one or more circle boxes 1701 on the browsing interface 1601.
  • the circle 1701 can be used to select the target control projected to the smart watch.
  • the user can adjust the size and position of the circle selection box 1701 in the browsing interface 1601 to select the target control that the user wants to project to the smart watch this time.
  • a preset operation can be performed to trigger the mobile phone to start projecting.
  • the preset operation may be an operation such as double-clicking the first area or pressing the first area by the user.
  • the mobile phone may display a completion button near the circled box 1701. After the user selects the first area using the circled box 1701, the user can click the finish button to trigger the mobile phone to start screen projection.
  • the mobile phone can detect the specific coordinates of the first area in the circle box 1701 in the browsing interface 1601. Furthermore, the mobile phone can determine whether the first area matches the target control recorded in the configuration file 2 according to the aforementioned configuration file 2. For example, when the coordinates of the first area are the same as or close to the coordinates of a certain target control (for example, target control 1) in the configuration file 2, the mobile phone can determine that the first area matches the target control 1 in the configuration file 2.
  • a certain target control for example, target control 1 in the configuration file 2
  • the user can also click or circle the area that needs to be projected on the browsing interface 1601. Taking the user clicking on point A in the browsing interface 1601 as an example, the mobile phone can query the target control to which point A belongs in configuration file 2 according to the coordinates of point A as target control 1.
  • the mobile phone can determine one or more target controls that the user needs to cast this time in the configuration file 2 in response to the user's manual operation. Furthermore, similar to the projection method in the above embodiment, as shown in FIG. 18, based on the view tree 1801 of the browsing interface 1601, the mobile phone can generate a view of the projection interface after the projection according to the target control selected by the user in the configuration file 2. Tree 1802. For example, the view tree 1802 includes various controls in the control bar 1605 manually selected by the user this time.
  • the mobile phone can carry the drawing instruction of the target control in the view tree 1802 and the view tree 1802 in the UI message and send it to the smart phone.
  • the smartphone receives the UI message, as shown in Figure 19, the drawing instructions of the corresponding controls can be called according to the layer order of the controls in the view tree 1802, and each control in the control bar 1605 is drawn on the display of the smartphone.
  • the projection interface 1901 after the projection is formed.
  • the mobile phone can also use the control circled by the user as the screen projection for this time.
  • Target control and dynamically generate a configuration file corresponding to the target control.
  • the mobile phone can record the specific location of each target control circled by the user in a dynamically generated configuration file, and the mobile phone can also set it in the configuration file according to the resolution, screen size and other parameters of the target device for this projection Set the specific display position of each target control in the target device.
  • the mobile phone can generate the view tree of the subsequent screen projection interface based on the dynamically generated configuration file, and carry the view tree and related drawing instructions in the UI message and send it to the target device.
  • the mobile phone can also send the dynamically generated configuration file this time to the destination device.
  • the target device can call the drawing instruction at the corresponding display position to draw the target control manually circled by the user.
  • the user can manually select which content from the display interface of the source device is projected to the destination device for display.
  • the display interface is more flexible when displayed between multiple devices, which improves the screen projection. Time user experience.
  • the embodiment of the present application discloses an electronic device including a processor, and a memory, an input device, an output device, and a communication module connected to the processor.
  • the input device and the output device can be integrated into one device.
  • a touch sensor can be used as an input device
  • a display screen can be used as an output device
  • the touch sensor and display screen can be integrated into a touch screen.
  • the above electronic device may include: a touch screen 2001, which includes a touch sensor 2006 and a display screen 2007; one or more processors 2002; a memory 2003; a communication module 2008; one or more Application programs (not shown); and one or more computer programs 2004.
  • the above-mentioned devices may be connected through one or more communication buses 2005.
  • the one or more computer programs 2004 are stored in the aforementioned memory 2003 and are configured to be executed by the one or more processors 2002, and the one or more computer programs 2004 include instructions, and the aforementioned instructions can be used to execute the aforementioned implementations.
  • the steps in the example Among them, all relevant content of the steps involved in the above method embodiments can be cited in the functional description of the corresponding physical device, which will not be repeated here.
  • the foregoing processor 2002 may specifically be the processor 110 shown in FIG. 2, the foregoing memory 2003 may specifically be the internal memory 121 and/or the external memory 120 shown in FIG. 2, and the foregoing display screen 2007 may specifically be FIG. 2
  • the touch sensor 2006 may be the touch sensor in the sensor module 200 shown in FIG. 2
  • the communication module 2008 may be the mobile communication module 150 and/or the wireless communication module 160 shown in FIG. The embodiment of this application does not impose any restriction on this.
  • the functional units in the various embodiments of the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated unit can be implemented in the form of hardware or software functional unit.
  • the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a computer readable storage medium.
  • a computer readable storage medium includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor to execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage media include: flash memory, mobile hard disk, read-only memory, random access memory, magnetic disk or optical disk and other media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Telephone Function (AREA)

Abstract

本申请的实施例提供一种投屏显示方法及电子设备,涉及终端技术领域,目的设备可按照自身的设备特点布局源设备发来的显示内容,从而提高多设备之间投屏显示的显示效果和用户体验。该方法包括:第一电子设备显示第一显示界面;所述第一电子设备接收用户将所述第一显示界面投射至第二电子设备的投屏指令;响应于所述投屏指令,所述第一电子设备确定所述第一显示界面中的一个或多个第一目标控件;所述第一电子设备向所述第二电子设备发送第一消息,所述第一消息中包括所述第一目标控件的绘制指令,使得所述第二电子设备按照所述第一目标控件的绘制指令绘制第一投屏界面,所述第一投屏界面中包括所述第一目标控件。

Description

一种投屏显示方法及电子设备
本申请要求在2019年6月5日提交中国国家知识产权局、申请号为201910487829.1的中国专利申请的优先权,发明名称为“一种投屏显示方法及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及终端技术领域,尤其涉及一种投屏显示方法及电子设备。
背景技术
随着智能家居技术的发展,一个用户或家庭中往往具备多个能够互相通信的电子设备。各类电子设备一般具有各自的设备特点,例如,手机的便携性更好,电视屏幕的显示效果更好,而音箱的音质效果更好。为了充分发挥不同电子设备的设备特点,电子设备可以通过投屏等方式实现多媒体数据在多个设备之间的切换和显示。
示例性的,用户可以将手机(即源设备)中显示的内容发送至其他支持投屏功能的目的设备中进行显示。但是,由于目的设备中显示屏与源设备中显示屏的分辨率、尺寸等规格可能不同,因此,目的设备在显示源设备发来的内容时可能会出现无法显示或显示效果不佳等问题。
发明内容
本申请提供一种投屏显示方法及电子设备,目的设备可按照自身的设备特点布局源设备发来的显示内容,从而提高多设备之间投屏显示的显示效果和用户体验。
为达到上述目的,本申请采用如下技术方案:
第一方面,本申请提供一种投屏显示方法,包括:第一电子设备(即源设备)显示第一显示界面;第一电子设备接收用户将第一显示界面投射至第二电子设备(即目的设备1)的投屏指令;响应于该投屏指令,第一电子设备可确定第一显示界面中需要投屏的一个或多个第一目标控件;进而,第一电子设备可向第二电子设备发送第一消息,第一消息中包括第一目标控件的绘制指令,使得第二电子设备按照第一目标控件的绘制指令绘制包含第一目标控件的第一投屏界面。
也就是说,在进行投屏显示时,源设备可将其显示界面中的一个或多个控件投射至目的设备的投屏界面中,使得源设备中的显示界面可以被变换或重组后显示在目的设备中,从而更加灵活的适应目的设备的屏幕尺寸等设备特点,提高投屏场景下的显示效果和使用体验。
在一种可能的实现方式中,第一电子设备确定第一显示界面中的第一目标控件,具体包括:第一电子设备可根据第二电子设备的类型,获取与第一显示界面对应的配置文件,该配置文件中记录有第一显示界面中需要投屏的第一目标控件;进而,第一电子设备可根据该配置文件,确定第一显示界面中的第一目标控件。
示例性的,上述配置文件中可记录第一目标控件在第一显示界面中的标识;此时,第一电子设备根据该配置文件,确定第一显示界面中的第一目标控件,具体包括:第一电子设备根据第一目标控件在第一显示界面中的标识,确定本次需要投屏的第一目标控件。
或者,上述配置文件中可记录第一目标控件在第一显示界面中的显示位置;此时,第一电子设备根据该配置文件,确定第一显示界面中的第一目标控件,具体包括:第一电子设备 根据第一目标控件在第一显示界面中的显示位置,确定本次需要投屏的第一目标控件。
在一种可能的实现方式中,在第一电子设备确定第一显示界面中的一个或多个第一目标控件之后,还包括:第一电子设备获取绘制第一显示界面时的第一视图信息(例如第一显示界面的视图树),第一视图信息包括第一显示界面中各个控件之间的图层顺序;进而,第一电子设备可根据第一视图信息确定第二视图信息,第二视图信息包括第一目标控件在第一投屏界面中的图层顺序;
可以看出,源设备可将其显示界面中的各个控件进行拆分、删减或重组等变化,从而在目的设备中显示出新的投屏界面,以适应目的设备的显示尺寸等设备特点,从而提高投屏场景下目的设备的显示效果和用户体验。
另外,源设备向目的设备发送的第一消息中还可以包括上述第二视图信息,以使第二电子设备根据第二视图信息中第一目标控件的图层顺序,依次调用第一目标控件的绘制指令绘制第一投屏界面。
在一种可能的实现方式中,第一电子设备根据第一视图信息确定第二视图信息,包括:第一电子设备对第一视图信息中的第一目标控件进行拆分和重组,得到第二视图信息,此时,第一目标控件在第一视图信息和第二视图信息中的图层顺序相同,即第一目标控件在投屏前的第一显示界面中的图层关系与第一目标控件在投屏后的第一投屏界面中的图层关系相同。
在一种可能的实现方式中,与上述第一显示界面对应的配置文件中还记录有第一目标控件在第一投屏界面中的显示位置;此时,第一电子设备根据第一视图信息确定第二视图信息,包括:第一电子设备从第一视图信息中拆分出需要投屏的第一目标控件;进而,第一电子设备可按照配置文件中记录的第一目标控件在第一投屏界面中的显示位置,将第一目标控件重组后得到第二视图信息。此时,第一目标控件投屏后所在的位置和图层与投屏前所在的位置和图层可以相同或不同。
在一种可能的实现方式中,上述第一消息中还可以包括第一目标控件的绘制资源,例如图标、头像等;第一目标控件的绘制资源用于供第二电子设备执行第一目标控件的绘制指令时,使用该绘制资源绘制第一目标控件的用户界面。
在一种可能的实现方式中,在第一电子设备显示第一显示界面之后,还包括:第一电子设备接收用户将第一显示界面投射至第三电子设备(即目的设备2)的投屏指令;响应于该投屏指令,第一电子设备确定第一显示界面中第二目标控件;其中,第一电子设备确定第二目标控件的方法与第一电子设备确定上述第一目标控件的方法类似。确定出第二目标控件后,与第一电子设备向第二电子设备发送第一消息类似的,第一电子设备可向第三电子设备发送携带第二目标控件的绘制指令的第二消息。又或者,确定出第二目标控件后,第一电子设备可将第二目标控件的绘制指令也携带在第一消息中,此时,第一电子设备可同时将该第一消息发送给第二电子设备和第三电子设备。这样,第一电子设备可将其显示界面中的不同控件同时投射至多个目的设备中进行显示。
第二方面,本申请提供一种投屏显示方法,包括:第二电子设备(即目的设备)可接收第一电子设备(即源设备)发送的第一消息,第一消息中包括N(N为大于0的整数)个控件的绘制指令;那么,第二电子设备可按照这N个控件的绘制指令绘制第一投屏界面,第一投屏界面中包括上述N个控件中的至少一个。
在一种可能的实现方式中,上述第一消息中还包括与上述N个控件对应的第一视图信息,第一视图信息中包括这N个控件之间的图层顺序;此时,第二电子设备按照上述N个控件的绘制指令绘制第一投屏界面,具体包括:第二电子设备根据第一视图信息中N个控件之间的 图层顺序可确定这N个控件的绘制顺序,进而,第二电子设备可按照绘制顺序分别执行对应N个控件的绘制指令,绘制出上述N个控件,形成第一投屏界面。
在一种可能的实现方式中,上述第一消息中还包括第一目标控件所属的第一应用界面的标识;那么,在第二电子设备接收第一电子设备发送的第一消息之后,还包括:第二电子设备获取与第一应用界面的标识对应的配置文件,该配置文件中记录有上述N个控件投屏后在第一投屏界面中的显示位置;此时,第二电子设备按照上述N个控件的绘制指令绘制第一投屏界面,包括:第二电子设备按照第一视图信息所指示的N个控件的绘制顺序,分别在配置文件记录的相应的显示位置执行每个控件的绘制指令,依次绘制出上述N个控件,形成第一投屏界面。
在一种可能的实现方式中,上述第一消息中还包括第一目标控件所属的第一应用界面的标识;那么,在第二电子设备接收第一电子设备发送的第一消息之后,还包括:第二电子设备获取与第一应用界面的标识对应的配置文件,该配置文件中记录有M(M为不大于N的整数)个目标控件在第一投屏界面中的显示位置,该M个目标控件为该N个控件的子集。也就是说,需要投屏在目的设备上显示M个目标控件为第一消息中N个控件的一部分,那么,第二电子设备可根据上述配置文件在这N个控件中确定需要在第一投屏界面中显示的M个目标控件;此时,第二电子设备按照上述N个控件的绘制指令绘制第一投屏界面,包括:第二电子设备从上述N个控件的绘制指令中确定本次使用的M个目标控件的绘制指令;进而,第二电子设备可按照这M个目标控件的绘制指令绘制这M个目标控件,形成第一投屏界面。
示例性的,在第二电子设备根据上述配置文件在N个控件中确定需要在第一投屏界面中显示的M个目标控件之后,还包括:第二电子设备根据上述配置文件生成投屏后的第二视图信息,第二视图信息包括上述M个目标控件在第一投屏界面中的图层顺序;此时,第二电子设备按照上述M个目标控件的绘制指令绘制这M个目标控件,形成第一投屏界面,包括:第二电子设备可以根据第二视图信息所指示的M个目标控件的绘制顺序,分别调用相应目标控件的绘制指令,在配置文件所记录的相应显示位置绘制每个目标控件,最终形成第一投屏界面。
第三方面,本申请提供一种电子设备,该电子设备可以为上述源设备或目的设备。其中,该电子设备包括:触摸屏、通信模块、一个或多个处理器、一个或多个存储器、以及一个或多个计算机程序;其中,处理器与通信模块、触摸屏以及存储器均耦合,上述一个或多个计算机程序被存储在存储器中,当电子设备运行时,该处理器执行该存储器存储的一个或多个计算机程序,以使电子设备执行上述任一项所述的投屏显示方法。
第四方面,本申请提供一种计算机存储介质,包括计算机指令,当计算机指令在电子设备上运行时,使得电子设备执行如第一方面中任一项所述的投屏显示方法。
第五方面,本申请提供一种计算机程序产品,当计算机程序产品在电子设备上运行时,使得电子设备执行如第一方面中任一项所述的投屏显示方法。
第六方面,本申请提供一种投屏显示系统,该系统中可包括至少一个源设备和至少一个目的设备;源设备可用于执行如第一方面中任一项所述的投屏显示方法,目的设备可用于执行如第二方面中任一项所述的投屏显示方法。
可以理解地,上述提供的第三方面所述的电子设备、第四方面所述的计算机存储介质、第五方面所述的计算机程序产品以及第六方面所述的系统均用于执行上文所提供的对应的方法,因此,其所能达到的有益效果可参考上文所提供的对应的方法中的有益效果,此处不再赘述。
附图说明
图1为本申请实施例提供的一种通信系统的架构示意图;
图2为本申请实施例提供的一种电子设备的结构示意图一;
图3为本申请实施例提供的一种电子设备内操作系统的架构示意图;
图4为本申请实施例提供的一种投屏显示方法的应用场景示意图一;
图5为本申请实施例提供的一种投屏显示方法的应用场景示意图二;
图6为本申请实施例提供的一种投屏显示方法的应用场景示意图三;
图7为本申请实施例提供的一种投屏显示方法的应用场景示意图四;
图8为本申请实施例提供的一种投屏显示方法的应用场景示意图五;
图9为本申请实施例提供的一种投屏显示方法的应用场景示意图六;
图10为本申请实施例提供的一种投屏显示方法的应用场景示意图七;
图11为本申请实施例提供的一种投屏显示方法的应用场景示意图八;
图12为本申请实施例提供的一种投屏显示方法的应用场景示意图九;
图13为本申请实施例提供的一种投屏显示方法的应用场景示意图十;
图14为本申请实施例提供的一种投屏显示方法的应用场景示意图十一;
图15为本申请实施例提供的一种投屏显示方法的应用场景示意图十二;
图16为本申请实施例提供的一种投屏显示方法的应用场景示意图十三;
图17为本申请实施例提供的一种投屏显示方法的应用场景示意图十四;
图18为本申请实施例提供的一种投屏显示方法的应用场景示意图十五;
图19为本申请实施例提供的一种投屏显示方法的应用场景示意图十六;
图20为本申请实施例提供的一种电子设备的结构示意图二。
具体实施方式
下面将结合附图对本实施例的实施方式进行详细描述。
如图1所示,本申请实施例提供的一种投屏显示方法可应用于通信系统100,通信系统100中可以包括N(N>1)个电子设备。例如,通信系统100中可包括电子设备101和电子设备102。
示例性地,电子设备101可以通过一个或多个通信网络104与电子设备102连接。
该通信网络104可以是有线网络,也可以是无线网络)。例如,上述通信网络104可以是局域网(local area networks,LAN),也可以是广域网(wide area networks,WAN),例如互联网。该通信网络104可使用任何已知的网络通信协议来实现,上述网络通信协议可以是各种有线或无线通信协议,诸如以太网、通用串行总线(universal serial bus,USB)、火线(FIREWIRE)、全球移动通讯系统(global system for mobile communications,GSM)、通用分组无线服务(general packet radio service,GPRS)、码分多址接入(code division multiple access,CDMA)、宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE)、蓝牙、无线保真(wireless fidelity,Wi-Fi)、NFC、基于互联网协议的语音通话(voice over Internet protocol,VoIP)、支持网络切片架构的通信协议或任何其他合适的通信协议。示例性地,在一些实施例中,电子设备101可以通过Wi-Fi协议与电子设备102建立Wi-Fi连接。
示例性的,电子设备101可以作为源设备,电子设备102可以作为电子设备101的目的设备。电子设备101可将显示屏中的显示界面投射至电子设备102的显示屏中进行显示。
仍如图1所示,上述通信系统100中还可以包括电子设备103,例如,电子设备103可 以为可穿戴设备。那么,电子设备101也可以作为电子设备103的源设备,将其显示屏中的显示界面也投射至电子设备103的显示屏中进行显示。
也就是说,当电子设备101为源设备时,电子设备101可将其显示界面同时投射至多个目的设备(例如上述电子设备102和电子设备103)中显示。当然,上述通信系统100中的任意电子设备均可作为源设备,本申请实施例对此不做任何限制。
在一些实施例中,以电子设备101为源设备举例,电子设备101可识别自身显示界面中的各个控件,以及各个控件之间的位置和图层等关系。进而,电子设备101可对这些控件进行拆分、剪裁或重组等变换,从而得到需要投射至电子设备102(或电子设备103)的一个或多个控件(后续可称为目标控件)。进而,电子设备101可将目标控件相关的绘制指令发送给电子设备102(或电子设备103),使得电子设备102(或电子设备103)可以在自身显示的投屏界面中绘制出上述目标控件。
在另一些实施例中,仍以电子设备101为源设备举例,电子设备101可识别自身显示界面中的各个控件。并且,电子设备101可将这些控件相关的绘制指令发送给电子设备102(或电子设备103)。进而,可由电子设备102(或电子设备103)对电子设备101中的各个控件进行拆分、剪裁或重组等变换,并在从而得到本次需要显示的一个或多个目标控件。后续,电子设备102(或电子设备103)可在自身的显示屏中绘制上述目标控件。
也就是说,在电子设备101(源设备)向电子设备102(目的设备)投射显示内容的过程中,目的设备最终为用户呈现的显示界面与源设备中的显示界面可以是不相同的。这样,在进行投屏显示时,源设备中的显示界面可以被裁剪、变换或重组后显示在目的设备中,从而更加灵活的适应目的设备的屏幕尺寸等设备特点,提高投屏场景下的显示效果和使用体验。
在一些实施例中,上述电子设备101、电子设备102以及电子设备103的具体结构可以是相同的,也可以是不同的。
例如,上述各个电子设备具体可以是手机、平板电脑、智能电视、可穿戴电子设备、车机、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、手持计算机、上网本、个人数字助理(personal digital assistant,PDA)、虚拟现实设备等,本申请实施例对此不做任何限制。
以电子设备101举例,图2示出了电子设备101的结构示意图。
电子设备101可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,摄像头193,显示屏194等。
可以理解的是,本发明实施例示意的结构并不构成对电子设备101的具体限定。在本申请另一些实施例中,电子设备101可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110 中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
充电管理模块140用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。在一些有线充电的实施例中,充电管理模块140可以通过USB接口130接收有线充电器的充电输入。在一些无线充电的实施例中,充电管理模块140可以通过电子设备101的无线充电线圈接收无线充电输入。充电管理模块140为电池142充电的同时,还可以通过电源管理模块141为电子设备供电。
电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块141接收电池142和/或充电管理模块140的输入,为处理器110,内部存储器121,显示屏194,摄像头193,和无线通信模块160等供电。电源管理模块141还可以用于监测电池容量,电池循环次数,电池健康状态(漏电,阻抗)等参数。在其他一些实施例中,电源管理模块141也可以设置于处理器110中。在另一些实施例中,电源管理模块141和充电管理模块140也可以设置于同一个器件中。
电子设备101的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备101中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在电子设备101上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括一个或多个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A,受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块150或其他功能模块设置在同一个器件中。
无线通信模块160可以提供应用在电子设备101上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(Bluetooth,BT),全球导 航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成一个或多个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,电子设备101的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得电子设备101可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
电子设备101通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备101可以包括1个或N个显示屏194,N为大于1的正整数。
电子设备101可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,电子设备101可以包括1个或N个摄像头193,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备101在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备101可以支持一种或多种视频编解码器。这样,电子设备101可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备101的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储一个或多个计算机程序,该一个或多个计算机程序包括指令。处理器110可以通过运行存储在内部存储器121的上述指令,从而使得电子设备101执行本申请一些实施例中所提供的投屏显示方法,以及各种功能应用和数据处理等。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统;该存储程序区还可以存储一个或多个应用程序(比如图库、联系人等)等。存储数据区可存储电子设备101使用过程中所创建的数据(比如照片,联系人等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如一个或多个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。在另一些实施例中,处理器110通过运行存储在内部存储器121的指令,和/或存储在设置于处理器中的存储器的指令,来使得电子设备101执行本申请实施例中提供的投屏显示方法,以及各种功能应用和数据处理。
电子设备101可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。电子设备101可以通过扬声器170A收听音乐,或收听免提通话。
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。当电子设备101接听电话或语音信息时,可以通过将受话器170B靠近人耳接听语音。
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风170C发声,将声音信号输入到麦克风170C。电子设备101可以设置一个或多个麦克风170C。在另一些实施例中,电子设备101可以设置两个麦克风170C,除了采集声音信号,还可以实现降噪功能。在另一些实施例中,电子设备101还可以设置三个,四个或更多麦克风170C,实现采集声音信号,降噪,还可以识别声音来源,实现定向录音功能等。
耳机接口170D用于连接有线耳机。耳机接口170D可以是USB接口130,也可以是3.5mm的开放移动电子设备平台(open mobile terminal platform,OMTP)标准接口,美国蜂窝电信工业协会(cellular telecommunications industry association of the USA,CTIA)标准接口。
传感器模块180可以包括压力传感器,陀螺仪传感器,气压传感器,磁传感器,加速度传感器,距离传感器,接近光传感器,指纹传感器,温度传感器,触摸传感器,环境光传感器,骨传导传感器等。
另外,上述电子设备中还可以包括按键、马达、指示器以及SIM卡接口等一种或多种部件,本申请实施例对此不做任何限制。
上述电子设备101的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架 构,或云架构。本申请实施例以分层架构的Android系统为例,示例性说明电子设备101的软件结构。
图3是本申请实施例的电子设备101的软件结构框图。
分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android系统分为四层,从上至下分别为应用程序层,应用程序框架层,安卓运行时(Android runtime)和系统库,以及内核层。
1、应用程序层
应用程序层可以包括一系列应用程序。
如图3所示,上述应用程序可以包括通话,联系人,相机,图库,日历,地图,导航,蓝牙,音乐,视频,短信息等APP(应用,application)。
2、应用程序框架层
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。
如图3所示,应用程序框架层中可以包括视图系统(view system),通知管理器,活动管理器,窗口管理器,内容提供器,资源管理器,输入法管理器等。
其中,视图系统可用于构建应用程序的显示界面。每个显示界面可以由一个或多个控件组成。一般而言,控件可以包括图标、按钮、菜单、选项卡、文本框、对话框、状态栏、导航栏、微件(Widget)等界面元素。
视图系统在绘制显示界面时可获取对应显示界面的视图信息,该视图信息中记录了需要绘制的显示界面中各个控件之间的图层关系。示例性的,显示界面中的各个控件一般按照树状结构分层组织,形成一个完整的ViewTree(视图树),该视图树可称为上述显示界面的视图信息。视图系统可根据ViewTree中设置好的各个控件之间的图层关系绘制显示界面。在绘制显示界面中的每一个控件时都对应一组绘制指令,例如DrawLine、DrawPoint、DrawBitmap等。视图系统可按照ViewTree中各个控件之间的图层关系,依次调用相应控件的绘制指令绘制该控件。
例如,图4中的(a)示出了微信APP的聊天界面401,聊天界面401中最底层的控件为根节点(root),根节点下设置有底图402这一控件,底图402中还包括以下控件:标题栏403、聊天背景404以及输入栏405。其中,标题栏403中进一步包括返回按钮406和标题407,聊天背景404中进一步包括头像408和气泡409,输入栏405中进一步包括语音输入按钮图标410、输入框411以及发送按钮412。
这些控件按照顺序分层可形成如图4中(b)所示的视图树A。其中,底图402位于根节点下,标题栏403、聊天背景404以及输入栏405均为底图402的子节点。返回按钮406和标题407均为标题栏403的子节点。头像408和气泡409均为聊天背景404的子节点。语音输入按钮图标410、输入框411以及发送按钮412均为输入栏405的子节点。视图系统在绘制聊天界面401时可按照视图树A中各个控件之间的图层关系,从根节点开始逐层调用对应的绘制指令绘制每个控件,最终形成聊天界面401。
在本申请实施例中,可以在源设备的视图系统增加一个投屏管理模块。当用户打开源设备的投屏功能后,投屏管理模块可记录视图系统绘制每一显示界面中每个控件的绘制指令,以及该绘制指令所需的绘图资源(例如头像、图标等)。并且,投屏管理模块可确定出当前显示界面中需要投射至目的设备显示的一个或多个目标控件。进而,投屏管理模块可基于当前显示界面的视图树1,生成目标控件在投屏后的投屏界面的视图树2。视图树2中控件的数量 与视图树1中控件的数量可以不同,视图树2中控件之间的位置关系也可与视图树1中控件的位置关系不同。进而,投屏管理模块可指示源设备将视图树2以及视图树2中各个控件的绘制指令和绘图资源发送给目的设备,使得目的设备可按照视图树2中控件之间的图层关系,逐层调用相应控件的绘制指令绘制投屏后的投屏界面。这样,源设备便可将其显示界面中的目标控件投射至目的设备显示的投屏界面中显示。
在另一些实施例中,也可以在目的设备的视图系统增加一个投屏管理模块。例如,当上述电子设备101为目的设备时,可以接收源设备发来的UI消息,UI消息中可以包括源设备中显示界面的视图树1以及视图树1中各个控件的绘制指令和绘图资源。进而,投屏管理模块可基于视图树1,生成本次在目的设备中需要显示的投屏界面的视图树2。这样,目的设备可按照视图树2中控件之间的图层关系,逐层调用相应控件的绘制指令和绘图资源绘制投屏后的投屏界面。
也就是说,在投屏过程中,上述投屏管理模块可将源设备显示界面中的各个控件进行拆分、删减和重组,从而在目的设备中显示出新的投屏界面,以适应目的设备的显示尺寸等设备特点,从而提高投屏场景下目的设备的显示效果和用户体验。
需要说明的是,上述投屏管理模块也可以独立于视图系统单独设置在应用程序框架层,本申请实施例对此不做任何限制。
另外,上述活动管理器可用于管理每个应用的生命周期。应用通常以activity的形式运行在操作系统中。活动管理器可以调度应用的activity进程管理每个应用的生命周期。窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。所述数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等。
3、Android runtime和系统库
Android Runtime包括核心库和虚拟机。Android runtime负责安卓系统的调度和管理。
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(Media Libraries),三维图形处理库(例如:OpenGL ES),2D图形引擎(例如:SGL)等。
其中,表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。三维图形处理库用于实现三维图形绘图,图像渲染,合成,和图层处理等。2D图形引擎是2D绘图的绘图引擎。
4、内核层
内核层是硬件和软件之间的层。内核层至少包含显示驱动,摄像头驱动,音频驱动,传感器驱动等,本申请实施例对此不做任何限制。
以下将以手机作为投屏时的源设备举例,结合附图详细阐述本申请实施例提供的一种投屏显示方法。
示例性的,可预先在手机中设置一个投屏按钮,该投屏按钮可用于将手机中的显示界面 投射至其他电子设备中显示。例如,上述投屏按钮可以设置在下拉菜单、上拉菜单或负一屏菜单等位置,本申请实施例对此不做任何限制。
以投屏音乐APP的显示界面举例,如图5中的(a)所示,用户在手机使用音乐APP播放歌曲时,手机可显示音乐APP的音乐播放界面501。如果此时用户希望将手机中的音乐播放界面501投屏至其他电子设备中继续显示,则用户可在音乐播放界面501中执行预设操作。例如,该预设操作可以为下拉操作。进而,响应用户在音乐播放界面501中执行的下拉操作,如图5中的(b)所示,手机可显示包含设备切换按钮502的下拉菜单503。
如图6中的(a)所示,如果检测到用户选中该投屏按钮502,说明用户希望将当前手机中显示的音乐播放界面501投射至其他的一个或多个电子设备中继续显示。那么,如图6中的(b)所示,手机可显示提示框601,该提示框601中可包含一个或多个候选设备。用户可从这些候选设备中选择一个或多个作为目的设备。后续,手机可将上述音乐播放界面501投射至用户选中的目的设备中进行投屏显示。
例如,检测到用户选中上述投屏按钮502后,手机可查询位于同一网络中的其他电子设备。例如,如果手机接入了名称为“1234”的Wi-Fi网络,则手机可查询该Wi-Fi网络内还包括哪些电子设备。如果查询到该Wi-Fi网络中还包括智能电视、笔记本、平板电脑以及智能手表,则手机可将这些电子设备作为候选设备,并将这些电子设备的图标显示在上述提示框601中。
又例如,检测到用户选中上述投屏按钮502后,手机还可以请求服务器查询与手机登录同一华为账号的其他电子设备。进而,服务器可将查询到的登录同一华为账号的一个或多个电子设备的标识发送给手机,这样,手机可将这些电子设备作为候选设备,并将这些电子设备的图标显示在上述提示框601中。
仍以上述提示框601中的多个候选设备举例,如果检测到用户选中提示框601中的智能手表,说明用户希望将当前手机中的显示界面(即音乐播放界面501)投射至智能手表中显示。在本申请实施例中,手机可以先对音乐播放界面501中的显示内容进行拆分、剪裁或重组等变化,得到与音乐播放界面501对应的投屏界面。进而,手机可将该投屏界面投射至智能手表中显示,使得智能手表中显示的投屏界面既能够体现出音乐播放界面501中的内容,又可以适应智能手表自身显示屏的屏幕尺寸。
仍以将手机中的音乐播放界面501投射至智能手表举例,可以预先在手机中为智能手表这一类目的设备设置多个配置文件。每个配置文件可与特定的应用或特定的显示界面对应。每个配置文件中记录了源设备中需要投射至目的设备上的一个或多个目标控件。
示例性的,配置文件中可以记录音乐播放界面501中需要投射至目的设备上的一个或多个控件的标识,这样,手机根据配置文件中记录的控件的标识可确定出音乐播放界面501中需要投屏的目标控件。
又例如,由于不同版本的音乐APP在运行音乐播放界面501时,音乐播放界面501内各个控件的标识可能会被更新,因此通过配置文件中控件的标识确定出的目标控件时可能会不准确。对此,也可在配置文件中记录需要投屏的一个或多个控件在音乐播放界面501中的具体显示位置。例如,每个控件的位置均可通过left,top,widht,height这4个参数的取值唯一确定。其中,left为控件左上角顶点在x轴的大小,top为控件左上角顶点在y轴的大小,widht为控件的宽度,height为控件的高度。这样,手机根据配置文件中记录的控件在音乐播放界面501中的显示位置可以唯一确定出音乐播放界面501内需要投屏的目标控件。
另外,配置文件中除了记录源设备中需要投射至目的设备上的一个或多个目标控件外, 还可以记录目标控件投射至目的设备后在投屏界面中的显示位置。
例如,以图7中所示手机的音乐播放界面501举例,音乐播放界面501中包括以下控件:底图701、状态栏702、标题栏703、专辑封面704、歌词705以及控制栏706。其中,状态栏702中包括时间、信号强度以及电池容量等控件。标题栏703中包括歌曲名称7031和演唱者7032等控件。控制栏706中包括进度条7061、暂停按钮7062、上一首按钮7063以及下一首按钮7064等控件。
手机可预先为音乐播放界面501设置投射至智能手表时对应的配置文件1。由于智能手表的显示屏尺寸较小,用户将手机中的音乐播放界面501投射至智能手表的主要目的是为了方便控制音乐APP。因此,可预先在配置文件1中设置音乐播放界面501中需要被投射的目标控件为标题栏703和控制栏706。并且,可在配置文件1中设置标题栏703和控制栏706中各个控件投屏前在音乐播放界面501中的显示位置,以及标题栏703和控制栏706中各个控件投屏后的显示位置。
示例性的,与音乐播放界面501对应的配置文件1可以为:
Figure PCTCN2020093892-appb-000001
需要说明的是,上述配置文件可以采用JSON(JavaScript Object Notation)格式、XML(Extensible Markup Language)格式或文本格式等格式存储在手机中或服务器中,本申请实施例对此不做任何限制。
仍以上述配置文件1举例,如果检测到用户选中提示框601中的智能手表,则手机可获取预先为智能手表设置的一个或多个配置文件。并且,手机可获取当前在前台运行的音乐应用的包名(packagename),以及当前显示界面的activityname。进而,手机可在获取到的配置文件中,根据该packagename和activityname,查询与音乐播放界面501对应的配置文件1。那么,根据配置文件1中目标控件1的ID或“src”字段记录的位置信息,手机可识别出音乐播放界面501中需要投射至智能手表的一个或多个目标控件。
又或者,手机或服务器中还可以预先为不同型号或不同显示规格(例如屏幕形状、分辨率等)的智能手表设置对应的配置文件。那么,检测到用户选中提示框601中的智能手表后,手机还可以进一步获取本次投屏的智能手表的型号或显示规格,并查找与该型号或显示规格对应的配置文件。当然,也可以在同一配置文件(例如上述配置文件1)中记录不同型号或不同显示规格的智能手表需要投射的目标控件。那么,手机获取到配置文件1后,可根据本次投屏的智能手表的型号或显示规格,在配置文件1中查找到相应的“src”字段或目标控件的ID,从而识别出本次需要投射至智能手表中显示的一个或多个目标控件。
当然,如果手机没有获取到与智能手表对应的配置文件,或者手机没有获取到与上述音乐应用的包名或activityname对应的配置文件,则手机可按照现有的投屏方案将上述音乐播放界面501投射至智能手表中显示,本申请实施例对此不做任何限制。
另外,检测到用户选中提示框601中的智能手表后,手机还可以获取viewsystem在绘制上述音乐播放界面501时对应的视图信息。以视图树为视图信息举例,如图8所示,视图树801记录了上述音乐播放界面501中各个控件的之间的图层关系。在视图树801中,音乐播放界面501的根节点下包括底图701这一子节点,状态栏702、标题栏703、专辑封面704、歌词705以及控制栏706均为底图701的子节点。歌曲名称7031和演唱者7041为标题栏703的子节点。进度条7061、暂停按钮7062、上一首按钮7063以及下一首按钮7064为控制栏706的子节点。
示例性的,手机通过上述配置文件1识别出音乐播放界面501中的目标控件包括:标题栏703中的歌曲名称7031和演唱者7041,控制栏706中的暂停按钮7062、上一首按钮7063以及下一首按钮7064,以及专辑封面704。进而,由于配置文件1中记录了目标控件投屏后在投屏界面中的显示位置,因此,手机可根据上述配置文件1对音乐播放界面501的视图树801进行拆分、裁剪和重组等操作,生成投屏后在智能手表上显示的投屏界面的视图树901。
如图9所示,为投屏界面的视图树901的示意图。在视图树901中,手机删除了视图树801中不是目标控件的节点,例如上述底图701、状态栏702、状态栏702中的各个控件以及控制栏706中的进度条7601。并且,如果配置文件1中记录了投屏后标题栏703和控制栏706中的目标控件位于专辑封面704的图层之上,则在视图树901中,手机可将标题栏703和控制栏706作为专辑封面704的子节点。并且,标题栏703的子节点包括歌曲名称7031和演唱者7041,控制栏706的子节点包括暂停按钮7062、上一首按钮7063以及下一首按钮7064。
在另一些实施例中,手机通过上述配置文件1识别出音乐播放界面501中的目标控件后,也可以沿用目标控件在音乐播放界面501中的图层关系生成投屏后与投屏界面对应的视图树。例如,手机可将视图树801中的目标控件拆分出来,并按照目标控件在视图树801中的图层关系生成新的视图树,即与投屏界面对应的视图树。
进而,手机(即源设备)可通过上述通信网络104向智能手表(即目的设备)发送UI消息,该UI消息中包括投屏界面对应的视图树(例如,上述视图树901以及视图树901中每个控件相关的绘制指令和绘图资源)。例如,手机与智能手表可基于TCP/IP协议建立socket连接。进而,手机可使用该socket连接将与上述音乐播放界面501对应的UI消息发送给智能手表。由于UI消息中仅包括目标控件的相关信息,因此,相比于手机将整个音乐播放界面501发送给智能手表进行投屏,本申请实施例提供的投屏方法可降低源设备与目的设备交互时的传输带宽,提高投屏时的传输速度。
后续,当手机更新音乐播放界面501时,手机可按照上述方法生成与新的显示界面对应的UI消息,并将新的UI消息发送给智能手表。
在一些实施例中,智能手表接收到与上述音乐播放界面501对应的UI消息后,可按照视图树901中各个目标控件之间图层关系,依次调用视图树901中每一个目标控件的绘制指令绘制目标控件。最终,如图10所示,智能手表可绘制出上述音乐播放界面501投屏后的投屏界面1001。投屏界面1001中的各个控件与视图树901中各个控件一一对应。
示例性的,智能手表中也可存储不同显示界面的配置文件,或者,智能手表中也可以从服务器中获取不同显示界面的配置文件。手机发来的UI消息中还可以携带上述音乐播放界面501的标识。进而,智能手表可根据音乐播放界面501的标识查找到对应的配置文件(例如上述配置文件1)。那么,目的设备在绘制投屏界面1001时,可根据视图树901中目标控件之间的图层关系确定目标控件的绘制顺序。例如,目的设备根据视图树901可确定先绘制专辑封面704再绘制专辑封面704的子节点(例如标题栏703)。在绘制专辑封面704时,智能手表还可以根据配置文件1中专辑封面704的“dest”字段确定专辑封面704的具体绘制位置,进而,智能手表可调用与专辑封面704对应的绘制指令在该位置绘制专辑封面704。类似的,智能手表可视图树901中目标控件之间的图层关系依次绘制每个目标控件,从而形成如图10所示的投屏界面1001。
仍以音乐播放界面501举例,音乐播放界面501的配置文件1中也可以不包括上述“dest”字段。那么,智能手表在绘制音乐播放界面501中的目标控件1时,可以根据“translationx”字段和“translationy”字段确定目标控件1在x轴和y轴的平移距离;又例如,智能手表可以根据“scalex”字段和“scaley”字段确定目标控件1在x轴和y轴的缩放比例;又例如,智能手表还可以根据“rotatedegree”字段确定目标控件1的旋转角度。这样,智能手表也可计算出目标控件1投射在智能手表中的具体显示位置,进而,智能手表可调用目标控件1的绘制指令在相应位置绘制并显示出上述目标控件1。
可以看出,手机在将上述音乐播放界面501投屏至智能手表中显示时,可对音乐播放界面501中的控件进行拆分、删减和重组等操作,使得最终投屏在智能手表中的投屏界面1001能够适用智能手表中显示屏的显示尺寸以及用户的使用需求,从而提高多设备之间投屏时的显示效果和用户体验。
在本申请的另一些实施例中,手机还可以将当前的显示界面同时投射至多个目的设备中进行显示。
如图11所示,手机在显示视频APP的视频播放界面1101时,如果检测到用户开启投屏功能,并选中智能手表和智能电视作为本次投屏的目的设备,则手机可获取预先为智能手表设置的与上述视频播放界面1101对应的配置文件A,以及预先为智能电视设置的与上述视频播放界面1101对应的配置文件B。
与上述实施例中手机识别音乐播放界面501中目标控件的方法类似,手机可根据配置文 件A中记录的第一目标控件的标识或第一目标控件投屏前的显示位置识别视频播放界面1101中的第一目标控件,并根据配置文件B中记录的第二目标控件的标识或第二目标控件投屏前的显示位置识别视频播放界面1101中的第二目标控件。第一目标控件与智能手表对应,第二目标控件与智能电视对应。
示例性的,仍如图11所示,视频播放界面1101中包括状态栏1100、视频画面1102,文本控件1103、进度条1104以及控制栏1105。控制栏1105中包括暂停按钮1106、上一个按钮1107以及下一个按钮1108。那么,如图12中的(a)所示,与视频播放界面1101对应的视图树1201包括:根节点,位于根节点下的状态栏1100、视频画面1102和控制栏1105。视频画面1102下包括文本控件1103和进度条1104这两个子节点,控制栏1105下包括暂停按钮1106、上一个按钮1107以及下一个按钮1108这三个子节点。
例如,手机通过配置文件A可识别出向智能手表投射视频播放界面1101的第一目标控件为:控制栏1105以及控制栏1105下的各个子节点,并且,向手机通过配置文件B可识别出向智能电视投射视频播放界面1101的第二目标控件为:视频画面1102以及视频画面1102下的各个子节点。
那么,手机可将第一目标控件和第二目标控件在视图树1201中的并集作为本次需要发送的视图树。如图12中的(b)所示,手机删减掉视图树1201中的非目标控件(即状态栏1100)后,可得到包含第一目标控件和第二目标控件的视图树1202。进而,手机可通过上述通信网络104向智能手表和智能电视发送UI消息,该UI消息中既包括第一目标控件的相关绘制指令和绘图资源,也包括第二目标控件的相关绘制指令和绘图资源。当然,该UI消息中还可以包括视图树1202。
示例性的,智能手表可以预先存储与视频播放界面1101对应的配置文件A,或者,智能手表接收到上述UI消息后可从服务器中获取对应的配置文件A。进而,如图13所示,智能手表可根据配置文件A中记录的第一目标控件,对上述视图树1202进行拆分、裁剪和重组等操作,生成投屏后在智能手表上显示的第一投屏界面的视图树1301。例如,视图树1301包括位于根节点下的控制栏1105,控制栏1105下的子节点为:暂停按钮1106、上一个按钮1107以及下一个按钮1108。
进而,智能手表可按照视图树1301,根据配置文件A记录的目标控件投屏后在投屏界面中的具体位置,使用对应的绘制指令在对应的位置绘制每个第一目标控件,从而显示出图13所示的第一投屏界面1302。也就是说,手机将视频播放界面1101投屏到智能手表后,在智能手表中显示出了视频播放界面1101中控制栏的相关控件。
类似的,智能电视也可以预先存储与视频播放界面1101对应的配置文件B,或者,智能电视接收到上述UI消息后可从服务器中获取对应的配置文件B。进而,如图14所示,智能电视可根据配置文件B中记录的第二目标控件,对上述视图树1202进行拆分、裁剪和重组等操作,生成投屏后在智能电视上显示的第二投屏界面的视图树1401。例如,视图树1401包括位于根节点下的视频画面1102,视频画面1102下的子节点为:文本控件1103和进度条1104。
进而,智能电视可按照视图树1401,根据配置文件B记录的目标控件投屏后在投屏界面中的具体位置,使用对应的绘制指令在对应的位置绘制每个第二目标控件,从而显示出图14所示的第二投屏界面1402。也就是说,手机将视频播放界面1101投屏到智能电视后,在智能电视中显示出了视频播放界面1101中视频画面内的相关控件。
在另一些实施例中,如图15所示,手机获取到上述视频播放界面1101的视图树1201后, 也可以根据上述配置文件A生成与智能手表对应的视图树1301,并根据上述配置文件B生成与智能电视对应的视图树1401。进而,手机可向智能手表发送第一UI消息,第一UI消息中包含视图树1301以及视图树1301中各个控件的绘制指令,使得智能手表可基于视图树1301绘制并显示上述第一投屏界面1302。相应的,手机可向智能电视发送第二UI消息,第二UI消息中包含视图树1401以及视图树1401中各个控件的绘制指令,使得智能电视可基于视图树1401绘制并显示上述第二投屏界面1402。
可以看出,手机在向不同目的设备上投射同一显示画面时,可通过对显示画面中的控件进行拆分、删减和重组,在不同目的设备上显示出不同的投屏界面,以适应不同目的设备的屏幕大小等设备特点,从而提高投屏时的显示效果和用户体验。
在本申请的另一些实施例中,用户还可以手动在源设备的显示界面中指定向目的设备投屏的目标控件。这样,源设备可将用户手动指定的一个或多个目标控件投屏至目的设备中进行显示。
仍以手机为源设备举例,如图16所示,手机正在显示音乐APP中的歌词浏览界面1601。用户在手机中开启投屏功能后,手机可在提示框中显示支持本次投屏的多个候选设备。如果检测到用户选中提示框中的智能手表,说明用户希望将当前显示的歌词浏览界面1601投射至智能手表中显示。那么,手机可获取音乐APP的packagename和歌词浏览界面1601的activityname,并且,手机可在与智能手表对应的多个配置文件中查询与该packagename和activityname对应的配置文件(例如配置文件2)。与上述配置文件1类似的,配置文件2中记录了浏览界面1601中可以投屏至智能手表中显示的一个或多个目标控件。
如图16所示的浏览界面1601,该浏览界面1601中包括以下控件:状态栏1602、标题栏1603、歌词1604以及控制栏1605。控制栏1605中包括进度条、暂停按钮、上一首按钮以及下一首按钮等控件。标题栏1603中包括歌曲名称和演唱者等控件。状态栏1602中包括时间、信号强度以及电池容量等控件。
检测到用户选中智能手表作为目的设备后,如图17中的(a)所示,手机可在浏览界面1601上显示一个或多个圈选框1701。圈选框1701可用于选择向智能手表投射的目标控件。例如,用户可调节圈选框1701在浏览界面1601中的大小和位置,以选择本次用户希望投射至智能手表中的目标控件。仍如图17中的(a)所示,用户使用圈选框1701选中了浏览界面1601中的第一区域后,可执行预设的操作触发手机开始投屏。例如,该预设的操作可以为用户双击第一区域或按压第一区域等操作。又例如,手机可以在圈选框1701附近显示完成按钮,用户使用圈选框1701选中第一区域后,可点击该完成按钮,触发手机开始投屏。
示例性的,用户使用圈选框1701选中了浏览界面1601中的第一区域后,手机可以检测到圈选框1701内的第一区域在浏览界面1601中的具体坐标。进而,手机可根据上述配置文件2确定第一区域是否与配置文件2中记录的目标控件匹配。例如,当第一区域的坐标与配置文件2中某一目标控件(例如目标控件1)的坐标相同或接近时,手机可确定第一区域与配置文件2中的目标控件1匹配。
又或者,检测到用户选中智能手表作为目的设备后,如图17中的(b)所示,用户也可以在浏览界面1601上点击或圈选需要投屏的区域。以用户点击浏览界面1601中的A点举例,手机可根据A点的坐标,在配置文件2中查询A点所属的目标控件为目标控件1。
也就是说,手机可响应用户的手动操作在配置文件2中确定用户本次需要投屏的一个或多个目标控件。进而,与上述实施例中的投屏方法类似的,如图18所示,基于浏览界面1601的视图树1801,手机可根据配置文件2中用户选中的目标控件生成投屏后投屏界面的视图树 1802。例如,视图树1802中包括用户本次手动选中的控制栏1605中的各个控件。
那么,手机可将视图树1802及视图树1802中目标控件的绘制指令携带在UI消息中发送给智能手机。智能手机接收到该UI消息后,如图19所示,可按照视图树1802中各个控件的图层顺序调用相应控件的绘制指令,在智能手机的显示屏中绘制控制栏1605中的各个控件,形成投屏后的投屏界面1901。
在另一些实施例中,手机开启投屏模式后,如果检测到用户在当前的显示界面中手动圈选了一个或多个控件,则手机也可以将用户圈选的控件作为本次投屏的目标控件,并动态的生成与目标控件对应的配置文件。例如,手机可在动态生成的配置文件中记录用户圈选的每个目标控件的具体位置,并且,手机还可以根据本次投屏的目的设备的分辨率、屏幕尺寸等参数,在配置文件中设置每个目标控件在目的设备中的具体显示位置。
进而,与上述实施例中的投屏方法类似的,手机基于动态生成的配置文件可生成后续投屏界面的视图树,并将该视图树及相关绘制指令携带在UI消息发送给目的设备。并且,手机还可以将本次动态生成的配置文件也发送给目的设备。这样,目的设备基于动态生成的配置文件,可在相应的显示位置调用绘制指令绘制用户手动圈选的目标控件。这样一来,在投屏过程中,用户可以手动选择将源设备的显示界面中的哪些内容投射至目的设备中显示,显示界面在多设备之间显示时的灵活度更高,提高了投屏时用户的使用体验。
本申请实施例公开了一种电子设备,包括处理器,以及与处理器相连的存储器、输入设备、输出设备和通信模块。其中,输入设备和输出设备可集成为一个设备,例如,可将触摸传感器作为输入设备,将显示屏作为输出设备,并将触摸传感器和显示屏集成为触摸屏。
此时,如图20所示,上述电子设备可以包括:触摸屏2001,所述触摸屏2001包括触摸传感器2006和显示屏2007;一个或多个处理器2002;存储器2003;通信模块2008;一个或多个应用程序(未示出);以及一个或多个计算机程序2004,上述各器件可以通过一个或多个通信总线2005连接。其中该一个或多个计算机程序2004被存储在上述存储器2003中并被配置为被该一个或多个处理器2002执行,该一个或多个计算机程序2004包括指令,上述指令可以用于执行上述实施例中的各个步骤。其中,上述方法实施例涉及的各步骤的所有相关内容均可以援引到对应实体器件的功能描述,在此不再赘述。
示例性的,上述处理器2002具体可以为图2所示的处理器110,上述存储器2003具体可以为图2所示的内部存储器121和/或外部存储器120,上述显示屏2007具体可以为图2所示的显示屏194,上述触摸传感器2006具体可以为图2所示的传感器模块200中的触摸传感器,上述通信模块2008具体可以为图2所示的移动通信模块150和/或无线通信模块160,本申请实施例对此不做任何限制。
通过以上的实施方式的描述,所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。上述描述的系统,装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请实施例各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请实施例的技术方案本质上或 者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)或处理器执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:快闪存储器、移动硬盘、只读存储器、随机存取存储器、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本申请实施例的具体实施方式,但本申请实施例的保护范围并不局限于此,任何在本申请实施例揭露的技术范围内的变化或替换,都应涵盖在本申请实施例的保护范围之内。因此,本申请实施例的保护范围应以所述权利要求的保护范围为准。

Claims (17)

  1. 一种投屏显示方法,其特征在于,包括:
    第一电子设备显示第一显示界面;
    所述第一电子设备接收用户将所述第一显示界面投射至第二电子设备的投屏指令;
    响应于所述投屏指令,所述第一电子设备确定所述第一显示界面中的一个或多个第一目标控件;
    所述第一电子设备向所述第二电子设备发送第一消息,所述第一消息中包括所述第一目标控件的绘制指令,使得所述第二电子设备按照所述第一目标控件的绘制指令绘制第一投屏界面,所述第一投屏界面中包括所述第一目标控件。
  2. 根据权利要求1所述的方法,其特征在于,所述第一电子设备确定所述第一显示界面中的第一目标控件,包括:
    所述第一电子设备根据所述第二电子设备的类型,获取与所述第一显示界面对应的配置文件,所述配置文件中记录有所述第一显示界面中需要投屏的第一目标控件;
    所述第一电子设备根据所述配置文件,确定所述第一显示界面中的第一目标控件。
  3. 根据权利要求2所述的方法,其特征在于,
    所述配置文件中记录有所述第一目标控件在所述第一显示界面中的标识;其中,所述第一电子设备根据所述配置文件,确定所述第一显示界面中的第一目标控件,包括:
    所述第一电子设备根据所述第一目标控件在所述第一显示界面中的标识,确定所述第一目标控件;或者,
    所述配置文件中记录有所述第一目标控件在所述第一显示界面中的显示位置;其中,所述第一电子设备根据所述配置文件,确定所述第一显示界面中的第一目标控件,包括:
    所述第一电子设备根据所述第一目标控件在所述第一显示界面中的显示位置,确定所述第一目标控件。
  4. 根据权利要求1-3中任一项所述的方法,其特征在于,在所述第一电子设备确定所述第一显示界面中的一个或多个第一目标控件之后,还包括:
    所述所述第一电子设备获取绘制所述第一显示界面时的第一视图信息,所述第一视图信息包括所述第一显示界面中各个控件之间的图层顺序;
    所述第一电子设备根据所述第一视图信息确定第二视图信息,所述第二视图信息包括所述第一目标控件在所述第一投屏界面中的图层顺序;
    其中,所述第一消息中还包括所述第二视图信息,以使所述第二电子设备根据所述第二视图信息,及所述第一目标控件的绘制指令绘制第一投屏界面。
  5. 根据权利要求4所述的方法,其特征在于,所述第一电子设备根据所述第一视图信息确定第二视图信息,包括:
    所述第一电子设备对所述第一视图信息中的所述第一目标控件进行拆分和重组,得到第二视图信息,所述第一目标控件在所述第一视图信息和所述第二视图信息中的图层顺序相同。
  6. 根据权利要求4所述的方法,其特征在于,与所述第一显示界面对应的配置文件中还记录有所述第一目标控件在所述第一投屏界面中的显示位置;
    其中,所述第一电子设备根据所述第一视图信息确定第二视图信息,包括:
    所述第一电子设备从所述第一视图信息中拆分出所述第一目标控件;
    所述第一电子设备按照所述配置文件中所述第一目标控件在所述第一投屏界面中的显示位置,将所述第一目标控件重组后得到所述第二视图信息。
  7. 根据权利要求1-6中任一项所述的方法,其特征在于,所述第一消息中还包括所述第一目标控件的绘制资源;所述第一目标控件的绘制资源用于供所述第二电子设备执行所述第一目标控件的绘制指令时,使用该绘制资源绘制所述第一目标控件的用户界面。
  8. 根据权利要求1-7中任一项所述的方法,其特征在于,在第一电子设备显示第一显示界面之后,还包括:
    所述第一电子设备接收用户将所述第一显示界面投射至第三电子设备的投屏指令;
    响应于所述投屏指令,所述第一电子设备确定所述第一显示界面中第二目标控件;
    所述第一电子设备向所述第三电子设备发送第二消息,所述第二消息中包括所述第二目标控件的绘制指令;或者,
    所述第一电子设备向所述第三电子设备发送所述第一消息,所述第一消息中还包括所述第二目标控件的绘制指令。
  9. 一种投屏显示方法,其特征在于,包括:
    第二电子设备接收第一电子设备发送的第一消息,所述第一消息中包括N个控件的绘制指令,N为大于0的整数;
    所述第二电子设备按照所述N个控件的绘制指令绘制第一投屏界面,所述第一投屏界面中包括所述N个控件中的至少一个。
  10. 根据权利要求9所述的方法,其特征在于,所述第一消息中还包括与所述N个控件对应的第一视图信息,所述第一视图信息包括所述N个控件之间的图层顺序;
    其中,所述第二电子设备按照所述N个控件的绘制指令绘制第一投屏界面,包括:
    所述第二电子设备根据所述第一视图信息,分别执行所述N个控件的绘制指令,绘制所述第一投屏界面。
  11. 根据权利要求10所述的方法,其特征在于,所述第一消息中还包括第一应用界面的标识,所述N个控件位于所述第一应用界面中;
    其中,在第二电子设备接收第一电子设备发送的第一消息之后,还包括:
    所述第二电子设备获取与所述第一应用界面的标识对应的配置文件,所述配置文件中记录有所述N个控件在所述第一投屏界面中的显示位置;
    其中,所述第二电子设备按照所述N个控件的绘制指令绘制第一投屏界面,包括:
    所述第二电子设备根据所述第一视图信息,分别执行所述N个控件的绘制指令,在所述配置文件记录的显示位置依次绘制所述N个控件,形成所述第一投屏界面。
  12. 根据权利要求9所述的方法,其特征在于,所述第一消息中还包括第一应用界面的标识,所述N个控件中的M个目标控件位于所述第一应用界面中;
    其中,在第二电子设备接收第一电子设备发送的第一消息之后,还包括:
    所述第二电子设备获取与所述第一应用界面的标识对应的配置文件,所述配置文件中记录有M个目标控件在所述第一投屏界面中的显示位置,所述M个目标控件为所述N个控件的子集,M为不大于N的整数;
    所述第二电子设备根据所述配置文件在所述N个控件中确定需要在所述第一投屏界面中显示的M个目标控件;
    其中,所述第二电子设备按照所述N个控件的绘制指令绘制第一投屏界面,包括:
    所述第二电子设备从所述N个控件的绘制指令中确定所述M个目标控件的绘制指令;
    所述第二电子设备按照所述M个目标控件的绘制指令绘制所述M个目标控件,形成所述第一投屏界面。
  13. 根据权利要求12所述的方法,其特征在于,在所述第二电子设备根据所述配置文件在所述N个控件中确定需要在所述第一投屏界面中显示的M个目标控件之后,还包括:
    所述第二电子设备根据所述配置文件生成第二视图信息,所述第二视图信息包括所述M个目标控件在所述第一投屏界面中的图层顺序;
    其中,所述第二电子设备按照所述M个目标控件的绘制指令绘制所述M个目标控件,形成所述第一投屏界面,包括:
    所述第二电子设备根据所述第二视图信息,分别执行所述M个目标控件的绘制指令,在所述配置文件记录的显示位置绘制所述M个目标控件,形成所述第一投屏界面。
  14. 一种电子设备,其特征在于,包括:
    触摸屏,所述触摸屏包括触摸传感器和显示屏;
    通信模块;
    一个或多个处理器;
    一个或多个存储器;
    以及一个或多个计算机程序,其中所述一个或多个计算机程序被存储在所述一个或多个存储器中,所述一个或多个计算机程序包括指令,当所述指令被所述电子设备执行时,使得所述电子设备执行如权利要求1-8或权利要求9-13中任一项所述的投屏显示方法。
  15. 一种计算机可读存储介质,所述计算机可读存储介质中存储有指令,其特征在于,当所述指令在电子设备上运行时,使得所述电子设备执行如权利要求1-8或权利要求9-13中任一项所述的投屏显示方法。
  16. 一种包含指令的计算机程序产品,其特征在于,当所述计算机程序产品在电子设备上运行时,使得所述电子设备执行如权利要求1-8或权利要求9-13中任一项所述的投屏显示方法。
  17. 一种投屏显示系统,其特征在于,所述系统包括至少一个源设备和至少一个目的设备;其中,所述源设备用于执行如权利要求1-8中任一项所述的投屏显示方法,所述目的设备用于执行如权利要求9-13中任一项所述的投屏显示方法。
PCT/CN2020/093892 2019-06-05 2020-06-02 一种投屏显示方法及电子设备 WO2020244495A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP20819136.1A EP3958548A4 (en) 2019-06-05 2020-06-02 DISPLAY METHOD BY SCREEN PROJECTION AND ELECTRONIC DEVICE
US17/616,901 US11880628B2 (en) 2019-06-05 2020-06-02 Screen mirroring display method and electronic device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910487829.1 2019-06-05
CN201910487829.1A CN110381195A (zh) 2019-06-05 2019-06-05 一种投屏显示方法及电子设备

Publications (1)

Publication Number Publication Date
WO2020244495A1 true WO2020244495A1 (zh) 2020-12-10

Family

ID=68249819

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/093892 WO2020244495A1 (zh) 2019-06-05 2020-06-02 一种投屏显示方法及电子设备

Country Status (4)

Country Link
US (1) US11880628B2 (zh)
EP (1) EP3958548A4 (zh)
CN (1) CN110381195A (zh)
WO (1) WO2020244495A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113961161A (zh) * 2021-10-18 2022-01-21 阿里云计算有限公司 数据展示方法、系统、移动终端、存储介质及程序产品
EP4242811A4 (en) * 2020-12-16 2024-05-01 Huawei Technologies Co., Ltd. METHOD FOR CUTTING AN APPLICATION INTERFACE AND ELECTRONIC DEVICE

Families Citing this family (57)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110381195A (zh) * 2019-06-05 2019-10-25 华为技术有限公司 一种投屏显示方法及电子设备
CN110908627A (zh) * 2019-10-31 2020-03-24 维沃移动通信有限公司 投屏方法及第一电子设备
CN111625211B (zh) * 2019-12-03 2023-11-28 蘑菇车联信息科技有限公司 一种屏幕投屏方法、装置、安卓设备及显示设备
CN111107624B (zh) * 2019-12-16 2022-04-01 北京小米移动软件有限公司 负一屏同步方法、负一屏同步装置及电子设备
CN116055773A (zh) 2019-12-17 2023-05-02 华为技术有限公司 一种多屏协同方法、系统及电子设备
CN111050091B (zh) * 2019-12-23 2021-04-13 联想(北京)有限公司 输出控制方法、装置及电子设备
CN113138816A (zh) * 2020-01-19 2021-07-20 华为技术有限公司 一种息屏显示主题显示方法及移动设备
CN111324327B (zh) * 2020-02-20 2022-03-25 华为技术有限公司 投屏方法及终端设备
CN111399789B (zh) * 2020-02-20 2021-11-19 华为技术有限公司 界面布局方法、装置及系统
CN113391734A (zh) * 2020-03-12 2021-09-14 华为技术有限公司 图像处理方法和图像显示设备、存储介质和电子设备
CN111510671A (zh) * 2020-03-13 2020-08-07 海信集团有限公司 一种监控视频调取显示的方法及智能终端
CN111443884A (zh) * 2020-04-23 2020-07-24 华为技术有限公司 投屏方法、装置和电子设备
CN111767012A (zh) * 2020-05-29 2020-10-13 维沃移动通信有限公司 投屏方法及装置
CN111813302B (zh) * 2020-06-08 2022-02-08 广州视源电子科技股份有限公司 投屏显示方法、装置、终端设备和存储介质
CN113805981A (zh) * 2020-06-17 2021-12-17 Oppo(重庆)智能科技有限公司 表盘图案显示方法和装置、手表、电子设备、存储介质
CN112260907A (zh) * 2020-07-01 2021-01-22 华为技术有限公司 跨设备的控制方法、装置及系统
CN113938743B (zh) * 2020-07-08 2023-06-27 华为技术有限公司 一种电子设备间的协同控制方法及系统
CN113961157B (zh) * 2020-07-21 2023-04-07 华为技术有限公司 显示交互系统、显示方法及设备
CN112035048B (zh) * 2020-08-14 2022-03-25 广州视源电子科技股份有限公司 触摸数据处理方法、装置、设备及存储介质
CN112000410B (zh) * 2020-08-17 2024-03-19 努比亚技术有限公司 一种投屏控制方法、设备及计算机可读存储介质
CN114079809A (zh) * 2020-08-20 2022-02-22 华为技术有限公司 终端及其输入方法与装置
WO2022042162A1 (zh) * 2020-08-25 2022-03-03 华为技术有限公司 用户接口界面实现方法及装置
CN112153459A (zh) * 2020-09-01 2020-12-29 三星电子(中国)研发中心 用于投屏显示的方法和装置
CN115480670A (zh) * 2020-09-07 2022-12-16 华为技术有限公司 一种导航栏显示方法、显示方法与第一电子设备
CN114173184A (zh) * 2020-09-10 2022-03-11 华为终端有限公司 投屏方法和电子设备
CN114168236A (zh) * 2020-09-10 2022-03-11 华为技术有限公司 一种应用接入方法及相关装置
CN114422640B (zh) * 2020-10-12 2023-10-13 华为技术有限公司 一种设备推荐方法及电子设备
CN113867663B (zh) * 2020-10-22 2024-04-09 华为技术有限公司 一种显示方法及电子设备
CN114530148A (zh) * 2020-10-30 2022-05-24 华为终端有限公司 一种控制方法、装置及电子设备
CN114442971A (zh) * 2020-10-30 2022-05-06 华为技术有限公司 无线投屏方法、移动设备及计算机可读存储介质
CN112328344B (zh) * 2020-11-02 2022-11-22 联想(北京)有限公司 一种投屏处理方法及第一设备
CN114489532B (zh) * 2020-11-12 2023-11-10 聚好看科技股份有限公司 终端设备以及终端设备与显示设备联动的方法
CN114510203A (zh) * 2020-11-16 2022-05-17 荣耀终端有限公司 电子设备及其设备间屏幕协同方法和介质
CN112383803B (zh) * 2020-11-16 2023-04-11 Oppo广东移动通信有限公司 信息处理方法及相关装置
CN112286477B (zh) * 2020-11-16 2023-12-08 Oppo广东移动通信有限公司 投屏显示方法及相关产品
CN114584828B (zh) * 2020-11-30 2024-05-17 上海新微技术研发中心有限公司 安卓投屏方法、计算机可读存储介质和设备
CN112732212A (zh) * 2020-12-31 2021-04-30 咪咕音乐有限公司 显示方法、电子设备及存储介质
CN112732384B (zh) * 2021-01-04 2024-03-26 联想(北京)有限公司 数据处理方法及装置
CN114816692A (zh) * 2021-01-29 2022-07-29 Oppo广东移动通信有限公司 投屏显示方法、装置、移动终端及存储介质
CN114915834A (zh) * 2021-02-08 2022-08-16 华为技术有限公司 一种投屏的方法和电子设备
CN113242463B (zh) * 2021-03-26 2023-03-03 北京汗粮科技有限公司 一种通过扩展参数增强投屏交互能力的方法
CN115145519A (zh) * 2021-03-31 2022-10-04 华为技术有限公司 一种显示方法、电子设备和系统
CN113190196B (zh) * 2021-04-27 2023-09-05 北京京东振世信息技术有限公司 多设备联动实现方法、装置、介质及电子设备
CN115525366A (zh) * 2021-06-25 2022-12-27 华为技术有限公司 一种投屏方法及相关装置
CN115599325A (zh) * 2021-06-28 2023-01-13 华为技术有限公司(Cn) 一种投屏控制方法与电子设备
CN115544469A (zh) * 2021-06-29 2022-12-30 华为技术有限公司 访问控制方法及相关装置
CN113703849B (zh) * 2021-07-15 2023-04-18 荣耀终端有限公司 投屏应用打开方法和装置
CN116301516A (zh) * 2021-12-21 2023-06-23 北京小米移动软件有限公司 一种应用共享方法及装置、电子设备、存储介质
CN115567630B (zh) * 2022-01-06 2023-06-16 荣耀终端有限公司 一种电子设备的管理方法、电子设备及可读存储介质
CN114461124B (zh) * 2022-01-30 2023-03-21 深圳创维-Rgb电子有限公司 投屏控制方法、装置、投屏器及计算机可读存储介质
CN115134341A (zh) * 2022-06-27 2022-09-30 联想(北京)有限公司 显示方法和装置
CN117492672A (zh) * 2022-07-26 2024-02-02 华为技术有限公司 一种投屏方法及电子设备
US11909544B1 (en) * 2022-09-20 2024-02-20 Motorola Mobility Llc Electronic devices and corresponding methods for redirecting user interface controls during a videoconference
CN117850718A (zh) * 2022-10-09 2024-04-09 华为技术有限公司 一种显示屏选择方法及电子设备
CN116679895B (zh) * 2022-10-26 2024-06-07 荣耀终端有限公司 一种协同业务的调度方法、电子设备及协同系统
US11689695B1 (en) * 2022-12-15 2023-06-27 Northern Trust Corporation Computing technologies for screensharing
US11907606B1 (en) * 2023-03-15 2024-02-20 Motorola Mobility Llc Compute box and corresponding systems and methods for formatting content for presentation on flexible content presentation companion devices

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103744810A (zh) * 2013-12-23 2014-04-23 西安酷派软件科技有限公司 终端、电子设备、同步显示系统和方法
CN105100907A (zh) * 2014-04-28 2015-11-25 宇龙计算机通信科技(深圳)有限公司 选择性投屏的方法及其装置
CN108713185A (zh) * 2016-03-02 2018-10-26 三星电子株式会社 电子装置及其显示和发送图像的方法
CN108958684A (zh) * 2018-06-22 2018-12-07 维沃移动通信有限公司 投屏方法及移动终端
EP3451654A1 (en) * 2017-08-25 2019-03-06 Bellevue Investments GmbH & Co. KGaA Method and system for 360 degree video editing with latency compensation
CN109508162A (zh) * 2018-10-12 2019-03-22 福建星网视易信息系统有限公司 一种投屏显示方法、系统及存储介质
CN110377250A (zh) * 2019-06-05 2019-10-25 华为技术有限公司 一种投屏场景下的触控方法及电子设备
CN110381195A (zh) * 2019-06-05 2019-10-25 华为技术有限公司 一种投屏显示方法及电子设备
CN110389736A (zh) * 2019-06-05 2019-10-29 华为技术有限公司 一种投屏显示方法及电子设备
CN111190558A (zh) * 2018-11-15 2020-05-22 腾讯科技(深圳)有限公司 投屏控制方法、装置、计算机可读存储介质和计算机设备

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014063259A (ja) * 2012-09-20 2014-04-10 Fujitsu Ltd 端末装置,及び処理プログラム
WO2015036649A1 (en) * 2013-09-13 2015-03-19 Polar Electro Oy Remote wireless display for biometric data with bidirectional communications
US10191713B2 (en) 2014-03-24 2019-01-29 Lenovo (Beijing) Co., Ltd. Information processing method and electronic device
CN103986935B (zh) * 2014-04-30 2018-03-06 华为技术有限公司 编码方法、编码器、屏幕共享设备及系统
WO2016056858A2 (en) * 2014-10-10 2016-04-14 Samsung Electronics Co., Ltd. Method for sharing screen and electronic device thereof
KR20170096408A (ko) 2016-02-16 2017-08-24 삼성전자주식회사 어플리케이션을 표시하는 방법 및 이를 지원하는 전자 장치
CN106331669A (zh) * 2016-08-14 2017-01-11 深圳市芯智科技有限公司 一种基于全息自动无极变焦功能的投影方法
CN109218731B (zh) * 2017-06-30 2021-06-01 腾讯科技(深圳)有限公司 移动设备的投屏方法、装置及系统
CN107493375B (zh) * 2017-06-30 2020-06-16 北京超卓科技有限公司 移动终端扩展式投屏方法及投屏系统
WO2019036942A1 (zh) 2017-08-23 2019-02-28 华为技术有限公司 一种显示方法及装置

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103744810A (zh) * 2013-12-23 2014-04-23 西安酷派软件科技有限公司 终端、电子设备、同步显示系统和方法
CN105100907A (zh) * 2014-04-28 2015-11-25 宇龙计算机通信科技(深圳)有限公司 选择性投屏的方法及其装置
CN108713185A (zh) * 2016-03-02 2018-10-26 三星电子株式会社 电子装置及其显示和发送图像的方法
EP3451654A1 (en) * 2017-08-25 2019-03-06 Bellevue Investments GmbH & Co. KGaA Method and system for 360 degree video editing with latency compensation
CN108958684A (zh) * 2018-06-22 2018-12-07 维沃移动通信有限公司 投屏方法及移动终端
CN109508162A (zh) * 2018-10-12 2019-03-22 福建星网视易信息系统有限公司 一种投屏显示方法、系统及存储介质
CN111190558A (zh) * 2018-11-15 2020-05-22 腾讯科技(深圳)有限公司 投屏控制方法、装置、计算机可读存储介质和计算机设备
CN110377250A (zh) * 2019-06-05 2019-10-25 华为技术有限公司 一种投屏场景下的触控方法及电子设备
CN110381195A (zh) * 2019-06-05 2019-10-25 华为技术有限公司 一种投屏显示方法及电子设备
CN110389736A (zh) * 2019-06-05 2019-10-29 华为技术有限公司 一种投屏显示方法及电子设备

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3958548A4

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4242811A4 (en) * 2020-12-16 2024-05-01 Huawei Technologies Co., Ltd. METHOD FOR CUTTING AN APPLICATION INTERFACE AND ELECTRONIC DEVICE
CN113961161A (zh) * 2021-10-18 2022-01-21 阿里云计算有限公司 数据展示方法、系统、移动终端、存储介质及程序产品

Also Published As

Publication number Publication date
US11880628B2 (en) 2024-01-23
US20220391161A1 (en) 2022-12-08
EP3958548A1 (en) 2022-02-23
EP3958548A4 (en) 2022-07-06
CN110381195A (zh) 2019-10-25

Similar Documents

Publication Publication Date Title
WO2020244495A1 (zh) 一种投屏显示方法及电子设备
WO2020244492A1 (zh) 一种投屏显示方法及电子设备
WO2020244500A1 (zh) 一种投屏场景下的触控方法及电子设备
US11818420B2 (en) Cross-device content projection method and electronic device
WO2020244497A1 (zh) 一种柔性屏幕的显示方法及电子设备
WO2021023220A1 (zh) 一种内容接续方法、系统及电子设备
CN112714214B (zh) 一种内容接续方法、设备、系统、gui及计算机可读存储介质
WO2020192456A1 (zh) 一种语音交互方法及电子设备
WO2020155014A1 (zh) 智能家居设备分享系统、方法及电子设备
WO2021121052A1 (zh) 一种多屏协同方法、系统及电子设备
WO2020119464A1 (zh) 一种视频拆分方法及电子设备
WO2021249318A1 (zh) 一种投屏方法和终端
WO2020211705A1 (zh) 一种联系人的推荐方法及电子设备
CN116360725B (zh) 显示交互系统、显示方法及设备
WO2022078295A1 (zh) 一种设备推荐方法及电子设备
CN113691842A (zh) 一种跨设备的内容投射方法及电子设备
WO2022143883A1 (zh) 一种拍摄方法、系统及电子设备
WO2022135527A1 (zh) 一种视频录制方法及电子设备
JP2022515863A (ja) スマートホームデバイスによってネットワークにアクセスするための方法および関連するデバイス
WO2022156721A1 (zh) 一种拍摄方法及电子设备
CN115016697A (zh) 投屏方法、计算机设备、可读存储介质和程序产品
WO2021042881A1 (zh) 消息通知方法及电子设备
WO2023005711A1 (zh) 一种服务的推荐方法及电子设备
WO2022206763A1 (zh) 一种显示方法、电子设备和系统
WO2022143310A1 (zh) 一种双路投屏的方法及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20819136

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020819136

Country of ref document: EP

Effective date: 20211116

NENP Non-entry into the national phase

Ref country code: DE