CN114442969A - Inter-device screen cooperation method and device - Google Patents

Inter-device screen cooperation method and device Download PDF

Info

Publication number
CN114442969A
CN114442969A CN202011190260.1A CN202011190260A CN114442969A CN 114442969 A CN114442969 A CN 114442969A CN 202011190260 A CN202011190260 A CN 202011190260A CN 114442969 A CN114442969 A CN 114442969A
Authority
CN
China
Prior art keywords
window
electronic device
screen
content
electronic equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011190260.1A
Other languages
Chinese (zh)
Other versions
CN114442969B (en
Inventor
叶灵洁
阚彬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202011190260.1A priority Critical patent/CN114442969B/en
Priority to PCT/CN2021/125218 priority patent/WO2022089294A1/en
Publication of CN114442969A publication Critical patent/CN114442969A/en
Application granted granted Critical
Publication of CN114442969B publication Critical patent/CN114442969B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Information Transfer Between Computers (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The application provides a method and equipment for screen collaboration between equipment, which are applied to a system formed by first electronic equipment and second electronic equipment, and the method comprises the following steps: the method comprises the steps that communication connection is established between first electronic equipment and second electronic equipment; the first electronic device displays a first window including first content; the second electronic device displays a second window including second content; the method comprises the steps that first electronic equipment sends first screen projection data comprising information of first content to second electronic equipment; the second electronic equipment receives the first screen projection data, and displays a second window and a third window comprising the first content; the second electronic equipment sends second screen projection data comprising information of second content to the first electronic equipment; the first electronic device receives the second screen projection data, and displays the first window and a fourth window including the second content. In the scheme, the first electronic device and the second electronic device can simultaneously project screens to the opposite side, so that the two-way screen cooperation between the devices can be realized, and the screen cooperation efficiency between the devices can be improved.

Description

Inter-device screen cooperation method and device
Technical Field
The present application relates to the field of electronic devices, and in particular, to a method and device for screen collaboration between devices.
Background
With the development of science and technology, the functions of electronic equipment become more and more powerful, and the multi-window mode and the inter-equipment cooperation mode of the equipment are gradually popularized. At present, a user can simultaneously open a plurality of operation windows in an electronic device, and can also share the windows (applications) to other devices for display. For example, the current inter-device cooperation mode supports that a device a (e.g., a mobile phone) cooperatively projects a screen onto a device B, the operation performed by the user on the device a may be synchronously displayed on the device B, and the operation performed by the user on the device a on the device B may also be synchronously fed back to the device a. However, the current inter-device coordination mode has a master-slave relationship, for example, after the device a is projected to the device B, the device B is used as a slave device and cannot be projected to the device a any more, which results in a low use experience.
Therefore, when the current cross-device sharing is performed, the problem that only one-way sharing among devices can be supported and two-way sharing is not supported exists, so that the efficiency of cooperation among the devices is low.
Disclosure of Invention
In a first aspect, an embodiment of the present application provides an inter-device screen coordination method, which is applied to a system formed by a first electronic device and a second electronic device, and the method includes:
the first electronic equipment and the second electronic equipment establish communication connection;
the first electronic equipment displays a first window, wherein the first window comprises first content;
the second electronic equipment displays a second window, wherein the second window comprises second content;
the first electronic equipment sends first screen projection data to the second electronic equipment, wherein the first screen projection data comprises information of the first content;
the second electronic equipment receives the first screen projection data and displays the second window and a third window, wherein the third window comprises the first content;
the second electronic equipment sends second screen projection data to the first electronic equipment, wherein the second screen projection data comprise information of the second content;
and the first electronic equipment receives the second screen projection data and displays the first window and a fourth window, wherein the fourth window comprises the second content.
In the method, when a first electronic device displays a first window and a second electronic device displays a second window, the first electronic device sends first screen projection data corresponding to the first window to the second electronic device, the second electronic device can display a screen projection window, namely a third window, of the first window according to the first screen projection data and simultaneously display a second window of the second electronic device, screen projection from the first electronic device to the second electronic device is achieved, meanwhile, the first electronic device receives second screen projection data sent by the second electronic device, and displays a screen projection window of the second electronic device according to the second screen projection data and simultaneously displays the first window of the first electronic device, and screen projection from the second electronic device to the first electronic device can be achieved. Therefore, the method can realize the bidirectional screen projection between the first electronic device and the second electronic device, and the two electronic devices can both display the windows of the electronic devices and the windows of the opposite-end device, so that the screen cooperation efficiency between the electronic devices is improved, and the user experience can be improved.
In one possible design, before the first electronic device sends the first screen projection data to the second electronic device, the method further includes:
the first electronic equipment responds to the received first operation and sends a first screen projection request to the second electronic equipment, and the first screen projection request is used for requesting to project the first content to the second electronic equipment;
before the second electronic device displays the second window and the third window, the method further comprises:
and the second electronic equipment displays first prompt information, wherein the first prompt information is used for prompting whether the user accepts the first electronic equipment to project the first content to the second electronic equipment.
According to the method, the first electronic equipment requests screen projection to the second electronic equipment according to user operation, and the second electronic equipment displays corresponding prompt information after receiving the request, so that whether screen projection is accepted or not is determined according to the user operation. Therefore, the first electronic device and the second electronic device can perform screen cooperative control between the devices based on user requirements, and user experience is good.
In one possible design, after the second electronic device displays the second window and a third window, the method further includes:
the first electronic equipment responds to the received second operation and updates the first content in the first window; or
The second electronic device responds to the received third operation and sends a first updating request to the first electronic device, wherein the first updating request is used for requesting the first electronic device to update the first content in the first window;
the first electronic device updates the first content in the first window in response to the received first update request.
In the method, a user can control the updating of the content in the first window of the first electronic equipment at the first electronic equipment side or the second electronic equipment side, so that the flexibility and convenience of equipment control in the screen cooperation process between the equipment are greatly improved. Meanwhile, the second electronic equipment receiving screen projection can be supported to control the updating of the display window of the first electronic equipment initiating the screen projection.
In one possible design, when the first electronic device updates the first content in the first window, the method further includes:
the first electronic equipment sends updated first screen projection data to the second electronic equipment;
and the second electronic equipment updates the first content in the third window according to the received updated first screen projection data.
In the method, when the first electronic device updates the display content of the first window of the first electronic device, the updated content is indicated to the second electronic device, and the second electronic device can synchronously update the screen projection window corresponding to the first window, namely the third window.
In one possible design, after the first electronic device displays the first window and the fourth window, the method further includes:
the first electronic equipment displays a first control option;
before the first electronic device updates the first content in the first window in response to the received first update request, the method further includes:
the first electronic device receives a fourth operation acting on the first control option.
In the method, after the second electronic equipment is projected to the first electronic equipment, a user can set whether the first electronic equipment receives the control of the second electronic equipment, so that the flexibility of screen cooperative control among the equipment is improved.
In one possible design, before the second electronic device sends the second screen projection data to the first electronic device, the method further comprises:
the second electronic device responds to the received fifth operation and sends a second screen projection request to the first electronic device, wherein the second screen projection request is used for requesting to project the second content to the first electronic device;
before the first electronic device displays the first window and the fourth window, the method further comprises:
and the first electronic equipment displays second prompt information, wherein the second prompt information is used for prompting whether the user accepts the second electronic equipment to project the second content to the first electronic equipment.
According to the method, the second electronic equipment requests screen projection to the first electronic equipment according to user operation, and the first electronic equipment displays corresponding prompt information after receiving the request, so that whether screen projection is accepted or not is determined according to the user operation. Therefore, the first electronic device and the second electronic device can perform screen cooperative control between the devices based on user requirements, and user experience is good.
In one possible design, after the first electronic device displays the first window and the fourth window, the method further includes:
the second electronic equipment responds to the received sixth operation and updates the second content in the second window; or
The first electronic device responds to the received seventh operation and sends a second updating request to the second electronic device, wherein the second updating request is used for requesting the second electronic device to update the second content in the second window;
the second electronic device updates the second content in the second window in response to the received second update request.
In the method, the user can control the updating of the content in the second window of the second electronic equipment at the second electronic equipment side or the first electronic equipment side, so that the flexibility and convenience of equipment control in the screen cooperation process between the equipment are greatly improved. Meanwhile, the method can support the first electronic equipment receiving the screen projection to control the updating of the display window of the second electronic equipment initiating the screen projection.
In one possible design, when the second electronic device updates the second content in the second window, the method further includes:
the second electronic equipment sends updated second screen projection data to the first electronic equipment;
and the first electronic equipment updates the second content in the fourth window according to the received updated second screen projection data.
In the method, when the second electronic device updates the display content of the second window of the second electronic device, the updated content is indicated to the first electronic device, and the first electronic device can synchronously update the screen projection window corresponding to the second window, namely the fourth window.
In one possible design, after the second electronic device displays the second window and a third window, the method further includes:
the second electronic device displays a second control option;
before the second electronic device updates the second content in the second window in response to the received second update request, the method further includes:
the second electronic device receives an eighth operation acting on the second control option.
According to the method, after the first electronic device is projected to the second electronic device, a user can set whether the second electronic device is controlled by the first electronic device, and flexibility of screen cooperative control among the devices is improved.
In one possible design, the display area of the first window is different from that of the fourth window, or the display area of the first window covers a partial area in the display area of the fourth window, or the display area of the fourth window covers a partial area in the display area of the first window.
In the method, when the first electronic equipment simultaneously displays the first window of the first electronic equipment and the fourth window projected by the second electronic equipment, multiple different display modes can be adopted, the flexibility of screen collaborative display among the equipment is improved, and the method can meet the display requirements of multiple application scenes.
In one possible design, the display area of the second window is different from the display area of the third window, or the display area of the second window covers a partial area in the display area of the third window, or the display area of the third window covers a partial area in the display area of the second window.
In the method, when the second electronic equipment simultaneously displays the second window of the second electronic equipment and the third window projected by the first electronic equipment, multiple different display modes can be adopted, the flexibility of screen collaborative display among the equipment is improved, and the method can meet the display requirements of multiple application scenes.
In one possible design, the system further includes a third electronic device, after the first electronic device displays the first window and the fourth window, the method further includes:
the first electronic equipment and the third electronic equipment establish communication connection;
the third electronic equipment displays a fifth window, wherein the fifth window comprises third content;
the first electronic equipment sends the first screen projection data to the third electronic equipment;
the third electronic equipment receives the first screen projection data and displays a fifth window and a sixth window, wherein the sixth window comprises the first content;
the third electronic equipment sends third screen projection data to the first electronic equipment, wherein the third screen projection data comprise information of the third content;
and the first electronic equipment receives the third screen projection data and displays the first window and a seventh window, wherein the seventh window comprises the third content.
In the method, after the first electronic device and the second electronic device establish bidirectional screen projection, the first electronic device and the third electronic device can establish bidirectional screen projection at the same time, so that the first electronic device can perform screen projection control on the plurality of electronic devices and simultaneously receive screen projection control of the plurality of electronic devices, and screen cooperation among the devices in a multi-device scene is further realized.
In a second aspect, an embodiment of the present application provides an inter-device screen coordination method, which is applied to a first electronic device, and the method includes:
establishing a communication connection with a second electronic device;
displaying a first window, wherein the first window comprises first content;
sending first screen projection data to the second electronic equipment, wherein the first screen projection data comprises information of the first content;
receiving second screen projection data from the second electronic equipment, wherein the second screen projection data comprise information of second content, and the second content is content included in a second window displayed by the second electronic equipment;
and displaying the first window and a fourth window, wherein the fourth window comprises the second content.
In one possible design, before sending the first screen projection data to the second electronic device, the method further includes:
and responding to the received first operation, and sending a first screen projection request to the second electronic equipment, wherein the first screen projection request is used for requesting to project the first content to the second electronic equipment.
In one possible design, prior to receiving second screen projection data from the second electronic device, the method further includes:
and displaying second prompt information, wherein the second prompt information is used for prompting whether the user accepts the second electronic equipment to project the second content to the first electronic equipment.
In one possible design, after sending the first screen projection data to the second electronic device, the method further includes:
updating the first content in the first window in response to the received second operation; or
Updating the first content in the first window in response to a first update request received from the second electronic device, wherein the first update request is used for requesting the first electronic device to update the first content in the first window.
In one possible design, when updating the first content in the first window, the method further includes:
and sending the updated first screen projection data to the second electronic equipment.
In one possible design, after displaying the first window and the fourth window, the method further includes:
displaying a first control option;
before updating the first content in the first window in response to the first update request from the second electronic device, the method further comprises:
receiving a fourth operation acting on the first control option.
In one possible design, after displaying the first window and the fourth window, the method further includes:
in response to the received seventh operation, sending a second update request to the second electronic device, where the second update request is used to request the second electronic device to update the second content in the second window.
In one possible design, after sending the second update request to the second electronic device, the method further includes:
updating the second content in the fourth window in response to receiving updated second screen projection data from the second electronic device.
In one possible design, the display area of the first window is different from that of the fourth window, or the display area of the first window covers a partial area in the display area of the fourth window, or the display area of the fourth window covers a partial area in the display area of the first window.
In one possible design, after displaying the first window and the fourth window, the method further includes:
establishing a communication connection with a third electronic device;
sending the first screen projection data to the third electronic equipment;
receiving third screen projection data from the third electronic device, wherein the third screen projection data comprises information of third content, and the third content is content included in a fifth window displayed by the third electronic device;
and displaying the first window and a seventh window, wherein the seventh window comprises the third content.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a display screen, a memory, and one or more processors;
wherein the memory is to store computer program code comprising computer instructions; the instructions, when executed by the one or more processors, enable the electronic device to perform the method as described above in the second aspect or any possible design of the second aspect.
In a fourth aspect, an embodiment of the present application provides a chip, where the chip is coupled with a memory in an electronic device, so that the chip, when running, invokes a computer program stored in the memory, to implement the method according to the first aspect or any one of the possible designs provided by the first aspect of the embodiment of the present application, or to implement the method according to the second aspect or any one of the possible designs provided by the second aspect of the embodiment of the present application.
In a fifth aspect, the present application provides a computer storage medium storing a computer program, which, when run on an electronic device, causes the electronic device to perform any one of the above-mentioned possible designs of the first aspect or the first aspect, or to implement any one of the possible designs provided by the second aspect or the second aspect of the present application.
In a sixth aspect, the present application provides a computer program product, which when run on an electronic device, causes the electronic device to perform the method of the first aspect or any one of the possible designs of the first aspect, or to implement the method of the second aspect or any one of the possible designs provided by the second aspect of the present application.
Drawings
Fig. 1 is a schematic diagram of cross-device cooperative screen projection provided in an embodiment of the present application;
fig. 2 is a schematic diagram of an inter-device screen collaboration system architecture according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of an android operating system of an electronic device according to an embodiment of the present application;
fig. 5 is a schematic diagram of a method for screen collaboration between devices according to an embodiment of the present application;
fig. 6a is a schematic diagram of a method for establishing a screen collaboration between devices according to an embodiment of the present application;
fig. 6b is a schematic view of a display screen of a first electronic device according to an embodiment of the present application;
fig. 6c is a schematic view of a display screen of a second electronic device according to an embodiment of the present application;
fig. 6d is a schematic view of a display screen of a second electronic device accepting screen projection according to an embodiment of the present application;
fig. 6e is a schematic view of an updated display screen of a first electronic device according to an embodiment of the present application;
fig. 6f is a schematic view of a display screen updated after the second electronic device receives screen projection according to an embodiment of the present application;
fig. 7a is a schematic diagram illustrating a method for establishing a screen collaboration between two-way devices according to an embodiment of the present application;
fig. 7b is a schematic view of a display screen of a second electronic device accepting screen projection according to an embodiment of the present application;
fig. 7c is a schematic view of a display screen of a first electronic device according to an embodiment of the present application;
fig. 7d is a schematic view of a display screen of a first electronic device accepting screen projection according to an embodiment of the present application;
fig. 7e is a schematic view of an updated display screen of a second electronic device according to an embodiment of the present application;
fig. 7f is a schematic view of a display screen updated after the first electronic device receives screen projection according to an embodiment of the present application;
fig. 8 is a schematic flowchart of screen cooperative control between two-way devices according to an embodiment of the present application;
fig. 9 is a schematic effect diagram of a screen coordination method between two-way devices according to an embodiment of the present application;
fig. 10 is a schematic effect diagram of a method for screen collaboration between two-way devices according to an embodiment of the present application;
fig. 11 is a schematic effect diagram of a screen coordination method between two-way devices according to an embodiment of the present application;
fig. 12 is a schematic diagram of a screen coordination control method between two-way devices according to an embodiment of the present application;
fig. 13a is a schematic view of a display screen of a first electronic device accepting screen projection according to an embodiment of the present application;
fig. 13b is a schematic view of a display screen of a first electronic device receiving screen shots of a plurality of electronic devices according to an embodiment of the present application;
fig. 14 is a schematic diagram of a method for controlling screen coordination between devices according to an embodiment of the present application;
fig. 15 is a schematic diagram of screen cooperative control between devices according to an embodiment of the present application;
fig. 16 is a schematic diagram of screen cooperative control between devices according to an embodiment of the present application;
fig. 17 is a schematic diagram of a method for screen collaboration between devices according to an embodiment of the present application;
fig. 18 is a schematic view of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the embodiments of the present application will be described in further detail with reference to the accompanying drawings. In the description of the embodiments of the present application, the terms "first" and "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying any number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature.
For ease of understanding, an explanation of concepts related to the present application is given by way of example for reference, as follows:
1) the electronic equipment is equipment with a display screen. In some embodiments of the present application, the electronic device may be a portable device, such as a mobile phone, a tablet computer, a wearable device with wireless communication function (e.g., a watch, a bracelet, a helmet, a headset, etc.), an in-vehicle terminal device, an Augmented Reality (AR)/Virtual Reality (VR) device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), a smart home device (e.g., a smart television, etc.), and other smart terminal devices.
In some embodiments of the present application, the electronic device may also be a portable terminal device that also includes other functions, such as a personal digital assistant and/or a picture display function. An exemplary embodiment of a portable terminal device includesBut not limited to, mounting
Figure BDA0002752548580000071
Figure BDA0002752548580000072
Or other operating system. The above-described portable terminal device may also be other portable terminal devices such as a Laptop computer (Laptop) with a display screen, and the like. It should also be understood that in some other embodiments of the present application, the electronic device may not be a portable terminal device, but may be a desktop computer having a display screen.
2) The screen coordination, also called multi-screen coordination, screen projection, co-screen, fly-screen, and screen sharing, refers to that a picture output and displayed in a screen a of a device a (such as a mobile phone, a tablet, a notebook, a computer, etc.) is displayed in a set area of a screen b of a device b (such as a tablet, a notebook, a computer, a television, a kiosk, a projector, etc.) in real time. And simultaneously, the picture change of the screen a caused by the operation device a is synchronously displayed in the set area of the screen b. And the picture change of the set area of the screen b caused by operating the set area of the screen b of the device b is synchronously displayed in the picture of the screen a.
3) Middleware, a type of software that is interposed between operating systems and applications, and that is used to connect software components of the operating systems and application software of users, is a common service that is located between platforms (hardware and operating systems) and applications. The middleware uses basic service (function) provided by system software to link each part of an application system or different applications on a network, and can achieve the purposes of resource sharing and function sharing. Middleware is an independent system-level software service program, supports distributed computing, and can provide interactive functions of applications or services across networks and across hardware transparency.
It should be understood that "at least one" in the embodiments of the present application means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a alone, both A and B, and B alone, where A, B may be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of the singular or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, a and b, a and c, b and c, or a, b and c, wherein a, b and c can be single or multiple.
Referring to fig. 1, a schematic diagram of cross-device cooperative screen projection provided in an embodiment of the present application is shown. As shown in fig. 1, currently, a device a (e.g., a mobile phone) is supported in a multi-device cooperation to cooperatively project a screen onto a device B (e.g., a tablet computer), so that a display window corresponding to an entire display screen of the display device a is output on a display screen of the device B. After the device A projects the display window corresponding to the display screen to the device B, both the device A and the device B can control the display window, the control operation and the result of the display window on the device A can be synchronously displayed in the display window corresponding to the display screen of the device B, and the control operation and the result of the display window on the device B can be synchronously displayed in the display window corresponding to the display screen of the device A.
However, in the above method, after the display window of the device a is projected to the device B, the display window of the device B itself cannot be projected to the device a at the same time for display, so that the device a and the device B cannot perform bidirectional cooperative control at the same time, and cannot meet the service requirements of the user for bidirectional screen projection and bidirectional cooperation.
In view of this, the embodiment of the present application provides an inter-device screen coordination method, which is applied to a scene of cross-device screen projection control.
Fig. 2 is a schematic diagram of an inter-device screen collaboration system architecture according to an embodiment of the present application. As shown in fig. 2, the system architecture may include: a first electronic device 201 (e.g., a mobile phone as shown in the figure) and a second electronic device 202 (e.g., a tablet computer as shown in the figure). Wherein the second electronic device 202 may be any one of at least one electronic device connected with the first electronic device 201; the first electronic device 201 and the second electronic device 202 may respectively project the display windows on their own display screens onto the display screens of the other parties for display, so as to implement bidirectional screen projection cooperation and support mutual operation control. The display window on the display screen is a whole window corresponding to the display screen or a local window corresponding to a certain application displayed on the display screen.
In this system, the first electronic device 201 and the second electronic device 202 can communicate with each other. Optionally, the first electronic device 201 and the second electronic device 202 access the same local area network. The first electronic device 201 and the second electronic device 202 can communicate through a short-range wireless communication technology such as bluetooth and Wi-Fi. Optionally, the first electronic device 201 and the second electronic device 202 are connected in a wired manner and communicate with each other. The first electronic device 201 and the second electronic device 202 may communicate with each other by connecting them via a data line, such as a Universal Serial Bus (USB) data line.
In an example that the first electronic device 201 and the second electronic device 202 access the same local area network, specifically, the following may be used: the first electronic device 201 and the second electronic device 202 establish a wireless connection with the same wireless access point.
In addition, the first electronic device 201 and the second electronic device 202 may access to the same Wireless Fidelity (Wi-Fi) hotspot, and for example, the first electronic device 201 and the second electronic device 202 may also access to the same bluetooth beacon through a bluetooth protocol. For another example, the first electronic device 201 and the second electronic device 202 may also trigger a Communication connection through a Near Field Communication (NFC) tag, and transmit encrypted information through a bluetooth module to perform identity authentication. After the authentication is successful, data transmission is performed in a Point-to-Point (P2P) manner.
In some embodiments of the present application, the first electronic device 201 and the second electronic device 202 are smart devices with an output display function, such as a mobile phone, a tablet, a computer, a smart television, and the like.
It should be noted that the system shown in fig. 2 does not limit the applicable scenarios of the inter-device screen collaboration method provided in the present application.
It should be further noted that, the present application also does not limit the number of the electronic devices in the inter-device screen collaboration system, and the inter-device screen collaboration system may include two electronic devices, or may include more electronic devices, for example, may include 3 or 4 electronic devices, and the like.
In some embodiments of the present application, when the inter-device screen coordination system includes a plurality of electronic devices, any two of the electronic devices may execute the inter-device screen coordination method provided in the present application, so as to implement bidirectional screen projection and coordination control. For example, when the inter-device screen coordination system includes 3 electronic devices, namely, a device a, a device B, and a device C, the inter-device screen coordination may be that both the device a and the device B project screens to the device C, and the device C projects screens to the device a and the device B, respectively; or, the screen cooperation among the devices may be that the device a respectively projects a screen to the device B and the device C, the device B respectively projects a screen to the device a and the device C, and the device C respectively projects a screen to the device a and the device B; alternatively, the screen coordination among the devices may be that the device a projects a screen to the device B, the device B projects a screen to the device C, the device C projects a screen to the device a, and the like, which are not listed here.
Referring to fig. 3, a structure of an electronic device to which the method provided in the embodiment of the present application is applied is described.
As shown in fig. 3, the electronic device 300 may include a processor 310, an external memory interface 320, an internal memory 321, a USB interface 330, a charging management module 340, a power management module 341, a battery 342, an antenna 1, an antenna 2, a mobile communication module 350, a wireless communication module 360, an audio module 370, a speaker 370A, a receiver 370B, a microphone 370C, an earphone interface 370D, a sensor module 380, keys 390, a motor 391, an indicator 392, a camera 393, a display screen 394, a SIM card interface 395, and the like. Wherein the sensor module 380 may include a gyroscope sensor, an acceleration sensor, a proximity light sensor, a fingerprint sensor, a touch sensor, a temperature sensor, a pressure sensor, a distance sensor, a magnetic sensor, an ambient light sensor, an air pressure sensor, a bone conduction sensor, and the like.
It is to be understood that the electronic device shown in fig. 3 is merely an example and does not constitute a limitation of the electronic device, and the electronic device may have more or less components than those shown in the drawings, may combine two or more components, or may have a different configuration of components. The various components shown in fig. 3 may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
Processor 310 may include one or more processing units, such as: the processor 310 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a Neural-Network Processing Unit (NPU), among others. The different processing units may be separate devices or may be integrated into one or more processors. The controller may be, among other things, a neural center and a command center of the electronic device 300. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 310 for storing instructions and data. In some embodiments, the memory in the processor 310 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 310. If the processor 310 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 310, thereby increasing the efficiency of the system.
The execution of the inter-device screen coordination method provided by the embodiment of the present application may be controlled or completed by calling other components by the processor 310, for example, calling a processing program stored in the internal memory 321 in the embodiment of the present application to control the wireless communication module 360 to perform data communication to other electronic devices, so as to implement inter-device screen coordination, improve the coordination control efficiency of the electronic devices, and improve the user experience. The processor 310 may include different devices, for example, when the CPU and the GPU are integrated, the CPU and the GPU may cooperate to execute the inter-device screen cooperation method provided in the embodiment of the present application, for example, part of algorithms in the inter-device screen cooperation method are executed by the CPU, and another part of algorithms are executed by the GPU, so as to obtain faster processing efficiency.
The display screen 394 is used to display images, video, and the like. The display screen 394 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 300 may include 1 or N display screens 394, N being a positive integer greater than 1. The display screen 394 may be used to display information input by or provided to the user as well as various Graphical User Interfaces (GUIs). For example, the display screen 394 may display a photograph, video, web page, or file, etc. As another example, the display screen 394 may display a graphical user interface of an electronic device as shown in FIG. 2. For example, the graphical user interface of the electronic device shown in fig. 2 includes a status bar, a Dock bar, a concealable navigation bar, a time and weather widget (widget), and icons of applications, such as a browser icon, etc. The status bar includes the name of the operator (e.g., china mobile), the mobile network (e.g., 4G), the time and the remaining power. Further, it is understood that in some embodiments, a Bluetooth icon, a Wi-Fi icon, an add-on icon, etc. may also be included in the status bar. It is further understood that, in other embodiments, a Dock column may be further included in the graphical user interface of the electronic device shown in fig. 2, and a commonly used application icon may be included in the Dock column. When the processor 310 detects a touch event of a user's finger (or stylus, etc.) with respect to a certain application icon, in response to the touch event, the user interface of the application corresponding to the application icon is opened and displayed on the display screen 394.
In this embodiment, the display screen 394 may be an integrated flexible display screen, or a spliced display screen formed by two rigid screens and a flexible screen located between the two rigid screens may be used. After the processor 310 executes the inter-device screen coordination method provided in the embodiment of the present application, the processor 310 may control the display screen 394 to display the relevant result.
The cameras 393 (front or rear, or one camera can be both front and rear) are used to capture still images or video. In general, the camera 393 may include a photosensitive element such as a lens group including a plurality of lenses (convex or concave lenses) for collecting light signals reflected by an object to be photographed and transferring the collected light signals to an image sensor, and an image sensor. And the image sensor generates an original image of the object to be shot according to the optical signal.
The internal memory 321 may be used to store computer-executable program code, which includes instructions. The processor 310 executes various functional applications of the electronic device 300 and data processing by executing instructions stored in the internal memory 321. The internal memory 321 may include a program storage area and a data storage area. The storage program area may store codes of an operating system, an application program (such as an inter-device screen cooperation function, etc.), and the like. The storage data area may store data created during the usage of the electronic device 300 (e.g., process data generated by performing the inter-device screen coordination function provided by the embodiment of the present application), and the like.
The internal memory 321 may further store one or more computer programs corresponding to the inter-device screen cooperation algorithm provided in the embodiments of the present application. The one or more computer programs stored in the internal memory 321 and configured to be executed by the one or more processors 310 include instructions that may be used to perform the steps in the following embodiments.
In addition, the internal memory 321 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
Of course, the codes of the inter-device screen cooperation algorithm provided in the embodiments of the present application may also be stored in the external memory. In this case, the processor 310 may execute the code of the inter-device screen cooperation algorithm stored in the external memory through the external memory interface 320.
The sensor module 380 may include a gyroscope sensor, an acceleration sensor, a proximity light sensor, a fingerprint sensor, a touch sensor, and the like.
Touch sensors, also known as "touch panels". The touch sensor may be disposed on the display screen 394, and the display screen is formed by the touch sensor and the display screen 394, and is also called a "display screen". The touch sensor is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to the touch operation may be provided through the display screen 394. In other embodiments, the touch sensor may be disposed on a surface of the electronic device 300 at a different location than the display screen 394.
Illustratively, the display screen 394 of the electronic device 300 displays a primary interface including icons for a plurality of applications (e.g., a camera application, a WeChat application, etc.). The user clicks the icon of the camera application in the main interface through the touch sensor, and the trigger processor 310 starts the camera application and opens the camera 393. Display screen 394 displays an interface, such as a viewfinder interface, for a camera application.
The wireless communication function of the electronic device 300 may be implemented by the antenna 1, the antenna 2, the mobile communication module 350, the wireless communication module 360, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in electronic device 300 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 350 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 300. The mobile communication module 350 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 350 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the filtered electromagnetic wave to the modem processor for demodulation. The mobile communication module 350 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 350 may be disposed in the processor 310. In some embodiments, at least some of the functional modules of the mobile communication module 350 may be disposed in the same device as at least some of the modules of the processor 310. In the embodiment of the present application, the mobile communication module 350 may also be configured to perform information interaction with other electronic devices, for example, send a screen-projecting or screen-projecting window-updating instruction to the other electronic devices, or the mobile communication module 350 may be configured to receive a screen-projecting or screen-projecting window-updating instruction sent by the other electronic devices.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 370A, the receiver 370B, etc.) or displays an image or video through the display screen 394. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be separate from the processor 310, and may be disposed in the same device as the mobile communication module 350 or other functional modules.
The wireless communication module 360 may provide solutions for wireless communication applied to the electronic device 300, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), Bluetooth (BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 360 may be one or more devices integrating at least one communication processing module. The wireless communication module 360 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 310. The wireless communication module 360 may also receive a signal to be transmitted from the processor 310, frequency-modulate and amplify the signal, and convert the signal into electromagnetic waves via the antenna 2 to radiate the electromagnetic waves. In this embodiment, the wireless communication module 360 is configured to establish a connection with other electronic devices for data interaction. Or the wireless communication module 360 may be used to access the access point device, send a screen projection or screen projection window update instruction to the other electronic device, or receive a screen projection or screen projection window update instruction sent by the other electronic device.
For example, the first electronic device and the second electronic device shown in fig. 2 may receive or transmit instructions and data related to screen projection through the mobile communication module 350 or the wireless communication module 360, so as to implement the inter-device screen coordination function.
In addition, the electronic device 300 may implement an audio function through the audio module 370, the speaker 370A, the receiver 370B, the microphone 370C, the earphone interface 370D, and the application processor. Such as music playing, recording, etc. The electronic device 300 may receive key 390 inputs, generating key signal inputs related to user settings and function control of the electronic device 300. Electronic device 300 may generate a vibration alert (such as an incoming call vibration alert) using motor 391. Indicator 392 in electronic device 300 may be an indicator light that may be used to indicate a charging status, a change in charge, or a message, missed call, notification, etc. The SIM card interface 395 in the electronic device 300 is used for connecting a SIM card. The SIM card can be attached to and detached from the electronic device 300 by being inserted into and removed from the SIM card interface 395.
It should be understood that in practical applications, the electronic device 300 may include more or less components than those shown in fig. 3, and the embodiment of the present application is not limited thereto. The illustrated electronic device 300 is merely an example, and the electronic device 300 may have more or fewer components than shown in the figures, may combine two or more components, or may have a different configuration of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
The software system of the electronic device 300 may employ a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the invention takes an Android system with a layered architecture as an example, and exemplarily illustrates a software structure of an electronic device. Fig. 4 is a block diagram of a software structure of an electronic device according to an embodiment of the present invention. For example, fig. 4 is a schematic diagram of a software architecture that can be run in the first electronic device or the second electronic device. The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. As shown in fig. 4, the software architecture may be divided into five layers, which are an application layer, an application framework layer, an android runtime and system library, a hardware abstraction layer, and a Linux kernel layer.
The application layer is the top layer of the operating system and includes native applications of the operating system, such as email clients, bluetooth, cameras, music, video, text messages, calls, calendars, browsers, contacts, etc. The APP, abbreviated as application, related to the embodiments of the present application is a software program capable of implementing one or more specific functions. Generally, a plurality of applications can be installed in a terminal device. For example, a camera application, a mailbox application, an intelligent home control application, and the like. The application mentioned below may be a system application installed when the terminal device leaves a factory, or may be a third-party application downloaded from a network or acquired from another terminal device by a user during the process of using the terminal device.
Of course, for a developer, the developer may write an application and install it into the layer. The application framework layer may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like. Wherein, the window manager is used for managing the window program. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
In some embodiments of the present application, the application layer may include an interface setting service, which is used to implement presentation of a setting interface, where the setting interface may be used for a user to set an inter-device screen cooperation function of the terminal device. For example, the user can perform on or off setting of the inter-device screen cooperation function in the setting interface. For example, the setting interface may be content in a status bar or a notification bar displayed on a touch screen of the terminal device, or may be a control interface related to a device control function displayed on the touch screen of the terminal device.
In a possible implementation manner, the Application program may be developed using Java language, and is completed by calling an Application Programming Interface (API) provided by an Application framework layer, and a developer may interact with a bottom layer (e.g., a hardware abstraction layer, a kernel layer, etc.) of an operating system through the Application framework layer to develop its own Application program. The application framework is primarily a series of services and management systems for the operating system.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer may include some predefined functions. As shown in FIG. 4, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like. Content providers are used to store and retrieve data and make it accessible to applications. The view system may be used to build applications. The display interface may be composed of one or more views. The telephone manager is used for providing a communication function of the terminal equipment. The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction.
In this embodiment, the application framework layer may further include an inter-device screen cooperation service for controlling and implementing the inter-device screen cooperation function, and the inter-device screen cooperation service may include a multi-window framework service, which is mainly used to provide a device control function and a window display control function in cooperation with the inter-device screen cooperation service. For example, it can be used to manage a currently connected electronic device, manage a window displayed on a display screen, and the like.
In some embodiments of the present application, the inter-device screen collaboration service may further include a notification manager for information interaction with other data layers.
The android runtime includes a core library and a virtual machine. The android runtime is responsible for scheduling and managing the android system. The core library of the android system comprises two parts: one part is a function which needs to be called by the Java language, and the other part is a core library of the android system. The application layer and the application framework layer run in a virtual machine. Taking Java as an example, the virtual machine executes Java files of the application layer and the application framework layer as binary files. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers, media libraries, three-dimensional graphics processing libraries (e.g., OpenGL ES), two-dimensional graphics engines (e.g., SGL), and the like. The surface manager is used to manage the display subsystem and provide a fusion of two-dimensional and three-dimensional layers for multiple applications. The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc. The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like. The two-dimensional graphics engine is a two-dimensional drawing engine.
The Hardware Abstraction Layer (HAL) is a support of an application framework, and is an important link for connecting the application framework Layer and the Linux kernel Layer, and can provide services for developers through the application framework Layer.
The Kernel layer provides core system services of the operating system, such as security, memory management, process management, network protocol stack and driver model, and the like, which are realized based on the Kernel layer. The kernel layer also acts as an abstraction layer between the hardware and software stacks. The layer has many drivers associated with the electronic device, the main drivers being: display driving; linux-based frame buffer drivers; a keyboard driver as an input device; flash drive based on memory technology equipment; driving a camera; audio driving; driving by Bluetooth; WI-FI drives, etc.
In some embodiments of the present application, the core layer serves as an abstraction layer between hardware and a software stack, and includes a touch driving service, which is used to acquire and report operation information related to a trigger window update received by a hardware portion (e.g., a touch screen, a touch sensor, etc.).
It should be noted that, in the inter-device screen collaboration system shown in fig. 2, both the first electronic device and the second electronic device may be implemented by the above hardware architecture and software architecture.
The following describes, with reference to a schematic diagram of an inter-device screen coordination method shown in fig. 5, a method for inter-device screen coordination provided in an embodiment of the present application with reference to a hardware structure and a software structure of an electronic device shown in fig. 3 and 4.
As shown in fig. 5, in the embodiment of the present application, the above-mentioned software structure is combined with a middleware and a soft bus structure, so as to implement bidirectional screen projection control between two electronic devices (a first electronic device and a second electronic device).
The middleware is connected with an application program layer and an application program framework layer of the operating system, and provides services such as interface standardization, protocol unification, specific operation detail shielding and the like of application for the operating system.
The soft bus is positioned at a link layer, is a module for encapsulating the operation of an operating system on resources which are commonly used by a plurality of processes, such as interprocess communication resources, shared memory and the like, and can provide standard resource application, use and recovery interfaces for task processes, and the task processes use the interfaces and the marks of the protocol to share the resources.
In the embodiment of the application, the electronic device initiating the screen collaboration between the devices respectively sends screen projection information to an application framework layer and a middleware through an application layer, wherein the screen projection information comprises screen projection data, screen projection objects and other information to be displayed by a screen to be projected. And after the display stream coding is carried out on the screen projection data by the middleware, the coded data is transmitted into the link layer and is sent to the electronic equipment receiving the screen projection by the soft bus. After the electronic equipment receiving screen projection receives the encoded data sent by the electronic equipment initiating screen projection, the encoded data is subjected to display stream decoding to obtain screen projection data, the screen projection data are reported to an application framework layer, and the application framework layer performs multi-window display according to the screen projection data.
Specifically, as shown in fig. 5, after the first electronic device establishes a communication connection with the second electronic device, when the first electronic device casts a screen to the second electronic device, screen casting data to be projected to the second electronic device for display is sent to the middleware through the application layer, the middleware performs display stream coding on the screen casting data and sends the coded screen casting data to the link layer, and the link layer sends the coded screen casting data to the second electronic device.
As an alternative embodiment, the link layer is sent to the link layer of the second electronic device by short-range communication means, such as bluetooth, Wi-Fi, etc., using a soft bus.
And the link layer of the second electronic equipment receives the encoded screen projection data sent by the link layer of the first electronic equipment and sends the encoded screen projection data to the middleware, the middleware performs display stream decoding on the encoded screen projection data and sends the decoded screen projection data to the application program framework layer, and the application program framework layer displays the screen projection data. When the application framework layer of the second electronic device displays the screen projection data sent by the first electronic device, the self window data displayed on the current display screen is sent to the application framework layer through the application framework layer, and the application framework layer outputs the window data and the screen projection data sent by the link layer on the display screen at the same time for displaying.
When the second electronic device cooperatively projects the screen to the first electronic device, screen projection data to be projected to the first electronic device for display are sent to the middleware through the application program layer, the middleware carries out display stream coding on the screen projection data and then sends the coded screen projection data to the link layer, and the link layer sends the coded screen projection data to the first electronic device.
As an alternative embodiment, the link layer is transmitted to the link layer of the first electronic device by short-range communication using a soft bus.
The method comprises the steps that a link layer of first electronic equipment receives screen projection data which are sent by a link layer of second electronic equipment and are coded, the screen projection data are sent to a middleware, the middleware carries out display stream decoding on the coded screen projection data and sends the decoded screen projection data to an application program framework layer, and the application program framework layer displays the screen projection data. When the application program framework layer of the first electronic device displays screen projection data sent by the second electronic device, window data displayed on a current display screen are sent to the application program framework layer through the application program layer, and the application program framework layer outputs the window data and the screen projection data sent by the link layer on the display screen at the same time for display.
The screen projection data of the first electronic device and the second electronic device which are projected to the opposite side in cooperation are window data which are generated by the first electronic device and the second electronic device and displayed in a display screen of the first electronic device, and the window data are data of an integral window corresponding to the display screen.
The method for cooperatively projecting the screen to the second electronic equipment by the first electronic equipment and cooperatively projecting the screen to the first electronic equipment by the second electronic equipment can be separately executed; and the two-way simultaneous projection can be realized.
In the embodiment of the present application, as an optional implementation manner, the first electronic device and the second electronic device are in the same local area network, and communicate through a short-range communication technology. The same local area network may be a WiFi local area network, a bluetooth local area network, or the like.
In the above embodiment, the first electronic device and the second electronic device respectively establish an information transmission route for screen collaboration between the two-way devices, and can respectively transmit the screen projection data to the opposite-end device, so that the screen projection data is displayed on the display screen of the opposite-end device, thereby realizing two-way screen projection control, and solving the problem that two-way collaborative sharing between the devices cannot be realized.
The inter-device screen coordination method provided by the embodiment of the present application is described in detail below with reference to specific embodiments.
In some embodiments of the present application, a screen cooperation switch for controlling whether a screen cooperation function between the devices is activated is provided in each of the first electronic device and the second electronic device, and the screen cooperation switch may be turned on or off by a user. When the screen cooperation switch is turned on, the electronic equipment supports the screen cooperation function between the equipment, can execute the screen cooperation method between the equipment provided by the embodiment of the application, and can project screens to other electronic equipment or receive screen projection of other electronic equipment; when the screen cooperation switch is turned off, the electronic equipment does not support the screen cooperation function between the equipment.
In the embodiment of the present application, a first electronic device is taken as a device that initiates inter-device screen coordination first, and a second electronic device is taken as a device that initiates inter-device screen coordination later, as an example, that is, after the first electronic device initiates inter-device screen coordination and projects a screen to the second electronic device, the second electronic device initiates inter-device screen coordination again and projects a screen to the first electronic device.
It should be noted that, in the embodiment of the present application, establishing screen cooperation between the first electronic device and the second electronic device means that the first electronic device projects a screen to the second electronic device on the basis of establishing communication connection with the second electronic device; the second electronic device, in cooperation with the first electronic device, means that the second electronic device projects a screen to the first electronic device on the basis of establishing a communication connection with the first electronic device.
In some embodiments of the present application, the first electronic device may automatically initiate inter-device screen collaboration, or initiate inter-device screen collaboration according to an instruction of a user.
As an optional implementation manner, when detecting that the communication connection with the second electronic device can be established, the first electronic device may directly establish inter-device screen cooperation with the second electronic device, and project a display window of the first electronic device onto a display screen of the second electronic device for display.
As another optional implementation, when detecting that the communication connection with the second electronic device can be established, the first electronic device displays a screen projection button in a display window on the display screen, where the screen projection button is used to trigger the first electronic device to project a window displayed on the display screen onto a display screen of the second electronic device for display. Illustratively, as shown in fig. 6a, the first electronic device sends a screen-casting request to the second electronic device in response to the user operating the screen-casting button, after the second electronic device passes the request, the first electronic device casts a screen-casting to the second electronic device in coordination, and casts a display window a (also referred to as a first window in this application) displayed on the current display screen onto a display screen of the second electronic device for display, and the second electronic device simultaneously outputs a screen-casting window a1 (also referred to as a third window in this application) displaying a display window B of the second electronic device and a display window corresponding to the display window of the first electronic device on the display screen. In fig. 6a, a window a is a display window of a first electronic device on a current display screen, a window B is a display window of a second electronic device (also referred to as a second window in this application) on the current display screen, and a window a1 is a screen projection window corresponding to the first electronic device after the window a is projected to the display screen of the second electronic device.
Illustratively, when the first electronic device and the second electronic device are two different tablet computers, if the current display window of the first electronic device is a window corresponding to a desktop, when the first electronic device (the tablet computer 1) detects that a communication connection can be established with the second electronic device (the tablet computer 2), the screen projection button 601 is displayed on its own display window, as shown in fig. 6b, a user can trigger the first electronic device and the second electronic device to establish inter-device screen cooperation by clicking the screen projection button, and the first electronic device sends a screen projection request to the second electronic device in response to an operation of clicking the screen projection button by the user, and requests to establish inter-device screen cooperation with the second electronic device. As shown in fig. 6c, after receiving the screen-casting request, the second electronic device displays a prompt message in its display window to prompt the user to select whether to accept the screen casting of the first electronic device, for example, displays a query message "do accept screen casting? If the screen projection of the first electronic device is determined to be accepted according to the user operation, sending feedback information of accepting the screen projection to the first electronic device. After receiving the feedback information, the first electronic device sends data corresponding to its window to the second electronic device, and the second electronic device displays the window displayed by the first electronic device according to the received data, as shown in fig. 6d, where window 1 is a window displayed by the second electronic device before the second electronic device accepts screen projection, window 2 is a window projected by the first electronic device to the second electronic device, and the display content is the same as the display content of the window of the first electronic device.
In the embodiment of the application, the display content in the window includes information displayed on an interface corresponding to the window on a display screen. For example, when the tablet pc 1 shown in fig. 6b is the first electronic device, the first window is a window corresponding to the entire display interface of the display screen, and the first content included in the first window includes all display elements in the entire display interface of the display screen, including the content of the status bar and the application icon shown in the figure. For another example, when the tablet pc 1 shown in fig. 6e is the first electronic device, the first window is a window corresponding to the entire display interface of the display screen, and the first content included in the first window includes information in the application interface of the short message application. For another example, when the tablet pc 1 shown in fig. 7f is a first electronic device, the first window is the window 1 shown in the drawing, and the first content included in the first window includes information in an application interface of a short message application shown in the window 1 in the drawing.
The first electronic equipment can determine the initiation time of the screen cooperation between the equipment according to the actual scene or user setting, and the flexibility of screen cooperation control between the equipment is improved.
After the screen cooperation between the first electronic device and the second electronic device is established, when a user operates a display window of the first electronic device to trigger window updating, a corresponding screen projecting window on a display screen of the second electronic device is updated synchronously. Specifically, after receiving an operation that a user triggers updating of a display window, the first electronic device updates data displayed by the display window, sends the updated data to the second electronic device and instructs the second electronic device to synchronously update a corresponding screen projection window, and the second electronic device displays the received updated data on the screen projection window according to the instruction of the first electronic device. For example, as shown in the window distribution shown in fig. 6a, when a user operates the display window a in the first electronic device to trigger window update, the content of the screen projection window a1 on the display screen of the second electronic device is updated synchronously and is kept consistent with the content of the display window a.
In the embodiment of the application, the conditions that the content displayed in the window on the display screen of the electronic device changes belong to window updating, and the operations that can cause the content displayed in the window to change belong to the operations that trigger the window updating.
For example, based on the above examples shown in fig. 6a to 6d, after the first electronic device is dropped to the second electronic device, if the user clicks the "information" application icon in the desktop window of the first electronic device to open the short message interface, the window displayed by the first electronic device is updated from the desktop window shown in fig. 6b to the window corresponding to the short message interface shown in fig. 6 e. The first electronic device sends the updated related data of the displayed short message window to the second electronic device, and the second electronic device synchronously updates the screen projection window (window 2) of the first electronic device shown in fig. 6d according to the received data to obtain the display window shown in fig. 6f, wherein the window 1 is a window of the second electronic device itself, and the window 2 is a window updated by the first electronic device after the first electronic device projects the screen to the window of the second electronic device.
After the screen cooperation between the first electronic device and the second electronic device is established, when a user operates a screen projection window on a display screen of the second electronic device to trigger window updating, the corresponding display window in the first electronic device is updated synchronously. Specifically, after receiving an operation that a user triggers updating of a screen projecting window, the second electronic device updates data displayed by the screen projecting window, sends the updated data to the first electronic device and instructs the first electronic device to update a corresponding display window synchronously, and the first electronic device displays the received updated data on the corresponding display window according to the instruction of the second electronic device. For example, as shown in the window distribution shown in fig. 6a, when a user operates the screen-cast window a1 on the display screen of the second electronic device to trigger window update, the content of the display window a in the first electronic device is updated synchronously and is consistent with the content of the screen-cast window a 1.
The inter-device screen cooperation established between the first electronic device and the second electronic device is unidirectional, that is, after the inter-device screen cooperation is established between the first electronic device and the second electronic device, the first electronic device projects the display window to the second electronic device for displaying, but the second electronic device does not project the display window to the first electronic device for displaying.
In some embodiments of the present application, as an optional implementation manner, on the basis that the first electronic device and the second electronic device establish inter-device screen cooperation, the second electronic device directly establishes inter-device screen cooperation with the first electronic device, and projects a display window of the second electronic device onto a display screen of the first electronic device for display.
As another optional implementation, after the first electronic device and the second electronic device establish inter-device screen cooperation and project a display window onto a display screen of the second electronic device for display, the second electronic device outputs and displays a screen projection button in the display window of the second electronic device, where the screen projection button is used to trigger the second electronic device to project a window displayed by the display screen onto the display screen of the first electronic device for display, where the window projection button is coordinated with the inter-device screen established by the first electronic device. For example, as shown in a schematic diagram (a) in fig. 7a, after the first electronic device cooperatively projects a screen to the second electronic device, the second electronic device outputs a display screen projecting button on its own display window B, and the user clicks the screen projecting button to trigger the second electronic device to project the own window B displayed on the display screen onto the display screen of the first electronic device for displaying. The second electronic device sends a screen-projecting request message to the first electronic device in response to the operation of the screen-projecting button by the user, after the first electronic device requests the screen-projecting request message, the second electronic device projects the screen to the first electronic device in cooperation with the screen-projecting request message, and projects the display window B of the current display screen onto the display screen of the first electronic device for display, and the first electronic device simultaneously displays the display window a of the first electronic device and a screen-projecting window B1 (which may also be referred to as a fourth window in this application) projected by the display window of the second electronic device on the display screen, as shown in schematic diagram (B) in fig. 7 a. The window B1 is a screen projection window corresponding to the projection of the display window B of the second electronic device onto the display screen of the first electronic device.
The second electronic equipment can determine the initiation time of the screen cooperation between the equipment according to the actual scene or user setting, and the flexibility of screen cooperation control between the equipment is improved.
Illustratively, based on the example shown in fig. 6f above, after the first electronic device screens the second electronic device, the second electronic device displays a screen-projection button 701 in its own display window (window 1), as shown in fig. 7 b. And the second electronic equipment responds to the operation of clicking the screen projection button by the user and sends a screen projection request to the first electronic equipment. As shown in fig. 7c, after receiving the screen-casting request, the first electronic device displays a prompt message in its display window to prompt the user to select whether to accept the screen casting of the second electronic device, for example, displays a query message "do accept screen casting? If the screen projection of the second electronic device is determined to be accepted according to the user operation, sending feedback information of accepting the screen projection to the second electronic device. After receiving the feedback information, the second electronic device sends data corresponding to its own window to the first electronic device, and the first electronic device displays a window for the second electronic device to project a screen according to the received data, as shown in fig. 7 d. The window 1 is a window displayed before the first electronic device receives screen projection, the window 2 is a window projected from the second electronic device to the first electronic device, and the display content is the same as that of the window of the second electronic device.
After the screen cooperation between the second electronic device and the first electronic device is established, when a user operates a display window of the second electronic device to trigger window updating, a corresponding screen projecting window on a display screen of the first electronic device is updated synchronously. Specifically, after receiving an operation that a user triggers updating of a display window, the second electronic device updates data displayed by the display window, sends the updated data to the first electronic device and instructs the first electronic device to update a corresponding screen projection window synchronously, and the first electronic device displays the received updated data on the screen projection window according to the instruction of the second electronic device. For example, as shown in the window distribution shown in the schematic diagram (B) in fig. 7a, when a user operates the display window B in the second electronic device to trigger window update, the content of the screen projection window B1 on the display screen of the first electronic device is updated synchronously and is consistent with the content of the display window B.
For example, based on the above examples shown in fig. 7a to 7d, after the second electronic device is projected to the first electronic device, if the user clicks the "camera" application icon in the desktop to open the photographing interface, the window displayed by the second electronic device is updated from the desktop window shown in fig. 7b to the window corresponding to the photographing interface shown in fig. 7 e. The second electronic device sends the updated and displayed data related to the camera window to the first electronic device, and the first electronic device updates the screen projection window (window 2) of the second electronic device shown in fig. 7d synchronously according to the received data, so as to obtain the display window shown in fig. 7 f. The window 1 is a window of the first electronic device, and the window 2 is a window updated from the second electronic device to the first electronic device.
After the screen cooperation between the second electronic device and the first electronic device is established, when a user operates a screen projection window on a display screen of the first electronic device to trigger window updating, the corresponding display window in the second electronic device is updated synchronously. Specifically, after receiving an operation that a user triggers updating of a screen projecting window, a first electronic device updates data displayed by the screen projecting window, sends the updated data to a second electronic device and instructs the second electronic device to update a corresponding display window synchronously, and the second electronic device updates and displays the received updated data in the corresponding display window according to the instruction of the first electronic device. For example, as shown in the window distribution shown in the schematic diagram (B) in fig. 7a, when a user operates a screen-projecting window B1 on the display screen of the first electronic device to trigger window updating, the content of the display window B in the second electronic device is updated synchronously and is consistent with the content of the screen-projecting window B1.
The first electronic equipment cooperatively projects the screen to the second electronic equipment, and meanwhile, after the second electronic equipment cooperatively projects the screen to the first electronic equipment, the first electronic equipment and the second electronic equipment establish screen cooperation between two-way equipment, so that the two-way screen projection control can be realized.
In some embodiments of the present application, after the first electronic device and the second electronic device establish the screen coordination between the two-way devices, the display window and the screen projection window on the display screen may be adaptively adjusted, so as to obtain a better visual display effect, for example, information such as the size of the display window and the screen projection window, the display area, and the like may be adjusted.
The inter-device screen cooperation method provided by the present application is described with reference to a schematic flow diagram of a bi-directional inter-device screen cooperation control process shown in fig. 8. As shown in fig. 8, the specific process of the method includes:
step 1: the first electronic device generates corresponding operation information after detecting that a user clicks a screen projection button in a display window of a display screen.
The display window can refer to the window a in fig. 6a or refer to the window in fig. 6b, and the screen-projection button can refer to the screen-projection button in the window a in fig. 6a or refer to the screen-projection button 601 in fig. 6 b.
Step 2: and the first electronic equipment sends a screen projection request message to the second electronic equipment according to the operation information, and establishes screen cooperation between the first electronic equipment and the second electronic equipment.
And step 3: the first electronic equipment carries out display stream coding on related data in a display window of the first electronic equipment on a display screen.
And 4, step 4: the first electronic device transmits the encoded display stream data to the second electronic device.
And 5: and the second electronic equipment decodes the received coded display stream data, generates a corresponding screen projection window, and simultaneously displays a display window of the second electronic equipment and the screen projection window on a display screen.
Wherein, the second electronic device can display the window by itself, see window B shown in fig. 6a or see window 1 described in fig. 6d, and the screen projection window can see window a1 shown in fig. 6a or see window 2 described in fig. 6 d.
Step 6: and the second electronic equipment generates corresponding operation information after detecting the operation of clicking a screen projection button in a display window of the second electronic equipment on the display screen by a user.
The display window can refer to the window B in the schematic diagram (a) in fig. 7a or refer to the window 1 in fig. 7B, and the screen-casting button can refer to the screen-casting button in the window B in the schematic diagram (a) in fig. 7a or refer to the screen-casting button 701 in the window 1 in fig. 7B.
And 7: and the second electronic equipment sends a screen projection request message to the first electronic equipment according to the operation information, and establishes screen cooperation between the second electronic equipment and the first electronic equipment.
And 8: and the second electronic equipment carries out display stream coding on the relevant data in the display window of the second electronic equipment on the display screen.
And step 9: the second electronic device transmits the encoded display stream data to the first electronic device.
Step 10: the first electronic equipment decodes the received encoded display stream data, generates a corresponding screen projection window, and simultaneously displays a display window of the first electronic equipment and the screen projection window on a display screen.
Wherein, the first electronic device can display the window by itself, see window a in schematic diagram (B) of fig. 7a or see window 1 described in fig. 7d, and the screen projection window can see window B1 in schematic diagram (B) of fig. 7a or see window 2 described in fig. 7 d.
In the embodiment of the application, the display windows and the screen projection windows in the first electronic device and the second electronic device may be displayed in a layered manner, or may be displayed in a partitioned manner.
For example, assuming that the first electronic device and the second electronic device are both mobile phones, when the display windows and the screen projection windows in the first electronic device and the second electronic device are displayed in a layered manner, the effect of the first electronic device and the second electronic device after the method is executed is shown in fig. 9. The layer 1 of the first electronic device is used for displaying a display window of the first electronic device, the layer 2 is used for displaying a screen projection window of the second electronic device after the display window of the second electronic device is projected to the first electronic device, the content of the screen projection window is the same as that of the display window displayed in the layer 3 of the second electronic device, and the layer 2 is located on the layer 1. The layer 3 of the second electronic device is used for displaying a display window of the second electronic device, the layer 4 is used for displaying a screen projection window of the first electronic device after the display window of the first electronic device is projected to the second electronic device, the content of the display window displayed in the layer 1 is the same, and the layer 4 is located on the layer 3.
For example, assuming that the first electronic device and the second electronic device are both mobile phones, when the display windows in the first electronic device and the second electronic device are displayed in a partition manner with the screen projection window, the effect of the first electronic device and the second electronic device after the method is executed is shown in fig. 10. The window 1 is a display window of the first electronic device on the display screen, and the window 2 is a screen projection window of the second electronic device after the display window of the second electronic device is projected to the first electronic device, and the display content of the window is the same as that of the window 3 in the second electronic device. The window 3 is a display window of the second electronic device on the display screen, and the window 4 is a screen projection window of the first electronic device after the display window of the first electronic device is projected to the second electronic device, and the display content of the window is the same as that of the window 1.
In the embodiment of the application, after the first electronic device and the second electronic device establish the bidirectional screen projection, when the first electronic device or the second electronic device displays a plurality of windows, for example, simultaneously displays a window of the first electronic device or projects a screen to the window of the first electronic device or the second electronic device, the window to be displayed may be scaled and then displayed, for example, in the display manner shown in fig. 9; or adaptively adjust the content layout in the window to be displayed and then display the content, for example, in the display mode shown in fig. 10.
After the screen cooperation between the two-way devices is established between the first electronic device and the second electronic device, the operation of the first electronic device or the second electronic device on each window in the display screen can be synchronized to the corresponding window in the display screen of the opposite terminal. Therefore, the user using the first electronic device and the user using the second electronic device can check the information displayed by the two in real time, the user experience is improved, the first electronic device and the second electronic device can mutually support synchronous operation, and the efficiency of the cooperative operation between the devices is improved. For example, as shown in fig. 11, taking a first electronic device and a second electronic device as mobile phones as an example, assuming that a user C using the first electronic device and a user D using the second electronic device are in the same game scene, the first electronic device and the second electronic device respectively project their own display windows onto a display screen of the other party for display by executing the inter-device screen cooperation method, and can simultaneously display game scene interfaces of the user C and the user D on the display screens of the first electronic device and the second electronic device, so as to implement cooperative bidirectional sharing, and the user C and the user D can view state information of the other party in real time in the projection screen window of the display screen of the own device without switching scenes.
After the screen cooperation between the two-way devices is established between the first electronic device and the second electronic device, a control button for controlling the screen cooperation between the two-way devices can be further output and displayed in the display screen, so that a user can flexibly control the two-way screen projection between the first electronic device and the second electronic device.
In some embodiments of the present application, after the first electronic device and the second electronic device establish the screen coordination between the two-way devices, both the first electronic device and the second electronic device may set a control opposite-end switch button in a screen-projecting window on a display screen, where the control opposite-end switch button is used to control whether the electronic device synchronously controls a corresponding display window when updating the screen-projecting window, so that the display window is synchronously updated. Specifically, when the control opposite-end switch button in the first electronic device is opened, and the first electronic device updates the screen projection window, the corresponding display window in the second electronic device is controlled to be updated at the same time; when a user operates a screen projection window in a display screen of first electronic equipment to trigger window updating, a corresponding display window in second electronic equipment is synchronously updated. When the control opposite-end switch button in the second electronic equipment is opened, and the second electronic equipment updates the screen projection window, the corresponding display window in the first electronic equipment is controlled to be updated at the same time; when a user operates a screen projection window in a display screen of the second electronic equipment to trigger window updating, the corresponding display window in the first electronic equipment is synchronously updated.
In some embodiments of the present application, after the first electronic device and the second electronic device establish the screen cooperation between the two-way devices, both the first electronic device and the second electronic device may set an opposite-end-allowed control switch button in a display window of the first electronic device on the display screen (in this application, the opposite-end-allowed control switch button displayed by the first electronic device may also be referred to as a first control option, and the opposite-end-allowed control switch button displayed by the second electronic device may also be referred to as a second control option), where the opposite-end-allowed control switch button is used to control whether the display window of the electronic device itself receives synchronous control of a corresponding screen projection window, and is updated synchronously with the screen projection window. Specifically, when the control switch button of the opposite terminal is allowed to be opened in the first electronic device, the first electronic device allows a user to control the display window of the first electronic device through the screen projecting window of the second electronic device, and when the user operates the screen projecting window in the display screen of the second electronic device to trigger window updating, the corresponding display window in the first electronic device is updated synchronously. When the opposite-end control switch button in the second electronic equipment is allowed to be opened, the second electronic equipment allows a user to control the display window of the second electronic equipment through the screen projecting window of the first electronic equipment, and when the user operates the screen projecting window in the display screen of the first electronic equipment to trigger window updating, the corresponding display window in the second electronic equipment is synchronously updated.
For example, as shown in fig. 12, after the first electronic device establishes bidirectional screen projection with the second electronic device, the screen projection window B1 of the first electronic device displays the control opposite-end switch button, and the second electronic device displays the permission opposite-end control switch button in its own display window B. When the user opens the control opposite-end switch button in the screen projection window B1 by performing a corresponding operation (e.g., clicking, sliding, etc.) on the control opposite-end switch button, the first electronic device is set to the control opposite-end mode in response to the operation, and when the user closes the control opposite-end switch button in the screen projection window B1 by performing a corresponding operation on the control opposite-end switch button, the first electronic device is set to the non-control opposite-end mode in response to the operation. When the user opens the opposite-end-allowed control switch button in the display window B by performing corresponding operation (for example, clicking, sliding, and the like) on the opposite-end-allowed control switch button, the second electronic device sets the opposite-end-allowed control mode in response to the operation, and when the user closes the opposite-end-allowed control switch button in the display window B by performing corresponding operation on the opposite-end-allowed control switch button, the second electronic device sets the opposite-end-disallowed control mode in response to the operation.
Among them, one possible case is: when the first electronic device is set to a control opposite terminal mode, sending request information for controlling an opposite terminal to the second electronic device, after the second electronic device receives the request information, if the second electronic device is determined to be set to a control opposite terminal allowed mode, returning feedback information for allowing control to the first electronic device, if the second electronic device is determined to be set to a control opposite terminal disallowed mode, returning feedback information for disallowing control to the first electronic device, after the first electronic device receives the feedback information for allowing control returned by the second electronic device, allowing a user to control the second electronic device through the screen projection window B1, and when the user operates a screen projection window B1 of the first electronic device to trigger window updating, the first electronic device updates the screen projection window B1 and simultaneously sends a request for synchronously updating a display window B corresponding to the screen projection window B1 to the second electronic device, so that the second electronic device synchronously updates the display window B. After receiving feedback information of disallowed control returned by the second electronic device, the first electronic device disallows the user to control the second electronic device through the screen projection window B1, and when the user operates the screen projection window B1 of the first electronic device to trigger window updating, the first electronic device only updates the screen projection window B1.
When the first electronic device is set to be in the opposite-end mode without control, the user is not allowed to control the second electronic device through the screen projection window B1, and when the user operates the screen projection window B1 of the first electronic device to trigger window updating, the first electronic device only updates the screen projection window B1.
Another possible scenario is: when the second electronic device is set to be in the opposite-end-allowed control mode, sending prompt information allowing the corresponding display window B to be controlled through the screen projection window B1 to the first electronic device, feeding the prompt information back to the user after the first electronic device receives the prompt information, controlling a switch button through the opposite end requesting in the screen projection window B1, and setting a mode of controlling the opposite end or not controlling the opposite end according to corresponding operation of the user. When a user operates the screen projection window B1 of the first electronic device to trigger window updating, the first electronic device may determine to update only the screen projection window B1 according to a set peer-to-peer control mode or a peer-to-peer non-control mode, or simultaneously control the second electronic device to update the corresponding display window B.
When the second electronic device is set to be in the opposite-end control impermissible mode, sending prompt information which does not allow the screen projection window B1 to control the corresponding display window B to the first electronic device, feeding the prompt information back to the user after the first electronic device receives the prompt information, and only updating the screen projection window B1 by the first electronic device when the user operates the screen projection window B1 of the first electronic device to trigger window updating.
In some embodiments of the application, as an optional implementation manner, when the second electronic device is set to the peer-to-peer control mode, sending a prompt message that the control of the screen projecting window is not accepted to the first electronic device, and the first electronic device blocks the control of the user on the window B through the screen projecting window B1 according to the prompt message, so that when the user operates the screen projecting window B1 to trigger window update, the first electronic device only updates the screen projecting window B1. As another optional implementation manner, when the second electronic device is set to be in the peer-to-peer control disallowed mode, the second electronic device blocks the user from controlling the display window B through the screen projection window B1, and when the user operates the screen projection window B1 to trigger window updating, the first electronic device updates the screen projection window B1 and requests the second electronic device to update the display window B synchronously, but the second electronic device does not respond after receiving the request, and the display window B is maintained unchanged.
The operation and control modes of the control opposite-end switch button in the second electronic device screen projection window a1 and the permitted opposite-end control switch button in the first electronic device display window a can refer to the control opposite-end switch button in the first electronic device screen projection window B1 and the permitted opposite-end control switch button in the second electronic device display window B, which are not described herein again.
In some embodiments of the present application, after the first electronic device and the second electronic device establish the bidirectional screen projection, the first electronic device and the second electronic device are opposite devices of each other. When the first electronic device or the second electronic device updates the screen projecting window projected by the opposite device, if the electronic device can generate the content displayed after the screen projecting window is updated, the screen projecting window can be updated according to the updated data corresponding to the screen projecting window. If the display window on the opposite terminal device corresponding to the screen projecting window needs to be controlled to be updated simultaneously, a request for synchronously updating the display window is sent to the opposite terminal device, and the opposite terminal device updates the display window displayed by the opposite terminal device according to the request. If the electronic device cannot generate the content displayed after the screen projecting window is updated, the electronic device can request the opposite terminal device for the corresponding data after the screen projecting window is updated, and the screen projecting window is updated according to the data returned by the opposite terminal device. If the display window on the opposite terminal device corresponding to the screen projecting window needs to be controlled to be updated, the opposite terminal device can be requested to synchronously update the display window corresponding to the screen projecting window when the opposite terminal device requests the corresponding updated data of the screen projecting window, and the opposite terminal device synchronously updates the display window displayed by the opposite terminal device when returning the corresponding updated data of the screen projecting window according to the request.
On the basis that the screen cooperation between the two-way devices is established between the first electronic device and the second electronic device, the switch button for controlling the opposite terminal is added to the screen projection windows in the display screens of the first electronic device and the second electronic device, and the switch button for allowing the opposite terminal to control is added to the display windows, so that a user can select functions of the two-way screen projection control according to requirements, flexibly switch whether to initiate control or not and whether to accept control, and improve the flexibility of screen cooperation control between the two-way devices.
In this embodiment of the present application, the method for establishing a bidirectional inter-device screen collaboration between a first electronic device and a second electronic device may also be applied to a multi-device scenario including more than two devices, where any two electronic devices in the multi-device scenario may be respectively used as the first electronic device and the second electronic device in the foregoing embodiment of the present application, and respectively execute corresponding inter-device screen collaboration methods to project their own display windows onto display screens of the other devices for display.
In some embodiments of the present application, any electronic device in the multi-device scenario may be used as a first electronic device in the foregoing embodiments of the present application, and other electronic devices capable of establishing communication connection with the electronic device may be used as second electronic devices in the foregoing embodiments of the present application, and cooperate with the first electronic device to execute the inter-device screen cooperation method in the foregoing embodiments of the present application, and project a display window of the electronic device onto a display screen of an opposite party for display.
As an optional implementation manner, after the first electronic device establishes bidirectional screen-casting with the second electronic device, if it is detected that multi-screen coordination with a third electronic device can be established, prompt information inquiring whether to establish screen coordination with the third electronic device is displayed, and in response to a received instruction of establishing screen coordination with the third electronic device, a new screen-casting relationship is established with the third electronic device on the basis of maintaining bidirectional screen-casting with the second electronic device.
For example, based on the example shown in fig. 7f, after the tablet pc 1 as the first electronic device and the tablet pc 2 as the second electronic device establish a bidirectional screen projection, the tablet pc 1 displays the window (window 1) of the tablet pc 1 and the window (window 2) projected by the tablet pc 2 on the display screen at the same time, and if the tablet pc 1 detects that a multi-screen coordination can be established with the third electronic device, for example, the mobile phone 1, the tablet pc 1 displays a prompt message in the display window of the tablet pc 1 to inquire whether the user needs to project the screen to the mobile phone 1, for example, displays an inquiry message "whether to project the screen to the mobile phone 1? ", as shown in FIG. 13 a. If the screen is determined to be projected to the mobile phone 1 according to the user operation, the mobile phone 1 is used as a first electronic device and the mobile phone 1 is used as a second electronic device under the condition that the bidirectional screen projection between the tablet computer 1 and the tablet computer 2 is maintained uninterrupted, and the inter-device screen cooperation method applied to the first electronic device and the second electronic device provided by the embodiment of the application is executed, so that the tablet computer 1 projects the display window of the tablet computer 1 to the display screen of the mobile phone 1 for display. After the screen of the tablet personal computer 1 is projected to the mobile phone 1, the screen projection of the mobile phone 1 can be received, and a display window of the tablet personal computer, a window projected by the tablet personal computer 2 and a window projected by the mobile phone 1 are simultaneously displayed on a display screen of the tablet personal computer. For example, based on the display screen of the tablet computer 1 shown in fig. 7f, when the display window in the mobile phone 1 is a window corresponding to the memo interface, the display interface on the display screen of the tablet computer 1 is as shown in fig. 13b after the tablet computer 1 establishes a bidirectional screen projection with the tablet computer 2 and establishes a bidirectional screen projection with the mobile phone 1. The window 1 is a display window of the tablet computer 1, the window 2 is a window from the tablet computer 2 to the tablet computer 1, and the window 3 is a window from the mobile phone 1 to the tablet computer 1.
As another optional implementation manner, if the electronic device detects that communication connections can be established with multiple other electronic devices, the electronic device outputs and displays a collaboration request button for establishing screen collaboration with the multiple other electronic devices in a display window of the electronic device on a display screen, and sets a corresponding expanded list, where the expanded list includes device flags of the multiple electronic devices, and a user may select an electronic device that needs to establish a multi-screen connection according to the device flags.
For example, assuming that a certain electronic device 1 detects that a communication connection with other electronic devices can be established, a screen-projecting button 1401 for displaying a screen projection to the other electronic devices is output in its own display window on the display screen, and a corresponding expanded list is set, as shown in schematic diagram (a) in fig. 14, and after the user clicks the screen-projecting button, the electronic device 1 outputs and displays the expanded list, as shown in schematic diagram (b) in fig. 14, the user selects at least one electronic device in the expanded list by a clicking operation, and the first electronic device establishes inter-device screen cooperation with the at least one electronic device selected by the user in response to the operation of the user. For example, as shown in the schematic diagram (b) in fig. 14, when the display list includes four electronic devices, namely, the electronic device 2, the electronic device 3, the electronic device 4, and the electronic device 5, the user selects the electronic device 2 and the electronic device 3 to establish a multi-screen connection. Then, the electronic device 1 is used as the first electronic device according to the foregoing embodiment of the present application, the electronic device 2 and the electronic device 3 are respectively used as the second electronic devices according to the foregoing embodiment of the present application, and the electronic device 1 and the inter-device screen cooperation method according to the foregoing embodiment of the present application are respectively executed in cooperation, so as to implement the bidirectional screen projection control between the electronic device 1 and any one of the electronic devices 2 and 3.
Fig. 15 is a schematic diagram illustrating a bidirectional screen projection control effect of the inter-device screen coordination method, as shown in the figure, a window 101 is a display window of an electronic device 1 on a display screen, a window 201 is a display window of an electronic device 2 on a display screen, and a window 301 is a display window of an electronic device 3 on a display screen; the window 102 is a screen projection window in the electronic device 2 after the window 201 is projected to the electronic device 1, and has the same display content as the window 201, and the window 103 is a screen projection window in the electronic device 3 after the window 301 is projected to the electronic device 1, and has the same display content as the window 301; the window 202 is a screen projection window after the window 101 in the electronic device 1 is projected to the electronic device 2, and has the same display content as the window 101; the window 302 is a screen projection window projected by the window 101 in the electronic device 1 to the electronic device 3, and has the same display content as the window 101.
On this basis, the electronic device 2 and the electronic device 3 may also be respectively used as the first electronic device and the second electronic device in the above-described embodiments of the present application, and the inter-device screen cooperation method in the above-described embodiments of the present application is cooperatively executed, so as to implement the bidirectional screen projection control between the electronic device 2 and the electronic device 3. Further, bidirectional screen projection control between any two of the electronic device 1, the electronic device 2, and the electronic device 3 can be realized.
Fig. 16 is a schematic diagram of a bidirectional screen projection control effect of the inter-device screen coordination method, as shown in the figure, on the basis of the bidirectional screen projection control effect shown in fig. 14, the method further includes: the window 203 is a screen projection window after the window 301 in the electronic device 3 is projected to the electronic device 2, and has the same display content as the window 301; the window 303 is a screen projection window of the electronic device 2 after the window 201 is projected to the electronic device 3, and has the same display content as the window 201.
The embodiment is applied to a multi-device scene, inter-device screen collaboration can be performed among a plurality of devices, a display window of one device can be projected into other devices, and projection of other devices can be accepted, so that an application object of inter-device screen collaboration can be flexibly adjusted according to actual requirements in the multi-device scene, efficient screen collaboration is guaranteed, and practicability of inter-device screen collaboration is improved.
Based on the foregoing embodiments, an inter-device screen coordination method is provided in the embodiments of the present application, and is applied to a system formed by a first electronic device and a second electronic device, as shown in fig. 17, the method includes the following steps:
step S1701: the first electronic equipment establishes communication connection with the second electronic equipment.
Step 1702: the first electronic equipment displays a first window, and the first window comprises first content.
Step S1703: and the second electronic equipment displays a second window, wherein the second window comprises second content.
Step S1704: the first electronic equipment sends first screen projection data to the second electronic equipment, and the first screen projection data comprises information of the first content.
Step S1705, the second electronic device receives the first screen projection data, and displays the second window and a third window, where the third window includes the first content.
Step S1706, the second electronic device sends second screen projection data to the first electronic device, where the second screen projection data includes information of the second content.
Step S1707, the first electronic device receives the second screen projection data, and displays the first window and a fourth window, where the fourth window includes the second content.
Specifically, the specific steps executed by the first electronic device and the second electronic device in the method may refer to the foregoing embodiments, and are not described in detail herein.
It should be noted that the execution sequence or timing of the above steps may be varied, for example, the step 1706 may be executed after the step 1704 and the step 1705, or may be executed simultaneously with the step 1704.
Based on the above embodiments, an embodiment of the present application further provides an electronic device, where the electronic device may be a first electronic device or a second electronic device. The electronic equipment is used for realizing the inter-equipment screen cooperation method provided by the embodiment of the application. As shown in fig. 18, the electronic device 1800 may include: a display screen 1801, one or more processors 1802, memory 1803, and one or more computer programs (not shown). The various devices described above may be coupled by one or more communication buses 1804.
The display screen 1801 is used for displaying images, videos, application interfaces and other relevant user interfaces. One or more computer programs, including instructions, are stored in the memory 1803; the processor 1802 calls the instructions stored in the memory 1803, so that the electronic device 1800 executes the inter-device screen coordination method provided by the embodiment of the present application.
In the embodiments provided in the present application, the method provided in the embodiments of the present application is described from the perspective of an electronic device as an execution subject. In order to implement the functions in the method provided by the embodiments of the present application, the electronic device may include a hardware structure and/or a software module, and the functions are implemented in the form of a hardware structure, a software module, or a hardware structure and a software module. Whether any of the above-described functions is implemented as a hardware structure, a software module, or a hardware structure plus a software module depends upon the particular application and design constraints imposed on the technical solution.
As used in the above embodiments, the terms "when …" or "after …" may be interpreted to mean "if …" or "after …" or "in response to determining …" or "in response to detecting …", depending on the context. Similarly, depending on the context, the phrase "at the time of determination …" or "if (a stated condition or event) is detected" may be interpreted to mean "if the determination …" or "in response to the determination …" or "upon detection (a stated condition or event)" or "in response to detection (a stated condition or event)". In addition, in the above-described embodiments, relational terms such as first and second are used to distinguish one entity from another entity without limiting any actual relationship or order between the entities.
The steps of a method or algorithm described in the embodiments herein may be embodied directly in hardware, in a software element executed by a processor, or in a combination of the two. The software cells may be stored in Random Access Memory (RAM), flash memory, read-only memory (ROM), EPROM memory, EEPROM memory, registers, a hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. For example, a storage medium may be coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In one or more exemplary designs, the functions described in the embodiments of the present application may be implemented in hardware, software, firmware, or any combination of the three. If implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media includes both computer storage media and communication media that facilitate transfer of a computer program from one place to another. Storage media may be any available media that can be accessed by a general purpose or special purpose computer. For example, such computer-readable media can include, but is not limited to, RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store program code in the form of instructions or data structures and which can be read by a general-purpose or special-purpose computer, or a general-purpose or special-purpose processor. Additionally, any connection is properly termed a computer-readable medium, and, thus, is included if the software is transmitted from a website, server, or other remote source over a coaxial cable, fiber optic computer, twisted pair, Digital Subscriber Line (DSL), or wirelessly, e.g., infrared, radio, and microwave. Such discs (disks) and disks (discs) include compact disks, laser disks, optical disks, Digital Versatile Disks (DVDs), floppy disks and blu-ray disks, where disks usually reproduce data magnetically, while disks usually reproduce data optically with lasers. Combinations of the above may also be included in the computer-readable medium.

Claims (19)

1. A method for screen collaboration between devices is applied to a system composed of a first electronic device and a second electronic device, and is characterized by comprising the following steps:
the first electronic equipment and the second electronic equipment establish communication connection;
the first electronic equipment displays a first window, wherein the first window comprises first content;
the second electronic equipment displays a second window, wherein the second window comprises second content;
the first electronic equipment sends first screen projection data to the second electronic equipment, wherein the first screen projection data comprises information of the first content;
the second electronic equipment receives the first screen projection data and displays the second window and a third window, wherein the third window comprises the first content;
the second electronic equipment sends second screen projection data to the first electronic equipment, wherein the second screen projection data comprise information of the second content;
and the first electronic equipment receives the second screen projection data and displays the first window and a fourth window, wherein the fourth window comprises the second content.
2. The method of claim 1,
before the first electronic device sends the first screen projection data to the second electronic device, the method further includes:
the first electronic equipment responds to the received first operation and sends a first screen projection request to the second electronic equipment, and the first screen projection request is used for requesting to project the first content to the second electronic equipment;
before the second electronic device displays the second window and the third window, the method further comprises:
and the second electronic equipment displays first prompt information, wherein the first prompt information is used for prompting whether the user accepts the first electronic equipment to project the first content to the second electronic equipment.
3. The method of claim 1 or 2, wherein after the second electronic device displays the second window and the third window, the method further comprises:
the first electronic equipment responds to the received second operation and updates the first content in the first window; or
The second electronic device responds to the received third operation and sends a first updating request to the first electronic device, wherein the first updating request is used for requesting the first electronic device to update the first content in the first window;
the first electronic device updates the first content in the first window in response to the received first update request.
4. The method of claim 3, wherein when the first electronic device updates the first content in the first window, the method further comprises:
the first electronic equipment sends updated first screen projection data to the second electronic equipment;
and the second electronic equipment updates the first content in the third window according to the received updated first screen projection data.
5. The method of claim 3 or 4, wherein after the first electronic device displays the first window and the fourth window, the method further comprises:
the first electronic equipment displays a first control option;
before the first electronic device updates the first content in the first window in response to the received first update request, the method further includes:
the first electronic device receives a fourth operation acting on the first control option.
6. The method according to any one of claims 1 to 5, wherein the display area of the first window is different from that of the fourth window, or the display area of the first window covers a partial area of the display area of the fourth window, or the display area of the fourth window covers a partial area of the display area of the first window.
7. The method of any of claims 1-6, wherein the system further comprises a third electronic device, and wherein after the first electronic device displays the first window and the fourth window, the method further comprises:
the first electronic equipment and the third electronic equipment establish communication connection;
the third electronic equipment displays a fifth window, wherein the fifth window comprises third content;
the first electronic equipment sends the first screen projection data to the third electronic equipment;
the third electronic equipment receives the first screen projection data and displays a fifth window and a sixth window, wherein the sixth window comprises the first content;
the third electronic equipment sends third screen projection data to the first electronic equipment, wherein the third screen projection data comprise information of the third content;
and the first electronic equipment receives the third screen projection data and displays the first window and a seventh window, wherein the seventh window comprises the third content.
8. A method for screen collaboration among devices is applied to a first electronic device, and is characterized by comprising the following steps:
establishing a communication connection with a second electronic device;
displaying a first window, wherein the first window comprises first content;
sending first screen projection data to the second electronic equipment, wherein the first screen projection data comprises information of the first content;
receiving second screen projection data from the second electronic equipment, wherein the second screen projection data comprise information of second content, and the second content is content included in a second window displayed by the second electronic equipment;
and displaying the first window and a fourth window, wherein the fourth window comprises the second content.
9. The method of claim 8, wherein prior to sending the first screen projection data to the second electronic device, the method further comprises:
and responding to the received first operation, and sending a first screen projection request to the second electronic equipment, wherein the first screen projection request is used for requesting to project the first content to the second electronic equipment.
10. The method of claim 8 or 9, wherein prior to receiving second screen projection data from the second electronic device, the method further comprises:
and displaying second prompt information, wherein the second prompt information is used for prompting whether the user accepts the second electronic equipment to project the second content to the first electronic equipment.
11. The method of any one of claims 8-10, wherein after sending the first screen projection data to the second electronic device, the method further comprises:
updating the first content in the first window in response to the received second operation; or alternatively
Updating the first content in the first window in response to a first update request received from the second electronic device, wherein the first update request is used for requesting the first electronic device to update the first content in the first window.
12. The method of claim 11, wherein when updating the first content in the first window, the method further comprises:
and sending the updated first screen projection data to the second electronic equipment.
13. The method of claim 11 or 12, wherein after displaying the first and fourth windows, the method further comprises:
displaying a first control option;
before updating the first content in the first window in response to the first update request from the second electronic device, the method further includes:
receiving a fourth operation acting on the first control option.
14. The method of any of claims 8 to 13, wherein after displaying the first and fourth windows, the method further comprises:
in response to the received seventh operation, sending a second update request to the second electronic device, where the second update request is used to request the second electronic device to update the second content in the second window.
15. The method of claim 14, wherein after sending the second update request to the second electronic device, the method further comprises:
updating the second content in the fourth window in response to receiving updated second screen projection data from the second electronic device.
16. The method according to any one of claims 8 to 15, wherein the display area of the first window is different from that of the fourth window, or the display area of the first window covers a partial area of the display area of the fourth window, or the display area of the fourth window covers a partial area of the display area of the first window.
17. The method of any of claims 8 to 16, wherein after displaying the first and fourth windows, the method further comprises:
establishing a communication connection with a third electronic device;
sending the first screen projection data to the third electronic equipment;
receiving third screen projection data from the third electronic device, wherein the third screen projection data comprises information of third content, and the third content is content included in a fifth window displayed by the third electronic device;
and displaying the first window and a seventh window, wherein the seventh window comprises the third content.
18. An electronic device, comprising a display screen, a memory, and one or more processors;
wherein the memory is to store computer program code comprising computer instructions; the computer instructions, when executed by the one or more processors, cause the electronic device to perform the method of any of claims 8-17.
19. A computer-readable storage medium, characterized in that it comprises program instructions which, when run on a terminal device, cause the terminal device to perform the method of any of claims 1 to 7 or to perform the method of any of claims 8 to 17.
CN202011190260.1A 2020-10-30 2020-10-30 Inter-equipment screen collaboration method and equipment Active CN114442969B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202011190260.1A CN114442969B (en) 2020-10-30 2020-10-30 Inter-equipment screen collaboration method and equipment
PCT/CN2021/125218 WO2022089294A1 (en) 2020-10-30 2021-10-21 Inter-device screen collaboration method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011190260.1A CN114442969B (en) 2020-10-30 2020-10-30 Inter-equipment screen collaboration method and equipment

Publications (2)

Publication Number Publication Date
CN114442969A true CN114442969A (en) 2022-05-06
CN114442969B CN114442969B (en) 2024-06-18

Family

ID=81357685

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011190260.1A Active CN114442969B (en) 2020-10-30 2020-10-30 Inter-equipment screen collaboration method and equipment

Country Status (2)

Country Link
CN (1) CN114442969B (en)
WO (1) WO2022089294A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023230765A1 (en) * 2022-05-30 2023-12-07 京东方科技集团股份有限公司 Screen information synchronization method and system
WO2024065449A1 (en) * 2022-09-29 2024-04-04 京东方科技集团股份有限公司 Data sharing display method and intelligent display system

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117560792A (en) * 2022-08-05 2024-02-13 荣耀终端有限公司 Multi-device cooperation method, electronic device and related products

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110471639A (en) * 2019-07-23 2019-11-19 华为技术有限公司 Display methods and relevant apparatus
CN110622123A (en) * 2017-08-23 2019-12-27 华为技术有限公司 Display method and device
CN110673782A (en) * 2019-08-29 2020-01-10 华为技术有限公司 Control method applied to screen projection scene and related equipment
CN111131866A (en) * 2019-11-25 2020-05-08 华为技术有限公司 Screen-projecting audio and video playing method and electronic equipment

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106293583B (en) * 2016-08-03 2019-10-11 广东威创视讯科技股份有限公司 Desktop window sharing method and system
US10331394B1 (en) * 2017-12-21 2019-06-25 Logmein, Inc. Manipulating shared screen content
CN110515580B (en) * 2019-09-02 2022-08-19 联想(北京)有限公司 Display control method, device and terminal
CN111158624A (en) * 2019-12-31 2020-05-15 维沃移动通信有限公司 Application sharing method, electronic equipment and computer readable storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110622123A (en) * 2017-08-23 2019-12-27 华为技术有限公司 Display method and device
CN110471639A (en) * 2019-07-23 2019-11-19 华为技术有限公司 Display methods and relevant apparatus
CN110673782A (en) * 2019-08-29 2020-01-10 华为技术有限公司 Control method applied to screen projection scene and related equipment
CN111131866A (en) * 2019-11-25 2020-05-08 华为技术有限公司 Screen-projecting audio and video playing method and electronic equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023230765A1 (en) * 2022-05-30 2023-12-07 京东方科技集团股份有限公司 Screen information synchronization method and system
WO2024065449A1 (en) * 2022-09-29 2024-04-04 京东方科技集团股份有限公司 Data sharing display method and intelligent display system

Also Published As

Publication number Publication date
WO2022089294A1 (en) 2022-05-05
CN114442969B (en) 2024-06-18

Similar Documents

Publication Publication Date Title
US20220342850A1 (en) Data transmission method and related device
WO2020238874A1 (en) Vr multi-screen display method and electronic device
CN112286618A (en) Device cooperation method, device, system, electronic device and storage medium
CN114442969B (en) Inter-equipment screen collaboration method and equipment
CN111221845A (en) Cross-device information searching method and terminal device
WO2021121052A1 (en) Multi-screen cooperation method and system, and electronic device
WO2021147406A1 (en) Audio output method and terminal device
US20230422154A1 (en) Method for using cellular communication function, and related apparatus and system
CN112527174B (en) Information processing method and electronic equipment
CN113050841A (en) Method, electronic equipment and system for displaying multiple windows
WO2024016559A1 (en) Multi-device cooperation method, electronic device and related product
US20230094172A1 (en) Cross-Device Application Invoking Method and Electronic Device
WO2022028494A1 (en) Multi-device data collaboration method and electronic device
CN114520868A (en) Video processing method, device and storage medium
CN113190362B (en) Service calling method and device, computer equipment and storage medium
CN114489529A (en) Screen projection method of electronic device, medium thereof and electronic device
CN115686401A (en) Screen projection method, electronic equipment and system
CN115941674B (en) Multi-device application connection method, device and storage medium
CN114510186A (en) Cross-device control method and device
WO2022143310A1 (en) Double-channel screen projection method and electronic device
CN114520867B (en) Camera control method based on distributed control and terminal equipment
CN115114607A (en) Sharing authorization method, device and storage medium
CN115113832A (en) Cross-device synchronous display control method and system
CN115543496A (en) Message processing method and related device
CN115225753A (en) Shooting method, related device and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant