CN112286477B - Screen projection display method and related product - Google Patents

Screen projection display method and related product Download PDF

Info

Publication number
CN112286477B
CN112286477B CN202011284012.3A CN202011284012A CN112286477B CN 112286477 B CN112286477 B CN 112286477B CN 202011284012 A CN202011284012 A CN 202011284012A CN 112286477 B CN112286477 B CN 112286477B
Authority
CN
China
Prior art keywords
screen
picture
input operation
projected
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011284012.3A
Other languages
Chinese (zh)
Other versions
CN112286477A (en
Inventor
杨俊拯
邓朝明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202011284012.3A priority Critical patent/CN112286477B/en
Publication of CN112286477A publication Critical patent/CN112286477A/en
Priority to PCT/CN2021/116478 priority patent/WO2022100237A1/en
Application granted granted Critical
Publication of CN112286477B publication Critical patent/CN112286477B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Controls And Circuits For Display Device (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a screen projection display method and related products, wherein the method comprises the following steps: the electronic equipment displays an editing interface window, wherein the editing interface window comprises a first area and a second area, and a first picture is displayed in the first area; then, responding to at least one first input operation for the first picture, and selecting a picture list to be projected corresponding to the at least one first input operation from the first picture; displaying a second picture in the second area, and editing the second picture into a screen projection picture in response to at least one second input operation in the second area; and finally, the screen throwing picture is sent to display the screen throwing picture in a first application running on the target equipment. According to the method, one or more pictures to be projected on the interface of the target application on the source equipment can be captured, and the one or more pictures to be projected on the screen are rearranged so as to be displayed on the target equipment, so that the application picture can adapt to the screen sizes of various electronic equipment, the use is convenient, and the user experience is improved.

Description

Screen projection display method and related product
Technical Field
The application relates to the technical field of electronics, in particular to a screen projection display method and related products.
Background
The screen projection refers to projecting the display image of the device A into the device B, so that the device B can also synchronously display the display image of the device A. The display images of devices with larger display screens (such as televisions and vehicle-mounted multimedia display screens) can be projected onto small-screen display devices (such as mobile phones and tablet computers) through a screen projection technology, so that data synchronization is performed, and sharing of the display images with multiple people is facilitated.
However, in a specific use process, the interface of the device a is directly displayed on the device B due to the difference of the sizes and functions of the display screens of the device a and the device B, which faces the problem of poor display effect, and such direct display may cause that the operation of the device a cannot be performed on the device B.
Disclosure of Invention
The embodiment of the application provides a screen projection display method and related products, which can adapt to the screen sizes of various electronic devices, are convenient to use and improve the user experience.
In a first aspect, an embodiment of the present application provides a method for displaying a screen, which is applied to an electronic device, and the method includes:
displaying an editing interface window, wherein the editing interface window comprises a first area and a second area, a first picture is displayed in the first area, and the first picture is a target picture in a target application running on source equipment;
Responding to at least one first input operation aiming at the first picture, and selecting a picture list to be projected corresponding to the at least one first input operation from the first picture, wherein the picture list to be projected comprises at least one picture to be projected;
displaying a second picture in a second area, wherein the second picture comprises the picture list to be projected;
responsive to at least one second input operation within the second region, editing the second screen into a screen-drop screen;
and sending the screen-throwing picture so as to display the screen-throwing picture in the first application running on the target equipment.
In a second aspect, an embodiment of the present application provides a screen display device, applied to an electronic apparatus, where the device includes:
the display unit is used for displaying an editing interface window, the editing interface window comprises a first area and a second area, a first picture is displayed in the first area, and the first picture is a target picture in a target application running on the source equipment;
a selecting unit, configured to respond to at least one first input operation for the first frame, and select a to-be-screen frame list corresponding to the at least one first input operation from the first frame, where the to-be-screen frame list includes at least one to-be-screen frame;
The display unit is further configured to display a second screen in a second area, where the second screen includes the to-be-screen list and/or an external screen-projection screen list;
an editing unit, configured to edit the second screen into a screen-throwing screen in response to at least one second input operation in the second area;
and the receiving and transmitting unit is used for transmitting the screen projection picture so as to display the screen projection picture in the first application running on the target equipment.
In a third aspect, an embodiment of the present application provides an electronic device, including a processor, a memory, a communication interface, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the processor, the programs including instructions for performing steps in any of the methods of the first aspect of the embodiments of the present application.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium, wherein the computer-readable storage medium stores a computer program for electronic data exchange, wherein the computer program causes a computer to perform part or all of the steps as described in any of the methods of the first aspect of the embodiments of the present application.
In a fifth aspect, embodiments of the present application provide a computer program product, wherein the computer program product comprises a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps described in any of the methods of the first aspect of the embodiments of the present application. The computer program product may be a software installation package.
It can be seen that in the screen projection display method provided by the application, the electronic equipment displays the editing interface window, the editing interface window comprises a first area and a second area, a first picture is displayed in the first area, and the first picture is a target picture in a target application running on the source equipment; then, responding to at least one first input operation for the first picture, and selecting a picture list to be projected corresponding to the at least one first input operation from the first picture; displaying a second picture in the second area, and editing the second picture into a screen throwing picture in response to at least one second input operation in the second area; and finally, the screen throwing picture is sent to display the screen throwing picture in a first application running on the target equipment. According to the method, one or more pictures to be projected can be captured from the target pictures of the target application on the source equipment, and the one or more pictures to be projected are rearranged so as to be displayed on the target equipment, so that the application pictures can adapt to the screen sizes of various electronic equipment, the use is convenient, and the user experience is improved.
Drawings
In order to more clearly illustrate the embodiments of the application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1a is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 1b is a schematic software structure of an electronic device according to an embodiment of the present application;
fig. 2a is a schematic structural diagram of a screen projection system according to an embodiment of the present application;
FIG. 2b is a schematic diagram of another exemplary screen projection system according to an embodiment of the present application;
FIG. 2c is a schematic diagram of another exemplary screen projection system according to an embodiment of the present application;
fig. 3 is a schematic flow chart of a screen projection display method according to an embodiment of the present application;
fig. 4a is a schematic diagram of a target screen on a source device according to an embodiment of the present application;
FIG. 4b is a schematic diagram of a screen-drop application selection provided by an embodiment of the present application;
FIG. 4c is a schematic diagram of a screen selection according to an embodiment of the present application;
FIG. 5a is a schematic diagram of an editing interface window according to an embodiment of the present application;
FIG. 5b is a schematic diagram of another editing interface window provided by an embodiment of the present application;
FIG. 6a is a schematic diagram of selecting a screen to be projected according to an embodiment of the present application;
FIG. 6b is a schematic diagram of another embodiment of selecting a screen to be projected;
FIG. 6c is a schematic diagram of another embodiment of selecting a screen to be projected;
FIG. 6d is a schematic diagram of another embodiment of selecting a screen to be projected;
FIG. 6e is a schematic diagram of another embodiment of selecting a screen to be projected;
fig. 7a is a schematic diagram of a mobile screen to be projected according to an embodiment of the present application;
fig. 7b is a schematic diagram of a screen to be projected by rotation according to an embodiment of the present application;
fig. 7c is a schematic diagram of an external screen to be projected screen adding interface according to an embodiment of the present application;
FIG. 7d is a schematic diagram of adding a screen to be projected according to an embodiment of the present application;
fig. 7e is a schematic diagram of a screen to be projected in a hierarchical arrangement according to an embodiment of the present application;
FIG. 7f is a schematic diagram of a projection screen setup interface according to an embodiment of the present application;
FIG. 7g is a schematic diagram of a destination device configuration interface according to an embodiment of the present application;
fig. 7h is a schematic diagram of a source device screen-casting to a destination device according to an embodiment of the present application;
fig. 8a is a schematic diagram of a destination device control source device according to an embodiment of the present application;
fig. 8b is a schematic diagram of a source device controlling a destination device according to an embodiment of the present application;
FIG. 9a is a schematic diagram of another exemplary screen projection system according to an embodiment of the present application;
FIG. 9b is a schematic diagram of another exemplary screen projection system according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of a projection display device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings in the embodiments of the present application. Wherein, in the description of the embodiments of the present application, unless otherwise indicated, "/" means or, for example, a/B may represent a or B; "and/or" herein is merely an association relationship describing an association object, and means that three relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist together, and B exists alone. In addition, in the description of the embodiments of the present application, "plurality" means two or more than two.
The terms "first" and "second" are used below for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the present embodiment, unless otherwise specified, the meaning of "plurality" is two or more.
For a better understanding of aspects of embodiments of the present application, related terms and concepts that may be related to embodiments of the present application are described below.
The screen projection of the embodiment of the application is a technology for projecting the pictures or the contents of the application running on the equipment to the display screen or the display medium of another equipment for display, and is a typical information synchronization mode. In this embodiment of the present application, a device that projects a screen to which it applies is referred to as a source device, and a device that receives a screen to which it applies and displays is referred to as a destination device.
In this embodiment, the source device and the destination device need to establish communication connection in advance, where the communication connection includes wired connection and wireless connection, for example, may be implemented by bluetooth, wifi, or universal serial bus, and this embodiment is not limited in particular, and may be adaptively selected according to functions supported by both the source device and the destination device. According to different specific application scenes, the target equipment can be flexibly determined, for example, in a driving scene, the target equipment can be vehicle-mounted equipment; in a home living scene, the target device may be a home device such as a smart television.
The screen may include a wired screen and a wireless screen. The wired screen can establish wired connection between the source device and the destination device through a high-definition multimedia interface (high definition multimedia interface, HDMI), a universal serial bus (universal serial bus, USB) interface and the like so as to transmit media data; the wireless screen casting may establish a wired connection between the source device and the destination device to transmit media data through a digital living network alliance (digistal living network alliance, DLNA) protocol, a wireless display sharing (Miracast) or a spaced play (AirPlay) protocol.
For example, during screen projection, the source device may compress the user image of the current video player through data encoding and then send the compressed user image to the destination device, and after decoding, the destination device rearranges the user image and then displays the screen projection content on the display screen of the destination device; or the source device can rearrange the user image of the current video player, and send the rearranged user image to the destination device after data encoding and compression, and the destination device displays the screen throwing content on the display screen after decoding; or the source device may compress the user image of the current video player by data encoding and then send the compressed user image to the development device, the development device rearranges the user image, and sends the rearranged user image to the destination device after compressing the user image by data encoding, and the destination device displays the screen-throwing content on the display screen thereof after decoding. For convenience in description, the embodiment of the application can refer to the screen projection content displayed on the display screen of the screen projection destination device as the mirror image of the screen projection content of the screen projection source device.
The embodiment of the application provides a screen projection display method which can be applied to first equipment, and particularly, the first equipment can be source equipment, destination equipment or development equipment. Specifically, the method can capture one or more to-be-projected screen images of the target application on the source device, and rearrange the one or more to-be-projected screen images so as to display the to-be-projected screen images to the target device, so that the application image can adapt to the screen sizes of various electronic devices, the use is convenient, and the user experience is improved.
The screen display method provided by the embodiment of the application can be applied to electronic equipment, which can be handheld equipment, vehicle-mounted equipment, wearable equipment, augmented reality (augmented reality, AR) equipment, virtual Reality (VR) equipment, projection equipment, a projector or other equipment connected to a wireless modem, and can also be User Equipment (UE), terminal equipment (terminal device), mobile phone (smart phone), smart screen, smart television, smart watch, notebook computer, smart sound, camera, game handle, microphone, station (STA), access Point (AP), mobile Station (MS), personal digital assistant (personal digital assistant, PDA), personal computer (personal computer, PC) or relay equipment and the like.
Two electronic devices, namely a smart watch and a mobile phone, are taken as examples. When the smart watch and the mobile phone are connected through a wireless communication technology (such as Bluetooth, wireless fidelity, zigbee, near field communication and the like) or a data line (such as a USB data line), the mobile phone is used as a source device to screen the screen of the display screen of the smart watch by the screen-casting technology, and the smart watch is used as a target device; or the smart watch is used as a source device to screen the screen of the running application on the display screen of the mobile phone, and the mobile phone is used as a destination device.
In the embodiment of the application, the source equipment and the destination equipment for screen projection can be directly connected, for example, the direct connection between two electronic equipment is realized through Bluetooth, wiFi and the like; alternatively, the two electronic devices may be indirectly connected by being connected to other electronic devices, such as a cloud server, respectively. In the screen throwing process, the connection between the two electronic devices can be switched between direct connection and indirect connection, and the embodiment of the application is not limited.
The application program interface in the embodiment of the application is a medium interface for interaction and information exchange between an application program and a user, and the medium interface realizes conversion between an internal form of information and a form acceptable to the user. A commonly used presentation form of an application program interface is a graphical user interface (Graphic User Interface, GUI), which refers to a user interface related to computer operations that is displayed in a graphical manner. Visual interface elements such as icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, widgets (widgets), and the like may be included in the interface.
The first part, the software and hardware operation environment of the technical scheme disclosed by the application is introduced as follows.
By way of example, fig. 1a shows a schematic diagram of an electronic device 100. Taking the example of the electronic device being a cellular phone, the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a usb interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an ear-piece interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a SIM card interface 195, etc. The sensor module 180 may include a gyroscope sensor 180A, an acceleration sensor 180B, a barometric sensor 180C, a magnetic sensor 180D, an ambient light sensor 180E, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, and a touch sensor 180K (of course, the electronic device 100 may also include other sensors such as a temperature sensor, a pressure sensor, a distance sensor, a bone conduction sensor, etc., not shown).
It should be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a Neural network processor (Neural-network Processing Unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors. The controller may be a neural hub and a command center of the electronic device 100, among others. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
The processor 110 can operate the screen projection method provided by the embodiment of the application, so that the screen projection function is enriched, the flexibility of screen projection is improved, and the user experience is improved. The processor 110 may include different devices, for example, when the CPU and the GPU are integrated, the CPU and the GPU may cooperate to execute the screen projection method provided by the embodiment of the present application, for example, a part of algorithms in the screen projection method are executed by the CPU, and another part of algorithms are executed by the GPU, so as to obtain a faster processing efficiency.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1. The display 194 may be used to display information entered by or provided to a user as well as various graphical user interfaces (graphical user interface, GUI). For example, the display 194 may display photographs, videos, web pages, or files, etc. For another example, the display 194 may display a graphical user interface. Including status bars, hidden navigation bars, time and weather gadgets (widgets), and icons for applications, such as browser icons, etc. The status bar includes the name of the operator (e.g., chinese mobile), the mobile network (e.g., 4G), time, and the remaining power. The navigation bar includes a back (back) key icon, a home screen (home) key icon, and a forward key icon. Further, it is to be appreciated that in some embodiments, bluetooth icons, wi-Fi icons, external device icons, etc. may also be included in the status bar. It will also be appreciated that in other embodiments, a Dock may be included in the graphical user interface, a commonly used application icon may be included in the Dock, and the like. When the processor detects a touch event of a finger (or a stylus or the like) of a user for a certain application icon, a user interface of the application corresponding to the application icon is opened in response to the touch event, and the user interface of the application is displayed on the display 194.
In the embodiment of the present application, the display 194 may be an integral flexible display, or a tiled display composed of two rigid screens and a flexible screen located between the two rigid screens may be used. After the processor 110 runs the screen projection method provided by the embodiment of the application, the processor 110 can control the external audio output device to switch the output audio signals.
The camera 193 (front camera or rear camera, or one camera may be used as both front camera and rear camera) is used to capture still images or video. In general, the camera 193 may include a photosensitive element such as a lens group including a plurality of lenses (convex lenses or concave lenses) for collecting optical signals reflected by an object to be photographed and transmitting the collected optical signals to an image sensor. The image sensor generates an original image of the object to be photographed according to the optical signal.
The internal memory 121 may be used to store computer executable program code including instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store, among other things, code for an operating system, an application program (e.g., a camera application, a WeChat application, etc.), and so on. The storage data area may store data created during use of the electronic device 100 (e.g., images, video, etc. captured by a camera application), and so on.
The internal memory 121 may also store one or more computer programs corresponding to the screen projection method provided in the embodiment of the present application. The one or more computer programs, which may include an account verification module, a priority comparison module, are stored in the memory 211 and configured to be executed by the one or more processors 110. The account verification module is used for authenticating system authentication accounts of other terminal devices in the local area network; the priority comparison module can be used for comparing the priority of the audio output request service with the priority of the current output service of the audio output equipment. And the state synchronization module can be used for synchronizing the equipment state of the audio output equipment currently accessed by the terminal equipment to other terminal equipment or synchronizing the equipment state of the audio output equipment currently accessed by other equipment to the local. When the code of the screen projection method stored in the internal memory 121 is executed by the processor 110, the processor 110 may control the transmitting end to perform the screen projection data processing.
In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
Of course, the codes of the screen projection method provided by the embodiment of the application can also be stored in an external memory. In this case, the processor 110 may run the code of the screen-casting method stored in the external memory through the external memory interface 120, and the processor 110 may control the transmitting end to perform the screen-casting data processing.
The gyro sensor 180A may be used to determine a motion gesture of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., x, y, and z axes) may be determined by gyro sensor 180A. I.e., the gyro sensor 180A may be used to detect the current motion state of the electronic device 100, such as shaking or being stationary.
When the display screen in the embodiment of the present application is a foldable screen, the gyro sensor 180A may be used to detect a folding or unfolding operation acting on the display screen 194. The gyro sensor 180A may report the detected folding operation or unfolding operation to the processor 110 as an event to determine the folding state or unfolding state of the display screen 194.
The acceleration sensor 180B may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). I.e., the gyro sensor 180A may be used to detect the current motion state of the electronic device 100, such as shaking or being stationary. When the display screen in the embodiment of the present application is a foldable screen, the acceleration sensor 180B may be used to detect a folding or unfolding operation acting on the display screen 194. The acceleration sensor 180B may report the detected folding operation or unfolding operation as an event to the processor 110 to determine the folding state or unfolding state of the display screen 194.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The mobile phone emits infrared light outwards through the light emitting diode. The cell phone uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object in the vicinity of the handset. When insufficient reflected light is detected, the handset may determine that there is no object in the vicinity of the handset. When the display screen in the embodiment of the present application is a foldable screen, the proximity light sensor 180G may be disposed on a first screen of the foldable display screen 194, and the proximity light sensor 180G may detect a folding angle or an unfolding angle of the first screen and the second screen according to an optical path difference of the infrared signal.
The gyro sensor 180A (or the acceleration sensor 180B) may transmit detected motion state information (such as angular velocity) to the processor 110. The processor 110 determines whether it is currently in a handheld state or a foot rest state based on the motion state information (e.g., when the angular velocity is not 0, it is indicated that the electronic device 100 is in a handheld state).
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint feature to unlock the fingerprint, access the application lock, photograph the fingerprint, answer the incoming call, etc.
The touch sensor 180K, also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
Illustratively, the display 194 of the electronic device 100 displays a main interface that includes icons of a plurality of applications (e.g., camera applications, weChat applications, etc.). The user clicks on an icon of the camera application in the main interface by touching the sensor 180K, triggering the processor 110 to launch the camera application, opening the camera 193. The display 194 displays an interface for the camera application, such as a viewfinder interface.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110. In the embodiment of the present application, the mobile communication module 150 may also be used for performing information interaction with other terminal devices, that is, sending screen-casting related data to other terminal devices, or the mobile communication module 150 may be used for receiving a screen-casting request and encapsulating the received screen-casting request into a message in a specified format.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2. In the embodiment of the present application, the wireless communication module 160 is configured to establish a connection with a receiving end, and display the screen content through the receiving end. Or the wireless communication module 160 may be configured to access the access point device, send a message corresponding to the screen-drop request to other terminal devices, or receive a message corresponding to the audio output request sent by other terminal devices.
In addition, the electronic device 100 may implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor, etc. Such as music playing, recording, etc. The electronic device 100 may receive key 190 inputs, generating key signal inputs related to user settings and function control of the electronic device 100. The electronic device 100 may generate a vibration alert (such as an incoming call vibration alert) using the motor 191. The indicator 192 in the electronic device 100 may be an indicator light, may be used to indicate a state of charge, a change in power, may be used to indicate a message, a missed call, a notification, etc. The SIM card interface 195 in the electronic device 100 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195, or removed from the SIM card interface 195 to enable contact and separation with the electronic device 100.
It should be understood that in practical applications, electronic device 100 may include more or fewer components than those shown in FIG. 1a, and embodiments of the present application are not limited. The illustrated electronic device 100 is only one example, and the electronic device 100 may have more or fewer components than shown in the figures, may combine two or more components, or may have a different configuration of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
By way of example, fig. 1b shows a block diagram of the software architecture of the electronic device 100. The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system libraries, and a kernel layer, respectively. The application layer may include a series of application packages.
As shown in fig. 1b, the application package may include applications such as camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 1b, the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is used to provide the communication functions of the electronic device 100. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
Android run time includes a core library and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media library (media library), three-dimensional graphics processing library (e.g., openGL ES), 2D graphics engine (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio video encoding formats, such as: MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
In the second part, an example application scenario disclosed in the embodiment of the present application is described below.
The technical solution of the embodiment of the present application may be applied to, for example, the screen projection system 20 shown in fig. 2 a. Wherein the screening system 20 may include a source device 210 and a destination device 220. The source device 210 may include an electronic device 210A, an electronic device 210B, and an electronic device 210C, and the destination device 220 may include an electronic device 220A, an electronic device 220B, and an electronic device 220C. Meanwhile, the source device 210 and the destination device 220 may be connected to each other through a wireless network or wired data communication.
Specifically, the source device 210 and the destination device 220 may be devices under the same user account. For example, when a user logs in a mobile phone, a desktop computer, a smart screen, a notebook computer, a relay device, and a smart watch using the same user account, the source device 210 and/or the destination device 220 may be a mobile phone, a desktop computer, a smart screen, a notebook computer, a relay device, and a smart watch, and the mobile phone, the desktop computer, the smart screen, the notebook computer, the relay device, and the smart watch may communicate with each other through a wireless network.
Specifically, the source device 210 and the destination device 220 may be connected to the same WLAN network through a relay device (e.g., a router). For example, when a user accesses a cell phone, a desktop computer, a smart screen, a notebook computer, and a smart watch to a Wi-Fi network provided by a relay device, the source device 210 and the destination device 220 may include the cell phone, the desktop computer, the smart screen, the notebook computer, the relay device, and the smart watch, and the cell phone, the desktop computer, the smart screen, the notebook computer, the relay device, and the smart watch form one WLAN network, so that communication between the respective devices within the WLAN network may be achieved through the relay device.
Further, the source device may screen the current screen of the application that it runs to the destination device for display. In addition, when the destination device needs to display the mirror images projected by a plurality of source devices at the same time, the destination device can display the mirror images at the same time in a split-screen mode. For example, electronic device 210A drops the screen to electronic device 220A and electronic device 220B, and electronic device 210B drops the screen to electronic device 220B and electronic device 220C. At this time, the electronic device 220B may simultaneously display the projected images from the electronic device 210A and the electronic device 210B in a split-screen manner.
It should be appreciated that the screen shot communication system 30 may also include other numbers of electronic devices, not specifically limited herein.
Illustratively, the technical solution of the embodiment of the present application may be applied to the screen projection system 30 as shown in fig. 2 b. The screen-casting system 30 may include a source device 210, a destination device 220, and a cloud device 230, among others. The source device 210 and the destination device 220 may be devices under the same user account, and the cloud device 230 may store application layout information and/or an external resource list uploaded by the source device 210 and the destination device 220. The source device 210 or the destination device 220 may acquire application layout information and/or an external resource list from the cloud device 230 for screen projection display when screen projection is performed.
Illustratively, the technical solution of the embodiment of the present application may be applied to the screen projection system 30 as shown in fig. 2 c. The screen projection system 30 may include a source device 210, a destination device 220, a cloud device 230, and a development device 240, among others. The source device 210 and the destination device 220 may be devices under the same user account, and the development device 240 may be configured to develop an application layout and an external resource information list, and upload the developed application layout to the cloud device 230, so that the source device 210 or the destination device 220 may acquire the application layout from the cloud device 230 for screen-casting display.
In the third section, the protection scope of the claims disclosed in the embodiments of the present application is described as follows.
Referring to fig. 3, fig. 3 is a flowchart of a screen display method according to an embodiment of the present application, which is applied to the electronic device shown in fig. 1a, and as shown in fig. 3, the screen display method includes the following operations.
S310, displaying an editing interface window, wherein the editing interface window comprises a first area and a second area, a first picture is displayed in the first area, and the first picture is a target picture in a target application running on the source equipment.
When receiving the screen projection instruction, the electronic device may start the screen projection function and execute the operation of a 410. The screen throwing instruction can be input by a user or third-party equipment, or can be actively generated by the electronic equipment. The specific requirement is determined by the actual application scene.
In the embodiment of the application, an application program (hereinafter referred to as an application) is taken as an object to perform screen projection. Before the screen is projected, the source equipment needs to determine the target application which needs to be projected this time. In the embodiment of the application, the confirmation of the application to be screen-thrown can be performed by referring to the following modes:
1. after the electronic equipment starts the screen throwing function, the application displayed on the current user interface at the moment of starting the screen throwing function can be determined to be a target application so as to meet the personalized screen throwing requirement of a user. As shown in fig. 4a, the electronic device is a mobile phone, and the current page of the mobile phone is displayed as a music player, and when the screen throwing function is started, the music player is determined to be a target application, and a popup window is displayed on the current page for confirmation.
2. One or more applications of the default screen is preset by a technician or a user, and the electronic equipment sets the default screen to be the screen-throwing application at the time after starting the screen-throwing function. On the basis of meeting the personalized screen projection requirement of the user, the user does not need to select the application every time, so that the operation is more convenient. As shown in fig. 4b, the electronic device is a mobile phone, when the screen-throwing function is started, a popup window is displayed on the current interface, one or more applications with default screen throwing preset may be included in the popup window, and a user may select a default application in the popup window. On the basis, when the screen throwing function is started next time, the application selection to be thrown can be performed according to the default application set by the user.
3. On the basis of setting the default screen-throwing application in the above mode 2, after the electronic device starts the screen-throwing function, the user may display a popup window including one or more applications with default screen throwing preset on the current interface, and the user may select the target application of each screen throwing. Based on the mode 2, the mode 3 enables a user to control the application condition of each screen throwing more flexibly.
4. The transmitting end automatically selects one or more target applications according to a certain preset rule. In mode 4, the technician may preset some application selection rules, for example, may set that applications that are running and support screen casting are all selected as target applications. After the screen throwing function is started, automatic selection of the target application is performed according to the preset rule.
In practical applications, the confirmation of the target application may be performed with reference to any one of the above 4 modes. Other target application confirmation methods can also be set by a technician according to the actual application scene requirements. The validation method of the target application is not excessively limited here.
Further, after confirming the target application of the screen projection, the target picture in the target application needs to be determined. In the embodiment of the present application, the determination of the target picture may be performed with reference to the following:
1. if the target user confirms the application displayed by the current user interface, the screen displayed by the current user interface at the moment of starting the screen throwing function can be determined as the target screen. For example, as shown in fig. 4a, assume that the electronic device is a cellular phone and the current page of the cellular phone is displayed as a music player, and thus the picture of the currently played music is determined as the target picture.
2. If the determined target application is the application currently running in the background, the last displayed picture of the target application on the user interface can be determined as a target picture. For example, assuming that the electronic device is a mobile phone, the determined target application is a music player running in the background, and if the last screen displayed by the music player on the user interface is as shown in fig. 4a, the screen of fig. 4a is confirmed as the target screen.
3. One or more screen projection images of a default target application are preset by a technician or a user, and after the screen projection function is started, the default screen projection image is set as the target image of the screen projection at the time. For example, referring also to fig. 4c, it is assumed that the electronic device is a mobile phone. At this time, the user can manually enter the screen-throwing picture selection interface before the screen-throwing function is started, and perform default screen-throwing picture selection. On the basis, when the screen throwing function is started next time, the target screen can be selected according to the default screen throwing screen set by the user, and the screen throwing screen selecting interface can be entered again to modify the screen of the current screen throwing.
4. And selecting one or more pictures to be projected according to a certain preset rule. In the mode 4, a technician may preset some screen selection rules, for example, may set that the display target application information and the screen available for screen projection are both selected as the screen to be projected. After the screen throwing function is started, the automatic selection of the screen to be thrown is carried out according to the preset rule. For example, when the target application is a music player, a rule for selecting a target screen may be set as a control screen of a currently playing song, such as the interface shown in fig. 4 c.
In practical applications, the confirmation of the target screen may be performed with reference to any one of the above 4 modes. Other target picture confirmation methods can also be set by technicians according to actual application scene requirements. The method for confirming the target screen is not limited too much.
In the embodiment of the application, the editing interface window is used for manufacturing the application layout of the screen-throwing picture. The editing interface window comprises a first area and a second area, the first area is used for displaying a target picture in a target application acquired from source equipment, the second area is used for carrying out application layout operation on a picture to be projected, the second area can comprise a first display area used for displaying an operation icon corresponding to at least one interface layout and a second display area used for displaying the picture to be projected or the picture to be projected, and the operation icon comprises at least one of the following items: rotation, movement, scaling, deletion, addition, hierarchical setting, etc., the layout operation of the screen to be projected may be determined according to the operation icon selected by the user.
Optionally, the editing interface window further includes a third area, where the third area includes at least one shape icon corresponding to the selected shape. The shape icon may include at least one of: rectangular, circular, oval, triangular, pentagonal, hexagonal, arbitrary, etc. Illustratively, the shape selection of the screen to be projected may be automatically selected by the AI.
Illustratively, as shown in FIG. 5a, the editing interface window includes an APP bar, a layout bar, and a toolbar. Specifically, as shown in fig. 5b, the target application is a music player, and the APP column is used for displaying a target screen of the music player of the source device. The layout column comprises a first display area for displaying at least one operation icon and a second display area for displaying at least one external screen, one or more screen to be projected or screen projected, and a user can select layout operation corresponding to the operation icon from the first display area to apply layout operation to the screen to be projected in the second display area. The toolbar comprises at least one shape icon, and the shape of a screen to be projected can be determined according to the shape icon selected by a user.
Alternatively, the electronic device may be a source device or a destination device. The electronic device may also be a development device.
Because the editing interface window can be positioned on the source equipment, the destination equipment and the development equipment, the application can be used for projecting the screen in three modes. The first way is: the source equipment can directly transmit the acquired target picture of the target application to the target equipment after the target picture is coded by the coding module, or can directly carry out screen projection display after the target equipment decodes the target picture. The second way is: the source equipment directly codes the acquired target picture of the target application and then directly sends the target picture to the target equipment, and the target equipment decompresses the target picture and then carries out screen projection display after carrying out application layout on all or selected partial pictures of the target picture. The third way is: the source equipment can directly encode the acquired target picture of the target application and then directly send the encoded target picture to the development equipment, the development equipment performs application layout and encoding on the target picture and then sends the encoded target picture to the target equipment, and the target equipment can directly perform screen projection display after decoding. The first mode is to perform application layout on a source device, the second mode is to perform application layout on a destination device, and the third mode is to perform application layout on a third party device. Since the process of application layout consumes a certain amount of resources, if the computing capacities of the source device and the destination device are different, the process of application layout tends to be placed on a device with strong computing capacity; in addition, if the application layout is carried out on the source device, the content of encoding and transmission is smaller than the original picture, so that the network bandwidth and the encoding and decoding speed can be saved, which is also an important factor in a specific scene; when the editing interface window is not set on the source device or the destination device, the application layout can be performed on the third-party device, and the source device or the destination device can call the third-party device to perform the application layout in preparation for the cross-end application, so that the editing interface window does not need to be installed on the source device or the destination device. Therefore, in the actual process, specific scenes are comprehensively considered, and decisions are automatically made.
Specifically, when the electronic device is a source device, the source device starts a screen-throwing operation flow when receiving a screen-throwing instruction, displays an editing interface window on a user interface, and displays a target picture in a target application on the source device in a first area of the editing interface window. When the electronic equipment is the target equipment, the target equipment starts a screen-throwing operation flow when receiving a screen-throwing instruction, receives a target picture sent by the source equipment, displays an editing interface window on a user interface, and displays the received target picture in a first area of the editing interface window.
S320, responding to at least one first input operation for the first picture, and selecting a picture list to be projected corresponding to the at least one first input operation from the first picture, wherein the picture list to be projected comprises at least one picture to be projected.
In the embodiment of the application, when the screen is projected, the user can select part of the screen from the target screen to project because the screen parameters (such as screen size, screen resolution, supporting touch screen operation and the like) of the source device and the target device may be different.
The first input operation may be, for example, a sliding operation or other touch operation of a finger of a user on the first screen, or may be, for example, a sliding operation or other touch operation of a finger joint of a user on the first screen, such as sliding a closed circle by a user as shown in fig. 6a, or an unsealed "C" shape as shown in fig. 6b, or other possible shapes, which are not limited by the present application.
Optionally, the responding to at least one first input operation for a first screen selects a screen to be projected corresponding to the first input operation from the first screen, and includes: responding to a third input operation for the shape icons in the third area, and determining the selected shape of each screen to be projected in the screen to be projected list; determining a selection area corresponding to the first input operation each time; and selecting the picture to be projected in the selected area of the first picture in the selected shape to obtain the picture list to be projected.
Wherein, a plurality of functional areas can be included in the first picture, each functional area being for implementing a different function. For example, as shown in fig. 4a, the functional areas in the music play screen include: song name display function, return function, share function, lyric display function, collection function, download function, comment function, play time adjustment function, circulation setting function, last song selection function, play function, next song selection function, and song list viewing function.
Specifically, when the user selects a screen to be displayed to the destination device, the user may select to capture the shape of the screen to be projected from the first screen by clicking the shape icon in the third area. The user may then select one or more to-be-projected screens on the first screen by gesture selection and display the one or more to-be-projected screens in a selected shape in a second display area in the second area. Illustratively, on the first screen, the user may select a function area to be displayed to the destination device on the first screen according to the large size of the display screen of the destination device and the own function use requirement on the interface. For example, as shown in fig. 6c, the user selects a rectangular shape icon in the toolbar, the selected shape icon is "rectangular", and selecting the area shown by the dashed box on the first screen can be understood as the user selecting the last song selection function, the play function, and the next song selection function. The operation of selecting the region shown by the dotted line frame may be a sliding operation of a user along two position coordinates of a diagonal line of the region shown by the dotted line frame, for example, a sliding operation from an upper left corner start coordinate position 1 of the dotted line frame to a lower right corner coordinate position 2 of the region shown in the figure, or a sliding operation from a lower right corner start coordinate position 2 of the dotted line frame to an upper left corner coordinate position 1 of the region, which is not limited in this application.
It should be noted that, the electronic device supporting the touch screen operation may select the screen to be projected through a sliding operation of a gesture, for electronic devices not supporting the touch screen operation, such as a desktop computer, a notebook computer, and the like, the screen to be projected may be selected through a click sliding operation of a mouse, and for electronic devices not supporting the touch screen operation and remotely operated, such as a smart television, a smart speaker, and the like, the screen to be projected may be selected through a selection of a preset functional area.
By way of example, dragging the mouse on the APP bar may cause a shape corresponding to the selected shape icon to appear, selecting a functional area corresponding to a suitable position in the first screen, and releasing the mouse when the functional area selection is completed. When the selected shape in the toolbar is selected, the user's operation in the App bar will be intercepted and not captured by the redirection system to operate the selection of content.
For example, the user may select one or more to-be-projected screens from the first screen, and the cut-out shape of each to-be-projected screen may be the same or different. The specific user may cut out a plurality of to-be-projected pictures from the first picture through the shape corresponding to the shape icon selected at a time, for example, as shown in fig. 6d, the shape icon selected by the user is rectangular, the selected function area is a first function area and a second function area, and the first function area and the second function area are respectively displayed in the second display area of the layout bar in the rectangular shape. The user may also select the capturing shape of the screen to be captured before capturing the screen to be captured, for example, as shown in fig. 6e, the icon of the first selected shape of the user is a circle, and the selected functional area is a first functional area; the second selected shape icon is rectangular, the selected function area is a second function area, the first function area is displayed in a second display area of the layout bar in a circular shape, and the second function area is displayed in the second display area of the layout bar in a rectangular shape.
S330, displaying a second picture in the second area, wherein the second picture comprises the picture list to be projected.
Wherein, one or more pictures to be projected from the first picture by the user can be displayed in the second display area of the second area in a selected shape. And the second display area in the second area displays the picture to be projected in the form of a picture list.
Optionally, the second screen may further include an external screen projection screen list, where the external screen projection screen list includes at least one external screen projection screen imported by the user.
The electronic device may also obtain an external resource list from the cloud device or locally, where the external resource list may include, but is not limited to: background pictures, logo pictures and pictures to be projected. The user may add one or more external screen shots to the picture list in the second display area via an "add" operator icon for use in the first display area in the second area.
It should be noted that, because the external screen-throwing picture is not derived from the picture in the target application in the source device, only the user beautifies the screen-throwing picture to be thrown, so that the screen-throwing picture is displayed on the target device more attractive, the user requirement is met, and the external screen-throwing picture does not comprise a functional display area.
S340, responding to at least one second input operation in the second area, and editing the second picture into a screen projection picture.
In the embodiment of the application, a user can apply layout operation to the intercepted screen to be projected, and the original display and layout of the target screen on the source equipment are changed, so that the screen projected screen can be adapted among different equipment. In addition, the adaptive screen projection picture can be subjected to self-defined appearance design on the basis of not changing the original application, and the external screen projection picture can be used for beautifying the screen projection picture, so that the screen projection picture is more attractive in appearance on the target equipment.
The second input operation may be, for example, a touch operation of a finger of a user or other touch operation on the interface, or the second input operation may be a touch operation of a finger joint of a user or other touch operation on the interface, which is not limited by the present application.
It should be noted that, the electronic device supporting the touch screen operation may select the screen to be projected through a touch operation, for electronic devices not supporting the touch screen operation, such as a desktop computer, a notebook computer, and the like, the screen to be projected may be selected through a click operation of a mouse, and for electronic devices not supporting the touch screen operation and remotely operated, such as a smart television, a smart speaker, and the like, the screen to be projected may be selected through selection of a preset functional area.
In one possible embodiment, the method further comprises: acquiring first information of each screen to be projected in the source equipment in the screen to be projected picture list, wherein the first information comprises at least one of the following items: position, size, shape; obtaining second information of the destination device, wherein the second information comprises at least one of the following: interface size, display position of screen projection picture, scaling of screen projection picture, rotation direction of screen projection picture; and acquiring third information of each external screen in the external screen projection picture list, wherein the third information comprises the position and/or the size.
Each intercepted screen to be projected has information aiming at a target application on the source device, such as position information, shape, size, height, implemented functions and the like of a display interface of the intercepted screen to be projected on the source device, for example, coordinate information of a left upper corner starting point and coordinate information of a right upper corner starting point of the screen to be projected on the display interface of the source device, and display width and height of the screen to be projected on the display interface of the source device. Each intercepted screen to be projected also has information aiming at a target application on the target device, such as the interface size and the height of the target device, the position information of the screen projected display, the position information of the intercepted screen to be projected in the target application on the target device, and the scaling and rotation information of the screen to be projected on the target device, for example, the coordinate information of the left-upper corner starting point and the right-upper corner starting point of the screen to be projected on the display interface of the target device, and the display width and the height of the screen to be projected on the display interface of the target device. Further, the electronic device may also obtain information of the external screen from the cloud device or locally, such as a size, a height, a source, a shape, a location, and the like of the external screen.
Optionally, the editing the second screen into a screen-throwing screen in response to at least one second input operation in the second area includes: acquiring an operation icon corresponding to the at least one second input operation; determining interface layout information of the second picture according to the operation icon; and editing the second picture into the screen projection picture according to the interface layout information and the parameter information, wherein the parameter information comprises at least one of the first information, the second information and the third information.
The electronic equipment acquires the information of each screen to be projected, and is used for extracting the information of each screen to be projected in the process of applying the layout, so as to realize the application layout operation of the screen to be projected.
The user can operate the operation icon in the second area to move, rotate, zoom, delete, add, set up the hierarchy and the like on any screen to be projected and/or the external screen projected. For example, the user reformulates the application layout this time, and sends the interface layout information corresponding to the application layout of the target application to the cloud device, or stores the interface layout information locally. When the next setting is performed, the interface layout information can be directly obtained from the local or cloud equipment to directly carry out application layout on the screen to be projected, and the operation process of application layout is not required to be carried out every time, so that the operation is more convenient. For example, the user may further beautify the screen to be projected on the original application layout by deleting an icon, adding an icon or other icons, for example, importing an external screen to be projected picture as a background picture, deleting one screen to be projected picture from the plurality of screen to be projected pictures, and rotating the screen to be projected picture by 180 degrees.
Specifically, as shown in fig. 6d, two to-be-projected images are displayed in the second display area of the layout bar, and when the user needs to move the to-be-projected images to adjust the arrangement sequence of the two to-be-projected images, the user can touch the operation icon corresponding to "move" in the first display area of the layout bar to trigger the function of moving operation. When the icon corresponding to "move" is selected, the user may move the arrangement sequence of the two screen images to be projected in the second display area, and fig. 7a is a screen image displayed after the user executes the operation icon corresponding to "move" in the diagram of fig. 6 d. When the user needs to rotate the screen to be projected to adjust the display directions of the two screen to be projected, the user can touch the operation icon corresponding to the rotation in the first display area of the layout bar to trigger the function of rotation operation, and fig. 7b is the screen projected screen displayed after the user executes the operation icon corresponding to the rotation in fig. 7 a. When the user needs to delete a certain screen to be projected in the two screen to be projected, the user can select the icon corresponding to 'delete', and then select the screen to be projected to be deleted.
In addition, when the user needs to import the external screen to be displayed, the user can realize the import of the external screen to be displayed by triggering the operation icon corresponding to the 'increase'. After the user touches the operation icon corresponding to "add" in the diagram of fig. 7a, the user enters the display interface shown in fig. 7c, and the user can import an external screen to be projected in the display interface. For example, the user may add a background picture and logo picture as described in fig. 7 d.
In addition, the multiple to-be-projected screen images can be displayed in an overlapping mode, and a user can trigger the function of hierarchy setting operation by triggering the operation icon corresponding to the hierarchy setting. The selection of the operation icon corresponding to the specific hierarchy setting can set the selected screen to be projected as a top layer or a bottom layer. The user can set the background picture as the bottommost layer and the logo picture as the topmost layer through the operation icon corresponding to the hierarchy setting. Fig. 7e is a screen shot displayed after the user executes the operation icon corresponding to the "hierarchy setting" in fig. 7 d.
In an exemplary embodiment, a layout operation corresponding to the selected operation icon may appear when a touch is performed on the first display area of the layout bar, and a layout operation corresponding to the operation icon is performed on the selected screen to be projected in the second display area, and the touch screen is stopped when the screen to be projected is selected. When the selected operation icon in the first display area is selected, the operation of the user in the second display area is intercepted, and is not captured by the redirection system to execute the execution of the layout operation.
For example, between two adjacent screen images to be projected among the plurality of screen images to be projected, interpolation processing may be performed on the connection portion of the screen images to be projected to smooth the transition between the different screen images to be projected.
Alternatively, before the user device performs screen projection, the user device may enter a screen projection setting interface shown in fig. 7f, where the screen projection setting interface may include a plurality of setting menus in a process of displaying the target screen to the destination device. Such as destination devices, auto-application layout, number of destination devices, etc. When the user executes the operation shown in fig. 7f, after clicking the automatic application layout, the electronic device can automatically acquire the interface layout information and the parameter information to perform application layout on the screen to be projected, and the operation process of performing application layout every time is not needed, so that the operation is more convenient.
It should be understood that the position and arrangement sequence of each screen to be projected may be adjusted according to the habit of the user, and the shape and size of the screen to be projected may also be changed according to the adjustment of the user. Optionally, when the user starts the "automatic application layout" in the screen setting interface, the system may default the shape and size of the screen to be projected to display according to the last setting of the user, or perform scaling to form the screen according to the shape and size of the display screen of the destination device.
Illustratively, when the user performs the operation shown in fig. 7f, clicks on the destination device setup menu, enters the display interface shown in fig. 7g, and the user can select at least one destination device from the list of available devices included in the display interface. For example, the user can select the intelligent watch device, and can also select a plurality of devices such as the watch device, the vehicle-mounted display device and the like at the same time.
Illustratively, when the user performs the operation shown in fig. 7g, clicking on the destination device number setting menu, the user may select the number of destination devices in the display interface. For example, the user may select 1, 2, 3, 4, etc. Because the source device transmits the target frames or the screen projection frames to a plurality of destination devices at the same time, or increases the overhead and the network bandwidth of the source device, which are also important factors in a specific scene, in an actual process, a user can set the number of the destination devices capable of projecting the screen according to the actual scene.
And S350, sending the screen-throwing picture so as to display the screen-throwing picture in the first application running on the target equipment.
Specifically, when the first device is a source device, after obtaining a screen projection picture, the first device may encode and compress the screen projection picture and transmit the screen projection picture to one or more destination devices, and after receiving the screen projection picture, the destination device decompresses and decodes the screen projection picture and then displays the screen projection picture in a target application. When the first device is a source device, the first device directly sends the screen projection picture to a display template to display the screen projection picture in a target application after obtaining the screen projection picture.
For example, as shown in fig. 7h, the source device performs application layout on the target image to obtain a screen projection image in fig. 7h, and after receiving the screen projection image, the destination device displays the screen projection image shown in fig. 7h on a display interface of the target application of the destination device.
In summary, the method for displaying a screen according to the present application is described with reference to the accompanying drawings, where the method may capture one or more to-be-screened frames of a target application on a source device, and re-apply the one or more to-be-screened frames to a layout to generate a screen, where the screen may be displayed on one or more destination devices. The method comprises the steps of intercepting a target picture in a target application, and carrying out application layout on the intercepted picture according to the user requirement, so that the screen throwing picture can adapt to the sizes of display screens of various electronic devices, the use is convenient, and the user experience is improved.
By the description of the above embodiment, the screen display from the source device to the destination device is completed, and for any of the screen display methods described above, the screen display screen is displayed at the destination. According to the screen projection display method provided by the embodiment of the application, a user can operate the screen projection picture at the destination equipment end, the operation result can be transmitted to the source equipment end, or the user can operate the target picture at the source equipment end, the operation can be transmitted to the destination equipment end, and the consistency of the operation of the destination equipment and the source equipment is ensured, so that the execution process of the application of the source equipment is controlled through the screen projection picture of the destination equipment.
In the embodiment of the application, since the screen projection picture displayed on the target device is inconsistent with the target picture on the source device, the screen projection picture may be formed by combining some parts of the target picture, the position of the functional area in the screen projection picture is greatly changed relative to the position of the functional area in the target picture, and the direction control is also specially processed.
For example, the electronic device is the destination device, and after the screen-throwing screen is displayed in the first application running by the destination device, the method further includes:
and responding to a fourth input operation of the screen throwing picture, calculating a first position and a first position offset of the fourth input operation, wherein the first position is a position of a first screen to be thrown picture on the target picture, the first screen to be thrown picture is a screen to be thrown picture corresponding to the fourth input operation, the first position offset is a position offset of the fourth input operation relative to the first screen to be thrown picture, and a first control message is sent and comprises the fourth input operation, the first position and the first position offset.
Optionally, the calculating the first position and the first position offset of the fourth input operation includes: acquiring a third position of the fourth input operation, wherein the third position is a position of the fourth input operation relative to the screen projection picture; and calculating the first position and the first position offset according to the third position, the interface layout information and the parameter information.
The target device comprises an editing interface window, a user operates on a screen throwing picture of the target device, the target device can calculate the operation position of the operation position relative to a target picture according to the operation position on the screen throwing picture, the operation position of the target picture and the input operation on the screen throwing picture are sent to the source device, and the source device executes the input operation at the operation position of the target picture after receiving the input operation.
Specifically, the destination device collects the operation position of the user on the screen-throwing picture, and calculates the selected screen-throwing picture to be thrown triggered by the fourth input operation through the interface layout information. Since the click is a picture, behavior similar to a control event cannot be generated, and the clicked portion needs to be obtained through position calculation. The user can acquire interface layout information stored locally or interface layout information acquired from the cloud device. Taking clicking as an example, when the screen-throwing picture on the target device is clicked, a third position is obtained, and through interface layout information, the target device can calculate that the third position belongs to the position of the selected screen-throwing picture in the target picture, namely the first position, and the third position aims at the position offset of the screen-throwing picture, namely the first position offset. The position of the third position in the target picture can be calculated through the selected picture to be projected and the relative offset position of the picture to be projected relative to the third position. Because the selected screen to be projected may have scaling, moving and rotating actions when being edited into the screen to be projected, the information is recorded in the interface layout information, and the position of the third position of the click in the target screen can be calculated by the interface layout information, the position of the selected screen to be projected and the position of the click aiming at the position offset of the screen to be projected.
FIG. 8a illustrates an operational interface of a portion of the functionality controls of the mobile phone music application presented by the smart watch, which may include a user clicking on the play controls of the music application, while the music of the mobile phone is being played, the user performing the operations as shown in FIG. 8 a. And responding to clicking operation of the user, wherein a playing control of the music application in an operation interface of the intelligent watch presents a pause state, and accordingly, playing is paused in a playing interface of the mobile phone music.
Illustratively, the electronic device is the source device, and after sending the screen-casting screen, the method further includes:
responding to a fifth input operation of the target picture, calculating a second position and a second position offset of the fifth input operation, wherein the second position is a position of a second screen to be projected on the screen, the second screen to be projected is a screen to be projected corresponding to the fifth input operation, and the second position offset is a position offset of the fifth input operation relative to the second screen to be projected; and sending a second control message, wherein the second control message comprises the fifth input operation, the second position and the second position offset.
Optionally, the calculating the second position and the second position offset of the fifth input operation includes: acquiring a fourth position of the fifth input operation, wherein the fourth position is a position of the fifth input operation relative to the target picture; and calculating the second position and the second position offset according to the fourth position, the interface layout information and the parameter information.
The source device comprises an editing interface window, a user operates on a target picture of the source device, the source device can calculate the operation position of the operation position relative to the screen throwing picture according to the operation position on the target picture, the operation position of the screen throwing picture and the input operation on the target picture are sent to the target device, and the target device executes the input operation at the operation position of the screen throwing picture after receiving the input operation.
Specifically, the source device collects the operation position of the user on the target screen, and calculates the selected screen to be projected triggered by the fifth input operation through the interface layout information. The user can acquire interface layout information stored locally or interface layout information acquired from the cloud device. Taking clicking as an example, when clicking a target picture on the source device, the fourth position is obtained, and through interface layout information, the position of the fourth position, which belongs to the selected picture to be projected in the target picture, namely the second position, and the position deviation of the fourth position for the picture to be projected, namely the second position deviation, can be calculated. And calculating the position of the fourth position in the screen projection picture through the selected screen projection picture and the relative offset position of the screen projection picture relative to the fourth position. Because the selected screen to be projected may have scaling, moving and rotating actions when being edited into the screen, the information is recorded in the interface layout information, and the position of the fourth position in the screen can be calculated by the interface layout information and the position of the selected screen to be projected and the position deviation of the fourth position with respect to the screen to be projected.
For example, when the play control in the music play interface of the mobile phone assumes a pause play state, the user performs an operation as shown in fig. 8b, which may include a click operation of the play control of the music application by the user. And responding to clicking operation of the user, and converting the playing control in the playing interface of the music application of the mobile phone into a pause state, and correspondingly converting the playing control of the music application in the operating interface of the intelligent watch into the pause state.
It should be understood that, taking the example that the music application of the mobile phone is displayed on the smart watch end as an example, the user can execute corresponding operations on the mobile phone end and the smart watch end to control the music application. Similarly, for the screen projection screen formed by combining the plurality of screen projection screens in fig. 8b, the user can execute the operation on any one of the functional areas in the screen projection screen on the control panel of the vehicle-mounted terminal, and can transmit the operation result to the mobile phone end, thereby correspondingly controlling the application of the mobile phone.
For example, the electronic device is the destination device, and after the screen-throwing screen is displayed in the first application running by the destination device, the method further includes: receiving third control information, wherein the third control information comprises a sixth input operation and a fifth position; calculating a sixth position and a third position offset of the sixth input operation according to the fifth position, the interface layout information and the parameter information, wherein the sixth position is a position of a third screen to be projected on the screen, the third screen to be projected is a screen to be projected corresponding to the sixth input operation, and the third position offset is a position offset of the sixth input operation relative to the third screen to be projected; calculating a seventh position according to the sixth position and the third position offset, wherein the seventh position is the position of the sixth input operation relative to the screen projection picture; and performing the sixth input operation at the seventh position.
The electronic device is a target device, the target device comprises an editing interface window, the input operation of a user on a target picture of the source device is responded, the source device transmits the position of the input operation to the target device, the target device can calculate the operation position of the operation position relative to the screen throwing picture according to the operation position on the target picture, and the input operation can be executed at the operation position of the screen throwing picture.
Specifically, the source device collects an operation position of a user on a target picture, taking clicking as an example, when the target picture on the source device is clicked, the fifth position is obtained, the source device sends the fifth position and a sixth input operation to the destination device, and after the destination device receives the fifth position and the sixth input operation, the destination device calculates a selected picture to be projected, triggered by the sixth input operation, through interface layout information. The user can acquire interface layout information stored locally or interface layout information acquired from the cloud device. The position of the fifth position point belonging to the selected screen to be projected in the target screen, namely the sixth position, and the position deviation of the sixth position for the screen to be projected, namely the third position deviation, can be calculated through the interface layout information. The position of the fifth position in the screen projection screen can be calculated through the selected screen to be projected and the relative offset position of the screen to be projected relative to the fifth position. Because the selected screen to be projected may have scaling, moving and rotating actions when being edited into the screen, the information is recorded in the interface layout information, and the position of the fifth position in the screen can be calculated by the interface layout information, the position of the selected screen to be projected and the position deviation of the fifth position with respect to the screen to be projected.
Illustratively, the electronic device is the source device, and after sending the screen-casting screen, the method further includes: receiving fourth control information, the fourth control information including a seventh input operation and an eighth location; according to the eighth position, the interface layout information and the parameter information, calculating a ninth position and a fourth position offset of the seventh input operation, wherein the ninth position is a position of a fourth screen to be projected on the target screen, the fourth screen to be projected is a screen to be projected corresponding to the seventh input operation, and the fourth position offset is a position offset of the seventh input operation relative to the fourth screen to be projected; calculating a tenth position, which is a position of the seventh input operation with respect to the target screen, according to the ninth position and the fourth position offset; and executing the seventh input operation at the tenth position.
The electronic device is a source device, the source device comprises an editing interface window, the destination device transmits the input operation position to the source device in response to the input operation of a user on a screen of the destination device, the source device can calculate the operation position of the operation position relative to a target screen according to the operation position on the screen, and the input operation is executed at the operation position of the target screen.
Specifically, the destination device collects an operation position of the user on the screen-throwing picture, taking clicking as an example, when the screen-throwing picture on the destination device is clicked, the eighth position is obtained, the destination device sends the eighth position and the seventh input operation to the source device, and after the source device receives the eighth position and the seventh input operation, the selected screen-throwing picture to be thrown triggered by the seventh input operation is calculated through the interface layout information. The user can acquire interface layout information stored locally or interface layout information acquired from the cloud device. The position of the eighth position point belonging to the selected screen to be projected in the target screen, namely the ninth position, and the position deviation of the eighth position aiming at the screen to be projected, namely the fourth position deviation, can be calculated through the interface layout information. The position of the eighth position in the target picture can be calculated through the selected picture to be projected and the relative offset position of the picture to be projected relative to the eighth position. Because the selected screen to be projected may have scaling, moving and rotating actions when being edited into the screen to be projected, the information is recorded in the interface layout information, and the position of the eighth position in the target screen can be calculated by the interface layout information and the position of the selected screen to be projected and the position deviation of the eighth position with respect to the screen to be projected.
When the operation of the destination device and the device level used by the source device are not matched, the destination device needs to convert the operation of the input device into the operation matched with the input device of the source device; or the source device needs to convert the operation of the input device into an operation adapted to the input device of the destination device, for example, when the destination device is a touch operation and the source device is a mouse operation, the destination device may also convert the touch operation of the user into the mouse operation adapted to the source device.
In summary, the method for displaying a screen according to the present application is described with reference to the accompanying drawings, where the method may capture one or more to-be-screened pictures of a target application on a source device, and re-apply the one or more to-be-screened pictures to generate a screen, where the screen may be displayed on one or more destination devices. The destination device can transmit the operation result back to the source device for execution aiming at executing any operation on the screen throwing picture, so that the operation consistency of the source device and the destination device is ensured. According to the method, the screen projection picture is displayed on a plurality of target devices, so that the bottleneck among different devices is broken; the method comprises the steps of intercepting a target picture in a target application, and carrying out application layout on the intercepted picture according to the user requirement, so that the screen throwing picture can adapt to the sizes of display screens of various electronic devices, the use is convenient, and the user experience is improved.
In a specific implementation process, as shown in fig. 9a, the method for screen projection display provided by the application can be implemented based on cooperative coordination of three modules, and specifically includes an application editor, a first redirection system and a second redirection system.
The application editor can be located in a source device, a destination device, a development device and an application editor, and comprises an application editing window as shown in fig. 5a, which is mainly responsible for the application layout of the screen-throwing picture.
The first redirection system is located in the source device or the destination device and is mainly responsible for transferring data flow on the source device to the destination device or receiving data flow of the destination device, such as video, audio, image and control.
The second redirection system is positioned in the destination device or the source device and is mainly responsible for receiving data circulated by the source device or transferring data circulated on the destination device to the source device.
It should be appreciated that when the first redirection system is located at the source device, the second redirection system is located at the destination device; the second redirection system is located at the source device when the first redirection system is located at the destination device.
The method for projection display provided by the application also needs to be implemented by cooperative coordination of application management modules. The application management module is located in the cloud device and is mainly responsible for storing interface layout information and/or an external resource list manufactured by the application editor.
In the following, the first redirection system is located in the source device, and the first device is taken as the source device to describe the cooperative coordination between the different modules.
After the screen throwing function is started, the source equipment intercepts a target picture in the target application and redirects the target picture to an APP column in an application editor in the source equipment through a first redirection system. The first picture displayed in the APP column coincides with the target picture. The user selects the shape of the screen to be projected from a toolbar in the application editor, intercepts one or more screen to be projected from a first screen displayed in the APP bar according to the selected shape, and transmits the one or more screen to be projected to a layout bar in the application editor for display. And carrying out application layout on one or more pictures to be projected through the operation icons in the layout column, generating a projected picture, and uploading interface layout information in the process of generating the projected picture to cloud equipment by an application editor for storage. The source equipment encodes the screen-throwing picture through the first redirection system and sends the encoded screen-throwing picture to the target equipment, the second redirection system in the target equipment receives the encoded screen-throwing picture, decodes the encoded screen-throwing picture and then transmits the decoded screen-throwing picture to the target application, and the screen-throwing picture is displayed on a display interface of the target application.
As shown in fig. 9b, the method for displaying a projection screen according to the present application may be implemented based on the cooperative coordination of three modules, and specifically includes an application management module, a first redirection system, and a second redirection system. The first redirection system may include an acquisition module, a redrawing module, a coding module, a sending module, an input injection module, a receiving module, and a control module; the second redirection system may include: the device comprises a receiving module, a decoding module, a display module, an input processing module and a sending module.
Taking the first redirection system as the source device, the first device is taken as the source device for describing the cooperative coordination among the different modules. As shown in fig. 9b, the acquisition module in the first redirection system acquires a target picture from the application running on the source device, and transmits the target picture to the redrawing module. The redrawing template can acquire interface layout information and/or parameter information from an application management module in the cloud system, edit a target picture and generate a screen throwing picture; or the user edits the target picture through operating the redrawing module, generates a screen projection picture, and uploads interface layout information in the process of generating the screen projection picture to an application management module of the cloud system for storage. And the redrawing template transmits the generated screen throwing picture to the coding module, and the coding module edits the screen throwing picture and transmits the screen throwing picture to the transmitting module. The sending module uses a proprietary streaming protocol to send the encoded data stream to the destination device. And after receiving the data, the receiving module in the second redirection system on the target equipment forwards the data to the decoding module, and the display module displays the screen projection picture decoded by the decoding module on the display interface.
Further, the user can operate the screen projection picture at the destination equipment end and transmit the operation result to the source equipment end. And after the input module in the second redirection system acquires the input operation of the user on the screen projection picture, the input operation is transmitted to the input processing module. The input processing module calculates the operation position of the input operation on the target picture, calculates the position of the target picture to be projected in the target picture where the input operation is positioned according to the operation position, interface layout information and parameter information acquired from cloud equipment or locally, and the position offset of the operation position of the input operation relative to the target picture to be projected, and transmits the operation position of the input operation in the target picture and the input operation to the first redirection system according to the position of the target picture to be projected and the position offset of the input operation. After receiving the operation position and the input operation in the target picture, the receiving module in the first redirection system executes the input operation at the operation position of the target picture in the target application of the source device through the input injection module.
It will be appreciated that the electronic device, in order to achieve the above-described functions, includes corresponding hardware and/or software modules that perform the respective functions. The present application can be implemented in hardware or a combination of hardware and computer software, in conjunction with the example algorithm steps described in connection with the embodiments disclosed herein. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Those skilled in the art may implement the described functionality using different approaches for each particular application in conjunction with the embodiments, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The present embodiment may divide the functional modules of the electronic device according to the above method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated modules described above may be implemented in hardware. It should be noted that, in this embodiment, the division of the modules is schematic, only one logic function is divided, and another division manner may be implemented in actual implementation.
In the case of dividing each functional module by corresponding each function, fig. 10 shows a schematic structural diagram of a screen display device, and as shown in fig. 10, the screen display device 1000 is applied to an electronic apparatus, and the screen display device 1000 may include: a display unit 1100, a selection unit 1200, an editing unit 1300, and a transceiving unit 1400.
Among other things, the display unit 1100 may be used to support an electronic device to perform S310, S330, etc. described above, and/or other processes for the techniques described herein.
The pick unit 1200 may be used to support the electronic device to perform S320, etc. described above, and/or other processes for the techniques described herein.
The editing unit 1300 may be used to support an electronic device to perform S340 and the like described above, and/or other processes for the techniques described herein.
The transceiver unit 1400 may be used to support an electronic device to perform S350 described above, etc., and/or other processes for the techniques described herein.
It should be noted that, all relevant contents of each step related to the above method embodiment may be cited to the functional description of the corresponding functional module, which is not described herein.
The electronic device provided in this embodiment is configured to execute the above-mentioned screen display method, so that the same effects as those of the above-mentioned implementation method can be achieved.
In case an integrated unit is employed, the electronic device may comprise a processing module, a storage module and a communication module. The processing module may be configured to control and manage actions of the electronic device, for example, may be configured to support the electronic device to perform the steps performed by the display unit 1100, the selection unit 1200, the editing unit 1300, and the transceiver unit 1400. The memory module may be used to support the electronic device to execute stored program code, data, etc. And the communication module can be used for supporting the communication between the electronic device and other devices.
Wherein the processing module may be a processor or a controller. Which may implement or perform the various exemplary logic blocks, modules and circuits described in connection with this disclosure. A processor may also be a combination that performs computing functions, e.g., including one or more microprocessors, digital signal processing (digital signal processing, DSP) and microprocessor combinations, and the like. The memory module may be a memory. The communication module can be a radio frequency circuit, a Bluetooth chip, a Wi-Fi chip and other equipment which interact with other electronic equipment.
In one embodiment, when the processing module is a processor and the storage module is a memory, the electronic device according to this embodiment may be a device having the structure shown in fig. 1 a.
The present embodiment also provides a computer storage medium having stored therein computer instructions that, when executed on an electronic device, cause the electronic device to execute the above-described related method steps to implement the screen display method in the above-described embodiments.
The present embodiment also provides a computer program product, which when run on a computer, causes the computer to perform the above-mentioned related steps to implement the screen projection display method in the above-mentioned embodiments.
In addition, embodiments of the present application also provide an apparatus, which may be embodied as a chip, component or module, which may include a processor and a memory coupled to each other; the memory is used for storing computer-executed instructions, and when the device is operated, the processor can execute the computer-executed instructions stored in the memory, so that the chip executes the screen display method in each method embodiment.
The electronic device, the computer storage medium, the computer program product, or the chip provided in this embodiment are used to execute the corresponding methods provided above, so that the beneficial effects thereof can be referred to the beneficial effects in the corresponding methods provided above, and will not be described herein.
It will be appreciated by those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to perform all or part of the functions described above.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of modules or units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and the parts shown as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions for causing a device (may be a single-chip microcomputer, a chip or the like) or a processor (processor) to perform all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read Only Memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely illustrative of the present application, and the present application is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present application. Therefore, the protection scope of the application is subject to the protection scope of the claims.

Claims (24)

1. A method for projection display, characterized in that it is applied to an electronic device, said method comprising:
displaying an editing interface window, wherein the editing interface window comprises a first area, a second area and a third area, a first picture is displayed in the first area, and the first picture is a target picture in a target application running on source equipment; the third area comprises at least one shape icon corresponding to the selected shape;
responding to at least one first input operation for the first picture, selecting a picture list to be projected corresponding to the at least one first input operation from the first picture, wherein the picture list to be projected comprises the following steps: responding to a third input operation for the shape icons in the third area, and determining the selected shape of each screen to be projected in the screen to be projected list; determining a selection area corresponding to the first input operation each time; selecting the screen to be projected in the selected area of the first screen in the selected shape to obtain a screen list to be projected; the screen to be projected picture list comprises at least one screen to be projected picture;
Displaying a second picture in a second area, wherein the second picture comprises the picture list to be projected and an external screen projection picture list; the external screen projection picture list comprises at least one external screen projection picture imported by a user;
responsive to at least one second input operation within the second region, editing the second screen into a screen-drop screen;
and sending the screen-throwing picture so as to display the screen-throwing picture in a first application running on the target equipment.
2. The method of claim 1, wherein the electronic device is the source device or the destination device.
3. The method according to claim 1, wherein the method further comprises:
acquiring first information of each screen to be projected in the source equipment in the screen to be projected picture list, wherein the first information comprises at least one of the following items: position, size, shape;
obtaining second information of the destination device, wherein the second information comprises at least one of the following: interface size, display position of screen projection picture, scaling of screen projection picture, rotation direction of screen projection picture;
and acquiring third information of each external screen in the external screen projection picture list, wherein the third information comprises the position and/or the size.
4. A method according to claim 3, wherein the second area comprises at least one operational icon corresponding to an interface layout;
the editing the second screen into a projection screen in response to at least one second input operation in the second area, including:
acquiring an operation icon corresponding to the at least one second input operation;
determining interface layout information of the second picture according to the operation icon, wherein the operation icon comprises at least one of the following items: rotation, movement, scaling, deletion, addition, hierarchy setting;
and editing the second picture into the screen projection picture according to the interface layout information and the parameter information, wherein the parameter information comprises at least one of the first information, the second information and the third information.
5. The method according to claim 4, wherein the method further comprises:
and sending the interface layout information to at least one of cloud equipment, source equipment and destination equipment.
6. The method according to claim 4 or 5, wherein the electronic device is the destination device, and after displaying the screen in the first application run by the destination device, the method further comprises:
Calculating a first position and a first position offset of the fourth input operation in response to the fourth input operation of the screen-throwing picture, wherein the first position is a position of a first screen-throwing picture on the target picture, the first screen-throwing picture is a screen-throwing picture corresponding to the fourth input operation, the first position offset is a position offset of the fourth input operation relative to the first screen-throwing picture,
and sending a first control message, wherein the first control message comprises the fourth input operation, the first position and the first position offset.
7. The method of claim 4 or 5, wherein the electronic device is the source device, and wherein after transmitting the screen shot, the method further comprises:
responding to a fifth input operation of the target picture, calculating a second position and a second position offset of the fifth input operation, wherein the second position is a position of a second screen to be projected on the screen, the second screen to be projected is a screen to be projected corresponding to the fifth input operation, and the second position offset is a position offset of the fifth input operation relative to the second screen to be projected;
And sending a second control message, wherein the second control message comprises the fifth input operation, the second position and the second position offset.
8. The method of claim 6, wherein the calculating the first position and the first position offset of the fourth input operation comprises:
acquiring a third position of the fourth input operation, wherein the third position is a position of the fourth input operation relative to the screen projection picture;
and calculating the first position and the first position offset according to the third position, the interface layout information and the parameter information.
9. The method of claim 7, wherein the calculating the second position and the second position offset of the fifth input operation comprises:
acquiring a fourth position of the fifth input operation, wherein the fourth position is a position of the fifth input operation relative to the target picture;
and calculating the second position and the second position offset according to the fourth position, the interface layout information and the parameter information.
10. The method according to claim 4 or 5, wherein the electronic device is the destination device, and after displaying the screen in the first application run by the destination device, the method further comprises:
Receiving third control information, wherein the third control information comprises a sixth input operation and a fifth position;
calculating a sixth position and a third position offset of the sixth input operation according to the fifth position, the interface layout information and the parameter information, wherein the sixth position is a position of a third screen to be projected on the screen, the third screen to be projected is a screen to be projected corresponding to the sixth input operation, and the third position offset is a position offset of the sixth input operation relative to the third screen to be projected;
calculating a seventh position according to the sixth position and the third position offset, wherein the seventh position is the position of the sixth input operation relative to the screen projection picture;
and performing the sixth input operation at the seventh position.
11. The method of claim 4 or 5, wherein the electronic device is the source device, and wherein after transmitting the screen shot, the method further comprises:
receiving fourth control information, the fourth control information including a seventh input operation and an eighth location;
according to the eighth position, the interface layout information and the parameter information, calculating a ninth position and a fourth position offset of the seventh input operation, wherein the ninth position is a position of a fourth screen to be projected on the target screen, the fourth screen to be projected is a screen to be projected corresponding to the seventh input operation, and the fourth position offset is a position offset of the seventh input operation relative to the fourth screen to be projected;
Calculating a tenth position, which is a position of the seventh input operation with respect to the target screen, according to the ninth position and the fourth position offset;
and executing the seventh input operation at the tenth position.
12. A projection display apparatus for use with an electronic device, the apparatus comprising:
the display unit is used for displaying an editing interface window, the editing interface window comprises a first area, a second area and a third area, a first picture is displayed in the first area, and the first picture is a target picture in a target application running on source equipment; the third area comprises at least one shape icon corresponding to the selected shape;
a selecting unit, configured to respond to at least one first input operation for the first frame, select a to-be-screen frame list corresponding to the at least one first input operation from the first frame, where the to-be-screen frame list includes: responding to a third input operation for the shape icons in the third area, and determining the selected shape of each screen to be projected in the screen to be projected list; determining a selection area corresponding to the first input operation each time; selecting the screen to be projected in the selected area of the first screen in the selected shape to obtain a screen list to be projected; the screen to be projected picture list comprises at least one screen to be projected picture;
The display unit is further configured to display a second screen in a second area, where the second screen includes the to-be-screen list and/or an external screen list, and the external screen list includes at least one external screen imported by a user;
an editing unit, configured to edit the second screen into a screen-throwing screen in response to at least one second input operation in the second area;
and the receiving and transmitting unit is used for transmitting the screen throwing picture so as to display the screen throwing picture in a first application running on the target equipment.
13. The apparatus of claim 12, wherein the electronic device is the source device or the destination device.
14. The apparatus of claim 12, further comprising an acquisition unit,
the obtaining unit is configured to obtain first information of each screen to be projected in the source device in the screen to be projected list, where the first information includes at least one of the following: position, size, shape;
the obtaining unit is further configured to obtain second information of the destination device, where the second information includes at least one of: interface size, display position of screen projection picture, scaling of screen projection picture, rotation direction of screen projection picture;
The obtaining unit is further configured to obtain third information of each external screen in the external screen list, where the third information includes a position and/or a size.
15. The apparatus of claim 14, wherein the second area comprises at least one operational icon corresponding to an interface layout;
the editing unit is specifically configured to obtain an operation icon corresponding to the at least one second input operation;
determining interface layout information of the second picture according to the operation icon, wherein the operation icon comprises at least one of the following items: rotation, movement, scaling, deletion, addition, hierarchy setting;
and editing the second picture into the screen projection picture according to the interface layout information and the parameter information, wherein the parameter information comprises at least one of the first information, the second information and the third information.
16. The apparatus of claim 15, wherein the transceiver unit is further configured to:
and sending the interface layout information to at least one of cloud equipment, source equipment and destination equipment.
17. The apparatus according to claim 15 or 16, wherein the electronic device is the destination device, and wherein the apparatus further comprises a computing unit after displaying the screen in the first application run by the destination device;
The computing unit is used for responding to a fourth input operation of the screen throwing picture, computing a first position and a first position offset of the fourth input operation, wherein the first position is a position of a first screen to be thrown picture on the target picture, the first screen to be thrown picture is a screen to be thrown picture corresponding to the fourth input operation, and the first position offset is a position offset of the fourth input operation relative to the first screen to be thrown picture;
the transceiver unit is further configured to send a first control message, where the first control message includes the fourth input operation, the first location, and the first location offset.
18. The apparatus according to claim 15 or 16, wherein the electronic device is the source device, and wherein after sending the screen shot, the apparatus further comprises a computing unit;
the computing unit is configured to respond to a fifth input operation of the target screen, and compute a second position and a second position offset of the fifth input operation, where the second position is a position of a second screen to be projected on the screen, the second screen to be projected is a screen to be projected corresponding to the fifth input operation, and the second position offset is a position offset of the fifth input operation relative to the second screen to be projected;
The transceiver unit is further configured to send a second control message, where the second control message includes the fifth input operation, the second location, and the second location offset.
19. The apparatus according to claim 17, wherein in calculating the first position and the first position offset for the fourth input operation, the calculating unit is specifically configured to:
acquiring a third position of the fourth input operation, wherein the third position is a position of the fourth input operation relative to the screen projection picture;
and calculating the first position and the first position offset according to the third position, the interface layout information and the parameter information.
20. The apparatus according to claim 18, wherein in calculating the second position and the second position offset of the fifth input operation, the calculating unit is specifically configured to:
acquiring a fourth position of the fifth input operation, wherein the fourth position is a position of the fifth input operation relative to the target picture;
and calculating the second position and the second position offset according to the fourth position, the interface layout information and the parameter information.
21. The apparatus according to claim 15 or 16, wherein the electronic device is the destination device, and the apparatus further comprises an execution unit and a calculation unit after the screen is displayed in the first application run by the destination device;
the receiving and transmitting unit is used for receiving third control information, and the third control information comprises a sixth input operation and a fifth position;
the computing unit is configured to compute, according to the fifth position, the interface layout information, and the parameter information, a sixth position and a third position offset of the sixth input operation, where the sixth position is a position of a third screen to be projected on the screen, the third screen to be projected is a screen to be projected on the screen corresponding to the sixth input operation, and the third position offset is a position offset of the sixth input operation relative to the third screen to be projected on the screen;
the computing unit is further configured to compute a seventh position according to the sixth position and the third position offset, where the seventh position is a position of the sixth input operation relative to the screen projection screen;
the execution unit is configured to execute the sixth input operation at the seventh position.
22. The apparatus according to claim 15 or 16, wherein the electronic device is the source device, and wherein after sending the screen shot, the apparatus further comprises an execution unit and a calculation unit;
the transceiver unit is configured to receive fourth control information, where the fourth control information includes a seventh input operation and an eighth position;
the computing unit is configured to compute, according to the eighth position, the interface layout information, and the parameter information, a ninth position and a fourth position offset of the seventh input operation, where the ninth position is a position of a fourth screen to be projected on the target screen, the fourth screen to be projected is a screen to be projected corresponding to the seventh input operation, and the fourth position offset is a position offset of the seventh input operation relative to the fourth screen to be projected;
the computing unit is used for computing a tenth position according to the ninth position and the fourth position offset, wherein the tenth position is the position of the seventh input operation relative to the target picture;
the execution unit is configured to execute the seventh input operation at the tenth position.
23. An electronic device comprising a processor, a memory, a communication interface, and one or more programs stored in the memory and configured to be executed by the processor, the programs comprising instructions for performing the steps in the method of any of claims 1-11.
24. A computer-readable storage medium, characterized in that a computer program for electronic data exchange is stored, wherein the computer program causes a computer to perform the method according to any one of claims 1-11.
CN202011284012.3A 2020-11-16 2020-11-16 Screen projection display method and related product Active CN112286477B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202011284012.3A CN112286477B (en) 2020-11-16 2020-11-16 Screen projection display method and related product
PCT/CN2021/116478 WO2022100237A1 (en) 2020-11-16 2021-09-03 Screen projection display method and related product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011284012.3A CN112286477B (en) 2020-11-16 2020-11-16 Screen projection display method and related product

Publications (2)

Publication Number Publication Date
CN112286477A CN112286477A (en) 2021-01-29
CN112286477B true CN112286477B (en) 2023-12-08

Family

ID=74399059

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011284012.3A Active CN112286477B (en) 2020-11-16 2020-11-16 Screen projection display method and related product

Country Status (2)

Country Link
CN (1) CN112286477B (en)
WO (1) WO2022100237A1 (en)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112286477B (en) * 2020-11-16 2023-12-08 Oppo广东移动通信有限公司 Screen projection display method and related product
CN115964106B (en) * 2021-04-20 2024-02-13 华为技术有限公司 Graphic interface display method, electronic device, medium and program product
CN112988101B (en) * 2021-04-20 2023-07-21 西安诺瓦星云科技股份有限公司 Image processing method and device, nonvolatile storage medium and processor
CN115253285A (en) * 2021-04-30 2022-11-01 华为技术有限公司 Display method and related device
CN113805827B (en) * 2021-09-14 2024-05-07 北京百度网讯科技有限公司 Screen projection display method and device, electronic equipment and storage medium
CN113900760B (en) * 2021-10-26 2024-05-28 广州博冠信息科技有限公司 Popup window display method and device
CN113918262A (en) * 2021-10-27 2022-01-11 深圳市宝泽科技有限公司 Method and system for displaying bottom wallpaper applied to screen projection
CN114089940B (en) * 2021-11-18 2023-11-17 佛吉亚歌乐电子(丰城)有限公司 Screen projection method, device, equipment and storage medium
CN114785848A (en) * 2022-03-02 2022-07-22 阿里巴巴(中国)有限公司 Collaborative interaction and collaboration method, device and system between electronic devices
CN117632061A (en) * 2022-08-15 2024-03-01 华为技术有限公司 Screen projection method, electronic equipment and system
CN117850718A (en) * 2022-10-09 2024-04-09 华为技术有限公司 Display screen selection method and electronic equipment
CN117956219A (en) * 2022-10-18 2024-04-30 华为技术有限公司 Multi-screen multi-device interaction method, electronic device and system
CN115562539A (en) * 2022-11-09 2023-01-03 维沃移动通信有限公司 Control display method and device, electronic equipment and readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110381195A (en) * 2019-06-05 2019-10-25 华为技术有限公司 A kind of throwing screen display methods and electronic equipment
CN111324327A (en) * 2020-02-20 2020-06-23 华为技术有限公司 Screen projection method and terminal equipment
CN111443884A (en) * 2020-04-23 2020-07-24 华为技术有限公司 Screen projection method and device and electronic equipment
EP3706400A1 (en) * 2017-12-28 2020-09-09 Huawei Technologies Co., Ltd. Icon management method and device

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150116399A (en) * 2014-04-02 2015-10-15 디에스글로벌 (주) Portable device for providing multiple function
CN110995923B (en) * 2019-11-22 2021-08-20 维沃移动通信(杭州)有限公司 Screen projection control method and electronic equipment
CN112286477B (en) * 2020-11-16 2023-12-08 Oppo广东移动通信有限公司 Screen projection display method and related product

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3706400A1 (en) * 2017-12-28 2020-09-09 Huawei Technologies Co., Ltd. Icon management method and device
CN110381195A (en) * 2019-06-05 2019-10-25 华为技术有限公司 A kind of throwing screen display methods and electronic equipment
CN111324327A (en) * 2020-02-20 2020-06-23 华为技术有限公司 Screen projection method and terminal equipment
CN111443884A (en) * 2020-04-23 2020-07-24 华为技术有限公司 Screen projection method and device and electronic equipment

Also Published As

Publication number Publication date
WO2022100237A1 (en) 2022-05-19
CN112286477A (en) 2021-01-29

Similar Documents

Publication Publication Date Title
CN112286477B (en) Screen projection display method and related product
CN111324327B (en) Screen projection method and terminal equipment
CN112269527B (en) Application interface generation method and related device
WO2022100239A1 (en) Device cooperation method, apparatus and system, electronic device and storage medium
CN112558825A (en) Information processing method and electronic equipment
CN112527174B (en) Information processing method and electronic equipment
CN114520868B (en) Video processing method, device and storage medium
CN115297405A (en) Audio output method and terminal equipment
WO2022105445A1 (en) Browser-based application screen projection method and related apparatus
CN112527222A (en) Information processing method and electronic equipment
CN114065706A (en) Multi-device data cooperation method and electronic device
WO2021254113A1 (en) Control method for three-dimensional interface and terminal
CN114356195B (en) File transmission method and related equipment
CN114513689B (en) Remote control method, electronic equipment and system
CN112822544A (en) Video material file generation method, video synthesis method, device and medium
CN114510186A (en) Cross-device control method and device
CN114449171B (en) Method for controlling camera, terminal device, storage medium and program product
CN114520867B (en) Camera control method based on distributed control and terminal equipment
CN111324255B (en) Application processing method based on double-screen terminal and communication terminal
CN116797767A (en) Augmented reality scene sharing method and electronic device
CN113079332A (en) Mobile terminal and screen recording method thereof
CN113157092A (en) Visualization method, terminal device and storage medium
CN115113832A (en) Cross-device synchronous display control method and system
CN111381801B (en) Audio playing method based on double-screen terminal and communication terminal
CN114615362B (en) Camera control method, device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant