CN112394895A - Cross-equipment display method and device of picture and electronic equipment - Google Patents
Cross-equipment display method and device of picture and electronic equipment Download PDFInfo
- Publication number
- CN112394895A CN112394895A CN202011284014.2A CN202011284014A CN112394895A CN 112394895 A CN112394895 A CN 112394895A CN 202011284014 A CN202011284014 A CN 202011284014A CN 112394895 A CN112394895 A CN 112394895A
- Authority
- CN
- China
- Prior art keywords
- picture
- screen
- operation event
- instruction corresponding
- screen projection
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 74
- 238000012545 processing Methods 0.000 claims abstract description 61
- 230000009471 action Effects 0.000 claims abstract description 20
- 238000004891 communication Methods 0.000 claims description 47
- 230000015654 memory Effects 0.000 claims description 36
- 238000004590 computer program Methods 0.000 claims description 14
- 238000003860 storage Methods 0.000 claims description 14
- 238000005266 casting Methods 0.000 claims description 12
- 238000013507 mapping Methods 0.000 claims description 12
- 230000002452 interceptive effect Effects 0.000 abstract description 27
- 230000009286 beneficial effect Effects 0.000 abstract description 4
- 238000010586 diagram Methods 0.000 description 25
- 230000006870 function Effects 0.000 description 20
- 238000007726 management method Methods 0.000 description 16
- 230000008569 process Effects 0.000 description 15
- 239000000243 solution Substances 0.000 description 12
- 238000010295 mobile communication Methods 0.000 description 10
- 230000006978 adaptation Effects 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 7
- 241000282326 Felis catus Species 0.000 description 6
- 230000003044 adaptive effect Effects 0.000 description 6
- 230000005236 sound signal Effects 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 5
- 230000001413 cellular effect Effects 0.000 description 4
- 230000003993 interaction Effects 0.000 description 4
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000009467 reduction Effects 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 230000001360 synchronised effect Effects 0.000 description 3
- 229920001621 AMOLED Polymers 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 238000013500 data storage Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000002347 injection Methods 0.000 description 2
- 239000007924 injection Substances 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000003321 amplification Effects 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 210000000988 bone and bone Anatomy 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 239000002096 quantum dot Substances 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1454—Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Controls And Circuits For Display Device (AREA)
- Telephone Function (AREA)
Abstract
The embodiment of the application discloses a method and a device for displaying pictures across equipment and electronic equipment, wherein the method comprises the following steps: displaying a first picture, wherein the first picture is a current mirror image of an operation picture of a first application on the screen projection source equipment; acquiring a first operation event aiming at a first picture; judging whether a first instruction corresponding to the first operation event is responded at the home terminal or not according to the execution action of the first operation event on the first picture; and if the local terminal does not respond to the first instruction corresponding to the first operation event, sending a second instruction corresponding to the first operation event to the screen projection source equipment, wherein the second operation instruction is used for indicating the screen projection source equipment to execute operation on the running picture of the first application. Therefore, the method and the device are beneficial to realizing the cross-device display and interactive operation of the applied pictures, improving the processing efficiency of the picture interactive operation and improving the use experience of screen projection.
Description
Technical Field
The application relates to the technical field of computers, in particular to a method and a device for cross-device display of a picture and electronic equipment.
Background
With the development of technology, a user may have multiple devices at the same time, and the devices may have multiple forms, such as a mobile phone, a computer, a tablet, a television, a watch, and the like, and different devices may be equipped with different operating systems, software and hardware architectures, and the like. For example, a computer may be equipped with a system such as Windows or Mac, and a mobile phone may be equipped with a system such as Android or IOS.
With the application of screen projection technology, applications running on different devices or systems can be used across devices in a screen projection mode, and therefore screen projection of pictures applied on different devices or systems is achieved. At present, some problems to be solved still exist in the process of displaying the applied pictures in a cross-device mode.
Disclosure of Invention
The embodiment of the application provides a method and a device for cross-device display of a picture and electronic equipment, and aims to realize cross-device display and interaction control of the picture of an application, improve the processing efficiency of picture interaction operation and improve the use experience of screen projection.
In a first aspect, an embodiment of the present application provides a method for displaying a picture across devices, which is applied to a screen projection target device, and the method includes:
displaying a first picture, wherein the first picture is a current mirror image of an operation picture of a first application on the screen projection source equipment;
acquiring a first operation event aiming at the first picture;
judging whether a first instruction corresponding to the first operation event is responded at the home terminal or not according to the execution action of the first operation event on the first picture;
and if the local terminal does not respond to the first instruction corresponding to the first operation event, sending a second instruction corresponding to the first operation event to the screen projection source device, wherein the second instruction corresponding to the first operation event is used for indicating the screen projection source device to execute operation on the running picture of the first application.
In a second aspect, an embodiment of the present application provides a cross-device screen display apparatus, which includes a processing unit and a communication unit, where the processing unit is configured to:
displaying a first picture, wherein the first picture is a current mirror image of an operation picture of a first application on the screen projection source equipment;
acquiring a first operation event aiming at the first picture;
judging whether the local terminal responds to the first operation instruction according to the execution action of the first operation event on the first picture;
and if the local terminal does not respond to the first operation instruction, sending a second instruction corresponding to the first operation event to the screen projection source device through the communication unit, wherein the second instruction corresponding to the first operation event is used for indicating the screen projection source device to execute operation on the running picture of the first application.
In a third aspect, an embodiment of the present application provides an electronic device, which is a screen projection target device and includes a processor, a memory and a communication interface, where the memory stores one or more programs, and the one or more programs are executed by the processor, and the one or more programs are used for executing the instructions of the steps in the first aspect of the embodiment of the present application.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium, where the computer-readable storage medium stores a computer program for electronic data exchange, and the computer program is operable to cause a computer to perform some or all of the steps described in the first aspect of the embodiments of the present application.
In a fifth aspect, the present application provides a computer program product, where the computer program product includes a computer program operable to cause a computer to perform some or all of the steps described in the first aspect of the present application. The computer program product may be a software installation package.
It can be seen that, in the embodiment of the present application, the screen projection target device displays the first picture and acquires the first operation event for the first picture. And then, judging whether a first instruction corresponding to the first operation event is responded at the local terminal or not according to the execution action of the first operation event on the first picture. And if the local terminal does not respond to the first instruction, sending a second instruction corresponding to the first operation event to the screen projection source equipment. The first picture is a current mirror image of an operation picture of a first application on the screen projection target device on the display screen of the screen projection source device, and the second instruction corresponding to the first operation event is used for indicating the screen projection source device to execute operation on the operation picture of the first application, so that picture cross-device display is achieved. In addition, the screen projection target device needs to judge whether to respond to the first instruction corresponding to the first operation event at the local terminal, so that the screen projection target device can respond at the local terminal or not, the screen projection target device can operate the screen projection picture on the display screen of the screen projection target device, and can also control and operate the running picture of the first application on the screen projection source device through adaptive processing, thereby realizing cross-device interactive control of the pictures of the applications, improving the processing efficiency of the picture interactive operation, and improving the use experience of the screen projection.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below. It is to be expressly understood that the drawings described below are only illustrative of some embodiments of the invention. It is also possible for a person skilled in the art to derive other figures from these figures without inventive effort.
Fig. 1 is a schematic architecture diagram of a screen projection communication system provided in an embodiment of the present application;
fig. 2 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram of a software structure of an electronic device according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a screen projection source device and a screen projection target device provided in an embodiment of the present application;
fig. 5 is a flowchart illustrating a cross-device display method for a picture according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of cross-device display of a picture provided in an embodiment of the present application;
FIG. 7 is a schematic structural diagram of still another cross-device display of a screen provided in an embodiment of the present application;
FIG. 8 is a schematic structural diagram of still another cross-device display of a screen provided in an embodiment of the present application;
FIG. 9 is a schematic structural diagram of still another cross-device display of a screen provided in an embodiment of the present application;
FIG. 10 is a schematic structural diagram of still another cross-device display of a screen provided in an embodiment of the present application;
FIG. 11 is a schematic structural diagram of still another cross-device display of a screen provided in an embodiment of the present application;
FIG. 12 is a schematic structural diagram of still another cross-device display of a screen provided in an embodiment of the present application;
FIG. 13 is a schematic structural diagram of still another cross-device display of a screen provided in an embodiment of the present application;
FIG. 14 is a schematic structural diagram of still another cross-device display of a screen provided in an embodiment of the present application;
FIG. 15 is a schematic structural diagram of still another cross-device display of a screen provided in an embodiment of the present application;
FIG. 16 is a schematic structural diagram of still another cross-device display of a screen provided in an embodiment of the present application;
FIG. 17 is a schematic structural diagram of still another cross-device display of a screen provided in an embodiment of the present application;
FIG. 18 is a schematic structural diagram of still another cross-device display of a screen provided in an embodiment of the present application;
FIG. 19 is a schematic structural diagram of still another cross-device display of a screen provided in an embodiment of the present application;
FIG. 20 is a schematic structural diagram of still another cross-device display of a screen provided in an embodiment of the present application;
fig. 21 is a flowchart illustrating a further method for displaying a frame across devices according to an embodiment of the present application;
fig. 22 is a flowchart illustrating a further method for displaying a frame on a device according to an embodiment of the present application;
fig. 23 is a block diagram illustrating functional units of a cross-device display apparatus according to an embodiment of the present disclosure;
fig. 24 is a schematic structural diagram of another electronic device provided in the embodiment of the present application.
Detailed Description
In order to better understand the technical solutions of the present application, the following description is given for clarity and completeness in conjunction with the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the description of the embodiments of the present application without inventive step, are within the scope of the present application.
Before describing the technical solution of the embodiment of the present application, the following description will be made on related concepts, a screen projection communication system, a software and hardware structure of an electronic device, and the like, which may be involved in the present application.
The screen projection in the embodiment of the present application is a technology for projecting a screen or content of an application running on a device onto a display screen or a display medium of another device to display the screen or the content, and is a typical information synchronization method. In the embodiments of the present application, a device that projects a screen of an application thereof is referred to as a screen projection source device, and a device that receives and displays a screen of an application thereof is referred to as a screen projection target device.
Secondly, the screen projection related to the application can comprise wired screen projection and wireless screen projection. The wired screen projection device can establish wired connection between the screen projection source device and the screen projection target device through a High Definition Multimedia Interface (HDMI), a Universal Serial Bus (USB) interface and the like to transmit media data; the wireless screen projection can establish a wired connection between the screen projection source device and the screen projection target device through a Digital Living Network Alliance (DLNA) protocol, a wireless display sharing (Miracast) protocol or an AirPlay (AirPlay) protocol so as to transmit media data.
For example, when a screen is projected, the screen projection source device may transmit a video stream in a current video player to the screen projection target device after the video stream is compressed by data encoding; then, after the screen projection target device decodes the video stream data, the screen projection content is displayed on the display screen thereof. For convenience of description, the screen projection content displayed on the display screen of the screen projection target device may be referred to as a mirror image of the screen projection content of the screen projection source device in the embodiments of the present application.
In addition, the screen projection source device and the screen projection target device can be collectively referred to as electronic devices in the embodiments of the present application. The electronic device according to the embodiment of the present application may be a handheld device, a vehicle-mounted device, a wearable device, an Augmented Reality (AR) device, a Virtual Reality (VR) device, a projection device, a projector, or other devices connected to a wireless modem, or may be various User Equipments (UEs), terminal devices (terminal devices), cell phones (smart phones), smart screens, smart tvs, smart watches, laptops, smart stereos, cameras, game pads, microphones, Stations (STAs), Access Points (APs), Mobile Stations (MSs), Personal Digital Assistants (PDAs), Personal Computers (PCs), or relay devices.
For example, two electronic devices, a computer and a mobile phone, are taken as examples. When the computer and the mobile phone are connected through a wireless communication technology (such as Bluetooth, wireless fidelity, Zigbee, near field communication and the like) or a data line (such as a USB data line), the mobile phone is used as a screen projection source device to project pictures or contents of running applications onto a display screen of the computer through a screen projection technology, and the computer is used as a screen projection target device; or, the computer is used as a screen projection source device to project the picture or content of the running application onto the display screen of the mobile phone, and the mobile phone is used as a screen projection target device at the moment.
The technical solution of the embodiment of the present application can be applied to the screen projection communication system 10 shown in fig. 1. Among other things, the screen-casting communication system 10 may include at least two electronic devices 110. Among them, the at least two electronic devices 110 may include an electronic device 110A, an electronic device 110B, an electronic device 110C, an electronic device 110D, an electronic device 110E, and an electronic device 110F. Meanwhile, each of the at least two electronic devices 110 may be communicatively connected to each other through a wireless network or wired data.
It should be noted that the wireless network may include a mobile cellular network (e.g., a fifth generation 5G mobile communication network), a Wireless Local Area Network (WLAN), a Wide Area Network (WAN), Bluetooth (Bluetooth), wireless fidelity (Wi-Fi), Zigbee (Zigbee), Near Field Communication (NFC), Ultra Wide Band (UWB), or the like; the wired data may include HDMI data lines, USB data lines, and the like.
Specifically, each of the at least two electronic devices 110 may be devices under the same user account. For example, when a user logs in to a mobile phone, a desktop computer, a smart screen, a notebook computer, a relay device and a smart watch using the same user account, the at least two electronic devices 110 include the mobile phone, the desktop computer, the smart screen, the notebook computer, the relay device and the smart watch, and the mobile phone, the desktop computer, the smart screen, the notebook computer, the relay device and the smart watch can communicate with each other through a wireless network.
Specifically, each of the at least two electronic devices 110 may be connected to the same WLAN network through a relay device (e.g., a router). For example, when a user accesses a mobile phone, a desktop computer, a smart screen, a notebook computer and a smart watch to a Wi-Fi network provided by a relay device, the at least two electronic devices 110 include the mobile phone, the desktop computer, the smart screen, the notebook computer, the relay device and the smart watch, and the mobile phone, the desktop computer, the smart screen, the notebook computer, the relay device and the smart watch form a WLAN network, so that the devices in the WLAN network can communicate with each other through the relay device.
Specifically, each of the at least two electronic devices 110 may form a Peer-to-Peer (P2P) network through a wireless communication manner (e.g., bluetooth, Zigbee, NFC, UWB, etc.). For example, a user may scan an NFC tag to form a P2P network with a cell phone, a laptop, a smart watch, etc., and all devices in the P2P network may have previously communicated with each other.
Further, one or more of the at least two electronic devices 110 may serve as a screen projection source device, and the other electronic devices may serve as screen projection target devices. At this time, the screen projection source device may project or transfer the current screen of the application running therein to the screen projection target device for display. In addition, when the screen projection target equipment needs to simultaneously display the mirror images projected by the plurality of screen projection source equipment, the screen projection target equipment can simultaneously display the mirror images in a split screen mode. For example, electronic device 110A casts a screen to electronic device 110B and electronic device 110C, and electronic device 110D casts a screen to electronic device 110C and electronic device 110F. At this time, the electronic device 11C may simultaneously display the screen projection images from the electronic device 110A and the electronic device 110D in a split screen manner.
Further, the screen-casting communication system 10 may also include other numbers of electronic devices, which are not specifically limited herein.
The following describes the structure of the electronic device in the embodiment of the present application in detail with reference to fig. 2, and it is understood that the structure illustrated in fig. 2 does not specifically limit the electronic device. In other embodiments of the present application, the electronic device may also include more or fewer components than illustrated in FIG. 2, or some components may be combined, some components may be split, or a different arrangement of components. In addition, the components illustrated in fig. 2 may be implemented by hardware, software, or a combination of software and hardware.
Referring to fig. 2, the electronic device may include a processor 210, an antenna 1, an antenna 2, a mobile communication module 220, a wireless communication module 230, an audio module 240, a sensor module 250, a display module 260, a camera module 270, a charging management module 280, an internal memory 2901, an external memory interface 2902, and the like.
In particular, processor 210 may include one or more processing units. For example, the processor 210 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA) baseband processor, and/or a neural-Network Processing Unit (NPU), and the like. The different processing units may be separate devices or may be integrated into one or more processors.
Further, a memory may be disposed within processor 210 for storing instructions and data. In some embodiments, the memory in the processor 210 is a cache memory. The memory may hold instructions or data that have just been used or recycled by processor 210. If the processor 210 needs to reuse the instruction or data, it can be called directly from the memory, thereby avoiding repeated accesses and reducing the latency of the processor 210 to improve system efficiency.
Further, the processor 210 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a USB interface, etc.
The wireless communication function of the electronic device may be implemented by the antenna 1, the antenna 2, the mobile communication module 220, the wireless communication module 230, the modem processor, the baseband processor, and the like. Wherein the antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in an electronic device may be used to cover a single or multiple communication bands. In addition, different antennas can be multiplexed to improve the utilization rate of the antennas. For example, antenna 1 is multiplexed as a diversity antenna for a wireless local area network.
Specifically, the mobile communication module 220 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device. The mobile communication module 220 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like.
Further, the mobile communication module 220 may receive the electromagnetic wave from the antenna 1, and filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem for demodulation. In addition, the mobile communication module 220 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In this embodiment, the mobile communication module 220 can implement communication connection between the screen projection source device and the screen projection target device in the technical solution of this application.
Further, at least a part of the functional modules of the mobile communication module 220 may be disposed in the processor 210; alternatively, at least some of the functional modules of the mobile communication module 220 may be disposed in the same device as some of the modules of the processor 210.
Specifically, the wireless communication module 230 may provide solutions for wireless communication applied to the electronic device, including Bluetooth (BT), Wireless Local Area Network (WLAN), wireless fidelity (Wi-Fi) network, Near Field Communication (NFC), infrared technology (IR), and the like.
Further, the wireless communication module 230 may be one or more devices integrating at least one communication processing module. The wireless communication module 230 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the system-on-chip 210. The wireless communication module 230 may also receive a signal to be transmitted from the processor 210, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic wave radiation by the antenna 2. In this embodiment, the wireless communication module 230 can implement communication connection between the screen projection source device and the screen projection target device in the technical solution of this application.
It should be noted that the electronic device may implement an audio function through the audio module 240, the speaker 2401, the receiver 2402, the microphone 2403, the earphone interface 2404, the processor 210, and the like. Such as music playing, recording, etc.
Specifically, the audio module 240 may be configured to convert digital audio information into an analog audio signal for output, and may also be configured to convert an analog audio input into a digital audio signal. In addition, the audio module 240 may also be used to encode and decode an audio signal. In some embodiments, the audio module 240 may be disposed in the processor 210, or some functional modules of the audio module 240 may be disposed in the processor 210.
In particular, the speaker 2401 may be used to convert an audio electrical signal into an acoustic signal. The electronic apparatus can listen to music or a handsfree call through the speaker 2301.
In particular, receiver 2402 may be used to convert an electrical audio signal into an acoustic signal. When the electronic device receives a call or voice message, the receiver 2302 can be close to the ear to receive the voice message.
In particular, microphone 2403 may be used to convert acoustic signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 2403 by sounding a voice signal near the microphone 2403 through the mouth of the user. In addition, the electronic device may be provided with at least one microphone 2403. In one possible example, the electronic device may be provided with two microphones 2403, which may implement a noise reduction function in addition to collecting sound signals; in a possible example, the electronic device may further include three, four or more microphones 2403 to collect sound signals, reduce noise, identify sound sources, implement a directional recording function, and the like, which is not particularly limited.
In particular, the headset interface 2404 may be used to connect wired headsets. The headset interface 2404 may be a USB interface 2703, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface, or the like.
Specifically, the sensor module 180 may include a pressure sensor, a gyroscope sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, an ultra-wideband UWB sensor, a near field communication NFC sensor, a laser sensor, a visible light sensor, and the like.
The electronic device may implement a display function through the GPU, the display module 260, the processor 210, and the like. The GPU may be configured to perform mathematical and geometric calculations and perform graphics rendering, among other things. Additionally, the GPU may be a microprocessor for image processing and connect the display module 260 and the processor 210. Processor 210 may include one or more GPUs that execute program instructions to generate or alter display information.
In particular, the display module 260 may be a display screen for displaying images, videos, and the like. The display screen 260 may include a display panel, among others. The display panel may be a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED), a flexible light-emitting diode (FLED), a quantum dot light-emitting diode (QLED), or the like. In one possible example, the electronic device may include 1 or more display modules 260.
The electronic device may implement a shooting function through the ISP, the camera module 270, the video codec, the GPU, the display module 260, the processor 210, and the like. The ISP may be configured to process data fed back by the camera module 270. For example, when a photo is taken, the shutter is opened first, then light is transmitted to the camera photosensitive element through the lens, so that an optical signal is converted into an electrical signal, and finally the electrical signal is transmitted to the ISP through the camera photosensitive element to be processed so as to be converted into an image visible to naked eyes. In addition, the ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In one possible example, the ISP may be provided in the camera module 270.
In particular, the camera module 270 may be a camera for capturing still images or videos, etc. The object generates an optical image through a lens and projects the optical image onto a photosensitive element, and the photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then transmitted to the ISP to be converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV or other format. In one possible example, the electronic device may include one or more camera modules 270.
Specifically, the charging management module 280 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 280 may receive charging input from a wired charger via the USB interface 2803. In some wireless charging embodiments, the charging management module 280 may receive a wireless charging input through a wireless charging coil of the electronic device. While the charging management module 280 charges the battery 2801, power may also be supplied to the electronic device through the power management module 2802.
Note that the power management module 2802 is used to connect the battery 2801, the charge management module 2802, and the processor 210. The power management module 2802 receives input from the battery 2801 and/or the charge management module 280, and provides power to various modules in the electronic device, the processor 210, and the like.
Specifically, the power management module 2802 may also be configured to monitor parameters such as battery capacity, battery cycle count, and battery state of health (leakage, impedance). In one possible example, the power management module 2802 can also be disposed in the processor 210; in one possible example, the power management module 2802 and the charge management module 280 may also be provided in the same device.
It is noted that internal memory 2901 can be utilized to store computer-executable program code, including instructions. The processor 210 executes various functional applications and data processing of the electronic device by executing instructions stored in the internal memory 2901. In one possible example, the internal memory 2901 stores program code that implements the technical aspects of the embodiments of the present application.
Specifically, the internal memory 2901 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (e.g., a sound playing function, an image playing function, etc.) required for at least one function, and the like. The storage data area may store data (e.g., audio data, a phonebook, etc.) created during use of the electronic device, and the like. In addition, the internal memory 2901 can include high-speed random access memory, and can also include non-volatile memory. Such as at least one magnetic disk storage device, flash memory device, Universal Flash Storage (UFS), etc.
Specifically, the external memory interface 2902 can be used for connecting an external memory card, such as a micro SD card, to extend the memory capability of the electronic device. The external memory card communicates with the processor 210 through the external memory interface 2902, implementing data storage functionality. For example, files such as music, video, and the like are saved in an external memory card.
In the embodiment of the present application, a software system of an electronic device may adopt a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. In the following, the embodiment of the present application takes an Android system with a layered architecture as an example, and exemplarily illustrates a software structure of an electronic device.
Fig. 3 is a schematic diagram of an architecture of a software and hardware system with an Android system. The internal memory 2901 may store a kernel layer 320, a system runtime layer 340, an application framework layer 360, and an application layer 380. The layers communicate with each other through a software interface, and the kernel layer 320, the system runtime layer 340 and the application framework layer 360 belong to an operating system space.
Specifically, the application layer 380 belongs to a user space, and at least one application program (or simply "application") runs in the application layer 380, and the application program may be a native application program carried by an operating system itself or a third-party application program developed by a third-party developer. For example, the application layer 380 may include applications for cameras, galleries, calendars, conversations, maps, navigation, WLAN, Bluetooth, music, video, and short messages.
In the embodiment of the application, a screen projection application can be further installed in the application layer. The user can open the screen-casting application from a desktop, a setup function, a pull-down menu, or the like. The screen projection application can be used as a bridge between the screen projection source device and the screen projection target device during content projection, and screen projection content (such as an application picture) of an application needing screen projection in the screen projection source device is sent to the screen projection target device. For example, the screen-casting application may receive a screen-casting event reported by the application framework layer 360, so that the screen-casting application interacts with an application (e.g., a video player) running, and content being displayed or played in the application is sent to the screen-casting target device as screen-casting content through a wireless communication mode such as Wi-Fi.
In addition, the user can also set a binding relationship between the NFC tag and one or more electronic devices using the screen projection application. For example, an option for binding an NFC tag is set in a screen-casting application. When the electronic device detects that the user opens the option, the screen-casting application can display a list of the electronic devices to be bound. After the user selects one or more electronic devices to be bound from the list, the electronic devices can be brought close to the NFC tags to be bound. In this way, the electronic device can write the identifier of the electronic device selected by the user in the screen-shooting application into the NFC tag through the NFC signal, so that the binding relationship between the NFC tag and one or more electronic devices is established in the NFC tag.
It should be noted that the application framework layer 360 provides various Application Programming Interfaces (APIs) and programming frameworks that may be used by the application programs that build the application layer, so that developers can build their own application programs by using these APIs. For example, a window manager (window manager), a content provider (content providers), a view system (view system), a telephone manager (telephone manager), a resource manager, a notification manager (notification manager), a message manager, an activity manager (activity manager), a package manager (package manager), a location manager (location manager), and an NFC service, etc.
In particular, a window manager may be used to manage the windowing program. The window manager can obtain the size of the display screen and judge whether a status bar, a lock screen, a capture screen and the like exist.
In particular, the content provider may be used to store and retrieve data and make the data accessible to applications. The data may include, among other things, video, images, audio, calls made and answered, browsing history and bookmarks, phone books, and the like. In addition, the content provider may enable an application to access data of another application, such as a contacts database, or to share their own data.
In particular, the view system includes visual controls. For example, controls for displaying text, controls for displaying pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
In particular, a phone manager is used to provide communication functions for the electronic device. For example, management of call status (e.g., on, off, etc.).
In particular, the resource manager may provide various resources for the application. Such as localized strings, icons, pictures, layout files, video files, etc.
Specifically, the notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. For example, a notification manager is used to notify download completion, message alerts, and the like. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system. Additionally, the notification of the application running in the background may also be a notification that appears on the screen in the form of a dialog window. For example, text messages are prompted in the status bar, a prompt tone is sounded, the electronic device vibrates, indicator lights flash, and the like.
Specifically, the message manager may be configured to store data of messages reported by each application program, and process the data reported by each application program.
In particular, the campaign manager may be used to manage application lifecycle and provide common navigation fallback functionality. In one possible example, the message manager may be part of the notification manager.
In the embodiment of the present application, an NFC service (NFC service) may be run in the application framework layer 360.
For example, after the NFC function is turned on, the mobile phone may start running the NFC service in the application framework layer. When the mobile phone is close to or touches the NFC label, the NFC service can call the NFC drive of the kernel layer to read the binding relation stored in the NFC label, and therefore screen projection target equipment for performing the content screen projection is obtained. And then, the NFC service can report a screen projection event to the screen projection application, so that the screen projection application is triggered to send the content which is displayed or played by the mobile phone as screen projection content to the screen projection target equipment, and the content screen projection process is started.
It should be noted that the system runtime library layer 340 provides a main feature support for the Android system through some C/C + + libraries. For example, the SQLite library provides support for a database, the OpenGL/ES library provides support for 3D drawing, the Webkit library provides support for a browser kernel, and the like. Also provided in the system Runtime layer 340 is an Android Runtime library (Android Runtime), which mainly provides some core libraries capable of allowing developers to write Android applications using the Java language.
Specifically, the kernel layer 320 may provide underlying drivers for various hardware of the electronic device, such as a display driver, an audio driver, a camera driver, a bluetooth driver, a Wi-Fi driver, a power management, an NFC driver, a UWB driver, and the like.
The following provides a specific description of an application scenario of the embodiment of the present application. As shown in fig. 4, the acquisition module 4101 in the projection source device 410 acquires a current picture of an application it runs on, and transmits to the encoding module 4102; secondly, the encoding module 4102 encodes and compresses the picture to form a data stream, and the transmitting module 4103 transmits the data stream to the receiving module 4201 in the screen projection target device 420; again, the decoding module 4202 decodes the data stream using a streaming media protocol, and transmits the decoded data stream to the display module 4203; finally, a full or partial mirror image of the picture is displayed by the display module 4203, thereby enabling a cross-device display of the picture.
In addition, in the embodiment of the present application, it is further proposed that an input operation instruction is generated by performing an input operation (for example, zooming in, zooming out, scrolling, flipping, moving, etc.) on the screen displayed by the display module 4203 through the input module 4204, the input operation instruction is transmitted to the input processing module 4205, and the input processing module 4205 determines whether the input operation instruction is responded by the screen-projection target device 420 or the screen-projection source device 410. If the input operation command is a response from the screen-projection source device 410, the input processing module 4205 needs to perform an adaptation process on the input operation command, transmit the adapted operation command to the input injection module 4104 through the transmission module 4206, and then the input injection module 4104 performs an operation on the current screen of the application run by the screen-projection source device 410.
As can be seen, the screen projection target device 420 can operate the screen projection picture in a zooming or moving manner, and can also operate the current picture applied to the screen projection source device 410, so that the interactive operation of the applied picture is realized, and the screen projection use experience is improved. The following embodiments of the present application will be specifically described with reference to the method examples.
In conjunction with the above description, the steps of the implementation of the cross-device display method will be described below in terms of method examples, please refer to fig. 5. Fig. 5 is a schematic flowchart of a cross-device display method for a picture provided in an embodiment of the present application, where the method includes:
and S510, displaying a first picture.
The first screen may be a current mirror image of a running screen of the first application on the screen projection source device.
It should be noted that the first application in the embodiment of the present application may be an application program or media data, such as a photo, a video, an audio, a game, a gallery, a document or multimedia, running on an application layer of the screen projection source device. Meanwhile, the screen projection source device can run the first application at the front end and can also run the first application at the back end. When the screen projection source equipment runs the first application at the front end, a display screen of the screen projection source equipment can display a running picture of the first application; when the screen projection source device runs the first application at the back end, the display screen of the screen projection source device may not display the running picture of the first application, and the background executes the first application.
It should be further noted that, in the embodiment of the present application, a picture that is displayed on the display screen of the screen projection target device and is projected by the screen projection source device may be referred to as a mirror image (i.e., a first picture or a target picture), and the mirror image may present all pictures or a part of pictures of the running picture of the first application. Meanwhile, the mirror image is already a picture which is presented after relevant operations such as resolution adjustment, display size adjustment, picture adjustment and the like in the screen projection process.
In addition, since the input operation can be performed on the screen displayed by the display module 4203 through the input module 4204 in the embodiment of the present application, a current mirror image (i.e., a first screen) displayed on the display screen of the screen projection target device may be a target mirror image (i.e., a target screen) to be projected by the screen projection source device with respect to the running screen of the first application, or may be a mirror image after performing operations such as zooming in, zooming out, or moving with respect to the target screen. For convenience of distinction, in the embodiments of the present application, a target image to be projected by a screen projection source device with respect to a running picture of a first application is referred to as a target picture, and a picture currently displayed on a display screen of a screen projection target device is referred to as a first picture. In the following, the screen projection source device is taken as a notebook computer and the screen projection target device is taken as a mobile phone in the embodiment of the present application for illustration.
As shown in fig. 6, 7, 8 and 9, for example. In fig. 6, the user opens a photo of the "sketched cat, jpg" through the "gallery" application in the laptop, and uses a screen projection technique to project the current screen of the "gallery" application in the laptop onto the mobile phone. At this time, the target picture displayed on the display screen of the mobile phone is the same as the current picture of the gallery application on the notebook computer. However, in fig. 7, the user can only project the "gallery" application on the laptop to the phone with the "sketched cat, jpg" opened by the user by setting the relevant screen projection option. At this time, the target picture displayed on the display screen of the mobile phone is only a partial picture in the current picture of the gallery application. In fig. 8, the user can perform an enlargement operation on the target screen in the display screen of the mobile phone shown in fig. 7 by sliding the touch panel. When the mobile phone responds to the zoom-in operation, the display screen of the mobile phone displays the current picture, as shown in fig. 9. Similarly, in fig. 10, the user can perform a zoom-out operation on the target screen on the display screen of the mobile phone shown in fig. 7 by sliding the touch panel. When the mobile phone responds to the zoom-out operation, the display screen of the mobile phone displays the current picture, as shown in fig. 11.
Specifically, the input module 4204 of the embodiment of the present application needs to be determined according to the electronic device. For example, when the electronic device is a device with a specific touch screen, such as a mobile phone and a smart watch, the input module may be the touch screen. At this time, the user can perform an input operation (such as zooming in, zooming out, scrolling, flipping, moving, etc.) on the displayed screen by sliding the touch screen. When the electronic device is a desktop computer, a notebook computer, an intelligent television or other device without a touch screen, the input module may be an external device (such as a mouse, a keyboard, a remote controller, etc.). At this time, the user can perform an input operation on the displayed screen by operating the external device. In addition, when the external device is a mouse, a user can move the picture by pressing a left mouse button of the mouse; alternatively, the screen is enlarged or reduced by sliding the mouse wheel up and down, which is not particularly limited.
S520, acquiring a first operation event aiming at the first picture.
It should be noted that, in the embodiment of the present application, an input operation may be performed on the first screen displayed by the screen projection target device through the input module 4204 to generate a first operation event. Therefore, the first operation event may be an operation of moving the first screen in the first direction on the display screen of the screen projection target device, an operation of clicking the first screen on the display screen of the screen projection target device, or an operation of enlarging or reducing the first screen on the display screen of the screen projection target device, and is not particularly limited.
S530, judging whether a first instruction corresponding to the first operation event is responded at the local terminal according to the execution action of the first operation event on the first picture.
The following embodiment of the present application will specifically describe an implementation manner of how the screen projection target device executes an action on the first screen according to the first operation time to determine whether to respond to the first instruction corresponding to the first operation event at the home terminal.
The first method is as follows:
in one possible example, the action performed on the first screen according to the first operation event to determine whether to respond to the first instruction corresponding to the first operation event at the local terminal may include the following steps: and if the first operation event is an operation of moving the first picture along the first direction, judging whether a first instruction corresponding to the first operation event is responded at the local terminal or not according to the first picture and the target picture.
The target picture can be a target mirror image to be projected by the screen projection source device for the running picture of the first application.
It should be noted that, in the embodiment of the present application, since the input module 4204 can perform an input operation on the screen displayed by the screen projection target device, a current mirror image (i.e., a first screen) displayed on the display screen of the screen projection target device may be a target screen, a screen after an operation such as an enlargement, reduction, or movement, or a screen after an operation such as an enlargement, reduction, or movement is performed on the target screen. When a user needs to move a first picture along a first direction on a display screen of a screen projection target device, the embodiment of the application judges whether to move the first picture along the first direction on the display screen of the screen projection target device according to the first picture and a target picture. Because the screen projection target device can directly call the internal related modules (such as GPU, image processing module, etc.) to compare the first picture with the target picture, the processing time required by the judgment process for whether to respond to the first instruction corresponding to the first operation event at the local terminal is short, thereby being beneficial to improving the processing efficiency of the interactive operation of the picture of the user and improving the screen projection use experience.
Specifically, the target screen presents the entire screen of the running screen of the first application; or, the target picture only presents a partial picture of the running picture of the first application.
It is to be understood that the target screen displayed on the display screen of the screen projection target device may be an entire screen to be projected by the stream source device for the running screen of the first application (as shown in fig. 6), or may be a partial screen to be projected by the stream source device for the running screen of the first application (as shown in fig. 7). Therefore, the method and the device are beneficial to realizing the diversity of interactive operation of the applied pictures and improving the use experience of screen projection.
Specifically, the first direction may be used to indicate any direction of a plane where a current display screen of the screen projection target device is located. For example, the first direction may be horizontal left, horizontal right, horizontal down, horizontal up, horizontal oblique left, etc. parallel to the plane of the display screen of the screen projection target device, and may be arc left, arc right, arc down, arc up, arc oblique up, or arc oblique left, etc. parallel to the plane of the display screen of the screen projection target device.
It should be noted that, taking the screen projection target device as a mobile phone as an example, because the gestures of the user holding the mobile phone are different, the plane where the display screen of the mobile phone is located may be perpendicular (or parallel, oblique, etc.) to the ground. Therefore, the first direction in the embodiment of the application needs to be specifically analyzed according to the plane where the current display screen of the screen projection target device is located.
In one possible example, after determining whether to respond to the first instruction corresponding to the first operation event at the local terminal according to the first picture and the target picture, the method further includes the following steps: and if the local terminal responds to the first operation instruction, the first picture is moved along the first direction to display the second picture.
The second picture is a mirror image of the running picture of the first application, which is different from the first picture.
For example, in fig. 12, the user can slide the touch screen of the mobile phone shown in fig. 9 to move the current screen (i.e., the first screen) in the display screen to the left along a plane parallel to the display screen of the mobile phone. When the mobile phone responds to the moving operation, a current picture (i.e., a second picture) is displayed on the display screen of the mobile phone, as shown in fig. 13.
In one possible example, the determining, according to the first screen and the target screen, whether to respond to the first instruction corresponding to the first operation event at the local terminal may include: if the first picture is not the same as the target picture, judging whether a first instruction corresponding to the first operation event is responded at the home terminal or not according to the relation between the first picture and the target picture; or if the first picture and the target picture are the same picture, the first instruction corresponding to the first operation event is not responded to the local terminal.
It should be noted that the first screen and the target screen are different screens, and it can be understood that the user has performed an input operation on the target screen displayed on the display screen of the screen projection target device, so that the currently displayed screen is different from the target screen (as shown in fig. 9). At this time, the embodiment of the present application considers whether to move the first screen in the first direction on the display screen of the screen projection target device according to the relationship between the first screen and the target screen (i.e., the enlarged screen, the reduced screen, and/or the moving screen). Similarly, the first picture and the target picture are the same picture, and it can be understood that the user does not perform an input operation on the target picture displayed on the display screen of the screen projection target device, so that the currently displayed picture is the same as the target picture. Therefore, the method and the device are beneficial to realizing the diversity of interactive operation of the applied pictures and improving the use experience of screen projection.
In addition, for the difference between the first instruction corresponding to the response of the home terminal to the first operation event and the first instruction corresponding to the non-response of the home terminal to the first operation event, the screen projection source device is taken as a notebook computer and the screen projection target device is taken as a mobile phone in the embodiment of the present application for illustration.
The home terminal responds to the first instruction corresponding to the first operation event, which can be understood as that the mobile phone responds to the first instruction corresponding to the first operation event. For example, in fig. 13, the mobile phone responds to the moving operation and moves the partial screen (i.e., the second screen) on the display screen of the mobile phone. However, the first instruction corresponding to the first operation event is not responded to the home terminal, which can be understood that the mobile phone does not respond to the first instruction corresponding to the first operation event. At this time, the mobile phone needs to perform adaptation processing on the first operation event, and transmit an adapted operation instruction (i.e., a second instruction corresponding to the first operation event) to the notebook computer for execution. In addition, the reason for adapting the first operation event is that the input module of the mobile phone may be a touch screen, and the input module of the notebook computer may be an external mouse, so the input operation generated through the touch screen needs to be adapted to the characteristics of the input operation of the mouse. Meanwhile, because the display screen of the mobile phone and the display screen of the notebook computer are often different in size, when the mobile phone slides a certain length, moves a certain direction, or clicks a certain position on the touch screen of the mobile phone, the generated input operation cannot be performed in the same length, direction, or position on the notebook computer, and thus the input operation generated by the mobile phone needs to be correspondingly adapted. In addition, different operating systems, software and hardware architectures, etc. may be installed between the mobile phone and the notebook computer, so that corresponding adaptation processing is also required.
For example, in fig. 14, the user can slide the touch screen of the mobile phone shown in fig. 7 to move the current screen (i.e., the first screen) in the display screen horizontally to the right along a plane parallel to the display screen of the mobile phone. Because the current picture and the target picture are the same picture, the mobile phone does not respond to the instruction corresponding to the moving operation. At this time, the mobile phone performs adaptation processing on the mobile operation, and transmits an instruction (i.e., a second instruction) corresponding to the adapted mobile operation to the notebook computer for execution. Then, after responding to the instruction corresponding to the adapted mobile operation, the notebook computer switches the sketch cat and jpg opened by the gallery application to the real cat and jpg, and synchronizes the mirror image of the real cat and jpg to the mobile phone by using a screen projection technology, as shown in fig. 15 specifically, so that the current picture of the gallery application on the notebook computer is directly switched by the mobile phone, the interactive operation of the applied picture is realized, and the screen projection use experience is improved.
Further, determining whether to respond to the instruction corresponding to the first operation event at the local terminal according to the relationship between the first image and the target image may include the following steps: if the first picture is a partial picture after the target picture is amplified, judging whether the first operation instruction is responded at the local terminal according to whether the first picture moves along the first direction and exceeds the edge of the target picture; or if the first picture is a full picture after the target picture is reduced, the first operation instruction is not responded to the local terminal.
It should be noted that, in the embodiment of the present application, it is considered that a relationship between the first picture and the target picture is determined according to whether the first picture is a partial picture obtained by enlarging the target picture or a full picture obtained by reducing the target picture, so as to respectively determine whether to move the first picture along the first direction on the display screen of the screen projection target device, thereby implementing an interactive operation of the pictures of the application.
For example, in fig. 9, the current picture displayed by the mobile phone is a partial picture of the enlarged target picture, and at this time, it is required that the current picture moves along the first direction to determine whether the mobile phone responds to the first operation instruction or not; in fig. 10, the current screen displayed by the mobile phone is the full screen after the target screen is reduced, and at this time, the mobile phone will not respond to the instruction corresponding to the first operation event, but the notebook computer needs to respond to the adapted instruction after the first operation event is adapted.
Further, determining whether to respond to the first instruction corresponding to the first operation event at the local end according to whether the first picture moves in the first direction and exceeds the edge of the target picture, may include the following steps: if the first picture moves beyond the edge of the target picture along the first direction, a first instruction corresponding to the first operation event is not responded to the local terminal; or, if the first picture does not exceed the edge of the target picture along the first direction, responding to a first instruction corresponding to the first operation event at the local terminal.
Illustratively, in fig. 16 (a), if the user slides the touch screen of the mobile phone shown in fig. 9 to perform a horizontal rightward movement operation on the current screen in the display screen thereof along a plane parallel to the plane of the display screen of the mobile phone, the horizontal rightward movement of the current screen will exceed the edge of the target screen, as shown in fig. 16 (b). At this time, the mobile phone will not respond to the instruction corresponding to the moving operation.
For example, in fig. 12, the user can slide the touch screen of the mobile phone shown in fig. 9 to move the current screen in the display screen to the left along a plane parallel to the plane of the display screen of the mobile phone. Because the current picture moves horizontally leftward without exceeding the edge of the target picture, the mobile phone may respond to the instruction corresponding to the moving operation, and the current picture displayed on the display screen of the mobile phone is as shown in fig. 13.
Further, after the local terminal does not respond to the first instruction corresponding to the first operation event, the method further includes the following steps: displaying a target window in a display screen of the screen projection target equipment, wherein the target window is used for prompting that the first picture cannot move along a first direction; or calling a sensing module of the screen projection target equipment to enable the screen projection target equipment to vibrate; or calling a loudspeaker of the projection target equipment to enable the projection target equipment to sound.
It should be noted that, when the screen projection target device cannot respond to the first instruction corresponding to the first operation event to move the first picture along the first direction, the screen projection target device reminds the user in a pop-up window, vibration or sounding manner, so that interactive operation of the applied picture is realized, and the screen projection use experience is improved.
For example, in fig. 17 (a), the user slides the touch screen of the mobile phone shown in fig. 9 to move the current screen of the display screen horizontally to the right along a plane parallel to the display screen of the mobile phone, and since the current screen moves horizontally to the right and exceeds the edge of the target screen, the display screen of the mobile phone pops up to prompt that the current screen cannot be moved horizontally, and "see previous photo" is displayed, as shown in fig. 17 (b). When the user selects the 'yes' option, the mobile phone adapts the relevant operation (mobile operation or 'yes' option operation, etc.), transmits the adapted operation instruction to the notebook computer, and then the notebook computer checks the previous photo in the 'gallery' application. Finally, the notebook computer screens the current picture of the 'gallery' application to the mobile phone, so that the last photo is viewed on the mobile phone, and the screen-projecting use experience is improved.
The second method comprises the following steps:
in one possible example, the action performed on the first screen according to the first operation event to determine whether to respond to the first instruction corresponding to the first operation event at the local terminal may include the following steps: and if the first operation event is clicking the first picture, not responding to a first instruction corresponding to the first operation event at the local terminal.
It should be noted that, when a user needs to click a first picture (or a certain position on the first picture) on a display screen of a screen projection target device, the embodiment of the present application considers that a first instruction corresponding to a first operation event is not responded to a home terminal, so that the processing efficiency of the interactive operation of the picture of the user is improved, and the screen projection use experience is improved.
Illustratively, in fig. 18, the user generates an input operation by clicking on the position of "x" on the display screen of the cellular phone shown in fig. 6. At this time, the mobile phone does not respond to the instruction corresponding to the input operation, but performs adaptation processing on the input operation, and transmits the adapted operation instruction (i.e., the second instruction) to the notebook computer for execution. Then, after the notebook computer responds to the adapted operation instruction, the "gallery" application is closed, and the screen projection from the notebook computer to the mobile phone is ended, as shown in fig. 19 specifically.
Further, after the local terminal does not respond to the first instruction corresponding to the first operation event, the method further includes the following steps: displaying a target window in a display screen of the screen projection target equipment, wherein the target window is used for prompting that the first picture cannot move along a first direction; or calling a sensing module of the screen projection target equipment to enable the screen projection target equipment to vibrate; or calling a loudspeaker of the projection target equipment to enable the projection target equipment to sound. As described above, the details are not repeated herein.
And S540, if the local terminal does not respond to the first instruction corresponding to the first operation event, sending a second instruction corresponding to the first operation event to the screen projection source equipment.
The second operation instruction can be used for instructing the screen projection source device to execute operation on the running picture of the first application.
In view of the above "manner one", in the following, an example will be described in the embodiment of the present application on how the screen projection target device determines the second instruction corresponding to the first operation event.
In one possible example, before sending the second instruction corresponding to the first operation event to the screen-projection source device, the method may further include the following steps: and determining a second instruction corresponding to the first operation event according to the first operation event and a preset operation mapping relation.
The preset operation mapping relation is used for representing the mapping relation between the operation aiming at the first picture and the operation aiming at the running picture of the first application.
As can be seen from the above description, different operating systems, software and hardware architectures, and the like may be mounted between the screen projection source device and the screen projection target device, so that in the process of establishing a screen projection between the screen projection source device and the screen projection target device, in order to adapt an input operation of the screen projection target device to the screen projection source device, the screen projection target device is ensured to operate a running picture applied to the screen projection source device, and an interactive operation of an applied picture is realized.
In the following embodiment of the present application, a preset operation mapping relationship is illustrated by a screen projection process between a notebook computer and a mobile phone illustrated in the above diagram, as shown in table 1.
TABLE 1
In view of the above "mode two", in the following, an example will be described in the embodiment of the present application on how the screen projection target device determines the second instruction corresponding to the first operation event.
In one possible example, before sending the second instruction corresponding to the first operation event to the screen-projection source device, the method may further include the following steps: acquiring a first scaling ratio and a second scaling ratio; and determining a second instruction corresponding to the first operation event according to the first scaling ratio, the second scaling ratio and the first operation instruction.
The first zoom ratio is used for representing the zoom ratio between the display size of the running picture of the first application and the display size of the target picture, the second zoom ratio is used for representing the zoom ratio between the display size of the first picture and the display size of the target picture, and the target picture is a starting mirror image of the running picture of the first application on the display screen of the screen projection target device.
As can be seen from the above description, since the display screen of the screen projection source device and the display screen of the screen projection target device may have different sizes, and the current picture displayed on the display screen of the screen projection target device may be different from the target picture, in order to adapt the input operation of the screen projection target device to the screen projection source device in the process of establishing screen projection between the screen projection source device and the screen projection target device, it is ensured that the screen projection target device operates the current picture applied to the screen projection source device, and implement the interactive operation of the picture of the application.
In fig. 6, the "gallery" application in the laptop displays a "sketched cat" jpg "having a display size different from that of the target screen displayed on the display screen of the mobile phone, and thus it is necessary to calculate a zoom ratio (i.e., a first zoom ratio) between them and store the first zoom ratio in the internal memory of the mobile phone. Then, in order to facilitate the user to view the screen on the mobile phone shown in fig. 6, the user enlarges the target screen (i.e., fig. 20 (a)) displayed on the display screen of the mobile phone shown in fig. 6, and obtains the current screen (i.e., the first screen) displayed on the display screen of the mobile phone shown in fig. 20 (b). At this time, it is necessary to calculate a scaling ratio (i.e., a second scaling ratio) between the display size of the current screen and the display size of the target screen, and store the second scaling ratio in the internal memory of the mobile phone. Finally, in (c) of fig. 20, when the user clicks the position of the "x" on the display screen of the mobile phone shown in (b) of fig. 20 to generate an input operation, the mobile phone does not respond to the input operation, performs adaptation processing according to the first scaling ratio, the second scaling and the input operation, and transmits the adapted operation instruction to the notebook computer for execution. And then, the notebook computer closes the gallery application by responding to the adapted operation instruction, and finishes screen projection from the notebook computer to the mobile phone.
In one possible example, after S540, the method further comprises the steps of: and displaying a third picture, wherein the third picture is a mirror image of the running picture of the first application different from the first picture after the screen projection source equipment responds to a second instruction corresponding to the first operation event.
It can be understood that, after responding to the second operation instruction, the screen projection source device displays the running picture of the refreshed, changed or switched first application, and projects the running picture of the refreshed, changed or switched first application to the screen projection target device, so as to implement synchronous display of the screen projection target device.
To sum up, first, in the embodiments of the present application, by taking the screen projection source device as a notebook computer and the screen projection target device as a mobile phone as examples, the specific implementation of the picture cross-device display is exemplified, and by performing input operations such as zooming in, zooming out, moving or clicking on the current picture displayed on the display screen of the mobile phone, and judging whether the input operation needs to be executed on the mobile phone or is subjected to adaptation processing to be transmitted to the notebook computer, so that the user can not only operate the screen-projecting picture on the display screen of the mobile phone in the modes of zooming, moving or clicking, etc., but also can control the current picture of the 'gallery' application running on the notebook computer (such as viewing different photos), therefore, the picture interaction control of the application is realized, the processing efficiency of the interactive operation of the picture of the application is improved, and the use experience of screen projection is improved.
Then, different operating systems, software and hardware architectures and the like may be loaded between the mobile phone and the notebook computer, so that the application can realize cross-device display of pictures between different devices, operating systems or software and hardware architectures.
Finally, the applications running on the notebook computer may also be a document editor, a video player, a music player, etc. At this time, the mobile phone may edit a document running on the notebook computer, may perform operations such as switching a video played on the notebook computer, and may also perform operations such as switching music played on the notebook computer, which is not particularly limited.
It can be seen that, in the embodiment of the present application, the screen projection target device displays the first picture and acquires the first operation event for the first picture. And then, judging whether a first instruction corresponding to the first operation event is responded at the local terminal or not according to the execution action of the first operation event on the first picture. And if the local terminal does not respond to the first instruction, sending a second instruction corresponding to the first operation event to the screen projection source equipment. The first picture is a current mirror image of an operation picture of a first application on the screen projection target device on the display screen of the screen projection source device, and the second instruction corresponding to the first operation event is used for indicating the screen projection source device to execute operation on the operation picture of the first application, so that picture cross-device display is achieved. In addition, the screen projection target device needs to judge whether to respond to the first instruction corresponding to the first operation event at the local terminal, so that the screen projection target device can respond at the local terminal or not, the screen projection target device can operate the screen projection picture on the display screen of the screen projection target device, and can also control and operate the running picture of the first application on the screen projection source device through adaptive processing, thereby realizing cross-device interactive control of the pictures of the applications, improving the processing efficiency of the picture interactive operation, and improving the use experience of the screen projection.
Consistent with the embodiment described in fig. 5, please refer to fig. 21, where fig. 21 is a schematic flowchart of another cross-device display method for a frame provided in the embodiment of the present application, and the method includes:
and S2110, displaying a first picture.
The first screen may be a current mirror image of a running screen of the first application on the screen projection source device.
It should be noted that the first application in the embodiment of the present application may be an application program or media data, such as a photo, a video, an audio, a game, a gallery, a document or multimedia, running on an application layer of the screen projection source device. Meanwhile, the screen projection source device can run the first application at the front end and can also run the first application at the back end. When the screen projection source equipment runs the first application at the front end, a display screen of the screen projection source equipment can display a running picture of the first application; when the screen projection source device runs the first application at the back end, the display screen of the screen projection source device may not display the running picture of the first application, and the background executes the first application.
It should be further noted that, in the embodiment of the present application, a picture that is displayed on the display screen of the screen projection target device and is projected by the screen projection source device may be referred to as a mirror image (i.e., a first picture or a target picture), and the mirror image may present all pictures or a part of pictures of the running picture of the first application. Meanwhile, the mirror image is already a picture which is presented after relevant operations such as resolution adjustment, display size adjustment, picture adjustment and the like in the screen projection process.
In addition, since the input operation can be performed on the screen displayed by the display module 4203 through the input module 4204 in the embodiment of the present application, a current mirror image (i.e., a first screen) displayed on the display screen of the screen projection target device may be a target mirror image (i.e., a target screen) to be projected by the screen projection source device with respect to the running screen of the first application, or may be a mirror image after performing operations such as zooming in, zooming out, or moving with respect to the target screen. For convenience of distinction, in the embodiments of the present application, a target image to be projected by a screen projection source device with respect to a running picture of a first application is referred to as a target picture, and a picture currently displayed on a display screen of a screen projection target device is referred to as a first picture.
And S2120, acquiring a first operation event aiming at the first picture.
It should be noted that, in the embodiment of the present application, an input operation may be performed on the first screen displayed by the screen projection target device through the input module 4204 to generate a first operation event. Therefore, the first operation event may be an operation of moving the first screen in the first direction on the display screen of the screen projection target device, an operation of clicking the first screen on the display screen of the screen projection target device, or an operation of enlarging or reducing the first screen on the display screen of the screen projection target device, and is not particularly limited.
And S2130, if the first operation event is an operation of moving the first picture along the first direction, judging whether a first instruction corresponding to the first operation event is responded to the local terminal or not according to the first picture and the target picture.
The target picture can be a target mirror image to be projected by the screen projection source device for the running picture of the first application.
Specifically, the target screen presents the entire screen of the running screen of the first application; or, the target picture only presents a partial picture of the running picture of the first application.
Specifically, the first direction may be used to indicate any direction of a plane where a current display screen of the screen projection target device is located. For example, the first direction may be horizontal left, horizontal right, horizontal down, horizontal up, horizontal oblique left, etc. parallel to the plane of the display screen of the screen projection target device, and may be arc left, arc right, arc down, arc up, arc oblique up, or arc oblique left, etc. parallel to the plane of the display screen of the screen projection target device.
In one possible example, the determining, according to the first screen and the target screen, whether to respond to the first instruction corresponding to the first operation event at the local terminal may include: if the first picture is not the same as the target picture, judging whether a first instruction corresponding to the first operation event is responded at the home terminal or not according to the relation between the first picture and the target picture; or if the first picture and the target picture are the same picture, the first instruction corresponding to the first operation event is not responded to the local terminal.
In one possible example, the determining whether to respond to the first instruction corresponding to the first operation event at the local terminal according to the relationship between the first screen and the target screen may include: if the first picture is a partial picture after the target picture is amplified, judging whether a first instruction corresponding to a first operation event is responded at the local terminal according to whether the first picture moves along the first direction and exceeds the edge of the target picture; or if the first picture is a full picture after the target picture is reduced, the first instruction corresponding to the first operation event is not responded to the local terminal.
In one possible example, the determining whether to respond to the first instruction corresponding to the first operation event at the local end according to whether the first picture moves in the first direction beyond the edge of the target picture may include: if the first picture moves beyond the edge of the target picture along the first direction, a first instruction corresponding to the first operation event is not responded to the local terminal; or, if the first picture does not exceed the edge of the target picture along the first direction, responding to a first instruction corresponding to the first operation event at the local terminal.
S2140, if the local terminal does not respond to the first instruction corresponding to the first operation event, determining a second instruction corresponding to the first operation event according to the first operation event and a preset operation mapping relation.
S2150, if the local end responds to the first instruction corresponding to the first operation event, the first screen is moved along the first direction to display the second screen.
The second picture is a mirror image of the running picture of the first application, which is different from the first picture.
S2160, sending a second instruction corresponding to the first operation event to the screen projection source equipment.
And the second instruction corresponding to the first operation event is used for indicating the screen projection source equipment to execute operation on the running picture of the first application.
And S2170, displaying a third screen.
The third picture is a mirror image of the running picture of the first application, which is different from the first picture, after the screen projection source equipment responds to the second instruction corresponding to the first operation event.
It can be understood that, after responding to the second operation instruction, the screen projection source device displays the running picture of the refreshed, changed or switched first application, and projects the running picture of the refreshed, changed or switched first application to the screen projection target device, so as to implement synchronous display of the screen projection target device.
It should be noted that, because the descriptions of the embodiments have respective emphasis, parts that are not described in detail in the specific embodiment illustrated in fig. 21 may refer to the description of the specific embodiment in fig. 5, and are not described again here.
It can be seen that, in the embodiment of the present application, the screen projection target device displays the first picture and acquires the first operation event for the first picture. And secondly, if the first operation event is an operation of moving the first picture along the first direction, judging whether a first instruction corresponding to the first operation event is responded at the local terminal or not according to the first picture and the target picture. Thirdly, if the local terminal does not respond to the first instruction corresponding to the first operation event, determining a second instruction corresponding to the first operation event according to the first operation event and a preset operation mapping relation; and if the local terminal responds to a first instruction corresponding to the first operation event, the first picture is moved along the first direction to display a second picture. And finally, sending a second instruction corresponding to the first operation event to the screen projection source equipment, and displaying a third picture. The first picture is a current mirror image of an operation picture of a first application on the screen projection target device on the display screen of the screen projection source device, and the second instruction corresponding to the first operation event is used for indicating the screen projection source device to execute operation on the operation picture of the first application, so that picture cross-device display is achieved. In addition, the screen projection target device needs to judge whether to respond to the first instruction corresponding to the first operation event at the local terminal, so that the screen projection target device can respond at the local terminal or not, the screen projection target device can operate the screen projection picture on the display screen of the screen projection target device, and can also control and operate the running picture of the first application on the screen projection source device through adaptive processing, thereby realizing cross-device interactive control of the pictures of the applications, improving the processing efficiency of the picture interactive operation, and improving the use experience of the screen projection.
Referring to fig. 22, in accordance with the embodiment described in fig. 5 and fig. 21, fig. 22 is a schematic flowchart of another cross-device frame display method provided in the embodiment of the present application, where the method includes:
s2210, displaying the first picture.
The first screen may be a current mirror image of a running screen of the first application on the screen projection source device.
It should be noted that the first application in the embodiment of the present application may be an application program or media data, such as a photo, a video, an audio, a game, a gallery, a document or multimedia, running on an application layer of the screen projection source device. Meanwhile, the screen projection source device can run the first application at the front end and can also run the first application at the back end. When the screen projection source equipment runs the first application at the front end, a display screen of the screen projection source equipment can display a running picture of the first application; when the screen projection source device runs the first application at the back end, the display screen of the screen projection source device may not display the running picture of the first application, and the background executes the first application.
It should be further noted that, in the embodiment of the present application, a picture that is displayed on the display screen of the screen projection target device and is projected by the screen projection source device may be referred to as a mirror image (i.e., a first picture or a target picture), and the mirror image may present all pictures or a part of pictures of the running picture of the first application. Meanwhile, the mirror image is already a picture which is presented after relevant operations such as resolution adjustment, display size adjustment, picture adjustment and the like in the screen projection process.
In addition, since the input operation can be performed on the screen displayed by the display module 4203 through the input module 4204 in the embodiment of the present application, a current mirror image (i.e., a first screen) displayed on the display screen of the screen projection target device may be a target mirror image (i.e., a target screen) to be projected by the screen projection source device with respect to the running screen of the first application, or may be a mirror image after performing operations such as zooming in, zooming out, or moving with respect to the target screen. For convenience of distinction, in the embodiments of the present application, a target image to be projected by a screen projection source device with respect to a running picture of a first application is referred to as a target picture, and a picture currently displayed on a display screen of a screen projection target device is referred to as a first picture.
S2220, a first operation event aiming at the first picture is acquired.
It should be noted that, in the embodiment of the present application, an input operation may be performed on the first screen displayed by the screen projection target device through the input module 4204 to generate a first operation event. Therefore, the first operation event may be an operation of moving the first screen in the first direction on the display screen of the screen projection target device, an operation of clicking the first screen on the display screen of the screen projection target device, or an operation of enlarging or reducing the first screen on the display screen of the screen projection target device, and is not particularly limited.
And S2230, if the first operation event is an operation of clicking the first picture, not responding the first instruction corresponding to the first operation event at the local terminal.
It should be noted that, when a user needs to click a first picture (or a certain position on the first picture) on a display screen of a screen projection target device, the embodiment of the present application considers that a first instruction corresponding to a first operation event is not responded to a home terminal, so that the processing efficiency of the interactive operation of the picture of the user is improved, and the screen projection use experience is improved.
S2240, obtaining a first scaling ratio and a second scaling ratio.
The first zoom ratio is used for representing the zoom ratio between the display size of the running picture of the first application and the display size of the target picture, the second zoom ratio is used for representing the zoom ratio between the display size of the first picture and the display size of the target picture, and the target picture is a starting mirror image of the running picture of the first application on the display screen of the screen projection target device.
S2250, determining a second instruction corresponding to the first operation event according to the first scaling ratio, the second scaling ratio and the first operation event.
As can be seen from the above description, since the display screen of the screen projection source device and the display screen of the screen projection target device may have different sizes, and the current picture displayed on the display screen of the screen projection target device may be different from the target picture, in order to adapt the input operation of the screen projection target device to the screen projection source device in the process of establishing screen projection between the screen projection source device and the screen projection target device, it is ensured that the screen projection target device operates the current picture applied to the screen projection source device, and implement the interactive operation of the picture of the application.
And S2260, sending a second instruction corresponding to the first operation event to the screen projection source device.
And the second instruction corresponding to the first operation event is used for indicating the screen projection source equipment to execute operation on the running picture of the first application.
And S2270, displaying a third screen.
The third picture is a mirror image of the running picture of the first application, which is different from the first picture, after the screen projection source equipment responds to the second instruction corresponding to the first operation event.
It can be understood that, after responding to the second operation instruction, the screen projection source device displays the running picture of the refreshed, changed or switched first application, and projects the running picture of the refreshed, changed or switched first application to the screen projection target device, so as to implement synchronous display of the screen projection target device.
It should be noted that, because the descriptions of the embodiments have respective emphasis, parts that are not described in detail in the specific embodiment illustrated in fig. 22 may refer to the description of the specific embodiment in fig. 5, and are not described again here.
It can be seen that, in the embodiment of the present application, the screen projection target device displays the first picture and acquires the first operation event for the first picture. And secondly, if the first operation event is the operation of clicking the first picture, the first instruction corresponding to the first operation event is not responded to the local terminal. And thirdly, acquiring the first scaling ratio and the second scaling ratio, and determining a second instruction corresponding to the first operation event according to the first scaling ratio, the second scaling ratio and the first operation event. And finally, sending a second instruction corresponding to the first operation event to the screen projection source equipment, and displaying a third picture. The first picture is a current mirror image of an operation picture of a first application on the screen projection target device on the display screen of the screen projection source device, and the second instruction corresponding to the first operation event is used for indicating the screen projection source device to execute operation on the operation picture of the first application, so that picture cross-device display is achieved. In addition, the screen projection target device needs to judge whether to respond to the first instruction corresponding to the first operation event at the local terminal, so that the screen projection target device can respond at the local terminal or not, the screen projection target device can operate the screen projection picture on the display screen of the screen projection target device, and can also control and operate the running picture of the first application on the screen projection source device through adaptive processing, thereby realizing cross-device interactive control of the pictures of the applications, improving the processing efficiency of the picture interactive operation, and improving the use experience of the screen projection.
The above description has introduced the solution of the embodiment of the present application mainly from the perspective of the method-side implementation process. It is understood that the electronic device comprises corresponding hardware structures and/or software modules for performing the respective functions in order to realize the above-mentioned functions. Those of skill in the art will readily appreciate that the present application is capable of hardware or a combination of hardware and computer software implementing the various illustrative elements and algorithm steps described in connection with the embodiments provided herein. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiment of the present application, the electronic device may be divided into the functional units according to the method example, for example, each functional unit may be divided corresponding to each function, or two or more functions may be integrated into one processing unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit. It should be noted that the division of the units in the embodiment of the present application is illustrative, and is only one division of the logic functions, and there may be another division in actual implementation.
In the case of employing an integrated unit, fig. 23 shows a functional unit composition block diagram of a screen cross-device display apparatus. The picture cross-device display device 2300 is applied to screen projection target devices, and specifically comprises: a processing unit 2320 and a communication unit 2330. The processing unit 2320 is used to control and manage the actions of the screen projection target device, for example, the processing unit 2320 is used to support the screen projection target device to perform some or all of the steps in fig. 5, as well as other processes for the techniques described herein. The communication unit 2330 is used to support communication of the screen projection target device with other devices. The screen-crossing device display apparatus 2300 may further include a storage unit 2310 for storing program codes and data of a screen-projection target device.
The processing unit 2320 may be a processor or controller, such as a CPU, general purpose processor, DSP, ASIC, FPGA, transistor logic, hardware component, or any combination thereof. Which may implement or perform the various illustrative logical blocks, modules, and circuits described in connection with the embodiments disclosed herein. Additionally, processing unit 2320 may also be a combination that performs computing functions, including one or more of a combination of microprocessors, a DSP, and a microprocessor. The communication unit 2330 may be a communication interface, a transceiver, a transceiving circuit, and the like. The storage unit 2310 may be a memory. When the processing unit 2320 is a processor, the communication unit 2330 is a communication interface, and the storage unit 2310 is a memory, the cross-device screen display apparatus 2300 according to the embodiment of the present application may be an electronic device shown in fig. 24.
Specifically, the processing unit 2320 is configured to execute any step performed by the screen projection target device in the above method embodiment, and optionally invokes the communication unit 2330 to complete the corresponding operation when performing data transmission, such as sending. The details will be described below.
In a possible example, in terms of performing an action on the first screen according to the first operation event to determine whether to respond to the first instruction corresponding to the first operation event at the local end, the processing unit 2320 is specifically configured to: and if the first operation event is an operation of moving a first picture along a first direction, judging whether a first instruction corresponding to the first operation event is responded at the local terminal or not according to the first picture and a target picture, wherein the target picture is a target mirror image to be projected by the screen projection source equipment aiming at the running picture of the first application.
In a possible example, in terms of determining whether to respond to the first instruction corresponding to the first operation event at the local end according to the first screen and the target screen, the processing unit 2320 is specifically configured to: if the first picture is not the same as the target picture, judging whether a first instruction corresponding to the first operation event is responded at the home terminal or not according to the relation between the first picture and the target picture; or if the first picture and the target picture are the same picture, the first instruction corresponding to the first operation event is not responded to the local terminal.
In a possible example, in terms of determining whether to respond to the first instruction corresponding to the first operation event at the local end according to the relationship between the first screen and the target screen, the processing unit 2320 is specifically configured to: if the first picture is a partial picture after the target picture is amplified, judging whether a first instruction corresponding to a first operation event is responded at the local terminal according to whether the first picture moves along the first direction and exceeds the edge of the target picture; or if the first picture is a full picture after the target picture is reduced, the first instruction corresponding to the first operation event is not responded to the local terminal.
In a possible example, in terms of determining whether to respond to the first instruction corresponding to the first operation event at the local end according to whether the first image moves in the first direction and exceeds the edge of the target image, the processing unit 2320 is specifically configured to: if the first picture moves beyond the edge of the target picture along the first direction, a first instruction corresponding to the first operation event is not responded to the local terminal; or, if the first picture does not exceed the edge of the target picture along the first direction, responding to a first instruction corresponding to the first operation event at the local terminal.
In one possible example, before sending the second instruction corresponding to the first operation event to the screen-projection source device, the processing unit 2320 is further configured to: and determining a second instruction corresponding to the first operation event according to the first operation event and a preset operation mapping relation, wherein the preset operation mapping relation is used for representing the mapping relation between the operation aiming at the first picture and the operation aiming at the running picture of the first application.
In one possible example, after determining whether to respond to the second instruction corresponding to the first operation event at the local end according to the first screen and the target screen, the processing unit 2320 is further configured to: and if the local terminal responds to a first instruction corresponding to the first operation event, the first picture is moved along the first direction to display a second picture, and the second picture is a mirror image of the running picture of the first application, which is different from the first picture.
In a possible example, in terms of performing an action on the first screen according to the first operation event to determine whether to respond to the first instruction corresponding to the first operation event at the local end, the processing unit 2320 is specifically configured to: and if the first instruction corresponding to the first operation event is the operation of clicking the first picture, not responding to the first instruction corresponding to the first operation event at the local terminal.
In one possible example, before sending the second instruction corresponding to the first operation event to the screen-projection source device, the processing unit 2320 is further configured to: acquiring a first zoom ratio and a second zoom ratio, wherein the first zoom ratio is used for representing the zoom ratio between the display size of the running picture of the first application and the display size of the target picture, the second zoom ratio is used for representing the zoom ratio between the display size of the first picture and the display size of the target picture, and the target picture is a target mirror image to be projected by the screen projection source device aiming at the running picture of the first application; and determining a second instruction corresponding to the first operation event according to the first scaling ratio, the second scaling ratio and the first operation instruction.
In one possible example, after sending the second instruction corresponding to the first operation event to the screen-projection source device, the processing unit 2320 is further configured to: and displaying a third picture, wherein the third picture is a mirror image of the running picture of the first application different from the first picture after the screen projection source equipment responds to a second instruction corresponding to the first operation event.
It can be seen that, in the embodiment of the present application, the first screen is displayed, and the first operation event for the first screen is acquired. And then, judging whether a first instruction corresponding to the first operation event is responded at the local terminal or not according to the execution action of the first operation event on the first picture. And if the local terminal does not respond to the first instruction, sending a second instruction corresponding to the first operation event to the screen projection source equipment. The first picture is a current mirror image of an operation picture of a first application on the screen projection target device on the display screen of the screen projection source device, and the second instruction corresponding to the first operation event is used for indicating the screen projection source device to execute operation on the operation picture of the first application, so that picture cross-device display is achieved. In addition, the screen projection target device needs to judge whether to respond to the first instruction corresponding to the first operation event at the local terminal, so that the screen projection target device can respond at the local terminal or not, the screen projection target device can operate the screen projection picture on the display screen of the screen projection target device, and can also control and operate the running picture of the first application on the screen projection source device through adaptive processing, thereby realizing cross-device interactive control of the pictures of the applications, improving the processing efficiency of the picture interactive operation, and improving the use experience of the screen projection.
A schematic structural diagram of another electronic device 2400 provided in the embodiment of the present application is described below, as shown in fig. 24. Electronic device 2400 includes, among other things, a processor 2410, a memory 2420, a communication interface 2430, and at least one communication bus connecting processor 2410, memory 2420, and communication interface 2430.
The processor 2410 may be one or more Central Processing Units (CPUs). When the processor 2410 is one CPU, the CPU may be a single-core CPU or a multi-core CPU. Memory 2420 includes, but is not limited to, Random Access Memory (RAM), Read-Only Memory (ROM), Erasable Programmable Read Only Memory (EPROM), or portable Read Only Memory (CD-ROM), and Memory 2220 is used for related instructions and data. Communication interface 2230 is used for receiving and transmitting data.
The processor 2410 in the electronic device 2400 is configured to read the one or more programs 2421 stored in the memory 2420 for performing the following steps: displaying a first picture, wherein the first picture is a current mirror image of an operation picture of a first application on the screen projection source equipment; acquiring a first operation event aiming at a first picture; judging whether a first instruction corresponding to the first operation event is responded at the home terminal or not according to the execution action of the first operation event on the first picture; and if the local terminal does not respond to the first instruction corresponding to the first operation event, sending a second instruction corresponding to the first operation event to the screen projection source equipment, wherein the second instruction corresponding to the first operation event is used for indicating the screen projection source equipment to execute operation on the running picture of the first application.
It should be noted that, for specific implementation of each operation performed by the electronic device 2400, reference may be made to the corresponding description of the method embodiment shown in fig. 5, which is not described herein again.
It can be seen that, in the embodiment of the present application, the first screen is displayed, and the first operation event for the first screen is acquired. And then, judging whether a first instruction corresponding to the first operation event is responded at the local terminal or not according to the execution action of the first operation event on the first picture. And if the local terminal does not respond to the first instruction, sending a second instruction corresponding to the first operation event to the screen projection source equipment. The first picture is a current mirror image of an operation picture of a first application on the screen projection target device on the display screen of the screen projection source device, and the second instruction corresponding to the first operation event is used for indicating the screen projection source device to execute operation on the operation picture of the first application, so that picture cross-device display is achieved. In addition, the screen projection target device needs to judge whether to respond to the first instruction corresponding to the first operation event at the local terminal, so that the screen projection target device can respond at the local terminal or not, the screen projection target device can operate the screen projection picture on the display screen of the screen projection target device, and can also control and operate the running picture of the first application on the screen projection source device through adaptive processing, thereby realizing cross-device interactive control of the pictures of the applications, improving the processing efficiency of the picture interactive operation, and improving the use experience of the screen projection.
Embodiments of the present application also provide a computer-readable storage medium, wherein the computer-readable storage medium stores a computer program for electronic data exchange, the computer program being operable to cause a computer to perform part or all of the steps of any of the methods as set forth in the above method embodiments.
Embodiments of the present application also provide a computer program product, where the computer program product includes a computer program operable to cause a computer to perform part or all of the steps of any one of the methods as described in the above method embodiments. The computer program product may be a software installation package.
For simplicity of description, each of the above method embodiments is described as a series of combinations of operations. Those skilled in the art should appreciate that the present application is not limited by the order of acts described, as some steps in the embodiments of the present application may occur in other orders or concurrently. Moreover, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that acts and modules referred to are not necessarily required to implement the embodiments of the application.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood by those skilled in the art that the described apparatus can be implemented in other ways. It will be appreciated that the above described apparatus embodiments are merely illustrative. For example, the division of the unit is only one logic function division, and actually, other division modes can be provided. That is, multiple units or components may be combined or integrated into another software, and some features may be omitted or not implemented. In addition, the shown or discussed mutual coupling, direct coupling or communication connection and the like can be an indirect coupling or communication connection through some interfaces, devices or units, and can also be an electric or other form.
The above-mentioned units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer-readable storage medium. It will be appreciated that the solution of the present application (which form a part of or all or part of the prior art) may be embodied in the form of a computer software product. The computer software product is stored in a memory and includes several instructions for causing a computer device (personal computer, server, network device, etc.) to perform all or part of the steps of the embodiments of the present application. The computer-readable storage medium may be stored in various memories such as a usb disk, a ROM, a RAM, a removable hard disk, a magnetic disk, or an optical disk.
While the embodiments of the present application have been described in detail, it should be understood by those skilled in the art that the embodiments of the present application are only used for assisting understanding of the core concept of the technical solutions of the present application, and therefore, the embodiments of the present application may be changed in terms of the specific implementation and the application scope. The contents described in the present specification should not be construed as limiting the present application.
Claims (13)
1. A picture cross-equipment display method is characterized by being applied to screen projection target equipment; the method comprises the following steps:
displaying a first picture, wherein the first picture is a current mirror image of an operation picture of a first application on the screen projection source equipment;
acquiring a first operation event aiming at the first picture;
judging whether a first instruction corresponding to the first operation event is responded at the home terminal or not according to the execution action of the first operation event on the first picture;
and if the local terminal does not respond to the first instruction corresponding to the first operation event, sending a second instruction corresponding to the first operation event to the screen projection source device, wherein the second instruction corresponding to the first operation event is used for indicating the screen projection source device to execute operation on the running picture of the first application.
2. The method according to claim 1, wherein the performing the action on the first screen according to the first operation event to determine whether to respond to a first instruction corresponding to the first operation event at a local terminal comprises:
if the first operation event is an operation of moving the first picture along a first direction, whether a first instruction corresponding to the first operation event is responded at a local terminal is judged according to the first picture and a target picture, and the target picture is a target mirror image of the screen projection source device to be projected on the running picture of the first application.
3. The method according to claim 2, wherein the determining whether to respond to the first instruction corresponding to the first operation event at the local terminal according to the first picture and the target picture comprises:
if the first picture and the target picture are different pictures, judging whether a first instruction corresponding to the first operation event is responded at the local terminal or not according to the relation between the first picture and the target picture; or,
and if the first picture and the target picture are the same picture, not responding to a first instruction corresponding to the first operation event at the local terminal.
4. The method according to claim 3, wherein the determining whether to respond to the first instruction corresponding to the first operation event at the local end according to the relationship between the first image and the target image comprises:
if the first picture is a partial picture after the target picture is amplified, judging whether a first instruction corresponding to the first operation event is responded at the local terminal according to whether the first picture moves along the first direction and exceeds the edge of the target picture; or,
and if the first picture is the full picture after the target picture is reduced, not responding to a first instruction corresponding to the first operation event at the local terminal.
5. The method according to claim 4, wherein the determining whether to respond to the first instruction corresponding to the first operation event at the local end according to whether the first frame moves in the first direction beyond the edge of the target frame comprises:
if the first picture moves along the first direction and exceeds the edge of the target picture, a first instruction corresponding to the first operation event is not responded at the local terminal; or,
and if the first picture does not exceed the edge of the target picture along the first direction, responding a first instruction corresponding to the first operation event at the local terminal.
6. The method of claim 2, wherein before the sending the second instruction corresponding to the first operation event to the screen-casting source device, the method further comprises:
and determining a second instruction corresponding to the first operation event according to the first operation event and a preset operation mapping relation, wherein the preset operation mapping relation is used for representing a mapping relation between the operation aiming at the first picture and the operation aiming at the running picture of the first application.
7. The method according to claim 2, wherein after the determining whether to respond to the first instruction corresponding to the first operation event at the local terminal according to the first picture and the target picture, the method further comprises:
and if the local terminal responds to a first instruction corresponding to the first operation event, the first picture is moved along the first direction to display a second picture, and the second picture is a mirror image of the running picture of the first application, which is different from the first picture.
8. The method according to claim 1, wherein the performing the action on the first screen according to the first operation event to determine whether to respond to a first instruction corresponding to the first operation event at a local terminal comprises:
and if the first operation event is the operation of clicking the first picture, not responding to a first instruction corresponding to the first operation event at the local terminal.
9. The method of claim 8, wherein before the sending the second instruction corresponding to the first operation event to the screen-casting source device, the method further comprises:
acquiring a first scaling ratio and a second scaling ratio, wherein the first scaling ratio is used for representing the scaling ratio between the display size of the running picture of the first application and the display size of a target picture, the second scaling ratio is used for representing the scaling ratio between the display size of the first picture and the display size of the target picture, and the target picture is a target mirror image to be projected by the screen projection source device aiming at the running picture of the first application;
and determining an operation picture of a second instruction first application corresponding to the first operation event according to the first scaling ratio, the second scaling ratio and the first operation event.
10. The method according to any one of claims 1-9, wherein after the sending the second instruction corresponding to the first operation event to the screen-projection source device, the method further comprises:
and displaying a third picture, wherein the third picture is a mirror image of an operation picture of the first application, which is different from the first picture, after the screen projection source equipment responds to a second instruction corresponding to the first operation event.
11. A picture cross-equipment display device is characterized by being applied to screen projection target equipment; the apparatus comprises a processing unit and a communication unit, the processing unit being configured to:
displaying a first picture, wherein the first picture is a current mirror image of an operation picture of a first application on the screen projection source equipment;
acquiring a first operation event aiming at the first picture;
judging whether a first instruction corresponding to the first operation event is responded at the home terminal or not according to the execution action of the first operation event on the first picture;
and if the local terminal does not respond to the first instruction corresponding to the first operation event, sending a second instruction corresponding to the first operation event to the screen projection source device through the communication unit, wherein the second instruction corresponding to the first operation event is used for indicating the screen projection source device to execute operation on the running picture of the first application.
12. An electronic device, characterized in that the electronic device is a screen projection target device, comprising a processor, a memory and a communication interface, the memory storing one or more programs, and the one or more programs being executed by the processor, the one or more programs comprising instructions for performing the steps in the method according to any of claims 1-10.
13. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program for electronic data exchange, wherein the computer program is operable to cause a computer to perform the method according to any one of claims 1-10.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011284014.2A CN112394895B (en) | 2020-11-16 | 2020-11-16 | Picture cross-device display method and device and electronic device |
PCT/CN2021/121014 WO2022100305A1 (en) | 2020-11-16 | 2021-09-27 | Cross-device picture display method and apparatus, and electronic device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011284014.2A CN112394895B (en) | 2020-11-16 | 2020-11-16 | Picture cross-device display method and device and electronic device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112394895A true CN112394895A (en) | 2021-02-23 |
CN112394895B CN112394895B (en) | 2023-10-13 |
Family
ID=74600911
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011284014.2A Active CN112394895B (en) | 2020-11-16 | 2020-11-16 | Picture cross-device display method and device and electronic device |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN112394895B (en) |
WO (1) | WO2022100305A1 (en) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113179555A (en) * | 2021-05-19 | 2021-07-27 | 北京小米移动软件有限公司 | Screen projection method, screen projection device, screen projection system, electronic device, and storage medium |
CN114257631A (en) * | 2021-12-20 | 2022-03-29 | Oppo广东移动通信有限公司 | Data interaction method, device, equipment and storage medium |
WO2022100305A1 (en) * | 2020-11-16 | 2022-05-19 | Oppo广东移动通信有限公司 | Cross-device picture display method and apparatus, and electronic device |
CN115174988A (en) * | 2022-06-24 | 2022-10-11 | 长沙联远电子科技有限公司 | Audio and video screen projection control method based on DLNA |
WO2022227978A1 (en) * | 2021-04-30 | 2022-11-03 | 华为技术有限公司 | Display method and related apparatus |
CN115460445A (en) * | 2021-06-09 | 2022-12-09 | 荣耀终端有限公司 | Screen projection method of electronic equipment and electronic equipment |
WO2023006035A1 (en) * | 2021-07-30 | 2023-02-02 | 华为技术有限公司 | Screen mirroring method and system, and electronic device |
CN115729502A (en) * | 2022-03-23 | 2023-03-03 | 博泰车联网(南京)有限公司 | Response method of screen projection terminal and display terminal, electronic device and storage medium |
WO2023030099A1 (en) * | 2021-09-03 | 2023-03-09 | 华为技术有限公司 | Cross-device interaction method and apparatus, and screen projection system and terminal |
WO2023030306A1 (en) * | 2021-08-31 | 2023-03-09 | 维沃移动通信有限公司 | Method and apparatus for video editing, and electronic device |
WO2023165370A1 (en) * | 2022-03-02 | 2023-09-07 | 北京字节跳动网络技术有限公司 | Information exchange method and apparatus, display device, and storage medium |
CN116719468A (en) * | 2022-09-02 | 2023-09-08 | 荣耀终端有限公司 | Interactive event processing method and device |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104811639A (en) * | 2015-04-28 | 2015-07-29 | 联想(北京)有限公司 | Method for processing information and electronic equipment |
CN107483994A (en) * | 2017-07-31 | 2017-12-15 | 广州指观网络科技有限公司 | It is a kind of reversely to throw screen control system and method |
CN110248226A (en) * | 2019-07-16 | 2019-09-17 | 广州视源电子科技股份有限公司 | Information screen projection method, device, system, storage medium and processor |
CN110377250A (en) * | 2019-06-05 | 2019-10-25 | 华为技术有限公司 | A kind of touch control method and electronic equipment thrown under screen scene |
CN111221491A (en) * | 2020-01-09 | 2020-06-02 | Oppo(重庆)智能科技有限公司 | Interaction control method and device, electronic equipment and storage medium |
CN111562896A (en) * | 2020-04-26 | 2020-08-21 | 维沃移动通信有限公司 | Screen projection method and electronic equipment |
CN111918119A (en) * | 2020-07-24 | 2020-11-10 | 深圳乐播科技有限公司 | IOS system data screen projection method, device, equipment and storage medium |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112394895B (en) * | 2020-11-16 | 2023-10-13 | Oppo广东移动通信有限公司 | Picture cross-device display method and device and electronic device |
-
2020
- 2020-11-16 CN CN202011284014.2A patent/CN112394895B/en active Active
-
2021
- 2021-09-27 WO PCT/CN2021/121014 patent/WO2022100305A1/en active Application Filing
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104811639A (en) * | 2015-04-28 | 2015-07-29 | 联想(北京)有限公司 | Method for processing information and electronic equipment |
US20160321968A1 (en) * | 2015-04-28 | 2016-11-03 | Beijing Lenovo Software Ltd. | Information processing method and electronic device |
CN107483994A (en) * | 2017-07-31 | 2017-12-15 | 广州指观网络科技有限公司 | It is a kind of reversely to throw screen control system and method |
CN110377250A (en) * | 2019-06-05 | 2019-10-25 | 华为技术有限公司 | A kind of touch control method and electronic equipment thrown under screen scene |
CN110248226A (en) * | 2019-07-16 | 2019-09-17 | 广州视源电子科技股份有限公司 | Information screen projection method, device, system, storage medium and processor |
CN111221491A (en) * | 2020-01-09 | 2020-06-02 | Oppo(重庆)智能科技有限公司 | Interaction control method and device, electronic equipment and storage medium |
CN111562896A (en) * | 2020-04-26 | 2020-08-21 | 维沃移动通信有限公司 | Screen projection method and electronic equipment |
CN111918119A (en) * | 2020-07-24 | 2020-11-10 | 深圳乐播科技有限公司 | IOS system data screen projection method, device, equipment and storage medium |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022100305A1 (en) * | 2020-11-16 | 2022-05-19 | Oppo广东移动通信有限公司 | Cross-device picture display method and apparatus, and electronic device |
WO2022227978A1 (en) * | 2021-04-30 | 2022-11-03 | 华为技术有限公司 | Display method and related apparatus |
CN113179555A (en) * | 2021-05-19 | 2021-07-27 | 北京小米移动软件有限公司 | Screen projection method, screen projection device, screen projection system, electronic device, and storage medium |
CN115460445B (en) * | 2021-06-09 | 2024-03-22 | 荣耀终端有限公司 | Screen projection method of electronic equipment and electronic equipment |
CN115460445A (en) * | 2021-06-09 | 2022-12-09 | 荣耀终端有限公司 | Screen projection method of electronic equipment and electronic equipment |
WO2023006035A1 (en) * | 2021-07-30 | 2023-02-02 | 华为技术有限公司 | Screen mirroring method and system, and electronic device |
WO2023030306A1 (en) * | 2021-08-31 | 2023-03-09 | 维沃移动通信有限公司 | Method and apparatus for video editing, and electronic device |
WO2023030099A1 (en) * | 2021-09-03 | 2023-03-09 | 华为技术有限公司 | Cross-device interaction method and apparatus, and screen projection system and terminal |
WO2023116311A1 (en) * | 2021-12-20 | 2023-06-29 | Oppo广东移动通信有限公司 | Data interaction method and apparatus, and device and storage medium |
CN114257631A (en) * | 2021-12-20 | 2022-03-29 | Oppo广东移动通信有限公司 | Data interaction method, device, equipment and storage medium |
WO2023165370A1 (en) * | 2022-03-02 | 2023-09-07 | 北京字节跳动网络技术有限公司 | Information exchange method and apparatus, display device, and storage medium |
CN115729502A (en) * | 2022-03-23 | 2023-03-03 | 博泰车联网(南京)有限公司 | Response method of screen projection terminal and display terminal, electronic device and storage medium |
CN115729502B (en) * | 2022-03-23 | 2024-02-27 | 博泰车联网(南京)有限公司 | Screen-throwing end and display end response method, electronic equipment and storage medium |
CN115174988A (en) * | 2022-06-24 | 2022-10-11 | 长沙联远电子科技有限公司 | Audio and video screen projection control method based on DLNA |
CN115174988B (en) * | 2022-06-24 | 2024-04-30 | 长沙联远电子科技有限公司 | Audio and video screen-throwing control method based on DLNA |
CN116719468A (en) * | 2022-09-02 | 2023-09-08 | 荣耀终端有限公司 | Interactive event processing method and device |
Also Published As
Publication number | Publication date |
---|---|
WO2022100305A1 (en) | 2022-05-19 |
CN112394895B (en) | 2023-10-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112394895B (en) | Picture cross-device display method and device and electronic device | |
CN112398855B (en) | Method and device for transferring application contents across devices and electronic device | |
WO2021078284A1 (en) | Content continuation method and electronic device | |
CN112291764B (en) | Content connection system | |
WO2020238871A1 (en) | Screen projection method and system and related apparatus | |
EP4060475A1 (en) | Multi-screen cooperation method and system, and electronic device | |
WO2022121775A1 (en) | Screen projection method, and device | |
JP7369281B2 (en) | Device capacity scheduling method and electronic devices | |
EP2326136A2 (en) | Method and apparatus for remote controlling bluetooth device | |
CN114040242B (en) | Screen projection method, electronic equipment and storage medium | |
CN115793916A (en) | Method, electronic device and system for displaying multiple windows | |
CN112527174B (en) | Information processing method and electronic equipment | |
WO2022105445A1 (en) | Browser-based application screen projection method and related apparatus | |
KR101989016B1 (en) | Method and apparatus for transferring files during video telephony in electronic device | |
WO2022135157A1 (en) | Page display method and apparatus, and electronic device and readable storage medium | |
CN115550597A (en) | Shooting method, system and electronic equipment | |
CN114489529A (en) | Screen projection method of electronic device, medium thereof and electronic device | |
CN115686401A (en) | Screen projection method, electronic equipment and system | |
EP4254927A1 (en) | Photographing method and electronic device | |
CN113946302B (en) | Method and device for opening file | |
CN115086888B (en) | Message notification method and device and electronic equipment | |
CN117873367A (en) | Split screen display method and related device | |
WO2024022307A1 (en) | Screen mirroring method and electronic device | |
WO2024159925A1 (en) | Screen mirroring method, screen mirroring system, and electronic device | |
WO2024037542A1 (en) | Touch input method, system, electronic device, and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |