CN112394895B - Picture cross-device display method and device and electronic device - Google Patents

Picture cross-device display method and device and electronic device Download PDF

Info

Publication number
CN112394895B
CN112394895B CN202011284014.2A CN202011284014A CN112394895B CN 112394895 B CN112394895 B CN 112394895B CN 202011284014 A CN202011284014 A CN 202011284014A CN 112394895 B CN112394895 B CN 112394895B
Authority
CN
China
Prior art keywords
screen
picture
operation event
target
application
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011284014.2A
Other languages
Chinese (zh)
Other versions
CN112394895A (en
Inventor
杨俊拯
何�轩
钟卫东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202011284014.2A priority Critical patent/CN112394895B/en
Publication of CN112394895A publication Critical patent/CN112394895A/en
Priority to PCT/CN2021/121014 priority patent/WO2022100305A1/en
Application granted granted Critical
Publication of CN112394895B publication Critical patent/CN112394895B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay

Abstract

The embodiment of the application discloses a method and a device for displaying pictures in a cross-device manner and electronic equipment, wherein the method comprises the following steps: displaying a first picture, wherein the first picture is a current mirror image of a running picture of a first application on the screen throwing source equipment; acquiring a first operation event aiming at a first picture; judging whether to respond to a first instruction corresponding to the first operation event at the local end according to the execution action of the first operation event on the first picture; if the first instruction corresponding to the first operation event is not responded at the local end, a second instruction corresponding to the first operation event is sent to the screen throwing source equipment, and the second operation instruction is used for indicating the screen throwing source equipment to execute operation on the running picture of the first application. Therefore, the embodiment of the application is beneficial to realizing the cross-equipment display and interactive operation of the applied pictures, improving the processing efficiency of the interactive operation of the pictures and improving the use experience of screen throwing.

Description

Picture cross-device display method and device and electronic device
Technical Field
The application relates to the technical field of computers, in particular to a method and a device for displaying pictures in a cross-device manner and electronic equipment.
Background
With the development of technology, a user may have multiple devices at the same time, and the devices may have multiple forms, such as mobile phones, computers, tablets, televisions, watches, etc., and different operating systems, software and hardware architectures, etc. may be mounted on different devices. For example, a system such as Windows or Mac may be installed on a computer, and a system such as Android or IOS may be installed on a mobile phone.
With the application of the screen projection technology, the application running on different devices or systems can be used across devices in a screen projection mode, so that the screen projection of the pictures applied on the different devices or systems is realized. Currently, some problems to be solved still exist in the process of displaying the images of the application in a cross-device mode.
Disclosure of Invention
The embodiment of the application provides a method and a device for displaying pictures in a cross-device manner and electronic equipment, which are used for expecting to realize cross-device display and interactive control of the pictures of an application, improving the processing efficiency of picture interactive operation and improving the use experience of screen throwing.
In a first aspect, an embodiment of the present application provides a method for displaying a picture across devices, which is applied to a screen-throwing target device, where the method includes:
displaying a first picture, wherein the first picture is a current mirror image of a running picture of a first application on screen throwing source equipment;
acquiring a first operation event aiming at the first picture;
judging whether to respond to a first instruction corresponding to the first operation event at the local end according to the execution action of the first operation event on the first picture;
if the first instruction corresponding to the first operation event is not responded at the local end, a second instruction corresponding to the first operation event is sent to the screen throwing source equipment, and the second instruction corresponding to the first operation event is used for indicating the screen throwing source equipment to execute operation on the running picture of the first application.
In a second aspect, an embodiment of the present application provides a cross-screen display apparatus, where the apparatus includes a processing unit and a communication unit, where the processing unit is configured to:
displaying a first picture, wherein the first picture is a current mirror image of a running picture of a first application on screen throwing source equipment;
acquiring a first operation event aiming at the first picture;
judging whether to respond to the first operation instruction at the local end according to the execution action of the first operation event on the first picture;
if the first operation instruction is not responded at the local end, a second instruction corresponding to the first operation event is sent to the screen throwing source equipment through the communication unit, and the second instruction corresponding to the first operation event is used for indicating the screen throwing source equipment to execute operation on the running picture of the first application.
In a third aspect, an embodiment of the present application provides an electronic device, where the electronic device is a projection target device, and includes a processor, a memory, and a communication interface, where the memory stores one or more programs, and the one or more programs are executed by the processor, where the one or more programs are used to execute instructions in the steps in the first aspect of the embodiment of the present application.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium storing a computer program for electronic data exchange, the computer program being operable to cause a computer to perform some or all of the steps described in the first aspect of the embodiments of the present application.
In a fifth aspect, embodiments of the present application provide a computer program product, wherein the computer program product comprises a computer program operable to cause a computer to perform some or all of the steps described in the first aspect of the embodiments of the present application. The computer program product may be a software installation package.
It can be seen that, in the embodiment of the present application, the screen-throwing target device displays the first screen, and acquires the first operation event for the first screen. And then, judging whether to respond to a first instruction corresponding to the first operation event at the local end according to the execution action of the first operation event on the first picture. If the first instruction is not responded at the local end, a second instruction corresponding to the first operation event is sent to the screen throwing source equipment. Because the first picture is the current mirror image of the running picture of the first application on the screen throwing source equipment on the display screen of the screen throwing target equipment, and the second instruction corresponding to the first operation event is used for indicating the screen throwing source equipment to execute the operation on the running picture of the first application, the cross-equipment display of the pictures is realized. In addition, because the screen-throwing target equipment needs to judge whether to respond to the first instruction corresponding to the first operation event at the local end, the screen-throwing target equipment can respond at the local end or not, so that the screen-throwing target equipment can operate a screen throwing picture on a display screen of the screen-throwing target equipment, and can also control and operate an operation picture of the first application on the screen-throwing source equipment through the adapting process, thereby realizing the cross-equipment interactive control of the picture of the application, improving the processing efficiency of the picture interactive operation and improving the use experience of the screen throwing.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below. It will be apparent that the figures described below are merely some embodiments of the application. Other figures can be obtained from these figures without inventive effort for the person skilled in the art.
Fig. 1 is a schematic diagram of an architecture of a screen-projection communication system according to an embodiment of the present application;
fig. 2 is a schematic hardware structure of an electronic device according to an embodiment of the present application;
fig. 3 is a schematic software structure of an electronic device according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a screen-projection source device and a screen-projection target device according to an embodiment of the present application;
fig. 5 is a schematic flow chart of a method for displaying pictures across devices according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a cross-device display of a picture according to an embodiment of the present application;
FIG. 7 is a schematic diagram of another cross-device display of a screen according to an embodiment of the present application;
FIG. 8 is a schematic diagram of another cross-device display of a screen according to an embodiment of the present application;
FIG. 9 is a schematic diagram of another cross-device display of a screen according to an embodiment of the present application;
FIG. 10 is a schematic diagram of a cross-device display of yet another embodiment of the present application;
FIG. 11 is a schematic diagram of a structure of another cross-device display of a screen provided by an embodiment of the present application;
FIG. 12 is a schematic diagram of a cross-device display of yet another embodiment of the present application;
FIG. 13 is a schematic diagram of a cross-device display of yet another embodiment of the present application;
FIG. 14 is a schematic diagram of a structure of a cross-device display of yet another screen provided by an embodiment of the present application;
FIG. 15 is a schematic diagram of a cross-device display of yet another embodiment of the present application;
FIG. 16 is a schematic diagram of a cross-device display of yet another embodiment of the present application;
FIG. 17 is a schematic diagram of a cross-device display of yet another embodiment of the present application;
FIG. 18 is a schematic diagram of a cross-device display of yet another embodiment of the present application;
FIG. 19 is a schematic diagram of a cross-device display of yet another embodiment of the present application;
FIG. 20 is a schematic diagram of a cross-device display of yet another embodiment of the present application;
FIG. 21 is a flowchart of another method for displaying images across devices according to an embodiment of the present application;
FIG. 22 is a flowchart of another method for displaying images across devices according to an embodiment of the present application;
FIG. 23 is a functional block diagram of a cross-device display apparatus according to an embodiment of the present application;
fig. 24 is a schematic structural diagram of still another electronic device according to an embodiment of the present application.
Detailed Description
For better understanding of the technical solutions of the present application by those skilled in the art, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the description of the embodiments of the application without making any inventive effort, are intended to fall within the scope of the application.
Before describing the technical scheme of the embodiment of the application, the related concepts, a screen-throwing communication system, a software and hardware structure of the electronic equipment and the like possibly related to the application are introduced.
The screen projection of the embodiment of the application is a technology for projecting the pictures or the contents of the application running on the equipment to the display screen or the display medium of another equipment for display, and is a typical information synchronization mode. In the embodiment of the application, the device for projecting the applied picture is called a screen projection source device, and the device for receiving and displaying the applied picture is called a screen projection target device.
Secondly, the screen projection related to the application can comprise wired screen projection and wireless screen projection. The wired screen can establish wired connection between the screen source device and the screen target device through a high-definition multimedia interface (high definition multimedia interface, HDMI), a universal serial bus (universal serial bus, USB) interface and the like so as to transmit media data; the wireless screen may establish a wired connection between the screen source device and the screen target device to transmit media data through a digital living network alliance (digistal living network alliance, DLNA) protocol, a wireless display sharing (Miracast) or a spaced play (AirPlay) protocol.
For example, during screen projection, the screen projection source device may compress the video stream in the current video player through data encoding and then send the compressed video stream to the screen projection target device; then, the screen-casting target device displays the screen-casting content on its display screen after decoding the video stream data. For convenience in description, the embodiment of the application can refer to the screen projection content displayed on the display screen of the screen projection target device as the mirror image of the screen projection content of the screen projection source device.
In addition, the embodiment of the application can collectively refer to the screen-throwing source equipment and the screen-throwing target equipment as electronic equipment. The electronic device in the embodiment of the present application may be a handheld device, a vehicle-mounted device, a wearable device, an augmented reality (augmented reality, AR) device, a Virtual Reality (VR) device, a projection device, a projector, or other devices connected to a wireless modem, or may be User Equipment (UE), a terminal device (terminal device), a mobile phone (smart phone), a smart screen, a smart television, a smart watch, a notebook computer, a smart sound, a camera, a game pad, a microphone, a Station (STA), an Access Point (AP), a Mobile Station (MS), a personal digital assistant (personal digital assistant, PDA), a personal computer (personal computer, PC), or a relay device.
Two electronic devices, namely a computer and a mobile phone, are taken as examples. When the computer and the mobile phone are connected through a wireless communication technology (such as Bluetooth, wireless fidelity, zigbee, near field communication and the like) or a data line (such as a USB data line), the mobile phone is used as a screen-throwing source device to throw the pictures or the contents of the application operated by the mobile phone onto the display screen of the computer through a screen-throwing technology, and the computer is used as a screen-throwing target device at the moment; or, the computer is used as a screen-throwing source device to throw the pictures or the contents of the running application of the computer onto the display screen of the mobile phone, and the mobile phone is used as a screen-throwing target device.
The following technical solutions of the embodiments of the present application may be applied to the screen-projection communication system 10 shown in fig. 1. Wherein the screen shot communication system 10 may include at least two electronic devices 110. The at least two electronic devices 110 may include an electronic device 110A, an electronic device 110B, an electronic device 110C, an electronic device 110D, an electronic device 110E, and an electronic device 110F. Meanwhile, each of the at least two electronic devices 110 may be connected to each other through a wireless network or wired data communication.
It should be noted that, the wireless network may include a mobile cellular network (such as a fifth generation 5G mobile communication network), a wireless local area network (wireless local area network, WLAN), a wide area network (wide area network, WAN), bluetooth (Bluetooth), wireless fidelity (wireless fidelity, wi-Fi), zigbee (Zigbee), near field communication (near field communication, NFC), or Ultra Wide Band (UWB), etc.; the wired data may include an HDMI data line, a USB data line, and the like.
Specifically, each of the at least two electronic devices 110 may be a device under the same user account. For example, when a user logs in a mobile phone, a desktop computer, a smart screen, a notebook computer, a relay device, and a smart watch using the same user account, the at least two electronic devices 110 include the mobile phone, the desktop computer, the smart screen, the notebook computer, the relay device, and the smart watch, and the mobile phone, the desktop computer, the smart screen, the notebook computer, the relay device, and the smart watch can communicate with each other through a wireless network.
Specifically, each of the at least two electronic devices 110 may be connected to the same WLAN network through a relay device (e.g., a router). For example, when a user accesses a cell phone, a desktop computer, a smart screen, a notebook computer, and a smart watch to a Wi-Fi network provided by a relay device, the at least two electronic devices 110 include the cell phone, the desktop computer, the smart screen, the notebook computer, the relay device, and the smart watch, and the cell phone, the desktop computer, the smart screen, the notebook computer, the relay device, and the smart watch form one WLAN network, so that communication between the respective devices within the WLAN network can be achieved through the relay device.
Specifically, each of the at least two electronic devices 110 may form a Peer-to-Peer (P2P) network through wireless communication (such as bluetooth, zigbee, NFC, UWB, etc.). For example, a user forms a P2P network by scanning NFC tags with a mobile phone, a notebook computer, a smart watch, etc., and all devices in the P2P network may communicate with each other before.
Further, one or more of the at least two electronic devices 110 may be a projection source device, while other electronic devices may be projection target devices. At this time, the screen-throwing source device may throw or stream the current screen of the application that it runs to the screen-throwing target device for display. In addition, when the screen-throwing target equipment needs to simultaneously display the mirror images of the screen throwing by the plurality of screen-throwing source equipment, the screen-throwing target equipment can be simultaneously displayed in a screen splitting mode. For example, electronic device 110A drops the screen to electronic device 110B and electronic device 110C, and electronic device 110D drops the screen to electronic device 110C and electronic device 110F. At this time, the electronic device 11C may simultaneously display the projection images from the electronic device 110A and the electronic device 110D in a split-screen manner.
Further, the screen-shot communication system 10 may also include other numbers of electronic devices, which are not specifically limited herein.
The following describes the structure of the electronic device in the embodiment of the present application in detail with reference to fig. 2, and it is to be understood that the structure illustrated in fig. 2 does not constitute a specific limitation on the electronic device. In other embodiments of the application, the electronic device may also include more or fewer components than illustrated in FIG. 2, or certain components may be combined, certain components may be separated, or a different arrangement of components may be provided. In addition, the components illustrated in fig. 2 may be implemented by hardware, software, or a combination of software and hardware.
Referring to fig. 2, the electronic device may include a processor 210, an antenna 1, an antenna 2, a mobile communication module 220, a wireless communication module 230, an audio module 240, a sensor module 250, a display module 260, a camera module 270, a charge management module 280, an internal memory 2901, an external memory interface 2902, and the like.
In particular, the processor 210 may include one or more processing units. For example, the processor 210 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a field programmable gate array (field programmable gate array, FPGA) baseband processor, and/or a neural-network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
Further, a memory may be provided in the processor 210 for storing instructions and data. In some embodiments, the memory in the processor 210 is a cache memory. The memory may hold instructions or data that the processor 210 has just used or recycled. If the processor 210 needs to reuse the instruction or data, it can be called directly from the memory, thereby avoiding repeated accesses and reducing the latency of the processor 210 to increase system efficiency.
Further, the processor 210 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a USB interface, among others.
The wireless communication function of the electronic device may be implemented by the antenna 1, the antenna 2, the mobile communication module 220, the wireless communication module 230, the modem processor, the baseband processor, and the like. Wherein the antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device may be used to cover a single or multiple communication bands. In addition, different antennas can also be multiplexed to improve the utilization of the antennas. For example, the antenna 1 is multiplexed into a diversity antenna of a wireless local area network.
In particular, the mobile communication module 220 may provide a solution for wireless communication including 2G/3G/4G/5G, etc. applied to an electronic device. The mobile communication module 220 may include at least one filter, switch, power amplifier and low noise amplifier (low noise amplifier, LNA), etc.
Further, the mobile communication module 220 may receive electromagnetic waves from the antenna 1, perform processes such as filtering and amplifying on the received electromagnetic waves, and transmit the electromagnetic waves to the modem processor for demodulation. In addition, the mobile communication module 220 may amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate the electromagnetic waves. In the embodiment of the present application, the mobile communication module 220 can implement communication connection between the screen-projection source device and the screen-projection target device in the technical solution of the present application.
Further, at least part of the functional modules of the mobile communication module 220 may be provided in the processor 210; alternatively, at least part of the functional modules of the mobile communication module 220 may be provided in the same device as part of the modules of the processor 210.
In particular, the wireless communication module 230 may provide solutions for wireless communication including Bluetooth (BT), wireless local area network (wireless local area networks, WLAN), wireless fidelity (wireless fidelity, wi-Fi) network, near field wireless communication (near field communication, NFC), infrared (IR), etc. applied on electronic devices.
Further, the wireless communication module 230 may be one or more devices integrating at least one communication processing module. The wireless communication module 230 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the system-in-chip 210. The wireless communication module 230 may also receive a signal to be transmitted from the processor 210, frequency modulate and amplify the signal, and then convert the signal into electromagnetic waves by the antenna 2. In the embodiment of the present application, the wireless communication module 230 can implement communication connection between the screen-projection source device and the screen-projection target device in the technical solution of the present application.
The electronic device may implement audio functions through the audio module 240, the speaker 2401, the receiver 2402, the microphone 2403, the earphone interface 2404, the processor 210, and the like. Such as music playing, recording, etc.
In particular, the audio module 240 may be used to convert digital audio information to an analog audio signal output, and may also be used to convert an analog audio input to a digital audio signal. In addition, the audio module 240 may also be used to encode and decode audio signals. In some embodiments, the audio module 240 may be disposed in the processor 210, or some functional modules of the audio module 240 may be disposed in the processor 210.
In particular, the speaker 2401 may be used to convert audio electrical signals into sound signals. The electronic device can listen to music, or to hands-free conversations, through the speaker 2301.
In particular, the receiver 2402 may be used to convert an audio electrical signal into a sound signal. When the electronic device receives a call or voice message, the phone 2302 may be brought close to the ear to receive voice.
In particular, the microphone 2403 may be used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 2403 through the mouth, inputting a sound signal to the microphone 2403. In addition, the electronic device may be provided with at least one microphone 2403. In one possible example, the electronic device may be provided with two microphones 2403, which may implement a noise reduction function in addition to collecting sound signals; in one possible example, the electronic device may further be provided with three, four, or more microphones 2403 to enable collection of sound signals, noise reduction, identification of sound sources, implementation of directional recording functions, and the like, without limitation.
In particular, the headphone interface 2404 may be used to connect wired headphones. The headset interface 2404 may be a USB interface 2703, or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface, or the like.
Specifically, the sensor module 180 may include a pressure sensor, a gyroscope sensor, a barometric sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, an ultra-wideband UWB sensor, a near field communication NFC sensor, a laser sensor, a visible light sensor, and the like.
It should be noted that, the electronic device may implement the display function through the GPU, the display module 260, the processor 210, and the like. Among other things, GPUs may be used to perform mathematical and geometric calculations and perform graphics rendering. In addition, the GPU may be a microprocessor for image processing and connect the display module 260 and the processor 210. Processor 210 may include one or more GPUs that execute program instructions to generate or change display information.
In particular, the display module 260 may be a display screen for displaying images, videos, and the like. The display 260 may include a display panel, among others. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light emitting diode (AMOLED), a flexible light-emitting diode (FLED), a quantum dot light-emitting diode (quantum dot lightemitting diodes, QLED), or the like. In one possible example, the electronic device may include 1 or more display modules 260.
It should be noted that the electronic device may implement the photographing function through the ISP, the camera module 270, the video codec, the GPU, the display module 260, the processor 210, and the like. Wherein the ISP may be used to process the data fed back by the camera module 270. For example, when photographing, the shutter is opened first, then light is transmitted to the camera photosensitive element through the lens, so that the optical signal is converted into an electric signal, and finally the electric signal is transmitted to the ISP through the camera photosensitive element for processing so as to be converted into an image visible to naked eyes. In addition, ISP can also carry out algorithm optimization on noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In one possible example, an ISP may be provided in the camera module 270.
In particular, the camera module 270 may be a camera, which is used to capture still images or video, etc. Wherein the object is projected through a lens to generate an optical image to a photosensitive element, which may be a charge coupled device (charge coupled device, CCD) or a complementary metal oxide semiconductor (complementary metal-oxide-semiconductor, CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transmitted to the ISP for conversion into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In one possible example, the electronic device may include one or more camera modules 270.
Specifically, the charge management module 280 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 280 may receive a charging input of a wired charger through the USB interface 2803. In some wireless charging embodiments, the charge management module 280 may receive wireless charging input through a wireless charging coil of the electronic device. The charge management module 280 may also provide power to the electronic device through the power management module 2802 while charging the battery 2801.
Note that, the power management module 2802 is used to connect the battery 2801, the charge management module 2802, and the processor 210. The power management module 2802, among other things, receives inputs from the battery 2801 and/or the charge management module 280, and provides power to various modules in the electronic device and the processor 210, etc.
In particular, the power management module 2802 may also be used to monitor parameters such as battery capacity, battery cycle times, battery health (leakage, impedance), etc. In one possible example, the power management module 2802 may also be disposed in the processor 210; in one possible example, the power management module 2802 and the charge management module 280 may also be provided in the same device.
It should be noted that the internal memory 2901 may be used to store computer executable program code including instructions. Among other things, the processor 210 executes various functional applications of the electronic device and data processing by executing instructions stored in the internal memory 2901. In one possible example, the internal memory 2901 stores program code that performs the technical aspects of embodiments of the present application.
In particular, the internal memory 2901 may include a stored program area and a stored data area. The storage program area may store, among other things, an operating system, application programs required for at least one function (e.g., a sound playing function, an image playing function, etc.), and the like. The storage data area may store data created during use of the electronic device (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 2901 may include a high-speed random access memory, and may also include a nonvolatile memory. Such as at least one disk storage device, flash memory device, universal flash memory (universal flash storage, UFS), etc.
In particular, the external memory interface 2902 may be used to connect external memory cards, such as micro SD cards, to enable expanding the memory capabilities of the electronic device. The external memory card communicates with the processor 210 through an external memory interface 2902 to implement data storage functions. For example, files such as music, video, and the like are stored in an external memory card.
In the embodiment of the application, a software system of the electronic device can adopt a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture or a cloud architecture. In the following, the embodiment of the application takes an Android system with a layered architecture as an example, and illustrates a software structure of an electronic device.
The architecture diagram of the software and hardware system provided with the Android system is shown in fig. 3. Therein, the internal memory 2901 may have stored therein a kernel layer 320, a system runtime layer 340, an application framework layer 360, and an application layer 380. Wherein the layers communicate through a software interface, and the kernel layer 320, the system runtime layer 340, and the application framework layer 360 belong to an operating system space.
Specifically, the application layer 380 belongs to a user space, and at least one application program (or simply referred to as an "application") is running in the application layer 380, where the application programs may be native application programs of an operating system, or may be third party application programs developed by a third party developer. For example, the application layer 380 may include applications such as cameras, gallery, calendar, talk, map, navigation, WLAN, bluetooth, music, video, and short messages.
In the embodiment of the application, the application layer can be provided with a screen-throwing application. The user may open the drop-in application from a desktop, setup function, or drop-down menu, etc. The screen-throwing application can be used as a bridge between the screen-throwing source equipment and the screen-throwing target equipment during content projection, and screen-throwing content (such as an application picture) of an application needing screen throwing in the screen-throwing source equipment is sent to the screen-throwing target equipment. For example, the screen-casting application may receive the screen-casting event reported by the application framework layer 360, so that the screen-casting application is used to interact with the running application (such as a video player), and the content being displayed or played in the application is used as the screen-casting content to be sent to the screen-casting target device through wireless communication modes such as Wi-Fi.
In addition, the user can also use the screen-drop application to set a binding relationship between the NFC tag and one or more electronic devices. For example, an option for binding NFC tags is set in the drop application. When the electronic device detects that the user opens the option, the screen-drop application may display a list of electronic devices to be bound. After the user selects one or more electronic devices to be bound in the list, the electronic devices can be close to the NFC tag to be bound. In this way, the electronic device can write the identification of the electronic device selected by the user in the screen-throwing application into the NFC tag through the NFC signal, so that the binding relationship between the NFC tag and one or more electronic devices is established in the NFC tag.
It should be noted that the application framework layer 360 provides various application programming interfaces (application programming interface, APIs) and programming frameworks that may be used by applications that build the application layer, so that developers can also build their own applications by using these APIs. For example, a window manager (window manager), a content provider (content providers), a view system (view system), a phone manager (telephony manager), a resource manager, a notification manager (notification manager), a message manager, an activity manager (activity manager), a package manager (package manager), a location manager (location manager), an NFC service, and the like.
In particular, a window manager may be used to manage window programs. The window manager may obtain the size of the display screen, determine if there is a status bar, lock the screen, intercept the screen, etc.
In particular, the content provider may be used to store and retrieve data and make the data accessible to applications. The data may include, among other things, video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc. In addition, the content provider may enable an application to access data of another application, such as a contact database, or share their own data.
In particular, the view system includes visual controls. For example, a control for displaying characters and a control for displaying pictures. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
In particular, the telephony manager is used to provide communication functions for the electronic device. For example, management of call status (e.g., on, off, etc.).
In particular, the resource manager may provide various resources for the application. Such as localization strings, icons, pictures, layout files, video files, etc.
Specifically, the notification manager enables the application to display notification information in the status bar, can be used to convey notification type messages, and can automatically disappear after a short stay without user interaction. For example, a notification manager is used to inform that the download is complete, a message alert, etc. The notification manager may also be a notification that appears in the system top status bar in the form of a chart or scroll bar text. In addition, the notification of the application running in the background may also be a notification that appears on the screen in the form of a dialog window. For example, a text message is presented in a status bar, a warning tone is emitted, and the electronic device vibrates, and the indicator lights blink, etc.
Specifically, the message manager may be configured to store data of the messages reported by each application program, and process the data reported by each application program.
In particular, the activity manager may be used to manage application lifecycle and provide common navigation rollback functionality. In one possible example, the message manager may be part of a notification manager.
In an embodiment of the present application, an NFC service (NFC service) may be run in the application framework layer 360.
For example, the mobile phone may start running the NFC service in the application framework layer after starting the NFC function. When the mobile phone approaches or touches the NFC tag, the NFC service can call the NFC drive of the kernel layer to read the binding relation stored in the NFC tag, so that the screen throwing target device for throwing the screen of the content is obtained. Furthermore, the NFC service can report a screen throwing event to the screen throwing application, so that the screen throwing application is triggered to send the content being displayed or played by the mobile phone as screen throwing content to the screen throwing target equipment, and the screen throwing process of the content is started.
It should be noted that, the system runtime layer 340 provides main feature support for the Android system through some C/c++ libraries. For example, the SQLite library provides support for databases, the OpenGL/ES library provides support for 3D graphics, the Webkit library provides support for browser kernels, and the like. Also provided in the system Runtime layer 340 is a An Zhuoyun Runtime (Android run) that provides mainly some core libraries that can allow developers to write Android applications using the Java language.
In particular, kernel layer 320 may provide underlying drivers for various hardware of the electronic device, such as display drivers, audio drivers, camera drivers, bluetooth drivers, wi-Fi drivers, power management, NFC drivers, UWB drivers, and the like.
The following describes an application scenario of the embodiment of the present application in detail. As shown in fig. 4, the acquisition module 4101 in the screen-throwing source device 410 acquires the current picture of the application it runs, and transmits to the encoding module 4102; next, the encoding module 4102 performs encoding compression on the picture to form a data stream, and the transmitting module 4103 transmits the data stream to the receiving module 4201 in the screen casting target device 420; again, the decoding module 4202 decodes the data stream using a streaming protocol and transmits the decoded data stream to the display module 4203; finally, the full or partial mirror of the picture is displayed by display module 4203, thereby enabling cross-device display of the picture.
In addition, the embodiment of the present application further proposes that the input module 4204 performs an input operation (such as zooming in, zooming out, scrolling, flipping, moving, etc.) on the screen displayed by the display module 4203 to generate an input operation command, and transmits the input operation command to the input processing module 4205, and then the input processing module 4205 determines whether the input operation command is responded to at the screen input target device 420 or the screen input source device 410. If the input operation instruction is responded at the screen source device 410, the input processing module 4205 needs to perform adaptation processing on the input operation instruction, and transmits the adapted operation instruction to the input injection module 4104 through the transmitting module 4206, and then the input injection module 4104 performs an operation on a current screen of the application running on the screen source device 410.
It can be seen that the screen-throwing target device 420 can operate the screen-throwing picture in a zooming or moving manner, and also can operate the current picture applied on the screen-throwing source device 410, so that the interactive operation of the applied picture is realized, and the use experience of the screen throwing is improved. The following embodiments of the present application will be specifically described with reference to the exemplary method.
In connection with the above description, the steps of performing the screen cross-device display method will be described below from the viewpoint of a method example, referring to fig. 5. Fig. 5 is a flow chart of a method for displaying pictures across devices, which includes:
S510, displaying a first picture.
The first screen may be a current mirror image of a running screen of the first application on the projection source device.
It should be noted that, the first application of the embodiment of the present application may be an application program or media data running on an application layer of the screen-throwing source device, such as a photo, a video, an audio, a game, a gallery, a document, or multimedia. Meanwhile, the screen-throwing source equipment can run the first application at the front end and also can run the first application at the back end. When the screen-throwing source equipment operates the first application at the front end, a display screen of the screen-throwing source equipment can display an operation picture of the first application; when the screen-throwing source device runs the first application at the back end, the display screen of the screen-throwing source device can execute the first application by the background without displaying the running picture of the first application.
It should be further noted that, in the embodiment of the present application, a screen displayed on the display screen of the screen-throwing target device and thrown by the screen-throwing source device may be referred to as a mirror image (i.e., the first screen or the target screen), and the mirror image may represent all or part of the running screen of the first application. Meanwhile, the mirror image is a picture which is presented after related operations such as resolution adjustment, display size adjustment, picture adjustment and the like in the screen throwing process.
In addition, since the input module 4204 performs the input operation on the screen displayed by the display module 4203 according to the embodiment of the present application, the current image (i.e., the first screen) displayed on the display screen of the screen-casting target device may be the target image (i.e., the target screen) of the screen-casting source device for the operation screen of the first application, or may be the image after performing the operations such as zooming in, zooming out, or moving with respect to the target screen. For convenience of distinction, in the embodiment of the present application, a target image of a screen to be projected by a screen projection source device for a running screen of a first application is referred to as a target screen, and a screen currently displayed on a display screen of the screen projection target device is referred to as a first screen. In the following, the embodiment of the application uses the screen-throwing source equipment as a notebook computer and the screen-throwing target equipment as a mobile phone for illustration.
As illustrated in fig. 6, 7, 8 and 9. In fig. 6, a user opens a photo of "sketched cat jpg" through a "gallery" application on a notebook computer, and uses a screen-projection technique to screen the current picture of the "gallery" application on the notebook computer to the mobile phone. At this time, the target picture displayed on the display screen of the mobile phone is the same as the current picture of the "gallery" application on the notebook computer. However, in fig. 7, the user can only screen the "sketched cat. Jpg" on the laptop by setting the relevant screen option, which is open by the "gallery" application on the laptop. At this time, the target screen displayed on the display screen of the mobile phone is only a partial screen in the current screen of the gallery application. In fig. 8, a user can zoom in on a target screen in the display screen of the mobile phone shown in fig. 7 by sliding the touch screen. When the mobile phone responds to the amplifying operation, the display screen of the mobile phone displays the current picture, as shown in fig. 9. Similarly, in fig. 10, the user may zoom out the target screen in the display screen of the mobile phone shown in fig. 7 by sliding the touch screen. When the mobile phone responds to the zoom-out operation, the display screen of the mobile phone displays the current picture, as shown in fig. 11.
Specifically, the input module 4204 of an embodiment of the present application needs to be determined according to the electronic device. For example, when the electronic device is a device of a specific touch screen such as a mobile phone, a smart watch, etc., the input module of the electronic device may be the touch screen. At this time, the user can perform an input operation (e.g., zoom in, zoom out, scroll, flip, move, etc.) on the screen displayed by the touch screen by sliding the touch screen. When the electronic device is a device without a touch screen, such as a desktop computer, a notebook computer, or an intelligent television, the input module may be an external device (such as a mouse, a keyboard, and a remote controller). At this time, the user can perform an input operation on the screen displayed by the external device by operating the external device. In addition, when the external device is a mouse, the user can move the picture by pressing the left mouse button of the mouse; alternatively, the screen is enlarged or reduced by sliding the mouse wheel up and down, which is not particularly limited.
S520, a first operation event aiming at a first picture is acquired.
It should be noted that, in the embodiment of the present application, the input module 4204 may perform an input operation on a first screen displayed by the projection target device to generate a first operation event. Therefore, the first operation event may be an operation of moving the first screen along the first direction on the display screen of the screen-throwing target device, an operation of clicking the first screen on the display screen of the screen-throwing target device, or an operation of zooming in or out the first screen on the display screen of the screen-throwing target device, which is not particularly limited.
S530, judging whether to respond to a first instruction corresponding to the first operation event at the local end according to the execution action of the first operation event on the first picture.
The embodiment of the application specifically describes how the screen projection target device performs the action on the first screen according to the first operation time to determine whether to respond to the implementation mode of the first instruction corresponding to the first operation event at the local end.
Mode one:
in one possible example, the determining whether to respond to the first instruction corresponding to the first operation event at the home end according to the execution action of the first operation event on the first screen may include the following steps: if the first operation event is an operation of moving the first picture along the first direction, judging whether to respond to a first instruction corresponding to the first operation event at the local end according to the first picture and the target picture.
The target image may be a target image of a screen to be projected by the projection source device for a running image of the first application.
It should be noted that, in the embodiment of the present application, the input module 4204 may perform an input operation on a screen displayed on the screen-projection target device, so that a current mirror image (i.e., the first screen) displayed on the display screen of the screen-projection target device may be a target screen, a screen after an operation such as zooming in, zooming out or moving, or a screen after an operation such as zooming in, zooming out or moving is performed on the target screen. When a user needs to move a first screen along a first direction on a display screen of a screen-throwing target device, the embodiment of the application considers whether to execute the movement of the first screen along the first direction on the display screen of the screen-throwing target device or not through the first screen and the target screen. Because the screen throwing target equipment can directly call the related modules (such as a GPU (graphics processing unit), an image processing module and the like) in the first picture and the target picture for comparison operation, the processing time required for the judging process of whether to respond to the first instruction corresponding to the first operation event at the local end is short, thereby being beneficial to improving the processing efficiency of the interactive operation of the picture of the user and improving the use experience of the screen throwing.
Specifically, the target screen presents the entire screen of the running screen of the first application; alternatively, the target screen presents only a partial screen of the running screen of the first application.
It may be understood that the target screen displayed on the display screen of the screen-throwing target device may be an entire screen (as shown in fig. 6) of the running screen of the circulation-source device for the first application, or may be a partial screen (as shown in fig. 7) of the running screen of the circulation-source device for the first application. Therefore, the embodiment of the application is beneficial to realizing the diversity of interactive operation of the images of the application and improving the use experience of screen throwing.
Specifically, the first direction may be used to represent any direction of a plane on which a current display screen of the screen-casting target device is located. For example, the first direction may be horizontal left, horizontal right, horizontal down, horizontal up, horizontal down, or the like, which are parallel to a plane of the display screen of the screen-throwing target apparatus, and may be arc left, arc right, arc down, arc up, or arc down, or arc up, or the like, which are parallel to the plane of the display screen of the screen-throwing target apparatus.
It should be noted that, taking the screen-throwing target device as an example of a mobile phone, because the gesture of the user holding the mobile phone is different, the plane on which the display screen of the mobile phone is located may be vertical (or parallel, oblique, etc.) to the ground. It can be seen that, in the embodiment of the present application, the first direction needs to be specifically analyzed according to the plane where the current display screen of the screen-throwing target device is located.
In one possible example, after determining whether to respond to the first instruction corresponding to the first operation event at the home terminal according to the first screen and the target screen, the method further includes the steps of: if the first operation instruction is responded at the local end, the first picture is moved along the first direction so as to display the second picture.
Wherein the second picture is a mirror image of the running picture of the first application different from the first picture.
Illustratively, in fig. 12, the user may perform a horizontal left movement operation on the current screen (i.e., the first screen) in the display screen of the mobile phone by sliding the touch screen of the mobile phone shown in fig. 9 along a plane parallel to the display screen of the mobile phone. When the mobile phone responds to the mobile operation, the current picture (i.e. the second picture) is displayed on the display screen of the mobile phone, as shown in fig. 13.
In one possible example, the determining whether to respond to the first instruction corresponding to the first operation event at the home terminal according to the first screen and the target screen may include the following steps: if the first picture and the target picture are different pictures, judging whether to respond to a first instruction corresponding to a first operation event at the local end according to the relation between the first picture and the target picture; or if the first picture and the target picture are the same picture, the first command corresponding to the first operation event is not responded at the local end.
It should be noted that the first screen is different from the target screen, and it is understood that the user has performed an input operation on the target screen displayed on the display screen of the screen-throwing target device, so that the currently displayed screen is different from the target screen (as shown in fig. 9). At this time, the embodiment of the present application considers that whether to move the first screen in the first direction on the display screen of the screen-throwing target apparatus is determined according to the relationship between the first screen and the target screen (i.e., the enlarged screen, the reduced screen, and/or the moving screen). Similarly, the first frame is the same as the target frame, and it is understood that the user does not perform an input operation on the target frame displayed on the display screen of the screen-throwing target device, so that the currently displayed frame is the same as the target frame. Therefore, the embodiment of the application is beneficial to realizing the diversity of interactive operation of the images of the application and improving the use experience of screen throwing.
In addition, aiming at the difference between the first instruction corresponding to the first operation event responded at the local end and the first instruction corresponding to the first operation event not responded at the local end, the embodiment of the application uses the screen-throwing source equipment as a notebook computer and the screen-throwing target equipment as a mobile phone for illustration.
The first instruction corresponding to the first operation event is responded at the home terminal, which can be understood that the mobile phone responds to the first instruction corresponding to the first operation event. For example, in fig. 13, the mobile phone responds to the moving operation and moves on the display screen of the mobile phone to a partial screen (i.e., a second screen). However, it may be understood that the mobile phone does not respond to the first instruction corresponding to the first operation event. At this time, the mobile phone needs to perform the adapting process on the first operation event, and transmit the adapted operation instruction (i.e. the second instruction corresponding to the first operation event) to the notebook computer for execution. In addition, the reason for adapting the first operation event is that, since the input module of the mobile phone may be a touch screen and the input module of the notebook computer may be an external mouse, the input operation generated by the touch screen needs to be adapted to the characteristics of the input operation of the mouse. Meanwhile, since the display screen of the mobile phone and the display screen of the notebook computer are often different in size, the mobile phone can slide a certain length on the touch screen of the mobile phone, move a certain direction or click a certain position, and the generated input operation cannot be executed on the notebook computer in the same length, direction or position, so that the corresponding adapting processing is required for the input operation generated by the mobile phone. In addition, since different operating systems, software and hardware architectures, etc. may be installed between the mobile phone and the notebook computer, corresponding adaptation processing is also required.
Illustratively, in fig. 14, the user may perform a horizontal rightward movement operation on the current screen (i.e., the first screen) in the display screen of the mobile phone by sliding the touch screen of the mobile phone shown in fig. 7 along a plane parallel to the display screen of the mobile phone. Since the current frame and the target frame are the same frame, the mobile phone does not respond to the instruction corresponding to the moving operation. At this time, the mobile phone performs the adapting process on the mobile operation, and transmits the instruction (i.e., the second instruction) corresponding to the adapted mobile operation to the notebook computer for execution. Then, after responding to an instruction corresponding to the adapted mobile operation, the notebook computer switches the sketched cat. Jpg with the "gallery" application opened to the true cat. Jpg, and synchronizes the mirror image of the true cat. Jpg to the mobile phone by using a screen throwing technology, as shown in fig. 15, so that the switching operation is directly performed on the current picture of the "gallery" application on the notebook computer by the mobile phone, the interactive operation of the picture of the application is realized, and the use experience of screen throwing is improved.
Further, the determining whether to respond to the instruction corresponding to the first operation event at the home terminal according to the relation between the first picture and the target picture may include the following steps: if the first picture is a part of the amplified target picture, judging whether to respond to the first operation instruction at the local end according to whether the first picture moves in the first direction beyond the edge of the target picture or not; or if the first picture is the full picture after the target picture is reduced, the first operation instruction is not responded at the local end.
It should be noted that, in the embodiment of the present application, the relationship between the first frame and the target frame is determined according to whether the first frame is a partial frame after the target frame is enlarged or a full frame after the target frame is reduced, so as to respectively determine whether to move the first frame along the first direction on the display screen of the screen-throwing target device, thereby implementing the interactive operation of the applied frame.
In fig. 9, the current frame displayed by the mobile phone is a part of the frame of the target frame after the enlargement, and it is required whether the current frame moves beyond the edge of the target frame along the first direction to determine whether the mobile phone responds to the first operation instruction; in fig. 10, the current frame displayed by the mobile phone is a full frame of the target frame after the target frame is reduced, and the mobile phone will not respond to the instruction corresponding to the first operation event, but the notebook computer needs to respond to the adapted instruction after adapting the first operation event.
Further, the determining whether to respond to the first instruction corresponding to the first operation event at the home terminal according to whether the first frame moves beyond the edge of the target frame along the first direction may include the following steps: if the first picture moves beyond the edge of the target picture along the first direction, the first command corresponding to the first operation event is not responded at the local end; or if the first picture moves along the first direction and does not exceed the edge of the target picture, responding to a first instruction corresponding to the first operation event at the local end.
For example, in fig. 16 (a), if the user performs a horizontal rightward movement operation of the current screen in the display screen of the mobile phone shown in fig. 9 along a plane parallel to the display screen of the mobile phone by sliding the touch screen of the mobile phone, the current screen moves beyond the edge of the target screen along the horizontal rightward direction, as in fig. 16 (b). At this time, the mobile phone will not respond to the instruction corresponding to the move operation.
Illustratively, in fig. 12, the user may perform a horizontal left movement operation by sliding the touch screen of the mobile phone shown in fig. 9 to move the current screen of the display screen of the mobile phone along a plane parallel to the plane of the display screen of the mobile phone. Because the current frame moves leftwards along the horizontal direction and does not exceed the edge of the target frame, the mobile phone can respond to the instruction corresponding to the moving operation, and the current frame displayed on the display screen of the mobile phone is shown in fig. 13.
Further, the method further includes the following steps, not after the home terminal responds to the first instruction corresponding to the first operation event: displaying a target window in a display screen of the screen-throwing target device, wherein the target window is used for prompting that the first picture cannot be moved along the first direction; or, calling a sensing module of the screen throwing target equipment to vibrate the screen throwing target equipment; alternatively, a speaker of the screen-casting target device is invoked to sound the screen-casting target device.
It should be noted that, when the screen-throwing target device cannot respond to the first instruction corresponding to the first operation event to move the first screen along the first direction, the embodiment of the application considers that the screen-throwing target device reminds the user in a popup window, vibration or sounding mode, so that the interactive operation of the applied screen is realized, and the use experience of the screen throwing is improved.
In fig. 17 (a), the user performs a horizontal rightward movement operation on the current screen of the display screen of the mobile phone by sliding the touch screen of the mobile phone shown in fig. 9 along a plane parallel to the plane of the display screen of the mobile phone, and the current screen moves horizontally rightward beyond the edge of the target screen, so that a window is flicked on the display screen of the mobile phone to prompt that the current screen cannot be moved horizontally, and a "whether to look at a photo" is displayed, as shown in fig. 17 (b). When the user selects the "yes" option, the mobile phone will adapt the related operation (mobile operation or "yes" option operation, etc.), and transmit the adapted operation instruction to the notebook computer, and then the notebook computer views the previous photo in the "gallery" application. Finally, the notebook computer screens the current picture of the 'gallery' application to the mobile phone, so that a picture can be seen on the mobile phone, and the screen-projection use experience is improved.
Mode two:
in one possible example, the determining whether to respond to the first instruction corresponding to the first operation event at the home end according to the execution action of the first operation event on the first screen may include the following steps: if the first operation event is clicking the first picture, the first instruction corresponding to the first operation event is not responded at the local end.
It should be noted that, when the user needs to click the first screen (or a certain position on the first screen) on the display screen of the screen-throwing target device, the embodiment of the application considers that the first instruction corresponding to the first operation event is not responded at the local end, thereby being beneficial to improving the processing efficiency of the interactive operation of the screen of the user and improving the use experience of the screen throwing.
Illustratively, in fig. 18, the user generates an input operation by clicking on the location of an "x" on the display of the handset shown in fig. 6. At this time, the mobile phone does not respond to the instruction corresponding to the input operation, but performs the adapting process on the input operation, and transmits the adapted operation instruction (i.e. the second instruction) to the notebook computer for execution. Then, after responding to the adapted operation instruction, the notebook computer closes the gallery application, and ends the screen of the notebook computer to the mobile phone, as shown in fig. 19.
Further, the method further includes the following steps, not after the home terminal responds to the first instruction corresponding to the first operation event: displaying a target window in a display screen of the screen-throwing target device, wherein the target window is used for prompting that the first picture cannot be moved along the first direction; or, calling a sensing module of the screen throwing target equipment to vibrate the screen throwing target equipment; alternatively, a speaker of the screen-casting target device is invoked to sound the screen-casting target device. Specifically, as described above, the details are not repeated here.
S540, if the first instruction corresponding to the first operation event is not responded at the local end, a second instruction corresponding to the first operation event is sent to the screen projection source equipment.
The second operation instruction may be used to instruct the screen source device to perform an operation on a running screen of the first application.
In the above-mentioned "mode one", the following embodiments of the present application will describe an example of how the screen-throwing target device determines the second instruction corresponding to the first operation event.
In one possible example, before sending the second instruction corresponding to the first operation event to the screen-throwing source device, the method may further include the following steps: and determining a second instruction corresponding to the first operation event according to the first operation event and a preset operation mapping relation.
The preset operation mapping relation is used for representing the mapping relation between the operation of the first picture and the operation of the running picture of the first application.
As can be seen from the above, different operating systems, software and hardware architectures and the like may be installed between the screen-throwing source device and the screen-throwing target device, so in the process of establishing the screen between the screen-throwing source device and the screen-throwing target device, in order to adapt the input operation of the screen-throwing target device to the screen-throwing source device, the screen-throwing target device is ensured to operate the running screen applied on the screen-throwing source device, and the interactive operation of the screen applied is realized.
The following embodiment of the present application illustrates a preset operation mapping relationship through a screen-projection process between the notebook computer and the mobile phone illustrated in the above-mentioned figure, as shown in table 1.
TABLE 1
In the second mode, an example of how the screen-throwing target device determines the second instruction corresponding to the first operation event will be described below.
In one possible example, before sending the second instruction corresponding to the first operation event to the screen-throwing source device, the method may further include the following steps: acquiring a first scaling ratio and a second scaling ratio; and determining a second instruction corresponding to the first operation event according to the first scaling ratio, the second scaling ratio and the first operation instruction.
The first scaling ratio is used for representing the scaling ratio between the display size of the running picture of the first application and the display size of the target picture, the second scaling ratio is used for representing the scaling ratio between the display size of the first picture and the display size of the target picture, and the target picture is the initial mirror image of the running picture of the first application on the display screen of the screen throwing target device.
As can be seen from the foregoing, since the display screen of the screen-throwing source device and the display screen of the screen-throwing target device may have different sizes, and the current frame displayed on the display screen of the screen-throwing target device and the target frame may be different, in the process of establishing the screen-throwing between the screen-throwing source device and the screen-throwing target device, in order to adapt the input operation of the screen-throwing target device to the screen-throwing source device, the screen-throwing target device is ensured to operate the current frame applied on the screen-throwing source device, and the interactive operation of the frame applied is realized.
Illustratively, in fig. 6, the "gallery" application in the notebook computer displays a "sketched cat. Jpg" of a different display size than the target screen displayed on the display of the cell phone, so that the scaling ratios (i.e., the first scaling ratio) between each other need to be calculated and stored in the internal memory of the cell phone. Then, in order to facilitate viewing of the screen on the mobile phone shown in fig. 6, the user enlarges the target screen (i.e. fig. 20 (a)) displayed on the display screen of the mobile phone shown in fig. 6, to obtain the current screen (i.e. the first screen) displayed on the display screen of the mobile phone shown in fig. 20 (b). At this time, it is necessary to calculate a scaling ratio (i.e., a second scaling ratio) between the display size of the current picture and the display size of the target picture, and store the second scaling ratio in an internal memory of the mobile phone. Finally, in fig. 20 (c), when the user clicks the position "x" on the display screen of the mobile phone shown in fig. 20 (b) to generate an input operation, the mobile phone does not respond to the input operation, performs an adaptation process according to the first zoom ratio, the second zoom and the input operation, and transmits an adapted operation instruction to the notebook computer for execution. And then, closing the 'gallery' application by the notebook computer through responding to the adapted operation instruction, and ending the screen projection from the notebook computer to the mobile phone.
In one possible example, after S540, the method further comprises the steps of: and displaying a third picture, wherein the third picture is a mirror image of the running picture of the first application, which is different from the first picture, after the screen throwing source equipment responds to the second instruction corresponding to the first operation event.
It can be understood that after responding to the second operation instruction, the screen-throwing source device displays the refreshed, changed or switched running screen of the first application, and throws the refreshed, changed or switched running screen of the first application to the screen-throwing target device, so as to realize synchronous display of the screen-throwing target device.
In summary, firstly, in the embodiment of the present application, the specific implementation manner of displaying the images across devices is illustrated by taking the screen-throwing source device as a notebook computer and the screen-throwing target device as a mobile phone as examples, and by performing an input operation such as zooming in, zooming out, moving or clicking on the current image displayed on the display screen of the mobile phone, and determining whether the input operation needs to be performed on the mobile phone or not, or performing an adaptation process on the input operation needs to be transmitted to the notebook computer, the user can not only operate the screen-throwing image on the display screen of the mobile phone in a zooming, moving or clicking manner, but also perform a control operation (such as viewing different photos) on the current image of a "gallery" application running on the notebook computer, thereby implementing an image interaction control of the application, improving the processing efficiency of the interaction operation of the image of the application, and improving the use experience of the screen throwing.
Then, different operating systems, software and hardware architectures and the like can be possibly carried between the mobile phone and the notebook computer, so that the application can realize the cross-equipment display of pictures among different equipment, operating systems or software and hardware architectures.
Finally, applications running on notebook computers may also have document editors, video players, music players, and the like. At this time, the mobile phone can edit the document running on the notebook computer, can switch the video played on the notebook computer, and can switch the music played on the notebook computer, which is not particularly limited.
It can be seen that, in the embodiment of the present application, the screen-throwing target device displays the first screen, and acquires the first operation event for the first screen. And then, judging whether to respond to a first instruction corresponding to the first operation event at the local end according to the execution action of the first operation event on the first picture. If the first instruction is not responded at the local end, a second instruction corresponding to the first operation event is sent to the screen throwing source equipment. Because the first picture is the current mirror image of the running picture of the first application on the screen throwing source equipment on the display screen of the screen throwing target equipment, and the second instruction corresponding to the first operation event is used for indicating the screen throwing source equipment to execute the operation on the running picture of the first application, the cross-equipment display of the pictures is realized. In addition, because the screen-throwing target equipment needs to judge whether to respond to the first instruction corresponding to the first operation event at the local end, the screen-throwing target equipment can respond at the local end or not, so that the screen-throwing target equipment can operate a screen throwing picture on a display screen of the screen-throwing target equipment, and can also control and operate an operation picture of the first application on the screen-throwing source equipment through the adapting process, thereby realizing the cross-equipment interactive control of the picture of the application, improving the processing efficiency of the picture interactive operation and improving the use experience of the screen throwing.
In accordance with the embodiment described in fig. 5, please refer to fig. 21, fig. 21 is a flowchart of another method for displaying a picture across devices according to an embodiment of the present application, the method includes:
s2110, a first screen is displayed.
The first screen may be a current mirror image of a running screen of the first application on the projection source device.
It should be noted that, the first application of the embodiment of the present application may be an application program or media data running on an application layer of the screen-throwing source device, such as a photo, a video, an audio, a game, a gallery, a document, or multimedia. Meanwhile, the screen-throwing source equipment can run the first application at the front end and also can run the first application at the back end. When the screen-throwing source equipment operates the first application at the front end, a display screen of the screen-throwing source equipment can display an operation picture of the first application; when the screen-throwing source device runs the first application at the back end, the display screen of the screen-throwing source device can execute the first application by the background without displaying the running picture of the first application.
It should be further noted that, in the embodiment of the present application, a screen displayed on the display screen of the screen-throwing target device and thrown by the screen-throwing source device may be referred to as a mirror image (i.e., the first screen or the target screen), and the mirror image may represent all or part of the running screen of the first application. Meanwhile, the mirror image is a picture which is presented after related operations such as resolution adjustment, display size adjustment, picture adjustment and the like in the screen throwing process.
In addition, since the input module 4204 performs the input operation on the screen displayed by the display module 4203 according to the embodiment of the present application, the current image (i.e., the first screen) displayed on the display screen of the screen-casting target device may be the target image (i.e., the target screen) of the screen-casting source device for the operation screen of the first application, or may be the image after performing the operations such as zooming in, zooming out, or moving with respect to the target screen. For convenience of distinction, in the embodiment of the present application, a target image of a screen to be projected by a screen projection source device for a running screen of a first application is referred to as a target screen, and a screen currently displayed on a display screen of the screen projection target device is referred to as a first screen.
S2120, a first operation event for the first screen is acquired.
It should be noted that, in the embodiment of the present application, the input module 4204 may perform an input operation on a first screen displayed by the projection target device to generate a first operation event. Therefore, the first operation event may be an operation of moving the first screen along the first direction on the display screen of the screen-throwing target device, an operation of clicking the first screen on the display screen of the screen-throwing target device, or an operation of zooming in or out the first screen on the display screen of the screen-throwing target device, which is not particularly limited.
S2130, if the first operation event is an operation of moving the first screen along the first direction, determining whether to respond to the first instruction corresponding to the first operation event at the local end according to the first screen and the target screen.
The target image may be a target image of a screen to be projected by the projection source device for a running image of the first application.
Specifically, the target screen presents the entire screen of the running screen of the first application; alternatively, the target screen presents only a partial screen of the running screen of the first application.
Specifically, the first direction may be used to represent any direction of a plane on which a current display screen of the screen-casting target device is located. For example, the first direction may be horizontal left, horizontal right, horizontal down, horizontal up, horizontal down, or the like, which are parallel to a plane of the display screen of the screen-throwing target apparatus, and may be arc left, arc right, arc down, arc up, or arc down, or arc up, or the like, which are parallel to the plane of the display screen of the screen-throwing target apparatus.
In one possible example, the determining whether to respond to the first instruction corresponding to the first operation event at the home terminal according to the first screen and the target screen may include the following steps: if the first picture and the target picture are different pictures, judging whether to respond to a first instruction corresponding to a first operation event at the local end according to the relation between the first picture and the target picture; or if the first picture and the target picture are the same picture, the first command corresponding to the first operation event is not responded at the local end.
In one possible example, the determining whether to respond to the first instruction corresponding to the first operation event at the home terminal according to the relationship between the first screen and the target screen may include the following steps: if the first picture is a part of the amplified target picture, judging whether to respond to a first instruction corresponding to a first operation event at the local end according to whether the first picture moves beyond the edge of the target picture along a first direction; or if the first picture is the full picture after the target picture is reduced, the first command corresponding to the first operation event is not responded at the local end.
In one possible example, the determining whether to respond to the first instruction corresponding to the first operation event at the home terminal according to whether the first screen moves beyond the edge of the target screen along the first direction may include the following operations: if the first picture moves beyond the edge of the target picture along the first direction, the first command corresponding to the first operation event is not responded at the local end; or if the first picture moves along the first direction and does not exceed the edge of the target picture, responding to a first instruction corresponding to the first operation event at the local end.
S2140, if the first instruction corresponding to the first operation event is not responded at the home terminal, determining a second instruction corresponding to the first operation event according to the first operation event and a preset operation mapping relation.
S2150, if the first command corresponding to the first operation event is responded at the home terminal, the first screen is moved along the first direction to display the second screen.
Wherein the second picture is a mirror image of the running picture of the first application different from the first picture.
S2160, sending a second instruction corresponding to the first operation event to the screen throwing source equipment.
The second instruction corresponding to the first operation event is used for indicating the screen throwing source equipment to execute operation on the running picture of the first application.
S2170, the third screen is displayed.
The third picture is a mirror image of the running picture of the first application, which is different from the first picture, after the screen throwing source equipment responds to the second instruction corresponding to the first operation event.
It can be understood that after responding to the second operation instruction, the screen-throwing source device displays the refreshed, changed or switched running screen of the first application, and throws the refreshed, changed or switched running screen of the first application to the screen-throwing target device, so as to realize synchronous display of the screen-throwing target device.
It should be noted that, since the descriptions of the embodiments are focused, the portions of the embodiment illustrated in fig. 21 that are not described in detail may be referred to the related descriptions of the embodiment in fig. 5, and are not described herein again.
It can be seen that, in the embodiment of the present application, the screen-throwing target device displays the first screen, and acquires the first operation event for the first screen. And secondly, if the first operation event is an operation of moving the first picture along the first direction, judging whether to respond to a first instruction corresponding to the first operation event at the local end according to the first picture and the target picture. Thirdly, if the first instruction corresponding to the first operation event is not responded at the local end, determining a second instruction corresponding to the first operation event according to the first operation event and a preset operation mapping relation; if the first instruction corresponding to the first operation event is responded at the local end, the first picture is moved along the first direction so as to display the second picture. And finally, sending a second instruction corresponding to the first operation event to the screen throwing source equipment, and displaying a third picture. Because the first picture is the current mirror image of the running picture of the first application on the screen throwing source equipment on the display screen of the screen throwing target equipment, and the second instruction corresponding to the first operation event is used for indicating the screen throwing source equipment to execute the operation on the running picture of the first application, the cross-equipment display of the pictures is realized. In addition, because the screen-throwing target equipment needs to judge whether to respond to the first instruction corresponding to the first operation event at the local end, the screen-throwing target equipment can respond at the local end or not, so that the screen-throwing target equipment can operate a screen throwing picture on a display screen of the screen-throwing target equipment, and can also control and operate an operation picture of the first application on the screen-throwing source equipment through the adapting process, thereby realizing the cross-equipment interactive control of the picture of the application, improving the processing efficiency of the picture interactive operation and improving the use experience of the screen throwing.
In accordance with the embodiments described in fig. 5 and 21, please refer to fig. 22, fig. 22 is a flowchart of another method for displaying a picture across devices according to an embodiment of the present application, which includes:
s2210, displaying a first picture.
The first screen may be a current mirror image of a running screen of the first application on the projection source device.
It should be noted that, the first application of the embodiment of the present application may be an application program or media data running on an application layer of the screen-throwing source device, such as a photo, a video, an audio, a game, a gallery, a document, or multimedia. Meanwhile, the screen-throwing source equipment can run the first application at the front end and also can run the first application at the back end. When the screen-throwing source equipment operates the first application at the front end, a display screen of the screen-throwing source equipment can display an operation picture of the first application; when the screen-throwing source device runs the first application at the back end, the display screen of the screen-throwing source device can execute the first application by the background without displaying the running picture of the first application.
It should be further noted that, in the embodiment of the present application, a screen displayed on the display screen of the screen-throwing target device and thrown by the screen-throwing source device may be referred to as a mirror image (i.e., the first screen or the target screen), and the mirror image may represent all or part of the running screen of the first application. Meanwhile, the mirror image is a picture which is presented after related operations such as resolution adjustment, display size adjustment, picture adjustment and the like in the screen throwing process.
In addition, since the input module 4204 performs the input operation on the screen displayed by the display module 4203 according to the embodiment of the present application, the current image (i.e., the first screen) displayed on the display screen of the screen-casting target device may be the target image (i.e., the target screen) of the screen-casting source device for the operation screen of the first application, or may be the image after performing the operations such as zooming in, zooming out, or moving with respect to the target screen. For convenience of distinction, in the embodiment of the present application, a target image of a screen to be projected by a screen projection source device for a running screen of a first application is referred to as a target screen, and a screen currently displayed on a display screen of the screen projection target device is referred to as a first screen.
S2220, acquire a first operation event for the first screen.
It should be noted that, in the embodiment of the present application, the input module 4204 may perform an input operation on a first screen displayed by the projection target device to generate a first operation event. Therefore, the first operation event may be an operation of moving the first screen along the first direction on the display screen of the screen-throwing target device, an operation of clicking the first screen on the display screen of the screen-throwing target device, or an operation of zooming in or out the first screen on the display screen of the screen-throwing target device, which is not particularly limited.
S2230, if the first operation event is an operation of clicking the first screen, the first instruction corresponding to the first operation event is not responded at the home terminal.
It should be noted that, when the user needs to click the first screen (or a certain position on the first screen) on the display screen of the screen-throwing target device, the embodiment of the application considers that the first instruction corresponding to the first operation event is not responded at the local end, thereby being beneficial to improving the processing efficiency of the interactive operation of the screen of the user and improving the use experience of the screen throwing.
S2240, a first scaling ratio and a second scaling ratio are acquired.
The first scaling ratio is used for representing the scaling ratio between the display size of the running picture of the first application and the display size of the target picture, the second scaling ratio is used for representing the scaling ratio between the display size of the first picture and the display size of the target picture, and the target picture is the initial mirror image of the running picture of the first application on the display screen of the screen throwing target device.
S2250, determining a second instruction corresponding to the first operation event according to the first scaling ratio, the second scaling ratio and the first operation event.
As can be seen from the foregoing, since the display screen of the screen-throwing source device and the display screen of the screen-throwing target device may have different sizes, and the current frame displayed on the display screen of the screen-throwing target device and the target frame may be different, in the process of establishing the screen-throwing between the screen-throwing source device and the screen-throwing target device, in order to adapt the input operation of the screen-throwing target device to the screen-throwing source device, the screen-throwing target device is ensured to operate the current frame applied on the screen-throwing source device, and the interactive operation of the frame applied is realized.
S2260, sending a second instruction corresponding to the first operation event to the screen projection source equipment.
The second instruction corresponding to the first operation event is used for indicating the screen throwing source equipment to execute operation on the running picture of the first application.
S2270, the third screen is displayed.
The third picture is a mirror image of the running picture of the first application, which is different from the first picture, after the screen throwing source equipment responds to the second instruction corresponding to the first operation event.
It can be understood that after responding to the second operation instruction, the screen-throwing source device displays the refreshed, changed or switched running screen of the first application, and throws the refreshed, changed or switched running screen of the first application to the screen-throwing target device, so as to realize synchronous display of the screen-throwing target device.
It should be noted that, since the descriptions of the embodiments are focused, the portions of the embodiment shown in fig. 22 that are not described in detail may be referred to the related descriptions of the embodiment in fig. 5, and are not described herein again.
It can be seen that, in the embodiment of the present application, the screen-throwing target device displays the first screen, and acquires the first operation event for the first screen. And if the first operation event is the operation of clicking the first picture, the first instruction corresponding to the first operation event is not responded at the local end. And thirdly, acquiring a first scaling ratio and a second scaling ratio, and determining a second instruction corresponding to the first operation event according to the first scaling ratio, the second scaling ratio and the first operation event. And finally, sending a second instruction corresponding to the first operation event to the screen throwing source equipment, and displaying a third picture. Because the first picture is the current mirror image of the running picture of the first application on the screen throwing source equipment on the display screen of the screen throwing target equipment, and the second instruction corresponding to the first operation event is used for indicating the screen throwing source equipment to execute the operation on the running picture of the first application, the cross-equipment display of the pictures is realized. In addition, because the screen-throwing target equipment needs to judge whether to respond to the first instruction corresponding to the first operation event at the local end, the screen-throwing target equipment can respond at the local end or not, so that the screen-throwing target equipment can operate a screen throwing picture on a display screen of the screen-throwing target equipment, and can also control and operate an operation picture of the first application on the screen-throwing source equipment through the adapting process, thereby realizing the cross-equipment interactive control of the picture of the application, improving the processing efficiency of the picture interactive operation and improving the use experience of the screen throwing.
The foregoing description of the embodiments of the present application has been presented primarily in terms of a method-side implementation. It will be appreciated that the electronic device, in order to achieve the above-described functions, includes corresponding hardware structures and/or software modules that perform the respective functions. Those of skill in the art will readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Those skilled in the art may implement the described functionality using different approaches for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The embodiment of the application can divide the functional units of the electronic device according to the method example, for example, each functional unit can be divided corresponding to each function, and two or more functions can be integrated in one processing unit. The integrated units may be implemented in hardware or in software functional units. It should be noted that, in the embodiment of the present application, the division of the units is schematic, but only one logic function is divided, and another division manner may be adopted in actual implementation.
In the case of employing integrated units, fig. 23 shows a functional unit composition block diagram of a picture cross-device display apparatus. The cross-device display device 2300 is applied to a projection target device, and specifically includes: a processing unit 2320 and a communication unit 2330. The processing unit 2320 is configured to control and manage actions of the screen-casting target device, e.g., the processing unit 2320 is configured to support the screen-casting target device to perform some or all of the steps in fig. 5, as well as other processes for the techniques described herein. The communication unit 2330 is used to support communication of the screen-casting target device with other devices. The screen cross-device display 2300 may further include a storage unit 2310 for storing program codes and data of the screen-casting target device.
The processing unit 2320 may be a processor or controller, such as a CPU, general-purpose processor, DSP, ASIC, FPGA, transistor logic, hardware component, or any combination thereof. Which may implement or perform the various exemplary logic blocks, modules and circuits described in connection with embodiments of the application. In addition, processing unit 2320 may also be a combination that implements computing functionality, e.g., comprising one or more microprocessor combinations, a DSP, and a microprocessor. The communication unit 2330 may be a communication interface, transceiver circuitry, and the like. The storage unit 2310 may be a memory. When the processing unit 2320 is a processor, the communication unit 2330 is a communication interface, and the storage unit 2310 is a memory, the cross-device display apparatus 2300 according to the embodiment of the present application may be an electronic device shown in fig. 24.
In particular, the processing unit 2320 is configured to perform any steps performed by the screen-casting target device in the above method embodiment, and when performing data transmission such as sending, the communication unit 2330 is optionally invoked to complete a corresponding operation. The following is a detailed description.
The processing unit 2320 is configured to: displaying a first picture, wherein the first picture is a current mirror image of a running picture of a first application on the screen throwing source equipment; acquiring a first operation event aiming at a first picture; judging whether to respond to a first instruction corresponding to the first operation event at the local end according to the execution action of the first operation event on the first picture; if the first instruction corresponding to the first operation event is not responded at the local end, a second instruction corresponding to the first operation event is sent to the screen throwing source equipment, and the second instruction corresponding to the first operation event is used for indicating the screen throwing source equipment to execute operation on the running picture of the first application.
In one possible example, in terms of executing an action on the first screen according to the first operation event to determine whether to respond to the first instruction corresponding to the first operation event at the home terminal, the processing unit 2320 is specifically configured to: if the first operation event is an operation of moving the first picture along the first direction, judging whether to respond to a first instruction corresponding to the first operation event at the local end according to the first picture and a target picture, wherein the target picture is a target mirror image of the screen projection source equipment aiming at the running picture of the first application.
In one possible example, in determining whether to respond to the first instruction corresponding to the first operation event at the home terminal according to the first screen and the target screen, the processing unit 2320 is specifically configured to: if the first picture and the target picture are different pictures, judging whether to respond to a first instruction corresponding to a first operation event at the local end according to the relation between the first picture and the target picture; or if the first picture and the target picture are the same picture, the first command corresponding to the first operation event is not responded at the local end.
In one possible example, in determining whether to respond to the first instruction corresponding to the first operation event at the home terminal according to the relationship between the first screen and the target screen, the processing unit 2320 is specifically configured to: if the first picture is a part of the amplified target picture, judging whether to respond to a first instruction corresponding to a first operation event at the local end according to whether the first picture moves beyond the edge of the target picture along a first direction; or if the first picture is the full picture after the target picture is reduced, the first command corresponding to the first operation event is not responded at the local end.
In one possible example, the processing unit 2320 is specifically configured to, in determining whether to respond to the first instruction corresponding to the first operation event at the local end according to whether the first screen moves beyond the edge of the target screen in the first direction: if the first picture moves beyond the edge of the target picture along the first direction, the first command corresponding to the first operation event is not responded at the local end; or if the first picture moves along the first direction and does not exceed the edge of the target picture, responding to a first instruction corresponding to the first operation event at the local end.
In one possible example, before sending the second instruction corresponding to the first operation event to the screen-casting source device, the processing unit 2320 is further configured to: and determining a second instruction corresponding to the first operation event according to the first operation event and a preset operation mapping relation, wherein the preset operation mapping relation is used for representing the mapping relation between the operation aiming at the first picture and the operation aiming at the running picture of the first application.
In one possible example, after determining whether to respond to the second instruction corresponding to the first operation event at the home terminal according to the first screen and the target screen, the processing unit 2320 is further configured to: and if the first instruction corresponding to the first operation event is responded at the local end, the first picture is moved along the first direction to display a second picture, wherein the second picture is a mirror image of the running picture of the first application and is different from the first picture.
In one possible example, in terms of executing an action on the first screen according to the first operation event to determine whether to respond to the first instruction corresponding to the first operation event at the home terminal, the processing unit 2320 is specifically configured to: if the first instruction corresponding to the first operation event is the operation of clicking the first picture, the first instruction corresponding to the first operation event is not responded at the local end.
In one possible example, before sending the second instruction corresponding to the first operation event to the screen-casting source device, the processing unit 2320 is further configured to: obtaining a first scaling ratio and a second scaling ratio, wherein the first scaling ratio is used for representing the scaling ratio between the display size of a running picture of a first application and the display size of a target picture, the second scaling ratio is used for representing the scaling ratio between the display size of the first picture and the display size of the target picture, and the target picture is a target mirror image of a screen-throwing source device aiming at the running picture of the first application; and determining a second instruction corresponding to the first operation event according to the first scaling ratio, the second scaling ratio and the first operation instruction.
In one possible example, after sending the second instruction corresponding to the first operation event to the screen-casting source device, the processing unit 2320 is further configured to: and displaying a third picture, wherein the third picture is a mirror image of the running picture of the first application, which is different from the first picture, after the screen throwing source equipment responds to the second instruction corresponding to the first operation event.
It can be seen that, in the embodiment of the present application, by displaying the first screen and acquiring the first operation event for the first screen. And then, judging whether to respond to a first instruction corresponding to the first operation event at the local end according to the execution action of the first operation event on the first picture. If the first instruction is not responded at the local end, a second instruction corresponding to the first operation event is sent to the screen throwing source equipment. Because the first picture is the current mirror image of the running picture of the first application on the screen throwing source equipment on the display screen of the screen throwing target equipment, and the second instruction corresponding to the first operation event is used for indicating the screen throwing source equipment to execute the operation on the running picture of the first application, the cross-equipment display of the pictures is realized. In addition, because the screen-throwing target equipment needs to judge whether to respond to the first instruction corresponding to the first operation event at the local end, the screen-throwing target equipment can respond at the local end or not, so that the screen-throwing target equipment can operate a screen throwing picture on a display screen of the screen-throwing target equipment, and can also control and operate an operation picture of the first application on the screen-throwing source equipment through the adapting process, thereby realizing the cross-equipment interactive control of the picture of the application, improving the processing efficiency of the picture interactive operation and improving the use experience of the screen throwing.
A schematic structure of still another electronic device 2400 provided by an embodiment of the present application is described below, as shown in fig. 24. Wherein the electronic device 2400 includes a processor 2410, a memory 2420, a communication interface 2430, and at least one communication bus for connecting the processor 2410, the memory 2420, the communication interface 2430.
Processor 2410 may be one or more Central Processing Units (CPUs). In the case where the processor 2410 is one CPU, the CPU may be a single-core CPU or a multi-core CPU. Memory 2420 includes, but is not limited to, random access Memory (Random Access Memory, RAM), read-Only Memory (ROM), erasable programmable Read-Only Memory (Erasable Programmable Read Only Memory, EPROM), or portable Read-Only Memory (Compact Disc Read-Only Memory, CD-ROM), and Memory 2220 is used for related instructions and data. Communication interface 2230 is used to receive and transmit data.
The processor 2410 in the electronic device 2400 is configured to read the one or more programs 2421 stored in the memory 2420 to perform the following steps: displaying a first picture, wherein the first picture is a current mirror image of a running picture of a first application on the screen throwing source equipment; acquiring a first operation event aiming at a first picture; judging whether to respond to a first instruction corresponding to the first operation event at the local end according to the execution action of the first operation event on the first picture; if the first instruction corresponding to the first operation event is not responded at the local end, a second instruction corresponding to the first operation event is sent to the screen throwing source equipment, and the second instruction corresponding to the first operation event is used for indicating the screen throwing source equipment to execute operation on the running picture of the first application.
It should be noted that, for a specific implementation of each operation performed by the electronic device 2400, reference may be made to the corresponding description of the method embodiment shown in fig. 5, which is not repeated herein.
It can be seen that, in the embodiment of the present application, by displaying the first screen and acquiring the first operation event for the first screen. And then, judging whether to respond to a first instruction corresponding to the first operation event at the local end according to the execution action of the first operation event on the first picture. If the first instruction is not responded at the local end, a second instruction corresponding to the first operation event is sent to the screen throwing source equipment. Because the first picture is the current mirror image of the running picture of the first application on the screen throwing source equipment on the display screen of the screen throwing target equipment, and the second instruction corresponding to the first operation event is used for indicating the screen throwing source equipment to execute the operation on the running picture of the first application, the cross-equipment display of the pictures is realized. In addition, because the screen-throwing target equipment needs to judge whether to respond to the first instruction corresponding to the first operation event at the local end, the screen-throwing target equipment can respond at the local end or not, so that the screen-throwing target equipment can operate a screen throwing picture on a display screen of the screen-throwing target equipment, and can also control and operate an operation picture of the first application on the screen-throwing source equipment through the adapting process, thereby realizing the cross-equipment interactive control of the picture of the application, improving the processing efficiency of the picture interactive operation and improving the use experience of the screen throwing.
The present application also provides a computer-readable storage medium storing a computer program for electronic data exchange, the computer program being operable to cause a computer to perform part or all of the steps of any one of the methods described in the method embodiments above.
Embodiments of the present application also provide a computer program product, wherein the computer program product comprises a computer program operable to cause a computer to perform part or all of the steps of any one of the methods described in the method embodiments above. The computer program product may be a software installation package.
For the purposes of simplicity of explanation, the various method embodiments described above are depicted as a series of acts in combination. It will be appreciated by persons skilled in the art that the application is not limited by the order of acts described, as some steps in embodiments of the application may be performed in other orders or concurrently. Moreover, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred embodiments, and that the acts and modules referred to are not necessarily required in the present embodiments.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to related descriptions of other embodiments.
In several embodiments provided by the present application, it should be appreciated by those skilled in the art that the described apparatus may be implemented in other ways. It will be appreciated that the above described apparatus embodiments are merely illustrative. For example, the above-described division of units is only one logical function division, and there may be another division manner in practice. That is, multiple units or components may be combined or integrated into another piece of software, and some features may be omitted or not performed. Further, the illustrated or discussed coupling, direct coupling, or communication connection may be through some interface, device, or unit, or may be in electrical or other form.
The above units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. It will be appreciated that the technical solution of the application, which contributes to the prior art or all or part of the technical solution, may be embodied in the form of a computer software product. The computer software product is stored in a memory and includes instructions for causing a computer device (personal computer, server, network device, etc.) to perform all or part of the steps of an embodiment of the application. The computer-readable storage medium may be stored in various memories such as a usb disk, a ROM, a RAM, a removable hard disk, a magnetic disk, and an optical disk.
While the embodiments of the present application have been described in detail, those skilled in the art should appreciate that the embodiments of the present application are merely for aiding in understanding the core concept of the technical solution of the present application, and thus the embodiments of the present application may vary in specific implementation and application scope. The description herein should not be construed as limiting the application.

Claims (12)

1. A picture cross-device display method is characterized by being applied to a screen-throwing target device; the method comprises the following steps:
displaying a first picture, wherein the first picture is a current mirror image of a running picture of a first application on screen throwing source equipment;
acquiring a first operation event aiming at the first picture;
judging whether to respond to a first instruction corresponding to the first operation event at the local end according to the execution action of the first operation event on the first picture;
if the first instruction corresponding to the first operation event is not responded at the local end, a second instruction corresponding to the first operation event is sent to the screen throwing source equipment, and the second instruction corresponding to the first operation event is used for indicating the screen throwing source equipment to execute operation on the running picture of the first application;
The executing the action on the first picture according to the first operation event to judge whether to respond to a first instruction corresponding to the first operation event at the local end includes:
if the first operation event is an operation of moving the first picture along a first direction, determining whether to respond to a first instruction corresponding to the first operation event at the home end or not according to whether the first picture and a target picture are the same picture or not, wherein the target picture is a target mirror image of the screen-throwing source equipment aiming at the running picture of the first application.
2. The method of claim 1, wherein the determining whether to respond to the first instruction corresponding to the first operation event at the home terminal or not according to whether the first screen and the target screen are the same screen comprises:
if the first picture and the target picture are different pictures, determining whether to respond to a first instruction corresponding to the first operation event at the local end or not at the local end according to the enlargement or reduction relation between the first picture and the target picture; or alternatively, the process may be performed,
if the first picture and the target picture are the same picture, the first command corresponding to the first operation event is not responded at the local end.
3. The method of claim 2, wherein the determining whether to respond to the first instruction corresponding to the first operation event at the home terminal or not at the home terminal according to the zoom-in or zoom-out relation between the first screen and the target screen comprises:
if the first picture is a part of the amplified target picture, determining whether to respond to a first instruction corresponding to the first operation event at the local end or not according to whether the first picture moves beyond the edge of the target picture along the first direction; or alternatively, the process may be performed,
if the first picture is the full picture after the target picture is reduced, the first command corresponding to the first operation event is not responded at the local end.
4. The method of claim 3, wherein the first instruction for determining whether to respond to the first operation event at the home terminal or not according to whether the first screen moves beyond an edge of the target screen in the first direction comprises:
if the first picture moves beyond the edge of the target picture along the first direction, a first instruction corresponding to the first operation event is not responded at the local end; or alternatively, the process may be performed,
And if the first picture moves along the first direction and does not exceed the edge of the target picture, responding to a first instruction corresponding to the first operation event at the local end.
5. The method of claim 1, wherein prior to the sending the second instruction corresponding to the first operational event to the screen-casting source device, the method further comprises:
and determining a second instruction corresponding to the first operation event according to the first operation event and a preset operation mapping relation, wherein the preset operation mapping relation is used for representing a mapping relation between the operation aiming at the first picture and the operation aiming at the running picture of the first application.
6. The method according to claim 1, wherein after the determining whether to respond to the first instruction corresponding to the first operation event at the home terminal according to the first screen and the target screen, the method further comprises:
and if the first instruction corresponding to the first operation event is responded at the local end, the first picture is moved along the first direction to display a second picture, and the second picture is a mirror image of the running picture of the first application, which is different from the first picture.
7. The method according to claim 1, wherein the performing the action on the first screen according to the first operation event to determine whether to respond to the first instruction corresponding to the first operation event at the home terminal includes:
if the first operation event is the operation of clicking the first picture, the first instruction corresponding to the first operation event is not responded at the local end.
8. The method of claim 7, wherein prior to the sending the second instruction corresponding to the first operational event to the screen-casting source device, the method further comprises:
obtaining a first scaling ratio and a second scaling ratio, wherein the first scaling ratio is used for representing a scaling ratio between the display size of a running picture of the first application and the display size of a target picture, the second scaling ratio is used for representing a scaling ratio between the display size of the first picture and the display size of the target picture, and the target picture is a target mirror image of the screen-throwing source equipment aiming at the running picture of the first application;
and determining a running picture of a second instruction first application corresponding to the first operation event according to the first scaling ratio, the second scaling ratio and the first operation event.
9. The method of any of claims 1-8, wherein after the sending the second instruction corresponding to the first operation event to the screen-casting source device, the method further comprises:
and displaying a third picture, wherein the third picture is a mirror image of the running picture of the first application, which is different from the first picture, after the screen throwing source equipment responds to the second instruction corresponding to the first operation event.
10. A picture cross-device display device is characterized by being applied to a screen-throwing target device; the device comprises a processing unit and a communication unit, wherein the processing unit is used for:
displaying a first picture, wherein the first picture is a current mirror image of a running picture of a first application on screen throwing source equipment;
acquiring a first operation event aiming at the first picture;
judging whether to respond to a first instruction corresponding to the first operation event at the local end according to the execution action of the first operation event on the first picture;
if the first instruction corresponding to the first operation event is not responded at the local end, a second instruction corresponding to the first operation event is sent to the screen throwing source equipment through the communication unit, wherein the second instruction corresponding to the first operation event is used for indicating the screen throwing source equipment to execute operation on the running picture of the first application;
The processing unit is configured to, in the aspect of determining whether to respond, at a local end, to a first instruction corresponding to the first operation event by performing an action on the first screen according to the first operation event:
if the first operation event is an operation of moving the first picture along a first direction, determining whether to respond to a first instruction corresponding to the first operation event at the home end or not according to whether the first picture and a target picture are the same picture or not, wherein the target picture is a target mirror image of the screen-throwing source equipment aiming at the running picture of the first application.
11. An electronic device, characterized in that it is a projection target device comprising a processor, a memory and a communication interface, the memory storing one or more programs and the one or more programs being executed by the processor, the one or more programs comprising instructions for performing the steps of the method of any of claims 1-9.
12. A computer readable storage medium storing a computer program for electronic data exchange, wherein the computer program is operable to cause a computer to perform the method of any one of claims 1-9.
CN202011284014.2A 2020-11-16 2020-11-16 Picture cross-device display method and device and electronic device Active CN112394895B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202011284014.2A CN112394895B (en) 2020-11-16 2020-11-16 Picture cross-device display method and device and electronic device
PCT/CN2021/121014 WO2022100305A1 (en) 2020-11-16 2021-09-27 Cross-device picture display method and apparatus, and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011284014.2A CN112394895B (en) 2020-11-16 2020-11-16 Picture cross-device display method and device and electronic device

Publications (2)

Publication Number Publication Date
CN112394895A CN112394895A (en) 2021-02-23
CN112394895B true CN112394895B (en) 2023-10-13

Family

ID=74600911

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011284014.2A Active CN112394895B (en) 2020-11-16 2020-11-16 Picture cross-device display method and device and electronic device

Country Status (2)

Country Link
CN (1) CN112394895B (en)
WO (1) WO2022100305A1 (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112394895B (en) * 2020-11-16 2023-10-13 Oppo广东移动通信有限公司 Picture cross-device display method and device and electronic device
CN115253285A (en) * 2021-04-30 2022-11-01 华为技术有限公司 Display method and related device
CN115460445B (en) * 2021-06-09 2024-03-22 荣耀终端有限公司 Screen projection method of electronic equipment and electronic equipment
CN115700463A (en) * 2021-07-30 2023-02-07 华为技术有限公司 Screen projection method and system and electronic equipment
CN113891127A (en) * 2021-08-31 2022-01-04 维沃移动通信有限公司 Video editing method and device and electronic equipment
CN115756268A (en) * 2021-09-03 2023-03-07 华为技术有限公司 Cross-device interaction method and device, screen projection system and terminal
CN114257631A (en) * 2021-12-20 2022-03-29 Oppo广东移动通信有限公司 Data interaction method, device, equipment and storage medium
CN114579034A (en) * 2022-03-02 2022-06-03 北京字节跳动网络技术有限公司 Information interaction method and device, display equipment and storage medium
CN115729502B (en) * 2022-03-23 2024-02-27 博泰车联网(南京)有限公司 Screen-throwing end and display end response method, electronic equipment and storage medium
CN115174988A (en) * 2022-06-24 2022-10-11 长沙联远电子科技有限公司 Audio and video screen projection control method based on DLNA
CN116719468A (en) * 2022-09-02 2023-09-08 荣耀终端有限公司 Interactive event processing method and device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104811639A (en) * 2015-04-28 2015-07-29 联想(北京)有限公司 Method for processing information and electronic equipment
CN107483994A (en) * 2017-07-31 2017-12-15 广州指观网络科技有限公司 It is a kind of reversely to throw screen control system and method
CN110248226A (en) * 2019-07-16 2019-09-17 广州视源电子科技股份有限公司 Throwing screen method, apparatus, system, storage medium and the processor of information
CN110377250A (en) * 2019-06-05 2019-10-25 华为技术有限公司 A kind of touch control method and electronic equipment thrown under screen scene
CN111221491A (en) * 2020-01-09 2020-06-02 Oppo(重庆)智能科技有限公司 Interaction control method and device, electronic equipment and storage medium
CN111562896A (en) * 2020-04-26 2020-08-21 维沃移动通信有限公司 Screen projection method and electronic equipment
CN111918119A (en) * 2020-07-24 2020-11-10 深圳乐播科技有限公司 IOS system data screen projection method, device, equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112394895B (en) * 2020-11-16 2023-10-13 Oppo广东移动通信有限公司 Picture cross-device display method and device and electronic device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104811639A (en) * 2015-04-28 2015-07-29 联想(北京)有限公司 Method for processing information and electronic equipment
CN107483994A (en) * 2017-07-31 2017-12-15 广州指观网络科技有限公司 It is a kind of reversely to throw screen control system and method
CN110377250A (en) * 2019-06-05 2019-10-25 华为技术有限公司 A kind of touch control method and electronic equipment thrown under screen scene
CN110248226A (en) * 2019-07-16 2019-09-17 广州视源电子科技股份有限公司 Throwing screen method, apparatus, system, storage medium and the processor of information
CN111221491A (en) * 2020-01-09 2020-06-02 Oppo(重庆)智能科技有限公司 Interaction control method and device, electronic equipment and storage medium
CN111562896A (en) * 2020-04-26 2020-08-21 维沃移动通信有限公司 Screen projection method and electronic equipment
CN111918119A (en) * 2020-07-24 2020-11-10 深圳乐播科技有限公司 IOS system data screen projection method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN112394895A (en) 2021-02-23
WO2022100305A1 (en) 2022-05-19

Similar Documents

Publication Publication Date Title
CN112394895B (en) Picture cross-device display method and device and electronic device
US11922005B2 (en) Screen capture method and related device
WO2021078284A1 (en) Content continuation method and electronic device
CN112291764B (en) Content connection system
CN112398855B (en) Method and device for transferring application contents across devices and electronic device
CN112558825A (en) Information processing method and electronic equipment
EP4060475A1 (en) Multi-screen cooperation method and system, and electronic device
CN114040242B (en) Screen projection method, electronic equipment and storage medium
WO2022105445A1 (en) Browser-based application screen projection method and related apparatus
CN112527174B (en) Information processing method and electronic equipment
WO2022121775A1 (en) Screen projection method, and device
CN112527222A (en) Information processing method and electronic equipment
WO2023030099A1 (en) Cross-device interaction method and apparatus, and screen projection system and terminal
WO2022042769A2 (en) Multi-screen interaction system and method, apparatus, and medium
WO2022135157A1 (en) Page display method and apparatus, and electronic device and readable storage medium
CN115016697A (en) Screen projection method, computer device, readable storage medium, and program product
JP2023534182A (en) File opening methods and devices
CN115550597A (en) Shooting method, system and electronic equipment
CN109714628B (en) Method, device, equipment, storage medium and system for playing audio and video
WO2023005711A1 (en) Service recommendation method and electronic device
CN115086888B (en) Message notification method and device and electronic equipment
CN113805825B (en) Method for data communication between devices, device and readable storage medium
WO2024022307A1 (en) Screen mirroring method and electronic device
WO2022188632A1 (en) Theme display method and apparatus, terminal, and computer storage medium
CN115309316A (en) Device using method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant