WO2022100305A1 - Procédé et appareil d'affichage d'image entre dispositifs, et dispositif électronique - Google Patents

Procédé et appareil d'affichage d'image entre dispositifs, et dispositif électronique Download PDF

Info

Publication number
WO2022100305A1
WO2022100305A1 PCT/CN2021/121014 CN2021121014W WO2022100305A1 WO 2022100305 A1 WO2022100305 A1 WO 2022100305A1 CN 2021121014 W CN2021121014 W CN 2021121014W WO 2022100305 A1 WO2022100305 A1 WO 2022100305A1
Authority
WO
WIPO (PCT)
Prior art keywords
screen
picture
operation event
target
instruction corresponding
Prior art date
Application number
PCT/CN2021/121014
Other languages
English (en)
Chinese (zh)
Inventor
杨俊拯
何�轩
钟卫东
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2022100305A1 publication Critical patent/WO2022100305A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay

Definitions

  • the present application relates to the field of computer technologies, and in particular, to a method and device for displaying pictures across devices, and an electronic device.
  • a computer can be equipped with a system such as Windows or Mac
  • a mobile phone can be equipped with a system such as Android or IOS.
  • Embodiments of the present application provide a cross-device screen display method, device, and electronic device, so as to realize cross-device screen display and interactive control of an application, improve processing efficiency of screen interaction operations, and improve screen projection experience.
  • an embodiment of the present application provides a method for displaying pictures across devices, which is applied to a target device for screen projection, and the method includes:
  • the first picture is the current mirror image of the running picture of the first application on the screen projection source device
  • the second instruction corresponding to the first operation event is sent to the screen projection source device, and the second instruction corresponding to the first operation event uses instructing the screen projection source device to perform an operation on the running screen of the first application.
  • an embodiment of the present application provides an apparatus for displaying pictures across devices.
  • the apparatus includes a processing unit and a communication unit, where the processing unit is configured to:
  • the first picture is the current mirror image of the running picture of the first application on the screen projection source device
  • the communication unit sends the second instruction corresponding to the first operation event to the screen projection source device, and the second instruction corresponding to the first operation event uses instructing the screen projection source device to perform an operation on the running screen of the first application.
  • an embodiment of the present application provides an electronic device, the electronic device is a screen projection target device, including a processor, a memory, and a communication interface, the memory stores one or more programs, and the one or more programs A program is executed by the processor, and the one or more programs are used to execute instructions of the steps in the first aspect of the embodiments of the present application.
  • the embodiments of the present application provide a computer-readable storage medium, wherein the computer-readable storage medium stores a computer program for electronic data exchange, and the computer program is operable to cause a computer to execute the embodiments of the present application some or all of the steps described in the first aspect.
  • an embodiment of the present application provides a computer program product, wherein the computer program product includes a computer program, and the computer program is operable to cause a computer to execute part or all of the description in the first aspect of the embodiment of the present application step.
  • the computer program product may be a software installation package.
  • FIG. 1 is a schematic diagram of the architecture of a screen projection communication system provided by an embodiment of the present application
  • FIG. 2 is a schematic diagram of a hardware structure of an electronic device provided by an embodiment of the present application.
  • FIG. 3 is a schematic diagram of a software structure of an electronic device provided by an embodiment of the present application.
  • FIG. 4 is a schematic structural diagram of a screen projection source device and a screen projection target device provided by an embodiment of the present application;
  • FIG. 5 is a schematic flowchart of a method for displaying pictures across devices provided by an embodiment of the present application
  • FIG. 6 is a schematic structural diagram of a cross-device display of a picture provided by an embodiment of the present application.
  • FIG. 7 is a schematic structural diagram of another screen cross-device display provided by an embodiment of the present application.
  • FIG. 8 is a schematic structural diagram of another screen cross-device display provided by an embodiment of the present application.
  • FIG. 9 is a schematic structural diagram of another screen cross-device display provided by an embodiment of the present application.
  • FIG. 10 is a schematic structural diagram of another screen cross-device display provided by an embodiment of the present application.
  • FIG. 11 is a schematic structural diagram of another screen cross-device display provided by an embodiment of the present application.
  • FIG. 12 is a schematic structural diagram of another screen cross-device display provided by an embodiment of the present application.
  • FIG. 13 is a schematic structural diagram of another screen cross-device display provided by an embodiment of the present application.
  • FIG. 14 is a schematic structural diagram of another screen cross-device display provided by an embodiment of the present application.
  • FIG. 15 is a schematic structural diagram of another screen cross-device display provided by an embodiment of the present application.
  • 16 is a schematic structural diagram of another screen cross-device display provided by an embodiment of the present application.
  • 17 is a schematic structural diagram of another screen cross-device display provided by an embodiment of the present application.
  • FIG. 18 is a schematic structural diagram of another screen cross-device display provided by an embodiment of the present application.
  • FIG. 19 is a schematic structural diagram of another screen cross-device display provided by an embodiment of the present application.
  • FIG. 20 is a schematic structural diagram of another screen cross-device display provided by an embodiment of the present application.
  • 21 is a schematic flowchart of another method for displaying pictures across devices provided by an embodiment of the present application.
  • 22 is a schematic flowchart of another method for displaying pictures across devices provided by an embodiment of the present application.
  • FIG. 23 is a block diagram showing the composition of functional units of a screen cross-device display device provided by an embodiment of the present application.
  • FIG. 24 is a schematic structural diagram of another electronic device provided by an embodiment of the present application.
  • the screen projection in the embodiment of the present application is a technology for projecting the screen or content of an application running on a device to a display screen or a display medium of another device for display, which is a typical information synchronization method.
  • the device that projects the screen of its application is called the screen-casting source device
  • the device that receives and displays the screen of its application is called the screen-casting target device.
  • the screen projection involved in this application may include wired screen projection and wireless screen projection.
  • wired screen projection can establish a wired connection between the screen projection source device and the screen projection target device through a high definition multimedia interface (HDMI), universal serial bus (USB) interface, etc. to transmit media.
  • Data; wireless screencasting can establish a wired connection between the screencasting source device and the screencasting target device through the digital living network alliance (DLNA) protocol, wireless display sharing (Miracast) or airplay (AirPlay) protocol to transmit media data.
  • DLNA digital living network alliance
  • Miracast wireless display sharing
  • AirPlay airplay
  • the screencasting source device may send the video stream in the current video player to the screencasting target device after encoding and compressing the data; then, after the screencasting target device decodes the video stream data, it displays The projected content is displayed on the screen.
  • the screencast content displayed on the display screen of the screencast target device may be referred to as the mirror image of the screencast content of the screencast source device.
  • the screen projection source device and the screen projection target device may be collectively referred to as electronic devices.
  • the electronic device in this embodiment of the present application may be a handheld device, a vehicle-mounted device, a wearable device, an augmented reality (AR) device, a virtual reality (VR) device, a projection device, a projector, or a wireless
  • Other devices of the modem can also be various specific forms of user equipment (UE), terminal device (terminal device), mobile phone (smart phone), smart screen, smart TV, smart watch, notebook computer, smart audio, Camera, gamepad, microphone, station (STA), access point (AP), mobile station (mobile Station, MS), personal digital assistant (PDA), personal computer (personal computer, PC) or relay equipment, etc.
  • two electronic devices a computer and a mobile phone
  • the computer and the mobile phone are connected by wireless communication technology (such as Bluetooth, Wi-Fi, Zigbee, near field communication, etc.) or data cable (such as USB data cable)
  • the mobile phone will run it as a screen projection source device through screen projection technology.
  • the screen or content of the application is projected to the screen of the computer, and the computer is used as the target device for screen projection; or, the computer is used as the source device to project the screen or content of the application it is running on the screen of the mobile phone.
  • the mobile phone is used as the target device for screen projection.
  • the screen projection communication system 10 may include at least two electronic devices 110 .
  • the at least two electronic devices 110 may include electronic device 110A, electronic device 110B, electronic device 110C, electronic device 110D, electronic device 110E and electronic device 110F. Meanwhile, each of the at least two electronic devices 110 may be connected to each other through wireless network or wired data communication.
  • the wireless network may include a mobile cellular network (such as a fifth-generation 5G mobile communication network), a wireless local area network (WLAN), a wide area network (WAN), Bluetooth, wireless security True (wireless fidelity, Wi-Fi), Zigbee (Zigbee), near field communication (near field communication, NFC) or ultra wide band (ultra wide band, UWB), etc.; wired data can include HDMI data lines, USB data lines, etc. .
  • a mobile cellular network such as a fifth-generation 5G mobile communication network
  • WLAN wireless local area network
  • WAN wide area network
  • Bluetooth wireless security True (wireless fidelity, Wi-Fi), Zigbee (Zigbee), near field communication (near field communication, NFC) or ultra wide band (ultra wide band, UWB), etc.
  • wired data can include HDMI data lines, USB data lines, etc.
  • each electronic device in the at least two electronic devices 110 may be a device under the same user account.
  • the at least two electronic devices 110 include the mobile phone, desktop computer, smart screen, notebook computer, medium The relay device and the smart watch, and the mobile phone, desktop computer, smart screen, notebook computer, relay device and smart watch can communicate with each other through a wireless network.
  • each of the at least two electronic devices 110 may be connected to the same WLAN network through a relay device (eg, a router).
  • a relay device eg, a router
  • the at least two electronic devices 110 include the mobile phone, desktop computer, smart screen, notebook computer Computer, relay device and smart watch, and the mobile phone, desktop computer, smart screen, notebook computer, relay device and smart watch form a WLAN network, so that each device in the WLAN network can communicate with each other through the relay device .
  • each of the at least two electronic devices 110 may form a peer-to-peer (P2P) network through wireless communication (such as Bluetooth, Zigbee, NFC, UWB, etc.).
  • P2P peer-to-peer
  • wireless communication such as Bluetooth, Zigbee, NFC, UWB, etc.
  • users can form a P2P network with mobile phones, laptops and smart watches by scanning NFC tags, and all devices in the P2P network can communicate with each other before.
  • one or more of the at least two electronic devices 110 may be used as screen projection source devices, and other electronic devices may be used as screen projection target devices.
  • the screen-casting source device can screen-cast or stream the current screen of the running application to the screen-casting target device for display.
  • the screen-casting target device can display the images simultaneously in a split-screen manner.
  • the electronic device 110A casts the screen to the electronic device 110B and the electronic device 110C
  • the electronic device 110D casts the screen to the electronic device 110C and the electronic device 110F.
  • the electronic device 11C can simultaneously display the mirror images from the electronic device 110A and the electronic device 110D in a split-screen manner.
  • the screen projection communication system 10 may also include other electronic devices, which are not specifically limited herein.
  • the structure of the electronic device in the embodiment of the present application will be described in detail below with reference to FIG. 2 . It can be understood that the structure shown in FIG. 2 does not constitute a specific limitation on the electronic device. In other embodiments of the present application, the electronic device may further include more or less components than those shown in FIG. 2 , or combine some components, or separate some components, or arrange different components. In addition, the components illustrated in FIG. 2 may be implemented by hardware, software, or a combination of software and hardware.
  • the electronic device may include a processor 210, an antenna 1, an antenna 2, a mobile communication module 220, a wireless communication module 230, an audio module 240, a sensor module 250, a display module 260, a camera module 270, a charging management module 280, Internal memory 2901 and external memory interface 2902, etc.
  • the processor 210 may include one or more processing units.
  • the processor 210 may include an application processor (AP), a modem processor, a graphics processing unit (GPU), an image signal processor (ISP), a controller, a memory , video codec, digital signal processor (DSP), field programmable gate array (FPGA) baseband processor and/or neural-network processing unit (NPU) )Wait.
  • AP application processor
  • GPU graphics processing unit
  • ISP image signal processor
  • controller a memory
  • video codec digital signal processor
  • DSP digital signal processor
  • FPGA field programmable gate array
  • NPU neural-network processing unit
  • a memory may also be provided in the processor 210 for storing instructions and data.
  • the memory in processor 210 is cache memory.
  • the memory may hold instructions or data that have just been used or recycled by the processor 210 . If the processor 210 needs to use the instruction or data again, it can be directly called from the memory, thereby avoiding repeated access, reducing the waiting time of the processor 210 and improving the system efficiency.
  • the processor 210 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transceiver (universal asynchronous transmitter) receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface and/ or USB interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transceiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • the wireless communication function of the electronic device may be implemented by the antenna 1 , the antenna 2 , the mobile communication module 220 , the wireless communication module 230 , the modulation and demodulation processor, the baseband processor, and the like.
  • the antenna 1 and the antenna 2 are used for transmitting and receiving electromagnetic wave signals.
  • Each antenna in an electronic device can be used to cover a single or multiple communication frequency bands.
  • different antennas can also be multiplexed to improve the utilization of the antennas.
  • the antenna 1 is multiplexed as a diversity antenna of the wireless local area network.
  • the mobile communication module 220 can provide a wireless communication solution including 2G/3G/4G/5G, etc. applied on the electronic device.
  • the mobile communication module 220 may include at least one filter, a switch, a power amplifier, a low noise amplifier (LNA), and the like.
  • LNA low noise amplifier
  • the mobile communication module 220 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation. In addition, the mobile communication module 220 can also amplify the signal modulated by the modulation and demodulation processor, and then turn it into an electromagnetic wave for radiation through the antenna 1 . In the embodiment of the present application, the mobile communication module 220 can realize the communication connection between the screen projection source device and the screen projection target device in the technical solution of the present application.
  • At least some functional modules of the mobile communication module 220 may be provided in the processor 210; or, at least some functional modules of the mobile communication module 220 may be provided in the same device as some modules of the processor 210.
  • the wireless communication module 230 can provide applications on electronic devices including Bluetooth (BT), wireless local area networks (WLAN), wireless fidelity (Wi-Fi) networks, short-range wireless Communication (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • BT Bluetooth
  • WLAN wireless local area networks
  • Wi-Fi wireless fidelity
  • NFC short-range wireless Communication
  • infrared technology infrared, IR
  • the wireless communication module 230 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 230 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the system-on-chip 210 .
  • the wireless communication module 230 can also receive the signal to be sent from the processor 210, perform frequency modulation and amplification on the signal, and then convert it into electromagnetic waves by the antenna 2 and radiate it out.
  • the wireless communication module 230 can realize the communication connection between the screen projection source device and the screen projection target device in the technical solution of the present application.
  • the electronic device may implement audio functions through the audio module 240 , the speaker 2401 , the receiver 2402 , the microphone 2403 , the headphone interface 2404 , the processor 210 , and the like. For example, music playback, recording, etc.
  • the audio module 240 can be used to convert digital audio information into analog audio signal output, and can also be used to convert analog audio input into digital audio signal. Additionally, the audio module 240 may also be used to encode and decode audio signals. In some embodiments, the audio module 240 may be provided in the processor 210 , or some functional modules of the audio module 240 may be provided in the processor 210 .
  • the speaker 2401 can be used to convert audio electrical signals into sound signals.
  • the electronic device can listen to music through the speaker 2301, or listen to a hands-free call.
  • the receiver 2402 can be used to convert audio electrical signals into sound signals.
  • the receiver 2302 can be close to the human ear to receive the voice.
  • the microphone 2403 can be used to convert sound signals into electrical signals.
  • the user can make a sound by approaching the microphone 2403 through the human mouth, and input the sound signal into the microphone 2403 .
  • the electronic device may be provided with at least one microphone 2403 .
  • the electronic device may be provided with two microphones 2403, which can implement noise reduction function in addition to collecting sound signals; in a possible example, the electronic device may be provided with three, four or more microphones 2403 , to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions, etc., which are not limited.
  • the earphone interface 2404 can be used to connect a wired earphone.
  • the earphone interface 2404 can be a USB interface 2703, or a 3.5mm open mobile terminal platform (OMTP) standard interface, a cellular telecommunications industry association of the USA (CTIA) standard interface, etc. .
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the sensor module 180 may include a pressure sensor, a gyroscope sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, an ultra- Bandwidth UWB sensors, near field communication NFC sensors, laser sensors and visible light sensors, etc.
  • the electronic device may implement the display function through the GPU, the display module 260, the processor 210, and the like.
  • the GPU can be used to perform mathematical and geometric calculations and perform graphics rendering.
  • the GPU can be a microprocessor for image processing, and is connected to the display module 260 and the processor 210 .
  • Processor 210 may include one or more GPUs that execute program instructions to generate or alter display information.
  • the display module 260 may be a display screen, which is used to display images, videos, and the like.
  • the display screen 260 may include a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode, or an active matrix organic light emitting diode (active-matrix organic light).
  • LED liquid crystal display
  • AMOLED organic light-emitting diode
  • FLED flexible light-emitting diode
  • QLED quantum dot light-emitting diode
  • the electronic device may include one or more display modules 260 .
  • the electronic device can realize the shooting function through the ISP, the camera module 270, the video codec, the GPU, the display module 260, the processor 210, and the like.
  • the ISP may be used to process the data fed back by the camera module 270 .
  • the shutter is first opened, and then the light is transmitted to the camera photosensitive element through the lens to convert the light signal into an electrical signal, and finally the electrical signal is transmitted to the ISP through the camera photosensitive element for processing to convert it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin tone. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera module 270 .
  • the camera module 270 may be a camera, which is used to capture still images or videos, and the like.
  • the optical image generated by the object is projected to the photosensitive element through the lens, and the photosensitive element can be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the electronic device may include one or more camera modules 270 .
  • the charging management module 280 is configured to receive charging input from the charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 280 may receive charging input from the wired charger through the USB interface 2803 .
  • the charging management module 280 may receive wireless charging input through a wireless charging coil of the electronic device. While the charging management module 280 charges the battery 2801 , it can also supply power to the electronic device through the power management module 2802 .
  • the power management module 2802 is used to connect the battery 2801 , the charging management module 2802 and the processor 210 .
  • the power management module 2802 receives input from the battery 2801 and/or the charging management module 280, and supplies power to each module in the electronic device, the processor 210, and the like.
  • the power management module 2802 can also be used to monitor parameters such as battery capacity, battery cycle times, battery health status (leakage, impedance).
  • the power management module 2802 may also be provided in the processor 210; in a possible example, the power management module 2802 and the charging management module 280 may also be provided in the same device.
  • the internal memory 2901 may be used to store computer executable program codes, where the executable program codes include instructions.
  • the processor 210 executes various functional applications and data processing of the electronic device by executing the instructions stored in the internal memory 2901 .
  • the internal memory 2901 stores program codes for executing the technical solutions of the embodiments of the present application.
  • the internal memory 2901 may include a program storage area and a data storage area.
  • the storage program area may store an operating system, an application program required for at least one function (for example, a sound playback function and an image playback function, etc.), and the like.
  • the storage data area can store data (eg, audio data, phone book, etc.) created during the use of the electronic device, and the like.
  • the internal memory 2901 may include high-speed random access memory, and may also include non-volatile memory. For example, at least one disk storage device, flash memory device, universal flash storage (UFS), and the like.
  • the external memory interface 2902 can be used to connect an external memory card, such as a micro SD card, to expand the storage capacity of the electronic device.
  • the external memory card communicates with the processor 210 through the external memory interface 2902 to realize the data storage function. For example, save files such as music, videos, etc. on an external memory card.
  • the software system of the electronic device may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the following embodiments of the present application take an Android system with a layered architecture as an example to exemplarily describe the software structure of an electronic device.
  • the internal memory 2901 may store the kernel layer 320 , the system runtime layer 340 , the application framework layer 360 and the application layer 380 .
  • the layers communicate with each other through software interfaces, and the kernel layer 320 , the system runtime layer 340 and the application framework layer 360 belong to the operating system space.
  • the application layer 380 belongs to the user space, and at least one application program (or “application” for short) runs in the application layer 380.
  • application programs may be native applications provided by the operating system, or may be third-party developers developed third-party applications.
  • the application layer 380 may include applications such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, and SMS.
  • a screen-casting application may also be installed in the application layer.
  • Users can open screen-casting apps from the desktop, settings, or drop-down menus.
  • the screencasting application can be used as a bridge between the screencasting source device and the screencasting target device when content is projected, and the screencasting content (such as the screen of the application) of the application that needs to be screened in the screencasting source device is sent to the screencasting device. target device.
  • a screen-casting application can receive a screen-casting event reported by the application framework layer 360, so as to use the screen-casting application to interact with a running application (such as a video player), and use the content being displayed or played in the application as the screen-casting content through the Wi-Fi and other wireless communication methods are sent to the target device for screen projection.
  • a running application such as a video player
  • the user can also use the screen-casting application to set the binding relationship between the NFC tag and one or more electronic devices. For example, set an option for binding an NFC tag in a screen-casting app. After the electronic device detects that the user has turned on this option, the screen-casting application may display a list of electronic devices to be bound. After the user selects one or more electronic devices to be bound in the list, the user can bring the electronic device to the NFC tag to be bound. In this way, the electronic device can write the identifier of the electronic device selected by the user in the screen projection application into the NFC tag through the NFC signal, thereby establishing a binding relationship between the NFC tag and one or more electronic devices in the NFC tag.
  • the application framework layer 360 provides various application programming interfaces (application programming interfaces, APIs) and programming frameworks that may be used to construct applications of the application layer, so that developers can also build their own applications by using these APIs program. For example, window manager, content providers, view system, telephony manager, resource manager, notification manager, message manager, activity Manager (activity manager), package manager (package manager), location management (location manager) and NFC services, etc.
  • APIs application programming interfaces
  • programming frameworks may be used to construct applications of the application layer, so that developers can also build their own applications by using these APIs program. For example, window manager, content providers, view system, telephony manager, resource manager, notification manager, message manager, activity Manager (activity manager), package manager (package manager), location management (location manager) and NFC services, etc.
  • a window manager can be used to manage window programs.
  • the window manager can get the size of the display screen, determine whether there is a status bar, lock screen, screen capture, etc.
  • the content provider can be used to store and obtain data, and make the data accessible to applications.
  • the data may include video, images, audio, calls made and received, browsing history and bookmarks, phone book, and the like.
  • a content provider may enable an application to access another application's data, such as a contact database, or to share their own data.
  • the view system includes visual controls. For example, controls that display text and controls that display pictures, etc. View systems can be used to build applications.
  • a display interface can consist of one or more views.
  • the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
  • the phone manager is used to provide the communication function of the electronic device. For example, the management of call status (such as connecting, hanging up, etc.).
  • the resource manager can provide various resources for the application. For example, localized strings, icons, pictures, layout files, video files, etc.
  • the notification manager enables the application to display notification information in the status bar, which can be used to convey notification-type messages, and can disappear automatically after a short stay without user interaction.
  • the notification manager is used to notify download completion, message reminders, etc.
  • the notification manager can also display notifications in the status bar at the top of the system in the form of a graphic or scroll bar text.
  • the notification of the application running in the background can also be a notification that appears on the screen in the form of a dialog window. For example, text information is prompted in the status bar, a prompt sound is issued, the electronic device vibrates, and the indicator light flashes.
  • the message manager can be used to store the data of the messages reported by each application program, and process the data reported by each application program.
  • the activity manager can be used to manage the application life cycle and provide the usual navigation and fallback functions.
  • the message manager may be part of the notification manager.
  • the application framework layer 360 may run an NFC service (NFC service).
  • NFC service NFC service
  • the NFC service can be started to run in the application framework layer.
  • the NFC service can call the NFC driver at the kernel layer to read the binding relationship stored in the NFC tag, so as to obtain the screen-casting target device for this content screen-casting.
  • the NFC service can report the screen-casting event to the above-mentioned screen-casting application, thereby triggering the screen-casting application to send the content being displayed or played on the mobile phone as the screen-casting content to the screen-casting target device to start the content-casting process.
  • system runtime layer 340 provides main feature support for the Android system through some C/C++ libraries.
  • the SQLite library provides database support
  • the OpenGL/ES library provides 3D drawing support
  • the Webkit library provides browser kernel support.
  • An Android runtime library (Android Runtime) is also provided in the system runtime library layer 340, which mainly provides some core libraries, which can allow developers to use Java language to write Android applications.
  • the kernel layer 320 may provide underlying drivers for various hardware of electronic devices, such as display drivers, audio drivers, camera drivers, Bluetooth drivers, Wi-Fi drivers, power management, NFC drivers, UWB drivers, and the like.
  • the acquisition module 4101 in the screen projection source device 410 acquires the current picture of the running application and transmits it to the encoding module 4102; secondly, the encoding module 4102 encodes and compresses the picture to form a data stream, which is then sent by the encoding module 4102.
  • the sending module 4103 transmits the data stream to the receiving module 4201 in the screen projection target device 420; again, the decoding module 4202 decodes the data stream using the streaming media protocol, and transmits the decoded data stream to the display module 4203; finally , the display module 4203 displays a full mirror image or a partial mirror image of the screen, so as to realize cross-device display of the screen.
  • the embodiment of the present application also proposes to perform input operations (such as zooming in, zooming out, scrolling, flipping, moving, etc.) on the screen displayed by the display module 4203 through the input module 4204 to generate an input operation instruction, and transmit the input operation instruction to the input
  • the input processing module 4205 determines whether the input operation instruction is responded to by the screen projection target device 420 or the screen projection source device 410.
  • the input processing module 4205 needs to perform adaptation processing on the input operation command, and transmit the adapted operation command to the input injection module 4104 through the sending module 4206, and then An operation is performed by the input injection module 4104 on the current screen of the application running on the screen projection source device 410 .
  • the projection target device 420 can not only operate the projection screen by zooming or moving, but also operate the current picture of the application on the projection source device 410, so as to realize the interactive operation of the application screen and improve the projection screen. user experience.
  • the following embodiments of the present application will be specifically described in terms of method examples.
  • FIG. 5 is a schematic flowchart of a method for displaying pictures across devices provided by an embodiment of the present application, and the method includes:
  • the first picture may be a current mirror image of the running picture of the first application on the screen projection source device.
  • the first application in this embodiment of the present application may be an application program or media data running on the application layer of the screen projection source device, such as photos, videos, audios, games, gallery, documents, or multimedia.
  • the screen projection source device can run the first application on the front end, and can also run the first application on the back end.
  • the display screen of the screencasting source device can display the running screen of the first application; when the screencasting source device runs the first application at the back end, the screencasting source device
  • the display screen of the first application may not display the running screen of the first application, but the first application is executed in the background.
  • the screen displayed on the display screen of the screen-casting target device may be referred to as a mirror image (ie, the first screen or the target screen) displayed on the display screen of the screen-casting source device, and the mirror image may present the first screen.
  • a mirror image ie, the first screen or the target screen
  • the mirror image may present the first screen.
  • the mirror image is already the image presented after the relevant resolution adjustment, display size adjustment, image adjustment and other operations during the projection process.
  • the current image (ie, the first picture) displayed on the display screen of the screen-casting target device may be the screen-casting source.
  • the target image (ie, the target image) to be projected by the device for the running image of the first application may also be an image after performing operations such as zooming in, zooming out, or moving the target image.
  • the target image to be projected by the screen projection source device for the running image of the first application is called the target image
  • the image currently displayed on the display screen of the screen projection target device is called the first image.
  • the following embodiments of the present application illustrate that the source device for screen projection is a notebook computer and the target device for screen projection is a mobile phone.
  • FIG. 6 Illustratively, as shown in FIG. 6 , FIG. 7 , FIG. 8 and FIG. 9 .
  • the user opens a photo of "Sketched Cat.jpg” through the "Gallery” application on the laptop, and uses the screen projection technology to project the current screen of the "Gallery” application on the laptop to the mobile phone superior.
  • the target screen displayed on the display screen of the mobile phone is the same as the current screen of the "Gallery” application on the laptop.
  • the user can set the relevant screencasting options to screencast only the "cat after sketch.jpg” opened by the "Gallery” application on the laptop to the mobile phone.
  • the target screen displayed on the display screen of the mobile phone is only a part of the current screen of the "Gallery" application.
  • the user can slide the touch screen of the mobile phone shown in Figure 7 to display the target screen on the screen. Zoom-in operation.
  • the display screen of the mobile phone displays the current screen, as shown in Figure 9.
  • the user can slide the touch screen of the mobile phone shown in Figure 7 to display the current screen.
  • Perform a zoom-out operation on the target screen in the phone When the mobile phone responds to the zoom-out operation, the display screen of the mobile phone displays the current screen, as shown in Figure 11.
  • the input module 4204 in this embodiment of the present application needs to be determined according to the electronic device.
  • the electronic device is a device with a specific touch screen such as a mobile phone or a smart watch
  • the input module may be the touch screen.
  • the user can perform input operations (such as zooming in, zooming out, scrolling, flipping, moving, etc.) on the displayed screen by sliding the touch screen.
  • the input module may be an external device (such as a mouse, a keyboard, a remote control, etc.).
  • the user can perform input operations on the displayed screen by operating the external device.
  • the external device is a mouse
  • the user can move the screen by pressing and holding the left mouse button of the mouse; or zoom in or out by sliding the mouse wheel up and down, which is not specifically limited.
  • an input operation may be performed on the first screen displayed by the screen projection target device through the input module 4204 to generate a first operation event.
  • the first operation event may be an operation of moving the first picture along the first direction on the display screen of the target screen projection device, an operation of clicking the first picture on the display screen of the target screen projection device, or an operation of clicking the first picture on the display screen of the target screen projection device.
  • the operation of zooming in or zooming out the first screen on the display screen of the target device for screen projection is not specifically limited.
  • determining whether to respond to the first instruction corresponding to the first operation event at the local end according to the execution action of the first operation event on the first screen may include the following steps: if the first operation event is an edge For the operation of moving the first screen in the first direction, it is determined whether the local end responds to the first instruction corresponding to the first operation event according to the first screen and the target screen.
  • the target image may be a target image to be projected by the screen projection source device for the running image of the first application.
  • the current image (ie, the first picture) displayed on the display screen of the screen-casting target device may be either
  • the target screen is the screen after operations such as zooming in, zooming out, or moving, or it may be the screen after performing operations such as zooming in, zooming out, or moving the target screen.
  • the embodiment of the present application considers the first picture and the target picture to determine whether to execute the first picture on the display screen of the screen projection target device. A picture moves along the first direction.
  • the screen projection target device can directly call internal related modules (such as GPU, image processing module, etc.) to perform a comparison operation between the first picture and the target picture, it is necessary to determine whether to respond to the first instruction corresponding to the first operation event at the local end.
  • the processing time required for the process is very short, which is conducive to improving the processing efficiency of the interactive operation of the user's screen and improving the experience of using the screen.
  • the target screen presents the entire screen of the running screen of the first application; or, the target screen only presents a partial screen of the running screen of the first application.
  • the target picture displayed on the display screen of the target device for screen projection can be either the entire picture to be projected by the source device for the running picture of the first application (as shown in Figure 6), or the source device. A part of the screen to be projected for the running screen of the first application (as shown in FIG. 7 ). It can be seen that the embodiments of the present application are beneficial to realize the diversity of interactive operations on the screen of the application, and improve the use experience of screen projection.
  • the first direction may be used to represent any direction of the plane where the current display screen of the screen projection target device is located.
  • the first direction may be horizontal left, horizontal right, horizontal downward, horizontal upward, horizontal oblique upward, horizontal oblique left, etc. parallel to the plane where the display screen of the screen projection target device is located, and may be parallel to the projection screen.
  • the plane where the display screen of the target device is located has an arc left, an arc right, an arc down, an arc up, an arc obliquely upward, or an arc obliquely to the left, etc.
  • the plane on which the display screen of the mobile phone is located may be vertical (or parallel, oblique, etc.) to the ground. It can be seen that the first direction in the embodiment of the present application needs to be specifically analyzed according to the plane where the current display screen of the screen projection target device is located.
  • the method further includes the following steps: if the local end responds to the first operation instruction, then Move the first picture in the first direction to display the second picture.
  • the second picture is a mirror image of the running picture of the first application that is different from the first picture.
  • the user can move the current picture (ie the first picture) in the display screen of the mobile phone shown in FIG. 9 horizontally to the left along the plane parallel to the display screen of the mobile phone by sliding the touch screen of the mobile phone shown in FIG. 9 . operate.
  • the current picture ie, the second picture
  • the current picture is displayed on the display screen of the mobile phone, as shown in FIG. 13 .
  • judging whether to respond to the first instruction corresponding to the first operation event at the local end according to the first picture and the target picture may include the following steps: if the first picture and the target picture are different pictures, then according to the first The relationship between a picture and the target picture determines whether the local end responds to the first command corresponding to the first operation event; or, if the first picture and the target picture are the same picture, the local end does not respond to the first command corresponding to the first operation event. an instruction.
  • the first picture and the target picture are different pictures. It can be understood that the user has performed an input operation on the target picture displayed on the display screen of the target device for screen projection, so that the currently displayed picture is different from the target picture. (as shown in Figure 9). At this time, the embodiment of the present application considers the relationship between the first image and the target image (that is, zooming in, zooming out, and/or moving the image) to determine whether to display the first image along the first image on the display screen of the screen projection target device. Move in one direction.
  • the embodiments of the present application are beneficial to realize the diversity of interactive operations on the screen of the application, and improve the use experience of screen projection.
  • the screen projection source device is a laptop computer and the screen projection target device Demonstrate an example for a cell phone.
  • the mobile phone responds to the first instruction corresponding to the first operation event.
  • the mobile phone responds to the moving operation, and moves a part of the screen (ie, the second screen) on the display screen of the mobile phone.
  • the mobile phone does not respond to the first instruction corresponding to the first operation event.
  • the mobile phone needs to perform adaptation processing on the first operation event, and transmit the adapted operation instruction (ie, the second instruction corresponding to the first operation event) to the notebook computer for execution.
  • the reason for adapting the first operation event is that since the input module of the mobile phone may be a touch screen, and the input module of the notebook computer may be an external mouse, the input operation generated through the touch screen needs to be adapted to the input operation of the mouse. characteristic.
  • the display screen of the mobile phone and the display screen of the notebook computer are often different in size, the input operation generated by sliding a certain length, moving a certain direction or clicking a certain position on the touch screen of the mobile phone cannot be performed on the notebook computer. , direction or position, so it is necessary to perform corresponding adaptation processing on the input operation generated by the mobile phone.
  • the mobile phone and the notebook computer may be equipped with different operating systems, software and hardware architectures, etc., corresponding adaptation processing is also required.
  • the user can move the current picture (ie the first picture) in the display screen of the mobile phone shown in FIG. 7 horizontally to the right along the plane parallel to the display screen of the mobile phone by sliding the touch screen of the mobile phone shown in FIG. 7 . operate. Since the current screen and the target screen are the same screen, the mobile phone does not respond to the command corresponding to the move operation. At this time, the mobile phone performs adaptation processing on the mobile operation, and transmits an instruction corresponding to the adapted mobile operation (ie, the second instruction) to the notebook computer for execution. Then, after responding to the instruction corresponding to the adapted mobile operation, the laptop switches the "Sketched cat.jpg” opened by the "Gallery” application to "Real cat.
  • the image of the "cat.jpg” is synchronized to the mobile phone, as shown in Figure 15, so that the current screen of the "Gallery” application on the laptop can be switched directly through the mobile phone to realize the interactive operation of the application screen and improve the screen projection. user experience.
  • judging whether to respond to the instruction corresponding to the first operation event at the local end according to the relationship between the first picture and the target picture may include the following steps: if the first picture is an enlarged partial picture of the target picture, then according to the first picture Whether the screen moves in the first direction beyond the edge of the target screen determines whether the local end responds to the first operation command;
  • the relationship between the first picture and the target picture is determined according to whether the first picture is an enlarged partial picture of the target picture or a full picture after the target picture has been reduced, so as to determine whether the target picture is being cast or not.
  • the first screen is moved along the first direction on the display screen of the screen target device, so as to realize the interactive operation of the screen of the application.
  • the current picture displayed by the mobile phone is a partial picture after the target picture is enlarged. At this time, it is necessary to determine whether the mobile phone responds to the first operation command by moving the current picture along the first direction beyond the edge of the target picture.
  • the current picture displayed by the mobile phone is the full picture after the target picture is reduced, and the mobile phone will not respond to the corresponding instruction of the first operation event at this time, but needs to be adapted by the notebook computer after the first operation event. After the command.
  • the following steps may be included: if the first picture moves along the first direction beyond the edge of the target picture The edge of the target screen does not respond to the first command corresponding to the first operation event at the local end; or, if the first screen moves in the first direction and does not exceed the edge of the target screen, the local end responds to the first command corresponding to the first operation event. an instruction.
  • the user can move the current picture in the display screen of the mobile phone horizontally to the left along the plane parallel to the display screen of the mobile phone by sliding the touch screen of the mobile phone shown in FIG. 9 . Since the current image moves horizontally to the left and does not exceed the edge of the target image, the mobile phone can respond to the instruction corresponding to the moving operation, and the current image displayed on the display screen of the mobile phone is shown in FIG. 13 .
  • the method further includes the following steps: displaying a target window on the display screen of the screen projection target device, the target window being used to prompt that the first picture cannot be moved along the edge. Move in the first direction; or, call the sensing module of the screen-casting target device to make the screen-casting target device vibrate; or, call the speaker of the screen-casting target device to make the screen-casting target device sound.
  • the embodiment of the present application considers that the screen-casting target device can use a pop-up window, vibrate or sound through the screen-casting target device. way to remind the user, so as to realize the interactive operation of the application screen and improve the experience of using the screen.
  • the user moves the current picture in the display screen of the mobile phone horizontally to the right along the plane parallel to the display screen of the mobile phone by sliding the touch screen of the mobile phone shown in FIG. 9 . If the current image moves horizontally to the right, it will exceed the edge of the target image, so a pop-up window will pop up on the screen of the mobile phone, indicating that the current image cannot be moved horizontally, and it will display “Whether to view the previous photo”, as shown in Figure 17 ( b) shown. When the user selects the "Yes” option, the mobile phone will adapt the related operations (move operation or "Yes” option operation, etc.) ” app to view the previous photo. Finally, the notebook computer projects the current screen of the "Gallery” application to the mobile phone, so that the previous photo can be viewed on the mobile phone, and the experience of using the screen is improved.
  • determining whether to respond to the first instruction corresponding to the first operation event at the local end according to the execution action of the first operation event on the first screen may include the following steps: if the first operation event is to click the first screen, the local end does not respond to the first command corresponding to the first operation event.
  • the embodiment of the present application considers that the local end does not respond to the first operation event corresponding to the first operation event. A command, thereby helping to improve the processing efficiency of the interactive operation of the user's screen, and improve the experience of using the screen.
  • the user generates an input operation by clicking on the position of “ ⁇ ” on the display screen of the mobile phone shown in FIG. 6 .
  • the mobile phone does not respond to the instruction corresponding to the input operation, but performs adaptation processing on the input operation, and transmits the adapted operation instruction (ie, the second instruction) to the notebook computer for execution.
  • the laptop closes the "Gallery" application, and ends the screen projection from the laptop to the mobile phone, as shown in Figure 19.
  • the method further includes the following steps: displaying a target window on the display screen of the screen projection target device, the target window being used to prompt that the first picture cannot be moved along the edge. Move in the first direction; or, call the sensing module of the screen-casting target device to make the screen-casting target device vibrate; or, call the speaker of the screen-casting target device to make the screen-casting target device sound.
  • the second operation instruction may be used to instruct the screen projection source device to perform an operation on the running screen of the first application.
  • the method before sending the second instruction corresponding to the first operation event to the screen projection source device, the method may further include the following steps: determining the corresponding first operation event according to the first operation event and the preset operation mapping relationship the second instruction.
  • the preset operation mapping relationship is used to represent the mapping relationship between the operation for the first screen and the operation for the running screen of the first application.
  • the screen projection source device and the screen projection target device may be equipped with different operating systems, software and hardware architectures, etc., in the process of establishing the screen projection between the screen projection source device and the screen projection target device, in order to convert the screen projection target device
  • the input operation of the device is adapted to the screen projection source device to ensure that the screen projection target device operates on the running screen of the application on the screen projection source device, and realizes the interactive operation of the application screen.
  • the mapping relationship between the operation of the screen projected image and the operation of the image applied on the screen-casting source device may be equipped with different operating systems, software and hardware architectures, etc.
  • mode 2 the following embodiments of the present application will illustrate how the screen projection target device determines the second instruction corresponding to the first operation event as an example.
  • the method may further include the following steps: acquiring a first zoom ratio and a second zoom ratio; according to the first zoom ratio, The second scaling ratio and the first operation instruction determine a second instruction corresponding to the first operation event.
  • the first zoom ratio is used to represent the zoom ratio between the display size of the running screen of the first application and the display size of the target screen
  • the second zoom ratio is used to represent the difference between the display size of the first screen and the display size of the target screen
  • the target picture is the initial mirror image of the running picture of the first application on the display screen of the screen projection target device.
  • the embodiment of the present application needs to pre-determine the scaling ratio between the display size of the running screen of the first application and the display size of the target screen, as well as the display size of the first screen and the display size of the target screen. zoom ratio between.
  • the display size of “Sketched cat.jpg” displayed by the “Gallery” application in the notebook computer is different from the display size of the target screen displayed on the display screen of the mobile phone, so it is necessary to calculate the size of each other. and store the first zoom ratio in the internal memory of the mobile phone. Then, for the convenience of viewing the picture on the mobile phone shown in FIG. 6 , the user enlarges the target picture displayed on the display screen of the mobile phone shown in FIG. 6 (that is, (a) of FIG. ) the current picture (ie the first picture) displayed on the display screen of the mobile phone.
  • the scaling ratio between the display size of the current picture and the display size of the target picture (ie, the second scaling ratio) needs to be calculated, and the second scaling ratio is stored in the internal memory of the mobile phone.
  • the mobile phone when the user clicks the position of the "X" on the display screen of the mobile phone as shown in Figure 20 (b) to generate an input operation, the mobile phone does not respond to the input operation, but Perform adaptation processing according to the first scaling ratio, the second scaling and the input operation, and transmit the adapted operation instruction to the notebook computer for execution. Then, the laptop closes the "Gallery” application by responding to the adapted operation instruction, and ends the screen projection from the laptop to the mobile phone.
  • the method further includes the following steps: displaying a third picture, where the third picture is a different running picture of the first application after the screen-casting source device responds to the second instruction corresponding to the first operation event A mirror image of the first screen.
  • the screen-casting source device displays the refreshed, changed or switched running screen of the first application, and casts the refreshed, changed or switched running screen of the first application to the screen. Go to the projection target device to realize the synchronous display of the projection target device.
  • the specific implementation of the cross-device display of pictures is exemplified by taking the screen projection source device as a laptop computer and the screen projection target device as a mobile phone as an example, and through the display screen of the mobile phone Zoom in, zoom out, move or click on the current screen displayed on the screen, and determine whether the input operation needs to be performed on the mobile phone, or the input operation needs to be adapted for transmission to the notebook computer, so that the user can both use the Zoom, move or click to operate the projection screen on the display screen of the mobile phone, and can also control the current screen of the "Gallery" application running on the laptop (such as viewing different photos), so as to realize the screen interaction of the application Control, improve the processing efficiency of the interactive operation of the application screen, and improve the experience of using the screen.
  • the present application can realize cross-device display of pictures between different devices, operating systems, or software and hardware architectures.
  • applications running on a laptop can include document editors, video players, music players, and more.
  • the mobile phone can edit the document running on the laptop computer, switch the video played on the laptop computer, and switch the music played on the laptop computer, which is not specifically limited.
  • the screen projection target device acquires the first operation event for the first image by displaying the first image. Then, according to the execution action of the first operation event on the first screen, it is determined whether the local end responds to the first instruction corresponding to the first operation event. If the local end does not respond to the first command, the second command corresponding to the first operation event is sent to the screen projection source device.
  • the first image is the current image of the running image of the first application on the screen-casting source device on the display screen of the screen-casting target device
  • the second instruction corresponding to the first operation event is used to instruct the screen-casting source device to respond to the first application Execute the operation on the running screen of the screen, so as to realize the cross-device display of the screen.
  • the screen projection target device since the screen projection target device needs to determine whether to respond to the first command corresponding to the first operation event at the local end, the screen projection target device may or may not respond at the local end, so that the present application can not only respond to the projection It can also control the operation screen of the first application on the screen-casting source device through the adaptation process, so as to realize the interactive control of the screen of the application across devices and improve the interactive operation of the screen. processing efficiency, and improve the experience of screen projection.
  • FIG. 21 is a schematic flowchart of another screen cross-device display method provided by an embodiment of the present application. The method includes:
  • the first picture may be a current mirror image of the running picture of the first application on the screen projection source device.
  • the first application in this embodiment of the present application may be an application program or media data running on the application layer of the screen projection source device, such as photos, videos, audios, games, gallery, documents, or multimedia.
  • the screen projection source device can run the first application on the front end, and can also run the first application on the back end.
  • the display screen of the screencasting source device can display the running screen of the first application; when the screencasting source device runs the first application at the back end, the screencasting source device
  • the display screen of the first application may not display the running screen of the first application, but the first application is executed in the background.
  • the screen displayed on the display screen of the screen-casting target device may be referred to as a mirror image (ie, the first screen or the target screen) displayed on the display screen of the screen-casting source device, and the mirror image may present the first screen.
  • a mirror image ie, the first screen or the target screen
  • the mirror image may present the first screen.
  • the mirror image is already the image presented after the relevant resolution adjustment, display size adjustment, image adjustment and other operations during the projection process.
  • the current image (ie, the first picture) displayed on the display screen of the screen-casting target device may be the screen-casting source.
  • the target image (that is, the target image) to be projected by the device for the running image of the first application may also be an image after performing operations such as zooming in, zooming out, or moving the target image.
  • the target image to be projected by the screen projection source device for the running image of the first application is called the target image
  • the image currently displayed on the display screen of the screen projection target device is called the first image.
  • an input operation may be performed on the first screen displayed by the screen projection target device through the input module 4204 to generate a first operation event.
  • the first operation event may be an operation of moving the first picture along the first direction on the display screen of the target screen projection device, an operation of clicking the first picture on the display screen of the target screen projection device, or an operation of clicking the first picture on the display screen of the target screen projection device.
  • the operation of zooming in or zooming out the first screen on the display screen of the target device for screen projection is not specifically limited.
  • the first operation event is an operation of moving the first screen along the first direction
  • the target image may be a target image to be projected by the screen projection source device for the running image of the first application.
  • the target screen presents the entire screen of the running screen of the first application; or, the target screen only presents a partial screen of the running screen of the first application.
  • the first direction may be used to represent any direction of the plane where the current display screen of the screen projection target device is located.
  • the first direction may be horizontal left, horizontal right, horizontal downward, horizontal upward, horizontal oblique upward, horizontal oblique left, etc. parallel to the plane where the display screen of the screen projection target device is located, and may be parallel to the projection screen.
  • the plane where the display screen of the target device is located has an arc left, an arc right, an arc down, an arc up, an arc obliquely upward, or an arc obliquely to the left, etc.
  • judging whether to respond to the first instruction corresponding to the first operation event at the local end according to the first picture and the target picture may include the following steps: if the first picture and the target picture are different pictures, then according to the first The relationship between a picture and the target picture determines whether the local end responds to the first command corresponding to the first operation event; or, if the first picture and the target picture are the same picture, the local end does not respond to the first command corresponding to the first operation event. an instruction.
  • judging whether to respond to the first instruction corresponding to the first operation event at the local end according to the relationship between the first picture and the target picture may include the following steps: if the first picture is an enlarged part of the target picture picture, then according to whether the first picture moves along the first direction beyond the edge of the target picture to determine whether the local end responds to the first command corresponding to the first operation event; or, if the first picture is the full picture after the target picture has been reduced, Then, the local end does not respond to the first instruction corresponding to the first operation event.
  • judging whether the local end responds to the first instruction corresponding to the first operation event according to whether the first image moves in the first direction beyond the edge of the target image may include the following operations: if the first image moves along the first If one direction moves beyond the edge of the target screen, the local end will not respond to the first command corresponding to the first operation event; or, if the first screen moves in the first direction and does not exceed the edge of the target screen, the local end will respond to the first operation The first instruction corresponding to the event.
  • the second picture is a mirror image of the running picture of the first application that is different from the first picture.
  • the second instruction corresponding to the first operation event is used to instruct the screen projection source device to perform an operation on the running screen of the first application.
  • the third picture is a mirror image in which the running picture of the first application is different from the first picture after the screen projection source device responds to the second instruction corresponding to the first operation event.
  • the screen-casting source device displays the refreshed, changed or switched running screen of the first application, and casts the refreshed, changed or switched running screen of the first application to the screen. Go to the projection target device to realize the synchronous display of the projection target device.
  • the screen projection target device acquires the first operation event for the first image by displaying the first image.
  • the first operation event is an operation of moving the first screen in the first direction
  • the second instruction corresponding to the first operation event is determined according to the first operation event and the preset operation mapping relationship; if the local end responds to the first operation event corresponding to the first operation event the first instruction, then move the first picture along the first direction to display the second picture.
  • the second instruction corresponding to the first operation event is sent to the screen projection source device, and the third screen is displayed. Since the first image is the current image of the running image of the first application on the screen-casting source device on the display screen of the screen-casting target device, and the second instruction corresponding to the first operation event is used to instruct the screen-casting source device to respond to the first application Execute the operation on the running screen of the screen, so as to realize the cross-device display of the screen.
  • the screen projection target device since the screen projection target device needs to determine whether to respond to the first command corresponding to the first operation event at the local end, the screen projection target device may or may not respond at the local end, so that the present application can not only respond to the projection It can also control the operation screen of the first application on the screen-casting source device through the adaptation process, so as to realize the interactive control of the screen of the application across devices and improve the interactive operation of the screen. processing efficiency, and improve the experience of screen projection.
  • FIG. 22 is a schematic flowchart of another method for displaying pictures across devices provided by an embodiment of the present application. The method includes:
  • the first picture may be a current mirror image of the running picture of the first application on the screen projection source device.
  • the first application in this embodiment of the present application may be an application program or media data running on the application layer of the screen projection source device, such as photos, videos, audios, games, gallery, documents, or multimedia.
  • the screen projection source device can run the first application on the front end, and can also run the first application on the back end.
  • the display screen of the screencasting source device can display the running screen of the first application; when the screencasting source device runs the first application at the back end, the screencasting source device
  • the display screen of the first application may not display the running screen of the first application, but the first application is executed in the background.
  • the screen displayed on the display screen of the screen-casting target device may be referred to as a mirror image (ie, the first screen or the target screen) displayed on the display screen of the screen-casting source device, and the mirror image may present the first screen.
  • a mirror image ie, the first screen or the target screen
  • the mirror image may present the first screen.
  • the mirror image is already the image presented after the relevant resolution adjustment, display size adjustment, image adjustment and other operations during the projection process.
  • the current image (ie, the first picture) displayed on the display screen of the screen-casting target device may be the screen-casting source.
  • the target image (that is, the target image) to be projected by the device for the running image of the first application may also be an image after performing operations such as zooming in, zooming out, or moving the target image.
  • the target image to be projected by the screen projection source device for the running image of the first application is called the target image
  • the image currently displayed on the display screen of the screen projection target device is called the first image.
  • an input operation may be performed on the first screen displayed by the screen projection target device through the input module 4204 to generate a first operation event.
  • the first operation event may be an operation of moving the first picture along the first direction on the display screen of the target screen projection device, an operation of clicking the first picture on the display screen of the target screen projection device, or an operation of clicking the first picture on the display screen of the target screen projection device.
  • the operation of zooming in or zooming out the first screen on the display screen of the target device for screen projection is not specifically limited.
  • the local end does not respond to the first instruction corresponding to the first operation event.
  • the embodiment of the present application considers that the local end does not respond to the first operation event corresponding to the first operation event. A command, thereby helping to improve the processing efficiency of the interactive operation of the user's screen, and improve the experience of using the screen.
  • the first zoom ratio is used to represent the zoom ratio between the display size of the running screen of the first application and the display size of the target screen
  • the second zoom ratio is used to represent the difference between the display size of the first screen and the display size of the target screen
  • the target picture is the initial mirror image of the running picture of the first application on the display screen of the screen projection target device.
  • the embodiment of the present application needs to pre-determine the scaling ratio between the display size of the running screen of the first application and the display size of the target screen, as well as the display size of the first screen and the display size of the target screen. zoom ratio between.
  • the second instruction corresponding to the first operation event is used to instruct the screen projection source device to perform an operation on the running screen of the first application.
  • the third picture is a mirror image in which the running picture of the first application is different from the first picture after the screen projection source device responds to the second instruction corresponding to the first operation event.
  • the screen-casting source device displays the refreshed, changed or switched running screen of the first application, and casts the refreshed, changed or switched running screen of the first application to the screen. Go to the projection target device to realize the synchronous display of the projection target device.
  • the screen projection target device acquires the first operation event for the first image by displaying the first image. Secondly, if the first operation event is an operation of clicking on the first screen, the local end does not respond to the first instruction corresponding to the first operation event. Thirdly, the first scaling ratio and the second scaling ratio are acquired, and the second instruction corresponding to the first operation event is determined according to the first scaling ratio, the second scaling ratio and the first operation event. Finally, the second instruction corresponding to the first operation event is sent to the screen projection source device, and the third screen is displayed.
  • the first image is the current image of the running image of the first application on the screen-casting source device on the display screen of the screen-casting target device
  • the second instruction corresponding to the first operation event is used to instruct the screen-casting source device to respond to the first application Execute the operation on the running screen of the screen, so as to realize the cross-device display of the screen.
  • the screen projection target device since the screen projection target device needs to determine whether to respond to the first command corresponding to the first operation event at the local end, the screen projection target device may or may not respond at the local end, so that the present application can not only respond to the projection It can also control the operation screen of the first application on the screen-casting source device through the adaptation process, so as to realize the interactive control of the screen of the application across devices and improve the interactive operation of the screen. processing efficiency, and improve the experience of screen projection.
  • the electronic device includes corresponding hardware structures and/or software modules for executing each function.
  • the present application can be implemented in hardware or in the form of a combination of hardware and computer software, in combination with the units and algorithm steps of each example described in the embodiments provided herein. Whether a function is performed by hardware or computer software driving hardware depends on the specific application and design constraints of the technical solution. Skilled artisans may implement the described functionality using different methods for each particular application, but such implementations should not be considered beyond the scope of this application.
  • the electronic device may be divided into functional units according to the foregoing method examples.
  • each functional unit may be divided corresponding to each function, or two or more functions may be integrated into one processing unit.
  • the above-mentioned integrated units may be implemented in the form of hardware, or may be implemented in the form of software functional units. It should be noted that, the division of units in the embodiments of the present application is illustrative, and is only a logical function division, and there may be other division manners in actual implementation.
  • FIG. 23 shows a block diagram of the functional units of a screen cross-device display apparatus.
  • the screen cross-device display apparatus 2300 is applied to the screen projection target device, and specifically includes: a processing unit 2320 and a communication unit 2330 .
  • the processing unit 2320 is used to control and manage the actions of the screen-casting target device, for example, the processing unit 2320 is used to support the screen-casting target device to perform some or all of the steps in FIG. 5 , and other processes for the techniques described herein.
  • the communication unit 2330 is used to support the communication between the screen projection target device and other devices.
  • the apparatus 2300 for displaying pictures across devices may further include a storage unit 2310 for storing program codes and data of the screen projection target device.
  • the processing unit 2320 may be a processor or a controller, such as a CPU, a general-purpose processor, a DSP, an ASIC, an FPGA, a transistor logic device, a hardware component, or any combination thereof. It may implement or execute various exemplary logical blocks, modules and circuits described in connection with the embodiments of the present application. In addition, the processing unit 2320 may also be a combination that implements computing functions, such as a combination of one or more microprocessors, a combination of a DSP and a microprocessor.
  • the communication unit 2330 may be a communication interface, a transceiver, a transceiver circuit, and the like.
  • the storage unit 2310 may be a memory. When the processing unit 2320 is a processor, the communication unit 2330 is a communication interface, and the storage unit 2310 is a memory, the cross-device display apparatus 2300 involved in this embodiment of the present application may be the electronic device shown in FIG. 24 .
  • the processing unit 2320 is configured to perform any step performed by the screen projection target device in the above method embodiments, and when performing data transmission such as sending, the communication unit 2330 can be selectively invoked to complete corresponding operations. A detailed description will be given below.
  • the processing unit 2320 is configured to: display the first picture, which is the current image of the running picture of the first application on the screen projection source device; obtain the first operation event for the first picture; to determine whether to respond to the first command corresponding to the first operation event at the local end; if the local end does not respond to the first command corresponding to the first operation event, send the second command corresponding to the first operation event to the screen projection source device
  • the second instruction corresponding to the first operation event is used to instruct the screen projection source device to perform an operation on the running screen of the first application.
  • the processing unit 2320 is specifically configured to: if the first operation event For the operation of moving the first screen in the first direction, it is determined whether the local end responds to the first command corresponding to the first operation event according to the first screen and the target screen, and the target screen is the screen projection source device for the first The target image to which the running screen of the application should be projected.
  • the processing unit 2320 is specifically configured to: if the first picture and the target picture are different pictures , then according to the relationship between the first picture and the target picture, it is judged whether the local end responds to the first instruction corresponding to the first operation event; or, if the first picture and the target picture are the same picture, the local end does not respond to the first operation The first instruction corresponding to the event.
  • the processing unit 2320 is specifically configured to: if the first picture is the target picture For the zoomed-in part of the screen, it is judged whether the local end responds to the first command corresponding to the first operation event according to whether the first screen moves in the first direction beyond the edge of the target screen; or, if the first screen is the target screen after zooming out If the full screen is displayed, the local end does not respond to the first command corresponding to the first operation event.
  • the processing unit 2320 is specifically configured to: When a picture moves in the first direction beyond the edge of the target picture, the local end does not respond to the first instruction corresponding to the first operation event; or, if the first picture moves in the first direction and does not exceed the edge of the target picture, the local end Respond to the first instruction corresponding to the first operation event.
  • the processing unit 2320 before sending the second instruction corresponding to the first operation event to the screen projection source device, is further configured to: determine the first operation event corresponding to the first operation event according to the first operation event and the preset operation mapping relationship The second instruction, the preset operation mapping relationship is used to represent the mapping relationship between the operation for the first screen and the operation for the running screen of the first application.
  • the processing unit 2320 is further configured to: if the local end responds to the first operation event corresponding to the the first instruction, the first screen is moved along the first direction to display the second screen, and the second screen is a mirror image of the running screen of the first application that is different from the first screen.
  • the processing unit 2320 is specifically configured to: if the first operation event If the corresponding first instruction is an operation of clicking on the first screen, the local end does not respond to the first instruction corresponding to the first operation event.
  • the processing unit 2320 before sending the second instruction corresponding to the first operation event to the screen projection source device, is further configured to: acquire a first scaling ratio and a second scaling ratio, where the first scaling ratio is used to represent The scaling ratio between the display size of the running screen of the first application and the display size of the target screen, the second scaling ratio is used to represent the scaling ratio between the display size of the first screen and the display size of the target screen, where the target screen is the projection screen.
  • the screen source device is a target image to be projected on the screen of the running image of the first application; the second instruction corresponding to the first operation event is determined according to the first zoom ratio, the second zoom ratio and the first operation instruction.
  • the processing unit 2320 is further configured to: display a third picture, where the third picture is the screen projection source device responding to the first operation event
  • the running screen of the first application after the corresponding second instruction is different from the mirror image of the first screen.
  • the first operation event for the first image is acquired by displaying the first image. Then, according to the execution action of the first operation event on the first screen, it is determined whether the local end responds to the first instruction corresponding to the first operation event. If the local end does not respond to the first command, the second command corresponding to the first operation event is sent to the screen projection source device. Since the first image is the current image of the running image of the first application on the screen-casting source device on the display screen of the screen-casting target device, and the second instruction corresponding to the first operation event is used to instruct the screen-casting source device to respond to the first application Execute the operation on the running screen of the screen, so as to realize the cross-device display of the screen.
  • the screen projection target device since the screen projection target device needs to determine whether to respond to the first command corresponding to the first operation event at the local end, the screen projection target device may or may not respond at the local end, so that the present application can not only respond to the projection It can also control the operation screen of the first application on the screen-casting source device through the adaptation process, so as to realize the interactive control of the screen of the application across devices and improve the interactive operation of the screen. processing efficiency, and improve the experience of screen projection.
  • the following introduces a schematic structural diagram of another electronic device 2400 provided by an embodiment of the present application, as shown in FIG. 24 .
  • the electronic device 2400 includes a processor 2410 , a memory 2420 , a communication interface 2430 and at least one communication bus for connecting the processor 2410 , the memory 2420 , and the communication interface 2430 .
  • Processor 2410 may be one or more central processing units (CPUs). In the case where the processor 2410 is a CPU, the CPU may be a single-core CPU or a multi-core CPU.
  • the memory 2420 includes, but is not limited to, Random Access Memory (RAM), Read-Only Memory (ROM), Erasable Programmable Read Only Memory (EPROM) or portable Read-only memory (Compact Disc Read-Only Memory, CD-ROM), and the memory 2220 is used for related instructions and data.
  • the communication interface 2230 is used to receive and transmit data.
  • the processor 2410 in the electronic device 2400 is configured to read one or more programs 2421 stored in the memory 2420 to perform the following steps: displaying a first screen, the first screen being the image of the running screen of the first application on the screen-casting source device current mirror image; obtain the first operation event for the first screen; determine whether to respond to the first instruction corresponding to the first operation event at the local end according to the execution action of the first operation event on the first image; if not respond to the first command at the local end If the first instruction corresponding to the operation event is used, the second instruction corresponding to the first operation event is sent to the screen projection source device, and the second instruction corresponding to the first operation event is used to instruct the screen projection source device to perform an operation on the running screen of the first application. .
  • the first operation event for the first image is acquired by displaying the first image. Then, according to the execution action of the first operation event on the first screen, it is determined whether the local end responds to the first instruction corresponding to the first operation event. If the local end does not respond to the first command, the second command corresponding to the first operation event is sent to the screen projection source device. Since the first image is the current image of the running image of the first application on the screen-casting source device on the display screen of the screen-casting target device, and the second instruction corresponding to the first operation event is used to instruct the screen-casting source device to respond to the first application Execute the operation on the running screen of the screen, so as to realize the cross-device display of the screen.
  • the screen projection target device since the screen projection target device needs to determine whether to respond to the first command corresponding to the first operation event at the local end, the screen projection target device may or may not respond at the local end, so that the present application can not only respond to the projection It can also control the operation screen of the first application on the screen-casting source device through the adaptation process, so as to realize the interactive control of the screen of the application across devices and improve the interactive operation of the screen. processing efficiency, and improve the experience of screen projection.
  • Embodiments of the present application further provide a computer-readable storage medium, wherein the computer-readable storage medium stores a computer program for electronic data exchange, and the computer program is operable to cause a computer to execute any of the methods described in the foregoing method embodiments. some or all of the steps of a method.
  • Embodiments of the present application further provide a computer program product, wherein the computer program product includes a computer program, and the computer program is operable to cause a computer to execute part or all of the steps of any method described in the above method embodiments.
  • the computer program product may be a software installation package.
  • the above-mentioned units are implemented in the form of software functional units and sold or used as independent products, they may be stored in a computer-readable storage medium.
  • the technical solution of the present application (the part of the technical solution that contributes to the prior art or the whole or part of the technical solution) can be embodied in the form of a computer software product.
  • the computer software product is stored in a memory, and includes several instructions for causing a computer device (personal computer, server, or network device, etc.) to execute all or part of the steps of the embodiments of the present application.
  • the above-mentioned computer-readable storage medium may be stored in various memories such as U disk, ROM, RAM, removable hard disk, magnetic disk, or optical disk.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Controls And Circuits For Display Device (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

L'invention concerne un procédé et un appareil d'affichage d'image entre dispositifs, et un dispositif électronique. Le procédé consiste à : afficher une première image (S510), la première image étant l'image miroir actuelle d'une image courante d'une première application sur un dispositif source de projection d'écran ; acquérir un premier événement d'opération pour la première image (S520) ; selon une action d'exécution du premier événement d'opération pour la première image, déterminer s'il faut ou non répondre, au niveau d'une extrémité locale, à une première instruction correspondant au premier événement d'opération (S530) ; et si on ne répond pas à la première instruction correspondant au premier événement d'opération au niveau de l'extrémité locale, envoyer, au dispositif source de projection d'écran, une seconde instruction correspondant au premier événement d'opération (S540), la seconde instruction d'opération étant utilisée pour ordonner au dispositif source de projection d'écran d'exécuter une opération sur l'image courante de la première application. Le procédé permet de réaliser l'affichage d'image entre dispositifs et une opération d'interaction d'image d'une application, de façon à améliorer l'efficacité de traitement de l'opération d'interaction d'image, et à améliorer l'expérience d'utilisation de la projection d'écran.
PCT/CN2021/121014 2020-11-16 2021-09-27 Procédé et appareil d'affichage d'image entre dispositifs, et dispositif électronique WO2022100305A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011284014.2A CN112394895B (zh) 2020-11-16 2020-11-16 画面跨设备显示方法与装置、电子设备
CN202011284014.2 2020-11-16

Publications (1)

Publication Number Publication Date
WO2022100305A1 true WO2022100305A1 (fr) 2022-05-19

Family

ID=74600911

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/121014 WO2022100305A1 (fr) 2020-11-16 2021-09-27 Procédé et appareil d'affichage d'image entre dispositifs, et dispositif électronique

Country Status (2)

Country Link
CN (1) CN112394895B (fr)
WO (1) WO2022100305A1 (fr)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112394895B (zh) * 2020-11-16 2023-10-13 Oppo广东移动通信有限公司 画面跨设备显示方法与装置、电子设备
CN115253285A (zh) * 2021-04-30 2022-11-01 华为技术有限公司 显示方法及相关装置
CN115460445B (zh) * 2021-06-09 2024-03-22 荣耀终端有限公司 电子设备的投屏方法和电子设备
CN115700463A (zh) * 2021-07-30 2023-02-07 华为技术有限公司 一种投屏方法、系统及电子设备
CN113891127A (zh) * 2021-08-31 2022-01-04 维沃移动通信有限公司 视频编辑方法、装置及电子设备
CN115756268A (zh) * 2021-09-03 2023-03-07 华为技术有限公司 跨设备交互的方法、装置、投屏系统及终端
CN114257631A (zh) * 2021-12-20 2022-03-29 Oppo广东移动通信有限公司 数据交互方法、装置、设备及存储介质
CN114579034A (zh) * 2022-03-02 2022-06-03 北京字节跳动网络技术有限公司 一种信息交互方法、装置、显示设备及存储介质
CN115729502B (zh) * 2022-03-23 2024-02-27 博泰车联网(南京)有限公司 投屏端和显示端的响应方法、电子设备及存储介质
CN115174988B (zh) * 2022-06-24 2024-04-30 长沙联远电子科技有限公司 一种基于dlna的音视频投屏控制方法
CN116719468A (zh) * 2022-09-02 2023-09-08 荣耀终端有限公司 交互事件的处理方法及装置

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160321968A1 (en) * 2015-04-28 2016-11-03 Beijing Lenovo Software Ltd. Information processing method and electronic device
CN110248226A (zh) * 2019-07-16 2019-09-17 广州视源电子科技股份有限公司 信息的投屏方法、装置、系统、存储介质和处理器
CN110377250A (zh) * 2019-06-05 2019-10-25 华为技术有限公司 一种投屏场景下的触控方法及电子设备
CN111221491A (zh) * 2020-01-09 2020-06-02 Oppo(重庆)智能科技有限公司 交互控制方法及装置、电子设备、存储介质
CN111562896A (zh) * 2020-04-26 2020-08-21 维沃移动通信有限公司 投屏方法及电子设备
CN111918119A (zh) * 2020-07-24 2020-11-10 深圳乐播科技有限公司 Ios系统数据的投屏方法、装置、设备及存储介质
CN112394895A (zh) * 2020-11-16 2021-02-23 Oppo广东移动通信有限公司 画面跨设备显示方法与装置、电子设备

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107483994A (zh) * 2017-07-31 2017-12-15 广州指观网络科技有限公司 一种反向投屏控制系统及方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160321968A1 (en) * 2015-04-28 2016-11-03 Beijing Lenovo Software Ltd. Information processing method and electronic device
CN110377250A (zh) * 2019-06-05 2019-10-25 华为技术有限公司 一种投屏场景下的触控方法及电子设备
CN110248226A (zh) * 2019-07-16 2019-09-17 广州视源电子科技股份有限公司 信息的投屏方法、装置、系统、存储介质和处理器
CN111221491A (zh) * 2020-01-09 2020-06-02 Oppo(重庆)智能科技有限公司 交互控制方法及装置、电子设备、存储介质
CN111562896A (zh) * 2020-04-26 2020-08-21 维沃移动通信有限公司 投屏方法及电子设备
CN111918119A (zh) * 2020-07-24 2020-11-10 深圳乐播科技有限公司 Ios系统数据的投屏方法、装置、设备及存储介质
CN112394895A (zh) * 2020-11-16 2021-02-23 Oppo广东移动通信有限公司 画面跨设备显示方法与装置、电子设备

Also Published As

Publication number Publication date
CN112394895A (zh) 2021-02-23
CN112394895B (zh) 2023-10-13

Similar Documents

Publication Publication Date Title
WO2022100305A1 (fr) Procédé et appareil d'affichage d'image entre dispositifs, et dispositif électronique
WO2020238871A1 (fr) Procédé et système de projection d'écran, et appareil associé
US20220342850A1 (en) Data transmission method and related device
CN112291764B (zh) 一种内容接续系统
WO2021078284A1 (fr) Procédé de continuation de contenu et dispositif électronique
WO2022100304A1 (fr) Procédé et appareil de transfert d'un contenu d'application à travers des dispositifs, et dispositif électronique
WO2021121052A1 (fr) Procédé et système de coopération à écrans multiples et dispositif électronique
CN112558825A (zh) 一种信息处理方法及电子设备
WO2021129253A1 (fr) Procédé d'affichage de multiples fenêtres, et dispositif électronique et système
WO2022105445A1 (fr) Procédé de projection d'écran d'application basé sur un navigateur et appareil associé
WO2022121775A1 (fr) Procédé de projection sur écran, et dispositif
CN112527174B (zh) 一种信息处理方法及电子设备
WO2022017393A1 (fr) Système d'interaction d'affichage, procédé d'affichage, et dispositif
WO2022083465A1 (fr) Procédé de projection d'écran de dispositif électronique, support associé et dispositif électronique
CN114040242A (zh) 投屏方法和电子设备
WO2023273543A1 (fr) Procédé et appareil de gestion de dossier
WO2022135157A1 (fr) Procédé et appareil d'affichage de page, ainsi que dispositif électronique et support de stockage lisible
WO2022042769A2 (fr) Système et procédé d'interaction multi-écrans, appareil, et support de stockage
WO2022007678A1 (fr) Procédé d'ouverture de fichier et dispositif
CN115016697A (zh) 投屏方法、计算机设备、可读存储介质和程序产品
WO2023005900A1 (fr) Procédé de projection d'écran, dispositif électronique et système
WO2023020012A1 (fr) Procédé de communication de données entre dispositifs, dispositif, support de stockage et produit de programme
CN115086888A (zh) 消息通知方法与装置、电子设备
WO2023231893A1 (fr) Procédé d'affichage de curseur et dispositif électronique
WO2023169237A1 (fr) Procédé de capture d'écran, dispositif électronique, et système

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21890833

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21890833

Country of ref document: EP

Kind code of ref document: A1