WO2023283941A1 - 一种投屏图像的处理方法和装置 - Google Patents

一种投屏图像的处理方法和装置 Download PDF

Info

Publication number
WO2023283941A1
WO2023283941A1 PCT/CN2021/106821 CN2021106821W WO2023283941A1 WO 2023283941 A1 WO2023283941 A1 WO 2023283941A1 CN 2021106821 W CN2021106821 W CN 2021106821W WO 2023283941 A1 WO2023283941 A1 WO 2023283941A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
image
images
screen
operation track
Prior art date
Application number
PCT/CN2021/106821
Other languages
English (en)
French (fr)
Inventor
李龙华
宋超迪
郑万椿
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to PCT/CN2021/106821 priority Critical patent/WO2023283941A1/zh
Priority to CN202180099334.7A priority patent/CN117501233A/zh
Publication of WO2023283941A1 publication Critical patent/WO2023283941A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units

Definitions

  • the present application relates to the field of computer technology, and more specifically, to a processing method and device for projected screen images.
  • the screen projection technology transmits a display image on one electronic device to another electronic device through a wired screen projection port or a wireless screen projection port.
  • the user may perform operations on the electronic device with a finger, a stylus, or a mouse, and the data generated by the electronic device according to these operations may be referred to as reporting point data.
  • the reporting point data generated by the first electronic device according to the user's operation needs to be transmitted to the second electronic device through the wired screen projection port or the wireless projection screen port, and the second electronic device completes the drawing and synthesis of the trajectory of the reporting point data, and then Then transmit to the first electronic device. Therefore, the display delay of the trajectory drawn by hand is relatively large, and the user experience is poor.
  • the present application provides a method for processing a projected image, in order to reduce the display delay of an operation track.
  • the present application provides a method for processing a projected screen image, including: receiving a first projected screen image from a second electronic device; generating reporting point data in response to user operations; and drawing An operation track; combining the operation track with the first projected image to obtain a combined image; and displaying the combined image.
  • the handwriting application can locally draw the operation track based on the newly received operation.
  • the first electronic device can directly For newly received operations, draw the latest operation trajectory, and synthesize the operation trajectory with the previously received screen projection image (such as the first projection screen image) from the second electronic device, and then display it on the first electronic device. Therefore, when a new operation occurs on the first electronic device, the display delay of the new operation track does not depend on the transmission delay between the second electronic device and the first electronic device, thereby reducing the display delay of the operation track.
  • the first electronic device may draw the operation trajectory in units of frames to obtain a frame of image.
  • the first electronic device may generate a frame of images at intervals, and each frame of images includes an operation track drawn in a recent period of time.
  • the duration of a period of time corresponding to the operation track may be greater than the sum of the two-way transmission delay between the first electronic device and the second electronic device and the delay for the second electronic device to locally process the received report data.
  • the method further includes: sending the operation track to the second electronic device, where the operation track is used for synthesizing images of one or more layers on the second electronic device, and the one or more Images of multiple layers are used by the second electronic device to synthesize the first projected screen image.
  • the operation track can be drawn by the first electronic device, to display the operation track on the second electronic device, the first electronic device can send the drawn operation track to the second electronic device.
  • the method further includes: sending the report point data to the second electronic device, the report point data is used by the second electronic device to draw the operation track, and the operation track is used in the The second electronic device synthesizes the images of one or more layers, and the images of the one or more layers are used by the second electronic device to synthesize the first projected screen image.
  • the operation track drawn by the second electronic device based on the report point data is the same as the operation track drawn by the first electronic device based on the report point data.
  • the first electronic device can directly send the reporting point data to the second electronic device, and the second electronic device draws and displays the operation track.
  • the screen projection image may be carried in the screen projection layer.
  • the projected screen image is an image synthesized by one or more layers including the application layer and sent to the first electronic device.
  • the aforementioned screen projection images include the first screen projection image or the second screen projection image.
  • the method further includes: receiving a second screen projection image from the second electronic device, and the second screen projection image is synthesized based on the images of the one or more layers and the operation track and displaying the second projected screen image to update the synthesized image.
  • the operation trajectory may be the operation trajectory drawn by the first electronic device and sent to the second electronic device, or the operation trajectory drawn by the second electronic device itself according to the reporting point data, which is not limited in this application.
  • the first electronic device and the second electronic device can draw the same operation trajectory based on the reporting point data, whether it is based on the operation trajectory drawn by the first electronic device and the second electronic device's local one or more
  • the image synthesis of the layers is still based on the operation trajectory drawn by the second electronic device and the image synthesis of one or more local layers of the second electronic device, and the obtained second projected screen image is also the same.
  • the second electronic device may send the newly generated second screen projection image to the first electronic device, so as to update the screen projection image of the first electronic device.
  • updating the locally displayed image by the first electronic device may refer to replacing the first projected screen image with the second projected screen image for display, or for synthesizing with the latest operation track new image.
  • the track drawn by the first electronic device based on the point report data is recorded as the first operation track
  • the track drawn by the second electronic device based on the point report data is recorded as the second operation track.
  • Updating the locally displayed image of the first electronic device may include the following two implementation manners, which will be described in detail below.
  • the first electronic device when receiving a new user operation, can draw the first operation track by itself, and synthesize it with the first projected screen image from the second electronic device , to obtain a synthesized image, so as to reduce the display delay of the operation track.
  • the first electronic device may send the first operation track or the reporting data of the first operation track to the second electronic device.
  • the second electronic device After the second electronic device can acquire a new operation track (such as a first operation track, or a second operation track drawn based on report point data), it can compare the new operation track with one or more local graphs of the second electronic device. layer image synthesis, and then send the synthesized image to the first electronic device.
  • the images of one or more local layers of the second electronic device may refer to layers used to synthesize the second projected screen image.
  • the composited image of one or more local layers of the second electronic device is a new projected screen image, that is, the second projected screen image, which can be used to update the image displayed on the first electronic device.
  • a possible situation is that after the first electronic device sends the report data of the first operation track to the second electronic device, the user generates a new operation on the first electronic device, that is, the first electronic device can be based on New operations draw new operation trajectories.
  • the first electronic device may generate a frame of images at intervals, and each frame of images may include an operation track drawn in a recent period of time. Therefore, the above-mentioned newly drawn operation track can be understood as a track drawn based on operations within a recent period of time, which is another example of the first operation track. It should be understood that there may be overlap between the time periods for generating the two frames of images before and after. Therefore, there may be overlapping or approximate overlapping of the first operation tracks in the two frames of images before and after.
  • the first operation track in the previous frame image will be recorded as operation track 1 below, and the operation track in the next frame image
  • the trajectory is denoted as operation trajectory 2.
  • the second electronic device may draw a second operation track based on the received reporting data. It should be understood that the second operation track can be drawn jointly by the second electronic device based on the received reporting point data in the latest period of time and the previously received reporting point data, or it can be obtained based on the received reporting point data in the latest period of time.
  • the operation trajectory drawn by the report point data is synthesized with the previously drawn operation trajectory.
  • the second electronic device may combine the second operation track with the images of one or more local layers into a second projected screen image and send it to the first electronic device.
  • the first electronic device may synthesize based on the operation track 2 and the second projected screen image. Thereby, the latest operation locus can be displayed on the latest synthesized image.
  • a part of the operation track on the image may come from the second projected screen image, and a part from the operation track 2 .
  • the first electronic device may combine the operation track 2 with the operation track 1, and then combine it with the second projected screen image.
  • the latest operation trajectory can be displayed on the latest synthesized image, and all the operation trajectories on the image come from the operation trajectories drawn locally by the first electronic device.
  • the image locally displayed on the first electronic device can be updated.
  • the first electronic device when receiving a new user operation, may draw an operation track by itself, and synthesize it with the first projected image from the second electronic device to obtain a synthesized image.
  • the first electronic device may draw an operation trajectory based on each new user operation, and synthesize the newly drawn operation trajectory with the last synthesized image to obtain an updated image.
  • the second electronic device may refresh locally stored image data based on the received operation track. Since the operation track is drawn locally by the first electronic device, the second electronic device does not need to send the operation track to the first electronic device.
  • the present application provides a method for processing a projected screen image, including: sending a first projected screen image to a first electronic device; receiving an operation track from the first electronic device, and the operation track is based on the user's The operation trace of the first electronic device is drawn; the operation track is synchronized with images of one or more layers, and the images of one or more layers are used to synthesize the first projected screen image.
  • the second electronic device receives the operation track from the first electronic device (that is, the above-mentioned first operation track), and can synchronize the operation track locally so that the first operation track can be further processed locally, for example
  • the image locally used for screen projection such as the first screen projection image above
  • a new screen projection image such as the second screen projection image above
  • the first electronic device is used for updating the image displayed by the first electronic device.
  • the method further includes: synthesizing the operation track with images of one or more layers to obtain a second projected screen image; displaying the first Two projection screen images.
  • the second screen image may be synthesized. This process can be an example of synchronization.
  • the method further includes: sending the second screen projection image to the first electronic device.
  • the synthesized image sent by the second electronic device to the first electronic device is the latest screen projection image synthesized by the second electronic device, that is, the second screen projection image.
  • the first electronic device may update the locally displayed image of the first electronic device based on the latest received second screen projection image. For example, based on the new operation, the first electronic device can synthesize the newly drawn operation track (namely the above-mentioned operation track 2) with the second projected screen image, and obtain the synthesized image for display; If no new operation is generated, the local operation track (the local operation track has no data at this time) is combined with the second projected screen image, and finally the second projected screen image is displayed.
  • the newly drawn operation track namely the above-mentioned operation track 2
  • the synthesized image for display If no new operation is generated, the local operation track (the local operation track has no data at this time) is combined with the second projected screen image, and finally the second projected screen image is displayed.
  • the method further includes: sending the second screen projection image to the first electronic device.
  • the present application provides an apparatus for processing a projected image, including a module or unit for implementing the method in the first aspect, the second aspect, or any possible implementation manner of the first aspect and the second aspect. It should be understood that each module or unit can realize corresponding functions by executing computer programs.
  • the present application provides a screen projection image processing device, including a processor, the processor is configured to execute the first aspect, the second aspect, or any one of the possible implementation manners of the first aspect and the second aspect.
  • a screen projection image processing device including a processor, the processor is configured to execute the first aspect, the second aspect, or any one of the possible implementation manners of the first aspect and the second aspect. The processing method of the screen projection image described above.
  • the apparatus may also include memory for storing instructions and data.
  • the memory is coupled to the processor, and when the processor executes the instructions stored in the memory, the methods described in the foregoing aspects can be implemented.
  • the device may further include a communication interface, which is used for the device to communicate with other devices.
  • the communication interface may be a transceiver, a circuit, a bus, a module or other types of communication interfaces.
  • the present application provides a system-on-a-chip, which includes at least one processor, configured to support the implementation of the above-mentioned first aspect, the second aspect, or any possible implementation manner of the first aspect and the second aspect.
  • the functions involved for example, receive or process the data and/or information involved in the methods described above.
  • the chip system further includes a memory, the memory is used to store program instructions and data, and the memory is located inside or outside the processor.
  • the system-on-a-chip may consist of chips, or may include chips and other discrete devices.
  • the present application provides a computer-readable storage medium, including a computer program, which, when run on a computer, enables the computer to realize the possibility of either the first aspect, the second aspect, or the first aspect and the second aspect method in the implementation.
  • the present application provides a computer program product, the computer program product including: a computer program (also referred to as code, or instruction), when the computer program is executed, the computer executes the first aspect, The second aspect or the method in any possible implementation manner of the first aspect and the second aspect.
  • a computer program also referred to as code, or instruction
  • FIG. 1 is a schematic block diagram of an electronic device provided by an embodiment of the present application.
  • Fig. 2 is a schematic block diagram of the software and hardware structure of an electronic device suitable for the method provided by the embodiment of the present application;
  • FIG. 3 is a schematic diagram of a screen projection system applicable to the processing method of a screen projection image provided in an embodiment of the present application;
  • FIG. 4 is a schematic diagram of a possible processing flow of wireless projection
  • FIG. 5 is a schematic diagram of time delay of wireless projection
  • FIG. 6 and FIG. 7 are schematic flowcharts of a method for processing a projected screen image provided by an embodiment of the present application
  • FIG. 8 is a schematic diagram of another time delay of wireless projection
  • FIG. 9 is a schematic block diagram of a screen projection image processing device 900 provided by an embodiment of the present application.
  • FIG. 10 is a schematic block diagram of an apparatus 1000 for processing a projected screen image provided by an embodiment of the present application.
  • Wireless projection technology transmits the display screen on one electronic device to another electronic device for display through wireless communication technology.
  • the terminal device that transmits the display screen may be referred to as a second electronic device
  • the terminal device that receives and displays the display screen may be referred to as a first electronic device.
  • the first electronic device may be an electronic device that supports user's handwriting input and has a larger display screen, such as a TV, a smart screen and other electronic devices.
  • the embodiment of the present application does not impose any limitation on the specific types of the terminal device and the display device.
  • the second electronic device is an electronic device that supports screen projection, such as a mobile phone, a tablet computer, a vehicle-mounted device, a notebook computer, a personal computer (personal computer, PC), an ultra-mobile personal computer (ultra-mobile personal computer, UMPC), and a netbook. , and other electronic equipment.
  • both the first electronic device and the second electronic device in the embodiment of the present application are electronic devices with a display screen, and may be the same type of electronic device, or may be different types of electronic devices.
  • the first electronic device and the second electronic device The specific structure of the two electronic devices can refer to the structure of the electronic device 100 shown in Figure 1 and Figure 2 below, both of which can use more or less components than the structure shown in Figure 1 and Figure 2, the present application The embodiment does not limit this.
  • the terms "first" and "second” in the embodiments of the present application are used to distinguish similar objects, and not necessarily used to describe a specific order or sequence.
  • the screen projection image in the embodiment of the present application is an image transmitted by the second electronic device to the first electronic device through a wired screen projection port or a wireless screen projection port, and then displayed on the display screen of the first electronic device.
  • the screen projection image may be carried in the screen projection layer.
  • the projected screen image may be carried in one or more layers including the application layer.
  • the second electronic device may project the screen to the first electronic device after synthesizing the one or more layers into a projected image.
  • one or more layers in the second electronic device can be, for example, the layer of the handwriting application window and the layers of other icons or applications in the current screen-casting interface, which can be generally regarded as a layer for screen-casting. Displays the layers contained in the interface.
  • the projected screen image corresponds to the image of one or more local layers.
  • the methods described in the embodiments of the present application can support operating environments such as Linux, Android, Harmony operating system (Harmony OS), Mac, iOS, Windows, and Internet of Things operating systems (such as LiteOS).
  • operating environments such as Linux, Android, Harmony operating system (Harmony OS), Mac, iOS, Windows, and Internet of Things operating systems (such as LiteOS).
  • Harmony operating system Harmony operating system
  • Mac Mac
  • iOS Windows
  • Internet of Things operating systems such as LiteOS
  • FIG. 1 shows a schematic structural diagram of an electronic device 100 .
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charging management module 140, a power management module 141, a battery 142, antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone interface 170D, sensor module 180, button 190, mouse 191, indicator 192, camera 193, display screen 194, subscriber identification module (subscriber identification module, SIM) card interface 195, touch circuit 196 (touch panel integrated circuit, TP IC) and wireless fidelity (wireless fidelity, Wi-Fi) circuit 197 (Wi-Fi Fi integrated circuit, Wi-Fi IC), etc.
  • SIM subscriber identification module
  • the structure illustrated in this application does not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or fewer components than shown, or combine certain components, or separate certain components, or arrange different components.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor and neural network processor (neural-network processing unit, NPU), etc. one or more. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processing unit
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • the application processor outputs sound signals through the audio module 170 (such as the speaker 170A, etc.), or displays images or videos through the display screen 194 .
  • the controller may be the nerve center and command center of the electronic device 100 .
  • the controller can generate an operation control signal according to the instruction opcode and timing signal, and complete the control of fetching and executing the instruction.
  • the controller can further process the signal collected by the touch circuit 196 .
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is a cache memory.
  • the memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated access is avoided, and the waiting time of the processor 110 is reduced, thus improving the efficiency of the system.
  • the processor 110 may execute different operations to implement different functions by executing instructions.
  • the instruction may be, for example, an instruction stored in the memory before the device leaves the factory, or an instruction read from the APP after the user installs a new application (APP) during use, and this embodiment of the present application does not make any Any restrictions.
  • APP new application
  • processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transmitter (universal asynchronous receiver/transmitter, UART) interface, mobile industry processor interface (mobile industry processor interface, MIPI), general-purpose input/output (general-purpose input/output, GPIO) interface, SIM interface and/or USB interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM interface and/or USB interface etc.
  • the I2C interface is a bidirectional synchronous serial bus, including a serial data line (serial data line, SDA) and a serial clock line (derail clock line, SCL).
  • SDA serial data line
  • SCL serial clock line
  • the I2S interface can be used for audio communication.
  • processor 110 may include multiple sets of I2S buses.
  • the processor 110 may be coupled to the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170 .
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface, so as to realize the function of answering calls through the Bluetooth headset.
  • the PCM interface can also be used for audio communication, sampling, quantizing and encoding the analog signal.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus can be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • a UART interface is generally used to connect the processor 110 and the wireless communication module 160 .
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to realize the Bluetooth function.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 .
  • MIPI interface includes camera serial interface (camera serial interface, CSI), display serial interface (display serial interface, DSI), etc.
  • the processor 110 communicates with the camera 193 through the CSI interface to realize the shooting function of the electronic device 100 .
  • the processor 110 communicates with the display screen 194 through the DSI interface to realize the display function of the electronic device 100 .
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface can be used to connect the processor 110 with the camera 193 , the display screen 194 , the wireless communication module 160 , the audio module 170 , the sensor module 180 and so on.
  • the GPIO interface can also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface conforming to the USB standard specification, specifically, it can be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100 , and can also be used to transmit data between the electronic device 100 and peripheral devices. It can also be used to connect headphones and play audio through them. This interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between modules shown in this application is only a schematic illustration, and does not constitute a structural limitation of the electronic device 100 .
  • the electronic device 100 may also adopt different interface connection methods in the above embodiments, or a combination of multiple interface connection methods.
  • the charging management module 140 is configured to receive a charging input from a charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 can receive charging input from the wired charger through the USB interface 130 .
  • the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100 . While the charging management module 140 is charging the battery 142 , it can also provide power for electronic devices through the power management module 141 .
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives the input from the battery 142 and/or the charging management module 140 to provide power for the processor 110 , the internal memory 121 , the external memory, the display screen 194 , the camera 193 , and the wireless communication module 160 .
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, and battery health status (leakage, impedance).
  • the power management module 141 may also be disposed in the processor 110 .
  • the power management module 141 and the charging management module 140 can also be set in the same device.
  • the wireless communication function of the electronic device 100 can be realized by the antenna 1 , the antenna 2 , the mobile communication module 150 , the wireless communication module 160 , a modem processor, a baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover single or multiple communication frequency bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves through the antenna 1, filter and amplify the received electromagnetic waves, and send them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signals modulated by the modem processor, and convert them into electromagnetic waves through the antenna 1 for radiation.
  • at least part of the functional modules of the mobile communication module 150 may be set in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be set in the same device.
  • a modem processor may include a modulator and a demodulator.
  • the modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator sends the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low-frequency baseband signal is passed to the application processor after being processed by the baseband processor.
  • the application processor outputs sound signals through audio equipment (not limited to speaker 170A, receiver 170B, etc.), or displays images or videos through display screen 194 .
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent from the processor 110, and be set in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide wireless local area network (wireless local area networks, WLAN) (such as Wi-Fi network), bluetooth (bluetooth, BT), global navigation satellite system (global navigation satellite system, GNSS) applied on the electronic device 100. ), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency-modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , frequency-modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
  • the wireless communication module 160 can be used to realize data transmission between the host end and the display end, for example, the host end sends screen projection images to the display end through the WiFi air interface, and the display end sends report data to the host end through the WiFi air interface, etc.
  • the antenna 1 of the electronic device 100 is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), the fifth generation (5th generation, 5G) communication systems, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc.
  • GSM global system for mobile communications
  • GPRS general packet radio service
  • code division multiple access code division multiple access
  • CDMA broadband Code division multiple access
  • WCDMA wideband code division multiple access
  • time division code division multiple access time-division code division multiple access
  • TD-SCDMA
  • the GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a BeiDou navigation satellite system (BDS), a quasi-zenith satellite system (quasi -zenith satellite system (QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS BeiDou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the electronic device 100 may implement a display function through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 194 which can also be called a screen, can be used to display images, videos and the like.
  • Display 194 may include a display panel.
  • the display panel can adopt liquid crystal display (liquid crystal display, LCD), organic light-emitting diode (organic light-emitting diode, OLED), active-matrix organic light-emitting diode or active-matrix organic light-emitting diode (active-matrix organic light emitting diode (AMOLED), flexible light-emitting diode (flex light-emitting diode, FLED), mini LED (Mini LED), micro Led (Micro LED), micro OLED (Micro-OLED), quantum dot light emitting diode (quantum dot light emitting diodes, QLED), etc.
  • electronic device 100 may include one or more display screens 194 .
  • the display screen 194 may also include more components.
  • a backlight panel a display integrated circuit (integrated circuit, IC) and the like.
  • the backlight board can be used to provide a light source, and the display panel emits light based on the light source provided by the backlight board.
  • the display integrated circuit can be used to draw and synthesize images, and can be used to control the light transmission or opacity of the liquid crystal in the liquid crystal layer.
  • the electronic device 100 can realize the shooting function through the ISP, the camera 193 , the video codec, the GPU, the display screen 194 and the application processor.
  • the ISP is used for processing the data fed back by the camera 193 .
  • the light is transmitted to the photosensitive element of the camera through the lens, and the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin color.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be located in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object generates an optical image through the lens and projects it to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the light signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other image signals.
  • electronic device 100 may include one or more cameras 193 .
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs.
  • the electronic device 100 can play or record videos in various encoding formats, such as: moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, so as to expand the storage capacity of the electronic device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. Such as saving music, video and other files in the external memory card.
  • the internal memory 121 may be used to store computer-executable program codes including instructions.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by executing instructions stored in the internal memory 121 .
  • the internal memory 121 may include an area for storing programs and an area for storing data.
  • the stored program area can store an operating system, at least one application program required by a function (such as a sound playing function, an image playing function, etc.) and the like.
  • the storage data area can store data created during the use of the electronic device 100 (such as audio data, phonebook, etc.) and the like.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (universal flash storage, UFS) and the like.
  • the SIM card interface 195 is used for connecting a SIM card.
  • the SIM card can be connected and separated from the electronic device 100 by inserting it into the SIM card interface 195 or pulling it out from the SIM card interface 195 .
  • the electronic device 100 may support one or more SIM card interfaces.
  • SIM card interface 195 can support Nano SIM card, Micro SIM card, SIM card etc. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of the multiple cards may be the same or different.
  • the SIM card interface 195 is also compatible with different types of SIM cards.
  • the SIM card interface 195 is also compatible with external memory cards.
  • the electronic device 100 interacts with the network through the SIM card to implement functions such as calling and data communication.
  • the electronic device 100 adopts an eSIM, that is, an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100 .
  • the touch circuit 196 can be used to collect signals generated by the user by touching the display screen 194 .
  • the structure illustrated in this application does not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or fewer components than shown, or combine certain components, or separate certain components, or arrange different components.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a micro-kernel architecture, a micro-service architecture or a cloud architecture.
  • This application takes the Android system with a layered architecture as an example to illustrate the software structure of the electronic device 100 .
  • the present application does not limit the type of the operating system of the electronic device. For example, Android system, Hongmeng OS, etc.
  • Fig. 2 is a structural block diagram of software and hardware of an electronic device applicable to the method provided by the embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate through software interfaces.
  • the Android system is divided into four layers, which are application program layer, application program framework layer, Android runtime (Android runtime) and system library, and kernel layer respectively from top to bottom.
  • the application layer can consist of a series of application packages. As shown in Figure 2, the application layer can include handwriting applications (such as brush applications), camera, gallery, calendar, calls, maps, navigation, WLAN, Bluetooth, music, video, games, shopping, travel, instant messaging (such as SMS, WeChat, etc.), smart home, device control and other applications.
  • handwriting applications such as brush applications
  • camera gallery
  • calendar calls
  • maps maps
  • navigation WLAN
  • Bluetooth Bluetooth
  • music video
  • games games
  • shopping travel
  • instant messaging such as SMS, WeChat, etc.
  • smart home device control and other applications.
  • the embodiments of this application include but are not limited thereto.
  • the handwriting application integrates the software service package of the handwriting layer tool and the management service of the handwriting window, and the handwriting window management includes functions such as handwriting file management, handwriting data management, and handwriting application window scaling.
  • the brush application can realize part of the functions of the handwriting application, and provide the software service package of the handwriting layer tool for the handwriting tool set, including functions such as brushes and canvases.
  • a handwriting application can be regarded as an application capable of generating a handwriting layer for carrying operation tracks.
  • the first electronic device may deploy a brush application or a handwriting application
  • the second electronic device may deploy a handwriting application
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer can include input management service (input manager service, IMS), display policy service, power management service (power manager service, PMS), display management service (display manager service, DMS), activity Management service, resource management service, content provision service, view system, phone management service, notification management service, window management service, process management service, drawing service, wireless projection service, reverse interaction service, display frame, etc.
  • IMS input management service
  • PMS power management service
  • display management service display management service
  • activity Management service resource management service, content provision service, view system
  • phone management service notification management service
  • window management service process management service
  • drawing service drawing service
  • wireless projection service reverse interaction service
  • the input management service can be used to find the corresponding application program for the input data, so that the corresponding application program can process the input data.
  • the wireless screen projection service can work with the Wi-Fi driver and Wi-Fi IC to complete the transmission of handwritten data and screen projection images.
  • the function of the reverse interaction service is the same as that of the wireless projection service, and will not be repeated here.
  • the display framework can be used to call the display IC to draw and synthesize images, and can control the display to display images.
  • the window management service can be used to manage window programs.
  • the window management service can obtain the size of the display screen, determine whether there is a status bar, lock the screen, capture the screen, etc.
  • Content provisioning services can be used to store and retrieve data and make it accessible to applications.
  • Said data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebook, etc.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on.
  • the view system can be used to build applications.
  • a display interface can consist of one or more views.
  • a display interface including a text message notification icon may include a view for displaying text and a view for displaying pictures.
  • the resource management service provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and so on.
  • the notification management service enables the application to display notification information in the status bar, which can be used to convey notification-type messages, and can automatically disappear after a short stay without user interaction.
  • the notification manager is used to notify the download completion, message reminder, etc.
  • the notification manager can also be a notification that appears on the top status bar of the system in the form of a chart or scroll bar text, such as a notification of an application running in the background, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, issuing a prompt sound, vibrating the electronic device, and flashing the indicator light, etc.
  • the drawing service can be used to draw the interface.
  • the drawing service can convert the application layout and resources into a format that can be recognized by the display service of the kernel layer.
  • the drawing service can send the drawn interface to the display service of the kernel layer, and then realize the display of the drawn interface.
  • the Android runtime can include core libraries and a virtual machine.
  • the Android runtime is responsible for the scheduling and management of the Android system.
  • the core library can contain two parts: one part is the function function that the java language needs to call, and the other part is the core library of the Android system.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application program layer and the application program framework layer as binary files.
  • the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
  • a system library can include multiple function modules. For example: state monitoring service, surface manager (surface manager), media library (media libraries), three-dimensional graphics processing library (for example, OpenGLES), two-dimensional (2 dimensions, 2D) graphics engine (for example, SGL), etc.
  • the state monitoring service is used to determine the specific orientation of the mobile phone, the physical state of the flexible screen, etc. according to the monitoring data reported by the kernel layer.
  • the surface manager is used to manage the display subsystem, and provides fusion of 2D and 3D (3dimensions, 3D) layers for multiple applications.
  • the media library supports playback and recording of various commonly used audio and video formats, as well as still image files, etc.
  • the media library can support multiple audio and video encoding formats, such as MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, synthesis and layer processing, etc.
  • 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer includes at least input subsystem, power management service, sensor service (also called sensor driver), display service (also called display driver), camera driver, audio driver, Wi-Fi driver (Wi-Fi driver) Wait.
  • sensor service also called sensor driver
  • display service also called display driver
  • camera driver audio driver
  • Wi-Fi driver Wi-Fi driver
  • the input subsystem includes a driver layer, an input subsystem core and an event processing layer, which are used to manage input devices of different types, different principles, and different input information.
  • the input device may be, for example, a keyboard, a mouse, a touch screen, a joystick, handwriting, or some somatosensory input devices.
  • the driver layer is used to convert the underlying hardware input into a unified event form and report to the input subsystem core (input core).
  • the driver layer may include a touch panel driver (TP Driver).
  • TP Driver touch panel driver
  • the touch driver is used to receive signals from the touch circuit and convert them into corresponding output signals, which are input to the system input management service for further processing.
  • the Wi-Fi driver is used to drive the Wi-Fi IC to work.
  • the system library and kernel layer below the application framework layer can be called the underlying system.
  • the underlying system includes an underlying display system for providing display services.
  • the underlying display system includes a display driver in the kernel layer and a surface manager in the system library.
  • the hardware of the electronic device may include, but not limited to, a power button, a sensor, a display screen, a fingerprint, a camera, and the like.
  • the screen projection involved in this application refers to the transmission of media data (such as audio, video, pictures, etc.) on one electronic device to another electronic device for presentation, so as to achieve the effect of synchronously displaying the same media data among multiple electronic devices.
  • the screen projection involved in this application may include wired screen projection and wireless screen projection.
  • wired screen projection can establish a connection between multiple electronic devices through a high definition multimedia interface (HDMI), and transmit media data through an HDMI transmission line.
  • Wireless projection can establish a connection between multiple electronic devices through a wireless projection protocol such as Mira cast, and transmit media data through a wireless communication module.
  • FIG. 3 is a schematic diagram of a screen projection system applicable to the method for processing a screen projection image provided in an embodiment of the present application.
  • the screen projection system 300 includes at least two electronic devices, such as a first electronic device 310 and a second electronic device 320 as shown in the figure.
  • a screen projection connection can be established between the first electronic device 310 and the second electronic device 320 through the screen projection port, so as to realize the transmission of media data.
  • the screen projection port may include a wired port and/or a wireless port. Wherein, the wired port may be HDMI; the wireless port may include an API, and may also include a wireless hardware projection module. This embodiment of the present application does not limit it. As shown in FIG.
  • the screen projection port of the first electronic device 310 includes a first wired port and a first wireless port.
  • the first wired port and the first wireless port may be integrated on the first electronic device 310 or may be independent of the first wired port.
  • An electronic device 310 exists.
  • the screen projection port of the second electronic device may include a second wired port and a second wireless port, and the second wired port and the second wireless port may be integrated on the second electronic device 320 or exist independently of the second electronic device 320 . This embodiment of the present application does not limit it.
  • the second electronic device 320 performs a screen projection operation to the first electronic device 310 .
  • the user can make marks on the first electronic device 310 by fingers, stylus or mouse, etc., and these marks can be called reporting point data. Therefore, the operations in this embodiment can be understood as user operations, and are not limited to operations performed by hands.
  • the reporting point data needs to be transmitted to the second electronic device through the screen projection port, and the second electronic device completes the drawing of the trajectory of the reporting point data, and synthesizes the drawn trajectory with the projection screen image, and then Then transmit to the first electronic device.
  • FIG. 4 shows a possible processing flow of wireless screen projection.
  • the process shown by the solid arrow in FIG. 4 shows the process in which the first electronic device transmits the reporting point data to the second electronic device and the second electronic device draws the reporting point data and synthesizes it with the projected screen image.
  • the process shown by the hollow arrow in FIG. 4 shows the process in which the second electronic device transmits the synthesized image to the first electronic device and displays it on the first electronic device.
  • the first electronic device generates report data in response to the user's operation.
  • the first electronic device can collect handwritten data input by the user through the TPIC, and then input the collected handwritten data into the controller, and the controller calculates the reporting point data.
  • TP IC inputs the report point data to TP Driver, and TP Driver converts the report point data into a certain event form.
  • the user can use a stylus to operate on the display screen of the first electronic device to generate handwritten data, and the user can also use a mouse or finger to operate on the display screen of the first electronic device.
  • the user can also perform operations in other ways, which are not limited in this embodiment of the present application.
  • the reporting point data may include corresponding position information operated on the display screen.
  • the first electronic device transmits the report point data to the second electronic device through the wireless screen projection service or the reverse interactive service.
  • the TP Driver of the first electronic device transmits the converted report point data to the Wi-Fi driver of the first electronic device through the wireless projection service or reverse interactive service, and the first electronic device
  • the Wi-Fi driver of the first electronic device drives the Wi-Fi IC of the first electronic device into a working state, thereby transmitting the reporting data to the Wi-Fi IC of the second electronic device.
  • the Wi-Fi IC of the second electronic device inputs the report point data to the Wi-Fi driver of the second electronic device, and the Wi-Fi driver of the second electronic device transmits the report point data to the second electronic device through the wireless projection service.
  • the system enters the management service.
  • the second electronic device draws a second operation track based on the reporting point data.
  • the system input management service of the second electronic device transmits the reporting point data to the handwriting application of the second electronic device, and triggers the handwriting application of the second electronic device to draw a second operation track.
  • the handwriting application of the second electronic device sends the report point data to the window management service of the second electronic device, and the window management service of the second electronic device sends the report point data to the display frame of the second electronic device, and the display frame of the second electronic device
  • the framework calls the display IC of the second electronic device to draw the second operation track.
  • the second operation track is drawn based on the reported point data.
  • the second electronic device synthesizes the second operation track and the projected screen image.
  • the second electronic device after the display IC of the second electronic device draws the second operation track, it synthesizes the second operation track with images of one or more local layers.
  • the second electronic device sends the synthesized image to the first electronic device.
  • the second electronic device sends the synthesized image to the first electronic device through a wireless screen projection service and a Wi-Fi air interface.
  • the display IC of the second electronic device transmits the synthesized image to the Wi-Fi driver of the second electronic device through the wireless projection service, and the Wi-Fi driver of the second electronic device drives the Wi-Fi driver of the second electronic device.
  • the IC enters the working state, thereby transmitting the reporting data to the Wi-Fi IC of the first electronic device.
  • the Wi-Fi IC of the first electronic device inputs the synthesized image to the Wi-Fi driver of the first electronic device, and the Wi-Fi driver of the first electronic device transmits the synthesized image to the first electronic device through the wireless projection service.
  • the display frame of the device is the display frame of the device.
  • the first electronic device can display the synthesized image.
  • the display framework of the first electronic device invokes the display IC, and controls the display screen to display the synthesized image.
  • the reporting point data drawn based on the user's operation on the first electronic device needs to be transmitted to the second electronic device, and the second electronic device draws the operation track and synthesizes it with the images of one or more local layers before transmitting To the first electronic device, displayed on the first electronic device.
  • FIG. 5 shows possible time delays caused by adopting the processing flow in FIG. 4 .
  • the reporting frequency is 360 hertz (Hz) and the screen refresh rate is 120 Hz
  • this application proposes a processing method for projected screen images.
  • the first electronic device can draw the operation trajectory by itself.
  • the operation trajectory is consistent with the projection screen from the second electronic device.
  • the synthesized images can be directly displayed on the first electronic device. Reduce the delay caused by the steps of transmitting the reporting point data to the second electronic device, and the second electronic device transmitting the drawn operation trajectory data to the first electronic device, thereby reducing the display delay of the operation trajectory and bringing users Come for a better experience.
  • FIG. 6 and FIG. 7 are schematic flowcharts of a method for processing a projected screen image provided by an embodiment of the present application.
  • FIG. 6 shows a main flow of a method 600 for processing a projected image
  • FIG. 7 describes the specific implementation of each step in the method 600 in more detail in combination with the structural block diagram of software and hardware of the electronic device shown in FIG. 2 .
  • the process shown in FIG. 7 is described in detail by taking wireless screen projection as an example, but this should not constitute any limitation to the embodiment of the present application.
  • the method provided in the embodiment of the present application can be applied in the scenario of wired screen projection or wireless screen projection.
  • the method 600 may include step 601 to step 610 .
  • a first electronic device receives a first projected screen image from a second electronic device.
  • the first projected screen image is an image synthesized based on one or more local layers of the second electronic device, and then sent to the first electronic device by the second electronic device.
  • the specific sending process reference may be made to the above related description in conjunction with FIG. 4 , and for the sake of brevity, details are not repeated here.
  • step 602 the first electronic device generates reporting point data in response to the user's operation.
  • the relevant description in FIG. 4 reference may be made to the relevant description in FIG. 4 , and for the sake of brevity, details are not repeated here.
  • the first electronic device draws a first operation track based on the reporting point data.
  • the paintbrush application since the paintbrush application is added to the first electronic device, after the first electronic device obtains the report point data, it can directly use the paintbrush application to draw the operation track based on the report point data.
  • the TP Driver of the first electronic device sends the converted report point data to the paintbrush application of the first electronic device, and the paintbrush application draws a response report point data and its distribution (that is, the first operation track), and send the handwritten layer to the display frame of the first electronic device, and the display frame calls the display IC to draw the first operation track.
  • the first electronic device may draw an operation track in units of frames to draw an image.
  • the first electronic device may draw a frame of images at intervals, and each frame of images includes an operation track drawn in a recent period of time.
  • the duration of a period of time corresponding to the operation track may be greater than the sum of the two-way transmission delay between the first electronic device and the second electronic device and the delay for the second electronic device to locally process the received report data. Therefore, the first operation trajectory can be an operation trajectory drawn according to the report point data within a period of time, and the duration of the period of time is longer than the two-way transmission delay between the first electronic device and the second electronic device and the local communication of the second electronic device. The sum of the time delays for receiving and processing the reporting point data within this period of time.
  • the first electronic device synthesizes the first operation track with the first projected screen image from the second electronic device to obtain a synthesized image.
  • the first screen projection image may refer to step 601, for example, it may be an image transmitted to the first electronic device through a wireless screen projection service and a Wi-Fi air interface.
  • the display frame of the second electronic device acquires the first screen projection image, and then transmits the first screen projection image to the Wi-Fi of the second electronic device through the display IC through the wireless screen projection service.
  • -Fi driver the Wi-Fi driver of the second electronic device drives the Wi-Fi IC of the second electronic device into a working state, so as to transmit the first projected screen image to the Wi-Fi IC of the first electronic device.
  • the Wi-Fi IC of the first electronic device inputs the first projected screen image to the Wi-Fi driver of the first electronic device, and the Wi-Fi driver of the first electronic device transmits the first projected screen image to the first projected screen image through the wireless projection service.
  • a display frame of an electronic device Then the display framework of the first electronic device calls the display IC of the first electronic device to synthesize the previously drawn first operation track with the first projected image from the second electronic device to obtain a synthesized image.
  • the first operation track can be carried on the handwriting layer
  • the first projected screen image can be carried on the projected screen layer
  • the synthesis of the first operation track and the first projected screen image Specifically, the handwritten layer and the screen projection layer can be synthesized, and the resulting synthesized image includes not only the screen projection image in the screen projection layer, but also the operation track in the handwriting layer. That is to say, the synthesis of the operation track and the projected image is realized.
  • step 610 may be executed before step 602 and step 603, or may be executed simultaneously with step 602 or step 603.
  • the first electronic device displays the synthesized image.
  • the first electronic device can display the synthesized image.
  • the display frame of the first electronic device can control the display IC to transmit the synthesized image to the display screen after receiving the completion of image synthesis by the display IC, and refresh the display on the display screen.
  • the handwriting application can draw the operation trajectory locally based on the newly received operation, and then synthesize it with the first projected screen image from the second electronic device to obtain a composite after the image.
  • the display delay of the new operation track does not depend on the transmission delay between the second electronic device and the first electronic device, which can reduce the display delay of the operation track.
  • the first electronic device may directly display an image synthesized by the operation track and the first projected screen image based on the user's operation on the handwriting layer.
  • the time taken by the first electronic device from receiving the user's operation to displaying the synthesized image is shown in FIG. 8 . It takes about 10 milliseconds (ms) for the TP IC and the input subsystem of the first electronic device to obtain the reporting point data; it takes about 8.3 ms for the first electronic device to draw the operation track and synthesize it with the first projected screen image, Corresponding to the process from system input service to handwriting application handwriting drawing in Figure 8; displaying the synthesized image on the first electronic device takes about 7.5ms, which can correspond to the process from displaying the frame to the display screen in Figure 8, and the total time-consuming is about 25.8ms.
  • time delay shown in FIG. 8 is relative to the time delay shown in FIG. 5.
  • each step in FIG. 5 and FIG. 8 may produce a different time delay from the example, but each part The time-consuming ratio is similar.
  • the method may further include step 606 to step 609 .
  • the first electronic device sends the first operation track to the second electronic device.
  • the first electronic device can send the first operation track drawn by the display IC to the Wi-Fi IC of the second electronic device through the wireless projection service, Wi-Fi driver, and Wi-Fi IC .
  • the first electronic device when it sends the first operation track to the second electronic device, it may also send the time stamp corresponding to the first operation track to the second electronic device.
  • the time stamp is used to record the drawing time of the first operation track. Since the first operation track is drawn in real time based on the user's operation, the time stamp can also be considered as the time information for recording the user's operation.
  • the second electronic device may synchronize the received first operation track with images of one or more layers.
  • the Wi-Fi IC of the second electronic device receives the first operation trace from the first electronic device, it then sends the first operation trace to the second electronic device through the Wi-Fi driver.
  • the system enters the management service.
  • the system input management service of the second electronic device sends the first operation track to the handwriting application, which realizes data synchronization in the handwriting application.
  • synchronization specifically may refer to associating the above-mentioned first operation track and its corresponding time stamp with images of one or more layers used to synthesize the first projected screen image, and storing them in the second electronic device, So that the second electronic device can obtain the latest operation track.
  • the first electronic device may not have the function of saving the operation track, so it is necessary to synchronize the operation track to the second electronic device so that the second electronic device can further process the first operation track, for example, the first The operation track is saved or deleted, or synthesized with images of one or more local layers of the second electronic device as mentioned later.
  • step 608 the second electronic device obtains a second projected screen image based on the first operation track and local images of one or more layers.
  • the method 600 may further include: Step 609, the second electronic device displays the second projected screen image. That is to say, the first electronic device may send the first operation track drawn by itself to the second electronic device. Therefore, the second electronic device can directly combine the first operation track with images of one or more local layers, without drawing the operation track according to the reporting point data. It should be understood that the images of the one or more layers are images used to synthesize the first screen projection image.
  • the first operation track received by the second electronic device is an image carried on the handwriting layer.
  • the second electronic device may synthesize the images carried on one or more local layers and the first operation track carried on the handwritten layer to obtain a second projected screen image.
  • the handwriting application of the second electronic device may transmit the first operation trace to the system window management, and the system window management transmits the first operation trace to the display frame, and the display frame calls the display IC to transfer the first operation trace to the display frame.
  • An operation track is synthesized with images on one or more local layers to obtain an image on the second screen projection.
  • the display IC of the second electronic device may transmit the second projected screen image to the display screen for display on the display screen.
  • the first electronic device may also send the acquired point reporting data to the second electronic device, so that the second electronic device draws the second operation track by itself according to the point reporting data. That is, the above step 606 may be replaced by: the first electronic device sends the reporting data to the second electronic device.
  • step 607 may be replaced by: the second electronic device draws a second operation track based on the received reporting point data.
  • the second electronic device obtains a second projected screen image based on the second operation track and images of one or more local layers; and in step 609, the second electronic device displays the second projected screen image.
  • the first electronic device may send the time stamp of the report data while sending the report data to the second electronic device, and the time stamp is used to record the generation time of the report data. Since the reporting point data is generated in real time based on the user's operation, the timestamp can also be considered as time information for recording the user's operation.
  • the second electronic device may jointly draw the second operation track based on the latest received point report data and the previously received point report data, or may combine the operation track drawn according to the latest received point report data with the previously drawn
  • the operation trajectories are synthesized to obtain the second operation trajectories, which is not limited in this embodiment of the present application.
  • the second electronic device may determine whether to display the operation track based on the configuration of the user.
  • the user may set the second electronic device, for example, to "display the operation track locally".
  • the second electronic device can generate an instruction based on the setting, and send the instruction to the handwriting application, so that the handwriting application can determine whether to display the operation track locally. If the second electronic device is set to locally display the operation track, the above step 609 may be performed; if the second electronic device is set not to locally display the user's operation track on the first electronic device, then step 609 may not be performed. 609. Only the projected screen image that does not include the operation track may be displayed on the second electronic device.
  • the method 600 may further include: Step 610, the second electronic device sends the second screen projection image to the first electronic device. After the second electronic device obtains the second screen projection image, it may further send the second screen projection image to the first electronic device, and correspondingly, the first electronic device receives the second screen projection image from the second electronic device.
  • the synthesized image sent by the second electronic device to the first electronic device is the latest projected screen image.
  • the projected screen image may be referred to as a second projected screen image, for example.
  • the specific process is similar to the process shown above in conjunction with FIG. 4 , and will not be repeated here for the sake of brevity.
  • the second projected screen image can be used to overlay the image synthesized in the first electronic device based on the first operation track and the first projected screen image, thereby realizing local image update of the first electronic device.
  • the user after the first electronic device sends the reporting data of the first operation track to the second electronic device, the user generates a new operation on the first electronic device, that is, the first electronic device can be based on the new operation Draw a new operation trajectory.
  • the first electronic device may generate a frame of images at intervals, and each frame of images may include an operation track drawn in a recent period of time. Therefore, the above-mentioned newly drawn operation track can be understood as a track drawn based on operations within a recent period of time, which is another example of the first operation track.
  • the first operation track in the previous frame image will be recorded as operation track 1 below, and the operation track in the next frame image
  • the trajectory is denoted as operation trajectory 2.
  • the first electronic device can synthesize the operation track 2 with the newly received projected screen image from the second electronic device, that is, synthesize the handwritten layer and the screen projected layer to obtain a new synthesized image, and The new composite image is displayed on the device.
  • the above steps may be repeatedly performed.
  • the first electronic device may synthesize the operation trace 2 with the latest received screen projection image from the second electronic device. Thereby, the latest operation trajectory can be displayed on the latest synthesized image.
  • a part of the operation track on the image may come from the second projected screen image, and a part from the operation track 2 .
  • the first electronic device may also combine the operation track 2 with the operation track 1, and then combine it with the second projected screen image.
  • the latest operation trajectory can be displayed on the latest synthesized image, and all the operation trajectories on the image come from the operation trajectories drawn locally by the first electronic device.
  • the embodiment of the present application does not limit this.
  • the handwriting application can locally draw the operation track based on the newly received operation.
  • the first electronic device can directly For newly received operations, the latest operation track is drawn, and the operation track is synthesized with the projected screen image previously received from the second electronic device, and then displayed on the first electronic device. Therefore, when a new operation occurs on the first electronic device, the display delay of the new operation track does not depend on the transmission delay between the second electronic device and the first electronic device, thereby reducing the display delay of the operation track.
  • the local image update of the first electronic device is not limited to the process illustrated above.
  • the first electronic device receives a new user operation, it can draw the operation track by itself, and synthesize it with the projected image from the second electronic device to obtain a synthesized image, without sending the new operation track or reporting point data to the second electronic device. That is, performing steps 601 to 605 in the above method 600 without performing steps 606 to 610 may also implement updating of the locally displayed image of the first electronic device. This embodiment of the present application does not limit it.
  • steps 606 to 610 are executed, the present application does not limit the order of execution of steps 609 and 610 , for example, steps 610 and 609 may be executed at the same time, or may be executed before or after step 609 .
  • FIG. 9 is a schematic block diagram of an apparatus 900 for processing a projected screen image provided by an embodiment of the present application.
  • the device 900 may include a transceiver module 910 , a processing module 920 and a display module 930 .
  • the transceiving module 910 can be used to receive the first projected screen image from the second electronic device; the processing module 920 can be used to respond to the user's handwriting operation, generate handwritten report point data report point data, and use for reporting point data based on the report point data , drawing an operation track; the processing module 920 is further configured to combine the operation track with the first projected screen image to obtain a combined image; the display module 930 can be used to display the combined image.
  • the transceiver module 910 is further configured to send the operation track to the second electronic device, where the operation track is used to synthesize images of one or more layers on the second electronic device, and the one The images of one or more layers are used by the second electronic device to synthesize the first projected screen image.
  • the transceiver module 910 is further configured to send the report point data to the second electronic device, the report point data is used by the second electronic device to draw the operation track, and the second operation track uses Because the second electronic device is combined with images of one or more local layers, the images of the one or more layers are used by the second electronic device to synthesize the first projected screen image.
  • the transceiving module 910 is further configured to receive a second screen projection image from the second electronic device, and the second screen projection image is based on images of the one or more layers and corresponding to the obtained by synthesizing operation tracks; the display module 930 is further configured to display the second projected screen image, so as to update the synthesized image.
  • the apparatus 900 may include a module or unit for executing the method executed by the first electronic device in the embodiment of the method 600 in FIG. 6 .
  • the transceiving module 910 can be used to perform step 601, step 606 and step 610 in the above method 600
  • the processing module 920 can be used to perform steps 602 to 604 in the above method 600
  • the display module 930 can be used to perform the above Step 605 in method 600 .
  • the apparatus 900 may also be used to execute the method executed by the first electronic device in the embodiment in FIG. 7 . Moreover, each module in the device 900 and other operations and/or functions mentioned above are respectively for realizing the corresponding process of the embodiment in FIG. 7 . For the sake of brevity, it will not be described in detail here.
  • the apparatus 900 for processing a projected screen image may correspond to at least part of the first electronic device in the method embodiment according to the embodiment of the present application.
  • the apparatus 900 may be the first electronic device, or a component in the first electronic device, such as a chip or a chip system.
  • the functions implemented by the screen projection image processing apparatus 900 may be implemented by one or more processors executing corresponding programs.
  • FIG. 10 is a schematic block diagram of an apparatus 1000 for processing a projected screen image provided by an embodiment of the present application.
  • the device 1000 may include a transceiver module 1010 , a processing module 1020 and a display module 1030 .
  • the transceiver module 1010 is used to send the first projected screen image to the first electronic device;
  • the transceiver module 1010 is also used to receive the operation trace from the first electronic device, and the operation trace is based on the drawn by the operation of the device;
  • the processing module 1020 is configured to synchronize the operation trajectory with images of one or more layers, and the images of one or more layers are used to synthesize the first projected screen image .
  • the processing module 1020 is further configured to synthesize the operation trajectory with images of one or more layers to obtain a second screen projection image; the display module 1030 is configured to display the second screen projection image.
  • the transceiver module 1010 is configured to send the second screen projection image to the first electronic device.
  • the apparatus 1000 may include a module or unit for executing the method executed by the second electronic device in the embodiment of the method 600 in FIG. 6 .
  • the transceiving module 1010 can be used to perform step 601, step 606 and step 610 in the above method 600
  • the processing unit 1020 can be used to perform steps 607 to 608 in the above method 600
  • the display module 1030 can be used to perform the above Step 609 in method 600 .
  • the apparatus 1000 can also be used to execute the method executed by the second electronic device in the embodiment in FIG. 7 . Moreover, each module in the apparatus 1000 and other operations and/or functions mentioned above are respectively for realizing the corresponding flow of the embodiment in FIG. 7 . For the sake of brevity, it will not be described in detail here.
  • the apparatus 1000 for processing a projected screen image may correspond to at least part of the second electronic device in the method embodiment according to the embodiment of the present application.
  • the apparatus 1000 may be the second electronic device, or a component in the second electronic device, such as a chip or a chip system.
  • the functions implemented by the screen projection image processing apparatus 1000 may be implemented by one or more processors executing corresponding programs.
  • the present application also provides an electronic device or an internal device thereof.
  • the electronic device or device may include one or more processors, so as to realize the functions of the above-mentioned processing device 900 and device 1000 for projected screen images.
  • the one or more processors may include or execute, for example, the transceiver module 910, the processing module 920, and the display module 930 as well as the transceiver module 1010, the processing module 1020, and the display module 1030 described in the above embodiments.
  • the one or more processors may correspond to, for example, the processor 110 in the electronic device 100 shown in FIG. 1 .
  • the display module may correspond to the display screen 194 in the electronic unit 100 shown in FIG. 1 .
  • the electronic device or apparatus further includes one or more memories.
  • the one or more memories are used to store computer programs and/or data, such as reporting point data and the like.
  • the one or more memories may correspond to, for example, the external memory 120 and/or the internal memory 121 in the electronic device 100 shown in FIG. 1 .
  • the processor may acquire the computer program stored in the memory to execute the method procedures involved in the above embodiments.
  • the present application also provides a computer storage medium, the computer storage medium stores computer instructions, and when the computer instructions are run on the electronic device, the electronic device executes the steps of the above related methods to implement the method for capturing images in the above embodiments.
  • the computer storage medium may correspond to, for example, the external memory 120 and/or the internal memory 121 in the electronic device 100 shown in FIG. 1 .
  • the transceiver module 910, processing module 920, and display module 930 shown in FIG. 9 and the transceiver module 1010, processing module 1020, and display module 1030 shown in FIG. 10 may exist in the form of software and be stored in the computer storage medium.
  • the present application also provides a computer program product, which can be stored in the computer storage medium, and when the computer program product is run on the computer, the computer is made to execute the above-mentioned related steps, so as to realize the processing of the projected screen image in the above-mentioned embodiment method.
  • the screen projection image processing device, electronic equipment, computer storage medium, computer program product or chip provided in the embodiment of the present application are all used to execute the corresponding method provided above, therefore, the beneficial effect it can achieve can be With reference to the beneficial effects in the corresponding method provided above, details will not be repeated here.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, each unit may exist separately physically, or two or more units may be integrated into one unit.
  • the functions described above are realized in the form of software function units and sold or used as independent products, they can be stored in a computer-readable storage medium.
  • the technical solution of the present application is essentially or the part that contributes to the prior art or the part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium, including Several instructions are used to make a computer device (which may be a personal computer, a server, or a network device, etc.) execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage medium includes: U disk, mobile hard disk, read-only memory (read-only memory, ROM), random access memory (random access memory, RAM), magnetic disk or optical disc and other media that can store program codes. .

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请提供了一种投屏图像的处理方法和装置,该方法包括:第一电子设备从第二电子设备接收第一投屏图像;响应于用户的操作,绘制报点数据;且基于报点数据,绘制操作轨迹;然后将操作轨迹与第一投屏图像合成,得到合成后的图像;最后显示合成后的图像。由此,通过在第一电子设备上部署手写应用,使第一电子设备在有新的操作发生时能够绘制相应的操作轨迹,新的操作轨迹的显示延迟不依赖于第二电子设备与第一电子设备之间的传输时延,从而减小操作轨迹的显示延迟。

Description

一种投屏图像的处理方法和装置 技术领域
本申请涉及计算机技术领域,更为具体地,涉及一种投屏图像的处理方法和装置。
背景技术
随着科技的发展以及智能带屏终端的出现,投屏技术被广泛应用。通常情况下,投屏技术通过有线投屏端口或无线投屏端口将一个电子设备上的显示画面传送到另一个电子设备。
一种情况下,用户可以通过手指、手写笔或鼠标等在电子设备进行操作,电子设备根据这些操作生成的数据可以被称为报点数据。第一电子设备根据用户在的操作生成的报点数据需要通过有线投屏端口或无线投屏端口传输到第二电子设备,由第二电子设备完成对报点数据的轨迹的绘制并合成,然后再传送到第一电子设备。因此,手写绘制的轨迹的显示延迟较大,用户体验差。
发明内容
本申请提供了一种投屏图像的处理方法,以期减小操作轨迹的显示延迟。
第一方面,本申请提供了一种投屏图像的处理方法,包括:从第二电子设备接收第一投屏图像;响应于用户的操作,生成报点数据;基于所述报点数据,绘制操作轨迹;将所述操作轨迹与所述第一投屏图像合成,得到合成后的图像;显示所述合成后的图像。
基于上述技术内容,通过在第一电子设备上部署手写应用,手写应用可以基于新接收到的操作,在本地绘制操作轨迹。如此一来,即便第二电子设备绘制的最新的投屏图像(也即,包含了基于新的操作而绘制的操作轨迹的图像)还未到达第一电子设备,第一电子设备也可以直接基于新接收到的操作,绘制最新的操作轨迹,并将该操作轨迹与此前从第二电子设备接收到的投屏图像(如第一投屏图像)合成,进而在第一电子设备显示。因此,第一电子设备在有新的操作发生时,新的操作轨迹的显示延迟不依赖于第二电子设备与第一电子设备之间的传输时延,从而减小操作轨迹的显示延迟。
应理解,第一电子设备可以以帧为单位来绘制操作轨迹,以得到一帧图像。第一电子设备可以每隔一段时间生成一帧图像,每一帧图像包含最近一段时间绘制的操作轨迹。与该操作轨迹对应的一段时间的时长可以大于第一电子设备与第二电子设备之间的双向传输时延与第二电子设备本地对接收到的报点数据进行处理的时延之和。
可选地,该方法还包括:向所述第二电子设备发送所述操作轨迹,所述操作轨迹用于在所述第二电子设备与一个或多个图层的图像合成,所述一个或多个图层的图像被所述第二电子设备用于合成所述第一投屏图像。
基于上述技术内容,由于操作轨迹可以由第一电子设备绘制,因此,第二电子设备上 要显示操作轨迹,可以由第一电子设备将已绘制的操作轨迹发送给第二电子设备。
可选地,该方法还包括:向所述第二电子设备发送所述报点数据,所述报点数据用于所述第二电子设备绘制所述操作轨迹,所述操作轨迹用于在所述第二电子设备与一个或多个图层的图像合成,所述一个或多个图层的图像被所述第二电子设备用于合成所述第一投屏图像。
应理解,第二电子设备基于报点数据绘制的操作轨迹和第一电子设备基于报点数据绘制的操作轨迹是相同的。
基于上述技术内容,第二电子设备上要显示操作轨迹,可以由第一电子设备直接将报点数据发送给第二电子设备,由第二电子设备绘制并显示操作轨迹。
应理解,对于第一电子设备来说,投屏图像可以承载于投屏图层中。对于第二电子设备来说,投屏图像是由包括应用图层在内的一个或多个图层合成后发送给第一电子设备的图像。上述的投屏图像包括第一投屏图像或第二投屏图像。
进一步可选地,该方法还包括:从所述第二电子设备接收第二投屏图像,所述第二投屏图像是基于所述一个或多个图层的图像和所述操作轨迹合成得到的;显示所述第二投屏图像,以更新所述合成后的图像。
应理解,所述操作轨迹可以是第一电子设备绘制并发送给第二电子设备的操作轨迹,也可以是第二电子设备根据报点数据自行绘制的操作轨迹,本申请对此不作限定。如前所述,由于第一电子设备和第二电子设备可以基于报点数据绘制得到相同的操作轨迹,故无论是基于第一电子设备绘制的操作轨迹与第二电子设备本地的一个或多个图层的图像合成,还是基于第二电子设备绘制的操作轨迹与第二电子设备本地的一个或多个图层的图像合成,所得的第二投屏图像也是相同的。第二电子设备可以将新生成的第二投屏图像发送给第一电子设备,以便于更新第一电子设备的投屏图像。第一电子设备基于所述第二投屏图像,更新本地显示的图像可以是指,将第一投屏图像替换为第二投屏图像,以用于显示,或用于与最新的操作轨迹合成新的图像。
下文中为方便区分和说明,将第一电子设备基于报点数据绘制的轨迹记为第一操作轨迹,第二电子设备基于报点数据绘制的轨迹记为第二操作轨迹。
第一电子设备本地显示的图像的更新可包括如下两种实现方式,下文将对这两种实现方式做详细说明。
在第一种可能的实现方式中,如前所述,第一电子设备可以在接收到新的用户操作时,自行绘制第一操作轨迹,并与来自第二电子设备的第一投屏图像合成,得到合成的图像,以此来减小操作轨迹的显示延迟。另一方面,第一电子设备可以将第一操作轨迹或第一操作轨迹的报点数据发送至第二电子设备。第二电子设备可以获取到新的操作轨迹(如第一操作轨迹,或基于报点数据绘制的第二操作轨迹)后,可以将该新的操作轨迹与第二电子设备本地一个或多个图层的图像合成,得到合成后的图像后再发送给第一电子设备。这里,第二电子设备本地一个或多个图层的图像可以是指用于合成第二投屏图像的图层。基于新的操作轨迹第二电子设备本地的一个或多个图层的图像合成后的图像是新的投屏图像,即第二投屏图像,可用于更新第一电子设备中显示的图像。
一种可能的情况是,在第一电子设备将第一操作轨迹的报点数据发送给第二电子设备之后,用户在第一电子设备又产生了新的操作,也即第一电子设备可以基于新的操作绘制 新的操作轨迹。
如前所述,第一电子设备可以每隔一段时间生成一帧图像,每一帧图像可以包含最近一段时间绘制的操作轨迹。因此,上述新绘制的操作轨迹可以理解为是基于最近一段时间内的操作绘制的轨迹,是第一操作轨迹的又一个示例。应理解,用于生成前后两帧图像的时段可能存在重叠。因此,前后两帧图像中的第一操作轨迹也可能存在重叠或近似重叠。为方便区分上一帧图像中的第一操作轨迹和下一帧图像中的第一操作轨迹,下文将上一帧图像中的第一操作轨迹记为操作轨迹1,下一帧图像中的操作轨迹记为操作轨迹2。
第二电子设备可以基于接收到的报点数据绘制第二操作轨迹。应理解,该第二操作轨迹可以是第二电子设备基于接收到的最近一段时间内的报点数据和之前接收到的报点数据共同绘制得到,也可以将根据接收到的最近一段时间内的报点数据绘制的操作轨迹与之前绘制的操作轨迹进行合成得到。第二电子设备可以将该第二操作轨迹与本地一个或多个图层的图像合成第二投屏图像并发送给第一电子设备。
第一电子设备可以基于操作轨迹2和第二投屏图像进行合成。由此可以将最新的操作轨迹显示在最新合成的图像上。该图像上的操作轨迹可能一部分来自第二投屏图像,一部分来自操作轨迹2。
或者,第一电子设备可以将操作轨迹2与操作轨迹1进行合成后,再与第二投屏图像合成。由此,可以将最新的操作轨迹显示在最新合成的图像上,并且该图像上的操作轨迹全部来自第一电子设备本地绘制的操作轨迹。
如此,可以实现对第一电子设备本地显示的图像的更新。
在第二种可能的实现方式中,第一电子设备可以在接收到新的用户操作时,自行绘制操作轨迹,并与来自第二电子设备的第一投屏图像合成,得到合成的图像。第一电子设备可以基于每一次新的用户操作,绘制操作轨迹,并将新绘制的操作轨迹与上一次合成的图像合成,得到更新的图像。如此,可以实现对第一电子设备本地显示的图像的更新。第二电子设备可以基于接收到的操作轨迹刷新本地保存的图像数据。由于该操作轨迹是第一电子设备本地绘制的,第二电子设备无需再将该操作轨迹发送给第一电子设备。
第二方面,本申请提供了一种投屏图像的处理方法,包括:向第一电子设备发送第一投屏图像;接收来自第一电子设备的操作轨迹,所述操作轨迹是基于用户在所述第一电子设备的操作绘制的;将所述操作轨迹与一个或多个图层的图像进行同步,所述一个或多个图层的图像用于合成所述第一投屏图像。
基于上述技术内容,第二电子设备接收来自第一电子设备的操作轨迹(即,上述的第一操作轨迹),能够将操作轨迹在本地进行同步,以便本地对第一操作轨迹进行进一步处理,例如以便和第二电子设备本地用于投屏的图像(如上述第一投屏图像)合成,得到新的投屏图像(如上述第二投屏图像),并将该新的投屏图像发送至第一电子设备,用于对第一电子设备显示的图像的更新。同时也便于在第二电子设备上显示最新的操作轨迹。
结合第二方面,在第二方面的某些可能的实现方式中,该方法还包括:将所述操作轨迹与一个或多个图层的图像合成,得到第二投屏图像;显示所述第二投屏图像。
应理解,基于第一操作轨迹和用于合成第一投屏图像的本地的一个或多个图层的图像,可以合成第二投屏图像。此过程可以是同步的一例。
结合第二方面,在第二方面的某些可能的实现方式中,该方法还包括:向所述第一电子设备发送所述第二投屏图像。
可以理解,第二电子设备向第一电子设备发送的合成后的图像是第二电子设备最新合成的投屏图像,即第二投屏图像。第二电子设备将第二投屏图像发送给第一电子设备后,第一电子设备便可以基于最新接收到的第二投屏图像,更新第一电子设备本地显示的图像。比如,第一电子设备可基于新的操作,将新绘制的操作轨迹(即上述操作轨迹2)与第二投屏图像合成,得到合成后的图像用于显示;又比如,第一电子设备在未产生新的操作的情况下,将本地的操作轨迹(此时本地的操作轨迹没有数据)与第二投屏图像合成,最终显示第二投屏图像。
结合第二方面,在第二方面的某些可能的实现方式中,该方法还包括:向所述第一电子设备发送所述第二投屏图像。
第三方面,本申请提供了一种投屏图像的处理装置,包括用于实现第一方面、第二方面或第一方面和第二方面任一种可能实现方式中的方法的模块或单元。应理解,各个模块或单元可通过执行计算机程序来实现相应的功能。
第四方面,本申请提供了一种投屏图像的处理装置,包括处理器,所述处理器用于执行第一方面、第二方面或第一方面和第二方面任一种可能实现方式中所述的投屏图像的处理方法。
所述装置还可以包括存储器,用于存储指令和数据。所述存储器与所述处理器耦合,所述处理器执行所述存储器中存储的指令时,可以实现上述各方面中描述的方法。所述装置还可以包括通信接口,所述通信接口用于该装置与其它设备进行通信,示例性地,通信接口可以是收发器、电路、总线、模块或其它类型的通信接口。
第五方面,本申请提供了一种芯片系统,该芯片系统包括至少一个处理器,用于支持实现上述第一方面、第二方面或第一方面和第二方面任一种可能实现方式中所涉及的功能,例如,例如接收或处理上述方法中所涉及的数据和/或信息。
在一种可能的设计中,所述芯片系统还包括存储器,所述存储器用于保存程序指令和数据,存储器位于处理器之内或处理器之外。
该芯片系统可以由芯片构成,也可以包含芯片和其他分立器件。
第六方面,本申请提供了一种计算机可读存储介质,包括计算机程序,当其在计算机上运行时,使得计算机实现第一方面、第二方面或第一方面和第二方面任一种可能实现方式中的方法。
第七方面,本申请提供了一种计算机程序产品,所述计算机程序产品包括:计算机程序(也可以称为代码,或指令),当所述计算机程序被运行时,使得计算机执行第一方面、第二方面或第一方面和第二方面任一种可能实现方式中的方法。
应当理解的是,本申请的第三方面至第七方面与本申请的第一方面和第二方面的技术方案相对应,各方面及对应的可行实施方式所取得的有益效果相似,不再赘述。
附图说明
图1为本申请实施例提供的一种电子设备的示意性框图;
图2为适用于本申请实施例提供的方法的一种电子设备的软件与硬件的结构示意 性框图;
图3为适用于本申请实施例提供的投屏图像的处理方法的投屏系统的示意图;
图4为无线投屏的一种可能的处理流程示意图;
图5为无线投屏的时延示意图;
图6和图7为本申请实施例提供的投屏图像的处理方法的流程示意图;
图8为无线投屏的另一时延示意图;
图9为本申请实施例提供的投屏图像的处理装置900的示意性框图;
图10为本申请实施例提供的投屏图像的处理装置1000的示意性框图。
具体实施方式
下面将结合附图,对本申请中的技术方案进行描述。为了便于理解本申请实施例,首先对本申请涉及的应用场景做简单的介绍。
随着科技的发展以及智能带屏终端的出现,无线投屏技术被广泛应用。无线投屏技术通过无线通信技术将一个电子设备上的显示画面传送到另一个电子设备上显示。其中,传送该显示画面的终端设备可以称为第二电子设备,接收到该显示画面并显示的终端设备可以称为第一电子设备。
其中,第一电子设备可以为支持用户采用手写输入且带有较大显示屏的电子设备,例如可以为:电视机、智慧屏等电子设备。本申请实施例对终端设备和显示端设备的具体类型不作任何限定。第二电子设备为支持投屏的电子设备,例如可以为手机、平板电脑、车载设备、笔记本电脑、个人计算机(personal computer,PC)、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本、等电子设备。
应理解,本申请实施例中的第一电子设备和第二电子设备均为带显示屏的电子设备,可以为相同类型的电子设备,也可以为不同类型的电子设备,第一电子设备和第二电子设备的具体结构均可以从参考下文图1和图2所示的电子设备100的结构,两者可以采用比图1和图2中所示结构更多或者更少的组成部分,本申请实施例对此不作限定。还应理解,本申请实施例中的术语“第一”、“第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。
还应理解,本申请实施例中的投屏图像是由第二电子设备通过有线投屏端口或无线投屏端口传输至第一电子设备,进而通过第一电子设备的显示屏显示的图像。对于第一电子设备来说,投屏图像可以承载于投屏图层中。对于第二电子设备来说,投屏图像可以承载于包括应用图层在内的一个或多个图层中。第二电子设备可以在对该一个或多个图层合成为投屏图像后,投屏至第一电子设备。具体地,该第二电子设备中的一个或者多个图层例如可以为手写应用窗口的图层以及当前投屏的界面中其他图标或应用的图层,总体可以看作是用于投屏的显示界面包含的图层。在下文中,对第二电子设备侧的执行过程进行描述时,投屏图像即对应于本地一个或多个图层的图像。
此外,本申请实施例中所述的方法可以支持Linux、安卓(Android)、鸿蒙操作系统(Harmony operating system,Harmony OS)、Mac、iOS、Windows和物联网操作系统(如LiteOS)等操作环境。本申请实施例对此不作任何限定。
示例性地,图1示出了电子设备100的结构示意图。如图1所示,该电子设备100 可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,鼠标191,指示器192,摄像头193,显示屏194,用户标识模块(subscriber identification module,SIM)卡接口195,触控电路196(touch panel integrated circuit,TP IC)以及无线保真(wireless fidelity,Wi-Fi)电路197(Wi-Fi integrated circuit,Wi-Fi IC)等。
可以理解的是,本申请示意的结构并不构成对电子设备100的具体限定。在另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP)、调制解调处理器、图形处理器(graphics processing unit,GPU)、图像信号处理器(image signal processor,ISP)、控制器、存储器、视频编解码器、数字信号处理器(digital signal processor,DSP)、基带处理器及神经网络处理器(neural-network processing unit,NPU)等中的一个或多个。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
其中,应用处理器通过音频模块170(如扬声器170A等)输出声音信号,或通过显示屏194显示图像或视频。
控制器可以是电子设备100的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。在一些实施例中,控制器可以对触控电路196采集的信号进行进一步处理。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
处理器110可以通过执行指令,执行不同的操作,以实现不同的功能。该指令例如可以是设备出厂前预先保存在存储器中的指令,也可以是用户在使用过程中安装新的应用(application,APP)之后从APP中读取到的指令,本申请实施例对此不作任何限定。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口、集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口、通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口、移动产业处理器接口(mobile industry processor interface,MIPI)、通用输入输出(general-purpose input/output,GPIO)接口、SIM接口和/或USB接口等。
I2C接口是一种双向同步串行总线,包括一根串行数据线(serial data line,SDA)和一根串行时钟线(derail clock line,SCL)。
I2S接口可以用于音频通信。在一些实施例中,处理器110可以包含多组I2S总线。 处理器110可以通过I2S总线与音频模块170耦合,实现处理器110与音频模块170之间的通信。在一些实施例中,音频模块170可以通过I2S接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。
PCM接口也可以用于音频通信,将模拟信号抽样,量化和编码。在一些实施例中,音频模块170与无线通信模块160可以通过PCM总线接口耦合。在一些实施例中,音频模块170也可以通过PCM接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。所述I2S接口和所述PCM接口都可以用于音频通信。
UART接口是一种通用串行数据总线,用于异步通信。该总线可以为双向通信总线。它将要传输的数据在串行通信与并行通信之间转换。在一些实施例中,UART接口通常被用于连接处理器110与无线通信模块160。例如:处理器110通过UART接口与无线通信模块160中的蓝牙模块通信,实现蓝牙功能。在一些实施例中,音频模块170可以通过UART接口向无线通信模块160传递音频信号,实现通过蓝牙耳机播放音乐的功能。
MIPI接口可以被用于连接处理器110与显示屏194,摄像头193等外围器件。MIPI接口包括摄像头串行接口(camera serial interface,CSI),显示屏串行接口(display serial interface,DSI)等。在一些实施例中,处理器110和摄像头193通过CSI接口通信,实现电子设备100的拍摄功能。处理器110和显示屏194通过DSI接口通信,实现电子设备100的显示功能。
GPIO接口可以通过软件配置。GPIO接口可以被配置为控制信号,也可被配置为数据信号。在一些实施例中,GPIO接口可以用于连接处理器110与摄像头193,显示屏194,无线通信模块160,音频模块170,传感器模块180等。GPIO接口还可以被配置为I2C接口,I2S接口,UART接口,MIPI接口等。
USB接口130是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接口130可以用于连接充电器为电子设备100充电,也可以用于电子设备100与外围设备之间传输数据。也可以用于连接耳机,通过耳机播放音频。该接口还可以用于连接其他电子设备,例如AR设备等。
可以理解的是,本申请示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备100的结构限定。在另一些实施例中,电子设备100也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块140用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。在一些有线充电的实施例中,充电管理模块140可以通过USB接口130接收有线充电器的充电输入。在一些无线充电的实施例中,充电管理模块140可以通过电子设备100的无线充电线圈接收无线充电输入。充电管理模块140为电池142充电的同时,还可以通过电源管理模块141为电子设备供电。
电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块141接收电池142和/或充电管理模块140的输入,为处理器110,内部存储器121,外部存储器,显示屏194,摄像头193,和无线通信模块160等供电。电源管理模块141还可以用于监测电池容量,电池循环次数,电池健康状态(漏电,阻抗)等参数。在其他一些实施例中,电源管理模块141也可以设置于处理器110中。在另一些实施 例中,电源管理模块141和充电管理模块140也可以设置于同一个器件中。
电子设备100的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在电子设备100上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A,受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块150或其他功能模块设置在同一个器件中。
无线通信模块160可以提供应用在电子设备100上的包括无线局域网(wireless local area networks,WLAN)(如Wi-Fi网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
无线通信模块160可用于实现主机端和显示端之间的数据传输,比如,主机端通过WiFi空口向显示端发送投屏图像,显示端通过WiFi空口向主机端发送报点数据,等。
在一些实施例中,电子设备100的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得电子设备100可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple  access,TD-SCDMA),长期演进(long term evolution,LTE),第五代(5th generation,5G)通信系统,BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(BeiDou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
电子设备100可以通过GPU、显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194,也可以称为屏幕,可用于显示图像、视频等。显示屏194可包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD)、有机发光二极管(organic light-emitting diode,OLED)、有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode,AMOLED)、柔性发光二极管(flex light-emitting diode,FLED),迷你LED(Mini LED)、微Led(Micro LED)、微OLED(Micro-OLED)、量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备100可以包括一个或多个显示屏194。
应理解,显示屏194还可以包括更多的组件。例如,背光板、显示集成电路(integrated circuit,IC)等。其中,背光板可用于提供光源,显示面板基于背光板提供的光源而发光。显示集成电路可用于绘制、合成图像,并可用于控制液晶层的液晶透光或不透光。
电子设备100可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,电子设备100可以包括一个或多个摄像头193。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备100可以支持一种或多种视频编解码器。这样,电子设备100可以播放或录制多种编码格式的视频,例如:动 态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备100的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,从而执行电子设备100的各种功能应用以及数据处理。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储电子设备100使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
SIM卡接口195用于连接SIM卡。SIM卡可以通过插入SIM卡接口195,或从SIM卡接口195拔出,实现和电子设备100的接触和分离。电子设备100可以支持一个或多个SIM卡接口。SIM卡接口195可以支持Nano SIM卡,Micro SIM卡,SIM卡等。同一个SIM卡接口195可以同时插入多张卡。所述多张卡的类型可以相同,也可以不同。SIM卡接口195也可以兼容不同类型的SIM卡。SIM卡接口195也可以兼容外部存储卡。电子设备100通过SIM卡和网络交互,实现通话以及数据通信等功能。在一些实施例中,电子设备100采用eSIM,即:嵌入式SIM卡。eSIM卡可以嵌在电子设备100中,不能和电子设备100分离。
触控电路196可以用于采集用户通过触摸显示屏194产生的信号。
可以理解的是,本申请示意的结构并不构成对电子设备100的具体限定。在另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
电子设备100的软件系统可以采用分层架构、事件驱动架构、微核架构、微服务架构或云架构。本申请以分层架构的Android系统为例,示例性说明电子设备100的软件结构。其中,本申请对电子设备的操作系统的类型不做限定。例如,Android系统、鸿蒙OS等。
图2是适用于本申请实施例提供的方法的一种电子设备的软件与硬件的结构框图。如图2所示,分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android系统分为四层,从上至下分别为应用程序层、应用程序框架层、安卓运行时(Android runtime)和系统库,以及内核层。
应用程序层可以包括一系列应用程序包。如图2所示,应用程序层可以包括手写应用(例如包括画笔应用)、相机、图库、日历、通话、地图、导航、WLAN、蓝牙、音乐、视频、游戏、购物、出行、即时通信(如短信息、微信等)、智能家居、设备控制等应用程序。本申请实施例包含但不限于此。
其中,手写应用集成了手写图层工具的软件服务程序包和手写窗口的管理服务,手写窗口管理包括手写文件管理、手写数据管理、手写应用窗口缩放等功能。画笔应用可以实现手写应用的一部分功能,为手写工具集提供手写图层工具的软件服务程序包,包括笔刷、画布等功能。手写应用可以看作是一个能够生成用于承载操作轨迹的手写图层的应用。
在本申请的实施例中,第一电子设备可部署画笔应用或手写应用,第二电子设备可部署手写应用。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。如图2所示,应用程序框架层可以包括输入管理服务(input manager service,IMS)、显示策略服务、电源管理服务(power manager service,PMS)、显示管理服务(display manager service,DMS)、活动管理服务、资源管理服务、内容提供服务、视图系统、电话管理服务、通知管理服务、窗口管理服务、进程管理服务、绘制服务、无线投屏服务、反向交互服务、显示框架等。本申请实施例对此不作任何限制。
输入管理服务可用于为输入数据找到相应的应用程序,使相应的应用程序对输入数据进行处理。
在一些实施例中,无线投屏服务可以和Wi-Fi驱动、Wi-Fi IC共同完成手写数据以及投屏图像的传输。反向交互服务与无线投屏服务作用相同,此处不再赘述。
显示框架可以用于调用显示IC绘制、合成图像,并可控制显示屏显示图像。
窗口管理服务可用于管理窗口程序。窗口管理服务可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。
内容提供服务可用于存放和获取数据,并使这些数据可以被应用程序访问。所述数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。
视图系统包括可视控件,例如显示文字的控件,显示图片的控件等。视图系统可用于构建应用程序。显示界面可以由一个或多个视图组成的。例如,包括短信通知图标的显示界面,可以包括显示文字的视图以及显示图片的视图。
资源管理服务为应用程序提供各种资源,比如本地化字符串、图标、图片、布局文件、视频文件等等。
通知管理服务使应用程序可以在状态栏中显示通知信息,可以用于传达告知类型的消息,可以短暂停留后自动消失,无需用户交互。比如通知管理器被用于告知下载完成,消息提醒等。通知管理器还可以是以图表或者滚动条文本形式出现在系统顶部状态栏的通知,例如后台运行的应用程序的通知,还可以是以对话窗口形式出现在屏幕上的通知。例如在状态栏提示文本信息,发出提示音,电子设备振动,指示灯闪烁等。
绘制服务可用于绘制界面,示例性地,绘制服务可将应用布局和资源转换为内核层的显示服务能够识别的格式。绘制服务可将绘制的界面发送给内核层的显示服务,进而实现绘制的界面的显示。
安卓运行时可以包括核心库和虚拟机。安卓运行时负责Android系统的调度和管理。
核心库可以包含两部分:一部分是java语言需要调用的功能函数,另一部分是Android系统的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理、堆栈管理、线程管理、安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如:状态监测服务、表面管理器(surface manager)、媒体库(media libraries)、三维图形处理库(例如,OpenGLES)、二维(2 dimensions,2D)图形引擎(例如,SGL)等。
状态监测服务用于根据内核层上报的监测数据确定手机的具体朝向、柔性屏幕的物理状态等。表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和三维(3dimensions,3D)图层的融合。媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如,MPEG4、H.264、MP3、AAC、AMR、JPG、PNG等。三维图形处理库用于实现三维图形绘图、图像渲染、合成和图层处理等。2D图形引擎是2D绘图的绘图引擎。
内核层是硬件和软件之间的层。内核层至少包含输入子系统、电源管理服务、传感器服务(也可以称,传感器驱动)、显示服务(也可以称,显示驱动)、摄像头驱动、音频驱动、Wi-Fi驱动(Wi-Fi driver)等。本申请实施例对此不做任何限制。
其中,输入子系统包括驱动层、输入子系统核心和事件处理层,用于管理不同类型、不同原理、不同的输入信息的输入设备。输入设备例如可以为键盘、鼠标、触摸屏、手柄、手写或者是一些体感输入设备等。
驱动层用于将底层的硬件输入转化为统一事件形式,向输入子系统核心(input core)汇报。驱动层可以包括触控驱动(touch panel driver,TP Driver)。在一些实施例中,触控驱动用于接收来自触控电路的信号,并将其转化为相应的输出信号,输入至系统输入管理服务进一步处理。
Wi-Fi驱动用于驱动Wi-Fi IC进入工作状态。
应用程序框架层以下的系统库和内核层等可称为底层系统。底层系统中包括用于提供显示服务的底层显示系统,例如,底层显示系统包括内核层中的显示驱动以及系统库中的表面管理器等。
底层系统之下为硬件,硬件可以为软件运行提供基础。如图2所示,该电子设备的硬件可以包括但不限于,开机键、传感器、显示屏、指纹、摄像头等。
本申请所涉及的投屏是指某个电子设备上的媒体数据(如音频、视频、图片等)传输至另外的电子设备上呈现,实现多个电子设备之间同步显示相同媒体数据的效果。本申请所涉及的投屏可以包括有线投屏和无线投屏。其中,有线投屏可以通过高清多媒体接口(high definition multimedia interface,HDMI)建立多个电子设备之间的连接,并通过HDMI传输线传输媒体数据。无线投屏可以通过例如Mira cast等无线投屏协议建立多个电子设备之间的连接,并通过无线通信模块传输媒体数据。
图3是适用于本申请实施例提供的投屏图像的处理方法的投屏系统的示意图。如图3所示,该投屏系统300包括至少两个电子设备,如图中所示的第一电子设备310和第二电子设备320。第一电子设备310和第二电子设备320之间可通过投屏端口建 立投屏连接,进而实现媒体数据的传输。投屏端口可以包括有线端口和/或无线端口。其中,有线端口可以为HDMI;无线端口可以包括API,也可以包括无线硬件投屏模块。本申请实施例对此不作限定。如图3所示,第一电子设备310的投屏端口包括第一有线端口和第一无线端口,第一有线端口和第一无线端口可以集成在第一电子设备310上,也可以独立于第一电子设备310而存在。第二电子设备的投屏端口可以包括第二有线端口和第二无线端口,第二有线端口和第二无线端口可以集成在第二电子设备320上,也可以独立于第二电子设备320而存在。本申请实施例对此不作限定。
第二电子设备320向第一电子设备310执行投屏操作。用户可以通过手指、手写笔或鼠标等在第一电子设备310上做标记,这些标记可以被称为报点数据。因此,本实施例的操作可以理解是用户操作,而不仅限于用手执行的操作。在当前技术中,该报点数据需要通过投屏端口被传输到第二电子设备,由第二电子设备完成对报点数据的轨迹的绘制,并将绘制后的轨迹与投屏图像合成,然后再传输到第一电子设备。
为便于理解,图4示出了无线投屏的一种可能的处理流程。图4中实心箭头所示的流程示出了第一电子设备将报点数据传送给第二电子设备以及第二电子设备绘制报点数据并与投屏图像合成的流程。图4中空心箭头所示的流程示出了第二电子设备将合成后的图像传送给第一电子设备以及在第一电子设备上显示的流程。
首先,第一电子设备响应于用户的操作,生成报点数据。示例性地,如图4所示,第一电子设备可以通过TP IC采集用户输入的手写数据,然后将采集的手写数据输入控制器,由控制器计算得到报点数据。TP IC将报点数据输入至TP Driver,TP Driver将报点数据转化为一定的事件形式。
应理解,用户可以通过手写笔在第一电子设备的显示屏上进行操作,产生手写数据,用户还可以通过鼠标或手指在第一电子设备的显示屏上进行操作。用户还可以通过其他方式进行操作,本申请实施例对此不作限定。其中,报点数据可以包括操作在显示屏上对应的位置信息。其次,第一电子设备通过无线投屏服务或者反向交互服务,将报点数据传送至第二电子设备。
示例性地,如图4所示,第一电子设备的TP Driver将转化后的报点数据通过无线投屏服务或者反向交互服务传送至第一电子设备的Wi-Fi driver,第一电子设备的Wi-Fi driver驱动第一电子设备的Wi-Fi IC进入工作状态,从而将报点数据传送至第二电子设备的Wi-Fi IC。第二电子设备的Wi-Fi IC将报点数据输入至第二电子设备的Wi-Fi driver,第二电子设备的Wi-Fi driver通过无线投屏服务将报点数据传输至第二电子设备的系统输入管理服务。
此后,第二电子设备基于报点数据绘制第二操作轨迹。示例性地,如图4所示,第二电子设备的系统输入管理服务将报点数据传送至第二电子设备的手写应用,触发第二电子设备的手写应用绘制第二操作轨迹。第二电子设备的手写应用将报点数据发送至第二电子设备的窗口管理服务,第二电子设备的窗口管理服务将报点数据发送至第二电子设备的显示框架,第二电子设备的显示框架调用第二电子设备的显示IC绘制第二操作轨迹。其中,第二操作轨迹基于报点数据绘制得到。
然后,第二电子设备将第二操作轨迹和投屏图像合成。示例性地,如图4所示,第二电子设备的显示IC绘制好第二操作轨迹后,将第二操作轨迹与本地的一个或多个 图层的图像合成。最后,第二电子设备将合成后的图像发送至第一电子设备。示例性地,如图4所示,第二电子设备将合成后的图像通过无线投屏服务和Wi-Fi空口发送至第一电子设备。具体地,第二电子设备的显示IC将合成后的图像通过无线投屏服务传送至第二电子设备的Wi-Fi driver,第二电子设备的Wi-Fi driver驱动第二电子设备的Wi-Fi IC进入工作状态,从而将报点数据传送至第一电子设备的Wi-Fi IC。第一电子设备的Wi-Fi IC将合成后的图像输入至第一电子设备的Wi-Fi driver,第一电子设备的Wi-Fi driver通过无线投屏服务将合成后的图像传输至第一电子设备的显示框架。
至此,第一电子设备可以显示合成后的图像。示例性地,如图4所示,第一电子设备的显示框架调用显示IC,并控制显示屏显示合成后的图像。基于上述步骤,基于用户在第一电子设备上的操作绘制的报点数据需要传送至第二电子设备,由第二电子设备绘制操作轨迹并与本地的一个或多个图层的图像合成后传送至第一电子设备,在第一电子设备上显示。
图5示出了采用图4的处理流程可能带来的时延。如图5所示,假设报点频率为360赫兹(Hz),屏幕刷新率120Hz,第一电子设备获取报点数据需要耗时约10毫秒(ms)。第一电子设备将报点数据传送至第二电子设备需要耗时约6ms,第二电子设备绘制操作轨迹并与投屏图像合成约耗时8.3ms。合成后的图像传输至第一电子设备约耗时52.5ms,在第一电子设备上显示合成后的图像约7.5ms。可以看到,由第一电子设备获得报点数据,到在第一电子设备上显示操作轨迹及投屏图像共计耗时约84.3ms,手写迹绘制时延过长,用户体验差。应理解,图5所示流程中的时延仅为一种可能的示例,不应对本申请实施例构成限定。
基于此,本申请提出了一种投屏图像的处理方法,通过在第一电子设备上部署画笔应用,使第一电子设备能够自行绘制操作轨迹,该操作轨迹与来自第二电子设备的投屏图像合成后,即可直接在第一电子设备上显示合成后的图像。减少将报点数据传输至第二电子设备,以及第二电子设备将绘制好的操作轨迹数据传输给第一电子设备等步骤带来的时延,从而减低操作轨迹的显示时延,给用户带来更加良好的体验。
下面将结合附图对本申请实施例提供的投屏图像的处理方法做详细说明。图6和图7为本申请实施例提供的投屏图像的处理方法的流程示意图。图6示出了投屏图像的处理方法600的主要流程,图7结合图2所示的电子设备的软、硬件的结构框图更详细地描述方法600中各个步骤的具体实现。需要说明的是,图7所示的流程以无线投屏为例来进行了详细描述,但这不应对本申请实施例构成任何限定。本申请实施例提供的方法可应用于有线投屏或无线投屏的场景中。
下面将参照附图来详细说明方法600中的各个步骤。如图6所示,该方法600可以包括步骤601至步骤610。
在步骤601中,第一电子设备从第二电子设备接收第一投屏图像。其中,第一投屏图像为基于第二电子设备本地的一个或多个图层的图像合成,然后由第二电子发送至第一电子设备的图像。具体的发送过程可以由参考上文结合图4的相关描述,为了简洁,此处不再赘述。
在步骤602中,第一电子设备响应于用户的操作,生成报点数据。该步骤的具体实现可参考图4的相关描述,为了简洁,此处不再赘述。
在步骤603中,第一电子设备基于报点数据绘制第一操作轨迹。在本申请实施例中,由于在第一电子设备上增加了画笔应用,因此,第一电子设备在获取到报点数据后,可以直接通过画笔应用基于报点数据绘制操作轨迹。示例性地,如图7所示,第一电子设备的TP Driver将转化后的报点数据发送至第一电子设备的画笔应用,画笔应用绘制一个反应报点数据及其分布情况(即第一操作轨迹)的手写图层,并将该手写图层发送至第一电子设备的显示框架,显示框架调用显示IC绘制第一操作轨迹。
应理解,第一电子设备可以以帧为单位来绘制操作轨迹,以绘制图像。第一电子设备可以每隔一段时间绘制一帧图像,每一帧图像包含最近一段时间绘制的操作轨迹。与该操作轨迹对应的一段时间的时长可以大于第一电子设备与第二电子设备之间的双向传输时延与第二电子设备本地对接收到的报点数据进行处理的时延之和。因此,第一操作轨迹可以为根据一段时间内的报点数据绘制的操作轨迹,该一段时间的时长大于第一电子设备与第二电子设备之间的双向传输时延与第二电子设备本地对接收到该一段时间内的报点数据进行处理的时延之和。
在步骤604中,第一电子设备将第一操作轨迹与来自第二电子设备的第一投屏图像合成,得到合成后的图像。该第一投屏图像可参考步骤601,例如可以是通过无线投屏服务和Wi-Fi空口传输到第一电子设备的图像。
一种可能的实现方式如图7中所示,第二电子设备的显示框架获取第一投屏图像,然后通过显示IC将第一投屏图像通过无线投屏服务传送至第二电子设备的Wi-Fi driver,第二电子设备的Wi-Fi driver驱动第二电子设备的Wi-Fi IC进入工作状态,从而将第一投屏图像传送至第一电子设备的Wi-Fi IC。第一电子设备的Wi-Fi IC将第一投屏图像输入至第一电子设备的Wi-Fi driver,第一电子设备的Wi-Fi driver通过无线投屏服务将第一投屏图像传输至第一电子设备的显示框架。然后第一电子设备的显示框架调用第一电子设备的显示IC将之前绘制好的第一操作轨迹与来自第二电子设备的第一投屏图像合成,得到合成后的图像。
应理解,对于第一电子设备来说,第一操作轨迹可承载于手写图层上,第一投屏图像可承载于投屏图层上,因此第一操作轨迹与第一投屏图像的合成具体可以通过将手写图层与投屏图层合成,所得到的合成后的图像既包括了投屏图层中的投屏图像,也包括了手写图层中的操作轨迹。也即实现了操作轨迹与投屏图像的合成。
还应理解,对于第一电子设备来说,接收第一投屏图像和绘制第一操作轨迹的顺序没有先后之分,也就是说,本申请实施例对步骤601与步骤602、步骤603之间的执行顺序不作限定,例如,步骤610可以在步骤602、步骤603之前执行,也可以与步骤602或步骤603同时执行。
在步骤605中,第一电子设备显示合成后的图像。第一电子设备得到合成后的图像后,即可显示合成后的图像。示例性地,如图7所示,第一电子设备的显示框架收到显示IC合成图像完成,即可控制显示IC将合成后的图像传输至显示屏,在显示屏上刷新显示。
基于上述技术内容,通过在第一电子设备上部署手写应用,手写应用可以基于新接收到的操作,在本地绘制操作轨迹,便可与来自第二电子设备的第一投屏图像合成,得到合成后的图像。如此一来,在第一电子设备发生新的操作时,新的操作轨迹的显 示延迟不依赖于第二电子设备与第一电子设备之间的传输时延,可以减小操作轨迹的显示延迟。
需要说明的是,基于上述步骤601至步骤605,第一电子设备可基于用户在手写图层上的操作,直接显示由操作轨迹和第一投屏图像合成的图像。
基于上述流程,第一电子设备从接收到用户的操作到显示合成后的图像的用时如图8所示。第一电子设备的TP IC以及输入子系统获取到报点数据的过程获取耗时约10毫秒(ms);第一电子设备绘制操作轨迹并与第一投屏图像合成约耗时约8.3ms,对应于图8中的系统输入服务到手写应用笔迹绘制的过程;在第一电子设备上显示合成后的图像约7.5ms,可对应于图8中显示框架到显示屏的过程,共计耗时约25.8ms。与图5中的实现方式相比,减少将报点数据传送至第二电子设备的耗时6ms,以及将第二电子设备合成后的图像传输至第一电子设备的耗时52.5ms,共计节省时间约58.5ms。因此,相比于图5所示的端到端时延来说,可以大大减少时延。
应理解,图8所示的时延是相对于图5所示的时延,当采用不同的设备时,图5和图8的各个步骤可能产生与示例中不同的时延,但各个部分的耗时占比是类似的。
进一步地,该方法还可包括步骤606至步骤609。
在步骤606中,第一电子设备向第二电子设备发送第一操作轨迹。示例性地,如图7所示,第一电子设备可通过无线投屏服务、Wi-Fi driver、Wi-Fi IC将显示IC绘制好第一操作轨迹发送至第二电子设备的Wi-Fi IC。
可选地,第一电子设备在向第二电子设备发送第一操作轨迹的同时,还可将该第一操作轨迹对应的时间戳一同发送给第二电子设备。该时间戳用于记录第一操作轨迹的绘制时间,因第一操作轨迹基于用户的操作而实时地绘制,故该时间戳也可认为是该用于记录用户操作的时间信息。
在步骤607中,第二电子设备可将接收到的第一操作轨迹与一个或多个图层的图像进行同步。示例性地,如图7所示,第二电子设备的Wi-Fi IC接收来自第一电子设备的第一操作轨迹后,然后将第一操作轨迹经过Wi-Fi driver发送到第二电子设备的系统输入管理服务。第二电子设备的系统输入管理服务将第一操作轨迹发送至手写应用,即实现了对手写应用中的数据的同步。
其中,同步具体可以是指,将上述的第一操作轨迹及其对应的时间戳,与用于合成第一投屏图像的一个或多个图层的图像关联起来,保存在第二电子设备,以便于第二电子设备获取到最新的操作轨迹。
应理解,第一电子设备上可能没有保存操作轨迹的功能,所以需要将操作轨迹同步至第二电子设备,以便第二电子设备可以对第一操作轨迹进行进一步地处理,例如,对该第一操作轨迹进行保存或者删除,或者后面所说的与第二电子设备本地的一个或多个图层的图像进行合成。
在步骤608中,第二电子设备基于第一操作轨迹和本地的一个或多个图层的图像,得到第二投屏图像。
可选地,该方法600还可进一步包括:步骤609,第二电子设备显示第二投屏图像。也就是说,第一电子设备可以将自行绘制的第一操作轨迹发送给第二电子设备。因此,第二电子设备可直接将该第一操作轨迹与本地的一个或多个图层的图像合成, 而无需根据报点数据绘制操作轨迹。应理解,该一个或多个图层的图像是用于合成第一投屏图像的图像。
第二电子设备接收到的第一操作轨迹是承载于手写图层上的图像。第二电子设备可以将承载于本地的一个或多个图层上的图像与承载于手写图层上的第一操作轨迹进行合成,得到第二投屏图像。
示例性地,如图7所示,第二电子设备的手写应用可以将第一操作轨迹传送至系统窗口管理,系统窗口管理将第一操作轨迹传送至显示框架,显示框架调用显示IC,将第一操作轨迹与本地的一个或多个图层上的图像合成,得到第二投屏的图像。此后,第二电子设备的显示IC可以将第二投屏图像传输至显示屏,在显示屏上显示。
在另一种实现方式中,第一电子设备也可以将获取到的报点数据发送给第二电子设备,以便于第二电子设备自行根据报点数据绘制第二操作轨迹。也即,上述的步骤606可替换为:第一电子设备向第二电子设备发送报点数据。相应地,步骤607可替换为:第二电子设备基于接收到的报点数据,绘制第二操作轨迹。在此后的步骤608中,第二电子设备基于第二操作轨迹与本地一个或多个图层的图像得到第二投屏图像;以及步骤609,第二电子设备显示第二投屏图像。
与前文所述相似,第一电子设备可以在向第二电子设备发送报点数据的同时发送该报点数据的时间戳,该时间戳用于记录该报点数据的生成时间。由于该报点数据是基于用户的操作而实时地生成,故该时间戳也可认为是用于记录用户操作的时间信息。
应理解,第二电子设备可以基于最新接收到的报点数据和之前接收到的报点数据共同绘制得到第二操作轨迹,也可以将根据最新接收到的报点数据绘制的操作轨迹与之前绘制的操作轨迹进行合成得到第二操作轨迹,本申请实施例对此不作限定。
还应理解,上述步骤609仅为示例,第二电子设备并不一定要执行步骤609。例如,在一种可能的设计中,第二电子设备可以基于用户的配置,确定是否显示操作轨迹。示例性地,用户可以对第二电子设备进行设置,例如设置为“在本地显示操作轨迹”。第二电子设备可基于该设置,生成指令,并将该指令发送至手写应用,以便于手写应用确定是否在本地显示操作轨迹。如果该第二电子设备被设置为在本地显示操作轨迹,则可继续执行上述步骤609;如果该第二电子设备被设置为不在本地显示用户在第一电子设备上的操作轨迹,则可不执行步骤609,可以只在第二电子设备上显示不包含操作轨迹的投屏图像。
进一步地,该方法600还可以包括:步骤610,第二电子设备向第一电子设备发送第二投屏图像。第二电子设备得到第二投屏图像后,可以进一步向第一电子设备发送第二投屏图像,对应地,第一电子设备接收来自第二电子设备的第二投屏图像。可以理解,第二电子设备向第一电子设备发送的合成后的图像是最新的投屏图像。为方便区分和说明,该投屏图像例如可以称为第二投屏图像。其具体过程与前文结合图4所示的过程相似,为了简洁,此处不再赘述。
应理解,该第二投屏图像可以用于覆盖在第一电子设备中基于第一操作轨迹和第一投屏图像合成的图像,由此可以实现第一电子设备本地的图像更新。
还应理解,在第一电子设备将第一操作轨迹的报点数据发送给第二电子设备之后,用户在第一电子设备又产生了新的操作,也即第一电子设备可以基于新的操作绘制新的操作 轨迹。如前所述,第一电子设备可以每隔一段时间生成一帧图像,每一帧图像可以包含最近一段时间绘制的操作轨迹。因此,上述新绘制的操作轨迹可以理解为是基于最近一段时间内的操作绘制的轨迹,是第一操作轨迹的又一个示例。由于用于生成前后两帧图像的时段可能存在重叠。因此,前后两帧图像中的第一操作轨迹也可能存在重叠或近似重叠。为方便区分上一帧图像中的第一操作轨迹和下一帧图像中的第一操作轨迹,下文将上一帧图像中的第一操作轨迹记为操作轨迹1,下一帧图像中的操作轨迹记为操作轨迹2。第一电子设备可以将该操作轨迹2与最新接收到的来自第二电子设备的投屏图像进行合成,也即将手写图层和投屏图层合成,得到新的合成图像,并在第一电子设备上显示该新的合成图像。换句话说,在投屏场景下,只要用户在第一电子设备端执行操作,上述步骤可以是重复执行的。
具体地,第一电子设备可以将操作轨迹2与最新接收到的来自第二电子设备的投屏图像进行合成。由此,可以将最新的操作轨迹显示在最新合成的图像上。该图像上的操作轨迹可能一部分来自第二投屏图像,一部分来自操作轨迹2。
此外,第一电子设备还可以将操作轨迹2与操作轨迹1进行合成后,再与第二投屏图像合成。由此,可以将最新的操作轨迹显示在最新合成的图像上,并且该图像上的操作轨迹全部来自第一电子设备本地绘制的操作轨迹。本申请实施例对此不作限制。
基于上述技术方案,通过在第一电子设备上部署画笔应用,手写应用可以基于新接收到的操作,在本地绘制操作轨迹。如此一来,即便第二电子设备生成的最新的投屏图像(也即,包含了基于新的操作而绘制的操作轨迹的图像)还未到达第一电子设备,第一电子设备也可以直接基于新接收到的操作,绘制最新的操作轨迹,并将该操作轨迹与此前从第二电子设备接收到的投屏图像合成,进而在第一电子设备显示。因此,第一电子设备在有新的操作发生时,新的操作轨迹的显示延迟不依赖于第二电子设备与第一电子设备之间的传输时延,从而减小操作轨迹的显示延迟。
应理解,第一电子设备本地的图像更新并不限于上文所示例的流程。第一电子设备在接收到新的用户操作时,可以自行绘制操作轨迹,并与来自第二电子设备的投屏图像合成,得到合成的图像,而可以不将新的操作轨迹或报点数据发送给第二电子设备。也即,执行上文方法600中的步骤601至605,而不执行步骤606至610,也可以实现对第一电子设备本地显示的图像的更新。本申请实施例对此不作限定。
还应理解,在执行步骤606至步骤610时,本申请对于步骤609和步骤610的执行顺序不作限定,例如,步骤610和步骤609可以同时执行,也可以在步骤609之前或之后执行。
以上,结合图6至图8详细说明了本申请实施例提供的投屏图像的处理方法。以下,结合图9和图10详细说明本申请实施例提供的投屏图像的处理装置。图9为本申请实施例提供的投屏图像的处理装置900的示意性框图。如图9所示,该装置900可以包括收发模块910、处理模块920和显示模块930。
具体地,该收发模块910可用于从第二电子设备接收第一投屏图像;处理模块920可用于响应于用户的手写操作,生成手写报点数据报点数据,并用于基于所述报点数据,绘制操作轨迹;处理模块920还用于将所述操作轨迹与所述第一投屏图像合成,得到合成后的图像;该显示模块930可用于显示所述合成后的图像。
可选地,收发模块910还用于向所述第二电子设备发送所述操作轨迹,所述操作轨迹用于在所述第二电子设备与一个或多个图层的图像合成,所述一个或多个图层的图像被所述第二电子设备用于合成所述第一投屏图像。
可选地,收发模块910还用于向所述第二电子设备发送所述报点数据,所述报点数据用于所述第二电子设备绘制所述操作轨迹,所述第二操作轨迹用于在所述第二电子设备与本地的一个或多个图层的图像合成,所述一个或多个图层的图像被所述第二电子设备用于合成所述第一投屏图像。
可选地,所述收发模块910还用于从所述第二电子设备接收第二投屏图像,所述第二投屏图像是基于所述一个或多个图层的图像和对应于所述操作轨迹合成得到的;所述显示模块930还用于显示所述第二投屏图像,以更新所述合成后的图像。
具体地,该装置900可以包括用于执行图6中的方法600实施例中第一电子设备执行的方法的模块或单元。该收发模块910可用于执行上文方法600中的步骤601、步骤606和步骤610,该处理模块920可用于执行上文方法600中的步骤602至步骤604,该显示模块930可用于执行上文方法600中的步骤605。
该装置900还可用于执行图7中的实施例中第一电子设备执行的方法。并且,该装置900中的各模块和上述其他操作和/或功能分别为了实现图7中的实施例的相应流程。为了简洁,这里不再详述。
应理解,该投屏图像的处理装置900可对应于根据本申请实施例的方法实施例中的第一电子设备的至少部分。例如,装置900可以是该第一电子设备,或者,该第一电子设备中的部件,例如芯片或芯片系统等。具体地,该投屏图像的处理装置900所实现的功能可以由一个或多个处理器执行相应的程序来实现。
图10为本申请实施例提供的投屏图像的处理装置1000的示意性框图。如图10所示,该装置1000可以包括收发模块1010、处理模块1020和显示模块1030。具体地,该收发模块1010用于向第一电子设备发送第一投屏图像;收发模块1010还用于接收来自第一电子设备的操作轨迹,所述操作轨迹是基于用户在所述第一电子设备的操作绘制的;该处理模块1020用于将所述操作轨迹与一个或多个图层的图像进行同步,所述一个或多个图层的图像是用于合成所述第一投屏图像。
可选地,处理模块1020还用于将所述操作轨迹与一个或多个图层的图像合成,得到第二投屏图像;显示模块1030用于显示所述第二投屏图像。
可选地,收发模块1010用于向所述第一电子设备发送所述第二投屏图像。
具体地,该装置1000可以包括用于执行图6中的方法600实施例中第二电子设备执行的方法的模块或单元。该收发模块1010可用于执行上文方法600中的步骤601、步骤606和步骤610,该处理单元1020可用于执行上文方法600中的步骤607至步骤608,该显示模块1030可用于执行上文方法600中的步骤609。
该装置1000还可用于执行图7中的实施例中第二电子设备执行的方法。并且,该装置1000中的各模块和上述其他操作和/或功能分别为了实现图7中的实施例的相应流程。为了简洁,这里不再详述。
应理解,该投屏图像的处理装置1000可对应于根据本申请实施例的方法实施例中的第二电子设备的至少部分。例如,装置1000可以是该第二电子设备,或者,该第二 电子设备中的部件,例如芯片或芯片系统等。具体地,该投屏图像的处理装置1000所实现的功能可以由一个或多个处理器执行相应的程序来实现。
本申请还提供一种电子设备或其内装置。该电子设备或装置可以包括一个或多个处理器,以用于实现上述投屏图像的处理装置900和装置1000的功能。该一个或多个处理器例如可以包括或执行上文实施例中所述的收发模块910、处理模块920、和显示模块930以及收发模块1010、处理模块1020和显示模块1030等。该一个或多个处理器例如可对应于图1中所示的电子设备100中的处理器110。所述显示模块可对应于图1中所示的电子单元100中的显示屏194。
可选地,该电子设备或装置还包括一个或多个存储器。该一个或多个存储器用于存储计算机程序,和/或,数据,例如报点数据等。该一个或多个存储器例如可对应于图1中所示的电子设备100中的外部存储器120和/或内部存储器121。所述处理器可以获取存储于存储器中的计算机程序以执行以上实施例涉及的方法流程。
本申请还提供一种计算机存储介质,该计算机存储介质中存储有计算机指令,当该计算机指令在电子设备上运行时,使得电子设备执行上述相关方法步骤实现上述实施例中的拍摄图像的方法。该计算机存储介质例如可对应于图1中所示的电子设备100中的中的外部存储器120和/或内部存储器121。图9所示的收发模块910、处理模块920、和显示模块930以及图10所示的收发模块1010、处理模块1020和显示模块1030可以以软件形式存在并被存储在所述计算机存储介质。
本申请还提供了一种计算机程序产品,可存储于所述计算机存储介质,当该计算机程序产品在计算机上运行时,使得计算机执行上述相关步骤,以实现上述实施例中的投屏图像的处理方法。
其中,本申请实施例提供的投屏图像的处理装置、电子设备、计算机存储介质、计算机程序产品或芯片均用于执行上文所提供的对应的方法,因此,其所能达到的有益效果可参考上文所提供的对应的方法中的有益效果,此处不再赘述。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以计算机软件、电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统、装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。在本申请所提供的几个实施例中,应该理解到,所揭露的系统、装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到 多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。
所述功能如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(read-only memory,ROM)、随机存取存储器(random access memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (17)

  1. 一种投屏图像的处理方法,其特征在于,包括:
    从第二电子设备接收第一投屏图像;
    响应于用户的操作,生成报点数据;
    基于所述报点数据,绘制操作轨迹;
    将所述操作轨迹与所述第一投屏图像合成,得到合成后的图像;
    显示所述合成后的图像。
  2. 如权利要求1所述的方法,其特征在于,所述方法还包括:
    向所述第二电子设备发送所述操作轨迹,所述操作轨迹用于在所述第二电子设备与一个或多个图层的图像合成,所述一个或多个图层的图像被所述第二电子设备用于合成所述第一投屏图像。
  3. 如权利要求1所述的方法,其特征在于,所述方法还包括:
    向所述第二电子设备发送所述报点数据,所述报点数据用于所述第二电子设备绘制所述操作轨迹,所述操作轨迹用于在所述第二电子设备与一个或多个图层的图像合成,所述一个或多个图层的图像被所述第二电子设备用于合成所述第一投屏图像。
  4. 如权利要求2或3所述的方法,其特征在于,所述方法还包括:
    从所述第二电子设备接收第二投屏图像,所述第二投屏图像是基于所述一个或多个图层的图像和所述操作轨迹合成得到的;
    显示所述第二投屏图像,以更新所述合成后的图像。
  5. 一种投屏图像的处理方法,其特征在于,包括:
    向第一电子设备发送第一投屏图像;
    接收来自所述第一电子设备的操作轨迹,所述操作轨迹是基于用户在所述第一电子设备的操作绘制的;
    将所述操作轨迹与一个或多个图层的图像进行同步,所述一个或多个图层的图像是用于合成所述第一投屏图像。
  6. 如权利要求5所述的方法,所述方法还包括:
    将所述操作轨迹与一个或多个图层的图像合成,得到第二投屏图像;
    显示所述第二投屏图像。
  7. 如权利要求6所述的方法,所述方法还包括:
    向所述第一电子设备发送所述第二投屏图像。
  8. 一种投屏图像的处理装置,其特征在于,包括:
    收发模块,用于从第二电子设备接收第一投屏图像;
    处理模块,用于响应于用户的操作,生成报点数据;并用于基于所述报点数据,绘制操作轨迹;还用于将所述操作轨迹与所述第一投屏图像合成,得到合成后的图像;
    显示模块,用于显示所述合成后的图像。
  9. 如权利要求8所述的装置,其特征在于,所述收发模块还用于向所述第二电子设备发送所述操作轨迹,所述操作轨迹用于在所述第二电子设备与一个或多个图层的图像合成,所述一个或多个图层的图像被所述第二电子设备用于合成所述第一投屏图 像。
  10. 如权利要求8所述的装置,其特征在于,所述收发模块还用于向所述第二电子设备发送所述报点数据,所述报点数据用于所述第二电子设备绘制所述操作轨迹,所述操作轨迹用于在所述第二电子设备与一个或多个图层的图像合成,所述一个或多个图层的图像被所述第二电子设备用于合成所述第一投屏图像。
  11. 如权利要求9或10所述的装置,其特征在于,所述收发模块还用于从所述第二电子设备接收第二投屏图像,所述第二投屏图像是基于所述一个或多个图层的图像和对应于所述报点数据的操作轨迹合成得到的;
    所述显示模块还用于显示所述第二投屏图像,以更新所述合成后的图像。
  12. 一种投屏图像的处理装置,其特征在于,包括:
    收发模块,用于向第一电子设备发送第一投屏图像;还用于接收来自第一电子设备的操作轨迹,所述操作轨迹是基于用户在所述第一电子设备的操作绘制的;
    处理模块,用于将所述操作轨迹与一个或多个图层的图像进行同步,所述一个或多个图层的图像是用于合成所述第一投屏图像。
  13. 如权利要求12所述的装置,其特征在于,所述处理模块还用于将所述操作轨迹与一个或多个图层的图像合成,得到第二投屏图像;
    所述装置还包括显示模块,用于显示所述第二投屏图像。
  14. 如权利要求13所述的装置,其特征在于,所述收发模块还用于向所述第一电子设备发送所述第二投屏的图像。
  15. 一种投屏图像的处理装置,其特征在于,包括存储器与处理器;其中,
    所述存储器用于存储程序代码;
    所述处理器用于调用所述程序代码以用于实现如权利要求1至7中任一项所述的方法。
  16. 一种计算机可读存储介质,其上存储有计算机程序,其特征在于,所述计算机程序被在电子设备或处理器执行时,使得所述电子设备或所述处理器执行如权利要求1至7中任一项所述的方法。
  17. 一种计算机程序产品,其特征在于,包括计算机程序,当所述计算机程序被运行时,使得所述计算机执行如权利要求1至7中任一项所述的方法。
PCT/CN2021/106821 2021-07-16 2021-07-16 一种投屏图像的处理方法和装置 WO2023283941A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2021/106821 WO2023283941A1 (zh) 2021-07-16 2021-07-16 一种投屏图像的处理方法和装置
CN202180099334.7A CN117501233A (zh) 2021-07-16 2021-07-16 一种投屏图像的处理方法和装置

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/106821 WO2023283941A1 (zh) 2021-07-16 2021-07-16 一种投屏图像的处理方法和装置

Publications (1)

Publication Number Publication Date
WO2023283941A1 true WO2023283941A1 (zh) 2023-01-19

Family

ID=84919865

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/106821 WO2023283941A1 (zh) 2021-07-16 2021-07-16 一种投屏图像的处理方法和装置

Country Status (2)

Country Link
CN (1) CN117501233A (zh)
WO (1) WO2023283941A1 (zh)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140176420A1 (en) * 2012-12-26 2014-06-26 Futurewei Technologies, Inc. Laser Beam Based Gesture Control Interface for Mobile Devices
US20150261385A1 (en) * 2014-03-17 2015-09-17 Seiko Epson Corporation Picture signal output apparatus, picture signal output method, program, and display system
CN107168674A (zh) * 2017-06-19 2017-09-15 浙江工商大学 投屏批注方法和系统
CN108459836A (zh) * 2018-01-19 2018-08-28 广州视源电子科技股份有限公司 批注显示方法、装置、设备及存储介质
CN110703978A (zh) * 2019-09-25 2020-01-17 掌阅科技股份有限公司 信息显示方法、阅读器以及计算机存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140176420A1 (en) * 2012-12-26 2014-06-26 Futurewei Technologies, Inc. Laser Beam Based Gesture Control Interface for Mobile Devices
US20150261385A1 (en) * 2014-03-17 2015-09-17 Seiko Epson Corporation Picture signal output apparatus, picture signal output method, program, and display system
CN107168674A (zh) * 2017-06-19 2017-09-15 浙江工商大学 投屏批注方法和系统
CN108459836A (zh) * 2018-01-19 2018-08-28 广州视源电子科技股份有限公司 批注显示方法、装置、设备及存储介质
CN110703978A (zh) * 2019-09-25 2020-01-17 掌阅科技股份有限公司 信息显示方法、阅读器以及计算机存储介质

Also Published As

Publication number Publication date
CN117501233A (zh) 2024-02-02

Similar Documents

Publication Publication Date Title
WO2020221039A1 (zh) 投屏方法、电子设备以及系统
WO2020253719A1 (zh) 一种录屏方法及电子设备
WO2020244495A1 (zh) 一种投屏显示方法及电子设备
WO2020259452A1 (zh) 一种移动终端的全屏显示方法及设备
EP4044609A1 (en) Cross-device content projection method and electronic device
WO2020224485A1 (zh) 一种截屏方法及电子设备
WO2020108356A1 (zh) 一种应用显示方法及电子设备
WO2021233218A1 (zh) 投屏方法、投屏源端、投屏目的端、投屏系统及存储介质
WO2022258024A1 (zh) 一种图像处理方法和电子设备
CN114040242B (zh) 投屏方法、电子设备和存储介质
WO2021233079A1 (zh) 一种跨设备的内容投射方法及电子设备
WO2022105445A1 (zh) 基于浏览器的应用投屏方法及相关装置
CN113961157B (zh) 显示交互系统、显示方法及设备
WO2023030099A1 (zh) 跨设备交互的方法、装置、投屏系统及终端
CN116055773A (zh) 一种多屏协同方法、系统及电子设备
WO2022179275A1 (zh) 终端应用控制的方法、终端设备及芯片系统
WO2020155875A1 (zh) 电子设备的显示方法、图形用户界面及电子设备
WO2023066395A1 (zh) 一种应用运行方法以及相关设备
CN116627301A (zh) 数据处理方法和相关装置
CN115016697A (zh) 投屏方法、计算机设备、可读存储介质和程序产品
WO2023000746A1 (zh) 增强现实视频的处理方法与电子设备
WO2022121988A1 (zh) 显示同步的方法、电子设备以及可读存储介质
CN115119048A (zh) 一种视频流处理方法及电子设备
WO2023283941A1 (zh) 一种投屏图像的处理方法和装置
WO2022161006A1 (zh) 合拍的方法、装置、电子设备和可读存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21949724

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 202180099334.7

Country of ref document: CN

NENP Non-entry into the national phase

Ref country code: DE