WO2022111727A1 - 一种多屏协同的显示方法及电子设备 - Google Patents

一种多屏协同的显示方法及电子设备 Download PDF

Info

Publication number
WO2022111727A1
WO2022111727A1 PCT/CN2021/134390 CN2021134390W WO2022111727A1 WO 2022111727 A1 WO2022111727 A1 WO 2022111727A1 CN 2021134390 W CN2021134390 W CN 2021134390W WO 2022111727 A1 WO2022111727 A1 WO 2022111727A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
frame
interface image
interface
image
Prior art date
Application number
PCT/CN2021/134390
Other languages
English (en)
French (fr)
Inventor
彭冠奇
段潇潇
王金波
钟小飞
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP21897227.1A priority Critical patent/EP4231133A4/en
Priority to US18/254,472 priority patent/US11977810B2/en
Publication of WO2022111727A1 publication Critical patent/WO2022111727A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • G06F3/1462Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay with means for detecting differences between the image stored in the host and the images displayed on the remote displays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user

Definitions

  • the embodiments of the present application relate to the technical field of distributed control, and in particular, to a multi-screen collaborative display method and electronic device.
  • the screen-casting device such as a mobile phone
  • the screen-casting device such as a personal computer (PC)
  • a connection such as a wired connection or a wireless connection
  • the mobile phone can cast the interface image of the mobile phone to the screen. display on the PC.
  • the above-mentioned mobile phone instructing the PC to update the interface image may specifically include: the mobile phone encodes the updated interface image to obtain encoded data; the mobile phone transmits the encoded data to the PC; the PC receives the encoded data and decodes the encoded data to obtain the updated interface image ; PC displays updated interface image.
  • the encoding and decoding takes a long time and network fluctuations, etc., in the multi-screen collaborative scenario, the PC update interface image will have obvious freezes.
  • the problem of jamming in multi-screen collaboration scenarios is becoming more and more serious.
  • the PC needs to receive the update operation and send the update instruction to the mobile phone, so that the mobile phone can update the interface image in response to the update instruction, and instruct the PC to update the interface image.
  • the time-consuming of updating the interface image of the PC will undoubtedly increase, and further increase the possibility of the freezing phenomenon of the updating interface image of the PC.
  • the present application provides a display method and electronic device for multi-screen collaboration, which can solve the problem of jamming existing in the update interface image of the second electronic device in the multi-screen collaboration scenario.
  • a first aspect provides a multi-screen collaborative display method, which is applied to a first electronic device, and the first electronic device is connected to a second electronic device.
  • the display method includes: the first electronic device first displays the first interface image, and encodes the first interface image to obtain encoded data of the first interface image; then sends the encoded information of the first interface image to the second electronic device; then, if The first electronic device receives the update operation on the first interface image, then in response to the update operation, displays the ith frame of the interface image, and acquires N differences in the ith frame of the interface image compared to the i-1th frame of the interface image Then, if the area ratio of the N difference areas in the ith frame interface image is less than the preset value, the first electronic device encodes the image content of the N difference areas to obtain first encoded data; finally, the first The electronic device sends the first encoded information to the second electronic device.
  • the first encoding information includes the first encoding data and the position information of the N difference regions in the ith frame of the interface image; the first encoding information is used to trigger the second electronic device to update the i-1th frame of the interface image based on the first encoding information to obtain
  • the second screen projection interface is displayed and the second screen projection interface is displayed; the content of the second screen projection interface is a mirror image of the ith frame of the interface image.
  • the first electronic device displays the first interface image, and projects the content of the first interface image onto the second electronic device. If the first electronic device receives an update operation for the first interface image, it displays M frames of the interface image, and acquires N difference regions in the ith frame of the interface image compared to the i-1th frame of the interface image. If the area ratio of the N difference regions in the ith frame of the interface image is less than the preset value, the first electronic device only encodes the N difference regions to obtain the first encoded data; then, encodes the first encoded information including the first encoded data sent to the second electronic device.
  • the second electronic device can use the first encoding information including the first encoded data to update the i-1 th interface image more quickly, Displays a mirror image of the i-th frame interface image. In this way, it is possible to solve the problem of freezing existing in the update interface image of the second electronic device in the multi-screen collaboration scenario.
  • the above-mentioned first electronic device receiving an update operation for the first interface image includes: the first electronic device receives an update instruction from the second electronic device, and the update instruction is for the first screen projection interface. The update operation is triggered; or, the first electronic device receives an update operation of the user on the first interface image displayed by the first electronic device.
  • the first electronic device directly receives the user's update operation on the first interface image displayed by the first electronic device, updates the interface image of the first electronic device, and instructs (or triggers) the second electronic device to update the interface image.
  • the second electronic device receives the user's update operation on the first interface image displayed by the first electronic device, and then sends an update instruction to the first electronic device; the first electronic device then responds to the update instruction and updates the The interface image of the first electronic device, and instructs (or triggers) the second electronic device to update the interface image.
  • the second electronic device can use the first encoding information including the first encoded data to update the interface image of the i-1 th frame more quickly, and display the mirror image of the i th frame of the interface image. Then, in the reverse control scenario, after receiving the user's update operation, the second electronic device quickly updates the i-1 th interface image by using the first encoding information including the first encoded data, indicating that the first electronic device has The counter-control delay of the second electronic device is small. That is to say, by using the display method in a reverse control scenario, the delay of the reverse control of the first electronic device to the second electronic device can be reduced.
  • obtaining the N difference regions in the interface image of the ith frame compared to the interface image of the ith-1st frame includes: the first electronic device converts each pixel in the interface image of the ith frame The pixel value of the point is compared with the pixel value of the corresponding pixel in the interface image of the ith frame, to obtain the difference pixel in the interface image of the ith frame; the first electronic device determines to include the difference pixel in the interface image of the ith frame. N difference regions of points. Wherein, the pixel value of the difference pixel is different from the pixel value of the corresponding pixel in the interface image of the i-1th frame.
  • the display method further includes: if the area ratio of the N difference regions in the ith frame of the interface image is greater than a preset value, the first electronic device encodes the ith frame of the interface image to obtain the ith frame of the interface image. two encoded data; the first electronic device sends the second encoded information to the second electronic device; wherein the second encoded information includes the second encoded data, and the second encoded information is used to trigger the second electronic device to display the third encoded information based on the second encoded information
  • the content of the third screen projection interface is a mirror image of the interface image of the i-th frame.
  • the first electronic device encodes the N difference regions and sends it to the second electronic device, or encodes the ith frame of the interface image and sends it to the second electronic device.
  • the second electronic device freezes when updating the interface image. If the difference is not significant, the first electronic device may use conventional techniques to encode the ith frame interface image and transmit it when the area ratio of the N difference regions in the ith frame interface image is greater than the preset value.
  • the display method further includes: the first electronic device generates a first time stamp of the ith frame in response to the update operation; the first time stamp of the ith frame is used to record the generation of the first electronic device The time of the ith frame of the interface image; wherein, the second encoding information further includes the first timestamp of the ith frame.
  • the first electronic device generates the second encoded information.
  • the first electronic device sends the encoding information (ie the first encoding information or the second encoding information) corresponding to the M-frame interface images to the second electronic device in turn according to the sequence of generating the M-frame interface images, due to the size of the encoded data Due to reasons such as differences and network fluctuations, the sequence in which the second electronic device receives multiple encoded information (ie, the first encoded information or the second encoded information) may be different from the sequence in which the M frames of interface images are generated.
  • the first electronic device sends the encoding information corresponding to the i-th frame of the interface image (that is, the first encoding information or the second encoding information) and the encoding information corresponding to the i+1-th interface image (that is, the first encoding information) to the second electronic device in turn.
  • the second electronic device may first receive the encoding information corresponding to the i+1 frame interface image, and then receive the i frame corresponding to the interface image. encoding information. Therefore, both the first encoded information and the second encoded information may include the first time stamp of the i-th frame, to indicate that the second electronic device may refer to the first time stamp to display the interface images in chronological order.
  • the display method further includes: the first electronic device generates a second time stamp, and saves the second time stamp, where the second time stamp is used to record the time when the first electronic device generates the first interface image , the second time stamp is the first reference time stamp of the first frame, and the first reference time stamp is the reference time recorded by the first electronic device for screen projection; Set the value, the first electronic device determines that the first reference timestamp of the ith frame is the first timestamp of the ith frame; if the area ratio of the N difference regions in the interface image of the ith frame is less than the preset value, the An electronic device determines that the first reference timestamp of the i+1 th frame is the first reference timestamp of the ith frame.
  • the first encoding information further includes: the first timestamp of the ith frame and the first reference timestamp of the ith frame.
  • the area ratio of the interface image of the ith frame compared to the N difference regions of the ith frame of the interface image (referred to as the N difference regions of the ith frame) in the ith frame of the interface image is If it is smaller than the preset value, it means that the difference between the interface image of the i-th frame and the interface image of the i-1-th frame is small.
  • the N difference areas of the i-1th frame in the interface image of the i-1th frame, compared with the N difference areas of the interface image of the i-2th frame (referred to as the N difference areas of the i-1th frame) in the interface image of the i-2th frame If the area ratio is greater than the preset value, it means that the interface image of the i-1th frame is quite different from the interface image of the i-2th frame. Whereas the difference between the interface image of the i-th frame and the interface image of the i-1-th frame is relatively small, the difference between the interface image of the i-th frame and the interface image of the i-2-th frame is also relatively large.
  • the area ratio of the interface image in the i-kth frame compared with the N difference regions in the i-k-1th frame interface image (referred to as the N difference regions in the i-kth frame) in the i-kth frame interface image is greater than
  • the default value means that the interface image of the i-kth frame and the interface image of the i-k-1th frame and its previous interface images are all quite different, and k is a positive integer.
  • the second electronic device receives the first encoding information of the i-th frame interface image before the i-k-th frame interface image due to reasons such as different encoded data sizes and network fluctuations; If the received first encoding information displays the content of the i-th frame of the interface image, the problem of picture misalignment occurs.
  • the first electronic device determines that when the area ratio of the N difference regions in the interface image of the ith frame is greater than or equal to the preset value, the first time stamp of the ith frame is used as the time stamp of the ith frame.
  • the first reference timestamp otherwise, the first reference timestamp of the i-th frame is used as the first reference timestamp of the i+1-th frame.
  • the first reference timestamp of the i-k+1th frame is equal to the first reference time stamp of the i-kth frame of the interface image. timestamp.
  • the area ratio of the N difference regions from the i-k+1 frame interface image to any frame of the i-th interface image in any frame of the interface image is smaller than the preset value
  • the first reference timestamp of any frame from the k+1 frame interface image to the ith frame is equal to the first reference timestamp of the i-k+1 th frame. Since the first reference timestamp of the ith frame i-k+1 is equal to the first timestamp of the interface image of the ith frame i-k, the first reference timestamp of the ith frame is equal to the first timestamp of the interface image of the ith frame.
  • the first electronic device carries the first time stamp of the ith frame and the first reference time stamp of the ith frame in the first encoded information. Because the first reference timestamp of the ith frame is equal to the first timestamp of the ith frame of the interface image, the first reference timestamp of the ith frame can be used to instruct the second electronic device to display the content of the ith frame of the interface image in the After the i-kth frame interface image above. In this way, the problem of picture misalignment caused by displaying the content of the interface image in the i-th frame before the interface image in the i-k-th frame can be avoided.
  • a second aspect provides a multi-screen collaborative display method, which is applied to a second electronic device, and the second electronic device is connected to the first electronic device.
  • the display method includes: the second electronic device displays a first screen projection interface, and the first projection screen The content of the interface is a mirror image of the first interface image displayed by the first electronic device; the second electronic device receives the first encoded information from the first electronic device; wherein the first encoded information includes the first encoded data and the ith frame of the interface The position information of the N difference regions in the image compared to the i-1 th interface image; the second electronic device decodes the first encoded data to obtain the image content of the N difference regions; the second electronic device displays the second screen projection interface .
  • the ith frame of interface images is generated by the first electronic device in response to an update operation, and the update operation is used to trigger the first electronic device to sequentially display M frames of interface images.
  • M is a positive integer
  • i takes values in ⁇ 1,...,M ⁇ in sequence
  • the 0th interface image is the first interface image.
  • the pixel values of the pixel points in the N difference regions are different from the pixel values of the corresponding pixel points in the interface image of the i-1th frame
  • N is a positive integer.
  • the first encoded data is obtained by encoding the image content of the N difference regions.
  • the content of the second screen projection interface is a mirror image of the interface image of the ith frame, and the interface image of the ith frame is obtained by updating the interface image of the ith-1th frame according to the image content and position information of the N difference regions.
  • the second aspect provides a multi-screen collaborative display method applied to a second electronic device, where the second electronic device displays the content of the first interface image projected by the first electronic device.
  • the second electronic device may receive the first encoded information from the first electronic device, decode the first encoded data in the first encoded information to obtain image content of the N different regions; and then display the second screen projection interface. Since the content of the second screen projection interface is the mirror image of the interface image of the ith frame, and the interface image of the ith frame is obtained by updating the interface image of the ith-1th frame according to the image content and position information of the N difference regions, it can be known that, The second screen projection interface can display the updated interface image by using the image contents of the N difference regions obtained by decoding.
  • the N difference areas are the contents of the interface image of the ith frame that are different from the interface image of the ith frame
  • the first encoded data obtained by encoding the N difference areas is smaller than that obtained by encoding the interface image of the ith frame. other encoded data.
  • the second electronic device can use the first encoded information including the first encoded data to update the i-th faster. -1 frame interface image, showing the mirror image of the ith frame interface image. In this way, it is possible to solve the problem of freezing existing in the update interface image of the second electronic device in the multi-screen collaboration scenario.
  • the display method before the second electronic device receives the first encoded information from the first electronic device, the display method further includes: the second electronic device receives an update operation of the first screen projection interface by the user; responding to In the update operation, an update instruction is sent to the first electronic device.
  • the update instruction is used to trigger the first electronic device to sequentially display M frames of interface images.
  • the second electronic device receives an update operation on the first screen projection interface through an external device connected to the second electronic device; the external device includes any one of a display screen, a remote control, a mouse or a stylus of the second electronic device.
  • the second electronic device receives the user's update operation on the first interface image displayed by the first electronic device, and then sends an update instruction to the first electronic device to instruct (or trigger) the first electronic device to update the interface image .
  • the display method further includes: the second electronic device receives second encoded information from the first electronic device; the second encoded information includes second encoded data, and the second encoded data is for the i-th frame The second electronic device decodes the second encoded data to obtain the ith frame of the interface image; the second electronic device displays the third screen projection interface; the content of the third screen projection interface is the mirror image of the ith frame of the interface image .
  • the second encoding information further includes: a first timestamp of the ith frame, where the first timestamp of the ith frame is used to record the time when the first electronic device generates the ith frame of the interface image.
  • the display method further includes that the second electronic device determines that the second reference time stamp of the i+1 th frame is the first time stamp of the ith frame, The second reference time stamp is the reference time at which the second electronic device records the screen projection.
  • the second electronic device after receiving the second encoding information of the interface image of the ith frame, the second electronic device indicates that the area ratio of the N difference regions of the ith frame in the interface image of the ith frame is larger than the preset value, and The area ratio of the N difference regions of the ith frame in the interface image of the ith frame is greater than the preset value, indicating that the difference between the interface image of the ith frame and the interface image of the ith frame is larger. Then, the difference between the interface image of the i+1th frame and the interface image of the i-1th frame is greater, therefore, the first timestamp of the ith frame in the second encoded information is used as the second reference of the i+1th frame timestamp.
  • the first encoding information further includes: a first time stamp of the ith frame and a first reference time stamp of the ith frame, where the first reference time stamp is a reference for the first electronic device to record the screen projection time.
  • the display method further includes: the second electronic device determines the first encoded data. The time recorded by the first time stamp of the i frame is later than the time recorded by the first reference time stamp of the i frame, and the time recorded by the first reference time stamp of the i frame is equal to the second reference time of the i frame time stamped.
  • the second electronic device receives the second encoding information of the interface image of the i-1th frame (referred to as the second encoding information of the i-1th frame for short), it will convert the second encoding information of the i-1th frame
  • the first timestamp of the ith frame i-1 in the encoded information is used as the second reference timestamp of the ith frame (that is, the second reference timestamp of the ith frame is equal to the first timestamp of the ith frame)
  • the content of the interface image of the i-1th frame is displayed by using the second encoding information of the i-1th frame.
  • receiving the second encoding information of the i-1th frame indicates that the area ratio of the N difference regions of the i-1th frame in the interface image of the i-1th frame is greater than the preset value;
  • the area ratio of the difference area in the interface image of the i-1 frame is greater than the preset value, which means that the difference between the interface image of the i-1 frame and the interface image of the i-2 frame is relatively large. Then, the difference between the interface image of the i-th frame and the interface image of the i-2-th frame is greater.
  • the second electronic device receives the first encoding information of the interface image of the ith frame (referred to as the first encoding information of the ith frame for short), it means that the N difference regions of the ith frame are in the interface image of the ith frame.
  • the area ratio is smaller than the preset value; and the area ratio of the N difference regions of the i-th frame in the interface image of the i-th frame is smaller than the preset value, indicating that the difference between the interface image of the i-th frame and the interface image of the i-1-th frame is small.
  • the first encoding information of the ith frame of the interface image may include the first reference timestamp of the ith frame.
  • the first electronic device uses the first timestamp of the i-1th frame as the i-th frame when it is determined that the area ratio of the N difference regions of the i-1th frame in the interface image of the i-1th frame is greater than the preset value
  • the first reference time stamp of the i-th frame is equal to the first time stamp of the i-1-th frame; combined with the second reference time stamp of the i-th frame, it is equal to the first time-stamp of the i-1-th frame.
  • timestamp it can be known that the first reference timestamp of the ith frame is equal to the second reference timestamp of the ith frame.
  • the first reference timestamp of the ith frame is equal to the second reference timestamp of the ith frame, which means that the difference between the interface image of the ith frame and the interface image of the i-1th frame is small, and it also indicates that the second electronic device has been Display the content of the i-1th frame interface image. Then, when the first reference timestamp of the ith frame is equal to the second reference timestamp of the ith frame, the second electronic device uses the first encoding information of the ith frame to display the content of the interface image of the ith frame.
  • the second electronic device receives the encoding information corresponding to the interface image of the M frame (that is, the first encoding information or The sequence of the second coded information) is different from the sequence in which the M frames of interface images are generated by the first electronic device, resulting in the problem of picture misalignment.
  • a third aspect provides a first electronic device, the first electronic device comprising a device, or a module, or a unit for performing various possible methods as the first aspect and the first aspect.
  • a fourth aspect provides a second electronic device, the second electronic device comprising means, or modules, or units for performing various possible methods of the second aspect and the second aspect.
  • a fifth aspect provides a first electronic device, the first electronic device includes: a processor and a memory.
  • the memory is used for storing computer program codes, and the computer program codes include computer instructions; the processor is used for executing the computer instructions, so that the first electronic device executes the method according to the first aspect and any possible design manners thereof.
  • a sixth aspect provides a second electronic device, the second electronic device includes: a processor and a memory.
  • the memory is used for storing computer program codes, and the computer program codes include computer instructions; the processor is used for executing the computer instructions, so that the second electronic device executes the method of the second aspect and any possible design manner thereof.
  • a seventh aspect provides a computer-readable storage medium, where computer instructions are stored on the computer-readable storage medium, and when the computer instructions are executed on the first electronic device, the first electronic device is made to execute the first aspect and any one thereof. a possible design approach.
  • An eighth aspect provides a computer-readable storage medium, where computer instructions are stored on the computer-readable storage medium, and when the computer instructions are executed on the second electronic device, the second electronic device is made to execute the second aspect and any one thereof. a possible design approach.
  • a ninth aspect also provides a computer program product comprising one or more instructions, the one or more instructions can be executed on the first electronic device to cause the first electronic device to perform the first aspect and any of its possibilities method of design.
  • a tenth aspect also provides a computer program product comprising one or more instructions that can be executed on a second electronic device to cause the second electronic device to perform the second aspect and any of its possibilities method of design.
  • the seventh aspect and the ninth aspect please refer to the technical effects brought about by different design methods in the above-mentioned first aspect
  • the fourth aspect of the present application and any possible design methods thereof, and the technical effects brought by the sixth aspect, the eighth aspect and the tenth aspect can refer to the technical effects brought by different design methods in the above-mentioned second aspect , and will not be repeated here.
  • FIG. 1 is a schematic diagram of a multi-screen display in a multi-screen non-mirroring scene provided by conventional technology
  • Fig. 2 is a kind of multi-screen display schematic diagram under a kind of multi-screen expansion scenario provided by conventional technology
  • FIG. 3 is a schematic diagram of a multi-screen display in a multi-screen mirroring scene provided by conventional technology
  • FIG. 4 is a schematic diagram of a multi-screen display in a multi-screen scenario provided by conventional technology
  • FIG. 5 is a schematic diagram of a multi-screen display after updating an interface image in a multi-screen scenario provided by conventional technology
  • FIG. 6 is a schematic diagram of a hardware structure of a first electronic device according to an embodiment of the present application.
  • FIG. 7 is a flowchart 1 of a display method for multi-screen collaboration under a first usage scenario provided by an embodiment of the present application
  • FIG. 8 is a schematic diagram 1 of a multi-screen display in a first usage scenario provided by an embodiment of the present application.
  • FIG. 9 is a second schematic diagram of a multi-screen display in a first usage scenario provided by an embodiment of the present application.
  • FIG. 10 is a schematic diagram of an interface for acquiring N difference regions by a first electronic device according to an embodiment of the present application
  • 11A is a sequence diagram of a first electronic device displaying an interface image and updating a timestamp according to an embodiment of the present application
  • FIG. 11B is a sequence diagram of a first electronic device displaying an interface image and updating a timestamp according to an embodiment of the present application
  • FIG. 12 is a sequence diagram of a first electronic device projecting a screen to a second electronic device according to an embodiment of the present application
  • FIG. 13 is a flowchart 2 of a display method for multi-screen collaboration under a first usage scenario provided by an embodiment of the present application;
  • FIG. 14 is a flowchart 1 of a display method for multi-screen collaboration in an anti-control scenario provided by an embodiment of the present application;
  • 15 is a schematic diagram 1 of a multi-screen display in an anti-control scenario provided by an embodiment of the present application.
  • 16 is a second schematic diagram of a multi-screen display in an anti-control scenario provided by an embodiment of the present application.
  • 17 is a flowchart 2 of a display method for multi-screen collaboration in an anti-control scenario provided by an embodiment of the present application;
  • FIG. 18 is a schematic structural diagram of a first electronic device according to an embodiment of the present application.
  • FIG. 19 is a schematic structural diagram of a second electronic device according to an embodiment of the present application.
  • first”, second, etc. are only used for descriptive purposes, and should not be understood as indicating or implying relative importance or implying the number of indicated technical features.
  • a feature defined as “first”, “second”, etc. may expressly or implicitly include one or more of that feature.
  • a connection eg, a wired connection or a wireless connection
  • the first electronic device can The interface image is projected to the second electronic device for display.
  • the wireless connection may be a Bluetooth connection, a near field communication (Near Field Communication, NFC) connection, or a wireless fidelity (Wireless Fidelity, WiFi) connection.
  • the first electronic device can receive the user's update operation on the interface image; the first electronic device can update the first electronic device in response to the update operation. interface image of the electronic device, and instruct (or trigger) the second electronic device to also update the interface image of the second electronic device.
  • the first usage scenario may include a multi-screen extended scenario and a multi-screen non-mirroring scenario.
  • the first electronic device In the multi-screen expansion scenario, the first electronic device generates and displays a first interface image; the first electronic device also generates a second interface image, and instructs (or triggers) the second electronic device to display a mirror image of the second interface image, and the first electronic device generates a second interface image. The device does not display the second interface image.
  • the first electronic device can display the first interface image; the first electronic device also displays the second interface image, and instructs (or triggers) the second electronic device to display the mirror image of the second interface image.
  • the notebook computer 110 is connected to the tablet computer 120 through a wireless connection.
  • the notebook computer 110 displays the first interface image 111 and the second interface image 112 ; the tablet computer further instructs the tablet computer 120 to display the mirror image 113 of the second interface image 112 .
  • the content of the first interface image 111 includes picture editing options such as “graffiti”, “color”, “thickness”, and “eraser”; the content of the second interface image 112 includes “title”, “Start”, “Eraser”, etc. Insert”, "Page Layout” and other document manipulation options.
  • the notebook computer 110 is connected to the tablet computer 120 through a wireless connection.
  • the notebook computer 110 generates and displays the first interface image 111 ; the notebook computer 110 also generates the second interface image 112 and instructs the tablet computer 120 to display the mirror image 113 of the second interface image 112 ; the notebook computer 110 does not display the second interface image 112 .
  • the second electronic device can also receive the user's update of the interface image displayed by the second electronic device. operate.
  • the second electronic device can receive the user's update operation on the second electronic device through an external device such as a stylus, a remote control, or a mouse connected to the second electronic device; and then send an update instruction to the first electronic device to indicate (or trigger the )
  • the first electronic device updates the interface image in response to the update instruction.
  • the first electronic device may update the interface image of the first electronic device, and instruct (or trigger) the second electronic device to also update the interface image of the second electronic device.
  • the anti-control scenario may include a multi-screen mirroring scenario and the above-mentioned multi-screen non-mirroring scenario.
  • the second electronic device can display the content displayed by the first electronic device in a full-screen or non-full-screen manner.
  • the second electronic device in a multi-screen mirroring scenario, can display the content displayed by the first electronic device in full screen.
  • the second electronic device may display the content displayed by the first electronic device in a non-full screen, and the second electronic device does not display other content.
  • the notebook computer 110 is connected to the tablet computer 120 through a wireless connection.
  • the laptop computer 110 displays the second interface image 112 and instructs the tablet computer 120 to display the mirror image 113 of the second interface image 112 .
  • the tablet computer 120 displays the mirror image 113 of the second interface image 112 in full screen.
  • the first electronic device instructs (or triggers) the second electronic device to update the interface image, which may specifically include: the first electronic device adopts a codec standard such as H.262, H.263, or H.264, and updates the updated interface image.
  • the first electronic device transmits the encoded data to the second electronic device; the second electronic device receives the encoded data, and uses the same encoding and decoding standard as the first electronic device to decode the encoded data to obtain an update the interface image; the second electronic device displays the updated interface image.
  • the encoding and decoding takes a long time and the encoding data is large.
  • the larger encoded data increases the transmission time of the encoded data.
  • the longer encoding and decoding time and the longer transmission time will cause the second electronic device to use the encoded data sent by the first electronic device to update the interface image slower.
  • the update interface image of the second electronic device becomes slower, which is manifested as a significant freeze when the second electronic device updates the interface image.
  • the problem of stalling in multi-screen collaboration scenarios is becoming more and more serious.
  • the second electronic device needs to receive the update operation and send the update instruction to the first electronic device, so that the first electronic device can update the interface image in response to the update instruction, and instruct (or trigger) the second electronic device. Updated interface image.
  • the time-consuming of updating the interface image of the second electronic device will undoubtedly increase, and further increase the possibility that the interface image of the second electronic device gets stuck.
  • the notebook computer 210 is connected to the tablet computer 220 through a wireless connection; the notebook computer 210 displays a frame of interface image (eg, an editing interface image 211 for editing a picture), and projects the content of the editing interface image 211 into screen onto the tablet computer 220.
  • a mirror image 212 of the editing interface image 211 is displayed on the tablet computer 220 .
  • the content of the editing interface image 211 includes: multiple picture editing options of "graffiti", "color”, "thickness” and "eraser".
  • the tablet computer 220 displays the content of the editing interface image 211, it receives the user's update operation on the editing interface image 211 (eg, a drawing operation of drawing a heart shape on the editing interface image). After receiving the drawing operation, the tablet computer 220 sends an update instruction to the notebook computer 210 to instruct the notebook computer 210 to display the drawn heart shape in response to the update instruction. In response to the update instruction, the notebook computer 210 sequentially displays multiple frames of interface images in the process of drawing the heart shape. The notebook computer 210 also encodes each frame of the interface image in the multi-frame interface images to obtain encoded data, and transmits the encoded data to the tablet computer 220 . However, due to the long time consuming of encoding and decoding, the large amount of encoded data, etc., it takes a long time for the tablet computer 220 to receive and decode each frame of the interface image after receiving the drawing operation.
  • the tablet computer 220 receives the user's update operation on the editing interface image 211 (eg, a drawing operation of drawing
  • the notebook computer 210 sequentially displays all the interface images in the process of drawing the heart shape, and displays an editing interface image 213 including the heart shape.
  • the tablet computer 220 receives the drawing operation, since it takes a long time to receive and decode each frame of the interface image, at this moment, only part of the interface image in the process of drawing the heart shape is used to display a part of the heart shape.
  • the tablet computer 220 displays a heart shape using all the interface images in the process of drawing the heart shape after a long period of time.
  • the tablet computer is stuck in displaying the updated interface image including the heart shape.
  • the tablet displays the updated interface image including the heart-shaped interface, and the stuck situation is more obvious, which is manifested as the interaction between the laptop and the tablet.
  • the anti-control delay is long.
  • a multi-screen collaborative display method is proposed, which can solve the problem of stuck in the update interface image of the second electronic device in the multi-screen collaborative scenario, and reduce the time when the first electronic device reverses the control of the second electronic device in the reverse control scenario. extension.
  • the first electronic device in the embodiment of the present application may be a mobile phone, a PC, a tablet computer, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, and a personal digital assistant (personal digital assistant). , PDA) etc.
  • the second electronic device in the first usage scenario may be any display device such as a monitor of a PC, a television, or a projector, or an electronic device including a display.
  • the second electronic device in the reverse control scenario may be an electronic device including a display.
  • the above-mentioned electronic device including a display may be a PC, a tablet computer, a notebook computer, a UMPC, a netbook, a PDA, a large-screen device, a smart TV, and the like.
  • the embodiments of the present application do not impose any limitations on the specific forms of the first electronic device and the second electronic device.
  • the mobile phone 300 may include: a processor 310 , an external memory interface 320 , an internal memory 321 , a universal serial bus (USB) interface 330 , a charging management module 340 , a power management module 341 , and a battery 342 , Antenna 1, Antenna 2, Mobile Communication Module 350, Wireless Communication Module 360, Audio Module 370, Speaker 370A, Receiver 370B, Microphone 370C, Headphone Interface 370D, Sensor Module 380, Key 390, Motor 391, Indicator 392, Camera 393 (may include cameras 1-N), display screen 394 (such as a touch screen), and a subscriber identification module (subscriber identification module, SIM) card interface 395 (may include SIM card interfaces 1-N) and the like.
  • a processor 310 may include cameras 1-N
  • display screen 394 such as a touch screen
  • SIM subscriber identification module
  • the aforementioned sensor module 380 may include sensors such as a pressure sensor, a gyroscope sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, and a bone conduction sensor.
  • sensors such as a pressure sensor, a gyroscope sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, and a bone conduction sensor.
  • cell phone 300 may include more or fewer components than shown, or some components may be combined, or some components may be split, or a different arrangement of components.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 310 may include one or more processing units, for example, the processor 310 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), a controller, a memory , video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (neural-network processing unit, NPU), etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processor
  • GPU graphics processor
  • controller a controller
  • video codec digital signal processor
  • DSP digital signal processor
  • baseband processor baseband processor
  • neural-network processing unit neural-network processing unit
  • the power management module 341 is used to connect the battery 342 , the charging management module 340 and the processor 310 .
  • the power management module 341 receives input from the battery 342 and/or the charging management module 340, and supplies power to the processor 310, the internal memory 321, the external memory, the display screen 394, the camera 393, and the wireless communication module 360.
  • the power management module 341 may also be provided in the processor 310 .
  • the power management module 341 and the charging management module 340 may also be provided in the same device.
  • GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system (quasi-zenith) satellite system, QZSS) and/or satellite based augmentation systems (satellite based augmentation systems, SBAS), etc.
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the mobile phone 300 may acquire the real-time location information of the mobile phone 300 by using a positioning technology such as GPS, BDS, or SBAS.
  • the method steps in the subsequent embodiments of the present application may be executed by the above-mentioned first electronic device and/or the above-mentioned second electronic device, or the execution subject of the method steps in the embodiments of the present application may also be the first electronic device.
  • Some functional modules such as a central processing unit (Central Processing Unit, CPU)
  • CPU Central Processing Unit
  • some functional modules such as a CPU
  • the display method for multi-screen collaboration provided by the embodiment of the present application is described in detail by taking the display method for performing multi-screen collaboration performed by the first electronic device and/or the above-mentioned second electronic device as an example.
  • the first electronic device may compare each frame of the updated interface image with the previous frame of the interface image. If the change of each frame of interface image is smaller than that of the previous frame of interface image, the first electronic device may only encode the content in each frame of interface image that is different from the interface image of the previous frame to obtain encoded data, and encode The encoded data is transmitted to the second electronic device.
  • an embodiment of the present application provides a multi-screen collaborative display method, which is applied to the first electronic device and the second electronic device in the above-mentioned first usage scenario, where the first electronic device is connected to the second electronic device.
  • the display method may include S401-S415.
  • the first electronic device displays a first interface image, and encodes the first interface image to obtain encoded data of the first interface image.
  • the first electronic device displays the first interface image, and encodes the first interface image to obtain encoded data of the first interface image; and then generates encoded information of the first interface image.
  • the encoded information of the first interface image includes encoded data of the first interface image.
  • the encoded information of the first interface image may include, in addition to the encoded data of the first interface image, a timestamp of the first interface image; the timestamp of the first interface image is used to record the generation of the first interface by the first electronic device. image time.
  • the second time stamp described in the following embodiments is the time stamp of the first interface image.
  • the second electronic device receives the encoding information of the first interface image from the first electronic device, and displays the first screen projection interface by using the encoding information of the first interface image.
  • the second electronic device After receiving the encoding information of the first interface image from the first electronic device, the second electronic device adopts the encoding and decoding standard adopted by the first electronic device when generating the encoded data of the first interface image, and encodes the encoding information of the first interface image. Decode the encoded data in to obtain the first interface image. The second electronic device uses the first interface image to generate a first screen projection interface, and displays the first screen projection interface.
  • the second electronic device may display the first screen projection interface in full screen.
  • the second electronic device may display the first screen projection interface in full screen, or may display the first screen projection interface in non-full screen.
  • the second electronic device may display other interface images while displaying the first screen projection interface in a non-full screen in a multi-screen non-mirroring scenario, for example, display the other interface images in other areas except the first screen projection interface.
  • the other interface image is different from the first interface image and the first screen projection interface.
  • the second electronic device may display the first screen projection interface in full screen, or may display the first screen projection interface in non-full screen.
  • the second electronic device displays the first screen projection interface in a non-full screen in a multi-screen mirroring scenario, it does not display other content, for example, the other area may appear as black.
  • the full-screen display screen projection interface described in the embodiments of the present application refers to: the second electronic device can display the screen projection interface and the status bar of the second electronic device on the display screen, and no other content is displayed; or, The second electronic device may display the screen projection interface on the display screen, but does not display the status bar and other contents of the second electronic device.
  • S404 The first electronic device receives an update operation on the first interface image by the user, where the update operation is used to trigger the first electronic device to sequentially display M frames of interface images.
  • the first electronic device may directly receive the user's update operation on the first interface image.
  • M is a positive integer (eg, 1, 2, or 3). The value of M depends on the update operation.
  • the update operation may include an operation of switching the first interface image, an operation of drawing on the first interface image, or an operation of loading new content on the first interface image, etc., which are not limited in this embodiment of the present application. .
  • the first electronic device displays the ith frame of the interface image in response to the update operation, and acquires N difference regions in the ith frame of the interface image compared to the ith-1th frame of the interface image.
  • the first electronic device sequentially generates and displays M frames of interface images.
  • the first electronic device also acquires N difference regions in the ith frame of the interface image compared to the i-1 th frame of the interface image, and position information of the N difference regions in the ith frame of the interface image.
  • the position information is used to indicate (or characterize) the positions of the N difference regions in the ith frame of the interface image.
  • i takes values in sequence in ⁇ 1,...,M ⁇ ; the 0th interface image is the above-mentioned first interface image.
  • the pixel values of the pixel points in the N difference regions are different from the pixel values of the corresponding pixel points in the interface image of the i-1th frame, and N is a positive integer.
  • the M frames of the first interface images may include the first frame of interface images, the second frame of interface images, ..., the ith frame of interface images, ..., the Mth frame of interface images.
  • the first frame of interface images, the second frame of interface images, ..., the i-th frame of interface images, ..., the M-th frame of interface images are the interface images arranged in order of generation time.
  • the i-1th interface image is the interface image of the previous frame of the i-th interface image.
  • N may be equal to any positive integer. If N is greater than 1, any two difference areas in the N difference areas may be independent and non-overlapping. The pixel values of all the pixels in each difference area in the N difference areas are different from the pixel values of the corresponding pixels in the interface image of the i-1th frame, or, the pixel values of some pixels in each difference area are different from the pixel values of the i-1th frame of the interface image. The pixel values of corresponding pixels in one frame of interface image are different.
  • the location information of the N difference areas may include N pieces of location information corresponding to the N difference areas one-to-one.
  • the first electronic device may adopt a lossy encoding method (such as JPEG, RM file format, RealMedia variable bit rate) or a lossless encoding method (such as Huffman algorithm, LZW (Lenpel-Ziv & Welch) compression algorithm), and the N
  • a lossy encoding method such as JPEG, RM file format, RealMedia variable bit rate
  • a lossless encoding method such as Huffman algorithm, LZW (Lenpel-Ziv & Welch) compression algorithm
  • the first electronic device sends first encoded information to the second electronic device, where the first encoded information includes first encoded data.
  • the first electronic device may also acquire position information of the N difference areas in the ith frame of the interface image. Then, the first electronic device generates first encoded information by using the location information and the first encoded data, and then sends the first encoded information to the second electronic device.
  • the second electronic device receives the first encoded information from the first electronic device, and decodes the first encoded data to obtain image contents of N difference regions.
  • the second electronic device decodes the first encoded data in the first encoded information by using the encoding and decoding standards used by the first electronic device when generating the first encoded data, to obtain image contents of N difference regions.
  • the second electronic device displays a second screen projection interface.
  • the content of the second screen projection interface is a mirror image of the ith frame of the interface image.
  • the ith frame of the interface image is obtained by the second electronic device updating the ith-1th frame of the interface image according to the image content and position information of the N difference regions.
  • the second electronic device uses the image content of the N difference regions to update the i-1 th interface image according to the position information of the N difference regions in the first encoded information to obtain the ith frame of the interface image.
  • the second electronic device then generates a second screen projection interface according to the ith frame of the interface image, and displays the second screen projection interface in a full-screen or non-full-screen manner.
  • the second electronic device may replace the content of the area indicated by the position information of the N difference areas in the ith frame of the interface image with the image content of the N difference areas, to obtain the ith frame of the interface image.
  • the display screen of the mobile phone 300 is a touch screen
  • the mobile phone 300 receives the user's update operation on the first interface image, such as a drawing operation of drawing a heart shape with the user's finger on the display screen of the mobile phone 300 .
  • the drawing operation 300 sequentially displays M frames of interface images in the process of drawing the heart shape, and displays an interface image 503 including the heart shape.
  • the first electronic device may use encoding and decoding standards such as H.261, H.263, and H.264 to encode the ith frame of the interface image to obtain the second encoded data.
  • encoding and decoding standards such as H.261, H.263, and H.264
  • the encoding of the interface image of the ith frame by the first electronic device is to encode the complete interface image of the ith frame; at this time, the interface image of the ith frame may be an intra-coded picture frame (Intra-coded picture, I frame) Or predictive coded picture frame (Predictive-coded Picture, P frame).
  • I frame Intra-coded picture
  • P frame predictive coded picture frame
  • the second electronic device receives the second encoded information from the first electronic device.
  • the second electronic device may use the encoding and decoding standard used by the first electronic device when generating the second encoded data to decode the second encoded data in the second encoded information to obtain the ith frame of the interface image.
  • the second electronic device displays a third screen projection interface.
  • the interface image 501 is an editing interface image for editing a picture, and instructs (or triggers) the tablet computer 505 to display the first screen projection interface 502 .
  • the notebook computer 520 is connected to the mouse 506 .
  • the notebook computer 520 receives the user's update operation on the first interface image through the mouse 506 , for example, the user uses the mouse 506 to draw a heart-shaped drawing operation on the display screen of the notebook computer 520 .
  • the notebook computer 520 sequentially displays M frames of interface images in the process of drawing the heart shape, and displays an interface image 503 including the heart shape.
  • the notebook computer 520 sends the first encoded information or the second encoded information of each frame of the interface image to the tablet computer 505 . Because the notebook computer 520 can encode only the content in each frame of interface image that is different from the interface image of the previous frame to obtain the first encoded data when the change of each frame of interface image is small compared to the interface image of the previous frame , the first encoded information including the first encoded data is transmitted to the second electronic device, which reduces the time-consuming of encoding and decoding and the transmission time of the encoded data. Therefore, the tablet computer 505 also uses the received first encoded information or the second encoded information to display a screen projection interface 504 including a heart shape relatively quickly.
  • the first electronic device displays the first interface image, and projects the content of the first interface image to the second electronic device. If the first electronic device receives an update operation for the first interface image, it displays M frames of the interface image, and acquires N difference regions in the ith frame of the interface image compared to the i-1th frame of the interface image. If the area ratio of the N difference regions in the ith frame of the interface image is less than the preset value, the first electronic device only encodes the N difference regions to obtain the first encoded data; then, encodes the first encoded information including the first encoded data sent to the second electronic device.
  • the N difference regions are pixel points with different pixel values of the interface image of the ith frame compared with the interface image of the ith-1th frame. Then, when the first electronic device encodes the N difference regions, it is to encode the content in the interface image of the i-th frame that is different from the interface image of the i-1-th frame. Decoding time and the size of the first encoded data, and transmitting smaller first encoded data can also shorten the transmission duration of the first encoded data.
  • the second electronic device can use the first encoding information including the first encoded data to update the i-1 th interface image more quickly, Displays a mirror image of the i-th frame interface image. In this way, it is possible to solve the problem of freezing existing in the update interface image of the second electronic device in the multi-screen collaboration scenario.
  • the pixel value of each pixel in the interface image of the ith frame may be compared with the pixel value of the corresponding pixel in the interface image of the ith-1th frame.
  • the difference pixels in the interface image of the ith frame are obtained; then N difference regions including the difference pixels in the interface image of the ith frame are determined.
  • the pixel value of the difference pixel is different from the pixel value of the corresponding pixel in the interface image of the i-1th frame.
  • the difference pixel point may include a plurality of pixel points.
  • the first electronic device After the first electronic device obtains the difference pixels, it can determine the minimum area including the difference pixels according to the shape of the preset area; the minimum area is the N difference areas.
  • the shape of the preset area may be a rectangle, a circle, a polygon, or the like.
  • the shape of the N difference regions is a preset region shape.
  • the first electronic device may use the areas where the difference pixels are located as the N difference areas.
  • the first electronic device compares the ith interface image in the i-th interface image
  • the pixel value of the pixel point and the pixel value of the pixel point in the ith frame of interface image are used to obtain the difference pixel point in the ith frame of the interface image.
  • the oblique line area in (c) of FIG. 10 includes all the difference pixels in the ith frame of the interface image.
  • the first electronic device determines the smallest rectangular area R1 including all the difference pixels, that is, N difference areas R1.
  • the position information of the N difference regions R1 includes the abscissa xmin of the upper left vertex P1 (or the abscissa xmin of the lower left vertex P3), the abscissa xmax of the upper right vertex P2 (or the abscissa xmax of the lower right vertex P4), the upper left vertex The ordinate ymin of P1 (or the ordinate ymin of the upper right vertex P2), the ordinate ymax of the lower left vertex P3 (or the ordinate ymax of the lower right vertex P4).
  • the position information of the N difference regions R1 may include the coordinates (xmin, ymin) of the upper left vertex P1, the coordinates of the upper right vertex P2 are (xmax, ymin), the coordinates of the lower left vertex P1 are (xmin, ymax), and the coordinates of the lower right vertex The coordinates of P4 are (xmax, ymax).
  • the N difference regions may include multiple independent regions.
  • the N difference regions are taken as one region as an example, which is not limited in this embodiment of the present application.
  • the first electronic device generates a first timestamp of the ith frame while generating the ith frame of the interface image.
  • the first electronic device may carry the first timestamp of the i-th frame in the second encoded information and send it to the second electronic device.
  • the first electronic device sends the encoding information (ie the first encoding information or the second encoding information) corresponding to the M-frame interface images to the second electronic device in turn according to the sequence in which the M-frame interface images are generated, due to the Due to different encoded data sizes and network fluctuations, the sequence in which the second electronic device receives multiple encoded information (ie, the first encoded information or the second encoded information) may be different from the sequence in which the M frames of interface images are generated.
  • the first electronic device sends the encoding information corresponding to the i-th frame of the interface image (that is, the first encoding information or the second encoding information) and the encoding information corresponding to the i+1-th interface image (that is, the first encoding information) to the second electronic device in turn.
  • the second electronic device may first receive the encoding information corresponding to the i+1 frame interface image, and then receive the i frame corresponding to the interface image. encoding information. Therefore, both the first encoded information and the second encoded information may include the first time stamp of the i-th frame, to indicate that the second electronic device may refer to the first time stamp to display the interface images in chronological order.
  • the first electronic device determines that the first reference timestamp of the ith+1th frame is the first timestamp of the ith frame; if N The area ratio of each difference area in the interface image of the ith frame is smaller than the preset value, and the first electronic device determines that the first reference timestamp of the ith+1th frame is the first reference timestamp of the ith frame.
  • the first encoding information further includes: the first timestamp of the ith frame and the first reference timestamp of the ith frame.
  • the first electronic device may generate a second time stamp when displaying the first interface image.
  • the second time stamp may be used as the first reference time stamp of the first frame.
  • the first electronic device may use the first time stamp of the ith frame as the time stamp of the ith+1th frame when the area ratio of the N difference regions in the ith frame of the interface image is greater than or equal to the preset value.
  • the first base timestamp may be used as the first base timestamp.
  • the first electronic device uses the first reference timestamp of the ith frame as the first reference time of the ith+1th frame when the area ratio of the N difference regions in the ith frame of the interface image is less than the preset value Stamp, that is, continue to use the first reference timestamp of the i-th frame in the i+1-th frame.
  • the first electronic device may carry the first time stamp of the i-th frame and the first reference time-stamp of the i-th frame in the first encoded information and send it to the second electronic device.
  • the first reference time stamp of the ith frame is the reference time at which the first electronic device records and projects the interface image of the ith frame to the second electronic device.
  • the second time stamp is the time stamp of the first interface image in the above S401.
  • the area ratio of the interface image of the ith frame If it is smaller than the preset value, it means that the difference between the interface image of the i-th frame and the interface image of the i-1-th frame is small.
  • the N difference areas of the i-1th frame in the interface image of the i-1th frame, compared with the N difference areas of the interface image of the i-2th frame (referred to as the N difference areas of the i-1th frame) in the interface image of the i-2th frame If the area ratio is greater than the preset value, it means that the interface image of the i-1th frame is quite different from the interface image of the i-2th frame. Whereas the difference between the interface image of the i-th frame and the interface image of the i-1-th frame is relatively small, the difference between the interface image of the i-th frame and the interface image of the i-2-th frame is also relatively large.
  • the area ratio of the interface image in the i-kth frame compared with the N difference regions in the i-k-1th frame interface image (referred to as the N difference regions in the i-kth frame) in the i-kth frame interface image is greater than
  • the default value means that the interface image of the i-k th frame and the interface image of the i-k-1 th frame and its previous interface images are all quite different, and k is a positive integer.
  • the second electronic device receives the first encoding information of the i-th frame interface image before the i-k-th frame interface image due to reasons such as different encoded data sizes and network fluctuations; If the received first encoding information displays the content of the i-th frame of the interface image, the problem of picture misalignment occurs.
  • the first electronic device determines that when the area ratio of the N difference regions in the interface image of the ith frame is greater than or equal to the preset value, the first time stamp of the ith frame is used as the time stamp of the ith frame.
  • the first reference timestamp otherwise, the first reference timestamp of the i-th frame is used as the first reference timestamp of the i+1-th frame.
  • the first reference timestamp of the i-k+1th frame is equal to the first reference time stamp of the i-kth frame of the interface image. timestamp.
  • the area ratio of the N difference regions from the i-k+1 frame interface image to any frame of the i-th interface image in any frame of the interface image is smaller than the preset value
  • the first reference timestamp of any frame from the k+1 frame interface image to the ith frame is equal to the first reference timestamp of the i-k+1 th frame. Since the first reference timestamp of the ith frame i-k+1 is equal to the first timestamp of the interface image of the ith frame i-k, the first reference timestamp of the ith frame is equal to the first timestamp of the interface image of the ith frame.
  • the first electronic device carries the first time stamp of the ith frame and the first reference time stamp of the ith frame in the first encoded information. Because the first reference timestamp of the ith frame is equal to the first timestamp of the ith frame of the interface image, the first reference timestamp of the ith frame can be used to instruct the second electronic device to display the content of the ith frame of the interface image in the After the i-kth frame interface image above. In this way, the problem of picture misalignment caused by displaying the content of the interface image in the i-th frame before the interface image in the i-k-th frame can be avoided.
  • the process of generating the first reference timestamp by the first electronic device is described.
  • the first electronic device first displays the first interface image, and generates a second time stamp T0.
  • the first electronic device receives the update operation, and performs the following processes in sequence: displaying the interface image of the first frame and generating the first time stamp T1 of the first frame, displaying the interface image of the second frame and generating the first time stamp of the second frame T2, display the interface image of the 3rd frame and the first time stamp of generating the 3rd frame T3, display the interface image of the 4th frame and the first time stamp T4 of the generation of the 4th frame, display the interface image of the 5th frame and generate the first time stamp of the 5th frame
  • the arrow direction of the time axis represents the time sequence.
  • the second time stamp T0 may also be used as the first reference time stamp T11-JZ of the first frame.
  • the first electronic device determines that the area ratio of the interface image of the first frame compared with the N difference regions of the first interface image (referred to as the N difference regions of the first frame) in the interface image of the first frame is greater than a preset value, and the The first time stamp T1 of the first frame is used as the first reference time stamp T21-JZ of the second frame.
  • the first electronic device determines that the ratio of the area of the interface image in the second frame to the N difference regions of the interface image in the first frame (referred to as N difference regions in the second frame) in the interface image in the second frame is smaller than a preset value,
  • the first reference time stamp T21-JZ of the second frame is used as the first reference time stamp T31-JZ of the third frame.
  • the first electronic device determines that the ratio of the area of the interface image in the interface image in the third frame to the N difference regions of the interface image in the second frame (referred to as the N difference regions in the third frame) in the interface image in the third frame is smaller than a preset value,
  • the first reference time stamp T31-JZ of the third frame is used as the first reference time stamp T41-JZ of the fourth frame.
  • the first electronic device determines that the ratio of the area of the interface image of the fourth frame to the N difference regions of the interface image of the third frame (referred to as N difference regions of the fourth frame) in the interface image of the fourth frame is greater than a preset value,
  • the first time stamp T4 of the fourth frame is used as the first reference time stamp T51-JZ of the fifth frame.
  • the first electronic device also determines that the ratio of the area of the interface image in the fifth frame to the N difference regions of the interface image in the fourth frame (referred to as the N difference regions in the fifth frame) in the interface image in the fifth frame is smaller than a preset value. .
  • the first electronic device sequentially encodes the first frame of the interface image (ie the I frame or the P frame), the N difference areas of the second frame, the N difference areas of the third frame, and the fourth frame of the interface.
  • Image i.e. I frame or P frame
  • N difference regions of the 5th frame the first electronic device encodes the complete first frame interface image and the complete fourth frame interface image respectively, then the first frame interface image may be an I frame or a P frame, and the fourth frame interface image may be an I frame or P frame frame.
  • the first encoding information may further include a first identifier representing the N difference regions
  • the second encoding information may further include a first identifier representing the ith frame of the interface image. Second identification.
  • the first identifier may refer to the transmission path used for sending the first encoded information.
  • the information related to the transmission path used for the information (such as the port number), and the second identifier may refer to the information related to the transmission path used for sending the second encoded information (such as the port number).
  • the codec standard used by the first electronic device to encode the image content of the N difference regions may be different from the codec standard used to encode the ith frame interface image, and the first identifier can be used to indicate the second.
  • the electronic device decodes the first encoded data using the encoding and decoding standard used by the first electronic device when generating the first encoded data; the second identifier can be used to instruct the second electronic device to use the first electronic device to generate the second encoded data Decode the second encoded data according to the encoding and decoding standard used at the time.
  • the data structure of the first encoded information may be as shown in Table 1 below. Different bytes in the first encoded information store different data, and the first encoded information specifically includes: (1) Represents the first encoded data
  • the data of the encoding type can take the value 0 or 1.
  • 0 indicates that the first encoded data is the original data (that is, the pixel values of all pixels in the N difference regions), and 1 indicates JPEG or other encoding methods; (2) the first The width videoWidth of the i-1 frame interface image; (3) the height videoHeight of the i-1 frame interface image, the unit of width and height can be pixels; (4) The first reference timestamp Basetimestamp of the i-th frame; (5 ) The first timestamp Playtimestamp of the i-th frame; (6) The total number N of difference regions; (7) The total length of the first encoded data (or the length len of the payload field); (8) N The position information of the first difference area in the first difference area, such as xmin, xmax, ymin, ymax of the first difference area; (9) the data length len1 (or len1) of the first difference area in the first encoded data Say the length of the first difference area len1); (10) the encoded data data1 of the first difference area, len
  • unit8 indicates that the data is an unsigned 8-bit (bit) integer
  • unit32 indicates that the data is an unsigned 32-bit integer
  • unit64 indicates that the data is an unsigned 64-bit integer
  • unit32[4] indicates that the data is an unsigned 32-bit integer and length Equal to 4
  • unit8[len1] indicates that the data is an unsigned 32-bit integer and the length is equal to len1
  • unit8[lenN] indicates that the data is an unsigned 32-bit integer and the length is equal to lenN.
  • the first coded information may be divided into a frame header and a payload field (payload) for storing different data.
  • the second encoding information further includes: a first timestamp of the ith frame, where the first timestamp of the ith frame is used to record the time when the first electronic device generates the ith frame of the interface image.
  • the second electronic device determines that the second reference time stamp of the i+1 th frame is the first time stamp of the ith frame, and the second reference time stamp is The second electronic device records the reference time of screen projection.
  • the second encoded information may include the above-mentioned second identifier.
  • the second electronic device may determine that the second reference timestamp of the i+1 th frame is the first timestamp of the ith frame when it is determined that the second encoded information includes the above-mentioned second identifier.
  • the first encoding information further includes: a first time stamp of the ith frame and a first reference time stamp of the ith frame, where the first reference time stamp is the reference time at which the first electronic device records the screen projection.
  • the second electronic device determines the first time of the i-th frame The time recorded by the stamp is later than the time recorded by the first reference time stamp of the ith frame, and the time recorded by the first reference time stamp of the ith frame is equal to the time recorded by the second reference time stamp of the ith frame.
  • the second electronic device After receiving the first encoded information from the first electronic device, the second electronic device determines that the time recorded by the first timestamp of the ith frame is later than the time recorded by the first reference timestamp of the ith frame, and that the ith frame Only when the time recorded by the first reference timestamp of the ith frame is equal to the time recorded by the second reference timestamp of the ith frame, the first encoded data is decoded to obtain the image content of the N difference regions.
  • the second electronic device may further determine that the second reference timestamp of the i+1 th frame is the second reference timestamp of the ith frame.
  • the second electronic device receives the second encoding information of the interface image of the i-1 th frame (referred to as the second encoding information of the i-1 th frame for short), it will encode the second encoding information of the i-1 th frame.
  • the first timestamp of the i-1th frame in the information is used as the second reference timestamp of the i-th frame (that is, the second reference timestamp of the i-th frame is equal to the first timestamp of the i-1th frame), and also uses
  • the second encoding information of the i-1th frame displays the content of the interface image of the i-1th frame.
  • receiving the second encoding information of the i-1th frame indicates that the area ratio of the N difference regions of the i-1th frame in the interface image of the i-1th frame is greater than the preset value;
  • the area ratio of the difference area in the interface image of the i-1 frame is greater than the preset value, which means that the difference between the interface image of the i-1 frame and the interface image of the i-2 frame is relatively large. Then, the difference between the interface image of the i-th frame and the interface image of the i-2-th frame is greater.
  • the second electronic device receives the first encoding information of the interface image of the ith frame (referred to as the first encoding information of the ith frame for short), it means that the N difference regions of the ith frame are in the interface image of the ith frame.
  • the area ratio is smaller than the preset value; and the area ratio of the N difference regions of the i-th frame in the interface image of the i-th frame is smaller than the preset value, indicating that the difference between the interface image of the i-th frame and the interface image of the i-1-th frame is small.
  • the first encoding information of the ith frame of the interface image may include the first reference timestamp of the ith frame.
  • the first electronic device uses the first timestamp of the i-1th frame as the i-th frame when it is determined that the area ratio of the N difference regions of the i-1th frame in the interface image of the i-1th frame is greater than the preset value
  • the first reference time stamp of the i-th frame is equal to the first time stamp of the i-1-th frame; combined with the second reference time stamp of the i-th frame, it is equal to the first time-stamp of the i-1-th frame.
  • timestamp it can be known that the first reference timestamp of the ith frame is equal to the second reference timestamp of the ith frame.
  • the first reference timestamp of the ith frame is equal to the second reference timestamp of the ith frame, which means that the difference between the interface image of the ith frame and the interface image of the i-1th frame is small, and it also indicates that the second electronic device has been Display the content of the i-1th frame interface image. Then, when the first reference timestamp of the ith frame is equal to the second reference timestamp of the ith frame, the second electronic device uses the first encoding information of the ith frame to display the content of the interface image of the ith frame.
  • the second electronic device receives the encoding information corresponding to the interface image of the M frame (that is, the first encoding information or The sequence of the second coded information) is different from the sequence in which the M frames of interface images are generated by the first electronic device, resulting in the problem of picture misalignment.
  • the second electronic device may determine that the time recorded by the first time stamp of the ith frame is not later than (that is, earlier than or equal to) the time recorded by the first reference time stamp of the ith frame, or the time recorded by the first reference time stamp of the ith frame.
  • the time recorded by the first reference timestamp of the i-th frame is earlier than the time recorded by the second reference timestamp of the i-th frame, and the first encoding information of the interface image of the i-th frame is not processed, or the first encoding information is deleted.
  • the time recorded in the first time stamp of the ith frame is later than the time recorded by the first reference time stamp of the ith frame, and the time recorded by the first reference time stamp of the ith frame is later than the time recorded by the first reference time stamp of the ith frame In the case of the time recorded by the second reference time stamp, the first encoded information of the ith frame of the interface image is not processed.
  • the time recorded by the second electronic device at the first time stamp of the ith frame is later than the time recorded by the first reference time stamp of the ith frame and the time recorded by the first reference time stamp of the ith frame. If it is later than the time recorded by the second reference timestamp of the ith frame, the second encoding information of the interface image of the jth frame (referred to as the second encoding information of the jth frame) is received again; The first time stamp is used as the second reference time stamp of the j+1th frame, and the interface image is updated using the second encoding information of the jth frame.
  • the second electronic device may also perform the first encoding information on the interface image of the ith frame when the time recorded by the first reference timestamp of the ith frame is equal to the time recorded by the second reference timestamp of the j+1th frame S409-S410.
  • the first encoded information of the ith frame received by the second electronic device includes the first timestamp of the ith frame and the first reference timestamp of the ith frame. If the time recorded by the first time stamp of the ith frame is not later than the time recorded by the first reference time stamp of the ith frame, the first encoded information of the ith frame is wrong; therefore, the second electronic device does not The first encoded information is processed.
  • the second electronic device has received the second encoded information of the interface image of the ith+hth frame ( It is abbreviated as the second encoding information of the i+h th frame), and the first timestamp of the i+h th frame in the second encoding information of the i+h th frame is used as the second reference timestamp of the i th frame.
  • the second electronic device also uses the second encoding information of the i+h frame to display the content of the i+h frame interface image, then the content of the i frame interface image and the content of the currently displayed interface image are quite different; therefore; , the second electronic device does not process the first encoding information of the ith frame, that is, does not display the content of the interface image of the ith frame, which avoids the problem of picture misalignment.
  • h is a positive integer.
  • the first encoded information of the ith frame received by the second electronic device includes the first timestamp of the ith frame and the first reference timestamp of the ith frame. If the time recorded by the first time stamp of the ith frame is later than the time recorded by the first reference time stamp of the ith frame, and the time recorded by the first reference time stamp of the ith frame is later than the time recorded by the first reference time stamp of the ith frame.
  • the time recorded by the two reference time stamps indicates that the first reference time stamp of the ith frame is equal to the first time stamp of the i-k th frame, and the second reference time stamp of the ith frame is equal to the first time stamp of the i-k-q th frame.
  • the second reference timestamp of the i-th frame is equal to the first timestamp of the i-k-q-th frame, which means that the second electronic device has not received the second encoding information of the i-k-th frame, and the second electronic device only displays the i-k-th frame.
  • the content of the previous interface image is equal to the first reference timestamp of the i-k-th frame, which also means that compared with the interface image of the i-th frame and the interface image before the i-k-th frame, the interface image of the i-th frame and the interface image of the i-k-th frame are different.
  • the difference between the frame interface images is smaller; further, it can be known that the content of the ith frame interface image and the content of the interface image currently displayed by the second electronic device are quite different. Therefore, the first encoded information of the ith frame is not processed, that is, the content of the interface image of the ith frame is not displayed. Then, if the second electronic device receives the second encoding information of the jth frame again, the first timestamp of the jth frame in the second encoding information of the jth frame is used as the second reference time of the j+1th frame and use the second encoding information of the jth frame to display the content of the jth frame interface image.
  • the first reference timestamp of the i-th frame is equal to the second reference timestamp of the j+1-th frame, it means that the j+1-th frame is the i-k-th frame, that is, the second electronic device displays an interface image with the i-th frame The content of the interface image in the i-kth frame with less difference.
  • the second electronic device uses the first encoding information of the i-th frame to display the content of the interface image of the i-th frame, so as to prevent the second electronic device from receiving the encoding information corresponding to the M-frame interface image (that is, the first encoding information or The sequence of the second coded information) is different from the sequence in which the M frames of interface images are generated by the first electronic device, resulting in the problem of picture misalignment.
  • k and q are a positive integer, and k and q can be equal or unequal.
  • the encoding information (referred to as the second encoding information of the first frame), the first encoding information of the interface image of the second frame (referred to as the first encoding information of the second frame), and the first encoding information of the interface image of the third frame (referred to as the first encoding information of the third frame) is the first encoding information of the third frame), the second encoding information of the interface image in the fourth frame (referred to as the second encoding information of the fourth frame), and the first encoding information of the interface image in the fifth frame (referred to as the fifth frame for short) the first encoded information).
  • the second encoded information of the first frame includes the first time stamp T1 of the first frame.
  • the second encoded information of the 4th frame includes the first time stamp T4 of the 4th frame.
  • the first electronic device may send the first encoded information through the first transmission path, and send the second encoded information through the second transmission path.
  • the second electronic device may sequentially receive the second encoded information of the first frame of interface image, the first encoded information of the second frame of interface image, and the first encoded information of the third frame of interface image. encoding information, the first encoding information of the interface image in the fifth frame, and the second encoding information in the interface image in the fourth frame.
  • an embodiment of the present application provides a multi-screen collaborative display method, which is applied to the first electronic device and the second electronic device in the above-mentioned first usage scenario, where the first electronic device is connected to the second electronic device.
  • the display method may include S601.
  • the display method may include S602.
  • the display method may include S603.
  • the display method may include S604.
  • S408 in the display method may include S605.
  • the display method may include S606.
  • S410 in the display method may include S607.
  • S412 in the display method may include S608.
  • the display method may include S609.
  • the first electronic device generates a second time stamp, and determines that the second time stamp is the first reference time stamp of the first frame.
  • the second time stamp is used to record the time when the first electronic device generates the first interface image
  • the first reference time stamp is the reference time at which the first electronic device records the screen projection
  • the second electronic device determines that the second reference timestamp of the first frame is the second timestamp.
  • the first electronic device generates a first timestamp of the ith frame.
  • the first timestamp of the ith frame is used to record the time when the first electronic device generates the ith frame of the interface image.
  • the first electronic device determines that the first reference timestamp of the i+1 th frame is the first reference timestamp of the ith frame.
  • the first encoded information further includes position information of N difference regions in the ith frame of interface image compared to the i-1 th interface image (ie, position information of the N difference regions in the ith frame of the interface image).
  • the second electronic device determines that the second reference timestamp of the i+1 th frame is the second reference timestamp of the ith frame.
  • the time recorded by the second electronic device at the first time stamp of the ith frame is later than the time recorded by the first reference time stamp of the ith frame, and the time recorded by the first reference time stamp of the ith frame is equal to In the case of the time recorded by the second reference timestamp of the ith frame, the second screen projection interface is displayed.
  • the second electronic device determines that the second reference timestamp of the i+1 th frame is the first timestamp of the ith frame.
  • the first electronic device projects the content of the interface image displayed by the first electronic device to the second electronic device, and can also use
  • the second electronic device receives an update operation of the interface image by the user.
  • the second electronic device instructs the first electronic device to update the interface image, receives the updated interface image from the first electronic device, and displays the content of the updated interface image. Specifically, as shown in FIG.
  • the display method for multi-screen collaboration applied to the first electronic device and the second electronic device can be performed after S403 and before S406 , and S701-S704 may be executed, but not executed.
  • the second electronic device receives an update operation of the first interface image by the user.
  • the second electronic device sends an update instruction to the first electronic device, where the update instruction is triggered by an update operation on the first screen projection interface.
  • the first electronic device receives an update instruction from the second electronic device.
  • the update instruction is used to trigger the first electronic device to sequentially display M frames of interface images.
  • the first interface image 501 displayed by the mobile phone 300 is: Edit the editing interface image of a picture, and instruct (or trigger) the notebook computer 510 to display the first screen projection interface 502 .
  • the notebook computer 510 is wirelessly connected to the stylus 507 , and the notebook computer 510 receives the user’s update operation on the first interface image through the stylus 507 , for example, the user uses the stylus 507 on the display screen of the mobile phone 300 .
  • the first interface image displayed by the notebook computer 520 501 is an editing interface image for editing a picture, and instructs (or triggers) the tablet computer 505 to display the first screen projection interface 502 .
  • the tablet computer 505 is wirelessly connected to the stylus 507 , and the tablet computer 505 receives the user's update operation on the first interface image through the stylus 507 , such as the user's display on the notebook computer 520 through the stylus 507 Draw a heart-shaped drawing operation on the screen.
  • the notebook computer 520 sequentially displays M frames of interface images in the process of drawing the heart shape, and displays an interface image 503 including the heart shape.
  • the notebook computer 520 sends the first encoded information or the second encoded information of each frame of the interface image to the tablet computer 505 . Because the notebook computer 520 can encode only the content in each frame of interface image that is different from the interface image of the previous frame to obtain the first encoded data when the change of each frame of interface image is small compared to the interface image of the previous frame , the first encoded information including the first encoded data is transmitted to the second electronic device, which reduces the time-consuming of encoding and decoding and the transmission time of the encoded data. Therefore, the tablet computer 505 also uses the received first encoded information or the second encoded information to display a screen projection interface 504 including a heart shape more quickly.
  • the first electronic device receives an update instruction from the second electronic device when displaying the first interface image.
  • the first electronic device adopts conventional technology (such as H264 encoding technology), displays M frames of interface images in response to the update instruction, and instructs (or triggers) the second electronic device to display M frames of interface images; the second electronic device displays M frames of interface images During the image process, it is measured that the frame rate of the second electronic device displaying M-frame interface images (ie, the screen projection frame rate) is equal to 48FPS; it is also measured that the second electronic device needs to display the M-frame interface images from receiving the update operation.
  • the duration is 130ms, that is, the duration of the counter-control of the first electronic device to the second electronic device is 130ms.
  • the first electronic device adopts the method provided by the embodiment of the present application, displays M frames of interface images in response to the update instruction, and instructs (or triggers) the second electronic device to display M frames of interface images;
  • the frame rate of the second electronic device displaying M frames of interface images ie, the screen projection frame rate
  • the second electronic device receives the update operation to display the M frames of the interface.
  • the required duration of the image is 58ms, that is, the duration of the reverse control of the first electronic device to the second electronic device is 58ms.
  • the delay of the counter control of the second electronic device by the first electronic device is smaller, and the frame rate of displaying M frames of interface images by the second electronic device is equal to 60 FPS, indicating that the second The image of the electronic device update interface does not freeze.
  • an embodiment of the present application provides a multi-screen collaborative display method, which is applied to the first electronic device and the second electronic device in the above-mentioned reverse control scenario, where the first electronic device is connected to the second electronic device.
  • the display method may include S801. After S403, the display method may include S802.
  • the display method may include S803.
  • the display method may include S804.
  • S408 in the display method may include S805.
  • the display method may include S806.
  • S410 in the display method may include S807.
  • S412 in the display method may include S808.
  • the display method may include S809.
  • the first electronic device generates a second time stamp, and determines that the second time stamp is the first reference time stamp of the first frame.
  • the second time stamp is used to record the time when the first electronic device generates the first interface image
  • the first reference time stamp is the reference time at which the first electronic device records the screen projection
  • the second electronic device determines that the second reference timestamp of the first frame is the second timestamp.
  • the first electronic device generates a first timestamp of the ith frame.
  • the first electronic device determines that the first reference timestamp of the i+1 th frame is the first reference timestamp of the ith frame.
  • the first electronic device sends the first encoding information to the second electronic device; the first encoding information includes: first encoded data, a first timestamp of the ith frame, and a first reference timestamp of the ith frame.
  • the first encoded information further includes position information of N difference regions in the ith frame of interface image compared to the i-1 th interface image (ie, position information of the N difference regions in the ith frame of the interface image).
  • the second electronic device determines that the second reference timestamp of the i+1 th frame is the second reference timestamp of the ith frame.
  • the time recorded by the second electronic device at the first time stamp of the ith frame is later than the time recorded by the first reference time stamp of the ith frame, and the time recorded by the first reference time stamp of the ith frame is equal to In the case of the time recorded by the second reference timestamp of the ith frame, the second screen projection interface is displayed.
  • the first electronic device sends second encoded information to the second electronic device; the second encoded information includes: second encoded data and the first timestamp of the i-th frame.
  • the second electronic device determines that the second reference timestamp of the i+1 th frame is the first timestamp of the ith frame.
  • the above method can be implemented by a multi-screen collaborative display device.
  • the multi-screen collaborative display device includes corresponding hardware structures and/or software modules for executing each function.
  • the embodiments of the present application can be implemented in hardware or a combination of hardware and computer software. Whether a function is performed by hardware or computer software driving hardware depends on the specific application and design constraints of the technical solution. Skilled persons may use different methods to implement the described functions for each specific application, but such implementations should not be considered beyond the scope of the embodiments of the present application.
  • each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module.
  • the above-mentioned integrated modules can be implemented in the form of hardware, and can also be implemented in the form of software function modules. It should be noted that, the division of modules in the embodiments of the present application is schematic, and is only a logical function division, and there may be other division manners in actual implementation.
  • FIG. 18 shows a possible schematic structural diagram of the first electronic device involved in the above embodiment.
  • the first electronic device 900 is connected to the second electronic device, and the first electronic device 900 is connected to the second electronic device.
  • the device 900 includes a display module 901 , a sending module 902 , a receiving module 903 and a cooperative processing module 904 .
  • the display module 901 is used to display the first interface image
  • the cooperative processing module 904 is used to encode the first interface image to obtain the encoded data of the first interface image
  • the sending module 902 is used to send the first interface image to the second electronic device.
  • the coding information of the interface image; the receiving module 903 is used to receive an update operation for the first interface image, and the update operation is used to trigger the first electronic device 900 to display M frames of interface images in sequence, where M is a positive integer; the cooperative processing module 904, also It is used to display the ith frame interface image in response to the update operation, and obtain N difference areas in the ith frame interface image compared with the i-1th frame interface image; if the N difference areas are in the ith frame interface image The area ratio of the N difference regions is smaller than the preset value, and the image content of the N difference regions is encoded to obtain the first encoded data; the sending module 902 is further configured to send the first encoded information to the second electronic device.
  • the encoded information of the first interface image includes encoded data of the first interface image.
  • the encoded information of the first interface image is used to trigger the second electronic device to display the first screen projection interface based on the encoded information of the first interface image, and the content of the first screen projection interface is a mirror image of the first interface image.
  • the first encoding information includes the first encoding data and the position information of the N difference regions in the ith frame of the interface image; the first encoding information is used to trigger the second electronic device to update the i-1th frame of the interface image based on the first encoding information to obtain The second screen projection interface is displayed and the second screen projection interface is displayed.
  • the cooperative processing module 904 is further configured to: in response to the update operation, generate a first timestamp of the ith frame; the first timestamp of the ith frame is used to record that the first electronic device generates the first timestamp The time of the i frame of the interface image; wherein, the second encoded information further includes the first time stamp of the i th frame.
  • Embodiments of the present application further provide an electronic device, where the electronic device is a first electronic device.
  • the first electronic device may include a processor and a memory.
  • the memory is used to store computer program codes, and the computer program codes include computer instructions; the processor is used to run the computer instructions, so that the first electronic device performs various functions or steps performed by the mobile phone 300 or the notebook computer 520 in the above method embodiments.
  • the processor is used to run the computer instructions, so that the first electronic device performs various functions or steps performed by the mobile phone 300 or the notebook computer 520 in the above method embodiments.
  • the first encoded information includes first encoded data and position information of N different regions in the interface image of the ith frame compared with the interface image of the ith-1th frame; the cooperative processing module 1003 is used for decoding The first encoded data obtains the image content of the N difference regions; the display module 1001 is also used for displaying the second screen projection interface.
  • the receiving module 1002 is further configured to receive second encoded information from the first electronic device; wherein the second encoded information includes second encoded data, and the second encoded data is for the i-th frame interface
  • the collaborative processing module 1003 is further used to decode the second encoded data to obtain the ith frame of the interface image; the display module 1001 is also used to display the third screen projection interface; the content of the third screen projection interface is the A mirror image of the i-frame interface image.
  • the computer instructions are stored in a second electronic device (such as the notebook computer 510 shown in 9 or the tablet computer 505 shown in FIG. 16, the second electronic device 1000 shown in Each function or step performed.
  • the computer-readable storage medium may be ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like.
  • Another embodiment of the present application provides a computer program product, including one or more instructions, and the one or more instructions can be used in a second electronic device (the notebook computer 510 shown in FIG. 8 or FIG. 15 , FIG. 9 or The tablet computer 505 shown in FIG. 16 and the second electronic device 1000 shown in FIG. 19 ), so that the second electronic device executes each of the laptop computer 510 , the tablet computer 505 or the second electronic device 1000 in the above method embodiments. function or step.
  • the disclosed apparatus and method may be implemented in other manners.
  • the apparatus embodiments described above are only illustrative, for example, the division of modules or units is only a logical function division, and other division methods may be used in actual implementation, for example, multiple units or components may be combined or May be integrated into another device, or some features may be omitted, or not implemented.
  • the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or units, and may be in electrical, mechanical or other forms.
  • the integrated unit if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium.
  • a readable storage medium including several instructions to make a device (may be a single chip microcomputer, a chip, etc.) or a processor (processor) to execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage medium includes: a U disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk and other mediums that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

本申请提供一种多屏协同的显示方法及电子设备,涉及分布式控制技术领域,可以解决多屏协同场景下第二电子设备更新界面图像存在的卡顿问题。应用于第一电子设备的具体方案包括:显示第一界面图像,并对第一界面图像编码得到第一界面图像的编码数据;向第二电子设备发送第一界面图像的编码信息;响应于对第一界面图像的更新操作,显示第i帧界面图像,并获取第i帧界面图像中、相较于第i-1帧界面图像的N个差异区域;若N个差异区域在第i帧界面图像的面积占比小于预设值,对N个差异区域的图像内容进行编码,得到第一编码数据;向第二电子设备发送第一编码信息;第一编码信息包括第一编码数据和N个差异区域在第i帧界面图像中的位置信息。

Description

一种多屏协同的显示方法及电子设备
本申请要求于2020年11月30日提交国家知识产权局、申请号为202011381303.4、发明名称为“一种多屏协同的显示方法及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及分布式控制技术领域,尤其涉及一种多屏协同的显示方法及电子设备。
背景技术
在多屏协同场景中,投屏设备(如手机)和被投屏设备(如个人计算机(Personal Computer,PC))建立连接(如有线连接或无线连接)后,手机可以将手机的界面图像投屏到PC上显示。
第一种使用场景下,在手机向PC投屏的过程中,手机可以接收用户对界面图像的更新操作;手机响应该更新操作,可以更新手机的界面图像,并指示PC也更新PC的界面图像。
第二种使用场景(称为反控场景)下,在手机向PC投屏的过程中,还可以由PC接收用户对PC显示的界面图像的更新操作。PC可以接收用户对PC的更新操作;然后向手机发送更新指令,以指示手机响应于该更新指令更新界面图像。手机响应该更新指令,可以更新手机的界面图像,并指示PC也更新PC的界面图像。
上述手机指示PC更新界面图像,具体可以包括:手机对更新的界面图像进行编码得到编码数据;手机向PC传输该编码数据;PC接收该编码数据并对该编码数据进行解码,得到更新的界面图像;PC显示更新的界面图像。
其中,编解码耗时较长以及网络波动等原因,在多屏协同场景下,PC更新界面图像会存在明显的卡顿。并且,随着用户对设备显示画面分辨率的要求越来越高,多屏协同场景的卡顿问题越来越严重。尤其在上述反控场景中,需要PC接收更新操作并向手机发送更新指令,手机才可以响应于更新指令更新界面图像,并指示PC更新界面图像。这样,无疑会增加PC更新界面图像的耗时,进一步增大PC更新界面图像出现卡顿现象的可能性。
发明内容
本申请提供一种多屏协同的显示方法及电子设备,可以解决多屏协同场景下第二电子设备更新界面图像存在的卡顿问题。
为达到上述目的,本申请采用如下技术方案:
第一方面提供一种多屏协同的显示方法,应用于第一电子设备,第一电子设备连接第二电子设备。该显示方法包括:第一电子设备先显示第一界面图像,并对第一界面图像编码得到第一界面图像的编码数据;再向第二电子设备发送第一界面图像的编码信息;然后,若第一电子设备接收对第一界面图像的更新操作,则响应于更新操作,显示第i帧界面图像,并获取第i帧界面图像中、相较于第i-1帧界面图像的N个差异 区域;再然后,若N个差异区域在第i帧界面图像的面积占比小于预设值,第一电子设备对N个差异区域的图像内容进行编码,得到第一编码数据;最后,第一电子设备向第二电子设备发送第一编码信息。
其中,第一界面图像的编码信息包括第一界面图像的编码数据;第一界面图像的编码信息用于触发第二电子设备基于第一界面图像的编码信息显示第一投屏界面,第一投屏界面的内容是第一界面图像的镜像。更新操作用于触发第一电子设备依次显示M帧界面图像,M为正整数。i在{1,……,M}中依次取值;第0帧界面图像为第一界面图像;N个差异区域中像素点的像素值与第i-1帧界面图像中对应像素点的像素值不同,N为正整数。第一编码信息包括第一编码数据和N个差异区域在第i帧界面图像中的位置信息;第一编码信息用于触发第二电子设备基于第一编码信息更新第i-1帧界面图像得到第二投屏界面并显示第二投屏界面;第二投屏界面的内容是第i帧界面图像的镜像。
第一方面提供的多屏协同的显示方法,第一电子设备显示第一界面图像,并将第一界面图像的内容投屏到第二电子设备上。第一电子设备若接收到对第一界面图像的更新操作,则显示M帧界面图像,并获取第i帧界面图像中、相较于第i-1帧界面图像的N个差异区域。如果N个差异区域在第i帧界面图像的面积占比小于预设值,第一电子设备只对N个差异区域编码得到第一编码数据;然后,将包括第一编码数据的第一编码信息发送给第二电子设备。其中,N个差异区域包括第i帧界面图像相较于第i-1帧界面图像的像素值不同的像素点。那么,第一电子设备对N个差异区域编码,就是对第i帧界面图像中的与第i-1帧界面图像不同的内容进行编码,相较于对第i帧界面图像编码,可以减少编解码耗时以及第一编码数据的大小,而传输较小的第一编码数据也可以缩短第一编码数据的传输时长。进而,由于减小了编解码耗时和缩短了第一编码数据的传输时长,则第二电子设备可以更快地利用包括第一编码数据的第一编码信息更新第i-1帧界面图像,显示第i帧界面图像的镜像。如此,可以解决多屏协同场景下第二电子设备更新界面图像存在的卡顿问题。
在一种可能的设计方式中,上述第一电子设备接收对第一界面图像的更新操作,包括:第一电子设备接收来自第二电子设备的更新指令,更新指令是对第一投屏界面的更新操作触发的;或者,第一电子设备接收用户对第一电子设备显示的第一界面图像的更新操作。
在该设计方式中,描述了该显示方法的几种使用场景。在第一使用场景下,第一电子设备直接接收用户对第一电子设备显示的第一界面图像的更新操作,更新第一电子设备的界面图像,并指示(或触发)第二电子设备更新界面图像。在反控场景下,由第二电子设备接收用户对第一电子设备显示的第一界面图像的更新操作,然后向第一电子设备发送更新指令;第一电子设备再响应于该更新指令,更新第一电子设备的界面图像,并指示(或触发)第二电子设备更新界面图像。
其中,根据上述分析可知采用该显示方法,第二电子设备可以更快地利用包括第一编码数据的第一编码信息更新第i-1帧界面图像,显示第i帧界面图像的镜像。那么,在反控场景下,第二电子设备在接收用户的该更新操作后,快速地利用包括第一编码数据的第一编码信息更新第i-1帧界面图像,表示第一电子设备对第二电子设备的反 控时延较小。也就是说,在反控场景下采用该显示方法,可以减小第一电子设备对第二电子设备的反控时延。
另一种可能的设计方式中,上述获取第i帧界面图像中、相较于第i-1帧界面图像的N个差异区域,包括:第一电子设备将第i帧界面图像中每个像素点的像素值,与第i-1帧界面图像中对应像素点的像素值进行对比,得到第i帧界面图像中的差异像素点;第一电子设备确定包括第i帧界面图像中的差异像素点的N个差异区域。其中,差异像素点的像素值与第i-1帧界面图像中对应像素点的像素值不同。
在该设计方式中,描述了第一电子设备获取第i帧界面图像中、相较于第i-1帧界面图像的N个差异区域的一种实现方式。
另一种可能的设计方式中,该显示方法还包括:若N个差异区域在第i帧界面图像的面积占比大于预设值,第一电子设备对第i帧界面图像进行编码,得到第二编码数据;第一电子设备向第二电子设备发送第二编码信息;其中,第二编码信息包括第二编码数据,第二编码信息用于触发第二电子设备基于第二编码信息显示第三投屏界面,第三投屏界面的内容是第i帧界面图像的镜像。
在该设计方式中,若N个差异区域在第i帧界面图像的面积占比大于预设值,则对N个差异区域编解码的耗时,和对第i帧界面图像编解码的耗时相差不大。也就是说,第一电子设备对N个差异区域编码后发送至第二电子设备,或者对第i帧界面图像编码后发送至第二电子设备,第二电子设备更新界面图像时的卡顿程度差别不大,则第一电子设备可以在N个差异区域在第i帧界面图像的面积占比大于预设值的情况下,采用常规技术对第i帧界面图像编码后传输。
另一种可能的设计方式中,该显示方法还包括:第一电子设备响应于更新操作,生成第i帧的第一时间戳;第i帧的第一时间戳用于记录第一电子设备生成第i帧界面图像的时间;其中,第二编码信息还包括第i帧的第一时间戳。
在该设计方式中,描述了第一电子设备生成第二编码信息的一种实现方式。考虑到虽然第一电子设备按照生成M帧界面图像的先后顺序,依次向第二电子设备发送M帧界面图像对应的编码信息(即第一编码信息或第二编码信息),但是由于编码数据大小不同和网络波动等原因,第二电子设备接收到多个编码信息(即第一编码信息或第二编码信息)的顺序可能与生成M帧界面图像的先后顺序不同。例如,第一电子设备依次向第二电子设备发送第i帧界面图像对应的编码信息(即第一编码信息或第二编码信息)、第i+1帧界面图像对应的编码信息(即第一编码信息或第二编码信息)时,由于编码数据大小不同和网络波动等原因,第二电子设备可能先接收到第i+1帧界面图像对应的编码信息,再接收到第i帧界面图像对应的编码信息。因此,第一编码信息和第二编码信息都可以包括第i帧的第一时间戳,以指示第二电子设备可以参考第一时间戳,按照时间先后顺序显示界面图像。
另一种可能的设计方式中,该显示方法还包括:第一电子设备生成第二时间戳,并保存第二时间戳,第二时间戳用于记录第一电子设备生成第一界面图像的时间,第二时间戳是第1帧的第一基准时间戳,第一基准时间戳是第一电子设备记录投屏的基准时间;若N个差异区域在第i帧界面图像的面积占比大于预设值,第一电子设备确定第i+1帧的第一基准时间戳为第i帧的第一时间戳;若N个差异区域在第i帧界面 图像的面积占比小于预设值,第一电子设备确定第i+1帧的第一基准时间戳为第i帧的第一基准时间戳。其中,第一编码信息还包括:第i帧的第一时间戳和第i帧的第一基准时间戳。
可以理解的是,如果第i帧界面图像中、相较于第i-1帧界面图像的N个差异区域(简称为第i帧的N个差异区域)在第i帧界面图像的面积占比小于预设值,则表示第i帧界面图像与第i-1帧界面图像的差异较小。其次,如果第i-1帧界面图像中、相较于第i-2帧界面图像的N个差异区域(简称为第i-1帧的N个差异区域)在第i-2帧界面图像的面积占比大于预设值,则表示第i-1帧界面图像相较于第i-2帧界面图像差异较大。而第i帧界面图像与第i-1帧界面图像的差异较小,则第i帧界面图像与该第i-2帧界面图像差异也较大。
同理可知,如果第i-k帧界面图像中、相较于第i-k-1帧界面图像的N个差异区域(简称为第i-k帧的N个差异区域)在第i-k帧界面图像的面积占比大于预设值,则表示第i-k帧界面图像与第i-k-1帧界面图像及其之前的界面图像的差异都较大,k为一个正整数。其次,如果从第i-k+1帧界面图像到第i帧界面图像中任一帧的N个差异区域在任一帧界面图像中的面积占比都小于预设值,则表示第i帧界面图像与第i-k帧界面图像及其到第i帧界面图像之间的界面图像的差异较小。因此,若因为编码数据大小不同和网络波动等原因,第二电子设备在第i-k帧界面图像之前先接收到第i帧界面图像的第一编码信息;并且,在显示第i-k帧界面图像之前利用接收到的第一编码信息显示第i帧界面图像的内容,则会出现画面错位的问题。
针对画面错位的问题,第一电子设备确定N个差异区域在第i帧界面图像的面积占比大于或等于预设值时,将第i帧的第一时间戳用作第i+1帧的第一基准时间戳,否则,将第i帧的第一基准时间戳用作第i+1帧的第一基准时间戳。结合上述“第i-k帧的N个差异区域在第i-k帧界面图像的面积占比大于预设值”可知,第i-k+1帧的第一基准时间戳等于第i-k帧界面图像的第一时间戳。结合上述“从第i-k+1帧界面图像到第i帧界面图像中任一帧的N个差异区域在任一帧界面图像中的面积占比都小于预设值”可知,从第i-k+1帧界面图像到第i帧中任一帧的第一基准时间戳等于第i-k+1帧的第一基准时间戳。由于第i-k+1帧的第一基准时间戳等于第i-k帧界面图像的第一时间戳,因此,第i帧的第一基准时间戳等于第i-k帧界面图像的第一时间戳。
其次,第一电子设备在第一编码信息中携带有第i帧的第一时间戳和第i帧的第一基准时间戳。因为第i帧的第一基准时间戳等于第i-k帧界面图像的第一时间戳,则第i帧的第一基准时间戳能够用于指示第二电子设备将第i帧界面图像的内容显示在上述第i-k帧界面图像之后。如此,就可以避免由于将第i帧界面图像的内容显示在第i-k帧界面图像之前所导致的画面错位的问题。
第二方面提供一种多屏协同的显示方法,应用于第二电子设备,第二电子设备连接第一电子设备,该显示方法包括:第二电子设备显示第一投屏界面,第一投屏界面的内容是第一电子设备显示的第一界面图像的镜像;第二电子设备接收来自第一电子设备的第一编码信息;其中,第一编码信息包括第一编码数据、以及第i帧界面图像中相较于第i-1帧界面图像的N个差异区域的位置信息;第二电子设备解码第一编码数据,得到N个差异区域的图像内容;第二电子设备显示第二投屏界面。
其中,第i帧界面图像是第一电子设备响应于更新操作生成的,更新操作用于触发第一电子设备依次显示M帧界面图像。M为正整数,i在{1,……,M}中依次取值,第0帧界面图像为第一界面图像。N个差异区域中像素点的像素值与第i-1帧界面图像中对应像素点的像素值不同,N为正整数。第一编码数据是对所述N个差异区域的图像内容编码得到的。第二投屏界面的内容是第i帧界面图像的镜像,第i帧界面图像是根据N个差异区域的图像内容和位置信息更新第i-1帧界面图像得到的。
第二方面提供的应用于第二电子设备的多屏协同的显示方法,第二电子设备显示第一电子设备投屏来的第一界面图像的内容。第二电子设备可以接收到来自第一电子设备的第一编码信息,对第一编码信息中的第一编码数据解码得到N个差异区域的图像内容;然后显示第二投屏界面。由于,第二投屏界面的内容是第i帧界面图像的镜像,第i帧界面图像是根据N个差异区域的图像内容和位置信息更新第i-1帧界面图像得到的,则可以知道,第二投屏界面利用解码得到的N个差异区域的图像内容,就可以显示更新界面图像。其中,由于N个差异区域是第i帧界面图像中的与第i-1帧界面图像不同的内容,则对N个差异区域编码得到的第一编码数据小于对第i帧界面图像编码得到的其他编码数据。进而,第二电子设备对第一编码数据解码的耗时少于对该其他编码数据解码的耗时,则第二电子设备可以更快地利用包括第一编码数据的第一编码信息更新第i-1帧界面图像,显示第i帧界面图像的镜像。如此,可以解决多屏协同场景下第二电子设备更新界面图像存在的卡顿问题。
在一种可能的设计方式中,在第二电子设备接收来自第一电子设备的第一编码信息之前,该显示方法还包括:第二电子设备接收用户对第一投屏界面的更新操作;响应于更新操作,向第一电子设备发送更新指令。
其中,更新指令用于触发第一电子设备依次显示M帧界面图像。第二电子设备通过与第二电子设备连接的外部设备接收对第一投屏界面的更新操作;外部设备包括第二电子设备的显示屏、遥控器、鼠标或者手写笔中的任一种。
在该设计方式中,描述了使用该显示方法的一种反控场景。在反控场景下,第二电子设备接收用户对第一电子设备显示的第一界面图像的更新操作,然后向第一电子设备发送更新指令,以指示(或触发)第一电子设备更新界面图像。
另一种可能的实施方式中,该显示方法还包括:第二电子设备接收来自第一电子设备的第二编码信息;第二编码信息包括第二编码数据,第二编码数据是对第i帧界面图像编码得到的;第二电子设备对第二编码数据解码,得到第i帧界面图像;第二电子设备显示第三投屏界面;第三投屏界面的内容是第i帧界面图像的镜像。
在该设计方式中,描述了第二电子设备更新界面图像的另一种实现方式。
另一种可能的设计方式中,第二编码信息还包括:第i帧的第一时间戳,第i帧的第一时间戳用于记录第一电子设备生成第i帧界面图像的时间。在第二电子设备接收来自第一电子设备的第二编码信息之后,该显示方法还包括第二电子设备确定第i+1帧的第二基准时间戳为第i帧的第一时间戳,第二基准时间戳是第二电子设备记录投屏的基准时间。
在该设计方式中,第二电子设备在接收到第i帧界面图像的第二编码信息后,表示第i帧的N个差异区域在第i帧界面图像的面积占比大于预设值,而第i帧的N个 差异区域在第i帧界面图像的面积占比大于预设值,表示第i帧界面图像与第i-1帧界面图像的差异较大。那么,第i+1帧界面图像与第i-1帧界面图像的差异更大,因此,将第二编码信息中的第i帧的第一时间戳用作第i+1帧的第二基准时间戳。
另一种可能的设计方式中,第一编码信息还包括:第i帧的第一时间戳和第i帧的第一基准时间戳,第一基准时间戳是第一电子设备记录投屏的基准时间。在第二电子设备接收来自第一电子设备的第一编码信息之后,第二电子设备解码第一编码数据,得到N个差异区域的图像内容之前,该显示方法还包括:第二电子设备确定第i帧的第一时间戳所记录的时间晚于第i帧的第一基准时间戳所记录的时间,并且第i帧的第一基准时间戳所记录的时间等于第i帧的第二基准时间戳所记录的时间。
在该设计方式中,第二电子设备如果接收到第i-1帧界面图像的第二编码信息(简称为第i-1帧的第二编码信息),则将第i-1帧的第二编码信息中的第i-1帧的第一时间戳用作第i帧的第二基准时间戳(即第i帧的第二基准时间戳等于第i-1帧的第一时间戳),还利用第i-1帧的第二编码信息显示第i-1帧界面图像的内容。其中,接收到第i-1帧的第二编码信息表示第i-1帧的N个差异区域在第i-1帧界面图像的面积占比大于预设值;而第i-1帧的N个差异区域在第i-1帧界面图像的面积占比大于预设值,表示第i-1帧界面图像与第i-2帧界面图像的差异较大。那么,第i帧界面图像与第i-2帧界面图像的差异更大。然后,如果第二电子设备再接收到第i帧界面图像的第一编码信息(简称为第i帧的第一编码信息),则表示第i帧的N个差异区域在第i帧界面图像的面积占比小于预设值;而第i帧的N个差异区域在第i帧界面图像的面积占比小于预设值,表示第i帧界面图像与第i-1帧界面图像的差异较小。其中,第i帧界面图像的第一编码信息可以包括第i帧的第一基准时间戳。第一电子设备在确定第i-1帧的N个差异区域在第i-1帧界面图像的面积占比大于预设值时,将第i-1帧的第一时间戳用作第i帧的第一基准时间戳,即第i帧的第一基准时间戳等于第i-1帧的第一时间戳;再结合上述第i帧的第二基准时间戳等于第i-1帧的第一时间戳,则可知第i帧的第一基准时间戳等于第i帧的第二基准时间戳。
综上可知,第i帧的第一基准时间戳等于第i帧的第二基准时间戳,表示第i帧界面图像与第i-1帧界面图像的差异较小,还表示第二电子设备已显示出第i-1帧界面图像的内容。那么,第二电子设备在第i帧的第一基准时间戳等于第i帧的第二基准时间戳时,利用第i帧的第一编码信息显示第i帧界面图像的内容。因为第i帧界面图像的内容与已显示出的第i-1帧界面图像的内容差异较小,就可以避免因为第二电子设备接收M帧界面图像对应的编码信息(即第一编码信息或第二编码信息)的顺序与第一电子设备生成M帧界面图像的先后顺序不同导致画面错位的问题。
第三方面提供了一种第一电子设备,该第一电子设备包括执行如第一方面和第一方面的各种可能的方法的装置,或者,模块,或者,单元。
第四方面提供了一种第二电子设备,该第二电子设备包括执行如第二方面和第二方面的各种可能的方法的装置,或者,模块,或者,单元。
第五方面提供了一种第一电子设备,第一电子设备包括:处理器和存储器。其中,存储器用于存储计算机程序代码,计算机程序代码包括计算机指令;处理器用于运行计算机指令,使得第一电子设备执行如第一方面及其任一种可能的设计方式的方法。
第六方面提供了一种第二电子设备,第二电子设备包括:处理器和存储器。其中,存储器用于存储计算机程序代码,计算机程序代码包括计算机指令;处理器用于运行计算机指令,使得第二电子设备执行如第二方面及其任一种可能的设计方式的方法。
第七方面提供了一种计算机可读存储介质,计算机可读存储介质上存储有计算机指令,当计算机指令在第一电子设备上运行时,使得第一电子设备执行如第一方面及其任一种可能的设计方式的方法。
第八方面提供了一种计算机可读存储介质,计算机可读存储介质上存储有计算机指令,当计算机指令在第二电子设备上运行时,使得第二电子设备执行如第二方面及其任一种可能的设计方式的方法。
第九方面还提供一种计算机程序产品,包括一条或多条指令,该一条或多条指令可以在第一电子设备上运行,使得第一电子设备执行如第一方面及其任一种可能的设计方式的方法。
第十方面还提供一种计算机程序产品,包括一条或多条指令,该一条或多条指令可以在第二电子设备上运行,使得第二电子设备执行如第二方面及其任一种可能的设计方式的方法。
本申请的第三方面及其任一种可能的设计方式,以及第五方面、第七方面和第九方面所带来的技术效果可参见上述第一方面中不同设计方式所带来的技术效果;本申请第四方面及其任一种可能的设计方式,以及第六方面、第八方面和第十方面所带来的技术效果可参见上述第二方面中不同设计方式所带来的技术效果,此处不再赘述。
附图说明
图1为常规技术提供的一种多屏非镜像场景下的多屏显示示意图;
图2为常规技术提供的一种多屏扩展场景下的多屏显示示意图;
图3为常规技术提供的一种多屏镜像场景下的多屏显示示意图;
图4为常规技术提供的一种多屏场景下的多屏显示示意图;
图5为常规技术提供的一种多屏场景下更新界面图像后的多屏显示示意图;
图6为本申请实施例提供的一种第一电子设备的硬件结构示意图;
图7为本申请实施例提供的一种第一使用场景下的多屏协同的显示方法流程图一;
图8为本申请实施例提供的一种第一使用场景下的多屏显示示意图一;
图9为本申请实施例提供的一种第一使用场景下的多屏显示示意图二;
图10为本申请实施例提供的一种第一电子设备获取N个差异区域的界面示意图;
图11A为本申请实施例提供的一种第一电子设备显示界面图像并更新时间戳的时序图;
图11B为本申请实施例提供的一种第一电子设备显示界面图像并更新时间戳的时序图;
图12为本申请实施例提供的一种第一电子设备向第二电子设备投屏的时序图;
图13为本申请实施例提供的一种第一使用场景下的多屏协同的显示方法流程图二;
图14为本申请实施例提供的一种反控场景下的多屏协同的显示方法流程图一;
图15为本申请实施例提供的一种反控场景下的多屏显示示意图一;
图16为本申请实施例提供的一种反控场景下的多屏显示示意图二;
图17为本申请实施例提供的一种反控场景下的多屏协同的显示方法流程图二;
图18为本申请实施例提供的一种第一电子设备的结构示意图;
图19为本申请实施例提供的一种第二电子设备的结构示意图。
具体实施方式
以下,术语“第一”、“第二”等仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”等的特征可以明示或者隐含地包括一个或者更多个该特征。
目前,在多屏协同场景中,第一电子设备(如,手机)和第二电子设备(如,PC)建立连接(如有线连接或无线连接)后,第一电子设备可以将第一电子设备的界面图像投屏到第二电子设备上显示。其中,无线连接可以为蓝牙连接、近场通信(Near Field Communication,NFC)连接或者无线保真(Wireless Fidelity,WiFi)连接。
第一种使用场景下,在第一电子设备向第二电子设备投屏的过程中,第一电子设备可以接收用户对界面图像的更新操作;第一电子设备响应该更新操作,可以更新第一电子设备的界面图像,并指示(或触发)第二电子设备也更新第二电子设备的界面图像。
其中,第一使用场景可以包括多屏扩展场景和多屏非镜像场景。多屏扩展场景下,第一电子设备生成并显示第一界面图像;第一电子设备还生成第二界面图像,并指示(或触发)第二电子设备显示第二界面图像的镜像,第一电子设备不显示第二界面图像。多屏非镜像场景下,第一电子设备可以显示第一界面图像;第一电子设备还显示第二界面图像,并指示(或触发)第二电子设备显示第二界面图像的镜像。
示例性地,以第一电子设备为笔记本电脑、第二电子设备为平板电脑为例,如图1所示,在多屏非镜像场景下,笔记本电脑110通过无线连接的方式连接平板电脑120。笔记本电脑110显示第一界面图像111和第二界面图像112;平板电脑还指示平板电脑120显示第二界面图像112的镜像113。其中,第一界面图像111的内容包括“涂鸦”、“颜色”、“粗细”、“橡皮檫”等图片编辑选项;第二界面图像112的内容包括“标题”,还包括“开始”、“插入”、“页面布局”等文档操作选项。
示例性地,以第一电子设备为笔记本电脑、第二电子设备为平板电脑为例,如图2所示,在多屏扩展场景下,笔记本电脑110通过无线连接的方式连接平板电脑120。笔记本电脑110生成并显示第一界面图像111;笔记本电脑110还生成第二界面图像112,并指示平板电脑120显示第二界面图像112的镜像113;笔记本电脑110不显示第二界面图像112。
第二种使用场景(称为反控场景)下,在第一电子设备向第二电子设备投屏的过程中,还可以由第二电子设备接收用户对第二电子设备显示的界面图像的更新操作。第二电子设备可以通过与第二电子设备存在连接的手写笔、遥控器或鼠标等外部设备接收用户对第二电子设备的更新操作;然后向第一电子设备发送更新指令,以指示(或触发)第一电子设备响应于该更新指令更新界面图像。第一电子设备响应该更新指令,可以更新第一电子设备的界面图像,并指示(或触发)第二电子设备也更新第二电子设备的界面图像。
其中,反控场景可以包括多屏镜像场景和上述多屏非镜像场景。多屏镜像场景下,第一电子设备向第二电子设备投屏后,第二电子设备可以采用全屏或非全屏的方式显示第一电子设备显示的内容。其中,在多屏镜像场景下,第二电子设备可以全屏显示第一电子设备所显示的内容。在多屏镜像场景下,第二电子设备可以非全屏显示第一电子设备所显示的内容,并且,第二电子设备不显示其他内容。
示例性地,以第一电子设备为笔记本电脑、第二电子设备为平板电脑为例,如图3所示,在多屏镜像场景下,笔记本电脑110通过无线连接的方式连接平板电脑120。笔记本电脑110显示第二界面图像112,并指示平板电脑120显示第二界面图像112的镜像113。其中,平板电脑120全屏显示第二界面图像112的镜像113。
上述方案中第一电子设备指示(或触发)第二电子设备更新界面图像,具体可以包括:第一电子设备采用H.262、H.263或H.264等编解码标准,对更新的界面图像进行编码得到编码数据;第一电子设备向第二电子设备传输该编码数据;第二电子设备接收该编码数据,并采用与第一电子设备相同的编解码标准对该编码数据进行解码,得到更新的界面图像;第二电子设备显示更新的界面图像。
然而,由于上述方案是对整张界面图像进行编码,则编解码耗时较长、编码数据较大。而编码数据较大又增加了编码数据的传输时间。较长的编解码耗时和较长的传输时间,都会导致第二电子设备利用第一电子设备发送的编码数据更新界面图像变慢。在多屏协同场景下,第二电子设备更新界面图像变慢,则表现为第二电子设备更新界面图像时发生明显的卡顿。进一步地,随着用户对设备显示画面分辨率的要求越来越高,多屏协同场景的卡顿问题越来越严重。尤其在上述反控场景中,需要第二电子设备接收更新操作并向第一电子设备发送更新指令,第一电子设备才可以响应于更新指令更新界面图像,并指示(或触发)第二电子设备更新界面图像。这样,无疑会增加第二电子设备更新界面图像的耗时,进一步增大第二电子设备更新界面图像出现卡顿现象的可能性。
示例性地,以第一电子设备为笔记本电脑、第二电子设备为平板电脑为例,说明上述方案更新界面图像的过程。如图4所示,笔记本电脑210通过无线连接的方式连接平板电脑220;笔记本电脑210显示一帧界面图像(如,编辑一张图片的编辑界面图像211),并将编辑界面图像211的内容投屏到平板电脑220上。平板电脑220上显示编辑界面图像211的镜像212。从图4中可以看出,该编辑界面图像211的内容包括:“涂鸦”、“颜色”、“粗细”和“橡皮檫”的多个图片编辑选项。
进一步地,平板电脑220显示编辑界面图像211的内容时,接收用户对编辑界面图像211的更新操作(如在编辑界面图像上绘制一个心形的绘画操作)。平板电脑220在接收该绘画操作后,向笔记本电脑210发送一个更新指令,以指示笔记本电脑210响应于该更新指令显示所绘制的心形。笔记本电脑210响应于更新指令,依次显示绘制心形过程中的多帧界面图像。笔记本电脑210还对该多帧界面图像中每一帧界面图像进行编码,得到编码数据,并将该编码数据传输至平板电脑220。然而,由于编解码耗时较长、编码数据较大等原因,导致平板电脑220在接收绘画操作后接收并解码每一帧界面图像的时间较长。
如图5所示,笔记本电脑210响应于更新指令,依次显示了绘制心形过程中的所 有界面图像,显示出一个包括心形的编辑界面图像213。而平板电脑220在接收绘画操作后,由于接收再解码每一帧界面图像的时间较长,此刻,只利用绘制心形过程中的部分界面图像显示出心形的一部分。平板电脑220在接收绘画操作后经过较长的时间,利用绘制心形过程中的所有界面图像显示出一个心形。
可以理解的是,平板电脑在接收绘画操作后显示更新后的包括心形的界面图像出现卡顿。并且,在这种笔记本电脑反向控制平板电脑的反控场景下,平板电脑在接收绘画操作后显示更新后的包括心形的界面图像的卡顿情况更明显,表现为笔记本电脑对平板电脑的反控时延较长。
针对上述方案存在的多屏协同场景下第二电子设备更新界面图像的卡顿问题、以及反控场景下第一电子设备对第二电子设备的反控时延较长的问题,本申请实施例提出一种多屏协同的显示方法,可以解决多屏协同场景下第二电子设备更新界面图像存在的卡顿问题、以及减小反控场景下第一电子设备对第二电子设备的反控时延。
需要说明的是,本申请实施例中的第一电子设备可以为手机、PC、平板电脑、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本、个人数字助理(personal digital assistant,PDA)等。第一种使用场景下的第二电子设备可以为PC的显示器、电视机或者投影仪等任一显示装置,或者为包括显示器的电子设备。反控场景下的第二电子设备可以为包括显示器的电子设备。其中,上述包括显示器的电子设备可以为PC、平板电脑、笔记本电脑、UMPC、上网本、PDA,大屏设备,智能电视等。本申请实施例对第一电子设备和第二电子设备的具体形态不作任何限制。
下面继续以第一电子设备是手机为例,介绍第一电子设备的硬件结构。如图6所示,手机300可以包括:处理器310,外部存储器接口320,内部存储器321,通用串行总线(universal serial bus,USB)接口330,充电管理模块340,电源管理模块341,电池342,天线1,天线2,移动通信模块350,无线通信模块360,音频模块370,扬声器370A,受话器370B,麦克风370C,耳机接口370D,传感器模块380,按键390,马达391,指示器392,摄像头393(可包括摄像头1-N),显示屏394(如触摸屏),以及用户标识模块(subscriber identification module,SIM)卡接口395(可包括SIM卡接口1-N)等。
其中,上述传感器模块380可以包括压力传感器,陀螺仪传感器,气压传感器,磁传感器,加速度传感器,距离传感器,接近光传感器,指纹传感器,温度传感器,触摸传感器,环境光传感器和骨传导传感器等传感器。
可以理解的是,本实施例示意的结构并不构成对手机300的具体限定。在另一些实施例中,手机300可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器310可以包括一个或多个处理单元,例如:处理器310可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU) 等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
控制器可以是手机300的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器310中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器310中的存储器为高速缓冲存储器。该存储器可以保存处理器310刚用过或循环使用的指令或数据。如果处理器310需要再次使用该指令或数据,可从存储器中直接调用。避免了重复存取,减少了处理器310的等待时间,因而提高了系统的效率。
在一些实施例中,处理器310可以包括一个或多个接口。可以理解的是,本实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对手机300的结构限定。在另一些实施例中,手机300也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块340用于从充电器接收充电输入(有线充电器的充电输入和/或无线充电输入),为电池342充电。其中,充电器可以是无线充电器,也可以是有线充电器。充电管理模块340为电池342充电的同时,还可以通过电源管理模块341为手机供电。
电源管理模块341用于连接电池342,充电管理模块340与处理器310。电源管理模块341接收电池342和/或充电管理模块340的输入,为处理器310,内部存储器321,外部存储器,显示屏394,摄像头393,和无线通信模块360等供电。在一些实施例中,电源管理模块341也可以设置于处理器310中。在另一些实施例中,电源管理模块341和充电管理模块340也可以设置于同一个器件中。
手机300的无线通信功能可以通过天线1,天线2,移动通信模块350,无线通信模块360,调制解调处理器以及基带处理器等实现。天线1和天线2用于发射和接收电磁波信号。在一些实施例中,手机300的天线1和移动通信模块350耦合,天线2和无线通信模块360耦合,使得手机300可以通过无线通信技术与网络以及其他设备通信。无线通信技术可以包括全球导航卫星系统(Global Navigation Satellite System,GNSS),无线局域网(wireless local area networks,WLAN)(如Wi-Fi网络)技术等。GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)等。示例性的,手机300可以通过GPS、BDS或者SBAS等定位技术,获取手机300的实时位置信息。
移动通信模块350可以提供应用在手机300上的包括2G/3G/4G/5G等无线通信的解决方案。例如,手机300可以通过移动通信模块350向服务器发送电子邮件。
无线通信模块360可以提供应用在手机300上的包括WLAN(如Wi-Fi网络),蓝牙(bluetooth,BT),GNSS,近场通信(Near Field Communication,NFC)、红外(Infrared,IR)、调频(frequency modulation,FM)等无线通信的解决方案。例如,手机300可以通过GNSS定位技术,获取手机300的实时位置信息。
手机300通过GPU,显示屏394,以及应用处理器等实现显示功能。手机300可以通过ISP,摄像头393,视频编解码器,GPU,显示屏394以及应用处理器等实现拍 摄功能。外部存储器接口320可以用于连接外部存储卡,例如Micro SD卡,实现扩展手机300的存储能力。内部存储器321可以用于存储计算机可执行程序代码,可执行程序代码包括指令。处理器310通过运行存储在内部存储器321的指令,从而执行手机300的各种功能应用以及数据处理。手机300可以通过音频模块370,扬声器370A,受话器370B,麦克风370C,耳机接口370D,以及应用处理器等实现音频功能。
可以理解,本申请后续实施例中的方法步骤可以由上述第一电子设备和/或上述第二电子设备执行,或者本申请实施例中的方法步骤的执行主体也可以是第一电子设备中的部分功能模块(如中央处理器(Central Processing Unit,CPU))和/或第二电子设备终端中的部分功能模块(如CPU),本申请实施例对此不做限制。其中,本申请实施例这里以第一电子设备和/或上述第二电子设备执行多屏协同的显示方法为例,对本申请实施例提供的多屏协同的显示方法进行详细说明。
本申请实施例中,第一电子设备将界面图像的内容投屏到第二电子设备上之后,若接收到用户对第一电子设备的界面图像的更新操作,更新界面图像。第一电子设备还指示(或触发)第二电子设备更新界面图像。为了减少编解码的耗时和编码数据的传输时长,第一电子设备可以对更新后的每一帧界面图像和其前一帧界面图像进行比较。若每一帧界面图像相较于其前一帧界面图像的变化较小,第一电子设备可以只对每一帧界面图像中的与其前一帧界面图像不同的内容进行编码得到编码数据,将编码数据传输至第二电子设备。以使得第二电子设备利用该编码数据更新界面图像。其中,每一帧界面图像的前一帧界面图像是生成时间在自身生成时间之前、且在生成时间上与自身连续的界面图像。由于第一电子设备只对每一帧界面图像中的与其前一帧界面图像不同的内容进行编码,可以减少编解码耗时以及编码数据的传输时长。进而可以解决了多屏协同场景下第二电子设备更新界面图像存在的卡顿问题,还可以减小反控场景下第一电子设备对第二电子设备的反控时延。
请参考图7,本申请实施例提供一种多屏协同的显示方法,应用于上述第一使用场景下的第一电子设备和第二电子设备中,第一电子设备连接第二电子设备。如图7所示,该显示方法可以包括S401-S415。
S401、第一电子设备显示第一界面图像,并对第一界面图像编码得到第一界面图像的编码数据。
第一电子设备显示第一界面图像,并对第一界面图像编码得到第一界面图像的编码数据;然后,生成第一界面图像的编码信息。其中,第一界面图像的编码信息包括第一界面图像的编码数据。可选的,第一界面图像的编码信息除了包括第一界面图像的编码数据,还可包括第一界面图像的时间戳;第一界面图像的时间戳用于记录第一电子设备生成第一界面图像的时间。其中,以下实施例中所述的第二时间戳是该第一界面图像的时间戳。
S402、第一电子设备向第二电子设备发送第一界面图像的编码信息。
其中,第一界面图像的编码信息用于触发第二电子设备基于第一界面图像的编码信息显示第一投屏界面,第一投屏界面的内容是第一界面图像的镜像。
第二电子设备接收第一界面图像的编码信息后,利用第一界面图像的编码信息,生成第一投屏界面,并显示第一投屏界面。
示例性地,以第一电子设备为手机、第二电子设备为笔记本电脑为例,如图8中的(a)所示,手机300显示的第一界面图像501为编辑一张图片的编辑界面图像,并指示(或触发)笔记本电脑510显示第一投屏界面502。第一界面图像501的内容包括:当前时间为“8:00”、待编辑的图片的采集时间为“2020年6月5日下午5:10”、以及包括“涂鸦”、“颜色”、“粗细”和“橡皮檫”的多个图片编辑选项。
S403、第二电子设备接收来自第一电子设备的第一界面图像的编码信息,并利用第一界面图像的编码信息显示第一投屏界面。
第二电子设备接收来自第一电子设备的第一界面图像的编码信息后,采用第一电子设备在生成第一界面图像的编码数据时所采用的编解码标准,对第一界面图像的编码信息中的编码数据解码,得到第一界面图像。第二电子设备再利用第一界面图像生成第一投屏界面,并显示第一投屏界面。
本申请实施例中,在上述多屏扩展场景下,第二电子设备可以全屏显示第一投屏界面。
本申请实施例中,在上述多屏非镜像场景下,第二电子设备可以全屏显示第一投屏界面,也可以非全屏显示第一投屏界面。其中,第二电子设备在多屏非镜像场景下非全屏显示第一投屏界面的同时,可以显示其他界面图像,例如,在除第一投屏界面之外的其他区域显示该其他界面图像。该其他界面图像不同于第一界面图像和第一投屏界面。
本申请实施例中,在上述多屏镜像场景下,第二电子设备可以全屏显示第一投屏界面,也可以非全屏显示第一投屏界面。其中,第二电子设备在多屏镜像场景下非全屏显示第一投屏界面时,不显示其他内容,例如,该其他区域可以表现为黑色。
需要说明的是,本申请实施例中所述的全屏显示投屏界面是指:第二电子设备可以在显示屏显示该投屏界面和第二电子设备的状态栏,不显示其他内容;或者,第二电子设备可以在显示屏显示该投屏界面,不显示第二电子设备的状态栏和其他内容。
S404、第一电子设备接收用户对第一界面图像的更新操作,更新操作用于触发第一电子设备依次显示M帧界面图像。
在上述第一种使用场景下,第一电子设备可以直接接收用户对第一界面图像的更新操作。其中,M为正整数(如,1、2或3)。M的取值取决于该更新操作。
示例性地,更新操作可以包括切换第一界面图像的操作、在第一界面图像上进行绘画的操作或在第一界面图像上加载新的内容的操作等,本申请实施例对此不做限制。
S405、第一电子设备响应于更新操作,显示第i帧界面图像,并获取第i帧界面图像中、相较于第i-1帧界面图像的N个差异区域。
第一电子设备响应于更新操作,依次生成M帧界面图像并显示。第一电子设备还获取第i帧界面图像中、相较于第i-1帧界面图像的N个差异区域,以及N个差异区域在第i帧界面图像中的位置信息。该位置信息用于指示(或表征)N个差异区域在第i帧界面图像中的位置。
其中,i在{1,……,M}中依次取值;第0帧界面图像为上述第一界面图像。N个差异区域中像素点的像素值与第i-1帧界面图像中对应像素点的像素值不同,N为正整数。
示例性地,M帧第一界面图像可以包括第1帧界面图像、第2帧界面图像、…、第i帧界面图像、…、第M帧界面图像。第1帧界面图像、第2帧界面图像、…、第i帧界面图像、…、第M帧界面图像为按生成时间先后排列的界面图像。第i-1帧界面图像就是第i帧界面图像的前一帧界面图像。
本申请实施例中,N可以等于任意一个正整数。若N大于1,则N个差异区域中的任意两个差异区域可以是独立的、不重叠的。N个差异区域中每个差异区域的所有像素点的像素值与第i-1帧界面图像中对应像素点的像素值不同,或者,每个差异区域的部分像素点的像素值与第i-1帧界面图像中对应像素点的像素值不同。N个差异区域的位置信息可以包括N个差异区域一一对应的N个位置信息。
S406、第一电子设备判断N个差异区域在第i帧界面图像的面积占比是否小于预设值。
第一电子设备可以先计算N个差异区域在第i帧界面图像的面积占比;再判断该面积占比是否小于预设值(如,1/4或1/2)。第一电子设备确定该面积占比小于预设值,执行S407-S410。第一电子设备确定该面积占比大于或等于预设值,执行S411-S415。
S407、第一电子设备对N个差异区域的图像内容进行编码,得到第一编码数据。
第一电子设备可以采用有损编码方式(如JPEG、RM文件格式、RealMedia可变比特率)或无损编码方式(如霍夫曼(Huffman)算法、LZW(Lenpel-Ziv&Welch)压缩算法),对N个差异区域的图像内容进行编码,得到第一编码数据。
S408、第一电子设备向第二电子设备发送第一编码信息,第一编码信息包括第一编码数据。
第一电子设备在获取N个差异区域后,还可以获取N个差异区域在第i帧界面图像中的位置信息。然后,第一电子设备利用该位置信息和第一编码数据生成第一编码信息,再将第一编码信息发送至第二电子设备。
S409、第二电子设备接收来自第一电子设备的第一编码信息,解码第一编码数据得到N个差异区域的图像内容。
第二电子设备采用第一电子设备在生成第一编码数据时所采用的编解码标准,对第一编码信息中的第一编码数据解码,得到N个差异区域的图像内容。
S410、第二电子设备显示第二投屏界面。
其中,第二投屏界面的内容是第i帧界面图像的镜像。第i帧界面图像是第二电子设备根据N个差异区域的图像内容和位置信息更新第i-1帧界面图像得到的。
本申请实施例中,第二电子设备按照第一编码信息中的N个差异区域的位置信息,利用N个差异区域的图像内容,更新第i-1帧界面图像,得到第i帧界面图像。第二电子设备再根据第i帧界面图像,生成第二投屏界面,并采用全屏或非全屏的方式显示第二投屏界面。
需要说明的是,第二电子设备全屏显示或非全屏显示第二投屏界面的具体过程,可以参见上述S403中关于第二电子设备全屏显示或非全屏显示第一投屏界面的详细介绍,本申请实施例这里不予赘述。
示例性地,第二电子设备可以用N个差异区域的图像内容,对第i-1帧界面图像中N个差异区域的位置信息所指示的区域内容进行替换,得到第i帧界面图像。
示例性地,在上述第一种使用场景下,继续以第一电子设备为手机、第二电子设备为笔记本电脑为例,如图8中的(b)所示,手机300的显示屏为触摸屏,手机300接收用户对第一界面图像的更新操作,如用户手指在手机300的显示屏上绘制一个心形的绘画操作。然后,300响应于该绘画操作,依次显示了绘制心形过程中的M帧界面图像,显示出一个包括心形的界面图像503。同时,手机300向笔记本电脑510发送M帧界面图像中每一帧界面图像的第一编码信息或第二编码信息。由于手机300可以在每一帧界面图像相较于其前一帧界面图像的变化较小时,只对每一帧界面图像中的与其前一帧界面图像不同的内容进行编码得到第一编码数据,将包括第一编码数据的第一编码信息传输至第二电子设备,减少了编解码耗时以及编码数据的传输时长。因此,笔记本电脑510也较快地利用接收到的第一编码信息或第二编码信息显示一个包括心形的投屏界面504。
S411、第一电子设备对第i帧界面图像进行编码,得到第二编码数据。
本申请实施例中,第一电子设备可以采用H.261、H.263、H.264等编解码标准,对第i帧界面图像进行编码,得到第二编码数据。
其中,第一电子设备对第i帧界面图像编码,就是对完整的第i帧界面图像编码;此时,该第i帧界面图像可以为帧内编码图像帧(Intra-coded picture,I帧)或预测编码图像帧(Predictive-coded Picture,P帧)。
需要说明的是,若N个差异区域在第i帧界面图像的面积占比大于或等于预设值,则对N个差异区域编解码的耗时,和对第i帧界面图像编解码的耗时相差不大。也就是说,第一电子设备对N个差异区域编码后传输第二电子设备,或者对第i帧界面图像编码后传输至第二电子设备,第二电子设备更新界面图像时的卡顿程度差别不大,则第一电子设备可以在N个差异区域在第i帧界面图像的面积占比大于或等于预设值的情况下,直接对第i帧界面图像编码后传输。
S412、第一电子设备向第二电子设备发送第二编码信息,第二编码信息包括第二编码数据。
S413、第二电子设备接收来自第一电子设备的第二编码信息。
其中,第二编码数据是第一电子设备对第i帧界面图像编码后生成的。第二编码信息还可以包括第i帧的第一时间戳。第i帧的第一时间戳用于记录第一电子设备生成第i帧界面图像的时间。
S414、第二电子设备对第二编码数据解码,得到第i帧界面图像。
第二电子设备可以采用第一电子设备在生成第二编码数据时所采用的编解码标准,对第二编码信息中的第二编码数据解码,得到第i帧界面图像。
S415、第二电子设备显示第三投屏界面。
第二电子设备可以利用第i帧界面图像,生成第三投屏界面,并采用全屏或非全屏的方式显示第三投屏界面;第三投屏界面的内容是第i帧界面图像的镜像。
需要说明的是,第二电子设备全屏显示或非全屏显示第三投屏界面的具体过程,可以参见上述S403中关于第二电子设备全屏显示或非全屏显示第一投屏界面的详细介绍,本申请实施例这里不予赘述。
示例性地,在上述第一种使用场景下,以第一电子设备为笔记本电脑、第二电子 设备为平板电脑为例,如图9中的(a)所示,笔记本电脑520显示的第一界面图像501为编辑一张图片的编辑界面图像,并指示(或触发)平板电脑505显示第一投屏界面502。其中,笔记本电脑520连接鼠标506。如图9中的(b)所示,笔记本电脑520通过鼠标506接收用户对第一界面图像的更新操作,如用户使用鼠标506在笔记本电脑520的显示屏上绘制一个心形的绘画操作。然后,笔记本电脑520响应于该绘画操作,依次显示了绘制心形过程中的M帧界面图像,显示出一个包括心形的界面图像503。同时,笔记本电脑520向平板电脑505发送每一帧界面图像的第一编码信息或第二编码信息。由于笔记本电脑520可以在每一帧界面图像相较于其前一帧界面图像的变化较小时,只对每一帧界面图像中的与其前一帧界面图像不同的内容进行编码得到第一编码数据,将包括第一编码数据的第一编码信息传输至第二电子设备,减少了编解码耗时以及编码数据的传输时长。因此,平板电脑505也较快地利用接收到的第一编码信息或第二编码信息显示一个包括心形的投屏界面504。
可以理解的是,本申请提供的多屏协同的显示方法,第一电子设备显示第一界面图像,并将第一界面图像的内容投屏到第二电子设备上。第一电子设备若接收到对第一界面图像的更新操作,则显示M帧界面图像,并获取第i帧界面图像中、相较于第i-1帧界面图像的N个差异区域。如果N个差异区域在第i帧界面图像的面积占比小于预设值,第一电子设备只对N个差异区域编码得到第一编码数据;然后,将包括第一编码数据的第一编码信息发送给第二电子设备。其中,N个差异区域为第i帧界面图像相较于第i-1帧界面图像的像素值不同的像素点。那么,第一电子设备对N个差异区域编码,就是对第i帧界面图像中的与第i-1帧界面图像不同的内容进行编码,相较于对第i帧界面图像编码,可以减少编解码耗时以及第一编码数据的大小,而传输较小的第一编码数据也可以缩短第一编码数据的传输时长。进而,由于减小了编解码耗时和缩短了第一编码数据的传输时长,则第二电子设备可以更快地利用包括第一编码数据的第一编码信息更新第i-1帧界面图像,显示第i帧界面图像的镜像。如此,可以解决多屏协同场景下第二电子设备更新界面图像存在的卡顿问题。
本申请实施例中,第一电子设备在获取N个差异区域时,可以将第i帧界面图像中每个像素点的像素值,与第i-1帧界面图像中对应像素点的像素值进行对比,得到第i帧界面图像中的差异像素点;再确定包括第i帧界面图像中的差异像素点的N个差异区域。其中,差异像素点的像素值与第i-1帧界面图像中对应像素点的像素值不同。差异像素点可以包括多个像素点。
第一电子设备得到差异像素点后,可以按照预设区域形状,确定包括差异像素点的最小区域;该最小区域就是N个差异区域。其中,预设区域形状可以为矩形、圆形、多边形等等。N个差异区域的形状为预设区域形状。
或者,第一电子设备得到差异像素点后,可以将差异像素点所在的区域,作为N个差异区域。
示例性地,以预设区域的形状为矩形为例,说明第一电子设备确定包括差异像素点的最小矩形区域的过程,最小矩形区域就是N个差异区域。针对如图10中的(a)所示的第i-1帧界面图像,以及如图10中的(b)所示的第i帧界面图像,第一电子设备比较该第i帧界面图像中像素点的像素值和该第i-1帧界面图像中像素点的像素值, 得到第i帧界面图像中的差异像素点。如图10中的(c)中的斜线区域包括第i帧界面图像中的所有差异像素点。然后,第一电子设备确定包括所有差异像素点的最小矩形区域R1,即N个差异区域R1。
其中,N个差异区域R1的位置信息包括左上顶点P1的横坐标xmin(或左下顶点P3的横坐标xmin)、右上顶点P2的横坐标xmax(或右下顶点P4的横坐标xmax)、左上顶点P1的纵坐标ymin(或右上顶点P2的纵坐标ymin)、左下顶点P3的纵坐标ymax(或右下顶点P4的纵坐标ymax)。或者,N个差异区域R1的位置信息可以包括左上顶点P1的坐标(xmin,ymin),右上顶点P2的坐标是(xmax,ymin),左下顶点P1的坐标是(xmin,ymax),右下顶点P4的坐标是(xmax,ymax)。
其中,上述坐标值xmin、xmax、ymin、ymax可以是在第一坐标系下取得的坐标值。该第一坐标系的原点为第i帧界面图像的左上顶点,第一坐标系的X轴的正方向为从第i帧界面图像的左上顶点到右上顶点的方向,第一坐标系的Y轴的正方向为从第i帧界面图像的左上顶点到左下顶点的方向。
需要说明的是,N个差异区域可以包括多个独立的区域,上述图10中以N个差异区域为1个区域为例,本申请实施例不对此进行限制。
本申请实施例中,第一电子设备响应于更新操作,可以生成第i帧的第一时间戳。第i帧的第一时间戳用于记录第一电子设备生成第i帧界面图像的时间。其中,第二编码信息还包括所述第i帧的第一时间戳。
第一电子设备在生成第i帧界面图像的同时,生成第i帧的第一时间戳。另外,第一电子设备可以将第i帧的第一时间戳携带在第二编码信息中发送给第二电子设备。
需要说明的是,虽然第一电子设备按照生成M帧界面图像的先后顺序,依次向第二电子设备发送M帧界面图像对应的编码信息(即第一编码信息或第二编码信息),但是由于编码数据大小不同和网络波动等原因,第二电子设备接收到多个编码信息(即第一编码信息或第二编码信息)的顺序可能与生成M帧界面图像的先后顺序不同。例如,第一电子设备依次向第二电子设备发送第i帧界面图像对应的编码信息(即第一编码信息或第二编码信息)、第i+1帧界面图像对应的编码信息(即第一编码信息或第二编码信息)时,由于编码数据大小不同和网络波动等原因,第二电子设备可能先接收到第i+1帧界面图像对应的编码信息,再接收到第i帧界面图像对应的编码信息。因此,第一编码信息和第二编码信息都可以包括第i帧的第一时间戳,以指示第二电子设备可以参考第一时间戳,按照时间先后顺序显示界面图像。
本申请实施例中,第一电子设备可以生成第二时间戳,并保存第二时间戳。第二时间戳用于记录第一电子设备生成第一界面图像的时间,第二时间戳是第1帧的第一基准时间戳,第一基准时间戳是第一电子设备记录投屏的基准时间。其次,若N个差异区域在第i帧界面图像的面积占比大于预设值,第一电子设备确定第i+1帧的第一基准时间戳为第i帧的第一时间戳;若N个差异区域在第i帧界面图像的面积占比小于预设值,第一电子设备确定第i+1帧的第一基准时间戳为第i帧的第一基准时间戳。第一编码信息还包括:第i帧的第一时间戳和第i帧的第一基准时间戳。
第一电子设备在显示第一界面图像的时候,可以生成第二时间戳。第一电子设备接收到上述更新操作时,可以将第二时间戳用作第1帧的第一基准时间戳。然后,第 一电子设备可以在N个差异区域在第i帧界面图像中的面积占比大于或等于预设值的情况下,将第i帧的第一时间戳用作第i+1帧的第一基准时间戳。第一电子设备在N个差异区域在第i帧界面图像中的面积占比小于预设值的情况下,将第i帧的第一基准时间戳用作第i+1帧的第一基准时间戳,即在第i+1帧继续使用第i帧的第一基准时间戳。另外,第一电子设备可以将第i帧的第一时间戳和第i帧的第一基准时间戳携带在第一编码信息中发送给第二电子设备。
其中,第i帧的第一基准时间戳是第一电子设备记录向第二电子设备投屏第i帧界面图像的基准时间。第二时间戳就是上述S401中的第一界面图像的时间戳。
需要说明的是,如果第i帧界面图像中、相较于第i-1帧界面图像的N个差异区域(简称为第i帧的N个差异区域)在第i帧界面图像的面积占比小于预设值,则表示第i帧界面图像与第i-1帧界面图像的差异较小。其次,如果第i-1帧界面图像中、相较于第i-2帧界面图像的N个差异区域(简称为第i-1帧的N个差异区域)在第i-2帧界面图像的面积占比大于预设值,则表示第i-1帧界面图像相较于第i-2帧界面图像差异较大。而第i帧界面图像与第i-1帧界面图像的差异较小,则第i帧界面图像与该第i-2帧界面图像差异也较大。
同理可知,如果第i-k帧界面图像中、相较于第i-k-1帧界面图像的N个差异区域(简称为第i-k帧的N个差异区域)在第i-k帧界面图像的面积占比大于预设值,则表示第i-k帧界面图像与第i-k-1帧界面图像及其之前的界面图像的差异都较大,k为正整数。其次,如果从第i-k+1帧界面图像到第i帧界面图像中任一帧的N个差异区域在任一帧界面图像中的面积占比都小于预设值,则表示第i帧界面图像与第i-k帧界面图像及其到第i帧界面图像之间的界面图像的差异较小。因此,若因为编码数据大小不同和网络波动等原因,第二电子设备在第i-k帧界面图像之前先接收到第i帧界面图像的第一编码信息;并且,在显示第i-k帧界面图像之前利用接收到的第一编码信息显示第i帧界面图像的内容,则会出现画面错位的问题。
针对画面错位的问题,第一电子设备确定N个差异区域在第i帧界面图像的面积占比大于或等于预设值时,将第i帧的第一时间戳用作第i+1帧的第一基准时间戳,否则,将第i帧的第一基准时间戳用作第i+1帧的第一基准时间戳。结合上述“第i-k帧的N个差异区域在第i-k帧界面图像的面积占比大于预设值”可知,第i-k+1帧的第一基准时间戳等于第i-k帧界面图像的第一时间戳。结合上述“从第i-k+1帧界面图像到第i帧界面图像中任一帧的N个差异区域在任一帧界面图像中的面积占比都小于预设值”可知,从第i-k+1帧界面图像到第i帧中任一帧的第一基准时间戳等于第i-k+1帧的第一基准时间戳。由于第i-k+1帧的第一基准时间戳等于第i-k帧界面图像的第一时间戳,因此,第i帧的第一基准时间戳等于第i-k帧界面图像的第一时间戳。
其次,第一电子设备在第一编码信息中携带有第i帧的第一时间戳和第i帧的第一基准时间戳。因为第i帧的第一基准时间戳等于第i-k帧界面图像的第一时间戳,则第i帧的第一基准时间戳能够用于指示第二电子设备将第i帧界面图像的内容显示在上述第i-k帧界面图像之后。如此,就可以避免由于将第i帧界面图像的内容显示在第i-k帧界面图像之前所导致的画面错位的问题。
示例性地,以M=5为例,说明第一电子设备生成第一基准时间戳的过程。如图 11A所示,第一电子设备先显示第一界面图像,并生成第二时间戳T0。然后,第一电子设备接收到更新操作,依次执行以下过程:显示第1帧界面图像与生成第1帧的第一时间戳T1,显示第2帧界面图像与生成第2帧的第一时间戳T2,显示第3帧界面图像与生成第3帧的第一时间戳T3,显示第4帧界面图像与生成第4帧的第一时间戳T4,显示第5帧界面图像与生成第5帧的第一时间戳T5。其中,时间轴的箭头方向表示时间顺序。
如图11B所示,第一电子设备在接收到更新操作时,还可以将第二时间戳T0作为第1帧的第一基准时间戳T11-JZ。第一电子设备确定第1帧界面图像相较于第一界面图像的N个差异区域(简称为第1帧的N个差异区域)在第1帧界面图像的面积占比大于预设值,将第1帧的第一时间戳T1用作第2帧的第一基准时间戳T21-JZ。第一电子设备确定第2帧界面图像相较于第1帧界面图像的N个差异区域(简称为第2帧的N个差异区域)在第2帧界面图像的面积占比小于预设值,将第2帧的第一基准时间戳T21-JZ用作第3帧的第一基准时间戳T31-JZ。第一电子设备确定第3帧界面图像相较于第2帧界面图像的N个差异区域(简称为第3帧的N个差异区域)在第3帧界面图像的面积占比小于预设值,将第3帧的第一基准时间戳T31-JZ用作第4帧的第一基准时间戳T41-JZ。第一电子设备确定第4帧界面图像相较于第3帧界面图像的N个差异区域(简称为第4帧的N个差异区域)在第4帧界面图像的面积占比大于预设值,将第4帧的第一时间戳T4用作第5帧的第一基准时间戳T51-JZ。第一电子设备还确定第5帧界面图像相较于第4帧界面图像的N个差异区域(简称为第5帧的N个差异区域)在第5帧界面图像的面积占比小于预设值。
其次,如图11A所示,第一电子设备依次编码第1帧界面图像(即I帧或P帧)、第2帧的N个差异区域、第3帧的N个差异区域、第4帧界面图像(即I帧或P帧)、第5帧的N个差异区域。其中,第一电子设备分别对完整的第1帧界面图像和完整的第4帧界面图像编码,则第1帧界面图像可以是I帧或P帧,第4帧界面图像可以是I帧或P帧。
本申请实施例中,为了区分第一编码信息和第二编码信息,第一编码信息还可以包括表征N个差异区域的第一标识,第二编码信息还可以包括表征第i帧界面图像的第二标识。
示例性地,若第一电子设备向第二电子设备发送第一编码信息所采用的传输路径,和发送第二编码信息所采用的传输路径不同,则第一标识可以是指与发送第一编码信息所采用的传输路径相关的信息(如端口号),第二标识可以是指与发送第二编码信息所采用的传输路径相关的信息(如端口号)。
进一步地,第一电子设备对N个差异区域的图像内容编码所采用的编解码标准,与对第i帧界面图像编码所采用的编解码标准可以不同,则第一标识可以用于指示第二电子设备采用第一电子设备在生成第一编码数据时所采用的编解码标准,对第一编码数据解码;第二标识可以用于指示第二电子设备采用第一电子设备在生成第二编码数据时所采用的编解码标准,对第二编码数据解码。
本申请实施例中,第一编码信息的数据结构可以如下表1所示,第一编码信息中不同的字节存放不同的数据,第一编码信息具体包括:(1)表示第一编码数据的编码 类型的数据,可以取值0或1,0表示第一编码数据为原始数据(即N个差异区域内的所有像素点的像素值),1表示JPEG或其他的编码方式;(2)第i-1帧界面图像的宽度videoWidth;(3)第i-1帧界面图像的高度videoHeight,宽度和高度的单位都可以为像素;(4)第i帧的第一基准时间戳Basetimestamp;(5)第i帧的第一时间戳Playtimestamp;(6)差异区域的总个数N;(7)第一编码数据的总长度(或者说净荷域(payload)的长度len);(8)N个差异区域中第1个差异区域的位置信息,如第1个差异区域的xmin、xmax、ymin、ymax;(9)第1个差异区域在第一编码数据中所占的数据长度len1(或者说第1个差异区域的长度len1);(10)第1个差异区域的编码数据data1,len1就是data1的长度;(11)第N个差异区域的位置信息,如第N个差异区域的xmin、xmax、ymin、ymax;(12)第N个差异区域在第一编码数据中所占的数据长度lenN(或者说第N个差异区域的长度lenN);(13)第N个差异区域的编码数据dataN,lenN就是dataN的长度。
表1
Figure PCTCN2021134390-appb-000001
其中,unit8表示数据为无符号的8比特(bit)整数,unit32表示数据为无符号的32bit整数,unit64表示数据为无符号的64bit整数,unit32[4]表示数据为无符号的32bit整数且长度等于4,unit8[len1]表示数据为无符号的32bit整数且长度等于len1,unit8[lenN]表示数据为无符号的32bit整数且长度等于lenN。第一编码信息可以分为帧头和净荷域(payload),用于存放不同的数据。
本申请实施例中,第二编码信息还包括:第i帧的第一时间戳,第i帧的第一时间戳用于记录第一电子设备生成第i帧界面图像的时间。第二电子设备在第二电子设备接收来自第一电子设备的第二编码信息之后,确定第i+1帧的第二基准时间戳为第i帧的第一时间戳,第二基准时间戳是第二电子设备记录投屏的基准时间。
示例性地,第二编码信息可以包括上述第二标识。第二电子设备可以在确定第二编码信息包括上述第二标识时,确定第i+1帧的第二基准时间戳为第i帧的第一时间戳。
本申请实施例中,第一编码信息还包括:第i帧的第一时间戳和第i帧的第一基准时间戳,第一基准时间戳是第一电子设备记录投屏的基准时间。第二电子设备在接收来自第一电子设备的第一编码信息之后,第二电子设备解码第一编码数据,得到N个差异区域的图像内容之前,第二电子设备确定第i帧的第一时间戳所记录的时间晚于第i帧的第一基准时间戳所记录的时间,并且第i帧的第一基准时间戳所记录的时间等于第i帧的第二基准时间戳所记录的时间。
第二电子设备接收来自第一电子设备的第一编码信息之后,确定第i帧的第一时间戳所记录的时间晚于第i帧的第一基准时间戳所记录的时间、且第i帧的第一基准时间戳所记录的时间等于第i帧的第二基准时间戳所记录的时间时,才解码第一编码数据,得到N个差异区域的图像内容。
进一步地,第二电子设备接收来自第一电子设备的第一编码信息之后,还可以确定第i+1帧的第二基准时间戳为第i帧的第二基准时间戳。
可以理解的是,第二电子设备如果接收到第i-1帧界面图像的第二编码信息(简称为第i-1帧的第二编码信息),则将第i-1帧的第二编码信息中的第i-1帧的第一时间戳用作第i帧的第二基准时间戳(即第i帧的第二基准时间戳等于第i-1帧的第一时间戳),还利用第i-1帧的第二编码信息显示第i-1帧界面图像的内容。其中,接收到第i-1帧的第二编码信息表示第i-1帧的N个差异区域在第i-1帧界面图像的面积占比大于预设值;而第i-1帧的N个差异区域在第i-1帧界面图像的面积占比大于预设值,表示第i-1帧界面图像与第i-2帧界面图像的差异较大。那么,第i帧界面图像与第i-2帧界面图像的差异更大。然后,如果第二电子设备再接收到第i帧界面图像的第一编码信息(简称为第i帧的第一编码信息),则表示第i帧的N个差异区域在第i帧界面图像的面积占比小于预设值;而第i帧的N个差异区域在第i帧界面图像的面积占比小于预设值,表示第i帧界面图像与第i-1帧界面图像的差异较小。其中,第i帧界面图像的第一编码信息可以包括第i帧的第一基准时间戳。第一电子设备在确定第i-1帧的N个差异区域在第i-1帧界面图像的面积占比大于预设值时,将第i-1帧的第一时间戳用作第i帧的第一基准时间戳,即第i帧的第一基准时间戳等于第i-1帧的第一时间戳;再结合上述第i帧的第二基准时间戳等于第i-1帧的第一时间戳,则可知第i帧的第一基准时间戳等于第i帧的第二基准时间戳。
综上可知,第i帧的第一基准时间戳等于第i帧的第二基准时间戳,表示第i帧界面图像与第i-1帧界面图像的差异较小,还表示第二电子设备已显示出第i-1帧界面图像的内容。那么,第二电子设备在第i帧的第一基准时间戳等于第i帧的第二基准时间戳时,利用第i帧的第一编码信息显示第i帧界面图像的内容。因为第i帧界面图像的内容与已显示出的第i-1帧界面图像的内容差异较小,就可以避免因为第二电子设备接收M帧界面图像对应的编码信息(即第一编码信息或第二编码信息)的顺序与第一电子设备生成M帧界面图像的先后顺序不同导致画面错位的问题。
本申请实施例中,第二电子设备可以确定第i帧的第一时间戳所记录的时间不晚于(即早于或等于)第i帧的第一基准时间戳所记录的时间、或者第i帧的第一基准时间戳所记录的时间早于第i帧的第二基准时间戳所记录的时间,不处理第i帧界面图像的第一编码信息,或者删除第一编码信息。在第i帧的第一时间戳所记录的时间晚于 第i帧的第一基准时间戳所记录的时间、且第i帧的第一基准时间戳所记录的时间晚于第i帧的第二基准时间戳所记录的时间的情况下,不处理第i帧界面图像的第一编码信息。
进一步地,第二电子设备在第i帧的第一时间戳所记录的时间晚于第i帧的第一基准时间戳所记录的时间、且第i帧的第一基准时间戳所记录的时间晚于第i帧的第二基准时间戳所记录的时间的情况下,又接收到第j帧界面图像的第二编码信息(简称为第j帧的第二编码信息);将第j帧的第一时间戳用作第j+1帧的第二基准时间戳,并利用第j帧的第二编码信息更新界面图像。第二电子设备还可以在第i帧的第一基准时间戳所记录的时间等于第j+1帧的第二基准时间戳所记录的时间时,对第i帧界面图像的第一编码信息执行S409-S410。
可以理解的是,一方面,第二电子设备接收到第i帧的第一编码信息包括第i帧的第一时间戳和第i帧的第一基准时间戳。如果第i帧的第一时间戳所记录的时间不晚于第i帧的第一基准时间戳所记录的时间,则第i帧的第一编码信息是错误的;因此,第二电子设备不处理该第一编码信息。如果第i帧的第一基准时间戳所记录的时间早于第i帧的第二基准时间戳所记录的时间,表示第二电子设备已经接收第i+h帧界面图像的第二编码信息(简称为第i+h帧的第二编码信息),并将第i+h帧的第二编码信息中的第i+h帧的第一时间戳用作第i帧的第二基准时间戳。第二电子设备还利用第i+h帧的第二编码信息显示出第i+h帧界面图像的内容,则第i帧界面图像的内容和当前显示出的界面图像的内容差异较大;因此,第二电子设备不处理第i帧的第一编码信息,即不显示第i帧界面图像的内容,避免了画面错位的问题。其中,h为一个正整数。
另一方面,第二电子设备在接收到第i帧的的第一编码信息包括第i帧的第一时间戳和第i帧的第一基准时间戳。如果第i帧的第一时间戳所记录的时间晚于第i帧的第一基准时间戳所记录的时间、且第i帧的第一基准时间戳所记录的时间晚于第i帧的第二基准时间戳所记录的时间,表示第i帧的第一基准时间戳等于第i-k帧的第一时间戳,第i帧的第二基准时间戳等于第i-k-q帧的第一时间戳。其中,第i帧的第二基准时间戳等于第i-k-q帧的第一时间戳,又表示第二电子设备还没有接收到第i-k帧的第二编码信息,第二电子设备只显示出第i-k帧之前的界面图像的内容。其次,第i帧的第一基准时间戳等于第i-k帧的第一时间戳,又表示相较于第i帧界面图像和第i-k帧之前的界面图像的差异,第i帧界面图像和第i-k帧界面图像的差异更小;进而,可以知道第i帧界面图像的内容和第二电子设备当前显示出的界面图像的内容差异较大。因此,不处理第i帧的第一编码信息,即不显示第i帧界面图像的内容。然后,如果第二电子设备又接收到第j帧的第二编码信息,将第j帧的第二编码信息中的第j帧的第一时间戳用作第j+1帧的第二基准时间戳,并利用第j帧的第二编码信息显示第j帧界面图像的内容。此时,如果第i帧的第一基准时间戳等于第j+1帧的第二基准时间戳,表示第j+1帧就是第i-k帧,即第二电子设备显示出与第i帧界面图像的差异较小的第i-k帧界面图像的内容。此时,第二电子设备再利用第i帧的第一编码信息显示第i帧界面图像的内容,就可以避免因为第二电子设备接收M帧界面图像对应的编码信息(即第一编码信息或第二编码信息)的顺序与第一电子设备生成M帧界面图像的先后顺序不 同导致画面错位的问题。其中,k、q为一个正整数,k和q可以相等或不相等。
示例性地,继续以图11A和图11B中的5帧界面图像为例,如图12中的(a)所示,第一电子设备依次向第二电子设备发送第1帧界面图像的第二编码信息(简称为第1帧的第二编码信息)、第2帧界面图像的第一编码信息(简称为第2帧的第一编码信息)、第3帧界面图像的第一编码信息(简称为第3帧的第一编码信息)、第4帧界面图像的第二编码信息(简称为第4帧的第二编码信息)、第5帧界面图像的第一编码信息(简称为第5帧的第一编码信息)。其中,第1帧的第二编码信息包括第1帧的第一时间戳T1。第2帧的第一编码信息包括:第2帧的第一时间戳T2、第2帧的第一基准时间戳T21-jz=T1。第3帧的第一编码信息包括:第3帧的第一时间戳T3、第3帧的第一基准时间戳T31-jz=T21-jz=T1。第4帧的第二编码信息包括第4帧的第一时间戳T4。第5帧的第一编码信息包括:第5帧的第一时间戳T5、第5帧的第一基准时间戳T51-jz=T4。第一电子设备可以通过第一传输路径发送第一编码信息,通过第二传输路径发送第二编码信息。然而,由于编码数据大小不同和网络波动等原因,第二电子设备可能依次接收第1帧界面图像的第二编码信息、第2帧界面图像的第一编码信息、第3帧界面图像的第一编码信息、第5帧界面图像的第一编码信息、第4帧界面图像的第二编码信息。
如图12中的(b)所示,第二电子设备可以接收第1帧的第二编码信息(包括:第1帧的第一时间戳T1),则显示第1帧界面图像的内容,并确定第2帧的第二基准时间戳T22-jz=T1。第二电子设备再接收第2帧的第一编码信息(包括:第2帧的第一时间戳T2、第2帧的第一基准时间戳T21-jz=T1),在T21-jz=T22-jz=T1时显示第2帧界面图像的内容,并确定第3帧的第二基准时间戳T32-jz=T22-jz(T32-jz=T22-jz=T1)。第二电子设备再接收第3帧的第一编码信息(包括:T3、T31-jz=T21-jz=T1),在T31-jz=T32-jz时显示第3帧界面图像的内容,并确定第4帧的第二基准时间戳T42-jz=T32-jz(T42-jz=T32-jz=T22-jz=T1)。第二电子设备再接收第5帧的第一编码信息(包括:T5、T51-jz=T4),在T51-jz≠T42-jz时不处理第5帧的第一编码信息。
最后,第二电子设备接收到第4帧的第二编码信息(包括T4),显示第4帧界面图像的内容,并确定第5帧的第二基准时间戳T52-jz=T4。然后,第一电子设备在T51-jz=T52-jz时显示第5帧界面图像的内容。
请参照图13,本申请实施例提供一种多屏协同的显示方法,应用于上述第一使用场景下的第一电子设备和第二电子设备中,第一电子设备连接第二电子设备。如图13所示,在S401之后、且S402之前,该显示方法可以包括S601。在S403之后,该显示方法可以包括S602。在S405之后,该显示方法可以包括S603。在S407之后、且S408之前,该显示方法可以包括S604。该显示方法中的S408可以包括S605。在S409之后,该显示方法可以包括S606。该显示方法中的S410可以包括S607。该显示方法中的S412可以包括S608。在S414之后,该显示方法可以包括S609。
S601、第一电子设备生成第二时间戳,确定第二时间戳是第1帧的第一基准时间戳。
其中,第二时间戳用于记录第一电子设备生成第一界面图像的时间;第一基准时间戳是所述第一电子设备记录投屏的基准时间。
S602、第二电子设备确定第1帧的第二基准时间戳为第二时间戳。
S603、第一电子设备生成第i帧的第一时间戳。
其中,第i帧的第一时间戳用于记录第一电子设备生成第i帧界面图像的时间。
S604、第一电子设备确定第i+1帧的第一基准时间戳为第i帧的第一基准时间戳。
S605、第一电子设备向第二电子设备发送第一编码信息;第一编码信息包括:第一编码数据、第i帧的第一时间戳、第i帧的第一基准时间戳。
其中,第一编码信息还包括第i帧界面图像中相较于第i-1帧界面图像的N个差异区域的位置信息(即N个差异区域在第i帧界面图像中的位置信息)。
S606、第二电子设备确定第i+1帧的第二基准时间戳为第i帧的第二基准时间戳。
S607、第二电子设备在第i帧的第一时间戳所记录的时间晚于第i帧的第一基准时间戳所记录的时间,并且第i帧的第一基准时间戳所记录的时间等于第i帧的第二基准时间戳所记录的时间的情况下,显示第二投屏界面。
S608、第一电子设备向第二电子设备发送第二编码信息;第二编码信息包括:第二编码数据、第i帧的第一时间戳。
S609、第二电子设备确定第i+1帧的第二基准时间戳为第i帧的第一时间戳。
本申请实施例中,在上述反控场景下,第一电子设备连接第二电子设备后,第一电子设备将第一电子设备显示的界面图像的内容投屏第二电子设备上,还可以由第二电子设备接收用户对该界面图像的更新操作。第二电子设备响应于该更新操作,指示第一电子设备更新该界面图像,并从第一电子设备接收更新后的界面图像,显示更新后的界面图像的内容。具体地,如图14所示,上述反控场景下,应用于第一电子设备和第二电子设备的多屏协同的显示方法在S403之后、且在S406之前,可以执行S701-S704,不执行S404-S405。
S701、第二电子设备接收用户对第一界面图像的更新操作。
其中,更新操作用于触发第一电子设备依次显示M帧界面图像。
S702、第二电子设备向第一电子设备发送更新指令,更新指令是对第一投屏界面的更新操作触发的。
S703、第一电子设备接收来自第二电子设备的更新指令。
其中,更新指令用于触发第一电子设备依次显示M帧界面图像。
S704、响应于更新指令,显示第i帧界面图像,并获取第i帧界面图像中、相较于第i-1帧界面图像的N个差异区域。
需要说明的是,S703中第一电子设备响应于更新指令显示第i帧界面图像的详情过程可以参见上述S404中关于第一电子设备响应于更新操作显示第i帧界面图像的介绍,本申请实施例这里不予赘述。
示例性地,在上述反控场景下,以第一电子设备为手机、第二电子设备为笔记本电脑为例,如图15中的(a)所示,手机300显示的第一界面图像501为编辑一张图片的编辑界面图像,并指示(或触发)笔记本电脑510显示第一投屏界面502。如图15中的(b)所示,笔记本电脑510无线连接手写笔507,笔记本电脑510通过手写笔507接收用户对第一界面图像的更新操作,如用户通过手写笔507在手机300的显示屏上绘制一个心形的绘画操作。然后,手机300响应于该绘画操作,依次显示了绘制 心形过程中的M帧界面图像,显示出一个包括心形的界面图像503。同时,手机300向笔记本电脑510发送每一帧界面图像的第一编码信息或第二编码信息。由于手机300可以在每一帧界面图像相较于其前一帧界面图像的变化较小时,只对每一帧界面图像中的与其前一帧界面图像不同的内容进行编码得到第一编码数据,将包括第一编码数据的第一编码信息传输至第二电子设备,减少了编解码耗时以及编码数据的传输时长。因此,笔记本电脑510也较快地利用接收到的第一编码信息或第二编码信息显示一个包括心形的投屏界面504。
示例性地,在上述反控场景下,以第一电子设备为笔记本电脑、第二电子设备为平板电脑为例,如图16中的(a)所示,笔记本电脑520显示的第一界面图像501为编辑一张图片的编辑界面图像,并指示(或触发)平板电脑505显示第一投屏界面502。如图16中的(b)所示,平板电脑505无线连接手写笔507,平板电脑505通过手写笔507接收用户对第一界面图像的更新操作,如用户通过手写笔507在笔记本电脑520的显示屏上绘制一个心形的绘画操作。然后,笔记本电脑520响应于该绘画操作,依次显示了绘制心形过程中的M帧界面图像,显示出一个包括心形的界面图像503。同时,笔记本电脑520向平板电脑505发送每一帧界面图像的第一编码信息或第二编码信息。由于笔记本电脑520可以在每一帧界面图像相较于其前一帧界面图像的变化较小时,只对每一帧界面图像中的与其前一帧界面图像不同的内容进行编码得到第一编码数据,将包括第一编码数据的第一编码信息传输至第二电子设备,减少了编解码耗时以及编码数据的传输时长。因此,平板电脑505也较快地利用接收到的第一编码信息或第二编码信息显示一个包括心形的投屏界面504。
示例性地,如下表2所示,反控场景下,第一电子设备显示第一界面图像时接收来自第二电子设备的更新指令。第一电子设备采用常规技术(如H264编码技术),响应于更新指令,显示M帧界面图像,并指示(或触发)第二电子设备显示M帧界面图像;在第二电子设备显示M帧界面图像的过程中,测量到第二电子设备显示M帧界面图像的帧率(即投屏帧率)等于48FPS;还测量到第二电子设备从接收到更新操作到显示出M帧界面图像所需时长为130ms,即第一电子设备对第二电子设备的反控时长为130ms。同样场景下,第一电子设备采用本申请实施例提供的方法,响应于更新指令,显示M帧界面图像,并指示(或触发)第二电子设备显示M帧界面图像;在第二电子设备显示M帧界面图像的过程中,测量到第二电子设备显示M帧界面图像的帧率(即投屏帧率)等于60FPS;还测量到第二电子设备从接收到更新操作到显示出M帧界面图像所需时长为58ms,即第一电子设备对第二电子设备的反控时长为58ms。
表2
  投屏帧率 反控时长
H264编码技术 48FPS 130ms
本申请实施例提供的方法 60FPS 58ms
可以看出,采用本申请实施例提供的方法,第一电子设备对第二电子设备的反控时延更小,并且,第二电子设备显示M帧界面图像的帧率等于60FPS,表示第二电子设备更新界面图像不卡顿。
请参照图17,本申请实施例提供一种多屏协同的显示方法,应用于上述反控场景下的第一电子设备和第二电子设备中,第一电子设备连接第二电子设备。如图17所示,在S401之后、且S402之前,该显示方法可以包括S801。在S403之后,该显示方法可以包括S802。在S704之后,该显示方法可以包括S803。在S407之后、且S408之前,该显示方法可以包括S804。该显示方法中的S408可以包括S805。在S409之后,该显示方法可以包括S806。该显示方法中的S410可以包括S807。该显示方法中的S412可以包括S808。在S414之后,该显示方法可以包括S809。
S801、第一电子设备生成第二时间戳,确定第二时间戳是第1帧的第一基准时间戳。
其中,第二时间戳用于记录第一电子设备生成第一界面图像的时间;第一基准时间戳是所述第一电子设备记录投屏的基准时间。
S802、第二电子设备确定第1帧的第二基准时间戳为第二时间戳。
S803、第一电子设备生成第i帧的第一时间戳。
S804、第一电子设备确定第i+1帧的第一基准时间戳为第i帧的第一基准时间戳。
S805、第一电子设备向第二电子设备发送第一编码信息;第一编码信息包括:第一编码数据、第i帧的第一时间戳、第i帧的第一基准时间戳。
其中,第一编码信息还包括第i帧界面图像中相较于第i-1帧界面图像的N个差异区域的位置信息(即N个差异区域在第i帧界面图像中的位置信息)。
S806、第二电子设备确定第i+1帧的第二基准时间戳为第i帧的第二基准时间戳。
S807、第二电子设备在第i帧的第一时间戳所记录的时间晚于第i帧的第一基准时间戳所记录的时间,并且第i帧的第一基准时间戳所记录的时间等于第i帧的第二基准时间戳所记录的时间的情况下,显示第二投屏界面。
S808、第一电子设备向第二电子设备发送第二编码信息;第二编码信息包括:第二编码数据、第i帧的第一时间戳。
S809、第二电子设备确定第i+1帧的第二基准时间戳为第i帧的第一时间戳。
可以理解的是,上述方法可以由多屏协同的显示装置实现。多屏协同的显示装置为了实现上述功能,其包含了执行各个功能相应的硬件结构和/或软件模块。本领域技术人员应该很容易意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,本申请实施例能够以硬件或硬件和计算机软件的结合形式来实现。某个功能究竟以硬件还是计算机软件驱动硬件的方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请实施例的范围。
本申请实施例可以根据上述方法示例对上述电子设备等进行功能模块的划分,例如,可以对应各个功能划分各个功能模块,也可以将两个或两个以上的功能集成在一个处理模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。需要说明的是,本申请实施例中对模块的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。
在采用对应各个功能划分各个功能模块的情况下,图18示出了上述实施例中所涉及的第一电子设备一种可能的结构示意图,第一电子设备900连接第二电子设备,第 一电子设备900包括显示模块901、发送模块902、接收模块903和协同处理模块904。其中,显示模块901,用于显示第一界面图像;协同处理模块904,用于对第一界面图像编码得到第一界面图像的编码数据;发送模块902,用于向第二电子设备发送第一界面图像的编码信息;接收模块903,用于接收对第一界面图像的更新操作,更新操作用于触发第一电子设备900依次显示M帧界面图像,M为正整数;协同处理模块904,还用于响应于更新操作,显示第i帧界面图像,并获取第i帧界面图像中、相较于第i-1帧界面图像的N个差异区域;若N个差异区域在第i帧界面图像的面积占比小于预设值,对N个差异区域的图像内容进行编码,得到第一编码数据;发送模块902,还用于向第二电子设备发送第一编码信息。
其中,第一界面图像的编码信息包括第一界面图像的编码数据。第一界面图像的编码信息用于触发第二电子设备基于第一界面图像的编码信息显示第一投屏界面,第一投屏界面的内容是第一界面图像的镜像。第一编码信息包括第一编码数据和N个差异区域在第i帧界面图像中的位置信息;第一编码信息用于触发第二电子设备基于第一编码信息更新第i-1帧界面图像得到第二投屏界面并显示第二投屏界面,第二投屏界面的内容是第i帧界面图像的镜像;i在{1,……,M}中依次取值;第0帧界面图像为第一界面图像;N个差异区域中像素点的像素值与第i-1帧界面图像中对应像素点的像素值不同,N为正整数。
在一种可能的实施方式中,接收模块903,用于接收对第一界面图像的更新操作,包括:接收模块903,具体用于接收来自第二电子设备的更新指令,或者,接收用户对第一电子设备900显示的第一界面图像的更新操作;更新指令是对第一投屏界面的更新操作触发的。
另一种可能的实施方式中,协同处理模块904,用于获取第i帧界面图像中、相较于第i-1帧界面图像的N个差异区域,包括:协同处理模块904,具体用于将第i帧界面图像中每个像素点的像素值,与第i-1帧界面图像中对应像素点的像素值进行对比,得到第i帧界面图像中的差异像素点;确定包括第i帧界面图像中的差异像素点的N个差异区域。其中,差异像素点的像素值与第i-1帧界面图像中对应像素点的像素值不同。
另一种可能的实施方式中,协同处理模块904,还用于若N个差异区域在第i帧界面图像的面积占比大于预设值,对第i帧界面图像进行编码,得到第二编码数据;发送模块902,还用于向第二电子设备发送第二编码信息。其中,第二编码信息包括第二编码数据,第二编码信息用于触发第二电子设备基于第二编码信息显示第三投屏界面,第三投屏界面的内容是第i帧界面图像的镜像。
另一种可能的实施方式中,协同处理模块904,还用于:响应于更新操作,生成第i帧的第一时间戳;第i帧的第一时间戳用于记录第一电子设备生成第i帧界面图像的时间;其中,第二编码信息还包括第i帧的第一时间戳。
另一种可能的实施方式中,协同处理模块904,还用于:生成第二时间戳,并保存第二时间戳;若N个差异区域在第i帧界面图像的面积占比大于预设值,确定第i+1帧的第一基准时间戳为第i帧的第一时间戳;若N个差异区域在第i帧界面图像的面积占比小于预设值,确定第i+1帧的第一基准时间戳为第i帧的第一基准时间戳。其 中,第二时间戳用于记录第一电子设备生成第一界面图像的时间,第二时间戳是第1帧的第一基准时间戳,第一基准时间戳是第一电子设备记录投屏的基准时间。第一编码信息还包括:第i帧的第一时间戳和第i帧的第一基准时间戳。
本申请实施例还提供一种电子设备,该电子设备是第一电子设备。该第一电子设备可以包括处理器和存储器。其中,存储器用于存储计算机程序代码,计算机程序代码包括计算机指令;处理器用于运行计算机指令,使得第一电子设备执行上述方法实施例中手机300或笔记本电脑520执行的各个功能或者步骤。其中,第一电子设备的其他硬件结构可以参考上述实施例对图6所示的手机300的详细介绍,本申请实施例这里不予赘述。
在采用对应各个功能划分各个功能模块的情况下,图19示出了上述实施例中所涉及的第二电子设备一种可能的结构示意图,第二电子设备1000连接第一电子设备,第二电子设备1000包括:显示模块1001、接收模块1002和协同处理模块1003。其中,显示模块1001,用于显示第一投屏界面,第一投屏界面的内容是第一电子设备显示的第一界面图像的镜像;接收模块1002,用于接收来自第一电子设备的第一编码信息;其中,第一编码信息包括第一编码数据、以及第i帧界面图像中相较于第i-1帧界面图像的N个差异区域的位置信息;协同处理模块1003,用于解码第一编码数据,得到N个差异区域的图像内容;显示模块1001,还用于显示第二投屏界面。
其中,第i帧界面图像是第一电子设备响应于更新操作生成的,更新操作用于触发第一电子设备依次显示M帧界面图像,M为正整数,i在{1,……,M}中依次取值,第0帧界面图像为第一界面图像。N个差异区域中像素点的像素值与第i-1帧界面图像中对应像素点的像素值不同,N为正整数;第一编码数据是对N个差异区域的图像内容编码得到的。第二投屏界面的内容是第i帧界面图像的镜像,第i帧界面图像是根据N个差异区域的图像内容和位置信息更新第i-1帧界面图像得到的。
在一种可能的实施方式中,第二电子设备1000还包括发送模块1004。接收模块1002,还用于在接收来自第一电子设备的第一编码信息之前,接收用户对第一投屏界面的更新操作;发送模块1004,用于响应于更新操作,向第一电子设备发送更新指令;更新指令用于触发第一电子设备依次显示M帧界面图像。
其中,发送模块通过与第二电子设备1000连接的外部设备接收对第一投屏界面的更新操作;其中,外部设备包括第二电子设备1000的显示屏、遥控器、鼠标或者手写笔中的任一种。
另一种可能的实施方式中,接收模块1002,还用于接收来自第一电子设备的第二编码信息;其中,第二编码信息包括第二编码数据,第二编码数据是对第i帧界面图像编码得到的;协同处理模块1003,还用于对第二编码数据解码,得到第i帧界面图像;显示模块1001,还用于显示第三投屏界面;第三投屏界面的内容是第i帧界面图像的镜像。
另一种可能的实施方式中,第二编码信息还包括:第i帧的第一时间戳,第i帧的第一时间戳用于记录第一电子设备生成第i帧界面图像的时间。协同处理模块1003,还用于在接收模块接收来自第一电子设备的第二编码信息之后,确定第i+1帧的第二基准时间戳为第i帧的第一时间戳,第二基准时间戳是第二电子设备记录投屏的基准 时间。
另一种可能的实施方式中,第一编码信息还包括:第i帧的第一时间戳和第i帧的第一基准时间戳,第一基准时间戳是第一电子设备记录投屏的基准时间。协同处理模块1003,还用于在接收模块接收来自第一电子设备的第一编码信息之后,解码第一编码数据,得到N个差异区域的图像内容之前,确定第i帧的第一时间戳所记录的时间晚于第i帧的第一基准时间戳所记录的时间,并且第i帧的第一基准时间戳所记录的时间等于第i帧的第二基准时间戳所记录的时间。
本申请实施例还提供一种电子设备,该电子设备是第二电子设备。该第二电子设备包括处理器和存储器。其中,存储器用于存储计算机程序代码,计算机程序代码包括计算机指令;处理器用于运行计算机指令,使得第二电子设备执行上述方法实施例中笔记本电脑510或平板电脑505执行的各个功能或者步骤。其中,第二电子设备的其他硬件结构可以参考上述实施例对图6所示的手机300的详细介绍,本申请实施例这里不予赘述。
本申请另一实施例提供了一种计算机可读存储介质,计算机可读存储介质上存储有计算机指令,当计算机指令在第一电子设备(如图6、图8和图15中任一附图所示的手机300、图9或图16所示的笔记本电脑520、图18所示的第一电子设备900)上运行时,使得第一电子设备执行上述方法实施例中手机300、笔记本电脑520或第一电子设备900执行的各个功能或者步骤。例如,该计算机可读存储介质可以是只读存储器(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、只读光盘(Compact Disc Read-Only Memory,CD-ROM)、磁带、软盘和光数据存储设备等。
本申请另一实施例提供了一种计算机程序产品,包括一条或多条指令,该一条或多条指令可以在第一电子设备(如图6、图8和图15中任一附图所示的手机300、图9或图16所示的笔记本电脑520,图18所示的第一电子设备900)上运行,使得第一电子设备执行上述方法实施例中手机300、笔记本电脑520或第一电子设备900执行的各个功能或者步骤。
本申请另一实施例提供了一种计算机可读存储介质,计算机可读存储介质上存储有计算机指令,当计算机指令在第二电子设备(如图8或图15所示的笔记本电脑510、图9或图16所示的平板电脑505、图19所示的第二电子设备1000)上运行时,使得第二电子设备执行上述方法实施例中笔记本电脑510、平板电脑505或第二电子设备1000执行的各个功能或者步骤。例如,该计算机可读存储介质可以是ROM、RAM、CD-ROM、磁带、软盘和光数据存储设备等。
本申请另一实施例提供了一种计算机程序产品,包括一条或多条指令,该一条或多条指令可以在第二电子设备(如图8或图15所示的笔记本电脑510、图9或图16所示的平板电脑505、图19所示的第二电子设备1000)上运行,使得第二电子设备执行上述方法实施例中笔记本电脑510、平板电脑505或第二电子设备1000执行的各个功能或者步骤。
通过以上实施方式的描述,所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而 将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。
在本申请所提供的几个实施例中,应该理解到,所揭露的装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个装置,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是一个物理单元或多个物理单元,即可以位于一个地方,或者也可以分布到多个不同地方。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个可读取存储介质中。基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该软件产品存储在一个存储介质中,包括若干指令用以使得一个设备(可以是单片机,芯片等)或处理器(processor)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、ROM、RAM、磁碟或者光盘等各种可以存储程序代码的介质。
以上内容,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何在本申请揭露的技术范围内的变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (15)

  1. 一种多屏协同的显示方法,其特征在于,应用于第一电子设备,所述第一电子设备连接第二电子设备,所述方法包括:
    所述第一电子设备显示第一界面图像,并对所述第一界面图像编码得到所述第一界面图像的编码数据;
    所述第一电子设备向所述第二电子设备发送所述第一界面图像的编码信息,所述第一界面图像的编码信息包括所述第一界面图像的编码数据,所述第一界面图像的编码信息用于触发所述第二电子设备基于所述第一界面图像的编码信息显示第一投屏界面,所述第一投屏界面的内容是所述第一界面图像的镜像;
    所述第一电子设备接收对所述第一界面图像的更新操作,所述更新操作用于触发所述第一电子设备依次显示M帧界面图像,M为正整数;
    所述第一电子设备响应于更新操作,显示第i帧界面图像,并获取所述第i帧界面图像中、相较于第i-1帧界面图像的N个差异区域;其中,i在{1,……,M}中依次取值;第0帧界面图像为所述第一界面图像;所述N个差异区域中像素点的像素值与所述第i-1帧界面图像中对应像素点的像素值不同,N为正整数;
    若所述N个差异区域在所述第i帧界面图像的面积占比小于预设值,所述第一电子设备对N个差异区域的图像内容进行编码,得到第一编码数据;
    所述第一电子设备向所述第二电子设备发送第一编码信息,所述第一编码信息包括所述第一编码数据和所述N个差异区域在所述第i帧界面图像中的位置信息;所述第一编码信息用于触发所述第二设备基于所述第一编码信息更新所述第i-1帧界面图像得到第二投屏界面并显示所述第二投屏界面,所述第二投屏界面的内容是所述第i帧界面图像的镜像。
  2. 根据权利要求1的方法,其特征在于,所述第一电子设备接收对所述第一界面图像的更新操作,包括:
    所述第一电子设备接收来自所述第二电子设备的更新指令,所述更新指令是对所述第一投屏界面的更新操作触发的;或者,
    所述第一电子设备接收用户对所述第一电子设备显示的所述第一界面图像的更新操作。
  3. 根据权利要求1或2的方法,其特征在于,所述获取所述第i帧界面图像中、相较于第i-1帧界面图像的N个差异区域,包括:
    所述第一电子设备将所述第i帧界面图像中每个像素点的像素值,与所述第i-1帧界面图像中对应像素点的像素值进行对比,得到所述第i帧界面图像中的差异像素点;其中,所述差异像素点的像素值与所述第i-1帧界面图像中对应像素点的像素值不同;
    所述第一电子设备确定包括所述第i帧界面图像中的差异像素点的所述N个差异区域。
  4. 根据权利要求1-3中任一项的方法,其特征在于,所述方法还包括:
    若所述N个差异区域在所述第i帧界面图像的面积占比大于所述预设值,所述第一电子设备对所述第i帧界面图像进行编码,得到第二编码数据;
    所述第一电子设备向所述第二电子设备发送第二编码信息;其中,所述第二编码 信息包括所述第二编码数据,所述第二编码信息用于触发所述第二电子设备基于所述第二编码信息显示第三投屏界面,所述第三投屏界面的内容是所述第i帧界面图像的镜像。
  5. 根据权利要求1-4中任一项的的方法,其特征在于,所述方法还包括:
    所述第一电子设备响应于所述更新操作,生成第i帧的第一时间戳;所述第i帧的第一时间戳用于记录所述第一电子设备生成所述第i帧界面图像的时间;
    其中,所述第二编码信息还包括所述第i帧的第一时间戳。
  6. 根据权利要求1-5中任一项所述的方法,其特征在于,所述方法还包括:
    所述第一电子设备生成第二时间戳,并保存所述第二时间戳,所述第二时间戳用于记录所述第一电子设备生成所述第一界面图像的时间,所述第二时间戳是第1帧的第一基准时间戳,所述第一基准时间戳是所述第一电子设备记录投屏的基准时间;
    若所述N个差异区域在所述第i帧界面图像的面积占比大于所述预设值,所述第一电子设备确定第i+1帧的第一基准时间戳为所述第i帧的第一时间戳;
    若所述N个差异区域在所述第i帧界面图像的面积占比小于所述预设值,所述第一电子设备确定所述第i+1帧的第一基准时间戳为所述第i帧的第一基准时间戳;
    其中,所述第一编码信息还包括:所述第i帧的第一时间戳和所述第i帧的第一基准时间戳。
  7. 一种多屏协同的显示方法,其特征在于,应用于第二电子设备,所述第二电子设备连接第一电子设备,所述方法包括:
    所述第二电子设备显示第一投屏界面,所述第一投屏界面的内容是所述第一电子设备显示的第一界面图像的镜像;
    所述第二电子设备接收来自所述第一电子设备的第一编码信息;其中,所述第一编码信息包括第一编码数据、以及第i帧界面图像中相较于第i-1帧界面图像的N个差异区域的位置信息;所述N个差异区域中像素点的像素值与所述第i-1帧界面图像中对应像素点的像素值不同,N为正整数;所述第一编码数据是对所述N个差异区域的图像内容编码得到的;所述第i帧界面图像是所述第一电子设备响应于更新操作生成的,所述更新操作用于触发所述第一电子设备依次显示M帧界面图像,M为正整数,i在{1,……,M}中依次取值,第0帧界面图像为所述第一界面图像;
    所述第二电子设备解码所述第一编码数据,得到所述N个差异区域的图像内容;
    所述第二电子设备显示所述第二投屏界面;其中,所述第二投屏界面的内容是所述第i帧界面图像的镜像,所述第i帧界面图像是根据所述N个差异区域的图像内容和所述位置信息更新所述第i-1帧界面图像得到的。
  8. 根据权利要求7的方法,其特征在于,在所述第二电子设备接收来自所述第一电子设备的第一编码信息之前,所述方法还包括:
    所述第二电子设备接收用户对所述第一投屏界面的所述更新操作;
    响应于所述更新操作,向所述第一电子设备发送更新指令;所述更新指令用于触发所述第一电子设备依次显示所述M帧界面图像;
    其中,所述第二电子设备通过与所述第二电子设备连接的外部设备接收对所述第一投屏界面的所述更新操作;其中,所述外部设备包括所述第二电子设备的显示屏、 遥控器、鼠标或者手写笔中的任一种。
  9. 根据权利要求7或8的方法,其特征在于,所述方法还包括:
    所述第二电子设备接收来自所述第一电子设备的第二编码信息;其中,所述第二编码信息包括第二编码数据,所述第二编码数据是对所述第i帧界面图像编码得到的;
    所述第二电子设备对所述第二编码数据解码,得到所述第i帧界面图像;
    所述第二电子设备显示第三投屏界面;所述第三投屏界面的内容是所述第i帧界面图像的镜像。
  10. 根据权利要求9的方法,其特征在于,所述第二编码信息还包括:第i帧的第一时间戳,所述第i帧的第一时间戳用于记录所述第一电子设备生成所述第i帧界面图像的时间;
    在所述第二电子设备接收来自所述第一电子设备的第二编码信息之后,所述方法还包括:
    所述第二电子设备确定第i+1帧的第二基准时间戳为所述第i帧的第一时间戳,所述第二基准时间戳是所述第二电子设备记录投屏的基准时间。
  11. 根据权利要求7-10中任一项所述的方法,其特征在于,所述第一编码信息还包括:第i帧的第一时间戳和所述第i帧的第一基准时间戳,所述第一基准时间戳是所述第一电子设备记录投屏的基准时间;
    在所述第二电子设备接收来自所述第一电子设备的第一编码信息之后,所述第二电子设备解码所述第一编码数据,得到所述N个差异区域的图像内容之前,所述方法还包括:
    所述第二电子设备确定所述第i帧的第一时间戳所记录的时间晚于所述第i帧的第一基准时间戳所记录的时间,并且所述第i帧的第一基准时间戳所记录的时间等于所述第i帧的第二基准时间戳所记录的时间。
  12. 一种电子设备,其特征在于,所述电子设备是第一电子设备,所述第一电子设备包括:处理器和存储器;其中,所述存储器用于存储计算机程序代码,所述计算机程序代码包括计算机指令;所述处理器用于运行所述计算机指令,使得所述第一电子设备执行如权利要求1-6中任一项所述的方法。
  13. 一种电子设备,其特征在于,所述电子设备是第二电子设备,所述第二电子设备包括:处理器和存储器;
    其中,所述存储器用于存储计算机程序代码,所述计算机程序代码包括计算机指令;所述处理器用于运行所述计算机指令,使得所述第二电子设备执行如权利要求7-11中任一项所述的方法。
  14. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质上存储有计算机指令,当所述计算机指令在第一电子设备上运行时,使得所述第一电子设备执行如权利要求1-6中任一项所述的方法。
  15. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质上存储有计算机指令,当所述计算机指令在第二电子设备上运行时,使得所述第二电子设备执行如权利要求7-11中任一项所述的方法。
PCT/CN2021/134390 2020-11-30 2021-11-30 一种多屏协同的显示方法及电子设备 WO2022111727A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP21897227.1A EP4231133A4 (en) 2020-11-30 2021-11-30 COLLABORATIVE MULTI-SCREEN DISPLAY METHOD AND ELECTRONIC DEVICE
US18/254,472 US11977810B2 (en) 2020-11-30 2021-11-30 Multi-screen collaborative display method and electronic device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011381303.4 2020-11-30
CN202011381303.4A CN114579068A (zh) 2020-11-30 2020-11-30 一种多屏协同的显示方法及电子设备

Publications (1)

Publication Number Publication Date
WO2022111727A1 true WO2022111727A1 (zh) 2022-06-02

Family

ID=81754052

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/134390 WO2022111727A1 (zh) 2020-11-30 2021-11-30 一种多屏协同的显示方法及电子设备

Country Status (4)

Country Link
US (1) US11977810B2 (zh)
EP (1) EP4231133A4 (zh)
CN (1) CN114579068A (zh)
WO (1) WO2022111727A1 (zh)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5612744A (en) * 1993-12-29 1997-03-18 Electronics And Telecommunications Research Institute Image signal transmitting system using image frames differences
CN103281539A (zh) * 2013-06-07 2013-09-04 华为技术有限公司 一种图像编、解码处理的方法、装置及终端
CN109218731A (zh) * 2017-06-30 2019-01-15 腾讯科技(深圳)有限公司 移动设备的投屏方法、装置及系统
CN111625211A (zh) * 2019-12-03 2020-09-04 蘑菇车联信息科技有限公司 一种屏幕投屏方法、装置、安卓设备及显示设备

Family Cites Families (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR0159575B1 (ko) * 1994-10-31 1999-01-15 배순훈 영역 분할 부호화 방식의 인접 영역간 불연속 처리 장치
JP3870882B2 (ja) * 2002-09-12 2007-01-24 ソニー株式会社 情報通信システム、情報通信装置及び情報通信方法、並びにコンピュータ・プログラム
US7739038B2 (en) * 2004-12-17 2010-06-15 Information Patterns Llc Methods and apparatus for geo-collaboration
GB2481612A (en) * 2010-06-30 2012-01-04 Skype Ltd Updating image regions in a shared image system
KR101741551B1 (ko) * 2010-12-20 2017-06-15 엘지전자 주식회사 이동 단말기 및 이것의 애플리케이션 제어 방법
US9143536B2 (en) * 2011-01-31 2015-09-22 Telefonaktiebolaget L M Ericsson (Publ) Determining a location address for shared data
US9462466B2 (en) * 2011-09-29 2016-10-04 Israel L'Heureux Gateway router supporting session hand-off and content sharing among clients of a local area network
US20130147903A1 (en) * 2011-12-07 2013-06-13 Reginald Weiser Systems and methods for including video traffic from external sources into a video conferencing
US8854325B2 (en) * 2012-02-29 2014-10-07 Blackberry Limited Two-factor rotation input on a touchscreen device
US8930457B2 (en) * 2012-06-19 2015-01-06 International Business Machines Corporation Proximity initiated co-browsing sessions
US9176703B2 (en) * 2012-06-29 2015-11-03 Lg Electronics Inc. Mobile terminal and method of controlling the same for screen capture
KR20140125671A (ko) * 2013-04-19 2014-10-29 삼성전자주식회사 입력 제어 방법 및 이를 지원하는 전자 장치
WO2014188050A1 (en) * 2013-05-21 2014-11-27 Multitouch Oy App sharing
US9537908B2 (en) * 2013-06-11 2017-01-03 Microsoft Technology Licensing, Llc Collaborative mobile interaction
KR102227661B1 (ko) * 2014-01-08 2021-03-15 삼성전자주식회사 화면 미러링 방법 및 그에 따른 장치
KR102219861B1 (ko) * 2014-05-23 2021-02-24 삼성전자주식회사 화면 공유 방법 및 그 전자 장치
US20150346937A1 (en) * 2014-05-27 2015-12-03 Breezio Inc. Collaborative system and method with drag along browsing and reading position approximation on a display device
US9801219B2 (en) * 2015-06-15 2017-10-24 Microsoft Technology Licensing, Llc Pairing of nearby devices using a synchronized cue signal
US20180121663A1 (en) * 2016-11-01 2018-05-03 Microsoft Technology Licensing, Llc Sharing Protection for a Screen Sharing Experience
US10375125B2 (en) * 2017-04-27 2019-08-06 Cisco Technology, Inc. Automatically joining devices to a video conference
CN109996072B (zh) * 2018-01-03 2021-10-15 华为技术有限公司 视频图像的处理方法及装置
US11416205B2 (en) * 2019-04-16 2022-08-16 Apple Inc. Systems and methods for initiating and interacting with a companion-display mode for an electronic device with a touch-sensitive display
US11335187B2 (en) * 2019-06-20 2022-05-17 Here Global B.V. Dynamic privacy-sensitive operating modes
CN110865782B (zh) * 2019-09-29 2024-01-30 华为终端有限公司 数据传输方法、装置及设备
CN111459428B (zh) * 2020-02-28 2023-01-06 通彩视听科技(上海)有限公司 显示界面同步方法、装置、计算机设备及存储介质
CN111580765B (zh) * 2020-04-27 2024-01-12 Oppo广东移动通信有限公司 投屏方法、投屏装置、存储介质、被投屏设备与投屏设备

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5612744A (en) * 1993-12-29 1997-03-18 Electronics And Telecommunications Research Institute Image signal transmitting system using image frames differences
CN103281539A (zh) * 2013-06-07 2013-09-04 华为技术有限公司 一种图像编、解码处理的方法、装置及终端
CN109218731A (zh) * 2017-06-30 2019-01-15 腾讯科技(深圳)有限公司 移动设备的投屏方法、装置及系统
CN111625211A (zh) * 2019-12-03 2020-09-04 蘑菇车联信息科技有限公司 一种屏幕投屏方法、装置、安卓设备及显示设备

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4231133A4

Also Published As

Publication number Publication date
EP4231133A1 (en) 2023-08-23
CN114579068A (zh) 2022-06-03
US11977810B2 (en) 2024-05-07
EP4231133A4 (en) 2024-04-24
US20240053946A1 (en) 2024-02-15

Similar Documents

Publication Publication Date Title
WO2021004381A1 (zh) 一种投屏显示方法及电子设备
WO2020216156A1 (zh) 投屏方法和计算设备
WO2021017836A1 (zh) 控制大屏设备显示的方法、移动终端及第一系统
WO2022052773A1 (zh) 多窗口投屏方法及电子设备
US20230162324A1 (en) Projection data processing method and apparatus
WO2019114724A1 (zh) 一种相机缩略图生成的方法、移动终端及存储介质
WO2022161227A1 (zh) 图像处理方法、装置、图像处理芯片和电子设备
CN108769738B (zh) 视频处理方法、装置、计算机设备和存储介质
CN112995727A (zh) 一种多屏协同方法、系统及电子设备
KR20170043324A (ko) 전자 장치 및 전자 장치의 영상 인코딩 방법
US20220358688A1 (en) Page drawing control method, apparatus, and device
WO2022237259A1 (zh) 终端设备的待机方法和终端设备
US20220414178A1 (en) Methods, apparatuses and systems for displaying alarm file
WO2022111727A1 (zh) 一种多屏协同的显示方法及电子设备
CN115857850A (zh) 投屏异常处理方法及电子设备
WO2019153286A1 (zh) 一种图像分类方法及设备
CN109714628B (zh) 播放音视频的方法、装置、设备、存储介质及系统
US10148874B1 (en) Method and system for generating panoramic photographs and videos
EP2726995B1 (en) Methods, apparatuses and computer program products for improving network transmission by reducing memory copy overhead by providing direct access to data
CN114697731A (zh) 投屏方法、电子设备及存储介质
WO2022199352A1 (zh) 一种息屏显示方法及电子设备
WO2024067428A1 (zh) 高分辨率高帧率摄像方法和图像处理装置
CN115543649B (zh) 一种数据获取方法及电子设备
WO2023035837A1 (zh) 数据同步方法及设备
CN115022310B (zh) 设备在线时长获取方法、系统、电子设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21897227

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021897227

Country of ref document: EP

Effective date: 20230519

NENP Non-entry into the national phase

Ref country code: DE