WO2023088211A1 - 一种显示画面同步方法、系统及电子设备 - Google Patents

一种显示画面同步方法、系统及电子设备 Download PDF

Info

Publication number
WO2023088211A1
WO2023088211A1 PCT/CN2022/131734 CN2022131734W WO2023088211A1 WO 2023088211 A1 WO2023088211 A1 WO 2023088211A1 CN 2022131734 W CN2022131734 W CN 2022131734W WO 2023088211 A1 WO2023088211 A1 WO 2023088211A1
Authority
WO
WIPO (PCT)
Prior art keywords
frame
picture frame
display
receiving end
picture
Prior art date
Application number
PCT/CN2022/131734
Other languages
English (en)
French (fr)
Inventor
黄宇
胡诗尧
吴志鹏
赵力学
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP22894745.3A priority Critical patent/EP4408001A1/en
Publication of WO2023088211A1 publication Critical patent/WO2023088211A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • G06F3/1446Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display display composed of modules, e.g. video walls
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2300/00Aspects of the constitution of display devices
    • G09G2300/02Composition of display devices
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/04Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/12Synchronisation between the display unit and other units, e.g. other display units, video-disc players

Definitions

  • the present application belongs to the field of display technology, and in particular relates to a display screen synchronization method, system and electronic equipment.
  • the embodiments of the present application provide a display screen synchronization method, system, and electronic device, which can effectively avoid the occurrence of screen frame asynchrony during multi-screen display.
  • the first aspect of the embodiments of the present application provides a display screen synchronization method, which is applied to a receiving end device.
  • the receiver device includes frame loss information, and the frame loss information is used to record identifiers of picture frames not to be displayed.
  • the method may include the following steps: the receiving end device establishes a communication connection with the source end device; the receiving end device receives the first picture frame sent by the source end device, and the first designated display time corresponding to the first picture frame; if the receiving end device determines The first picture frame can be displayed according to the first designated display time, and the frame number of the first picture frame is not in the frame loss information, then the receiving end device can display the first picture frame according to the first designated display time; if the receiving end device determines that it cannot Displaying the first picture frame according to the first designated display time, then adding the identification of the first picture frame to the frame loss information, and notifying the identification of the first picture frame to other devices; wherein, the other devices are used to communicate with the receiving end device Synchronous display of picture
  • the receiving end device can determine under what circumstances the first screen frame can be displayed and at which time the first screen frame should be displayed according to the first designated display time and frame loss information;
  • the receiver device may notify other devices of the identifier of the first picture frame when it has received the first picture frame but cannot display the first picture frame. Therefore, the synchronous display of the display screen is realized by means of software, does not need to rely on the wired connection on the hardware, and is low in cost and easy to implement.
  • the receiver device includes frame loss information, which can be understood as maintaining frame loss information in the receiver device, for example, allocating storage space for frame loss information, dynamically updating frame loss information, and the like.
  • the method further includes the following step: if the identifier of the first picture frame is included in the frame loss information, the receiving end device does not display the first picture frame.
  • the identifier of the first picture frame is included in the frame loss information, which indicates that other devices cannot display the first picture frame, so the receiving end device does not display the first picture frame either. In this way, the situation that other devices do not display the first picture frame but the receiving end still displays the first picture frame can be avoided, which helps to realize the synchronous display of the display picture.
  • the method further includes the following steps: the receiving end device determines the lost frame identifier according to the identifier corresponding to the picture frame that has been successfully received; The identifier corresponding to the picture frame; the receiver device adds the lost frame identifier to the lost frame information, and notifies the lost frame identifier to other devices.
  • the receiving device fails to receive some picture frames (that is, lost frames). At this time, the receiving device will A lost frame flag is added to the lost frame information and notified to other devices. In this way, it is possible to avoid the situation that the receiving end device does not display the missing frame but other devices still display the missing frame, which helps to realize the synchronous display of the display screen.
  • the step of determining that the receiving end device can display the first picture frame according to the first specified display time specifically includes: the receiving end device obtains the estimated decoding delay required for decoding the first picture frame, according to Estimate the decoding delay, and determine the estimated display time corresponding to the first picture frame; if the estimated display time is earlier than or equal to the first designated display time, then the receiving end device determines that the first picture frame can be displayed according to the first designated display time.
  • the step of determining that the receiving end device cannot display the first picture frame according to the first specified display moment specifically includes: the receiving end device obtains an estimated decoding delay required for decoding the first picture frame, Determine the estimated display time corresponding to the first picture frame according to the estimated decoding delay; if the estimated display time is later than the first designated display time, the receiving end device determines that the first picture frame cannot be displayed according to the first designated display time.
  • the receiving device may first determine whether it is time to display the first picture frame after receiving the first picture frame. In the case of judging that there is time to display the first picture frame, the receiving end device determines that the first picture frame can be displayed according to the first specified time, and optionally, the receiving end device decodes the first picture frame. In the case of judging that it is too late to display the first picture frame, the receiving end device notifies other devices of the identifier of the first picture frame. In this way, it is possible to avoid the situation that the receiving end device does not display the first picture frame but other devices still display the first picture frame, which helps to realize the synchronous display of the display pictures.
  • the receiving device may not decode the first picture frame, so as to avoid wasting computing resources on decoding the first picture frame that will not be displayed, which helps To save computing resources of the receiver device.
  • the receiving The end device does not decode the first picture frame. That is to say, although it is judged that it is too late to display the first picture frame, if the first picture frame is a frame from other picture frames for display, then the first picture frame still needs to be decoded to ensure that other picture frames can be displayed normally .
  • the method further includes the following steps: the receiving end device performs clock synchronization with the source end device, and determines the clock offset between the system time of the receiving end device and the system time of the source end device; If the display time is earlier than or equal to the first specified display time, the receiving end device determines that the first frame can be displayed according to the first specified display time, which specifically includes: correcting the estimated display time by using the clock bias to obtain the corrected estimated display time If the corrected estimated display time is earlier than or equal to the first designated display time, it is determined that the first picture frame can be displayed according to the first designated display time.
  • the receiving end device displays the first picture frame according to the first specified display time corrected by using the clock deviation, which helps to avoid the out-of-sync of the picture frame caused by the clock deviation.
  • the step of obtaining the estimated decoding delay required by the receiving end device for decoding the first picture frame specifically includes: the receiving end device obtains the decoding delay according to the actual decoding delay consumed by decoding the historical picture frame The estimated decoding delay required for the first picture frame; wherein, the historical picture frames include at least one picture frame whose identifier is located before the identifier of the first picture frame.
  • the decoding delay required by the receiving device to decode picture frames is related to factors such as the processor performance of the receiving device, the current processor occupancy rate, and the codec algorithm used, so it is often continuous. Therefore, the receiver device can obtain (for example, estimate, predict, etc.) the estimated decoding delay it will take to decode the first picture frame according to the decoding delay actually consumed by the receiver device to decode the historical picture frame (that is, the actual decoding delay). . In this way, it is helpful for the receiving end device to judge more accurately whether the receiving end device has time to display the first picture frame.
  • the method further includes the following steps: the receiving end device determines the current decoding delay of the receiving end device according to the actual decoding delay of at least one picture frame successfully received; the receiving end device determines the current decoding delay The delay is sent to the source device. Correspondingly, the source device receives the current decoding delay.
  • the method further includes the following steps: the receiving end device determines the current transmission delay of the receiving end device according to the actual transmission delay of at least one picture frame successfully received; The delay is sent to the source device. Correspondingly, the source device receives the current transmission delay.
  • the number of receiving end devices is at least one, and the number of current decoding delays and/or current transmission delays received by the source end device is also at least one.
  • the source device may determine the current delay according to at least one current decoding delay and/or current transmission delay, for example, determine the current delay according to a maximum value of at least one current decoding delay and/or current transmission delay.
  • the source device specifies a specified display time for subsequent picture frames based on the current delay. Therefore, the source device determines the current delay based on the current decoding delay and/or the current transmission delay of the receiving device, so that the current delay can be dynamically changed, and is consistent with the actual network transmission situation in the actual application scenario, And/or, the actual decoding delay of the receiving end device matches.
  • the delay of the first frame does not need to be set to a relatively large value (such as 200ms, 600ms), but can be set to a A relatively small value helps reduce the start-up delay and improve user experience.
  • the method further includes the following steps: the receiving end device receives the lost frame identifier and/or actively discarded frame identifier sent by other devices, and sends the lost frame identifier and/or actively discarded frame identifier sent by other devices
  • the identification is added to the lost frame information; in some embodiments, the lost frame identification sent by other devices may include the picture frames that other devices have not successfully received as determined by other devices according to the corresponding identifications of the picture frames that other devices have successfully received Corresponding identification; the active discarding frame identification sent by other devices includes the frame number of the picture frame that other devices determine cannot be displayed according to the second specified display time; in some embodiments, the second specified display time may be an active message sent by other devices The specified display time corresponding to the picture frame corresponding to the discarded frame identifier.
  • the receiving end device will not display the picture frames that cannot be displayed by other devices (that is, lost frames and/or actively discarded frames). In this way, the situation that other devices do not display the above-mentioned lost frames and/or actively discarded frames, but the receiver device still displays the above-mentioned lost frames and/or actively discarded frames can be avoided, which helps to realize the synchronous display of the display screen.
  • the step of receiving the first picture frame sent by the source device and the first specified display time corresponding to the first picture frame by the receiving end device specifically includes: receiving the first message by the receiving end device , wherein the first message carries the first picture frame and the first designated display time, optionally, the first message also carries the sending time of the first message and the identifier of the first picture frame.
  • the source device packs the relevant information of the first picture frame (for example, the first picture frame, the first specified display time, the identifier of the first picture frame) into the first message and sends it, which helps the receiving end device to accurately determine Information corresponding to the first picture frame.
  • the relevant information of the first picture frame for example, the first picture frame, the first specified display time, the identifier of the first picture frame
  • the sink device and the source device are in the same local area network.
  • the receiving end device and the source end device can conveniently perform information exchange.
  • the identifier of the picture frame includes a frame number of the picture frame.
  • the identifier of the picture frame may also include other content than the frame number, which is not limited in this embodiment of the present application.
  • the second aspect of the embodiments of the present application provides a display screen synchronization method, which is applied to a source device.
  • the source device is used to obtain picture frames and send the obtained picture frames to the sink device.
  • the method may include the following steps: the source device establishes a communication connection with the sink device; the source device sends the first picture frame to the sink device, and the first specified display time corresponding to the first picture frame; wherein, the first specified display
  • the time is determined by the source device according to the current transmission delay and/or the current decoding delay.
  • the current transmission delay is related to the actual transmission delay of historical picture frames
  • the current decoding delay is related to the actual decoding of historical picture frames decoded by the receiver device.
  • the historical picture frames include at least one picture frame whose identifier is located before the identifier of the first picture frame.
  • the source device can determine the current delay according to the current transmission delay and/or the current decoding delay, and then determine the current delay according to the determined current delay
  • the first specified display time of the first picture frame is not a preset fixed value, but changes dynamically and is associated with the current network transmission status and/or the decoding capability of the current receiving end device. Therefore, the first designated display time specified by the source device for the first picture frame is an appropriate value, which allows the receiving device to display as much as possible, which helps to prevent the receiving device from being unable to receive certain picture frames, unable to Occurrence of a situation where certain picture frames are displayed.
  • the delay of the first frame does not need to be set to a relatively large value (such as 200ms, 600ms), but can be set to a A relatively small value helps reduce the start-up delay and improve user experience.
  • the source device is also used for synchronously displaying picture frames with the receiving device.
  • the source device includes frame loss information, and the frame loss information is used to record the details of the picture frames that are not displayed identification
  • the method also includes the following steps: if the frame number of the first picture frame is not in the frame loss information, the source device displays the first picture frame according to the first specified display time; if the frame number of the first picture frame is in the frame loss information information, the source device does not display the first picture frame.
  • the frame number of the first picture frame is not included in the frame loss information, it means that each of the receiving end devices has successfully received the first picture frame, and all of them have determined that they can display the time according to the first specified time Display the first picture frame. At this time, if the source device also performs synchronous display of the picture frame, it can display the first picture frame.
  • the frame number of the first picture frame is included in the frame loss information, it indicates that at least one of the receiving end devices failed to receive the first picture frame, or even if the first picture frame was successfully received frame, but it is determined that the first picture frame cannot be displayed according to the first specified display time. At this time, if the source device also performs synchronous display of the picture frame, it cannot display the first picture frame.
  • the above-mentioned source device includes frame loss information, which can be understood as maintaining frame loss information in the source device, for example, allocating storage space for the frame loss list, and dynamically updating the frame loss information.
  • the method further includes the following steps: the source device receives the lost frame identifier and/or the active Discard the frame identifier, and add the lost frame identifier and/or actively discarded frame identifier sent by the receiving end device to the frame loss information; wherein, the lost frame identifier sent by the receiving end device includes The identification of the picture frame corresponding to the identification of the picture frame determined by the receiving end device; The second designated display time is the designated display time corresponding to the picture frame corresponding to the actively discarded frame identifier sent by the receiving end device.
  • the source device will not display picture frames that cannot be displayed by the sink device (that is, lost frames and/or actively discarded frames). In this way, the situation that the receiving device does not display the above-mentioned lost frames and/or actively discarded frames, but the source device still displays the above-mentioned lost frames and/or actively discarded frames can be avoided, which helps to realize the synchronous display of the display screen.
  • the method further includes the following steps: the source device receives the current transmission delay of at least one receiving device and/or the current decoding delay of at least one receiving device sent by at least one receiving device ; The source device determines the maximum value of the current transmission delay of at least one receiving device as the current transmission delay, and/or, determines the maximum value of the current decoding delay of at least one receiving device as the current decoding time delay.
  • the first designated display time is determined by the source device according to the current transmission delay and/or the current decoding delay, and the current decoding delay and/or current transmission delay is the information sent by the receiving device to the source device. . Determining the maximum value among them as the current decoding delay and/or the current transmission delay will help the first specified display time specified by the source device to be as appropriate as possible, and ensure that all receiving devices can receive Arrived and can display the first picture frame in time.
  • the current transmission delay of the receiving end device is determined by the receiving end device according to the actual transmission delay of the historical picture frames received by the receiving end device, and the current decoding delay of the receiving end device is determined by the receiving end device The device determines it according to the actual decoding delay of the historical picture frames received by the receiving device.
  • the current transmission delay and the current decoding experiment can be made to match the actual network transmission conditions in the actual application scenario, and/or the actual decoding delay of the receiving end device, which is helpful for the first time specified by the source end device.
  • Specifying the display moment is an appropriate moment.
  • the method further includes the following step: the source device and the sink device perform clock synchronization. Due to the inherent physical properties of hardware devices, there is usually a clock skew inevitably between the system time of the source device and the system time of the sink device. Performing clock synchronization can enable the receiving end device to obtain and save the clock deviation. Furthermore, the receiving end device can be made to display the first picture frame according to the first specified display time corrected by using the clock deviation, which helps to avoid the out-of-synchronization of picture frames caused by the clock deviation.
  • the source device and the sink device are in the same local area network.
  • the receiving end device and the source end device can conveniently perform information exchange.
  • the identifier of the picture frame includes a frame number of the picture frame. It should be understood that the identifier of the picture frame (such as the identifier of the first picture frame above) may also include other content than the frame number, which is not limited in this embodiment of the present application.
  • a third aspect of the embodiments of the present application provides a display screen synchronization method, which is applied to a system including a source device and at least one sink device.
  • the receiver device is a device that performs synchronous display of picture frames
  • the receiver device and the source device are devices that perform synchronous display of picture frames.
  • the method may include the following steps: the source device sends the first picture frame and the first specified display time corresponding to the first picture frame to each receiving end device in the receiving end device; if each receiving end device in the receiving end device All devices have successfully received the first picture frame, and all of them have confirmed that they can display the first picture frame according to the first specified display time, then each of the devices performing synchronous display of picture frames will display the first picture according to the first specified display time frame; if there is a first receiving device in the receiving device that fails to receive the first picture frame, or successfully receives the first picture frame but determines that the first picture frame cannot be displayed according to the first specified display time, the picture frame will be displayed. Each of the devices displayed in frame synchronization does not display the first picture frame.
  • the device that performs screen frame synchronous display will display the first screen frame; as long as the first receiver device cannot display the first picture frame according to the first specified display time, the device that performs synchronous display of the picture frame will not display the first picture frame.
  • the synchronous display of the display screen is guaranteed.
  • each of the above-mentioned devices for synchronously displaying picture frames includes frame loss information, and the frame loss information is used to record the identifiers of picture frames that are not displayed;
  • the first receiving device fails to receive the first picture frame, or successfully receives the first picture frame but determines that the first picture frame cannot be displayed according to the first specified display time, then in the device that performs synchronous display of the picture frame
  • the step of not displaying the first picture frame for each device in the receiving end device specifically includes: if there is a first receiving end device in the receiving end device that fails to receive the first picture frame, or successfully receives the first picture frame but is determined not to be able to receive the first picture frame Display the first picture frame according to the first specified display time, then the first receiving end device will add the identification of the first picture frame to the frame loss information of the first receiving end device, and notify the identification of the first picture frame to the progressing picture
  • each of the above-mentioned devices for synchronous display of frame frames includes frame loss information, which can be understood as maintaining frame loss information in each device, for example, allocating storage space for frame loss information, Information is dynamically updated, etc.
  • the method further includes the following steps: the source device determines the first specified display time according to the current transmission delay and/or the current decoding delay; wherein, the current transmission delay and the historical picture frame The actual transmission delay is related, and the current decoding delay is related to the actual decoding delay of the receiver device decoding the historical picture frame, and the historical picture frame includes at least one picture frame whose frame identifier is located before the identifier of the first picture frame.
  • the source device may determine the current delay according to the current transmission delay and/or the current decoding delay, and then determine the first designated display moment of the first picture frame according to the determined current delay.
  • the current delay is not a preset fixed value, but changes dynamically and is associated with the current network transmission status and/or the decoding capability of the current receiving end device. Therefore, the first designated display time specified by the source device for the first picture frame is an appropriate value, which allows the receiving device to display as much as possible, which helps to prevent the receiving device from being unable to receive certain picture frames, unable to Occurrence of a situation where certain picture frames are displayed.
  • the delay of the first frame does not need to be set to a relatively large value (such as 200ms, 600ms), but can be set to a A relatively small value helps reduce the start-up delay and improve user experience.
  • the method further includes the following steps: the receiving end device The actual transmission delay of the historical picture frames received by the device determines the current transmission delay of the receiving end device, and/or, the current decoding delay of the receiving end device is determined according to the actual decoding delay of the historical picture frames received by the receiving end device ;
  • the receiving end device sends the determined current transmission delay of the receiving end device and/or the current decoding delay of the receiving end device to the source end device; the above source end device according to the current transmission delay and/or the current decoding delay,
  • the step of determining the first specified display time specifically includes: the source device determines the maximum value of the received current transmission delays of at least one sink device as the current transmission delay, and/or, the received at least one The maximum value among the current decoding delays of the receiving end device is determined as the current decoding delay.
  • the first designated display time is determined by the source device according to the current transmission delay and/or the current decoding delay, and the current decoding delay and/or current transmission delay is the information sent by the receiving device to the source device. . Determining the maximum value among them as the current decoding delay and/or the current transmission delay will help the first specified display time specified by the source device to be as appropriate as possible, and ensure that all receiving devices can receive Arrived and can display the first picture frame in time.
  • the method further includes the following step: the source device and the sink device perform clock synchronization. Due to the inherent physical properties of hardware devices, there is usually a clock skew inevitably between the system time of the source device and the system time of the sink device. Performing clock synchronization can enable the receiving end device to obtain and save the clock deviation. Furthermore, the receiving end device can be made to display the first picture frame according to the first specified display time corrected by using the clock deviation, which helps to avoid the out-of-synchronization of picture frames caused by the clock deviation.
  • the source device and the sink device are in the same local area network.
  • the receiving end device and the source end device can conveniently perform information exchange.
  • the fourth aspect of the embodiment of the present application provides a receiver device, the receiver device includes a memory, a processor, and a computer program stored in the memory and operable on the processor, when the processor is configured to execute the computer program,
  • the receiving end device is made to implement the method described in the first aspect or any possible implementation manner of the first aspect.
  • the fifth aspect of the embodiment of the present application provides a source device.
  • the source device includes a memory, a processor, and a computer program stored in the memory and operable on the processor.
  • the processor is configured to execute the computer program, Make the source device implement the method described in the second aspect or any possible implementation manner of the second aspect.
  • the sixth aspect of the embodiment of the present application provides a display screen synchronization system.
  • the display screen synchronization system includes a source device and at least one sink device, wherein the source device and the sink device are respectively configured to perform the following steps: In the third aspect or in any possible implementation manner of the third aspect, the steps performed by the source device and the receiver device respectively.
  • the seventh aspect of the embodiment of the present application provides a computer-readable storage medium, the computer-readable storage medium is configured to store a computer program, and when the computer program is executed by a processor, the first aspect or the first aspect can be implemented.
  • the eighth aspect of the embodiment of the present application provides a computer program product, which is configured to enable the receiving end device to execute any possible implementation of the first aspect or the first aspect when running on the receiving end device.
  • the ninth aspect of the embodiment of the present application provides a chip system, the chip system includes a memory and a processor, the processor is configured to execute the computer program stored in the memory, so as to implement any one of the first aspect or the first aspect The method described in a possible implementation manner, or implement the method described in the second aspect or any possible implementation manner of the second aspect.
  • FIG. 1 is a schematic diagram of an application scenario of display screen synchronization provided by an embodiment of the present application
  • FIG. 2 is a schematic diagram of another application scenario of display screen synchronization provided by an embodiment of the present application
  • Fig. 3 is a schematic diagram of a multi-screen system provided by an embodiment of the present application.
  • Fig. 4 is a schematic diagram of a designated display time provided by an embodiment of the present application.
  • Fig. 5 is a schematic diagram of device interaction of a display screen synchronization method provided by an embodiment of the present application.
  • Fig. 7 is a schematic diagram of another designated display time provided by an embodiment of the present application.
  • FIG. 8 is a flow chart of a method for obtaining and processing a lost frame number by a receiver device according to an embodiment of the present application
  • Question 9 is a flow chart of a method for obtaining and processing actively discarded frame numbers by a receiver device provided in an embodiment of the present application.
  • Fig. 10 is a flow chart of a method for synchronously displaying a display screen by a receiving device according to an embodiment of the present application.
  • the term “if” may be construed, depending on the context, as “when” or “once” or “in response to determining” or “in response to detecting “.
  • the phrase “if determined” or “if [the described condition or event] is detected” may be construed, depending on the context, to mean “once determined” or “in response to the determination” or “once detected [the described condition or event] ]” or “in response to detection of [described condition or event]”.
  • references to "one embodiment” or “some embodiments” or the like in the specification of the present application means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application.
  • appearances of the phrases “in one embodiment,” “in some embodiments,” “in other embodiments,” “in other embodiments,” etc. in various places in this specification are not necessarily All refer to the same embodiment, but mean “one or more but not all embodiments” unless specifically stated otherwise.
  • the terms “including”, “comprising”, “having” and variations thereof mean “including but not limited to”, unless specifically stated otherwise.
  • frame number is used as a specific example of “frame identification”, but this does not constitute a limitation to the solution provided by the embodiment of the present application. Those skilled in the art may replace the frame number in the example with any other type of frame identifier, without going beyond the scope of the solutions provided by the embodiments of the present application.
  • FIG. 1 exemplarily shows a schematic diagram of an application scenario of display screen synchronization provided by an embodiment of the present application.
  • the scene shown in Figure 1 is the synchronization of the display screen of the splicing screen.
  • the splicing screen can also be called a split screen, a combined screen, etc., and can usually be used in scenes that require large-scale display, such as outdoor or shopping mall billboards, airport flight information display etc.
  • the splicing screen can include multiple display devices, such as 4, 6, or 9, each display device displays a part of the screen, and the display screens of multiple display devices are spliced to form a complete display screen.
  • the splicing screen includes display device A, display device B, display device C, and display device D, and the four display devices respectively display the upper left part, upper right part, lower left part and right part of the display screen shown in Figure 1. Lower part.
  • the multiple display devices in the splicing screen need to synchronize the display pictures.
  • FIG. 2 exemplarily shows another application scenario of display screen synchronization provided by the embodiment of the present application.
  • the scenario shown in FIG. 2 is the synchronization of the display images of multiple display devices displaying the same image, which can generally be applied to scenarios such as teaching and conferences.
  • the display device E and the display device F need to display the same picture at the same time, so the display device E and the display device F need to synchronize the display pictures.
  • FIG. 3 exemplarily shows a schematic diagram of a multi-screen system provided by an embodiment of the present application.
  • the multi-screen system 300 may include splicing screens, and/or, one or more display devices.
  • the multi-screen system 300 described in the embodiment of the present application may only include a splicing screen, wherein the splicing screen may include at least two display devices, and each display device of the at least two display devices respectively displays A part, the display screen of each display device is spliced to form a complete display screen; or, it may only include one or more display devices, and the one or more display devices display the same display screen; or, it may also include a splicing screen , further including one or more display devices displaying the same display image.
  • the multi-screen system 300 described in the embodiment of the present application may be any system that includes at least two display devices and needs to synchronize display screens.
  • one display device can be arbitrarily selected from the multi-screen system as a source device, and other display devices as sink devices.
  • the source device is used to obtain the display screen, for example, from the network, locally or from other devices, and send the obtained display screen to the receiving device; the receiving device is used to perform show.
  • display device A can be used as the source device, and display device B, display device C, and display device D can be used as the receiving end devices, then display device A can communicate with display device B and display device C respectively.
  • the display device D is connected by wire, and the display screen is transmitted to the display device B, the display device C, and the display device D respectively through the wired connection. Since the transmission delay of the wired connection is usually very small and can be ignored, time synchronization may not be performed, and display device B, display device C, and display device D can directly decode and send display content after receiving the display content.
  • the solution of realizing synchronization of display images through wired connection requires additional hardware cost and complicated wiring.
  • the display devices in the multi-screen system can all work as receiver devices.
  • the display devices in the multi-screen system are connected to a source device other than the multi-screen system, and the display devices are obtained from the source device. screen.
  • the source device may not necessarily display the display screen synchronously with the display device in the multi-screen system, but may work as a display screen providing device. If this implementation method adopts the hardware transmission solution of wired connection, there will also be problems of additional hardware cost and complicated wiring as mentioned in the previous paragraph.
  • the embodiment of the present application provides a display screen synchronization method implemented by software: the source device specifies the display time of each frame of the display screen, and the sink device specifies the display time of each frame of the display screen according to the source device.
  • the display time displays each frame of the display screen.
  • the designated display time of the first frame display screen is denoted as playTime 1
  • the designated display time of the second frame display screen is denoted as playTime 2 , . . . , and so on.
  • the source device may designate the display time of each frame according to the system time of the source device itself.
  • Each frame of the display screen is displayed at the specified display time later to realize the synchronization of the display screen.
  • the receiver device performing display according to the specified display time described in the embodiments of the present application may refer to the receiver device performing display according to the corrected specified display time.
  • the time when the source device acquires the first frame of the display screen is recorded as startTime.
  • startTime may be the system time of the source device when the source device obtains the first display frame from the network, locally, or from other devices.
  • the delay time of the i-th frame display picture relative to the first frame display picture is recorded as pts i , where "pts" may represent a presentation time stamp.
  • the time interval between two adjacent display frames is 1/60 ⁇ 16.67 milliseconds (ms).
  • startTime+pts i the time of startTime+pts i .
  • startTime+pts i the time of startTime+pts i .
  • the display device at the source end acquires the display screen of the i-th frame until the display device at the receiving end can display the display screen of the i-th frame. During this period, it is necessary to transmit and decode the display screen of the i-th frame , will take a certain amount of time. If the display content is transmitted wirelessly between the source display device and the receiver display device, the transmission time will also be affected by the transmission status of the wireless network, and there will be certain fluctuations.
  • the time when the source device obtains the first frame of display screen is startTime
  • the specified display time playTime 3 of the source device for the third frame display screen startTime+pts 3 +delayTime, ..., and so on.
  • the delayTime may be set relatively larger, for example, set to 200ms or 600ms. Therefore, when the network transmission status fluctuates, the synchronization of the display screen will not be affected as much as possible.
  • this implementation method of using the same delayTime for each frame still has a user experience of a large start-up delay. That is to say, for example, there is a relatively large user-perceivable delay between the time when the user presses the play button until the display device in the multi-screen system starts to synchronously display the first frame, which affects the user experience.
  • FIG. 1 it is assumed that display device A is used as a source device, and display device B, display device C, and display device D are used as sink display devices. Assuming that the network transmission status between display device B and display device A, display device C and display device A is normal, and the network transmission status between display device D and display device A becomes worse, it may happen that display device B, display device Device C can display the i-th frame display screen at the specified display time, but the display device D cannot display the i-th frame display screen at the specified display time. At this time, the display device D can only stay at the (i-1)th frame display screen, causing the display screen to be out of sync.
  • the embodiment of this application provides another display screen synchronization method implemented by software:
  • Each receiver device maintains frame loss information, which is used to record the frame numbers of the frames that are not displayed.
  • the form of the dropped frame information may be a "dropped frame list" (referred to as dropList j ).
  • the frame loss information may also exist in other forms, such as a frame loss pool, a frame loss queue, and a frame loss linked list, which are not specifically limited in this application.
  • this embodiment of the present application mainly uses a frame loss list as an example for description.
  • a receiving device determines that a certain frame cannot be displayed, and/or, when it is determined that a certain frame cannot be displayed at the specified display time (denoted as playTime i ), it will add the sequence number of the frame to its own frame loss list and notify Other receiver devices also add the frame number of the frame to their frame loss lists, so that the frame loss lists maintained by each receiver device are the same.
  • Each receiver device does not display the frames recorded in the dropped frame list, but only displays the unrecorded frames in the dropped frame list. As long as one receiving device cannot display a certain frame, and/or cannot display a certain frame at a specified display time, other receiving devices will not display the frame either. It avoids the situation that different receiver devices display different frames at the same time, and realizes the synchronization of the display screen.
  • a receiving device if a receiving device fails to receive a certain frame due to a fault in the transmission process, the receiving device determines that the frame cannot be displayed.
  • a receiver device can estimate (or predict) the estimated decoding delay of the frame (recorded as decodeTime i_Sj_pred ) according to the decoding delay of the historical frame, and then determine the estimated display time of the frame, if If the estimated display time exceeds the specified display time of the frame, the receiving end device determines that the receiving end device cannot display the frame at the specified display time.
  • the specified display time may be specifically determined by the source device for each display frame according to the current delay (denoted as delayTime i ). Wherein, the current delay is dynamically changed, rather than a preset fixed value.
  • the current delay may be determined by the source device according to the current transmission delay (denoted as transTime Sj ) and current decoding delay (denoted as decodeTime Sj ) sent to it by the sink device.
  • the current transmission delay and current decoding delay sent by the receiving end device to the source end device may be obtained by the receiving end device based on the transmission delay and decoding delay of historical frames received by the receiving end device. of.
  • the receiver device maintains the same frame loss list, and the source device determines the specified display time of each frame according to the dynamically changing current delay, thereby realizing the display screen synchronization method. synchronization.
  • the display screen synchronization method provided by the embodiment of the present application does not need to increase additional hardware costs, and can use a pure software method to realize the synchronization of display screens, which is easy to implement and provides good user experience.
  • FIG. 5 exemplarily shows a schematic diagram of device interaction in a display screen synchronization method provided by an embodiment of the present application.
  • the method involves a source device S 0 and at least one sink device S [1-N] .
  • a source device S 0 and at least one sink device S [1-N] .
  • display device A can be used as source device S 0
  • display device B, display device C, and display device D can be used as sink devices S 1 , S 2 , and S 3 respectively;
  • the device S 0 and the receiving device S [1-3] need to display the display screen synchronously.
  • display device A, display device B, display device C, and display device D may also be used as receiving end devices S 1 , S 2 , S 3 , and S 4.
  • the method may include steps 501 to 509, specifically:
  • Step 501 the source device S0 establishes a connection with the sink device S [1-N] .
  • the source device S 0 and the sink device S [1-N] are in the same local area network (local area network, LAN), and the local area network may be wired or wireless.
  • the source device S 0 and the sink device S [1-N] are both connected to the same router.
  • the connection established between the source device S 0 and the sink device S [1-N] may be, for example, a WiFi connection or the like.
  • the source device S0 can establish a dedicated transmission channel with each receiving device Sj , and perform data interaction with the receiving device Sj through the dedicated transmission channel; in some embodiments, the source The Mesh network is formed between the end device S 0 and the receiving end device S [1-N] , and data exchange is performed by broadcasting in the Mesh network.
  • Step 502 the source device S 0 performs clock synchronization with the sink device S [1-N] .
  • Electronic devices are usually set with their own system time, and the electronic device executes specified functions according to the system time. Usually, there is inevitably a certain deviation between the system times of different electronic devices.
  • the source device S0 and the sink device S [1-N] can perform clock synchronization to determine the difference between the system time of each sink device Sj and the system time of the source device S0 The clock deviation between them, the embodiment of the present application records the clock deviation as offset Sj .
  • clock synchronization may be performed through a precision time protocol (precision time protocol, PTP) protocol.
  • PTP precision time protocol
  • each sink device Sj can record the clock offset offset Sj between the system time of the sink device S j obtained through clock synchronization and the system time of the source device S0 in the sink device S jlocal . Therefore, whenever the sink device Sj receives a frame of display picture sent by the source device S0 , it can correct the designated display time of the frame according to the clock offset offset Sj , so as to display the picture according to the corrected designated display time displayed synchronously.
  • Step 503 the source device S 0 sends the display screen to the sink device S [1-N] , and accordingly, the sink device S [1-N] receives the display screen.
  • the source device S 0 may record the frame number of the i-th frame display picture (denoted as frameID i ), and the source device S 0 when the source device S 0 sends the frame display picture
  • the system time (referred to as "sending time”, denoted as sendTime i ), the specified display time playTime i specified by the source device S 0 for the display screen of this frame, and the display screen of this frame (denoted as payload i ), are packaged into a message M1 and sent for each receiver device S j .
  • sending time denoted as sendTime i
  • the display screen of this frame (denoted as payload i )
  • FIG. 6 is only an example of an implementation, and this embodiment of the present application does not limit the specific implementation of step 503, nor does it limit the frame number, sending time, specified display time, and the specific arrangement order of the display screen in the message M1 .
  • Message M1 may include more or fewer fields than shown in FIG. 6 .
  • the specified display time playTime i of the i-th frame of the display screen may be specifically determined by the source device S 0 for the i-th frame of the display screen according to the current delayTime i .
  • startTime startTime
  • pts i delayTime i
  • delayTime i the specific meanings of “startTime”, “pts i ”, and “delayTime i ” are the same as those described in the previous section, and will not be repeated here.
  • the time when the source device S0 acquires the first frame of the display screen is startTime
  • the source device For the specified display time playTime 2 startTime+pts 2 +delayTime 2 of the second frame display screen
  • the source device for the specified display time playTime 3 of the third frame display screen startTime+pts 3 +delayTime 3 , ..., so analogy.
  • the source device S0 uses the same delayTime to determine the specified Display time; in the implementation shown in FIG. 7 , for each frame of display screen, the source device S 0 determines the specified display time according to the dynamically changing current delay time delayTime i .
  • the current delayTime i may be the source device S 0 according to the current transmission delay of the receiving device S [1-N] (denoted as transTime S[1-N] ), the receiving device S [ 1-N] The current decoding delay (decodeTime S[1-N] ) is determined.
  • the time delay generated during the transmission of each frame display screen, the process of decoding the frame display screen by the receiving end device S [1-N] changes in real time. For example, if the network transmission status becomes worse, the current transmission delay will become longer; if the processor load of the receiving device S [1-N] increases, the current decoding delay will become longer; and vice versa.
  • the source device S0 can change in real time according to the current transmission delay transTime S[1-N] and the current decoding delay decodeTime S[1-N] , to determine the current delay time delayTime i for the display screen of the i-th frame, so that the specified display time playTime i of the source device S 0 for the display screen of the i-th frame can dynamically adapt to the current delay, which helps to prevent the display screen from being out of sync situation arises.
  • the source device S 0 can determine the current transmission time for the i-th frame of the display screen according to the received current transmission delays transTime S [1-N] of the N sink devices S[ 1-N]. Time delay transTime i .
  • the transTime Sj may be obtained by fitting and estimating the receiving end device Sj according to the transmission delays of historical frames received by itself.
  • the source device S 0 can receive the current transmission of the N receiving devices S [1-N] sent to it by the N receiving devices S [1-N]. After transTime S[1-N] is delayed, the maximum value is taken as the current transmission time delay transTime i for the i-th frame display picture.
  • the source device S 0 can determine the current decoding time for the i-th frame display picture according to the received current decoding time delays decodeTime S [1-N] of the N sink devices S [1-N]. Delay decodeTime i .
  • the decodeTime Sj may be obtained by fitting and estimating the receiving end device S j according to the decoding delay of historical frames received by itself.
  • the source device S 0 can receive the current decoding of the N sink devices S [1-N] sent to it by the N sink devices S [1-N]. After the delay of decoeTime S[1-N] , the maximum value is taken as the current decoding time delay decodeTime i for the i-th frame display picture.
  • this embodiment of the present application does not limit the specific implementation manner in which the source device S 0 determines the current transmission delay transTime i and the current decoding delay decodeTime i for the i-th frame of the display screen.
  • the source device S 0 may set the delayTime 1 to a smaller value, such as 30ms or 60ms, based on empirical values or tentatively. Subsequently, according to whether the N receiver devices S [1-N] can successfully display the picture frame, and/or according to the current transmission delay delayTime S [1-N] fed back by the N receiver devices S [1-N] , The current decoding delay decodeTime S[1-N] to dynamically adjust the delayTime i of the subsequent display frame.
  • a smaller value such as 30ms or 60ms
  • the source device S0 can set The delayTime 2 of the second frame is set slightly larger than the delayTime 1 of the first frame, so that N receiver devices S [1-N] can display the second frame as much as possible.
  • the delayTime 1 of the first frame display screen frame can be set to a smaller value, and the delayTime i of the subsequent frames can be dynamically adjusted according to the actual situation; instead of the above-mentioned embodiment
  • all display frames (including the first frame) use a relatively large fixed value (such as 200ms, 600ms) as their delayTime; thus the display frame synchronization method provided by the embodiment of the present application is effective in ensuring the display frame synchronization At the same time, it also solves the problem of the large start-up delay in the solutions of the foregoing embodiments.
  • the value of the current delayTime i of the source device S 0 usually needs to be greater than handleTime+transTime i +decodeTime i .
  • handleTime is used to indicate the processing time of one frame of the display screen, and the fluctuation is relatively small, so it can take a fixed value, for example, 10ms.
  • the processing time may refer to the time consumed by the source device S0 for processing the display frame, such as encrypting the display frame, encapsulating the message containing the display frame according to the adopted transmission protocol, and calling the encrypted frame.
  • the time consumed by processes such as software, calling packet encapsulation software, etc.
  • the above is only an example and not a limitation.
  • any value (such as 20ms, 40ms) can also be set as a reserved margin, and an integer multiple of framePlayTime may not be used.
  • each receiving end device S j among the receiving end devices S [1-N] sends its own lost frame number to other receiving end devices.
  • the lost frame number may also be sent to the source device S 0 .
  • each receiving end device S j may pack its own lost frame number into a message M2 and send it to other receiving end devices.
  • the message M2 may also be sent to the source device S 0 .
  • the lost frame number is used to indicate the sequence number of the display picture frame that is not successfully received by the receiving end device due to a failure in the transmission process.
  • the receiving end device S j can judge whether it has failed to receive the display screen of a certain frame number, determine its own lost frame number, add its own lost frame number to its own lost frame list, and notify others to display the picture Devices that are synchronizing also add this lost frame number to their lost frame lists.
  • the method provided by the embodiment of the present application can use step 504 to avoid the failure of the display screen caused by the fault in the transmission process to cause some or some receiver devices to fail to receive one or some frames. Synchronization situation.
  • a specific implementation manner of step 504 may be:
  • Step 5041 after the receiving end device S j receives the message M1, the receiving end device S j can obtain the frame number framID i of the frame display picture from the message M1.
  • Step 5042 the receiving end device Sj judges whether the frame numbers are continuous, if the frame numbers are continuous, execute step 5046; if the frame numbers are not continuous, execute step 5043.
  • the frame numbers received by the receiver device S j last time is 4, and the frame number received this time is 5, the frame numbers are continuous; assume that the frame number received by the receiver device S j last time is 4 , the frame number received this time is 6, and the frame numbers are discontinuous.
  • the receiving device Sj may combine the preset frame sorting or frame transmission algorithm to determine whether the frame numbers are continuous.
  • Step 5043 the receiver device S j calculates the lost frame number.
  • the receiving end device Sj calculates the lost frame number according to the historical received frame numbers. Exemplarily, assuming that the historically received frame numbers of the receiving end device Sj are 1, 2, 3, and 5 respectively, the receiving end device Sj calculates that the display screen whose frame number is 4 is lost.
  • the above description is only used as an example of calculating the lost frame number, rather than limitation. If the display picture frame adopts the preset frame sorting or frame transmission algorithm during the transmission process, and/or adopts the preset frame loss retransmission algorithm, the receiving end device Sj can combine the preset frame sorting or frame transmission algorithm , The preset lost frame retransmission algorithm to calculate the lost frame number.
  • Step 5044 the receiver device S j adds the lost frame number to its own dropped frame list dropList j .
  • each receiver device S j maintains its own frame drop list dropList j , which is used to record the frame numbers of the display images that are not displayed.
  • each receiver device S j may have created the frame drop list dropList j before step 503 .
  • dropList j can be implemented with computer data structures such as arrays, vectors, lists, and linked lists, which is not specifically limited in this embodiment of the present application. Therefore, in step 5044, the receiver device S j may add the calculated lost frame number to its own lost frame list drotList j . For example, add the lost frame number to dropList j by means of appending, inserting, updating, etc.
  • Step 5045 the receiving end device S j sends the lost frame number to other receiving end devices; correspondingly, after receiving the lost frame number of the receiving end device S j sent by the receiving end device S j , other receiving end devices also send these The lost frame number is added to the dropped frame list dropList j' of the other receiver device.
  • the sink device S j may also send the lost frame number to the source device S 0 .
  • the source device S 0 also performs synchronous display of the display screen
  • the source device S 0 also maintains its own dropped frame list dropList 0
  • the sink device S j can also send the dropped frame number to the source The terminal device S 0 ; correspondingly, after receiving the lost frame numbers, the source device S 0 adds the lost frame numbers to its dropped frame list dropList 0 .
  • Step 5046 the receiver device S j sends the display screen payload i corresponding to the frame number frameID i to the decoder.
  • the decoder may be a display picture decoding software installed in the receiving end device Sj , and may be preset with a required decoding algorithm for decoding the display picture.
  • each receiving device S j in the receiving device S [1-N] determines its own current transmission delay transTime Sj , and sends the determined current transmission delay to the source device S 0 .
  • each sink device S j may pack its determined current transmission delay into a message M3 and send it to the source device S 0 .
  • the receiving end device S j may obtain the sending time sendTime i of the i-th frame display picture from the message M1.
  • the sending time sendTime i is the system time of the source device S 0 when the source device S 0 packs and sends the message M1. Furthermore, there may be a clock offset offset Sj between the system time of the sink device S j and the system time of the source device S 0 .
  • the receiver device S j can display the picture for the i-th frame, according to the clock offset offset Sj , and the time of the receiver device S j when the receiver device S j receives the i-th frame display picture System time (denoted as rcvTime i_Sj ), calculate the actual transmission delay (denoted as transTime i_Sj ) of the i-th frame display screen sent from the source device S 0 to the receiver device S j :
  • transTime i_Sj rcvTime i_Sj - offset Sj - sendTime i .
  • the actual transmission delay transTime i_Sj can be understood as the time actually spent on the transmission path when the i-th display frame is sent from the source device S 0 to the sink device S j .
  • the receiving end device Sj may record the actual transmission delays of the display images it has received in history. Therefore, the receiving end device S j can determine its own current transmission time delay transTime Sj by estimating (or predicting) according to the actual transmission time delays of these historically received display pictures.
  • the receiving device Sj directly uses the actual transmission delay transTime i_Sj_real of the i-th frame display screen as the receiving device S j own current transmission delay transTime Sj ;
  • a and b are hyperparameters, which can be calculated by linear fitting, a is used to represent the slope of the fitted linear function in the coordinate system, and b is used to represent the intercept.
  • the embodiment of the present application does not limit the specific implementation manner in which the display device S j at the receiving end determines its own current transmission time delay transTime Sj .
  • each receiving device S j in the receiving device S [1-N] sends the determined current transmission time delay transTime Sj to the source device S 0 , thus, the source device S 0 Obtain the N current transmission delays transTime S [1-N] sent to it by N receiver devices S[1-N] .
  • the source device S0 can be based on the N receiving device S [1- N] send to its N receiver devices S [1-N] their respective current transmission delays transTime S[1-N] to determine the current transmission delay transTime i for the (i+1)th frame display +1 , for example take the maximum value in transTime S[1-N] .
  • the source device S 0 may determine the specified display time playTime i+1 for the (i+1)th frame display picture according to the current transmission time delay transTime i+1 for the (i+1)th frame display picture.
  • the display screen synchronization method provided by the embodiment of the present application enables the source device S 0 to determine the specified display time palyTime i according to the dynamically changing current transmission delay transTime i for the display screen of the i-th frame.
  • step 505 is not limited in this embodiment of the present application. In some embodiments, step 505 only needs to be executed before step 503 is executed next time.
  • each receiving end device S j among the receiving end devices S [1-N] sends its own actively discarded frame number (if it exists) to other receiving end devices.
  • the actively discarded frame number may also be sent to the source device S 0 .
  • each receiving end device S j may package its own actively discarded frame number into a message M4 and send it to other receiving end devices.
  • the message M4 may also be sent to the source device S 0 , for example, in the case that the source device S 0 also performs synchronous display of the display screen.
  • the actively discarded frame number is used to indicate the serial number of the display screen frame that the receiver device cannot display at the specified display time because the estimated display time exceeds the specified display time.
  • the receiver device S j can estimate (or predict) the decoding delay (called “estimated decoding delay”, denoted as decodeTime i_Sj_pred ) required for the i-th frame currently received to display the picture, and then decode according to the estimated Delay, to obtain the estimated display time of the current i-th frame display screen. If the estimated display time exceeds the specified display time specified by the source device S0 for the display screen of the i-th frame, the sink device Sj can tend to think that even if the display screen of the i-th frame has been decoded this time, decoding The time after completion is also likely to have already exceeded the designated display time, and it is too late to display. Therefore, the receiving end device S j needs to actively discard the frame, and also send the frame number of the frame to other receiving end devices.
  • decodeTime i_Sj_pred the decoding delay required for the i-th frame currently received to display the picture
  • the method provided by the embodiment of the present application can avoid the situation that the display screen is not synchronized due to the inability of one or some receiver devices to successfully display one or some frames at the specified display time through step 506 .
  • step 506 considering that after step 504 is executed, the current i-th frame display picture has been sent to the decoder, therefore, the subject of execution of step 506 may be the decoder.
  • step 506 may be:
  • Step 5061 the receiving end device S j estimates the decoding delay required by the current i-th frame to display the payload i (that is, "estimated decoding delay", denoted as decodeTime i_Sj_pred ).
  • the receiving end device Sj may record the actual decoding delays of the display pictures it has received in history. Therefore, the receiving end device S j can estimate the decoding delay required for the current i-th frame display picture according to the actual decoding delay of the display pictures received in history.
  • the receiver device Sj When the cumulative number of frames received by the receiver device S j in history is less than the preset number (for example, 30), the receiver device Sj directly uses the actual decoding delay of the (i-1)th frame display picture as the current i-th frame The estimated decoding delay decodeTime i_Sj_pred of the frame display picture;
  • c and d are hyperparameters, which can be calculated by linear fitting, c is used to represent the slope of the fitted linear function in the coordinate system, and d is used to represent the intercept.
  • the embodiment of the present application does not limit the specific implementation manner of determining the estimated transmission delay of the i-th frame display picture by the display device S j at the receiving end.
  • Step 5062 the receiving end device S j judges whether the decoding can be completed before the specified display time palyTime i according to the estimated decoding time delay decodeTime i_Sj_pred , if yes, execute step 507 for decoding; if not, execute step 5063.
  • the receiving device S j determines that the decoding can be completed before the specified display time playTime i .
  • the above preset conditions may be:
  • systemTime Sj is used to indicate the system time of the receiving end device S j ; the meanings of other symbols are as described above, and will not be repeated here.
  • Step 5063 the receiving end device S j determines that the current frame needs to be actively discarded.
  • step 5062 Since the judgment in step 5062 is "No", the sink device Sj can tend to think that if it decodes the current i-th frame, the moment after the decoding is completed, it is likely to have exceeded the specified display value specified by the source device S0 . time, so the receiving end device S j does not need to decode the display screen of the i-th frame, and needs to actively discard the current frame.
  • Step 5064 the receiving end device S j adds the frame number of the current frame into its own frame drop list dropList j . That is to say, the frame number of the current frame is determined by the receiving end device Sj as the number of the actively discarded frame.
  • step 5044 This step can be compared with the description of step 5044 in the previous section, and details are not repeated here.
  • Step 5065 the receiving end device S j sends the frame number of the current frame ( actively discarding the frame number) to other receiving end devices; After actively discarding the frame number, the actively discarded frame number is also added to the drop list list dropList j' of the other receiving end device.
  • the sink device S j may also send the actively discarded frame number to the source device S 0 .
  • the source device S 0 also performs synchronous display of the display screen
  • the source device S 0 also maintains its own frame drop list dropList 0
  • the receiver device S j can also send the actively dropped frame number to The source device S 0 ; correspondingly, after receiving the actively dropped frame number, the source device S 0 adds the actively dropped frame number into its dropped frame list dropList 0 .
  • Step 507 the receiver device S [1-N] decodes the i-th frame to display the payload i .
  • the source device S 0 may also decode the i-th frame to display the payload i .
  • a decoder may be installed in each receiving end device S j for decoding the display picture according to a preset decoding algorithm.
  • a decoder may also be installed in the source device S0 , and may also decode the i-th frame display screen payload i ; if the source device S 0 is only used as a display screen providing device, and does not perform synchronous display of the display screen, then the source device S 0 may not decode the display screen payload i of the i-th frame.
  • step 506 is implemented in the manner shown in FIG. 9 , since step 5062 is judged as “yes”, the receiving end device S j can tend to think that if the current i-th frame is decoded, Then the time after the decoding is completed is likely to be before the specified display time specified by the source device S0 . Therefore, the display picture of the i-th frame can be decoded.
  • the decoding algorithm preset in the decoder is used to decode the display picture payload i of the i-th frame.
  • each receiving device S j in the receiving device S [1-N] determines its own current decoding delay decodeTime Sj , and sends the determined current decoding delay to the source device S 0 .
  • each sink device S j may pack its determined current decoding delay into a message M5 and send it to the source device S 0 .
  • the receiving device S j can know the decoding time actually consumed for decoding the i-th frame of the display picture this time, that is, the "actual decoding delay", which is recorded as decodeTime i_Sj_real .
  • the receiving end device S j can record the actual decoding time delay of the display picture received in the history of the receiving end device S j (wherein, including the actual decoding time delay decodeTime i_Sj_real of the i-th frame display picture), Estimate the current decoding delay decodeTime Sj of the receiver device S j itself.
  • the receiving device Sj directly uses the actual decoding time delay decodeTime i_Sj_real of the i-th frame display screen as the receiving device S j own current decoding delay decodeTime Sj ;
  • the receiving end device Sj When the historical accumulated number of frames received by the receiving end device Sj is greater than or equal to the preset number (for example, 30), the receiving end device Sj based on the actual decoding delay of these historical frames (including the i-th frame display screen
  • c' and d' are hyperparameters, which can be calculated by linear fitting, c' is used to represent the slope of the fitted linear function in the coordinate system, and d' is used to represent the intercept. Therefore, by bringing the frame number frameID i+1 of the next frame (i+i)th frame into the above linear relationship formula, the calculated fitting decoding delay decodeTime Sj_fitting' (frameID i+1 ) can be used as the received end device S j own current decoding delay decodeTime Sj .
  • the receiving end device S j determines its own current decoding delay decodeTime Sj . It should be understood that the embodiment of the present application does not limit the specific implementation manner in which the receiving end device S j determines its own current decoding delay decodeTime Sj .
  • the receiving end device S j may also use the estimated decoding delay decodeTime i_Sj_pred determined in step 5061 as the receiving end device S j 's own current decoding delay.
  • each receiving device S j in the receiving device S [1-N] will send the determined current decoding time delay decodeTime Sj to the source device S 0 , thus, the source device S 0 Obtain the N current decoding delays decodeTime S [1-N] sent to it by N receiver devices S[1-N] .
  • the source device S0 can be based on the N receiving device S [1- N] sent to its N receiver devices S [1-N] their respective current decoding delays decodeTime S[1-N] to determine the current decoding delay decodeTime i for the (i+1)th frame display +1 , for example take the maximum value in decodeTime S[1-N] .
  • the source device S 0 may determine the specified display time playTime i+1 for the (i + 1)th frame display picture according to the current decoding delay decodeTime i+1 for the (i+1)th frame display picture.
  • the display screen synchronization method provided by the embodiment of the present application enables the source device S 0 to determine the specified display time palyTime i according to the dynamically changing current decoding delay decodeTime i for the display screen of the i-th frame.
  • step 508 does not limit the execution order of step 508, as long as step 508 is executed before step 503 is executed next time.
  • each receiving end device S j in the receiving end devices S [1-N] performs synchronous display of the display screen.
  • the source device S0 may also perform synchronous display of the display screen.
  • each receiving device Sj can judge whether to display a certain frame according to its own frame drop list dropList j , so as to realize the synchronous display of the display screen.
  • the frame display picture is not displayed; if the frame number of the frame display picture is not in the lost frame list, the frame display picture is displayed.
  • the method provided by the embodiment of the present application realizes the synchronization of the display screen by maintaining a dynamically updated frame loss list that records the same frame number of the display screen that is not displayed in each device that performs synchronous display of the display screen. displayed synchronously.
  • the display screen synchronization is realized by pure software, which is easy to implement and has good user experience.
  • step 509 may be:
  • Step 5091 the receiver device S j completes the decoding of the current display screen.
  • step 507 the receiver device S j decodes the payload i of the current i-th frame display picture. Then step 5091 determines that the payload i of the display screen of the current i-th frame has been decoded, and obtains the decoded display screen of the i-th frame.
  • the decoded i-th frame of display picture may refer to the i-th frame of display picture data that can be sent to a display for display.
  • the display can display the i-th frame of the display image according to the decoded i-th frame of the display image.
  • Step 5092 the receiving end device S j judges whether the frame number frameID i is not in its own frame drop list dropList j , if not, execute step 5093; if yes, execute step 5094.
  • the frame drop list dropList j of each receiving end device S j may be continuously and dynamically updated. Therefore, until any moment before sending for display, the frame number of a new display frame that is not to be displayed may be added to the frame drop list dropList j of each receiving end device S j at any time. Therefore, after the decoding is completed, step 5092 is also required to determine whether the frame number of the display picture frame that has been decoded this time is not in the lost frame list.
  • Step 5093 Do not send the decoded i-th frame display screen for display.
  • step 5093 If the frame number is in the frame loss list, it means that among all the devices that display the display screen synchronously, at least one device fails to receive or cannot display the frame display screen at the specified display time. In order to ensure the synchronous display of the display images, all the devices performing the synchronous display of the display images at this time should not display the frame of the display images. Therefore, if the determination in step 5092 is "No", step 5093 is executed.
  • Step 5094 the receiving end device S j judges whether the time to send the display has arrived, if yes, execute step 5096; if not, execute step 5095.
  • the receiving end device S j determines that the time for display delivery has arrived.
  • the left side is the system time of the receiving end device S j minus the clock offset between the system time of the receiving end device S j and the system time of the source end device S 0
  • the right side is the clock deviation between the source end device S 0 and the system time of the source end device S 0
  • systemTime Sj > playTime i +offset Sj .
  • the above judging method can be understood as: judging whether the time for sending and displaying has arrived by judging whether the system time of the receiving device S j has reached the corrected designated displaying time.
  • Step 5095 the receiving end device S j waits for display, and judges in real time whether the frame number is in the lost frame list.
  • step 5049 If the judgment in step 5049 is "No", the system time of the receiving end device S j has not reached the corrected specified display time, so the receiving end device S j needs to wait.
  • Step 5095 the received The end device S j still needs to judge in real time whether the frame number is in the frame loss list.
  • Step 5096 is only executed when the frame number has not yet appeared in the frame loss list when the frame number arrives at the display time.
  • Step 5096 the receiver device S j sends the display screen corresponding to the frame number for display.
  • step 5049 is determined to be "Yes”, or, if step 5095 is completed and the frame number is not in the frame loss list until the display time arrives, step 5096 is executed. Therefore, in the method provided by the present application, all devices that perform synchronous display of display screens display the same frame number display screens at the same time, and will not display display screens with different frame numbers at the same time, which can ensure the synchronization of the display screens .
  • the receiving end device Sj initially judges that the i-th frame is lost, so it notifies other receiving end devices to add the frame number of the i-th frame to their frame loss list; but later, due to frame loss and retransmission, the receiving end The end device S j has received the i-th frame again, then the display screen synchronization method provided by the embodiment of the present application may further include the following steps: the receiving end device S j notifies other receiving end devices to change the frame number of the i-th frame from their Deleted from the dropped frame list. That is to say, the display screen synchronization method provided by the embodiment of the present application can not only add the frame number to the frame loss list, but also delete the frame number from the frame loss list.
  • the frame loss list is dynamically updated, and the maintenance will not display The frame number of the display screen frame. Therefore, as many frames as possible can be synchronously displayed under the condition of ensuring the synchronization of the display images.
  • the disclosed device/electronic equipment and method can be implemented in other ways.
  • the device/electronic device embodiments described above are only illustrative.
  • the division of the modules or units is only a logical function division.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be through some interfaces, and the indirect coupling or communication connection of devices or units may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in one place, or may be distributed to multiple network units. Part or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, each unit may exist separately physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units can be implemented in the form of hardware or in the form of software functional units.
  • the integrated module/unit is realized in the form of a software function unit and sold or used as an independent product, it can be stored in a computer-readable storage medium. Based on this understanding, all or part of the processes in the methods of the above embodiments in the present application can also be completed by instructing related hardware through computer programs.
  • the computer programs can be stored in a computer-readable storage medium, and the computer When the program is executed by the processor, the steps in the above-mentioned various method embodiments can be realized.
  • the computer program includes computer program code, and the computer program code may be in the form of source code, object code, executable file or some intermediate form.
  • the computer-readable storage medium may include: any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer memory, a read-only memory (ROM, Read-Only Memory) ), Random Access Memory (RAM, Random Access Memory), electrical carrier signal, telecommunication signal, and software distribution medium, etc.
  • ROM Read-Only Memory
  • RAM Random Access Memory
  • electrical carrier signal telecommunication signal
  • software distribution medium etc.
  • the content contained in the computer-readable storage medium can be appropriately increased or decreased according to the requirements of legislation and patent practice in the jurisdiction.
  • computer-readable Storage media excludes electrical carrier signals and telecommunication signals.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

本申请提供了一种显示画面同步方法,该方法可以包括以下步骤:源端设备向接收端设备中的每个接收端设备发送第一画面帧,以及第一画面帧对应的第一指定显示时刻;若接收端设备中的每个接收端设备都成功接收到第一画面帧,且都确定能够按照第一指定显示时刻显示第一画面帧,则进行画面帧同步显示的设备中的每个设备都按照第一指定显示时刻显示第一画面帧;若接收端设备中存在第一接收端设备未成功接收到第一画面帧,或者,成功接收到了第一画面帧但确定不能够按照第一指定显示时刻显示第一画面帧,则进行画面帧同步显示的设备中的每个设备都不显示第一画面帧。从而,只要接收端设备中存在第一接收端设备不能够按照第一指定显示时刻显示第一画面帧,进行画面帧同步显示的设备就都不显示第一画面帧,保证了显示画面的同步显示。

Description

一种显示画面同步方法、系统及电子设备
本申请要求在2021年11月22日提交中国国家知识产权局、申请号为202111383467.5的中国专利申请的优先权,发明名称为“一种显示画面同步方法、系统及电子设备”的中国专利申请的优先权,要求在2022年6月7日提交中国国家知识产权局、申请号为202210640453.5的中国专利申请的优先权,发明名称为“一种显示画面同步方法、系统及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请属于显示技术领域,尤其涉及一种显示画面同步方法、系统及电子设备。
背景技术
随着直播、教育、会议等应用场景的发展,多屏显示的需求不断增长,分体屏、拼接屏技术也在不断进步。若想要使得多个显示设备同时显示相同的画面帧,需要进行显示画面同步。当前的显示画面同步主要是通过硬件来实现的,例如通过有线连接传输画面帧,接收端显示设备直接显示接收到的画面帧。然而,这种方式实现成本高且布线复杂。当前,尚无一种软件方案能够很好地实现画面帧的同步显示。
发明内容
有鉴于此,本申请实施例提供了一种显示画面同步方法、系统及电子设备,能够有效避免进行多屏显示时画面帧不同步现象的发生。
本申请实施例的第一方面提供了一种显示画面同步方法,应用于接收端设备。其中,接收端设备包括丢帧信息,丢帧信息用于记录不进行显示的画面帧的标识。该方法可以包括以下步骤:接收端设备与源端设备建立通信连接;接收端设备接收源端设备发送的第一画面帧,以及第一画面帧对应的第一指定显示时刻;若接收端设备确定能够按照第一指定显示时刻显示第一画面帧,且第一画面帧的帧号不在丢帧信息中,则接收端设备按照第一指定显示时刻显示第一画面帧;若接收端设备确定不能够按照第一指定显示时刻显示第一画面帧,则将第一画面帧的标识添加至丢帧信息,并将第一画面帧的标识通知给其他设备;其中,其他设备用于与接收端设备进行画面帧的同步显示。
本申请实施例提供的显示画面同步方法中,接收端设备可以依据第一指定显示时刻、丢帧信息,来确定什么情况下可以显示第一画面帧,以及应当在哪个时刻显示第一画面帧;接收端设备可以在它接收到了第一画面帧但不能够显示第一画面帧的情况下,将第一画面帧的标识通知给其他设备。从而,通过软件的方式实现了显示画面的同步显示,不需要依赖于硬件上的有线连接,成本低且易于实施。在一些场景下,上述接收端设备包括丢帧信息,可以理解为接收端设备中维护有丢帧信息,例如,为丢帧信息分配存储空间、对丢帧信息进行动态更新等。
在一种可能的实现方式中,该方法还包括以下步骤:若第一画面帧的标识在丢帧信息中,则接收端设备不显示第一画面帧。
在一些场景下,第一画面帧的标识在丢帧信息中,表明存在其他设备不能够显示第一画面帧,因此接收端设备也不显示第一画面帧。这样,就可以避免其他设备未显示第一画面帧但接收端仍显示第一画面帧的情况的发生,有助于实现显示画面的同步显示。
在一种可能的实现方式中,该方法还包括以下步骤:接收端设备根据已成功接收到的画面帧对应的标识,确定丢失帧标识;其中,丢失帧标识包括接收端设备未成功接收到的画面帧对应的标识;接收端设备将丢失帧标识添加至丢帧信息,并将丢失帧标识通知给其他设备。
在一些场景下,例如网络波动、网络不稳定、带宽被其他设备占用等场景下,会存在接收端设备未能成功接收到某些画面帧(即丢失帧)的情况,此时接收端设备将丢失帧标识添加至丢帧信息并通知其他设备。这样,就可以避免接收端设备未显示丢失帧但其他设备仍显示该丢失帧的情况的发生,有助于实现显示画面的同步显示。
在一种可能的实现方式中,上述接收端设备确定能够按照第一指定显示时刻显示第一画面帧的步骤,具体包括:接收端设备获取解码第一画面帧所需的估计解码时延,根据估计解码时延,确定第一画面帧对应的估计显示时刻;若估计显示时刻早于或等于第一指定显示时刻,则接收端设备确定能够按照第一指定显示时刻显示第一画面帧。
在一种可能的实现方式中,上述接收端设备确定不能够按照第一指定显示时刻显示第一画面帧的步骤,具体包括:接收端设备获取解码第一画面帧所需的估计解码时延,根据估计解码时延,确定第一画面帧对应的估计显示时刻;若估计显示时刻晚于第一指定显示时刻,则接收端设备确定不能够按照第一指定显示时刻显示第一画面帧。
由于接收端设备解码第一画面帧也需要花费一定的时间,因此,接收端设备在接收到第一画面帧后,可先判断是否来得及显示第一画面帧。在判断来得及显示第一画面帧的情况下,接收端设备确定能够按照第一指定时刻显示第一画面帧,可选地,接收端设备对第一画面帧进行解码。在判断来不及显示第一画面帧的情况下,接收端设备将第一画面帧的标识通知给其他设备。这样,可以避免接收端设备未显示第一画面帧但其他设备仍显示第一画面帧情况的发生,有助于实现显示画面的同步显示。
可选地,在判断来不及显示第一画面帧的情况下,接收端设备还可以不对第一画面帧进行解码处理,避免将运算资源浪费在解码不会进行显示的第一画面帧上,有助于节省接收端设备的运算资源。
可选地,在判断来不及显示第一画面帧的情况下,并且第一画面帧不是关键帧(即,其他画面帧进行显示所依赖的帧,通常可以称之为“I帧”)时,接收端设备才不对第一画面帧进行解码处理。也就是说,虽然判断来不及显示第一画面帧了,但若第一画面帧是其他画面帧进行显示所以来的帧,那么仍旧需要对第一画面帧进行解码,以保证其他画面帧能够正常显示。
在一种可能的实现方式中,该方法还包括以下步骤:接收端设备与源端设备进行时钟同步,确定接收端设备的系统时间与源端设备的系统时间之间的时钟偏差;上述若估计显示时刻早于或等于第一指定显示时刻,则接收端设备确定能够按照第一指定显示时刻显示第一画面帧的步骤,具体包括:使用时钟偏差矫正估计显示时刻,得到矫正后的估计显示时刻,若矫正后的估计显示时刻早于或等于第一指定显示时刻,则确定能够按照第一指定显示时刻显示第一画面帧。
由于硬件器件的固有物理属性,接收端设备的系统时间与源端设备的系统时间之间通常不可避免地存在时钟偏差。接收端设备按照使用时钟偏差矫正后的第一指定显示时刻显示第一画面帧,有助于避免时钟偏差造成的画面帧不同步。
在一种可能的实现方式中,上述接收端设备获取解码第一画面帧所需的估计解码时延的步骤,具体包括:接收端设备根据解码历史画面帧所消耗的实际解码时延,获取解 码第一画面帧所需的估计解码时延;其中,历史画面帧包括帧的标识位于第一画面帧的标识之前的至少一个画面帧。
通常,接收端设备解码画面帧所需要的解码时延与接收端设备的处理器性能、当前处理器占用率、采用的编解码算法等因素相关,因此往往是具有连续性的。因此,接收端设备可以根据接收端设备解码历史画面帧所实际消耗的解码时延(即实际解码时延),来获取(例如估计、预测等)解码第一画面帧将要花费的估计解码时延。这样,有助于接收端设备更加准确地判断接收端设备是否来得及显示第一画面帧。
在一种可能的实现方式中,该方法还包括以下步骤:接收端设备根据成功接收到的至少一个画面帧的实际解码时延,确定接收端设备的当前解码时延;接收端设备将当前解码时延发送给源端设备。相应地,源端设备接收到该当前解码时延。
在一种可能的实现方式中,该方法还包括以下步骤:接收端设备根据成功接收到的至少一个画面帧的实际传输时延,确定接收端设备的当前传输时延;接收端设备将当前传输时延发送给源端设备。相应地,源端设备接收到该当前传输时延。
在一场景中,接收端设备的数量为至少一个,则源端设备接收到的当前解码时延和/或当前传输时延的数量也为至少一个。源端设备可以根据至少一个当前解码时延和/或当前传输时延,确定当前时延,例如,根据至少一个当前解码时延和/或当前传输时延中的最大值确定当前时延。进而,源端设备基于该当前时延来指定后续画面帧的指定显示时刻。从而,源端设备基于接收端设备的当前解码时延和/或当前传输时延来确定当前时延,可以使得当前时延是动态变化的,并且是与实际应用场景中实际的网络传输情况,和/或,接收端设备实际的解码时延相符合的。这样,有助于源端设备为画面帧所指定的指定显示时刻是一个合适的数值,尽可能让接收端设备都来得及显示,有助于避免接收端设备不能接收到某些画面帧、不能显示某些画面帧的情况的发生。并且,存在这种当前时延动态变化的机制,在一些实施例中,第一帧的时延就可以不需要设置为一个相对较大的数值(例如200ms、600ms)了,而可以设置为一个相对较小的数值,有助于降低起播时延,提升用户体验。
在一种可能的实现方式中,该方法还包括以下步骤:接收端设备接收其他设备发送的丢失帧标识和/或主动丢弃帧标识,并将其他设备发送的丢失帧标识和/或主动丢弃帧标识添加至丢帧信息中;在一些实施例中,其他设备发送的丢失帧标识可以包括其他设备根据其他设备已成功接收到的画面帧对应的标识,确定的其他设备未成功接收到的画面帧对应的标识;其他设备发送的主动丢弃帧标识包括其他设备确定不能够按照第二指定显示时刻显示的画面帧的帧号;在一些实施例中,第二指定显示时刻可以是其他设备发送的主动丢弃帧标识对应的画面帧对应的指定显示时刻。
从而,其他设备不能够显示的画面帧(即,丢失帧和/或主动丢弃帧),接收端设备也不会显示。这样,可以避免其他设备未显示上述丢失帧和/或主动丢弃帧,但接收端设备仍显示上述丢失帧和/或主动丢弃帧的情况的发生,有助于实现显示画面的同步显示。
在一种可能的实现方式中,上述接收端设备接收源端设备发送的第一画面帧,以及第一画面帧对应的第一指定显示时刻的步骤,具体包括:接收端设备接收第一报文,其中,第一报文携带有第一画面帧和第一指定显示时刻,可选地,第一报文还携带有第一报文的发送时刻和第一画面帧的标识。
源端设备将第一画面帧的相关信息(例如,第一画面帧、第一指定显示时刻、第一画面帧的标识)打包到第一报文中发送,有助于接收端设备准确地确定第一画面帧所对 应的信息。
在一种可能的实现方式中,接收端设备和源端设备处于同一局域网中。例如,处于同一个WiFi局域网,接入同一个路由器,处于同一个Mesh网络等。从而,接收端设备和源端设备可以方便地进行信息交互。
在一种可能的实现方式中,画面帧的标识包括画面帧的帧号。当然,应理解,画面帧的标识(如上述第一画面帧的标识)也可以包括帧号以外的其他内容,本申请实施例对此不做限定。
本申请实施例的第二方面提供了一种显示画面同步方法,应用于源端设备。其中,源端设备用于获取画面帧并将获取到的画面帧发送给接收端设备。该方法可以包括以下步骤:源端设备与接收端设备建立通信连接;源端设备向接收端设备发送第一画面帧,以及第一画面帧对应的第一指定显示时刻;其中,第一指定显示时刻是源端设备根据当前传输时延和/或当前解码时延确定的,当前传输时延与历史画面帧的实际传输时延有关,当前解码时延与接收端设备解码历史画面帧的实际解码时延有关,历史画面帧包括帧的标识位于第一画面帧的标识之前的至少一个画面帧。
本申请实施例提供的显示画面同步方法中,在一些实施例中,源端设备可以根据当前传输时延和/或当前解码时延来确定当前时延,然后根据所确定的当前时延来确定第一画面帧的第一指定显示时刻。其中,当前时延不是一个预先设定的固定数值,而是动态变化的、与当前网络传输状态和/或当前接收端设备的解码能力相关联的。从而,源端设备为第一画面帧所指定的第一指定显示时刻是一个合适的数值,尽可能让接收端设备都来得及显示,有助于避免接收端设备不能接收到某些画面帧、不能显示某些画面帧的情况的发生。并且,存在这种当前时延动态变化的机制,在一些实施例中,第一帧的时延就可以不需要设置为一个相对较大的数值(例如200ms、600ms)了,而可以设置为一个相对较小的数值,有助于降低起播时延,提升用户体验。
在一种可能的实现方式中,源端设备还用于与接收端设备进行画面帧的同步显示,此时,源端设备包括丢帧信息,丢帧信息用于记录不进行显示的画面帧的标识,该方法还包括以下步骤:若第一画面帧的帧号不在丢帧信息中,则源端设备按照第一指定显示时刻显示第一画面帧;若第一画面帧的帧号在丢帧信息中,则源端设备不显示第一画面帧。
在一些场景中,若第一画面帧的帧号不在丢帧信息中,则表明接收端设备中的每个接收端设备都成功接收到了第一画面帧,并且都确定能够按照第一指定显示时刻显示第一画面帧,此时,源端设备若也进行画面帧的同步显示,则可以显示第一画面帧。
在一些场景中,若第一画面帧的帧号在丢帧信息中,则表明接收端设备中至少存在一个接收端设备未能成功接收到了第一画面帧,或者,即使成功接收到了第一画面帧,但确定不能够按照第一指定显示时刻显示第一画面帧,此时,源端设备若也进行画面帧的同步显示,则不可以显示第一画面帧。
从而,保证了源端设备和接收端设备画面帧的同步显示。
在一些场景下,上述源端设备包括丢帧信息,可以理解为源端设备中维护有丢帧信息,例如,为丢帧列表分配存储空间、对丢帧信息进行动态更新等。
在一种可能的实现方式中,当源端设备用于与接收端设备进行画面帧的同步显示时,该方法还包括以下步骤:源端设备接收接收端设备发送的丢失帧标识和/或主动丢弃帧标识,并将接收端设备发送的丢失帧标识和/或主动丢弃帧标识添加至丢帧信息中;其中, 接收端设备发送的丢失帧标识包括接收端设备根据接收端设备已成功接收到的画面帧对应的标识确定的接收端设备未成功接收到的画面帧的标识;接收端设备发送的主动丢弃帧标识包括接收端设备确定不能够按照第二指定显示时刻显示的画面帧的标识;第二指定显示时刻是接收端设备发送的主动丢弃帧标识对应的画面帧对应的指定显示时刻。
从而,接收端设备不能够显示的画面帧(即,丢失帧和/或主动丢弃帧),源端设备也不会显示。这样,可以避免接收端设备未显示上述丢失帧和/或主动丢弃帧,但源端设备仍显示上述丢失帧和/或主动丢弃帧的情况的发生,有助于实现显示画面的同步显示。
在一种可能的实现方式中,该方法还包括以下步骤:源端设备接收至少一个接收端设备发送的至少一个接收端设备的当前传输时延和/或至少一个接收端设备的当前解码时延;源端设备将至少一个接收端设备的当前传输时延中的最大值确定为当前传输时延,和/或,将至少一个接收端设备的当前解码时延中的最大值确定为当前解码时延。
如前所述,第一指定显示时刻是源端设备根据当前传输时延和/或当前解码时延确定的,当前解码时延和/或当前传输时延是接收端设备发送给源端设备的。将其中的最大值确定为当前解码时延和/或当前传输时延,有助于源端设备指定的第一指定显示时刻尽可能是一个合适的时刻,尽可能保证全部接收端设备都能够接收到并且都能够来得及显示第一画面帧。
在一种可能的实现方式中,接收端设备的当前传输时延是接收端设备根据接收端设备接收到的历史画面帧的实际传输时延确定的,接收端设备的当前解码时延是接收端设备根据所述接收端设备接收到的历史画面帧的实际解码时延确定的。
这样,可以使得当前传输时延、当前解码实验是与实际应用场景中实际的网络传输情况,和/或,接收端设备实际的解码时延相符合的,有助于源端设备指定的第一指定显示时刻是一个合适的时刻。
在一种可能的实现方式中,该方法还包括以下步骤:源端设备与接收端设备进行时钟同步。由于硬件器件的固有物理属性,源端设备的系统时间与接收端设备的系统时间之间通常不可避免地存在时钟偏差。进行时钟同步可以使接收端设备获取到并保存该时钟偏差。进而,可以使得接收端设备按照使用时钟偏差矫正后的第一指定显示时刻显示第一画面帧,有助于避免时钟偏差造成的画面帧不同步。
在一种可能的实现方式中,源端设备和接收端设备处于同一局域网中。例如,处于同一个WiFi局域网,接入同一个路由器,处于同一个Mesh网络等。从而,接收端设备和源端设备可以方便地进行信息交互。
在一种可能的实现方式中,画面帧的标识包括画面帧的帧号。应理解,画面帧的标识(如上述第一画面帧的标识)也可以包括帧号以外的其他内容,本申请实施例对此不做限定。
本申请实施例的第三方面提供了一种显示画面同步方法,应用于包括源端设备和至少一个接收端设备的系统中。其中,接收端设备为进行画面帧同步显示的设备,或者,接收端设备以及源端设备为进行画面帧同步显示的设备。该方法可以包括以下步骤:源端设备向接收端设备中的每个接收端设备发送第一画面帧,以及第一画面帧对应的第一指定显示时刻;若接收端设备中的每个接收端设备都成功接收到第一画面帧,且都确定能够按照第一指定显示时刻显示第一画面帧,则进行画面帧同步显示的设备中的每个设备都按照第一指定显示时刻显示第一画面帧;若接收端设备中存在第一接收端设备未成功接收到第一画面帧,或者,成功接收到了第一画面帧但确定不能够按照第一指定显示时刻显示第一画面帧,则 进行画面帧同步显示的设备中的每个设备都不显示第一画面帧。
从而,本申请实施例提供的显示画面同步方法,在每个接收端设备都确定能够按照第一指定显示时刻显示第一画面帧的情况下,进行画面帧同步显示的设备才会显示第一画面帧;只要其中存在第一接收端设备不能够按照第一指定显示时刻显示第一画面帧,进行画面帧同步显示的设备就都不显示第一画面帧。保证了显示画面的同步显示。
在一种可能的实现方式中,上述进行画面帧同步显示的设备中的每个设备都包括丢帧信息,丢帧信息用于记录不进行显示的画面帧的标识;上述若接收端设备中存在第一接收端设备未成功接收到所述第一画面帧,或者,成功接收到了第一画面帧但确定不能够按照第一指定显示时刻显示第一画面帧,则进行画面帧同步显示的设备中的每个设备都不显示第一画面帧的步骤,具体包括:若接收端设备中存在第一接收端设备未成功接收到第一画面帧,或者,成功接收到了第一画面帧但确定不能够按照第一指定显示时刻显示第一画面帧,则第一接收端设备将第一画面帧的标识添加至第一接收端设备的丢帧信息中,并将第一画面帧的标识通知给进行画面帧同步显示的设备中的其他设备;进行画面帧同步显示的设备中的每个设备,根据所述第一画面帧的标识在丢帧信息中,不显示第一画面帧。
第一画面帧的标识在丢帧信息中,表明存在第一接收端设备不能够显示第一画面帧,因此,这种情况下进行画面帧同步显示的设备就不显示第一画面帧,有助于实现显示画面的同步显示。在一些场景下,上述进行画面帧同步显示的设备中的每个设备都包括丢帧信息,可以理解为每个设备中维护有丢帧信息,例如,为丢帧信息分配存储空间、对丢帧信息进行动态更新等。
在一种可能的实现方式中,该方法还包括以下步骤:源端设备根据当前传输时延和/或当前解码时延,确定第一指定显示时刻;其中,当前传输时延与历史画面帧的实际传输时延有关,当前解码时延与接收端设备解码历史画面帧的实际解码时延有关,历史画面帧包括帧的标识位于第一画面帧的标识之前的至少一个画面帧。
在一些实施例中,源端设备可以根据当前传输时延和/或当前解码时延来确定当前时延,然后根据所确定的当前时延来确定第一画面帧的第一指定显示时刻。其中,当前时延不是一个预先设定的固定数值,而是动态变化的、与当前网络传输状态和/或当前接收端设备的解码能力相关联的。从而,源端设备为第一画面帧所指定的第一指定显示时刻是一个合适的数值,尽可能让接收端设备都来得及显示,有助于避免接收端设备不能接收到某些画面帧、不能显示某些画面帧的情况的发生。并且,存在这种当前时延动态变化的机制,在一些实施例中,第一帧的时延就可以不需要设置为一个相对较大的数值(例如200ms、600ms)了,而可以设置为一个相对较小的数值,有助于降低起播时延,提升用户体验。
在一种可能的实现方式中,在上述源端设备根据当前传输时延和/或当前解码时延,确定第一指定显示时刻的步骤之前,该方法还包括以下步骤:接收端设备根据接收端设备接收到的历史画面帧的实际传输时延确定接收端设备的当前传输时延,和/或,根据接收端设备接收到的历史画面帧的实际解码时延确定接收端设备的当前解码时延;接收端设备将所确定的接收端设备的当前传输时延和/或接收端设备的当前解码时延发送给源端设备;上述源端设备根据当前传输时延和/或当前解码时延,确定第一指定显示时刻的步骤,具体包括:源端设备将接收到的至少一个接收端设备的当前传输时延中的最大值确定为当前传输时延,和/或,将接收到的至少一个接收端设备的当前解码时延中的最大值确定为当前解码时延。
如前所述,第一指定显示时刻是源端设备根据当前传输时延和/或当前解码时延确定的,当前解码时延和/或当前传输时延是接收端设备发送给源端设备的。将其中的最大值确定为当前解码时延和/或当前传输时延,有助于源端设备指定的第一指定显示时刻尽可能是一个合适的时刻,尽可能保证全部接收端设备都能够接收到并且都能够来得及显示第一画面帧。
在一种可能的实现方式中,该方法还包括以下步骤:源端设备与接收端设备进行时钟同步。由于硬件器件的固有物理属性,源端设备的系统时间与接收端设备的系统时间之间通常不可避免地存在时钟偏差。进行时钟同步可以使接收端设备获取到并保存该时钟偏差。进而,可以使得接收端设备按照使用时钟偏差矫正后的第一指定显示时刻显示第一画面帧,有助于避免时钟偏差造成的画面帧不同步。
在一种可能的实现方式中,源端设备和接收端设备处于同一个局域网中。例如,处于同一个WiFi局域网,接入同一个路由器,处于同一个Mesh网络等。从而,接收端设备和源端设备可以方便地进行信息交互。
本申请实施例第四方面提供了一种接收端设备,该接收端设备包括存储器、处理器以及存储在存储器中并可在处理器上运行的计算机程序,处理器被配置为执行计算机程序时,使上述接收端设备实现如第一方面或者第一方面中任一种可能的实现方式中所述的方法。
本申请实施例第五方面提供了一种源端设备,该源端设备包括存储器、处理器以及存储在存储器中并可在处理器上运行的计算机程序,处理器被配置为执行计算机程序时,使上述源端设备实现如第二方面或者第二方面中任一种可能的实现方式中所述的方法。
本申请实施例第六方面提供了一种显示画面同步系统,该显示画面同步系统包括源端设备和至少一个接收端设备,其中,源端设备、接收端设备分别被配置为用于执行如第三方面或者第三方面中任一种可能的实现方式中,源端设备、接收端设备分别执行的步骤。
本申请实施例第七方面提供了一种计算机可读存储介质,该计算机可读存储介质被配置为存储有计算机程序,当计算机程序被处理器执行时,实现如第一方面或者第一方面中任一种可能的实现方式中所述的方法,或者,实现如第二方面或者第二方面中任一种可能的实现方式中所述的方法。
本申请实施例第八方面提供了一种计算机程序产品,该计算机程序产品被配置为在接收端设备上运行时,使得接收端设备执行如第一方面或者第一方面中任一种可能的实现方式中所述的方法,或者,所述计算机程序产品被配置为在源端设备上运行时,使得源端设备执行如第二方面或者第二方面中任一种可能的实现方式中所述的方法。
本申请实施例第九方面提供了一种芯片系统,该芯片系统包括存储器和处理器,处理器被配置为执行存储器中存储的计算机程序,以实现如第一方面或者第一方面中任一种可能的实现方式中所述的方法,或者,实现如第二方面或者第二方面中任一种可能的实现方式中所述的方法。
附图说明
图1是本申请一实施例提供的一种显示画面同步的应用场景示意图;
图2是本申请一实施例提供的另一种显示画面同步的应用场景示意图;
图3是本申请一实施例提供的一种多屏系统示意图;
图4是本申请一实施例提供的一种指定显示时刻示意图;
图5是本申请一实施例提供的一种显示画面同步方法设备交互示意图;
图7是本申请一实施例提供的另一种指定显示时刻示意图;
图8是本申请一实施例提供的一种接收端设备获取和处理丢失帧号的方法流程图;
题9是本申请一实施例提供的一种接收端设备获取和处理主动丢弃帧号的方法流程图;
图10是本申请一实施例提供的一种接收端设备进行显示画面的同步显示的方法流程图。
具体实施方式
以下描述中,为了说明而不是为了限定,提出了诸如特定系统结构、技术之类的具体细节,以便透彻理解本申请实施例。然而,本领域的技术人员应当清楚,在没有这些具体细节的其它实施例中也可以实现本申请。在其它情况中,省略对众所周知的系统、装置、电路以及方法的详细说明,以免不必要的细节妨碍本申请的描述。
应当理解,当在本申请说明书和所附权利要求书中使用时,术语“包括”指示所描述特征、整体、步骤、操作、元素和/或组件的存在,但并不排除一个或多个其它特征、整体、步骤、操作、元素、组件和/或其集合的存在或添加。
还应当理解,在本申请说明书和所附权利要求书中使用的术语“和/或”是指相关联列出的项中的一个或多个的任何组合以及所有可能组合,并且包括这些组合。
如在本申请说明书和所附权利要求书中所使用的那样,术语“如果”可以依据上下文被解释为“当...时”或“一旦”或“响应于确定”或“响应于检测到”。类似地,短语“如果确定”或“如果检测到[所描述条件或事件]”可以依据上下文被解释为意指“一旦确定”或“响应于确定”或“一旦检测到[所描述条件或事件]”或“响应于检测到[所描述条件或事件]”。
另外,在本申请说明书和所附权利要求书的描述中,术语“第一”、“第二”、“第三”等仅用于区分描述,而不能理解为指示或暗示相对重要性。
在本申请说明书中描述的参考“一个实施例”或“一些实施例”等意味着在本申请的一个或多个实施例中包括结合该实施例描述的特定特征、结构或特点。由此,在本说明书中的不同之处出现的语句“在一个实施例中”、“在一些实施例中”、“在其他一些实施例中”、“在另外一些实施例中”等不是必然都参考相同的实施例,而是意味着“一个或多个但不是所有的实施例”,除非是以其他方式另外特别强调。术语“包括”、“包含”、“具有”及它们的变形都意味着“包括但不限于”,除非是以其他方式另外特别强调。
在本申请说明书中,所述“画面”、“画面帧”、“显示画面”、“显示内容”、“显示画面帧”、“帧”等词语,依据具体的上下文,多数情况下可以理解为具有相同的含义、指代同一类事物,除非是以其他方式另外特别强调。
在本申请说明书中,以“帧号”作为“帧标识”的一种具体示例进行讲解,但这并不构成对本申请实施例提供的方案的限定。本领域技术人员可以将示例中的帧号替换为任意一种其他类型的帧标识,而均不超出本申请实施例提供的方案的范围。
首先,结合附图介绍本申请实施例提供的显示画面同步方法、系统及电子设备的应用场景。
图1示例性展示了本申请实施例提供的一种显示画面同步的应用场景示意图。图1所示场景为拼接屏的显示画面同步,其中,拼接屏也可以叫做分体屏、组合屏等,通常可以用于需要大面积显示的场景中,例如户外或商场广告牌、机场航班信息显示等。拼接屏可包含多个显示设备,例如4个、6个、9个,每个显示设备分别显示画面的一部分,多个显示设备的显示画面拼接构成完整的显示画面。如图1所示,该拼接屏包含显示设备A、显示设备B、显示设备C和显示设备D,4个显示设备分别显示图1中所示显示画面的左上部分、右上部分、左下部分和右下部分。为了确保在某一时刻拼接屏中的多个显示设备显示的画面能够构成完整的显示画面,拼接屏中的多个显示设备需要进行显示画面同步。
图2示例性展示了本申请实施例提供的另一种显示画面同步的应用场景示意图。图2所示场景为显示相同画面的多个显示设备的显示画面同步,通常可应用于例如教学、会议等场景。如图2所示,显示设备E和显示设备F需要在相同的时刻显示相同的画面,因此显示设备E和显示设备F需要进行显示画面同步。
基于图1和图2,图3示例性展示了本申请实施例提供的一种多屏系统示意图。多屏系统300可以包括拼接屏,和/或,一个或多个显示设备。
也就是说,本申请实施例所述的多屏系统300,可仅包括拼接屏,其中,拼接屏可包括至少两个显示设备,至少两个显示设备中的每个显示设备分别显示显示画面的一部分,每个显示设备的显示画面拼接构成完整的显示画面;或者,也可仅包括一个或多个显示设备,该一个或多个显示设备显示相同的显示画面;或者,也可既包括拼接屏、又包括一个或多个显示相同的显示画面的显示设备。本申请实施例所述的多屏系统300可以为任意一种至少包括两个显示设备、需要进行显示画面同步的系统。
为了实现显示画面同步,如背景技术中所述,目前通常采用的是硬件方案,通过有线方式传输显示画面。例如,可以从多屏系统中任意选择一个显示设备作为源端(source)设备,其他显示设备作为接收端(sink)设备。其中,源端设备用于获取显示画面,例如从网络获取、本地获取或者从其他设备获取显示画面,并将获取的显示画面发送给接收端设备;接收端设备用于在接收到显示画面后进行显示。
以图1所示场景为例,可以将显示设备A作为源端设备,将显示设备B、显示设备C、显示设备D作为接收端设备,则显示设备A可以分别与显示设备B、显示设备C、显示设备D进行有线连接,通过该有线连接将显示画面分别传输给显示设备B、显示设备C、显示设备D。由于有线连接的传输时延通常非常小,可以忽略,因此可以不进行对时同步,显示设备B、显示设备C、显示设备D接收到显示内容就可以直接解码送显。然而,通过有线连接实现显示画面同步的方案需要增加额外的硬件成本,且布线复杂。
当然,在一些实现方式中,多屏系统中的显示设备也可以都作为接收端设备工作,多屏系统中的显示设备与多屏系统之外的一个源端设备连接,从源端设备获取显示画面。此时,源端设备可以不必须与多屏系统中的显示设备进行显示画面的同步显示,而是作为显示画面的提供设备工作。这种实现方式若采用有线连接的硬件传输方案,同样会存在如上一段所述增加额外的硬件成本、布线复杂的问题。
有鉴于此,本申请实施例提供一种通过软件的方式实现的显示画面同步方法:源端设备指定每一帧显示画面的显示时刻,接收端设备根据源端设备指定的每一帧显示画面的显示时刻显示每一帧显示画面。为了便于描述,本申请实施例中,
将源端设备指定的第i帧显示画面的显示时刻简称为“指定显示时刻”,记作 playTime i,其中,i用来表示帧号,i=1,2,…,M。例如,第1帧显示画面的指定显示时刻记作playTime 1,第2帧显示画面的指定显示时刻记作playTime 2,…,以此类推。
应理解,为了便于实施,源端设备可以是按照源端设备自身的系统时间来指定每一帧的显示时刻的。在一些实施例中,接收端设备的系统时间可能与源端设备的系统时间存在时钟偏差,则接收端设备可以根据该时钟偏差来矫正源端设备提供的指定显示时刻,从而接收端设备根据校正后的指定显示时刻显示每一帧显示画面,来实现显示画面同步。本申请实施例所述的接收端设备根据指定显示时刻进行显示,均可以指接收端设备根据矫正后的指定显示时刻进行显示。
将源端设备获取第1帧显示画面的时刻记作startTime。例如,可以是源端设备从网络获取、本地获取或者从其他设备获取第1帧显示画面时,源端设备的系统时间。
将第i帧显示画面相对于第1帧显示画面的延后时间记作pts i,其中,“pts”可以表示显示时间戳(presentation time stamp)。
假设显示帧率为60帧每秒(frame per second,fps),则相邻两帧显示画面之间的时间间隔为1/60≈16.67毫秒(ms)。第1帧显示画面对应的pts 1为0,因此有第2帧显示画面对应的pts 2=16.67×1=16.67,第3帧显示画面对应的pts 3=16.67×2=33.34ms,…,pts i=16.67×(i-1)ms,以此类推。
从而,第i帧显示画面理论上应当在startTime+pts i的时刻进行显示。为了便于描述,将startTime+pts i简称为“理论显示时刻”。在一些实施例中,源端设备可以将理论显示时刻作为指定显示时刻。即playTime i=startTime+pts i
但是,在实际应用场景中,源端显示设备获取到第i帧显示画面,直至接收端显示设备能够显示第i帧显示画面,这段时间内还需要进行第i帧显示画面的传输、解码等,都需要花费一定的时间。如果源端显示设备和接收端显示设备之间通过无线的方式传输显示内容,则传输时间还会受到无线网络传输状态的影响,存在一定的波动。
因此,在本申请实施例提供的显示画面同步方法中,源端显示设备可以在理论显示时刻基础上,加上一段时延(记作delayTime),用来表示显示画面延后的显示时间,作为一种预留的余量,尽可能保证接收端显示设备能够在指定显示时刻之前,获取到并解码完显示内容。避免发生指定显示时刻到了,接收端显示设备却还没有接收到显示画面,或者还没有完成显示画面解码的情况。即playTime i=startTime+pts i+delayTime。
如图4所示,源端设备获取第1帧显示画面的时刻为startTime,则源端设备针对第1帧显示画面的指定显示时刻playTime 1=startTime+pts 1+delayTime,源端设备针对第2帧显示画面的指定显示时刻playTime 2=startTime+pts 2+delayTime,源端设备针对第3帧显示画面的指定显示时刻playTime 3=startTime+pts 3+delayTime,……,以此类推。
在一些实施例中,为了应对网络传输状态不稳定的问题,可以将时延delayTime设置得相对大一些,例如设置为200ms、600ms。从而当网络传输状态产生波动时,尽可能使得显示画面同步不受影响。
然而,这种针对每一帧都采用相同的时延delayTime的实现方式仍旧会存在起播时延大的使用体验。也就是说,例如,用户从按下开始播放按键,直到多屏系统中的显示设备开始同步显示第1帧画面之间,存在一个较大的、用户可感知的时延,影响用户体验。
另外,如果不同的接收端设备与源端设备之间的网络传输状态变化不一致,仍旧会 存在显示画面无法同步的情况。示例性地,以图1所示场景为例,假设显示设备A作为源端设备,显示设备B、显示设备C、显示设备D作为接收端显示设备。假设显示设备B与显示设备A、显示设备C与显示设备A之间的网络传输状态正常,显示设备D与显示设备A之间的网络传输状态变差,则有可能会发生显示设备B、显示设备C能够在指定显示时刻显示第i帧显示画面,但显示设备D无法在指定显示时刻显示第i帧显示画面的情况,此时,显示设备D只能停留在显示第(i–1)帧显示画面,导致显示画面无法同步。
有鉴于此,本申请实施例提供另一种通过软件的方式实现的显示画面同步方法:
每个接收端设备都维护有丢帧信息,,用于记录不进行显示的帧的帧号。示例性地,丢帧信息的形式可以是“丢帧列表”(记作dropList j)。丢帧信息也可以以其他形式存在,例如丢帧池、丢帧队列、丢帧链表等,本申请对此不作具体限定。为了便于描述,本申请实施例主要以丢帧列表为例进行描述。
当某个接收端设备确定无法显示某一帧,和/或,确定无法在指定显示时刻(记作playTime i)显示某一帧时,就将该帧的序号加入自己的丢帧列表,并通知其他接收端设备也将该帧的帧号加入它们的丢帧列表,从而使得每个接收端设备维护的丢帧列表是相同的。每个接收端设备都不显示丢帧列表中记录的帧、只显示丢帧列表中未记录的帧。只要有一个接收端设备无法显示某一帧,和/或,无法在指定显示时刻显示某一帧,则其他接收端设备也不显示该帧。避免了同一时刻不同接收端设备显示不同帧的情况,实现了显示画面的同步。
在一些实施例中,由于传输过程中的故障导致某个接收端设备未能成功接收到某一帧,则该接收端设备确定无法显示该帧。
在一些实施例中,某个接收端设备可以根据历史帧的解码时延,估计(或者说预测)该帧的估计解码时延(记作decodeTime i_Sj_pred),进而确定该帧的估计显示时刻,如果估计显示时刻超过了该帧的指定显示时刻,则该接收端设备确定该接收端设备无法在指定显示时刻显示该帧。
在一些实施例中,指定显示时刻可以是由源端设备根据当前时延(记作delayTime i),为每一帧显示画面针对性确定的。其中,当前时延是动态变化的,而不是预先设定的固定值。
在一些实施例中,当前时延可以是源端设备根据接收端设备发送给它的当前传输时延(记作transTime Sj)、当前解码时延(记作decodeTime Sj)确定的。
在一些实施例中,接收端设备发送给源端设备的当前传输时延、当前解码时延,可以是接收端设备根据其接收到的历史帧的传输时延、历史帧的解码时延估计得到的。
从而,本申请实施例提供的显示画面同步方法,通过接收端设备维护相同的丢帧列表,通过源端设备根据动态变化的当前时延针对性确定每一帧的指定显示时刻,实现了显示画面的同步。本申请实施例提供的显示画面同步方法不需要增加额外的硬件成本,能够使用纯软件的方法实现显示画面的同步,易于实施且用户体验良好。
接下来,结合附图详细介绍本申请实施例提供的显示画面同步方法。
图5示例性展示了本申请实施例提供的一种显示画面同步方法设备交互示意图。
该方法涉及源端设备S 0和至少一个接收端设备S [1-N]。为了便于描述,本申请实施例将接收端设备S [1-N]中的任意一个接收端设备记作S j,其中,j用来表示接收端设备编号,j=1,2,…,N。
例如,在图1所示场景中,可以将显示设备A作为源端设备S 0,将显示设备B、显示设备C、显示设备D分别作为接收端设备S 1、S 2、S 3;源端设备S 0和接收端设备S [1-3]需要进行显示画面的同步显示。当然,在另一些实现方式中,在图1所示场景中,也可以将显示设备A、显示设备B、显示设备C、显示设备D分别作为接收端设备S 1、S 2、S 3、S 4,将多屏系统之外的另外一个设备作为源端设备S 0;接收端设备S [1-4]需要进行显示画面的同步显示。应理解,本申请实施例不限定源端设备S 0是否必须进行显示画面的同步显示。
如图5所示,该方法可以包括步骤501至步骤509,具体地:
步骤501、源端设备S 0与接收端设备S [1-N]建立连接。
在一些实施例中,源端设备S 0与接收端设备S [1-N]处于同一个局域网(local area network,LAN)内,该局域网可以是有线或者无线的。例如,源端设备S 0与接收端设备S [1-N]都连接至同一个路由器。源端设备S 0与接收端设备S [1-N]建立的连接例如可以是WiFi连接等。
在一些实施例中,源端设备S 0可以分别与每个接收端设备S j建立专属的传输通道,通过该专属的传输通道与接收端设备S j进行数据交互;在一些实施例中,源端设备S 0与接收端设备S [1-N]之间组成Mesh网络,通过在该Mesh网络内广播的方式进行数据交互。
应理解,本申请实施例不限定源端设备S 0与接收端设备S [1-N]建立连接的方式和类型。
步骤502、源端设备S 0与接收端设备S [1-N]进行时钟同步。
电子设备通常都设置有自己的系统时间,电子设备根据该系统时间来执行指定的功能。通常,不同的电子设备的系统时间之间难以避免地存在一定的偏差。
因此,在一些实施例中,源端设备S 0与接收端设备S [1-N]可以进行时钟同步,以确定每个接收端设备S j的系统时间与源端设备S 0的系统时间之间的时钟偏差,本申请实施例将时钟偏差记作offset Sj
例如,可以通过精确时间协议(precision time protocol,PTP)协议进行时钟同步。应理解,本申请实施例不限定源端设备S 0与接收端设备S [1-N]之间具体采用何种方式进行时钟同步。
在一些实施例中,每个接收端设备S j可以将通过时钟同步得到的接收端设备S j的系统时间与源端设备S 0的系统时间之间的时钟偏差offset Sj记录在接收端设备S j本地。从而,每当接收端设备S j接收到源端设备S 0发送的一帧显示画面,都可以根据时钟偏差offset Sj来矫正该帧的指定显示时刻,以根据矫正后的指定显示时刻进行显示画面的同步显示。
步骤503、源端设备S 0向接收端设备S [1-N]发送显示画面,相应地,接收端设备S [1-N]接收显示画面。
在一些实施例中,如图6所示,源端设备S 0可以将第i帧显示画面的帧号(记作frameID i)、源端设备S 0发送该帧显示画面时源端设备S 0的系统时间(简称“发送时刻”,记作sendTime i)、源端设备S 0针对该帧显示画面指定的指定显示时刻playTime i、该帧显示画面(记作payload i),打包为消息M1发送给每个接收端设备S j。其中,i用来表示帧号,i=1,2,…,M。j用来表示接收端设备编号,j=1,2,…,N。
应理解,图6仅作为一种实现方式的示例,本申请实施例不限定步骤503的具体实现方式,也不限定帧号、发送时刻、指定显示时刻、显示画面在消息M1中的具体排列顺序。消息M1可以包括比图6所示更多或更少的字段。
在一些实施例中,第i帧显示画面的指定显示时刻playTime i可以是由源端设备S 0根据当前时延delayTime i,为第i帧显示画面针对性确定的。
例如,playTime i的计算方式可以为playTime i=startTime+pts i+delayTime i。其中,“startTime”、“pts i”、“delayTime i”的具体含义与前述部分的描述相同,此处不再赘述。
如图7所示,源端设备S 0获取第1帧显示画面的时刻为startTime,则源端设备针对第1帧显示画面的指定显示时刻playTime 1=startTime+pts 1+delayTime 1,源端设备针对第2帧显示画面的指定显示时刻playTime 2=startTime+pts 2+delayTime 2,源端设备针对第3帧显示画面的指定显示时刻playTime 3=startTime+pts 3+delayTime 3,……,以此类推。
图7所示实现方式,与图4所示实现方式的不同之处在于:图4所示实现方式中,针对每一帧显示画面,源端设备S 0都采用相同的时延delayTime来确定指定显示时刻;图7所示实现方式中,针对每一帧显示画面,源端设备S 0根据动态变化的当前时延delayTime i来确定指定显示时刻。
在一些实施例中,当前时延delayTime i可以是源端设备S 0根据接收端设备S [1-N]的当前传输时延(记作transTime S[1-N])、接收端设备S [1-N]的当前解码时延(记作decodeTime S[1-N])确定的。
由于网络传输状态、接收端设备S [1-N]的处理能力存在波动,因此,每一帧显示画面传输过程中产生的时延、接收端设备S [1-N]解码该帧显示画面过程中产生的时延是实时变化的。例如,若网络传输状态变差,则当前传输时延会变长;若接收端设备S [1-N]的处理器负载升高,则当前解码时延会变长;反之同理。
有鉴于此,本申请实施例提供的显示画面同步方法中,源端设备S 0可以根据实时变化的当前传输时延transTime S[1-N]、当前解码时延decodeTime S[1-N],来确定针对第i帧显示画面的当前时延delayTime i,使得源端设备S 0针对第i帧显示画面的指定显示时刻playTime i能够动态地适应于当前时延,有助于避免显示画面不同步情况的产生。
在一些实施例中,源端设备S 0可以根据接收到的N个接收端设备S [1-N]的当前传输时延transTime S[1-N],确定针对第i帧显示画面的当前传输时延transTime i
在一种可能的实现方式中,具体地,第i帧显示画面的当前传输时延的计算方式可以为:transTime i=max(transTime S1,…,transTime SN)。其中,transTime Sj(j=1,…,N)用于表示接收端设备S j发送给源端设备S 0的该接收端设备S j的当前传输时延。在一些实施例中,transTime Sj可以是接收端设备S j根据自己接收到的历史帧的传输时延进行拟合、估计得到的。用来计算transTime Sj的一些具体的实现方式,将在后续步骤505部分进行详细介绍,此处先不赘述。也就是说,在这种实现方式中,源端设备S 0可以在接收到N个接收端设备S [1-N]发送给它的N个接收端设备S [1-N]的当前传输时延transTime S[1-N]后,取其中的最大值,作为针对第i帧显示画面的当前传输时延transTime i
在一些实施例中,源端设备S 0可以根据接收到的N个接收端设备S [1-N]的当前解码时延decodeTime S[1-N],确定针对第i帧显示画面的当前解码时延decodeTime i
在一种可能的实现方式中,具体地,第i帧显示画面的当前解码时延的计算方式可以为:decodeTime i=max(decodeTime S1,…,decodeTime SN)。其中,decodeTime Sj(j=1,…,N)用于表示接收端设备S j发送给源端设备S 0的该接收端设备S j的当前解码时延。在一些实施例中,decodeTime Sj可以是接收端设备S j根据自己接收到的历史帧的解码时 延进行拟合、估计得到的。用来计算decodeTime Sj的一些具体的实现方式,将在后续步骤508部分进行详细介绍,此处先不赘述。也就是说,在这种实现方式中,源端设备S 0可以在接收到N个接收端设备S [1-N]发送给它的N个接收端设备S [1-N]的当前解码时延decoeTime S[1-N]后,取其中的最大值,作为针对第i帧显示画面的当前解码时延decodeTime i
应理解,本申请实施例不限定源端设备S 0确定针对第i帧显示画面的当前传输时延transTime i、当前解码时延decodeTime i的具体实现方式。
在一些实施例中,针对第1帧显示画面,源端设备S 0可以将delayTime 1依据经验值或尝试性地设置为一个较小的值,例如30ms、60ms。后续,再根据N个接收端设备S [1-N]是否能够成功显示画面帧,和/或根据N个接收端设备S [1-N]反馈的当前传输时延delayTime S[1-N]、当前解码时延decodeTime S[1-N]来动态调整后续显示画面帧的delayTime i。例如,如果存在接收端设备S j确定无法在指定显示时刻playTime 1显示第1帧,则N个接收端设备S [1-N]就都不显示第1帧,并且源端设备S 0可以将第2帧的delayTime 2设置得比第1帧的delayTime 1稍大一些,来尽可能使得N个接收端设备S [1-N]可以显示第2帧。
本申请实施例提供的这种显示画面同步方法,可以将第1帧显示画面帧的delayTime 1设置为一个较小的值,后续帧的delayTime i再根据实际情况动态调整;而不必如前述实施例方案那样,所有显示画面帧(包括第1帧)都采用一个较大的固定值(例如200ms、600ms)作为其delayTime;从而本申请实施例提供的这种显示画面同步方法在保证显示画面同步的同时,也解决了前述实施例方案起播时延大的问题。
在实施时,源端设备S 0对于当前时延delayTime i的取值通常需要大于handleTime+transTime i+decodeTime i。其中,handleTime用于表示一帧显示画面的处理时间,相对波动不大,因此可以取一固定值,例如取10ms。所述处理时间,可以指源端设备S 0处理显示画面帧所消耗的时间,例如将显示画面帧进行加密、将包含显示画面帧的报文按照所采用的传输协议进行报文封装、调用加密软件、调用报文封装软件等过程所消耗的时间。
在一些实施例中,示例性地,源端设备S 0对于当前时延delayTime i的具体计算方式可以为:delayTime i=handleTime+transTime i+decodeTime i+2×framePlayTime。
其中,framePlayTime用于表示一帧显示画面持续显示的时间,也就是帧率的倒数,例如若帧率为60帧每秒,则framePlayTime=1/60≈16.67ms。通过增加两倍的framePlayTime作为一种预留的余量,可以提高显示画面同步的容错率。当然,以上仅作为一种示例而非限定,在实施时,也可以设置任意值(例如20ms、40ms)作为一种预留的余量,不一定采用整数倍的framePlayTime。
步骤504、接收端设备S [1-N]中的每个接收端设备S j都将自己的丢失帧号发送给其他接收端设备。可选地,还可将丢失帧号发送给源端设备S 0
在一些实施例中,如图5所示,每个接收端设备S j可以将自己的丢失帧号打包为消息M2后发送给其他接收端设备。可选地,还可以将消息M2发送给源端设备S 0
在本申请实施例中,丢失帧号用于表示由于传输过程中的故障导致的接收端设备未成功接收到的显示画面帧的序号。
例如,接收端设备S j可以判断自己是否未能成功接收到某个帧号的显示画面,确定自己的丢失帧号,将自己的丢失帧号加入自己的丢帧列表,并通知其他进行显示画面同步的设备也将该丢失帧号加入它们的丢帧列表中。
只要进行显示画面同步显示的全部设备中,有任意一个设备未成功接收到某帧,则进行显示画面同步显示的全部设备就都不会显示该帧,不论它们是否成功接收到了该帧。
从而,本申请实施例提供的方法,可以通过步骤504,避免由于传输过程中的故障至使某个或某些接收端设备未未成功接收到某个或某些帧,而导致的显示画面不同步的情况。
在一些实施例中,如图8所示,步骤504的一种具体实现方式可以为:
步骤5041、在接收端设备S j接收到消息M1后,接收端设备S j可以从消息M1中获取该帧显示画面的帧号framID i
步骤5042、接收端设备S j判断帧号是否连续,若帧号连续,则执行步骤5046;若帧号不连续,则执行步骤5043。
示例性地,假设接收端设备S j上一次接收到的帧号为4,此次接收到的帧号为5,则帧号连续;假设接收端设备S j上一次接收到的帧号为4,此次接收到的帧号为6,则帧号不连续。
应理解,以上描述仅作为判断帧号是否连续的一种示例,而非限定。如果显示画面帧在传输过程中采用了预设的帧排序或帧传输算法,则接收端设备Sj可以结合预设的帧排序或帧传输算法,来判断帧号是否连续。
步骤5043、接收端设备S j计算丢失帧号。
在一些实施例中,接收端设备S j根据历史接收到的帧号来计算丢失帧号。示例性地,假设接收端设备S j历史接收到的帧号分别为1、2、3、5,则接收端设备Sj计算得到丢失了帧号为4的显示画面。
应理解,以上描述仅作为计算丢失帧号的一种示例,而非限定。如果显示画面帧在传输过程中采用了预设的帧排序或帧传输算法,和/或采用了预设的丢帧重传算法,则接收端设备Sj可以结合预设的帧排序或帧传输算法、预设的丢帧重传算法,来计算丢失帧号。
步骤5044、接收端设备S j将丢失帧号加入自己的丢帧列表dropList j
在一些实施例中,每个接收端设备S j都维护有一个自己的丢帧列表dropList j,用于记录不进行显示的显示画面的帧号。在一些实施例中,每个接收端设备S j可以在步骤503之前创建好该丢帧列表dropList j。dropList j可以以数组、向量、列表、链表等计算机数据结构实现,本申请实施例对此具体不做限定。从而在步骤5044中,接收端设备S j可以将计算得到的丢失帧号加入到自己的丢帧列表drotList j中。例如,通过追加、插入、更新等方式,将丢失帧号加入dropList j
步骤5045、接收端设备S j将丢失帧号发送给其他接收端设备;相应地,其他接收端设备在接收到接收端设备S j发送的接收端设备S j的丢失帧号后,也将这些丢失帧号加入该其他接收端设备自己的丢帧列表dropList j’中。
可选地,在步骤5045中,接收端设备S j还可以将丢失帧号发送给源端设备S 0。例如,在源端设备S 0也进行显示画面的同步显示的情况下,源端设备S 0也维护有他自己的丢帧列表dropList 0,接收端设备S j也可以将丢失帧号发送给源端设备S 0;相应地,源端设备S 0在接收到丢失帧号后,将这些丢失帧号加入它的丢帧列表dropList 0中。
步骤5046、接收端设备S j将帧号frameID i对应的显示画面payload i送入解码器。其中,解码器可以是安装在接收端设备S j中的显示画面解码软件,可以预设有所需的解码算法,用于解码显示画面。
步骤505、接收端设备S [1-N]中的每个接收端设备S j都确定自己的当前传输时延transTime Sj,并将所确定的自己的当前传输时延发送给源端设备S 0
在一些实施例中,如图5所示,每个接收端设备S j可以将确定的自己的当前传输时延打包为消息M3后发送给源端设备S 0
在一些实施例中,接收端设备S j接收到消息M1后,接收端设备S j可以从消息M1中获取该第i帧显示画面的发送时间sendTime i
如前所述,发送时间sendTime i是源端设备S 0打包发送消息M1时源端设备S 0自己的系统时间。又由于接收端设备S j的系统时间与源端设备S 0的系统时间可能存在时钟偏差offset Sj
因此,在一种实现方式中,接收端设备S j可以针对该第i帧显示画面,根据时钟偏差offset Sj,以及接收端设备S j接收到该第i帧显示画面时接收端设备S j的系统时间(记作rcvTime i_Sj),计算得到第i帧显示画面由源端设备S 0发送至接收端设备S j的实际传输时延(记作transTime i_Sj):
即transTime i_Sj=rcvTime i_Sj-offset Sj-sendTime i
该实际传输时延transTime i_Sj可以理解为第i帧显示画面由源端设备S 0发送至接收端设备S j,在传输路径上实际消耗的时间。
在一些实施例中,接收端设备S j中可以记录有它历史上接收到的显示画面的实际传输时延。从而,接收端设备S j可以根据这些历史上接收到的显示画面的实际传输时延,通过估计(或者说预测)确定自己的当前传输时延transTime Sj
具体地,在一种可能的实现方式中:
(1)当接收端设备S j历史累计接收到的帧数小于预设数量(例如30)时,接收端设备Sj直接用第i帧显示画面的实际传输时延transTime i_Sj_real作为接收端设备S j自己的当前传输时延transTime Sj
(2)当接收端设备S j历史累计接收到的帧数大于等于预设数量(例如30)时,接收端设备S j基于这些历史帧的实际传输时延进行拟合,例如通过最小二乘法进行线性拟合,得到接收端设备S j的拟合传输时延transTime Sj_fitting和帧号framID i之间的线性关系:transTime Sj_fitting=a×frameID i+b。其中,a和b为超参数,可以是通过线性拟合计算得到的,a用于表示拟合得到的线性函数在坐标系中的斜率,b用于表示截距。从而,将第i帧显示画面的帧号frameID i带入transTime Sj_fitting=a×frameID i+b,就可以计算得到的拟合传输时延transTime Sj_fitting(frameID i)作为接收端设备S j自己的当前传输时延transTime Sj
除了以上所述的接收端设备S j确定自己的当前传输时延transTime Sj的方式,还可以通过求历史帧的实际传输时延的平均数、众数等方式确定。应理解,本申请实施例不限定接收端显示设备S j确定自己的当前传输时延transTime Sj的具体实现方式。
通过执行步骤505,接收端设备S [1-N]中的每个接收端设备S j都将所确定的自己的当前传输时延transTime Sj发送给源端设备S 0,从而,源端设备S 0获取到N个接收端设备S [1-N]发送给它的N个当前传输时延transTime S[1-N]
在下一次循环过程中(即下一帧显示画面的同步显示过程中),当下一次执行步骤503时,根据前述部分所述,源端设备S 0可以根据此次N个接收端设备S [1-N]发送给它的N个接收端设备S [1-N]它们各自的当前传输时延transTime S[1-N],确定针对第(i+1)帧显示画面的当前传输时延transTime i+1,例如取transTime S[1-N]中的最大值。
进而,源端设备S 0可以根据针对第(i+1)帧显示画面的当前传输时延transTime i+1,确定针对第(i+1)帧显示画面的指定显示时刻playTime i+1
因此,本申请实施例提供的显示画面同步方法,源端设备S 0得以实现针对第i帧显示画面,根据动态变化的当前传输时延transTime i来确定指定显示时刻palyTime i
应理解,本申请实施例不限定步骤505的执行顺序。在一些实施例中,步骤505只要在下一次执行步骤503之前执行完成即可。
步骤506、接收端设备S [1-N]中的每个接收端设备S j都将自己的主动丢弃帧号(若存在)发送给其他接收端设备。可选地,还可将主动丢弃帧号发送给源端设备S 0
在一些实施例中,如图5所示,每个接收端设备S j可以将自己的主动丢弃帧号打包为消息M4后发送给其他接收端设备。可选地,还可以将消息M4发送给源端设备S 0,例如在源端设备S 0也进行显示画面的同步显示的情况下。
在本申请实施例中,主动丢弃帧号用于表示由于估计显示时刻超过了指定显示时刻,而导致的接收端设备无法在指定显示时刻显示的显示画面帧的序号。
例如,接收端设备S j可以估计(或者说预测)当前接收到的第i帧显示画面所需的解码时延(称为“估计解码时延”,记作decodeTime i_Sj_pred),然后根据该估计解码时延,得到当前第i帧显示画面的估计显示时刻。如果估计显示时刻超过了源端设备S 0为该第i帧显示画面指定的指定显示时刻,则接收端设备S j可以倾向性认为,就算此次对该第i帧显示画面进行了解码,解码完成后的时刻也很可能早已经超过了指定显示时刻,来不及进行显示了。因此,接收端设备S j需要主动丢弃该帧,并将该帧的帧号也发送给其他接收端设备。
只要进行显示画面同步显示的全部设备中,有任意一个设备无法在指定显示时刻显示该帧,则进行显示画面同步显示的全部设备就都不会显示该帧。
从而,本申请实施例提供的方法,可以通过步骤506,避免由于某个或某些接收端设备无法在指定显示时刻成功显示某个或某些帧,而导致的显示画面不同步的情况。
在一些实施例中,考虑到步骤504执行完成后,当前第i帧显示画面已经被送入了解码器,因此步骤506的执行主体可以是该解码器。
在一些实施例中,如图9所示,步骤506的一种具体实现方式可以为:
步骤5061、接收端设备S j估计当前的第i帧显示画面payload i所需的解码时延(即“估计解码时延”,记作decodeTime i_Sj_pred)。
在一些实施例中,接收端设备S j中可以记录有它历史上接收到的显示画面的实际解码时延。从而,接收端设备S j可以根据这些历史上接收到的显示画面的实际解码时延,估计当前的第i帧显示画面所需的解码时延。
具体地,在一种可能的实现方式中:
(1)当接收端设备S j历史累计接收到的帧数小于预设数量(例如30)时,接收端设备Sj直接用第(i-1)帧显示画面的实际解码时延作为当前第i帧显示画面的估计解码时延decodeTime i_Sj_pred
(2)当接收端设备S j历史累计接收到的帧数大于等于预设数量(例如30)时,接收端设备S j基于这些历史显示画面的实际解码时延进行拟合,例如通过最小二乘法进行线性拟合,得到接收端设备S j的拟合解码时延decodeTime Sj_fitting和帧号framID i之间的线性关系:decodeTime Sj_fitting=c×frameID i+d。其中,c和d为超参数,可以是通过线性拟合计算得到的,c用于表示拟合得到的线性函数在坐标系中的斜率,d用于表示 截距。从而,将当前第i帧显示画面的帧号frameID i带入decodeTime Sj_fitting=c×frameID i+d,就可以将计算得到的拟合解码时延decodeTime Sj_fitting(frameID i)作为接收端设备S j对当前第i帧显示画面的估计解码时延decodeTime i_Sj_pred
除了以上所述的接收端设备S j确定第i帧显示画面的估计传输时延的方式,还可以通过求历史帧的实际解码时延的平均数、众数等方式确定。应理解,本申请实施例不限定接收端显示设备S j确定第i帧显示画面的估计传输时延的具体实现方式。
步骤5062、接收端设备S j根据估计解码时延decodeTime i_Sj_pred,判断是否能够在指定显示时刻palyTime i之前完成解码,若能,则执行步骤507,进行解码;若不能,则执行步骤5063。
在一些实施例中,若满足预设条件,则接收端设备S j判断能够在指定显示时刻playTime i之前完成解码。示例性地,上述预设条件可以为:
systemTime Sj+decodeTime i_Sj_pred-offset Sj<playTime i+framePlayTime。
其中,“systemTime Sj”用于表示接收端设备S j的系统时间;其他符号的含义如前所述,此处不再赘述。
步骤5063、接收端设备S j确定需要主动丢弃当前帧。
由于步骤5062判断为“否”,因此接收端设备S j可以倾向性认为,如果对当前第i帧进行解码,则解码完成后的时刻,很可能已经超过了源端设备S 0指定的指定显示时刻,因此接收端设备S j没有必要再对该第i帧显示画面进行解码了,需要主动丢弃当前帧。
步骤5064、接收端设备S j将当前帧的帧号加入自己的丢帧列表dropList j。也就是说,当前帧的帧号被接收端设备Sj认定为主动丢弃帧号。
该步骤可以参考前述部分对步骤5044的描述进行类比,此处不做赘述。
步骤5065、接收端设备S j将当前帧的帧号(主动丢弃帧号)发送给其他接收端设备;相应地,其他接收端设备在接收到接收端设备S j发送的接收端设备S j的主动丢弃帧号后,也将该主动丢弃帧号加入该其他接收端设备自己的丢帧列表dropList j’中。
可选地,在步骤5065中,接收端设备S j还可以将主动丢弃帧号发送给源端设备S 0。例如,在源端设备S 0也进行显示画面的同步显示的情况下,源端设备S 0也维护有他自己的丢帧列表dropList 0,接收端设备S j也可以将主动丢弃帧号发送给源端设备S 0;相应地,源端设备S 0在接收到该主动丢弃帧号后,将该主动丢弃帧号加入它的丢帧列表dropList 0中。
步骤507、接收端设备S [1-N]解码第i帧显示画面payload i。可选地,源端设备S 0也可以解码第i帧显示画面payload i
在一些实施例中,每个接收端设备S j中可以安装有解码器,用于按照预设的解码算法解码显示画面。
在一些实施例中,若源端设备S 0也进行显示画面的同步显示,则源端设备S 0中也可以安装有解码器,也可解码第i帧显示画面payload i;若源端设备S 0仅作为显示画面的提供设备,而不进行显示画面的同步显示,则源端设备S 0可以不解码第i帧显示画面payload i
在一些实施例中,若步骤506是以图9所示的方式实现的,则由于步骤5062判断为“是”,因此接收端设备S j可以倾向性认为,如果对当前第i帧进行解码,则解码完成后的时刻,很可能在源端设备S 0指定的指定显示时刻之前。因此,可以对该第i帧显示 画面进行解码。例如,使用解码器中预设的解码算法对第i帧显示画面payload i进行解码。
步骤508、接收端设备S [1-N]中的每个接收端设备S j都确定自己的当前解码时延decodeTime Sj,并将所确定的自己的当前解码时延发送给源端设备S 0
在一些实施例中,如图5所示,每个接收端设备S j可以将确定的自己的当前解码时延打包为消息M5后发送给源端设备S 0
在一些实施例中,在步骤507完成解码后,接收端设备S j可以得知此次针对第i帧显示画面进行解码实际消耗的解码时间,即“实际解码时延”,记作decodeTime i_Sj_real
则此时,接收端设备S j可以根据接收端设备S j中记录的它历史上接收到的显示画面的实际解码时延(其中,包括第i帧显示画面的实际解码时延decodeTime i_Sj_real),估计接收端设备S j自己的当前解码时延decodeTime Sj
具体地,在一种可能的实现方式中:
(1)当接收端设备S j历史累计接收到的帧数小于预设数量(例如30)时,接收端设备Sj直接用第i帧显示画面的实际解码时延decodeTime i_Sj_real作为接收端设备S j自己的当前解码时延decodeTime Sj
(2)当接收端设备S j历史累计接收到的帧数大于等于预设数量(例如30)时,接收端设备Sj基于这些历史帧的实际解码时延(其中,包括第i帧显示画面的实际解码时延decodeTime i_Sj_real)进行拟合,例如通过最小二乘法进行线性拟合,得到接收端设备S j的拟合解码时延decodeTime Sj_fitting’和帧号framID i之间的线性关系:decodeTime Sj_fitting’=c’×frameID i+d’。其中,c’和d’为超参数,可以是通过线性拟合计算得到的,c’用于表示拟合得到的线性函数在坐标系中的斜率,d’用于表示截距。从而,将下一帧即第(i+i)帧的帧号frameID i+1带入以上线性关系公式,就可以将计算得到的拟合解码时延decodeTime Sj_fitting’(frameID i+1)作为接收端设备S j自己的当前解码时延decodeTime Sj
除了以上所述的接收端设备S j确定自己的当前解码时延decodeTime Sj的方式,还可以通过求历史帧的实际解码时延的平均数、众数等方式确定。应理解,本申请实施例不限定接收端设备S j确定自己的当前解码时延decodeTime Sj的具体实现方式。
当然,在其他一些可能的实现方式中,接收端设备S j也可以将步骤5061中确定的估计解码时延decodeTime i_Sj_pred作为接收端设备S j自己当前解码时延。
通过执行步骤508,接收端设备S [1-N]中的每个接收端设备S j都将所确定的自己的当前解码时延decodeTime Sj发送给源端设备S 0,从而,源端设备S 0获取到N个接收端设备S [1-N]发送给它的N个当前解码时延decodeTime S[1-N]
在下一次循环过程中(即下一帧显示画面的同步显示过程中),当下一次执行步骤508时,根据前述部分所述,源端设备S 0可以根据此次N个接收端设备S [1-N]发送给它的N个接收端设备S [1-N]它们各自的当前解码时延decodeTime S[1-N],确定针对第(i+1)帧显示画面的当前解码时延decodeTime i+1,例如取decodeTime S[1-N]中的最大值。
进而,源端设备S 0可以根据针对第(i+1)帧显示画面的当前解码时延decodeTime i+1,确定针对第(i+1)帧显示画面的指定显示时刻playTime i+1
因此,本申请实施例提供的显示画面同步方法,源端设备S 0得以实现针对第i帧显示画面,根据动态变化的当前解码时延decodeTime i来确定指定显示时刻palyTime i
应理解,本申请实施例不限定步骤508的执行顺序,步骤508只要在下一次执行步 骤503之前执行完成即可。
步骤509、接收端设备S [1-N]中的每个接收端设备S j进行显示画面的同步显示。可选地,源端设备S 0也可以进行显示画面的同步显示。
在一些实施例中,经过前面的步骤504、步骤506,每个接收端设备S j中的丢帧列表dropList j中,所记录的不进行显示的显示画面的帧号都是相同的。因此,步骤509中,每个接收端设备Sj可以根据自己的丢帧列表dropList j来判断是否显示某一帧,实现显示画面的同步显示。
例如,如果该帧显示画面的帧号在丢帧列表中,则不显示该帧显示画面;如果该帧显示画面的帧号不在丢失列表中,则显示该帧显示画面。
从而,本申请实施例提供的方法,通过在每个进行显示画面同步显示的设备中维护一个动态更新的、记录有相同的不进行显示的显示画面帧号的丢帧列表,实现了显示画面的同步显示。通过纯软件的方式实现了显示画面同步,易于实施且用户体验良好。
在一些实施例中,如图10所示,步骤509的一种具体实现方式可以为:
步骤5091、接收端设备S j解码完成当前显示画面。
在一些实施例中,如前所述,步骤507中接收端设备S j对当前第i帧显示画面payload i进行解码。则步骤5091确定当前第i帧显示画面payload i已经解码完成,得到解码完成的第i帧显示画面。
其中,解码完成的第i帧显示画面可以指能够送入显示器进行显示的第i帧显示画面数据。显示器能够根据解码完成的第i帧显示画面将第i帧显示画面显示出来。
步骤5092、接收端设备S j判断帧号frameID i是否不在自己的丢帧列表dropList j内,若否,则执行步骤5093;若是,则执行步骤5094。
在实施时,每个接收端设备S j的丢帧列表dropList j都可能是不停地动态更新的。因此,直至送显之前的任意时刻,每个接收端设备S j的丢帧列表dropList j内随时都可能加入新的不进行显示的显示画面帧的帧号。因此,解码完成后,也需要步骤5092来判断此次解码完成的显示画面帧的帧号是否不在丢帧列表内。
步骤5093、不将解码完成的第i帧显示画面送显。
如果帧号在丢帧列表内,则表明进行显示画面同步显示的所有设备中,至少有一个设备未成功接收到或者不能够在指定显示时刻显示该帧显示画面。为了保证显示画面的同步显示,此时进行显示画面同步显示的所有设备都应当不显示该帧显示画面。因此,在步骤5092判断为“否”的情况下执行步骤5093。
步骤5094、接收端设备S j判断送显时刻是否已到,若是,则执行步骤5096;若否,则执行步骤5095。
在一些实施例中,若满足预设条件,则接收端设备S j判断送显时刻已到。示例性地,上述预设条件可以为:systemTime Sj-offset Sj>=playTime i
其中,每个符号的含义如前所述,此处不再赘述。
该预设条件中左侧为接收端设备S j的系统时间减去接收端设备S j的系统时间与源端设备S 0的系统时间之间的时钟偏差,右侧为源端设备S 0针对第i帧显示内容指定的指定显示时刻。
当然,上述预设条件也可以写为:systemTime Sj>=playTime i+offset Sj
以上判断方式可以理解为:通过判断此时接收端设备S j的系统时间,是否已到达矫正后的指定显示时刻来判断送显时刻是否已到。
步骤5095、接收端设备S j等待送显,并实时判断帧号是否在丢帧列表内。
在步骤5049判断为“否”的情况下,接收端设备S j的系统时间还没有到达矫正后的指定显示时刻,因此接收端设备S j需要进行等待。
在一些实施例中,有可能在这段等待时间内,接收端设备S j又接收到了其他接收端设备S j发送给它的需要将该帧加入丢帧列表的消息,因此,步骤5095中接收端设备S j仍旧需要实时判断帧号是否在丢帧列表内。只有在帧号在送显时刻到达之时,仍旧未出现在丢帧列表内,才执行步骤5096。
步骤5096、接收端设备S j将帧号对应的显示画面送显。
在步骤5049判断为“是”的情况下,或者,在步骤5095执行完成且直至送显时刻到达且帧号仍旧未出现在丢帧列表内的情况下,执行步骤5096。从而,本申请提供的方法,所有进行显示画面同步显示的设备,都在相同的时刻显示相同帧号的显示画面,不会在相同的时刻显示不同帧号的显示画面,可以保证显示画面的同步。
在一些实施例中,接收端设备S j一开始判断丢失了第i帧,于是通知了其他接收端设备将第i帧的帧号加入它们的丢帧列表;但后来由于丢帧重传,接收端设备S j又接收到了第i帧,则此时本申请实施例提供的显示画面同步方法还可以包括以下步骤:接收端设备S j通知其他接收端设备将第i帧的帧号从它们的丢帧列表中删除。也就是说,本申请实施例提供的显示画面同步方法,除了可以向丢帧列表中添加帧号,也可以从丢帧列表中删除帧号,丢帧列表是动态更新的,维护有不进行显示的显示画面帧的帧号。从而可以在保证显示画面同步的情况下,尽可能同步显示更多的帧。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
在本申请所提供的实施例中,应该理解到,所揭露的装置/电子设备和方法,可以通过其它的方式实现。例如,以上所描述的装置/电子设备实施例仅仅是示意性的,例如,所述模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通讯连接可以是通过一些接口,装置或单元的间接耦合或通讯连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的模块/单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读存储介质中。基于这样的理解,本申请实现上述实施例方法中的全部或部分流程,也可以通过计算机程序来指令相关的硬件来完成,所述的计算机程序可存储于一计算机可读存储介质中,该计算机程序在被处理器执行时,可实 现上述各个方法实施例的步骤。其中,所述计算机程序包括计算机程序代码,所述计算机程序代码可以为源代码形式、对象代码形式、可执行文件或某些中间形式等。所述计算机可读存储介质可以包括:能够携带所述计算机程序代码的任何实体或装置、记录介质、U盘、移动硬盘、磁碟、光盘、计算机存储器、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、电载波信号、电信信号以及软件分发介质等。需要说明的是,所述计算机可读存储介质包含的内容可以根据司法管辖区内立法和专利实践的要求进行适当的增减,例如在某些司法管辖区,根据立法和专利实践,计算机可读存储介质不包括电载波信号和电信信号。
最后应说明的是:以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何在本申请揭露的技术范围内的变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (27)

  1. 一种显示画面同步方法,应用于接收端设备,其特征在于,所述接收端设备包括丢帧信息,所述丢帧信息用于记录不进行显示的画面帧的标识,所述方法包括:
    与源端设备建立通信连接;
    接收所述源端设备发送的第一画面帧,以及所述第一画面帧对应的第一指定显示时刻;
    若确定能够按照所述第一指定显示时刻显示所述第一画面帧,且所述第一画面帧的帧号不在所述丢帧信息中,则按照所述第一指定显示时刻显示所述第一画面帧;
    若确定不能够按照所述第一指定显示时刻显示所述第一画面帧,则将所述第一画面帧的标识添加至所述丢帧信息,并将所述第一画面帧的标识通知给其他设备;其中,所述其他设备用于与所述接收端设备进行画面帧的同步显示。
  2. 根据权利要求1所述的方法,其特征在于,所述方法还包括:
    若所述第一画面帧的标识在所述丢帧信息中,则不显示所述第一画面帧。
  3. 根据权利要求1或2所述的方法,其特征在于,所述方法还包括:
    根据已成功接收到的画面帧对应的标识,确定丢失帧标识;其中,所述丢失帧标识包括未成功接收到的画面帧对应的标识;
    将所述丢失帧标识添加至所述丢帧信息,并将所述丢失帧标识通知给所述其他设备。
  4. 根据权利要求1-3中任一项所述的方法,其特征在于,所述确定能够按照所述第一指定显示时刻显示所述第一画面帧,具体包括:
    获取解码所述第一画面帧所需的估计解码时延,根据所述估计解码时延,确定所述第一画面帧对应的估计显示时刻;
    若所述估计显示时刻早于或等于所述第一指定显示时刻,则确定能够按照所述第一指定显示时刻显示所述第一画面帧。
  5. 根据权利要求4所述的方法,其特征在于,所述获取解码所述第一画面帧所需的估计解码时延,具体包括:
    根据解码历史画面帧所消耗的实际解码时延,获取解码所述第一画面帧所需的所述估计解码时延,其中,所述历史画面帧包括帧的标识位于所述第一画面帧的标识之前的至少一个画面帧。
  6. 根据权利要求1-5中任一项所述的方法,其特征在于,所述方法还包括:
    根据成功接收到的至少一个画面帧的实际解码时延,确定所述接收端设备的当前解码时延;
    将所述当前解码时延发送给所述源端设备。
  7. 根据权利要求1-6中任一项所述的方法,其特征在于,所述方法还包括:
    根据成功接收到的至少一个画面帧的实际传输时延,确定所述接收端设备的当前传输时延;
    将所述当前传输时延发送给所述源端设备。
  8. 根据权利要求1-7中任一项所述的方法,其特征在于,所述方法还包括:
    接收所述其他设备发送的丢失帧标识和/或主动丢弃帧标识,并将所述其他设备发送的丢失帧标识和/或主动丢弃帧标识添加至所述丢帧信息中。
  9. 根据权利要求1-8中任一项所述的方法,其特征在于,所述接收端设备和所述源端设备处于同一局域网中。
  10. 根据权利要求1-9中任一项所述的方法,其特征在于,所述画面帧的标识包括所述画面帧的帧号。
  11. 一种显示画面同步方法,应用于源端设备,其特征在于,所述源端设备用于获取画面帧并将所述画面帧发送给接收端设备,所述方法包括:
    与所述接收端设备建立通信连接;
    向所述接收端设备发送第一画面帧,以及所述第一画面帧对应的第一指定显示时刻;
    其中,所述第一指定显示时刻是所述源端设备根据当前传输时延和/或当前解码时延确定的,所述当前传输时延与历史画面帧的实际传输时延有关,所述当前解码时延与所述接收端设备解码所述历史画面帧的实际解码时延有关,所述历史画面帧包括帧的标识位于所述第一画面帧的标识之前的至少一个画面帧。
  12. 根据权利要求11所述的方法,其特征在于,所述源端设备还用于与所述接收端设备进行所述画面帧的同步显示,所述源端设备包括丢帧信息,所述丢帧信息用于记录不进行显示的画面帧的标识,所述方法还包括:
    若所述第一画面帧的帧号不在所述丢帧信息中,则按照所述第一指定显示时刻显示所述第一画面帧;
    若所述第一画面帧的帧号在所述丢帧信息中,则不显示所述第一画面帧。
  13. 根据权利要求12所述的方法,其特征在于,所述方法还包括:
    接收所述接收端设备发送的丢失帧标识和/或主动丢弃帧标识,并将所述接收端设备发送的丢失帧标识和/或主动丢弃帧标识添加至所述丢帧信息中。
  14. 根据权利要求11-13中任一项所述的方法,其特征在于,所述方法还包括:
    接收至少一个所述接收端设备发送的至少一个所述接收端设备的当前传输时延和/或至少一个所述接收端设备的当前解码时延;
    将至少一个所述接收端设备的当前传输时延中的最大值确定为所述当前传输时延,和/或,将至少一个所述接收端设备的当前解码时延中的最大值确定为所述当前解码时延。
  15. 根据权利要求14所述的方法,其特征在于,所述接收端设备的当前传输时延是所述接收端设备根据所述接收端设备接收到的所述历史画面帧的实际传输时延确定的,所述接收端设备的当前解码时延是所述接收端设备根据所述接收端设备接收到的所述历史画面帧的实际解码时延确定的。
  16. 根据权利要求11-15中任一项所述的方法,其特征在于,所述源端设备和所述接收端设备处于同一局域网中。
  17. 一种显示画面同步方法,应用于包括源端设备和至少一个接收端设备的系统中,其特征在于,所述接收端设备为进行画面帧同步显示的设备,或者,所述接收端设备以及所述源端设备为所述进行画面帧同步显示的设备,
    所述方法包括:
    所述源端设备向所述接收端设备中的每个接收端设备发送第一画面帧,以及所述第一画面帧对应的第一指定显示时刻;
    若所述接收端设备中的每个接收端设备都成功接收到所述第一画面帧,且都确定能够按照所述第一指定显示时刻显示所述第一画面帧,则所述进行画面帧同步显示的设备中的每个设备都按照所述第一指定显示时刻显示所述第一画面帧;
    若所述接收端设备中存在第一接收端设备未成功接收到所述第一画面帧,或者,成功接 收到了所述第一画面帧但确定不能够按照所述第一指定显示时刻显示所述第一画面帧,则所述进行画面帧同步显示的设备中的每个设备都不显示所述第一画面帧。
  18. 根据权利要求17所述的方法,其特征在于,所述进行画面帧同步显示的设备中的每个设备都包括丢帧信息,所述丢帧信息用于记录不进行显示的画面帧的标识;
    所述若所述接收端设备中存在第一接收端设备未成功接收到所述第一画面帧,或者,成功接收到了所述第一画面帧但确定不能够按照所述第一指定显示时刻显示所述第一画面帧,则所述进行画面帧同步显示的设备中的每个设备都不显示所述第一画面帧,具体包括:
    若所述接收端设备中存在所述第一接收端设备未成功接收到所述第一画面帧,或者,成功接收到了所述第一画面帧但确定不能够按照所述第一指定显示时刻显示所述第一画面帧,则所述第一接收端设备将所述第一画面帧的标识添加至所述第一接收端设备的丢帧信息中,并将所述第一画面帧的标识通知给所述进行画面帧同步显示的设备中的其他设备;
    所述进行画面帧同步显示的设备中的每个设备,根据所述第一画面帧的标识在所述丢帧信息中,不显示所述第一画面帧。
  19. 根据权利要求17或18所述的方法,其特征在于,所述方法还包括:
    所述源端设备根据当前传输时延和/或当前解码时延,确定所述第一指定显示时刻;
    其中,所述当前传输时延与历史画面帧的实际传输时延有关,所述当前解码时延与所述接收端设备解码所述历史画面帧的实际解码时延有关,所述历史画面帧包括帧的标识位于所述第一画面帧的标识之前的至少一个画面帧。
  20. 根据权利要求19所述的方法,其特征在于,在所述源端设备根据当前传输时延和/或当前解码时延,确定所述第一指定显示时刻之前,所述方法还包括:
    所述接收端设备根据所述接收端设备接收到的所述历史画面帧的实际传输时延确定所述接收端设备的当前传输时延,和/或,根据所述接收端设备接收到的所述历史画面帧的实际解码时延确定所述接收端设备的当前解码时延;
    所述接收端设备将所确定的所述接收端设备的当前传输时延和/或所述接收端设备的当前解码时延发送给所述源端设备;
    所述源端设备根据当前传输时延和/或当前解码时延,确定所述第一指定显示时刻,具体包括:
    所述源端设备将接收到的至少一个所述接收端设备的当前传输时延中的最大值确定为所述当前传输时延,和/或,将接收到的至少一个所述接收端设备的当前解码时延中的最大值确定为所述当前解码时延。
  21. 根据权利要求17-20中任一项所述的方法,其特征在于,所述源端设备和所述接收端设备处于同一个局域网中。
  22. 一种接收端设备,包括存储器、处理器以及存储在所述存储器中并可在所述处理器上运行的计算机程序,其特征在于,所述处理器被配置为执行所述计算机程序时,使所述接收端设备实现如权利要求1-10中任一项所述的方法。
  23. 一种源端设备,包括存储器、处理器以及存储在所述存储器中并可在所述处理器上运行的计算机程序,其特征在于,所述处理器被配置为执行所述计算机程序时,使所述源端设备实现如权利要求11-16中任一项所述的方法。
  24. 一种显示画面同步系统,所述显示画面同步系统包括源端设备和至少一个接收端设备,其中,所述源端设备、所述接收端设备分别被配置为用于执行如权利要求17-21中任一项所述的方法中所述源端设备、所述接收端设备分别执行的步骤。
  25. 一种计算机可读存储介质,所述计算机可读存储介质被配置为存储有计算机程序,其特征在于,所述计算机程序被处理器执行时实现如权利要求1-10中任一项所述的方法,或者,权利要求11-16中任一项所述的方法。
  26. 一种计算机程序产品,其特征在于,所述计算机程序产品被配置为在接收端设备上运行时,使得所述接收端设备执行如权利要求1-10中任一项所述的方法,或者,所述计算机程序产品被配置为在源端设备上运行时,使得所述源端设备执行如权利要求11-16中任一项所述的方法。
  27. 一种芯片系统,其特征在于,所述芯片系统包括存储器和处理器,所述处理器被配置为执行所述存储器中存储的计算机程序,以实现如权利要求1-10中任一项所述的方法,或者,权利要求11-16中任一项所述的方法。
PCT/CN2022/131734 2021-11-22 2022-11-14 一种显示画面同步方法、系统及电子设备 WO2023088211A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP22894745.3A EP4408001A1 (en) 2021-11-22 2022-11-14 Display picture synchronization method and system, and electronic device

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN202111383467 2021-11-22
CN202111383467.5 2021-11-22
CN202210640453.5A CN116156233A (zh) 2021-11-22 2022-06-07 一种显示画面同步方法、系统及电子设备
CN202210640453.5 2022-06-07

Publications (1)

Publication Number Publication Date
WO2023088211A1 true WO2023088211A1 (zh) 2023-05-25

Family

ID=86372439

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/131734 WO2023088211A1 (zh) 2021-11-22 2022-11-14 一种显示画面同步方法、系统及电子设备

Country Status (3)

Country Link
EP (1) EP4408001A1 (zh)
CN (1) CN116156233A (zh)
WO (1) WO2023088211A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116506672A (zh) * 2023-06-29 2023-07-28 北京朝歌数码科技股份有限公司 内网设备的音视频同步播放方法

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101807389A (zh) * 2010-03-19 2010-08-18 上海博康智能网络科技有限公司 大屏拼接方法及系统
CN103873952A (zh) * 2012-12-18 2014-06-18 风网科技(北京)有限公司 一种高效的视频播放丢帧控制系统及其方法
CN104168426A (zh) * 2013-05-17 2014-11-26 华为技术有限公司 一种视频拼接显示的多路同步方法及装置
CN104581190A (zh) * 2014-12-29 2015-04-29 上海智物信息技术有限公司 一种多屏同步方法及系统
US20150138038A1 (en) * 2013-11-19 2015-05-21 Electronics And Telecommunications Research Institute Multi-screen display system and image signal correcting method for the same
CN106454010A (zh) * 2016-10-21 2017-02-22 青岛海信电器股份有限公司 多屏拼接显示系统同步显示校准方法、显示器及系统
CN106686438A (zh) * 2016-12-29 2017-05-17 北京奇艺世纪科技有限公司 一种跨设备的音频图像同步播放的方法、装置及系统
US20180018931A1 (en) * 2015-12-31 2018-01-18 Boe Technology Group Co., Ltd. Splicing display system and display method thereof
CN111221611A (zh) * 2020-01-03 2020-06-02 北京恒泰实达科技股份有限公司 一种实现多台主机同步切换显示画面的方法
CN113365127A (zh) * 2021-06-02 2021-09-07 众立智能科技(深圳)有限公司 局域网多屏显示同步方法及装置

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101807389A (zh) * 2010-03-19 2010-08-18 上海博康智能网络科技有限公司 大屏拼接方法及系统
CN103873952A (zh) * 2012-12-18 2014-06-18 风网科技(北京)有限公司 一种高效的视频播放丢帧控制系统及其方法
CN104168426A (zh) * 2013-05-17 2014-11-26 华为技术有限公司 一种视频拼接显示的多路同步方法及装置
US20150138038A1 (en) * 2013-11-19 2015-05-21 Electronics And Telecommunications Research Institute Multi-screen display system and image signal correcting method for the same
CN104581190A (zh) * 2014-12-29 2015-04-29 上海智物信息技术有限公司 一种多屏同步方法及系统
US20180018931A1 (en) * 2015-12-31 2018-01-18 Boe Technology Group Co., Ltd. Splicing display system and display method thereof
CN106454010A (zh) * 2016-10-21 2017-02-22 青岛海信电器股份有限公司 多屏拼接显示系统同步显示校准方法、显示器及系统
CN106686438A (zh) * 2016-12-29 2017-05-17 北京奇艺世纪科技有限公司 一种跨设备的音频图像同步播放的方法、装置及系统
CN111221611A (zh) * 2020-01-03 2020-06-02 北京恒泰实达科技股份有限公司 一种实现多台主机同步切换显示画面的方法
CN113365127A (zh) * 2021-06-02 2021-09-07 众立智能科技(深圳)有限公司 局域网多屏显示同步方法及装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116506672A (zh) * 2023-06-29 2023-07-28 北京朝歌数码科技股份有限公司 内网设备的音视频同步播放方法
CN116506672B (zh) * 2023-06-29 2023-09-08 北京朝歌数码科技股份有限公司 内网设备的音视频同步播放方法

Also Published As

Publication number Publication date
CN116156233A (zh) 2023-05-23
EP4408001A1 (en) 2024-07-31

Similar Documents

Publication Publication Date Title
CN103079088B (zh) 多屏视频预处理与同步播放方法和系统
CN101741538B (zh) 同步调度方法
EP2025182B1 (en) Method of transmitting/playing multimedia data over wireless network and wireless device using the method
KR101595526B1 (ko) 콘텐츠 동기 전송 시스템 및 방법
US7724780B2 (en) Synchronization of one or more source RTP streams at multiple receiver destinations
US20120314597A1 (en) Enhanced stream reservation protocol for audio video networks
JP5628925B2 (ja) 無線ネットワークでのオーディオ通信のための無線通信方法及びシステム
US20200014963A1 (en) Latency improvement via frame latency feedback
CN109168059B (zh) 一种在不同设备上分别播放音频与视频的唇音同步方法
JP2016537914A (ja) オーディオ/ビデオストリーミングのためのレイテンシバッファリングの動的および自動制御
WO2020052456A1 (zh) 一种实现视频流切换的方法、装置和系统
KR102519381B1 (ko) 오디오 스트림과 비디오 스트림을 동기식으로 전환하는 방법 및 장치
WO2023088211A1 (zh) 一种显示画面同步方法、系统及电子设备
EP3281317B1 (en) Multi-layer timing synchronization framework
KR101862355B1 (ko) 무선 통신 시스템에서 오디오/비디오(a/v) 스트림 포맷 변경의 동기화를 위한 방법 및 시스템.
US20230231787A1 (en) Communication method and an apparatus
US8218548B2 (en) Information processing apparatus, method, and program
CN103248945A (zh) 图像传输的方法及系统
EP2871848A1 (en) Providing correction information for media synchronization
CN112272306B (zh) 一种多路实时交互视频融合传输方法
JP2016225744A (ja) 無線通信システム及び方法
KR20190013967A (ko) 이종 네트워크 기반의 멀티미디어 자원 동기화 푸시 방법
JP2012019456A (ja) メディア同期再生システム
JP4152860B2 (ja) 通信装置、通信機器、そのコンピュータ・プログラムおよびそのプログラムを記録した記録媒体
JP6335775B2 (ja) メディア受信装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22894745

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2022894745

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2022894745

Country of ref document: EP

Effective date: 20240422

NENP Non-entry into the national phase

Ref country code: DE