WO2021179931A1 - Url投屏方法和装置 - Google Patents

Url投屏方法和装置 Download PDF

Info

Publication number
WO2021179931A1
WO2021179931A1 PCT/CN2021/078480 CN2021078480W WO2021179931A1 WO 2021179931 A1 WO2021179931 A1 WO 2021179931A1 CN 2021078480 W CN2021078480 W CN 2021078480W WO 2021179931 A1 WO2021179931 A1 WO 2021179931A1
Authority
WO
WIPO (PCT)
Prior art keywords
code stream
frame
image frame
screen
switched
Prior art date
Application number
PCT/CN2021/078480
Other languages
English (en)
French (fr)
Other versions
WO2021179931A8 (zh
Inventor
肖华熙
姚垚
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to CN202180017852.XA priority Critical patent/CN115244944A/zh
Publication of WO2021179931A1 publication Critical patent/WO2021179931A1/zh
Publication of WO2021179931A8 publication Critical patent/WO2021179931A8/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/858Linking data to content, e.g. by linking an URL to a video object, by creating a hotspot
    • H04N21/8586Linking data to content, e.g. by linking an URL to a video object, by creating a hotspot by using a URL
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/238Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • H04N21/4331Caching operations, e.g. of an advertisement for later insertion during playback
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/43615Interfacing a Home Network, e.g. for connecting the client to a plurality of peripherals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video stream to a specific local network, e.g. a Bluetooth® network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video stream to a specific local network, e.g. a Bluetooth® network
    • H04N21/43637Adapting the video stream to a specific local network, e.g. a Bluetooth® network involving a wireless protocol, e.g. Bluetooth, RF or wireless LAN [IEEE 802.11]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/845Structuring of content, e.g. decomposing content into time segments
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/845Structuring of content, e.g. decomposing content into time segments
    • H04N21/8456Structuring of content, e.g. decomposing content into time segments by decomposing the content in the time domain, e.g. in time segments
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/858Linking data to content, e.g. by linking an URL to a video object, by creating a hotspot

Definitions

  • This application relates to screen projection technology, and in particular to a URL projection method and device.
  • DestPlayer will start playing the video from the beginning after the projection switch, and it cannot be synchronized with the playback progress of the SourcePlayer. On the one hand, it will affect the efficiency of video playback. Aspects will affect the user experience.
  • the present application provides a URL projection method and device, which can improve the response speed of the projection screen switching and prevent the video playback from being stuck.
  • the present application provides a URL screen projection method, including: receiving a first code stream sent by a source device, where the first code stream is a code stream downloaded from a media server by the source device before the screen projection is switched; Receiving the download instruction information sent by the source device; downloading a second code stream from the media server, the start frame of the second code stream is indicated by the download instruction information, the start frame of the second code stream and the first The end frame of the code stream is related; the video is played according to the first code stream or the second code stream.
  • the source device sends the buffered code stream to the destination device.
  • the buffered code stream is transmitted through the local area network. Based on the advantages of the local area network, it includes higher transmission bandwidth and better service. Quality (Quality of Service, QOS), etc., can realize the fast and stable transmission of the cache code stream.
  • the destination device receives the cache code stream from the source device and downloads the code stream after the cache code stream from the media server. , Can quickly start playing video based on the existing code stream, improve the response speed of screen switching, and can ensure that enough code stream is stored when the video is played, and prevent the video from being stuck.
  • the start frame of the first code stream is the key frame Fk corresponding to the image frame Fc being played by the source device when the screen is switched.
  • the destination device cannot decode the image corresponding to Fc only based on the information of the image frame Fc being played when the screen is switched. It must also have a key frame Fk corresponding to Fc (Fk is the one that is earlier than Fc and closest to Fc. Key frame), the image corresponding to Fc can be completely decoded by the information of Fk. The same is true for the image frame after Fc. Therefore, even if Fk has already been played when the screen is switched, in order to ensure the information integrity of the first code stream, the key frame Fk corresponding to Fc must be sent to the destination device, so the start frame of the first code stream can be Fk . This not only reduces the amount of data transmitted between the source device and the destination device, but also provides sufficient image information to ensure effective image decoding.
  • the end frame of the first code stream is the last image frame Fd that has been downloaded by the source device when the screen is switched.
  • the source device will send all the code streams of all image frames that have been downloaded from the media server and have not been played to the destination device, that is, the code streams of image frames from Fc to Fd.
  • the code streams of image frames from Fc to Fd are transmitted to the destination end device via the local area network as much as possible, so that the data volume of the second code stream downloaded by the destination end device from the media server can be reduced.
  • the image frame being played by the source device when the screen is switched is Fc
  • the last image frame that the source device has downloaded when the screen is switched is Fd.
  • Fd and Fc are different Media segments
  • each media segment contains multiple image frames
  • the end frame of the first code stream is the last image frame of the media segment to which the Fc belongs.
  • the first code stream sent by the source device to the destination device can end in the last image frame Fe of the media segment to which the Fc belongs.
  • the start frame of the second code stream is related to the end frame of the first code stream, specifically including: the start frame of the second code stream is the end frame of the first code stream The first image frame of the media segment to which the next image frame belongs.
  • the start frame of the code stream downloaded by the source device from the media server is the first image frame of the first media segment of the video.
  • the code stream (the second The start frame of the code stream is the first image frame of the media segment to which the next image frame of the end frame of the first code stream belongs.
  • the start frame of the second code stream can be considered as the media fragmentation processing on the media server It is the first image frame of the media segment to which the next image frame Fn of Fd belongs. At this time, the end part of the first code stream and the beginning part of the second code stream may overlap.
  • the start frame of the second code stream can be the media to which the next image frame of Fe belongs. The first image frame of the slice.
  • the end frame of the first stream is the last image frame Fe of the media segment to which the image frame being played before the screen is cut
  • the next image frame of Fe is the first image frame of the adjacent media segment
  • the The first image frame is the start frame of the second code stream, so the first code stream and the second code stream will not overlap.
  • the first code stream and the second code stream can be as continuous as possible without overlapping to reduce data transmission redundancy.
  • this part of the overlap is also part of the media fragmentation. Under the premise of screen projection time synchronization, redundant transmission of this part can be allowed.
  • the image frame being played by the source device when the screen is switched is Fc
  • the last image frame downloaded by the source device when the screen is switched is Fd
  • the next image frame of Fd Fn, Fn and Fc belong to different media segments
  • the start frame of the second code stream is the first image frame F1 of the media segment to which Fn belongs
  • the end frame of the first code stream is the previous image of F1 Frame F0.
  • the source device removes the part that may overlap with the second code stream from the code stream that it has cached, and then transmits the remaining code stream to the destination device through the local area network, which can reduce the second download from the media server by the destination device.
  • the data volume of the code stream is the code stream.
  • the playing the video according to the first code stream or the second code stream specifically includes: determining the first code stream to be played after the screen projection is switched from the first code stream or the second code stream. Frame an image frame, and start playing the video from the first image frame.
  • the method before the video is played according to the first code stream or the second code stream, the method further includes: receiving the playing time indication information sent by the source device, where the playing time indication information is used to indicate the casting;
  • the image frame Fc being played by the source device when the screen is switched;
  • the playing video according to the first code stream or the second code stream includes: when the data amount of the first code stream is greater than a set threshold, according to the playing
  • the time indication information determines from the first code stream that the first image frame to be played is Fc, and plays the video from the Fc to the end frame of the first code stream according to the first code stream, and according to the second code stream Play the video starting from Fm, Fm is the next image frame of the end frame of the first code stream.
  • the destination device receiving the first code stream and downloading the second code stream from the media server are two independent processes, so the two can be performed at the same time. Since the first code stream is positioned before the second code stream as a whole, the destination device can start playing from the first code stream. In the process of receiving the first code stream and the second code stream, if the data volume of the received first code stream reaches enough data volume for playback, the destination device can switch from the projection screen according to the first code stream The video will be played with the playback progress at the time, which will help improve the response speed of screen switching. After the first code stream is played, the destination device has already accumulated enough second code stream, and the video is played back according to the second code stream, ensuring smooth video playback.
  • the receiving the first code stream sent by the source device includes: when the total data amount of the first code stream is not less than a set threshold, receiving the first code stream sent by the source device Code stream.
  • the method further includes: when the total data amount of the first code stream is less than the set threshold, receiving the data amount indication information sent by the source device; and downloading the third code from the media server Stream, the start frame of the third code stream is the first image frame of the media segment to which the image frame Fc that the source device is playing when the screen is switched.
  • the method further includes: when the total data amount of the first code stream is less than the set threshold, receiving the data amount indication information and the first code stream sent by the source device;
  • the media server downloads the fourth code stream, the last image frame of the first code stream is Fd, the next image frame of Fd is Fn, and the start frame of the fourth code stream is the first frame of the media segment to which Fn belongs.
  • the source device can send to the destination device Sending data volume indication information, where the data volume indication information is used to notify that the total data volume of the first code stream is less than the set threshold. In this case, the source device can choose to send the first code stream to the destination device, or it can choose not to send the first code stream.
  • the method before receiving the first code stream sent by the source device, the method further includes: receiving control information sent by the source device, where the control information includes the encapsulation format of the first code stream; Receiving the first code stream sent by the source end device includes: receiving format data sent by the source end device, and decapsulating the format data according to the encapsulation format to obtain the first code stream.
  • the image frame included in the first code stream may span multiple media segments, and the source device can encapsulate the first code stream into one media segment, such as TS, FMP4 segments, etc., to simplify the organization of transmission data.
  • the source-end device may also divide the first code stream into multiple media fragments for transmission, which is not specifically limited in this application.
  • the encapsulation object in this application is to download the buffered video stream from the media server. The encapsulation process can improve the performance of screen projection time synchronization, and it does not involve the encoding and decoding of the image, thereby reducing the difficulty of the algorithm and the hardware requirements.
  • the current playback status (including the playback position) is periodically reported to the source device. If the user adjusts the playback progress by dragging, it can also be reported to the source device.
  • a new message is defined to transmit media information and the first code stream.
  • the message may include a uniform fixed-length message header, a type and response code used to identify the message, and optional Message body.
  • this application provides a URL projection method, including: determining a first code stream and download instruction information according to the playback progress when the projection screen is switched, and the first code stream is downloaded from the media server before the screen projection is switched.
  • Code stream the download instruction information is used to indicate the start frame of the second code stream
  • the second code stream is the code stream to be downloaded by the destination device from the media server, the start frame of the second code stream and the first frame
  • the end frame of a code stream is related; the download instruction information and the first code stream are sent to the destination device.
  • the start frame of the first code stream is the key frame Fk corresponding to the image frame Fc that is being played when the screen is switched.
  • the end frame of the first code stream is the last image frame Fd that has been downloaded when the projection screen is switched.
  • the image frame being played when the screen is switched is Fc
  • the last image frame that has been downloaded when the screen is switched is Fd.
  • Fd and Fc belong to different media fragments.
  • the slice contains multiple image frames
  • the end frame of the first bitstream is the last image frame of the media slice to which the Fc belongs.
  • the start frame of the second code stream is related to the end frame of the first code stream, specifically including: the start frame of the second code stream is the end frame of the first code stream The first image frame of the media segment to which the next image frame belongs.
  • the image frame being played when the screen is switched is Fc
  • the last image frame that has been downloaded when the screen is switched is Fd
  • the next image frame of Fd is Fn.
  • Fn and Fc belong to different
  • the start frame of the second code stream is the first image frame F1 of the media segment to which Fn belongs
  • the end frame of the first code stream is the previous image frame F0 of F1.
  • the method before sending the download instruction information and the first code stream to the destination device, the method further includes: sending playback time instruction information to the destination device, where the playback time instruction information is used to indicate The image frame Fc being played when the projection screen is switched.
  • the method further includes: when the total data amount of the first code stream is less than a set threshold, sending data amount indication information to the destination device, where the data amount indication information is used to notify the first code stream The total data volume of the code stream is less than the set threshold.
  • the method before sending the download instruction information and the first code stream to the destination device, the method further includes: encapsulating the first code stream according to a set encapsulation format to obtain format data; The control information and the format data are sent to the destination device, and the control information includes the encapsulation format.
  • an extended parameter can be carried in the screen-cast switching request, and the extended parameter is used to indicate that the source-end device supports the URL projection method provided in this application. If the destination device supports the URL projection method provided in this application, it can identify the extended parameters in the projection request.
  • a new message is defined to transmit media information and the first code stream.
  • the message may include a uniform fixed-length message header, a type and response code used to identify the message, and optional Message body.
  • the present application provides a screen projection device, including: a receiving module for receiving a first code stream sent by a source device, the first code stream being downloaded by the source device from a media server before the screen projection is switched
  • the receiving module is also used to receive the download instruction information sent by the source device, and download the second code stream from the media server according to the download instruction information, and the start frame of the second code stream is downloaded by the
  • the indication information indicates that the start frame of the second code stream is related to the end frame of the first code stream; the playback module is used to play a video according to the first code stream or the second code stream.
  • the start frame of the first code stream is the key frame Fk corresponding to the image frame Fc being played by the source device when the screen is switched.
  • the end frame of the first code stream is the last image frame Fd that has been downloaded by the source device when the screen is switched.
  • the image frame being played by the source device when the screen is switched is Fc
  • the last image frame that the source device has downloaded when the screen is switched is Fd.
  • Fd and Fc are different Media segments
  • each media segment contains multiple image frames
  • the end frame of the first code stream is the last image frame of the media segment to which the Fc belongs.
  • the start frame of the second code stream is related to the end frame of the first code stream, specifically including: the start frame of the second code stream is the end frame of the first code stream The first image frame of the media segment to which the next image frame belongs.
  • the image frame being played by the source device when the screen is switched is Fc
  • the last image frame downloaded by the source device when the screen is switched is Fd
  • the next image frame of Fd Fn, Fn and Fc belong to different media segments
  • the start frame of the second code stream is the first image frame F1 of the media segment to which Fn belongs
  • the end frame of the first code stream is the previous image of F1 Frame F0.
  • the playback module is specifically configured to determine the first image frame to be played after the projection screen is switched from the first code stream or the second code stream, and start from the first image frame Play the video.
  • the receiving module is further configured to receive the playback time indication information sent by the source device, and the playback time indication information is used to indicate the image frame Fc being played by the source device when the projection screen is switched.
  • the playback module is also used to determine the first image frame to be played as Fc from the first code stream according to the playback time indication information when the data amount of the first code stream is greater than the set threshold, and according to The first code stream plays a video from Fc to the end frame of the first code stream, and the video starts from Fm according to the second code stream, and Fm is the next image frame of the end frame of the first code stream.
  • the receiving module is specifically configured to receive the first code stream sent by the source device when the total data amount of the first code stream is not less than a set threshold.
  • the receiving module is further configured to receive the data amount indication information sent by the source device when the total data amount of the first code stream is less than the set threshold; from the media server Download the third code stream, and the start frame of the third code stream is the first image frame of the media segment to which the image frame Fc that the source device is playing when the screen is switched.
  • the receiving module is further configured to receive the data amount indication information and the first code sent by the source device when the total data amount of the first code stream is less than the set threshold.
  • Stream download the fourth code stream from the media server, the last image frame of the first code stream is Fd, the next image frame of Fd is Fn, and the start frame of the fourth code stream is the media segment to which Fn belongs The first frame.
  • the receiving module is further configured to receive control information sent by the source device, where the control information includes the encapsulation format of the first code stream; receive format data sent by the source device;
  • the device further includes: a decoding module, configured to decapsulate the format data according to the encapsulation format to obtain the first code stream.
  • the present application provides a screen projection device, including: a processing module, configured to determine a first code stream and download instruction information according to the playback progress during screen projection switching.
  • the code stream downloaded by the media server.
  • the download instruction information is used to indicate the start frame of the second code stream.
  • the second code stream is the code stream to be downloaded by the destination device from the media server.
  • the start of the second code stream The frame is related to the end frame of the first code stream; the sending module is used to send the download instruction information and the first code stream to the destination device.
  • the start frame of the first code stream is the key frame Fk corresponding to the image frame Fc that is being played when the screen is switched.
  • the end frame of the first code stream is the last image frame Fd that has been downloaded when the projection screen is switched.
  • the image frame being played when the screen is switched is Fc
  • the last image frame that has been downloaded when the screen is switched is Fd.
  • Fd and Fc belong to different media fragments.
  • the slice contains multiple image frames
  • the end frame of the first bitstream is the last image frame of the media slice to which the Fc belongs.
  • the start frame of the second code stream is related to the end frame of the first code stream, specifically including: the start frame of the second code stream is the end frame of the first code stream The first image frame of the media segment to which the next image frame belongs.
  • the image frame being played when the screen is switched is Fc
  • the last image frame that has been downloaded when the screen is switched is Fd
  • the next image frame of Fd is Fn.
  • Fn and Fc belong to different
  • the start frame of the second code stream is the first image frame F1 of the media segment to which Fn belongs
  • the end frame of the first code stream is the previous image frame F0 of F1.
  • the sending module is also used to send playback time indication information to the destination device, where the playback time indication information is used to indicate the image frame Fc being played when the projection screen is switched.
  • the sending module is further configured to send data volume indication information to the destination device when the total data volume of the first code stream is less than a set threshold, and the data volume indication information is used for Notify that the total data amount of the first code stream is less than the set threshold.
  • the device further includes: an encapsulation module, configured to encapsulate the first code stream according to the set encapsulation format to obtain format data; the sending module is specifically configured to send data to the destination device Send control information and the format data, and the control information includes the encapsulation format.
  • the present application provides a screen projection device, including: a processor and a transmission interface; the processor is configured to call program instructions stored in the memory to implement any one of the above-mentioned first to second aspects Methods.
  • the present application provides a computer-readable storage medium, which is characterized by comprising a computer program, which when executed on a computer or a processor, causes the computer or processor to execute the above-mentioned first to second aspects Any method.
  • the present application provides a computer program, which is characterized in that, when the computer program is executed by a computer or a processor, it is used to execute the method in any one of the first to second aspects.
  • Fig. 1 shows an exemplary schematic diagram of a screen projection scene
  • FIG. 2 shows an exemplary structural diagram of the device 200
  • FIG. 3 is a flowchart of Embodiment 1 of the URL projection method of this application.
  • Fig. 4 shows an exemplary sequence diagram of image frames
  • Fig. 5 shows an exemplary sequence diagram of image frames
  • Fig. 6 shows an exemplary sequence diagram of image frames
  • FIG. 7 is a flowchart of Embodiment 2 of the URL projection method of this application.
  • Figure 8 shows a possible embodiment of the URL projection method of the present application
  • FIG. 9 is a schematic structural diagram of Embodiment 1 of a screen projection device according to this application.
  • FIG. 10 is a schematic structural diagram of Embodiment 2 of a screen projection device of this application.
  • At least one (item) refers to one or more, and “multiple” refers to two or more.
  • “And/or” is used to describe the association relationship of associated objects, indicating that there can be three types of relationships, for example, “A and/or B” can mean: only A, only B, and both A and B , Where A and B can be singular or plural.
  • the character “/” generally indicates that the associated objects before and after are in an “or” relationship.
  • the following at least one item (a) or similar expressions refers to any combination of these items, including any combination of a single item (a) or a plurality of items (a).
  • At least one of a, b, or c can mean: a, b, c, "a and b", “a and c", “b and c", or "a and b and c" ", where a, b, and c can be single or multiple.
  • URL projection The projection initiator provides the URL address of the video to the projection receiver, and the projection receiver obtains the video stream from the media server according to the URL address and plays it.
  • URL projection is different from mirroring. Mirroring is to transmit the screen data from the initiating end of the projection to the receiving end in real time, and the screen of the initiating end is synchronously displayed on the receiving end.
  • the DLNA projection is a typical URL projection technology.
  • the projection process includes: the user uses a terminal device (such as a mobile phone or tablet, etc.) to play a video, and clicks the projection button in the media player to initiate the projection.
  • the terminal device is the screen initiating end.
  • the projection initiator sends the URL address of the video to the projection receiver using the DLNA protocol.
  • Large-screen devices such as TVs and set-top boxes are the projection receivers.
  • the screen projection initiator uses the seek instruction in the DLNA protocol to indicate the current playback position to the screen projection receiver.
  • the projection receiver requests the media server to obtain the content corresponding to the URL address and play it.
  • the process of terminating the screen projection includes: the user terminates the screen projection by pressing a button or key on the screen initiating end or the screen receiving end.
  • the projection receiver terminates the video playback. If the media player on the screen initiating end has not exited, the screen initiating end requests the content from the media server and continues to play.
  • Sender The initiator of URL projection. When starting the screencast, Sender will provide Receiver with the video URL address.
  • mobile phones, tablets, computers and other devices can initiate URL projection through running media players.
  • Projection receiver The receiver of URL projection. When projecting the screen, Receiver will obtain the video stream from the media server according to the URL address provided by the Sender.
  • AV audio video
  • ChromeCast computers and other equipment can receive screen projections through built-in receivers.
  • Server a server that provides video stream. Receiver requests the corresponding video stream from the Server through the URL address of the video.
  • Server mainly uses HyperText Transfer Protocol Live Streaming (HLS), Fragmented MP4 (Fragmented MP4, FMP4) and other streaming media protocols to provide media services.
  • HLS HyperText Transfer Protocol Live Streaming
  • Fragmented MP4 Fragmented MP4, FMP4
  • other streaming media protocols to provide media services.
  • Initial cache data The device that plays the video needs to download and cache the code stream corresponding to a piece of video from the media server before starting playback. Usually it needs to cache the video code stream for at least 2 to 3 seconds, otherwise it will be easy to play the video. Stuttering occurs.
  • Screen projection switch when starting or terminating screen projection, the device that plays the video switches from one end to the other.
  • Sender When starting the screencast, Sender generally stops playing, and the video is switched to being played by Receiver.
  • Receiver When the projection is terminated, Receiver will stop playing. If the Sender's video player is not terminated, the video will switch back to being played by the Sender.
  • DestPlayer After the projection screen is switched, the end that should play the video.
  • the video playback is switched from Sender to Receiver, and then Receiver is DestPlayer; when screen projection is terminated, the video should be switched from Receiver back to Sender, and Sender is DestPlayer.
  • Source device (SourcePlayer): the end that plays the video before switching the projection screen. Before starting the projection, the Sender is the SourcePlayer; before terminating the projection, the Receiver is the SourcePlayer.
  • Screen switching response speed it can also be referred to as the screen response speed, response speed, etc. After the user initiates the screen switching, the video is switched to the response speed of DestPlayer.
  • each frame represents a still image.
  • the key frame refers to the I frame, and the I frame adopts the full frame compression coding, that is, the full frame image information is compressed and encoded, and the complete image can be reconstructed by using only the data of the I frame during decoding.
  • the I frame describes the details of the image background and the moving subject. It does not need to refer to other image frames and belongs to intra-frame compression. Compared with key frames, ordinary image frames belong to inter-frame compression. When decoding, it is necessary to superimpose the residual information of the frame with the information of the corresponding key frame to decode the final picture.
  • Figure 1 shows an exemplary schematic diagram of a URL projection scene.
  • the scene is that the source device sends the URL address of the video to the destination device, and the destination device requests the media server to obtain the URL address.
  • Corresponding content and play there are always two problems with URL projection: (1) The response speed of projection switching is not fast enough, and the video cannot be quickly switched to the destination device for playback. (2) It is difficult to ensure the synchronization of the projection time while taking into account the switching response speed of the projection screen, and it is difficult for the destination device to start playing from the video position that the source device is playing.
  • the current optimization directions are mostly focused on compressing the time overhead of each link in the switching process of the projection screen, or starting the destination device in advance. Although this can alleviate the problem to a certain extent, it cannot solve the problem fundamentally.
  • the playback device needs to download and cache a video stream from the media server before it can start playing, usually at least 2 to 3 seconds of video stream. Due to the unpredictable jitter of the network, the initial buffered data cannot be too small, otherwise the playback will easily freeze. When the bandwidth capacity is not much different from the bandwidth requirement, it will take at least 2 to 3 seconds to initially cache the data. Correspondingly, when the projection screen is switched, it takes 2 to 3 seconds for the destination device to start playing.
  • a device wants to play a video at time T, it must start downloading from the first frame containing the media segment at time T. For example, suppose that the duration of the media segment is 5 seconds, and the source device initiates a screen projection switch when it plays to the 14th second. After the destination device receives the screen switching request, it seeks to the third slice, and the corresponding time range is 10 to 15 seconds. In order to achieve time synchronization, in addition to the necessary initial cache (2 to 3 seconds after 14 seconds), the destination device also downloads the 10th to 13th second data that has been played, which will lead to a longer waiting process.
  • the related technology is to choose one of the two between the screen switching response speed and the screen time synchronization, or sacrifice the screen switching response speed, enhance the screen time synchronization, or sacrifice the projection.
  • the screen time is synchronized to improve the response speed of screen switching. At present, most of them choose the latter, that is, to improve the response speed of screen switching, and when switching screens, the video is always played from the beginning.
  • the Sender in the URL projection method can also be called User Equipment (UE), which can be deployed on land, including indoor or outdoor, handheld or vehicle-mounted; or on the water (such as Ships, etc.); can also be deployed in the air (such as aircraft, balloons and satellites, etc.).
  • UE User Equipment
  • Source devices can be mobile phones, tablets, wearable devices with wireless communication functions (such as smart watches), location trackers with positioning functions, computers with wireless transceiver functions, VR devices, AR Equipment, wireless equipment in industrial control, wireless equipment in self-driving, wireless equipment in remote medical, wireless equipment in smart grid, transportation safety ( This application is not limited to wireless devices in transportation safety, wireless devices in smart cities, and wireless devices in smart homes.
  • wireless communication functions such as smart watches
  • location trackers with positioning functions computers with wireless transceiver functions
  • VR devices AR Equipment
  • wireless equipment in industrial control wireless equipment in self-driving
  • wireless equipment in remote medical wireless equipment in smart grid
  • transportation safety This application is not limited to wireless devices in transportation safety, wireless devices in smart cities, and wireless devices in smart homes.
  • the destination device in the URL projection method may be a smart TV, a TV box, a projection screen, etc., which is not limited in this application.
  • the network between the source device and the destination device may be a communication network that supports short-range communication technology, such as a communication network that supports Wireless-Fidelity (WIFI) technology; or, a communication network that supports Bluetooth technology, or, A communication network that supports Near Field Communication (NFC) technology; etc.; or, the communication network can also be a communication network that supports Fourth Generation (4G) access technology, such as Long Term Evolution (Long Term) Evolution, LTE) access technology; or, the communication network may also be a communication network that supports fifth generation (5G) access technology, such as New Radio (NR) access technology; or, the communication network It can also be a communication network that supports third generation (Third Generation, 3G) access technology, such as (Universal Mobile Telecommunications System, UMTS) access technology; or, the communication network can also be a communication network that supports multiple wireless technologies. For example, a communication network supporting LTE technology and NR technology; or, the communication network may also be suitable for future-oriented communication technology, which is not specifically limited in this application.
  • FIG. 2 shows an exemplary structure diagram of the device 200.
  • the device 200 can be used as the aforementioned source device, and can also be used as the aforementioned destination device.
  • the device 200 includes: an application processor 201, a microcontroller unit (Microcontroller Unit, MCU) 202, a memory 203, a modem (modem) 204, a radio frequency (RF) module 205, and a wireless fidelity ( Wireless-Fidelity (Wi-Fi for short) module 206, Bluetooth module 207, sensor 208, input/output (Input/Output, I/O) device 209, positioning module 210 and other components.
  • These components can communicate through one or more communication buses or signal lines.
  • the aforementioned communication bus or signal line may be the CAN bus provided in this application.
  • the device 200 may include more or fewer components than shown, or combine certain components, or arrange different components.
  • the application processor 201 is the control center of the device 200, and uses various interfaces and buses to connect various components of the device 200.
  • the processor 201 may include one or more processing units.
  • the memory 203 stores computer programs, such as the operating system 211 and application programs 212 shown in FIG. 2.
  • the application processor 201 is configured to execute a computer program in the memory 203 to implement functions defined by the computer program.
  • the application processor 201 executes an operating system 211 to implement various functions of the operating system on the device 200.
  • the memory 203 also stores other data besides computer programs, such as data generated during the running of the operating system 211 and the application program 212.
  • the memory 203 is a non-volatile storage medium, and generally includes a memory and an external memory.
  • the memory includes but is not limited to random access memory (RAM), read-only memory (ROM), or cache.
  • External storage includes but is not limited to flash memory (Flash Memory), hard disks, optical disks, Universal Serial Bus (USB) disks, etc.
  • Computer programs are usually stored in external memory, and the processor loads the program from external memory to memory before executing the computer program.
  • the memory 203 may be independent and connected to the application processor 201 through a bus; the memory 203 may also be integrated with the application processor 201 into a chip subsystem.
  • MCU 202 is a co-processor used to acquire and process data from sensors 208.
  • the processing capacity and power consumption of MCU 202 are less than that of application processor 201, but it has the feature of "always on” and can be used in application processor 201 continuously collects and processes sensor data when it is in sleep mode to ensure the normal operation of the sensor with extremely low power consumption.
  • the MCU 202 may be a sensor hub chip.
  • the sensor 208 may include a light sensor and a motion sensor.
  • the light sensor may include an ambient light sensor and a proximity sensor. The ambient light sensor can adjust the brightness of the display 2091 according to the brightness of the ambient light, and the proximity sensor can turn off the power of the display screen when the device 200 is moved to the ear.
  • the accelerometer sensor can detect the magnitude of acceleration in various directions (usually three axes), and can detect the magnitude and direction of gravity when stationary; the sensor 208 can also include a gyroscope, barometer, hygrometer, Other sensors such as thermometers and infrared sensors will not be described here.
  • the MCU 202 and the sensor 208 may be integrated on the same chip, or may be separate components, connected by a bus.
  • the modem 204 and the radio frequency module 205 constitute the communication subsystem of the device 200 and are used to implement the main functions of the wireless communication standard protocol. Among them, the modem 204 is used for encoding and decoding, signal modulation and demodulation, and equalization.
  • the radio frequency module 205 is used for receiving and transmitting wireless signals.
  • the radio frequency module 205 includes but is not limited to an antenna, at least one amplifier, a coupler, a duplexer, and the like.
  • the radio frequency module 205 cooperates with the modem 204 to realize the wireless communication function.
  • the modem 204 can be used as a separate chip, or can be combined with other chips or circuits to form a system-level chip or integrated circuit. These chips or integrated circuits can be applied to all devices that implement wireless communication functions, including: mobile phones, computers, notebooks, tablets, routers, wearable devices, automobiles, home appliances, etc.
  • the device 200 may also use the Wi-Fi module 206, the Bluetooth module 207, etc. to perform wireless communication.
  • the Wi-Fi module 206 is used to provide the device 200 with network access that complies with Wi-Fi related standard protocols.
  • the device 200 can access a Wi-Fi access point through the Wi-Fi module 206, and then access the Internet.
  • the Wi-Fi module 206 can also be used as a Wi-Fi wireless access point, which can provide Wi-Fi network access for other devices.
  • the Bluetooth module 207 is used to implement short-distance communication between the device 200 and other devices (such as mobile phones, smart watches, etc.).
  • the Wi-Fi module 206 in the embodiment of the present application may be an integrated circuit or a Wi-Fi chip or the like, and the Bluetooth module 207 may be an integrated circuit or a Bluetooth chip or the like.
  • the positioning module 210 is used to determine the geographic location of the device 200. It is understandable that the positioning module 210 may specifically be a receiver of a positioning system such as a global positioning system (GPS), Beidou satellite navigation system, and Russian GLONASS.
  • GPS global positioning system
  • Beidou satellite navigation system Beidou satellite navigation system
  • Russian GLONASS Russian GLONASS
  • the Wi-Fi module 206, the Bluetooth module 207, and the positioning module 210 may be separate chips or integrated circuits, respectively, or they may be integrated together.
  • the Wi-Fi module 206, the Bluetooth module 207 and the positioning module 210 may be integrated on the same chip.
  • the Wi-Fi module 206, the Bluetooth module 207, the positioning module 210, and the MCU 202 may also be integrated into the same chip.
  • the input/output device 209 includes, but is not limited to: a display 2091, a touch screen 2092, an audio circuit 2093, and so on.
  • the touch screen 2092 can collect the touch events of the user of the device 200 on or near it (for example, the user uses any suitable object such as a finger or a stylus to operate on the touch screen 2092 or near the touch screen 2092), and The collected touch events are sent to other devices (for example, the application processor 201).
  • the user's operation near the touch screen 2092 can be referred to as floating touch; through the floating touch, the user can select, move or drag objects (such as icons, etc.) without directly touching the touch screen 2092.
  • multiple types of resistive, capacitive, infrared, and surface acoustic waves can be used to implement the touch screen 2092.
  • the display (also called a display screen) 2091 is used to display information input by the user or information presented to the user.
  • the display can be configured in the form of liquid crystal display, organic light emitting diode, etc.
  • the touch screen 2092 can be overlaid on the display 2091. When the touch screen 2092 detects a touch event, it is sent to the application processor 201 to determine the type of the touch event, and then the application processor 201 can provide corresponding information on the display 2091 according to the type of the touch event. Visual output.
  • the touch screen 2092 and the display 2091 are used as two independent components to realize the input and output functions of the device 200, in some embodiments, the touch screen 2092 and the display 2091 can be integrated to realize the input of the device 200. And output function.
  • the touch screen 2092 and the display 2091 may be configured on the front of the device 200 in the form of a full panel to realize a frameless structure.
  • the audio circuit 2093, the speaker 2094, and the microphone 2095 can provide an audio interface between the user and the device 200.
  • the audio circuit 2093 can transmit the electrical signal converted from the received audio data to the speaker 2094, which is converted into a sound signal for output by the speaker 2094; on the other hand, the microphone 2095 converts the collected sound signal into an electrical signal, and the audio circuit 2093 After being received, it is converted into audio data, and then the audio data is sent to another device through the modem 204 and the radio frequency module 205, or the audio data is output to the memory 203 for further processing.
  • the device 200 may also have a fingerprint recognition function.
  • a fingerprint collection device may be configured on the back of the device 200 (for example, below the rear camera), or a fingerprint collection device may be configured on the front of the device 200 (for example, below the touch screen 2092).
  • a fingerprint acquisition device may be configured in the touch screen 2092 to realize the fingerprint identification function, that is, the fingerprint acquisition device may be integrated with the touch screen 2092 to realize the fingerprint identification function of the device 200.
  • the fingerprint collection device is configured in the touch screen 2092, may be a part of the touch screen 2092, or may be configured in the touch screen 2092 in other ways.
  • the main component of the fingerprint acquisition device in the embodiments of the present application is a fingerprint sensor.
  • the fingerprint sensor can use any type of sensing technology, including but not limited to optical, capacitive, piezoelectric or ultrasonic sensing technology.
  • the operating system 211 carried by the device 200 may be Or other operating systems, this embodiment of the present application does not impose any restriction on this.
  • the operating system device 200 is taken as an example.
  • the device 200 can be logically divided into a hardware layer, an operating system 211, and an application layer.
  • the hardware layer includes hardware resources such as the application processor 201, the MCU 202, the memory 203, the modem 204, the Wi-Fi module 206, the sensor 208, and the positioning module 210 as described above.
  • the application layer includes one or more applications, such as an application 212, and the application 212 may be any type of application such as a social application, an e-commerce application, or a browser.
  • the operating system 211 as a software middleware between the hardware layer and the application layer, is a computer program that manages and controls hardware and software resources.
  • the operating system 211 includes a kernel, a hardware abstraction layer (HAL), a library and runtime, and a framework.
  • the kernel is used to provide underlying system components and services, such as: power management, memory management, thread management, hardware drivers, etc.; hardware drivers include Wi-Fi drivers, sensor drivers, positioning module drivers, etc.
  • the hardware abstraction layer encapsulates the kernel driver, provides an interface to the framework, and shields low-level implementation details.
  • the hardware abstraction layer runs in user space, while the kernel driver runs in kernel space.
  • the library and runtime are also called runtime libraries, which provide the required library files and execution environment for executable programs at runtime.
  • the library and the runtime include the android runtime (ART), the library, and the scene package runtime.
  • ART is a virtual machine or virtual machine instance that can convert bytecode of an application into machine code.
  • a library is a program library that provides support for executable programs at runtime, including browser engines (such as webkit), script execution engines (such as JavaScript engines), graphics processing engines, and so on.
  • the runtime environment of the scene package is the runtime environment of the scene package, which mainly includes the page execution environment (page context) and the script execution environment (script context).
  • the page execution environment parses the page code in html, css and other formats by calling the corresponding library.
  • the execution environment analyzes and executes codes or executable files implemented by scripting languages such as JavaScript by calling the corresponding function library.
  • the framework is used to provide various basic public components and services for applications in the application layer, such as window management, location management, and so on.
  • the framework includes geofencing services, policy services, notification managers, etc.
  • the functions of the various components of the operating system 211 described above can all be implemented by the application processor 201 executing a program stored in the memory 203.
  • the device 200 may include fewer or more components than those shown in FIG. 2, and the device shown in FIG. 2 only includes components that are more relevant to the multiple implementations disclosed in the present application.
  • FIG. 3 is a flowchart of Embodiment 1 of the URL projection method of this application.
  • the process 300 can be executed by the source device, the destination device, and the media server.
  • the process 300 is described as a series of steps or operations. It should be understood that the process 300 may be executed in various orders and/or occur simultaneously, and is not limited to the execution order shown in FIG. 3.
  • the method of this embodiment may include:
  • Step 301 The source device determines the first code stream and download instruction information according to the playback progress when the screen is switched.
  • Sender refers to the originating end of URL projection, such as mobile phones, tablets, computers and other devices
  • Receiver refers to the receiving end of URL projection, such as televisions, set-top boxes, AV amplifiers, ChromeCast, computers, etc. equipment.
  • the URL projection process starts from the Sender sending a projection request to the Receiver.
  • the Sender can carry the URL address of the video being played in the projection request.
  • the Receiver downloads the video stream corresponding to the URL address from the media server according to the URL address. And play. Therefore, Sender and Receiver are fixed.
  • the source device (SourcePlayer) refers to the end that plays the video before the projection screen is switched
  • the destination device (DestPlayer) refers to the end that should play the video after the projection screen is switched.
  • the Sender initiates the transfer of the permission to play the video to the Receiver.
  • the SourcePlayer is the Sender and the DestPlayer is the Receiver; when the projection is terminated, the Receiver returns the permission to play the video to the Sender.
  • the SourcePlayer is the Receiver.
  • DestPlayer is Sender. That is, SourcePlayer and DestPlayer are relative, and switch between Sender and Receiver according to the different stages of the screen projection.
  • Network media playback usually uses streaming media technology, which uses streaming technology to continuously play media formats on the network in real time, such as audio, video, or multimedia files.
  • Streaming media technology is to compress continuous media data and store it in a media server in the form of media fragments. Each media fragment includes a code stream corresponding to multiple image frames.
  • the media server transmits the media fragments of the video to the user's terminal device in sequence or in real time, and the terminal device is downloading and playing at the same time, without waiting for the completion of the download of the entire video file.
  • the terminal device will create a buffer area and download a video stream as a buffer before playing. When the actual network connection speed is lower than the speed consumed for playback, the player will use a small segment of the stream in the buffer area for playback. , To avoid video freezes.
  • the source device is doing two things at the same time in the process of playing the video: playing and downloading the cache.
  • the timing of the image frame being played by the source device at the same time is earlier than the timing of the image frame being downloaded, that is, for the same image frame, the time to download and cache the image frame is earlier At the time of playing the image frame.
  • Figure 4 shows an exemplary sequence diagram of image frames.
  • Fc is the image frame being played by the source device when the screen is switched
  • Fd is the last of the source device has been downloaded when the screen is switched.
  • Fk is the key frame corresponding to Fc
  • Fn is the next image frame of Fd.
  • Fk, Fc, Fd, and Fn belong to the same media segment Si.
  • Figure 5 shows an exemplary sequence diagram of image frames. As shown in Figure 5, Fc is the image frame being played by the source device when the screen is switched, and Fd is the last of the source device has been downloaded when the screen is switched. An image frame, Fk is the key frame corresponding to Fc, and Fn is the next image frame of Fd. In this example, Fk and Fc belong to the same media segment Si, and Fd and Fn belong to the same media segment Sj, Si and Sj They are different media fragments. Si and Sj can be adjacent or separated by one or more other media fragments, which is not specifically limited in this application.
  • Figure 6 shows an exemplary sequence diagram of image frames.
  • Fc is the image frame being played by the source device when the screen is switched
  • Fd is the last of the source device has been downloaded when the screen is switched.
  • An image frame Fk is the key frame corresponding to Fc
  • Fn is the next image frame of Fd.
  • Fk and Fc belong to the same media segment Si
  • Fd belongs to media segment Sl
  • Fn belongs to media segment Sj.
  • Si, Sl, and Sj are different media fragments. Sl and Sj must be adjacent. Si and Sl may be adjacent or separated by one or more other media fragments, which is not specifically limited in this application. It should be noted that in the above example, Fk and Fc belong to the same media fragment Si.
  • Fk and Fc can also belong to different media fragments.
  • the media fragments are small, the media fragments are not always Contains key frames.
  • Figures 4 to 6 exemplarily show the timing relationship of the image frames that the source device has downloaded before the screen switching, and the possible correspondence between image frames and media fragments, but this is not related to this application.
  • the video stream causes restrictions.
  • the code stream downloaded by the source device from the media server includes the information of the image frame before Fc and the information of the image frame from Fc to Fd.
  • this application can also be implemented using a new message, such as time indication information, which indicates the image frame being played when the screen is switched.
  • the destination device To start playing from the image frame Fc, the destination device also needs to go through the same process as the source device, that is, download a video stream from the media server as a buffer before playing, and use this buffer to start playing the video. During the process, the cache of subsequent image frames is downloaded while playing. It should be understood that the destination device starts playing from the image frame Fc, and the buffer to be downloaded includes at least the bit stream of the image frame from Fc to Fd.
  • the source device can send its own buffered code stream to the destination device through the communication network between the source device and the destination device. Since URL projection usually occurs between two devices located in the same local area network, the source device can send the above code stream to the destination device through the local area network. The speed of data transmission through the local area network will be faster and the time required will be longer. short. In the buffer code stream sent by the source device to the destination device, in order to reduce the amount of data transmitted, the source device can only send the code stream of the image frame that has been downloaded from the media server and has not been played to the destination device. Part of the code stream can be referred to as the first code stream.
  • the destination device In addition to the first code stream, the destination device needs to cache subsequent code streams to ensure the normal playback of the video. This part of the code stream can be called the second code stream, and the destination device can download the second code stream from the media server. In this application, the destination device cannot determine which image frame the second code stream starts from, so the source device can determine a download instruction information, and the download instruction information indicates the start frame of the second code stream.
  • the starting frame of the first code stream should be Fc, but according to the codec technology of the media and the role of key frames, if Fc is not a key frame, then the destination device cannot decode Fc based on Fc information alone.
  • the corresponding image must also have a key frame Fk corresponding to Fc (Fk is a key frame earlier than Fc and closest to Fc), and the image corresponding to Fc can be completely decoded based on the information of Fk. The same is true for the image frame after Fc.
  • the key frame Fk corresponding to Fc must be sent to the destination device, so the start frame of the first code stream can be Fk .
  • the image frames before Fk because these image frames have been played on the source device when the projection screen is switched, the destination device must synchronize the projection time and start playing from the Fc. The ones that have been played before the Fc The image frame is not needed for the destination device.
  • the start frame of the first code stream can be Fk, which not only reduces the amount of data transmitted between the source device and the destination device, but also provides sufficient image information to ensure effective image decoding.
  • the end frame of the first code stream may include two possible situations: (1) The end frame of the first code stream is the last image frame Fd that has been downloaded by the source device when the screen is switched. In this case, it means that the source device will send all the code streams of all image frames that have been downloaded from the media server and have not been played to the destination device, that is, the code streams of image frames from Fc to Fd. (2) The end frame of the first code stream is the last image frame Fe of the media segment to which Fc belongs.
  • the continuous media data is stored uniformly in the form of media fragments after compression processing, and each media fragment contains multiple image frames.
  • the first code stream sent by the source device to the destination device may end in Fe.
  • Fd and Fc belong to different media fragments, so as to ensure that Fe is earlier than Fd. Otherwise, if Fd and Fc belong to the same media fragment, the source device will only download to Fd, Fd and Fe are not necessarily the same image frame, and the source device cannot send Fe to the destination device.
  • the device downloads the video stream from the media server in the unit of media fragments, one by one, one by one in chronological order. Therefore, whether it is the source device or the destination device, from The starting frame of the download code stream of the media server must be the first image frame of a certain media segment.
  • the start frame of the code stream downloaded by the source device from the media server is the first image frame of the first media segment of the video, and the code stream (the second code stream) that the destination device needs to download from the media server
  • the start frame is related to the end frame of the first code stream.
  • the start frame of the second code stream may be the first image frame of the media segment to which the next image frame of the end frame of the first code stream belongs.
  • the end frame of the first code stream is the last image frame Fd that the source device has downloaded when the screen is switched, so the ideal state is that the second code stream starts from the next image frame Fn of Fd .
  • the start frame of the second bitstream can be the first image frame of the media segment to which Fn belongs.
  • the end part of the first code stream may overlap with the beginning part of the second code stream.
  • Fd and Fn belong to the same media segment Sj
  • the end frame of the first code stream is Fd
  • the end frame of the second code stream is Fd.
  • the starting frame of the code stream is the first image frame of the media segment to which Fn belongs.
  • the first image frame is either Fd or earlier than Fd, so that the first code stream and the second code stream will partially overlap.
  • Fd belongs to media segment Sl
  • Fn belongs to media segment Sj
  • Sl and Sj are different media segments
  • Fd is the previous image frame of Fn, so the end frame Fd of the first code stream is just right
  • Fn is just the first image frame of the adjacent media segment Sj, so that the first code stream and the second code stream will not overlap.
  • the end frame of the first code stream is the last image frame Fe of the media segment to which Fc belongs
  • the ideal state is that the second code stream starts from the next image frame of Fe, but considering the media server In the media segmentation process above, the start frame of the second code stream may be the first image frame of the media segment to which the next image frame of Fe belongs. Since the end frame of the first code stream is the last image frame Fe of the media segment, the next image frame of Fe is the first image frame of the adjacent media segment, and the first image frame is the second code The start frame of the stream, so the first code stream and the second code stream will not overlap.
  • the source device can transmit as much of its buffered code stream as possible to the destination device through the local area network, which can reduce the amount of data of the second code stream downloaded by the destination device from the media server.
  • the start frame of the second code stream may be the first image frame F1 of the media segment to which Fn belongs, and Fn is the next image frame of the last image frame Fd that has been downloaded by the source device when the screen is switched. This is the latest starting frame that can be set for the second stream. If the subsequent media segments start to download, it will inevitably not be able to synchronize the screen projection time. Therefore, the end frame of the first code stream is related to the start frame of the second code stream, that is, the end frame of the first code stream is the previous image frame F0 of F1. In this case, the first code stream and the second code stream will not overlap.
  • Step 302 The source device sends download instruction information to the destination device.
  • Step 303 The source device sends the first code stream to the destination device.
  • the source device can send the download instruction information and the first code stream to the destination device.
  • the source end device may send play time indication information to the destination end device, where the play time indication information is used to indicate the image frame being played when the projection screen is switched.
  • the purpose of the source device sending the playback time indication information is to inform the destination device of its own playback progress when the screen is switched, so that the destination device can determine which frame to start playing the video according to the playback progress, so as to realize the projection time on both sides Synchronize.
  • the source device can encapsulate the first code stream according to the set encapsulation format to obtain format data, and then the source device sends the format data and control information to the destination device, and the control information Include the aforementioned encapsulation format.
  • the image frame included in the first bit stream may span multiple media fragments.
  • the source device can encapsulate the first bit stream into one media fragment, such as Transport Stream (TS), FMP4 fragment, etc., to simplify transmission The organization of the data.
  • the source-end device may also divide the first code stream into multiple media fragments for transmission, which is not specifically limited in this application.
  • the encapsulation object in this application is to download the cached video stream from the media server.
  • the encapsulation process can improve the performance of screen projection time synchronization, and it does not involve the encoding and decoding of the image, thereby reducing the difficulty of the algorithm and the requirements on the hardware.
  • Step 304 The destination device downloads the second code stream from the media server.
  • the download instruction information indicates the start frame of the second code stream, so the destination device downloads the code stream of continuous image frames starting from the start frame from the media server according to the download instruction information.
  • the start frame of the second code stream has been determined to be the first image frame of a certain media segment, so the destination device can request the media server to start downloading from the media segment.
  • the download instruction information may include information about the media segment to which the start frame of the second code stream belongs, and the destination device can determine the media segment to which the start frame belongs according to the information, and then determine the start frame.
  • the download instruction information may include information about the start frame of the second code stream, and the destination device may directly determine the start frame according to the information, and then determine the media segment to which the start frame belongs.
  • This application does not specifically limit the information included in the above download instruction information.
  • Step 305 The destination device plays the video according to the first code stream or the second code stream.
  • the destination device receiving the first code stream and downloading the second code stream from the media server are two independent processes, so the two can be performed at the same time. Since the first code stream is positioned before the second code stream as a whole, the destination device can start playing from the first code stream. In the process of receiving the first code stream and the second code stream, if the data volume of the received first code stream reaches enough data volume for playback, the destination device can switch from the projection screen according to the first code stream The video will be played with the playback progress at the time, which will help improve the response speed of screen switching. After the first code stream is played, the destination device has already accumulated enough second code stream, and the video is played back according to the second code stream, ensuring smooth video playback.
  • the destination device in order to synchronize the screen projection time, can determine the first image frame to be played after the projection screen is switched from the first code stream or the second code stream, and start playing the video from the first image frame.
  • the first image frame is the key to the synchronization of the projection time.
  • the source device in order to synchronize the screen projection time, sends to the destination device the playback time indication information used to indicate the image frame Fc being played when the screen is switched.
  • the destination device can first determine according to the playback time indication information
  • the first image frame to be played is Fc, and then play the video from Fc to the end frame of the first code stream according to the first code stream, and play the video starting from Fm according to the second code stream, Fm is the end of the first code stream The next image frame of the frame.
  • the end part of the first code stream and the beginning part of the second code stream may overlap, or the first code stream and the second code stream may not overlap at all. If it is not overlapped, after the destination device finishes playing the first code stream, it directly starts playing according to the second code stream.
  • the end frame of the first code stream is the previous image frame of the start frame of the second code stream, so The destination device can realize seamless switching from the first code stream to the second code stream. If there is overlap, in the overlapping part, the destination device can choose to play according to the first code stream, that is, according to the first code stream, play the video from Fc to the end frame of the first code stream, and play according to the second code stream. Video starting from Fm.
  • the destination device can also choose to play according to the second code stream in the overlapping part, that is, according to the first code stream to play the video of the previous image frame from Fc to the start frame of the second code stream, and give up The part of the code stream from the start frame of the second code stream to the end frame of the first code stream in the first code stream is discarded, and the video starting from the start frame of the second code stream is played according to the second code stream.
  • This application does not specifically limit this.
  • Steps 301-305 mainly describe the screen switching process when the total data amount of the first code stream is greater than or equal to a set threshold, and the set threshold is used to characterize whether the total data amount of the first code stream is sufficient to support video playback.
  • the set threshold is used to characterize whether the total data amount of the first code stream is sufficient to support video playback.
  • the source end device may send data amount indication information to the destination end device, where the data amount indication information is used to notify that the total data amount of the first code stream is less than the set threshold.
  • the source device can choose to send the first code stream to the destination device, or it can choose not to send the first code stream. This process can be negotiated through high-level configuration or mutual information exchange between the two parties, and will not be repeated here.
  • the destination device can download the fourth code stream from the media server, and the start frame of the fourth code stream is the first frame of the media segment to which Fn belongs. That is, the second code stream is downloaded from the first image frame of the media segment to which the frame next to the end frame of the first code stream belongs.
  • the destination device can download the third code stream from the media server.
  • the start frame of the third code stream belongs to the image frame Fc being played by the source device when the screen is switched.
  • the first image frame of the media segment Since there is no first code stream, in order to ensure the normal playback of the video and to achieve screen projection time synchronization, the destination device can download the second code stream from the first image frame of the media segment to which the Fc belongs.
  • the source device sends the buffered code stream to the destination device.
  • the buffered code stream is transmitted through the local area network. Based on the advantages of the local area network itself, including higher transmission bandwidth and better QOS It can realize fast and stable transmission of the buffer code stream.
  • the destination device receives the buffer code stream from the source device and downloads the code stream after the buffer code stream from the media server, which can be quickly based on the existing code stream. The code stream starts to play the video, which improves the response speed of screen switching, and can ensure that enough code stream is stored during video playback to prevent the video from being stuck.
  • the foregoing method embodiments describe the process in which the source device sends the first code stream and download instruction information to the destination device when the screen is switched, so as to realize the process of screen switching.
  • This process occurs when the screen projection is started.
  • the source device can be a mobile phone, tablet, computer, etc. as a sender
  • the destination device can be a TV, a set-top box, an AV power amplifier, a ChromeCast, a computer, and other devices as a receiver. This process occurs when screen projection is terminated.
  • the source device can be a TV, set-top box, AV power amplifier, ChromeCast, computer, etc., as a receiver
  • the destination device can be a mobile phone, tablet, computer, etc., as a sender. Therefore, the roles of Sender and Receiver can be switched at different stages of screen projection, which is not specifically limited in this application.
  • FIG. 7 is a flowchart of Embodiment 2 of the URL projection method of this application.
  • the process 700 can be executed by the source device, the destination device, and the media server.
  • the process 700 is described as a series of steps or operations. It should be understood that the process 700 may be executed in various orders and/or occur simultaneously, and is not limited to the execution order shown in FIG. 7.
  • the method of this embodiment may include:
  • Step 701 The user initiates screen projection through Sender.
  • This process can use any technology such as application programs, video platforms, and player programs that support the URL projection protocol, which is not specifically limited in this application.
  • Step 702 Sender sends a screen projection request to Receiver.
  • the screencast request may include the URL address of the video to be played, and a seek instruction, etc., to inform the Receiver of the address of the current video being played, and the playback progress of the current video.
  • Step 703 Receiver sends a screencast response to Sender.
  • the Receiver If the Receiver correctly receives the screencast request sent by the Sender, the Receiver can feed back an acknowledgement response (Acknowledge, ACK), otherwise, the Receiver can feed back a negative Acknowledge (NAK) response.
  • Acknowledge ACK
  • NAK negative Acknowledge
  • Step 704 The Sender determines the first code stream and download instruction information according to the current playback progress.
  • Step 704 can refer to the above step 301, which will not be repeated here.
  • Step 705 Sender sends media information to Receiver.
  • the first code stream determined by the sender includes the start frame and the end frame of the first code stream, and the encapsulation format of the first code stream, and may also include the organization and storage mode of the first code stream.
  • Sender can send all these information, as well as the download instruction information mentioned above, to Receiver as media information.
  • the media information may adopt a description method such as Java Script Object Notation (JSON) or Extensible Markup Language (XML). It should be understood that media information may also use other information description methods, which are not specifically limited in this application.
  • Step 706 The Sender sends the first code stream to the Receiver.
  • Step 706 can refer to the above step 303, which will not be repeated here.
  • Step 707 Receiver sends a download request to the media server.
  • the download request may include the start frame of the second code stream.
  • Step 708 Receiver downloads the second code stream from the media server.
  • Step 708 can refer to the above step 304, which will not be repeated here.
  • Step 709 When the amount of data in the first stream is sufficient, Receiver starts to play the video.
  • the data volume of the received first code stream reaches enough data volume for playback, for example, a 500ms video code stream, it can be from the first code stream.
  • the playback progress during screen switching will play the video, which helps to improve the response speed of screen switching.
  • Step 709 can refer to the above step 305, which will not be repeated here.
  • Step 710 The user terminates screen projection through Sender.
  • This process can use any technology such as application programs, video platforms, and player programs that support the URL projection protocol, which is not specifically limited in this application.
  • the user can also terminate the screen projection through Receiver, and can also use any technology such as applications, video platforms, and player programs that support the URL projection protocol, which is not specifically limited in this application.
  • Step 711 The Sender sends a request to terminate the screen projection to the Receiver.
  • Step 712 The Receiver sends a response to terminate the screen projection to the Sender.
  • the response to terminate the screencast may include a seek instruction, etc., to inform the Sender of the current video playback progress.
  • Step 713 Receiver determines the first code stream and download instruction information according to the current playback progress.
  • step 713 refer to the above step 301, which will not be repeated here.
  • Step 714 Receiver sends media information to Sender.
  • step 714 refer to the above step 705, which will not be repeated here.
  • Step 715 Receiver sends the first code stream to Sender.
  • step 715 refer to the above step 303, which will not be repeated here.
  • Step 716 The Sender sends a download request to the media server.
  • the download request may include the start frame of the second code stream.
  • Step 717 Sender downloads the second code stream from the media server.
  • Step 717 can refer to the above step 304, which will not be repeated here.
  • Step 718 When the amount of data in the first code stream is sufficient, the Sender starts to play the video.
  • Sender can also be in the process of receiving the first code stream and the second code stream, if the data volume of the received first code stream reaches enough data volume for playback, it can be based on the first code stream when switching from the projection screen.
  • the playback progress will play the video, which helps to improve the response speed of screen switching.
  • Step 718 can refer to the above step 305, which will not be repeated here.
  • Sender and Receiver do not support the URL projection method provided in this application. If Receiver does not support the URL projection method, Receiver can notify Sender through the projection response in step 703. If Sender does not support the URL projection method, then Sender will not send the first stream to Receiver. . If either party does not support the URL projection method, Sender and Receiver will execute the projection process in related technologies to achieve compatibility between the URL projection method and the related projection method.
  • the Receiver After the Receiver starts to play the video, it periodically reports the current playback status (including the playback position) to the Sender. If the user adjusts the playback progress by dragging, the Receiver can also report to the Sender.
  • the Receiver has started to play the video.
  • the user closes the player, application, etc. on the Sender, so that the user cannot terminate the screen projection by operating on the Sender.
  • the user can terminate the screencasting through Receiver, or reopen the player, application, etc. on the Sender to terminate the screencasting.
  • the first code stream may be actively sent by the source device to the destination device, or the destination device may request a pull from the source device after receiving a screencast switching request. This application does not specifically limit this.
  • the transmission of the first code stream may use Transmission Control Protocol (TCP), Hyper Text Transfer Protocol (HTTP), WebSocket or various private protocols, which are not specifically limited in this application.
  • TCP Transmission Control Protocol
  • HTTP Hyper Text Transfer Protocol
  • WebSocket or various private protocols, which are not specifically limited in this application.
  • the above-mentioned related protocols may include URL projection protocols such as DLNA, AirPlay, GoogleCast, etc.
  • the URL projection method provided in this application extends these protocols to be deployed at both ends of Sender and Receiver.
  • different extension schemes can be adopted according to the characteristics of the protocol to realize the compatibility extension of the protocol.
  • the following uses the WebSocket protocol as an example to describe the key points of transmitting the first code stream in the URL projection method provided by this application. Although it is easier to use TCP directly, more and more players currently choose to run in the browser as a web application (WebApp), and the browser does not directly access the TCP standard interface.
  • WebApp web application
  • Sender always acts as a WebSocket client (Client), and actively initiates a WebSocket connection.
  • Receiver is always used as a WebSocket server to passively monitor connections.
  • Receiver dynamically selects the WebSocket Server port (port) and reports to the Sender.
  • This method relies on a premise that when Receiver responds to Sender's screencasting request, it can extend the screencasting response message with compatibility and carry its selected port.
  • DLNA initiates a screencast request through SOAP SetAVTransportURI, and the screencast response is in XML format, which can be extended for compatibility.
  • AirPlayer initiates a screencast request through HTTP POST/play, and the screencast response does not have an HTTP body.
  • HTTP body or HTTP header can be added to achieve compatibility expansion.
  • GoogleCast is special. After Sender initiates a screencast request through the SDK, both ends can establish a WebSocket connection through the SDK and exchange messages. The WebSocket connection can be directly reused to send the first code stream. For such cases as GoogleCast, it is not necessary to establish a WebSocket connection in the following way, but it is regarded as a WebSocket connection has been established after the screen projection is started.
  • the extended parameter is carried in the screencast request, indicating that Sender supports this solution. If Receiver supports this solution, it will be able to recognize the extended parameters in the screen projection request.
  • start the WebSocket Server Before responding to the Sender, start the WebSocket Server first, and then send the screencast response to the Sender.
  • the screencast response carries the WebSocket Server Port.
  • the Sender recognizes the WebSocket Server Port responded by the Receiver, it initiates a WebSocket connection request.
  • the WebSocket Server Port returned by the Receiver is valid during a complete screencast, and the Receiver has been monitoring the Port during this period, waiting for connection.
  • a complete screencasting process refers to: after the Sender initiates the screencasting request, until the user terminates the screencasting through Sender or Receiver, and completes the reverse transmission of the first code stream (or timeout, or Sender initiates a new screencasting request). Time range.
  • the Receiver After the Receiver detects the termination of the screen projection request, it will wait for the Sender to obtain the first code stream cached by itself and start the timer. When the transmission of the first code stream is completed, the timer expires, or the same Sender is monitored to initiate a new screencasting request, Receiver determines that the current screencasting process is over, terminates the WebSocket Server, and the corresponding Port becomes invalid. Sender decides whether to maintain a long connection or a short connection. Long connection means that the Sender does not disconnect the WebSocket connection after sending the first stream. When the screen projection is terminated, use this connection to obtain the first reverse stream, and then terminate the connection after completion. Short connection means that the Sender immediately disconnects the WebSocket connection after sending the first code stream. When the projection is terminated, the WebSocket connection with Receiver is re-established to obtain the first reverse stream, and the connection is terminated after completion.
  • This application defines new message transmission media information and the first code stream.
  • the message may have a uniform fixed-length message header used to carry the type and response code of the identification message, followed by an optional message body.
  • the following exemplarily gives the functions of several new messages. It should be understood that the following new messages are examples of the URL projection method provided in this application for extending the relevant protocols, but they do not constitute a limitation.
  • PutCacheMeta When the screen is started, after the Sender establishes a WebSocket connection with the Receiver, the Sender sends a PutCacheMeta message, which carries media information.
  • PutCacheMedia After receiving the ACK from Receiver to PutCacheMeta, Sender sends a PutCacheMedia message.
  • the PutCacheMedia message carries the first code stream.
  • GetCacheMeta Sender sends a GetCacheMeta message when the projection is terminated, and Receiver carries media information in the response to terminate the projection.
  • GetCacheMedia When the screencast is finished, Sender sends a GetCacheMedia message, and Receiver replies to the first stream.
  • FIG. 8 shows a possible embodiment of the URL projection method of the present application.
  • FIG. 8 is a flowchart of Embodiment 3 of the URL projection method of this application.
  • the process 800 may be executed by the source device, the destination device, and the media server.
  • the process 800 is described as a series of steps or operations. It should be understood that the process 800 may be executed in various orders and/or occur simultaneously, and is not limited to the execution order shown in FIG. 8.
  • the method of this embodiment may include:
  • Step 801 The user initiates screen projection through Sender.
  • Step 802 Sender sends a screen projection request to Receiver.
  • Step 803 Receiver sends a screencast response to Sender.
  • the Receiver can feed back an acknowledgement response (Acknowledge, ACK), otherwise, the Receiver can feed back a negative Acknowledge (NAK) response.
  • ACK acknowledgement response
  • NAK negative Acknowledge
  • Receiver can carry extended information such as WebSocket Server Port in ACK.
  • Step 804 The Sender requests the Receiver to establish a WebSocket connection.
  • Step 805 Receiver sends a connection establishment response to Sender.
  • the receiver If the receiver is successfully connected, it will reply ACK, otherwise it will reply NAK.
  • Step 806 The Sender determines the first code stream and download instruction information according to the current playback progress.
  • Step 807 The Sender sends a PutCacheMeta message to the Receiver.
  • the PutCacheMeta message carries media information.
  • Step 808 The Sender sends a PutCacheMedia message to the Receiver.
  • the PutCacheMedia message carries the first code stream.
  • Step 809 Receiver sends a download request to the media server.
  • Step 810 Receiver downloads the second code stream from the media server.
  • Step 811 When the data amount of the first code stream is sufficient, Receiver starts to play the video.
  • Step 812 The user terminates the screen projection through Sender.
  • Step 813 The Sender sends a request to terminate the screen projection to the Receiver.
  • Step 814 The Receiver sends a response to terminate the screen projection to the Sender.
  • Step 815 Receiver determines the first code stream and download instruction information according to the current playback progress.
  • Step 816 The Sender sends a GetCacheMeta message to the Receiver.
  • Step 817 Receiver carries media information in the response message.
  • Step 818 The Sender sends a GetCacheMedia message to the Receiver.
  • Step 819 Receiver sends the first code stream to Sender.
  • Step 820 Sender sends a download request to the media server.
  • Step 821 Sender downloads the second code stream from the media server.
  • Step 822 When the data amount of the first code stream is sufficient, the Sender starts to play the video.
  • the URL projection method of this application is introduced above, and the device of this application is introduced below.
  • the device of this application includes a projection device applied to a source device and a projection device device applied to a destination device. It should be understood that the application
  • the screen projection device of the source device is the source device in the above method, which has any function of the source device in the above method, and the screen projection device applied to the destination device is the destination device in the above method.
  • the screen projection device applied to the destination device includes: a receiving module 901, a playing module 902, and a decoding module 903.
  • the receiving module 901 is used to receive the first code stream sent by the source device, and the first code stream is the code stream downloaded from the media server by the source device before the screen projection switch; the receiving module 901 is also used to Receive the download instruction information sent by the source device, and download the second code stream from the media server according to the download instruction information.
  • the start frame of the second code stream is indicated by the download instruction information, and the start of the second code stream
  • the start frame is related to the end frame of the first code stream;
  • the playing module 902 is configured to play a video according to the first code stream or the second code stream.
  • the start frame of the first code stream is the key frame Fk corresponding to the image frame Fc being played by the source device when the screen is switched.
  • the end frame of the first code stream is the last image frame Fd that has been downloaded by the source device when the screen is switched.
  • the image frame being played by the source device when the screen is switched is Fc
  • the last image frame that the source device has downloaded when the screen is switched is Fd.
  • Fd and Fc are different Media segments
  • each media segment contains multiple image frames
  • the end frame of the first bitstream is the last image frame of the media segment to which the Fc belongs.
  • the start frame of the second code stream is related to the end frame of the first code stream, specifically including: the start frame of the second code stream is the end frame of the first code stream The first image frame of the media segment to which the next image frame belongs.
  • the image frame being played by the source device when the screen is switched is Fc
  • the last image frame downloaded by the source device when the screen is switched is Fd
  • the next image frame of Fd Fn, Fn and Fc belong to different media segments
  • the start frame of the second code stream is the first image frame F1 of the media segment to which Fn belongs
  • the end frame of the first code stream is the previous image of F1 Frame F0.
  • the playback module 902 is specifically configured to determine the first image frame to be played after the projection screen is switched from the first code stream or the second code stream, and obtain the first image frame from the first image frame. Start playing the video.
  • the receiving module 901 is also used to receive the playback time indication information sent by the source device, and the playback time indication information is used to indicate the image frame that the source device is playing when the screen is switched.
  • the playback module is also used to determine the first image frame to be played as Fc from the first code stream according to the playback time indication information when the data amount of the first code stream is greater than the set threshold, and Play a video from Fc to the end frame of the first code stream according to the first code stream, and play a video starting from Fm according to the second code stream, where Fm is the next image frame of the end frame of the first code stream.
  • the receiving module 901 is specifically configured to receive the first code stream sent by the source device when the total data amount of the first code stream is not less than a set threshold.
  • the receiving module 901 is further configured to receive the data amount indication information sent by the source device when the total data amount of the first code stream is less than the set threshold;
  • the server downloads the third code stream, and the start frame of the third code stream is the first image frame of the media segment to which the image frame Fc that the source device is playing when the screen is switched.
  • the receiving module 901 is further configured to receive the data amount indication information and the first code sent by the source device when the total data amount of the first code stream is less than the set threshold.
  • Stream download the fourth code stream from the media server, the last image frame of the first code stream is Fd, the next image frame of Fd is Fn, and the start frame of the fourth code stream is the media segment to which Fn belongs The first frame.
  • the receiving module 901 is further configured to receive control information sent by the source device, where the control information includes the encapsulation format of the first code stream; and receive format data sent by the source device;
  • the decoding module 903 is configured to decapsulate the format data according to the encapsulation format to obtain the first code stream.
  • the device of this embodiment can be used to implement the technical solution of any one of the method embodiments shown in FIG. 3 to FIG. 8. Its implementation principles and technical effects are similar, and will not be repeated here.
  • the screen projection device applied to the destination device includes: a processing module 1001, a sending module 1002, and an encapsulation module 1003.
  • the processing module 1001 is configured to determine a first code stream and download instruction information according to the playback progress when the screen is switched.
  • the first code stream is the code stream downloaded from the media server before the screen is switched, and the download instruction information is used for To indicate the start frame of the second code stream, the second code stream is the code stream to be downloaded by the destination device from the media server, and the start frame of the second code stream is related to the end frame of the first code stream;
  • the sending module 1002 is configured to send the download instruction information and the first code stream to the destination device.
  • the start frame of the first code stream is the key frame Fk corresponding to the image frame Fc that is being played when the screen is switched.
  • the end frame of the first code stream is the last image frame Fd that has been downloaded when the projection screen is switched.
  • the image frame being played when the screen is switched is Fc
  • the last image frame that has been downloaded when the screen is switched is Fd.
  • Fd and Fc belong to different media fragments.
  • the slice contains multiple image frames
  • the end frame of the first bitstream is the last image frame of the media slice to which the Fc belongs.
  • the start frame of the second code stream is related to the end frame of the first code stream, specifically including: the start frame of the second code stream is the end frame of the first code stream The first image frame of the media segment to which the next image frame belongs.
  • the image frame being played when the screen is switched is Fc
  • the last image frame that has been downloaded when the screen is switched is Fd
  • the next image frame of Fd is Fn.
  • Fn and Fc belong to different
  • the start frame of the second code stream is the first image frame F1 of the media segment to which Fn belongs
  • the end frame of the first code stream is the previous image frame F0 of F1.
  • the sending module 1002 is also used to send play time indication information to the destination device, where the play time indication information is used to indicate the image frame Fc being played when the screen is switched.
  • the sending module 1002 is further configured to send data volume indication information to the destination device when the total data volume of the first code stream is less than a set threshold. To notify that the total data amount of the first code stream is less than the set threshold.
  • the encapsulation module 1003 is used to encapsulate the first code stream according to the set encapsulation format to obtain format data; the sending module 1002 is specifically used to send control information and data to the destination device
  • the format data and the control information include the encapsulation format.
  • the device of this embodiment can be used to implement the technical solution of any one of the method embodiments shown in FIG. 3 to FIG. 8. Its implementation principles and technical effects are similar, and will not be repeated here.
  • the steps of the foregoing method embodiments can be completed by hardware integrated logic circuits in the processor or instructions in the form of software.
  • the steps of the method disclosed in the present application can be directly embodied as being executed and completed by a hardware encoding processor, or executed and completed by a combination of hardware and software modules in the encoding processor.
  • the software module can be located in a mature storage medium in the field, such as random access memory, flash memory, read-only memory, programmable read-only memory, or electrically erasable programmable memory, registers.
  • the storage medium is located in the memory, and the processor reads the information in the memory and completes the steps of the above method in combination with its hardware.
  • the disclosed system, device, and method can be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of the units is only a logical function division, and there may be other divisions in actual implementation, for example, multiple units or components may be combined or It can be integrated into another system, or some features can be ignored or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or units, and may be in electrical, mechanical or other forms.
  • the function is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a computer readable storage medium.
  • the technical solution of the present application essentially or the part that contributes to the existing technology or the part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium, including Several instructions are used to make a computer device (personal computer, server, or network device, etc.) execute all or part of the steps of the method described in each embodiment of the present application.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

本申请提供一种URL投屏方法和装置,该投屏方法包括:源端设备根据投屏切换时的播放进度确定第一码流和下载指示信息,所述第一码流为在投屏切换前从媒体服务器下载的码流,下载指示信息用于指示第二码流的起始帧,所述第二码流为目的端设备要从媒体服务器下载的码流,第二码流的起始帧与第一码流的结束帧有关;源端设备向所述目的端设备发送下载指示信息和第一码流;目的端设备从媒体服务器下载第二码流;目的端设备根据所述第一码流或者所述第二码流播放视频。本申请可以提高投屏切换响应速度,防止视频播放卡顿。

Description

URL投屏方法和装置
本申请要求于2020年3月13日提交中国国家知识产权局、申请号为202010177999.2、申请名称为“URL投屏方法和装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及投屏技术,尤其涉及一种URL投屏方法和装置。
背景技术
随着智能移动终端的普及和互联网视频业务的快速发展,在手机、平板等智能终端设备上播放视频,并在需要时投射到电视、机顶盒等大屏设备上观看的需求越来越多。基于数字生活网络联盟(Digital Living Network Alliance,DLNA)协议的投屏作为一个典型的统一资源定位器(Uniform Resource Locator,URL)投屏技术,正是用来满足这种需求的。DLNA投屏中,源端设备(SourcePlayer)把视频的URL地址发送给目的端设备(DestPlayer),DestPlayer向媒体服务器请求获取与URL地址对应的内容,并播放。
但是,采用上述URL投屏技术,通常为了确保投屏切换的响应速度,在投屏切换后DestPlayer会从头开始播放视频,不能与SourcePlayer的播放进度保持同步,一方面会影响视频播放效率,另一方面会影响用户体验。
发明内容
本申请提供一种URL投屏方法和装置,可以提高投屏切换响应速度,防止视频播放卡顿。
第一方面,本申请提供一种URL投屏方法,包括:接收源端设备发送的第一码流,该第一码流为该源端设备在投屏切换前从媒体服务器下载的码流;接收该源端设备发送的下载指示信息;从该媒体服务器下载第二码流,该第二码流的起始帧由该下载指示信息指示,该第二码流的起始帧与该第一码流的结束帧有关;根据该第一码流或者该第二码流播放视频。
本申请在投屏切换时,源端设备将已经缓存的码流发送给目的端设备,一方面缓存码流通过局域网传输,基于局域网自身的优势,包括具有更高的传输带宽和更优的服务质量(Quality of Service,QOS)等,可以实现缓存码流的快速、稳定传输,另一方面目的端设备一边接收来自源端设备的缓存码流,一边从媒体服务器下载缓存码流之后的码流,既可以快速基于已有的码流开始播放视频,提高投屏切换响应速度,又可以确保视频播放时存储足够的码流,防止视频播放卡顿。
在一种可能的实现方式中,该第一码流的起始帧为投屏切换时该源端设备正在播放的图像帧Fc对应的关键帧Fk。
目的端设备仅凭投屏切换时正在播放的图像帧Fc的信息是无法解码出Fc对应的图像,必须还要有Fc对应的关键帧Fk(Fk是早于Fc、且距离Fc最接近的一个关键帧),藉由Fk的信息才能完整的解码出Fc对应的图像。Fc之后的图像帧亦是如此。因此即使 投屏切换时Fk已经播放过了,为了确保第一码流的信息完整性,Fc对应的关键帧Fk是必须发送给目的端设备的,因此第一码流的起始帧可以为Fk。这样既减少了源端设备和目的端设备之间的传输的数据量,又提供足够的图像信息,确保图像的有效解码。
在一种可能的实现方式中,该第一码流的结束帧为投屏切换时该源端设备已经下载完成的最后一个图像帧Fd。
该情况下表示源端设备会把已经从媒体服务器下载、且尚未播放的所有图像帧的码流全部发送给目的端设备,即从Fc到Fd的图像帧的码流。本申请实施例将源端设备已经下载的码流尽可能多地通过局域网传输给目的端设备,这样可以减少目的端设备从媒体服务器下载的第二码流的数据量。
在一种可能的实现方式中,投屏切换时该源端设备正在播放的图像帧为Fc,投屏切换时该源端设备已经下载完成的最后一个图像帧为Fd,Fd和Fc属于不同的媒体分片,每个媒体分片包含多个图像帧,该第一码流的结束帧为Fc所属媒体分片的最后一个图像帧。
为了确保目的端可以实现投屏时间同步,也为了符合媒体数据的存储要求,源端设备发送给目的端设备的第一码流可以结束于Fc所属媒体分片的最后一个图像帧Fe。
在一种可能的实现方式中,该第二码流的起始帧与该第一码流的结束帧有关,具体包括:该第二码流的起始帧为该第一码流的结束帧的下一个图像帧所属媒体分片的第一个图像帧。
通常源端设备从媒体服务器下载的码流的起始帧是视频的第一个媒体分片的第一个图像帧,本申请实施例中目的端设备要从媒体服务器下载的码流(第二码流)的起始帧为第一码流的结束帧的下一个图像帧所属的媒体分片的第一个图像帧。这样,可以确保目的端设备从服务器下载的第二码流与从源端设备接收的第一码流之间前后连贯,并进一步确保投屏时间同步。
示例性的,当第一码流的结束帧为投屏切换时源端设备已经下载完成的最后一个图像帧Fd,考虑到媒体服务器上的媒体分片处理,第二码流的起始帧可以是Fd的下一个图像帧Fn所属媒体分片的第一个图像帧。此时第一码流的结束部分和第二码流的开始部分可能会存在重叠。当第一码流的结束帧为Fc所属媒体分片的最后一个图像帧Fe,考虑到媒体服务器上的媒体分片处理,第二码流的起始帧可以是Fe的下一个图像帧所属媒体分片的第一个图像帧。由于第一码流的结束帧是切屏之前正在播放的图像帧所属媒体分片的最后一个图像帧Fe,那么Fe的下一个图像帧是相邻媒体分片的第一个图像帧,而该第一个图像帧是第二码流的起始帧,因此第一码流和第二码流不会有重叠。这样一方面第一码流和第二码流可以在前后连贯的基础上,尽量不重叠,减少数据传输冗余,但即使二者有重叠,这部分重叠也是媒体分片中的一部分,在确保投屏时间同步的大前提下是可以允许该部分冗余传输的。
在一种可能的实现方式中,投屏切换时该源端设备正在播放的图像帧为Fc,投屏切换时该源端设备已经下载完成的最后一个图像帧为Fd,Fd的下一个图像帧为Fn,Fn和Fc属于不同的媒体分片,该第二码流的起始帧为Fn所属媒体分片的第一个图像帧F1,该第一码流的结束帧为F1的上一个图像帧F0。
源端设备从己方已经缓存的码流中剔除掉可能会与第二码流重叠的部分,再将剩余码流通过局域网传输给目的端设备,这样可以减少目的端设备从媒体服务器下载的第二码流 的数据量。
在一种可能的实现方式中,该根据该第一码流或者该第二码流播放视频,具体包括:从该第一码流或者该第二码流中确定投屏切换后待播放的首帧图像帧,并从该首帧图像帧开始播放该视频。
在一种可能的实现方式中,该根据该第一码流或者该第二码流播放视频之前,还包括:接收该源端设备发送的播放时间指示信息,该播放时间指示信息用于指示投屏切换时该源端设备正在播放的图像帧Fc;该根据该第一码流或者该第二码流播放视频,包括:当该第一码流的数据量大于设定阈值时,根据该播放时间指示信息从该第一码流中确定该待播放的首帧图像帧为Fc,并根据该第一码流播放从Fc到该第一码流的结束帧的视频,根据该第二码流播放从Fm开始的视频,Fm为该第一码流的结束帧的下一个图像帧。
目的端设备接收第一码流和从媒体服务器下载第二码流是两个独立的过程,因此二者可以同时进行。由于第一码流整体位于第二码流之前,因此目的端设备可以从第一码流开始播放。而在接收第一码流和第二码流的过程中,如果接收到的第一码流的数据量到达足够用于播放的数据量,目的端设备就可以根据第一码流从投屏切换时的播放进度将视频播放起来,这样有利于提高投屏切换响应速度。当第一码流播放完后目的端设备也已经积累了足够的第二码流,根据第二码流接着往后播放视频,确保了视频播放的流畅。
在一种可能的实现方式中,该接收源端设备发送的第一码流,包括:当该第一码流的总数据量不小于设定阈值时,接收该源端设备发送的该第一码流。
在一种可能的实现方式中,还包括:当该第一码流的总数据量小于该设定阈值时,接收该源端设备发送的该数据量指示信息;从该媒体服务器下载第三码流,该第三码流的起始帧为投屏切换时该源端设备正在播放的图像帧Fc所属媒体分片的第一个图像帧。
在一种可能的实现方式中,还包括:当该第一码流的总数据量小于该设定阈值时,接收该源端设备发送的该数据量指示信息和该第一码流;从该媒体服务器下载第四码流,该第一码流的最后一个图像帧为Fd,Fd的下一个图像帧为Fn,该第四码流的起始帧为Fn所属媒体分片的第一帧。
如果投屏切换时,源端设备从媒体服务器下载的缓存码流的数据量较小,小于设定阈值,即使发送给目的端设备,目的端设备仍然需要从媒体服务器下载几乎全部的码流,或者根据第一码流无法播放视频,又或者传输第一码流所产生的带宽消耗、传输时延等代价高于直接从媒体服务器下载所产生的代价,此时源端设备可以向目的端设备发送数据量指示信息,该数据量指示信息用于通知第一码流的总数据量小于设定阈值。该情况下,源端设备可以选择将第一码流发送给目的端设备,也可以选择不发送第一码流。
在一种可能的实现方式中,该接收该源端设备发送的第一码流之前,还包括:接收该源端设备发送的控制信息,该控制信息包括该第一码流的封装格式;该接收该源端设备发送的第一码流,包括:接收该源端设备发送的格式数据,根据该封装格式对该格式数据进行解封装得到该第一码流。
第一码流包括的图像帧可能跨越多个媒体分片,源端设备可以将第一码流封装成一个媒体分片,例如TS、FMP4分片等,以简化传输数据的组织。可选的,源端设备也可以将第一码流分成多个媒体分片发送,本申请对此不作具体限定。本申请中的封装对象是从媒体服务器下载缓存的视频码流,该封装处理可以提升投屏时间同步的性能,其并不涉及图 像的编解码,从而可以降低算法的难度和对硬件的要求。
在一种可能的实现方式中,开始播放视频后,向源端设备定期报告当前播放状态(包括播放位置),如果用户通过拖动的方式调整了播放进度,也可以报告给源端设备。
在一种可能的实现方法中,定义新的消息用于传输媒体信息和第一码流,该消息可以包括统一的固定长度的消息头、用来携带标识消息的类型和响应码以及可选的消息体。
第二方面,本申请提供一种URL投屏方法,包括:根据投屏切换时的播放进度确定第一码流和下载指示信息,该第一码流为在投屏切换前从媒体服务器下载的码流,该下载指示信息用于指示第二码流的起始帧,该第二码流为目的端设备要从该媒体服务器下载的码流,该第二码流的起始帧与该第一码流的结束帧有关;向该目的端设备发送该下载指示信息和该第一码流。
在一种可能的实现方式中,该第一码流的起始帧为投屏切换时正在播放的图像帧Fc对应的关键帧Fk。
在一种可能的实现方式中,该第一码流的结束帧为投屏切换时已经下载完成的最后一个图像帧Fd。
在一种可能的实现方式中,投屏切换时正在播放的图像帧为Fc,投屏切换时已经下载完成的最后一个图像帧为Fd,Fd和Fc属于不同的媒体分片,每个媒体分片包含多个图像帧,该第一码流的结束帧为Fc所属媒体分片的最后一个图像帧。
在一种可能的实现方式中,该第二码流的起始帧与该第一码流的结束帧有关,具体包括:该第二码流的起始帧为该第一码流的结束帧的下一个图像帧所属媒体分片的第一个图像帧。
在一种可能的实现方式中,投屏切换时正在播放的图像帧为Fc,投屏切换时已经下载完成的最后一个图像帧为Fd,Fd的下一个图像帧为Fn,Fn和Fc属于不同的媒体分片,该第二码流的起始帧为Fn所属媒体分片的第一个图像帧F1,该第一码流的结束帧为F1的上一个图像帧F0。
在一种可能的实现方式中,该向该目的端设备发送该下载指示信息和该第一码流之前,还包括:向该目的端设备发送播放时间指示信息,该播放时间指示信息用于指示投屏切换时正在播放的图像帧Fc。
在一种可能的实现方式中,还包括:当该第一码流的总数据量小于设定阈值时,向该目的端设备发送数据量指示信息,该数据量指示信息用于通知该第一码流的总数据量小于设定阈值。
在一种可能的实现方式中,该向该目的端设备发送该下载指示信息和该第一码流之前,还包括:根据设定的封装格式对该第一码流进行封装,得到格式数据;向该目的端设备发送控制信息和该格式数据,该控制信息包括该封装格式。
在一种可能的实现方式中,源端设备发起投屏时,在投屏切换请求中除了URL外可以携带扩展参数,该扩展参数用于表明源端设备支持本申请提供的URL投屏方法。目的端设备如果支持本申请提供的URL投屏方法,可以识别投屏请求中的扩展参数。
在一种可能的实现方法中,定义新的消息用于传输媒体信息和第一码流,该消息可以包括统一的固定长度的消息头、用来携带标识消息的类型和响应码以及可选的消息体。
第三方面,本申请提供一种投屏装置,包括:接收模块,用于接收源端设备发送的第 一码流,该第一码流为该源端设备在投屏切换前从媒体服务器下载的码流;该接收模块,还用于接收该源端设备发送的下载指示信息,并根据该下载指示信息从该媒体服务器下载第二码流,该第二码流的起始帧由该下载指示信息指示,该第二码流的起始帧与该第一码流的结束帧有关;播放模块,用于根据该第一码流或者该第二码流播放视频。
在一种可能的实现方式中,该第一码流的起始帧为投屏切换时该源端设备正在播放的图像帧Fc对应的关键帧Fk。
在一种可能的实现方式中,该第一码流的结束帧为投屏切换时该源端设备已经下载完成的最后一个图像帧Fd。
在一种可能的实现方式中,投屏切换时该源端设备正在播放的图像帧为Fc,投屏切换时该源端设备已经下载完成的最后一个图像帧为Fd,Fd和Fc属于不同的媒体分片,每个媒体分片包含多个图像帧,该第一码流的结束帧为Fc所属媒体分片的最后一个图像帧。
在一种可能的实现方式中,该第二码流的起始帧与该第一码流的结束帧有关,具体包括:该第二码流的起始帧为该第一码流的结束帧的下一个图像帧所属媒体分片的第一个图像帧。
在一种可能的实现方式中,投屏切换时该源端设备正在播放的图像帧为Fc,投屏切换时该源端设备已经下载完成的最后一个图像帧为Fd,Fd的下一个图像帧为Fn,Fn和Fc属于不同的媒体分片,该第二码流的起始帧为Fn所属媒体分片的第一个图像帧F1,该第一码流的结束帧为F1的上一个图像帧F0。
在一种可能的实现方式中,该播放模块,具体用于从该第一码流或者该第二码流中确定投屏切换后待播放的首帧图像帧,并从该首帧图像帧开始播放该视频。
在一种可能的实现方式中,该接收模块,还用于接收该源端设备发送的播放时间指示信息,该播放时间指示信息用于指示投屏切换时该源端设备正在播放的图像帧Fc;该播放模块,还用于当该第一码流的数据量大于设定阈值时,根据该播放时间指示信息从该第一码流中确定该待播放的首帧图像帧为Fc,并根据该第一码流播放从Fc到该第一码流的结束帧的视频,根据该第二码流播放从Fm开始的视频,Fm为该第一码流的结束帧的下一个图像帧。
在一种可能的实现方式中,该接收模块,具体用于当该第一码流的总数据量不小于设定阈值时,接收该源端设备发送的该第一码流。
在一种可能的实现方式中,该接收模块,还用于当该第一码流的总数据量小于该设定阈值时,接收该源端设备发送的该数据量指示信息;从该媒体服务器下载第三码流,该第三码流的起始帧为投屏切换时该源端设备正在播放的图像帧Fc所属媒体分片的第一个图像帧。
在一种可能的实现方式中,该接收模块,还用于当该第一码流的总数据量小于该设定阈值时,接收该源端设备发送的该数据量指示信息和该第一码流;从该媒体服务器下载第四码流,该第一码流的最后一个图像帧为Fd,Fd的下一个图像帧为Fn,该第四码流的起始帧为Fn所属媒体分片的第一帧。
在一种可能的实现方式中,该接收模块,还用于接收该源端设备发送的控制信息,该控制信息包括该第一码流的封装格式;接收该源端设备发送的格式数据;该装置还包括:解码模块,用于根据该封装格式对该格式数据进行解封装得到该第一码流。
第四方面,本申请提供一种投屏装置,包括:处理模块,用于根据投屏切换时的播放进度确定第一码流和下载指示信息,该第一码流为在投屏切换前从媒体服务器下载的码流,该下载指示信息用于指示第二码流的起始帧,该第二码流为目的端设备要从该媒体服务器下载的码流,该第二码流的起始帧与该第一码流的结束帧有关;发送模块,用于向该目的端设备发送该下载指示信息和该第一码流。
在一种可能的实现方式中,该第一码流的起始帧为投屏切换时正在播放的图像帧Fc对应的关键帧Fk。
在一种可能的实现方式中,该第一码流的结束帧为投屏切换时已经下载完成的最后一个图像帧Fd。
在一种可能的实现方式中,投屏切换时正在播放的图像帧为Fc,投屏切换时已经下载完成的最后一个图像帧为Fd,Fd和Fc属于不同的媒体分片,每个媒体分片包含多个图像帧,该第一码流的结束帧为Fc所属媒体分片的最后一个图像帧。
在一种可能的实现方式中,该第二码流的起始帧与该第一码流的结束帧有关,具体包括:该第二码流的起始帧为该第一码流的结束帧的下一个图像帧所属媒体分片的第一个图像帧。
在一种可能的实现方式中,投屏切换时正在播放的图像帧为Fc,投屏切换时已经下载完成的最后一个图像帧为Fd,Fd的下一个图像帧为Fn,Fn和Fc属于不同的媒体分片,该第二码流的起始帧为Fn所属媒体分片的第一个图像帧F1,该第一码流的结束帧为F1的上一个图像帧F0。
在一种可能的实现方式中,该发送模块,还用于向该目的端设备发送播放时间指示信息,该播放时间指示信息用于指示投屏切换时正在播放的图像帧Fc。
在一种可能的实现方式中,该发送模块,还用于当该第一码流的总数据量小于设定阈值时,向该目的端设备发送数据量指示信息,该数据量指示信息用于通知该第一码流的总数据量小于设定阈值。
在一种可能的实现方式中,该装置还包括:封装模块,用于根据设定的封装格式对该第一码流进行封装,得到格式数据;该发送模块,具体用于向该目的端设备发送控制信息和该格式数据,该控制信息包括该封装格式。
第五方面,本申请提供一种投屏装置,包括:处理器和传输接口;该处理器被配置为调用存储在存储器中的程序指令,以实现如上述第一方面至二方面中任一项的方法。
第六方面,本申请提供一种计算机可读存储介质,其特征在于,包括计算机程序,该计算机程序在计算机或处理器上被执行时,使得该计算机或处理器执行上述第一至二方面中任一项的方法。
第七方面,本申请提供一种计算机程序,其特征在于,当该计算机程序被计算机或处理器执行时,用于执行上述第一至二方面中任一项的方法。
附图说明
图1示出了投屏场景的一个示例性的示意图;
图2示出了设备200的一个示例性的结构示意图;
图3为本申请URL投屏方法实施例一的流程图;
图4示出了图像帧的一个示例性的序列示意图;
图5示出了图像帧的一个示例性的序列示意图;
图6示出了图像帧的一个示例性的序列示意图;
图7为本申请URL投屏方法实施例二的流程图;
图8示出了本申请URL投屏方法的一个可能的实施例;
图9为本申请投屏装置实施例一的结构示意图;
图10为本申请投屏装置实施例二的结构示意图。
具体实施方式
为使本申请的目的、技术方案和优点更加清楚,下面将结合本申请中的附图,对本申请中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
本申请的说明书实施例和权利要求书及附图中的术语“第一”、“第二”等仅用于区分描述的目的,而不能理解为指示或暗示相对重要性,也不能理解为指示或暗示顺序。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,例如,包含了一系列步骤或单元。方法、系统、产品或设备不必限于清楚地列出的那些步骤或单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它步骤或单元。
应当理解,在本申请中,“至少一个(项)”是指一个或者多个,“多个”是指两个或两个以上。“和/或”,用于描述关联对象的关联关系,表示可以存在三种关系,例如,“A和/或B”可以表示:只存在A,只存在B以及同时存在A和B三种情况,其中A,B可以是单数或者复数。字符“/”一般表示前后关联对象是一种“或”的关系。“以下至少一项(个)”或其类似表达,是指这些项中的任意组合,包括单项(个)或复数项(个)的任意组合。例如,a,b或c中的至少一项(个),可以表示:a,b,c,“a和b”,“a和c”,“b和c”,或“a和b和c”,其中a,b,c可以是单个,也可以是多个。
本申请涉及到的名词说明:
URL投屏:投屏发起端将视频的URL地址提供给投屏接收端,投屏接收端根据该URL地址从媒体服务器获取视频码流并播放的一种投屏方式。URL投屏与镜像投屏不同,镜像投屏是把投屏发起端的屏幕数据实时传送给投屏接收端,在投屏接收端同步显示投屏发起端的屏幕。
DLNA投屏是一种典型的URL投屏技术,其投屏过程包括:用户使用终端设备(例如手机或平板等)播放视频,点击媒体播放器中的投屏按钮发起投屏。终端设备为投屏发起端。投屏发起端用DLNA协议将视频的URL地址发送给投屏接收端。电视、机顶盒等大屏设备为投屏接收端。同时,投屏发起端用DLNA协议中的seek指令向投屏接收端指示当前播放位置。投屏接收端向媒体服务器请求获取URL地址对应内容,并播放。
投屏终止过程包括:用户在投屏发起端或投屏接收端通过按钮或按键终止投屏。投屏接收端终止播放视频。如果投屏发起端上的媒体播放器还没有退出,则投屏发起端向媒体服务器请求内容并继续播放。
投屏发起端(Sender):URL投屏的发起方。在启动投屏时,Sender会向Receiver 提供视频URL地址。通常手机、平板、计算机等设备均可以通过运行的媒体播放器发起URL投屏。
投屏接收端(Receiver):URL投屏的接收端。在投屏时,Receiver会根据Sender提供的URL地址从媒体服务器获取视频码流。通常电视机、机顶盒、音视频(audio video,AV)功放、ChromeCast、计算机等设备均可以通过内置接收装置接受投屏。
媒体服务器(Server):提供视频码流的服务器。Receiver通过视频的URL地址从Server请求对应的视频码流。目前,Server主要采用超文本传输协议直播流(Hyper Text Transfer Protocol Live Streaming,HLS)、碎片MP4(Fragmented MP4,FMP4)等流媒体协议提供媒体服务。
初始缓存数据(Cache):播放视频的设备需要在启动播放前,预先从媒体服务器下载并缓存的一段视频对应的码流,通常至少需要缓存2~3秒的视频码流,否则播放视频时容易出现卡顿。
投屏切换:在启动投屏或终止投屏时,播放视频的设备从一端切换到另一端。启动投屏时,Sender一般会停止播放,视频切换到由Receiver播放。终止投屏时,Receiver会停止播放,如果Sender的视频播放器没有终止,视频将切换回由Sender播放。
目的端设备(DestPlayer):投屏切换后,应播放视频的一端。启动投屏时,视频播放从Sender切换到Receiver,此时Receiver为DestPlayer;终止投屏时,视频应从Receiver切换回Sender,Sender为DestPlayer。
源端设备(SourcePlayer):投屏切换前,播放视频的一端。启动投屏前,Sender为SourcePlayer;终止投屏前,Receiver为SourcePlayer。
投屏切换响应速度:也可以称作投屏响应速度、响应速度等。用户启动投屏切换后,视频切换到DestPlayer播放的响应速度。
投屏时间同步:投屏切换时,DestPlayer能从SourcePlayer正在播放的视频位置开始播放。
关键帧:视频压缩中,每帧代表一幅静止的图像。关键帧是指I帧,I帧采用全帧压缩编码,即将其全帧图像信息进行压缩编码,解码时仅用I帧的数据就可重构完整图像。I帧描述了图像背景和运动主体的详情,其不需要参考其他图像帧,属于帧内压缩。相对于关键帧,普通图像帧属于帧间压缩,解码时需要用与之对应的关键帧的信息叠加本帧的残差信息才能解码得到最终画面。
图1示出了URL投屏场景的一个示例性的示意图,如图1所示,该场景是源端设备把视频的URL地址发送给目的端设备,目的端设备向媒体服务器请求获取与URL地址对应的内容,并播放。但URL投屏一直存在两个问题:(1)投屏切换响应速度不够快,视频不能快速切换到目的端设备播放。(2)难以在兼顾投屏切换响应速度的同时,确保投屏时间同步,目的端设备难以从源端设备正在播放的视频位置开始播放。
目前优化的方向大多集中在压缩投屏切换过程中的各个环节的时间开销上,或者预先启动目的端设备等。这虽然可以在一定程度缓解问题,但不能从根本上解决问题。
一方面是由于提高投屏切换响应速度比较困难。对于网络视频服务,播放设备需要从媒体服务器下载并缓存一段视频码流后才能启动播放,通常至少需要缓2~3秒的视频码流。由于网络存在不可预知的抖动,初始缓存数据不能过少,否则播放容易卡顿。当带宽 能力与带宽需求相差不大时,初始缓存数据将需要至少2~3秒的时间。对应的,投屏切换时,目的端设备需要2~3秒后才能开始播放。而带宽需求与带宽能力之间的矛盾是长期存在的,视频又是带宽需求大户,高清、超高清、4K、8K、虚拟现实(Virtual Reality,VR)/增强现实(Augmented Reality,AR)、…,用户新增的带宽往往很快就会被这些数据塞满。除此之外,从媒体服务器下载码流,还存在初始下载开销,例如,分发网络(例如内容分发网络(Content Delivery Network,CDN)+对等网络(Peer to Peer,P2P))的初始协商。虽然媒体服务提供商会尽可能优化其性能,但开销总是存在的。通常需要数百毫秒的时间,而且与媒体服务提供商的服务质量(Quality of Service,QOS)紧密相关。可见,在投屏切换时,如果不采用其他更有效的方法,将无法从根本上解决这个问题。另一方面是由于提供投屏时间同步比较困难。通常设备要播放T时刻的视频,必须要从包含T时刻的媒体分片的第一帧开始下载。例如,假设媒体分片的时长为5秒,源端设备播放到第14秒时发起投屏切换。目的端设备收到投屏切换请求后,seek到第3片,对应的时间范围为10~15秒。为了实现时间同步,目的端设备除了必要的初始缓存(14秒之后的2~3秒),还要下载已经播放过的第10~13秒数据,这会导致更长时间的等待过程。
由此可见,在没有没有更有效的方法之前,相关技术是在投屏切换响应速度和投屏时间同步之间二选一,要么牺牲投屏切换响应速度,增强投屏时间同步,要么牺牲投屏时间同步,提高投屏切换响应速度。目前大多数都选择后者,即提高投屏切换响应速度,而在投屏切换时,总是从头开始播放视频。
本申请提了一种URL投屏方法,旨在解决上述问题。如图1所示,URL投屏方法中的Sender又可以称之为用户设备(User Equipment,UE),可以部署在陆地上,包括室内或室外、手持或车载;也可以部署在水面上(如轮船等);还可以部署在空中(例如飞机、气球和卫星上等)。源端设备可以是手机(mobile phone)、平板电脑(pad)、具备无线通讯功能的可穿戴设备(如智能手表)、具有定位功能的位置追踪器、带无线收发功能的电脑、VR设备、AR设备、工业控制(industrial control)中的无线设备、无人驾驶(self driving)中的无线设备、远程医疗(remote medical)中的无线设备、智能电网(smart grid)中的无线设备、运输安全(transportation safety)中的无线设备、智慧城市(smart city)中的无线设备、智慧家庭(smart home)中的无线设备等,本申请对此不作限定。
URL投屏方法中的目的端设备可以是智能电视、电视盒子、投影幕布等,本申请对此不作限定。
源端设备和目的端设备之间的网络可以是是支持短距离通信技术的通信网络,例如支持无线保真(Wireless-Fidelity,WIFI)技术的通信网络;或者,支持蓝牙技术的通信网络或者,支持近场通信(Near Field Communication,NFC)技术的通信网络;等等;或者,该通信网络也可以是支持第四代(Fourth Generation,4G)接入技术的通信网络,例如长期演进(Long Term Evolution,LTE)接入技术;或者,该通信网络也可以是支持第五代(Fifth Generation,5G)接入技术通信网络,例如新无线(New Radio,NR)接入技术;或者,该通信网络也可以是支持第三代(Third Generation,3G)接入技术的通信网络,例如(Universal Mobile Telecommunications System,UMTS)接入技术;或者,该通信网络还可以是支持多种无线技术的通信网络,例如支持LTE技术和NR技术的通信网络;或者,该通信网络也可以适用于面向未来的通信技术,本申请对此不做具体限定。
图2示出了设备200的一个示例性的结构示意图。设备200可以用作上述源端设备,也可以用作上述目的端设备。如图2所示,设备200包括:应用处理器201、微控制器单元(Microcontroller Unit,MCU)202、存储器203、调制解调器(modem)204、射频(Radio Frequency,RF)模块205、无线保真(Wireless-Fidelity,简称Wi-Fi)模块206、蓝牙模块207、传感器208、输入/输出(Input/Output,I/O)设备209、定位模块210等部件。这些部件可通过一根或多根通信总线或信号线进行通信。前述通信总线或信号线可以是本申请提供的CAN总线。本领域技术人员可以理解,设备200可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。
下面结合图2对设备200的各个部件进行具体的介绍:
应用处理器201是设备200的控制中心,利用各种接口和总线连接设备200的各个部件。在一些实施例中,处理器201可包括一个或多个处理单元。
存储器203中存储有计算机程序,诸如图2所示的操作系统211和应用程序212。应用处理器201被配置用于执行存储器203中的计算机程序,从而实现该计算机程序定义的功能,例如应用处理器201执行操作系统211从而在设备200上实现操作系统的各种功能。存储器203还存储有除计算机程序之外的其他数据,诸如操作系统211和应用程序212运行过程中产生的数据。存储器203为非易失性存储介质,一般包括内存和外存。内存包括但不限于随机存取存储器(Random Access Memory,RAM),只读存储器(Read-Only Memory,ROM),或高速缓存(cache)等。外存包括但不限于闪存(Flash Memory)、硬盘、光盘、通用串行总线(Universal Serial Bus,USB)盘等。计算机程序通常被存储在外存上,处理器在执行计算机程序前会将该程序从外存加载到内存。
存储器203可以是独立的,通过总线与应用处理器201相连接;存储器203也可以和应用处理器201集成到一个芯片子系统。
MCU 202是用于获取并处理来自传感器208的数据的协处理器,MCU 202的处理能力和功耗小于应用处理器201,但具有“永久开启(always on)”的特点,可以在应用处理器201处于休眠模式时持续收集以及处理传感器数据,以极低的功耗保障传感器的正常运行。在一个实施例中,MCU 202可以为sensor hub芯片。传感器208可以包括光传感器、运动传感器。具体地,光传感器可包括环境光传感器及接近传感器,其中,环境光传感器可根据环境光线的明暗来调节显示器2091的亮度,接近传感器可在设备200移动到耳边时,关闭显示屏的电源。作为运动传感器的一种,加速计传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向;传感器208还可以包括陀螺仪、气压计、湿度计、温度计、红外线传感器等其它传感器,在此不再赘述。MCU 202和传感器208可以集成到同一块芯片上,也可以是分离的元件,通过总线连接。
modem 204以及射频模块205构成了设备200通信子系统,用于实现无线通信标准协议的主要功能。其中,modem 204用于编解码、信号的调制解调、均衡等。射频模块205用于无线信号的接收和发送,射频模块205包括但不限于天线、至少一个放大器、耦合器、双工器等。射频模块205配合modem 204实现无线通信功能。modem 204可以作为单独的芯片,也可以与其他芯片或电路在一起形成系统级芯片或集成电路。这些芯片或集成电路可应用于所有实现无线通信功能的设备,包括:手机、电脑、笔记本、平板、路由器、可穿戴设备、汽车、家电设备等。
设备200还可以使用Wi-Fi模块206,蓝牙模块207等来进行无线通信。Wi-Fi模块206用于为设备200提供遵循Wi-Fi相关标准协议的网络接入,设备200可以通过Wi-Fi模块206接入到Wi-Fi接入点,进而访问互联网。在其他一些实施例中,Wi-Fi模块206也可以作为Wi-Fi无线接入点,可以为其他设备提供Wi-Fi网络接入。蓝牙模块207用于实现设备200与其他设备(例如手机、智能手表等)之间的短距离通信。本申请实施例中的Wi-Fi模块206可以是集成电路或Wi-Fi芯片等,蓝牙模块207可以是集成电路或者蓝牙芯片等。
定位模块210用于确定设备200的地理位置。可以理解的是,定位模块210具体可以是全球定位系统(global position system,GPS)或北斗卫星导航系统、俄罗斯GLONASS等定位系统的接收器。
Wi-Fi模块206,蓝牙模块207和定位模块210分别可以是单独的芯片或集成电路,也可以集成到一起。例如,在一个实施例中,Wi-Fi模块206,蓝牙模块207和定位模块210可以集成到同一芯片上。在另一个实施例中,Wi-Fi模块206,蓝牙模块207、定位模块210以及MCU 202也可以集成到同一芯片中。
输入/输出设备209包括但不限于:显示器2091、触摸屏2092,以及音频电路2093等等。
其中,触摸屏2092可采集设备200的用户在其上或附近的触摸事件(比如用户使用手指、触控笔等任何适合的物体在触摸屏2092上或在触控屏触摸屏2092附近的操作),并将采集到的触摸事件发送给其他器件(例如应用处理器201)。其中,用户在触摸屏2092附近的操作可以称之为悬浮触控;通过悬浮触控,用户可以在不直接接触触摸屏2092的情况下选择、移动或拖动目的(例如图标等)。此外,可以采用电阻式、电容式、红外线以及表面声波等多种类型来实现触摸屏2092。
显示器(也称为显示屏)2091用于显示用户输入的信息或展示给用户的信息。可以采用液晶显示屏、有机发光二极管等形式来配置显示器。触摸屏2092可以覆盖在显示器2091之上,当触摸屏2092检测到触摸事件后,传送给应用处理器201以确定触摸事件的类型,随后应用处理器201可以根据触摸事件的类型在显示器2091上提供相应的视觉输出。虽然在图2中,触摸屏2092与显示器2091是作为两个独立的部件来实现设备200的输入和输出功能,但是在某些实施例中,可以将触摸屏2092与显示器2091集成而实现设备200的输入和输出功能。另外,触摸屏2092和显示器2091可以以全面板的形式配置在设备200的正面,以实现无边框的结构。
音频电路2093、扬声器2094、麦克风2095可提供用户与设备200之间的音频接口。音频电路2093可将接收到的音频数据转换后的电信号,传输到扬声器2094,由扬声器2094转换为声音信号输出;另一方面,麦克风2095将收集的声音信号转换为电信号,由音频电路2093接收后转换为音频数据,再通过modem 204和射频模块205将音频数据发送给另一设备,或者将音频数据输出至存储器203以便进一步处理。
另外,设备200还可以具有指纹识别功能。例如,可以在设备200的背面(例如后置摄像头的下方)配置指纹采集器件,或者在设备200的正面(例如触摸屏2092的下方)配置指纹采集器件。又例如,可以在触摸屏2092中配置指纹采集器件来实现指纹识别功能,即指纹采集器件可以与触摸屏2092集成在一起来实现设备200的指纹识别功能。在 这种情况下,该指纹采集器件配置在触摸屏2092中,可以是触摸屏2092的一部分,也可以以其他方式配置在触摸屏2092中。本申请实施例中的指纹采集器件的主要部件是指纹传感器,该指纹传感器可以采用任何类型的感测技术,包括但不限于光学式、电容式、压电式或超声波传感技术等。
进一步地,设备200搭载的操作系统211可以为
Figure PCTCN2021078480-appb-000001
或者其它操作系统,本申请实施例对此不作任何限制。
以搭载
Figure PCTCN2021078480-appb-000002
操作系统的设备200为例,设备200从逻辑上可划分为硬件层、操作系统211,以及应用层。硬件层包括如上所述的应用处理器201、MCU 202、存储器203、modem 204、Wi-Fi模块206、传感器208、定位模块210等硬件资源。应用层包括一个或多个应用程序,比如应用程序212,应用程序212可以为社交类应用、电子商务类应用、浏览器等任意类型的应用程序。操作系统211作为硬件层和应用层之间的软件中间件,是管理和控制硬件与软件资源的计算机程序。
在一个实施例中,操作系统211包括内核,硬件抽象层(hardware abstraction layer,HAL)、库和运行时(libraries and runtime)以及框架(framework)。其中,内核用于提供底层系统组件和服务,例如:电源管理、内存管理、线程管理、硬件驱动程序等;硬件驱动程序包括Wi-Fi驱动、传感器驱动、定位模块驱动等。硬件抽象层是对内核驱动程序的封装,向框架提供接口,屏蔽低层的实现细节。硬件抽象层运行在用户空间,而内核驱动程序运行在内核空间。
库和运行时也叫做运行时库,它为可执行程序在运行时提供所需要的库文件和执行环境。在一个实施例中,库与运行时包括安卓运行时(android runtime,ART),库,以及场景包运行时。ART是能够把应用程序的字节码转换为机器码的虚拟机或虚拟机实例。库是为可执行程序在运行时提供支持的程序库,包括浏览器引擎(比如webkit)、脚本执行引擎(比如JavaScript引擎)、图形处理引擎等。场景包运行时是场景包的运行环境,主要包括页面执行环境(page context)和脚本执行环境(script context),其中,页面执行环境通过调用相应的库解析html、css等格式的页面代码,脚本执行环境通过调用相应的功能库解析执行JavaScript等脚本语言实现的代码或可执行文件。
框架用于为应用层中的应用程序提供各种基础的公共组件和服务,比如窗口管理、位置管理等等。在一个实施例中,框架包括地理围栏服务,策略服务,通知管理器等。
以上描述的操作系统211的各个组件的功能均可以由应用处理器201执行存储器203中存储的程序来实现。
所属领域的技术人员可以理解设备200可包括比图2所示的更少或更多的部件,图2所示的该设备仅包括与本申请所公开的多个实现方式更加相关的部件。
图3为本申请URL投屏方法实施例一的流程图。该过程300可由源端设备、目的端设备和媒体服务器执行。过程300描述为一系列的步骤或操作,应当理解的是,过程300可以以各种顺序执行和/或同时发生,不限于图3所示的执行顺序。如图3所示,本实施例的方法可以包括:
步骤301、源端设备根据投屏切换时的播放进度确定第一码流和下载指示信息。
需要说明的是,本申请中Sender是指URL投屏的发起端,例如手机、平板、计算机等设备,Receiver是指URL投屏的接收端,例如电视机、机顶盒、AV功放、ChromeCast、 计算机等设备。通常URL投屏过程是从Sender向Receiver发送投屏请求开始,Sender可以在该投屏请求中携带上正在播放的视频的URL地址,Receiver根据URL地址从媒体服务器下载与URL地址对应的视频码流并播放。因此Sender和Receiver是固定的。而源端设备(SourcePlayer)是指投屏切换前播放视频的一端,目的端设备(DestPlayer)是指投屏切换后应播放视频的一端。在启动投屏时,由Sender发起将播放视频的权限交给Receiver,此时SourcePlayer是Sender,DestPlayer是Receiver;在终止投屏时,Receiver将播放视频的权限交还给Sender,此时SourcePlayer是Receiver,DestPlayer是Sender。即SourcePlayer和DestPlayer是相对的,会根据投屏的不同阶段而在Sender和Receiver之间发生切换。
网络媒体播放通常采用流媒体技术,其采用流式传输技术在网络上连续实时播放的媒体格式,例如音频、视频或多媒体文件。流媒体技术就是把连续的媒体数据经过压缩处理后以媒体分片的方式统一存储于媒体服务器,每个媒体分片包括多个图像帧对应的码流。在播放时,媒体服务器向用户的终端设备顺序或实时地传送视频的媒体分片,终端设备一边下载一边播放,无需等待整个视频文件下载完成。终端设备会创建一个缓存区,在播放前预先下载一段视频码流作为缓存,当网路实际连线速度小于播放所耗的速度时,播放器就会取用一小段缓存区内的码流播放,避免视频卡顿。
由此可见,源端设备在播放视频的过程中,同时在做两件事:播放和下载缓存。为了保证视频的连贯性,通常在同一时刻上源端设备正在播放的图像帧的时序要早于正在下载的图像帧的时序,亦即针对同一个图像帧,下载缓存该图像帧的时间要早于播放该图像帧的时间。图4示出了图像帧的一个示例性的序列示意图,如图4所示,Fc为投屏切换时源端设备正在播放的图像帧,Fd为投屏切换时源端设备已经下载完成的最后一个图像帧,Fk为Fc对应的关键帧,Fn为Fd的下一个图像帧,该示例中,Fk、Fc、Fd和Fn属于同一个媒体分片Si。图5示出了图像帧的一个示例性的序列示意图,如图5所示,Fc为投屏切换时源端设备正在播放的图像帧,Fd为投屏切换时源端设备已经下载完成的最后一个图像帧,Fk为Fc对应的关键帧,Fn为Fd的下一个图像帧,该示例中,Fk和Fc属于同一个媒体分片Si,Fd和Fn属于同一个媒体分片Sj,Si和Sj是不同的媒体分片,Si和Sj可以相邻,也可以相隔一个或多个其他的媒体分片,本申请对此不作具体限定。图6示出了图像帧的一个示例性的序列示意图,如图6所示,Fc为投屏切换时源端设备正在播放的图像帧,Fd为投屏切换时源端设备已经下载完成的最后一个图像帧,Fk为Fc对应的关键帧,Fn为Fd的下一个图像帧,该示例中,Fk和Fc属于同一个媒体分片Si,Fd属于媒体分片Sl,Fn属于媒体分片Sj,Si、Sl和Sj是不同的媒体分片,Sl和Sj一定是相邻的,Si和Sl可以相邻,也可以相隔一个或多个其他的媒体分片,本申请对此不作具体限定。需要说明的是,上述示例中Fk和Fc属于同一个媒体分片Si,应理解的,Fk和Fc也可以属于不同的媒体分片,当媒体分片很小时,媒体分片中也不总是包含关键帧。另外,图4-图6示例性的示出了投屏切换前源端设备已经下载的图像帧的时序关系,以及图像帧和媒体分片可能存在的对应关系,但这并非对本申请涉及到的视频码流造成限定。
为了保证视频的连贯性,Fd是晚于Fc的,而且可以认为从Fc到Fd的图像帧还没有被源端设备播放。源端设备从媒体服务器下载的码流包括了Fc之前的图像帧的信息,以及从Fc到Fd的图像帧的信息。
投屏切换时源端设备正在播放图像帧Fc,那么在投屏时间同步这一需求下,投屏切换完成后目的端设备要从图像帧Fc开始向后播放。关于目的端设备从视频的哪个位置开始播放可以借助URL协议中的寻找(seek)指令实现。可选的,本申请也可以采用新的消息实现,例如时间指示信息,通过该消息指示投屏切换时正在播放的图像帧。
目的端设备要从图像帧Fc开始播放,同样需要经过和源端设备相同的过程,即在播放前预先从媒体服务器下载一段视频码流作为缓存,采用这段缓存开始播放视频,在播放视频的过程中,一边播放一边下载后续图像帧的缓存。应理解的,目的端设备从图像帧Fc开始播放,其要下载的缓存至少包括从Fc到Fd的图像帧的码流。
本申请中源端设备可以通过和目的端设备之间的通信网络,将己方已经缓存码流发送给目的端设备。由于通常URL投屏发生在两个位于同一局域网内的设备之间,因此源端设备向目的端设备发送上述码流可以藉由局域网实现,通过局域网传输数据的速度会更快,所需时间更短。而源端设备发送给目的端设备的缓存码流中,为了减少传输的数据量,源端设备可以只把已经从媒体服务器下载、且尚未播放的图像帧的码流发送给目的端设备,这部分码流可以称作第一码流。
而目的端设备除了第一码流,还需要缓存后续码流以确保视频的正常播放,这部分码流可以称作第二码流,目的端设备可以从媒体服务器下载第二码流。本申请中目的端设备无法确定第二码流从哪个图像帧开始,因此源端设备可以确定一个下载指示信息,通过该下载指示信息指示第二码流的起始帧。
理论上讲,第一码流的起始帧应该是Fc,但是根据媒体的编解码技术和关键帧的作用,如果Fc不是关键帧,那么在目的端设备仅凭Fc的信息是无法解码出Fc对应的图像,必须还要有Fc对应的关键帧Fk(Fk是早于Fc、且距离Fc最接近的一个关键帧),藉由Fk的信息才能完整的解码出Fc对应的图像。Fc之后的图像帧亦是如此。因此即使投屏切换时Fk已经播放过了,为了确保第一码流的信息完整性,Fc对应的关键帧Fk是必须发送给目的端设备的,因此第一码流的起始帧可以为Fk。而Fk之前的图像帧,由于在投屏切换时这些图像帧已经在源端设备上播放过了,目的端设备要实现投屏时间同步,从Fc开始播放即可,在Fc之前已经播放过的图像帧对目的端设备来讲是不需要的。综上所述,第一码流的起始帧可以是Fk,既减少了源端设备和目的端设备之间的传输的数据量,又提供足够的图像信息,确保图像的有效解码。
第一码流的结束帧可以包括两种可能的情况:(1)第一码流的结束帧为投屏切换时源端设备已经下载完成的最后一个图像帧Fd。该情况下表示源端设备会把已经从媒体服务器下载、且尚未播放的所有图像帧的码流全部发送给目的端设备,即从Fc到Fd的图像帧的码流。(2)第一码流的结束帧为Fc所属媒体分片的最后一个图像帧Fe。根据上述对流媒体技术的描述,连续的媒体数据经过压缩处理后是以媒体分片的方式统一存储,每个媒体分片包含多个图像帧。为了确保目的端可以实现投屏时间同步,也为了符合媒体数据的存储要求,源端设备发送给目的端设备的第一码流可以结束于Fe。而该情况还有一个前提即Fd和Fc属于不同的媒体分片,这样才能保证Fe早于Fd,否则若Fd和Fc属于同一个媒体分片,则在投屏切换时源端设备只下载到Fd,Fd和Fe不一定是同一个图像帧,源端设备也就无法把Fe发送给目的端设备。
基于视频的存储格式,设备从媒体服务器下载视频码流都是以媒体分片为单位,一个 分片一个分片的依时间先后逐个下载的,因此无论是源端设备,还是目的端设备,从媒体服务器下载码流的起始帧一定是某一个媒体分片的第一个图像帧。通常源端设备从媒体服务器下载的码流的起始帧是视频的第一个媒体分片的第一个图像帧,而目的端设备要从媒体服务器下载的码流(第二码流)的起始帧与第一码流的结束帧有关。第二码流的起始帧可以是第一码流的结束帧的下一个图像帧所属媒体分片的第一个图像帧。针对情况(1),第一码流的结束帧为投屏切换时源端设备已经下载完成的最后一个图像帧Fd,那么最理想的状态是第二码流从Fd的下一个图像帧Fn开始,但考虑到媒体服务器上的媒体分片处理,第二码流的起始帧可以是Fn所属媒体分片的第一个图像帧。此时第一码流的结束部分和第二码流的开始部分可能会存在重叠,例如,图5中Fd和Fn属于同一个媒体分片Sj,第一码流的结束帧为Fd,第二码流的起始帧为Fn所属媒体分片的第一个图像帧,该第一个图像帧要么就是Fd,要么就早于Fd,这样第一码流和第二码流就会有部分重叠。又例如,图6中Fd属于媒体分片Sl,Fn属于媒体分片Sj,Sl和Sj是不同的媒体分片,而Fd是Fn的上一个图像帧,因此第一码流的结束帧Fd刚好是媒体分片Sl的最后一个图像帧,Fn刚好是相邻媒体分片Sj的第一个图像帧,这样第一码流和第二码流不会有重叠。针对情况(2),第一码流的结束帧为Fc所属媒体分片的最后一个图像帧Fe,那么最理想的状态是第二码流从Fe的下一个图像帧开始,但考虑到媒体服务器上的媒体分片处理,第二码流的起始帧可以是Fe的下一个图像帧所属媒体分片的第一个图像帧。由于第一码流的结束帧是媒体分片的最后一个图像帧Fe,那么Fe的下一个图像帧是相邻媒体分片的第一个图像帧,而该第一个图像帧是第二码流的起始帧,因此第一码流和第二码流不会有重叠。
基于局域网传输的优势,本申请中源端设备可以将己方已经缓存的码流尽可能多的通过局域网传输给目的端设备,这样可以减少目的端设备从媒体服务器下载的第二码流的数据量。因此第二码流的起始帧可以是Fn所属媒体分片的第一个图像帧F1,Fn为投屏切换时源端设备已经下载完成的最后一个图像帧Fd的下一个图像帧。这是第二码流可以设置的最晚的一个起始帧,若再往后的媒体分片开始下载,必然无法实现投屏时间同步。因此第一码流的结束帧与第二码流的起始帧有关,即第一码流的结束帧为F1的上一个图像帧F0。这种情况下,第一码流和第二码流也不会有重叠。
步骤302、源端设备向目的端设备发送下载指示信息。
步骤303、源端设备向目的端设备发送第一码流。
通过源端设备和目的端设备之间的通信网络,源端设备可以将下载指示信息和第一码流发送给目的端设备。
源端设备可以向目的端设备发送播放时间指示信息,该播放时间指示信息用于指示投屏切换时正在播放的图像帧。源端设备发送播放时间指示信息的目的在于告知目的端设备投屏切换时己方的播放进度,以便于目的端设备可以根据该播放进度确定从哪一帧开始播放视频,实现两侧的投屏时间同步。
在一种可能的实现方式中,源端设备可以根据设定的封装格式对第一码流进行封装,得到格式数据,然后源端设备将该格式数据以及控制信息发送给目的端设备,控制信息中包括前述封装格式。第一码流包括的图像帧可能跨越多个媒体分片,源端设备可以将第一码流封装成一个媒体分片,例如传输流(Transport Stream,TS)、FMP4分片等,以简化 传输数据的组织。可选的,源端设备也可以将第一码流分成多个媒体分片发送,本申请对此不作具体限定。本申请中的封装对象是从媒体服务器下载缓存的视频码流,该封装处理可以提升投屏时间同步的性能,其并不涉及图像的编解码,从而可以降低算法的难度和对硬件的要求。
步骤304、目的端设备从媒体服务器下载第二码流。
下载指示信息指示了第二码流的起始帧,因此目的端设备根据该下载指示信息从媒体服务器下载从该起始帧开始的连续图像帧的码流。如前所述,第二码流的起始帧已经确定为是某一媒体分片的第一个图像帧,因此目的端设备可以向媒体服务器请求从该媒体分片开始下载。
可选的,下载指示信息可以包括第二码流的起始帧所属的媒体分片的信息,目的端设备可以根据该信息确定起始帧所属的媒体分片,进而确定起始帧。
可选的,下载指示信息可以包括第二码流的起始帧的信息,目的端设备可以根据该信息直接确定起始帧,进而确定该起始帧所属的媒体分片。
本申请对于上述下载指示信息包括的信息不做具体限定。
步骤305、目的端设备根据第一码流或者第二码流播放视频。
目的端设备接收第一码流和从媒体服务器下载第二码流是两个独立的过程,因此二者可以同时进行。由于第一码流整体位于第二码流之前,因此目的端设备可以从第一码流开始播放。而在接收第一码流和第二码流的过程中,如果接收到的第一码流的数据量到达足够用于播放的数据量,目的端设备就可以根据第一码流从投屏切换时的播放进度将视频播放起来,这样有利于提高投屏切换响应速度。当第一码流播放完后目的端设备也已经积累了足够的第二码流,根据第二码流接着往后播放视频,确保了视频播放的流畅。
上述过程中,为了实现投屏时间同步,目的端设备可以从第一码流或者第二码流中确定投屏切换后待播放的首帧图像帧,并从首帧图像帧开始播放视频。首帧图像帧正是投屏时间同步的关键。如上所述,源端设备为了投屏时间同步,向目的端设备发送用于指示投屏切换时正在播放的图像帧Fc的播放时间指示信息,目的端设备根据该播放时间指示信息,首先可以确定待播放的首帧图像帧为Fc,然后根据第一码流播放从Fc到第一码流的结束帧的视频,根据第二码流播放从Fm开始的视频,Fm为第一码流的结束帧的下一个图像帧。
如上所述,第一码流的结束部分和第二码流的开始部分可能会存在重叠,也可能第一码流和第二码流完全不重叠。如果是不重叠的情况,目的端设备播放完第一码流后,直接根据第二码流开始播放,第一码流的结束帧是第二码流的起始帧的上一个图像帧,因此目的端设备可以实现从第一码流到第二码流的无缝切换。如果是有重叠的情况,在重叠的部分,目的端设备可以选择根据第一码流播放,即根据第一码流播放从Fc到第一码流的结束帧的视频,根据第二码流播放从Fm开始的视频。需要说明的是,目的端设备也可以在有重叠的部分选择根据第二码流播放,即根据第一码流播放从Fc到第二码流的起始帧的上一个图像帧的视频,放弃掉第一码流中从第二码流的起始帧到第一码流的结束帧这部分码流,根据第二码流播放从第二码流的起始帧开始的视频。本申请对此不作具体限定。
步骤301-305主要描述第一码流的总数据量大于或等于设定阈值时的投屏切换过程,设定阈值用于表征第一码流的总数据量是否足够支持视频的播放。在一种可能的实现方式 中,如果投屏切换时,源端设备从媒体服务器下载的缓存码流的数据量较小,小于设定阈值,即使发送给目的端设备,目的端设备仍然需要从媒体服务器下载几乎全部的码流,或者根据第一码流无法播放视频,又或者传输第一码流所产生的带宽消耗、传输时延等代价高于直接从媒体服务器下载所产生的代价,此时源端设备可以向目的端设备发送数据量指示信息,该数据量指示信息用于通知第一码流的总数据量小于设定阈值。该情况下,源端设备可以选择将第一码流发送给目的端设备,也可以选择不发送第一码流,该过程可以通过高层配置或双方交互信息来协商,此处不再赘述。
对于源端设备将第一码流发送给目的端设备的情况,目的端设备可以从媒体服务器下载第四码流,该第四码流的起始帧为Fn所属媒体分片的第一帧。即从第一码流的结束帧的下一帧所属媒体分片的第一个图像帧开始下载第二码流。
对于源端设备不发送第一码流的情况,目的端设备可以从媒体服务器下载第三码流,该第三码流的起始帧为投屏切换时源端设备正在播放的图像帧Fc所属媒体分片的第一个图像帧。由于没有第一码流,因此为了保证视频的正常播放,同时为了实现投屏时间同步,目的端设备可以从Fc所属媒体分片的第一个图像帧开始下载第二码流。
本申请在投屏切换时,源端设备将已经缓存的码流发送给目的端设备,一方面缓存码流通过局域网传输,基于局域网自身的优势,包括具有更高的传输带宽和更优的QOS等,可以实现缓存码流的快速、稳定传输,另一方面目的端设备一边接收来自源端设备的缓存码流,一边从媒体服务器下载缓存码流之后的码流,既可以快速基于已有的码流开始播放视频,提高投屏切换响应速度,又可以确保视频播放时存储足够的码流,防止视频播放卡顿。
需要说明的是,上述方法实施例描述了投屏切换时,源端设备将第一码流和下载指示信息发送给目的端设备,进而实现投屏切换的过程。该过程发生于启动投屏时,上述源端设备可以是作为Sender的手机、平板、计算机等设备,目的端设备可以是作为Receiver的电视机、机顶盒、AV功放、ChromeCast、计算机等设备。该过程发生于终止投屏时,上述源端设备可以是作为Receiver的电视机、机顶盒、AV功放、ChromeCast、计算机等设备,目的端设备可以是作为Sender的手机、平板、计算机等设备。因此Sender和Receiver可以在投屏的不同阶段发生角色切换,本申请对此不作具体限定。
图7为本申请URL投屏方法实施例二的流程图。该过程700可由源端设备、目的端设备和媒体服务器执行。过程700描述为一系列的步骤或操作,应当理解的是,过程700可以以各种顺序执行和/或同时发生,不限于图7所示的执行顺序。如图7所示,本实施例的方法可以包括:
步骤701、用户通过Sender启动投屏。
该过程可以采用任意一种支持URL投屏协议的应用程序、视频平台、播放器程序等技术,本申请对此不作具体限定。
步骤702、Sender向Receiver发送投屏请求。
投屏请求中可以包括需要播放的视频的URL地址,以及seek指令等,告知Receiver当前播放视频的地址,当前视频的播放进度。
步骤703、Receiver向Sender发送投屏响应。
如果Receiver正确接收到了Sender发送的投屏请求,Receiver可以反馈确认响应 (Acknowledge,ACK),否则Receiver可以反馈否认响应(Negative Acknowledge,NAK)。
步骤704、Sender根据当前播放进度确定第一码流和下载指示信息。
步骤704可以参考上述步骤301,此处不再赘述。
步骤705、Sender向Receiver发送媒体信息。
Sender确定的第一码流包括第一码流的起始帧和结束帧,以及第一码流的封装格式,还可以包括第一码流的组织和存储方式。Sender可以将这些信息,以及上述下载指示信息全都作为媒体信息发送给Receiver。可选的,媒体信息可以采用Java脚本对象简谱(Java Script Object Notation,JSON)或者可扩展标记语言(Extensible Markup Language,XML)等描述方式。应当理解的,媒体信息也可以采用其他信息描述方式,本申请对此不作具体限定。
步骤706、Sender向Receiver发送第一码流。
步骤706可以参考上述步骤303,此处不再赘述。
步骤707、Receiver向媒体服务器发送下载请求。
下载请求中可以包括第二码流的起始帧。
步骤708、Receiver从媒体服务器下载第二码流。
步骤708可以参考上述步骤304,此处不再赘述。
步骤709、当第一码流的数据量足够时,Receiver启动播放视频。
Receiver在接收第一码流和第二码流的过程中,如果接收到的第一码流的数据量到达足够用于播放的数据量,例如500ms的视频码流,可以根据第一码流从投屏切换时的播放进度将视频播放起来,这样有利于提高投屏切换响应速度。
步骤709可以参考上述步骤305,此处不再赘述。
步骤710、用户通过Sender终止投屏。
该过程可以采用任意一种支持URL投屏协议的应用程序、视频平台、播放器程序等技术,本申请对此不作具体限定。
可选的,用户也可以通过Receiver终止投屏,同样也可以采用任意一种支持URL投屏协议的应用程序、视频平台、播放器程序等技术,本申请对此不作具体限定。
步骤711、Sender向Receiver发送终止投屏请求。
步骤712、Receiver向Sender发送终止投屏响应。
终止投屏响应中可以包括seek指令等,告知Sender当前视频的播放进度。
步骤713、Receiver根据当前播放进度确定第一码流和下载指示信息。
步骤713可以参考上述步骤301,此处不再赘述。
步骤714、Receiver向Sender发送媒体信息。
步骤714可以参考上述步骤705,此处不再赘述。
步骤715、Receiver向Sender发送第一码流。
步骤715可以参考上述步骤303,此处不再赘述。
步骤716、Sender向媒体服务器发送下载请求。
下载请求中可以包括第二码流的起始帧。
步骤717、Sender从媒体服务器下载第二码流。
步骤717可以参考上述步骤304,此处不再赘述。
步骤718、当第一码流的数据量足够时,Sender启动播放视频。
Sender也可以在接收第一码流和第二码流的过程中,如果接收到的第一码流的数据量到达足够用于播放的数据量,可以根据第一码流从投屏切换时的播放进度将视频播放起来,这样有利于提高投屏切换响应速度。
步骤718可以参考上述步骤305,此处不再赘述。
需要说明的是,图3和图7所示的实施例是本申请提供的URL投屏方法的常规流程,其可以作为对相关投屏协议的扩展,即在相关投屏协议的基础上做出扩展以实现本申请提供的URL投屏方法。应当理解的,相关协议中规定的异常情况、或其他本申请实施例中未涉及的流程也可以被本申请提供的URL投屏方法所支持和兼容。
例如,Sender和Receiver中一端或两端不支持本申请提供的URL投屏方法。如果是Receiver不支持该URL投屏方法,Receiver可以通过步骤703中的投屏响应通知Sender这一情况,如果是Sender不支持该URL投屏方法,那么Sender就不会向Receiver发送第一码流。任何一方不支持该URL投屏方法,Sender和Receiver就会执行相关技术中的投屏过程,实现该URL投屏方法和相关投屏方法的兼容。
又例如,Receiver开始播放视频后,向Sender定期报告当前播放状态(包括播放位置),如果用户通过拖动的方式调整了播放进度,Receiver也可以报告给Sender。
又例如,Sender启动投屏,Receiver已经开始播放视频,此时用户关闭了Sender上的播放器、应用程序等,这样用户就无法在Sender上操作终止投屏。该情况下,用户可以通过Receiver终止投屏,也可以在Sender上重新打开播放器、应用程序等来终止投屏。
又例如,第一码流可以是由源端设备主动发送给目的端设备,也可以是目的端设备在接收到投屏切换请求后,向源端设备请求拉取。本申请对此不作具体限定。
又例如,第一码流的传输可以采用传输控制协议(Transmission Control Protocol,TCP)、超文本传输协议(Hyper Text Transfer Protocol,HTTP)、WebSocket或各种私有协议,本申请对此不作具体限定。
又例如,上述相关协议可以包括DLNA、AirPlay、GoogleCast等URL投屏协议,本申请提供的URL投屏方法对这些协议做出的扩展要考虑在Sender和Receiver两端均有布署。而对不同的协议可以根据协议的特点采用不同的扩展方案,以实现协议的兼容性扩展。
以下以WebSocket协议为例,描述本申请提供的URL投屏方法中传输第一码流的关键点。虽然直接用TCP更简化,但目前越来越多的播放器选择以网页应用程序(WebApp)的方式运行于浏览器中,而浏览器没直接访问TCP的标准接口。
Sender总是作为WebSocket的客户端(Client),主动发起WebSocket连接。Receiver总是作为WebSocket的服务器端(Server)被动监听连接。尽可能由Receiver动态选择WebSocket Server端口(port),向Sender报告。这种方式依赖一个前提,Receiver在响应Sender投屏请求时,能兼容性扩展投屏响应消息,携带其选择的port。DLNA通过SOAP SetAVTransportURI发起投屏请求,投屏响应为XML格式,可兼容性扩展。AirPlayer通过HTTP POST/play发起投屏请求,投屏响应无HTTP body,可增加HTTP body或HTTP header,实现兼容性扩展。GoogleCast比较特别,Sender通过SDK发起投屏请求后,两端可以通过SDK建立WebSocket连接,交互消息。可以直接复用该WebSocket连接来发送第一码流。对GoogleCast这类情况,无须按下述方式建立WebSocket连接,而视为投屏 启动后,WebSocket连接已经建立。
Sender发起投屏时,在投屏请求中携带扩展参数,表明Sender支持本方案。Receiver如果支持本方案,将能识别投屏请求中的扩展参数。在响应Sender前,先启动WebSocket Server,再向Sender发送投屏响应,投屏响应中携带WebSocket Server Port。Sender识别出Receiver响应的WebSocket Server Port后,发起WebSocket连接请求。Receiver返回的WebSocket Server Port在一次完整投屏过程中均有效,Receiver在该期间一直监听该Port,等待连接。一次完整投屏过程指:自Sender发起投屏请求后开始,至用户通过Sender或Receiver终止投屏,且完成第一码流的反向传输(或超时,或Sender发起新的投屏请求)为止的时间范围。
Receiver检测到终止投屏请求后,会等待Sender获取己方缓存的第一码流,并启动定时器。当第一码流传输完成,或定时器到期,或监听到相同Sender又发起新的投屏请求时,Receiver判定当前投屏过程结束,终止WebSocket Server,对应Port随即失效。由Sender决定维持长连接还是短连接。长连接指Sender完成第一码流发送后,不断开WebSocket连接。待投屏终止时,用该连接获取反向第一码流,完成后再终止连接。短连接指Sender完成第一码流发送后,立刻断开WebSocket连接。在投屏终止时,再重新建立与Receiver的WebSocket连接获取反向第一码流,完成后终止连接。
本申请定义新的消息传输媒体信息和第一码流,该消息可以具有统一的固定长度的消息头,用来携带标识消息的类型和响应码,后面紧跟可选的消息体。以下示例性的给出了几种新消息的作用,应当理解的,以下新消息为本申请提供的URL投屏方法对相关协议做出扩展所需要的几种示例,但并非构成限定。
PutCacheMeta:启动投屏时,Sender与Receiver建立WebSocket连接后,Sender发送PutCacheMeta消息,消息体中携带媒体信息。
PutCacheMedia:Sender收到Receiver对PutCacheMeta的ACK后,发送PutCacheMedia消息,PutCacheMedia消息中携带第一码流。
GetCacheMeta:终止投屏时,Sender发送GetCacheMeta消息,Receiver在终止投屏响应中携带媒体信息。
GetCacheMedia:结束投屏时,Sender发送GetCacheMedia消息,Receiver回复第一码流。
基于图7所示的流程图,结合上述几种新的消息,图8示出了本申请URL投屏方法的一个可能的实施例。图8为本申请URL投屏方法实施例三的流程图。该过程800可由源端设备、目的端设备和媒体服务器执行。过程800描述为一系列的步骤或操作,应当理解的是,过程800可以以各种顺序执行和/或同时发生,不限于图8所示的执行顺序。如图8所示,本实施例的方法可以包括:
步骤801、用户通过Sender启动投屏。
步骤802、Sender向Receiver发送投屏请求。
步骤803、Receiver向Sender发送投屏响应。
如果Receiver正确接收到了Sender发送的投屏请求,Receiver可以反馈确认响应(Acknowledge,ACK),否则Receiver可以反馈否认响应(Negative Acknowledge,NAK)。Receiver可以在ACK中携带WebSocket Server Port等扩展信息。
步骤804、Sender向Receiver请求建立WebSocket连接。
步骤805、Receiver向Sender发送建连响应。
如果Receiver建连成功,则回复ACK,否则回复NAK。
步骤806、Sender根据当前播放进度确定第一码流和下载指示信息。
步骤807、Sender向Receiver发送PutCacheMeta消息。
PutCacheMeta消息中携带媒体信息。
步骤808、Sender向Receiver发送PutCacheMedia消息。
PutCacheMedia消息携带第一码流。
步骤809、Receiver向媒体服务器发送下载请求。
步骤810、Receiver从媒体服务器下载第二码流。
步骤811、当第一码流的数据量足够时,Receiver启动播放视频。
步骤812、用户通过Sender终止投屏。
步骤813、Sender向Receiver发送终止投屏请求。
步骤814、Receiver向Sender发送终止投屏响应。
步骤815、Receiver根据当前播放进度确定第一码流和下载指示信息。
步骤816、Sender向Receiver发送GetCacheMeta消息。
步骤817、Receiver在响应消息中携带媒体信息。
步骤818、Sender向Receiver发送GetCacheMedia消息。
步骤819、Receiver向Sender发送第一码流。
步骤820、Sender向媒体服务器发送下载请求。
步骤821、Sender从媒体服务器下载第二码流。
步骤822、当第一码流的数据量足够时,Sender启动播放视频。
以上介绍了本申请的URL投屏方法,以下介绍本申请的装置,本申请的装置包括应用于源端设备的投屏装置和应用于目的端设备的投屏装置装置,应理解,所述应用于源端设备的投屏装置即为上述方法中的源端设备,其具有上述方法中源端设备的任意功能,所述应用于目的端设备的投屏装置装置即为上述方法中的目的端设备,其具有上述方法中目的端设备的任意功能。
如图9所示,应用于目的端设备的投屏装置,包括:接收模块901、播放模块902和解码模块903。其中,接收模块901,用于接收源端设备发送的第一码流,该第一码流为该源端设备在投屏切换前从媒体服务器下载的码流;该接收模块901,还用于接收该源端设备发送的下载指示信息,并根据该下载指示信息从该媒体服务器下载第二码流,该第二码流的起始帧由该下载指示信息指示,该第二码流的起始帧与该第一码流的结束帧有关;播放模块902,用于根据该第一码流或者该第二码流播放视频。
在一种可能的实现方式中,该第一码流的起始帧为投屏切换时该源端设备正在播放的图像帧Fc对应的关键帧Fk。
在一种可能的实现方式中,该第一码流的结束帧为投屏切换时该源端设备已经下载完成的最后一个图像帧Fd。
在一种可能的实现方式中,投屏切换时该源端设备正在播放的图像帧为Fc,投屏切换时该源端设备已经下载完成的最后一个图像帧为Fd,Fd和Fc属于不同的媒体分片,每 个媒体分片包含多个图像帧,该第一码流的结束帧为Fc所属媒体分片的最后一个图像帧。
在一种可能的实现方式中,该第二码流的起始帧与该第一码流的结束帧有关,具体包括:该第二码流的起始帧为该第一码流的结束帧的下一个图像帧所属媒体分片的第一个图像帧。
在一种可能的实现方式中,投屏切换时该源端设备正在播放的图像帧为Fc,投屏切换时该源端设备已经下载完成的最后一个图像帧为Fd,Fd的下一个图像帧为Fn,Fn和Fc属于不同的媒体分片,该第二码流的起始帧为Fn所属媒体分片的第一个图像帧F1,该第一码流的结束帧为F1的上一个图像帧F0。
在一种可能的实现方式中,该播放模块902,具体用于从该第一码流或者该第二码流中确定投屏切换后待播放的首帧图像帧,并从该首帧图像帧开始播放该视频。
在一种可能的实现方式中,该接收模块901,还用于接收该源端设备发送的播放时间指示信息,该播放时间指示信息用于指示投屏切换时该源端设备正在播放的图像帧Fc;该播放模块,还用于当该第一码流的数据量大于设定阈值时,根据该播放时间指示信息从该第一码流中确定该待播放的首帧图像帧为Fc,并根据该第一码流播放从Fc到该第一码流的结束帧的视频,根据该第二码流播放从Fm开始的视频,Fm为该第一码流的结束帧的下一个图像帧。
在一种可能的实现方式中,该接收模块901,具体用于当该第一码流的总数据量不小于设定阈值时,接收该源端设备发送的该第一码流。
在一种可能的实现方式中,该接收模块901,还用于当该第一码流的总数据量小于该设定阈值时,接收该源端设备发送的该数据量指示信息;从该媒体服务器下载第三码流,该第三码流的起始帧为投屏切换时该源端设备正在播放的图像帧Fc所属媒体分片的第一个图像帧。
在一种可能的实现方式中,该接收模块901,还用于当该第一码流的总数据量小于该设定阈值时,接收该源端设备发送的该数据量指示信息和第一码流;从该媒体服务器下载第四码流,该第一码流的最后一个图像帧为Fd,Fd的下一个图像帧为Fn,该第四码流的起始帧为Fn所属媒体分片的第一帧。
在一种可能的实现方式中,该接收模块901,还用于接收该源端设备发送的控制信息,该控制信息包括该第一码流的封装格式;接收该源端设备发送的格式数据;解码模块903,用于根据该封装格式对该格式数据进行解封装得到该第一码流。
本实施例的装置,可以用于执行图3-图8任一所示方法实施例的技术方案,其实现原理和技术效果类似,此处不再赘述。
如图10所示,应用于目的端设备的投屏装置,包括:处理模块1001、发送模块1002和封装模块1003。其中,处理模块1001,用于根据投屏切换时的播放进度确定第一码流和下载指示信息,该第一码流为在投屏切换前从媒体服务器下载的码流,该下载指示信息用于指示第二码流的起始帧,该第二码流为目的端设备要从该媒体服务器下载的码流,该第二码流的起始帧与该第一码流的结束帧有关;发送模块1002,用于向该目的端设备发送该下载指示信息和该第一码流。
在一种可能的实现方式中,该第一码流的起始帧为投屏切换时正在播放的图像帧Fc对应的关键帧Fk。
在一种可能的实现方式中,该第一码流的结束帧为投屏切换时已经下载完成的最后一个图像帧Fd。
在一种可能的实现方式中,投屏切换时正在播放的图像帧为Fc,投屏切换时已经下载完成的最后一个图像帧为Fd,Fd和Fc属于不同的媒体分片,每个媒体分片包含多个图像帧,该第一码流的结束帧为Fc所属媒体分片的最后一个图像帧。
在一种可能的实现方式中,该第二码流的起始帧与该第一码流的结束帧有关,具体包括:该第二码流的起始帧为该第一码流的结束帧的下一个图像帧所属媒体分片的第一个图像帧。
在一种可能的实现方式中,投屏切换时正在播放的图像帧为Fc,投屏切换时已经下载完成的最后一个图像帧为Fd,Fd的下一个图像帧为Fn,Fn和Fc属于不同的媒体分片,该第二码流的起始帧为Fn所属媒体分片的第一个图像帧F1,该第一码流的结束帧为F1的上一个图像帧F0。
在一种可能的实现方式中,该发送模块1002,还用于向该目的端设备发送播放时间指示信息,该播放时间指示信息用于指示投屏切换时正在播放的图像帧Fc。
在一种可能的实现方式中,该发送模块1002,还用于当该第一码流的总数据量小于设定阈值时,向该目的端设备发送数据量指示信息,该数据量指示信息用于通知该第一码流的总数据量小于设定阈值。
在一种可能的实现方式中,封装模块1003,用于根据设定的封装格式对该第一码流进行封装,得到格式数据;发送模块1002,具体用于向该目的端设备发送控制信息和该格式数据,该控制信息包括该封装格式。
本实施例的装置,可以用于执行图3-图8任一所示方法实施例的技术方案,其实现原理和技术效果类似,此处不再赘述。
在实现过程中,上述方法实施例的各步骤可以通过处理器中的硬件的集成逻辑电路或者软件形式的指令完成。本申请公开的方法的步骤可以直接体现为硬件编码处理器执行完成,或者用编码处理器中的硬件及软件模块组合执行完成。软件模块可以位于随机存储器,闪存、只读存储器,可编程只读存储器或者电可擦写可编程存储器、寄存器等本领域成熟的存储介质中。该存储介质位于存储器,处理器读取存储器中的信息,结合其硬件完成上述方法的步骤。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统、装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,应该理解到,所揭露的系统、装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显 示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述功能如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(个人计算机,服务器,或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (44)

  1. 一种URL投屏方法,其特征在于,包括:
    接收源端设备发送的第一码流,所述第一码流为所述源端设备在投屏切换前从媒体服务器下载的码流;
    接收所述源端设备发送的下载指示信息;
    从所述媒体服务器下载第二码流,所述第二码流的起始帧由所述下载指示信息指示,所述第二码流的起始帧与所述第一码流的结束帧有关;
    根据所述第一码流或者所述第二码流播放视频。
  2. 根据权利要求1所述的方法,其特征在于,所述第一码流的起始帧为投屏切换时所述源端设备正在播放的图像帧Fc对应的关键帧Fk。
  3. 根据权利要求1或2所述的方法,其特征在于,所述第一码流的结束帧为投屏切换时所述源端设备已经下载完成的最后一个图像帧Fd。
  4. 根据权利要求1或2所述的方法,其特征在于,投屏切换时所述源端设备正在播放的图像帧为Fc,投屏切换时所述源端设备已经下载完成的最后一个图像帧为Fd,Fd和Fc属于不同的媒体分片,每个媒体分片包含多个图像帧,所述第一码流的结束帧为Fc所属媒体分片的最后一个图像帧。
  5. 根据权利要求1-4中任一项所述的方法,其特征在于,所述第二码流的起始帧与所述第一码流的结束帧有关,具体包括:
    所述第二码流的起始帧为所述第一码流的结束帧的下一个图像帧所属媒体分片的第一个图像帧。
  6. 根据权利要求1或2所述的方法,其特征在于,投屏切换时所述源端设备正在播放的图像帧为Fc,投屏切换时所述源端设备已经下载完成的最后一个图像帧为Fd,Fd的下一个图像帧为Fn,Fn和Fc属于不同的媒体分片,所述第二码流的起始帧为Fn所属媒体分片的第一个图像帧F1,所述第一码流的结束帧为F1的上一个图像帧F0。
  7. 根据权利要求1-6中任一项所述的方法,其特征在于,所述根据所述第一码流或者所述第二码流播放视频,具体包括:
    从所述第一码流或者所述第二码流中确定投屏切换后待播放的首帧图像帧,并从所述首帧图像帧开始播放所述视频。
  8. 根据权利要求7中任一项所述的方法,其特征在于,所述根据所述第一码流或者所述第二码流播放视频之前,还包括:
    接收所述源端设备发送的播放时间指示信息,所述播放时间指示信息用于指示投屏切换时所述源端设备正在播放的图像帧Fc;
    所述根据所述第一码流或者所述第二码流播放视频,包括:
    当所述第一码流的数据量大于设定阈值时,根据所述播放时间指示信息从所述第一码流中确定所述待播放的首帧图像帧为Fc,并根据所述第一码流播放从Fc到所述第一码流的结束帧的视频,根据所述第二码流播放从Fm开始的视频,Fm为所述第一码流的结束帧的下一个图像帧。
  9. 根据权利要求1-8中任一项所述的方法,其特征在于,所述接收源端设备发送的 第一码流,包括:
    当所述第一码流的总数据量不小于设定阈值时,接收所述源端设备发送的所述第一码流。
  10. 根据权利要求9所述的方法,其特征在于,还包括:
    当所述第一码流的总数据量小于所述设定阈值时,接收所述源端设备发送的所述数据量指示信息;
    从所述媒体服务器下载第三码流,所述第三码流的起始帧为投屏切换时所述源端设备正在播放的图像帧Fc所属媒体分片的第一个图像帧。
  11. 根据权利要求9所述的方法,其特征在于,还包括:
    当所述第一码流的总数据量小于所述设定阈值时,接收所述源端设备发送的所述数据量指示信息和所述第一码流;
    从所述媒体服务器下载第四码流,所述第一码流的最后一个图像帧为Fd,Fd的下一个图像帧为Fn,所述第四码流的起始帧为Fn所属媒体分片的第一帧。
  12. 根据权利要求1-11中任一项所述的方法,其特征在于,所述接收所述源端设备发送的第一码流之前,还包括:
    接收所述源端设备发送的控制信息,所述控制信息包括所述第一码流的封装格式;
    所述接收所述源端设备发送的第一码流,包括:
    接收所述源端设备发送的格式数据,根据所述封装格式对所述格式数据进行解封装得到所述第一码流。
  13. 一种URL投屏方法,其特征在于,包括:
    根据投屏切换时的播放进度确定第一码流和下载指示信息,所述第一码流为在投屏切换前从媒体服务器下载的码流,所述下载指示信息用于指示第二码流的起始帧,所述第二码流为目的端设备要从所述媒体服务器下载的码流,所述第二码流的起始帧与所述第一码流的结束帧有关;
    向所述目的端设备发送所述下载指示信息和所述第一码流。
  14. 根据权利要求13所述的方法,其特征在于,所述第一码流的起始帧为投屏切换时正在播放的图像帧Fc对应的关键帧Fk。
  15. 根据权利要求13或14所述的方法,其特征在于,所述第一码流的结束帧为投屏切换时已经下载完成的最后一个图像帧Fd。
  16. 根据权利要求13或14所述的方法,其特征在于,投屏切换时正在播放的图像帧为Fc,投屏切换时已经下载完成的最后一个图像帧为Fd,Fd和Fc属于不同的媒体分片,每个媒体分片包含多个图像帧,所述第一码流的结束帧为Fc所属媒体分片的最后一个图像帧。
  17. 根据权利要求13-16中任一项所述的方法,其特征在于,所述第二码流的起始帧与所述第一码流的结束帧有关,具体包括:
    所述第二码流的起始帧为所述第一码流的结束帧的下一个图像帧所属媒体分片的第一个图像帧。
  18. 根据权利要求13或14所述的方法,其特征在于,投屏切换时正在播放的图像帧为Fc,投屏切换时已经下载完成的最后一个图像帧为Fd,Fd的下一个图像帧为Fn,Fn 和Fc属于不同的媒体分片,所述第二码流的起始帧为Fn所属媒体分片的第一个图像帧F1,所述第一码流的结束帧为F1的上一个图像帧F0。
  19. 根据权利要求13-18中任一项所述的方法,其特征在于,所述向所述目的端设备发送所述下载指示信息和所述第一码流之前,还包括:
    向所述目的端设备发送播放时间指示信息,所述播放时间指示信息用于指示投屏切换时正在播放的图像帧Fc。
  20. 根据权利要求13-19中任一项所述的方法,其特征在于,还包括:
    当所述第一码流的总数据量小于设定阈值时,向所述目的端设备发送数据量指示信息,所述数据量指示信息用于通知所述第一码流的总数据量小于设定阈值。
  21. 根据权利要求13-20中任一项所述的方法,其特征在于,所述向所述目的端设备发送所述下载指示信息和所述第一码流之前,还包括:
    根据设定的封装格式对所述第一码流进行封装,得到格式数据;
    向所述目的端设备发送控制信息和所述格式数据,所述控制信息包括所述封装格式。
  22. 一种投屏装置,其特征在于,包括:
    接收模块,用于接收源端设备发送的第一码流,所述第一码流为所述源端设备在投屏切换前从媒体服务器下载的码流;
    所述接收模块,还用于接收所述源端设备发送的下载指示信息,并根据所述下载指示信息从所述媒体服务器下载第二码流,所述第二码流的起始帧与所述第一码流的结束帧有关;
    播放模块,用于根据所述第一码流或者所述第二码流播放视频。
  23. 根据权利要求22所述的装置,其特征在于,所述第一码流的起始帧为投屏切换时所述源端设备正在播放的图像帧Fc对应的关键帧Fk。
  24. 根据权利要求22或23所述的装置,其特征在于,所述第一码流的结束帧为投屏切换时所述源端设备已经下载完成的最后一个图像帧Fd。
  25. 根据权利要求22或23所述的装置,其特征在于,投屏切换时所述源端设备正在播放的图像帧为Fc,投屏切换时所述源端设备已经下载完成的最后一个图像帧为Fd,Fd和Fc属于不同的媒体分片,每个媒体分片包含多个图像帧,所述第一码流的结束帧为Fc所属媒体分片的最后一个图像帧。
  26. 根据权利要求22-25中任一项所述的装置,其特征在于,所述第二码流的起始帧与所述第一码流的结束帧有关,具体包括:
    所述第二码流的起始帧为所述第一码流的结束帧的下一个图像帧所属媒体分片的第一个图像帧。
  27. 根据权利要求22或23所述的装置,其特征在于,投屏切换时所述源端设备正在播放的图像帧为Fc,投屏切换时所述源端设备已经下载完成的最后一个图像帧为Fd,Fd的下一个图像帧为Fn,Fn和Fc属于不同的媒体分片,所述第二码流的起始帧为Fn所属媒体分片的第一个图像帧F1,所述第一码流的结束帧为F1的上一个图像帧F0。
  28. 根据权利要求22-27中任一项所述的装置,其特征在于,所述播放模块,具体用于从所述第一码流或者所述第二码流中确定投屏切换后待播放的首帧图像帧,并从所述首帧图像帧开始播放所述视频。
  29. 根据权利要求28中任一项所述的装置,其特征在于,所述接收模块,还用于接收所述源端设备发送的播放时间指示信息,所述播放时间指示信息用于指示投屏切换时所述源端设备正在播放的图像帧Fc;
    所述播放模块,还用于当所述第一码流的数据量大于设定阈值时,根据所述播放时间指示信息从所述第一码流中确定所述待播放的首帧图像帧为Fc,并根据所述第一码流播放从Fc到所述第一码流的结束帧的视频,根据所述第二码流播放从Fm开始的视频,Fm为所述第一码流的结束帧的下一个图像帧。
  30. 根据权利要求22-29中任一项所述的装置,其特征在于,所述接收模块,具体用于:
    当所述第一码流的总数据量不小于设定阈值时,接收所述源端设备发送的所述第一码流。
  31. 根据权利要求30所述的装置,其特征在于,所述接收模块,还用于:
    当所述第一码流的总数据量小于所述设定阈值时,接收所述源端设备发送的所述数据量指示信息;
    从所述媒体服务器下载第三码流,所述第三码流的起始帧为投屏切换时所述源端设备正在播放的图像帧Fc所属媒体分片的第一个图像帧。
  32. 根据权利要求30所述的方法,其特征在于,所述接收模块,还用于:
    当所述第一码流的总数据量小于所述设定阈值时,接收所述源端设备发送的所述数据量指示信息和所述第一码流;
    从所述媒体服务器下载第四码流,所述第一码流的最后一个图像帧为Fd,Fd的下一个图像帧为Fn,所述第四码流的起始帧为Fn所属媒体分片的第一帧。
  33. 根据权利要求22-32中任一项所述的装置,其特征在于,所述接收模块,还用于:
    接收所述源端设备发送的控制信息,所述控制信息包括所述第一码流的封装格式;
    接收所述源端设备发送的格式数据;
    所述装置还包括:解码模块,用于根据所述封装格式对所述格式数据进行解封装得到所述第一码流。
  34. 一种投屏装置,其特征在于,包括:
    处理模块,用于根据投屏切换时的播放进度确定第一码流和下载指示信息,所述第一码流为在投屏切换前从媒体服务器下载的码流,所述下载指示信息用于指示第二码流的起始帧,所述第二码流为目的端设备要从所述媒体服务器下载的码流,所述第二码流的起始帧与所述第一码流的结束帧有关;
    发送模块,用于向所述目的端设备发送所述下载指示信息和所述第一码流。
  35. 根据权利要求34所述的装置,其特征在于,所述第一码流的起始帧为投屏切换时正在播放的图像帧Fc对应的关键帧Fk。
  36. 根据权利要求34或35所述的装置,其特征在于,所述第一码流的结束帧为投屏切换时已经下载完成的最后一个图像帧Fd。
  37. 根据权利要求34或35所述的装置,其特征在于,投屏切换时正在播放的图像帧为Fc,投屏切换时已经下载完成的最后一个图像帧为Fd,Fd和Fc属于不同的媒体分片,每个媒体分片包含多个图像帧,所述第一码流的结束帧为Fc所属媒体分片的最后一个图 像帧。
  38. 根据权利要求34-37中任一项所述的装置,其特征在于,所述第二码流的起始帧与所述第一码流的结束帧有关,具体包括:
    所述第二码流的起始帧为所述第一码流的结束帧的下一个图像帧所属媒体分片的第一个图像帧。
  39. 根据权利要求34或35所述的装置,其特征在于,投屏切换时正在播放的图像帧为Fc,投屏切换时已经下载完成的最后一个图像帧为Fd,Fd的下一个图像帧为Fn,Fn和Fc属于不同的媒体分片,所述第二码流的起始帧为Fn所属媒体分片的第一个图像帧F1,所述第一码流的结束帧为F1的上一个图像帧F0。
  40. 根据权利要求34-39中任一项所述的装置,其特征在于,所述发送模块,还用于向所述目的端设备发送播放时间指示信息,所述播放时间指示信息用于指示投屏切换时正在播放的图像帧Fc。
  41. 根据权利要求34-40中任一项所述的装置,其特征在于,所述发送模块,还用于当所述第一码流的总数据量小于设定阈值时,向所述目的端设备发送数据量指示信息,所述数据量指示信息用于通知所述第一码流的总数据量小于设定阈值。
  42. 根据权利要求34-41中任一项所述的装置,其特征在于,还包括封装模块,用于根据设定的封装格式对所述第一码流进行封装,得到格式数据;
    所述发送模块,具体用于向所述目的端设备发送控制信息和所述格式数据,所述控制信息包括所述封装格式。
  43. 一种投屏装置,其特征在于,包括:
    处理器和传输接口;
    所述处理器被配置为调用存储在存储器中的程序指令,以实现如权利要求1-12或13-21任一项所述的方法。
  44. 一种计算机可读存储介质,其特征在于,包括计算机程序,所述计算机程序在计算机或处理器上被执行时,使得所述计算机或所述处理器执行权利要求1-12或13-21中任一项所述的方法。
PCT/CN2021/078480 2020-03-13 2021-03-01 Url投屏方法和装置 WO2021179931A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202180017852.XA CN115244944A (zh) 2020-03-13 2021-03-01 Url投屏方法和装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010177999.2 2020-03-13
CN202010177999.2A CN113395606A (zh) 2020-03-13 2020-03-13 Url投屏方法和装置

Publications (2)

Publication Number Publication Date
WO2021179931A1 true WO2021179931A1 (zh) 2021-09-16
WO2021179931A8 WO2021179931A8 (zh) 2022-09-22

Family

ID=77616428

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/078480 WO2021179931A1 (zh) 2020-03-13 2021-03-01 Url投屏方法和装置

Country Status (2)

Country Link
CN (2) CN113395606A (zh)
WO (1) WO2021179931A1 (zh)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114157904A (zh) * 2021-12-02 2022-03-08 瑞森网安(福建)信息科技有限公司 基于移动终端的云视频投屏播放方法、系统及存储介质
CN114827690A (zh) * 2022-03-30 2022-07-29 北京奇艺世纪科技有限公司 一种网络资源显示方法、装置及系统
CN115103221A (zh) * 2022-06-28 2022-09-23 北京奇艺世纪科技有限公司 一种投屏方法、装置、电子设备及可读存储介质
CN115412758A (zh) * 2022-09-01 2022-11-29 北京奇艺世纪科技有限公司 一种视频处理方法及相关装置
CN115550498A (zh) * 2022-08-03 2022-12-30 阿波罗智联(北京)科技有限公司 投屏方法、装置、设备和存储介质
WO2024087815A1 (zh) * 2022-10-25 2024-05-02 广州视臻信息科技有限公司 一种传屏器配对方法、配对装置、电子设备及介质

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113395606A (zh) * 2020-03-13 2021-09-14 华为技术有限公司 Url投屏方法和装置
CN114567802B (zh) * 2021-12-29 2024-02-09 沈阳中科创达软件有限公司 一种数据显示方法和装置
CN114501120B (zh) * 2022-01-11 2023-06-09 烽火通信科技股份有限公司 多终端无线投屏切换方法与电子设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103248939A (zh) * 2012-02-03 2013-08-14 海尔集团公司 一种实现多屏同步显示的方法及系统
CN103281294A (zh) * 2013-04-17 2013-09-04 华为技术有限公司 一种数据共享方法及电子设备
CN106572383A (zh) * 2015-10-12 2017-04-19 中国科学院声学研究所 一种基于多屏互动的视频切换方法及系统
CN110087149A (zh) * 2019-05-30 2019-08-02 维沃移动通信有限公司 一种视频图像分享方法、装置及移动终端
US20190320219A1 (en) * 2018-04-13 2019-10-17 Koji Yoden Services over wireless communication with high flexibility and efficiency

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100591134C (zh) * 2007-08-01 2010-02-17 神州亿品科技有限公司 防止黑屏和花屏的播放方法
US9749676B2 (en) * 2010-06-08 2017-08-29 Microsoft Technology Licensing, Llc Virtual playback speed modification
US9197685B2 (en) * 2012-06-28 2015-11-24 Sonic Ip, Inc. Systems and methods for fast video startup using trick play streams
CN102905188B (zh) * 2012-11-01 2015-09-30 北京奇艺世纪科技有限公司 一种视频码流切换方法及装置
US9412332B2 (en) * 2013-12-20 2016-08-09 Blackberry Limited Method for wirelessly transmitting content from a source device to a sink device
CN107241640B (zh) * 2017-06-26 2019-05-24 中广热点云科技有限公司 一种移动设备和电视设备同步播放的方法
CN107659712A (zh) * 2017-09-01 2018-02-02 咪咕视讯科技有限公司 一种投屏的方法、装置及存储介质
CN109982151B (zh) * 2017-12-28 2021-06-25 中国移动通信集团福建有限公司 视频点播方法、装置、设备及介质
CN113395606A (zh) * 2020-03-13 2021-09-14 华为技术有限公司 Url投屏方法和装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103248939A (zh) * 2012-02-03 2013-08-14 海尔集团公司 一种实现多屏同步显示的方法及系统
CN103281294A (zh) * 2013-04-17 2013-09-04 华为技术有限公司 一种数据共享方法及电子设备
CN106572383A (zh) * 2015-10-12 2017-04-19 中国科学院声学研究所 一种基于多屏互动的视频切换方法及系统
US20190320219A1 (en) * 2018-04-13 2019-10-17 Koji Yoden Services over wireless communication with high flexibility and efficiency
CN110087149A (zh) * 2019-05-30 2019-08-02 维沃移动通信有限公司 一种视频图像分享方法、装置及移动终端

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114157904A (zh) * 2021-12-02 2022-03-08 瑞森网安(福建)信息科技有限公司 基于移动终端的云视频投屏播放方法、系统及存储介质
CN114157904B (zh) * 2021-12-02 2024-05-28 瑞森网安(福建)信息科技有限公司 基于移动终端的云视频投屏播放方法、系统及存储介质
CN114827690A (zh) * 2022-03-30 2022-07-29 北京奇艺世纪科技有限公司 一种网络资源显示方法、装置及系统
CN114827690B (zh) * 2022-03-30 2023-07-25 北京奇艺世纪科技有限公司 一种网络资源显示方法、装置及系统
CN115103221A (zh) * 2022-06-28 2022-09-23 北京奇艺世纪科技有限公司 一种投屏方法、装置、电子设备及可读存储介质
CN115103221B (zh) * 2022-06-28 2023-09-22 北京奇艺世纪科技有限公司 一种投屏方法、装置、电子设备及可读存储介质
CN115550498A (zh) * 2022-08-03 2022-12-30 阿波罗智联(北京)科技有限公司 投屏方法、装置、设备和存储介质
CN115550498B (zh) * 2022-08-03 2024-04-02 阿波罗智联(北京)科技有限公司 投屏方法、装置、设备和存储介质
CN115412758A (zh) * 2022-09-01 2022-11-29 北京奇艺世纪科技有限公司 一种视频处理方法及相关装置
CN115412758B (zh) * 2022-09-01 2023-11-14 北京奇艺世纪科技有限公司 一种视频处理方法及相关装置
WO2024087815A1 (zh) * 2022-10-25 2024-05-02 广州视臻信息科技有限公司 一种传屏器配对方法、配对装置、电子设备及介质

Also Published As

Publication number Publication date
CN115244944A (zh) 2022-10-25
WO2021179931A8 (zh) 2022-09-22
CN113395606A (zh) 2021-09-14

Similar Documents

Publication Publication Date Title
WO2021179931A1 (zh) Url投屏方法和装置
JP6425720B2 (ja) コンテンツ配信のための方法及び装置
US9832621B2 (en) Method, terminal, server, and system for audio signal transmission
EP3720019B1 (en) Internet of things data transmission method, device and system
WO2015058590A1 (zh) 一种视频直播控制方法、设备及系统和存储介质
WO2013127172A1 (zh) 一种流媒体传输方法、设备及系统
CN109151494B (zh) 多媒体数据传输方法、多媒体采集设备及服务器
WO2021197008A1 (zh) 音视频通信方法、终端、服务器、计算机设备和存储介质
WO2021143479A1 (zh) 媒体流传输方法及系统
CN113225592B (zh) 基于Wi-Fi P2P的投屏方法和装置
US9509947B2 (en) Method and apparatus for transmitting file during video call in electronic device
CN113141524B (zh) 资源传输方法、装置、终端及存储介质
WO2021143386A1 (zh) 资源传输方法及终端
WO2012151957A1 (zh) 服务器、客户端及利用其远程播放视频文件的方法和系统
KR102036579B1 (ko) 무선 통신 시스템에서 웹 서비스 제공 방법 및 장치
CN110636337B (zh) 视频图像的截取方法、装置及系统
US20220095020A1 (en) Method for switching a bit rate, and electronic device
WO2023217188A1 (zh) 一种直播数据传输方法、装置、系统、设备和介质
WO2022011574A1 (zh) 数据传输方法、发送设备及接收设备
EP2819384A1 (en) Method, device and system for video monitoring based on universal plug and play (upnp)
CN101984622B (zh) 基于实时传输协议的双向音频映射方法
CN102547204B (zh) 接收装置、荧幕画面传输系统以及其传输方法
WO2014082294A1 (zh) 一种媒体流传输方法及相关设备、系统
CN112368987B (zh) 一种媒体播放方法及播放设备
WO2023030386A1 (zh) 一种数据传输方法、电子设备和装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21767002

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21767002

Country of ref document: EP

Kind code of ref document: A1