CN113395606A - URL screen projection method and device - Google Patents

URL screen projection method and device Download PDF

Info

Publication number
CN113395606A
CN113395606A CN202010177999.2A CN202010177999A CN113395606A CN 113395606 A CN113395606 A CN 113395606A CN 202010177999 A CN202010177999 A CN 202010177999A CN 113395606 A CN113395606 A CN 113395606A
Authority
CN
China
Prior art keywords
code stream
frame
image frame
screen
switching
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010177999.2A
Other languages
Chinese (zh)
Inventor
肖华熙
姚垚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202010177999.2A priority Critical patent/CN113395606A/en
Priority to CN202180017852.XA priority patent/CN115244944A/en
Priority to PCT/CN2021/078480 priority patent/WO2021179931A1/en
Publication of CN113395606A publication Critical patent/CN113395606A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/858Linking data to content, e.g. by linking an URL to a video object, by creating a hotspot
    • H04N21/8586Linking data to content, e.g. by linking an URL to a video object, by creating a hotspot by using a URL
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/238Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • H04N21/4331Caching operations, e.g. of an advertisement for later insertion during playback
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/43615Interfacing a Home Network, e.g. for connecting the client to a plurality of peripherals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video stream to a specific local network, e.g. a Bluetooth® network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video stream to a specific local network, e.g. a Bluetooth® network
    • H04N21/43637Adapting the video stream to a specific local network, e.g. a Bluetooth® network involving a wireless protocol, e.g. Bluetooth, RF or wireless LAN [IEEE 802.11]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/845Structuring of content, e.g. decomposing content into time segments
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/845Structuring of content, e.g. decomposing content into time segments
    • H04N21/8456Structuring of content, e.g. decomposing content into time segments by decomposing the content in the time domain, e.g. in time segments
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/858Linking data to content, e.g. by linking an URL to a video object, by creating a hotspot

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The application provides a URL screen projection method and a device, wherein the screen projection method comprises the following steps: the source end equipment determines a first code stream and downloading indication information according to the playing progress during screen projection switching, wherein the first code stream is downloaded from a media server before screen projection switching, the downloading indication information is used for indicating a starting frame of a second code stream, the second code stream is a code stream to be downloaded from the media server by destination end equipment, and the starting frame of the second code stream is related to an ending frame of the first code stream; the source end equipment sends download indication information and a first code stream to the destination end equipment; the destination terminal equipment downloads a second code stream from the media server; and the destination terminal equipment plays the video according to the first code stream or the second code stream. According to the method and the device, the screen-throwing switching response speed can be improved, and the video playing is prevented from being blocked.

Description

URL screen projection method and device
Technical Field
The application relates to a screen projection technology, in particular to a URL screen projection method and device.
Background
With the popularization of intelligent mobile terminals and the rapid development of internet video services, there are more and more demands for playing videos on intelligent terminal devices such as mobile phones and tablets and projecting the videos to large-screen devices such as televisions and set-top boxes for watching when needed. A screen projection technology based on a Digital Living Network Alliance (DLNA) protocol is used as a typical Uniform Resource Locator (URL) screen projection technology to meet the requirement. In DLNA screen projection, a source device (SourcePlayer) sends a URL address of a video to a destination device (DestPlayer), and the DestPlayer requests a media server to acquire and play content corresponding to the URL address.
However, with the URL screen projection technology, in order to generally ensure the response speed of screen projection switching, the DestPlayer plays the video from the beginning after screen projection switching, and cannot keep synchronization with the playing progress of the SourcePlayer, which may affect the video playing efficiency on one hand and the user experience on the other hand.
Disclosure of Invention
The application provides a URL screen projection method and device, which can improve screen projection switching response speed and prevent video playing from being blocked.
In a first aspect, the present application provides a URL screen projection method, including: receiving a first code stream sent by a source end device, wherein the first code stream is a code stream downloaded from a media server by the source end device before screen-casting switching; receiving download indication information sent by the source end device; downloading a second code stream from the media server, wherein the starting frame of the second code stream is indicated by the downloading indication information, and the starting frame of the second code stream is related to the ending frame of the first code stream; and playing the video according to the first code stream or the second code stream.
When the application is switched in screen projection, the source terminal equipment sends the cached code stream to the target terminal equipment, on one hand, the cached code stream is transmitted through the local area network, based on the advantages of the local area network, including higher transmission bandwidth and better Quality of Service (QOS) and the like, the fast and stable transmission of the cached code stream can be realized, on the other hand, the target terminal equipment receives the cached code stream from the source terminal equipment and downloads the code stream after the cached code stream from the media server, the video can be played quickly based on the existing code stream, the screen projection switching response speed is improved, the sufficient code stream can be stored during video playing, and video playing is prevented from being blocked.
In a possible implementation manner, the start frame of the first code stream is a key frame Fk corresponding to an image frame Fc being played by the source device during screen projection switching.
The destination device can decode the image corresponding to Fc only by the fact that the information of the image frame Fc being played during screen switching cannot decode the image corresponding to Fc, and a key frame Fk corresponding to Fc is required (Fk is a key frame earlier than Fc and closest to Fc), and the image corresponding to Fc can be completely decoded by the information of Fk. As is the image frame after Fc. Therefore, even if Fk has been played during screen switching, in order to ensure the information integrity of the first code stream, the key frame Fk corresponding to Fc must be sent to the destination device, and therefore the start frame of the first code stream may be Fk. This reduces the amount of data transmitted between the source and destination devices, and provides sufficient image information to ensure efficient decoding of the image.
In a possible implementation manner, the ending frame of the first code stream is the last image frame Fd that has been downloaded by the source device during the screen-casting switching.
This case means that the source device transmits all the streams of all the image frames that have been downloaded from the media server and have not been played back, to the destination device, i.e., the streams of the image frames from Fc to Fd. According to the method and the device, the code stream downloaded by the source end device is transmitted to the destination end device through the local area network as much as possible, so that the data volume of the second code stream downloaded by the destination end device from the media server can be reduced.
In a possible implementation manner, an image frame being played by the source device during screen-casting switching is Fc, a last image frame downloaded by the source device during screen-casting switching is Fd, Fd and Fc belong to different media fragments, each media fragment includes a plurality of image frames, and an end frame of the first code stream is a last image frame of a media fragment to which Fc belongs.
In order to ensure that the destination can implement screen-casting time synchronization and meet the storage requirement of media data, the first code stream sent by the source device to the destination device may end at the last image frame Fe of the media fragment to which Fc belongs.
In a possible implementation manner, the starting frame of the second code stream is related to the ending frame of the first code stream, and specifically includes: the start frame of the second code stream is the first image frame of the media fragment to which the next image frame of the end frame of the first code stream belongs.
Usually, the start frame of the code stream downloaded by the source device from the media server is the first image frame of the first media segment of the video, and in this embodiment, the start frame of the code stream (the second code stream) to be downloaded by the destination device from the media server is the first image frame of the media segment to which the next image frame of the end frame of the first code stream belongs. Therefore, the second code stream downloaded from the server by the destination device and the first code stream received from the source device can be consistent in sequence, and screen projection time synchronization is further ensured.
For example, when the ending frame of the first bitstream is the last image frame Fd that has been downloaded by the source device when the screen-casting is switched, considering the media fragment processing on the media server, the starting frame of the second bitstream may be the first image frame of the media fragment to which the next image frame Fn of the Fd belongs. At this time, there may be an overlap between the end portion of the first code stream and the start portion of the second code stream. When the ending frame of the first code stream is the last image frame Fe of the media segment to which the Fc belongs, considering the media segment processing on the media server, the starting frame of the second code stream may be the first image frame of the media segment to which the next image frame of Fe belongs. Since the ending frame of the first code stream is the last image frame Fe of the media segment to which the image frame being played before the screen is cut, the next image frame of Fe is the first image frame of the adjacent media segment, and the first image frame is the starting frame of the second code stream, the first code stream and the second code stream do not overlap. On one hand, the first code stream and the second code stream can not overlap as much as possible on the basis of consecutive continuity, data transmission redundancy is reduced, even if the first code stream and the second code stream overlap, the partial overlap is also a part in the media fragment, and the partial redundancy transmission can be allowed on the premise of ensuring screen projection time synchronization.
In a possible implementation manner, the image frame being played by the source device at the time of screen projection switching is Fc, the last image frame that has been downloaded by the source device at the time of screen projection switching is Fd, the next image frame of Fd is Fn, Fn and Fc belong to different media fragments, the start frame of the second code stream is the first image frame F1 of the media fragment to which Fn belongs, and the end frame of the first code stream is the last image frame F0 of F1.
The source end device eliminates a part which is possibly overlapped with the second code stream from the code stream cached by the source end device, and transmits the residual code stream to the destination end device through the local area network, so that the data volume of the second code stream downloaded from the media server by the destination end device can be reduced.
In a possible implementation manner, the playing a video according to the first code stream or the second code stream specifically includes: and determining a first frame image frame to be played after screen projection switching from the first code stream or the second code stream, and starting to play the video from the first frame image frame.
In a possible implementation manner, before playing a video according to the first bitstream or the second bitstream, the method further includes: receiving play time indication information sent by the source end equipment, wherein the play time indication information is used for indicating an image frame Fc played by the source end equipment during screen projection switching; the playing the video according to the first code stream or the second code stream includes: when the data volume of the first code stream is greater than a set threshold value, determining that the first frame image frame to be played is Fc from the first code stream according to the playing time indication information, playing a video from Fc to an end frame of the first code stream according to the first code stream, and playing a video starting from Fm according to the second code stream, wherein Fm is a next image frame of the end frame of the first code stream.
The receiving of the first code stream and the downloading of the second code stream from the media server by the destination device are two independent processes, so that the two processes can be performed simultaneously. Since the first code stream is entirely located before the second code stream, the destination device can start playing from the first code stream. In the process of receiving the first code stream and the second code stream, if the data volume of the received first code stream reaches the data volume enough for playing, the destination device can play the video according to the playing progress of the first code stream when switching from screen projection, so that the screen projection switching response speed is favorably improved. When the first code stream is played, the destination device also accumulates enough second code streams, and then plays the video backwards according to the second code streams, so that the smoothness of video playing is ensured.
In a possible implementation manner, the receiving a first code stream sent by a source end device includes: and when the total data volume of the first code stream is not less than a set threshold, receiving the first code stream sent by the source end device.
In one possible implementation manner, the method further includes: when the total data volume of the first code stream is smaller than the set threshold, receiving the data volume indication information sent by the source end device; and downloading a third code stream from the media server, wherein the starting frame of the third code stream is the first image frame of the media fragment to which the image frame Fc which is played by the source end equipment during screen projection switching belongs.
In one possible implementation manner, the method further includes: when the total data volume of the first code stream is smaller than the set threshold, receiving the data volume indication information and the first code stream sent by the source end device; and downloading a fourth code stream from the media server, wherein the last image frame of the first code stream is Fd, the next image frame of the Fd is Fn, and the initial frame of the fourth code stream is the first frame of the media fragment to which the Fn belongs.
If the data volume of the cache code stream downloaded from the media server by the source end device is smaller than the set threshold value when screen projection is switched, even if the cache code stream is sent to the destination end device, the destination end device still needs to download almost all the code stream from the media server, or the video cannot be played according to the first code stream, or the cost of bandwidth consumption, transmission delay and the like generated by transmitting the first code stream is higher than the cost generated by directly downloading the first code stream from the media server, at this moment, the source end device can send data volume indicating information to the destination end device, and the data volume indicating information is used for informing that the total data volume of the first code stream is smaller than the set threshold value. In this case, the source device may select to send the first code stream to the destination device, or may select not to send the first code stream.
In a possible implementation manner, before receiving the first code stream sent by the source device, the method further includes: receiving control information sent by the source end device, wherein the control information comprises a packaging format of the first code stream; the receiving of the first code stream sent by the source end device includes: and receiving format data sent by the source end equipment, and de-encapsulating the format data according to the encapsulation format to obtain the first code stream.
The first bitstream may comprise image frames spanning multiple media segments, and the source device may encapsulate the first bitstream into one media segment, such as a TS, an FMP4 segment, and the like, to simplify organization of the transmitted data. Optionally, the source device may also divide the first code stream into a plurality of media segments for transmission, which is not specifically limited in this application. The packaging object in the application is a video code stream downloaded and cached from a media server, the packaging processing can improve the performance of screen projection time synchronization, and the packaging processing does not relate to encoding and decoding of images, so that the difficulty of an algorithm and the requirement on hardware can be reduced.
In a possible implementation manner, after the video starts to be played, the current playing state (including the playing position) is periodically reported to the source device, and if the user adjusts the playing progress in a dragging manner, the current playing state may also be reported to the source device.
In one possible implementation, a new message is defined for transmitting the media information and the first codestream, and the message may include a uniform fixed-length header, a type and a response code for carrying the identification message, and an optional message body.
In a second aspect, the present application provides a URL screen projection method, including: determining a first code stream and downloading indication information according to a playing progress during screen projection switching, wherein the first code stream is a code stream downloaded from a media server before screen projection switching, the downloading indication information is used for indicating a starting frame of a second code stream, the second code stream is a code stream to be downloaded from the media server by destination equipment, and the starting frame of the second code stream is related to an ending frame of the first code stream; and sending the downloading indication information and the first code stream to the destination device.
In a possible implementation manner, the starting frame of the first code stream is a key frame Fk corresponding to the image frame Fc being played during screen projection switching.
In a possible implementation manner, the ending frame of the first code stream is the last image frame Fd that has been downloaded during screen-casting switching.
In a possible implementation manner, the image frame being played during screen-casting switching is Fc, the last image frame that has been downloaded during screen-casting switching is Fd, Fd and Fc belong to different media fragments, each media fragment includes a plurality of image frames, and the end frame of the first code stream is the last image frame of the media fragment to which Fc belongs.
In a possible implementation manner, the starting frame of the second code stream is related to the ending frame of the first code stream, and specifically includes: the start frame of the second code stream is the first image frame of the media fragment to which the next image frame of the end frame of the first code stream belongs.
In a possible implementation manner, the image frame being played during the screen-casting switching is Fc, the last image frame that has been downloaded during the screen-casting switching is Fd, the next image frame of Fd is Fn, Fn and Fc belong to different media segments, the start frame of the second code stream is the first image frame F1 of the media segment to which Fn belongs, and the end frame of the first code stream is the last image frame F0 of F1.
In a possible implementation manner, before sending the download indication information and the first code stream to the destination device, the method further includes: and sending playing time indication information to the destination device, wherein the playing time indication information is used for indicating the image frame Fc which is being played during screen projection switching.
In one possible implementation manner, the method further includes: and when the total data volume of the first code stream is smaller than a set threshold, sending data volume indicating information to the destination device, wherein the data volume indicating information is used for informing that the total data volume of the first code stream is smaller than the set threshold.
In a possible implementation manner, before sending the download indication information and the first code stream to the destination device, the method further includes: packaging the first code stream according to a set packaging format to obtain format data; and sending control information and the format data to the destination device, wherein the control information comprises the packaging format.
In a possible implementation manner, when a source device initiates screen projection, an extended parameter may be carried in a screen projection switching request except for a URL, where the extended parameter is used to indicate that the source device supports the URL screen projection method provided by the present application. If the target terminal equipment supports the URL screen projection method provided by the application, the extended parameters in the screen projection request can be identified.
In one possible implementation, a new message is defined for transmitting the media information and the first codestream, and the message may include a uniform fixed-length header, a type and a response code for carrying the identification message, and an optional message body.
In a third aspect, the present application provides a screen projection apparatus, comprising: the receiving module is used for receiving a first code stream sent by a source end device, wherein the first code stream is a code stream downloaded from a media server by the source end device before screen-casting switching; the receiving module is further configured to receive download indication information sent by the source end device, and download a second code stream from the media server according to the download indication information, where a start frame of the second code stream is indicated by the download indication information, and the start frame of the second code stream is related to an end frame of the first code stream; and the playing module is used for playing the video according to the first code stream or the second code stream.
In a possible implementation manner, the start frame of the first code stream is a key frame Fk corresponding to an image frame Fc being played by the source device during screen projection switching.
In a possible implementation manner, the ending frame of the first code stream is the last image frame Fd that has been downloaded by the source device during the screen-casting switching.
In a possible implementation manner, an image frame being played by the source device during screen-casting switching is Fc, a last image frame downloaded by the source device during screen-casting switching is Fd, Fd and Fc belong to different media fragments, each media fragment includes a plurality of image frames, and an end frame of the first code stream is a last image frame of a media fragment to which Fc belongs.
In a possible implementation manner, the starting frame of the second code stream is related to the ending frame of the first code stream, and specifically includes: the start frame of the second code stream is the first image frame of the media fragment to which the next image frame of the end frame of the first code stream belongs.
In a possible implementation manner, the image frame being played by the source device at the time of screen projection switching is Fc, the last image frame that has been downloaded by the source device at the time of screen projection switching is Fd, the next image frame of Fd is Fn, Fn and Fc belong to different media fragments, the start frame of the second code stream is the first image frame F1 of the media fragment to which Fn belongs, and the end frame of the first code stream is the last image frame F0 of F1.
In a possible implementation manner, the playing module is specifically configured to determine a first frame image frame to be played after screen switching from the first code stream or the second code stream, and start playing the video from the first frame image frame.
In a possible implementation manner, the receiving module is further configured to receive play time indication information sent by the source device, where the play time indication information is used to indicate an image frame Fc being played by the source device during screen switching; the playing module is further configured to determine, when the data volume of the first code stream is greater than a set threshold, that the first image frame to be played is Fc from the first code stream according to the playing time indication information, play a video from Fc to an end frame of the first code stream according to the first code stream, play a video starting from Fm according to the second code stream, where Fm is a next image frame of the end frame of the first code stream.
In a possible implementation manner, the receiving module is specifically configured to receive the first code stream sent by the source device when a total data amount of the first code stream is not less than a set threshold.
In a possible implementation manner, the receiving module is further configured to receive the data amount indication information sent by the source device when a total data amount of the first code stream is smaller than the set threshold; and downloading a third code stream from the media server, wherein the starting frame of the third code stream is the first image frame of the media fragment to which the image frame Fc which is played by the source end equipment during screen projection switching belongs.
In a possible implementation manner, the receiving module is further configured to receive the data volume indication information and the first code stream sent by the source end device when a total data volume of the first code stream is smaller than the set threshold; and downloading a fourth code stream from the media server, wherein the last image frame of the first code stream is Fd, the next image frame of the Fd is Fn, and the initial frame of the fourth code stream is the first frame of the media fragment to which the Fn belongs.
In a possible implementation manner, the receiving module is further configured to receive control information sent by the source device, where the control information includes a package format of the first code stream; receiving format data sent by the source end device; the device also includes: and the decoding module is used for de-encapsulating the format data according to the encapsulation format to obtain the first code stream.
In a fourth aspect, the present application provides a screen projection apparatus, comprising: the processing module is used for determining a first code stream and downloading indication information according to the playing progress during screen-casting switching, wherein the first code stream is downloaded from a media server before screen-casting switching, the downloading indication information is used for indicating a starting frame of a second code stream, the second code stream is a code stream to be downloaded from the media server by a destination device, and the starting frame of the second code stream is related to an ending frame of the first code stream; and the sending module is used for sending the downloading indication information and the first code stream to the destination device.
In a possible implementation manner, the starting frame of the first code stream is a key frame Fk corresponding to the image frame Fc being played during screen projection switching.
In a possible implementation manner, the ending frame of the first code stream is the last image frame Fd that has been downloaded during screen-casting switching.
In a possible implementation manner, the image frame being played during screen-casting switching is Fc, the last image frame that has been downloaded during screen-casting switching is Fd, Fd and Fc belong to different media fragments, each media fragment includes a plurality of image frames, and the end frame of the first code stream is the last image frame of the media fragment to which Fc belongs.
In a possible implementation manner, the starting frame of the second code stream is related to the ending frame of the first code stream, and specifically includes: the start frame of the second code stream is the first image frame of the media fragment to which the next image frame of the end frame of the first code stream belongs.
In a possible implementation manner, the image frame being played during the screen-casting switching is Fc, the last image frame that has been downloaded during the screen-casting switching is Fd, the next image frame of Fd is Fn, Fn and Fc belong to different media segments, the start frame of the second code stream is the first image frame F1 of the media segment to which Fn belongs, and the end frame of the first code stream is the last image frame F0 of F1.
In a possible implementation manner, the sending module is further configured to send playing time indication information to the destination device, where the playing time indication information is used to indicate an image frame Fc that is being played during screen switching.
In a possible implementation manner, the sending module is further configured to send data volume indication information to the destination device when a total data volume of the first code stream is smaller than a set threshold, where the data volume indication information is used to notify that the total data volume of the first code stream is smaller than the set threshold.
In one possible implementation, the apparatus further includes: the packaging module is used for packaging the first code stream according to a set packaging format to obtain format data; the sending module is specifically configured to send control information and the format data to the destination device, where the control information includes the encapsulation format.
In a fifth aspect, the present application provides a screen projection apparatus, comprising: a processor and a transmission interface; the processor is configured to invoke program instructions stored in the memory to implement the method of any of the first to second aspects described above.
In a sixth aspect, the present application provides a computer-readable storage medium, characterized by a computer program which, when executed on a computer or processor, causes the computer or processor to perform the method of any of the first to second aspects.
In a seventh aspect, the present application provides a computer program for performing the method of any one of the first to second aspects when the computer program is executed by a computer or a processor.
Drawings
FIG. 1 illustrates an exemplary schematic diagram of a screen projection scenario;
FIG. 2 shows an exemplary schematic of a device 200;
FIG. 3 is a flowchart of a first embodiment of a URL screen projection method of the present application;
FIG. 4 illustrates an exemplary sequence diagram of image frames;
FIG. 5 illustrates an exemplary sequence diagram of image frames;
FIG. 6 illustrates an exemplary sequence diagram of image frames;
FIG. 7 is a flowchart of a second embodiment of a URL screen projection method of the present application;
FIG. 8 illustrates one possible embodiment of a URL screen projection method of the present application;
FIG. 9 is a schematic structural diagram of a first embodiment of a projection screen apparatus according to the present application;
fig. 10 is a schematic structural diagram of a second embodiment of the screen projection device according to the present application.
Detailed Description
To make the purpose, technical solutions and advantages of the present application clearer, the technical solutions in the present application will be clearly and completely described below with reference to the drawings in the present application, and it is obvious that the described embodiments are some, but not all embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in the description examples and claims of this application and in the drawings are used for descriptive purposes only and are not to be construed as indicating or implying relative importance, nor order. Furthermore, the terms "comprises" and "comprising," as well as any variations thereof, are intended to cover a non-exclusive inclusion, such as a list of steps or elements. A method, system, article, or apparatus is not necessarily limited to those steps or elements explicitly listed, but may include other steps or elements not explicitly listed or inherent to such process, system, article, or apparatus.
It should be understood that in the present application, "at least one" means one or more, "a plurality" means two or more. "and/or" for describing an association relationship of associated objects, indicating that there may be three relationships, e.g., "a and/or B" may indicate: only A, only B and both A and B are present, wherein A and B may be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of single item(s) or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, "a and b", "a and c", "b and c", or "a and b and c", wherein a, b, c may be single or plural.
The terms referred to in this application describe:
and (4) screen projection of URL: and the screen projection initiating terminal provides the URL address of the video to the screen projection receiving terminal, and the screen projection receiving terminal acquires the video code stream from the media server according to the URL address and plays the video code stream. The URL screen projection is different from the mirror image screen projection, the mirror image screen projection is that the screen data of the screen projection initiating terminal is transmitted to the screen projection receiving terminal in real time, and the screen of the screen projection initiating terminal is synchronously displayed at the screen projection receiving terminal.
DLNA screen projection is a typical URL screen projection technology, and the screen projection process comprises the following steps: the user plays the video by using a terminal device (such as a mobile phone or a tablet and the like), and clicks a screen projection button in the media player to initiate screen projection. The terminal equipment is a screen projection initiating terminal. And the screen projection initiating terminal sends the URL address of the video to the screen projection receiving terminal by using a DLNA protocol. Large-screen equipment such as televisions, set-top boxes and the like are screen projection receiving ends. Meanwhile, the screen projection initiating terminal indicates the current playing position to the screen projection receiving terminal by using a seek command in the DLNA protocol. And the screen projection receiving terminal requests the media server to acquire the content corresponding to the URL address and plays the content.
The screen projection termination process comprises the following steps: and the user terminates the screen projection at the screen projection initiating end or the screen projection receiving end through a button or a key. And the screen projection receiving end terminates playing the video. And if the media player on the screen projection initiating terminal is not exited, the screen projection initiating terminal requests the media server for the content and continues playing.
Screen-projecting launch end (Sender): and (4) an initiator of URL screen projection. When the screen shot is started, the Sender provides the video URL address to the Receiver. Generally, a mobile phone, a tablet, a computer, and the like can initiate URL screen projection through an operating media player.
Screen projection receiving end (Receiver): and a receiving end for URL screen projection. When the screen is shot, the Receiver can obtain the video code stream from the media server according to the URL address provided by the Sender. Generally, a television, a set-top box, an Audio Video (AV) power amplifier, a ChromeCast, a computer, and other devices can receive screen projection through a built-in receiving device.
Media Server (Server): and a server for providing the video code stream. And the Receiver requests a corresponding video code stream from the Server through the URL address of the video. Currently, the Server mainly uses a hypertext Transfer Protocol Live Streaming (HLS), a Fragmented MP4(Fragmented MP4, FMP4), and other Streaming media protocols to provide media services.
Initial Cache data (Cache): the video playing device needs to download and cache a code stream corresponding to a section of video from the media server in advance before starting playing, and usually at least 2-3 seconds of video code stream needs to be cached, otherwise, video playing is easy to cause pause.
Switching a screen: when the screen projection is started or terminated, the equipment for playing the video is switched from one end to the other end. When the screen is shot, the Sender generally stops playing, and the video is switched to be played by the Receiver. When the screen projection is terminated, the Receiver stops playing, and if the video player of the Sender is not terminated, the video is switched back to be played by the Sender.
Destination device (DestPlayer): and after screen projection switching, one end of the video is required to be played. When screen projection is started, video playing is switched from Sender to Receiver, and the Receiver is DestPlayer at the moment; when the screen projection is terminated, the video is switched from Receiver to Sender, and Sender is DestPlayer.
Source end device (SourcePlayer): and before screen projection switching, one end of the video is played. Before starting the screen projection, the Sender is sourcePlayer; before the screen projection is terminated, Receiver is SourcePlayer.
Screen-casting switching response speed: which may also be referred to as a screen-projection response speed, a response speed, etc. After the user starts screen projection switching, the video is switched to the response speed of DestPlayer playing.
Screen projection time synchronization: when the screen is switched, the DestPlayer can start playing from the video position played by the SourcePlayer.
Key frame: in video compression, each frame represents a still image. The key frame is I frame, the I frame adopts full frame compression coding, namely, the full frame image information is compressed and coded, and the complete image can be reconstructed only by using the data of the I frame during decoding. I-frames describe details of the image background and the moving subject, which do not need to refer to other image frames, belonging to intra-frame compression. Compared with the key frame, the common image frame belongs to inter-frame compression, and the residual information of the frame can be decoded to obtain the final image only by overlapping the information of the key frame corresponding to the common image frame with the information of the key frame.
Fig. 1 is a schematic diagram illustrating an exemplary URL screen projection scenario, as shown in fig. 1, in which a source device sends a URL address of a video to a destination device, and the destination device requests a media server to obtain content corresponding to the URL address and plays the content. However, URL screen projection has two problems: (1) the screen-casting switching response speed is not fast enough, and the video cannot be quickly switched to the destination terminal device for playing. (2) The screen-casting switching response speed is difficult to be considered, meanwhile, the screen-casting time synchronization is ensured, and the target end device is difficult to start playing from the video position of the source end device in playing.
The current optimization direction mostly focuses on the time overhead of each link in the compression screen-casting switching process, or the target-end equipment is started in advance, and the like. This, while alleviating the problem to some extent, does not fundamentally solve the problem.
On the one hand, the screen-projection switching response speed is difficult to improve. For network video service, the playing device needs to download and cache a section of video code stream from the media server before starting playing, and usually at least 2-3 seconds of video code stream needs to be cached. Due to unpredictable jitter of the network, the initial buffer data cannot be too little, otherwise the play is easy to pause. When the bandwidth capacity is not much different from the bandwidth requirement, the initial caching of data will take at least 2-3 seconds. Correspondingly, when the screen is switched, the target terminal equipment can start playing after 2-3 seconds. The contradiction between the bandwidth requirement and the bandwidth capability exists for a long time, the video is a user with large bandwidth requirement, and the bandwidth newly added by the user is often quickly filled with the data, such as high definition, ultra-high definition, 4K, 8K, Virtual Reality (VR)/Augmented Reality (AR), …. In addition to downloading the codestream from the media server, there is an initial download overhead, e.g., initial negotiation of a distribution Network (e.g., Content Delivery Network (CDN) + Peer-to-Peer Network (P2P)). While media service providers will optimize their performance as much as possible, overhead is always present. Typically requiring hundreds of milliseconds and is closely related to the Quality of Service (QOS) of the media Service provider. Therefore, when switching the screen projection, if other more effective methods are not adopted, the problem cannot be solved fundamentally. Another aspect is due to the difficulty in providing screen-shot time synchronization. Usually, a device has to start downloading from the first frame containing the media segment at time T to play the video at time T. For example, assuming that the duration of a media segment is 5 seconds, the source device initiates a screen-cast switch when playing for 14 seconds. After the destination device receives the screen-casting switching request, seek to the 3 rd piece, and the corresponding time range is 10-15 seconds. In order to realize time synchronization, the destination device downloads the 10 th to 13 th seconds of data which are already played in addition to the necessary initial buffer (2 to 3 seconds after 14 seconds), which results in a longer waiting process.
Therefore, before no more effective method exists, the related technology is to select between screen-casting switching response speed and screen-casting time synchronization, either sacrifice screen-casting switching response speed and enhance screen-casting time synchronization or sacrifice screen-casting time synchronization and improve screen-casting switching response speed. The latter is currently most often chosen, i.e. to increase the speed of response of the screen shot switch, while at the time of screen shot switch, the video is always played from the beginning.
The application provides a URL screen projection method, and aims to solve the problems. As shown in fig. 1, the Sender in the URL screen projection method may also be referred to as a User Equipment (UE), and may be deployed on land, including indoors or outdoors, handheld or vehicle-mounted; can also be deployed on the water surface (such as a ship and the like); and may also be deployed in the air (e.g., airplanes, balloons, satellites, etc.). The source device may be a mobile phone (mobile phone), a tablet computer (pad), a wearable device with a wireless communication function (e.g., a smart watch), a location tracker with a positioning function, a computer with a wireless transceiving function, a VR device, an AR device, a wireless device in industrial control (industrial control), a wireless device in self driving (self driving), a wireless device in remote medical (remote medical), a wireless device in smart grid (smart grid), a wireless device in transportation safety (transportation safety), a wireless device in smart city (smart city), a wireless device in smart home (smart home), etc., which is not limited in this application.
The destination device in the URL screen projection method may be a smart television, a television box, a projection screen, or the like, which is not limited in this application.
The network between the source device and the destination device may be a communication network supporting a short-distance communication technology, for example, a communication network supporting a Wireless-Fidelity (WIFI) technology; or a Communication network supporting bluetooth technology or a Communication network supporting Near Field Communication (NFC) technology; and the like; alternatively, the communication network may be a communication network supporting a Fourth Generation (4G) access technology, such as a Long Term Evolution (LTE) access technology; alternatively, the communication network may be a communication network supporting a Fifth Generation (5G) access technology, such as a New Radio (NR) access technology; alternatively, the communication network may be a communication network supporting a Third Generation (3G) access technology, such as a Universal Mobile Telecommunications System (UMTS) access technology; alternatively, the communication network may also be a communication network supporting a plurality of wireless technologies, such as a communication network supporting LTE technology and NR technology; alternatively, the communication network may be adapted to future-oriented communication technologies, which are not specifically limited in this application.
Fig. 2 shows an exemplary schematic configuration of a device 200. The device 200 may be used as the source device described above and may also be used as the destination device described above. As shown in fig. 2, the apparatus 200 includes: an application processor 201, a Microcontroller Unit (MCU) 202, a memory 203, a modem (modem)204, a Radio Frequency (RF) module 205, a Wireless-Fidelity (Wi-Fi) module 206, a bluetooth module 207, a sensor 208, an Input/Output (I/O) device 209, and a positioning module 210. These components may communicate over one or more communication buses or signal lines. The aforementioned communication bus or signal line may be a CAN bus as provided herein. Those skilled in the art will appreciate that the device 200 may include more or fewer components than illustrated, or some components may be combined, or a different arrangement of components.
The various components of the apparatus 200 are described in detail below with reference to fig. 2:
the application processor 201 is the control center of the device 200, and various components of the device 200 are connected using various interfaces and buses. In some embodiments, the processor 201 may include one or more processing units.
The memory 203 has stored therein computer programs such as an operating system 211 and application programs 212 shown in fig. 2. The application processor 201 is configured to execute a computer program in the memory 203 to implement the functions defined by the computer program, e.g., the application processor 201 executes the operating system 211 to implement various functions of the operating system on the device 200. The memory 203 also stores data other than computer programs, such as data generated during the operation of the operating system 211 and the application programs 212. The storage 203 is a non-volatile storage medium, and generally includes a memory and an external memory. The Memory includes, but is not limited to, Random Access Memory (RAM), Read-Only Memory (ROM), or cache. External Memory includes, but is not limited to, Flash Memory (Flash Memory), hard disks, optical disks, Universal Serial Bus (USB) disks, and the like. The computer program is typically stored on an external memory, from which the processor loads the program into the internal memory before executing the computer program.
The memory 203 may be independent and connected to the application processor 201 through a bus; the memory 203 may also be integrated with the application processor 201 into a chip subsystem.
The MCU 202 is a co-processor for acquiring and processing data from the sensor 208, the processing power and power consumption of the MCU 202 are smaller than those of the application processor 201, but the MCU 202 has a feature of "always on", which can continuously collect and process sensor data when the application processor 201 is in a sleep mode, and thus, the normal operation of the sensor can be guaranteed with extremely low power consumption. In one embodiment, MCU 202 may be a sensor hub chip. The sensor 208 may include a light sensor, a motion sensor. Specifically, the light sensor may include an ambient light sensor that adjusts the brightness of the display 2091 based on the ambient light level and a proximity sensor that turns off the power to the display when the device 200 is moved to the ear. As one type of motion sensor, an accelerometer sensor can detect the magnitude of acceleration in various directions (generally, three axes), and can detect the magnitude and direction of gravity when the accelerometer sensor is stationary; the sensors 208 may also include other sensors such as gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein. MCU 202 and sensor 208 may be integrated on the same chip or may be separate components connected by a bus.
The modem 204 and radio frequency module 205 form the device 200 communication subsystem for implementing the primary functions of the wireless communication standard protocol. Wherein the modem 204 is used for codec, signal modem, equalization, etc. The rf module 205 is used for receiving and transmitting wireless signals, and the rf module 205 includes, but is not limited to, an antenna, at least one amplifier, a coupler, a duplexer, and the like. The radio frequency module 205 cooperates with the modem 204 to implement wireless communication functions. The modem 204 may be provided as a separate chip or may be combined with other chips or circuits to form a system-on-chip or integrated circuit. These chips or integrated circuits are applicable to all devices that implement wireless communication functions, including: mobile phones, computers, notebooks, tablets, routers, wearable devices, automobiles, home appliances, and the like.
The device 200 may also use a Wi-Fi module 206, a bluetooth module 207, etc. for wireless communication. Wi-Fi module 206 is configured to provide device 200 with network access compliant with Wi-Fi-related standard protocols, where device 200 can access a Wi-Fi access point through Wi-Fi module 206 to access the Internet. In other embodiments, Wi-Fi module 206 can also act as a Wi-Fi wireless access point and can provide Wi-Fi network access to other devices. Bluetooth module 207 is used to enable short-range communication between device 200 and other devices (e.g., cell phones, smartwatches, etc.). The Wi-Fi module 206 in the embodiment of the present application can be an integrated circuit or a Wi-Fi chip, etc., and the Bluetooth module 207 can be an integrated circuit or a Bluetooth chip, etc.
The location module 210 is used to determine the geographic location of the device 200. It is understood that the positioning module 210 may specifically be a receiver of a Global Positioning System (GPS) or a positioning system such as the beidou satellite navigation system, russian GLONASS, and the like.
The Wi-Fi module 206, the bluetooth module 207, and the positioning module 210 may be separate chips or integrated circuits, respectively, or may be integrated together. For example, in one embodiment, the Wi-Fi module 206, the bluetooth module 207, and the positioning module 210 may be integrated onto the same chip. In another embodiment, the Wi-Fi module 206, the Bluetooth module 207, the positioning module 210 and the MCU 202 can also be integrated into the same chip.
Input/output devices 209 include, but are not limited to: a display 2091, a touch screen 2092, and an audio circuit 2093, etc.
Among other things, the touch screen 2092 may capture touch events at or near the device 200 by a user (e.g., user manipulation of a finger, stylus, etc. of any suitable object on or near the touch screen 2092) and transmit the captured touch events to other devices (e.g., the application processor 201). The operation of the user near the touch screen 2092 may be referred to as floating touch; with hover touch, the user may select, move, or drag a destination (e.g., an icon, etc.) without directly contacting touch screen 2092. In addition, the touch screen 2092 may be implemented using various types of resistive, capacitive, infrared, and surface acoustic waves.
The display 2091 is used to display information entered by the user or presented to the user. The display may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The touch screen 2092 may be overlaid on the display 2091, and when a touch event is detected by the touch screen 2092, the touch event is transmitted to the application processor 201 to determine the type of touch event, and the application processor 201 may then provide a corresponding visual output on the display 2091 based on the type of touch event. Although in fig. 2, the touch screen 2092 and the display 2091 are shown as two separate components to implement the input and output functions of the device 200, in some embodiments, the touch screen 2092 may be integrated with the display 2091 to implement the input and output functions of the device 200. The touch screen 2092 and the display 2091 may be arranged on the front surface of the device 200 in a full panel configuration to realize a frameless structure.
The audio circuit 2093, speaker 2094, and microphone 2095 may provide an audio interface between the user and the device 200. The audio circuit 2093 may transmit the received electrical signal converted from the audio data to the speaker 2094, and convert the audio data into an audio signal and output the audio signal through the speaker 2094; on the other hand, the microphone 2095 converts the collected sound signals into electrical signals, which are received by the audio circuit 2093 and converted into audio data, which are then transmitted to another device via the modem 204 and the rf module 205 or output to the memory 203 for further processing.
In addition, the device 200 may also have a fingerprint recognition function. For example, the fingerprint acquisition device may be disposed on the back side of the device 200 (e.g., below the rear camera), or on the front side of the device 200 (e.g., below the touch screen 2092). Also for example, a fingerprint acquisition device may be configured within touch screen 2092 to perform a fingerprint identification function, i.e., the fingerprint acquisition device may be integrated with touch screen 2092 to perform a fingerprint identification function of device 200. In this case, the fingerprint acquisition device is disposed on the touch screen 2092, and may be a part of the touch screen 2092 or may be otherwise disposed on the touch screen 2092. The main component of the fingerprint acquisition device in the embodiments of the present application is a fingerprint sensor, which may employ any type of sensing technology, including but not limited to optical, capacitive, piezoelectric, or ultrasonic sensing technologies, etc.
Further, the operating system 211 carried by the device 200 may be
Figure BDA0002411452050000121
Or other operating system, to which the embodiments of the present application do not impose any limitations.
To be carried with
Figure BDA0002411452050000122
The device 200 of the operating system is an example, and the device 200 may be logically divided into a hardware layer, an operating system 211, and an application layer. The hardware layer includes hardware resources such as an application processor 201, MCU 202, memory 203, modem 204, Wi-Fi module 206, sensors 208, positioning module 210, etc., as described above. The application layer includes one or more applications, such as application 212, and application 212 may be any type of application, such as a social-type application, an e-commerce-type application, a browser, and so forth. Operating system 211 as a layer between the hardware layer and the application layerThe software middleware of (2) is a computer program that manages and controls hardware and software resources.
In one embodiment, the operating system 211 includes a kernel, Hardware Abstraction Layer (HAL), libraries and runtimes (libraries and runtimes), and framework (framework). Wherein, the kernel is used for providing the components and services of the bottom system, such as: power management, memory management, thread management, hardware drivers, etc.; the hardware driving program comprises a Wi-Fi driving program, a sensor driving program, a positioning module driving program and the like. The hardware abstraction layer is used for encapsulating the kernel driver, providing an interface for the framework and shielding the implementation details of the lower layer. The hardware abstraction layer runs in user space and the kernel driver runs in kernel space.
Libraries and runtimes, also called runtime libraries, provide the required library files and execution environment for the executable program at runtime. In one embodiment, the libraries and runtimes include Android Runtimes (ART), libraries, and scene package runtimes. An ART is a virtual machine or virtual machine instance that is capable of converting the bytecode of an application into machine code. Libraries are libraries that provide support for executable programs at runtime, including browser engines (e.g., webkit), script execution engines (e.g., JavaScript engines), graphics processing engines, and so forth. The scene package operation is an operation environment of the scene package, and mainly comprises a page execution environment (page context) and a script execution environment (script context), wherein the page execution environment analyzes page codes in formats such as html and css by calling a corresponding library, and the script execution environment analyzes codes or executable files realized by executing scripting languages such as JavaScript by calling a corresponding function library.
The framework is used to provide various underlying common components and services for applications in the application layer, such as window management, location management, and the like. In one embodiment, the framework includes a geo-fencing service, a policy service, a notification manager, and the like.
The functions of the various components of the operating system 211 described above may be implemented by the application processor 201 executing programs stored in the memory 203.
Those skilled in the art will appreciate that the apparatus 200 may include fewer or more components than those shown in fig. 2, and that the apparatus shown in fig. 2 includes only those components more pertinent to the various implementations disclosed herein.
Fig. 3 is a flowchart of a URL screen projection method according to a first embodiment of the present application. The process 300 may be performed by a source device, a destination device, and a media server. Process 300 is described as a series of steps or operations, it being understood that process 300 may be performed in various orders and/or concurrently, and is not limited to the order of execution shown in FIG. 3. As shown in fig. 3, the method of this embodiment may include:
step 301, the source device determines a first code stream and download indication information according to the play progress during screen projection switching.
In the present application, Sender refers to an initiating end of URL screen projection, such as a mobile phone, a tablet, a computer, and the like, and Receiver refers to a receiving end of URL screen projection, such as a television, a set-top box, an AV power amplifier, ChromeCast, a computer, and the like. Generally, the URL screen casting process is started when a Sender sends a screen casting request to a Receiver, the Sender can carry a URL address of a video being played in the screen casting request, and the Receiver downloads a video code stream corresponding to the URL address from a media server according to the URL address and plays the video code stream. The Sender and Receiver are therefore fixed. The source device (SourcePlayer) is the end playing the video before the screen switching, and the destination device (DestPlayer) is the end playing the video after the screen switching. When the screen projection is started, a Sender initiates that the permission of playing the video is handed to a Receiver, wherein the sourcePlayer is the Sender and the destPlayer is the Receiver; when the screen projection is terminated, the Receiver returns the authority of playing the video to the Sender, wherein the SourcePlayer is the Receiver and the DestPlayer is the Sender. That is, SourcePlayer and DestPlayer are opposite, and switching between Sender and Receiver occurs according to different stages of screen projection.
Network media playback typically employs streaming media technology, which employs streaming technology to play media formats, such as audio, video, or multimedia files, continuously in real-time over a network. The streaming media technology is to store continuous media data in a media server in a unified manner in media segments after compression, wherein each media segment includes a code stream corresponding to a plurality of image frames. During playing, the media server sequentially or real-timely transmits the media fragments of the video to the terminal equipment of the user, and the terminal equipment plays the video while downloading, without waiting for the completion of downloading the whole video file. The terminal device will create a buffer area, and download a section of video code stream as buffer before playing, when the actual network connection speed is less than the playing speed, the player will use a section of code stream in the buffer area to play, avoiding video jam.
Therefore, in the process of playing the video, the source device does two things simultaneously: play and download the cache. In order to ensure the consistency of the video, the timing sequence of the image frame being played by the source device is usually earlier than the timing sequence of the image frame being downloaded at the same time, that is, the time for downloading and buffering the image frame is earlier than the time for playing the image frame for the same image frame. Fig. 4 shows an exemplary sequence diagram of image frames, as shown in fig. 4, Fc is an image frame being played by a source device during screen-shot switching, Fd is a last image frame that has been downloaded by the source device during screen-shot switching, Fk is a key frame corresponding to Fc, and Fn is a next image frame of Fd. Fig. 5 shows an exemplary sequence diagram of image frames, as shown in fig. 5, Fc is an image frame being played by a source end device during screen projection switching, Fd is a last image frame that has been downloaded by the source end device during screen projection switching, Fk is a key frame corresponding to Fc, Fn is a next image frame of Fd, in this example, Fk and Fc belong to a same media segment Si, Fd and Fn belong to a same media segment Sj, Si and Sj are different media segments, and Si and Sj may be adjacent to each other or separated by one or more other media segments, which is not specifically limited in this application. Fig. 6 shows an exemplary sequence diagram of image frames, as shown in fig. 6, Fc is an image frame being played by a source end device at the time of screen-casting switching, Fd is a last image frame that has been downloaded by the source end device at the time of screen-casting switching, Fk is a key frame corresponding to Fc, Fn is a next image frame of Fd, in this example, Fk and Fc belong to a same media segment Si, Fd belongs to a media segment Sl, Fn belongs to a media segment Sj, Si, Sl and Sj are different media segments, Sl and Sj are necessarily adjacent, Si and Sl may be adjacent, or may be separated by one or more other media segments, which is not specifically limited in this application. It should be noted that, although Fk and Fc belong to the same media segment Si in the above example, it should be understood that Fk and Fc may belong to different media segments, and when a media segment is small, the media segment may not always contain a key frame. In addition, fig. 4-6 exemplarily show the timing relationship of the image frames that have been downloaded by the source device before the screen-shot switching, and the possible correspondence between the image frames and the media segments, but this is not a limitation to the video stream referred to in this application.
To ensure video continuity, Fd is later than Fc, and it can be considered that the image frames from Fc to Fd have not been played by the source device. The code stream downloaded by the source device from the media server includes information of the image frame before Fc and information of the image frames from Fc to Fd.
When the screen is switched, the source device plays the image frame Fc, and then the destination device starts to play from the image frame Fc backward after the screen is switched under the requirement of screen-casting time synchronization. The position of the destination device from which to start playing the video can be realized by a seek (seek) instruction in the URL protocol. Alternatively, the present application may also be implemented with a new message, such as time indication information, by which the image frame being played at the time of the screen shot switching is indicated.
The destination device starts playing from the image frame Fc, and also needs to go through the same process as the source device, that is, a video code stream is downloaded from the media server in advance as a cache before playing, the video is started to play by using the cache, and during the process of playing the video, the cache of the subsequent image frame is downloaded while playing. It should be understood that the destination device starts playing from the image frame Fc, and the buffer to be downloaded includes at least the code stream of the image frames from Fc to Fd.
In the application, the source end device can send the cached code stream of the source end device to the destination end device through a communication network between the source end device and the destination end device. Since the URL screen projection usually occurs between two devices located in the same lan, sending the code stream from the source device to the destination device can be realized by the lan, and the speed of transmitting data through the lan is faster and the time required is shorter. In the cache code stream sent by the source device to the destination device, in order to reduce the amount of transmitted data, the source device may send only the code stream of the image frame that has been downloaded from the media server and has not been played to the destination device, and this part of the code stream may be referred to as a first code stream.
The destination device needs to cache the subsequent code stream in addition to the first code stream to ensure the normal playing of the video, this part of code stream may be called as a second code stream, and the destination device may download the second code stream from the media server. In the present application, the destination device cannot determine which image frame the second code stream starts from, so the source device may determine a download indication information, by which the source device indicates a start frame of the second code stream.
Theoretically, the initial frame of the first code stream should be Fc, but according to the encoding and decoding technology of the media and the function of the key frame, if Fc is not the key frame, the destination device cannot decode the image corresponding to Fc only by using the information of Fc, and must also have the key frame Fk corresponding to Fc (Fk is a key frame earlier than Fc and closest to Fc), and the image corresponding to Fc can be completely decoded by using the information of Fk. As is the image frame after Fc. Therefore, even if Fk has been played during screen switching, in order to ensure the information integrity of the first code stream, the key frame Fk corresponding to Fc must be sent to the destination device, and therefore the start frame of the first code stream may be Fk. And because the image frames before Fk are played on the source end equipment when the screen projection is switched, the destination end equipment needs to realize the screen projection time synchronization, and the image frames are played from Fc, and the image frames which are played before Fc are not needed for the destination end equipment. In summary, the start frame of the first code stream may be Fk, which not only reduces the data amount transmitted between the source device and the destination device, but also provides sufficient image information to ensure effective decoding of the image.
The end frame of the first code stream may include two possible cases: (1) the ending frame of the first code stream is the last image frame Fd that the source device has already downloaded during screen-casting switching. This case means that the source device transmits all the streams of all the image frames that have been downloaded from the media server and have not been played back, to the destination device, i.e., the streams of the image frames from Fc to Fd. (2) And the ending frame of the first code stream is the last image frame Fe of the media fragment to which the Fc belongs. According to the above description of the streaming media technology, the continuous media data are compressed and then stored in a unified manner in media segments, and each media segment includes a plurality of image frames. In order to ensure that the destination can implement screen-casting time synchronization and meet the storage requirement of media data, the first code stream sent by the source device to the destination device may end at Fe. If not, if the Fd and the Fc belong to the same media fragment, the source device only downloads the Fd during screen-casting switching, and the Fd and the Fe are not necessarily the same image frame, and the source device cannot send the Fe to the destination device.
Based on the storage format of the video, the device downloads the video code stream from the media server by taking the media fragments as units, and the video code stream is downloaded one by one in sequence according to time, so that no matter the device is a source end device or a destination end device, an initial frame of the downloaded code stream from the media server is necessarily the first image frame of a certain media fragment. Typically, the start frame of the bitstream downloaded by the source device from the media server is the first image frame of the first media segment of the video, and the start frame of the bitstream (second bitstream) to be downloaded by the destination device from the media server is related to the end frame of the first bitstream. The start frame of the second bitstream may be the first image frame of the media segment to which the next image frame of the end frame of the first bitstream belongs. For the case (1), the ending frame of the first code stream is the last image frame Fd that has been downloaded by the source device during screen-casting switching, and then the optimal state is that the second code stream starts from the next image frame Fn of Fd, but considering the media fragment processing on the media server, the starting frame of the second code stream may be the first image frame of the media fragment to which Fn belongs. At this time, there may be an overlap between the ending portion of the first code stream and the starting portion of the second code stream, for example, in fig. 5, Fd and Fn belong to the same media segment Sj, the ending frame of the first code stream is Fd, and the starting frame of the second code stream is the first image frame of the media segment to which Fn belongs, and the first image frame is either Fd or earlier than Fd, so that the first code stream and the second code stream have a partial overlap. For another example, in fig. 6, Fd belongs to a media slice Sl, Fn belongs to a media slice Sj, Sl and Sj are different media slices, and Fd is a previous image frame of Fn, so that an end frame Fd of a first code stream is exactly a last image frame of the media slice Sl, and Fn is exactly a first image frame of an adjacent media slice Sj, so that the first code stream and the second code stream do not overlap. For the case (2), the ending frame of the first code stream is the last image frame Fe of the media segment to which Fc belongs, and then the optimal state is that the second code stream starts from the next image frame of Fe, but considering the media segment processing on the media server, the starting frame of the second code stream may be the first image frame of the media segment to which the next image frame of Fe belongs. Since the ending frame of the first code stream is the last image frame Fe of the media segment, the next image frame of Fe is the first image frame of the adjacent media segment, and the first image frame is the starting frame of the second code stream, so the first code stream and the second code stream do not overlap.
Based on the advantage of local area network transmission, the source end device in the application can transmit the code stream cached by the source end device to the destination end device through the local area network as much as possible, so that the data volume of the second code stream downloaded by the destination end device from the media server can be reduced. Therefore, the starting frame of the second bitstream may be the first image frame F1 of the media segment to which Fn belongs, and Fn is the next image frame of the last image frame Fd that has been downloaded by the source device at the time of screen-casting switching. This is the latest starting frame that can be set by the second stream, and if the media segment starts to be downloaded further back, the screen-casting time synchronization cannot be realized. The ending frame of the first code stream is related to the starting frame of the second code stream, i.e. the ending frame of the first code stream is the last image frame F0 of F1. In this case, the first code stream and the second code stream do not overlap.
Step 302, the source device sends download indication information to the destination device.
Step 303, the source device sends the first code stream to the destination device.
The source device may send the download indication information and the first code stream to the destination device through a communication network between the source device and the destination device.
The source device may send play time indication information to the destination device, where the play time indication information is used to indicate an image frame being played during screen projection switching. The purpose of the source device sending the play time indication information is to inform the destination device of the own play progress during screen-casting switching, so that the destination device can determine from which frame to start playing the video according to the play progress, and screen-casting time synchronization on both sides is realized.
In a possible implementation manner, the source device may encapsulate the first code stream according to a set encapsulation format to obtain format data, and then the source device sends the format data and control information to the destination device, where the control information includes the encapsulation format. The first bitstream may include image frames spanning multiple media slices, and the source device may encapsulate the first bitstream into one media slice, such as a Transport Stream (TS), a FMP4 slice, and so on, to simplify organization of the Transport data. Optionally, the source device may also divide the first code stream into a plurality of media segments for transmission, which is not specifically limited in this application. The packaging object in the application is a video code stream downloaded and cached from a media server, the packaging processing can improve the performance of screen projection time synchronization, and the packaging processing does not relate to encoding and decoding of images, so that the difficulty of an algorithm and the requirement on hardware can be reduced.
And step 304, the destination device downloads the second code stream from the media server.
The download instruction information indicates a start frame of the second code stream, so that the destination device downloads the code stream of the continuous image frames starting from the start frame from the media server according to the download instruction information. As previously mentioned, the starting frame of the second stream has been determined to be the first image frame of a certain media segment, so the destination device may request from the media server to start downloading from the media segment.
Optionally, the download indication information may include information of a media segment to which a start frame of the second code stream belongs, and the destination device may determine the media segment to which the start frame belongs according to the information, so as to determine the start frame.
Optionally, the download indication information may include information of a start frame of the second bitstream, and the destination device may directly determine the start frame according to the information, and further determine the media segment to which the start frame belongs.
The present application does not specifically limit the information included in the download instruction information.
And 305, the destination device plays the video according to the first code stream or the second code stream.
The receiving of the first code stream and the downloading of the second code stream from the media server by the destination device are two independent processes, so that the two processes can be performed simultaneously. Since the first code stream is entirely located before the second code stream, the destination device can start playing from the first code stream. In the process of receiving the first code stream and the second code stream, if the data volume of the received first code stream reaches the data volume enough for playing, the destination device can play the video according to the playing progress of the first code stream when switching from screen projection, so that the screen projection switching response speed is favorably improved. When the first code stream is played, the destination device also accumulates enough second code streams, and then plays the video backwards according to the second code streams, so that the smoothness of video playing is ensured.
In the above process, in order to implement screen-casting time synchronization, the destination device may determine, from the first code stream or the second code stream, a first frame of image frame to be played after screen-casting switching, and start playing a video from the first frame of image frame. The first frame image frame is the key of the screen projection time synchronization. As described above, for screen-casting time synchronization, the source device sends, to the destination device, play time indication information for indicating an image frame Fc being played during screen-casting switching, and according to the play time indication information, the destination device may first determine that a first image frame to be played is Fc, then play a video from Fc to an end frame of a first code stream according to a first code stream, play a video starting from Fm according to a second code stream, where Fm is a next image frame of the end frame of the first code stream.
As described above, there may be an overlap between the end portion of the first code stream and the start portion of the second code stream, or the first code stream and the second code stream may not overlap at all. If the two code streams are not overlapped, the target end equipment directly starts playing according to the second code stream after playing the first code stream, and the ending frame of the first code stream is the last image frame of the starting frame of the second code stream, so that the target end equipment can realize seamless switching from the first code stream to the second code stream. If the situation of overlapping exists, in the overlapped part, the destination device can select to play according to the first code stream, namely, play the video from Fc to the ending frame of the first code stream according to the first code stream, and play the video from Fm according to the second code stream. It should be noted that the destination device may also select to play according to the second code stream in the overlapped portion, that is, play the video of the last image frame from Fc to the start frame of the second code stream according to the first code stream, abandon the portion of the first code stream from the start frame of the second code stream to the end frame of the first code stream, and play the video starting from the start frame of the second code stream according to the second code stream. This is not a particular limitation of the present application.
Step 301-305 mainly describes a screen-casting switching process when the total data volume of the first code stream is greater than or equal to a set threshold, where the set threshold is used to represent whether the total data volume of the first code stream is sufficient to support the playing of the video. In a possible implementation manner, if the screen projection is switched, the data volume of the cache code stream downloaded by the source device from the media server is small and smaller than the set threshold, even if the cache code stream is sent to the destination device, the destination device still needs to download almost all the code stream from the media server, or cannot play video according to the first code stream, or the cost of bandwidth consumption, transmission delay and the like generated by transmitting the first code stream is higher than the cost generated by directly downloading the first code stream from the media server, at this time, the source device may send data volume indication information to the destination device, where the data volume indication information is used to notify that the total data volume of the first code stream is smaller than the set threshold. In this case, the source device may select to send the first code stream to the destination device, or may select not to send the first code stream, and the process may be negotiated through high-level configuration or information interaction between both parties, which is not described herein again.
For the case that the source device sends the first code stream to the destination device, the destination device may download a fourth code stream from the media server, where an initial frame of the fourth code stream is a first frame of the media segment to which Fn belongs. Namely, the second code stream is downloaded from the first image frame of the media fragment to which the next frame of the ending frame of the first code stream belongs.
For the condition that the source device does not send the first code stream, the destination device may download a third code stream from the media server, where an initial frame of the third code stream is a first image frame of a media fragment to which an image frame Fc that the source device is playing during screen-casting switching belongs. Because the first code stream does not exist, in order to ensure the normal playing of the video and realize the screen-casting time synchronization, the destination device can start downloading the second code stream from the first image frame of the media fragment to which the Fc belongs.
When the application is switched in a screen projection mode, the source end equipment sends the cached code stream to the target end equipment, on one hand, the cached code stream is transmitted through the local area network, based on the advantages of the local area network, including higher transmission bandwidth, better QOS and the like, the fast and stable transmission of the cached code stream can be achieved, on the other hand, the target end equipment receives the cached code stream from the source end equipment and downloads the code stream after the cached code stream from the media server, the video can be played quickly based on the existing code stream, the screen projection switching response speed is improved, sufficient code stream storage during video playing can be guaranteed, and video playing is prevented from being blocked.
It should be noted that, the above method embodiment describes a process in which, when switching a screen, the source device sends the first code stream and the download indication information to the destination device, so as to implement the screen switching. When the process is started to project a screen, the source end equipment can be a mobile phone, a tablet, a computer and the like which are used as a Sender, and the destination end equipment can be a television, a set-top box, an AV power amplifier, a ChromeCast, a computer and the like which are used as a Receiver. When the process is terminated, the source end device can be a television, a set-top box, an AV power amplifier, a ChromeCast, a computer and the like which are used as receivers, and the destination end device can be a mobile phone, a tablet, a computer and the like which are used as Senders. Therefore, the Sender and the Receiver can perform role switching at different stages of screen projection, and the application is not limited to this specifically.
Fig. 7 is a flowchart of a second embodiment of the URL screen projection method of the present application. The process 700 may be performed by a source device, a destination device, and a media server. Process 700 is described as a series of steps or operations, it being understood that process 700 may be performed in various orders and/or concurrently, and is not limited to the order of execution shown in FIG. 7. As shown in fig. 7, the method of this embodiment may include:
step 701, the user starts screen projection through the Sender.
The process may adopt any application program, video platform, player program and other technologies supporting the URL screen projection protocol, which is not specifically limited in this application.
Step 702, Sender sends screen projection request to Receiver.
The screen-casting request may include a URL address of a video to be played, a seek instruction, and the like, and informs the Receiver of an address of a currently played video and a playing progress of the current video.
And step 703, the Receiver sends a screen-casting response to the Sender.
If the Receiver correctly receives the screen casting request sent by the Sender, the Receiver can feed back an acknowledgement response (ACK), otherwise, the Receiver can feed back a Negative acknowledgement response (NAK).
And step 704, the Sender determines a first code stream and downloading indication information according to the current playing progress.
Step 704 may refer to step 301, which is not described herein.
Step 705, Sender sends media information to Receiver.
The first code stream determined by the Sender comprises a start frame and an end frame of the first code stream, the packaging format of the first code stream, and the organization and storage mode of the first code stream. The Sender may send these pieces of information, as well as the above-described download instruction information, to the Receiver as media information. Optionally, the media information may be described in Java Script Object Notation (JSON) or Extensible Markup Language (XML). It should be understood that the media information may also be described in other information manners, and the present application is not limited thereto.
And 706, sending the first code stream to a Receiver by the Sender.
Step 706 may refer to step 303 above, and is not described herein again.
Step 707, the Receiver sends a download request to the media server.
The download request may include a start frame of the second code stream.
And step 708, the Receiver downloads the second code stream from the media server.
Step 708 may refer to step 304 described above, and is not described herein again.
And 709, when the data volume of the first code stream is enough, the Receiver starts to play the video.
In the process of receiving the first code stream and the second code stream, if the data volume of the received first code stream reaches the data volume enough for playing, for example, a video code stream of 500ms, the Receiver can play the video according to the playing progress of the first code stream when switching from screen projection, which is favorable for improving the screen projection switching response speed.
Step 709 may refer to step 305 described above, and is not described herein again.
And step 710, the user terminates the screen projection through the Sender.
The process may adopt any application program, video platform, player program and other technologies supporting the URL screen projection protocol, which is not specifically limited in this application.
Optionally, the user may also terminate screen projection through the Receiver, and similarly, any application program, video platform, player program and other technologies supporting the URL screen projection protocol may also be adopted, which is not specifically limited in this application.
And step 711, sending a screen projection termination request to the Receiver by the Sender.
And step 712, the Receiver sends a screen-projection termination response to the Sender.
The screen-casting termination response can include a seek instruction and the like, and informs the Sender of the playing progress of the current video.
And 713, determining a first code stream and downloading indication information by the Receiver according to the current playing progress.
Step 713 may refer to step 301 described above, and will not be described herein.
Step 714, Receiver sends media information to Sender.
Step 714 may refer to step 705 above, and will not be described herein.
And 715, the Receiver sends the first code stream to the Sender.
Step 715 may refer to step 303, which is not described herein again.
Step 716, Sender sends download request to media server.
The download request may include a start frame of the second code stream.
Step 717, Sender downloads the second code stream from the media server.
Step 717 may refer to step 304 above, and will not be described herein.
Step 718, when the data volume of the first code stream is enough, the Sender starts playing the video.
The Sender can also play the video according to the playing progress of the first code stream when switching from screen projection if the data volume of the received first code stream reaches the data volume enough for playing in the process of receiving the first code stream and the second code stream, so that the screen projection switching response speed is favorably improved.
Step 718 can refer to step 305 described above, and is not described herein again.
It should be noted that the embodiments shown in fig. 3 and fig. 7 are a conventional flow of the URL screen projection method provided by the present application, and may be used as an extension to the relevant screen projection protocol, that is, an extension is made on the basis of the relevant screen projection protocol to implement the URL screen projection method provided by the present application. It should be understood that abnormal situations specified in the relevant protocol, or other processes not involved in the embodiments of the present application, may also be supported and compatible by the URL screen projection method provided by the present application.
For example, one or both of Sender and Receiver do not support the URL screen projection method provided by the present application. If the Receiver does not support the URL screen projection method, the Receiver can notify the Sender through the screen projection response in the step 703, and if the Sender does not support the URL screen projection method, the Sender does not send the first code stream to the Receiver. Any party does not support the URL screen projection method, and the Sender and the Receiver execute the screen projection process in the related technology, so that the URL screen projection method and the related screen projection method are compatible.
For another example, after the Receiver starts playing the video, the current playing state (including the playing position) is periodically reported to the Sender, and if the user adjusts the playing progress by dragging, the Receiver may also report to the Sender.
For another example, the Sender starts to cast the screen, the Receiver already starts to play the video, and at this time, the user closes the player, the application program and the like on the Sender, so that the user cannot operate on the Sender to terminate the screen casting. In this case, the user may terminate the screen-casting by the Receiver, or may terminate the screen-casting by re-opening the player, the application, or the like on the Sender.
For another example, the first code stream may be actively sent to the destination device by the source device, or the destination device may request to pull from the source device after receiving the screen-casting switching request. This is not a particular limitation of the present application.
For example, the Transmission of the first code stream may adopt a Transmission Control Protocol (TCP), a hypertext Transfer Protocol (HTTP), a WebSocket, or various proprietary protocols, which is not specifically limited in this application.
For another example, the above related protocols may include URL screen projection protocols such as DLNA, AirPlay, GoogleCast, etc., and the URL screen projection method provided by the present application makes extensions to these protocols to allow for deployment at both Sender and Receiver. Different protocols can adopt different extension schemes according to the characteristics of the protocols so as to realize the compatibility extension of the protocols.
The following describes key points for transmitting the first code stream in the URL screen projection method provided by the present application, taking a WebSocket protocol as an example. While TCP is more simplified by direct use, more and more players are currently selected to run in browsers in the form of web applications (WebApp), which do not directly access the standard interface of TCP.
The Sender always serves as a Client (Client) of the WebSocket and actively initiates the WebSocket connection. The Receiver always acts as a Server side (Server) of the WebSocket to passively monitor the connection. The WebSocket Server port (port) is dynamically selected by the Receiver as much as possible and is reported to the Sender. The mode depends on a premise, and when the Receiver responds to the Sender screen-casting request, the Receiver can compatibly expand the screen-casting response message and carry the port selected by the Receiver. DLNA initiates a screen-casting request through SOAP SetAVransportURI, and the screen-casting response is in XML format and can be extended in compatibility. The airPlayer initiates a screen-casting request through HTTP POST/play, and the screen-casting response does not have HTTP body, so that HTTP body or HTTP header can be added to realize compatibility extension. Particularly, GoogleCast compares that after a Sender initiates a screen casting request through an SDK, two ends can establish WebSocket connection and exchange messages through the SDK. The WebSocket connection can be directly multiplexed to send the first code stream. For the situation of GoogleCast, the WebSocket connection does not need to be established in the following way, but is established after screen projection is started.
When the Sender initiates screen projection, the screen projection request carries extension parameters, which indicates that the Sender supports the scheme. And if the Receiver supports the scheme, the Receiver can identify the extension parameters in the screen projection request. Before responding to the Sender, the WebSocket Server is started, and then screen projection response is sent to the Sender, wherein the WebSocket Server Port is carried in the screen projection response. And after identifying the WebSocket Server Port responded by the Receiver, the Sender initiates a WebSocket connection request. WebSocket Server Port returned by a Receiver is effective in a one-time complete screen projection process, and the Receiver always monitors the Port in the period and waits for connection. The one-time complete screen projection process is as follows: starting from the moment that the Sender initiates a screen-casting request, and ending the screen casting by the user through the Sender or the Receiver until the reverse transmission of the first code stream is completed (or overtime, or the Sender initiates a new screen-casting request).
After detecting the screen-casting termination request, the Receiver waits for the Sender to acquire the first code stream cached by the Sender, and starts a timer. When the transmission of the first code stream is finished, or a timer expires, or a new screen casting request is initiated by the same Sender as monitored, the Receiver judges that the current screen casting process is finished, the WebSocket Server is terminated, and the corresponding Port is invalid immediately. Whether long or short connections are maintained is determined by the Sender. The long connection means that the Sender does not disconnect the WebSocket connection after finishing sending the first code stream. And when the screen projection is terminated, the connection is used for acquiring the reverse first code stream, and the connection is terminated after the connection is completed. The short connection means that the Sender immediately disconnects the WebSocket connection after finishing sending the first code stream. When the screen projection is terminated, reestablishing the WebSocket connection with the Receiver to acquire the reverse first code stream, and terminating the connection after the completion.
The application defines a new message transmission media information and a first code stream, wherein the message can have a uniform message header with a fixed length, is used for carrying the type and the response code of the identification message, and is followed by an optional message body. The following exemplary provides several functions of the new message, and it should be understood that the following new message is a few examples of the URL screen projection method provided by the present application, which is required for making an extension to the relevant protocol, but is not a limitation.
PutCacheMeta: when the screen casting is started, after the Sender establishes WebSocket connection with the Receiver, the Sender sends PutCacheMeta information, and the information body carries media information.
PutCacheMedia: and after receiving the ACK of the Receiver to PutCacheMea, the Sender sends a PutCacheMedia message, wherein the PutCacheMedia message carries the first code stream.
GetCacheMeta: when screen projection is terminated, a Sender sends a GetCacheMeta message, and a Receiver carries media information in screen projection termination response.
GetCacheMedia: when the screen projection is finished, the Sender sends a GetCacheMedia message, and the Receiver replies a first code stream.
Based on the flowchart shown in fig. 7, in conjunction with the above-mentioned several new messages, fig. 8 shows a possible embodiment of the URL screen projection method of the present application. Fig. 8 is a flowchart of a third embodiment of the URL screen projection method according to the present application. The process 800 may be performed by a source device, a destination device, and a media server. Process 800 is described as a series of steps or operations, it being understood that process 800 may be performed in various orders and/or concurrently, and is not limited to the order of execution shown in fig. 8. As shown in fig. 8, the method of this embodiment may include:
step 801, a user starts screen projection through a Sender.
And step 802, sending a screen projection request to a Receiver by the Sender.
And step 803, the Receiver sends a screen-casting response to the Sender.
If the Receiver correctly receives the screen casting request sent by the Sender, the Receiver can feed back an acknowledgement response (ACK), otherwise, the Receiver can feed back a Negative acknowledgement response (NAK). The Receiver can carry the extension information such as WebSocket Server Port in the ACK.
And step 804, the Sender requests the Receiver to establish the WebSocket connection.
Step 805, Receiver sends a connection establishing response to Sender.
And if the Receiver successfully establishes the connection, replying ACK, otherwise, replying NAK.
And 806, determining the first code stream and the downloading indication information by the Sender according to the current playing progress.
Step 807, Sender sends PutCacheMeta message to Receiver.
The PutCacheMeta message carries media information.
Step 808, Sender sends PutCacheMedia message to Receiver.
The PutCacheMedia message carries a first code stream.
Step 809, Receiver sends a download request to the media server.
And step 810, the Receiver downloads the second code stream from the media server.
And step 811, when the data volume of the first code stream is enough, the Receiver starts playing the video.
Step 812, the user terminates the screen-casting through the Sender.
Step 813, Sender sends a request for stopping screen projection to Receiver.
And 814, the Receiver sends a screen projection termination response to the Sender.
Step 815, the Receiver determines the first code stream and the download indication information according to the current playing progress.
Step 816, Sender sends GetCacheMeta message to Receiver.
Step 817, the Receiver carries the media information in the response message.
Step 818, Sender sends GetCacheMedia message to Receiver.
Step 819, the Receiver sends the first code stream to the Sender.
Step 820, Sender sends download request to media server.
Step 821, Sender downloads the second code stream from the media server.
Step 822, when the data volume of the first code stream is enough, the Sender starts playing the video.
The URL screen projection method of the present application is introduced above, and the apparatus of the present application is introduced below, where the apparatus of the present application includes a screen projection apparatus applied to a source device and a screen projection apparatus applied to a destination device, and it should be understood that the screen projection apparatus applied to the source device is the source device in the method, and has any function of the source device in the method, and the screen projection apparatus applied to the destination device is the destination device in the method, and has any function of the destination device in the method.
As shown in fig. 9, the screen projection apparatus applied to the destination device includes: a receiving module 901, a playing module 902 and a decoding module 903. The receiving module 901 is configured to receive a first code stream sent by a source device, where the first code stream is a code stream downloaded from a media server by the source device before screen-casting switching; the receiving module 901 is further configured to receive download indication information sent by the source device, and download a second code stream from the media server according to the download indication information, where a start frame of the second code stream is indicated by the download indication information, and the start frame of the second code stream is related to an end frame of the first code stream; a playing module 902, configured to play a video according to the first code stream or the second code stream.
In a possible implementation manner, the start frame of the first code stream is a key frame Fk corresponding to an image frame Fc being played by the source device during screen projection switching.
In a possible implementation manner, the ending frame of the first code stream is the last image frame Fd that has been downloaded by the source device during the screen-casting switching.
In a possible implementation manner, an image frame being played by the source device during screen-casting switching is Fc, a last image frame downloaded by the source device during screen-casting switching is Fd, Fd and Fc belong to different media fragments, each media fragment includes a plurality of image frames, and an end frame of the first code stream is a last image frame of a media fragment to which Fc belongs.
In a possible implementation manner, the starting frame of the second code stream is related to the ending frame of the first code stream, and specifically includes: the start frame of the second code stream is the first image frame of the media fragment to which the next image frame of the end frame of the first code stream belongs.
In a possible implementation manner, the image frame being played by the source device at the time of screen projection switching is Fc, the last image frame that has been downloaded by the source device at the time of screen projection switching is Fd, the next image frame of Fd is Fn, Fn and Fc belong to different media fragments, the start frame of the second code stream is the first image frame F1 of the media fragment to which Fn belongs, and the end frame of the first code stream is the last image frame F0 of F1.
In a possible implementation manner, the playing module 902 is specifically configured to determine, from the first code stream or the second code stream, a first frame image frame to be played after screen projection switching, and start playing the video from the first frame image frame.
In a possible implementation manner, the receiving module 901 is further configured to receive play time indication information sent by the source device, where the play time indication information is used to indicate an image frame Fc being played by the source device during screen switching; the playing module is further configured to determine, when the data volume of the first code stream is greater than a set threshold, that the first image frame to be played is Fc from the first code stream according to the playing time indication information, play a video from Fc to an end frame of the first code stream according to the first code stream, play a video starting from Fm according to the second code stream, where Fm is a next image frame of the end frame of the first code stream.
In a possible implementation manner, the receiving module 901 is specifically configured to receive the first code stream sent by the source device when a total data amount of the first code stream is not less than a set threshold.
In a possible implementation manner, the receiving module 901 is further configured to receive the data amount indication information sent by the source device when a total data amount of the first code stream is smaller than the set threshold; and downloading a third code stream from the media server, wherein the starting frame of the third code stream is the first image frame of the media fragment to which the image frame Fc which is played by the source end equipment during screen projection switching belongs.
In a possible implementation manner, the receiving module 901 is further configured to receive the data amount indication information and the first code stream sent by the source end device when a total data amount of the first code stream is smaller than the set threshold; and downloading a fourth code stream from the media server, wherein the last image frame of the first code stream is Fd, the next image frame of the Fd is Fn, and the initial frame of the fourth code stream is the first frame of the media fragment to which the Fn belongs.
In a possible implementation manner, the receiving module 901 is further configured to receive control information sent by the source device, where the control information includes a package format of the first code stream; receiving format data sent by the source end device; and the decoding module 903 is configured to decapsulate the format data according to the encapsulation format to obtain the first code stream.
The apparatus of this embodiment may be configured to execute the technical solution of any one of the method embodiments shown in fig. 3 to fig. 8, and the implementation principle and the technical effect are similar, which are not described herein again.
As shown in fig. 10, the screen projection apparatus applied to the destination device includes: a processing module 1001, a sending module 1002 and an encapsulating module 1003. The processing module 1001 is configured to determine a first code stream and download indication information according to a play progress during screen-casting switching, where the first code stream is a code stream downloaded from a media server before screen-casting switching, the download indication information is used to indicate a start frame of a second code stream, the second code stream is a code stream to be downloaded from the media server by a destination device, and the start frame of the second code stream is related to an end frame of the first code stream; a sending module 1002, configured to send the download indication information and the first code stream to the destination device.
In a possible implementation manner, the starting frame of the first code stream is a key frame Fk corresponding to the image frame Fc being played during screen projection switching.
In a possible implementation manner, the ending frame of the first code stream is the last image frame Fd that has been downloaded during screen-casting switching.
In a possible implementation manner, the image frame being played during screen-casting switching is Fc, the last image frame that has been downloaded during screen-casting switching is Fd, Fd and Fc belong to different media fragments, each media fragment includes a plurality of image frames, and the end frame of the first code stream is the last image frame of the media fragment to which Fc belongs.
In a possible implementation manner, the starting frame of the second code stream is related to the ending frame of the first code stream, and specifically includes: the start frame of the second code stream is the first image frame of the media fragment to which the next image frame of the end frame of the first code stream belongs.
In a possible implementation manner, the image frame being played during the screen-casting switching is Fc, the last image frame that has been downloaded during the screen-casting switching is Fd, the next image frame of Fd is Fn, Fn and Fc belong to different media segments, the start frame of the second code stream is the first image frame F1 of the media segment to which Fn belongs, and the end frame of the first code stream is the last image frame F0 of F1.
In a possible implementation manner, the sending module 1002 is further configured to send playing time indication information to the destination device, where the playing time indication information is used to indicate an image frame Fc being played during screen switching.
In a possible implementation manner, the sending module 1002 is further configured to send, when a total data amount of the first code stream is smaller than a set threshold, data amount indicating information to the destination device, where the data amount indicating information is used to notify that the total data amount of the first code stream is smaller than the set threshold.
In a possible implementation manner, the encapsulating module 1003 is configured to encapsulate the first code stream according to a set encapsulation format to obtain format data; the sending module 1002 is specifically configured to send control information and the format data to the destination device, where the control information includes the encapsulation format.
The apparatus of this embodiment may be configured to execute the technical solution of any one of the method embodiments shown in fig. 3 to fig. 8, and the implementation principle and the technical effect are similar, which are not described herein again.
In implementation, the steps of the above method embodiments may be performed by integrated logic circuits of hardware in a processor or instructions in the form of software. The steps of the method disclosed in the present application may be directly implemented by a hardware encoding processor, or implemented by a combination of hardware and software modules in the encoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in a memory, and a processor reads information in the memory and completes the steps of the method in combination with hardware of the processor.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (personal computer, server, network device, or the like) to execute all or part of the steps of the method according to the embodiments of the present application.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (44)

1. A URL screen projection method is characterized by comprising the following steps:
receiving a first code stream sent by a source end device, wherein the first code stream is a code stream downloaded from a media server by the source end device before screen-casting switching;
receiving download indication information sent by the source end equipment;
downloading a second code stream from the media server, wherein the starting frame of the second code stream is indicated by the downloading indication information, and the starting frame of the second code stream is related to the ending frame of the first code stream;
and playing the video according to the first code stream or the second code stream.
2. The method of claim 1, wherein a start frame of the first code stream is a key frame Fk corresponding to an image frame Fc being played by the source device during screen-projection switching.
3. The method according to claim 1 or 2, wherein the end frame of the first code stream is the last image frame Fd that has been downloaded by the source device at the time of screen-casting switching.
4. The method according to claim 1 or 2, wherein the image frame being played by the source device at the time of screen switching is Fc, the last image frame that has been downloaded by the source device at the time of screen switching is Fd, Fd and Fc belong to different media segments, each media segment contains multiple image frames, and the end frame of the first bitstream is the last image frame of the media segment to which Fc belongs.
5. The method according to any one of claims 1 to 4, wherein the starting frame of the second code stream is related to the ending frame of the first code stream, and specifically comprises:
and the starting frame of the second code stream is the first image frame of the media fragment to which the next image frame of the ending frame of the first code stream belongs.
6. The method of claim 1 or 2, wherein the image frame being played by the source device at the time of screen switching is Fc, the last image frame that has been downloaded by the source device at the time of screen switching is Fd, the next image frame of Fd is Fn, Fn and Fc belong to different media segments, the starting frame of the second bitstream is a first image frame F1 of the media segment to which Fn belongs, and the ending frame of the first bitstream is a last image frame F0 of F1.
7. The method according to any one of claims 1 to 6, wherein the playing the video according to the first bitstream or the second bitstream specifically includes:
and determining a first frame image frame to be played after screen projection switching from the first code stream or the second code stream, and starting to play the video from the first frame image frame.
8. The method according to any of claim 7, wherein before playing the video according to the first bitstream or the second bitstream, the method further comprises:
receiving play time indication information sent by the source end equipment, wherein the play time indication information is used for indicating an image frame Fc played by the source end equipment during screen projection switching;
the playing the video according to the first code stream or the second code stream includes:
when the data volume of the first code stream is larger than a set threshold value, determining that the first frame image frame to be played is Fc from the first code stream according to the playing time indication information, playing a video from Fc to an end frame of the first code stream according to the first code stream, and playing a video from Fm according to the second code stream, wherein Fm is a next image frame of the end frame of the first code stream.
9. The method according to any one of claims 1 to 8, wherein the receiving the first code stream sent by the source end device includes:
and when the total data volume of the first code stream is not less than a set threshold, receiving the first code stream sent by the source end device.
10. The method of claim 9, further comprising:
when the total data volume of the first code stream is smaller than the set threshold, receiving the data volume indication information sent by the source end device;
and downloading a third code stream from the media server, wherein a starting frame of the third code stream is a first image frame of a media fragment to which an image frame Fc (media fragment) currently played by the source end device belongs during screen projection switching.
11. The method of claim 9, further comprising:
when the total data volume of the first code stream is smaller than the set threshold, receiving the data volume indication information and the first code stream sent by the source end device;
and downloading a fourth code stream from the media server, wherein the last image frame of the first code stream is Fd, the next image frame of the Fd is Fn, and the initial frame of the fourth code stream is the first frame of the media fragment to which the Fn belongs.
12. The method according to any of claims 1-11, wherein before receiving the first codestream sent by the source device, further comprising:
receiving control information sent by the source end device, wherein the control information comprises a packaging format of the first code stream;
the receiving the first code stream sent by the source end device includes:
and receiving format data sent by the source end equipment, and de-encapsulating the format data according to the encapsulation format to obtain the first code stream.
13. A URL screen projection method is characterized by comprising the following steps:
determining a first code stream and downloading indication information according to a playing progress during screen projection switching, wherein the first code stream is downloaded from a media server before screen projection switching, the downloading indication information is used for indicating a starting frame of a second code stream, the second code stream is a code stream to be downloaded from the media server by destination equipment, and the starting frame of the second code stream is related to an ending frame of the first code stream;
and sending the downloading indication information and the first code stream to the destination terminal equipment.
14. The method of claim 13, wherein a starting frame of the first code stream is a key frame Fk corresponding to an image frame Fc being played during screen-shot switching.
15. The method according to claim 13 or 14, wherein the ending frame of the first code stream is the last image frame Fd that has been downloaded at the time of screen-casting switching.
16. The method of claim 13 or 14, wherein the image frame being played at the time of the screen switching is Fc, the last image frame that has been downloaded at the time of the screen switching is Fd, Fd and Fc belong to different media segments, each media segment contains a plurality of image frames, and the end frame of the first bitstream is the last image frame of the media segment to which Fc belongs.
17. The method according to any one of claims 13 to 16, wherein the starting frame of the second code stream is related to the ending frame of the first code stream, and specifically comprises:
and the starting frame of the second code stream is the first image frame of the media fragment to which the next image frame of the ending frame of the first code stream belongs.
18. The method of claim 13 or 14, wherein the image frame being played at the time of the screen switching is Fc, the last image frame that has been downloaded at the time of the screen switching is Fd, the next image frame of Fd is Fn, Fn and Fc belong to different media slices, the start frame of the second bitstream is the first image frame F1 of the media slice to which Fn belongs, and the end frame of the first bitstream is the last image frame F0 of F1.
19. The method according to any one of claims 13 to 18, wherein before the sending the download indication information and the first codestream to the destination device, further comprising:
and sending playing time indication information to the destination terminal equipment, wherein the playing time indication information is used for indicating the image frame Fc which is being played during screen projection switching.
20. The method of any one of claims 13-19, further comprising:
and when the total data volume of the first code stream is smaller than a set threshold, sending data volume indicating information to the destination device, wherein the data volume indicating information is used for informing that the total data volume of the first code stream is smaller than the set threshold.
21. The method according to any one of claims 13 to 20, wherein before the sending the download indication information and the first code stream to the destination device, the method further includes:
packaging the first code stream according to a set packaging format to obtain format data;
and sending control information and the format data to the destination terminal equipment, wherein the control information comprises the packaging format.
22. A screen projection apparatus, comprising:
the receiving module is used for receiving a first code stream sent by a source end device, wherein the first code stream is a code stream downloaded from a media server by the source end device before screen-casting switching;
the receiving module is further configured to receive download indication information sent by the source end device, and download a second code stream from the media server according to the download indication information, where a start frame of the second code stream is related to an end frame of the first code stream;
and the playing module is used for playing the video according to the first code stream or the second code stream.
23. The apparatus of claim 22, wherein a start frame of the first code stream is a key frame Fk corresponding to an image frame Fc being played by the source device during screen-projection switching.
24. The apparatus of claim 22 or 23, wherein the end frame of the first bitstream is a last image frame Fd that has been downloaded by the source device at the time of screen-casting switching.
25. The apparatus of claim 22 or 23, wherein the image frame being played by the source device at the time of screen shot switching is Fc, the last image frame that has been downloaded by the source device at the time of screen shot switching is Fd, Fd and Fc belong to different media segments, each media segment contains multiple image frames, and the end frame of the first bitstream is the last image frame of the media segment to which Fc belongs.
26. The apparatus according to any one of claims 22 to 25, wherein the starting frame of the second code stream is related to the ending frame of the first code stream, and specifically comprises:
and the starting frame of the second code stream is the first image frame of the media fragment to which the next image frame of the ending frame of the first code stream belongs.
27. The apparatus of claim 22 or 23, wherein the image frame being played by the source device at the time of the screen shot switching is Fc, the last image frame that has been downloaded by the source device at the time of the screen shot switching is Fd, the next image frame of Fd is Fn, Fn and Fc belong to different media segments, the start frame of the second bitstream is a first image frame F1 of the media segment to which Fn belongs, and the end frame of the first bitstream is a last image frame F0 of F1.
28. The apparatus according to any one of claims 22 to 27, wherein the playing module is specifically configured to determine, from the first code stream or the second code stream, a first image frame to be played after screen switching, and start playing the video from the first image frame.
29. The apparatus according to any one of claim 28, wherein the receiving module is further configured to receive playing time indication information sent by the source device, where the playing time indication information is used to indicate an image frame Fc being played by the source device at the time of screen switching;
the playing module is further configured to determine, when the data volume of the first code stream is greater than a set threshold, that the first image frame to be played is Fc from the first code stream according to the playing time indication information, play a video from Fc to an end frame of the first code stream according to the first code stream, play a video starting from Fm according to the second code stream, where Fm is a next image frame of the end frame of the first code stream.
30. The apparatus according to any one of claims 22 to 29, wherein the receiving module is specifically configured to:
and when the total data volume of the first code stream is not less than a set threshold, receiving the first code stream sent by the source end device.
31. The apparatus of claim 30, wherein the receiving module is further configured to:
when the total data volume of the first code stream is smaller than the set threshold, receiving the data volume indication information sent by the source end device;
and downloading a third code stream from the media server, wherein a starting frame of the third code stream is a first image frame of a media fragment to which an image frame Fc (media fragment) currently played by the source end device belongs during screen projection switching.
32. The method of claim 30, wherein the receiving module is further configured to:
when the total data volume of the first code stream is smaller than the set threshold, receiving the data volume indication information and the first code stream sent by the source end device;
and downloading a fourth code stream from the media server, wherein the last image frame of the first code stream is Fd, the next image frame of the Fd is Fn, and the initial frame of the fourth code stream is the first frame of the media fragment to which the Fn belongs.
33. The apparatus of any one of claims 22-32, wherein the receiving module is further configured to:
receiving control information sent by the source end device, wherein the control information comprises a packaging format of the first code stream;
receiving format data sent by the source end equipment;
the device further comprises: and the decoding module is used for de-encapsulating the format data according to the encapsulation format to obtain the first code stream.
34. A screen projection apparatus, comprising:
the processing module is used for determining a first code stream and downloading indication information according to a playing progress during screen-casting switching, wherein the first code stream is downloaded from a media server before screen-casting switching, the downloading indication information is used for indicating a starting frame of a second code stream, the second code stream is a code stream to be downloaded from the media server by a destination device, and the starting frame of the second code stream is related to an ending frame of the first code stream;
and the sending module is used for sending the downloading indication information and the first code stream to the destination terminal equipment.
35. The apparatus of claim 34, wherein the starting frame of the first code stream is a key frame Fk corresponding to an image frame Fc being played during screen-shot switching.
36. The apparatus of claim 34 or 35, wherein the ending frame of the first codestream is the last image frame Fd that has been downloaded at the time of screen-shot switching.
37. The apparatus of claim 34 or 35, wherein the image frame being played at the time of the screen switching is Fc, the last image frame that has been downloaded at the time of the screen switching is Fd, Fd and Fc belong to different media segments, each media segment contains a plurality of image frames, and the end frame of the first bitstream is the last image frame of the media segment to which Fc belongs.
38. The apparatus according to any one of claims 34 to 37, wherein the starting frame of the second code stream is related to the ending frame of the first code stream, and specifically comprises:
and the starting frame of the second code stream is the first image frame of the media fragment to which the next image frame of the ending frame of the first code stream belongs.
39. The apparatus of claim 34 or 35, wherein the image frame being played at the time of the screen switching is Fc, the last image frame that has been downloaded at the time of the screen switching is Fd, the next image frame of Fd is Fn, Fn and Fc belong to different media slices, the start frame of the second bitstream is a first image frame F1 of the media slice to which Fn belongs, and the end frame of the first bitstream is a last image frame F0 of F1.
40. The apparatus according to any one of claims 34 to 39, wherein the sending module is further configured to send playing time indication information to the destination device, where the playing time indication information is used to indicate an image frame Fc being played during screen switching.
41. The apparatus according to any one of claims 34 to 40, wherein the sending module is further configured to send, to the destination device, data amount indication information when a total data amount of the first codestream is smaller than a set threshold, where the data amount indication information is used to notify that the total data amount of the first codestream is smaller than the set threshold.
42. The apparatus according to any one of claims 34 to 41, further comprising a packaging module, configured to package the first code stream according to a set packaging format to obtain format data;
the sending module is specifically configured to send control information and the format data to the destination device, where the control information includes the encapsulation format.
43. A screen projection apparatus, comprising:
a processor and a transmission interface;
the processor is configured to invoke program instructions stored in the memory to implement the method of any of claims 1-12 or 13-21.
44. A computer-readable storage medium, comprising a computer program which, when executed on a computer or processor, causes the computer or processor to perform the method of any of claims 1-12 or 13-21.
CN202010177999.2A 2020-03-13 2020-03-13 URL screen projection method and device Pending CN113395606A (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202010177999.2A CN113395606A (en) 2020-03-13 2020-03-13 URL screen projection method and device
CN202180017852.XA CN115244944A (en) 2020-03-13 2021-03-01 URL screen projection method and device
PCT/CN2021/078480 WO2021179931A1 (en) 2020-03-13 2021-03-01 Url screen projection method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010177999.2A CN113395606A (en) 2020-03-13 2020-03-13 URL screen projection method and device

Publications (1)

Publication Number Publication Date
CN113395606A true CN113395606A (en) 2021-09-14

Family

ID=77616428

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202010177999.2A Pending CN113395606A (en) 2020-03-13 2020-03-13 URL screen projection method and device
CN202180017852.XA Pending CN115244944A (en) 2020-03-13 2021-03-01 URL screen projection method and device

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202180017852.XA Pending CN115244944A (en) 2020-03-13 2021-03-01 URL screen projection method and device

Country Status (2)

Country Link
CN (2) CN113395606A (en)
WO (1) WO2021179931A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114501120A (en) * 2022-01-11 2022-05-13 烽火通信科技股份有限公司 Multi-terminal wireless screen projection switching method and electronic equipment
CN114567802A (en) * 2021-12-29 2022-05-31 沈阳中科创达软件有限公司 Data display method and device
CN115103221A (en) * 2022-06-28 2022-09-23 北京奇艺世纪科技有限公司 Screen projection method and device, electronic equipment and readable storage medium
CN115244944A (en) * 2020-03-13 2022-10-25 华为技术有限公司 URL screen projection method and device
CN116193178A (en) * 2021-11-26 2023-05-30 杭州当虹科技股份有限公司 Method for seamlessly transferring multi-screen simultaneous-view home screen

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114157904B (en) * 2021-12-02 2024-05-28 瑞森网安(福建)信息科技有限公司 Cloud video projection playing method, system and storage medium based on mobile terminal
CN114827690B (en) * 2022-03-30 2023-07-25 北京奇艺世纪科技有限公司 Network resource display method, device and system
CN115550498B (en) * 2022-08-03 2024-04-02 阿波罗智联(北京)科技有限公司 Screen projection method, device, equipment and storage medium
CN115412758B (en) * 2022-09-01 2023-11-14 北京奇艺世纪科技有限公司 Video processing method and related device
CN117939088A (en) * 2022-10-25 2024-04-26 广州视臻信息科技有限公司 Screen transmitter pairing method, pairing device, electronic equipment and medium

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101102500A (en) * 2007-08-01 2008-01-09 神州亿品科技有限公司 Playing method for preventing form black screen and flicking screen
US20110302238A1 (en) * 2010-06-08 2011-12-08 Microsoft Corporation Virtual playback speed modification
CN102905188A (en) * 2012-11-01 2013-01-30 北京奇艺世纪科技有限公司 Video code stream switching method and device
CN103281294A (en) * 2013-04-17 2013-09-04 华为技术有限公司 Data sharing method and electronic equipment
US20140003516A1 (en) * 2012-06-28 2014-01-02 Divx, Llc Systems and methods for fast video startup using trick play streams
US20150179130A1 (en) * 2013-12-20 2015-06-25 Blackberry Limited Method for wirelessly transmitting content from a source device to a sink device
CN106572383A (en) * 2015-10-12 2017-04-19 中国科学院声学研究所 Video switching method and system based on multi-screen interaction
CN107241640A (en) * 2017-06-26 2017-10-10 中广热点云科技有限公司 The method that a kind of mobile device and television equipment are synchronously played
CN107659712A (en) * 2017-09-01 2018-02-02 咪咕视讯科技有限公司 A kind of method, apparatus and storage medium for throwing screen
CN109982151A (en) * 2017-12-28 2019-07-05 中国移动通信集团福建有限公司 VOD method, device, equipment and medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103248939B (en) * 2012-02-03 2017-11-28 海尔集团公司 A kind of method and system realized multi-screen synchronous and shown
US11477516B2 (en) * 2018-04-13 2022-10-18 Koji Yoden Services over wireless communication with high flexibility and efficiency
CN110087149A (en) * 2019-05-30 2019-08-02 维沃移动通信有限公司 A kind of video image sharing method, device and mobile terminal
CN113395606A (en) * 2020-03-13 2021-09-14 华为技术有限公司 URL screen projection method and device

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101102500A (en) * 2007-08-01 2008-01-09 神州亿品科技有限公司 Playing method for preventing form black screen and flicking screen
US20110302238A1 (en) * 2010-06-08 2011-12-08 Microsoft Corporation Virtual playback speed modification
US20140003516A1 (en) * 2012-06-28 2014-01-02 Divx, Llc Systems and methods for fast video startup using trick play streams
CN102905188A (en) * 2012-11-01 2013-01-30 北京奇艺世纪科技有限公司 Video code stream switching method and device
CN103281294A (en) * 2013-04-17 2013-09-04 华为技术有限公司 Data sharing method and electronic equipment
US20150179130A1 (en) * 2013-12-20 2015-06-25 Blackberry Limited Method for wirelessly transmitting content from a source device to a sink device
CN106572383A (en) * 2015-10-12 2017-04-19 中国科学院声学研究所 Video switching method and system based on multi-screen interaction
CN107241640A (en) * 2017-06-26 2017-10-10 中广热点云科技有限公司 The method that a kind of mobile device and television equipment are synchronously played
CN107659712A (en) * 2017-09-01 2018-02-02 咪咕视讯科技有限公司 A kind of method, apparatus and storage medium for throwing screen
CN109982151A (en) * 2017-12-28 2019-07-05 中国移动通信集团福建有限公司 VOD method, device, equipment and medium

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115244944A (en) * 2020-03-13 2022-10-25 华为技术有限公司 URL screen projection method and device
CN116193178A (en) * 2021-11-26 2023-05-30 杭州当虹科技股份有限公司 Method for seamlessly transferring multi-screen simultaneous-view home screen
CN114567802A (en) * 2021-12-29 2022-05-31 沈阳中科创达软件有限公司 Data display method and device
CN114567802B (en) * 2021-12-29 2024-02-09 沈阳中科创达软件有限公司 Data display method and device
CN114501120A (en) * 2022-01-11 2022-05-13 烽火通信科技股份有限公司 Multi-terminal wireless screen projection switching method and electronic equipment
CN114501120B (en) * 2022-01-11 2023-06-09 烽火通信科技股份有限公司 Multi-terminal wireless screen switching method and electronic equipment
CN115103221A (en) * 2022-06-28 2022-09-23 北京奇艺世纪科技有限公司 Screen projection method and device, electronic equipment and readable storage medium
CN115103221B (en) * 2022-06-28 2023-09-22 北京奇艺世纪科技有限公司 Screen projection method and device, electronic equipment and readable storage medium

Also Published As

Publication number Publication date
WO2021179931A8 (en) 2022-09-22
WO2021179931A1 (en) 2021-09-16
CN115244944A (en) 2022-10-25

Similar Documents

Publication Publication Date Title
CN113395606A (en) URL screen projection method and device
CN110324671B (en) Webpage video playing method and device, electronic equipment and storage medium
WO2021143479A1 (en) Media stream transmission method and system
US9843775B2 (en) Surveillance video playback method, device, and system
EP2744169B1 (en) Method and apparatus for playing streaming media files
CN113225592B (en) Screen projection method and device based on Wi-Fi P2P
EP3952316A1 (en) Resource transmission method and terminal
WO2015058590A1 (en) Control method, device and system for live broadcast of video, and storage medium
CN113141524B (en) Resource transmission method, device, terminal and storage medium
US9509947B2 (en) Method and apparatus for transmitting file during video call in electronic device
CN110536175A (en) A kind of code rate switching method and apparatus
CN112104893A (en) Video stream management method and device for realizing plug-in-free playing of webpage end
CN110636337B (en) Video image intercepting method, device and system
US20220095020A1 (en) Method for switching a bit rate, and electronic device
CN115209208B (en) Video cyclic playing processing method and device
WO2023217188A1 (en) Livestream data transmission method, apparatus and system, device and medium
JP2022509802A (en) Receiver with native broadcaster application
CN116112476A (en) Multimedia playing method, system, equipment and storage medium of cloud desktop
CN115380487B (en) Data transmission method, sending equipment and receiving equipment
EP2819384A1 (en) Method, device and system for video monitoring based on universal plug and play (upnp)
CN112368987B (en) Media playing method and playing equipment
US20240073415A1 (en) Encoding Method, Electronic Device, Communication System, Storage Medium, and Program Product
WO2023030386A1 (en) Data transmission method, electronic device and apparatus
TWI524767B (en) Receiving device, screen frame transmission system and method
CN117812052A (en) Display device and media data conversion method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210914