CN115604496A - Display device, live broadcast channel switching method and storage medium - Google Patents

Display device, live broadcast channel switching method and storage medium Download PDF

Info

Publication number
CN115604496A
CN115604496A CN202211201571.2A CN202211201571A CN115604496A CN 115604496 A CN115604496 A CN 115604496A CN 202211201571 A CN202211201571 A CN 202211201571A CN 115604496 A CN115604496 A CN 115604496A
Authority
CN
China
Prior art keywords
data
hls
live broadcast
broadcast source
display device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211201571.2A
Other languages
Chinese (zh)
Inventor
马立凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN202211201571.2A priority Critical patent/CN115604496A/en
Publication of CN115604496A publication Critical patent/CN115604496A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/643Communication protocols

Abstract

Some embodiments of the present application provide a display device, a live broadcast channel switching method, and a storage medium, where the display device includes a controller, and the controller is configured to respond to a channel switching instruction for switching a first live broadcast source to a second live broadcast source during live broadcast, acquire HLS data and multicast data corresponding to the second live broadcast source at the same time, and control the display to play multicast data corresponding to the second live broadcast source, and stop acquiring the multicast data and continue playing the HLS data when data frames of the multicast data and the HLS data are aligned. When the channel is switched, the HLS data and the multicast data are acquired simultaneously, and the multicast data are played before the multicast data are aligned with the data frames of the HLS data, so that the problems of playing delay and blockage caused by acquisition of the HLS data after the channel is switched by the display equipment can be solved, and the watching experience of a user is improved.

Description

Display device, live broadcast channel switching method and storage medium
Technical Field
The application relates to the technical field of live video, in particular to a display device, a live channel switching method and a storage medium.
Background
The display device refers to a device capable of outputting a display screen, and may include a smart television, a communication terminal, and the like. With the rapid development of the internet, the functions of display devices are becoming more and more abundant, for example: the display device can realize the functions of network live broadcast or network on-demand and the like based on the Internet application technology. In order to deal with the limited bandwidth of the internet, various audio and video programs distributed on the internet by a media generation end mainly adopt a streaming media technology, that is, a traditional audio and video file can be converted into a streaming media file like a water stream, so that a display device can play while downloading, and does not need to play after the whole file is completely downloaded.
When the Streaming media file is transmitted, an adaptive bitrate Streaming media Protocol (HLS) based on a hypertext Transfer Protocol (HTTP) may be used, where the HLS mainly divides the whole Streaming media file into multiple Transport Stream (TS) fragment files for downloading and playing by a display device. However, when the live broadcast of any channel is played by using the display device, in order to avoid the problem of pause in playing caused by the fact that the latest TS clip file of the HLS data of the channel is not ready, the display device usually starts to acquire data from a preset number of TS clip files before the current time, so that the live broadcast of the channel has a certain delay. When a user switches channels, the display device needs to acquire HLS data of a new channel again, which causes a lot of time waste, and further causes that the user cannot smoothly switch channels, which causes that channel switching playing is jammed, and affects the viewing experience of the user.
Disclosure of Invention
Some embodiments of the application provide a display device, a live channel switching method and a storage medium. The problem that in the related art, when the channel is switched, the display device needs to acquire and analyze HLS data of a new channel again, a large amount of time is wasted, and further channel switching playing is unsmooth, so that the watching experience of a user is influenced is solved.
In a first aspect, some embodiments of the present application provide a display device, including:
a display configured to display the play data;
a controller configured to:
responding to a channel switching instruction for switching a first live broadcast source to a second live broadcast source, acquiring adaptive bitrate streaming media transport protocol (HLS) data corresponding to the second live broadcast source, and simultaneously acquiring multicast data corresponding to the second live broadcast source;
controlling the display to play multicast data corresponding to the second live broadcast source;
and under the condition that the multicast data are aligned with the data frames of the HLS data, stopping acquiring the multicast data corresponding to the second live broadcast source, and controlling the display to play the HLS data.
In a second aspect, some embodiments of the present application provide a live broadcast channel switching method, which is applied to a display device, and the method includes:
responding to a channel switching instruction for switching a first live broadcast source to a second live broadcast source, acquiring adaptive bitrate streaming media transport protocol (HLS) data corresponding to the second live broadcast source, and simultaneously acquiring multicast data corresponding to the second live broadcast source;
playing multicast data corresponding to the second live broadcast source;
and under the condition that the multicast data are aligned with the data frames of the HLS data, stopping acquiring the multicast data corresponding to the second live broadcast source, and playing the HLS data.
In a third aspect, some embodiments of the present application provide a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a controller, implements the live channel switching method according to the second aspect.
According to the technical scheme, the embodiments of the application provide a display device, a live broadcast channel switching method and a storage medium. When the first live broadcast source is switched to the second live broadcast source, the HLS data and the multicast data corresponding to the second live broadcast source can be simultaneously acquired, the multicast data corresponding to the second live broadcast source is played, and the acquisition of the multicast data is stopped and the HLS data is played under the condition that the data frames of the multicast data and the HLS data are aligned. When the channel is switched, the HLS data and the multicast data are acquired simultaneously, and the multicast data are played before the multicast data are aligned with the data frames of the HLS data, so that the problems of playing delay and blockage caused by acquisition of the HLS data after the channel is switched by the display equipment can be solved, and the watching experience of a user is improved.
Drawings
In order to more clearly explain the technical solution of the present application, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious to those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 illustrates a usage scenario diagram of a display device according to some embodiments of the present application;
fig. 2 shows a block configuration diagram of the control device 100 of some embodiments of the present application;
fig. 3 illustrates a block diagram of a hardware configuration of a display device 200 according to some embodiments of the present application;
fig. 4 shows a software configuration block diagram of the display device 200 of some embodiments of the present application;
FIG. 5 illustrates an icon control interface display of an application in the display device 200 according to some embodiments of the application;
FIG. 6 illustrates a structural schematic of HLS data of some embodiments of the present application;
fig. 7 illustrates a flow diagram of HLS data live broadcast of some embodiments of the present application;
FIG. 8 illustrates a flow diagram of a live channel switching method of some embodiments of the present application;
FIG. 9 illustrates a data flow diagram of a live channel switching method of some embodiments of the present application;
FIG. 10 is a schematic diagram illustrating the effect of a display device according to some embodiments of the present application playing data from a first live source;
FIG. 11 is a schematic diagram illustrating an effect of a display device of some embodiments of the present application playing data of a second live source;
fig. 12 is a schematic diagram illustrating an interaction flow between a display device and a server according to some embodiments of the present application.
Detailed Description
To make the purpose and embodiments of the present application clearer, the following will clearly and completely describe the exemplary embodiments of the present application with reference to the attached drawings in the exemplary embodiments of the present application, and it is obvious that the described exemplary embodiments are only a part of the embodiments of the present application, and not all of the embodiments.
It should be noted that the brief descriptions of the terms in the present application are only for the convenience of understanding the embodiments described below, and are not intended to limit the embodiments of the present application. These terms should be understood in their ordinary and customary meaning unless otherwise indicated.
The terms "first," "second," "third," and the like in the description and claims of this application and in the foregoing drawings are used for distinguishing between similar or analogous objects or entities and are not necessarily intended to limit the order or sequence in which they are presented unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances.
The terms "comprises" and "comprising," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to all elements expressly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The display device provided by the embodiment of the present application may have various implementation forms, and for example, the display device may be a television, a smart television, a laser projection device, a display (monitor), an electronic whiteboard (electronic whiteboard), an electronic desktop (electronic table), and the like. The embodiments of the present application do not limit the specific form of the display device. In the embodiment of the present application, a display device is taken as a television as an example for schematic description. Fig. 1 and 2 are specific embodiments of a display device of the present application.
Fig. 1 illustrates a usage scenario diagram of a display device according to some embodiments of the present application. As shown in fig. 1, a user may operate the display apparatus 200 through the smart device 300 or the control device 100.
In some embodiments, the control apparatus 100 may be a remote controller, and the communication between the remote controller and the display device includes an infrared protocol communication or a bluetooth protocol communication, and other short-distance communication methods, and controls the display device 200 in a wireless or wired manner. The user may input a user instruction through a key on a remote controller, voice input, control panel input, etc., to control the display apparatus 200.
In some embodiments, the smart device 300 (e.g., mobile terminal, tablet, computer, laptop, etc.) may also be used to control the display device 200. For example, the display device 200 is controlled using an application program running on the smart device.
In some embodiments, the display device 200 may receive the user's control through touch or gesture, etc., instead of receiving the instruction using the smart device 300 or the control apparatus 100 described above.
In some embodiments, the display device 200 may also be controlled in a manner other than the control apparatus 100 and the smart device 300, for example, the voice command control of the user may be directly received by a module configured inside the display device 200 to obtain a voice command, or may be received by a voice control device provided outside the display device 200.
In some embodiments, the display device 200 is also in data communication with a server 400. The display device 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display apparatus 200. The server 400 may be a cluster or a plurality of clusters, and may include one or more types of servers.
Fig. 2 shows a block configuration diagram of the control device 100 according to some embodiments of the present application. As shown in fig. 2, the control device 100 includes a controller 110, a communication interface 130, a user input/output interface 140, a memory, and a power supply. The control apparatus 100 may receive an input operation instruction from a user and convert the operation instruction into an instruction recognizable and responsive to the display device 200, thereby mediating interaction between the user and the display device 200.
Fig. 3 shows a hardware configuration block diagram of a display device 200 according to some embodiments of the present application. As shown in fig. 3, the display apparatus 200 includes at least one of a tuner demodulator 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a display 260, an audio output interface 270, a memory, a power supply, and a user interface.
In some embodiments, the controller 250 includes a processor, a video processor, an audio processor, a graphic processor, a RAM, a ROM, a first interface to an nth interface for input/output.
The display 260 includes a display screen component for presenting a picture, and a driving component for driving image display, a component for receiving an image signal outputted from the controller 250, performing display of video content, image content, and a menu manipulation interface, and a user manipulation UI interface.
The display 260 may be a liquid crystal display, an OLED display, and a projection display, and may also be a projection device and a projection screen.
The communicator 220 is a component for communicating with an external device or a server according to various communication protocol types. For example: the communicator may include at least one of a Wifi module, a bluetooth module, a wired ethernet module, and other network communication protocol chips or near field communication protocol chips, and an infrared receiver. The display apparatus 200 may establish transmission and reception of a control signal and a data signal with the control device 100 or the server 400 through the communicator 220.
A user interface for receiving control signals for controlling the apparatus 100 (e.g., an infrared remote control, etc.).
The detector 230 is used to collect signals of the external environment or interaction with the outside. For example, detector 230 includes a light receiver, a sensor for collecting ambient light intensity; alternatively, the detector 230 includes an image collector, such as a camera, which may be used to collect external environment scenes, attributes of the user, or user interaction gestures, or the detector 230 includes a sound collector, such as a microphone, which is used to receive external sounds.
The external device interface 240 may include, but is not limited to, the following: high Definition Multimedia Interface (HDMI), analog or data high definition component input interface (component), composite video input interface (CVBS), USB input interface (USB), RGB port, and the like. The interface may be a composite input/output interface formed by the plurality of interfaces.
The tuner demodulator 210 receives a broadcast television signal through a wired or wireless reception manner, and demodulates an audio/video signal, such as an EPG data signal, from a plurality of wireless or wired broadcast television signals.
In some embodiments, the controller 250 and the modem 210 may be located in different separate devices, that is, the modem 210 may also be located in an external device of the main device where the controller 250 is located, such as an external set-top box.
The controller 250 controls the operation of the display device and responds to the user's operation through various software control programs stored in the memory. The controller 250 controls the overall operation of the display apparatus 200. For example: in response to receiving a user command for selecting a UI object to be displayed on the display 260, the controller 250 may perform an operation related to the object selected by the user command.
In some embodiments, the controller 250 includes at least one of a Central Processing Unit (CPU), a video processor, an audio processor, a Graphic Processing Unit (GPU), a RAM Random Access Memory (RAM), a ROM (Read-Only Memory), a first interface to an nth interface for input/output, a communication Bus (Bus), and the like.
The user may input a user command through a Graphical User Interface (GUI) displayed on the display 260, and the user input interface receives the user input command through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
A "user interface" is a media interface for interaction and information exchange between an application or operating system and a user that enables the conversion of the internal form of information to a form acceptable to the user. A commonly used presentation form of the User Interface is a Graphical User Interface (GUI), which refers to a User Interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in the display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
Fig. 4 shows a software configuration block diagram of the display device 200 according to some embodiments of the present application. As shown in fig. 4, in some embodiments, the system is divided into four layers, which are, from top to bottom, an Application (Applications) layer (abbreviated as "Application layer"), an Application Framework (Application Framework) layer (abbreviated as "Framework layer"), an Android runtime (Android runtime) and system library layer (abbreviated as "system runtime library layer"), and a kernel layer.
In some embodiments, at least one application program runs in the application program layer, and the application programs may be windows (windows) programs carried by an operating system, system setting programs, clock programs or the like; or an application developed by a third party developer. In particular implementations, the applications in the application layer are not limited to the above examples.
The framework layer provides an Application Programming Interface (API) and a programming framework for the application. The application framework layer includes a number of predefined functions. The application framework layer acts as a processing center that decides to let the applications in the application layer act. The application program can access the resources in the system and obtain the services of the system in execution through the API interface.
In some embodiments, the system runtime layer provides support for an upper layer, i.e., the framework layer, and when the framework layer is used, the android operating system runs the C/C + + library included in the system runtime layer to implement the functions to be implemented by the framework layer.
In some embodiments, the kernel layer is a layer between hardware and software. The inner core layer comprises at least one of the following drivers: audio drive, demonstration drive, bluetooth drive, camera drive, WIFI drive, USB drive, HDMI drive, sensor drive (like fingerprint sensor, temperature sensor, pressure sensor etc.), MIC drive and power drive etc..
FIG. 5 illustrates an icon control interface display of an application in display device 200 according to some embodiments of the present application. In some embodiments, as shown in fig. 5, the application layer containing at least one application may display a corresponding icon control in the display, such as: the system comprises a live television application icon control, a video on demand application icon control, a media center application icon control, an application center icon control, a game application icon control and the like.
In some embodiments, the live television application may provide live television via different signal sources. For example, a live television application may provide television signals using input from cable television, radio broadcasts, satellite services, or other types of live television sources. And, the live television application may display video of the live television signal on display device 200.
In some embodiments, a video-on-demand application may provide video from different storage sources. Unlike live television applications, video on demand provides a video display from some storage source. For example, the video on demand may come from a server side of the cloud storage, from a local hard disk storage containing stored video programs.
In some embodiments, the media center application may provide various applications for multimedia content playback. For example, a media center, which may be other than live television or video on demand, may provide services that a user may access to various images or audio through a media center application.
In some embodiments, an application center may provide storage for various applications. The application may be a game, an application, or some other application associated with a computer system or other device that may be run on the smart television. The application center may obtain these applications from different sources, store them in local storage, and then be operable on the display device 200.
In some embodiments, when the display device performs Live broadcast using the configured player, live broadcast data is mainly transmitted using an adaptive bitrate Streaming Protocol (HLS) based on a hypertext Transfer Protocol (HTTP). The working principle of the HLS protocol is that the whole media stream is divided into small HTTP-based files to be downloaded, only one HTTP-based file is downloaded each time, so that the client can download and play the media stream simultaneously, and the whole media stream does not need to be downloaded and played again. Compared to real-time transport protocol (RTP), HLS data can pass through any firewall or proxy server that allows HTTP data to pass through, thereby having greater stability. Among them, HTTP is a simple request response Protocol, which generally runs on top of Transmission Control Protocol (TCP) and is used to define the process of exchanging data between a client and a server. After the client connects to the server, if a certain resource in the server is desired to be obtained, a certain communication format needs to be observed, and the HTTP protocol is used for defining the communication format between the client and the server.
Fig. 6 illustrates a structural schematic of HLS data of some embodiments of the present application. In some embodiments, as shown in fig. 6, the HLS data includes an Index file (Index file) and a Transport Stream (TS) slice file. The index file is an M3U8 file, and the M3U8 file is actually a play list (Playlist), similar to a Media Playlist (Media Playlist), and its internal information records a series of Media clip resources, which are sequentially played, i.e. the multimedia resources are completely shown. In some embodiments, as shown in fig. 6, an address of at least one secondary index file may be recorded in the M3U8 file, for example: the addresses of three secondary index files are recorded in the M3U8 file and are respectively Alternate-A index file, alternate-B index file and Alternate-C index file. The address of each secondary index file is recorded with the download address of each TS fragment file of the multimedia resource, and the client can download the corresponding TS fragment file through the download address of each TS fragment file. The secondary index file acts as an alternate source, and the client may choose to download the same resource from many different alternate sources at different rates, thereby allowing the streaming session between the client and the server to accommodate different data rates. In other embodiments, each secondary index file may also continue to be nested, for example: and recording the address of at least one tertiary index file in each secondary index file. In still other embodiments, the M3U8 file may directly record the download address of each TS fragment file of the multimedia resource, without the index file of the nested hierarchy.
For example: the specific format of the M3U8 file may be:
{#EXTM3U,#EXT-X-MEDIA-SEQUENCE,#EXT-X-TARGETDURATION:10,#EXTINF:10.0,http://media.example.com/first.ts,#EXTINF:9.99,http://media.example.com/second.ts,#EXT-X-ENDLIST}
wherein, # EXTM3U denotes the M3U8 header, placed in the first row; # EXT-X-MEDIA-SEQUENCE indicates the SEQUENCE number of the first TS sharded file, typically 0, but in the live scenario this SEQUENCE number identifies the start position of the live segment; # EXT-X-TARGETDURATION:10 denotes that the maximum duration of each TS sharded file is 10 seconds(s); # extinn indicates the duration of each TS sharded file; # EXT-X-ENDLIST denotes the terminator for the M3U8 file. If an M3U8 file does not have the # EXT-X-ENDLIST tag, it can be considered as live and a new TS slice file is added to the end of the playlist. When playing live broadcast, the display device of the client needs to continuously update the M3U8 file to obtain the latest TS fragment file for playing.
Fig. 7 illustrates a flow diagram of HLS data live broadcast of some embodiments of the present application. In some embodiments, as shown in fig. 7, audio/video input data (Audio/video inputs) provided by a Media producer, which is a transcoding module responsible for transcoding the Audio/video input data into data in a target encoding format, is transmitted to a Server (Server) where it is converted into HLS data, where the encoding format of the data provided by the Media producer may be any Audio/video encoding format. In some embodiments, the target encoding format may be an MPEG2-TS format. After the data in the target coding format is transcoded into the data in the target coding format, the data in the target coding format is sliced by the stream segmenter module, and the result of the slicing is HLS data which comprises an index file and a TS fragment file. HLS data is transmitted to Distribution, which is a generic HTTP file server. The Client (Client) can play the whole audio/video stream provided by the media producer by sequentially acquiring and playing each TS fragment file through the HTTP protocol.
In some embodiments, the duration of the TS slice file may be an integer multiple of the duration of a full Group of Pictures (GOP). A GOP is a group of consecutive pictures, which includes a plurality of data frames. The first data frame of the GOP must be an I-frame, which is an intra-coded frame, i.e., a key frame, that can be understood as a complete picture. MPEG encoding divides data frames into three types, I-frames, P-frames, and B-frames, the P-frames are forward predicted frames, the B-frames are bi-directional interpolated frames, the P-frames and the B-frames record changes relative to the I-frames, the P-frames represent differences of a previous frame, and the B-frames represent differences of previous and next frames.
In some embodiments, when the client implements live broadcast of any channel by using the display device, in order to avoid a problem of pause in playing caused by that a latest TS fragment file of HLS data of the channel is not ready, in a start-playing stage of live broadcast of the channel, the display device usually starts to acquire data from a third TS fragment file from the last, that is, a TS fragment file traced back for a certain period from a current live broadcast time. Since playing HLS data is performed in a file indexing and downloading manner, the playing delay is affected by the size of the TS fragment file, and live broadcasting generally delays about the duration of three TS fragment files, for example: the time length of each TS fragment file is about 10 seconds, the live broadcast delay is about 30 seconds, or the time length of each TS fragment file is about 2 seconds, and the live broadcast delay can be reduced to about 6 seconds. When the time length of the TS fragment file is reduced, the live broadcast delay can be reduced, but the number of the TS fragment files in unit time can be increased, and further performance loss caused by a large number of GETs in a short time of an HTTP protocol can be brought.
When a user switches channels, namely a live broadcast source needs to be switched, the display device needs to acquire HLS data of a new live broadcast source again, based on the characteristics of the HLS protocol, the display device needs to analyze the HLS data to acquire an M3U8 file, acquire a download address of a TS fragment file from the M3U8 file, and then download a corresponding TS fragment file, which wastes time in the process and causes that the user cannot smoothly switch channels, thereby causing that channel switching and playing are jammed and affecting the viewing experience of the user.
Based on this, in order to solve the problem that the display device plays in the live broadcasting channel switching process, some embodiments of the present application provide a live broadcasting channel switching method, which can analyze HLS data while playing multicast data after receiving a channel switching instruction, disconnect the multicast data after the multicast data is aligned with the analyzed HLS data, and continue to play the HLS data. Because some embodiments of the present application analyze the HLS data and play the multicast data at the same time, there is no deadlock during channel switching, and some embodiments of the present application disconnect the multicast data after the multicast data is aligned with the analyzed HLS data, and continue playing the HLS data, which not only enables the live data watched by the user to be continuous, but also enables the HLS to pass through any firewall or proxy server that allows HTTP data to pass through, and also makes it easy to use a content distribution network to transmit media streams, which has a better live broadcast advantage and better user experience.
Fig. 8 illustrates a flow chart of a live channel switching method of some embodiments of the present application. As shown in fig. 8, the method specifically includes the following steps:
s101: responding to a channel switching instruction for switching the first live broadcast source to the second live broadcast source, acquiring HLS data corresponding to the second live broadcast source, and simultaneously acquiring multicast data corresponding to the second live broadcast source.
In some embodiments, the user may issue a channel switching instruction to the display device through the smart device or the control device, for example: the user can control the display apparatus by inputting a channel switching instruction on a control device such as a remote controller, a key, a voice input, a control panel input, or the like. The user can also send out a voice channel switching instruction through a module which is configured on the display device and used for acquiring the voice instruction or an external voice control device. The user may also issue a channel switching instruction by performing a touch operation or a gesture operation on the display device.
Illustratively, the channel switching instruction may be received by a communicator in the display device, and after the channel switching instruction is received by the communicator, the controller in the display device may acquire HLS data corresponding to the second live source and simultaneously acquire multicast data corresponding to the second live source based on the channel switching instruction.
In some embodiments, the servers interacting with the display device may include a streaming server and a multicast server. The streaming media server is used for converting live broadcast data of each live broadcast source into HLS data, and the multicast server can send the live broadcast data of each live broadcast source to a corresponding client according to the multicast address of each live broadcast source. Wherein the multicast data represents real-time live data without delay. The display device may obtain HLS data corresponding to the second live broadcast source from the streaming server, and obtain multicast data corresponding to the second live broadcast source from the multicast server based on the multicast address of the second live broadcast source. When the display device acquires the HLS data corresponding to the second live broadcast source from the streaming media server, since the live broadcast data is continuously increased, the display device continuously acquires updated HLS data from the streaming media server, so as to acquire the latest TS fragment file.
S102: and playing the multicast data corresponding to the second live broadcast source.
After the multicast data corresponding to the second live broadcast source is acquired, the display device performs decapsulation processing and decoding processing on the multicast data to obtain data frames, where each data frame includes an I frame, a B frame, a P frame, and the like. And storing the decoded data frame in a buffer, and displaying the picture corresponding to the data frame. For example: after the display device acquires the multicast data corresponding to the second live broadcast source at the current live broadcast time, the display device can start to play the corresponding picture from the data frame at the current live broadcast time.
In some embodiments, while the multicast data corresponding to the second live source is played, the display device may also simultaneously parse HLS data corresponding to the second live source to obtain an index file in the HLS data corresponding to the second live source. Then, based on the index file, the TS fragment file is downloaded. The display device will trace back the TS fragment file of a certain time period from the current live broadcast time to start downloading until downloading the newly generated TS fragment file, for example: and the display equipment traces back three TS fragment files from the current live broadcast time to start downloading.
In some embodiments, the display device downloads the TS slice file based on the index file, including: the display device sends a first request message to the streaming media server based on the index file and receives a first response message from the streaming media server.
The first request message is used for requesting the TS fragmented files, and the first request message includes download addresses of the TS fragmented files. The first response message carries the TS fragment file. The first request message and the first response message are HTTP messages based on the HTTP protocol.
In some embodiments, the first request message is sent by means of a long TCP connection, and the first response message is sent by means of a chunked connection. And setting a Connection field in the first request message to be in a keep-alive state so as to ensure that the streaming media server and the display equipment can acquire data based on the HTTP long Connection. If the keep-alive is started, the streaming media server does not close the TCP connection after returning the response, the display device does not close the TCP connection after receiving the response message, and the TCP connection is continuously used when the next HTTP request is sent. The Transfer-Encoding field in the first response message is set to chunked, which means that the length of the content is not fixed, and the display device can receive data until the link is broken. Because a request is sent to the server every time a TS (transport stream) fragment file is requested, and the TCP connection is reestablished every time a request is sent, great performance loss is caused, the number of GETs can be effectively reduced by adopting a TCP long connection mode, so that the resource consumption caused by the HLS protocol is reduced.
For example: the specific format of the first request packet may be:
{POST cctvx.m3u8 HTTP/1.1,…,Connection:Keep-Alive,…}
for example: the specific format of the first response packet may be:
{HTTP/1.1206Partial Content,…,Transfer-Encoding:chunked,…,Connection:Keep-Alive,…}
s103: and under the condition that the multicast data are aligned with the data frames of the HLS data, stopping acquiring the multicast data corresponding to the second live broadcast source, and playing the HLS data.
When multicast data is played, HLS data is analyzed, TS fragment files are downloaded, and due to the fact that the downloading speed of the TS fragment files is different from the playing speed of data frames, although the TS fragment files backtrack for a period of time from the current live broadcast time to the front, the downloading speed usually exceeds the playing speed, and therefore the display device can download the corresponding newly generated TS fragment files at a certain time. For example: the display device starts to play real-time multicast data from 10 points, simultaneously starts to download from a TS fragment file of 9 points and 59 minutes, and after one minute, the display device has downloaded the latest TS fragment file corresponding to zero 1 minute of 10 points, that is, the latest generated data frame is obtained, and at this time, the multicast data is also played to the latest generated data frame, and at this time, the multicast data is aligned with the data frame of the HLS data.
In some embodiments, whether data frames of the multicast data and the HLS data are aligned or not is detected, and whether data acquired in each preset time period are identical or not may be compared. The preset time period may be set according to the duration of a complete GOP, for example: for 2 seconds. Illustratively, from the current live broadcast time, once the multicast data and the HLS data of 2 seconds are obtained, if the multicast data and the HLS data are different, the next comparison for 2 seconds is continued until the data are completely the same, and it is determined that the data frames of the multicast data and the HLS data are aligned.
In some embodiments, when the multicast data is not aligned with the data frames of the HLS data, the display device continues to play the multicast data corresponding to the second live broadcast source, continues to analyze the HLS data corresponding to the second live broadcast source, and downloads the TS fragment file, and performs data comparison with the multicast data within a next preset time period until the multicast data is aligned with the data frames of the HLS data.
In some embodiments, after disconnecting the multicast data, HLS data may be played from the aligned data frame or the next data frame of the aligned data frame. The aligned data frame is a data frame when the multicast data is aligned with the HLS data. Therefore, after the multicast data is disconnected, the HLS data can be continuously played, the picture display is coherent, the delay caused by the HLS protocol is avoided, and the resources can be saved.
In some embodiments, when the channel switching instruction for switching the first live source to the second live source is not received, the display device continues to acquire HLS data corresponding to the first live source and plays the HLS data corresponding to the first live source. The HLS data playing process is described in the related description, and is not described herein again.
Fig. 9 is a data flow diagram illustrating a live channel switching method according to some embodiments of the present application. As shown in fig. 9, if the channel switching instruction is not received, the HLS data corresponding to the first live broadcast source is obtained, and the HLS data corresponding to the first live broadcast source is played. And if the channel switching instruction is received, acquiring HLS data corresponding to the second live broadcast source and multicast data corresponding to the second live broadcast source at the same time. And analyzing and downloading the TS fragment file aiming at the HLS data corresponding to the second live broadcast source, and playing the multicast data aiming at the multicast data corresponding to the second live broadcast source. And if the data frames of the multicast data and the HLS data are aligned, disconnecting the multicast data and continuing to play the HLS data. And if the data frames of the multicast data and the HLS data are not aligned, continuously acquiring the HLS data corresponding to the second live broadcast source and the multicast data corresponding to the second live broadcast source.
According to the live broadcast channel switching method provided by some embodiments of the application, after a channel switching instruction is received, the HLS data can be analyzed while the multicast data is played, the multicast data is disconnected after the multicast data is aligned with the HLS data obtained through analysis, and the HLS data is continuously played. The broadcast multicast data is played while the HLS data is analyzed, so that the blockage can not occur during channel switching, the multicast data is disconnected after the multicast data is aligned with the analyzed HLS data, and the HLS data is continuously played, so that the live broadcast data watched by a user is continuous, the HLS can pass through any firewall or proxy server allowing the HTTP data to pass, the media stream can be easily transmitted by using a content distribution network, the live broadcast advantage is good, and the user experience is good.
Based on the live broadcast channel switching method provided in the above embodiments, some embodiments of the present application further provide a display device. The display device 200 includes: a display 260 and a controller 250. Wherein the display 260 is configured to display the play data. Under the condition that the controller 250 normally plays the HLS data corresponding to the first live broadcast source and does not receive the channel switching instruction, fig. 10 illustrates an effect diagram of the display device according to some embodiments of the present application playing the data of the first live broadcast source. As shown in fig. 10, the controller 250 continues to acquire the HLS data corresponding to the first direct broadcast source and controls the display 260 to play the HLS data corresponding to the first direct broadcast source.
In a case that the controller 250 normally plays HLS data corresponding to the first live source and receives a channel switching instruction for switching the first live source to the second live source, fig. 11 illustrates an effect diagram of the display device according to some embodiments of the present application playing data of the second live source. As shown in fig. 11, after receiving the channel switching instruction, the controller 250 may simultaneously acquire HLS data and multicast data corresponding to the second live broadcast source, and control the display 260 to play the multicast data corresponding to the second live broadcast source.
Fig. 12 is a schematic diagram illustrating an interaction flow between a display device and a server according to some embodiments of the present application. As shown in fig. 12, in a case where the controller 250 receives a channel switching instruction for switching the first live source to the second live source, the controller 250, the display 260, the streaming server and the multicast server are respectively configured to execute the following program steps:
s201: in response to the channel switching instruction for switching the first live broadcast source to the second live broadcast source, the controller 250 acquires multicast data corresponding to the second live broadcast source from the multicast server.
S202: the controller 250 controls the display 260 to play the multicast data corresponding to the second live source.
S203: the controller 250 obtains HLS data corresponding to the second live source from the streaming server.
The controller 250 may obtain the multicast data corresponding to the second live source from the multicast server, and may obtain the HLS data corresponding to the second live source from the streaming server.
S204: the controller 250 analyzes the HLS data corresponding to the second live source to obtain an index file in the HLS data corresponding to the second live source.
S205: the controller 250 downloads the TS slice file from the streaming server based on the index file.
The above-mentioned step 202 and the steps 203 to 205 may be performed simultaneously.
S206: controller 250 detects whether the data frames of the multicast data and the HLS data are aligned.
S207: in the case that the data frames of the multicast data and the HLS data are not aligned, the controller 250 continues to acquire the multicast data corresponding to the second live broadcast source from the multicast server and play the multicast data.
S208: in the case where the multicast data is aligned with the data frame of the HLS data, the controller 250 stops acquiring the multicast data corresponding to the second live source from the multicast server.
S209: and controlling the display to play the HLS data corresponding to the second live broadcast source.
Wherein step S208 and step S209 are performed simultaneously.
The display device provided by some embodiments of the present application can analyze HLS data while playing multicast data after receiving a channel switching instruction, disconnect the multicast data after the multicast data is aligned with the analyzed HLS data, and continue playing the HLS data. The broadcast multicast data is played while the HLS data is analyzed, so that the display device cannot be jammed during channel switching, the multicast data is disconnected after the multicast data is aligned with the analyzed HLS data, and the HLS data is continuously played, so that not only can live broadcast data watched by a user be continuous, but also the HLS can pass through any firewall or proxy server allowing HTTP data to pass through, and a content distribution network can be easily used for transmitting media streams, therefore, the broadcast multicast data broadcast method has a better live broadcast advantage and better user experience.
In some embodiments of the present application, a computer-readable storage medium is further provided, where the computer-readable storage medium may store a program, and the program may include some or all of the steps in the embodiments of the live channel switching method provided in the present application when executed. The storage medium may be a magnetic disk, an optical disk, a Read Only Memory (ROM), a Random Access Memory (RAM), or the like.
It can be seen from the foregoing technical solutions that, in a live broadcasting process, in response to a channel switching instruction for switching a first live broadcasting source to a second live broadcasting source, the display device provided in some embodiments of the present application can simultaneously acquire HLS data and multicast data corresponding to the second live broadcasting source, play multicast data corresponding to the second live broadcasting source, and stop acquiring the multicast data and playing the HLS data when data frames of the multicast data and the HLS data are aligned. When the channel is switched, the HLS data and the multicast data are acquired simultaneously, and the multicast data are played before the multicast data are aligned with the data frames of the HLS data, so that the problems of playing delay and blockage caused by acquisition of the HLS data after the channel is switched by the display equipment can be solved, and the watching experience of a user is improved.
The embodiments provided in the present application are only a few examples of the general concept of the present application, and do not limit the scope of the present application. Any other embodiments extended according to the scheme of the present application without inventive efforts will be within the scope of protection of the present application for a person skilled in the art.

Claims (10)

1. A display device, comprising:
a display configured to display the play data;
a controller configured to:
responding to a channel switching instruction for switching a first live broadcast source to a second live broadcast source, acquiring adaptive bitrate streaming media transport protocol (HLS) data corresponding to the second live broadcast source, and simultaneously acquiring multicast data corresponding to the second live broadcast source;
controlling the display to play multicast data corresponding to the second live broadcast source;
and under the condition that the multicast data are aligned with the data frames of the HLS data, stopping acquiring the multicast data corresponding to the second live broadcast source, and controlling the display to play the HLS data.
2. The display device of claim 1, wherein the controller is configured to control the display to play the HLS data, comprising:
controlling the display to play the HLS data from an aligned data frame or a next data frame of the aligned data frame, wherein the aligned data frame is a data frame when the multicast data is aligned with the HLS data.
3. The display device according to claim 1, wherein while the controller controls the display to play the multicast data corresponding to the second live source, the controller is further configured to:
analyzing HLS data corresponding to the second live broadcast source to obtain an index file in the HLS data corresponding to the second live broadcast source;
and downloading the transport stream TS slicing file based on the index file.
4. The display device according to claim 3, wherein the controller is configured to download a Transport Stream (TS) slice file based on the index file, comprising:
sending a first request message to a streaming media server based on the index file, wherein the first request message is used for requesting the TS fragment file and comprises a download address of the TS fragment file;
and receiving a first response message from the streaming media server, wherein the TS fragment file is carried in the first response message.
5. The display device according to claim 4, wherein the first request message is sent by means of a long connection of TCP, and the first response message is sent by means of a chunked.
6. The display device according to claim 1, wherein in a case where the controller does not receive the table switching instruction, the controller is further configured to:
obtaining HLS data corresponding to the first direct broadcast source;
and controlling the display to play the HLS data corresponding to the first direct playing source.
7. The display device according to claim 1, wherein the controller is configured to obtain adaptive bitrate streaming protocol (HLS) data corresponding to the second live source and simultaneously obtain multicast data corresponding to the second live source, and the method comprises:
obtaining the HLS data corresponding to the second live broadcast source from a streaming media server;
and simultaneously acquiring multicast data corresponding to the second live broadcast source from a multicast server based on the multicast address of the second live broadcast source.
8. A live broadcast channel switching method is applied to display equipment and is characterized by comprising the following steps:
responding to a channel switching instruction for switching a first live broadcast source to a second live broadcast source, acquiring adaptive bitrate streaming media transport protocol (HLS) data corresponding to the second live broadcast source, and simultaneously acquiring multicast data corresponding to the second live broadcast source;
playing multicast data corresponding to the second live broadcast source;
and under the condition that the multicast data are aligned with the data frames of the HLS data, stopping acquiring the multicast data corresponding to the second live broadcast source, and playing the HLS data.
9. The method of claim 8, wherein the playing the HLS data comprises:
and playing the HLS data from an aligned data frame or the next data frame of the aligned data frame, wherein the aligned data frame is the data frame when the multicast data is aligned with the HLS data.
10. A computer-readable storage medium, having stored thereon a computer program which, when executed by a controller, implements a live channel switching method as claimed in any one of claims 8 and 9.
CN202211201571.2A 2022-09-29 2022-09-29 Display device, live broadcast channel switching method and storage medium Pending CN115604496A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211201571.2A CN115604496A (en) 2022-09-29 2022-09-29 Display device, live broadcast channel switching method and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211201571.2A CN115604496A (en) 2022-09-29 2022-09-29 Display device, live broadcast channel switching method and storage medium

Publications (1)

Publication Number Publication Date
CN115604496A true CN115604496A (en) 2023-01-13

Family

ID=84844163

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211201571.2A Pending CN115604496A (en) 2022-09-29 2022-09-29 Display device, live broadcast channel switching method and storage medium

Country Status (1)

Country Link
CN (1) CN115604496A (en)

Similar Documents

Publication Publication Date Title
US9584557B2 (en) Proxy for facilitating streaming of media from server to client
RU2577468C2 (en) Method of sharing digital media content (versions)
US9634880B2 (en) Method for displaying user interface and display device thereof
US20120254929A1 (en) Content Extraction for Television Display
US9521470B2 (en) Video delivery system configured to seek in a video using different modes
WO2013163553A1 (en) Connected multi-screen video
US20130135179A1 (en) Control method and device thereof
US20180041817A1 (en) Video Assets Having Associated Graphical Descriptor Data
WO2015035742A1 (en) Method, terminal and system for audio and video sharing of digital television
US20220132175A1 (en) Methods and apparatus for responding to inoperative commands
KR20150121459A (en) SERVER DEVICE FOR PROVIDING VoD SERVICE, CLIENT DEVICE AND METHODS THEREOF
CN113132194A (en) Information transfer method, device, equipment, server and storage medium
US20150032900A1 (en) System for seamlessly switching between a cloud-rendered application and a full-screen video sourced from a content server
CN115623275A (en) Subtitle display method and display equipment
KR101405865B1 (en) Method of presentation virtualization of set-top-box, and its system
CN115604496A (en) Display device, live broadcast channel switching method and storage medium
EP2168379B1 (en) High-speed programs review
JP6034113B2 (en) Video content distribution device
US11765443B2 (en) Playback of media content during dual mode trick play operations
CN117812330A (en) Display device and broadcast program playing method
CN117641024A (en) Display equipment and media data display method
CN117915139A (en) Display equipment and sound and picture synchronization method
CN115270030A (en) Display device and media asset playing method
CN117157987A (en) Split-screen playing method and display device
CN117749906A (en) Display device and broadcast program playing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination