CN108924664B - Synchronous display method and terminal for program subtitles - Google Patents

Synchronous display method and terminal for program subtitles Download PDF

Info

Publication number
CN108924664B
CN108924664B CN201810835679.4A CN201810835679A CN108924664B CN 108924664 B CN108924664 B CN 108924664B CN 201810835679 A CN201810835679 A CN 201810835679A CN 108924664 B CN108924664 B CN 108924664B
Authority
CN
China
Prior art keywords
program
event
code stream
program event
time axis
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810835679.4A
Other languages
Chinese (zh)
Other versions
CN108924664A (en
Inventor
鲁好锦
张飞
胡玉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN201810835679.4A priority Critical patent/CN108924664B/en
Publication of CN108924664A publication Critical patent/CN108924664A/en
Application granted granted Critical
Publication of CN108924664B publication Critical patent/CN108924664B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/488Data services, e.g. news ticker
    • H04N21/4884Data services, e.g. news ticker for displaying subtitles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8547Content authoring involving timestamps for synchronizing content

Abstract

The embodiment of the invention discloses a method and a terminal for synchronously displaying program subtitles, relates to the field of subtitle display, and solves the problem that external subtitles and audio and video cannot be synchronously displayed when subtitle data is not included in a code stream. The specific scheme is as follows: the method comprises the steps of obtaining the name and time information of a program event by analyzing a code stream, generating an actual time axis of the program event broadcast according to the time information of the program event, determining that caption data is not included in the code stream when a PMT of the code stream is determined not to include a caption descriptor, obtaining the caption data of the program event according to the name of the program event, mapping the caption data of the program event to the actual time axis, and displaying the caption data according to the mapped actual time axis. The embodiment of the invention is used in the process of synchronously displaying the external subtitles and the audio and video.

Description

Synchronous display method and terminal for program subtitles
Technical Field
The embodiment of the invention relates to the field of subtitle display, in particular to a method and a terminal for synchronously displaying program subtitles.
Background
Currently, in the Digital Video Broadcasting (DVB) standard, Subtitle data can be transmitted to a terminal through a Packetized Elementary stream (PES stream) to package an original ES stream to form a PES stream, so that the terminal can analyze and display the Subtitle data for a user to watch, and the function of auxiliary watching is achieved.
For example, when a user watches a Video through a terminal, the terminal may analyze a subtitle Descriptor (subtitle Descriptor) included in a Program Map Table (PMT) of a code stream to obtain subtitle related information, analyze the subtitle data in a PES stream through a Digital Television (DTV) protocol stack according to the subtitle related information, and synchronize with a Video display Time Stamp (Video PTS) and an Audio display Time Stamp (Audio PTS) according to a Presentation Time Stamp (PTS) of the subtitle data, thereby implementing synchronous display of Audio and Video and subtitles.
However, since the video allows multi-language subtitles to exist at present, a part of the language subtitle data can be carried in the code stream, and a part of the language subtitle data can be stored in a memory or a subtitle server (external subtitles for short); or the multi-language subtitles are stored in a memory or a subtitle server. The synchronous display method of the subtitles is only suitable for the condition that the code stream comprises subtitle data, and if the code stream does not comprise subtitle data, the terminal cannot realize the synchronous display of the external subtitles.
Disclosure of Invention
The invention provides a method and a terminal for synchronously displaying program subtitles, which solve the problem that the synchronous display of external subtitles and audio and video cannot be realized when subtitle data is not included in a code stream.
In order to achieve the purpose, the invention adopts the following technical scheme:
in a first aspect, the present invention provides a method for synchronously displaying program subtitles, which may include: the terminal obtains the name and the time information of the program event by analyzing the code stream, generates an actual time axis of program event broadcasting according to the time information of the program event, determines that the code stream does not include caption data when determining that the PMT of the code stream does not include the caption descriptor, acquires the caption data of the program event according to the name of the program event, maps the caption data of the program event onto the actual time axis, and displays the caption data according to the mapped actual time axis. Wherein the time information includes a start time and a duration of the program broadcasting.
With reference to the first aspect, in a possible implementation manner, the method may further include: when the terminal determines that the PMT of the code stream comprises the caption descriptor, the terminal determines that the code stream comprises the caption data, displays a selection list, wherein the selection list comprises the code stream caption data and the external caption data, and acquires the caption data of the program event according to the name of the program event when the confirmation operation of the user on the external caption data is detected.
With reference to the first aspect and the foregoing possible implementation manners, in another possible implementation manner, the obtaining, by the terminal, the name and the time information of the program event by analyzing the code stream may specifically include: the terminal obtains channel Electronic Program Guide (EPG) information by analyzing an Event Information Table (EIT) of the code stream, wherein the channel EPG information includes names and time information of all Program events of the code stream; and when receiving the operation of the user on the program event, acquiring the name and the time information of the program event from the channel EPG information.
With reference to the first aspect and the foregoing possible implementation manners, in another possible implementation manner, the method may further include: the terminal respectively generates an actual time axis of corresponding event broadcasting according to the time information of other events except the program event in the channel EPG information, acquires the caption data of the corresponding event according to the names of the other events except the program event in the channel EPG information, and maps the caption data of the other events to the actual time axis of the corresponding event.
With reference to the first aspect and the foregoing possible implementation manners, in another possible implementation manner, the mapping, by the terminal, the subtitle data of the program event onto an actual time axis may specifically include: and the terminal generates a caption time axis according to the caption data of the program event, and associates the actual time axis with the caption time axis to obtain the mapped actual time axis.
In a second aspect, the present invention provides a terminal, which may include: the device comprises an analysis unit, a generation unit, a determination unit, an acquisition unit, a mapping unit and a display unit. The analysis unit is used for obtaining the name and the time information of the program event by analyzing the code stream, and the time information comprises the starting time and the duration of the program broadcast. And the generating unit is used for generating an actual time axis of the program event broadcast according to the time information of the program event. And the determining unit is used for determining that the code stream does not include the caption data when the PMT of the code stream is determined not to include the caption descriptor. And the acquisition unit is used for acquiring the caption data of the program event according to the name of the program event. And a mapping unit for mapping the caption data of the program event onto an actual time axis. And the display unit is used for displaying the subtitle data according to the mapped actual time axis.
With reference to the second aspect, in a possible implementation manner, the determining unit is further configured to determine that the code stream includes the subtitle data when it is determined that the PMT of the code stream includes the subtitle descriptor. And the display unit is also used for displaying a selection list, and the selection list comprises code stream subtitle data and external subtitle data. And the obtaining unit is further used for obtaining the caption data of the program event according to the name of the program event when the confirming operation of the user on the external caption data is detected.
With reference to the second aspect and the foregoing possible implementation manners, in another possible implementation manner, the parsing unit is specifically configured to: obtaining channel EPG information by analyzing an EIT table of the code stream, wherein the channel EPG information comprises names and time information of all program events of the code stream; and when receiving the operation of the user on the program event, acquiring the name and the time information of the program event from the channel EPG information.
With reference to the second aspect and the foregoing possible implementation manners, in another possible implementation manner, the generating unit is further configured to generate actual time axes of corresponding event broadcasts according to time information of other events except for the program event in the channel EPG information. And the acquisition unit is also used for acquiring the caption data of the corresponding event according to the names of other events except the program event in the channel EPG information. And the mapping unit is also used for mapping the subtitle data of other events to the actual time axis of the corresponding event.
With reference to the second aspect and the foregoing possible implementation manners, in another possible implementation manner, the mapping unit is specifically configured to: generating a caption time axis according to caption data of the program event; and associating the actual time axis with the subtitle time axis to obtain the mapped actual time axis.
The specific implementation manner may refer to the behavior function of the terminal in the method for synchronously displaying program subtitles provided in the first aspect or the possible implementation manners of the first aspect.
In a third aspect, a terminal is provided, which includes: at least one processor, a memory, a communication interface, and a communication bus. The processor is connected to the memory and the communication interface through a communication bus, the memory is used for storing computer-executable instructions, and when the terminal runs, the processor executes the computer-executable instructions stored in the memory, so that the terminal executes the method for synchronously displaying program subtitles according to the first aspect or any one of the possible implementation manners of the first aspect.
In a fourth aspect, there is provided a computer storage medium having stored thereon computer-executable instructions that, when executed on a computer, cause the computer to perform a method of synchronized display of program subtitles according to the first aspect or any one of its possible implementations.
According to the method for synchronously displaying the program subtitles, the terminal obtains the name and the time information of the program event by analyzing the code stream, generates the actual time axis of the program event broadcast according to the time information, and maps the external subtitle data to the actual time axis by combining the external subtitle data when the fact that the code stream does not comprise the subtitle data is determined, so that the display time of the subtitles is controlled according to the actual time, and the synchronous display of the audio and video and the external subtitles is realized.
Drawings
Fig. 1 is a schematic composition diagram of a terminal according to an embodiment of the present invention;
fig. 2 is a flowchart of a method for synchronously displaying program subtitles according to an embodiment of the present invention;
fig. 3 is a schematic diagram of an actual time axis of a program event according to an embodiment of the present invention;
fig. 4 is a schematic diagram of a subtitle timeline according to an embodiment of the present invention;
FIG. 5 is a diagram illustrating a mapped real time axis according to an embodiment of the present invention;
fig. 6 is a schematic composition diagram of another terminal according to an embodiment of the present invention;
fig. 7 is a schematic composition diagram of another terminal according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Fig. 1 is a schematic composition diagram of a terminal according to an embodiment of the present invention, and in a specific implementation, the terminal may be a television. As shown in fig. 1, the terminal may include: at least one processor 11, a memory 12, a communication interface 13, and a communication bus 14.
The following describes the various components of the terminal in detail with reference to fig. 1:
the processor 11 is a control center of the terminal, and may be a single processor or a collective term for a plurality of processing elements. For example, the processor 11 is a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), or one or more Integrated circuits configured to implement embodiments of the present invention, such as: one or more Digital Signal Processors (DSPs), or one or more Field Programmable Gate Arrays (FPGAs).
In particular implementations, processor 11 may include one or more CPUs, such as CPU0 and CPU1 shown in FIG. 1, for example, as one embodiment. Also, as an example, the terminal may include a plurality of processors, such as the processor 11 and the processor 15 shown in fig. 1. Each of these processors may be a Single-core processor (Single-CPU) or a Multi-core processor (Multi-CPU). A processor herein may refer to one or more devices, circuits, and/or processing cores for processing data (e.g., computer program instructions).
The Memory 12 may be a Read-Only Memory (ROM) or other type of static storage device that can store static information and instructions, a Random Access Memory (RAM) or other type of dynamic storage device that can store information and instructions, an Electrically Erasable Programmable Read-Only Memory (EEPROM), a Compact Disc Read-Only Memory (CD-ROM) or other optical Disc storage, optical Disc storage (including Compact Disc, laser Disc, optical Disc, digital versatile Disc, blu-ray Disc, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer, but is not limited to such. The memory 12 may be self-contained and coupled to the processor 11 via a communication bus 14. The memory 12 may also be integrated with the processor 11.
In a specific implementation, the memory 12 is used for storing data in the present invention and software programs for executing the present invention. The processor 11 may perform various functions of the terminal by running or executing software programs stored in the memory 12 and calling data stored in the memory 12.
The communication interface 13 is any device such as a transceiver for communicating with other devices or communication Networks, such as a Radio Access Network (RAN), a Wireless Local Area Network (WLAN), and the like. The communication interface 13 may include a receiving unit implementing a receiving function and a transmitting unit implementing a transmitting function.
The communication bus 14 may be an Industry Standard Architecture (ISA) bus, a Peripheral Component Interconnect (PCI) bus, an Extended ISA (EISA) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown in FIG. 1, but it is not intended that there be only one bus or one type of bus.
In order to solve the problem that the external subtitle and audio/video synchronous display cannot be realized when the code stream does not include subtitle data, an embodiment of the present invention provides a method for synchronously displaying program subtitles, which, as shown in fig. 2, may include:
201. and the terminal obtains the name and the time information of the program event by analyzing the code stream.
The time information may include, among other things, the start time and duration of the program broadcast. When a user watches audio and video through the terminal, the terminal can initialize and allocate memory for subsequently used data, variables and the like after being started, and then an EIT table of a code stream is analyzed through a DTV protocol stack to obtain channel EPG information, wherein the channel EPG information can comprise names and time information of all program events in effective time. Thus, the terminal can acquire the name and time information of a program event from the channel EPG information when receiving the operation of a user on the program event.
202. And the terminal generates an actual time axis of the program event broadcast according to the time information of the program event.
And generating an actual time axis of program event broadcasting according to the time information of the program event acquired from the channel EPG information, wherein the actual time axis is the broadcasting time of the program in the EPG two-dimensional program menu. For example, in the time information of the program event acquired by the terminal, the starting time of the program broadcast is 2018-7-1507: 30: 01, duration 01: 49: 00, the terminal can generate a schematic diagram of the actual time axis as shown in fig. 3.
203. And when the terminal determines that the PMT of the code stream does not comprise the caption descriptor, determining that the code stream does not comprise the caption data.
The terminal can analyze the PMT of the code stream through the DTV protocol stack, and determine whether the code stream includes caption data by judging whether the PMT includes the caption descriptor. If the caption descriptor is not included in the PMT, the terminal determines that the caption data is not included in the code stream, at which point the following steps 204-206 may be performed. And if the PMT comprises the caption descriptor, the terminal determines that the code stream comprises the caption data. At this time, in a possible implementation manner, the terminal may parse the code stream according to the information of the subtitle descriptor to obtain subtitle data, and synchronize with the video PTS and the audio PTS according to the PTS of the subtitle data to implement synchronous display of the subtitle and the audio and video. In another possible implementation manner, the terminal may display a selection list, where the selection list may include code stream subtitle data and external subtitle data. Thus, the terminal can execute the following steps 204-206 when receiving the confirmation operation of the user on the external caption data. And the terminal can analyze the code stream to obtain the caption data for synchronous display when receiving the confirmation operation of the user on the code stream caption data.
204. And the terminal acquires the caption data of the program event according to the name of the program event.
The terminal may obtain Text format Subtitle (SRT) data of the program event according to the name of the program event when it is determined that the code stream does not include subtitle data or when a confirmation operation of the user on the external subtitle data of the selection list is received. In a specific implementation, if the program event is a fixed program, the terminal may obtain the subtitle data of the program event from a pre-stored subtitle file, and if the program event is a program updated in real time, the terminal may download the subtitle data of the program event from a subtitle server.
For example, assuming that the name of the program event is bosch prince-hour, the caption data of the program event acquired by the terminal may be as follows:
1、00:00:52,688->00:00:55,173
it is said that some lives can span space-time
2、00:00:57,384->00:01:03,862
Ancient calling through reverberation and history
205. The terminal maps the caption data of the program event onto the actual time axis.
After acquiring the caption data of the program event, the terminal may generate a caption time axis according to the caption data of the program event, where the starting time of the caption time axis is usually zero, and associate the actual time axis generated in step 202 with the caption time axis to obtain a mapped actual time axis, so as to control the display time of the caption according to the actual time.
Illustratively, according to the example in step 204, assuming that the caption time axis generated by the terminal is as shown in fig. 4, the terminal may map the caption data onto the actual time axis of fig. 3, so as to obtain the mapped actual time axis as shown in fig. 5. Optionally, the terminal may associate the two PTSs with the subtitle data corresponding to the PTS on the mapped actual time axis according to the analyzed audio PTS and video PTS of the bosch prince-hour edge of the target program, so as to implement synchronous display of the audio and video data and the subtitle data.
206. And the terminal displays the caption data according to the mapped actual time axis.
Further, in the embodiment of the present invention, the terminal may further generate an actual time axis of the corresponding event broadcast according to the time information of other events except the program event in the channel EPG information, and obtain the subtitle data of the corresponding event according to the names of the other events, and map the subtitle data of the other events to the actual time axis of the corresponding event, so that when the user selects another program event, the terminal may directly display the subtitle data of the other event according to the actual time, and the user experience is improved.
According to the method for synchronously displaying the program subtitles, the terminal obtains the name and the time information of the program event by analyzing the code stream, generates the actual time axis of the program event broadcast according to the time information, and maps the external subtitle data to the actual time axis by combining the external subtitle data when the fact that the code stream does not comprise the subtitle data is determined, so that the display time of the subtitles is controlled according to the actual time, and the synchronous display of the audio and video and the external subtitles is realized.
The above description mainly introduces the scheme provided by the embodiment of the present invention from the perspective of the terminal. It is understood that the terminal includes corresponding hardware structures and/or software modules for performing the respective functions in order to implement the above-described functions. Those of skill in the art will readily appreciate that the present invention can be implemented in hardware or a combination of hardware and computer software, in conjunction with the exemplary algorithm steps described in connection with the embodiments disclosed herein. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The embodiment of the present invention may perform the division of the functional modules on the terminal according to the above method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. It should be noted that, the division of the modules in the embodiment of the present invention is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
In the case of dividing each functional module by corresponding functions, fig. 6 shows another possible composition diagram of the terminal involved in the above embodiment, as shown in fig. 6, the terminal may include: the analysis unit 31, the generation unit 32, the determination unit 33, the acquisition unit 34, the mapping unit 35, and the display unit 36.
Wherein, the parsing unit 31 is configured to support the terminal to execute step 201 in the method for synchronously displaying program subtitles shown in fig. 2.
The generating unit 32 is configured to support the terminal to execute step 202 in the method for synchronously displaying program subtitles shown in fig. 2.
A determining unit 33, configured to support the terminal to execute step 203 in the method for synchronously displaying program subtitles shown in fig. 2.
The obtaining unit 34 is configured to support the terminal to execute step 204 in the method for synchronously displaying program subtitles shown in fig. 2.
A mapping unit 35, configured to support the terminal to execute step 205 in the method for synchronously displaying program subtitles shown in fig. 2.
The display unit 36 is configured to support the terminal to execute step 206 in the method for synchronously displaying program subtitles shown in fig. 2.
It should be noted that all relevant contents of each step related to the above method embodiment may be referred to the functional description of the corresponding functional module, and are not described herein again.
The terminal provided by the embodiment of the invention is used for executing the synchronous display method of the program subtitles, so that the same effect as the synchronous display method of the program subtitles can be achieved.
In the case of an integrated unit, fig. 7 shows another possible composition diagram of the terminal involved in the above-described embodiment. As shown in fig. 7, the terminal includes: a processing module 41, a communication module 42 and a storage module 43.
The processing module 41 is configured to control and manage the actions of the terminal, for example, the processing module 41 is configured to support the terminal to perform step 201, step 202, step 203, step 204, step 205, step 206 in fig. 2, and/or other processes for the technology described herein. The communication module 42 is used to support communication between the terminal and other network entities. A storage module 43 for storing program codes and data of the terminal.
The processing module 41 may be the processor in fig. 1. Which may implement or perform the various illustrative logical blocks, modules, and circuits described in connection with the disclosure. A processor may also be a combination of computing functions, e.g., comprising one or more microprocessors, a DSP and a microprocessor, or the like. The communication module 42 may be the communication interface of fig. 1. The storage module 43 may be the memory of fig. 1.
Through the above description of the embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical functional division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another device, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may be one physical unit or a plurality of physical units, that is, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium. Based on such understanding, the technical solution of the embodiments of the present invention may be essentially or partially contributed to by the prior art, or all or part of the technical solution may be embodied in the form of a software product, where the software product is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a U disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk.
The above description is only an embodiment of the present invention, but the scope of the present invention is not limited thereto, and any changes or substitutions within the technical scope of the present invention are intended to be covered by the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (6)

1. A method for synchronously displaying program subtitles, the method comprising:
obtaining the name and time information of the program event by analyzing the code stream, including: analyzing an event information table EIT of the code stream through a DTV protocol stack to obtain channel Electronic Program Guide (EPG) information, wherein the channel EPG information comprises names and time information of all program events of the code stream; when receiving the operation of a user on the program event, acquiring the name and time information of the program event from the channel EPG information; the time information comprises the starting time and duration of the program broadcast;
generating an actual time axis of the program event broadcast according to the time information of the program event;
when determining that the program mapping table PMT of the code stream does not include the caption descriptor, determining that the code stream does not include the caption data;
acquiring caption data of the program event according to the name of the program event;
mapping the caption data of the program event to the actual time axis, and displaying the caption data according to the mapped actual time axis;
respectively generating actual time axes of corresponding event broadcasts according to the time information of other events except the program events in the channel EPG information;
acquiring caption data of a corresponding event according to the names of other events except the program event in the channel EPG information;
and mapping the subtitle data of the other events to the actual time axis of the corresponding events.
2. The method for synchronously displaying program subtitles according to claim 1, wherein the method further comprises:
when the PMT of the code stream is determined to comprise the caption descriptor, determining that the code stream comprises caption data;
displaying a selection list, wherein the selection list comprises code stream subtitle data and external subtitle data;
and when the confirmation operation of the user on the external caption data is detected, obtaining the caption data of the program event according to the name of the program event.
3. The method for synchronously displaying program subtitles according to any one of claims 1-2, wherein the mapping subtitle data of the program event onto the actual time axis comprises:
generating a caption time axis according to the caption data of the program event;
and associating the actual time axis with the subtitle time axis to obtain the mapped actual time axis.
4. A terminal, characterized in that the terminal comprises: the device comprises an analysis unit, a generation unit, a determination unit, an acquisition unit, a mapping unit and a display unit;
the analysis unit is used for obtaining the name and the time information of the program event by analyzing the code stream, and comprises: analyzing an event information table EIT table of the code stream through a DTV protocol stack to obtain channel Electronic Program Guide (EPG) information, wherein the channel EPG information comprises names and time information of all program events of the code stream; when receiving the operation of a user on the program event, acquiring the name and time information of the program event from the channel EPG information; the time information comprises the starting time and duration of the program broadcast;
the generating unit is used for generating an actual time axis of the program event broadcast according to the time information of the program event;
the determining unit is configured to determine that the code stream does not include subtitle data when it is determined that a program map table PMT of the code stream does not include a subtitle descriptor;
the acquiring unit is used for acquiring the caption data of the program event according to the name of the program event;
the mapping unit is used for mapping the caption data of the program event to the actual time axis;
the display unit is used for displaying the subtitle data according to the mapped actual time axis;
the generating unit is further configured to generate actual time axes of corresponding event broadcasts according to time information of other events except the program event in the channel EPG information;
the acquiring unit is further configured to acquire subtitle data of a corresponding event according to names of other events except the program event in the channel EPG information;
the mapping unit is further configured to map the subtitle data of the other events to an actual time axis of the corresponding event.
5. The terminal of claim 4,
the determining unit is further configured to determine that the code stream includes subtitle data when it is determined that the PMT of the code stream includes the subtitle descriptor;
the display unit is also used for displaying a selection list, and the selection list comprises code stream subtitle data and external subtitle data;
the obtaining unit is further configured to obtain the subtitle data of the program event according to the name of the program event when a confirmation operation of the user on the external subtitle data is detected.
6. The terminal according to claim 4 or 5, wherein the mapping unit is specifically configured to:
generating a caption time axis according to the caption data of the program event;
and associating the actual time axis with the subtitle time axis to obtain the mapped actual time axis.
CN201810835679.4A 2018-07-26 2018-07-26 Synchronous display method and terminal for program subtitles Active CN108924664B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810835679.4A CN108924664B (en) 2018-07-26 2018-07-26 Synchronous display method and terminal for program subtitles

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810835679.4A CN108924664B (en) 2018-07-26 2018-07-26 Synchronous display method and terminal for program subtitles

Publications (2)

Publication Number Publication Date
CN108924664A CN108924664A (en) 2018-11-30
CN108924664B true CN108924664B (en) 2021-06-08

Family

ID=64417396

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810835679.4A Active CN108924664B (en) 2018-07-26 2018-07-26 Synchronous display method and terminal for program subtitles

Country Status (1)

Country Link
CN (1) CN108924664B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109842815A (en) * 2019-01-31 2019-06-04 海信电子科技(深圳)有限公司 A kind of the subtitle state display method and device of program
CN113873221B (en) * 2021-08-27 2023-07-18 深圳市九洲电器有限公司 Subtitle distortion detection method, multimedia terminal, server, and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010068016A (en) * 2008-09-08 2010-03-25 Q-Tec Inc Movie-subtitle synchronous display system
CN102630017A (en) * 2012-04-10 2012-08-08 中兴通讯股份有限公司 Method and system for synchronizing mobile multi-media broadcasting and subtitles
CN102802044A (en) * 2012-06-29 2012-11-28 华为终端有限公司 Video processing method, terminal and subtitle server
CN103067678A (en) * 2011-10-20 2013-04-24 四川长虹电器股份有限公司 Television subtitle display method and device
CN103297843A (en) * 2012-03-02 2013-09-11 三星电子(中国)研发中心 Program selection method for MPEG TS file playback
CN105898556A (en) * 2015-12-30 2016-08-24 乐视致新电子科技(天津)有限公司 Plug-in subtitle automatic synchronization method and device
CN105959772A (en) * 2015-12-22 2016-09-21 合网络技术(北京)有限公司 Streaming media and caption instant synchronization display and matching processing method, device and system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8707381B2 (en) * 2009-09-22 2014-04-22 Caption Colorado L.L.C. Caption and/or metadata synchronization for replay of previously or simultaneously recorded live programs

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010068016A (en) * 2008-09-08 2010-03-25 Q-Tec Inc Movie-subtitle synchronous display system
CN103067678A (en) * 2011-10-20 2013-04-24 四川长虹电器股份有限公司 Television subtitle display method and device
CN103297843A (en) * 2012-03-02 2013-09-11 三星电子(中国)研发中心 Program selection method for MPEG TS file playback
CN102630017A (en) * 2012-04-10 2012-08-08 中兴通讯股份有限公司 Method and system for synchronizing mobile multi-media broadcasting and subtitles
CN102802044A (en) * 2012-06-29 2012-11-28 华为终端有限公司 Video processing method, terminal and subtitle server
CN105959772A (en) * 2015-12-22 2016-09-21 合网络技术(北京)有限公司 Streaming media and caption instant synchronization display and matching processing method, device and system
CN105898556A (en) * 2015-12-30 2016-08-24 乐视致新电子科技(天津)有限公司 Plug-in subtitle automatic synchronization method and device

Also Published As

Publication number Publication date
CN108924664A (en) 2018-11-30

Similar Documents

Publication Publication Date Title
CN111050205B (en) Video clip acquisition method, device, equipment and storage medium
US11153666B2 (en) Method and apparatus for releasing video file
US11070886B2 (en) Method and apparatus for looping a video file
US20220365985A1 (en) Page jump method and apparatus, electronic device and computer-readable storage medium
CN108600800B (en) Smart television and starting method thereof
EP3382578A1 (en) Two-dimensional code processing method and terminal
EP2478697B1 (en) Method and device for providing complementary information
RU2673560C1 (en) Method and system for displaying multimedia information, standardized server and direct broadcast terminal
CN109474839B (en) Method and device for displaying EPG user interface during broadcasting program
CN108924664B (en) Synchronous display method and terminal for program subtitles
CN106412634A (en) Media file pushing method, media file server and media file pushing system
WO2017092699A1 (en) Condition receiving method and system for intelligent operating system
US20200228851A1 (en) Atsc 3.0 advertising notification using event streams
EP2926567B1 (en) Method and apparatus for processing digital service signal
CN111614954A (en) Index acquisition processing method and device for streaming media, computer and storage medium
CN110856005A (en) Live stream display method and device, electronic equipment and readable storage medium
KR20220144857A (en) Multimedia data publishing method and apparatus, device and recording medium
CN114021016A (en) Data recommendation method, device, equipment and storage medium
CN111050217B (en) Video playing method and device
US20230221828A1 (en) Content display method and apparatus, electronic device, andcomputer-readable storage medium
WO2017000722A1 (en) Page integration method and device
CN109922366A (en) A kind of device parameter method of adjustment, device, equipment and medium
CN117319736A (en) Video processing method, device, electronic equipment and storage medium
CN111586492A (en) Video playing method and device, client device and storage medium
CN112437289A (en) Switching time delay obtaining method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 266555 Qingdao economic and Technological Development Zone, Shandong, Hong Kong Road, No. 218

Applicant after: Hisense Video Technology Co., Ltd

Address before: 266555 Qingdao economic and Technological Development Zone, Shandong, Hong Kong Road, No. 218

Applicant before: HISENSE ELECTRIC Co.,Ltd.

GR01 Patent grant
GR01 Patent grant