CN114640874A - Subtitle synchronization method and device, set top box and computer readable storage medium - Google Patents

Subtitle synchronization method and device, set top box and computer readable storage medium Download PDF

Info

Publication number
CN114640874A
CN114640874A CN202210223615.5A CN202210223615A CN114640874A CN 114640874 A CN114640874 A CN 114640874A CN 202210223615 A CN202210223615 A CN 202210223615A CN 114640874 A CN114640874 A CN 114640874A
Authority
CN
China
Prior art keywords
subtitle
caption
playing
current target
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210223615.5A
Other languages
Chinese (zh)
Inventor
杨连发
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan Goke Microelectronics Co Ltd
Original Assignee
Hunan Goke Microelectronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan Goke Microelectronics Co Ltd filed Critical Hunan Goke Microelectronics Co Ltd
Priority to CN202210223615.5A priority Critical patent/CN114640874A/en
Publication of CN114640874A publication Critical patent/CN114640874A/en
Priority to PCT/CN2023/078379 priority patent/WO2023169240A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • H04N21/43074Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of additional data with content streams on the same device, e.g. of EPG data or interactive icon with a TV program
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/488Data services, e.g. news ticker
    • H04N21/4884Data services, e.g. news ticker for displaying subtitles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8547Content authoring involving timestamps for synchronizing content

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Security & Cryptography (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The embodiment of the application provides a subtitle synchronization method, a subtitle synchronization device, a set top box and a computer readable storage medium, wherein the method comprises the following steps: receiving a code stream file name from the playing end, and acquiring a corresponding subtitle file according to the code stream file name; analyzing the caption file to obtain a plurality of caption sentences; instantiating the plurality of caption sentences to obtain a plurality of caption objects; receiving the current video playing time from the playing end, and determining a current target subtitle object from the subtitle objects according to the current video playing time; and displaying the current target subtitle object. According to the subtitle synchronization scheme provided by the embodiment, the subtitle sentences of the subtitle files are instantiated and processed into a plurality of subtitle objects, the current target subtitle object is determined from the plurality of subtitle objects based on the current video playing time, the current target subtitle object is displayed, the number of cross-process communication is reduced, and the subtitle synchronization effect is improved.

Description

Subtitle synchronization method and device, set top box and computer readable storage medium
Technical Field
The present application relates to the field of multimedia technologies, and in particular, to a method and an apparatus for synchronizing subtitles, a set top box, and a computer readable storage medium.
Background
The existing subtitle technology comprises embedded subtitles and plug-in subtitles, different subtitle technologies can be adopted for different application scenes, for example, for foreign videos, character dialogues mostly adopt foreign languages, and the Chinese plug-in subtitles can be used for facilitating users to understand video contents.
The plug-in subtitle is a subtitle file independent of the code stream file, and the subtitle can be displayed only by independently analyzing the plug-in subtitle file when the video is played. Most of plug-in subtitle functions are independently expanded by each player, and each player needing to support the plug-in subtitle needs to independently realize the plug-in subtitle function. The synchronous display of the plug-in subtitle and the video needs to depend on the current video playing time. In order to realize subtitle synchronization, timestamps of subtitles need to be compared in a specific time, and data interaction needs to be performed frequently across threads or processes.
Disclosure of Invention
In order to solve the above technical problem, embodiments of the present application provide a subtitle synchronization method, apparatus, set-top box, and computer-readable storage medium.
In a first aspect, an embodiment of the present application provides a subtitle synchronization method, which is applied to a subtitle server, where the subtitle server is in communication connection with a player, and the method includes:
receiving a code stream file name from the playing end, and acquiring a corresponding subtitle file according to the code stream file name;
analyzing the caption file to obtain a plurality of caption sentences;
instantiating the plurality of caption sentences to obtain a plurality of caption objects;
receiving the current video playing time from the playing end, and determining a current target subtitle object from a plurality of subtitle objects according to the current video playing time;
and displaying the current target subtitle object.
In a second aspect, an embodiment of the present application provides a subtitle synchronization apparatus, which is applied to a subtitle server, where the subtitle server is in communication connection with a player, and the apparatus includes:
the acquisition module is used for receiving the code stream file name from the playing end and acquiring a corresponding subtitle file according to the code stream file name;
the analysis module is used for analyzing the caption file to obtain a plurality of caption sentences;
the processing module is used for performing instantiation processing on the plurality of caption sentences to obtain a plurality of caption objects;
the determining module is used for receiving the current video playing time from the playing end and determining a current target subtitle object from the subtitle objects according to the current video playing time;
and the display module is used for displaying the current target subtitle object.
In a third aspect, an embodiment of the present application provides a set top box, which includes a memory and a processor, where the memory is used to store a computer program, and the computer program executes the subtitle synchronization method provided in the first aspect when the processor runs.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, which stores a computer program, and when the computer program runs on a processor, the computer program performs the subtitle synchronization method provided in the first aspect.
The subtitle synchronization method, the subtitle synchronization device, the set top box and the computer-readable storage medium provided by the application receive the code stream file name from the playing end, and acquire the corresponding subtitle file according to the code stream file name; analyzing the caption file to obtain a plurality of caption sentences; instantiating the plurality of caption sentences to obtain a plurality of caption objects; receiving the current video playing time from the playing end, and determining a current target subtitle object from a plurality of subtitle objects according to the current video playing time; and displaying the current target subtitle object. According to the subtitle synchronization scheme provided by the embodiment, the subtitle sentences of the subtitle files are instantiated and processed into a plurality of subtitle objects, the current target subtitle object is determined from the plurality of subtitle objects based on the current video playing time, the current target subtitle object is displayed, the number of cross-process communication is reduced, and the subtitle synchronization effect is improved.
Drawings
In order to more clearly explain the technical solutions of the present application, the drawings needed to be used in the embodiments are briefly introduced below, and it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope of protection of the present application. Like components are numbered similarly in the various figures.
Fig. 1 shows a flow chart of a subtitle synchronization method provided by an embodiment of the present application;
fig. 2 shows a schematic structural diagram of a subtitle synchronization apparatus according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments.
The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
Hereinafter, the terms "including", "having", and their derivatives, which may be used in various embodiments of the present application, are intended to indicate only specific features, numbers, steps, operations, elements, components, or combinations of the foregoing, and should not be construed as first excluding the existence of, or adding to, one or more other features, numbers, steps, operations, elements, components, or combinations of the foregoing.
Furthermore, the terms "first," "second," "third," and the like are used solely to distinguish one from another and are not to be construed as indicating or implying relative importance.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which the various embodiments of the present application belong. The terms (such as those defined in commonly used dictionaries) should be interpreted as having a meaning that is consistent with their contextual meaning in the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein in various embodiments.
Example 1
The embodiment of the disclosure provides a subtitle synchronization method, which is applied to a subtitle server, and the subtitle server is in communication connection with a playing end.
Referring to fig. 1, the subtitle synchronization method of the present embodiment includes:
and step S101, receiving the code stream file name from the playing terminal, and acquiring a corresponding subtitle file according to the code stream file name.
In this embodiment, the playing side is a video player (MediaPlayer), and the video player can be installed on the set-top box. The caption service end is a plug-in caption system independent of the video playing end. The caption service end can realize cross-process communication with the playing end to carry out data interaction. And the subtitle service end can realize quick migration. The current video playing time can be acquired through a getCurrentPosition interface of the playing end.
Specifically, the caption service end runs in the background as an independent service, registers the caption service into the service manager, and the playing end obtains the caption service through the binder function. The playing end is communicated with the caption service end, acquires a caption service instance from the caption service end, creates a caption session object through the caption service instance, and realizes display processing of the plug-in caption by operating the caption session object. In one embodiment, the playing end creates a caption time provider object, and provides the current video playing time to the caption service end through the caption time provider object, so that the caption service end can conveniently perform caption synchronous operation.
In one embodiment, step S101 includes the following steps:
searching a corresponding target code stream file according to the code stream file name;
acquiring a storage path of the target code stream file;
and reading the subtitle file corresponding to the code stream file name under the storage path.
Specifically, the subtitle server side searches, analyzes and displays the plug-in subtitle files. And searching a subtitle file with the same name as the code stream file in the same directory according to the file name of the code stream file sent to the subtitle server by the playing end, wherein the name of the subtitle file is the same as the file name of the last stream file, and the name of the suffix is different.
And step S102, analyzing the subtitle file to obtain a plurality of subtitle sentences.
In this embodiment, the subtitle file has a plurality of file formats, for example, the text subtitle has a format of srt, the file suffix name is srt, and the subtitle includes a subtitle sequence number, a subtitle start display time, a subtitle end display time, and subtitle content; the graphic subtitles are in a dvdsub format, the graphic subtitles comprise two files, namely, a suffix idx and a sub file, the idx file is an index file and comprises the display time of the subtitles, and the sub file stores the data of each subtitle picture. The caption files with different formats can be analyzed according to different standard standards, and then a plurality of caption sentences are obtained.
In one embodiment, step S102 may include the following steps:
analyzing the caption file to obtain a real caption sentence and a caption-free time interval;
and generating an empty caption statement according to the caption-free time interval.
In one embodiment, the caption service determines a file format of a caption file, and performs corresponding processing according to different file formats to obtain a real caption sentence and a non-caption time period, so that each time period of a video corresponds to a caption object, and a blank caption sentence with empty display content needs to be correspondingly displayed in the non-caption time period, and the real caption sentence and the blank caption sentence are combined into a plurality of caption sentences.
Step S103, performing instantiation processing on the plurality of caption sentences to obtain a plurality of caption objects.
In one embodiment, the caption service instantiates each of the real caption sentences and the empty caption sentences into corresponding caption objects according to the display time and the display content of each caption sentence. It should be noted that, if the current time period is a real caption statement, the real caption statement is instantiated into a real caption object, the initial display time of the real caption object is the start time of the real caption statement, the end display time of the real caption object is the initial display time of the real caption statement, and the caption content is real text content; if the current time period is the empty caption sentence, the empty caption sentence is instantiated into an empty caption object, the initial display time of the empty caption object is the end time of the previous caption object, the end display time is the initial display time of the next caption object, and the caption content is set to be empty.
In one embodiment, step S103 includes:
instantiating each caption statement according to the initial display time and the end display time corresponding to each caption statement to obtain a plurality of caption objects;
and storing the plurality of subtitle objects in a subtitle queue according to a preset display time sequence, wherein the starting display time of each subtitle object in the subtitle queue is the ending display time of the adjacent last subtitle object.
In this embodiment, the instantiated subtitle objects include a start display time and an end display time of each subtitle object, the plurality of subtitle objects are stored in the subtitle queue according to a preset display time sequence, the start display time of each subtitle object in the subtitle queue is the end display time of the next subtitle object, and the subtitle object corresponding to the current video playing time can be determined from the subtitle queue based on the start display time and the end display time of each subtitle object.
Step S104, receiving the current video playing time from the playing end, and determining the current target subtitle object from the plurality of subtitle objects according to the current video playing time.
Specifically, when the playing end starts playing the video, the playing end establishes communication connection with the subtitle service end, the subtitle service end starts operating to acquire the current video playing time of the playing end, and the current target subtitle object matched with the current video playing time is searched from the plurality of subtitle objects through the current video playing time and the display starting and ending time of each subtitle object.
In one embodiment, step S104 includes:
and matching according to the current video playing time and the initial display time corresponding to each subtitle object, and determining a current target subtitle object matched with the current video playing time in the subtitle queue.
Specifically, after the subtitle service starts to run, because the plurality of subtitle objects are sequentially stored in the subtitle queue according to the preset display time sequence, the subtitle object with the starting display time matched with the current video playing time can be searched from the subtitle queue according to the current video playing time, and the searched subtitle object is used as the current target subtitle object. Therefore, the current target subtitle object can be searched from the subtitle queue by matching the current video playing time with the initial display time corresponding to each subtitle object, the corresponding current subtitle object can be quickly searched according to the current video playing time, and the efficiency of searching the current target subtitle object is improved.
Because the subtitle objects are stored in the subtitle queue according to the preset display time sequence, when the video playing does not receive pause operation, fast forward operation and fast backward operation, the subtitle objects in the subtitle queue can be displayed in sequence according to the time sequence, so that the aim of synchronizing the display of the subtitle objects and the video playing is fulfilled.
And step S105, displaying the current target subtitle object.
Specifically, the current target subtitle object is displayed until the ending display time of the current target subtitle object, and the display is ended until the ending display time point.
In one embodiment, after the displaying of the current target subtitle object is finished, the method further includes:
and sequentially acquiring the next subtitle object adjacent to the current target subtitle object from the subtitle queue as the current target subtitle object for displaying.
Specifically, when the ending display time of the current target subtitle object is reached, an adjacent next subtitle object of the current target subtitle object may be sequentially determined from the subtitle queue, the adjacent next subtitle object is used as the current target subtitle object, and the current target subtitle object is displayed. Therefore, the subtitle server side only needs to determine the current target subtitle object by obtaining the current video playing time when subtitle objects are displayed synchronously for the first time, after the subtitle objects are displayed synchronously for the first time, the adjacent next subtitle object can be sequentially obtained from the subtitle queue to be used as the current target subtitle object to be displayed, the subtitle server side does not need to repeatedly obtain the current video playing time for many times, cross-process communication is reduced, and system resources are saved.
Therefore, when the pause operation, the fast forward operation and the fast backward operation are not received in the video playing process, namely when the video is in a normal playing state, the next undisplayed subtitle object can be sequentially acquired from the subtitle queue as the current target subtitle object, so that the subtitle object display is synchronous with the video keeping time of continuous playing.
The caption service end can be used as a single background service and can be well expanded in other media frameworks without depending on a specific player. The subtitle synchronization of the subtitle server side has less dependence on the current playing time of the video, so that the number of cross-process communication is greatly reduced, and the system performance is greatly improved.
In one embodiment, the displaying the current target subtitle object in step S104 includes:
acquiring the ending display time of the current target subtitle object;
and ending the display of the current target subtitle object according to the ending display time.
Therefore, the display time of the current target subtitle object can be accurately controlled, and the subtitle display and the video playing progress are ensured to be consistent.
In one embodiment, after displaying the current target subtitle object, the method further includes:
and re-receiving the current video playing time from the playing end, re-determining the current target subtitle object from the subtitle objects according to the re-received current video playing time, and displaying the re-determined current target subtitle object.
Therefore, the subtitle server can reacquire the current video playing time after the subtitle object is synchronously displayed once, and synchronously display the subtitle object again through the reacquired front video playing time, so that the synchronization of the video playing time and the subtitle object displaying time can be ensured, and the precision of synchronous display of the subtitle object is improved.
It is noted that, when the video playing progress is adjusted, the subtitle server may adjust and display the subtitle object correspondingly according to the video playing progress. Specifically, the subtitle synchronization method of the present embodiment further includes:
receiving a playing control instruction from the playing end, and sending a video synchronization request to the playing end according to the playing control instruction;
and receiving playing time updating information from the playing end, and determining a current target subtitle object from the plurality of subtitle objects according to the playing time updating information, wherein the playing time updating information is generated by the playing end according to the video synchronization request.
In this embodiment, the playing control instruction may include a pause playing instruction, a resume playing instruction, and a time selection instruction. The following describes the process of adjusting the caption object by the caption service end according to the pause playing instruction, the resume playing instruction and the time selection instruction.
In one embodiment, the subtitle server receives a pause playing instruction from the playing end, and does not update the current subtitle object according to the pause playing instruction; and receiving a playing recovery instruction from the playing end, and sending a first video synchronization request to the playing end according to the playing recovery instruction. And the subtitle server receives first play time updating information from the play end, determines a next target subtitle object from the plurality of subtitle objects according to the first play time updating information, and displays the next target subtitle object, wherein the first play time updating information is generated by the play end according to the first video synchronization request.
For example, if the playing end invokes a pause operation, when the playing end pauses video playing, the player sends a pause playing instruction to the subtitle server, synchronously notifies the subtitle server to pause and refresh the subtitle object, the subtitle server continues to display the currently displayed subtitle object, until the video of the playing end resumes playing, the playing end synchronously sends a resume playing instruction to the subtitle server, the subtitle server generates a first video synchronization request according to the resume playing instruction to the playing end, sends a first video synchronization request to the playing end, the playing end generates first playing time update information according to the first video synchronization request, sends first playing time update information to the subtitle server, and the subtitle server determines a next target subtitle object from the plurality of subtitle objects according to the first playing time update information, displays the next target subtitle object, and resumes the synchronous rendering processing of the subtitle object.
In another embodiment, the subtitle service end receives a time selection instruction from the playing end, and sends a second video synchronization request to the playing end according to the time selection instruction;
and receiving second playing time updating information from the playing end, determining a current target subtitle object from the plurality of subtitle objects according to the second playing time updating information, and displaying the current target subtitle object, wherein the second playing time updating information is generated by the playing end according to the second video synchronization request.
For example, if the play end invokes the time selection operation, the currently played position of the play end changes, the play end sends a time selection instruction to the subtitle server, the subtitle server generates a second video synchronization request according to the time selection instruction, the subtitle server sends the second video synchronization request to the play end, the play end receives the second synchronization request, generates second play time update information according to the second synchronization request, sends the second play time update information to the subtitle server, the subtitle server receives the second play time update information, determines a next target subtitle object from the plurality of subtitle objects according to the second play time update information, displays the next target subtitle object, and restarts the synchronous rendering of the subtitle objects.
The subtitle server of the embodiment exists independently of the playing end, the subtitle server can communicate with a player media frame of an Android system (Android) and carry out information interaction, the subtitle server can be independent of other modules, and when the other modules crash, the subtitle server still can work normally without being influenced by the other modules. The caption service end is designed into a background service which can be remotely called by a player, and can be very easily expanded and realized; more external subtitle scenes are covered, and the graphic subtitles can be conveniently expanded. The subtitle server can realize higher subtitle synchronization accuracy through less interaction with other processes, and the system performance is improved. In addition, in the embodiment, the caption objects of the real caption sentences and the caption-free time periods are obtained according to the continuous time increase, the caption objects can be sequentially displayed according to the time sequence, the time comparison process in the caption synchronization process is reduced, the captions can be refreshed in time, and the viewing experience is improved.
The title synchronization method provided by the disclosure comprises the steps of receiving a code stream file name from a playing terminal, and acquiring a corresponding title file according to the code stream file name; analyzing the caption file to obtain a plurality of caption sentences; instantiating the plurality of caption sentences to obtain a plurality of caption objects; receiving the current video playing time from the playing end, and determining a current target subtitle object from the subtitle objects according to the current video playing time; and displaying the current target subtitle object. According to the subtitle synchronization scheme provided by the embodiment, the subtitle sentences of the subtitle files are instantiated and processed into a plurality of subtitle objects, the current target subtitle object is determined from the plurality of subtitle objects based on the current video playing time, the current target subtitle object is displayed, the number of cross-process communication is reduced, and the subtitle synchronization effect is improved.
Example 2
In addition, the embodiment of the disclosure provides a subtitle synchronization device, which is applied to a subtitle service end, and the subtitle service end is in communication connection with a playing end.
Specifically, as shown in fig. 2, the subtitle synchronization apparatus 200 includes:
an obtaining module 201, configured to receive a code stream file name from the playing end, and obtain a corresponding subtitle file according to the code stream file name;
the parsing module 202 is configured to parse the subtitle file to obtain a plurality of subtitle statements;
the processing module 203 is configured to perform instantiation processing on the multiple subtitle statements to obtain multiple subtitle objects;
a determining module 204, configured to receive a current video playing time from the playing end, and determine a current target subtitle object from the subtitle objects according to the current video playing time;
a display module 205, configured to display the current target subtitle object.
In an embodiment, the parsing module 202 is further configured to parse the subtitle file to obtain a real subtitle statement and a subtitle-free time period;
and generating an empty caption statement according to the caption-free time interval.
In an embodiment, the processing module 203 is further configured to perform instantiation processing on each caption statement according to a start display time and an end display time corresponding to each caption statement, so as to obtain a plurality of caption objects;
and storing the plurality of subtitle objects in a subtitle queue according to a preset display time sequence, wherein the starting display time of each subtitle object in the subtitle queue is the ending display time of the adjacent last subtitle object.
In an embodiment, the processing module 203 is further configured to perform matching according to the current video playing time and an initial display time corresponding to each subtitle object, and determine a current target subtitle object matched with the current video playing time in the subtitle queue.
In an embodiment, the processing module 203 is further configured to sequentially acquire, from the subtitle queue, an adjacent next subtitle object of the current target subtitle object as the current target subtitle object to display.
In an embodiment, the display module 205 is further configured to obtain an ending display time of the current target subtitle object;
and ending the display of the current target subtitle object according to the ending display time.
In an embodiment, the display module 205 is further configured to receive a play control instruction from the play end, and send a video synchronization request to the play end according to the play control instruction;
and receiving playing time updating information from the playing end, and determining a current target subtitle object from the plurality of subtitle objects according to the playing time updating information, wherein the playing time updating information is generated by the playing end according to the video synchronization request.
The subtitle synchronization apparatus 200 provided in this embodiment may implement the subtitle synchronization method provided in embodiment 1, and for avoiding repetition, details are not described herein again.
The subtitle synchronization apparatus provided in this embodiment receives a code stream file name from the playing terminal, and obtains a corresponding subtitle file according to the code stream file name; analyzing the caption file to obtain a plurality of caption sentences; instantiating the plurality of caption sentences to obtain a plurality of caption objects; receiving the current video playing time from the playing end, and determining a current target subtitle object from a plurality of subtitle objects according to the current video playing time; and displaying the current target subtitle object. According to the subtitle synchronization scheme provided by the embodiment, the subtitle sentences of the subtitle files are instantiated and processed into a plurality of subtitle objects, the current target subtitle object is determined from the plurality of subtitle objects based on the current video playing time, the current target subtitle object is displayed, the number of cross-process communication is reduced, and the subtitle synchronization effect is improved.
Example 3
In addition, an embodiment of the present disclosure provides a set top box, which includes a memory and a processor, where the memory stores a computer program, and the computer program executes the subtitle synchronization method provided in embodiment 1 when running on the processor.
The set top box provided in this embodiment may implement the subtitle synchronization method provided in embodiment 1, and is not described herein again to avoid repetition.
Example 4
The embodiments disclosed in the present application further provide a computer-readable storage medium, on which a computer program is stored, and when the computer program runs on a processor, the subtitle synchronization method provided in embodiment 1 is performed.
In this embodiment, the computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
The computer-readable storage medium provided in this embodiment may implement the water level obtaining method based on image recognition provided in embodiment 1, and is not described herein again to avoid repetition.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or terminal that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. A subtitle synchronization method is applied to a subtitle service end which is in communication connection with a playing end, and comprises the following steps:
receiving a code stream file name from the playing end, and acquiring a corresponding subtitle file according to the code stream file name;
analyzing the caption file to obtain a plurality of caption sentences;
instantiating the plurality of caption sentences to obtain a plurality of caption objects;
receiving the current video playing time from the playing end, and determining a current target subtitle object from a plurality of subtitle objects according to the current video playing time;
and displaying the current target subtitle object.
2. The method of claim 1, wherein parsing the subtitle file to obtain a plurality of subtitle sentences comprises:
analyzing the caption file to obtain a real caption sentence and a caption-free time interval;
and generating an empty caption statement according to the caption-free time interval.
3. The method of claim 1, wherein instantiating the plurality of caption statements to obtain a plurality of caption objects comprises:
instantiating each caption sentence according to the initial display time and the end display time corresponding to each caption sentence to obtain a plurality of caption objects;
and storing the plurality of subtitle objects in a subtitle queue according to a preset display time sequence, wherein the starting display time of each subtitle object in the subtitle queue is the ending display time of the adjacent last subtitle object.
4. The method of claim 3, wherein the determining a current target subtitle object from the plurality of subtitle objects according to the current video playing time comprises:
and matching according to the current video playing time and the initial display time corresponding to each subtitle object, and determining a current target subtitle object matched with the current video playing time in the subtitle queue.
5. The method of claim 3, further comprising, after the end of the display of the current target subtitle object:
and sequentially acquiring the next subtitle object adjacent to the current target subtitle object from the subtitle queue as the current target subtitle object for displaying.
6. The method of claim 1, wherein the displaying the current target subtitle object further comprises:
acquiring the ending display time of the current target subtitle object;
and ending the display of the current target subtitle object according to the ending display time.
7. The method of claim 1, wherein the displaying the current target subtitle object comprises:
receiving a playing control instruction from the playing end, and sending a video synchronization request to the playing end according to the playing control instruction;
and receiving playing time updating information from the playing end, and determining a current target subtitle object from the plurality of subtitle objects according to the playing time updating information, wherein the playing time updating information is generated by the playing end according to the video synchronization request.
8. A subtitle synchronization device is applied to a subtitle service end which is in communication connection with a playing end, and the device comprises:
the acquisition module is used for receiving the code stream file name from the playing end and acquiring a corresponding subtitle file according to the code stream file name;
the analysis module is used for analyzing the caption file to obtain a plurality of caption sentences;
the processing module is used for carrying out instantiation processing on the plurality of caption sentences to obtain a plurality of caption objects;
a determining module, configured to receive a current video playing time from the playing end, and determine a current target subtitle object from the subtitle objects according to the current video playing time;
and the display module is used for displaying the current target subtitle object.
9. A set-top box comprising a memory and a processor, the memory storing a computer program which, when executed by the processor, performs the method of synchronizing subtitles according to any one of claims 1 to 7.
10. A computer-readable storage medium, characterized in that it stores a computer program which, when run on a processor, performs the method of subtitle synchronization of any of claims 1-7.
CN202210223615.5A 2022-03-09 2022-03-09 Subtitle synchronization method and device, set top box and computer readable storage medium Pending CN114640874A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210223615.5A CN114640874A (en) 2022-03-09 2022-03-09 Subtitle synchronization method and device, set top box and computer readable storage medium
PCT/CN2023/078379 WO2023169240A1 (en) 2022-03-09 2023-02-27 Subtitle synchronization method and apparatus, set-top box and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210223615.5A CN114640874A (en) 2022-03-09 2022-03-09 Subtitle synchronization method and device, set top box and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN114640874A true CN114640874A (en) 2022-06-17

Family

ID=81948161

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210223615.5A Pending CN114640874A (en) 2022-03-09 2022-03-09 Subtitle synchronization method and device, set top box and computer readable storage medium

Country Status (2)

Country Link
CN (1) CN114640874A (en)
WO (1) WO2023169240A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023169240A1 (en) * 2022-03-09 2023-09-14 湖南国科微电子股份有限公司 Subtitle synchronization method and apparatus, set-top box and computer readable storage medium

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117319738B (en) * 2023-12-01 2024-03-08 飞狐信息技术(天津)有限公司 Subtitle delay method and device, electronic equipment and storage medium

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150095938A1 (en) * 2013-09-30 2015-04-02 Hulu, LLC Queue to Display Additional Information for Entities in Captions
CN104506957A (en) * 2014-12-08 2015-04-08 广东欧珀移动通信有限公司 Method and device for displaying subtitles
CN105704582A (en) * 2015-05-11 2016-06-22 深圳Tcl数字技术有限公司 Browser-based subtitle displaying method and device
CN105898556A (en) * 2015-12-30 2016-08-24 乐视致新电子科技(天津)有限公司 Plug-in subtitle automatic synchronization method and device
KR20170047547A (en) * 2015-10-23 2017-05-08 엘지전자 주식회사 Display device and method for controlling the same
CN106804011A (en) * 2017-02-10 2017-06-06 深圳创维数字技术有限公司 The method and system of loading caption file during a kind of broadcasting video
WO2017191397A1 (en) * 2016-05-03 2017-11-09 Orange Method and device for synchronising subtitles
CN108259963A (en) * 2018-03-19 2018-07-06 成都星环科技有限公司 A kind of TV ends player
CN109525899A (en) * 2018-11-19 2019-03-26 青岛海信传媒网络技术有限公司 The method and device that subtitle and audio video synchronization are presented
US20200007902A1 (en) * 2018-06-29 2020-01-02 Alibaba Group Holding Limited Video subtitle display method and apparatus

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008141693A (en) * 2006-12-05 2008-06-19 Matsushita Electric Ind Co Ltd Content reproducing apparatus and content reproduction method
US20140003792A1 (en) * 2012-06-29 2014-01-02 Kourosh Soroushian Systems, methods, and media for synchronizing and merging subtitles and media content
CN106791494B (en) * 2016-12-19 2019-02-26 腾讯科技(深圳)有限公司 The generation method and device of video caption
CN111526414B (en) * 2020-04-30 2022-06-07 青岛海信传媒网络技术有限公司 Subtitle display method and display equipment
CN112601101B (en) * 2020-12-11 2023-02-24 北京有竹居网络技术有限公司 Subtitle display method and device, electronic equipment and storage medium
CN114640874A (en) * 2022-03-09 2022-06-17 湖南国科微电子股份有限公司 Subtitle synchronization method and device, set top box and computer readable storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150095938A1 (en) * 2013-09-30 2015-04-02 Hulu, LLC Queue to Display Additional Information for Entities in Captions
CN104506957A (en) * 2014-12-08 2015-04-08 广东欧珀移动通信有限公司 Method and device for displaying subtitles
CN105704582A (en) * 2015-05-11 2016-06-22 深圳Tcl数字技术有限公司 Browser-based subtitle displaying method and device
KR20170047547A (en) * 2015-10-23 2017-05-08 엘지전자 주식회사 Display device and method for controlling the same
CN105898556A (en) * 2015-12-30 2016-08-24 乐视致新电子科技(天津)有限公司 Plug-in subtitle automatic synchronization method and device
WO2017191397A1 (en) * 2016-05-03 2017-11-09 Orange Method and device for synchronising subtitles
CN106804011A (en) * 2017-02-10 2017-06-06 深圳创维数字技术有限公司 The method and system of loading caption file during a kind of broadcasting video
CN108259963A (en) * 2018-03-19 2018-07-06 成都星环科技有限公司 A kind of TV ends player
US20200007902A1 (en) * 2018-06-29 2020-01-02 Alibaba Group Holding Limited Video subtitle display method and apparatus
CN109525899A (en) * 2018-11-19 2019-03-26 青岛海信传媒网络技术有限公司 The method and device that subtitle and audio video synchronization are presented

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023169240A1 (en) * 2022-03-09 2023-09-14 湖南国科微电子股份有限公司 Subtitle synchronization method and apparatus, set-top box and computer readable storage medium

Also Published As

Publication number Publication date
WO2023169240A1 (en) 2023-09-14

Similar Documents

Publication Publication Date Title
CN114640874A (en) Subtitle synchronization method and device, set top box and computer readable storage medium
EP2978232A1 (en) Method and device for adjusting playback progress of video file
CN112601101B (en) Subtitle display method and device, electronic equipment and storage medium
CN110769265A (en) Simultaneous caption translation method, smart television and storage medium
US20130212514A1 (en) Method and Device for Displaying Start-Up Interface of Multimedia Terminal
KR20070020208A (en) Method and apparatus for locating content in a program
CN109348145B (en) Method and device for generating associated bullet screen based on subtitle and computer readable medium
CN1881415A (en) Information processing apparatus and method therefor
CN113035199B (en) Audio processing method, device, equipment and readable storage medium
AU2014259879B2 (en) Interactive viewing experiences by detecting on-screen text
CN111898388A (en) Video subtitle translation editing method and device, electronic equipment and storage medium
CN111698565B (en) Video playing method and device and electronic equipment
CN112616062A (en) Subtitle display method and device, electronic equipment and storage medium
CN106792114A (en) The changing method and device of captions
CN103517104A (en) Set top box and video captions composite method based on network broadcasting
CN108491178B (en) Information browsing method, browser and server
CN113886612A (en) Multimedia browsing method, device, equipment and medium
US20210073479A1 (en) Information processing apparatus and non-transitory computer readable medium
CN113986161A (en) Method and device for real-time word extraction in audio and video communication
CN114925656B (en) Rich text display method, device, equipment and storage medium
CN111327961A (en) Video subtitle switching method and system
CN112055262B (en) Method and system for displaying network streaming media subtitles
CN112711954B (en) Translation method, translation device, electronic equipment and storage medium
CN115495185A (en) Method and device for displaying page elements
KR101869053B1 (en) System of providing speech bubble or score, method of receiving augmented broadcasting contents and apparatus for performing the same, method of providing augmented contents and apparatus for performing the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination