WO2013152562A1 - Procédé et système de synchronisation de diffusion multimédia mobile et de sous-titres - Google Patents

Procédé et système de synchronisation de diffusion multimédia mobile et de sous-titres Download PDF

Info

Publication number
WO2013152562A1
WO2013152562A1 PCT/CN2012/077420 CN2012077420W WO2013152562A1 WO 2013152562 A1 WO2013152562 A1 WO 2013152562A1 CN 2012077420 W CN2012077420 W CN 2012077420W WO 2013152562 A1 WO2013152562 A1 WO 2013152562A1
Authority
WO
WIPO (PCT)
Prior art keywords
subtitle
time
audio
frame
relative
Prior art date
Application number
PCT/CN2012/077420
Other languages
English (en)
Chinese (zh)
Inventor
夏智海
黄泽武
陈志兵
Original Assignee
中兴通讯股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 中兴通讯股份有限公司 filed Critical 中兴通讯股份有限公司
Publication of WO2013152562A1 publication Critical patent/WO2013152562A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8547Content authoring involving timestamps for synchronizing content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • H04N21/43074Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of additional data with content streams on the same device, e.g. of EPG data or interactive icon with a TV program
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41407Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance embedded in a portable device, e.g. video client on a mobile phone, PDA, laptop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/488Data services, e.g. news ticker
    • H04N21/4884Data services, e.g. news ticker for displaying subtitles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8126Monomedia components thereof involving additional data, e.g. news, sports, stocks, weather forecasts

Definitions

  • the present invention relates to the field of mobile multimedia broadcasting, and in particular, to a method and system for synchronizing mobile multimedia broadcast subtitles.
  • CMMB China Mobile Multimedia Broadcasting
  • CMMB complementary metal-oxide-semiconductor
  • the subtitles are embedded in pure audio data, and the subtitle form is fixed in a single manner;
  • the subtitle information is transmitted through the mobile multimedia broadcast data segment to solve the problem that the subtitles are fixed single, but this method cannot guarantee the synchronization of the subtitles and audio broadcasted, and the subtitles cannot be flexibly inserted, for example, the programs A and B are sequentially played, only Program B provides subtitles;
  • the subtitles are sent to the terminal, and the user who uses the terminal manually adjusts the playing time of the subtitles to synchronize the subtitles with the audio. This way the user experience is poor.
  • the terminal After the subtitle is sent to the terminal, the synchronization information of the subtitle and the audio is sent to the terminal, and the terminal adjusts the playing time of the subtitle according to the synchronization information. In this manner, the terminal reserves a large storage space to buffer the entire subtitle file, and needs Subtitles are sent for a long time in advance, resulting in poor real-time performance.
  • Embodiments of the present invention provide a mobile multimedia subtitle synchronization method and system to solve the technical problem of how to implement subtitle and audio synchronization.
  • the character string of the subtitle, the audio start playing time, and the time offset of the subtitle relative to the audio start playing time are encapsulated into the data segment of the audio sub-frame corresponding to the subtitle;
  • the data segment is transmitted to a mobile multimedia broadcast terminal.
  • the step of determining a time offset of the subtitle relative to an audio start playing time includes:
  • a time offset of the subtitle relative to the start time of the audio playback is determined based on the synchronization flag.
  • the method further includes determining, according to the start time of the subtitle in a local time zone, the encapsulation time, where:
  • the start time of the subtitle in the local time zone is M
  • the current time (TOD) provided by the local GPS is N
  • the transmission time difference from receiving the subtitle data packet to transmitting the data segment is R
  • the encapsulation time is S
  • M, N, R And the unit of S is seconds
  • the subtitle obtained according to the subtitle data packet is composed of multiple subtitles
  • the character string of the subtitle piece, the start playing time of the audio, and the time offset of the subtitle piece relative to the audio start playing time are encapsulated into an audio sub-frame corresponding to the subtitle. Data segment.
  • the step of determining, according to the synchronization mark, a time offset of the subtitle piece relative to an audio start playing time includes:
  • the time offset of the subtitle piece relative to the audio start playing time is calculated.
  • a receiving module configured to receive a subtitle data packet sent by the subtitle server and an audio sub-frame sent by the audio encoder
  • An obtaining module configured to acquire a subtitle from the subtitle data packet, and obtain an audio starting time from the audio sub-frame
  • a time offset determination module configured to determine a time offset of the subtitle relative to an audio start playing time
  • a packaging module configured to encapsulate a character string of the subtitle, an audio start playing time, and a time offset of the subtitle relative to an audio start playing time to an audio sub-frame corresponding to the subtitle when the encapsulation time arrives Data segment;
  • a sending module configured to send the data segment to the mobile multimedia broadcast terminal.
  • the time offset determining module is configured to determine a time offset of the subtitle relative to an audio start playing time in the following manner:
  • the encapsulating module is further configured to determine the encapsulation time according to a start time of the subtitle in a local time zone in the following manner:
  • the start time of the subtitle in the local time zone is M
  • the current time (TOD) provided by the local GPS is N
  • the transmission time difference from receiving the subtitle data packet to transmitting the data segment is R
  • the encapsulation time is S
  • M, N, R And the unit of S is seconds
  • the acquiring module is configured to obtain a subtitle composed of a plurality of subtitles from the subtitle data packet;
  • the time offset determining module is configured to determine, according to the synchronization flag, a time offset of the title film relative to the audio start playing time;
  • the encapsulating module is configured to, when the encapsulation time arrives, encapsulate a character string of the subtitle piece, a start playing time of the audio, and a time offset of the subtitle piece relative to the audio start playing time to the audio subtitle corresponding to the subtitle The data segment of the frame.
  • the time offset determining module is configured to determine, according to the synchronization flag, a time offset of the beginning time of the movie with respect to the audio according to the synchronization flag:
  • the time offset of the subtitle piece relative to the audio start playing time is calculated.
  • the data segment of the encapsulated audio sub-frame includes a time start of audio start time and a time offset of the subtitle relative to the audio start play time, so that the mobile multimedia broadcast terminal that receives the audio sub-frame can
  • the time offset of the audio start playing time determines the playing time of the subtitle, achieving the technical effect of synchronizing with the audio.
  • FIG. 1 is a flowchart of a mobile multimedia broadcast caption synchronization method according to an embodiment of the present invention
  • 2 is a schematic diagram of a package for transmitting data streams to a multiplexer by an application example caption server
  • FIG. 3 is a schematic diagram of a format of a data segment of an audio sub-frame in which caption information is encapsulated according to an application example
  • FIG. 4 is a mobile multimedia broadcast of the present embodiment.
  • the subtitle synchronization system forms a module diagram.
  • FIG. 1 is a flowchart of a mobile multimedia broadcast caption synchronization method according to an embodiment of the present invention.
  • S101 receiving a subtitle data packet sent by the subtitle server and an audio sub-frame sent by the audio encoder; S102: acquiring audio from the audio sub-frame to start playing time;
  • the subtitle data packet is also allowed to include information indicating the time-lapse data of the subtitle data packet (such as Coordinated Universal Time UTC time system), information indicating the time zone in which the subtitle data packet is located, and the start time of the subtitle in the time zone in which it is located.
  • information indicating the time-lapse data of the subtitle data packet such as Coordinated Universal Time UTC time system
  • information indicating the time zone in which the subtitle data packet is located such as Coordinated Universal Time UTC time system
  • the start time of the subtitle in the time zone in which it is located After receiving the data packet, correcting the subtitle in the time zone of the time zone in which the subtitle data packet is located according to the information indicating the time slot system of the subtitle data packet, the information indicating the time zone in which the subtitle data packet is located, and the time zone in which the receiving end is locally located. The start time of the local time zone.
  • the information indicating the time capping system of the caption data is UTC, indicating that the time zone of the caption data packet is the East 5 zone, the start time of the caption in the time zone where the caption is located is 0 o'clock, and the time zone of the local end of the receiving end is In the East 8th district, the revised subtitles start at 3 o'clock in the local time zone.
  • the subtitle data packet may further include other information according to different application requirements, for example, in order to clearly distinguish from the audio sub-frame, add a representation in the subtitle data packet.
  • the data packet is information of the subtitle; in order to enable the terminal to select the correct character set, the subtitle string encoding mode is added in the subtitle data packet; in order for the terminal to recognize the subtitle display duration and end the subtitle display at an appropriate time, the subtitle Increase the duration of subtitles in the packet.
  • S104 determines a time offset of the subtitle relative to the start time of the audio playing time
  • the encapsulation time is determined according to a start time of the subtitle in a local time zone, and the determining step includes: setting a start time of the subtitle in the local time zone to be M, and a current time TOD (Time of Day) provided by the local global positioning system (GPS) is N
  • the transmission time difference from receiving the subtitle data packet to the transmission data segment is R, the encapsulation time is S; the units of M, N, R, and S are seconds;
  • S106 sends the data segment to the mobile multimedia broadcast terminal.
  • the caption string parsed from the caption data packet may also be composed of a plurality of captions to facilitate long caption data transmission or to be applied to multi-lingual subtitles.
  • the caption may be determined according to the synchronization mark. a time offset of the audio start playing time; when the encapsulation time arrives, the start play time of the audio, the time offset of the subtitle piece relative to the audio start play time, and the character string of the subtitle piece are encapsulated to correspond to the subtitle The data segment of the audio sub-frame.
  • the mobile multimedia broadcast terminal After receiving the data segment of the audio sub-frame, the mobile multimedia broadcast terminal performs an operation of playing a character string of the subtitle according to a start time of the audio included in the data segment and a time offset of the subtitle relative to the start time of the audio; Encapsulated in the segment is the subtitle piece information, and the character string of the subtitle piece is played according to the start play time of the audio contained in the data segment and the time offset of the subtitle piece relative to the start time of the audio start time.
  • the mobile multimedia broadcast caption synchronization method of the above embodiment is further described below with a specific application example.
  • the application example relates to a subtitle server, an audio encoder and a multiplexer, wherein the data stream transmitted between the subtitle server and the multiplexer is encapsulated in UDP, and the encapsulation format is as shown in FIG. 2.
  • the subtitle synchronization method of this application example includes the following steps:
  • Step 1 The subtitle server sends a UDP encapsulated data stream to the multiplexer, where each UDP message body part is used to form a subtitle data packet; the audio encoder sends an audio sub-frame to the multiplexer; the subtitle data packet includes an indication The time zone information of the subtitle data packet, the subtitle update sequence number, the start time of the subtitle in the time zone in which it is located, the encoding mode, the number of subtitles, the subtitle slice 1 parameter, the subtitle slice 2 parameter subtitle slice n parameter, and the subtitle content, wherein
  • the subtitle update sequence number (4bit), when the subtitle is updated, the sequence number is incremented by 1.
  • the multiplexer can compare whether the received sequence number is carried in the subtitle data packet and whether the serial number carried in the received subtitle data packet is consistent, and whether the re-determination is re-determined.
  • the start time of the subtitle in its time zone (32bit);
  • the subtitle piece n parameter (80bit), which further contains three fields, respectively: subtitle piece n play time (32bit), subtitle piece n end time (32bit) and subtitle piece n length (16bit);
  • Subtitle content according to the number of subtitles included in the subtitle data packet and the length of the subtitle piece n, the corresponding subtitle piece content can be extracted from the subtitle content.
  • Step 2 After receiving the UDP encapsulated data stream, the multiplexer extracts the message body from the UDP to form the subtitle data packet; after receiving the audio sub-frame, the multiplexer extracts the audio starting time from the audio sub-frame;
  • Step 3 The multiplexer parses the information identifying the relationship between the subtitle and the subframe from the UDP encapsulated data stream, and determines a subframe corresponding to the subtitle according to the information;
  • Step 4 The multiplexer obtains, from the subtitle data packet, information indicating a time zone in which the subtitle data packet is located and a start time of the subtitle in a time zone in which the subtitle is located, and the multiplexer compares information about the time zone in which the subtitle data packet is located The time zone in which the multiplexer is located (ie, the local time zone) is compared. If they are consistent, the start time of the subtitle contained in the subtitle packet is the start time of the subtitle in the local time zone; if not, the subtitle is The start time of the time zone in which it is located is corrected to the start time of the subtitle in the local time zone;
  • Step 5 The multiplexer sets a synchronization mark of the subtitle and the audio sub-frame according to the start time of the subtitle in the local time zone and the time stamp of the audio sub-frame corresponding to the subtitle;
  • Step 6 the multiplexer determines, according to the synchronization mark, a time offset of the time when the subtitle piece starts playing with respect to the audio;
  • Step 7 The multiplexer obtains the current time TOD from the local GPS as N, and the multiplexer obtains the time difference of the received and sent data according to the debugging statistics as R, and the units of N and R are seconds;
  • Step 8 The multiplexer starts a timer.
  • the subtitle information is encapsulated into the data segment of the audio sub-frame corresponding to the subtitle.
  • the format of the data segment is as shown in FIG. 3.
  • / _ 3 ⁇ 4or(
  • J floor(R)
  • ⁇ ( ⁇ ) means round
  • means absolute value
  • means the start time of the local time zone of the subtitle, The unit is seconds;
  • the data segment is composed of a data segment header, a data unit 1, and a data unit 2 data unit ⁇ ; wherein, the data segment header is further composed of a data unit number, a data unit 1 parameter, a data unit 2 parameter, ..., a data unit ⁇ parameter, CRC composition; the data unit ⁇ parameter includes two fields of data unit type and data unit length;
  • the data unit n is sequentially divided by the service ID (16bit), the reserved field (16bit), the encoding mode (32bit), the reserved field, the audio start playing time (32bit), and the time offset of the subtitle piece n relative to the audio start playing time (16bit). , subtitle piece n continuous play time (32bit), subtitle piece n length (16bit) and subtitle content;
  • the multiplexer encapsulates the subframe into a multiframe, and converts the multiframe into a PMS stream and sends it to the mobile multimedia broadcast terminal.
  • FIG. 4 is a block diagram showing the components of the mobile multimedia broadcast caption synchronization system of the embodiment.
  • the system includes:
  • a receiving module configured to receive a subtitle data packet sent by the subtitle server and an audio sub-frame sent by the audio encoder
  • Obtaining a module configured to acquire a subtitle according to the subtitle data packet, and obtain an audio start playback time according to the audio sub-frame;
  • a time offset determining module configured to determine a time offset of the subtitle relative to an audio start playing time
  • the time offset determining module may be configured to: obtain information about a relationship between a subtitle and an audio sub-frame from a data stream that includes the subtitle data packet; and determine an audio sub-subtitle corresponding to the subtitle according to the information of the subtitle and the audio sub-frame relationship a frame; setting a synchronization flag of the subtitle and the audio sub-frame according to a start time of the subtitle in a local time zone and a time stamp of the audio sub-frame corresponding to the subtitle; determining, according to the synchronization flag, a subtitle relative to an audio start playing time Time offset
  • the encapsulating module is configured to, when the encapsulation time arrives, encapsulate a character string of the subtitle, an audio start playing time, and a time offset of the subtitle relative to the audio start playing time to the data of the audio sub-frame corresponding to the subtitle Paragraph
  • the start time of the subtitle in the local time zone is M
  • the current time TOD provided by the local GPS is N
  • the transmission time difference from receiving the subtitle data packet to the transmission data segment is R
  • the encapsulation time is S; units of M, N, R and S In seconds;
  • a sending module configured to send the data segment to the mobile multimedia broadcast terminal.
  • the subtitle string obtained by the obtaining module according to the subtitle data packet in the foregoing embodiment may be composed of a plurality of subtitle chips
  • the time offset determining module is configured to determine a time offset of the subtitle piece relative to the audio start playing time according to the synchronization mark;
  • the time offset determining module can start a timer, set a timing length of the timer and a synchronization mark equivalent; when the timing length is reached, calculate a time offset of the subtitle piece relative to the audio start playing time.
  • a packaging module configured to: when the encapsulation time arrives, time start playback time of the audio, time offset of the subtitle piece relative to the audio start play time, and a character string of the subtitle piece to the audio sub-frame corresponding to the subtitle Data segment.
  • each module/unit in the foregoing embodiment may be implemented in the form of hardware, or may use software functions.
  • the form of the module is implemented. The invention is not limited to any specific form of combination of hardware and software.
  • the data segment of the encapsulated audio sub-frame includes a time start of audio start time and a time offset of the subtitle relative to the audio start play time, so that the mobile multimedia broadcast terminal receiving the audio sub-frame can The time offset of the subtitle relative to the start time of the audio determines the playback time of the subtitle, achieving the technical effect of synchronizing with the audio.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Mobile Radio Communication Systems (AREA)
  • Telephone Function (AREA)
  • Television Systems (AREA)

Abstract

La présente invention concerne un procédé et un système de synchronisation de diffusion multimédia mobile et de sous-titres. Le procédé comprend : la réception d'un paquet de données de sous-titres envoyé par un serveur de sous-titres et d'une sous-trame audio envoyée par un codeur audio ; l'obtention de l'heure de début de la lecture audio depuis la sous-trame audio ; l'obtention d'un sous-titre provenant du paquet de données de sous-titres ; la détermination du décalage temporel du sous-titre par rapport à l'heure de début de la lecture audio ; lorsque l'heure d'encapsulation est arrivée, l'encapsulation des chaînes de caractères du sous-titre, de l'heure de début de la lecture audio et du décalage temporel du sous-titre par rapport à l'heure de début de la lecture audio dans un segment de données de la sous-trame audio correspondant au sous-titre ; et l'envoi du segment de données à un terminal de diffusion multimédia mobile.
PCT/CN2012/077420 2012-04-10 2012-06-25 Procédé et système de synchronisation de diffusion multimédia mobile et de sous-titres WO2013152562A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201210104688.9 2012-04-10
CN201210104688.9A CN102630017B (zh) 2012-04-10 2012-04-10 一种移动多媒体广播字幕同步的方法和系统

Publications (1)

Publication Number Publication Date
WO2013152562A1 true WO2013152562A1 (fr) 2013-10-17

Family

ID=46588162

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2012/077420 WO2013152562A1 (fr) 2012-04-10 2012-06-25 Procédé et système de synchronisation de diffusion multimédia mobile et de sous-titres

Country Status (2)

Country Link
CN (1) CN102630017B (fr)
WO (1) WO2013152562A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113259776A (zh) * 2021-04-14 2021-08-13 北京达佳互联信息技术有限公司 字幕与音源的绑定方法及装置

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104142989B (zh) * 2014-07-28 2017-10-17 广州酷狗计算机科技有限公司 一种匹配检测方法及装置
CN104506957A (zh) * 2014-12-08 2015-04-08 广东欧珀移动通信有限公司 一种显示字幕的方法及装置
CN108206966B (zh) * 2016-12-16 2020-07-03 杭州海康威视数字技术股份有限公司 一种视频文件同步播放方法及装置
CN109413475A (zh) * 2017-05-09 2019-03-01 北京嘀嘀无限科技发展有限公司 一种视频中字幕的调整方法、装置和服务器
CN108174264B (zh) * 2018-01-09 2020-12-15 武汉斗鱼网络科技有限公司 歌词同步显示方法、系统、装置、介质及设备
CN113992638B (zh) * 2018-05-02 2023-07-14 腾讯科技(上海)有限公司 多媒体资源的同步播放方法、装置、存储位置及电子装置
CN108924664B (zh) * 2018-07-26 2021-06-08 海信视像科技股份有限公司 一种节目字幕的同步显示方法及终端
CN113766342B (zh) * 2021-08-10 2023-07-18 安徽听见科技有限公司 字幕合成方法及相关装置、电子设备、存储介质

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20060000172A (ko) * 2004-04-27 2006-01-06 우종식 위치정보를 이용한 자막, 정지영상, 동영상의 동기화생성/재생방법 및 그 장치
CN101378356A (zh) * 2008-06-10 2009-03-04 中兴通讯股份有限公司 一种ip实时流媒体的播放方法
CN102196319A (zh) * 2010-03-17 2011-09-21 中兴通讯股份有限公司 一种流媒体直播业务系统及实现方法

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20060000172A (ko) * 2004-04-27 2006-01-06 우종식 위치정보를 이용한 자막, 정지영상, 동영상의 동기화생성/재생방법 및 그 장치
CN101378356A (zh) * 2008-06-10 2009-03-04 中兴通讯股份有限公司 一种ip实时流媒体的播放方法
CN102196319A (zh) * 2010-03-17 2011-09-21 中兴通讯股份有限公司 一种流媒体直播业务系统及实现方法

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113259776A (zh) * 2021-04-14 2021-08-13 北京达佳互联信息技术有限公司 字幕与音源的绑定方法及装置
CN113259776B (zh) * 2021-04-14 2022-11-22 北京达佳互联信息技术有限公司 字幕与音源的绑定方法及装置

Also Published As

Publication number Publication date
CN102630017B (zh) 2014-03-19
CN102630017A (zh) 2012-08-08

Similar Documents

Publication Publication Date Title
WO2013152562A1 (fr) Procédé et système de synchronisation de diffusion multimédia mobile et de sous-titres
JP7386463B2 (ja) 送信方法、受信方法、送信装置及び受信装置
JP6877603B2 (ja) 送信方法、受信方法、送信装置、及び受信装置
US9998773B2 (en) Transmission device, transmission method of transmission stream, and processing device
JP7068526B2 (ja) 送信方法、受信方法、送信装置、及び受信装置
JP2020065295A (ja) 送信方法、受信方法、送信装置、及び受信装置
CN105493509B (zh) 传输装置、传输方法、接收装置和接收方法
JP2019110554A (ja) 送信方法、受信方法、送信装置、及び受信装置
US10305617B2 (en) Transmission apparatus, transmission method, reception apparatus, and reception method
US11343559B2 (en) Method and apparatus for receiving, sending and data processing information related to time such as leap second and daylight saving time (DST)
KR102675843B1 (ko) 송신 장치, 수신 장치 및 데이터 처리 방법
JP2024051039A (ja) 送信装置及び送信方法
KR101620776B1 (ko) 인코딩 장치, 디코딩 장치 및 이와 관련된 프로그램
JP2018182677A (ja) 情報処理装置、情報処理方法、プログラム、および記録媒体製造方法
JP7512471B2 (ja) 送信方法、及び、送信装置
WO2015045362A1 (fr) Procédé d'émission, procédé de réception, appareil émetteur, et appareil récepteur
WO2013040996A1 (fr) Extrémité d'envoi, terminal, système et procédé pour multiplexage en codage hiérarchique
JP2008245061A (ja) Ipストリーム伝送におけるpcr再生方式
EP3280147A1 (fr) Procédé et appareil permettant d'émettre et de recevoir un signal de diffusion

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12874235

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12874235

Country of ref document: EP

Kind code of ref document: A1