EP1884115A4 - METHOD AND DEVICE FOR SYNCHRONIZING A DATA SERVICE WITH A VIDEO SERVICE FOR DIGITAL MULTIMEDIA RADIATION - Google Patents

METHOD AND DEVICE FOR SYNCHRONIZING A DATA SERVICE WITH A VIDEO SERVICE FOR DIGITAL MULTIMEDIA RADIATION

Info

Publication number
EP1884115A4
EP1884115A4 EP06768652A EP06768652A EP1884115A4 EP 1884115 A4 EP1884115 A4 EP 1884115A4 EP 06768652 A EP06768652 A EP 06768652A EP 06768652 A EP06768652 A EP 06768652A EP 1884115 A4 EP1884115 A4 EP 1884115A4
Authority
EP
European Patent Office
Prior art keywords
data
trigger
time stamp
time
recited
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP06768652A
Other languages
German (de)
English (en)
French (fr)
Other versions
EP1884115A1 (en
Inventor
Gwang-Soon Lee
Kyu-Tae Yang
Bong-Ho Lee
Young-Kwon Hahm
Chung-Hyun Ahn
Soo-In Lee
Do-Hyung Kim
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Electronics and Telecommunications Research Institute ETRI
Alticast Corp
Original Assignee
Electronics and Telecommunications Research Institute ETRI
Alticast Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Electronics and Telecommunications Research Institute ETRI, Alticast Corp filed Critical Electronics and Telecommunications Research Institute ETRI
Publication of EP1884115A1 publication Critical patent/EP1884115A1/en
Publication of EP1884115A4 publication Critical patent/EP1884115A4/en
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/242Synchronization processes, e.g. processing of PCR [Program Clock References]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/24Systems for the transmission of television signals using pulse code modulation
    • H04N7/52Systems for transmission of a pulse code modulated video signal with one or more other pulse code modulated signals, e.g. an audio signal or a synchronizing signal
    • H04N7/54Systems for transmission of a pulse code modulated video signal with one or more other pulse code modulated signals, e.g. an audio signal or a synchronizing signal the signals being synchronous
    • H04N7/56Synchronising systems therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04HBROADCAST COMMUNICATION
    • H04H20/00Arrangements for broadcast or for distribution combined with broadcast
    • H04H20/28Arrangements for simultaneous broadcast of plural pieces of information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04HBROADCAST COMMUNICATION
    • H04H60/00Arrangements for broadcast applications with a direct linking to broadcast information or broadcast space-time; Broadcast-related systems
    • H04H60/02Arrangements for generating broadcast information; Arrangements for generating broadcast-related information with a direct linking to broadcast information or to broadcast space-time; Arrangements for simultaneous generation of broadcast information and broadcast-related information
    • H04H60/06Arrangements for scheduling broadcast services or broadcast-related services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/236Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
    • H04N21/23614Multiplexing of additional data and video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/434Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
    • H04N21/4348Demultiplexing of additional data and video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems

Definitions

  • the present invention relates to digital multimedia broadcasting (DMB); and more particularly, to a method and apparatus for synchronizing data transmitted based on the Eureka-147 with audio and video (AV) data transmitted after coded and multiplexed into Moving Picture Experts Group (MPEG) 4 or MPEG-2.
  • DMB digital multimedia broadcasting
  • AV audio and video
  • a synchronized data service model provides a means for an application program executed in a terrestrial Digital Multimedia Broadcasting (DMB) middleware to make a performance in synchronization with other media, such as a DMB video service.
  • DMB Digital Multimedia Broadcasting
  • a video service means an audio/vieo (AV) service provided based on "Digital Audio Broadcasting (DAB); DMB video service; User Application Specification” (ETSI TS 102 428).
  • Fig. 1 is an exemplary block diagram showing a conventional terrestrial DMB transmitting system.
  • a DMB AV encoder 110 encodes the audio and video signals based on a DMB video transmission and reception interface standard to thereby create AV stream.
  • This process includes an MPEG-4 AV encoding procedure and a multiplexing procedure into MPEG-2 transport stream (TS).
  • TS MPEG-2 transport stream
  • the data signal source 130 generates diverse data having no concern with AV data
  • the data encoder 140 encodes the diverse data generated in the data signal source 130 to create data packets.
  • the AV stream and the data packets outputted from the DMB AV encoder 110 and the data encoder 140 are multiplexed into frames of the Ensemble Transport Interface (ETI), i.e., ETI frames, in an ensemble multiplexer 150.
  • ETI frames go through Coded Orthogonal Frequency Division Multiplexing (COFDM) encoding in a DMB transmitter 160 to be outputted in the form of radio frequency (RF) signals.
  • COFDM Coded Orthogonal Frequency Division Multiplexing
  • the ETI frames are basically composed of fast information channel (FIC) data and main service channel (MSC) data.
  • FIC fast information channel
  • MSC main service channel
  • the FIC data and the MSC data are generated in an FIC unit 151 and an MSC unit 152 of the ensemble multiplexer 150, respectively.
  • the FIC is an information channel for fast access to multiplexing information and service information in a Eureka-147 system
  • the MSC is a channel for multiplexing service components each corresponding to each service according to a multiplexing structure set up through the FIC.
  • Basic audio data, AV stream and diverse additional data are multiplexed and transmitted through the MSC.
  • absolute time is added to FIG-type 0 expansive 10 (FIG 0/10) in the form of universal coordinated time (UCT), and this provides a reference time based on which the MSC data are decoded and presented.
  • the UCT absolute time information may be added as a standard for timing information when data are encoded.
  • MPEG-4 and/or MPEG-2 systems use a clock reference and time stamps to synchronize AV data transmitted over elementary stream (ES) and transmit timing information.
  • ES elementary stream
  • a receiving terminal uses a decoding time stamp (DTS) to define a decoding time point of each access unit in a decoding buffer, and uses a composition time stamp (CTS) to accurately define a composition time point of each composition unit (CU).
  • DTS decoding time stamp
  • CTS composition time stamp
  • An object clock reference (OCR) is used to transmit a time mark of given stream to an ES decoder.
  • An OCR value corresponds to an object time base (OTB) value at a time when a transmitting terminal generates an OCR time stamp.
  • OCR values are included in an SL packet header and transmitted.
  • an MPEG-2 system uses a program clock reference (PCR) and a presentation time stamp (PTS), and a MPEG-4 system uses an object clock reference (OCR), a composition time stamp (CTS), and a decoding time stamp (DTS).
  • PCR program clock reference
  • PTS presentation time stamp
  • DTS decoding time stamp
  • the MPEG-4 system is synchronized with the MPEG-2 system by mapping SL packets of MPEG-4 to Packetized Elementary Stream (PES) of MPEG-2 at a ratio of 1 : 1.
  • a PES header includes PTS of the MPEG-2 only when an SL packet header of MPEG-4 includes an OCR. Otherwise, the PTS of the MPEG-2 is not used.
  • object time base (OTB) that defines a time stamp for MPEG-4 data stream is engaged with a system time clock of MPEG-2.
  • the data encoder 140 of Fig. 1 transmits timing information with reference to UTC absolute time information transmitted over an FIC channel, when it encodes data based on multimedia object protocol (MOT), which is a data transmission specification based on a DMB system specification, i.e., the Eureka-147.
  • MOT multimedia object protocol
  • the data encoder 140 objectizes data on a file or directory basis and then packetizes the objects.
  • a time stamp which is timing information on time when one data object is decoded and presented, is added to the header of the data object in the form of UTC.
  • this method has a problem that it is hard to exactly synchronize data with DMB AV stream. Disclosure Technical Problem
  • an object of the present invention to provide a method and apparatus for synchronizing data with audio/video (AV) data in a Digital Multimedia
  • DMB Broadcasting
  • a method for providing data synchronized with audio/video (AV) data in digital multimedia broadcasting including the steps of: a) receiving an AV time stamp for the AV data; b) calculating a time stamp of the data, which is information on a time point when the data are to be presented in a user terminal, which will be shortly referred to as data time stamp hereinafter, based on the AV time stamp; c) generating sync metadata including the calculated data time stamp; and d) encoding the sync metadata and transmitting the encoded sync metadata.
  • DMB digital multimedia broadcasting
  • a method for providing data synchronized with AV data in DMB including the steps of: a) separating receiving signals into AV stream and data packets; b) receiving a system reference time information from an AV stream decoder for decoding the AV stream; c) acquiring a data time stamp from sync metadata included in the data packets; and d) comparing the data time stamp with the system reference time, and decoding and presenting a data object file at a time point when the data time stamp coincides with the system reference time.
  • the present invention can synchronize data transmitted based on the Eureka-147 with a video based on
  • MPEG Moving Picture Experts Group 4 and MPEG-2.
  • DMB Digital Multimedia Broadcasting
  • Fig. 1 is an exemplary block diagram showing a conventional terrestrial Digital Multimedia Broadcasting (DMB) transmitting system
  • Fig. 2 is a block diagram illustrating a terrestrial DMB transmitting system for synchronizing data with video data in real-time in accordance with an embodiment of the present invention
  • Fig. 3 is a view describing a method of calculating a time point when data are to be presented based on a composition time stamp (CTS) value;
  • CTS composition time stamp
  • Fig. 4 is a view showing a structure of a data carousel shown in Fig. 3
  • Fig. 5 is a view showing a structure of sync metadata shown in Fig. 3;
  • Fig. 6 is a block diagram illustrating a terrestrial DMB transmitting system for synchronizing data with video data in real-time in accordance with another embodiment of the present invention
  • Fig. 7 is a block diagram illustrating a terrestrial DMB transmitting system for synchronizing data with video data in real-time in accordance with yet another embodiment of the present invention
  • Fig. 8 is a block diagram illustrating a terrestrial DMB transmitting system for synchronizing data with video data in real-time in accordance with still another embodiment of the present invention.
  • Fig. 9 is a block diagram showing a receiving system capable of providing data synchronized with video data in accordance with an embodiment of the present invention.
  • Fig. 2 is a block diagram illustrating a terrestrial DMB transmitting system for synchronizing data with video data in real-time in accordance with an embodiment of the present invention.
  • the terrestrial DMB transmitting system includes an audio/video (AV) signal source 200 for generating AV contents, a Digital Multimedia Broadcasting (DMB) AV encoder 210 for encoding the generated AV contents into AV stream based on a terrestrial DMB standard, a data server 230 for providing diverse data services, an ensemble multiplexer 240 for multiplexing the generated AV stream and data packets into ensembles, a DMB transmitter 250 for performing Orthogonal Frequency Division Multiplexing (OFDM) encoding and radio frequency (RF) transmission, and a Network Time Protocol (NTP) server 220 for synchronizing the above constituent elements temporally.
  • OFDM Orthogonal Frequency Division Multiplexing
  • RF radio frequency
  • NTP Network Time Protocol
  • the data server 230 is composed of a data signal source 231, a data management and controlling unit 233 for managing and controlling the data signal source and a data encoder 235, and the data encoder 235 for encoding generated data based on diverse DMB data transmission and reception standards.
  • the DMB AV encoder 210 encodes the AV data according to a terrestrial DMB video standard based on MPEG-4 and MPEG-2.
  • the DMB AV encoder 210 inserts Object Clock Reference (OCR) and Composition Time Stamp (CTS) at MPEG-4 layer, inserting Program Clock Reference (PCR) and Program Time Stamp (PTS) at MPEG-2 layer.
  • OCR Object Clock Reference
  • CTS Composition Time Stamp
  • PCR Program Clock Reference
  • PTS Program Time Stamp
  • the DMB AV encoder 210 supplies Composition Time Stamp (CTS) of an initial period of a program to the data server 230.
  • the data signal source 231 To provide additional data based on the Eureka-147, the data signal source 231 generates and stores diverse detailed data by collecting and authoring JAVA-based application data, texts related to the application data, image, moving pictures and the like.
  • the additional data are encoded in the data encoder 235 and transmitted under the control and management of the data management and controlling unit 233.
  • the NTP server 220 temporally synchronizes the AV signal source 200, the AV encoder 210, and the data server 230.
  • the data management and controlling unit 233 manages the time points when the data from the data signal source 231 are inserted.
  • the calculated synchronization information is directly transmitted to the data encoder 235 or becomes metadata for synchronization between video and the data and transmitted to the data encoder 235.
  • Fig. 3 describes a method of calculating a time point when data are to be presented based on a time stamp value, e.g., a composition time stamp (CTS) value.
  • a time stamp value e.g., a composition time stamp (CTS) value.
  • CTS composition time stamp
  • time information (V(b) ) for a scene where data synchronized with video data are to be added can be exactly extracted in advance.
  • a time stamp which is restoration time of each scene at the user terminal, such as CTS, is added to the header of an SL packet.
  • data files are broadcasted in the form of data carousels prior to the restoration time in the user terminal. Then, the user terminal downloads the data carousels and performs data restoration at the predetermined time point.
  • data are of a downloadable application program and/or related data thereof. The data may be stored in a non-volatile memory in advance.
  • the data management and controlling unit 233 generates sync metadata to presented data synchronized with a particular scene and it transmits the generated sync metadata based on an appropriate data protocol.
  • Fig. 4 is a view showing a structure of a data carousel shown in Fig. 3
  • Fig. 5 is a view showing a structure of sync metadata shown in Fig. 3.
  • the sync metadata may be composed of an identifier, a trigger time, which is a video time stamp, e.g., CTS, a related data indicator, and data.
  • the identifier identifies data
  • the trigger time includes a data decoding time, data restoration time, data extermination time in a user terminal, and it is calculated in advance based on the time stamp of video, which will be referred to as a video time stamp, e.g., CTS.
  • the related data indicator indicates data which are synchronized with a particular scene and presented in connection with an application program executed in the user terminal.
  • the data include information instantly needed.
  • the sync metadata are added prior to a particular video restoration time (V(b) ) .
  • the trigger time (Ts(b)) that constitutes the synch metadata should be estimated prior to the video restoration time (V(b)).
  • the time when to calculate the trigger time (Ts(b)) can be acquired based on the known video restoration time (V(b) ) and the video time stamp information of an initial period of the application program, which includes video information and time stamp information and is inputted from the DMB AV encoder 210.
  • the video time stamp information of the initial period of the program is temporally synchronized by the NTP server among all devices, it can be easily extracted.
  • the video time stamp information is directly added to the header of a data object repeatedly transmitted by a data carousel without using the sync metadata to thereby synchronize the restoration time of the data object with the video.
  • the synch metadata will be described in detail with reference to examples.
  • trigger time synchronized with video and indicating time when a particular event is executed in an application program should be generated, and the generated trigger time should be transmitted from a data server of a transmitting part to a terrestrial DMB middleware within a short time.
  • the transmitting part should transmit the trigger time and data to be executed or an indicator of the data to execute an event synchronized with video.
  • a message composed of the trigger time and data to be executed or the indicator of the data is referred to as a trigger packet.
  • the trigger packet is an example of the sync metadata.
  • the trigger packet should be scheduled and transmitted from the data server of the transmitting part prior to a predetermined synchronized trigger time to make the application program execute the event. When it reaches the trigger time, the trigger packet is transmitted repeatedly to make the application program perform a predetermined action indicated by the event at a time synchronized with the video.
  • the terrestrial DMB uses a Transparent Data Channel (TDC) packet mode which has a small overhead and does not use a data group having a short waiting time.
  • the TDC packet mode is based on ETSI TS 101 759, i.e., Digital Audio Broadcasting (DAB); DAB Data Broadcasting Transparent Data Channel (TDC) standard, to transmit the trigger packet.
  • DAB Digital Audio Broadcasting
  • TDC Transparent Data Channel
  • the trigger packet may be transmitted by using other transmission protocols.
  • Table 1 shows formats of the trigger packet .
  • Triggerld is an identifier for identifying a trigger in the application program
  • TriggerTime indicates a time point when an event is generated.
  • a video time stamp such as CTS is used at around the event generation time point to provide a link service.
  • a video service providing device and the data server need to cooperate to use the CTS at the trigger time.
  • privateDataByte indicates data that are needed for the application program needs to execute an event at the trigger time.
  • the privateDataByte may be composed of a related data indicator and data, which is shown in Fig.
  • API Model A terrestrial DMB middleware defines a trigger interface in a dmb.io package, which is defined for data reception through a DMB data channel, to provide a synchronized data service.
  • the API model is what is extended from javax.microedition . Datagram (CLDC 1.1 (JSR 139) at http://java.sun.com/products/cldc/index.jsp) and it is a datagram including event information. Trigger is used to transmit sync signals to other media.
  • the API model is linked with the ID of the trigger and time information when an event indicated by the trigger is to be executed. Although a plurality of triggers are received on a broadcasting network, if the triggers have the same ID, they are all treated as the same triggers.
  • a trigger may be transmitted several times to increase a reception possibility or to make sure the time indicated by the trigger because the time may be changed due to discontinuity in system time clocks.
  • the trigger is ignored.
  • the ID of the trigger whose time is past and process is completed may be reused by another trigger.
  • doItNow( ) is false.
  • a trigger having a false doItNow( ) is transmitted to the application program only once even though the trigger of the same ID is transmitted several times. If any, when the trigger time is changed in the middle, triggers of the same ID may be transmitted several times.
  • a trigger whose doItNow( ) is true is transmitted to the application program.
  • the application program executes an operation indicated by the trigger instantly. After the execution, even though a trigger of the same ID is transmitted, the trigger is treated as a different one from the trigger whose doItNow( ) was true.
  • PrivateData transmitted in the form of trigger packets are read by using a method of Diagram that is an upper class of the trigger.
  • Table 2 defines API for a synchronized service.
  • Table 2 public interface dmb. io. Trigger
  • Fig. 6 is a block diagram illustrating a terrestrial DMB transmitting system for synchronizing data with video data in real-time in accordance with another embodiment of the present invention.
  • the drawing shows an example of a system where a data server 430 uses a time stamp extracting unit 437, when a DMB AV encoder 410 cannot directly output time information of a video source and the time stamp information thereof to the data server 430.
  • the time stamp extracting unit 437 extracts video time stamp, e.g., CTS of the video, from the AV stream outputted from the DMB AV encoder 410 and the extracted time stamp is inputted to the data management and controlling unit 433.
  • video time stamp e.g., CTS of the video
  • Sync data may be serviced more easily when AV data are encoded in advance and stored in the form of stream, which is shown in Figs. 7 and 8.
  • time of a video to which data are to be added and a time stamp thereof can be acquired in advance.
  • Fig. 7 is a block diagram illustrating a terrestrial DMB transmitting system for synchronizing video with data in real-time in accordance with yet another embodiment of the present invention.
  • AV stream is stored in an MPEG-2 file or a Forward Error Correction (FEC) -added file thereof .
  • FEC Forward Error Correction
  • a DMB transmitting system includes an AV signal source 500, an AV encoder 510 for encoding AV signals based on a terrestrial DMB standard, a storage 560 for storing the AV signals encoded in the form of stream, a data server 530 for generating sync metadata to provide data synchronized with AV data by using an AV time stamp which are supplied from the storage 560, and multiplexers 520, 540 and 550 for multiplexing the output signals of the AV encoder 510 and the output signals of the data server 530.
  • video time information of a video to which data are to be added and a data time stamp information for data restoration are inputted to the data server 530 in advance.
  • the data management and controlling unit 533 determines when to add the data and generates sync metadata based on the time to add the data.
  • stream switching occurs in a switcher.
  • the switcher 520 performs re-stamping, which is a process for guaranteeing continuity of time stamp.
  • the time stamp pre-added to a stream file is re-established in the switcher 520 to be in continuum with the time stamp of the AV stream outputted from the DMB AV encoder 510.
  • a predetermined value is added to the time stamp.
  • the data server 530 should receive the time stamp of the AV stream re-established in the switcher 520 and information on the time point when the switching has occurred in order to reflect the re-establishment.
  • Fig. 8 is a block diagram illustrating a terrestrial DMB transmitting system for synchronizing video with data in real-time in accordance with still another embodiment of the present invention.
  • AV stream is encoded in the form of an MP4 file and directly added to the DMB AV encoder 610 in the transmitting system of Fig. 8.
  • AV stream When AV stream is encoded into an MPEG-4 format and an MP4 file, which is one of storage file formats, is added to DMB, it should be packetized into MPEG-4 SL and an MPEG-2 transport stream (TS) in an M4onM2 processing module 620, which is shown in Fig. 8.
  • This process may be carried out inside the DMB AV encoder 610 but it also may be performed in an additional device.
  • an MP4 file is packetized into MPEG-4 SL and MPEG-2 TS
  • the comparative time information inside the MP4 file is transformed into OCR or CTS.
  • the data server 630 already includes time information of video to which data are to be added and a data time stamp for data restoration.
  • FIG. 9 is a block diagram showing a receiving system capable of providing a service where data are synchronized with video in accordance with an embodiment of the present invention.
  • the receiving system capable of providing video synchronized with data includes an RF receiving channel decoder 710, an MSC processor 730, a DMB AV decoder 740, and a DMB data decoder 760, and a data presenting apparatus 770.
  • the RF receiving channel decoder 710 receives RF signals, demodulates the RF signals into baseband signals, performs channel decoding, and separates FIC data from MSC data.
  • a FIC analyzer 720 analyzes the FIC data including multiplexing information and service information and provides an analysis result to the MSC processor 730.
  • the MSC processor 730 separates data transmitted through an MSC channel into data packets and AV stream.
  • the AV stream and the data packet are inputted to the DMB AV decoder 740 and the DMB data decoder 760 to be decoded, respectively.
  • the DMB AV presenting apparatus 750 and the data presenting apparatus 770 presented the AV stream and the data at the same restoration time, respectively.
  • the DMB data decoder 760 receives a system reference time information such as OCR from the DMB AV decoder 740 and compares the system reference time information with the CTS-based data time stamp information added to the header of sync metadata or the header of a data object. What coincides herein becomes the restoration time of the data object file.
  • the data presenting apparatus 770 executes an application program directed by the sync metadata, and related data indicated by the sync metadata and instant data added to the sync metadata are presented at the extracted restoration time, respectively, to be synchronized with AV data.
  • the method of the present invention which is described above, can be realized as a program and stored in a computer-readable recording medium such as CD-ROM, RAM, ROM, floppy disks, hard disks, magneto-optical disks and the like. Since the process can be easily implemented by those skilled in the art of the present invention, detailed description on it will not be provided herein.
EP06768652A 2005-05-26 2006-05-26 METHOD AND DEVICE FOR SYNCHRONIZING A DATA SERVICE WITH A VIDEO SERVICE FOR DIGITAL MULTIMEDIA RADIATION Withdrawn EP1884115A4 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
KR20050044579 2005-05-26
KR20050080642 2005-08-31
PCT/KR2006/002011 WO2006126852A1 (en) 2005-05-26 2006-05-26 Method and apparatus for synchronizing data service with video service in digital multimedia broadcasting

Publications (2)

Publication Number Publication Date
EP1884115A1 EP1884115A1 (en) 2008-02-06
EP1884115A4 true EP1884115A4 (en) 2008-08-06

Family

ID=37452227

Family Applications (1)

Application Number Title Priority Date Filing Date
EP06768652A Withdrawn EP1884115A4 (en) 2005-05-26 2006-05-26 METHOD AND DEVICE FOR SYNCHRONIZING A DATA SERVICE WITH A VIDEO SERVICE FOR DIGITAL MULTIMEDIA RADIATION

Country Status (3)

Country Link
EP (1) EP1884115A4 (ko)
KR (1) KR100837720B1 (ko)
WO (1) WO2006126852A1 (ko)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101226178B1 (ko) * 2007-03-27 2013-01-24 삼성전자주식회사 비디오 데이터 디스플레이 방법 및 장치
KR101382613B1 (ko) * 2007-06-20 2014-04-07 한국전자통신연구원 디지털 방송 신호의 프레임 다중화 방법, 프레임 다중화장치 및 송신장치
US7936790B2 (en) 2007-08-30 2011-05-03 Silicon Image, Inc. Synchronizing related data streams in interconnection networks
US8819749B2 (en) 2008-06-11 2014-08-26 Koninklijke Philips B.V. Synchronization of media stream components
KR101052480B1 (ko) * 2008-08-27 2011-07-29 한국전자통신연구원 방송신호 송수신장치와 그 방법
KR101349227B1 (ko) 2010-03-29 2014-02-11 한국전자통신연구원 멀티미디어 시스템에서 객체 정보 제공 장치 및 방법
KR20140008478A (ko) * 2010-07-19 2014-01-21 엘지전자 주식회사 미디어 파일 송수신 방법 및 그를 이용한 송수신 장치
KR101236813B1 (ko) * 2011-01-13 2013-02-28 주식회사 알티캐스트 디지털 방송에서의 개방형 음성 서비스 시스템 및 서비스 방법
JP5948773B2 (ja) 2011-09-22 2016-07-06 ソニー株式会社 受信装置、受信方法、プログラム、及び情報処理システム
US9118425B2 (en) * 2012-05-31 2015-08-25 Magnum Semiconductor, Inc. Transport stream multiplexers and methods for providing packets on a transport stream
WO2019164361A1 (ko) * 2018-02-23 2019-08-29 스타십벤딩머신 주식회사 스트리밍 장치 및 스트리밍 방법
CN111611252B (zh) * 2020-04-01 2023-07-18 石化盈科信息技术有限责任公司 数据同步过程中安全数据的监控、装置、设备和存储介质

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0680216A2 (en) * 1994-04-28 1995-11-02 Thomson Consumer Electronics, Inc. Apparatus and method for formulating an interactive signal
WO2001026369A1 (en) * 1999-10-05 2001-04-12 Webtv Networks, Inc. Trigger having a time attribute
EP1343323A2 (en) * 2002-03-07 2003-09-10 Chello Broadband NV Display of enhanced content

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000244433A (ja) * 1999-02-17 2000-09-08 Sony Corp データ多重化装置及びデータ多重化方法
KR100392384B1 (ko) * 2001-01-13 2003-07-22 한국전자통신연구원 엠펙-2 데이터에 엠펙-4 데이터를 동기화시켜 전송하는장치 및 그 방법
JP2002238048A (ja) * 2001-02-07 2002-08-23 Nec Corp Mpeg2トランスポートストリーム伝送レート変換方法
EP1444839A2 (en) * 2001-07-27 2004-08-11 Matsushita Electric Industrial Co., Ltd. Digital broadcast system, sync information replacing apparatus and method
KR100438518B1 (ko) * 2001-12-27 2004-07-03 한국전자통신연구원 엠펙-4 장면 기술자를 이용한 엠펙-2 비디오의 특정 영역활성화 장치 및 그 방법
KR100406122B1 (ko) * 2002-03-29 2003-11-14 한국전자통신연구원 디지털 데이터 방송을 위한 동기화 데이터 삽입 장치 및그 방법
KR100646851B1 (ko) * 2004-11-03 2006-11-23 한국전자통신연구원 오디오/비디오 서비스와 데이터 서비스의 동기화를 위한지상파 디지털멀티미디어방송 송/수신 시스템

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0680216A2 (en) * 1994-04-28 1995-11-02 Thomson Consumer Electronics, Inc. Apparatus and method for formulating an interactive signal
WO2001026369A1 (en) * 1999-10-05 2001-04-12 Webtv Networks, Inc. Trigger having a time attribute
EP1343323A2 (en) * 2002-03-07 2003-09-10 Chello Broadband NV Display of enhanced content

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
"Digital Audio Broadcasting (DAB); DMB video service; User Application Specification European Broadcasting Union Union Européenne de Radio-Télévision EBUÜER; ETSI TS 102 428", ETSI STANDARDS, LIS, vol. BC, no. V1.1.1, 1 January 2005 (2005-01-01), XP014030465, ISSN: 0000-0001 *
"Digital Audio Broadcasting (DAB); MOT Slide Show; User Application Specification; ETSI TS 101 499", ETSI STANDARDS, LIS, vol. BC, no. V1.1.1, 1 July 2001 (2001-07-01), XP014006407, ISSN: 0000-0001 *
"DRAFT VERSION 1.1R26 UPDATED 02/02/99; STATE OF THIS DOCUMENT", ADVANCED TELEVISION ENHANCEMENT FORUM SPECIFICATION (ATVEF), XX, XX, 1 February 1998 (1998-02-01), pages 1 - 37, XP002935044 *
See also references of WO2006126852A1 *

Also Published As

Publication number Publication date
WO2006126852A1 (en) 2006-11-30
KR100837720B1 (ko) 2008-06-13
EP1884115A1 (en) 2008-02-06
KR20060122784A (ko) 2006-11-30

Similar Documents

Publication Publication Date Title
WO2006126852A1 (en) Method and apparatus for synchronizing data service with video service in digital multimedia broadcasting
US10129609B2 (en) Method for transceiving media files and device for transmitting/receiving using same
US10820065B2 (en) Service signaling recovery for multimedia content using embedded watermarks
US7188353B1 (en) System for presenting synchronized HTML documents in digital television receivers
KR20030078354A (ko) 디지털 데이터 방송을 위한 동기화 데이터 삽입 장치 및그 방법
CN102752669A (zh) 多通道实时流媒体文件的传送处理方法与系统、接收装置
CN108111872B (zh) 一种音频直播系统
US10797811B2 (en) Transmitting device and transmitting method, and receiving device and receiving method
US9426506B2 (en) Apparatuses for providing and receiving augmented broadcasting service in hybrid broadcasting environment
CN108174264B (zh) 歌词同步显示方法、系统、装置、介质及设备
CN101218819A (zh) 数字多媒体广播中同步数据服务和视频服务的方法和装置
EP2814256B1 (en) Method and apparatus for modifying a stream of digital content
JP2024040224A (ja) 送信方法及び送信装置、並びに受信方法及び受信装置
CN105812961B (zh) 自适应流媒体处理方法及装置
US20100205317A1 (en) Transmission, reception and synchronisation of two data streams
EP1487214A1 (en) A method and a system for synchronizing MHP applications in a data packet stream
JP6935843B2 (ja) 送信装置及び送信方法、並びに受信装置及び受信方法
JP6791344B2 (ja) 送信装置及び送信方法、並びに受信装置及び受信方法

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20071126

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC NL PL PT RO SE SI SK TR

A4 Supplementary search report drawn up and despatched

Effective date: 20080704

DAX Request for extension of the european patent (deleted)
17Q First examination report despatched

Effective date: 20090113

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20141002