WO2013136754A1 - 表示装置、及び送信装置 - Google Patents
表示装置、及び送信装置 Download PDFInfo
- Publication number
- WO2013136754A1 WO2013136754A1 PCT/JP2013/001538 JP2013001538W WO2013136754A1 WO 2013136754 A1 WO2013136754 A1 WO 2013136754A1 JP 2013001538 W JP2013001538 W JP 2013001538W WO 2013136754 A1 WO2013136754 A1 WO 2013136754A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- stream
- unit
- video
- display
- timing
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/04—Synchronising
- H04N5/06—Generation of synchronising signals
- H04N5/067—Arrangements or circuits at the transmitter end
- H04N5/073—Arrangements or circuits at the transmitter end for mutually locking plural sources of synchronising signals, e.g. studios or relay stations
- H04N5/0736—Arrangements or circuits at the transmitter end for mutually locking plural sources of synchronising signals, e.g. studios or relay stations using digital storage buffer techniques
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/4302—Content synchronisation processes, e.g. decoder synchronisation
- H04N21/4307—Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
- H04N21/43072—Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of multiple content streams on the same device
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/433—Content storage operation, e.g. storage operation in response to a pause request, caching operations
- H04N21/4331—Caching operations, e.g. of an advertisement for later insertion during playback
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/45—Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
- H04N21/462—Content or additional data management, e.g. creating a master electronic program guide from data received from the Internet and a Head-end, controlling the complexity of a video stream by scaling the resolution or bit-rate based on the client capabilities
- H04N21/4622—Retrieving content or additional data from different sources, e.g. from a broadcast channel and the Internet
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/61—Network physical structure; Signal processing
- H04N21/6106—Network physical structure; Signal processing specially adapted to the downstream path of the transmission network
- H04N21/6125—Network physical structure; Signal processing specially adapted to the downstream path of the transmission network involving transmission via Internet
Definitions
- the present invention relates to a technology for reproducing broadcast content and content accompanying it in synchronization.
- broadcast video Video received through broadcast waves
- communication video There is known a display device that displays them in synchronization with each other.
- a broadcast station that transmits broadcast video is displayed so that the broadcast video is displayed in synchronization with the communication video at a timing as intended by the video producer.
- a communication service providing station that provides display timing information (for example, PTS (Presentation Time Stamp)) indicating the timing to be displayed to each of the video frames constituting the broadcast video, transmits the broadcast video, and transmits the communication video Transmits the communication video by adding display timing information to each of the video frames constituting the communication video so that the communication video is displayed in synchronization with the broadcast video at a timing as intended by the video producer. Then, the display device displays each of the video frames constituting the received broadcast video and each of the video frames constituting the received communication video at a timing indicated by the assigned display timing information.
- PTS Presentation Time Stamp
- this display device displays the broadcast video and the communication video in synchronization with the timing as intended by the video producer.
- the broadcast time of a program when the display device has already received and stored the communication video corresponding to the program transmitted from the communication service providing station, the broadcast video of the program is configured. Even if each of the video frames and each of the video frames constituting the communication video are displayed at the timing indicated by the given display timing information, these videos are displayed at the timing as intended by the video producer. They are not displayed synchronously.
- the present invention has been made in view of such problems, and when the broadcast time of a program is changed, even when a communication video corresponding to the program has already been received and stored, the broadcast is performed. It is an object of the present invention to provide a display device that can display a video and a communication video in synchronization with each other at a timing intended by the video producer.
- a display device that displays a separately acquired and stored image in a received stream including a plurality of video frames.
- a receiving unit that receives a stream including a video frame and first display timing information that determines display timing for each of the video frames, and a plurality of video frames included in the stream received by the receiving unit,
- the display timing determined by the second display timing information stored in the storage unit An acquisition unit that acquires correction information for determining a correction amount for displaying an image stored in the storage unit in synchronization with a video frame displayed on the display unit by correcting the display unit; Is further stored in the storage unit at a correction timing obtained by correcting the display timing determined by the second display timing information stored in the storage unit by a correction amount determined by the correction information acquired by the acquisition unit
- the display device When the broadcast time of a program is changed, the display device according to the present invention having the above-described configuration communicates with the broadcast video even when the communication video corresponding to the program has already been received and stored. It is possible to display the video in synchronization with the timing as intended by the video producer.
- Conceptual diagram of broadcasting system 100 Schematic diagram showing schematic configurations of broadcasting station 120 and communication service providing station 130 Schematic diagram showing an example of video handled by the broadcasting station 120 and the communication service providing station 130
- A External view of display device 110
- B Schematic diagram showing scene displaying broadcast video on screen
- c Broadcast video and communication video are superimposed and displayed on screen synchronously.
- a broadcast video transmitted from a broadcast station is received via a broadcast wave, and a communication video transmitted via an Internet line is received from a communication service providing station.
- a display device that displays the received broadcast video and the received communication video synchronously at the timing intended by the video producer will be described.
- This display device receives a communication stream including a communication video and a PTS indicating a display timing of each video frame constituting the communication video (hereinafter referred to as “communication video PTS”) from the communication service providing station.
- a correction for correcting the display timing indicated by the broadcast video and the PTS indicating the display timing of each video frame constituting the broadcast video hereinafter referred to as “broadcast video PTS” and the communication video PTS.
- a broadcast stream including STC_offset (System Time Counter offset) indicating the amount is received.
- each of the video frames constituting the received broadcast video is displayed at the display timing indicated by the broadcast video PTS, and each of the video frames constituting the received communication video is displayed with respect to the display timing indicated by the communication video PTS.
- display is performed at the display timing corrected by the correction amount indicated by STC_offset.
- the display timing of the video frame constituting the broadcast video indicated by the broadcast video PTS is deviated from the display timing intended by the video producer for some reason.
- the received broadcast video and the received communication video can be displayed in synchronization with the timing as intended by the video producer.
- FIG. 1 is a conceptual diagram of a broadcasting system 100 including a broadcasting station 120, a communication service providing station 130, and a display device 110.
- the broadcast station 120 includes a broadcast video, a broadcast video PTS, and an STC_offset indicating a correction amount for correcting the display timing indicated by the communication video PTS.
- MPEG Motion Picture Experts Group
- -2 TS Transport Stream
- STC_offset is stored in PSI / SI (Program Specific Information / Service Information) such as PMT (Program Map Table) and EIT (Event Information Table).
- PSI / SI Program Specific Information / Service Information
- PMT Program Map Table
- EIT Event Information Table
- the communication service providing station 130 generates an MPEG-2 TS format communication stream including a communication video to be displayed in synchronization with the broadcast video and the communication video PTS, and the generated communication stream is transmitted to the Internet communication network 140.
- the communication service providing station 130 generates an MPEG-2 TS format communication stream including a communication video to be displayed in synchronization with the broadcast video and the communication video PTS, and the generated communication stream is transmitted to the Internet communication network 140.
- the display device 110 receives the broadcast stream transmitted from the broadcast station 120 from the reception antenna 111 to restore the broadcast video, and receives the communication stream transmitted from the communication service providing station 130 from the Internet communication network 140 and broadcasts it. Restore the video. Then, each video frame constituting the broadcast video is displayed at a display timing indicated by the broadcast video PTS, and each video frame constituting the communication video is displayed by STC_offset with respect to the display timing indicated by the communication video PTS. Displayed at the display timing corrected by the indicated correction amount.
- FIG. 2 is a schematic diagram showing a schematic configuration of the broadcasting station 120 and the communication service providing station 130.
- the broadcast station 120 includes a broadcast video photographing unit 210, a broadcast video editing unit 211, a broadcast stream generation unit 212, a PTS timing determination unit 213, a broadcast stream storage unit 214, and an output unit 215.
- the output unit 215 further includes an STC offset applying unit 216.
- the communication service providing station 130 includes a communication video photographing unit 220, a communication video editing unit 221, a communication stream generation unit 222, a communication stream storage unit 224, and an output unit 225.
- Broadcast video photographing means 210 includes a photographing device such as a video camera and has a function of photographing video and audio.
- the broadcast video photographing means 210 is not necessarily fixed inside the broadcast station 120, and can capture video and audio outside the broadcast station 120 as necessary.
- the broadcast video editing unit 211 includes a computer system including a processor, a memory, and the like, and has a function of editing video and audio shot by the broadcast video shooting unit 210.
- the communication video photographing means 220 includes a photographing device such as a video camera and has a function of photographing video and audio.
- the communication video photographing means 220 is not necessarily fixed inside the communication service providing station 130, and can take video and audio outside the communication service providing station 130 as necessary. Is.
- the communication video editing unit 221 includes a computer system including a processor, a memory, and the like, and has a function of editing video and audio shot by the communication video shooting unit 220.
- FIG. 3 is a schematic diagram showing an example of a video handled by the broadcasting station 120 and the communication service providing station 130.
- the broadcast video shooting means 210 captures a video with a relatively wide field of view
- the communication video shooting means 220 captures a video from a different viewpoint than the shooting by the broadcast video shooting means 210 (for example, a specific player) Is a zoomed-up video)
- a video frame in which a video shot by the broadcast video shooting means 210 and a video shot by the communication video shooting means 220 are shot at the same time A case where these videos are edited assuming that they are displayed at the same time will be described as an example.
- description of audio will be omitted and description about video will be mainly performed so that the description will not be more complicated than necessary.
- the broadcast video photographing means 210 shoots a photographed video 301 that is a video with a relatively wide field of view.
- the broadcast video editing unit 211 cuts out a broadcast scene from the shot video 301 shot by the broadcast video shooting unit 210 and superimposes graphics such as score information 303 on the cut out video.
- the communication video photographing means 220 shoots a video from a different viewpoint from that taken by the broadcast video photographing means 210.
- the communication video editing unit 221 cuts out a scene in the same time zone as the scene cut out by the broadcast video editing unit 211 from the shot video 311 shot by the communication video shooting unit 220. Graphics such as player information 313 are superimposed.
- the PTS timing determination unit 213 includes a computer system including a processor, a memory, and the like, and has a function of determining a value of a PTS to be assigned to each of video frames constituting a video edited by the broadcast video editing unit 211.
- the broadcast stream generation unit 212 includes a computer system including a processor, a memory, and the like.
- the video stream is generated from the video and audio edited by the broadcast video editing unit 211 and the PTS value determined by the PTS timing determination unit 213.
- an MPEG-2 TS format broadcast stream in which an audio stream, a subtitle stream, and a system packet are multiplexed.
- the broadcast stream generation means 212 generates a video stream by encoding with a video codec such as MPEG-2, MPEG-4, or AVC for video, for example, and for example AC3 (Audio Code It is encoded with an audio codec such as number 3) or AAC (Advanced Audio Coding).
- a video codec such as MPEG-2, MPEG-4, or AVC for video
- AC3 Audio Code It is encoded with an audio codec such as number 3) or AAC (Advanced Audio Coding).
- the broadcast stream storage unit 214 includes a storage device such as a hard disk drive, and has a function of storing the broadcast stream generated by the broadcast stream generation unit 212.
- the communication stream generation unit 222 includes a computer system including a processor, a memory, and the like. From the video edited by the communication video editing unit 221 and the PTS determined by the PTS timing determination unit 213, a video stream and an audio stream Has a function of generating a communication stream of MPEG-2 TS format in which is multiplexed.
- the communication stream generation unit 222 performs, for each of the video frames constituting the video edited by the communication video editing unit 221, at the same time among the video frames constituting the video edited by the broadcast video editing unit 211.
- a communication stream is generated so that a PTS having the same value as the PTS value given to the captured video frame is given.
- the communication stream generation unit 222 generates a video stream by encoding with a video codec such as MPEG-2 or MPEG-4 AVC, for example, for video, If it is a voice, it is encoded by an audio codec such as AC3 or AAC.
- a video codec such as MPEG-2 or MPEG-4 AVC, for example, for video, If it is a voice, it is encoded by an audio codec such as AC3 or AAC.
- the communication stream storage unit 224 includes a storage device such as a hard disk drive, and has a function of storing the communication stream generated by the communication stream generation unit 222.
- the STC offset providing means 216 includes a computer system including a processor, a memory, and the like, and has the following STC offset providing function.
- STC offset addition function (1) When the broadcast time of a program corresponding to the video stream stored in the broadcast stream storage unit 214 is changed, the broadcast video PTS for each video frame before the broadcast time is changed And the value of the broadcast video PTS for each video frame after the broadcast time is changed, and the calculated difference amount is set as STC_offset and included in the new broadcast stream.
- the STC_offset in which the difference value is “0” is set to PSI. Function to store in / SI.
- the output unit 215 includes a computer system including a processor, a memory, an output amplifier, and the like.
- a broadcast stream in which STC_offset is stored in the PSI / SI by the STC offset adding unit 216 is broadcast in a predetermined format in a predetermined frequency band. It has a function of modulating to a signal and transmitting the modulated broadcast signal from the broadcast antenna 121.
- the output unit 225 includes a computer system including a processor, a memory, and the like.
- the output unit 225 outputs a communication stream stored in the communication stream storage unit 224 to the display device 110 via the Internet communication network 140.
- ATC_delay Arriv Time Counter delay
- FIG. 4A is an external view of the display device 110
- FIG. 4B is a diagram in which the display device 110 restores the broadcast video from the received broadcast stream and displays the restored broadcast video on the screen.
- FIG. 4C is a schematic diagram showing a scene in which the display device 110 restores the broadcast video and the communication video from the received broadcast stream and the communication stream, and restores the restored broadcast video to the restored broadcast video. It is a schematic diagram which shows the scene which superimposes an image
- a remote controller 410 is attached to the display device 110.
- a user who uses the display device 110 operates the display device 110 using the remote controller 410.
- the display device 110 displays the communication video in a part of the broadcast video when the broadcast video and the communication video are displayed in synchronization with each other. Are superimposed and displayed.
- FIG. 5 is a configuration diagram showing the configuration of the display device 110.
- the display device 110 includes a tuner 501, a communication interface 502, an external device interface 503, a remote control interface 504, a broadcast stream decoding unit 510, a communication stream decoding unit 520, an application execution control unit 530, and a synchronization start packet.
- the composition processing unit 570 includes a display 580 and a speaker 590.
- the broadcast stream decoding unit 510 includes a first demultiplexing unit 511, a first audio decoder 512, a caption decoder 513, a first video decoder 514, and a system packet manager 515
- the communication stream decoding unit 520 includes a second The multiplexer / demultiplexer 521, the second video decoder 523, and the second audio decoder 522 are configured.
- the tuner 501 has a function of receiving and demodulating the broadcast signal transmitted from the broadcast station 120 using the reception antenna 111 and outputting the broadcast stream obtained by the demodulation to the broadcast stream decoding means 510.
- the communication interface 502 includes, for example, a NIC (Network Interface Card), a function of receiving a communication stream output from the communication service providing station 130 from the Internet communication network 140 and outputting it to the buffer 542, and a communication service providing station.
- the buffer 542 has a function of delaying the communication stream sent from the communication interface 502 by a delay designated by the ATC delay unit 543 and outputting the delayed signal to the communication stream decoding unit 520.
- the ATC delay unit 543 delays the communication stream sent from the communication interface 502 to the buffer 542 by a delay value specified by the ATC_delay. It has a function of causing the stream decoding means 520 to output.
- the external device interface 503 includes, for example, a USB (Universal Serial Bus) port and the like, receives a signal from a connected external device (for example, a camera, a motion sensor, a USB memory, etc.), and receives the received signal as an application execution control unit 530. It has the function to output to.
- a USB Universal Serial Bus
- the remote control interface 504 has a function of receiving a signal sent from the remote control 410 and outputting the received signal to the application execution control unit 530.
- the first demultiplexing unit 511 separates the video stream, the audio stream, the subtitle stream, and the system packet from the broadcast stream sent from the tuner 501, and outputs the separated video stream to the first video decoder 514.
- the separated audio stream is output to the first audio decoder 512
- the separated caption stream is output to the caption decoder 513
- the separated system packet is output to the system packet manager 515.
- the second demultiplexing unit 521 separates the video stream and the audio stream from the broadcast stream sent from the buffer 542, outputs the separated video stream to the second video decoder 523, and outputs the separated audio stream to the first stream. 2 has a function of outputting to the audio decoder 522.
- the first video decoder 514 decodes the video stream sent from the first demultiplexing unit 511, generates an uncompressed video frame, and generates the generated video frame at the PTS timing associated with the frame. And has a function of outputting to the first video plane 552.
- the second video decoder 523 decodes the video stream sent from the second demultiplexing unit 521 to generate an uncompressed video frame, and generates the generated video frame at the PTS timing associated with the frame. It has a function of outputting to the second video plane 555.
- the first audio decoder 512 decodes the audio stream sent from the first demultiplexing unit 511 to generate an uncompressed LPCM (Linear ⁇ ⁇ ⁇ ⁇ ⁇ Pulse1Code Modulation) type audio frame, and the generated audio frame is converted into the frame.
- LPCM Linear ⁇ ⁇ ⁇ ⁇ ⁇ Pulse1Code Modulation
- the second audio decoder 522 decodes the audio stream sent from the second demultiplexing unit 521, generates an uncompressed LPCM audio frame, and associates the generated audio frame with the frame. It has a function of outputting to the speech synthesis processing unit 570 at the PTS timing.
- the subtitle decoder 513 decodes the subtitle frame sent from the first demultiplexing unit 511 to generate an uncompressed image frame, and the generated image frame is subtitled at the PTS timing associated with the image frame. It has a function of outputting to the plane 551.
- the system packet manager 515 has the following two functions.
- Data providing function A function that analyzes a system packet sent from the first demultiplexing unit 511 and provides data required by the application execution control unit 530 in response to a request from the application execution control unit 530.
- the provided data includes, for example, program information stored in an EIT (encoded information type) packet, stream attribute information stored in a PMT (program map table) packet, DSM-CC (digital storage medium)? BML (broadcast Markup Language) content information provided by Command (and Control) and the like.
- EIT encoded information type
- PMT program map table
- PMT program map table
- DSM-CC digital storage medium
- BML broadcast Markup Language
- the system execution packet sent from the first demultiplexing unit 511 includes an application execution control signal (for example, an application execution start signal, an application execution end signal, etc.) for the application execution control unit 530
- an application execution control signal for example, an application execution start signal, an application execution end signal, etc.
- the application execution control unit 530 has a function of executing an application acquired by the communication interface 502.
- the application is HTML (Hyper Text Markup Language) content
- it is a Web browser
- Java registered trademark
- the application acquires program information of a currently displayed broadcast program, stream attribute information, and the like from the system packet manager 515 via a broadcast resource access API (Application Programming Interface).
- the application controls the operation, stop, and the like of the broadcast stream decoding unit 510 and the communication stream decoding unit 520 via the playback control API.
- the application outputs the CG image to the background plane 553 and the graphics plane 554 via the graphics drawing API.
- the application controls the plane synthesis method in the plane synthesis processing unit 560. As a method of combining the planes, scaling such as enlargement and reduction, positioning, and the like are included with respect to the video plane.
- the application acquires data from the external device interface 503 and the remote control interface 504, and changes the display content of the image in accordance with the user's operation to realize a graphical user interface.
- the subtitle plane 551 is a buffer for storing the frame sent from the subtitle decoder 513
- the first video plane 552 is a buffer for storing the frame sent from the first video decoder 514
- the background plane 553 is a buffer for storing the background image sent from the application execution control unit 530
- the graphics plane 554 stores the CG image sent from the application execution control unit 530.
- the second video plane 555 is a buffer for storing the frames sent from the second video decoder 523.
- the plane composition processing unit 560 synthesizes the images stored in the subtitle plane 551, the first video plane 552, the background plane 553, the graphics plane 554, and the second video plane 555, and generates one image. A function of outputting the generated image to a display;
- the plane synthesis processing unit 560 superimposes the background plane 553, the first video plane 552, the second video plane 555, the caption plane 551, and the graphics plane 554 in order from the back side. Perform image composition.
- scaling such as enlargement and reduction, positioning, and the like are performed on the video plane.
- the speech synthesis processing unit 570 has a function of mixing the audio frame output from the first audio decoder 512 and the audio frame output from the second audio decoder 522 (superimposing sounds) and outputting the result to the speaker 590.
- FIG. 6 shows parts of the first demultiplexing unit 511, the second demultiplexing unit 521, the first video decoder 514, the second video decoder 523, the synchronization start packet determining unit 540, and the input start control unit 541 in FIG.
- this part is referred to as a “main video decoding unit”.
- the first demultiplexing unit 511 includes a data buffer 601, a decoder input unit 602, a PID (Packet ID) filter 603, and a first ATC counter 620.
- the second demultiplexing unit 521 includes a data buffer 611 and a decoder input unit 612. , A PID filter 613 and a second ATC counter 640, a first video decoder 514 is composed of TB 604, MB 605, EB 606 and video decoder 607, and a second video decoder 524 is composed of TB 614, MB 615, EB 616 and video decoder. 617.
- a crystal resonator 660, a first STC counter 630, and a second STC counter 650, which are not described in FIG. 5, are added to the main video decoding unit.
- Data buffer 601 and data buffer 611, decoder input device 602 and decoder input device 612, PID filter 603 and PID filter 613, first ATC counter 620 and second ATC counter 640, first STC counter 630 and second STC counter 650, TB 604 and TB 014, MB605 and MB615, EB606 and EB616, video decoder 607 and video decoder 617 have the same configuration. Therefore, here, the data buffer 601, the decoder input device 602, the PID filter 603, the first ATC counter 620, the first STC counter 630, the TB 604, the MB 605, the EB 606, and the video decoder 607 will be described.
- the crystal resonator 660 is a resonator that oscillates at a frequency of 27 MHz using the piezoelectric effect of crystal.
- the first ATC counter 620 is a counter that ticks the ATC time axis using the transmission of the crystal resonator 660 and has a function of incrementing the value at a frequency of 27 MHz.
- the first STC counter 630 is a counter that ticks the STC time axis using the transmission of the crystal resonator 660 and has a function of incrementing the value at a frequency of 90 KHz.
- the first STC counter 630 calibrates the STC time axis with the PCR value of the PCR packet at the arrival timing of the PCR (Program Clock Reference) packet output from the PID filter 603.
- the PCR packet is continuously input, and the STC value of the STC counter is continuously calibrated with the PCR value at the timing when the PCR packet arrives. Become.
- the data buffer 601 is a buffer for temporarily storing a stream, and has a function of outputting the stored stream to the decoder input device 602 at a timing determined by the input start control unit 541.
- the data buffer 601 includes, for example, a DRAM (Dynamic Random Access Memory), a hard disk, and the like.
- the decoder input unit 602 has a function of inputting a TS packet for calibrating a stream stored in the data buffer 601 to the PID filter 603 at a timing stored in the data buffer 601. If the stream stored in the data buffer 601 is a TTS (Timestamped TS) stream, the corresponding TS packet is sent to the PID filter at the timing when the ATS given to the TS packet matches the value of the ATC counter. Input to 603. In the following description, it is assumed that both the broadcast stream and the communication stream are TTS streams.
- TTS Timestamped TS
- the PID filter 603 has a function of outputting the TS packet sent from the decoder input device 602 to the decoder corresponding to the PID, the system packet manager 515 or the first STC counter 630 according to the PID of the TS packet. For example, if the PID value indicates an audio stream, it is output to the first audio decoder 512, and if the PID value indicates a video stream, it is output to the first video decoder 514, and the PID value is a PCR packet. Is output to the first STC counter 630.
- TB 604 is a buffer for accumulating TS packets sent from the PID filter 603.
- the MB 605 is a buffer for temporarily storing PES packets when outputting TS packets from the TB 604 to the EB 606.
- a TP packet is transmitted from the TB 604 to the MB 605
- the TS header and the adaptation field are displayed from the TS packet.
- EB 606 is a buffer for storing a picture in an encoded state, and has a function of removing a PES packet header from a PES packet when a PES packet is transmitted from MB 605 to EB 606.
- the video decoder 607 has a function of decoding a picture in an encoded state stored in the EB 606 for each corresponding decoding time (DTS: Decode Time Stamp), and the decoded picture corresponding to a display time (PTS). Each of the decoders has a function of outputting to the first video plane 552.
- DTS Decode Time Stamp
- PTS display time
- the synchronization start packet determination unit 540 synchronizes from the packets included in the stream stored in the data buffer 601 and the packets included in the stream stored in the data buffer 611, respectively. Thus, it has a function of determining a packet for starting display (hereinafter referred to as “synchronization start TS packet”).
- FIG. 7A is a schematic diagram showing a state in which the TTS stream of the broadcast stream is buffered in the data buffer 601 and the TTS stream of the communication stream is buffered in the data buffer 611.
- Each box labeled V in FIG. 7A represents the first TS packet of a GOP (Group Of Pictures) included in the stream, and the number above the box stores the packet.
- the first frame of a GOP is an I picture.
- the box written as SI in FIG. 7A represents the SI included in the stream, and here, the STC_Offset stored in the SI is assumed to be “ ⁇ 2000”.
- the synchronization start packet determination unit 540 stores, from the broadcast stream stored in the data buffer 601, the memory address of the TS packet at the head of the GOP, the PTS value of the TS packet of the GOP head packet, and SI. STC_Offset is acquired and a first GOP information table is generated.
- FIG. 7B shows a first GOP information table 710 that is an example of the first GOP information table generated by the synchronization start packet determination unit 540.
- the first GOP information table 710 is a table in which an address 711, a PTS 712, and a PTS + STC_offset 713 are associated with each other.
- Address 711 is a memory address of the first TS packet of the GOP acquired by the synchronization start packet determination unit 540.
- PTS 712 is the PTS value of the first TS packet of the GOP stored in the corresponding address 711.
- PTS + STC_offset 713 is a value obtained by adding the value of STC_offset acquired by the synchronization start packet determining unit 540 to the PTS value of the corresponding PTS 712 (hereinafter referred to as “PTS + STC_offset value”).
- the synchronization start packet determination unit 540 acquires the memory address of the first TS packet of the GOP and the PTS value of the TS packet of the GOP first packet from the communication stream stored in the data buffer 611, and A 2GOP information table is generated.
- FIG. 7B shows a second GOP information table 720 that is an example of the second GOP information table generated by the synchronization start packet determining unit 540.
- the second GOP information table 720 is a table in which an address 721 and a PTS 722 are associated with each other.
- the address 721 is the same as the address 711 of the first GOP information table 710, and the PTS 722 is the same as the PTS 712 of the first GOP information table 710.
- the synchronization start packet determining unit 540 compares the first GOP information table 710 and the second GOP information table 720, and the value of the PTS 712 is the smallest among the combinations in which the value of the PTS 712 matches the value of the PTS + STC_offset 713. Find a combination.
- the TS packet stored at the address 711 associated with the combination of PTS + STC_offset 713 and the TS packet stored at the address 721 associated with the combination of PTS 722 are respectively transmitted in the broadcast stream.
- the synchronization start TS packet is calculated as the synchronization start TS packet in the communication stream.
- the TS packet stored in the memory address “500” of the data buffer 601 is calculated as the synchronization start TS packet in the broadcast stream and stored in the memory address “150” of the data buffer 611.
- the selected TS packet is selected as the synchronization start TS packet in the communication stream.
- the input start control unit 541 has a function of determining the timing for outputting the broadcast stream stored in the data buffer 601 to the decoder input unit 602 and the communication stream stored in the data buffer 611 to the decoder input unit 612. And a function for determining the output timing.
- FIG. 8 illustrates the timing at which the decoder input unit 602 and the decoder input unit 612 output the TS packet sequence of the broadcast stream and the TS packet sequence of the communication stream to the PID filter 603 and the PID filter 613, respectively, It is a timing diagram which shows the timing which the video frame obtained by decoding a row
- the broadcast stream and the communication stream are each composed of only video streams, and will be described later, such as ATC1, ATC2, STC1, STC2, and the like.
- the numerical value, the frame rate, etc. are set ignoring the actual time unit. When actually calculating, it is necessary to consider the actual numerical unit, frame rate, and the like.
- ATC 1 indicates an ATC time axis determined by the first ATC counter 620.
- a TS packet string 801 of the broadcast stream indicates the timing of each TS packet output from the decoder input unit 602 on the ATC1 time axis.
- the width of each box indicates that each TS packet has a PID filter from the decoder input unit 602.
- the time transmitted to 603 is shown.
- the TS packet 820 is output from the decoder input device 602 at time 1300 on the ATC 1 time axis.
- the TS packet 820 denoted as V1 indicates the synchronization start TS packet of the broadcast stream
- the TS packet 810 denoted as PCR1 indicates the PCR packet nearest to the synchronization start TS packet of the broadcast stream.
- a video frame obtained by decoding the TS packet 820 is assumed to have a frame ID of 30.
- ATC2 indicates an ATC time axis determined by the second ATC counter 640.
- the TS packet sequence 802 of the communication stream indicates the timing of each TS packet output from the decoder input unit 612 on the ATC2 time axis, and the width of each box indicates that each TS packet is transmitted from the decoder input unit 612 to the PID filter. The time transmitted to 613 is shown.
- the TS packet 840 is output from the decoder input unit 612 at time 3400 on the ATC2 time axis.
- the TS packet 840 denoted by V2 indicates the synchronization start TS packet of the communication stream
- the TS packet 830 denoted by PCR2 indicates the PCR packet nearest to the synchronization start TS packet of the communication stream.
- the video frame obtained by decoding the TS packet 840 is assumed to have a frame ID of 30.
- STC1 indicates an STC time axis determined by the first STC counter 630.
- the broadcast stream frame ID 803 indicates the timing at which the video frame obtained by decoding the TS packet sequence of the broadcast stream is output to the first video plane 552.
- a video frame having a frame ID of 30 obtained by decoding TS packet 810 is output to first video plane 552 at time 2600 on the STC1 time axis, that is, time 1600 on the ATC1 time axis. It is shown.
- D1 is the time taken from when the TS packet is output from the decoder input unit 612 to when the corresponding decoded video frame is output to the first video plane 552. In this example, it is shown that it takes 300 on the ATC1 time axis and 300 on the STC1 time axis. A formula for calculating D1 will be described later.
- STC2 indicates an STC time axis determined by the second STC counter 650.
- the video stream frame ID 804 indicates the timing at which the video frame obtained by decoding the TS packet sequence of the broadcast stream is output to the second video plane 555.
- a video frame having a frame ID of 30 obtained by decoding TS packet 840 is output to second video plane 555 at time 6600 on the STC2 time axis, that is, at time 3600 on the ATC2 time axis. It is shown.
- D2 is the time taken from when the TS packet is output from the decoder input unit 622 until the corresponding decoded video frame is output to the second video plane 555. In this example, it is shown that it takes 200 on the ATC2 time axis and 200 on the STC2 time axis. A formula for calculating D2 will be described later.
- a video frame obtained by decoding the TS packet 820 and a video frame obtained by decoding the TS packet 840 are simultaneously output to the first video plane 552 and the second video plane 555, respectively.
- the timing at which the TS packet 820 is output from the decoder input unit 602 and the timing at which the TS packet 840 is output from the decoder input unit 612 need to be shifted by D1-D2.
- the input start control unit 541 calculates the value of D1-D2, and starts synchronization of the communication stream and the synchronization start TS packet (TS packet 820 in FIG. 8) determined by the synchronization start packet determination unit 540.
- the timing at which the TS packet 820 is output from the decoder input device 602 so that the TS packet (TS packet 840 in FIG. 8) is simultaneously output to the first video plane 552 and the second video plane 555, and TS The timing at which the broadcast stream stored in the data buffer 601 is output to the decoder input device 602, and the data buffer so that the timing at which the packet 840 is output from the decoder input device 612 is shifted by the calculated value D1-D2.
- the communication stream stored in 611 is Determining a timing for outputting the over da input device 612.
- a function that returns the ATS value assigned to the TS packet X is ATS (X), and a function that obtains the PCR value assigned to the TS packet X is PCR (X), and a video frame obtained by decoding the TS packet 820 is obtained.
- D1 SyncPTS1? PCR (PCR1)? ATS (V1) + ATS (PCR1)
- D2 SyncPTS2?
- the broadcast video included in the TS packet sequence of the broadcast stream and the communication video included in the TS packet sequence of the communication stream are synchronized according to the timing intended by the video producer.
- the output start timing to the decoder input device 602 for the TS packet sequence of the broadcast stream and the output start timing to the decoder input device 612 for the TS packet sequence of the communication stream were calculated and calculated.
- output start processing for starting output of the TS packet sequence of the broadcast stream and the TS packet sequence of the communication stream is performed.
- FIG. 9 is a flowchart of the output start process.
- the output start process is started when the input of the broadcast stream to the data buffer 601 and the input of the communication stream to the data buffer 611 are started.
- the synchronization start packet determination unit 540 determines the memory address of the first TS packet of the GOP and the GOP first packet from the TS packet sequence stored in the data buffer 601 in the broadcast stream.
- the PTS value of the TS packet and the STC_Offset stored in the SI are acquired (step S900), and a first GOP information table is generated (step S910).
- the synchronization start packet determination unit 540 determines the memory address of the first TS packet of the GOP and the PTS value of the TS packet of the GOP first packet from the TS packet sequence stored in the data buffer 611 in the communication stream. (Step S920) and a second GOP information table is generated (step S930).
- the synchronization start packet determination unit 540 compares the first GOP information table 710 and the second GOP information table 720 to check whether or not there is a combination in which the value of the PTS 712 matches the value of the PTS + STC_offset 713 (step). S940).
- step S940 when there is a matching combination (step S940: Yes), the synchronization start packet determining unit 540 searches for a combination having the smallest value of PTS712 from the matching combinations. Then, the TS packet stored at the address 711 associated with the combination of PTS + STC_offset 713 and the TS packet stored at the address 721 associated with the combination of PTS 722 are respectively transmitted in the broadcast stream. The synchronization start TS packet is calculated as the synchronization start TS packet in the communication stream (step S950).
- the input start control unit 541 calculates D1-D2 (step S960), and outputs the synchronization start TS packet in the broadcast stream.
- the output start timing to the decoder input device 612 is calculated, and output of the TS packet sequence of the broadcast stream and the TS packet sequence of the communication stream is started at the calculated output start timing (step S970).
- the display device 110 ends the output start process when the process of step S970 ends and when there is no matching combination in the process of step S940 (step S940: No).
- the display device 110 configured as described above displays each of the video frames constituting the communication stream at a timing obtained by adding the STC_Offset value to the PTS associated with each video frame.
- the received broadcast video and the received communication video are displayed synchronously at the timing as intended by the video producer.
- Display device 110 in the embodiment includes two counters, a first ATC counter 620 and a second ATC counter 640, as ATC counters, and two counters, a first STC counter 630 and a second STC counter 650, as STC counters. It was a configuration.
- the first modified display device in the modified example 1 is an example of a configuration including one ATC counter and one STC counter.
- the first modified display device is modified from the display device 110 in the embodiment so that the main video decoding unit is changed to the first modified main video decoding unit.
- FIG. 10 is a block diagram of the first modified main video decoding unit.
- the second ATC counter 640 and the second STC counter 650 are deleted from the main video modifying unit (see FIG. 6) in the embodiment, and the ATC_Offset adder 1040 is deleted. And an STC_Offset adder 1050 are added.
- the ATC_Offset adder 1040 has a function of calculating ATC_Offset indicating the ATC offset value of the communication stream with respect to the broadcast stream, adding the calculated ATC_Offset to the ATC time sent from the first ATC counter 620, and outputting the result. .
- the STC_Offset adder 1050 has a function of adding the STC_Offset included in the broadcast stream to the STC time sent from the first STC counter 630 and outputting it.
- the first modified display device having the above configuration decodes the broadcast stream along the STC1 time axis, and decodes the video stream along the STC2 time axis obtained by adding STC_Offset to the STC1 time axis.
- the received broadcast video and the received communication video are displayed synchronously at the timing as intended by the video producer.
- the first modified display device in the first modification has a configuration corresponding to the case where the communication stream transmitted from the communication service providing station 130 is a TTS stream in the MPEG-2 TS format.
- the second modified display device in the modified example 2 is an example of a configuration corresponding to the case where the communication stream transmitted by the communication service providing station 130 is a PES stream configured by a PES packet sequence.
- the second modified display device is modified such that the first modified main video decoding unit is changed to the second modified main video decoding unit from the first modified display device in Modification 1.
- FIG. 11 is a block diagram of the second modified main video decoding unit.
- the decoder input unit 612, the PID filter 613, and the ATC_Offset adder 1040 are deleted from the first modified main video decoding unit (see FIG. 10) in Modification 1.
- the second video decoder 523 is modified to be changed to the second video decoder 1123.
- the second video decoder 1123 is modified such that the TB 614 and the MB 615 are deleted from the second video decoder 523 in the first modification, and the EB 616 is changed to the EB 1116.
- the EB 1116 is a buffer for storing a picture in an encoded state, and has a function of removing a PES packet header from the PES packet when the PES packet is transmitted from the data buffer 611 to the EB 606.
- the input start control unit 541 starts the output of the broadcast stream from the data buffer 601 to the decoder input device 602 until sufficient data is accumulated in the EB 1116 (for example, 1 second in the EB 616). This may be done after waiting until the storage capacity of the EB 1116 is full). Then, regarding the output of the communication stream from the data buffer 611 to the EB 1116, data is continuously output so that the EB 1116 does not underflow.
- the second modified display device having the above configuration decodes the broadcast stream along the STC1 time axis, and STC2 time axis obtained by adding STC_Offset to the STC1 time axis.
- the video stream is decoded according to the above.
- the received broadcast video and the received communication video are displayed synchronously at the timing as intended by the video producer.
- the display device 110 in the embodiment has a configuration in which the plane synthesis processing unit 560 synthesizes the video frame output from the first video decoder 514 and the video frame output from the second video decoder 523.
- the third modified display device is an example of a configuration in which the video frame output from the first video decoder 514 and the video frame output from the second video decoder 523 are multiplexed again.
- the third modified display device is modified such that the main video decoding unit is changed to the third modified main video decoding unit from the display device 110 in the embodiment.
- FIG. 12 is a block diagram of the third modified main video decoding unit.
- the plane synthesis processing unit 560, the video decoder 607, and the video decoder 617 are deleted from the main video deformation unit (see FIG. 6) in the embodiment.
- a multiplexer 1260 and a TS player 1270 are modified to be added.
- the multiplexer 1260 generates a TS packet sequence (hereinafter referred to as “second broadcast stream”) from the video frame sequence in the encoded state output from the EB 606, and outputs the encoded state from the EB 616.
- second broadcast stream a TS packet sequence
- the multiplexer 1260 multiplexes the second broadcast stream and the second communication stream
- the multiplexer 1260 doubles the system rate of the second broadcast stream and the system rate of the second communication stream.
- FIG. 13 (a) is a timing chart showing the timing of the TS packet sequence of the second broadcast stream and the TS packet sequence of the second communication stream before the system rate is doubled.
- the system rate before double speed is 24 Mbps.
- Each box indicates a TS packet constituting a TS packet sequence, and dt which is the width of each box indicates a time from when each TS packet is input to the PID filter until it is output.
- dt It is 188 ⁇ 8/240000000.
- FIG. 13B is a timing diagram showing the timing of the TS packet sequence of the second broadcast stream and the TS packet sequence of the second communication stream after the system rate is doubled.
- the system rate is doubled from 24 Mbps to 48 Mbps.
- dt 188 ⁇ 8/480000000.
- FIG. 13C is a timing diagram showing the timing of the TS packet sequence of the composite stream.
- the second broadcast stream and the second broadcast stream can be multiplexed with the second broadcast stream by multiplexing the system rate of the second broadcast stream and the system rate of the second communication stream. Multiplexing with two communication streams can be performed relatively easily.
- the conversion method from ATC2 to ATC1 includes, for example, a method of calculating ATC_Offset described in the first modification.
- the TS player 1270 separates the second broadcast stream and the second communication stream from the combined stream sent from the multiplexer 1260 and includes the video frame and the second communication stream included in the second broadcast stream. The function of outputting in synchronization with the video frame to be output.
- the combined stream generated by the multiplexer 1260 is multiplexed with the STC_offset added.
- the composite stream generated by the multiplexer 1260 is the same as a normal TS stream. Therefore, the composite stream generated by the multiplexer 1260 can be stored on an optical disk or the like and played back by another playback device.
- the display device 110 in the embodiment is configured to perform synchronous display of broadcast video and communication video using four time axes of ATC1, ATC2, STC1, and STC2.
- the fourth modified display device in the modified example 4 is configured to perform synchronized display of broadcast video and communication video using a total of five time axes obtained by adding the AbsTime time axis to these four time axes. It is an example.
- the fourth modified display device is modified so that the main video decoding unit is changed to the fourth modified main video decoding unit from the display device 110 in the embodiment.
- FIG. 14 is a block diagram of the fourth modified main video decoding unit.
- the synchronization start packet determination unit 540 and the input start control unit 541 are deleted from the main video deformation unit (see FIG. 6) in the embodiment, and synchronization is performed.
- the start packet determination unit 1440, the synchronization start packet determination unit 1450, the input start control unit 1441, the input start control unit 1451, the crystal resonator 1460, the AbsTime counter 1410, and the AbsTime counter 1411 are modified.
- the components above the broken line are stored in a first housing (for example, a television receiver), and the components below the broken line are stored in a second housing (for example, a tablet terminal).
- a first housing for example, a television receiver
- a second housing for example, a tablet terminal
- the crystal resonator 1460 is an oscillator that oscillates at a frequency of 27 MHz by utilizing the piezoelectric effect of crystal, similarly to the crystal resonator 660.
- the AbsTime counter 1410 and the AbsTime counter 1411 are counters that divide the AbsTime time axis using the transmission of the crystal resonator 660 and the crystal resonator 1460, respectively, and have a function of incrementing the value at a frequency of 27 MHz.
- AbsTime is RTC (Real Time Clock).
- the AbsTime counter 1410 and the AbsTime counter 1411 have a function of communicating with each other, and using NTP (Network14Time Protocol) or the like, AbsTime countered by the AbsTime counter 1410 and AbsTime counter recorded by the AbsTime counter 1410 are It is matched.
- NTP Network14Time Protocol
- the synchronization start packet determination unit 1440 and the synchronization start packet determination unit 1450 have a function of communicating with each other, and the block including the synchronization start packet determination unit 1440 and the synchronization start packet determination unit 1450 is the synchronization start packet in the embodiment.
- a function equivalent to that of the determination unit 540 is realized.
- the input start control unit 1441 and the input start control unit 1451 have a function of communicating with each other, and a block including the input start control unit 1441 and the input start control unit 1451 is equivalent to the input start control unit 541 in the embodiment. Realize the function.
- the input start control unit 1441 and the input start control unit 1451 realize the function of determining the timing to output the broadcast stream stored in the data buffer 601 to the decoder input unit 602 and store it in the data buffer 611.
- a total of five time axes are used by adding the AbsTime time axis to the four time axes of ATC1, ATC2, STC1, and STC2. To do.
- FIG. 15 is a timing diagram in which a time axis AbsTime is added to the timing diagram of FIG. This figure shows a case where SyncAbsTime, which is a value on the AbsTime time axis at which synchronous playback is started, is 5600.
- the input start timing of the broadcast stream synchronization start TS packet (V1) to the decoder input device 602 on the AbsTime time axis is InputAbsTime1.
- the input start timing on the AbsTime time axis of the broadcast stream synchronization start TS packet (V2) to the decoder input unit 612 is InputAbsTime2.
- the input start control unit 1441 starts inputting the synchronization start TS packet (V1) of the broadcast stream to the decoder input device 602 at the timing of InputAbsTime1.
- the input start control unit 1451 starts input of the synchronization start TS packet (V2) of the communication stream to the decoder input device 612 at the timing of InputAbsTime2. Thereby, it becomes possible to start displaying each video frame in synchronization with each other at the timing of SyncAbsTime.
- the image stored in the first video plane 552 is displayed on the display provided in the first housing, and the image stored in the second video plane 555 is displayed in the first display.
- the received broadcast video and the received communication video are synchronized with the timing as intended by the video producer, respectively, It is possible to display with two housings.
- the first modified display device in the modified example 1 has a configuration corresponding to the case where the communication stream transmitted by the communication service providing station 130 is in the MPEG-2 TS format.
- the fifth modified display device in the modified example 5 is an example of a configuration corresponding to the case where the communication service providing station 130 transmits graphics data to be displayed in synchronization with the broadcast video.
- graphics data for example, information on players such as player names when the broadcast video is a soccer game can be considered.
- the fifth modified display device is modified such that the first modified main video decoding unit is changed to the fifth modified main video decoding unit from the first modified display device in Modification 1.
- FIG. 16 is a block diagram of the fifth modified main video decoding unit.
- the fifth modified main video decoding unit includes a data buffer 611, a decoder input unit 612, a PID filter 613, and a TB 614 from the first modified main video decoding unit (see FIG. 10) in Modification 1.
- the MB 615, the EB 616, the video decoder 617, the second video plane 555, and the ATC_Offset adder 1040 are deleted, and the application execution control unit 530 and the graphics plane 554 are added.
- the application execution control unit 530 executes the application, draws a CG image from the CG image data acquired by the communication interface 502, and outputs the CG image to the graphics plane 554.
- FIG. 17A is a data configuration diagram showing an example of the data configuration of CG image data.
- the CG image data is drawing instruction data to which DTS and PTS are assigned.
- the drawing instruction data is script code.
- the application format is HTML, it is a drawing code for JavaScript (registered trademark) graphics.
- the application execution control unit 530 starts drawing at the DTS timing on the time axis of STC2 output from the STC_Offset adder 1050 by executing the application, and outputs the drawn CG image to the graphics plane 554.
- the application acquires the current time with a function such as GetCurrentTime (stc) and starts drawing at the timing indicated by DTS.
- a function such as GetCurrentTime (stc)
- the application draws at a specified time (outputPTS) with a function such as SyncBuffer (outputPTS).
- outputPTS SyncBuffer
- the graphics plane 554 has a double buffer structure.
- FIG. 17B is a timing chart showing the timing of drawing and displaying a CG image.
- the application execution control unit 530 starts drawing in one of the double buffers of the graphics plane 554 at the timing indicated by the DTS, and the graphics plane 554 The drawn CG image is displayed at the timing indicated by PTS.
- the graphics plane 554 has a double buffer structure, so that a smooth display of a CG image can be realized.
- the application execution control unit 530 may realize these functions by executing an application instead of the synchronization start packet determination unit 540 and the input start control unit 541.
- a GOP information table is acquired by a function such as GetGOPTable (gop_table)
- a synchronization start TS packet is determined based on PTS or the like included in the acquired GOP information table
- SetStartPTS pts
- SetStartFrameID frameID
- the fifth modified display device having the above configuration decodes the broadcast stream along the STC1 time axis and, similarly to the first modified display device in the first modification, STC2 time axis obtained by adding STC_Offset to the STC1 time axis. A CG image is drawn and output along the line. Thus, the received broadcast video and the received CG image are displayed in synchronization with the timing as intended by the video producer.
- STC2 time axis obtained by adding STC_Offset to the STC1 time axis.
- a CG image is drawn and output along the line.
- the received broadcast video and the received CG image are displayed in synchronization with the timing as intended by the video producer.
- the sixth modified display device is an example of a configuration in which a graphics engine, which is a hardware accelerator for drawing a CG image, is added to the fifth modified display device in the modified example 5.
- the sixth modified display device is modified such that the fifth modified main video decoding unit is changed to the sixth modified main video decoding unit from the fifth modified display device in the modified example 5.
- FIG. 18 is a block diagram of the sixth modified main video decoding unit.
- the sixth modified main video decoding unit is modified from the fifth modified main video decoding unit (see FIG. 16) in Modification 5 so that an EB 1840 and a graphics engine 1850 are added. ing.
- EB 1840 is a buffer for storing CG image data.
- the application execution control unit 530 stores the CG image data acquired by the communication interface 502 in the EB 1840 by executing the application.
- the graphics engine 1850 has a function of drawing from the CG image data stored in the EB 1840 at the timing indicated by the DTS and outputting it to the graphics plane 554.
- ⁇ Modification 7> ⁇ Overview>
- a seventh modified display device obtained by modifying a part of the display device 110 in the embodiment will be described.
- Display device 110 in the embodiment corresponds to a case in which the frame ID of the first video frame of the GOP that starts synchronization in the broadcast stream matches the frame ID of the first video frame of the GOP that starts synchronization in the communication stream. Met.
- the frame ID of the first video frame of the GOP that starts synchronization in the broadcast stream and the frame ID of the first video frame of the GOP that starts synchronization in the communication stream are obtained.
- An ID is given.
- each video frame is given a frame ID incremented from the top side in the display order
- each video frame is given a frame ID incremented from the top side in the display order.
- the addition of the frame ID to the video frame included in the broadcast stream and the addition of the frame ID to the video frame included in the communication stream have the same frame ID for the video frames displayed in synchronization with each other. Done to be granted.
- the seventh modified display device is modified from the display device 110 in the embodiment so that the main video decoding unit is changed to a seventh modified main video decoding unit.
- the seventh modified main video decoding unit will be described with reference to FIG. 6 showing the main video decoding unit in the embodiment, without showing the seventh modified main video decoding unit.
- the synchronization start packet determination unit 540 determines, from the broadcast stream stored in the data buffer 601, the memory address of the first TS packet of the GOP, the PTS value of the TS packet of the GOP, and the head of the GOP.
- the frame ID of the TS packet and the STC_Offset stored in the SI are obtained, and from the communication stream stored in the data buffer 611, the memory address of the first TS packet of the GOP and the TS packet of the GOP first packet
- the PTS value and the frame ID of the first TS packet of the GOP are acquired, and a modified first GOP information table and a modified second GOP information table are generated.
- FIG. 19 shows an example of a modified first GOP information table 1910, which is an example of a modified first GOP information table generated by the synchronization start packet determining unit 540, and an example of a modified second GOP information table generated by the synchronization start packet determining unit 540.
- the modified second GOP information table 1920 is shown.
- the modified first GOP information table 1910 is a table in which an address 1911, a PTS 1912, a PTS + STC_offset 1913, and a frame ID 1914 are associated with each other.
- Address 1911, PTS 1912, and PTS + STC_offset 1913 are the same as address 711, PTS712, and PTS + STC_offset 713 in the embodiment, respectively (see FIG. 7B).
- the frame ID 1914 is the frame ID of the first TS packet of the GOP stored in the corresponding address 1911.
- the modified second GOP information table 1910 is a table in which an address 1921, a PTS 1922, and a frame ID 1924 are associated with each other.
- Address 1921, PTS 1922, and frame ID 1924 are the same as address 1911, PTS 1912, and frame ID 1914, respectively (see FIG. 7B).
- the frame ID of the entry of address 500 in the GOP table information of the broadcast stream is 30, whereas the frame ID of the entry of address 150 in the GOP table information of the communication stream is 32.
- FIG. 20 shows that in the seventh modified display device, the decoder input unit 602 and the decoder input unit 612 convert the TS packet sequence of the broadcast stream and the TS packet sequence of the communication stream to the PID filter 603 and the PID filter 613, respectively. It is a timing diagram which shows the timing which outputs and the video frame obtained by decoding a packet sequence is output to a video plane.
- the input start control unit 541 determines the input timing to the decoder as shown in FIG. First, D1 and D2 are calculated in the same manner as the method described in FIG. Next, a difference value (D3) is calculated when the broadcast stream synchronization start PTS (SyncPTS1) and the communication stream synchronization start PTS (SyncPTS2) are projected on the same time axis.
- the input start control unit 541 starts input to the V1 decoder, then delays by the value of D1 + D3-D2, and starts input to the V2 decoder. With this configuration, when the start time of the GOP at which synchronization is started does not match between the broadcast stream and the communication stream, the broadcast stream and the communication stream can be synchronously reproduced.
- the broadcast stream is not displayed until the display time at the head of the GOP of the communication stream, which is the later time axis. Also good.
- the broadcast stream may be started first. Alternatively, the user may be able to select.
- the value of the frame ID may be a common PTS or TimeCode value for the broadcast stream and the communication stream.
- the display device 110 in the embodiment has a configuration corresponding to the case where the broadcast stream transmitted by the broadcast station 120 is a TTS stream in the MPEG-2 TS format.
- the eighth modified display device in the modified example 8 is an example of a configuration corresponding to the case where the broadcast stream transmitted by the broadcast station 120 is a TS stream in the MPEG-2 TS format.
- the configuration of the eighth modified display device in the modified example 8 will be described focusing on differences from the display device 110 in the embodiment with reference to the drawings.
- the eighth modified display device is modified such that a converter 2100 is added between the tuner 501 and the first demultiplexing unit 511 (see FIG. 5) in the display device 110 in the embodiment.
- FIG. 21 is a configuration diagram showing the configuration of the converter 2100.
- the converter 2100 includes a crystal resonator 2130, an ATC counter 2140, a TS packet filter 2110, and an ATS adder 2120.
- the crystal unit 2130 and the ATC counter 2140 are the same as the crystal unit 660 (see FIG. 6) and the first ATC counter 620 in the embodiment, respectively.
- the TS packet filter 2110 uses the EIT program information and the stream configuration information in the PMT packet program to filter only the TS packets that make up the program selected by the user, and inputs them to the ATS adder 2120. .
- the ATS adder 2120 refers to the ATC value of the ATC counter 2140 with respect to the 188-byte TS packet input via the TS packet filter 2110, adds an ATS value to the head of the TS packet, and 192 A byte TS packet is generated. Since the ATS field is 4 bytes, a value from 0x0 to 0xFFFFFF0 is taken, and when the ATC value becomes a value equal to or greater than 0xFFFFFFFF, it returns to 0 as Wrap-around again. In the case of Blu-ray, since the first 2 bits of the first 4 bytes of the TS packet are used for copy control information, the ATS value is 30 bits and Wrap-around is performed with 30 bits.
- the converter 2100 has a function of giving an ATS to the head of each TS packet of the broadcast stream by adopting the configuration shown in FIG.
- the converter 2100 is disposed between the tuner 501 and the first demultiplexing unit 511, so that the broadcast stream input to the first demultiplexing unit 511 becomes a TTS stream.
- the converter 2100 is disposed between the tuner 501 and the first demultiplexing unit 511, so that the broadcast stream input to the first demultiplexing unit 511 becomes a TTS stream.
- FIG. 22 shows a problem during delayed reproduction.
- a timeline A is a display time (AbsTime) and indicates terminal common time information such as RTC synchronized by means such as NTP.
- the display time (AbsTime) Shows the PTS of the video of the broadcast stream displayed by.
- the timeline C indicates the PTS of the video of the broadcast stream displayed at the display time (AbsTime) when performing delayed playback when realizing synchronized playback.
- An arrow written between the timeline B and the timeline C indicates a reproduction route for synchronous reproduction. First, buffering is performed between AbsTime 100 and 200, and the data of the broadcast stream and the communication stream is stored in the data buffer.
- FIG. 23 shows a stream decoding means for dealing with this problem.
- a selector 2302 is added in front of the broadcast stream decoder input unit 602, and not only the broadcast stream but also the supplemental stream data can be input.
- the supplemental stream is a stream for making up for the lack of broadcast stream data due to delayed reproduction.
- the supplemental stream is preferably downloaded and stored in advance in a data buffer 2301 (for example, HDD).
- the supplementary stream is played back during the buffering period of the broadcast stream and the communication stream, the selector 2302 added in FIG. 23 is switched, and the broadcast stream and the communication stream are played back synchronously at the same time as the buffering is completed. To do. If the content of the PTS 1400-1500 of the broadcast stream is matched with the content of the supplemental stream, the user can view the video of the broadcast stream without loss.
- the display time (AbsTime) at which the synchronization is started can be set by an application API or the like.
- the information may be stored in communication data obtained from a broadcast stream, a communication stream, or an application.
- the method for controlling the display time for starting the synchronization may be realized by the method shown in FIG.
- the playback speed of the synchronous playback may be increased so that the PTSs of the broadcast streams at the display time AbsTime at the end of the synchronous playback match.
- fast-forward playback is performed from the PTS 1100 to the PTS 1500 in the timeline C for delayed playback, and ends at the same timing as the timeline B for normal playback.
- information for identifying the delay stop prohibition period may be stored in a broadcast stream, a communication stream, communication data acquired from an application, or the like.
- a delay stop prohibition section is set in PTS1400 to PTS1500 in the broadcast stream.
- the AbsTime 500 is instructed by the user to end the synchronized playback
- the synchronized playback process is not terminated, and the delayed playback is continued until the delay stop prohibition period ends, and then the normal playback is resumed.
- the buffering period is 1000 to 1100 and the length is 100, it is processed as a delay stop prohibition section between AbsTime 400 and 500, that is, during playback of broadcast streams 1300 to 1400. .
- information for identifying a delay permitted / prohibited section for a broadcast stream may be stored in a broadcast stream, a communication stream, communication data acquired from an application, or the like.
- the PTS 1100 to PTS 1500 are set as a delay permission section, and the PTS 1500 and later are set as a delay prohibition section.
- delayed playback is executed from AbsTime200 to AbsTime600, and normal playback is resumed after AbsTime600.
- synchronization may be stopped when an STC discontinuity occurs in the middle of the broadcast stream, or when another video is inserted in the middle of the broadcast stream.
- a synchronization stop signal is stored in a broadcast stream, a communication stream, communication data, and the like, and when the signal is received, the stream decoding means stops synchronous reproduction and returns to normal reproduction.
- an API and an Event are prepared on the application side. The application will restart as needed.
- the synchronous playback stream decoding means provides an API for acquiring a delay time to an application control unit for displaying graphics.
- the delay time may be calculated by calculating the difference between AbsTime of the decoding start time and the buffering start time.
- a rough delay time can be calculated by comparing the current time (AbsTime) with the time data of the TOT packet of the stream currently being decoded.
- Internet information for example, Twitter comments
- the data necessary for the program to be synchronized playback must be downloaded as much as possible. Therefore, in the playback device, when a broadcast program viewing reservation or recording reservation is made, data necessary for synchronous playback may be downloaded to an HDD or the like. It should be noted that the recording process is performed without delay in a recorded program that performs synchronous reproduction.
- the delayed playback may be stopped immediately and the normal broadcast playback may be resumed.
- the broadcast stream stored in the data buffer is always realized by monitoring the emergency broadcast signal.
- FIG. 27 shows an API list of the synchronous playback control module provided by the application execution unit for realizing the synchronous playback in this embodiment.
- the application is HTML content.
- the first column is the method name
- the second column is the API description
- the third column is the argument
- the fourth column is the comment.
- AddVideo is an API that adds a stream entry including a video stream specified by an HTML5 video tag or the like.
- a video object VideoObject
- bMaster master
- start PTS on the stream StarttPTS
- ptsOffset a PTS difference with the master
- AddAudio is an API that adds a stream entry including an audio stream specified by an HTML5 audio tag or the like. As an argument, an audio object (audioObject), a start PTS (StartPTS) on the stream, and a PTS difference (ptsOffset) from the master are specified. Other parameters include a flag indicating whether or not to return to the stream head at the end of the stream and loop playback, a mixing coefficient with the master audio, and the like.
- AddCanvas is an API that adds a canvas object that is a graphics drawing module of HTML5.
- a canvas object is specified as an argument.
- RemoveVideo, RemoveAudio, and RemoveCanvas are APIs that delete the added video object, audio object, and canvas object, respectively.
- Play is an API for instructing the start of synchronous playback.
- the argument specifies a playback delay time (StartUpDelay) and a playback mode (Mode).
- the playback mode includes a 3D playback mode with the broadcast stream as the left eye and the communication stream as the right eye.
- Stop is a synchronized playback stop
- Pause is a synchronized playback pause.
- GetStartUpDelay is an API that acquires a delay time.
- FIG. 28 shows an example of HTML5 content expansion when a broadcast stream (video stream + audio stream) and a communication stream (video stream) are reproduced in synchronization.
- the video tag defines three videos. The first is a supplementary stream, the first is a broadcast stream, and the third is a communication stream.
- the first is a supplementary stream
- the first is a broadcast stream
- the third is a communication stream.
- FIG. 29 shows an example of HTML5 content extension when a broadcast stream (video stream + audio stream) and a communication stream (audio stream) are synchronously reproduced.
- a broadcast stream is specified by a video tag
- a communication stream is specified by an audio tag.
- FIG. 30 shows an example of HTML5 content extension when a broadcast stream (video stream + audio stream) and graphics data are played back synchronously. Specify the broadcast stream with the video tag and the graphics drawing area with the canvas tag. In the explanation of Javascript operation, comments are described in FIG.
- the method for synchronous playback of the broadcast stream and the communication stream has been described. However, the same is achieved by changing the input stream in the synchronous playback of the broadcast stream and the broadcast stream and the communication stream and the communication stream. Needless to say, this configuration can be realized.
- the encoded stream can be applied to a case where a video stream compressed with reference to a video stream of a broadcast stream (inter-view reference) is stored in a communication stream.
- the compressed video stream such as the inter-view reference can be synchronously reproduced.
- MPEG-2 TS Digital stream in MPEG-2 TS format is used for transmission on digital TV broadcast waves.
- MPEG-2 TS is a standard for multiplexing and transmitting various streams such as video and audio. It is standardized in ISO / IEC13818-1 and ITU-T recommendation H222.0.
- FIG. 32 shows the structure of a digital stream in the MPEG-2 TS format.
- MPEG-2 TS is obtained by multiplexing a video stream, an audio stream, a subtitle stream, and the like.
- the video stream stores the main video of the program
- the audio stream stores the main audio portion and sub-audio of the program
- the subtitle stream stores the subtitle information of the program.
- the video stream is encoded and recorded using a method such as MPEG-2 or MPEG4-AVC.
- the audio stream is compressed and encoded and recorded by a method such as Dolby AC-3, MPEG-2 AAC, MPEG-4 AAC, HE-AAC.
- moving picture compression coding such as MPEG-2, MPEG-4 AVC, SMPTE VC-1, etc.
- data amount is compressed using redundancy in the spatial direction and temporal direction of moving images.
- inter-picture predictive coding is used as a method of using temporal redundancy.
- inter-picture predictive coding when a certain picture is coded, a picture that is forward or backward in display time order is used as a reference picture. Then, the amount of motion from the reference picture is detected, and the amount of data is compressed by removing the redundancy in the spatial direction from the difference value between the motion compensated picture and the picture to be coded.
- FIG. 38 shows a picture reference structure of a general video stream. The arrow indicates that it is compressed by reference.
- a picture that does not have a reference picture and performs intra-picture predictive coding using only a picture to be coded is called an I picture.
- a picture is a unit of encoding that includes both a frame and a field.
- a picture that is inter-picture prediction encoded with reference to one already processed picture is called a P picture
- a picture that is inter-picture predictively encoded with reference to two already processed pictures at the same time is called a B picture.
- a picture that is referred to by other pictures in the B picture is called a Br picture.
- a field having a frame structure and a field having a field structure are referred to as a video access unit here.
- the video stream has a hierarchical structure as shown in FIG.
- a video stream is composed of a plurality of GOPs, and by using this as a basic unit for encoding processing, editing of a moving image and random access are possible.
- a GOP is composed of one or more video access units.
- the video access unit is a unit for storing coded data of a picture, and stores data of one frame in the case of a frame structure and one field in the case of a field structure.
- Each video access unit includes an AU identification code, a sequence header, a picture header, supplementary data, compressed picture data, padding data, a sequence end code, a stream end code, and the like.
- each data is stored in units called NAL units.
- AU identification code is a start code indicating the head of the access unit.
- the sequence header is a header that stores common information in a playback sequence composed of a plurality of video access units, and stores information such as resolution, frame rate, aspect ratio, and bit rate.
- the picture header is a header that stores information such as the coding method of the entire picture.
- the supplemental data is additional information that is not essential for decoding the compressed data, and stores, for example, closed caption character information displayed on the TV in synchronization with the video, GOP structure information, and the like.
- the compressed picture data stores compression-encoded picture data.
- the padding data stores meaningless data for formatting. For example, it is used as stuffing data for maintaining a predetermined bit rate.
- the sequence end code is data indicating the end of the reproduction sequence.
- the stream end code is data indicating the end of the bit stream.
- the contents of the AU identification code, sequence header, picture header, supplementary data, compressed picture data, padding data, sequence end code, and stream end code differ depending on the video encoding method.
- the AU identification code is AU delimiter (Access Unit Delimiter)
- the sequence header is SPS (Sequence Parameter Set)
- the picture header is PPS (Picture Parameter Set)
- supplemental data is SEI (Supplemental Enhancement Information)
- padding data is FillerData
- sequence end code is End of Sequence
- stream end code is End of Stream.
- the sequence header is sequence_Header, sequence_extension, group_of_picture_header, the picture header is picture_header, picture_coding_extension, the compressed_data is a plurality of slices, the supplemental data is a sequence_decode .
- the start code of each header is used, the break of the access unit can be determined.
- the sequence header may be necessary only in the video access unit at the head of the GOP and may not be present in other video access units.
- the picture header may refer to that of the previous video access unit in the code order, and there is no picture header in its own video access unit.
- I picture data is stored as compressed picture data
- an AU identification code, a sequence header, a picture header, and compressed picture data are always stored
- supplementary data, padding Data, sequence end code, and stream end code are stored.
- Video access units other than the head of the GOP always store AU identification code and compressed picture data, and store supplementary data, padding data, sequence end code, and stream end code.
- Each stream included in the transport stream is identified by a stream identification ID called PID.
- PID stream identification ID
- the composite apparatus can extract the target stream.
- the correspondence between the PID and the stream is stored in the descriptor of the PMT packet described later.
- FIG. 32 schematically shows how the transport stream is multiplexed.
- a video stream 3201 composed of a plurality of video frames and an audio stream 3204 composed of a plurality of audio frames are converted into PES packet sequences 3202 and 3205, respectively, and converted into TS packets 3203 and 3206.
- the data of the subtitle stream 3207 is converted into a PES packet sequence 3208 and further converted into a TS packet 3209.
- the MPEG-2 transport stream 3213 is configured by multiplexing these TS packets into one stream.
- FIG. 35 shows in more detail how the video stream is stored in the PES packet sequence.
- the first level in the figure shows a video frame sequence of the video stream.
- the second level shows a PES packet sequence.
- a plurality of Video Presentation Units in the video stream are divided for each picture, and stored in the payload of the PES packet.
- Each PES packet has a PES header, and a PTS that is a picture display time and a DTS that is a picture decoding time are stored in the PES header.
- FIG. 36 shows the data structure of TS packets constituting the transport stream.
- the TS packet is a 188-byte fixed-length packet composed of a 4-byte TS header, an adaptation field, and a TS payload.
- the TS header is composed of transport_priority, PID, adaptation_field_control, and the like.
- the PID is an ID for identifying the stream multiplexed in the transport stream as described above.
- the transport_priority is information for identifying the type of packet in TS packets having the same PID.
- Adaptation_field_control is information for controlling the configuration of the adaptation field and the TS payload. There are cases where only one of the adaptation field and the TS payload exists or both, and adaptation_field_control indicates the presence / absence thereof. When adaptation_field_control is 1, only the TS payload is present, when adaptation_field_control is 2, only the adaptation field is present, and when adaptation_field_control is 3, both the TS payload and the adaptation field are present.
- the adaptation field is a storage area for storing information such as PCR and stuffing data for making the TS packet a fixed length of 188 bytes.
- a PES packet is divided and stored in the TS payload.
- TS packets included in a transport stream include PAT (Program Association Table), PMT, PCR, and the like in addition to video, audio, and subtitle streams. These packets are called PSI.
- PAT indicates what the PID of the PMT used in the transport stream is, and the PID of the PAT itself is registered as 0.
- the PMT has PID of each stream such as video / audio / subtitles included in the transport stream and stream attribute information corresponding to each PID, and has various descriptors related to the transport stream.
- the descriptor includes copy control information for instructing permission / non-permission of copying of the AV stream.
- the PCR has STC time information corresponding to the timing at which the PCR packet is transferred to the decoder in order to synchronize the arrival time of the TS packet to the decoder and the STC which is the time axis of the PTS / DTS.
- FIG. 37 is a diagram for explaining the data structure of the PMT in detail.
- a PMT header describing the length of data included in the PMT is arranged at the head of the PMT. After that, a plurality of descriptors related to the transport stream are arranged.
- the copy control information described above is described as a descriptor.
- a plurality of pieces of stream information regarding each stream included in the transport stream are arranged after the descriptor.
- the stream information includes a stream descriptor in which a stream type, a stream PID, and stream attribute information (frame rate, aspect ratio, etc.) are described to identify a compression codec of the stream.
- the transport stream shown in the lower part of FIG. 36 is a stream in which TS packets are arranged, and a stream generally used for a broadcast wave is in this format, and is hereinafter referred to as a TS stream.
- the transport stream shown in the lower part of FIG. 39 is a stream in which source packets each having a 4-byte time stamp are arranged at the head of a 188-byte TS packet, and a stream generally transmitted by communication is in this format.
- TTS stream The first time stamp attached to the TS packet is hereinafter referred to as ATS, and ATS indicates the transfer start time of the attached TS packet to the decoder of the stream.
- ATS indicates the transfer start time of the attached TS packet to the decoder of the stream.
- SPN Source Packet Number
- a normal broadcast wave is transmitted as a full TS in which TSs for a plurality of channels are multiplexed.
- a full TS is a TS stream composed of a 188-byte fixed-length TS packet sequence.
- a storage medium such as a BD-RE or HDD
- only necessary channel data is extracted from the full TS and recorded as a partial TS.
- the partial TS is a TTS stream.
- the display device according to the embodiment, the modification, or the like has been described as an example of the display device according to the present invention.
- the display device is not limited to the one exemplified in the form, the modification, and the like.
- the display device 110 is an example of a configuration that receives the communication video transmitted from the communication service providing station 130 via the Internet communication network 140.
- the communication video transmitted via the Internet communication network 140 is not necessarily received.
- a configuration for receiving a communication video transmitted via a broadcast wave, a configuration for receiving a communication video transmitted via a dedicated line, and the like can be considered.
- the STC_Offset value stored in the PSI / SI of the broadcast stream is “0”.
- the value of STC_Offset stored in the PSI / SI of the broadcast stream is necessarily “0”. It is not necessary to have a configuration. As an example, a configuration in which STC_Offset is not stored in the PSI / SI of the broadcast stream is conceivable only when it is not necessary to correct the display timing of the communication video included in the communication stream. In the case of such a configuration, when STC_Offset is not stored in the PSI / SI of the received broadcast stream, the display device 110 performs the same processing as when the value of STC_Offset is “0”.
- FIG. 17A illustrates an example in which both the broadcast stream and the communication stream are stored in the data buffer.
- the broadcast wave is transmitted.
- the synchronization start PTS of the communication stream to be synchronized may be determined, the data at that point may be inquired of the server, and buffering may be started.
- the synchronization start PTS or frame ID of the communication stream to be synchronized is determined based on the GOP table information and the frame ID of the broadcast wave, and the data at that point is determined.
- the server may be queried to start buffering.
- the PCR value of the corresponding packet may be stored in the video packet at the head of each GOP as an adaptation field.
- D1 and D2 may not be calculated by the playback device, but may be transmitted as information in a system wave or video stream of a broadcast wave or a communication stream as information. Alternatively, it may be simply provided as data by communication from the server.
- the communication stream may be an MP4 stream.
- management information indicating the PTS and DTS corresponding to each frame is separately prepared in the MP4 header information and processed using the timing information.
- the frame ID is unique for each video stream, and there is no relationship between the streams (that is, even if the same frame ID is given, it does not indicate that it is in a synchronous relationship) If the offset information of the frame ID included in the broadcast stream and the ID of the frame ID included in the communication stream is separately transmitted, the frame ID of the broadcast stream and the frame ID of the communication stream are synchronized. Relationships can be identified.
- One piece of information necessary for realizing the embodiment, modification, etc. such as STC_Offset, GOP information table, frame ID, D1, D2, D3, and ATC_Offset described in the embodiment, modification, etc.
- the part or all of the information may be stored in a broadcast stream, a communication stream, or communication data acquired from an application.
- a PSI / SI packet such as PMT or EIT, a BML event message, user data of each frame of video, or the like.
- information regarding the frames in the GOP may be stored in a lump only at the GOP head, or only the GOP head information may be stored.
- a display device is a display device that displays a separately acquired and stored image in a received stream including a plurality of video frames, A receiving unit that receives a stream including a video frame and first display timing information that determines display timing for each of the video frames, and a plurality of video frames included in the stream received by the receiving unit, A display unit for displaying at a display timing determined by the first display timing information included in the stream received by the receiving unit, a storage unit for storing an image, and second display timing information for determining the display timing of the image; The display timing determined by the second display timing information stored in the storage unit is corrected.
- the acquisition unit that acquires correction information for determining a correction amount for displaying an image stored in the storage unit in synchronization with a video frame displayed on the display unit, and the display unit includes: Further, the image stored in the storage unit is corrected at a correction timing obtained by correcting the display timing determined by the second display timing information stored in the storage unit by a correction amount determined by the correction information acquired by the acquisition unit. It is characterized by displaying.
- the display device having the above-described configuration communicates with a broadcast video even when a communication video corresponding to the program is already received and stored when the broadcast time of the program is changed. It is possible to display the video in synchronization with the timing as intended by the video producer.
- FIG. 40 is a configuration diagram of the display device 4000 in the above modification.
- the display device 4000 includes a reception unit 4010, a display unit 4020, a storage unit 4030, and an acquisition unit 4040.
- the receiving unit 4010 has a function of receiving a stream including a plurality of video frames and first display timing information that determines display timing for each of the video frames.
- it is realized as the tuner 501 in the embodiment.
- the display unit 4020 has a function of displaying each of a plurality of video frames included in the stream received by the receiving unit 4010 at a display timing determined by the first display timing information included in the stream received by the receiving unit 4010.
- Have As an example, in the embodiment, as a block including a broadcast stream decoding unit 510, a communication stream decoding unit 520, a first video plane 552, a second video plane 555, a plane composition processing unit 560, a display 580, and an input start control unit 541. Realized.
- the storage unit 4030 has a function of storing an image and second display timing information that determines the display timing of the image. As an example, it is realized as the buffer 542 in the embodiment.
- the acquisition unit 4040 corrects the display timing determined by the second display timing information stored in the storage unit 4030, thereby synchronizing the image stored in the storage unit 4030 with the video frame displayed on the display unit 4020.
- a correction information for determining a correction amount for display is realized as the synchronization start packet determination unit 540 in the embodiment.
- the display unit 4020 then further corrects the display timing determined by the second display timing information stored in the storage unit 4030 by the correction amount determined by the correction information acquired by the acquisition unit 4040 at the correction timing.
- 4030 has a function of displaying an image stored in 4030.
- the stream received by the receiving unit may further include the correction information, and the acquiring unit may acquire the correction information from the stream received by the receiving unit.
- the correction information can be acquired from the stream received by the receiving unit.
- the display unit includes a display for displaying each of a plurality of video frames included in the stream received by the reception unit and an image stored in the storage unit, and the display The unit may superimpose the image on the video frame when the video frame included in the stream received by the receiving unit and the image stored in the storage unit are displayed on the display.
- the image stored in the storage unit can be superimposed and displayed on the video received by the reception unit.
- the image processing apparatus further includes a sub-reception unit that receives a stream including a plurality of video frames and information that determines display timing for each of the video frames, and the image stored in the storage unit is stored by the sub-reception unit.
- the plurality of received video frames, and the second display timing information stored in the storage unit may be information received by the sub reception unit.
- the video received by the sub-receiving unit can be displayed in synchronization with the video received by the receiving unit.
- the receiving unit includes a broadcast wave receiving unit that receives a broadcast wave transmitted from a broadcasting station and transmitting data including a stream, and the sub-receiving unit receives the stream from an external network.
- a transmission signal receiving unit for receiving a transmission signal for transmitting data including the stream received by the receiving unit is a stream transmitted by a broadcast wave received by the broadcast wave receiving unit;
- the stream received by may be a stream transmitted by a transmission signal received by the transmission signal receiving unit.
- the stream transmitted by the broadcast wave received by the broadcast wave receiver and the stream transmitted by the transmission signal received by the transmission signal receiver are MPEG (MovingMoPicture Experts Group) -2. It may be a TS (TransportTransStream) format stream.
- the receiving unit and the sub-receiving unit can be realized by a relatively general method.
- the image processing apparatus further includes a sub-reception unit that receives a stream including a plurality of video frames and information that determines display timing for each of the video frames, and the image stored in the storage unit is stored by the sub-reception unit.
- the plurality of received video frames, and the second display timing information stored in the storage unit is information received by the sub-reception unit, and is received by the sub-reception unit and the stream received by the reception unit.
- the stream is an MPEG-2 TS format stream
- the display unit includes a first decoder that restores a video frame from the stream received by the receiving unit, and a stream received by the sub-receiving unit. A second decoder for restoring a video frame; and a stream received by the receiving unit.
- a first arrival time counter that gives a first arrival time according to a timing at which the stream is input to the first decoder, a stream received by the sub-receiving unit, and a stream to the second decoder
- a second arrival time counter that gives a second arrival time according to the input timing, a first arrival time given to the stream received by the receiving unit, and a stream received by the sub-receiving unit.
- the stream received by the sub-receiving unit at a timing delayed by a predetermined time with respect to the timing at which the stream received by the receiving unit is input to the first decoder.
- the video received by the sub-receiving unit can be displayed in synchronization with the video received by the receiving unit.
- a multiplexing unit that generates a stream by multiplexing the video frame restored by the first decoder and the video frame restored by the second decoder may be further provided.
- the image processing apparatus further includes a sub-receiving unit that receives a stream including a plurality of video frames and information that determines display timing for each of the video frames, and the image stored in the storage unit is stored by the sub-receiving unit.
- the plurality of received video frames, and the second display timing information stored in the storage unit is information received by the sub-reception unit, and is received by the sub-reception unit and the stream received by the reception unit.
- the stream is an MPEG-2 TS format stream
- the display unit includes a first decoder that restores a video frame from the stream received by the receiving unit, and a stream received by the sub-receiving unit. A second decoder for restoring a video frame; and a stream received by the receiving unit.
- an arrival time counter that gives a second arrival time
- the second decoder receives the corrected timing by the display unit. In the output timing is displayed, may output the video frame.
- the image processing apparatus further includes a sub-reception unit that receives a stream including a plurality of video frames and information for determining display timing for each of the video frames, and the image stored in the storage unit is stored by the sub-reception unit.
- the plurality of received video frames, the second display timing information stored in the storage unit is information received by the sub-receiving unit, and the stream received by the receiving unit is in MPEG-2 TS format
- the display unit includes a first decoder that restores a video frame from the stream received by the receiver, and a second decoder that restores a video frame from the stream received by the sub-receiver.
- the second decoder causes the display unit to display the corrected timing of the restored video frame. In the output timing as displayed in grayed, may output the video frame.
- the image processing apparatus further includes a sub-reception unit that receives a stream including a plurality of video frames and information for determining display timing for each of the video frames, and the image stored in the storage unit is stored by the sub-reception unit.
- the plurality of received video frames, and the second display timing information stored in the storage unit is information received by the sub-reception unit, and is received by the sub-reception unit and the stream received by the reception unit.
- the stream is an MPEG-2 TS format stream
- the display unit includes a first decoder that restores a video frame from the stream received by the receiving unit, and a stream received by the sub-receiving unit. A second decoder for restoring a video frame; and a stream received by the receiving unit.
- a first arrival time counter that gives a first arrival time according to a timing at which the stream is input to the first decoder, a stream received by the sub-receiving unit, and a stream to the second decoder
- a second arrival time counter that assigns a second arrival time according to the input timing
- the second decoder has a first arrival time assigned to the stream received by the receiving unit on a reference time axis.
- the restored video frame is displayed at the correction timing by the display unit using the correction amount determined by the correction information.
- In the output timing may output the video frame.
- the image stored in the storage unit is a computer graphics image
- the display unit is stored in the storage unit and a decoder that restores a video frame from the stream received by the reception unit.
- An image output unit that outputs the image at an output timing such that the image is displayed by the display unit at the correction timing.
- a transmission apparatus includes: a video storage unit that stores a plurality of video frames; a plurality of video frames stored in the video storage unit; and a display timing for each of the video frames.
- a video storage unit that stores a plurality of video frames
- a plurality of video frames stored in the video storage unit By correcting the display timing determined by the second display timing information in the external device that stores the display timing information and the second display timing information that determines the timing for displaying the image, the image is stored in the video storage unit.
- a generation unit that generates a stream by multiplexing correction information that determines a correction amount for display in synchronization with a stored video frame; and a transmission unit that transmits a stream generated by the multiplexing unit.
- the transmission apparatus in the present modification having the above-described configuration, it is possible to provide a transmission apparatus that can multiplex and transmit a plurality of video frames, first display timing information, and correction information.
- the present invention can be widely used in devices that play back broadcast content and accompanying content in synchronization.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Databases & Information Systems (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
Description
<概要>
以下、本発明に係る表示装置の一実施形態として、放送局から、放送波を介して送信される放送映像を受信し、通信サービス提供局から、インターネット回線を介して送信される通信映像を受信し、受信した放送映像と受信した通信映像とを、映像製作者の意図した通りのタイミングで同期して表示する表示装置について説明する。
図1は、放送局120と通信サービス提供局130と表示装置110とを含む放送システム100の概念図である。
[数1]
D1 = SyncPTS1 ? PCR(PCR1) ? ATS(V1) + ATS(PCR1)
また、TSパケット840がデコードされて得られる映像フレームが、第2ビデオプレーン555に出力されるタイミングをSyncPTS2とすると、D2は、次式のように表わすことができる。
[数2]
D2 = SyncPTS2 ? PCR(PCR2) ? ATS(V2) + ATS(PCR2)
従って、D1-D2は、
[数3]
D1 ? D2 = SyncPTS1 ? PCR(PCR1) ? ATS(V1) + ATS(PCR1)
[ SyncPTS2 ? PCR(PCR2) - ATS(V2) + ATS(PCR2)]
と表わされる。
[数4]
D1 ? D2 = 2600 ? 2200 ? 1300 + 1200
? [ 600 ? 300 ? 3400 + 3600 ]
= 300 ? 200
= 100
となる。
表示装置110は、その特徴的な動作として、放送ストリームのTSパケット列に含まれる放送映像と、通信ストリームのTSパケット列に含まれる通信映像とが、映像製作者の意図したタイミング通りに同期して表示されるように、放送ストリームのTSパケット列についてのデコーダ入力器602への出力開始タイミングと、通信ストリームのTSパケット列についてのデコーダ入力器612への出力開始タイミングとを算定し、算定した出力開始タイミングで、放送ストリームのTSパケット列と通信ストリームのTSパケット列との出力を開始する出力開始処理を行う。
図9は、出力開始処理のフローチャートである。
上記構成の表示装置110は、通信ストリームを構成する映像フレームのそれぞれを、各映像フレームに対応付けられているPTSに対して、STC_Offsetの値だけ加算して得られるタイミングで表示することとなる。そして、このことによって、受信した放送映像と受信した通信映像とを、映像製作者の意図した通りのタイミングで同期して表示する。
<変形例1>
<概要>
以下、本発明に係る表示装置の一実施形態として、実施の形態における表示装置110の一部を変形した第1変形表示装置について説明する。
第1変形表示装置は、実施の形態における表示装置110から、主要ビデオデコード部が第1変形主要ビデオデコード部に変更されるように変形されている。
[数5]
ATC_Offset = ATC2(SyncPTS2) ? ATC1(SyncPTS1)
この式は、以下のように変形することができる。
[数6]
ATC_Offset = [ SyncPTS2 + ATC2(PCR2) ? PCR(PCR2) ]
? [SyncPTS1 + ATC(PCR1) ? PCR(PCR1) ]
= STC_Offset + [ ATC2(PCR2) ? PCR(PCR2) ]
? [ ATC1(PCR1) ? PCR(PCR1) ]
この式に、図8に示される各値を代入すると、
[数7]
ATC_Offset = ?2000 + [ 3300 ? 300 ] ? [ 1200 -2200 ]
= 2000
となる。
上記構成の第1変形表示装置は、STC1時間軸に沿って放送ストリームのデコードを行い、STC1時間軸に対してSTC_Offsetが加算されたSTC2時間軸に沿って、映像ストリームのデコードを行う。そして、このことによって、受信した放送映像と受信した通信映像とを、映像製作者の意図した通りのタイミングで同期して表示する。
<変形例2>
<概要>
以下、本発明に係る表示装置の一実施形態として、変形例1における第1変形表示装置の一部を変形した第2変形表示装置について説明する。
第2変形表示装置は、変形例1における第1変形表示装置から、第1変形主要ビデオデコード部が第2変形主要ビデオデコード部に変更されるように変形されている。
上記構成の第2変形表示装置は、変形例1における第1変形表示装置と同様に、STC1時間軸に沿って放送ストリームのデコードを行い、STC1時間軸に対してSTC_Offsetが加算されたSTC2時間軸に沿って、映像ストリームのデコードを行う。そして、このことによって、受信した放送映像と受信した通信映像とを、映像製作者の意図した通りのタイミングで同期して表示する。
<変形例3>
<概要>
以下、本発明に係る表示装置の一実施形態として、実施の形態における表示装置110の一部を変形した第3変形表示装置について説明する。
第3変形表示装置は、実施の形態における表示装置110から、主要ビデオデコード部が第3変形主要ビデオデコード部に変更されるように変形されている。
上記構成の第3変形表示装置によると、多重化器1260が生成する合成ストリームは、STC_offsetが加味された上で多重化されたものとなる。このことにより、多重化器1260が生成する合成ストリームは、通常のTSストリームと同様のものとなる。従って、多重化器1260が生成する合成ストリームを光ディスク等に記憶させて、別の再生機器で再生させることが可能となる。
<変形例4>
<概要>
以下、本実施の形態に係る表示装置の一実施形態として、実施の形態における表示装置110の一部を変形した第4変形表示装置について説明する。
第4変形表示装置は、実施の形態における表示装置110から、主要ビデオデコード部が第4変形主要ビデオデコード部に変更されるように変形されている。
上記構成の第4変形表示装置によると、第1ビデオプレーン552に格納される画像を、第1の筐体に備えられたディスプレイで表示し、第2ビデオプレーン555に格納される画像を、第2の筐体に備えられたディスプレイで表示することで、受信した放送映像と受信した通信映像とを、映像製作者の意図した通りのタイミングで同期して、それぞれ、第1の筐体と第2の筐体とで表示することが可能となる。
<変形例5>
<概要>
以下、本発明に係る表示装置の一実施形態として、変形例1における第1変形表示装置の一部を変形した第5変形表示装置について説明する。
第5変形表示装置は、変形例1における第1変形表示装置から、第1変形主要ビデオデコード部が第5変形主要ビデオデコード部に変更されるように変形されている。
上記構成の第5変形表示装置は、変形例1における第1変形表示装置と同様に、STC1時間軸に沿って放送ストリームのデコードを行い、STC1時間軸に対してSTC_Offsetが加算されたSTC2時間軸に沿って、CG画像の描画と出力とを行う。そして、このことによって、受信した放送映像と受信したCG画像とを、映像製作者の意図した通りのタイミングで同期して表示する。
<変形例6>
<概要>
以下、本発明に係る表示装置の一実施形態として、変形例5における第5変形表示装置の一部を変形した第6変形表示装置について説明する。
第6変形表示装置は、変形例5における第5変形表示装置から、第5変形主要ビデオデコード部が第6変形主要ビデオデコード部に変更されるように変形されている。
<変形例7>
<概要>
以下、本発明に係る表示装置の一実施形態として、実施の形態における表示装置110の一部を変形した第7変形表示装置について説明する。
第7変形表示装置は、実施の形態における表示装置110から、主要ビデオデコード部が第7変形主要ビデオデコード部に変更されるように変形されている。
<変形例8>
<概要>
以下、本発明に係る表示装置の一実施形態として、実施の形態における表示装置110の一部を変形した第8変形表示装置について説明する。
第8変形表示装置は、実施の形態における表示装置110において、チューナ501と第1多重分離部511との間に(図5参照)、変換器2100が追加されるように変形されている。
<他の変形例>
放送ストリームと通信ストリームの同期再生を実現する上では、放送ストリームと通信ストリームをバッファリングしてデータがそろってから再生する必要があり、また通信ネットワークのネットワーク遅延を考慮して、遅延再生が必要となる。
で表示される放送ストリームのビデオのPTSを示している。タイムラインCは、同期再生を実現する際に、遅延再生する場合に、表示時刻(AbsTime)で表示される放送ストリームのビデオのPTSを示している。タイムラインBとタイムラインC間に記載される矢印は、同期再生の再生ルートを示している。まず、AbsTime100から200の間はバッファリングを行い、データバッファに放送ストリームと通信ストリームのデータを格納する。次に、AbsTime200から500の間は、放送ストリームと通信ストリームの同期再生を実行する。AbsTime500で同期再生を終了し、AbsTime500からは通常再生に戻る。このようなケースの場合には、放送ストリームにおいて、PTS1400からPTS1500までのデータが表示されなくなる問題が発生する。例えば、このシーンがCMである場合には、同期表示するユーザは、CMを見ないため、放送局のビジネスが成り立たなくなってしまう課題がある。
<利用技術についての説明>
以下、上記した実施の形態、変形例等において利用されている技術について説明する。
<補足>
以上、本発明に係る表示装置の一実施形態として、実施の形態、変形例等における表示装置を例として説明したが、以下のように変形することも可能であり、本発明は上述した実施の形態、変形例等で例示した通りの表示装置に限られないことはもちろんである。
502 通信インターフェース
510 放送ストリームデコード手段
511 第1多重分離部
512 第1オーディオデコーダ
513 字幕デコーダ
514 第1ビデオデコーダ
515 システムパケットマネージャ
520 通信ストリームデコード手段
521 第2多重分離部
523 第2ビデオデコーダ
522 第2オーディオデコーダ
530 アプリケーション実行制御部
540 同期開始パケット決定部
541 入力開始制御部
552 第1ビデオプレーン
554 グラフィックスプレーン
555 第2ビデオプレーン
560 プレーン合成処理部
Claims (13)
- 受信した、複数の映像フレームを含むストリームに、別途取得して記憶している画像を同期させて表示する表示装置であって、
複数の映像フレームと、当該映像フレームそれぞれについての表示タイミングを定める第1表示タイミング情報とを含むストリームを受信する受信部と、
前記受信部によって受信されたストリームに含まれる複数の映像フレームのそれぞれを、前記受信部によって受信されたストリームに含まれる第1表示タイミング情報によって定められる表示タイミングで表示する表示部と、
画像と、当該画像の表示タイミングを定める第2表示タイミング情報とを記憶する記憶部と、
前記記憶部に記憶される第2表示タイミング情報によって定められる表示タイミングを補正することで、前記記憶部に記憶される画像を、前記表示部に表示される映像フレームに同期させて表示するための補正量を定める補正情報を取得する取得部とを備え、
前記表示部は、さらに、前記記憶部に記憶される第2表示タイミング情報によって定められる表示タイミングを前記取得部によって取得された補正情報によって定められる補正量分補正した補正タイミングで、前記記憶部に記憶される画像を表示する
ことを特徴とする表示装置。 - 前記受信部によって受信されるストリームは、さらに、前記補正情報を含み、
前記取得部は、前記補正情報の取得を、前記受信部によって受信されたストリームから行う
ことを特徴とする請求項1記載の表示装置。 - 前記表示部は、前記受信部によって受信されたストリームに含まれる複数の映像フレームのそれぞれと、前記記憶部に記憶される画像とを表示するためのディスプレイを有し、
前記表示部は、前記受信部によって受信されたストリームに含まれる映像フレームと、前記記憶部に記憶される画像とを前記ディスプレイに表示する場合に、当該映像フレームに当該画像を重畳させる
ことを特徴とする請求項2記載の表示装置。 - 複数の映像フレームと、当該映像フレームそれぞれについての表示タイミングを定める情報とを含むストリームを受信するサブ受信部をさらに備え、
前記記憶部の記憶する画像は、前記サブ受信部によって受信された複数の映像フレームであり、
前記記憶部の記憶する第2表示タイミング情報は、前記サブ受信部によって受信された情報である
ことを特徴とする請求項3記載の表示装置。 - 前記受信部は、放送局から放送された、ストリームを含むデータを伝送する放送波を受信する放送波受信部を有し、
前記サブ受信部は、外部のネットワークから、ストリームを含むデータを伝送する伝送信号を受信する伝送信号受信部を有し、
前記受信部の受信するストリームは、前記放送波受信部によって受信される放送波によって伝送されるストリームであり、
前記サブ受信部の受信するストリームは、前記伝送信号受信部によって受信される伝送信号によって伝送されるストリームである
ことを特徴とする請求項4記載の表示装置。 - 前記放送波受信部によって受信される放送波によって伝送されるストリームと前記伝送信号受信部によって受信される伝送信号によって伝送されるストリームとは、MPEG(Moving Picture Experts Group)-2 TS(Transport Stream)形式のストリームである
ことを特徴とする請求項5記載の表示装置。 - 複数の映像フレームと、当該映像フレームそれぞれについての表示タイミングを定める情報とを含むストリームを受信するサブ受信部をさらに備え、
前記記憶部の記憶する画像は、前記サブ受信部によって受信された複数の映像フレームであり、
前記記憶部の記憶する第2表示タイミング情報は、前記サブ受信部によって受信された情報であり、
前記受信部によって受信されるストリームと前記サブ受信部によって受信されるストリームとは、MPEG-2 TS形式のストリームであり、
前記表示部は、
前記受信部によって受信されるストリームから映像フレームを復元する第1デコーダと、
前記サブ受信部によって受信されるストリームから映像フレームを復元する第2デコーダと、
前記受信部によって受信されるストリームに、当該ストリームが前記第1デコーダに入力されるタイミングに係る第1到着時刻を付与する第1到着時刻カウンタと、
前記サブ受信部によって受信されるストリームに、当該ストリームが前記第2デコーダに入力されるタイミングに係る第2到着時刻を付与する第2到着時刻カウンタと、
前記受信部によって受信されるストリームに付与された第1到着時刻と、前記サブ受信部によって受信されるストリームに付与された第2到着時刻とを利用して、前記受信部によって受信されるストリームが前記第1デコーダに入力されるタイミングに対して所定時間遅延するタイミングで、前記サブ受信部によって受信されるストリームを前記第2デコーダに入力する遅延入力部とを備え、
前記第2デコーダは、復元した映像フレームが前記表示部によって前記補正タイミングで表示されるような出力タイミングで、当該映像フレームを出力する
ことを特徴とする請求項1記載の表示装置。 - 前記第1デコーダによって復元された映像フレームと、前記第2デコーダによって復元された映像フレームとを多重化することでストリームを生成する多重化部をさらに備える
ことを特徴とする請求項7記載の表示装置。 - 複数の映像フレームと、当該映像フレームそれぞれについての表示タイミングを定める情報とを含むストリームを受信するサブ受信部をさらに備え、
前記記憶部の記憶する画像は、前記サブ受信部によって受信された複数の映像フレームであり、
前記記憶部の記憶する第2表示タイミング情報は、前記サブ受信部によって受信された情報であり、
前記受信部によって受信されるストリームと前記サブ受信部によって受信されるストリームとは、MPEG-2 TS形式のストリームであり、
前記表示部は、
前記受信部によって受信されるストリームから映像フレームを復元する第1デコーダと、
前記サブ受信部によって受信されるストリームから映像フレームを復元する第2デコーダと、
前記受信部によって受信されるストリームに、当該ストリームが前記第1デコーダに入力されるタイミングに係る第1到着時刻を付与し、前記サブ受信部によって受信されるストリームに、当該ストリームが前記第2デコーダに入力されるタイミングに係る第2到着時刻を付与する到着時刻カウンタと、
前記受信部によって受信されるストリームに付与された第1到着時刻と、前記サブ受信部によって受信されるストリームに付与された第2到着時刻とを利用して、前記受信部によって受信されるストリームが前記第1デコーダに入力されるタイミングに対して所定時間遅延するタイミングで、前記サブ受信部によって受信されるストリームを前記第2デコーダに入力する遅延入力部とを備え、
前記第2デコーダは、復元した映像フレームが前記表示部によって前記補正タイミングで表示されるような出力タイミングで、当該映像フレームを出力する
ことを特徴とする請求項1記載の表示装置。 - 複数の映像フレームと、当該映像フレームそれぞれについての表示タイミングを定める情報とを含むストリームを受信するサブ受信部をさらに備え、
前記記憶部の記憶する画像は、前記サブ受信部によって受信された複数の映像フレームであり、
前記記憶部の記憶する第2表示タイミング情報は、前記サブ受信部によって受信された情報であり、
前記受信部によって受信されるストリームは、MPEG-2 TS形式のストリームであり、
前記表示部は、
前記受信部によって受信されるストリームから映像フレームを復元する第1デコーダと、
前記サブ受信部によって受信されるストリームから映像フレームを復元する第2デコーダとを備え、
前記第2デコーダは、復元した映像フレームが前記表示部によって前記補正タイミングで表示されるような出力タイミングで、当該映像フレームを出力する
ことを特徴とする請求項1記載の表示装置。 - 複数の映像フレームと、当該映像フレームそれぞれについての表示タイミングを定める情報とを含むストリームを受信するサブ受信部をさらに備え、
前記記憶部の記憶する画像は、前記サブ受信部によって受信された複数の映像フレームであり、
前記記憶部の記憶する第2表示タイミング情報は、前記サブ受信部によって受信された情報であり、
前記受信部によって受信されるストリームと前記サブ受信部によって受信されるストリームとは、MPEG-2 TS形式のストリームであり、
前記表示部は、
前記受信部によって受信されるストリームから映像フレームを復元する第1デコーダと、
前記サブ受信部によって受信されるストリームから映像フレームを復元する第2デコーダと、
前記受信部によって受信されるストリームに、当該ストリームが前記第1デコーダに入力されるタイミングに係る第1到着時刻を付与する第1到着時刻カウンタと、
前記サブ受信部によって受信されるストリームに、当該ストリームが前記第2デコーダに入力されるタイミングに係る第2到着時刻を付与する第2到着時刻カウンタとを備え、
前記第2デコーダは、基準時間軸における、前記受信部によって受信されるストリームに付与された第1到着時刻に対応する第1基準時間軸時刻と、当該準時間軸における、前記サブ受信部によって受信されるストリームに付与された第2到着時刻に対応する第2基準時間軸時刻と、前記取得部に取得された補正情報によって定められる補正量とを用いて、復元した映像フレームが前記表示部によって前記補正タイミングで表示されるような出力タイミングで、当該映像フレームを出力する
ことを特徴とする請求項1記載の表示装置。 - 前記記憶部の記憶する画像は、コンピュータグラフィックス画像であり、
前記表示部は、
前記受信部によって受信されるストリームから映像フレームを復元するデコーダと、
前記記憶部に記憶される画像が前記表示部によって前記補正タイミングで表示されるような出力タイミングで、当該画像を出力する画像出力部とを有する
ことを特徴とする請求項1記載の表示装置。 - 複数の映像フレームを記憶する映像記憶部と、
前記映像記憶部に記憶される複数の映像フレームと、当該映像フレームそれぞれについての表示タイミングを定める第1表示タイミング情報と、画像を表示するタイミングを定める第2表示タイミング情報を記憶する外部装置に、当該第2表示タイミング情報によって定められる表示タイミングを補正することで、当該画像を、前記映像記憶部に記憶する映像フレームと同期させて表示させるための補正量を定める補正情報とを多重化してストリームを生成する生成部と、
前記多重化部によって生成されるストリームを送信する送信部とを備える
ことを特徴とする送信装置。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/115,967 US20140079368A1 (en) | 2012-03-12 | 2013-03-08 | Display device and transmission device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201261609492P | 2012-03-12 | 2012-03-12 | |
US61/609,492 | 2012-03-12 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013136754A1 true WO2013136754A1 (ja) | 2013-09-19 |
Family
ID=49160686
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/001538 WO2013136754A1 (ja) | 2012-03-12 | 2013-03-08 | 表示装置、及び送信装置 |
Country Status (3)
Country | Link |
---|---|
US (1) | US20140079368A1 (ja) |
JP (1) | JPWO2013136754A1 (ja) |
WO (1) | WO2013136754A1 (ja) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016192741A (ja) * | 2015-03-31 | 2016-11-10 | 日本放送協会 | 放送装置、マッピング情報送信装置、受信機、提示対象データ送信装置、放送通信データ同期システムおよびプログラム |
JP2017508327A (ja) * | 2013-12-23 | 2017-03-23 | エルジー エレクトロニクス インコーポレイティド | 一つ以上のネットワークで放送コンテンツを送受信する装置及び方法 |
JP2018129844A (ja) * | 2018-03-23 | 2018-08-16 | ソニー株式会社 | 送信装置、送信方法、受信装置および受信方法 |
JP2019146188A (ja) * | 2019-03-26 | 2019-08-29 | ソニー株式会社 | 送信装置、送信方法、受信装置および受信方法 |
JP2020188497A (ja) * | 2019-03-26 | 2020-11-19 | ソニー株式会社 | 送信方法 |
JP2021166396A (ja) * | 2020-07-29 | 2021-10-14 | ソニーグループ株式会社 | 送信方法 |
US11259070B2 (en) | 2019-04-11 | 2022-02-22 | Kabushiki Kaisha Toshiba | Packet generation apparatus and method |
US11405679B2 (en) | 2015-02-02 | 2022-08-02 | Maxell, Ltd. | Broadcast receiving apparatus, broadcast receiving method, and contents outputting method |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2713371A3 (en) * | 2012-09-27 | 2014-08-06 | JVC KENWOOD Corporation | Imaging device, image processing device, imaging method, image processing method, and computer program product |
EP3697100A1 (en) * | 2013-06-05 | 2020-08-19 | Sun Patent Trust | Data decoding method, data decoding apparatus, and data transmitting method |
KR20150033827A (ko) * | 2013-09-24 | 2015-04-02 | 삼성전자주식회사 | 영상표시장치, 서버 및 그 동작방법 |
KR20150057149A (ko) * | 2013-11-18 | 2015-05-28 | 한국전자통신연구원 | 재전송망에 기초한 3d 방송 서비스 제공 시스템 및 방법 |
EP2919458A1 (en) * | 2014-03-11 | 2015-09-16 | Axis AB | Method and system for playback of motion video |
US20150334471A1 (en) * | 2014-05-15 | 2015-11-19 | Echostar Technologies L.L.C. | Multiple simultaneous audio video data decoding |
GB201421304D0 (en) * | 2014-12-01 | 2015-01-14 | Pace Plc | Improvements to television service and system |
CN111711851B (zh) | 2015-07-24 | 2022-11-22 | 麦克赛尔株式会社 | 广播接收装置和接收装置 |
US11445223B2 (en) | 2016-09-09 | 2022-09-13 | Microsoft Technology Licensing, Llc | Loss detection for encoded video transmission |
US10979785B2 (en) * | 2017-01-20 | 2021-04-13 | Hanwha Techwin Co., Ltd. | Media playback apparatus and method for synchronously reproducing video and audio on a web browser |
US10834298B1 (en) * | 2019-10-14 | 2020-11-10 | Disney Enterprises, Inc. | Selective audio visual synchronization for multiple displays |
CN111510772B (zh) * | 2020-03-23 | 2022-03-29 | 珠海亿智电子科技有限公司 | 一种平衡视频帧率误差的方法、装置、设备及存储介质 |
CN115720292B (zh) * | 2021-08-23 | 2024-08-23 | 北京字跳网络技术有限公司 | 视频录制方法、设备、存储介质及程序产品 |
EP4199525A1 (de) * | 2021-12-20 | 2023-06-21 | The Color Grading Company GmbH | Computerimplementiertes verfahren zur wahlweisen oder gleichzeitigen anzeige zumindest zweier videos |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012010009A (ja) * | 2010-06-23 | 2012-01-12 | Nippon Hoso Kyokai <Nhk> | 送信装置、サーバ装置、および受信装置 |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090051759A1 (en) * | 2005-05-27 | 2009-02-26 | Adkins Sean M | Equipment and methods for the synchronization of stereoscopic projection displays |
KR20120107258A (ko) * | 2011-03-21 | 2012-10-02 | 삼성전자주식회사 | 디스플레이장치 및 그 제어방법과, 셔터 안경 및 그 제어방법 |
-
2013
- 2013-03-08 US US14/115,967 patent/US20140079368A1/en not_active Abandoned
- 2013-03-08 WO PCT/JP2013/001538 patent/WO2013136754A1/ja active Application Filing
- 2013-03-08 JP JP2014504687A patent/JPWO2013136754A1/ja active Pending
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012010009A (ja) * | 2010-06-23 | 2012-01-12 | Nippon Hoso Kyokai <Nhk> | 送信装置、サーバ装置、および受信装置 |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017508327A (ja) * | 2013-12-23 | 2017-03-23 | エルジー エレクトロニクス インコーポレイティド | 一つ以上のネットワークで放送コンテンツを送受信する装置及び方法 |
US10080055B2 (en) | 2013-12-23 | 2018-09-18 | Lg Electronics Inc. | Apparatuses and methods for transmitting or receiving a broadcast content via one or more networks |
US11871071B2 (en) | 2015-02-02 | 2024-01-09 | Maxell, Ltd. | Broadcast receiving apparatus, broadcast receiving method, and contents outputting method |
US11405679B2 (en) | 2015-02-02 | 2022-08-02 | Maxell, Ltd. | Broadcast receiving apparatus, broadcast receiving method, and contents outputting method |
JP2016192741A (ja) * | 2015-03-31 | 2016-11-10 | 日本放送協会 | 放送装置、マッピング情報送信装置、受信機、提示対象データ送信装置、放送通信データ同期システムおよびプログラム |
JP2018129844A (ja) * | 2018-03-23 | 2018-08-16 | ソニー株式会社 | 送信装置、送信方法、受信装置および受信方法 |
JP2019146188A (ja) * | 2019-03-26 | 2019-08-29 | ソニー株式会社 | 送信装置、送信方法、受信装置および受信方法 |
JP2020188497A (ja) * | 2019-03-26 | 2020-11-19 | ソニー株式会社 | 送信方法 |
US11259070B2 (en) | 2019-04-11 | 2022-02-22 | Kabushiki Kaisha Toshiba | Packet generation apparatus and method |
JP2021166396A (ja) * | 2020-07-29 | 2021-10-14 | ソニーグループ株式会社 | 送信方法 |
JP2022159374A (ja) * | 2020-07-29 | 2022-10-17 | ソニーグループ株式会社 | 送信方法 |
JP7371734B2 (ja) | 2020-07-29 | 2023-10-31 | ソニーグループ株式会社 | 送信方法 |
JP7120399B2 (ja) | 2020-07-29 | 2022-08-17 | ソニーグループ株式会社 | 送信方法 |
Also Published As
Publication number | Publication date |
---|---|
US20140079368A1 (en) | 2014-03-20 |
JPWO2013136754A1 (ja) | 2015-08-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2013136754A1 (ja) | 表示装置、及び送信装置 | |
JP5137345B2 (ja) | ローカル3次元ビデオを具現するためのエンコーディング/デコーディング方法及び装置 | |
KR100782835B1 (ko) | 캡션 정보의 출력시점 및 출력 우선순위를 조절하는 방법및 그 장치 | |
JP6184408B2 (ja) | 受信装置及びその受信方法 | |
KR100711328B1 (ko) | 데이터 처리 장치 및 방법 | |
US7266288B2 (en) | Video/audio playback apparatus and video/audio playback method | |
JP2008011404A (ja) | コンテンツ処理装置及びコンテンツ処理方法 | |
JP2014504083A (ja) | マルチメディアコンテンツを送受信する送信装置および受信装置、その再生方法 | |
US20060203287A1 (en) | Reproducing apparatus and method, and recording medium | |
WO2013175718A1 (ja) | 受信装置、送信装置、受信方法、及び送信方法 | |
WO2013099290A1 (ja) | 映像再生装置、映像再生方法、映像再生プログラム、映像送信装置、映像送信方法及び映像送信プログラム | |
US20080037956A1 (en) | Systems and Methods of Generating Encapsulated MPEG Program Streams | |
WO2013011696A1 (ja) | 送信装置、受信再生装置、送信方法及び受信再生方法 | |
JP4613860B2 (ja) | Mpeg符号化ストリーム復号装置 | |
WO2004086396A1 (en) | Reproducing apparatus and method, and recording medium | |
WO2015174207A1 (ja) | 受信装置、および送信装置、並びにデータ処理方法 | |
JP2009044282A (ja) | デジタル映像データ再生装置及び表示装置 | |
JPH099215A (ja) | データ多重方法、データ伝送方法、及び多重データ復号方法、多重データ復号装置 | |
JP3671969B2 (ja) | データ多重方法及び多重データ復号方法 | |
JP5016335B2 (ja) | 再生装置、および、再生方法 | |
KR100539731B1 (ko) | 전송스트림저장장치및방법 | |
KR100708377B1 (ko) | 디지털 방송 수신시 동시 화면을 위한 디코더 | |
JP2019186732A (ja) | レコーダおよび録画情報の再生方法 | |
JP2008066770A (ja) | コンテンツ受信装置、コンテンツ送受信システム、及びコンテンツ送受信方法 | |
JP2003284069A (ja) | 多重データ復号方法及び多重データ復号装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2014504687 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13761815 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14115967 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13761815 Country of ref document: EP Kind code of ref document: A1 |