US20070053665A1 - Apparatus and method for image coding and decoding - Google Patents

Apparatus and method for image coding and decoding Download PDF

Info

Publication number
US20070053665A1
US20070053665A1 US11/593,388 US59338806A US2007053665A1 US 20070053665 A1 US20070053665 A1 US 20070053665A1 US 59338806 A US59338806 A US 59338806A US 2007053665 A1 US2007053665 A1 US 2007053665A1
Authority
US
United States
Prior art keywords
data
multimedia
video
coding
stream
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/593,388
Inventor
Motoki Kato
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Priority to US11/593,388 priority Critical patent/US20070053665A1/en
Publication of US20070053665A1 publication Critical patent/US20070053665A1/en
Assigned to SONY CORPORATION reassignment SONY CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KATO, MOTOKI
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • H04N7/0117Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level involving conversion of the spatial resolution of the incoming video signal
    • H04N7/0122Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level involving conversion of the spatial resolution of the incoming video signal the input and the output signals having different aspect ratios
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/115Selection of the code volume for a coding unit prior to coding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/146Data rate or code amount at the encoder output
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/146Data rate or code amount at the encoder output
    • H04N19/149Data rate or code amount at the encoder output by estimating the code amount by means of a model, e.g. mathematical model or statistical model
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/162User input
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/17Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
    • H04N19/172Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a picture, frame or field
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/177Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a group of pictures [GOP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/188Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a video data packet, e.g. a network abstraction layer [NAL] unit
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/40Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using video transcoding, i.e. partial or full decoding of a coded input stream followed by re-encoding of the decoded output stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/44Decoders specially adapted therefor, e.g. video decoders which are asymmetric with respect to the encoder
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/46Embedding additional information in the video signal during the compression process
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/59Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving spatial sub-sampling or interpolation, e.g. alteration of picture size or resolution
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/70Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by syntax aspects related to video coding, e.g. related to compression standards
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/236Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
    • H04N21/23614Multiplexing of additional data and video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • H04N21/4334Recording operations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/434Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
    • H04N21/4341Demultiplexing of audio and video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/434Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
    • H04N21/4348Demultiplexing of additional data and video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440263Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the spatial resolution, e.g. for displaying on a connected PDA
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/633Control signals issued by server directed to the network components or client
    • H04N21/6332Control signals issued by server directed to the network components or client directed to client
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/637Control signals issued by the client directed to the server or network components
    • H04N21/6373Control signals issued by the client directed to the server or network components for rate control, e.g. request to the server to modify its transmission rate
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/637Control signals issued by the client directed to the server or network components
    • H04N21/6377Control signals issued by the client directed to the server or network components directed to server
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/65Transmission of management data between client and server
    • H04N21/654Transmission by server directed to the client
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/65Transmission of management data between client and server
    • H04N21/658Transmission by the client directed to the server
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/82Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
    • H04N9/8205Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/78Television signal recording using magnetic recording
    • H04N5/781Television signal recording using magnetic recording on disks or drums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/804Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components
    • H04N9/8042Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components involving data reduction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/82Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
    • H04N9/8205Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
    • H04N9/8227Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal the additional signal being at least another television signal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/82Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
    • H04N9/8205Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
    • H04N9/8233Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal the additional signal being a character code signal

Definitions

  • the present invention relates generally to an image coding apparatus and method, an image decoding apparatus and method, and a recording medium. More specifically, the present invention relates to an image coding apparatus and method, an image decoding apparatus and method, and a recording medium which are suitable for use in apparatus for re-encoding video streams and recording and reproducing the re-encoded video streams.
  • Digital television broadcasts such as European DVB (Digital Video Broadcast), American DTV (Digital Television) broadcast, and Japanese BS (Broadcast Satellite) digital broadcast use MPEG (Motion Picture Expert Group) 2 transport streams.
  • a transport stream consists of continuous transport packets, each packet carrying video data or audio data, for example.
  • the data length of one transport packet is 188 bytes.
  • digital television broadcasts are capable of providing services added with multimedia coding data.
  • data such as video data, audio data, character graphics data, and still picture data, for example, are associated with each other for transmission by the multimedia coding data.
  • multimedia coding data a coding method based on XML (Extensible Markup Language) is used in the Japanese BS digital broadcast, for example. The details of this method are disclosed in ARIB STD-B24 Data Coding And Transmission Specification for Digital Broadcasting, for example.
  • Data such as video data, audio data, character graphics data, and still picture data are each packetized into a transport packet for transmission.
  • FIGS. 1A and 1B show an example of synthesizing data to be transferred between the sending and receiving sides and a multimedia screen.
  • the sending side sends to the receiving side video data, character graphics data for displaying buttons A through C, text data for displaying “XYZABC . . . ,” and multimedia coding data for relating these data to each other.
  • the sending side generally denotes a television broadcast station, for example. However, herein it denotes a television broadcast station which includes a recording apparatus (the recording side) which receives and records data transmitted from broadcast stations, as shown in the example illustrated in FIG. 1A including the data which is output from this recording apparatus.
  • the multimedia coding data includes data which can synthesize on the receiving side video data, character graphics data, and text data and display the synthesized data.
  • the multimedia coding data includes the data associated with the display positions of the video, character graphics, and text which are displayed by the size-associated data such as the multimedia plane (the display area of images on the television receiver, for example) size (plane_height and plane_width) and video display size (video_height and video_width), video data, character graphics data, and text data, as shown in FIG. 1B .
  • the receiving side processes the video data, the character graphics data, and the text data to display a resultant image, as shown in FIG. 1B .
  • the user can receive services such as displaying desired information in the video section by clicking button A corresponding to that information and obtaining, from the text data displayed in the bottom of the screen, the information associated with the matter displayed in the video section, for example.
  • a television program carried by a transport stream transmitted from a digital television broadcast is recorded without change to a recording medium on the received side, the program can be recorded without its picture and audio qualities being deteriorated at all.
  • the received video stream in order to record as long a television program as possible to a recording medium having a limited recording capacity by presupposing a certain degree of picture quality deterioration, the received video stream must be decoded and then encoded again to lower the bit rate of the transport stream.
  • the re-encoding of the video stream of a television program attached with multimedia coding data to lower its bit rate for recording may be implemented by sub-sampling the image to change writing blocks.
  • this approach presents a problem of causing a mismatch in the relationship between the video stream resulting from re-encoding and the multimedia coding data. The following describes an example of this mismatch with reference to FIGS. 2A and 2B .
  • the sending side (the recording side) converts the original video writing block to a smaller picture frame at the time of re-encoding. Therefore, as shown in FIG. 2B , on the receiving side (the reproducing side), changes occur in the video display size and position, resulting in a display screen which is different from the display screen intended by the sending side (the display screen to be displayed on the basis of the data before being re-encoded).
  • an image coding apparatus receives a multiplexed transport stream that includes multimedia coding data.
  • a demultiplexer separates a video stream from the multiplexed transport stream.
  • a decoder reproduces the separated video stream as decoded video data.
  • a coding generator receives multimedia information associated with the multimedia coding data and generates display control information.
  • the display control information includes a mismatch flag which indicates whether a display mismatch condition exists between the video data and multimedia coding data.
  • An output unit outputs the decoded video data, the multimedia coding data and the mismatch flag.
  • an encoder may be coupled to the decoder and may reproduce the video stream based on the multimedia information associated with the multimedia coding data and the video data.
  • the output unit may comprise a writing unit that records the decoded video data, the multimedia coding data and the mismatch flag onto a recording medium.
  • a coding controller may be coupled between the selector and the coding generator and may generate the multimedia information associated with the multimedia coding data.
  • a data analyzer may be coupled between the selector and the coding controller and may detect at least a bit rate associated with the video stream.
  • the display control information may include a re-encode flag which indicates whether the video data is re-encoded.
  • the display control information may include a frame size change flag which indicates whether a size of a picture frame associated with the video data has been changed.
  • FIGS. 1A and 1B are schematic diagrams illustrating a display screen to be shown on the basis of multimedia coding information
  • FIGS. 2A and 2B are schematic diagrams illustrating a mismatch which takes place when a video stream is re-encoded
  • FIG. 3 is a block diagram illustrating a recording apparatus practiced as one embodiment of the present invention.
  • FIGS. 4A and 4B illustrate the operation of a multiplexer shown in FIG. 3 ;
  • FIGS. 5A, 5B and 5 C illustrate the processing by an arrival timestamp adding block
  • FIG. 6 illustrates multimedia display sub-information
  • FIG. 7 illustrates an example of ProgramInfo( ) syntax
  • FIG. 8 illustrates an example of StreamCodingInfo( ) syntax
  • FIG. 9 illustrates the meaning of stream_coding type
  • FIG. 10 illustrates the meaning of video_format
  • FIG. 11 illustrates the meaning of frame_rate
  • FIG. 12 illustrates the meaning of display_aspect_ratio
  • FIG. 13 is a flowchart describing the processing of coding AV stream and multimedia display sub-information
  • FIG. 14 is a flowchart describing the coding processing to be executed for restricting the re-encoding of a multiplexed stream video including multimedia coding data;
  • FIG. 15 illustrates an example of an input transport stream
  • FIG. 16 illustrates an example of a transport stream after the re-encoding of the video stream shown in FIG. 15 ;
  • FIG. 17 is a flowchart describing a recording rate control process by a recording apparatus shown in FIG. 3 ;
  • FIG. 18 is a flowchart describing another recording rate control process by the recording apparatus shown in FIG. 3 ;
  • FIG. 19 illustrates another example of a transport stream resulting from the re-encoding of the video stream
  • FIG. 20 illustrates another example of the input transport stream
  • FIG. 21 is a block diagram illustrating a configuration of a reproducing apparatus practiced as one embodiment of the present invention.
  • FIGS. 22A and 22B illustrate a display screen to be shown when multimedia display sub-information is added
  • FIG. 23 is a block diagram illustrating another configuration of the recording apparatus practiced as one embodiment of the present invention.
  • FIG. 24 is a flowchart describing the processing of reproducing an AV stream which uses multimedia display sub-information
  • FIG. 25 is a block diagram illustrating another configuration of the reproducing apparatus practiced as one embodiment of the present invention.
  • FIG. 26 illustrates recording media
  • FIG. 3 there is shown a block diagram illustrating an exemplary configuration of a recording apparatus 1 practiced as one embodiment of the invention.
  • a transport stream received at an antenna, not shown, is input in a selector 10 .
  • a program number (a channel number) specified by the user is also input from a terminal 11 to the selector 10 .
  • the selector 10 extracts the specified program from the received transport stream and outputs a partial transport stream.
  • the partial transport stream is input in a demultiplexer 12 and an analyzing block 13 .
  • the partial transport stream input in the demultiplexer 12 is separated into a video stream and other streams (audio, still picture, character graphics, and multimedia coding data for example).
  • the video stream thus obtained is output to a decoder 14 .
  • the other streams are output to a multiplexer 16 .
  • the demultiplexer 12 outputs the output timing information in the input transport stream of these transport packets to the multiplexer 16 .
  • the decoder 14 applies a predetermined decoding scheme, for example, MPEG2 to the input video stream and outputs the decoded video data to an encoder 15 . Also, the decoder 14 outputs the stream information about the video stream obtained at decoding to a coding controller 18 .
  • a predetermined decoding scheme for example, MPEG2
  • the analyzing block 13 analyzes the input transport stream to obtain the stream information about the non-video streams, for example, a bit rate, and outputs it to the coding controller 18 .
  • the stream information about the non-video streams output from the analyzing block 13 , the video stream information output from decoder 14 , and a stream recording bit rate output from a terminal 19 are input in the coding controller 18 .
  • the coding controller 18 sets the video data coding conditions (coding control information) to be executed by the encoder 15 and outputs these coding conditions to the encoder 15 and a coding block 20 .
  • the coding controller 18 uses, as a bit rate to be allocated to the video data encoding, a value obtained by subtracting a total value (the data input from the analyzing block 13 ) of the bit rates of the non-video streams from a stream recording bit rate (the data input, via the terminal 19 , from a controller, not shown, for controlling the operation of the recording apparatus 1 , for example).
  • the coding controller 18 sets coding control information such as bit rate and picture frame such that an optimum picture quality can be achieved with the bit rate thus obtained and outputs this coding control information to the encoder 15 and the coding block 20 .
  • the details of the coding control information will be described later with reference to FIGS. 15 through 20 .
  • this stream recording bit rate becomes the fixed rate; if a stream is recorded with a variable bit rate, this stream recording bit rate is a mean bit rate per predetermined time.
  • the maximum value of the variable bit rate in this case needs to be lower than the maximum recording bit rate ensured by the recording medium concerned.
  • the encoder 15 encodes (on the basis of MPEG2, for example) the video data output from the decoder 14 on the basis of the coding control information output from the coding controller 18 and outputs the resultant video data to the multiplexer 16 .
  • the video stream from the encoder 15 , the transport stream packets other than video from the demultiplexer 12 , and the information about the occurrence timing of the transport stream packets other than video are input in the multiplexer 16 .
  • the multiplexer 16 multiplexes the video stream with the transport stream packets, other than video, and outputs the result to the arrival timestamp adding block 17 as a transport stream.
  • FIGS. 4A and 4B schematically illustrate the above-mentioned processing to be executed by the multiplexer 16 .
  • FIG. 4A shows the timing of the input transport stream packets.
  • the cross-hatched portions indicate the video packets while the white portions indicate the stream packets other than video.
  • the input transport stream packets are continuous; however, the data volume of the video data is reduced by the re-encoding of video data by the encoder 15 . Consequently, the number of video packets is reduced.
  • the multiplexer 16 does not change the timing of the stream packets other than video but causes only the timing of the video packets to be different from the original state (shown in FIG. 4A ).
  • the arrival timestamp adding block 17 adds a header (TP_extra_header) including an arrival timestamp to each of the packets ( FIG. 5A ) of the input transport stream to generate a source packet ( FIG. 5B ), arranges the generated source packets continuously ( FIG. 5C ), and outputs them to a writing block 21 .
  • the arrival timestamp is information indicative of the timing with which the transport stream packets occur in a transport stream.
  • the writing block 21 takes the input source packet stream consisting of continuous source packets and records the file to a recording medium 22 . It should be noted that the recording medium 22 may be any type of recording medium.
  • the information output from the coding block 20 is also input in the writing block 21 .
  • the coding block 20 On the basis of the video coding information from the coding controller 18 , the coding block 20 generates multimedia display sub-information and outputs the same to the writing block 21 .
  • the multimedia display sub-information to be output to the writing block 21 is information for keeping the video display position and display size unchanged on multimedia plane from those of the image (the image which would be displayed without re-encoding) intended by the sending side even if the picture frame size has changed by transcoding (decoding by the decoder 14 and then encoding by the encoder 15 ) a video stream. This information also is used at the time of reproduction in combination with multimedia coding data.
  • the multimedia display sub-information consists of three flags of a mismatch flag (mismatch_MMinfo_flag), a re-encoded flag (Re_encoded_flag), and a frame size change flag (changed_frame_size_flag), data associated with two sizes indicative of an original horizontal size (original_horizontal_size) and an original vertical size (original_vertical_size), and an original screen aspect ratio (original_display_aspect_ratio).
  • the mismatch flag indicates whether there exists a mismatch in the relationship between video and multimedia coding data.
  • the re-encoded flag indicates whether the video has been re-encoded at the time of recording.
  • the frame size change flag indicates whether the picture frame of video has been changed by re-encoding, for example.
  • the original horizontal size indicates the horizontal size of a picture frame before re-encoding.
  • the original vertical size indicates the vertical size of a picture frame before re-encoding.
  • the original screen aspect ratio indicates the aspect ratio of a frame screen before re-encoding.
  • multimedia display sub-information is illustrative only. Therefore, information other than that shown in FIG. 6 may be included in, or part of the information shown in FIG. 6 may be excluded from, the multimedia display sub-information.
  • the multimedia display sub-information is stored in a ProgramInfo( ) syntax shown in FIG. 7 .
  • the following describes the fields associated with the present invention in the ProgramInfo( ) syntax.
  • “length” indicates the number of bytes between the byte just after the length field and the last byte of ProgramInfo( ) inclusive.
  • number_of_program_sequences indicates the number of program sequences in the an AV stream file.
  • a source packet sequence with which the program contents specified by this format in the AV stream file are constant is referred to as a program sequence.
  • SPN_program_sequences_start indicates an address at which the program sequence starts in the AV stream file. “SPN_program_sequences_start” is of a size in unit of source packet number and counted from the initial value 0 starting with the first packet of the AV stream file.
  • program_map_PID is value of the PID of a transport packet having PMT (Program Map Table) applicable to that program sequence.
  • number_of_streams in_ps indicates the number of elementary streams defined in that program sequence.
  • stream_PID indicates the value of the PID for the elementary stream defined in the PMT which is referenced by the program map PID of that program sequence.
  • StreamCodingInfo( ) indicates the information about the elementary stream indicated by the above-mentioned stream PID.
  • FIG. 8 shows the syntax of StreamCodingInfo( ). “length” indicates the number of bytes between the byte just after this length field and the last byte of StreamCodingInfo( ) inclusive.
  • stream_coding_type indicates the coding type of the elementary stream indicated by the stream PID for this StreamCodingInfo( ). The meanings of the individual types are shown in FIG. 9 .
  • stream coding type If the value of stream coding type is 0 ⁇ 02, it indicates that the elementary stream indicated by the stream PID is a video stream.
  • stream coding type If the value of stream coding type is 0 ⁇ 0A, 0 ⁇ 0B, or 0 ⁇ 0D, it indicates that the elementary stream indicated by the stream PID is multimedia coding data.
  • stream coding type If the value of stream coding type is 0 ⁇ 06, it indicates that the elementary stream indicated by the stream PID is subtitles or teletext.
  • video_format indicates the video format of a video stream indicated by the stream PID for this StreamCodingInfo( ). The meanings of the individual video formats are shown in FIG. 10 .
  • 480 i indicates video display of NTSC standard TV (interlace frame of 720 pixels ⁇ 480 lines).
  • 576 i indicates video display of PAL standard TV (interlace frame of 720 pixels ⁇ 576 lines).
  • 480 p indicates video display of progressive frame of 720 pixels ⁇ 480 lines.
  • 1080 i indicates video display of interlace frame of 1920 pixels ⁇ 1080 lines.
  • 720 p indicates video display of progressive frame of 1230 pixels ⁇ 720 lines.
  • frame_rate indicates the frame rate of a video stream indicated by the stream PID for this StreamCodingInfo( ).
  • the meanings of the individual frame rates are shown in FIG. 11 .
  • Display_aspect_ratio indicates the display aspect ratio of a video stream indicated by the stream PID for this StreamCodingIndo( ). The meaning of the individual display aspect ratios are shown in FIG. 12 .
  • original video_format_flag indicates whether there exists original video format and original display aspect ratio in this StreamCodingInfo( ).
  • original_video_format indicates a video format before a video stream indicated by the stream PID for this StreamCodingInfo( ) is coded.
  • the meanings of the individual original video formats are the same as shown in FIG. 10 .
  • original display_aspect ratio is the display aspect ratio before a video stream indicated by the stream PID for this StreamCodingInfo( ) is coded.
  • the meanings of the individual aspect ratios are the same as shown in FIG. 12 .
  • the re-encoding of the video stream changes its video format (for example, from 1080 i to 480 i ), while the multimedia data stream retains its original stream contents.
  • a mismatch in information may occur between a new video stream and the multimedia data stream.
  • the parameters associated with the display of the multimedia data stream are determined on the supposition of the video format of the original video stream, the video format may be changed by the re-encoding of the video stream.
  • the video format of the original video stream is indicated by the video format and the display aspect ratio.
  • the video format of the re-encoded video stream is indicated by the original video format and the original display aspect ratio.
  • the stream PID in which the stream coding type indicates multimedia coding data and subtitles are included in ProgramInfo( ), it indicates that the multimedia data is multiplexed in an AV stream file (a transport stream).
  • ProgramInfo( ) indicates that a video format change has been caused by the re-encoding of video at the time of recording and multimedia data is multiplexed in the AV stream file, then it is determined that a mismatch exists in display between the video stream (re-encoded) and the multimedia data (the original multimedia data) in the AV stream file.
  • the reproducing apparatus generates a display screen from the above-mentioned new video stream and multimedia data stream as follows.
  • the video stream is up-sampled to a video format indicated by the original video format and the original display aspect ratio.
  • the up-sampled image and the multimedia data stream are synthesized to form a correct display screen.
  • the multimedia display sub-information generated by the coding block 20 is recorded by the writing block 21 to the recording medium 22 but stored as a file which is different from the source packet stream file output from the arrival timestamp adding block 17 . If the multimedia display sub-information is recorded by the writing block 21 to the recording medium 22 as a file different from the source packet stream file, the filed multimedia display sub-information is output from the coding block 20 .
  • FIG. 13 is a flowchart describing the processing of coding an AV stream and multimedia display sub-information.
  • step 50 a multiplexed stream including multimedia coding data is input in the recording apparatus 1 .
  • step 51 the demultiplexer 12 separates the video stream from the multiplexed stream.
  • step 52 the encoder 15 re-encodes the video stream decoded by the decoder 14 .
  • step 53 the multiplexer 16 multiplexes the above-mentioned video stream and multimedia coding data to generate a multiplexed stream.
  • step 54 the coding block 20 generates multimedia display sub-information.
  • the coding controller 18 generates the coding control information including bit rate and picture frame on the basis of the input data.
  • the coding controller 18 may generate the following information as alternative coding control information. Namely, if the input transport stream is found to include multimedia coding data by the analyzing block 13 , then the coding controller 18 may generate coding control information when encoding is executed by the encoder 15 for instructing the encoder 15 to execute the re-encoding with a picture frame (the picture frame before re-encoding) of the same size as that of the picture frame of the original video, and output the generated coding control information to the encoder 15 .
  • the encoder 15 re-encodes the video data supplied from the decoder 14 with the same value as that of the picture frame of the original video stream on the basis of the input coding control information. If such coding control information is generated and the re-encoding is executed on the basis of the coding control information, no picture frame change is caused by the re-encoding, thereby preventing a mismatch from occurring in the relationship between the video stream obtained by re-encoding and the multimedia coding data.
  • the following information may be generated as the coding control information generated by the coding controller 18 .
  • the coding controller 18 may generate coding control information when encoding is executed by the encoder 15 for instructing the encoder 15 to execute the re-encoding under the same conditions as the video format (shown in FIG. 10 ) and screen aspect ratio (shown in FIG. 12 ) of the original video, and output the coding control information to the encoder 15 .
  • the encoder 15 re-encodes the video supplied from the decoder 14 under the same conditions as the video format (shown in FIG. 10 ) and screen aspect ratio (shown in FIG. 12 ) of the original video on the basis of the input coding control information. If such coding control information is generated and the re-encoding is executed on the basis of the coding control information, no video format and no screen aspect ratio change is caused by the re-encoding, thereby preventing a mismatch from occurring in the relationship between the video stream obtained by re-encoding and the multimedia coding data.
  • FIG. 14 is a flowchart describing the coding for restricting the re-encoding of the video of a multiplexed stream including multimedia coding data.
  • step 70 a multiplexed stream is input in the recording apparatus 1 .
  • step 71 the demultiplexer 12 separates the video stream from the multiplexed stream.
  • step 72 the analyzing block 13 checks if the multimedia coding data is included in the video stream. If the multimedia coding data is included, the analyzing block 13 sends the coding control information to the encoder 15 instructing the same to re-encode the video stream without changing the display format. On the basis of the supplied control information, the encoder 15 re-encodes the video stream.
  • step 73 the multiplexer 16 generates a multiplexed stream including the above-mentioned video stream.
  • a transport stream to be input to the selector 10 has a constant bit rate R I as shown in FIG. 15 , for example.
  • the video stream and the non-video streams are coded by variable bit rates.
  • the bit rate of the video stream in unit time (for example, GOP) A, the bit rate of the video stream is R VA and the bit rate of non-video streams is R OA .
  • the bit rate of the video stream is R VB and the bit rate of non-video streams is R OB .
  • the bit rate of the video stream is R VC and the bit rate of non-video streams is R OC .
  • the coding controller 18 executes the processing described by the flowchart shown in FIG. 17 .
  • step S 1 the coding controller 18 sets the bit rate to S (recording rate) of a transport stream to be output from the multiplexer 16 on the basis of a control signal input from a controller, not shown, via the terminal 19 .
  • step S 2 the coding controller 18 determines non-video streams to be recorded and computes a maximum total value D of the bit rates of the determined streams.
  • the maximum value D is determined from the stream specification of the input transport stream. For example, if two audio streams are to be recorded in addition to the video stream, the maximum value D is 384 ⁇ 2 Kbps since the maximum value of the bit rate of one audio stream is 384 Kbps according to the Japanese digital BS broadcast stream specification.
  • step S 4 the coding controller 18 analyzes the coding information such as the video stream bit rate and picture frame from the video stream information output from the decoder 14 .
  • step S 5 the coding controller 18 determines, on the basis of the value C computed in step S 3 and the video stream coding information analyzed in step S 4 , a video coding parameter (video coding control information) such that an optimum picture quality is achieved.
  • value S is 1 ⁇ 2 of value R I .
  • bit rate of steams other than video is the maximum value D, which is used without change as the bit rate of non-video steams in a multiplexed stream after re-encoding.
  • video coding parameters are determined such that an optimum picture quality can be achieved within the range of (S ⁇ D). If the picture frame is controlled, the horizontal direction of a picture frame of 720 ⁇ 480 pixels, for example, is sampled by 1 ⁇ 2 into 360 ⁇ 480 pixels.
  • the determined coding parameters (bit rate and picture angle) are supplied to the encoder 15 as video coding control information.
  • step S 6 on the basis of the video coding control information supplied from the coding controller 18 , the encoder 15 re-encodes the video data of unit time (in this example, unit time A) to be processed now.
  • unit time A unit time
  • the actual bit rate R OA is smaller than the maximum value D in unit time A; however, since the maximum value D is fixed, the video allocated bit rate becomes (S ⁇ D).
  • a wasted portion Rsa which cannot be used for video coding occurs because the maximum value D is fixed. The wasted portion is filled with stuffing bits.
  • step S 7 the coding controller 18 determines whether there remains any stream to be re-encoded. If any streams remain to be re-encoded, the procedure returns to step S 4 to repeat the above-mentioned processes.
  • step S 7 If, in step S 7 , no more streams remain to be re-encoded, this processing comes to an end.
  • the bit rate of non-video streams also is D and the video stream allocated bit rate is S ⁇ D because it is fixed.
  • bit rate of non-video streams is D and the video stream allocated bit rate is S ⁇ D. It should be noted that, in unit time C,
  • the video stream is coded with a fixed bit rate.
  • FIG. 18 is a flowchart describing a processing example in which the video re-encoding allocated bit rate is variable.
  • the coding controller 18 sets recording rate S on the basis of the information supplied via the terminal 19 .
  • the coding controller 18 analyzes the coding information of the video stream on the basis of the video stream information supplied from the decoder 14 .
  • the processes of steps S 21 and S 22 are the same as those of steps S 1 and S 4 of FIG. 17 .
  • step S 23 the coding controller 18 computes, from the output of the analyzing block 13 , the total bit rate B in each unit time of non-video streams.
  • step S 25 the coding controller 18 determines, on the basis of value C obtained in step S 24 and a result of analysis of the video stream coding information obtained in step S 22 , video coding parameters such that an optimum picture quality is obtained.
  • the determined coding parameters are output to the encoder 15 .
  • step S 27 the coding controller 18 determines whether any streams remain to be processed. If any streams remain to be processed, the procedure returns to step S 22 to repeat the above-mentioned processes. If no more streams remain to be processed, this processing comes to an end.
  • the bit rate of the video stream is variable and, therefore, no stuffing bit is needed or the number of stuffing bits can be reduced, thereby coding the video stream more efficiently.
  • the input transport stream has a fixed bit rate.
  • the present invention also is applicable to an example in which the bit rate of the input transport stream is variable as shown in FIG. 20 .
  • the above-mentioned novel embodiment prevents the qualities of audio data, still picture and character graphics data, multimedia coding data, and other non-video data from being conspicuously deteriorated.
  • the non-video data is basically smaller in data volume than video data, so that reducing the bit rate of the non-video data in the same ratio as the bit rate of video data makes the effects on the non-video data relatively greater than those on video data.
  • the novel embodiment can prevent these effects from being caused.
  • FIG. 21 there is shown a block diagram illustrating the configuration of a reproducing apparatus practiced as one embodiment of the invention.
  • a source packet stream file recorded on the recording medium 22 is read by a reading block 31 .
  • the reading block 31 also reads multimedia display sub-information recorded on the recording medium 22 as a file separate from the source packet stream file.
  • the source packet stream read by the reading block 31 is output to a arrival timestamp separating block 32 and the multimedia display sub-information is output to a synthesizing block 36 .
  • the arrival timestamp separating block 32 incorporates a reference clock.
  • the arrival time stamp separating block 32 compares the reference clock with the value of the arrival timestamp added to the source packet of the input source packet stream and, when a match is found, removes the arrival timestamp from the source packet having the matching arrival timestamp, outputting the resultant packet to a demultiplexer 33 as a transport stream packet.
  • the demultiplexer 33 separates the input transport stream into a video/audio stream and data streams such as multimedia coding data, character graphics, text, and still picture. Of these separated data, the video/audio stream is output to an AV decoder 34 , the multimedia coding data is output to the synthesizing block 36 , and the data stream such as character graphics, text, and still picture is output to a character graphics/still picture decoder 35 .
  • the AV decoder 34 separates the input video/audio stream into video data and audio data, decodes each data, and outputs the decoded audio data to an audio reproducing device, not shown, and the decoded video data to the synthesizing block 36 .
  • the character graphics/still picture decoder 35 decodes the input data stream, such as character graphics, text, and still picture, and outputs the decoded character graphics data, text data, and still picture data to the synthesizing block 36 .
  • the synthesizing block 36 determines whether a mismatch exists in the relationship between the input video signal and the multimedia coding data.
  • the synthesizing block 36 determines that a video format change has been caused by the video re-encoding at the time of recording, detecting a mismatch in the relationship between the input video signal and the multimedia encoding data. If no mismatch exists between the value of video format and the value of original video format and no mismatch exists between the value of display aspect radio and the value of original display aspect ratio, the synthesizing block 36 determines that no mismatch exists in the relationship between the input video signal and the multimedia coding data.
  • the synthesizing block 36 further references the original horizontal size and vertical size of the multimedia display sub-information or references the original video format and the original display aspect ratio. Then, the synthesizing block 36 scale-converts the input video signal so that it can be displayed in a frame of the referenced size. On the basis of the multimedia coding data, the synthesizing block 36 outputs the video signal with the scale-converted video signal and the data, such as character graphics synthesized on a multimedia plane, to a television receiver, not shown, which serves as a display device.
  • the synthesizing block 36 synthesizes the input video signal with other data on a multimedia plane without scale conversion and outputs the synthesized data.
  • recording the multimedia display sub-information and using it at the time of reproduction allow the receiving side to display a screen as intended on the sending side.
  • the size reduction is recorded as multimedia display sub-information, which is referenced at the time of reproduction. Consequently, because there exists no mismatch between video data and other data, the receiving side (the reproduction side) can display the same screen as the original.
  • FIG. 24 is a flowchart describing AV stream reproduction processing which uses multimedia display sub-information.
  • step 60 a multiplexed stream including multimedia coding data is read from a recording medium and input in a reproduction device.
  • step 61 multimedia display sub-information is input. This information is read from the recording medium in the case of the reproducing device shown in FIG. 21 ; in the case of a reproducing device shown in FIG. 25 , this information is separated from the multiplexed stream.
  • step 62 a video stream is separated from the multiplexed stream.
  • step 63 the video stream is decoded.
  • step S 64 if a mismatch exists between the video data and the multimedia coding data, the synthesizing block 36 scale-converts the video data on the basis of the multimedia display sub-information.
  • step 65 the synthesizing block 36 synthesizes the processed image and the multimedia data to generate a display image.
  • the multimedia display sub-information may be recorded on the recording medium 22 as a file which is different from the source packet stream file containing character graphics data and video signals.
  • the mutlimedia display sub-information may be embedded in a source packet stream file and then recorded on the recording medium 22 .
  • FIG. 23 shows the configuration of the recording apparatus 1 in which the multimedia display sub-information is embedded in a source packet stream file.
  • the former In comparison between the configuration of the recording apparatus 1 shown in FIG. 23 and the configuration shown in FIG. 3 , the former outputs the multimedia display sub-information output from the coding block 20 and supplies this output to the multiplexer 16 .
  • the multiplexer 16 then generates a transport packet of the input multimedia display sub-information and embeds it into a source packet stream file, outputting the same to the arrival timestamp adding block 17 .
  • the multimedia display sub-information may be written to a user data area in an MPEG video stream.
  • video data may be re-encoded using other methods than that described above; for example, an input video stream may be converted in the DCT area to convert the coding parameters such as picture frame.
  • FIG. 25 shows the configuration of the reproducing apparatus 30 in which the multimedia display sub-information is embedded in a source packet stream file to be recorded on the recording medium 22 .
  • the former reads only the source packet stream through the reading block 31 .
  • the source packet stream read by the reading block 31 is input to the demultiplexer 33 via the arrival timestamp separating block 32 .
  • the demultiplexer 33 extracts the multimedia display sub-information from the input source packet stream file and outputs the extracted information to the synthesizing block 36 .
  • the further processing is the same as that of the configuration shown in FIG. 5 .
  • the receiving side can also obtain the video picture size and display position intended by the sending side.
  • a transport stream was used as an example.
  • the present invention also is applicable to multiplexed streams such as a program stream.
  • the recording apparatus 1 (and the reproducing apparatus 30 ) is constituted by a personal computer as shown in FIG. 26 .
  • a CPU (Central Processing Unit) 101 executes various processing operations as instructed by programs stored in a ROM (Read Only Memory) 102 or loaded from a storage block 108 into a RAM (Random Access Memory) 103 .
  • the RAM 103 also stores, as required, the data necessary for the CPU 101 to execute various processing operations.
  • the CPU 101 , the ROM 102 , and the RAM 103 are interconnected via a bus 104 .
  • the bus 104 also is connected to an input/output interface 105 .
  • the input/output interface 105 is connected to an input block 106 , such as a keyboard and a mouse, a display device such as a CRT or LCD, an output block 107 , such as a speaker, a storage block 108 such as hard disk, and a communication block 109 such as modem or terminal adapter.
  • the communication block 109 executes communication processing via a network.
  • the input/output interface 105 also is connected to a drive 110 , as required, in which a magnetic disc 121 , an optical disc 122 , a magneto-optical disc 123 , or a semiconductor memory 124 is loaded. Computer programs read from these storage media are installed in the storage block 108 as required.
  • the execution of a sequence of processing operations by software requires the use of a computer having a dedicated hardware device storing beforehand the programs constituting the software or a general-purpose computer in which these programs are installed, as required, from a recording medium.
  • the program recording medium for storing computer-readable and executable programs may be a package medium which is distributed to users providing programs and embodied by the magnetic disk 121 (including floppy disk), the optical disc 122 (including CD-ROM (Compact Disc-Read Only Memory) and DVD (Digital Versatile Disc)), the magneto-optical disk 123 (including MD (Mini Disk)), the semiconductor memory 124 , a ROM 102 or a hard disk which is preinstalled in a personal computer and provided for users and on which the programs are stored temporarily or permanently as shown in FIG. 26 .
  • the magnetic disk 121 including floppy disk
  • the optical disc 122 including CD-ROM (Compact Disc-Read Only Memory) and DVD (Digital Versatile Disc)
  • the magneto-optical disk 123 including MD (Mini Disk)
  • the semiconductor memory 124 a ROM 102 or a hard disk which is preinstalled in a personal computer and provided for users and on which the programs are stored temporarily or permanently as shown
  • a video steam is separated from a multiplexed stream containing multimedia coding data, a predetermined conversion process is performed on the separated video stream, and additional information indicative of a mismatch occurs when displaying the converted video stream on the basis of the multimedia coding data.
  • the first recording medium stores the converted video stream, the multimedia coding data, and the additional information indicative that a mismatch will occur when displaying the converted video stream on the basis of the above-mentioned multimedia coding data.
  • the reproducing side can prevent a mismatch from occurring between the video stream and the multimedia coding data.
  • a mismatch occurs when a video stream is separated from an input multiplexed stream, the separated video stream is decoded, and the decoded video stream is displayed on the basis of multimedia coding information. On the basis of the additional information about this mismatch occurrence, a predetermined conversion process is performed on the decoded video stream.
  • a video stream is separated from an input multiplexed stream, the input multiplexed stream is checked for multimedia coding data and, if the multimedia coding data is found, coding control information for giving an instruction not to change the display format of the separated video stream is generated, and a predetermined conversion process is performed on the separated video stream on the basis of the generated coding control information.
  • the second recording medium also stores the above-mentioned coding control information giving instruction not to change the display format of a video stream and a multiplexed stream containing the video stream on which a predetermined conversion process has been performed on the basis of the coding control information.
  • the reproduction side can prevent a mismatch from occurring between the video stream and the multimedia coding data.

Abstract

An image coding apparatus is provided. A selector receives a multiplexed transport stream that includes multimedia coding data. A demultiplexer separates a video stream from the multiplexed transport stream. A decoder reproduces the video stream as decoded video data. A coding generator receives multimedia information associated with the multimedia coding data and generates display control information. The display control information includes a mismatch flag which indicates whether a display mismatch condition exists between the video data and multimedia coding data. An output unit outputs the decoded video data, the multimedia coding data and the mismatch flag.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • The present application is a divisional of U.S. application Ser. No. 09/872,147, filed Jun. 1, 2001, which claims priority from Japanese Application No. P2000-165298, filed Jun. 2, 2000, and Japanese Application No. P2001-001031, filed Jan. 9, 2001, the disclosures of which are hereby incorporated by reference herein.
  • BACKGROUND OF THE INVENTION
  • The present invention relates generally to an image coding apparatus and method, an image decoding apparatus and method, and a recording medium. More specifically, the present invention relates to an image coding apparatus and method, an image decoding apparatus and method, and a recording medium which are suitable for use in apparatus for re-encoding video streams and recording and reproducing the re-encoded video streams.
  • Digital television broadcasts such as European DVB (Digital Video Broadcast), American DTV (Digital Television) broadcast, and Japanese BS (Broadcast Satellite) digital broadcast use MPEG (Motion Picture Expert Group) 2 transport streams. A transport stream consists of continuous transport packets, each packet carrying video data or audio data, for example. The data length of one transport packet is 188 bytes.
  • Unlike analog television broadcasts, digital television broadcasts are capable of providing services added with multimedia coding data. In these services, data such as video data, audio data, character graphics data, and still picture data, for example, are associated with each other for transmission by the multimedia coding data. For the multimedia coding data, a coding method based on XML (Extensible Markup Language) is used in the Japanese BS digital broadcast, for example. The details of this method are disclosed in ARIB STD-B24 Data Coding And Transmission Specification for Digital Broadcasting, for example.
  • Data such as video data, audio data, character graphics data, and still picture data are each packetized into a transport packet for transmission.
  • FIGS. 1A and 1B show an example of synthesizing data to be transferred between the sending and receiving sides and a multimedia screen. As shown in FIG. 1A, the sending side sends to the receiving side video data, character graphics data for displaying buttons A through C, text data for displaying “XYZABC . . . ,” and multimedia coding data for relating these data to each other. The sending side generally denotes a television broadcast station, for example. However, herein it denotes a television broadcast station which includes a recording apparatus (the recording side) which receives and records data transmitted from broadcast stations, as shown in the example illustrated in FIG. 1A including the data which is output from this recording apparatus.
  • The multimedia coding data includes data which can synthesize on the receiving side video data, character graphics data, and text data and display the synthesized data. To be more specific, the multimedia coding data includes the data associated with the display positions of the video, character graphics, and text which are displayed by the size-associated data such as the multimedia plane (the display area of images on the television receiver, for example) size (plane_height and plane_width) and video display size (video_height and video_width), video data, character graphics data, and text data, as shown in FIG. 1B.
  • On the basis of the multimedia coding data, the receiving side processes the video data, the character graphics data, and the text data to display a resultant image, as shown in FIG. 1B.
  • Through the screen on which the above-mentioned image is displayed, the user can receive services such as displaying desired information in the video section by clicking button A corresponding to that information and obtaining, from the text data displayed in the bottom of the screen, the information associated with the matter displayed in the video section, for example.
  • If a television program carried by a transport stream transmitted from a digital television broadcast is recorded without change to a recording medium on the received side, the program can be recorded without its picture and audio qualities being deteriorated at all. However, in order to record as long a television program as possible to a recording medium having a limited recording capacity by presupposing a certain degree of picture quality deterioration, the received video stream must be decoded and then encoded again to lower the bit rate of the transport stream.
  • For example, the re-encoding of the video stream of a television program attached with multimedia coding data to lower its bit rate for recording may be implemented by sub-sampling the image to change writing blocks. However, this approach presents a problem of causing a mismatch in the relationship between the video stream resulting from re-encoding and the multimedia coding data. The following describes an example of this mismatch with reference to FIGS. 2A and 2B.
  • In the example shown in FIG. 2A, the sending side (the recording side) converts the original video writing block to a smaller picture frame at the time of re-encoding. Therefore, as shown in FIG. 2B, on the receiving side (the reproducing side), changes occur in the video display size and position, resulting in a display screen which is different from the display screen intended by the sending side (the display screen to be displayed on the basis of the data before being re-encoded).
  • SUMMARY OF THE INVENTION
  • According to an aspect of the invention, an image coding apparatus is provided. A selector receives a multiplexed transport stream that includes multimedia coding data. A demultiplexer separates a video stream from the multiplexed transport stream. A decoder reproduces the separated video stream as decoded video data. A coding generator receives multimedia information associated with the multimedia coding data and generates display control information. The display control information includes a mismatch flag which indicates whether a display mismatch condition exists between the video data and multimedia coding data. An output unit outputs the decoded video data, the multimedia coding data and the mismatch flag.
  • In accordance with this aspect of the invention, an encoder may be coupled to the decoder and may reproduce the video stream based on the multimedia information associated with the multimedia coding data and the video data. The output unit may comprise a writing unit that records the decoded video data, the multimedia coding data and the mismatch flag onto a recording medium. A coding controller may be coupled between the selector and the coding generator and may generate the multimedia information associated with the multimedia coding data. A data analyzer may be coupled between the selector and the coding controller and may detect at least a bit rate associated with the video stream. The display control information may include a re-encode flag which indicates whether the video data is re-encoded. The display control information may include a frame size change flag which indicates whether a size of a picture frame associated with the video data has been changed.
  • The foregoing aspects, features and advantages of the present invention will be further appreciated when considered with reference to the following detailed description and accompanying drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • These and other objects of the invention will be seen by reference to the description, taken in connection with the accompanying drawings, in which:
  • FIGS. 1A and 1B are schematic diagrams illustrating a display screen to be shown on the basis of multimedia coding information;
  • FIGS. 2A and 2B are schematic diagrams illustrating a mismatch which takes place when a video stream is re-encoded;
  • FIG. 3 is a block diagram illustrating a recording apparatus practiced as one embodiment of the present invention;
  • FIGS. 4A and 4B illustrate the operation of a multiplexer shown in FIG. 3;
  • FIGS. 5A, 5B and 5C illustrate the processing by an arrival timestamp adding block;
  • FIG. 6 illustrates multimedia display sub-information;
  • FIG. 7 illustrates an example of ProgramInfo( ) syntax;
  • FIG. 8 illustrates an example of StreamCodingInfo( ) syntax;
  • FIG. 9 illustrates the meaning of stream_coding type;
  • FIG. 10 illustrates the meaning of video_format;
  • FIG. 11 illustrates the meaning of frame_rate;
  • FIG. 12 illustrates the meaning of display_aspect_ratio;
  • FIG. 13 is a flowchart describing the processing of coding AV stream and multimedia display sub-information;
  • FIG. 14 is a flowchart describing the coding processing to be executed for restricting the re-encoding of a multiplexed stream video including multimedia coding data;
  • FIG. 15 illustrates an example of an input transport stream;
  • FIG. 16 illustrates an example of a transport stream after the re-encoding of the video stream shown in FIG. 15;
  • FIG. 17 is a flowchart describing a recording rate control process by a recording apparatus shown in FIG. 3;
  • FIG. 18 is a flowchart describing another recording rate control process by the recording apparatus shown in FIG. 3;
  • FIG. 19 illustrates another example of a transport stream resulting from the re-encoding of the video stream;
  • FIG. 20 illustrates another example of the input transport stream;
  • FIG. 21 is a block diagram illustrating a configuration of a reproducing apparatus practiced as one embodiment of the present invention;
  • FIGS. 22A and 22B illustrate a display screen to be shown when multimedia display sub-information is added;
  • FIG. 23 is a block diagram illustrating another configuration of the recording apparatus practiced as one embodiment of the present invention;
  • FIG. 24 is a flowchart describing the processing of reproducing an AV stream which uses multimedia display sub-information;
  • FIG. 25 is a block diagram illustrating another configuration of the reproducing apparatus practiced as one embodiment of the present invention; and
  • FIG. 26 illustrates recording media.
  • DETAILED DESCRIPTION
  • This invention will be described in further detail by way of example with reference to the accompanying drawings. Now, referring to FIG. 3, there is shown a block diagram illustrating an exemplary configuration of a recording apparatus 1 practiced as one embodiment of the invention. A transport stream received at an antenna, not shown, is input in a selector 10. A program number (a channel number) specified by the user is also input from a terminal 11 to the selector 10. Referring to the received program number, the selector 10 extracts the specified program from the received transport stream and outputs a partial transport stream. The partial transport stream is input in a demultiplexer 12 and an analyzing block 13.
  • The partial transport stream input in the demultiplexer 12 is separated into a video stream and other streams (audio, still picture, character graphics, and multimedia coding data for example). The video stream thus obtained is output to a decoder 14. The other streams are output to a multiplexer 16. In addition to the transport packets other than video, the demultiplexer 12 outputs the output timing information in the input transport stream of these transport packets to the multiplexer 16.
  • The decoder 14 applies a predetermined decoding scheme, for example, MPEG2 to the input video stream and outputs the decoded video data to an encoder 15. Also, the decoder 14 outputs the stream information about the video stream obtained at decoding to a coding controller 18.
  • On the other hand, the analyzing block 13 analyzes the input transport stream to obtain the stream information about the non-video streams, for example, a bit rate, and outputs it to the coding controller 18. The stream information about the non-video streams output from the analyzing block 13, the video stream information output from decoder 14, and a stream recording bit rate output from a terminal 19 are input in the coding controller 18. From these data, the coding controller 18 sets the video data coding conditions (coding control information) to be executed by the encoder 15 and outputs these coding conditions to the encoder 15 and a coding block 20.
  • The coding controller 18 uses, as a bit rate to be allocated to the video data encoding, a value obtained by subtracting a total value (the data input from the analyzing block 13) of the bit rates of the non-video streams from a stream recording bit rate (the data input, via the terminal 19, from a controller, not shown, for controlling the operation of the recording apparatus 1, for example). The coding controller 18 sets coding control information such as bit rate and picture frame such that an optimum picture quality can be achieved with the bit rate thus obtained and outputs this coding control information to the encoder 15 and the coding block 20. The details of the coding control information will be described later with reference to FIGS. 15 through 20.
  • When a stream is recorded to a recording medium with a fixed rate, this stream recording bit rate becomes the fixed rate; if a stream is recorded with a variable bit rate, this stream recording bit rate is a mean bit rate per predetermined time. However, the maximum value of the variable bit rate in this case needs to be lower than the maximum recording bit rate ensured by the recording medium concerned.
  • The encoder 15 encodes (on the basis of MPEG2, for example) the video data output from the decoder 14 on the basis of the coding control information output from the coding controller 18 and outputs the resultant video data to the multiplexer 16. The video stream from the encoder 15, the transport stream packets other than video from the demultiplexer 12, and the information about the occurrence timing of the transport stream packets other than video are input in the multiplexer 16. On the basis of the input occurrence timing information, the multiplexer 16 multiplexes the video stream with the transport stream packets, other than video, and outputs the result to the arrival timestamp adding block 17 as a transport stream.
  • FIGS. 4A and 4B schematically illustrate the above-mentioned processing to be executed by the multiplexer 16. FIG. 4A shows the timing of the input transport stream packets. In these figures, the cross-hatched portions indicate the video packets while the white portions indicate the stream packets other than video. As shown in FIG. 4A, the input transport stream packets are continuous; however, the data volume of the video data is reduced by the re-encoding of video data by the encoder 15. Consequently, the number of video packets is reduced.
  • As shown in FIG. 4B, the multiplexer 16 does not change the timing of the stream packets other than video but causes only the timing of the video packets to be different from the original state (shown in FIG. 4A).
  • As shown in FIGS. 5A, 5B and 5C, the arrival timestamp adding block 17 adds a header (TP_extra_header) including an arrival timestamp to each of the packets (FIG. 5A) of the input transport stream to generate a source packet (FIG. 5B), arranges the generated source packets continuously (FIG. 5C), and outputs them to a writing block 21. The arrival timestamp is information indicative of the timing with which the transport stream packets occur in a transport stream. The writing block 21 takes the input source packet stream consisting of continuous source packets and records the file to a recording medium 22. It should be noted that the recording medium 22 may be any type of recording medium.
  • The information output from the coding block 20 is also input in the writing block 21. On the basis of the video coding information from the coding controller 18, the coding block 20 generates multimedia display sub-information and outputs the same to the writing block 21. The multimedia display sub-information to be output to the writing block 21 is information for keeping the video display position and display size unchanged on multimedia plane from those of the image (the image which would be displayed without re-encoding) intended by the sending side even if the picture frame size has changed by transcoding (decoding by the decoder 14 and then encoding by the encoder 15) a video stream. This information also is used at the time of reproduction in combination with multimedia coding data.
  • The following describes the multimedia display sub-information more specifically. As shown in FIG. 6, the multimedia display sub-information consists of three flags of a mismatch flag (mismatch_MMinfo_flag), a re-encoded flag (Re_encoded_flag), and a frame size change flag (changed_frame_size_flag), data associated with two sizes indicative of an original horizontal size (original_horizontal_size) and an original vertical size (original_vertical_size), and an original screen aspect ratio (original_display_aspect_ratio).
  • The mismatch flag indicates whether there exists a mismatch in the relationship between video and multimedia coding data. The re-encoded flag indicates whether the video has been re-encoded at the time of recording. The frame size change flag indicates whether the picture frame of video has been changed by re-encoding, for example. The original horizontal size indicates the horizontal size of a picture frame before re-encoding. The original vertical size indicates the vertical size of a picture frame before re-encoding. The original screen aspect ratio indicates the aspect ratio of a frame screen before re-encoding.
  • It should be noted that the above-mentioned multimedia display sub-information is illustrative only. Therefore, information other than that shown in FIG. 6 may be included in, or part of the information shown in FIG. 6 may be excluded from, the multimedia display sub-information.
  • The following describes another example of the multimedia display sub-information. In the following example, the multimedia display sub-information is stored in a ProgramInfo( ) syntax shown in FIG. 7. The following describes the fields associated with the present invention in the ProgramInfo( ) syntax.
  • “length” indicates the number of bytes between the byte just after the length field and the last byte of ProgramInfo( ) inclusive.
  • “num_of_program_sequences” indicates the number of program sequences in the an AV stream file. A source packet sequence with which the program contents specified by this format in the AV stream file are constant is referred to as a program sequence.
  • “SPN_program_sequences_start” indicates an address at which the program sequence starts in the AV stream file. “SPN_program_sequences_start” is of a size in unit of source packet number and counted from the initial value 0 starting with the first packet of the AV stream file.
  • “program_map_PID” is value of the PID of a transport packet having PMT (Program Map Table) applicable to that program sequence.
  • “num_of_streams in_ps” indicates the number of elementary streams defined in that program sequence.
  • “stream_PID” indicates the value of the PID for the elementary stream defined in the PMT which is referenced by the program map PID of that program sequence.
  • “StreamCodingInfo( )” indicates the information about the elementary stream indicated by the above-mentioned stream PID.
  • FIG. 8 shows the syntax of StreamCodingInfo( ). “length” indicates the number of bytes between the byte just after this length field and the last byte of StreamCodingInfo( ) inclusive.
  • “stream_coding_type” indicates the coding type of the elementary stream indicated by the stream PID for this StreamCodingInfo( ). The meanings of the individual types are shown in FIG. 9.
  • If the value of stream coding type is 0×02, it indicates that the elementary stream indicated by the stream PID is a video stream.
  • If the value of stream coding type is 0×0A, 0×0B, or 0×0D, it indicates that the elementary stream indicated by the stream PID is multimedia coding data.
  • If the value of stream coding type is 0×06, it indicates that the elementary stream indicated by the stream PID is subtitles or teletext.
  • “video_format” indicates the video format of a video stream indicated by the stream PID for this StreamCodingInfo( ). The meanings of the individual video formats are shown in FIG. 10.
  • In FIG. 10, 480 i indicates video display of NTSC standard TV (interlace frame of 720 pixels×480 lines). 576 i indicates video display of PAL standard TV (interlace frame of 720 pixels×576 lines). 480 p indicates video display of progressive frame of 720 pixels×480 lines. 1080 i indicates video display of interlace frame of 1920 pixels×1080 lines. 720 p indicates video display of progressive frame of 1230 pixels×720 lines.
  • “frame_rate” indicates the frame rate of a video stream indicated by the stream PID for this StreamCodingInfo( ). The meanings of the individual frame rates are shown in FIG. 11.
  • “display_aspect_ratio” indicates the display aspect ratio of a video stream indicated by the stream PID for this StreamCodingIndo( ). The meaning of the individual display aspect ratios are shown in FIG. 12.
  • “original video_format_flag” indicates whether there exists original video format and original display aspect ratio in this StreamCodingInfo( ).
  • “original_video_format” indicates a video format before a video stream indicated by the stream PID for this StreamCodingInfo( ) is coded. The meanings of the individual original video formats are the same as shown in FIG. 10.
  • “original display_aspect ratio” is the display aspect ratio before a video stream indicated by the stream PID for this StreamCodingInfo( ) is coded. The meanings of the individual aspect ratios are the same as shown in FIG. 12.
  • It is assumed that, in transcoding a transport stream with a multimedia data stream (BML stream or subtitles) multiplexed along with a video stream, the re-encoding of the video stream changes its video format (for example, from 1080 i to 480 i), while the multimedia data stream retains its original stream contents. In this case, a mismatch in information may occur between a new video stream and the multimedia data stream. For example, although the parameters associated with the display of the multimedia data stream are determined on the supposition of the video format of the original video stream, the video format may be changed by the re-encoding of the video stream.
  • The video format of the original video stream is indicated by the video format and the display aspect ratio. The video format of the re-encoded video stream is indicated by the original video format and the original display aspect ratio.
  • If a mismatch exists between the values of the video format and the original video format and/or between the display aspect ratio and the original display aspect ratio, it indicates that a video format change has been caused by the video re-encoding at the time of recording.
  • If the stream PID in which the stream coding type indicates multimedia coding data and subtitles are included in ProgramInfo( ), it indicates that the multimedia data is multiplexed in an AV stream file (a transport stream).
  • If ProgramInfo( ) indicates that a video format change has been caused by the re-encoding of video at the time of recording and multimedia data is multiplexed in the AV stream file, then it is determined that a mismatch exists in display between the video stream (re-encoded) and the multimedia data (the original multimedia data) in the AV stream file.
  • In such a case, the information about the original video stream, namely the original video format and the original display aspect ratio, becomes effective. The reproducing apparatus generates a display screen from the above-mentioned new video stream and multimedia data stream as follows.
  • The video stream is up-sampled to a video format indicated by the original video format and the original display aspect ratio.
  • The up-sampled image and the multimedia data stream are synthesized to form a correct display screen.
  • The multimedia display sub-information generated by the coding block 20 is recorded by the writing block 21 to the recording medium 22 but stored as a file which is different from the source packet stream file output from the arrival timestamp adding block 17. If the multimedia display sub-information is recorded by the writing block 21 to the recording medium 22 as a file different from the source packet stream file, the filed multimedia display sub-information is output from the coding block 20.
  • FIG. 13 is a flowchart describing the processing of coding an AV stream and multimedia display sub-information.
  • In step 50, a multiplexed stream including multimedia coding data is input in the recording apparatus 1.
  • In step 51, the demultiplexer 12 separates the video stream from the multiplexed stream.
  • In step 52, the encoder 15 re-encodes the video stream decoded by the decoder 14.
  • In step 53, the multiplexer 16 multiplexes the above-mentioned video stream and multimedia coding data to generate a multiplexed stream.
  • In step 54, the coding block 20 generates multimedia display sub-information.
  • In the above description, the coding controller 18 generates the coding control information including bit rate and picture frame on the basis of the input data. The coding controller 18 may generate the following information as alternative coding control information. Namely, if the input transport stream is found to include multimedia coding data by the analyzing block 13, then the coding controller 18 may generate coding control information when encoding is executed by the encoder 15 for instructing the encoder 15 to execute the re-encoding with a picture frame (the picture frame before re-encoding) of the same size as that of the picture frame of the original video, and output the generated coding control information to the encoder 15.
  • When the above-mentioned method is used, the encoder 15 re-encodes the video data supplied from the decoder 14 with the same value as that of the picture frame of the original video stream on the basis of the input coding control information. If such coding control information is generated and the re-encoding is executed on the basis of the coding control information, no picture frame change is caused by the re-encoding, thereby preventing a mismatch from occurring in the relationship between the video stream obtained by re-encoding and the multimedia coding data.
  • Still alternatively, the following information may be generated as the coding control information generated by the coding controller 18. Namely, if the input transport stream is found to include multimedia coding data by the analyzing block 13, then the coding controller 18 may generate coding control information when encoding is executed by the encoder 15 for instructing the encoder 15 to execute the re-encoding under the same conditions as the video format (shown in FIG. 10) and screen aspect ratio (shown in FIG. 12) of the original video, and output the coding control information to the encoder 15.
  • When the above-mentioned method is used, the encoder 15 re-encodes the video supplied from the decoder 14 under the same conditions as the video format (shown in FIG. 10) and screen aspect ratio (shown in FIG. 12) of the original video on the basis of the input coding control information. If such coding control information is generated and the re-encoding is executed on the basis of the coding control information, no video format and no screen aspect ratio change is caused by the re-encoding, thereby preventing a mismatch from occurring in the relationship between the video stream obtained by re-encoding and the multimedia coding data.
  • FIG. 14 is a flowchart describing the coding for restricting the re-encoding of the video of a multiplexed stream including multimedia coding data.
  • In step 70, a multiplexed stream is input in the recording apparatus 1.
  • In step 71, the demultiplexer 12 separates the video stream from the multiplexed stream.
  • In step 72, the analyzing block 13 checks if the multimedia coding data is included in the video stream. If the multimedia coding data is included, the analyzing block 13 sends the coding control information to the encoder 15 instructing the same to re-encode the video stream without changing the display format. On the basis of the supplied control information, the encoder 15 re-encodes the video stream.
  • In step 73, the multiplexer 16 generates a multiplexed stream including the above-mentioned video stream.
  • With reference to FIGS. 15 through 20, the following describes one example of control to be executed on the basis of the coding control information.
  • It is assumed here that a transport stream to be input to the selector 10 has a constant bit rate RI as shown in FIG. 15, for example. The video stream and the non-video streams are coded by variable bit rates. In the example shown in FIG. 15, in unit time (for example, GOP) A, the bit rate of the video stream is RVA and the bit rate of non-video streams is ROA. In unit time B, the bit rate of the video stream is RVB and the bit rate of non-video streams is ROB. In unit time C, the bit rate of the video stream is RVC and the bit rate of non-video streams is ROC.
  • If the transport stream as shown in FIG. 15 is re-encoded to output the transport stream having fixed bit rate S (S<RI) as shown in FIG. 16 from the multiplexer 16, the coding controller 18 executes the processing described by the flowchart shown in FIG. 17.
  • First, in step S1, the coding controller 18 sets the bit rate to S (recording rate) of a transport stream to be output from the multiplexer 16 on the basis of a control signal input from a controller, not shown, via the terminal 19. Next, in step S2, the coding controller 18 determines non-video streams to be recorded and computes a maximum total value D of the bit rates of the determined streams.
  • The maximum value D is determined from the stream specification of the input transport stream. For example, if two audio streams are to be recorded in addition to the video stream, the maximum value D is 384×2 Kbps since the maximum value of the bit rate of one audio stream is 384 Kbps according to the Japanese digital BS broadcast stream specification.
  • In step S3, the coding controller 18 uses value C obtained by subtracting the maximum value D computed in step S2 from the recording bit rate set in step S1 (C=S−D), as a bit rate to be allocated to the re-encoding of the video data. In step S4, the coding controller 18 analyzes the coding information such as the video stream bit rate and picture frame from the video stream information output from the decoder 14.
  • In step S5, the coding controller 18 determines, on the basis of the value C computed in step S3 and the video stream coding information analyzed in step S4, a video coding parameter (video coding control information) such that an optimum picture quality is achieved.
  • For example, in the example shown in FIG. 16, value S is ½ of value RI. In the present example, the bit rate of steams other than video is the maximum value D, which is used without change as the bit rate of non-video steams in a multiplexed stream after re-encoding.
  • Then, video coding parameters are determined such that an optimum picture quality can be achieved within the range of (S−D). If the picture frame is controlled, the horizontal direction of a picture frame of 720×480 pixels, for example, is sampled by ½ into 360×480 pixels. The determined coding parameters (bit rate and picture angle) are supplied to the encoder 15 as video coding control information.
  • In step S6, on the basis of the video coding control information supplied from the coding controller 18, the encoder 15 re-encodes the video data of unit time (in this example, unit time A) to be processed now. In the example shown in FIG. 16, the actual bit rate ROA is smaller than the maximum value D in unit time A; however, since the maximum value D is fixed, the video allocated bit rate becomes (S−D). A wasted portion Rsa which cannot be used for video coding occurs because the maximum value D is fixed. The wasted portion is filled with stuffing bits.
  • In step S7, the coding controller 18 determines whether there remains any stream to be re-encoded. If any streams remain to be re-encoded, the procedure returns to step S4 to repeat the above-mentioned processes.
  • If, in step S7, no more streams remain to be re-encoded, this processing comes to an end.
  • Thus, in the example shown in FIG. 16, in unit time B, the bit rate of non-video streams also is D and the video stream allocated bit rate is S−D because it is fixed. Stuffing bits are inserted in value Rsb (Rsb=S−(S−D)−ROB=D−ROB).
  • In unit time C, too, the bit rate of non-video streams is D and the video stream allocated bit rate is S−D. It should be noted that, in unit time C,
  • D=ROC, so that no stuffing bits exist.
  • Thus, in the example shown in FIG. 16, the video stream is coded with a fixed bit rate.
  • FIG. 18 is a flowchart describing a processing example in which the video re-encoding allocated bit rate is variable. First, in step S21, the coding controller 18 sets recording rate S on the basis of the information supplied via the terminal 19. Next, in step S22, the coding controller 18 analyzes the coding information of the video stream on the basis of the video stream information supplied from the decoder 14. The processes of steps S21 and S22 are the same as those of steps S1 and S4 of FIG. 17.
  • In step S23, the coding controller 18 computes, from the output of the analyzing block 13, the total bit rate B in each unit time of non-video streams.
  • In step S24, the coding controller 18 uses, as the video re-encoding allocated bit rate, value C (C=S−B) obtained by subtracting value B obtained in step S23 from value S obtained in S1.
  • In step S25, the coding controller 18 determines, on the basis of value C obtained in step S24 and a result of analysis of the video stream coding information obtained in step S22, video coding parameters such that an optimum picture quality is obtained. The determined coding parameters are output to the encoder 15.
  • In step S26, the encoder 15 re-encodes the video data of the current unit time on the basis of the coding parameters determined in step S25. Consequently, as shown in FIG. 19, for example, after allocation of Roa (=ROA) as the bit rate in unit time of non-video streams, the bit rate of the video stream is set to bit rate Rva specified by (S−Roa).
  • In step S27, the coding controller 18 determines whether any streams remain to be processed. If any streams remain to be processed, the procedure returns to step S22 to repeat the above-mentioned processes. If no more streams remain to be processed, this processing comes to an end.
  • Thus, in unit time B, after allocation of bit rate Rob (=S−ROB) of non-video streams, the remaining Rvb (=S−Rob) is the bit rate of the video stream. In unit time C, the bit rate of the video stream is set to Rvc (=S−ROC), except for bit rate Roc of non-video streams.
  • Thus, in the present processing example, the bit rate of the video stream is variable and, therefore, no stuffing bit is needed or the number of stuffing bits can be reduced, thereby coding the video stream more efficiently.
  • In the above, the input transport stream has a fixed bit rate. The present invention also is applicable to an example in which the bit rate of the input transport stream is variable as shown in FIG. 20.
  • Consequently, a transport stream of longer content can be recorded to the recording medium 22 at a lower bit rate as required.
  • In addition, the above-mentioned novel embodiment prevents the qualities of audio data, still picture and character graphics data, multimedia coding data, and other non-video data from being conspicuously deteriorated. The non-video data is basically smaller in data volume than video data, so that reducing the bit rate of the non-video data in the same ratio as the bit rate of video data makes the effects on the non-video data relatively greater than those on video data. The novel embodiment can prevent these effects from being caused.
  • The following describes the reproduction of a source packet stream file recorded on the recording medium 22. Referring to FIG. 21, there is shown a block diagram illustrating the configuration of a reproducing apparatus practiced as one embodiment of the invention. A source packet stream file recorded on the recording medium 22 is read by a reading block 31. The reading block 31 also reads multimedia display sub-information recorded on the recording medium 22 as a file separate from the source packet stream file.
  • The source packet stream read by the reading block 31 is output to a arrival timestamp separating block 32 and the multimedia display sub-information is output to a synthesizing block 36. The arrival timestamp separating block 32 incorporates a reference clock. The arrival time stamp separating block 32 compares the reference clock with the value of the arrival timestamp added to the source packet of the input source packet stream and, when a match is found, removes the arrival timestamp from the source packet having the matching arrival timestamp, outputting the resultant packet to a demultiplexer 33 as a transport stream packet.
  • The demultiplexer 33 separates the input transport stream into a video/audio stream and data streams such as multimedia coding data, character graphics, text, and still picture. Of these separated data, the video/audio stream is output to an AV decoder 34, the multimedia coding data is output to the synthesizing block 36, and the data stream such as character graphics, text, and still picture is output to a character graphics/still picture decoder 35.
  • The AV decoder 34 separates the input video/audio stream into video data and audio data, decodes each data, and outputs the decoded audio data to an audio reproducing device, not shown, and the decoded video data to the synthesizing block 36. The character graphics/still picture decoder 35 decodes the input data stream, such as character graphics, text, and still picture, and outputs the decoded character graphics data, text data, and still picture data to the synthesizing block 36.
  • In the synthesizing block 36, the video data from the AV decoder 34, the multimedia coding data from the demultiplexer 33, the data from the character graphics/still picture decoder 35, and the multimedia display sub-information from the reading block 31 are input. Checking the mismatch flag (FIG. 6) of the input multimedia display sub-information, the synthesizing block 36 determines whether a mismatch exists in the relationship between the input video signal and the multimedia coding data.
  • If a mismatch exists between the value of video format and the value of original video format shown in FIG. 8 and/or a mismatch exists between the value of display aspect ratio and the original display aspect ratio, the synthesizing block 36 determines that a video format change has been caused by the video re-encoding at the time of recording, detecting a mismatch in the relationship between the input video signal and the multimedia encoding data. If no mismatch exists between the value of video format and the value of original video format and no mismatch exists between the value of display aspect radio and the value of original display aspect ratio, the synthesizing block 36 determines that no mismatch exists in the relationship between the input video signal and the multimedia coding data.
  • If a mismatch is found in the relationship between the input video signal and the multimedia coding data, the synthesizing block 36 further references the original horizontal size and vertical size of the multimedia display sub-information or references the original video format and the original display aspect ratio. Then, the synthesizing block 36 scale-converts the input video signal so that it can be displayed in a frame of the referenced size. On the basis of the multimedia coding data, the synthesizing block 36 outputs the video signal with the scale-converted video signal and the data, such as character graphics synthesized on a multimedia plane, to a television receiver, not shown, which serves as a display device.
  • On the other hand, if no mismatch is found in the relationship between the input video signal and the multimedia coding data, the synthesizing block 36 synthesizes the input video signal with other data on a multimedia plane without scale conversion and outputs the synthesized data.
  • Thus, recording the multimedia display sub-information and using it at the time of reproduction allow the receiving side to display a screen as intended on the sending side. Referring to FIG. 22, if the re-encoding on the sending side (recording side) results in a smaller video picture frame than the original, the size reduction is recorded as multimedia display sub-information, which is referenced at the time of reproduction. Consequently, because there exists no mismatch between video data and other data, the receiving side (the reproduction side) can display the same screen as the original.
  • FIG. 24 is a flowchart describing AV stream reproduction processing which uses multimedia display sub-information.
  • In step 60, a multiplexed stream including multimedia coding data is read from a recording medium and input in a reproduction device.
  • In step 61, multimedia display sub-information is input. This information is read from the recording medium in the case of the reproducing device shown in FIG. 21; in the case of a reproducing device shown in FIG. 25, this information is separated from the multiplexed stream.
  • In step 62, a video stream is separated from the multiplexed stream.
  • In step 63, the video stream is decoded.
  • In step S64, if a mismatch exists between the video data and the multimedia coding data, the synthesizing block 36 scale-converts the video data on the basis of the multimedia display sub-information.
  • In step 65, the synthesizing block 36 synthesizes the processed image and the multimedia data to generate a display image.
  • As described, the multimedia display sub-information may be recorded on the recording medium 22 as a file which is different from the source packet stream file containing character graphics data and video signals. Alternatively, the mutlimedia display sub-information may be embedded in a source packet stream file and then recorded on the recording medium 22. FIG. 23 shows the configuration of the recording apparatus 1 in which the multimedia display sub-information is embedded in a source packet stream file.
  • In comparison between the configuration of the recording apparatus 1 shown in FIG. 23 and the configuration shown in FIG. 3, the former outputs the multimedia display sub-information output from the coding block 20 and supplies this output to the multiplexer 16. The multiplexer 16 then generates a transport packet of the input multimedia display sub-information and embeds it into a source packet stream file, outputting the same to the arrival timestamp adding block 17. Instead of embedding the multimedia display sub-information into a source packet stream file as a transport packet, the multimedia display sub-information may be written to a user data area in an MPEG video stream.
  • In the present embodiment of the invention, video data may be re-encoded using other methods than that described above; for example, an input video stream may be converted in the DCT area to convert the coding parameters such as picture frame.
  • FIG. 25 shows the configuration of the reproducing apparatus 30 in which the multimedia display sub-information is embedded in a source packet stream file to be recorded on the recording medium 22. In comparison between the configuration of the reproducing apparatus shown in FIG. 25 and the configuration shown in FIG. 21, the former reads only the source packet stream through the reading block 31. The source packet stream read by the reading block 31 is input to the demultiplexer 33 via the arrival timestamp separating block 32.
  • The demultiplexer 33 extracts the multimedia display sub-information from the input source packet stream file and outputs the extracted information to the synthesizing block 36. The further processing is the same as that of the configuration shown in FIG. 5.
  • Thus, if the multimedia display sub-information is recorded as embedded in a source packet stream file, the receiving side can also obtain the video picture size and display position intended by the sending side.
  • In the present embodiment of the invention, a transport stream was used as an example. The present invention also is applicable to multiplexed streams such as a program stream.
  • The above-described sequence of processing operations can be executed by hardware as well as software. In the software approach, the recording apparatus 1 (and the reproducing apparatus 30) is constituted by a personal computer as shown in FIG. 26.
  • Referring to FIG. 26, a CPU (Central Processing Unit) 101 executes various processing operations as instructed by programs stored in a ROM (Read Only Memory) 102 or loaded from a storage block 108 into a RAM (Random Access Memory) 103. The RAM 103 also stores, as required, the data necessary for the CPU 101 to execute various processing operations.
  • The CPU 101, the ROM 102, and the RAM 103 are interconnected via a bus 104. The bus 104 also is connected to an input/output interface 105.
  • The input/output interface 105 is connected to an input block 106, such as a keyboard and a mouse, a display device such as a CRT or LCD, an output block 107, such as a speaker, a storage block 108 such as hard disk, and a communication block 109 such as modem or terminal adapter. The communication block 109 executes communication processing via a network.
  • The input/output interface 105 also is connected to a drive 110, as required, in which a magnetic disc 121, an optical disc 122, a magneto-optical disc 123, or a semiconductor memory 124 is loaded. Computer programs read from these storage media are installed in the storage block 108 as required.
  • The execution of a sequence of processing operations by software requires the use of a computer having a dedicated hardware device storing beforehand the programs constituting the software or a general-purpose computer in which these programs are installed, as required, from a recording medium.
  • The program recording medium for storing computer-readable and executable programs may be a package medium which is distributed to users providing programs and embodied by the magnetic disk 121 (including floppy disk), the optical disc 122 (including CD-ROM (Compact Disc-Read Only Memory) and DVD (Digital Versatile Disc)), the magneto-optical disk 123 (including MD (Mini Disk)), the semiconductor memory 124, a ROM 102 or a hard disk which is preinstalled in a personal computer and provided for users and on which the programs are stored temporarily or permanently as shown in FIG. 26.
  • It should be noted that the steps describing the programs to be stored on the program storage medium are not only executed in a time-dependent manner in the order described, but also in parallel or in a discrete manner.
  • As described, and according to the first image coding apparatus and method and the program stored in the first recording medium, a video steam is separated from a multiplexed stream containing multimedia coding data, a predetermined conversion process is performed on the separated video stream, and additional information indicative of a mismatch occurs when displaying the converted video stream on the basis of the multimedia coding data.
  • The first recording medium stores the converted video stream, the multimedia coding data, and the additional information indicative that a mismatch will occur when displaying the converted video stream on the basis of the above-mentioned multimedia coding data.
  • Consequently, in any case, the reproducing side can prevent a mismatch from occurring between the video stream and the multimedia coding data.
  • As described and according to the image decoding apparatus and method and the program stored in the second recording medium, a mismatch occurs when a video stream is separated from an input multiplexed stream, the separated video stream is decoded, and the decoded video stream is displayed on the basis of multimedia coding information. On the basis of the additional information about this mismatch occurrence, a predetermined conversion process is performed on the decoded video stream. This novel configuration prevents the mismatch from occurring between the video stream and the multimedia coding data.
  • As described and according to the second image coding apparatus and method and the program stored in the third recording medium, a video stream is separated from an input multiplexed stream, the input multiplexed stream is checked for multimedia coding data and, if the multimedia coding data is found, coding control information for giving an instruction not to change the display format of the separated video stream is generated, and a predetermined conversion process is performed on the separated video stream on the basis of the generated coding control information.
  • The second recording medium also stores the above-mentioned coding control information giving instruction not to change the display format of a video stream and a multiplexed stream containing the video stream on which a predetermined conversion process has been performed on the basis of the coding control information.
  • Consequently, in any case, the reproduction side can prevent a mismatch from occurring between the video stream and the multimedia coding data.
  • Although the invention herein has been described with reference to particular embodiments, it is to be understood that these embodiments are merely illustrative of the principles and applications of the present invention. It is therefore to be understood that numerous modifications may be made to the illustrative embodiments and that other arrangements may be devised without departing from the spirit and scope of the present invention as defined by the appended claims.

Claims (21)

1. An image coding apparatus, comprising:
a selector operable to receive a multiplexed transport stream that includes multimedia coding data;
a demultiplexer operable to separate a video stream from the multiplexed transport stream;
a decoder operable to reproduce the separated video stream as decoded video data;
a coding generator operable to receive multimedia information associated with the multimedia coding data and to generate display control information, the display control information including a mismatch flag which indicates whether a display mismatch condition exists between the video data and multimedia coding data; and
an output unit operable to output the decoded video data, the multimedia coding data and the mismatch flag.
2. The image coding apparatus of claim 1, further comprising an encoder which is coupled to said decoder and which is operable to reproduce the video stream based on the multimedia information associated with the multimedia coding data and the video data.
3. The image coding apparatus of claim 1, wherein said output unit comprises a writing unit operable to record the decoded video data, the multimedia coding data and the mismatch flag onto a recording medium.
4. The image coding apparatus of claim 1, further comprising a coding controller which is coupled between said selector and said coding generator and which is operable to generate the multimedia information associated with the multimedia coding data.
5. The image coding apparatus of claim 1, further comprising a data analyzer which is coupled between said selector and said coding controller and which is operable to detect at least a bit rate associated with the video stream.
6. The image coding apparatus of claim 1, wherein the display control information includes a re-encode flag which indicates whether the video data is re-encoded.
7. The image coding apparatus of claim 1, wherein the display control information includes a frame size change flag which indicates whether a size of a picture frame associated with the video data has been changed.
8. An image coding method, comprising:
receiving a multiplexed transport stream that includes multimedia coding data;
separating a video stream from the multiplexed transport stream;
reproducing the separated video stream as decoded video data;
receiving multimedia information associated with the multimedia coding data;
generating display control information that includes a mismatch flag which indicates whether a display mismatch condition exists between the video data and multimedia coding data; and
outputting the decoded video data, the multimedia coding data and the mismatch flag.
9. The image coding method of claim 8, further comprising: reproducing the video stream based on the multimedia information associated with the multimedia coding data and the video data.
10. The image coding method of claim 8, wherein said outputting step includes writing the decoded video data, the multimedia coding data and the mismatch flag onto a recording medium.
11. The image coding method of claim 8, further comprising: generating the multimedia information associated with the multimedia coding data.
12. The image coding method of claim 8, further comprising: detecting at least a bit rate associated with the video stream.
13. The image coding method of claim 8, wherein the display control information includes a re-encode flag which indicates whether the video data is re-encoded.
14. The image coding method of claim 8, wherein the display control information includes a frame size change flag which indicates whether a size of a picture frame associated with the video data has been changed.
15. A computer-readable medium having recorded instructions for carrying an image coding method, said method comprising:
receiving a multiplexed transport stream that includes multimedia coding data;
separating a video stream from the multiplexed transport stream;
reproducing the separated video stream as decoded video data;
receiving multimedia information associated with the multimedia coding data;
generating display control information that includes a mismatch flag which indicates whether a display mismatch condition exists between the video data and multimedia coding data; and
outputting the decoded video data, the multimedia coding data and the mismatch flag.
16. The computer-readable medium of claim 15, wherein said image coding method further comprises: reproducing the video stream based on the multimedia information associated with the multimedia coding data and the video data.
17. The computer-readable medium of claim 15, wherein said outputting step includes writing the decoded video data, the multimedia coding data and the mismatch flag onto a recording medium.
18. The computer-readable medium of claim 15, wherein said image coding method further comprises: generating the multimedia information associated with the multimedia coding data.
19. The computer-readable medium of claim 15, wherein said image coding method further comprises: detecting at least a bit rate associated with the video stream.
20. The computer-readable medium of claim 15, wherein the display control information includes a re-encode flag which indicates whether the video data is re-encoded.
21. The computer-readable medium of claim 15, wherein the display control information includes a frame size change flag which indicates whether a size of a picture frame associated with the video data has been changed.
US11/593,388 2000-06-02 2006-11-06 Apparatus and method for image coding and decoding Abandoned US20070053665A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/593,388 US20070053665A1 (en) 2000-06-02 2006-11-06 Apparatus and method for image coding and decoding

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
JPP2000-165298 2000-06-02
JP2000165298 2000-06-02
JP2001001031 2001-01-09
JPP2001-001031 2001-01-09
US09/872,147 US7224890B2 (en) 2000-06-02 2001-06-01 Apparatus and method for image coding and decoding
US11/593,388 US20070053665A1 (en) 2000-06-02 2006-11-06 Apparatus and method for image coding and decoding

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US09/872,147 Division US7224890B2 (en) 2000-06-02 2001-06-01 Apparatus and method for image coding and decoding

Publications (1)

Publication Number Publication Date
US20070053665A1 true US20070053665A1 (en) 2007-03-08

Family

ID=26593196

Family Applications (5)

Application Number Title Priority Date Filing Date
US09/872,147 Expired - Fee Related US7224890B2 (en) 2000-06-02 2001-06-01 Apparatus and method for image coding and decoding
US11/593,388 Abandoned US20070053665A1 (en) 2000-06-02 2006-11-06 Apparatus and method for image coding and decoding
US11/708,774 Expired - Fee Related US8625958B2 (en) 2000-06-02 2007-02-21 Apparatus and method for image coding and decoding
US11/717,442 Expired - Fee Related US8625959B2 (en) 2000-06-02 2007-03-13 Apparatus and method for image coding and decoding
US11/717,443 Expired - Fee Related US8644672B2 (en) 2000-06-02 2007-03-13 Apparatus and method for image coding and decoding

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US09/872,147 Expired - Fee Related US7224890B2 (en) 2000-06-02 2001-06-01 Apparatus and method for image coding and decoding

Family Applications After (3)

Application Number Title Priority Date Filing Date
US11/708,774 Expired - Fee Related US8625958B2 (en) 2000-06-02 2007-02-21 Apparatus and method for image coding and decoding
US11/717,442 Expired - Fee Related US8625959B2 (en) 2000-06-02 2007-03-13 Apparatus and method for image coding and decoding
US11/717,443 Expired - Fee Related US8644672B2 (en) 2000-06-02 2007-03-13 Apparatus and method for image coding and decoding

Country Status (6)

Country Link
US (5) US7224890B2 (en)
EP (2) EP2262265B1 (en)
JP (1) JP5516488B2 (en)
KR (1) KR100827887B1 (en)
CN (1) CN1174607C (en)
TW (1) TW519840B (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080115175A1 (en) * 2006-11-13 2008-05-15 Rodriguez Arturo A System and method for signaling characteristics of pictures' interdependencies
US20080115176A1 (en) * 2006-11-13 2008-05-15 Scientific-Atlanta, Inc. Indicating picture usefulness for playback optimization
US20080260045A1 (en) * 2006-11-13 2008-10-23 Rodriguez Arturo A Signalling and Extraction in Compressed Video of Pictures Belonging to Interdependency Tiers
US20090034633A1 (en) * 2007-07-31 2009-02-05 Cisco Technology, Inc. Simultaneous processing of media and redundancy streams for mitigating impairments
US20090034627A1 (en) * 2007-07-31 2009-02-05 Cisco Technology, Inc. Non-enhancing media redundancy coding for mitigating transmission impairments
US20090100482A1 (en) * 2007-10-16 2009-04-16 Rodriguez Arturo A Conveyance of Concatenation Properties and Picture Orderness in a Video Stream
US20090180546A1 (en) * 2008-01-09 2009-07-16 Rodriguez Arturo A Assistance for processing pictures in concatenated video streams
US20090220012A1 (en) * 2008-02-29 2009-09-03 Rodriguez Arturo A Signalling picture encoding schemes and associated picture properties
US20090313662A1 (en) * 2008-06-17 2009-12-17 Cisco Technology Inc. Methods and systems for processing multi-latticed video streams
US20090313668A1 (en) * 2008-06-17 2009-12-17 Cisco Technology, Inc. Time-shifted transport of multi-latticed video for resiliency from burst-error effects
US20090323822A1 (en) * 2008-06-25 2009-12-31 Rodriguez Arturo A Support for blocking trick mode operations
US20100003015A1 (en) * 2008-06-17 2010-01-07 Cisco Technology Inc. Processing of impaired and incomplete multi-latticed video streams
US20100118979A1 (en) * 2008-11-12 2010-05-13 Rodriguez Arturo A Targeted bit appropriations based on picture importance
US20110222837A1 (en) * 2010-03-11 2011-09-15 Cisco Technology, Inc. Management of picture referencing in video streams for plural playback modes
US8718388B2 (en) 2007-12-11 2014-05-06 Cisco Technology, Inc. Video processing with tiered interdependencies of pictures
US8886022B2 (en) 2008-06-12 2014-11-11 Cisco Technology, Inc. Picture interdependencies signals in context of MMCO to assist stream manipulation
US8949883B2 (en) 2009-05-12 2015-02-03 Cisco Technology, Inc. Signalling buffer characteristics for splicing operations of video streams
US9467696B2 (en) 2009-06-18 2016-10-11 Tech 5 Dynamic streaming plural lattice video coding representations of video

Families Citing this family (66)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6808709B1 (en) * 1994-12-30 2004-10-26 The Regents Of The University Of California Immunoglobulins containing protection proteins and their use
KR100448452B1 (en) * 2000-06-09 2004-09-13 엘지전자 주식회사 Method for supporting menu of a high-density recording medium
DE10055937A1 (en) * 2000-11-10 2002-05-23 Grundig Ag Device for recording coded digital audiovisual data determines if residual storage capacity in storage medium is adequate for program data of program contribution based on mean data rate
US7577333B2 (en) * 2001-08-04 2009-08-18 Samsung Electronics Co., Ltd. Method and apparatus for recording and reproducing video data, and information storage medium in which video data is recorded by the same
KR100888587B1 (en) * 2001-09-27 2009-03-16 삼성전자주식회사 Method and apparatus for recording and reproducing video data and information storage medium thereby
TW578083B (en) * 2001-10-25 2004-03-01 Samsung Electronics Co Ltd Storage medium adaptable to changes in screen aspect ratio and reproducing method thereof
FR2840424B1 (en) * 2002-05-30 2004-09-03 Thomson Licensing Sa MULTIMEDIA DATA FRAGMENTATION METHOD AND DEVICE
CN101350214B (en) 2002-06-24 2015-07-01 Lg电子株式会社 Method and device for recording and reproducing data structure of reproduction for video data
KR20040000290A (en) 2002-06-24 2004-01-03 엘지전자 주식회사 Method for managing multi-path data stream of high density optical disc
US20040033821A1 (en) * 2002-08-16 2004-02-19 Visteon Global Technologies, Inc. In-vehicle entertainment system
CA2462198C (en) * 2002-09-05 2012-11-06 Lg Electronics Inc. Recording medium having data structure for managing reproduction of still images recorded thereon and recording and reproducing methods and apparatuses
KR100684411B1 (en) * 2002-09-06 2007-02-16 엘지전자 주식회사 Recording medium having data structure for managing reproduction of still images recorded thereon and recording and reproducing methods and apparatuses
AU2003260974B8 (en) * 2002-09-07 2010-06-03 Lg Electronics Inc. Recording medium having data structure for managing reproduction of still images from a clip file recorded thereon and recording and reproducing methods and apparatuses
CN101106729B (en) * 2002-10-02 2012-12-19 Lg电子株式会社 Recording and reproducing method for controlling image data reproduction data structure
EP1547080B1 (en) * 2002-10-04 2012-01-25 LG Electronics, Inc. Recording medium having a data structure for managing reproduction of graphic data and recording and reproducing methods and apparatuses
AU2003269521B2 (en) * 2002-10-15 2009-05-28 Lg Electronics Inc. Recording medium having data structure for managing reproduction of multiple graphics streams recorded thereon and recording and reproducing methods and apparatuses
JP4558498B2 (en) * 2002-11-20 2010-10-06 エルジー エレクトロニクス インコーポレイティド Recording medium having data structure for managing reproduction of recorded still image, and recording and reproduction method and apparatus therefor
EP1579411B1 (en) * 2002-12-20 2012-10-10 Trident Microsystems (Far East) Ltd. Apparatus for re-ordering video data for displays using two transpose steps and storage of intermediate partially re-ordered video data
JP4224690B2 (en) * 2002-12-27 2009-02-18 ソニー株式会社 Recording method, recording apparatus, reproducing method, reproducing apparatus, and imaging apparatus
WO2004064392A2 (en) * 2003-01-14 2004-07-29 Matsushita Electric Industrial Co., Ltd. Data-transceiving equipment, image processor, and image-processing method
ATE447229T1 (en) 2003-01-20 2009-11-15 Lg Electronics Inc RECORDING MEDIUM HAVING A DATA STRUCTURE FOR MANAGING THE PLAYBACK OF STILL IMAGE RECORDED THEREON AND RECORDING AND PLAYBACKING METHODS AND APPARATUS
JP4165895B2 (en) * 2003-01-20 2008-10-15 エルジー エレクトロニクス インコーポレーテッド RECORDING MEDIUM HAVING DATA STRUCTURE FOR MANAGING REPRODUCING RECORDED STILL VIDEO, RECORDING AND REPRODUCING METHOD AND DEVICE USING THE SAME
US8145033B2 (en) * 2003-02-05 2012-03-27 Lg Electronics Inc. Recording medium having data structure for managing reproducton duration of still pictures recorded thereon and recording and reproducing methods and apparatuses
US7734154B2 (en) * 2003-02-14 2010-06-08 Lg Electronics Inc. Recording medium having data structure for managing reproduction duration of still pictures recorded thereon and recording and reproducing methods and apparatuses
US8055117B2 (en) * 2003-02-15 2011-11-08 Lg Electronics Inc. Recording medium having data structure for managing reproduction duration of still pictures recorded thereon and recording and reproducing methods and apparatuses
US8041179B2 (en) * 2003-02-24 2011-10-18 Lg Electronics Inc. Methods and apparatuses for reproducing and recording still picture and audio data and recording medium having data structure for managing reproduction of still picture and audio data
US7802288B2 (en) * 2003-03-14 2010-09-21 Starz Entertainment, Llc Video aspect ratio manipulation
RU2388073C2 (en) * 2003-04-29 2010-04-27 Эл Джи Электроникс Инк. Recording medium with data structure for managing playback of graphic data and methods and devices for recording and playback
US7616865B2 (en) * 2003-04-30 2009-11-10 Lg Electronics Inc. Recording medium having a data structure for managing reproduction of subtitle data and methods and apparatuses of recording and reproducing
JP4068509B2 (en) * 2003-06-05 2008-03-26 アルパイン株式会社 Video playback apparatus and method
KR100617178B1 (en) 2003-06-13 2006-08-30 엘지전자 주식회사 Apparatus and method for zoom transfer of television system
KR20050005074A (en) * 2003-07-01 2005-01-13 엘지전자 주식회사 Method for managing grahics data of high density optical disc, and high density optical disc therof
KR20050004339A (en) * 2003-07-02 2005-01-12 엘지전자 주식회사 Method for managing grahics data of high density optical disc, and high density optical disc therof
KR20050064150A (en) * 2003-12-23 2005-06-29 엘지전자 주식회사 Method for managing and reproducing a menu information of high density optical disc
US20060075449A1 (en) * 2004-09-24 2006-04-06 Cisco Technology, Inc. Distributed architecture for digital program insertion in video streams delivered over packet networks
KR101150872B1 (en) * 2005-01-24 2012-06-13 톰슨 라이센싱 에스.에이.에스. Method, apparatus and system for visual inspection of transcoded video
US8677504B2 (en) * 2005-07-14 2014-03-18 Qualcomm Incorporated Method and apparatus for encrypting/decrypting multimedia content to allow random access
KR100743247B1 (en) * 2005-09-23 2007-07-27 엠텍비젼 주식회사 Method and apparatus for providing image ancilliary information using dummy data block and record media recorded program for realizing the same
US7680047B2 (en) * 2005-11-22 2010-03-16 Cisco Technology, Inc. Maximum transmission unit tuning mechanism for a real-time transport protocol stream
US20070212578A1 (en) * 2006-03-13 2007-09-13 More Energy Ltd. Direct liquid fuel cell comprising a hydride fuel and a gel electrolyte
US8326927B2 (en) * 2006-05-23 2012-12-04 Cisco Technology, Inc. Method and apparatus for inviting non-rich media endpoints to join a conference sidebar session
DE102006027441A1 (en) * 2006-06-12 2007-12-13 Attag Gmbh Method and apparatus for generating a digital transport stream for a video program
KR100754225B1 (en) * 2006-08-07 2007-09-03 삼성전자주식회사 Method and apparatus for recording and reproducing interactive service of digital broadcast
US8358763B2 (en) * 2006-08-21 2013-01-22 Cisco Technology, Inc. Camping on a conference or telephony port
US8121277B2 (en) * 2006-12-12 2012-02-21 Cisco Technology, Inc. Catch-up playback in a conferencing system
DE602008004665D1 (en) * 2007-08-09 2011-03-03 Inlet Technologies Raleigh STORAGE OF SUBTITLES BY VIDEO TRANSCRIPTION
US8457214B2 (en) * 2007-09-10 2013-06-04 Cisco Technology, Inc. Video compositing of an arbitrary number of source streams using flexible macroblock ordering
US8422411B2 (en) * 2007-10-07 2013-04-16 Motorola Mobility Llc Flexible frame structure in wireless communication system
CA2705676C (en) * 2007-11-15 2017-08-29 Thomson Licensing System and method for re-encoding video using version comparison data to determine re-encoding parameters
US8045827B2 (en) * 2007-12-14 2011-10-25 Xerox Corporation Image downsampling during processing
US8000562B2 (en) * 2007-12-14 2011-08-16 Xerox Corporation Image downsampling for print job processing
US9118465B2 (en) * 2008-02-21 2015-08-25 Google Technology Holdings LLC Method for supporting flexible frame structures in wireless communication systems
US8311101B2 (en) * 2008-02-28 2012-11-13 Lsi Corporation Rate control for real time transcoding of subtitles for application with limited memory
CN101355706B (en) * 2008-08-15 2010-06-16 中兴通讯股份有限公司 Method and apparatus for analyzing multiplexing code stream of multiplexer
GB2469528B (en) * 2009-04-18 2011-10-05 Saffron Digital Ltd Transcoding video data
JP5400009B2 (en) * 2010-09-27 2014-01-29 ルネサスエレクトロニクス株式会社 Transcoding device, transcoding method and program
US9001886B2 (en) 2010-11-22 2015-04-07 Cisco Technology, Inc. Dynamic time synchronization
US20120176540A1 (en) * 2011-01-10 2012-07-12 Cisco Technology, Inc. System and method for transcoding live closed captions and subtitles
JP5685969B2 (en) * 2011-02-15 2015-03-18 ソニー株式会社 Display control method and display control apparatus
JP2013055587A (en) * 2011-09-06 2013-03-21 Sony Corp Image processing apparatus, image processing method, and image processing system
US9532080B2 (en) 2012-05-31 2016-12-27 Sonic Ip, Inc. Systems and methods for the reuse of encoding information in encoding alternative streams of video data
JP5949204B2 (en) * 2012-06-21 2016-07-06 ソニー株式会社 Electronic device, stream transmission / reception method in electronic device, program, host device, and stream transmission / reception method in host device
CN103873888A (en) * 2012-12-12 2014-06-18 深圳市快播科技有限公司 Live broadcast method of media files and live broadcast source server
US9357210B2 (en) 2013-02-28 2016-05-31 Sonic Ip, Inc. Systems and methods of encoding multiple video streams for adaptive bitrate streaming
US9253490B2 (en) 2013-05-31 2016-02-02 Qualcomm Technologies International, Ltd. Optimizing video transfer
CN107242882A (en) * 2017-06-05 2017-10-13 上海瓴舸网络科技有限公司 A kind of B ultrasound shows auxiliary equipment and its control method

Citations (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5231492A (en) * 1989-03-16 1993-07-27 Fujitsu Limited Video and audio multiplex transmission system
US5327235A (en) * 1992-02-17 1994-07-05 Sony United Kingdom Limited Video conversions of video signal formats
US5519446A (en) * 1993-11-13 1996-05-21 Goldstar Co., Ltd. Apparatus and method for converting an HDTV signal to a non-HDTV signal
US5731847A (en) * 1995-03-20 1998-03-24 Sony Corporation Subtitle encoding/decoding method and apparatus
US5754235A (en) * 1994-03-25 1998-05-19 Sanyo Electric Co., Ltd. Bit-rate conversion circuit for a compressed motion video bitstream
US5847770A (en) * 1995-09-25 1998-12-08 Sony Corporation Apparatus and method for encoding and decoding a subtitle signal
US5848217A (en) * 1995-08-02 1998-12-08 Sony Corporation Subtitle encoding/decoding method and apparatus
US5862300A (en) * 1995-07-07 1999-01-19 Sony Corporation Control of an image display based on a permission signal
US5912710A (en) * 1996-12-18 1999-06-15 Kabushiki Kaisha Toshiba System and method for controlling a display of graphics data pixels on a video monitor having a different display aspect ratio than the pixel aspect ratio
US5970072A (en) * 1997-10-02 1999-10-19 Alcatel Usa Sourcing, L.P. System and apparatus for telecommunications bus control
US6011598A (en) * 1996-03-28 2000-01-04 Sanyo Electric Co., Ltd. Decoding start controller, decoder, and decoding system
US6055270A (en) * 1994-04-20 2000-04-25 Thomson Cosumer Electronics, Inc. Multiplexer system using constant bit rate encoders
US6160954A (en) * 1996-10-31 2000-12-12 Sony Corporation Device for and method of reproducing recording medium
US6310915B1 (en) * 1998-11-20 2001-10-30 Harmonic Inc. Video transcoder with bitstream look ahead for rate control and statistical multiplexing
US20010052135A1 (en) * 1996-12-30 2001-12-13 Philips Electronics North America Corporation Method and system for implementing interactive broadcast programs and commercials
US20020012530A1 (en) * 1996-04-17 2002-01-31 U.S. Philips Corporation Encoding device for encoding a program and recording device
US20020057382A1 (en) * 2000-08-31 2002-05-16 Hideaki Yui Television signal reception apparatus
US6393202B1 (en) * 1996-05-09 2002-05-21 Matsushita Electric Industrial Co. Ltd. Optical disc for which a sub-picture can be favorably superimposed on a main image and a disc reproduction apparatus and a disc reproduction method for the disc
US6483945B1 (en) * 1998-02-02 2002-11-19 Sony Corporation Moving picture encoding method and apparatus
US20020176506A1 (en) * 1997-09-26 2002-11-28 Dinei Afonso Ferreira Florencio Computational resource allocation in an information stream decoder
US6490320B1 (en) * 2000-02-02 2002-12-03 Mitsubishi Electric Research Laboratories Inc. Adaptable bitstream video delivery system
US20030001981A1 (en) * 2001-05-21 2003-01-02 Sony Corporation Modular digital television architecture
US6788347B1 (en) * 1997-03-12 2004-09-07 Matsushita Electric Industrial Co., Ltd. HDTV downconversion system
US6801709B1 (en) * 1997-07-19 2004-10-05 Samsung Electronics Co., Ltd. Apparatus and method for synchronously decoding video data and sub-picture data in DVD player
US6900845B1 (en) * 1996-12-18 2005-05-31 Thomson Licensing S.A. Memory architecture for a multiple format video signal processor
US6915531B2 (en) * 2000-01-13 2005-07-05 Lg Electronics Inc. Open cable set-top box diagnosing system and method thereof
US7010032B1 (en) * 1999-03-12 2006-03-07 Kabushiki Kaisha Toshiba Moving image coding apparatus and decoding apparatus
US7020195B1 (en) * 1999-12-10 2006-03-28 Microsoft Corporation Layered coding and decoding of image data
US7072396B2 (en) * 1997-03-14 2006-07-04 Microsoft Corporation Motion video signal encoder and encoding method
US7872668B2 (en) * 2005-08-26 2011-01-18 Nvidia Corporation Video image processing with programmable scripting and remote diagnosis
US7986846B2 (en) * 2004-10-26 2011-07-26 Samsung Electronics Co., Ltd Apparatus and method for processing an image signal in a digital broadcast receiver
US7995896B1 (en) * 1999-11-04 2011-08-09 Thomson Licensing System and user interface for a television receiver in a television program distribution system

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2591437B2 (en) * 1993-09-13 1997-03-19 日本電気株式会社 High-definition video signal encoding / decoding device
JPH07202820A (en) 1993-12-28 1995-08-04 Matsushita Electric Ind Co Ltd Bit rate control system
JPH08321828A (en) * 1995-05-25 1996-12-03 Matsushita Electric Ind Co Ltd Encoding signal transmission device
JP3544105B2 (en) 1997-09-26 2004-07-21 富士ゼロックス株式会社 Received information recording device and received information recording method
DE69840427D1 (en) * 1997-11-04 2009-02-12 Avistar Comm Corp Scalable multimedia network system and application
JPH11196414A (en) 1997-11-06 1999-07-21 Thomson Broadcast Syst Device for processing encoded video data and system for distributing program using the device
JP3724205B2 (en) 1998-03-10 2005-12-07 ソニー株式会社 Decoding device and method, and recording medium
JP2000102007A (en) * 1998-09-28 2000-04-07 Matsushita Electric Ind Co Ltd Multi-media information synthesizer and compressed video signal generator
US6483851B1 (en) * 1998-11-13 2002-11-19 Tektronix, Inc. System for network transcoding of multimedia data flow

Patent Citations (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5231492A (en) * 1989-03-16 1993-07-27 Fujitsu Limited Video and audio multiplex transmission system
US5327235A (en) * 1992-02-17 1994-07-05 Sony United Kingdom Limited Video conversions of video signal formats
US5519446A (en) * 1993-11-13 1996-05-21 Goldstar Co., Ltd. Apparatus and method for converting an HDTV signal to a non-HDTV signal
US5754235A (en) * 1994-03-25 1998-05-19 Sanyo Electric Co., Ltd. Bit-rate conversion circuit for a compressed motion video bitstream
US6055270A (en) * 1994-04-20 2000-04-25 Thomson Cosumer Electronics, Inc. Multiplexer system using constant bit rate encoders
US5731847A (en) * 1995-03-20 1998-03-24 Sony Corporation Subtitle encoding/decoding method and apparatus
US5862300A (en) * 1995-07-07 1999-01-19 Sony Corporation Control of an image display based on a permission signal
US5848217A (en) * 1995-08-02 1998-12-08 Sony Corporation Subtitle encoding/decoding method and apparatus
US6424792B1 (en) * 1995-08-02 2002-07-23 Sony Corporation Subtitle encoding/decoding method and apparatus
US5847770A (en) * 1995-09-25 1998-12-08 Sony Corporation Apparatus and method for encoding and decoding a subtitle signal
US6011598A (en) * 1996-03-28 2000-01-04 Sanyo Electric Co., Ltd. Decoding start controller, decoder, and decoding system
US20020012530A1 (en) * 1996-04-17 2002-01-31 U.S. Philips Corporation Encoding device for encoding a program and recording device
US6393202B1 (en) * 1996-05-09 2002-05-21 Matsushita Electric Industrial Co. Ltd. Optical disc for which a sub-picture can be favorably superimposed on a main image and a disc reproduction apparatus and a disc reproduction method for the disc
US6785463B2 (en) * 1996-05-09 2004-08-31 Matsushita Electric Industrial Co., Ltd. Reproduction apparatus and a reproduction method for video objects received by digital broadcast
US6160954A (en) * 1996-10-31 2000-12-12 Sony Corporation Device for and method of reproducing recording medium
US6900845B1 (en) * 1996-12-18 2005-05-31 Thomson Licensing S.A. Memory architecture for a multiple format video signal processor
US5912710A (en) * 1996-12-18 1999-06-15 Kabushiki Kaisha Toshiba System and method for controlling a display of graphics data pixels on a video monitor having a different display aspect ratio than the pixel aspect ratio
US20010052135A1 (en) * 1996-12-30 2001-12-13 Philips Electronics North America Corporation Method and system for implementing interactive broadcast programs and commercials
US6788347B1 (en) * 1997-03-12 2004-09-07 Matsushita Electric Industrial Co., Ltd. HDTV downconversion system
US7072396B2 (en) * 1997-03-14 2006-07-04 Microsoft Corporation Motion video signal encoder and encoding method
US6801709B1 (en) * 1997-07-19 2004-10-05 Samsung Electronics Co., Ltd. Apparatus and method for synchronously decoding video data and sub-picture data in DVD player
US20020176506A1 (en) * 1997-09-26 2002-11-28 Dinei Afonso Ferreira Florencio Computational resource allocation in an information stream decoder
US5970072A (en) * 1997-10-02 1999-10-19 Alcatel Usa Sourcing, L.P. System and apparatus for telecommunications bus control
US6483945B1 (en) * 1998-02-02 2002-11-19 Sony Corporation Moving picture encoding method and apparatus
US6310915B1 (en) * 1998-11-20 2001-10-30 Harmonic Inc. Video transcoder with bitstream look ahead for rate control and statistical multiplexing
US7010032B1 (en) * 1999-03-12 2006-03-07 Kabushiki Kaisha Toshiba Moving image coding apparatus and decoding apparatus
US7995896B1 (en) * 1999-11-04 2011-08-09 Thomson Licensing System and user interface for a television receiver in a television program distribution system
US7020195B1 (en) * 1999-12-10 2006-03-28 Microsoft Corporation Layered coding and decoding of image data
US6915531B2 (en) * 2000-01-13 2005-07-05 Lg Electronics Inc. Open cable set-top box diagnosing system and method thereof
US6490320B1 (en) * 2000-02-02 2002-12-03 Mitsubishi Electric Research Laboratories Inc. Adaptable bitstream video delivery system
US20020057382A1 (en) * 2000-08-31 2002-05-16 Hideaki Yui Television signal reception apparatus
US20030001981A1 (en) * 2001-05-21 2003-01-02 Sony Corporation Modular digital television architecture
US7986846B2 (en) * 2004-10-26 2011-07-26 Samsung Electronics Co., Ltd Apparatus and method for processing an image signal in a digital broadcast receiver
US7872668B2 (en) * 2005-08-26 2011-01-18 Nvidia Corporation Video image processing with programmable scripting and remote diagnosis

Cited By (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9521420B2 (en) 2006-11-13 2016-12-13 Tech 5 Managing splice points for non-seamless concatenated bitstreams
US20080115176A1 (en) * 2006-11-13 2008-05-15 Scientific-Atlanta, Inc. Indicating picture usefulness for playback optimization
US20080260045A1 (en) * 2006-11-13 2008-10-23 Rodriguez Arturo A Signalling and Extraction in Compressed Video of Pictures Belonging to Interdependency Tiers
US8416859B2 (en) 2006-11-13 2013-04-09 Cisco Technology, Inc. Signalling and extraction in compressed video of pictures belonging to interdependency tiers
US8875199B2 (en) 2006-11-13 2014-10-28 Cisco Technology, Inc. Indicating picture usefulness for playback optimization
US20080115175A1 (en) * 2006-11-13 2008-05-15 Rodriguez Arturo A System and method for signaling characteristics of pictures' interdependencies
US9716883B2 (en) 2006-11-13 2017-07-25 Cisco Technology, Inc. Tracking and determining pictures in successive interdependency levels
US20090034633A1 (en) * 2007-07-31 2009-02-05 Cisco Technology, Inc. Simultaneous processing of media and redundancy streams for mitigating impairments
US20090034627A1 (en) * 2007-07-31 2009-02-05 Cisco Technology, Inc. Non-enhancing media redundancy coding for mitigating transmission impairments
US8804845B2 (en) 2007-07-31 2014-08-12 Cisco Technology, Inc. Non-enhancing media redundancy coding for mitigating transmission impairments
US8958486B2 (en) 2007-07-31 2015-02-17 Cisco Technology, Inc. Simultaneous processing of media and redundancy streams for mitigating impairments
US20090100482A1 (en) * 2007-10-16 2009-04-16 Rodriguez Arturo A Conveyance of Concatenation Properties and Picture Orderness in a Video Stream
US8873932B2 (en) 2007-12-11 2014-10-28 Cisco Technology, Inc. Inferential processing to ascertain plural levels of picture interdependencies
US8718388B2 (en) 2007-12-11 2014-05-06 Cisco Technology, Inc. Video processing with tiered interdependencies of pictures
US20090180546A1 (en) * 2008-01-09 2009-07-16 Rodriguez Arturo A Assistance for processing pictures in concatenated video streams
US8804843B2 (en) 2008-01-09 2014-08-12 Cisco Technology, Inc. Processing and managing splice points for the concatenation of two video streams
US20090220012A1 (en) * 2008-02-29 2009-09-03 Rodriguez Arturo A Signalling picture encoding schemes and associated picture properties
US8416858B2 (en) 2008-02-29 2013-04-09 Cisco Technology, Inc. Signalling picture encoding schemes and associated picture properties
US9819899B2 (en) 2008-06-12 2017-11-14 Cisco Technology, Inc. Signaling tier information to assist MMCO stream manipulation
US8886022B2 (en) 2008-06-12 2014-11-11 Cisco Technology, Inc. Picture interdependencies signals in context of MMCO to assist stream manipulation
US20090313668A1 (en) * 2008-06-17 2009-12-17 Cisco Technology, Inc. Time-shifted transport of multi-latticed video for resiliency from burst-error effects
US8971402B2 (en) 2008-06-17 2015-03-03 Cisco Technology, Inc. Processing of impaired and incomplete multi-latticed video streams
US20090313662A1 (en) * 2008-06-17 2009-12-17 Cisco Technology Inc. Methods and systems for processing multi-latticed video streams
US8699578B2 (en) 2008-06-17 2014-04-15 Cisco Technology, Inc. Methods and systems for processing multi-latticed video streams
US9723333B2 (en) 2008-06-17 2017-08-01 Cisco Technology, Inc. Output of a video signal from decoded and derived picture information
US20100003015A1 (en) * 2008-06-17 2010-01-07 Cisco Technology Inc. Processing of impaired and incomplete multi-latticed video streams
US9407935B2 (en) 2008-06-17 2016-08-02 Cisco Technology, Inc. Reconstructing a multi-latticed video signal
US9350999B2 (en) 2008-06-17 2016-05-24 Tech 5 Methods and systems for processing latticed time-skewed video streams
US8705631B2 (en) 2008-06-17 2014-04-22 Cisco Technology, Inc. Time-shifted transport of multi-latticed video for resiliency from burst-error effects
US20090323822A1 (en) * 2008-06-25 2009-12-31 Rodriguez Arturo A Support for blocking trick mode operations
US20100118979A1 (en) * 2008-11-12 2010-05-13 Rodriguez Arturo A Targeted bit appropriations based on picture importance
US20100118973A1 (en) * 2008-11-12 2010-05-13 Rodriguez Arturo A Error concealment of plural processed representations of a single video signal received in a video program
US8320465B2 (en) 2008-11-12 2012-11-27 Cisco Technology, Inc. Error concealment of plural processed representations of a single video signal received in a video program
US8681876B2 (en) 2008-11-12 2014-03-25 Cisco Technology, Inc. Targeted bit appropriations based on picture importance
US8761266B2 (en) 2008-11-12 2014-06-24 Cisco Technology, Inc. Processing latticed and non-latticed pictures of a video program
US8949883B2 (en) 2009-05-12 2015-02-03 Cisco Technology, Inc. Signalling buffer characteristics for splicing operations of video streams
US9609039B2 (en) 2009-05-12 2017-03-28 Cisco Technology, Inc. Splice signalling buffer characteristics
US9467696B2 (en) 2009-06-18 2016-10-11 Tech 5 Dynamic streaming plural lattice video coding representations of video
US20110222837A1 (en) * 2010-03-11 2011-09-15 Cisco Technology, Inc. Management of picture referencing in video streams for plural playback modes

Also Published As

Publication number Publication date
US8625958B2 (en) 2014-01-07
EP2262265A2 (en) 2010-12-15
US20070206930A1 (en) 2007-09-06
EP1182880B1 (en) 2014-01-29
JP5516488B2 (en) 2014-06-11
CN1336764A (en) 2002-02-20
US20070147789A1 (en) 2007-06-28
US8625959B2 (en) 2014-01-07
KR100827887B1 (en) 2008-05-07
US20070206932A1 (en) 2007-09-06
TW519840B (en) 2003-02-01
EP1182880A2 (en) 2002-02-27
EP1182880A3 (en) 2004-06-23
EP2262265A3 (en) 2012-05-02
US20020006165A1 (en) 2002-01-17
KR20010110147A (en) 2001-12-12
CN1174607C (en) 2004-11-03
US7224890B2 (en) 2007-05-29
US8644672B2 (en) 2014-02-04
JP2011166813A (en) 2011-08-25
EP2262265B1 (en) 2014-02-12

Similar Documents

Publication Publication Date Title
US8644672B2 (en) Apparatus and method for image coding and decoding
US6377309B1 (en) Image processing apparatus and method for reproducing at least an image from a digital data sequence
US7054539B2 (en) Image processing method and apparatus
US20110164673A1 (en) Preserving Captioning Through Video Transcoding
US7305173B2 (en) Decoding device and decoding method
JP4724919B2 (en) Recording apparatus and recording method, reproducing apparatus and reproducing method, and recording medium
JP4931034B2 (en) Decoding device, decoding method, program, and program recording medium
US20070274675A1 (en) Method and Apparatus for Transcoding Digital Audio/Video Streams
JP4765192B2 (en) Image encoding apparatus and method, image decoding apparatus and method, and recording medium
US6097439A (en) Omnibus closed captioning decoder for encoded video
KR101154743B1 (en) Encoder apparatus, encoding method, decoder apparatus, decoding method, recording medium, and playback apparatus
JP2002077789A (en) Method and apparatus for processing image
KR100998449B1 (en) Digital multimedia broadcasting receiver and the method for controlling buffer using the receiver
KR101158435B1 (en) System and method for multi-media broad casting using priority information on BIFS packet header in DMB mobile terminal
JP2001346162A (en) Multiplexed stream converter and method, and recording medium
KR20050050315A (en) Method of handling broadcasting channel and a/v device thereof
JP2008010997A (en) Information processing apparatus and method, and semiconductor integrated circuit
JP2002112221A (en) Data transmission method and device, data transmission system, transmission medium

Legal Events

Date Code Title Description
AS Assignment

Owner name: SONY CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KATO, MOTOKI;REEL/FRAME:019236/0127

Effective date: 20010711

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION