WO2007063890A1 - Dispositif de traitement de signal de réception d’émission numérique, méthode de traitement de signal, programme de traitement de signal et récepteur d’émissions numériques - Google Patents

Dispositif de traitement de signal de réception d’émission numérique, méthode de traitement de signal, programme de traitement de signal et récepteur d’émissions numériques Download PDF

Info

Publication number
WO2007063890A1
WO2007063890A1 PCT/JP2006/323799 JP2006323799W WO2007063890A1 WO 2007063890 A1 WO2007063890 A1 WO 2007063890A1 JP 2006323799 W JP2006323799 W JP 2006323799W WO 2007063890 A1 WO2007063890 A1 WO 2007063890A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
unit
code
syntax element
signal processing
Prior art date
Application number
PCT/JP2006/323799
Other languages
English (en)
Japanese (ja)
Inventor
Yoshitaka Tanaka
Kenji Mito
Original Assignee
Pioneer Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pioneer Corporation filed Critical Pioneer Corporation
Priority to JP2007547967A priority Critical patent/JP4672734B2/ja
Publication of WO2007063890A1 publication Critical patent/WO2007063890A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/438Interfacing the downstream path of the transmission network originating from a server, e.g. retrieving encoded video stream packets from an IP network
    • H04N21/4382Demodulation or channel decoding, e.g. QPSK demodulation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/103Selection of coding mode or of prediction mode
    • H04N19/105Selection of the reference unit for prediction within a chosen coding or prediction mode, e.g. adaptive choice of position and number of pixels used for prediction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/164Feedback from the receiver or from the transmission channel
    • H04N19/166Feedback from the receiver or from the transmission channel concerning the amount of transmission errors, e.g. bit error rate [BER]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/187Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a scalable video layer
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/30Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using hierarchical techniques, e.g. scalability
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/44Decoders specially adapted therefor, e.g. video decoders which are asymmetric with respect to the encoder
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/59Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving spatial sub-sampling or interpolation, e.g. alteration of picture size or resolution
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/60Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding
    • H04N19/61Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding in combination with predictive coding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/70Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by syntax aspects related to video coding, e.g. related to compression standards
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/85Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using pre-processing or post-processing specially adapted for video compression
    • H04N19/89Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using pre-processing or post-processing specially adapted for video compression involving methods or arrangements for detection of transmission errors at the decoder
    • H04N19/895Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using pre-processing or post-processing specially adapted for video compression involving methods or arrangements for detection of transmission errors at the decoder in combination with error concealment
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04HBROADCAST COMMUNICATION
    • H04H20/00Arrangements for broadcast or for distribution combined with broadcast
    • H04H20/65Arrangements characterised by transmission systems for broadcast
    • H04H20/67Common-wave systems, i.e. using separate transmitters operating on substantially the same frequency

Definitions

  • Digital broadcast reception signal processing apparatus signal processing method and signal processing program, and digital broadcast reception apparatus
  • the present invention relates to a technique for processing a plurality of digital broadcast reception signals, and particularly to a technique for processing a plurality of digital broadcast reception signals transmitted by simulcast.
  • Terrestrial digital broadcasting in Japan is defined by ISDB-T (Integrated Services Digital Broadcasting Terrestrial) standard, and employs OFDM (Orthogonal Frequency Division Multiplexing) modulation and hierarchical transmission schemes.
  • ISDB-T Integrated Services Digital Broadcasting Terrestrial
  • OFDM Orthogonal Frequency Division Multiplexing
  • the transmission bandwidth of one channel is divided into 13 subbands, ie, OFDM segments S 1 to S 13
  • These OFDM segments S 1 to S 13 can be further divided into a maximum of three groups or layers.
  • Different transmission characteristics such as the carrier modulation scheme, inner code coding rate, and time interleave length can be set for each layer, and different broadcast programs can be transmitted for each layer.
  • 12 OFDM segments S1 to S6, S8 to S13 are used to provide high-quality high-definition television (High Definition Television) for fixed receivers, or 12 OFDM segments S1 to S6, S8 to Divide S13 into three layers to provide standard definition television for fixed receivers in each layer, or digital broadcasting for mobiles using only one segment S7 located in the center (Simple video broadcasting) can be provided.
  • MPEG-2 Motion Picture Experts Group phase 2
  • H.264 MPEG4 AVC
  • the receiving broadcast is switched to one of the digital broadcasting for the moving body and the digital broadcasting for the fixed receiver.
  • the picture quality of the displayed video changes abruptly, so that a visually unnatural video is displayed, which gives a sense of incongruity to those who listen to the broadcast program.
  • image information such as subtitles, human face images, or patterns with high spatial frequency suddenly appears. It can become unidentifiable.
  • Patent Document 1 Japanese Unexamined Patent Application Publication No. 2004-312361
  • Patent Document 2 JP 2004-166173 A
  • an object of the present invention is to provide a signal processing apparatus, a signal, and a signal processing apparatus that are capable of generating a high-quality or a visually natural display image as much as possible.
  • a processing method, a signal processing program, and a digital broadcast receiving apparatus are provided.
  • a signal processing apparatus includes first and second code signals generated by reception signal powers of both a first digital broadcast and a second digital broadcast that are simulcast.
  • a signal processing apparatus for processing a bitstream wherein an error monitoring unit that detects an error in the first code bitstream, and a first syntax by praying the first code bitstream Obtaining an element, decoding an intra-frame code image of the first code bitstream to reproduce a decoded image, and predicting a code image of the first code bitstream By performing a first motion compensation prediction using the first syntax element with reference to the decoded image, and a first inverse transform processing unit that decodes the image and reproduces the first difference image.
  • First difference To generate a first prediction image to be added to the image A first motion compensation unit that analyzes the second code bitstream and obtains a second syntax element, and decodes an intraframe code image of the second code bitstream A second inverse transform processing unit that reproduces a decoded image and reproduces a second difference image by decoding a prediction code image of the second code bitstream, and the second syntax A syntax element converting unit that converts the syntax element into a syntax element that conforms to the image format of the first code bitstream, and a second that uses the converted syntax element with reference to the decoded image.
  • a second motion compensation unit that generates a second predicted image to be added to the second difference image by executing the motion compensated prediction, and when the error is not detected, the second difference image is added to the first difference image.
  • the first decoded image reproduced by adding one predicted image is output.
  • the second predicted image is reproduced by adding the second predicted image to the second difference image.
  • An output control unit that outputs the decoded image instead of the first decoded image.
  • a digital broadcast receiving apparatus is a digital broadcast receiving apparatus that processes received signals of both a first digital broadcast and a second digital broadcast that have been simulcast.
  • a demodulating circuit that generates first and second code bit streams from received signals of the first and second digital broadcasts, respectively, and the signal processing device are provided.
  • a signal processing method includes: first and second encoded bits generated by reception signal powers of both a first digital broadcast and a second digital broadcast that are simulcast;
  • a signal processing method for processing a stream comprising: (a) detecting an error in the first encoded bitstream; and (b) analyzing the first encoded bitstream to obtain a first synth.
  • C decoding an intra-frame encoded image of the first encoded bitstream to reproduce a decoded image, and predictive encoded image of the first encoded bitstream
  • a signal processing program provides a first and second received signal powers of both a high-quality first digital broadcast and a second low-quality digital broadcast that are simulcast.
  • a signal processing program for causing a processor to execute processing of a code bit stream, wherein the processing includes error detection processing for detecting an error in the first code bit stream, and the first code bit.
  • a first analysis process for analyzing the bitstream to obtain a first syntax element; an intra-frame encoded image of the first code bitstream is decoded to reproduce a decoded image; and A first inverse transform process that decodes a predictive encoded image of one encoded bitstream and reproduces a first differential image, and uses the first syntax element with reference to the decoded image.
  • a first motion compensation process for generating a first predicted image to be added to the first difference image by executing the first motion compensated prediction, and analyzing the second code bit stream.
  • a second analysis process for obtaining a second syntax element; an intra-frame encoded image of the second encoded bitstream is decoded to reproduce a decoded image; and the second encoded bitstream Prediction code ⁇ Performing the second inverse transform process that decodes the image and reproduces the second difference image, and executes the second motion compensated prediction using the transformed syntax element with reference to the decoded image
  • a second motion compensation process for generating a second predicted image to be added to the second difference image, and when the error is not detected, adding the first predicted image to the first difference image Export the first decoded image
  • the second decoded image reproduced by adding the second predicted image to the second difference image is output instead of the first decoded image.
  • FIG. 1 is a diagram showing 13 OFDM segments.
  • FIG. 2 is a functional block diagram showing a schematic configuration of the digital broadcast receiving apparatus according to the first embodiment of the present invention.
  • FIG. 3 is a functional block diagram schematically showing configuration examples of a first decoder and a second decoder.
  • FIG. 4 is a diagram for explaining an example of vector information conversion processing.
  • FIG. 5 is a diagram for explaining another example of vector information conversion processing.
  • FIG. 6 is a functional block diagram showing a schematic configuration of a digital broadcast receiving apparatus according to a second embodiment of the present invention.
  • FIG. 7 is a functional block diagram showing a schematic configuration of a decoder ′ block according to the second embodiment.
  • DMUX Demultiplexer
  • FIG. 2 is a functional block diagram showing a schematic configuration of the digital broadcast receiving device 1A according to the first embodiment of the present invention.
  • This digital broadcast receiving apparatus 1A includes an antenna 10, a receiving circuit (front end) 2, and a signal processing circuit 3A.
  • the reception circuit 2 includes a tuner 11 and a demodulation circuit 12.
  • the signal processing circuit 3A includes an error monitoring unit 20, a demultiplexer (DMUX) 21, a first decoder 22, a second decoder 23, a signal output unit 24, an image conversion unit 25, a syntax element conversion unit 26, a control unit 27, and An input unit 28 is provided.
  • the control unit (output control unit) 27 can individually control the operations of the error monitoring unit 20, the first decoder 22, the second decoder 23, and the signal output unit 24.
  • tuner 11 selects a broadcast station to be received by antenna 10 in accordance with an instruction from control unit 27, converts the received signal of the broadcast wave to a frequency, and performs OFDM (Orthogonal Frequency Division). Multiplexing) signal is generated.
  • the demodulating circuit 12 performs an FFT (Fast Fourier Transform) on the OFDM signal from the tuner 11 to generate an OFDM demodulated signal, and further performs a decoding process such as dingerleaving, demapping, and error correction on the OFDM demodulated signal. Processing is performed to generate a transport stream TS.
  • the tuner 11 has a function of receiving one or more broadcasts of terrestrial digital broadcast, satellite digital broadcast, and cable digital broadcast. However, the tuner 11 is not limited to these digital broadcasts. First, it may have a function of receiving a digital broadcast transmitted via a computer network such as the Internet.
  • the demultiplexer 21 hierarchically divides the transport stream TS supplied from the demodulation circuit 12 via the error monitoring unit 20, and from the transport stream TS, an electronic device for digital broadcast video for a fixed receiver is provided. Separates the elementary stream ES 1 from the digital stream elementary stream ES2 for mobiles. One element separated The tally stream ESI is supplied to the first decoder 22, and the other elementary stream ES 2 is supplied to the second decoder 23.
  • one elementary stream ES1 is a high-quality compression code bit string
  • the other elementary stream ES2 is a relatively low-quality compression code bit string.
  • elementary stream ES1 conforms to the MPEG-2 (Moving Picture Coding Experts Group phase 2) standard (MPE G-2 standard), and elementary stream ES2 conforms to the H.264 standard. It is compliant. Note that the force error monitoring unit 20 and the control unit 27 that are separated from each other by the error monitoring unit 20 and the control unit 27 can constitute the “error monitoring unit” of the present invention.
  • the error monitoring unit 20 constantly monitors the transport stream TS from the demodulation circuit 12, detects an uncorrectable bit error in the transport stream TS, and generates the bit error occurrence position and the bit error rate. Is supplied to the control unit 27. Based on the error information from the error monitoring unit 20, the control unit 27 determines that an error exists if the bit error rate Ber for the elementary stream ES1 exceeds an allowable value (threshold), and the bit error rate is set to the allowable level. If it is less than the value, it can be determined that there is no error.
  • the allowable value can be appropriately set by the user operating the input unit 28.
  • the force by which the control unit 27 preferably determines whether or not there is an error based on the bit error rate Ber is not limited to this.
  • the control unit 27 when the receiving circuit 2 detects a CZN ratio (carrier power to noise power ratio) of a received signal of digital broadcasting, the control unit 27 generates an error if the CZN ratio is less than a desired value. It is also possible to determine that
  • the first decoder 22 analyzes the elementary stream (first code bit stream) ES1 during entropy decoding so that the parameter indicating the syntax (rule) of the elementary stream ES1 is obtained.
  • Compressed code key / decode key standards such as MPEG-2 and H.264 specify syntax (rules) for describing and decoding the structure of a code stream or multiplexed bit stream (bit stream).
  • the second decoder 23 also obtains the syntax element of the elementary stream ES 2 by analyzing the elementary stream (second code bit stream) ES2 during entropy decoding.
  • Such syntax elements include motion vectors, quantum Parameters, macroblock type and reference frame index.
  • the syntax element conversion unit 26 converts the syntax element acquired by the second decoder 23 into a syntax element having a parameter group power that conforms to the image format of the elementary stream ES1, and the conversion is performed.
  • the syntax element is given to the first decoder 22.
  • the second decoder 23 when the second decoder 23 operates in the motion compensation prediction mode (interframe prediction mode), the second decoder 23 generates a prediction image by executing the motion compensation prediction, and also generates a difference image to be calored on the prediction image. To do.
  • the image conversion unit 25 can generate a difference image by interpolation, resolution conversion, or the like based on the difference image supplied from the second decoder 23 and provide the difference image to the first decoder 22.
  • FIG. 3 is a functional block diagram schematically showing each configuration example of the first decoder 22 and the second decoder 23.
  • the first decoder 22 executes decoding based on the normal MPEG-2 standard
  • the first decoder 22 detects an error
  • the 1 decoder 22 can execute decoding based on the H.264 standard using the converted syntax element from the syntax element converting unit 26 and the difference image from the image converting unit 25.
  • the first decoder 22 has a general MPEG-2 decoder configuration, an entropy decoding unit (ED) 30, an inverse quantization unit (IQ) 31, an inverse orthogonal transform. Section (IOT) 32, first motion compensation section (MCI) 34 ⁇ , frame buffer memory 35 and adder 36.
  • the hybrid motion compensation unit 34 includes a first motion compensation unit 34 ⁇ and a second motion compensation unit 34 ⁇ .
  • the entropy decoding unit 30, the inverse quantization unit 31, and the inverse orthogonal transform unit 32 may constitute the “first inverse transform processing unit” of the present invention.
  • the first switch 33A connects between the entropy decoding unit 30 and the hybrid motion compensation unit 34 in accordance with the control signal SW1.
  • the second switch 33B connects the inverse orthogonal transform unit 32 and the adder 36 according to the control signal SW1.
  • the first decoder 22 includes a second motion compensation unit (MC 2) 34B, an intra prediction unit (IP) 37, and a third switch 38 as a configuration conforming to the H.264 standard.
  • Hybrid motion compensation The unit 34 includes a first motion compensation unit 34A that executes motion compensation prediction based on the MPEG-2 standard in response to the control signal SW1 when the control unit 27 does not detect an error, and the control unit 27 detects an error.
  • the second motion compensation unit 34B executes motion compensation prediction based on the H.264 standard according to the control signal SW1.
  • the third switch 38 When the control unit 27 does not detect an error, the third switch 38 always connects between the abnormal motion compensation unit 34 and the adder 36 according to the control signal SW2 from the control unit 27, while When the unit 27 detects an error, the third switch 38 sets the adder 36 to either the intra prediction unit 37 or the hybrid motion compensation unit 34 according to the operation mode according to the H.264 standard. It is possible to selectively connect to.
  • the control unit 27 detects an error the entropy decoding unit 30 changes the operation mode of the first decoder 22 based on the syntax element to a motion compensation prediction mode and an intra prediction mode based on the H.264 standard.
  • the third switch 38 determines whether to deviate from the hybrid motion compensation unit 34 and the intra prediction unit according to the operation mode switching information (not shown) supplied from the entropy decoding unit 30. Switch between 37.
  • the second decoder 23 is configured as an H.264 decoder with an entropy decoding unit (ED) 40, an inverse quantization unit (IQ) 41, and an inverse orthogonal transform. Unit (IOT) 42, motion compensation unit (MC) 43, intra prediction unit (IP) 44, switch 45, frame buffer memory 46, adder 47 and deblocking filter 48.
  • the entropy decoding unit 40, the inverse quantization unit 41, and the inverse orthogonal transform unit 42 may constitute the “second inverse transform processing unit” of the present invention.
  • the signal output unit 24 instructs the control unit 27 to output one of the decoded image signal DOl from the first decoder 22 and the decoded image signal D02 from the second decoder 23.
  • the selected signal can be output as an output image signal CS.
  • Whether or not two digital broadcasts are simulcast can be determined, for example, by obtaining EPG (electronic program guide) information, and the control unit 27 can determine based on this information.
  • the control unit 27 determines that the simulcast digital broadcast is not received, the control unit 27 selectively outputs either the decoded image signal DOl or D02 as the output image signal CS to the signal output unit 24. Can be made.
  • the input part 28 by the user the control unit 27 can cause the signal output unit 24 to selectively output one of the decoded image signals DOl and DO2 as the output image signal CS.
  • the operation described below is based on the premise that the digital broadcast receiving device 1A is simultaneously receiving the simulcast digital broadcast for the fixed receiver and the digital broadcast for the mobile unit.
  • each operation of the first decoder 22 and the second decoder 23 when the control unit 27 does not detect an error will be described below.
  • the first switch 33A of the first decoder 22 connects between the no-motion and the motion compensation unit 34 and the entropy decoding unit 40
  • the second switch 33B includes the adder 36 and the inverse orthogonal transform unit. Connect between 32.
  • the first decoder 22 decodes the elementary stream ES1 for each macroblock.
  • the macroblock has a 16 x 16 pixel luminance signal (Y) and two 8 x 8 pixel color difference signals corresponding to the spatial position of this luminance signal. (Cb, Cr).
  • the entropy decoding unit 30 performs entropy decoding such as variable length decoding on the elementary stream ES1 to generate a quantum coefficient and analyzes the elementary stream ES1. Get the syntax element.
  • the inverse quantization unit 31 generates a DCT (Discrete Cosine Transform) transform coefficient by dequantizing the quantization coefficient according to a quantization parameter which is one of syntax elements, and the inverse orthogonal transform unit 32 further performs the transform.
  • the coefficients are subjected to inverse orthogonal transform (for example, inverse DCT transform) to generate an image signal, and this image signal is supplied to the adder 36 via the second switch 33B.
  • the first decoder 22 When the output of the entropy decoding unit 30 is an intra-coded image (intra-frame coded image), the first decoder 22 operates in the intra mode.
  • the third switch 38 is a force hybrid motion compensator 34 connected to the hybrid motion compensator 34 and does not output data. Therefore, the adder 36 outputs the image signal from the second switch 33B as it is to the outside as the decoded image signal DOl.
  • the decoded image that is the output of the adder 36 is stored in the frame buffer memory 35.
  • the first decoder 22 when the output of the entropy decoding unit 30 is a prediction-coded image (inter-frame coded images) based on motion compensation prediction, the first decoder 22 operates in the motion compensation prediction mode. Works with.
  • the first motion compensation unit 34A has an entropy decoding function. Motion vector information is input from the unit 30 via the first switch 33A.
  • the first motion compensation unit 3 4A reads the reference image from the frame buffer memory 35, generates a prediction image of the region specified by the motion vector with 1/2 pixel accuracy (half-pel accuracy) from the reference image, and the prediction image Is supplied to the adder 36 through the third switch 38.
  • the adder 36 reproduces the decoded image by adding the predicted image to the difference image input from the inverse orthogonal transform unit 32 via the second switch 33B.
  • the reproduced decoded image is output to the outside as a decoded image signal DOl and simultaneously stored in the frame buffer memory 35.
  • the second decoder 23 decodes the elementary stream ES2 for each macroblock.
  • As the size of the macro block for example, four sizes of 16 X 16 pixels, 16 X 8 pixels, 8 X 16 pixels or 8 X 8 pixels are specified, and the size of the sub block is 8 X 8 pixels, 8 Four different sizes can be specified: X 4 pixels, 4 X 8 pixels, and 4 X 4 pixels.
  • the entity mouth pe decoding unit 40 performs entropy coding such as variable length decoding on the elementary stream ES2 to generate a quantum coefficient and analyzes the elementary stream ES2. To get the syntax element.
  • the inverse quantization unit 41 dequantizes the quantization coefficient according to the quantization parameter which is one of the syntax elements to generate DCT transform coefficients, and the inverse orthogonal transform unit 42 further converts these transform coefficients to integer precision.
  • the image signal is generated by inverse orthogonal transformation at, and this image signal is supplied to the adder 47 and the image conversion unit 25.
  • the second decoder 23 When the output of the entropy decoding unit 40 is an intra code image (intra-frame encoded image), the second decoder 23 operates in an intra mode based on the H.264 standard. At this time, the switch 45 is connected to the motion compensation unit 43, but the motion compensation unit 43 does not output data. Therefore, the adder 47 supplies the image signal from the inverse orthogonal transform unit 42 to the fat packing filter 48 as it is. The output of the adder 47 is also given to the intra prediction unit 44.
  • the deblocking filter 48 applies filtering to the output of the adder 47 in order to suppress the occurrence of so-called block noise (distortion that occurs near the boundary between macroblocks). Output to the outside as decoded image signal D02 .
  • the filter signal is stored in the frame buffer memory 46 to be used as a reference image.
  • the second decoder 23 when the output of the entropy decoding unit 40 is a prediction code key image (inter-frame encoded image) based on motion compensation prediction, the second decoder 23 operates in the motion compensation prediction mode.
  • the switch 45 is connected to the motion compensation unit 43.
  • the motion compensation unit 43 receives motion vector information from the entropy decoding unit 40.
  • the motion compensation unit 43 reads the reference image from the frame buffer memory 46, generates a predicted image of the region specified by the motion vector from the reference image with 1Z4 pixel accuracy (quarter-pel accuracy), and switches the predicted image signal. It is given to the adder 47 through 45.
  • the reference image may have an ability S to set an image three frames before the reference image as a reference image.
  • the adder 47 adds the difference image signal to the predicted image signal input from the motion compensation unit 43 via the switch 45, and provides the added signal to the deblocking filter 48.
  • the deblocking filter 48 outputs the filter signal to the outside as the decoded image signal D02 and also supplies it to the frame buffer memory 35.
  • the second decoder 23 when the output of the entropy decoding unit 40 is a predictive encoded image by intra prediction, the second decoder 23 operates in the intra prediction mode.
  • the switch 45 is connected to the intra prediction unit 44.
  • a pixel force prediction image adjacent to the current macroblock is generated by interpolation within the same frame.
  • the intra prediction unit 44 generates a prediction image according to the prediction pattern specified by the syntax element, and supplies the prediction image to the adder 47 via the switch 45.
  • the adder 47 adds the difference image signal to the prediction image signal input from the intra prediction unit 44 via the switch 45, and gives the addition signal to the deblocking filter 48.
  • the operation of the signal processing circuit 3A when the control unit 27 detects an error will be described below.
  • the first switch 33A of the first decoder 22 connects between the syntax element conversion unit 26 and the second motion compensation unit 34B according to the control signal SW1, and at the same time, The two switch 33B connects between the image conversion unit 25 and the adder 36 in accordance with the control signal SW1.
  • the second motion compensation unit 34B operates instead of the first motion compensation unit 34A.
  • the second motion compensation unit 34B performs entropy in the motion compensation prediction mode.
  • a predicted image is generated from a high-quality reference image read from the frame buffer memory 35 by using the syntax element from the syntax element conversion unit 26 instead of the syntax element from the decoding unit 30. Furthermore, the intra prediction unit 37 generates a prediction image according to the prediction pattern specified by the syntax element supplied from the syntax element conversion unit 26 in the intra prediction mode.
  • the third switch 38 is connected to the inverted motion compensation unit 34, and the second motion compensation unit 34B operates.
  • the motion vector information is input to the second motion compensation unit 34B from the syntax element conversion unit 26 via the first switch 33A.
  • the second motion compensation unit 34B reads the designated high-resolution reference image from the frame buffer memory 35, generates a predicted image of the region specified by this reference image force motion vector, and uses the predicted image as the first motion image.
  • the signal is supplied to the adder 36 through the 3 switch 38.
  • the image conversion unit 25 supplies the difference image to be added to the predicted image to the adder 36 via the second switch 33B.
  • the adder 36 adds the difference image signal from the second switch 33B to the prediction image signal, and outputs the addition signal to the outside as the decoded image signal DO1.
  • the image conversion unit 25 generates a difference image suitable for the image format of the elementary stream ES1 based on the difference image from the inverse orthogonal transform unit 42.
  • a difference image suitable for the image format of the elementary stream ES1 based on the difference image from the inverse orthogonal transform unit 42.
  • the resolution of no. (1920 x 1080 pixels) is larger than the resolution of digital broadcasting for mobiles (320 x 180 pixels).
  • the macro block size of the difference image input to the image conversion unit 25 is 4 X 4 pixels conforming to the H.264 standard
  • the image conversion unit 25 is 24 X 24 according to the resolution ratio of both digital broadcasts. It is possible to generate a difference image having a macroblock size of pixels.
  • the decoded image stored in the frame buffer memory 35 of the first decoder 22 is the second Since it has a higher resolution than the decoded image stored in the frame buffer memory 46 of the decoder 23, the second motion compensation unit 34B can execute the motion compensation prediction with 1Z2 pixel accuracy or lower accuracy.
  • the resolution of high-definition broadcasting (1920 x 1080 pixels) is larger than the resolution of digital broadcasting for mobile objects (320 x 180 pixels). Therefore, high-quality decoded images can be reproduced even if motion compensation prediction is performed with an accuracy lower than 1Z4 pixel accuracy.
  • FIG. 4 is a diagram illustrating a method of generating the motion vector MV1 used in the second motion compensation unit 34B.
  • the entropy decoding unit 40 of the second decoder 23 uses the syntax of the motion vector mvl together with the information of the macroblock type (macroblock type) that specifies the prediction method of the code macroblock mbl for the encoded macroblock mbl. This is given to the element conversion unit 26.
  • the syntax element conversion unit 26 generates the motion vector MV1 by converting the motion vector mvl according to the format of the encoded macroblock MB1 to be processed by the first decoder 22.
  • the image HI to which the sign key macroblock mbl belongs has a different resolution (horizontal resolution and vertical resolution) from the image Ml to which the sign key macroblock MB1 belongs, so that the motion vector is adapted to the resolution of the image Ml. Convert mvl to motion vector MV1! ,.
  • the syntax element conversion unit 26 generates a motion vector mv2 by interpolation from the motion vectors mvl and mv3 of the images HI and H2 that are temporally preceding and following, and further, the motion vector mv2 is generated from the image M2
  • the motion vector MV2 can be generated by converting it according to the format of the coded macroblock MB2 belonging to.
  • the syntax element conversion unit 26 interpolates not only the motion vector but also other syntax elements such as a macro block type.
  • the intra prediction unit 37 operates instead of the second motion compensation unit 34B, and the third switch 38 is connected to the intra prediction unit 37.
  • a syntax element indicating a prediction pattern (prediction mode) used for the sign y is given to the intra prediction unit 37 from the syntax element conversion unit 26.
  • Intra prediction unit 37 Generates a prediction image in accordance with the prediction pattern specified by the syntax element, and provides the prediction image signal to the adder 36 via the third switch 38. Then, the adder 36 adds the difference image signal given from the image conversion unit 25 via the second switch 33B to the predicted image signal from the third switch 38, and uses the addition signal as a decoded image signal DOl. Output.
  • the first decoder 22 may include a filter corresponding to the deblocking filter 48 of the second decoder 23.
  • the intra prediction unit 37 executes intra prediction conforming to the H.264 standard in accordance with the image format of the elementary stream ES1.
  • a prediction image can be generated in units of blocks of 4 ⁇ 4 pixels or 16 ⁇ 16 pixels by intra prediction.
  • the resolution of high-definition broadcasting as described above (1920 x 1080 pixels) is greater than the resolution of digital broadcasting for mobiles (320 x 180 pixels). Therefore, the intra prediction unit 37 can generate a prediction image having a block size of 24 ⁇ 24 pixels according to the resolution ratio between the two digital broadcasts.
  • the hybrid motion compensation unit 34 and the intra prediction unit 37 do not operate, and the no-branch motion compensation unit 34 does not output data.
  • the image conversion unit 25 outputs the image signal supplied from the inverse orthogonal transform unit 42 of the second decoder 23 in accordance with the image format of the elementary list ES1, and the converted image signal is converted into the second image signal.
  • the signal can be output to the outside through the switch 33B and the adder 36.
  • the digital broadcast receiving device 1A even if an error occurs with respect to one elementary stream ES1 of two simulcast digital broadcasts,
  • the syntax element acquired from the elementary stream ES2 is converted according to the image format of the one elementary stream ES1, and motion compensation prediction or intra prediction is performed using the converted syntax element.
  • the prediction image can be generated by performing the operation. Therefore, it is possible to provide a visually natural image as much as possible.
  • the second motion compensation unit 34B 35 refer to the high-quality decoded image stored in By executing motion compensation prediction using the converted syntax element, a high-quality prediction image can be generated.
  • the image conversion unit 25 generates a difference image to be added to the prediction image based on the difference image supplied from the inverse orthogonal conversion unit 42 of the second decoder 23. Therefore, it is possible to provide high-quality and visually natural images as much as possible.
  • a mobile object such as a vehicle or a PDA (Personal Digital Assistant) can receive a digital broadcast for a mobile object even if the reception status of the digital broadcast for a fixed receiver deteriorates. It is expected that there are many cases.
  • the digital broadcast receiver 1A of the present embodiment can provide a high-quality and visually natural image as much as possible even in the case where power is applied.
  • FIG. 6 is a functional block diagram showing a schematic configuration of the digital broadcast receiving device 1B of the second embodiment.
  • FIG. 7 is a functional block diagram showing a schematic configuration of the decoder block 29 of the digital transmission / reception device 1B. It should be noted that the components denoted by the same reference numerals between FIG. 2 and FIG. 6 and between FIG. 3 and FIG. 7 have substantially the same function and the same configuration and will not be described in detail.
  • digital broadcast receiving apparatus 1B includes antenna 10, receiving circuit (front end) 2, and signal processing circuit 3B.
  • the reception circuit 2 includes a tuner 11 and a demodulation circuit 12.
  • the signal processing circuit 3B includes an error monitoring unit 20, a demultiplexer (DMUX) 21, a decoder block 29, a control unit (output control unit) 27, and an input unit 28.
  • the control unit 27 can individually control the operations of the error monitoring unit 20 and the decoder block 29.
  • one elementary stream ES 1 output from the demultiplexer 21 is a high-quality compressed code bit string of digital broadcasting for a fixed receiver, and the other The elementary stream ES2 is a compression code bit string of relatively low quality for mobile digital broadcasting, but is not limited to this.
  • decoder block 29 includes a first decoder 22B, a second decoder 23B, a frame buffer memory 46, and a signal switching unit 50.
  • the first decoder 22B is a general
  • the MPEG-2 decoder consists of an entropy decoding unit (ED) 30, an inverse quantization unit (IQ) 31, an inverse orthogonal transform unit (IOT) 32, a first motion compensation unit (MC) 34A, and an adder 36. Have.
  • the operations of these components 30 to 32 and 36 are the same as those of the corresponding components shown in FIG.
  • the first motion compensation unit 34A shown in FIG. 7 performs the same operation as the first motion compensation unit 34A shown in FIG.
  • the entropy decoding unit 30, the inverse quantization unit 31, and the inverse orthogonal transform unit 32 may constitute the “first inverse transform processing unit” of the present invention.
  • the second decoder 23B has a configuration conforming to the H.264 standard and includes an entropy decoding unit (ED) 40, an inverse quantization unit (IQ) 41, an inverse orthogonal transform unit (IOT) 42, 2 It has a motion compensation unit (MC) 43 mm, an intra prediction unit (IP) 44 mm, a switch 45B, an adder 47B and a deblocking filter 48.
  • the entropy decoding unit 40, the inverse quantization unit 41, and the inverse orthogonal transform unit 42 may constitute the “second inverse transform processing unit” of the present invention.
  • the second motion compensation unit 43B has the same configuration and the same function as the second motion compensation unit 34B of the first embodiment (Fig. 3), and the intra prediction unit 44B It has the same configuration and functions as the intra prediction unit 37 in Fig. 3).
  • the second decoder 23B further includes an image conversion unit 25 that converts the resolution of the output image of the inverse orthogonal transform unit 42 so as to match the resolution of the elementary stream ES1, and is acquired by the entropy decoding unit 40.
  • a syntax element conversion unit 26 is provided for converting the syntax elements so as to conform to the resolution of the elementary stream ES1.
  • These image conversion unit 25 and syntax element conversion unit 26 shown in FIG. 7 also have substantially the same configuration and the same as the image conversion unit 25 and syntax element conversion unit 26 of the first embodiment (FIG. 3), respectively. It has a function.
  • the first motion compensation unit 34A of the first decoder 22B and the motion compensation unit 43B of the second decoder 23B refer to the same frame buffer memory 46, respectively, and refer to the frame buffer memory 46 when performing motion compensation prediction. Read the image.
  • the signal switching unit 50 selects one of the decoded image signal DO1 supplied from the first decoder 22B and the decoded image signal D02 supplied with the second decoder 23B force. And select the selected signal as an output image signal CS to an external device (not shown) To supply. As will be described later, when the control unit 27 does not detect an error, the signal switching unit 50 selects the decoded image signal DO 1, and when the control unit 27 detects an error, the signal switching unit 50 selects the decoded image signal D02. To do.
  • digital broadcast receiving apparatus 1B having the above configuration will be described below.
  • the operations described below are based on the assumption that the digital broadcast receiver 1B is simultaneously receiving digital broadcasts for simulcast fixed receivers and mobile digital broadcasts.
  • the entropy decoding unit 30 analyzes the elementary stream ES1 to obtain syntax elements, and supplies these syntax elements to the motion compensation unit 34 and the inverse quantization unit 31. To do.
  • the entropy decoding unit 30 performs entropy decoding on the elementary stream ES1 to generate a quantized coefficient.
  • the inverse quantization unit 31 dequantizes the quantization coefficient according to the quantization parameter which is one of the syntax elements to generate DCT transform coefficients, and the inverse orthogonal transform unit 32 inversely orthogonalizes these transform coefficients.
  • An image signal is generated by conversion (for example, inverse DCT conversion), and this image signal is supplied to the adder 36.
  • the adder 36 gives the image signal from the inverse orthogonal transform unit 32 to the signal switching unit 50 as it is as the decoded image signal DOl.
  • the signal switching unit 50 selects the decoded image signal DOl as the output image signal CS in accordance with the switching control signal SW.
  • the decoded image signal DOl is supplied from the signal switching unit 50 to the external device and is simultaneously stored in the frame notch memory 46.
  • the first motion compensation unit 34A reads the decoded image stored in the frame buffer memory 46.
  • the prediction image of the area specified by the motion vector is generated with 1Z2 pixel accuracy from the reference image, and the prediction image is given to the adder 36.
  • the adder 36 reproduces the decoded image by adding the predicted image to the difference image from the inverse orthogonal transform unit 32.
  • the reproduced decoded image is given to the signal switching unit 50 as a decoded image signal DOl.
  • the image signal DOl is supplied from the signal switching unit 50 to an external device (not shown) and simultaneously stored in the frame buffer memory 46.
  • the entropy decoding unit 40 applies an entropy code such as variable length decoding to the elementary stream ES2 to generate a quantized coefficient at the same time. Analyze mental stream ES2 to obtain syntax elements. These syntax elements are supplied to the syntax element conversion unit 26 and the inverse quantization unit 41. The inverse quantization unit 41 dequantizes the quantization coefficient according to the quantization parameter, which is one of the syntax elements, to generate DCT transform coefficients, and the inverse orthogonal transform unit 42 further converts these transform coefficients to integers. The decoded image or difference image is reproduced by inverse orthogonal transformation with accuracy, and this decoded image or difference image is given to the image conversion unit 25.
  • an entropy code such as variable length decoding
  • the image transform unit 25 Based on the image from the inverse orthogonal transform unit 32, the image transform unit 25 generates an image that conforms to the image format of the elementary stream ES1. For example, in the case of Japanese terrestrial digital broadcasting, as described above, the resolution of high-definition broadcasting (1920 x 1080 pixels) is greater than the resolution of digital broadcasting for mobile objects (320 x 180 pixels). In such a case, the image conversion unit 25 can convert, for example, an image having a block size of 4 ⁇ 4 pixels into an image having a block size of 24 ⁇ 24 pixels in accordance with the resolution of high-definition broadcasting.
  • the adder 47B provides the decoded image from the image conversion unit 25 to the deblocking filter 48 as it is.
  • the deblocking filter 48 performs filtering on the decoded image signal from the adder 47B, and provides the filtered signal to the signal switching unit 50 as a decoded image signal D02.
  • the signal switching unit 50 selects the decoded image signal D02 as the output image signal CS.
  • the output image signal CS is stored in the frame buffer memory 46 at the same time as being supplied from the signal switching unit 50 to an external device (not shown).
  • the switch 45B is connected to the second motion compensation unit 43B.
  • the second motion compensation unit 43B uses the syntax element (motion vector information) converted by the syntax element conversion unit 26 to refer to the frame buffer memory 46, and the first decoder 22B performs decoding.
  • the adder 47B adds the predicted image to the difference image from the image conversion unit 25 to generate a decoded image.
  • the decoded image is applied to the signal switching unit 50 after being filtered by the deblocking filter 48.
  • the signal switching unit 50 selects the decoded image signal D02 from the deblocking filter 48 as the output image signal CS, and the output image signal CS is supplied to an external device (not shown) and at the same time the frame buffer. Stored in memory 46.
  • the switch 45B is connected to the intra prediction unit 44B.
  • the intra prediction unit 44B performs intra prediction using the syntax element (information indicating the prediction mode) converted by the syntax element conversion unit 26, generates a prediction image, and switches the prediction image to the switch.
  • the adder 47B is supplied via 45B.
  • the adder 47B adds the predicted image to the difference image from the image conversion unit 25 to generate a decoded image.
  • This decoded image is supplied to the signal switching unit 50 after being filtered by the deblocking filter 48.
  • the signal switching unit 50 selects the decoded image signal D02 from the deblocking filter 48 as the output image signal CS, and the output image signal CS is supplied to an external device (not shown) and simultaneously sent to the frame buffer memory 46. Accumulated.
  • an error occurs with respect to one elementary stream ES 1 of two simulcast digital broadcasts as in the first embodiment. Even so, the syntax element acquired from the elementary stream ES2 of the other digital broadcasting is converted in accordance with the image format of the one elementary stream ES1, and further, the converted syntax element is used.
  • a prediction image can be generated by executing motion compensation prediction or intra prediction. Therefore, it is possible to provide visually as natural images as possible.
  • the second motion compensation unit 43B With reference to the high-quality decoded image decoded in 22B and stored in the frame buffer memory 46, motion compensation prediction is performed using the converted syntax element.
  • a high-quality predicted image can be generated by performing the operation.
  • the image conversion unit 25 generates a difference image to be added to the predicted image based on the difference image supplied from the inverse orthogonal conversion unit 42. Therefore, it is possible to provide high-quality and visually natural images as much as possible.
  • All or part of the configuration of the signal processing circuits 3A and 3B of the first and second embodiments may be realized by hardware, or may be recorded on a recording medium such as a nonvolatile memory or an optical disk. It may be realized by a program (or program code). Such a program (or program code) can cause a processor such as a CPU to execute all or part of the processing of the signal processing circuits 3A and 3B.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

Dispositif de traitement de signal pour créer un affichage vidéo aussi naturel que possible, avec une qualité aussi élevée que possible, à partir du signal d’émission numérique reçu, fourni par émission simultanée. Le dispositif de traitement de signal comprend un premier décodeur effectuant une prévision de compensation de mouvement en analysant un premier flux binaire codé, en acquérant un élément de syntaxe et en utilisant l’élément de syntaxe pour la prévision de compensation de mouvement, un deuxième décodeur effectuant une prévision de compensation de mouvement en analysant un deuxième flux binaire codé, en acquérant un élément de syntaxe et en utilisant l’élément de syntaxe pour la prévision de compensation de mouvement, et une section de conversion d’élément de syntaxe pour convertir l’élément de syntaxe acquis par le deuxième décodeur en un élément de syntaxe adapté au format d’image du premier flux binaire codé. En réaction à la détection d’une erreur, le premier décodeur effectue la prévision de compensation de mouvement en utilisant l’élément de syntaxe acquis par la conversion par la section de conversion d’élément de syntaxe au lieu de l’élément de syntaxe acquis par le premier décodeur.
PCT/JP2006/323799 2005-12-02 2006-11-29 Dispositif de traitement de signal de réception d’émission numérique, méthode de traitement de signal, programme de traitement de signal et récepteur d’émissions numériques WO2007063890A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2007547967A JP4672734B2 (ja) 2005-12-02 2006-11-29 デジタル放送受信用の信号処理装置、信号処理方法および信号処理プログラム、並びにデジタル放送受信装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2005349352 2005-12-02
JP2005-349352 2005-12-02

Publications (1)

Publication Number Publication Date
WO2007063890A1 true WO2007063890A1 (fr) 2007-06-07

Family

ID=38092223

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2006/323799 WO2007063890A1 (fr) 2005-12-02 2006-11-29 Dispositif de traitement de signal de réception d’émission numérique, méthode de traitement de signal, programme de traitement de signal et récepteur d’émissions numériques

Country Status (2)

Country Link
JP (1) JP4672734B2 (fr)
WO (1) WO2007063890A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009100424A (ja) * 2007-10-19 2009-05-07 Fujitsu Ltd 受信装置、受信方法
JP2009130415A (ja) * 2007-11-20 2009-06-11 Kenwood Corp デジタル放送受信装置およびコンピュータプログラム
RU2472294C2 (ru) * 2007-06-20 2013-01-10 Моторола Мобилити, Инк. Сигнал и устройство широковещательного канала, предназначенные для управления передачей и приемом информации широковещательного канала

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1093981A (ja) * 1996-07-17 1998-04-10 Sony Corp 画像符号化装置および画像符号化方法、画像復号化装置および画像復号化方法、伝送方法、並びに記録媒体
JP2003134064A (ja) * 2001-10-26 2003-05-09 Hitachi Ltd デジタル放送補完方法およびデジタル放送受信システム
JP2003274303A (ja) * 2002-03-18 2003-09-26 Sony Corp ディジタル放送受信装置、車載装置、ディジタル放送受信の案内方法、ディジタル放送受信の案内方法のプログラム及びディジタル放送受信の案内方法のプログラムを記録した記録媒体
JP2005260606A (ja) * 2004-03-11 2005-09-22 Fujitsu Ten Ltd デジタル放送受信装置
JP2005311435A (ja) * 2004-04-16 2005-11-04 Denso Corp 移動体用放送受信装置およびプログラム

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1093981A (ja) * 1996-07-17 1998-04-10 Sony Corp 画像符号化装置および画像符号化方法、画像復号化装置および画像復号化方法、伝送方法、並びに記録媒体
JP2003134064A (ja) * 2001-10-26 2003-05-09 Hitachi Ltd デジタル放送補完方法およびデジタル放送受信システム
JP2003274303A (ja) * 2002-03-18 2003-09-26 Sony Corp ディジタル放送受信装置、車載装置、ディジタル放送受信の案内方法、ディジタル放送受信の案内方法のプログラム及びディジタル放送受信の案内方法のプログラムを記録した記録媒体
JP2005260606A (ja) * 2004-03-11 2005-09-22 Fujitsu Ten Ltd デジタル放送受信装置
JP2005311435A (ja) * 2004-04-16 2005-11-04 Denso Corp 移動体用放送受信装置およびプログラム

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2472294C2 (ru) * 2007-06-20 2013-01-10 Моторола Мобилити, Инк. Сигнал и устройство широковещательного канала, предназначенные для управления передачей и приемом информации широковещательного канала
JP2009100424A (ja) * 2007-10-19 2009-05-07 Fujitsu Ltd 受信装置、受信方法
JP2009130415A (ja) * 2007-11-20 2009-06-11 Kenwood Corp デジタル放送受信装置およびコンピュータプログラム

Also Published As

Publication number Publication date
JP4672734B2 (ja) 2011-04-20
JPWO2007063890A1 (ja) 2009-05-07

Similar Documents

Publication Publication Date Title
US11328452B2 (en) Image processing device and method
US10491919B2 (en) Image processing apparatus and method
US10321136B2 (en) Image processing apparatus and method
US7146056B2 (en) Efficient spatial scalable compression schemes
KR101055738B1 (ko) 베이스 레이어의 내부모드 블록의 예측정보를 이용하여 영상신호를 엔코딩/디코딩하는 방법 및 장치
US20060133475A1 (en) Video coding
US20110122953A1 (en) Image processing apparatus and method
US20110229049A1 (en) Image processing apparatus, image processing method, and program
JP4672734B2 (ja) デジタル放送受信用の信号処理装置、信号処理方法および信号処理プログラム、並びにデジタル放送受信装置
WO2004112378A1 (fr) Recepteur de television et procede de traitement d'images
JP2009182776A (ja) 符号化装置、復号化装置、および、動画像符号化方法、動画像復号化方法
KR0179104B1 (ko) 지상채널 에이치디티브이 방송과 양립하는 위성채널 에이치디티브이 시스템
JP2008193444A (ja) 情報処理装置
JP2006217445A (ja) 動画像符号化装置、復号化装置、動画像記録装置および動画像再生装置
JPH0884334A (ja) 画像信号の復号化方法およびその装置
JP2002152731A (ja) 画像サイズ変換装置
JP2000115774A (ja) 画像復号化方法、動画像符号化方法、画像復号化装置、動画像符号化装置、及び蓄積メディア

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
ENP Entry into the national phase

Ref document number: 2007547967

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 06833603

Country of ref document: EP

Kind code of ref document: A1