EP0881835B1 - Kodier- und Dekodierverfahren für Videosignal mit Zwischenbild mit Konvertierung periodisch ausgewählter Videohalbbilder zu Videobildern mit progressiver Abtastung - Google Patents

Kodier- und Dekodierverfahren für Videosignal mit Zwischenbild mit Konvertierung periodisch ausgewählter Videohalbbilder zu Videobildern mit progressiver Abtastung Download PDF

Info

Publication number
EP0881835B1
EP0881835B1 EP98106084A EP98106084A EP0881835B1 EP 0881835 B1 EP0881835 B1 EP 0881835B1 EP 98106084 A EP98106084 A EP 98106084A EP 98106084 A EP98106084 A EP 98106084A EP 0881835 B1 EP0881835 B1 EP 0881835B1
Authority
EP
European Patent Office
Prior art keywords
scanning
fields
frames
progressive scanning
interlaced
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
EP98106084A
Other languages
English (en)
French (fr)
Other versions
EP0881835A3 (de
EP0881835A2 (de
Inventor
Kenji Sugiyama
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Victor Company of Japan Ltd
Original Assignee
Victor Company of Japan Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=26485127&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=EP0881835(B1) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by Victor Company of Japan Ltd filed Critical Victor Company of Japan Ltd
Publication of EP0881835A2 publication Critical patent/EP0881835A2/de
Publication of EP0881835A3 publication Critical patent/EP0881835A3/de
Application granted granted Critical
Publication of EP0881835B1 publication Critical patent/EP0881835B1/de
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/103Selection of coding mode or of prediction mode
    • H04N19/105Selection of the reference unit for prediction within a chosen coding or prediction mode, e.g. adaptive choice of position and number of pixels used for prediction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/103Selection of coding mode or of prediction mode
    • H04N19/112Selection of coding mode or of prediction mode according to a given display mode, e.g. for interlaced or progressive display mode
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/136Incoming video signal characteristics or properties
    • H04N19/137Motion inside a coding unit, e.g. average field, frame or block difference
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/44Decoders specially adapted therefor, e.g. video decoders which are asymmetric with respect to the encoder
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • H04N19/577Motion compensation with bidirectional frame interpolation, i.e. using B-pictures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • H04N7/0117Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level involving conversion of the spatial resolution of the incoming video signal
    • H04N7/012Conversion between an interlaced and a progressive signal

Definitions

  • the present invention relates to a method and apparatus for high-efficiency encoding to efficiently convert an interlaced type of video signal into a stream of compressed code, for the purpose of transmission or storage.
  • the invention relates to encoding processing which uses bidirectional motion prediction encoding, applied to an interlaced type of video signal.
  • picture will be used as a general term for referring to the contents of afield of an interlaced video signal, or a frame of aprogressive scanning (i.e., non-interlaced) type of video signal.
  • a method of high-efficiency video encoding for an interlaced type of video signal whereby one in every m successive frames (where m is an integer of 2 or more) is encoded either independently by internal encoding or by unidirectional predictive encoding, while the remaining frames (referred to as the B frames)are encoded by bidirectional predictive encoding using preceding and succeeding ones of the aforementioned specific frames (i.e., I or P frames).
  • Such predictive encoding of a video signal is now well known in the art, being described for example in Japanese patent laid-open number HEI 2-192378, of the assignee of the present invention, etc.
  • the technique is also used with the MPEG-1 system (ISO/IEC-11172), and the MPEG-2 system (ISO/IEC-13818).
  • the first and second fields of each frame of the interlaced video signal are time-displaced by 1/60 second, and are also mutually displaced by one scanning line position, in the vertical direction of the picture.
  • a method is used such as with the MPEG-2 standard etc., whereby processing is performed in units of fields, with a plurality of fields being used to constitute a reference picture, or whereby processing is basically performed in units of frames, but with prediction being switched to perform local prediction in units of fields when necessary.
  • each of the aforementioned picture types i.e., I, P, B
  • the I-pictures and B-pictures must each be set as respective consecutive pairs of fields.
  • Fig. 5 shows an example of the configuration of a prior art type of video encoding apparatus which uses bidirectional prediction for encoding the B fields. It will be assumed that prediction is performed in units of fields, but that the I, P and B picture types are established in units of interlaced frames as described above.
  • the interlaced video signal which is input to the video input terminal 7 is supplied to the input signal selection switch 56 which is controlled to operate in synchronism with successive fields of the input video signal such that the I and P frames are supplied to a subtractor 51 while the B frames are supplied to a frame delay element 61.
  • video signal signifies a digital video signal.
  • the subtractor 51 subtracts an inter-picture prediction signal (i.e., consisting of successive predicted values for respective pixels of a frame) that is produced by an inter-picture prediction section 57 from the I or P frame signal which is supplied thereto, and supplies the resultant difference values, i.e. prediction error values, to a DCT section 52.
  • an inter-picture prediction signal i.e., consisting of successive predicted values for respective pixels of a frame
  • the DCT section 52 performs DCT (Discrete Cosine Transform) conversion processing on successive sets of prediction error values which correspond to respective blocks of 8 x 8 (or 16 x 16) pixels of a picture, and the transform coefficients thereby obtained are supplied to a quantizer 53.
  • the quantizer 53 performs quantization of the coefficients, using a predetermined quantization step size, and the resultant fixed-length encoded coefficients are supplied to a variable-length encoder 54 and to a dequantizer 55.
  • the variable-length encoder 54 performs array conversion of the 2-dimensional 8 x 8 sets of coefficients into a 1-dimensional sequence, using zig-zag sequence processing, and encodes the result by Huffman encoding, i.e. using the numbers of runs of coefficient values of zero or of coefficient values other than zero.
  • the resultant code sequences into which the I and P frames have been respectively converted are multiplexed with the code sequences which are obtained for the B frames, by the multiplexer 13, and the resultant code stream is supplied to the code output terminal 14.
  • the dequantizer 55 and the inverse DCT section 60 perform the inverse processing to that of the quantizer and the DCT section 52, to thereby reproduce the inter-picture prediction error values, and the values thus obtained are added to the prediction signal by the adder 59, to obtain values expressing successive reconstructed pictures, which are supplied to the picture memory 58.
  • the reconstructed pictures which are thus stored in the picture memory 58 are thereafter read out and supplied to the inter-picture prediction section 57 at appropriate timings.
  • the inter-picture prediction section 57 generates different prediction signals in accordance with respective types of picture (i.e., I, P or B), supplies the prediction signals derived for the I and P frames to the subtractor 51, and supplies the prediction signals derived for the B frames to one input of the subtractor 17.
  • respective types of picture i.e., I, P or B
  • the prediction signal values for an I frame are always zero.
  • the prediction signal is obtained based on a preceding I or P frame.
  • the prediction signal is obtained based on preceding and succeeding I or P frames.
  • the frame delay section 61 applies a delay of (m - 1) frames, and the delayed B frame signal is then supplied to the subtractor 17. Since the picture type is established in units of frames, the delay must be established in units of frame periods.
  • the resultant delayed picture signal (i.e. successive pixel values) is input to the subtractor 17 in synchronism with predicted values supplied from the inter-picture prediction section 57, to obtain respective prediction error values for the B frame, which are encoded by the DCT section 18, quantizer 19 and variable-length encoder 20, in the same way as for the DCT section 52, quantizer 53 and variable-length encoder 54.
  • the encoding system for the B frames does not contain any local decoding section.
  • the code sequences obtained for the B frames are multiplexed by the multiplexer 13 with the code sequences derived for the I and P frames in a different order from that of the order of the frames of the original input video signal. That is to say, the order must be changed such as to ensure that the code sequence for each B frame will not be transmitted from the code output terminal 14 until after the code sequences for the I or P frames which were used in predictive encoding of that B frame have been transmitted.
  • FIG. 6 A video decoding apparatus corresponding to the video encoding apparatus of Fig. 5 will be described in the following, referring to the system block diagram of Fig. 6.
  • an input code stream i.e., consisting of successive code sequences for respective I, P, B pictures which have been generated by the video encoding apparatus of Fig. 5 is supplied to a code input terminal 33, to be separated by a demultiplexer 34 into the code sequences for the I and P frames and the code sequences for the B frames.
  • the code sequences for the I and P frames are supplied to a variable-length decoder 62, to be restored to fixed code length format, then reconstructed prediction error values for a frame are derived by a dequantizer 75 and inverse DCT section 80, and added to predicted values for that frame by an adder 79, to obtain pixel values expressing reconstructed pictures, which are stored in a picture memory 63.
  • the inter-picture prediction section 64 generates prediction signals and supplies these to the adder 79 (in the case of the I and P frames) and to the adder 41 (in the case of the B frames).
  • the inter-picture prediction section 64 differs in operation from the inter-picture prediction section 57 in that it does not perform motion estimation or prediction mode selection, and operates only in accordance with the transmitted information, so that the amount of processing which is executed is much less than that executed by the inter-picture prediction section 57.
  • the code sequences for the B frames are decoded by the variable-length decoder 38, dequantizer 39, and inverse DCT section 40 to obtain reconstructed prediction error values, which are added to the corresponding predicted pixel values by an adder 41, to thereby obtain reconstructed B frames.
  • the output selection switch 42 selects the values for the reconstructed I and P frames, read out from the picture memory 63, and the reconstructed B frames, produced from the adder 41, to be supplied to the picture output terminal 43. This is executed such that the order in which the sets of values for respective frames are supplied to the picture output terminal 43 is identical to the picture sequence of the original video signal (prior to encoding), rather than the order in which the encoded data sequences for the frames are output from the video encoding apparatus.
  • the pixel values for each frame are supplied to the output selection switch 42 from from the picture memory 63, or obtained from the adder 41, as a set of values for the first field followed by a set of values for the second field of the frame, so that an interlaced video signal is obtained from output terminal 43.
  • a further problem which arises in the prior art is that when an interlaced video signal is obtained by reproduction from a recorded medium, or from a transmission source, the reconstructed interlaced pictures are not suitable for display by a progressive scanning type of monitor, such as is generally used to display text, images, etc., in the field of computers and data processing.
  • the present invention provides a video encoding apparatus and method, and corresponding video decoding apparatus and method whereby an interlaced video signal can be encoded and subsequently decoded as a series of code sequences expressing pictures which have been respectively encoded either by intra-picture encoding (i.e., I pictures), by unidirectional predictive encoding (i.e., P pictures) or by bidirectional predictive encoding (i.e. the B pictures), which differs from the prior art in that:
  • improved prediction is achieved by periodically selecting specific fields of an interlaced video signal to be converted to respective progressive scanning frames, performing encoding and decoding of each such progressive scanning frame by independent encoding or by unidirectional predictive encoding, while leaving the remaining fields unchanged as interlaced scanning fields, and performing bidirectional predictive encoding and decoding of such interlaced scanning fields by using preceding and succeeding progressive scanning frames as reference frames.
  • the invention provides a video encoding apparatus and method, a video decoding apparatus and method and an encoded video recorded medium as defined in the attached claims.
  • the present invention can provide a video encoding method and apparatus and video decoding method and apparatus whereby an interlaced video signal can be encoded with a very high level of encoding efficiency, and furthermore whereby the resultant code, after having been transmitted and received, or recorded and reconstructed, can be decoded to recover the original video signal as an interlaced signal, or as a progressive scanning video signal in which each field of the originally encoded video signal has been converted to a progressive scanning frame (having double the number of scanning lines of an interlaced field) which can be directly displayed by various types of data processing display apparatus, etc., that can only utilize a progressive scanning video signal.
  • FIG. 1 An embodiment of a motion-compensation encoding apparatus according to the present invention will be described in the following referring to the system block diagram of Fig. 1.
  • elements that are identical in function and operation to elements of the prior art video encoding apparatus example of Fig. 5 are designated by identical numerals to those of Fig. 5.
  • the apparatus of Fig. 1 differs from that of Fig. 5 by including a progressive scanning conversion section 1 and scanning line decimation section 15.
  • the processing executed with this embodiment will be described by comparison with that of the prior art video encoding apparatus example of Fig. 5.
  • the picture processing units are respective fields of the input interlaced video signal which is supplied to the video signal input terminal 7, and the picture type (i.e., I, P, B) is also established in units of these interlaced fields.
  • the time-axis separation between successive I, P or B pictures is the field period, e.g., 1/60 second.
  • the major features of this embodiment are as follows.
  • the fields of the input video signal which are to be processed as I and P type pictures are each subjected to conversion processing to increase the scanning line density, i.e. interpolation of scanning lines is performed to double the number of scanning lines per field and thereby achieve conversion to progressive scanning frames.
  • the fields of the input video signal which are to be processed as B type pictures are left unchanged, i.e. are encoded in units of fields, in a similar manner to that described for the B type fields of the prior art example of Fig. 5.
  • Fig. 2 shows the arrangement of scanning lines of the resultant picture types.
  • the effective number of scanning lines is 480 lines per frame, 240 lines per field. Since each of the I and P type pictures is converted to have twice the number of scanning lines of the original fields, the amount of encoding processing which must be executed for each of these is accordingly doubled.
  • Encoding of the I and P progressive scanning frames is not executed as for the prior art example, i.e. based on the two fields/frame configuration. Instead, as can be clearly understood from the scanning line configuration illustrated in Fig. 2, prediction is executed for each P type progressive scanning frame based on a preceding I type or P type progressive scanning frame, and is executed for each B type field based on preceding and succeeding ones of the I type or P type progressive scanning frames. It can thus be seen that the encoding processing can be simplified by comparison with the prior art method of the example of Fig. 5.
  • a set of prediction signal values is generated by the inter-picture prediction section 9, (in synchronism with input to the subtractor 17 of the pixel values for the field which is to be encoded) based upon specific I or P progressive scanning frames which succeed and precede that interlaced field. Since that set has been derived using progressive scanning frames, it contains twice the number of prediction signal values that are required for encoding an interlaced field, i.e. the set includes respective sub-sets of values which correspond to scanning lines that are omitted from the interlaced field. For that reason, each such set of prediction signal values is subjected to decimation processing by the scanning line decimation section 15, to eliminate each of these sub-sets of prediction signal values corresponding to respective scanning lines which do not appear in the field which is being encoded.
  • the scanning line decimation section 15 must be controlled in accordance with whether the field which is being encoded contains the even-numbered or odd-numbered scanning lines of an interlaced scanning frame, so that appropriate sub-sets of prediction signal values will be eliminated. That is to say the scanning line decimation section 15 must extract, from a set of prediction signal values supplied from the inter-picture prediction section 9, those values which match the scanning lines of the field which is to be encoded.
  • the I and P progressive scanning frames designated by numerals 200, 203 are used to derive prediction signal values for encoding the interlaced scanning field 201 as a B field, the first scanning lines of the fields 201, 202 being indicated as 201a, 202a, then the result of the operation of the inter-picture prediction section 9 and scanning line decimation section 15 will be as follows.
  • a complete set of prediction signal values which are generated by the inter-picture prediction section 9 i.e., a set which would be appropriate for encoding a progressive scanning frame
  • allsub-sets which correspond to scanning lines that do not occur in the field 201 are eliminated by the scanning line decimation section 15.
  • the operation of the input selection switch 8 is synchronized with successive fields of the input video signal.
  • the technology relating to such control is very well known, so that detailed description of specific switch control apparatus is omitted.
  • values for m in the range 3 to 6 are appropriate, i.e., values which are larger than those used in the prior art, and the proportion of I frames in the overall stream of I, P, B pictures can be made accordingly greater than is possible with prior art methods, without making the prediction distance excessively large, so that the amount of encoded data can be substantially reduced by comparison with the prior art.
  • the prediction distance would be halved, by comparison with the prior art example, so that an according increase in motion prediction accuracy would be obtained.
  • the progressive scanning conversion section 1 can be configured as shown in Japanese Patent Laid-open No. HEI 8-130716, whereby motion prediction is performed in units of small blocks, using preceding and succeeding fields, with interpolation of scanning lines which have been omitted from a field due to the interlaced scanning. Since the contents (i.e., pixel values) of preceding and succeeding interlaced fields are required for this interpolation operation, it will be understood that the progressive scanning conversion section 1 includes any necessary delay elements for achieving this, such as a field memory.
  • each of the subtractor 2, the DCT section 3, the quantizer 4, and the variable-length encoder 5 is basically identical to that of the corresponding element of the prior art example. However since there is 1 progressive scanning frame in each field period of the original video signal, e.g., 1/60 second, it is necessary to execute processing at twice the speed of the prior art example, if real-time processing is to be achieved.
  • the operating speed of the picture memory 10 is the same as that of the inverse DCT section 12, however the memory capacity is the same as that of the prior art example. That is to say, with the prior art example there are two interlaced fields per I frame or P frame whereas with the present invention each I frame or P frame is a single progressive scanning frame having twice the number of scanning lines of an interlaced field.
  • the inter-picture prediction section 9 can execute processing simply in units of progressive scanning frames, the operation can be simpler than for the prior art example. Specifically, motion estimation is performed within the inter-picture prediction section 9 by operating on blocks of 16 x 16 pixels or 8 x 8 pixels, and motion compensation is executed in accordance with the detected motion vectors. In general, the accuracy of motion compensation is to within 1/2 of a pixel.
  • a delay of (m - 1) fields must be applied to the picture signal by the field delay element 16. Since the picture types (i.e., I, P, B) are established in units of interlaced scanning fields of the input video signal as described above, the delay applied by the field delay element 16 is set in units of field periods. However in addition, for the the progressive scanning conversion section 1 to perform inter-field interpolation for generating the I and P progressive scanning frames, a delay of one field must occur between inputting video signal values for a field to the progressive scanning conversion section 1 and output of resultant video signal values for a progressive scanning frame. Thus, it is necessary to apply a corresponding amount of delay to the B fields to compensate for this. As a result, the total delay applied by the field delay element 16 must be m field periods (e.g., m/60 seconds).
  • each of the subtractor 17, DCT section 18, quantizer 19 and variable-length encoder 20 is basically identical to that of the prior art example, but with processing being executed in units of fields. That is to say, as described above, the prediction signal which is supplied to the subtractor 17 to derive difference values (i.e., values of prediction error) for a B field is derived from preceding and succeeding I or P progressive scanning frames which are separated from that B field by one or more field periods.
  • the processing system that is constituted by the subtractor 2, the DCT section 3, the quantizer 4 and the variable-length encoder 5 is configured separately from the processing system that is constituted by the subtractor 17, DCT section 18, quantizer 19 and variable-length encoder 20.
  • the processing executed by these two systems is basically the same, these could be combined into a single system, through use of time-sharing operation.
  • the relationships between the processing timings for encoding the I, P and B fields could be as shown conceptually in the example designated as "Processing a", in Fig. 4.
  • the processing time intervals required for each of the I and P frames is twice that required for a B field, it is necessary to make the processing time for a B field shorter than a field period (e.g., shorter than 1/60 second).
  • a field period e.g., shorter than 1/60 second.
  • a subsampler operating along the horizontal direction of each B field can be inserted between the subtractor and the DCT section of such a combined configuration (i.e. which respectively perform the functions of the subtractors 2, 17 and the DCT sections 3, 18 in the embodiment of Fig. 2).
  • Fig. 3 shows an embodiment of a video decoding apparatus corresponding to the video encoding apparatus of Fig. 1 for decoding the output code generated by that video encoding apparatus, e.g., when transmitted code is received or recorded code is reconstructed.
  • Fig. 3 elements that are identical to elements of the prior art example of Fig. 6 are designated by identical numerals.
  • the configuration of Fig. 3 differs from that of the prior art example of Fig. 6 by including scanning line decimation sections 81 and 36.
  • the operation of the inter-picture prediction section 35 differs from that of the prior art example of Fig. 6, while in addition the processing executed by each of the variable-length decoder 31, the dequantizer 6, the inverse DCT section 12 and the adder 11 with respect to the I and P pictures is in units of progressive scanning frames.
  • the code which is input from the code input terminal 33 is separated by the demultiplexer 34 into the I and P frame code sequences and the B frame code sequences.
  • the I and P frame code sequences are subjected to decoding processing by the variable-length decoder 31, the dequantizer 6, the inverse DCT section 12 and the adder 11, in the same way as for the prior art example of Fig. 6, to thereby obtain sets of pixel values for reconstructed pictures expressed as respective progressive scanning frames, which are then stored in a picture memory 32.
  • processing of each progressive scanning frame be completed within one field period, e.g., 1/60 second.
  • the inter-picture prediction section 35 generates a prediction signal (expressing successive reconstructed prediction error values) based on the decoded progressive scanning frames stored in the memory 32, supplies this to the adder 11 for deriving pixel values for reconstructed I and P progressive scanning frames, and supplies a prediction signal to scanning line decimation section 81 for use in decoding the B fields.
  • the inter-picture prediction section 35 operates in a basically similar manner to the inter-picture prediction section 9 of Fig. 1. However the inter-picture prediction section 35 differs from the inter-picture prediction section 9 in that it does not perform motion estimation or prediction mode selection. Thus the amount of processing performed by the inter-picture prediction section 35 is substantially less than that executed by the inter-picture prediction section 9.
  • the B-picture code sequences supplied from the demultiplexer 34 (which correspond to respective interlaced-scanning fields, as described above referring to Fig. 1) are decoded by the variable-length decoder 38, the dequantizer 39, the inverse DCT section 40 and by addition of prediction signal values in the adder 41, to obtain respective reconstructed B field signals (i.e., successive reconstructed pixel values for the B fields).
  • the operation of the scanning line decimation section 81 of this embodiment in selecting appropriate sets of prediction signal values from those generated by the inter-picture prediction section 35, in relation to the scanning lines of the B fields which are to be decoded, is identical to that of the scanning line decimation section 15 of Fig. 1, which has been described in detail hereinabove.
  • respective sets of reconstructed pixel values expressing these B fields are obtained, i.e. the corresponding interlaced fields of the original video signal are thereby obtained from the adder 41, and supplied to the output selection switch 42.
  • sets of reconstructed pixel values corresponding to respective I, P progressive scanning frames are read out from the picture memory 32, via the output lines designated as 32b in Fig. 3.
  • Each of these sets is supplied to the scanning line decimation section 36, which eliminates specific sub-sets of pixel values which correspond to scanning lines that must be omitted in order to convert a progressive scanning frame to an appropriate (i.e. odd-numbered or even-numbered) interlaced scanning field.
  • the values expressing that field are then supplied to the output selection switch 42.
  • the output selection switch 42 is controlled, in conjunction with control of read out of values from the picture memory 32, to transfer the sets of values for reconstructed fields which have been derived from the I and P progressive scanning frames and the sets of values for the reconstructed B fields to the video output terminal 43 in the same order of interlaced scanning fields as that of the original video signal prior to encoding.
  • a reconstructed output interlaced video signal is thereby obtained, with the fields having been restored to the time-axis order of the original video signal, rather than that of the code sequences representing respective I, P and B pictures which are output from the video encoding apparatus of Fig. 1.
  • the scanning line decimation sections 81 and 36 perform processing which differs from subsampling processing in that no filter processing is executed prior to the decimation operation. This is because the reconstructed progressive scanning pictures are derived from original interlaced pictures which have been converted to progressive scanning form, so that the vertical scanning frequency characteristic is already limited to a degree that is appropriate for an interlaced signal, prior to the decimation processing.
  • the video decoding apparatus embodiment of Fig. 3 in combination with the video encoding apparatus embodiment of Fig. 1, enables a system for transmitting or recording an interlaced video signal as encoded data with a very high encoding efficiency.
  • the apparatus could be configured to perform sub-sampling of the reconstructed prediction error values for the B fields, by inserting a subsampler operating along the horizontal direction of each field, i.e. with the subsampler being inserted (during each interval in which a B field is being decoded) between the inverse DCT section that would be used in common for decoding I, P and B pictures and the adder which receives the output values from that inverse DCT section. In that way it would become possible to increase the time available for decoding each of the I and P progressive scanning frames, in the same manner as described above referring to Fig. 4 for the case of encoding processing.
  • Fig. 7 shows a second embodiment of a decoding apparatus for operating on the encoded video data generated by the encoding apparatus of Fig. 1.
  • elements that are identical to elements of the prior art example of Fig. 6 are designated by identical numerals.
  • the configuration of Fig. 7 differs from that of the prior art example of Fig. 6 by including a scanning line interpolation section 44, while in addition the operation of the inter-picture prediction section 35 differs from that of the prior art example of Fig. 6, and furthermore the processing executed by each of the variable-length decoder 31, the dequantizer 6, the inverse DCT section 12 and the adder 11 with respect to the I and P pictures is in units of progressive scanning frames, in the same way as for the video decoding apparatus embodiment of Fig. 3 described above.
  • the code which is input from the code input terminal 33 is separated by the demultiplexer 34 into the I and P frame code sequences, i.e., for respective progressive scanning frames, and the B field code sequences, for respective interlaced scanning fields.
  • the I and P frame code sequences are subjected to decoding processing by the variable-length decoder 31, the dequantizer 6, the inverse DCT section 12 and the adder 11, in the same way as for the prior art example of Fig. 6, to thereby obtain sets of reconstructed pixel values expressing respective I and P progressive scanning frames, which are stored in the picture memory 32.
  • each progressive scanning frame be completed in one field period (e.g., 1/60 second).
  • the inter-picture prediction section 35 generates a prediction signal in the same way as described for the embodiment of Fig. 3, based on the progressive scanning frames stored in the memory 32, and supplies this to the adder 11 for deriving pixel values for the reconstructed I and P progressive scanning frames.
  • the inter-picture prediction section 35 supplies a prediction signal to the adder 41, for deriving pixel values for reconstructed progressive scanning frames which are obtained from decoded B fields as described in the following.
  • the B frame code sequences are decoded by the variable-length decoder 38, the dequantizer 39 and the inverse DCT section 40 to obtain respective sets of reconstructed prediction error values for the B fields, i.e., this processing is performed in units of interlaced scanning fields.
  • Each set of reconstructed prediction error values thereby obtained for a field is then subjected to scanning line interpolation (oversampling) in the vertical direction by the scanning line interpolation section 44, to obtain a corresponding set of reconstructed prediction error values for a progressive scanning frame.
  • the set of prediction error values corresponding to a 240 line field is converted to a set of prediction error values corresponding to a 480 line progressive scanning frame, by generation and insertion of 240 sub-sets of interpolated prediction error values which correspond to respective interpolated scanning lines, before being supplied to the adder 41.
  • This processing is executed within each field by oversampling, and is a simple form of processing by comparison with that executed by the progressive scanning conversion section 1 of the video encoding apparatus of Fig. 1.
  • the prediction signal values which are output from the inter-picture prediction section 35 are added, in the adder 41, to the prediction error values which have been converted to progressive scanning frame format by the scanning line interpolation section 44, to thereby obtain sets of reconstructed pixel values for respective progressive scanning frames that have been derived from decoded B fields.
  • the output selection switch 42 receives the pictures expressed as respective reconstructed I and P progressive scanning frames which are read out from the picture memory 32 at appropriate timings (as described for the embodiment of Fig. 3 and the pictures expressed as respective progressive scanning frames derived from the B fields which are output from the adder 41, and outputs these progressive scanning frames to the video output terminal 43 in the appropriate order for producing a progressive scanning video signal in which the sequence of the progressive scanning frames is identical to that of the originally encoded interlaced video signal and in which the frame period is half of the frame period of that original video signal. That is to say, in the same way as described for the embodiment of Fig.
  • the order in which data expressing respective reconstructed pictures are supplied to the video output terminal 43 is identical to the time-axis order of the respectively corresponding pictures (expressed by interlaced fields) of the original video signal, rather than that of the I, P, B code sequences which are received by the video decoding apparatus.
  • Fig. 8 is a diagram for conceptually illustrating the successive progressive scanning frames which are thereby supplied to the video output terminal 43. Respectively different symbols are used in Fig. 8 to indicate scanning lines which are derived in each of three different ways, i.e. scanning lines (indicated by circles) which are reconstructed from original scanning lines of the interlaced fields of the video signal, scanning lines (indicated by diamonds) which are inserted by the operation of the progressive scanning conversion section 1 of the video encoding apparatus of Fig. 1, and scanning lines (indicated by stars) which are inserted as a result of the operation of the scanning line interpolation section 44 in conjunction with the adder 41 in the video decoding apparatus of Fig. 7.
  • scanning lines indicated by circles
  • scanning lines indicated by diamonds
  • scanning lines indicated by stars
  • a further aspect of the present invention is that it enables a digital encoded video recorded medium having a high efficiency of encoding to be realized. Specifically, this can be achieved by multiplexing the picture code sequences which are generated by the video encoding apparatus of Fig. 1 with code sequences for audio and control information using the MPEG system standards, adding error correction codes to the resultant code, modulating a recording signal with the resultant code and then recording the modulated recording signal on a recorded medium.
  • this aspect of the invention can provide an encoded video recorded medium implemented as a recorded medium having recorded thereon interlaced video information which has been subjected to high-efficiency encoding.
  • This is achieved by recording on the recorded medium respective code sequences each of which is derived from a corresponding field of a field interlaced digital video signal, with the code sequences consisting of first and second code sequences, where each of the first code sequences is generated by converting one out of every m fields of the interlaced video signal into a progressive scanning frame having double the number of scanning lines of an interlaced field (where m is an integer of 2 or more), and performing encoding of each such progressive scanning frame either by intra-frame encoding or by unidirectional inter-frame predictive encoding using a progressive scanning frame as a reference frame, and where each of the second code sequences is generated by bidirectional inter-frame predictive encoding of one of the remaining fields of the video signal (i.e., which has been left unchanged as an interlaced scanning field) using preceding and succeeding
  • the recording can be executed at high speed, in the case of a read-only type of recorded medium, by using a stamper, etc.
  • Reproduced code which is read from such an encoded recorded medium can be decoded to obtain either the original interlaced video signal, by using the first video decoding apparatus embodiment of the invention, or to obtain a progressive scanning video signal, by using the second video decoding apparatus embodiment as described above.
  • one in m fields of a digital interlaced video signal (where m is an integer of 2 or greater) is converted to a progressive scanning frame and is encoded and decoded either independently within the frame or by unidirectional prediction, while other fields which are left unchanged as interlaced fields are encoded and decoded by bidirectional prediction using preceding and succeeding progressive scanning frames. That is to say, all of the reference pictures which are utilized for inter-picture prediction consist of progressive scanning frames. As a result, the time-axis deviations which occur when interlaced fields (i.e.
  • Another advantage is as follows.
  • the encoding when intra-frame encoding of an I frame is performed, the encoding must utilize two successive fields, which are mutually displaced by one field period along the time axis and also by one scanning line along the vertical picture direction.
  • aliasing components are generated by such intra-frame encoding.
  • each I picture to which intra-frame encoding is applied consists of a single progressive scanning frame. Hence, the problem of generation of aliasing components is again eliminated by the method of the present invention.
  • the reference pictures for inter-frame encoding always consist of progressive scanning frames, i.e., with the scanning line density being twice that of a field, the accuracy of motion estimation in the vertical direction of a picture is accordingly doubled, so that motion compensation is accordingly more accurate than is possible with the prior art method.
  • the inter-picture prediction distance would be halved, in the case of a video encoding apparatus according to the present invention.
  • the prediction error values would be accordingly made smaller.
  • the inter-picture prediction distance were to be made the same as that of the prior art video encoding apparatus example, then by comparison with the prior art video encoding apparatus example, the value of m could be doubled and hence the proportion of B frames could be made accordingly greater, with the apparatus of the present invention. Since the B frames are encoded by bidirectional prediction, and so require a smaller amount of code than the P and I frames, a correspondingly greater degree of code compression can be achieved.
  • the conversion processing for converting interlaced scanning fields to progressive scanning frames is performed on only one in m of the total number of fields, it is possible with the present invention (as described hereinabove referring to the video decoding apparatus embodiment of Fig. 7) to configure the decoding system such that although a field-interlaced video signal is encoded, the finally decoded video signal is a progressive scanning video signal in which each frame corresponds to a field of the original interlaced video signal, so that the frame frequency is twice that of the original video signal.
  • the decoded video signal can be in a form which can be directly supplied to various types of display apparatus such as computer monitors, liquid crystal displays, PDP displays, etc., i.e., to those types of apparatus which can display only progressive scanning pictures, and high display quality can be achieved.
  • display apparatus such as computer monitors, liquid crystal displays, PDP displays, etc., i.e., to those types of apparatus which can display only progressive scanning pictures, and high display quality can be achieved.
  • An encoding apparatus includes a selector (8) for periodically selecting fields of an interlaced video signal to be converted to respective progressive scanning frames, by a scanning converter (1) which doubles the number of scanning lines per field.
  • the apparatus encodes these frames by intra-frame encoding or unidirectional predictive encoding using preceding ones of the frames, and encodes the remaining fields of the video signal by bidirectional prediction using preceding and succeeding ones of the progressive scanning frames for reference.
  • the resultant code can be decoded by an inverse process to recover the interlaced video signal, or each decoded field can be converted to a progressive scanning frame to thereby enable output of a progressive scanning video signal.
  • Enhanced accuracy of motion prediction for inter-frame encoding can thereby be achieved, and generation of encoded aliasing components suppressed, since all reference pictures are progressive scanning frames rather than pairs of fields constituting interlaced scanning frames.

Claims (7)

  1. Videocodiergerät zum Codieren eines Zeilensprungvideosignals, mit:
    einem Konvertierungsmittel (1) mit progressiver Abtastung zum Konvertieren eines Teilbildes bei m Teilbildern des zeilensprungvideosignals in ein einzelnes Abtastbild mit progressiver Abtastung, das die doppelte Zeilenzahldichte eines Zeilensprungteilbildes hat, während die restlichen Teilbilder als Zeilensprungabtastteilbilder unverändert bleiben, wobei m eine Ganzzahl mit dem Wert 2 oder höher ist,
    einem ersten Codiermittel (2, 3, 4, 5, 6, 9, 10, 11, 12) zum Codieren des Bildes mit progressiver Abtastung entweder durch unabhängiges internes Codieren des Vollbildes oder durch unidirektionale Prädiktionscodierung auf der Grundlage von Vollbildern mit progressiver Abtastung, die codiert sind, und mit
    einem zweiten Codiermittel (15, 17, 18, 19, 20) zum Anlegen eines Prädiktionssignals zum Ausführen einer Prädiktionscodierung eines jeden restlichen Teilbildes vom Videosignal, das sich von den Teilbildern unterscheidet, die zu Vollbildern mit progressiver Abtastung konvertiert sind;
       wobei das zweite Codiermittel das Prädiktionssignal durch Herleiten eines Satzes von Prädiktionssignalwerten mit progressiver Abtastung aus ausgewählten Vollbildern mit progressiver Abtastung erzeugt, die einem jeden restlichen Teilbild längs der Zeitachse vorangehen und nachfolgen, und durch Anwenden einer Dezimierungsverarbeitung zum Konvertieren des Satzes von Prädiktionssignalwerten mit progressiver Abtastung auf eine Vielzahl von Untersätzen, die den jeweiligen Abtastzeilen entsprechen, die sich in identischer Weise bei den Abtastzeilen eines jeden der restlichen Teilbilder befinden, mit der Vielzahl von Untersätzen, die das Prädiktionssignal einrichten.
  2. Videodecodiergerät zum Decodieren aufeinanderfolgender Codesequenzen, die erzeugt worden sind durch Codieren jeweiliger Vollbilder mit progressiver Abtastung, hergeleitet durch Verdoppeln einer Abtastzeilendichte alle m Teilbilder eines Originalzeilensprungvideosignals, wobei m eine Ganzzahl vom Wert 2 oder größer ist, mit der Codierverarbeitung unter Verwendung einer Intravollbildprädiktionscodierung oder einer unidirektionalen Prädiktionscodierung und Anlegen eines Prädiktionssignals zum Codieren eines jeden restlichen Teilbildes vom Videosignal direkt als ein Zeilensprungabtastteilbild, wobei das Prädiktionssignal gewonnen wird durch Herleiten eines Satzes von Prädiktionssignalwerten mit progressiver Abtastung aus ausgewählten Vollbildern mit progressiver Abtastung, die einem jeden restlichen Teilbild längs der Zeitachse vorangehen und nachfolgen, und Ausführen einer Dezimierungsverarbeitung zum Konvertieren des Satzes von Prädiktionssignalwerten mit progressiver Abtastung auf eine Vielzahl von Untersätzen, die den jeweiligen Abtastzeilen entsprechen, die sich in identischer Weise bei den Abtastzeilen eines jeden restlichen Teilbildes befinden, mit der Vielzahl an Untersätzen, die das Prädiktionssignal einrichten, mit:
    einem ersten Decodiermittel (6, 11, 12, 31, 32, 35) zum Decodieren einer jeden der Codesequenzen gemäß den Vollbildern mit progressiver Abtastung durch Intravollbilddecodierung oder durch unidirektionale Prädiktionsdecodierung unter Verwendung decodierter Vollbilder mit progressiver Abtastung, um dadurch eine Serie rekonstruierter Bilder zu erhalten,
    einem zweiten Decodiermittel (38, 39, 40, 41, 81) zum Ausführen einer bidirektionalen Prädiktionsdecodierung jeweiliger Codesequenzen gemäß einem jeden der Teilbilder, die codiert sind als Zeilensprungabtastteilbilder, die über ein Interbildprädiktionsmittel (35) zum Herleiten von Sätzen von Prädiktionssignalwerten mit progressiver Abtastung entsprechend einem jeder der Teilbilder verfügen, die codiert sind als Zeilensprungabtastteilbilder unter Verwendung eines vorangehenden oder nachfolgenden Bildes aus der Serie rekonstruierter Bilder als Bezugsvollbilder, einem Dezimierungsmittel (81) zum Herleiten aus den Prädiktionssignalwerten mit progressiver Abtastung von Untersätzen von Prädiktionssignalwerten, die jeweils den Abtastzeilen der Teilbilder entsprechen, die als Zeilensprungabtastteilbilder codiert sind, einem Mittel (38, 39, 40) zum Herleiten von Sätzen rekonstruierter Prädiktionsfehlerwerte gemäß einem jeden der Teilbilder, die als Zeilensprungabtastteilbilder codiert sind, und einem Addiermittel (41) zum Addieren der Prädiktionssignalwerte der rekonstruierten Prädiktionsfehlerwerte, um rekonstruierte Teilbilder zu gewinnen, und mit
    einem Bildrekonfigurationsmittel (32, 36, 42) zum Ausführen einer Dezimierung von Abtastzeilen eines jeden der Vollbilder mit progressiver Abtastung, die das erste Decodiermittel decodiert hat, um konvertierte Teilbilder zu gewinnen, die jeweils dieselbe Anzahl von Abtastzeilen wie ein Zeilensprungabtastteilbild haben, und zum Ausführen einer Zeitachsenkombination jeweiliger der konvertierten Teilbilder mit den rekonstruierten Teilbildern in eine passende Sequenz zur Wiedergabe des Originalzeilensprungvideosignals.
  3. Videodecodiergerät zum Decodieren aufeinanderfolgender durch Codieren jeweiliger Vollbilder mit progressiver Abtastung erzeugter Codesequenzen, hergeleitet durch Verdoppeln der Abtastzeilendichte in einem aller m Teilbilder eines Originalzeilensprungvideosignals, wobei m eine Ganzzahl mit dem Wert 2 oder höher ist, mit dem Codierprozeß unter Verwendung einer Intravollbildprädiktionscodierung oder einer unidirektionalen Prädiktionscodierung und durch Anwenden eines Prädiktionssignals zum Codieren eines jeden restlichen Teilbildes vom Videosignal direkt als ein Zeilensprungabtastteilbild, wobei das Prädiktionssignal gewonnen wird durch Herleiten eines Satzes von Prädiktionssignalwerten mit progressiver Abtastung aus ausgewählten Vollbildern mit progressiver Abtastung, die einem jeden restlichen Teilbild entlang der Zeitachse vorangehen oder nachfolgen, und Ausführen einer Dezimierungsverarbeitung zum Konvertieren des Satzes von Prädiktionssignalwerten mit progressiver Abtastung auf eine Vielzahl von Untersätzen, die den jeweiligen Abtastzeilen entsprechen, die sich in identischer Weise auf den Abtastzeilen eines jeden restlichen Teilbildes befinden, mit der Vielzahl von Untersätzen, die das Prädiktionssignal einrichten, mit:
    einem Decodiermittel (6, 11, 12, 31, 32, 35) zum Decodieren eines jeden der codierten Vollbilder mit progressiver Abtastung durch Intravollbilddecodierung oder durch unidirektionales Prädiktionsdecodierung unter Verwendung der codierten Vollbilder mit progressiver Abtastung, um dadurch eine erste Serie rekonstruierter Bilder zu schaffen,
    einem Prädiktionsfehlerdecodiermittel (38, 39, 40) zum Decodieren von Prädiktionsfehlerwerten für jedes der restlichen Teilbilder, die als Zeilensprungabtastteilbilder codiert sind,
    einem Interpolationsmittel (44) zum Ausführen einer Überabtastung der Prädiktionsfehlerwerte in vertikaler Abtastungsrichtung eines Teilbildes, um rekonstruierte Prädiktionsfehlerwerte gemäß einem der Vollbilder mit progressiver Abtastung herzuleiten,
    einem Interbildprädiktionsmittel (35) zum Herleiten von Sätzen von Prädiktionssignalwerten mit progressiver Abtastung gemäß einem jeden der Teilbilder, die unverändert als Zeilensprungabtastteilbilder codiert sind unter Verwendung eines vorhergehenden oder nachfolgenden der ersten Serie rekonstruierter Bilder als Bezugsvollbilder,
    einem Addiermittel (41) zum Addieren der Prädiktionssignalwerte mit progressiver Abtastung der rekonstruierten Prädiktionsfehlerwerte gemäß den Zeilen mit progressiver Abtastung, um dadurch eine zweite Serie rekonstruierter Bilder zu gewinnen, und mit
    einem Bildrekonfigurationsmittel (32, 42) zum Einfügen der zweiten Serie rekonstruierter Bilder in die erste Serie rekonstruierter Bilder, um ein rekonstruiertes Videosignal zu erhalten, bei dem alle Bilder als Vollbilder mit progressiver Abtastung ausgedrückt sind.
  4. Codiervideoaufzeichnungsmedium, das Codeausdruckzeilensprungvideoinformationen aufgezeichnet hat, die codiert sind durch hocheffiziente Codierung eines Zeilensprungvideosignals, dadurch gekennzeichnet, daß
       eines in allen m Teilbilder des Zeilensprungvideosignals, wobei m eine Ganzzahl mit dem Wert 2 oder höher ist, konvertiert wird in ein Vollbild mit progressiver Abtastung, das die doppelte Anzahl von Abtastzeilen hat, während restliche Teilbilder als Zeilensprungabtastteilbilder unverändert bleiben,
       jedes der Vollbilder mit progressiver Abtastung entweder unabhängig intern codiert oder durch unidirektionale Prädiktionscodierung unter Verwendung von Bezugsvollbildern jeweiliger Vollbilder mit progressiver Abtastung codiert wird, die bereits codiert sind, um dadurch erste Codesequenzen gemäß den Vollbildern mit progressiver Abtastung zu gewinnen,
       jedes der restlichen Teilbilder, die als Zeilensprungabtastteilbilder unverändert geblieben sind, codiert wird durch bidirektionale Prädiktion unter Verwendung eines Prädiktionssignals, um jeweilige zweite Codesequenzen gemäß den restlichen Teilbildern zu gewinnen, wobei das Prädiktionssignal erzeugt wird durch Herleiten eines Satzes von Prädiktionssignalwerten mit progressiver Abtastung aus ausgewählten Vollbildern mit progressiver Abtastung, die einem jeden Teilbild längs der Zeitachse vorangehen und nachfolgen, und Ausführen einer Dezimierungsverarbeitung zum Konvertieren des Satzes von Prädiktionssignalwerten mit progressiver Abtastung in eine Vielzahl von Untersätzen, die den jeweiligen Abtastzeilen entsprechen, die sich in identischer Weise bei den Abtastzeilen eines jeden restlichen Teilbildes befinden, mit der Vielzahl von Untersätzen, die das Prädiktionssignal einrichten, und daß
       die erste und zweite Codesequenz zum Einrichten des codierten Videoaufzeichnungsmediums aufgezeichnet werden.
  5. Verfahren zum Codieren eines Zeilensprungvideosignals, mit den Verfahrensschritten:
    Konvertieren eines Teilbildes alle m Teilbilder des Zeilensprungsvideosignals in ein einzelnes Vollbild mit progressiver Abtastung, das die doppelte Abtastzeilendichte eines Zeilensprungteilbildes hat, während die restlichen Teilbilder als Zeilensprungabtastteilbilder unverändert bleiben, wobei m eine Ganzzahl mit dem Wert 2 oder höher ist,
    Codieren eines jeden Vollbildes mit progressiver Abtastung entweder durch unabhängiges internes Codieren des Vollbildes oder durch unidirektionales Prädiktionscodierung auf der Grundlage von Vollbildern mit progressiver Abtastung, die bereits codiert sind,
    Herleiten eines Satzes von Prädiktionssignalwerten mit progressiver Abtastung aus ausgewählten Vollbildern mit progressiver Abtastung, die einem jeden restlichen Teilbild längs der Zeitachse vorangehen oder nachfolgen, und Anwenden einer Dezimierungsverarbeitung zum Konvertieren eines jeden der Sätze von Prädiktionssignalwerten mit progressiver Abtastung in eine Vielzahl von Untersätzen, die den jeweiligen Abtastzeilen entsprechen, die sich identisch auf den Abtastzeilen eines jeden restlichen Teilbildes befinden, und
    Anwenden der Vielzahl von Untersätzen als Prädiktionssignal zum Ausführen einer Prädiktionscodierung von jedem restlichen Teilbild.
  6. Verfahren zum Decodieren aufeinanderfolgender Codesequenzen die erzeugt sind durch einen ersten Codierprozeß, angewandt auf jeweilige Vollbilder mit progressiver Abtastung, die hergeleitet sind durch Verdoppeln der Abtastzeilendichte alle m Teilbilder eines Originalzeilensprungvideosignals, wobei m eine Ganzzahl mit dem Wert 2 oder höher ist, wobei der erste Codierprozeß unter Anwendung einer Intravollbildprädiktionscodierung oder einer unidirektionalen Prädiktionscodierung erfolgt, und durch Codieren eines jeden restlichen Teilbildes des Videosignals direkt als ein Zeilensprungabtastteilbild in einem zweiten Codierprozeß, wobei der zweite Codierprozeß durch Anlegen eines Prädiktionssignals zum Codieren eines jeden restlichen Teilbildes des Videosignals direkt als ein Zeilensprungabtastteilbild erfolgt, wobei das Prädiktionssignal gewonnen wird durch Herleiten eines Satzes von Prädiktionssignalwerten mit progressiver Abtastung aus ausgewählten Vollbildern mit progressiver Abtastung, die einem jeden restlichen Teilbild längs der Zeitachse vorangehen und nachfolgen, und Ausführen einer Dezimierungsverarbeitung zum Umsetzen des Satzes von Prädiktionssignalwerten mit progressiver Abtastung in eine Vielzahl von Untersätzen, die den jeweiligen Abtastzeilen entsprechen, die sich identisch auf den Abtastzeilen eines jeden der restlichen Teilbilder befinden, wobei die Vielzahl von Untersätzen das Prädiktionssignal einrichtet, mit den Verfahrensschritten:
    Ausführen des Decodierens einer jeden Codesequenz gemäß den Vollbildern mit progressiver Abtastung durch Intravollbilddecodierung oder durch unidirektionale Prädiktionsdecodierung unter Verwendung decodierter Vollbilder mit progressiver Abtastung, um dadurch eine Serie rekonstruierter Bilder zu schaffen,
    Ausführen einer Prädiktionsdecodierung jeweiliger Codesequenzen gemäß einem jeden der Teilbilder, die als Zeilensprungabtastteilbilder codiert sind,
    Herleiten von Sätzen von Prädiktionswerten mit progressiver Abtastung gemäß einem jeden der Teilbilder, die als Zeilensprungabtastteilbilder codiert sind, unter Verwendung eines vorangehenden oder nachfolgenden rekonstruierten Bilder der Serie rekonstruierter Bilder als Bezugsvollbilder,
    Herleiten aus den Prädiktionssignalwerten mit progressiver Abtastung von Untersätzen von Prädiktionssignalwerten, die jeweils Abtastzeilen der Teilbilder entsprechen, die als Zeilensprungabtastteilbilder codiert sind,
    Herleiten von Sätzen rekonstruierter Prädiktionsfehlerwerte gemäß einem jeden der Teilbilder, die als Zeilensprungabtastteilbilder codiert sind,
    Addieren der Prädiktionssignalwerten zu den rekonstruierten Prädiktionsfehlerwerte, um rekonstruierte Teilbilder zu gewinnen,
    Ausführen einer Dezimierung von Abtastzeilen eines jeden der im ersten Decodierschritt decodierten Vollbilder mit progressiver Abtastung, um konvertierte Teilbilder zu gewinnen, die jeweils dieselbe Anzahl an Abtastzeilen wie ein Zeilensprungabtastteilbild haben, und
    Kombinieren jeweiliger der konvertierten Teilbilder mit den rekonstruierten Teilbildern in einer passenden Sequenz zur Wiedergabe des Originalzeilensprungvideosignals.
  7. Verfahren zum Decodieren von durch einen ersten Codierprozeß erzeugten aufeinanderfolgenden Codesequenzen, angewandt auf die jeweiligen Vollbilder mit progressiver Abtastung, die hergeleitet sind durch Verdoppeln der Abtastzeilendichte eines jeden von m Teilbildern eines Originalzeilensprungvideosignals, wobei m eine Ganzzahl mit dem Wert 2 oder höher ist, wobei der erste Codierprozeß unter Verwendung einer Intravollbildprädiktionscodierung oder einer unidirektionalen Prädiktionscodierung erfolgt, und durch Erzeugen eines Prädiktionssignals und Anwenden des Prädiktionssignals zur Codierung eines jeden restlichen Teilbildes vom Videosignal direkt als Zeilensprungabtastteilbild in einem zweiten Codierprozeß, wobei das Prädiktionssignal, gewonnen durch Herleiten eines Satzes von Prädiktionssignalwerten mit progressiver Abtastung aus den ausgewählten der Vollbilder mit progressiver Abtastung, die einem jeden restlichen Teilbild längs der Zeitachse vorangehen und nachfolgen, und Ausführen einer Dezimierungsverarbeitung zum Konvertieren des Satzes von Prädiktionssignalwerten mit progressiver Abtastung in eine Vielzahl von Untersätzen, die jeweiligen Abtastzeilen entsprechen, die sich identisch auf den Abtastzeilen eines jeden restlichen Teilbildes befinden, mit der Vielzahl von Untersätzen, die das Prädiktionssignal einrichten, mit den Verfahrensschritten:
    Decodieren eines jeden der Vollbilder mit progressiver Abtastung durch Intravollbilddecodierung oder durch unidirektionale Prädiktionscodierung unter Verwendung decodierter Vollbilder mit progressiver Abtastung, um dadurch eine Serie rekonstruierter Bilder zu schaffen,
    Decodieren von Prädiktionsfehlerwerten für jedes der restlichen Teilbilder, die als Zeilensprungabtastteilbilder codiert sind,
    Ausführen einer Überabtastung der Prädiktionsfehlerwerte in Vertikalabtastrichtung eines Teilbildes, um rekonstruierte Prädiktionsfehlerwerte gemäß den Abtastzeilen eines Vollbildes mit progressiver Abtastung zu erzeugen,
    Herleiten für ein jedes als Zeilensprungabtastteilbild codiertes Teilbild von Prädiktionssignalwerten auf der Grundlage eines der ersten Serie von Vollbildern progressiver Abtastung, die längs der Zeitachse vorangehen oder nachfolgen, als Bezugsvollbilder,
    Addieren der Prädiktionssignalwerte zu den rekonstruierten Prädiktionsfehlerwerten, um dadurch eine zweite Serie rekonstruierter Bilder zu schaffen, und
    Einfügen der zweiten Serie rekonstruierter Bilder in die ersten Serie rekonstruierter Bilder, um ein Videosignal zu schaffen, bei dem alle Bilder als Vollbilder mit progressiver Abtastung ausgedrückt sind.
EP98106084A 1997-05-30 1998-04-02 Kodier- und Dekodierverfahren für Videosignal mit Zwischenbild mit Konvertierung periodisch ausgewählter Videohalbbilder zu Videobildern mit progressiver Abtastung Expired - Lifetime EP0881835B1 (de)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
JP157806/97 1997-05-30
JP15780697 1997-05-30
JP15780697 1997-05-30
JP25929197 1997-09-08
JP25929197A JP3164031B2 (ja) 1997-05-30 1997-09-08 動画像符号化復号化装置、動画像符号化復号化方法、及び動画像符号化記録媒体
JP259291/97 1997-09-08

Publications (3)

Publication Number Publication Date
EP0881835A2 EP0881835A2 (de) 1998-12-02
EP0881835A3 EP0881835A3 (de) 1999-06-30
EP0881835B1 true EP0881835B1 (de) 2004-06-16

Family

ID=26485127

Family Applications (1)

Application Number Title Priority Date Filing Date
EP98106084A Expired - Lifetime EP0881835B1 (de) 1997-05-30 1998-04-02 Kodier- und Dekodierverfahren für Videosignal mit Zwischenbild mit Konvertierung periodisch ausgewählter Videohalbbilder zu Videobildern mit progressiver Abtastung

Country Status (6)

Country Link
US (1) US6188725B1 (de)
EP (1) EP0881835B1 (de)
JP (1) JP3164031B2 (de)
KR (1) KR100285175B1 (de)
CN (1) CN1151684C (de)
DE (1) DE69824486T2 (de)

Families Citing this family (58)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9286294B2 (en) 1992-12-09 2016-03-15 Comcast Ip Holdings I, Llc Video and digital multimedia aggregator content suggestion engine
US7168084B1 (en) 1992-12-09 2007-01-23 Sedna Patent Services, Llc Method and apparatus for targeting virtual objects
JP4014263B2 (ja) * 1997-10-01 2007-11-28 松下電器産業株式会社 映像信号変換装置及び映像信号変換方法
JP3164056B2 (ja) * 1998-03-19 2001-05-08 日本ビクター株式会社 動画像符号化復号化装置、動画像符号化復号化方法及び動画像符号記録媒体
US6408029B1 (en) 1998-04-02 2002-06-18 Intel Corporation Method and apparatus for simplifying real-time data encoding
US7046734B2 (en) * 1998-04-02 2006-05-16 Intel Corporation Method and apparatus for performing real-time data encoding
US6904174B1 (en) * 1998-12-11 2005-06-07 Intel Corporation Simplified predictive video encoder
US9924234B2 (en) 1998-07-23 2018-03-20 Comcast Ip Holdings I, Llc Data structure and methods for providing an interactive program
US6754905B2 (en) 1998-07-23 2004-06-22 Diva Systems Corporation Data structure and methods for providing an interactive program guide
EP1097587A1 (de) 1998-07-23 2001-05-09 Diva Systems Corporation Interaktive benutzerschnittstelle
WO2000024194A1 (fr) * 1998-10-20 2000-04-27 Sony Corporation Dispositif et procede de traitement d'images
US7096487B1 (en) * 1999-10-27 2006-08-22 Sedna Patent Services, Llc Apparatus and method for combining realtime and non-realtime encoded content
US6754271B1 (en) 1999-04-15 2004-06-22 Diva Systems Corporation Temporal slice persistence method and apparatus for delivery of interactive program guide
US6904610B1 (en) 1999-04-15 2005-06-07 Sedna Patent Services, Llc Server-centric customized interactive program guide in an interactive television environment
AU1576801A (en) 1999-10-27 2001-05-08 Diva Systems Corporation Picture-in-picture and multiple video streams using slice-based encoding
FR2804274B1 (fr) 2000-01-25 2002-04-12 St Microelectronics Sa Decodeur mpeg d'images de sequences multiples
US6940911B2 (en) * 2000-03-14 2005-09-06 Victor Company Of Japan, Ltd. Variable picture rate coding/decoding method and apparatus
JP2001285863A (ja) * 2000-03-30 2001-10-12 Sony Corp 画像情報変換装置及び方法
EP1744563A3 (de) * 2000-07-21 2007-02-28 Matsushita Electric Industrial Co., Ltd. Signalübertragungssystem
US7023491B2 (en) * 2001-02-28 2006-04-04 Thomson Licensing Method and device for displaying frozen pictures on video display device
JP3715249B2 (ja) * 2001-04-27 2005-11-09 シャープ株式会社 画像処理回路、画像表示装置、並びに画像処理方法
US7793326B2 (en) 2001-08-03 2010-09-07 Comcast Ip Holdings I, Llc Video and digital multimedia aggregator
US7908628B2 (en) 2001-08-03 2011-03-15 Comcast Ip Holdings I, Llc Video and digital multimedia aggregator content coding and formatting
JP3797208B2 (ja) 2001-11-30 2006-07-12 日本ビクター株式会社 カラー動画像符号化装置、復号化装置、符号化方法、復号化方法及びカラー動画像符号列伝送方法
US7003035B2 (en) 2002-01-25 2006-02-21 Microsoft Corporation Video coding methods and apparatuses
US20040001546A1 (en) 2002-06-03 2004-01-01 Alexandros Tourapis Spatiotemporal prediction for bidirectionally predictive (B) pictures and motion vector prediction for multi-picture reference motion compensation
US7280700B2 (en) * 2002-07-05 2007-10-09 Microsoft Corporation Optimization techniques for data compression
US7088776B2 (en) * 2002-07-15 2006-08-08 Apple Computer, Inc. Method and apparatus for variable accuracy inter-picture timing specification for digital video encoding
US7154952B2 (en) 2002-07-19 2006-12-26 Microsoft Corporation Timestamp-independent motion vector prediction for predictive (P) and bidirectionally predictive (B) pictures
US6728315B2 (en) 2002-07-24 2004-04-27 Apple Computer, Inc. Method and apparatus for variable accuracy inter-picture timing specification for digital video encoding with reduced requirements for division operations
US7447264B2 (en) * 2002-11-07 2008-11-04 Victor Company Of Japan, Ltd. Moving-picture temporal scalable coding method, coding apparatus, decoding method, decoding apparatus, and computer program therefor
EP1455534A1 (de) * 2003-03-03 2004-09-08 Thomson Licensing S.A. Skalierbare Kodierung und Dekodierung von digitalen Zeilensprungvideosignalen
US7379656B2 (en) * 2003-05-05 2008-05-27 Thomson Licensing Forward trick modes on progressive video using special groups of pictures
US20050013498A1 (en) 2003-07-18 2005-01-20 Microsoft Corporation Coding of motion vector information
US10554985B2 (en) 2003-07-18 2020-02-04 Microsoft Technology Licensing, Llc DC coefficient signaling at small quantization step sizes
US7499495B2 (en) * 2003-07-18 2009-03-03 Microsoft Corporation Extended range motion vectors
US7426308B2 (en) * 2003-07-18 2008-09-16 Microsoft Corporation Intraframe and interframe interlace coding and decoding
US7609763B2 (en) * 2003-07-18 2009-10-27 Microsoft Corporation Advanced bi-directional predictive coding of video frames
US7738554B2 (en) 2003-07-18 2010-06-15 Microsoft Corporation DC coefficient signaling at small quantization step sizes
US7724827B2 (en) * 2003-09-07 2010-05-25 Microsoft Corporation Multi-layer run level encoding and decoding
US8064520B2 (en) 2003-09-07 2011-11-22 Microsoft Corporation Advanced bi-directional predictive coding of interlaced video
US7317839B2 (en) * 2003-09-07 2008-01-08 Microsoft Corporation Chroma motion vector derivation for interlaced forward-predicted fields
US7567617B2 (en) * 2003-09-07 2009-07-28 Microsoft Corporation Predicting motion vectors for fields of forward-predicted interlaced video frames
US7599438B2 (en) * 2003-09-07 2009-10-06 Microsoft Corporation Motion vector block pattern coding and decoding
EP1530373A2 (de) * 2003-11-06 2005-05-11 Matsushita Electric Industrial Co., Ltd. Speicheranordung für einen schnellen Zugriff auf Bildblockdaten gemäss einer verschiedenen Scanreihenfolge
EP1633128A1 (de) * 2004-09-02 2006-03-08 Deutsche Thomson-Brandt Gmbh Verfahren und Vorrichtung zur Dekodierung von kodierten Bildgruppen einer Videosequenz und zur Präsentation der besagten Videosequenz und der besagten Bildgruppen in einer zeitlich umgekehrten Reihenfolge
KR100694059B1 (ko) * 2004-09-30 2007-03-12 삼성전자주식회사 멀티 타임 스캔 방식에 기초한 인터 모드 인코딩 및디코딩 방법 및 장치
US8031777B2 (en) * 2005-11-18 2011-10-04 Apple Inc. Multipass video encoding and rate control using subsampling of frames
US8295343B2 (en) 2005-11-18 2012-10-23 Apple Inc. Video bit rate control method
US8233535B2 (en) * 2005-11-18 2012-07-31 Apple Inc. Region-based processing of predicted pixels
US20070116117A1 (en) * 2005-11-18 2007-05-24 Apple Computer, Inc. Controlling buffer states in video compression coding to enable editing and distributed encoding
US8780997B2 (en) * 2005-11-18 2014-07-15 Apple Inc. Regulation of decode-side processing based on perceptual masking
US7889789B2 (en) * 2006-04-07 2011-02-15 Microsoft Corporation Making interlace frame level coding mode decisions
US8254455B2 (en) * 2007-06-30 2012-08-28 Microsoft Corporation Computing collocated macroblock information for direct mode macroblocks
US8189666B2 (en) 2009-02-02 2012-05-29 Microsoft Corporation Local picture identifier and computation of co-located information
RU2467499C2 (ru) * 2010-09-06 2012-11-20 Государственное образовательное учреждение высшего профессионального образования "Поволжский государственный университет телекоммуникаций и информатики" (ГОУВПО ПГУТИ) Способ сжатия цифрового потока видеосигнала в телевизионном канале связи
CN103392341A (zh) 2010-12-23 2013-11-13 三星电子株式会社 用于对图像预测单元的帧内预测模式进行编码的方法和装置,以及用于对图像预测单元的帧内预测模式进行解码的方法和装置
US9154813B2 (en) 2011-06-09 2015-10-06 Comcast Cable Communications, Llc Multiple video content in a composite video stream

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4709394A (en) * 1985-08-23 1987-11-24 Rca Corporation Multiplexed real-time pyramid signal processing system
US4985768A (en) 1989-01-20 1991-01-15 Victor Company Of Japan, Ltd. Inter-frame predictive encoding system with encoded and transmitted prediction error
JP2530217B2 (ja) 1989-01-20 1996-09-04 日本ビクター株式会社 フレ―ム間予測符号化装置及び復号装置
JPH03132278A (ja) 1989-10-18 1991-06-05 Victor Co Of Japan Ltd 映像信号変換装置
US5353119A (en) * 1990-11-15 1994-10-04 Sony United Kingdom Limited Format conversion of digital video signals, integration of digital video signals into photographic film material and the like, associated signal processing, and motion compensated interpolation of images
US6101313A (en) * 1992-06-29 2000-08-08 Sony Corporation High efficiency encoding and decoding of picture signals and recording medium containing same
US5305104A (en) * 1992-07-27 1994-04-19 The Trustees Of Columbia University In The City Of New York Digitally assisted motion compensated deinterlacing for enhanced definition television
US5337089A (en) * 1993-06-07 1994-08-09 Philips Electronics North America Corporation Apparatus for converting a digital video signal which corresponds to a first scan line format into a digital video signal which corresponds to a different scan
CA2126467A1 (en) * 1993-07-13 1995-01-14 Barin Geoffry Haskell Scalable encoding and decoding of high-resolution progressive video
US5530482A (en) * 1995-03-21 1996-06-25 Texas Instruments Incorporated Pixel data processing for spatial light modulator having staggered pixels
WO1997013376A1 (en) * 1995-10-05 1997-04-10 Faroudja Y C Method and apparatus for procucing from a standard-bandwidth color television signal a color video signal with extended vertical definition

Also Published As

Publication number Publication date
KR100285175B1 (ko) 2001-03-15
DE69824486T2 (de) 2005-07-07
DE69824486D1 (de) 2004-07-22
US6188725B1 (en) 2001-02-13
JP3164031B2 (ja) 2001-05-08
EP0881835A3 (de) 1999-06-30
CN1201332A (zh) 1998-12-09
KR19980087538A (ko) 1998-12-05
EP0881835A2 (de) 1998-12-02
CN1151684C (zh) 2004-05-26
JPH1146365A (ja) 1999-02-16

Similar Documents

Publication Publication Date Title
EP0881835B1 (de) Kodier- und Dekodierverfahren für Videosignal mit Zwischenbild mit Konvertierung periodisch ausgewählter Videohalbbilder zu Videobildern mit progressiver Abtastung
EP0576290B1 (de) Kodierung und Dekodierung von Bildsignalen
USRE44091E1 (en) Picture encoding method, picture encoding apparatus and picture recording medium
USRE35158E (en) Apparatus for adaptive inter-frame predictive encoding of video signal
JP3381855B2 (ja) 画像信号符号化方法および画像信号符号化装置、並びに画像信号復号化方法および画像信号復号化装置
KR100326641B1 (ko) 영상신호의전송,부호화,복호화방법및장치와광디스크의기록및재생방법
EP0910213A1 (de) Kodierung und Dekodierung von digitalen Videosignalen
WO1994023535A1 (en) Method and apparatus for coding video signal, and method and apparatus for decoding video signal
EP0933942B1 (de) Überträger, empfänger und medium für progressives bildsignal
JPH11355803A (ja) 立体映像再生方法
JPS61118085A (ja) 画像信号の符号化方式およびその装置
JP3540447B2 (ja) 動画像符号化装置及び復号装置
US5991494A (en) Digital image data processing apparatus and method
JP2006525735A (ja) 適応的な走査順序に基づいてブロックを使用するビデオ情報の符号化
JPH08237666A (ja) フレーム間帯域圧縮信号処理装置
JP3900534B2 (ja) 動画像の符号化装置および符号化方法
US5303060A (en) Apparatus for recording and/or reproducing HDTV signals
US6490321B1 (en) Apparatus and method of encoding/decoding moving picture using second encoder/decoder to transform predictive error signal for each field
US6904093B1 (en) Horizontal/vertical scanning frequency converting apparatus in MPEG decoding block
JP3946177B2 (ja) 動画像符号化装置及び復号装置
JP3552045B2 (ja) 画像信号記録媒体の記録方法、画像信号記録装置、および、画像信号再生装置
JPH0759092A (ja) 画像信号の伝送装置
JP2004088795A (ja) 画像信号生成装置および方法、並びに、画像信号再生装置および方法
JPH06311498A (ja) 画像信号の符号化・復号化装置
JP2000333203A (ja) 圧縮符号化方法、圧縮復号化方法、圧縮符号化装置及び圧縮復号化装置

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): DE FR GB

AX Request for extension of the european patent

Free format text: AL;LT;LV;MK;RO;SI

PUAL Search report despatched

Free format text: ORIGINAL CODE: 0009013

AK Designated contracting states

Kind code of ref document: A3

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE

AX Request for extension of the european patent

Free format text: AL;LT;LV;MK;RO;SI

RIC1 Information provided on ipc code assigned before grant

Free format text: 6H 04N 7/26 A, 6H 04N 5/44 B

17P Request for examination filed

Effective date: 19990825

AKX Designation fees paid

Free format text: DE FR GB

17Q First examination report despatched

Effective date: 20020809

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): DE FR GB

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REF Corresponds to:

Ref document number: 69824486

Country of ref document: DE

Date of ref document: 20040722

Kind code of ref document: P

ET Fr: translation filed
PLAQ Examination of admissibility of opposition: information related to despatch of communication + time limit deleted

Free format text: ORIGINAL CODE: EPIDOSDOPE2

PLBI Opposition filed

Free format text: ORIGINAL CODE: 0009260

PLBQ Unpublished change to opponent data

Free format text: ORIGINAL CODE: EPIDOS OPPO

26 Opposition filed

Opponent name: INTERESSENGEMEINSCHAFTFUER RUNDFUNKSCHUTZRECHTE E.

Effective date: 20050316

PLAQ Examination of admissibility of opposition: information related to despatch of communication + time limit deleted

Free format text: ORIGINAL CODE: EPIDOSDOPE2

PLAR Examination of admissibility of opposition: information related to receipt of reply deleted

Free format text: ORIGINAL CODE: EPIDOSDOPE4

PLAX Notice of opposition and request to file observation + time limit sent

Free format text: ORIGINAL CODE: EPIDOSNOBS2

PLBQ Unpublished change to opponent data

Free format text: ORIGINAL CODE: EPIDOS OPPO

PLAB Opposition data, opponent's data or that of the opponent's representative modified

Free format text: ORIGINAL CODE: 0009299OPPO

R26 Opposition filed (corrected)

Opponent name: INTERESSENGEMEINSCHAFTFUER RUNDFUNKSCHUTZRECHTE E.

Effective date: 20050316

PLBB Reply of patent proprietor to notice(s) of opposition received

Free format text: ORIGINAL CODE: EPIDOSNOBS3

PLCK Communication despatched that opposition was rejected

Free format text: ORIGINAL CODE: EPIDOSNREJ1

PLBN Opposition rejected

Free format text: ORIGINAL CODE: 0009273

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: OPPOSITION REJECTED

27O Opposition rejected

Effective date: 20070322

REG Reference to a national code

Ref country code: DE

Ref legal event code: R082

Ref document number: 69824486

Country of ref document: DE

Representative=s name: TBK, DE

REG Reference to a national code

Ref country code: DE

Ref legal event code: R082

Ref document number: 69824486

Country of ref document: DE

Representative=s name: TBK, DE

Effective date: 20120430

Ref country code: DE

Ref legal event code: R081

Ref document number: 69824486

Country of ref document: DE

Owner name: RAKUTEN, INC., JP

Free format text: FORMER OWNER: VICTOR COMPANY OF JAPAN, LTD., YOKOHAMA, KANAGAWA, JP

Effective date: 20120430

Ref country code: DE

Ref legal event code: R081

Ref document number: 69824486

Country of ref document: DE

Owner name: JVC KENWOOD CORPORATION, YOKOHAMA-SHI, JP

Free format text: FORMER OWNER: VICTOR COMPANY OF JAPAN, LTD., YOKOHAMA, KANAGAWA, JP

Effective date: 20120430

Ref country code: DE

Ref legal event code: R081

Ref document number: 69824486

Country of ref document: DE

Owner name: JVC KENWOOD CORPORATION, JP

Free format text: FORMER OWNER: VICTOR COMPANY OF JAPAN, LTD., YOKOHAMA, JP

Effective date: 20120430

REG Reference to a national code

Ref country code: FR

Ref legal event code: TP

Owner name: JVC KENWOOD CORPORATION, JP

Effective date: 20120705

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 18

REG Reference to a national code

Ref country code: DE

Ref legal event code: R082

Ref document number: 69824486

Country of ref document: DE

Representative=s name: TBK, DE

Ref country code: DE

Ref legal event code: R081

Ref document number: 69824486

Country of ref document: DE

Owner name: RAKUTEN, INC., JP

Free format text: FORMER OWNER: JVC KENWOOD CORPORATION, YOKOHAMA-SHI, KANAGAWA, JP

REG Reference to a national code

Ref country code: GB

Ref legal event code: 732E

Free format text: REGISTERED BETWEEN 20160114 AND 20160120

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 19

REG Reference to a national code

Ref country code: FR

Ref legal event code: TP

Owner name: JVC KENWOOD CORPORATION, JP

Effective date: 20160226

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 20

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20170313

Year of fee payment: 20

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20170329

Year of fee payment: 20

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20170329

Year of fee payment: 20

REG Reference to a national code

Ref country code: DE

Ref legal event code: R071

Ref document number: 69824486

Country of ref document: DE

REG Reference to a national code

Ref country code: GB

Ref legal event code: PE20

Expiry date: 20180401

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF EXPIRATION OF PROTECTION

Effective date: 20180401