GB2268659A - Interlace to non-interlace conversion for electron beam film recording - Google Patents
Interlace to non-interlace conversion for electron beam film recording Download PDFInfo
- Publication number
- GB2268659A GB2268659A GB9214499A GB9214499A GB2268659A GB 2268659 A GB2268659 A GB 2268659A GB 9214499 A GB9214499 A GB 9214499A GB 9214499 A GB9214499 A GB 9214499A GB 2268659 A GB2268659 A GB 2268659A
- Authority
- GB
- United Kingdom
- Prior art keywords
- output
- input
- interpolation
- motion
- signal
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/01—Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
- H04N7/0135—Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level involving interpolation processes
- H04N7/014—Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level involving interpolation processes involving the use of motion vectors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/91—Television signal processing therefor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/01—Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
- H04N7/0112—Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level one of the standards corresponding to a cinematograph film standard
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/01—Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
- H04N7/0117—Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level involving conversion of the spatial resolution of the incoming video signal
- H04N7/012—Conversion between an interlaced and a progressive signal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/84—Television signal recording using optical recording
- H04N5/87—Producing a motion picture film from a television signal
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Graphics (AREA)
- Television Systems (AREA)
Abstract
A method of producing an output 1:1 format video signal from an input 2:1 interlace format video signal comprises estimating the degree of motion in areas of the image between fields of the input signal, and producing output frames at temporal positions offset from the input field positions, output frame pixels each being produced from one or a combination of: (i) a corresponding pixel obtained by one-dimensional interframe interpolation between frames of the input signal, or the corresponding pixel in a temporally adjacent field of the input signal; and (ii) a corresponding pixel obtained by two-dimensional vertical/temporal interpolation between fields of the input signal in dependence upon the estimated degree of motion corresponding to that output frame pixel. The output signal can be used to drive an electron beam recorder or such like for recording the image content thereof on film. <IMAGE>
Description
VIDEO TO FILM CONVERSION
This invention relates to video to film conversion.
For some years now it has been possible to transfer video to film using an electron beam recording (EBR) system. When converting HDVS 60
Hz (60 fields/s) 2:1 interlace format video to 24 Hz (24 frames/s) film using this system, a video signal corresponding to 24 frames/s is produced from the 60 Hz video signal, and the 24 Hz video signal is used to drive an electron beam recorder for recording the 24 Hz frames on film. The frame rate conversion is achieved in this system by a drop field" process which is illustrated in Figure 1 of the accompanying drawings.
Figure 1 illustrates the temporal relationship between four 24 Hz film frames, A, B, C and D, and ten 60 Hz video fields, numbered 1 to 10, from which the four film frames are produced. As indicated by the arrows in Figure 1, video field pairs 1, 2 and 3, 4 are combined to produce frames A and B respectively, and video field 5, an odd field, is discarded. This process is then repeated with video fields 6 and 7 being combined to produce frame C, and fields 8 and 9 being combined to produce frame D so that video field 10, an even field, is discarded, thus producing a complete cycle.
It will be seen that, in frames C and D, the timing of the video fields is essentially reversed as compared with frames A and B due to the video requirement to always alternate odd and even fields.
However, because the fields of each pair are written to film and displayed at the same time, this temporal reversal makes no difference in the final product (though it can cause problems in monitoring of the 24 Hz video prior to recording on film).
The drop field process shown in Figure 1 is a very crude form of standards conversion which introduces a 12 Hz judder component due to the loss of every fifth field of the original 60 Hz video. There is also a reduction in dynamic resolution since temporally offset fields are combined to produce each frame of the film so that the outline of moving objects can appear blurred. Thus, while the drop field system offers picture quality which is adequate for the majority of the time, motion rendition is poor due to the 12 Hz judder in addition to the 24
Hz strobing introduced by the film projector, and there is reduced dynamic resolution.
Another known technique used in video standards conversion is that of two-dimensional vertical/temporal interpolation. This technique is illustrated schematically in Figure 2 for the case of a 625/50/2:1 (625 lines/frame, 50 Hz, 2:1 interlace) to 525/60/2:1 video standards conversion.
Figure 2 shows pixels of the 625/50 format in the vertical/temporal (V/T) plane such that successive vertical columns of pixels along the time axis represent pixels corresponding to a given column of the display in alternate odd and even fields. Pixels of the 525/60 format which are to be produced are also shown schemically. As indicated by the broken lines in the figure, each required output pixel can be produced from the four input pixels which surround the output pixel position in the vertical/temporal plane. This is achieved by multiplying the value of each of the four input pixels by a respective weighting factor and summing the weighted values to produce the output pixel value. (A larger pixel array could be used to produce each output pixel to improve the interpolation characteristic.For example, the sixteen input pixels bounded by the solid line in Figure 2 could be used.) Each weighting factor depends on the vertical and temporal displacement of the corresponding input pixel from the output pixel position, so that pixels which are vertically or temporally closer to the output pixel position contribute more to the output pixel than those which are further away. Such two-dimensional vertical/temporal interpolation can be carried out using a two-dimensional digital filter the coefficients of which correspond to the required weighting factors (and therefore vary with output pixel position as previously described).
While the two-dimensional vertical/temporal interpolation described above is well known in video standards conversion, such a linear interpolation technique cannot produce "ideal" results because of the vertical and temporal alias which is present in the input video signal. Thus, the linear interpolation still introduces undesirable artifacts in the resulting picture such as temporal judder and loss of spatial resolution.
In order to overcome the above problems, an elaborate conversion system has been developed involving motion compensated temporal interpolation". To convert 60 Hz 2:1 format video to 24 Hz film, the process involves: (i) producing progressive scan frames at 60 frames/s from one or from three of the input 60 fields/s fields depending upon motion in the input fields; (ii) detecting motion in areas of the image between pairs of temporally adjacent progressive scan frames; (iii) producing output frames at 24 frames/s, four for every ten progressive scan frames, with each pixel of each output frame derived from pixels in a respective pair of the progressive scan frames spatially displaced from the output pixel in dependence upon the detected motion and the temporal misalignment between the output frame and the pair of frames from which it is formed; and (iv) recording the output frames on film.
Thus, the motion in the image is analysed and interpolation of each pixel is performed along the direction of motion.
Such a system for converting 60 Hz 2:1 interlace format video to 24 Hz film is described in detail in GB2231228A. The system gives output pictures of extremely high quality in which dynamic resolution is preserved and motion is very smooth. However, the process can be labour intensive, and the equipment is complex, bulky and expensive.
It will be seen from the above that there is a need for a conversion process which provides high quality pictures but which involves only modest processing complexity.
According to one aspect of the present invention there is provided a method of producing an output 1:1 format video signal from an input 2:1 interlace format video signal, the method comprising estimating the degree of motion in areas of the image between fields of the input signal. and producing output frames at temporal positions offset from the input field positions, output frame pixels each being produced from one or a combination of:
(i) a corresponding pixel obtained by one-dimensional inter
frame interpolation between frames of the input signal, or the
corresponding pixel in a temporally adjacent field of the input
signal; and
(ii) a corresponding pixel obtained by two-dimensional
vertical/temporal interpolation between fields of the input
signal in dependence upon the estimated degree of motion corresponding to that output frame pixel.
Thus, a motion adaptive combination of two different methods of producing output pixels is used to produce pixels of output frames at temporal sites displaced from the input field positions. Because the output frames are temporally offset from the input fields, good results can be achieved by using one or a combination of the same two methods to produce all output pixels. Particularly good results are obtained when all output frames are offset from their temporally closest input fields by the same amount since fluctuations in picture quality between output frames due to differences in temporal offsets are thereby avoided.Thus, in the case of converting a 60 Hz 2:1 interlace format signal to an output 24 Hz 1:1 format signal for recording on film, it is preferred that the output frames are produced at respective temporal positions offset by one quarter of the input field period from the temporally nearest input field.
The relative contributions to each said output pixel of the two methods of producing an output pixel (ie one-dimensional inter-frame interpolation and two-dimensional vertical/temporal interpolation, or copying of the corresponding pixel in a temporally adjacent input field and two-dimensional vertical/temporal interpolation) depends on the estimated degree of motion in the area of the image corresponding to that output pixel. For totally static picture areas, producing output pixels solely by one-dimensional inter-frame interpolation, or copying of the corresponding pixel in a temporally adjacent input field, preferably the closest input field ("nearest field replacement"), gives the best results, avoiding the reduction in vertical resolution introduced by two-dimensional vertical/temporal interpolation.
Conversely, for areas of extreme motion, producing output pixels solely by two-dimensional vertical/temporal interpolation provides better motion portrayal. For motion between these two extremes, a weighted combination of pixels obtained by both methods may be used to produce the output pixel, the weighting depending upon the estimated degree of motion.For example, the value of each said output pixel may be given by (1-K)pl + Kp2, or (l-K)po + Kp2, where p1 is the value of the corresponding pixel obtained by one-dimensional inter-frame interpolation, p2 is the value of the corresponding pixel obtained by two-dimensional vertical/temporal interpolation, p0 is the value of the corresponding pixel in a temporally adjacent input frame, and K ranges from 0, when the estimated degree of motion is zero, to 1 when the estimated degree of motion is above a predetermined level.
Although nearest field replacement may be used as an alternative to one-dimensional inter-frame interpolation, the latter is preferred since interpolation reduces noise in the input. Thus, it is preferred that each said output frame pixel is produced from one or a combination of:
(i) a corresponding pixel obtained by one-dimensional inter
frame interpolation between frames of the input signal; and
(ii) a corresponding pixel obtained by two-dimensional
vertical/temporal interpolation between fields of the input
signal.
The one-dimensional inter-frame interpolation may be effected simply by averaging the values of input frame pixels corresponding to the desired output pixel position in input frames temporally adjacent to the output frame position. However, it is preferred that the input pixel values are weighted in dependence upon their temporal offset from the output pixel position.
Use of the method described above in converting video to film is particularly advantageous when the resulting film is to be displayed as a moving picture. However, when, for example, publicity shots, or "stills", are to be produced from film, fluctuations in picture quality from frame to frame, and smooth motion portrayal, are not of particular importance since each frame will be used in isolation. In this case, the interpolation method used to produce each frame can be selected to give good results for that frame without undue regard to whether the method used to produce the next frame will give good motion rendition or consistent picture quality.
Thus, in accordance with another aspect of the invention there is provided a method of producing an output 1:1 format video signal from an input 2:1 interlace format video signal, the method comprising:
estimating the degree of motion in areas of the image between fields of the input signal;
producing some output frames at temporal positions coincident with respective input field positions, pixels of such an output frame which correspond to missing pixels in the temporally coincident input field each being produced from one or a combination of a corresponding pixel obtained by intra-field interpolation of the temporally coincident input field and a corresponding pixel obtained by onedimensional inter-frame interpolation between frames of the input signal, in dependence upon the estimated degree of motion corresponding to the output frame pixel; and
producing other output frames at temporal positions offset from the input field positions by two-dimensional vertical/temporal interpolation between fields of the input signal.
In the case of converting an input 60 Hz 2:1 interlace format signal to a 24 Hz 1:1 format signal for recording on film, for example, alternate output fields can be produced at temporal positions coincident with respective input field positions, and the remaining output fields produced at temporal positions midway between temporally adjacent input fields. For the output fields which are offset from the input field positions, good results are achieved using two-dimensional vertical/temporal interpolation. However, for output frames which are coincident with input field positions producing pixels corresponding to missing lines of the input fields by a motion adaptive combination of intra-field and one-dimensional inter-frame interpolation gives better results than two-dimensional vertical/temporal interpolation.For lines of pixels in these output frames which correspond to lines of the coincident input fields the input field pixels can be used directly.
However, some pre-filtering of the input field pixels can be performed if desired.
For frames which are coincident with input fields, the proportional contribution to an output pixel by each type of interpolation depends on the estimated degree of motion. For wholly static picture areas, producing the output pixels solely by onedimensional inter-frame interpolation is preferred to retain as much vertical information as possible. Conversely, for areas of extreme motion, producing output pixels solely by intra-field interpolation of the temporally coincident input field gives better results. A combination of the two types of interpolation may be used for motion between these two extremes.For example, the value of an output pixel in such a frame which corresponds to a missing pixel in the coincident input field may be given by (l-K)pl + Kp3, where p3 is the value of a corresponding pixel produced by intra-field interpolation, and K and P are as described above.
It will be appreciated that the invention extends to apparatus arranged to perform a method as hereinbefore described. In general, where features have been described with reference to a method of the invention, corresponding features may be provided in accordance with an apparatus of the invention and vice versa.
Embodiments of the invention will now be described, by way of example, with reference to the accompanying drawings in which:
Figure 1 is a schematic illustration of the known drop field conversion process for converting 60 Hz 2:1 interlace format video to 24 Hz film;
Figure 2 is a schematic illustration of two-dimensional vertical/temporal interpolation in a 625/50/2:1 to 525/60/2:1 video standards conversion;
Figure 3 illustrates the vertical/temporal relationship between pixels of 60 Hz 2:1 interlace format video and the required pixels of 24 Hz 1::1 format video in converting 60 Hz video to 24 Hz film by a method embodying the invention;
Figure 4 is a diagram illustrating part of the motion estimation process in a conversion method embodying the invention;
Figure 5 shows a normalising function used in the motion estimation process of Figure 4;
Figure 6 shows a non-linear function used in the motion estimation process of Figure 4;
Figure 7 is a schematic block diagram of apparatus embodying the invention for use in converting 60 Hz 2::1 interlace format video to 24
Hz film;
Figure 8a illustrates schematically the one-dimensional interframe interpolation performed by part of the apparatus of Figure 7;
Figure 8b illustrates schematically the intra-field interpolation performed by another part of the apparatus of Figure 7;
Figure 8c illustrates schematically the two-dimensional vertical/temporal interpolation performed by a further part of the apparatus of Figure 7;
Figure 9 illustrates the vertical/temporal relationship between pixels of 60 Hz 2:1 interlace format video and the required pixels of 24 Hz 1:1 format video in converting 60 Hz video to 24 Hz film by another method embodying the invention;
Figure 10 is a schematic block diagram of another embodiment of apparatus for use in converting video 60 Hz 2::1 interlace format video to 24 Hz film;
Figure lia illustrates schematically the one-dimensional interframe interpolation performed by part of the apparatus of Figure 10; and
Figure lib illustrates schematically the two-dimensional vertical/temporal interpolation performed by another part of the apparatus of Figure 10.
The upper section of Figure 3 shows the temporal relationship between four 24 Hz film frames, A, B, C and D, and ten fields of a 60
Hz 2:1 interlace format video signal from which the four film frames are to be produced. The video fields, which are alternately odd (0) and even (E), are numbered 1 to 10 in the figure. The middle section of Figure 3 shows pixels of the fields 1 to 10 of the 60 Hz video in the vertical/temporal (V/T) plane. Similarly, the lower section of
Figure 3 shows the required pixels, in the vertical/temporal plane, of the 24 Hz 1:1 format video which is to be written to film to form the film frames A, B, C and D.
As can be seen from Figure 3, alternate film frames A, C etc "line up" with respective 60 Hz video fields. That is to say, alternate film frames are temporally coincident with respective input fields, the temporal position of frame A corresponding to that of video field 1, the temporal position of frame C corresponding to that of field 6, etc. Alternate lines of pixels in these output frames correspond to lines of the coincident input fields, so the input field pixels can be used directly as these output frame pixels. The other lines of pixels in these output frames correspond to missing lines of the coincident input fields, so that interpolation is required to produce these output pixels. The remaining film frames B, D etc have temporal positions mid-way between video fields 3, 8 etc.
Figure 7 shows an embodiment of apparatus used to generate output pixels in accordance with the temporal configuration shown in Figure 3 from an input 60 Hz 2:1 video signal. This temporal configuration, and the conversion method carried out by the apparatus of Figure 7, is particularly advantageous when the resulting 24 Hz film is to be used to produce an image sequence from which stills are to be selected. The apparatus of Figure 7 comprises a motion estimator 1, a one-dimensional inter-frame interpolator 2, an intra-field interpolator 3, a twodimensional vertical/temporal interpolator 4, two multipliers 5a and 5b, an adder 6, and two selectors 8 and 9, all connected as shown. The selectors 8 and 9 and the interpolator 4 operate under the control of a controller 7.Control links (indicated by the broken lines in the figure) may also be provided between the controller 7 and the interpolators 2 and 3, though this is not essential, at least for the conversion to be described, as explained below.
Since moving picture quality is not especially important for this application, each output frame is produced so as to give good results for that frame without undue regard to picture quality fluctuations or motion portrayal from frame to frame. Those output frames which are midway between input field positions. as shown in Figure 3, are generated by two-dimensional vertical/temporal interpolation performed by the interpolator 4 in Figure 7. Alternate lines of pixels of those output frames which are coincident with input fields, as shown in
Figure 3, are produced by a motion-adaptive combination of onedimensional inter-frame interpolation, performed by the interpolator 2 in Figure 7, and intra-field interpolation performed by the interpolator 3, the input field lines being used directly as the other lines of these output frames as indicated above.
The proportional contributions to an output pixel by onedimensional inter-frame interpolation and intra-field interpolation depend upon the estimated degree of motion in the area of the image corresponding to the output pixel to be produced. The concept is to use inter-frame interpolation in wholly static picture areas to retain as much vertical information as possible, and to use intra-field interpolation when significant motion is present. In between these two extremes, a weighted combination of the two types of interpolation is used, the weighting depending upon the estimated degree of motion. The degree of motion is determined by the motion estimator 1 as will now be described with reference to Figures 4 to 6.
The input 60 Hz 2:1 interlace format signal is supplied to the motion estimator 1. For each input field, an array of motion estimates
K is generated, the motion estimates K corresponding to respective pixel positions in the missing lines of the field. As illustrated in
Figure 4, the modulus of the inter-frame difference between the previous and next fields is first generated. To generate the required estimates, the modulus inter-frame difference array from the previous and the next fields is generated at each point: (pixel, current line, current field) = tY(pixel, current line, next field)
Y(pixel, current line, previous field) where: Au is the unnormalized modulus difference array, and
Y is the luminance array corresponding to the 3D picture.
The modulus of difference is then normalized to adjust for the significance of changes in lower luminance areas: d,(pixel, current line, current field) = F(Y (pixel, current line) ) * au(pixel, current line, current field) where: is iS the normalized modulus difference array Y is the inter-frame average luminance value
Y(pixel, current line) =
(Y (pixel, current line, previous field) +
Y(pixel, current line, next field) )/2, and
F(Y) (the normalizing function) is derived, for example, as indicated in Figure 5 from which it can be seen that luminance differences in light areas (above the breakpoint of, for example, 50) are scaled down relative to luminance differences in dark areas (below the breakpoint), thus normalising the effect over the total range of luminance.
The difference array A is then vertically filtered together with the previous field difference by a three-tap filter (examples of coefficients are: a quarter, a half, a quarter; or zero, unity, zero) to reduce vertical alias problems, and in particular to minimize the problems encountered with temporal alias.Thus: 4,(pixel, current line, current field) = ssN(pixel, current line-l, previous field)*C1 + iiN(pixel, current line, current field)*C2+ 4,(pixel, current line+l, previous field where: AF iS the filtered normalized difference array, and
C1 and C2 are filter coefficients, and 2C1+C2=1 so that unity dc gain is maintained.
A vertical and horizontal intra-field filter of up to five taps by fifteen taps is then used to smooth the difference values within the current field. In practice, a filter of three taps by three taps is satisfactory. Finally, in order to produce the actual motion estimates, a non-linear mapping function is applied using a function to provide the motion estimates (K):
K (pixel, current line) = (spatially filtered Ap (pixel, current line)
The non-linear function g is derived as shown in Figure 6. For a static picture (i9F below breakpoint 1 is treated as static) K is zero, for full motion ( jiF above breakpoint 2) K is one, and for intermediate motion a controlled transition occurs.
Thus, the motion estimator 1 generates an array of motion estimates for each input field, each individual motion estimate K corresponding to the position of a missing pixel in that field as shown in Figure 4.
Referring again to Figure 7, as previously described, pixels in alternate lines of output frames which are coincident with respective input fields (ie pixels corresponding to missing pixels in the coincident input field) are produced by a combination of onedimensional inter-frame interpolation and intra-field interpolation, performed by the interpolators 2 and 3 respectively, in dependence upon the estimated degree of motion corresponding to the output pixel to be produced. The value of K for such an output pixel is taken as the K value corresponding to the output pixel position in the temporally coincident input field.
The fields of the 60 Hz video signal applied at the input to the apparatus of Figure 7 are supplied to the selector 8, the interpolators 2, 3 and 4 and the motion estimator 1 which outputs the motion estimates K derived as previously described. The selector 8, under control of the controller 7, outputs frames corresponding to input field positions which are supplied to one input of the selector 9. The interpolator 4 produces frames corresponding to positions mid-way between input fields and supplies these to the other input of the selector 9. The controller 7 controls the selector 9 to output alternately frames supplied by the selector 8 and the interpolator 4.
The controller 7 receives a SYNC input synchronised with the input 60
Hz fields for controlling the timing of operation of the selectors 8 and 9 and the interpolation position in the interpolator 4. The output of the selector 9 is a 24 Hz 1:1 format video signal.
The operation of the interpolators 2, 3 and 4 will now be described in more detail.
Figure 8a illustrates the process of one-dimensional inter-frame interpolation performed by the interpolator 2. Figure 8a shows an array of input pixels in the vertical/temporal plane and one column of the pixels required for an output 24 Hz frame to be produced at a temporal position coincident with an input field. Each required output pixel is interpolated between the two pixels in the corresponding position in the previous and next fields as indicated by the arrows in
Figure 8a. The interpolated pixel value is obtained by summing half the values of the two pixels in the previous and next fields. The output pixels can be produced continuously in this way using known digital filter techniques.
Thus, the interpolator 2 generates pixels, by one-dimensional inter-frame interpolation, corresponding to the required output pixels of frames which are coincident with input fields. Since the interpolation ratio used to generate output pixels from input pixels is i: for all pixels produced by the interpolator 2, control of this interpolator by the controller 7 is not strictly required although a control link may be provided to allow flexibility of operation.
Similarly, the intra-field interpolator 3 receives the input 60
Hz fields and generates pixels corresponding to the required output pixels by intra-field interpolation for output frames coincident with input field positions. Intra-field interpolation involves combining proportions of the pixels surrounding the output pixel positions, again by means of a digital filter, to produce interpolated pixels. For example, Figure 8b shows pixels in a single input field, i.e in the vertical/horizontal (V/H) plane, together with one column of the required output pixels for the frame coincident with that input field.
The required output pixels can be interpolated by combining, in the ratio i:, the input pixels immediately above and below the output pixel position. Again, since the interpolation ratio is consistently a a control link with the controller 7 is not strictly required although may be provided if desired.
Of course, it will be appreciated that larger sample arrays can be used to interpolate each output pixel than the two pixels used in the examples of Figure 8a and Figure 8b. This can be achieved simply by increasing the number of taps in the digital filter of the interpolator and adjusting the filter coefficients accordingly in dependence on the spatial or temporal offsets of the input pixels from the output pixel position.
To produce the final output pixels of alternate lines of frames which are coincident with input field positions (see Figure 3), the pixels output by the interpolators 2 and 3 corresponding to the desired output pixel are supplied to the multipliers 5a and 5b respectively.
The pixel values are then weighted in the ratio (1 - K) : K where K corresponds to the motion estimate for the required output pixel as supplied by the motion estimator 1. The outputs of the multipliers 5a and 5b are supplied to the adder 6 in which they are summed to produce the final output pixel. Since K varies between 0, for wholly static picture areas, and 1, for extreme motion, ie a degree of motion above a predetermined threshold, the final output pixels are produced solely by one-dimensional inter-frame interpolation in wholly static picture areas, and solely by intra-field interpolation in areas of extreme motion. Between these two extremes, a controlled transition occurs.
The pixels output by the adder 6 are supplied to one input of the selector 8 which, at its other input, receives the pixels of the 60 Hz input fields. Under control of the controller 7, the selector 8 outputs alternately lines of input pixels and lines of interpolated pixels from the selector 6 to produce a complete frame which is supplied to an input of the selector 9.
Referring again to Figure 3, the motion adaptive combination of one-dimensional inter-frame interpolation and intra-field interpolation described above is used only in the production of output frames which are coincident with input field positions. However, as shown in Figure 3, alternate output frames are sited midway between two input fields.
These output frames are generated by two-dimensional vertical/temporal interpolation in the interpolator 4 in Figure 7. This is illustrated in Figure 8c where the relationship between one column of required output pixels and the input pixels is shown in the vertical/temporal (V/T) plane. Each output pixel is generated by combining proportions of the pixels, from three temporally successive fields, linked by the dashed lines surrounding the output pixel position in the figure. The proportions depend upon the vertical and temporal displacement of each input pixel from the output pixel position. Of course, a larger sample array than the four pixels shown in Figure 8c can be used to produce each output pixel. For example, a 16-sample array, spanning four temporally successive input fields (see Figure 2), could be used.The temporal positions in which the output pixels are to be interpolated, which determine the filter coefficients, are controlled in the interpolator 4 by the controller 7.
The output of the interpolator 4 is connected to the second input of the selector 9. The controller 7 controls the selector 9 to select alternately a frame supplied by the selector 8 and a frame supplied by the interpolator 4. Thus, 24 Hz 1:1 format output frames which are alternately coincident with input field positions, and midway between input field positions, are supplied to the output of the selector 9.
The output signal can be digital-to-analogue converted and the analogue signal used to drive a film recording device such as an electron beam recorder (not shown) for writing the 24 Hz signal to film.
The conversion method carried out by the apparatus of Figure 7 yields a rather indifferent moving picture quality due to the change in spatial resolution from frame to frame resulting from the different interpolation methods used to produce alternate frames. However, good picture quality is achieved for each individual frame, particularly those, half the total number, which correspond to input field positions, and since frames are to be selected from these to produce publicity stills, the fluctuations from frame to frame are not of significance. Good quality stills are thereby achieved by means of a practical apparatus using only relatively simple processing techniques.
Where the film is to be displayed as a motion picture, smooth motion portrayal and consistent picture quality from frame to frame are priority requirements. In this case, it is desirable for the output frames to be interpolated at positions each of which is displaced by an equal amount from the temporally closest input field. This can be achieved as shown in Figure 9, which corresponds generally to Figure 3 but in which the output frames have been shifted by one quarter of the input field period (f) along the temporal axis. As shown in Figure 9, the required output pixels are then each offset by + f/4 from the temporally closest input field. For this configuration, good results are achieved using a motion adaptive combination of one-dimensional inter-frame interpolation and two-dimensional vertical/temporal interpolation to generate all output frames.Figure 10 shows an embodiment of apparatus for carrying out this process.
The apparatus of Figure 10 comprises a motion estimator 1, a onedimensional inter-frame interpolator 2, a two-dimensional vertical/temporal interpolator 4, two multipliers 5a and 5b, and an adder 6, all connected as shown in the figure. The interpolators 2 and 4 operate under the control of a controller 7.
The fields of the 60 Hz video signal applied at the input to the apparatus of Figure 10 are supplied to the motion estimator 1 and the interpolators 2 and 4. The interpolators 2 and 4 generate pixels by one-dimensional inter-frame interpolation and two-dimensional vertical/temporal interpolation respectively, the pixels corresponding to pixels of the required 24 Hz output frames. The timing of the operation of the interpolators 2, 4 relative to the input 60 Hz fields is controlled by the controller 7 which receives a SYNC input synchronised with the 60 Hz fields. Pixels produced by the interpolators 2 and 4 are weighted in dependence upon the estimated degree of motion for the corresponding output pixel, as determined by the motion estimator 1, and summed to produce the final output pixel.
The output of the apparatus is a 24 Hz 1:1 format signal.
The motion estimator 1 operates as hereinbefore described to derive an array of motion estimates for each input field, each individual motion estimate K corresponding to the position of a missing pixel in that field. The value of K for an output pixel is taken as the K value corresponding to the output pixel position in the temporally closest input field having a K value at the required position. (These values may be further smoothed, for example, by a filter with coefficients l , t as previously described.) In this way, a motion estimate K corresponding to each output pixel is obtained. The motion estimates are used to combine proportions of pixels produced by the interpolators 2 and 4 as will be described below.
The operation of the interpolator 2 is illustrated in Figure lia.
Figure lia shows the input pixels in the vertical/temporal (V/T) plane together with the required output pixels offset by one quarter of the input field period in accordance with the configuration of Figure 9.
Each output pixel is interpolated between the temporally preceding input pixel corresponding to the output pixel position and the temporally next pixel corresponding to the output pixel position, as shown by the arrows in Figure lla. Thus, each pixel is interpolated between two temporally adjacent input frames. The value of each interpolated pixel is obtained by combining proportions of the previous and next input pixels in accordance with the temporal offset relative to these pixels of the output pixel position. Thus, alternate output pixels in the columns shown in Figure lla are produced by summing a of the value of the temporally preceding pixel and i of the value of the temporally next input pixel.The other output pixels are produced by summing i of the value of the temporally preceding input pixel and i of the value of the temporally next input pixel. The output pixels can be produced continuously in this way using known digital filter techniques. Of course, a larger sample array than the two pixels shown can be used to produce each output pixel, the weighting factors being adjusted accordingly. The appropriate interpolation ratios, which determine the coefficients of the digital filter in the interpolator 2, are controlled by the controller 7 which also provides timing control relative to the input 60 Hz fields.
Thus, the interpolator 2 generates pixels corresponding to the required output pixel positions by one-dimensional inter-frame interpolation and supplies these to the multiplier 5a. Similarly, the interpolator 4 generates pixels corresponding to the required output pixel positions by two-dimensional vertical / temporal interpolation at timings controlled by the controller 7. This is illustrated in the diagram of Figure 11b where, again, the required output pixels are shown in the vertical/temporal (V/T) plane together with the input pixels.Each output pixel is generated by weighting and summing the values of the four pixels, from three temporally successive fields, linked by the broken lines surrounding the output pixel position, though, as previously mentioned, a larger sample array, for example 16 samples spanning four successive fields, could be used to produce each output pixel. Again, the filter coefficients in the interpolator 4 are such as to combine proportions of input pixels in dependence upon the vertical and temporal displacement of each input pixel from the output pixel position as controlled by the controller 7.
The output of the interpolator 4 is supplied to the multiplier 5b. To generate each output pixel, the corresponding pixels produced by the interpolators 2 and 4 are multiplied by weighting factors (1
K) and K in the multipliers 5a and 5b respectively, where K is the motion estimate for the output pixel as previously described. The weighted outputs of the multipliers are supplied to the adder 6 and summed to generate the final output pixel. Since K varies between 0, for zero motion, and 1, for extreme motion, ie a degree of motion above a predetermined threshold, the final output pixels are produced solely by one-dimensional inter-frame interpolation in wholly static picture areas, and solely by two-dimensional vertical/temporal interpolation for areas of extreme motion.
Thus, the output of the adder 6 is a 24 Hz 1:1 format signal, each output frame being temporally offset by one quarter of the input field period from the temporally closest input field. The 24 Hz output signal is then digital-to-analogue converted and the analogue signal used to drive an electron beam recorder (not shown) for writing to film.
The above method of converting 60 Hz video to 24 Hz film provides consistent picture quality from frame to frame, and smooth motion portrayal. Highly complex processing techniques are avoided, allowing relatively simple apparatus to be used
It will of course be appreciated that the embodiments of Figures 7 and 10 could be combined to form a single system, switchable under control of the controller 7, for performing the operations of both of the individual systems described above on an alternative basis.
Claims (16)
1. A method of producing an output 1:1 format video signal from an input 2:1 interlace format video signal, the method comprising estimating the degree of motion in areas of the image between fields of the input signal, and producing output frames at temporal positions offset from the input field positions, output frame pixels each being produced from one or a combination of:
(i) a corresponding pixel obtained by one-dimensional inter
frame interpolation between frames of the input signal, or the
corresponding pixel in a temporally adjacent field of the input
signal; and
(ii) a corresponding pixel obtained by two-dimensional
vertical/temporal interpolation between fields of the input
signal in dependence upon the estimated degree of motion corresponding to that output frame pixel.
2. A method as claimed in claim 1, wherein the value of each said output pixel is given by (1-K)pl + Kp2, or (1-K)po + Kp2, where p1 is the value of the corresponding pixel obtained by one-dimensional interframe interpolation, p2 is the value of the corresponding pixel obtained by two-dimensional vertical/temporal interpolation. p0 is the value of the corresponding pixel in the said temporally adjacent input frame, and K ranges from 0, when the estimated degree of motion is zero, to 1 when the estimated degree of motion is above a predetermined level.
3. A method as claimed in claim 1, wherein each said output frame pixel is produced from one or a combination of:
(i) a corresponding pixel obtained by one-dimensional inter
frame interpolation between frames of the input signal; and
(ii) a corresponding pixel obtained by two-dimensional
vertical/temporal interpolation between fields of the input
signal.
4. A method as claimed in claim 3, wherein the value of each said output pixel is given by (1-K)p1 + Kp2, where p1 is the value of the corresponding pixel obtained by one-dimensional inter-frame interpolation, p2 is the value of the corresponding pixel obtained by two-dimensional vertical/temporal interpolation, and K ranges from 0, when the estimated degree of motion is zero, to 1 when the estimated degree of motion is above a predetermined level.
5. A method as claimed in any preceding claim, wherein the input signal is a 60 Hz 2:1 interlace format signal and the output signal is a 24 Hz 1:1 format signal.
6. A method as claimed in claim 5, wherein the output frames are produced at respective temporal positions offset by one quarter of the input field period from the temporally nearest input field.
7. A method of producing an output 1:1 format video signal from an input 2:1 interlace format video signal, the method comprising:
estimating the degree of motion in areas of the image between fields of the input signal;
producing some output frames at temporal positions coincident with respective input field positions, pixels of such an output frame which correspond to missing pixels in the temporally coincident input field each being produced from one or a combination of a corresponding pixel obtained by intra-field interpolation of the temporally coincident input field and a corresponding pixel obtained by onedimensional inter-frame interpolation between frames of the input signal, in dependence upon the estimated degree of motion corresponding to the output frame pixel; and
producing other output frames at temporal positions offset from the input field positions by two-dimensional vertical/temporal interpolation between fields of the input signal.
8. A method as claimed in claim 7, wherein the value of each said pixel of an output frame which is temporally coincident with an input field is given by (1-K)p1 + Kp3, where p1 is the value of the corresponding pixel obtained by one-dimensional inter-frame interpolation, p3 is the value of the corresponding pixel obtained by intra-field interpolation of the temporally coincident input field, and
K ranges from 0, when the estimated degree of motion is zero, to 1 when the estimated degree of motion is above a predetermined level.
9. A method as claimed in claim 7 or claim 8, wherein the input signal is a 60 Hz 2:1 interlace format signal and the output signal is a 24 Hz 1:1 format signal.
10. A method as claimed in claim 9, wherein alternate output fields are produced at temporal positions coincident with respective input field positions, and the remaining output fields are produced at temporal positions mid-way between temporally adjacent input fields.
11. A method as claimed in any preceding claim, wherein the twodimensional vertical/temporal interpolation is between three temporally successive fields of the input signal.
12. A method as claimed in any preceding claim, wherein the twodimensional vertical/temporal interpolation is between four temporally successive fields of the input signal.
13. A method as claimed in any preceding claim including digital-toanalogue converting the output video signal and supplying the analogue signal to a film recording device for recording the image content thereof on film.
14. A method of producing an output 1:1 format video signal from an input 2:1 interlace format video signal substantially as hereinbefore described with reference to Figures 3 to 7, 8a, 8b and 8c of the accompanying drawings.
15. A method of producing an output 1:1 format video signal from an input 2:1 interlace format video signal substantially as hereinbefore described with reference to Figures 4 to 6, 9, 10, 11a and 11b of the accompanying drawings.
16. Apparatus arranged to perform the method of any one of the preceding claims.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB9214499A GB2268659B (en) | 1992-07-08 | 1992-07-08 | Video to film conversion |
JP5168981A JPH077660A (en) | 1992-07-08 | 1993-07-08 | Method for conversion of video signal system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB9214499A GB2268659B (en) | 1992-07-08 | 1992-07-08 | Video to film conversion |
Publications (3)
Publication Number | Publication Date |
---|---|
GB9214499D0 GB9214499D0 (en) | 1992-08-19 |
GB2268659A true GB2268659A (en) | 1994-01-12 |
GB2268659B GB2268659B (en) | 1996-03-27 |
Family
ID=10718394
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB9214499A Expired - Fee Related GB2268659B (en) | 1992-07-08 | 1992-07-08 | Video to film conversion |
Country Status (2)
Country | Link |
---|---|
JP (1) | JPH077660A (en) |
GB (1) | GB2268659B (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0735748A2 (en) * | 1995-03-27 | 1996-10-02 | AT&T Corp. | Method and apparatus for converting an interlaced video frame sequence into a progressively-scanned sequence |
GB2318243A (en) * | 1996-10-09 | 1998-04-15 | British Broadcasting Corp | Motion compensated video line interpolation for interlace-to-progressive conversion |
WO2001078388A1 (en) * | 2000-04-07 | 2001-10-18 | Snell & Wilcox Limited | Method of conversion from an interlaced format to a progressive format having a lower frame rate |
WO2003039147A1 (en) * | 2001-11-01 | 2003-05-08 | Koninklijke Philips Electronics N.V. | Edge oriented interpolation of video data |
EP1931141A1 (en) * | 2005-09-30 | 2008-06-11 | Sharp Kabushiki Kaisha | Image display device and method |
GB2505872A (en) * | 2012-07-24 | 2014-03-19 | Snell Ltd | Interpolation of images |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2231228A (en) * | 1989-04-27 | 1990-11-07 | Sony Corp | Video signal to photographic film conversion |
GB2249909A (en) * | 1990-11-15 | 1992-05-20 | Sony Broadcast & Communication | Format conversion of digital video signals |
-
1992
- 1992-07-08 GB GB9214499A patent/GB2268659B/en not_active Expired - Fee Related
-
1993
- 1993-07-08 JP JP5168981A patent/JPH077660A/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2231228A (en) * | 1989-04-27 | 1990-11-07 | Sony Corp | Video signal to photographic film conversion |
GB2249909A (en) * | 1990-11-15 | 1992-05-20 | Sony Broadcast & Communication | Format conversion of digital video signals |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0735748A2 (en) * | 1995-03-27 | 1996-10-02 | AT&T Corp. | Method and apparatus for converting an interlaced video frame sequence into a progressively-scanned sequence |
EP0735748A3 (en) * | 1995-03-27 | 1997-05-28 | At & T Corp | Method and apparatus for converting an interlaced video frame sequence into a progressively-scanned sequence |
GB2318243A (en) * | 1996-10-09 | 1998-04-15 | British Broadcasting Corp | Motion compensated video line interpolation for interlace-to-progressive conversion |
GB2318243B (en) * | 1996-10-09 | 2000-09-13 | British Broadcasting Corp | Video signal processing |
WO2001078388A1 (en) * | 2000-04-07 | 2001-10-18 | Snell & Wilcox Limited | Method of conversion from an interlaced format to a progressive format having a lower frame rate |
CN1322749C (en) * | 2001-11-01 | 2007-06-20 | 皇家飞利浦电子股份有限公司 | Edge oriented interpolation of video data |
WO2003039147A1 (en) * | 2001-11-01 | 2003-05-08 | Koninklijke Philips Electronics N.V. | Edge oriented interpolation of video data |
EP1931141A1 (en) * | 2005-09-30 | 2008-06-11 | Sharp Kabushiki Kaisha | Image display device and method |
EP1931141A4 (en) * | 2005-09-30 | 2010-11-03 | Sharp Kk | Image display device and method |
US9881535B2 (en) | 2005-09-30 | 2018-01-30 | Sharp Kabushiki Kaisha | Image display device and method |
GB2505872A (en) * | 2012-07-24 | 2014-03-19 | Snell Ltd | Interpolation of images |
US8860880B2 (en) | 2012-07-24 | 2014-10-14 | Snell Limited | Offset interpolation of a sequence of images to obtain a new image |
GB2505872B (en) * | 2012-07-24 | 2019-07-24 | Snell Advanced Media Ltd | Interpolation of images |
Also Published As
Publication number | Publication date |
---|---|
JPH077660A (en) | 1995-01-10 |
GB9214499D0 (en) | 1992-08-19 |
GB2268659B (en) | 1996-03-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US4992869A (en) | Motion dependent video signal processing | |
US5329309A (en) | Method of integrating format material and an interlace scan format signal | |
AU668196B2 (en) | Motion compensated video processing | |
JP2832927B2 (en) | Scanning line interpolation apparatus and motion vector detection apparatus for scanning line interpolation | |
US5444493A (en) | Method and apparatus for providing intra-field interpolation of video signals with adaptive weighting based on gradients of temporally adjacent fields | |
US5610662A (en) | Method and apparatus for reducing conversion artifacts | |
KR920004561B1 (en) | Moving detecting circuit | |
EP0395276B1 (en) | Video signal to photographic film conversion | |
US5504531A (en) | Method and apparatus for field rate up-conversion | |
JPH02289894A (en) | Video signal interpolating device | |
JPH02290385A (en) | Television system converter with moving correction | |
US5579053A (en) | Method for raster conversion by interpolating in the direction of minimum change in brightness value between a pair of points in different raster lines fixed by a perpendicular interpolation line | |
JPH02290387A (en) | Television signal system converter with moving correction | |
US5386237A (en) | Method and apparatus for adaptive progressive scan conversion | |
US4539592A (en) | Double-scanning non-interlace television receiver | |
JPH02290384A (en) | Television signal system converter with moving correction | |
US6219102B1 (en) | Weighted median filter interpolator | |
JPH02290381A (en) | Television signal system converter with movement correction | |
US6151363A (en) | Video image processing | |
GB2268659A (en) | Interlace to non-interlace conversion for electron beam film recording | |
US7391476B2 (en) | Method and device for interpolating a pixel of an interline of a field | |
EP0817478A1 (en) | Process for interpolating frames for film mode compatibility | |
JPS5940772A (en) | Double scanning television receiver | |
GB2195216A (en) | Video transmission system | |
Thomas et al. | Generation of high quality slow-motion replay using motion compensation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
730A | Proceeding under section 30 patents act 1977 | ||
PCNP | Patent ceased through non-payment of renewal fee |
Effective date: 20060708 |