GB2264414A - Motion compensated noise reduction - Google Patents
Motion compensated noise reduction Download PDFInfo
- Publication number
- GB2264414A GB2264414A GB9202939A GB9202939A GB2264414A GB 2264414 A GB2264414 A GB 2264414A GB 9202939 A GB9202939 A GB 9202939A GB 9202939 A GB9202939 A GB 9202939A GB 2264414 A GB2264414 A GB 2264414A
- Authority
- GB
- United Kingdom
- Prior art keywords
- frame
- frames
- output
- pixel
- input
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/14—Picture signal circuitry for video frequency region
- H04N5/21—Circuitry for suppressing or minimising disturbance, e.g. moiré or halo
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Television Systems (AREA)
- Picture Signal Circuits (AREA)
Abstract
In order to reduce motion blur artifacts in areas of picture motion, motion compensated temporal interpolation is employed so that the fields or frames used to produce each filtered frame are generally temporally aligned. Noise reduction is achieved by averaging an input field/frame If2 with at least one other motion compensated temporally interpolated field/ frame If1. <IMAGE>
Description
VIDEO SIGNAL PROCESSING This invention is concerned with a method of and an apparatus for processing a digital video signal and in particular is concerned with reducing noise in the video signal.
present noise reduction systems are normally based on one of two techniques:
i) low pass filtering of an image or image sequencing; and
ii) detection and cancellation of noise.
The filtering technique is perhaps the easiest technique to implement. It relies on the fact that the power spectral density of an average real image has the majority of the picture energy at the lower end of the spectrum. However random (Gaussian white) noise has a power spectrum that is flat. By filtering the image with a suitable low pass filter, the majority of the picture energy remains unaffected whilst a dramatic attenuatioi of the noise can be obtained. For example a low pass filter at half the picture bandwidth will reduce the noise by 3dB.
The degradation of the picture caused by the filter is dependent on both the filter and the image.
The use of an intra-field or intra-frame low pass filter will soften the image. Sharpness will be lost and edges will become blurred. The degree of degradation will depend on how sharp the original image was and the bandwidth of the filter.
The degradation of an inter-field or inter-frame filter is often less disturbing. In static areas of an image the signal has zero frequency. It is therefore possible in theory to use ci filter of infinitesimally small bandwidth to remove all the noise without degradation to the image. Ilowever, in areas of motion higher frequencies do exist. Temporal filtering of a moving image will therefore manifest itself as motion blur.
Motion adaptive filtering systems are known in which elaborate switching of filters is carried out according to the degree of motion in the image. however, these systems will only be able to work effectively in static picture areas.
Noise reduction based on detection and ca1lcellat,ion of noise strives to reduce noise without significant degradation of the image.
An example of a commercially available system is the "BKU904" produced by Sony Corporation of Japan, which uses an inter-field difference signal, a form of high-pass filtered image. This signal is the opposite to the filtered signal described previously in that it ideally represents just noise and not image. This noise signal is then subtracted from the source image (picture and noise) revealing the picture. However, the estimated noise signal is corrupted by movement within the image which, like noise, has a high frequency content. This system would again cause motion blur in moving areas of the picture. The BKU904 applies a lladamard transform to the difference signal followed by a threshold to attempt to remove the motion information froiii the difference signal.
This relies on the noise and motion components generally having different characteristics which are isolated by the threshold. however, this system again is not perfect and some picture impairment will be seen.
In summary, therefore, the problem with the known systems is their generally poor performance with moving images. Only small noise reduction can be achieved if artifacts such a motion blur are to be kept within acceptable limits.
The present invention is a development of the inter-field or inter franle low-pass filtering technique described above. I-lowever, in accordance with one aspect of the present invention, rather thai' simply filtering two or more input fields or frames to produce an output field or frame, at least one of those input fields or frames is temporally shifted using motion compensated temporal shifting so that the fields or frames to be filtered in producing each output field or frame are generally temporally aligned.Thus, a degree of noise reduction can be achieved comparable with that of the krlc,wn filtering ar rarlgelllent, described above, but witloout or with less motion blur artifacts in regions of picture motion In accordance with another aspect of the present invention, there is provided a method of processing an input digital video signal representing a series of input fields or frames to produce an output video signal representing a series of output fields or frames having the same frame rate (and if appropriate the same field rate)' as the input signal, in which each output field or frame is derived directly or indirectly from at least two of the input fields or frames by motion compensated temporal shifting.
In one example of the method. a motion vector is developed for each pixel in each output field or frame indicative of the motion of that pixel in the picture, and the value of each pixel at a respective location in the output field or frame is determined in part from the value of a pixel in one of the respective input fields or frames at a location offset from the output pixel location by an amount dependent upon the respective motion vector. The value of each pixel in each output field or frame may also be determined in part from the value of a pixel in another of the respective input fields or frames at the same location as the output pixel.Alternatively or additionally, the value of each pixel in each output field or frame may be determined in part from the value of a pixel in a further one of the respective input fields or frances at a location offset from the output pixel location by an amount dependent npon the respective motion vector. if this latter feature is additionally provided, preferably said one of the respective input fields or frames and said further one of the respective input fields or frames are temporally positioned to either side of said other of the respective input.
fields or frames, and the offset of said location in said one of the respective input fields or frames is in the opposite direction to the offset of said locatiori in said further orie of the respective input fields or frames.
In accordance with a further aspect of the present invention, there is provided an apparatus arranged to perform the above mentioned method.
A method of and an apparatus for producing temporally shifted frames using motion compensated temporal interpolation are known from published patent application GB223l228A, the content of which is incorporated herein by reference as if printed in full below. However, in that case there is a change of frame and field rate from 60wiz, 2:1 interlaced format to 24Hz, 1:1 progressive scan format. The apparatus described in that application can be used, with some modification, to perform the present invention.
Specific embodir nts of the present invention will riow be described, by way of example, with reference to the accompanying drawings, in which:
Figure 1 is a temporal/process diagram illustrating the relationship between input progressive scan format frames and corresponding output frames employing two-frame noise reduction filtering;
Figure 2 is a block diagram of an apparatus for performing the process of Figure 1;
Figure 3 illustrates a modification to the Figure 1 process;
Figure 4 is a block diagram of an apparatus for performing the process of Figure 3;;
Figure 5 is a temporal/process diagram illustrating the relationship between input progressive scan format frames and corresponding output frames employing three-fralne noise reduction filtering;
Figure 6 is a block diagram of an apparatus for performing the process of Figure 5.
Referring to Figure 1, the diagram represents time in the left to right direction and process steps in the top to bottom direction. The diagram shows three input 1:1 progressive scan format frames IF to (for example, 30 frame/s 1-1 frames) which are part of a series of input frames IFi, and two output frames OFi, OFZ, which are part of a series of resulting output frames OFi, each temporally aligned with a respective input frame IFi+i. For each pixel at location (x,y) ill eac.-h output frame OFi, a respective motion vector (m(x,y), n(x,y)) is developed indicative of tlie inter-frame motion of that pixel iri tlie picture.For each output frame OFj, a temporally shifted frame SF: is produced such that the value SFi(x,y) of a pixel at a location (x,y) in the frame SF: is equal to the value of the pixel at a location (x-m(x,y ), y-n(x,y)) in the temporally preceding input frame IFi, that is offset from the position (x,y) by the respective motion vector (m(x,y), n(x,y)). Otherwise stated:: SFi(x,y) = IF1(x-m(x,y), y-n(x,y)). - (1)
Each output frame OFi is then produced by averaging the respective temporally aligned input frame IFi with the respective temporally shifted frame SFi so that the value OFi(x,y) of a pixel at location (x,y) in the output frame OFi is given by:
OFi(x,y) = 1/2(SFi(x,y) + 1Fi+l(x,y)) - (2) which can be otherwise stated as:
OFi(x,y) = 1/2(IFi(x-m(x,y), y-n(x,y)) + IFi+l(x,y)) - (3)
Figure 2 illustrates an apparatus for performing the method of
Figure 1.The input video frames are supplied on line 10 to a motion vector producer 12, which comprises a direct block matcher, motion vector estimator, motion vector reducer, motion vector selector and motion vector post processor as shown in Figure 4 of GB2231228A and described with reference to Figures 15 to 48 thereof, to which reference is directed for further detail. The motion vector producer 12 supplies frames of motion vectors to a vector offset input 14 of a frame store 16.
The input video frames on line 10 are also supplied via a delay 18 (to take account of the time taken in producing the motion vectors) to a raster input 20 of the frame store 16. The frame store 16 receives address data (or alternatively, for example, synchronisation and clock signals so that the frame store can determine addresses) at an input 22 from a system controller 24 and has an offset output 26.The frame store 16 operates not only to delay a frame by one frame period, but also to output pixel data which is offset from the current address (x,y) by the motion vector (m(x,y), n(x,y-)). Therefore, for pixel data IFi+i(x,y) supplied to the raster input 20, the pixel data output from the offset output 26 is I F (x-m(x,y), y-n(x,y) ). The output of the delay 18 is connected to one input of a low-pass filter 28, and the offset; output 26 of the frame store 16 is connected to another input of the fitter 28.
'I'he filter 28 may simply serve to average the pixel data at its two inputs and thus produce output pixel data OFi(x,y) in accordance with formula (3) above, or additionally it inay apply some degree of spatial filtering to the input, data.
Referring to Figure 3, there is no need for each output frame OFi to be temporally aligned with a respective input frame IFi+i, and as shown in Figure 3, each output frame OFi may be temporally interpolated, for example half-way, between a respective pair IF:, IFi+l of input frames using motion compensated temporal interpolation.Thurs for each output frame OFi a first temporally shifted frame SFi' is produced such that:
SFi'(x,y) = IFi(x-1/2), y-1/2n(x,y)) - (4) arid a second temporally shifted frame S F " is produced such that: SFi"(x,y) = IFi+l(x+1/2m(x,y), y+1/2n(x,y) - (5) The output frame OFi is then produced by averaging the respective temporally shifted frames SFi' and SFi", such that::
OFi(x,y) = 1/2Fi(x-1/2m(x,y), y-1/2n(x,y) + 1/2Fi+i(x+1/2m(x,y), y+1/2n(x,y==
- (6)
Figure 4 illustrates a modification to the Figure 2 apparatus for performing the method of Figure 3. The delay 18' produces a delay of one frame period less than the delay 18 of Figure 2, and the delayed frames are input to a raster input 30 of a frame store 32 which is similar to the frame store 16 of Figure 2 except that it. also has a raster output 34 producing delayed frames with no offset. The raster output 34 supplies a second frame store 16 similar to the frame store 16 of
Figure 2. The motion vectors supplied by the motion vector producer
12 are multiplied by -2 by a multiplier 36 and supplied to the vector offset input 38 of the frame store 32, and are also multiplied by 1/2 by a multiplier 40 and supplied to the vector offset input 14 of the frame store ] 6. The offset outputs 26, 40 of the frame stores 16, 32 are supplied to the low pass filter 28. It will therefore be appreciated that while input pixel data IFi+Z(x,y) is being supplied to the raster input 30 of the frame store 32, the pixel data IFi+l(x,y) of the preceding frame is supplied from the raster output 34 of the frame store 32 to the raster input 20 of the frame store 16, and the temporally shifted pixel data IFi(x-1/2m(x,y), y-1/2n(x,y)) and IFi+i(x+1/2m(x,y), y+1/2n(x,y)) are supplied from the offset. outputs 26, 42 of the frame stores 16.32 to the low-pass filter 28, so that the filter 28 can produce the output frames OFi(x,y) in accordance with formula (6) above.
In the arrangements described with reference to Figures 1 to 4, each output frame is derived from a respective pair of input frames, at least one of which is temporally shifted. More than two in put frames niay be used to form each output frame, and @ Figure 5 illustrates a method in which each output frame is derived from three input frames.
referring to Figure D, each output frame OFi is temporally aligned with a respective input frame IFitl and is produced by averaging: (a) that input frame IFi+l; (b) a temporally shifted frame SPi' produced by motion compensated temporal shifting from the preceding input frame IF; in accordance with the motion vectors (m(x,y), n(x,y)) of the pixels in the output frame OFi indicative of the inter-frame motion; and (c) a frame SFi+Z" produced by temporal shifting from the succeeding input frame IFi+g in accordance with the motion vectors (m(x,y), n(x,y)).More specifically:
SFi'(x,y) = IFi(x-m(x,y), y-n(x,y); - (7)
SFi"(x,y) = IFi(x+m(x,y), y+n(x,y)); and - (8)
OFi(x,y) = 1/3(SFi'(x,y) + IFi+l(x,y) + SFi+2"(x,y)) - (9) so that: OFi(x,y) = 1/3(IFi(x-m(x,y), y-n(x,y)) + IFi+l(x,y) +
IFi+2(x+m(x,y), y+n(x,y))) - (10)
It may be appreciated that errors may occur in determining the motion vectors (m(x,y), n(x,y)). which may cause degradation of the output picture.In order to lessen this problem, but at the expense of reducing the noise reduction effect, a weighting may be employed in favour of the input frame which is temporally aligned with the respective output fr frame, for example so that:
OFi(x,y) = 1/4IFi(x-m(x,y), y-n(x,y)) + 1/2IFi+i(x,y) +
1/4IFi+2(x+m(x,y), y+n(x,y)) - (11)
An apparatus for performing the method of Figure 5 will riow be described with reference to Figure 6.The apparatus is similar to that of Figure 4, except that: the -1/2 multiplier 36 of Figure 4 is replaced by a -1
multiplier 36';
the +', multiplier 40 of Figure 4 is not employed;
an additional frame store 44 is placed bctweerl the raster
output 34 of the frame store 32 and the raster input 20 of the frame store 16 to provide an additional one frame period delay;
and the low pass filter 28' has an additional i input. from the raster output 46 of the frame store 44.
It will therefore be appreciated that, while input pixel data
IFi+3(x,y) is being supplied to the raster input 30 of the fraiiie store 82: (a) the temporally shifted pixel data IFi+2(x-m(x,y=, y+n(x,y)) of the preceding frame is supplied from the offset output 42 of the frame store 32 to one input of the low pass filter 28'; (b) the unshifted pixel data
IFi+2(x,y) of the preceding frame is supplied from the raster output 34 of the frame store 32 to the raster input 48 of the frame store '14; (c) the unshifted pixel data IFi+i(x,y) of the preceding frame but one is supplied from the raster output 46 of the frame store 44 both to the raster input 20 of the frame store 16 and to another input of the low pass filter 28'; and (d) the temporally shifted pixel data IFi(x-m(x,y), y n(x,y)) of the preceding frame but two is supplied to a further input of the low pass filter 28'. The filter 28' can therefore produce the output frame pixel data OFi(x,y) in accordance with formula (10) or (1 I) above.
The methods and apparatuses described with reference to Figures 1 to 6 above are concerned with progressive scan format video signals.
The invention is also applicable to interlaced video signals, such as 60 field/s, 30 frame/s, 2:1 interlaced signals or 50 field/s, 25 frame/s, 2:1 interlaced signals. One simple way of doing this involves using tlie apparatuses of Figures 2, 4 and 6 on a field-by-field basis rather than a frame-by-frame basis, by combining odd fields to produce each odd noise-reduced output field and by combining even fields to produce each even noise-reduced output field. Alternatively arld preferably, a progressive scan converter may be included at the input to the motion vector producer 12.Such a progressive scan converter is shown in
Figure 4 of GB2231228A and described with reference to Figures 5 to 14 thereof, to which reference is d irec ted. Alternatively, such a progressive scan converter may be included in the input line 10 of the apparatus of Figure 2, 4 or 6, so that the apparatus then performs as described above with reference to Figures J and 2, 3 and 4, or 5 and 6, and only the odd or even lines, as appropriate, of the output frames
OF would then be output in order to provide the noise reduced output fields.
It will be appreciated that many modifications and developments may be made to the methods and apparatuses described above. For example, more than theree frames or fields may be used to produce each output frame or field.
Claims (8)
1.. A method of processing an input digital video signal representing a series of input fields or frames to produce an output video signal representing a series of output fields or frames having the same frame rate (and if appropriate the same field rate) as the input signal, in which each output field or frame is derived directly or indirectly from at least two of the input fields or frames by motion compensated temporal shifting.
2. A method as clainled in claim 1, wherein a motion vector is developed for each pixel in each output field or frame indicative of the motion of that pixel in the picture, arid whereiii the value of each pixel at I respective location in the output field or frame is determined in part from the value of a pixel in one of the respective input fields or frames at a location offset from the output pixel location by an amount dependent upon the respective motion vector.
3. A method as claimed in claim 2, wherein the value of each pixel n each output field or frame is determined in part from the value of a pixel in another of the respective input fields or frames at the same location as the output pixel.
4. A method as claimed in claim iii 2 or 3, wherein the value of each pixel in each output field or frame is determined in part from the value of a pixel in a further one of the respective input fields or frames at, a location offset, from the output pixel location by an aiiiount dependerlt upon the respective motion vector.
5. A method as claimed iii claim 4 when depeiident on claim 3, wherein said one of the respective input, fields or frames and said further ono of the respective input fields or frames are temporally positioned to either side of said other of the respective input fields or frames, and wherein the offset of said location in said one of the respective input. fields or frames is in the opposite direction to the offset of said location iri said further one of the respective input fields or frames.
6. A method of processing a digital video signal substantially as described with reference to Figures I, 3 or 5 of the drawings.
7. An apparatus arranged to perform the method of any preceding claim.
8. An apparatus for processing a digiLil video signal substantially as described with reference to the drawings.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB9202939A GB2264414B (en) | 1992-02-12 | 1992-02-12 | Video signal processing |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB9202939A GB2264414B (en) | 1992-02-12 | 1992-02-12 | Video signal processing |
Publications (3)
Publication Number | Publication Date |
---|---|
GB9202939D0 GB9202939D0 (en) | 1992-03-25 |
GB2264414A true GB2264414A (en) | 1993-08-25 |
GB2264414B GB2264414B (en) | 1995-07-12 |
Family
ID=10710246
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB9202939A Expired - Fee Related GB2264414B (en) | 1992-02-12 | 1992-02-12 | Video signal processing |
Country Status (1)
Country | Link |
---|---|
GB (1) | GB2264414B (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2280812A (en) * | 1993-08-05 | 1995-02-08 | Sony Uk Ltd | Deblurring image data using motion vector dependent deconvolution |
WO1996026606A2 (en) * | 1995-02-20 | 1996-08-29 | Snell & Wilcox Limited | Moving image reproduction system |
GB2333413A (en) * | 1998-01-20 | 1999-07-21 | Snell & Wilcox Ltd | Moving image restoration |
EP0722250A3 (en) * | 1995-01-10 | 1999-10-20 | Sony Corporation | System for and method of processing image signal |
EP1995948A3 (en) * | 2007-05-23 | 2011-01-12 | Sony Corporation | Image processing |
US8204334B2 (en) | 2006-06-29 | 2012-06-19 | Thomson Licensing | Adaptive pixel-based filtering |
US8467626B2 (en) | 2006-09-29 | 2013-06-18 | Thomson Licensing | Automatic parameter estimation for adaptive pixel-based filtering |
-
1992
- 1992-02-12 GB GB9202939A patent/GB2264414B/en not_active Expired - Fee Related
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2280812A (en) * | 1993-08-05 | 1995-02-08 | Sony Uk Ltd | Deblurring image data using motion vector dependent deconvolution |
GB2280812B (en) * | 1993-08-05 | 1997-07-30 | Sony Uk Ltd | Image enhancement |
EP0722250A3 (en) * | 1995-01-10 | 1999-10-20 | Sony Corporation | System for and method of processing image signal |
WO1996026606A2 (en) * | 1995-02-20 | 1996-08-29 | Snell & Wilcox Limited | Moving image reproduction system |
WO1996026606A3 (en) * | 1995-02-20 | 1996-10-31 | Snell & Wilcox Ltd | Moving image reproduction system |
GB2333413A (en) * | 1998-01-20 | 1999-07-21 | Snell & Wilcox Ltd | Moving image restoration |
GB2333413B (en) * | 1998-01-20 | 2002-05-15 | Snell & Wilcox Ltd | Moving image restoration |
US8204334B2 (en) | 2006-06-29 | 2012-06-19 | Thomson Licensing | Adaptive pixel-based filtering |
US8467626B2 (en) | 2006-09-29 | 2013-06-18 | Thomson Licensing | Automatic parameter estimation for adaptive pixel-based filtering |
EP1995948A3 (en) * | 2007-05-23 | 2011-01-12 | Sony Corporation | Image processing |
US8243150B2 (en) | 2007-05-23 | 2012-08-14 | Sony Corporation | Noise reduction in an image processing method and image processing apparatus |
Also Published As
Publication number | Publication date |
---|---|
GB2264414B (en) | 1995-07-12 |
GB9202939D0 (en) | 1992-03-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6940557B2 (en) | Adaptive interlace-to-progressive scan conversion algorithm | |
De Haan et al. | Deinterlacing-an overview | |
EP0677958B1 (en) | Motion adaptive scan conversion using directional edge interpolation | |
US5642170A (en) | Method and apparatus for motion compensated interpolation of intermediate fields or frames | |
US5793435A (en) | Deinterlacing of video using a variable coefficient spatio-temporal filter | |
KR910009880B1 (en) | Image motion detecting circuit of interlacing television signal | |
US5444493A (en) | Method and apparatus for providing intra-field interpolation of video signals with adaptive weighting based on gradients of temporally adjacent fields | |
JP4153480B2 (en) | Noise attenuator and progressive scan converter | |
JPS6281888A (en) | Video signal interpolator | |
JPH07131761A (en) | Television signal processing circuit | |
US6219102B1 (en) | Weighted median filter interpolator | |
GB2269071A (en) | Noise suppression in image signals | |
KR20070030223A (en) | Pixel interpolation | |
GB2264414A (en) | Motion compensated noise reduction | |
US5497203A (en) | Motion detection circuit for high definition television based on muse | |
Mohammadi et al. | A five-field motion compensated deinterlacing method based on vertical motion | |
JP3546698B2 (en) | Scan line interpolation circuit | |
JP3062286B2 (en) | Motion detection circuit and motion adaptive scanning line interpolation circuit | |
Biswas et al. | Performance analysis of motion-compensated de-interlacing systems | |
EP0648046B1 (en) | Method and apparatus for motion compensated interpolation of intermediate fields or frames | |
EP0600291B1 (en) | Method and apparatus for adaptive proscan conversion | |
JPH0313790B2 (en) | ||
Lee et al. | Motion detection and motion adaptive pro-scan conversion | |
JP2525456B2 (en) | Noise reduction circuit for television video signals | |
KR0142296B1 (en) | A exchanging apparatus for scanning |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
730A | Proceeding under section 30 patents act 1977 | ||
PCNP | Patent ceased through non-payment of renewal fee |
Effective date: 20110212 |