WO2009026966A1 - Method of estimating illumination change of images for object detection - Google Patents

Method of estimating illumination change of images for object detection Download PDF

Info

Publication number
WO2009026966A1
WO2009026966A1 PCT/EP2007/059089 EP2007059089W WO2009026966A1 WO 2009026966 A1 WO2009026966 A1 WO 2009026966A1 EP 2007059089 W EP2007059089 W EP 2007059089W WO 2009026966 A1 WO2009026966 A1 WO 2009026966A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
luminance
illumination change
images
difference
Prior art date
Application number
PCT/EP2007/059089
Other languages
French (fr)
Inventor
Antony Louis Piriyakumar Douglas
Ashish Sharma
Original Assignee
Siemens Building Technologies Fire & Security Products Gmbh & Co.Ohg
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens Building Technologies Fire & Security Products Gmbh & Co.Ohg filed Critical Siemens Building Technologies Fire & Security Products Gmbh & Co.Ohg
Priority to PCT/EP2007/059089 priority Critical patent/WO2009026966A1/en
Publication of WO2009026966A1 publication Critical patent/WO2009026966A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/92Dynamic range modification of images or parts thereof based on global image properties

Definitions

  • the present invention relates to digital image processing and in particular to estimating illumination change of images for object detection.
  • Possible applications of the invention are for example in surveillance, intrusion detection in indoor and outdoor environments and traffic monitoring systems.
  • Background subtraction is a commonly used technique for detecting objects of interest in an image. It involves comparing a current image with a background image which does not contain any object of interest. The areas of the image plane where there is a significant difference between the current image and the background image indicate the location of the objects of interest. Background subtraction uses the technique of subtracting the background image from the current image and thresholding the result to detect the objects of interest.
  • an effective background subtraction requires the accurate updating of a background image for illumination changes so that object detection can be perfectly done. If the background is not changed accordingly, the output might have portions of the background which may have changed because of the illumination change and may be falsely detected as objects of interest. It is an object of the present invention to provide a more accurate method of estimating illumination change of images for object detection.
  • Illumination changes usually cause changes in the brightness of an image while the colours of the image remain basically unchanged. Usually only the brightness of colours is affected by illumination changes not the colours as such. In other words, illumination changes usually cause a substantial change in luminance of an image with no or minimal change in chrominance of an image.
  • the invention makes use of this effect in an unexpected way, exploiting it for improving the accuracy of estimation of illumination changes of images for object detection.
  • the ratio of the luminance of the current image and the luminance of the corresponding background image is determined, but only in case the difference in luminance is larger than a first predetermined value and the difference in chrominance is smaller than a second predetermined value.
  • the method comprises the step of changing the luminance of the background image using the determined ratio.
  • the determined ratio is a measure of the illumination change between the current image and the background image. If the background is not changed accordingly, the output of any background subtraction method used thereafter might have portions of the background which may have changed because of the illumination change and may be falsely detected as objects of interest. Hence, the above feature facilitates accurate updating of the background image, which is particularly advantageous for an effective background subtraction.
  • the current image is a part of a frame, said frame comprising a plurality of images, and the steps (a) , (b) and (c) are repeated for the plurality of images.
  • the ratio being a measure of the illumination change between the current image and the background image can be determined only for images containing mainly background. Images containing mainly foreground objects will not be considered. Hence, the accuracy of illumination change estimation will be improved.
  • said method comprises the step of calculating an average illumination change of the frame out of the ratios of the images for which the ratio has been determined. This will result in a better approximation of actual illumination change in the frame.
  • an illumination change is detected if the number of images for which the ratio has been determined exceeds a predetermined threshold, making the method less error-prone regarding to false detection of illumination changes.
  • Another alternative embodiment comprises the step of updating the luminance of the background images using the average illumination change of the frame. This provides a corrected background of the image that can be used for background subtraction. This facilitates a proper background subtraction since the updated background is devoid of errors due to illumination change.
  • the image comprises of pixels, the luminance of each pixel determined by one luminance value and the chrominance of each pixel determined by two chrominance values. This eases determining the differences in luminance and in chrominance of the current image and the corresponding background image, since both luminance and chrominance are explicitly defined by their respective values.
  • FIG 1 is a flowchart illustrating a method for estimating illumination change of a current image with respect to a corresponding background image
  • FIG 2 is a flowchart illustrating a method for estimating illumination change of a frame comprising of a plurality of images
  • FIG 3 is a flowchart illustrating an embodiment of the present invention for estimating illumination change of a frame comprising of a plurality of images in YCrCb color space
  • FIG 4 shows a background image of a road in a traffic monitoring method
  • FIG 5 shows an input image of a traffic monitoring method affected by illumination change
  • FIG 6 shows a background subtracted image affected by illumination change in a traffic monitoring method
  • FIG 7 shows an illumination corrected output image in a traffic monitoring method.
  • Still photographs or digital video signals may be represented as a matrix of pixel values and stored in a storage device, such as that of a computer, an embedded system, or other digital processing device.
  • a pixel value or the matrix of pixel values is generally referred to as an 'image' .
  • 'image' can be a pixel or combination of pixels where the 'pixel' is the smallest resolved unit of an image that has specific luminance (brightness) and chrominance (color) .
  • a 'frame' is one of the many single photographic images in a video.
  • FIG 1 illustrates a method 100, for estimating illumination change of a current image with respect to a corresponding background image.
  • a background image is stored in an image processing device (step 10) .
  • a current image is then received in the image processing device in a reguired format compatible for processing, i. e. for example in a luminance-chrominance color space.
  • the image is converted to the format YCrCb (Y is the luminance component, Cr and Cb correspondingly represents the chroma red and chroma blue components which are the colour components) .
  • the current image received is converted to a required format compatible for processing. This is an optional step and is not performed if the current image is received directly in the format compatible for processing.
  • the required format has at least one luminance component and at least one chrominance component.
  • the difference in luminance of the current image and a corresponding background image is determined.
  • the difference in chrominance of the image and the chrominance of the corresponding background image is determined.
  • step 30 and step 40 can be performed in parallel or can be done one after the other.
  • step 50 a check is made to determine if difference in luminance is larger than a first predetermined value and the difference in chrominance is smaller than a second predetermined value. By this the chances are reduced that any change in luminance caused by brighter or darker objects in the image is taken erroneously as measure of an illumination change, since these objects usually will also cause a significant difference in chrominance since their colour differs from the background colour. If the above condition is not satisfied then an illumination change is not detected (step 60) . If the above condition is satisfied, the ratio between the luminance of the current image and the luminance of the background image is determined (step 70) .
  • Illumination change affects the luminance of the image in a multiplicative way. So, to get the multiplicative factor, the ratio between the luminance of the image and the luminance of the corresponding background image is determined.
  • the determined ratio is a measure of the illumination change between the image and the background image. Illumination changes usually cause changes in the brightness of an image while the colours of the image remain basically unchanged. Usually only the brightness of colours is affected by illumination changes not the colours as such.
  • the determined ratio is a measure of the illumination change between the image and the background image.
  • the illumination change is determined using the ratio.
  • the background image is changed accordingly with respect to the measured illumination change (step 90) .
  • FIG 2 illustrates a method 200 for estimating the illumination change of a frame comprising a plurality of images.
  • a background image is stored in an image processing device (step 205) .
  • the current image received by the processing device is converted to a required format compatible for processing. This is an optional step and is not performed if the current image is received directly in the format compatible for processing.
  • the current image is part of a frame, which is having a plurality of images.
  • the difference in luminance and the difference in chrominance of a current image and a corresponding background image are determined.
  • a check is made to determine if difference in luminance is larger than a first predetermined value and the difference in chrominance is smaller than a second predetermined value. If the condition is satisfied, the ratio between the luminance of the current image and the luminance of the corresponding background image is determined (step 225) .
  • the process of determining the difference in luminance, determining the difference in chrominance and determining the ratio is repeated for all the images in the frame (step 230) .
  • the change in chrominance usually is less or nearly nil. But there may be more chrominance change than luminance change in an image which contains at least portions of a foreground object since the colour of the foreground object usually differs from that of the background which affects the chrominance of the image rather than the luminance.
  • the average illumination change of the whole frame is calculated using the determined ratio of each image.
  • a check is made to determine if the total number of images for which the ratio has been determined exceeds a predetermined threshold (a third threshold) . If the above condition is not satisfied then an illumination change is not detected (step 245) .
  • an illumination change is detected if the total number of images for which the ratio has been determined exceeds the third predetermined threshold.
  • the determined average illumination change is a measure of the illumination change of the frame.
  • the background images are changed accordingly with respect to the measured illumination change .
  • FIG 3 is a flowchart illustrating an embodiment of the present invention for estimating illumination change of a frame comprising of a plurality of images in YCrCb color space.
  • a feature for describing color numerically is generally called a colour space.
  • the most widely used color spaces are RGB (Red, Green, Blue) for scanners and displays, CMYK (short for cyan, magenta, yellow, and black) for color printing and YUV for video and TV.
  • the Y in YUV stands for "luma, " which is brightness; U and V provide color information and are "color difference" signals of blue minus luma (B-Y) and red minus luma (R-Y) respectively.
  • YUV is most commonly seen in its digital form, YCrCb widely used in video and image processing schemes.
  • Y is the luminance component; Cr and Cb are the chrominance component.
  • Cr is a colour difference signal red minus luma (R-Y) and Cb is another colour difference signal blue minus luma (B-Y) .
  • luminance represents the brightness in an image.
  • Luminance is typically paired with chrominance. Luminance represents the achromatic image without any color, while the chrominance components represent the color information.
  • YCrCb domain processing also helps in storing Cr, Cb in sub-sampled domain making it easy to store data with lesser storage requirements as against other color space like RGB. Since both luminance and chrominance are explicitly defined by their respective values estimation of illumination change is easier in YCrCb colour space.
  • a method 300 for estimating illumination change of a frame comprising of a plurality of images in YCrCb color space is illustrated in FIG 3.
  • illumination change there is substantial change in Y component of the image with no or minimal change in Cr, Cb component of an image at the same position in the two consecutive frames.
  • ratio of the current Y component to the previous Y component of the same image is calculated due to multiplicative change in Y component during illumination.
  • a background image is stored in an image processing device (step 305) .
  • the current image is received in the YCrCb format (step 310) .
  • the current image is converted to the YCrCb format.
  • the current image is part of a frame, which is having a plurality of images.
  • the difference in luminance and the difference in chrominance of an image in the frame and a corresponding background image are determined. If the difference in luminance is larger than a first predetermined value and the difference in chrominance is smaller than a second predetermined value (step 320) , then that particular image is marked (step 325) .
  • the process of determining the difference in luminance, determining the difference in chrominance and marking of the image is repeated for all the images in the frame (step 330) .
  • a check is made to determine if the ratio of the total number of marked images to that of total images in the frame exceeds a predetermined threshold. If the above condition is not satisfied then an illumination change is not detected (step 340) .
  • an illumination change is detected if the said condition is satisfied.
  • the illumination ratio of each marked image is determined.
  • the average illumination change of the whole frame is calculated using the determined illumination ratio of each marked image.
  • the determined average illumination change is a measure of the illumination change of the frame.
  • the background image is changed accordingly with respect to the measured illumination change (step 360) .
  • the above thresholds are initialized with some values say 40% change or 25% change.
  • the performance of the algorithm is evaluated for a good set of images. Depending upon the results, the thresholds are increased or decreased.
  • the invention is explained with respect to an example.
  • the invention can be implemented in a traffic monitoring system.
  • the need is to find the foreground objects such as vehicles compared to background such as a road.
  • the background subtraction technique involves comparing the image that has undergone illumination change and a corresponding background image captured prior to the illumination change.
  • FIG 4 shows a background image 400 of the road in the traffic monitoring method.
  • the road is divided for traffic using well defined traffic lanes.
  • the lane of a road is defined by painted strips of lines.
  • Two sequences of strips divide the road into three parallel lanes as shown in the image.
  • the first sequence is constituted by strips 401, 402 and 403.
  • the second sequence is constituted by strips 404, 405 and 406.
  • FIG 5 shows an input image 500 of a traffic monitoring method affected by illumination change.
  • the input image is received in the required format compatible for processing.
  • 501, 502 and 503 are the foreground objects i.e. vehicles that need to be detected. Due to illumination changes for example due to Sun by nature, the brightness of the road changes with change in luminance of the strips also. But, the presence of a vehicle in the captured image changes the colour of the background. The brightness of the background as well as the foreground image changes because of the illumination change.
  • the illumination change brings brightness change in the road, strips and also in the vehicles. The change in chrominance is higher in places where there is a presence of vehicles. But in the background where there is no vehicle the luminance change is more since the illumination change affects the brightness of the background and not the colour. For illustration of this example a high level of luminance change in the strips is assumed. 504, 505, 506, 507, 508 and 509 indicates strips after luminance change.
  • FIG 6 shows a background subtracted image 600 affected by illumination change in a traffic monitoring method.
  • the background subtracted image has portions of the strips 601,
  • FIG 7 illustrates an illumination corrected output image 700. Since the scene is having several foreground objects the illumination change is determined by using a plurality of smaller images as part of a frame instead of one large image. The portions where there is a chrominance change is a part of the image identified by a vehicle, since these vehicles usually cause a significant difference in chrominance as their colour differs from the background colour. To determine the change in luminance during an illumination change, the ratio of the luminance of the image and the luminance of the corresponding background image is determined, but only in case the difference in luminance is larger than a first predetermined value and the difference in chrominance is smaller than a second predetermined value.
  • the ratio being a measure of the illumination change between the image and the background image can be determined only for images containing mainly background. Images containing mainly foreground objects will not be considered. Hence the accuracy of illumination change estimation will be improved.
  • the average illumination change in the image or frame can be determined.
  • the background of the image is then updated using the determined average illumination change so that accurate vehicle detection can be carried out.
  • the luminance of the image goes on changing the ratio between the luminance of the image and the luminance of the background image can be determined periodically to update the background image to get more accurate results.
  • the illumination corrected output image will have only the vehicles 501, 502 and 503 but not any portion of the background.
  • the present invention provides a method of estimating illumination change of images for object detection.
  • the method involves estimating illumination change for images and further changing the luminance of a corresponding background image.
  • the method first involves determining the difference in luminance of a current image and a corresponding background image and also determining the difference in chrominance of the current image and the corresponding background image. Then the ratio between the luminance of the current image and the luminance of the background image is determined if the difference in luminance is larger than a first predetermined value and the difference in chrominance is smaller than a second predetermined value, the determined ratio being a measure of the illumination change between the image and the background image. Further the invention involves changing the luminance of the background image using the determined ratio.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The present invention provides a method of estimating illumination change of images for object detection. The method involves estimating illumination change for images and further changing the luminance of a corresponding background image. The method first involves determining the difference in luminance of a current image and a corresponding background image and also determining the difference in chrominance of the current image and the corresponding background image. Then the ratio between the luminance of the current image and the luminance of the background image is determined if the difference in luminance is larger than a first predetermined value and the difference in chrominance is smaller than a second predetermined value. The determined ratio is a measure of the illumination change between the image and the background image. Further the invention involves changing the luminance of the background image using the determined ratio.

Description

Description
Method of estimating illumination change of images for object detection
The present invention relates to digital image processing and in particular to estimating illumination change of images for object detection. Possible applications of the invention are for example in surveillance, intrusion detection in indoor and outdoor environments and traffic monitoring systems.
Background subtraction is a commonly used technique for detecting objects of interest in an image. It involves comparing a current image with a background image which does not contain any object of interest. The areas of the image plane where there is a significant difference between the current image and the background image indicate the location of the objects of interest. Background subtraction uses the technique of subtracting the background image from the current image and thresholding the result to detect the objects of interest.
However, there are various environmental issues like shadows, variation in natural light, which make the object detection error prone and difficult since they can bring gradual or sudden illumination change in the current image which may lead further to erroneously detected objects of interest in the final output, i. e. the background subtracted image. So, an effective background subtraction requires the accurate updating of a background image for illumination changes so that object detection can be perfectly done. If the background is not changed accordingly, the output might have portions of the background which may have changed because of the illumination change and may be falsely detected as objects of interest. It is an object of the present invention to provide a more accurate method of estimating illumination change of images for object detection.
The above object is achieved by a method of estimating illumination change of images for object detection comprising the steps of:
(a) determining the difference in luminance of a current image and a corresponding background image;
(b) determining the difference in chrominance of said current image and said corresponding background image; and
(c) determining the ratio between the luminance of the current image and the luminance of the background image if the difference in luminance is larger than a first predetermined value and the difference in chrominance is smaller than a second predetermined value, the determined ratio being a measure of the illumination change between the current image and the background image .
Illumination changes usually cause changes in the brightness of an image while the colours of the image remain basically unchanged. Usually only the brightness of colours is affected by illumination changes not the colours as such. In other words, illumination changes usually cause a substantial change in luminance of an image with no or minimal change in chrominance of an image. The invention makes use of this effect in an unexpected way, exploiting it for improving the accuracy of estimation of illumination changes of images for object detection. To determine the change in luminance during an illumination change, the ratio of the luminance of the current image and the luminance of the corresponding background image is determined, but only in case the difference in luminance is larger than a first predetermined value and the difference in chrominance is smaller than a second predetermined value. By this, the chances are reduced that any change in luminance caused by brighter or darker objects in the current image is taken erroneously as measure of an illumination change, since these objects usually will also cause a significant difference in chrominance since their colour differs from the background colour.
In a preferred embodiment of the invention, the method comprises the step of changing the luminance of the background image using the determined ratio. The determined ratio is a measure of the illumination change between the current image and the background image. If the background is not changed accordingly, the output of any background subtraction method used thereafter might have portions of the background which may have changed because of the illumination change and may be falsely detected as objects of interest. Hence, the above feature facilitates accurate updating of the background image, which is particularly advantageous for an effective background subtraction.
In a further preferred embodiment, the current image is a part of a frame, said frame comprising a plurality of images, and the steps (a) , (b) and (c) are repeated for the plurality of images. This is especially advantageous for estimating illumination changes in complex scenes with several foreground objects which can be captured more easily using a plurality of smaller images as part of a frame instead of one large image. By repeating the steps for the plurality of images the ratio being a measure of the illumination change between the current image and the background image can be determined only for images containing mainly background. Images containing mainly foreground objects will not be considered. Hence, the accuracy of illumination change estimation will be improved. In an alternative embodiment, said method comprises the step of calculating an average illumination change of the frame out of the ratios of the images for which the ratio has been determined. This will result in a better approximation of actual illumination change in the frame.
In another alternative embodiment, an illumination change is detected if the number of images for which the ratio has been determined exceeds a predetermined threshold, making the method less error-prone regarding to false detection of illumination changes.
Another alternative embodiment comprises the step of updating the luminance of the background images using the average illumination change of the frame. This provides a corrected background of the image that can be used for background subtraction. This facilitates a proper background subtraction since the updated background is devoid of errors due to illumination change.
In another alternative embodiment, the image comprises of pixels, the luminance of each pixel determined by one luminance value and the chrominance of each pixel determined by two chrominance values. This eases determining the differences in luminance and in chrominance of the current image and the corresponding background image, since both luminance and chrominance are explicitly defined by their respective values.
The present invention is further described hereinafter with reference to illustrated embodiments shown in the accompanying drawings, in which:
FIG 1 is a flowchart illustrating a method for estimating illumination change of a current image with respect to a corresponding background image, FIG 2 is a flowchart illustrating a method for estimating illumination change of a frame comprising of a plurality of images,
FIG 3 is a flowchart illustrating an embodiment of the present invention for estimating illumination change of a frame comprising of a plurality of images in YCrCb color space,
FIG 4 shows a background image of a road in a traffic monitoring method,
FIG 5 shows an input image of a traffic monitoring method affected by illumination change,
FIG 6 shows a background subtracted image affected by illumination change in a traffic monitoring method, and
FIG 7 shows an illumination corrected output image in a traffic monitoring method.
Before describing the method of object detection, some of the terminology used herein will be explained. Still photographs or digital video signals may be represented as a matrix of pixel values and stored in a storage device, such as that of a computer, an embedded system, or other digital processing device. In the context of the present discussion, a pixel value or the matrix of pixel values is generally referred to as an 'image' . Similarly 'image' can be a pixel or combination of pixels where the 'pixel' is the smallest resolved unit of an image that has specific luminance (brightness) and chrominance (color) . A 'frame' is one of the many single photographic images in a video. The 'frame' may consist of a single 'image' or a combination of such 'images' . An 'object' or Object of interest' is defined as an identifiable entity in a scene in a still image or a moving or non-moving entity in a video image. FIG 1 illustrates a method 100, for estimating illumination change of a current image with respect to a corresponding background image. To estimate the illumination change initially a background image is stored in an image processing device (step 10) . A current image is then received in the image processing device in a reguired format compatible for processing, i. e. for example in a luminance-chrominance color space. For example in case the image is in RGB (Red, Green, Blue) colour space, the image is converted to the format YCrCb (Y is the luminance component, Cr and Cb correspondingly represents the chroma red and chroma blue components which are the colour components) . At Step 20, the current image received is converted to a required format compatible for processing. This is an optional step and is not performed if the current image is received directly in the format compatible for processing. The required format has at least one luminance component and at least one chrominance component. At step 30, the difference in luminance of the current image and a corresponding background image is determined. At step 40, the difference in chrominance of the image and the chrominance of the corresponding background image is determined. The said step 30 and step 40 can be performed in parallel or can be done one after the other. At step 50, a check is made to determine if difference in luminance is larger than a first predetermined value and the difference in chrominance is smaller than a second predetermined value. By this the chances are reduced that any change in luminance caused by brighter or darker objects in the image is taken erroneously as measure of an illumination change, since these objects usually will also cause a significant difference in chrominance since their colour differs from the background colour. If the above condition is not satisfied then an illumination change is not detected (step 60) . If the above condition is satisfied, the ratio between the luminance of the current image and the luminance of the background image is determined (step 70) . Illumination change affects the luminance of the image in a multiplicative way. So, to get the multiplicative factor, the ratio between the luminance of the image and the luminance of the corresponding background image is determined. The determined ratio is a measure of the illumination change between the image and the background image. Illumination changes usually cause changes in the brightness of an image while the colours of the image remain basically unchanged. Usually only the brightness of colours is affected by illumination changes not the colours as such. The determined ratio is a measure of the illumination change between the image and the background image. At step 80, the illumination change is determined using the ratio. The background image is changed accordingly with respect to the measured illumination change (step 90) .
FIG 2 illustrates a method 200 for estimating the illumination change of a frame comprising a plurality of images. To estimate the illumination change initially a background image is stored in an image processing device (step 205) . At Step 210, the current image received by the processing device is converted to a required format compatible for processing. This is an optional step and is not performed if the current image is received directly in the format compatible for processing. Here the current image is part of a frame, which is having a plurality of images. At step 215, the difference in luminance and the difference in chrominance of a current image and a corresponding background image are determined. At step 220, a check is made to determine if difference in luminance is larger than a first predetermined value and the difference in chrominance is smaller than a second predetermined value. If the condition is satisfied, the ratio between the luminance of the current image and the luminance of the corresponding background image is determined (step 225) . The process of determining the difference in luminance, determining the difference in chrominance and determining the ratio is repeated for all the images in the frame (step 230) . There may not be illumination change equally on all the images of a frame since the frame can contain both background and foreground objects. For example an image can contain only background, only a foreground object or parts of both. An illumination change will possibly bring a high level of change in luminance in an image having background alone. The change in chrominance usually is less or nearly nil. But there may be more chrominance change than luminance change in an image which contains at least portions of a foreground object since the colour of the foreground object usually differs from that of the background which affects the chrominance of the image rather than the luminance. At step 235, the average illumination change of the whole frame is calculated using the determined ratio of each image. At step 240, a check is made to determine if the total number of images for which the ratio has been determined exceeds a predetermined threshold (a third threshold) . If the above condition is not satisfied then an illumination change is not detected (step 245) . At step 250, an illumination change is detected if the total number of images for which the ratio has been determined exceeds the third predetermined threshold. The determined average illumination change is a measure of the illumination change of the frame. At step 255, the background images are changed accordingly with respect to the measured illumination change .
FIG 3 is a flowchart illustrating an embodiment of the present invention for estimating illumination change of a frame comprising of a plurality of images in YCrCb color space. A feature for describing color numerically is generally called a colour space. The most widely used color spaces are RGB (Red, Green, Blue) for scanners and displays, CMYK (short for cyan, magenta, yellow, and black) for color printing and YUV for video and TV. The Y in YUV stands for "luma, " which is brightness; U and V provide color information and are "color difference" signals of blue minus luma (B-Y) and red minus luma (R-Y) respectively. YUV is most commonly seen in its digital form, YCrCb widely used in video and image processing schemes. Y is the luminance component; Cr and Cb are the chrominance component. Cr is a colour difference signal red minus luma (R-Y) and Cb is another colour difference signal blue minus luma (B-Y) . As applied to video signals, luminance represents the brightness in an image. Luminance is typically paired with chrominance. Luminance represents the achromatic image without any color, while the chrominance components represent the color information. YCrCb domain processing also helps in storing Cr, Cb in sub-sampled domain making it easy to store data with lesser storage requirements as against other color space like RGB. Since both luminance and chrominance are explicitly defined by their respective values estimation of illumination change is easier in YCrCb colour space.
A method 300 for estimating illumination change of a frame comprising of a plurality of images in YCrCb color space is illustrated in FIG 3. During illumination change, there is substantial change in Y component of the image with no or minimal change in Cr, Cb component of an image at the same position in the two consecutive frames. To estimate the illumination change, ratio of the current Y component to the previous Y component of the same image is calculated due to multiplicative change in Y component during illumination.
To estimate the illumination change initially a background image is stored in an image processing device (step 305) . The current image is received in the YCrCb format (step 310) . In case the image is not in YCrCb format the current image is converted to the YCrCb format. Here the current image is part of a frame, which is having a plurality of images. At step 315, the difference in luminance and the difference in chrominance of an image in the frame and a corresponding background image are determined. If the difference in luminance is larger than a first predetermined value and the difference in chrominance is smaller than a second predetermined value (step 320) , then that particular image is marked (step 325) . The process of determining the difference in luminance, determining the difference in chrominance and marking of the image is repeated for all the images in the frame (step 330) . At step 335, a check is made to determine if the ratio of the total number of marked images to that of total images in the frame exceeds a predetermined threshold. If the above condition is not satisfied then an illumination change is not detected (step 340) . At step 345 an illumination change is detected if the said condition is satisfied. At step 350, the illumination ratio of each marked image is determined. At step 355, the average illumination change of the whole frame is calculated using the determined illumination ratio of each marked image. The determined average illumination change is a measure of the illumination change of the frame. The background image is changed accordingly with respect to the measured illumination change (step 360) .
The above thresholds are initialized with some values say 40% change or 25% change. The performance of the algorithm is evaluated for a good set of images. Depending upon the results, the thresholds are increased or decreased.
The invention is explained with respect to an example. The invention can be implemented in a traffic monitoring system. The need is to find the foreground objects such as vehicles compared to background such as a road. The background subtraction technique involves comparing the image that has undergone illumination change and a corresponding background image captured prior to the illumination change.
FIG 4 shows a background image 400 of the road in the traffic monitoring method. The road is divided for traffic using well defined traffic lanes. The lane of a road is defined by painted strips of lines. Two sequences of strips divide the road into three parallel lanes as shown in the image. The first sequence is constituted by strips 401, 402 and 403. The second sequence is constituted by strips 404, 405 and 406.
FIG 5 shows an input image 500 of a traffic monitoring method affected by illumination change. The input image is received in the required format compatible for processing. 501, 502 and 503 are the foreground objects i.e. vehicles that need to be detected. Due to illumination changes for example due to Sun by nature, the brightness of the road changes with change in luminance of the strips also. But, the presence of a vehicle in the captured image changes the colour of the background. The brightness of the background as well as the foreground image changes because of the illumination change. The illumination change brings brightness change in the road, strips and also in the vehicles. The change in chrominance is higher in places where there is a presence of vehicles. But in the background where there is no vehicle the luminance change is more since the illumination change affects the brightness of the background and not the colour. For illustration of this example a high level of luminance change in the strips is assumed. 504, 505, 506, 507, 508 and 509 indicates strips after luminance change.
FIG 6 shows a background subtracted image 600 affected by illumination change in a traffic monitoring method. The background subtracted image has portions of the strips 601,
602, 603, 604, 605, and 606 which are erroneously detected as foreground objects. Thus the luminance change has resulted in an erroneous detection of the foreground objects.
FIG 7 illustrates an illumination corrected output image 700. Since the scene is having several foreground objects the illumination change is determined by using a plurality of smaller images as part of a frame instead of one large image. The portions where there is a chrominance change is a part of the image identified by a vehicle, since these vehicles usually cause a significant difference in chrominance as their colour differs from the background colour. To determine the change in luminance during an illumination change, the ratio of the luminance of the image and the luminance of the corresponding background image is determined, but only in case the difference in luminance is larger than a first predetermined value and the difference in chrominance is smaller than a second predetermined value. By this the chances are reduced that any change in luminance caused by brighter or darker vehicle in the image is taken erroneously as measure of an illumination change. By repeating the same method steps for the plurality of images the ratio being a measure of the illumination change between the image and the background image can be determined only for images containing mainly background. Images containing mainly foreground objects will not be considered. Hence the accuracy of illumination change estimation will be improved.
Once the ratio of luminance change is obtained the average illumination change in the image or frame can be determined. The background of the image is then updated using the determined average illumination change so that accurate vehicle detection can be carried out. In a practical situation where the luminance of the image goes on changing the ratio between the luminance of the image and the luminance of the background image can be determined periodically to update the background image to get more accurate results. The illumination corrected output image will have only the vehicles 501, 502 and 503 but not any portion of the background.
Summarizing, the present invention provides a method of estimating illumination change of images for object detection. The method involves estimating illumination change for images and further changing the luminance of a corresponding background image. The method first involves determining the difference in luminance of a current image and a corresponding background image and also determining the difference in chrominance of the current image and the corresponding background image. Then the ratio between the luminance of the current image and the luminance of the background image is determined if the difference in luminance is larger than a first predetermined value and the difference in chrominance is smaller than a second predetermined value, the determined ratio being a measure of the illumination change between the image and the background image. Further the invention involves changing the luminance of the background image using the determined ratio.
Although the invention has been described with reference to specific embodiments, this description is not meant to be construed in a limiting sense. Various modifications of the disclosed embodiments, as well as alternate embodiments of the invention, will become apparent to persons skilled in the art upon reference to the description of the invention. It is therefore contemplated that such modifications can be made without departing from the spirit or scope of the present invention as defined.

Claims

Patent claims:
1. A method of estimating illumination change of images for object detection comprising the steps of:
(a) determining the difference in luminance of a current image and a corresponding background image;
(b) determining the difference in chrominance of said current image and said corresponding background image; and
(c) determining the ratio between the luminance of the current image and the luminance of the background image if the difference in luminance is larger than a first predetermined value and the difference in chrominance is smaller than a second predetermined value, the determined ratio being a measure of the illumination change between the current image and the background image.
2. The method according to claim 1, further comprising the step of changing the luminance of the background image using the determined ratio.
3. The method according to claim 1, wherein the current image is a part of a frame, said frame comprising a plurality of images, and wherein the steps (a) , (b) and (c) are repeated for the plurality of images.
4. The method according to claim 3, further comprising the step of calculating an average illumination change of the frame out of the ratios of the images for which the ratio has been determined.
5. The method according to claim 3 or 4, wherein an illumination change is detected if the number of images for which the ratio has been determined exceeds a predetermined threshold.
6. The method according to any of claim 3-5, further comprising the step of updating the luminance of the background images using the average illumination change of the frame .
7. The method according to any of the preceding claims, wherein the image comprises of pixels, the luminance of each pixel determined by one luminance value and the chrominance of each pixel determined by two chrominance values.
PCT/EP2007/059089 2007-08-31 2007-08-31 Method of estimating illumination change of images for object detection WO2009026966A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2007/059089 WO2009026966A1 (en) 2007-08-31 2007-08-31 Method of estimating illumination change of images for object detection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2007/059089 WO2009026966A1 (en) 2007-08-31 2007-08-31 Method of estimating illumination change of images for object detection

Publications (1)

Publication Number Publication Date
WO2009026966A1 true WO2009026966A1 (en) 2009-03-05

Family

ID=39389209

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2007/059089 WO2009026966A1 (en) 2007-08-31 2007-08-31 Method of estimating illumination change of images for object detection

Country Status (1)

Country Link
WO (1) WO2009026966A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011000226A1 (en) * 2009-07-03 2011-01-06 深圳泰山在线科技有限公司 Target detecting method and device and used image collecting apparatus thereof
US20120008858A1 (en) * 2008-12-16 2012-01-12 Staffordshire University Image processing
WO2013064891A1 (en) * 2011-11-01 2013-05-10 Canon Kabushiki Kaisha Method and system for luminance adjustment of images in an image sequence
EP3846440A1 (en) * 2019-12-30 2021-07-07 Axis AB A low-light imaging system

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5748775A (en) * 1994-03-09 1998-05-05 Nippon Telegraph And Telephone Corporation Method and apparatus for moving object extraction based on background subtraction
JP2005018401A (en) * 2003-06-26 2005-01-20 Hitachi Eng Co Ltd Image processor corresponding to illumination change

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5748775A (en) * 1994-03-09 1998-05-05 Nippon Telegraph And Telephone Corporation Method and apparatus for moving object extraction based on background subtraction
JP2005018401A (en) * 2003-06-26 2005-01-20 Hitachi Eng Co Ltd Image processor corresponding to illumination change

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
CUCCHIARA R ET AL: "Detecting objects, shadows and ghosts in video streams by exploiting color and motion information", IMAGE ANALYSIS AND PROCESSING, 2001. PROCEEDINGS. 11TH INTERNATIONAL C ONFERENCE ON SEPT.26-28, 2001, PISCATAWAY, NJ, USA,IEEE, 26 September 2001 (2001-09-26), pages 360 - 365, XP010561273, ISBN: 978-0-7695-1183-2 *
IKEUCHI K ET AL: "Illumination Normalization with Time-Dependent Intrinsic Images for Video Surveillance", IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINEINTELLIGENCE, IEEE SERVICE CENTER, LOS ALAMITOS, CA, US, vol. 26, no. 10, 1 October 2004 (2004-10-01), pages 1336 - 1347, XP011116536, ISSN: 0162-8828 *
RAE-HONG PARK ET AL: "Fast local motion-compensation algorithm for video sequences with brightness variations", IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, IEEE SERVICE CENTER, PISCATAWAY, NJ, US, vol. 13, no. 4, 1 April 2003 (2003-04-01), pages 289 - 299, XP011096132, ISSN: 1051-8215 *
YINGHUA HE ET AL: "Background updating in illumination variant scenes", INTELLIGENT TRANSPORTATION SYSTEMS, 2003. PROCEEDINGS. 2003 IEEE OCT. 12-15, 2003, PISCATAWAY, NJ, USA,IEEE, vol. 1, 12 October 2003 (2003-10-12), pages 515 - 519, XP010673621, ISBN: 978-0-7803-8125-4 *

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120008858A1 (en) * 2008-12-16 2012-01-12 Staffordshire University Image processing
US9047677B2 (en) * 2008-12-16 2015-06-02 Staffordshire University Enterprise and Commercial Development Image processing
WO2011000226A1 (en) * 2009-07-03 2011-01-06 深圳泰山在线科技有限公司 Target detecting method and device and used image collecting apparatus thereof
AU2011244921B2 (en) * 2011-11-01 2014-03-27 Canon Kabushiki Kaisha Method and system for luminance adjustment of images in an image sequence
AU2011244921B8 (en) * 2011-11-01 2014-04-03 Canon Kabushiki Kaisha Method and system for luminance adjustment of images in an image sequence
AU2011244921A8 (en) * 2011-11-01 2014-04-03 Canon Kabushiki Kaisha Method and system for luminance adjustment of images in an image sequence
WO2013064891A1 (en) * 2011-11-01 2013-05-10 Canon Kabushiki Kaisha Method and system for luminance adjustment of images in an image sequence
US9609233B2 (en) 2011-11-01 2017-03-28 Canon Kabushiki Kaisha Method and system for luminance adjustment of images in an image sequence
EP3846440A1 (en) * 2019-12-30 2021-07-07 Axis AB A low-light imaging system
KR20210086454A (en) * 2019-12-30 2021-07-08 엑시스 에이비 A low-light imaging system
JP2021114762A (en) * 2019-12-30 2021-08-05 アクシス アーベー Low-light imaging system
US11153546B2 (en) 2019-12-30 2021-10-19 Axis Ab Low-light imaging system
KR102333707B1 (en) 2019-12-30 2021-12-01 엑시스 에이비 A low-light imaging system
JP7057818B2 (en) 2019-12-30 2022-04-20 アクシス アーベー Low light imaging system

Similar Documents

Publication Publication Date Title
US6842536B2 (en) Image processing apparatus, image processing method and computer program product for correcting image obtained by shooting subject
US7599550B1 (en) Method for accurate real-time compensation for changing illumination spectra in digital video cameras
US5245436A (en) Method and apparatus for detecting fades in digital video sequences
CN100539692C (en) From digital moving picture data, remove the method for noise
US8164649B2 (en) White balance adjusting device, imaging apparatus, and recording medium storing white balance adjusting program
US20070047803A1 (en) Image processing device with automatic white balance
US7564491B2 (en) Pixel defect detecting/correcting device and pixel defect detecting/correcting method
JP5205007B2 (en) Light source estimation method and apparatus
US8023010B2 (en) Defective pixel correction device
US20050058341A1 (en) Image quality correction apparatus and image quality correction method
JP5352859B2 (en) Method and apparatus for maintaining video background using multiple Gaussian distribution models
US20080112639A1 (en) Method and apparatus for removing noise in dark area of image
WO2009026966A1 (en) Method of estimating illumination change of images for object detection
CN112652027A (en) Pseudo-color detection algorithm and system
US20070268503A1 (en) Image processing system
US20070159536A1 (en) White balance control method
TWI736599B (en) Method for detection of saturated pixels in an image
US5815159A (en) Spatially mapped monochrome display in a color image
US10567724B2 (en) Dynamic demosaicing of camera pixels
US7688357B2 (en) Method and apparatus for color temperature correction in a built-in camera of a portable terminal
US20210377425A1 (en) Black level drift
CN109726708A (en) A kind of Lane detection method and device
US20190057492A1 (en) Method and image processing device for detecting a portion of an image
US8654220B2 (en) Image processing apparatus and control method for the same
CN113194317A (en) Image processing method and device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07819997

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07819997

Country of ref document: EP

Kind code of ref document: A1