WO2007141858A1 - 汚れ検出方式 - Google Patents
汚れ検出方式 Download PDFInfo
- Publication number
- WO2007141858A1 WO2007141858A1 PCT/JP2006/311527 JP2006311527W WO2007141858A1 WO 2007141858 A1 WO2007141858 A1 WO 2007141858A1 JP 2006311527 W JP2006311527 W JP 2006311527W WO 2007141858 A1 WO2007141858 A1 WO 2007141858A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- contamination
- pixel
- dirt
- unit
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/40—Analysis of texture
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
- G06T7/001—Industrial image inspection using an image reference approach
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/81—Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
- H04N23/811—Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation by dust removal, e.g. from surfaces of the image sensor or processing of the image signal output by the electronic image sensor
Definitions
- the present invention relates to a technique for detecting dirt and foreign matter adhering to the surface of an image sensor exposed to the outside of a device that captures an image used for a monitoring camera or inspection device installed at a fixed point.
- the technique for detecting dirt and foreign matter adhering to the sensor surface described in Patent Document 1 is such that a mirror is installed so that the sensor surface of the surveillance power mela is reflected at the edge of the captured image, and the captured image is displayed as an image. By processing, it detects dirt.
- This conventional technique has a problem in that a mirror is installed outside the sensor, resulting in an increase in the size and cost of the apparatus, and a decrease in performance as a surveillance camera due to part of the captured image being hidden.
- the technique for detecting dirt and foreign matter adhering to the sensor surface described in Patent Document 2 drives an imaging system such as a lens diaphragm to create different shooting conditions, and takes a plurality of images. Invariant portions between images are determined as dirt or foreign matter.
- This conventional technology increases the cost by having a movable part such as a lens diaphragm, and also does not change between multiple images. Since the portion is determined to be dirty, there is a problem that the background may be erroneously determined to be dirty because the background is not changed.
- Patent Document 1 Japanese Patent Laid-Open No. 2001-8193
- Patent Document 2 JP 2004-172820 A
- the present invention automatically detects a subject to be compared by a subject region extraction unit from among images captured by an image photographing unit that has no movable system and is fixed at a fixed point, and images inside the subject. Extract regions. This prevents the background from being mistakenly judged as dirty. Also, a plurality of recent area extraction images obtained by extracting image areas inside the subject are held in the area extraction image storage unit. Then, the region extraction images held in the stain level calculation unit are compared. The contamination level calculation unit compares the stored subject area image for each pixel, and indicates a situation that is likely to occur when the contamination is attached, and the pixel stored in the contamination level storage unit. To increase the value of the degree of dirt.
- the dirt determination unit outputs whether or not dirt is attached based on the information on the degree of dirt stored in the dirt degree storage unit, or the degree of dirt adhesion as a determination result.
- the contamination that overlaps the subject to be captured and interferes with the processing Only can be reliably detected. Unlike the conventional technology, it is possible to prevent the background from being erroneously determined as dirty.
- the contamination level calculation unit repeatedly compares multiple images to extract pixels with high probability of contamination, and based on the probability information, the contamination determination unit determines whether contamination has occurred. Patterns in the image that appear to be dirty at first glance from the captured image are not mistakenly judged as dirty.
- initial setting such as taking a reference image (reference sheet or the like) in advance before operation becomes unnecessary, which is very simple. Operation can be started. The user looks at the output of the dirt judgment unit and cleans and exchanges the device. By performing replacement, etc., it is possible to prevent degradation of the performance of the monitoring device, etc. due to continued operation with dirt attached. In addition, it is possible to automatically stop the use of the device by using the judgment result, and it is possible to prevent the disadvantages caused by operating with the dirt attached.
- FIG. 1 is a block diagram showing a configuration of a dirt detection method according to a first embodiment of the present invention.
- FIG. 2A is a flowchart for explaining the operation of the dirt detection method according to the first embodiment of the present invention.
- FIG. 2B is a flowchart for explaining the operation of the dirt detection method according to the first embodiment of the present invention.
- FIG. 3 is a flowchart for explaining the conceptual operation of the dirt detection method according to the first embodiment of the present invention.
- FIG. 4 is a graph showing an example of a contamination degree function used in the embodiment of the present invention.
- FIG. 5 is a table for calculating the degree of contamination used in the embodiment of the present invention.
- FIG. 6 is an explanatory diagram for calculating a contamination level with reference to the vicinity of a specific pixel used in the embodiment of the present invention.
- FIG. 7 is a diagram for explaining a first dirt determination method performed by a dirt judgment unit according to an embodiment of the present invention.
- FIG. 8 is a flowchart for explaining the operation of the first dirt determination method shown in FIG.
- FIG. 9 is a diagram for explaining a second stain determination method performed by the stain determination unit according to the embodiment of the present invention.
- FIG. 10 is a flowchart for explaining the operation of the second dirt determination method shown in FIG. 9.
- ⁇ 11] A block diagram showing the configuration of the dirt detection method according to the second embodiment of the present invention.
- FIG. 12A is a flowchart for explaining the operation of the dirt detection method according to the second embodiment of the present invention.
- FIG. 12B is a flowchart for explaining the operation of the dirt detection method according to the second embodiment of the present invention.
- FIG. 13 is a block diagram showing a configuration of a dirt detection method according to a third embodiment of the present invention.
- FIG. 14A is a flowchart for explaining the operation of the dirt detection method according to the third embodiment of the present invention.
- FIG. 14B is a flowchart for explaining the operation of the dirt detection method according to the third embodiment of the present invention.
- FIG. 14C is a flowchart for explaining the operation of the dirt detection method according to the third embodiment of the present invention.
- FIG. 15 is a block diagram showing a configuration of a dirt detection method according to a fourth embodiment of the present invention.
- FIG. 16A is a flowchart for explaining the operation of the dirt detection method according to the fourth embodiment of the present invention.
- FIG. 16B is a flowchart for explaining the operation of the dirt detection method according to the fourth embodiment of the present invention.
- FIG. 17 is a flowchart for explaining an operation of a dirt degree storage initialization unit according to the fourth embodiment of the present invention.
- FIG. 1 is a block diagram showing a configuration of a dirt detection method according to the first embodiment of the present invention.
- the dirt detection method according to the first embodiment of the present invention includes an image photographing unit 11 that is fixed at a fixed point that does not have a movable system for monitoring or the like and always captures an image.
- a captured image storage unit 12 for storing an image captured by the image capturing unit 11; and detecting that a target subject (for example, a vehicle) appears in the captured image stored in the captured image storage unit 12;
- a subject region extraction unit 13 that extracts a region of a subject and generates a region extraction image, a region extraction image storage unit 14 that accumulates two or more recent photographed images extracted by the subject region extraction unit 13, and a region extraction image storage
- the contamination level calculation unit 15 that compares the plurality of region extraction images stored in the unit 14 for each pixel and calculates the contamination level, the contamination level storage unit 16 that stores the calculated contamination level for each pixel, and the contamination level storage unit
- a contamination determination unit 17 is provided that evaluates the contamination level of each pixel read from 16 and outputs the presence or absence of contamination or the contamination level.
- step S 11 of FIG. 2A! The image capturing unit 11 captures the latest image I.
- step S 12 the captured image I is stored in the captured image storage unit 12.
- step S 13 the subject region extraction unit 13 performs image processing on the image I stored in the photographed image storage unit 12, extracts the region of the subject to be photographed, and generates a region extraction image E.
- step S14 the extracted image E of the subject area is stored in the area extraction image storage unit 14. If there is an image already stored as the image E in the area extraction image storage unit 14, it is stored as the previous image E.
- step S15 the smear level calculation unit 15 initializes a pixel coordinate variable ( ⁇ , ⁇ ) indicating one arbitrary pixel in the image so as to indicate the first pixel (such as the upper left corner).
- step S16 the pixel value ⁇ of the pixel indicated by the pixel coordinate variable ( ⁇ , ⁇ ) is read out from the latest image stored in the region extraction image storage unit 14.
- step S17 the pixel value A of the pixel indicated by the pixel coordinate variable ( ⁇ , ⁇ ) is read from the previous image E stored in the region extraction image storage unit 14.
- step S18 based on the pixel values A and A, whether or not the pixel is contaminated
- step S19 in FIG. 2B the pixel contamination degree ⁇ indicated by the pixel coordinate variable ( ⁇ , ⁇ ) in the contamination degree storage unit 16 is read.
- step S20 a new contamination level ⁇ is calculated from the contamination levels ⁇ and ⁇ .
- step S22 it is determined whether the pixel coordinate variable ( ⁇ , ⁇ ) is the last pixel force in the image, that is, whether all pixels have been processed. If all the pixels have not been processed, the process proceeds to step S23. In step S23, the pixel coordinate variable ( ⁇ , ⁇ ) is updated to point to the next pixel, and the process returns to step S16 in FIG. On the other hand, if all the pixels have been processed, the process proceeds to step S 24, and in step S 24, the stain determination unit 17 obtains a value ⁇ ⁇ that is the sum of the stain levels of all the pixels stored in the stain level storage unit 16, and the stain determination threshold value ⁇ Compare with In step S25, it is determined whether the value ⁇ is greater than the threshold value ⁇ .
- step S26 it is determined that “dirt is adhered”, the determination result is output, and then the process ends.
- step S27 it is determined that “dirt is not adhered” in step S27, and then the process returns to step S11 in FIG. 2A.
- the stain degree calculation unit repeatedly compares a plurality of images to extract pixels having a high probability of stain, and based on the probability information, the stain determination unit determines whether or not the stain adheres. Patterns in the image that appear to be dirty at first glance from the captured image are not mistakenly determined as dirty.
- initial setting such as taking a reference image (reference paper, etc.) in advance before operation becomes unnecessary, which is very simple. Operation can be started. The user can prevent the deterioration of the performance of the monitoring device and the like by continuing the operation with the dirt attached by cleaning or replacing the device by looking at the output of the dirt judgment unit. In addition, it is possible to automatically stop the use of the device by using the judgment result, and it is possible to prevent the disadvantages caused by operating with the dirt attached.
- FIG. 3 is a flowchart for explaining the conceptual operation of the dirt detection method according to the first embodiment of the present invention described above.
- the image capturing unit 11 is an example of a fixed point camera, and images are continuously captured by the camera, and the captured images are continuously stored in the captured image storage unit 12.
- the subject region extraction unit 13 extracts only the subject region and stores it in the region extraction image storage unit 14.
- only the area of the subject to be photographed is set as the target area for contamination detection, and only the 'dirt' foreign object that appears on the subject is detected, and the background image that appears around the subject is erroneously detected as a 'dirt' foreign object. I will do it in this way.
- the degree-of-dirt calculation unit 15 compares each pixel within the region extracted from each of two or more photographed images.
- the attached dirt is recorded in the dirt degree storage unit 16 as a candidate for dirt by utilizing the property that even if a different subject is photographed, the same pixel value appears at the same position.
- the contamination level is not determined by a single comparison, and the contamination level is increased or decreased based on the comparison result.
- the contamination level is P
- the pixel value of the arbitrary coordinate (x, y) of the latest captured image is A (x, y)
- the pixel value of the arbitrary coordinate (x, y) of the previous captured image is B (x , y)
- the content of the function is arbitrary.
- the absolute value of the difference in pixel values (A (x, y)-B (x, y)) is small, the degree of contamination P becomes maximum. Use a function with the characteristics shown.
- FIG. 5 is a table for performing a stain detection filter process in which the stain level P increases when the pixel is close to black and A (x, y) and B (x, y) are close values.
- Fig. 6 is a diagram for explaining the third method for calculating the degree of contamination.
- the degree of contamination is judged to be large only when the central pixel has the same pixel value and is similar to the neighboring pixels.
- the degree of contamination is judged to be large only when the central pixel has the same pixel value and is similar to the neighboring pixels.
- the image pattern is not characteristic at the time of dirt adhesion, it is judged that the degree of dirt is small even if the surrounding patterns are similar.
- the central pixel has the same pixel value, but if it is not the characteristic image pattern seen when the dirt is attached, the degree of dirt is judged to be small. As shown in Fig. 6 (b), if the central pixel has the same pixel value and pixels near the force are not similar, the degree of contamination is judged to be small.
- the first dirt determination method obtains the total dirt degree of the entire image stored in the dirt degree storage unit, and the total value is determined in advance. If it exceeds the threshold P_sum_thresh! /, It is judged that dirt is attached! /. If this method is used, the sensor surface will be entirely thin, thin, or dirty. Can be determined as dirty.
- FIG. 8 is a flow chart for explaining the first dirt determination method according to the embodiment of the present invention.
- step S31 of FIG. 8 the total value P_sum of the smear level of the entire image is initialized.
- step S32 a pixel coordinate variable (x, y) indicating an arbitrary pixel in the image is initialized so as to indicate the first pixel (such as the upper left corner).
- step S33 the contamination level P (x, y) of the pixel coordinate variable (x, y) is read from the contamination level storage unit.
- step S34 the total value P_sum of the contamination level of the entire image and the read contamination level P (x, y) are totaled, and the total value P_sum of the contamination level of the entire image is updated.
- step S35 it is determined whether the pixel coordinate variable (x, y) is the last pixel in the image, that is, whether all pixels have been processed.
- step S36 If all the pixels have been processed! /, If not, the process proceeds to step S36, and the pixel coordinate variable (X, y) is updated to point to the next pixel in step S36, and the process returns to step S33.
- step S37 it is determined in step S37 whether the total value P_sum of the contamination level of the entire image exceeds the threshold value P_sum_thresh. If it exceeds the threshold value, the process proceeds to step S38. In step S38, it is determined that “dirt has adhered” and the dirt determination is terminated. On the other hand, if it does not exceed the threshold value, the process proceeds to step S39. In step S39, it is determined that “dirt is attached and V ⁇ is a defect” and the stain determination is terminated.
- the second stain determination method sequentially checks the stain degree of the entire image stored in the stain degree storage unit, and the stain degree larger than a predetermined threshold value P_thresh.
- a predetermined threshold value P_thresh When the number N of pixels in N is counted and the number N of pixels exceeds a predetermined threshold value N_thresh, it is determined that dirt is attached. If this determination method is used, it is possible to determine that there is a stain when there is a dark, dirty, or part that can be clearly determined as dirty. It should be noted that instead of the total number N of pixels having a degree of contamination greater than the threshold value described above, the determination may be made based on the total area of the pixels having a degree of contamination greater than the threshold value.
- FIG. 10 is a flow chart for explaining the second dirt determination method according to the embodiment of the present invention.
- step S41 in FIG. 10 the number N of pixels having a degree of contamination larger than the threshold value P_thresh is initialized.
- step S42 a pixel coordinate variable (x, y) indicating an arbitrary pixel in the image is initialized so as to indicate the first pixel (such as the upper left corner).
- step S43 the stain degree P (x, y) of the stain degree storage unit pixel coordinate variable (x, y) is read.
- step S44 it is determined whether the degree of contamination P (x, y) is larger than the threshold value P_thresh.
- step S44 If it is not larger than the threshold value P_thresh in the determination in step S44, the process proceeds to step S45, and the pixel coordinate variable (x, y) is updated to indicate the next pixel in step S45, and the process returns to step S43. On the other hand, if the determination in step S44 is larger than the threshold value P_thresh, the process proceeds to step S46, and the number N of pixels is incremented in step S46. In step S47, it is determined whether the pixel coordinate variable (x, y) is the last pixel force in the image, that is, whether all pixels have been processed.
- step S45 If all pixels have been processed! / ?, return to step S45, update the pixel coordinate variable (X, y) to indicate the next pixel in step S45, and return to step S43.
- step S48 it is determined in step S48 whether the number of pixels N exceeds a predetermined threshold value N_thresh. If it exceeds the threshold N_thresh, the process proceeds to step S49, where it is determined that “dirt is attached” in step S49, and the dirt determination is terminated. On the other hand, if it does not exceed the predetermined threshold value N_thresh, the process proceeds to step S50, and in step S50, it is determined that “dirt adheres and is a bad habit” and the stain determination is terminated.
- the dirt determination unit determines the dirt as described above, the dirt determination unit outputs the degree of dirt adhesion along with the presence or absence of dirt adhesion. Since the dirt judgment unit outputs the presence / absence of dirt and the degree of dirt adherence, it is possible to reliably warn before judging that dirt adheres. Also, by logging them, it is possible to check changes in the sensor state.
- FIG. 11 is a block diagram showing the configuration of the dirt detection method according to the second embodiment of the present invention.
- the dirt detection method according to the second embodiment of the present invention includes an image capturing unit 21 that is fixed at a fixed point that does not have a movable system for monitoring or the like and always captures an image, and an image capturing unit.
- a captured image storage unit 22 for storing images captured by the image capturing unit 21, and a target image is detected in the captured image stored in the captured image storage unit 22.
- the region-extracted image correction unit 25 that performs luminance correction and re-accumulates in the region-extracted image storage unit 24 is compared with a plurality of region-extracted images that are accumulated by correcting the luminance in the region-extracted image storage unit 24, and the pixel
- the degree of contamination calculation unit 26 for calculating the degree of contamination, the degree of contamination storage unit 27 for storing the calculated degree of contamination for each pixel, and the degree of contamination for each pixel read out from the degree of contamination storage unit 27 are evaluated to adhere to the contamination.
- FIGS. 12A and 12B are flowcharts for explaining the operation of the dirt detection method according to the second embodiment of the present invention.
- the image capturing unit 21 captures the latest image I.
- the captured image 1a is stored in the captured image storage unit 22.
- the subject region extraction unit 23 performs image processing on the image I stored in the photographed image storage unit 22, extracts the region of the subject that is the subject of photographing, and generates a region extraction image E.
- the extracted image E of the subject area is stored in the area extracted image storage unit 24. If there is an image already stored as the image E in the region extraction image storage unit 24, it is stored as the previous image E. Step S5
- the region extraction image correction unit 25 calculates the total luminance V of all the pixels of the region extraction image E, and sets all the pixels of the region extraction image to V /
- the brightness is corrected by multiplying by V and stored again in the region extraction image storage unit 24.
- step S56 the smear level calculation unit 26 initializes a pixel coordinate variable ( ⁇ , ⁇ ) indicating one arbitrary pixel in the image so as to indicate the first pixel (such as the upper left corner).
- step S57 the pixel value ⁇ of the pixel indicated by the pixel coordinate variable ( ⁇ , ⁇ ) is read from the latest image ⁇ stored in the region extraction image storage unit 24.
- step S58 the pixel value A of the pixel indicated by the pixel coordinate variable ( ⁇ , ⁇ ) is read from the previous image E stored in the region extraction image storage unit 24.
- step S59 of FIG. 12B the pixel is contaminated based on the pixel values A and A.
- the pixel coordinate variables ( ⁇ , ⁇ ) in the contamination degree storage unit 27 are
- step S61 Read out the contamination level ⁇ of the indicated pixel.
- the dirt level is changed from ⁇ ⁇ to ⁇ ⁇ .
- step S62 the value of the degree of contamination of the pixel indicated by the pixel coordinate variable ( ⁇ , ⁇ ) in the degree-of-dirt storage unit 27 is
- step S63 it is determined whether the pixel coordinate variable ( ⁇ , ⁇ ) is the last pixel force in the image, that is, whether all pixels have been processed. If all the pixels have not been processed, the process proceeds to step S64. In step S64, the pixel coordinate variable ( ⁇ , ⁇ ) is updated to indicate the next pixel, and the process returns to step S57 in FIG. 12A. On the other hand, if all the pixels have been processed, the process proceeds to step S65. In step S65, the stain determination unit 28 obtains a sum ⁇ ⁇ of the stain levels of all the pixels stored in the stain level storage unit 27 to obtain a stain determination threshold value. Compare with ⁇ . Step S66 determines whether the value ⁇ is greater than the threshold ⁇ th th
- step S67 it is determined that “dirt has adhered”, the determination result is output, and then the process ends.
- step S68 it is determined that “dirt is not adhered” in step S68, and then the process returns to step S51 in FIG. 12A.
- the imaging condition is corrected by correcting the pixel value to be compared based on the brightness level of the region extraction image before calculating the dirt level. Even if they are not the same, it is possible to calculate an appropriate level of contamination by comparing pixels that are close to those taken under the same shooting conditions. In addition, before calculating the contamination level, the accuracy of the contamination determination is improved by lowering the contamination level of the pixels that indicate pixel values that can be determined to be clearly unaffected by contamination such as V, high brightness, and pixels. Can be improved.
- FIG. 13 is a block diagram showing a configuration of a dirt detection method according to the third embodiment of the present invention.
- the dirt detection method according to the third embodiment of the present invention includes an image photographing unit 31 that is fixed at a fixed point that does not have a movable system and is always photographed for monitoring or the like, and an image photographing unit.
- a captured image storage unit 32 for storing images captured by the image pickup unit 31, and a target object is reflected in the captured image stored in the captured image storage unit 32.
- a subject area extraction unit 33 that generates an extracted image and a plurality of recent captured images extracted by the subject area extraction unit 33 and an extraction region mask that distinguishes whether or not the pixels related to the boundary of the region extraction are within the subject region
- the region extraction image storage unit 34 for storing information and a plurality of recent region extraction images stored in the region extraction image storage unit 34 as pixels Refers to the degree of contamination of each pixel stored in the degree-of-dirt storage unit 36 that stores the calculated degree of contamination for each pixel, the degree-of-contamination storage unit 36 that stores the calculated degree of contamination for each pixel. At this time, it is determined whether or not the pixel is in the subject region by referring to the extracted region mask information stored in the region extraction image storage unit 34. If the pixel is in the subject region, It is configured to include a dirt determination unit 37 for determining the adhesion of dirt by increasing the specific gravity of the degree of dirt of pixels.
- FIG. 14A, FIG. 14B, and FIG. 14C are flowcharts for explaining the operation of the dirt detection method according to the third embodiment of the present invention.
- the image capturing unit 31 captures the latest image I.
- the photographed image I is stored in the photographed image storage unit 32.
- the subject region extraction unit 33 performs image processing on the image I stored in the photographed image storage unit 72, extracts the subject region to be photographed, and extracts the region extraction image E and the extraction region mask information M. Generate.
- the extracted image E of the subject area and the extracted area mask information M are stored in the area extracted image storage unit 74.
- the image and mask information already stored as the image and mask information E ⁇ ⁇ in the region extraction image storage unit 74 are stored as the previous image and mask information ⁇ ⁇ ⁇ .
- step S75 the smear level calculation unit 35 initializes a pixel coordinate variable ( ⁇ , ⁇ ) indicating an arbitrary pixel in the image so as to indicate the first pixel (eg, the upper left corner).
- step S76 the pixel value ⁇ of the pixel indicated by the pixel coordinate variable ( ⁇ , ⁇ ) is read out from the latest image stored in the region extraction image storage unit 34.
- step S77 the pixel value A of the pixel indicated by the pixel coordinate variable ( ⁇ , ⁇ ) is read from the previous image E stored in the region extraction image storage unit 34.
- step S78 based on pixel values A and A, whether or not the pixel is contaminated
- step S79 of FIG. 14B the pixel contamination degree ⁇ ⁇ indicated by the pixel coordinate variable ( ⁇ , ⁇ ) in the contamination degree storage unit 36 is read.
- step S80 a new contamination level ⁇ is calculated from the contamination level, and ⁇ .
- step S81 ⁇ is written as the value of the degree of contamination of the pixel pointed to by the pixel coordinate variable ( ⁇ , ⁇ ) in the degree-of-dirt storage unit 36.
- step S82 the pixel coordinate variables ( ⁇ , ⁇ ) Pixels, that is, whether all pixels have been processed. If all the pixels have not been processed, the process proceeds to step S83.
- step S83 the pixel coordinate variable ( ⁇ , ⁇ ) is updated to point to the next pixel, and the process returns to step S76 in FIG. 14A. On the other hand, if all the pixels have been processed, the process proceeds to step S84. ) To initialize.
- step S85 the contamination level P (x, y) of the pixel coordinate variable (x, y) is read from the contamination level storage unit.
- step S86 the extraction area mask information M (x, y) of the pixel coordinate variable (x, y) is read from the area extraction storage unit 34.
- step S87 of FIG. 14C if the extracted area mask information M (x, y) is greater than 0, the degree of contamination P (x, y) is multiplied by a factor K to obtain a new degree of contamination P (x, y). If the extracted area mask information M (x, y) is 0, the contamination level P (x, y) is multiplied by K to obtain a new contamination level P (x, y).
- coefficient K is a coefficient
- step S88 it is determined whether the pixel coordinate variable (x, y) is the last pixel in the image, that is, whether all pixels have been processed. If all the pixels have not been processed, the process proceeds to step S89. In step S89, the pixel coordinate variable (x, y) is updated to indicate the next pixel, and the process returns to step S85 in FIG. 14B. On the other hand, if all pixels have been processed, the process proceeds to step S90. In step 90, the contamination level P (x, y) of all pixels is summed to calculate a value P, and the contamination determination threshold P
- step S91 it is determined whether the value P is larger than the threshold value P. If so, proceed to step S92 and step S th
- step S93 it is determined that “dirt is not adhered” in step S93, and then the process returns to step S71 in FIG. 14A.
- the subject area extraction unit extracts the subject area.
- the stain detection is not important because it does not overlap the subject such as the edge of the image. This makes it possible to focus on detection of dirt that mainly overlaps the subject rather than dirt that adheres to any part.
- FIG. 15 is a block diagram showing a configuration of a dirt detection method according to the fourth embodiment of the present invention.
- the dirt detection method according to the fourth embodiment of the present invention includes an image capturing unit 41 that is fixed at a fixed point that does not have a movable system for monitoring or the like and always captures an image, and an image capturing unit.
- a captured image storage unit 42 that stores images captured by 41, and an area of the subject is detected by detecting that a target subject (for example, a vehicle) appears in the captured image stored in the captured image storage unit 42.
- a subject area extraction unit 43 that extracts a region extraction image, a region extraction image storage unit 44 that stores two or more recent photographed images extracted by the subject region extraction unit 43, and a region extraction image storage unit 44.
- a plurality of accumulated region extraction images are compared for each pixel to calculate a contamination level, a contamination level calculation unit 45 that stores the calculated contamination level for each pixel, and a contamination level storage unit 46 that reads the calculated contamination level.
- the timer 49 is configured to output a periodic timer output to the contamination degree storage initialization unit 48.
- FIG. 16A and FIG. 16B are flowcharts for explaining the operation of the dirt detection method according to the fourth embodiment of the present invention.
- the image capturing unit 41 captures the latest image I.
- the captured image I is stored in the captured image storage unit 42.
- the subject region extraction unit 43 performs image processing on the image I stored in the photographed image storage unit 42, extracts the region of the subject to be photographed, and generates a region extraction image E.
- the extracted image E of the subject area is stored in the area extracted image storage unit 44. If there is an image already stored as the image E in the area extraction image storage unit 44, it is stored as the previous image E.
- step S105 it is checked whether or not the contamination level storage unit 46 is locked. After confirming that 6 is not locked, proceed to Step S106.
- step S106 the dirt storage unit 46 is locked.
- the smear level calculation unit 45 initializes a pixel coordinate variable ( ⁇ , ⁇ ) indicating an arbitrary pixel in the image so as to indicate the first pixel (such as the upper left corner).
- step S108 the pixel value ⁇ of the pixel indicated by the pixel coordinate variable ( ⁇ , ⁇ ) is also read out from the latest image receptive force stored in the region extraction image storage unit 44.
- step S109 the pixel coordinate is changed from the previous image E stored in the region extraction image storage unit 44.
- step S110 of FIG. 16B based on the pixel values A and A, the pixel is soiled.
- the degree of contamination P is a function of the pixel values A and A.
- step S111 F (A, A)
- step S111 the pixel coordinate variable ( ⁇ , ⁇ ) in the dirt storage unit 46
- step S112 Read out the pixel contamination level ⁇ ⁇ indicated by.
- step S 113 ⁇ is written as the contamination level value of the pixel indicated by the pixel coordinate variable ( ⁇ , ⁇ ) in the contamination level storage unit 46.
- step S114 it is determined whether the pixel coordinate variable ( ⁇ , ⁇ ) is the last pixel force in the image, that is, whether all pixels have been processed. If all the pixels have not been processed, the process proceeds to step S115. In step S115, the pixel coordinate variable ( ⁇ , ⁇ ) is updated to point to the next pixel, and the process returns to step S108 in FIG. On the other hand, if all pixels have been processed, the process proceeds to step S116, where V is determined in step S116, and the stain determination unit 47 obtains a value ⁇ ⁇ that is the sum of the stain levels of all the pixels stored in the stain level storage unit 46, Compare with the dirt judgment threshold ⁇ . In step S117, the dirt storage unit 46 th
- step SI 18 it is determined whether the value P is larger than the threshold value P. Large th
- step S119 determines that “dirt is attached” in step S119, output the determination result, and then terminate the processing.
- the process proceeds to step S120, where it is determined in step S120 that “dirt is not adhered”, and then the process returns to step S101 in FIG. 16A.
- FIG. 17 is a diagram illustrating a flowchart for explaining the operation of the contamination degree storage initializing unit according to the fourth embodiment of the present invention.
- FIG. I based on the assumption that the timer 49 is started by a periodic timer output.
- step S121 it is confirmed whether or not the contamination degree storage unit 46 is locked by determining whether or not the contamination degree storage unit 46 is locked, and the process proceeds to step S122.
- step S122 the contamination degree storage unit 46 is locked.
- step S123 the contamination level values of all the pixels in the contamination level storage unit 46 are initialized to zero.
- step S124 the mouth of the dirt storage unit 46 is released. As a result, the dirty degree memory initialization operation is completed.
- the stored content of the stain level storage unit is periodically initialized, and as a result of continuing the processing for a long time, the stain level It is possible to prevent illegal information from remaining in the storage unit and to maintain the accuracy of the dirt determination.
- the force explaining the application in the usage of the tangle monitoring device is not limited to this application, and can be applied in various fields such as a product inspection device and a biometric authentication device using an image sensor. It is.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Quality & Reliability (AREA)
- Image Analysis (AREA)
- Studio Devices (AREA)
- Image Processing (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
Description
Claims
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP06766500.0A EP2031557B1 (en) | 2006-06-08 | 2006-06-08 | Uncleanness detecting device |
JP2008520094A JP4644283B2 (ja) | 2006-06-08 | 2006-06-08 | 汚れ検出方式 |
CN2006800548522A CN101460971B (zh) | 2006-06-08 | 2006-06-08 | 污物检测装置 |
PCT/JP2006/311527 WO2007141858A1 (ja) | 2006-06-08 | 2006-06-08 | 汚れ検出方式 |
KR1020087029606A KR101078474B1 (ko) | 2006-06-08 | 2006-06-08 | 오염 검출 장치 |
US12/329,860 US8098302B2 (en) | 2006-06-08 | 2008-12-08 | Stain detection system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2006/311527 WO2007141858A1 (ja) | 2006-06-08 | 2006-06-08 | 汚れ検出方式 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/329,860 Continuation US8098302B2 (en) | 2006-06-08 | 2008-12-08 | Stain detection system |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2007141858A1 true WO2007141858A1 (ja) | 2007-12-13 |
Family
ID=38801130
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2006/311527 WO2007141858A1 (ja) | 2006-06-08 | 2006-06-08 | 汚れ検出方式 |
Country Status (6)
Country | Link |
---|---|
US (1) | US8098302B2 (ja) |
EP (1) | EP2031557B1 (ja) |
JP (1) | JP4644283B2 (ja) |
KR (1) | KR101078474B1 (ja) |
CN (1) | CN101460971B (ja) |
WO (1) | WO2007141858A1 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010016650A (ja) * | 2008-07-03 | 2010-01-21 | Canon Inc | 撮像装置及びその制御方法及びプログラム |
EP3493111A1 (en) | 2017-12-01 | 2019-06-05 | Fujitsu Limited | Biometric image processing apparatus, method and program |
CN111275022A (zh) * | 2020-03-19 | 2020-06-12 | 山东宜佳成新材料有限责任公司 | 基于遗忘因子型经验模态分解的污渍检测分析方法及应用 |
JP2020139730A (ja) * | 2019-02-27 | 2020-09-03 | ダイキン工業株式会社 | 情報提供システム |
JP2021052233A (ja) * | 2019-09-20 | 2021-04-01 | 株式会社デンソーテン | 付着物検出装置および付着物検出方法 |
Families Citing this family (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101964873B (zh) * | 2009-07-21 | 2014-08-20 | 株式会社尼康 | 图像处理装置、图像处理程序及摄像装置 |
JP2011078047A (ja) * | 2009-10-02 | 2011-04-14 | Sanyo Electric Co Ltd | 撮像装置 |
JP5725012B2 (ja) * | 2010-03-04 | 2015-05-27 | 日本電気株式会社 | 異物判定装置、異物判定方法および異物判定プログラム |
JP5953658B2 (ja) * | 2011-05-25 | 2016-07-20 | ソニー株式会社 | ロボット制御装置及びロボット装置の制御方法、コンピューター・プログラム、プログラム記憶媒体、並びにロボット装置 |
JP2014011785A (ja) * | 2012-07-03 | 2014-01-20 | Clarion Co Ltd | 車載カメラ汚れ除去装置の診断装置、診断方法及び車両システム |
CN104509102B (zh) * | 2012-07-27 | 2017-12-29 | 日产自动车株式会社 | 三维物体检测装置和异物检测装置 |
JP6102213B2 (ja) * | 2012-11-22 | 2017-03-29 | 富士通株式会社 | 画像処理装置、画像処理方法および画像処理プログラム |
US10823592B2 (en) | 2013-09-26 | 2020-11-03 | Rosemount Inc. | Process device with process variable measurement using image capture device |
US11076113B2 (en) | 2013-09-26 | 2021-07-27 | Rosemount Inc. | Industrial process diagnostics using infrared thermal sensing |
US10638093B2 (en) * | 2013-09-26 | 2020-04-28 | Rosemount Inc. | Wireless industrial process field device with imaging |
CN105389577A (zh) * | 2014-08-25 | 2016-03-09 | 中兴通讯股份有限公司 | 一种污物的检测方法、装置及终端 |
US10914635B2 (en) | 2014-09-29 | 2021-02-09 | Rosemount Inc. | Wireless industrial process monitor |
KR101672116B1 (ko) * | 2015-02-02 | 2016-11-02 | 울산대학교 산학협력단 | 세차시스템 장치 및 그 세차 방법 |
CN104867159B (zh) * | 2015-06-05 | 2018-04-10 | 北京大恒图像视觉有限公司 | 一种数字相机传感器污点检测及分级方法与装置 |
US10311314B2 (en) | 2016-11-23 | 2019-06-04 | Ford Global Technologies, Llc | Detection of lane-splitting motorcycles |
CN106803252A (zh) * | 2017-01-16 | 2017-06-06 | 广东容祺智能科技有限公司 | 一种输电线路塔号牌污浊定位与自动检测方法 |
US10290158B2 (en) | 2017-02-03 | 2019-05-14 | Ford Global Technologies, Llc | System and method for assessing the interior of an autonomous vehicle |
US10509974B2 (en) * | 2017-04-21 | 2019-12-17 | Ford Global Technologies, Llc | Stain and trash detection systems and methods |
US10304165B2 (en) | 2017-05-12 | 2019-05-28 | Ford Global Technologies, Llc | Vehicle stain and trash detection systems and methods |
US10795618B2 (en) | 2018-01-05 | 2020-10-06 | Datamax-O'neil Corporation | Methods, apparatuses, and systems for verifying printed image and improving print quality |
US10546160B2 (en) | 2018-01-05 | 2020-01-28 | Datamax-O'neil Corporation | Methods, apparatuses, and systems for providing print quality feedback and controlling print quality of machine-readable indicia |
US10834283B2 (en) | 2018-01-05 | 2020-11-10 | Datamax-O'neil Corporation | Methods, apparatuses, and systems for detecting printing defects and contaminated components of a printer |
US10803264B2 (en) | 2018-01-05 | 2020-10-13 | Datamax-O'neil Corporation | Method, apparatus, and system for characterizing an optical system |
CN109936676B (zh) * | 2019-03-29 | 2020-12-18 | 富士施乐实业发展(中国)有限公司 | 一种用于复合机的输稿器的控制方法及装置 |
CN109862202B (zh) * | 2019-03-29 | 2021-11-30 | 富士施乐实业发展(中国)有限公司 | 一种用于复合机的输稿器的控制方法及装置 |
CN110166769B (zh) * | 2019-06-27 | 2020-10-02 | 信利光电股份有限公司 | 检测摄像模组输出错位的方法、装置、系统及存储介质 |
KR102157005B1 (ko) * | 2019-12-12 | 2020-09-16 | 주식회사 제이시스 | 영상 필터링 기법을 적용한 딥러닝 결과영상의 정확성 향상방법 |
CN111739012A (zh) * | 2020-06-30 | 2020-10-02 | 重庆盛泰光电有限公司 | 基于转盘的摄像头模组白斑检测系统 |
KR20220006895A (ko) * | 2020-07-09 | 2022-01-18 | 현대자동차주식회사 | 자동차 및 그를 위한 실내 청결 관리 방법 |
CN111812341A (zh) * | 2020-07-22 | 2020-10-23 | 英华达(上海)科技有限公司 | 自动化设备检测系统和检测自动化设备内部运作的方法 |
CN113458072B (zh) * | 2021-07-06 | 2022-01-18 | 广东固特超声股份有限公司 | 一种智能终端控制的眼镜超声波清洗方法及清洗机 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08202998A (ja) * | 1995-01-31 | 1996-08-09 | Isuzu Motors Ltd | 車線逸脱警報装置 |
JP2001008193A (ja) | 1999-06-24 | 2001-01-12 | Secom Co Ltd | 画像センサ |
JP2002094978A (ja) * | 2000-09-18 | 2002-03-29 | Toyota Motor Corp | レーン検出装置 |
JP2003259358A (ja) * | 2002-03-06 | 2003-09-12 | Nissan Motor Co Ltd | カメラの汚れ検出装置およびカメラの汚れ検出方法 |
JP2004172820A (ja) | 2002-11-19 | 2004-06-17 | Minolta Co Ltd | 撮像装置 |
JP2005117262A (ja) * | 2003-10-06 | 2005-04-28 | Fujitsu Ltd | レンズの汚れ判定方法及び装置 |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS60124783A (ja) | 1983-12-10 | 1985-07-03 | Meidensha Electric Mfg Co Ltd | 画像処理装置 |
JPH11195121A (ja) | 1997-12-29 | 1999-07-21 | Canon Inc | 画像評価装置および方法 |
JP2002290994A (ja) | 2001-03-26 | 2002-10-04 | Sharp Corp | 小型カメラモジュールの異物検査方法およびその異物検査装置 |
JP2003209749A (ja) * | 2002-01-11 | 2003-07-25 | Olympus Optical Co Ltd | 撮像装置 |
JP2003295281A (ja) * | 2002-04-03 | 2003-10-15 | Canon Inc | 撮像装置及び動作処理方法及びプログラム及び記憶媒体 |
JP4179079B2 (ja) | 2002-08-30 | 2008-11-12 | 株式会社ニコン | 電子カメラ及びその制御プログラム |
JP2004153422A (ja) * | 2002-10-29 | 2004-05-27 | Toshiba Corp | 撮影装置、顔照合装置、撮影装置の汚れ検知方法、及び顔照合方法 |
US7676110B2 (en) * | 2003-09-30 | 2010-03-09 | Fotonation Vision Limited | Determination of need to service a camera based on detection of blemishes in digital images |
JP2007215151A (ja) * | 2006-01-12 | 2007-08-23 | Canon Inc | 撮像装置及びその制御方法及びプログラム |
JP4764265B2 (ja) * | 2006-06-20 | 2011-08-31 | キヤノン株式会社 | 撮像装置 |
JP4166253B2 (ja) * | 2006-07-10 | 2008-10-15 | トヨタ自動車株式会社 | 物体検出装置、物体検出方法、および物体検出用プログラム |
-
2006
- 2006-06-08 EP EP06766500.0A patent/EP2031557B1/en not_active Ceased
- 2006-06-08 KR KR1020087029606A patent/KR101078474B1/ko active IP Right Grant
- 2006-06-08 CN CN2006800548522A patent/CN101460971B/zh not_active Expired - Fee Related
- 2006-06-08 WO PCT/JP2006/311527 patent/WO2007141858A1/ja active Application Filing
- 2006-06-08 JP JP2008520094A patent/JP4644283B2/ja not_active Expired - Fee Related
-
2008
- 2008-12-08 US US12/329,860 patent/US8098302B2/en not_active Expired - Fee Related
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08202998A (ja) * | 1995-01-31 | 1996-08-09 | Isuzu Motors Ltd | 車線逸脱警報装置 |
JP2001008193A (ja) | 1999-06-24 | 2001-01-12 | Secom Co Ltd | 画像センサ |
JP2002094978A (ja) * | 2000-09-18 | 2002-03-29 | Toyota Motor Corp | レーン検出装置 |
JP2003259358A (ja) * | 2002-03-06 | 2003-09-12 | Nissan Motor Co Ltd | カメラの汚れ検出装置およびカメラの汚れ検出方法 |
JP2004172820A (ja) | 2002-11-19 | 2004-06-17 | Minolta Co Ltd | 撮像装置 |
JP2005117262A (ja) * | 2003-10-06 | 2005-04-28 | Fujitsu Ltd | レンズの汚れ判定方法及び装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2031557A4 |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010016650A (ja) * | 2008-07-03 | 2010-01-21 | Canon Inc | 撮像装置及びその制御方法及びプログラム |
EP3493111A1 (en) | 2017-12-01 | 2019-06-05 | Fujitsu Limited | Biometric image processing apparatus, method and program |
US10936848B2 (en) | 2017-12-01 | 2021-03-02 | Fujitsu Limited | Biometric image processing apparatus, biometric image processing method, and biometric image processing program |
JP2020139730A (ja) * | 2019-02-27 | 2020-09-03 | ダイキン工業株式会社 | 情報提供システム |
WO2020175589A1 (ja) * | 2019-02-27 | 2020-09-03 | ダイキン工業株式会社 | 情報提供システム |
US11538204B2 (en) | 2019-02-27 | 2022-12-27 | Daikin Industries, Ltd. | Information providing system |
JP2021052233A (ja) * | 2019-09-20 | 2021-04-01 | 株式会社デンソーテン | 付着物検出装置および付着物検出方法 |
JP7156224B2 (ja) | 2019-09-20 | 2022-10-19 | 株式会社デンソーテン | 付着物検出装置および付着物検出方法 |
CN111275022A (zh) * | 2020-03-19 | 2020-06-12 | 山东宜佳成新材料有限责任公司 | 基于遗忘因子型经验模态分解的污渍检测分析方法及应用 |
CN111275022B (zh) * | 2020-03-19 | 2023-05-16 | 山东宜佳成新材料有限责任公司 | 基于遗忘因子型经验模态分解的污渍检测分析方法及应用 |
Also Published As
Publication number | Publication date |
---|---|
CN101460971A (zh) | 2009-06-17 |
JPWO2007141858A1 (ja) | 2009-10-15 |
EP2031557A1 (en) | 2009-03-04 |
US20090087022A1 (en) | 2009-04-02 |
CN101460971B (zh) | 2012-06-27 |
KR20090009944A (ko) | 2009-01-23 |
EP2031557A4 (en) | 2014-04-16 |
EP2031557B1 (en) | 2017-12-27 |
US8098302B2 (en) | 2012-01-17 |
KR101078474B1 (ko) | 2011-10-31 |
JP4644283B2 (ja) | 2011-03-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2007141858A1 (ja) | 汚れ検出方式 | |
JP4186699B2 (ja) | 撮像装置および画像処理装置 | |
US7889890B2 (en) | Image capture apparatus and control method therefor | |
JP4533836B2 (ja) | 変動領域検出装置及びその方法 | |
JP4466015B2 (ja) | 画像処理装置および画像処理プログラム | |
JP4196124B2 (ja) | 撮像系診断装置、撮像系診断プログラム、撮像系診断プログラム製品、および撮像装置 | |
Andreopoulos et al. | On sensor bias in experimental methods for comparing interest-point, saliency, and recognition algorithms | |
US20150363920A1 (en) | Method, electronic apparatus, and computer readable medium for processing reflection in image | |
WO2019146097A1 (ja) | 欠陥のある撮影データの検知装置及び検知システム | |
JP5207057B2 (ja) | 撮影装置の汚れを検知する汚れ検知装置及び同検知方法 | |
JP2011015299A (ja) | 撮像装置、画像処理装置、制御方法、及びプログラム | |
JP4419479B2 (ja) | 画像処理装置および画像処理プログラム | |
JP4466017B2 (ja) | 画像処理装置および画像処理プログラム | |
JP4438363B2 (ja) | 画像処理装置および画像処理プログラム | |
Premachandran et al. | Measuring the effectiveness of bad pixel detection algorithms using the ROC curve | |
JP2008042227A (ja) | 撮像装置 | |
JP6348883B2 (ja) | 画像撮像装置、画像撮像方法及びコンピュータプログラム | |
JP2009088884A (ja) | 撮像データにおける動きベクトル検出方法と装置 | |
JP4466016B2 (ja) | 画像処理装置および画像処理プログラム | |
CN107147845B (zh) | 对焦方法、装置和终端设备 | |
JP3957495B2 (ja) | 画像センサ | |
JP5213493B2 (ja) | 動き検出装置 | |
JP2006203688A5 (ja) | ||
JP6617124B2 (ja) | 物体検出装置 | |
JP6727890B2 (ja) | 画像認識装置、画像認識方法、及び画像認識プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200680054852.2 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 06766500 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2008520094 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020087029606 Country of ref document: KR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2006766500 Country of ref document: EP |