WO2006109398A1 - 画像処理装置および方法、プログラム、並びに記録媒体 - Google Patents
画像処理装置および方法、プログラム、並びに記録媒体 Download PDFInfo
- Publication number
- WO2006109398A1 WO2006109398A1 PCT/JP2006/305113 JP2006305113W WO2006109398A1 WO 2006109398 A1 WO2006109398 A1 WO 2006109398A1 JP 2006305113 W JP2006305113 W JP 2006305113W WO 2006109398 A1 WO2006109398 A1 WO 2006109398A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- pixel
- image data
- data
- correction
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 118
- 238000000034 method Methods 0.000 title claims abstract description 69
- 238000003384 imaging method Methods 0.000 claims abstract description 105
- 238000001514 detection method Methods 0.000 claims abstract description 85
- 238000012937 correction Methods 0.000 claims description 80
- 238000005286 illumination Methods 0.000 claims description 60
- 238000007667 floating Methods 0.000 claims description 8
- 238000003672 processing method Methods 0.000 claims description 4
- 238000010586 diagram Methods 0.000 description 29
- 238000012806 monitoring device Methods 0.000 description 20
- 238000006243 chemical reaction Methods 0.000 description 12
- 230000035945 sensitivity Effects 0.000 description 10
- 238000012544 monitoring process Methods 0.000 description 9
- 238000003702 image correction Methods 0.000 description 5
- 239000004065 semiconductor Substances 0.000 description 5
- 230000000007 visual effect Effects 0.000 description 5
- 238000004804 winding Methods 0.000 description 4
- 230000000903 blocking effect Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000002452 interceptive effect Effects 0.000 description 3
- GGCZERPQGJTIQP-UHFFFAOYSA-N sodium;9,10-dioxoanthracene-2-sulfonic acid Chemical compound [Na+].C1=CC=C2C(=O)C3=CC(S(=O)(=O)O)=CC=C3C(=O)C2=C1 GGCZERPQGJTIQP-UHFFFAOYSA-N 0.000 description 3
- 230000000694 effects Effects 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 229910044991 metal oxide Inorganic materials 0.000 description 2
- 150000004706 metal oxides Chemical class 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 239000002245 particle Substances 0.000 description 2
- 229910052709 silver Inorganic materials 0.000 description 2
- 239000004332 silver Substances 0.000 description 2
- -1 silver halide Chemical class 0.000 description 2
- 240000003173 Drymaria cordata Species 0.000 description 1
- 241000238631 Hexapoda Species 0.000 description 1
- MMOXZBCLCQITDF-UHFFFAOYSA-N N,N-diethyl-m-toluamide Chemical compound CCN(CC)C(=O)C1=CC=CC(C)=C1 MMOXZBCLCQITDF-UHFFFAOYSA-N 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 239000010426 asphalt Substances 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 239000012141 concentrate Substances 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000005669 field effect Effects 0.000 description 1
- 230000001771 impaired effect Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000000414 obstructive effect Effects 0.000 description 1
- 229920006395 saturated elastomer Polymers 0.000 description 1
- 238000009738 saturating Methods 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/272—Means for inserting a foreground image in a background image, i.e. inlay, outlay
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/22—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
- B60R1/23—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
- B60R1/24—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view in front of the vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/77—Retouching; Inpainting; Scratch removal
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/26—Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
- G06V10/273—Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion removing elements interfering with the pattern to be recognised
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/304—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using merged images, e.g. merging camera image with stored images
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/8053—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for bad weather conditions or night vision
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/8093—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for obstacle warning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20021—Dividing image into blocks, subimages or windows
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20172—Image enhancement details
- G06T2207/20208—High dynamic range [HDR] image processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
Definitions
- the present invention relates to an image processing device and method, a program, and a recording medium, and in particular, an image processing device and method, a program, and a program that can remove a disturbing object that blocks a visual field and provide a comfortable visual image. And a recording medium.
- the present invention has been made in view of such a situation, and it is intended to remove an obstructing object that blocks a visual field and to provide a comfortable visual field image.
- a first image processing apparatus to which the present invention is applied captures an image, outputs image data of the captured image, and corrects image data output from the image capturing unit.
- the correction necessity judgment means for judging whether Detecting means for detecting a pixel corresponding to an obstructing object which is a predetermined object to be lowered, replacing means for replacing the pixel of the obstructing object in the image data detected by the detecting means with another pixel, and replacing means And an output means for outputting image data in which the pixel of the obstructing object is replaced with another pixel.
- an image is captured, data of the captured image is output, whether or not to correct the output image data is determined, and the image is determined.
- a pixel corresponding to a disturbing object which is a predetermined object floating or falling in the air, is detected, and the pixel of the disturbing object in the detected image data is replaced with another pixel.
- Image data in which the pixel of the obstructing object is replaced with another pixel is output.
- the imaging unit converts the charge generated corresponding to the captured light into an analog electrical signal having a voltage value proportional to the logarithm of the number of charges for each pixel, and converts the analog electrical signal to digital Image data can be output after being converted to data.
- the photographing means is constituted by, for example, an HDRC camera.
- the imaging means converts the current generated corresponding to the captured light into an analog electrical signal having a voltage value proportional to the logarithm of the magnitude of the current for each pixel, and the analog electrical signal Can be converted into digital data to output image data.
- the detection means may detect a pixel corresponding to an obstructing object based on a luminance value of a pixel of image data and a preset threshold value.
- the threshold value is an upper limit and a lower limit threshold of the luminance value for distinguishing between the pixel corresponding to the disturbing object and the pixel corresponding to the background in the image data.
- a pixel having a luminance value within the range can be detected as a pixel corresponding to the obstructing object.
- the detection means divides the image into a plurality of regions, and if there is a pixel having a luminance value within the threshold range in the image data of all the divided regions, A pixel having a luminance value within the range can be detected as a pixel corresponding to the obstructing object.
- the object force existing in a part of the image is prevented from being erroneously detected as a disturbance.
- the detection unit includes a pixel having a luminance value within the threshold range in the image data of all of the plurality of frames captured by the imaging unit, the luminance value within the threshold range is included. It is possible to detect a pixel having a pixel corresponding to the obstructing object.
- the detection means calculates a feature amount of data of a block centering on a pixel having a luminance value within a threshold range, and corresponds to the calculated feature amount and a preliminarily stored disturbing object. If the difference between the pixel block and the feature value of the pixel data is calculated and the difference is less than or equal to a preset value, the block centered on the pixel having a luminance value within the threshold range is handled as the disturbing object. It can be detected as a block of pixels.
- the obstructing object can be reliably detected regardless of the amount of the obstructing object in the image.
- the replacement means detects the pixels detected by the detection means in the image of the frame taken by the photographing means, and is detected by the detection means in the image of the frame temporally prior to the frame in which the pixels are to be replaced. It is possible to replace the pixel corresponding to the selected pixel.
- means for replacing the pixel detected by the detecting means with the pixel specified by the specifying means it can.
- the image forming apparatus further includes another photographing unit, and the replacement unit is an image obtained by photographing the pixel detected by the detecting unit with the other photographing unit at the same timing as the image in which the pixel is to be replaced.
- the pixel corresponding to the pixel detected by the detecting means can be replaced with a pixel.
- the first image processing method to which the present invention is applied it is determined whether or not to correct image data output from an imaging unit that captures an image and outputs data of the captured image. If it is determined that the image data should be corrected by the correction necessity determination step and the correction necessity determination step, a predetermined object floating or dropping in the air in the image data is used.
- a detection step for detecting a pixel corresponding to a disturbing object, a replacement step for replacing the pixel of the disturbing object in the image data detected by the processing of the detection step with another pixel, and a processing for the replacement step And an output step of outputting image data in which the pixel of the object is replaced with another pixel.
- the first image processing method of the present invention whether or not to correct the image data output from the imaging unit that captures the image and outputs the data of the captured image is determined. If it is determined that correction should be performed on the image data, the pixel corresponding to the obstructing object, which is a predetermined object that floats or falls in the air, is detected in the image data. The pixel power of the disturbing object in the image data thus obtained is replaced with another pixel, and image data in which the pixel of the disturbing object is replaced with another pixel is output.
- a first program to which the present invention is applied is a program that causes an image processing apparatus to perform image processing, and is an image output from an imaging unit that captures an image and outputs data of the captured image. Correction is performed on the image data by the processing of the correction necessity determination control step that controls the determination of whether or not to correct the data of the image and the correction necessity determination control step. Is detected by the detection control step that controls the detection of the pixel corresponding to the obstructing object, which is a predetermined object that floats or falls in the air, and the processing of the detection step.
- the replacement control step for controlling the replacement of the disturbing object pixel in the image data with another pixel, and the image data in which the disturbing object pixel is replaced with another pixel by the processing of the replacement control step And an output control step for controlling the output of the computer.
- a first recording medium to which the present invention is applied is a recording medium on which a program for causing an image processing apparatus to perform image processing is recorded.
- the first recording medium captures an image and outputs data of the captured image.
- the image data is corrected by the correction necessity determination control step for controlling whether or not to correct the image data output from the photographing means and the correction necessity determination control step. If it is determined that it should be detected in the image data, it is detected by a detection control step for controlling detection of a pixel corresponding to a disturbing object that is a predetermined object floating or falling in the air, and detected by the processing of the detection step.
- Program for executing an output control step of controlling the output of the data of the image to the computer is characterized in that it is recorded.
- the second image processing apparatus to which the present invention is applied captures an image when the illumination for illuminating the subject is turned on and an image when the illumination is turned off, and takes the data of the captured image. Illumination that illuminates the subject to be photographed by the photographing means that outputs, the necessity of correction judgment that determines whether or not to correct the image data output from the photographing means Correction data for correcting the image data based on the image data captured when the illumination is turned off, and the image data corrected by the correction means is output. Output means.
- an image when the illumination for illuminating the subject is turned on and an image when the illumination is turned off are taken, and data of the taken image is output. It is determined whether or not to correct the output image data, and the image data captured when the illumination to illuminate the subject to be photographed is turned off and the illumination is turned off. The image data is corrected based on the image data captured when the lamp is lit, and the corrected image data is output.
- the correction means includes data of an image taken when the illumination for illuminating the subject photographed by the photographing means is turned on, and data of an image taken when the illumination is turned off.
- the image data can be corrected so that the image data captured when the illumination is turned off is output to the output means.
- the image data further includes detection means for detecting a pixel corresponding to an obstructing object that is a predetermined object that floats or falls in the air, and the detection means applies light to the subject imaged by the imaging means. Based on image data taken when the illumination is turned on and image data taken when the illumination is turned off! Calculate the difference in brightness value of each pixel corresponding to each image data V, and detect the pixel whose brightness value exceeds a preset value as the pixel corresponding to the disturbing object.
- the correcting means can replace the pixel of the obstructing object in the image data detected by the detecting means with another pixel.
- an image when the illumination for illuminating the subject is turned on and an image when the illumination is turned off are photographed, and data of the photographed image is obtained. It is determined that the image data should be corrected by the correction necessity determination step and the correction necessity determination step that determine whether or not to correct the image data that is output from the output photographing means. In this case, the image data is corrected based on the image data captured when the illumination is turned on and the image data captured when the illumination is turned off. A correction step; and an output step of outputting corrected image data by the processing of the correction step.
- the illumination for illuminating the subject is turned on. And the image when the illumination is turned off are determined, and it is determined whether or not to correct the image data output from the imaging means that outputs the captured image data. If it is determined that correction should be performed on the image, the image data captured when the illumination that illuminates the subject captured by the imaging means is turned on, and the image data captured when the illumination is turned off, Based on the above, the image data is corrected, and the corrected image data is output.
- a second program to which the present invention is applied is a program for causing an image processing apparatus to perform image processing.
- An image obtained when the illumination for illuminating a subject is turned on, and an image obtained when the illumination is turned off.
- a correction necessity determination control step for controlling whether or not to correct the image data output from the photographing means that captures the image and outputs the data of the captured image, and a correction necessity determination control
- the image data captured when the illumination to illuminate the subject photographed by the photographing means is turned on, and the illumination is turned off.
- a correction control step for controlling the correction of the image data based on the image data taken at the time, and an output control for controlling the output of the corrected image data by the processing of the correction control step Characterized in that to execute the steps in Konbyu over data.
- the second recording medium to which the present invention is applied is a recording medium on which a program for causing the image processing apparatus to perform image processing is recorded, and is an image when the illumination for illuminating the subject is turned on. And the image when the illumination is turned off, and whether or not correction is necessary to control whether or not to correct the image data output from the imaging means that outputs the captured image data
- the image is taken when the illumination to illuminate the subject to be photographed by the photographing means is turned on.
- the invention's effect it is possible to provide a comfortable visual field image.
- FIG. 1 is a block diagram showing a configuration example of a monitoring device to which the present invention is applied.
- FIG. 2 is a diagram illustrating a configuration example of an imaging unit in FIG.
- FIG. 3 is a diagram illustrating sensitivity characteristics of an imaging unit.
- FIG. 4 is a block diagram illustrating a configuration example of a control unit in FIG. 1.
- FIG. 5 is a flowchart illustrating an example of image correction processing.
- FIG. 6 is a flowchart illustrating an example of correction necessity determination processing.
- FIG. 7 is a flowchart for explaining an example of obstructing object detection processing.
- FIG. 8 is a diagram illustrating an example of an image in which an obstructing object is captured.
- FIG. 9 is a diagram showing an example of dividing the image of FIG. 8 into a plurality of regions.
- FIG. 10 is a diagram showing an example of a pixel histogram.
- FIG. 11 is a flowchart illustrating an example of mode A processing.
- FIG. 12 is a flowchart illustrating an example of mode B processing.
- FIG. 13 is a diagram showing an example of continuous frames.
- FIG. 14 is a diagram showing an example of a pixel histogram.
- FIG. 15 is a diagram showing an example of a pixel histogram.
- FIG. 16 is a diagram illustrating an example of mode C processing.
- FIG. 17 is a flowchart illustrating an example of feature determination processing.
- FIG. 18 is a flowchart for explaining another example of disturbing object detection processing.
- FIG. 19 is a diagram showing an example of an image taken when lighting is turned on.
- FIG. 20 is a diagram showing an example of an image taken when the illumination is turned off.
- FIG. 21 is a diagram showing an example of an image from which an obstructing object is removed.
- FIG. 22 is a flowchart for explaining an example of obstructing object removal processing.
- FIG. 23 is a diagram showing an example of a frame image to be corrected.
- FIG. 24 is a diagram showing an example of an image of the previous frame in time.
- FIG. 25 is a diagram showing an example of an image in which pixels are replaced.
- FIG. 26 is a diagram showing another example of an image of a frame to be corrected.
- FIG. 27 is a diagram showing another example of the image of the previous frame in time.
- FIG. 28 is a diagram showing another example of an image in which pixels are replaced.
- FIG. 29 is a block diagram showing another configuration example of the monitoring device to which the present invention is applied.
- FIG. 30 A flow chart illustrating an example of obstruction removal processing by the monitoring device of FIG.
- FIG. 31 is a block diagram illustrating a configuration example of a personal computer.
- FIG. 1 is a block diagram showing an external configuration example of an embodiment of a monitoring device 100 to which the present invention is applied.
- the monitoring device 100 is a device that is mounted on, for example, an automobile and presents a comfortable field of view image to the user by photographing the front outside the vehicle, and includes an imaging unit 101, a control unit 102, and a display unit 103. ing.
- the imaging unit 101 is configured by a camera or the like, for example, and applies light input from the lens 101a.
- a base image (whether it is a moving image or a still image! / ⁇ ) is captured, and data of the captured image is output to the control unit 102.
- the imaging unit 101 captures a moving image, the captured image data is output as digital data encoded in units of frames.
- the control unit 102 performs a predetermined process on the image data supplied from the imaging unit 101, corrects it by removing an obstructing object, and displays a signal corresponding to the corrected image data. Output to part 103.
- the disturbing object is an object that exists in the air, such as a force floating in the air, falling in the air like snow or rain, or flying like an insect. And it hinders human visibility.
- control unit 102 is connected to an external information device such as an electronic control unit (microcomputer (microcomputer), etc.) of an automobile, and various sensors connected to the information device as necessary. Get the output status.
- an electronic control unit microcomputer (microcomputer), etc.
- the display unit 103 is configured by, for example, an LCD (Liquid Crystal Display) or the like, and displays an image corresponding to a signal supplied from the control unit 102.
- LCD Liquid Crystal Display
- FIG. 2 is a block diagram illustrating a configuration example of the imaging unit 101.
- the imaging unit 101 is configured such that light output from the lens 101 a is output to the imaging control unit 121.
- the imaging control unit 121 is, for example, a logarithmic conversion type imaging device such as HDRC (High Dynamic Range CMOS (Complementary Metal Oxide Semiconductor)), a light detection unit 141, a logarithmic conversion unit 142, an A / D conversion unit 143, and An imaging timing control unit 144 is included.
- HDRC High Dynamic Range CMOS (Complementary Metal Oxide Semiconductor)
- the light of the subject incident through the lens 101a forms an image on a light detection surface (not shown) of the light detection unit 141 of the imaging control unit 121.
- the light detection unit 141 includes, for example, a plurality of light receiving elements such as photodiodes, and converts the light of the subject imaged by the lens 1 Ola into a charge corresponding to the intensity (light quantity) of the light. Accumulate the converted charge.
- the light detection unit 141 supplies the accumulated charge to the logarithmic conversion unit 142 in synchronization with the control signal supplied from the imaging timing control unit 144. Note that the photodetection unit 141 is not supplied with the converted charge but supplied to the logarithmic conversion unit 142 as it is. It's pretty cute.
- the logarithmic conversion unit 142 includes, for example, a plurality of MOSFETs (Metal Oxide Semiconductor Field Effect Transistors).
- the logarithmic conversion unit 142 uses the sub-threshold characteristic of the MOSFET to change the charge (or current) supplied from the photodetection unit 141 to a voltage proportional to the logarithm of the number of charges (or current intensity) for each pixel. Convert to analog electrical signal with value.
- the logarithmic conversion unit 142 supplies the converted analog electrical signal to the A / D conversion unit 143.
- the A / D conversion unit 143 converts an analog electric signal into digital image data in synchronization with the control signal supplied from the imaging timing control unit 144, and converts the converted image data into the image processing device 112. To supply. Accordingly, the pixel value of each pixel of the image data output from the imaging control unit 121 is a value proportional to the value obtained by logarithmically converting the amount of light of the subject incident on the light detection unit 141.
- FIG. 3 is a graph showing the sensitivity characteristics of the HDRC imaging control unit 121, a CCD (Charge Coupled Device) imaging device, a silver halide film, and the human eye.
- the horizontal axis indicates the logarithm of the illuminance (unit: lux) of the incident light
- the vertical axis indicates the sensitivity.
- Line 151 indicates the sensitivity characteristic of the imaging controller 121
- line 152 indicates the sensitivity characteristic of the human eye
- line 153 indicates the sensitivity characteristic of the silver salt film
- line 154 indicates the sensitivity characteristic of the CCD image sensor.
- the sensitivity characteristic of the conventional CMOS image sensor is almost similar to the sensitivity characteristic of the CCD image sensor indicated by line 154.
- the imaging control unit 121 outputs image data having a pixel value that is approximately proportional to the logarithm of the amount of light of the incident subject, thereby forming a photo that constitutes the imaging control unit 121.
- saturating capacitances such as diodes and MOSFETs, CCD image sensors, silver salt films, and about 170 dB of dynamics from about 1 millilux to about 500 k / letter, higher than the brightness of sunlight, wider than the human eye Have a range.
- the logarithmic conversion unit 142 outputs data consisting of luminance values (or pixel values) substantially proportional to the logarithm of the incident light quantity, so that even when the incident light quantity becomes large, Capacitance of elements such as photodiodes and MOSFETs constituting the imaging controller 121 is saturated, and the current flowing through each element and the applied voltage output according to the input of each element.
- the range that can be performed is not exceeded. Accordingly, it is possible to obtain a luminance value (or pixel value) according to the fluctuation of the amount of incident light almost accurately within the range of luminance that can be imaged.
- the dynamic range of the imaging control unit 121 is not limited to the above-mentioned 170 dB, but depending on the purpose of use, it is necessary to use one corresponding to the required dynamic range, such as about 10 dB or 200 dB. .
- the imaging apparatus 101 using the imaging control unit 121 can brighten the subject to the maximum pixel value that can be output by the imaging device without adjusting the amount of incident light by adjusting the aperture, shutter speed, and the like.
- the pixel value corresponding to the portion is clipped, or the pixel value force S corresponding to the dark portion of the subject is clipped to the minimum pixel value that can be output by the image sensor.
- the imaging apparatus 101 can faithfully capture the detailed change in luminance of the subject without the bright portion of the subject flying white or the dark portion being painted black.
- the imaging device 101 using the imaging control unit 121 captures the front of the vehicle from the inside of the vehicle during the daytime, even if the sun is within the angle of view, without adjusting the amount of incident light, Images that faithfully reproduce the conditions of the sun and the road ahead can be taken.
- the imaging device 101 can adjust the incident light of the headlight of the oncoming vehicle without adjusting the amount of incident light even if the headlight of the oncoming vehicle is illuminated from the front. It is possible to shoot an image that faithfully reproduces even the darkest part of the car illuminated by the headlight of the vehicle.
- the imaging apparatus 101 using the imaging control unit 121 does not need to adjust the amount of incident light, in the image data output from the imaging apparatus 101, for example, during imaging of two frames.
- the pixel values corresponding to the areas where the luminance changes fluctuate, and there are almost no pixel values corresponding to the areas where the luminance does not change. Does not fluctuate. Therefore, the pixel value (hereinafter also referred to as the difference value) of each pixel of the data (hereinafter referred to as difference data) obtained by taking the difference of the image data between the frames reflects the change in luminance of the subject almost faithfully. Value.
- an imaging apparatus using a CCD imaging device whose dynamic range is narrower than the human eye needs to adjust the amount of incident light in accordance with the brightness of the subject. If there is an area in the subject where the brightness varies and an area that does not vary, during shooting, the pixel value corresponding to the area where the intensity does not vary may also vary. Therefore, the difference value of each pixel of the difference data may not be a value that faithfully reflects the change in luminance of the subject.
- the pixel value of the image data output from the imaging device 101 is a value that is substantially proportional to the logarithm of the amount of light of the subject, regardless of the brightness (illuminance) of the illumination irradiated to the subject,
- the histogram showing the distribution of pixel values in the image data obtained by photographing the subject has almost the same shape as the histogram showing the reflectance distribution of the subject.
- the difference in the histogram width indicating the distribution of the pixel values of the first image data and the second image data is about 100 times.
- the sensitivity of the CCD image sensor and the silver salt film is not proportional to the illuminance of the incident light due to factors such as the gamma characteristic. . Therefore, the histogram showing the distribution of pixel values of image data using a CCD image sensor or silver halide film is dependent on the amount of light (intensity of illuminance) even if the distribution of the amount of incident light (illuminance) is uniform. , Its shape changes.
- FIG. 4 is a block diagram illustrating a configuration example of the control unit 102.
- an obstruction state detection unit 161 detects whether there is a force with an obstruction (snow) to be removed from an image based on information acquired from, for example, a microcomputer of an automobile.
- Interfering object detection unit 162 Blocking objects in the image supplied from the unit 101 are detected.
- the moving state control unit 163 detects the moving state of the automobile and the moving state of the obstructing object, detects the positional relationship between the obstructing object and the background in both moving state force images, and determines the positional relationship between the two. Based on the above, a frame in which there is a pixel to be replaced along with correction and a pixel to be replaced are determined.
- the disturbing object registration unit 165 stores the data of the characteristic amount of the obstructing object in advance, and stores the characteristic amount of the obstructing object detected by the obstructing object detection unit 162 as necessary. The degree of coincidence with the feature quantity of the obstructing object is detected.
- the disturbing object removing unit 164 is based on the processing results of the disturbing object detecting unit 162, the moving state control unit 163, and the obstructing object registering unit 165. Then, the pixel corresponding to the disturbing object is replaced (removed from the disturbing object), corrected, and a signal corresponding to the corrected image data is output to the display unit 103.
- each unit configuring control unit 102 may be configured by hardware such as a semiconductor integrated circuit in which a logical operation unit and a storage unit for realizing the functions described above are incorporated.
- the control unit 102 may be configured by a computer, for example, and each unit described above may be configured as a function block realized by software processed by the computer.
- the monitoring device 100 is mounted on a car, and the imaging unit 101 captures a front image and displays it on the display unit 103, and also displays snow as an obstructing object from the captured image. Shall.
- step S101 the control unit 102 executes correction necessity determination processing described later with reference to FIG. Thereby, it is determined whether or not the image data supplied from the imaging unit 101 needs to be corrected.
- step S102 the control unit 102 determines whether or not the force is determined to require correction as a result of the process in step S101. If it is determined that correction is necessary, the process proceeds to step S1.
- step S103 the control unit 102 performs a disturbing object detection process to be described later with reference to FIG. Execute. Thereby, a pixel (or a block of pixels) corresponding to the disturbing object in the image data supplied from the imaging unit 101 is detected.
- step S104 the control unit 102 executes a disturbing object removal process which will be described later with reference to FIG. As a result, the disturbing object force image force detected by the process of step S103 is removed.
- step S104 the control unit 102 outputs a signal corresponding to the image data to the display unit 103 to display an image.
- step S102 determines that correction is not necessary, the processing in steps S103 and S104 is skipped, and the image captured by the imaging unit 101 is displayed without correction.
- step S101 in FIG. 5 the details of the correction necessity determination processing in step S101 in FIG. 5 will be described with reference to the flowchart in FIG.
- step S121 the disturbance state detection unit 161 obtains the output information of the raindrop sensor from an automobile microcomputer, etc., determines whether the sensor has detected an object (such as snow or rain), and detects the object. If it is determined that it has been performed, the process proceeds to step S122.
- an object such as snow or rain
- step S122 the disturbance state detection unit 161 determines whether or not the wiper has operated for a preset time (for example, 1 minute) based on information acquired from the microcomputer of the vehicle, and the like. If it is determined that the wiper has operated for the time, the process proceeds to step S123. For example, even if it is determined in step S121 that the raindrop sensor has detected an object, it may be temporary due to, for example, splashing of water. Snow) is not always falling. Therefore, it is further determined whether or not the wiper has operated for a predetermined time.
- a preset time for example, 1 minute
- step S123 the disturbance state detection unit 161 determines whether the vehicle speed is equal to or lower than a threshold value based on information acquired from the microcomputer of the vehicle. If it is determined that the vehicle speed is equal to or lower than the threshold value, the process is performed. Proceed to step S125. When it is snowing, it is considered that the normal vehicle speed will be slow, so it is further determined whether or not the vehicle speed is below the threshold value.
- step S125 the disturbance state detection unit 161 determines that image correction is necessary. Set the correction required flag to ON. In the process of step S102 in FIG. 5, it is determined whether or not the correction flag is ON. If the correction flag force is ON, it is determined that correction is necessary.
- step S121 determines whether the sensor does not detect an object. If it is determined in step S122 that the wiper is not operating for a predetermined time, or if the vehicle speed is If it is determined that it is not less than the threshold value, the process proceeds to step S124.
- step S124 the disturbance state detection unit 161 determines whether or not correction is manually set. If it is determined that correction is manually set, the process proceeds to step S125. Proceed to For example, if the user (Dryno presses an operation button (not shown) to command image correction, the correction required flag is also set to ON. In step S1 24, the correction required setting is not made manually. If it is determined that the process has been successful, the process of step S125 is skipped and the process ends.
- step S103 of FIG. 5 the details of the obstructing object detection process in step S103 of FIG. 5 will be described with reference to the flowchart of FIG.
- step S141 the disturbing object detection unit 162 divides the image captured by the imaging unit 101 into predetermined areas.
- an image as shown in FIG. 8 is divided as shown in FIG.
- the part indicated by a white circle in the figure is snow that is an obstruction.
- the image is divided into eight areas A to H.
- step S142 the disturbing object detection unit 162 detects pixels within a threshold range existing in the image data.
- the graph shown in Fig. 10 shows the relationship between the pixel value (pixel luminance value) of the image outside the vehicle and the number of pixels when it is snowing.
- the horizontal axis represents the output value (pixel value)
- the vertical axis represents the number of pixels
- the pixel distribution (histogram) is indicated by a line 201.
- the peak of the line 201 is formed in the portion where the output value (pixel value) is low on the left side and the portion where the output value (pixel value) is high on the right side in the drawing.
- the peak on the left side in the figure is due to pixels corresponding to a low-luminance background in the image, and the peak on the right side in the figure is due to pixels corresponding to snow that is an obstructing object.
- the threshold value a and the threshold value b are the lower limit and upper limit of the pixel value corresponding to the snow that is the obstructing object, respectively, and are set in advance and are appropriate values for distinguishing the background from the obstructing object. Therefore, a pixel having a value that is greater than or equal to the threshold a and less than or equal to the threshold b (a pixel within the threshold range) is considered to be highly likely to be a pixel of a disturbing object.
- the threshold value a and the threshold value b are set based on, for example, a histogram of pixel values created based on image data acquired by taking an image of snowing in advance and acquired! The
- the threshold value may be set dynamically according to the weather or the like that is not necessarily set constant. For example, since the intensity of sunlight is different between a sunny day and a cloudy day (or day and night), even if the same object is used, The luminance value of the pixel may be different. In such a case, a threshold that is suitable for distinguishing between a background and a disturbing object based on the luminance value of an object that is always observed in an image and that has a reflectance stored in advance (for example, a road surface). But let's select (set dynamically).
- the imaging unit 101 when the imaging unit 101 is attached in front of a car, the road surface (asphalt) is always shown at the lower end of the image to be captured, so images captured in advance in a plurality of different weather conditions.
- the relationship between the brightness value of the road surface and snow (for example, the difference in brightness value) is stored, and if the brightness of the image taken by the weather changes, the brightness value of the pixel corresponding to the road surface is calculated.
- a pixel corresponding to snow may be detected.
- the pixel force within the threshold range detected in the process of step S142 may be detected as it is as a pixel corresponding to the obstacle. In this case, the processing of steps S1 43 to S 146 described later may be omitted!
- step S143 the disturbing object detection unit 162 checks the mode set in the monitoring apparatus 100.
- the mode is set in advance by the user, for example, in order to select the detection method of the obstructing object, and is appropriately set according to how snow falls, the characteristics of the imaging unit 101, and the like. If it is determined in step S 143 that mode A is set, the process proceeds to step S 144, and the disturbing object detection unit 162 executes mode A processing.
- the details of the mode A process in step S144 of FIG. 7 will be described with reference to the flowchart of FIG.
- step S161 the disturbing object detection unit 162 determines whether or not pixels within the threshold range exist in all the regions. At this time, for example, it is determined whether or not there is a pixel having a value within the threshold range in all the areas A to H described above with reference to FIG.
- step S161 If it is determined in step S161 that pixels within the threshold range exist in all regions, the process proceeds to step S162, and the disturbing object detection unit 162 sets the pixels within the threshold range as pixels of the disturbing object. Set as image pixel.
- a pixel having a value within the threshold range is a pixel corresponding to a bright image having a relatively high luminance value, and is considered to be a white object, for example. However, if such image pixels are not part of the image, for example, in all the areas A to H in FIG. 9 (spread widely), the image corresponding to the pixel is snow. Since there is a high possibility, a pixel having a value within the threshold range is regarded as an obstructing object.
- step S161 determines whether there are no pixels within the threshold range in all the regions. If it is determined in step S161 that there are no pixels within the threshold range in all the regions, the process of step S162 is skipped.
- pixels corresponding to a bright image with a high luminance value exist in a part of the image other than the entire image. Since an image corresponding to the pixel is likely to be a building, for example, a pixel having a value within the threshold range is not set as a disturbing object.
- the mode A processing According to the detection of the obstructing object by the mode A processing described above, for example, when the vehicle equipped with the monitoring device 100 is white in front of the vehicle and the truck is running in parallel, the brightness is high. It is determined that the pixels of the image are present in all areas, and may be erroneously set as an obstructing object (snow). For example, if the imaging unit 101 is configured with a high-speed camera, etc., detection by mode A processing may detect an obstructing object by mistake. It is necessary to make it possible to detect the obstructing object more reliably. For this reason, when the imaging unit 101 is configured using a high-speed camera or the like, the mode B process is executed instead of the mode A process. That is, in step S143 in FIG. 7, it is determined that the mode B is set, the process proceeds to step S145, and the mode B process is executed.
- step S181 is the same as the processing in step S161 in Fig. 11, and thus detailed description thereof is omitted. If it is determined in step S181 that pixels within the threshold range exist in all regions, the process proceeds to step S182.
- step S182 the disturbing object detection unit 162 determines whether or not the state in which the pixels within the threshold range exist in all the regions continues for a predetermined number of frames (for example, several tens to several hundreds frames). Determine. For example, as shown in FIG. 13, in the case where all the frames from the nth frame to the (n + 101) th frame are snowed and images are recorded, In S182, it is determined that the state where pixels within the threshold range exist in all the regions has continued for a predetermined number of frames, and the process proceeds to Step S183.
- a predetermined number of frames for example, several tens to several hundreds frames.
- step S183 when it is determined that the state where the pixels within the threshold range exist in all the regions continues for a predetermined number of frames and it is determined that the state is not, the process of step S183 is skipped.
- step S183 is the same as the processing in step S162 in Fig. 11, and thus detailed description thereof is omitted.
- the imaging unit 101 uses a high-speed camera, etc., because it is determined whether or not a state where pixels within the threshold range exist in all the regions continues for a predetermined number of frames. In this case, it is possible to prevent erroneous detection of a bright object (eg, a white truck) that temporarily blocks the field of view in front of the automobile on which the monitoring device 100 is mounted as an obstructing object.
- a bright object eg, a white truck
- FIG. 14 is a diagram showing a histogram of image pixels in the case of heavy snow.
- the horizontal axis indicates the output value (pixel value)
- the vertical axis indicates the number of pixels
- the pixel distribution (histogram) is indicated by a line 221.
- a peak of a line 221 is formed in the center of the figure due to an obstructing object (snow).
- the pixel output values are concentrated, and the peak of line 221 is formed within the threshold range (the output value between threshold a and threshold b). Probability is high.
- FIG. 15 is a diagram showing a histogram of pixels of an image in the case of a small descent.
- the horizontal axis is the output value (pixel value)
- the vertical axis is the number of pixels
- the pixel distribution (histogram) is indicated by a line 222.
- a line 222 peak due to a low-brightness background is formed in the low luminance part on the left side of the figure
- a line 222 peak due to an obstruction (snow) is formed near the center in the figure.
- the peak of the line 222 with a high luminance background is formed in the high luminance portion.
- threshold range are more likely to contain pixels of a high-luminance background image. In this way, if the output of each pixel does not concentrate at a certain level, the threshold range must be increased and an appropriate threshold (e.g. threshold b) for distinguishing between background and disturbing objects. Is difficult to set.
- an appropriate threshold e.g. threshold b
- Mode C processing That is, in step S 143 of FIG. 7, it is determined that mode C is set, and the process proceeds to step S 146 where mode C processing is executed.
- step S202 the threshold range is applied to all regions. If it is determined that the state in which the surrounding pixels exist continues for a predetermined number of frames, the process proceeds to step S203, and the feature determination process is executed.
- step S221 the disturbing object detection unit 162 extracts blocks composed of pixels within the threshold range from the image.
- step S222 the disturbing object detection unit 162 calculates the feature amount of the block extracted in step S221. At this time, for example, by performing Laplacian transformation on the block of the pixel, the shape of the block is close to the particle shape and is calculated as a numerical value. It should be noted that a reference value for determining a shape close to a particulate shape is stored in advance in the obstructing object registration unit 165.
- the area corresponding to the block in the image is equal to or less than a predetermined percentage of the entire image (the size occupied in the image is small). For example, based on the analysis result of an image taken in advance, a constant value is set for the ratio of snow particles to the entire image according to the angle of view of the lens 101a, and the area of the block extracted in step S221. The percentage power of how close to the preset value is calculated as a numerical value. Further, how close the pixel block color is to the white snow color may be calculated as a numerical value. It is assumed that values such as a threshold necessary for calculating these numerical values are stored in the obstructing object registration unit 165 in advance.
- the disturbing object detection unit 162 calculates the difference between the feature amount calculated in step S222 and the preset feature amount stored in the disturbing object registration unit 165, and The difference is determined as to whether the force is below a threshold.
- This threshold value is a threshold value for determining the degree of coincidence between the feature quantity of the pixel block of interest and the feature quantity of the disturbing object.
- the threshold value is set in advance and stored in the disturbing object registration unit 165. Shall.
- step S223 If it is determined in step S223 that the difference between the feature amount calculated in step S222 and the preset feature amount stored in the obstructing object registration unit 165 is equal to or smaller than the threshold value, step S221 The block extracted at step S224 is considered to be close to the feature of snow, so the process proceeds to step S224, where the obstruction detection unit 162 adds the block extracted at step S221.
- the feature value match flag indicating the feature value match is set to ON.
- step S223 if it is determined in step S223 that the difference between the feature amount calculated in step S222 and the preset feature amount stored in the disturbing object registration unit 165 exceeds the threshold value, Since the block extracted in step S221 is considered not to have snow features, the process proceeds to step S224, and the obstruction detection unit 162 turns off the feature amount match flag for the block extracted in step S221. Set to.
- step S204 the disturbing object detection unit 162 confirms whether or not each block whose characteristics are determined in step S203 matches the characteristics of the disturbing object. Determine whether or not.
- whether or not the feature matches the feature of the obstructing object is determined based on the feature amount match flag described above.
- step S204 If it is determined in step S204 that the feature matches the characteristic of the disturbing object, the process proceeds to step S205, and the disturbing object detection unit 162 sets the pixel corresponding to the block as the disturbing object. On the other hand, if it is determined in step S204 that it does not match the characteristics of the obstructing object, the processing in step S205 is skipped.
- step S201 or S202 may be omitted, and the detection of the disturbing object may be performed based on the result of the feature determination.
- the obstacle may be detected by a process different from the case described above with reference to FIGS.
- a user who actually drives a car may not always need to remove all the snow in the image. It may be possible to secure a sufficient field of view by removing only the snow part reflected by the headlights in the image.
- the brightness of snow that significantly interferes with the field of view is identified by analyzing the snow image reflected and reflected by the headlight in advance, and based on that brightness (for example, the threshold value a in FIG. It is also possible to set a threshold value (which is slightly higher) so that all pixels with a luminance higher than the threshold value are detected as disturbing objects.
- the disturbing object detection process of FIG. 7 may be a process in which, for example, pixels having a luminance equal to or higher than the threshold value are detected in step S142, and all detected pixels are set as disturbing objects.
- step S261 the disturbing object detection unit 162 acquires an image captured by the imaging unit 101 when illumination such as a headlight is on.
- step S262 the obstruction object detection unit 162 acquires an image captured by the imaging unit 101 when illumination such as a headlight is turned off.
- the headlight may be controlled to be turned on or off each time according to the shooting timing. For example, if a headlight composed of an LED (Light Emitting Diode) is used, Since the LED is repeatedly turned on and off at a predetermined interval, it is not necessary to control the turning on and off of the headlight if an image is acquired from the imaging unit 101 in accordance with the interval.
- LED Light Emitting Diode
- step S263 the disturbing object detection unit 162 determines both the image acquired by the processing of step S261 and the processing of step S262, for example, to exclude the influence of lighting on or off. After performing processing such as making the average value of the overall brightness of the image equal, the difference is calculated and compared. In step S264, the disturbing object detection unit 162 detects a block of pixels whose difference exceeds a threshold value.
- FIGS. 19 and 20 are diagrams showing examples of images acquired in steps S261 and S262.
- step S261 when an illumination such as a headlight is on, the imaging unit An image as shown in FIG. 19 is acquired as an image taken at 101, and when an illumination such as a headlight is turned off at step S262, an image taken at imaging unit 101 is obtained as shown in FIG. It is assumed that an image as shown in FIG.
- the snow force reflected on the headlight is clearly displayed in the entire image.
- FIG. 20 since snow does not reflect on the headlight, forces such as oncoming cars, street lights, and pedestrians are used. It is clearly displayed in comparison with the case.
- all the pixel values (luminance values) of the image in FIG. 20 are uniformly converted to be high so that the average value of the overall brightness is the same in both the image in FIG. 19 and the image in FIG.
- the comparison is performed by calculating the difference in pixel values after performing processing such as performing the processing, the pixel block power corresponding to snow in FIG. 19 is detected as a significant difference (for example, the difference exceeds the threshold). Will be.
- the headlight When the headlight is turned on and off, the amount of light radiated to the subject (in front of the car) varies greatly.
- a camera using an imaging element with a narrow dynamic range such as a CCD
- the bright part of the subject flies white on one side, and the dark part of the subject is painted black on the other.
- the imaging device 101 using the HDRC imaging control unit 121 can output an imaging element without adjusting the amount of incident light by adjusting the aperture, shutter speed, and the like.
- the pixel value corresponding to the bright part of the subject is clipped to the maximum pixel value, or the pixel value corresponding to the dark part of the subject is clipped to the minimum pixel value that can be output by the image sensor. Since clipping does not occur, it is possible to faithfully shoot detailed changes in brightness of the subject. As a result, it is possible to detect a snow pixel, which is a pixel reflected in the headlight light and has a significantly high luminance in the image of FIG. 19, as a significant difference from the image of FIG.
- step S264 the disturbing object detection unit 162 sets the block detected by the process in step S263 (that is, the pixel block corresponding to snow in FIG. 19) as the disturbing object.
- the driver can turn off the headlight in order to improve the forward field of view, and can be prevented from performing dangerous driving.
- the driver feels that the snow illuminated by the headlight is dazzling when the headlight is lit even though the front is illuminated by the brightness and lighting of the sky in the dark. Sometimes. This is especially true when the surroundings are dim, such as at dusk, when the amount of snowfall is large and there are many snow grains. In such a case, if you turn off the headlights, the forward visibility will be improved. In such cases, the driver may be warned not to turn off the headlight.
- the control unit 102 says, “Turn off the headlights because the surroundings are dark. It is also possible to output a sound signal of a message that conveys the danger, such as “Please look at the image on display 103” to the speaker in the car of the car, and prompt the driver to turn on the headlight.
- the control unit 102 is an image from which the snow is not removed, except for the image at the moment when the headlight is turned on, of the data image output from the imaging unit 101, and the headlight is turned off. Only the image in the displayed state may be displayed on the display unit 103.
- Whether or not to remove obstructing objects may be selected by the driver each time! If the brightness of the disturbing object is not significantly different from the surrounding brightness with the headlight turned off, the image may be displayed in such a manner that the disturbing object is not automatically removed. .
- the force described so far for detecting the obstructing object The pixel corresponding to the obstructing object detected by the processing described above with reference to Fig. 7 is represented by, for example, a two-dimensional coordinate value in the image. Information specific to each pixel is specified individually, and information about the specified pixel is transferred. It is output to the moving state control unit 163 or the obstructing object removal processing unit 164.
- step S104 in FIG. 5 the details of the obstructing object removal process in step S104 in FIG. 5 will be described with reference to the flowchart in FIG.
- step S301 the disturbing object removal processing unit 164 acquires an image of a frame temporally prior to the frame of the image to be corrected.
- step S302 the obstruction object detection unit 162 detects the pixel block set as the obstruction object in the image of the frame to be corrected in the image of the previous frame acquired in step S301.
- the part (block) corresponding to the lock is detected as the part to be replaced.
- step S303 the disturbing object removal processing unit 164 detects the block of pixels set as the disturbing object in the image of the frame to be corrected, and detects the block of pixels detected in the process of step S302. Replace with
- the frame of the image to be corrected is the nth frame as shown in FIG. 23, the pixel corresponding to the obstructing object (snow) in this image is the pixel (xl, yl ) Is a block composed of surrounding pixels.
- (xl, yl) represents the X-axis and y-axis coordinates in the image.
- step S301 for example, an image of a frame as shown in Fig. 24 is acquired as a frame temporally prior to the nth frame.
- step S302 the portion of the image in FIG. 24 corresponding to the block of pixels set as the obstruction in the image of the frame to be corrected (FIG. 23), that is, the pixel (xl, yl) is detected as a part to be replaced. Note that it is checked in advance that the block does not contain snow centering on the pixel (xl, yl) in Fig. 24, and it is detected as a part to be replaced.
- step S303 the snow in FIG. 23 is replaced with a block centered on the pixel (xl, yl) in FIG. 24 and removed.
- the movement state control unit 163 When the automobile is moving (running), the movement state control unit 163 considers the movement of the image and detects it as a portion to be replaced. For example, when the car is moving forward, an image as shown in FIG. 26 is taken as an image of the nth frame and then shown as an image of the (n + 10) th frame in FIG. Images like It will be shadowed. Since the car is moving forward, in FIG. 26, the object displayed near the center of the vertical axis in the figure (for example, trees on both sides of the road) approaches as the car moves. In 27, compared to the case of FIG. 26, it is displayed slightly below the vertical axis in the figure.
- the frame of the image to be corrected is the (n + 10th) frame of FIG. 27, and the image of the previous frame acquired in step S301 is the nth frame of FIG.
- the pixel (pixel xll, y11) set as a disturbing object in Fig. 27 cannot be replaced with the pixel at the same position (pixel xll, yll) in the image of Fig. 26. .
- the movement state control unit 163 extracts a predetermined block from the image and calculates a motion vector, so that (pixel xll, yll) of the image in FIG. It is detected that y2 corresponds to 1), and the obstructing object removal processing unit 164 is notified.
- step S303 as shown in FIG. 28, the block centered on the pixel (pixel xll, yll) set as the obstructing object in FIG. 27 is the pixel (x21, y21 in FIG. 26). ) Will be replaced by a block centered on.
- the obstruction is performed in step S304.
- the removal processing unit 164 generates a corrected image signal based on the image and outputs it to the display unit 103.
- a corrected image signal based on the image and outputs it to the display unit 103.
- snow as an obstructing object is removed from the image shown in FIG. 19, and a corrected image as shown in FIG. 21 is displayed. That is, an image (FIG. 21) with the snow removed from the image cover shown in FIG. 19 is virtually generated.
- the monitoring device 100 may be installed in a snowy place such as a ski resort.
- the monitoring device 100 When the monitoring device 100 is installed at a ski resort etc., the monitoring device 100 will not move. Therefore, the movement state control unit 163 may not be provided.
- the image pickup control unit 121 of the image pickup unit 101 can be configured by a CCD image pickup device, a CMOS image pickup device, or the like.
- the monitoring device 100 can be used without using a logarithmic conversion type image pickup device such as HDRC. Can be configured.
- the pixel held is an obstructing object, and the obstructing object detection process described above with reference to FIG. 7 is performed.
- FIG. 29 is a block diagram showing another configuration example of the monitoring apparatus to which the present invention is applied.
- the monitoring apparatus 200 in the figure blocks denoted by the same reference numerals as those in the monitoring apparatus 100 in FIG. 1 are the same as those in FIG. 1, and detailed description thereof is omitted.
- the imaging unit 101-1 and the imaging unit 101-2 are provided as imaging units.
- the imaging unit 101-1 and the imaging unit 101-2 have the same height from the ground on the front grille in front of the automobile, respectively. Thus, it is attached to a position separated by a predetermined interval from side to side.
- the image corresponding to the light incident through the lens 101-la of the imaging unit 101-1 and the image corresponding to the light incident through the lens 101-2a of the imaging unit 101-2 have parallax.
- An imaging unit 101-1 and an imaging unit 101-2 are attached so as to form an image. Note that if the images captured by the imaging unit 101-1 and the imaging unit 101-2 can have an appropriate parallax, the imaging unit 101-1 and the imaging unit 101-2 are described above. It may be possible to install it at a position different from the installation position.
- an image of a frame temporally prior to the frame of an image to be corrected is acquired, and an image of the previous frame in time is acquired.
- An example in which an obstructing object is removed using a raw block has been described.
- the movement state control unit 163 takes into account the movement of the image and detects the block (the part to be replaced) used in the previous frame in terms of time. For example, if the car is driving on a winding road with a sharp curve, the direction of the car will change greatly over time and will be captured by the imaging unit 101. The image changes greatly in a relatively short time.
- the image of the frame that is a predetermined time before the frame of the image to be corrected includes, for example, a different subject from the image of the frame to be corrected. It is no longer the same image (which gives the viewer the same impression), and it may not be appropriate to remove the obstructing object by replacing it with a block of pixels from the previous frame in time. is there.
- the monitoring apparatus 200 different (having a difference of view) images taken by the two imaging units are acquired at the same time. Thus, it is possible to perform correction based on the image captured by the other imaging unit. In this way, for example, even when the vehicle is traveling on a winding road, it is possible to appropriately remove the obstructing object.
- Fig. 30 shows an example of the obstruction removal processing when the monitoring device 200 corrects an image captured by one imaging unit with an image captured by the other imaging unit at the same timing. .
- FIG. 30 is a flowchart illustrating another example of the obstructing object removing process, which is an example of the obstructing object removing process executed by the monitoring apparatus 200 described above.
- the monitoring apparatus 200 it is assumed that an image captured mainly by the imaging unit 101-1 is displayed on the display unit 103.
- step S361 in the figure the obstructing object removal processing unit 164 acquires an image captured by another imaging unit (in this case, the imaging unit 101-2). Note that this image is an image captured by the image capturing unit 101-2 at the same timing as the image captured by the image capturing unit 101-1 (image to be corrected).
- step S362 the disturbing object detection unit 162 is obtained by the process of step S361.
- a portion (block) corresponding to a block of pixels set as an obstructing object is detected as a portion to be replaced.
- the image acquired in step S361 is captured at the same timing as the image to be corrected, and becomes an image having parallax with the image to be corrected.
- the image acquired in the process of step S3 61 is an image including the same object as the image to be corrected, and gives an image with almost the same impression to the observer, while the image to be corrected.
- the position (coordinates) of the included object is an image that shows the same object at slightly different positions. In other words, when removing sufficiently small obstructing objects such as falling snow, the image is taken at the same coordinate position as the coordinate position of the part where snow appears in the image to be corrected taken by the imaging unit 101-1. Even in the image captured by Section 101-2, it is very unlikely that snow will appear. In addition, it is very unlikely that there is an object in the image acquired by the process of step S361 in the vicinity of the part where the snow appears in the image to be corrected. Conceivable.
- the portion where snow is reflected in the image to be corrected is composed of pixels around the pixel (xl, yl)
- the pixels in the image to be corrected A block with a sufficiently small area composed of surrounding pixels centered on (xl, yl) is composed of surrounding pixels centered on the pixel (xl, yl) in the image acquired in step S361.
- the blocks By replacing the blocks with the same area, it is possible to generate a natural image in which only the snow which is the obstructing object is removed from the image to be corrected.
- the block image corresponding to the pixel of the disturbing object is replaced as described above.
- step S364 a corrected image from which the disturbing object is removed through the process of step S363 is generated.
- a CPU (Central Processing Unit) 501 has various types according to programs stored in a ROM (Read Only Memory) 502 or programs loaded from a storage unit 508 to a RAM (Random Access Memory) 503. Execute the process.
- the RAM 503 also appropriately stores data necessary for the CPU 501 to execute various processes.
- CPU 501, ROM 502, and RAM 503 are connected to each other via a bus 504.
- An input / output interface 505 is also connected to the bus 504.
- the input / output interface 505 includes an input unit 506 including a keyboard and a mouse, a display including a CRT (Cathode Ray Tube) and an LCD (Liquid Crystal display), and an output unit 507 including a speaker power, a hard disk
- a communication unit 509 including a storage unit 508 including a network interface card such as a modem and a LAN card is connected.
- a communication unit 509 performs communication processing via a network including the Internet.
- a drive 510 is also connected to the input / output interface 505 as necessary, and a removable medium 511 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory is appropriately mounted, and these forces are also read out.
- the computer program is installed in the storage unit 508 as necessary.
- This recording medium is a magnetic disk (floppy disk) on which the program is recorded, which is distributed to distribute the program to the user separately from the apparatus main body shown in FIG. Disk (including registered trademark)), optical disk (including compact disk-read only memory (CD-ROM), DVD (digital versatile disk)), magneto-optical disk (including MD (mini-disk) (registered trademark))
- a ROM 502 on which a program is recorded and a hard disk included in the storage unit 508 are delivered to the user in a state of being pre-installed in the main body of the apparatus that is configured only by the removable medium 511 including a semiconductor memory or the like. Including later life.
- processing includes processing that is not necessarily performed in time series but is executed in parallel or individually.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Mechanical Engineering (AREA)
- Studio Devices (AREA)
- Image Processing (AREA)
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP06729134A EP1868375A4 (en) | 2005-03-15 | 2006-03-15 | IMAGE PROCESSING APPARATUS AND METHOD, PROGRAM AND RECORDING MEDIUM |
US11/908,959 US20090016636A1 (en) | 2005-03-15 | 2006-03-15 | Image processing apparatus, and method, program and recording medium |
JP2007512423A JPWO2006109398A1 (ja) | 2005-03-15 | 2006-03-15 | 画像処理装置および方法、プログラム、並びに記録媒体 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2005072795 | 2005-03-15 | ||
JP2005-072795 | 2005-03-15 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2006109398A1 true WO2006109398A1 (ja) | 2006-10-19 |
Family
ID=37086689
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2006/305113 WO2006109398A1 (ja) | 2005-03-15 | 2006-03-15 | 画像処理装置および方法、プログラム、並びに記録媒体 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20090016636A1 (ja) |
EP (1) | EP1868375A4 (ja) |
JP (1) | JPWO2006109398A1 (ja) |
CN (1) | CN101142814A (ja) |
WO (1) | WO2006109398A1 (ja) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008207677A (ja) * | 2007-02-27 | 2008-09-11 | Hitachi Ltd | 画像処理装置,画像処理方法、及び画像処理システム |
WO2008111549A1 (ja) * | 2007-03-15 | 2008-09-18 | Kansai University | 移動物体ノイズ除去処理装置及び移動物体ノイズ除去処理プログラム |
WO2010084707A1 (ja) * | 2009-01-20 | 2010-07-29 | 本田技研工業株式会社 | コンピュータビジョン・システムに於ける画像復元方法 |
WO2015129250A1 (ja) * | 2014-02-25 | 2015-09-03 | パナソニックIpマネジメント株式会社 | 画像処理装置及び画像処理方法 |
CN108494996A (zh) * | 2018-05-14 | 2018-09-04 | Oppo广东移动通信有限公司 | 图像处理方法、装置、存储介质及移动终端 |
JP2019140251A (ja) * | 2018-02-09 | 2019-08-22 | キヤノン株式会社 | 光電変換装置、撮像システム、および移動体 |
Families Citing this family (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2674323B1 (en) * | 2007-04-30 | 2018-07-11 | Mobileye Vision Technologies Ltd. | Rear obstruction detection |
US8436902B2 (en) * | 2007-08-30 | 2013-05-07 | Valeo Schalter And Sensoren Gmbh | Method and system for weather condition detection with image-based road characterization |
WO2009117603A2 (en) * | 2008-03-19 | 2009-09-24 | Hypermed, Inc. | Miniaturized multi-spectral imager for real-time tissue oxygenation measurement |
EP2351351B1 (en) * | 2008-10-01 | 2015-09-16 | Connaught Electronics Limited | A method and a system for detecting the presence of an impediment on a lens of an image capture device to light passing through the lens of an image capture device |
EP2460354A4 (en) * | 2009-07-27 | 2015-11-04 | Utc Fire & Security Corp | SYSTEM AND METHOD FOR IMPROVING VIDEO QUALITY |
WO2011072157A2 (en) * | 2009-12-09 | 2011-06-16 | Cale Fallgatter | Imaging of falling objects |
US8941726B2 (en) * | 2009-12-10 | 2015-01-27 | Mitsubishi Electric Research Laboratories, Inc. | Method and system for segmenting moving objects from images using foreground extraction |
DE102012205012A1 (de) * | 2011-07-12 | 2013-01-17 | Robert Bosch Gmbh | Kamerasystem für den Einsatz in einem Fahrzeug sowie Fahrzeug mit einem derartigen Kamerasystem |
CN102496147A (zh) * | 2011-11-30 | 2012-06-13 | 宇龙计算机通信科技(深圳)有限公司 | 图像处理装置、图像处理方法和图像处理系统 |
KR101341243B1 (ko) * | 2012-04-04 | 2013-12-12 | (주) 넥스트칩 | 기상 현상으로 인해 훼손된 영상을 복원하는 장치 및 방법 |
JP2014011785A (ja) * | 2012-07-03 | 2014-01-20 | Clarion Co Ltd | 車載カメラ汚れ除去装置の診断装置、診断方法及び車両システム |
PL2893510T3 (pl) * | 2012-09-05 | 2016-08-31 | Fayteq Ag | Sposób i urządzenie do przetwarzania obrazu do usuwania obiektu wizualnego z obrazu |
JP5991224B2 (ja) * | 2013-02-15 | 2016-09-14 | オムロン株式会社 | 画像処理装置、画像処理方法、および画像処理プログラム |
US9165352B2 (en) | 2013-06-10 | 2015-10-20 | Xerox Corporation | Precipitation removal for vision-based parking management systems |
ES2666499T3 (es) | 2013-07-03 | 2018-05-04 | Kapsch Trafficcom Ab | Método para identificación de contaminación en una lente de una cámara estereoscópica |
CN104349045B (zh) * | 2013-08-09 | 2019-01-15 | 联想(北京)有限公司 | 一种图像采集方法及电子设备 |
EP2945116A1 (en) * | 2014-05-15 | 2015-11-18 | Continental Automotive GmbH | Method and apparatus for providing an augmented image of a vehicle's surrounding |
JP6862830B2 (ja) * | 2014-12-29 | 2021-04-21 | ソニーグループ株式会社 | 送信装置、送信方法、受信装置および受信方法 |
JP6368958B2 (ja) * | 2016-05-12 | 2018-08-08 | 本田技研工業株式会社 | 車両制御システム、車両制御方法、および車両制御プログラム |
US11400860B2 (en) * | 2016-10-06 | 2022-08-02 | SMR Patents S.à.r.l. | CMS systems and processing methods for vehicles |
IT201700021444A1 (it) * | 2017-02-24 | 2017-05-24 | Roberto Possekel | Sistema di visione digitale per veicoli |
DE102018203590A1 (de) * | 2018-03-09 | 2019-09-12 | Conti Temic Microelectronic Gmbh | Surroundview-System mit angepasster Projektionsfläche |
CN109167893B (zh) * | 2018-10-23 | 2021-04-27 | Oppo广东移动通信有限公司 | 拍摄图像的处理方法、装置、存储介质及移动终端 |
US10694105B1 (en) * | 2018-12-24 | 2020-06-23 | Wipro Limited | Method and system for handling occluded regions in image frame to generate a surround view |
CN109905613B (zh) * | 2019-01-21 | 2021-05-25 | 广州市安晓科技有限责任公司 | 一种规避开门干扰的汽车全景拼接方法、装置及介质 |
JP7426987B2 (ja) * | 2019-03-26 | 2024-02-02 | 株式会社小糸製作所 | 撮影システムおよび画像処理装置 |
CN112914727A (zh) * | 2021-03-19 | 2021-06-08 | 联仁健康医疗大数据科技股份有限公司 | 非目标障碍物分离方法、系统、医疗机器人及存储介质 |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1291668B1 (en) * | 2001-09-07 | 2005-11-30 | Matsushita Electric Industrial Co., Ltd. | Vehicle surroundings display device and image providing system |
-
2006
- 2006-03-15 WO PCT/JP2006/305113 patent/WO2006109398A1/ja active Application Filing
- 2006-03-15 CN CNA2006800086330A patent/CN101142814A/zh active Pending
- 2006-03-15 JP JP2007512423A patent/JPWO2006109398A1/ja not_active Withdrawn
- 2006-03-15 EP EP06729134A patent/EP1868375A4/en not_active Withdrawn
- 2006-03-15 US US11/908,959 patent/US20090016636A1/en active Pending
Non-Patent Citations (4)
Title |
---|
FUJIMOTO M. ET AL.: "Gazochu no Shiya Bogai to naru Suiteki no Jokyo", EIZO JOHO MEDIA GAKKAISHI, THE INSTITUTE OF IMAGE INFORMATION AND TELEVISION ENGINEERS, JAPAN, vol. 56, no. 5, 1 May 2002 (2002-05-01), pages 853 - 857, XP003005819 * |
MIYAKE K. ET AL.: "Kosetsu Jokyo ni Tekioteki na Kosetsu Ryushi no Real Time Jokyo", THE JOURNAL OF THE INSTITUTE OF IMAGE ELECTRONICS ENGINEERS OF JAPAN, THE INSTITUTE OF IMAGE ELECTRONICS ENGINEERS, TOKYO, JAPAN, vol. 32, no. 4, 25 July 2003 (2003-07-25), pages 478 - 482, XP003005817 * |
NAKAJIMA A.: "Wide Dynamic Range Camera 'HNDC Series'", EIZO JOHO INDUSTRIAL, SANGYO KAIHATSUKIKO INC., vol. 35, no. 1, 1 January 2003 (2003-01-01), pages 41 - 46, XP003005818 * |
See also references of EP1868375A4 * |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008207677A (ja) * | 2007-02-27 | 2008-09-11 | Hitachi Ltd | 画像処理装置,画像処理方法、及び画像処理システム |
WO2008111549A1 (ja) * | 2007-03-15 | 2008-09-18 | Kansai University | 移動物体ノイズ除去処理装置及び移動物体ノイズ除去処理プログラム |
JP4878644B2 (ja) * | 2007-03-15 | 2012-02-15 | 学校法人 関西大学 | 移動物体ノイズ除去処理装置及び移動物体ノイズ除去処理プログラム |
WO2010084707A1 (ja) * | 2009-01-20 | 2010-07-29 | 本田技研工業株式会社 | コンピュータビジョン・システムに於ける画像復元方法 |
JPWO2010084707A1 (ja) * | 2009-01-20 | 2012-07-12 | 本田技研工業株式会社 | コンピュータビジョン・システムに於ける画像復元方法 |
JP5216010B2 (ja) * | 2009-01-20 | 2013-06-19 | 本田技研工業株式会社 | ウインドシールド上の雨滴を同定するための方法及び装置 |
US8797417B2 (en) | 2009-01-20 | 2014-08-05 | Honda Motor Co., Ltd. | Image restoration method in computer vision system, including method and apparatus for identifying raindrops on a windshield |
WO2015129250A1 (ja) * | 2014-02-25 | 2015-09-03 | パナソニックIpマネジメント株式会社 | 画像処理装置及び画像処理方法 |
JP2019140251A (ja) * | 2018-02-09 | 2019-08-22 | キヤノン株式会社 | 光電変換装置、撮像システム、および移動体 |
JP7250427B2 (ja) | 2018-02-09 | 2023-04-03 | キヤノン株式会社 | 光電変換装置、撮像システム、および移動体 |
CN108494996A (zh) * | 2018-05-14 | 2018-09-04 | Oppo广东移动通信有限公司 | 图像处理方法、装置、存储介质及移动终端 |
CN108494996B (zh) * | 2018-05-14 | 2021-01-15 | Oppo广东移动通信有限公司 | 图像处理方法、装置、存储介质及移动终端 |
Also Published As
Publication number | Publication date |
---|---|
EP1868375A1 (en) | 2007-12-19 |
JPWO2006109398A1 (ja) | 2008-10-09 |
EP1868375A4 (en) | 2009-07-08 |
CN101142814A (zh) | 2008-03-12 |
US20090016636A1 (en) | 2009-01-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2006109398A1 (ja) | 画像処理装置および方法、プログラム、並びに記録媒体 | |
JP4766302B2 (ja) | 画像処理装置および方法、記録媒体、並びにプログラム | |
JP4985394B2 (ja) | 画像処理装置および方法、プログラム、並びに記録媒体 | |
US9639764B2 (en) | Image recognition system for vehicle for traffic sign board recognition | |
JP5182555B2 (ja) | 画像処理装置および画像処理方法、画像処理システム、プログラム、並びに、記録媒体 | |
JP5501477B2 (ja) | 環境推定装置及び車両制御装置 | |
JP5441462B2 (ja) | 車両用撮像装置 | |
US8315766B2 (en) | Process for detecting a phenomenon limiting the visibility for a motor vehicle | |
US20080024606A1 (en) | Image processing apparatus | |
WO2019194256A1 (ja) | 演算処理装置、オブジェクト識別システム、学習方法、自動車、車両用灯具 | |
JP2006259829A (ja) | 画像処理システム、画像処理装置および方法、記録媒体、並びにプログラム | |
CN110520898B (zh) | 用于消除明亮区域的图像处理方法 | |
CN111027494A (zh) | 一种基于计算机视觉的矩阵车灯识别方法 | |
JP2016196233A (ja) | 車両用道路標識認識装置 | |
JP2970168B2 (ja) | 車両検出装置 | |
JP2005073296A (ja) | 車両用カメラの露光制御装置 | |
WO2019138827A1 (ja) | 物体認識装置 | |
KR100801989B1 (ko) | 번호판 인식 시스템,전처리 장치 및 방법 | |
WO2020196536A1 (ja) | 撮影システムおよび画像処理装置 | |
JP2004123061A (ja) | 撮影システム | |
JP2007257242A (ja) | 白線認識装置 | |
KR20210025855A (ko) | 야간에 저속도 차량에 대한 차량번호 판별장치 | |
JP2015058747A (ja) | 車両灯体制御装置 | |
CN112240529B (zh) | 配光控制装置以及车辆用灯具系统 | |
JP2007249568A (ja) | 画像処理装置および方法、記録媒体、並びに、プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200680008633.0 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
DPE1 | Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101) | ||
WWE | Wipo information: entry into national phase |
Ref document number: 11908959 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2006729134 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: RU |
|
WWP | Wipo information: published in national office |
Ref document number: 2006729134 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2007512423 Country of ref document: JP Kind code of ref document: A |