WO2004008743A1 - 撮像システム - Google Patents
撮像システム Download PDFInfo
- Publication number
- WO2004008743A1 WO2004008743A1 PCT/JP2003/008778 JP0308778W WO2004008743A1 WO 2004008743 A1 WO2004008743 A1 WO 2004008743A1 JP 0308778 W JP0308778 W JP 0308778W WO 2004008743 A1 WO2004008743 A1 WO 2004008743A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- signal
- image processing
- imaging
- field
- Prior art date
Links
- 238000003384 imaging method Methods 0.000 claims abstract description 53
- 238000012545 processing Methods 0.000 claims abstract description 51
- 238000009825 accumulation Methods 0.000 claims abstract description 25
- 230000001678 irradiating effect Effects 0.000 claims description 16
- 230000006837 decompression Effects 0.000 claims description 3
- 230000002035 prolonged effect Effects 0.000 abstract 1
- 238000000034 method Methods 0.000 description 43
- 238000010586 diagram Methods 0.000 description 28
- 238000001514 detection method Methods 0.000 description 22
- 238000012935 Averaging Methods 0.000 description 9
- 230000010354 integration Effects 0.000 description 5
- 238000012937 correction Methods 0.000 description 4
- 230000015572 biosynthetic process Effects 0.000 description 3
- 238000011156 evaluation Methods 0.000 description 3
- 238000006243 chemical reaction Methods 0.000 description 2
- 239000002131 composite material Substances 0.000 description 2
- 229920006395 saturated elastomer Polymers 0.000 description 2
- 230000003111 delayed effect Effects 0.000 description 1
- 238000007599 discharging Methods 0.000 description 1
- 230000001552 evening effect Effects 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/30—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles providing vision in the non-visible spectrum, e.g. night or infrared vision
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/22—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
- B60R1/23—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
- B60R1/24—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view in front of the vehicle
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/20—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/73—Circuitry for compensating brightness variation in the scene by influencing the exposure time
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/30—Transforming light or analogous information into electric information
- H04N5/33—Transforming infrared radiation
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
- B60R2300/103—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using camera systems provided with artificial illumination device, e.g. IR light source
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
- B60R2300/106—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using night vision cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/20—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used
- B60R2300/205—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used using a head-up display
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/8053—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for bad weather conditions or night vision
Definitions
- the present invention relates to an imaging system using a CCD camera or the like.
- a CCD camera 101 is provided as an image pickup means, and a DSP (Digital Signal Processor) 103 and a digital camera are provided as image processing units. 1; 105.
- DSP Digital Signal Processor
- the CPU 105 and the DSP 103 are connected via a multiplex circuit 107, and a signal from a shutter speed setting switch 109 is input to the CPU 105.
- the shutter speed setting switch 109 allows the user to set the shirt speed for the ODD (odd) field and the shutter speed for the EVEN (even) field, respectively.
- the CPU 105 reads the setting state of the shirt evening speed setting switch 109 and outputs the shutter speed setting value of each field.
- the field pulse signal shown in Fig. 17 is output from the DSP 103, and when the output signal is high, the shutter speed setting value output on the EVEN side is output; when the output signal is low, the shutter speed setting on the ODD side is output.
- the value output is input to the shutdown speed setting input terminal of the DSP 103 by the multiplex circuit 107. Therefore, a different shutter speed can be set for each field by the imaging system as shown in FIG.
- FIG. 18 is an image obtained by irradiating infrared light forward with an IR lamp, which is an infrared light irradiating means, while the vehicle is driving at night, and capturing an image of the front of the vehicle with a CCD camera mounted on the vehicle.
- the area around bright light sources such as headlamps for oncoming vehicles and lighting for gas stations, has become invisible due to blooming. This is because the automatic shutdown speed is controlled so that the darkness of the entire screen is averaged and output. It is also possible to reduce the blooming (halation) by increasing the shirt evening speed, but in this case, the background becomes completely invisible as shown in FIG.
- the control of FIG. 16 for changing the shutter speed for each field is so-called double exposure control, in which a different shutter speed is set for each field.
- double exposure control in which a different shutter speed is set for each field.
- each field image is output alternately and can be displayed on a monitor as a clear image as shown in FIG.
- This imaging device is a camera equipped with an imaging device 111. It has a unit 113 and a processing unit 115.
- FIG. 22 is a conceptual diagram of image processing by the image pickup device of FIG. 21.
- a through image in the figure refers to a direct output of the image pickup device 111 of the camera unit 113.
- the memory image refers to the signal of the immediately preceding field once stored in the image memory 117.
- the main subject at the time of light emission is blackened for each ODD field set at a high shirt evening speed, and the background is overexposed for each EVEN field set at the same slow speed.
- the signal consists of a signal delayed by one field period, overexposure and underexposure occur in a different field from the through image. Therefore, the output image at the bottom of FIG. 22 can be obtained by appropriately combining the through image and the memory image.
- the synthesis of the through image and the memory image is performed based on the through image and the memory image. Since partially selected images are superimposed and synthesized, images with different exposure amounts are joined. Therefore, the flicker of the entire screen is eliminated as in the simple double exposure control, but there is a problem that the boundary between the through image and the memory image becomes unnatural. Disclosure of the invention
- An object of the present invention is to provide an imaging system capable of outputting a clearer image.
- An object of the present invention is to provide an infrared light irradiating unit for irradiating infrared light, an imaging unit for imaging a place illuminated by the infrared light irradiating unit and converting the image into an electric signal, and a signal of the imaging unit.
- An image processing unit that changes the accumulation time at a predetermined cycle and continuously and periodically outputs images with different exposure amounts, wherein the image processing unit extends and expands the images with different exposure amounts in the vertical direction. After both images This is achieved by averaging the signal levels to form a combined image.
- infrared light can be irradiated by the infrared light irradiation means.
- the imaging means can image the place illuminated by the infrared light irradiating means and convert it into an electric signal.
- the signal accumulation time of the image pickup means can be changed at a predetermined cycle, and images having different exposure amounts can be output continuously and periodically.
- the image processing unit can expand the images having different exposure amounts in the vertical direction, and average the signal levels of both images after the expansion to form a combined image.
- the double exposure control can show both dark and invisible parts in a bright image and parts that cannot be seen due to blooming (halation) in a dark image, but the boundary and flicker due to the difference in the exposure amount appear on the output image. Can be suppressed, and a clearer image can be output.
- the image processing unit performs the decompression by inserting an average value of signal levels of vertically adjacent pixels therebetween.
- the image processing section performs the expansion by inserting the average value of the signal levels of the pixels adjacent in the vertical direction between the pixels, the image can be expanded without difficulty, and a clearer image output can be performed. Can be performed.
- the image processing unit presets a target value of the exposure amount and controls the signal accumulation time according to the target value.
- the image processing unit can set the target value of the exposure amount in advance, and control the signal accumulation time according to the target value. Therefore, a dark area can be projected brighter, strong incident light can be suppressed, blooming (halation) can be further suppressed, and a clearer image can be output. Is an electric signal of the imaging means And the signal accumulation time is controlled by comparing the integrated electric signal with a reference value set in advance according to the target value.
- the image processing unit can control the signal accumulation time by integrating the electric signal of the imaging unit and comparing the integrated electric signal with a reference value set in advance according to the target value. Therefore, the signal accumulation time can be controlled more accurately, and a clearer image can be output.
- the image processing unit may control the signal accumulation time by comparing the number of pixels whose electric signal of the imaging unit exceeds a reference value with a reference pixel number set in advance according to the target value. I do.
- the image processing unit can control the signal accumulation time by comparing the number of pixels for which the electric signal of the imaging unit exceeds a reference value with a reference pixel number preset according to the target value. it can. For this reason, the signal accumulation time can be controlled more accurately, and a clearer image output can be performed.
- the imaging system of the present invention is characterized in that the infrared light irradiation unit, the imaging unit, and the image processing unit
- the infrared light irradiating means irradiates infrared light to the outside of the vehicle, and the imaging means captures an image of the outside of the vehicle.
- the infrared light irradiating means, the imaging means, and the image processing unit are provided in the vehicle, the infrared light irradiating means irradiates infrared light to the outside of the car, and the imaging means irradiates outside the car. Images can be taken. Therefore, while suppressing blooming (halation) due to the illumination of the headlamp of the oncoming vehicle, dark areas can be projected brightly and clearly, and the outside of the vehicle can be confirmed by a clear image output.
- FIG. 1 is a conceptual diagram of an automobile to which a first embodiment of the present invention is applied.
- FIG. 2 is a block diagram of an imaging unit and an image processing unit according to the first embodiment of the present invention. It is a lock figure.
- FIG. 3 is a flowchart of the first embodiment of the present invention.
- FIG. 4 is an output image diagram of simple double exposure control according to the first embodiment of the present invention.
- FIG. 5 relates to the first embodiment of the present invention, wherein (a) is a divided image diagram of the ODD field, and (b) is a divided image diagram of the EVEN field.
- FIG. 6 relates to the first embodiment of the present invention, wherein (a) is an expanded image of an ODD field, and (b) is an expanded image of an EVEN field.
- FIG. 7 relates to the first embodiment of the present invention, wherein (a) is an image quality adjustment diagram of the ODD field, and (b) is an image quality adjustment diagram of the EVEN field.
- FIG. 8 is a diagram of an average combined image according to the first embodiment of the present invention.
- FIG. 9 is a block diagram of an imaging unit and an image processing unit according to the second embodiment of the present invention.
- FIG. 10 is a timing chart of a process according to the second embodiment of the present invention.
- FIG. 11 is an explanatory diagram showing an example of a CCD area according to the second embodiment of the present invention.
- FIG. 12 is a flowchart according to the second embodiment of the present invention.
- FIG. 13 is a flowchart showing switching of the shutter speed in the integrated average value detection method according to the second embodiment of the present invention.
- FIG. 14 is a flowchart showing a shirt evening speed switch of the peak value detection method according to the second embodiment of the present invention.
- FIG. 15 relates to the second embodiment of the present invention, in which (a) is a divided image diagram of an ODD field, and (b) is a divided image diagram of an EVEN field.
- FIG. 16 is a block diagram according to a conventional example.
- FIG. 17 is an output diagram of a field pulse according to the conventional example.
- FIG. 18 is an output image diagram at a normal shutter speed according to the conventional example.
- FIG. 19 is an output image diagram of the high-speed shirt evening speed according to the conventional example.
- Fig. 20 is an output image showing the blooming (halation) phenomenon.
- FIG. 21 is a block diagram according to another conventional example.
- FIG. 22 is an image forming diagram according to another conventional example. BEST MODE FOR CARRYING OUT THE INVENTION
- FIG. 1 is a conceptual diagram of an automobile to which the first embodiment of the present invention is applied
- FIG. 2 is a block diagram of an imaging system according to the first embodiment
- FIG. 3 is a flowchart according to the first embodiment.
- FIG. 4 relates to the first embodiment, and is an output image diagram by simple double exposure control
- FIG. 5 is a field division image
- (a) is a division image diagram of an ODD field
- (b) is an EVEN field
- Fig. 6 shows a field decompressed image
- (a) is an ODD field decompressed image
- (b) is an EVEN field decompressed image
- Fig. 7 is a field image quality adjustment image
- (a) is an image quality adjustment image of the ODD field
- (b) is an image quality adjustment image of the EVEN field
- Fig. 8 is an average composite image.
- an imaging system according to a first embodiment of the present invention is applied to an automobile.
- An automobile 1 has an IR lamp 3 as an infrared light irradiating unit, and a CCD camera as an imaging unit. 5 and an image processing unit 7 as an image processing section, and a head-up display 9 is provided.
- the IR lamp 3 is automatically turned on to enable imaging in a dark place such as at night. This is for irradiating infrared light in front of the traveling direction of the car 1.
- the CCD camera 5 captures an image of the front of the vehicle 1 in the traveling direction irradiated with the infrared light, and converts the image into an electric signal.
- the electric signal in this case is a signal converted by the photodiode of the photosensitive section in the CCD camera 5.
- the image processing unit 7 changes the signal accumulation time of the CCD camera 5 at a predetermined cycle, and continuously and periodically outputs images having different exposure amounts.
- the signal accumulation time is the signal accumulation time of each pixel.
- Changing the signal accumulation time in a predetermined cycle means changing the number of pulses for discharging unnecessary charges accumulated in each pixel to change the resulting accumulation time.
- To output images with different exposures continuously and periodically means that the shutter speed is set for each ODD field and EVEN field by the electronic shutter operation, and the image of each field read at each shutter speed is read. Is output continuously and alternately every 160 seconds.
- the image processing unit 7 expands the images having different exposure amounts in the vertical direction, and averages the signal levels of the expanded images to form a combined image.
- Extending the images with different exposure amounts in the vertical direction means that the divided image of the ODD field and the divided image of the EVEN field obtained as images with different exposure amounts by changing the shutter speed in this embodiment are respectively It means to extend twice vertically. Also, averaging the signal levels of the two-sided images after decompression to form a composite image means that In the divided images of each field, the signal levels of the corresponding pixels are averaged to output one image.
- the image processing unit 7 includes, in addition to the CPU 11 and the DSP 13, an image memory 15, an operation memory 17, a video output memory 19, a DZA converter 21, and the like. I have.
- the CPU 11 performs various operations and can control the shutdown speed for each of the ODD field and the EVEN field by a configuration similar to that described with reference to FIG. That is, the CPU 11 receives a shirt speed control signal from the CPU 11 to the DSP 13.
- the DSP 13 converts a signal from the CCD camera 5 into a digital signal and processes the digital signal.
- the image memory 15 stores the image data for one frame output from the DSP 13.
- the CPU 11 divides the frame image data captured in the image memory 15 into ODD fields and EVEN fields, and writes the divided data into the operation memory 17.
- the CPU 11 expands each field image written in the arithmetic memory 17 twice in the vertical direction, and performs image quality adjustment such as gamma correction and contrast adjustment on each expanded screen.
- the average of these two image data is averaged, and the averaged image data is transferred to the video output memory 19, DZA-converted by the DZA converter 21, and output as, for example, an NTSC signal.
- FIG. 3 shows a flowchart of the first embodiment.
- the imaging system according to the first embodiment is also basically a double exposure control, and the process of “initial setting of shutter speed” is first executed in step S1 according to the flowchart of FIG.
- step S1 for example, the ODD field side is set to the low shutter speed as described above, and the EVEN field side is set to the high shutter speed. Set to fast shutter speed.
- the shutter speed on the ODD field side is set to 1/60 second, and the shutter speed on the EVEN field side is set to 11000 seconds, and the process proceeds to step S2. It is also possible to select another speed for each shuttle speed.
- the ODD field can be set to a high shutter speed, and the EVEN field can be set to a low shutter speed.
- step S2 a process of “CCD imaging” is executed.
- the CPU 11 outputs the shutter speed control signal on the ODD field side and the shirt speed control signal on the EVEN field side set in step S1 to the DSP 13.
- imaging is performed by the CCD camera 5 according to the drive signal, and signal charges are performed in all pixels of the photodiode in the photosensitive section of the CCD camera 5.
- the signal charges of the odd-numbered pixels in every other pixel of the photodiode in the photosensitive section are read out in 1Z60 seconds.
- the signal charges of the even-numbered pixels are read out with an accumulation time of 1/1000 seconds, and the flow proceeds to step S3.
- step S3 “DSP processing” is executed.
- step S3 the signal charges read out by the CCD camera 5 are taken in, converted into digital signals by the AZD converter, subjected to signal processing, and output, and the process proceeds to step S4. I do.
- step S4 the process of "memory storage” is executed, the processed signal output from the DSP 13 is stored in the image memory 15, and the process proceeds to step S5.
- step S5 the process of "whether or not one frame capture is completed" is executed, and it is determined whether or not the processed signal output from the DSP 13 has been captured by the image memory 15 for one frame. . Capture one frame into image memory 15 If not, the process returns to step S2, and the processes of step S3, step S4, and step S5 are repeated thereafter. When it is determined in step S5 that the acquisition of the processed signal for one frame has been completed, the process proceeds to step S6.
- step S6 the process of “writing to field division operation memory” is executed.
- the frame image data fetched into the image memory 15 by the CPU 11 is divided into ODD fields and EVEN fields, and the divided data is written into the operation memory 17 and the process proceeds to step S7.
- Transition Since the image data for each of the ODD field and the EVEN field written in the arithmetic memory 17 is every other data in the vertical direction, the image data compressed in the vertical direction by a factor of two is used. Has become.
- step S7 the process of "double expansion” is executed, and the image data of the ODD field and the EVEN field written in the operation memory 17 are expanded twice in the vertical direction.
- the expansion method is to extend one pixel for each field to two pixels in the vertical direction, or to insert it between the upper and lower two pixels by taking the average value of the signal levels of the upper and lower two pixels
- step S8 processing of “gamma correction and contrast adjustment” is performed, and image quality adjustment such as gamma correction and contrast adjustment is performed on each decompressed screen in step S7, and the process proceeds to step S9.
- step S9 "two-screen averaging" processing is performed, and the average of the image data expanded twice in the vertical direction of the ODD field and the EVEN field is calculated.
- the signal levels of the corresponding pixels in the ODD field and the EVEN field are averaged by simple averaging, and a new frame of image data is formed by the averaged signal levels.
- image formation is performed by synthesizing the image data of each field that has been doubled and expanded. And the process proceeds to step S10.
- step S10 the process of "transfer of video output memory” is executed, and the synthesized image data is transferred to the video output memory 19, and the process proceeds to step S11. All of the above processing is not performed in a time-division manner. For example, output is always performed from the output memory even during capture to the image memory. Also, while the image data is being processed in the image memory, the image signal of the next frame is continuously captured.
- step S11 the processing of "DZA conversion, NTSC output" is executed, and the digital signal of the image data is converted into an analog signal by the DZA converter 21 and output as, for example, an NTSC signal.
- the signal output from the image processing unit 7 is output to the head-up display 9 shown in FIG.
- the head-up display 9 displays an image on the windshield, and the driver of the car 1 can accurately grasp the situation in front of the vehicle even in a dark place such as at night by checking the image.
- image data processing as shown in FIGS. 4 to 7 is performed by the processing according to the flowchart in FIG. 3, and an image as shown in FIG. 8 can be displayed on the head-up display 9. .
- the image shown in FIG. 4 is image data for one frame which is taken into the image memory 15 by the double exposure control by the processing of the steps S1 to S5.
- the image data of FIG. 4 is divided into the image data of the ODD field in FIG. 5 (a) and the image data of the EVEN field in FIG. 5 (b) by the field division in step S6. Divided. In the ODD field where the shirt evening speed was slowed down, the bright parts were saturated and flew away, and the dark parts were clearly seen. Looks sharp.
- the divided image data of FIG. 5 is expanded twice as in step S7, and the ODD field expanded image of FIG. 6 (a) and the EVEN field expanded image of FIG. 6 (b) are expanded. Is obtained.
- the gamma correction and contrast adjustment of step S8 are performed on each of the expanded images, and the image quality adjustment data of the ⁇ DD field in FIG. 7 (a) and the image adjustment data of the EVEN field in FIG. 7 (b) Is performed.
- step S9 by averaging the signal levels of the two expanded images as described above by the two-screen averaging in step S9, the combined image formation is performed, and the image output as shown in FIG. 8 is performed.
- the output image in Fig. 8 is much clearer than the output image by simple double exposure control in Fig. 4, and blooming (halation) caused by strong light such as headlights of oncoming vehicles is apparent. By properly suppressing the), not only the information around the light source can be seen, but also dark areas can be seen more clearly overall.
- the simple double exposure control of FIG. 4 only outputs images with different exposure amounts continuously and periodically, so that the output image flickers as shown in FIG. I will invite you.
- the image is divided into ODD fields and EVEN fields, and the combined image is formed by averaging the signal levels of the two expanded images. A clearer image as shown in the figure could be output.
- FIG. 10 is a block diagram of the imaging system according to the embodiment
- FIG. 10 is a time chart showing the processing timing of the CPU
- FIG. 11 is a conceptual diagram showing the size of the image data of each field for calculating the shirt speed.
- FIG. 12 is a flowchart of the second embodiment
- FIG. 13 is a flowchart of shirt speed calculation using the integrated average value detection method
- FIG. 14 is a flowchart of shutter speed calculation using the peak value detection method.
- FIG. 15 shows the divided images, (a) is a divided image diagram of the ODD field, and (b) is a divided image diagram of the EVEN field. Note that components corresponding to those in the first embodiment are denoted by the same reference numerals and described.
- an analog front end IC (CDS / AGC / ADC) 23 is shown in addition to the CCD camera 5, the DSP 13, and the CPU 11.
- This analog front end IC 23 captures the signal charge of the CCD camera 5, removes noise on the signal, performs auto gain control, and then performs AZD conversion.
- the analog front end IC 23 is a generally provided circuit.
- the image processing unit 7A changes the signal accumulation time of the CCD camera 5 at a predetermined cycle, and continuously outputs images having different exposure amounts periodically. Is set in advance, and the signal accumulation time, that is, the shirt evening speed is controlled according to the target value. ''
- the analog front end Data from all pixels of camera 5 is received, shutter speed is controlled by CPU 11 according to the target value of exposure, and fed back to DSP 13 and analog front-end IC 23.
- the operation in the CPU 11 is performed for each of the ODD field and the EVEN field.
- the shirt evening speed of the ODD field and the EVEN field by the CPU 11 is initially set in the same manner as in the first embodiment, and the shutter speed is used for the same signal charge as in the first embodiment. Reading is performed. By reading out the signal charges, the shutter speeds of the ODD field and the EVEN field are further switched and controlled according to the target value of the exposure amount. This switching control of the shutter speed is performed, for example, at a processing timing as shown in FIG. “AE” in Fig.
- FIG. 10 refers to exposure control of the CCD camera, and is a general term for controlling the aperture of the lens, the speed of the electronic shirt, the analog or digital gain of the CCD camera, and the like.
- the focus is on the electronic shutter of the CCD camera at one speed.
- the processing timing of the EVEN field and the ODD field is different, but both fields perform the same operation. That is, the switching of the V-sync (vertical synchronization signal) switches the field pulse between the ODD field and the EVEN field.
- V-sync vertical synchronization signal
- the DSP 13 reads a charge signal from the CCD camera 5 via the analog front-end IC 23, the charge amount is integrated for each predetermined area of the CCD camera 5 (for example, an area obtained by dividing the CCD surface into 6). The task of averaging is repeated. Alternatively, the operation of counting the number of pixels having a value exceeding the set charge amount is repeatedly performed. This operation is continued until each field of ODDZE VEN is completed. At the end of the operation, the CPU 11 reads the integrated average value or the peak count value for each area from the DSP 13 ((and 1 in FIG. 10). ), Based on that, calculate the shirt evening speed (2 and 5), and output the shutter speed in that field to DSP 13 before the field at the next timing (3).
- the same operation is repeated each time the ODD field and EVEN field are switched. Will be returned.
- the charge accumulation amount of the CCD camera 5 is compared with the target luminance level which is the target value of the exposure amount, and the shutter speed is determined so that the charge accumulation amount in the CCD camera converges to the target luminance level.
- the optimal parameters are determined after experimental evaluation as adjustment items.
- the target brightness level is such that in the case of a field where the speed of the shirt is too slow in the field, even if a strong spotlight such as the headlight of an oncoming vehicle comes in as an image, it is ignored and the white portion is saturated. It is clear enough to be seen, and in the case of dark images with a fast shutter speed, the spot light is narrowed down so that the surrounding images are easy to see.
- a method using an integrated average value detection method for both the EVEN field and the ODD field a method using the peak value detection method for both the EVEN field and the ODD field, an integrated average value detection method for the EVEN field, ⁇ ⁇ ⁇
- the integrated average value detection method switches the shutter speed by scanning the image data and comparing the total integrated value of the luminance value of each pixel with a reference value.
- the peak value detection method scans image data, counts each pixel when the luminance value of each pixel is equal to or greater than the peak charge reference value, and compares the number of pixels with the reference pixel number. It switches the shutter speed.
- the size of the image to be scanned to determine the shutter speed is determined by dividing the image data shown in FIG. Accordingly, an example of 512 pixels ⁇ 256 lines as shown in FIG. 11 is used.
- FIG. 12 shows a flowchart of the second embodiment.
- the microcomputer calculates the integrated charge average and peak force value calculated inside the DSP 13 and reads the information from the DSP 13 to calculate the shutter speed in the next field. The calculation is described in a mixed manner.
- the flowchart of FIG. 12 is basically the same as the flowchart of FIG. 3 of the first embodiment, and the corresponding steps are denoted by the same step numbers.
- step S12 is added between step S9 and step S10, and the shutter speed is set at step S12 in contrast to the initial setting of the shutter speed at step S1. Switching is in progress.
- CHARGE charge integrated value
- step S22 a charge integration value (CHARGE) is calculated.
- step S27 it is determined whether or not the X value has ended at 511 pixels. If the X value has not reached 512 pixels, step S22, step S23, step S24, step S25, step S26, and step S27 are repeated, and charge integration is performed.
- step S28 When the charge integration of all pixels of 512 pixels ⁇ 256 lines is performed by the processing of steps S22 to S27, the process proceeds to step S28.
- step S28 if the accumulated charge value exceeds the reference value 1 (CHARGE 1 (REF)) (YE S), the process shifts to step S29 and the shutter speed (SHUTTER) is set one step faster (SHUTTE R—1).
- step S28 when the charge integrated value is lower than the reference value 1, the process proceeds to step S30, and it is determined whether the charge integrated value is lower than the reference value 2 (CHARGE 2 (R EF)). . If the integrated charge value is lower than the reference value 2 (CHARGE 2 (REF)) (YES), the process proceeds to step S31, otherwise (NO), the process proceeds to step S10.
- the optimum parameters are determined after experiment evaluation in advance, so as to maintain the target luminance level of each field.
- step S31 the shutter speed (SHUTTER) is set one step slower (SHUTTER + 1).
- the range of the shirt evening speed is set to 160 to 1Z10000 seconds, for example, 1Z60 seconds, 1/100 seconds, 1Z250 seconds, 1500 seconds are set as ODD fields, and 1Z 1000 seconds, 12000 seconds, Set the shirt evening speed so that 1/4 000 seconds and 1/10000 seconds are the EVEN field.
- one step faster than 1Z60 seconds means that the shutter speed is 00 seconds.
- Shutter speed One step slower than 1/4000 second means shirt evening speed is 1 Z 2 000 seconds.
- the above shutter speeds are merely examples, and it is possible to freely set a wider range of shutter speeds for each field or set a smaller interval between the shutter speeds. .
- the shutter speed is reduced because the intensity of the overall brightness level is higher than a certain amount of strong light such as the headlights of oncoming vehicles. It tries to reduce the brightness level by increasing the speed. If the charge integrated value is lower than the reference value 2, the overall brightness level is low, so the shutter speed is reduced to increase the brightness level. In this case, the reference value 1 is set as a higher luminance level than the reference value 2. If the charge integrated value is within the reference values 1-2, the shirt evening speed is maintained as it is.
- Fig. 13 the EVEN field and the ODD field are not particularly distinguished, but the shutter speed of the EVEN field and the ODD field is changed by the initial setting of the shirt evening speed in step S1.
- the reference values 1 and 2 of each field differ according to the initial setting, and the respective shutter speeds are switched according to the flowchart of FIG.
- the peak counter (i) power is reset to zero.
- the peak count scans the image data as described above and counts the pixels when the luminance value of each pixel is equal to or higher than the peak charge reference value.
- step S42 the coordinate value X of the pixel in the horizontal direction is reset to 0, and the coordinate value Y of the pixel in the vertical direction is reset to 0, and the process proceeds to step S43.
- step S43 it is determined whether or not the integrated charge value (CHARG #) of the pixel at the coordinates ((, ⁇ ) is equal to or greater than a preset peak charge reference value (PEAK (REF)).
- PEAK preset peak charge reference value
- step S46 The process of step S46 is repeated.
- step S49 it is determined whether or not the X value has reached 51 1 pixel. If the X value has not reached 512 pixels, step S43, step S44, step S45, step S43 Steps 46, S47, S48 and S49 are repeated, and pixels equal to or higher than the peak charge reference value are counted.
- step S50 OUNTER 1 (REF)
- step S51 the shirt evening speed (SHUTTER) is set one step faster (SHUTTER—Do
- step S50 If the peak charge pixel number is less than the peak charge reference pixel number 1 in step S50, the process proceeds to step S52, where the peak charge pixel number is the peak charge reference pixel number 2 (COUNTER 2 (REF)). Is determined. If the peak charge pixel count is less than the peak charge reference pixel count 2 (COUNTER 2 (REF)) (YES), the process proceeds to step S53; otherwise (NO), the process proceeds to step S10. .
- the optimum parameters are determined so as to maintain the target luminance level of each field after performing an experimental evaluation in advance as described above.
- step S53 the shirt evening speed (SHUTTER) is set one step slower (SHUTTER + 1).
- the peak charge pixel number exceeds the peak charge reference pixel number 1 as described above, strong light such as a headlight of an oncoming vehicle exists more than a certain level, and the overall brightness level is high. It attempts to reduce the brightness level by making one speed faster. If the peak charge pixel count is less than the peak charge reference pixel count 2, the overall brightness level is low, so the shirt speed is reduced to increase the brightness level. In this case, the peak charge reference pixel number 1 is set as a luminance level higher than the peak charge reference pixel number 2. If the peak charge pixel number is within the peak charge reference pixel number of 1 or 2, the shutter speed is maintained as it is. In FIG.
- the EVEN field and the ODD field are not particularly distinguished, but the shutter speeds of the EVEN field and the ODD field are determined by the initial setting of the shutter speed in step S1. Since the speed is changed, the number of peak charge reference pixels 2 in each field is different according to the initial setting, and the switching of each shutdown speed can be changed according to the flow chart in Fig. 14. It is what is done.
- the shutter speed initially set in step S1 is appropriately changed in step S12 to change the divided image data of each field as shown in FIG. Obtainable.
- Fig. 15 corresponds to Fig. 5, where (a) is the ODD field and (b) is the EVEN field.
- (a) is the ODD field
- (b) is the EVEN field.
- the spot light is further narrowed down, and the surrounding images are easier to see. Therefore, a sharper image can be obtained as a whole.
- the integrated average value detection method or the peak value detection method is used for both the ODD field and the EVEN field.
- the integrated average value detection method is used for the EVEN field
- the peak value detection method is used for the ⁇ DD field.
- the peak value detection method can be used in the EVEN field
- the integrated average value detection method can be used in the ODD field.
- the reference value is set so that the brightness of the entire screen is 50% gray.
- the number of reference pixels is set so that the highest light intensity part in the screen is 100% white.
- the electric charge readout is not limited to the readout of a single pixel, but can be read out and handled as a cluster of several pixels.
- the output image is displayed on the head-up display 9, but may be displayed on a display provided in a vehicle interior or the like.
- the IR lamp 3 illuminates the front of the vehicle in the traveling direction, it may illuminate the rear or side.
- the imaging system is not limited to an automobile, and may be configured as another vehicle such as a motorcycle, a ship, or the like, or as an imaging system independent of the vehicle.
- the imaging system grasps the situation in front of the vehicle by irradiating infrared light to the front and checking the image captured by the in-vehicle CCD camera or the like while the vehicle is driving at night. Suitable to do.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Mechanical Engineering (AREA)
- Studio Devices (AREA)
- Transforming Light Signals Into Electric Signals (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
Abstract
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CA002494095A CA2494095A1 (en) | 2002-07-11 | 2003-07-10 | Image pickup system |
AU2003281132A AU2003281132A1 (en) | 2002-07-11 | 2003-07-10 | Image pickup system |
EP03741324A EP1530367A4 (en) | 2002-07-11 | 2003-07-10 | SHOOTING SYSTEM |
US10/520,407 US20050258370A1 (en) | 2002-07-11 | 2003-07-10 | Imaging system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2002202569A JP2004048345A (ja) | 2002-07-11 | 2002-07-11 | 撮像システム |
JP2002-202569 | 2002-07-11 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2004008743A1 true WO2004008743A1 (ja) | 2004-01-22 |
Family
ID=30112631
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2003/008778 WO2004008743A1 (ja) | 2002-07-11 | 2003-07-10 | 撮像システム |
Country Status (7)
Country | Link |
---|---|
US (1) | US20050258370A1 (ja) |
EP (1) | EP1530367A4 (ja) |
JP (1) | JP2004048345A (ja) |
CN (1) | CN1675920A (ja) |
AU (1) | AU2003281132A1 (ja) |
CA (1) | CA2494095A1 (ja) |
WO (1) | WO2004008743A1 (ja) |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN100464572C (zh) * | 2007-07-25 | 2009-02-25 | 北京中星微电子有限公司 | 一种图像合成方法和装置 |
WO2012155951A1 (en) * | 2011-05-13 | 2012-11-22 | Valeo Schalter Und Sensoren Gmbh | Camera arrangement for a vehicle and method for calibrating a camera and for operating a camera arrangement |
JP5680573B2 (ja) * | 2012-01-18 | 2015-03-04 | 富士重工業株式会社 | 車両の走行環境認識装置 |
JP6330383B2 (ja) * | 2014-03-12 | 2018-05-30 | 株式会社デンソー | 合成画像生成装置、および合成画像生成プログラム |
JP6537231B2 (ja) * | 2014-08-11 | 2019-07-03 | キヤノン株式会社 | 画像処理装置、画像処理方法及びプログラム |
CN105991933B (zh) * | 2015-02-15 | 2019-11-08 | 比亚迪股份有限公司 | 图像传感器 |
CN105991934B (zh) * | 2015-02-15 | 2019-11-08 | 比亚迪股份有限公司 | 成像系统 |
CN105991935B (zh) * | 2015-02-15 | 2019-11-05 | 比亚迪股份有限公司 | 曝光控制装置及曝光控制方法 |
CN110869804B (zh) * | 2017-07-13 | 2023-11-28 | 苹果公司 | 用于光发射深度传感器的提前-滞后脉冲计数 |
US10593029B2 (en) | 2018-03-21 | 2020-03-17 | Ford Global Technologies, Llc | Bloom removal for vehicle sensors |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0556343A (ja) * | 1991-08-22 | 1993-03-05 | Olympus Optical Co Ltd | 電子的撮像装置 |
JPH1198418A (ja) * | 1997-09-24 | 1999-04-09 | Toyota Central Res & Dev Lab Inc | 撮像装置 |
JPH11201741A (ja) * | 1998-01-07 | 1999-07-30 | Omron Corp | 画像処理方法およびその装置 |
JP2000228747A (ja) * | 1998-12-03 | 2000-08-15 | Olympus Optical Co Ltd | 画像処理装置 |
JP2001148805A (ja) * | 1999-11-22 | 2001-05-29 | Matsushita Electric Ind Co Ltd | 固体撮像装置 |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6498620B2 (en) * | 1993-02-26 | 2002-12-24 | Donnelly Corporation | Vision system for a vehicle including an image capture device and a display system having a long focal length |
US6806907B1 (en) * | 1994-04-12 | 2004-10-19 | Canon Kabushiki Kaisha | Image pickup apparatus having exposure control |
US5880777A (en) * | 1996-04-15 | 1999-03-09 | Massachusetts Institute Of Technology | Low-light-level imaging and image processing |
JP3630905B2 (ja) * | 1997-02-28 | 2005-03-23 | キヤノン株式会社 | 撮像方法及び撮像装置 |
US6133960A (en) * | 1998-06-26 | 2000-10-17 | Lsi Logic Corporation | Field-based upsampling method for performing zoom in a digital video system |
US6753914B1 (en) * | 1999-05-26 | 2004-06-22 | Lockheed Martin Corporation | Image correction arrangement |
JP2001281529A (ja) * | 2000-03-29 | 2001-10-10 | Minolta Co Ltd | デジタルカメラ |
JP3649992B2 (ja) * | 2000-05-12 | 2005-05-18 | 三洋電機株式会社 | 画像信号処理装置及び画像信号処理方法 |
US20020067413A1 (en) * | 2000-12-04 | 2002-06-06 | Mcnamara Dennis Patrick | Vehicle night vision system |
JP3949903B2 (ja) * | 2001-04-09 | 2007-07-25 | 東芝エルエスアイシステムサポート株式会社 | 撮像装置及び撮像信号処理方法 |
US6560029B1 (en) * | 2001-12-21 | 2003-05-06 | Itt Manufacturing Enterprises, Inc. | Video enhanced night vision goggle |
-
2002
- 2002-07-11 JP JP2002202569A patent/JP2004048345A/ja active Pending
-
2003
- 2003-07-10 EP EP03741324A patent/EP1530367A4/en not_active Withdrawn
- 2003-07-10 CA CA002494095A patent/CA2494095A1/en not_active Abandoned
- 2003-07-10 WO PCT/JP2003/008778 patent/WO2004008743A1/ja not_active Application Discontinuation
- 2003-07-10 AU AU2003281132A patent/AU2003281132A1/en not_active Abandoned
- 2003-07-10 US US10/520,407 patent/US20050258370A1/en not_active Abandoned
- 2003-07-10 CN CN03816544.9A patent/CN1675920A/zh active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0556343A (ja) * | 1991-08-22 | 1993-03-05 | Olympus Optical Co Ltd | 電子的撮像装置 |
JPH1198418A (ja) * | 1997-09-24 | 1999-04-09 | Toyota Central Res & Dev Lab Inc | 撮像装置 |
JPH11201741A (ja) * | 1998-01-07 | 1999-07-30 | Omron Corp | 画像処理方法およびその装置 |
JP2000228747A (ja) * | 1998-12-03 | 2000-08-15 | Olympus Optical Co Ltd | 画像処理装置 |
JP2001148805A (ja) * | 1999-11-22 | 2001-05-29 | Matsushita Electric Ind Co Ltd | 固体撮像装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1530367A4 * |
Also Published As
Publication number | Publication date |
---|---|
EP1530367A4 (en) | 2007-10-17 |
AU2003281132A1 (en) | 2004-02-02 |
CN1675920A (zh) | 2005-09-28 |
EP1530367A1 (en) | 2005-05-11 |
US20050258370A1 (en) | 2005-11-24 |
JP2004048345A (ja) | 2004-02-12 |
CA2494095A1 (en) | 2004-01-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4154157B2 (ja) | 撮像装置 | |
JP3892172B2 (ja) | 固体撮像装置 | |
WO2008056789A1 (fr) | Dispositif de traitement d'image embarqué et procédé de commande pour un dispositif de traitement d'image embarqué | |
JP2004072237A (ja) | ストロボ装置及びカメラ | |
JP2006245784A (ja) | 固体撮像装置及びその駆動方法並びに撮像システム | |
JP2011234318A (ja) | 撮像装置 | |
JP2011193065A (ja) | 撮像装置 | |
JP3970903B2 (ja) | 撮像システム | |
JP3793487B2 (ja) | 撮像システム | |
JP2014230708A (ja) | 内視鏡 | |
WO2004008743A1 (ja) | 撮像システム | |
JP2004048455A (ja) | 撮像システム | |
CN107431759B (zh) | 摄像装置、摄像方法、电子设备以及车载电子设备 | |
KR20090091597A (ko) | 카메라의 백라이트 보정 장치 및 방법 | |
JP2005191954A (ja) | 撮像システム | |
JP2008230464A (ja) | 車載カメラ用自動露出装置 | |
JP5440245B2 (ja) | 撮像装置 | |
JP4652105B2 (ja) | 撮像素子を使用する監視カメラを用いる監視システム | |
JP5381541B2 (ja) | 撮像装置 | |
KR101601324B1 (ko) | 차량용 카메라 시스템의 영상 획득 방법 | |
KR20050019844A (ko) | 촬상시스템 | |
JP2005033709A (ja) | 車両用周辺監視装置 | |
JP5360589B2 (ja) | 画像撮像装置 | |
JP6338062B2 (ja) | 画像処理装置および撮像装置並びに画像処理方法 | |
JP2006109171A (ja) | 車両用表示装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AU CA CN KR US |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): DE FR SE |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2003741324 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 10520407 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2003281132 Country of ref document: AU |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2494095 Country of ref document: CA |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020057000552 Country of ref document: KR Ref document number: 20038165449 Country of ref document: CN |
|
WWP | Wipo information: published in national office |
Ref document number: 1020057000552 Country of ref document: KR |
|
WWP | Wipo information: published in national office |
Ref document number: 2003741324 Country of ref document: EP |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 2003741324 Country of ref document: EP |