WO2024176953A1 - Image processing device, image processing method, and program - Google Patents
Image processing device, image processing method, and program Download PDFInfo
- Publication number
- WO2024176953A1 WO2024176953A1 PCT/JP2024/005412 JP2024005412W WO2024176953A1 WO 2024176953 A1 WO2024176953 A1 WO 2024176953A1 JP 2024005412 W JP2024005412 W JP 2024005412W WO 2024176953 A1 WO2024176953 A1 WO 2024176953A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- photometry
- area
- unit
- types
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 43
- 238000003672 processing method Methods 0.000 title claims description 17
- 238000005375 photometry Methods 0.000 claims abstract description 133
- 238000003384 imaging method Methods 0.000 claims abstract description 45
- 238000012937 correction Methods 0.000 description 32
- 230000015572 biosynthetic process Effects 0.000 description 27
- 238000003786 synthesis reaction Methods 0.000 description 27
- 238000004364 calculation method Methods 0.000 description 19
- 238000005516 engineering process Methods 0.000 description 19
- 238000010586 diagram Methods 0.000 description 17
- 238000000034 method Methods 0.000 description 15
- 238000001514 detection method Methods 0.000 description 12
- 239000002131 composite material Substances 0.000 description 4
- 238000007796 conventional method Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 238000004590 computer program Methods 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 238000007792 addition Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B7/00—Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
- G03B7/08—Control effected solely on the basis of the response, to the intensity of the light received by the camera, of a built-in light-sensitive device
- G03B7/099—Arrangement of photoelectric elements in or on the camera
- G03B7/0993—Arrangement of photoelectric elements in or on the camera in the camera
- G03B7/0997—Through the lens [TTL] measuring
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/45—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/71—Circuitry for evaluating the brightness variation
Definitions
- This disclosure relates to image processing devices, etc.
- Patent document 1 discloses a technique for setting exposure conditions for a shooting area.
- the target object may not be detected correctly.
- the captured image contains bright objects that are not the target object, such as direct sunlight, the sky, or a distant white object (note that here, direct sunlight and the sky are also referred to as objects)
- automatic exposure control is performed so that the entire image becomes dark due to the influence of that brightness, and the area of the target object becomes dark, so that the target object may not be detected correctly.
- the present disclosure provides an image processing device and the like that can perform automatic exposure control suitable for detecting an object to be detected.
- the image processing device includes an acquisition unit that acquires two types of images showing the same area obtained by two types of imaging means having different wavelengths, and a determination unit that determines a photometric area in which photometry is performed in one of the two types of images based on the other of the two types of images.
- the image processing method disclosed herein is an image processing method executed by an image processing device, and includes an acquisition step of acquiring two types of images showing the same area obtained by two types of imaging means having different wavelengths, and a determination step of determining, based on one of the two types of images, a photometric area in which photometry is performed in the other of the two types of images.
- the program disclosed herein is a program for causing a computer to execute the image processing method described above.
- An image processing device can perform automatic exposure control suitable for detecting an object to be detected.
- FIG. 1 is a block diagram showing an example of an image processing device according to an embodiment
- 11A and 11B are diagrams for explaining a method of calculating the intensity of a component of reflected light.
- FIG. 4 is a block diagram showing an example of an exposure time setting unit according to the embodiment.
- 5 is a flowchart showing an example of an operation of a photometry unit according to the embodiment.
- FIG. 4 is a diagram showing an example of an acquired visible light image.
- FIG. 2 is a diagram showing an example of an acquired infrared image.
- FIG. 13 is a diagram showing an example of an area where photometry is not performed.
- 10 is a flowchart illustrating an example of an operation of an exposure time calculation unit according to the embodiment.
- FIG. 1 is a diagram showing an example of a visible light image obtained by performing automatic exposure control using a conventional method.
- 5A to 5C are diagrams illustrating an example of a visible light image obtained by performing automatic exposure control using the image processing device according to the embodiment.
- 10 is a flowchart illustrating another example of the operation of the photometry unit according to the embodiment.
- FIG. 13 is a diagram for explaining blocks on which photometry is not performed.
- FIG. 2 is a block diagram showing an example of an image synthesis unit according to the embodiment.
- 10 is a flowchart illustrating an example of an operation of an offset correction unit according to the embodiment.
- 11 is a flowchart illustrating an example of an operation of an output synthesis unit according to the embodiment.
- FIG. 4 is a block diagram showing an example of a gain correction unit according to the embodiment.
- 13 is a flowchart illustrating an example of an image processing method according to another embodiment.
- FIG. 1 is a block diagram showing an example of an image processing device 100 according to an embodiment.
- FIG. 1 also shows an imaging unit 200 that captures images to be processed by the image processing device 100.
- the image processing device 100 may also include the imaging unit 200.
- the imaging unit 200 is, for example, a camera that has two types of imaging means with different wavelengths and can capture two types of images using the two types of imaging means.
- the imaging unit 200 is a BW-TOF camera.
- a BW-TOF camera can simultaneously capture two types of images: a visible light image (e.g., a BW (black and white) image) obtained by receiving visible light in the entire wavelength range, and an infrared image obtained by receiving reflected light of infrared light in the infrared range that has been irradiated using a ToF (Time of Flight) type sensor.
- the imaging unit 200 can simultaneously capture two types of images, a visible light image and an infrared image, showing the same area.
- the imaging unit 200 does not have to be a camera that can capture visible light images and infrared images with one camera, such as a BW-TOF camera, as long as it can capture two types of images showing the same area.
- the imaging unit 200 may have two separate cameras, specifically, a camera capable of capturing visible light images and a camera capable of capturing infrared images.
- the same area is captured in the two types of images, but that area may be the entire image area or a portion of the entire image area.
- the entire image area of the two types of images may match, or a portion of the entire image area of the two types of images may match.
- the imaging unit 200 is a camera such as a BW-ToF camera that can capture two types of images simultaneously, the entire image area of the two types of images may match.
- the imaging unit 200 has two separate cameras, a portion of the entire image area of the two types of images may match.
- the imaging unit 200 can measure (range) the distance to an object shown in an infrared image based on the intensity of the reflected light component of the emitted infrared light.
- the infrared image contains distance information (depth information) of the object shown in the infrared image.
- the method for calculating the intensity of the reflected light component is explained using Figure 2.
- Figure 2 is a diagram to explain how to calculate the intensity of the reflected light components.
- the intensity of the reflected light component can be calculated according to the amount of received light signal in at least three sections after the irradiated light is applied.
- the at least three sections are sections A0 and A1 in which the signal amount is large due to the reflected light shown in FIG. 2, and section A2 in which the signal amount is small.
- Sections A0 and A1 each contain a portion of the IR reflected light component and a background component, which is a component of light other than the reflected light, and section A2 contains only the background component.
- the intensity of the reflected light component is the total signal amount of A0 and A1 minus the background component. If A0, A1, and A2 are each the signal amount, the reflected light component can be calculated by (A0-A2)+(A1-A2).
- the image processing device 100 is a device that performs image processing on two types of images (a visible light image and an infrared image) that show the same area and are obtained by two types of imaging means with different wavelengths in the imaging unit 200. For example, the image processing device 100 performs image processing to output an image suitable for detecting an object (detection target) that appears in the captured image.
- the image processing device 100 includes an exposure time setting unit 10, an image synthesis unit 20, and a gain correction unit 30.
- the image processing device 100 is a computer including a processor (microprocessor) and a memory.
- the memory is a ROM (Read Only Memory) and a RAM (Random Access Memory), etc., and can store programs executed by the processor.
- the exposure time setting unit 10, the image synthesis unit 20, and the gain correction unit 30 are realized by a processor that executes programs stored in the memory, etc.
- the exposure time setting unit 10 measures the luminance of the scene from the captured visible light image, sets the exposure time so that the brightness of the area of the detected object captured in the image becomes the desired brightness, and outputs the exposure time to the imaging unit 200. Details of the exposure time setting unit 10 will be described later.
- the image synthesis unit 20 synthesizes an image from multiple visible light images captured with different exposure times from the imaging unit 200, so that the detected object does not have blown-out highlights or crushed shadows. Details of the image synthesis unit 20 will be described later.
- the gain correction unit 30 measures the luminance of the scene in the current frame from the synthesized image, and performs digital gain correction on the image so that the brightness of the area of the detection target becomes the desired brightness. Specifically, there is a delay time until the exposure time set by the exposure time setting unit 10 is reflected in the output image, and the gain correction unit 30 performs digital gain correction to compensate for the output level fluctuation that accompanies this delay time.
- the gain correction unit 30 will be described in detail later.
- FIG. 3 is a block diagram showing an example of an exposure time setting unit 10 according to an embodiment.
- the exposure time setting unit 10 includes a photometry unit 11 and an exposure time calculation unit 12.
- the photometry unit 11 is an example of an acquisition unit that acquires two types of images showing the same area, obtained by two types of imaging means with different wavelengths.
- the photometry unit 11 is also an example of a determination unit that determines the photometry area in which photometry is performed in one of the two types of images, based on the other image of the two types of images.
- the photometry unit 11 acquires two types of images, a visible light image and an infrared image, and determines a photometry area in the visible light image based on the infrared image.
- the photometry area is a group of pixels to be measured, and the photometry unit 11 measures the brightness of the determined photometry area.
- an example of a method for determining the photometry area will be described with reference to FIG. 4.
- FIG. 4 is a flowchart showing an example of the operation of the photometry unit 11 according to the embodiment. It is assumed that the photometry unit 11 acquires, for example, a visible light image as shown in FIG. 5 and an infrared image as shown in FIG. 6.
- Figure 5 shows an example of an acquired visible light image.
- Figure 6 shows an example of an acquired infrared image.
- the same area is captured in the visible light image and the infrared image, and the photometry unit 11 recognizes pixels in the infrared image that capture the same positions as each pixel in the visible light image. Note that pixels in the visible light image and pixels in the infrared image that capture the same positions are referred to as corresponding pixels.
- the imaging unit 200 has a separately provided camera capable of capturing visible light images and a camera capable of capturing infrared images, if some areas of the entire image area of the visible light image and the infrared image match but the entire image area does not match, a process of matching between the two images can be performed by known means to determine corresponding pixels that capture the same position in the visible light image and the infrared image.
- the photometry unit 11 determines the photometry area by performing the processes from step S11 to step S14 shown in FIG. 4 for each pixel in the visible light image. Note that the area where photometry is not performed is an area excluding the photometry area where photometry is performed, so when the photometry area is determined, the area where photometry is not performed is also determined. In other words, determining the photometry area also means determining the area where photometry is not performed.
- the photometry unit 11 determines whether or not the BW intensity (pixel value (brightness value) of a pixel in a visible light image) is greater than a threshold value for the BW intensity (step S11).
- the threshold value for the BW intensity is set to a value that is somewhat smaller than the brightness value corresponding to the sky or a white object, for example, but is not particularly limited and may be set appropriately depending on the space or purpose in which the imaging unit 200 is used.
- the photometry unit 11 determines that the BW intensity of a pixel in the visible light image is greater than the threshold for the BW intensity (Yes in step S11), that is, when the pixel in question is a pixel that depicts a bright object such as the sky, direct sunlight, or a white object, it determines whether the IR intensity (intensity of the reflected light component relative to the emitted infrared light) of the pixel in the infrared image corresponding to the pixel in the visible light image is less than the threshold for the IR intensity (step S12).
- the threshold for the IR intensity is set to a value that is somewhat smaller than the value expected as the intensity of the reflected light component from an object within a certain range when it is desired to detect an object within the certain range from the imaging unit 200, but is not particularly limited and may be set appropriately depending on the space in which the imaging unit 200 is used, the purpose, etc.
- the photometry unit 11 determines that the IR intensity of a pixel in the infrared image is smaller than the threshold value for IR intensity (Yes in step S12), that is, when the pixel does not include an object (detection target) within a certain range from the imaging unit 200, the photometry unit 11 excludes the pixel in the visible light image corresponding to the pixel from the photometry target (step S13). In this way, the photometry unit 11 does not determine a pixel that includes a bright object such as the sky, direct sunlight, or a distant white object, and does not include the detection target, as a photometry area.
- the photometry unit 11 excludes such pixels from the photometry area, regarding them as pixels that include a high-luminance object (direct sunlight, the sky, or a distant white object, etc.) that is unrelated to the detection target.
- a high-luminance object direct sunlight, the sky, or a distant white object, etc.
- the white building surrounded by the dashed line in Figures 5 and 6 is a bright but distant object, so the pixel that includes this building is excluded from the photometry area.
- the photometry unit 11 determines that the IR intensity of a pixel in the infrared image is equal to or greater than the threshold for IR intensity (No in step S12), that is, if the pixel is a pixel that captures an object (detection target) within a certain range from the imaging unit 200, the photometry unit 11 includes the pixel in the visible light image corresponding to the pixel in the photometry target (step S14). In this way, the photometry unit 11 determines, as the photometry region, an area in the visible light image that corresponds to an area in the infrared image where the reflected light component acquired by the imaging unit 200 (ToF type sensor) is equal to or greater than a predetermined intensity (threshold for IR intensity). In other words, even if the pixel captures a high-luminance object, the photometry unit 11 does not exclude the pixel from the photometry region if the object is the detection target.
- the photometry unit 11 determines that the BW intensity of a pixel in the visible light image is equal to or less than the threshold value for BW intensity (No in step S11), that is, if the pixel does not show a bright object such as the sky, direct sunlight, or a white object, the photometry unit 11 includes the pixel in question in the photometry area (step S14). In this way, for pixels that do not show a bright object, the photometry unit 11 determines the pixel in question as a photometry area without making a judgment on the IR intensity of the pixel in the infrared image that corresponds to the pixel in question.
- FIG. 7 shows an example of an area where no photometry is performed.
- the white areas in Figure 7 indicate areas where no photometry is performed, and the black areas indicate photometry areas.
- the threshold for the BW intensity is set to be greater than the BW intensity of pixels that show the sky, and pixels that show the sky are not excluded from the photometry area.
- the photometry unit 11 measures the brightness of the photometry area determined in this manner. For example, the photometry unit 11 measures a representative value (e.g., average, median, or mode) of the pixel values of the pixel group to be metered as the brightness of the photometry area. The photometry unit 11 outputs the brightness of the photometry area to the exposure time calculation unit 12.
- a representative value e.g., average, median, or mode
- FIG. 8 is a flowchart showing an example of the operation of the exposure time calculation unit 12 according to the embodiment.
- the exposure time calculation unit 12 calculates the exposure time from the ratio of the photometry result of the current frame (brightness of the measured photometry area) to the target pixel value (step S31). Specifically, the exposure time calculation unit 12 calculates the exposure time by (target pixel value/photometry result of the current frame) x (current exposure time). That is, if the photometry result of the current frame is greater than the target pixel value (i.e., if it is bright), the exposure time calculation unit 12 calculates an exposure time shorter than the current exposure time. Also, if the photometry result of the current frame is smaller than the target pixel value (i.e., if it is dark), the exposure time calculation unit 12 calculates an exposure time longer than the current exposure time.
- the target pixel value is not particularly limited, but may be an intermediate value between the blown-out highlight value and the crushed shadow value.
- the exposure time calculation unit 12 outputs the calculated exposure time to the imaging unit 200.
- the imaging unit 200 performs exposure using the acquired exposure time and captures a visible light image. If an exposure time shorter than the current exposure time is calculated, a visible light image that is darker than the current frame can be captured, and if an exposure time longer than the current exposure time is calculated, a visible light image that is brighter than the current frame can be captured.
- FIG. 9 shows an example of a visible light image obtained by performing automatic exposure control using a conventional method.
- FIG. 10 shows an example of a visible light image obtained by performing automatic exposure control using the image processing device 100 according to the embodiment.
- the intensity of the reflected light component from nearby objects is high and the intensity of the reflected light component from distant objects is low, making it possible to obtain an infrared image in which nearby objects with high reflected light component intensity are easily captured and distant objects with low reflected light component intensity are not easily captured.
- Areas with high reflected light component intensity are areas in which the detected object may be captured, and can therefore be determined as photometry areas.
- FIG. 11 is a flowchart showing another example of the operation of the photometry unit 11 according to the embodiment.
- the photometry unit 11 determines the photometry area by performing the processes from step S21 to step S24 shown in FIG. 11 for each pixel in the visible light image. Note that the area where photometry is not performed is an area excluding the photometry area where photometry is performed, so when the photometry area is determined, the area where photometry is not performed is also determined. In other words, determining the photometry area also means determining the area where photometry is not performed.
- the photometry unit 11 determines whether or not the BW intensity (pixel value (brightness value) of a pixel in a visible light image) is greater than a threshold value for the BW intensity (step S21).
- the threshold value for the BW intensity is set to a value that is somewhat smaller than the brightness value corresponding to the sky or a white object, for example, but is not particularly limited and may be set appropriately depending on the space or purpose in which the imaging unit 200 is used.
- the photometry unit 11 determines whether the BW intensity of a pixel in the visible light image is greater than the threshold for the BW intensity (Yes in step S21), that is, when the pixel in question is a pixel that depicts a bright object such as the sky, direct sunlight, or a white object, it determines whether the IRBG intensity (intensity of light components other than reflected light: intensity of background components) of the pixel in the infrared image that corresponds to the pixel in the visible light image is greater than the threshold for the IRBG intensity (step S22).
- the threshold for the IRBG intensity is set to a value that is somewhat smaller than the saturation level obtained when receiving direct sunlight, for example, but is not particularly limited and may be set appropriately depending on the space in which the imaging unit 200 is used, the purpose, etc.
- the photometry unit 11 determines that the IRBG intensity of a pixel in the infrared image is greater than the threshold for IRBG intensity (Yes in step S22), that is, when the pixel in question is a pixel that reflects direct sunlight or the like, it excludes the pixel in the visible light image corresponding to that pixel from the photometry target (step S23). In this way, the photometry unit 11 does not determine, as a photometry area, an area in the visible light image corresponding to an area in the infrared image where the components of light other than reflected light (background components) acquired by the imaging unit 200 (ToF type sensor) are greater than a predetermined intensity (threshold for IRBG intensity).
- the photometry unit 11 determines that the IRBG intensity of a pixel in the infrared image is equal to or lower than the threshold value for IRBG intensity (No in step S22), that is, if the pixel in question is a pixel that does not reflect direct sunlight, etc., the photometry unit 11 includes the pixel in the visible light image corresponding to the pixel in question as a photometry target (step S24). In this way, even if the pixel reflects a highly luminous object, the photometry unit 11 does not exclude the pixel from the photometry target if the pixel does not reflect direct sunlight, etc.
- the photometry unit 11 determines that the BW intensity of a pixel in the visible light image is equal to or less than the threshold value for BW intensity (No in step S21), that is, if the pixel does not show a bright object such as the sky, direct sunlight, or a white object, the photometry unit 11 includes the pixel in question as a photometry target (step S24). In this way, for pixels that do not show a bright object, the photometry unit 11 determines the pixel in the photometry area without making a judgment on the IRBG intensity of the pixel in the infrared image that corresponds to the pixel in question.
- the subsequent operation of the photometry unit 11 is the same as when the photometry area is determined using the intensity of the reflected light components, so a detailed explanation is omitted.
- the intensity of light components other than reflected light from high-brightness objects such as direct sunlight is very high, and areas where the intensity of light components other than reflected light is high are areas where high-brightness objects such as direct sunlight may be captured, so they can be prevented from being determined as photometry areas.
- an area in the visible light image corresponding to an area in the infrared image where the light components other than the reflected light are greater than a predetermined intensity may not be determined as the photometric area.
- the photometry unit 11 may divide the visible light image into a number of blocks, and exclude from the photometry target those areas in the blocks in which the proportion of areas that are not photometry areas is equal to or greater than a predetermined proportion. This will be explained using FIG. 12.
- FIG. 12 is a diagram for explaining blocks where photometry is not performed.
- the white areas in FIG. 12 indicate areas where photometry is not performed, and the black areas indicate photometry areas.
- the photometry unit 11 divides the visible light image into multiple blocks (multiple blocks surrounded by white lines) as shown in FIG. 12. Next, the photometry unit 11 calculates the proportion of the area where photometry is not performed (white area in FIG. 12) for each block. Specifically, the photometry unit 11 calculates the proportion of the number of pixels in the area where photometry is not performed to the number of pixels included in one block for each block. Then, the photometry unit 11 excludes all pixels included in blocks where the proportion of the area that is not a photometry area is equal to or greater than a predetermined proportion from the photometry target. In other words, in a block that includes many pixels that are not a photometry target, all pixels are excluded from the photometry target. This makes it possible to exclude the areas surrounding the areas where photometry is not performed from the photometry target. For example, the predetermined proportion is 50%, but is not particularly limited.
- FIG. 13 is a block diagram showing an example of the image synthesis unit 20 according to the embodiment.
- the image synthesis unit 20 acquires visible light images captured with three different exposure times from the imaging unit 200.
- the ratio of the three types of exposure times is constant, and if the exposure time setting unit 10 calculates an exposure time that is shorter than the current exposure time, the three types of exposure times become shorter while the ratio of the three types of exposure times remains constant, and if the exposure time setting unit 10 calculates an exposure time that is longer than the current exposure time, the three types of exposure times become longer while the ratio of the three types of exposure times remains constant.
- the image synthesis unit 20 is a processing unit for realizing WDR (Wide Dynamic Range) and includes, for example, offset correction units 21a, 21b, and 21c and an output synthesis unit 22.
- WDR Wide Dynamic Range
- the offset correction unit 21a receives a visible light image (called the L image) taken with the longest exposure time of the three types of exposure times, the offset correction unit 21b receives a visible light image (called the M image) taken with the second longest exposure time of the three types of exposure times, and the offset correction unit 21c receives a visible light image (called the S image) taken with the shortest exposure time of the three types of exposure times.
- the L image is the brightest image
- the M image is the next brightest image
- the S image is the darkest image.
- the offset correction unit 21a will be explained using Figure 14.
- FIG. 14 is a flowchart showing an example of the operation of the offset correction unit 21a according to the embodiment.
- the offset correction unit 21a performs the processes of steps S41 and S42 shown in FIG. 14 for each pixel in the L image.
- the offset correction unit 21a subtracts the offset level from the input image (L image) (step S41).
- the offset correction unit 21a replaces pixel values equal to or greater than the clip level with the clip level (step S42).
- offset correction units 21b and 21c are the same as offset correction unit 21a except for the input image, so a description will be omitted.
- FIG. 15 is a flowchart showing an example of the operation of the output synthesis unit 22 according to the embodiment.
- the output synthesis unit 22 performs the processes from step S51 to step S59 shown in FIG. 15 for each pixel at the same position in the L image, M image, and S image.
- the output synthesis unit 22 determines whether or not the pixel value of a pixel at the same position in the L image, M image, and S image is smaller than a first threshold value (referred to as threshold value 1) in the L image (step S51).
- step S52 If the output synthesis unit 22 determines that the pixel value of the pixel in the L image is smaller than threshold 1 (Yes in step S51), it outputs the pixel value of the pixel in the L image (step S52).
- step S51 determines whether the pixel value of the pixel in the L image is equal to or greater than threshold 1 (No in step S51), it determines whether the pixel value of the pixel in the L image is smaller than a second threshold (referred to as threshold 2) (step S53).
- threshold 2 is a value greater than threshold 1.
- the output synthesis unit 22 determines that the pixel value of the pixel in the L image is smaller than threshold 2 (Yes in step S53), it outputs a value obtained by weighting and adding the pixel value of the pixel in the L image and the pixel value of the pixel in the M image (step S54).
- step S55 If the output synthesis unit 22 determines that the pixel value of the pixel in the L image is equal to or greater than threshold value 2 (No in step S53), it determines whether the pixel value of the pixel in the M image is smaller than a third threshold value (hereinafter referred to as threshold value 3) (step S55).
- threshold value 3 a third threshold value
- step S55 If the output synthesis unit 22 determines that the pixel value of the pixel in the M image is smaller than threshold 3 (Yes in step S55), it outputs the pixel value of the pixel in the M image (step S56).
- step S55 determines whether the pixel value of the pixel in the M image is equal to or greater than threshold value 3 (No in step S55), it determines whether the pixel value of the pixel in the M image is smaller than a fourth threshold value (hereinafter referred to as threshold value 4) (step S57).
- threshold value 4 is a value greater than threshold value 3.
- the output synthesis unit 22 determines that the pixel value of the pixel in the M image is smaller than threshold value 4 (Yes in step S57), it outputs a value obtained by weighting and adding the pixel value of the pixel in the M image and the pixel value of the pixel in the S image (step S58).
- the output synthesis unit 22 determines that the pixel value of the pixel in the M image is equal to or greater than threshold value 4 (No in step S57), it outputs the pixel value of the pixel in the S image (step S59).
- FIG. 16 is a block diagram showing an example of a gain correction unit 30 according to an embodiment.
- the gain correction unit 30 includes a photometry unit 31, a gain calculation unit 32, and a gain application unit 33.
- the photometry unit 31 acquires the composite image synthesized by the image synthesis unit 20 and measures the brightness (pixel value) of each pixel. For example, the photometry unit 31 measures a representative value (e.g., the average value, median value, or mode value) of the pixel values of each pixel as the brightness of each pixel.
- a representative value e.g., the average value, median value, or mode value
- the gain calculation unit 32 calculates the correction gain from the ratio between the photometry result (measured brightness) and the target pixel value. Specifically, the gain calculation unit 32 calculates the correction gain from (target pixel value/photometry result of the current frame). That is, if the photometry result of the current frame is greater than the target pixel value (i.e., brighter), the gain calculation unit 32 calculates a small correction gain. Also, if the photometry result of the current frame is smaller than the target pixel value (i.e., darker), the exposure time calculation unit 12 calculates a large correction gain.
- the target pixel value is not particularly limited, but may be an intermediate value between the blown-out highlight value and the crushed shadow value.
- the gain application unit 33 applies the calculated correction gain to each pixel of the composite image synthesized by the image synthesis unit 20.
- a small correction gain is calculated, the composite image can be corrected to a dark image, and when a large correction gain is calculated, the composite image can be corrected to a bright image.
- one of the two types of images obtained by two types of imaging means with different wavelengths can be an image in which the object to be detected is more clearly captured than the other image, and in which bright objects that are not the object to be detected are less easily captured than the other image.
- the photometric area by not determining the area that is bright in the other of the two types of images and is difficult to capture in the one image as the photometric area, and determining the other area as the photometric area, the area in which a bright object that is not the object to be detected is captured is excluded from the photometric area, and automatic exposure control suitable for detecting the object to be detected can be performed for the other image.
- the two types of images are a visible light image and an infrared image
- the object to be detected is more clearly visible in the infrared image than in the visible light image, and bright areas that are not the object to be detected are less visible than in the visible light image. Therefore, by not determining areas that are bright in the visible light image and less visible in the infrared image (i.e., the intensity of the reflected light components is reduced) as the photometric area, and instead determining other areas as the photometric area, areas that contain bright objects that are not the object to be detected are excluded from the photometric area, and automatic exposure control suitable for detecting the object to be detected can be performed for the visible light image.
- the present disclosure can be realized not only as an image processing device 100, but also as an image processing method including steps (processing) performed by components that make up the image processing device 100.
- FIG. 17 is a flowchart showing an example of an image processing method according to another embodiment.
- the image processing method is an image processing method executed by the image processing device 100, and includes an acquisition step (step S101) of acquiring two types of images showing the same area obtained by two types of imaging means having different wavelengths, as shown in FIG. 17, and a determination step (step S102) of determining, based on one of the two images, a photometric area in which photometry is performed in the other of the two images.
- the present disclosure can be realized as a program for causing a computer (processor) to execute the steps included in the image processing method.
- the present disclosure can be realized as a non-transitory computer-readable recording medium, such as a CD-ROM, on which the program is recorded.
- each step is performed by running the program using hardware resources such as a computer's CPU, memory, and input/output circuits.
- hardware resources such as a computer's CPU, memory, and input/output circuits.
- each step is performed by the CPU obtaining data from memory or input/output circuits, etc., performing calculations, and outputting the results of the calculations to memory or input/output circuits, etc.
- each component included in the image processing device 100 may be configured with dedicated hardware, or may be realized by executing a software program suitable for each component.
- Each component may be realized by a program execution unit such as a CPU or processor reading and executing a software program recorded on a recording medium such as a hard disk or semiconductor memory.
- LSI is an integrated circuit. These may be individually integrated into a single chip, or may be integrated into a single chip that includes some or all of the functions. Furthermore, the integrated circuit is not limited to an LSI, and may be realized using a dedicated circuit or a general-purpose processor. An FPGA (Field Programmable Gate Array) that can be programmed after the LSI is manufactured, or a reconfigurable processor that can reconfigure the connections and settings of circuit cells inside the LSI may also be used.
- FPGA Field Programmable Gate Array
- each component included in the image processing device 100 may be integrated into an integrated circuit using that technology.
- this disclosure also includes forms obtained by applying various modifications to the embodiments that a person skilled in the art may conceive, and forms realized by arbitrarily combining the components and functions of each embodiment within the scope that does not deviate from the spirit of this disclosure.
- An image processing device comprising: an acquisition unit that acquires two types of images showing the same area obtained by two types of imaging means having different wavelengths; and a determination unit that determines a photometric area in which photometry is performed in one of the two types of images, based on the other of the two types of images.
- one of the two types of images obtained by two types of imaging means with different wavelengths can be an image in which the object to be detected is more clearly captured than in the other image, and in which bright objects that are not the object to be detected are less easily captured than in the other image.
- the photometric area by not determining the area that is bright in the other of the two types of images and is difficult to capture in the one image as the photometric area, and determining the other area as the photometric area, the area in which a bright object that is not the object to be detected is captured is excluded from the photometric area, and automatic exposure control suitable for detecting the object to be detected can be performed on the other image.
- Infrared images tend to show the object being detected more clearly than visible light images, and bright areas that are not the object being detected are less visible than in visible light images. Therefore, by not determining areas that are bright in the visible light image but are difficult to see in the infrared image as the photometry area, and instead determining other areas as the photometry area, areas that show bright objects that are not the object being detected are excluded from the photometry area, making it possible to perform automatic exposure control for the visible light image that is suitable for detecting the object being detected.
- ToF sensors can capture infrared images that clearly show nearby objects and not so clearly show distant objects.
- the intensity of the reflected light components from nearby objects is high and the intensity of the reflected light components from distant objects is low, it is possible to obtain an infrared image in which nearby objects with high reflected light component intensity are easily captured and distant objects with low reflected light component intensity are not easily captured. Areas with high reflected light component intensity are areas where the detected object may be captured, and can therefore be determined as photometry areas.
- the intensity of light components other than reflected light from high-brightness objects such as direct sunlight is very high, and areas where the intensity of light components other than reflected light is high are areas where high-brightness objects such as direct sunlight may be captured, so they can be prevented from being determined as photometry areas.
- An image processing method executed by an image processing device including: an acquisition step of acquiring two types of images showing the same area obtained by two types of imaging means having different wavelengths; and a determination step of determining, based on one of the two types of images, a photometric area in which photometry is performed in the other of the two types of images.
- This disclosure can be applied to devices for detecting objects to be detected.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Studio Devices (AREA)
Abstract
An image processing device (100) comprising: an acquisition unit (exposure time setting unit (10)) which acquires two types of image obtained by two types of imaging means differ in wavelength from each other and showing the same area; and a determination unit (exposure time setting unit (10)) which determines, on the basis of one of the two types of image, a photometric area which is in the other one of the two types of image and in which photometry is performed.
Description
本開示は、画像処理装置などに関する。
This disclosure relates to image processing devices, etc.
特許文献1には、撮影領域に対する露光条件を設定するための技術が開示されている。
Patent document 1 discloses a technique for setting exposure conditions for a shooting area.
しかしながら、特許文献1に開示された技術を使って撮影された画像を物体検知に適用する場合、検知対象物を正しく検知できない場合がある。具体的には、撮影された画像に、直射日光、空、遠方の白い物体などの検知対象物ではない明るい物体(なお、ここでは、直射日光および空なども物体と呼ぶ)が写っている場合、その明るさの影響によって画像全体が暗くなるように自動露光制御が行われて、検知対象物の領域が暗くなってしまい、検知対象物を正しく検知できない場合がある。
However, when an image captured using the technology disclosed in Patent Document 1 is applied to object detection, the target object may not be detected correctly. Specifically, if the captured image contains bright objects that are not the target object, such as direct sunlight, the sky, or a distant white object (note that here, direct sunlight and the sky are also referred to as objects), automatic exposure control is performed so that the entire image becomes dark due to the influence of that brightness, and the area of the target object becomes dark, so that the target object may not be detected correctly.
そこで、本開示は、検知対象物の検知に適した自動露光制御を行うことができる画像処理装置などを提供する。
The present disclosure provides an image processing device and the like that can perform automatic exposure control suitable for detecting an object to be detected.
本開示に係る画像処理装置は、互いに波長の異なる2種類の撮像手段で得られた、同じ領域が写る2種類の画像を取得する取得部と、前記2種類の画像のうちの一方の画像に基づいて、前記2種類の画像のうちの他方の画像における、測光が行われる測光領域を決定する決定部と、を備える。
The image processing device according to the present disclosure includes an acquisition unit that acquires two types of images showing the same area obtained by two types of imaging means having different wavelengths, and a determination unit that determines a photometric area in which photometry is performed in one of the two types of images based on the other of the two types of images.
本開示に係る画像処理方法は、画像処理装置によって実行される画像処理方法であって、互いに波長の異なる2種類の撮像手段で得られた、同じ領域が写る2種類の画像を取得する取得ステップと、前記2種類の画像のうちの一方の画像に基づいて、前記2種類の画像のうちの他方の画像における、測光が行われる測光領域を決定する決定ステップと、を含む。
The image processing method disclosed herein is an image processing method executed by an image processing device, and includes an acquisition step of acquiring two types of images showing the same area obtained by two types of imaging means having different wavelengths, and a determination step of determining, based on one of the two types of images, a photometric area in which photometry is performed in the other of the two types of images.
本開示に係るプログラムは、上記の画像処理方法をコンピュータに実行させるためのプログラムである。
The program disclosed herein is a program for causing a computer to execute the image processing method described above.
なお、これらの包括的または具体的な態様は、システム、方法、集積回路、コンピュータプログラムまたはコンピュータ読み取り可能なCD-ROMなどの記録媒体で実現されてもよく、システム、方法、集積回路、コンピュータプログラムおよび記録媒体の任意な組み合わせで実現されてもよい。
These comprehensive or specific aspects may be realized as a system, method, integrated circuit, computer program, or computer-readable recording medium such as a CD-ROM, or may be realized as any combination of a system, method, integrated circuit, computer program, and recording medium.
本開示の一態様に係る画像処理装置などによれば、検知対象物の検知に適した自動露光制御を行うことができる。
An image processing device according to one aspect of the present disclosure can perform automatic exposure control suitable for detecting an object to be detected.
以下、実施の形態について、図面を参照しながら具体的に説明する。
The following describes the embodiment in detail with reference to the drawings.
なお、以下で説明する実施の形態は、いずれも包括的または具体的な例を示すものである。以下の実施の形態で示される数値、形状、材料、構成要素、構成要素の配置位置および接続形態、ステップ、ステップの順序などは、一例であり、本開示を限定する主旨ではない。
The embodiments described below are all comprehensive or specific examples. The numerical values, shapes, materials, components, the arrangement and connection of the components, steps, and order of steps shown in the following embodiments are merely examples and are not intended to limit the present disclosure.
(実施の形態)
以下、実施の形態に係る画像処理装置について説明する。 (Embodiment)
An image processing device according to an embodiment will be described below.
以下、実施の形態に係る画像処理装置について説明する。 (Embodiment)
An image processing device according to an embodiment will be described below.
図1は、実施の形態に係る画像処理装置100の一例を示すブロック図である。図1には、画像処理装置100の他に、画像処理装置100が処理する画像を撮影する撮像部200も示されている。なお、画像処理装置100は、撮像部200を備えていてもよい。
FIG. 1 is a block diagram showing an example of an image processing device 100 according to an embodiment. In addition to the image processing device 100, FIG. 1 also shows an imaging unit 200 that captures images to be processed by the image processing device 100. Note that the image processing device 100 may also include the imaging unit 200.
撮像部200は、例えば、互いに波長の異なる2種類の撮像手段を有し、2種類の撮像手段により2種類の画像を撮影することができるカメラである。例えば、撮像部200は、BW-TOFカメラである。BW-TOFカメラは、全波長領域の可視光を受光することで得られる可視光画像(例えばBW(白黒)画像)と、ToF(Time of Flight)方式のセンサにより、照射した赤外領域の赤外線の反射光を受光することで得られる赤外線画像との2種類の画像を同時に撮影することができる。すなわち、撮像部200は、同じ領域が写る可視光画像および赤外線画像の2種類の画像を同時に撮影することができる。なお、撮像部200は、同じ領域が写る2種類の画像を撮影することができれば、BW-TOFカメラのように、1つのカメラで可視光画像および赤外線画像を撮影できるカメラでなくてもよい。例えば、撮像部200は、別体に設けられた2つのカメラ、具体的には、可視光画像を撮影することができるカメラおよび赤外線画像を撮影することができるカメラを有していてもよい。
The imaging unit 200 is, for example, a camera that has two types of imaging means with different wavelengths and can capture two types of images using the two types of imaging means. For example, the imaging unit 200 is a BW-TOF camera. A BW-TOF camera can simultaneously capture two types of images: a visible light image (e.g., a BW (black and white) image) obtained by receiving visible light in the entire wavelength range, and an infrared image obtained by receiving reflected light of infrared light in the infrared range that has been irradiated using a ToF (Time of Flight) type sensor. In other words, the imaging unit 200 can simultaneously capture two types of images, a visible light image and an infrared image, showing the same area. Note that the imaging unit 200 does not have to be a camera that can capture visible light images and infrared images with one camera, such as a BW-TOF camera, as long as it can capture two types of images showing the same area. For example, the imaging unit 200 may have two separate cameras, specifically, a camera capable of capturing visible light images and a camera capable of capturing infrared images.
なお、2種類の画像には同じ領域が写っているが、当該領域は画像領域全体であってもよいし、画像領域全体のうちの一部の領域であってもよい。つまり、2種類の画像の画像領域全体が一致していてもよいし、2種類の画像の画像領域全体のうちの一部の領域が一致していてもよい。例えば、撮像部200が2種類の画像を同時に撮影することができるBW-ToFカメラなどのカメラである場合には、2種類の画像の画像領域全体が一致し得る。一方で、撮像部200が別体に設けられた2つのカメラを有する場合には、2種類の画像の画像領域全体のうちの一部の領域が一致し得る。
Note that the same area is captured in the two types of images, but that area may be the entire image area or a portion of the entire image area. In other words, the entire image area of the two types of images may match, or a portion of the entire image area of the two types of images may match. For example, if the imaging unit 200 is a camera such as a BW-ToF camera that can capture two types of images simultaneously, the entire image area of the two types of images may match. On the other hand, if the imaging unit 200 has two separate cameras, a portion of the entire image area of the two types of images may match.
撮像部200は、赤外線の出射光に対する反射光の成分の強度により、赤外線画像に写る物体までの距離を測定する(測距する)ことができる。つまり、赤外線画像には赤外線画像に写る物体の距離情報(奥行情報)が含まれる。ここで、反射光の成分の強度の計算方法について、図2を用いて説明する。
The imaging unit 200 can measure (range) the distance to an object shown in an infrared image based on the intensity of the reflected light component of the emitted infrared light. In other words, the infrared image contains distance information (depth information) of the object shown in the infrared image. Here, the method for calculating the intensity of the reflected light component is explained using Figure 2.
図2は、反射光の成分の強度の計算方法を説明するための図である。
Figure 2 is a diagram to explain how to calculate the intensity of the reflected light components.
例えば、撮像部200の撮影範囲に物体が存在する場合、図2に示される照射光に対する物体からの反射光が受光される。この反射光の成分の強度は、照射光を照射した後の少なくとも3つの区間の受光信号量に応じて計算できる。少なくとも3つの区間は、具体的には、図2に示される反射光によって信号量が大きくなっているA0およびA1の区間ならびに信号量が小さくなっているA2の区間である。A0およびA1の区間にはそれぞれIR反射光成分の一部および反射光以外の光の成分であるバックグラウンド成分が含まれ、A2の区間にはバックグラウンド成分のみが含まれている。このため、反射光の成分(IR反射光成分)の強度は、A0およびA1の総信号量からバックグラウンド成分を引いたものとなる。A0、A1およびA2をそれぞれ信号量とすると、反射光の成分を、(A0-A2)+(A1-A2)により算出することができる。
For example, when an object is present within the imaging range of the imaging unit 200, reflected light from the object is received in response to the irradiated light shown in FIG. 2. The intensity of the reflected light component can be calculated according to the amount of received light signal in at least three sections after the irradiated light is applied. Specifically, the at least three sections are sections A0 and A1 in which the signal amount is large due to the reflected light shown in FIG. 2, and section A2 in which the signal amount is small. Sections A0 and A1 each contain a portion of the IR reflected light component and a background component, which is a component of light other than the reflected light, and section A2 contains only the background component. Therefore, the intensity of the reflected light component (IR reflected light component) is the total signal amount of A0 and A1 minus the background component. If A0, A1, and A2 are each the signal amount, the reflected light component can be calculated by (A0-A2)+(A1-A2).
画像処理装置100は、撮像部200における互いに波長の異なる2種類の撮像手段で得られた、同じ領域が写る2種類の画像(可視光画像および赤外線画像)に対して画像処理を行う装置である。例えば、画像処理装置100は、撮影された画像に写る物体(検知対象物)の検知に適した画像を出力するために、画像処理を行う。
The image processing device 100 is a device that performs image processing on two types of images (a visible light image and an infrared image) that show the same area and are obtained by two types of imaging means with different wavelengths in the imaging unit 200. For example, the image processing device 100 performs image processing to output an image suitable for detecting an object (detection target) that appears in the captured image.
画像処理装置100は、露光時間設定部10、画像合成部20およびゲイン補正部30を備える。画像処理装置100は、プロセッサ(マイクロプロセッサ)およびメモリなどを含むコンピュータである。メモリは、ROM(Read Only Memory)およびRAM(Random Access Memory)などであり、プロセッサにより実行されるプログラムを記憶することができる。露光時間設定部10、画像合成部20およびゲイン補正部30は、メモリに格納されたプログラムを実行するプロセッサなどによって実現される。
The image processing device 100 includes an exposure time setting unit 10, an image synthesis unit 20, and a gain correction unit 30. The image processing device 100 is a computer including a processor (microprocessor) and a memory. The memory is a ROM (Read Only Memory) and a RAM (Random Access Memory), etc., and can store programs executed by the processor. The exposure time setting unit 10, the image synthesis unit 20, and the gain correction unit 30 are realized by a processor that executes programs stored in the memory, etc.
露光時間設定部10は、撮影された可視光画像からシーンの輝度を測定し、画像に写る検知対象物の領域の明るさが所望の明るさになるように露光時間を設定し、撮像部200に出力する。露光時間設定部10の詳細については後述する。
The exposure time setting unit 10 measures the luminance of the scene from the captured visible light image, sets the exposure time so that the brightness of the area of the detected object captured in the image becomes the desired brightness, and outputs the exposure time to the imaging unit 200. Details of the exposure time setting unit 10 will be described later.
画像合成部20は、撮像部200から得られた、異なる露光時間で撮影された複数の可視光画像から、検知対象物に対して白飛びおよび黒つぶれが発生しないような画像を合成する。画像合成部20の詳細については後述する。
The image synthesis unit 20 synthesizes an image from multiple visible light images captured with different exposure times from the imaging unit 200, so that the detected object does not have blown-out highlights or crushed shadows. Details of the image synthesis unit 20 will be described later.
ゲイン補正部30は、合成された画像から現フレームのシーンの輝度を測定し、検知対象物の領域の明るさが所望の明るさになるように画像のデジタルゲイン補正を行う。具体的には、露光時間設定部10で設定された露光時間が出力画像に反映されるまでに遅延時間が存在し、この遅延時間に伴う出力レベル変動を補償するために、ゲイン補正部30は、デジタルゲイン補正を行う。ゲイン補正部30の詳細については後述する。
The gain correction unit 30 measures the luminance of the scene in the current frame from the synthesized image, and performs digital gain correction on the image so that the brightness of the area of the detection target becomes the desired brightness. Specifically, there is a delay time until the exposure time set by the exposure time setting unit 10 is reflected in the output image, and the gain correction unit 30 performs digital gain correction to compensate for the output level fluctuation that accompanies this delay time. The gain correction unit 30 will be described in detail later.
まず、露光時間設定部10の詳細について説明する。
First, we will explain the details of the exposure time setting unit 10.
図3は、実施の形態に係る露光時間設定部10の一例を示すブロック図である。
FIG. 3 is a block diagram showing an example of an exposure time setting unit 10 according to an embodiment.
露光時間設定部10は、測光部11および露光時間計算部12を備える。
The exposure time setting unit 10 includes a photometry unit 11 and an exposure time calculation unit 12.
測光部11は、互いに波長の異なる2種類の撮像手段で得られた、同じ領域が写る2種類の画像を取得する取得部の一例である。また、測光部11は、2種類の画像のうちの一方の画像に基づいて、2種類の画像のうちの他方の画像における、測光が行われる測光領域を決定する決定部の一例である。
The photometry unit 11 is an example of an acquisition unit that acquires two types of images showing the same area, obtained by two types of imaging means with different wavelengths. The photometry unit 11 is also an example of a determination unit that determines the photometry area in which photometry is performed in one of the two types of images, based on the other image of the two types of images.
具体的には、測光部11は、2種類の画像として可視光画像および赤外線画像を取得し、赤外線画像に基づいて可視光画像における測光領域を決定する。測光領域は、測光対象の画素群であり、測光部11は、決定した測光領域の明るさを測定する。ここで、測光領域の決定方法の例について、図4を用いて説明する。
Specifically, the photometry unit 11 acquires two types of images, a visible light image and an infrared image, and determines a photometry area in the visible light image based on the infrared image. The photometry area is a group of pixels to be measured, and the photometry unit 11 measures the brightness of the determined photometry area. Here, an example of a method for determining the photometry area will be described with reference to FIG. 4.
図4は、実施の形態に係る測光部11の動作の一例を示すフローチャートである。なお、測光部11は、例えば、図5に示されるような可視光画像および図6に示されるような赤外線画像を取得しているとする。
FIG. 4 is a flowchart showing an example of the operation of the photometry unit 11 according to the embodiment. It is assumed that the photometry unit 11 acquires, for example, a visible light image as shown in FIG. 5 and an infrared image as shown in FIG. 6.
図5は、取得された可視光画像の一例を示す図である。
Figure 5 shows an example of an acquired visible light image.
図6は、取得された赤外線画像の一例を示す図である。
Figure 6 shows an example of an acquired infrared image.
図5および図6に示されるように、可視光画像および赤外線画像には、同じ領域が写っており、測光部11は、可視光画像における各画素と同じ位置が写る赤外線画像における画素を認識している。なお、同じ位置が写る可視光画像における画素および赤外線画像における画素を対応する画素と記載する。
As shown in Figures 5 and 6, the same area is captured in the visible light image and the infrared image, and the photometry unit 11 recognizes pixels in the infrared image that capture the same positions as each pixel in the visible light image. Note that pixels in the visible light image and pixels in the infrared image that capture the same positions are referred to as corresponding pixels.
なお、撮像部200が、別体に設けられた、可視光画像を撮影することができるカメラおよび赤外線画像を撮影することができるカメラを有する場合などに、可視光画像および赤外線画像の画像領域全体のうちの一部の領域が一致し、画像領域全体が一致しない場合には、公知の手段によって2つの画像間のマッチングを取る処理が行うことで、可視光画像および赤外線画像の同じ位置が写る対応する画素を決定することができる。
In addition, when the imaging unit 200 has a separately provided camera capable of capturing visible light images and a camera capable of capturing infrared images, if some areas of the entire image area of the visible light image and the infrared image match but the entire image area does not match, a process of matching between the two images can be performed by known means to determine corresponding pixels that capture the same position in the visible light image and the infrared image.
測光部11は、可視光画像における各画素について図4に示されるステップS11からステップS14までの処理を行うことで、測光領域を決定する。なお、測光が行われない領域は、測光が行われる測光領域を除く領域であるため、測光領域が決定されると、測光が行われない領域も確定する。つまり、測光領域を決定するとは、測光が行われない領域を決定することも意味する。
The photometry unit 11 determines the photometry area by performing the processes from step S11 to step S14 shown in FIG. 4 for each pixel in the visible light image. Note that the area where photometry is not performed is an area excluding the photometry area where photometry is performed, so when the photometry area is determined, the area where photometry is not performed is also determined. In other words, determining the photometry area also means determining the area where photometry is not performed.
まず、測光部11は、可視光画像におけるある画素について、BW強度(画素の画素値(輝度値))がBW強度に対する閾値よりも大きいか否かを判定する(ステップS11)。BW強度に対する閾値は、例えば、空または白い物体などに対応する輝度値よりもある程度小さい値に設定されるが、特に限定されず、撮像部200が使用される空間や用途などに応じて適宜設定されてもよい。
First, the photometry unit 11 determines whether or not the BW intensity (pixel value (brightness value) of a pixel in a visible light image) is greater than a threshold value for the BW intensity (step S11). The threshold value for the BW intensity is set to a value that is somewhat smaller than the brightness value corresponding to the sky or a white object, for example, but is not particularly limited and may be set appropriately depending on the space or purpose in which the imaging unit 200 is used.
測光部11は、可視光画像におけるある画素のBW強度がBW強度に対する閾値よりも大きいと判定した場合(ステップS11でYes)、つまり、当該画素が空、直射日光または白い物体などのような明るい物体が写る画素である場合、可視光画像における当該画素に対応する、赤外線画像における画素のIR強度(赤外線の出射光に対する反射光の成分の強度)がIR強度に対する閾値よりも小さいか否かを判定する(ステップS12)。IR強度に対する閾値は、例えば、撮像部200から一定範囲内の物体を検知したい場合に、当該一定範囲内の物体からの反射光の成分の強度として想定される値よりもある程度小さい値に設定されるが、特に限定されず、撮像部200が使用される空間や用途などに応じて適宜設定されてもよい。
When the photometry unit 11 determines that the BW intensity of a pixel in the visible light image is greater than the threshold for the BW intensity (Yes in step S11), that is, when the pixel in question is a pixel that depicts a bright object such as the sky, direct sunlight, or a white object, it determines whether the IR intensity (intensity of the reflected light component relative to the emitted infrared light) of the pixel in the infrared image corresponding to the pixel in the visible light image is less than the threshold for the IR intensity (step S12). The threshold for the IR intensity is set to a value that is somewhat smaller than the value expected as the intensity of the reflected light component from an object within a certain range when it is desired to detect an object within the certain range from the imaging unit 200, but is not particularly limited and may be set appropriately depending on the space in which the imaging unit 200 is used, the purpose, etc.
測光部11は、赤外線画像における画素のIR強度がIR強度に対する閾値よりも小さいと判定した場合(ステップS12でYes)、つまり、当該画素が撮像部200から一定範囲内の物体(検知対象物)が写っていない画素である場合、当該画素に対応する可視光画像における画素を測光対象から除外する(ステップS13)。このように、測光部11は、空、直射日光または遠方の白い物体などのような明るい物体が写る画素であり、かつ、検知対象物が写っていない画素を、測光領域に決定しない。つまり、測光部11は、このような画素を、検知対象物とは無関係な高輝度な物体(直射日光、空または遠方の白い物体など)が写る画素とみなして、測光領域から除外する。例えば、図5および図6において破線で囲まれた白い建物は、明るいが遠くにある物体であるため、この建物が写る画素は、測光領域から除外される。
When the photometry unit 11 determines that the IR intensity of a pixel in the infrared image is smaller than the threshold value for IR intensity (Yes in step S12), that is, when the pixel does not include an object (detection target) within a certain range from the imaging unit 200, the photometry unit 11 excludes the pixel in the visible light image corresponding to the pixel from the photometry target (step S13). In this way, the photometry unit 11 does not determine a pixel that includes a bright object such as the sky, direct sunlight, or a distant white object, and does not include the detection target, as a photometry area. In other words, the photometry unit 11 excludes such pixels from the photometry area, regarding them as pixels that include a high-luminance object (direct sunlight, the sky, or a distant white object, etc.) that is unrelated to the detection target. For example, the white building surrounded by the dashed line in Figures 5 and 6 is a bright but distant object, so the pixel that includes this building is excluded from the photometry area.
また、測光部11は、赤外線画像における画素のIR強度がIR強度に対する閾値以上と判定した場合(ステップS12でNo)、つまり、当該画素が撮像部200から一定範囲内の物体(検知対象物)が写っている画素である場合、当該画素に対応する可視光画像における画素を測光対象に含める(ステップS14)。このように、測光部11は、撮像部200(ToF方式のセンサ)が取得する反射光の成分が所定の強度(IR強度に対する閾値)以上となっている赤外線画像における領域に対応する可視光画像における領域を、測光領域として決定する。つまり、測光部11は、高輝度な物体が写っている画素であっても、その物体が検知対象物であれば、当該画素を測光領域から除外しない。
Furthermore, if the photometry unit 11 determines that the IR intensity of a pixel in the infrared image is equal to or greater than the threshold for IR intensity (No in step S12), that is, if the pixel is a pixel that captures an object (detection target) within a certain range from the imaging unit 200, the photometry unit 11 includes the pixel in the visible light image corresponding to the pixel in the photometry target (step S14). In this way, the photometry unit 11 determines, as the photometry region, an area in the visible light image that corresponds to an area in the infrared image where the reflected light component acquired by the imaging unit 200 (ToF type sensor) is equal to or greater than a predetermined intensity (threshold for IR intensity). In other words, even if the pixel captures a high-luminance object, the photometry unit 11 does not exclude the pixel from the photometry region if the object is the detection target.
一方で、測光部11は、可視光画像におけるある画素のBW強度がBW強度に対する閾値以下と判定した場合(ステップS11でNo)、つまり、当該画素が空、直射日光または白い物体などのような明るい物体が写らない画素である場合、当該画素を測光対象に含める(ステップS14)。このように、測光部11は、明るい物体が写らない画素については、当該画素に対応する赤外線画像における画素のIR強度に対する判定をせずに、測光領域に決定する。
On the other hand, if the photometry unit 11 determines that the BW intensity of a pixel in the visible light image is equal to or less than the threshold value for BW intensity (No in step S11), that is, if the pixel does not show a bright object such as the sky, direct sunlight, or a white object, the photometry unit 11 includes the pixel in question in the photometry area (step S14). In this way, for pixels that do not show a bright object, the photometry unit 11 determines the pixel in question as a photometry area without making a judgment on the IR intensity of the pixel in the infrared image that corresponds to the pixel in question.
図7は、測光が行われない領域の一例を示す図である。
FIG. 7 shows an example of an area where no photometry is performed.
図7に示される白い領域は測光が行われない領域を示し、黒い領域は測光領域を示す。なお、この例では、BW強度に対する閾値が、空が写る画素のBW強度よりも大きくなるように設定されており、空が写る画素は、測光領域から除外されていない。
The white areas in Figure 7 indicate areas where no photometry is performed, and the black areas indicate photometry areas. Note that in this example, the threshold for the BW intensity is set to be greater than the BW intensity of pixels that show the sky, and pixels that show the sky are not excluded from the photometry area.
測光部11は、このようにして決定した測光領域について、明るさを測定する。例えば、測光部11は、測光対象の画素群の画素値の代表値(例えば平均値、中央値または最頻値)を測光領域の明るさとして測定する。測光部11は、測光領域の明るさを露光時間計算部12へ出力する。
The photometry unit 11 measures the brightness of the photometry area determined in this manner. For example, the photometry unit 11 measures a representative value (e.g., average, median, or mode) of the pixel values of the pixel group to be metered as the brightness of the photometry area. The photometry unit 11 outputs the brightness of the photometry area to the exposure time calculation unit 12.
次に、露光時間計算部12について図8を用いて説明する。
Next, the exposure time calculation unit 12 will be explained using Figure 8.
図8は、実施の形態に係る露光時間計算部12の動作の一例を示すフローチャートである。
FIG. 8 is a flowchart showing an example of the operation of the exposure time calculation unit 12 according to the embodiment.
露光時間計算部12は、現フレームの測光結果(測定された測光領域の明るさ)と、ターゲット画素値との比から、露光時間を計算する(ステップS31)。具体的には、露光時間計算部12は、露光時間を(ターゲット画素値/現フレームの測光結果)×(現在の露光時間)により計算する。つまり、露光時間計算部12は、現フレームの測光結果がターゲット画素値よりも大きい場合(つまり明るい場合)、現在の露光時間よりも短い露光時間を計算する。また、露光時間計算部12は、現フレームの測光結果がターゲット画素値よりも小さい場合(つまり暗い場合)、現在の露光時間よりも長い露光時間を計算する。例えば、ターゲット画素値は特に限定されないが、白飛びの値と黒つぶれの値との中間の値であってもよい。
The exposure time calculation unit 12 calculates the exposure time from the ratio of the photometry result of the current frame (brightness of the measured photometry area) to the target pixel value (step S31). Specifically, the exposure time calculation unit 12 calculates the exposure time by (target pixel value/photometry result of the current frame) x (current exposure time). That is, if the photometry result of the current frame is greater than the target pixel value (i.e., if it is bright), the exposure time calculation unit 12 calculates an exposure time shorter than the current exposure time. Also, if the photometry result of the current frame is smaller than the target pixel value (i.e., if it is dark), the exposure time calculation unit 12 calculates an exposure time longer than the current exposure time. For example, the target pixel value is not particularly limited, but may be an intermediate value between the blown-out highlight value and the crushed shadow value.
そして、露光時間計算部12は、計算した露光時間を撮像部200へ出力する。これにより、撮像部200は、取得した露光時間を用いて露光を行って可視光画像を撮影する。現在の露光時間よりも短い露光時間が計算された場合には、現フレームよりも暗い可視光画像を撮影することができ、現在の露光時間よりも長い露光時間が計算された場合には、現フレームよりも明るい可視光画像を撮影することができる。
Then, the exposure time calculation unit 12 outputs the calculated exposure time to the imaging unit 200. As a result, the imaging unit 200 performs exposure using the acquired exposure time and captures a visible light image. If an exposure time shorter than the current exposure time is calculated, a visible light image that is darker than the current frame can be captured, and if an exposure time longer than the current exposure time is calculated, a visible light image that is brighter than the current frame can be captured.
図9は、従来の手法により自動露光制御が行われることで得られた可視光画像の一例を示す図である。
FIG. 9 shows an example of a visible light image obtained by performing automatic exposure control using a conventional method.
図10は、実施の形態に係る画像処理装置100により自動露光制御が行われることで得られた可視光画像の一例を示す図である。
FIG. 10 shows an example of a visible light image obtained by performing automatic exposure control using the image processing device 100 according to the embodiment.
従来の手法では、図9に示されるように、背景の白い建物の明るさの影響によって画像全体が暗くなるように自動露光制御が行われて、破線で囲まれた検知対象物(例えば車両)が存在する領域は、暗くなってしまい、検知対象物を正しく検知できない可能性がある。一方で、実施の形態に係る画像処理装置100によれば、図10に示されるように、背景の白い建物の領域は測光領域から除外されているため、背景の白い建物の明るさの影響を受けずに自動露光制御が行われて、破線で囲まれた検知対象物が存在する領域は暗くならず、検知対象物を検知しやすくなっている。
In conventional methods, as shown in FIG. 9, automatic exposure control is performed so that the entire image becomes dark due to the influence of the brightness of the white building in the background, and the area in which the detection object (e.g., a vehicle) exists, surrounded by a dashed line, becomes dark, which may result in the detection object not being detected correctly. On the other hand, according to the image processing device 100 of the embodiment, as shown in FIG. 10, the area in which the white building in the background exists is excluded from the photometry area, so automatic exposure control is performed without being influenced by the brightness of the white building in the background, and the area in which the detection object exists, surrounded by a dashed line, does not become dark, making it easier to detect the detection object.
このように、近くの物体からの反射光の成分の強度は大きく、遠くの物体からの反射光の成分の強度は小さいため、反射光の成分の強度が大きい近くの物体が写りやすく、反射光の成分の強度が小さい遠くの物体が写りにくい赤外線画像を取得することができる。反射光の成分の強度が大きい領域は、検知対象物が写り得る領域であるため、測光領域に決定することができる。
In this way, the intensity of the reflected light component from nearby objects is high and the intensity of the reflected light component from distant objects is low, making it possible to obtain an infrared image in which nearby objects with high reflected light component intensity are easily captured and distant objects with low reflected light component intensity are not easily captured. Areas with high reflected light component intensity are areas in which the detected object may be captured, and can therefore be determined as photometry areas.
なお、反射光の成分の強度を用いた測光領域の決定方法について説明したが、反射光の成分の強度を用いずに測光領域を決定する方法もある。以下、反射光の成分の強度を用いずに測光領域を決定する方法について図11を用いて説明する。
Although the method for determining the photometric area using the intensity of the reflected light components has been explained, there is also a method for determining the photometric area without using the intensity of the reflected light components. Below, a method for determining the photometric area without using the intensity of the reflected light components will be explained using Figure 11.
図11は、実施の形態に係る測光部11の動作の他の一例を示すフローチャートである。
FIG. 11 is a flowchart showing another example of the operation of the photometry unit 11 according to the embodiment.
測光部11は、可視光画像における各画素について図11に示されるステップS21からステップS24までの処理を行うことで、測光領域を決定する。なお、測光が行われない領域は、測光が行われる測光領域を除く領域であるため、測光領域が決定されると、測光が行われない領域も確定する。つまり、測光領域を決定するとは、測光が行われない領域を決定することも意味する。
The photometry unit 11 determines the photometry area by performing the processes from step S21 to step S24 shown in FIG. 11 for each pixel in the visible light image. Note that the area where photometry is not performed is an area excluding the photometry area where photometry is performed, so when the photometry area is determined, the area where photometry is not performed is also determined. In other words, determining the photometry area also means determining the area where photometry is not performed.
まず、測光部11は、可視光画像におけるある画素について、BW強度(画素の画素値(輝度値))がBW強度に対する閾値よりも大きいか否かを判定する(ステップS21)。BW強度に対する閾値は、例えば、空または白い物体などに対応する輝度値よりもある程度小さい値に設定されるが、特に限定されず、撮像部200が使用される空間や用途などに応じて適宜設定されてもよい。
First, the photometry unit 11 determines whether or not the BW intensity (pixel value (brightness value) of a pixel in a visible light image) is greater than a threshold value for the BW intensity (step S21). The threshold value for the BW intensity is set to a value that is somewhat smaller than the brightness value corresponding to the sky or a white object, for example, but is not particularly limited and may be set appropriately depending on the space or purpose in which the imaging unit 200 is used.
測光部11は、可視光画像におけるある画素のBW強度がBW強度に対する閾値よりも大きいと判定した場合(ステップS21でYes)、つまり、当該画素が空、直射日光または白い物体などのような明るい物体が写る画素である場合、可視光画像における当該画素に対応する、赤外線画像における画素のIRBG強度(反射光以外の光の成分の強度:バックグラウンド成分の強度)がIRBG強度に対する閾値よりも大きいか否かを判定する(ステップS22)。IRBG強度に対する閾値は、例えば、直射日光などを受光したときに得られるような飽和レベルの値よりもある程度小さい値に設定されるが、特に限定されず、撮像部200が使用される空間や用途などに応じて適宜設定されてもよい。
When the photometry unit 11 determines that the BW intensity of a pixel in the visible light image is greater than the threshold for the BW intensity (Yes in step S21), that is, when the pixel in question is a pixel that depicts a bright object such as the sky, direct sunlight, or a white object, it determines whether the IRBG intensity (intensity of light components other than reflected light: intensity of background components) of the pixel in the infrared image that corresponds to the pixel in the visible light image is greater than the threshold for the IRBG intensity (step S22). The threshold for the IRBG intensity is set to a value that is somewhat smaller than the saturation level obtained when receiving direct sunlight, for example, but is not particularly limited and may be set appropriately depending on the space in which the imaging unit 200 is used, the purpose, etc.
測光部11は、赤外線画像における画素のIRBG強度がIRBG強度に対する閾値よりも大きいと判定した場合(ステップS22でYes)、つまり、当該画素が、直射日光などが写っている画素である場合、当該画素に対応する可視光画像における画素を測光対象から除外する(ステップS23)。このように、測光部11は、撮像部200(ToF方式のセンサ)が取得する反射光以外の光の成分(バックグラウンド成分)が所定の強度(IRBG強度に対する閾値)よりも大きくなっている赤外線画像における領域に対応する可視光画像における領域を、測光領域として決定しない。
When the photometry unit 11 determines that the IRBG intensity of a pixel in the infrared image is greater than the threshold for IRBG intensity (Yes in step S22), that is, when the pixel in question is a pixel that reflects direct sunlight or the like, it excludes the pixel in the visible light image corresponding to that pixel from the photometry target (step S23). In this way, the photometry unit 11 does not determine, as a photometry area, an area in the visible light image corresponding to an area in the infrared image where the components of light other than reflected light (background components) acquired by the imaging unit 200 (ToF type sensor) are greater than a predetermined intensity (threshold for IRBG intensity).
また、測光部11は、赤外線画像における画素のIRBG強度がIRBG強度に対する閾値以下と判定した場合(ステップS22でNo)、つまり、当該画素が、直射日光などが写っていない画素である場合、当該画素に対応する可視光画像における画素を測光対象に含める(ステップS24)。このように、測光部11は、高輝度な物体が写る画素であっても、直射日光などが写っていない画素であれば、当該画素を測光対象から除外しない。
Furthermore, if the photometry unit 11 determines that the IRBG intensity of a pixel in the infrared image is equal to or lower than the threshold value for IRBG intensity (No in step S22), that is, if the pixel in question is a pixel that does not reflect direct sunlight, etc., the photometry unit 11 includes the pixel in the visible light image corresponding to the pixel in question as a photometry target (step S24). In this way, even if the pixel reflects a highly luminous object, the photometry unit 11 does not exclude the pixel from the photometry target if the pixel does not reflect direct sunlight, etc.
一方で、測光部11は、可視光画像におけるある画素のBW強度がBW強度に対する閾値以下と判定した場合(ステップS21でNo)、つまり、当該画素が空、直射日光または白い物体などのような明るい物体が写らない画素である場合、当該画素を測光対象に含める(ステップS24)。このように、測光部11は、明るい物体が写らない画素については、当該画素に対応する赤外線画像における画素のIRBG強度に対する判定をせずに、測光領域に決定する。
On the other hand, if the photometry unit 11 determines that the BW intensity of a pixel in the visible light image is equal to or less than the threshold value for BW intensity (No in step S21), that is, if the pixel does not show a bright object such as the sky, direct sunlight, or a white object, the photometry unit 11 includes the pixel in question as a photometry target (step S24). In this way, for pixels that do not show a bright object, the photometry unit 11 determines the pixel in the photometry area without making a judgment on the IRBG intensity of the pixel in the infrared image that corresponds to the pixel in question.
以降の測光部11の動作は、反射光の成分の強度を用いて測光領域を決定する場合と同じであるため、説明は省略する。
The subsequent operation of the photometry unit 11 is the same as when the photometry area is determined using the intensity of the reflected light components, so a detailed explanation is omitted.
このように、直射日光などの高輝度物体からの反射光以外の光の成分(バックグラウンド成分)の強度は非常に大きく、反射光以外の光の成分の強度が大きい領域は、直射日光などの高輝度物体が写り得る領域であるため、測光領域に決定しないようにすることができる。
In this way, the intensity of light components other than reflected light from high-brightness objects such as direct sunlight (background components) is very high, and areas where the intensity of light components other than reflected light is high are areas where high-brightness objects such as direct sunlight may be captured, so they can be prevented from being determined as photometry areas.
なお、反射光の成分の強度を用いて測光領域を決定する際に、反射光以外の光の成分が所定の強度よりも大きくなっている赤外線画像における領域に対応する可視光画像における領域を、測光領域として決定しないようにしてもよい。
When determining the photometric area using the intensity of the reflected light components, an area in the visible light image corresponding to an area in the infrared image where the light components other than the reflected light are greater than a predetermined intensity may not be determined as the photometric area.
なお、測光部11は、可視光画像を複数のブロックに分割し、複数のブロックのうち、測光領域ではない領域の割合が所定の割合以上となっているブロックにおける領域を測光対象から除外してもよい。これについて図12を用いて説明する。
The photometry unit 11 may divide the visible light image into a number of blocks, and exclude from the photometry target those areas in the blocks in which the proportion of areas that are not photometry areas is equal to or greater than a predetermined proportion. This will be explained using FIG. 12.
図12は、測光が行われないブロックを説明するための図である。図12に示される白い領域は測光が行われない領域を示し、黒い領域は測光領域を示す。
FIG. 12 is a diagram for explaining blocks where photometry is not performed. The white areas in FIG. 12 indicate areas where photometry is not performed, and the black areas indicate photometry areas.
例えば、測光部11は、可視光画像を、図12に示されるように複数のブロック(白い線で囲まれた複数のブロック)に分割する。次に、測光部11は、各ブロックについて、測光が行われない領域(図12の白い領域)の割合を算出する。具体的には、測光部11は、各ブロックについて、1つのブロックに含まれる画素数に対する測光が行われない領域の画素数の割合を算出する。そして、測光部11は、複数のブロックのうち、測光領域ではない領域の割合が所定の割合以上となっているブロックに含まれる画素を全て測光対象から除外する。つまり、測光対象ではない画素が多く含まれるブロックは、全ての画素が測光対象から除外される。これにより、測光が行われない領域の周辺の領域も測光対象から除外することができる。例えば、所定の割合は50%などであるが、特に限定されない。
For example, the photometry unit 11 divides the visible light image into multiple blocks (multiple blocks surrounded by white lines) as shown in FIG. 12. Next, the photometry unit 11 calculates the proportion of the area where photometry is not performed (white area in FIG. 12) for each block. Specifically, the photometry unit 11 calculates the proportion of the number of pixels in the area where photometry is not performed to the number of pixels included in one block for each block. Then, the photometry unit 11 excludes all pixels included in blocks where the proportion of the area that is not a photometry area is equal to or greater than a predetermined proportion from the photometry target. In other words, in a block that includes many pixels that are not a photometry target, all pixels are excluded from the photometry target. This makes it possible to exclude the areas surrounding the areas where photometry is not performed from the photometry target. For example, the predetermined proportion is 50%, but is not particularly limited.
次に、画像合成部20の詳細について説明する。
Next, we will explain the details of the image synthesis unit 20.
図13は、実施の形態に係る画像合成部20の一例を示すブロック図である。なお、画像合成部20は、撮像部200から3種類の露光時間によって撮影された可視光画像を取得するとする。例えば、3種類の露光時間の比は一定であり、露光時間設定部10によって現在の露光時間よりも短い露光時間が計算された場合には、3種類の露光時間の比が一定のまま、3種類の露光時間が短くなり、現在の露光時間よりも長い露光時間が計算された場合には、3種類の露光時間の比が一定のまま、3種類の露光時間が長くなる。
FIG. 13 is a block diagram showing an example of the image synthesis unit 20 according to the embodiment. The image synthesis unit 20 acquires visible light images captured with three different exposure times from the imaging unit 200. For example, the ratio of the three types of exposure times is constant, and if the exposure time setting unit 10 calculates an exposure time that is shorter than the current exposure time, the three types of exposure times become shorter while the ratio of the three types of exposure times remains constant, and if the exposure time setting unit 10 calculates an exposure time that is longer than the current exposure time, the three types of exposure times become longer while the ratio of the three types of exposure times remains constant.
画像合成部20は、WDR(Wide Dynamic Range)を実現するための処理部であり、例えば、オフセット補正部21a、21bおよび21cならびに出力合成部22を備える。
The image synthesis unit 20 is a processing unit for realizing WDR (Wide Dynamic Range) and includes, for example, offset correction units 21a, 21b, and 21c and an output synthesis unit 22.
オフセット補正部21aには、3種類の露光時間のうち最も長い露光時間で撮影された可視光画像(L画像と呼ぶ)が入力され、オフセット補正部21bには、3種類の露光時間のうち2番目に長い露光時間で撮影された可視光画像(M画像と呼ぶ)が入力され、オフセット補正部21cには、3種類の露光時間のうち最も短い露光時間で撮影された可視光画像(S画像と呼ぶ)が入力される。L画像、M画像およびS画像のうち、L画像が最も明るい画像であり、M画像が次に明るい画像であり、S画像が最も暗い画像である。ここで、オフセット補正部21aについて図14を用いて説明する。
The offset correction unit 21a receives a visible light image (called the L image) taken with the longest exposure time of the three types of exposure times, the offset correction unit 21b receives a visible light image (called the M image) taken with the second longest exposure time of the three types of exposure times, and the offset correction unit 21c receives a visible light image (called the S image) taken with the shortest exposure time of the three types of exposure times. Of the L, M, and S images, the L image is the brightest image, the M image is the next brightest image, and the S image is the darkest image. Here, the offset correction unit 21a will be explained using Figure 14.
図14は、実施の形態に係るオフセット補正部21aの動作の一例を示すフローチャートである。
FIG. 14 is a flowchart showing an example of the operation of the offset correction unit 21a according to the embodiment.
オフセット補正部21aは、L画像における各画素について図14に示されるステップS41およびステップS42の処理を行う。
The offset correction unit 21a performs the processes of steps S41 and S42 shown in FIG. 14 for each pixel in the L image.
まず、オフセット補正部21aは、入力画像(L画像)からオフセットレベルを減算する(ステップS41)。
First, the offset correction unit 21a subtracts the offset level from the input image (L image) (step S41).
次に、オフセット補正部21aは、クリップレベル以上の画素値をクリップレベルに置き換える(ステップS42)。
Next, the offset correction unit 21a replaces pixel values equal to or greater than the clip level with the clip level (step S42).
なお、オフセット補正部21bおよび21cについては、入力される画像が異なるだけで、その他の点はオフセット補正部21aと同じであるため説明は省略する。
Note that offset correction units 21b and 21c are the same as offset correction unit 21a except for the input image, so a description will be omitted.
次に、出力合成部22について図15を用いて説明する。
Next, the output synthesis unit 22 will be explained using FIG. 15.
図15は、実施の形態に係る出力合成部22の動作の一例を示すフローチャートである。
FIG. 15 is a flowchart showing an example of the operation of the output synthesis unit 22 according to the embodiment.
出力合成部22は、L画像、M画像およびS画像における同じ位置の各画素について、図15に示されるステップS51からステップS59までの処理を行う。
The output synthesis unit 22 performs the processes from step S51 to step S59 shown in FIG. 15 for each pixel at the same position in the L image, M image, and S image.
まず、出力合成部22は、L画像、M画像およびS画像における同じ位置のある画素について、L画像における当該画素の画素値が第1閾値(閾値1と記載する)よりも小さいか否か判定する(ステップS51)。
First, the output synthesis unit 22 determines whether or not the pixel value of a pixel at the same position in the L image, M image, and S image is smaller than a first threshold value (referred to as threshold value 1) in the L image (step S51).
出力合成部22は、L画像における当該画素の画素値が閾値1よりも小さいと判定した場合(ステップS51でYes)、L画像における当該画素の画素値を出力する(ステップS52)。
If the output synthesis unit 22 determines that the pixel value of the pixel in the L image is smaller than threshold 1 (Yes in step S51), it outputs the pixel value of the pixel in the L image (step S52).
出力合成部22は、L画像における当該画素の画素値が閾値1以上と判定した場合(ステップS51でNo)、L画像における当該画素の画素値が第2閾値(閾値2と記載する)よりも小さいか否か判定する(ステップS53)。閾値2は、閾値1よりも大きい値である。
If the output synthesis unit 22 determines that the pixel value of the pixel in the L image is equal to or greater than threshold 1 (No in step S51), it determines whether the pixel value of the pixel in the L image is smaller than a second threshold (referred to as threshold 2) (step S53). Threshold 2 is a value greater than threshold 1.
出力合成部22は、L画像における当該画素の画素値が閾値2よりも小さいと判定した場合(ステップS53でYes)、L画像における当該画素の画素値とM画像における当該画素の画素値とを重みづけ加算した値を出力する(ステップS54)。
If the output synthesis unit 22 determines that the pixel value of the pixel in the L image is smaller than threshold 2 (Yes in step S53), it outputs a value obtained by weighting and adding the pixel value of the pixel in the L image and the pixel value of the pixel in the M image (step S54).
出力合成部22は、L画像における当該画素の画素値が閾値2以上と判定した場合(ステップS53でNo)、M画像における当該画素の画素値が第3閾値(閾値3と記載する)よりも小さいか否か判定する(ステップS55)。
If the output synthesis unit 22 determines that the pixel value of the pixel in the L image is equal to or greater than threshold value 2 (No in step S53), it determines whether the pixel value of the pixel in the M image is smaller than a third threshold value (hereinafter referred to as threshold value 3) (step S55).
出力合成部22は、M画像における当該画素の画素値が閾値3よりも小さいと判定した場合(ステップS55でYes)、M画像における当該画素の画素値を出力する(ステップS56)。
If the output synthesis unit 22 determines that the pixel value of the pixel in the M image is smaller than threshold 3 (Yes in step S55), it outputs the pixel value of the pixel in the M image (step S56).
出力合成部22は、M画像における当該画素の画素値が閾値3以上と判定した場合(ステップS55でNo)、M画像における当該画素の画素値が第4閾値(閾値4と記載する)よりも小さいか否か判定する(ステップS57)。閾値4は、閾値3よりも大きい値である。
If the output synthesis unit 22 determines that the pixel value of the pixel in the M image is equal to or greater than threshold value 3 (No in step S55), it determines whether the pixel value of the pixel in the M image is smaller than a fourth threshold value (hereinafter referred to as threshold value 4) (step S57). Threshold value 4 is a value greater than threshold value 3.
出力合成部22は、M画像における当該画素の画素値が閾値4よりも小さいと判定した場合(ステップS57でYes)、M画像における当該画素の画素値とS画像における当該画素の画素値とを重みづけ加算した値を出力する(ステップS58)。
If the output synthesis unit 22 determines that the pixel value of the pixel in the M image is smaller than threshold value 4 (Yes in step S57), it outputs a value obtained by weighting and adding the pixel value of the pixel in the M image and the pixel value of the pixel in the S image (step S58).
出力合成部22は、M画像における当該画素の画素値が閾値4以上と判定した場合(ステップS57でNo)、S画像における当該画素の画素値を出力する(ステップS59)。
If the output synthesis unit 22 determines that the pixel value of the pixel in the M image is equal to or greater than threshold value 4 (No in step S57), it outputs the pixel value of the pixel in the S image (step S59).
このようして、WDRが実現される。
In this way, WDR is achieved.
次に、ゲイン補正部30の詳細について説明する。
Next, we will explain the details of the gain correction unit 30.
図16は、実施の形態に係るゲイン補正部30の一例を示すブロック図である。
FIG. 16 is a block diagram showing an example of a gain correction unit 30 according to an embodiment.
ゲイン補正部30は、測光部31、ゲイン計算部32およびゲイン適用部33を備える。
The gain correction unit 30 includes a photometry unit 31, a gain calculation unit 32, and a gain application unit 33.
測光部31は、画像合成部20により合成された合成画像を取得し、各画素の明るさ(画素値)を測定する。例えば、測光部31は、各画素の画素値の代表値(例えば平均値、中央値または最頻値)を各画素の明るさとして測定する。
The photometry unit 31 acquires the composite image synthesized by the image synthesis unit 20 and measures the brightness (pixel value) of each pixel. For example, the photometry unit 31 measures a representative value (e.g., the average value, median value, or mode value) of the pixel values of each pixel as the brightness of each pixel.
ゲイン計算部32は、測光結果(測定された明るさ)と、ターゲット画素値との比から、補正ゲインを計算する。具体的には、ゲイン計算部32は、補正ゲインを(ターゲット画素値/現フレームの測光結果)により計算する。つまり、ゲイン計算部32は、現フレームの測光結果がターゲット画素値よりも大きい場合(つまり明るい場合)、小さい補正ゲインを計算する。また、露光時間計算部12は、現フレームの測光結果がターゲット画素値よりも小さい場合(つまり暗い場合)、大きい補正ゲインを計算する。例えば、ターゲット画素値は特に限定されないが、白飛びの値と黒つぶれの値との中間の値であってもよい。
The gain calculation unit 32 calculates the correction gain from the ratio between the photometry result (measured brightness) and the target pixel value. Specifically, the gain calculation unit 32 calculates the correction gain from (target pixel value/photometry result of the current frame). That is, if the photometry result of the current frame is greater than the target pixel value (i.e., brighter), the gain calculation unit 32 calculates a small correction gain. Also, if the photometry result of the current frame is smaller than the target pixel value (i.e., darker), the exposure time calculation unit 12 calculates a large correction gain. For example, the target pixel value is not particularly limited, but may be an intermediate value between the blown-out highlight value and the crushed shadow value.
ゲイン適用部33は、計算された補正ゲインを、画像合成部20により合成された合成画像の各画素に適用する。小さい補正ゲインが計算された場合には、合成画像を暗い画像へと補正することができ、大きい補正ゲインが計算された場合には、合成画像を明るい画像へと補正することができる。
The gain application unit 33 applies the calculated correction gain to each pixel of the composite image synthesized by the image synthesis unit 20. When a small correction gain is calculated, the composite image can be corrected to a dark image, and when a large correction gain is calculated, the composite image can be corrected to a bright image.
以上説明した通り、撮像手段の波長によっては、近くの物体が写りやすく遠くの物体が写りにくい画像を撮影することができる。なお、検知対象物は近くの物体である場合が多く、直射日光、空、遠方の白い物体などの検知対象物ではない明るい物体は、遠くの物体である場合が多い。このため、互いに波長の異なる2種類の撮像手段で得られた2種類の画像のうちの一方の画像を、検知対象物が他方の画像よりもはっきりと写りやすく、また、検知対象物ではない明るい物体が他方の画像よりも写りにくい画像とすることができる。したがって、上記2種類の画像のうちの他方の画像において明るくなっており、かつ、上記一方の画像においては写りにくくなっている領域を測光領域に決定せず、それ以外の領域を測光領域に決定することで、検知対象物ではない明るい物体が写る領域は測光領域から除外されるため、上記他方の画像について、検知対象物の検知に適した自動露光制御を行うことができる。
As explained above, depending on the wavelength of the imaging means, it is possible to capture an image in which nearby objects are easily captured and distant objects are not easily captured. Note that the object to be detected is often a nearby object, and bright objects that are not the object to be detected, such as direct sunlight, the sky, and distant white objects, are often distant objects. For this reason, one of the two types of images obtained by two types of imaging means with different wavelengths can be an image in which the object to be detected is more clearly captured than the other image, and in which bright objects that are not the object to be detected are less easily captured than the other image. Therefore, by not determining the area that is bright in the other of the two types of images and is difficult to capture in the one image as the photometric area, and determining the other area as the photometric area, the area in which a bright object that is not the object to be detected is captured is excluded from the photometric area, and automatic exposure control suitable for detecting the object to be detected can be performed for the other image.
例えば、上記2種類の画像は、可視光画像および赤外線画像であり、赤外線画像は、検知対象物が可視光画像よりもはっきりと写りやすく、また、検知対象物ではない明るい領域が可視光画像よりも写りにくい。したがって、可視光画像において明るくなっており、かつ、赤外線画像においては写りにくくなっている(すなわち反射光の成分の強度が小さくなっている)領域を測光領域に決定せず、それ以外の領域を測光領域に決定することで、検知対象物ではない明るい物体が写る領域は測光領域から除外されるため、可視光画像について、検知対象物の検知に適した自動露光制御を行うことができる。
For example, the two types of images are a visible light image and an infrared image, and the object to be detected is more clearly visible in the infrared image than in the visible light image, and bright areas that are not the object to be detected are less visible than in the visible light image. Therefore, by not determining areas that are bright in the visible light image and less visible in the infrared image (i.e., the intensity of the reflected light components is reduced) as the photometric area, and instead determining other areas as the photometric area, areas that contain bright objects that are not the object to be detected are excluded from the photometric area, and automatic exposure control suitable for detecting the object to be detected can be performed for the visible light image.
(その他の実施の形態)
以上のように、本開示に係る技術の例示として実施の形態を説明した。しかしながら、本開示に係る技術は、これに限定されず、適宜、変更、置き換え、付加、省略などを行った実施の形態にも適用可能である。例えば、以下のような変形例も本開示の一実施の形態に含まれる。 (Other embodiments)
As described above, the embodiment has been described as an example of the technology according to the present disclosure. However, the technology according to the present disclosure is not limited to this, and can be applied to an embodiment in which appropriate changes, substitutions, additions, omissions, etc. are made. For example, the following modified examples are also included in one embodiment of the present disclosure.
以上のように、本開示に係る技術の例示として実施の形態を説明した。しかしながら、本開示に係る技術は、これに限定されず、適宜、変更、置き換え、付加、省略などを行った実施の形態にも適用可能である。例えば、以下のような変形例も本開示の一実施の形態に含まれる。 (Other embodiments)
As described above, the embodiment has been described as an example of the technology according to the present disclosure. However, the technology according to the present disclosure is not limited to this, and can be applied to an embodiment in which appropriate changes, substitutions, additions, omissions, etc. are made. For example, the following modified examples are also included in one embodiment of the present disclosure.
例えば、本開示は、画像処理装置100として実現できるだけでなく、画像処理装置100を構成する構成要素が行うステップ(処理)を含む画像処理方法として実現できる。
For example, the present disclosure can be realized not only as an image processing device 100, but also as an image processing method including steps (processing) performed by components that make up the image processing device 100.
図17は、その他の実施の形態に係る画像処理方法の一例を示すフローチャートである。
FIG. 17 is a flowchart showing an example of an image processing method according to another embodiment.
画像処理方法は、画像処理装置100によって実行される画像処理方法であって、図17に示されるように、互いに波長の異なる2種類の撮像手段で得られた、同じ領域が写る2種類の画像を取得する取得ステップ(ステップS101)と、2種類の画像のうちの一方の画像に基づいて、2種類の画像のうちの他方の画像における、測光が行われる測光領域を決定する決定ステップ(ステップS102)と、を含む。
The image processing method is an image processing method executed by the image processing device 100, and includes an acquisition step (step S101) of acquiring two types of images showing the same area obtained by two types of imaging means having different wavelengths, as shown in FIG. 17, and a determination step (step S102) of determining, based on one of the two images, a photometric area in which photometry is performed in the other of the two images.
例えば、本開示は、画像処理方法に含まれるステップを、コンピュータ(プロセッサ)に実行させるためのプログラムとして実現できる。さらに、本開示は、そのプログラムを記録したCD-ROM等である非一時的なコンピュータ読み取り可能な記録媒体として実現できる。
For example, the present disclosure can be realized as a program for causing a computer (processor) to execute the steps included in the image processing method. Furthermore, the present disclosure can be realized as a non-transitory computer-readable recording medium, such as a CD-ROM, on which the program is recorded.
例えば、本開示が、プログラム(ソフトウェア)で実現される場合には、コンピュータのCPU、メモリおよび入出力回路などのハードウェア資源を利用してプログラムが実行されることによって、各ステップが実行される。つまり、CPUがデータをメモリまたは入出力回路などから取得して演算したり、演算結果をメモリまたは入出力回路などに出力したりすることによって、各ステップが実行される。
For example, when the present disclosure is realized as a program (software), each step is performed by running the program using hardware resources such as a computer's CPU, memory, and input/output circuits. In other words, each step is performed by the CPU obtaining data from memory or input/output circuits, etc., performing calculations, and outputting the results of the calculations to memory or input/output circuits, etc.
なお、上記実施の形態において、画像処理装置100に含まれる各構成要素は、専用のハードウェアで構成されるか、各構成要素に適したソフトウェアプログラムを実行することによって実現されてもよい。各構成要素は、CPUまたはプロセッサなどのプログラム実行部が、ハードディスクまたは半導体メモリなどの記録媒体に記録されたソフトウェアプログラムを読み出して実行することによって実現されてもよい。
In the above embodiment, each component included in the image processing device 100 may be configured with dedicated hardware, or may be realized by executing a software program suitable for each component. Each component may be realized by a program execution unit such as a CPU or processor reading and executing a software program recorded on a recording medium such as a hard disk or semiconductor memory.
上記実施の形態に係る画像処理装置100の機能の一部または全ては典型的には集積回路であるLSIとして実現される。これらは個別に1チップ化されてもよいし、一部または全てを含むように1チップ化されてもよい。また、集積回路化はLSIに限るものではなく、専用回路または汎用プロセッサで実現してもよい。LSI製造後にプログラムすることが可能なFPGA(Field Programmable Gate Array)、またはLSI内部の回路セルの接続や設定を再構成可能なリコンフィギュラブル・プロセッサを利用してもよい。
Some or all of the functions of the image processing device 100 according to the above embodiment are typically realized as an LSI, which is an integrated circuit. These may be individually integrated into a single chip, or may be integrated into a single chip that includes some or all of the functions. Furthermore, the integrated circuit is not limited to an LSI, and may be realized using a dedicated circuit or a general-purpose processor. An FPGA (Field Programmable Gate Array) that can be programmed after the LSI is manufactured, or a reconfigurable processor that can reconfigure the connections and settings of circuit cells inside the LSI may also be used.
さらに、半導体技術の進歩または派生する別技術によりLSIに置き換わる集積回路化の技術が登場すれば、当然、その技術を用いて、画像処理装置100に含まれる各構成要素の集積回路化が行われてもよい。
Furthermore, if an integrated circuit technology that can replace LSIs emerges due to advances in semiconductor technology or other derived technologies, it is natural that each component included in the image processing device 100 may be integrated into an integrated circuit using that technology.
その他、実施の形態に対して当業者が思いつく各種変形を施して得られる形態、本開示の趣旨を逸脱しない範囲で各実施の形態における構成要素および機能を任意に組み合わせることで実現される形態も本開示に含まれる。
In addition, this disclosure also includes forms obtained by applying various modifications to the embodiments that a person skilled in the art may conceive, and forms realized by arbitrarily combining the components and functions of each embodiment within the scope that does not deviate from the spirit of this disclosure.
(付記)
以上の実施の形態の記載により、下記の技術が開示される。 (Additional Note)
The above description of the embodiments discloses the following techniques.
以上の実施の形態の記載により、下記の技術が開示される。 (Additional Note)
The above description of the embodiments discloses the following techniques.
(技術1)互いに波長の異なる2種類の撮像手段で得られた、同じ領域が写る2種類の画像を取得する取得部と、前記2種類の画像のうちの一方の画像に基づいて、前記2種類の画像のうちの他方の画像における、測光が行われる測光領域を決定する決定部と、を備える、画像処理装置。
(Technology 1) An image processing device comprising: an acquisition unit that acquires two types of images showing the same area obtained by two types of imaging means having different wavelengths; and a determination unit that determines a photometric area in which photometry is performed in one of the two types of images, based on the other of the two types of images.
撮像手段の波長によっては、近くの物体が写りやすく遠くの物体が写りにくい画像を撮影することができる。なお、検知対象物は近くの物体である場合が多く、直射日光、空、遠方の白い物体などの検知対象物ではない明るい物体は、遠くの物体である場合が多い。このため、互いに波長の異なる2種類の撮像手段で得られた2種類の画像のうちの一方の画像を、検知対象物が他方の画像よりもはっきりと写りやすく、また、検知対象物ではない明るい物体が他方の画像よりも写りにくい画像とすることができる。したがって、上記2種類の画像のうちの他方の画像において明るくなっており、かつ、上記一方の画像においては写りにくくなっている領域を測光領域に決定せず、それ以外の領域を測光領域に決定することで、検知対象物ではない明るい物体が写る領域は測光領域から除外されるため、上記他方の画像について、検知対象物の検知に適した自動露光制御を行うことができる。
Depending on the wavelength of the imaging means, it is possible to capture an image in which nearby objects are easily captured and distant objects are not easily captured. Note that the object to be detected is often a nearby object, and bright objects that are not the object to be detected, such as direct sunlight, the sky, or distant white objects, are often distant objects. For this reason, one of the two types of images obtained by two types of imaging means with different wavelengths can be an image in which the object to be detected is more clearly captured than in the other image, and in which bright objects that are not the object to be detected are less easily captured than in the other image. Therefore, by not determining the area that is bright in the other of the two types of images and is difficult to capture in the one image as the photometric area, and determining the other area as the photometric area, the area in which a bright object that is not the object to be detected is captured is excluded from the photometric area, and automatic exposure control suitable for detecting the object to be detected can be performed on the other image.
(技術2)前記2種類の画像は、可視光画像および赤外線画像であり、前記一方の画像は、前記赤外線画像であり、前記他方の画像は、前記可視光画像である、技術1に記載の画像処理装置。
(Technology 2) The image processing device described in Technology 1, in which the two types of images are a visible light image and an infrared image, one of the images being the infrared image and the other image being the visible light image.
赤外線画像は、検知対象物が可視光画像よりもはっきりと写りやすく、また、検知対象物ではない明るい領域が可視光画像よりも写りにくい。したがって、可視光画像において明るくなっており、かつ、赤外線画像においては写りにくくなっている領域を測光領域に決定せず、それ以外の領域を測光領域に決定することで、検知対象物ではない明るい物体が写る領域は測光領域から除外されるため、可視光画像について、検知対象物の検知に適した自動露光制御を行うことができる。
Infrared images tend to show the object being detected more clearly than visible light images, and bright areas that are not the object being detected are less visible than in visible light images. Therefore, by not determining areas that are bright in the visible light image but are difficult to see in the infrared image as the photometry area, and instead determining other areas as the photometry area, areas that show bright objects that are not the object being detected are excluded from the photometry area, making it possible to perform automatic exposure control for the visible light image that is suitable for detecting the object being detected.
(技術3)前記赤外線画像は、ToF方式のセンサにより得られる画像である、技術2に記載の画像処理装置。
(Technology 3) The image processing device described in Technology 2, in which the infrared image is an image obtained by a ToF sensor.
ToF方式のセンサにより、近くの物体が写りやすく遠くの物体が写りにくい赤外線画像を撮影することができる。
ToF sensors can capture infrared images that clearly show nearby objects and not so clearly show distant objects.
(技術4)前記決定部は、前記ToF方式のセンサが取得する反射光の成分が所定の強度以上となっている前記赤外線画像における領域に対応する前記可視光画像における領域を、前記測光領域として決定する、技術3に記載の画像処理装置。
(Technology 4) The image processing device described in Technology 3, in which the determination unit determines, as the photometric area, an area in the visible light image corresponding to an area in the infrared image in which the reflected light component acquired by the ToF sensor is equal to or greater than a predetermined intensity.
近くの物体からの反射光の成分の強度は大きく、遠くの物体からの反射光の成分の強度は小さいため、反射光の成分の強度が大きい近くの物体が写りやすく、反射光の成分の強度が小さい遠くの物体が写りにくい赤外線画像を取得することができる。反射光の成分の強度が大きい領域は、検知対象物が写り得る領域であるため、測光領域に決定することができる。
Since the intensity of the reflected light components from nearby objects is high and the intensity of the reflected light components from distant objects is low, it is possible to obtain an infrared image in which nearby objects with high reflected light component intensity are easily captured and distant objects with low reflected light component intensity are not easily captured. Areas with high reflected light component intensity are areas where the detected object may be captured, and can therefore be determined as photometry areas.
(技術5)前記決定部は、前記ToF方式のセンサが取得する反射光以外の光の成分が所定の強度よりも大きくなっている前記赤外線画像における領域に対応する前記可視光画像における領域を、前記測光領域として決定しない、技術3または4に記載の画像処理装置。
(Technology 5) The image processing device according to Technology 3 or 4, in which the determination unit does not determine, as the photometric area, an area in the visible light image corresponding to an area in the infrared image in which light components other than reflected light acquired by the ToF sensor are greater than a predetermined intensity.
直射日光などの高輝度物体からの反射光以外の光の成分(バックグラウンド成分)の強度は非常に大きく、反射光以外の光の成分の強度が大きい領域は、直射日光などの高輝度物体が写り得る領域であるため、測光領域に決定しないようにすることができる。
The intensity of light components other than reflected light from high-brightness objects such as direct sunlight (background components) is very high, and areas where the intensity of light components other than reflected light is high are areas where high-brightness objects such as direct sunlight may be captured, so they can be prevented from being determined as photometry areas.
(技術6)画像処理装置によって実行される画像処理方法であって、互いに波長の異なる2種類の撮像手段で得られた、同じ領域が写る2種類の画像を取得する取得ステップと、前記2種類の画像のうちの一方の画像に基づいて、前記2種類の画像のうちの他方の画像における、測光が行われる測光領域を決定する決定ステップと、を含む、画像処理方法。
(Technology 6) An image processing method executed by an image processing device, the image processing method including: an acquisition step of acquiring two types of images showing the same area obtained by two types of imaging means having different wavelengths; and a determination step of determining, based on one of the two types of images, a photometric area in which photometry is performed in the other of the two types of images.
これによれば、検知対象物の検知に適した自動露光制御を行うことができる画像処理方法を提供することができる。
This makes it possible to provide an image processing method that can perform automatic exposure control suitable for detecting an object to be detected.
(技術7)技術6に記載の画像処理方法をコンピュータに実行させるためのプログラム。
(Technology 7) A program for causing a computer to execute the image processing method described in Technology 6.
これによれば、検知対象物の検知に適した自動露光制御を行うことができるプログラムを提供することができる。
This makes it possible to provide a program that can perform automatic exposure control suitable for detecting the object to be detected.
本開示は、検知対象物を検知するための装置などに適用できる。
This disclosure can be applied to devices for detecting objects to be detected.
10 露光時間設定部
11、31 測光部
12 露光時間計算部
20 画像合成部
21a、21b、21c オフセット補正部
22 出力合成部
30 ゲイン補正部
32 ゲイン計算部
33 ゲイン適用部
100 画像処理装置
200 撮像部 REFERENCE SIGNSLIST 10 Exposure time setting section 11, 31 Photometry section 12 Exposure time calculation section 20 Image synthesis section 21a, 21b, 21c Offset correction section 22 Output synthesis section 30 Gain correction section 32 Gain calculation section 33 Gain application section 100 Image processing device 200 Imaging section
11、31 測光部
12 露光時間計算部
20 画像合成部
21a、21b、21c オフセット補正部
22 出力合成部
30 ゲイン補正部
32 ゲイン計算部
33 ゲイン適用部
100 画像処理装置
200 撮像部 REFERENCE SIGNS
Claims (7)
- 互いに波長の異なる2種類の撮像手段で得られた、同じ領域が写る2種類の画像を取得する取得部と、
前記2種類の画像のうちの一方の画像に基づいて、前記2種類の画像のうちの他方の画像における、測光が行われる測光領域を決定する決定部と、を備える、
画像処理装置。 an acquisition unit that acquires two types of images of the same area obtained by two types of imaging means having different wavelengths;
a determination unit that determines, based on one of the two types of images, a photometric area in which photometry is performed in the other of the two types of images,
Image processing device. - 前記2種類の画像は、可視光画像および赤外線画像であり、
前記一方の画像は、前記赤外線画像であり、
前記他方の画像は、前記可視光画像である、
請求項1に記載の画像処理装置。 the two types of images being a visible light image and an infrared image;
the one image is the infrared image;
the other image is the visible light image;
The image processing device according to claim 1 . - 前記赤外線画像は、ToF(Time of Flight)方式のセンサにより得られる画像である、
請求項2に記載の画像処理装置。 The infrared image is an image obtained by a ToF (Time of Flight) sensor.
The image processing device according to claim 2 . - 前記決定部は、前記ToF方式のセンサが取得する反射光の成分が所定の強度以上となっている前記赤外線画像における領域に対応する前記可視光画像における領域を、前記測光領域として決定する、
請求項3に記載の画像処理装置。 The determination unit determines, as the photometric area, an area in the visible light image corresponding to an area in the infrared image in which a component of reflected light acquired by the ToF sensor has a predetermined intensity or more.
The image processing device according to claim 3 . - 前記決定部は、前記ToF方式のセンサが取得する反射光以外の光の成分が所定の強度よりも大きくなっている前記赤外線画像における領域に対応する前記可視光画像における領域を、前記測光領域として決定しない、
請求項3または4に記載の画像処理装置。 the determination unit does not determine, as the photometric area, an area in the visible light image corresponding to an area in the infrared image in which a light component other than the reflected light acquired by the ToF sensor is greater than a predetermined intensity.
5. The image processing device according to claim 3. - 画像処理装置によって実行される画像処理方法であって、
互いに波長の異なる2種類の撮像手段で得られた、同じ領域が写る2種類の画像を取得する取得ステップと、
前記2種類の画像のうちの一方の画像に基づいて、前記2種類の画像のうちの他方の画像における、測光が行われる測光領域を決定する決定ステップと、を含む、
画像処理方法。 An image processing method executed by an image processing device, comprising:
An acquisition step of acquiring two types of images of the same area obtained by two types of imaging means having different wavelengths from each other;
and determining, based on one of the two types of images, a photometric area in the other of the two types of images in which photometric metering is performed.
Image processing methods. - 請求項6に記載の画像処理方法をコンピュータに実行させるためのプログラム。 A program for causing a computer to execute the image processing method according to claim 6.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2023026071 | 2023-02-22 | ||
JP2023-026071 | 2023-02-22 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2024176953A1 true WO2024176953A1 (en) | 2024-08-29 |
Family
ID=92501124
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2024/005412 WO2024176953A1 (en) | 2023-02-22 | 2024-02-16 | Image processing device, image processing method, and program |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2024176953A1 (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009100252A (en) * | 2007-10-17 | 2009-05-07 | Mitsubishi Electric Corp | Imaging device |
WO2018150768A1 (en) * | 2017-02-20 | 2018-08-23 | ソニー株式会社 | Photometry device, photometry method, program, and imaging device |
-
2024
- 2024-02-16 WO PCT/JP2024/005412 patent/WO2024176953A1/en unknown
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009100252A (en) * | 2007-10-17 | 2009-05-07 | Mitsubishi Electric Corp | Imaging device |
WO2018150768A1 (en) * | 2017-02-20 | 2018-08-23 | ソニー株式会社 | Photometry device, photometry method, program, and imaging device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10574961B2 (en) | Image processing apparatus and image processing method thereof | |
US9516290B2 (en) | White balance method in multi-exposure imaging system | |
RU2528590C2 (en) | Image capturing device, method of controlling same and storage medium | |
US20200043225A1 (en) | Image processing apparatus and control method thereof | |
JP6020199B2 (en) | Image processing apparatus, method, program, and imaging apparatus | |
JP6160004B2 (en) | Scene recognition method and apparatus | |
US20070047803A1 (en) | Image processing device with automatic white balance | |
WO2020059565A1 (en) | Depth acquisition device, depth acquisition method and program | |
JP2007074163A (en) | Imaging device and imaging method | |
US20110150357A1 (en) | Method for creating high dynamic range image | |
US7486884B2 (en) | Imaging device and imaging method | |
US20120127336A1 (en) | Imaging apparatus, imaging method and computer program | |
US10397473B2 (en) | Image processing apparatus having an image synthesis unit that generates synthesized image data depending on an object brightness, and related image-pickup apparatus | |
JP5988093B2 (en) | Image processing apparatus, object identification apparatus, and program | |
JP2018041380A (en) | Image processing apparatus, image processing method, and program | |
JP6025472B2 (en) | Image processing apparatus and image processing method | |
JP2015144475A (en) | Imaging apparatus, control method of the same, program and storage medium | |
JP4872277B2 (en) | Imaging apparatus and imaging method | |
JP4841582B2 (en) | Image correction program and image correction apparatus | |
WO2024176953A1 (en) | Image processing device, image processing method, and program | |
KR101710630B1 (en) | Photographing apparatus, photographing method and recording medium | |
KR20110067700A (en) | Image acquisition method and digital camera system | |
CN114143419B (en) | Dual-sensor camera system and depth map calculation method thereof | |
JP7446080B2 (en) | Image processing device, imaging device, control method, program and imaging system | |
JP6348883B2 (en) | Image capturing apparatus, image capturing method, and computer program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24760253 Country of ref document: EP Kind code of ref document: A1 |