WO2015133098A1 - 画像処理装置、撮像装置、画像処理方法及びプログラムを記憶した記憶媒体 - Google Patents
画像処理装置、撮像装置、画像処理方法及びプログラムを記憶した記憶媒体 Download PDFInfo
- Publication number
- WO2015133098A1 WO2015133098A1 PCT/JP2015/001000 JP2015001000W WO2015133098A1 WO 2015133098 A1 WO2015133098 A1 WO 2015133098A1 JP 2015001000 W JP2015001000 W JP 2015001000W WO 2015133098 A1 WO2015133098 A1 WO 2015133098A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- illumination light
- illumination
- component
- restored
- Prior art date
Links
- 238000003672 processing method Methods 0.000 title claims description 6
- 238000005286 illumination Methods 0.000 claims abstract description 156
- 239000002245 particle Substances 0.000 claims abstract description 26
- 238000009792 diffusion process Methods 0.000 claims abstract description 17
- 238000003384 imaging method Methods 0.000 claims description 68
- 238000000605 extraction Methods 0.000 claims description 8
- 230000001678 irradiating effect Effects 0.000 claims description 2
- 238000000034 method Methods 0.000 description 51
- 230000037303 wrinkles Effects 0.000 description 30
- 238000006731 degradation reaction Methods 0.000 description 17
- 230000015556 catabolic process Effects 0.000 description 16
- 238000010586 diagram Methods 0.000 description 15
- 238000004364 calculation method Methods 0.000 description 11
- 230000000694 effects Effects 0.000 description 10
- 230000006866 deterioration Effects 0.000 description 9
- 230000010365 information processing Effects 0.000 description 9
- 238000002834 transmittance Methods 0.000 description 7
- 230000003321 amplification Effects 0.000 description 6
- 238000003199 nucleic acid amplification method Methods 0.000 description 6
- 239000004071 soot Substances 0.000 description 5
- 230000006870 function Effects 0.000 description 4
- 230000002238 attenuated effect Effects 0.000 description 3
- 238000013016 damping Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 239000000428 dust Substances 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000010419 fine particle Substances 0.000 description 1
- 238000007667 floating Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 239000003595 mist Substances 0.000 description 1
- 239000004576 sand Substances 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000000779 smoke Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/60—Extraction of image or video features relating to illumination properties, e.g. using a reflectance or lighting model
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/56—Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/73—Circuitry for compensating brightness variation in the scene by influencing the exposure time
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30181—Earth observation
- G06T2207/30192—Weather; Meteorology
Definitions
- the present invention relates to an image processing apparatus, an imaging apparatus, an image processing method, and a storage medium storing a program.
- the outdoor shooting environment is fine particles floating in the atmosphere, such as water particles, smoke, sand dust, or dust in bad weather such as fog, hail, and hail (hereinafter sometimes referred to collectively as “haze”) May be included).
- haze reflected light from an object to be imaged is diffused based on particles in the atmosphere while traveling along a route to a camera that is an imaging device.
- the reflected light from the object is attenuated and reaches the camera sensor.
- ambient light is diffused into particles in the atmosphere and reaches the camera sensor. Therefore, the light (observation light) irradiated to the camera sensor is a mixed light of the reflected light from the attenuated object and the diffused ambient light.
- the captured image of the camera sensor is an image including a deterioration component that looks white.
- Equation (1) The observation light I (x, ⁇ ) at the wavelength ⁇ at the pixel position x of the camera sensor is expressed by Expression (1) using the reflected light J (x, ⁇ ) and the ambient light A ( ⁇ ) at the same position.
- t (x, ⁇ ) represents the transmittance of reflected light. If the atmospheric condition of the environment is uniform, t (x, ⁇ ) is calculated using the diffusion coefficient per unit distance (k ( ⁇ )) and the distance from the camera sensor to the object (d (x)). , Expressed as equation (2).
- observation light I (x, ⁇ ) and the transmittance t (x) can be expressed as in Expression (3) and Expression (4).
- Image restoration (estimation) technology that removes image degradation (influence of soot) based on particles in the atmosphere from images taken in such an environment is an object that is not attenuated from observation light I (x, ⁇ )
- the reflected light J (x, ⁇ ) from is estimated.
- the transmittance t (x) of the reflected light is estimated, and the reflected light J (x, ⁇ ) is calculated as shown in Expression (5).
- Such an image restoration (estimation) technique is to estimate two pieces of information of reflected light J (x, ⁇ ) and transmittance t (x) from the observation light I (x, ⁇ ) for each pixel. is necessary. Therefore, such an image restoration technique becomes a defect setting problem in which no solution is found. For this reason, in order to estimate the optimum solution of the reflected light J (x, ⁇ ) and the transmittance t (x) in such an image restoration technique, some prior knowledge about the environment is required.
- Non-Patent Document 1 Several techniques have been proposed so far to estimate the reflected light or transmittance and eliminate the effects of image degradation based on wrinkles. Among them, a technique for executing correction processing based on one image will be described with reference to Non-Patent Document 1 and Non-Patent Document 2.
- Non-Patent Document 1 uses statistical knowledge as prior knowledge. This is a finding that, in general, a natural image in a state in which no wrinkles are applied has a pixel having a value of 0 in any of the color channels RGB around the pixel of interest.
- the technique described in Non-Patent Document 1 is a technique for generating a restored image based on the statistical knowledge. Therefore, when there is no pixel having a value of 0 in any channel around the pixel of interest, the technique described in Non-Patent Document 1 indicates that the value does not become 0. It is considered.
- the method described in Non-Patent Document 1 calculates the transmittance based on the channel values of the pixels around the target pixel.
- Non-Patent Document 2 uses, as prior knowledge, decorrelation between the texture of an object and the distance to the object (the degree of superimposition of environmental light based on degradation processes such as wrinkles).
- the technique described in Non-Patent Document 2 is a technique for separating reflected light and ambient light by paying attention to the non-correlation.
- Non-Patent Document 1 and Non-Patent Document 2 In the method for removing degradation components such as wrinkles described in Non-Patent Document 1 and Non-Patent Document 2 described above, ambient light is uniformly irradiated, and the amount of ambient light irradiation is the same at each position in the shooting environment. Is assumed. However, when shooting using illumination light such as light, the amount of ambient light irradiation at each position in the shooting environment is not the same. Therefore, when photographing using illumination light, the methods described in Non-Patent Document 1 and Non-Patent Document 2 do not operate correctly when removing an image degradation component in a captured image and restoring the image. There was a point.
- Non-Patent Document 1 and Non-Patent Document 2 have a problem that a captured image using illumination light cannot be corrected correctly.
- An object of the present invention is to provide an image processing apparatus, an imaging apparatus, and an image processing method capable of appropriately correcting image degradation of an image photographed in an environment where illumination light at each position in the photographing environment is not uniformly irradiated. And providing a storage medium storing the program.
- An image processing apparatus includes a captured image that is an image to be captured, an illumination superposition ratio that indicates the degree of the influence of attenuation or diffusion on the captured image based on particles in the atmosphere, and illumination light.
- the reflected light restoration means for restoring the reflected light on the surface of the subject to be photographed, the illumination light is restored based on the restored reflected light, and the restored illumination light and the photographed image are restored.
- Illumination light restoring means for generating an output image obtained by restoring the captured image based on the above.
- An imaging device includes the above-described image device, a receiving unit that captures or receives a captured image, and an output unit that outputs an output image.
- An image processing method includes a captured image that is an image to be captured, an illumination superposition ratio that indicates the degree of the influence of attenuation or diffusion on the captured image based on particles in the atmosphere, and illumination light. Based on the illumination light color that is the color information of the image, the reflected light on the surface of the subject is restored, the illumination light is restored on the basis of the restored reflected light, and the image is taken on the basis of the restored illumination light and the photographed image. An output image obtained by restoring the image is generated.
- a storage medium storing a program includes a captured image that is an image to be captured, and an illumination superposition ratio that indicates the degree of the influence of attenuation or diffusion based on particles in the atmosphere of illumination light in the captured image.
- the illumination light color which is the color information of the illumination light
- the process of restoring the reflected light on the surface of the subject to be photographed the illumination light is restored based on the restored reflected light, and the restored illumination light and the photographed image
- a program for causing a computer to execute processing for generating an output image obtained by restoring a captured image based on the above is stored.
- the present invention can provide an effect of appropriately correcting image deterioration of an image taken in an environment where illumination light is not irradiated uniformly.
- FIG. 1 is a block diagram showing an example of the configuration of the imaging apparatus according to the first embodiment of the present invention.
- FIG. 2 is a block diagram illustrating an example of the configuration of the image processing apparatus according to the first embodiment.
- FIG. 3 is a block diagram illustrating an example of the configuration of the wrinkle removal unit according to the first embodiment.
- FIG. 4 is a block diagram illustrating an example of the configuration of the image processing apparatus according to the second embodiment.
- FIG. 5 is a block diagram illustrating an example of the configuration of the image processing apparatus according to the third embodiment.
- FIG. 6 is a block diagram illustrating an example of a configuration of an imaging apparatus according to the fourth embodiment.
- FIG. 7 is a model diagram illustrating an example of a shooting environment irradiated with ambient light.
- FIG. 8 is a model diagram illustrating an example of a shooting environment irradiated with illumination light.
- FIG. 9 is a block diagram illustrating an example of the configuration of the information processing apparatus according to the modification.
- FIG. 1 is a block diagram showing an example of the configuration of the imaging device 4 according to the first embodiment of the present invention.
- the imaging device 4 includes an imaging unit 1, an image processing device 2, and an output unit 3.
- the imaging unit 1 captures a captured image (I (x, ⁇ )) to be captured.
- the imaging unit 1 includes, for example, an image sensor using a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor). Note that the imaging unit 1 may receive an image to be captured from an imaging device (not shown). Therefore, the imaging unit 1 is also called a receiving unit.
- the captured image I (x, ⁇ ) is generated based on the light detected by the image sensor, and thus corresponds to the observation light I (x, ⁇ ) in the background art.
- the image processing apparatus 2 performs image degradation of the captured image I (x, ⁇ ) based on at least one of attenuation and diffusion based on particles (for example, soot) in the atmosphere of illumination light irradiated to the imaging target. For example, the deterioration component based on wrinkles) is corrected. Specifically, the image processing apparatus 2 restores the attenuation of the reflected light from the imaging target based on the diffused light based on the particles in the atmosphere of the illumination light. Then, the image processing apparatus 2 restores the attenuation of the illumination light based on the diffused light and the restored reflected light.
- the image processing apparatus 2 corrects (restores) the captured image I (x, ⁇ ) based on the restored illumination light, and generates an output image O (x, ⁇ ). Therefore, the image processing apparatus 2 may be called a correction unit.
- the output image O (x, ⁇ ) is also a corrected captured image.
- the output image O (x, ⁇ ) is also a degradation-removed image.
- the output unit 3 outputs the output image O (x, ⁇ ) generated by the image processing apparatus 2, that is, the corrected captured image I (x, ⁇ ).
- the output unit 3 is, for example, a display or a printer.
- FIG. 2 is a block diagram of the image processing apparatus 2 according to the first embodiment.
- the image processing apparatus 2 includes an illumination light color estimation unit 11, a skeleton component extraction unit 12, an illumination superimposition rate estimation unit 13, and a wrinkle removal unit 14.
- the illumination light color estimation unit 11 estimates illumination light color A ( ⁇ ), which is information on the color of illumination light, as environmental light in the shooting environment.
- the estimation method of the illumination light color A ( ⁇ ) in the present embodiment is not particularly limited.
- One method of estimating the illumination light color A ( ⁇ ) is to generate a light intensity intensity histogram at each wavelength and use a preset parameter ( ⁇ ) to determine the light intensity value of the upper ⁇ % intensity at each wavelength. Is the illumination light color A ( ⁇ ).
- the skeletal component extraction unit 12 removes fine fluctuations in the image from the captured image I (x, ⁇ ), and includes an image in an image composed of a flat area part with a small pixel value fluctuation and a strong edge part with a large fluctuation.
- the global structure (for example, the color or brightness of the flat area portion in the image) is extracted.
- the global structure is referred to as a skeleton component (B (x, ⁇ )).
- the method for extracting the skeleton component B (x, ⁇ ) in this embodiment is not particularly limited. As an example of a method for extracting the skeleton component B (x, ⁇ ), there is a method using total variation norm minimization.
- the method using total variation norm minimization is a method related to a technique for removing a vibration component in an image.
- This method uses an image (here, the captured image I (x, ⁇ )), and based on information obtained based on solving the minimization problem expressed by Equation (6), the skeleton component of the image Extract B (x, ⁇ ).
- ⁇ is a parameter for controlling a preset vibration amount to be removed.
- the method using the total variation norm minimization not only removes fine vibration components but also removes vibrations having a relatively wide period (low frequency) when combined with multi-resolution analysis.
- the integral of the first term in parentheses in the equation (6) is an integral on the xy plane of all variations of the skeleton component B (x, ⁇ ).
- the second term is the multiplication of the square of the two-dimensional norm of the difference between the captured image I (x, ⁇ ) and the skeleton component B (x, ⁇ ) by ⁇ / 2.
- Expression (6) the description of “(x, ⁇ )” is omitted. “ ⁇ ” under “min” is a symbol (cup) indicating that everything is included. That is, equation (6) shows the minimum of all possible cases.
- the illumination superposition rate estimation unit 13 uses the illumination light color A ( ⁇ ) and the skeleton component B (x, ⁇ ) as the illumination superposition rate c (x), and the illumination light at the time of emission in each pixel and the atmosphere The ratio of the illumination light reaching the camera sensor as a result of being diffused by the particles inside is estimated. That is, the illumination superimposition rate estimation unit 13 estimates the degree of the influence of attenuation or diffusion based on particles in the atmosphere of the illumination light.
- the illumination superimposition rate c (x) is a value indicating the degree of the influence of attenuation or diffusion based on particles in the atmosphere of the illumination light.
- k 1 is a parameter representing a predetermined ratio.
- the ratio k 1 may be changed as shown in Expression (8) using, for example, the brightness lumi (x) around the target pixel.
- k 1max and th 1 are preset parameters.
- Equation (9) Two examples of the lightness lumi (x) calculation method are shown in Equation (9) and Equation (10).
- the illumination overlay ratio c (x), if it exceeds the maximum value th 2 set in advance, performs the clip processing as in equation (11) may be adjusted so as not to exceed the maximum value.
- the wrinkle removal unit 14 is an image corrected by removing deterioration components based on wrinkles based on the captured image I (x, ⁇ ), the illumination light color A ( ⁇ ), and the illumination superimposition ratio c (x).
- An output image O (x, ⁇ ) is generated. That is, the soot removal unit 14 removes the diffused light based on the particles in the atmosphere of the illumination light irradiated to the imaging target from the captured image I (x, ⁇ ), and restores the attenuation of the reflected light of the imaging target. . Furthermore, the wrinkle removal unit 14 generates an output image O (x, ⁇ ) based on restoring the attenuation of the illumination light irradiated to the imaging target.
- the soot removal unit 14 includes a reflected light restoration unit 21 and an illumination light restoration unit 22, as shown in FIG.
- the reflected light restoration unit 21 removes the diffused light based on the illumination light from the captured image I (x, ⁇ ), and further attenuates the reflected light based on particles in the atmosphere in the path from the imaging target to the camera sensor. Restore. Based on this processing, the reflected light restoration unit 21 restores the reflected light D 1 (x, ⁇ ) on the surface of the photographic object.
- An example of a specific method to recover the reflected light D 1 (x, ⁇ ), reflection light D 1 (x, ⁇ ), the input image I (x, ⁇ ), illumination color A (lambda) and the illumination overlay ratio Assuming that the relationship of c (x) is sufficiently approximate to the environment represented by Equation (1), there is a method of calculating as in Equation (12).
- the reflected light restoration unit 21 uses the predetermined parameter k 2 to reduce the influence based on the difference from the past imaging environment or the estimation error of the illumination superimposition ratio c (x) as shown in Expression (13).
- a method of calculating the reflected light D 1 (x, ⁇ ) may be used.
- the reflected light restoration unit 21 uses the exponent value ⁇ calculated by using the predetermined parameter k 3 shown in Expression (14) to generate the reflected light D 1 (x, ⁇ ) as shown in Expression (15).
- a calculation method may be used.
- the reflected light restoration unit 21 uses the minimum value c min of the illumination superimposition ratio c (x) calculated as in Expression (16) as a mixed method of the calculation methods of Expression (13) and Expression (15). May be.
- the reflected light restoration unit 21 calculates the temporary correction result D ′ 1 (x, ⁇ ) as in Expression (17), and uses the exponent value ⁇ ′ determined in Expression (18) to The method of calculating the reflected light D 1 (x, ⁇ ) by correcting D ′ 1 (x, ⁇ ) as in 19) may be used.
- the illumination light restoration unit 22 restores diffusion or attenuation in the illumination light irradiated on the object to be photographed based on the reflected light D 1 (x, ⁇ ) on the surface of the photographing object generated by the reflected light restoration unit 21. Then, the illumination light restoration unit 22 generates an output image O (x, ⁇ ) from the captured image I (x, ⁇ ) based on the illumination light whose diffusion or attenuation is restored.
- a method for generating the output image O (x, ⁇ ) a method of calculating using the predetermined parameter k 4 as shown in the equation (20), or using a predetermined parameter k 5 as shown in the equation (21). There is a method of calculating as shown in Expression (22) using the index value ⁇ 2 (x) calculated as described above.
- the first embodiment in a dark environment such as at night or in a tunnel, for example, in an image illuminated with a light close to the imaging device 4 (for example, a camera), particles in the atmosphere (for example, soot) The image degradation based on is removed, and the influence based on the attenuation of the illumination light is restored. Therefore, the first embodiment can produce an effect that a high-quality image can be generated even when shooting is performed using illumination light such as a light.
- illumination light such as a light.
- the illumination light color estimation unit 11 estimates the illumination light color A ( ⁇ ).
- the skeleton component extraction unit 12 extracts the skeleton component B (x, ⁇ ) of the captured image.
- the illumination superposition rate estimation unit 13 estimates the illumination superposition rate c (x).
- the wrinkle removal unit 14 corrects the image deterioration factor such as wrinkles based on the captured image I (x, ⁇ ), the illumination light color A ( ⁇ ), and the illumination superimposition rate c (x). This is because x, ⁇ ) is generated.
- FIG. 4 is a block diagram showing an example of the configuration of the image processing apparatus 2 according to the second embodiment of the present invention.
- the image processing apparatus 2 according to the second embodiment includes an illumination light color estimation unit 11, a skeleton component extraction unit 12, an illumination superimposition rate estimation unit 13, a wrinkle removal unit 14, and an exposure correction unit 15. .
- the image processing device 2 according to the second embodiment is different from the image processing device 2 according to the first embodiment in that the exposure correction unit 15 is included.
- Other configurations of the image processing apparatus 2 according to the second embodiment are the same as those of the image processing apparatus 2 according to the first embodiment. The same applies to the imaging unit 1 and the output unit 3 in the imaging device 4. Therefore, the description of the same configuration is omitted, and the operation of the exposure correction unit 15 specific to the present embodiment will be described below.
- Exposure correction unit 15 an output image O to remove the degraded component output from the mist removal unit 14 (x, lambda) (first output image) to a group, the output to adjust the overall brightness of the image O 2 (x, ⁇ ) (referred to as a second output image or an exposure correction image) is generated.
- image shooting is performed by appropriately setting the dynamic range of the amount of light received by the camera sensor for the shooting environment.
- the correction executed by the wrinkle removal unit 14 virtually changes the shooting environment from the shot image I (x, ⁇ ) in the hazy environment to the output image O (x, ⁇ ) in the hazy environment. .
- the dynamic range of the first output image O (x, ⁇ ) from which the degradation has been removed may be different from the dynamic range set by the imaging device 4 at the time of shooting.
- the output image O (x, ⁇ ) from which deterioration has been removed may be too bright or too dark. Therefore, the exposure correction unit 15 corrects the first output image O (x, ⁇ ) from which the deterioration has been removed so as to set an appropriate dynamic range, and the second output image O 2 (x, ⁇ ) is corrected.
- the second output image O 2 (x, ⁇ ) is a corrected captured image, and in particular, an image whose exposure is corrected so as to have an appropriate dynamic range.
- the maximum value in the first output image O (x, ⁇ ) is expressed as shown in Expression (23).
- the exposure correction unit 15 may use an average luminance value (ave) of the first output image O (x, ⁇ ) and an average luminance value (tar) that is a preset target value. That is, the exposure correction unit 15 calculates an average luminance value ave of the first output image O (x, ⁇ ), and converts the average luminance value ave into an average luminance value tar that is a preset target value. ⁇ 3 is calculated as shown in equation (24). Then, the exposure correction unit 15 corrects the first output image O (x, ⁇ ) using the exponent value ⁇ 3 as shown in Expression (25), and the second output image O 2 (x, ⁇ ) is corrected. It may be generated.
- the second embodiment can produce an effect of obtaining an image with an appropriate dynamic range in addition to the effect of the first embodiment.
- the reason is that the exposure correction unit 15 generates a second output image O 2 (x, ⁇ ) in which the dynamic range of the first output image O (x, ⁇ ) is appropriately corrected.
- FIG. 5 is a block diagram showing an example of the configuration of the image processing apparatus 2 according to the third embodiment.
- the image processing apparatus 2 includes an illumination light color estimation unit 11, a skeleton component extraction unit 12, an illumination superposition rate estimation unit 13, a wrinkle removal unit 14 ′, and an exposure correction unit 15 ′. , A texture component calculation unit 16 and a texture component correction unit 17.
- the image processing device 2 according to the third embodiment includes the texture component calculation unit 16 and the texture component correction unit 17 as compared with the image processing device 2 according to the second embodiment. It is different. Furthermore, the image processing apparatus 2 according to the third embodiment is different in that it includes a wrinkle removal unit 14 ′ and an exposure correction unit 15 ′ instead of the wrinkle removal unit 14 and the exposure correction unit 15.
- Other configurations of the image processing apparatus 2 according to the third embodiment are the same as those of the image processing apparatus 2 according to the first or second embodiment. The same applies to the imaging unit 1 and the output unit 3 in the imaging device 4. Therefore, the description of the same configuration is omitted, and the operations of the texture component calculation unit 16, the texture component correction unit 17, the wrinkle removal unit 14 ′, and the exposure correction unit 15 ′ will be described below.
- the texture component calculation unit 16 is a component that represents a fine pattern (texture component or noise component) in the image that is the difference (residual) between the captured image I (x, ⁇ ) and the skeleton component B (x, ⁇ ) , Texture component T (x, ⁇ )) is calculated as in equation (26).
- the wrinkle removal unit 14 ′ Similar to the wrinkle removal unit 14, the wrinkle removal unit 14 ′ generates a first output image O (x, ⁇ ) obtained by removing image degradation from the captured image I (x, ⁇ ). Furthermore, the wrinkle removal unit 14 ′ corrects the skeleton component B (x, ⁇ ) (first skeleton component) by applying the same processing, and removes the corrected degradation component skeleton component B 1 (x, ⁇ ). ) (Second skeletal component). That is, the second skeleton component B 1 (x, ⁇ ) is a skeleton component from which the degradation component is removed. More specifically, the illumination light restoration unit 22 performs the above processing based on the restored illumination light.
- the exposure correction unit 15 ′ generates a second output image O 2 (x, ⁇ ) from the first output image O (x, ⁇ ) in the same manner as the exposure correction unit 15. Further, the exposure correction unit 15 ′ applies the same process to the second skeleton component B 1 (x, ⁇ ) from which the deterioration component has been removed, and the skeleton component B 2 (x, ⁇ ) (third value after correcting the exposure). Skeleton component).
- the texture component correction unit 17 suppresses excessive enhancement of texture or noise amplification in the second output image O 2 (x, ⁇ ) generated based on the processing of the wrinkle removal unit 14 ′ and the exposure correction unit 15 ′. Then, a third output image O 3 (x, ⁇ ) with a corrected texture component is generated. Thus, the third output image O 3 (x, ⁇ ) is also a corrected captured image.
- the texture component T 2 (x, ⁇ ) (second texture component) in the third output image O 3 (x, ⁇ ) is the second output image O 2 (x, ⁇ ) and the exposure correction skeleton component B 2.
- (x, ⁇ ) (third skeleton component) the calculation is performed as in Expression (27).
- a texture amplification factor r (x, ⁇ ) based on correction processing is calculated as shown in Equation (28).
- a texture component T 3 (x, ⁇ ) (third texture component) in which excessive emphasis is suppressed as in Expression (29) using a preset upper limit value r max of the amplification factor is used. ) Is calculated.
- Expression (30) removes vibration based on noise from the third texture component T 3 (x, ⁇ ) using the noise standard deviation ⁇ calculated from the camera characteristics and the amplification factor of the texture.
- a texture component T 4 (x, ⁇ ) (fourth texture component) in which noise is suppressed is generated.
- sgn (•) is a function representing a sign.
- the texture component correction unit 17 combines the third skeleton component B 2 (x, ⁇ ) and the fourth texture component T 4 (x, ⁇ ) as shown in Expression (31), and outputs the third output image O. 3 Generate (x, ⁇ ).
- the third embodiment can achieve an effect of obtaining an image in which excessive emphasis of texture and noise amplification are suppressed.
- the texture component calculation unit 16 calculates the first texture component T (x, ⁇ ).
- the wrinkle removal unit 14 ′ generates a second skeleton component B 1 (x, ⁇ ) in which image degradation is corrected in addition to the first output image O (x, ⁇ ).
- the exposure correction unit 15 ′ In addition to the second output image O 2 (x, ⁇ ), the exposure correction unit 15 ′ generates a third skeleton component B 2 (x, ⁇ ) in which the exposure is corrected based on the second skeleton component.
- the texture component correction section 17 a second output image O 2 (x, lambda) and third framework component B 2 (x, lambda) and a second texture component based on T 2 (x, lambda) Is calculated. Furthermore, in order to suppress excessive enhancement, the texture component correction unit 17 uses the first texture component T (x, ⁇ ) and the second texture component T 2 (x, ⁇ ) to A texture component T 3 (x, ⁇ ) is calculated. Furthermore, the texture component correction unit 17 generates a fourth texture component T 4 (x, ⁇ ) in which vibration based on noise is suppressed from the third texture component T 3 (x, ⁇ ).
- the texture component correction unit 17 suppresses excessive enhancement of texture or noise amplification based on the third skeleton component B 2 (x, ⁇ ) and the fourth texture component T 4 (x, ⁇ ). This is to generate the third output image O 3 (x, ⁇ ).
- FIG. 6 is a block diagram illustrating an example of the configuration of the imaging device 4 according to the fourth embodiment.
- the imaging device 4 according to the fourth embodiment is different from the imaging device 4 according to the first to third embodiments in that the imaging device 4 according to the fourth embodiment includes an illumination device 30 and a setting unit. 31. Since the other configuration of the imaging device 4 according to the fourth embodiment is the same as that of the first to third embodiments, the description of the same configuration is omitted, and hereinafter, the illumination device 30 and the setting unit 31 will be described. explain.
- the illumination device 30 is provided at a position close to the imaging unit 1 and irradiates the imaging target with illumination light as the imaging unit 1 starts imaging.
- the illumination device 30 is, for example, a flash.
- the setting unit 31 switches between execution and stop setting of correction processing for image deterioration (for example, wrinkles) in the image processing apparatus 2. Even in shooting in a hazy environment, there is a case where it is desired to intentionally reflect wrinkles on a captured image. In such a case, the user of the imaging device 4 can use the setting unit 31 to stop the image degradation correction process in the image processing device 2.
- the imaging device 4 according to the fourth embodiment is provided at a position where the illumination device 30 is close to the imaging unit 1. Therefore, the captured image based on the illumination light of the illumination device 30 is easily affected by particles in the atmosphere.
- the image processing apparatus 2 according to the fourth embodiment can achieve an effect of appropriately correcting the influence of wrinkles in the captured image.
- the image processing device 2 included in the imaging device 4 corrects the influence of particles in the atmosphere based on the operations described in the first to third embodiments (first output image O ( This is because x, ⁇ ) to the third output image O 3 (x, ⁇ )) can be generated.
- the imaging device 4 according to the fourth embodiment can produce an effect of generating an image reflecting the influence of wrinkles and the like.
- the imaging device 4 according to the fourth embodiment includes a setting unit 31 that stops the image degradation correction process in the image processing device 2. Therefore, the user can use the setting unit 31 to stop the image degradation correction process and intentionally reflect image degradation such as wrinkles in the captured image.
- the first to fourth embodiments described above are applicable not only to still images but also to moving images.
- the image processing device 2 in the first to fourth embodiments can be incorporated as an image processing engine in various photographing devices or devices that process images.
- the image processing apparatus 2 or the imaging apparatus 4 according to the first to fourth embodiments described above is configured as follows.
- each component of the image processing device 2 or the imaging device 4 may be configured with a hardware circuit.
- the image processing device 2 or the imaging device 4 may be configured using a plurality of devices in which each component is connected via a network.
- the image processing apparatus 2 illustrated in FIG. 2 includes the wrinkle removal unit 14 illustrated in FIG. 3, and includes a device including the illumination light color estimation unit 11, a device including the skeletal component extraction unit 12, and illumination superposition via a network.
- You may comprise with the apparatus connected with the apparatus containing the rate estimation part 13.
- FIG. the image processing apparatus 2 receives the captured image I (x, ⁇ ), the long-lived superimposition rate c (x), and the illumination light color A ( ⁇ ) via the network, and based on the above-described operation.
- the first output image O (x, ⁇ ) may be generated.
- the wrinkle removal unit 14 illustrated in FIG. 3 is also the minimum configuration of the image processing apparatus 2.
- the image processing device 2 or the imaging device 4 may be configured by a single piece of hardware.
- the image processing device 2 or the imaging device 4 may be realized as a computer device including a CPU (Central Processing Unit), a ROM (Read Only Memory), and a RAM (Random Access Memory).
- the image processing device 2 or the imaging device 4 is realized as a computer device that further includes an input / output connection circuit (IOC: Input) / Output Circuit) and a network interface circuit (NIC: Network Interface Circuit). Also good.
- IOC Input
- NIC Network Interface Circuit
- FIG. 9 is a block diagram showing an example of the configuration of the information processing apparatus 600 as the image processing apparatus 2 or the imaging apparatus 4 according to this modification.
- the information processing apparatus 600 includes a CPU 610, a ROM 620, a RAM 630, an internal storage device 640, an IOC 650, and a NIC 680, and constitutes a computer device.
- CPU 610 reads a program from ROM 620.
- the CPU 610 controls the RAM 630, the internal storage device 640, the IOC 650, and the NIC 680 based on the read program.
- the computer including the CPU 610 controls these components and implements the functions as the components shown in FIGS.
- the CPU 610 may use the RAM 630 or the internal storage device 640 as a temporary storage of a program when realizing each function.
- the CPU 610 may read a program included in the storage medium 700 storing the program so as to be readable by a computer by using a storage medium reading device (not shown). Alternatively, the CPU 610 may receive a program from an external device (not shown) via the NIC 680, store the program in the RAM 630, and operate based on the stored program.
- ROM 620 stores programs executed by CPU 610 and fixed data.
- the ROM 620 is, for example, a P-ROM (Programmable-ROM) or a flash ROM.
- the RAM 630 temporarily stores programs executed by the CPU 610 and data.
- the RAM 630 is, for example, a D-RAM (Dynamic-RAM).
- the internal storage device 640 stores data and programs stored in the information processing device 600 for a long period of time. Further, the internal storage device 640 may operate as a temporary storage device for the CPU 610.
- the internal storage device 640 is, for example, a hard disk device, a magneto-optical disk device, an SSD (Solid State Drive), or a disk array device.
- the ROM 620 and the internal storage device 640 are non-transitory storage media.
- the RAM 630 is a volatile storage medium.
- the CPU 610 can operate based on a program stored in the ROM 620, the internal storage device 640, or the RAM 630. That is, the CPU 610 can operate using a nonvolatile storage medium or a volatile storage medium.
- the IOC 650 mediates data between the CPU 610, the input device 660, and the display device 670.
- the IOC 650 is, for example, an IO interface card or a USB (Universal Serial Bus) card.
- the input device 660 is a device that receives an input instruction from an operator of the information processing apparatus 600.
- the input device 660 is, for example, a keyboard, a mouse, or a touch panel.
- the display device 670 is a device that displays information to the operator of the information processing apparatus 600.
- the display device 670 is a liquid crystal display, for example.
- the NIC 680 relays data exchange with an external device (not shown) via the network.
- the NIC 680 is, for example, a LAN (Local Area Network) card.
- the information processing apparatus 600 configured as described above can obtain the same effects as those of the image processing apparatus 2 or the imaging apparatus 4.
- the CPU 610 of the information processing apparatus 600 can realize the same function as the image processing apparatus 2 or the imaging apparatus 4 based on the program.
- photography image which is an imaging
- the illumination superimposition rate which shows the extent of the influence of the attenuation
- illumination which is the information of the color of illumination light
- An image processing apparatus comprising: an illumination light restoration unit that restores the illumination light based on the restored reflected light and generates an output image obtained by restoring the photographed image based on the restored illumination light and the photographed image.
- the image processing apparatus further comprising: an illumination superimposition ratio estimation unit that estimates the illumination superimposition ratio based on the estimated illumination light color and the skeleton component.
- Appendix 3 The image processing apparatus according to appendix 2, including an exposure correction unit that adjusts and corrects the brightness of the output image.
- the texture component calculation means which calculates the texture component which is the difference of the said picked-up image and the said skeleton line segment is included,
- the illumination light restoring means restores the skeletal component based on the restored illumination light,
- the exposure correction means corrects the restored skeleton component;
- the image processing apparatus further comprising: a texture component correcting unit that corrects a texture component in the corrected output image based on the texture component and the corrected skeleton component.
- Appendix 5 The image processing apparatus according to any one of appendices 1 to 4, Receiving means for capturing or receiving a captured image; An imaging device comprising: output means for outputting the output image.
- Appendix 6 Illuminating means for irradiating the illumination light;
- photography image which is an imaging
- the illumination superimposition rate which shows the extent of the influence of the attenuation
- illumination which is the information of the color of illumination light Based on the light color, the reflected light on the surface of the subject is restored
- photography image which is an imaging
- the illumination superimposition rate which shows the extent of the influence of the attenuation
- the illumination which is the information of the color of illumination light Based on the light color, processing to restore the reflected light on the surface of the subject to be photographed;
- a program is stored that restores the illumination light based on the restored reflected light and generates an output image obtained by restoring the photographed image based on the restored illumination light and the photographed image. Storage medium.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Software Systems (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Processing (AREA)
- Studio Devices (AREA)
- Image Analysis (AREA)
- Color Image Communication Systems (AREA)
- Endoscopes (AREA)
Abstract
Description
まず、本発明における第1の実施の形態に係る撮像装置4を説明する。
第2の実施の形態を説明する。
第3の実施の形態を説明する。
第4の実施の形態を説明する。
以上の説明した第1ないし第4の実施の形態に係る画像処理装置2又は撮像装置4は、次のように構成される。
前記復元した反射光を基に前記照明光を復元し、前記復元した照明光と前記撮影画像とを基に前記撮影画像を復元した出力画像を生成する照明光復元手段と
を含む画像処理装置。
前記撮影画像の大局的な構造を表す骨格成分を抽出する骨格成分抽出手段と、
前記推定した照明光色と前記骨格成分とを基に前記照明重畳率を推定する照明重畳率推定手段と
を含む付記1に記載の画像処理装置。
前記照明光復元手段が、前記復元した照明光を基に前記骨格成分を復元し、
前記露光補正手段が、前記復元された骨格成分を補正し、
さらに、前記テクスチャ成分と、前記補正された骨格成分とを基に、前記補正された出力画像におけるテクスチャ成分を修正するテクスチャ成分修正手段と
を含む付記3に記載の画像処理装置。
撮影画像を撮影又は受信する受信手段と、
前記出力画像を出力する出力手段と
を含む撮像装置。
前記画像処理装置における前記撮影画像に対する補正処理の実行及び停止の設定を切り替える設定手段と
を含む付記5に記載の撮像装置。
前記復元した反射光を基に前記照明光を復元し、前記復元した照明光と前記撮影画像とを基に前記撮影画像を復元した出力画像を生成する
画像処理方法。
前記復元した反射光を基に前記照明光を復元し、前記復元した照明光と前記撮影画像とを基に前記撮影画像を復元した出力画像を生成する処理と
をコンピュータに実行させるプログラムを記憶した記憶媒体。
2 画像処理装置
3 出力部
4 撮像装置
11 照明光色推定部
12 骨格成分抽出部
13 照明重畳率推定部
14 霞除去部
14' 霞除去部
15 露光補正部
15' 露光補正部
16 テクスチャ成分算出部
17 テクスチャ成分修正部
21 反射光復元部
22 照明光復元部
30 照明装置
31 設定部
600 情報処理装置
610 CPU
620 ROM
630 RAM
640 内部記憶装置
650 IOC
660 入力機器
670 表示機器
680 NIC
700 記憶媒体
Claims (8)
- 撮影対象の画像である撮影画像と、前記撮影画像における照明光の大気中の粒子に基づく減衰又は拡散の影響の程度を示す照明重畳率と、照明光の色の情報である照明光色とを基に、前記撮影対象の表面における反射光を復元する反射光復元手段と、
前記復元した反射光を基に前記照明光を復元し、前記復元した照明光と前記撮影画像とを基に前記撮影画像を復元した第1の出力画像を生成する照明光復元手段と
を含む画像処理装置。 - 前記照明光色を推定する照明光色推定手段と、
前記撮影画像の大局的な構造を表す第1の骨格成分を抽出する骨格成分抽出手段と、
前記推定した照明光色と前記第1の骨格成分とを基に前記照明重畳率を推定する照明重畳率推定手段と
を含む請求項1に記載の画像処理装置。 - 前記第1の出力画像の明るさを調整した補正を基に第2の出力画像を生成する露光補正手段を含む請求項2に記載の画像処理装置。
- 前記撮影画像と前記第1の骨格成分との差分である第1のテクスチャ成分を算出するテクスチャ成分算出手段を含み、
前記照明光復元手段が、前記復元した照明光を基に前記第1の骨格成分を補正した第2の骨格成分を生成し、
前記露光補正手段が、前記第2の骨格成分の露光を補正して第3の骨格成分を生成し、
さらに、前記第2の出力画像と前記第3の骨格成分を基に第2のテクスチャ成分を算出し、前記第1のテクスチャ成分と前記第2のテクスチャ成分とを基に過度な強調を抑制した第3のテクスチャ成分を算出し、前記第3のテクスチャ成分の振動を抑制した第4のテクスチャ成分を算出し、前記第4のテクスチャ成分と前記第3の骨格成分とを基に前記第2の出力画像を修正して第3の出力成分を生成するテクスチャ成分修正手段と
を含む請求項3に記載の画像処理装置。 - 請求項1ないし4のいずれか1項に記載の画像処理装置と、
前記撮影画像を撮影又は受信する受信手段と、
前記第1ないし第3の出力画像を出力する出力手段と
を含む撮像装置。 - 前記照明光を照射する照明手段と、
前記画像処理装置における前記撮影画像に対する補正処理の実行及び停止の設定を切り替える設定手段と
を含む請求項5に記載の撮像装置。 - 撮影対象の画像である撮影画像と、前記撮影画像における照明光の大気中の粒子に基づく減衰又は拡散の影響の程度を示す照明重畳率と、照明光の色の情報である照明光色とを基に、前記撮影対象の表面における反射光を復元し、
前記復元した反射光を基に前記照明光を復元し、前記復元した照明光と前記撮影画像とを基に前記撮影画像を復元した出力画像を生成する
画像処理方法。 - 撮影対象の画像である撮影画像と、前記撮影画像における照明光の大気中の粒子に基づく減衰又は拡散の影響の程度を示す照明重畳率と、照明光の色の情報である照明光色とを基に、前記撮影対象の表面における反射光を復元する処理と、
前記復元した反射光を基に前記照明光を復元し、前記復元した照明光と前記撮影画像とを基に前記撮影画像を復元した出力画像を生成する処理と
をコンピュータに実行させるプログラムを記憶した記憶媒体。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/119,886 US20170053384A1 (en) | 2014-03-06 | 2015-02-26 | Image-processing device, image-capturing device, image-processing method, and storage medium |
JP2016506122A JP6436158B2 (ja) | 2014-03-06 | 2015-02-26 | 画像処理装置、撮像装置、画像処理方法及びプログラム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014-044438 | 2014-03-06 | ||
JP2014044438 | 2014-03-06 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015133098A1 true WO2015133098A1 (ja) | 2015-09-11 |
Family
ID=54054915
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/001000 WO2015133098A1 (ja) | 2014-03-06 | 2015-02-26 | 画像処理装置、撮像装置、画像処理方法及びプログラムを記憶した記憶媒体 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20170053384A1 (ja) |
JP (1) | JP6436158B2 (ja) |
AR (1) | AR099579A1 (ja) |
WO (1) | WO2015133098A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019500793A (ja) * | 2015-12-16 | 2019-01-10 | ベー<>コムB Com | デジタル画像の処理方法、付随する装置、端末機器およびコンピュータプログラム |
JP2019200525A (ja) * | 2018-05-15 | 2019-11-21 | 日立Geニュークリア・エナジー株式会社 | 目視検査用の画像処理システム、及び、画像処理方法 |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6551048B2 (ja) * | 2015-08-24 | 2019-07-31 | 株式会社Jvcケンウッド | 水中撮影装置、水中撮影装置の制御方法、水中撮影装置の制御プログラム |
CN110232666B (zh) * | 2019-06-17 | 2020-04-28 | 中国矿业大学(北京) | 基于暗原色先验的地下管道图像快速去雾方法 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010287183A (ja) * | 2009-06-15 | 2010-12-24 | Denso Corp | 霧画像復元装置及び運転支援システム |
JP2013142984A (ja) * | 2012-01-10 | 2013-07-22 | Toshiba Corp | 画像処理装置、画像処理方法及び画像処理プログラム |
-
2015
- 2015-02-26 AR ARP150100573A patent/AR099579A1/es active IP Right Grant
- 2015-02-26 US US15/119,886 patent/US20170053384A1/en not_active Abandoned
- 2015-02-26 JP JP2016506122A patent/JP6436158B2/ja active Active
- 2015-02-26 WO PCT/JP2015/001000 patent/WO2015133098A1/ja active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010287183A (ja) * | 2009-06-15 | 2010-12-24 | Denso Corp | 霧画像復元装置及び運転支援システム |
JP2013142984A (ja) * | 2012-01-10 | 2013-07-22 | Toshiba Corp | 画像処理装置、画像処理方法及び画像処理プログラム |
Non-Patent Citations (1)
Title |
---|
MASATO TODA: "Color image dehazing based on visual features decomposition", FIT2013 DAI 12 KAI FORUM ON INFORMATION TECHNOLOGY KOEN RONBUNSHU, vol. 3, 20 August 2013 (2013-08-20), pages 99 - 100 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019500793A (ja) * | 2015-12-16 | 2019-01-10 | ベー<>コムB Com | デジタル画像の処理方法、付随する装置、端末機器およびコンピュータプログラム |
JP2019200525A (ja) * | 2018-05-15 | 2019-11-21 | 日立Geニュークリア・エナジー株式会社 | 目視検査用の画像処理システム、及び、画像処理方法 |
JP7013321B2 (ja) | 2018-05-15 | 2022-01-31 | 日立Geニュークリア・エナジー株式会社 | 目視検査用の画像処理システム、及び、画像処理方法 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2015133098A1 (ja) | 2017-04-06 |
US20170053384A1 (en) | 2017-02-23 |
AR099579A1 (es) | 2016-08-03 |
JP6436158B2 (ja) | 2018-12-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Hasinoff et al. | Burst photography for high dynamic range and low-light imaging on mobile cameras | |
US9275445B2 (en) | High dynamic range and tone mapping imaging techniques | |
KR101901602B1 (ko) | 디지털 사진에서 노이즈를 제거하는 장치 및 방법 | |
US20150350509A1 (en) | Scene Motion Correction In Fused Image Systems | |
US8340417B2 (en) | Image processing method and apparatus for correcting skin color | |
JP6485078B2 (ja) | 画像処理方法および画像処理装置 | |
Tico et al. | Motion-blur-free exposure fusion | |
Fang et al. | Single image dehazing and denoising: a fast variational approach | |
JP6390847B2 (ja) | 画像処理装置、画像処理方法及びプログラム | |
Galdran et al. | A variational framework for single image dehazing | |
WO2015184408A1 (en) | Scene motion correction in fused image systems | |
JP6677172B2 (ja) | 画像処理装置、画像処理方法およびプログラム | |
JP6436158B2 (ja) | 画像処理装置、撮像装置、画像処理方法及びプログラム | |
KR102106537B1 (ko) | 하이 다이나믹 레인지 영상 생성 방법 및, 그에 따른 장치, 그에 따른 시스템 | |
US9355439B1 (en) | Joint contrast enhancement and turbulence mitigation method | |
WO2016189901A1 (ja) | 画像処理装置、画像処理方法、プログラム、これを記録した記録媒体、映像撮影装置、及び映像記録再生装置 | |
TW201830330A (zh) | 一種圖像處理方法及圖像處理系統 | |
JP2014232938A (ja) | 画像処理装置、画像処理方法及びプログラム | |
KR20140008623A (ko) | 영상 처리 방법 및 장치 | |
US9338354B2 (en) | Motion blur estimation and restoration using light trails | |
KR102395305B1 (ko) | 저조도 영상 개선 방법 | |
KR101468433B1 (ko) | 결합된 색상 채널 변환 맵을 이용한 다이나믹 레인지 확장 장치 및 방법 | |
Shirai et al. | Noiseless no-flash photo creation by color transform of flash image | |
KR20100027888A (ko) | 레티넥스 기법을 이용한 영상처리방법 | |
KR101004623B1 (ko) | 플래쉬 장치를 이용한 영상 품질 개선 장치 및 방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15759198 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15119886 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2016506122 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15759198 Country of ref document: EP Kind code of ref document: A1 |