WO2015190184A1 - 画像処理システム及びコンピュータ読み取り可能な記録媒体 - Google Patents
画像処理システム及びコンピュータ読み取り可能な記録媒体 Download PDFInfo
- Publication number
- WO2015190184A1 WO2015190184A1 PCT/JP2015/062729 JP2015062729W WO2015190184A1 WO 2015190184 A1 WO2015190184 A1 WO 2015190184A1 JP 2015062729 W JP2015062729 W JP 2015062729W WO 2015190184 A1 WO2015190184 A1 WO 2015190184A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- frame
- haze
- unit
- value
- density
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 163
- 230000008859 change Effects 0.000 claims description 63
- 238000000034 method Methods 0.000 claims description 46
- 230000008569 process Effects 0.000 claims description 43
- 238000013459 approach Methods 0.000 claims description 16
- 238000009795 derivation Methods 0.000 claims description 6
- 238000005516 engineering process Methods 0.000 abstract description 2
- 238000004364 calculation method Methods 0.000 description 121
- 238000000605 extraction Methods 0.000 description 48
- 238000005286 illumination Methods 0.000 description 44
- 238000002834 transmittance Methods 0.000 description 37
- 238000011156 evaluation Methods 0.000 description 27
- 239000000284 extract Substances 0.000 description 21
- 238000005259 measurement Methods 0.000 description 10
- 239000000203 mixture Substances 0.000 description 4
- 238000012544 monitoring process Methods 0.000 description 4
- 238000000926 separation method Methods 0.000 description 4
- 230000015572 biosynthetic process Effects 0.000 description 3
- 238000003786 synthesis reaction Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 2
- 239000000428 dust Substances 0.000 description 2
- 230000002146 bilateral effect Effects 0.000 description 1
- 239000010419 fine particle Substances 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 239000004576 sand Substances 0.000 description 1
- 239000000779 smoke Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/73—Deblurring; Sharpening
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/20—Image enhancement or restoration using local operators
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/77—Retouching; Inpainting; Scratch removal
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/136—Segmentation; Edge detection involving thresholding
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/529—Depth or shape recovery from texture
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/40—Picture signal circuits
- H04N1/407—Control or modification of tonal gradation or of extreme levels, e.g. background level
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20004—Adaptive image processing
- G06T2207/20012—Locally adaptive
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20021—Dividing image into blocks, subimages or windows
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20172—Image enhancement details
- G06T2207/20192—Edge enhancement; Edge preservation
Definitions
- the present invention relates to an image processing system and a computer-readable recording medium.
- this application is a continuation-in-part of international application PCT / JP2014 / 003131 (filing date: June 12, 2014), and international application PCT / JP2015 / 056086 (filing date: March 2, 2015). Part of the continuation application.
- Patent Document 1 A technique for removing fog in an image based on an atmospheric model has been known (see, for example, Patent Document 1).
- Patent Document 1 A technique for removing fog in an image based on an atmospheric model has been known (see, for example, Patent Document 1).
- Patent Literature [Patent Literature]
- Patent Document 1 JP 2012-168936 A
- an image processing system may include a haze density estimation unit that derives an estimated value of the haze density of one of a plurality of frames included in the moving image.
- the image processing system adjusts parameters used for the haze removal processing of the one frame based on the estimated value of the haze density of the one frame and the relationship between the one frame and the past frame of the one frame.
- a parameter adjustment unit may be provided.
- the image processing system may include a haze removal unit that performs a haze removal process on the one frame based on the parameter adjusted by the parameter adjustment unit.
- the parameter adjustment unit sets the estimated value of the haze density of the one frame as the haze density target value of the one frame, and gradually approaches the haze density target value from the parameter adjusted in the past frame.
- a parameter used for the haze removal processing of the one frame may be adjusted.
- the image processing system may further include a scene change determination unit that determines whether the one frame is a scene-changed frame, and the scene change determination unit determines whether the one frame and the past frame are included. In the relationship, when it is determined that the one frame is a scene-changed frame, the parameter adjustment unit may increase the width that gradually approaches the haze density target value.
- the image processing system includes a haze reliability estimation unit that estimates the reliability of the estimated value of the haze density, a target value acquisition unit that acquires a haze density target value of the past frame, and the target value acquisition unit.
- the target for determining, as the haze density target value of the one frame, one of the haze density target value of the past frame and the estimated value of the haze density of the one frame derived by the haze density estimation unit.
- a difference for deriving an absolute value of a difference between a value determination unit, a haze density target value of the one frame determined by the target value determination unit, and a parameter used for haze removal processing of a past frame of the one frame An absolute value deriving unit, wherein the parameter adjusting unit is configured by the target value determining unit based on the parameters used for the haze removal processing of the past frame.
- the parameters used for the haze removal processing of the one frame may be adjusted.
- the parameter adjusting unit adjusts the parameter by a first adjustment amount, determines that the one frame is not a scene-changed frame, When the reliability of the estimated value is greater than the first threshold value and the absolute difference value is greater than the second threshold value, the parameter is adjusted by a second adjustment amount that is smaller than the first adjustment amount, If it is determined that the frame is not a scene-changed frame, and the reliability of the estimated value is smaller than the first threshold value or the absolute difference value is smaller than the second threshold value, the parameter is adjusted to the second adjustment value. You may adjust with the 3rd adjustment amount smaller than the amount.
- the image processing system may further include a haze reliability estimator that estimates the reliability of the estimated value of the haze density, and the haze reliability estimator is reliable in the one frame. If it is determined that it is not possible, the parameter adjustment unit may adjust a parameter used for the haze removal processing of the one frame so as to gradually approach the target value set at the time of the past frame.
- the image processing system includes a haze reliability estimation unit that estimates the reliability of the estimated value of the haze density, a target value acquisition unit that acquires a haze density target value of the past frame, and the haze reliability estimation unit.
- either the haze density target value of the past frame acquired by the target value acquisition unit or the estimated value of the haze density of the one frame derived by the haze density estimation unit May further include a target value determining unit that determines whether to use the parameter for adjusting the parameter used for the haze removal processing of the one frame.
- the image processing system calculates an absolute difference between the haze density target value of the past frame acquired by the target value acquisition unit and the estimated value of the haze density of the one frame derived by the haze density estimation unit.
- An absolute difference deriving unit for deriving may be further provided, and the target value determining unit may include a case where the one frame is a scene-changed frame and a case where the one frame is not a scene-changed frame,
- the target value determining unit may include a case where the one frame is a scene-changed frame and a case where the one frame is not a scene-changed frame,
- the reliability is greater than the first threshold and the difference absolute value is greater than the second threshold
- the estimated value of the haze density of the one frame is used to adjust a parameter used for the haze removal processing of the one frame.
- the one frame is not a scene-changed frame, and the reliability of the estimated value is smaller than the first threshold value or the difference absolute value. If the value is less than the second threshold value, the haze density target value of the past frame may be determined to be used to adjust the parameters used in the haze removal processing frame the one.
- an image processing system may include a luminance evaluation value deriving unit that derives luminance evaluation values of at least a partial region of the image.
- the image processing system may include a saturation evaluation value deriving unit that derives a saturation evaluation value of at least a partial region of the image.
- the image processing system may include a contrast evaluation value deriving unit that derives a contrast evaluation value of at least a partial region of the image.
- the image processing system may include a haze density estimation unit that derives an estimated value of the haze density of the image based on the luminance evaluation value, the saturation evaluation value, and the contrast evaluation value.
- the image processing system may include a first pixel extraction unit that extracts pixels that are neither flat nor strong edges from the image, and the haze density estimation unit is the flat or strong one extracted by the first pixel extraction unit.
- the estimated value of the haze density may be derived based on the luminance evaluation value, the saturation evaluation value, and the contrast evaluation value of a pixel that is not an edge.
- an image processing system may include a first pixel extraction unit that extracts pixels that are neither flat nor strong edges from the image. Further, the image processing system is based on at least two of the luminance evaluation value, the saturation evaluation value, and the contrast evaluation value of the pixel that is not flat or strong edge extracted by the first pixel extraction unit.
- a haze density estimation unit for deriving an estimated value of the haze density of the image may be provided.
- the luminance evaluation value may be an average luminance value of the region.
- the saturation evaluation value may be an average saturation value of the region.
- the contrast evaluation value may be a contrast value of the region.
- the haze concentration estimation unit may derive an estimated value of the haze concentration that is higher as the average luminance value is higher.
- the haze density estimation unit may derive an estimated value of the haze density that is higher as the average saturation value is lower.
- the haze density estimation unit may derive an estimated value of the haze density that is higher as the contrast value is lower.
- the image may be a moving image including a plurality of frames, and the image processing system extracts a high saturation pixel that extracts a high saturation pixel having a saturation higher than a predetermined threshold in one of the plurality of frames.
- a high saturation pixel rate deriving unit for deriving a high saturation pixel rate indicating a ratio of the high saturation pixel in the one frame, and whether the high saturation pixel rate is higher than a predetermined threshold.
- a scene change determination unit that determines whether or not a scene change is included in the image.
- the image processing system derives the reliability of the reliability of the estimated value of the haze density of the one frame based on the high saturation pixel rate in the one frame and the average luminance value of the one frame. May be provided.
- the image processing system is based on the reliability of the estimated value of the haze density of the one frame derived by the reliability deriving unit and a scene change flag indicating whether the one frame is a scene-changed frame.
- a parameter adjustment unit that adjusts parameters used for the haze removal process for the one frame may be provided.
- the image processing system includes a transmittance deriving unit that derives a transmittance corresponding to a haze density for each of a plurality of pixels of the image, and a haze removal process on the image based on the estimated value of the haze density and the transmittance.
- the haze removal part which performs may be provided.
- the image processing system may further include a second pixel extracting unit that extracts pixels that are neither flat nor strong edges from the image, and the haze removing unit is the flat or strong extracted by the second pixel extracting unit. Whether or not to perform the haze removal process may be determined based on the ratio of pixels that are not edges in the image.
- an image processing system may include a high saturation pixel extraction unit that extracts a high saturation pixel whose saturation is higher than a predetermined threshold from one frame among a plurality of frames included in the moving image.
- the image processing system may include a high saturation pixel rate deriving unit that derives a high saturation pixel rate indicating a ratio of the high saturation pixels in the one frame.
- the image processing system determines whether a scene change is included in the image based on different criteria depending on whether the high saturation pixel rate in the one frame is higher than a predetermined threshold.
- a scene change determination unit may be provided.
- the scene change determination unit When the high saturation pixel rate is higher than the threshold, the scene change determination unit includes a scene change in the image based on the hue of the one frame and the hue of a past frame of the one frame. It may be determined whether or not. When the high saturation pixel rate is equal to or lower than the threshold, the scene change determination unit converts the scene change to the image based on the high saturation pixel rate of the one frame and the high saturation pixel rate of the past frame of the one frame. May be determined.
- the image processing system is based on a haze density estimated value acquisition unit that acquires an estimated value of haze density of the one frame, the high saturation pixel rate in the one frame, and an average luminance value of the one frame.
- a reliability deriving unit for deriving the reliability of the estimated value of the haze concentration When the estimated value is higher than a predetermined threshold, the reliability deriving unit derives a higher reliability as the average luminance value is larger, and derives a higher reliability as the high saturation pixel rate is lower. It's okay. The reliability deriving unit derives a higher reliability as the average luminance value is smaller and a higher reliability as the high saturation pixel rate is higher when the estimated value is equal to or less than a predetermined threshold. Good.
- the image processing system includes a target value acquisition unit that acquires a haze density target value, and a past frame of the one frame acquired by the target value acquisition unit based on the reliability derived by the reliability deriving unit.
- a target value determining unit that determines which one of the haze concentration target value and the estimated value of the haze concentration acquired by the haze concentration estimated value acquisition unit is used to adjust a parameter used for the haze removal process. It's okay.
- the image processing system includes a difference absolute value deriving unit that derives an absolute difference between the haze concentration target value acquired by the target value acquisition unit and the estimated value of the haze concentration acquired by the haze concentration estimated value acquisition unit.
- the target value determination unit may acquire the target value based on a reliability of the estimated value and the absolute value of the difference, or a scene change flag indicating whether the one frame is a scene-changed frame. It may be determined which of the haze density target value acquired by the section and the estimated value of the haze density acquired by the haze density estimated value acquisition section is used to adjust a parameter used for the haze removal process.
- the image processing system causes the parameter used for the haze removal processing of the one frame to gradually approach a value corresponding to the haze density target value or the estimated value of the haze density determined by the target value determination unit.
- the parameter adjustment unit is an absolute difference between the haze density target value determined by the target value determination unit or the estimated value of the haze density and a parameter used for haze removal processing of the past frame of the one frame.
- the haze density target value determined by (1) or the estimated value of the haze density may be adjusted stepwise.
- an image processing system may include a haze density acquisition unit that acquires the haze density of an image.
- the image processing system may include a removal processing unit that performs a haze removal process with different degrees of haze removal on the reflectance component of the image and the illumination light component of the image based on the haze density.
- the image processing system may include a combining unit that combines the reflectance component and the illumination light component that have been subjected to the haze removal process. On the assumption that the removal processing unit can be approximated without the reflectance component contained in the atmospheric light, the reflectance component of the image and the illumination light component of the image are used using the atmospheric model of the haze image and Retinex theory.
- the removal processing unit further assumes that the atmospheric model of the haze image can be applied only to the illumination light component, using the atmospheric model of the haze image and Retinex theory, and the reflectance component of the image and the above You may perform the haze removal process from which the haze removal degree differs with the illumination light component of an image, respectively.
- a computer-readable recording medium that records a program for causing a computer to function as the image processing system.
- An example of a functional composition of image processing system 100 is shown roughly.
- An example of functional composition of haze concentration estimating part 200 is shown roughly. It is a figure for demonstrating the extraction process of a flat & strong edge pixel.
- An example of the weighting graph 240 is shown schematically.
- An example of the weighting graph 242 is shown schematically. It is a figure for demonstrating a histogram value.
- An example of a functional composition of scene control part 300 is shown roughly.
- An example of a functional structure of the scene change determination part 310 is shown schematically.
- An example of a functional structure of the haze reliability estimation part 330 is shown roughly.
- An example of the weighting graph 352 is shown schematically.
- An example of weighting graph 354 is shown roughly.
- An example of a functional structure of the haze removal parameter adjustment part 360 is shown roughly.
- An example of functional composition of haze removal part 400 is shown roughly.
- FIG. 1 schematically shows an example of a functional configuration of the image processing system 100.
- the image processing system 100 may be a display device that displays images with the haze included in the input image removed.
- Haze includes all atmospheric phenomena in which visibility is deteriorated by fine particles.
- haze includes fog, hail, hail, smoke, dust, sand dust, rain and snow.
- the display device may be a liquid crystal display, a plasma display, an organic EL display, or the like.
- the image processing system 100 includes an image input unit 110, a display unit 120, a haze density estimation unit 200, a scene control unit 300, and a haze removal unit 400.
- the image input unit 110 inputs an image.
- the image may be a moving image or a still image, and may be a frame included in the moving image.
- the image input unit 110 may input RGB data, YUV data, or HSV data.
- the image processing system 100 may convert the input YUV data into RGB data.
- the haze density estimation unit 200 derives an estimated value of haze density for each input image.
- the haze density of the image is the density of haze contained in the image. For example, when the same space is imaged, the haze density of the image is higher when the fog density in the space is higher than when the fog density in the space is low.
- Scene controller 300 determines whether or not a scene change is included in the input video.
- the scene control unit 300 may generate a parameter used for the haze removal process based on whether or not a scene change is included in the input moving image.
- the haze removing unit 400 removes haze from the input image.
- the haze removal unit 400 may remove haze from the input image using the parameters generated by the scene control unit 300.
- the display unit 120 displays the image from which the haze is removed by the haze removing unit 400.
- the scene control unit 300 when no scene change is detected in the moving image, the scene control unit 300 generates a parameter used for the haze removal process so as to change the strength of the haze removal step by step over a plurality of frames. Then, the haze removal unit 400 uses the parameters generated by the scene control unit 300 to change the intensity of the haze removal step by step over a plurality of frames when no scene change is detected in the moving image. . Thereby, it is possible to prevent the image from changing suddenly due to haze removal, and to suppress the occurrence of a phenomenon such as so-called flicker.
- the scene control unit 300 changes the intensity of haze removal step by step over a smaller number of frames than when no scene change is detected in the moving image.
- a parameter used for the haze removal process is generated.
- the haze removal unit 400 uses the parameters generated by the scene control unit 300 so that when a scene change is detected in the moving image, the number of frames is smaller than when no scene change is detected in the moving image. Change the intensity of haze removal step by step.
- the image processing system 100 may not include the scene control unit 300.
- the haze removal unit 400 removes haze from the input image based on the estimated value of the haze concentration of the image derived by the haze concentration estimation unit 200. Thereby, a highly accurate haze removal process based on the haze concentration estimated by the haze concentration estimation unit 200 can be realized.
- FIG. 2 schematically shows an example of the functional configuration of the haze concentration estimation unit 200.
- the haze density estimation unit 200 includes a flat & strong edge pixel extraction unit 202, an average luminance calculation unit 204, an average saturation calculation unit 206, a contrast calculation unit 208, a maximum saturation acquisition unit 210, a weighting acquisition unit 212, and a haze density calculation unit. 214, a tool screen determination unit 216 and a selector 218.
- the flat & strong edge pixel extraction unit 202 extracts pixels that are neither flat nor strong edge from the image input by the image input unit 110.
- the flat & strong edge pixel extraction unit 202 extracts pixels that are flat or strong edge from the image, and excludes the extracted pixels from the image, thereby extracting pixels that are neither flat nor strong edge.
- the flat & strong edge pixel extraction unit 202 may be an example of a first pixel extraction unit.
- the average luminance calculation unit 204 calculates an average luminance value (may be described as AVE Y ) of pixels that are neither flat nor strong edge.
- the average luminance value may be an example of a luminance evaluation value.
- the average luminance calculation unit 204 may be an example of a luminance evaluation value deriving unit.
- the average saturation calculation unit 206 calculates an average saturation value (may be referred to as AVE S ) of pixels that are neither flat nor strong edge.
- the average saturation value may be an example of a saturation evaluation value.
- the average saturation calculation unit 206 may be an example of a saturation evaluation value deriving unit.
- the contrast calculation unit 208 calculates the contrast value of a pixel that is neither flat nor a strong edge.
- the contrast value may be an example of a contrast evaluation value.
- the contrast calculation unit 208 may be an example of a contrast evaluation value deriving unit.
- the contrast calculation unit 208 may generate a histogram of pixels that are neither flat nor strong edges.
- the contrast calculation unit 208 may generate a histogram with an arbitrary number of bins. Then, the contrast calculation unit 208 may calculate a histogram width (may be described as HIST WIDTH ) by subtracting the minimum value from the maximum value of the generated histogram. At this time, the contrast calculation unit 208 may subtract the minimum value from the maximum value of the bins whose values exceed the threshold among the plurality of bins.
- a histogram width may be described as HIST WIDTH
- HIST WIDTH may be an example of a contrast value.
- the contrast calculation unit 208 may output the number of bins of the histogram as the maximum width of the histogram (may be described as MAX WIDTH ).
- the maximum saturation acquisition unit 210 acquires the maximum saturation (may be described as MAX S ) in the image processing system 100.
- the weighting acquisition unit 212 acquires a weighting value (may be described as coef) used when calculating the haze density of an image. For example, the weight acquisition unit 212 acquires coef designated by the manufacturer or user of the image processing system 100.
- the haze density calculation unit 214 calculates an estimated value (sometimes referred to as “Strength”) of the haze density of the image.
- the haze density calculation unit 214 may calculate Strength based on the luminance evaluation value, the saturation evaluation value, and the contrast evaluation value of a pixel that is neither flat nor a strong edge.
- the haze density calculation unit 214 is based on the average luminance value calculated by the average luminance calculation unit 204, the average saturation value calculated by the average saturation calculation unit 206, and the contrast value calculated by the contrast calculation unit 208. Then, Strength may be calculated. The haze density calculation unit 214 may calculate the strength by multiplying the average luminance value, the average saturation value, and the contrast value. The haze density calculation unit 214 may multiply the average saturation value, a value obtained by subtracting the average saturation value from the maximum saturation, and a value obtained by subtracting the contrast value from the maximum width of the histogram.
- the haze density calculation unit 214 may weight the average luminance value. For example, the haze density calculation unit 214 may perform weighting such that the higher the value, the higher the value, and the lower the value, the lower the value. Further, the haze density calculation unit 214 may weight the average saturation value. For example, the haze density calculation unit 214 may perform weighting such that the higher the value, the higher the value, and the lower the value, the lower the value.
- the haze concentration calculation unit 214 calculates Strength by, for example, Equation 1 below.
- the haze density calculation unit 214 can derive a higher value of the haze density estimation value as the average luminance value is higher, and can derive a higher value of the haze density estimation value as the average saturation value is lower.
- the lower the value the higher the estimated haze concentration can be derived.
- the haze density calculation unit 214 reflects the characteristics of the haze more A highly accurate estimated value of haze concentration can be calculated.
- the tool screen determination unit 216 determines whether or not the input image is a tool screen.
- the tool screen is, for example, a screen for setting display parameters of the display unit 120 and a screen for setting image display parameters.
- the haze removal processing is executed by the haze removal unit 400.
- the viewer of the monitoring camera image displays the tool screen to change the display parameter setting while the display unit 120 displays the monitoring camera image
- the haze removal process is performed on the tool screen. If executed, the screen may become unnecessarily dark or flicker.
- the haze density estimation unit 200 when the input image is determined to be a tool screen, the haze density estimation unit 200 according to the present embodiment outputs 0 as an estimated value of the haze density, and the input image is determined to be a tool screen. If not, control is performed so that the strength calculated by the haze density calculation unit 214 is output.
- the selector 218 receives the Strength calculated by the haze density calculation unit 214 and the tool screen determination result by the tool screen determination unit 216, and if the input image is not a tool screen, the Strength is set to the scene. If the input image is a tool screen, 0 is output to the scene control unit 300 or the haze removal unit 400. Thereby, the haze removal part 400 can discriminate
- the tool screen determination unit 216 may determine whether the input image is a tool screen based on pixels that are neither flat nor strong edge extracted by the flat & strong edge pixel extraction unit 202.
- the flat & strong edge pixel extraction unit 202 is a first reference for extracting pixels that are neither flat nor strong edge to be output to the average luminance calculation unit 204, the average saturation calculation unit 206, and the contrast calculation unit 208.
- pixels that are not flat or strong edges to be output to the tool screen determination unit 216 may be extracted.
- the second reference may be a reference that is less likely to be determined as a pixel that is neither flat nor a strong edge as compared to the first reference.
- the flat & strong edge pixel extraction unit 202 may be an example of a second pixel extraction unit.
- the tool screen determination unit 216 is not a tool screen when the ratio of pixels that are neither flat nor strong edge received from the flat & strong edge pixel extraction unit 202 to all pixels of the input image is equal to or less than a predetermined threshold value. If it is greater than a predetermined threshold, it may be determined that the tool screen is displayed.
- the tool screen determination unit 216 determines a ratio of pixels that are flat or strong edges extracted according to the second criterion to pixels that are flat or strong edges extracted according to the first criterion as a predetermined threshold value. In the following cases, it is determined that the screen is a tool screen, and if it is larger than a predetermined threshold, it may be determined that the screen is not a tool screen.
- the tool screen determination unit 216 determines not only the tool screen but also other types of screens as long as the ratio of pixels that are neither flat nor strong edges in the input image is small. be able to. For example, the tool screen determination unit 216 can determine whether the input image is an image in which the ratio of the area other than the image display area in the entire display area of the display unit 120 is high.
- the tool screen determination unit 216 may be an example of a haze removal process target determination unit that determines whether or not an input image is a target for executing a haze removal process.
- the average luminance calculation unit 204, the average saturation calculation unit 206, and the contrast calculation unit 208 calculate the average luminance value, the average saturation value, and the contrast value of pixels that are not flat or strong edge. Although it has been described, it is not limited to this.
- the average luminance calculation unit 204, the average saturation calculation unit 206, and the contrast calculation unit 208 may calculate the average luminance value, average saturation value, and contrast value of the entire input image. Further, the average luminance calculation unit 204, the average saturation calculation unit 206, and the contrast calculation unit 208 may calculate an average luminance value, an average saturation value, and a contrast value of a part of the input image.
- FIG. 3 is a diagram for explaining an example of flat & strong edge pixel extraction processing.
- the flat & strong edge pixel extraction unit 202 determines whether or not the target pixel 230 is a flat or strong edge, first, the maximum value and the minimum value of the pixel values for seven pixels in each of the vertical and horizontal directions centering on the target pixel 230. To get.
- the flat & strong edge pixel extraction unit 202 calculates a value obtained by subtracting the minimum value from the maximum value for each of the 7 pixels in the vertical direction and the 7 pixels in the horizontal direction. Then, the flat & strong edge pixel extraction unit 202 has a value obtained by subtracting the minimum value from the maximum value in at least one of the vertical direction and the horizontal direction being equal to or smaller than the first threshold value, and a value larger than the first threshold value. If it is equal to or greater than the second threshold, the target pixel 230 is determined as a pixel that is a flat or strong edge.
- the flat & strong edge pixel extracting unit 202 may use the first threshold value and the second threshold value when extracting pixels that are flat or strong edge according to the first reference, and may be flat or strong edge according to the second reference. In the case of extracting pixels that are, a third threshold value that is larger than the first threshold value and smaller than the second threshold value and a fourth threshold value that is larger than the third threshold value and smaller than the second threshold value may be used.
- FIG. 4A schematically shows an example of the weighting graph 240.
- FIG. 4B schematically shows an example of the weighting graph 242.
- the weighting graph 240 shows an example of a weighting value used when the haze density calculation unit 214 weights the average luminance value.
- 4A and 4B illustrate a case where the input signal is 10 bits.
- the haze density calculation unit 214 weights the average luminance value according to the weighting graph 240 so that the higher the average luminance value, the higher the value, and the lower the average luminance value, the lower the value. Can do.
- the haze density calculation unit 214 may also use the weighting graph 240 when weighting the average saturation value.
- the haze density calculation unit 214 may use the weighting graph 240 having the same value when weighting the average luminance value and when weighting the average saturation value, and the weighting graph 240 having different values. May be used. For example, when the weighting of the average luminance value is set to be heavier than the average saturation value, the haze density calculation unit 214 uses the weighting graph 242 in which the weighting becomes heavier as shown in the weighting graph 242 to calculate the average luminance value. You may weight.
- FIG. 5 is a diagram for explaining the histogram value.
- FIG. 5 illustrates a case where the number of histogram bins is 16.
- the contrast calculation unit 208 may calculate the HIST WIDTH by subtracting the minimum value from the maximum value of the bins whose values exceed the threshold value among the plurality of bins.
- FIG. 6 schematically shows an example of the functional configuration of the scene control unit 300.
- the scene control unit 300 includes a scene change determination unit 310, a haze reliability estimation unit 330, and a haze removal parameter adjustment unit 360.
- the scene change determination unit 310 determines whether a scene change is included in the moving image input by the image input unit 110.
- the scene change determination unit 310 may associate a scene change flag indicating whether or not each of a plurality of frames included in the moving image is a scene-changed frame.
- the haze reliability estimation unit 330 estimates the reliability of the strength output from the haze density estimation unit 200 for the frames included in the moving image input by the image input unit 110.
- the haze removal parameter adjustment unit 360 adjusts parameters used for the haze removal processing for a plurality of frames included in the moving image and outputs the adjusted parameters to the haze removal unit 400.
- the haze removal parameter adjustment unit 360 may adjust a parameter used for the haze removal processing of the one frame from the relationship between the one frame and the past frame of the one frame.
- the haze removal parameter adjustment unit 360 is used for the haze removal process for a plurality of frames included in the moving image based on the reliability estimated by the haze reliability estimation unit 330 and the scene change flag generated by the scene change determination unit 310.
- the parameters may be adjusted and output to the haze removal unit 400.
- FIG. 7 schematically shows an example of the functional configuration of the scene change determination unit 310.
- the scene change determination unit 310 includes a high saturation pixel extraction unit 312, a hue histogram generation unit 314, a high saturation pixel rate measurement unit 316, a flat & strong edge pixel extraction unit 318, an average luminance calculation unit 320, an average saturation calculation unit 322, and a determination process. Part 324.
- the high saturation pixel extraction unit 312 extracts a high saturation pixel from one frame among a plurality of frames included in the moving image input by the image input unit 110.
- the high saturation pixel may be a pixel whose saturation is higher than a predetermined threshold.
- the high saturation pixel extraction unit 312 receives the RGB data, the difference between the maximum value and the minimum value of the R component, the G component, and the B component is greater than or equal to a predetermined threshold for each of the plurality of pixels included in the frame.
- a certain pixel may be extracted as a high saturation pixel.
- the high saturation pixel extraction unit 312 may extract, as a high saturation pixel, a pixel having an S component equal to or greater than a predetermined threshold for each of a plurality of pixels included in the frame.
- the hue histogram generation unit 314 generates a hue histogram (may be referred to as “HueHIST”) for the high saturation pixels extracted by the high saturation pixel extraction unit 312.
- HueHIST hue histogram
- the hue histogram generation unit 314 may generate a Hue HIST for a frame included in the moving image input by the image input unit 110.
- the high saturation pixel rate measuring unit 316 measures a high saturation pixel rate (may be referred to as HighSatRate) indicating the ratio of high saturation pixels in one frame.
- the high saturation pixel rate measurement unit 316 may set the ratio of the light saturation pixels to all the pixels in one frame as HighSatRate.
- the high saturation pixel rate measuring unit 316 may be an example of a high saturation pixel rate deriving unit that derives a high saturation pixel rate in one frame.
- the flat & strong edge pixel extracting unit 318 extracts pixels that are neither flat nor strong edge from one of a plurality of frames included in the moving image input by the image input unit 110.
- the flat & strong edge pixel extraction unit 318 may extract pixels that are neither flat nor strong edge, similar to the flat & strong edge pixel extraction unit 202.
- the flat & strong edge pixel extraction unit 318 may extract pixels that are neither flat nor strong edge according to the first reference, and may extract pixels that are neither flat nor strong edge according to the second reference. Also, pixels that are neither flat nor strong edges may be extracted according to other criteria.
- the average luminance calculation unit 320 calculates AVE Y of pixels that are neither flat nor strong edge extracted by the flat & strong edge pixel extraction unit 318.
- the case where the average luminance calculation unit 320 calculates AVE Y of pixels that are neither flat nor strong edge extracted by the flat & strong edge pixel extraction unit 318 will be described as an example, but the present invention is not limited thereto. Absent.
- the average luminance calculation unit 320 may calculate AVE Y for a frame included in the moving image input by the image input unit 110.
- the average saturation calculation unit 322 calculates AVE S of pixels that are neither flat nor strong edge extracted by the flat & strong edge pixel extraction unit 318.
- AVE S AVE S of pixels that are neither flat nor strong edge extracted by the flat & strong edge pixel extraction unit 318.
- the average saturation calculation unit 322 may calculate AVE S for frames included in the moving image input by the image input unit 110.
- the determination processing unit 324 executes determination processing for determining whether or not a scene change is included in the moving image input by the image input unit 110.
- the determination processing unit 324 generates a scene change flag indicating whether the frame is a scene-changed frame, and outputs the scene change flag to the haze removal parameter adjustment unit 360.
- the determination processing unit 324 determines whether or not the moving image includes a scene change based on different criteria depending on whether or not the high saturation pixel rate measured by the high saturation pixel rate measurement unit 316 is higher than a predetermined threshold. You may judge.
- the determination processing unit 324 determines the hue of the one frame and the past frame of the one frame. Whether or not the moving image includes a scene change may be determined on the basis of the hue.
- the past frame of the one frame is, for example, a frame before the one frame.
- the determination processing unit 324 includes the SAD of the Hue HIST of the one frame generated by the hue histogram generation unit 314 and the hue histogram of the past frame of the one frame (may be described as Hue HIST_dl). Based on (Sum of Absolute Difference) (may be described as “HueHITSAD”), it may be determined whether or not a scene change is included in the moving image.
- determination processing unit 324 HueHISTSAD is lower than the fifth threshold value, the average luminance value of the past frame AVE Y and the one frame of the one frame (may be described as AVE Y _dl.) And the The difference absolute value is lower than the sixth threshold, and the difference absolute value between the AVE S of the one frame and the average saturation value of the past frame of the one frame (may be described as AVE S _dl) is the first. If the threshold is lower than 7, the one frame is determined not to be a scene-changed frame. Otherwise, the one frame is determined to be a scene-changed frame.
- the scene change flag of the one frame may be set to False. If the determination processing unit 324 determines that the one frame is a scene-changed frame, the scene change flag of the one frame may be set to True.
- the determination processing unit 324 determines the HighSatRate of the one frame and the past frames of the one frame. Based on the high saturation pixel rate (may be described as “HighSatRate_dl”), it may be determined whether or not a scene change is included in the moving image.
- determination processing unit 324 a difference absolute value between HighSatRate and HighSatRate_dl is lower than the eighth threshold value, the difference absolute value between the AVE Y and AVE Y _dl is lower than the sixth threshold value, the difference between the AVE S and AVE S _dl If the absolute value is lower than the seventh threshold, it is determined that the one frame is a scene-changed frame, and otherwise, it is determined that the one frame is a scene-changed frame.
- FIG. 8 schematically shows an example of a functional configuration of the haze reliability estimation unit 330.
- the haze reliability estimation unit 330 includes a flat & strong edge pixel extraction unit 332, an average luminance calculation unit 334, a high saturation pixel extraction unit 336, a high saturation pixel rate measurement unit 338, a haze density estimated value acquisition unit 340, and a reliability calculation unit 342. Prepare.
- the flat & strong edge pixel extracting unit 332 extracts pixels that are neither flat nor strong edge from one frame among a plurality of frames included in the moving image input by the image input unit 110.
- the flat & strong edge pixel extraction unit 332 may extract pixels that are neither flat nor strong edge, similar to the flat & strong edge pixel extraction unit 202.
- the flat & strong edge pixel extraction unit 332 may extract pixels that are neither flat nor strong edge according to the first reference, and may extract pixels that are neither flat nor strong edge according to the second reference. Also, pixels that are neither flat nor strong edges may be extracted according to other criteria.
- the average luminance calculation unit 334 calculates AVE Y of pixels that are neither flat nor strong edge extracted by the flat & strong edge pixel extraction unit 332.
- the case where the average luminance calculation unit 334 calculates AVE Y of pixels that are neither flat nor strong edge extracted by the flat & strong edge pixel extraction unit 332 will be described as an example, but the present invention is not limited thereto. Absent.
- the average luminance calculation unit 334 may calculate AVE Y for a frame included in the moving image input by the image input unit 110.
- the high saturation pixel extraction unit 336 extracts high saturation pixels from one frame among a plurality of frames included in the moving image input by the image input unit 110.
- the high saturation pixel extraction unit 336 extracts, as a high saturation pixel, a pixel in which the difference between the maximum value and the minimum value of the R component, the G component, and the B component is equal to or greater than a predetermined threshold for each of a plurality of pixels included in the frame. You can do it.
- the high saturation pixel rate measuring unit 338 measures HighSatRate in one frame.
- the haze concentration estimated value acquisition unit 340 acquires the strength output by the haze concentration estimation unit 200.
- the reliability calculation unit 342 calculates the reliability of the strength acquired by the haze density estimated value acquisition unit 340 based on the AVE Y calculated by the average luminance calculation unit 334 and the HighSatRate measured by the high saturation pixel rate measurement unit 338. And output to the haze removal parameter adjustment unit 360.
- the reliability calculation unit 342 may be an example of a reliability derivation unit that derives the strength of Strength based on AVE Y and HighSatRate.
- the reliability calculation unit 342 may calculate the strength of the strength based on different criteria depending on whether or not the strength acquired by the haze density estimated value acquisition unit 340 is larger than a predetermined threshold.
- the reliability calculation unit 342 calculates a higher reliability as AVE Y is larger, and the reliability is higher as HighSatRate is lower. May be calculated.
- the reliability calculation unit 342 calculates a higher reliability as AVE Y is smaller, and as the HighSatRate is higher, the reliability is higher. The degree may be calculated.
- the reliability calculation unit 342 includes the strength acquired by the haze density estimated value acquisition unit 340, the AVE Y calculated by the average luminance calculation unit 334, and the high saturation pixel rate measurement unit 338. Weights each HighSatRate measured by.
- the reliability calculation unit 342 weights Strength and HighSatRate so that the higher the value, the higher the value, and the lower the value, the lower the value.
- the reliability calculation unit 342 weights AVE Y such that the higher the value, the lower the value, and the lower the value, the higher the value.
- the weighted Strength is sometimes referred to as StrengthWeight.
- the AVE Y in which the weighting may be referred to as AVE Y Weight.
- the weighted HighSatRate may be described as HighSatRateWeight.
- the reliability calculation unit 342 sets EvalMax as the larger one of AVE Y Weight and HighSatRateWeight. Then, the reliability calculation unit 342 calculates the absolute difference between EvalMax and StrengthWeight as the reliability of the estimated value of haze density.
- FIG. 9 schematically shows an example of the weighting graph 352.
- the weighting graph 352 shows an example of a weighting value used when the reliability calculation unit 342 weights Strength.
- the reliability calculation unit 342 weights the strength according to the weighting graph 352, so that the higher the strength, the higher the value, and the lower the strength, the lower the value.
- the weighting graph 352 may be used when the reliability calculation unit 342 weights HighSatRate.
- FIG. 10 schematically shows an example of the weighting graph 354.
- the weighting graph 354 shows an example of a weighting value used when the reliability calculation unit 342 weights AVE Y.
- the reliability calculation unit 342 weights AVE Y according to the weighting graph 354, so that the value becomes lower as AVE Y is higher, and the value becomes higher as Strength is lower.
- FIG. 11 schematically illustrates an example of a functional configuration of the haze removal parameter adjustment unit 360.
- the haze removal parameter adjustment unit 360 includes a haze density target value calculation unit 362 and a parameter adjustment unit 364.
- the haze concentration target value calculation unit 362 calculates a haze concentration target value (may be described as Target Depth) for one frame.
- Target Depth indicates a haze removal parameter to be converged when the contents of one frame and a plurality of frames following the one frame do not change.
- the haze density target value calculation unit 362 when the haze reliability estimation unit 330 determines that the estimated value of the haze density of the one frame is unreliable in one frame, the time point of the past frame of the one frame.
- the Target Depth set in may be described as Target Depth_dl
- the haze reliability estimation unit 330 determines that the estimated value is not reliable.
- the haze density target value calculation unit 362 includes Target Depth (Target Depth_dl) of the past frame of the one frame, Strength for the one frame received from the haze density estimation unit 200, and the received from the reliability calculation unit 342. Based on the reliability of Strength and the scene change flag for the one frame received from the determination processing unit 324, the Target Depth for the one frame may be determined.
- the haze density target value calculation unit 362 calculates the absolute difference value between Strength and TargetDepth_dl (may be described as DiffDepth). Then, the haze density target value calculation unit 362 sets Strength Depth as Target Depth when Diff Depth is larger than the ninth threshold and the reliability of Strength is larger than the tenth threshold, and when the scene change flag is True. In this case, Target Depth_dl is set to Target Depth. The case other than that may be a case where the scene change flag is False and DiffDepth is smaller than the ninth threshold or Strength reliability is smaller than the tenth threshold.
- the parameter adjustment unit 364 uses the parameter used for the haze removal processing of the one frame (may be described as HazeRemovalStrength) as the Strength or TargetDepth_dl determined by the haze density target value calculation unit 362 as the Target Depth of the one frame. It adjusts so that it may gradually approach the value corresponding to. Thereby, the intensity
- the parameter adjustment unit 364 may adjust the HazeRemovalStrength from the relationship between the Target Depth determined by the haze density target value calculation unit 362 and the one frame and the past frame of the one frame.
- the parameter adjustment unit 364 may adjust HazeRemovalStrength so as to gradually approach the Target Depth determined by the haze density target value calculation unit 362 from the parameter adjusted in the past frame of the one frame.
- the parameter adjustment unit 364 may change the width that gradually approaches the Target Depth based on whether or not the one frame is a scene-changed frame.
- the parameter adjustment unit 364 increases the width that gradually approaches Target Depth. To do.
- the strength of haze removal can be increased compared to when the one frame is not a scene-changed frame.
- the parameter adjustment unit 364 absolute difference between the Target Depth determined by the haze density target value calculation unit 362 and the parameter used for the haze removal process of the past frame of the one frame (may be described as HazeRemovalStrength_dl).
- a value may be described as DiffStrength
- the parameter adjustment unit 364 may be an example of an absolute difference derivation unit that derives DiffStrength. Then, the parameter adjustment unit 364 adjusts the HazeRemovalStrength as the Target Depth of the one frame by the haze density target value calculation unit 362 with the adjustment amount according to the DiffStrength, the reliability of the Strength, and the scene change flag. Or you may adjust so that it may approach in steps the value corresponding to TargetDepth_dl.
- the parameter adjustment unit 364 corresponds to the Strength or Target Depth_dl in which the Haze Removal Strength Strength is determined as the Target Depth of the one frame by the haze density target value calculation unit 362 with the first adjustment amount. Adjust so that the value approaches the value step by step. Further, when the scene change flag is False, DiffStrength is greater than the eleventh threshold value, and the Strength of Strength is greater than the twelfth threshold value, the parameter adjustment unit 364 sets the HazeRemovalStrength to the first adjustment amount with the second adjustment amount. It adjusts so that it may gradually approach the value corresponding to Strength or Target Depth_dl determined as Target Depth of the current frame.
- the parameter adjustment unit 364 adjusts the HazeRemovalStrength so that it gradually approaches a value corresponding to the Strength or TargetDepth_dl determined as the TargetDepth of the one frame with the third adjustment amount.
- the case other than that may be a case where the scene change flag is False and DiffStrength is smaller than the eleventh threshold or the Strength of Strength is smaller than the twelfth threshold.
- the first adjustment amount is larger than the second adjustment amount and the third adjustment amount
- the second adjustment amount is larger than the third adjustment amount.
- the haze density target value calculation unit 362 uses either Target Depth_dl or Strength for haze removal processing based on the reliability of Strength and the absolute value of the difference between Strength and Target Depth_dl, or the scene change flag. It may be determined whether the parameter is used to adjust.
- the haze concentration target value calculation unit 362 that acquires Target Depth_dl may be an example of a target value acquisition unit. Further, the haze density target value calculation unit 362 may be an example of a difference absolute value derivation unit that calculates a difference absolute value between Strength and Target Depth_dl. Further, the haze density target value calculation unit 362 may be an example of a target value determination unit that determines Target Depth for the one frame.
- FIG. 12 schematically shows an example of the functional configuration of the haze removing unit 400.
- the haze removal unit 400 includes an illumination light separation unit 402, a parameter acquisition unit 410, a removal processing unit 420, and a synthesis unit 426.
- Illumination light separating unit 402 separates the illumination light component I L from the image I inputted by the image input unit 110. Illumination light separating unit 402, if it is possible to separate the illumination light component I L from the image I, may be carried out any process.
- the illumination light separating unit 402 uses the edge-preserving low-pass filter, for separating the illumination light component I L from the image I.
- An edge preserving low-pass filter is a filter that performs smoothing while preserving edges.
- the illumination light separation unit 402 uses, for example, a bilateral filter as an edge-preserving low-pass filter.
- the illumination light separation unit 402 may output the illumination light component IL and the image I to the parameter acquisition unit 410.
- the parameter acquisition unit 410 acquires parameters used for haze removal.
- the parameter acquisition unit 410 includes an atmospheric light calculation unit 412, a transmittance calculation unit 414, and a haze concentration estimated value acquisition unit 416.
- the atmospheric light calculation unit 412 calculates the atmospheric light A of the image I.
- the atmospheric light calculation unit 412 may perform any process as long as the atmospheric light A of the image I can be calculated. For example, the atmospheric light calculation unit 412 first calculates the minimum RGB value including peripheral pixels for each pixel of the image I. Next, the atmospheric light calculation unit 412 extracts, from the image I, pixels whose calculated minimum value is higher 0.1%. Then, the atmospheric light calculation unit 412 sets the value of the pixel having the highest luminance among the extracted pixels as the atmospheric light A.
- the transmittance calculating unit 414 calculates a transmittance t corresponding to the haze density for each of a plurality of pixels of the image input by the image input unit 110.
- the transmittance calculating unit 414 may be an example of a transmittance deriving unit.
- the transmittance calculation unit 414 may perform any process as long as the transmittance t can be calculated. For example, the transmittance calculation unit 414 calculates the transmittance t based on a dark channel prior (sometimes referred to as DCP) expressed by the following mathematical formula 2.
- I C is the color channel of I
- ⁇ (x) is a local region centered on x.
- the transmittance calculator 414 may calculate the transmittance t from the value of DCP based on the assumption that the DCP in Equation 2 represents the transmittance t. For example, the transmittance calculation unit 414 may calculate the transmittance t according to the following mathematical formula 3.
- the haze concentration estimated value acquisition unit 416 acquires the strength output by the haze concentration estimation unit 200.
- the parameter acquisition unit 410 includes the image I, the illumination light component I L , the atmospheric light A calculated by the atmospheric light calculation unit 412, the transmittance t calculated by the transmittance calculation unit 414, and the Strength acquired by the haze density estimated value acquisition unit 416. Is output to the removal processing unit.
- the removal processing unit 420 executes a haze removal process on the image I based on the Retinex theory represented by the following formula 4 and the atmospheric model of the haze image represented by the following formula 5.
- the removal processing unit 420 may execute the haze removal processing using the atmospheric model of the haze image and Retinex theory on the assumption that the reflectance component included in the atmospheric light is small or can be approximated. .
- the reflectance component included in the atmospheric light is minimal, it can be considered that the following Expression 8 holds.
- the removal processing unit 420 further calculates the atmospheric model of the haze image and the Retinex theory on the premise that the atmospheric model can be applied only to each illumination light component for the atmospheric light, the original image, and the input image. It may be used to perform a haze removal process. According to such a premise, the following formula 9 is established.
- the removal processing unit 420 may perform haze removal processing with different degrees of haze removal on the reflectance component and the illumination light component. For example, the removal processing unit 420 may vary the degree of haze removal by varying the transmittance t with respect to Equation 9 and Equation 10. Further, the removal processing unit 420 may vary the degree of haze removal by applying different weights to the results of Equation 9 and Equation 10.
- the removal processing unit 420 includes an illumination light component processing unit 422 and a reflectance component processing unit 424.
- the illumination light component processing unit 422 performs haze removal processing on the illumination light component of the image.
- the reflectance component processing unit 424 performs haze removal processing on the reflectance component of the image.
- the illumination light component processing unit 422 may perform haze removal processing using the HazeRemovalStrength received from the parameter adjustment unit 364. For example, the illumination light component processing unit 422 may calculate the illumination light component J L subjected to the haze removal process using the illumination light component I L , the atmospheric light A, the transmittance t, the HazeRemovalStrength, and the above formula 9. The illumination light component processing unit 422 may adjust the transmittance t using HazeRemovalStrength. Also, the illumination light component processing unit 422 may use HazeRemovalStrength instead of the transmittance t. Further, the illumination light component processing unit 422 may use a value based on HazeRemovalStrength instead of the transmittance t.
- the parameter acquisition unit 410 may not have the haze density estimated value acquisition unit 416.
- the haze concentration estimated value acquisition unit 416 may receive the HazeRemovalStrength from the parameter adjustment unit 364 and transmit the received HazeRemovalStrength to the removal processing unit 420.
- the illumination light component processing unit 422 receives the illumination light component I received from the parameter acquisition unit 410 when a still image is input by the image input unit 110 or when the image processing system 100 does not include the scene control unit 300.
- the illumination light component J L subjected to the haze removal process may be calculated using L 1 , atmospheric light A, transmittance t, Strength, and Equation 9 above.
- the illumination light component processing unit 422 may calculate the illumination light component J L by applying Strength to the transmittance t. For example, the illumination light component processing unit 422 multiplies the transmittance t by Strength. Further, for example, the illumination light component processing unit 422 may perform weighting on the transmittance t according to the value of Strength. As a result, a more accurate haze removal process using the strength estimated by the haze concentration estimation unit 200 can be realized.
- Reflectance component processor unit 424 from the image I and the illumination light component I L received from the parameter acquisition unit 410 may calculate the reflectance component I R.
- the reflectance component processing unit 424 may perform haze removal processing using the HazeRemovalStrength received from the parameter adjustment unit 364. For example, the reflectance component processing unit 424 calculates the reflectance component JR subjected to the haze removal process using the illumination light component I L , the reflectance component I R , the transmittance t, the HazeRemovalStrength, and the above Equation 10. Good.
- the reflectance component processing unit 424 may adjust the transmittance t by using HazeRemovalStrength.
- the reflectance component processing unit 424 may use HazeRemovalStrength instead of the transmittance t. In addition, the reflectance component processing unit 424 may use a value based on HazeRemovalStrength instead of the transmittance t.
- the parameter acquisition unit 410 may not have the haze density estimated value acquisition unit 416.
- the haze density estimated value acquisition unit 416 may receive the HazeRemovalStrength from the parameter adjustment unit 364 and transmit the received HazeRemovalStrength to the removal processing unit 420.
- the reflectance component processing unit 424 When the still image is input by the image input unit 110 or when the image processing system 100 does not include the scene control unit 300, the reflectance component processing unit 424 includes the illumination light component I L , the reflectance component I R , Using the transmittance t, the strength, and the above mathematical formula 10, the JR subjected to the haze removal process may be calculated.
- the reflectance component processing unit 424 may calculate JR by applying Strength to the transmittance t. For example, the reflectance component processing unit 424 multiplies the transmittance t by Strength. Further, for example, the reflectance component processing unit 424 may perform weighting on the transmittance t according to the value of Strength. As a result, a more accurate haze removal process using the strength estimated by the haze concentration estimation unit 200 can be realized.
- Synthesis unit 426 synthesizes the illumination light component J L haze removal process has been performed by the illumination light component processing unit 422, and a reflectance component haze removal process is performed J R by reflectance component processor unit 424.
- the combining unit 426 generates an output image J by combining J L and JR .
- the output image J generated by the combining unit 426 may be displayed by the display unit 120.
- the image processing system 100 is a display device including the haze density estimation unit 200, the scene control unit 300, and the haze removal unit 400 has been described as an example, but the present invention is not limited thereto.
- the image processing system 100 may be a display device that includes at least one of the haze density estimation unit 200, the scene control unit 300, and the haze removal unit 400.
- the image processing system 100 may be a display device that includes only the scene change determination unit 310 among the scene change determination unit 310, the haze reliability estimation unit 330, and the haze removal parameter adjustment unit 360 included in the scene control unit 300. Good.
- the image processing system 100 may be a display device that includes only the haze reliability estimation unit 330.
- the haze concentration estimated value acquisition unit 340 may acquire the haze concentration estimated value estimated by another device or the like instead of the Strength output by the haze concentration estimating unit 200.
- the haze density estimated value acquisition unit 416 includes the haze density estimation value acquisition unit 416.
- a haze concentration estimated value estimated by another device or the like may be acquired.
- the image processing system 100 is a display device
- the present invention is not limited to this.
- Other types of devices may be used as long as they are devices that process images.
- the image processing system 100 may be a mobile phone such as a smartphone, a tablet terminal, a personal computer, an information home appliance, or the like.
- the image processing system 100 may be an apparatus that does not have the display unit 120 and displays an image on an external display unit.
- each unit of the image processing system 100 may be realized by hardware or may be realized by software. Further, it may be realized by a combination of hardware and software. Further, the computer may function as the image processing system 100 by executing the program.
- the program may be installed in a computer constituting at least a part of the image processing system 100 from a computer-readable medium or a storage device connected to a network.
- a program that is installed in a computer and causes the computer to function as the image processing system 100 according to the present embodiment works on a CPU or the like to cause the computer to function as each unit of the image processing system 100.
- Information processing described in these programs functions as a specific means in which software and hardware resources of the image processing system 100 cooperate with each other by being read by a computer.
- 100 image processing system 110 image input unit, 120 display unit, 200 haze density estimation unit, 202 flat and strong edge pixel extraction unit, 204 average luminance calculation unit, 206 average saturation calculation unit, 208 contrast calculation unit, 210 maximum saturation Degree acquisition unit, 212 Weight acquisition unit, 214 Haze density calculation unit, 216 Tool screen determination unit, 218 selector, 230 pixel of interest, 240 weighting graph, 242 weighting graph, 300 scene control unit, 310 scene change determination unit, 312 high saturation pixel Extraction unit, 314 Hue histogram generation unit, 316 High saturation pixel rate measurement unit, 318 Flat & strong edge pixel extraction unit, 320 Average luminance calculation unit, 322 Average saturation calculation unit, 324 Judgment processing unit, 330 Haze reliability estimation unit, 332 Flat & strong Wedge pixel extraction unit, 334 average luminance calculation unit, 336 high saturation pixel extraction unit, 338 high saturation pixel rate measurement unit, 340 haze density estimated value acquisition unit, 342 reliability calculation unit (haze reliability estimation unit
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Image Processing (AREA)
- Studio Devices (AREA)
- Image Analysis (AREA)
- Facsimile Image Signal Circuits (AREA)
- Investigating Or Analysing Materials By Optical Means (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
[先行技術文献]
[特許文献]
[特許文献1]特開2012-168936号公報
Claims (9)
- 動画に含まれる複数のフレームのうちの一のフレームのヘイズ濃度の推定値を導出するヘイズ濃度推定部と、
前記一のフレームのヘイズ濃度の推定値と、前記一のフレームと前記一のフレームの過去のフレームとの関係から、前記一のフレームのヘイズ除去処理に用いるパラメータを調整するパラメータ調整部と、
前記パラメータ調整部によって調整された前記パラメータに基づいて、前記一のフレームにヘイズ除去処理を実行するヘイズ除去部と
を備える画像処理システム。 - 前記パラメータ調整部は、前記一のフレームのヘイズ濃度の推定値を前記一のフレームのヘイズ濃度目標値として、前記過去のフレームにおいて調整された前記パラメータから前記ヘイズ濃度目標値に段階的に近づくように前記一のフレームのヘイズ除去処理に用いるパラメータを調整する、請求項1に記載の画像処理システム。
- 前記一のフレームがシーンチェンジしたフレームであるか否かを判定するシーンチェンジ判定部をさらに備え、
前記シーンチェンジ判定部が前記一のフレームと前記過去のフレームとの関係において前記一のフレームがシーンチェンジしたフレームであると判定した場合、前記パラメータ調整部は、前記ヘイズ濃度目標値に段階的に近づける幅を大きくする、請求項2に記載の画像処理システム。 - 前記ヘイズ濃度の推定値の信頼度を推定するヘイズ信頼度推定部と、
前記過去のフレームのヘイズ濃度目標値を取得する目標値取得部と、
前記目標値取得部が取得した前記過去のフレームの前記ヘイズ濃度目標値と、前記ヘイズ濃度推定部が導出した前記一のフレームのヘイズ濃度の推定値とのいずれかを、前記一のフレームのヘイズ濃度目標値として決定する目標値決定部と、
前記目標値決定部によって決定された前記一のフレームのヘイズ濃度目標値と、前記一のフレームの過去のフレームのヘイズ除去処理に用いたパラメータとの差分絶対値を導出する差分絶対値導出部と
をさらに備え、
前記パラメータ調整部は、前記過去のフレームのヘイズ除去処理に用いたパラメータから前記目標値決定部により決定された前記一のフレームのヘイズ濃度目標値に段階的に近づくように、前記信頼度と、前記差分絶対値と、前記一のフレームがシーンチェンジしたフレームであるか否かとに応じた調整量で、前記一のフレームのヘイズ除去処理に用いるパラメータを調整する、請求項1に記載の画像処理システム。 - 前記パラメータ調整部は、前記一のフレームがシーンチェンジしたフレームであると判定された場合、前記パラメータを第1の調整量で調整し、前記一のフレームがシーンチェンジしたフレームでないと判定され、前記推定値の信頼度が第1の閾値より大きくかつ前記差分絶対値が第2の閾値より大きい場合、前記パラメータを前記第1の調整量よりも少ない第2の調整量で調整し、前記一のフレームがシーンチェンジしたフレームでないと判定され、かつ、前記推定値の信頼度が前記第1の閾値より小さい又は前記差分絶対値が前記第2の閾値より小さい場合、前記パラメータを前記第2の調整量より少ない第3の調整量で調整する、請求項4に記載の画像処理システム。
- 前記ヘイズ濃度の推定値の信頼度を推定するヘイズ信頼度推定部
をさらに備え、
前記ヘイズ信頼度推定部が前記一のフレームにおいて、前記ヘイズ濃度の推定値が信頼できないと判断した場合、前記パラメータ調整部は、前記過去のフレームの時点で設定した目標値に段階的に近づくように前記一のフレームのヘイズ除去処理に用いるパラメータを調整する、請求項2又は3に記載の画像処理システム。 - 前記ヘイズ濃度の推定値の信頼度を推定するヘイズ信頼度推定部と、
前記過去のフレームのヘイズ濃度目標値を取得する目標値取得部と、
前記ヘイズ信頼度推定部が推定した前記信頼度に基づいて、前記目標値取得部が取得した前記過去のフレームの前記ヘイズ濃度目標値と、前記ヘイズ濃度推定部が導出した前記一のフレームのヘイズ濃度の推定値とのいずれを、前記一のフレームのヘイズ除去処理に用いるパラメータを調整するために用いるかを決定する目標値決定部と
をさらに備える、請求項1から3のいずれか一項に記載の画像処理システム。 - 前記目標値取得部が取得した前記過去のフレームのヘイズ濃度目標値と、前記ヘイズ濃度推定部が導出した前記一のフレームの前記ヘイズ濃度の推定値との差分絶対値を導出する差分絶対値導出部
をさらに備え、
前記目標値決定部は、前記一のフレームがシーンチェンジしたフレームである場合と、前記一のフレームがシーンチェンジしたフレームでなく、前記推定値の信頼度が第1閾値より大きくかつ前記差分絶対値が第2閾値より大きい場合、前記一のフレームのヘイズ濃度の推定値を前記一のフレームのヘイズ除去処理に用いるパラメータを調整するために用いることを決定し、前記一のフレームがシーンチェンジしたフレームでなく、前記推定値の信頼度が前記第1閾値より小さい又は前記差分絶対値が前記第2閾値より小さい場合、前記過去のフレームのヘイズ濃度目標値を前記一のフレームのヘイズ除去処理に用いるパラメータを調整するために用いることを決定する、請求項7に記載の画像処理システム。 - コンピュータを、請求項1から8のいずれか一項に記載の画像処理システムとして機能させるためのプログラムを記録したコンピュータ読み取り可能な記録媒体。
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP15806110.1A EP3156970B1 (en) | 2014-06-12 | 2015-04-27 | Haze removal |
JP2016527685A JP6228671B2 (ja) | 2014-06-12 | 2015-04-27 | 画像処理システム及びプログラム |
RU2017100023A RU2648955C1 (ru) | 2014-06-12 | 2015-04-27 | Система обработки изображений и машиночитаемый записывающий носитель |
AU2015272799A AU2015272799B2 (en) | 2014-06-12 | 2015-04-27 | Image processing system and computer-readable storage medium |
CN201580031509.5A CN106462954B (zh) | 2014-06-12 | 2015-04-27 | 图像处理系统 |
US15/372,402 US10096092B2 (en) | 2014-06-12 | 2016-12-08 | Image processing system and computer-readable recording medium |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2014/003131 WO2015189874A1 (ja) | 2014-06-12 | 2014-06-12 | 霧除去装置および画像生成方法 |
JPPCT/JP2014/003131 | 2014-06-12 | ||
JPPCT/JP2015/056086 | 2015-03-02 | ||
PCT/JP2015/056086 WO2015190136A1 (ja) | 2014-06-12 | 2015-03-02 | 画像処理システム及びコンピュータ読み取り可能な記録媒体 |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/056086 Continuation-In-Part WO2015190136A1 (ja) | 2014-06-12 | 2015-03-02 | 画像処理システム及びコンピュータ読み取り可能な記録媒体 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/372,402 Continuation US10096092B2 (en) | 2014-06-12 | 2016-12-08 | Image processing system and computer-readable recording medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015190184A1 true WO2015190184A1 (ja) | 2015-12-17 |
Family
ID=54833008
Family Applications (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/003131 WO2015189874A1 (ja) | 2014-06-12 | 2014-06-12 | 霧除去装置および画像生成方法 |
PCT/JP2015/056086 WO2015190136A1 (ja) | 2014-06-12 | 2015-03-02 | 画像処理システム及びコンピュータ読み取り可能な記録媒体 |
PCT/JP2015/062729 WO2015190184A1 (ja) | 2014-06-12 | 2015-04-27 | 画像処理システム及びコンピュータ読み取り可能な記録媒体 |
PCT/JP2015/062728 WO2015190183A1 (ja) | 2014-06-12 | 2015-04-27 | 画像処理システム及びコンピュータ読み取り可能な記録媒体 |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/003131 WO2015189874A1 (ja) | 2014-06-12 | 2014-06-12 | 霧除去装置および画像生成方法 |
PCT/JP2015/056086 WO2015190136A1 (ja) | 2014-06-12 | 2015-03-02 | 画像処理システム及びコンピュータ読み取り可能な記録媒体 |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/062728 WO2015190183A1 (ja) | 2014-06-12 | 2015-04-27 | 画像処理システム及びコンピュータ読み取り可能な記録媒体 |
Country Status (8)
Country | Link |
---|---|
US (4) | US10102614B2 (ja) |
EP (4) | EP3156968B1 (ja) |
JP (4) | JP6228670B2 (ja) |
CN (4) | CN106462947B (ja) |
AU (4) | AU2014397095B2 (ja) |
ES (3) | ES2727929T3 (ja) |
RU (4) | RU2658874C1 (ja) |
WO (4) | WO2015189874A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019165832A (ja) * | 2018-03-22 | 2019-10-03 | 上銀科技股▲分▼有限公司 | 画像処理方法 |
JP2022103003A (ja) * | 2020-12-25 | 2022-07-07 | 財團法人工業技術研究院 | 画像ヘイズ除去方法及びその方法を用いる画像ヘイズ除去装置 |
Families Citing this family (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3180454B1 (en) * | 2015-06-30 | 2018-06-06 | Heraeus Deutschland GmbH & Co. KG | Process for the production of a pgm-enriched alloy |
CN107872608B (zh) * | 2016-09-26 | 2021-01-12 | 华为技术有限公司 | 图像采集设备及图像处理方法 |
KR101942198B1 (ko) * | 2017-12-27 | 2019-04-11 | 중앙대학교 산학협력단 | 레티넥스 모델을 기반으로 하여 영상을 개선하는 단말 장치 및 방법과 이를 수행하는 기록 매체 |
CN110189259B (zh) * | 2018-02-23 | 2022-07-08 | 荷兰移动驱动器公司 | 图像去雾霾方法、电子设备及计算机可读存储介质 |
CN108416316B (zh) * | 2018-03-19 | 2022-04-05 | 中南大学 | 一种黑烟车的检测方法及系统 |
JP7109317B2 (ja) | 2018-09-06 | 2022-07-29 | 株式会社クボタ | 水田作業機 |
CN109028234B (zh) * | 2018-09-29 | 2020-11-10 | 佛山市云米电器科技有限公司 | 一种能够对烟雾等级进行标识的油烟机 |
CN109028233B (zh) * | 2018-09-29 | 2020-11-10 | 佛山市云米电器科技有限公司 | 厨房油烟浓度划分方法及油烟图像识别系统及油烟机 |
CN109242805B (zh) * | 2018-10-24 | 2021-09-28 | 西南交通大学 | 一种基于独立分量分析的单幅图像雾霾快速去除方法 |
JP7421273B2 (ja) * | 2019-04-25 | 2024-01-24 | キヤノン株式会社 | 画像処理装置及びその制御方法及びプログラム |
CN110223258A (zh) * | 2019-06-12 | 2019-09-10 | 西南科技大学 | 一种多模式快速视频图像去雾方法及装置 |
CN112419162B (zh) * | 2019-08-20 | 2024-04-05 | 浙江宇视科技有限公司 | 图像去雾方法、装置、电子设备及可读存储介质 |
CN111192210B (zh) * | 2019-12-23 | 2023-05-26 | 杭州当虹科技股份有限公司 | 一种自适应增强的视频去雾方法 |
CN113674158A (zh) * | 2020-05-13 | 2021-11-19 | 浙江宇视科技有限公司 | 图像处理方法、装置、设备及存储介质 |
CN112011696B (zh) | 2020-08-19 | 2021-05-18 | 北京科技大学 | 一种火法富集铝基废催化剂中铂族金属的方法 |
US11641456B2 (en) | 2020-09-14 | 2023-05-02 | Himax Technologies Limited | Image rendering method and apparatus |
US11790545B2 (en) * | 2020-09-14 | 2023-10-17 | Himax Technologies Limited | Method and apparatus to control light source in structured light imaging |
CN114519683A (zh) * | 2020-11-20 | 2022-05-20 | 北京晶视智能科技有限公司 | 图像处理方法及应用其的图像处理装置 |
TWI792454B (zh) * | 2021-07-28 | 2023-02-11 | 瑞昱半導體股份有限公司 | 自適應的圖像陰影校正方法及圖像陰影校正系統 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2010074012A1 (ja) * | 2008-12-22 | 2010-07-01 | ローム株式会社 | 画像補正処理回路、半導体装置、画像補正処理装置 |
JP2013141210A (ja) * | 2011-12-30 | 2013-07-18 | Hitachi Ltd | 画像霧除去装置、画像霧除去方法及び画像処理システム |
JP2013156983A (ja) * | 2012-01-31 | 2013-08-15 | Hitachi Ltd | 画像霧除去装置、画像霧除去方法及び画像処理システム |
Family Cites Families (58)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6085152A (en) * | 1997-09-19 | 2000-07-04 | Cambridge Management Advanced Systems Corporation | Apparatus and method for monitoring and reporting weather conditions |
DE60014420T2 (de) | 1999-11-26 | 2005-10-13 | Sanyo Electric Co., Ltd., Moriguchi | Verfahren zur 2d/3d videoumwandlung |
JP2001160146A (ja) * | 1999-12-01 | 2001-06-12 | Matsushita Electric Ind Co Ltd | 画像認識方法および画像認識装置 |
US9149175B2 (en) | 2001-07-26 | 2015-10-06 | Given Imaging Ltd. | Apparatus and method for light control in an in-vivo imaging device |
JP2003187248A (ja) * | 2001-12-21 | 2003-07-04 | Mitsubishi Electric Corp | 画像処理システムおよび画像処理装置 |
KR100512976B1 (ko) * | 2003-08-09 | 2005-09-07 | 삼성전자주식회사 | 화면의 콘트라스트를 향상시키는 블랙/화이트 스트레칭시스템 및 그의 스트레칭 방법 |
JP2006155491A (ja) * | 2004-12-01 | 2006-06-15 | Samsung Yokohama Research Institute Co Ltd | シーンチェンジ検出方法 |
JP2007266838A (ja) * | 2006-03-28 | 2007-10-11 | Sharp Corp | 記録再生装置、記録再生方法、及び、記録再生プログラムを記録した記録媒体 |
JP4784452B2 (ja) | 2006-09-12 | 2011-10-05 | 株式会社デンソー | 車載霧判定装置 |
JP4475268B2 (ja) * | 2006-10-27 | 2010-06-09 | セイコーエプソン株式会社 | 画像表示装置、画像表示方法、画像表示プログラム、及び画像表示プログラムを記録した記録媒体、並びに電子機器 |
JP4241834B2 (ja) | 2007-01-11 | 2009-03-18 | 株式会社デンソー | 車載霧判定装置 |
RU2365993C1 (ru) * | 2008-01-30 | 2009-08-27 | Корпорация "САМСУНГ ЭЛЕКТРОНИКС Ко., Лтд." | Способ адаптивного улучшения факсимильных изображений документов |
US8619071B2 (en) * | 2008-09-16 | 2013-12-31 | Microsoft Corporation | Image view synthesis using a three-dimensional reference model |
US8290294B2 (en) | 2008-09-16 | 2012-10-16 | Microsoft Corporation | Dehazing an image using a three-dimensional reference model |
JP5325562B2 (ja) * | 2008-12-22 | 2013-10-23 | ローム株式会社 | 画像補正処理回路及びこれを集積化して成る半導体装置 |
US8350933B2 (en) * | 2009-04-08 | 2013-01-08 | Yissum Research Development Company Of The Hebrew University Of Jerusalem, Ltd. | Method, apparatus and computer program product for single image de-hazing |
WO2010115228A1 (en) * | 2009-04-09 | 2010-10-14 | National Ict Australia Limited | Enhancing image data |
JP2010276691A (ja) * | 2009-05-26 | 2010-12-09 | Toshiba Corp | 画像処理装置および画像処理方法 |
JP4807439B2 (ja) * | 2009-06-15 | 2011-11-02 | 株式会社デンソー | 霧画像復元装置及び運転支援システム |
JP5402504B2 (ja) * | 2009-10-15 | 2014-01-29 | 株式会社Jvcケンウッド | 擬似立体画像作成装置及び擬似立体画像表示システム |
TWI423166B (zh) * | 2009-12-04 | 2014-01-11 | Huper Lab Co Ltd | 判斷輸入影像是否為霧化影像之方法、判斷輸入影像的霧級數之方法及霧化影像濾清方法 |
US8284998B2 (en) * | 2010-07-01 | 2012-10-09 | Arcsoft Hangzhou Co., Ltd. | Method of estimating depths from a single image displayed on display |
JP2012028987A (ja) * | 2010-07-22 | 2012-02-09 | Toshiba Corp | 画像処理装置 |
CN102637293B (zh) * | 2011-02-12 | 2015-02-25 | 株式会社日立制作所 | 运动图像处理装置及运动图像处理方法 |
US20120213436A1 (en) * | 2011-02-18 | 2012-08-23 | Hexagon Technology Center Gmbh | Fast Image Enhancement and Three-Dimensional Depth Calculation |
JP5488530B2 (ja) * | 2011-05-23 | 2014-05-14 | 富士ゼロックス株式会社 | 画像処理装置及び画像処理プログラム |
JP5810628B2 (ja) * | 2011-05-25 | 2015-11-11 | 富士ゼロックス株式会社 | 画像処理装置及び画像処理プログラム |
US8582915B2 (en) * | 2011-06-27 | 2013-11-12 | Wuxi Jinnang Technology Development Ltd. | Image enhancement for challenging lighting conditions |
KR101568971B1 (ko) * | 2011-08-03 | 2015-11-13 | 인디안 인스티튜트 오브 테크놀로지, 카라그푸르 | 화상 및 동영상에서 안개를 제거하는 방법 및 시스템 |
US8970691B2 (en) * | 2011-08-26 | 2015-03-03 | Microsoft Technology Licensing, Llc | Removal of rayleigh scattering from images |
CN103034977B (zh) * | 2011-09-30 | 2015-09-30 | 株式会社日立制作所 | 图像除雾方法和相应的图像除雾装置 |
CN103164845B (zh) * | 2011-12-16 | 2016-08-03 | 中国科学院沈阳自动化研究所 | 一种实时图像去雾装置及方法 |
CN103188433B (zh) * | 2011-12-30 | 2016-01-20 | 株式会社日立制作所 | 图像除雾装置和图像除雾方法 |
JP2013152334A (ja) * | 2012-01-25 | 2013-08-08 | Olympus Corp | 顕微鏡システムおよび顕微鏡観察方法 |
US20130237317A1 (en) * | 2012-03-12 | 2013-09-12 | Samsung Electronics Co., Ltd. | Method and apparatus for determining content type of video content |
JP5247910B1 (ja) * | 2012-03-30 | 2013-07-24 | Eizo株式会社 | 画像表示装置またはその方法 |
JP5470415B2 (ja) * | 2012-03-30 | 2014-04-16 | Eizo株式会社 | イプシロンフィルタの閾値決定方法およびローパスフィルタの係数決定方法 |
US8885962B1 (en) * | 2012-07-23 | 2014-11-11 | Lockheed Martin Corporation | Realtime long range imaging scatter reduction |
CN103632339A (zh) * | 2012-08-21 | 2014-03-12 | 张晓光 | 一种基于变分Retinex的单幅图像去雾方法及装置 |
CN202872972U (zh) * | 2012-08-24 | 2013-04-10 | 中国人民解放军理工大学气象学院 | 一种图像监测处理装置 |
US9659237B2 (en) * | 2012-10-05 | 2017-05-23 | Micro Usa, Inc. | Imaging through aerosol obscurants |
KR101958910B1 (ko) | 2012-10-26 | 2019-03-15 | 에스케이 텔레콤주식회사 | 영상보정의 가속화를 위한 영상보정 장치 및 그 방법 |
CN102982537B (zh) * | 2012-11-05 | 2015-09-02 | 安维思电子科技(广州)有限公司 | 一种检测场景变换的方法和系统 |
KR101736468B1 (ko) | 2012-12-24 | 2017-05-29 | 한화테크윈 주식회사 | 영상 처리 장치 및 방법 |
KR101445577B1 (ko) | 2013-03-11 | 2014-11-04 | 주식회사 브이아이티시스템 | 안개제거 추정 모델을 이용한 안개 낀 휘도영상 개선 시스템 |
JP2014212513A (ja) | 2013-04-01 | 2014-11-13 | パナソニック株式会社 | 投写型映像表示装置、映像投影制御装置、映像投影制御方法、及び映像投影制御プログラム |
CN103218622B (zh) * | 2013-04-22 | 2016-04-13 | 武汉大学 | 一种基于计算机视觉的雾霾监测方法 |
KR101470831B1 (ko) | 2013-05-28 | 2014-12-10 | 전남대학교산학협력단 | 사용자 제어가 가능한 거듭제곱근 연산자를 이용한 안개영상 개선 장치 |
CN103337054A (zh) * | 2013-06-17 | 2013-10-02 | 西安理工大学 | 基于单图像的二阶段图像去雾方法 |
JP2017502429A (ja) * | 2014-01-10 | 2017-01-19 | 富士通株式会社 | 画像処理装置、電子機器及び方法 |
US20170178297A1 (en) * | 2014-02-19 | 2017-06-22 | Yissum Research Development Company Of The Hebrew University Of Jerusalem Ltd. | Method and system for dehazing natural images using color-lines |
KR102207939B1 (ko) | 2014-03-27 | 2021-01-26 | 한화테크윈 주식회사 | 안개 제거 시스템 및 안개 제거 방법 |
JP6284408B2 (ja) | 2014-04-03 | 2018-02-28 | オリンパス株式会社 | 画像処理装置、撮像装置、判定方法、駆動方法、撮像方法およびプログラム |
US9305339B2 (en) * | 2014-07-01 | 2016-04-05 | Adobe Systems Incorporated | Multi-feature image haze removal |
US9177363B1 (en) | 2014-09-02 | 2015-11-03 | National Taipei University Of Technology | Method and image processing apparatus for image visibility restoration |
JP6469448B2 (ja) | 2015-01-06 | 2019-02-13 | オリンパス株式会社 | 画像処理装置、撮像装置、画像処理方法、および記録媒体 |
US9870511B2 (en) * | 2015-10-14 | 2018-01-16 | Here Global B.V. | Method and apparatus for providing image classification based on opacity |
US9508129B1 (en) | 2015-11-06 | 2016-11-29 | Adobe Systems Incorporated | Dehazing photos and videos using visual artifact suppression |
-
2014
- 2014-06-12 AU AU2014397095A patent/AU2014397095B2/en not_active Ceased
- 2014-06-12 WO PCT/JP2014/003131 patent/WO2015189874A1/ja active Application Filing
- 2014-06-12 CN CN201480079760.4A patent/CN106462947B/zh active Active
- 2014-06-12 ES ES14894614T patent/ES2727929T3/es active Active
- 2014-06-12 EP EP14894614.8A patent/EP3156968B1/en active Active
- 2014-06-12 JP JP2016527494A patent/JP6228670B2/ja active Active
- 2014-06-12 RU RU2017100018A patent/RU2658874C1/ru not_active IP Right Cessation
-
2015
- 2015-03-02 AU AU2015272846A patent/AU2015272846B2/en active Active
- 2015-03-02 CN CN201580031438.9A patent/CN106462953B/zh active Active
- 2015-03-02 ES ES15805972T patent/ES2712452T3/es active Active
- 2015-03-02 WO PCT/JP2015/056086 patent/WO2015190136A1/ja active Application Filing
- 2015-03-02 JP JP2016527661A patent/JP6225255B2/ja active Active
- 2015-03-02 RU RU2017100022A patent/RU2664415C2/ru active
- 2015-03-02 EP EP15805972.5A patent/EP3156969B1/en active Active
- 2015-04-27 RU RU2017100023A patent/RU2648955C1/ru not_active IP Right Cessation
- 2015-04-27 AU AU2015272798A patent/AU2015272798B2/en not_active Ceased
- 2015-04-27 JP JP2016527684A patent/JP6225256B2/ja active Active
- 2015-04-27 CN CN201580031509.5A patent/CN106462954B/zh active Active
- 2015-04-27 WO PCT/JP2015/062729 patent/WO2015190184A1/ja active Application Filing
- 2015-04-27 EP EP15806110.1A patent/EP3156970B1/en active Active
- 2015-04-27 WO PCT/JP2015/062728 patent/WO2015190183A1/ja active Application Filing
- 2015-04-27 CN CN201580031412.4A patent/CN106663326B/zh active Active
- 2015-04-27 ES ES15807220.7T patent/ES2681294T3/es active Active
- 2015-04-27 AU AU2015272799A patent/AU2015272799B2/en not_active Ceased
- 2015-04-27 EP EP15807220.7A patent/EP3156971B1/en active Active
- 2015-04-27 JP JP2016527685A patent/JP6228671B2/ja active Active
- 2015-04-27 RU RU2017100021A patent/RU2654159C1/ru not_active IP Right Cessation
-
2016
- 2016-12-07 US US15/371,228 patent/US10102614B2/en active Active
- 2016-12-07 US US15/371,230 patent/US10157451B2/en active Active
- 2016-12-08 US US15/372,402 patent/US10096092B2/en active Active
- 2016-12-08 US US15/372,400 patent/US9972074B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2010074012A1 (ja) * | 2008-12-22 | 2010-07-01 | ローム株式会社 | 画像補正処理回路、半導体装置、画像補正処理装置 |
JP2013141210A (ja) * | 2011-12-30 | 2013-07-18 | Hitachi Ltd | 画像霧除去装置、画像霧除去方法及び画像処理システム |
JP2013156983A (ja) * | 2012-01-31 | 2013-08-15 | Hitachi Ltd | 画像霧除去装置、画像霧除去方法及び画像処理システム |
Non-Patent Citations (1)
Title |
---|
See also references of EP3156970A4 * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019165832A (ja) * | 2018-03-22 | 2019-10-03 | 上銀科技股▲分▼有限公司 | 画像処理方法 |
JP2022103003A (ja) * | 2020-12-25 | 2022-07-07 | 財團法人工業技術研究院 | 画像ヘイズ除去方法及びその方法を用いる画像ヘイズ除去装置 |
JP7178438B2 (ja) | 2020-12-25 | 2022-11-25 | 財團法人工業技術研究院 | 画像ヘイズ除去方法及びその方法を用いる画像ヘイズ除去装置 |
US11528435B2 (en) | 2020-12-25 | 2022-12-13 | Industrial Technology Research Institute | Image dehazing method and image dehazing apparatus using the same |
Also Published As
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6228671B2 (ja) | 画像処理システム及びプログラム | |
US10521885B2 (en) | Image processing device and image processing method | |
CN111915535B (zh) | 一种图像去噪的方法及装置 | |
EP3944603A1 (en) | Video denoising method and apparatus, and computer-readable storage medium | |
CN109982012B (zh) | 图像处理方法及装置、存储介质、终端 | |
CN110023957B (zh) | 用于估计图像中的投射阴影区域和/或加亮区域的方法和设备 | |
JP6161847B1 (ja) | 画像処理装置及び画像処理方法 | |
CN116977228B (zh) | 图像降噪方法、电子设备及存储介质 | |
KR101024731B1 (ko) | 디지털 이미지의 모스키토 노이즈를 감소시키기 위한 방법 및 시스템 | |
KR101067516B1 (ko) | 정규 상관도와 누적영상을 이용한 고속 그림자 제거 방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15806110 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016527685 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2017100023 Country of ref document: RU Kind code of ref document: A |
|
REEP | Request for entry into the european phase |
Ref document number: 2015806110 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2015806110 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2015272799 Country of ref document: AU Date of ref document: 20150427 Kind code of ref document: A |