WO2022163423A1 - 画像処理装置、画像処理方法及び画像処理プログラム - Google Patents
画像処理装置、画像処理方法及び画像処理プログラム Download PDFInfo
- Publication number
- WO2022163423A1 WO2022163423A1 PCT/JP2022/001512 JP2022001512W WO2022163423A1 WO 2022163423 A1 WO2022163423 A1 WO 2022163423A1 JP 2022001512 W JP2022001512 W JP 2022001512W WO 2022163423 A1 WO2022163423 A1 WO 2022163423A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- light source
- image
- information
- spectral information
- image processing
- Prior art date
Links
- 238000003672 processing method Methods 0.000 title description 4
- 230000003595 spectral effect Effects 0.000 claims abstract description 193
- 238000001514 detection method Methods 0.000 claims abstract description 109
- 238000001228 spectrum Methods 0.000 claims description 13
- 238000010586 diagram Methods 0.000 description 52
- 238000003384 imaging method Methods 0.000 description 30
- 238000000034 method Methods 0.000 description 27
- 239000004065 semiconductor Substances 0.000 description 15
- 238000006243 chemical reaction Methods 0.000 description 12
- 241000196324 Embryophyta Species 0.000 description 11
- 230000000875 corresponding effect Effects 0.000 description 10
- 230000035945 sensitivity Effects 0.000 description 9
- 230000023077 detection of light stimulus Effects 0.000 description 8
- 230000000694 effects Effects 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 7
- 239000000758 substrate Substances 0.000 description 7
- 239000011159 matrix material Substances 0.000 description 5
- 238000004364 calculation method Methods 0.000 description 4
- 239000000284 extract Substances 0.000 description 3
- 238000005286 illumination Methods 0.000 description 3
- 230000001681 protective effect Effects 0.000 description 3
- 229910052581 Si3N4 Inorganic materials 0.000 description 2
- 229910004298 SiO 2 Inorganic materials 0.000 description 2
- 239000010949 copper Substances 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 238000000513 principal component analysis Methods 0.000 description 2
- VYPSYNLAJGMNEJ-UHFFFAOYSA-N silicon dioxide Inorganic materials O=[Si]=O VYPSYNLAJGMNEJ-UHFFFAOYSA-N 0.000 description 2
- HQVNEWCFYHHQES-UHFFFAOYSA-N silicon nitride Chemical compound N12[Si]34N5[Si]62N3[Si]51N64 HQVNEWCFYHHQES-UHFFFAOYSA-N 0.000 description 2
- 239000004925 Acrylic resin Substances 0.000 description 1
- 229920000178 Acrylic resin Polymers 0.000 description 1
- 230000005457 Black-body radiation Effects 0.000 description 1
- RYGMFSIKBFXOCR-UHFFFAOYSA-N Copper Chemical compound [Cu] RYGMFSIKBFXOCR-UHFFFAOYSA-N 0.000 description 1
- 240000007594 Oryza sativa Species 0.000 description 1
- 235000007164 Oryza sativa Nutrition 0.000 description 1
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 229910052802 copper Inorganic materials 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 229910010272 inorganic material Inorganic materials 0.000 description 1
- 239000011147 inorganic material Substances 0.000 description 1
- 239000011810 insulating material Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 229910052751 metal Inorganic materials 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000011368 organic material Substances 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 230000029553 photosynthesis Effects 0.000 description 1
- 238000010672 photosynthesis Methods 0.000 description 1
- 238000000611 regression analysis Methods 0.000 description 1
- 235000009566 rice Nutrition 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 229910052814 silicon oxide Inorganic materials 0.000 description 1
- 239000002689 soil Substances 0.000 description 1
- WFKWXMTUELFFGS-UHFFFAOYSA-N tungsten Chemical compound [W] WFKWXMTUELFFGS-UHFFFAOYSA-N 0.000 description 1
- 229910052721 tungsten Inorganic materials 0.000 description 1
- 239000010937 tungsten Substances 0.000 description 1
- 235000013311 vegetables Nutrition 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G06T5/94—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/25—Determination of region of interest [ROI] or a volume of interest [VOI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/56—Extraction of image or video features relating to colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/60—Extraction of image or video features relating to illumination properties, e.g. using a reflectance or lighting model
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/84—Camera processing pipelines; Components thereof for processing colour signals
- H04N23/88—Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
- G06T2207/10036—Multispectral image; Hyperspectral image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/07—Target detection
Definitions
- the present disclosure relates to an image processing device, an image processing method, and an image processing program.
- a method using a reflection model of an object has been proposed as a method for detecting the light source component from the reflected light from the object.
- This reflection model is called a dichroic reflection model, and is a model that assumes that reflected light from an object is composed of a specular reflection component and a diffuse reflection component.
- the specular reflection component is a component that reflects the color of the light source as it is
- the diffuse reflection component is a component that reflects the light from the light source after changing depending on the original color of the object.
- An illumination estimation apparatus has been proposed that detects the area of specular reflection from an image, estimates the distribution feature of how the image signal of the area spreads in the spectral space, and detects an illumination model based on the distribution feature. (See Patent Document 1, for example).
- the lighting estimator can output spectral characteristics of lighting based on the detected lighting model.
- the conventional technology described above has the problem that the method of detecting the spectral characteristics of the light source (illumination) is complicated. As described above, when detecting spectral characteristics, it is necessary to calculate distribution features. This requires principal component analysis and matrix calculation for multi-dimensional spectral information, and has the problem of complicating the method of detecting spectral characteristics.
- the present disclosure proposes an image processing device, an image processing method, and an image processing program for simply detecting the spectral distribution of a light source.
- An image processing apparatus has a specific area detection unit, a chromaticity information detection unit, and a light source spectral information generation unit.
- the specific area detection unit detects a specific area, which is the area of the image of the specific object, from the image of the subject illuminated by the light from the light source.
- the chromaticity information detection unit detects light source color information based on the plurality of image signals of the detected specific region.
- the light source spectral information generation unit generates light source spectral information, which is spectral information of the light source, based on the detected light source color information.
- FIG. 1 is a diagram illustrating a configuration example of an image processing device according to an embodiment of the present disclosure
- FIG. It is a figure explaining a specular reflection component and a diffuse reflection component. It is a figure which shows an example of the specific area
- FIG. 4 is a diagram showing an example of spectral sensitivity characteristics of an imaging device used for detection of a specific region in the present disclosure
- FIG. 3 is a diagram illustrating an example of detection of light source chromaticity information according to the first embodiment of the present disclosure
- FIG. FIG. 4 is a diagram showing an example of light source spectral information according to the first embodiment of the present disclosure;
- FIG. 4 is a diagram illustrating an example of removing the influence of a light source according to the first embodiment of the present disclosure
- FIG. FIG. 4 is a diagram illustrating an example of removing the influence of a light source according to the first embodiment of the present disclosure
- FIG. FIG. 4 is a diagram illustrating an example of removing the influence of a light source according to the first embodiment of the present disclosure
- FIG. FIG. 4 is a diagram illustrating an example of image processing according to the first embodiment of the present disclosure
- FIG. FIG. 5 is a diagram illustrating an example of specific area detection processing according to the first embodiment of the present disclosure
- FIG. 5 is a diagram illustrating an example of light source color information detection processing according to the first embodiment of the present disclosure
- FIG. 5 is a diagram illustrating an example of light source spectral information generation processing according to the first embodiment of the present disclosure
- FIG. 10 is a diagram illustrating an example of detection of light source color information according to the second embodiment of the present disclosure
- 1 is a diagram illustrating a configuration example of an imaging device to which technology according to the present disclosure may be applied
- FIG. It is a figure showing an example of composition of an image sensor concerning an embodiment of this indication.
- FIG. 2 is a diagram showing a configuration example of a pixel according to an embodiment of the present disclosure;
- FIG. 1 is a diagram illustrating a configuration example of an image processing device according to an embodiment of the present disclosure. This figure is a block diagram showing a configuration example of the image processing apparatus 200 .
- the image processing device 200 is a device that generates and outputs an image from an image signal output from an imaging device or the like. In addition, the image processing apparatus 200 further performs processing for removing the influence of the light source from the generated image.
- the image generator 210 generates an image from the input image signal.
- the image generating section 210 generates an image for one screen from an image signal generated by an imaging device or the like, and outputs it as image data.
- the imaging device includes a pixel array section (pixel array section 10) composed of a plurality of pixels (pixels 100), and generates an image signal for each pixel during imaging. After that, the imaging element sequentially outputs the generated image signals in the order of pixel arrangement, for example.
- the image generator 210 generates an image from the image signal for one screen that is sequentially output. This image is called a RAW image.
- the image generator 210 outputs the generated image to the spectral image generator 220 and the specific area detector 230 .
- the spectral image generation unit 220 generates spectral images from RAW images.
- a spectral image can be generated by performing reconstruction processing on an input image. This reconstruction processing is processing for generating each wavelength image from the input image signal.
- the pixels of the imaging device described above output image signals corresponding to light in a specific wavelength band among light from a subject.
- a spectral image for each wavelength can be generated from the image signal corresponding to each wavelength band. Note that the pixels of the imaging element used in the present disclosure generate a spectral image from image signals corresponding to light in eight wavelength bands, for example, as described later.
- the spectral image generation section 220 outputs the image that has undergone the reconstruction processing to the image adjustment section 260 and the chromaticity information detection section 240 .
- Chromaticity information detection section 240 generates chromaticity information using the specific area output from specific area detection section 230 and the image output from spectral image generation section 220 . That is, the chromaticity information detection section 240 detects the light source color information using the image signal of the area corresponding to the specific area in the image output from the spectral image generation section 220 . Chromaticity information detection section 240 outputs the detected light source color information to light source spectral information generation section 250 .
- the light source spectral information generation unit 250 generates light source spectral information based on the light source color information output by the chromaticity information detection unit 240 .
- This light source spectral information is spectral information of the light source, and for example, the spectral distribution of the light source can be applied.
- the spectral distribution of the light source can be represented, for example, by the ratio of light from the light source for each wavelength band.
- For existing light sources it is possible to measure the spectral distribution and generate light source spectral information.
- the light source spectral information generation unit 250 can adopt a configuration that holds light source spectral information of a plurality of existing light sources.
- the image adjusting section 260 adjusts the image output by the spectral image generating section 220 based on the light source spectral information output by the light source spectral information generating section 250 . This adjustment can remove the effect of the light source on the image.
- the image adjuster 260 can adjust the image signals forming the image for each wavelength band. For example, the image adjustment unit 260 can perform adjustment by dividing the image signal for each wavelength band by the spectrum of the same wavelength band. As a result, the level of the image signal forming the image can be adjusted according to the spectrum of the light source, and the influence of the light source can be removed. The details of image adjustment by the image adjustment unit 260 will be described later.
- the image adjuster 260 outputs the adjusted image. This output image corresponds to the output image of the image processing apparatus 200 .
- FIG. 2 is a diagram for explaining specular reflection components and diffuse reflection components. This figure shows an example in which an object 301 is irradiated with light 305 from a light source 302 . When the object 301 is irradiated with the light 305, reflected light is generated. This reflected light has a specular reflection component 307 and a diffuse reflection component 306 .
- a specular reflection component 307 is a component in which the light 305 from the light source 302 is reflected at the interface between the object 301 and the medium of the light 305, and is a component in which the color of the light 305 from the light source 302 is reflected as it is.
- I specular the specular reflection component in vector notation.
- I diffuse the diffuse reflectance component in vector notation.
- ⁇ and ⁇ represent the coefficients of the respective components.
- L represents the light source vector.
- ⁇ represents the diffuse reflectance vector.
- ⁇ and ⁇ change depending on geometric relationships such as the position of the light source 302, the orientation of the surface of the object 301, and the direction of observation.
- FIG. 3 is a diagram showing an example of a specific area in the present disclosure.
- This figure is a diagram showing an example of the specific area detected by the specific area detection unit 230.
- the specific region will be described by taking the case of applying a plant to a specific object as an example.
- An image 310 in the figure is an image of soil 311 in which a plant 312 is planted.
- a region including the plant 312 in the image 310 is the specific region.
- a region 320 in the figure corresponds to the specific region.
- the specific area detection unit 230 detects and outputs this area 320 as a specific area.
- pixels having spectral characteristics at wavelengths of 600 nm and 750 nm are pixels having spectral sensitivity characteristics of graphs 334 and 336, respectively.
- the feature quantity can be calculated by the following equation.
- a specific region can be detected by comparing the calculated feature quantity with a threshold value.
- the blackbody locus represents the change in the color of the blackbody with respect to the change in the temperature of the blackbody.
- Curve 343 in the figure represents the black body locus.
- An intersection point 344 between the curve 343 and the straight line 342 can be detected as the color temperature.
- FIG. 6 is a diagram showing an example of light source spectral information according to the first embodiment of the present disclosure.
- This figure is a diagram showing an example of the light source spectral information held by the chromaticity information detection unit 240, and is a diagram showing the arrangement of the light source spectral information.
- Light source in the figure is the number of the light source that identifies the light source spectral information.
- Color temperature is the color temperature [K] of the light source.
- the light source spectral information is represented by spectral light for each wavelength.
- “Wavelength” in the figure represents the central wavelength [nm] of each spectrum.
- wavelengths (61 wavelengths) in increments of 10 nm from 350 nm to 950 nm can be applied.
- the light source spectral information in the figure represents an example of holding spectral information for each of these 61 wavelengths.
- the chromaticity information detection unit 240 can hold an array of light source spectral information corresponding to such a plurality of color temperatures.
- the light source spectral information in this array can be generated by pre-measuring the color temperature and spectral information of an existing light source.
- the chromaticity information detection unit 240 can use the array shown in the figure as a lookup table for inputting the detected color temperature and outputting the light source spectral information.
- the chromaticity information detection unit 240 can select one light source with a color temperature closest to the detected color temperature from the array in FIG. Next, the chromaticity information detection unit 240 can output the selected light source spectral information as the generated light source spectral information.
- the chromaticity information detection unit 240 selects two pieces of light source spectral information having a color temperature close to the detected color temperature, and mixes the selected two pieces of light source spectral information to generate light source spectral information.
- Mixing of light source spectral information will be described on the assumption that Ta and LUTa( ⁇ ) and Tb and LUTb( ⁇ ) are selected for color temperature and light source spectral information, respectively. Mixing can be performed, for example, by the following formula.
- L interp ( ⁇ ) represents light source spectral information after mixing.
- T represents the detected color temperature.
- T a and T b and LUT a ( ⁇ ) and LUT b ( ⁇ ) represent the respective color temperature and illuminant spectral information of the selected illuminants (a and b). In this way, the light source spectral information can be mixed based on the ratio corresponding to the difference from the detected color temperature.
- Iwb(x, y, ⁇ ) represents the image signal after adjustment.
- I(x, y, ⁇ ) represents an input image signal.
- LUT( ⁇ ) represents light source spectral information.
- the calculation of I(x, y, ⁇ )/LUT( ⁇ ) in the first half of Equation (5) corresponds to image adjustment.
- the latter half of the calculation in equation (5) is a calculation part for compensating for changes in brightness due to image adjustment.
- the specific area detection unit 230 compares the feature amount of the selected area with a threshold (step S116). If the feature amount is not greater than the threshold (step S116, No), the specific area detection unit 230 repeats the process from step S115. On the other hand, if the feature amount is larger than the threshold (step S116, Yes), the specific area detection unit 230 proceeds to the process of step S117.
- step S117 the specific area detection unit 230 outputs the selected area as the specific area (step S117). Through the above procedure, the specific area detection unit 230 can detect and output the specific area.
- the chromaticity information detection unit 240 calculates an approximate straight line in the chromaticity space based on the plurality of image signals (step S123). Next, the chromaticity information detection unit 240 detects the intersection of the approximate straight line and the black body locus (step S124). Next, the chromaticity information detection unit 240 detects and outputs the color temperature or chromaticity based on the detected intersection (step S125).
- the light source spectral information generation unit 250 searches the array of the light source spectral information based on the color temperature input as the light source color information, and generates two light source spectral information adjacent to each other across the color temperature input as the light source color information. can be selected.
- the image processing apparatus 200 generates light source spectral information based on light source color information detected from a specific region from an array of light source spectral information prepared in advance. This simplifies the generation of light source spectral information for removing the influence of the light source on the image.
- the image processing apparatus 200 of the first embodiment described above detects the color temperature from the intersection of the blackbody locus and the approximate straight line generated based on the distribution of the image signal in the chromaticity space in the chromaticity information detection unit 240. rice field.
- an image processing apparatus 200 according to the second embodiment of the present disclosure differs from the above-described first embodiment in that the color temperature is detected without using the blackbody locus.
- FIG. 12 is a diagram illustrating an example of detection of light source color information according to the second embodiment of the present disclosure. Similar to FIG. 5, this figure shows an example of detection of light source color information by the chromaticity information detection unit 240.
- the method for detecting the light source color information in FIG. 5 differs from the method for detecting the light source color information in FIG. 5 in that the curve of the black body locus is not used.
- a straight line 342 is generated based on the distribution of image signals 341 in this figure as well.
- a light source having a color temperature closest to the straight line 342 in the xy chromaticity diagram is detected. That is, the color temperature having the shortest straight line distance d from the straight line 342 is output as the detected color temperature (chromaticity information).
- the image processing apparatus 200 can detect light source color information without using the blackbody locus.
- the technology according to the present disclosure can be applied to various products.
- the technology according to the present disclosure can be applied to imaging devices such as cameras.
- An imaging device 1001 is a device that takes an image of a subject.
- a plurality of pixels each having a photoelectric conversion unit for performing photoelectric conversion of light from an object are arranged on the light receiving surface of the image sensor 1001 . These pixels each generate an image signal based on charges generated by photoelectric conversion.
- the image sensor 1001 converts image signals generated by pixels into digital image signals and outputs the digital image signals to the image processing unit 1003 .
- An image signal for one screen is called a frame.
- the imaging device 1001 can also output an image signal on a frame-by-frame basis.
- the control unit 1002 controls the image pickup device 1001 and the image processing unit 1003 .
- the control unit 1002 can be configured by an electronic circuit using a microcomputer or the like, for example.
- the image processing unit 1003 processes the image signal from the imaging device 1001 .
- the image processing unit 1003 can be configured by, for example, an electronic circuit using a microcomputer or the like.
- the display unit 1004 displays an image based on the image signal processed by the image processing unit 1003 .
- the display unit 1004 can be configured by, for example, a liquid crystal monitor.
- the recording unit 1005 records an image (frame) based on the image signal processed by the image processing unit 1003 .
- the recording unit 1005 can be composed of, for example, a hard disk or a semiconductor memory.
- the imaging device to which the present disclosure can be applied has been described above.
- the present technology can be applied to the image processing unit 1003 among the components described above.
- the image processing apparatus 200 described with reference to FIG. 1 can be applied to the image processing unit 1003 .
- FIG. 14 is a diagram illustrating a configuration example of an imaging element according to an embodiment of the present disclosure; This figure is a block diagram showing a configuration example of the imaging device 1001 .
- An imaging device 1001 is a semiconductor device that generates image data of a subject.
- the imaging device 1001 includes a pixel array section 10 , a vertical driving section 20 , a column signal processing section 30 and a control section 40 .
- the pixel array section 10 is configured by arranging a plurality of pixels 100 .
- a pixel array section 10 in the figure represents an example in which a plurality of pixels 100 are arranged in a two-dimensional matrix.
- the pixel 100 includes a photoelectric conversion unit that photoelectrically converts incident light, and generates an image signal of a subject based on the irradiated incident light.
- a photodiode for example, can be used for this photoelectric conversion unit.
- Signal lines 11 and 12 are wired to each pixel 100 .
- the pixels 100 generate image signals under the control of control signals transmitted by the signal lines 11 and output the generated image signals via the signal lines 12 .
- the signal line 11 is arranged for each row in a two-dimensional matrix and is commonly wired to the plurality of pixels 100 arranged in one row.
- the signal line 12 is arranged for each column in the shape of a two-dimensional matrix and is commonly wired to a plurality of pixels 100 arranged in one column.
- the vertical driving section 20 generates control signals for the pixels 100 described above.
- a vertical drive unit 20 in FIG. 1 generates a control signal for each row of the two-dimensional matrix of the pixel array unit 10 and sequentially outputs the control signal via the signal line 11 .
- the column signal processing unit 30 processes image signals generated by the pixels 100 .
- a column signal processing unit 30 shown in the figure simultaneously processes image signals from a plurality of pixels 100 arranged in one row of the pixel array unit 10 and transmitted through the signal line 12 .
- this processing for example, analog-to-digital conversion for converting analog image signals generated by the pixels 100 into digital image signals and correlated double sampling (CDS) for removing offset errors in image signals may be performed. can be done.
- the processed image signal is output to a circuit or the like outside the imaging element 1001 .
- the control unit 40 controls the vertical driving unit 20 and the column signal processing unit 30.
- a control unit 40 shown in the figure outputs control signals through signal lines 41 and 42 to control the vertical driving unit 20 and the column signal processing unit 30 .
- FIG. 15 is a diagram illustrating a configuration example of a pixel according to an embodiment of the present disclosure; This figure is a diagram showing a simplified cross-sectional configuration of the pixel 100 , and is a diagram schematically showing a part of the pixel 100 .
- the pixel 100 includes a semiconductor substrate 120 , a wiring area 130 , a protective film 140 , a color filter 150 and an on-chip lens 180 .
- the semiconductor substrate 120 is a semiconductor substrate on which elements of the pixel 100 are formed.
- the semiconductor substrate 120 can be made of silicon (Si).
- Si silicon
- FIG. 1 a photoelectric conversion unit 101 among the elements forming the pixel 100 is shown.
- the semiconductor substrate 120 in the figure constitutes a p-type well region.
- Outlined rectangles in the semiconductor substrate 120 represent n-type semiconductor regions.
- an n-type semiconductor region 121 is shown.
- the photoelectric conversion unit 101 is composed of an n-type semiconductor region 121 .
- the photoelectric conversion unit 101 corresponds to a photodiode composed of a pn junction at the interface between the n-type semiconductor region 121 and the surrounding p-type well region. Electrons among charges generated by photoelectric conversion of the photoelectric conversion unit 101 during the exposure period are accumulated in the n-type semiconductor region 121 . After the exposure period has elapsed, an image signal is generated and output based on the charges accumulated in the n-type semiconductor region 121 .
- the wiring region 130 is a region in which wiring for transmitting signals to elements is arranged.
- This wiring region 130 includes a wiring 132 and an insulating layer 131 .
- the wiring 132 transmits the signal of the element.
- This wiring 132 can be made of a metal such as copper (Cu) or tungsten (W).
- the insulating layer 149 insulates the wiring 132 and the like.
- This insulating layer 149 can be composed of an insulating material such as silicon oxide (SiO 2 ).
- the protective film 140 is a film that protects the back surface of the semiconductor substrate 120 .
- This protective film 140 can be made of, for example, SiO 2 .
- the color filter 150 is an optical filter that transmits light of a predetermined wavelength out of incident light.
- the on-chip lens 180 is a lens that collects incident light onto the photoelectric conversion section 101 .
- the on-chip lens 180 shown in the figure represents an example configured in a hemispherical shape.
- the on-chip lens 180 can be made of an inorganic material such as silicon nitride (SiN) or an organic material such as acrylic resin.
- the image processing apparatus 200 has a specific area detection section 230 , a chromaticity information detection section 240 and a light source spectral information generation section 250 .
- the specific area detection unit 230 detects a specific area, which is the area of the image of the specific object, from the image of the subject irradiated with the light from the light source.
- Chromaticity information detection section 240 detects light source color information based on a plurality of image signals of the detected specific region.
- the light source spectral information generation unit 250 generates light source spectral information, which is light source spectral information, based on the detected light source color information. Thereby, the spectrum of the light source can be detected from the light source color information estimated from the specific area.
- the light source spectral information may be information representing the relationship between the wavelength and the spectrum. Thereby, the spectrum for each wavelength of the light source can be detected.
- the light source spectral information generation unit 250 may generate light source spectral information by mixing respective spectra for each wavelength in a plurality of light source spectral information based on the detected light source color information. This simplifies generation of light source spectral information.
- the light source spectral information generation unit 250 may generate light source spectral information by selecting from a plurality of held light source spectral information. This simplifies generation of light source spectral information.
- the chromaticity information detection unit 240 may detect an approximate straight line generated based on a plurality of image signals in the chromaticity space as the light source color information. This simplifies the detection of the light source color information.
- the chromaticity information detection unit 240 may detect the color temperature detected based on the approximate straight line as the light source color information. This simplifies the detection of the light source color information.
- the chromaticity information detection unit 240 may detect the color in the chromaticity space at the intersection of the approximate straight line and the black body locus as the color temperature. This simplifies color temperature detection.
- the specific area detection unit 230 may detect the specific area based on the relationship between the wavelength and the reflectance of the specific object.
- the specific area detection unit 230 may detect a specific area using a plant as a specific object.
- it may further include an image adjustment unit that adjusts the image of the subject based on the generated light source spectral information. Thereby, the influence of the detected light source can be removed.
- the image processing method detects a specific area, which is an image area of a specific object, from an image of a subject irradiated with light from a light source, and extracts light source color information based on a plurality of image signals of the detected specific area. and generating light source spectral information, which is light source spectral information, based on the detected light source color information.
- the spectrum of the light source can be detected from the light source color information estimated from the specific area.
- the image processing program causes the computer to detect a specific region, which is an image region of a specific object, from the image of the subject illuminated by the light from the light source, and to detect the light source based on a plurality of image signals of the detected specific region.
- the spectrum of the light source can be detected from the light source color information estimated from the specific area.
- the processing procedure described in the above embodiment may be regarded as a method having a series of procedures, and a program for causing a computer to execute the series of procedures or a recording medium for storing the program You can catch it.
- this recording medium for example, CD (Compact Disc), MD (Mini Disc), DVD (Digital Versatile Disc), memory card, Blu-ray disc (Blu-ray (registered trademark) Disc), etc. can be used.
- the present technology can also take the following configuration.
- a specific area detection unit that detects a specific area, which is the area of the image of the specific object, from the image of the subject irradiated with the light from the light source; a chromaticity information detection unit that detects light source color information based on a plurality of image signals of the detected specific region; and a light source spectral information generation unit that generates light source spectral information, which is spectral information of the light source, based on the detected light source color information.
- the light source spectral information is information representing a relationship between a wavelength and a spectrum.
- the image processing device (6) The image processing device according to (5), wherein the chromaticity information detection unit detects a color temperature detected based on the approximate straight line as the light source color information. (7) The image processing device according to (6), wherein the chromaticity information detection unit detects, as the color temperature, a color in a chromaticity space at an intersection of the approximate straight line and a blackbody locus. (8) The image processing device according to any one of (1) to (7), wherein the specific area detection unit detects the specific area based on the relationship between the wavelength and the reflectance of the specific object. (9) The image processing device according to any one of (1) to (8), wherein the specific area detection unit detects the specific area using a plant as the specific object.
- Detecting a specific area which is an image area of a specific object, from an image of a subject irradiated with light from a light source; detecting light source color information based on a plurality of image signals of the detected specific region; An image processing program for generating light source spectral information, which is spectral information of the light source, based on the detected light source color information.
Abstract
Description
1.第1の実施形態
2.第2の実施形態
3.撮像装置の構成
4.撮像素子の構成
[画像処理装置の構成]
図1は、本開示の実施形態に係る画像処理装置の構成例を示す図である。同図は、画像処理装置200の構成例を表すブロック図である。画像処理装置200は、撮像素子等から出力される画像信号から画像を生成して出力する装置である。また、画像処理装置200は、生成した画像から光源の影響を除去する処理を更に行う。同図の画像処理装置200は、画像生成部210と、分光画像生成部220と、特定領域検出部230と、色度情報検出部240と、光源分光情報生成部250と、画像調整部260とを備える。
図2は、鏡面反射成分及び拡散反射成分を説明する図である。同図は、物体301に光源302からの光305が照射される場合の例を表したものである。物体301に光305が照射されると反射光を生じる。この反射光には、鏡面反射成分307及び拡散反射成分306が存在する。鏡面反射成分307は、光源302からの光305が物体301と光305の媒質との界面にて反射される成分であり、光源302からの光305の色味そのままに反射される成分である。これに対し、拡散反射成分306は、光源302からの光305が物体301の内部で反射される成分であり、光源302からの光305が物体301本来の色味より変化した成分である。
I=Ispecular+Idiffuse=α×L+β×L×ρ ・・・(1)
ここで、Iは、ベクトル表記された観測値を表す。Ispecularは、ベクトル表記された鏡面反射成分を表す。Idiffuseは、ベクトル表記された拡散反射成分を表す。α及びβは、それぞれの成分の係数を表す。Lは、光源ベクトルを表す。ρは、拡散反射ベクトルを表す。α及びβは、光源302の位置や物体301の表面の向き、観測する方向などの幾何的な関係によりそれぞれ変化する。
I(x,y)=α×L(x,y)+β×D(x,y) ・・・(2)
ここで、L(x,y)は、光源(鏡面反射成分)の色度を表す。D(x,y)は、拡散反射成分の色度を表す。式(2)に表したように、観測値I(x,y)は、L(x,y)及びD(x,y)がα及びβの係数に基づいて混色されたものである。この色度空間における混色は、色度図上においてそれぞれの色度を結んだ直線上に位置する。このため、領域内の観測値I(x,y)も直線に沿って分布する。
図3は、本開示における特定領域の一例を示す図である。同図は、特定領域検出部230が検出する特定領域の一例を表す図である。特定物体に植物を適用する場合を例に挙げて特定領域を説明する。同図の画像310は、植物312が植えられた土壌311の画像である。この画像310のうちの植物312が含まれる領域が特定領域となる。同図の領域320は、特定領域に該当する。特定領域検出部230は、この領域320を特定領域として検出し、出力する。
図4は、本開示における特定領域の検出に使用する撮像素子の分光感度特性の一例を示す図である。同図は、異なる波長帯域に分光感度を有する撮像素子の分光感度特性を表す図である。また、同図は、例えば8つの異なる波長帯域に分光可能な撮像素子の例を表した図である。使用する撮像素子は、例えば、8つの波長帯域にそれぞれ対応する画素が配置されて構成され、それぞれの波長帯域に対応する画像信号を生成する。同図の横軸は波長を表し、縦軸は感度を表す。同図のグラフ331-338は、それぞれの画素の分光感度特性を表す。
|IRch-IIRch|/(IRch+IIRch) ・・・(3)
この算出された特徴量と閾値との比較を行うことにより、特定領域を検出することができる。
図5は、本開示の第1の実施形態に係る光源色情報の検出の一例を示す図である。同図は、色度情報検出部240における光源色情報の検出の一例を表す図である。また、同図は、xy色度図における特定領域の画像信号及び色度情報の関係を表す図である。同図において、横軸及び縦軸は、それぞれx座標及びy座標を表す。また、同図のハッチングが付された円は、画像信号から生成した色度を表す。前述のように、観測値である画像信号から生成した色度は特定の方向に伸長された直線上に分布する。この直線により、画像信号の分布を表すことができる。同図の直線342は、画像信号の分布に基づいて生成された近似直線を表す。近似直線の生成は、回帰分析や主成分分析により行うことができる。
図6は、本開示の第1の実施形態に係る光源分光情報の一例を示す図である。同図は、色度情報検出部240が保持する光源分光情報の一例を表す図であり、光源分光情報の配列を表す図である。同図の「光源」は、光源分光情報を識別する光源の番号である。また、「色温度」は、光源の色温度[K]である。光源分光情報は、波長毎の分光により表される。同図の「波長」は、それぞれの分光の中心波長[nm]を表す。この波長には、例えば350nmから950nmまでの10nm刻みの波長(61波長)を適用することができる。同図の光源分光情報は、この61の波長毎の分光の情報を保持する例を表したものである。色度情報検出部240は、このような複数の色温度に対応する光源分光情報の配列を保持することができる。この配列の光源分光情報は、実在する光源の色温度及び分光情報を予め測定することにより生成することができる。色度情報検出部240は、検出した色温度を入力して光源分光情報を出力するルックアップテーブル(Lookup Table)として同図の配列を使用することができる。
画像調整部260は、色度情報検出部240から出力された光源分光情報に基づいて分光画像生成部220から出力された画像の調整を行う。この調整は、例えば、次式のように行うことができる。
図7A-7Cは、本開示の第1の実施形態に係る光源の影響の除去の一例を示す図である。図7A-7Cは、画像の分光分布を表す図であり、本開示に係る光源の影響の除去の効果を説明する図である。図7A-7Cの横軸は、波長を表す。縦軸は、輝度を表す。なお、図7A-7Cの縦軸は、明るさ(任意単位)を表す。図7Aのグラフ351は、分光画像生成部220から出力された画像の分光分布を表す。図7Bのグラフ352は、光源分光情報生成部250から出力される光源分光情報に基づく光源の分光分布を表す。図7Cのグラフ353は、画像調整部260により調整された画像を表す。図7Aのグラフ351から図7Bのグラフ352の光源の分光分布を除去することにより、図7Cのグラフ353の画像を生成することができる。図7Cのグラフ353は、被写体の本来の色に相当する分光分布を表すグラフである。
図8は、本開示の第1の実施形態に係る画像処理の一例を示す図である。同図は、画像処理装置200における画像処理の一例を表す図である。まず、画像生成部210が入力された画像信号から画像を生成する(ステップS101)。次に、分光画像生成部220が再構成処理を行って分光画像を生成する(ステップS102)。次に、特定領域検出の処理を行う(ステップS110)。この処理は、特定領域を検出する処理であり、特定領域検出部230が行う処理である。次に、色度情報検出の処理を行う(ステップS120)。この処理は、検出された特定領域に基づいて光源色情報を検出する処理であり、色度情報検出部240が行う処理である。次に、光源分光情報生成の処理を行う(ステップS130)。この処理は、検出された光源色情報に基づいて光源分光情報を生成する処理であり、光源分光情報生成部250が行う処理である。次に、調整の処理を行う(ステップS103)。この処理は、生成された光源分光情報に基づいて分光画像を調整する処理であり、画像調整部260が行う処理である。
図9は、本開示の第1の実施形態に係る特定領域検出処理の一例を示す図である。同図は、特定領域検出部230における特定領域検出処理の一例を表す図である。まず、特定領域検出部230は、入力された画像の平滑化を行う(ステップS111)。この処理により、画像のノイズを除去することができる。この平滑化は、例えば、ガウシアンフィルタにより行うことができる。次に、特定領域検出部230は、画像の正規化を行う(ステップS112)。次に、特定領域検出部230は、特徴量を算出する(ステップS113)。これは、前述の式(3)を使用して行うことができる。次に、特定領域検出部230は、画像を複数の領域に分割する(ステップS114)。次に、特定領域検出部230は、分割した領域のうちの未選択の領域を1つ選択する(ステップS115)。
図10は、本開示の第1の実施形態に係る色度情報検出処理の一例を示す図である。同図は、色度情報検出部240における色度情報検出処理の一例を表す図である。光源色情報として、色温度を検出する場合を例に挙げて色度情報検出処理を説明する。まず、色度情報検出部240は、特定領域から画像信号を抽出する(ステップS121)。これは、分光画像生成部220から出力された画像(分光画像)における特定領域に対応する領域に含まれる画像信号を選択することにより行うことができる。この際、色度情報検出部240は、所定の個数の画像信号を抽出する。次に、色度情報検出部240は、抽出した複数の画像信号を色度空間(xy色度)に変換する(ステップS122)。
図11は、本開示の第1の実施形態に係る光源分光情報生成処理の一例を示す図である。同図は、光源分光情報生成部250における光源分光情報生成処理の一例を表す図である。まず、光源分光情報生成部250は、光源色情報に基づいて光源分光情報の配列から光源分光情報を選択する(ステップS131)。図10において前述したように、色度情報検出部240は、色温度または色度を光源色情報として出力する。このため、光源分光情報生成部250は、色温度または色度に基づいて光源分光情報を選択する。例えば、光源分光情報生成部250は、光源色情報として入力された色温度により光源分光情報の配列の検索を行い、光源色情報として入力された色温度を挟んで隣接する2つの光源分光情報を選択することができる。
上述の第1の実施形態の画像処理装置200は、色度情報検出部240において色度空間における画像信号の分布に基づいて生成した近似直線と黒体軌跡との交点から色温度を検出していた。これに対し、本開示の第2の実施形態の画像処理装置200は、黒体軌跡を使用せずに色温度を検出する点で、上述の第1の実施形態と異なる。
図12は、本開示の第2の実施形態に係る光源色情報の検出の一例を示す図である。同図は、図5と同様に、色度情報検出部240における光源色情報の検出の一例を表す図である。同図の光源色情報の検出方法は、黒体軌跡の曲線を使用しない点で、図5の光源色情報の検出方法と異なる。
本開示に係る技術は、様々な製品へ応用することができる。例えば、本開示に係る技術は、カメラ等の撮像装置に適用することができる。
[撮像素子の構成]
図14は、本開示の実施形態に係る撮像素子の構成例を示す図である。同図は、撮像素子1001の構成例を表すブロック図である。撮像素子1001は、被写体の画像データを生成する半導体素子である。撮像素子1001は、画素アレイ部10と、垂直駆動部20と、カラム信号処理部30と、制御部40とを備える。
図15は、本開示の実施形態に係る画素の構成例を示す図である。同図は、画素100の断面の構成を簡略化して表した図であり、画素100の一部を模式的に表した図である。画素100は、半導体基板120と、配線領域130と、保護膜140と、カラーフィルタ150と、オンチップレンズ180とを備える。
画像処理装置200は、特定領域検出部230と、色度情報検出部240と、光源分光情報生成部250とを有する。特定領域検出部230は、光源からの光が照射される被写体の画像から特定物体の画像の領域である特定領域を検出する。色度情報検出部240は、検出された特定領域の複数の画像信号に基づいて光源色情報を検出する。光源分光情報生成部250は、検出された光源色情報に基づいて光源の分光の情報である光源分光情報を生成する。これにより、光源の分光を特定領域から推定した光源色情報により検出することができる。
(1)
光源からの光が照射される被写体の画像から特定物体の画像の領域である特定領域を検出する特定領域検出部と、
前記検出された特定領域の複数の画像信号に基づいて光源色情報を検出する色度情報検出部と、
前記検出された光源色情報に基づいて前記光源の分光の情報である光源分光情報を生成する光源分光情報生成部と
を有する画像処理装置。
(2)
前記光源分光情報は、波長と分光との関係を表す情報である前記(1)に記載の画像処理装置。
(3)
前記光源分光情報生成部は、前記検出された色度情報に基づく複数の前記光源分光情報において波長毎にそれぞれの分光を混合することにより前記光源分光情報を生成する前記(2)に記載の画像処理装置。
(4)
前記光源分光情報生成部は、保持する複数の前記光源分光情報から選択することにより前記光源分光情報を生成する前記(1)又は(2)に記載の画像処理装置。
(5)
前記色度情報検出部は、色度空間における複数の前記画像信号に基づいて生成される近似直線を前記光源色情報として検出する前記(1)から(4)の何れかに記載の画像処理装置。
(6)
前記色度情報検出部は、前記近似直線に基づいて検出された色温度を前記光源色情報として検出する前記(5)に記載の画像処理装置。
(7)
前記色度情報検出部は、前記近似直線と黒体軌跡との交点の色度空間における色を前記色温度として検出する前記(6)に記載の画像処理装置。
(8)
前記特定領域検出部は、前記特定物体の波長と反射率との関係に基づいて前記特定領域を検出する前記(1)から(7)の何れかに記載の画像処理装置。
(9)
前記特定領域検出部は、植物を前記特定物体として前記特定領域を検出する前記(1)から(8)の何れかに記載の画像処理装置。
(10)
前記生成された光源分光情報に基づいて前記被写体の画像を調整する画像調整部を更に有する前記(1)から(9)の何れかに記載の画像処理装置。
(11)
光源からの光が照射される被写体の画像から特定物体の画像の領域である特定領域を検出することと、
前記検出された特定領域の複数の画像信号に基づいて光源色情報を検出することと、
前記検出された光源色情報に基づいて前記光源の分光の情報である光源分光情報を生成することと
を含む画像処理方法。
(12)
コンピュータに、
光源からの光が照射される被写体の画像から特定物体の画像の領域である特定領域を検出することと、
前記検出された特定領域の複数の画像信号に基づいて光源色情報を検出することと、
前記検出された光源色情報に基づいて前記光源の分光の情報である光源分光情報を生成することと
を実行させる画像処理プログラム。
100 画素
200 画像処理装置
210 画像生成部
220 分光画像生成部
230 特定領域検出部
240 色度情報検出部
250 光源分光情報生成部
260 画像調整部
1000 撮像装置
1001 撮像素子
1003 画像処理部
Claims (12)
- 光源からの光が照射される被写体の画像から特定物体の画像の領域である特定領域を検出する特定領域検出部と、
前記検出された特定領域の複数の画像信号に基づいて光源色情報を検出する色度情報検出部と、
前記検出された光源色情報に基づいて前記光源の分光の情報である光源分光情報を生成する光源分光情報生成部と
を有する画像処理装置。 - 前記光源分光情報は、波長と分光との関係を表す情報である請求項1に記載の画像処理装置。
- 前記光源分光情報生成部は、前記検出された光源色情報に基づく複数の前記光源分光情報において波長毎にそれぞれの分光を混合することにより前記光源分光情報を生成する請求項2に記載の画像処理装置。
- 前記光源分光情報生成部は、保持する複数の前記光源分光情報から選択することにより前記光源分光情報を生成する請求項1に記載の画像処理装置。
- 前記色度情報検出部は、色度空間における複数の前記画像信号に基づいて生成される近似直線を前記光源色情報として検出する請求項1に記載の画像処理装置。
- 前記色度情報検出部は、前記近似直線に基づいて検出された色温度を前記光源色情報として検出する請求項5に記載の画像処理装置。
- 前記色度情報検出部は、前記近似直線と黒体軌跡との交点の色度空間における色を前記色温度として検出する請求項6に記載の画像処理装置。
- 前記特定領域検出部は、前記特定物体の波長と反射率との関係に基づいて前記特定領域を検出する請求項1に記載の画像処理装置。
- 前記特定領域検出部は、植物を前記特定物体として前記特定領域を検出する請求項1に記載の画像処理装置。
- 前記生成された光源分光情報に基づいて前記被写体の画像を調整する画像調整部を更に有する請求項1に記載の画像処理装置。
- 光源からの光が照射される被写体の画像から特定物体の画像の領域である特定領域を検出することと、
前記検出された特定領域の複数の画像信号に基づいて光源色情報を検出することと、
前記検出された光源色情報に基づいて前記光源の分光の情報である光源分光情報を生成することと
を含む画像処理方法。 - コンピュータに、
光源からの光が照射される被写体の画像から特定物体の画像の領域である特定領域を検出することと、
前記検出された特定領域の複数の画像信号に基づいて光源色情報を検出することと、
前記検出された光源色情報に基づいて前記光源の分光の情報である光源分光情報を生成することと
を実行させる画像処理プログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2022578257A JPWO2022163423A1 (ja) | 2021-01-29 | 2022-01-18 | |
EP22745646.4A EP4287108A1 (en) | 2021-01-29 | 2022-01-18 | Image processing device, image processing method, and image processing program |
CN202280008867.4A CN116670705A (zh) | 2021-01-29 | 2022-01-18 | 图像处理装置、图像处理方法和图像处理程序 |
US18/261,354 US20240070920A1 (en) | 2021-01-29 | 2022-01-18 | Image processing device, image processing method, and image processing program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021013992 | 2021-01-29 | ||
JP2021-013992 | 2021-01-29 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022163423A1 true WO2022163423A1 (ja) | 2022-08-04 |
Family
ID=82653334
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/001512 WO2022163423A1 (ja) | 2021-01-29 | 2022-01-18 | 画像処理装置、画像処理方法及び画像処理プログラム |
Country Status (6)
Country | Link |
---|---|
US (1) | US20240070920A1 (ja) |
EP (1) | EP4287108A1 (ja) |
JP (1) | JPWO2022163423A1 (ja) |
CN (1) | CN116670705A (ja) |
TW (1) | TW202303511A (ja) |
WO (1) | WO2022163423A1 (ja) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004165932A (ja) * | 2002-11-12 | 2004-06-10 | Sony Corp | 光源推定装置、光源推定方法、撮像装置および画像処理方法 |
JP2006173799A (ja) * | 2004-12-13 | 2006-06-29 | Canon Inc | 画像処理装置およびその方法 |
JP2008236437A (ja) * | 2007-03-20 | 2008-10-02 | Toshiba Corp | 画像処理装置、画像処理方法および画像処理用プログラム |
JP2009272709A (ja) * | 2008-04-30 | 2009-11-19 | Canon Inc | 画像処理装置、方法及びプログラム |
WO2014203453A1 (ja) | 2013-06-19 | 2014-12-24 | 日本電気株式会社 | 照明推定装置、照明推定方法および照明推定プログラム |
JP2016004134A (ja) * | 2014-06-16 | 2016-01-12 | キヤノン株式会社 | 撮像装置及びその制御方法、プログラム、記憶媒体 |
JP2017215851A (ja) * | 2016-06-01 | 2017-12-07 | キヤノン株式会社 | 画像処理装置および画像処理方法、造形システム |
-
2021
- 2021-12-30 TW TW110149531A patent/TW202303511A/zh unknown
-
2022
- 2022-01-18 US US18/261,354 patent/US20240070920A1/en active Pending
- 2022-01-18 EP EP22745646.4A patent/EP4287108A1/en active Pending
- 2022-01-18 WO PCT/JP2022/001512 patent/WO2022163423A1/ja active Application Filing
- 2022-01-18 JP JP2022578257A patent/JPWO2022163423A1/ja active Pending
- 2022-01-18 CN CN202280008867.4A patent/CN116670705A/zh active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004165932A (ja) * | 2002-11-12 | 2004-06-10 | Sony Corp | 光源推定装置、光源推定方法、撮像装置および画像処理方法 |
JP2006173799A (ja) * | 2004-12-13 | 2006-06-29 | Canon Inc | 画像処理装置およびその方法 |
JP2008236437A (ja) * | 2007-03-20 | 2008-10-02 | Toshiba Corp | 画像処理装置、画像処理方法および画像処理用プログラム |
JP2009272709A (ja) * | 2008-04-30 | 2009-11-19 | Canon Inc | 画像処理装置、方法及びプログラム |
WO2014203453A1 (ja) | 2013-06-19 | 2014-12-24 | 日本電気株式会社 | 照明推定装置、照明推定方法および照明推定プログラム |
JP2016004134A (ja) * | 2014-06-16 | 2016-01-12 | キヤノン株式会社 | 撮像装置及びその制御方法、プログラム、記憶媒体 |
JP2017215851A (ja) * | 2016-06-01 | 2017-12-07 | キヤノン株式会社 | 画像処理装置および画像処理方法、造形システム |
Also Published As
Publication number | Publication date |
---|---|
JPWO2022163423A1 (ja) | 2022-08-04 |
TW202303511A (zh) | 2023-01-16 |
US20240070920A1 (en) | 2024-02-29 |
CN116670705A (zh) | 2023-08-29 |
EP4287108A1 (en) | 2023-12-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7489396B1 (en) | Spectrophotometric camera | |
US8569681B2 (en) | Ambient infrared detection in solid state sensors | |
US7439510B2 (en) | Compact emissivity and temperature measuring infrared detector | |
US10168215B2 (en) | Color measurement apparatus and color information processing apparatus | |
US20140198183A1 (en) | Sensing pixel and image sensor including same | |
US20160040985A1 (en) | Specimen measuring device and computer program product | |
US20150062347A1 (en) | Image processing methods for visible and infrared imaging | |
US20100245826A1 (en) | One chip image sensor for measuring vitality of subject | |
US20070177029A1 (en) | Color correction apparatus | |
JP6872137B2 (ja) | 信号処理装置および信号処理方法、並びにプログラム | |
EP3922008A1 (en) | Advanced computational pixel imagers with multiple in-pixel counters | |
KR20080083712A (ko) | 이미저용의 베이어 컬러 모자이크 보간을 생성하는 방법 및장치 | |
JP6841406B2 (ja) | 光学測定方法および光学測定装置 | |
US20100278212A1 (en) | Circuit arrangement and imaging pyrometer for generating light- and temperature-dependent signals | |
JP6600800B2 (ja) | 固体撮像装置、撮像システム及び物体識別システム | |
JPH04328449A (ja) | 水分測定方法および測定装置 | |
WO2022163423A1 (ja) | 画像処理装置、画像処理方法及び画像処理プログラム | |
AU2018309904A1 (en) | Imaging device, imaging method, and program | |
JP2007248196A (ja) | 分光輝度分布推定システムおよび方法 | |
KR101998138B1 (ko) | 자외선 포토루미네선스를 이용한 결정의 폴리타입 분석방법 | |
TW202101961A (zh) | 使用多種波長之光單色成像的顏色檢查方法 | |
US20220342053A1 (en) | Distance measurement device, method of controlling distance measurement device, and electronic apparatus | |
EP3669743B1 (en) | System and method, in particular for microscopes and endoscopes, for creating an hdr image of a fluorescing fluorophore | |
JP6815628B2 (ja) | マルチスペクトル撮像装置 | |
US20220319048A1 (en) | Enhanced measurement of photosynthetically active radiation (par) and image conversion therefor |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22745646 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022578257 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280008867.4 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18261354 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022745646 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022745646 Country of ref document: EP Effective date: 20230829 |