WO2017051909A1 - Dispositif de traitement d'images, programme de traitement d'images, dispositif de capture d'images, et programme de capture d'images - Google Patents

Dispositif de traitement d'images, programme de traitement d'images, dispositif de capture d'images, et programme de capture d'images Download PDF

Info

Publication number
WO2017051909A1
WO2017051909A1 PCT/JP2016/078121 JP2016078121W WO2017051909A1 WO 2017051909 A1 WO2017051909 A1 WO 2017051909A1 JP 2016078121 W JP2016078121 W JP 2016078121W WO 2017051909 A1 WO2017051909 A1 WO 2017051909A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
wavelength
imaging data
unit
data
Prior art date
Application number
PCT/JP2016/078121
Other languages
English (en)
Japanese (ja)
Inventor
潤弥 萩原
Original Assignee
株式会社ニコン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ニコン filed Critical 株式会社ニコン
Priority to JP2017540934A priority Critical patent/JP6645504B2/ja
Publication of WO2017051909A1 publication Critical patent/WO2017051909A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/42Absorption spectrometry; Double beam spectrometry; Flicker spectrometry; Reflection spectrometry
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/27Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands using photo-electric detection ; circuits for computing concentration

Definitions

  • the present invention relates to an image processing device, an image processing program, an imaging device, and an imaging program.
  • RGB primary colors
  • Patent Document 1 Japanese Patent No. 5034953
  • the object has a different absorption spectrum in the invisible wavelength band for each object. For this reason, when a combination of bandpass filters is not appropriate for generating a visible image, it has been impossible for an observer to obtain a color image with good discrimination for distinguishing and recognizing a plurality of objects.
  • the image processing apparatus is a visible image picked up by light from a subject that has passed through each of n types (n is a natural number of 3 or more) of wavelength band filters having a wavelength that is not visible.
  • a data acquisition unit that acquires outside imaging data, a setting unit that sets a plurality of partial areas in the image indicated by the invisible imaging data, and a light intensity for each of the passbands is not visible in each of the plurality of partial areas
  • An analysis unit that analyzes from the imaging data, a selection unit that selects m types of invisible imaging data (m is a natural number of 2 or more and less than n) based on an analysis result by the analysis unit, and each of the selected invisible imaging data
  • an image generation unit that generates visible image data by associating at least some of the visible wavelength bands with different bands.
  • the image processing program is a visible image picked up by light from a subject that has passed through each of n types (n is a natural number of 3 or more) of wavelength band filters having a non-visible wavelength as a pass band.
  • a data acquisition step for acquiring outside imaging data, a setting step for setting a plurality of partial areas in the image indicated by the invisible imaging data, and a light intensity for each of the passbands in the plurality of partial areas are not visible.
  • An analysis step for analyzing from the imaging data a selection step for selecting m types of invisible imaging data (m is a natural number not less than 2 and less than n) based on the analysis result of the analysis step, and the invisible imaging selected by the selection step Image data that generates visible image data by associating at least some visible wavelength bands with different bands with each piece of data To perform the steps on a computer.
  • the imaging device is a non-visible image captured by light from a subject that has passed through each of n types (n is a natural number of 3 or more) of wavelength band filters having a non-visible wavelength as a pass band.
  • a data generation unit that generates imaging data
  • a setting unit that sets a plurality of partial areas in the image indicated by the invisible imaging data, and an invisible imaging of the light intensity for each of the passbands in each of the plurality of partial areas
  • An analysis unit that analyzes from the data, a selection unit that selects m types of invisible imaging data (m is a natural number of 2 or more and less than n) based on an analysis result by the analysis unit, and the invisible imaging data selected by the selection unit
  • a processing unit that processes invisible imaging data based on the information.
  • the imaging program according to the fourth aspect of the present invention is a non-visible image captured by light from a subject that has passed through each of n types (n is a natural number of 3 or more) of wavelength band filters having a non-visible wavelength as a pass band.
  • a data generation step for generating imaging data for generating imaging data
  • An analysis step for analyzing from the data a selection step for selecting m types of invisible imaging data (m is a natural number not less than 2 and less than n) based on the analysis result of the analysis step, and the invisible imaging selected by the selection step
  • a processing step of processing the invisible imaging data based on the data information is executed by the computer.
  • FIG. 3 is a flowchart showing a processing procedure in the digital camera 10.
  • FIG. 1 is a diagram illustrating a configuration of a digital camera 10 as an example of an image processing apparatus according to the first embodiment.
  • the digital camera 10 can take an image of a subject luminous flux in the outside visible band.
  • the digital camera 10 includes a photographic lens 20 as a photographic optical system and an image sensor 100.
  • the taking lens 20 guides the subject luminous flux incident along the optical axis 21 to the image sensor 100.
  • the digital camera 10 includes a control unit 201, an A / D conversion circuit 202, a work memory 203, a drive unit 204, an image processing unit 205, a system memory 206, a memory card IF 207, and an operation unit. 208, a display unit 209, an LCD drive circuit 210, and a communication unit 211.
  • the photographing lens 20 is composed of a plurality of optical lens groups, and forms an image of a subject light flux from the scene in the vicinity of its focal plane.
  • the photographing lens 20 may be an interchangeable lens that can be attached to and detached from the digital camera 10.
  • the camera body functions as an image processing device.
  • the photographic lens 20 is represented by a single virtual lens arranged in the vicinity of the pupil.
  • the image sensor 100 is disposed in the vicinity of the focal plane of the photographing lens 20.
  • the image sensor 100 is a near-infrared wavelength band image sensor having light receiving sensitivity in a non-visible band.
  • the imaging device 100 has light receiving sensitivity in a range of 800 nm to 2000 nm out of 800 nm to 2500 nm which is a near infrared band.
  • the near-infrared band and the range of light receiving sensitivity are not limited to this example.
  • the near-infrared band may be widened and the lower limit may be set to 700 nm.
  • the upper limit of the near infrared band may be 3000 nm.
  • the image sensor 100 includes a plurality of pixels arranged two-dimensionally.
  • Each of the plurality of pixels includes a photoelectric conversion unit and a band-pass filter provided corresponding to the photoelectric conversion unit.
  • bandpass filters there are six types of bandpass filters, and each of the plurality of photoelectric conversion units includes an NIR1 filter, an NIR2 filter, an NIR3 filter, an NIR4 filter, an NIR5 filter, and Any one of the NIR6 filters is provided.
  • the image sensor 100 is timing-controlled by the drive unit 204, converts the subject image formed on the light receiving surface into a pixel signal, and outputs the pixel signal to the A / D conversion circuit 202.
  • the A / D conversion circuit 202 converts a pixel signal as an output signal output from the image sensor 100 into a digital signal. Then, the imaging data obtained by digital conversion is output to the work memory 203.
  • each of the plurality of photoelectric conversion units is provided with one of six types of band-pass filters.
  • the imaging data includes imaging elements obtained according to the subject light flux that has passed through each of the six types of bandpass filters.
  • a specific example of the imaging element is image layer data composed of pixel value data in a pixel in which the NIR1 filter is arranged. That is, in the present embodiment, the imaging data includes six types of image layer data corresponding to the wavelength band of the invisible band of each bandpass filter. Details of the image layer data will be described later.
  • the image processing unit 205 uses the work memory 203 as a work space, and performs various processes such as a brightness correction process on the captured image data.
  • the image processing unit 205 plays a role as an acquisition unit that acquires invisible imaging data.
  • the image processing unit 205 extracts six image layer data from the image data captured by the light from the subject that has passed through each of the six wavelength band filters whose pass bands are invisible wavelength bands. Then, the image processing unit 205 performs interpolation processing described later on each of the six image layer data, and generates image layer data as invisible imaging data.
  • the pass band of the wavelength in the wavelength band filter may be referred to as a transmission wavelength band.
  • the control unit 201 acts together with an operation unit 208 described later to play a role as a setting unit.
  • the setting unit receives a user instruction and sets a plurality of partial areas as analysis areas in the image indicated by the image layer data. The analysis area will be described later.
  • control unit 201 acts together with the image processing unit 205 to play a role as an analysis unit.
  • the analysis unit analyzes the intensity of light from the subject with respect to each of the passbands of each bandpass filter in each set analysis region from the intensity value of the image layer data.
  • the intensity value is, for example, an integrated value of luminance values (pixel values) of pixels included in each analysis region.
  • the intensity value may be referred to as spectrum intensity.
  • control unit 201 acts together with the image processing unit 205 to play a role as a selection unit.
  • the selection unit selects three types of image layer data according to the analysis result by the analysis unit.
  • the image processing unit 205 plays a role as an image generation unit.
  • the image generation unit generates visible image data by associating each of the invisible imaging data selected by the selection unit with a visible wavelength band in which at least a part of the band is different.
  • the image processing unit 205 records the generated color image data on the memory card 220 attached to the memory card IF 207.
  • the generated visible image data is converted into a display signal by the LCD driving circuit 210 and displayed on the display unit 209 as a visible image.
  • the display unit 209 also displays a menu screen for various settings. For example, a menu screen relating to the setting of the analysis region for the observation object described later is displayed.
  • the system memory 206 records a program for controlling the digital camera 10, various parameters, and the like.
  • the system memory 206 records spectral transmittance data of each of the six bandpass filters.
  • the spectral transmittance data includes the peak wavelength of transmittance and the transmittance for each wavelength.
  • the spectral transmittance data may be recorded in the form of a function with the wavelength as a variable, or may be recorded in the form of a data table indicating the relationship between the wavelength and the transmittance.
  • the spectral transmittance data is recorded in association with the identifier of the corresponding bandpass filter.
  • the operation unit 208 accepts user operations.
  • the operation unit 208 outputs an operation signal corresponding to the received user operation to the control unit 201. For example, when a menu screen related to the setting of the analysis region in the observation target is displayed on the display unit 209, an operation signal related to the setting of the analysis region in the observation target is output to the control unit 201 in accordance with a user operation.
  • the operation unit 208 includes operation members such as a release switch, a cross key, and an OK key.
  • the release switch is composed of a push button that can be detected in two steps in the push-down direction.
  • the control unit 201 executes AF, AE, etc., which are shooting preparation operations by detecting SW1 that is the first-stage depression, and acquires an object image by the image sensor 100 by detecting SW2 that is the second-stage depression. Perform the action.
  • AF is executed so that the subject image is focused in the infrared wavelength band.
  • the communication unit 211 communicates with other devices.
  • the communication unit 211 transmits imaging data to another device in response to a user operation via the operation unit 208.
  • Examples of other devices include a device having a display unit such as a personal computer, a smartphone, and a tablet terminal, and a server device on the Internet.
  • FIG. 2 is an explanatory diagram of a band-pass filter disposed on each photoelectric conversion unit of the image sensor 100.
  • Each of the six types of band-pass filters passes a part of the continuous near-infrared band of the subject light flux.
  • the band to be passed is different among the six types of bandpass filters.
  • NIR1 filter, NIR2 filter, NIR3 filter, NIR4 filter, NIR5 filter, and NIR6 filter are provided as six types of bandpass filters.
  • each of the six types of bandpass filters may be simply referred to as NIR1, NIR2, NIR3, NIR4, NIR5, and NIR6.
  • NIR1 to NIR6 are assigned to unit 6 pixels 101 each consisting of 3 pixels ⁇ 2 pixels (x ⁇ y).
  • NIR1 is assigned to the upper left pixel
  • NIR2 is assigned to the upper middle pixel
  • NIR3 is assigned to the upper right pixel
  • NIR4 is assigned to the lower left pixel
  • NIR5 is assigned to the lower middle pixel
  • NIR6 is assigned to the lower right pixel.
  • the arrangement of the bandpass filter is not limited to this example.
  • each of a plurality of pixels arranged two-dimensionally includes any one of NIR1 to NIR6 discretely. Therefore, the image sensor 100 detects the incident subject luminous flux by separating it into the respective wavelength bands. In other words, the image sensor 100 photoelectrically converts the subject image formed on the light receiving surface into six different wavelength bands in the infrared wavelength band and different from each other.
  • the image processing unit 205 separates image layer data composed of pixel value data corresponding to each bandpass filter from the imaging data. Specifically, the image processing unit 205 extracts image layer data including only pixel value data in the pixel where NIR1 is arranged from the imaging data. Image layer data is similarly extracted for each of NIR2 to NIR6. In this way, the image processing unit 205 extracts six image layer data corresponding to each bandpass filter from the imaging data.
  • one of six band pass filters is arranged on one pixel. Therefore, for example, in the image layer data obtained by extracting only the pixel value data of the pixel where NIR1 is arranged, the pixel value data is not stored over all the pixels, and there is a pixel in which the pixel value data is not stored. To do.
  • a pixel in which no pixel value is stored may be referred to as a defective pixel.
  • the image processing unit 205 calculates the pixel value of the defective pixel of each image layer data from the pixel values of the surrounding pixels by interpolation processing, and generates image layer data that is free of pixel value data over all the pixels. Specifically, for example, the pixel value is calculated by bilinear interpolation or bicubic interpolation using the pixel values of the peripheral pixels of the defective pixel to be interpolated.
  • the interpolation processing method is not limited to these methods, and a known interpolation processing method can be adopted. In the following description, when simply referred to as image layer data, it refers to the image layer data after the above-described interpolation processing unless otherwise specified. In the present embodiment, the invisible imaging data is image layer data after the interpolation processing.
  • FIG. 3 is a diagram for explaining the spectral transmittance characteristics of the band-pass filter.
  • the spectral transmittance curve of each bandpass filter is shown.
  • the horizontal axis indicates the wavelength [nm]
  • the vertical axis indicates the transmittance [%].
  • the shapes of the spectral transmittance curves NIR1 to NIR6 are substantially the same as a whole.
  • the spectral transmittances of NIR1 to NIR6 are unimodal distributions, and the central wavelength of the transmission wavelength band and the peak wavelength of the transmittance are substantially the same.
  • the transmission wavelength bands of the wavelength band filters (NIR1 and NIR2, NIR2 and NIR3, etc.) having the closest center wavelengths (or peak wavelengths) partially overlap each other.
  • the spectral transmittance is not limited to a single peak and can have various distributions. Whether the spectral transmittance is a unimodal distribution or other distribution, the center wavelength can be determined in accordance with the distribution shape. For example, the center wavelength of the spectral transmittance curve width at a transmittance that is 50% of the maximum transmittance, or the central wavelength of the spectral transmittance curve width at a transmittance of 50% may be set as the central wavelength.
  • the peak wavelength may be the center wavelength as described above.
  • the spectral transmittance is fitted with a single-peak function such as a quadratic function or a Gaussian function, and the peak wavelength of the transmittance curve obtained by the fitting is used as the center wavelength. Also good.
  • the center of the wavelength band having transmittance may be the center wavelength.
  • the center wavelength is not limited to the above definition as long as it substantially indicates the center of the spectral transmittance distribution.
  • NIR1 has a transmittance from about 700 nm to about 1100 nm, and the peak wavelength ⁇ a of the transmittance is 900 nm.
  • NIR2 has a transmittance from about 800 nm to about 1250 nm, and the peak wavelength ⁇ b of the transmittance is 1050 nm.
  • NIR3 has a transmittance from about 950 nm to about 1400 nm, and the peak wavelength ⁇ c of the transmittance is 1150 nm.
  • NIR4 has a transmittance from about 1100 nm to about 1500 nm, and the peak wavelength ⁇ d of the transmittance is 1300 nm.
  • NIR5 has a transmittance from about 1200 nm to about 1700 nm, and the peak wavelength ⁇ e of the transmittance is 1450 nm.
  • NIR6 has a transmittance from about 1350 nm to about 1850 nm, and the peak wavelength ⁇ f of the transmittance is 1600 nm.
  • NIR1, NIR2, NIR3, NIR4, NIR5, and NIR6 has a transmittance at the peak wavelength of the transmittance of another bandpass filter with adjacent central wavelengths.
  • NIR1 has transmittance at the peak wavelength of NIR2.
  • NIR3 has a transmittance at the peak wavelength of NIR2.
  • NIR2 has transmittance at each of the peak wavelengths of NIR1 and NIR3.
  • the characteristics of the pixel signal output from the image sensor 100 are determined by combining the spectral transmittance characteristics of NIR1 to NIR6, the spectral sensitivity characteristics of the image sensor 100, the spectral characteristics of the light source, the spectral characteristics of the imaging lens, etc.
  • the spectral transmittance characteristics of the bandpass filter will be described as output characteristics from the image sensor 100. If the spectral sensitivity characteristics of the image sensor 100 have a distribution, the spectral transmittance characteristics from NIR1 to NIR6 are set in consideration of this, and the characteristics of the pixel signal output from the image sensor 100 are as shown in FIG. It becomes a curve. That is, the “wavelength band filter” may include not only a bandpass filter but also a filter effect based on spectral characteristics such as a light source and an imaging lens and spectral sensitivity characteristics of the imaging element 100.
  • FIG. 4 is a diagram for explaining an object of color discrimination processing according to an example in the first embodiment.
  • FIG. 4 shows an image 300 that includes an object of color discrimination processing.
  • the object of the color discrimination process is “water”.
  • Different amounts of water are contained in containers 1 to 6, which are six containers of the same shape.
  • region where water exists is shown with the area
  • the digital camera 10 calculates the intensity values of the regions 1 to 6 in the image layer data corresponding to each bandpass filter, and stores them in the work memory 203 in association with the analysis region and the peak wavelength of the corresponding bandpass filter. .
  • data in which the intensity value, the analysis region, and the peak wavelength of the bandpass filter are associated with each other may be referred to as spectral spectrum data.
  • the digital camera 10 selects image layer data for synthesizing visible image data according to the analysis result of the spectral spectrum data created using the pixel value information of the pixels included in the region 1 to the region 6. At this time, the digital camera 10 selects three image layer data captured by light that has passed through each of the wavelength band filters in which at least some of the pass bands of the filters having the closest center wavelengths overlap each other. Details will be described later.
  • FIG. 5A to 5F are diagrams showing layer images corresponding to the six band pass filters. These layer images are generated from the image layer data after the interpolation processing described above.
  • FIG. 5A shows a layer image corresponding to NIR1.
  • FIG. 5B shows a layer image corresponding to NIR2.
  • FIG. 5C shows a layer image corresponding to NIR3.
  • FIG. 5D shows a layer image corresponding to NIR4.
  • FIG. 5E shows a layer image corresponding to NIR5.
  • FIG. 5F shows a layer image corresponding to NIR6.
  • FIG. 6 is a diagram showing an intensity distribution for each wavelength in the analysis region.
  • the horizontal axis represents wavelength [nm]
  • the vertical axis represents normalized spectral intensity.
  • the spectral intensities in the regions 1 to 6 were calculated.
  • FIG. 6 shows the results of plotting the spectral intensities calculated in each of the regions 1 to 6 in association with the peak wavelengths ⁇ a to ⁇ f of the transmittance of each bandpass filter.
  • the spectrum intensity in the region 1 to the region 6 calculated from the image layer data corresponding to NIR1 is shown at the position of the wavelength ⁇ a (900 nm), and both were 1.0.
  • the spectral intensity in the region 1 to the region 6 calculated from the image layer data corresponding to NIR2 is shown at the position of the wavelength ⁇ b (1050 nm), the maximum value is 1.0 in the region 1, and the minimum value is 0 in the region 6. .88.
  • the spectral intensity in the region 1 to the region 6 calculated from the image layer data corresponding to NIR3 is shown at the position of the wavelength ⁇ c (1150 nm), the maximum value is 0.8 of the region 1, and the minimum value is 0 of the region 6. .52.
  • the spectral intensity in the region 1 to the region 6 calculated from the image layer data corresponding to the NIR6 is shown at the position of the wavelength ⁇ f (1600 nm), the maximum value is 0.11 of the region 1, and the minimum value is 0 of the region 6 .06.
  • the analysis unit calculates the difference between the maximum spectrum intensity and the minimum spectrum intensity in all analysis regions for each of the peak wavelengths ⁇ a to ⁇ f. In the present embodiment, the analysis unit calculates the difference between the maximum value and the minimum value of the spectral intensity in all analysis regions for each peak wavelength of each bandpass filter. Then, the selection unit selects the corresponding image layer data using, as a reference wavelength band filter, a band pass filter that includes, in the transmission wavelength band, a wavelength at which the difference between the maximum value and the minimum value of the spectrum intensity is the largest.
  • the reference wavelength band filter may be referred to as a reference filter
  • image layer data corresponding to the reference wavelength band filter may be referred to as first layer data.
  • the selection unit selects image layer data corresponding to the bandpass filter whose center wavelength is located on the longer wavelength side than the center wavelength of the selected reference filter.
  • image layer data corresponding to a bandpass filter whose center wavelength is located on the longer wavelength side than the center wavelength of the reference filter may be referred to as second layer data.
  • the selection unit selects the image layer data corresponding to the bandpass filter having the largest overlap amount of the spectral transmittance curve with respect to the spectral transmittance curve of the reference filter. To do.
  • the band pass filter has the center wavelength on the longer wavelength side than the reference filter and has the largest area of the overlapping portion between the spectral transmittance curves with the reference filter.
  • a bandpass filter having a center wavelength on the shorter wavelength side than the reference filter and having the largest area of the overlapping portion between the spectral transmittance curves with the reference filter is selected. Select the corresponding image layer data.
  • the selection unit may select the second layer data and the third layer data not based on the overlapping amount of the spectral transmittance curves but based on the magnitude relationship of the center wavelength (or peak wavelength).
  • image layer data corresponding to the wavelength band filter that is longer than the reference filter and has the closest center wavelength is selected as the second layer data.
  • image layer data corresponding to a wavelength band filter that is shorter than the reference filter and closest to the center wavelength is selected as third layer data.
  • the digital camera 10 uses the image corresponding to NIR4 as the first layer data. Select layer data.
  • the digital camera 10 is a bandpass filter having the largest area of the portion where the spectral transmittance curve overlaps with respect to the spectral transmittance curve of NIR4 among NIR5 and NIR6 having a central wavelength larger than the central wavelength of NIR4. Select NIR5. Then, the digital camera 10 selects image layer data corresponding to NIR5 as the second layer data.
  • the digital camera 10 is a band-pass filter having the largest area of the portion where the spectral transmittance curve overlaps the spectral transmittance curve of NIR4 among NIR1, NIR2, and NIR3 whose center wavelength is smaller than the central wavelength of NIR4. NIR3 is selected. Then, the digital camera 10 selects image layer data corresponding to NIR3 as the third layer data.
  • the digital camera 10 selects the first layer data, the second layer data, and the third layer data.
  • the digital camera 10 selects three image layer data and then assigns RGB color channels to each of them to generate visible image data.
  • FIG. 7A is a diagram showing a visible image obtained by combining selected image layer data.
  • a visible image synthesized from image layer data corresponding to NIR3, NIR4, and NIR5 is shown.
  • a green (G) color channel is assigned to the image layer data corresponding to NIR4 which is the first layer data.
  • a red (R) color channel is assigned to the image layer data corresponding to the second layer data NIR5
  • a blue (B) color channel is assigned to the image layer data corresponding to the third layer data NIR3.
  • FIG. 7B is a diagram showing an image for comparison.
  • a near-infrared layer image corresponding to NIR1 which is one of the wavelength band filters is shown.
  • a region indicated by a broken line indicates a region where water exists.
  • FIG. 7A an image having a blue hue is obtained in each region where water exists. It can be seen that the gradation gradually changes from the container 1 toward the container 6. That is, the image is excellent in discrimination with respect to the depth of water.
  • FIG. 7B almost no change in gradation can be confirmed in the water existence region of the containers 1 to 6.
  • water has absorption bands specific to 970 nm, 1450 nm, and 1940 nm in the near-infrared wavelength band. As described with reference to FIG.
  • NIR1 has a transmittance from about 700 nm including 970 nm which is an absorption band of water to about 1100 nm, but 970 nm is a second overtone of OH bond, Absorption is small compared to 1450 nm and 1940 nm. Therefore, since the amount of water that absorbs near-infrared light in the wavelength band passing through NIR 1 is small, the difference in the depth of water that has entered the containers 1 to 6 is not detected as a difference in pixel value.
  • the difference between the maximum value and the minimum value of the spectral intensity is the most in all analysis regions using the spectral intensity information obtained from the image layer data corresponding to each bandpass filter. Identify the wavelength to be increased. Then, by selecting three image layer data including image layer data corresponding to a bandpass filter having transmittance at the wavelength and generating color image data, a color image with high color discrimination can be obtained. it can.
  • FIG. 8 is a diagram for explaining an object of color discrimination processing according to an example in the second embodiment.
  • “salt”, “sugar”, and “synthetic sweetener” are indicated by regions surrounded by broken lines, respectively.
  • “sugar” and “synthetic sweetener” are color discrimination targets.
  • region b which are the rectangular area
  • FIG. 9A to 9F are diagrams showing layer images in each of the six bandpass filters.
  • FIG. 9A shows a layer image corresponding to NIR1.
  • FIG. 9B shows a layer image corresponding to NIR2.
  • FIG. 9C shows a layer image corresponding to NIR3.
  • FIG. 9D shows a layer image corresponding to NIR4.
  • FIG. 9E shows a layer image corresponding to NIR5.
  • FIG. 9F shows a layer image corresponding to NIR6.
  • the luminance reduction of the “synthetic sweetener” image area and the luminance reduction of the “sugar” image area are substantially the same.
  • the luminance decrease of the “sugar” image region is moderate as compared to the luminance decrease of the “synthetic sweetener” image region.
  • FIG. 10 is a diagram showing an intensity distribution for each wavelength in the analysis region.
  • the horizontal axis represents wavelength [nm]
  • the vertical axis represents normalized spectral intensity.
  • the spectral intensities in the region a and the region b were calculated.
  • FIG. 10 shows the result of plotting the calculated spectral intensity in association with the peak wavelength of the transmittance of each bandpass filter.
  • the spectral intensity in the region a and the region b calculated from the image layer data corresponding to NIR1 is shown at the position of the wavelength ⁇ a (900 nm), the region a is 1.0 and the region b is 0.96.
  • the spectral intensities in the region a and the region b calculated from the image layer data corresponding to NIR2 are shown at the position of the wavelength ⁇ b (1050 nm), the region a is 0.98, and the region b is 1.0.
  • the spectral intensities in the region a and the region b calculated from the image layer data corresponding to NIR3 are shown at the position of the wavelength ⁇ c (1150 nm).
  • the region a is 0.92 and the region b is 0.92.
  • the spectral intensity in the region a and the region b calculated from the image layer data corresponding to NIR4 is shown at the position of the wavelength ⁇ d (1300 nm), the region a is 0.78, and the region b is 0.8.
  • the spectral intensity in the region a and the region b calculated from the image layer data corresponding to NIR5 is shown at the position of the wavelength ⁇ e (1450 nm), the region a is 0.4, and the region b is 0.5.
  • the spectral intensities in the region a and the region b calculated from the image layer data corresponding to NIR6 are shown at the position of the wavelength ⁇ f (1600 nm), where the region a is 0.3 and the region b is 0.18.
  • the digital camera 10 receives light from a subject that has passed through a wavelength band filter corresponding to a wavelength whose spectral intensities with respect to wavelengths in at least two of the plurality of partial regions analyzed by the analysis unit are mutually reversed. Select the captured image layer data.
  • the digital camera 10 selects image layer data corresponding to a bandpass filter having ⁇ e and ⁇ f as center wavelengths as two image layer data among the three image layer data. Specifically, the digital camera 10 selects three image layer data including image layer data corresponding to NIR5 and NIR6 which are band pass filters having transmittance wavelengths of 1450 nm and 1600 nm.
  • NIR5 or NIR6 can be selected as the reference filter described with reference to FIG.
  • the digital camera 10 selects NIR5 as the reference filter because the bandpass filter whose center wavelength is located on the longer wavelength side than NIR6 is not provided. Therefore, the digital camera 10 selects the image layer data corresponding to NIR5 as the first layer data. Then, the digital camera 10 selects image layer data corresponding to NIR6 as second layer data.
  • the digital camera 10 uses the band pass having the largest area where the spectral transmittance curve overlaps the spectral transmittance curve of the reference filter, as described with reference to FIG. Select the image layer data corresponding to the filter. Specifically, NIR4 is selected from a bandpass filter having a center wavelength smaller than the center wavelength of NIR5, and image layer data corresponding to NIR4 is selected as third layer data. Then, the RGB color channel is assigned to the image layer data corresponding to NIR4, NIR5, and NIR6 by the same method described with reference to FIG. 7A to generate visible image data.
  • NIR5 is selected as the reference filter.
  • NIR6 can also be selected as the reference filter. Which bandpass filter is selected as the reference filter may be determined depending on which side has a large difference in spectral intensity between the region a and the region b. For example, when the difference in the spectrum intensity on the long wavelength side of NIR6 is larger than the difference in the spectrum intensity on the short wavelength side of NIR5, NIR6 is selected as the reference filter.
  • FIG. 11 is a diagram showing the obtained image.
  • a color image having a blue hue in the “sugar” region and a blue-green (cyan) hue in the “synthetic sweetener” region was obtained.
  • color image data with improved discrimination ability can be generated by changing the hue.
  • FIG. 12 is a diagram illustrating an example of an interface for setting an analysis region.
  • FIG. 12A shows an example of display on the display unit 209 in the initial state of setting the analysis region.
  • the display unit 209 displays an area selection image 301 and a selection cursor 302 for setting an analysis area.
  • the display unit 209 displays “Analysis area setting”, “Move the selection cursor to select the analysis area”, “After selecting the analysis area, press the OK button”, “All When the analysis area has been specified, click the “Finish” button. A “Finish” button is displayed.
  • FIG. 12B shows an example of display on the display unit 209 in a state where the analysis region is being set.
  • the display unit 209 displays a selection determination display 303 indicating the region for which selection has been determined.
  • the image generation unit generates image data of a region selection image 301 to be displayed for the user to set an analysis region from the six image layer data after the interpolation processing described with reference to FIG. Specifically, the image generation unit generates image data in which pixel values are averaged between the image layer data for each pixel of the image layer data. In the following description, image data obtained by averaging pixel values for each pixel between the image layer data may be referred to as averaged image data. As shown in FIGS. 12A and 12B, the control unit 201 displays the generated image data on the display unit 209 as a region selection image 301.
  • the user operates the cross key included in the operation unit 208 to move the selection cursor 302 to a region to be subjected to color discrimination.
  • the user presses an OK key included in the operation unit 208.
  • a selection determination display 303 displayed with a solid line is displayed at the position of the selection cursor 302 displayed with a broken line, and the designation of the analysis region is confirmed.
  • a plurality of analysis areas are designated using the cross key and the OK button of the operation unit 208.
  • the control unit 201 sets a plurality of areas designated by the user as analysis areas. This completes the setting of the analysis area.
  • a plurality of areas in a predetermined arrangement may be set as analysis areas in a wide area designated by the user.
  • the predetermined arrangement an arrangement of one or a plurality of pixels arranged at a predetermined interval in a two-dimensional direction can be cited.
  • the control unit 201 determines a predetermined target region.
  • a plurality of areas in the array are set as analysis areas. Note that the target area is not specified by the user, but a subject area extracted from the image by object recognition by the image processing unit 205 may be used. Further, when the entire image is set as a color discrimination target range, the control unit 201 may set a plurality of areas in a predetermined array in the entire image as analysis areas.
  • FIG. 13 is a flowchart showing a processing procedure in the digital camera 10.
  • the program shown in this flow is stored in the system memory 206, and is realized by the control unit 201 calling and executing the program.
  • This flow is started when, for example, the digital camera 10 is turned on and becomes ready for photographing.
  • all elements constituting the digital camera 10 operate under the control of the control unit 201. Further, this flow will be described using the first embodiment.
  • the control unit 201 controls the digital camera 10 to acquire image data of the subject image and store it in the work memory 203 (step S101).
  • the imaging data includes image layer data including missing pixels corresponding to the six bandpass filters.
  • the image processing unit 205 reads imaging data from the work memory 203. Then, the image processing unit 205 extracts image layer data corresponding to each bandpass filter from the imaging data. Then, the image processing unit 205 performs interpolation processing of missing pixels in the pixel value data for each of the extracted image layer data, and generates image layer data having pixel value data in all pixels (step S102). The generated image layer data is stored in the work memory 203.
  • the image processing unit 205 generates image data of a region selection image 301 as an image for the user to set an analysis region from the image layer data generated in step S102 (step S103). Then, the image data of the area selection image 301 is stored in the work memory 203.
  • the control unit 201 calls the image data of the region selection image 301 from the work memory 203 and displays the region selection image 301 on the display unit 209. Then, as described with reference to FIG. 12, the control unit 201 receives an instruction from the user, and sets a plurality of partial areas as analysis areas on the area selection image 301 (step S104).
  • control unit 201 calculates the integral value of the pixel value of the pixel in the analysis region for each image layer data, creates spectral spectrum data, and stores it in the work memory 203.
  • the control unit 201 analyzes the spectral data (step S105). Specifically, as described with reference to FIG. 6, the control unit 201 determines the wavelength at which the difference between the maximum value and the minimum value of the spectrum intensity is the largest in all set analysis regions.
  • the control unit 201 selects, as a reference filter, a bandpass filter that includes the wavelength determined by analyzing the spectral data in step S105 in the transmission wavelength band. Then, using the image layer data corresponding to the reference filter as a reference, three image layer data are selected from the six image layer data (step S106). Specifically, as described with reference to FIG. 6, image layer data corresponding to a bandpass filter having the largest overlap of spectral transmittance curves with respect to the spectral transmittance curve of the reference filter is selected.
  • the image processing unit 205 Under the control of the control unit 201, the image processing unit 205 generates color image data from the selected three image layer data (step S107). Specifically, color image data is generated by assigning RGB to each of the three image layer data. Then, the image processing unit 205 stores the generated color image data in the work memory 203. Finally, the control unit 201 causes the display unit 209 to display the generated color image data as a color image via the LCD drive circuit 210 (step S108), and ends this flow.
  • the digital camera 10 having six bandpass filters has been described as an example, but it is only necessary to have three or more bandpass filters.
  • three image layer data are selected from the image layer data corresponding to the six bandpass filters, and the visible image data is generated. This was explained using an example. However, it is sufficient that the number of image layer data to be selected is smaller than the type of bandpass filter provided. For example, in order to generate visible image data, four image layer data may be selected from image layer data corresponding to six band pass filters, or five image layer data may be selected. When selecting four or more image layer data, for example, other color channels such as yellow (Y) and cyan (C) may be assigned to the fourth and subsequent image layer data. On the other hand, as described above, two image layer data may be selected and a color channel may be assigned to each.
  • Y yellow
  • C cyan
  • the selection unit selects m types (m is a natural number of 2 or more and less than n) of band-pass filters from invisible imaging data (image layer data) corresponding to n types (n is a natural number of 3 or more). It is possible to select invisible imaging data (image layer data) corresponding to.
  • green (G) is assigned to the first layer data corresponding to the reference filter
  • red (R) is assigned to the second layer data
  • blue (B) is assigned to the third layer data.
  • the color channel assignment is not limited to this example, and the color channel assignment can be changed as appropriate from the viewpoint of realizing good color discrimination.
  • the color channel other than RGB such as cyan (C) magenta (M) and yellow (Y) may be assigned.
  • the multiband pass filter in which one type of bandpass filter is arranged for each pixel of the image sensor 100 is described as an example, but one type of bandpass filter is used for all pixels for one shooting.
  • one type of bandpass filter is used for all pixels for one shooting.
  • the setting unit receives an instruction from the user and sets the analysis region.
  • the setting unit may be configured to set a region where a predetermined feature amount satisfies a predetermined condition as the analysis region. Good. For example, by calculating difference data between each image layer data and averaged image data, an analysis area is defined as an area in which a change in pixel value as a feature value exceeds a predetermined threshold across all image layer data. It may be set automatically.
  • the wavelength at which the difference between the maximum value and the minimum value of the spectrum intensity exceeds a predetermined threshold value is selected. You may select so that the image layer data corresponding to the band pass filter included in a transmission wavelength band may be included. In this case, a plurality of sets of three image layer data to be selected can be determined in the image data acquired by one photographing, so that a plurality of color images can be generated.
  • the reference may be image layer data corresponding to a bandpass filter that includes a wavelength whose dispersion of spectral intensity exceeds a predetermined threshold in the transmission wavelength band.
  • the digital camera 10 as an example of the image processing apparatus includes the photographic lens 20, the image sensor 100, the image processing unit 205, and the like as a configuration for obtaining invisible imaging data. However, these configurations are not included. May be. Invisible imaging data generated by an imaging device such as another digital camera may be acquired.
  • the digital camera 10 has been described using an example of performing processing from generation of visible image data selected by the selection unit to generation of visible image data.
  • the digital camera 10 may be configured to execute up to the selection of invisible imaging data and execute the steps S107 and S108 described with reference to FIG. 13 by an external image processing apparatus.
  • the digital camera 10 adds information indicating that the selection unit has selected to the header of the selected invisible imaging data as metadata.
  • the processing unit adds the metadata to the invisible imaging data.
  • the image processing unit 205 functions as a processing unit.
  • the image processing unit 205 adds, as metadata, information that can identify the first layer data, the second layer data, and the third layer data to the three selected image layer data.
  • the image processing unit 205 stores the image layer data to which the metadata is added in the memory card 220.
  • the user can read the image layer data stored in the memory card 220 with an external image processing apparatus, perform color channel assignment processing, and generate visible image data.
  • the image layer data may be exchanged with an external image processing apparatus via the communication unit 211.
  • the image processing unit 205 may discard the non-visible imaging data that has not been selected. Thereby, the storage capacity of the memory card 220 can be saved.
  • the near-infrared wavelength band is described as an example of the invisible wavelength band, but the same processing can be applied to other wavelength bands.
  • the same processing can be applied in the ultraviolet wavelength band of 200 nm to 380 nm.

Landscapes

  • Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • General Physics & Mathematics (AREA)
  • Analytical Chemistry (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Mathematical Physics (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)
  • Spectrometry And Color Measurement (AREA)

Abstract

La présente invention concerne un dispositif de traitement d'images qui comporte : une unité d'acquisition de données qui acquiert des données de capture d'images non visibles capturées au moyen de lumière provenant d'un sujet qui a traversé chaque type parmi n types (n étant un nombre naturel au moins égal à 3) de filtres passe-bande de longueurs d'onde dont les bandes passantes sont situées dans des longueurs d'onde non visibles ; une unité de définition qui définit une pluralité de zones partielles dans une image représentée par les données de capture d'images non visibles ; une unité d'analyse qui, dans chaque zone de la pluralité de zones partielles, analyse l'intensité de lumière dans chaque bande passante, à partir des données de capture d'images non visibles ; une unité de sélection qui sélectionne m types (où m est un nombre naturel au moins égal à 2 et inférieur à n) de données de capture d'images non visibles sur la base de résultats d'analyse de l'unité d'analyse ; et une unité génératrice d'images qui associe des bandes de longueurs d'onde visibles dont les bandes sont au moins partiellement différentes les unes des autres à chacun des types sélectionnés de données de capture d'images non visibles, pour générer des données d'images visibles.
PCT/JP2016/078121 2015-09-25 2016-09-23 Dispositif de traitement d'images, programme de traitement d'images, dispositif de capture d'images, et programme de capture d'images WO2017051909A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2017540934A JP6645504B2 (ja) 2015-09-25 2016-09-23 画像処理装置、画像処理プログラム、撮像装置、および撮像プログラム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015-188692 2015-09-25
JP2015188692 2015-09-25

Publications (1)

Publication Number Publication Date
WO2017051909A1 true WO2017051909A1 (fr) 2017-03-30

Family

ID=58386047

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/078121 WO2017051909A1 (fr) 2015-09-25 2016-09-23 Dispositif de traitement d'images, programme de traitement d'images, dispositif de capture d'images, et programme de capture d'images

Country Status (2)

Country Link
JP (1) JP6645504B2 (fr)
WO (1) WO2017051909A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019128295A (ja) * 2018-01-25 2019-08-01 国立研究開発法人産業技術総合研究所 撮像装置、撮像システム、及び撮像方法

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004003878A (ja) * 2002-04-30 2004-01-08 Kobe Steel Ltd 植物の活性度測定装置およびその方法
JP2013113802A (ja) * 2011-11-30 2013-06-10 Sumitomo Electric Ind Ltd 対象検出装置および対象検出方法
WO2014065121A1 (fr) * 2012-10-25 2014-05-01 コニカミノルタ株式会社 Dispositif de traitement de données d'image bidimensionnelle, dispositif de mesure de luminance de couleur bidimensionnelle, procédé et programme de traitement de données d'image bidimensionnelle

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004003878A (ja) * 2002-04-30 2004-01-08 Kobe Steel Ltd 植物の活性度測定装置およびその方法
JP2013113802A (ja) * 2011-11-30 2013-06-10 Sumitomo Electric Ind Ltd 対象検出装置および対象検出方法
WO2014065121A1 (fr) * 2012-10-25 2014-05-01 コニカミノルタ株式会社 Dispositif de traitement de données d'image bidimensionnelle, dispositif de mesure de luminance de couleur bidimensionnelle, procédé et programme de traitement de données d'image bidimensionnelle

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
MOTOHARU FUJIGAKI ET AL.: "Trial of False-color Infrared Camera System for Evaluation of Plant Activity", HEISEI 12 NENDO SHUKI TAIKAI KOEN GAIYOSHU, 8 November 2000 (2000-11-08), pages 101 - 104 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019128295A (ja) * 2018-01-25 2019-08-01 国立研究開発法人産業技術総合研究所 撮像装置、撮像システム、及び撮像方法

Also Published As

Publication number Publication date
JP6645504B2 (ja) 2020-02-14
JPWO2017051909A1 (ja) 2018-06-14

Similar Documents

Publication Publication Date Title
JP6455604B2 (ja) 撮像装置、撮像プログラムおよび撮像方法
US9494768B2 (en) Image capturing module and image capturing apparatus
JP5976676B2 (ja) レンズ部の縦の色収差を利用したイメージングシステム及びその操作方法
US10645268B2 (en) Image processing method and apparatus of terminal, and terminal
US9288457B2 (en) Image processing device, method of processing image, and image processing program including false color correction
JP2007318753A (ja) イメージ撮像装置及びその動作方法
WO2012164934A1 (fr) Dispositif de traitement d'image, procédé de traitement d'image, et programme
CN107534759A (zh) 摄像装置、摄像方法和程序
WO2016052101A1 (fr) Dispositif de capture d'images et programme de génération de données d'images
JP2017011633A (ja) 撮像装置
US10334185B2 (en) Image capturing device, signal separation device, and image capturing method
US20110149126A1 (en) Multiband image pickup method and device
JP6645504B2 (ja) 画像処理装置、画像処理プログラム、撮像装置、および撮像プログラム
JP5108013B2 (ja) カラー撮像素子及びこれを用いた撮像装置及びフィルタ
JP2007043312A (ja) 撮影装置
US10602112B2 (en) Image processing apparatus
JP5874334B2 (ja) 画像処理装置、撮像装置、画像処理プログラムおよび撮像装置の制御プログラム
JP6384595B2 (ja) 撮像装置、データ生成装置および画像処理装置
CN108024035B (zh) 成像装置和方法
CN112335233B (zh) 图像生成装置以及摄像装置
JP6019611B2 (ja) 撮像装置
JP6896053B2 (ja) 特に顕微鏡および内視鏡のための、蛍光発光性蛍光体のhdrモノクローム画像を作成するためのシステムおよび方法
JP5028119B2 (ja) 画像処理装置,画像処理方法及び撮像装置
JP2017118293A (ja) 画像処理装置、撮像装置、画像処理方法、画像処理プログラム、および、記憶媒体
JP5918956B2 (ja) 画像表示装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16848697

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2017540934

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16848697

Country of ref document: EP

Kind code of ref document: A1