WO2022163671A1 - データ処理装置、方法及びプログラム並びに光学素子、撮影光学系及び撮影装置 - Google Patents
データ処理装置、方法及びプログラム並びに光学素子、撮影光学系及び撮影装置 Download PDFInfo
- Publication number
- WO2022163671A1 WO2022163671A1 PCT/JP2022/002756 JP2022002756W WO2022163671A1 WO 2022163671 A1 WO2022163671 A1 WO 2022163671A1 JP 2022002756 W JP2022002756 W JP 2022002756W WO 2022163671 A1 WO2022163671 A1 WO 2022163671A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- wavelength
- data
- map
- subject
- data processing
- Prior art date
Links
- 230000003287 optical effect Effects 0.000 title claims abstract description 56
- 238000000034 method Methods 0.000 title claims abstract description 33
- 238000003384 imaging method Methods 0.000 title claims abstract description 27
- 230000003595 spectral effect Effects 0.000 claims abstract description 139
- 238000004364 calculation method Methods 0.000 claims abstract description 19
- 238000006243 chemical reaction Methods 0.000 claims abstract description 17
- 238000001514 detection method Methods 0.000 claims description 42
- 238000003672 processing method Methods 0.000 claims description 37
- 210000001747 pupil Anatomy 0.000 claims description 13
- 230000002194 synthesizing effect Effects 0.000 claims description 5
- 239000000284 extract Substances 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 37
- 241000238631 Hexapoda Species 0.000 description 32
- 235000017060 Arachis glabrata Nutrition 0.000 description 18
- 241001553178 Arachis glabrata Species 0.000 description 18
- 235000010777 Arachis hypogaea Nutrition 0.000 description 18
- 235000018262 Arachis monticola Nutrition 0.000 description 18
- 235000020232 peanut Nutrition 0.000 description 18
- 239000002689 soil Substances 0.000 description 17
- 238000009826 distribution Methods 0.000 description 12
- 230000006870 function Effects 0.000 description 9
- 239000003550 marker Substances 0.000 description 7
- 239000003086 colorant Substances 0.000 description 6
- 238000000605 extraction Methods 0.000 description 6
- 238000007689 inspection Methods 0.000 description 6
- 235000013399 edible fruits Nutrition 0.000 description 5
- 230000010287 polarization Effects 0.000 description 5
- 238000001228 spectrum Methods 0.000 description 4
- 238000010521 absorption reaction Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 239000000123 paper Substances 0.000 description 2
- 239000000843 powder Substances 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 239000000356 contaminant Substances 0.000 description 1
- 238000011109 contamination Methods 0.000 description 1
- 239000002537 cosmetic Substances 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 235000013305 food Nutrition 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 235000014571 nuts Nutrition 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000003860 storage Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J3/00—Spectrometry; Spectrophotometry; Monochromators; Measuring colours
- G01J3/28—Investigating the spectrum
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/10—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J3/00—Spectrometry; Spectrophotometry; Monochromators; Measuring colours
- G01J3/02—Details
- G01J3/0205—Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows
- G01J3/0224—Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows using polarising or depolarising elements
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J3/00—Spectrometry; Spectrophotometry; Monochromators; Measuring colours
- G01J3/28—Investigating the spectrum
- G01J3/2803—Investigating the spectrum using photoelectric array detector
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J3/00—Spectrometry; Spectrophotometry; Monochromators; Measuring colours
- G01J3/46—Measurement of colour; Colour measuring devices, e.g. colorimeters
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/17—Systems in which incident light is modified in accordance with the properties of the material investigated
- G01N21/25—Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
- G01N21/27—Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands using photo-electric detection ; circuits for computing concentration
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8851—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/94—Investigating contamination, e.g. dust
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/143—Sensing or illuminating at different wavelengths
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/58—Extraction of image or video features relating to hyperspectral data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/188—Vegetation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/68—Food, e.g. fruit or vegetables
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J3/00—Spectrometry; Spectrophotometry; Monochromators; Measuring colours
- G01J3/28—Investigating the spectrum
- G01J3/2823—Imaging spectrometer
- G01J2003/2826—Multispectral imaging, e.g. filter imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8851—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
- G01N2021/8887—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques
Definitions
- the present invention relates to a data processing device, method, program, optical element, imaging optical system, and imaging device, and more particularly to selection of wavelengths suitable for detection of objects to be detected.
- hyperspectral cameras that can perform spectral sensing using more than 100 wavelengths.
- hyperspectral camera Since this type of hyperspectral camera measures many wavelengths, it is common to sense objects by searching for wavelengths where reflection and absorption change rapidly. (For example, find the peak wavelength by taking the second derivative of the spectral reflectance in the wavelength direction).
- Patent Document 1 a foreign matter contamination inspection device that inspects for foreign matter such as hair when an object to be inspected such as food, cosmetics, and medicine contains foreign matter.
- This contaminant inspection apparatus irradiates an inspection object with light in a first wavelength band and light in a second wavelength band in which the relative relationship between the luminance value of the foreign matter and the luminance value of the inspection object is different from each other.
- a first spectral image based on the luminance value of and a second spectral image based on the luminance value of the second wavelength band are respectively acquired, and the acquired first spectral image and the second spectral image are calculated to be included in the inspection object
- a foreign matter extraction image is acquired in which the foreign matter can be distinguished from the inspection object.
- One embodiment according to the technology of the present disclosure is a data processing device that can easily identify a wavelength combination of a first wavelength and a second wavelength suitable for detecting a detection target from a plurality of subjects including the detection target. , a method, a program, an optical element, an imaging optical system, and an imaging apparatus.
- a data processing device comprising a processor, wherein the processor is selected from a data acquisition process of acquiring spectral data of a plurality of subjects, and wavelength ranges of the acquired spectral data of the plurality of subjects.
- a calculation process for calculating the intensity characteristics at the first wavelength and the second wavelength based on the relationship between the two wavelengths, the first wavelength and the second wavelength, and the intensity characteristics calculated by the calculation process are used as identification data of the specific subject for the wavelength range. and an output process for externally outputting the identification data.
- the intensity characteristics are preferably intensity differences and/or intensity ratios.
- the intensity difference and/or the intensity ratio are A( ⁇ ) for the spectral data of the first subject among the spectral data of the plurality of subjects, and A( ⁇ ) for the spectral data of the second subject.
- B( ⁇ ) the first wavelength to be selected is ⁇ 1
- the second wavelength is ⁇ 2, the following [Formula 1] and/or [Formula 2] formula
- the identification data is a first map representing changes in intensity characteristics with wavelength as a variable.
- the first map is a two-dimensional map
- the coordinate axes of the two-dimensional map are the first wavelength and the second wavelength.
- the destination of external output of the identification data is the display device
- the processor receives the specific position on the first map displayed on the display device by the user's instruction. and specifying a wavelength combination of the first wavelength and the second wavelength to be used for detection of the detection object among the plurality of subjects based on the specified position.
- the processor extracts positions in one or more first maps whose intensity characteristics exceed a threshold from the first map, and based on the extracted positions, a plurality of It is preferable to perform a process of specifying a wavelength combination of one or more first wavelengths and second wavelengths to be used for detection of a detection target among subjects.
- the identification data is externally output to a display device, and the processor displays one or more first wavelengths and first wavelengths on a first map displayed on the display device.
- the plurality of subjects includes a first subject, a second subject, and a third subject
- the data acquisition processing includes spectral data of the first subject, spectral data of the second subject, , and the spectral data of the third subject
- the calculation process includes the intensity characteristics at the first wavelength and the second wavelength of the spectral data of the first subject and the spectral data of the second subject, the spectral data of the second subject, and the third subject.
- the characteristics are calculated and the data conversion process converts two or more intensity characteristics into two or more identification data.
- the invention according to a tenth aspect is an optical element having a first wavelength selection element and a second wavelength selection element, wherein the first wavelength selection element is specified by the data processing device according to any one of the sixth to eighth aspects. and the second wavelength selection element is an optical element that transmits the wavelength band of the second wavelength specified by the data processing device according to any one of the sixth to eighth aspects. be.
- An invention according to an eleventh aspect is a photographing optical system in which the optical element of the tenth aspect is arranged at or near the pupil position.
- the invention according to a twelfth aspect is the imaging optical system of the eleventh aspect, and the first optical image transmitted through the first wavelength selection element and the second optical image transmitted through the second wavelength selection element formed by the imaging optical system. and an imaging device that captures an image.
- the invention includes a data acquisition step of acquiring spectral data of a plurality of subjects; a calculation step of calculating based on the relationship between two wavelengths of the first wavelength and the second wavelength; a data conversion step of converting the intensity characteristics calculated in the calculation step into identification data of a specific subject with respect to the wavelength range; and an output step of outputting to the outside, and a data processing program in which a processor executes the processing of each step.
- the intensity characteristics are preferably intensity differences and/or intensity ratios.
- the intensity difference and/or the intensity ratio are: A( ⁇ ) for the spectral data of the first subject among the spectral data of the plurality of subjects; B( ⁇ ), the first wavelength to be selected is ⁇ 1, and the second wavelength is ⁇ 2, the following [Formula 1] and/or [Formula 2] formula
- the identification data is preferably a first map representing changes in intensity characteristics with wavelength as a variable.
- the first map is a two-dimensional map and the coordinate axes of the two-dimensional map are the first wavelength and the second wavelength.
- the destination of external output of the identification data is a display device, and a step of accepting a user instruction to specify a specific position on the first map displayed on the display device; and an identifying step of identifying a wavelength combination of the first and second wavelengths to be used for detection of the detection target among the plurality of subjects based on the position.
- the destination of external output of the identification data is a display device, and one or a plurality of first wavelengths and second wavelengths are displayed on a first map displayed on the display device.
- a step of superimposing and displaying specific position candidates in a wavelength combination a step of receiving a specific position from among the specific position candidates by a user instruction; and a step of specifying a wavelength combination of the first wavelength and the second wavelength based on the received specific position. and preferably include
- the plurality of subjects includes a first subject, a second subject, and a third subject
- the data acquisition step comprises spectral data of the first subject, spectral data of the second subject, , and spectral data of a third subject
- the calculating step includes: intensity characteristics at the first and second wavelengths of the spectral data of the first subject and the spectral data of the second subject; the spectral data of the second subject; and the spectral data of the third subject.
- the calculating characteristics and the data conversion step converts two or more intensity characteristics into two or more identification data.
- the identifying step includes two or more of a first wavelength and a second wavelength, which are used for detecting the detection target among the plurality of subjects based on the first map.
- the wavelength combination is specified, and the image acquisition step acquires a first image of a wavelength band including the first wavelength and a second image of the wavelength band including the second wavelength for each combination of two or more wavelengths, and creates a second map.
- the steps include creating a second map for each combination of two or more wavelengths, and synthesizing the two or more created second maps to create one second map.
- the data processing method preferably includes the step of detecting the detection target based on the created second map.
- a twenty-fifth aspect of the invention provides a function of acquiring spectral data of a plurality of subjects;
- a computer having a function of calculating based on the relationship between two wavelengths, a wavelength and a second wavelength, a function of converting the calculated intensity characteristics into identification data of a specific subject with respect to the wavelength range, and a function of outputting the identification data to an external device. It is a data processing program realized by
- FIG. 1 is a functional block diagram showing a first embodiment of a data processing device according to the present invention.
- FIG. 2 is a diagram showing a plurality of subjects including a detection target photographed by a hyperspectral camera.
- FIG. 3 is a graph showing spectral data for paper, leaf, and insect, respectively.
- FIG. 4 is a diagram showing an example of a first map showing intensity distribution of intensity characteristics calculated from spectral data of paper and leaf.
- FIG. 5 is a diagram showing an example of a first map showing an intensity distribution of intensity characteristics calculated from spectral data of leaves and insects.
- FIG. 6 is a diagram showing an example of a second map showing the difference or ratio between the first image and the second image acquired from the multispectral camera that captured the subject shown in FIG. FIG.
- FIG. 7 is a diagram showing a plurality of subjects including other detection targets photographed by a hyperspectral camera.
- FIG. 8 is a graph showing spectral data for two types of soil, paper and powder, respectively.
- FIG. 9 is a diagram showing an example of a first map showing an intensity distribution of intensity characteristics calculated from spectral data of two types of soil.
- 10 is a diagram showing an example of a second map showing the difference or ratio between the first image and the second image acquired from the multispectral camera that captured the subject shown in FIG. 7.
- FIG. FIG. 11 is a diagram showing a plurality of subjects including still another detection target photographed by the hyperspectral camera.
- FIG. 12 is a graph showing spectral data for paper, peanut skins, and peanut kernels, respectively.
- FIG. 13 is a diagram showing an example of a first map showing intensity distribution of intensity characteristics calculated from spectral data of peanut skin and fruit.
- FIG. 14 shows an example of one second map obtained by synthesizing three second maps created from the first, second, and third images acquired from the multispectral camera that captured the subject shown in FIG.
- FIG. 4 is a diagram showing;
- FIG. 15 is a diagram showing how three second maps are combined to create one second map.
- FIG. 16 is a functional block diagram showing a second embodiment of the data processing device according to the present invention.
- FIG. 17 is a schematic diagram showing an example of a multispectral camera.
- FIG. 18 is a flow chart showing the first embodiment of the data processing method according to the present invention.
- FIG. 19 is a flow chart showing a second embodiment of the data processing method according to the invention.
- FIG. 20 is a flow chart showing a third embodiment of the data processing method according to the invention.
- FIG. 1 is a functional block diagram showing a first embodiment of a data processing device according to the present invention.
- the data processing device 10-1 of the first embodiment can be configured by a personal computer, workstation, or the like having hardware such as a processor, memory, and input/output interface.
- the processor is composed of a CPU (Central Processing Unit) and the like, and controls each part of the data processing device 10-1. It functions as an output unit 18 , a user instruction reception unit 20 and a wavelength combination specifying unit 22 .
- CPU Central Processing Unit
- the data acquisition unit 12 performs data acquisition processing for acquiring spectral data of a plurality of subjects.
- the data acquisition unit 12 acquires the spectral data of a plurality of subjects directly from the hyperspectral camera 1 that has photographed a plurality of subjects with different spectral reflectances, or indirectly via a recording medium, network, or the like. .
- FIG. 2 is a diagram showing a plurality of subjects including a detection target photographed by a hyperspectral camera.
- the plurality of subjects shown in FIG. 2 are paper 2A as the first subject, leaf 2B as the second subject, and insect 2C as the third subject. There is an insect 2C above.
- spectral data of a plurality of subjects When acquiring spectral data of a plurality of subjects from the hyperspectral camera 1, for example, the plurality of subjects shown in FIG. .
- the user designates the area of the paper 2A, the area of the leaf 2B, and the area of the insect 2C on the monitor screen of the hyperspectral camera 1, the spectral data indicating the spectral reflectance of the paper 2A and the spectral reflectance of the leaf 2B are obtained.
- Spectral data indicating the reflectance and spectral data indicating the spectral reflectance of the insect 2C are acquired from the hyperspectral camera 1.
- the insect 2C is the object to be detected, and in the present invention, in order to detect the object to be detected (insect 2C), changes in reflection and absorption relative to the background object (paper 2A and leaf 2B) are made. It relates to a technique for finding wavelength combinations of two large wavelengths.
- Fig. 3 is a graph showing the spectral data of paper, leaf, and insect, respectively.
- the vertical axis is a graph showing the spectral reflectance when the reflectance of the reference plate is set to 1.
- the spectral reflectance is the average value or median value of the reflectance over the entire or partial region of the detection target.
- the data acquisition unit 12 acquires the spectral data A( ⁇ ) of the paper 2A, the spectral data B( ⁇ ) of the leaf 2B, and the spectral data C( ⁇ ) of the insect 2C shown in FIG.
- the spectral data of each subject is not limited to being obtained from the hyperspectral camera 1. For example, if the spectral data of a subject (including some subjects) is known, the spectral data may be obtained. good.
- the intensity characteristic calculation unit 14 calculates the intensity characteristics at the first wavelength and the second wavelength, which are selected from the wavelength ranges of the spectral data of the plurality of subjects acquired by the data acquisition unit 12, as two wavelengths of the first wavelength and the second wavelength. Calculation processing is performed based on the wavelength relationship.
- the intensity characteristics at the first and second wavelengths are used to evaluate the sensing performance of the object, and the larger the intensity characteristics, the easier it is to identify the object. Note that the first wavelength and the second wavelength may have a width.
- the intensity characteristic at the first and second wavelengths is the intensity difference and/or intensity ratio of the spectra at the first and second wavelengths.
- A( ⁇ ) be the spectral data of the first object (paper 2A)
- B( ⁇ ) be the spectral data of the second object (leaf 2B)
- ⁇ 1 be the first wavelength to be combined
- ⁇ 2 be the second wavelength.
- the characteristic calculator 14 acquires the spectral reflectance data A( ⁇ 1) and A( ⁇ 2) from the spectral data A( ⁇ ) of the paper 2A at the first wavelength ⁇ 1 and the second wavelength ⁇ 2.
- Spectral reflectance data B( ⁇ 1) and B( ⁇ 2) are obtained from the spectral data B( ⁇ ) of the leaf 2B at the second wavelength ⁇ 2.
- the intensity characteristic calculator 14 calculates the difference value between the spectral reflectance data A( ⁇ 1) and A( ⁇ 2) of the paper 2A at the first wavelength ⁇ 1 and the second wavelength ⁇ 2, The difference between the spectral reflectance data B( ⁇ 1) and B( ⁇ 2) of the leaf 2B at the first wavelength ⁇ 1 and the second wavelength ⁇ 2 (the paper 2A and the leaf 2B at the two first wavelength ⁇ 1 and the second wavelength ⁇ 2 ) is calculated. In this case, it is preferable to divide the difference value by the added value of the spectral reflectance data at the first wavelength ⁇ 1 and the second wavelength ⁇ 2 in order to cancel the effects of illumination unevenness, shadows, and the like.
- the intensity characteristic calculator 14 calculates the spectral reflectance data A( ⁇ 1) and A( ⁇ 2) of the paper 2A at the first wavelength ⁇ 1 and the second wavelength ⁇ 2, and the spectral reflectance data B( ⁇ 1 ) and B( ⁇ 2), the intensity difference is calculated by the following [Equation 1].
- the strength characteristic calculation unit 14 can calculate the strength difference by the following [Equation 2] formula.
- the intensity characteristic calculator 14 also calculates an intensity difference and/or an intensity ratio for each possible wavelength combination of the two first wavelengths ⁇ 1 and second wavelengths ⁇ 2.
- the intensity difference and/or intensity ratio for evaluating the sensing performance of the two subjects, the paper 2A and the leaf 2B is calculated. Calculate an intensity difference and/or an intensity ratio that evaluates the sensing performance of the two objects. Furthermore, when it is necessary to distinguish between the paper 2A and the insect 2C, an intensity difference and/or an intensity ratio for evaluating the sensing performance of the two objects of the paper 2A and the insect 2C is also calculated.
- the strength characteristics calculated by the strength characteristics calculation unit 14 are output to the data conversion unit 16 .
- the data conversion unit 16 performs data conversion processing to convert the calculated intensity characteristics into identification data of the specific subject for the wavelength range.
- the identification data is a map (first map) representing changes in intensity characteristics with wavelength as a variable.
- the first map is a two-dimensional map, and the coordinate axes of the two-dimensional map are the first wavelength and the second wavelength.
- FIG. 4 is a diagram showing an example of a first map showing intensity distribution of intensity characteristics calculated from spectral data of paper and leaves.
- the first map shown in FIG. 4(A) is a heat map with colors and densities depending on the magnitude of the intensity characteristic
- the first map shown in FIG. 4(B) is a contour line corresponding to the magnitude of the intensity characteristic. It is a diagram.
- the intensity characteristics are large in areas with high brightness (white), and in the first map (contour map) shown in FIG. 4(B), contour lines with large numerical values The strength characteristic of the region is large.
- FIG. 5 is a diagram showing an example of a first map showing the intensity distribution of intensity characteristics calculated from spectral data of leaves and insects.
- the first map shown in FIG. 5(A) is a heat map with colors and densities depending on the magnitude of the intensity characteristic
- the first map shown in FIG. 5(B) is a contour line corresponding to the magnitude of the intensity characteristic. It is a diagram.
- the two first maps (the first maps shown in FIGS. 4 and 5), which are the identification data converted by the data conversion unit 16, are applied to the output unit 18.
- the output unit 18 is an output processing unit that outputs the identification data (first map) input from the data conversion unit 16 to the outside.
- the destination of the external output in this example is the display device 30 .
- FIG. 4 the first map shown in FIGS. 4 and 5 is displayed on the display device 30.
- FIG. 4 the first map shown in FIGS. 4 and 5 is displayed on the display device 30.
- the user instruction reception unit 20 receives a specific position on the first map designated by the user through the operation unit 32 of the pointing device such as a mouse operated by the user.
- the paper 2A and the leaf 2B are not distinguished on the first map shown in FIG. Locations are preferred, while on the first map shown in FIG. 5, locations that discriminate between leaves 2B and insects 2C (higher intensity characteristics) are preferred.
- the user finds, from the two first maps displayed on the display device 30, a specific position that does not distinguish between the paper 2A and the leaf 2B but distinguishes between the leaf 2B and the insect 2C, and uses the operation unit 32 to to direct.
- the user instruction receiving unit 20 receives the specific position on the first map designated by the user in this way.
- the wavelength combination specifying unit 22 has wavelength information on each coordinate axis of the first map. A wavelength combination of one wavelength ⁇ 1 and a second wavelength ⁇ 2 is specified.
- a first wavelength selection element (a first band-pass filter), and a second wavelength selection element (second band-pass filter) that transmits light in a wavelength band containing the second wavelength ⁇ 2.
- a subject is photographed, and a first image in a wavelength band including the first wavelength ⁇ 1 and a second image in a wavelength band including the second wavelength ⁇ 2 are acquired simultaneously.
- FIG. 6 is a diagram showing an example of a second map showing the difference or ratio between the first image and the second image acquired from the multispectral camera that captured the subject shown in FIG.
- the contrast between the insect 2C and the background paper 2A and leaf 2B becomes clear, and the insect 2C can be detected with high accuracy by using the second map.
- FIG. 7 is a diagram showing a plurality of subjects including other detection targets photographed by a hyperspectral camera.
- the plurality of subjects shown in FIG. 7 are two types of powdered soil 4B and 4C placed on a background paper 4A.
- Soil 4B is soil of OK quality
- soil 4C is soil of NG quality.
- Each spectral data of two types of soil 4B and 4C is acquired from the hyperspectral camera 1 by the data acquisition unit 12 (Fig. 1).
- Fig. 8 is a graph showing spectral data of two types of soil, paper and powder, respectively.
- the vertical axis is a graph showing the spectral reflectance when the reflectance of the reference plate is set to 1.
- the spectral reflectance is the average value or median value of the reflectance over the entire or partial region of the detection target.
- D( ⁇ ) indicates spectral data of paper 4A
- E( ⁇ ) indicates spectral data of soil 4B
- F( ⁇ ) indicates spectral data of soil 4C.
- the spectral data E( ⁇ ) and F( ⁇ ) of the two types of soils 4B and 4C are nearly identical, making it difficult to distinguish between the two types of soils 4B and 4C with the naked eye.
- the intensity characteristic calculation unit 14 calculates the intensity characteristics at the first wavelength ⁇ 1 and the second wavelength ⁇ 2 selected from the wavelength regions of the spectrum data E( ⁇ ) and F( ⁇ ) of Sat 4B and 4C as the first wavelength ⁇ 1 and It is calculated based on the relationship between the two wavelengths of the second wavelength ⁇ 2, and the data converter 16 converts the calculated intensity characteristic into the first map.
- FIG. 9 is a diagram showing an example of a first map showing intensity distribution of intensity characteristics calculated from spectrum data of two types of soil.
- the first map shown in FIG. 9(A) is a heat map with colors and densities depending on the magnitude of the intensity characteristic
- the first map shown in FIG. 9(B) is a contour line corresponding to the magnitude of the intensity characteristic. It is a diagram.
- the user designates a specific position with a large intensity characteristic.
- the specific position designated by the user is the position marked with the asterisk marker M2.
- a second image of a wavelength band containing two wavelengths ⁇ 2 is acquired simultaneously.
- a difference or ratio between the first image and the second image of each wavelength band acquired by the multispectral camera is calculated, and a second map showing the calculated difference or ratio is created.
- a difference or ratio between the first image and the second image of each wavelength band acquired by the multispectral camera is calculated, and a second map showing the calculated difference or ratio is created.
- FIG. 10 is a diagram showing an example of a second map showing the difference or ratio between the first image and the second image acquired from the multispectral camera that captured the subject shown in FIG.
- the second map there are the following methods of expressing the difference between the first image and the second image.
- the difference image is (wavelength image ( ⁇ 1) ⁇ wavelength image ( ⁇ 2)) ⁇ (wavelength image ( ⁇ 1)+ wavelength image ( ⁇ 2)), and the difference image is displayed as a heat map.
- the R channel of R (red), G (green), and B (blue) color images is assigned to the wavelength image ( ⁇ 1)
- the G channel is assigned to the wavelength image ( ⁇ 2) to represent a pseudo-color image. It should be noted that allocation of the wavelength image ( ⁇ 1) and the wavelength image ( ⁇ 2) to any one of the R channel, G channel, and B channel of the color image is not limited to the above example.
- the heat map generated from the wavelength image ( ⁇ 1) and the wavelength image ( ⁇ 2) or the pseudo-color image can be used to easily distinguish between the subject of the detection target and other subjects.
- FIG. 11 is a diagram showing a plurality of subjects including still another detection target photographed by a hyperspectral camera.
- the plurality of subjects shown in FIG. 11 are peanut skins 6B and peanut nuts 6C placed on background paper 6A.
- the spectral data of the peanut skin 6B and the peanut fruit 6C are acquired from the hyperspectral camera 1 by the data acquisition unit 12 (Fig. 1).
- FIG. 12 is a graph showing spectral data of paper, peanut skin, and peanut kernel, respectively.
- the vertical axis is a graph showing the spectral reflectance when the reflectance of the reference plate is set to 1.
- the spectral reflectance is the average value or median value of the reflectance over the entire or partial region of the detection target.
- G( ⁇ ) indicates spectral data of paper 6A
- H( ⁇ ) indicates spectral data of peanut skin 6B
- I( ⁇ ) indicates spectral data of peanut kernel 6C.
- the peanut skin 6B and the peanut fruit 6C have substantially the same color and are difficult to distinguish with the naked eye.
- the intensity characteristic calculator 14 calculates the intensity characteristics at the first wavelength ⁇ 1 and the second wavelength ⁇ 2 selected from the wavelength regions of the spectral data H( ⁇ ) and I( ⁇ ) of the peanut skin 6B and the peanut kernel 6C, It is calculated based on the relationship between the first wavelength ⁇ 1 and the second wavelength ⁇ 2, and the data converter 16 converts the calculated intensity characteristic into a first map.
- FIG. 13 is a diagram showing an example of a first map showing intensity distribution of intensity characteristics calculated from spectral data of peanut skin and fruit.
- the first map shown in FIG. 13(A) is a heat map with colors and densities depending on the magnitude of the intensity characteristic
- the first map shown in FIG. 13(B) is a contour line corresponding to the magnitude of the intensity characteristic. It is a diagram.
- the user indicates a plurality of specific positions in order to enhance the sensing performance.
- the plurality of (three) specific positions specified by the user are the positions marked with star markers M3, M4, and M5.
- the wavelengths specified by the wavelength combination are three wavelengths (570 nm, 690 nm, and 930 nm).
- normally six wavelengths (2 wavelengths x 3) are specified. Since wavelengths are involved, three wavelengths (570nm, 690nm, 930nm) are specified.
- a second image of a wavelength band containing wavelength (690 nm) and a third image of a wavelength band containing wavelength (930 nm) are acquired simultaneously.
- FIG. 14 shows an example of one second map obtained by synthesizing three second maps created from the first, second, and third images acquired from the multispectral camera that captured the subject shown in FIG.
- FIG. 4 is a diagram showing;
- FIG. 15 is a diagram showing how three second maps are combined to create one second map.
- FIG. 16 is a functional block diagram showing a second embodiment of the data processing device according to the present invention.
- the data processing device 10-2 of the second embodiment shown in FIG. 16 is provided with a position extraction unit 24 instead of the user instruction receiving unit 20 of the data processing device 10-1 of the first embodiment. It differs from the data processor 10-1 of the first embodiment.
- a first map indicating the intensity distribution of the intensity characteristic is added to the position extraction unit 24 from the data conversion unit 16, and the position extraction unit 24 extracts one or more first maps whose intensity characteristics exceed the threshold value from the first map.
- a process of extracting a position in one map is performed.
- the position extracting unit 24 detects one or more regions whose intensity characteristics exceed the threshold in the first map, and obtains the centroid position of the detected region, thereby extracting one or more positions in the first map. can do. Also, within one or a plurality of regions exceeding the threshold, the position having the highest intensity characteristic may be set as the position in that region.
- the position information (coordinate information on the first map) extracted by the position extraction unit 24 is output to the wavelength combination identification unit 22 .
- the wavelength combination specifying unit 22 has wavelength information for each coordinate axis of the first map. A wavelength combination of a certain first wavelength ⁇ 1 and a second wavelength ⁇ 2 is specified.
- the wavelength combination (first wavelength ⁇ 1, second wavelength ⁇ 2) specified by the wavelength combination specifying unit 22 is output and displayed on the display device 30, and can also be output to a recording device and other external devices. When two or more wavelength combinations are specified, output is provided for each specified wavelength combination.
- the data processing device 10-2 of the second embodiment provides a plurality of specific positions (a plurality of (wavelength combination) candidates are specified, the candidates for the specific positions are superimposed on the first map, and one or more specific positions are specified by the user from among the candidates for the specific positions superimposed on the first map. You may make it specify by.
- FIG. 17 is a schematic diagram showing an example of a multispectral camera.
- a multispectral camera (imaging device) 100 shown in FIG. 1 A multispectral camera (imaging device) 100 shown in FIG.
- the filter unit 120 is composed of a polarizing filter unit 122 and a bandpass filter unit 124 and is preferably arranged at or near the pupil position of the imaging optical system 110 .
- the polarizing filter unit 122 includes a first polarizing filter 122A and a second polarizing filter 122B that linearly polarize light passing through the first pupil region and the second pupil region of the imaging optical system 110, respectively.
- the polarization directions are different from each other by 90° from the two-polarization filter 122B.
- the band-pass filter unit 124 includes a first band-pass filter (first wavelength selection element) 124A and a second band-pass filter that select wavelength bands of light that passes through the first pupil region and the second pupil region of the imaging optical system 110, respectively. It consists of a filter (second wavelength selection element) 124B, and the first band-pass filter 124A selects a wavelength band including one wavelength (first wavelength) of the identified wavelength combination, and the second band-pass filter 124B selects a wavelength band that includes the other wavelength (second wavelength) of the identified wavelength combination.
- the light transmitted through the first pupil region of the imaging optical system 110 is linearly polarized by the first polarizing filter 122A, and only the light in the wavelength band including the first wavelength is transmitted by the first bandpass filter 124A.
- the light passing through the second pupil region of the imaging optical system 110 is linearly polarized by the second polarizing filter 122B (linearly polarized in a direction different from that of the first polarizing filter 122A by 90°), and is further polarized by the second bandpass filter 124B. Only the light in the wavelength band including the second wavelength is transmitted by .
- the image sensor 130 is configured by regularly arranging a first polarizing filter and a second polarizing filter whose polarization directions are different from each other by 90° in a plurality of pixels composed of photoelectric conversion elements arranged two-dimensionally. .
- the first polarizing filter 122A and the first polarizing filter of the image sensor 130 have the same polarizing direction
- the second polarizing filter 122B and the second polarizing filter of the image sensor 130 have the same polarizing direction.
- the signal processing unit 140 acquires the first image in the wavelength band selected by the first band-pass filter 124A by reading pixel signals from the pixels of the image sensor 130 in which the first polarizing filter is arranged, and outputs the first image to the image sensor. By reading pixel signals from the pixels in which the second polarizing filters 130 are arranged, a second image in the wavelength band selected by the second band-pass filter 124B is acquired.
- the first image and the second image acquired by the signal processing unit 140 are used for detection of the detection target as described above.
- optical element has the first wavelength and the second wavelength specified by the data processing device 10-1 of the first embodiment shown in FIG. 1 or the data processing device 10-2 of the second embodiment shown in FIG. An optical element fabricated according to a wavelength combination of two wavelengths.
- the optical element corresponds to the band-pass filter unit 124 arranged in the multispectral camera 100 shown in FIG. It has a selection element (first bandpass filter) and a first wavelength selection element (second bandpass filter) that transmits light in a wavelength band including the second wavelength specified by the data processing device.
- the first band-pass filter and the second band-pass filter have the first wavelength and the second wavelength as the center wavelengths, respectively, and have a band width in which the wavelength bands of the transmission wavelengths do not overlap each other.
- the imaging optical system according to the present invention corresponds to the imaging optical system 110 of the multispectral camera 100 shown in FIG.
- This imaging optical system is an optical element corresponding to the bandpass filter unit 124, and is a first wavelength selection element (first bandpass filter) that transmits light in a wavelength band including the first wavelength specified by the data processing device. and a first wavelength selection element (second band-pass filter) that transmits light in a wavelength band including the second wavelength specified by the data processing device is located at or near the pupil position of the lenses 110A and 110B. It is arranged and configured in
- the imaging device according to the present invention corresponds to, for example, the multispectral camera 100 shown in FIG.
- the first optical image is an optical image transmitted through the first wavelength selection element of the optical element
- the second optical image is an optical image transmitted through the second wavelength selection element of the optical element
- the first optical image and the second optical image are the polarizing filter unit 122 (the first polarizing filter 122A and the second polarizing filter 122B) functioning as a pupil dividing section, respectively, and the first polarized light on each pixel of the image sensor 130.
- Pupil division is performed by the first polarizing filter and the second polarizing filter corresponding to the filter 122A and the second polarizing filter 122B, and the image is captured by the image sensor .
- the multispectral camera 100 can simultaneously acquire a first image corresponding to the first optical image having different wavelength bands and a second image corresponding to the second optical image.
- the imaging device is not limited to those having the configuration such as the pupil dividing section of the multispectral camera 100 shown in FIG. Any device may be used as long as it captures the transmitted second optical image and acquires the first image and the second image corresponding to the first optical image and the second optical image.
- a data processing method according to the present invention is a method for identifying a wavelength combination of a first wavelength and a second wavelength suitable for detecting a desired detection target. , 10-2 are executed by the processor that is the subject of the processing of each unit.
- FIG. 18 is a flow chart showing the first embodiment of the data processing method according to the present invention.
- the processor acquires spectral data of a plurality of subjects (step S10, data acquisition step).
- step S10 for example, spectral data of a plurality of subjects are acquired from a hyperspectral camera that has photographed a plurality of subjects with different spectral reflectances.
- the plurality of objects are paper 2A, leaf 2B, and insect 2C.
- leaf 2B is placed on paper 2A and insect 2C is on leaf 2B
- Three spectral data A( ⁇ ), B( ⁇ ), and C( ⁇ ) are obtained for paper 2A, leaf 2B, and insect 2C (see FIG. 3).
- the processor calculates the intensity characteristics at the first wavelength ⁇ 1 and the second wavelength ⁇ 2, which are selected from the wavelength regions of the spectral data of the plurality of subjects acquired in step S10, at the first wavelength ⁇ 1 and the second wavelength ⁇ 2. calculated based on the relationship between the two wavelengths (step S12, calculation step).
- the intensity characteristic at the first and second wavelengths is the intensity difference and/or intensity ratio of the spectra at the first and second wavelengths.
- step S12 the first wavelength ⁇ 1 and the second wavelength ⁇ 2 obtain the spectral reflectance data A ( ⁇ 1) and A ( ⁇ 2) from the spectral data A ( ⁇ ) of the paper 2A in the Obtain spectral reflectance data B( ⁇ 1) and B( ⁇ 2).
- the first wavelength ⁇ 1 and the second wavelength ⁇ 2 When calculating the absolute value of the difference between the spectral reflectance data B( ⁇ 1) and B( ⁇ 2) of the leaf 2B at the wavelength ⁇ 2 and calculating the intensity ratio as the intensity characteristic, the first wavelength ⁇ 1 and the second
- the processor converts the intensity characteristics calculated in step S12 into identification data of the specific subject for the wavelength range (step S14, data conversion step).
- the identification data is a map (first map) representing changes in intensity characteristics with wavelength as a variable.
- the first map is a two-dimensional map, and the coordinate axes of the two-dimensional map are the first wavelength and the second wavelength.
- FIG. 4 is a diagram showing an example of a first map showing intensity distribution of intensity characteristics calculated from spectral data of paper and leaves.
- the first map shown in FIG. 4(A) is a heat map with colors and densities depending on the magnitude of the intensity characteristic
- the first map shown in FIG. 4(B) is a contour line corresponding to the magnitude of the intensity characteristic. It is a diagram.
- FIG. 5 is a diagram showing an example of a first map showing intensity distribution of intensity characteristics calculated from spectral data of leaves and insects.
- the first map shown in FIG. 5(A) is a heat map with colors and densities depending on the magnitude of the intensity characteristic
- the first map shown in FIG. 5(B) is a contour line corresponding to the magnitude of the intensity characteristic. It is a diagram.
- the processor outputs the first map, which is the identification data converted in step S14, to the display device 30 (FIG. 1), which is the destination of the external output (step S16, output step).
- the first map shown in FIGS. 4 and 5 is displayed on the display device 30 .
- the processor determines whether or not a specific position on the first map has been designated by the user (step S18). While viewing the first map displayed on the display device 30, the user can designate a specific position (for example, a position with a large intensity characteristic) using a pointing device such as a mouse.
- a specific position for example, a position with a large intensity characteristic
- the user finds a specific position from the two first maps displayed on the display device 30 that does not identify the paper 2A and the leaf 2B but identifies the leaf 2B and the insect 2C, and indicates the specific position. is preferred.
- the processor uses the first wavelength and the second wavelength to detect the detection target among the plurality of subjects based on the specific position.
- the wavelength combination (750 nm, 950 nm) specified in this way is suitable for detecting the insect 2C, which is the detection target, and the information indicating the wavelength combination is output to the display device 30 and displayed. It is also output to a recording device and other external devices (step S22). can do.
- FIG. 19 is a flow chart showing a second embodiment of the data processing method according to the invention.
- steps common to the data processing method of the first embodiment shown in FIG. 18 are given the same step numbers, and detailed description thereof will be omitted.
- the data processing method of the second embodiment shown in FIG. 19 differs from the data processing method of the first embodiment shown in FIG. 18 in that the process of step S30 is performed instead of the process of step S18 shown in FIG. differ.
- step S30 shown in FIG. 19 a process of extracting one or a plurality of positions in the first map where the intensity characteristic exceeds the threshold is performed on the first map showing the intensity distribution of the intensity characteristic.
- step S30 in the first map, one or more regions whose intensity characteristics exceed the threshold are detected, and the centroid positions of the detected regions are obtained, thereby extracting positions in the one or more first maps. can be done. Also, within one or a plurality of regions exceeding the threshold, the position having the highest intensity characteristic may be set as the position in that region.
- the position automatically extracted in this way is set as the specific position on the first map. That is, the automatically extracted specific position can be used instead of the user-instructed specific position of the first embodiment.
- a plurality of specific position candidates are automatically specified by the second embodiment, and the specific position candidates are It may be superimposed on one map, and one or more specific positions may be specified by a user's instruction from among the specific position candidates superimposed and displayed on the first map.
- FIG. 20 is a flow chart showing a third embodiment of the data processing method according to the invention.
- a wavelength combination for example, 750 nm, 950 nm
- a first wavelength selection element first band-pass filter
- a second wavelength selection element first band-pass filter
- a multispectral camera 100 having a wavelength selection element (second bandpass filter) simultaneously photographs a plurality of subjects including a detection target (step S40).
- the processor acquires the first image in the wavelength band including the first wavelength and the second image in the wavelength band including the second wavelength from the multispectral camera 100 (step S42, image acquisition step).
- the processor calculates the difference or ratio between the acquired first image and the second image (step S44), and creates a map (second map) showing the calculated difference or ratio (step S46, second map-making step).
- FIG. 6 is a diagram showing an example of a second map showing the difference or ratio between the first image and the second image acquired from the multispectral camera that captured the subject shown in FIG.
- the processor detects the detection target based on the created second map (step S48).
- the contrast between the insect 2C and the background paper 2A and leaf 2B is clear.
- the number of insects 2C and the like can be detected with high accuracy.
- a plurality of subjects including a detection target is not limited to those in this embodiment, and various subjects are conceivable.
- the specified wavelength combination may be two or more, in which case the multispectral camera having three or more wavelength selection elements is applied.
- the hardware structure of a processing unit that executes various types of processing of a processor that constitutes a data processing device is the following types of processors.
- the circuit configuration can be changed after manufacturing such as CPU (Central Processing Unit), which is a general-purpose processor that functions as various processing units by executing software (program), and FPGA (Field Programmable Gate Array).
- Programmable Logic Device PLD
- ASIC Application Specific Integrated Circuit
- One processing unit may be composed of one of these various processors, or may be composed of two or more processors of the same type or different types (eg, multiple FPGAs, or combinations of CPUs and FPGAs).
- a plurality of processing units may be configured by one processor.
- a processor functions as multiple processing units.
- SoC System On Chip
- SoC System On Chip
- the various processing units are configured using one or more of the above various processors as a hardware structure.
- the hardware structure of these various processors is, more specifically, an electrical circuit that combines circuit elements such as semiconductor elements.
- the present invention also includes a data processing program that, when installed in a computer, causes the computer to function as a data processing apparatus according to the present invention, and a nonvolatile storage medium that stores this data processing program.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Signal Processing (AREA)
- Life Sciences & Earth Sciences (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Biochemistry (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Mathematical Physics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Processing (AREA)
- Investigating Or Analysing Materials By Optical Means (AREA)
Abstract
Description
図1は、本発明に係るデータ処理装置の第1実施形態を示す機能ブロック図である。
図7は、ハイパースペクトルカメラにより撮影される他の検出対象物を含む複数の被写体を示す図である。
図16は、本発明に係るデータ処理装置の第2実施形態を示す機能ブロック図である。
図17は、マルチスペクトルカメラの一例を示す概略図である。
本発明に係る光学素子は、図1に示した第1実施形態のデータ処理装置10-1、又は図16に示した第2実施形態のデータ処理装置10-2により特定した第1波長と第2波長の波長組合せにしたがって作製された光学素子である。
本発明に係る撮影光学系は、図17に示したマルチスペクトルカメラ100の撮影光学系110に相当するものである。この撮影光学系は、バンドパスフィルタユニット124に相当する光学素子であって、データ処理装置により特定した第1波長を含む波長帯域の光を透過させる第1波長選択素子(第1バンドパスフィルタ)と、データ処理装置により特定した第2波長を含む波長帯域の光を透過させる第1波長選択素子(第2バンドパスフィルタ)とを有する光学素子が、レンズ110A、110Bの瞳位置又は瞳位置近傍に配置されて構成されている。
本発明に係る撮影装置は、例えば、図17に示したマルチスペクトルカメラ100に相当するものである。
本発明に係るデータ処理方法は、所望の検出対象物の検出に適した第1波長と第2波長の波長組合せの特定に関する方法であり、図1及び図16に示したデータ処理装置10-1、10-2の各部の処理の主体となるプロセッサにより実行される方法である。
図18は、本発明に係るデータ処理方法の第1実施形態を示すフローチャートである。
図19は、本発明に係るデータ処理方法の第2実施形態を示すフローチャートである。
図20は、本発明に係るデータ処理方法の第3実施形態を示すフローチャートである。
検出対象物を含む複数の被写体は、本実施形態のものに限らず、種々のものが考えられる。また、特定される波長組合せは、2組以上であってもよく、その場合、マルチスペクトルカメラは、3以上の波長選択素子を有するものが適用される。
ロセッサ(processor)である。各種のプロセッサには、ソフトウェア(プログラム)を
実行して各種の処理部として機能する汎用的なプロセッサであるCPU(Central Processing Unit)、FPGA(Field Programmable Gate Array)などの製造後に回路構成を変更可能なプロセッサであるプログラマブルロジックデバイス(Programmable Logic Device:PLD)、ASIC(Application Specific Integrated Circuit)などの特定の処理を実行させるために専用に設計された回路構成を有するプロセッサである専用電気回路などが含まれる。
2A 紙
2B 葉
2C 虫
4A、6A 紙
4B、4C 土
6B ピーナッツの皮
6C ピーナッツの実
10-1、10-2 データ処理装置
12 データ取得部
14 強度特性算出部
16 データ変換部
18 出力部
20 ユーザ指示受付部
22 波長組合せ特定部
24 位置抽出部
30 表示装置
32 操作部
100 マルチスペクトルカメラ
110 撮影光学系
110A レンズ
110B レンズ
120 フィルタユニット
122 偏光フィルタユニット
122A 第1偏光フィルタ
122B 第2偏光フィルタ
124 バンドパスフィルタユニット
124A 第1バンドパスフィルタ
124B 第2バンドパスフィルタ
130 イメージセンサ
140 信号処理部
200 波長選択素子の設計装置
M1、M2、M3、M4、M5 マーカ
S10~S22、S30、S40~S46 ステップ
Claims (26)
- プロセッサを備えるデータ処理装置において、
前記プロセッサは、
複数の被写体のスペクトルデータを取得するデータ取得処理と、
前記取得した前記複数の被写体のスペクトルデータの波長域から選定される、第1波長及び第2波長における強度特性を、前記第1波長と前記第2波長の2つの波長の関係に基づいて算出する算出処理と、
前記算出処理で算出した前記強度特性を、前記波長域に対する特定被写体の識別データに変換するデータ変換処理と、
前記識別データを外部出力する出力処理と、
を備えたデータ処理装置。 - 前記強度特性は、強度差及び/又は強度比である、
請求項1に記載のデータ処理装置。 - 前記識別データは、波長を変数とした前記強度特性の変化を表す第1マップである、
請求項1から3のいずれか1項に記載のデータ処理装置。 - 前記第1マップは二次元マップであり、前記二次元マップの座標軸は、前記第1波長及び前記第2波長である、
請求項4に記載のデータ処理装置。 - 前記識別データの外部出力の先は、表示装置であり、
前記プロセッサは、前記表示装置に表示された前記第1マップ上の特定位置を、ユーザ指示により受け付ける処理と、前記特定位置に基づいて前記複数の被写体のうちの検出対象物の検出に使用する、前記第1波長及び前記第2波長の波長組合せを特定する処理と、を行う、
請求項5に記載のデータ処理装置。 - 前記プロセッサは、前記第1マップから前記強度特性が、閾値を超える1又は複数の前記第1マップにおける位置を抽出する処理と、
前記抽出した位置に基づいて前記複数の被写体のうちの検出対象物の検出に使用する、1又は複数の前記第1波長と前記第2波長の波長組合せを特定する処理と、を行う、
請求項5に記載のデータ処理装置。 - 前記識別データの外部出力の先は、表示装置であり、
前記プロセッサは、前記表示装置に表示された前記第1マップ上に前記1又は複数の前記第1波長と前記第2波長の波長組合せにおける特定位置の候補を重畳表示させる処理と、前記特定位置の候補からユーザ指示により特定位置を受け付ける処理と、前記受け付けた前記特定位置に基づいて前記第1波長及び前記第2波長の波長組合せを特定する処理と、を行う、
請求項7に記載のデータ処理装置。 - 前記複数の被写体は、第1被写体、第2被写体及び第3被写体を含み、
前記データ取得処理は、前記第1被写体のスペクトルデータ、前記第2被写体のスペクトルデータ、及び前記第3被写体のスペクトルデータを取得し、
前記算出処理は、前記第1被写体のスペクトルデータ及び前記第2被写体のスペクトルデータの第1波長及び第2波長における前記強度特性、前記第2被写体のスペクトルデータ及び前記第3被写体の各スペクトルデータの第1波長及び第2波長における前記強度特性、及び前記第1被写体のスペクトルデータ及び前記第3被写体のスペクトルデータの第1波長及び第2波長における前記強度特性のうちの、2以上の前記強度特性を算出し、
前記データ変換処理は、前記2以上の前記強度特性を2以上の前記識別データに変換する、
請求項1から8のいずれか1項に記載のデータ処理装置。 - 第1波長選択素子及び第2波長選択素子を有する光学素子であって、
前記第1波長選択素子は、請求項6から8のいずれか1項に記載のデータ処理装置により前記特定された前記第1波長の波長帯域を透過させ、
前記第2波長選択素子は、請求項6から8のいずれか1項に記載のデータ処理装置により前記特定された前記第2波長の波長帯域を透過させる、
光学素子。 - 請求項10に記載の光学素子を、瞳位置又は瞳位置近傍に配置した撮影光学系。
- 請求項11に記載の撮影光学系と、
前記撮影光学系により結像された、前記第1波長選択素子を透過した第1光学像と前記第2波長選択素子を透過した第2光学像とを撮像する撮像素子と、
を備えた撮影装置。 - 複数の被写体のスペクトルデータを取得するデータ取得ステップと、
前記取得した前記複数の被写体のスペクトルデータの波長域から選定される、第1波長及び第2波長における強度特性を、前記第1波長と前記第2波長の2つの波長の関係に基づいて算出する算出ステップと、
前記算出ステップで算出した前記強度特性を、前記波長域に対する特定被写体の識別データに変換するデータ変換ステップと、
前記識別データを外部出力する出力ステップと、を含み、
プロセッサが各ステップの処理を実行するデータ処理方法。 - 前記強度特性は、強度差及び/又は強度比である、
請求項13に記載のデータ処理方法。 - 前記識別データは、波長を変数とした前記強度特性の変化を表す第1マップである、
請求項13から15のいずれか1項に記載のデータ処理方法。 - 前記第1マップは二次元マップであり、前記二次元マップの座標軸は、前記第1波長及び前記第2波長である、
請求項16に記載のデータ処理方法。 - 前記識別データの外部出力の先は、表示装置であり、
前記表示装置に表示された前記第1マップ上の特定位置を、ユーザ指示により受け付けるステップと、
前記特定位置に基づいて前記複数の被写体のうちの検出対象物の検出に使用する、前記第1波長及び前記第2波長の波長組合せを特定する特定ステップと、を含む、
請求項17に記載のデータ処理方法。 - 前記第1マップから前記強度特性が、閾値を超える1又は複数の前記第1マップにおける位置を抽出するステップと、
前記抽出した位置に基づいて前記複数の被写体のうちの検出対象物の検出に使用する、1又は複数の前記第1波長と前記第2波長の波長組合せを特定する特定ステップと、を含む、
請求項17に記載のデータ処理方法。 - 前記識別データの外部出力の先は、表示装置であり、
前記表示装置に表示された前記第1マップ上に前記1又は複数の前記第1波長と前記第2波長の波長組合せにおける特定位置の候補を重畳表示させるステップと、
前記特定位置の候補からユーザ指示により特定位置を受け付けるステップと、
前記受け付けた前記特定位置に基づいて前記第1波長及び前記第2波長の波長組合せを特定するステップと、を含む、
請求項19に記載のデータ処理方法。 - 前記複数の被写体は、第1被写体、第2被写体及び第3被写体を含み、
前記データ取得ステップは、前記第1被写体のスペクトルデータ、前記第2被写体のスペクトルデータ、及び前記第3被写体のスペクトルデータを取得し、
前記算出ステップは、前記第1被写体のスペクトルデータ及び前記第2被写体のスペクトルデータの第1波長及び第2波長における前記強度特性、前記第2被写体のスペクトルデータ及び前記第3被写体の各スペクトルデータの第1波長及び第2波長における前記強度特性、及び前記第1被写体のスペクトルデータ及び前記第3被写体のスペクトルデータの第1波長及び第2波長における前記強度特性のうちの、2以上の強度特性を算出し、
前記データ変換ステップは、前記2以上の強度特性を2以上の前記識別データに変換する、
請求項13から20のいずれか1項に記載のデータ処理方法。 - 前記特定した波長組合せにおける前記第1波長を含む波長帯域の第1画像及び前記第2波長を含む波長帯域の第2画像をそれぞれ取得する画像取得ステップと、
前記取得した前記第1画像と前記第2画像との差又は比を算出するステップと、
前記算出した前記差又は比を示す第2マップを作成する第2マップ作成ステップと、
を含む請求項18から20のいずれか1項に記載のデータ処理方法。 - 前記特定ステップは、前記第1マップに基づいて前記複数の被写体のうちの検出対象物の検出に使用する、前記第1波長と前記第2波長との2以上の波長組合せを特定し、
前記画像取得ステップは、前記2以上の波長組合せ毎の前記第1波長を含む波長帯域の第1画像及び前記第2波長を含む波長帯域の第2画像をそれぞれ取得し、
前記第2マップ作成ステップは、前記2以上の波長組合せ毎に前記第2マップを作成し、
前記作成した前記2以上の前記第2マップを合成して1つの前記第2マップを作成するステップと、
を含む請求項22に記載のデータ処理方法。 - 前記作成した前記第2マップに基づいて前記検出対象物を検出するステップを含む、
請求項23に記載のデータ処理方法。 - 複数の被写体のスペクトルデータを取得する機能と、
前記取得した前記複数の被写体のスペクトルデータの波長域から選定される、第1波長及び第2波長における強度特性を、前記第1波長と前記第2波長の2つの波長の関係に基づいて算出する機能と、
前記算出した前記強度特性を、前記波長域に対する特定被写体の識別データに変換する機能と、
前記識別データを外部出力する機能と、
をコンピュータにより実現させるデータ処理プログラム。 - 非一時的かつコンピュータ読取可能な記録媒体であって、請求項25に記載のプログラムが記録された記録媒体。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2022578427A JPWO2022163671A1 (ja) | 2021-01-29 | 2022-01-26 | |
CN202280011204.8A CN116806304A (zh) | 2021-01-29 | 2022-01-26 | 数据处理装置、方法及程序以及光学元件、摄影光学系统及摄影装置 |
EP22745891.6A EP4296638A4 (en) | 2021-01-29 | 2022-01-26 | DATA PROCESSING DEVICE, METHOD AND PROGRAM, OPTICAL ELEMENT, OPTICAL IMAGING SYSTEM AND IMAGING DEVICE |
US18/359,856 US20230370700A1 (en) | 2021-01-29 | 2023-07-26 | Data processing apparatus, data processing method, data processing program, optical element, imaging optical system, and imaging apparatus |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021013412 | 2021-01-29 | ||
JP2021-013412 | 2021-01-29 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/359,856 Continuation US20230370700A1 (en) | 2021-01-29 | 2023-07-26 | Data processing apparatus, data processing method, data processing program, optical element, imaging optical system, and imaging apparatus |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022163671A1 true WO2022163671A1 (ja) | 2022-08-04 |
Family
ID=82653568
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/002756 WO2022163671A1 (ja) | 2021-01-29 | 2022-01-26 | データ処理装置、方法及びプログラム並びに光学素子、撮影光学系及び撮影装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20230370700A1 (ja) |
EP (1) | EP4296638A4 (ja) |
JP (1) | JPWO2022163671A1 (ja) |
CN (1) | CN116806304A (ja) |
WO (1) | WO2022163671A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024047944A1 (ja) * | 2022-08-29 | 2024-03-07 | 富士フイルム株式会社 | 校正用部材、筐体装置、校正装置、校正方法、及びプログラム |
WO2024090133A1 (ja) * | 2022-10-27 | 2024-05-02 | 富士フイルム株式会社 | 処理装置、検査装置、処理方法、及びプログラム |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5557414A (en) * | 1993-04-29 | 1996-09-17 | Centre De Recherche Industrielle Du Quebec | Method and apparatus for classifying articles according to their color |
JP2000249661A (ja) * | 1999-03-01 | 2000-09-14 | Topcon Corp | 光学測定装置 |
JP2007178407A (ja) | 2005-12-28 | 2007-07-12 | Yamatake Corp | 検査対象物の異物混入検査方法及びこれに用いる異物混入検査装置 |
JP2010117171A (ja) * | 2008-11-11 | 2010-05-27 | Shimadzu Corp | 分光光度計 |
JP2017053699A (ja) * | 2015-09-09 | 2017-03-16 | 国立大学法人岐阜大学 | 物質判別に用いる近赤外画像撮像用の波長決定方法および近赤外画像を用いた物質判別方法 |
JP2017064405A (ja) * | 2015-09-29 | 2017-04-06 | 住友電気工業株式会社 | 光学測定装置及び光学測定方法 |
JP2018125770A (ja) * | 2017-02-02 | 2018-08-09 | パイオニア株式会社 | 撮像装置、撮像方法、プログラム及び記録媒体 |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5781021B2 (ja) * | 2012-06-14 | 2015-09-16 | キヤノン株式会社 | レンズ装置および撮像装置 |
JP6908793B2 (ja) * | 2018-10-09 | 2021-07-28 | 富士フイルム株式会社 | 撮像装置 |
EP3716136A1 (en) * | 2019-03-26 | 2020-09-30 | Koninklijke Philips N.V. | Tumor boundary reconstruction using hyperspectral imaging |
JP7135109B2 (ja) * | 2018-11-26 | 2022-09-12 | 富士フイルム株式会社 | 撮像装置及び撮像方法 |
CN113966605B (zh) * | 2019-06-11 | 2023-08-18 | 富士胶片株式会社 | 摄像装置 |
-
2022
- 2022-01-26 CN CN202280011204.8A patent/CN116806304A/zh active Pending
- 2022-01-26 EP EP22745891.6A patent/EP4296638A4/en active Pending
- 2022-01-26 WO PCT/JP2022/002756 patent/WO2022163671A1/ja active Application Filing
- 2022-01-26 JP JP2022578427A patent/JPWO2022163671A1/ja active Pending
-
2023
- 2023-07-26 US US18/359,856 patent/US20230370700A1/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5557414A (en) * | 1993-04-29 | 1996-09-17 | Centre De Recherche Industrielle Du Quebec | Method and apparatus for classifying articles according to their color |
JP2000249661A (ja) * | 1999-03-01 | 2000-09-14 | Topcon Corp | 光学測定装置 |
JP2007178407A (ja) | 2005-12-28 | 2007-07-12 | Yamatake Corp | 検査対象物の異物混入検査方法及びこれに用いる異物混入検査装置 |
JP2010117171A (ja) * | 2008-11-11 | 2010-05-27 | Shimadzu Corp | 分光光度計 |
JP2017053699A (ja) * | 2015-09-09 | 2017-03-16 | 国立大学法人岐阜大学 | 物質判別に用いる近赤外画像撮像用の波長決定方法および近赤外画像を用いた物質判別方法 |
JP2017064405A (ja) * | 2015-09-29 | 2017-04-06 | 住友電気工業株式会社 | 光学測定装置及び光学測定方法 |
JP2018125770A (ja) * | 2017-02-02 | 2018-08-09 | パイオニア株式会社 | 撮像装置、撮像方法、プログラム及び記録媒体 |
Non-Patent Citations (1)
Title |
---|
See also references of EP4296638A4 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024047944A1 (ja) * | 2022-08-29 | 2024-03-07 | 富士フイルム株式会社 | 校正用部材、筐体装置、校正装置、校正方法、及びプログラム |
WO2024090133A1 (ja) * | 2022-10-27 | 2024-05-02 | 富士フイルム株式会社 | 処理装置、検査装置、処理方法、及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
US20230370700A1 (en) | 2023-11-16 |
JPWO2022163671A1 (ja) | 2022-08-04 |
EP4296638A4 (en) | 2024-07-24 |
CN116806304A (zh) | 2023-09-26 |
EP4296638A1 (en) | 2023-12-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11092725B2 (en) | Single-sensor hyperspectral imaging device | |
US8155413B2 (en) | Method and system for analyzing skin conditions using digital images | |
WO2022163671A1 (ja) | データ処理装置、方法及びプログラム並びに光学素子、撮影光学系及び撮影装置 | |
US8810658B2 (en) | Estimating a visible vector representation for pixels in an infrared image | |
JP6322939B2 (ja) | 撮像システム及び色検査システム | |
CN101124462A (zh) | 多光谱和超光谱成像系统 | |
JP2006084425A (ja) | マルチスペクトル画像処理方法及びマルチスペクトル皮膚画像による診断方法 | |
JP2010264276A (ja) | マルチスペクトル皮膚画像による診断方法 | |
WO2021246192A1 (ja) | 信号処理方法、信号処理装置、および撮像システム | |
Nouri et al. | Calibration and test of a hyperspectral imaging prototype for intra-operative surgical assistance | |
JP2016164559A (ja) | 画像色分布検査装置および画像色分布検査方法 | |
JP2014187558A (ja) | 画像色分布検査装置および画像色分布検査方法 | |
JPWO2022163671A5 (ja) | ||
US20230393059A1 (en) | Data processing apparatus, data processing method, data processing program, optical element, imaging optical system, and imaging apparatus | |
KR102350164B1 (ko) | 멀티스펙트럴 이미징 변환 방법 | |
TWI843820B (zh) | 使用多種波長之光單色成像的顏色檢查方法 | |
JP2009076012A (ja) | 物体識別装置 | |
TWI739185B (zh) | 光譜照相裝置與方法 | |
US20240331108A1 (en) | Display condition decision method, display condition decision apparatus, and program | |
WO2022270355A1 (ja) | 撮像システム、撮像システムに用いられる方法、および撮像システムに用いられるコンピュータプログラム | |
US20240129603A1 (en) | Imaging method and program | |
WO2024047944A1 (ja) | 校正用部材、筐体装置、校正装置、校正方法、及びプログラム | |
WO2021075215A1 (ja) | 皮膚外用剤の塗布支援を行うための方法およびシステム | |
US11867615B2 (en) | Field calibration for near real-time Fabry Perot spectral measurements | |
WO2024090133A1 (ja) | 処理装置、検査装置、処理方法、及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22745891 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280011204.8 Country of ref document: CN |
|
ENP | Entry into the national phase |
Ref document number: 2022578427 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022745891 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022745891 Country of ref document: EP Effective date: 20230829 |