WO2015133476A1 - Spectroradiometer - Google Patents

Spectroradiometer Download PDF

Info

Publication number
WO2015133476A1
WO2015133476A1 PCT/JP2015/056218 JP2015056218W WO2015133476A1 WO 2015133476 A1 WO2015133476 A1 WO 2015133476A1 JP 2015056218 W JP2015056218 W JP 2015056218W WO 2015133476 A1 WO2015133476 A1 WO 2015133476A1
Authority
WO
WIPO (PCT)
Prior art keywords
spectral
dimensional
image
color
wavelength
Prior art date
Application number
PCT/JP2015/056218
Other languages
French (fr)
Japanese (ja)
Inventor
直樹 野呂
洋平 高良
史識 安藤
雄大 藤森
Original Assignee
エバ・ジャパン株式会社
直樹 野呂
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by エバ・ジャパン株式会社, 直樹 野呂 filed Critical エバ・ジャパン株式会社
Publication of WO2015133476A1 publication Critical patent/WO2015133476A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0205Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows
    • G01J3/0208Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows using focussing or collimating elements, e.g. lenses or mirrors; performing aberration correction
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/06Scanning arrangements arrangements for order-selection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/12Generating the spectrum; Monochromators
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/12Generating the spectrum; Monochromators
    • G01J3/14Generating the spectrum; Monochromators using refracting elements, e.g. prisms
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2823Imaging spectrometer
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/06Scanning arrangements arrangements for order-selection
    • G01J2003/064Use of other elements for scan, e.g. mirror, fixed grating

Definitions

  • the present invention relates to a spectral radiance meter, and more particularly to a spectral radiance meter that measures a region including an object to be measured and acquires color information of the region.
  • color coordinate notation such as CIEXYZ
  • RGB obtained by multiplying spectral information obtained by a spectral radiance meter by experimentally obtained human visibility
  • these color coordinate notations are based on data obtained by statistically processing the influence of psychological states and individual differences, and cannot cope with individual diversity.
  • the color matching function that is the basis of all color spaces is a characteristic of a virtual observer called a standard observer based on statistical processing of psychological experiment results. There are individual differences due to the lens yellowing due to, and intra-individual variation due to the effect of macular pigment on the retina. When such variations in the color matching function cannot be ignored, when two types of stimuli are presented, even if the same color is perceived by the standard observer, a different color appears to another person (observer metamerism) ) Occurs.
  • Non-Patent Document 1 Such a technique is disclosed, for example, in Non-Patent Document 1, and by reproducing the spectral distribution of the reflected light from the real object using spectral image information and a corresponding multi-primary color display (spectral color reproduction), The observer can recognize the real object and the image on the display as the same color.
  • Such technology using spectral image information is expected to be applied to fields that require high-precision color matching between different media, such as color matching for telemedicine, artwork archives, clothing / interior / prints, etc.
  • the conventional color analysis display uses different color analysis display methods depending on the industry and country, so it is difficult to compare each other, and the compatibility between each color analysis display method is difficult, and it is difficult to handle. It had been.
  • the conventional spectral radiance meter can only measure an average value of one point or a fixed space region (for example, an angle of view of 10 ° or less), for example, an object to be measured (hereinafter referred to as an object to be measured)
  • an object to be measured hereinafter referred to as an object to be measured
  • the spectral radiance meter or the object In order to obtain spectral information in the two-dimensional space of the surface of the object, the spectral radiance meter or the object must be scanned to measure enormous points on the object surface. In other words, it takes a lot of time to obtain color information on the surface of the object, and it has been pointed out as a problem that it is substantially impossible.
  • the spectral information within the captured angle of view is averaged, so in the spectral information of the object surface created by acquiring such spectral information, the spatial information It has been pointed out that a difference in color information for each detail (one pixel) cannot be grasped, and the object surface cannot be accurately evaluated.
  • the present invention has been made in view of the various problems of the conventional techniques as described above, and its object is to perform color display and mutual conversion by an arbitrary color system without loss of information. It is intended to provide a spectral radiance meter that can be used.
  • an object of the present invention is to easily and quickly acquire spectroscopic information for each pixel in a two-dimensional space that is wider than a region measurable by a conventional spectral radiance meter.
  • the present invention intends to provide a spectral radiance meter capable of satisfying the requirements.
  • the present invention provides a two-dimensional imaging detector that captures an entire predetermined measurement region including an object while moving the imaging position by moving the configuration subsequent to the objective lens.
  • Two-dimensional spectral data having one-dimensional spatial information and one-dimensional wavelength information is acquired by (area sensor).
  • the one-dimensional wavelength information includes several tens of wavelengths in the wavelength range from ultraviolet to infrared (eg, 200 nm to 13 ⁇ m) including visible light visible to humans with a wavelength resolution of 0.1 nm to 100 nm. High-wavelength-resolved spectroscopic information (hyperspectral data) that splits into several hundred bands was used.
  • the acquired two-dimensional spectral data at each imaging position is integrated to create three-dimensional spectral data having two-dimensional spatial information and one-dimensional wavelength information, and the spectrum for each wavelength is created from the generated three-dimensional spectral data.
  • An image hyperspectral image
  • the spectral radiance, spectral luminance, spectral reflectance or spectral transmittance for each pixel in the created spectral image is calculated, and the spectral radiance value at each pixel is calculated as a stimulus value of a predetermined color system.
  • An image (hereinafter, referred to as a “color space image”) in which each pixel of the two-dimensional space image is stored with a stimulus value of the color system is converted to a color that has been converted into a predetermined color system
  • a color analysis image obtained by color calculation processing of the spatial image is displayed on the display unit by a predetermined display method.
  • imaging of the entire predetermined measurement region including the object and acquisition of spectral radiance for each pixel of the spectral image acquired by imaging can be executed in a short time.
  • the spectral radiance for each pixel in a wide two-dimensional space can be acquired in a few seconds if it is a VGA size, for example.
  • the present invention by making the spectral radiance a common color space, it is possible to perform color display and mutual conversion by various existing color systems without losing color information, Accurate color reproduction and information transmission can be performed regardless of individual differences between individuals in the illumination light, display environment, and visual characteristics, and intra-individual variations.
  • uniform and objective color analysis display can be performed.
  • a light beam from an object incident through an objective lens is incident by being dispersed in a direction orthogonal to a predetermined direction by a dispersion optical element by photographing, and a signal based on the incident light beam is acquired.
  • the two-dimensional imaging detector and the configuration provided downstream from the objective lens are integrally moved in the predetermined direction, so that the imaging position photographed by the two-dimensional imaging detector is set in the predetermined direction.
  • the moving means for moving the first control means for controlling the timing of photographing of the two-dimensional imaging detector, the second control means for controlling the movement of the moving means, and the signal at the photographing position.
  • Two-dimensional spectral data having one-dimensional spatial information and one-dimensional wavelength information is created, and two-dimensional spatial information and one-dimensional wavelength information are obtained from the two-dimensional spectral data at each photographing position.
  • Spectral data creation means for creating three-dimensional spectral data first image creation means for creating a spectral image for each wavelength from the three-dimensional spectral data, and acquisition for acquiring spectral radiance at each pixel of the spectral image Means and color space conversion processing of the spectral image to create a color space image by a predetermined color system, color calculation processing of the color space image to create a color analysis image, and the color analysis image
  • a second image creating means for converting the image into a display by a predetermined display method.
  • the present invention is the above-described invention, wherein the two-dimensional imaging detector is detachable together with the optical element including the objective lens and the dispersion optical element, and the two-dimensional imaging detector and the optical element are By exchanging, it is possible to change the wavelength range, the wavelength resolution, and the number of spatial pixels that can be photographed.
  • the moving means is a precision linear motion stage.
  • a light beam from an object incident through an objective lens is dispersed and incident in a direction orthogonal to a predetermined direction by a dispersion optical element by photographing, and a signal based on the incident light beam is acquired.
  • Spectral data creating means for creating spectral data, creating three-dimensional spectral data having two-dimensional spatial information and one-dimensional wavelength information from the two-dimensional spectral data at each imaging position, and wavelength from the three-dimensional spectral data
  • a first image creating means for creating a spectral image for each, an obtaining means for obtaining spectral radiance at each pixel of the spectral image, and a color space conversion process for the spectral image.
  • the present invention is configured as described above, there is an excellent effect that color display and mutual conversion by an arbitrary color system can be performed without loss of information.
  • the present invention since the present invention is configured as described above, it has an excellent effect that spectral information for each pixel in a wide two-dimensional space can be acquired easily and in a short time. .
  • FIG. 1A is a schematic configuration explanatory view of a spectral radiance meter according to the present invention
  • FIG. 1B is a view as seen from an arrow A in FIG.
  • FIG. 2 is an explanatory diagram of a block configuration of the control unit.
  • FIG. 3 is a flowchart showing a processing routine of imaging processing in the spectral radiance meter.
  • FIG. 4 is a flowchart showing a processing routine of measurement processing in the spectral radiance meter.
  • FIG. 5 is a flowchart showing the processing routine of the calibration process.
  • 6A is an RGB image of a leaf bundle photographed by the spectral radiance meter according to the present invention
  • 6B is a predetermined wavelength (band) created by the spectral radiance meter according to the present invention.
  • 6 (c) is a graph showing the spectral reflectance
  • FIG. 6 (d) shows the spectral image shown in FIG. 6 (b) as the stimulus value of the XYZ color system.
  • 6 (e) is an image obtained by converting and displaying the spectral image shown in FIG. 6 (b) into xy chromaticity coordinates (x value).
  • FIG. 7 is a schematic configuration explanatory diagram of a modification of the spectral radiance meter according to the present invention.
  • FIG. 8 is a schematic configuration explanatory diagram of a modification of the spectral radiance meter according to the present invention.
  • FIG. 9 is a schematic configuration explanatory diagram of a modification of the spectral radiance meter according to the present invention.
  • FIG. 10A is an explanatory diagram of a pair of achromatic prisms arranged in the initial state
  • FIG. 10B is a view taken in the direction of arrow B in FIG. (C) is explanatory drawing of a pair of achromatic prism of the state which functions as a parallel plane board
  • FIG.10 (d) is C arrow line view of FIG.10 (c).
  • 11 (a) and 11 (b) are schematic configuration explanatory views of a modification of the spectral radiance meter according to the present invention.
  • FIG. 1A is a schematic structural explanatory view of a spectral radiance meter according to the present invention
  • FIG. 1B is a view as viewed in the direction of arrow A in FIG.
  • FIG. 2 is a block diagram illustrating the control unit.
  • the spectral radiance meter 10 shown in FIG. 1 includes an objective lens 12 that receives a light beam from a predetermined measurement region including an object 200, a precision linear motion stage 14 that moves in the Y-axis direction in an XYZ orthogonal coordinate system, A slit plate 16 disposed so that a slit opening 16a extending in the Z-axis direction is positioned on the image plane of the objective lens 12, and a collimating lens 18 that collimates the light beam that has passed through the slit opening 16a; On the image plane of the imaging lens 22, the dispersion optical element 20 that disperses the parallel light from the collimating lens 18 in the Y-axis direction, the imaging lens 22 that forms an image of the light beam emitted from the dispersion optical element 20,
  • the two-dimensional imaging detection unit 24, the precision linear motion stage 14, and the two-dimensional imaging detection unit 24, which are arranged so that the detection unit 24a is located, are controlled and the two-dimensional imaging detection unit 24 is controlled. It is constituted
  • the slit plate 16, the collimating lens 18, the dispersion optical element 20, the imaging lens 22, and the two-dimensional imaging detection unit 24 are fixedly disposed on the precision linear motion stage 14. Along with the movement in the Y-axis direction, the movement in the Y-axis direction is integrated.
  • the slit plate 16 is disposed so that the light beam from the objective lens 12 passes through a slit opening 16a extending in the Z-axis direction.
  • the dispersion optical element 20 can use, for example, a diffraction grating, a prism, or a grism.
  • the grism is a direct-view diffraction grating that combines a transmission diffraction grating and a prism.
  • the diffraction grating, prism, or grism is disposed so as to disperse the incident light beam in the X-axis direction perpendicular to the Z-axis direction that is the extension direction of the slit opening 16a.
  • the objective lens 12, the slit plate 16, the collimating lens 18, the dispersion optical element 20, and the imaging lens 22 are configured to be exchangeable as appropriate.
  • the two-dimensional imaging detector 24 is an area sensor in which the detection unit 24a is arranged in parallel with the Z axis and the Y axis (that is, parallel to the primary image plane and parallel to the YZ plane).
  • the two-dimensional imaging detector 24 has a replaceable configuration, and the two-dimensional imaging detector 24 used in the spectral radiance meter 10 according to the present invention has a spatial pixel number of 1 to 29 million pixels.
  • the obtainable wavelength range is 200 nm to 13 ⁇ m, and the wavelength resolution is 0.1 nm to 100 nm.
  • the wavelength range, wavelength resolution, and number of spatial pixels in the acquired spectral image can be changed.
  • the two-dimensional imaging detector 24 having a different wavelength range, wavelength resolution, number of spatial pixels, and the like is replaced, and the objective lens 12, the slit plate 16, the collimating lens 18, the dispersion optical element 20, and the like.
  • the imaging lens 22 By appropriately replacing the imaging lens 22, it is possible to change the wavelength range that can be photographed, wavelength resolution, the number of spatial pixels, and the like.
  • control unit 26 is connected to the precision linear motion stage 14 and the two-dimensional imaging device 24, and is configured by, for example, a microcomputer or a general-purpose personal computer.
  • the control unit 26 moves the imaging control unit 30 that controls the timing at which the two-dimensional imaging detector 24 acquires an electrical signal (that is, the timing for imaging) and the precision linear motion stage 14 in the Y-axis direction.
  • the movement control unit 32 that controls the direction, the amount of movement, and the timing of movement, the spectral data creation unit 34 that creates spectral data based on the electrical signal from the two-dimensional imaging detector 24, and the spectral data creation unit 34
  • a spectral image is created based on the spectral data obtained, the created spectral image is converted into a predetermined color system, and then color calculation processing is performed to obtain a color analysis image.
  • an image creation unit 36 for outputting an image processed by the display method to a display unit (not shown).
  • the imaging control unit 30 performs imaging in the two-dimensional imaging detector 24 (that is, electric in the two-dimensional imaging detector). Signal is acquired, and information indicating that photographing has been performed is output to the movement control unit 32.
  • the imaging control unit 30 determines whether or not the precision linear motion stage 14 has been moved until the imaging position in the predetermined measurement region including the object 200 reaches the imaging end position, and the imaging end position is determined from the movement control unit 32. When the information indicating that the movement has been performed is output, the photographing process is terminated after photographing.
  • the movement control unit 32 moves the precision linear motion stage 14 in the Y-axis direction within a predetermined imaging area including the one end 200c in the Y-axis direction of the object 200 and the other end 200d. It is sequentially moved at a predetermined interval according to the slit width. Thereby, the photographing position in the predetermined measurement area is moved in the Y-axis direction.
  • the movement control unit 32 moves the precise linear motion stage 14 so that the photographing position of the two-dimensional imaging detector 24 becomes the photographing start position when the operator gives an instruction to start photographing.
  • Information indicating that the stage 14 has been moved is output to the imaging control unit 30.
  • the movement control unit 32 moves the precision linear motion stage 14 at a predetermined interval corresponding to the slit width of the slit opening 16a.
  • Information indicating that the linear motion stage 14 has been moved is output to the imaging control unit 30.
  • the movement control unit 32 outputs information to the photographing control unit 30 that it has moved to the photographing end position.
  • the spectral data creation unit 34 creates and creates two-dimensional spectral data having one-dimensional spatial information and one-dimensional wavelength information based on the electrical signal output from the two-dimensional imaging detector 24.
  • the dimensional spectral data is output to and stored in the storage unit 52 provided in the control unit 26.
  • the one-dimensional wavelength information created based on the electrical signal output from the two-dimensional imaging detector 24 has a predetermined wavelength range of 200 nm to 13 ⁇ m and a wavelength resolution of 0.1 nm to 100 nm. This is high-wavelength-resolved spectroscopic information split into several hundred bands with a predetermined wavelength resolution.
  • the spectral data creation unit 34 uses the two-dimensional spectral data stored in the storage unit 52 to complete two-dimensional spatial information and one-dimensional wavelength information (high wavelength resolution spectroscopy) when imaging at all imaging positions is completed. Information) is generated, and the generated three-dimensional spectral data is output to the storage unit 52 and stored therein.
  • the image creation unit 36 creates a spectral image creation unit 38 that creates a spectral image for each wavelength based on the three-dimensional spectral data created by the spectral data creation unit 34, and the spectral image created by the spectral image creation unit 38.
  • a color analysis image creating unit 40 that performs conversion to a predetermined color system, performs color calculation processing to create a color analysis image, and performs processing for displaying the created color analysis image by a predetermined display method Has been.
  • the spectroscopic image creation unit 38 performs three-dimensional spectroscopic data spectrally divided into several hundred bands with a predetermined wavelength range of 200 nm to 13 ⁇ m and a predetermined wavelength resolution within a range of 0.1 nm to 100 nm. Based on the above, a spectral image for each band (each wavelength) is acquired.
  • the color analysis image creation unit 40 performs calibration processing for performing dark noise correction processing, inter-image sensitivity deviation correction processing, luminance calibration processing, and light source correction processing on the spectral image created by the spectral image creation unit 38.
  • the spectral luminance (cd / m 2 ⁇ nm) is calculated from the unit 42 and the spectral radiance (W / sr ⁇ m 2 ⁇ nm) calculated at the time of the luminance calibration processing, and a sensitivity correction coefficient and luminance calibration which will be described later are calculated.
  • a calculation unit 44 for calculating a coefficient, a spectral reflectance, and the like; and a spectral image for each wavelength for which the spectral radiance of each pixel is calculated are converted into a predetermined color system, color calculation processing is performed, and a color analysis image is converted
  • a color analysis image acquisition unit 50 that performs processing for generating and displaying the generated color analysis image by a predetermined display method is configured.
  • the calibration processing unit 42 performs dark noise correction (dark correction) processing for removing noise caused by dark current in the two-dimensional imaging detector 24. Further, in order to correct the sensitivity deviation for each pixel of the two-dimensional imaging detector 24, the sensitivity deviation correction process between pixels is performed on the spectral image subjected to the dark noise correction process using the sensitivity correction coefficient. Further, the luminance calibration processing is performed on the spectral image that has been subjected to the inter-pixel sensitivity deviation correction processing using the luminance calibration coefficient. Furthermore, a light source correction process is performed for correcting the illumination unevenness of the light source light in the space in the spectral image subjected to the dark noise correction process and obtaining the spectral reflectance or the spectral transmittance of the object 200.
  • dark noise correction dark correction
  • the dark noise correction data used in the dark noise correction process is obtained by performing the photographing with the two-dimensional imaging detector 24 in the light-blocking state before the measurement of the object 200. This is spectral image data for each wavelength created based on the data. Note that the shooting procedure at this time is the same as the shooting process described later.
  • the sensitivity correction coefficient is obtained by photographing light source light (hereinafter, referred to as “uniform standard light source”) in which the spatial distribution of radiance is made uniform by an integrating sphere or the like before measuring the object 200. 3D spectroscopic data. Note that the shooting processing procedure at this time is the same as the shooting processing described later.
  • the sensitivity correction coefficient is a specified pixel (single pixel or average of a plurality of pixels) in a spectral image for each wavelength based on three-dimensional spectral data obtained by photographing a uniform standard light source with the two-dimensional imaging detector 24. 3) Spectral data having a correction coefficient for each pixel of the spectral image for each wavelength by calculating the correction coefficient for each pixel by dividing the reference value by the output value of each pixel. Created as The calculation of the sensitivity correction coefficient is performed by the calculation unit 44 and is output to and stored in the storage unit 52.
  • the luminance calibration coefficient is obtained by photographing a light source light (hereinafter referred to as “spectral radiance standard light source”) having a spectral radiance value before measurement of the object 200. This is one-dimensional data created based on the data. Note that the shooting procedure at this time is the same as the shooting process described later.
  • the luminance calibration coefficient is one-dimensional data corresponding to each wavelength, and the spectral radiance value obtained for each wavelength of the spectral radiance standard light source is obtained by photographing the spectral radiance standard light source.
  • a luminance calibration coefficient having a conversion coefficient for each wavelength is created by dividing by the output value of a designated pixel (which may be a single pixel or an average of a plurality of pixels) in each spectral image. Note that the calculation of the luminance calibration coefficient is performed by the calculation unit 44 and output to the storage unit 52 for storage.
  • the light source data is created as three-dimensional spectroscopic data acquired by photographing the reflected light obtained by irradiating the light source light to a reflection standard such as a standard white plate before observing the object 200.
  • a reflection standard such as a standard white plate before observing the object 200.
  • the light source light is light from a light source provided separately from the spectral radiance meter used for measurement using the spectral radiance meter.
  • the spectral reflectance or spectral transmittance R ( ⁇ ) of each pixel in the spectral image for each wavelength ( ⁇ nm) is expressed by the following equation.
  • R ( ⁇ ) C ( ⁇ ) / E ( ⁇ )
  • the calculation unit 44 calculates the spectral luminance L ( ⁇ ) (cd / m 2 ⁇ nm) from the spectral radiance Le ( ⁇ ) (W / sr ⁇ m 2 ⁇ nm) in the spectral image for each wavelength processed by the calibration processing unit 42. nm).
  • L ( ⁇ ) Km ⁇ Le ( ⁇ ) ⁇ V ( ⁇ )
  • L ( ⁇ ) Spectral luminance (cd / m 2 ⁇ nm) for each pixel of the spectral image
  • Km Maximum luminous efficacy (683lm ⁇ W ⁇ 1 )
  • the color analysis image acquisition unit 50 is set using spectral radiance, spectral luminance, spectral reflectance, and the like in each pixel of the spectral image for each wavelength acquired by measurement of the object 200 processed in the calibration processing unit 42. Color space conversion processing is performed for conversion to the specified color system.
  • spectral luminance may be used in the process, but basically processing is performed using spectral radiance.
  • conversion to an arbitrary color space can be performed by changing the color matching function or constant according to the color system.
  • conversion to the XYZ color system is as follows: It is as follows.
  • X, Y and Z are tristimulus values, and Is a color matching function, and ⁇ is a wavelength interval.
  • X, Y and Z are tristimulus values
  • I a color matching function
  • is a wavelength function
  • k is a coefficient for matching with a corresponding photometric unit
  • S ( ⁇ ) is a relative spectral radiance of the light source light. It is.
  • the color analysis image acquisition unit 50 performs a color calculation process on the acquired color space image using a conversion function stored in advance in the storage unit, so that a color analysis value (for example, a color analysis value corresponding to the purpose) is obtained.
  • a color analysis value for example, a color analysis value corresponding to the purpose
  • a color analysis image having a chromaticity coordinate value and a color rendering evaluation number) in each pixel is acquired.
  • conversion coefficients to xy chromaticity coordinates are as follows.
  • x ( ⁇ ), y ( ⁇ ), and z ( ⁇ ) are xy chromaticity coordinates, Is a color matching function.
  • the color analysis image acquisition unit 50 performs a process of converting the acquired color analysis image into a display by a predetermined display method.
  • a predetermined display method various display methods can be used. For example, there is a display method in which a color gradation is assigned to each intensity on the color coordinate to form a gradation image.
  • the color system standards include, for example, JIS standards (Japanese Industrial Standards), NDS standards (Ministry of Defense standards), ISO standards (International Standard Organization), CIE standards (International Lighting Commission), ASTM standards (ASTM International). ), IEC standard (International Electrotechnical Commission), ANSI C78.377 (American Standards Association), NTSC color standard, Pantone, RAL, NCS (Natural Color System), XYZ (Yxy) color system (CIE color system), RGB color system, L * u * v * color system, L * a * b * color system, L * c * h * color system, Hunter Lab color system, Munsell color system, Ostwald color system , DIN color system, PCCS color system (Nippon Color Co., Ltd.
  • DIC color guide Japan Paint Manufacturers Association standard color
  • HSV color space HLS color space
  • CMYK color space CMY color space
  • RGBA color space RGB color space
  • sRGB / AdobeRGB color space sYcc color space
  • xvYcc color space YCbCr color space
  • architectural design color chart paint color sample book, etc.
  • the display is not limited to the above, and color display by the color systems of various existing standards is possible.
  • control unit 26 starts imaging processing.
  • the flowchart of FIG. 3 shows the detailed processing contents of the imaging process in the spectral radiance meter 10 according to the present invention.
  • this imaging process first, the imaging position by the two-dimensional imaging detector 24 is determined.
  • the precision linear motion stage 14 is moved so as to coincide with the imaging start position (step S302).
  • This imaging start position is set in advance, and is, for example, one end in the Y-axis direction in a predetermined measurement area including the object 200.
  • the movement control unit 32 moves the precision linear motion stage 14 to a position where the imaging position coincides with the imaging start position by the two-dimensional imaging detector 24, and the precision linear motion stage 14 is Information indicating that it has moved is output to the imaging control unit 30.
  • the imaging control unit 30 executes processing for imaging in the two-dimensional imaging detector 24 (step S304).
  • the imaging control unit 30 performs imaging (acquisition of electric signals) in the two-dimensional imaging detector 24 and outputs information indicating that imaging has been performed to the movement control unit 32.
  • the two-dimensional imaging detector 24 receives the light of the incident light.
  • the intensity distribution is converted into an electrical signal, and this electrical signal is output to the spectral data creation unit 34.
  • the spectral data creating unit 34 creates two-dimensional spectral data having one-dimensional spatial information and one-dimensional wavelength information based on the electrical signal output from the two-dimensional imaging detector 24.
  • the created two-dimensional spectroscopic data is output to and stored in the storage unit 52.
  • the movement control unit 32 moves the precision linear motion stage 14 at a predetermined interval (step S306).
  • the movement control unit 32 moves the precision linear motion stage 14 at a predetermined interval and outputs information to the imaging control unit 30 that the precision linear motion stage 14 has moved. To do.
  • the movement control unit 32 outputs information indicating that it has moved to the photographing end position.
  • the imaging control unit 30 determines whether or not the precise linear motion stage 14 has moved until the imaging position of the two-dimensional imaging detector 24 reaches the imaging end position (step S308).
  • this photographing end position is set in advance, and is, for example, the other end in the Y-axis direction in a predetermined photographing region.
  • the imaging control unit 30 determines whether information indicating that the movement control unit 32 has moved to the imaging end position has been output.
  • step S308 if the information indicating that the movement control unit 32 has moved to the shooting end position is not output, the shooting position of the two-dimensional imaging detector 24 is set to the shooting end in the shooting control unit 30. It is determined that the precision linear motion stage 14 has not moved until the position is reached. On the other hand, when the information indicating that the movement control unit 32 has moved to the photographing end position is output, the precise linear motion stage 14 until the photographing position of the two-dimensional imaging detector 24 reaches the photographing end position in the photographing control unit 30. Is determined to have moved.
  • step S308 If it is determined in step S308 that the imaging linear motion stage 14 has not moved until the imaging position of the two-dimensional imaging detector 24 reaches the imaging end position, the process returns to step S304, and the process of step S304. Perform the following processing.
  • step S308 if it is determined in step S308 that the linear motion stage 14 has moved until the imaging position of the two-dimensional imaging detector 24 reaches the imaging end position, the two-dimensional imaging detector 24 at the imaging end position. Then, photographing (acquisition of electrical signals) is performed (step S310), and this photographing process is terminated.
  • each spectral position is taken by the spectral data creation unit 34.
  • 2D spectroscopic data having 1D spatial information and 1D wavelength information (high wavelength resolution spectroscopic information) acquired in step 2 is integrated to obtain 2D spatial information indicating information in a predetermined measurement region and 1D Three-dimensional spectroscopic data having wavelength information is created (step S402).
  • the spectral image creation unit 38 creates a spectral image (hyperspectral image) for each wavelength in the created three-dimensional spectral data (step S404), and then performs calibration processing for the created spectral image for each wavelength. This is performed (step S406).
  • FIG. 5 shows a flowchart showing the detailed processing contents of the calibration processing.
  • dark noise correction processing is performed (step S502).
  • step S502 the dark noise correction process is performed using the dark noise correction data created in advance and stored in the storage unit 52.
  • the output value of the corresponding pixel in is subtracted.
  • a sensitivity deviation correction process between pixels is performed on the spectral image for each wavelength subjected to the dark noise correction process in this way (step S504).
  • step S504 the sensitivity correction that has been created in advance and stored in the storage unit 52 for each pixel of the spectral image for each wavelength that has undergone dark noise correction processing in the calibration processing unit 42. Multiply by a coefficient.
  • a luminance calibration process is performed on the spectral image for each wavelength subjected to the inter-pixel sensitivity deviation correction process (step S506).
  • the calibration processing unit 42 creates and stores in advance in the storage unit 52 the pixel output value of the spectral image for each wavelength subjected to the inter-pixel sensitivity deviation correction processing. Multiply by the luminance calibration factor. In addition, what was obtained by multiplying the luminance calibration coefficient indicates the spectral radiance at each pixel of the spectral image for each wavelength.
  • the calculation unit 44 calculates the spectral luminance at each pixel of the spectral image for each wavelength from the acquired spectral radiance at each pixel of the spectral image for each wavelength (step S508), and the analysis processing at step S408. Proceed to processing.
  • step S508 the light source correction process (step S408) is performed.
  • the output value of the corresponding pixel in the spectral image of the wavelength corresponding to the light source data is divided from the output value of the light source data, and the light source correction process is performed, so that the spectral for each pixel in the spectral image for each wavelength of the object 200 is obtained.
  • the color analysis image acquisition unit 50 acquires a color space image (step S410).
  • step S410 the spectral radiance, spectral luminance, and spectral reflectance or spectral transmittance of each pixel are stored in the storage unit 52 for the spectral images for each wavelength.
  • a color space image represented by a color system of a preset standard is acquired using a color matching function corresponding to each color system.
  • step S412 the obtained color space image is subjected to color calculation processing using a conversion function stored in advance in the storage unit 52 to obtain a color analysis image (step S412).
  • a process of converting to display by the display method is performed, and the result is output to a display unit (not shown) (step S414), and the measurement process is terminated.
  • a display unit (not shown) performs display based on the information output by the process of step S414.
  • this leaf bundle is treated with halogen light (light from a light source provided separately from the spectral radiance meter 10). ) Left below.
  • halogen light was irradiated obliquely from above, and measurement was performed with the spectral radiance meter 10 from a distance of 45 cm in front.
  • color analysis is performed based on the spectrum information after the calibration process, and color information in a two-dimensional space of a predetermined imaging region including the object 200 is displayed as an image in FIGS. ) (C) (d) (e).
  • FIG. 6A shows an RGB image of a leaf bundle photographed by the spectral radiance meter 10, and FIG. 6B shows a predetermined wavelength created by the spectral radiance meter 10.
  • (Band) shows a spectral image
  • FIG. 6 (c) shows a graph showing the spectral reflectance of the spectral image
  • FIG. 6 (d) shows FIG. An image obtained by converting the spectral image shown in b) into a stimulus value of the XYZ color system (X value) is shown
  • FIG. 6E shows the spectral image shown in FIG. An image converted to chromaticity coordinates (x value) is shown.
  • the spectral reflectance R ( ⁇ ) is calculated for each pixel of the spectral image, and the spectral reflectance of the pixel (which may be an average of a plurality of pixels) designated by the operator is displayed in a graph. It is.
  • FIG. 6D shows a color space image having the spectral radiance of each pixel in the spectral image for each wavelength, the color operator using the color matching function, and the stimulus value of the XYZ color system for each pixel. Among them, an image in which only the X value is displayed with a color gradation according to intensity.
  • FIG. 6 (e) performs color calculation processing on the acquired XYZ color space image using a conversion function to acquire a color analysis image having xy chromaticity coordinate values in each pixel, of which x value Is an image displayed with color gradation according to intensity.
  • the spectral radiance meter 10 includes the slit plate 16, the collimating lens 18, the dispersion optical element 20, the imaging lens 22, and the two-dimensional imaging detector 24 after the objective lens 12.
  • two-dimensional spectroscopic data having one-dimensional spatial information extended in the Z-axis direction and one-dimensional wavelength information (high wavelength resolved spectroscopic information) extended in the Y-axis direction is acquired.
  • the slit plate 16, the collimating lens 18, the dispersion optical element 20, the imaging lens 22, and the two-dimensional imaging detector 24 are disposed on the precision linear motion stage 14 that is movably disposed in the Y-axis direction.
  • the shooting position in the measurement area is changed in the Y-axis direction.
  • two-dimensional spectral data having two-dimensional spatial information and one-dimensional wavelength information is created by integrating the acquired two-dimensional spectral data at each photographing position, and a spectrum for each wavelength is created from the created three-dimensional spectral data.
  • An image hyperspectral image
  • imaging of a predetermined measurement region including the object can be performed in a short time.
  • the spectral radiance meter 10 calculates the spectral radiance, spectral luminance, spectral reflectance or spectral transmittance for each pixel of the acquired spectral image, and the color space for this spectral image.
  • a color space image represented by a predetermined color system is acquired by performing conversion processing.
  • the obtained color space image is subjected to color calculation processing to obtain a color analysis image, and thereafter, processing for converting into a display by a predetermined display method is performed.
  • the spectral radiance meter 10 can acquire the spectral radiance for each pixel in a wide two-dimensional space in a short time.
  • spectral radiance meter 10 display by various existing color systems can be performed, and unified and objective color analysis display can be performed.
  • various spatial color analyzes are required (for example, display panel color development inspection, light source inspection, industrial product color management, human visual information Objective and effective color analysis in color information as basic data) becomes possible.
  • the slit plate 16, the collimating lens 18, the dispersion optical element 20, the imaging lens 22, and the two-dimensional imaging detector 24 are on the precision linear motion stage 14 that can move in the Y-axis direction.
  • the arrangement is not limited to this.
  • the configuration after the slit plate 16 is integrated with a configuration based on a technology capable of moving on a straight line, for example, a fluid type (hydraulic pressure, pneumatic pressure), electromagnetic type, ultrasonic type, piezo type actuator, or the like. It is good also as a structure which moves to an axial direction.
  • a fluid type hydroaulic pressure, pneumatic pressure
  • electromagnetic type electromagnetic type
  • ultrasonic type piezo type actuator
  • the calibration process is performed after the spectral image for each wavelength acquired by photographing the object 200 is created.
  • the present invention is not limited to this. Yes, the dark noise correction process and the light source correction process in the calibration process may be performed after an instruction from the operator, for example.
  • the imaging position is set to the Y axis by moving the configuration subsequent to the objective lens 12 in the Y axis direction by the precision linear motion stage 14 provided in the spectral radiance meter 10.
  • the objective lens 12, the slit plate 16, the collimator lens 18, the dispersion optical element 20, and the imaging lens 22 are of course not limited to this.
  • the imaging position may be moved in the Y-axis direction by disposing the two-dimensional imaging detector 24 in a fixed manner and moving the positional relationship with the object 200 in the Y-axis direction.
  • the photographing position is moved in the Y-axis direction.
  • the spectral radiance meter 10 when the spectral radiance meter 10 is moved, for example, without providing the precision linear motion stage 14 in the spectral radiance meter 10, together with the objective lens 12, the slit plate 16, the collimating lens 18, the dispersion
  • the optical element 20, the imaging lens 22 and the two-dimensional imaging detector 24 are fixedly arranged, and the fixed object 200 is controlled by a control unit provided separately from the spectral radiance meter 10.
  • the spectral radiance meter 10 may be moved in the Y-axis direction by the moving means (refer to FIG. 11A).
  • the control unit for controlling the moving unit and the control unit 26 of the spectral radiance meter 10 are connected, and the timing of movement and the timing of photographing are controlled.
  • the slit plate 16, the collimating lens 18, the dispersion optical element 20, the imaging lens are used together with the objective lens 12 without providing the precision linear motion stage 14 in the spectral radiance meter 10.
  • 22 and the two-dimensional imaging detector 24 are fixedly arranged, and the object 200 is moved in the Y-axis direction with respect to the fixed spectral radiance meter 10 by moving means controlled by the control unit. (Refer to FIG. 11 (b)).
  • the control unit for controlling the moving unit and the control unit 26 of the spectral radiance meter 10 are connected, and the timing of movement and the timing of photographing are controlled.
  • the position is moved in the Y-axis direction, the present invention is not limited to this.
  • the mechanism for moving the photographing position may have the following three configurations.
  • FIG. 7 is a schematic configuration explanatory diagram of a modification of the spectral radiance meter according to the present invention.
  • a spectral radiance meter 100 shown in FIG. 7 includes an objective lens 102 that receives a light beam from a predetermined imaging region including an object 200, and a precision linear motion stage that is provided with the objective lens 102 and moves in the Y-axis direction. 104, a slit plate 16 disposed so that a slit opening 16a extending in the Z-axis direction is positioned on the image plane of the objective lens 12, and a collimator that collimates the light beam that has passed through the slit opening 16a.
  • the two-dimensional imaging detection unit 24, the precision linear motion stage 14, and the two-dimensional imaging detection unit 24, which are arranged so that the detection unit 24a is positioned on the surface, are controlled and two-dimensional imaging is performed.
  • a control unit 26 for processing information output from the output unit 24 is constructed.
  • the slit plate 16, the collimating lens 18, the dispersion optical element 20, the imaging lens 22, and the two-dimensional imaging detector 24 are fixedly disposed.
  • the movement control unit 32 opens the precision linear motion stage 104 with a slit opening in the Y-axis direction within a predetermined imaging region including the one end 200c to the other end 200d of the object 200 in the Y-axis direction.
  • the part 16a is sequentially moved at a predetermined interval corresponding to the width in the Y-axis direction.
  • the movement control unit 32 moves the precise linear movement stage 104 so that the photographing position of the two-dimensional imaging detector 34 becomes the photographing start position when the operator instructs the start of photographing.
  • Information indicating that the moving stage 104 has been moved is output to the imaging control unit 30.
  • the movement control unit 32 moves the precision linear motion stage 104 at a predetermined interval corresponding to the width of the slit opening 16a in the Y-axis direction.
  • Information indicating that the precision linear motion stage 104 has been moved is output to the imaging control unit 30.
  • the movement control unit 32 outputs information to the photographing control unit 30 that it has moved to the photographing end position.
  • the movement control unit 32 moves the precision linear motion stage 104 to move the shooting position, and the shooting control unit 30 performs shooting and creates spectral data based on the electrical signal from the two-dimensional imaging detector 24.
  • the unit 34 creates two-dimensional spectroscopic data having one-dimensional spatial information and one-dimensional wavelength information (high wavelength resolution spectroscopic information).
  • FIG. 8 shows a schematic configuration explanatory diagram of a modification of the spectral radiance meter according to the present invention.
  • the imaging unit 110 shown in FIG. 8 includes an F- ⁇ lens 112 that receives a light beam from a predetermined imaging region including the target object 200, a galvano mirror 114 provided at the rear stage of the F- ⁇ lens 112, and a galvano mirror.
  • An imaging lens 116 that forms an image of the light beam reflected by 114; a slit plate 16 that is disposed so that a slit opening 16a that extends in the Z-axis direction is positioned on the image plane of the imaging lens 116; A collimating lens 18 that collimates the light beam that has passed through the slit opening 16a, a dispersion optical element 20 that disperses the parallel light from the collimating lens 18 in the X-axis direction, and a light beam emitted from the dispersion optical element 20 are combined.
  • the imaging lens 22 is configured to have an image, and the two-dimensional imaging detection unit 24 is disposed so that the detection unit 24 a is positioned on the image plane of the imaging lens 22.
  • the F- ⁇ lens 114, the imaging lens 116, the slit plate 16, the collimating lens 18, the dispersion optical element 20, the imaging lens 22, and the two-dimensional imaging detector 24 are fixedly disposed.
  • the galvanometer mirror 114 reflects the light substantially collimated by the F- ⁇ lens 112 on the reflecting surface and enters the imaging lens 116, and the reflecting surface rotates around the Z axis. It is arranged.
  • the movement control unit 32 controls the rotation angle and rotation direction of the reflecting surface. Note that such a rotation angle and a rotation direction are set before photographing processing.
  • rotate the galvano mirror 114 means “rotate the reflecting surface of the galvano mirror 114”.
  • the galvano mirror 114 rotates to rotate the reflection surface, and changes the reflection angle of the light collimated by the F- ⁇ lens 112.
  • the movement control unit 32 also includes a galvanometer mirror so that the imaging position moves in the Y-axis direction within a predetermined imaging area including the other end 200d from one end 200c in the Y-axis direction of the object 200. 114 is sequentially rotated around the Z axis at a predetermined rotation angle corresponding to the width of the slit opening 16a in the Y axis direction.
  • the movement control unit 32 rotates the galvano mirror 114 so that the shooting position of the two-dimensional imaging detector 24 becomes the shooting start position. Information about the rotation is output to the imaging control unit 30.
  • the movement control unit 32 rotates the galvanometer mirror 114 at a predetermined rotation angle corresponding to the width of the slit opening 16a in the Y-axis direction.
  • the information indicating that the galvano mirror 114 is rotated is output to the imaging control unit 30.
  • the movement control unit 32 outputs information indicating that the movement to the shooting end position is performed to the shooting control unit 30.
  • the movement control unit 32 controls the rotation of the galvano mirror 114 to move the imaging position, and the imaging control unit 30 performs imaging, and based on the electrical signal from the two-dimensional imaging detector 24, spectral data.
  • the creating unit 34 creates two-dimensional spectroscopic data having one-dimensional spatial information and one-dimensional wavelength information (high wavelength resolution spectroscopic information).
  • FIG. 9 shows a schematic configuration explanatory diagram of a modification of the spectral radiance meter according to the present invention.
  • the spectral radiance meter 120 shown in FIG. 9 includes a pair of achromatic prisms 122 disposed in front of the objective lens 12 on which the light beam from the object 200 is incident, and the Z-axis direction on the image plane of the objective lens 12.
  • the slit plate 16 disposed so that the extended slit opening 16a is positioned, the collimating lens 18 that makes the light beam that has passed through the slit opening 16a parallel light, and the parallel light from the collimating lens 18 is converted into Y
  • a dispersive optical element 20 that disperses in the axial direction
  • an image forming lens 22 that forms an image of the light beam emitted from the dispersive optical element 20, and a detector 24 a are disposed on the image plane of the image forming lens 22.
  • a two-dimensional imaging detection unit 24 is disposed on the image plane of the image forming lens 22.
  • the objective lens 12, the slit plate 16, the collimating lens 18, the dispersion optical element 20, the imaging lens 22, and the two-dimensional imaging detector 24 are fixedly disposed.
  • the pair of achromatic prisms 122 are constituted by achromatic prisms 122-1 and 122-2, and the achromatic prisms 122-1 and 122-2 are aligned in the X-axis direction with the same apex angle direction. Arranged.
  • the state of the pair of achromatic prisms shown in FIGS. 10A and 10B is referred to as “initial state”.
  • the pair of achromatic prisms 122 are rotated from the initial state by the same angle in the opposite direction to the achromatic prisms 122-1 and 122-2, respectively, under the control of the movement control unit 32.
  • the achromatic prisms 122-1 and 122-2 rotate in opposite directions, the apex angle direction is reversed, and the Z axis direction is horizontal (FIG. 10 ( c) (Refer to (d)), it functions as a plane parallel plate.
  • FIG. 10C shows a state in which the achromatic prism 122-1 in FIG. 10A is rotated 90 ° in the direction of arrow I and the achromatic prism 122-2 is rotated 90 ° in the direction of arrow II. It is.
  • the achromatic prisms 122-1 and 122-2 are configured to rotate in opposite directions around the center axis O that coincides, for example, the achromatic prism 122-1 is arranged in the direction of the arrow I.
  • the achromatic prism 122-2 rotates in the direction of arrow II (see FIG. 10A).
  • each of the achromatic prisms 122-1 and 122-2 is rotatable in the range of 0 ° to 180 °.
  • the pair of achromatic prisms 122 having such a configuration makes it possible to change the position of the light beam incident on the two-dimensional imaging detector 24 disposed at the subsequent stage of the pair of achromatic prisms 122. As a result, the shooting position of shooting is moved in the Y-axis direction.
  • the movement control unit 32 is achromatic so that the shooting position moves in the Y-axis direction within a predetermined shooting region including the other end 200d from one end 200c in the Y-axis direction of the object 200.
  • the prisms 122-1 and 122-2 are sequentially rotated around the X axis at a predetermined rotation angle according to the width of the slit opening 16a in the Y axis direction.
  • the movement control unit 32 sets the achromatic prisms 122-1 and 122-2 so that the photographing position of the one-dimensional imaging detector 122 becomes the photographing start position when the operator gives an instruction to start photographing.
  • Information indicating that the achromatic prisms 122-1 and 122-2 have been rotated in the opposite directions is output to the imaging control unit 30.
  • the movement control unit 32 moves the achromatic prisms 122-1 and 122-2 according to the width of the slit opening 16a in the Y-axis direction.
  • Information indicating that the achromatic prisms 122-1 and 122-2 are rotated is output to the imaging control unit 30 by rotating at a predetermined rotation angle.
  • the movement control unit 32 captures information that the achromatic prisms 122-1 and 122-2 are rotated until the photographing position of the two-dimensional imaging detector 24 reaches the photographing end position, and moves to the photographing end position. Output to the control unit 30.
  • the movement control unit 32 rotates the achromatic prisms 122-1 and 122-2 to move the shooting position, and the shooting control unit 30 performs shooting, and the electrical signal from the two-dimensional imaging detector 24 is converted into an electric signal.
  • the spectral data creating unit 34 creates two-dimensional spectral data having one-dimensional spatial information and one-dimensional wavelength information (high wavelength resolution spectral information).
  • the wavelength is changed by exchanging hardware configurations such as the objective lens 12, the slit plate 16, the collimating lens 18, the dispersion optical element 20, the imaging lens 22, and the two-dimensional imaging detector 24.
  • the area, wavelength resolution, and number of spatial pixels are changed, but this is not limited to this.
  • reading externally defined files or changing parameters to the worker May specify not only the wavelength range, wavelength resolution, and number of spatial pixels, but also the imaging speed, exposure time, and gain.
  • the present invention is suitable for use as a spectral radiance meter that measures a predetermined area including a target object and acquires detailed color information of the area.

Abstract

[Problem] To provide a spectroradiometer with which it is possible within a brief period of time to acquire spectral information for each pixel of a wide range, and to interconvert between color displays in any color system. [Solution] This device has: a two-dimensional image pickup detector which, through pickup of an image, is impinged upon by a beam of light traveling from a subject and diffused by optical elements, and the pick-up detector acquiring an electrical signal based on the impinging light; a moving means for integrally moving an arrangement in a subsequent stage from the objective lens in a prescribed direction, and moving the image pickup location in the prescribed direction; a means for synchronized control of image pickup and movement of the moving means; a means for creating, on the basis of the signal, first data that includes one-dimensional spatial information pertaining to the image pickup location and wavelength information, and creating, from the first data at each image pickup location, second data that includes two-dimensional spatial information and one-dimensional wavelength information; a means for creating a spectral image of each wavelength from the second data; a means for acquiring the spectral radiance in each pixel of spectral images; and a means for creating an image according to a prescribed color system from a spectral image, and performing a color computation process on this image to create an image of a prescribed display method.

Description

分光放射輝度計Spectral radiance meter
 本発明は、分光放射輝度計に関し、さらに詳細には、測定の対象となる物体を含む領域を測定し、当該領域の色情報を取得する分光放射輝度計に関する。 The present invention relates to a spectral radiance meter, and more particularly to a spectral radiance meter that measures a region including an object to be measured and acquires color information of the region.
 従来の色分析表示には、分光放射輝度計で得られた分光情報に、実験的に得られた人間の視感度を乗じたRGBをベースとした色座標表記(CIEXYZなど)が広く用いられているが、それらの色座標表記は、心理状態や個体差の影響を統計学的に処理したデータに基づいており、個別多様性に対応できていない。 In conventional color analysis display, color coordinate notation (such as CIEXYZ) based on RGB obtained by multiplying spectral information obtained by a spectral radiance meter by experimentally obtained human visibility is widely used. However, these color coordinate notations are based on data obtained by statistically processing the influence of psychological states and individual differences, and cannot cope with individual diversity.
 具体的には、あらゆる色空間の基になっている等色関数は、心理実験結果の統計処理に基づく、標準観察者と呼ばれる仮想的な観察者の特性であるが、実際には、加齢による水晶体の黄化などによる個人差や網膜上の黄斑色素の影響による個人内の変動がある。このような等色関数の変動が無視できない場合、2種類の刺激が提示されたとき、標準観察者には同じ色に知覚されても、別の人には異なる色に見える現象(観察者メタメリズム)が発生する。 Specifically, the color matching function that is the basis of all color spaces is a characteristic of a virtual observer called a standard observer based on statistical processing of psychological experiment results. There are individual differences due to the lens yellowing due to, and intra-individual variation due to the effect of macular pigment on the retina. When such variations in the color matching function cannot be ignored, when two types of stimuli are presented, even if the same color is perceived by the standard observer, a different color appears to another person (observer metamerism) ) Occurs.
 ここで、こうした観察者メタメリズムを発生させないために、例えば、分光画像情報を用いる技術が知られている。 Here, in order not to generate such observer metamerism, for example, a technique using spectral image information is known.
 こうした技術は、例えば、非特許文献1に開示されており、分光画像情報および対応した多原色ディスプレイを用いて実物からの反射光の分光分布を再現(分光的色再現)することにより、全ての観察者が、実物とディスプレイ上の画像とが同じ色として認識できるようにしたものである。 Such a technique is disclosed, for example, in Non-Patent Document 1, and by reproducing the spectral distribution of the reflected light from the real object using spectral image information and a corresponding multi-primary color display (spectral color reproduction), The observer can recognize the real object and the image on the display as the same color.
 そして、このような分光画像情報を利用した技術は、遠隔医療、美術品アーカイブ、服飾・インテリア・印刷物などの色合わせなど、異なるメディア間で高精度な色あわせを要する分野への応用が期待される。 Such technology using spectral image information is expected to be applied to fields that require high-precision color matching between different media, such as color matching for telemedicine, artwork archives, clothing / interior / prints, etc. The
 また、従来の色分析表示においては、産業や国によって異なる色分析表示方法が使用されているため、相互比較が難しく、各色分析表示方法間の互換性に乏しいため、扱いにくいという問題点が指摘されていた。 In addition, the conventional color analysis display uses different color analysis display methods depending on the industry and country, so it is difficult to compare each other, and the compatibility between each color analysis display method is difficult, and it is difficult to handle. It had been.
 また、従来の技術による分光放射輝度計は、1点もしくは一定空間領域(例えば、画角10°以下)の平均値のみの計測しか行うことができないため、例えば、測定の対象となる物体(以下、「対象物」と称する。)の表面の2次元空間の分光情報を取得しようとする場合には、分光放射輝度計あるいは対象物を走査して、対象物表面の膨大なポイントを測定しなければならず、対象物表面の色情報を取得する作業に多大な時間を要することとなり、実質不可能であることが問題点として指摘されていた。
 さらに、一定空間領域の分光情報を取得する場合には、撮影した画角内の分光情報は平均化されてしまうため、こうした分光情報を取得して作成した対象物表面の分光情報においては、空間細部(1画素)ごとの色情報の差異をとらえることができず、対象物表面の正確な評価をすることができないという問題点が指摘されていた。
Further, since the conventional spectral radiance meter can only measure an average value of one point or a fixed space region (for example, an angle of view of 10 ° or less), for example, an object to be measured (hereinafter referred to as an object to be measured) In order to obtain spectral information in the two-dimensional space of the surface of the object, the spectral radiance meter or the object must be scanned to measure enormous points on the object surface. In other words, it takes a lot of time to obtain color information on the surface of the object, and it has been pointed out as a problem that it is substantially impossible.
Furthermore, when acquiring spectral information of a certain spatial region, the spectral information within the captured angle of view is averaged, so in the spectral information of the object surface created by acquiring such spectral information, the spatial information It has been pointed out that a difference in color information for each detail (one pixel) cannot be grasped, and the object surface cannot be accurately evaluated.
 本発明は、上記したような従来の技術の有する種々の問題点に鑑みてなされたものであり、その目的とするところは、情報の損失なく任意の表色系による色表示と相互変換を行うことができる分光放射輝度計を提供しようとするものである。 The present invention has been made in view of the various problems of the conventional techniques as described above, and its object is to perform color display and mutual conversion by an arbitrary color system without loss of information. It is intended to provide a spectral radiance meter that can be used.
 また、本発明の目的とするところは、従来の技術による分光放射輝度計による測定可能な領域よりも広範囲の2次元の空間における1画素ごと分光情報を容易に、かつ、短時間で取得することが可能な分光放射輝度計を提供しようとするものである。 In addition, an object of the present invention is to easily and quickly acquire spectroscopic information for each pixel in a two-dimensional space that is wider than a region measurable by a conventional spectral radiance meter. The present invention intends to provide a spectral radiance meter capable of satisfying the requirements.
 上記目的を達成するために、本発明は、対物レンズより後段の構成を移動することにより、撮影位置を移動しながら、対象物を含む所定の測定領域全体を撮影して、2次元撮像検出器(エリアセンサ)によって1次元の空間情報と1次元の波長情報とを有する2次元分光データを取得するようにした。 In order to achieve the above object, the present invention provides a two-dimensional imaging detector that captures an entire predetermined measurement region including an object while moving the imaging position by moving the configuration subsequent to the objective lens. Two-dimensional spectral data having one-dimensional spatial information and one-dimensional wavelength information is acquired by (area sensor).
 なお、この1次元の波長情報は、人間の視覚可能な可視光線を含む、紫外から赤外領域(例えば、200nm~13μm)の波長域を、0.1nm~100nmの波長分解能で、数十~数百バンドに分光する高波長分解分光情報(ハイパースペクトルデータ)とした。 The one-dimensional wavelength information includes several tens of wavelengths in the wavelength range from ultraviolet to infrared (eg, 200 nm to 13 μm) including visible light visible to humans with a wavelength resolution of 0.1 nm to 100 nm. High-wavelength-resolved spectroscopic information (hyperspectral data) that splits into several hundred bands was used.
 そして、取得した各撮影位置の2次元分光データを統合して2次元の空間情報と1次元の波長情報とを有する3次元分光データを作成し、作成した3次元分光データから各波長ごとの分光画像(ハイパースペクトル画像)を作成するようにした。 Then, the acquired two-dimensional spectral data at each imaging position is integrated to create three-dimensional spectral data having two-dimensional spatial information and one-dimensional wavelength information, and the spectrum for each wavelength is created from the generated three-dimensional spectral data. An image (hyperspectral image) was created.
 その後、作成した分光画像における各画素ごとの分光放射輝度と、分光輝度と、分光反射率または分光透過率とを算出し、さらに、各画素における分光放射輝度値を所定の表色系の刺激値に変換して2次元空間画像の各画素に当該表色系の刺激値を保持する画像(以下、「色空間画像」と称する。)を取得するとともに、所定の表色系に変換された色空間画像を、色演算処理して得られる色解析画像を所定の表示方法で表示部に表示するようにした。 After that, the spectral radiance, spectral luminance, spectral reflectance or spectral transmittance for each pixel in the created spectral image is calculated, and the spectral radiance value at each pixel is calculated as a stimulus value of a predetermined color system. An image (hereinafter, referred to as a “color space image”) in which each pixel of the two-dimensional space image is stored with a stimulus value of the color system is converted to a color that has been converted into a predetermined color system A color analysis image obtained by color calculation processing of the spatial image is displayed on the display unit by a predetermined display method.
 これにより、本発明においては、対象物を含む所定の測定領域全体のイメージングやイメージングにより取得した分光画像の各画素ごとの分光放射輝度などの取得を短時間で実行することができるようになる。 Thereby, in the present invention, imaging of the entire predetermined measurement region including the object and acquisition of spectral radiance for each pixel of the spectral image acquired by imaging can be executed in a short time.
 具体的には、広い2次元空間における1画素ごとの分光放射輝度は、例えば、VGAサイズであれば数秒で取得することができようになる。 Specifically, the spectral radiance for each pixel in a wide two-dimensional space can be acquired in a few seconds if it is a VGA size, for example.
 さらに、本発明においては、分光放射輝度を共通の色空間とすることで、色情報を損失することがなく、既存の種々の表色系による色表示および相互変換を行うことができ、また、照明光やディスプレイ環境、視覚特性の観察者間の個体差や個人内変動によらず正確な色再現・情報伝達を行うことができる。 Furthermore, in the present invention, by making the spectral radiance a common color space, it is possible to perform color display and mutual conversion by various existing color systems without losing color information, Accurate color reproduction and information transmission can be performed regardless of individual differences between individuals in the illumination light, display environment, and visual characteristics, and intra-individual variations.
 これにより、本発明によれば、統一的、かつ、客観的な色分析表示を行うことができるようになる。 Thus, according to the present invention, uniform and objective color analysis display can be performed.
 即ち、本発明は、撮影により、対物レンズを介して入射した対象物からの光束が、分散光学素子により所定の方向と直交する方向に分散されて入射し、該入射した光束に基づく信号を取得する2次元撮像検出器と、上記対物レンズより後段に設けられた構成を一体的に上記所定の方向に移動することにより、上記2次元撮像検出器により撮影される撮影位置を上記所定の方向に移動する移動手段と、上記2次元撮像検出器の撮影のタイミングを制御する第1の制御手段と、上記移動手段の移動を制御する第2の制御手段と、上記信号に基づいて、撮影位置における1次元の空間情報と1次元の波長情報とを有する2次元分光データを作成するとともに、各撮影位置における2次元分光データから2次元の空間情報と1次元の波長情報とを有する3次元分光データを作成する分光データ作成手段と、上記3次元分光データから波長ごとの分光画像を作成する第1の画像作成手段と、上記分光画像の各画素における分光放射輝度を取得する取得手段と、上記分光画像を色空間変換処理して、所定の表色系による色空間画像を作成するとともに、上記色空間画像を色演算処理して、色解析画像を作成し、当該色解析画像を所定の表示方法による表示に変換する第2の画像作成手段とを有するようにしたものである。 That is, according to the present invention, a light beam from an object incident through an objective lens is incident by being dispersed in a direction orthogonal to a predetermined direction by a dispersion optical element by photographing, and a signal based on the incident light beam is acquired. The two-dimensional imaging detector and the configuration provided downstream from the objective lens are integrally moved in the predetermined direction, so that the imaging position photographed by the two-dimensional imaging detector is set in the predetermined direction. Based on the signal, the moving means for moving, the first control means for controlling the timing of photographing of the two-dimensional imaging detector, the second control means for controlling the movement of the moving means, and the signal at the photographing position. Two-dimensional spectral data having one-dimensional spatial information and one-dimensional wavelength information is created, and two-dimensional spatial information and one-dimensional wavelength information are obtained from the two-dimensional spectral data at each photographing position. Spectral data creation means for creating three-dimensional spectral data, first image creation means for creating a spectral image for each wavelength from the three-dimensional spectral data, and acquisition for acquiring spectral radiance at each pixel of the spectral image Means and color space conversion processing of the spectral image to create a color space image by a predetermined color system, color calculation processing of the color space image to create a color analysis image, and the color analysis image And a second image creating means for converting the image into a display by a predetermined display method.
 また、本発明は、上記した発明において、上記2次元撮像検出器は、上記対物レンズおよび上記分散光学素子を含む光学素子とともに着脱可能な構成であり、上記2次元撮像検出器および上記光学素子を交換することにより、撮影可能な波長域、波長分解能および空間画素数を変更することができるようにしたものである。 Further, the present invention is the above-described invention, wherein the two-dimensional imaging detector is detachable together with the optical element including the objective lens and the dispersion optical element, and the two-dimensional imaging detector and the optical element are By exchanging, it is possible to change the wavelength range, the wavelength resolution, and the number of spatial pixels that can be photographed.
 また、本発明は、上記した発明において、上記移動手段は、精密直動ステージであるようにしたものである。 In the present invention described above, the moving means is a precision linear motion stage.
 また、本発明は、撮影により、対物レンズを介して入射した対象物からの光束が、分散光学素子により所定の方向と直交する方向に分散されて入射し、該入射した光束に基づく信号を取得する2次元撮像検出器と、上記2次元撮像検出器の撮影のタイミングを制御する制御手段と、上記信号に基づいて、撮影位置における1次元の空間情報と1次元の波長情報とを有する2次元分光データを作成するとともに、各撮影位置における2次元分光データから2次元の空間情報と1次元の波長情報とを有する3次元分光データを作成する分光データ作成手段と、上記3次元分光データから波長ごとの分光画像を作成する第1の画像作成手段と、上記分光画像の各画素における分光放射輝度を取得する取得手段と、上記分光画像を色空間変換処理して、所定の表色系による色空間画像を作成するとともに、上記色空間画像を色演算処理して、色解析画像を作成し、当該色解析画像を所定の表示方法による表示に変換する第2の画像作成手段とを有し、上記対象物との位置関係を前記所定の方向において移動することにより、撮影位置を前記所定の方向に変更しながら上記対象物の撮影を行うようにしたものである。 Further, according to the present invention, a light beam from an object incident through an objective lens is dispersed and incident in a direction orthogonal to a predetermined direction by a dispersion optical element by photographing, and a signal based on the incident light beam is acquired. A two-dimensional imaging detector, a control means for controlling the timing of imaging of the two-dimensional imaging detector, and a two-dimensional image having one-dimensional spatial information and one-dimensional wavelength information at the imaging position based on the signal. Spectral data creating means for creating spectral data, creating three-dimensional spectral data having two-dimensional spatial information and one-dimensional wavelength information from the two-dimensional spectral data at each imaging position, and wavelength from the three-dimensional spectral data A first image creating means for creating a spectral image for each, an obtaining means for obtaining spectral radiance at each pixel of the spectral image, and a color space conversion process for the spectral image. A color space image based on a predetermined color system, color calculation processing on the color space image to generate a color analysis image, and converting the color analysis image into a display by a predetermined display method; An image creating means, and moving the positional relationship with the object in the predetermined direction so that the object is photographed while changing the photographing position in the predetermined direction. .
 本発明は、以上説明したように構成されているので、情報の損失なく任意の表色系による色表示と相互変換を行うことができるという優れた効果を奏する。 Since the present invention is configured as described above, there is an excellent effect that color display and mutual conversion by an arbitrary color system can be performed without loss of information.
 また、本発明は、以上説明したように構成されているので、広範囲の2次元の空間における1画素ごとの分光情報を容易に、かつ、短時間で取得することができるという優れた効果を奏する。 Further, since the present invention is configured as described above, it has an excellent effect that spectral information for each pixel in a wide two-dimensional space can be acquired easily and in a short time. .
図1(a)は、本発明による分光放射輝度計の概略構成説明図であり、また、図1(b)は、図1(a)のA矢視図である。FIG. 1A is a schematic configuration explanatory view of a spectral radiance meter according to the present invention, and FIG. 1B is a view as seen from an arrow A in FIG. 図2は、制御部のブロック構成説明図である。FIG. 2 is an explanatory diagram of a block configuration of the control unit. 図3は、分光放射輝度計における撮影処理の処理ルーチンを示すフローチャートである。FIG. 3 is a flowchart showing a processing routine of imaging processing in the spectral radiance meter. 図4は、分光放射輝度計における測定処理の処理ルーチンを示すフローチャートである。FIG. 4 is a flowchart showing a processing routine of measurement processing in the spectral radiance meter. 図5は、校正処理の処理ルーチンを示すフローチャートである。FIG. 5 is a flowchart showing the processing routine of the calibration process. 図6(a)は、本発明による分光放射輝度計により撮影した葉束のRGB画像であり、また、図6(b)は、本発明による分光放射輝度計で作成した所定の波長(バンド)における分光画像であり、また、図6(c)は、分光反射率を示すグラフであり、また、図6(d)は、図6(b)に示す分光画像をXYZ表色系の刺激値へ変換表示(X値)した画像であり、また、図6(e)は、図6(b)に示す分光画像をxy色度座標へ変換表示(x値)した画像である。6A is an RGB image of a leaf bundle photographed by the spectral radiance meter according to the present invention, and FIG. 6B is a predetermined wavelength (band) created by the spectral radiance meter according to the present invention. 6 (c) is a graph showing the spectral reflectance, and FIG. 6 (d) shows the spectral image shown in FIG. 6 (b) as the stimulus value of the XYZ color system. 6 (e) is an image obtained by converting and displaying the spectral image shown in FIG. 6 (b) into xy chromaticity coordinates (x value). 図7は、本発明による分光放射輝度計の変形例の概略構成説明図である。FIG. 7 is a schematic configuration explanatory diagram of a modification of the spectral radiance meter according to the present invention. 図8は、本発明による分光放射輝度計の変形例の概略構成説明図である。FIG. 8 is a schematic configuration explanatory diagram of a modification of the spectral radiance meter according to the present invention. 図9は、本発明による分光放射輝度計の変形例の概略構成説明図である。FIG. 9 is a schematic configuration explanatory diagram of a modification of the spectral radiance meter according to the present invention. 図10(a)は、初期状態で配置された一対の色消しプリズムの説明図であり、また、図10(b)は、図10(a)のB矢視図であり、また、図10(c)は、平行平面板として機能する状態の一対の色消しプリズムの説明図であり、また、図10(d)は、図10(c)のC矢視図である。FIG. 10A is an explanatory diagram of a pair of achromatic prisms arranged in the initial state, and FIG. 10B is a view taken in the direction of arrow B in FIG. (C) is explanatory drawing of a pair of achromatic prism of the state which functions as a parallel plane board, and FIG.10 (d) is C arrow line view of FIG.10 (c). 図11(a)(b)は、本発明による分光放射輝度計の変形例の概略構成説明図である。11 (a) and 11 (b) are schematic configuration explanatory views of a modification of the spectral radiance meter according to the present invention.
 以下、添付の図面を参照しながら、本発明による分光放射輝度計の実施の形態の一例を詳細に説明するものとする。 Hereinafter, an example of an embodiment of a spectral radiance meter according to the present invention will be described in detail with reference to the accompanying drawings.
 図1には、(a)には、本発明による分光放射輝度計の概略構成説明図が示されており、また、図1(b)には、図1(a)のA矢視図が示されており、また、図2には、制御部のブロック構成説明図が示されている。 FIG. 1A is a schematic structural explanatory view of a spectral radiance meter according to the present invention, and FIG. 1B is a view as viewed in the direction of arrow A in FIG. FIG. 2 is a block diagram illustrating the control unit.
 この図1に示す分光放射輝度計10は、対象物200を含む所定の測定領域からの光束を入射する対物レンズ12と、XYZ直交座標系におけるY軸方向に移動する精密直動ステージ14と、対物レンズ12の像面においてZ軸方向に延設されたスリット開口部16aが位置するように配設されたスリット板16と、スリット開口部16aを通過した光束を平行光とするコリメートレンズ18と、コリメートレンズ18からの平行光を、Y軸方向に分散する分散光学素子20と、分散光学素子20から出射された光束を結像する結像レンズ22と、結像レンズ22の像面上に検出部24aが位置するように配設された2次元撮像検出部24と、精密直動ステージ14および2次元撮像検出部24を制御するとともに、2次元撮像検出部24から出力された情報の処理を行う制御部26とを有して構成されている。 The spectral radiance meter 10 shown in FIG. 1 includes an objective lens 12 that receives a light beam from a predetermined measurement region including an object 200, a precision linear motion stage 14 that moves in the Y-axis direction in an XYZ orthogonal coordinate system, A slit plate 16 disposed so that a slit opening 16a extending in the Z-axis direction is positioned on the image plane of the objective lens 12, and a collimating lens 18 that collimates the light beam that has passed through the slit opening 16a; On the image plane of the imaging lens 22, the dispersion optical element 20 that disperses the parallel light from the collimating lens 18 in the Y-axis direction, the imaging lens 22 that forms an image of the light beam emitted from the dispersion optical element 20, The two-dimensional imaging detection unit 24, the precision linear motion stage 14, and the two-dimensional imaging detection unit 24, which are arranged so that the detection unit 24a is located, are controlled and the two-dimensional imaging detection unit 24 is controlled. It is constituted by a control unit 26 for processing the al outputted information.
 より詳細には、スリット板16、コリメートレンズ18、分散光学素子20、結像レンズ22および2次元撮像検出部24は、精密直動ステージ14に固定的に配設され、精密直動ステージ14のY軸方向への移動にともなって、一体的にY軸方向へ移動することとなる。 More specifically, the slit plate 16, the collimating lens 18, the dispersion optical element 20, the imaging lens 22, and the two-dimensional imaging detection unit 24 are fixedly disposed on the precision linear motion stage 14. Along with the movement in the Y-axis direction, the movement in the Y-axis direction is integrated.
 なお、こうした精密直動ステージ14は、従来より公知の技術を用いることができるため、その詳細な説明は省略することとする。 It should be noted that since the precision linear motion stage 14 can use a conventionally known technique, a detailed description thereof will be omitted.
 スリット板16は、対物レンズ12からの光束が、Z軸方向に延設されたスリット開口部16aを通過するように配設される。 The slit plate 16 is disposed so that the light beam from the objective lens 12 passes through a slit opening 16a extending in the Z-axis direction.
 また、分散光学素子20は、例えば、回折格子、プリズムあるいはグリズムを用いることができる。なお、グリズムとは、透過型回折格子とプリズムとを組み合わせた直視回折格子である。 Further, the dispersion optical element 20 can use, for example, a diffraction grating, a prism, or a grism. The grism is a direct-view diffraction grating that combines a transmission diffraction grating and a prism.
 そして、回折格子、プリズムあるいはグリズムは、スリット開口部16aの延長方向であるZ軸方向に対して垂直なX軸方向に、入射した光束を分散するように配設される。 The diffraction grating, prism, or grism is disposed so as to disperse the incident light beam in the X-axis direction perpendicular to the Z-axis direction that is the extension direction of the slit opening 16a.
 なお、対物レンズ12、スリット板16、コリメートレンズ18、分散光学素子20および結像レンズ22は、適宜交換することが可能な構成となっている。 The objective lens 12, the slit plate 16, the collimating lens 18, the dispersion optical element 20, and the imaging lens 22 are configured to be exchangeable as appropriate.
 2次元撮像検出器24は、Z軸およびY軸と平行(つまり、一次像面と平行、YZ平面と平行である。)に検出部24aが配置されるエリアセンサである。 The two-dimensional imaging detector 24 is an area sensor in which the detection unit 24a is arranged in parallel with the Z axis and the Y axis (that is, parallel to the primary image plane and parallel to the YZ plane).
 なお、こうした2次元撮像検出器24は、交換可能な構成となっており、本発明による分光放射輝度計10に用いられる2次元撮像検出器24としては、空間画素数が1画素~2900万画素、取得可能な波長域が200nm~13μm、波長分解能が0.1nm~100nmとする。 The two-dimensional imaging detector 24 has a replaceable configuration, and the two-dimensional imaging detector 24 used in the spectral radiance meter 10 according to the present invention has a spatial pixel number of 1 to 29 million pixels. The obtainable wavelength range is 200 nm to 13 μm, and the wavelength resolution is 0.1 nm to 100 nm.
 このため、分光放射輝度計10においては、取得される分光画像における波長域、波長分解能および空間画素数を変更することができる。 Therefore, in the spectral radiance meter 10, the wavelength range, wavelength resolution, and number of spatial pixels in the acquired spectral image can be changed.
 即ち、分光放射輝度計10においては、波長域、波長分解能、空間画素数などが異なる2次元撮像検出器24に交換するとともに、対物レンズ12、スリット板16、コリメートレンズ18、分散光学素子20および結像レンズ22を適宜交換することにより、撮影可能な波長域、波長分解能および空間画素数などを変更することができる。 That is, in the spectral radiance meter 10, the two-dimensional imaging detector 24 having a different wavelength range, wavelength resolution, number of spatial pixels, and the like is replaced, and the objective lens 12, the slit plate 16, the collimating lens 18, the dispersion optical element 20, and the like. By appropriately replacing the imaging lens 22, it is possible to change the wavelength range that can be photographed, wavelength resolution, the number of spatial pixels, and the like.
 また、制御部26は、精密直動ステージ14および2次元撮像素子24に接続されており、例えば、マイクロコンピューターや汎用のパーソナルコンピューターにより構成される。 Further, the control unit 26 is connected to the precision linear motion stage 14 and the two-dimensional imaging device 24, and is configured by, for example, a microcomputer or a general-purpose personal computer.
 そして、制御部26は、2次元撮像検出器24において電気信号を取得するタイミング(つまり、撮影するタイミングである。)を制御する撮影制御部30と、精密直動ステージ14のY軸方向における移動方向、移動量および移動のタイミングを制御する移動制御部32と、2次元撮像検出器24からの電気信号に基づいて分光データを作成する分光データ作成部34と、分光データ作成部34で作成された分光データに基づいて、分光画像を作成するとともに、作成された分光画像を所定の表色系に変換し、その後、色演算処理を行って色解析画像を取得し、この色解析画像を所定の表示方法により処理した画像を表示部(図示せず。)に出力する画像作成部36とを有して構成されている。 Then, the control unit 26 moves the imaging control unit 30 that controls the timing at which the two-dimensional imaging detector 24 acquires an electrical signal (that is, the timing for imaging) and the precision linear motion stage 14 in the Y-axis direction. The movement control unit 32 that controls the direction, the amount of movement, and the timing of movement, the spectral data creation unit 34 that creates spectral data based on the electrical signal from the two-dimensional imaging detector 24, and the spectral data creation unit 34 A spectral image is created based on the spectral data obtained, the created spectral image is converted into a predetermined color system, and then color calculation processing is performed to obtain a color analysis image. And an image creation unit 36 for outputting an image processed by the display method to a display unit (not shown).
 より詳細には、撮影制御部30は、移動制御部32から精密直動ステージ14が移動との情報が出力されると、2次元撮像検出器24における撮影(つまり、2次元撮像検出器において電気信号を取得することである。)を行い、撮影を行ったとの情報を移動制御部32に出力する。 More specifically, when the information indicating that the precision linear motion stage 14 is moved is output from the movement control unit 32, the imaging control unit 30 performs imaging in the two-dimensional imaging detector 24 (that is, electric in the two-dimensional imaging detector). Signal is acquired, and information indicating that photographing has been performed is output to the movement control unit 32.
 また、撮影制御部30は、対象物200を含む所定の測定領域における撮影位置が撮影終了位置となるまで精密直動ステージ14を移動したか否かを判断し、移動制御部32から撮影終了位置まで移動したとの情報が出力されると、撮影を行った後に撮影処理を終了する。 Further, the imaging control unit 30 determines whether or not the precision linear motion stage 14 has been moved until the imaging position in the predetermined measurement region including the object 200 reaches the imaging end position, and the imaging end position is determined from the movement control unit 32. When the information indicating that the movement has been performed is output, the photographing process is terminated after photographing.
 移動制御部32は、精密直動ステージ14を対象物200のY軸方向における一方の端部200cから他方の端部200dが含まれる所定の撮像領域内をY軸方向に、スリット開口部16aのスリット幅に応じた所定の間隔で順次移動させる。これにより、所定の測定領域における撮影位置をY軸方向に移動されることとなる。 The movement control unit 32 moves the precision linear motion stage 14 in the Y-axis direction within a predetermined imaging area including the one end 200c in the Y-axis direction of the object 200 and the other end 200d. It is sequentially moved at a predetermined interval according to the slit width. Thereby, the photographing position in the predetermined measurement area is moved in the Y-axis direction.
 より詳細には、移動制御部32は、作業者により撮影開始が指示されると、2次元撮像検出器24の撮影位置が撮影開始位置となるよう精密直動ステージ14を移動し、精密直動ステージ14を移動したとの情報を撮影制御部30に出力する。 More specifically, the movement control unit 32 moves the precise linear motion stage 14 so that the photographing position of the two-dimensional imaging detector 24 becomes the photographing start position when the operator gives an instruction to start photographing. Information indicating that the stage 14 has been moved is output to the imaging control unit 30.
 そして、移動制御部32は、撮影制御部30から撮影を行ったとの情報が出力されると、精密直動ステージ14を、スリット開口部16aのスリット幅に応じた所定の間隔で移動させ、精密直動ステージ14を移動したとの情報を撮影制御部30に出力する。 Then, when the information indicating that photographing has been performed is output from the photographing control unit 30, the movement control unit 32 moves the precision linear motion stage 14 at a predetermined interval corresponding to the slit width of the slit opening 16a. Information indicating that the linear motion stage 14 has been moved is output to the imaging control unit 30.
 また、移動制御部32は、2次元撮像検出器24の撮影位置が撮影終了位置となるまで精密直動ステージ14を移動すると、撮影終了位置まで移動したとの情報を撮影制御部30に出力する。 In addition, when the precise linear motion stage 14 is moved until the photographing position of the two-dimensional imaging detector 24 reaches the photographing end position, the movement control unit 32 outputs information to the photographing control unit 30 that it has moved to the photographing end position. .
 また、分光データ作成部34は、2次元撮像検出器24から出力された電気信号に基づいて、1次元の空間情報と1次元の波長情報とを有する2次元分光データを作成し、作成した2次元分光データは、制御部26に設けられた記憶部52に出力されて記憶される。 Further, the spectral data creation unit 34 creates and creates two-dimensional spectral data having one-dimensional spatial information and one-dimensional wavelength information based on the electrical signal output from the two-dimensional imaging detector 24. The dimensional spectral data is output to and stored in the storage unit 52 provided in the control unit 26.
 なお、2次元撮像検出器24から出力された電気信号に基づいて作成された1次元の波長情報は、200nm~13μmの範囲のうちの所定の範囲の波長域を、波長分解能0.1nm~100nmの所定の波長分解能で、数百バンドに分光された高波長分解分光情報である。 Note that the one-dimensional wavelength information created based on the electrical signal output from the two-dimensional imaging detector 24 has a predetermined wavelength range of 200 nm to 13 μm and a wavelength resolution of 0.1 nm to 100 nm. This is high-wavelength-resolved spectroscopic information split into several hundred bands with a predetermined wavelength resolution.
 また、分光データ作成部34は、全ての撮影位置の撮影が終了すると、記憶部52に記憶された2次元分光データを用いて、2次元の空間情報と1次元の波長情報(高波長分解分光情報)とを有する3次元分光データ(ハイパースペクトル画像)を作成し、作成した3次元分光データは記憶部52に出力されて記憶される。 Further, the spectral data creation unit 34 uses the two-dimensional spectral data stored in the storage unit 52 to complete two-dimensional spatial information and one-dimensional wavelength information (high wavelength resolution spectroscopy) when imaging at all imaging positions is completed. Information) is generated, and the generated three-dimensional spectral data is output to the storage unit 52 and stored therein.
 画像作成部36は、分光データ作成部34において作成した3次元分光データに基づいて、各波長ごとの分光画像を作成する分光画像作成部38と、分光画像作成部38において作成された分光画像を所定の表色系に変換し、色演算処理を行って色解析画像を作成するとともに、作成した色解析画像を所定の表示方法により表示するための処理を行う色解析画像作成部40とにより構成されている。 The image creation unit 36 creates a spectral image creation unit 38 that creates a spectral image for each wavelength based on the three-dimensional spectral data created by the spectral data creation unit 34, and the spectral image created by the spectral image creation unit 38. A color analysis image creating unit 40 that performs conversion to a predetermined color system, performs color calculation processing to create a color analysis image, and performs processing for displaying the created color analysis image by a predetermined display method Has been.
 より詳細には、分光画像作成部38は、200nm~13μmの所定の波長範囲の波長域、0.1nm~100nmの範囲内の所定の波長分解能で、数百バンドに分光された3次元分光データに基づいて、各バンド(各波長)ごとの分光画像を取得する。 More specifically, the spectroscopic image creation unit 38 performs three-dimensional spectroscopic data spectrally divided into several hundred bands with a predetermined wavelength range of 200 nm to 13 μm and a predetermined wavelength resolution within a range of 0.1 nm to 100 nm. Based on the above, a spectral image for each band (each wavelength) is acquired.
 また、この色解析画像作成部40は、分光画像作成部38で作成された分光画像に対して、暗時ノイズ補正処理、画像間感度偏差補正処理、輝度校正処理および光源補正処理を行う校正処理部42と、輝度校正処理の際に算出された分光放射輝度(W/sr・m・nm)から分光輝度(cd/m・nm)を算出するとともに、後述する感度補正係数、輝度校正係数および分光反射率などを算出する算出部44と、各画素の分光放射輝度が算出された各波長ごとの分光画像を所定の表色系に変換し、色演算処理を行って色解析画像を作成するとともに、作成した色解析画像を所定の表示方法により表示するための処理を行う色解析画像取得部50とを有して構成されている。 Further, the color analysis image creation unit 40 performs calibration processing for performing dark noise correction processing, inter-image sensitivity deviation correction processing, luminance calibration processing, and light source correction processing on the spectral image created by the spectral image creation unit 38. The spectral luminance (cd / m 2 · nm) is calculated from the unit 42 and the spectral radiance (W / sr · m 2 · nm) calculated at the time of the luminance calibration processing, and a sensitivity correction coefficient and luminance calibration which will be described later are calculated. A calculation unit 44 for calculating a coefficient, a spectral reflectance, and the like; and a spectral image for each wavelength for which the spectral radiance of each pixel is calculated are converted into a predetermined color system, color calculation processing is performed, and a color analysis image is converted A color analysis image acquisition unit 50 that performs processing for generating and displaying the generated color analysis image by a predetermined display method is configured.
 この校正処理部42は、2次元撮像検出器24において、暗電流に起因するノイズを除去する暗時ノイズ補正(ダーク補正)処理を行う。また、2次元撮像検出器24の画素ごとの感度偏差を補正するために、暗時ノイズ補正処理がなされた分光画像に対して感度補正係数を用いて画素間感度偏差補正処理を行う。さらに、画素間感度偏差補正処理がなされた分光画像に対して、輝度校正係数を用いて輝度校正処理を行う。さらにまた、暗時ノイズ補正処理された分光画像内の、空間内の光源光の照明ムラを補正するとともに対象物200の分光反射率もしくは分光透過率を取得する光源補正処理を行う。 The calibration processing unit 42 performs dark noise correction (dark correction) processing for removing noise caused by dark current in the two-dimensional imaging detector 24. Further, in order to correct the sensitivity deviation for each pixel of the two-dimensional imaging detector 24, the sensitivity deviation correction process between pixels is performed on the spectral image subjected to the dark noise correction process using the sensitivity correction coefficient. Further, the luminance calibration processing is performed on the spectral image that has been subjected to the inter-pixel sensitivity deviation correction processing using the luminance calibration coefficient. Furthermore, a light source correction process is performed for correcting the illumination unevenness of the light source light in the space in the spectral image subjected to the dark noise correction process and obtaining the spectral reflectance or the spectral transmittance of the object 200.
 ここで、暗時ノイズ補正処理時に用いて暗時ノイズ補正データは、対象物200の測定を行う前に、光遮断状態で、2次元撮像検出器24による撮影を行って取得された3次元分光データに基づいて作成された各波長ごとの分光画像データである。なお、このときの撮影の処理手順は、後述する撮影処理と同様である。 Here, the dark noise correction data used in the dark noise correction process is obtained by performing the photographing with the two-dimensional imaging detector 24 in the light-blocking state before the measurement of the object 200. This is spectral image data for each wavelength created based on the data. Note that the shooting procedure at this time is the same as the shooting process described later.
 また、感度補正係数は、対象物200の測定を行う前に、積分球などにより放射輝度の空間分布が均一化された光源光(以下、「均一標準光源」と称する。)を撮影して取得された3次元分光データである。なお、このとき撮影の処理手順は、後述する撮影処理と同様である。 Further, the sensitivity correction coefficient is obtained by photographing light source light (hereinafter, referred to as “uniform standard light source”) in which the spatial distribution of radiance is made uniform by an integrating sphere or the like before measuring the object 200. 3D spectroscopic data. Note that the shooting processing procedure at this time is the same as the shooting processing described later.
 即ち、感度補正係数は、2次元撮像検出器24により均一標準光源を撮影して得られた3次元分光データに基づく各波長ごとの分光画像において、指定画素(単一画素でも複数画素の平均でも可)の出力値を基準として、基準値を各画素の出力値で除算して、各画素ごとの補正係数を算出し、各波長ごとの分光画像の各画素に補正係数を持つ3次元分光データとして作成される。なお、こうした感度補正係数の算出は、算出部44においてなされ、記憶部52に出力されて記憶される。 That is, the sensitivity correction coefficient is a specified pixel (single pixel or average of a plurality of pixels) in a spectral image for each wavelength based on three-dimensional spectral data obtained by photographing a uniform standard light source with the two-dimensional imaging detector 24. 3) Spectral data having a correction coefficient for each pixel of the spectral image for each wavelength by calculating the correction coefficient for each pixel by dividing the reference value by the output value of each pixel. Created as The calculation of the sensitivity correction coefficient is performed by the calculation unit 44 and is output to and stored in the storage unit 52.
 さらに、輝度校正係数は、対象物200の測定を行う前に、分光放射輝度の値づけられている光源光(以下、「分光放射輝度標準光源」と称する。)を撮影して取得した3次元データに基づいて作成された1次元データである。なお、このときの撮影手順は、後述する撮影処理と同様である。 Further, the luminance calibration coefficient is obtained by photographing a light source light (hereinafter referred to as “spectral radiance standard light source”) having a spectral radiance value before measurement of the object 200. This is one-dimensional data created based on the data. Note that the shooting procedure at this time is the same as the shooting process described later.
 即ち、輝度校正係数は各波長に対応する1次元データであり、分光放射輝度標準光源の波長ごとに値づけられた分光放射輝度値を、分光放射輝度標準光源を撮影して得られた各波長ごとの分光画像における指定画素(単一画素でも複数画素の平均でも可)の出力値で除算して、各波長ごとの変換係数を持った輝度校正係数を作成する。なお、こうした輝度校正係数の算出は、算出部44においてなされ、記憶部52に出力されて記憶される。 That is, the luminance calibration coefficient is one-dimensional data corresponding to each wavelength, and the spectral radiance value obtained for each wavelength of the spectral radiance standard light source is obtained by photographing the spectral radiance standard light source. A luminance calibration coefficient having a conversion coefficient for each wavelength is created by dividing by the output value of a designated pixel (which may be a single pixel or an average of a plurality of pixels) in each spectral image. Note that the calculation of the luminance calibration coefficient is performed by the calculation unit 44 and output to the storage unit 52 for storage.
 また、光源データは、対象物200の観察を行う前に、光源光を標準白色板などの反射基準に照射して得られる反射光を撮影して取得された3次元分光データとして作成される。なお、このとき撮影の処理手順は、後述する撮影処理と同様である。また、この光源光は、分光放射輝度計を用いた測定に用いられる分光放射輝度計とは別体に設けられた光源からの光である。 Further, the light source data is created as three-dimensional spectroscopic data acquired by photographing the reflected light obtained by irradiating the light source light to a reflection standard such as a standard white plate before observing the object 200. Note that the shooting processing procedure at this time is the same as the shooting processing described later. The light source light is light from a light source provided separately from the spectral radiance meter used for measurement using the spectral radiance meter.
 さらに、各波長(λnm)ごとの分光画像における各画素の分光反射率もしくは分光透過率R(λ)は、次式で表される。
 R(λ)=C(λ)/E(λ)
 R(λ):分光画像の各画素ごとの分光反射率もしくは分光透過率
 C(λ):対象物200の測定により取得した各波長ごとの分光画像の各画素の出力値
 E(λ):光源データの各波長ごとの分光画像のC(λ)に対応する画素の出力値
Furthermore, the spectral reflectance or spectral transmittance R (λ) of each pixel in the spectral image for each wavelength (λnm) is expressed by the following equation.
R (λ) = C (λ) / E (λ)
R (λ): Spectral reflectance or spectral transmittance for each pixel of the spectral image C (λ): Output value of each pixel of the spectral image for each wavelength acquired by measurement of the object 200 E (λ): Light source Output value of pixel corresponding to C (λ) of spectral image for each wavelength of data
 なお、こうした光源データおよび分光反射率もしくは分光透過率の算出は、算出部44においてなされ、記憶部52に出力されて記憶される。 Note that such light source data and spectral reflectance or spectral transmittance are calculated by the calculation unit 44 and output to the storage unit 52 for storage.
 算出部44は、校正処理部42において処理された各波長ごとの分光画像における分光放射輝度Le(λ)(W/sr・m・nm)から分光輝度L(λ)(cd/m・nm)を算出する。 The calculation unit 44 calculates the spectral luminance L (λ) (cd / m 2 · nm) from the spectral radiance Le (λ) (W / sr · m 2 · nm) in the spectral image for each wavelength processed by the calibration processing unit 42. nm).
 具体的には、次式により算出される。
 L(λ)=Km×Le(λ)×V(λ)
 L(λ) :分光画像の各画素ごとの分光輝度(cd/m・nm)
 Le(λ):分光画像の各画素ごとの分光放射輝度(W/sr・m・nm)
 V(λ) :CIE標準分光視感効率
 Km   :最大視感効果度(683lm・W-1
Specifically, it is calculated by the following formula.
L (λ) = Km × Le (λ) × V (λ)
L (λ): Spectral luminance (cd / m 2 · nm) for each pixel of the spectral image
Le (λ): Spectral radiance (W / sr · m 2 · nm) for each pixel of the spectral image
V (λ): CIE standard spectral luminous efficiency Km: Maximum luminous efficacy (683lm · W −1 )
 なお、こうした分光放射輝度から分光輝度を算出する手法については、従来より公知の技術(例えば、日本色彩学会(編)、”新編色彩科学ハンドブック(第3版)”、東京大学出版会、pp.51-53、2011.)を用いるため、その詳細な説明は省略することとする。 As for the method for calculating the spectral luminance from such spectral radiance, conventionally known techniques (for example, Japanese Society for Color Science (ed.), “New Color Science Handbook (Third Edition)”, University of Tokyo Press, pp. 51-53, 2011.), detailed description thereof will be omitted.
 色解析画像取得部50は、校正処理部42において処理された対象物200の測定により取得した各波長ごとの分光画像の各画素における、分光放射輝度、分光輝度および分光反射率などを用いて設定された規格の表色系へ変換するための色空間変換処理を行う。 The color analysis image acquisition unit 50 is set using spectral radiance, spectral luminance, spectral reflectance, and the like in each pixel of the spectral image for each wavelength acquired by measurement of the object 200 processed in the calibration processing unit 42. Color space conversion processing is performed for conversion to the specified color system.
 具体的には、予め記憶部52に記憶された各表色系に対応する等色関数を用いて、各画素の分光放射輝度が算出された各波長ごとの分光画像に対して、表色演算を行い、各画素ごとに所定の規格の表色系における刺激値を持つ色空間画像を取得する。なお、この色空間変換処理においては、その過程で分光輝度を用いることもあるが、基本的には分光放射輝度を用いて処理がなされる。 Specifically, using color matching functions corresponding to the color systems stored in advance in the storage unit 52, color calculation is performed on the spectral image for each wavelength for which the spectral radiance of each pixel has been calculated. To obtain a color space image having a stimulus value in a color system of a predetermined standard for each pixel. In this color space conversion processing, spectral luminance may be used in the process, but basically processing is performed using spectral radiance.
 即ち、この色空間変換処理においては、表色系に応じて等色関数や定数を変えることで任意の色空間へ変換することができるものであり、例えば、XYZ表色系への変換は、次式の通りである。 That is, in this color space conversion process, conversion to an arbitrary color space can be performed by changing the color matching function or constant according to the color system. For example, conversion to the XYZ color system is as follows: It is as follows.
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000003
Figure JPOXMLDOC01-appb-M000003
 なお、X、Y、Zは、三刺激値であり、また、
Figure JPOXMLDOC01-appb-M000004
は、等色関数であり、また、Δλは波長間隔である。
X, Y and Z are tristimulus values, and
Figure JPOXMLDOC01-appb-M000004
Is a color matching function, and Δλ is a wavelength interval.
 また、物体色の場合は、以下の式で算出する。
Figure JPOXMLDOC01-appb-M000005
Figure JPOXMLDOC01-appb-M000006
Figure JPOXMLDOC01-appb-M000007
Figure JPOXMLDOC01-appb-M000008
In the case of an object color, it is calculated by the following formula.
Figure JPOXMLDOC01-appb-M000005
Figure JPOXMLDOC01-appb-M000006
Figure JPOXMLDOC01-appb-M000007
Figure JPOXMLDOC01-appb-M000008
 なお、X、Y、Zは三刺激値であり、また、
Figure JPOXMLDOC01-appb-M000009
は、等色関数であり、また、Δλは、波長関数であり、また、kは、対応する測光単位に合わせるための係数であり、また、S(λ)は、光源光の相対分光放射輝度である。
X, Y and Z are tristimulus values, and
Figure JPOXMLDOC01-appb-M000009
Is a color matching function, Δλ is a wavelength function, k is a coefficient for matching with a corresponding photometric unit, and S (λ) is a relative spectral radiance of the light source light. It is.
 さらに、色解析画像取得部50は、取得した色空間画像に対して、予め記憶部に記憶されている変換関数を用いて色演算処理を施すことにより、目的に応じた色解析値(例えば、色度座標値や演色評価数など)を各画素に持つ色解析画像を取得する。 Further, the color analysis image acquisition unit 50 performs a color calculation process on the acquired color space image using a conversion function stored in advance in the storage unit, so that a color analysis value (for example, a color analysis value corresponding to the purpose) is obtained. A color analysis image having a chromaticity coordinate value and a color rendering evaluation number) in each pixel is acquired.
 例えば、xy色度座標への変換係数は、以下の通りである。
Figure JPOXMLDOC01-appb-M000010
Figure JPOXMLDOC01-appb-M000011
Figure JPOXMLDOC01-appb-M000012
For example, conversion coefficients to xy chromaticity coordinates are as follows.
Figure JPOXMLDOC01-appb-M000010
Figure JPOXMLDOC01-appb-M000011
Figure JPOXMLDOC01-appb-M000012
 なお、x(λ)、y(λ)、z(λ)は、xy色度座標であり、また、
Figure JPOXMLDOC01-appb-M000013
は、等色関数である。
Note that x (λ), y (λ), and z (λ) are xy chromaticity coordinates,
Figure JPOXMLDOC01-appb-M000013
Is a color matching function.
 また、こうした手法については、従来より公知の技術(例えば、日本色彩学会(編)、”新編色彩科学ハンドブック(第3版)”、東京大学出版会、pp.81-86、2011.)を用いるため、その詳細な説明は省略することとする。 In addition, conventionally known techniques (for example, the Japan Color Society (ed.), “New Color Science Handbook (3rd edition)”, University of Tokyo Press, pp. 81-86, 2011.) are used for such methods. Therefore, the detailed description will be omitted.
 そして、色解析画像取得部50においては、取得した色解析画像を、所定の表示方法による表示に変換する処理を行う。なお、こうした所定の表示方法については、各種の表示方法を用いることができ、例えば、色座標上の強度別に色階調を割り付けてグラデーション画像とするような表示方法などがある。 Then, the color analysis image acquisition unit 50 performs a process of converting the acquired color analysis image into a display by a predetermined display method. As such a predetermined display method, various display methods can be used. For example, there is a display method in which a color gradation is assigned to each intensity on the color coordinate to form a gradation image.
 なお、表色系の規格としては、例えば、JIS規格(日本工業規格)、NDS規格(防衛省規格)、ISO規格(国際標準機構)、CIE規格(国際照明委員会)、ASTM規格(ASTMインターナショナル)、IEC規格(国際電気標準会議)、ANSI C78.377(米国規格協会)、NTSCカラー規格、パントン、RAL、NCS(Natural Color System)、XYZ(Yxy)表色系(CIE表色系)、RGB表色系、L表色系、L表色系、L表色系、ハンターLab表色系、マンセル表色系、オストワルト表色系、DIN表色系、PCCS表色系(日本色研配色体系)、DICカラーガイド、日本塗料工業会標準色、HSV色空間、HLS色空間、CMYK色空間、CMY色空間、RGBA色空間、RGB色空間、sRGB/AdobeRGB色空間、sYcc色空間、xvYcc色空間、YCbCr色空間、建築デザイン色票、塗料色見本帳などが挙げられるが、挙げられた規格のみに限定されるものではなく、既存の種々の規格の表色系による色表示が可能である。 The color system standards include, for example, JIS standards (Japanese Industrial Standards), NDS standards (Ministry of Defense standards), ISO standards (International Standard Organization), CIE standards (International Lighting Commission), ASTM standards (ASTM International). ), IEC standard (International Electrotechnical Commission), ANSI C78.377 (American Standards Association), NTSC color standard, Pantone, RAL, NCS (Natural Color System), XYZ (Yxy) color system (CIE color system), RGB color system, L * u * v * color system, L * a * b * color system, L * c * h * color system, Hunter Lab color system, Munsell color system, Ostwald color system , DIN color system, PCCS color system (Nippon Color Co., Ltd. color scheme), DIC color guide, Japan Paint Manufacturers Association standard color, HSV color space, HLS color space, CMYK color space , CMY color space, RGBA color space, RGB color space, sRGB / AdobeRGB color space, sYcc color space, xvYcc color space, YCbCr color space, architectural design color chart, paint color sample book, etc. The display is not limited to the above, and color display by the color systems of various existing standards is possible.
 以上の構成において、本発明による分光放射輝度計10により、対象物200の色情報の測定を行う場合には、まず、2次元撮像検出器24により対象物200のZ軸方向における一方の端部200aと他方の端部200bとが撮影可能な状態で、作業者により測定開始の指示がなされる。 In the above configuration, when the color information of the object 200 is measured by the spectral radiance meter 10 according to the present invention, first, one end of the object 200 in the Z-axis direction by the two-dimensional imaging detector 24. The operator gives an instruction to start measurement in a state where 200a and the other end 200b can be photographed.
 作業者により測定開始の指示がなされると、制御部26において、撮影処理が開始される。 When the operator gives an instruction to start measurement, the control unit 26 starts imaging processing.
 ここで、図3のフローチャートには、本発明による分光放射輝度計10における撮像処理の詳細な処理内容が示されており、この撮影処理においては、まず、2次元撮像検出器24による撮影位置が撮影開始位置と一致するように精密直動ステージ14を移動する(ステップS302)。 Here, the flowchart of FIG. 3 shows the detailed processing contents of the imaging process in the spectral radiance meter 10 according to the present invention. In this imaging process, first, the imaging position by the two-dimensional imaging detector 24 is determined. The precision linear motion stage 14 is moved so as to coincide with the imaging start position (step S302).
 この撮影開始位置は、予め設定されており、例えば、対象物200を含む所定の測定領域におけるY軸方向の一方の端部とする。 This imaging start position is set in advance, and is, for example, one end in the Y-axis direction in a predetermined measurement area including the object 200.
 つまり、このステップS302の処理においては、移動制御部32において、2次元撮像検出器24により撮影位置が撮影開始位置と一致する位置まで精密直動ステージ14を移動するとともに、精密直動ステージ14が移動したとの情報を撮影制御部30に出力する。 That is, in the process of step S302, the movement control unit 32 moves the precision linear motion stage 14 to a position where the imaging position coincides with the imaging start position by the two-dimensional imaging detector 24, and the precision linear motion stage 14 is Information indicating that it has moved is output to the imaging control unit 30.
 次に、撮影制御部30が、2次元撮像検出器24において撮影を行う処理を実行する(ステップS304)。 Next, the imaging control unit 30 executes processing for imaging in the two-dimensional imaging detector 24 (step S304).
 即ち、このステップS304の処理においては、撮影制御部30において、2次元撮像検出器24における撮影(電気信号の取得)を行うとともに、移動制御部32に撮影を行ったとの情報を出力する。 That is, in the process of step S304, the imaging control unit 30 performs imaging (acquisition of electric signals) in the two-dimensional imaging detector 24 and outputs information indicating that imaging has been performed to the movement control unit 32.
 具体的には、撮影により、例えば、対象物200のY軸方向における他方の端部200dからの光束が2次元撮像検出器24に入射すると、2次元撮像検出器24では、入射した光の光強度分布を電気信号に変換し、この電気信号を分光データ作成部34に出力する。
 その後、分光データ作成部34において、2次元撮像検出器24から出力された電気信号に基づいて、1次元の空間情報と1次元の波長情報とを有する2次元分光データを作成する。なお、作成された2次元分光データは、記憶部52に出力されて記憶される。
Specifically, for example, when a light beam from the other end 200d of the target 200 in the Y-axis direction is incident on the two-dimensional imaging detector 24 by photographing, the two-dimensional imaging detector 24 receives the light of the incident light. The intensity distribution is converted into an electrical signal, and this electrical signal is output to the spectral data creation unit 34.
Thereafter, the spectral data creating unit 34 creates two-dimensional spectral data having one-dimensional spatial information and one-dimensional wavelength information based on the electrical signal output from the two-dimensional imaging detector 24. The created two-dimensional spectroscopic data is output to and stored in the storage unit 52.
 そして、2次元撮像検出器24による撮影がなされると、移動制御部32は、精密直動ステージ14を所定の間隔で移動する(ステップS306)。 Then, when shooting is performed by the two-dimensional imaging detector 24, the movement control unit 32 moves the precision linear motion stage 14 at a predetermined interval (step S306).
 つまり、このステップS306の処理においては、移動制御部32の処理により、精密直動ステージ14を所定の間隔で移動するとともに、精密直動ステージ14が移動したとの情報を撮影制御部30に出力する。 That is, in the process of step S306, the movement control unit 32 moves the precision linear motion stage 14 at a predetermined interval and outputs information to the imaging control unit 30 that the precision linear motion stage 14 has moved. To do.
 なお、このとき、2次元撮像検出器24の撮影位置が撮影終了位置となるまで精密直動ステージ14を移動した場合には、移動制御部32から撮影終了位置まで移動したとの情報を出力されることとなる。 At this time, when the precise linear motion stage 14 is moved until the photographing position of the two-dimensional imaging detector 24 reaches the photographing end position, the movement control unit 32 outputs information indicating that it has moved to the photographing end position. The Rukoto.
 その後、撮影制御部30において、2次元撮像検出器24の撮影位置が撮影終了位置となるまで精密直動ステージ14が移動したか否かを判断する(ステップS308)。 Thereafter, the imaging control unit 30 determines whether or not the precise linear motion stage 14 has moved until the imaging position of the two-dimensional imaging detector 24 reaches the imaging end position (step S308).
 なお、この撮影終了位置は、予め設定されており、例えば、所定の撮影領域におけるY軸方向の他方の端部とする。 Note that this photographing end position is set in advance, and is, for example, the other end in the Y-axis direction in a predetermined photographing region.
 即ち、このステップS308の判断処理においては、撮影制御部30において、移動制御部32から撮影終了位置まで移動したとの情報が出力されたか否かの判断を行うものである。 That is, in the determination process of step S308, the imaging control unit 30 determines whether information indicating that the movement control unit 32 has moved to the imaging end position has been output.
 つまり、ステップS308の判断処理においては、移動制御部32から撮影終了位置まで移動したとの情報が出力されなかった場合には、撮影制御部30において2次元撮像検出器24の撮影位置が撮影終了位置となるまで精密直動ステージ14が移動していないと判断される。一方、移動制御部32から撮影終了位置まで移動したとの情報が出力された場合には、撮影制御部30において2次元撮像検出器24の撮影位置が撮影終了位置となるまで精密直動ステージ14が移動したと判断される。 That is, in the determination process in step S308, if the information indicating that the movement control unit 32 has moved to the shooting end position is not output, the shooting position of the two-dimensional imaging detector 24 is set to the shooting end in the shooting control unit 30. It is determined that the precision linear motion stage 14 has not moved until the position is reached. On the other hand, when the information indicating that the movement control unit 32 has moved to the photographing end position is output, the precise linear motion stage 14 until the photographing position of the two-dimensional imaging detector 24 reaches the photographing end position in the photographing control unit 30. Is determined to have moved.
 ステップS308の判断処理において、2次元撮像検出器24の撮影位置が撮影終了位置となるまで撮影直動ステージ14が移動していないと判断されると、ステップS304の処理に戻り、ステップS304の処理以降の処理を行う。 If it is determined in step S308 that the imaging linear motion stage 14 has not moved until the imaging position of the two-dimensional imaging detector 24 reaches the imaging end position, the process returns to step S304, and the process of step S304. Perform the following processing.
 一方、ステップS308の判断処理において、2次元撮像検出器24の撮影位置が撮影終了位置となるまで撮影直動ステージ14が移動したと判断されると、撮影終了位置において、2次元撮像検出器24により撮影(電気信号の取得)を行って(ステップS310)、この撮影処理を終了する。 On the other hand, if it is determined in step S308 that the linear motion stage 14 has moved until the imaging position of the two-dimensional imaging detector 24 reaches the imaging end position, the two-dimensional imaging detector 24 at the imaging end position. Then, photographing (acquisition of electrical signals) is performed (step S310), and this photographing process is terminated.
 撮影処理が終了すると、次に、測定処理が開始される。 When the photographing process is completed, the measurement process is started next.
 ここで、図4のフローチャートには、本発明による分光放射輝度計10における測定処理の詳細な処理内容が示されており、この測定処理においては、まず、分光データ作成部34において、各撮影位置において取得した1次元の空間情報と1次元の波長情報(高波長分解分光情報)とを有する2次元分光データを統合して、所定の測定領域における情報を示す2次元の空間情報と1次元の波長情報とを有する3次元分光データを作成する(ステップS402)。 Here, the flowchart of FIG. 4 shows the detailed processing contents of the measurement process in the spectral radiance meter 10 according to the present invention. In this measurement process, first, each spectral position is taken by the spectral data creation unit 34. 2D spectroscopic data having 1D spatial information and 1D wavelength information (high wavelength resolution spectroscopic information) acquired in step 2 is integrated to obtain 2D spatial information indicating information in a predetermined measurement region and 1D Three-dimensional spectroscopic data having wavelength information is created (step S402).
 次に、分光画像作成部38において、作成した3次元分光データにおける各波長ごとの分光画像(ハイパースペクトル画像)を作成し(ステップS404)、その後、作成した各波長ごとの分光画像の校正処理を行う(ステップS406)。 Next, the spectral image creation unit 38 creates a spectral image (hyperspectral image) for each wavelength in the created three-dimensional spectral data (step S404), and then performs calibration processing for the created spectral image for each wavelength. This is performed (step S406).
 ここで、図5には、校正処理の詳細な処理内容が示されたフローチャートが示されており、この校正処理においては、まず、暗時ノイズ補正処理を行う(ステップS502)。 Here, FIG. 5 shows a flowchart showing the detailed processing contents of the calibration processing. In this calibration processing, first, dark noise correction processing is performed (step S502).
 即ち、このステップS502の処理においては、予め作成して記憶部52にされている暗時ノイズ補正データを用いて暗時ノイズ補正処理を行う。 That is, in the process of step S502, the dark noise correction process is performed using the dark noise correction data created in advance and stored in the storage unit 52.
 具体的には、校正処理部42において、ステップS404の処理で作成した対象物200の撮影により取得した各波長ごとの分光画像の各画素における出力値に対し、対応する波長の暗時ノイズ補正データにおける対応する画素の出力値を減算するものである。 Specifically, in the calibration processing unit 42, the dark noise correction data of the corresponding wavelength with respect to the output value in each pixel of the spectral image for each wavelength acquired by photographing the object 200 created in the process of step S404. The output value of the corresponding pixel in is subtracted.
 次に、こうして暗時ノイズ補正処理された各波長ごとの分光画像に対して、画素間感度偏差補正処理を行う(ステップS504)。 Next, a sensitivity deviation correction process between pixels is performed on the spectral image for each wavelength subjected to the dark noise correction process in this way (step S504).
 即ち、このステップS504の処理においては、校正処理部42において、暗時ノイズ補正処理された各波長ごとの分光画像の各画素に対して、予め作成して記憶部52に記憶されている感度補正係数を乗算する。 That is, in the processing of step S504, the sensitivity correction that has been created in advance and stored in the storage unit 52 for each pixel of the spectral image for each wavelength that has undergone dark noise correction processing in the calibration processing unit 42. Multiply by a coefficient.
 その後、画素間感度偏差補正処理された各波長ごとの分光画像に対して、輝度校正処理を行う(ステップS506)。 Thereafter, a luminance calibration process is performed on the spectral image for each wavelength subjected to the inter-pixel sensitivity deviation correction process (step S506).
 即ち、このステップS506の処理においては、校正処理部42において、画素間感度偏差補正処理された各波長ごとの分光画像の画素出力値に対して、予め作成して記憶部52に記憶されている輝度校正係数を乗算する。なお、この輝度校正係数を乗算して得られたものは、各波長ごとの分光画像の各画素における分光放射輝度を示すものとなる。 That is, in the process of step S506, the calibration processing unit 42 creates and stores in advance in the storage unit 52 the pixel output value of the spectral image for each wavelength subjected to the inter-pixel sensitivity deviation correction processing. Multiply by the luminance calibration factor. In addition, what was obtained by multiplying the luminance calibration coefficient indicates the spectral radiance at each pixel of the spectral image for each wavelength.
 そして、算出部44において、取得した各波長ごとの分光画像の各画素における分光放射輝度から、各波長ごとの分光画像の各画素における分光輝度を算出し(ステップS508)、解析処理のステップS408の処理に進む。 Then, the calculation unit 44 calculates the spectral luminance at each pixel of the spectral image for each wavelength from the acquired spectral radiance at each pixel of the spectral image for each wavelength (step S508), and the analysis processing at step S408. Proceed to processing.
 このステップS508の処理においては、光源補正処理(ステップS408)を行ものであり、校正処理部42において、ステップS502の処理で取得した暗時ノイズ補正処理された各波長ごとの分光画像の各画素における出力値に対し、光源データの対応する波長の分光画像における対応する画素の出力値を除算して、光源補正処理を行って、対象物200の各波長ごとの分光画像における各画素ごとの分光反射率もしくは分光透過率を取得する。 In the process of step S508, the light source correction process (step S408) is performed. In the calibration processing unit 42, each pixel of the spectral image for each wavelength subjected to the dark noise correction process acquired in the process of step S502. The output value of the corresponding pixel in the spectral image of the wavelength corresponding to the light source data is divided from the output value of the light source data, and the light source correction process is performed, so that the spectral for each pixel in the spectral image for each wavelength of the object 200 is obtained. Obtain reflectance or spectral transmittance.
 次に、色解析画像取得部50において、色空間画像を取得する(ステップS410)。 Next, the color analysis image acquisition unit 50 acquires a color space image (step S410).
 即ち、このステップS410の処理においては、各画素における分光放射輝度と、分光輝度と、分光反射率もしくは分光透過率とが算出された各波長ごとの分光画像に対して、記憶部52に記憶された各表色系に対応する等色関数を用いて、予め設定された規格の表色系で表される色空間画像を取得する。 That is, in the process of step S410, the spectral radiance, spectral luminance, and spectral reflectance or spectral transmittance of each pixel are stored in the storage unit 52 for the spectral images for each wavelength. A color space image represented by a color system of a preset standard is acquired using a color matching function corresponding to each color system.
 その後、取得した色空間画像に対して、予め記憶部52に記憶された変換関数を用いて色演算処理を施すことにより色解析画像を取得し(ステップS412)、取得した色解析画像を所定の表示方法による表示に変換する処理を行って、表示部(図示せず。)に出力し(ステップS414)、測定処理を終了する。なお、表示部(図示せず。)では、このステップS414の処理により出力された情報に基づく表示がなされる。 Thereafter, the obtained color space image is subjected to color calculation processing using a conversion function stored in advance in the storage unit 52 to obtain a color analysis image (step S412). A process of converting to display by the display method is performed, and the result is output to a display unit (not shown) (step S414), and the measurement process is terminated. A display unit (not shown) performs display based on the information output by the process of step S414.
 ここで、本発明による分光放射輝度計10を用いて、対象物200の測定した実験の実験結果について説明する。 Here, an experimental result of an experiment in which the object 200 is measured using the spectral radiance meter 10 according to the present invention will be described.
 生葉(自然の葉)と造葉(造形した葉)とが混在した葉束を対象物として用いて、この葉束をハロゲン光(分光放射輝度計10と別体に設けられた光源からの光)下に静置した。 Using a leaf bundle in which fresh leaves (natural leaves) and artificial leaves (shaped leaves) are mixed as an object, this leaf bundle is treated with halogen light (light from a light source provided separately from the spectral radiance meter 10). ) Left below.
 そして、ハロゲン光を斜め上方側から照射し、正面45cmの距離から分光放射輝度計10により測定を行った。 Then, halogen light was irradiated obliquely from above, and measurement was performed with the spectral radiance meter 10 from a distance of 45 cm in front.
 なお、測定処理においては、校正処理後にスペクトル情報に基づいてカラー解析を行い、対象物200を含む所定の撮影領域の2次元空間上の色情報を画像表示したものを図6(a)(b)(c)(d)(e)に示す。 In the measurement process, color analysis is performed based on the spectrum information after the calibration process, and color information in a two-dimensional space of a predetermined imaging region including the object 200 is displayed as an image in FIGS. ) (C) (d) (e).
 なお、図6(a)には、分光放射輝度計10による撮影した葉束のRGB画像が示されており、また、図6(b)には、分光放射輝度計10で作成した所定の波長(バンド)における分光画像が示されており、また、図6(c)には、分光画像の分光反射率を示すグラフが示されており、また、図6(d)には、図6(b)に示す分光画像をXYZ表色系の刺激値へ変換表示(X値)した画像が示されており、また、図6(e)には、図6(b)に示す分光画像をxy色度座標へ変換表示(x値)した画像が示されている。 6A shows an RGB image of a leaf bundle photographed by the spectral radiance meter 10, and FIG. 6B shows a predetermined wavelength created by the spectral radiance meter 10. (Band) shows a spectral image, FIG. 6 (c) shows a graph showing the spectral reflectance of the spectral image, and FIG. 6 (d) shows FIG. An image obtained by converting the spectral image shown in b) into a stimulus value of the XYZ color system (X value) is shown, and FIG. 6E shows the spectral image shown in FIG. An image converted to chromaticity coordinates (x value) is shown.
 図6(c)においては、分光反射率R(λ)を分光画像の各画素ごとに算出し、作業者が指定した画素(複数画素の平均でもよい)の分光反射率をグラフで表示したものである。 In FIG. 6C, the spectral reflectance R (λ) is calculated for each pixel of the spectral image, and the spectral reflectance of the pixel (which may be an average of a plurality of pixels) designated by the operator is displayed in a graph. It is.
 また、図6(d)は、各波長ごとの分光画像における各画素の分光放射輝度を、等色関数を用いて色演算子、XYZ表色系の刺激値を各画素ごとにもつ色空間画像のうち、X値のみを強度別にカラーグラデーションで表示した画像である。 FIG. 6D shows a color space image having the spectral radiance of each pixel in the spectral image for each wavelength, the color operator using the color matching function, and the stimulus value of the XYZ color system for each pixel. Among them, an image in which only the X value is displayed with a color gradation according to intensity.
 さらに、図6(e)は、取得したXYZ色空間画像に対して、変換関数を用いて色演算処理を施し、xy色度座標値を各画素にもつ色解析画像を取得し、そのうちx値を、強度別にカラーグラデーションで表示した画像である。 Furthermore, FIG. 6 (e) performs color calculation processing on the acquired XYZ color space image using a conversion function to acquire a color analysis image having xy chromaticity coordinate values in each pixel, of which x value Is an image displayed with color gradation according to intensity.
 以上において説明したように、本発明による分光放射輝度計10は、対物レンズ12の後段に、スリット板16、コリメートレンズ18、分散光学素子20、結像レンズ22および2次元撮像検出器24を設けるようにして、Z軸方向に延長した1次元の空間情報と、Y軸方向に延長した1次元の波長情報(高波長分解分光情報)とを有する2次元分光データを取得するようにした。 As described above, the spectral radiance meter 10 according to the present invention includes the slit plate 16, the collimating lens 18, the dispersion optical element 20, the imaging lens 22, and the two-dimensional imaging detector 24 after the objective lens 12. In this way, two-dimensional spectroscopic data having one-dimensional spatial information extended in the Z-axis direction and one-dimensional wavelength information (high wavelength resolved spectroscopic information) extended in the Y-axis direction is acquired.
 そして、このスリット板16、コリメートレンズ18、分散光学素子20、結像レンズ22および2次元撮像検出器24をY軸方向に移動自在に配設された精密直動ステージ14に配置するようにし、Y軸方向で測定領域における撮影位置を変更するようにした。 Then, the slit plate 16, the collimating lens 18, the dispersion optical element 20, the imaging lens 22, and the two-dimensional imaging detector 24 are disposed on the precision linear motion stage 14 that is movably disposed in the Y-axis direction. The shooting position in the measurement area is changed in the Y-axis direction.
 そして、取得した各撮影位置の2次元分光データを統合して2次元の空間情報と1次元の波長情報とを有する2次元分光データを作成し、作成した3次元分光データから各波長ごとの分光画像(ハイパースペクトル画像)を作成するようにした。 Then, two-dimensional spectral data having two-dimensional spatial information and one-dimensional wavelength information is created by integrating the acquired two-dimensional spectral data at each photographing position, and a spectrum for each wavelength is created from the created three-dimensional spectral data. An image (hyperspectral image) was created.
 これにより、本発明による分光放射輝度計10においては、対象物を含む所定の測定領域のイメージングを短時間で行うことができるようになる。 Thereby, in the spectral radiance meter 10 according to the present invention, imaging of a predetermined measurement region including the object can be performed in a short time.
 また、本発明による分光放射輝度計10は、取得した分光画像の各画素ごとの分光放射輝度と、分光輝度と、分光反射率あるいは分光透過率とを算出し、この分光画像に対して色空間変換処理を行って所定の表色系で表す色空間画像を取得するようにした。 Further, the spectral radiance meter 10 according to the present invention calculates the spectral radiance, spectral luminance, spectral reflectance or spectral transmittance for each pixel of the acquired spectral image, and the color space for this spectral image. A color space image represented by a predetermined color system is acquired by performing conversion processing.
 さらに、取得した色空間画像を、色演算処理して色解析画像を取得し、その後、所定の表示方法による表示に変換する処理を行うようにした。 Further, the obtained color space image is subjected to color calculation processing to obtain a color analysis image, and thereafter, processing for converting into a display by a predetermined display method is performed.
 これにより、本発明による分光放射輝度計10においては、広い2次元空間における1画素ごとの分光放射輝度を短時間で取得することができるようになる。 Thereby, the spectral radiance meter 10 according to the present invention can acquire the spectral radiance for each pixel in a wide two-dimensional space in a short time.
 さらに、本発明による分光放射輝度計10においては、既存の種々の表色系による表示を行うことができ、統一的、かつ客観的な色分析表示を行うことができるようになる。 Furthermore, in the spectral radiance meter 10 according to the present invention, display by various existing color systems can be performed, and unified and objective color analysis display can be performed.
 従って、本発明による分光放射輝度計10によれば、様々な空間的な色分析が求められる場面(例えば、ディスプレイ・パネルの発色検査、光源の検査、工業製品の色管理、人間の視覚情報の基礎データとしての色情報など)における客観的かつ有効な色分析が可能となる。 Therefore, according to the spectral radiance meter 10 according to the present invention, various spatial color analyzes are required (for example, display panel color development inspection, light source inspection, industrial product color management, human visual information Objective and effective color analysis in color information as basic data) becomes possible.
 なお、上記した実施の形態は、以下の(1)乃至(6)に示すように変形するようにしてもよい。 The above-described embodiment may be modified as shown in the following (1) to (6).
 (1)上記した実施の形態においては、スリット板16、コリメートレンズ18、分散光学素子20、結像レンズ22および2次元撮像検出器24を、Y軸方向に移動可能な精密直動ステージ14上に配設するようにしたが、これに限られるものではないことは勿論である。 (1) In the above-described embodiment, the slit plate 16, the collimating lens 18, the dispersion optical element 20, the imaging lens 22, and the two-dimensional imaging detector 24 are on the precision linear motion stage 14 that can move in the Y-axis direction. Of course, the arrangement is not limited to this.
 即ち、直線上を移動制御可能な技術による構成、例えば、流体式(油圧、空圧)、電磁式、超音波式、ピエゾ式などのアクチュエータなどにより、スリット板16以降の構成を一体的にY軸方向に移動するような構成としてもよい。 That is, the configuration after the slit plate 16 is integrated with a configuration based on a technology capable of moving on a straight line, for example, a fluid type (hydraulic pressure, pneumatic pressure), electromagnetic type, ultrasonic type, piezo type actuator, or the like. It is good also as a structure which moves to an axial direction.
 (2)上記した実施の形態においては、対象物200の撮影により取得した各波長ごとの分光画像を作成した後に、校正処理を行うようにしたが、これに限られるものではないことは勿論であり、校正処理における暗時ノイズ補正処理や光源補正処理については、例えば、作業者による指示のあった後に処理を行うようにしてもよい。 (2) In the above-described embodiment, the calibration process is performed after the spectral image for each wavelength acquired by photographing the object 200 is created. However, the present invention is not limited to this. Yes, the dark noise correction process and the light source correction process in the calibration process may be performed after an instruction from the operator, for example.
 (3)上記した実施の形態においては、分光放射輝度計10内に設けられた精密直動ステージ14により、対物レンズ12より後段の構成をY軸方向に移動させることにより、撮影位置をY軸方向に移動するようにしたが、これに限られるものではないことは勿論であり、分光放射輝度計10において、対物レンズ12、スリット板16、コリメートレンズ18、分散光学素子20、結像レンズ22および2次元撮像検出器24を固定的に配設し、対象物200との位置関係をY軸方向に移動することにより、撮影位置をY軸方向に移動するようにしてもよい。 (3) In the embodiment described above, the imaging position is set to the Y axis by moving the configuration subsequent to the objective lens 12 in the Y axis direction by the precision linear motion stage 14 provided in the spectral radiance meter 10. In the spectral radiance meter 10, the objective lens 12, the slit plate 16, the collimator lens 18, the dispersion optical element 20, and the imaging lens 22 are of course not limited to this. Alternatively, the imaging position may be moved in the Y-axis direction by disposing the two-dimensional imaging detector 24 in a fixed manner and moving the positional relationship with the object 200 in the Y-axis direction.
 即ち、分光放射輝度計10あるいは対象物200のいずれか一方を固定し、他方をY軸方向に移動させることにより、撮影位置をY軸方向に移動することとなる。 That is, by fixing either one of the spectral radiance meter 10 or the object 200 and moving the other in the Y-axis direction, the photographing position is moved in the Y-axis direction.
 具体的には、分光放射輝度計10を移動させる場合には、例えば、分光放射輝度計10内に精密直動ステージ14を設けることなく、対物レンズ12とともに、スリット板16、コリメートレンズ18、分散光学素子20、結像レンズ22および2次元撮像検出器24を固定的に配設し、固定された対象物200に対して、分光放射輝度計10と別体に設けられた制御部により制御される移動手段により分光放射輝度計10をY軸方向に移動するようにしてもよい(図11(a)を参照する。)。なお、この際には、当該移動手段を制御する制御部と、分光放射輝度計10の制御部26とは接続され、移動のタイミングや撮影のタイミングが制御されることとなる。 Specifically, when the spectral radiance meter 10 is moved, for example, without providing the precision linear motion stage 14 in the spectral radiance meter 10, together with the objective lens 12, the slit plate 16, the collimating lens 18, the dispersion The optical element 20, the imaging lens 22 and the two-dimensional imaging detector 24 are fixedly arranged, and the fixed object 200 is controlled by a control unit provided separately from the spectral radiance meter 10. The spectral radiance meter 10 may be moved in the Y-axis direction by the moving means (refer to FIG. 11A). At this time, the control unit for controlling the moving unit and the control unit 26 of the spectral radiance meter 10 are connected, and the timing of movement and the timing of photographing are controlled.
 また、対象物200を移動させる場合には、分光放射輝度計10内に精密直動ステージ14を設けることなく、対物レンズ12とともに、スリット板16、コリメートレンズ18、分散光学素子20、結像レンズ22および2次元撮像検出器24を固定的に配設し、固定された分光放射輝度計10に対して、制御部により制御される移動手段により対象物200をY軸方向に移動するようにしてもよい(図11(b)を参照する。)。なお、この際には、当該移動手段を制御する制御部と、分光放射輝度計10の制御部26とは接続され、移動のタイミングや撮影のタイミングが制御されることとなる。 Further, when the object 200 is moved, the slit plate 16, the collimating lens 18, the dispersion optical element 20, the imaging lens are used together with the objective lens 12 without providing the precision linear motion stage 14 in the spectral radiance meter 10. 22 and the two-dimensional imaging detector 24 are fixedly arranged, and the object 200 is moved in the Y-axis direction with respect to the fixed spectral radiance meter 10 by moving means controlled by the control unit. (Refer to FIG. 11 (b)). At this time, the control unit for controlling the moving unit and the control unit 26 of the spectral radiance meter 10 are connected, and the timing of movement and the timing of photographing are controlled.
 (4)上記した実施の形態においては、スリット板16以降の構成を、Y軸方向に移動可能な精密直動ステージ14上に配設するようにし、対象物200を含む所定の撮影領域における撮影位置をY軸方向に移動するようにしたが、これに限られるものではないことは勿論であり、撮影位置を移動させる機構としては、下記の3つの構成としてもよい。 (4) In the above-described embodiment, the configuration after the slit plate 16 is arranged on the precision linear motion stage 14 movable in the Y-axis direction, and imaging in a predetermined imaging area including the object 200 is performed. Although the position is moved in the Y-axis direction, the present invention is not limited to this. Of course, the mechanism for moving the photographing position may have the following three configurations.
 (I)対物レンズ12を精密直動ステージ上に配設する構成
 (II)ガルバノミラーを用いる構成
 (III)一対の色消しプリズムを用いる構成
 以下、上記(I)~(III)について説明する。
(I) Configuration in which the objective lens 12 is disposed on a precision linear motion stage (II) Configuration using a galvanometer mirror (III) Configuration using a pair of achromatic prisms (I) to (III) will be described below.
 (I)対物レンズ12を精密直動ステージ上に配設する構成
 図7には、本発明による分光放射輝度計の変形例の概略構成説明図が示されている。
(I) Configuration in which the objective lens 12 is arranged on the precision linear motion stage FIG. 7 is a schematic configuration explanatory diagram of a modification of the spectral radiance meter according to the present invention.
 この図7に示す分光放射輝度計100は、対象物200を含む所定の撮影領域からの光束を入射する対物レンズ102と、対物レンズ102が配設され、Y軸方向に移動する精密直動ステージ104と、対物レンズ12の像面においてZ軸方向に延設されたスリット開口部16aが位置するように配設されたスリット板16と、スリット開口部16aを通過した光束を平行光とするコリメートレンズ18と、コリメートレンズ18からの平行光を、Y軸方向に分散する分散光学素子20と、分散光学素子20から出射された光束を結像する結像レンズ22と、結像レンズ22の像面上に検出部24aが位置するように配設された2次元撮像検出部24と、精密直動ステージ14および2次元撮像検出部24を制御するとともに、2次元撮像検出部24から出力された情報の処理を行う制御部26とを有して構成されている。 A spectral radiance meter 100 shown in FIG. 7 includes an objective lens 102 that receives a light beam from a predetermined imaging region including an object 200, and a precision linear motion stage that is provided with the objective lens 102 and moves in the Y-axis direction. 104, a slit plate 16 disposed so that a slit opening 16a extending in the Z-axis direction is positioned on the image plane of the objective lens 12, and a collimator that collimates the light beam that has passed through the slit opening 16a. The lens 18, the dispersion optical element 20 that disperses the parallel light from the collimating lens 18 in the Y-axis direction, the imaging lens 22 that forms an image of the light beam emitted from the dispersion optical element 20, and the image of the imaging lens 22 The two-dimensional imaging detection unit 24, the precision linear motion stage 14, and the two-dimensional imaging detection unit 24, which are arranged so that the detection unit 24a is positioned on the surface, are controlled and two-dimensional imaging is performed. And a control unit 26 for processing information output from the output unit 24 is constructed.
 なお、スリット板16、コリメートレンズ18、分散光学素子20、結像レンズ22および2次元撮像検出器24はそれぞれ固定的に配設されることとなる。 Note that the slit plate 16, the collimating lens 18, the dispersion optical element 20, the imaging lens 22, and the two-dimensional imaging detector 24 are fixedly disposed.
 また、移動制御部32は、精密直動ステージ104を、対象物200のY軸方向における一方の端部200cから他方の端部200dが含まれる所定の撮影領域内をY軸方向に、スリット開口部16aのY軸方向の幅に応じた所定の間隔で順次移動させる。 In addition, the movement control unit 32 opens the precision linear motion stage 104 with a slit opening in the Y-axis direction within a predetermined imaging region including the one end 200c to the other end 200d of the object 200 in the Y-axis direction. The part 16a is sequentially moved at a predetermined interval corresponding to the width in the Y-axis direction.
 より詳細には、移動制御部32は、作業者により撮影開始が指示されると、2次元撮像検出器34の撮影位置が撮影開始位置となるように精密直動ステージ104を移動し、精密直動ステージ104を移動したとの情報を撮影制御部30に出力する。 More specifically, the movement control unit 32 moves the precise linear movement stage 104 so that the photographing position of the two-dimensional imaging detector 34 becomes the photographing start position when the operator instructs the start of photographing. Information indicating that the moving stage 104 has been moved is output to the imaging control unit 30.
 そして、移動制御部32は、撮影制御部30から撮影を行ったとの情報が出力されると、精密直動ステージ104を、スリット開口部16aのY軸方向の幅に応じた所定の間隔で移動させ、精密直動ステージ104を移動したとの情報を撮影制御部30に出力する。 Then, when the information indicating that photographing has been performed is output from the photographing control unit 30, the movement control unit 32 moves the precision linear motion stage 104 at a predetermined interval corresponding to the width of the slit opening 16a in the Y-axis direction. Information indicating that the precision linear motion stage 104 has been moved is output to the imaging control unit 30.
 また、移動制御部32は、2次元撮像検出器24の撮影位置が撮影終了位置となるまで精密直動ステージ104を移動すると、撮影終了位置まで移動したとの情報を撮影制御部30に出力する。 Further, when the precise linear motion stage 104 is moved until the photographing position of the two-dimensional imaging detector 24 reaches the photographing end position, the movement control unit 32 outputs information to the photographing control unit 30 that it has moved to the photographing end position. .
 そして、移動制御部32において精密直動ステージ104を移動して撮影位置を移動しながら、撮影制御部30により撮影を行うとともに、2次元撮像検出器24からの電気信号に基づいて、分光データ作成部34において1次元の空間情報と1次元の波長情報(高波長分解分光情報)とを有する2次元分光データを作成することとなる。 Then, the movement control unit 32 moves the precision linear motion stage 104 to move the shooting position, and the shooting control unit 30 performs shooting and creates spectral data based on the electrical signal from the two-dimensional imaging detector 24. The unit 34 creates two-dimensional spectroscopic data having one-dimensional spatial information and one-dimensional wavelength information (high wavelength resolution spectroscopic information).
 (II)ガルバノミラーを用いる構成
 図8には、本発明による分光放射輝度計の変形例の概略構成説明図が示されている。
(II) Configuration Using Galvano Mirror FIG. 8 shows a schematic configuration explanatory diagram of a modification of the spectral radiance meter according to the present invention.
 この図8に示す撮像部110は、対象物200を含む所定の撮影領域からの光束を入射するF-θレンズ112と、F-θレンズ112の後段に設けられたガルバノミラー114と、ガルバノミラー114によって反射された光束を結像する結像レンズ116と、結像レンズ116の像面においてZ軸方向に延設されたスリット開口部16aが位置するように配設されたスリット板16と、スリット開口部16aを通過した光束を平行光とするコリメートレンズ18と、コリメートレンズ18からの平行光を、X軸方向に分散する分散光学素子20と、分散光学素子20から出射された光束を結像する結像レンズ22と、結像レンズ22の像面上に検出部24aが位置するように配設された2次元撮像検出部24とを有して構成されている。 The imaging unit 110 shown in FIG. 8 includes an F-θ lens 112 that receives a light beam from a predetermined imaging region including the target object 200, a galvano mirror 114 provided at the rear stage of the F-θ lens 112, and a galvano mirror. An imaging lens 116 that forms an image of the light beam reflected by 114; a slit plate 16 that is disposed so that a slit opening 16a that extends in the Z-axis direction is positioned on the image plane of the imaging lens 116; A collimating lens 18 that collimates the light beam that has passed through the slit opening 16a, a dispersion optical element 20 that disperses the parallel light from the collimating lens 18 in the X-axis direction, and a light beam emitted from the dispersion optical element 20 are combined. The imaging lens 22 is configured to have an image, and the two-dimensional imaging detection unit 24 is disposed so that the detection unit 24 a is positioned on the image plane of the imaging lens 22.
 なお、F-θレンズ114、結像レンズ116、スリット板16、コリメートレンズ18、分散光学素子20、結像レンズ22および2次元撮像検出器24はそれぞれ固定的に配設されることとなる。 Note that the F-θ lens 114, the imaging lens 116, the slit plate 16, the collimating lens 18, the dispersion optical element 20, the imaging lens 22, and the two-dimensional imaging detector 24 are fixedly disposed.
 また、ガルバノミラー114は、F-θレンズ112により略コリメートされた光を反射面において反射して結像レンズ116に入射させるものであり、この反射面は、Z軸周りに回動するように配設されている。 The galvanometer mirror 114 reflects the light substantially collimated by the F-θ lens 112 on the reflecting surface and enters the imaging lens 116, and the reflecting surface rotates around the Z axis. It is arranged.
 そして、移動制御部32において、反射面の回動角度や回動方向が制御される。なお、こうした回動角度や回動方向については、撮影処理前に設定する。 Then, the movement control unit 32 controls the rotation angle and rotation direction of the reflecting surface. Note that such a rotation angle and a rotation direction are set before photographing processing.
 なお、以下の説明においては、「ガルバノミラー114を回動する」とは、「ガルバノミラー114における反射面を回動する」ことを意味するものとする。 In the following description, “rotate the galvano mirror 114” means “rotate the reflecting surface of the galvano mirror 114”.
 即ち、ガルバノミラー114は、回動することにより反射面を回転して、F-θレンズ112によりコリメートされた光の反射角度を変更することとなる。 That is, the galvano mirror 114 rotates to rotate the reflection surface, and changes the reflection angle of the light collimated by the F-θ lens 112.
 また、移動制御部32は、対象物200のY軸方向における一方の端部200cから他方の端部200dが含まれる所定の撮影領域内をY軸方向に撮影位置が移動するように、ガルバノミラー114をZ軸周りにスリット開口部16aのY軸方向の幅に応じた所定の回動角度で順次回動させる。 The movement control unit 32 also includes a galvanometer mirror so that the imaging position moves in the Y-axis direction within a predetermined imaging area including the other end 200d from one end 200c in the Y-axis direction of the object 200. 114 is sequentially rotated around the Z axis at a predetermined rotation angle corresponding to the width of the slit opening 16a in the Y axis direction.
 より詳細には、移動制御部32は、作業者により撮影開始が指示されると、2次元撮像検出器24の撮影位置が撮影開始位置となるようガルバノミラー114を回動し、ガルバノミラー114を回動したとの情報を撮影制御部30に出力する。 More specifically, when the start of shooting is instructed by the operator, the movement control unit 32 rotates the galvano mirror 114 so that the shooting position of the two-dimensional imaging detector 24 becomes the shooting start position. Information about the rotation is output to the imaging control unit 30.
 そして、移動制御部32は、撮影制御部30から撮影を行ったとの情報が出力されると、ガルバノミラー114を、スリット開口部16aのY軸方向の幅に応じた所定の回動角度で回動させ、ガルバノミラー114を回動したとの情報を撮影制御部30に出力する。 Then, when the information indicating that the photographing has been performed is output from the photographing control unit 30, the movement control unit 32 rotates the galvanometer mirror 114 at a predetermined rotation angle corresponding to the width of the slit opening 16a in the Y-axis direction. The information indicating that the galvano mirror 114 is rotated is output to the imaging control unit 30.
 また、移動制御部32は、2次元撮像検出器24の撮影位置が撮影終了位置となるまでガルバノミラー114を回動すると、撮影終了位置まで移動したとの情報を撮影制御部30に出力する。 Further, when the galvano mirror 114 is rotated until the shooting position of the two-dimensional imaging detector 24 reaches the shooting end position, the movement control unit 32 outputs information indicating that the movement to the shooting end position is performed to the shooting control unit 30.
 そして、移動制御部32においてガルバノミラー114の回動を制御して撮影位置を移動しながら、撮影制御部30により撮影を行うとともに、2次元撮像検出器24からの電気信号に基づいて、分光データ作成部34において1次元の空間情報と1次元の波長情報(高波長分解分光情報)とを有する2次元分光データを作成することとなる。 Then, the movement control unit 32 controls the rotation of the galvano mirror 114 to move the imaging position, and the imaging control unit 30 performs imaging, and based on the electrical signal from the two-dimensional imaging detector 24, spectral data. The creating unit 34 creates two-dimensional spectroscopic data having one-dimensional spatial information and one-dimensional wavelength information (high wavelength resolution spectroscopic information).
 (III)一対の色消しプリズムを用いる構成
 図9には、本発明による分光放射輝度計の変形例の概略構成説明図が示されている。
(III) Configuration Using a Pair of Achromatic Prism FIG. 9 shows a schematic configuration explanatory diagram of a modification of the spectral radiance meter according to the present invention.
 この図9に示す分光放射輝度計120は、対象物200からの光束を入射する対物レンズ12の前段に配設された一対の色消しプリズム122と、対物レンズ12の像面においてZ軸方向に延設されたスリット開口部16aが位置するように配設されたスリット板16と、スリット開口部16aを通過した光束を平行光とするコリメートレンズ18と、コリメートレンズ18からの平行光を、Y軸方向に分散する分散光学素子20と、分散光学素子20から出射された光束を結像する結像レンズ22と、結像レンズ22の像面上に検出部24aが位置するように配設された2次元撮像検出部24とを有して構成されている。 The spectral radiance meter 120 shown in FIG. 9 includes a pair of achromatic prisms 122 disposed in front of the objective lens 12 on which the light beam from the object 200 is incident, and the Z-axis direction on the image plane of the objective lens 12. The slit plate 16 disposed so that the extended slit opening 16a is positioned, the collimating lens 18 that makes the light beam that has passed through the slit opening 16a parallel light, and the parallel light from the collimating lens 18 is converted into Y A dispersive optical element 20 that disperses in the axial direction, an image forming lens 22 that forms an image of the light beam emitted from the dispersive optical element 20, and a detector 24 a are disposed on the image plane of the image forming lens 22. And a two-dimensional imaging detection unit 24.
 なお、対物レンズ12、スリット板16、コリメートレンズ18、分散光学素子20、結像レンズ22および2次元撮像検出器24は固定的に配設されることとなる。 Note that the objective lens 12, the slit plate 16, the collimating lens 18, the dispersion optical element 20, the imaging lens 22, and the two-dimensional imaging detector 24 are fixedly disposed.
 また、一対の色消しプリズム122は、色消しプリズム122-1、122-2により構成され、色消しプリズム122-1、122-2は、頂角の方向が一致するとともに、X軸方向に並んで配設される。なお、以下の説明においては、この図10(a)(b)に示す一対の色消しプリズムの状態を「初期状態」と称することとする。 Further, the pair of achromatic prisms 122 are constituted by achromatic prisms 122-1 and 122-2, and the achromatic prisms 122-1 and 122-2 are aligned in the X-axis direction with the same apex angle direction. Arranged. In the following description, the state of the pair of achromatic prisms shown in FIGS. 10A and 10B is referred to as “initial state”.
 また、この一対の色消しプリズム122は、移動制御部32の制御により、初期状態から色消しプリズム122-1、122-2をそれぞれ逆方向に同じ角度で回動される。 Further, the pair of achromatic prisms 122 are rotated from the initial state by the same angle in the opposite direction to the achromatic prisms 122-1 and 122-2, respectively, under the control of the movement control unit 32.
 なお、一対の色消しプリズム122は、色消しプリズム122-1、122-2が互いに逆方向に回転して、頂角の方向が逆方向となるとともにZ軸方向と水平な状態(図10(c)(d)を参照する。)となるとき、平行平面板として機能する。 In addition, in the pair of achromatic prisms 122, the achromatic prisms 122-1 and 122-2 rotate in opposite directions, the apex angle direction is reversed, and the Z axis direction is horizontal (FIG. 10 ( c) (Refer to (d)), it functions as a plane parallel plate.
 なお、図10(c)は、図10(a)における色消しプリズム122-1が矢印I方向に90°回転し、色消しプリズム122-2が矢印II方向に90°回転した状態を示すものである。 FIG. 10C shows a state in which the achromatic prism 122-1 in FIG. 10A is rotated 90 ° in the direction of arrow I and the achromatic prism 122-2 is rotated 90 ° in the direction of arrow II. It is.
 即ち、色消しプリズム122-1、122-2は、それぞれ一致した中心軸Oを中心にして、それぞれ逆方向に回転するようになされており、例えば、色消しプリズム122-1が矢印I方向に回動するとき、色消しプリズム122-2は、矢印II方向に回動することとなる(図10(a)を参照する。)。なお、このとき、色消しプリズム122-1、122-2はそれぞれ0°以上180°以下の範囲で回転可能となっている。 That is, the achromatic prisms 122-1 and 122-2 are configured to rotate in opposite directions around the center axis O that coincides, for example, the achromatic prism 122-1 is arranged in the direction of the arrow I. When rotating, the achromatic prism 122-2 rotates in the direction of arrow II (see FIG. 10A). At this time, each of the achromatic prisms 122-1 and 122-2 is rotatable in the range of 0 ° to 180 °.
 こうした構成の一対の色消しプリズム122により、一対の色消しプリズム122の後段に配設された2次元撮像検出器24へ入射する光束の位置を変更することが可能となり、2次元撮像検出器24によって撮影される撮影位置がY軸方向で移動されることとなる。 The pair of achromatic prisms 122 having such a configuration makes it possible to change the position of the light beam incident on the two-dimensional imaging detector 24 disposed at the subsequent stage of the pair of achromatic prisms 122. As a result, the shooting position of shooting is moved in the Y-axis direction.
 また、移動制御部32は、対象物200のY軸方向における一方の端部200cから他方の端部200dが含まれる所定の撮影領域内をY軸方向に撮影位置が移動するように、色消しプリズム122-1、122-2を、X軸周りにスリット開口部16aのY軸方向の幅に応じてそれぞれ所定の回動角度で順次回動させる。 In addition, the movement control unit 32 is achromatic so that the shooting position moves in the Y-axis direction within a predetermined shooting region including the other end 200d from one end 200c in the Y-axis direction of the object 200. The prisms 122-1 and 122-2 are sequentially rotated around the X axis at a predetermined rotation angle according to the width of the slit opening 16a in the Y axis direction.
 より詳細には、移動制御部32は、作業者により撮影開始が指示されると、1次元撮像検出器122の撮影位置が撮影開始位置となるように色消しプリズム122-1、122-2をそれぞれ逆方向に回動し、色消しプリズム122-1、122-2を回動したとの情報を撮影制御部30に出力する。 More specifically, the movement control unit 32 sets the achromatic prisms 122-1 and 122-2 so that the photographing position of the one-dimensional imaging detector 122 becomes the photographing start position when the operator gives an instruction to start photographing. Information indicating that the achromatic prisms 122-1 and 122-2 have been rotated in the opposite directions is output to the imaging control unit 30.
 そして、移動制御部32は、撮影制御部30から撮影を行ったとの情報が出力されると、色消しプリズム122-1、122-2を、スリット開口部16aのY軸方向の幅に応じてそれぞれ所定の回動角度で回動させ、色消しプリズム122-1、122-2を回動したとの情報を撮影制御部30に出力する。 When the information indicating that the photographing has been performed is output from the photographing control unit 30, the movement control unit 32 moves the achromatic prisms 122-1 and 122-2 according to the width of the slit opening 16a in the Y-axis direction. Information indicating that the achromatic prisms 122-1 and 122-2 are rotated is output to the imaging control unit 30 by rotating at a predetermined rotation angle.
 また、移動制御部32は、2次元撮像検出器24の撮影位置が撮影終了位置となるまで色消しプリズム122-1、122-2を回動すると、撮影終了位置まで移動したとの情報を撮影制御部30に出力する。 Further, the movement control unit 32 captures information that the achromatic prisms 122-1 and 122-2 are rotated until the photographing position of the two-dimensional imaging detector 24 reaches the photographing end position, and moves to the photographing end position. Output to the control unit 30.
 そして、移動制御部32において色消しプリズム122-1、122-2を回動して撮影位置を移動しながら、撮影制御部30により撮影を行うとともに、2次元撮像検出器24からの電気信号に基づいて、分光データ作成部34において1次元の空間情報と1次元の波長情報(高波長分解分光情報)とを有する2次元分光データを作成することとなる。 Then, the movement control unit 32 rotates the achromatic prisms 122-1 and 122-2 to move the shooting position, and the shooting control unit 30 performs shooting, and the electrical signal from the two-dimensional imaging detector 24 is converted into an electric signal. Based on this, the spectral data creating unit 34 creates two-dimensional spectral data having one-dimensional spatial information and one-dimensional wavelength information (high wavelength resolution spectral information).
 (5)上記した実施の形態においては、対物レンズ12、スリット板16、コリメートレンズ18、分散光学素子20、結像レンズ22および2次元撮像検出器24などのハード構成を交換することにより、波長域、波長分解能、空間画素数を変更するようにしたが、これに限られるものではないことは勿論であり、撮影ソフトの機能を利用し、外部定義のファイルの読み込み、あるいは、パラメータを作業者が指定することにより、波長域、波長分解能、空間画素数だけでなく、撮影速度、露光時間、ゲインを自由に設定することができるようにしてもよい。 (5) In the above-described embodiment, the wavelength is changed by exchanging hardware configurations such as the objective lens 12, the slit plate 16, the collimating lens 18, the dispersion optical element 20, the imaging lens 22, and the two-dimensional imaging detector 24. The area, wavelength resolution, and number of spatial pixels are changed, but this is not limited to this. Using the functions of the shooting software, reading externally defined files or changing parameters to the worker May specify not only the wavelength range, wavelength resolution, and number of spatial pixels, but also the imaging speed, exposure time, and gain.
 (6)上記した実施の形態ならびに上記した(1)乃至(5)に示す変形例は、適宜に組み合わせるようにしてもよい。 (6) The above-described embodiment and the modifications shown in (1) to (5) above may be combined as appropriate.
 本発明は、対象となる物体を含む所定の領域を測定して当該領域の詳細な色情報を取得する分光放射輝度計として用いて好適である。 The present invention is suitable for use as a spectral radiance meter that measures a predetermined area including a target object and acquires detailed color information of the area.
 10、100、110、120    分光放射輝度計
 12、102    対物レンズ
 14    精密直動ステージ
 16    スリット板
 18    コリメートレンズ
 20    分散光学素子
 32、116    結像レンズ
 24    2次元撮像検出器
 26    制御部
 30    撮影制御部
 32    移動制御部
 34    分光データ作成部
 36    画像作成部
 38    分光画像作成部
 40    色解析画像作成部
 42    校正処理部
 44    算出部
 50    色解析画像取得部
 52    記憶部
 112    F-θレンズ
 114    ガルバノミラー
 122    一対の色消しプリズム
 122-1、122-2    色消しプリズム
 200    対象物
 
10, 100, 110, 120 Spectral radiance meter 12, 102 Objective lens 14 Precision linear motion stage 16 Slit plate 18 Collimator lens 20 Dispersive optical element 32, 116 Imaging lens 24 Two-dimensional imaging detector 26 Control unit 30 Imaging control unit 32 movement control unit 34 spectral data creation unit 36 image creation unit 38 spectral image creation unit 40 color analysis image creation unit 42 calibration processing unit 44 calculation unit 50 color analysis image acquisition unit 52 storage unit 112 F-θ lens 114 galvanometer mirror 122 pair Achromatic prisms 122-1, 122-2 Achromatic prism 200

Claims (4)

  1.  撮影により、対物レンズを介して入射した対象物からの光束が、分散光学素子により所定の方向と直交する方向に分散されて入射し、該入射した光束に基づく信号を取得する2次元撮像検出器と、
     前記対物レンズより後段に設けられた構成を一体的に前記所定の方向に移動することにより、前記2次元撮像検出器により撮影される撮影位置を前記所定の方向に移動する移動手段と、
     前記2次元撮像検出器の撮影のタイミングを制御する第1の制御手段と、
     前記移動手段の移動を制御する第2の制御手段と、
     前記信号に基づいて、撮影位置における1次元の空間情報と1次元の波長情報とを有する2次元分光データを作成するとともに、各撮影位置における2次元分光データから2次元の空間情報と1次元の波長情報とを有する3次元分光データを作成する分光データ作成手段と、
     前記3次元分光データから波長ごとの分光画像を作成する第1の画像作成手段と、
     前記分光画像の各画素における分光放射輝度を取得する取得手段と、
     前記分光画像を色空間変換処理して、所定の表色系による色空間画像を作成するとともに、前記色空間画像を色演算処理して、色解析画像を作成し、当該色解析画像を所定の表示方法による表示に変換する第2の画像作成手段と
     を有することを特徴とする分光放射輝度計。
    A two-dimensional imaging detector that obtains a signal based on the incident light flux by dispersing the light flux from the object incident through the objective lens in the direction orthogonal to the predetermined direction by the dispersion optical element. When,
    Moving means for moving an imaging position photographed by the two-dimensional imaging detector in the predetermined direction by integrally moving a configuration provided at a stage subsequent to the objective lens in the predetermined direction;
    First control means for controlling shooting timing of the two-dimensional imaging detector;
    Second control means for controlling movement of the moving means;
    Based on the signal, two-dimensional spectral data having one-dimensional spatial information and one-dimensional wavelength information at the photographing position is created, and two-dimensional spatial information and one-dimensional data are obtained from the two-dimensional spectral data at each photographing position. Spectral data creating means for creating three-dimensional spectral data having wavelength information;
    First image creation means for creating a spectral image for each wavelength from the three-dimensional spectral data;
    Obtaining means for obtaining spectral radiance in each pixel of the spectral image;
    The spectral image is subjected to color space conversion processing to create a color space image based on a predetermined color system, and the color space image is subjected to color calculation processing to generate a color analysis image. A spectral radiance meter, comprising: a second image creating means for converting to display by a display method.
  2.  請求項1に記載の分光放射輝度計において、
     前記2次元撮像検出器は、前記対物レンズおよび前記分散光学素子を含む光学素子とともに着脱可能な構成であり、前記2次元撮像検出器および前記光学素子を交換することにより、撮影可能な波長域、波長分解能および空間画素数を変更することができる
     ことを特徴とする分光放射輝度計。
    The spectral radiance meter according to claim 1.
    The two-dimensional imaging detector is configured to be detachable together with an optical element including the objective lens and the dispersion optical element, and by exchanging the two-dimensional imaging detector and the optical element, a wavelength region that can be photographed, Spectral radiance meter characterized in that wavelength resolution and number of spatial pixels can be changed.
  3.  請求項1または2のいずれか1項に記載の分光放射輝度計において、
     前記移動手段は、精密直動ステージである
     ことを特徴とする分光放射輝度計。
    The spectral radiance meter according to claim 1 or 2,
    The spectral radiance meter, wherein the moving means is a precision linear motion stage.
  4.  撮影により、対物レンズを介して入射した対象物からの光束が、分散光学素子により所定の方向と直交する方向に分散されて入射し、該入射した光束に基づく信号を取得する2次元撮像検出器と、
     前記2次元撮像検出器の撮影のタイミングを制御する制御手段と、
     前記信号に基づいて、撮影位置における1次元の空間情報と1次元の波長情報とを有する2次元分光データを作成するとともに、各撮影位置における2次元分光データから2次元の空間情報と1次元の波長情報とを有する3次元分光データを作成する分光データ作成手段と、
     前記3次元分光データから波長ごとの分光画像を作成する第1の画像作成手段と、
     前記分光画像の各画素における分光放射輝度を取得する取得手段と、
     前記分光画像を色空間変換処理して、所定の表色系による色空間画像を作成するとともに、前記色空間画像を色演算処理して、色解析画像を作成し、当該色解析画像を所定の表示方法による表示に変換する第2の画像作成手段と
     を有し、
     前記対象物との位置関係を前記所定の方向において変更することにより、撮影位置を前記所定の方向に移動しながら前記対象物の撮影を行う
     ことを特徴とする分光放射輝度計。
     
    A two-dimensional imaging detector that obtains a signal based on the incident light flux by dispersing the light flux from the object incident through the objective lens in the direction orthogonal to the predetermined direction by the dispersion optical element. When,
    Control means for controlling the shooting timing of the two-dimensional imaging detector;
    Based on the signal, two-dimensional spectral data having one-dimensional spatial information and one-dimensional wavelength information at the photographing position is created, and two-dimensional spatial information and one-dimensional data are obtained from the two-dimensional spectral data at each photographing position. Spectral data creating means for creating three-dimensional spectral data having wavelength information;
    First image creation means for creating a spectral image for each wavelength from the three-dimensional spectral data;
    Obtaining means for obtaining spectral radiance in each pixel of the spectral image;
    The spectral image is subjected to color space conversion processing to create a color space image based on a predetermined color system, and the color space image is subjected to color calculation processing to generate a color analysis image. Second image creating means for converting to display by a display method;
    The spectral radiance meter, wherein the object is photographed while moving the photographing position in the predetermined direction by changing the positional relationship with the object in the predetermined direction.
PCT/JP2015/056218 2014-03-03 2015-03-03 Spectroradiometer WO2015133476A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014-040711 2014-03-03
JP2014040711A JP6068375B2 (en) 2014-03-03 2014-03-03 Spectral radiance meter

Publications (1)

Publication Number Publication Date
WO2015133476A1 true WO2015133476A1 (en) 2015-09-11

Family

ID=54055282

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/056218 WO2015133476A1 (en) 2014-03-03 2015-03-03 Spectroradiometer

Country Status (2)

Country Link
JP (1) JP6068375B2 (en)
WO (1) WO2015133476A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7450960B2 (en) 2021-11-17 2024-03-18 國立中正大學 Color measurement method and device

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018212078A1 (en) 2017-05-17 2018-11-22 エバ・ジャパン株式会社 Information retrieval system and method, and information retrieval program
JP6843439B2 (en) 2017-05-17 2021-03-17 エバ・ジャパン 株式会社 Information retrieval system and method, information retrieval program
JP6568245B2 (en) * 2018-01-24 2019-08-28 Ckd株式会社 Inspection device, PTP packaging machine, and calibration method for inspection device
JP6951753B2 (en) 2018-03-27 2021-10-20 エバ・ジャパン 株式会社 Information search system and program
JP2020193928A (en) * 2019-05-30 2020-12-03 株式会社分光応用技術研究所 Two-dimensional spectroscopic measurement system and data processing method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6495818B1 (en) * 1998-07-21 2002-12-17 The Institute For Technology Development Microscopic hyperspectral imaging scanner
JP2004005566A (en) * 2002-04-23 2004-01-08 Matsushita Electric Ind Co Ltd Color compiling apparatus and color compiling method
JP2004045189A (en) * 2002-07-11 2004-02-12 Matsushita Electric Ind Co Ltd Color correction device and color correction method
JP2009039280A (en) * 2007-08-08 2009-02-26 Arata Satori Endoscopic system and method of detecting subject using endoscopic system
JP2011089895A (en) * 2009-10-22 2011-05-06 Arata Satori Device and method of hyperspectral imaging
WO2013009189A1 (en) * 2011-07-08 2013-01-17 Norsk Elektro Optikk As Hyperspectral camera and method for acquiring hyperspectral data

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4400450B2 (en) * 2004-12-22 2010-01-20 コニカミノルタセンシング株式会社 Stray light correction method and two-dimensional spectral luminance meter using the same
US7616314B2 (en) * 2006-01-30 2009-11-10 Radiant Imaging, Inc. Methods and apparatuses for determining a color calibration for different spectral light inputs in an imaging apparatus measurement

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6495818B1 (en) * 1998-07-21 2002-12-17 The Institute For Technology Development Microscopic hyperspectral imaging scanner
JP2004005566A (en) * 2002-04-23 2004-01-08 Matsushita Electric Ind Co Ltd Color compiling apparatus and color compiling method
JP2004045189A (en) * 2002-07-11 2004-02-12 Matsushita Electric Ind Co Ltd Color correction device and color correction method
JP2009039280A (en) * 2007-08-08 2009-02-26 Arata Satori Endoscopic system and method of detecting subject using endoscopic system
JP2011089895A (en) * 2009-10-22 2011-05-06 Arata Satori Device and method of hyperspectral imaging
WO2013009189A1 (en) * 2011-07-08 2013-01-17 Norsk Elektro Optikk As Hyperspectral camera and method for acquiring hyperspectral data

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7450960B2 (en) 2021-11-17 2024-03-18 國立中正大學 Color measurement method and device

Also Published As

Publication number Publication date
JP6068375B2 (en) 2017-01-25
JP2015166682A (en) 2015-09-24

Similar Documents

Publication Publication Date Title
WO2015133476A1 (en) Spectroradiometer
JP5475057B2 (en) Variable angle spectroscopic imaging measurement method and apparatus
EP2930494B1 (en) Handheld measuring device for recording the visual appearance of an object to be measured
JP6384183B2 (en) Sample measuring apparatus and sample measuring program
US8976239B2 (en) System and apparatus for color correction in transmission-microscope slides
JP6390252B2 (en) Sample measuring apparatus and sample measuring program
Bongiorno et al. Spectral characterization of COTS RGB cameras using a linear variable edge filter
Wego Accuracy simulation of an LED based spectrophotometer
HUE032787T2 (en) Colorimeter calibration
JP2013532290A (en) Method and apparatus for measuring surface color and other properties
Bonifazzi et al. A scanning device for VIS–NIR multispectral imaging of paintings
DE202012010549U1 (en) Hand-held measuring device for detecting the visual impression of a test object
US20200378887A1 (en) Multi-Angle Colorimeter
JP5217046B2 (en) Optical characteristic measuring apparatus and optical characteristic measuring method
JP4174707B2 (en) Spectroscopic measurement system, color reproduction system
Lee et al. Building a two-way hyperspectral imaging system with liquid crystal tunable filters
Picollo et al. Application of hyper-spectral imaging technique for colorimetric analysis of paintings
EP3993382A1 (en) Colour calibration of an imaging device
JP2010256303A (en) Hyper-spectrum image processor and hyper-spectrum image processing method
Zhu et al. Color calibration for colorized vision system with digital sensor and LED array illuminator
JP6982186B2 (en) Optical measuring devices, systems and methods
Hébert et al. Characterization by hyperspectral imaging and hypercolor gamut estimation for structural color prints
JP2022006624A (en) Calibration device, calibration method, calibration program, spectroscopic camera, and information processing device
Martínez-García et al. Color calibration of an RGB digital camera for the microscopic observation of highly specular materials
Pawlik et al. Color Formation in Virtual Reality 3D 360 Cameras

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15758100

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15758100

Country of ref document: EP

Kind code of ref document: A1