WO2018143340A1 - Spectroscopic camera, image capturing method, program, and recording medium - Google Patents

Spectroscopic camera, image capturing method, program, and recording medium Download PDF

Info

Publication number
WO2018143340A1
WO2018143340A1 PCT/JP2018/003400 JP2018003400W WO2018143340A1 WO 2018143340 A1 WO2018143340 A1 WO 2018143340A1 JP 2018003400 W JP2018003400 W JP 2018003400W WO 2018143340 A1 WO2018143340 A1 WO 2018143340A1
Authority
WO
WIPO (PCT)
Prior art keywords
wavelength
image
light
spectroscopic camera
uniformity
Prior art date
Application number
PCT/JP2018/003400
Other languages
French (fr)
Japanese (ja)
Inventor
佐藤 充
加園 修
Original Assignee
パイオニア株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パイオニア株式会社 filed Critical パイオニア株式会社
Priority to JP2018565643A priority Critical patent/JPWO2018143340A1/en
Publication of WO2018143340A1 publication Critical patent/WO2018143340A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/12Generating the spectrum; Monochromators
    • G01J3/26Generating the spectrum; Monochromators using multiple reflection, e.g. Fabry-Perot interferometer, variable interference filters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/46Measurement of colour; Colour measuring devices, e.g. colorimeters
    • G01J3/50Measurement of colour; Colour measuring devices, e.g. colorimeters using electric radiation detectors
    • G01J3/51Measurement of colour; Colour measuring devices, e.g. colorimeters using electric radiation detectors using colour filters

Definitions

  • the present invention relates to a spectroscopic camera, an imaging method, a program, and a recording medium.
  • a plenoptic camera is used as a spectroscopic camera that can simultaneously acquire a plurality of spectroscopic information and generate an image based on the acquired spectroscopic information.
  • a plenoptic camera a measuring apparatus has been proposed in which a plurality of bandpass filters having different spectral transmittances are arranged in the vicinity of a stop position of a main lens in order to measure a spectral spectrum (for example, Patent Document 1).
  • the above-described conventional technique has a problem that a continuous spectrum cannot be acquired because a multispectral image is acquired by a finite number of fixed bandpass filters.
  • a problem of deterioration in wavelength selectivity due to manufacturing variation or the like is an example. Can be mentioned.
  • the invention according to claim 1 has an optical system composed of a pair of lenses and a pair of reflecting surfaces arranged opposite to each other at the position of the pupil of the optical system, and the pair of reflections according to an applied voltage.
  • a wavelength selection unit that selectively transmits light having a wavelength according to the interval, and n pieces (n is a natural number) that collects light that has passed through the optical system and the wavelength selection unit.
  • a lens array provided at a position conjugate with the wavelength selection unit and having n divided imaging regions that receive light collected by the n lenses. It is characterized by.
  • a tenth aspect of the invention is an imaging method using the spectroscopic camera according to the first aspect of the invention, the step of entering light of a predetermined wavelength into the optical system, and the light of the predetermined wavelength while changing the applied voltage. And receiving the information on the non-uniformity of the transmission wavelength of the wavelength selection unit based on the information of the luminance distribution for each pixel in the n divided imaging regions of the imaging device, And imaging a subject to obtain a captured image, and performing image correction on the captured image based on non-uniformity information of the transmission wavelength.
  • the step of causing the computer to enter the optical system with light having a predetermined wavelength, and the light having the predetermined wavelength while changing the applied voltage A step of causing the imaging device to receive light, a step of obtaining non-uniformity of transmission wavelength of the wavelength selection unit based on information of luminance distribution for each pixel in the n divided imaging regions of the imaging device, and a subject And obtaining a picked-up image, and performing image correction on the picked-up image based on information on the non-uniformity of the transmission wavelength.
  • FIG. 1 is a diagram illustrating a configuration of a spectroscopic camera 10 of Example 1.
  • FIG. It is a figure which shows typically a mode that the light from an object point is received by the image sensor.
  • FIG. 4 is a diagram schematically illustrating a state in which light from a point on the pupil EYE of the optical system 11 is received by an image sensor 13. It is a figure which shows the example at the time of arrange
  • FIG. 6 is a diagram illustrating a configuration of a spectroscopic camera 20 according to a second embodiment. It is sectional drawing which shows the example of a structure provided with multiple band pass filters with the same transmission wavelength. It is sectional drawing which shows the example of the structure which provided multiple band pass filters from which a transmission wavelength differs. It is sectional drawing which shows the example of a structure provided with two or more band pass filters from which a half value width differs. It is a figure which shows typically the mode of imaging
  • FIG. 1 is a cross-sectional view illustrating the configuration of the spectroscopic camera 10 according to the first embodiment.
  • the spectroscopic camera 10 irradiates the object X (subject) with light and receives light reflected from the object X or transmitted through the object X to perform imaging.
  • the spectroscopic camera 10 includes an optical system 11, a camera 14 including a microlens array 12 and an image sensor 13, and an image processing unit 15.
  • the microlens array 12 is disposed at a position conjugate with the object X.
  • the image sensor 13 is disposed at a position conjugate with the pupil of the optical system 11.
  • the optical system 11 includes a first lens L1 and a second lens L2 made of, for example, a convex lens.
  • a wavelength selection filter FF is disposed at the position of the pupil of the optical system 11 between the first lens L1 and the second lens L2.
  • the optical system 11 has a mechanism (focusing mechanism) that can adjust the distance according to the position of the object X, and is adjusted so as to maintain the conjugate relationship between the object X and the microlens array 12.
  • the first lens L1 is disposed on the side on which light from the object X is incident (hereinafter referred to as the incident side), and the second lens L2 is disposed on the side that emits light to the microlens array 12 (hereinafter referred to as the emission side). ).
  • the light from the object X is condensed on the microlens array 12 through the first lens L1, the wavelength selection filter FF, and the second lens L2.
  • the wavelength selection filter FF is composed of, for example, a Fabry-Perot interference filter.
  • the wavelength selection filter FF includes reflection films RF1 and RF2 that are arranged to face each other, and transmits light having a wavelength corresponding to the interval between the reflection films.
  • the wavelength selection filter FF receives the voltage V and changes the distance d between the opposing surfaces of the reflective films RF1 and RF2 (hereinafter referred to as the gap d between the reflective surfaces), thereby changing the wavelength of the transmitted light. It can be set selectively.
  • the gap d between the reflection surfaces in the wavelength selection filter FF is formed so as to have a uniform interval. However, in practice, it is a non-uniform portion due to variations in manufacturing and assembly. (Hereinafter referred to as non-uniformity of the gap d between the reflecting surfaces).
  • the spectroscopic camera 10 of this embodiment has a function of correcting a captured image in accordance with the non-uniformity of the gap d between the reflecting surfaces.
  • the microlens array 12 is composed of n (n is a natural number) microlenses arranged vertically and horizontally. Light from different points on the object X is incident on different microlenses. Each of the micro lenses corresponds to “pixels” in a normal camera, and the number of micro lenses corresponds to “number of pixels” in a normal camera. Each microlens of the microlens array 12 receives light that has passed through the optical system 11 and focuses it on the image sensor 13.
  • the image sensor 13 is an image sensor that receives light collected by the microlens array 12 and converts it into electronic information to obtain a captured image.
  • the image sensor 13 is composed of n divided imaging areas corresponding to the microlenses of the microlens array 12. Light incident on each microlens of the microlens array 12 is received by the corresponding divided imaging region of the image sensor 13. Lights that have entered different microlenses are received by different divided imaging regions, and the received divided imaging regions do not overlap.
  • FIG. 2 is a diagram schematically showing a state in which light from one point of object X (referred to as object point P) is received by image sensor 13.
  • the light from the object point P passes through different regions on the first lens L1, the pupil EYE, and the second lens L2, enters the microlens ML2 of the microlens array 12, and is received by the divided imaging region R2.
  • the luminance value of the image of the object point P is represented by the sum of the luminance values of all the pixels in the divided imaging region R2. Therefore, by acquiring the sum of the luminance values of all the pixels in each divided imaging region, the same imaging result (captured image) as that obtained by an ordinary camera in which an image sensor is arranged at the position of the microlens can be obtained.
  • FIG. 3 is a diagram schematically showing how the image sensor 13 receives light from one point on the pupil EYE of the optical system 11.
  • Light from one point on the pupil EYE that is, light incident on one point on the pupil EYE
  • the image processing unit 15 is composed of, for example, a CPU (Central Processing Unit) and performs an image correction process on the image of the object X acquired by the image sensor 13. Specifically, the image processing unit 15 selects a pixel area according to the luminance distribution in each divided imaging area of the image sensor 13 obtained by the pre-photographing, and uses only the information on the selected pixel area for the object X. Configure the captured image.
  • a CPU Central Processing Unit
  • the image processing unit 15 records the signal received by the image sensor 13 in a storage unit (not shown) such as a memory while sweeping the applied voltage to the wavelength selection filter FF, for example,
  • a storage unit such as a memory
  • the luminance distribution indicating the non-uniformity of the transmission wavelength of the wavelength selection filter FF (that is, the non-uniformity of the gap d of the reflecting surface) is obtained based on the signal information of That is, the image processing unit 15 has a function as a detection unit that detects non-uniformity of the transmission wavelength of the wavelength selection filter FF.
  • the luminance distribution in the divided imaging region of the image sensor 13 represents the luminance distribution when the light incident on the corresponding microlens of the microlens array 12 passes through the pupil. Therefore, if the luminance value of light incident on a certain microlens is calculated as the sum of the specific pixel areas of the corresponding divided imaging area, it passes through a specific area of the pupil (area corresponding to the pixel area in the divided imaging area). Thus, the intensity of the received light can be obtained. Furthermore, if the luminance values of all the microlenses are calculated as the sum of the same pixel areas of the corresponding divided imaging areas, an image of light passing through a specific area of the pupil can be obtained. Thereby, for example, an image similar to the case where a physical opening having a predetermined shape is arranged in the pupil of the optical system 11 is obtained.
  • FIG. 4 is a diagram showing an example in which a physical aperture AP having a heart shape is arranged at the pupil position of the optical system.
  • a physical aperture AP having a heart shape is arranged at the pupil position of the optical system.
  • an image obtained by inverting the top and bottom of the heart shape (that is, the shape of the opening AP) is obtained.
  • a physical aperture in the figure, the pupil of the optical system 11.
  • An image similar to that in the case where the heart-shaped openings are arranged in the figure, an image obtained by inverting the heart-shaped top and bottom) can be obtained without arranging physical openings.
  • the Fabry-Perot interference filter constituting the wavelength selection filter FF is an interference filter composed of a pair of parallel reflection films.
  • the wavelength of the transmitted light is changed. Change.
  • the transmittance T ( ⁇ ) of the Fabry-Perot filter is expressed by the following equation (1).
  • the full width at half maximum (FWHM) is expressed by the following equation (2). In the limit of R ⁇ 1, FWHM ⁇ 0.
  • 6 (a) and 6 (b) compare the ideal case where the gap between the reflecting surfaces is almost uniform and the case where the gap is non-uniform in the vicinity of the end with respect to the intensity and half width of the transmitted light.
  • an aperture stop is arranged immediately before or after the filter.
  • light can be transmitted only to a region where the gap around the optical axis is uniform, and the half width can be improved.
  • this method can improve the half-value width, the amount of transmitted light is reduced.
  • the gap between the reflecting surfaces has a non-uniformity that is asymmetric with respect to the optical axis
  • the amount of transmitted light can be further reduced by arranging an aperture stop that is symmetric with respect to the optical axis as shown in FIG. Therefore, as shown in FIG. 7C, it is necessary to arrange an aperture stop corresponding to the gap non-uniformity.
  • it is difficult to arrange an appropriate aperture stop because different parts are different depending on the filter.
  • an image can be configured by selectively using only a specific pixel area in each divided imaging area of the image sensor 13. Therefore, it is possible to obtain an image using only light that has passed through a region where the distance between the reflection films RF1 and RF2 of the wavelength selection filter FF is uniform without providing a physical aperture stop.
  • the screen SC is irradiated with light having a single wavelength ⁇ 1 emitted from the laser LR via the lens L3 (step S101).
  • the spectroscopic camera 10 takes an image of the screen SC while sweeping the voltage V applied to the wavelength selection filter FF.
  • the image processing unit 15 records a signal received by the image sensor 13 that is an image sensor (step S102).
  • the interval between the reflection films RF1 and RF2 changes.
  • the image of each divided imaging region of the image sensor 13 becomes brightest when the interval at which the light having the wavelength ⁇ 1 passes most is obtained.
  • a region with high brightness in the image VD obtained in the divided imaging region is a region where the corresponding region on the pupil passes light of wavelength ⁇ 1 , and the gap d between the reflecting surfaces (that is, the reflection)
  • the distance between the films RF1 and RF2 reflects a uniform region.
  • the image processing unit 15 acquires the luminance distribution of the divided imaging region reflecting the non-uniformity of the gap d (the non-uniformity of the transmission wavelength) between the reflection surfaces of the wavelength selection filter based on the plurality of recorded signal information. (Step S103).
  • a screen SC that is illuminated by such as sunlight or halogen light wavelength lambda 1 of You may image
  • a band-pass filter that transmits light of wavelength ⁇ 1 at a position on the optical path in front of the optical system 11 (ie, in front of the first lens L1) when viewed from the incident side during preliminary imaging.
  • the BPF is arranged to irradiate the illumination light WL to perform imaging, and the bandpass filter BPF is removed during actual imaging.
  • a pre-shooting may be performed by arranging a bandpass filter BPF at a position on the optical path before the wavelength selection filter. That is, the band pass filter BPF may be disposed on the optical path.
  • the spectroscopic camera 10 is set on the subject (object X) (step S201).
  • the spectroscopic camera 10 captures the object X while sweeping the voltage V applied to the wavelength selection filter FF (step S202).
  • the subject is photographed using illumination light such as sunlight or white light instead of light having a single wavelength.
  • the image processing unit 15 records a signal received by the image sensor 13 that is an image sensor.
  • the image processing unit 15 reflects in actual shooting based on information indicating the non-uniformity of the gap d between the reflection surfaces of the wavelength selection filter obtained by pre-shooting (that is, information on the luminance distribution in the divided imaging region). First image information obtained by receiving light passing through a region where the gap d between the surfaces is uniform and second image information obtained by receiving light passing through the non-uniform region are calculated. (Step S203). The image processing unit 15 obtains a final captured image based on the first image information and the second image information (step S204).
  • the image information indicating the luminance distribution according to the gap d between the reflection surfaces of the wavelength selection filter FF is acquired in the pre-photographing, and the luminance value is determined in the pre-photographing.
  • FIG. 13 is a cross-sectional view illustrating the configuration of the spectroscopic camera 20 according to the second embodiment.
  • the spectroscopic camera 20 of the present embodiment is the same as the spectroscopic camera 10 of the first embodiment in that a bandpass filter BPF is fixedly provided on the exit side surface of one of the microlenses constituting the microlens array 12. Different.
  • a band-pass filter BPF having a transmission wavelength ⁇ 1 (the center wavelength of the pass band is ⁇ 1 ) is provided on the emission-side surface of the microlens ML1 positioned at the end of the microlens array 12 (that is, the surface on the image sensor 13 side). Is provided.
  • the micro lens ML1 functions as a pixel selecting micro lens, and the other micro lenses function as imaging micro lenses.
  • the image sensor 13 has a divided imaging region R1 corresponding to the microlens ML1 of the microlens array 12.
  • the divided imaging region R1 is provided at a position corresponding to the microlens ML1 (for example, an end portion of the image sensor).
  • the divided imaging area R1 functions as a divided imaging area for pixel selection, and the other divided imaging areas function as divided imaging areas for imaging.
  • the light that has passed through the area other than the microlens ML1 of the microlens array 12 is received by the other divided imaging areas of the image sensor 13 without passing through the bandpass filter BPF. Therefore, in a divided imaging region other than the divided imaging region R1 of the image sensor 13, a normal captured image obtained by photographing the object X is obtained.
  • the image processing unit 15 performs image correction processing on a captured image obtained outside the divided imaging region R1 based on the image information of the image VD for pixel selection obtained in the divided imaging region R1. Specifically, the image processing unit 15 selects a pixel area according to the luminance distribution shown in the pixel selection image VD, and configures a captured image of the object X using only the selected pixel area.
  • the spectroscopic camera 20 of the present embodiment unlike the first embodiment in which shooting is required in two steps of pre-shooting and actual shooting, only a pixel region with high luminance is used by one shooting. A captured image can be obtained.
  • a configuration in which a plurality of band pass filters BPF are provided may be employed.
  • two bandpass filters BPF each having a transmission wavelength (passband center wavelength) of ⁇ 1 are provided on the surface of the microlens array 12 on the emission side of the microlens, Two corresponding divided imaging regions (divided imaging regions R1 and Rx) are set as a divided imaging region for pixel selection. Accordingly, since a plurality of pixel selection images VD are obtained, pixel selection can be performed with high accuracy.
  • a plurality of bandpass filters BPF1 (transmission wavelength ⁇ 1 ) and BPF2 (transmission wavelength ⁇ 2 ) having different transmission wavelengths may be provided.
  • the pixel selection image in the divided imaging region R1 and the pixel selection image in the divided imaging region Rx are acquired when different voltages V are applied to the wavelength selection filter FF, respectively.
  • the pixels can be selected based on a plurality of images for pixel selection, the pixel selection can be performed with high accuracy.
  • the center of the transmission wavelength is ⁇ 1 and the half-value width is W 1
  • the center of the transmission wavelength is ⁇ 1 and the half-value width is W 2 .
  • Band-pass filters BPFB are provided on the exit-side surfaces of the microlenses at both ends of the microlens array 12.
  • the half-value width W 2 > the half-value width W 1
  • the amount of transmitted light is small in the band-pass filter BPFA having a small half-value width, and the transmitted light is transmitted in the band-pass filter BPFB having a large half-value width.
  • the selection is performed while switching between the half-width priority pixel selection and the transmitted light amount priority pixel selection. be able to.
  • the embodiment of the present invention is not limited to those shown in the first and second embodiments.
  • a configuration has been described in which image processing is performed by selecting a pixel region having a high luminance value in the divided imaging region of the image sensor 13 in a state where a certain voltage V is applied to the wavelength selection filter FF.
  • V a certain voltage
  • FIG. 9 when an image as shown in FIG. 9 is obtained in the divided imaging region when a certain monochromatic light is incident, only pixels in a pixel region with high luminance (region represented by white in the figure) are effective pixels. Therefore, it has been determined that information on pixels in other regions (regions represented by gray and black in the drawing) is discarded.
  • the image information may be obtained by adding information at a plurality of voltages.
  • the gray area at the applied voltage V 1 is information of another wavelength ⁇ 2 and the black area is information of another wavelength ⁇ 3 , information on a plurality of wavelengths is applied for each applied voltage. It is possible to obtain. Therefore, by storing the information in a memory (not shown) or the like while sweeping the applied voltage V, and finally calculating the luminance value of each microlens by adding the information of the same wavelength, Similar results can be obtained.
  • the wavelength selection filter FF since the wavelength selection filter FF is disposed in the pupil portion of the optical system 11, the light passing through a certain coordinate in the pupil is uniformly incident on all the microlenses of the microlens array 12. Therefore, when there is a non-uniform portion (unevenness) in the gap d between the reflection surfaces of the wavelength selection filter FF, the light passing through the filter enters the microlens array 12 depending on the time. Since the timing at which light enters each microlens is aligned, the image obtained by the image sensor 13 is not uneven. In addition, when there is dirt or a defective portion on the filter, the luminance is reduced by the area of the dirt or defect, but no pixel defect occurs in the image obtained by the image sensor 13.
  • the spectroscopic camera of the present invention it is possible to shoot a subject while avoiding this even when dust or the like is attached to the wavelength selection filter.
  • the wavelength selection filter FF when dust GB is attached to the wavelength selection filter FF, when the screen SC illuminated with the illumination light WL such as white light is photographed, all the divided imaging regions of the image sensor 13 are captured. Garbage GB will be reflected. Therefore, by selecting the pixel area while avoiding the area where the garbage GB is reflected, it is possible to perform shooting while avoiding the garbage GB. In addition to dust, it is also possible to deal with foreign matter whose location cannot be predicted, such as dirt and scratches.
  • the spectroscopic cameras 10 and 20 have the microlens array 12 in which n microlenses are arranged vertically and horizontally has been described.
  • the type and arrangement of the lenses are not limited to this, and the spectroscopic camera only needs to have a lens array having n lenses.
  • the position where the bandpass filter is disposed is not limited to this, and the light is condensed on the microlens. It only has to be arranged on the optical path of light. Moreover, it should just be arrange

Landscapes

  • Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • General Physics & Mathematics (AREA)
  • Spectrometry And Color Measurement (AREA)
  • Optical Filters (AREA)
  • Blocking Light For Cameras (AREA)
  • Microscoopes, Condenser (AREA)
  • Studio Devices (AREA)

Abstract

This spectroscopic camera includes: an optical system comprising a pair of lenses; a wavelength selecting unit which includes a pair of reflecting surfaces disposed facing one another at the position of a pupil of the optical system, which causes a gap between the pair of reflecting surfaces to vary in accordance with an applied voltage, and which selectively allows light having a wavelength corresponding to the gap to pass; a lens array having n (where n is a natural number) lenses each of which condenses the light that has passed through the optical system and the wavelength selecting unit; and an image capturing element which is provided in a position that is conjugate to the wavelength selecting unit and has n divided image capturing regions which receive the light condensed by the n lenses.

Description

分光カメラ、撮像方法、プログラム及び記録媒体Spectroscopic camera, imaging method, program, and recording medium
 本発明は、分光カメラ、撮像方法、プログラム及び記録媒体に関する。 The present invention relates to a spectroscopic camera, an imaging method, a program, and a recording medium.
 複数の分光情報を同時に取得し、取得した分光情報に基づいて画像を生成することが可能な分光カメラとして、プレノプティックカメラが用いられている。プレノプティックカメラにおいて、分光スペクトルを測定するため、メインレンズの絞り位置付近に分光透過率が異なる複数のバンドパスフィルタを配置した測定装置が提案されている(例えば、特許文献1)。 A plenoptic camera is used as a spectroscopic camera that can simultaneously acquire a plurality of spectroscopic information and generate an image based on the acquired spectroscopic information. In a plenoptic camera, a measuring apparatus has been proposed in which a plurality of bandpass filters having different spectral transmittances are arranged in the vicinity of a stop position of a main lens in order to measure a spectral spectrum (for example, Patent Document 1).
特開2015-132594号公報JP2015-132594A
 上記した従来技術では、有限個の固定されたバンドパスフィルタによりマルチスペクトル画像を取得するため、連続的なスペクトルを取得することができないという問題があった。 The above-described conventional technique has a problem that a continuous spectrum cannot be acquired because a multispectral image is acquired by a finite number of fixed bandpass filters.
 また、一対の反射面の間隔を連続的に変化させることで、この反射面を透過する光の波長を連続的に変化させる技術は古くから知られている。この技術には反射面間の平行度や面精度が悪いと透過波長の半値幅が悪化するという問題がある。 Also, a technique for continuously changing the wavelength of light transmitted through the reflecting surface by continuously changing the distance between the pair of reflecting surfaces has been known for a long time. This technique has a problem that the half-value width of the transmission wavelength deteriorates if the parallelism between the reflecting surfaces and the surface accuracy are poor.
 本発明が解決しようとする課題としては、一対の反射面の間隔を変化させることで透過波長の選択を行う分光カメラにおいて、製造上のばらつきなどに起因した波長選択性の悪化という問題が一例として挙げられる。 As an example of the problem to be solved by the present invention, in a spectroscopic camera that selects a transmission wavelength by changing the interval between a pair of reflecting surfaces, a problem of deterioration in wavelength selectivity due to manufacturing variation or the like is an example. Can be mentioned.
 請求項1に記載の発明は、一対のレンズからなる光学系と、前記光学系の瞳の位置に互いに対向して配された一対の反射面を有し、印加電圧に応じて前記一対の反射面の間隔を変化させ、前記間隔に応じた波長の光を選択的に透過する波長選択部と、各々が前記光学系及び前記波長選択部を通過した光を集光するn個(nは自然数)のレンズを有するレンズアレイと、前記波長選択部と共役な位置に設けられ、前記n個のレンズにより集光された光を受光するn個の分割撮像領域を有する撮像素子と、を有することを特徴とする。 The invention according to claim 1 has an optical system composed of a pair of lenses and a pair of reflecting surfaces arranged opposite to each other at the position of the pupil of the optical system, and the pair of reflections according to an applied voltage. A wavelength selection unit that selectively transmits light having a wavelength according to the interval, and n pieces (n is a natural number) that collects light that has passed through the optical system and the wavelength selection unit. And a lens array provided at a position conjugate with the wavelength selection unit and having n divided imaging regions that receive light collected by the n lenses. It is characterized by.
 請求項10に記載の発明は、請求項1に記載の分光カメラによる撮像方法であって、所定波長の光を前記光学系に入射するステップと、前記印加電圧を変化させつつ前記所定波長の光を前記撮像素子に受光させるステップと、前記撮像素子の前記n個の分割撮像領域における画素毎の輝度分布の情報に基づいて、前記波長選択部の透過波長の不均一性の情報を得るステップと、被写体を撮像して撮像画像を得るステップと、前記透過波長の不均一性の情報に基づいて、前記撮像画像に画像補正を行うステップと、を有することを特徴とする。 A tenth aspect of the invention is an imaging method using the spectroscopic camera according to the first aspect of the invention, the step of entering light of a predetermined wavelength into the optical system, and the light of the predetermined wavelength while changing the applied voltage. And receiving the information on the non-uniformity of the transmission wavelength of the wavelength selection unit based on the information of the luminance distribution for each pixel in the n divided imaging regions of the imaging device, And imaging a subject to obtain a captured image, and performing image correction on the captured image based on non-uniformity information of the transmission wavelength.
 請求項11に記載の発明は、請求項1に記載の分光カメラにおいて、コンピュータに、所定波長の光を前記光学系に入射するステップと、前記印加電圧を変化させつつ前記所定波長の光を前記撮像素子に受光させるステップと、前記撮像素子の前記n個の分割撮像領域における画素毎の輝度分布の情報に基づいて、前記波長選択部の透過波長の不均一性の情報を得るステップと、被写体を撮像して撮像画像を得るステップと、前記透過波長の不均一性の情報に基づいて、前記撮像画像に画像補正を行うステップと、を実行させることを特徴とする。 According to an eleventh aspect of the present invention, in the spectroscopic camera according to the first aspect, the step of causing the computer to enter the optical system with light having a predetermined wavelength, and the light having the predetermined wavelength while changing the applied voltage. A step of causing the imaging device to receive light, a step of obtaining non-uniformity of transmission wavelength of the wavelength selection unit based on information of luminance distribution for each pixel in the n divided imaging regions of the imaging device, and a subject And obtaining a picked-up image, and performing image correction on the picked-up image based on information on the non-uniformity of the transmission wavelength.
実施例1の分光カメラ10の構成を示す図である。1 is a diagram illustrating a configuration of a spectroscopic camera 10 of Example 1. FIG. 物体点からの光がイメージセンサ13で受光される様子を模式的に示す図である。It is a figure which shows typically a mode that the light from an object point is received by the image sensor. 光学系11の瞳EYE上の一点からの光がイメージセンサ13で受光される様子を模式的に示す図である。FIG. 4 is a diagram schematically illustrating a state in which light from a point on the pupil EYE of the optical system 11 is received by an image sensor 13. 瞳に物理的な開口部を配置した場合の例を示す図である。It is a figure which shows the example at the time of arrange | positioning the physical opening part to a pupil. 画素領域の選択により瞳に物理的な開口部を配置したのと同様の画像が得られることを模式的に示す図である。It is a figure which shows typically that the image similar to having arrange | positioned the physical opening part to the pupil by selection of a pixel area is obtained. 透過光の強度及び半値幅について、反射面間のギャップがほぼ均一である場合と端部付近においてギャップが不均一である場合とを比較して示す図である。It is a figure which compares and shows the case where the gap between reflective surfaces is substantially uniform, and the case where a gap is non-uniform | heterogenous in the edge part vicinity regarding the intensity | strength and half value width of transmitted light. フィルタの直前に開口絞りを配置する場合の例を示す図である。It is a figure which shows the example in the case of arrange | positioning an aperture stop just before a filter. 事前撮影の処理動作を示すフローチャートである。It is a flowchart which shows the processing operation | movement of prior imaging | photography. 事前撮影において単一波長の光を照射する場合の例を模式的に示す図である。It is a figure which shows typically the example in the case of irradiating the light of a single wavelength in prior imaging | photography. 事前撮影において光学系11の手前の位置にバンドパスフィルタを装着して照明光を照射する場合の例を模式的に示す図である。It is a figure which shows typically the example in the case of attaching a band pass filter in the position before the optical system 11, and irradiating illumination light in prior imaging | photography. 事前撮影において波長選択フィルタFFの手前の位置にバンドパスフィルタを装着して照明光を照射する場合の例を模式的に示す図である。It is a figure which shows typically the example in the case of attaching a band pass filter in the position before wavelength selection filter FF, and irradiating illumination light in prior imaging | photography. 実撮影の処理動作を示すフローチャートである。It is a flowchart which shows processing operation | movement of real imaging | photography. 実施例2の分光カメラ20の構成を示す図である。FIG. 6 is a diagram illustrating a configuration of a spectroscopic camera 20 according to a second embodiment. 透過波長の同じバンドパスフィルタを複数設けた構成の例を示す断面図である。It is sectional drawing which shows the example of a structure provided with multiple band pass filters with the same transmission wavelength. 透過波長が異なるバンドパスフィルタを複数設けた構成の例を示す断面図である。It is sectional drawing which shows the example of the structure which provided multiple band pass filters from which a transmission wavelength differs. 半値幅が異なるバンドパスフィルタを複数設けた構成の例を示す断面図である。It is sectional drawing which shows the example of a structure provided with two or more band pass filters from which a half value width differs. 波長選択フィルタにごみが付着していた場合における撮影の様子を模式的に示す図である。It is a figure which shows typically the mode of imaging | photography when dust has adhered to the wavelength selection filter.
 以下、本発明の実施例について、図面を参照して説明する。なお、以下の各実施例における説明及び添付図面においては、実質的に同一又は等価な部分には同一の参照符号を付している。 Hereinafter, embodiments of the present invention will be described with reference to the drawings. In the following description of each embodiment and the accompanying drawings, substantially the same or equivalent parts are denoted by the same reference numerals.
 図1は、実施例1に係る分光カメラ10の構成を示す断面図である。分光カメラ10は、例えば物体X(被写体)に光を照射し、物体Xから反射された光又は物体Xを透過した光を受光することによって撮像を行う。分光カメラ10は、光学系11と、マイクロレンズアレイ12及びイメージセンサ13からなるカメラ14と、画像処理部15と、を含む。マイクロレンズアレイ12は、物体Xと共役な位置に配置されている。また、イメージセンサ13は、光学系11の瞳と共役な位置に配置されている。 FIG. 1 is a cross-sectional view illustrating the configuration of the spectroscopic camera 10 according to the first embodiment. For example, the spectroscopic camera 10 irradiates the object X (subject) with light and receives light reflected from the object X or transmitted through the object X to perform imaging. The spectroscopic camera 10 includes an optical system 11, a camera 14 including a microlens array 12 and an image sensor 13, and an image processing unit 15. The microlens array 12 is disposed at a position conjugate with the object X. The image sensor 13 is disposed at a position conjugate with the pupil of the optical system 11.
 光学系11は、例えば凸レンズからなる第1のレンズL1及び第2のレンズL2を含む。第1のレンズL1及び第2のレンズL2の間にある光学系11の瞳の位置には、波長選択フィルタFFが配置されている。光学系11は、物体Xの位置に応じて距離の調整が可能な機構(ピント合わせの機構)を有し、物体Xとマイクロレンズアレイ12との共役関係とを維持するように調整される。 The optical system 11 includes a first lens L1 and a second lens L2 made of, for example, a convex lens. A wavelength selection filter FF is disposed at the position of the pupil of the optical system 11 between the first lens L1 and the second lens L2. The optical system 11 has a mechanism (focusing mechanism) that can adjust the distance according to the position of the object X, and is adjusted so as to maintain the conjugate relationship between the object X and the microlens array 12.
 第1のレンズL1は物体Xからの光が入射する側(以下、入射側と称する)に配置され、第2のレンズL2はマイクロレンズアレイ12に光を出射する側(以下、出射側と称する)に配置されている。物体Xからの光は第1のレンズL1、波長選択フィルタFF及び第2のレンズL2を通ってマイクロレンズアレイ12に集光する。 The first lens L1 is disposed on the side on which light from the object X is incident (hereinafter referred to as the incident side), and the second lens L2 is disposed on the side that emits light to the microlens array 12 (hereinafter referred to as the emission side). ). The light from the object X is condensed on the microlens array 12 through the first lens L1, the wavelength selection filter FF, and the second lens L2.
 波長選択フィルタFFは、例えばファブリペロー型干渉フィルタから構成されている。波長選択フィルタFFは、対向して配置された反射膜RF1及びRF2からなり、反射膜間の間隔に応じた波長の光を透過する。波長選択フィルタFFは、電圧Vの印加を受け、反射膜RF1及びRF2の対抗する面同士の間隔d(以下、反射面間のギャップdと称する)を変化させることにより、透過する光の波長を選択的に設定することが可能に構成されている。 The wavelength selection filter FF is composed of, for example, a Fabry-Perot interference filter. The wavelength selection filter FF includes reflection films RF1 and RF2 that are arranged to face each other, and transmits light having a wavelength corresponding to the interval between the reflection films. The wavelength selection filter FF receives the voltage V and changes the distance d between the opposing surfaces of the reflective films RF1 and RF2 (hereinafter referred to as the gap d between the reflective surfaces), thereby changing the wavelength of the transmitted light. It can be set selectively.
 なお、波長選択フィルタFFにおける反射面間のギャップdは、均一な間隔となるように形成されているのが理想的であるが、製造時や組み立て時におけるばらつきにより、実際には不均一な部分(以下、反射面間のギャップdの不均一性と称する)が存在する。本実施例の分光カメラ10は、かかる反射面間のギャップdの不均一性に応じて撮像画像を補正する機能を有する。 It is ideal that the gap d between the reflection surfaces in the wavelength selection filter FF is formed so as to have a uniform interval. However, in practice, it is a non-uniform portion due to variations in manufacturing and assembly. (Hereinafter referred to as non-uniformity of the gap d between the reflecting surfaces). The spectroscopic camera 10 of this embodiment has a function of correcting a captured image in accordance with the non-uniformity of the gap d between the reflecting surfaces.
 マイクロレンズアレイ12は、縦横に配列されたn個(nは自然数)のマイクロレンズから構成されている。物体X上の異なる点からの光は、夫々異なるマイクロレンズに入射する。マイクロレンズの各々は通常のカメラにおける「画素」に対応し、マイクロレンズの数は通常のカメラにおける「画素数」に対応している。マイクロレンズアレイ12の各マイクロレンズは、光学系11を通過した光を受光し、イメージセンサ13に集光する。 The microlens array 12 is composed of n (n is a natural number) microlenses arranged vertically and horizontally. Light from different points on the object X is incident on different microlenses. Each of the micro lenses corresponds to “pixels” in a normal camera, and the number of micro lenses corresponds to “number of pixels” in a normal camera. Each microlens of the microlens array 12 receives light that has passed through the optical system 11 and focuses it on the image sensor 13.
 イメージセンサ13は、マイクロレンズアレイ12により集光された光を受光し、これを電子情報に変換して撮像画像を得る撮像素子である。イメージセンサ13は、マイクロレンズアレイ12の各々のマイクロレンズに対応するn個の分割撮像領域から構成されている。マイクロレンズアレイ12の各々のマイクロレンズに入射した光は、イメージセンサ13の対応する分割撮像領域で受光される。異なるマイクロレンズに入射した光は、夫々異なる分割撮像領域で受光され、受光される分割撮像領域が重複しないように構成されている。 The image sensor 13 is an image sensor that receives light collected by the microlens array 12 and converts it into electronic information to obtain a captured image. The image sensor 13 is composed of n divided imaging areas corresponding to the microlenses of the microlens array 12. Light incident on each microlens of the microlens array 12 is received by the corresponding divided imaging region of the image sensor 13. Lights that have entered different microlenses are received by different divided imaging regions, and the received divided imaging regions do not overlap.
 図2は、物体Xの一点(物体点Pと称する)からの光がイメージセンサ13で受光される様子を模式的に示す図である。物体点Pからの光は、第1のレンズL1、瞳EYE及び第2のレンズL2上の異なる領域を通って、マイクロレンズアレイ12のマイクロレンズML2に入射し、分割撮像領域R2で受光される。この物体点Pの像の輝度値は、分割撮像領域R2内における全画素の輝度値の総和で表される。従って、各分割撮像領域における全画素の輝度値の総和を取得することにより、マイクロレンズの位置にイメージセンサを配した通常のカメラで撮像した画像と同一の撮像結果(撮像画像)が得られる。 FIG. 2 is a diagram schematically showing a state in which light from one point of object X (referred to as object point P) is received by image sensor 13. The light from the object point P passes through different regions on the first lens L1, the pupil EYE, and the second lens L2, enters the microlens ML2 of the microlens array 12, and is received by the divided imaging region R2. . The luminance value of the image of the object point P is represented by the sum of the luminance values of all the pixels in the divided imaging region R2. Therefore, by acquiring the sum of the luminance values of all the pixels in each divided imaging region, the same imaging result (captured image) as that obtained by an ordinary camera in which an image sensor is arranged at the position of the microlens can be obtained.
 図3は、光学系11の瞳EYE上の一点からの光がイメージセンサ13で受光される様子を模式的に示す図である。瞳EYE上の一点からの光(すなわち、瞳EYE上の一点に入射した光)は、第2のレンズL2の異なる領域及びマイクロレンズアレイ12の異なるマイクロレンズを通って、イメージセンサ13の各々の分割撮像領域における同じ箇所に集光する。 FIG. 3 is a diagram schematically showing how the image sensor 13 receives light from one point on the pupil EYE of the optical system 11. Light from one point on the pupil EYE (that is, light incident on one point on the pupil EYE) passes through different regions of the second lens L2 and different microlenses of the microlens array 12, and then passes through each of the image sensors 13. Condensing light at the same location in the divided imaging area.
 再び図1を参照すると、画像処理部15は、例えばCPU(Central Processing Unit)等から構成され、イメージセンサ13により取得された物体Xの画像に対して画像補正処理を行う。具体的には、画像処理部15は、事前撮影により得られたイメージセンサ13の各分割撮像領域における輝度分布に応じて画素領域を選択し、選択した画素領域の情報のみを用いて物体Xの撮像画像を構成する。事前撮影では、画像処理部15は、例えば波長選択フィルタFFへの印加電圧を掃引しながらイメージセンサ13により受光された信号をメモリ等の記憶部(図示せず)に記録し、記録された複数の信号情報に基づいて波長選択フィルタFFの透過波長の不均一性(すなわち、反射面のギャップdの不均一性)を示す輝度分布を得る。すなわち、画像処理部15は、波長選択フィルタFFの透過波長の不均一性を検出する検出部としての機能を有する。 Referring to FIG. 1 again, the image processing unit 15 is composed of, for example, a CPU (Central Processing Unit) and performs an image correction process on the image of the object X acquired by the image sensor 13. Specifically, the image processing unit 15 selects a pixel area according to the luminance distribution in each divided imaging area of the image sensor 13 obtained by the pre-photographing, and uses only the information on the selected pixel area for the object X. Configure the captured image. In the pre-photographing, the image processing unit 15 records the signal received by the image sensor 13 in a storage unit (not shown) such as a memory while sweeping the applied voltage to the wavelength selection filter FF, for example, The luminance distribution indicating the non-uniformity of the transmission wavelength of the wavelength selection filter FF (that is, the non-uniformity of the gap d of the reflecting surface) is obtained based on the signal information of That is, the image processing unit 15 has a function as a detection unit that detects non-uniformity of the transmission wavelength of the wavelength selection filter FF.
 イメージセンサ13の分割撮像領域内の輝度分布は、マイクロレンズアレイ12の対応するマイクロレンズに入射した光が瞳を通過する際にどのような輝度分布であったかを表すものである。従って、あるマイクロレンズに入射した光の輝度値を対応する分割撮像領域の特定の画素領域の和として算出すれば、瞳の特定の領域(分割撮像領域内の画素領域に対応する領域)を通過して受光した光の強度が得られる。さらに、全てのマイクロレンズの輝度値を対応する分割撮像領域の同じ画素領域の和として算出すれば、瞳の特定の領域を通過した光による画像が得られる。これにより、例えば光学系11の瞳に所定の形状を有する物理的な開口部を配置した場合と同様の画像が得られる。 The luminance distribution in the divided imaging region of the image sensor 13 represents the luminance distribution when the light incident on the corresponding microlens of the microlens array 12 passes through the pupil. Therefore, if the luminance value of light incident on a certain microlens is calculated as the sum of the specific pixel areas of the corresponding divided imaging area, it passes through a specific area of the pupil (area corresponding to the pixel area in the divided imaging area). Thus, the intensity of the received light can be obtained. Furthermore, if the luminance values of all the microlenses are calculated as the sum of the same pixel areas of the corresponding divided imaging areas, an image of light passing through a specific area of the pupil can be obtained. Thereby, for example, an image similar to the case where a physical opening having a predetermined shape is arranged in the pupil of the optical system 11 is obtained.
 図4は、ハート形の形状を有する物理的な開口部APを光学系の瞳位置に配置した場合の例を示す図である。イメージセンサ13の各分割撮像領域では、ハート形の形状(すなわち、開口部APの形状)の上下を反転させた画像が得られる。 FIG. 4 is a diagram showing an example in which a physical aperture AP having a heart shape is arranged at the pupil position of the optical system. In each divided imaging region of the image sensor 13, an image obtained by inverting the top and bottom of the heart shape (that is, the shape of the opening AP) is obtained.
 これに対し、本実施例の分光カメラ10では、選択した画素領域のみを用いて輝度値を計算することにより、図5に示すように、光学系11の瞳に物理的な開口部(図では、ハート形の開口部)を配置した場合と同様の画像(図では、ハート形の上下を反転させた画像)を、物理的な開口部を配置することなく得ることができる。 On the other hand, in the spectroscopic camera 10 of the present embodiment, by calculating the luminance value using only the selected pixel region, as shown in FIG. 5, a physical aperture (in the figure, the pupil of the optical system 11). An image similar to that in the case where the heart-shaped openings are arranged (in the figure, an image obtained by inverting the heart-shaped top and bottom) can be obtained without arranging physical openings.
 ところで、波長選択フィルタFFを構成するファブリペロー型干渉フィルタは、一組の平行な反射膜で構成される干渉フィルタであり、その反射面間のギャップを変化させることにより、透過する光の波長が変化する。反射面の反射率をR、反射面間のギャップをh0、入射光の波長をλとすると、ファブリペローフィルタの透過率T(λ)は、次の式(1)で表される。 By the way, the Fabry-Perot interference filter constituting the wavelength selection filter FF is an interference filter composed of a pair of parallel reflection films. By changing the gap between the reflection surfaces, the wavelength of the transmitted light is changed. Change. When the reflectance of the reflecting surface is R, the gap between the reflecting surfaces is h 0 , and the wavelength of the incident light is λ, the transmittance T (λ) of the Fabry-Perot filter is expressed by the following equation (1).
Figure JPOXMLDOC01-appb-M000001
 R=1の反射面では、λ=2h0のときにだけ値を持ち、他の波長は通過させない完全な単色フィルタとなる。しかし、実際にはR<1となるため、λ=2h0を中心波長としたある範囲の波長を通過させることとなる。その半値幅FWHM(Full Width at Half Maximum)は、次の式(2)で表される。R→1の極限では、FWHM→0となる。
Figure JPOXMLDOC01-appb-M000001
The reflection surface of R = 1 has a value only when λ = 2h 0 and is a complete monochromatic filter that does not pass other wavelengths. However, since R <1 in practice, a certain range of wavelengths with λ = 2h 0 as the center wavelength is allowed to pass. The full width at half maximum (FWHM) is expressed by the following equation (2). In the limit of R → 1, FWHM → 0.
Figure JPOXMLDOC01-appb-M000002
 R→1の極限では、FWHM→0となる。しかし、反射率Rが1であるとしても、実際に作成されるファブリペローフィルタでは反射面間のギャップが不均一であるため、透過光は完全な単色とはならない。
Figure JPOXMLDOC01-appb-M000002
In the limit of R → 1, FWHM → 0. However, even if the reflectance R is 1, the gap between the reflecting surfaces is not uniform in the Fabry-Perot filter that is actually produced, so that the transmitted light does not become a complete single color.
 図6(a)及び(b)は、透過光の強度及び半値幅について、反射面間のギャップがほぼ均一である理想的な場合と端部付近においてギャップが不均一である場合とを比較して示す図である。 6 (a) and 6 (b) compare the ideal case where the gap between the reflecting surfaces is almost uniform and the case where the gap is non-uniform in the vicinity of the end with respect to the intensity and half width of the transmitted light. FIG.
 図6(a)に示すように、フィルタの反射面内の全ての領域でギャップが均一な値(d=h0)となっているときには、反射面内の全ての箇所における透過光の波長は2h0であり、半値幅はほぼ0となる。 As shown in FIG. 6A, when the gap is a uniform value (d = h 0 ) in all regions in the reflection surface of the filter, the wavelength of transmitted light at all locations in the reflection surface is 2h 0 and the half-value width is almost zero.
 これに対し、図6(b)に示すように、ギャップが不均一である場合には、d≠h0である領域ではλ=2h0以外の波長の光が透過することになり、半値幅は広くなる。 On the other hand, as shown in FIG. 6B, when the gap is not uniform, light having a wavelength other than λ = 2h 0 is transmitted in the region where d ≠ h 0 , and the half-value width Becomes wider.
 このような半値幅の広がりを改善するためには、一般的にフィルタの直前又は直後に開口絞りを配置することが考えられる。例えば、図7(a)に示すように、フィルタの直前に開口絞りを配置することにより、光軸周辺のギャップが均一な領域のみに光を透過させ、半値幅を改善することができる。しかし、かかる方法では半値幅を改善することはできるものの、透過光の光量が小さくなってしまう。 In order to improve the spread of the half width, it is generally considered that an aperture stop is arranged immediately before or after the filter. For example, as shown in FIG. 7A, by arranging an aperture stop immediately before the filter, light can be transmitted only to a region where the gap around the optical axis is uniform, and the half width can be improved. However, although this method can improve the half-value width, the amount of transmitted light is reduced.
 また、反射面間のギャップが光軸に対して非対称な不均一性を有する場合、図7(b)のように光軸に対して対称な開口絞りを配置すると透過光の光量がさらに小さくなるため、図7(c)に示すように、ギャップの不均一性に応じた開口絞りを配置する必要がある。しかし、実際にはどの部分が不均一なのかがフィルタによって様々であるため、適切な開口絞りを配置することは困難である。 If the gap between the reflecting surfaces has a non-uniformity that is asymmetric with respect to the optical axis, the amount of transmitted light can be further reduced by arranging an aperture stop that is symmetric with respect to the optical axis as shown in FIG. Therefore, as shown in FIG. 7C, it is necessary to arrange an aperture stop corresponding to the gap non-uniformity. However, in practice, it is difficult to arrange an appropriate aperture stop because different parts are different depending on the filter.
 これに対し、本実施例の分光カメラ10では、イメージセンサ13の各分割撮像領域における特定の画素領域のみを選択的に用いることにより画像を構成することが可能である。従って、物理的な開口絞りを設けることなく、波長選択フィルタFFの反射膜RF1及びRF2の間隔が均一な領域を通過した光のみを用いた画像を得ることができる。 On the other hand, in the spectroscopic camera 10 of the present embodiment, an image can be configured by selectively using only a specific pixel area in each divided imaging area of the image sensor 13. Therefore, it is possible to obtain an image using only light that has passed through a region where the distance between the reflection films RF1 and RF2 of the wavelength selection filter FF is uniform without providing a physical aperture stop.
 次に、本実施例の分光カメラ10による撮像及び画素領域の選択方法について説明する。まず、被写体の撮影(実撮影)を行う前の事前撮影について、図8のフローチャート及び図9を参照して説明する。 Next, a method for picking up an image and selecting a pixel area by the spectral camera 10 of the present embodiment will be described. First, pre-photographing before photographing a subject (actual photographing) will be described with reference to the flowchart of FIG. 8 and FIG.
 事前撮影では、図9に示すように、レーザLRから出射された単一の波長λ1の光を、レンズL3を介してスクリーンSCに照射する(ステップS101)。 In the pre-photographing, as shown in FIG. 9, the screen SC is irradiated with light having a single wavelength λ 1 emitted from the laser LR via the lens L3 (step S101).
 分光カメラ10は、波長選択フィルタFFに印加する電圧Vを掃引しながら、スクリーンSCの撮影を行う。画像処理部15は、撮像素子であるイメージセンサ13で受光された信号を記録する(ステップS102)。 The spectroscopic camera 10 takes an image of the screen SC while sweeping the voltage V applied to the wavelength selection filter FF. The image processing unit 15 records a signal received by the image sensor 13 that is an image sensor (step S102).
 波長選択フィルタFFに印加する電圧Vを変化させることにより、反射膜RF1及びRF2の間隔が変化する。波長λ1の光を最も多く通過させる間隔となったときに、イメージセンサ13の各分割撮像領域の画像が最も明るくなる。このとき、分割撮像領域において得られた画像VDにおいて輝度が高い領域は、対応する瞳上の領域が波長λ1の光を通過している領域であり、反射面間のギャップd(すなわち、反射膜RF1及びRF2の間隔)が均一な領域を反映したものとなる。 By changing the voltage V applied to the wavelength selection filter FF, the interval between the reflection films RF1 and RF2 changes. The image of each divided imaging region of the image sensor 13 becomes brightest when the interval at which the light having the wavelength λ 1 passes most is obtained. At this time, a region with high brightness in the image VD obtained in the divided imaging region is a region where the corresponding region on the pupil passes light of wavelength λ 1 , and the gap d between the reflecting surfaces (that is, the reflection) The distance between the films RF1 and RF2 reflects a uniform region.
 画像処理部15は、記録された複数の信号情報に基づいて、波長選択フィルタの反射面間のギャップdの不均一性(透過波長の不均一性)を反映した分割撮像領域の輝度分布を取得する(ステップS103)。 The image processing unit 15 acquires the luminance distribution of the divided imaging region reflecting the non-uniformity of the gap d (the non-uniformity of the transmission wavelength) between the reflection surfaces of the wavelength selection filter based on the plurality of recorded signal information. (Step S103).
 なお、画素領域選択のための事前撮影を行う際には、単一波長のレーザ光で照明されたスクリーンSCを用いる代わりに、太陽光やハロゲン光などで照明されたスクリーンSCを波長λ1の光を通過するバンドパスフィルタを通して撮影しても良い。 Incidentally, when performing preparatory photographing for the pixel region selection of a single wavelength, instead of using the illuminated screen SC with a laser beam, a screen SC that is illuminated by such as sunlight or halogen light wavelength lambda 1 of You may image | photograph through the band pass filter which passes light.
 例えば、図10に示すように、事前撮影時には入射側から見て光学系11の手前(すなわち、第1のレンズL1の手前)の光路上の位置に波長λ1の光を透過するバンドパスフィルタBPFを配置して照明光WLを照射して撮影を行い、実撮影時にはバンドパスフィルタBPFを外して撮影を行う。これにより、波長選択フィルタFFの反射面間のギャップに応じた輝度の分布を示す画素領域の情報を取得し、輝度が高い画素領域のみを用いて画像を構成することが出来る。また、図11に示すように、波長選択フィルタの手前の光路上の位置にバンドパスフィルタBPFを配置して事前撮影を行っても良い。すなわち、バンドパスフィルタBPFは光路上に配置されていれば良い。 For example, as shown in FIG. 10, a band-pass filter that transmits light of wavelength λ 1 at a position on the optical path in front of the optical system 11 (ie, in front of the first lens L1) when viewed from the incident side during preliminary imaging. The BPF is arranged to irradiate the illumination light WL to perform imaging, and the bandpass filter BPF is removed during actual imaging. As a result, it is possible to acquire information on the pixel area indicating the luminance distribution according to the gap between the reflection surfaces of the wavelength selection filter FF, and to construct an image using only the pixel area having a high luminance. In addition, as shown in FIG. 11, a pre-shooting may be performed by arranging a bandpass filter BPF at a position on the optical path before the wavelength selection filter. That is, the band pass filter BPF may be disposed on the optical path.
 次に、被写体の撮影を行う実撮影について、図12のフローチャートを参照して説明する。 Next, actual shooting for shooting a subject will be described with reference to the flowchart of FIG.
 まず、被写体(物体X)に対して分光カメラ10をセットする(ステップS201)。分光カメラ10は、波長選択フィルタFFに印加する電圧Vを掃引しながら、物体Xの撮影を行う(ステップS202)。その際、事前撮影とは異なり、単一波長の光ではなく太陽光や白色光等の照明光を用いて被写体の撮影を行う。そして、画像処理部15は、撮像素子であるイメージセンサ13において受光された信号を記録する。 First, the spectroscopic camera 10 is set on the subject (object X) (step S201). The spectroscopic camera 10 captures the object X while sweeping the voltage V applied to the wavelength selection filter FF (step S202). At this time, unlike the pre-photographing, the subject is photographed using illumination light such as sunlight or white light instead of light having a single wavelength. Then, the image processing unit 15 records a signal received by the image sensor 13 that is an image sensor.
 画像処理部15は、事前撮影により得られた波長選択フィルタの反射面間のギャップdの不均一性を示す情報(すなわち、分割撮像領域内の輝度分布の情報)に基づいて、実撮影において反射面間のギャップdが均一な領域を通過した光を受光して得られた第1の画像情報と、不均一な領域を通過した光を受光して得られた第2の画像情報とを算出する(ステップS203)。画像処理部15は、第1の画像情報及び第2の画像情報に基づいて、最終的な撮像画像を得る(ステップS204)。 The image processing unit 15 reflects in actual shooting based on information indicating the non-uniformity of the gap d between the reflection surfaces of the wavelength selection filter obtained by pre-shooting (that is, information on the luminance distribution in the divided imaging region). First image information obtained by receiving light passing through a region where the gap d between the surfaces is uniform and second image information obtained by receiving light passing through the non-uniform region are calculated. (Step S203). The image processing unit 15 obtains a final captured image based on the first image information and the second image information (step S204).
 以上のように、本実施例の分光カメラ10によれば、事前撮影において波長選択フィルタFFの反射面間のギャップdに応じた輝度の分布を示す画像情報を取得し、事前撮影において輝度値が高かった画素領域のみを選択的に用いることにより、波長選択フィルタFFの反射面間のギャップdが均一な領域を通過した光のみを用いて構成された撮像画像を得ることができる。従って、フィルタにおいて透過波長が不均一な領域が存在するような場合においても、高精度の画像を取得することが可能となる。 As described above, according to the spectroscopic camera 10 of the present embodiment, the image information indicating the luminance distribution according to the gap d between the reflection surfaces of the wavelength selection filter FF is acquired in the pre-photographing, and the luminance value is determined in the pre-photographing. By selectively using only the high pixel region, it is possible to obtain a captured image configured using only light that has passed through a region where the gap d between the reflection surfaces of the wavelength selection filter FF is uniform. Therefore, even when there is a region where the transmission wavelength is non-uniform in the filter, it is possible to acquire a highly accurate image.
 図13は、実施例2に係る分光カメラ20の構成を示す断面図である。本実施例の分光カメラ20は、マイクロレンズアレイ12を構成するマイクロレンズのうちの1つの出射側表面にバンドパスフィルタBPFが固定して設けられている点で、実施例1の分光カメラ10と異なる。 FIG. 13 is a cross-sectional view illustrating the configuration of the spectroscopic camera 20 according to the second embodiment. The spectroscopic camera 20 of the present embodiment is the same as the spectroscopic camera 10 of the first embodiment in that a bandpass filter BPF is fixedly provided on the exit side surface of one of the microlenses constituting the microlens array 12. Different.
 マイクロレンズアレイ12の端部に位置するマイクロレンズML1の出射側の表面(すなわち、イメージセンサ13側の表面)には、透過波長λ1(通過帯域の中心波長がλ1)のバンドパスフィルタBPFが設けられている。マイクロレンズML1は画素選択用のマイクロレンズとして機能し、その他のマイクロレンズは撮像用のマイクロレンズとして機能する。 A band-pass filter BPF having a transmission wavelength λ 1 (the center wavelength of the pass band is λ 1 ) is provided on the emission-side surface of the microlens ML1 positioned at the end of the microlens array 12 (that is, the surface on the image sensor 13 side). Is provided. The micro lens ML1 functions as a pixel selecting micro lens, and the other micro lenses function as imaging micro lenses.
 イメージセンサ13は、マイクロレンズアレイ12のマイクロレンズML1に対応する分割撮像領域R1を有する。分割撮像領域R1は、マイクロレンズML1に対応する位置(例えばイメージセンサの端部)に設けられている。分割撮像領域R1は画素選択用の分割撮像領域として機能し、その他の分割撮像領域は撮像用の分割撮像領域として機能する。 The image sensor 13 has a divided imaging region R1 corresponding to the microlens ML1 of the microlens array 12. The divided imaging region R1 is provided at a position corresponding to the microlens ML1 (for example, an end portion of the image sensor). The divided imaging area R1 functions as a divided imaging area for pixel selection, and the other divided imaging areas function as divided imaging areas for imaging.
 物体Xの撮影時において、マイクロレンズML1に入射した光のうち波長λ1の光のみが、バンドパスフィルタBPFを通ってイメージセンサ13の分割撮像領域R1で受光される。従って、分割撮像領域R1では、波長選択フィルタFFの反射面間のギャップdを反映した輝度分布を有する画素選択用の画像VDが取得される。 At the time of photographing the object X, only the light having the wavelength λ 1 out of the light incident on the micro lens ML 1 is received by the divided imaging region R 1 of the image sensor 13 through the band pass filter BPF. Therefore, in the divided imaging region R1, a pixel selection image VD having a luminance distribution reflecting the gap d between the reflection surfaces of the wavelength selection filter FF is acquired.
 一方、マイクロレンズアレイ12のマイクロレンズML1以外の領域を通過した光は、バンドパスフィルタBPFを通らずに、イメージセンサ13のその他の分割撮像領域で受光される。従って、イメージセンサ13の分割撮像領域R1以外の分割撮像領域では、物体Xを撮影した通常の撮像画像が得られる。 On the other hand, the light that has passed through the area other than the microlens ML1 of the microlens array 12 is received by the other divided imaging areas of the image sensor 13 without passing through the bandpass filter BPF. Therefore, in a divided imaging region other than the divided imaging region R1 of the image sensor 13, a normal captured image obtained by photographing the object X is obtained.
 画像処理部15は、分割撮像領域R1において得られた画素選択用の画像VDの画像情報に基づいて、分割撮像領域R1以外で得られた撮像画像に対して画像補正処理を行う。具体的には、画像処理部15は、画素選択用の画像VDに示される輝度分布に応じて画素領域を選択し、選択した画素領域のみを用いて物体Xの撮像画像を構成する。 The image processing unit 15 performs image correction processing on a captured image obtained outside the divided imaging region R1 based on the image information of the image VD for pixel selection obtained in the divided imaging region R1. Specifically, the image processing unit 15 selects a pixel area according to the luminance distribution shown in the pixel selection image VD, and configures a captured image of the object X using only the selected pixel area.
 従って、本実施例の分光カメラ20によれば、事前撮影及び実撮影という2つのステップで撮影を行う必要がある実施例1とは異なり、一度の撮影により、輝度が高い画素領域のみを用いた撮像画像を得ることが可能となる。 Therefore, according to the spectroscopic camera 20 of the present embodiment, unlike the first embodiment in which shooting is required in two steps of pre-shooting and actual shooting, only a pixel region with high luminance is used by one shooting. A captured image can be obtained.
 なお、上記構成とは異なり、バンドパスフィルタBPFを複数設けた構成としても良い。例えば、図14に示すように、いずれも透過波長(通過帯域の中心波長)がλ1である2つのバンドパスフィルタBPFをマイクロレンズアレイ12の両端部のマイクロレンズの出射側の表面に設け、対応する2つの分割撮像領域(分割撮像領域R1及びRx)を画素選択用の分割撮像領域とする。これにより、画素選択用の画像VDが複数得られるため、画素選択を精度よく行うことができる。 Unlike the above configuration, a configuration in which a plurality of band pass filters BPF are provided may be employed. For example, as shown in FIG. 14, two bandpass filters BPF each having a transmission wavelength (passband center wavelength) of λ 1 are provided on the surface of the microlens array 12 on the emission side of the microlens, Two corresponding divided imaging regions (divided imaging regions R1 and Rx) are set as a divided imaging region for pixel selection. Accordingly, since a plurality of pixel selection images VD are obtained, pixel selection can be performed with high accuracy.
 また、図15に示すように、透過波長が異なる複数のバンドパスフィルタBPF1(透過波長λ1)及びBPF2(透過波長λ2)を設けた構成としても良い。分割撮像領域R1における画素選択用の画像と、分割撮像領域Rxにおける画素選択用の画像は、夫々異なる電圧Vを波長選択フィルタFFに印加した際に取得される。例えば、電圧V=V1のときに分割撮像領域R1において画素選択用の画像VD1が得られ、電圧V=V2のときに分割撮像領域Rxにおいて画素選択用の画像VD2が得られる。この場合においても、複数の画素選択用の画像に基づいて画素の選択を行うことができるため、画素選択を精度よく行うことができる。 Also, as shown in FIG. 15, a plurality of bandpass filters BPF1 (transmission wavelength λ 1 ) and BPF2 (transmission wavelength λ 2 ) having different transmission wavelengths may be provided. The pixel selection image in the divided imaging region R1 and the pixel selection image in the divided imaging region Rx are acquired when different voltages V are applied to the wavelength selection filter FF, respectively. For example, the image VD1 for pixel selection is obtained in the dividing imaging area R1 when the voltage V = V 1, the image VD2 for pixel selection is obtained in the dividing the imaging region Rx when the voltage V = V 2. Even in this case, since the pixels can be selected based on a plurality of images for pixel selection, the pixel selection can be performed with high accuracy.
 また、半値幅が異なる複数のバンドパスフィルタを設けた構成としても良い。例えば、図16(a)に示すように、透過波長の中心がλ1で且つ半値幅がW1であるバンドパスフィルタBPFAと、透過波長の中心がλ1で且つ半値幅がW2であるバンドパスフィルタBPFBとを、マイクロレンズアレイ12の両端部のマイクロレンズの出射側の表面に設ける。半値幅W2>半値幅W1とすると、図16(b)に示すように、半値幅の小さいバンドパスフィルタBPFAでは透過光の光量が小さく、半値幅の大きいバンドパスフィルタBPFBでは透過光の光量が大きくなる。従って、分割撮像領域Aにおいて得られる画素選択用の画像VD1及び分割撮像領域Bにおいて得られる画素選択用の画像VD3に基づいて、半値幅優先の画素選択と透過光量優先の画素選択とを切り替えつつ行うことができる。 Moreover, it is good also as a structure which provided the several band pass filter from which a half value width differs. For example, as shown in FIG. 16A, the center of the transmission wavelength is λ 1 and the half-value width is W 1 , and the center of the transmission wavelength is λ 1 and the half-value width is W 2 . Band-pass filters BPFB are provided on the exit-side surfaces of the microlenses at both ends of the microlens array 12. When the half-value width W 2 > the half-value width W 1 , as shown in FIG. 16B, the amount of transmitted light is small in the band-pass filter BPFA having a small half-value width, and the transmitted light is transmitted in the band-pass filter BPFB having a large half-value width. The amount of light increases. Therefore, based on the pixel selection image VD1 obtained in the divided imaging region A and the pixel selection image VD3 obtained in the divided imaging region B, the selection is performed while switching between the half-width priority pixel selection and the transmitted light amount priority pixel selection. be able to.
 なお、本発明の実施形態は、上記実施例1及び2で示したものに限られない。例えば、上記実施例では、ある電圧Vが波長選択フィルタFFに印加された状態において、イメージセンサ13の分割撮像領域内で輝度値の高い画素領域を選択して画像処理を行う構成について説明した。例えば、ある単色光を入射させたときに図9のような画像が分割撮像領域で得られた場合、輝度の高い画素領域(図中、白で表される領域)の画素のみを有効画素とし、その他の領域(図中、グレー及び黒で表される領域)の画素の情報は捨てるという判断をしていた。しかし、波長選択フィルタFFに印加する電圧を変化させることにより輝度の高い画素領域は変化するため、複数の電圧における情報を加算して画像情報を得る構成としても良い。 The embodiment of the present invention is not limited to those shown in the first and second embodiments. For example, in the above embodiment, a configuration has been described in which image processing is performed by selecting a pixel region having a high luminance value in the divided imaging region of the image sensor 13 in a state where a certain voltage V is applied to the wavelength selection filter FF. For example, when an image as shown in FIG. 9 is obtained in the divided imaging region when a certain monochromatic light is incident, only pixels in a pixel region with high luminance (region represented by white in the figure) are effective pixels. Therefore, it has been determined that information on pixels in other regions (regions represented by gray and black in the drawing) is discarded. However, since the pixel area with high luminance is changed by changing the voltage applied to the wavelength selection filter FF, the image information may be obtained by adding information at a plurality of voltages.
 例えば、波長選択フィルタFFに印加する電圧(以下、印加電圧と称する)をV=V1の前後で変化させ、印加電圧V1のときにグレーで表されていた画素領域の輝度値が印加電圧V2において最大となり、印加電圧V1の時に黒で表されていた画素領域の輝度値が印加電圧V3において最大となったとする。この場合、夫々の電圧値における画素領域の情報を加算して、対応するマイクロレンズの輝度値を算出することにより、波長選択フィルタFFの全面の情報から波長λ1の画像を得ることができ、光量のロスをさらに減らすことができる。このように1つの波長の画像情報を得るために、複数の印加電圧とその印加電圧に対応した分割撮像領域内のピクセル(画素領域)を使用することにより、光利用効率を最大限確保しつつ、半値幅を改善することが可能となる。 For example, the voltage applied to the wavelength selection filter FF (hereinafter referred to as applied voltage) is changed before and after V = V1, and the luminance value of the pixel area represented in gray at the applied voltage V 1 is the applied voltage V. It is assumed that the luminance value of the pixel region which is the maximum at 2 and is represented in black at the applied voltage V 1 is the maximum at the applied voltage V 3 . In this case, by adding the pixel region information at each voltage value and calculating the luminance value of the corresponding microlens, an image of the wavelength λ 1 can be obtained from the information on the entire surface of the wavelength selection filter FF, Light loss can be further reduced. In this way, in order to obtain image information of one wavelength, a plurality of applied voltages and pixels (pixel regions) in a divided imaging region corresponding to the applied voltages are used, and light utilization efficiency is ensured to the maximum. It becomes possible to improve the half width.
 また、印加電圧V1のときのグレーの領域は別の波長λ2の情報であり、黒の領域はさらに別の波長λ3の情報であるとも言えるため、印加電圧ごとに複数波長の情報を取得することが可能である。従って、印加電圧Vを掃引しながらこれらの情報をメモリ(図示せず)等に保持しておき、最後に同じ波長の情報の加算を行うことで各マイクロレンズの輝度値を算出することにより、同様の結果を得ることが出来る。 Further, since the gray area at the applied voltage V 1 is information of another wavelength λ 2 and the black area is information of another wavelength λ 3 , information on a plurality of wavelengths is applied for each applied voltage. It is possible to obtain. Therefore, by storing the information in a memory (not shown) or the like while sweeping the applied voltage V, and finally calculating the luminance value of each microlens by adding the information of the same wavelength, Similar results can be obtained.
 なお、本発明において波長選択フィルタFFは光学系11の瞳部分に配置されるため、瞳におけるある座標を通過した光はマイクロレンズアレイ12の全てのマイクロレンズに一様に入射する。従って、波長選択フィルタFFの反射面間のギャップdに不均一な部分(むら)がある場合には、フィルタのどこを通過した光がマイクロレンズアレイ12に入射するかは時刻により変化するが、各マイクロレンズに光が入射するタイミングは揃っているので、イメージセンサ13において得られる画像にむらは生じない。また、フィルタ上に汚れや欠陥部分があった場合には、汚れや欠陥の面積分だけ輝度が低下するものの、イメージセンサ13において得られる画像に画素の欠陥は生じない。 In the present invention, since the wavelength selection filter FF is disposed in the pupil portion of the optical system 11, the light passing through a certain coordinate in the pupil is uniformly incident on all the microlenses of the microlens array 12. Therefore, when there is a non-uniform portion (unevenness) in the gap d between the reflection surfaces of the wavelength selection filter FF, the light passing through the filter enters the microlens array 12 depending on the time. Since the timing at which light enters each microlens is aligned, the image obtained by the image sensor 13 is not uneven. In addition, when there is dirt or a defective portion on the filter, the luminance is reduced by the area of the dirt or defect, but no pixel defect occurs in the image obtained by the image sensor 13.
 また、本発明の分光カメラによれば、波長選択フィルタにごみ等が付着していた場合でも、これを避けて被写体を撮影することが可能である。例えば、図17に示すように、波長選択フィルタFFにごみGBが付着していた場合、白色光等の照明光WLで照明されたスクリーンSCを撮影すると、イメージセンサ13の全ての分割撮像領域にごみGBが映り込むことになる。そこで、ごみGBが映り込む領域を避けて画素領域を選択することにより、ごみGBを避けつつ撮影を行うことが可能となる。また、ごみ以外にも、汚れや傷など、その場所を予測できない異物に対しても対応することが可能である。 Further, according to the spectroscopic camera of the present invention, it is possible to shoot a subject while avoiding this even when dust or the like is attached to the wavelength selection filter. For example, as illustrated in FIG. 17, when dust GB is attached to the wavelength selection filter FF, when the screen SC illuminated with the illumination light WL such as white light is photographed, all the divided imaging regions of the image sensor 13 are captured. Garbage GB will be reflected. Therefore, by selecting the pixel area while avoiding the area where the garbage GB is reflected, it is possible to perform shooting while avoiding the garbage GB. In addition to dust, it is also possible to deal with foreign matter whose location cannot be predicted, such as dirt and scratches.
 また、上記実施例では、分光カメラ10及び20がn個のマイクロレンズが縦横に配列されたマイクロレンズアレイ12を有する例について説明した。しかし、レンズの種類及び配列はこれに限られず、分光カメラはn個のレンズを有するレンズアレイを有していれば良い。 In the above embodiment, the example in which the spectroscopic cameras 10 and 20 have the microlens array 12 in which n microlenses are arranged vertically and horizontally has been described. However, the type and arrangement of the lenses are not limited to this, and the spectroscopic camera only needs to have a lens array having n lenses.
 また、上記実施例2ではバンドパスフィルタがマイクロレンズの出射面側の表面に設けられている例について説明したが、バンドパスフィルタが配置される位置はこれに限られず、マイクロレンズに集光する光の光路上に配置されていれば良い。また、両端部のマイクロレンズに限らずn個のマイクロレンズのいずれかに対応した位置に配置されていれば良い。 In the second embodiment, the example in which the bandpass filter is provided on the surface on the emission surface side of the microlens has been described. However, the position where the bandpass filter is disposed is not limited to this, and the light is condensed on the microlens. It only has to be arranged on the optical path of light. Moreover, it should just be arrange | positioned in the position corresponding to not only the microlens of both ends but n microlenses.
 また、上記各実施例で説明した一連の処理は、例えばROMなどの記録媒体に格納されたプログラムに従ったコンピュータ処理により行うことができる。 The series of processes described in the above embodiments can be performed by computer processing according to a program stored in a recording medium such as a ROM.
10,20 分光カメラ
11 光学系
12 マイクロレンズアレイ
13 イメージセンサ
14 カメラ
15 画像処理部
10, 20 Spectroscopic camera 11 Optical system 12 Micro lens array 13 Image sensor 14 Camera 15 Image processing unit

Claims (12)

  1.  一対のレンズからなる光学系と、
     前記光学系の瞳の位置に互いに対向して配された一対の反射面を有し、印加電圧に応じて前記一対の反射面の間隔を変化させ、前記間隔に応じた波長の光を選択的に透過する波長選択部と、
     各々が前記光学系及び前記波長選択部を通過した光を集光するn個(nは自然数)のレンズを有するレンズアレイと、
     前記波長選択部と共役な位置に設けられ、前記n個のレンズにより集光された光を受光するn個の分割撮像領域を有する撮像素子と、
     を有することを特徴とする分光カメラ。
    An optical system consisting of a pair of lenses;
    The optical system has a pair of reflecting surfaces arranged opposite to each other at the position of the pupil, and changes the interval between the pair of reflecting surfaces according to the applied voltage, and selectively selects light having a wavelength according to the interval. A wavelength selector that transmits through
    A lens array having n lenses (n is a natural number) that each collects light that has passed through the optical system and the wavelength selection unit;
    An image sensor having n divided imaging regions that are provided at a position conjugate with the wavelength selection unit and receive the light collected by the n lenses;
    A spectroscopic camera comprising:
  2.  前記撮像素子が撮像した画像に基づいて前記波長選択部の透過波長の不均一性を検出する検出部を有することを特徴とする請求項1に記載の分光カメラ。 The spectroscopic camera according to claim 1, further comprising a detection unit that detects non-uniformity of a transmission wavelength of the wavelength selection unit based on an image captured by the image sensor.
  3.  前記検出部は、前記画像の画素毎の輝度値に基づいて前記波長選択部の透過波長の不均一性を検出することを特徴とする請求項2に記載の分光カメラ。 3. The spectroscopic camera according to claim 2, wherein the detection unit detects non-uniformity of a transmission wavelength of the wavelength selection unit based on a luminance value for each pixel of the image.
  4.  前記検出部により検出された前記波長選択部の透過波長の不均一性に応じて前記画像に画像補正を行う画像処理部を有することを特徴とする請求項2又は3に記載の分光カメラ。 4. The spectroscopic camera according to claim 2, further comprising an image processing unit that performs image correction on the image according to non-uniformity of the transmission wavelength of the wavelength selection unit detected by the detection unit.
  5.  前記画像処理部は、前記撮像素子の前記n個の分割撮像領域のうちの少なくとも1つにおいて前記波長選択部の前記一対の反射面の間隔が均一な部分を通過した光を受光した画素の輝度値と前記一対の反射面の間隔が不均一な部分を通過した光を受光した画素の輝度値とに基づいて、前記画像補正を行うことを特徴とする請求項4に記載の分光カメラ。 The image processing unit is configured to obtain a luminance of a pixel that has received light that has passed through a portion where the distance between the pair of reflection surfaces of the wavelength selection unit is uniform in at least one of the n divided imaging regions of the imaging element. The spectroscopic camera according to claim 4, wherein the image correction is performed based on a value and a luminance value of a pixel that has received light that has passed through a portion where the distance between the pair of reflecting surfaces is not uniform.
  6.  前記光学系の光路上に少なくとも1つのバンドパスフィルタを有し、
     前記検出部は、前記n個の分割撮像領域のうち前記バンドパスフィルタを通過した光を受光した分割撮像領域により撮像された画像に基づいて、前記波長選択部の透過波長の不均一性を検出することを特徴とする請求項2乃至5のいずれか1に記載の分光カメラ。
    Having at least one bandpass filter on the optical path of the optical system;
    The detection unit detects non-uniformity of the transmission wavelength of the wavelength selection unit based on an image captured by a divided imaging region that receives light that has passed through the bandpass filter among the n divided imaging regions. The spectroscopic camera according to claim 2, wherein the spectroscopic camera is used.
  7.  前記バンドパスフィルタは、前記レンズアレイの前記n個のレンズのいずれかが集光する光の光路上に設けられていることを特徴とする請求項6に記載の分光カメラ。 The spectroscopic camera according to claim 6, wherein the band-pass filter is provided on an optical path of light collected by any of the n lenses of the lens array.
  8.  透過波長が異なる複数の前記バンドパスフィルタを有し、
     前記複数の前記バンドパスフィルタの各々は、前記レンズアレイの前記n個のレンズのうち異なるレンズが集光する光路上に夫々設けられていることを特徴とする請求項6に記載の分光カメラ。
    A plurality of bandpass filters having different transmission wavelengths;
    The spectroscopic camera according to claim 6, wherein each of the plurality of band-pass filters is provided on an optical path on which a different lens among the n lenses of the lens array collects light.
  9.  前記波長選択部の透過波長の不均一性に関する情報を記憶する記憶部と、
     前記記憶部に記憶されている前記不均一性に関する情報に基づいて、前記撮像素子が撮像した画像に画像補正を行う画像処理部と、
     を有することを特徴とする請求項1に記載の分光カメラ。
    A storage unit for storing information on non-uniformity of the transmission wavelength of the wavelength selection unit;
    An image processing unit that performs image correction on an image captured by the image sensor based on information on the non-uniformity stored in the storage unit;
    The spectroscopic camera according to claim 1, comprising:
  10.  請求項1に記載の分光カメラによる撮像方法であって、
     所定波長の光を前記光学系に入射するステップと、
     前記印加電圧を変化させつつ前記所定波長の光を前記撮像素子に受光させるステップと、
     前記撮像素子の前記n個の分割撮像領域における画素毎の輝度分布の情報に基づいて、前記波長選択部の透過波長の不均一性の情報を得るステップと、
     被写体を撮像して撮像画像を得るステップと、
     前記透過波長の不均一性の情報に基づいて、前記撮像画像に画像補正を行うステップと、
     を有することを特徴とする撮像方法。
    An imaging method using the spectroscopic camera according to claim 1,
    Incident light of a predetermined wavelength into the optical system;
    Receiving the light of the predetermined wavelength on the image sensor while changing the applied voltage;
    Obtaining non-uniformity information of the transmission wavelength of the wavelength selection unit based on information of luminance distribution for each pixel in the n divided imaging regions of the imaging element;
    Imaging a subject to obtain a captured image;
    Performing image correction on the captured image based on the non-uniformity information of the transmission wavelength;
    An imaging method characterized by comprising:
  11.  請求項1に記載の分光カメラにおいて、コンピュータに、
     所定波長の光を前記光学系に入射するステップと、
     前記印加電圧を変化させつつ前記所定波長の光を前記撮像素子に受光させるステップと、
     前記撮像素子の前記n個の分割撮像領域における画素毎の輝度分布の情報に基づいて、前記波長選択部の透過波長の不均一性の情報を得るステップと、
     被写体を撮像して撮像画像を得るステップと、
     前記透過波長の不均一性の情報に基づいて、前記撮像画像に画像補正を行うステップと、
     を実行させることを特徴とするプログラム。
    The spectroscopic camera according to claim 1, wherein the computer includes:
    Incident light of a predetermined wavelength into the optical system;
    Receiving the light of the predetermined wavelength on the image sensor while changing the applied voltage;
    Obtaining non-uniformity information of the transmission wavelength of the wavelength selection unit based on information of luminance distribution for each pixel in the n divided imaging regions of the imaging element;
    Imaging a subject to obtain a captured image;
    Performing image correction on the captured image based on the non-uniformity information of the transmission wavelength;
    A program characterized by having executed.
  12.  請求項11に記載のプログラムを記録することを特徴とする記録媒体。 A recording medium for recording the program according to claim 11.
PCT/JP2018/003400 2017-02-02 2018-02-01 Spectroscopic camera, image capturing method, program, and recording medium WO2018143340A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2018565643A JPWO2018143340A1 (en) 2017-02-02 2018-02-01 Spectral camera, imaging method, program, and recording medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017017768 2017-02-02
JP2017-017768 2017-08-18

Publications (1)

Publication Number Publication Date
WO2018143340A1 true WO2018143340A1 (en) 2018-08-09

Family

ID=63039832

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/003400 WO2018143340A1 (en) 2017-02-02 2018-02-01 Spectroscopic camera, image capturing method, program, and recording medium

Country Status (2)

Country Link
JP (5) JPWO2018143340A1 (en)
WO (1) WO2018143340A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210293723A1 (en) * 2020-03-18 2021-09-23 Kabushiki Kaisha Toshiba Optical inspection device
CN113905164A (en) * 2021-10-09 2022-01-07 奕目(上海)科技有限公司 Light field imaging system and method for acquiring light field information through light field imaging system
US20230061395A1 (en) * 2021-08-31 2023-03-02 Shimadzu Corporation Spectroscopic detector

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2018143340A1 (en) * 2017-02-02 2020-01-09 パイオニア株式会社 Spectral camera, imaging method, program, and recording medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000028931A (en) * 1998-07-09 2000-01-28 Tdk Corp Multiple wavelength filter array
US20080266655A1 (en) * 2005-10-07 2008-10-30 Levoy Marc S Microscopy Arrangements and Approaches
JP2016090251A (en) * 2014-10-30 2016-05-23 セイコーエプソン株式会社 Spectrometer and spectrometric method
JP2016093509A (en) * 2014-11-14 2016-05-26 株式会社リコー Simultaneous capture of filter processing image of eye
US20160299339A1 (en) * 2013-12-05 2016-10-13 B.G. Negev Technologies And Applications Ltd., At Ben-Gurion University Method for extended depth of field imaging

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014089075A (en) * 2012-10-29 2014-05-15 Ricoh Co Ltd Spectral reflectance measuring system
JP6536877B2 (en) * 2014-07-31 2019-07-03 パナソニックIpマネジメント株式会社 Imaging device and imaging system
JPWO2018143340A1 (en) * 2017-02-02 2020-01-09 パイオニア株式会社 Spectral camera, imaging method, program, and recording medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000028931A (en) * 1998-07-09 2000-01-28 Tdk Corp Multiple wavelength filter array
US20080266655A1 (en) * 2005-10-07 2008-10-30 Levoy Marc S Microscopy Arrangements and Approaches
US20160299339A1 (en) * 2013-12-05 2016-10-13 B.G. Negev Technologies And Applications Ltd., At Ben-Gurion University Method for extended depth of field imaging
JP2016090251A (en) * 2014-10-30 2016-05-23 セイコーエプソン株式会社 Spectrometer and spectrometric method
JP2016093509A (en) * 2014-11-14 2016-05-26 株式会社リコー Simultaneous capture of filter processing image of eye

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210293723A1 (en) * 2020-03-18 2021-09-23 Kabushiki Kaisha Toshiba Optical inspection device
US20230061395A1 (en) * 2021-08-31 2023-03-02 Shimadzu Corporation Spectroscopic detector
US11879776B2 (en) * 2021-08-31 2024-01-23 Shimadzu Corporation Spectroscopic detector assembly
CN113905164A (en) * 2021-10-09 2022-01-07 奕目(上海)科技有限公司 Light field imaging system and method for acquiring light field information through light field imaging system

Also Published As

Publication number Publication date
JPWO2018143340A1 (en) 2020-01-09
JP2021063824A (en) 2021-04-22
JP2022050396A (en) 2022-03-30
JP2024028237A (en) 2024-03-01
JP2023010706A (en) 2023-01-20
JP7164694B2 (en) 2022-11-01

Similar Documents

Publication Publication Date Title
JP7164694B2 (en) Spectral camera, imaging method, program and recording medium
EP2169459B1 (en) Image sensing apparatus, image sensing system and focus detection method
JP5898481B2 (en) Imaging apparatus and focus detection method
US11140311B2 (en) Detecting apparatus and detecting method
US11388383B2 (en) Image processing apparatus, image pickup apparatus, image processing method, and non-transitory computer-readable storage medium
JP4983271B2 (en) Imaging device
JPWO2005081020A1 (en) Optics and beam splitters
US11184521B2 (en) Focus detection apparatus, focus detection method, and storage medium
JP4858179B2 (en) Focus detection apparatus and imaging apparatus
JP5475393B2 (en) Imaging system and correction method
JP7091053B2 (en) Imaging device and focus detection method
JP7019337B2 (en) Image stabilization device, lens device and their control method
JP4208536B2 (en) Focus detection device, imaging device having the same, and photographing lens
JP6862102B2 (en) Control device, image pickup device, control method, program, and storage medium
KR20170015158A (en) Control apparatus, image pickup apparatus, and control method
JP6789810B2 (en) Image processing method, image processing device, and imaging device
JP6395790B2 (en) IMAGING DEVICE, IMAGING DEVICE CONTROL METHOD, AND FOCUS CONTROL PROGRAM
JP2017054052A (en) Control device, imaging device, control method, program, and storage medium
JP2019219577A (en) Detection device and detection method
WO2016181620A1 (en) Image processing device, imaging device, image processing method, program, and storage medium
JP7237476B2 (en) Focus detection method
JP2018026604A (en) Imaging device and imaging device system
JP6765829B2 (en) Image processing device, control method of image processing device, imaging device
JP2019219499A (en) Controller, imaging apparatus, control method, program, and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18748277

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2018565643

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18748277

Country of ref document: EP

Kind code of ref document: A1