US20090231983A1 - Image pickup apparatus for capturing spectral images of an object and observation system including the same - Google Patents

Image pickup apparatus for capturing spectral images of an object and observation system including the same Download PDF

Info

Publication number
US20090231983A1
US20090231983A1 US12/471,045 US47104509A US2009231983A1 US 20090231983 A1 US20090231983 A1 US 20090231983A1 US 47104509 A US47104509 A US 47104509A US 2009231983 A1 US2009231983 A1 US 2009231983A1
Authority
US
United States
Prior art keywords
image pickup
light
image
illumination device
zero
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/471,045
Inventor
Susumu Takahashi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Olympus Corp
Original Assignee
Olympus Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Corp filed Critical Olympus Corp
Priority to US12/471,045 priority Critical patent/US20090231983A1/en
Publication of US20090231983A1 publication Critical patent/US20090231983A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/10Beam splitting or combining systems
    • G02B27/14Beam splitting or combining systems operating by reflection only
    • G02B27/144Beam splitting or combining systems operating by reflection only using partially transparent surfaces without spectral selectivity
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0264Electrical interface; User interface
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2823Imaging spectrometer
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/30Measuring the intensity of spectral lines directly on the spectrum itself
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/30Measuring the intensity of spectral lines directly on the spectrum itself
    • G01J3/36Investigating two or more bands of a spectrum by separate detectors
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/31Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • G02B23/2407Optical details
    • G02B23/2423Optical details of the distal end
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/10Beam splitting or combining systems
    • G02B27/1086Beam splitting or combining systems operating by diffraction only
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/10Beam splitting or combining systems
    • G02B27/14Beam splitting or combining systems operating by reflection only
    • G02B27/145Beam splitting or combining systems operating by reflection only having sequential partially reflecting surfaces
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N2021/1793Remote sensing
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/31Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry
    • G01N2021/3129Determining multicomponents by multiwavelength light
    • G01N2021/3137Determining multicomponents by multiwavelength light with selection of wavelengths after the sample
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/31Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry
    • G01N21/314Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry with comparison of measurements at specific and non-specific wavelengths
    • G01N2021/317Special constructive features
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/47Scattering, i.e. diffuse reflection
    • G01N21/49Scattering, i.e. diffuse reflection within a body or fluid

Definitions

  • the present invention relates to an image pickup apparatus for obtaining spectral images of an object, and particularly relates to an image pickup unit in a medical endoscope for obtaining spectral images of fluorescence emitted by living tissue, and an analysis device for analyzing obtained spectral images, such as measuring fluorescent wavelengths for living tissue diagnosis, and is also concerned with an image pickup unit in an industrial inspection device for obtaining spectral images of luminous surfaces such as those of LEDs and an analysis device for analyzing obtained spectral images, such as measuring spectroscopic properties of an object surface for quality control on a production line.
  • Japanese Laid Open Patent Application No. H02-104332 discloses a spectroscopic endoscope in which illumination light is separated into multiple wave bands in a time-division manner via a rotating filter provided in the light source while continuously illuminating an object, thereby obtaining spectral images separated in a time-division manner.
  • Japanese Laid Open Patent Application No. S63-271308 discloses an endoscope having a variable transmittance element in the observation optical system wherein the wavelengths transmitted by the variable transmittance element may be successively changed, thereby enabling spectral images to be obtained for different wave bands.
  • U.S. Pat. No. 5,782,770 discloses a detection device in which a ribbon-shaped beam is used to illuminate an object, and spectral content of the image of the object is then detected via a dispersion element.
  • the Internet publication entitled “Spectral Camera” mentioned above discloses a spectral camera having a slit opening, a spectroscope, and a two-dimensional CCD camera in which light from an object is received via the slit opening and the light is then separated according to wavelength, thereby enabling the spectrum of an object to be obtained.
  • the present invention provides a small image pickup apparatus that simultaneously allows for high-resolution spectral images and color images of an object to be observed.
  • FIGS. 1( a ) and 1 ( b ) show the construction of the image pickup apparatus of Embodiment 1, with FIG. 1( a ) being a cross-section containing the optical axis of the observation optical system that shows the basic construction of an image pickup apparatus 5 , and with FIG. 1( b ) being a view along the optical axis that shows the imaging areas of minus first order light (hereinafter referred to as ⁇ 1st-order light) and zero-order light on the image pickup surface of a solid-state image pickup element;
  • ⁇ 1st-order light minus first order light
  • FIGS. 2( a ) and 2 ( b ) are illustrations to explain the relationship between the imaging position of zero-order light and the spectral position of ⁇ 1st-order light of a diffraction grating used in the spectroscopic element of the present invention, with FIG. 2( a ) showing the diffraction of the principal ray of an imaging light flux when the diffraction element is a transmission-type diffraction grating, and with FIG. 2( b ) showing the different imaging positions of different wavelengths due to dispersion of the diffraction grating;
  • FIGS. 3( a )- 3 ( c ) are illustrations to schematically show the relationship between the imaging positions of the zero-order light and the ⁇ 1st-order light on the image pickup surface of a solid-state image pickup element; with the incident light having an angle of incidence of + ⁇ I degrees in FIG. 3( a ), 0 degrees in FIG. 3( b ), and ⁇ I degrees in FIG. 3( c ).
  • FIGS. 4( a ) and 4 ( b ) are illustrations to explain the construction of the image pickup apparatus of Embodiment 2 of the present invention, with FIG. 4( a ) being a cross-section containing the optical axis of the observation optical system that shows the basic construction of an image pickup apparatus 5 , and with FIG. 4( b ) being a view in the direction of the optical axis that shows the imaging areas of ⁇ 1st-order light and zero-order light on the image pickup surface of a solid-state image pickup element 4 ;
  • FIG. 5 is an illustration showing the construction of an endoscope system in which the image pickup apparatus 5 is mounted
  • FIGS. 6( a ) and 6 ( b ) are illustrations showing an exemplary construction of a first illumination device 112 , with FIG. 6( a ) being a cross-section containing the optical axis and showing a construction of the first illumination device 112 , and with FIG. 6( b ) being a perspective view showing an exemplary construction of a slit glass 112 b;
  • FIGS. 7( a ) and 7 ( b ) are illustrations showing an exemplary construction of the optical system of the light source (item 55 , shown in FIG. 5) , with FIG. 7( a ) being a cross-section containing the center lines of the light guides 113 a and 113 b of the optical system of the light source as seen, for example, in a top view, and FIG. 7( b ) being a composite cross-sectional view that includes the center line of one light guide of the optical system of the light source as seen, for example, from one side;
  • FIGS. 8( a ) and 8 ( b ) are illustrations to explain the construction of the image pickup apparatus of Embodiment 3 of the present invention, with FIG. 8( a ) being a cross-section containing the optical axis of the observation optical system that shows the basic construction of an image pickup apparatus 5 , and FIG. 8( b ) being a view in the direction of the optical axis that shows the imaging areas of ⁇ 1st-order light and zero-order light on the image pickup surface of a solid-state image pickup element 4 ;
  • FIGS. 9( a ) and 9 ( b ) are illustrations to explain the construction of the image pickup apparatus of Embodiment 4 of the present invention, with FIG. 9( a ) being a cross-section containing the optical axis of the observation optical system that shows the basic construction of an image pickup apparatus 5 , and FIG. 9( b ) being a view in the direction of the optical axis that shows the imaging areas of plus first-order light (hereinafter +1st-order light) and zero-order light on the image pickup surfaces of the solid-state image pickup elements 4 b and 4 a , respectively;
  • +1st-order light plus first-order light
  • FIGS. 10( a )- 10 ( c ) are illustrations to explain the construction of the observation system 201 of Embodiment 5 of the present invention, with FIG. 10( a ) being a cross-section containing the optical axis of the observation optical system 2 , with FIG. 10( b ) being a view in the direction of the optical axis that shows the imaging areas of ⁇ 1st-order light and zero-order light on the image pickup surface of a solid-state image pickup element 4 when a second illumination device 7 extensively illuminates the object surface, and with FIG.
  • 10( c ) being a view in the direction of the optical axis that shows the imaging areas of ⁇ 1st-order light and the zero-order light on the image pickup surface of a solid-state image pickup element 4 when a first illumination device 6 illuminates a narrow region of the object surface;
  • FIGS. 11( a ) and 11 b ) are illustrations showing another possible construction of the illumination device 6 , with FIG. 11( a ) being a partial cross-section containing the optical axis of the illumination device 6 , and FIG. 11( b ) being a view along the optical axis showing the arrangement of a set of LEDs 6 e;
  • FIGS. 12( a )- 12 ( c ) are illustrations to schematically show an image display obtained using the observation system for measuring spectral reflectance according to Embodiment 5 of the present invention
  • FIG. 13 is an illustration to schematically show another image display obtained using the observation system for measuring spectral reflectance according to Embodiment 5 of the present invention.
  • FIGS. 14( a )- 14 ( c ) are illustrations to explain the construction of an observation system for measuring spectral reflectance according to Embodiment 6 of the present invention, with FIG. 14( a ) being a cross-section containing the optical axis of the objective lens 15 showing the basic construction of an observation system for measuring spectral reflectance, with FIG. 14( b ) showing an illumination light flux of an illumination device 16 as viewed in the direction z, and with FIG. 14( c ) showing an illumination light flux of the illumination device 16 as viewed in the direction x;
  • FIGS. 15( a ) and 15 ( b ) are illustrations to schematically show an image display obtained using the observation system for measuring spectral reflectance according to Embodiment 6 of the present invention.
  • FIG. 16 is an illustration showing the construction of an observation system for inspecting the quality of LEDs according to Embodiment 7 of the present invention.
  • FIG. 17 is an illustration to schematically show an image display obtained using the observation system for inspecting the quality of LEDs according to Embodiment 7 of the present invention.
  • the image pickup apparatus of the present invention includes a diffraction element in the optical path of an observation optical system whereby zero-order light transmitted through the diffraction element and ⁇ 1st-order light diffracted by the diffraction element are imaged on the image pickup surface of an image pickup element, and the imaging areas of the zero-order light and ⁇ 1st-order light are not overlapped on the image pickup surface of the image pickup element. Therefore, a color image of an object can be observed within the area that receives the zero-order light and the spectrum of the object can be observed within the respective areas that receive the +1st-order light and the ⁇ 1st-order light while preventing flare from occurring in these light-receiving areas.
  • a diffraction element is provided in the optical path of an observation optical system and both color image observation and spectrum detection of an object can be achieved. Therefore, for example, a small image pickup apparatus can be realized that can be provided at the insertion end of an endoscope. Further, morphological information of an object obtained from a color image, and spectral information of each part of the object can be analyzed and associated with each other.
  • FIG. 1( a ) is a cross-section containing the optical axis of the observation optical system that shows the basic construction of an image pickup apparatus 5
  • FIG. 1( b ) is a view along the optical axis that shows the imaging areas of ⁇ 1st-order light and zero-order light on the image pickup surface of a solid-state image pickup element.
  • the image pickup apparatus 5 of the present invention is formed of an observation optical system 2 that includes a diffraction element 3 located in the optical path, and a solid-state image pickup element 4 .
  • the observation optical system 2 includes a lens 2 a provided on the object surface side, and a collimating lens 2 b for collimating a light flux from the lens 2 a .
  • the observation optical system 2 includes a diaphragm S and the diffraction element 3 , each of which is located in the collimated light flux from the collimating lens 2 b , and a lens 2 c for imaging zero-order light that passes straight through the diffraction element 3 and ⁇ 1st-order light that is diffracted by the diffraction element 3 onto off-axis positions on the image pickup surface of the solid-state image pickup element 4 .
  • the diffraction element 3 is a transmission-type DOE (Diffractive Optical Element) such as a diffraction grating, or an HOE (Holographic Optical Element) such as a holographic film.
  • DOE diffractive Optical Element
  • HOE Helographic Optical Element
  • Upon entering the diffraction element 3 light flux is separated into zero-order light and ⁇ 1st-order light and +1st-order light.
  • the 1st-order light emerges from the diffraction element 3 at equal (plus and minus) angles to the optical axis, with the specific angle amount depending upon the wavelength of the incident light.
  • the solid-state image pickup element 4 is provided in a manner such that the center of the image pickup surface is not aligned with the optical axis of the observation optical system 2 .
  • the zero-order light that passes straight through the diffraction element 3 and the ⁇ 1st-order light that is diffracted by the diffraction element 3 are imaged on the image pickup surface by the lens 2 c .
  • the imaging areas of the zero-order light and ⁇ 1st-order light do not overlap on the image pickup surface.
  • the spectrum of the ⁇ 1st-order light from an object can be obtained by using the positional relationship between the imaging position of the ⁇ 1st-order light and that of the zero-order light on the image pickup surface of the solid-state image pickup element 4 .
  • a light beam emitted from a point on the object surface of an object is imaged on the image pickup surface of the solid-state image pickup element 4 via the observation optical system.
  • the relationship between the imaging position of the zero-order light and the spectrum of the ⁇ 1st-order light on the image pickup surface of the solid-state image pickup element 4 will now be explained with reference to FIGS. 2( a ) through 3 ( c ).
  • FIG. 2( a ) is an illustration showing the diffraction of the principal ray of a light flux when the diffraction element 3 is a transmission-type diffraction grating.
  • the incident angle of a wavelength ⁇ relative to the optical axis of the diffraction grating is ⁇ I and the diffraction angle is ⁇ I′ as shown in FIG. 2( a )
  • Equation (1) is obtained:
  • d is the diffraction grating pitch.
  • FIG. 2( b ) schematically shows the different imaging positions for different wavelengths as a result of the dispersion of the diffraction grating.
  • the imaging lens 2 c is a lens having distortion of a sin ⁇ type, the following Equations (2) and (3) are obtained for a diffraction angle ⁇ for each wavelength of the principal ray entering the diffraction grating at a right angle:
  • H ⁇ is the image height on the imaging surface of the diffracted light having a wavelength ⁇
  • F is the focal length of the imaging lens 2 c .
  • N and d are as previously defined.
  • the incident angle ⁇ I of the principal ray of an imaging light flux entering the diffraction grating is changed, the imaging positions of the zero-order light and the ⁇ 1st-order light are accordingly shifted.
  • the diffraction grating causes dispersion, but does not cause refraction or imaging.
  • Such a diffraction grating allows the zero-order light to pass straight through the diffraction grating so that the exit angle is the same as the incident angle.
  • FIGS. 3( a )- 3 ( c ) show the positional relationship between the imaging positions of the zero-order light and the ⁇ 1st-order diffracted light on the image pickup surface of the solid-state image pickup element 4 for three different incidence angles of incident light onto the diffraction grating.
  • the position on the image pickup surface that corresponds to the optical axis of the imaging optical system is marked with an “x” in each figure.
  • the image position on the image pickup surface labeled “zero-order light” is the image position of light from an object point located off the optical axis such that a principal ray enters the diffraction grating at the angle + ⁇ I (see FIG. 2( a )), and this position is at a distance H 0 to the right of the optical axis “x” position of the imaging optical system.
  • the position of the ⁇ 1st-order diffracted light is at a distance H ⁇ from the optical axis that is relatively far from the optical axis.
  • 3( b ) illustrates the situation where the angle of incidence of the incident light is such that the zero-order diffracted light travels along the optical axis and thus the image of the zero-order light lies at the position “x”.
  • Zero-order light from the point on the object at zero image height i.e., the point on the object that corresponds to the optical axis of the imaging optical system, passes straight through the diffraction grating and thus is imaged at the point “S” on the image pickup surface.
  • the position of the ⁇ 1st-order diffracted light is at a distance H ⁇ from the optical axis that is a medium distance from the optical axis.
  • 3( c ) illustrates the situation where the angle of incidence of the incident light is such that a principal ray enters the diffraction grating at the angle ⁇ I, and this light is imaged onto the image pickup surface at a position to the left of the optical axis at a distance H 0 .
  • the position of the ⁇ 1st-order diffracted light is at a distance H ⁇ from the optical axis that is relatively near the optical axis.
  • Equation (4) The relationship between the imaging position of the zero-order light and the position of the spectrum of the ⁇ 1st-order light is given by the following Equation (4):
  • ⁇ I is the incident angle of light having a wavelength ⁇ relative to the optical axis of the diffraction element
  • ⁇ I′ ⁇ is the diffraction angle of light having a wavelength ⁇ relative to the optical axis of the diffraction element
  • N and d are as previously defined.
  • the image height H ⁇ on the image pickup surface of the ⁇ 1st-order light having a wavelength ⁇ is obtained using the following Equation (5):
  • H ⁇ , F, and ⁇ I′ ⁇ are as previously defined.
  • the imaging lens 2 c behind the diffraction grating is a lens having distortion that varies according to sin ⁇ . Practically speaking, a similar calculation may be applied to a lens having any type of distortion provided that its aberration properties are previously measured and known.
  • the imaging position of the spectrum of the ⁇ 1st-order light is identified as follows.
  • the image height H 0 of the zero-order light (i.e., the imaging position of the zero-order light) is measured based on an image acquired by the solid-state image pickup element 4 ;
  • the diffraction angle OI′ ⁇ of the ⁇ 1st-order light having a wavelength ⁇ is calculated using the incident angle ⁇ I (as determined in step (2) above), the diffraction grating pitch d, the order of diffraction N, and the wavelength ⁇ ;
  • the image height H ⁇ of the ⁇ 1st-order light having a wavelength ⁇ is calculated for a diffraction angle ⁇ I′ ⁇ for light of wavelength ⁇ as calculated in step (3), the previously measured focal length F and the distortion properties of the imaging lens 2 c.
  • the image pickup apparatus 5 is provided with an image analysis circuit 101 for analyzing the position of an obtained image and an arithmetic operation circuit 102 for executing the calculations above to acquire the spectral properties of the obtained image on a real time basis.
  • a diffraction grating that performs imaging in addition to providing a diffraction function can be provided.
  • the imaging lens 2 c behind the diffraction grating can be eliminated.
  • the focal length F and the distortion property of the diffraction grating may be measured and stored in a memory 103 as a calculation parameter.
  • An arithmetic operation circuit may perform the calculations above by using the stored parameter, thereby enabling the arithmetic operations to be repeatedly executed.
  • the image pickup apparatus 5 of the present invention is provided with an illumination device. It is desired that the illumination device have a function to extensively illuminate an object and also functions to spotlight a small part of the object.
  • the entire image of the object is obtained in the imaging area of the zero-order light on the image pickup surface of the solid-state image pickup element 4 .
  • spectral images of the entire object are obtained in the imaging area of the ⁇ 1st-order light on the image pickup surface.
  • illumination refers to emitted white light
  • a color object image and respective object images for different wavelengths are obtained in a successively overlapped manner.
  • the spectral information for each point of the object is superimposed. Therefore, it is difficult to analyze the spectral information based on the calculations described above.
  • the illumination device in this embodiment has the function of spotlighting a small portion of the object, thereby enabling the spectral information for each point of the object to be separately obtained in the imaging area of the ⁇ 1st-order light on the image pickup surface of the image pickup element 4 .
  • an object may be extensively illuminated in one instance and only a small part of the object may be illuminated (i.e., spotlighted) in another instance. Then, a color image obtained by extensively illuminating the object and a spectral image obtained by spotlighting a small part of the object may be merged and displayed by a display. In this manner, the spectrum at a specific point of the object can be observed while examining the appearance of the object.
  • the spot image obtained in the imaging area of the zero-order light on the image pickup surface of the solid-state image pickup element 4 during the spotlighting can be analyzed according to its coordinates. Then, the color object image obtained by extensively illuminating the object can be marked in the position where the spot image is taken, thereby enabling the positions on the object where the spectral information of the object comes from to be known.
  • the image pickup surface of the solid-state image pickup element 4 receives the zero-order light that is transmitted straight through the diffraction element 3 and the ⁇ 1st-order light that is diffracted by the diffraction element 3 .
  • the +1st-order light diffracted by the diffraction element 3 and higher-order diffractive components are unnecessary for image pickup. Therefore, it is desirable to eliminate those light beams by providing optical members, in combination, that reflect or absorb the light beams between the diffraction element 3 and the image pickup surface.
  • a light absorbing member 104 can be provided at the imaging area of +1st-order light along with the solid-state image pickup element 4 , thereby eliminating the +1st-order light from being detected by the solid-state image pickup element 4 .
  • FIG. 4( a ) is a cross-section containing the optical axis of the observation optical system showing the basic construction of the image pickup apparatus 5 .
  • FIG. 4( b ) is an illustration showing the imaging areas of the ⁇ 1st-order light and zero-order light on the image pickup surface of a solid-state image pickup element 4 , as viewed from a position on the optical axis facing the image pickup surface.
  • the image pickup apparatus 5 of Embodiment 2 of the present invention is formed of an observation optical system 2 which may include a diffraction element 3 located in the optical path, and a solid-state image pickup element 4 .
  • the observation optical system 2 is formed of a lens 2 a located on the object surface side of the observation optical system 2 , a collimating lens 2 b for collimating a light beam from the lens 2 a , and a diaphragm S and the diffraction element 3 located within the collimated light flux.
  • the diffraction element 3 is a transmission-type diffraction grating that provides an imaging function in addition to providing a diffraction function. Entering the diffraction element 3 , a light flux is separated into zero-order light and 1st-order light due to diffraction. The +1st-order light emerges from the diffraction element 3 at equal but opposite angles to the optical axis, with the angle amount depending on the wavelength of the light, and is then incident onto the image pickup surface of the image pickup element 4 .
  • the solid-state image pickup element 4 is positioned in a manner such that the center of the image pickup surface is not aligned with the optical axis of the observation optical system 2 . It thereby receives the light fluxes of the zero-order light and ⁇ 1st-order light from the diffraction element 3 . As shown in FIG. 4( b ), the imaging areas of the zero-order light and ⁇ 1st-order light do not overlap on the image pickup surface.
  • the image pickup apparatus 5 can be downsized so that it can be mounted at, for example, the insertion end of an endoscope.
  • FIG. 5 shows an example of an endoscope in which the image pickup apparatus 5 is mounted at the insertion end of the endoscope.
  • the image pickup apparatus 5 is provided at the leading end 51 with a first illumination device 112 for illuminating a relatively narrow area of an object via a slit opening, and a second illumination device 111 for extensively illuminating an object.
  • Image signals obtained by the image pickup apparatus 5 are transferred to an image processing device 54 via a universal cable 53 extended from an operation part 52 of the endoscope.
  • the image processing device 54 includes an image analysis circuit 101 (not shown in FIG. 5 ), an arithmetic operation circuit 102 (not shown in FIG. 5 ), and a memory as described above to analyze and merge obtained images. Images that are processed by the image processing device 54 may be displayed by a display device 56 .
  • FIGS. 6( a ) and 6 ( b ) show an exemplary construction of the first illumination device 112 .
  • FIG. 6( a ) is a cross-section showing the construction of the first illumination device 112 .
  • Illumination light supplied by a light source 55 is transferred to the first illumination device 112 via a light guide 113 a provided in the universal cable 53 .
  • Light emerging from the exit end of the light guide 113 a enters a single fiber rod lens 112 a .
  • the single fiber rod lens 112 a allows the light to emerge from the exit end thereof with a nearly uniform intensity.
  • Light passing through the linear opening in a slit glass 112 b illuminates an object surface via a projection lens 112 c .
  • the slit glass 112 b has a linear slit printed on one surface of a parallel flat glass and an anti-reflection coating deposited on the other surface.
  • a super-thin metal plate with a linear slit can be used in place of the slit glass 112 b .
  • the second illumination device 111 can have any construction that allows extensive illumination of an object. For example, a plano-convex lens with the flat surface on the object side can be provided immediately after the exit end of a light guide 113 b ( FIG. 7( a )), thereby allowing light emerging from the light guide 113 b to illuminate the object in a diffuse manner.
  • FIGS. 7( a ) and 7 ( b ) show an exemplary construction of the light source 55 .
  • FIG. 7( a ) is a cross-section containing the center lines of the light guides 113 a and 113 b of the optical system of the light source 55 shown in FIG. 5 as seen, for example, in a top view
  • FIG. 7( b ) is a composite cross-sectional view that includes the center line of one light guide of the optical system of the light source 55 as seen, for example, from one side.
  • Light that emerges from a discharge lamp 117 is collected on the entrance end of the light guide 113 a , 113 b by a relay optical system 116 and collection lenses 114 a , 114 b .
  • Collection lens 114 a supplies light to the first illumination device 112 and collection lens 114 b supplies light to the second illumination device 111 .
  • An optical path-switching mirror 115 is provided in the optical path of the relay optical system 116 . The mirror is rotated about a line including the intersection with the relay optical system 116 in order to switch between the optical paths of the collection lenses 114 a and 114 b .
  • the optical path of the collection lens 114 a is selected, light from the light source 117 is directed to the light guide 113 a so that the first illumination device 112 illuminates a relatively narrow area of an object.
  • the optical path of the collection lens 114 b When the optical path of the collection lens 114 b is selected, light from the light source 117 is directed to the light guide 113 b so that the second illumination device 111 extensively illuminates an object.
  • a glass rod is provided at the entrance end of each of the light guides 113 a , 113 b , thereby preventing the entrance ends of the light guides 113 a , 113 b from being damaged by the thermal energy of the collected light.
  • the image processing device 54 is electrically connected to the light source 55 and controls the switching between extensive illumination of an object and linear illumination of a small part of the object.
  • the optical path switching mirror 115 provided in the optical system of the light source 55 is controlled based on control signals transmitted from a control circuit built in the image processing device 54 .
  • the second illumination device 111 When the second illumination device 111 extensively illuminates an object, the entire image of the object is obtained in the imaging area of the zero-order light on the image pickup surface of the solid-state image pickup element 4 . Concurrently, spectral images of the entire object are obtained in the imaging area of the ⁇ 1st-order light on the image pickup surface.
  • the illumination device emits white light, a color object image and respective object images for different wavelengths are obtained in a successively overlapping manner. It is difficult to analyze the spectral information using the latter images because the spectral information at each point of the object is overlapped. Therefore, in this embodiment, the first illumination device 112 serves to illuminate a relatively narrow area of the object.
  • the object surface is divided into segments in the imaging area of the ⁇ 1st-order light on the image pickup surface of the solid-state image pickup element 4 , yielding separate spectral information.
  • FIG. 8( a ) is a cross-section containing the optical axis of the observation optical system that shows the basic construction of an image pickup apparatus 5
  • FIG. 8( b ) is a view in the direction of the optical axis that shows the imaging areas of ⁇ 1st-order light and zero-order light on the image pickup surface of a solid-state image pickup element 4 .
  • the image pickup apparatus 5 of this embodiment is formed of an observation optical system 2 which includes a diffraction element 3 located in the optical path, and a solid-state image pickup element 4 .
  • the observation optical system 2 is formed of a lens 2 a on the object surface side of the observation optical system, a collimating lens 2 b for collimating a light flux from the lens 2 a , a diaphragm S and a diffraction element 3 provided in the collimated light flux, and an imaging lens 2 c for imaging the zero-order light that is transmitted straight through the diffraction element 3 and the ⁇ 1st-order light that is diffracted by the diffraction element 3 .
  • the zero-order light and the ⁇ 1st-order light are received onto the image pickup surface of the solid-state image pickup element 4 .
  • the diffraction element 3 is a transmission-type diffraction grating that provides a refraction function in addition to performing a diffraction function.
  • a light flux that enters the diffraction element 3 is separated into zero-order light, +1st-order light, and ⁇ 1st-order light due to diffraction.
  • the zero-order light, the +1st-order light, and the ⁇ 1st-order light emerge at different angles due to refraction.
  • the refraction directions can be controlled to adjust the imaging positions of the zero-order light, the +1st-order light, and the ⁇ 1st-order light.
  • the solid-state image pickup element 4 may be positioned in a manner such that the center of the image pickup surface coincides with the optical axis of the observation optical system 2 and such that the image pickup surface receives the imaging light fluxes of the zero-order light and the ⁇ 1st-order light that emerge from the diffraction element 3 .
  • the imaging areas of the zero-order light and ⁇ 1st-order light do not overlap on the image pickup surface.
  • the refraction angle and direction of the diffraction element 3 can be pre-measured and stored as parameters to be used in calculations (in addition to the focal length F and distortion property of the imaging lens 2 c ) in order to identify the imaging positions of the zero-order light and the spectrum of the ⁇ 1st-order light.
  • the image pickup apparatus 5 having the construction above in which the center of the image pickup surface coincides with the optical axis of the observation optical system 2 is preferably provided at the insertion end of an endoscope because it allows the insertion end of the endoscope to have a smaller outer diameter.
  • This embodiment is nearly the same as that of Embodiment 1 in terms of its construction and its efficacy.
  • FIG. 9( a ) is a cross-section containing the optical axis of the observation optical system that shows the basic construction of an image pickup apparatus 5
  • FIG. 9( b ) is a view in the direction of the optical axis that shows the imaging areas of plus first-order light (hereinafter +1st-order light) and zero-order light on the image pickup surfaces of the solid-state image pickup elements 4 b and 4 a , respectively.
  • +1st-order light plus first-order light
  • the image pickup apparatus 5 of this embodiment is formed of an observation optical system 2 , which includes a diffraction element 3 located in the optical path of the observation optical system 2 , and two solid-state image pickup elements 4 a and 4 b .
  • the observation optical system 2 is formed of a lens 2 a on the object surface side of the observation optical system 2 , a lens 2 b for imaging a light flux from the lens 2 a on the image pickup surfaces of the solid-state image pickup elements 4 a and 4 b , and a diffraction element 3 that is provided at a specific angle in the imaging light flux.
  • the diffraction element 3 is a reflection-type diffraction grating that has a reflecting function in addition to having a diffraction function.
  • the solid-state image pickup elements 4 a and 4 b are provided at the image plane of the zero-order light and the +1st-order light, respectively, in such a manner that the centers of their image pickup surfaces each coincide with the optical axis of the observation optical system.
  • An image analysis circuit 101 (not shown in FIG. 9( a )) for analyzing the position of an obtained image and an arithmetic operation circuit 102 (not shown in FIG. 9( a )) for executing calculations required to analyze spectral information and to create an image may be provided.
  • the spectral properties of the obtained image may be acquired on a real time basis.
  • the focal length F and distortion properties of the lenses 2 a and 2 b may be previously measured and stored in a memory 103 (not shown in FIG. 9( a )).
  • An arithmetic operation circuit may use these as parameters in calculations in order to identify the imaging positions of the zero-order light and the spectrum of the +1st-order light on the image pickup surface.
  • These circuits have the same construction as those in Embodiment 1 and, therefore, are not further illustrated.
  • the image pickup apparatus of this embodiment wherein solid-state image pickup elements 4 a and 4 b are provided at the image plane of the zero-order light and the +1st-order light, respectively, allows for larger imaging areas on the image pickup surface as compared with the case where a single solid-state image pickup element is used to pick up images. Therefore, spectral information can be analyzed with higher resolution for the same object image range.
  • FIG. 10( a ) is a cross-section containing the optical axis of the observation optical system 2 showing the basic construction of the observation device 201 of this embodiment.
  • FIG. 10( b ) is an illustration showing the imaging areas of ⁇ 1st-order light and zero-order light on the image pickup surface of a solid-state image pickup element 4 when the object surface is illuminated by the second illumination device 7 .
  • FIG. 10( a ) is a cross-section containing the optical axis of the observation optical system 2 showing the basic construction of the observation device 201 of this embodiment.
  • FIG. 10( b ) is an illustration showing the imaging areas of ⁇ 1st-order light and zero-order light on the image pickup surface of a solid-state image pickup element 4 when the object surface is illuminated by the second illumination device 7 .
  • FIG. 10( a ) is a cross-section containing the optical axis of the observation optical system 2 showing the basic construction of the observation device 201 of this embodiment.
  • FIG. 10( c ) is an illustration showing the imaging areas of ⁇ 1st-order light and zero-order light on the image pickup surface of a solid-state image pickup element 4 when the object surface is illuminated by the first illumination device 6 .
  • the object surface is shown in a perspective view for greater clarity of explanation.
  • the observation system 201 of this embodiment is formed of an image pickup apparatus 5 that has the same construction as in Embodiment 1, a first illumination device 6 for illuminating a narrow linear region part of an object surface, and a second illumination device 7 for extensively illuminating the object surface beyond the field of view of the image pickup apparatus 5 .
  • a collimating lens 6 b for collimating light from the LEDs
  • a diffusing element 6 c located in the collimated light flux for diffusing light in the direction x
  • a cylindrical lens 6 d for collecting in one direction the light diffused by the diffusing element 6 c on the surface of the object.
  • an LED corresponding to a point to be measured on the object surface is energized.
  • Light from the LED passes through the collimating lens 6 b , is diffused in the direction x by the diffusing element 6 c , and forms a light beam that illuminates a linear region L for spectroscopic measurement on the object surface via the cylindrical lens 6 d .
  • Each LED of the LED set 6 a is individually energized.
  • FIGS. 11( a ) and 11 ( b ) are illustrations showing another possible construction of the first illumination device 6 , with FIG. 11( a ) being a partial cross-section containing the optical axis of the first illumination device 6 , and FIG. 11( b ) being a view along the optical axis showing the arrangement of a set of LEDs 6 e .
  • the object surface is shown in a perspective view for greater clarity of explanation.
  • the first illumination device 6 comprises a set of LEDs 6 e formed of multiple LEDs arranged in a two dimensional plane x-y, a collimating lens 6 f , and a cylindrical lens 6 g .
  • the object surface can be scanned in the direction y by illuminating a linear region with the optical axis of the first illumination device 6 being fixed relative to the object surface.
  • a diffusion agent for mixing color lights emitted from the LEDs, an agent that produces fluorescence using the light emitted from the LEDs as an excitation source to produce white light, and an intensity correction filter for homogenizing the light intensity among wavelengths in a desired range be provided in front of the light exit surface of each LED.
  • the second illumination device 7 is formed of a light guide 7 a that guides light from a light source (not illustrated) to the insertion end of an endoscope, and a plano-concave lens 7 b that diffuses light emitted from the light guide 7 a and illuminates the object surface so that the second illumination device 7 extensively illuminates the object surface beyond the field of view of the image pickup apparatus 5 .
  • illumination devices are not limited to those specifically described above. Any illumination device can be used in combination with the image pickup apparatus 5 provided that it illuminates a linear region of the object surface and/or that it extensively illuminates the object surface.
  • FIGS. 12( a )- 12 ( c ) An image processing device and a display device can be used to merge and display images obtained using different illumination devices, as shown in FIGS. 12( a )- 12 ( c ) by way of example.
  • a zero-order light image obtained using the second illumination device 7 is displayed along with lines indicating where the illuminated region L (shown in FIG. 11( a )) is on the object surface, calculated on the basis of the zero-order light image obtained using the second illumination device 7 .
  • spectral images of the illuminated regions obtained using the first illumination device 6 are shown.
  • the arrangement of these images allows the observer to acquire the morphological information of the object surface and the positional information of the linear illumination region from the images displayed in the lower area, and to acquire the spectral information of the object surface from the images displayed in the upper area.
  • the first illumination device 6 scans the illuminated region L over the object surface and individually obtained spectral images are stored in an image processing device.
  • the image of a specific illuminated region L can be retrieved and displayed.
  • the spectral images over the entire object surface can be individually obtained for different wavelength bands as shown in FIGS. 12( a )- 12 ( c ).
  • the observer can screen out unnecessary spectral images while viewing these images.
  • the image processing device 54 can serve to extract image information for a specific wavelength among the spectral images of the illuminated regions L and merge extracted image information in order to display an image of the entire object surface.
  • FIG. 13 shows object surface images obtained based on individual image information that is displayed on a display screen divided into four sections.
  • the display screen A displays a zero-order light image obtained using the illumination device 7 .
  • the illumination device 7 emits white light
  • a color image is displayed on the display screen A.
  • the display screen B displays an image based on image information for a specific wavelength.
  • the display screen C displays an image based on image information for another specific wavelength.
  • the display screen D displays a differential image of the image displayed on the display screens B and C.
  • FIGS. 14( a )- 14 ( c ) are illustrations to explain the construction of an observation system according to Embodiment 6 for measuring spectral reflectance in which the image pickup apparatus of the present invention is mounted, with FIG. 14( a ) schematically showing the various components of the observation system and its construction, with FIG. 14( b ) showing an illumination light flux in the illumination device 16 as seen in the direction z, and with FIG. 14( c ) showing an illumination light flux in the illumination device 16 as seen in the direction x.
  • FIGS. 15( a ) and 15 ( b ) are illustrations showing information obtained by a light receiving element 14 of the observation system after the information has been processed by an image processing device (not shown), and displayed on display screens.
  • the observation system for measuring spectral reflectance in this embodiment is formed of an objective lens 15 , a diffraction element 13 , an imaging lens 12 , a light receiving element 14 , an illumination device 16 for illuminating a linear region of an object surface with white light, an illumination device 17 for illuminating the object surface with infrared light, and half mirrors 18 and 19 .
  • the illumination device 16 is formed of a white light source 16 a , a collimating lens 16 b , and a diffusing element 16 c . As shown in FIGS. 14( b ) and 14 ( c ), the diffusing element 16 c simply transmits light in the direction z and diffuses light in the direction y.
  • the illumination device 17 is formed of an infrared light source 17 a , a diffusing element 17 b , and a collimating lens 17 c .
  • the diffusing element 17 b diffuses incident light evenly in the directions x and y.
  • the observation system for measuring spectral reflectance of this embodiment further includes a stage 20 on which an object is placed and a stage driving mechanism (not shown) for moving the stage in the direction x in the figure, as shown by the double-headed arrows.
  • the illumination device 17 first illuminates the object surface with light in the infrared range. Infrared rays emitted from the infrared light source 17 a enter the half mirror 19 as a collimated light flux via the diffusing element 17 b and the collimating lens 17 c . After being reflected by the half mirror 19 and transmitted through the half mirror 18 , the infrared light illuminates the object surface on the stage 20 via the objective lens 15 . The infrared light that is reflected by the object surface enters the half mirror 18 as a collimated light flux via the objective lens 15 .
  • the infrared light that is transmitted through the half mirrors 18 and 19 is separated into zero-order light and ⁇ 1st-order light and imaged on the light receiving element 14 via the diffraction element 13 and imaging lens 12 . In this manner, an infrared image of the object surface is obtained in the imaging area of the zero-order light on the light receiving surface of the light receiving element 14 .
  • the illumination device 16 illuminates a relatively narrow area of the object surface with white light.
  • White light emitted from the light source 16 a enters the diffusing element 16 c as a collimated light flux via the collimating lens 16 b .
  • the white light is diffused only in the direction y.
  • the white light passes through the objective lens 15 and illuminates a linear region of the object surface that is oriented in the direction y in the figure. Light reflected by the object surface enters the half mirror 18 as a collimated light flux via the objective lens 15 .
  • the light After being transmitted through the half mirrors 18 and 19 , the light is separated into zero-order light and ⁇ 1st-order light and imaged onto the light receiving element 14 via the spectroscopic element 13 and imaging lens 12 .
  • a color image of the illuminated region of the object surface is obtained in the imaging area of the zero-order light on the light receiving surface of the light receiving element 14 and the spectrum of the object surface image is obtained in the imaging area of the ⁇ 1st-order light on the light receiving surface of the light receiving element 14 .
  • the stage 20 on which an object surface is placed can be moved to shift the linearly illuminated region on the object surface, thereby obtaining and storing in the memory of an image processing device (not shown) reflection spectral images for the entire object surface.
  • An image processing device 54 and a display device 56 may be used to display infrared images obtained using the illumination device 17 and the spectral images obtained using the illumination device 16 side-by-side on one or more display screen(s).
  • the coordinates of the linearly illuminated region on the object surface can be calculated based on the color image obtained using the illumination device 16 and a mark that indicates the illuminated region can be merged with, and appear on, the infrared image.
  • a display may be divided vertically into two parts as shown in FIG. 15( a ), so that a first display screen displays an infrared image of the object surface and so that a second display screen displays the spectral content of the visible image.
  • the region of the object that is linearly illuminated with light is marked on the infrared image that is displayed on the first display screen so that it is known from which part of the object surface the spectral images that are displayed on the second display screen are taken. Furthermore, a spectral intensity analysis that focuses on the reflectance spectrum Q 1 for a small region P 1 within the marked region on the object surface can be provided.
  • FIG. 15( b ) shows a spectral intensity profile for the small region P 1 that may be displayed on the second display screen. In this way, a specific small region P 1 on the object surface is selected and the spectral intensity profile for the region is analyzed.
  • FIG. 16 is an illustration that schematically shows the construction of an illuminant observation system in which the image pickup apparatus according to Embodiment 7 of the present invention is mounted.
  • FIG. 17 is an illustration that shows information that is obtained by a light receiving element 14 of the illuminant observation system of FIG. 16 after the information has been processed by an image processing device (not shown), and displayed on a display device that includes, for example, a first display screen and a second display screen.
  • the illuminant observation system of FIG. 16 is formed of an objective lens 15 , a spectroscopic element 13 , an imaging lens 12 , a light receiving element 14 , an illumination device 17 ′, and a half mirror 19 .
  • An object is placed on a stage 20 ′ and moved by a stage driving mechanism (not shown) in the y direction.
  • the illumination device 17 ′ may be formed of a light source 17 a ′ that emits white light, a diffusing element 17 b , and a collimating lens 17 c .
  • the illumination device 17 ′ has the same construction as the illumination device 17 of Embodiment 6 except the light source 17 a ′ is different in that it emits visible light rather than infrared light.
  • the other optical elements have the same construction in Embodiment 7 as the like-numbered other optical elements in earlier embodiments.
  • the observation system of the present embodiment is for inspecting illumination light sources, such as LEDs.
  • the stage 20 ′ is elongated in the direction y in the figure. Multiple substrates having red, green, and blue LEDs are arranged on the stage 20 ′ in the direction y.
  • the LEDs are energized for inspection.
  • the illumination device 17 ′ intermittently emits white light.
  • the illumination device 17 ′ can be a flash lamp. While the illumination device 17 ′ is energized, light from the illumination device 17 ′ enters the half mirror 19 as a collimated light flux via the diffusing element 17 b and the collimating lens 17 c . After being reflected by the half mirror 19 , the light extensively illuminates the object surface on the stage 20 ′ via the objective lens 15 .
  • Light reflected by the object surface enters the half mirror 19 as a collimated light flux via the objective lens 15 .
  • the light transmitted through the half mirror 19 is separated into zero-order light and ⁇ 1st-order light and is imaged on the light receiving element 14 via the spectroscopic element 13 and the imaging lens 12 . In this manner, a color object surface image is obtained in the imaging area of the zero-order light on the light receiving surface of the light receiving element 14 .
  • the light transmitted through the half mirror 19 is separated into zero-order light and ⁇ 1st-order light and is imaged onto the light receiving element 14 via the spectroscopic element 13 and the imaging lens 12 .
  • spectral images for the LEDs are obtained in the imaging areas of the ⁇ 1st-order light on the light receiving surface of the light receiving element 14 .
  • the stage 20 may be moved in accordance with the illumination cycle of the illumination device 17 ′ so that the inspection of LEDs may be continuously performed. Images that are obtained may be stored in an image processing device (not shown in FIG. 16 ).
  • the image processing device and the display device may display, side-by-side on a display screen or screens: color object surface images obtained while the illumination device 17 ′ is energized; and spectral images obtained while the illumination device 17 ′ is not energized.
  • the display may be divided vertically into two parts.
  • a first display screen displays color object surface images and the second display screen displays spectral images of the LEDs.
  • the substrate on which the LEDs are mounted and the LEDs may be inspected by viewing color images on the first display screen.
  • the LEDs can be checked for quality by viewing spectral images of the LEDs on the second display screen and comparing each LED output with reference LED spectral data.

Abstract

An image pickup apparatus has a construction in which a diffraction element is provided in an observation optical system. Zero-order light that is transmitted straight through the diffraction element and one of the +1st-order diffracted light and the −1st-order diffracted light that is diffracted by the diffraction element are imaged onto an image pickup surface of an image pickup apparatus. The imaging areas of the zero-order light and one of the +1st-order diffracted light and the −1st-order diffracted light that is diffracted by the diffraction element do not overlap on the image pickup surface of the image pickup apparatus. With this construction, a small image pickup apparatus that provides a high-resolution spectral image and a color image of an object can be obtained.

Description

  • This application claims benefit under 35 U.S.C. §119 of JP 2004-154659, filed May 25, 2004, the contents of which are hereby incorporated by reference.
  • BACKGROUND OF THE INVENTION
  • The present invention relates to an image pickup apparatus for obtaining spectral images of an object, and particularly relates to an image pickup unit in a medical endoscope for obtaining spectral images of fluorescence emitted by living tissue, and an analysis device for analyzing obtained spectral images, such as measuring fluorescent wavelengths for living tissue diagnosis, and is also concerned with an image pickup unit in an industrial inspection device for obtaining spectral images of luminous surfaces such as those of LEDs and an analysis device for analyzing obtained spectral images, such as measuring spectroscopic properties of an object surface for quality control on a production line.
  • Devices in the prior art for measuring spectroscopic properties include those described in Japanese Laid Open Patent Application No. H02-104332 (which corresponds in subject matter to U.S. Pat. No. 5,078,150), in Japanese Laid Open Patent Application No. S63-271308, in U.S. Pat. No. 5,782,770, and in a document entitled “Spectral Camera” that was published on the Internet by DHT Corporation (found at: http://www.dht.cojp/products/spectral camera/spectral camera.html).
  • Japanese Laid Open Patent Application No. H02-104332 discloses a spectroscopic endoscope in which illumination light is separated into multiple wave bands in a time-division manner via a rotating filter provided in the light source while continuously illuminating an object, thereby obtaining spectral images separated in a time-division manner.
  • Japanese Laid Open Patent Application No. S63-271308 discloses an endoscope having a variable transmittance element in the observation optical system wherein the wavelengths transmitted by the variable transmittance element may be successively changed, thereby enabling spectral images to be obtained for different wave bands.
  • U.S. Pat. No. 5,782,770 discloses a detection device in which a ribbon-shaped beam is used to illuminate an object, and spectral content of the image of the object is then detected via a dispersion element.
  • The Internet publication entitled “Spectral Camera” mentioned above discloses a spectral camera having a slit opening, a spectroscope, and a two-dimensional CCD camera in which light from an object is received via the slit opening and the light is then separated according to wavelength, thereby enabling the spectrum of an object to be obtained.
  • In the endoscopes for obtaining spectral images of an object described in Japanese Laid Open Patent Application Nos. H02-104332 and S63-271308 mentioned above, only object information carried by light of specific wavelength components from the object is imaged. For example, in a medical endoscope, only light of blue components reflected by the living tissue can be imaged to clearly depict the capillary blood vessels in the surface layer of the living tissue. In the endoscope described in Japanese Laid Open Patent Application No. S63-271308, a variable transmittance element is provided in the observation optical system. Thus, fluorescent images emitted by the object can be selectively obtained. However, it is difficult to obtain bands of smaller-wavelength separations using a rotating filter or a variable transmittance element. Therefore, high-resolution extraction of the spectrum of the object from obtained spectral images cannot be achieved.
  • On the other hand, with the devices for detecting the spectrum of an object described in U.S. Pat. No. 5,782,770, and in the document entitled “Spectral Camera” mentioned above, high resolution spectra can be obtained using a spectroscopic element. However, these devices do not provide color object images, making it impossible to examine the object while obtaining spectral information or to identify from which part of the object the spectral information is obtained. In order to obtain color images of an object using these devices, an image pickup optical system for obtaining color images must be additionally provided, or a mechanism for retracting the spectroscopic element from the optical path of the observation optical system must be provided, which alternatives are not desirable because this would require that the size of the device be increased.
  • BRIEF SUMMARY OF THE INVENTION
  • The present invention provides a small image pickup apparatus that simultaneously allows for high-resolution spectral images and color images of an object to be observed.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The present invention will become more fully understood from the detailed description given below and the accompanying drawings, which are given by way of illustration only and thus are not limitative of the present invention, wherein:
  • FIGS. 1( a) and 1(b) show the construction of the image pickup apparatus of Embodiment 1, with FIG. 1( a) being a cross-section containing the optical axis of the observation optical system that shows the basic construction of an image pickup apparatus 5, and with FIG. 1( b) being a view along the optical axis that shows the imaging areas of minus first order light (hereinafter referred to as −1st-order light) and zero-order light on the image pickup surface of a solid-state image pickup element;
  • FIGS. 2( a) and 2(b) are illustrations to explain the relationship between the imaging position of zero-order light and the spectral position of −1st-order light of a diffraction grating used in the spectroscopic element of the present invention, with FIG. 2( a) showing the diffraction of the principal ray of an imaging light flux when the diffraction element is a transmission-type diffraction grating, and with FIG. 2( b) showing the different imaging positions of different wavelengths due to dispersion of the diffraction grating;
  • FIGS. 3( a)-3(c) are illustrations to schematically show the relationship between the imaging positions of the zero-order light and the −1st-order light on the image pickup surface of a solid-state image pickup element; with the incident light having an angle of incidence of +θI degrees in FIG. 3( a), 0 degrees in FIG. 3( b), and −θI degrees in FIG. 3( c).
  • FIGS. 4( a) and 4(b) are illustrations to explain the construction of the image pickup apparatus of Embodiment 2 of the present invention, with FIG. 4( a) being a cross-section containing the optical axis of the observation optical system that shows the basic construction of an image pickup apparatus 5, and with FIG. 4( b) being a view in the direction of the optical axis that shows the imaging areas of −1st-order light and zero-order light on the image pickup surface of a solid-state image pickup element 4;
  • FIG. 5 is an illustration showing the construction of an endoscope system in which the image pickup apparatus 5 is mounted;
  • FIGS. 6( a) and 6(b) are illustrations showing an exemplary construction of a first illumination device 112, with FIG. 6( a) being a cross-section containing the optical axis and showing a construction of the first illumination device 112, and with FIG. 6( b) being a perspective view showing an exemplary construction of a slit glass 112 b;
  • FIGS. 7( a) and 7(b) are illustrations showing an exemplary construction of the optical system of the light source (item 55, shown in FIG. 5), with FIG. 7( a) being a cross-section containing the center lines of the light guides 113 a and 113 b of the optical system of the light source as seen, for example, in a top view, and FIG. 7( b) being a composite cross-sectional view that includes the center line of one light guide of the optical system of the light source as seen, for example, from one side;
  • FIGS. 8( a) and 8(b) are illustrations to explain the construction of the image pickup apparatus of Embodiment 3 of the present invention, with FIG. 8( a) being a cross-section containing the optical axis of the observation optical system that shows the basic construction of an image pickup apparatus 5, and FIG. 8( b) being a view in the direction of the optical axis that shows the imaging areas of −1st-order light and zero-order light on the image pickup surface of a solid-state image pickup element 4;
  • FIGS. 9( a) and 9(b) are illustrations to explain the construction of the image pickup apparatus of Embodiment 4 of the present invention, with FIG. 9( a) being a cross-section containing the optical axis of the observation optical system that shows the basic construction of an image pickup apparatus 5, and FIG. 9( b) being a view in the direction of the optical axis that shows the imaging areas of plus first-order light (hereinafter +1st-order light) and zero-order light on the image pickup surfaces of the solid-state image pickup elements 4 b and 4 a, respectively;
  • FIGS. 10( a)-10(c) are illustrations to explain the construction of the observation system 201 of Embodiment 5 of the present invention, with FIG. 10( a) being a cross-section containing the optical axis of the observation optical system 2, with FIG. 10( b) being a view in the direction of the optical axis that shows the imaging areas of −1st-order light and zero-order light on the image pickup surface of a solid-state image pickup element 4 when a second illumination device 7 extensively illuminates the object surface, and with FIG. 10( c) being a view in the direction of the optical axis that shows the imaging areas of −1st-order light and the zero-order light on the image pickup surface of a solid-state image pickup element 4 when a first illumination device 6 illuminates a narrow region of the object surface;
  • FIGS. 11( a) and 11 b) are illustrations showing another possible construction of the illumination device 6, with FIG. 11( a) being a partial cross-section containing the optical axis of the illumination device 6, and FIG. 11( b) being a view along the optical axis showing the arrangement of a set of LEDs 6 e;
  • FIGS. 12( a)-12(c) are illustrations to schematically show an image display obtained using the observation system for measuring spectral reflectance according to Embodiment 5 of the present invention;
  • FIG. 13 is an illustration to schematically show another image display obtained using the observation system for measuring spectral reflectance according to Embodiment 5 of the present invention;
  • FIGS. 14( a)-14(c) are illustrations to explain the construction of an observation system for measuring spectral reflectance according to Embodiment 6 of the present invention, with FIG. 14( a) being a cross-section containing the optical axis of the objective lens 15 showing the basic construction of an observation system for measuring spectral reflectance, with FIG. 14( b) showing an illumination light flux of an illumination device 16 as viewed in the direction z, and with FIG. 14( c) showing an illumination light flux of the illumination device 16 as viewed in the direction x;
  • FIGS. 15( a) and 15(b) are illustrations to schematically show an image display obtained using the observation system for measuring spectral reflectance according to Embodiment 6 of the present invention;
  • FIG. 16 is an illustration showing the construction of an observation system for inspecting the quality of LEDs according to Embodiment 7 of the present invention; and
  • FIG. 17 is an illustration to schematically show an image display obtained using the observation system for inspecting the quality of LEDs according to Embodiment 7 of the present invention.
  • DETAILED DESCRIPTION
  • The image pickup apparatus of the present invention includes a diffraction element in the optical path of an observation optical system whereby zero-order light transmitted through the diffraction element and ±1st-order light diffracted by the diffraction element are imaged on the image pickup surface of an image pickup element, and the imaging areas of the zero-order light and ±1st-order light are not overlapped on the image pickup surface of the image pickup element. Therefore, a color image of an object can be observed within the area that receives the zero-order light and the spectrum of the object can be observed within the respective areas that receive the +1st-order light and the −1st-order light while preventing flare from occurring in these light-receiving areas.
  • As described above, a diffraction element is provided in the optical path of an observation optical system and both color image observation and spectrum detection of an object can be achieved. Therefore, for example, a small image pickup apparatus can be realized that can be provided at the insertion end of an endoscope. Further, morphological information of an object obtained from a color image, and spectral information of each part of the object can be analyzed and associated with each other.
  • Several embodiments of the present invention will now be described with reference to the drawings.
  • Embodiment 1
  • The construction of an image pickup apparatus according to Embodiment 1 of the present invention will now be described with reference to FIGS. 1( a) and 1(b). FIG. 1( a) is a cross-section containing the optical axis of the observation optical system that shows the basic construction of an image pickup apparatus 5, and FIG. 1( b) is a view along the optical axis that shows the imaging areas of −1st-order light and zero-order light on the image pickup surface of a solid-state image pickup element. The image pickup apparatus 5 of the present invention is formed of an observation optical system 2 that includes a diffraction element 3 located in the optical path, and a solid-state image pickup element 4. The observation optical system 2 includes a lens 2 a provided on the object surface side, and a collimating lens 2 b for collimating a light flux from the lens 2 a. In addition, the observation optical system 2 includes a diaphragm S and the diffraction element 3, each of which is located in the collimated light flux from the collimating lens 2 b, and a lens 2 c for imaging zero-order light that passes straight through the diffraction element 3 and −1st-order light that is diffracted by the diffraction element 3 onto off-axis positions on the image pickup surface of the solid-state image pickup element 4.
  • The diffraction element 3 is a transmission-type DOE (Diffractive Optical Element) such as a diffraction grating, or an HOE (Holographic Optical Element) such as a holographic film. Upon entering the diffraction element 3, light flux is separated into zero-order light and −1st-order light and +1st-order light. The 1st-order light emerges from the diffraction element 3 at equal (plus and minus) angles to the optical axis, with the specific angle amount depending upon the wavelength of the incident light.
  • The solid-state image pickup element 4 is provided in a manner such that the center of the image pickup surface is not aligned with the optical axis of the observation optical system 2. The zero-order light that passes straight through the diffraction element 3 and the −1st-order light that is diffracted by the diffraction element 3 are imaged on the image pickup surface by the lens 2 c. As shown in FIG. 1( b), the imaging areas of the zero-order light and −1st-order light do not overlap on the image pickup surface.
  • The spectrum of the −1st-order light from an object can be obtained by using the positional relationship between the imaging position of the −1st-order light and that of the zero-order light on the image pickup surface of the solid-state image pickup element 4. Here, it is assumed that a light beam emitted from a point on the object surface of an object is imaged on the image pickup surface of the solid-state image pickup element 4 via the observation optical system. The relationship between the imaging position of the zero-order light and the spectrum of the −1st-order light on the image pickup surface of the solid-state image pickup element 4 will now be explained with reference to FIGS. 2( a) through 3(c).
  • FIG. 2( a) is an illustration showing the diffraction of the principal ray of a light flux when the diffraction element 3 is a transmission-type diffraction grating. Assuming the incident angle of a wavelength λ relative to the optical axis of the diffraction grating is θI and the diffraction angle is θI′ as shown in FIG. 2( a), the following Equation (1) is obtained:

  • sin θI−sin θI′=N·λ/d  Equation (1)
  • where
  • N is the order of diffraction (here, N=−1), and
  • d is the diffraction grating pitch.
  • It is assumed that the diffraction element 3 is provided in the collimated light flux and the imaging lens 2 c is provided behind the diffraction element 3, as in this embodiment. FIG. 2( b) schematically shows the different imaging positions for different wavelengths as a result of the dispersion of the diffraction grating. Provided that the imaging lens 2 c is a lens having distortion of a sin θ type, the following Equations (2) and (3) are obtained for a diffraction angle θλ for each wavelength of the principal ray entering the diffraction grating at a right angle:

  • −sin θλ=N·λ/d  Equation (2)

  • H λ =−F·sin θλ=F·N·λ/d  Equation (3)
  • where
  • Hλ is the image height on the imaging surface of the diffracted light having a wavelength λ,
  • F is the focal length of the imaging lens 2 c, and
  • N and d are as previously defined.
  • As the incident angle θI of the principal ray of an imaging light flux entering the diffraction grating is changed, the imaging positions of the zero-order light and the −1st-order light are accordingly shifted. Here, it is assumed for convenience that the diffraction grating causes dispersion, but does not cause refraction or imaging. Such a diffraction grating allows the zero-order light to pass straight through the diffraction grating so that the exit angle is the same as the incident angle.
  • FIGS. 3( a)-3(c) show the positional relationship between the imaging positions of the zero-order light and the −1st-order diffracted light on the image pickup surface of the solid-state image pickup element 4 for three different incidence angles of incident light onto the diffraction grating. In FIGS. 3( a)-3(c), the position on the image pickup surface that corresponds to the optical axis of the imaging optical system is marked with an “x” in each figure. In FIG. 3( a), the image position on the image pickup surface labeled “zero-order light” is the image position of light from an object point located off the optical axis such that a principal ray enters the diffraction grating at the angle +θI (see FIG. 2( a)), and this position is at a distance H0 to the right of the optical axis “x” position of the imaging optical system. In this case, the position of the −1st-order diffracted light is at a distance Hλ from the optical axis that is relatively far from the optical axis. FIG. 3( b) illustrates the situation where the angle of incidence of the incident light is such that the zero-order diffracted light travels along the optical axis and thus the image of the zero-order light lies at the position “x”. Zero-order light from the point on the object at zero image height, i.e., the point on the object that corresponds to the optical axis of the imaging optical system, passes straight through the diffraction grating and thus is imaged at the point “S” on the image pickup surface. In this case, the position of the −1st-order diffracted light is at a distance Hλ from the optical axis that is a medium distance from the optical axis. FIG. 3( c) illustrates the situation where the angle of incidence of the incident light is such that a principal ray enters the diffraction grating at the angle −θI, and this light is imaged onto the image pickup surface at a position to the left of the optical axis at a distance H0. In this case, the position of the −1st-order diffracted light is at a distance Hλ from the optical axis that is relatively near the optical axis.
  • The relationship between the imaging position of the zero-order light and the position of the spectrum of the −1st-order light is given by the following Equation (4):

  • sin θI−sin θI′ λ =N·λ/d  Equation (4)
  • where
  • θI is the incident angle of light having a wavelength λ relative to the optical axis of the diffraction element,
  • θI′λ is the diffraction angle of light having a wavelength λ relative to the optical axis of the diffraction element, and
  • N and d are as previously defined.
  • When the imaging lens 2 c behind the diffraction grating is a lens having distortion of sin θ type, the image height Hλ on the image pickup surface of the −1st-order light having a wavelength λ is obtained using the following Equation (5):

  • H λ =−F·sin θI′ λ  Equation (5)
  • where
  • Hλ, F, and θI′λ are as previously defined.
  • On the other hand, the image height H0 on the image pickup surface of the zero-order light is obtained using the following Equation (6):

  • H 0 =−F·sin θI  Equation (6)
  • where
  • F and θI are as previously defined.
  • The distance between the imaging positions of the zero-order light and the −1st-order light having a wavelength λ is given by the following Equation (7):
  • H λ - H 0 = - F · sin θ I λ - ( F · sin θ I ) = F ( sin θ I - sin θ I λ ) = F · N · λ / d Equation ( 7 )
  • It is understood from Equation (7) that the distance between the imaging positions of the zero-order light and the −1st-order light having a wavelength λ is constant regardless of the incident angle of the principal ray to the diffraction grating. Therefore, detecting the imaging position of the zero-order light leads to identifying the imaging position of the spectrum of the −1st-order light. In this embodiment, the imaging lens 2 c behind the diffraction grating is a lens having distortion that varies according to sin θ. Practically speaking, a similar calculation may be applied to a lens having any type of distortion provided that its aberration properties are previously measured and known.
  • The imaging position of the spectrum of the −1st-order light is identified as follows.
  • (1) The image height H0 of the zero-order light (i.e., the imaging position of the zero-order light) is measured based on an image acquired by the solid-state image pickup element 4;
  • (2) The incident angle θI onto the diffraction grating of the principal ray is calculated using the image height H0 of the zero-order light, the previously measured focal length F, and distortion property of the imaging lens 2 c;
  • (3) The diffraction angle OI′λ of the −1st-order light having a wavelength λ is calculated using the incident angle θI (as determined in step (2) above), the diffraction grating pitch d, the order of diffraction N, and the wavelength λ; and
  • (4) The image height Hλ of the −1st-order light having a wavelength λ is calculated for a diffraction angle θI′λ for light of wavelength λ as calculated in step (3), the previously measured focal length F and the distortion properties of the imaging lens 2 c.
  • Based on the imaging position of the spectrum of the −1st-order light as described above and the intensity of image pickup signals at each position, analysis of the intensity property of the spectrum of the obtained image may be performed.
  • In practice, a coordinate is assumed on the image pickup surface of the solid-state image pickup element 4 having an origin at the intersection with the optical axis of the observation optical system. Referring again to FIG. 1( a), the image pickup apparatus 5 is provided with an image analysis circuit 101 for analyzing the position of an obtained image and an arithmetic operation circuit 102 for executing the calculations above to acquire the spectral properties of the obtained image on a real time basis.
  • When holographic techniques are applied to the diffraction element, a diffraction grating that performs imaging in addition to providing a diffraction function can be provided. With the use of such a diffraction grating, the imaging lens 2 c behind the diffraction grating can be eliminated. The focal length F and the distortion property of the diffraction grating may be measured and stored in a memory 103 as a calculation parameter. An arithmetic operation circuit may perform the calculations above by using the stored parameter, thereby enabling the arithmetic operations to be repeatedly executed.
  • The image pickup apparatus 5 of the present invention is provided with an illumination device. It is desired that the illumination device have a function to extensively illuminate an object and also functions to spotlight a small part of the object.
  • When the illumination device extensively illuminates an object, the entire image of the object is obtained in the imaging area of the zero-order light on the image pickup surface of the solid-state image pickup element 4. Concurrently, spectral images of the entire object are obtained in the imaging area of the −1st-order light on the image pickup surface. When illumination refers to emitted white light, a color object image and respective object images for different wavelengths are obtained in a successively overlapped manner. In the latter images, the spectral information for each point of the object is superimposed. Therefore, it is difficult to analyze the spectral information based on the calculations described above.
  • Hence, the illumination device in this embodiment has the function of spotlighting a small portion of the object, thereby enabling the spectral information for each point of the object to be separately obtained in the imaging area of the −1st-order light on the image pickup surface of the image pickup element 4.
  • For example, an object may be extensively illuminated in one instance and only a small part of the object may be illuminated (i.e., spotlighted) in another instance. Then, a color image obtained by extensively illuminating the object and a spectral image obtained by spotlighting a small part of the object may be merged and displayed by a display. In this manner, the spectrum at a specific point of the object can be observed while examining the appearance of the object. The spot image obtained in the imaging area of the zero-order light on the image pickup surface of the solid-state image pickup element 4 during the spotlighting can be analyzed according to its coordinates. Then, the color object image obtained by extensively illuminating the object can be marked in the position where the spot image is taken, thereby enabling the positions on the object where the spectral information of the object comes from to be known.
  • In this embodiment, the image pickup surface of the solid-state image pickup element 4 receives the zero-order light that is transmitted straight through the diffraction element 3 and the −1st-order light that is diffracted by the diffraction element 3. The +1st-order light diffracted by the diffraction element 3 and higher-order diffractive components are unnecessary for image pickup. Therefore, it is desirable to eliminate those light beams by providing optical members, in combination, that reflect or absorb the light beams between the diffraction element 3 and the image pickup surface. For example, a light absorbing member 104 can be provided at the imaging area of +1st-order light along with the solid-state image pickup element 4, thereby eliminating the +1st-order light from being detected by the solid-state image pickup element 4.
  • Embodiment 2
  • The construction of an image pickup apparatus according to Embodiment 2 of the present invention will now be described with reference to FIGS. 4( a) and 4(b). FIG. 4( a) is a cross-section containing the optical axis of the observation optical system showing the basic construction of the image pickup apparatus 5. FIG. 4( b) is an illustration showing the imaging areas of the −1st-order light and zero-order light on the image pickup surface of a solid-state image pickup element 4, as viewed from a position on the optical axis facing the image pickup surface.
  • The image pickup apparatus 5 of Embodiment 2 of the present invention is formed of an observation optical system 2 which may include a diffraction element 3 located in the optical path, and a solid-state image pickup element 4. The observation optical system 2 is formed of a lens 2 a located on the object surface side of the observation optical system 2, a collimating lens 2 b for collimating a light beam from the lens 2 a, and a diaphragm S and the diffraction element 3 located within the collimated light flux.
  • The diffraction element 3 is a transmission-type diffraction grating that provides an imaging function in addition to providing a diffraction function. Entering the diffraction element 3, a light flux is separated into zero-order light and 1st-order light due to diffraction. The +1st-order light emerges from the diffraction element 3 at equal but opposite angles to the optical axis, with the angle amount depending on the wavelength of the light, and is then incident onto the image pickup surface of the image pickup element 4.
  • The solid-state image pickup element 4 is positioned in a manner such that the center of the image pickup surface is not aligned with the optical axis of the observation optical system 2. It thereby receives the light fluxes of the zero-order light and −1st-order light from the diffraction element 3. As shown in FIG. 4( b), the imaging areas of the zero-order light and −1st-order light do not overlap on the image pickup surface.
  • With this construction, the image pickup apparatus 5 can be downsized so that it can be mounted at, for example, the insertion end of an endoscope. FIG. 5 shows an example of an endoscope in which the image pickup apparatus 5 is mounted at the insertion end of the endoscope. The image pickup apparatus 5 is provided at the leading end 51 with a first illumination device 112 for illuminating a relatively narrow area of an object via a slit opening, and a second illumination device 111 for extensively illuminating an object. Image signals obtained by the image pickup apparatus 5 are transferred to an image processing device 54 via a universal cable 53 extended from an operation part 52 of the endoscope. The image processing device 54 includes an image analysis circuit 101 (not shown in FIG. 5), an arithmetic operation circuit 102 (not shown in FIG. 5), and a memory as described above to analyze and merge obtained images. Images that are processed by the image processing device 54 may be displayed by a display device 56.
  • FIGS. 6( a) and 6(b) show an exemplary construction of the first illumination device 112. FIG. 6( a) is a cross-section showing the construction of the first illumination device 112. Illumination light supplied by a light source 55 is transferred to the first illumination device 112 via a light guide 113 a provided in the universal cable 53. Light emerging from the exit end of the light guide 113 a enters a single fiber rod lens 112 a. The single fiber rod lens 112 a allows the light to emerge from the exit end thereof with a nearly uniform intensity. Light passing through the linear opening in a slit glass 112 b illuminates an object surface via a projection lens 112 c. FIG. 6( b) shows an exemplary construction of the slit glass 112 b. The slit glass 112 b has a linear slit printed on one surface of a parallel flat glass and an anti-reflection coating deposited on the other surface. A super-thin metal plate with a linear slit can be used in place of the slit glass 112 b. The second illumination device 111 can have any construction that allows extensive illumination of an object. For example, a plano-convex lens with the flat surface on the object side can be provided immediately after the exit end of a light guide 113 b (FIG. 7( a)), thereby allowing light emerging from the light guide 113 b to illuminate the object in a diffuse manner.
  • FIGS. 7( a) and 7(b) show an exemplary construction of the light source 55. FIG. 7( a) is a cross-section containing the center lines of the light guides 113 a and 113 b of the optical system of the light source 55 shown in FIG. 5 as seen, for example, in a top view, and FIG. 7( b) is a composite cross-sectional view that includes the center line of one light guide of the optical system of the light source 55 as seen, for example, from one side. Light that emerges from a discharge lamp 117 is collected on the entrance end of the light guide 113 a, 113 b by a relay optical system 116 and collection lenses 114 a, 114 b. Collection lens 114 a supplies light to the first illumination device 112 and collection lens 114 b supplies light to the second illumination device 111. An optical path-switching mirror 115 is provided in the optical path of the relay optical system 116. The mirror is rotated about a line including the intersection with the relay optical system 116 in order to switch between the optical paths of the collection lenses 114 a and 114 b. When the optical path of the collection lens 114 a is selected, light from the light source 117 is directed to the light guide 113 a so that the first illumination device 112 illuminates a relatively narrow area of an object. When the optical path of the collection lens 114 b is selected, light from the light source 117 is directed to the light guide 113 b so that the second illumination device 111 extensively illuminates an object. A glass rod is provided at the entrance end of each of the light guides 113 a, 113 b, thereby preventing the entrance ends of the light guides 113 a, 113 b from being damaged by the thermal energy of the collected light.
  • The image processing device 54 is electrically connected to the light source 55 and controls the switching between extensive illumination of an object and linear illumination of a small part of the object. The optical path switching mirror 115 provided in the optical system of the light source 55 is controlled based on control signals transmitted from a control circuit built in the image processing device 54.
  • When the second illumination device 111 extensively illuminates an object, the entire image of the object is obtained in the imaging area of the zero-order light on the image pickup surface of the solid-state image pickup element 4. Concurrently, spectral images of the entire object are obtained in the imaging area of the −1st-order light on the image pickup surface. When the illumination device emits white light, a color object image and respective object images for different wavelengths are obtained in a successively overlapping manner. It is difficult to analyze the spectral information using the latter images because the spectral information at each point of the object is overlapped. Therefore, in this embodiment, the first illumination device 112 serves to illuminate a relatively narrow area of the object. Thus, the object surface is divided into segments in the imaging area of the −1st-order light on the image pickup surface of the solid-state image pickup element 4, yielding separate spectral information.
  • Embodiment 3
  • The construction of an image pickup apparatus according to Embodiment 3 of the present invention will now be described with reference to FIGS. 8( a) and 8(b). FIG. 8( a) is a cross-section containing the optical axis of the observation optical system that shows the basic construction of an image pickup apparatus 5, and FIG. 8( b) is a view in the direction of the optical axis that shows the imaging areas of −1st-order light and zero-order light on the image pickup surface of a solid-state image pickup element 4.
  • The image pickup apparatus 5 of this embodiment is formed of an observation optical system 2 which includes a diffraction element 3 located in the optical path, and a solid-state image pickup element 4. The observation optical system 2 is formed of a lens 2 a on the object surface side of the observation optical system, a collimating lens 2 b for collimating a light flux from the lens 2 a, a diaphragm S and a diffraction element 3 provided in the collimated light flux, and an imaging lens 2 c for imaging the zero-order light that is transmitted straight through the diffraction element 3 and the −1st-order light that is diffracted by the diffraction element 3. The zero-order light and the −1st-order light are received onto the image pickup surface of the solid-state image pickup element 4. The diffraction element 3 is a transmission-type diffraction grating that provides a refraction function in addition to performing a diffraction function. A light flux that enters the diffraction element 3 is separated into zero-order light, +1st-order light, and −1st-order light due to diffraction. The zero-order light, the +1st-order light, and the −1st-order light emerge at different angles due to refraction. The refraction directions can be controlled to adjust the imaging positions of the zero-order light, the +1st-order light, and the −1st-order light. For example, the solid-state image pickup element 4 may be positioned in a manner such that the center of the image pickup surface coincides with the optical axis of the observation optical system 2 and such that the image pickup surface receives the imaging light fluxes of the zero-order light and the −1st-order light that emerge from the diffraction element 3.
  • As shown in FIG. 8( b), the imaging areas of the zero-order light and −1st-order light do not overlap on the image pickup surface. In this case, the refraction angle and direction of the diffraction element 3 can be pre-measured and stored as parameters to be used in calculations (in addition to the focal length F and distortion property of the imaging lens 2 c) in order to identify the imaging positions of the zero-order light and the spectrum of the −1st-order light.
  • The image pickup apparatus 5 having the construction above in which the center of the image pickup surface coincides with the optical axis of the observation optical system 2 is preferably provided at the insertion end of an endoscope because it allows the insertion end of the endoscope to have a smaller outer diameter. This embodiment is nearly the same as that of Embodiment 1 in terms of its construction and its efficacy.
  • Embodiment 4
  • The construction of an image pickup apparatus according to Embodiment 4 of the present invention will now be described with reference to FIGS. 9( a) and 9(b). FIG. 9( a) is a cross-section containing the optical axis of the observation optical system that shows the basic construction of an image pickup apparatus 5, and FIG. 9( b) is a view in the direction of the optical axis that shows the imaging areas of plus first-order light (hereinafter +1st-order light) and zero-order light on the image pickup surfaces of the solid-state image pickup elements 4 b and 4 a, respectively.
  • The image pickup apparatus 5 of this embodiment is formed of an observation optical system 2, which includes a diffraction element 3 located in the optical path of the observation optical system 2, and two solid-state image pickup elements 4 a and 4 b. The observation optical system 2 is formed of a lens 2 a on the object surface side of the observation optical system 2, a lens 2 b for imaging a light flux from the lens 2 a on the image pickup surfaces of the solid-state image pickup elements 4 a and 4 b, and a diffraction element 3 that is provided at a specific angle in the imaging light flux. The diffraction element 3 is a reflection-type diffraction grating that has a reflecting function in addition to having a diffraction function. Upon being incident onto the diffraction element 3, a light flux is separated into zero-order light, +1st-order light, and −1st-order light upon being diffracted. The zero-order light, the +1st-order light, and the −1st-order light emerge at different angles due to diffraction. The solid-state image pickup elements 4 a and 4 b are provided at the image plane of the zero-order light and the +1st-order light, respectively, in such a manner that the centers of their image pickup surfaces each coincide with the optical axis of the observation optical system.
  • As before, coordinates having an origin on the image pickup surfaces of the solid-state image pickup elements 4 a and 4 b at the intersections with the optical axis of the observation optical system are assumed in the image pickup apparatus 5. An image analysis circuit 101 (not shown in FIG. 9( a)) for analyzing the position of an obtained image and an arithmetic operation circuit 102 (not shown in FIG. 9( a)) for executing calculations required to analyze spectral information and to create an image may be provided. In this manner, the spectral properties of the obtained image may be acquired on a real time basis. The focal length F and distortion properties of the lenses 2 a and 2 b may be previously measured and stored in a memory 103 (not shown in FIG. 9( a)). An arithmetic operation circuit may use these as parameters in calculations in order to identify the imaging positions of the zero-order light and the spectrum of the +1st-order light on the image pickup surface. These circuits have the same construction as those in Embodiment 1 and, therefore, are not further illustrated.
  • The image pickup apparatus of this embodiment, wherein solid-state image pickup elements 4 a and 4 b are provided at the image plane of the zero-order light and the +1st-order light, respectively, allows for larger imaging areas on the image pickup surface as compared with the case where a single solid-state image pickup element is used to pick up images. Therefore, spectral information can be analyzed with higher resolution for the same object image range.
  • Embodiment 5
  • The construction of an observation system 201 according to Embodiment 5 of the present invention will now be described with reference to FIGS. 10( a)-10(c). FIG. 10( a) is a cross-section containing the optical axis of the observation optical system 2 showing the basic construction of the observation device 201 of this embodiment. FIG. 10( b) is an illustration showing the imaging areas of −1st-order light and zero-order light on the image pickup surface of a solid-state image pickup element 4 when the object surface is illuminated by the second illumination device 7. FIG. 10( c) is an illustration showing the imaging areas of −1st-order light and zero-order light on the image pickup surface of a solid-state image pickup element 4 when the object surface is illuminated by the first illumination device 6. In FIG. 10( a), the object surface is shown in a perspective view for greater clarity of explanation.
  • The observation system 201 of this embodiment is formed of an image pickup apparatus 5 that has the same construction as in Embodiment 1, a first illumination device 6 for illuminating a narrow linear region part of an object surface, and a second illumination device 7 for extensively illuminating the object surface beyond the field of view of the image pickup apparatus 5. The first illumination device 6 is formed of a set of LEDs that includes multiple LEDs 6 a-n (n=1, 2 . . . ) aligned in the direction y, a collimating lens 6 b for collimating light from the LEDs, a diffusing element 6 c located in the collimated light flux for diffusing light in the direction x, and a cylindrical lens 6 d for collecting in one direction the light diffused by the diffusing element 6 c on the surface of the object.
  • Among the LEDs 6 a-n (n=1, 2 . . . ) forming the LED set 6 a, an LED corresponding to a point to be measured on the object surface is energized. Light from the LED passes through the collimating lens 6 b, is diffused in the direction x by the diffusing element 6 c, and forms a light beam that illuminates a linear region L for spectroscopic measurement on the object surface via the cylindrical lens 6 d. Each LED of the LED set 6 a is individually energized.
  • FIGS. 11( a) and 11(b) are illustrations showing another possible construction of the first illumination device 6, with FIG. 11( a) being a partial cross-section containing the optical axis of the first illumination device 6, and FIG. 11( b) being a view along the optical axis showing the arrangement of a set of LEDs 6 e. In FIG. 11( a), the object surface is shown in a perspective view for greater clarity of explanation. In the figure, the first illumination device 6 comprises a set of LEDs 6 e formed of multiple LEDs arranged in a two dimensional plane x-y, a collimating lens 6 f, and a cylindrical lens 6 g. The LED set 6 e includes a group of multiple LEDs that are linearly arranged in the direction x and multiple LED groups are arranged in the direction y. Each group of LEDs 6 e-n (n=1, 2 . . . ) in the direction x is individually energized. Light emitted from the LED set 6 e is collimated by the collimating lens 6 f and enlarged and projected onto the object surface with the linear shape maintained by the cylindrical lens 6 g. Thus, the target region L for spectroscopic measurement on the object surface, which is a linear region, is illuminated by linearly shaped illumination light.
  • With the above construction, the object surface can be scanned in the direction y by illuminating a linear region with the optical axis of the first illumination device 6 being fixed relative to the object surface. It is preferred that a diffusion agent for mixing color lights emitted from the LEDs, an agent that produces fluorescence using the light emitted from the LEDs as an excitation source to produce white light, and an intensity correction filter for homogenizing the light intensity among wavelengths in a desired range be provided in front of the light exit surface of each LED.
  • The second illumination device 7 is formed of a light guide 7 a that guides light from a light source (not illustrated) to the insertion end of an endoscope, and a plano-concave lens 7 b that diffuses light emitted from the light guide 7 a and illuminates the object surface so that the second illumination device 7 extensively illuminates the object surface beyond the field of view of the image pickup apparatus 5.
  • The construction of the illumination devices is not limited to those specifically described above. Any illumination device can be used in combination with the image pickup apparatus 5 provided that it illuminates a linear region of the object surface and/or that it extensively illuminates the object surface.
  • As shown in FIG. 10( b), while the second illumination device 7 extensively illuminates the object, an image of the entire object is obtained in the imaging area of the zero-order light on the image pickup surface of the solid-state image pickup element 4 and, concurrently, spectral images of the entire object are obtained in the imaging area of the −1st-order light on the image pickup surface. On the other hand, as shown in FIG. 10( c), while the first illumination device 6 illuminates a relatively narrow, linear region of the object, an image of the illuminated object is obtained in the imaging area of the zero-order light on the image pickup surface of the solid-state image pickup element 4 and, concurrently, spectral images of the illuminated object part are obtained in the imaging area of the −1st-order light on the image pickup surface.
  • An image processing device and a display device can be used to merge and display images obtained using different illumination devices, as shown in FIGS. 12( a)-12(c) by way of example. In FIG. 12( a), a zero-order light image obtained using the second illumination device 7 is displayed along with lines indicating where the illuminated region L (shown in FIG. 11( a)) is on the object surface, calculated on the basis of the zero-order light image obtained using the second illumination device 7. Above the image formed by the zero-order light, spectral images of the illuminated regions obtained using the first illumination device 6 are shown. The arrangement of these images allows the observer to acquire the morphological information of the object surface and the positional information of the linear illumination region from the images displayed in the lower area, and to acquire the spectral information of the object surface from the images displayed in the upper area. The first illumination device 6 scans the illuminated region L over the object surface and individually obtained spectral images are stored in an image processing device. The image of a specific illuminated region L can be retrieved and displayed. With this construction, the spectral images over the entire object surface can be individually obtained for different wavelength bands as shown in FIGS. 12( a)-12(c). The observer can screen out unnecessary spectral images while viewing these images. Furthermore, the image processing device 54 can serve to extract image information for a specific wavelength among the spectral images of the illuminated regions L and merge extracted image information in order to display an image of the entire object surface.
  • FIG. 13 shows object surface images obtained based on individual image information that is displayed on a display screen divided into four sections. The display screen A displays a zero-order light image obtained using the illumination device 7. When the illumination device 7 emits white light, a color image is displayed on the display screen A. The display screen B displays an image based on image information for a specific wavelength. The display screen C displays an image based on image information for another specific wavelength. The display screen D displays a differential image of the image displayed on the display screens B and C.
  • In this way, images carrying different information can be compared, thereby a specific phenomenon that appears at a specific point on the object surface can be precisely observed and its spectrum viewed.
  • Embodiment 6
  • FIGS. 14( a)-14(c) are illustrations to explain the construction of an observation system according to Embodiment 6 for measuring spectral reflectance in which the image pickup apparatus of the present invention is mounted, with FIG. 14( a) schematically showing the various components of the observation system and its construction, with FIG. 14( b) showing an illumination light flux in the illumination device 16 as seen in the direction z, and with FIG. 14( c) showing an illumination light flux in the illumination device 16 as seen in the direction x.
  • FIGS. 15( a) and 15(b) are illustrations showing information obtained by a light receiving element 14 of the observation system after the information has been processed by an image processing device (not shown), and displayed on display screens.
  • As shown in FIGS. 14( a)-14(c), the observation system for measuring spectral reflectance in this embodiment is formed of an objective lens 15, a diffraction element 13, an imaging lens 12, a light receiving element 14, an illumination device 16 for illuminating a linear region of an object surface with white light, an illumination device 17 for illuminating the object surface with infrared light, and half mirrors 18 and 19.
  • The illumination device 16 is formed of a white light source 16 a, a collimating lens 16 b, and a diffusing element 16 c. As shown in FIGS. 14( b) and 14(c), the diffusing element 16 c simply transmits light in the direction z and diffuses light in the direction y. The illumination device 17 is formed of an infrared light source 17 a, a diffusing element 17 b, and a collimating lens 17 c. The diffusing element 17 b diffuses incident light evenly in the directions x and y.
  • The observation system for measuring spectral reflectance of this embodiment further includes a stage 20 on which an object is placed and a stage driving mechanism (not shown) for moving the stage in the direction x in the figure, as shown by the double-headed arrows.
  • In the observation system for measuring spectral reflectance of this embodiment, the illumination device 17 first illuminates the object surface with light in the infrared range. Infrared rays emitted from the infrared light source 17 a enter the half mirror 19 as a collimated light flux via the diffusing element 17 b and the collimating lens 17 c. After being reflected by the half mirror 19 and transmitted through the half mirror 18, the infrared light illuminates the object surface on the stage 20 via the objective lens 15. The infrared light that is reflected by the object surface enters the half mirror 18 as a collimated light flux via the objective lens 15. The infrared light that is transmitted through the half mirrors 18 and 19 is separated into zero-order light and −1st-order light and imaged on the light receiving element 14 via the diffraction element 13 and imaging lens 12. In this manner, an infrared image of the object surface is obtained in the imaging area of the zero-order light on the light receiving surface of the light receiving element 14.
  • Subsequently, the illumination device 16 illuminates a relatively narrow area of the object surface with white light. White light emitted from the light source 16 a enters the diffusing element 16 c as a collimated light flux via the collimating lens 16 b. Upon entering the diffusing element 16 c, the white light is diffused only in the direction y. After being reflected by the half mirror 18, the white light passes through the objective lens 15 and illuminates a linear region of the object surface that is oriented in the direction y in the figure. Light reflected by the object surface enters the half mirror 18 as a collimated light flux via the objective lens 15. After being transmitted through the half mirrors 18 and 19, the light is separated into zero-order light and −1st-order light and imaged onto the light receiving element 14 via the spectroscopic element 13 and imaging lens 12. In this manner, a color image of the illuminated region of the object surface is obtained in the imaging area of the zero-order light on the light receiving surface of the light receiving element 14 and the spectrum of the object surface image is obtained in the imaging area of the −1st-order light on the light receiving surface of the light receiving element 14. The stage 20 on which an object surface is placed can be moved to shift the linearly illuminated region on the object surface, thereby obtaining and storing in the memory of an image processing device (not shown) reflection spectral images for the entire object surface.
  • An image processing device 54 and a display device 56 (shown in block diagram form in FIG. 5) may be used to display infrared images obtained using the illumination device 17 and the spectral images obtained using the illumination device 16 side-by-side on one or more display screen(s). In addition, the coordinates of the linearly illuminated region on the object surface can be calculated based on the color image obtained using the illumination device 16 and a mark that indicates the illuminated region can be merged with, and appear on, the infrared image. For example, a display may be divided vertically into two parts as shown in FIG. 15( a), so that a first display screen displays an infrared image of the object surface and so that a second display screen displays the spectral content of the visible image. The region of the object that is linearly illuminated with light is marked on the infrared image that is displayed on the first display screen so that it is known from which part of the object surface the spectral images that are displayed on the second display screen are taken. Furthermore, a spectral intensity analysis that focuses on the reflectance spectrum Q1 for a small region P1 within the marked region on the object surface can be provided. FIG. 15( b) shows a spectral intensity profile for the small region P1 that may be displayed on the second display screen. In this way, a specific small region P1 on the object surface is selected and the spectral intensity profile for the region is analyzed.
  • Embodiment 7
  • FIG. 16 is an illustration that schematically shows the construction of an illuminant observation system in which the image pickup apparatus according to Embodiment 7 of the present invention is mounted. FIG. 17 is an illustration that shows information that is obtained by a light receiving element 14 of the illuminant observation system of FIG. 16 after the information has been processed by an image processing device (not shown), and displayed on a display device that includes, for example, a first display screen and a second display screen.
  • The illuminant observation system of FIG. 16 is formed of an objective lens 15, a spectroscopic element 13, an imaging lens 12, a light receiving element 14, an illumination device 17′, and a half mirror 19. An object is placed on a stage 20′ and moved by a stage driving mechanism (not shown) in the y direction. As shown in FIG. 16, the illumination device 17′ may be formed of a light source 17 a′ that emits white light, a diffusing element 17 b, and a collimating lens 17 c. The illumination device 17′ has the same construction as the illumination device 17 of Embodiment 6 except the light source 17 a′ is different in that it emits visible light rather than infrared light. The other optical elements have the same construction in Embodiment 7 as the like-numbered other optical elements in earlier embodiments.
  • The observation system of the present embodiment is for inspecting illumination light sources, such as LEDs. The stage 20′ is elongated in the direction y in the figure. Multiple substrates having red, green, and blue LEDs are arranged on the stage 20′ in the direction y. The LEDs are energized for inspection. The illumination device 17′ intermittently emits white light. For example, the illumination device 17′ can be a flash lamp. While the illumination device 17′ is energized, light from the illumination device 17′ enters the half mirror 19 as a collimated light flux via the diffusing element 17 b and the collimating lens 17 c. After being reflected by the half mirror 19, the light extensively illuminates the object surface on the stage 20′ via the objective lens 15. Light reflected by the object surface enters the half mirror 19 as a collimated light flux via the objective lens 15. The light transmitted through the half mirror 19 is separated into zero-order light and −1st-order light and is imaged on the light receiving element 14 via the spectroscopic element 13 and the imaging lens 12. In this manner, a color object surface image is obtained in the imaging area of the zero-order light on the light receiving surface of the light receiving element 14.
  • While the illumination device 17′ is off, light from the LEDs (not shown in FIG. 16) enters the half mirror 19 as collimated light fluxes via the objective lens 15. The light transmitted through the half mirror 19 is separated into zero-order light and −1st-order light and is imaged onto the light receiving element 14 via the spectroscopic element 13 and the imaging lens 12. Then, spectral images for the LEDs are obtained in the imaging areas of the −1st-order light on the light receiving surface of the light receiving element 14.
  • The stage 20 may be moved in accordance with the illumination cycle of the illumination device 17′ so that the inspection of LEDs may be continuously performed. Images that are obtained may be stored in an image processing device (not shown in FIG. 16).
  • The image processing device and the display device may display, side-by-side on a display screen or screens: color object surface images obtained while the illumination device 17′ is energized; and spectral images obtained while the illumination device 17′ is not energized. For example, as illustrated in FIG. 17, the display may be divided vertically into two parts. A first display screen displays color object surface images and the second display screen displays spectral images of the LEDs. The substrate on which the LEDs are mounted and the LEDs may be inspected by viewing color images on the first display screen. Furthermore, the LEDs can be checked for quality by viewing spectral images of the LEDs on the second display screen and comparing each LED output with reference LED spectral data.
  • The invention being thus described, it will be obvious that the same may be varied in many ways. Such variations are not to be regarded as a departure from the spirit and scope of the invention. Rather, the scope of the invention shall be defined as set forth in the following claims and their legal equivalents. All such modifications as would be obvious to one skilled in the art are intended to be included within the scope of the following claims.

Claims (7)

1-8. (canceled)
9. An observation system comprising:
an image pickup apparatus which includes:
an observation optical system that forms an image of an object at an image surface;
a diffraction optical element that is arranged in the observation optical system and generates zero-order diffracted light, at least one of −1st-order diffracted light and +1st-order diffracted light from light incident onto the diffraction optical element; and
two solid-state image pickup devices, both of which are arranged at the image surface, one of which captures an image formed by the observation optical system using the zero-order diffracted light and the other of which captures an image formed by the observation optical system using one of the −1st-order diffracted light and the +1st-order diffracted light;
an image processing apparatus that processes information of images captured by the two solid-state image pickup devices; and
a display apparatus that displays images supplied by the image processing apparatus; wherein
the image processing apparatus forms an image of the object using information of the zero-order diffracted light captured by one of said two solid-state image pickup devices, and also forms a spectral image of the object by using information from one of the −1st-order diffracted light and the +1st-order diffracted light captured by the other of said two solid-state image pickup devices.
10. The observation system according to claim 9, and further comprising:
a first illumination device for illuminating a relatively narrow area of the object; and
a second illumination device for illuminating a relatively broad area of the object; wherein
the image processing apparatus extracts spectral information of the object using one of the −1st-order diffracted light and the +1st-order diffracted light captured by one of said two solid-state image pickup devices when the object is illuminated by the first illumination device, and extracts morphological information of the object from the zero-order diffracted light captured by the other of said two solid-state image pickup devices when the object is illuminated by the second illumination device.
11. The observation system according to claim 10, wherein the image processing apparatus further comprises a memory for storing the image data and the spectral image data, and supplies a composite image to the display apparatus using the stored image data and spectral image data.
12. The observation system according to claim 10, wherein the image processing apparatus forms one image by extracting a specific wavelength information from a plurality of images that include spectral information when said images are captured when illuminated by the first illumination device.
13. The observation system according to claim 10, wherein the first illumination device is adapted to scan an area on the object.
14. The observation system according to claim 10, wherein the first illumination device includes an LED.
US12/471,045 2004-05-25 2009-05-22 Image pickup apparatus for capturing spectral images of an object and observation system including the same Abandoned US20090231983A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/471,045 US20090231983A1 (en) 2004-05-25 2009-05-22 Image pickup apparatus for capturing spectral images of an object and observation system including the same

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2004154659A JP4499476B2 (en) 2004-05-25 2004-05-25 Spectral image input apparatus and optical apparatus provided with the same
JP2004-154659 2004-05-25
US11/135,391 US7554572B2 (en) 2004-05-25 2005-05-24 Image pickup apparatus for capturing spectral images of an object and observation system including the same
US12/471,045 US20090231983A1 (en) 2004-05-25 2009-05-22 Image pickup apparatus for capturing spectral images of an object and observation system including the same

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/135,391 Division US7554572B2 (en) 2004-05-25 2005-05-24 Image pickup apparatus for capturing spectral images of an object and observation system including the same

Publications (1)

Publication Number Publication Date
US20090231983A1 true US20090231983A1 (en) 2009-09-17

Family

ID=35424742

Family Applications (2)

Application Number Title Priority Date Filing Date
US11/135,391 Active 2027-05-25 US7554572B2 (en) 2004-05-25 2005-05-24 Image pickup apparatus for capturing spectral images of an object and observation system including the same
US12/471,045 Abandoned US20090231983A1 (en) 2004-05-25 2009-05-22 Image pickup apparatus for capturing spectral images of an object and observation system including the same

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US11/135,391 Active 2027-05-25 US7554572B2 (en) 2004-05-25 2005-05-24 Image pickup apparatus for capturing spectral images of an object and observation system including the same

Country Status (2)

Country Link
US (2) US7554572B2 (en)
JP (1) JP4499476B2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140210975A1 (en) * 2012-09-03 2014-07-31 Olympus Medical Systems Corp. Scanning endoscope system
US9408527B2 (en) * 2012-11-01 2016-08-09 Karl Storz Imaging, Inc. Solid state variable direction of view endoscope with rotatable wide-angle field for maximal image performance

Families Citing this family (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009527953A (en) * 2006-02-22 2009-07-30 アイトレス リサーチ リミテッド Optical multiplexed imaging system and operation method
US20080024871A1 (en) * 2006-02-22 2008-01-31 Itres Research Limited Optically multiplexed imaging systems and methods of operation
JP2007319442A (en) * 2006-06-01 2007-12-13 Fujifilm Corp Capsule endoscope system and image processing unit
JP5028048B2 (en) * 2006-08-10 2012-09-19 キヤノン株式会社 Imaging device
US8248466B2 (en) * 2006-10-11 2012-08-21 International Business Machines Corporation Image processing using multiple image devices
KR100885537B1 (en) * 2008-09-03 2009-02-26 주식회사 나노베이스 Wavelength tunable spectrometer and wavelength tuning method therof
JP5467756B2 (en) * 2008-11-14 2014-04-09 Hoya株式会社 Endoscope device
JP5645445B2 (en) * 2009-05-22 2014-12-24 キヤノン株式会社 Imaging apparatus and imaging method
US10498939B2 (en) * 2009-06-16 2019-12-03 Nri R&D Patent Licensing, Llc Small-profile lensless optical microscopy imaging and tomography instruments and elements for low cost and integrated microscopy
JP2011039247A (en) * 2009-08-10 2011-02-24 Nikon Corp Microscope, image processor and image processing method
JPWO2011055405A1 (en) 2009-11-04 2013-03-21 株式会社ニレコ Spectroscopic information reader
WO2011144964A1 (en) * 2010-05-17 2011-11-24 Ford Espana S.L. Inspection system and method of defect detection on specular surfaces
US8189179B2 (en) * 2010-07-09 2012-05-29 Raytheon Company System and method for hyperspectral and polarimetric imaging
JP5538194B2 (en) * 2010-11-30 2014-07-02 ソニー株式会社 Optical apparatus and electronic apparatus
JP5717052B2 (en) * 2011-04-25 2015-05-13 株式会社リコー Spectroscopic measurement apparatus, image evaluation apparatus, and image forming apparatus
US9030660B2 (en) 2012-09-19 2015-05-12 Raytheon Company Multi-band imaging spectrometer
JP5701837B2 (en) 2012-10-12 2015-04-15 横河電機株式会社 Displacement sensor, displacement measurement method
JP6128897B2 (en) * 2013-03-06 2017-05-17 キヤノン株式会社 Illumination device and image reading device
JP6377768B2 (en) * 2015-01-07 2018-08-22 オリンパス株式会社 Spectral image acquisition device
JP5907364B2 (en) * 2015-02-17 2016-04-26 横河電機株式会社 Spectral characteristic measuring device, spectral characteristic measuring method, surface measurement object quality monitoring device
DE102015102595B4 (en) * 2015-02-24 2021-01-28 Karl Storz Se & Co. Kg Optical observation arrangement, camera, endoscope or exoscope and endoscope or exoscope system
JP6767753B2 (en) * 2015-03-02 2020-10-14 株式会社ミツトヨ Chromatic confocal sensor and measurement method
JP6025130B2 (en) * 2015-03-23 2016-11-16 パナソニックIpマネジメント株式会社 Endoscope and endoscope system
WO2018119043A1 (en) * 2016-12-20 2018-06-28 The Board Of Trustees Of The Leland Stanford Junior University Micro-screening apparatus, process, and products
CN108917927B (en) * 2018-07-27 2020-08-25 京东方科技集团股份有限公司 Dispersion device and spectrometer
US11256012B2 (en) 2019-02-27 2022-02-22 Boe Technology Group Co., Ltd. Color dispersion apparatus and spectrometer

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5078150A (en) * 1988-05-02 1992-01-07 Olympus Optical Co., Ltd. Spectral diagnosing apparatus with endoscope
US5782770A (en) * 1994-05-12 1998-07-21 Science Applications International Corporation Hyperspectral imaging methods and apparatus for non-invasive diagnosis of tissue for cancer
US20050259158A1 (en) * 2004-05-01 2005-11-24 Eliezer Jacob Digital camera with non-uniform image resolution
US20060082731A1 (en) * 2002-12-04 2006-04-20 Valter Drazic High contrast stereoscopic projection system
US20060125921A1 (en) * 1999-08-09 2006-06-15 Fuji Xerox Co., Ltd. Method and system for compensating for parallax in multiple camera systems
US20060221331A1 (en) * 2002-08-08 2006-10-05 Emanuel Elyasaf High Throughput Inspection System and a Method for Generating Transmitted and/or Reflected Images

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2802061B2 (en) 1987-04-30 1998-09-21 オリンパス光学工業株式会社 Endoscope optical system
DE3843876A1 (en) * 1988-12-24 1990-07-12 Leitz Wild Gmbh SPECTRAL MICROSCOPE WITH A PHOTOMETER
DE4039070A1 (en) * 1990-12-07 1992-06-11 Philips Patentverwaltung MULTI-CHANNEL SPECTROMETER
JPH06129907A (en) * 1992-10-16 1994-05-13 Hitachi Ltd Two-dimensional image spectroscopic apparatus
JPH06331446A (en) * 1993-05-27 1994-12-02 Fuji Photo Film Co Ltd Apparatus for measuring spectral image
US5905571A (en) * 1995-08-30 1999-05-18 Sandia Corporation Optical apparatus for forming correlation spectrometers and optical processors
US5828451A (en) * 1997-09-30 1998-10-27 Northrop Grumman Corporation Spectral imaging system and method employing an acousto-optic tunable filter for wavelength selection with increased field of view brightness
US5998796A (en) * 1997-12-22 1999-12-07 Spectrumedix Corporation Detector having a transmission grating beam splitter for multi-wavelength sample analysis
WO2001090748A2 (en) * 2000-05-19 2001-11-29 Iowa State University Research Foundation, Inc. High-throughput methods of distinguishing at least one molecule individually in a sample comprising multiple molecules and systems for use therein
US6646264B1 (en) * 2000-10-30 2003-11-11 Monsanto Technology Llc Methods and devices for analyzing agricultural products
KR100452293B1 (en) * 2002-01-07 2004-10-08 삼성전기주식회사 Optical pickup device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5078150A (en) * 1988-05-02 1992-01-07 Olympus Optical Co., Ltd. Spectral diagnosing apparatus with endoscope
US5782770A (en) * 1994-05-12 1998-07-21 Science Applications International Corporation Hyperspectral imaging methods and apparatus for non-invasive diagnosis of tissue for cancer
US20060125921A1 (en) * 1999-08-09 2006-06-15 Fuji Xerox Co., Ltd. Method and system for compensating for parallax in multiple camera systems
US20060221331A1 (en) * 2002-08-08 2006-10-05 Emanuel Elyasaf High Throughput Inspection System and a Method for Generating Transmitted and/or Reflected Images
US20060082731A1 (en) * 2002-12-04 2006-04-20 Valter Drazic High contrast stereoscopic projection system
US20050259158A1 (en) * 2004-05-01 2005-11-24 Eliezer Jacob Digital camera with non-uniform image resolution

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140210975A1 (en) * 2012-09-03 2014-07-31 Olympus Medical Systems Corp. Scanning endoscope system
US8994804B2 (en) * 2012-09-03 2015-03-31 Olympus Medical Systems Corp. Scanning endoscope system
US9408527B2 (en) * 2012-11-01 2016-08-09 Karl Storz Imaging, Inc. Solid state variable direction of view endoscope with rotatable wide-angle field for maximal image performance

Also Published As

Publication number Publication date
US20050264672A1 (en) 2005-12-01
JP4499476B2 (en) 2010-07-07
JP2005337793A (en) 2005-12-08
US7554572B2 (en) 2009-06-30

Similar Documents

Publication Publication Date Title
US7554572B2 (en) Image pickup apparatus for capturing spectral images of an object and observation system including the same
JP2007135989A (en) Spectral endoscope
CN102575928B (en) Method and measuring arrangement for the three-dimensional measurement of an object
US8928892B2 (en) Wavefront analysis inspection apparatus and method
JP5806504B2 (en) Imaging apparatus and microscope system including the same
CN106483646B (en) Micro-spectroscope
US20120075425A1 (en) Handheld dental camera and method for carrying out optical 3d measurement
CN111433652A (en) Microscope system and method for microscopically imaging with such a microscope system
JPWO2006126596A1 (en) Surface defect inspection equipment
EP2160591B1 (en) Imaging optical inspection device with a pinhole camera
JP2006208380A (en) Multi-spectral technology for detecting defocusing
JP2008268387A (en) Confocal microscope
JP2006171024A (en) Multi-point fluorescence spectrophotometry microscope and multi-point fluorescence spectrophotometry method
JP2022165355A (en) Imaging apparatus
JP2011123019A (en) Image inspection apparatus
CN114641667A (en) Surface profile measuring system
JP2558864B2 (en) Spectroscopic analyzer
JPH04294224A (en) Multichannel spectrometer
TW202014671A (en) Device for optically measuring and imaging a measurement object, and method
US6929604B2 (en) Optic for industrial endoscope/borescope with narrow field of view and low distortion
US6618154B2 (en) Optical measurement arrangement, in particular for layer thickness measurement
JP7197134B2 (en) Fluorometer and observation method
JP2022049881A (en) Optical device
JP7136064B2 (en) Apparatus for inspecting surface of object to be inspected and method for inspecting surface of object to be inspected
JP2005017127A (en) Interferometer and shape measuring system

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION