US20220038610A1 - 3 mos camera - Google Patents
3 mos camera Download PDFInfo
- Publication number
- US20220038610A1 US20220038610A1 US17/132,554 US202017132554A US2022038610A1 US 20220038610 A1 US20220038610 A1 US 20220038610A1 US 202017132554 A US202017132554 A US 202017132554A US 2022038610 A1 US2022038610 A1 US 2022038610A1
- Authority
- US
- United States
- Prior art keywords
- light
- visible light
- video signal
- image sensor
- visible
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000003384 imaging method Methods 0.000 claims abstract description 118
- 230000035945 sensitivity Effects 0.000 description 35
- 230000003595 spectral effect Effects 0.000 description 21
- 101000864318 Homo sapiens Binder of sperm protein homolog 1 Proteins 0.000 description 15
- 102100025744 Mothers against decapentaplegic homolog 1 Human genes 0.000 description 15
- 101100219191 Schizosaccharomyces pombe (strain 972 / ATCC 24843) byr1 gene Proteins 0.000 description 12
- 101100219192 Schizosaccharomyces pombe (strain 972 / ATCC 24843) byr2 gene Proteins 0.000 description 12
- 238000010586 diagram Methods 0.000 description 12
- 230000007704 transition Effects 0.000 description 12
- 239000003153 chemical reaction reagent Substances 0.000 description 11
- MOFVSTNWEDAEEK-UHFFFAOYSA-M indocyanine green Chemical compound [Na+].[O-]S(=O)(=O)CCCCN1C2=CC=C3C=CC=CC3=C2C(C)(C)C1=CC=CC=CC=CC1=[N+](CCCCS([O-])(=O)=O)C2=CC=C(C=CC=C3)C3=C2C1(C)C MOFVSTNWEDAEEK-UHFFFAOYSA-M 0.000 description 9
- 229960004657 indocyanine green Drugs 0.000 description 9
- 238000001356 surgical procedure Methods 0.000 description 8
- 238000000926 separation method Methods 0.000 description 5
- 102100037410 Gigaxonin Human genes 0.000 description 4
- 101001025761 Homo sapiens Gigaxonin Proteins 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 238000012886 linear function Methods 0.000 description 4
- 238000002834 transmittance Methods 0.000 description 4
- 102100021809 Chorionic somatomammotropin hormone 1 Human genes 0.000 description 3
- 102100038530 Chorionic somatomammotropin hormone 2 Human genes 0.000 description 3
- 101150083710 DRG3 gene Proteins 0.000 description 3
- 102100028945 Developmentally-regulated GTP-binding protein 1 Human genes 0.000 description 3
- 102100037711 Developmentally-regulated GTP-binding protein 2 Human genes 0.000 description 3
- 101000895818 Homo sapiens Chorionic somatomammotropin hormone 1 Proteins 0.000 description 3
- 101000956228 Homo sapiens Chorionic somatomammotropin hormone 2 Proteins 0.000 description 3
- 101000838507 Homo sapiens Developmentally-regulated GTP-binding protein 1 Proteins 0.000 description 3
- 101000880940 Homo sapiens Developmentally-regulated GTP-binding protein 2 Proteins 0.000 description 3
- 101000979748 Homo sapiens Protein NDRG1 Proteins 0.000 description 3
- 230000005284 excitation Effects 0.000 description 3
- 238000002073 fluorescence micrograph Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 238000000034 method Methods 0.000 description 2
- 238000007740 vapor deposition Methods 0.000 description 2
- 238000009825 accumulation Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 238000012217 deletion Methods 0.000 description 1
- 230000037430 deletion Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 210000004185 liver Anatomy 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 210000000496 pancreas Anatomy 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/10—Beam splitting or combining systems
- G02B27/14—Beam splitting or combining systems operating by reflection only
- G02B27/145—Beam splitting or combining systems operating by reflection only having sequential partially reflecting surfaces
-
- H04N5/2258—
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/10—Beam splitting or combining systems
- G02B27/1006—Beam splitting or combining systems for splitting or combining different wavelengths
- G02B27/1013—Beam splitting or combining systems for splitting or combining different wavelengths for colour or multispectral image sensors, e.g. splitting an image into monochromatic image components on respective sensors
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B5/00—Optical elements other than lenses
- G02B5/04—Prisms
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B5/00—Optical elements other than lenses
- G02B5/20—Filters
- G02B5/208—Filters for use with infrared or ultraviolet radiation, e.g. for separating visible light from infrared and/or ultraviolet radiation
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B5/00—Optical elements other than lenses
- G02B5/20—Filters
- G02B5/26—Reflecting filters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/10—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
- H04N23/11—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths for generating image signals from visible and infrared light wavelengths
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/10—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
- H04N23/13—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with multiple sensors
- H04N23/16—Optical arrangements associated therewith, e.g. for beam-splitting or for colour correction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/20—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/45—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/73—Circuitry for compensating brightness variation in the scene by influencing the exposure time
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/741—Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/815—Camera processing pipelines; Components thereof for controlling the resolution by using a single image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/84—Camera processing pipelines; Components thereof for processing colour signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/95—Computational photography systems, e.g. light-field imaging systems
- H04N23/951—Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
- H04N25/13—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
- H04N25/131—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements including elements passing infrared wavelengths
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
- H04N25/13—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
- H04N25/134—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements
-
- H04N5/23235—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/265—Mixing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/30—Transforming light or analogous information into electric information
- H04N5/33—Transforming infrared radiation
-
- H04N9/0451—
-
- H04N9/04557—
-
- H04N9/097—
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0071—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by measuring fluorescence emission
Definitions
- the present disclosure relates to a 3 MOS camera.
- ICG indocyanine green
- JP-A-2016-75825 discloses an imaging device having a blue separation prism that reflects a part of blue component light of incident light and near-infrared light in a specific wavelength region and transmits light other than the above light, a red separation prism that reflects a part of red component light of incident light and near-infrared light in a specific wavelength region and transmits light other than the above light, and a green separation prism into which the light transmitted through the red separation prism is incident.
- the present disclosure has been devised in view of the above-mentioned circumstances, and a purpose thereof is to provide a 3 MOS camera that achieves both generation of a clearer fluorescence video of an observation part to which a fluorescent reagent is administered and resolution enhancement of a color image of the observation part to assist a doctor or the like in easily grasping a diseased part.
- the present disclosure provides a 3 MOS camera including a first prism that causes a first image sensor to receive IR light of light from an observation part, a second prism that causes a second image sensor to receive visible light of A % (A: a predetermined real number) of the light from the observation part, a third prism that causes a third image sensor to receive remaining visible light of (100-A)% of the light from the observation part, and a video signal processor that combines a color video signal based on imaging outputs of the second image sensor and the third image sensor and an IR video signal based on an imaging output of the first image sensor and outputs the combined signal to a monitor, the second image sensor and the third image sensor being respectively bonded to positions optically shifted by substantially one pixel.
- the present disclosure it is possible to achieve both the generation of the clearer fluorescence video of the observation part to which the fluorescent reagent is administered and the resolution enhancement of the color image of the observation part, and thus to assist the doctor or the like in easily grasping the diseased part.
- FIG. 1A is a block diagram showing an internal configuration example of a 3 MOS camera according to a first embodiment.
- FIG. 1B is a block diagram showing another internal configuration example of the 3 MOS camera 1 according to the first embodiment.
- FIG. 2 is a diagram showing a structural example of a spectral prism shown in FIG. 1 .
- FIG. 3A is a diagram showing an arrangement example of color filters of imaging elements 151 and 152 .
- FIG. 3B is an explanatory diagram of a problem in a case where the color filters of the imaging elements 151 and 152 are configured in a Bayer array and are disposed with half pixel shifting.
- FIG. 4A is a graph showing an example of spectral characteristics of a dichroic mirror.
- FIG. 4B is a graph showing an example of spectral characteristics of a beam splitter.
- FIG. 5 is a graph showing an example of a relationship between visible light division ratio and sensitivity, dynamic range, and resolution in a case where exposure times of second visible light and first visible light are the same.
- FIG. 6 is a graph showing an example of the relationship between visible light division ratio and sensitivity, dynamic range, and resolution in a case where a ratio of the exposure times of the second visible light and the first visible light is 10:1.
- FIG. 7 is a graph showing an example of the relationship between visible light division ratio and sensitivity, dynamic range, and resolution in a case where the ratio of the exposure times of the second visible light and the first visible light is 100:1.
- FIG. 8 is a graph showing an example of the relationship between visible light division ratio and sensitivity, dynamic range, and resolution in a case where the ratio of the exposure times of the second visible light and the first visible light is 1:10.
- FIG. 9 is a diagram showing a display example of a visible/IR combined video signal generated by the 3 MOS camera according to the first embodiment on a monitor.
- FIG. 1A is a block diagram showing an internal configuration example of a 3 MOS camera 1 according to a first embodiment.
- FIG. 1B is a block diagram showing another internal configuration example of the 3 MOS camera 1 according to the first embodiment.
- the 3 MOS camera 1 includes a lens 11 , a spectral prism 13 , imaging elements 151 , 152 , and 153 , and a video signal processing unit 17 .
- the video signal processing unit 17 includes camera signal processing units 191 , 192 , and 193 , a pixel shifting combination/resolution enhancement processing unit 21 , and a visible/IR combination processing unit 23 .
- the 3 MOS camera 1 may include a video signal processing unit 17 A (refer to FIG.
- the 3 MOS camera 1 may include both the video signal processing unit 17 (refer to FIG. 1A ) and the video signal processing unit 17 A (refer to FIG. 1B ). Each configuration will be described in detail.
- the 3 MOS camera 1 is used for a medical observation system in which excitation light in a predetermined wavelength band (for example, 760 nm to 800 nm) is emitted to a fluorescent reagent (for example, indocyanine green; hereinafter referred to as “ICG”) administered in advance to an observation part (for example, diseased part) in a subject such as a patient and the observation part that emits fluorescent light on a long wavelength side (for example, 820 to 860 nm) based on the excitation light is imaged, at the time of surgery or examination, for example.
- An image (for example, video of the observation part) captured by the 3 MOS camera 1 is displayed on a monitor MN 1 (refer to FIG.
- the spectral prism 13 will be described as examples used in the medical observation system described above. However, the use thereof is not limited to medical usage and the prism may be used for industrial usage.
- a part of an objective side (in other words, tip side) of the 3 MOS camera 1 with respect to the lens 11 is configured by a scope that is inserted through the observation part (for example, diseased part; the same applies hereinafter).
- This scope is, for example, a main portion of a medical instrument such as a rigid endoscope inserted into the observation part and is a light guide member capable of guiding light L 1 from the observation part to the lens 11 .
- the lens 11 is attached to the objective side (in other words, tip side) of the spectral prism 13 and collects the light L 1 from the observation part (for example, reflected light at the observation part). Collected light L 2 is incident on the spectral prism 13 .
- the spectral prism 13 receives the light L 2 from the observation part and splits the light into first visible light V 1 , a second visible light V 2 , and IR light N 1 .
- the spectral prism 13 has a configuration having an IR prism 31 , visible prisms 32 and 33 (refer to FIG. 2 ).
- the first visible light V 1 is incident on the imaging element 151 disposed so as to face the visible prism 32 .
- the second visible light V 2 is incident on the imaging element 152 disposed so as to face the visible prism 33 .
- the IR light N 1 is incident on the imaging element 153 disposed so as to face the IR prism 31 .
- a detailed structural example of the spectral prism 13 will be described below with reference to FIG. 2 .
- the imaging element 151 as an example of a second image sensor includes, for example, a charge coupled device (CCD) or a complementary metal oxide semiconductor
- CMOS complementary metal-oxide-semiconductor
- CMOS complementary metal-oxide-semiconductor
- the imaging element 151 at least a microlens, a color filter, and a light receiving element are disposed, for example, corresponding to each pixel.
- the microlens collects incident light (visible light).
- the color filter transmits visible light of a specific color component (wavelength) transmitted through the microlens.
- the color filter of the imaging element 151 is disposed in a Bayer array (refer to FIG. 3A ) such as red (R), green (G), green (G), and blue (B).
- the specific color component indicates, for example, red (R), green (G), and blue (B).
- the light receiving element receives light of the specific color component (wavelength) transmitted through the color filter.
- the imaging element 151 is disposed so as to face the visible prism 32 (refer to FIG. 2 ).
- the imaging element 151 captures an image based on the first visible light V 1 that is incident for a first exposure time determined by the exposure control circuit based on an exposure control signal CSH 1 from the camera signal processing unit 191 .
- the imaging element 151 generates a video signal V 1 V of the observation part by imaging and outputs the signal to the video signal processing unit 17 .
- the imaging element 152 as an example of a third image sensor includes, for example, a CCD or a CMOS in which a plurality of pixels suitable for imaging visible light are arranged, and an exposure control circuit (not shown) using an electronic shutter.
- a CCD or CMOS constituting the imaging element 152 .
- at least a microlens, a color filter, and a light receiving element are arranged, for example, corresponding to each pixel.
- the microlens collects incident light (visible light).
- the color filter transmits visible light of a specific color component (wavelength) transmitted through the microlens.
- the color filter of the imaging element 152 is disposed in the Bayer array (refer to FIG.
- the specific color component indicates, for example, red (R), green (G), and blue (B).
- the light receiving element receives light of the specific color component (wavelength) transmitted through the color filter.
- the imaging element 152 is disposed so as to face the visible prism 33 (refer to FIG. 2 ). The imaging element 152 captures an image based on the second visible light V 2 that is incident for a second exposure time determined by the exposure control circuit based on an exposure control signal CSH 2 from the camera signal processing unit 192 .
- the imaging element 152 generates a video signal V 2 V of the observation part by imaging and outputs the signal to the video signal processing unit 17 .
- the imaging element 153 as an example of a first image sensor includes, for example, a CCD or a CMOS in which a plurality of pixels suitable for imaging IR light are arranged.
- the imaging element 153 is disposed so as to face the IR prism 31 (refer to FIG. 2 ).
- the imaging element 153 captures an image based on the incident IR light N 1 .
- the imaging element 153 generates a video signal N 1 V of the observation part by imaging and outputs the signal to the video signal processing unit 17 .
- the video signal processing unit 17 is configured of a processor such as a digital signal processor (DSP) or a field programmable gate array (FPGA).
- DSP digital signal processor
- FPGA field programmable gate array
- the camera signal processing unit 191 performs various types of camera signal processing using the video signal V 1 V from the imaging element 151 to generate a first visible video signal V 1 VD of the observation part, and outputs the signal to the pixel shifting combination/resolution enhancement processing unit 21 or the long/short exposure combination/wide dynamic range processing unit 21 A.
- the camera signal processing unit 191 generates the exposure control signal CSH 1 for determining the first exposure time of the imaging element 151 and outputs the signal to the imaging element 151 .
- the imaging element 151 controls the exposure time of the first visible light V 1 based on the exposure control signal CSH 1 .
- the camera signal processing unit 192 performs various types of camera signal processing using the video signal V 2 V from the imaging element 152 to generate a second visible video signal V 2 VD of the observation part, and outputs the signal to the pixel shifting combination/resolution enhancement processing unit 21 or the long/short exposure combination/wide dynamic range processing unit 21 A.
- brightness (sensitivity) of the first visible video signal V 1 VD and brightness of the second visible video signal V 2 VD may be substantially the same (including the same) or may be different. In particular, the closer the brightness (sensitivity) of the first visible video signal V 1 VD and the brightness of the second visible video signal V 2 VD are to substantially the same (including the same), the higher an effect of resolution enhancement is.
- the camera signal processing unit 192 generates the exposure control signal CSH 2 for determining the exposure time of the imaging element 152 and outputs the signal to the imaging element 152 .
- the imaging element 152 controls the second exposure time of the second visible light V 2 based on the exposure control signal CSH 2 .
- the first exposure time and the second exposure time may be the same (refer to FIG. 5 ) or may be different (refer to FIGS. 6 to 8 ), and the same applies hereinafter.
- the camera signal processing unit 193 performs various types of camera signal processing using the video signal N 1 V from the imaging element 153 to generate an IR video signal N 1 VD of the observation part, and outputs the signal to the visible/IR combination processing unit 23 .
- the pixel shifting combination/resolution enhancement processing unit 21 receives two video signals (specifically, the first visible video signal V 1 VD from the camera signal processing unit 191 and the second visible video signal V 2 VD from the camera signal processing unit 192 ). The closer the brightness of the first visible video signal V 1 VD and the brightness of the second visible video signal V 2 VD are to the same, the higher the effect of resolution enhancement by the pixel shifting combination/resolution enhancement processing unit 21 is.
- Combination/pixel interpolation processing is performed in consideration of a spatial positional relationship between the first visible video signal V 1 VD and the second visible video signal V 2 VD, and thus it is possible to generate a high-resolution video signal VVD with high resolution.
- the pixel shifting combination/resolution enhancement processing unit 21 performs combination processing on the received two input video signals (that is, combination of the first visible video signal V 1 VD generated by the camera signal processing unit 191 based on the imaging of the imaging element 151 bonded to the visible prism 32 and the second visible video signal V 2 VD generated by the camera signal processing unit 192 based on the imaging of the imaging element 152 bonded to the visible prism 33 ) to generate the high-resolution video signal VVD.
- the pixel shifting combination/resolution enhancement processing unit 21 can generate the high-resolution video signal VVD having higher resolution than the first visible video signal V 1 VD or the second visible video signal V 2 VD.
- the pixel shifting combination/resolution enhancement processing unit 21 outputs the high-resolution video signal VVD to the visible/IR combination processing unit 23 .
- the generation of the high-resolution video signal VVD by the pixel shifting combination/resolution enhancement processing unit 21 will be described below with reference to FIG. 3A .
- the video signal processing unit 17 generates the high-resolution video signal VVD by pixel shifting. Therefore, in the spectral prism 13 (refer to FIG. 2 ), when the imaging element 151 on which the first visible light V 1 is incident and the imaging element 152 on which the second visible light V 2 is incident are respectively bonded to the corresponding visible prisms 32 and 33 , it is necessary to optically shift positions of the imaging element 151 and the imaging element 152 by substantially one pixel (for example, in the horizontal or vertical direction, or in both directions) to perform the bonding (refer to FIG. 3A ).
- the pixel shifting combination/resolution enhancement processing unit 21 can generate the high-resolution video signal VVD by the pixel shifting based on the imaging of the imaging elements 151 and 152 which are disposed in an optically shifted manner by substantially one pixel (refer to above).
- the substantially one pixel includes one pixel, may not be exactly one pixel, and may include, for example, a distance deviation of one pixel plus or minus 0.25 pixels. The closer an amount of pixel shifting is to one pixel, the higher the effect of resolution enhancement by the pixel shifting combination/resolution enhancement processing unit 21 is.
- the long/short exposure combination/wide dynamic range processing unit 21 A receives and superimposes the two video signals having different brightness (sensitivity) (specifically, the first visible video signal V 1 VD from the camera signal processing unit 191 and the second visible video signal V 2 VD from the camera signal processing unit 192 ) for combining the signals to generate a wide dynamic range video signal VVDA.
- the long/short exposure combination/wide dynamic range processing unit 21 A superimposes and combines the two video signals having different brightness (sensitivity) and thus can generate the wide dynamic range video signal VVDA with an apparently wider dynamic range than the first visible video signal V 1 VD or the second visible video signal V 2 VD.
- the long/short exposure combination/wide dynamic range processing unit 21 A outputs the wide dynamic range video signal VVDA to the visible/IR combination processing unit 23 .
- the visible/IR combination processing unit 23 receives and superimposes the high-resolution video signal VVD from the pixel shifting combination/resolution enhancement processing unit 21 and the IR video signal N 1 VD from the camera signal processing unit 193 for combining the signals to generated a visible/IR combined video signal IMVVD.
- the visible/IR combined video signal IMVVD With the visible/IR combined video signal IMVVD, the resolution is enhanced by the combination processing after the pixel shifting. Therefore, a state around the observation part (for example, surgical field) becomes visually clear and a state of the diseased part can be clarified in detail by the fluorescent light emission of the fluorescent reagent such as ICG (refer to FIG. 9 ).
- the visible/IR combination processing unit 23 may output the visible/IR combined video signal IMVVD to the monitor MN 1 or send the signal to a recording device (not shown) for accumulation.
- the monitor MN 1 constitutes, for example, an image console (not shown) disposed in a surgery room at the time of surgery or examination, and displays the visible/IR combined video signal IMVVD of the observation part generated by the 3 MOS camera 1 . Accordingly, the user such as doctor can visually recognize the visible/IR combined video signal IMVVD displayed on the monitor MN 1 to grasp in detail the part that emits fluorescent light in the observation part.
- the recording device is a recorder capable of recording data of the visible/IR combined video signal IMVVD generated by the 3 MOS camera 1 , for example.
- FIG. 2 is a diagram showing a structural example of the spectral prism 13 shown in FIG. 1 .
- the spectral prism 13 includes the IR prism 31 (an example of a first prism), the visible prism 32 (an example of a second prism), and the visible prism 33 (an example of a third prism).
- the IR prism 31 , the visible prism 32 , and the visible prism 33 are sequentially assembled in an optical axis direction of the light L 2 collected by the lens 11 .
- the IR prism 31 as an example of the first prism includes an incident surface 31 a on which the light L 2 is incident, a reflection surface 31 b on which a dichroic mirror DYM 1 that reflects the IR light of the light L 2 is formed, and an emission surface 31 c from which the IR light is emitted.
- the dichroic mirror DYM 1 (an example of first reflection film) is formed on the reflection surface 31 b by vapor deposition or the like, reflects the IR light (for example, IR light in the wavelength band of 800 nm or more) of the light L 2 , and transmits light (for example, light of about 400 nm to 800 nm) other than the IR light of the light L 2 (refer to FIG. 4A ).
- the IR light (refer to above) of the light L 2 incident on the incident surface 31 a of the IR prism 31 is reflected by the reflection surface 31 b .
- This IR light is reflected by the reflection surface 31 b , is then totally reflected by the incident surface 31 a of the IR prism 31 , and is incident on the imaging element 153 through the emission surface 31 c.
- FIG. 4A is a graph showing an example of spectral characteristics of the dichroic mirror DYM 1 .
- the horizontal axis of FIG. 4A indicates wavelength [nm: nanometer (the same applies hereinafter)], and the vertical axis indicates reflectance or transmittance.
- a characteristic TP 1 indicates the transmittance of the dichroic mirror DYM 1 . According to the characteristic TP 1 , the dichroic mirror DYM 1 can transmit the light of about 400 nm to 800 nm.
- a characteristic RF 1 indicates the reflectance of the dichroic mirror DYM 1 . According to the characteristic RF 1 , the dichroic mirror DYM 1 can reflect the IR light of 800 nm or more. Therefore, all the IR light having a light amount indicated by an area AR 1 (in other words, the IR light of the light L 2 ) can be incident on the imaging element 153 .
- the visible prism 32 as an example of the second prism includes an incident surface 32 a on which the light (an example of first transmitted light) transmitted through the dichroic mirror DYM 1 is incident, a reflection surface 32 b on which a beam splitter BSP 1 for reflecting a partial light amount of the transmitted light (specifically, visible light) is formed, and an emission surface 32 c from which reflected visible light of the partial light amount is emitted.
- the beam splitter BSP 1 (an example of second reflection film) is formed on the reflection surface 32 b by vapor deposition or the like, reflects visible light having a partial (for example, around A % of the light incident on the incident surface 32 a ; A is a predetermined real number, for example, 50) light amount of the visible light incident on the incident surface 32 a , and transmits visible light having a remaining (100-A)% (for example, around 50% of the light incident on the incident surface 32 a ) light amount thereof (refer to FIG. 4B ). Specifically, the visible light having the partial (for example, 50%) light amount of the visible light incident on the incident surface 32 a of the visible prism 32 is reflected by the reflection surface 32 b .
- a ratio of visible light reflected by the beam splitter BSP 1 is not limited to 50% and may be in a range of 30% to 50%, for example.
- the visible prism 33 as an example of the third prism has an incident surface 33 a on which the visible light having the remaining light amount transmitted through the beam splitter BSP 1 is incident and an emission surface 33 c from which the visible light having the remaining light amount is emitted. Specifically, the visible light having the remaining light amount transmitted through the beam splitter BSP 1 is incident on the visible prism 33 , is emitted as it is, and is incident on the imaging element 152 (refer to FIG. 4B ).
- FIG. 4B is a graph showing an example of spectral characteristics of the beam splitter BSP 1 .
- the horizontal axis of FIG. 4B indicates wavelength [nm], and the vertical axis indicates reflectance or transmittance.
- a characteristic TP 2 indicates transmittance and reflectance (about 50% at 400 nm to 800 nm) of the beam splitter BSP 1 in the spectral prism 13 shown in FIG. 2 .
- the beam splitter BSP 1 as an example of the second reflection film can reflect light having a light amount of about 50% (mainly visible light) of the light of about 400 nm to 800 nm and can transmit light having a remaining light amount of about 50% (mainly visible light) thereof.
- visible light having a light amount indicated by an area AR 2 (for example, visible light having light amount of about 50%) can be incident on the imaging element 151 .
- the visible light having the light amount indicated by the area AR 2 (for example, visible light having light amount of about 50%) can be incident on the imaging element 152 .
- FIG. 3A is a diagram showing an arrangement example of the color filters BYR 1 and BYR 2 of the imaging elements 151 and 152 .
- the color filter BYR 1 is a color filter constituting the imaging element 151 and is disposed in the Bayer array consisting of the color filters of red (R), green (G), green (G), and blue (B) in any four adjacent pixels in the horizontal and vertical directions, for example. In the Bayer array, more green (G) is disposed than red (R) and blue (B) in any four pixels.
- the color filter BYR 2 is a color filter constituting the imaging element 152 and is disposed in the Bayer array consisting of the color filters of red (R), green (G), green (G), and blue (B) in any four adjacent pixels in the horizontal and vertical directions, for example.
- the imaging elements 151 and 152 are disposed with an offset of one pixel, and thus the color filters BYR 1 and BYR 2 are disposed so as to be offset by one pixel.
- FIG. 3A shows an example in which the offset of one pixel is added, the color filters BYR 1 and BYR 2 may be disposed with an offset of substantially one pixel (refer to above). Therefore, with the pixel shifting of the offset of substantially one pixel (refer to above), the green (G) pixel of one Bayer array (for example, the color filter BYR 1 ) is disposed on the blue (B) pixel or the red (R) pixel of the other Bayer array (for example, the color filter BYR 2 ).
- the green (G) color filter is disposed for all pixels. Accordingly, the pixel shifting combination/resolution enhancement processing unit 21 that receives the first visible video signal V 1 VD and the second visible video signal V 2 VD can generate the high-resolution video signal VVD having high resolution as compared with a video signal in a case where the pixel shifting by substantially one pixel is not performed, by selectively using light transmitted through the green (G) color filter, which has the highest ratio of contributing to resolution of a luminance signal in each pixel, of the color filters BYR 1 and BYR 2 of the Bayer array stacked in two layers (refer to FIG. 3A ).
- FIG. 3B is an explanatory diagram of a problem in a case where the color filters BYR 1 and BYR 2 of the imaging elements 151 and 152 are configured in the Bayer array and disposed with the half pixel shifting.
- the horizontal axis and the vertical axis of FIG. 3B are both frequencies, where fs indicates sampling frequency and fs/2 indicates Nyquist frequency.
- the color filters of the imaging elements 151 and 152 are stacked and disposed with the pixel shifting offset by the half pixel, it is found that false color or moire, which is not present in the subject, is detected near the Nyquist frequency (fs/2) as shown in FIG. 3B . When such false color or moire is detected, the image quality of the color video signal deteriorates.
- the color filters BYR 1 and BYR 2 are disposed with the optical offset of one pixel as shown in FIG. 3A .
- FIG. 5 is a graph showing an example of a relationship between visible light division ratio and sensitivity GAN 1 , dynamic range DRG 1 , and resolution RSO 1 in a case where exposure times of the second visible light V 2 and the first visible light V 1 are the same.
- the horizontal axis of FIG. 5 is the visible light division ratio.
- the visible light division ratio is a ratio at which the beam splitter BSP 1 reflects the visible light transmitted through the dichroic mirror DYM 1 .
- the beam splitter BSP 1 reflects the visible light of 10% of the visible light transmitted through the dichroic mirror DYM 1 and transmits the visible light of 90% thereof. That is, the ratio light amount of the second visible light
- V 2 :light amount of the first visible light V 1 is 90:10.
- Another visible light division ratio can be considered in the same manner as the specific example described above.
- the vertical axis of FIG. 5 shows the sensitivity GAN 1 , the dynamic range DRG 1 , and the resolution RSO 1 of the high-resolution video signal VVD generated by the video signal processing unit 17 .
- FIG. 5 shows an example in which the exposure times for the imaging elements 152 and 151 by the electronic shutter are controlled to be the same. Therefore, it is considered that the sensitivity GAN 1 transitions according to a characteristic (for example, a linear function) that the sensitivity is the maximum as the visible light division ratio is smaller (for example, the maximum (100%) and the brightest when the ratio is 0%) and the sensitivity is the minimum (for example, the darkest at 50%) when the ratio is 50%.
- the sensitivity is determined by the brightness of the brighter second visible light V 2 of the brightness of the first visible video signal V 1 VD based on the first visible light V 1 and the brightness of the second visible video signal V 2 VD based on the second visible light V 2 .
- the dynamic range DRG 1 transitions according to a characteristic that the dynamic range increases similarly as the visible light division ratio is smaller in a range larger than zero (for example, about +80 dB when the ratio is 0.01%) and the dynamic range is the minimum (for example, 0 dB) when the ratio is 50%. This is because a difference between a dark portion and a bright portion tends to widen as the visible light division ratio is smaller in the high-resolution video signal VVD.
- the resolution RSO 1 transitions according to a characteristic that the resolution is the minimum contrarily as the visible light division ratio is smaller (for example, the maximum of 1 time when the ratio is 0%) and the resolution is the maximum (for example, 1.1 times) when the ratio is 50%. This is because a difference in pixel value between adjacent pixels is small as the visible light division ratio is larger and thus it is easy to realize high resolution by pixel shifting.
- FIG. 6 is a graph showing an example of a relationship between visible light division ratio and sensitivity GAN 2 , dynamic range DRG 2 , and resolution RSO 2 in a case where a ratio of the exposure times of the second visible light V 2 and the first visible light V 1 is 10:1.
- the horizontal axis of FIG. 6 is the visible light division ratio, and description thereof will be omitted since the description is the same as that in FIG. 5 .
- the vertical axis of FIG. 6 shows the sensitivity GAN 2 , the dynamic range DRG 2 , and the resolution RSO 2 of the high-resolution video signal VVD generated by the video signal processing unit 17 .
- FIG. 6 shows an example in which a difference is provided such that a ratio of the exposure times for the imaging elements 152 and 151 by the electronic shutter is 10:1. It is considered, as in the case of the sensitivity GAN 1 shown in FIG. 5 , that the sensitivity GAN 2 transitions according to a characteristic (for example, a linear function) that the sensitivity is the maximum as the visible light division ratio is smaller (for example, the maximum (100%) and the brightest when the ratio is 0%) and the sensitivity is the minimum (for example, the darkest at 50%) when the ratio is 50%.
- a characteristic for example, a linear function
- a brightness ratio of the second visible video signal V 2 VD and the first visible video signal V 1 VD is obtained by multiplying the ratio of the exposure times for the imaging elements 152 and 151 of 10:1 by a light amount ratio of the second visible light V 2 and the first visible light V 1 , and the sensitivity is determined by the brightness of the brighter second visible video signal V 2 VD of the second visible video signal V 2 VD and the first visible video signal V 1 VD.
- the dynamic range DRG 2 transitions according to a characteristic that the dynamic range increases similarly as the visible light division ratio is smaller in a range larger than zero (for example, about +80 dB when the ratio is 0.1%) and the dynamic range is the minimum (for example, +20 dB) when the ratio is 50%. That is, it is possible to gain +20 dB even with a minimum value in the example of FIG. 6 .
- FIG. 7 is a graph showing an example of a relationship between visible light division ratio and sensitivity GAN 2 , dynamic range DRG 3 , and resolution RSO 3 in a case where the ratio of the exposure times of the second visible light V 2 and the first visible light V 1 is 100:1.
- the horizontal axis of FIG. 7 is the visible light division ratio, and description thereof will be omitted since the description is the same as that in FIG. 5 .
- the vertical axis of FIG. 7 shows the sensitivity GAN 2 , the dynamic range DRG 3 , and the resolution RSO 3 of the high-resolution video signal VVD generated by the video signal processing unit 17 .
- FIG. 7 shows an example in which a considerable difference is provided such that the ratio of the exposure times for the imaging elements 152 and 151 by the electronic shutter is 100:1. It is considered, as in the case of the sensitivity GAN 2 shown in FIG. 6 , that the sensitivity GAN 2 transitions according to a characteristic (for example, a linear function) that the sensitivity is the maximum as the visible light division ratio is smaller (for example, the maximum (100%) and the brightest when the ratio is 0%) and the sensitivity is the minimum (for example, the darkest at 50%) when the ratio is 50%.
- a characteristic for example, a linear function
- a brightness ratio of the second visible video signal V 2 VD and the first visible video signal V 1 VD is obtained by multiplying the ratio of the exposure times for the imaging elements 152 and 151 of 100:1 by a light amount ratio of the second visible light V 2 and the first visible light V 1 , and the sensitivity is determined by the brightness of the brighter second visible video signal V 2 VD of the second visible video signal V 2 VD and the first visible video signal V 1 VD.
- the dynamic range DRG 3 transitions according to a characteristic that the dynamic range increases similarly as the visible light division ratio is smaller in a range larger than zero (for example, about +80 dB when the ratio is 1%) and the dynamic range is the minimum (for example, +40 dB) when the ratio is 50%. That is, it is possible to gain +40 dB even with a minimum value in the example of FIG. 7 .
- FIG. 8 is a graph showing an example of a relationship between visible light division ratio and sensitivity GAN 3 , dynamic range DRG 4 , and resolution RSO 4 in a case where the ratio of the exposure times of the second visible light V 2 and the first visible light V 1 is 1:10.
- the horizontal axis of FIG. 8 is the visible light division ratio, and description thereof will be omitted since the description is the same as that in FIG. 5 .
- the vertical axis of FIG. 8 shows the sensitivity GAN 3 , the dynamic range DRG 4 , and the resolution RSO 4 of the high-resolution video signal VVD generated by the video signal processing unit 17 .
- FIG. 8 shows an example in which a difference is provided such that the ratio of the exposure times for the imaging elements 152 and 151 by the electronic shutter is 1:10.
- the ratio of the exposure times for the imaging elements 152 and 151 is, for example, 1:10
- the sensitivity GAN 3 transitions according to a characteristic that the sensitivity transitions substantially constant so as to be the minimum when the visible light division ratio is from 0% to 10% (in other words, in a case where light amounts incident on the imaging elements 152 and 151 do not change much) and the sensitivity increases monotonically in a linear function until the visible light division ratio is larger than 10% and reaches 50%.
- the brightness is the maximum (50%, that is, ⁇ 6 dB) when the visible light division ratio is 50%.
- a brightness ratio of the second visible video signal V 2 VD and the first visible video signal V 1 VD is obtained by multiplying the ratio of the exposure times for the imaging elements 152 and 151 of 1:10 by a light amount ratio of the second visible light V 2 and the first visible light V 1 , and the sensitivity is determined by the brightness of a brighter video signal of the second visible video signal V 2 VD and the first visible video signal V 1 VD.
- the dynamic range DRG 4 increases as the visible light division ratio is smaller in a range larger than 0% (for example, about +80 dB at 0.001%).
- the visible light division ratio is 10%
- the brightness of the second visible video signal V 2 VD and the brightness of the first visible video signal V 1 VD are substantially equal due to the cancellation of the visible light division ratio and the ratio of the exposure times for the imaging elements 152 and 151 of 1:10 and the dynamic range DRG 4 is the minimum.
- the visible light division ratio exceeds 10%
- the brightness of the second visible video signal V 2 VD is different again from the brightness of the first visible video signal V 1 VD and the dynamic range DRG 4 is large.
- the ratio of the brightness of the second visible video signal V 2 VD and the brightness of the first visible video signal V 1 VD is 1:10 by the multiplication of the ratio of the exposure times for the imaging elements 152 and 151 of 1:10 and the dynamic range is +20 dB.
- FIG. 9 is a diagram showing a display example of the visible/IR combined video signal IMVVD generated by the 3 MOS camera 1 according to the first embodiment on the monitor MN 1 .
- the visible/IR combined video signal IMVVD shown in FIG. 9 is generated based on imaging at the observation part (for example, around liver and pancreas) of the patient who is the subject and is displayed on the monitor MN 1 .
- the fluorescent reagent of ICG which is administered in advance to the diseased part in a body of the patient before surgery or examination, emits light, and a place that emits the light (for example, diseased part FL 1 ) is shown so as to be known in the visible/IR combined video signal IMVVD.
- the high-resolution video signal VVD having the high resolution is generated by the pixel shifting combination/resolution enhancement processing unit 21 . Therefore, a clear video of the surgical field such as an observation target can be obtained with the visible/IR combined video signal IMVVD.
- the 3 MOS camera 1 can generate the visible/IR combined video signal IMVVD, which allows the user such as doctor to grasp the details of the observation part with high image quality and to easily specify a position of the diseased part, and display the signal on the monitor MN 1 , at the time of surgery or examination, for example.
- the 3 MOS camera 1 is provided with the first prism (for example, IR prism 31 ) that causes the imaging element 153 to receive the IR light of the light L 2 from the observation part (for example, diseased part in the subject), the second prism (for example, visible prism 32 ) that reflects the visible light of A % of the light L 2 from the observation part (for example, diseased part in the subject) and causes the imaging element 151 to receive the remaining (100-A)% thereof, and the third prism (for example, visible prism 33 ) that causes the imaging element 152 to receive the remaining visible light of (100-A)% thereof.
- the first prism for example, IR prism 31
- the second prism for example, visible prism 32
- the third prism for example, visible prism 33
- the 3 MOS camera 1 is provided with the video signal processing unit 17 that combines the color video signal based on the imaging outputs of the imaging element 151 and the imaging element 152 , which are respectively bonded to the positions optically shifted by substantially one pixel, and the IR video signal based on the imaging output of the imaging element 153 , and outputs the combined signal to the monitor MN 1 .
- the 3 MOS camera 1 can separate (split), by the spectral prism 13 , the IR light specialized in a fluorescent region of the fluorescent reagent of the light from the observation part (for example, diseased part) to which the fluorescent reagent (for example, ICG) is administered in advance in the subject such as patient at the time of surgery or examination, for example.
- the observation part for example, diseased part
- the fluorescent reagent for example, ICG
- the 3 MOS camera 1 can generate an RGB color video signal having high resolution based on the imaging outputs of the imaging elements 151 and 152 , which are optically shifted by substantially one pixel, obtained by reflecting the part of the visible light of the light from the observation part and transmitting the remaining visible light thereof on the beam splitter BSP 1 .
- the 3 MOS camera 1 can generate an RGB color video signal with an expanded dynamic range by combining the imaging outputs of the imaging elements 151 and 152 .
- the 3 MOS camera 1 can generate and output clearer fluorescence images in both the IR light and the visible light and thus achieve both the generation of a clearer fluorescence video of the observation part to which the fluorescent reagent is administered and the resolution enhancement of the color image of the observation part to assist the doctor or the like in easily grasping the diseased part.
- the first reflection film (for example, dichroic mirror DYM 1 ) that reflects the IR light is formed on the first prism.
- the second reflection film (for example, beam splitter BSP 1 ) that reflects the visible light of A % of the visible light transmitted through the first reflection film and transmits the visible light of (100-A)% thereof is formed on the second prism.
- the visible light of (100-A)% that transmits through the second reflection film is incident on the third prism.
- the dichroic mirror DYM 1 first splits the IR light of the light from the observation part (for example, diseased part), and the visible light transmitted through the dichroic mirror DYM 1 is split by the beam splitter BSP 1 . Therefore, it is possible to improve the efficiency of the splitting in the dichroic mirror DYM 1 and the beam splitter BSP 1 .
- a value of A % and a value of the remaining (100-A)% are substantially equal.
- the A value becomes substantially 50 , and light having equal brightness is incident on each of the color filters BYR 1 and BYR 2 , which are optically shifted by substantially one pixel. Therefore, the 3 MOS camera 1 can effectively generate the highest resolution RGB color video signal.
- the color filter BYR 1 having red (R), green (G), and blue (B) of the imaging element 151 and the color filter BYR 2 having red (R), green (G), and blue (B) of the imaging element 152 are disposed such that the green (G) color filter is located in each pixel.
- the video signal processing unit 17 selects a pixel value based on the green (G) color filter disposed so as to be located in each pixel and mainly uses the selected pixel value to generate the luminance signal among the color video signals.
- the video signal processing unit 17 can generate the high-resolution video signal VVD having high resolution as compared with the video signal in a case where the pixel shifting by substantially one pixel is not performed, by selectively using light transmitted through the green (G) color filter, which has the highest ratio of contributing to resolution of a luminance signal in each pixel, of the color filters BYR 1 and BYR 2 of the Bayer array stacked in two layers (refer to FIG. 3A ).
- G green
- the imaging element 152 is disposed so as to be optically shifted by one pixel in at least one of the horizontal direction or the vertical direction with respect to the imaging element 151 . Accordingly, the video signal processing unit can generate the high-resolution video signal VVD by the pixel shifting based on the imaging of the imaging elements 151 and 152 which are disposed in an optically shifted manner by substantially one pixel (refer to above).
- the 3 MOS camera 1 controls the ratio of the exposure times of the imaging elements 151 and 152 to be the same or different. Accordingly, the 3 MOS camera 1 can generate high-quality video signals that adaptively realize sensitivity, dynamic range, and resolution fitted to the preference of the user according to the ratio of the exposure times of the imaging elements 151 and 152 and the reflectance of the visible light by the beam splitter BSP 1 (refer to FIGS. 5 to 8 ).
- the IR prism 31 is illustrated as an example of the first prism in the first embodiment described above, but the first prism may not be limited to the IR prism 31 .
- the first prism may be a prism that reflects the IR light and light in another wavelength band (for example, wavelength band of ultraviolet ray) other than the visible light of the light L 2 .
- a video obtained by combining, for example, a video signal based on imaging of the ultraviolet ray and an RGB color video signal with enhanced resolution and expanded dynamic range can be output to the monitor MN 1 or the like.
- the IR prism 31 may not be disposed on the most objective side.
- the IR prism 31 may be disposed at any of the positions of the visible prisms 32 and 33 .
- the bonding of the imaging elements 151 and 152 to the visible prisms 32 and 33 with the optical shift of substantially one pixel it is possible to obtain the same effect as that of the 3 MOS camera 1 according to the first embodiment described above regardless of the position of the IR prism 31 on the spectral prism 13 .
- the present disclosure is useful as the 3 MOS camera that achieves both the generation of the clearer fluorescence video of the observation part to which the fluorescent reagent is administered and the resolution enhancement of the color image of the observation part to assist the doctor or the like in easily grasping the diseased part.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Optics & Photonics (AREA)
- General Physics & Mathematics (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Health & Medical Sciences (AREA)
- Toxicology (AREA)
- Computing Systems (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Studio Devices (AREA)
- Endoscopes (AREA)
Abstract
Description
- The present disclosure relates to a 3 MOS camera.
- In recent years, attention has been paid to a diagnosis method in which, at the time of surgery or examination, ICG (indocyanine green) is administered as a fluorescent reagent into a subject, and the ICG is excited by emission of excitation light or the like to capture and observe a near-infrared fluorescence image emitted by the ICG together with a subject image. For example, JP-A-2016-75825 discloses an imaging device having a blue separation prism that reflects a part of blue component light of incident light and near-infrared light in a specific wavelength region and transmits light other than the above light, a red separation prism that reflects a part of red component light of incident light and near-infrared light in a specific wavelength region and transmits light other than the above light, and a green separation prism into which the light transmitted through the red separation prism is incident.
- In a configuration in JP-A-2016-75825, a partial light amount of the near-infrared light of light from a diseased part or the like is incident on each of the plurality of color separation prisms in a shared manner and imaged. For this reason, for example, there is a problem in that light specialized in the wavelength region of the near-infrared light cannot be received by a corresponding imaging element. Therefore, it is difficult to output a clearer fluorescence image of an observation part to which the fluorescent reagent is administered at the time of surgery or examination described above, and there is room for improvement in that a doctor or the like can more easily grasp the diseased part. Each of blue, red, and green lights is specially imaged. Therefore, there is room for improvement in enhancing resolution of a video by imaging visible light.
- The present disclosure has been devised in view of the above-mentioned circumstances, and a purpose thereof is to provide a 3 MOS camera that achieves both generation of a clearer fluorescence video of an observation part to which a fluorescent reagent is administered and resolution enhancement of a color image of the observation part to assist a doctor or the like in easily grasping a diseased part.
- The present disclosure provides a 3 MOS camera including a first prism that causes a first image sensor to receive IR light of light from an observation part, a second prism that causes a second image sensor to receive visible light of A % (A: a predetermined real number) of the light from the observation part, a third prism that causes a third image sensor to receive remaining visible light of (100-A)% of the light from the observation part, and a video signal processor that combines a color video signal based on imaging outputs of the second image sensor and the third image sensor and an IR video signal based on an imaging output of the first image sensor and outputs the combined signal to a monitor, the second image sensor and the third image sensor being respectively bonded to positions optically shifted by substantially one pixel.
- According to the present disclosure, it is possible to achieve both the generation of the clearer fluorescence video of the observation part to which the fluorescent reagent is administered and the resolution enhancement of the color image of the observation part, and thus to assist the doctor or the like in easily grasping the diseased part.
-
FIG. 1A is a block diagram showing an internal configuration example of a 3 MOS camera according to a first embodiment. -
FIG. 1B is a block diagram showing another internal configuration example of the 3MOS camera 1 according to the first embodiment. -
FIG. 2 is a diagram showing a structural example of a spectral prism shown inFIG. 1 . -
FIG. 3A is a diagram showing an arrangement example of color filters ofimaging elements -
FIG. 3B is an explanatory diagram of a problem in a case where the color filters of theimaging elements -
FIG. 4A is a graph showing an example of spectral characteristics of a dichroic mirror. -
FIG. 4B is a graph showing an example of spectral characteristics of a beam splitter. -
FIG. 5 is a graph showing an example of a relationship between visible light division ratio and sensitivity, dynamic range, and resolution in a case where exposure times of second visible light and first visible light are the same. -
FIG. 6 is a graph showing an example of the relationship between visible light division ratio and sensitivity, dynamic range, and resolution in a case where a ratio of the exposure times of the second visible light and the first visible light is 10:1. -
FIG. 7 is a graph showing an example of the relationship between visible light division ratio and sensitivity, dynamic range, and resolution in a case where the ratio of the exposure times of the second visible light and the first visible light is 100:1. -
FIG. 8 is a graph showing an example of the relationship between visible light division ratio and sensitivity, dynamic range, and resolution in a case where the ratio of the exposure times of the second visible light and the first visible light is 1:10. -
FIG. 9 is a diagram showing a display example of a visible/IR combined video signal generated by the 3 MOS camera according to the first embodiment on a monitor. - Hereinafter, embodiments that specifically disclose a 3 MOS camera according to the present disclosure will be described in detail with reference to drawings as appropriate.
- However, more detailed description than necessary may be omitted. For example, detailed description of a well-known matter and redundant description of substantially the same configuration may be omitted. This is to prevent the following description from being unnecessarily redundant and to facilitate understanding by those skilled in the art. The accompanying drawings and the following description are provided for those skilled in the art to fully understand the present disclosure, and are not intended to limit subject matters described in claims thereby.
-
FIG. 1A is a block diagram showing an internal configuration example of a 3MOS camera 1 according to a first embodiment. -
FIG. 1B is a block diagram showing another internal configuration example of the 3MOS camera 1 according to the first embodiment. The 3MOS camera 1 includes alens 11, aspectral prism 13,imaging elements signal processing unit 17. The videosignal processing unit 17 includes camerasignal processing units enhancement processing unit 21, and a visible/IRcombination processing unit 23. As shown inFIG. 1B , the 3MOS camera 1 may include a videosignal processing unit 17A (refer toFIG. 1B ) having a long/short exposure combination/wide dynamicrange processing unit 21A, instead of the video signal processing unit 17 (refer toFIG. 1A ). Although not shown, the 3MOS camera 1 may include both the video signal processing unit 17 (refer toFIG. 1A ) and the videosignal processing unit 17A (refer toFIG. 1B ). Each configuration will be described in detail. - The 3
MOS camera 1 is used for a medical observation system in which excitation light in a predetermined wavelength band (for example, 760 nm to 800 nm) is emitted to a fluorescent reagent (for example, indocyanine green; hereinafter referred to as “ICG”) administered in advance to an observation part (for example, diseased part) in a subject such as a patient and the observation part that emits fluorescent light on a long wavelength side (for example, 820 to 860 nm) based on the excitation light is imaged, at the time of surgery or examination, for example. An image (for example, video of the observation part) captured by the 3MOS camera 1 is displayed on a monitor MN1 (refer toFIG. 9 ) and assists a user such as a doctor in executing a medical procedure. Thespectral prism 13 will be described as examples used in the medical observation system described above. However, the use thereof is not limited to medical usage and the prism may be used for industrial usage. - Although not shown in
FIG. 1 , a part of an objective side (in other words, tip side) of the 3MOS camera 1 with respect to thelens 11 is configured by a scope that is inserted through the observation part (for example, diseased part; the same applies hereinafter). This scope is, for example, a main portion of a medical instrument such as a rigid endoscope inserted into the observation part and is a light guide member capable of guiding light L1 from the observation part to thelens 11. - The
lens 11 is attached to the objective side (in other words, tip side) of thespectral prism 13 and collects the light L1 from the observation part (for example, reflected light at the observation part). Collected light L2 is incident on thespectral prism 13. - The
spectral prism 13 receives the light L2 from the observation part and splits the light into first visible light V1, a second visible light V2, and IR light N1. Thespectral prism 13 has a configuration having anIR prism 31,visible prisms 32 and 33 (refer toFIG. 2 ). The first visible light V1 is incident on theimaging element 151 disposed so as to face thevisible prism 32. The second visible light V2 is incident on theimaging element 152 disposed so as to face thevisible prism 33. The IR light N1 is incident on theimaging element 153 disposed so as to face theIR prism 31. A detailed structural example of thespectral prism 13 will be described below with reference toFIG. 2 . - The
imaging element 151 as an example of a second image sensor includes, for example, a charge coupled device (CCD) or a complementary metal oxide semiconductor - (CMOS) in which a plurality of pixels suitable for imaging visible light are arranged, and an exposure control circuit (not shown) using an electronic shutter. In the CCD or CMOS constituting the
imaging element 151, at least a microlens, a color filter, and a light receiving element are disposed, for example, corresponding to each pixel. The microlens collects incident light (visible light). The color filter transmits visible light of a specific color component (wavelength) transmitted through the microlens. The color filter of theimaging element 151 is disposed in a Bayer array (refer toFIG. 3A ) such as red (R), green (G), green (G), and blue (B). The specific color component indicates, for example, red (R), green (G), and blue (B). The light receiving element receives light of the specific color component (wavelength) transmitted through the color filter. Theimaging element 151 is disposed so as to face the visible prism 32 (refer toFIG. 2 ). Theimaging element 151 captures an image based on the first visible light V1 that is incident for a first exposure time determined by the exposure control circuit based on an exposure control signal CSH1 from the camerasignal processing unit 191. Theimaging element 151 generates a video signal V1V of the observation part by imaging and outputs the signal to the videosignal processing unit 17. - The
imaging element 152 as an example of a third image sensor includes, for example, a CCD or a CMOS in which a plurality of pixels suitable for imaging visible light are arranged, and an exposure control circuit (not shown) using an electronic shutter. In the CCD or CMOS constituting theimaging element 152, at least a microlens, a color filter, and a light receiving element are arranged, for example, corresponding to each pixel. The microlens collects incident light (visible light). The color filter transmits visible light of a specific color component (wavelength) transmitted through the microlens. The color filter of theimaging element 152 is disposed in the Bayer array (refer toFIG. 3A ) such as red (R), green (G), green (G), and blue (B). The specific color component indicates, for example, red (R), green (G), and blue (B). The light receiving element receives light of the specific color component (wavelength) transmitted through the color filter. Theimaging element 152 is disposed so as to face the visible prism 33 (refer toFIG. 2 ). Theimaging element 152 captures an image based on the second visible light V2 that is incident for a second exposure time determined by the exposure control circuit based on an exposure control signal CSH2 from the camerasignal processing unit 192. - The
imaging element 152 generates a video signal V2V of the observation part by imaging and outputs the signal to the videosignal processing unit 17. - The
imaging element 153 as an example of a first image sensor includes, for example, a CCD or a CMOS in which a plurality of pixels suitable for imaging IR light are arranged. Theimaging element 153 is disposed so as to face the IR prism 31 (refer toFIG. 2 ). Theimaging element 153 captures an image based on the incident IR light N1. Theimaging element 153 generates a video signal N1V of the observation part by imaging and outputs the signal to the videosignal processing unit 17. - The video
signal processing unit 17 is configured of a processor such as a digital signal processor (DSP) or a field programmable gate array (FPGA). The camerasignal processing units 191 to 193, the pixel shifting combination/resolutionenhancement processing unit 21, and the visible/IRcombination processing unit 23 are executed by the processor described above. - The camera
signal processing unit 191 performs various types of camera signal processing using the video signal V1V from theimaging element 151 to generate a first visible video signal V1VD of the observation part, and outputs the signal to the pixel shifting combination/resolutionenhancement processing unit 21 or the long/short exposure combination/wide dynamicrange processing unit 21A. The camerasignal processing unit 191 generates the exposure control signal CSH1 for determining the first exposure time of theimaging element 151 and outputs the signal to theimaging element 151. Theimaging element 151 controls the exposure time of the first visible light V1 based on the exposure control signal CSH1. - The camera
signal processing unit 192 performs various types of camera signal processing using the video signal V2V from theimaging element 152 to generate a second visible video signal V2VD of the observation part, and outputs the signal to the pixel shifting combination/resolutionenhancement processing unit 21 or the long/short exposure combination/wide dynamicrange processing unit 21A. Although the details will be described below, brightness (sensitivity) of the first visible video signal V1VD and brightness of the second visible video signal V2VD may be substantially the same (including the same) or may be different. In particular, the closer the brightness (sensitivity) of the first visible video signal V1VD and the brightness of the second visible video signal V2VD are to substantially the same (including the same), the higher an effect of resolution enhancement is. The camerasignal processing unit 192 generates the exposure control signal CSH2 for determining the exposure time of theimaging element 152 and outputs the signal to theimaging element 152. - The
imaging element 152 controls the second exposure time of the second visible light V2 based on the exposure control signal CSH2. Although the details will be described below, the first exposure time and the second exposure time may be the same (refer toFIG. 5 ) or may be different (refer toFIGS. 6 to 8 ), and the same applies hereinafter. - The camera
signal processing unit 193 performs various types of camera signal processing using the video signal N1V from theimaging element 153 to generate an IR video signal N1VD of the observation part, and outputs the signal to the visible/IRcombination processing unit 23. - The pixel shifting combination/resolution
enhancement processing unit 21 receives two video signals (specifically, the first visible video signal V1VD from the camerasignal processing unit 191 and the second visible video signal V2VD from the camera signal processing unit 192). The closer the brightness of the first visible video signal V1VD and the brightness of the second visible video signal V2VD are to the same, the higher the effect of resolution enhancement by the pixel shifting combination/resolutionenhancement processing unit 21 is. Combination/pixel interpolation processing is performed in consideration of a spatial positional relationship between the first visible video signal V1VD and the second visible video signal V2VD, and thus it is possible to generate a high-resolution video signal VVD with high resolution. - The pixel shifting combination/resolution
enhancement processing unit 21 performs combination processing on the received two input video signals (that is, combination of the first visible video signal V1VD generated by the camerasignal processing unit 191 based on the imaging of theimaging element 151 bonded to thevisible prism 32 and the second visible video signal V2VD generated by the camerasignal processing unit 192 based on the imaging of theimaging element 152 bonded to the visible prism 33) to generate the high-resolution video signal VVD. With the combination processing (refer to above) on the received two input video signals, the pixel shifting combination/resolutionenhancement processing unit 21 can generate the high-resolution video signal VVD having higher resolution than the first visible video signal V1VD or the second visible video signal V2VD. The pixel shifting combination/resolutionenhancement processing unit 21 outputs the high-resolution video signal VVD to the visible/IRcombination processing unit 23. The generation of the high-resolution video signal VVD by the pixel shifting combination/resolutionenhancement processing unit 21 will be described below with reference toFIG. 3A . - In the 3
MOS camera 1, the videosignal processing unit 17 generates the high-resolution video signal VVD by pixel shifting. Therefore, in the spectral prism 13 (refer toFIG. 2 ), when theimaging element 151 on which the first visible light V1 is incident and theimaging element 152 on which the second visible light V2 is incident are respectively bonded to the correspondingvisible prisms imaging element 151 and theimaging element 152 by substantially one pixel (for example, in the horizontal or vertical direction, or in both directions) to perform the bonding (refer toFIG. 3A ). Accordingly, the pixel shifting combination/resolutionenhancement processing unit 21 can generate the high-resolution video signal VVD by the pixel shifting based on the imaging of theimaging elements enhancement processing unit 21 is. - The long/short exposure combination/wide dynamic
range processing unit 21A receives and superimposes the two video signals having different brightness (sensitivity) (specifically, the first visible video signal V1VD from the camerasignal processing unit 191 and the second visible video signal V2VD from the camera signal processing unit 192) for combining the signals to generate a wide dynamic range video signal VVDA. The long/short exposure combination/wide dynamicrange processing unit 21A superimposes and combines the two video signals having different brightness (sensitivity) and thus can generate the wide dynamic range video signal VVDA with an apparently wider dynamic range than the first visible video signal V1VD or the second visible video signal V2VD. The long/short exposure combination/wide dynamicrange processing unit 21A outputs the wide dynamic range video signal VVDA to the visible/IRcombination processing unit 23. - The visible/IR
combination processing unit 23 receives and superimposes the high-resolution video signal VVD from the pixel shifting combination/resolutionenhancement processing unit 21 and the IR video signal N1VD from the camerasignal processing unit 193 for combining the signals to generated a visible/IR combined video signal IMVVD. With the visible/IR combined video signal IMVVD, the resolution is enhanced by the combination processing after the pixel shifting. Therefore, a state around the observation part (for example, surgical field) becomes visually clear and a state of the diseased part can be clarified in detail by the fluorescent light emission of the fluorescent reagent such as ICG (refer toFIG. 9 ). The visible/IRcombination processing unit 23 may output the visible/IR combined video signal IMVVD to the monitor MN1 or send the signal to a recording device (not shown) for accumulation. - The monitor MN1 constitutes, for example, an image console (not shown) disposed in a surgery room at the time of surgery or examination, and displays the visible/IR combined video signal IMVVD of the observation part generated by the 3
MOS camera 1. Accordingly, the user such as doctor can visually recognize the visible/IR combined video signal IMVVD displayed on the monitor MN1 to grasp in detail the part that emits fluorescent light in the observation part. The recording device is a recorder capable of recording data of the visible/IR combined video signal IMVVD generated by the 3MOS camera 1, for example. -
FIG. 2 is a diagram showing a structural example of thespectral prism 13 shown inFIG. 1 . Hereinafter, the structural example of thespectral prism 13 shown inFIG. 1 will be mainly described with reference toFIG. 2 . Thespectral prism 13 includes the IR prism 31 (an example of a first prism), the visible prism 32 (an example of a second prism), and the visible prism 33 (an example of a third prism). TheIR prism 31, thevisible prism 32, and thevisible prism 33 are sequentially assembled in an optical axis direction of the light L2 collected by thelens 11. - The
IR prism 31 as an example of the first prism includes anincident surface 31 a on which the light L2 is incident, areflection surface 31 b on which a dichroic mirror DYM1 that reflects the IR light of the light L2 is formed, and anemission surface 31 c from which the IR light is emitted. The dichroic mirror DYM1 (an example of first reflection film) is formed on thereflection surface 31 b by vapor deposition or the like, reflects the IR light (for example, IR light in the wavelength band of 800 nm or more) of the light L2, and transmits light (for example, light of about 400 nm to 800 nm) other than the IR light of the light L2 (refer toFIG. 4A ). Specifically, the IR light (refer to above) of the light L2 incident on theincident surface 31 a of theIR prism 31 is reflected by thereflection surface 31 b. This IR light is reflected by thereflection surface 31 b, is then totally reflected by theincident surface 31 a of theIR prism 31, and is incident on theimaging element 153 through theemission surface 31 c. -
FIG. 4A is a graph showing an example of spectral characteristics of the dichroic mirror DYM1. The horizontal axis ofFIG. 4A indicates wavelength [nm: nanometer (the same applies hereinafter)], and the vertical axis indicates reflectance or transmittance. A characteristic TP1 indicates the transmittance of the dichroic mirror DYM1. According to the characteristic TP1, the dichroic mirror DYM1 can transmit the light of about 400 nm to 800 nm. A characteristic RF1 indicates the reflectance of the dichroic mirror DYM1. According to the characteristic RF1, the dichroic mirror DYM1 can reflect the IR light of 800 nm or more. Therefore, all the IR light having a light amount indicated by an area AR1 (in other words, the IR light of the light L2) can be incident on theimaging element 153. - The
visible prism 32 as an example of the second prism includes anincident surface 32 a on which the light (an example of first transmitted light) transmitted through the dichroic mirror DYM1 is incident, areflection surface 32 b on which a beam splitter BSP1 for reflecting a partial light amount of the transmitted light (specifically, visible light) is formed, and anemission surface 32 c from which reflected visible light of the partial light amount is emitted. The beam splitter BSP1 (an example of second reflection film) is formed on thereflection surface 32 b by vapor deposition or the like, reflects visible light having a partial (for example, around A % of the light incident on theincident surface 32 a; A is a predetermined real number, for example, 50) light amount of the visible light incident on theincident surface 32 a, and transmits visible light having a remaining (100-A)% (for example, around 50% of the light incident on theincident surface 32 a) light amount thereof (refer toFIG. 4B ). Specifically, the visible light having the partial (for example, 50%) light amount of the visible light incident on theincident surface 32 a of thevisible prism 32 is reflected by thereflection surface 32 b. This part of the visible light is reflected by thereflection surface 32 b, is then totally reflected by theincident surface 32 a of thevisible prism 32, and is incident on theimaging element 151 through theemission surface 32 c. In thespectral prism 13 shown inFIG. 1 , a ratio of visible light reflected by the beam splitter BSP1 is not limited to 50% and may be in a range of 30% to 50%, for example. - The
visible prism 33 as an example of the third prism has anincident surface 33 a on which the visible light having the remaining light amount transmitted through the beam splitter BSP1 is incident and anemission surface 33 c from which the visible light having the remaining light amount is emitted. Specifically, the visible light having the remaining light amount transmitted through the beam splitter BSP1 is incident on thevisible prism 33, is emitted as it is, and is incident on the imaging element 152 (refer toFIG. 4B ). -
FIG. 4B is a graph showing an example of spectral characteristics of the beam splitter BSP1. The horizontal axis ofFIG. 4B indicates wavelength [nm], and the vertical axis indicates reflectance or transmittance. A characteristic TP2 indicates transmittance and reflectance (about 50% at 400 nm to 800 nm) of the beam splitter BSP1 in thespectral prism 13 shown inFIG. 2 . With the characteristic TP2, the beam splitter BSP1 as an example of the second reflection film can reflect light having a light amount of about 50% (mainly visible light) of the light of about 400 nm to 800 nm and can transmit light having a remaining light amount of about 50% (mainly visible light) thereof. Therefore, visible light having a light amount indicated by an area AR2 (for example, visible light having light amount of about 50%) can be incident on theimaging element 151. The visible light having the light amount indicated by the area AR2 (for example, visible light having light amount of about 50%) can be incident on theimaging element 152. - Next, the arrangement of color filters BYR1 and BYR2 of the
imaging elements FIG. 3A .FIG. 3A is a diagram showing an arrangement example of the color filters BYR1 and BYR2 of theimaging elements imaging element 151 and is disposed in the Bayer array consisting of the color filters of red (R), green (G), green (G), and blue (B) in any four adjacent pixels in the horizontal and vertical directions, for example. In the Bayer array, more green (G) is disposed than red (R) and blue (B) in any four pixels. - This is because human vision is known to react most sensitively to green (G). Similarly, the color filter BYR2 is a color filter constituting the
imaging element 152 and is disposed in the Bayer array consisting of the color filters of red (R), green (G), green (G), and blue (B) in any four adjacent pixels in the horizontal and vertical directions, for example. - As shown in
FIG. 3A , theimaging elements FIG. 3A shows an example in which the offset of one pixel is added, the color filters BYR1 and BYR2 may be disposed with an offset of substantially one pixel (refer to above). Therefore, with the pixel shifting of the offset of substantially one pixel (refer to above), the green (G) pixel of one Bayer array (for example, the color filter BYR1) is disposed on the blue (B) pixel or the red (R) pixel of the other Bayer array (for example, the color filter BYR2). In other words, the green (G) color filter is disposed for all pixels. Accordingly, the pixel shifting combination/resolutionenhancement processing unit 21 that receives the first visible video signal V1VD and the second visible video signal V2VD can generate the high-resolution video signal VVD having high resolution as compared with a video signal in a case where the pixel shifting by substantially one pixel is not performed, by selectively using light transmitted through the green (G) color filter, which has the highest ratio of contributing to resolution of a luminance signal in each pixel, of the color filters BYR1 and BYR2 of the Bayer array stacked in two layers (refer toFIG. 3A ). - A problem in a case where the color filters of the
imaging elements FIG. 3B is an explanatory diagram of a problem in a case where the color filters BYR1 and BYR2 of theimaging elements FIG. 3B are both frequencies, where fs indicates sampling frequency and fs/2 indicates Nyquist frequency. - In a case where the color filters of the
imaging elements FIG. 3B . When such false color or moire is detected, the image quality of the color video signal deteriorates. On the other hand, in order to solve such a problem, in the first embodiment, the color filters BYR1 and BYR2 are disposed with the optical offset of one pixel as shown inFIG. 3A . Accordingly, in the high-resolution video signal VVD generated by the pixel shifting combination/resolutionenhancement processing unit 21, there is no detection of the false color or moire as shown inFIG. 3B near the Nyquist frequency (fs/2) and the image quality is accurately enhanced. -
FIG. 5 is a graph showing an example of a relationship between visible light division ratio and sensitivity GAN1, dynamic range DRG1, and resolution RSO1 in a case where exposure times of the second visible light V2 and the first visible light V1 are the same. The horizontal axis ofFIG. 5 is the visible light division ratio. In other words, the visible light division ratio is a ratio at which the beam splitter BSP1 reflects the visible light transmitted through the dichroic mirror DYM1. For example, in a case where the visible light division ratio is 10% (that is, 90:10), the beam splitter BSP1 reflects the visible light of 10% of the visible light transmitted through the dichroic mirror DYM1 and transmits the visible light of 90% thereof. That is, the ratio light amount of the second visible light - V2:light amount of the first visible light V1 is 90:10. Another visible light division ratio can be considered in the same manner as the specific example described above. The vertical axis of
FIG. 5 shows the sensitivity GAN1, the dynamic range DRG1, and the resolution RSO1 of the high-resolution video signal VVD generated by the videosignal processing unit 17. -
FIG. 5 shows an example in which the exposure times for theimaging elements - It is considered that the dynamic range DRG1 transitions according to a characteristic that the dynamic range increases similarly as the visible light division ratio is smaller in a range larger than zero (for example, about +80 dB when the ratio is 0.01%) and the dynamic range is the minimum (for example, 0 dB) when the ratio is 50%. This is because a difference between a dark portion and a bright portion tends to widen as the visible light division ratio is smaller in the high-resolution video signal VVD.
- It is considered that the resolution RSO1 transitions according to a characteristic that the resolution is the minimum contrarily as the visible light division ratio is smaller (for example, the maximum of 1 time when the ratio is 0%) and the resolution is the maximum (for example, 1.1 times) when the ratio is 50%. This is because a difference in pixel value between adjacent pixels is small as the visible light division ratio is larger and thus it is easy to realize high resolution by pixel shifting.
-
FIG. 6 is a graph showing an example of a relationship between visible light division ratio and sensitivity GAN2, dynamic range DRG2, and resolution RSO2 in a case where a ratio of the exposure times of the second visible light V2 and the first visible light V1 is 10:1. The horizontal axis ofFIG. 6 is the visible light division ratio, and description thereof will be omitted since the description is the same as that inFIG. 5 . The vertical axis ofFIG. 6 shows the sensitivity GAN2, the dynamic range DRG2, and the resolution RSO2 of the high-resolution video signal VVD generated by the videosignal processing unit 17. -
FIG. 6 shows an example in which a difference is provided such that a ratio of the exposure times for theimaging elements FIG. 5 , that the sensitivity GAN2 transitions according to a characteristic (for example, a linear function) that the sensitivity is the maximum as the visible light division ratio is smaller (for example, the maximum (100%) and the brightest when the ratio is 0%) and the sensitivity is the minimum (for example, the darkest at 50%) when the ratio is 50%. This is because a brightness ratio of the second visible video signal V2VD and the first visible video signal V1VD is obtained by multiplying the ratio of the exposure times for theimaging elements - When a difference is provided such that the ratio of the exposure times for the
imaging elements FIG. 6 . - When the difference is provided such that the ratio of the exposure times for the
imaging elements imaging element 151 =100:1 in a case where the visible light division ratio is 10% (the ratio second visible light V2:first visible light V1 =90:10). That is, the dark portion is hardly projected by the first visible light V1 and the bright portion is hardly projected by the second visible light V2, and thus it can be considered that it is almost difficult to gain a resolution when two video signals are superimposed. Therefore, it is considered that the resolution RSO2 transitions over small values (for example, the minimum of 1 time at 0% and about 1.02 times at 50%) regardless of the visible light division ratio. -
FIG. 7 is a graph showing an example of a relationship between visible light division ratio and sensitivity GAN2, dynamic range DRG3, and resolution RSO3 in a case where the ratio of the exposure times of the second visible light V2 and the first visible light V1 is 100:1. The horizontal axis ofFIG. 7 is the visible light division ratio, and description thereof will be omitted since the description is the same as that inFIG. 5 . The vertical axis ofFIG. 7 shows the sensitivity GAN2, the dynamic range DRG3, and the resolution RSO3 of the high-resolution video signal VVD generated by the videosignal processing unit 17. -
FIG. 7 shows an example in which a considerable difference is provided such that the ratio of the exposure times for theimaging elements FIG. 6 , that the sensitivity GAN2 transitions according to a characteristic (for example, a linear function) that the sensitivity is the maximum as the visible light division ratio is smaller (for example, the maximum (100%) and the brightest when the ratio is 0%) and the sensitivity is the minimum (for example, the darkest at 50%) when the ratio is 50%. This is because a brightness ratio of the second visible video signal V2VD and the first visible video signal V1VD is obtained by multiplying the ratio of the exposure times for theimaging elements - When a considerable difference is provided such that the ratio of the exposure times for the
imaging elements FIG. 7 . - When the difference is provided such that the ratio of the exposure times for the
imaging elements imaging element 151 =1000:1 in the case where the visible light division ratio is 10% (the ratio second visible light V2:first visible light V1 =90:10). That is, the dark portion is hardly projected since the second visible light V2 is too bright and the bright portion is hardly projected since the first visible light V1 is too dark, and thus it can be considered that it is almost difficult to gain a resolution when two video signals are superimposed as compared with the example ofFIG. 6 . Therefore, it is considered that the resolution RSO3 transitions over small values (for example, the minimum of 1 time at 0% and about 1.001 times at 50%) regardless of the visible light division ratio. -
FIG. 8 is a graph showing an example of a relationship between visible light division ratio and sensitivity GAN3, dynamic range DRG4, and resolution RSO4 in a case where the ratio of the exposure times of the second visible light V2 and the first visible light V1 is 1:10. The horizontal axis ofFIG. 8 is the visible light division ratio, and description thereof will be omitted since the description is the same as that inFIG. 5 . The vertical axis ofFIG. 8 shows the sensitivity GAN3, the dynamic range DRG4, and the resolution RSO4 of the high-resolution video signal VVD generated by the videosignal processing unit 17. -
FIG. 8 shows an example in which a difference is provided such that the ratio of the exposure times for theimaging elements - Contrary to the example of
FIG. 6 , when the difference is provided such that the ratio of the exposure times for theimaging elements imaging element 152 and the light amount of light incident on theimaging element 151 are substantially equal due to cancellation of the visible light division ratio and the exposure time ratio in the case where the visible light division ratio is 10% (second visible light V2:first visible light V1 =90:10), for example. Therefore, it is considered that the sensitivity GAN3 transitions according to a characteristic that the sensitivity transitions substantially constant so as to be the minimum when the visible light division ratio is from 0% to 10% (in other words, in a case where light amounts incident on theimaging elements imaging elements - When a difference is provided such that the ratio of the exposure times for the
imaging elements imaging elements imaging elements - When the difference is provided such that the ratio of the exposure times for the
imaging elements imaging element 152 and the light amount of light incident on theimaging element 151 are substantially equal in the case where the visible light division ratio is 10% (the ratio second visible light V2:first visible light V1 =90:10), for example (refer to above). That is, when the cancellation of the visible light division ratio and the exposure time ratio (1:10) occurs (for example, when the visible light division ratio is 10%), the first visible video signal V1VD based on the first visible light V1 and the second visible video signal V2VD based on the second visible light V2 have the same brightness. Therefore, it is considered that the resolution RSO4 transitions according to a characteristic that the resolution is the maximum and the resolution decreases from the maximum value at a visible light division ratio at which the cancellation is less likely to occur. -
FIG. 9 is a diagram showing a display example of the visible/IR combined video signal IMVVD generated by the 3MOS camera 1 according to the first embodiment on the monitor MN1. The visible/IR combined video signal IMVVD shown inFIG. 9 is generated based on imaging at the observation part (for example, around liver and pancreas) of the patient who is the subject and is displayed on the monitor MN1. InFIG. 9 , the fluorescent reagent of ICG, which is administered in advance to the diseased part in a body of the patient before surgery or examination, emits light, and a place that emits the light (for example, diseased part FL1) is shown so as to be known in the visible/IR combined video signal IMVVD. The high-resolution video signal VVD having the high resolution is generated by the pixel shifting combination/resolutionenhancement processing unit 21. Therefore, a clear video of the surgical field such as an observation target can be obtained with the visible/IR combined video signal IMVVD. In this manner, the 3MOS camera 1 can generate the visible/IR combined video signal IMVVD, which allows the user such as doctor to grasp the details of the observation part with high image quality and to easily specify a position of the diseased part, and display the signal on the monitor MN1, at the time of surgery or examination, for example. - As described above, the 3
MOS camera 1 according to the first embodiment is provided with the first prism (for example, IR prism 31) that causes theimaging element 153 to receive the IR light of the light L2 from the observation part (for example, diseased part in the subject), the second prism (for example, visible prism 32) that reflects the visible light of A % of the light L2 from the observation part (for example, diseased part in the subject) and causes theimaging element 151 to receive the remaining (100-A)% thereof, and the third prism (for example, visible prism 33) that causes theimaging element 152 to receive the remaining visible light of (100-A)% thereof. The 3MOS camera 1 is provided with the videosignal processing unit 17 that combines the color video signal based on the imaging outputs of theimaging element 151 and theimaging element 152, which are respectively bonded to the positions optically shifted by substantially one pixel, and the IR video signal based on the imaging output of theimaging element 153, and outputs the combined signal to the monitor MN1. - Accordingly, the 3
MOS camera 1 can separate (split), by thespectral prism 13, the IR light specialized in a fluorescent region of the fluorescent reagent of the light from the observation part (for example, diseased part) to which the fluorescent reagent (for example, ICG) is administered in advance in the subject such as patient at the time of surgery or examination, for example. - The 3
MOS camera 1 can generate an RGB color video signal having high resolution based on the imaging outputs of theimaging elements MOS camera 1 can generate an RGB color video signal with an expanded dynamic range by combining the imaging outputs of theimaging elements MOS camera 1 can generate and output clearer fluorescence images in both the IR light and the visible light and thus achieve both the generation of a clearer fluorescence video of the observation part to which the fluorescent reagent is administered and the resolution enhancement of the color image of the observation part to assist the doctor or the like in easily grasping the diseased part. - The first reflection film (for example, dichroic mirror DYM1) that reflects the IR light is formed on the first prism. The second reflection film (for example, beam splitter BSP1) that reflects the visible light of A % of the visible light transmitted through the first reflection film and transmits the visible light of (100-A)% thereof is formed on the second prism. The visible light of (100-A)% that transmits through the second reflection film is incident on the third prism. The dichroic mirror DYM1 first splits the IR light of the light from the observation part (for example, diseased part), and the visible light transmitted through the dichroic mirror DYM1 is split by the beam splitter BSP1. Therefore, it is possible to improve the efficiency of the splitting in the dichroic mirror DYM1 and the beam splitter BSP1.
- A value of A % and a value of the remaining (100-A)% are substantially equal. The A value becomes substantially 50, and light having equal brightness is incident on each of the color filters BYR1 and BYR2, which are optically shifted by substantially one pixel. Therefore, the 3
MOS camera 1 can effectively generate the highest resolution RGB color video signal. - The color filter BYR1 having red (R), green (G), and blue (B) of the
imaging element 151 and the color filter BYR2 having red (R), green (G), and blue (B) of theimaging element 152 are disposed such that the green (G) color filter is located in each pixel. The videosignal processing unit 17 selects a pixel value based on the green (G) color filter disposed so as to be located in each pixel and mainly uses the selected pixel value to generate the luminance signal among the color video signals. Accordingly, the videosignal processing unit 17 can generate the high-resolution video signal VVD having high resolution as compared with the video signal in a case where the pixel shifting by substantially one pixel is not performed, by selectively using light transmitted through the green (G) color filter, which has the highest ratio of contributing to resolution of a luminance signal in each pixel, of the color filters BYR1 and BYR2 of the Bayer array stacked in two layers (refer toFIG. 3A ). - This is based on the fact that the green (G) color filter is known to have the highest proportion of contributing to the resolution of the luminance signal since human vision is most sensitive to green (G).
- The
imaging element 152 is disposed so as to be optically shifted by one pixel in at least one of the horizontal direction or the vertical direction with respect to theimaging element 151. Accordingly, the video signal processing unit can generate the high-resolution video signal VVD by the pixel shifting based on the imaging of theimaging elements - The 3
MOS camera 1 controls the ratio of the exposure times of theimaging elements MOS camera 1 can generate high-quality video signals that adaptively realize sensitivity, dynamic range, and resolution fitted to the preference of the user according to the ratio of the exposure times of theimaging elements FIGS. 5 to 8 ). - Although various embodiments are described with reference to the drawings, it goes without saying that the present disclosure is not limited to such examples. It is obvious to those skilled in the art that various modification examples, change examples, substitution examples, addition examples, deletion examples, and equivalent examples can be conceived within the scope of the claims. Of course, it is understood that the various examples belong to the technical scope of the present disclosure. Further, the respective constituent elements in the various embodiments described above may be randomly combined in the scope of not departing from the spirit of the invention.
- For example, the
IR prism 31 is illustrated as an example of the first prism in the first embodiment described above, but the first prism may not be limited to theIR prism 31. For example, in a case where the first prism is not a visible prism that reflects the visible light, the first prism may be a prism that reflects the IR light and light in another wavelength band (for example, wavelength band of ultraviolet ray) other than the visible light of the light L2. Accordingly, instead of the IR video signal, a video obtained by combining, for example, a video signal based on imaging of the ultraviolet ray and an RGB color video signal with enhanced resolution and expanded dynamic range can be output to the monitor MN1 or the like. - In the
spectral prism 13 shown inFIG. 2 , an example in which theIR prism 31 is disposed most on the objective side has been described, but theIR prism 31 may not be disposed on the most objective side. For example, theIR prism 31 may be disposed at any of the positions of thevisible prisms imaging elements visible prisms MOS camera 1 according to the first embodiment described above regardless of the position of theIR prism 31 on thespectral prism 13. - The present disclosure is useful as the 3 MOS camera that achieves both the generation of the clearer fluorescence video of the observation part to which the fluorescent reagent is administered and the resolution enhancement of the color image of the observation part to assist the doctor or the like in easily grasping the diseased part.
- The present application is based upon Japanese Patent Application (Patent Application No. 2020-131042 filed on Jul. 31, 2020), the content of which is incorporated herein by reference.
Claims (5)
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2020-131042 | 2020-07-31 | ||
JPJP2020-131042 | 2020-07-31 | ||
JP2020131042A JP7477158B2 (en) | 2020-07-31 | 2020-07-31 | 3-chip camera |
Publications (2)
Publication Number | Publication Date |
---|---|
US20220038610A1 true US20220038610A1 (en) | 2022-02-03 |
US11252382B1 US11252382B1 (en) | 2022-02-15 |
Family
ID=79300701
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/132,554 Active US11252382B1 (en) | 2020-07-31 | 2020-12-23 | 3 MOS camera |
Country Status (3)
Country | Link |
---|---|
US (1) | US11252382B1 (en) |
JP (1) | JP7477158B2 (en) |
DE (1) | DE102021119417A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6765494B1 (en) * | 2019-10-31 | 2020-10-07 | パナソニックi−PROセンシングソリューションズ株式会社 | 3-plate camera |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6046772A (en) * | 1997-07-24 | 2000-04-04 | Howell; Paul | Digital photography device and method |
JP5086535B2 (en) * | 2005-11-21 | 2012-11-28 | オリンパスメディカルシステムズ株式会社 | Two-plate imaging device |
US20110292258A1 (en) * | 2010-05-28 | 2011-12-01 | C2Cure, Inc. | Two sensor imaging systems |
WO2014083489A1 (en) * | 2012-11-28 | 2014-06-05 | Corephotonics Ltd. | High-resolution thin multi-aperture imaging systems |
JP2015033032A (en) * | 2013-08-05 | 2015-02-16 | キヤノン株式会社 | Optical equipment and pixel deviated image acquisition method |
JP2016075825A (en) | 2014-10-07 | 2016-05-12 | パナソニックIpマネジメント株式会社 | Color separation prism and imaging apparatus |
JP2016096430A (en) * | 2014-11-13 | 2016-05-26 | パナソニックIpマネジメント株式会社 | Imaging device and imaging method |
US20160292506A1 (en) * | 2015-04-06 | 2016-10-06 | Heptagon Micro Optics Pte. Ltd. | Cameras having an optical channel that includes spatially separated sensors for sensing different parts of the optical spectrum |
JP2018027272A (en) * | 2016-08-19 | 2018-02-22 | ソニー株式会社 | Imaging System |
JP7095693B2 (en) * | 2017-05-30 | 2022-07-05 | ソニーグループ株式会社 | Medical observation system |
WO2018235225A1 (en) * | 2017-06-22 | 2018-12-27 | オリンパス株式会社 | Image capturing device, image capturing method, and program |
JP7025177B2 (en) * | 2017-10-20 | 2022-02-24 | キヤノン株式会社 | Imaging device |
JP6765494B1 (en) * | 2019-10-31 | 2020-10-07 | パナソニックi−PROセンシングソリューションズ株式会社 | 3-plate camera |
-
2020
- 2020-07-31 JP JP2020131042A patent/JP7477158B2/en active Active
- 2020-12-23 US US17/132,554 patent/US11252382B1/en active Active
-
2021
- 2021-07-27 DE DE102021119417.2A patent/DE102021119417A1/en active Pending
Also Published As
Publication number | Publication date |
---|---|
JP2022027195A (en) | 2022-02-10 |
DE102021119417A1 (en) | 2022-02-03 |
US11252382B1 (en) | 2022-02-15 |
JP7477158B2 (en) | 2024-05-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11089271B2 (en) | 3 MOS camera | |
US9872610B2 (en) | Image processing device, imaging device, computer-readable storage medium, and image processing method | |
JP6939000B2 (en) | Imaging device and imaging method | |
CN103068300B (en) | Endoscopic system, control method and camera head | |
JP6072374B2 (en) | Observation device | |
WO2016056157A1 (en) | Color separation prism and imaging device | |
US20130076879A1 (en) | Endoscopic image processing device, endoscope apparatus, and image processing method | |
US20190387964A1 (en) | Endoscope apparatus | |
JP7513318B2 (en) | 3-chip and 4-chip cameras | |
JP2010057547A (en) | Fundus camera | |
JP6121058B2 (en) | Endoscope system and operation method of endoscope system | |
JP6430880B2 (en) | Endoscope system and method for operating endoscope system | |
US11252382B1 (en) | 3 MOS camera | |
US8596784B2 (en) | Opthalmology photographing apparatus | |
WO2017042980A1 (en) | Fluoroscopic apparatus and fluoroscopic endoscope apparatus | |
JP4448320B2 (en) | Electronic endoscope device | |
JP4169957B2 (en) | Electronic endoscope device | |
JPH03159376A (en) | External fitting television camera for endoscope |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: PANASONIC I-PRO SENSING SOLUTIONS CO., LTD., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KINIWA, YUJI;HASHIMOTO, YOTA;TAKENAGA, YUUICHI;SIGNING DATES FROM 20201217 TO 20201222;REEL/FRAME:054740/0413 |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: I-PRO CO., LTD., JAPAN Free format text: CHANGE OF NAME;ASSIGNOR:PANASONIC I-PRO SENSING SOLUTIONS CO., LTD.;REEL/FRAME:061599/0874 Effective date: 20220401 |
|
AS | Assignment |
Owner name: I-PRO CO., LTD., JAPAN Free format text: ADDRESS CHANGE;ASSIGNOR:I-PRO CO., LTD.;REEL/FRAME:061828/0323 Effective date: 20221004 |