WO2015145814A1 - 生体観察システム - Google Patents
生体観察システム Download PDFInfo
- Publication number
- WO2015145814A1 WO2015145814A1 PCT/JP2014/074083 JP2014074083W WO2015145814A1 WO 2015145814 A1 WO2015145814 A1 WO 2015145814A1 JP 2014074083 W JP2014074083 W JP 2014074083W WO 2015145814 A1 WO2015145814 A1 WO 2015145814A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- light
- enhancement
- observation
- unit
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0638—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements providing two or more wavelengths
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00006—Operational features of endoscopes characterised by electronic signal processing of control signals
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000094—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/044—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances for absorption imaging
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/045—Control thereof
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/05—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances characterised by the image sensor, e.g. camera, being in the distal end portion
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0655—Control therefor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0661—Endoscope light sources
- A61B1/0669—Endoscope light sources at proximal end of an endoscope
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0661—Endoscope light sources
- A61B1/0676—Endoscope light sources at distal tip of an endoscope
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/07—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements using light-conductive means, e.g. optical fibres
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/145—Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
- A61B5/1455—Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters
- A61B5/14551—Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters for measuring blood gases
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/145—Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
- A61B5/1455—Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters
- A61B5/1459—Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters invasive, e.g. introduced into the body by a catheter
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B23/00—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
- G02B23/24—Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
- G02B23/2476—Non-optical details, e.g. housings, mountings, supports
- G02B23/2484—Arrangements in relation to a camera or imaging device
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B23/00—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
- G02B23/24—Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
- G02B23/26—Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes using light guides
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00057—Operational features of endoscopes provided with means for testing or calibration
Definitions
- the present invention relates to a living body observation system, and more particularly to a living body observation system used for observation of a living tissue.
- ESD endoscopic submucosal dissection
- Japanese Patent No. 5355820 discloses an endoscope system that can be used for ESD.
- Japanese Patent No. 5355820 discloses an object inside a feature space formed around the wavelength band or spatial frequency band of two or more band images selected from a plurality of band images. For the two or more selected band images based on the enhancement amount set for the area where the observation object (the deep blood vessels under the mucosal surface, the pit pattern on the mucosal surface layer, or the blood vessels on the mucosal surface layer) is distributed.
- An endoscope system including a configuration that performs an enhancement process and performs a color conversion process for adjusting a color tone on the plurality of band images including the two or more band images subjected to the enhancement process. It is disclosed.
- a pigment such as indigo carmine
- the layer boundary between the submucosa layer and the muscle layer is formed.
- a technique is used to make it visible.
- the present invention has been made in view of the above-described circumstances, and an object of the present invention is to provide a living body observation system capable of reducing the burden on an operator when performing treatment on a lesion site such as cancer.
- the living body observation system of one embodiment of the present invention includes a first light having a peak wavelength in a blue region and a red region as light for illuminating a subject including a region to which a pigment having an absorption peak in the red region is administered.
- An illumination light generator configured to generate a second light having a peak wavelength, and imaging a return light from the subject illuminated by the light emitted from the illumination light generator
- An imaging unit configured to perform imaging, a first image obtained by imaging the return light of the first light, and a second image obtained by imaging the return light of the second light
- an arithmetic unit configured to perform processing for calculating a degree of difference, which is a value indicating a degree of difference in luminance value in the same pixel, and an image obtained by imaging the return light, The degree of difference calculated by the calculation unit exceeds a predetermined threshold value.
- An enhancement coefficient calculation unit configured to perform processing for calculating a first enhancement coefficient corresponding to a pixel, and the first enhancement coefficient calculated by the enhancement coefficient calculation unit.
- An enhancement processing unit configured to perform a process for enhancing the contrast of a pixel in which the degree of difference calculated by the calculation unit exceeds a predetermined threshold among the pixels included in the image of the image, and the enhancement process
- An observation image generation unit configured to generate an observation image using the first image and the second image that have been processed by a unit, and to output the generated observation image to a display device And having.
- the living body observation system of one embodiment of the present invention includes a first light having a peak wavelength in a blue region and a red region as light for illuminating a subject including a region to which a pigment having an absorption peak in the red region is administered.
- An illumination light generator configured to generate a second light having a peak wavelength, and imaging a return light from the subject illuminated by the light emitted from the illumination light generator Based on the imaging unit configured to do so and the second image obtained by imaging the return light of the second light, the variation of the luminance value with respect to the average value of the luminance value of the second image is shown.
- a calculation unit configured to perform a process for calculating a variation amount as a value for each pixel, and the difference calculated by the calculation unit among images obtained by imaging the return light Emphasis function corresponding to a pixel whose pixel exceeds a predetermined threshold
- An enhancement coefficient calculation unit configured to perform processing for calculating the calculation unit, and the calculation unit among the pixels included in the first image based on the enhancement coefficient calculated by the enhancement coefficient calculation unit
- An enhancement processing unit configured to perform a process for enhancing the contrast of a pixel in which the amount of variation calculated by the above exceeds a predetermined threshold; and the first image subjected to the process by the enhancement processing unit;
- an observation image generation unit configured to generate an observation image using the second image and to output the generated observation image to a display device.
- the flowchart for demonstrating the emphasis process which concerns on a 1st Example The figure which shows an example of the image used as the process target of the emphasis process which concerns on a 1st Example.
- the flowchart for demonstrating the emphasis process which concerns on a 2nd Example. The figure which shows an example of the image used as the process target of the emphasis process which concerns on a 2nd Example.
- FIG. 1 is a diagram illustrating a configuration of a main part of the living body observation system according to the embodiment.
- the living body observation system 101 includes an elongated insertion portion that can be inserted into a subject that is a living body, and also captures an imaging signal by imaging a subject such as a living tissue in the subject.
- a processor 3 that generates and outputs an observation image according to a signal, a display device 4 that displays an observation image and the like output from the processor 3, and an instruction according to a user input operation are given to the processor 3.
- an input device 5 having a switch and / or a button capable of being configured.
- the endoscope 1 images an illumination optical system 11 that irradiates the subject with light transmitted by the light guide 6 and reflected light (return light) emitted from the subject in accordance with the light emitted from the illumination optical system 11.
- the imaging unit 12 that outputs the obtained imaging signal is provided at the distal end of the insertion unit.
- the endoscope 1 includes a scope switch 13 that can give various instructions to the processor 3 in accordance with user operations.
- the imaging unit 12 is configured to capture the reflected light (return light) from the subject illuminated by the illumination light emitted from the light source device 2 and output an imaging signal. Specifically, the imaging unit 12 matches the imaging surface of the objective optical system 12a with an objective optical system 12a that forms an image of reflected light (returned light) emitted from the subject and a primary color filter 121.
- the image pickup device 12b is arranged.
- the image pickup device 12b includes, for example, a CCD and is driven according to the image pickup device drive signal output from the processor 3, and images reflected light (return light) from the subject imaged on the image pickup surface.
- the obtained imaging signal is output.
- the scope switch 13 includes an observation mode switch 13a and a highlight display switch 13b.
- the observation mode change-over switch 13a instructs the processor 3 to set (switch) the observation mode of the living body observation system 101 to either the white light observation mode or the special light observation mode in accordance with a user operation. It is configured to be able to.
- the light source device 2 includes a light source control unit 21 and a light emitting unit 22 including a white light generation unit 22a and a special light generation unit 22b.
- the light source controller 21 includes, for example, a control circuit. On the basis of the system control signal output from the processor 3, the light source control unit 21 detects, for example, that the observation mode of the living body observation system 101 is set to the white light observation mode. A light emission control signal for generating white light from 22 a and quenching the special light generator 22 b is generated, and the generated light emission control signal is output to the light emitting unit 22.
- the light source control unit 21 detects that the observation mode of the living body observation system 101 is set to the special light observation mode based on the system control signal output from the processor 3, the light source control unit 21 sets the white light generation unit 22a.
- the special light generator 22b generates a light emission control signal for sequentially generating NB1 light, NR1 light, and NR2 light, which will be described later, and outputs the generated light emission control signal to the light emitting unit 22. ing.
- the special light generation unit 22b includes, for example, a plurality of LEDs, and based on the light emission control signal output from the light source control unit 21, NB1 light that is narrow band blue light, NR1 light that is narrow band red light, And it is comprised so that NR2 light which is narrow band red light set to a wavelength band different from NR1 light can be generated individually or simultaneously.
- FIG. 2 is a diagram for explaining an example of the wavelength of light emitted from the light source device when the observation mode of the biological observation system according to the embodiment is set to the special light observation mode.
- the NB1 light has a peak wavelength at which the absorption coefficient of indigo carmine, which is a coloring agent having an absorption peak near 610 nm belonging to the red region, is sufficiently smaller than the absorption coefficient of oxyhemoglobin.
- blue light set to have a bandwidth.
- the wavelength band of the NB1 light is set to have a peak wavelength of 460 nm, which is a wavelength showing an absorption coefficient smaller than the absorption peak of oxyhemoglobin in the blue region. .
- the NR1 light has a peak wavelength of 600 nm, which is a wavelength at which the absorption coefficient of oxyhemoglobin in the red region is larger than that of the NR2 light, and does not overlap with the NR2 light.
- Red light set to have a sufficient bandwidth.
- the absorption coefficient of oxyhemoglobin is sufficiently smaller than the absorption coefficient of indigo carmine
- the absorption coefficient of oxyhemoglobin in the red region is smaller than that of NR 1 light
- the absorption of indigo carmine is set to have a peak wavelength of 630 nm, which is a wavelength whose coefficient is larger than that of the NB1 light, and further to have a bandwidth that does not overlap with the NR1 light.
- the processor 3 includes a preprocessing unit 31, an image signal processing unit 32, an observation image generation unit 33, and a control unit 34.
- the preprocessing unit 31 includes, for example, a noise reduction circuit, an A / D conversion circuit, and the like, and performs digital processing by performing processing such as noise removal and A / D conversion on the imaging signal output from the endoscope 1. An image signal is generated, and the generated image signal is output to the image signal processing unit 32.
- the image signal processing unit 32 includes, for example, a signal processing circuit. Further, the image signal processing unit 32 performs preprocessing when detecting that the observation mode of the biological observation system 101 is set to the white light observation mode based on the system control signal output from the control unit 34, for example.
- the image signal output from the unit 31 is separated for each color component of R (red) component, G (green) component, and B (blue) component obtained by imaging reflected light (returned light) of white light. It is configured to output to the observation image generation unit 33.
- the image signal processing unit 32 sets the observation mode of the living body observation system 101 to the special light observation mode, and the predetermined enhancement processing is set to OFF. If this is detected, the image signal output from the preprocessing unit 31 is imaged with the NB1 component obtained by imaging the reflected light (returned light) of the NB1 light and the reflected light (returned light) of the NR1 light.
- the NR1 component obtained in this way and the reflected light (return light) of the NR2 light are separated for each color component of the NR2 component obtained by imaging and output to the observation image generation unit 33.
- the image signal processing unit 32 Based on the system control signal output from the control unit 34, for example, the image signal processing unit 32 sets the observation mode of the living body observation system 101 to the special light observation mode, and the predetermined enhancement processing is set to ON. When this is detected, the image signal output from the preprocessing unit 31 is separated for each of the NB1 component, the NR1 component, and the NR2 component, and a predetermined enhancement process based on the separated color components is performed. And output to the observation image generation unit 33.
- the observation image generation unit 33 includes, for example, an image processing circuit. Further, based on the system control signal output from the control unit 34, for example, the observation image generation unit 33 detects that the observation mode of the living body observation system 101 is set to the white light observation mode.
- the luminance value of the R component output from the processing unit 32 is assigned to the R channel corresponding to the red color of the display device 4, and the luminance value of the G component output from the image signal processing unit 32 is set to G corresponding to the green color of the display device 4.
- An observation image is generated by allocating the luminance value of the B component output from the image signal processing unit 32 to the B channel corresponding to the blue color of the display device 4 and assigning the generated observation image to the display device 4. Is configured to do.
- the observation image generation unit 33 is output from the image signal processing unit 32, for example, when the observation mode of the living body observation system 101 is set to the special light observation mode.
- the brightness value of the NR1 component output from the image signal processing unit 32 is assigned to the G channel, and the brightness value of the NB1 component output from the image signal processing unit 32 is assigned to the B channel.
- An observation image is generated by allocating, and the generated observation image is output to the display device 4.
- the control unit 34 is configured to generate and output an image sensor drive signal for driving the image sensor 12b.
- the control unit 34 includes, for example, a CPU or a control circuit, and generates a system control signal for performing an operation corresponding to the observation mode set in the observation mode changeover switch 13a to generate the light source control unit 21 and the image signal. It is configured to output to the processing unit 32 and the observation image generation unit 33.
- the control unit 34 is configured to generate a system control signal for performing an operation corresponding to on / off of the predetermined enhancement processing set in the enhancement display changeover switch 13b and output the system control signal to the image signal processing unit 32. .
- the user turns on the power of each part of the living body observation system 101, sets the observation mode of the living body observation system 101 to the white light observation mode, and then confirms the observation image (white light image) displayed on the display device 4. Then, the insertion portion of the endoscope 1 is inserted into the subject.
- the observation mode changeover switch 13a is operated to set the observation mode of the living body observation system 101 to the special light observation mode, and the highlight display changeover switch 13b is operated to determine the predetermined mode. Set emphasis processing on.
- control unit 34 detects that the observation mode of the living body observation system 101 is set to the special light observation mode, for example, a timing for simultaneously generating NB1 light and NR2 light, a timing for generating NR1 light alone, Are generated and output to the light source control unit 21, the image signal processing unit 32, and the observation image generation unit 33.
- the special light observation mode for example, a timing for simultaneously generating NB1 light and NR2 light, a timing for generating NR1 light alone, Are generated and output to the light source control unit 21, the image signal processing unit 32, and the observation image generation unit 33.
- the light source control unit 21 Based on the system control signal output from the control unit 34, the light source control unit 21 performs control for simultaneously generating NB1 light and NR2 light and control for generating NR1 light alone, as a special light generation unit. Repeatedly for 22b. Then, according to the control of the light source control unit 21, NB1 light, NR1 light, and NR2 light, which are illumination lights for illuminating the subject, are sequentially irradiated through the illumination optical system 11 and reflected light from the subject ( An imaging signal obtained by imaging the (return light) is output from the imaging device 12b, and an image signal generated based on the imaging signal from the imaging device 12b is output from the preprocessing unit 31.
- the NB1 light The control for generating the NR1 light alone, the control for generating the NR1 light alone, and the control for generating the NR2 light alone may be sequentially repeated.
- the image signal processing unit 32 separates the image signal output from the preprocessing unit 31 for each color component of the NB1 component, the NR1 component, and the NR2 component, and performs predetermined enhancement processing on the separated NB1 component.
- FIG. 3 is a flowchart for explaining the enhancement processing according to the first embodiment.
- FIG. 4 is a diagram illustrating an example of an image to be processed in the enhancement process according to the first embodiment.
- the image signal processing unit 32 displays a line profile (luminance information) indicating a distribution state of luminance values of each pixel located on a line segment parallel to the horizontal direction of the image. Processing for acquiring the same number of pixels as the vertical direction of the image is performed (step S1 in FIG. 3).
- the information described in FIG. 6 is used as the information indicating the distribution state of the luminance value of each pixel located on the line segment LS1 shown in FIG. 5 in the image shown in FIG. A case where a line profile as shown is acquired will be described as an example.
- FIG. 5 is a diagram showing a line segment LS1 used when acquiring a line profile from the image of FIG.
- FIG. 6 is a diagram illustrating an example of a line profile acquired from each pixel located on the line segment LS1 of FIG. In FIG. 6, for the sake of simplicity, it is assumed that the magnitude relationship between the luminance values between the NB1 image, the NR1 image, and the NR2 image is not accurately shown.
- the image signal processing unit 32 aligns the luminance level of the line profile LP1 of the NB1 image acquired by the process of step S1 in FIG. 3 with the luminance level of the line profile LP2 of the NR2 image acquired by the process of step S1 of FIG.
- the calculation process is performed (step S2 in FIG. 3).
- the image signal processing unit 32 for example, in step S2 of FIG. 3, the average value AV1 of the luminance values of each pixel included in a predetermined region (for example, the entire image) in the NB1 image, and the corresponding NR2 image A value obtained by calculating the average value AV2 of the luminance values of each pixel included in the predetermined area and then dividing the average value AV2 by the average value AV1 for each luminance value included in the line profile LP1. A process of multiplying (AV2 / AV1) is performed.
- the image signal processing unit 32 uses the line profile LP2 to calibrate the line profile LB1 of the NB1 image after the brightness level adjustment obtained as a result of the processing in step S2 in FIG. 3 (the reference value of the brightness value in the line profile LB1 is set). Is set to 0 (step S3 in FIG. 3).
- step S3 of FIG. 3 the image signal processing unit 32 divides the luminance value of one pixel included in the line profile LB1 by the luminance value of the one pixel included in the line profile LP2. After performing the process, a calculation process is further performed to subtract 1 from each luminance value obtained by the process.
- the luminance value of the pixel in the area AR1 is indicated by a value larger than 0, and the luminance value of the pixel in the area AR2 is 0.
- a line profile LC1 of the NB1 image as indicated by the following values is acquired (see FIG. 7).
- FIG. 7 is a diagram illustrating an example of the line profile LC1 acquired in the enhancement processing according to the first embodiment.
- the image signal processing unit 32 of the present embodiment is not limited to the one that acquires the line profile LC1 by performing the above-described processing in step S3 of FIG.
- the line profile LC1 may be acquired by performing a process of subtracting the luminance value of the one pixel included in the line profile LP2 from the luminance value of the pixel.
- the image signal processing unit 32 having a function as a calculation unit obtains the degree of difference, which is a value indicating the degree of difference in luminance value in the same pixel of the NB1 image and the NR2 image in step S2 and step S3 in FIG. Processing for calculating the line profile LC1 is performed.
- the image signal processing unit 32 has a maximum luminance value MBL in the NB1 image that is less than the maximum luminance value MRL in the NR2 image, and a difference value between the maximum luminance value MRL and the maximum luminance value MBL. If it is smaller, the line profile LC1 may be acquired by skipping the process of step S2 in FIG. 3 and then performing the following process in step S3 of FIG.
- the image signal processing unit 32 skips the process of step S2 in FIG. 3, and then in step S3 of FIG. A process of dividing the luminance value by the luminance value of the one pixel included in the line profile LP2 is performed, and further, an arithmetic process that subtracts 1 from each luminance value obtained by the process is performed to obtain the line profile LC1. You may acquire.
- the image signal processing unit 32 skips the process of step S2 in FIG. 3, and then in step S3 of FIG. 3, from the luminance value of one pixel included in the line profile LP1.
- the line profile LC1 is obtained by performing a process of reducing the luminance value of the one pixel included in the line profile LP2 and further performing an arithmetic process of subtracting 1 from each luminance value obtained by the process. There may be.
- the image signal processing unit 32 is based on the line profile LC1 of the NB1 image obtained as the processing result of step S3 in FIG. 3, and includes an enhancement coefficient line profile for each pixel located on the line segment LS1. Processing for acquiring LE1 is performed (step S4 in FIG. 3).
- step S4 of FIG. 3 the image signal processing unit 32 maintains the luminance value larger than 0 as it is for the line profile LC1, and uniformly converts the luminance value of 0 or less to 0.
- further processing for multiplying each luminance value obtained by the processing by a constant RA (RA> 1) is performed for each pixel located on the line segment LS1.
- RA RA> 1
- FIG. 8 is a diagram illustrating an example of the line profile LE1 acquired in the enhancement processing according to the first embodiment.
- the image signal processing unit 32 having a function as an enhancement coefficient calculation unit calculates an enhancement coefficient for each pixel located on the line segment LS1 based on the line profile LC1 in step S4 of FIG.
- a process for acquiring each calculated enhancement coefficient as the line profile LE1 is performed.
- the image signal processing unit 32 increases the luminance value of the pixels in the area AR1 among the pixels located on the line segment LS1 based on the line profile LC1 in step S4 of FIG. Processing is performed to calculate an enhancement coefficient that maintains the luminance value of the pixels outside the area AR1.
- the image signal processing unit 32 calculates an enhancement coefficient corresponding to a pixel whose difference calculated by the processing in steps S2 and S3 in FIG. 3 exceeds a predetermined threshold. Process.
- the image signal processing unit 32 having a function as the enhancement processing unit maintains the contrast of the area AR2 based on each enhancement coefficient included in the line profile LE1 for enhancement obtained as a result of the process in step S4 of FIG.
- the line profile LP1 is subjected to enhancement processing that enhances the contrast of the area AR1 (step S5 in FIG. 3).
- the image signal processing unit 32 determines the luminance value La of one pixel in the line profile LP1 and the enhancement coefficient Ea corresponding to the one pixel in the line profile LE1 as the luminance.
- a process (La + Ea * La) of adding a value (Ea * La) obtained by multiplying the value La is performed on each pixel included in the line profile LP1.
- the image signal processing unit 32 performs step S2 and step S3 in FIG. 3 among the pixels included in the NB1 image based on the enhancement coefficient calculated in step S4 in FIG. 3 in step S5 in FIG.
- the processing for enhancing the contrast of the pixels whose dissimilarity calculated by the above processing exceeds a predetermined threshold is performed.
- the image signal processing unit 32 is obtained by separating the NB1 image generated by performing the enhancement processing as shown in FIG. 3 (for each line profile) and the image signal output from the preprocessing unit 31.
- the NR1 image and the NR2 image are output to the observation image generation unit 33.
- the observation image generation unit 33 assigns the luminance value of the NR2 image output from the image signal processing unit 32 to the R channel, assigns the luminance value of the NR1 image output from the image signal processing unit 32 to the G channel, and performs enhancement processing.
- an observation image is generated by assigning the luminance value of the NB1 image output from the image signal processing unit 32 to the B channel, and the generated observation image is output to the display device 4.
- the blue intensity of the area AR1 is higher than that in the case where the enhancement process is not performed, while the red intensity of the area AR2 performs the enhancement process.
- An observation image that is substantially the same as when there is no image is displayed on the display device 4.
- FIG. 9 is a flowchart for explaining an enhancement process according to a modification of the first embodiment.
- the image signal processing unit 32 calibrates the line profile LP2 of the NR2 image acquired by the process after performing the process of step S1 in FIG. 3 (sets the reference value of the luminance value in the line profile LP2 to 0). Is performed (step S11 in FIG. 9).
- the image signal processing unit 32 calculates the average value AV2 of the luminance values of each pixel included in a predetermined region in the NR2 image, and further includes it in the line profile LP2. An arithmetic process is performed to subtract the average value AV2 from each luminance value.
- the image signal processing unit 32 having a function as a calculation unit calculates, for each pixel, a variation amount that is a value indicating the variation of the luminance value with respect to the average luminance value of the NR2 image in step S11 of FIG. Process to do.
- the image signal processing unit 32 is based on the line profile LC2 of the NR2 image obtained as the processing result of step S11 in FIG. 9, and includes an enhancement coefficient line profile for each pixel located on the line segment LS1. Processing for acquiring LE2 is performed (step S12 in FIG. 9).
- the image signal processing unit 32 acquires the absolute value of each luminance value included in the line profile LC2 in step S12 of FIG. 9, and is larger than 0 in the absolute value of each acquired luminance value.
- a constant RB (RB> 1) is further set for each brightness value obtained by the process.
- an enhancement coefficient for each pixel located on the line segment LS1 is calculated.
- the image signal processing unit 32 having a function as an enhancement coefficient calculation unit calculates an enhancement coefficient for each pixel located on the line segment LS1 based on the line profile LC2 in step S12 of FIG.
- a process for acquiring each calculated enhancement coefficient as the line profile LE2 is performed.
- the image signal processing unit 32 increases the luminance value of the pixels in the area AR1 among the pixels located on the line segment LS1 based on the line profile LC2 in step S12 of FIG. Processing is performed to calculate an enhancement coefficient that maintains the luminance value of the pixels outside the area AR1.
- the image signal processing unit 32 performs processing for calculating an enhancement coefficient corresponding to a pixel in which the amount of variation calculated by the processing of step S11 of FIG. 9 exceeds a predetermined threshold in step S12 of FIG. I do.
- the image signal processing unit 32 having a function as an enhancement processing unit maintains the contrast of the area AR2 based on each enhancement coefficient included in the line profile LE2 for enhancement obtained as a result of the process in step S12 of FIG.
- the line profile LP1 is subjected to an enhancement process that enhances the contrast of the area AR1 (step S13 in FIG. 9).
- the image signal processing unit 32 calculates the luminance value Lb of one pixel in the line profile LP1 and the enhancement coefficient Eb corresponding to the one pixel in the line profile LE2.
- a process of adding the value (Eb ⁇ Lb) obtained by multiplying the value Lb and (Lb + Eb ⁇ Lb) is performed on each pixel included in the line profile LP1.
- the image signal processing unit 32 performs the process of step S11 of FIG. 9 among the pixels included in the NB1 image based on the enhancement coefficient calculated by the process of step S12 of FIG. 9 in step S13 of FIG. A process for enhancing the contrast of the pixel whose calculated fluctuation amount exceeds a predetermined threshold value is performed.
- the image signal processing unit 32 is obtained by separating the NB1 image generated by performing the enhancement processing as shown in FIG. 9 (for each line profile) and the image signal output from the preprocessing unit 31.
- the NR1 image and the NR2 image are output to the observation image generation unit 33.
- the observation image generation unit 33 assigns the luminance value of the NR2 image output from the image signal processing unit 32 to the R channel, assigns the luminance value of the NR1 image output from the image signal processing unit 32 to the G channel, and performs enhancement processing.
- an observation image is generated by assigning the luminance value of the NB1 image output from the image signal processing unit 32 to the B channel, and the generated observation image is output to the display device 4.
- the blue intensity of the area AR1 is higher than that in the case where the enhancement process is not performed, while the red intensity of the area AR2 performs the enhancement process.
- An observation image that is substantially the same as when there is no image is displayed on the display device 4.
- the enhancement process is performed to enhance the contrast of the area AR1 while maintaining the contrast of the area AR2 (without affecting the contrast of the area AR2).
- the observed image can be displayed on the display device 4.
- the concentration of indigo carmine in the area AR1 is low, it is possible to display an observation image in which the area AR1 can be viewed on the display device 4. Therefore, according to the present embodiment, it is possible to improve the visibility of the layer boundary between the submucosal layer and the muscle layer without deteriorating the visibility of the bleeding site caused by the blood vessel damage or the like, that is, cancer It is possible to reduce the burden on the operator when performing treatment on the lesion site such as the above.
- the special light generator 22b of this embodiment may be configured to generate NB2 light, which is narrow-band blue light having a peak wavelength of 415 nm, for example, instead of NB1 light.
- the observation image generation unit 33 of the present embodiment sets the luminance value of the NR1 image output from the image signal processing unit 32 as R when the observation mode of the biological observation system 101 is set to the special light observation mode.
- the observation image may be generated by assigning the luminance value of the NB1 image output from the image signal processing unit 32 to the B channel in a state where the channel is assigned to the channel and the G channel and the enhancement process is performed.
- the processing for calculating the enhancement coefficient is not limited to one performed for each line profile, and may be performed for each pixel, for example, and includes a plurality of pixels. It may be performed for each region of interest to be performed, or may be performed collectively for all pixels in the image.
- (Second embodiment) 10 to 16 relate to a second embodiment of the present invention.
- the living body observation system of the present embodiment has substantially the same configuration as each part of the living body observation system 101 described in the first embodiment, while the image signal processing unit 32 performs enhancement processing different from that of the first embodiment. Configured to do.
- the image signal processing unit 32 of the present embodiment sets the observation mode of the living body observation system 101 to the special light observation mode and sets the predetermined enhancement processing to ON.
- the image signal output from the preprocessing unit 31 is separated for each of the NB1 component, the NR1 component, and the NR2 component, and predetermined enhancement processing based on the separated color components is performed.
- the component and the NR1 component are applied to the observation image generation unit 33 and output.
- FIG. 10 is a flowchart for explaining the enhancement processing according to the second embodiment.
- FIG. 11 is a diagram illustrating an example of an image to be processed in the enhancement process according to the second embodiment.
- the image signal processing unit 32 displays a line profile, which is information indicating the distribution state of the luminance value of each pixel located on a line segment parallel to the horizontal direction of the image, in each of the NB1 image, the NR1 image, and the NR2 image.
- a process for acquiring the same number of pixels in the vertical direction is performed (step S21 in FIG. 10).
- the line profile as shown in FIG. 13 is acquired as information indicating the distribution state of the luminance value of each pixel located on the line segment LS2 shown in FIG. An example will be described.
- FIG. 12 is a diagram showing a line segment LS2 used when acquiring a line profile from the image of FIG. FIG.
- FIG. 13 is a diagram illustrating an example of a line profile acquired from each pixel located on the line segment LS2 of FIG. In FIG. 13, for the sake of simplicity, it is assumed that the magnitude relationship of the luminance values between the NB1 image, the NR1 image, and the NR2 image is not accurately shown.
- the image signal processing unit 32 aligns the luminance level of the line profile LP3 of the NB1 image acquired by the process of step S21 of FIG. 10 with the luminance level of the line profile LP4 of the NR2 image acquired by the process of step S21 of FIG. Is performed (step S22 in FIG. 10).
- the image signal processing unit 32 for example, in step S22 of FIG. 10, the average value AV3 of the luminance values of each pixel included in a predetermined region (for example, the entire image) in the NB1 image, and the NR2 image A value obtained by calculating the average value AV4 of the luminance values of each pixel included in the predetermined area and then dividing the average value AV4 by the average value AV3 for each luminance value included in the line profile LP3. A process of multiplying (AV4 / AV3) is performed.
- the image signal processing unit 32 uses the line profile LP4 to calibrate the line profile LB2 of the NB1 image after the brightness level adjustment obtained as a result of the process of step S22 in FIG. 10 (the reference value of the brightness value in the line profile LB2 is set). (Set to 0) is performed (step S23 in FIG. 10).
- step S23 of FIG. 10 the image signal processing unit 32 divides the luminance value of one pixel included in the line profile LB2 by the luminance value of the one pixel included in the line profile LP4. After performing the process, a calculation process is further performed to subtract 1 from each luminance value obtained by the process. Then, when such calculation processing is performed in step S23 of FIG. 10, the luminance values of the pixels in the area AR3 and the area AR5 are indicated by a value larger than 0, and the luminance of the pixels in the area AR4 is displayed. A line profile LC2 of the NB1 image whose value is indicated by a value of 0 or less is acquired (see FIG. 14).
- FIG. 14 is a diagram illustrating an example of the line profile LC2 acquired in the enhancement processing according to the second embodiment.
- the image signal processing unit 32 having a function as a calculation unit obtains the degree of difference, which is a value indicating the degree of difference in luminance value in the same pixel of the NB1 image and the NR2 image in step S22 and step S23 in FIG. Processing for calculating the line profile LC2 is performed.
- the image signal processing unit 32 is based on the line profile LC2 of the NB1 image obtained as the processing result of step S23 in FIG. 10, and includes an enhancement coefficient line profile for each pixel located on the line segment LS2. Processing for acquiring LE3 and LE4 is performed (step S24 in FIG. 10).
- step S24 of FIG. 10 the image signal processing unit 32 maintains the luminance value larger than 0 as it is for the line profile LC2 and uniformly converts the luminance value of 0 or less to 0.
- step S24 of FIG. 10 the image signal processing unit 32 maintains the luminance value larger than 0 as it is for the line profile LC2 and uniformly converts the luminance value of 0 or less to 0.
- processing such as multiplying each luminance value obtained by the processing by a constant RC (RC> 1), for each pixel located on the line segment LS2. Calculate the enhancement factor.
- a line profile LE3 for enhancement processing as illustrated in FIG. 15 is acquired.
- step S24 of FIG. 10 the image signal processing unit 32 maintains the luminance value larger than 0 as it is for the line profile LC2 and uniformly converts the luminance value of 0 or less to 0.
- the image signal processing unit 32 maintains a luminance value of 0 or less as it is for the line profile LC2 while uniformly converting a luminance value greater than 0 to 0.
- the enhancement coefficient for each pixel located on the line segment LS2 is obtained by performing processing such as multiplying each luminance value obtained by the processing by a constant RD (RD> 1). calculate.
- RD constant RD
- FIG. 15 is a diagram illustrating an example of the line profile LE3 acquired in the enhancement processing according to the second embodiment.
- FIG. 16 is a diagram illustrating an example of the line profile LE4 acquired in the enhancement processing according to the second embodiment.
- the image signal processing unit 32 having a function as an enhancement coefficient calculation unit calculates an enhancement coefficient for each pixel located on the line segment LS2 based on the line profile LC2 in step S24 of FIG. Processing for acquiring the calculated enhancement coefficients as the line profile LE3 and the line profile LE4 is performed. In other words, in step S24 of FIG. 10, the image signal processing unit 32 determines the luminance value of the pixel in the area AR3 and the luminance value of the area AR5 among the pixels located on the line segment LS2 based on the line profile LC2.
- the image signal processing unit 32 calculates an enhancement coefficient corresponding to a pixel in which the degree of difference calculated by the processing in steps S22 and S23 in FIG. 10 exceeds a predetermined threshold in step S24 in FIG. Process.
- the image signal processing unit 32 having a function as an enhancement processing unit, based on the enhancement coefficients included in the line profiles LE3 and LE4 for enhancement obtained as a result of the process in step S24 of FIG. 10 is applied to the line profile LP3 and the line profile LP5 of the NR1 image acquired by the process of step S21 in FIG. 10 (step in FIG. 10). S25).
- the image signal processing unit 32 for example, in step S25 of FIG. 10, the luminance value Lc of one pixel included in the line profile LP3 and the enhancement coefficient Ec corresponding to the one pixel in the line profile LE3.
- a value (Ec ⁇ Lc) obtained by multiplying the luminance value Lc by (Lc + Ec ⁇ Lc) is added to each pixel included in the line profile LP3.
- the image signal processing unit 32 determines the luminance value Ld of one pixel included in the line profile LP5 and the enhancement coefficient Ed corresponding to the one pixel in the line profile LE4 as luminance values.
- a value (Ed ⁇ Ld) obtained by multiplying Ld is added (Ld + Ed ⁇ Ld) to each pixel included in the line profile LP5.
- the image signal processing unit 32 performs step S22 in FIG. 10 among the pixels included in the NB1 image and the NR1 image based on the enhancement coefficient calculated in step S24 in FIG. 10 in step S25 in FIG. And the process for emphasizing the contrast of the pixel whose dissimilarity calculated by the process of step S23 exceeds a predetermined threshold value is performed.
- the image signal processing unit 32 performs the enhancement process as shown in FIG. 10 (for each line profile) and the enhancement process as shown in FIG. 10 (for each line profile).
- the generated NR1 image and the NR2 image obtained by separating the image signal output from the preprocessing unit 31 are output to the observation image generating unit 33.
- the observation image generation unit 33 assigns the luminance value of the NR2 image output from the image signal processing unit 32 to the R channel, and the luminance value of the NR1 image output from the image signal processing unit 32 in a state where the enhancement processing is performed.
- An observation image is generated by assigning the luminance value of the NB1 image output from the image signal processing unit 32 to the B channel in the state where the enhancement processing is performed and assigned to the G channel, and the generated observation image is sent to the display device 4. Output.
- the blue intensity of the area AR3 is higher than that when the enhancement process is not performed, and the red intensity of the area AR4 is not subjected to the enhancement process.
- an observation image is displayed on the display device 4 such that the red intensity of the area AR5 becomes substantially the same as that when the enhancement process is not performed.
- a process for further increasing the red intensity of the area AR4 may be included in the enhancement process of FIG.
- the line profile LE5 corresponding to the absolute value of each luminance value included in the line profile LE4 shown in FIG. 16 is acquired, and the luminance value Le of one pixel included in the line profile LP4 is obtained.
- Each pixel included in the line profile LP4 performs (Le + Ee ⁇ Le) processing of adding a value (Ee ⁇ Le) obtained by multiplying the luminance value Le by the enhancement coefficient Ee corresponding to the one pixel in the line profile LE5.
- the processing to be applied to may be included in the enhancement processing of FIG.
- the luminance value of the NR2 image generated through such processing is assigned to the R channel, the luminance value of the NR1 image generated through the enhancement processing of FIG.
- the enhancement processing of FIG. 10 is assigned to the G channel, and the enhancement processing of FIG. 10 is performed.
- the luminance value of the generated NB1 image By assigning the luminance value of the generated NB1 image to the B channel, the blue intensity of the area AR3 becomes higher than when the enhancement process is not performed, and the red intensity of the area AR4 is subjected to the enhancement process of FIG.
- an observation image can be displayed on the display device 4 such that the red intensity in the area AR5 becomes substantially the same as that in the case where the enhancement process is not performed, while being further higher than the case.
- the red intensity of the area AR4 may be further increased by setting the value of the constant RD in the enhancement processing of FIG. 10 to a sufficiently large value.
- step S25 of FIG. 10 a process for enhancing the contrast of either the area AR3 or the area AR4 may be performed.
- the enhancement process that increases the contrast of the area AR3 and the area AR4 while maintaining the contrast of the area AR5 (without affecting the contrast of the area AR5). Can be displayed on the display device 4. Further, according to the present embodiment, for example, even when the concentration of indigo carmine in the area AR3 is low, an observation image that can visually recognize the area AR3 can be displayed on the display device 4.
- the visibility of the layer boundary between the submucosal layer and the muscle layer, the damage of the blood vessel, etc. without deteriorating the visibility of the deep blood vessels (large blood vessels) traveling in the submucosa layer
- the visibility of the bleeding site that accompanies it can be improved, that is, the burden on the operator when performing treatment on a lesion site such as cancer can be reduced.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Optics & Photonics (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Pathology (AREA)
- Veterinary Medicine (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Biophysics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Signal Processing (AREA)
- Astronomy & Astrophysics (AREA)
- General Physics & Mathematics (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Multimedia (AREA)
- Endoscopes (AREA)
Abstract
Description
図1から図9は、本発明の第1の実施例に係るものである。図1は、実施例に係る生体観察システムの要部の構成を示す図である。
図10から図16は、本発明の第2の実施例に係るものである。
Claims (7)
- 赤色域に吸収ピークを有する色素が投与された領域を含む被写体を照明するための光として、青色域にピーク波長を有する第1の光と、赤色域にピーク波長を具備する第2の光と、を発生することができるように構成された照明光発生部と、
前記照明光発生部から発せられる光により照明された前記被写体からの戻り光を撮像するように構成された撮像部と、
前記第1の光の戻り光を撮像して得られた第1の画像と、前記第2の光の戻り光を撮像して得られた第2の画像と、の同一画素における輝度値の相違の度合いを示す値である相違度を算出するための処理を行うように構成された演算部と、
前記戻り光を撮像して得られた画像のうち、前記演算部により算出された前記相違度が所定の閾値を超える画素に対応する第1の強調係数を算出するための処理を行うように構成された強調係数算出部と、
前記強調係数算出部により算出された前記第1の強調係数に基づき、前記第1の画像に含まれる各画素のうち、前記演算部により算出された前記相違度が所定の閾値を超える画素のコントラストを強調するための処理を行うように構成された強調処理部と、
前記強調処理部による処理が施された前記第1の画像と、前記第2の画像と、を用いて観察画像を生成し、当該生成した観察画像を表示装置に出力するように構成された観察画像生成部と、
を有することを特徴とする生体観察システム。 - 前記演算部は、前記第1の画像の輝度レベルを前記第2の画像の輝度レベルに揃えるための処理を経て得られる輝度レベル調整後の前記第1の画像と、前記第2の画像と、に基づいて前記相違度を算出する
ことを特徴とする請求項1に記載の生体観察システム。 - 前記照明光発生部は、前記第1の光と、前記第2の光と、赤色域における酸化ヘモグロビンの吸収係数が前記第2の光に比べて大きくなるようなピーク波長を具備する第3の光と、を発生するように構成されている
ことを特徴とする請求項1に記載の生体観察システム。 - 前記強調係数算出部は、前記演算部により算出された前記相違度が前記所定の閾値以下となる画素に対応する第2の強調係数を算出するための処理をさらに行い、
前記強調処理部は、前記強調係数算出部により算出された前記第2の強調係数に基づき、前記第3の光の戻り光を撮像して得られた第3の画像に含まれる各画素のうち、前記演算部により算出された前記相違度が所定の閾値を超える画素のコントラストを強調するための処理をさらに行い、
前記観察画像生成部は、前記強調処理部による処理が施された前記第1の画像と、前記第2の画像と、前記強調処理部による処理が施された前記第3の画像と、を用いて前記観察画像を生成する
ことを特徴とする請求項3に記載の生体観察システム。 - 前記観察画像生成部は、前記第2の画像を前記表示装置の赤色チャンネルに割り当て、前記第3の光の戻り光を撮像して得られた第3の画像を前記表示装置の緑色チャンネルに割り当て、前記強調処理部による処理が施された前記第1の画像を前記表示装置の青色チャンネルに割り当てることにより前記観察画像を生成する
ことを特徴とする請求項3に記載の生体観察システム。 - 前記観察画像生成部は、前記第2の画像を前記表示装置の赤色チャンネルに割り当て、前記強調処理部による処理が施された前記第3の画像を前記表示装置の緑色チャンネルに割り当て、前記強調処理部による処理が施された前記第1の画像を前記表示装置の青色チャンネルに割り当てることにより前記観察画像を生成する
ことを特徴とする請求項4に記載の生体観察システム。 - 赤色域に吸収ピークを有する色素が投与された領域を含む被写体を照明するための光として、青色域にピーク波長を有する第1の光と、赤色域にピーク波長を具備する第2の光と、を発生することができるように構成された照明光発生部と、
前記照明光発生部から発せられる光により照明された前記被写体からの戻り光を撮像するように構成された撮像部と、
前記第2の光の戻り光を撮像して得られた第2の画像に基づき、前記第2の画像の輝度値の平均値に対する輝度値の変動を示す値である変動量を各画素毎に算出するための処理を行うように構成された演算部と、
前記戻り光を撮像して得られた画像のうち、前記演算部により算出された前記相違度が所定の閾値を超える画素に対応する強調係数を算出するための処理を行うように構成された強調係数算出部と、
前記強調係数算出部により算出された前記強調係数に基づき、前記第1の画像に含まれる各画素のうち、前記演算部により算出された前記変動量が所定の閾値を超える画素のコントラストを強調するための処理を行うように構成された強調処理部と、
前記強調処理部による処理が施された前記第1の画像と、前記第2の画像と、を用いて観察画像を生成し、当該生成した観察画像を表示装置に出力するように構成された観察画像生成部と、
を有することを特徴とする生体観察システム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015538791A JP5905170B2 (ja) | 2014-03-28 | 2014-09-11 | 生体観察システム |
EP14887381.3A EP3066974B1 (en) | 2014-03-28 | 2014-09-11 | In vivo observation system |
CN201480067794.1A CN105813538B (zh) | 2014-03-28 | 2014-09-11 | 活体观察系统 |
US15/177,388 US9949624B2 (en) | 2014-03-28 | 2016-06-09 | Living body observation system with contrast enhancement processing |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014069674 | 2014-03-28 | ||
JP2014-069674 | 2014-03-28 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/177,388 Continuation US9949624B2 (en) | 2014-03-28 | 2016-06-09 | Living body observation system with contrast enhancement processing |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015145814A1 true WO2015145814A1 (ja) | 2015-10-01 |
Family
ID=54194392
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/074083 WO2015145814A1 (ja) | 2014-03-28 | 2014-09-11 | 生体観察システム |
Country Status (5)
Country | Link |
---|---|
US (1) | US9949624B2 (ja) |
EP (1) | EP3066974B1 (ja) |
JP (1) | JP5905170B2 (ja) |
CN (1) | CN105813538B (ja) |
WO (1) | WO2015145814A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017216883A1 (ja) * | 2016-06-14 | 2017-12-21 | オリンパス株式会社 | 内視鏡装置 |
WO2017216878A1 (ja) * | 2016-06-14 | 2017-12-21 | オリンパス株式会社 | 内視鏡装置 |
WO2018105020A1 (ja) * | 2016-12-05 | 2018-06-14 | オリンパス株式会社 | 内視鏡装置 |
WO2018193465A1 (en) * | 2017-04-20 | 2018-10-25 | Irillic Pvt Ltd | System for extending dynamic range and contrast of a video under fluorescence imaging |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11076106B2 (en) * | 2017-05-22 | 2021-07-27 | Sony Corporation | Observation system and light source control apparatus |
WO2019171703A1 (ja) * | 2018-03-05 | 2019-09-12 | オリンパス株式会社 | 内視鏡システム |
CN113518574B (zh) * | 2019-03-05 | 2024-06-18 | 奥林巴斯株式会社 | 内窥镜装置和图像处理方法 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2007039981A1 (ja) * | 2005-09-30 | 2007-04-12 | Olympus Medical Systems Corp. | 内視鏡装置 |
JP2010131265A (ja) * | 2008-12-05 | 2010-06-17 | Fujifilm Corp | 撮像装置、方法、及びプログラム |
WO2012098798A1 (ja) * | 2011-01-17 | 2012-07-26 | オリンパスメディカルシステムズ株式会社 | 生体内観察装置およびカプセル型内視鏡装置 |
WO2012107884A1 (en) * | 2011-02-09 | 2012-08-16 | Tel Hashomer Medical Research Infrastructure And Services Ltd. | Methods and devices suitable for imaging blood-containing tissue |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7303528B2 (en) * | 2004-05-18 | 2007-12-04 | Scimed Life Systems, Inc. | Serialization of single use endoscopes |
JP5159904B2 (ja) * | 2011-01-11 | 2013-03-13 | 富士フイルム株式会社 | 内視鏡診断装置 |
WO2012140970A1 (ja) * | 2011-04-11 | 2012-10-18 | オリンパスメディカルシステムズ株式会社 | 内視鏡装置 |
JP5865606B2 (ja) * | 2011-05-27 | 2016-02-17 | オリンパス株式会社 | 内視鏡装置及び内視鏡装置の作動方法 |
JP5426620B2 (ja) * | 2011-07-25 | 2014-02-26 | 富士フイルム株式会社 | 内視鏡システムおよび内視鏡システムの作動方法 |
CN103501681B (zh) | 2011-09-20 | 2015-11-25 | 奥林巴斯医疗株式会社 | 图像处理装置以及内窥镜系统 |
WO2014014956A1 (en) * | 2012-07-17 | 2014-01-23 | The Arizona Board Of Regents On Behalf Of The University Of Arizona | Formulaic imaging for tissue diagnosis |
-
2014
- 2014-09-11 JP JP2015538791A patent/JP5905170B2/ja active Active
- 2014-09-11 WO PCT/JP2014/074083 patent/WO2015145814A1/ja active Application Filing
- 2014-09-11 EP EP14887381.3A patent/EP3066974B1/en active Active
- 2014-09-11 CN CN201480067794.1A patent/CN105813538B/zh active Active
-
2016
- 2016-06-09 US US15/177,388 patent/US9949624B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2007039981A1 (ja) * | 2005-09-30 | 2007-04-12 | Olympus Medical Systems Corp. | 内視鏡装置 |
JP2010131265A (ja) * | 2008-12-05 | 2010-06-17 | Fujifilm Corp | 撮像装置、方法、及びプログラム |
WO2012098798A1 (ja) * | 2011-01-17 | 2012-07-26 | オリンパスメディカルシステムズ株式会社 | 生体内観察装置およびカプセル型内視鏡装置 |
WO2012107884A1 (en) * | 2011-02-09 | 2012-08-16 | Tel Hashomer Medical Research Infrastructure And Services Ltd. | Methods and devices suitable for imaging blood-containing tissue |
Non-Patent Citations (1)
Title |
---|
See also references of EP3066974A4 * |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017216883A1 (ja) * | 2016-06-14 | 2017-12-21 | オリンパス株式会社 | 内視鏡装置 |
WO2017216878A1 (ja) * | 2016-06-14 | 2017-12-21 | オリンパス株式会社 | 内視鏡装置 |
JPWO2017216883A1 (ja) * | 2016-06-14 | 2019-04-04 | オリンパス株式会社 | 内視鏡装置 |
JPWO2017216878A1 (ja) * | 2016-06-14 | 2019-04-11 | オリンパス株式会社 | 内視鏡装置 |
US11399706B2 (en) | 2016-06-14 | 2022-08-02 | Olympus Corporation | Endoscope apparatus for switching between one-substance observation mode and two-substance observation mode based on input of selection of desired observation mode |
WO2018105020A1 (ja) * | 2016-12-05 | 2018-06-14 | オリンパス株式会社 | 内視鏡装置 |
US10863936B2 (en) | 2016-12-05 | 2020-12-15 | Olympus Corporation | Endoscope apparatus |
WO2018193465A1 (en) * | 2017-04-20 | 2018-10-25 | Irillic Pvt Ltd | System for extending dynamic range and contrast of a video under fluorescence imaging |
US11650157B2 (en) | 2017-04-20 | 2023-05-16 | Irillic Pvt Ltd | System for extending dynamic range and contrast of a video under fluorescence imaging |
Also Published As
Publication number | Publication date |
---|---|
EP3066974B1 (en) | 2019-04-10 |
JP5905170B2 (ja) | 2016-04-20 |
EP3066974A1 (en) | 2016-09-14 |
JPWO2015145814A1 (ja) | 2017-04-13 |
CN105813538A (zh) | 2016-07-27 |
CN105813538B (zh) | 2018-04-20 |
EP3066974A4 (en) | 2017-09-06 |
US20160278621A1 (en) | 2016-09-29 |
US9949624B2 (en) | 2018-04-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5905170B2 (ja) | 生体観察システム | |
JP6285383B2 (ja) | 画像処理装置、内視鏡システム、画像処理装置の作動方法、及び内視鏡システムの作動方法 | |
US9629527B2 (en) | Endoscope system, processor device of endoscope system, and image processing method | |
JP5670264B2 (ja) | 内視鏡システム、及び内視鏡システムの作動方法 | |
US20200337540A1 (en) | Endoscope system | |
JP6039606B2 (ja) | 内視鏡システム、光源装置、内視鏡システムの作動方法、及び光源装置の作動方法 | |
JP5808031B2 (ja) | 内視鏡システム | |
US9072453B2 (en) | Endoscope apparatus | |
US20180033142A1 (en) | Image-processing apparatus, biological observation apparatus, and image-processing method | |
WO2016079831A1 (ja) | 画像処理装置、画像処理方法、画像処理プログラムおよび内視鏡装置 | |
US9962070B2 (en) | Endoscope system, processor device, and method for operating endoscope system | |
JP6203088B2 (ja) | 生体観察システム | |
WO2017022324A1 (ja) | 画像信号処理方法、画像信号処理装置および画像信号処理プログラム | |
JP6979510B2 (ja) | 内視鏡システム及びその作動方法 | |
WO2014156604A1 (ja) | 内視鏡システム及びその作動方法並びにプロセッサ装置 | |
JP6210923B2 (ja) | 生体観察システム | |
JP6203092B2 (ja) | 生体観察システム | |
JP7163386B2 (ja) | 内視鏡装置、内視鏡装置の作動方法及び内視鏡装置の作動プログラム | |
US10321103B2 (en) | Image pickup system and image processing apparatus | |
JP7116254B2 (ja) | 画像処理装置及びその作動方法 | |
WO2015025620A1 (ja) | 内視鏡システム及びプロセッサ装置並びに作動方法 | |
JP6196599B2 (ja) | 内視鏡用のプロセッサ装置、内視鏡用のプロセッサ装置の作動方法 | |
JP6378140B2 (ja) | 内視鏡システム及びその作動方法 | |
JP5970054B2 (ja) | 内視鏡システム、内視鏡システムの光源装置、及び内視鏡システムの作動方法 | |
JP2017209343A (ja) | 制御装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2015538791 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14887381 Country of ref document: EP Kind code of ref document: A1 |
|
REEP | Request for entry into the european phase |
Ref document number: 2014887381 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014887381 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |