WO2016047191A1 - 内視鏡システム - Google Patents
内視鏡システム Download PDFInfo
- Publication number
- WO2016047191A1 WO2016047191A1 PCT/JP2015/063305 JP2015063305W WO2016047191A1 WO 2016047191 A1 WO2016047191 A1 WO 2016047191A1 JP 2015063305 W JP2015063305 W JP 2015063305W WO 2016047191 A1 WO2016047191 A1 WO 2016047191A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- unit
- illumination
- convex
- size
- imaging
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0615—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements for radial illumination
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000094—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00064—Constructional details of the endoscope body
- A61B1/00071—Insertion part of the endoscope body
- A61B1/0008—Insertion part of the endoscope body characterised by distal tip features
- A61B1/00089—Hoods
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00064—Constructional details of the endoscope body
- A61B1/00071—Insertion part of the endoscope body
- A61B1/0008—Insertion part of the endoscope body characterised by distal tip features
- A61B1/00096—Optical elements
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/012—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor characterised by internal passages or accessories therefor
- A61B1/0125—Endoscope within endoscope
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/05—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances characterised by the image sensor, e.g. camera, being in the distal end portion
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0605—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements for spatially modulated illumination
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0623—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements for off-axis illumination
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0625—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements for multiple fixed illumination angles
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0627—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements for variable illumination angles
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0638—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements providing two or more wavelengths
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0655—Control therefor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0661—Endoscope light sources
- A61B1/0676—Endoscope light sources at distal tip of an endoscope
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0082—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes
- A61B5/0084—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes for introduction into the body, e.g. by catheters
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/107—Measuring physical dimensions, e.g. size of the entire body or parts thereof
- A61B5/1076—Measuring physical dimensions, e.g. size of the entire body or parts thereof for measuring dimensions inside body cavities, e.g. using catheters
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B23/00—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
- G02B23/24—Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B23/00—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
- G02B23/24—Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
- G02B23/2407—Optical details
- G02B23/2461—Illumination
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B23/00—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
- G02B23/24—Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
- G02B23/2476—Non-optical details, e.g. housings, mountings, supports
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00131—Accessories for endoscopes
- A61B1/00133—Drive units for endoscopic tools inserted through or with the endoscope
Definitions
- the present invention relates to an endoscope system capable of detecting a convex portion in a body cavity.
- Laparoscopic examination that enables a therapeutic treatment while using a laparoscope as an endoscope and observing a treatment tool and a treatment site without performing laparotomy is also performed.
- Laparoscopy has the advantage that invasion to the patient is reduced. However, it is difficult to confirm minute convex lesions in the body cavity by laparoscopic examination.
- peritoneal lesions of endometriosis have a minute and transparent blister-like convex shape, and the site of occurrence spreads extensively to the abdominal wall, organs in the abdominal cavity, the surface layer of the human body, etc. It is easy to overlook in a mirror examination.
- Document 1 Japanese Patent Application Laid-Open No. 2010-82271 proposes an apparatus for detecting minute irregularities in a body cavity.
- Document 2 Japanese Unexamined Patent Application Publication No. 2009-273655 proposes an image processing system for specifying the shape of an object surface.
- the size of the convex portion of the lesioned portion to be detected is often limited to a predetermined size range. For example, it is not necessary to detect a convex portion having a size that is too small as a lesioned portion, and it is not necessary to detect a convex portion having a size that can be reliably confirmed by laparoscopic examination or the like. For example, it may not be necessary to detect a convex portion having a size larger than a predetermined maximum size (for example, 5 mm) or a convex portion having a size smaller than a predetermined minimum size (for example, 1 mm).
- a predetermined maximum size for example, 5 mm
- a convex portion having a size smaller than a predetermined minimum size for example, 1 mm
- the apparatus of Document 1 can detect unevenness based on luminance information, but cannot detect the size of the unevenness.
- the shape is detected using luminance information, but the size cannot be detected accurately.
- An endoscope system includes an illumination unit that irradiates illumination light to illuminate a predetermined illumination range, an imaging unit that captures a predetermined imaging range of a subject illuminated by the illumination unit, and the imaging
- a low-brightness part detection unit that detects a low-brightness part in a captured image of the subject imaged by the part, and a convexity of a predetermined size range based on information on the low-brightness part detected by the low-brightness part detection unit
- a convex part size calculating unit for detecting the part.
- FIG. 1 is a block diagram showing an endoscope system according to a first embodiment of the present invention.
- Explanatory drawing which shows a mode that the insertion part of an endoscope is inserted in the body cavity.
- FIG. 1 The perspective view which shows the front-end
- Sectional drawing which shows the structure of the front-end
- FIG. The flowchart for demonstrating operation
- Explanatory drawing for demonstrating the detection method of the convex lesion part by the illumination in the cap.
- Sectional drawing which shows the modification of an insertion part and a cap which can be employ
- part was projected.
- Explanatory drawing which shows a modification Explanatory drawing which shows a modification.
- the block diagram which shows the 3rd Embodiment of this invention.
- FIG. 1 is a block diagram showing an endoscope system according to a first embodiment of the present invention.
- FIG. 2 is an explanatory view showing a state in which the insertion portion of the endoscope is inserted into the body cavity.
- the endoscope system 1 includes an endoscope 10, a probe 20, and a processor device 40.
- the endoscope 10 has an elongated and flexible insertion portion 10a that is inserted into a body cavity.
- FIG. 2 shows a state in which the insertion portion 10a is inserted into the body cavity through the abdominal wall 61 of the patient as the subject.
- FIG. 2 shows an example having a configuration different from that of FIG. 1, as will be described later, and the following description relates to FIG. 1 unless otherwise specified.
- an operation unit 17 provided with various operation devices is provided on the proximal end side of the insertion unit 10a.
- a cable is extended from the operation unit 17, and the endoscope 10 is connected to the operation unit 17 via the cable.
- the processor device 40 is detachably connected. Although not shown in FIG. 1 for simplification of the drawing, the electrical connection between each unit in the insertion unit 10 a and the processor device 40 is performed via the operation unit 17.
- the endoscope 10 is provided with a forceps port 16, and a channel (not shown) penetrating from the forceps port 16 to the distal end opening 10 c (see FIG. 2) of the insertion portion 10 a is provided in the insertion portion 10 a.
- the probe 20 is arranged in the channel so as to be able to advance and retract.
- the operation unit 17 can receive the operation of the user 36 with respect to various operation devices and can control the drive of each unit.
- the insertion portion 10a is provided with a bending portion 10b (see FIG. 2).
- the bending portion 10b is activated in the vertical and horizontal directions by an operation by a user 36 on a bending operation knob (not shown) provided in the operation portion 17. Is configured to be curved.
- a probe driving unit 12 is provided in the insertion unit 10a.
- the probe driving unit 12 is configured by a motor or the like (not shown), and can advance and retract the probe 20 in the channel to change the amount of protrusion of the probe 20 from the tip opening 10c.
- the endoscope system 1 has a light source 32.
- the light source 32 generates illumination light and supplies the illumination light to the first illumination unit 15 disposed in the insertion unit 10 a via the light guide 31.
- the first illumination unit 15 is configured by, for example, a lens or the like (not shown) disposed at the distal end of the insertion unit 10a, and can illuminate the subject 35 with illumination light.
- the light source 32 also supplies the generated illumination light to the second illumination unit 21 arranged on, for example, the side surface of the probe 20.
- the second illumination unit 21 has an optical axis in a direction substantially perpendicular to the advancing / retreating direction of the probe 20, for example, and can irradiate the subject 35 with illumination light.
- First 1 and 2nd illumination parts 15 and 21 may be provided with a light source, and illumination may be supplied to the 2nd illumination part 21 via a light guide.
- the first illumination unit 15 and the second illumination unit 21 may be configured by an LED or the like.
- the light source 32 supplies power to the first illuminating unit 15 and the second illuminating unit 21 configured by LEDs to control lighting.
- an imaging unit 13 is provided in the insertion unit 10a of the endoscope 10.
- the imaging unit 13 is disposed, for example, on the side surface of the insertion unit 10 a so that the visual field range overlaps the illumination range of the first and second illumination units 15 and 21.
- the object reflected light (return light) of the illumination light from the first and second illumination units 15 and 21 irradiated on the subject 35 is incident on the image pickup surface of the image pickup unit 13.
- the imaging unit 13 converts a subject optical image incident on the imaging surface into an electrical signal and outputs a captured image.
- the probe 20 is driven by the probe driving unit 12 to advance and retreat, and the amount of protrusion from the tip opening 10c changes. That is, the relative positions of the imaging unit 13 and the second illumination unit 21 are changed according to the forward / backward movement of the probe 20.
- imaging is performed while relatively changing the illumination direction of the illumination light with respect to the subject 35 and the viewing direction in imaging. Accordingly, as shown in FIG. 1, when the imaging unit 13 is provided in the insertion unit 10 a, illumination is performed by the second illumination unit 21 provided in the probe 20, and the imaging unit 13 is moved forward and backward. Perform imaging with. Thereby, the position of the 2nd illumination part 21 and the position of the imaging part 13 can be changed relatively, and it can image, changing the illumination direction and the visual field direction relatively.
- FIG. 2 shows an example of this case, in which the probe 20 is provided with an imaging unit 13a, and the insertion unit 10a is provided with a second illumination unit 21a.
- the endoscope 10 is provided with an index light irradiation unit 14.
- the index light irradiating unit 14 can irradiate parallel light whose light beam size is a specified value. Note that the index light irradiation unit 14 only needs to be provided at a position where the subject 35 can be irradiated.
- FIG. 1 shows an example in which the index light irradiation unit 14 is provided in the insertion unit 10a. The example which provided the irradiation part 14 in the probe 20 is shown.
- the processor device 40 includes a processor such as a CPU (not shown), and each unit in the processor device 40 can be controlled by the processor.
- the processor device 40 is provided with a light source control unit 50.
- the light source control unit 50 controls the light source 32 to irradiate the subject 35 with illumination light from the first or second illumination unit 15, 21.
- the processor device 40 includes a drive circuit (not shown) that drives the imaging unit 13 of the endoscope 10 and an image processing unit 42 to which a captured image from the imaging unit 13 is input.
- the image processing unit 42 performs predetermined image signal processing on the input captured image and outputs the processed image signal to the display unit 51.
- the display unit 51 displays the captured image given from the image processing unit 42.
- the processor device 40 is provided with a probe control unit 41.
- the probe control unit 41 can control the driving of the probe driving unit 12 provided in the endoscope 10 to move the probe 20 forward and backward by a set distance.
- a convex part having a predetermined size is determined as a convex lesion part.
- the surgeon inserts the insertion portion 10a into the body cavity via the abdominal wall 61 and arranges it in the vicinity of the site to be observed.
- the example of FIG. 2 shows an example in which the convex portion 62a is detected among the plurality of convex portions 62a to 62c existing in the body cavity.
- the distal end side of the insertion portion 10a and the axial direction of the probe 20 are arranged substantially parallel to the observation site 65, and the convex portion 62a in the observation site 65 is the second illumination unit 21a (second illumination in FIG. 1).
- the illumination range of the imaging unit 13a (in FIG. 1, the imaging unit 13).
- the presence of the convex portion 62a in the observation region 65 is detected, the size of the detected convex portion 62a is obtained, and the convex lesion is found when the convex portion 62a is within a predetermined size range. It is determined that it is a part.
- the processor device 40 is provided with a convex part specifying part 43.
- the convex part specifying unit 43 includes a low luminance part detecting unit 44, an image recording unit 45, and a shadow area specifying unit 46.
- the low-luminance portion detection unit 44 is provided with captured images according to the illumination of the illumination unit 21 before and after the change of the relative position between the imaging unit 13 and the illumination unit 21 from the imaging unit 13.
- the low luminance part detection unit 44 detects a low luminance part in each captured image.
- the low luminance part detection unit 44 may obtain a pixel having a luminance lower than a predetermined threshold in the captured image as the low luminance part.
- the low luminance part detection unit 44 gives the image recording unit 45 the detection result of the position, shape, and size of the low luminance part detected for each captured image.
- the image recording unit 45 records the detection result of the low luminance part detection unit 44.
- the low luminance part detection unit 44 may obtain the size of the low luminance part based on the detected number of pixels of the low luminance part.
- the probe control unit 41 moves the probe 20 forward and backward before and after detection by the low-luminance portion detection unit 44.
- the low-luminance portion detection unit 44 detects the low-luminance portion before and after the probe 20 moves back and forth, that is, before and after the relative position between the imaging unit 13 and the illumination unit 21 changes, and the image recording unit 45.
- the low-brightness detection results before and after the probe 20 moves back and forth are recorded.
- the shadow area specifying unit 46 reads the detection result of the low luminance part detecting unit 44 from the image recording unit 45. If the detection result of the low luminance part indicates that the shape of the low luminance part is changed before and after the change of the relative position between the imaging unit 13 and the illumination unit 21, the shadow region specifying unit 46 The low luminance part is determined as a convex part, and the determination result of the size of the low luminance part is output to the size comparison unit 49 of the convex lesion size calculation unit 47.
- the convex lesion size calculation unit 47 includes an index light irradiation control unit 48 and a size comparison unit 49.
- the index light irradiation control unit 48 controls the light source 32 to irradiate the subject 35 with illumination light from the index light irradiation unit 14.
- the size comparison unit 49 is provided with a captured image based on the illumination of the index light irradiation unit 14 from the imaging unit 13.
- the size comparison unit 49 calculates the size of the convex portion by obtaining the size of the index light in the input captured image and comparing it with the size of the low luminance portion given from the image recording unit 45. When the calculated size of the convex portion is within a specified range, the size comparison unit 49 determines that the convex portion is a lesioned portion, and outputs information regarding the convex portion (not shown). ).
- FIG. 5 is an explanatory diagram for explaining the operation of the first embodiment, in which the arrangement of the insertion portion 10a on the observation site 65 is shown on the upper side, and the position of the probe 20 is P1, P2, P3 on the lower side. The captured image is shown.
- the insertion portion 10a is placed on the observation site. That is, the operator inserts the insertion portion 10a into the body cavity and places it near the observation site 65 as shown in FIG.
- the imaging unit 13 and the second illumination unit 21 are arranged on the side surface of the insertion unit 10 a or the probe 20, and each optical axis is provided in a direction substantially perpendicular to the advancing / retreating direction of the probe 20.
- the insertion portion 10 a is arranged so that the advancing / retreating direction of the probe 20 is substantially parallel to the surface of the observation site 65.
- step S1 the processor device 40 sets a measurement count n (n is 2 or more) for detecting a convex lesion.
- FIG. 5 shows an example where n is 3, and steps S2-2 to S2-6 in FIG. 3 are repeated three times.
- step S2-1 the light source control unit 50 controls the light source 32 to turn on the second illumination unit 21.
- the probe control unit 41 controls the probe driving unit 12 to move the probe 20 forward and backward in step S2-2, thereby moving the position of the second illumination unit 21 to the specified position. For example, the probe control unit 41 moves the probe 20 so that the position of the second illumination unit 21 becomes the initial position P1.
- the imaging unit 13 performs imaging under the control of the processor device 40.
- the imaging unit 13 acquires a surface image of the observation region 65 in the visual field range indicated by the broken line in the upper part of FIG.
- the convex part 62a in the observation region 65 is located within the illumination range A1 of the second illumination part 21, and the captured image from the imaging part 13 includes the image portion of the convex part 62a.
- the captured image of the imaging unit 13 is supplied to the image processing unit 42 and the low luminance unit detection unit 44 of the processor device 40 (step S2-3).
- the image processing unit 42 performs appropriate image signal processing on the captured image and then provides the image to the display unit 51. Thereby, the surface image of the observation region 65 can be displayed on the screen of the display unit 51.
- the low luminance part detection unit 44 takes the captured image from the imaging unit 13 as the first image information, and detects the position, shape, and size of the low luminance part in the first image information (step S2-4).
- a captured image 70 by illumination in the illumination range A1 when the second illumination unit 21 is located at the specified position P1 is illustrated.
- the captured image 70 includes an image 71 by the convex portion 62a.
- a shadow of the convex part 62 a is generated, and this shadow image 72 a is also included in the captured image 70.
- the image portion of the shadow image 72 a is detected as a low luminance portion by the low luminance portion detection unit 44.
- the low luminance part detection unit 44 gives the detection result to the image recording unit 45 as first image luminance information for recording (step S2-5).
- the processor device 40 determines whether or not the number of times of measurement n has been reached in step S6. If not, the processor device 40 returns the process to step S2-2 and repeats the processes of steps S2-2 to S2-6.
- n is 3, and the probe control unit 41 controls the probe drive unit 12 to move the probe 20 forward and backward, thereby moving the position of the second illumination unit 21 to the specified position P2.
- FIG. The imaging unit 13 captures an image
- the low luminance part detection unit 44 captures the captured image as second image information to detect the low luminance part, and records the detection result in the image recording unit 45 as second image luminance information.
- the probe control unit 41 moves the position of the second illumination unit 21 to the specified position P3 and performs imaging.
- the low luminance part detection unit 44 captures the captured image as the third image information, detects the low luminance part, and records the detection result in the image recording unit 45 as the third image luminance information.
- the shadow area specifying unit 46 specifies a shadow area in step S3 of FIG. That is, the shadow area specifying unit 46 reads out all image luminance information from the image recording unit 45 and compares them with each other to detect a low luminance area having a shape change.
- the illumination range of the second illumination unit 21 changes to A1 to A3.
- the projections 62a are included in the illumination ranges A1 to A3, and the captured image from the imaging unit 13 includes the image portion of the projections 62a.
- 5 shows a captured image 70 by illumination in the illumination range A2 when the second illumination unit 21 is at the specified position P2. Even when the second illumination unit 21 moves from the specified position P1 to the specified position P2, the positional relationship between the imaging unit 13 and the convex part 62a does not change, so the image 71 of the convex part 62a is in the captured image 70. Images are taken at the same position with the same shape and size. On the other hand, the shape and size of the shadow of the convex portion 62 a generated by the illumination of the second illumination unit 21 change as the second illumination unit 21 moves. 5 indicates that the shadow image 72a of the convex portion 62a is captured on the right side of the paper surface of the convex portion 62a, and at P2 in FIG. 5, the shadow image 72b of the convex portion 62a is the convex portion 62a. It is shown that the image is taken on the left side of the paper.
- a captured image 70 is shown when the second illumination unit 21 has moved to the specified position P3.
- the convex part 62a and the shadow images 71 and 72c by the illumination of the illumination range A3 of the second illumination part 21 are included.
- the size and shape of the shadow images 72b and 72c change with the change from the illumination range A2 to the illumination range A3.
- the shadow area specifying unit 46 determines that the low luminance part having such a shape change is a shadow part due to the convex part, and detects this part as a low luminance area (step S3-1).
- the shadow of the convex portion 62a changes in length along the advance and retreat direction of the probe 20, and the advance and retreat direction of the probe 20 It is considered that the length does not change in the direction perpendicular to. That is, it can be considered that the portion where the length of the end portion of the low luminance region does not change corresponds to the image portion of the convex portion 62a adjacent to the low luminance region. Further, assuming that the planar shape of the convex portion 62a is substantially circular, the length of the portion where the length of the end portion of the low luminance region does not change may be considered as the size (diameter) of the convex portion 62a.
- step S3-2 the shadow area specifying unit 46 detects a part of the low luminance area where the size does not change, and assumes the length (see FIG. 5) of this part as the diameter d of the convex part 62a.
- the diameter d is determined by the number of pixels of the image, for example.
- the shadow region specifying unit 46 outputs information regarding the number of pixels corresponding to the low luminance region and the diameter d to the size comparing unit 49.
- step S4 the processor device 40 determines whether or not the size of the convex portion 62a is a size within a specified range based on the diameter d.
- the index light irradiation control unit 48 of the convex lesion size calculation unit 47 controls the light source 32 to irradiate the surface of the observation region 65 with the index light from the index light irradiation unit 14 (step S4-1). ).
- the index light is irradiated on the surface of the observation region 65 with a specified size.
- the index light irradiation unit 14 may limit the band of the index light to a predetermined band so that the index light can be reliably observed on the surface of the observation site 65.
- the band of the index light is different from the band of the illumination light from the second illumination unit 21. Moreover, when using the illumination light (white light) of the same band as the illumination light of the 2nd illumination part 21 as an indicator light, after making the 2nd illumination part 21 light-extinguish, an indicator light is irradiated.
- the imaging device 13 performs imaging under the control of the processor device 40 (step S4-2).
- 5 indicates that an image 73 of the index light region (index light irradiation region) irradiated on the surface of the observation region 65 is captured.
- the image 70 when the 2nd illumination part 21 is located in the regulation position P1 is shown, However, You may perform irradiation of index light at any timing. That is, the processing of steps S4-1 to S4-4 in FIG. 4 may be performed at other timings.
- the captured image from the imaging unit 13 is supplied to the size comparison unit 49.
- the captured image from the imaging unit 13 includes an image portion illuminated by the index light, and the size comparison unit 49 obtains the number of pixels of the image portion (step S4-3).
- the size comparison unit 49 may extract information on the wavelength band of the index light to obtain an image portion of the index light irradiation area.
- the index light is parallel light, and the size of the index light irradiation area irradiated on the surface of the observation site 65 is known.
- the size comparing unit 49 determines whether or not the size d of the convex lesion is a convex lesion to be detected, depending on whether or not the size d is within a specified range.
- the size comparison unit 49 outputs a determination result as to whether or not it is a convex lesion. For example, this determination result is supplied to the display unit 51, and a display indicating whether or not the convex part 62a is a convex lesion part to be detected, a display indicating that the convex part 62a is a convex lesion part, and the like are performed. Is called.
- the size comparison unit 49 may provide the display unit 51 with information on the size d of the convex lesion part as it is, and display a display indicating the size of the convex lesion part on the screen of the display unit 51.
- an image 74 corresponding to the minute convex portion 64 is also captured.
- the size of the convex portion 64 is smaller than the specified range, it is determined that the convex portion 64 does not need to be detected. Further, there may be no detectable low-luminance portion on the minute convex portion 64. Alternatively, it is conceivable that the change in the low luminance portion of the minute convex portion 64 is sufficiently small. In these cases, the convex portion 64 is not determined to be a low luminance region.
- the convex portion can be reliably detected by performing imaging while controlling the relative positional relationship between the imaging unit and the illumination unit.
- whether or not the size of the detected convex part is a convex part within a predetermined size range is determined by indicator light irradiation, and only the convex part to be identified as a lesion part is detected and notified to the operator. can do.
- FIG. 6 is a block diagram showing a second embodiment of the present invention.
- the same components as those in FIG. In the first embodiment an example is described in which imaging is performed while relatively moving the imaging unit and the illumination unit, and the size of the convex part is obtained by detecting the convex part by the shadow of the convex part in the captured image. did.
- the shadow due to the convex portion does not occur. Therefore, in the present embodiment, by using a cap-shaped lighting fixture, a shadow due to the convex portion is generated, and the convex portion and the convex portion are reliably detected.
- FIG. 7 and FIG. 8 are explanatory diagrams for explaining a usage pattern of the endoscope system in the present embodiment.
- FIG. 7 shows a state where the observation site 151 is observed by the insertion portion 110a of the endoscope 110 to which the cap 130 that is a lighting device is attached.
- the surface of the observation part 151 is inclined with respect to the direction of the distal end of the insertion part 110a, and even if the observation part 151 is irradiated with illumination light and a shadow that is a low-luminance part is observed, That size cannot always be detected reliably.
- the orientation of the insertion portion 110a is set to a predetermined angle, for example, parallel (or perpendicular) to the surface of the observation site 151, so that the projection is reliably shaded by the illumination light. It is like that.
- the cap 130 is configured, for example, in a cylindrical shape with an opening on the distal end side, and the distal end of the insertion portion 110a is fitted on the proximal end side. As shown in FIG. 8, the tip of the cap 130 is brought into contact with the surface of the observation site 151 so that the convex portion 152 is positioned in the opening of the tip of the cap 130, and is pushed in the direction indicated by the dashed arrow. As a result, the surface of the observation region 151 is deformed, and the orientation of the surface of the observation region 151 is substantially perpendicular to the axial direction of the insertion portion 110a.
- An illumination part 131 (first to third illumination parts 131a to 131c to be described later) made of, for example, LEDs is provided on the inner peripheral surface on the tip side of the cap 130.
- the illumination part 131 is shown in FIG. As indicated by the solid line arrow, the illumination light is irradiated toward the center side of the cap 130.
- the observation site 151 is imaged by the endoscope 110, and the convex portion 152 and the size thereof are detected according to the state of the shadow in the captured image. Illumination light is irradiated from the side surface side of the convex portion 152, so that a shadow by the convex portion 152 can be reliably generated, and the convex portion 152 can be detected.
- the endoscope 110 is provided with an imaging unit 113 and a fourth illumination unit 114 in the insertion unit 110a.
- the distal end of the insertion portion 110 a of the endoscope 110 is fitted into the proximal end side opening of the cap 130.
- the cap 130 is provided with three first to third illumination units 131a to 131c (hereinafter referred to as the illumination unit 131 as described above when it is not necessary to distinguish each of them).
- the illumination unit 131 an example in which the three illumination units of the first to third illumination units 131a to 131c are provided as the illumination unit 131 is shown, but two or more illumination units may be arranged.
- the cap 130 is configured, for example, in a cylindrical shape, and any of the illumination units 131 can irradiate illumination light toward the center of the cap 130.
- the illumination units 131a to 131c irradiate illumination lights having different wavelength bands, for example, R (red) light, G (green) light, and B (blue) light, respectively. Irradiate.
- FIG. 9 is a perspective view showing the distal end of the insertion portion 110a to which the cap 130 is attached
- FIG. 10 is a cross-sectional view showing the configuration of the distal end of the insertion portion 110a and the cap 130.
- the distal end of the insertion portion 110a is fitted into the proximal end side opening of the cylindrical cap 130 having both ends opened.
- a positioning convex portion 134 is provided on the inner peripheral surface of the cap 130, and the insertion position of the insertion portion 110 a into the cap 130 is determined by the convex portion 134.
- An imaging element 113b such as a CCD that constitutes the imaging unit 113 is disposed at the distal end of the insertion unit 110a.
- a lens group 113c composed of a plurality of lenses is provided between the image sensor 113b and the distal end surface of the insertion portion 110a. Return light from the subject is coupled to the image sensor surface of the image sensor 113b via the lens group 113c. It has come to be imaged.
- a cable 113a inserted through the insertion unit 11a from the processor device 140 is connected to the image sensor 113b, and various signals for driving the image sensor 113b and a captured image obtained by the image sensor 11b are transmitted by the cable 113a. It is like that.
- the 4th illumination part 114 comprised by the illumination lens is arrange
- the light from the light source 32 guided by the fiber cable 114a inserted through the insertion section 110a is irradiated to the subject through the fourth illumination section 114 facing the emission end face of the fiber cable 114a.
- the fourth illumination unit 114 is used for observing the subject 35.
- an electrical contact 115b is provided on the tip side surface.
- the electrical contact 115b is connected to the signal line 115a inserted into the insertion portion 110a and exposed to the outer peripheral surface of the insertion portion 110a.
- an electrical contact 132 a is provided on the inner peripheral surface of the cap 130, and the electrical contacts 115 b and 132 a come into contact with each other by inserting the insertion portion 110 a into the cap 130.
- the electrical contact 132a is connected to the illumination unit 131 through a signal line 132b such as a conductive wire or a flexible substrate.
- the signal line 115 a is connected to the light source 32.
- the light source 32 can control the lighting of the illumination unit 131 by supplying power to the illumination unit 131 via the signal line 115a, the electrical contacts 115b and 132a, and the signal line 132b.
- the illumination unit 131 can irradiate the illumination light 133 toward the center of the cap 130.
- FIG. 10 shows an example in which only one illumination unit 131 is provided for simplification of the drawing, but actually two or more illumination units 131 and the illumination unit 131 are driven. A power supply path is configured to do this.
- the processor device 140 includes a processor such as a CPU (not shown), and each processor in the processor device 140 can be controlled by this processor.
- the light source control unit 50 provided in the processor device 140 can control lighting of the first to third illumination units 131a to 131c provided in the cap 130 by controlling the light source 32. .
- the processor device 140 has an image processing unit 141.
- the image processing unit 141 has the same configuration as the image processing unit 42 in FIG. 1, performs predetermined image signal processing on the captured image from the imaging unit 113, and then gives the image to the display unit 51 for display.
- the processor device 140 is provided with a low-luminance portion detection unit 142, a convex lesion region specifying unit 143, and a convex lesion size calculation unit 144.
- the low luminance part detection unit 142 is provided with a captured image from the imaging unit 113 and detects a low luminance part in the captured image for each band of each illumination unit 131.
- the convex lesion area specifying unit 143 specifies the convex lesion area based on the detection result of the low luminance part detecting unit 142.
- the convex lesion size calculator 144 calculates the actual size of the convex lesion based on the identified convex lesion region.
- the convex lesion size calculation unit 144 outputs the calculation result to the image processing unit 141. Based on the calculation result of the convex lesion size calculation unit 144, the image processing unit 141 can display information about the convex lesion existing in the observation site on the display unit 51.
- FIG. 11 is a flowchart for explaining the operation of the second embodiment.
- FIGS. 12 to 14 are explanatory views for explaining a method of detecting a convex lesion by illumination in the cap 130, and shows a state of illumination at the tip of the cap 130 as viewed from the axial direction of the insertion portion 110a. Yes.
- the insertion part 110a is arranged at a site to be observed. That is, the surgeon inserts the insertion portion 110a with the cap 130 attached to the distal end into the body cavity and pushes the observation site 151 on the distal end surface of the cap 130 as shown in FIG. Thereby, the front end surface of the cap 130 and the surface of the observation site 151 are substantially parallel, and when there is a convex portion in the observation site 151, the illumination unit 131 disposed on the inner peripheral surface near the front end of the cap 130 The convex portion is illuminated from a direction perpendicular to the protruding direction of the convex portion.
- the processor device 140 detects a low luminance part in the observation region 151 in step S11.
- the light source control unit 50 of the processor device 140 controls the light source 32 in step S11-1 to irradiate illumination light from the first to third illumination units 131a to 131c in the cap 130.
- the first to third illumination units 131a to 131c emit R, G, and B light, respectively.
- FIG. 12 shows a state of this illumination. From the three illumination units 131a to 131c arranged on the inner peripheral surface of the cap 130 at a predetermined angular interval (for example, the same angular interval of 120 degrees), R light, It shows that G light or B light is emitted with the direction of the arrow as the optical axis.
- a predetermined angular interval for example, the same angular interval of 120 degrees
- the imaging unit 113 performs imaging under the control of the processor device 140.
- the imaging unit 113 acquires an image of the surface of the observation site 151 surrounded by the cap 130 (step S11-2).
- the captured image from the imaging unit 113 is supplied to the image processing unit 141 and the low luminance unit detection unit 142 of the processor device 140.
- the image processing unit 141 performs appropriate image signal processing on the captured image and then provides the image to the display unit 51. Thereby, the surface image of the observation site 151 can be displayed on the screen of the display unit 51.
- the low luminance part detection unit 142 detects the R component from the captured image from the imaging unit 113, and extracts the low luminance region by binarizing using a predetermined threshold (step S11-3-1). Similarly, in steps S11-3-2 and S11-3-3, the low luminance part detection unit 142 detects the G component and the B component from the captured image from the imaging unit 113, and uses a predetermined threshold value. Then, each low luminance region is extracted by binarizing. The extraction result of the low luminance region of the R, G, and B components by the low luminance portion detection unit 142 is supplied to the convex lesion region specifying unit 143. In step S12, the convex lesion region specifying unit 143 specifies the convex lesion region based on the output of the low luminance portion detecting unit 142.
- the illumination unit 131 irradiates light in a direction substantially parallel to the surface of the observation site 151 from a position near the tip of the cap 130 in a space 135 surrounded by the inner peripheral surface of the cap 130. Therefore, when there is a convex portion protruding from the surface of the observation site 151 due to the illumination of the illuminating unit 131, it is considered that a shadow caused by the convex portion is surely generated. In other words, it can be determined that the low luminance region in the surface image is due to the shadow of the convex portion.
- FIG. 13 shows a low luminance region by hatching when a convex portion exists at the center of the cap 130.
- the low luminance region 161a is due to the shadow of the convex portion due to the illumination of the first illumination unit 131a.
- the low luminance region 161b is due to the shadow of the convex portion due to the illumination of the first illumination unit 131b
- the low luminance region 161c is due to the shadow of the convex portion due to the illumination of the first illumination unit 131c. Since the illumination lights of the first illumination units 131a to 131c are R, G, and B light, respectively, the low-luminance region 161a is a cyan shadow that excludes R light from the first illumination unit 131a out of the R, G, and B light.
- the low luminance region 161b is considered to be based on a magenta shadow excluding the G light from the second illumination unit 131b among the R, G, and B light
- the low luminance region 161c is the R, G, B light. Of these, it is considered to be based on yellow shades excluding B light from the third illumination unit 131c.
- the convex lesion region specifying unit 143 Based on the extraction result of the low-luminance portion detection unit 142, the convex lesion region specifying unit 143 has an intersection 162 between the low-luminance regions 161a to 161c, that is, the low-luminance regions where the R, G, and B components are reduced. Is extracted (step S12-1).
- the convex lesion region specifying unit 143 assumes that a circle 163 passing through the intersection 162 of the low luminance regions of the R, G, and B components is the outer shape of the convex lesion region, as shown in FIG. To do.
- the convex lesion region specifying unit 143 outputs information on the circle 163 assumed to be the outer shape of the convex lesion region to the convex lesion size calculating unit 144.
- the convex lesion size calculation unit 144 calculates the size of the convex lesion part in step S13. That is, the convex lesion size calculation unit 144 calculates the number of pixels of the diameter d of the circle 163 assumed to be a convex lesion part in step S13-1.
- the convex lesion size calculation unit 144 calculates the actual length of one pixel of the captured image in step S13-2. Since the cap 130 is attached to the distal end of the insertion unit 110a to perform imaging, the distance from the imaging unit 113 to the subject is known. In addition, since the viewing angle of the imaging unit 113 is known and the characteristics of the lens group 113c are also known, the convex lesion size calculation unit 144 can calculate the actual length of one pixel of the captured image. Note that since the inner diameter of the cap 130 is also known, the convex lesion size calculation unit 144 may calculate the actual length of one pixel from the number of pixels of the inner diameter of the cap 130. Thus, in this embodiment, the actual length of one pixel of the captured image can be obtained without irradiating the index light.
- the convex lesion size calculation unit 144 determines whether or not the convex lesion size to be detected is based on whether or not the size d of the convex lesion portion is within a specified range.
- the convex lesion size calculation unit 144 outputs a determination result as to whether or not it is a convex lesion part. For example, the determination result is supplied to the display unit 51 via the image processing unit 141, and a display indicating whether the convex portion observed in the cap 130 is a convex lesion portion or a convex lesion portion. A display indicating the presence is performed.
- a cap is attached to the front-end
- Modification 15 to 17 are cross-sectional views showing modifications of the insertion portion and the cap that can be employed in the second embodiment.
- the imaging unit 113 and the fourth illumination unit 114 are not shown.
- the insertion portion 170 shown in FIG. 15 has the same configuration as the insertion portion 110a of FIG. 10 except that the electrical contact 115b is omitted and the signal line 115a connected to the electrical contact 115b is not inserted.
- the cap 175 is different from the cap 130 of FIG. 10 in the power supply path to the illumination unit 131.
- the cap 175 is provided with an electrical contact 176 on the outer peripheral surface.
- the electrical contact 176 and the illumination unit 131 are connected by a signal line 177.
- a bipolar forceps 179 such as a high-frequency cautery is brought into contact with the electrical contact 176 and energized.
- power can be supplied from the bipolar forceps 179 of the high-frequency cautery to the illumination unit 131 via the electrical contact 176 and the signal line 177, and the illumination unit 131 can be turned on.
- FIG. 16 shows an example in which the illumination unit 131 is configured by an illumination lens.
- the insertion unit 181 in FIG. 16 includes a cable 113a, an image sensor 113b, and a lens group 113c, similarly to the insertion unit 110a.
- the insertion portion 181 is provided with a lens 183 constituting the fourth illumination portion 114 at the distal end surface.
- a fiber cable 182 that guides light from the light source 32 is disposed in the insertion portion 181.
- the exit end face of the fiber cable 182 faces the lens 183, and the light guided by the fiber cable 182 is emitted from the lens 183.
- a lens 193 having a light exit surface exposed on the inner peripheral surface is disposed on the inner peripheral surface near the tip of the cap 191.
- the cap 191 has a thick side wall, and a light guide 194 is formed in the side wall.
- the opening at one end of the light guide 194 faces a part of the exit surface of the lens 183, and the opening at the other end faces the lens 193.
- a fiber cable 195 is disposed in the light guide 194, and a part of the light emitted from the lens 183 is emitted from the lens 193 via the fiber cable 195 in the light guide 194 as indicated by an arrow. Is done.
- FIG. 17 the illustration of the imaging unit 113 is omitted.
- the insertion portion 170 shown in FIG. 17 has the same configuration as the insertion portion 110a of FIG. 10 except that the electrical contact 115b is omitted and the signal line 115a connected to the electrical contact 115b is not inserted.
- the cap 201 uses a phosphorescent material or a phosphor as the illumination unit 202.
- the illumination unit 202 using the phosphorescent body accumulates light from the fourth illumination unit 114 and emits it toward the center of the cap 201 as indicated by an arrow.
- the surface of the observation site is not always flat.
- the illumination unit for example, the first illumination unit 15 in FIG. 1 or the fourth illumination unit 114 in FIG. 6
- the detection accuracy of the convex portion and the size thereof are improved.
- FIG. 18A to 18E are explanatory views for explaining the shape of the lattice projected on the observation site.
- FIG. 18A shows an example in which the surface of the observation site is flat.
- 18B to 18E show cases where the plane of the observation site is convex, concave, inclined, or partially convex, respectively.
- the processor devices 40 and 140 estimate the surface shape of the observation region based on the lattice shape in the captured image, and improve the detection accuracy of the convex shape by using the correction value based on the estimation result. It is possible.
- FIG. 19A to 19C are explanatory diagrams showing an example of the configuration in this case.
- FIG. 19A shows an example of the insertion portion 221 provided with a pressing member at the observation site.
- a plurality of claw members 223 that are extendable in the axial direction of the insertion portion 221 are disposed on the side surface of the insertion portion 221.
- Each claw member 223 can be individually expanded and contracted in an expansion / contraction direction indicated by an arrow in FIG. 19A by a driving mechanism (not shown).
- a driving mechanism not shown.
- the observation part 225 is inclined with respect to the axial direction of the insertion part 221, and the captured image of the lattice projected from the insertion part 221 onto the surface of the observation part 225 is distorted as shown in FIG. 19B.
- the processor device can obtain the lattice image 229 shown in FIG. 19C by extending, for example, one of the claw members 223 toward the observation site 225 based on the lattice shape of the lattice image 228. It is. Thereby, it is possible to improve the detection accuracy of a convex shape.
- the size of each grid of the grid image 229 shown in FIG. 19C is known. Therefore, by using the lattice image 229, it is possible to calculate the actual size of the detected convex lesion.
- the grid can be used as an index for detecting the size.
- FIG. 20 is a block diagram showing a third embodiment of the present invention.
- an example has been described in which the subject is illuminated from different directions at the same time or at different times, the convex portion is detected from the shadow state, and the size is obtained.
- the size of the convex portion is obtained only by illuminating the subject from one direction.
- this Embodiment has shown the method for calculating
- the insertion unit 210 a of the endoscope 210 is provided with an imaging unit 213, a second illumination unit 214, and a probe raising device 215.
- the probe raising device 215 is provided on the side surface of the insertion portion 210a, and the probe 230 may be arranged to protrude from the probe raising device 215 in a state inclined at a predetermined angle in the axial direction of the insertion portion 210a. It can be done.
- a probe insertion port is provided in an operation unit (not shown) attached to the proximal end side of the insertion unit 210a, and the probe 230 is inserted into a channel that penetrates the insertion unit 210a from the probe insertion port to the probe lifting device 215.
- the probe 230 is moved forward and backward from the probe insertion port side, and the probe raising device 215 is driven by operating an operation lever (not shown) provided in the operation unit, whereby the protrusion amount and the inclination angle of the probe 230 from the insertion unit 210a are driven. Can be freely controlled.
- FIG. 21 is an explanatory diagram for explaining a measurement method by the endoscope system in the present embodiment.
- the imaging unit 213 and the second illumination unit 214 are provided at the distal end of the insertion unit 210a, and the endoscope 210 can illuminate and image the axial direction of the insertion unit 210a. It is like that.
- the probe 230 protrudes from a probe raising device 215 (not shown in FIG. 21) provided on the side surface of the insertion portion 210a at an angle with respect to the axial direction of the insertion portion 210a.
- the protrusion amount and inclination angle of the probe 230 are known.
- a first illumination unit 231 is disposed on the side surface on the distal end side of the probe 230.
- the first illumination unit 231 can emit light from the light source 32.
- the 1st illumination part 231 can also be comprised by LED, and in this case, the 1st illumination part 231 is supplied with electric power from the light source 32, for example, and light emission is controlled.
- the first illumination unit 231 can emit parallel light. Furthermore, the 1st illumination part 231 may be able to adjust the size of the light beam.
- the first illuminating unit 231 may be capable of emitting two types of light beams, a beam-like light with a sufficiently narrow light beam and a light beam with a predetermined width.
- FIG. 21 shows an example in which the size of the convex portion 252 formed on the surface of the observation site 251 is shown.
- the first illumination unit 231 is configured so that the protrusion amount and the inclination angle of the probe 230 are appropriately set.
- the convex part 252 can be illuminated obliquely with respect to the protruding direction of the convex part 252.
- the processor device 240 includes a processor such as a CPU (not shown), and each unit in the processor device 240 can be controlled by this processor.
- the light source control unit 50 provided in the processor device 240 controls the light source 32 to control the lighting of the first illumination unit 231 and the second illumination unit 214.
- the processor device 240 has an image processing unit 241.
- the image processing unit 241 has the same configuration as that of the image processing unit 42 in FIG. 1, performs predetermined image signal processing on the captured image from the imaging unit 213, and then provides the display unit 51 for display.
- the processor device 240 is provided with a probe drive control unit 242, a convex lesion region specifying unit 243, and a convex lesion size calculating unit 244.
- the probe drive control unit 242 can change the tilt angle of the probe 230 by driving the probe raising device 215 based on the operation of the operator. Note that the protruding amount of the probe 230 can be changed by the operation of the operator.
- the convex lesion area specifying unit 243 specifies a convex lesion area.
- the convex lesion size calculator 244 calculates the actual size of the convex lesion based on the identified convex lesion region.
- the convex lesion size calculation unit 244 outputs the calculation result to the image processing unit 241. Based on the calculation result of the convex lesion size calculation unit 244, the image processing unit 241 can display information about the convex lesion existing in the observation site on the display unit 51.
- FIG. 22 is a flowchart for explaining the operation of the third embodiment.
- FIG. 23 is explanatory drawing which shows the state of the shadow of a convex part in 3rd Embodiment.
- the insertion portion 210a is placed at a site to be observed.
- illumination is performed by the second illumination unit 214 provided at the distal end of the insertion unit 210a
- imaging is performed by the imaging unit 213
- the captured image is displayed on the display unit 51
- the operator observes the display on the display unit 51.
- An observation site may be specified.
- you may detect a convex part with the method similar to said each embodiment. In this case, since the convex lesion part can be specified, a display indicating the outer shape of the convex lesion part may be displayed in the captured image displayed on the display unit 51.
- step S21 of FIG. 22 shows an example in which the convex lesion part is specified from the low luminance region by the same method as in the second embodiment (step S12).
- the process of step S21 can be executed by using the same three probes and the first illumination unit as the probe 230 and the first illumination unit 231.
- FIG. 21 shows a state in which the insertion portion 210a is fixedly arranged in a state where the axial direction of the insertion portion 210a faces the convex portion 252 on the surface of the observation site 251.
- the probe drive control unit 242 changes the protruding amount and the protruding angle of the probe 230 based on the operator's operation, and enables the first illumination unit 231 to illuminate the convex portion 252.
- the first illumination unit 231 is turned on by the light source control unit 50.
- FIG. 23 shows the convex portion 252 and the shadow 261 generated on the convex portion 252 in this case.
- the convex portion 252 and the shadow 261 are captured by the imaging unit 213, and the captured image is given to the convex lesion region specifying unit 143.
- the convex lesion region specifying unit 143 obtains the arc of the outline of the shadow 261 (thick line in FIG. 23) based on the captured image, and estimates the circle passing through the arc as the outer shape 252a of the convex lesion part (broken line in FIG. 23). To do.
- the convex lesion region specifying unit 143 calculates the center point O of the estimated outer shape 252a.
- the convex lesion region specifying unit 143 gives the information of the obtained center point O to the image processing unit 141 to display a display indicating the position in the captured image.
- step S22 the convex lesion size calculator 144 obtains the size of the convex lesion. That is, the convex lesion size calculation unit 144 displays a display indicating the position of the center point Q of the visual field range of the imaging unit 113 on the captured image displayed on the display unit 51 in step S22-1.
- the operator performs fine adjustment of the position and orientation of the insertion portion 210a, the protruding amount of the probe 230, and the inclination angle in order to calculate the size of the convex lesion.
- the positional relationship between the imaging unit 213 and the convex portion 252, the visual field direction and range of the imaging unit 213, and the illumination direction of the first illumination unit 231 are defined.
- the convex lesion size calculation unit 144 controls the light source control unit 50 to make the luminous flux of the illumination light of the first illumination unit 231 into a sufficiently thin beam. Alternatively (step S22-2).
- step S23-3 the surgeon looks at the endoscopic image displayed on the display unit 51 while matching the display of the convex lesion center point O with the display of the imaging unit center point Q. Fine-tune the position of. Next, the surgeon finely adjusts the protrusion amount and the inclination angle of the probe 230 so that the center of the illumination light corresponding to the center point P of the first illumination unit is the convex portion 252 corresponding to the convex lesion center point O. Match the points on the surface.
- a light beam 260a in FIG. 21 indicates a beam-shaped light beam used for this adjustment.
- the convex lesion size calculation unit 144 widens the irradiation range of the light from the first illumination unit 231 in order to generate the shadow 261 of FIG. 23 (step S23-4). Thereby, the 1st illumination part 231 illuminates the convex part 252 with the light beam 260 shown by the hatching of FIG. In step S23-5, the convex lesion size calculation unit 144 obtains the length l of the shadow 261 based on the number of pixels in the low luminance area in the captured image corresponding to the shadow 261. Further, the convex lesion size calculation unit 144 calculates the visual field range W by the number of pixels.
- the convex lesion size calculation unit 144 calculates the height h of the convex lesion part as follows using a known value assuming that the shape of the convex lesion part is a hemisphere (step S23-6). ). As shown in FIG. 21, the surface of the observation region 251 is flat, and the direction perpendicular to the surface coincides with the axial direction of the insertion portion 210a. A direction parallel to the surface of the part 251 will be described as a horizontal direction. In FIG. 21, the meanings of the symbols are as follows.
- h height of the convex lesion
- H height from the observation site 251 plane to the imaging unit 213
- h ′ height of the imaging unit 213 Difference from the height of the center point P of the first illumination unit 231 d: of the convex 252
- Diameter l Length of shadow 261 of convex portion 252
- W Field of view range (diagonal) by imaging unit 213 y: Horizontal distance ⁇ 1: center point P of first illumination unit 231 and center Q of imaging unit 213 Tilt angle of the probe 230 (angle formed by the central axis of the insertion portion 210a and the central axis of the probe 230)
- ⁇ 2 viewing angle ⁇ of the imaging unit 213: the optical axis of the imaging unit 213 and the illumination direction of the first illumination unit 231 Angle formed If the distance from the center of the convex part 252 (the central point O of the convex lesion part) to the tip of the shadow 261 is x, the illumination light from the first illumination
- the convex lesion size calculation unit 244 determines whether or not the convex lesion portion to be detected is based on whether or not the size h of the convex lesion portion is within a specified range.
- the convex lesion size calculation unit 244 outputs a determination result as to whether or not it is a convex lesion part. For example, the determination result is supplied to the display unit 51 via the image processing unit 241, and a display indicating whether or not the convex portion being observed is a convex lesion portion, a display indicating that the convex lesion portion is a convex lesion portion, or the like Is done.
- each part in the processor devices 40, 140, and 240 in each of the above embodiments may be configured by at least one processor and realize each function according to a program, and each part may be realized by hardware. It may be configured to realize each function.
- the present invention is not limited to the above-described embodiments as they are, and can be embodied by modifying constituent elements without departing from the scope of the invention in the implementation stage.
- various inventions can be formed by appropriately combining a plurality of constituent elements disclosed in the above embodiments. For example, you may delete some components of all the components shown by embodiment.
- constituent elements over different embodiments may be appropriately combined.
- the size of the convex portion can be detected from the captured image with high accuracy and the lesioned portion can be specified.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Optics & Photonics (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Pathology (AREA)
- Veterinary Medicine (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Biophysics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Astronomy & Astrophysics (AREA)
- General Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Endoscopes (AREA)
- Instruments For Viewing The Inside Of Hollow Bodies (AREA)
Abstract
Description
図1は本発明の第1の実施の形態に係る内視鏡システムを示すブロック図である。また、図2は内視鏡の挿入部が体腔内に挿入されている様子を示す説明図である。
図6は本発明の第2の実施の形態を示すブロック図である。図6において図1と同一の構成要素には同一符号を付して説明を省略する。第1の実施の形態においては、撮像部と照明部とを相対的に移動させながら撮像を行い、撮像画像中の凸部の影によって凸部を検出して凸部のサイズを求める例について説明した。生体組織の表面の状態、照明光の照射位置等によっては、凸部による影が生じないことも考えられる。そこで、本実施の形態においては、キャップ状の照明器具を用いることで、凸部による影を生じさせて、確実に凸部の検出及び凸部のサイズの検出を行うものである。
図15乃至図17は第2の実施の形態に採用可能な挿入部及びキャップの変形例を示す断面図である。
ところで、観察部位の表面は、平坦であるとは限らない。観察部位の表面が平坦でない場合には、撮像画像に基づく凸部の検出及びそのサイズの算出精度が低下することが考えられる。そこで、本変形例では、観察部位の表面に照明部(例えば、図1の第1照明部15や図6の第4照明部114)によって格子を投影し、投影された格子の形状に基づいて撮像画像を補正することで、凸部の検出及びそのサイズの算出精度を向上させる。
図20は本発明の第3の実施の形態を示すブロック図である。図20において図1と同一の構成要素には同一符号を付して説明を省略する。上記各実施の形態においては、同一時間に、又は異なる時間において、被写体を相互に異なる方向から照明し、影の状態から凸部を検出すると共にそのサイズを求める例を説明した。本実施の形態は、被写体に対する1方向からの照明のみによって、凸部のサイズを求めるものである。なお、本実施の形態は、位置が既知の凸部について、その正確なサイズを求めるための手法を示しているが、上記各実施の形態と同様の手法を採用することにより、凸部の位置を検出することも可能である。
凸部252の中心(凸病変部の中心点O)から影261の先端までの距離をxとすると、図21に示すように、第1照明部231からの照明光が水平方向となす角はθ1であるので、x=h/tanθ1が成り立つ。従って、l=x-(d/2)=(h/tanθ1)-(d/2)である。凸部252は例えば2mm以下程度の十分に小さい大きさを想定しており、d=0と近似して、下記(1)式を得る。h=l・tanθ1 …(1)
また、角度φは、φ=(π/2)-θ1であり、既知の値である。この角度φと未知の値h,h'を用いて高さHは下記(2)式で示される。H=h+y/tanφ+h' …(2)
(2)式を(1)式に代入すると、下記(3)式が得られる。H=l・tanθ1+(y/tanφ)+h' …(3)
撮像部213の視野角θ2は既知の値であるので、視野範囲Wと高さHとの関係も既知である。そこで、視野範囲Wを下記(4)に示す高さHの関数で表す。W=f(H) …(4)
撮像部213の視野角がθ2であるので、H・tan(θ2/2)=W/2であり、下記(5)式が得られる。W=2H・tan(θ2/2) …(5)
ここで、l/W=Kとする。l,Wのピクセル数は求められているので、Kは既知の値である。W=l/Kと変形し、上記(3)式及び(4)式を用いて下記(6)式を得る。W=l/K=f(l・tanθ1+(y/tanφ)+h') …(6)
上記(6)式のうち、K,θ1,y,φ,h'は既知の値であり、上記(6)式からlを算出することができる。算出したlを上記(1)式に代入することで、凸部252の高さhを求めることができる。
Claims (13)
- 照明光を照射し、所定の照明範囲を照明する照明部と、
前記照明部によって照明された被検体の所定の撮像範囲を撮像する撮像部と、
前記撮像部により撮像された前記被検体の撮像画像中の低輝度部を検出する低輝度部検出部と、
前記低輝度部検出部が検出した前記低輝度部の情報に基づいて所定のサイズ範囲の凸部を検出する凸部サイズ算出部とを具備したことを特徴とする内視鏡システム。 - 前記凸部サイズ算出部は、前記撮像画像中の既知のサイズの画像部分との比較に基づいて前記凸部のサイズを算出することを特徴とする請求項1に記載の内視鏡システム。
- 前記撮像画像中に前記既知のサイズの画像部分が含まれるように、前記撮像範囲中に既知のサイズの照明光を照射する指標光照射部を具備したことを特徴とする請求項2に記載の内視鏡システム。
- 前記撮像部と前記照明部とは、相対的に移動しながら前記被検体を前記撮像部によって撮像可能であることを特徴とする請求項1乃至3のいずれか1つに記載の内視鏡システム。
- 前記凸部サイズ算出部は、前記移動に伴って前記低輝度部の形状が変化することによって前記被検体の表面の前記凸部を検出することを特徴とする請求項1乃至4のいずれか1つに記載の内視鏡システム。
- 前記照明部及び撮像部は、一方が内視鏡の挿入部に設けられ、他方が前記挿入部から進退自在に延出されたプローブに設けられることを特徴とする請求項1乃至5のいずれか1つに記載の内視鏡システム。
- 前記照明部は、前記被検体を相互に異なる方向から相互に異なる帯域の光で照明する複数の照明光出射部を具備したことを特徴とする請求項1乃至3のいずれか1つに記載の内視鏡システム。
- 前記複数の照明光出射部は、前記撮像部の光軸に対して略直交する方向から前記被検体を照明することを特徴とする請求項7に記載の内視鏡システム。
- 前記撮像部は、内視鏡の挿入部の先端に取り付けられて前記挿入部の軸方向に光軸を有し、
前記複数の照明光出射部は、前記挿入部の先端に取り付けられて前記撮像部の撮像範囲を規定する筒状のキャップの先端側内周面に設けられることを特徴とする請求項7又は8に記載の内視鏡システム。 - 前記凸部サイズ算出部は、前記撮像範囲のサイズに基づいて前記凸部のサイズを算出することを特徴とする請求項9に記載の内視鏡システム。
- 前記凸部サイズ算出部は、前記低輝度部検出部が前記複数の照明光出射部からの照明光によって生じた前記低輝度部の情報に基づいて所定のサイズ範囲の凸部を検出することを特徴とする請求項7に記載の内視鏡システム。
- 前記照明部は、前記撮像画像中に前記既知のサイズの画像部分が含まれるように、前記撮像範囲中に既知のサイズの格子を投影することを特徴とする請求項2に記載の内視鏡システム。
- 前記凸部サイズ算出部は、前記撮像範囲中に投影された前記格子の画像に基づいて、前記凸部のサイズの算出結果を補正することを特徴とする請求項12に記載の内視鏡システム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201580035266.2A CN106535734B (zh) | 2014-09-25 | 2015-05-08 | 内窥镜系统 |
EP15843203.9A EP3150103A4 (en) | 2014-09-25 | 2015-05-08 | Endoscope system |
JP2016501262A JP6022106B2 (ja) | 2014-09-25 | 2015-05-08 | 内視鏡システム |
US15/392,312 US10264956B2 (en) | 2014-09-25 | 2016-12-28 | Endoscope system having processor for calculating a size of a convex portion |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014-195589 | 2014-09-25 | ||
JP2014195589 | 2014-09-25 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/392,312 Continuation US10264956B2 (en) | 2014-09-25 | 2016-12-28 | Endoscope system having processor for calculating a size of a convex portion |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016047191A1 true WO2016047191A1 (ja) | 2016-03-31 |
Family
ID=55580732
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/063305 WO2016047191A1 (ja) | 2014-09-25 | 2015-05-08 | 内視鏡システム |
Country Status (5)
Country | Link |
---|---|
US (1) | US10264956B2 (ja) |
EP (1) | EP3150103A4 (ja) |
JP (1) | JP6022106B2 (ja) |
CN (1) | CN106535734B (ja) |
WO (1) | WO2016047191A1 (ja) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018091645A1 (de) * | 2016-11-17 | 2018-05-24 | Westfälische Wilhelms-Universität Münster | Schutzkappe für eine bildgebende vorrichtung |
WO2018159292A1 (ja) * | 2017-03-03 | 2018-09-07 | 富士フイルム株式会社 | 計測支援装置、内視鏡システム、及び内視鏡システムのプロセッサ |
JP2020025882A (ja) * | 2018-08-17 | 2020-02-20 | アクラレント インコーポレイテッドAcclarent, Inc. | 解剖学的上昇アセンブリを備えた内視鏡 |
WO2021132153A1 (ja) * | 2019-12-26 | 2021-07-01 | 富士フイルム株式会社 | 内視鏡及び内視鏡システム |
US11419694B2 (en) | 2017-03-28 | 2022-08-23 | Fujifilm Corporation | Endoscope system measuring size of subject using measurement auxiliary light |
WO2022225218A1 (ko) * | 2021-04-22 | 2022-10-27 | 재단법인 아산사회복지재단 | Led 내시경 캡을 이용한 저준위 레이저 치료기 |
US11490785B2 (en) | 2017-03-28 | 2022-11-08 | Fujifilm Corporation | Measurement support device, endoscope system, and processor measuring size of subject using measurement auxiliary light |
WO2024185468A1 (ja) * | 2023-03-08 | 2024-09-12 | 富士フイルム株式会社 | 医療支援装置、内視鏡システム、医療支援方法、及びプログラム |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10117563B2 (en) * | 2014-01-09 | 2018-11-06 | Gyrus Acmi, Inc. | Polyp detection from an image |
CN109961448A (zh) * | 2019-04-10 | 2019-07-02 | 杭州智团信息技术有限公司 | 组织病变区域勾勒方法及系统 |
WO2021079402A1 (ja) * | 2019-10-21 | 2021-04-29 | 日本電信電話株式会社 | 映像処理装置、表示システム、映像処理方法、およびプログラム |
EP4301203A1 (en) * | 2021-03-03 | 2024-01-10 | Boston Scientific Scimed, Inc. | Scope modifications to enhance scene depth inference |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS5844030A (ja) * | 1981-09-10 | 1983-03-14 | オリンパス光学工業株式会社 | 内視鏡斜照明用シ−ス |
JP2003164419A (ja) * | 2001-12-04 | 2003-06-10 | Pentax Corp | 内視鏡スコープ |
JP2009273655A (ja) * | 2008-05-14 | 2009-11-26 | Fujifilm Corp | 画像処理システム |
JP2010082271A (ja) * | 2008-09-30 | 2010-04-15 | Fujifilm Corp | 凹凸検出装置、プログラム、及び方法 |
JP2011183000A (ja) * | 2010-03-09 | 2011-09-22 | Olympus Medical Systems Corp | 内視鏡装置 |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6563105B2 (en) * | 1999-06-08 | 2003-05-13 | University Of Washington | Image acquisition with depth enhancement |
JP5185505B2 (ja) * | 2006-03-07 | 2013-04-17 | オリンパスメディカルシステムズ株式会社 | 内視鏡システム及びこの内視鏡システムに適用されるアダプタ |
JP4585048B2 (ja) * | 2009-01-15 | 2010-11-24 | オリンパスメディカルシステムズ株式会社 | 内視鏡システム |
-
2015
- 2015-05-08 JP JP2016501262A patent/JP6022106B2/ja active Active
- 2015-05-08 EP EP15843203.9A patent/EP3150103A4/en not_active Withdrawn
- 2015-05-08 CN CN201580035266.2A patent/CN106535734B/zh active Active
- 2015-05-08 WO PCT/JP2015/063305 patent/WO2016047191A1/ja active Application Filing
-
2016
- 2016-12-28 US US15/392,312 patent/US10264956B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS5844030A (ja) * | 1981-09-10 | 1983-03-14 | オリンパス光学工業株式会社 | 内視鏡斜照明用シ−ス |
JP2003164419A (ja) * | 2001-12-04 | 2003-06-10 | Pentax Corp | 内視鏡スコープ |
JP2009273655A (ja) * | 2008-05-14 | 2009-11-26 | Fujifilm Corp | 画像処理システム |
JP2010082271A (ja) * | 2008-09-30 | 2010-04-15 | Fujifilm Corp | 凹凸検出装置、プログラム、及び方法 |
JP2011183000A (ja) * | 2010-03-09 | 2011-09-22 | Olympus Medical Systems Corp | 内視鏡装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3150103A4 * |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018091645A1 (de) * | 2016-11-17 | 2018-05-24 | Westfälische Wilhelms-Universität Münster | Schutzkappe für eine bildgebende vorrichtung |
US10708553B2 (en) | 2017-03-03 | 2020-07-07 | Fujifilm Corporation | Measurement support device, endoscope system, processor for endoscope system |
WO2018159292A1 (ja) * | 2017-03-03 | 2018-09-07 | 富士フイルム株式会社 | 計測支援装置、内視鏡システム、及び内視鏡システムのプロセッサ |
US11419694B2 (en) | 2017-03-28 | 2022-08-23 | Fujifilm Corporation | Endoscope system measuring size of subject using measurement auxiliary light |
US11490785B2 (en) | 2017-03-28 | 2022-11-08 | Fujifilm Corporation | Measurement support device, endoscope system, and processor measuring size of subject using measurement auxiliary light |
JP2020025882A (ja) * | 2018-08-17 | 2020-02-20 | アクラレント インコーポレイテッドAcclarent, Inc. | 解剖学的上昇アセンブリを備えた内視鏡 |
WO2021132153A1 (ja) * | 2019-12-26 | 2021-07-01 | 富士フイルム株式会社 | 内視鏡及び内視鏡システム |
JPWO2021132153A1 (ja) * | 2019-12-26 | 2021-07-01 | ||
JP7320620B2 (ja) | 2019-12-26 | 2023-08-03 | 富士フイルム株式会社 | 内視鏡及び内視鏡システム |
WO2022225218A1 (ko) * | 2021-04-22 | 2022-10-27 | 재단법인 아산사회복지재단 | Led 내시경 캡을 이용한 저준위 레이저 치료기 |
KR20220145685A (ko) * | 2021-04-22 | 2022-10-31 | 재단법인 아산사회복지재단 | Led 내시경 캡을 이용한 저준위 레이저 치료기 |
KR102567070B1 (ko) | 2021-04-22 | 2023-08-16 | 재단법인 아산사회복지재단 | Led 내시경 캡을 이용한 저준위 레이저 치료기 |
KR20230136085A (ko) * | 2021-04-22 | 2023-09-26 | 재단법인 아산사회복지재단 | Led 내시경 캡을 이용한 저준위 레이저 치료기 |
KR102644509B1 (ko) | 2021-04-22 | 2024-03-08 | 재단법인 아산사회복지재단 | Led 내시경 캡을 이용한 저준위 레이저 치료기 |
WO2024185468A1 (ja) * | 2023-03-08 | 2024-09-12 | 富士フイルム株式会社 | 医療支援装置、内視鏡システム、医療支援方法、及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
JP6022106B2 (ja) | 2016-11-09 |
EP3150103A4 (en) | 2018-02-21 |
CN106535734B (zh) | 2018-02-13 |
US20170105613A1 (en) | 2017-04-20 |
CN106535734A (zh) | 2017-03-22 |
US10264956B2 (en) | 2019-04-23 |
EP3150103A1 (en) | 2017-04-05 |
JPWO2016047191A1 (ja) | 2017-04-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6022106B2 (ja) | 内視鏡システム | |
JP5450527B2 (ja) | 内視鏡装置 | |
US20170112356A1 (en) | Image processing apparatus, image processing method, computer-readable recording medium, and endoscope system | |
WO2011055613A1 (ja) | 内視鏡システム | |
JP6442344B2 (ja) | 内視鏡診断装置、プログラムおよび記録媒体 | |
WO2017159335A1 (ja) | 医療用画像処理装置、医療用画像処理方法、プログラム | |
US20160077008A1 (en) | Cancer diagnostic device, diagnostic system, and diagnostic device | |
US20230000330A1 (en) | Medical observation system, medical imaging device and imaging method | |
WO2017115442A1 (ja) | 画像処理装置、画像処理方法および画像処理プログラム | |
EP3108798A1 (en) | Endoscopic system | |
US20230248209A1 (en) | Assistant device, endoscopic system, assistant method, and computer-readable recording medium | |
WO2017126531A1 (ja) | 内視鏡用プロセッサ | |
JP5927077B2 (ja) | 内視鏡システム | |
US12121219B2 (en) | Medical image processing device, medical imaging device, medical observation system, image processing method, and computer-readable recording medium | |
WO2019176253A1 (ja) | 医療用観察システム | |
JP6266559B2 (ja) | 内視鏡診断装置、画像処理方法、プログラムおよび記録媒体 | |
JP7505120B2 (ja) | 光治療装置、光治療装置の作動方法および光治療プログラム | |
US20230371817A1 (en) | Endoscope system | |
US20230347169A1 (en) | Phototherapy device, phototherapy method, and computer-readable recording medium | |
US20170215710A1 (en) | Endoscope system | |
US12133027B2 (en) | Medical control device and medical observation controlling projected illumination | |
WO2024166307A1 (ja) | 医療用装置、医療システム、医療用装置の作動方法、および、医療用装置の作動プログラム | |
WO2024166304A1 (ja) | 画像処理装置、医療システム、画像処理装置の作動方法、及び学習装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2016501262 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15843203 Country of ref document: EP Kind code of ref document: A1 |
|
REEP | Request for entry into the european phase |
Ref document number: 2015843203 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2015843203 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |