WO2015146836A1 - Système d'endoscope - Google Patents

Système d'endoscope Download PDF

Info

Publication number
WO2015146836A1
WO2015146836A1 PCT/JP2015/058508 JP2015058508W WO2015146836A1 WO 2015146836 A1 WO2015146836 A1 WO 2015146836A1 JP 2015058508 W JP2015058508 W JP 2015058508W WO 2015146836 A1 WO2015146836 A1 WO 2015146836A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
unit
subject image
acceleration
displayed
Prior art date
Application number
PCT/JP2015/058508
Other languages
English (en)
Japanese (ja)
Inventor
嵩 伊藤
倉 康人
高橋 毅
幹生 猪股
本田 一樹
Original Assignee
オリンパス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパス株式会社 filed Critical オリンパス株式会社
Publication of WO2015146836A1 publication Critical patent/WO2015146836A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0006Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means to keep optical surfaces clean, e.g. by preventing or removing dirt, stains, contamination, condensation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00174Optical arrangements characterised by the viewing angles
    • A61B1/00181Optical arrangements characterised by the viewing angles for multiple fixed viewing angles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes

Definitions

  • the present invention relates to an endoscope system, and more particularly to an endoscope system capable of simultaneously observing a direct viewing direction and a side viewing direction.
  • An endoscope system that includes an endoscope that images a subject inside a lumen such as a body cavity, and an image processing device that generates an observation image of the subject captured by the endoscope. Widely used in industrial fields.
  • a protrusion that protrudes from the distal end surface of the distal end of the insertion portion is provided, and a front observation lens that observes a subject located in front of the distal end surface of the protrusion is provided.
  • An endoscope system including an endoscope provided with a surrounding observation lens for observing a subject located opposite to the periphery of the protrusion is provided around the protrusion.
  • a subject observed with a front observation lens is imaged in a circular area at the center of the imaging element, and a subject observed with a surrounding observation lens is captured in a circular area of the same imaging element. Images are taken in the outer ring area.
  • the front image is formed in the center as a circular direct-view visual field image
  • the endoscopic image formed in the outer periphery of the direct-view visual field image as an annular side-view visual field image is displayed on the monitor. Is displayed.
  • Japanese Patent Laid-Open No. 2012-245157 discloses an endoscope apparatus that performs a scaling process on a display area of a direct view visual field image and a side view visual field image in order to display an attention area with an appropriate size. It is disclosed.
  • information on the endoscopic image is larger than a normal endoscopic system that mainly displays only the direct-view visual field image. The amount increases.
  • the amount of information required as an endoscopic image differs depending on the use situation such as insertion of an insertion portion of an endoscope, normal observation, detailed observation of a lesioned portion, and treatment using a treatment instrument. However, no attempt has been made to control the amount of information.
  • an object of the present invention is to provide an endoscope system capable of displaying an observation image controlled to an optimum amount of information necessary according to the use state of the endoscope.
  • An endoscope system includes an insertion portion that is inserted into a subject, and a first subject image that is provided in the insertion portion and that is mainly from a first region of the subject.
  • a first object image acquisition unit to acquire and a second object image that is provided in the insertion unit and acquires a second object image that is a subordinate from a second region of the subject that is different from the first region.
  • the subject image and the second subject image are displayed on a display unit capable of displaying them. Having an image processing unit for changing the amount of information of the second object image.
  • FIG. 1 is a diagram illustrating a configuration of an endoscope system according to the first embodiment
  • FIG. 2 is a perspective view illustrating a configuration of a distal end portion of an insertion portion of the endoscope
  • FIG. 4 is a front view showing a configuration of a distal end portion of an insertion portion of an endoscope
  • FIG. 4 is a diagram showing a configuration of a main part in the first embodiment
  • FIGS. 5A to 5C are image processing by an image processing unit. It is a figure which shows an example of the observation image displayed on a monitor.
  • an endoscope system 1 includes an endoscope 2 that images an observation object and outputs an imaging signal, a light source device 31 that supplies illumination light for illuminating the observation object, A video processor 32 having a function as an image signal generation unit that generates and outputs a video signal (image signal) corresponding to the imaging signal; and a monitor 35 that displays an observation image corresponding to the video signal (image signal). is doing.
  • the endoscope 2 includes an operation unit 3 that an operator holds and operates, an elongated insertion unit 4 that is formed on the distal end side of the operation unit 3 and is inserted into a body cavity and the like, and a side portion of the operation unit 3 And a universal cord 5 provided with one end so as to extend from.
  • the endoscope 2 is a wide-angle endoscope capable of observing a field of view of 180 degrees or more by displaying a plurality of field images.
  • the back of the eyelid or the boundary of the organ For example, it is possible to prevent oversight of a lesion that is difficult to see only by observation in the direct viewing direction.
  • operations such as temporary fixing by twisting, reciprocating movement, intestinal wall hooking, etc. occur in the insertion portion 2 as in the case of a normal large intestine endoscope. To do.
  • the insertion portion 4 includes a hard distal end portion 6 provided on the most distal end side, a bendable bending portion 7 provided on the rear end of the distal end portion 6, and a long and long side provided on the rear end of the bending portion 7. And a flexible tube portion 8 having flexibility. Further, the bending portion 7 performs a bending operation according to the operation of the bending operation lever 9 provided in the operation portion 3.
  • a columnar cylindrical portion 10 is formed at the distal end portion 6 of the insertion portion 4 so as to protrude from a position eccentric to the upper side from the center of the distal end surface of the distal end portion 6. ing.
  • An objective optical system (not shown) that serves both as a direct view and a side view is provided at the tip of the cylindrical portion 10. Further, the distal end portion of the cylindrical portion 10 is disposed at a position corresponding to the direct-viewing observation window 12 disposed at a position corresponding to the direct viewing direction of the objective optical system (not shown) and at a position corresponding to the side viewing direction of the objective optical system (not illustrated). And a side-view observation window 13. Further, a side-view illumination unit 14 that emits light for illuminating the side-view direction is formed near the proximal end of the cylindrical unit 10.
  • the side-view observation window 13 can acquire a side-view visual field image by capturing the return light (reflected light) from the observation target incident from the circumferential direction in the cylindrical cylindrical portion 10 in the side-view visual field.
  • the side view mirror lens 15 is provided.
  • an image of the observation object in the field of view of the direct-viewing observation window 12 is formed in the center as a circular direct-viewing field image at the imaging position of the objective optical system (not shown), and the field of view of the side-viewing observation window 13 It is assumed that the imaging element 40 (imaging surface thereof) shown in FIG. 4 is arranged so that the image of the observation object inside is formed as an annular side view field image on the outer periphery of the direct field image. .
  • Such an image is realized by using a two-reflection optical system that reflects the return light twice by the side-viewing mirror lens, but is formed by reflecting the return light once by the one-reflection optical system, This may be image-processed by the video processor 32 so that the orientations of the side-view visual field image and the direct-view visual field image are matched.
  • the distal end surface of the distal end portion 6 is disposed at a position adjacent to the cylindrical portion 10, and is disposed in the insertion portion 4 and the direct view illumination window 16 that emits illumination light in the range of the direct view field of the direct view observation window 12.
  • a distal end opening 17 is provided which communicates with a treatment instrument channel (not shown) formed of a tube or the like and can project the treatment instrument (the distal end portion) inserted through the treatment instrument channel.
  • the distal end portion 6 of the insertion portion 4 has a support portion 18 provided so as to protrude from the distal end surface of the distal end portion 6, and the support portion 18 is positioned adjacent to the lower side of the cylindrical portion 10.
  • the support portion 18 is configured to be able to support (or hold) each protruding member disposed so as to protrude from the distal end surface of the distal end portion 6.
  • the support portion 18 includes a direct-viewing observation window nozzle portion 19 that emits a gas or a liquid for cleaning the direct-viewing observation window 12 as the above-described protruding members, and light for illuminating the direct-viewing direction.
  • the direct-view illumination window 21 that emits light and the side-view observation window nozzle portion 22 that emits a gas or liquid for cleaning the side-view observation window 13 can be supported (or held), respectively.
  • the support unit 18 acquires a side-view visual field image including any one of the projecting members by causing each of the projecting members, which are objects different from the original observation target, to appear in the side-view visual field. It is formed with a shielding portion 18a, which is an optical shielding member, so as not to be disturbed. That is, by providing the shielding portion 18a on the support portion 18, a side-view visual field image that does not include any of the direct-view observation window nozzle portion 19, the direct-view illumination window 21, and the side-view observation window nozzle portion 22 is obtained. Obtainable.
  • the side-view observation window nozzle portion 22 is provided at two locations of the support portion 18, and is arranged so that the tip protrudes from the side surface of the support portion 18.
  • the operation unit 3 includes an air / liquid supply operation button 24 a capable of giving an operation instruction to eject a gas or liquid for cleaning the direct-view observation window 12 from the direct-view observation window nozzle unit 19,
  • An air / liquid feeding operation button 24b capable of operating instructions for injecting gas or liquid for cleaning the side viewing window 13 from the side viewing window nozzle 22 is provided.
  • Air supply and liquid supply can be switched by pressing the buttons 24a and 24b.
  • a plurality of air / liquid feeding operation buttons are provided so as to correspond to the respective nozzle portions. For example, by operating one air / liquid feeding operation button, the direct-view observation window nozzle unit 19, Gas or liquid may be ejected from both of the side-view observation window nozzle portions 22.
  • a plurality of scope switches 25 are provided at the top of the operation unit 3, and functions for each switch are assigned so as to output signals corresponding to various on / off states that can be used in the endoscope 2. It has a possible configuration. Specifically, the scope switch 25 has a function of outputting a signal corresponding to, for example, start and stop of forward water supply, execution and release of freeze, and notification of the use state of the treatment tool. Can be assigned as a function.
  • At least one of the functions of the air / liquid feeding operation buttons 24a and 24b may be assigned to one of the scope switches 25.
  • the operation unit 3 is provided with a suction operation button 26 that can instruct a suction unit or the like (not shown) to suck and collect mucus or the like in the body cavity from the distal end opening 17. Yes.
  • the mucus etc. in the body cavity sucked in response to the operation of the suction unit are provided in the vicinity of the front end of the distal end opening 17, the treatment instrument channel (not shown) in the insertion section 4, and the operation section 3. After passing through the treatment instrument insertion port 27, it is collected in a suction bottle or the like of a suction unit (not shown).
  • the treatment instrument insertion port 27 communicates with a treatment instrument channel (not shown) in the insertion portion 4 and is formed as an opening into which a treatment instrument (not shown) can be inserted. That is, the surgeon can perform treatment using the treatment tool by inserting the treatment tool from the treatment tool insertion port 27 and projecting the distal end side of the treatment tool from the distal end opening portion 17.
  • a connector 29 that can be connected to the light source device 31 is provided at the other end of the universal cord 5.
  • the tip of the connector 29 is provided with a base (not shown) serving as a connection end of the fluid conduit and a light guide base (not shown) serving as a supply end of illumination light. Further, an electrical contact portion (not shown) capable of connecting one end of the connection cable 33 is provided on the side surface of the connector 29. Furthermore, a connector for electrically connecting the endoscope 2 and the video processor 32 is provided at the other end of the connection cable 33.
  • the universal cord 5 includes a plurality of signal lines for transmitting various electrical signals and a light guide for transmitting illumination light supplied from the light source device 31 in a bundled state.
  • the light guide built in from the insertion portion 4 to the universal cord 5 has an end portion on the light emission side branched in at least two directions in the vicinity of the insertion portion 4, and the light emission end surface on one side is directly viewed as the light emission portion. It has the structure which is arrange
  • the light guide has a configuration in which the light incident side end is disposed on the light guide cap of the connector 29.
  • positioned at the direct view illumination windows 16 and 21 and the side view illumination part 14 may replace with a light guide, and may be a light emitting element like a light emitting diode (LED).
  • LED light emitting diode
  • the video processor 32 outputs a drive signal for driving the image sensor provided at the distal end portion 6 of the endoscope 2. Then, as will be described later, the video processor 32 generates an image signal based on the imaging signal output from the imaging element in accordance with the usage state of the endoscope 2, and performs signal processing (predetermined on the image signal). The video signal is generated and output to the monitor 35.
  • Peripheral devices such as the light source device 31, the video processor 32, and the monitor 35 are arranged on a gantry 36 together with a keyboard 34 for inputting patient information.
  • the endoscope 2 includes at least an image sensor 40, acceleration sensors 41a and 41b, and acceleration detectors 42a and 42b.
  • the video processor 32 includes at least an image processing unit 32a, a usage state detection unit 32b, an image output unit 32c, and an image signal generation unit 32d.
  • the direct-view observation window 12 constituting the first subject image acquisition unit is mainly from the direct-view direction (first direction) including the front substantially parallel to the longitudinal direction of the insertion unit 4, that is, the first region of the subject.
  • the side-view observation window 13 that acquires the first subject image and constitutes the second subject image acquisition unit is different from the longitudinal direction of the insertion portion that is at least partially different from the direct viewing direction (first direction).
  • a second subject image as a sub is acquired from the side viewing direction (second direction) including the intersecting direction, that is, from the second region of the subject.
  • the main subject image is a main subject image that is always displayed as described later, and the sub subject image is a subject whose display mode is changed as necessary, as will be described later. It is a specimen image.
  • the imaging element 40 is provided at the distal end portion 6 of the insertion portion 4 and photoelectrically converts the subject image in the direct viewing direction and the subject image in the side viewing direction on the same plane.
  • the imaging element 40 is electrically connected to the image signal generation unit 32d, and outputs the subject image acquired in the direct-view observation window 12 and the side-view observation window 13 to the image signal generation unit 32d.
  • the image signal generation unit 32d generates an image signal from the imaging signal output from the imaging element 40, and outputs the image signal to the image processing unit 32a.
  • the image processing unit 32a outputs a drive signal for driving the image sensor 40 provided at the distal end portion 6b of the endoscope 2.
  • the image processor 32a of the video processor 32 generates a video signal by performing predetermined signal processing on the image signal output from the image sensor 40 and generated by the image signal generator 32d. More specifically, the image processing unit 32a has an observation image including a direct-view field image having a circular shape and a side-view field image having an annular shape on the outer periphery of the image in the direct-view direction, that is, the side of the direct-view field image. In a state where the visual field images are adjacent to each other, an image arranged so that the side field of view image is surrounded by the direct field image is generated.
  • the boundary region between the direct-view visual field image and the side-view visual field image may or may not overlap.
  • the direct-view observation window 12 and the side-view observation window 13 In this case, the object image partially overlapped may be acquired, and the image processing unit 32a may perform a process of partially removing the overlapping region.
  • an insertion portion usage state detection unit that detects a usage state of the insertion portion and outputs a usage state detection result signal based on the detection result
  • an acceleration value is detected based on an acceleration sensor and a signal detected by the acceleration sensor
  • a method of detecting the use state of the insertion unit 4 by the use state detection unit based on the acceleration value is provided.
  • the acceleration sensor 41a is provided at the distal end portion 6 of the insertion portion 4, detects the acceleration in the advancing / retreating direction of the insertion portion 4, and outputs the detection result to the acceleration detector 42a.
  • the acceleration sensor 41b is provided at the distal end portion 6 of the insertion portion 4, detects the acceleration in the rotational direction of the insertion portion 4, and outputs the detection result to the acceleration detector 42b.
  • the acceleration detector 42a detects the value of the acceleration sensor 41a in the forward / backward direction and outputs it to the use state detection unit 32b. Moreover, the acceleration detector 42b detects the value of the acceleration sensor 41b in the rotation direction and outputs it to the use state detection unit 32b.
  • the acceleration detectors 42a and 42b may be provided in the video processor 32.
  • the usage state detection unit 32b Based on the value of the acceleration sensor 41a in the forward / backward direction from the acceleration detector 42a and the value of the acceleration sensor 41b in the rotational direction from the acceleration detector 42b, the usage state detection unit 32b as the insertion unit usage state detection unit The use state of the endoscope 2 is detected, and the detection result (use state detection result signal) is output to the image processing unit 32a.
  • the usage state detection unit 32b sets the usage state of the endoscope 2 to “ When the value of the acceleration sensor 41a in the forward / backward direction and the value of the acceleration sensor 41b in the rotational direction are less than the preset values, the usage state of the endoscope 2 is detected as “observing”.
  • the use state detection unit 32b detects whether the use state (use state) of the endoscope 2 is “inserting” or “observing”, only the acceleration value detected by each acceleration detector 27a and 27b is used. Rather, the direction of acceleration is positive or negative (+ and ⁇ ), that is, whether the acceleration of the insertion portion 4 is applied in the positive direction or the negative direction with respect to the insertion direction, or in the positive direction with respect to a predetermined rotation direction. You may add to the use condition detection result signal also about the frequency which changes whether it is hanging or it applies to a negative direction within the predetermined fixed time.
  • the image processing unit 32a has an annular shape on the outer periphery of the direct-view visual field image 50a having a circular shape acquired by the direct-view observation window 12 and the direct-view visual field image 50a acquired by the side-view observation window 13. And the side-view visual field image 50b.
  • the side-view visual field image 50b includes a shielding region 50c that is optically shielded by the shielding portion 18a of the support portion 18.
  • the image processing unit 32a generates an image signal obtained by cutting out predetermined regions of the acquired direct-view visual field image 50a and the side-view visual field image 50b according to the use state of the endoscope 2 detected by the use state detection unit 32b.
  • the image processing unit 32a generates an image signal by switching a plurality of modes having different display regions of the side-view visual field image 50b based on the detection result (use state detection result signal) from the use state detection unit 32b. .
  • the image output unit 32 c generates a signal to be displayed on the monitor 35 from the image signal generated by the image processing unit 32 a and outputs the signal to the monitor 35.
  • the image processing unit 32a displays an observation image 51a cut out in a range inscribed in the direct-view visual field image 50a as shown in FIG. 5B. To display.
  • the image processing unit 32a displays an observation image 51b cut out in a range in which the display areas of the direct-view visual field image 50a and the side-view visual field image 50b are the maximum area. As shown in 5C, it is displayed on the monitor 35.
  • FIG. 6 is a diagram for explaining the relationship between each acceleration sensor and the observation image to be displayed.
  • the surgeon When the insertion part 4 is being inserted into a lumen or the like, the surgeon performs a twisting operation to search for the lumen, and frequently repeats the advancing and retreating operation to insert into the deep part while shortening the intestinal tract. That is, while the insertion portion 4 is inserted into a lumen or the like, one of the values of acceleration in the rotational direction or the forward / backward direction of the insertion portion 4 is always large. Therefore, when the acceleration in the rotational direction or the acceleration in the forward / backward direction of the insertion unit 4 is greater than or equal to a preset value, the use state detection unit 32b determines that the movement of the insertion unit 4 is intense and uses the endoscope 2 Detects that the status is “Inserting”.
  • the surgeon slowly pulls out the insertion portion 4 in order to prevent oversight in a use state (screening) in which the entire lumen is observed. That is, the acceleration values in the rotational direction and the forward / backward direction of the insertion portion 4 are always in a small state. Therefore, when the acceleration in the rotation direction of the insertion unit 4 and the acceleration in the forward / backward direction are less than preset values, the use state detection unit 32b determines that the insertion unit 4 is not moving so much or the movement is constant. It determines and detects that the use state of the endoscope 2 is “under observation”.
  • the use state detection unit 32b detects whether the use state of the endoscope 2 is “inserting” or “observing” based on the detection results of the acceleration detectors 42a and 42b, and the detection result Is output to the image processing unit 32a.
  • the image processing unit 32a causes the monitor 35 to display the observation image 51a illustrated in FIG. 5B in which a predetermined region of the direct-view visual field image 50a is cut out.
  • the image processing unit 32a cuts out the predetermined regions of the direct-view visual field image 50a and the side-view visual field image 50b, as shown in FIG. 5C. Is displayed on the monitor 35.
  • the image processing unit 32a changes the information amount of the side view visual field image 50b while displaying at least the direct view visual field image 50a on the monitor 35 according to the use state of the endoscope (during insertion or observation).
  • the endoscope system 1 displays, on the monitor 35, the observation image 51a displaying only a predetermined region of the direct-view visual field image 50a when the operator inserts the insertion portion 4 into the lumen or the like.
  • the surgeon can insert the insertion portion 4 with the same feeling as that of a conventional endoscope.
  • the endoscope system 1 displays on the monitor 35 an observation image 51b in which predetermined areas of the direct-view visual field image 50a and the side-view visual field image 50b are displayed while the operator observes the lumen or the like.
  • the surgeon can observe the wide-angle endoscope 2 with full performance.
  • the endoscope system of the present embodiment it is possible to display an observation image controlled to an optimum amount of necessary information according to the use state of the endoscope (during insertion or observation).
  • FIG. 7 is a perspective view illustrating the configuration of the distal end portion of the insertion portion of the endoscope according to the first modification of the first embodiment
  • FIG. 8 illustrates the main part in the first modification of the first embodiment.
  • FIGS. 9A to 9C are diagrams illustrating an example of an observation image displayed on a monitor by image processing by an image processing unit.
  • a direct-view observation window 60a for observing the direct-view direction (first direction), that is, the first region of the subject, is disposed on the distal end surface of the distal end portion 6a of the endoscope 2.
  • a side viewing direction (second direction) including a direction intersecting with the longitudinal direction of the insertion portion that is at least partially different from the direct viewing direction (first direction), that is, Side-view observation windows 60b and 60c for observing the second region of the subject are arranged.
  • the side-view observation windows 60b and 60c are arranged at equal intervals in the circumferential direction of the tip end portion 6a, for example, at intervals of 180 degrees.
  • the main subject image is a main subject image that is always displayed, as in the first embodiment, and the sub subject image is a subject whose display mode is changed as necessary. It is a specimen image.
  • the number of side-view observation windows 60b and 60c arranged at equal intervals in the circumferential direction of the distal end portion 6a is not limited to two, and for example, a configuration in which one side-view observation window is arranged. Good. Further, the side-view observation windows 60b and 60c arranged at equal intervals in the circumferential direction of the distal end portion 6a, for example, arrange the side-view observation windows every 120 degrees in the circumferential direction (that is, three side-view visual field images). Or a configuration in which side-view observation windows are arranged every 90 degrees in the circumferential direction (that is, four side-view visual field images are acquired).
  • a direct-view illumination window 61a that emits illumination light in the range of the direct-view field of view of the direct-view observation window 60a is disposed on the distal end surface of the distal end portion 6a of the endoscope 2 at a position adjacent to the direct-view observation window 60a. Further, on the side surface of the distal end portion 6a of the endoscope 2, a side-view illumination window 61b that emits illumination light in a range of the side view field of the side-view observation window 60b is disposed at a position adjacent to the side-view observation window 60b.
  • a side-view illumination window 61c that emits illumination light in the range of the side-view visual field of the side-view observation window 60c is disposed at a position adjacent to the side-view observation window 60c.
  • a light emitting element such as a light emitting diode (LED) is disposed in the direct light illumination window 61a for emitting illumination light and the light emitting part disposed in the side view illumination unit 61b, and illumination light is emitted from the light source with a light guide.
  • a method of guiding can be considered.
  • the distal end opening 62 through which the treatment instrument (the distal end part) inserted through the treatment instrument channel can be projected and the direct-view observation window 60a are washed on the distal end surface of the distal end part 6a of the endoscope 2.
  • a direct-view observation window nozzle portion 63 for injecting gas or liquid for the purpose is provided on the side surface of the distal end portion 6a of the endoscope 2.
  • a side-view observation window nozzle unit (not shown) for injecting gas or liquid for cleaning the side-view observation windows 60b and 60c is provided. 60c is provided adjacent to each.
  • an imaging element 64a is arranged at the imaging position of the direct-viewing observation window 60a and an objective optical system (not shown).
  • An imaging element 64b is arranged at the imaging position of the side-view observation window 60b and an objective optical system (not shown), and an imaging element 64c is arranged at the imaging position of the side-view observation window 60c and the objective optical system (not shown).
  • the imaging elements 64a to 64c are electrically connected to the image signal generation unit 32d, respectively, and the direct-view field-of-view image captured by the imaging element 64a, that is, the first subject image, and the imaging elements 64b and 64c.
  • the side-view visual field images captured by the respective images, that is, the second subject image as a sub image are output to the image signal generating unit 32d.
  • the image signal generation unit 32d generates an image signal from the imaging signals output from the imaging elements 64a to 64c, and outputs the image signal to the image processing unit 32a.
  • the image processing unit 32a places the direct-view visual field image 65a captured by the image sensor 64a in the center, and is captured by the side-view visual field image 65b and the image sensor 64c captured by the image sensor 64b.
  • Image signals are generated so that the side-view visual field images 65c are arranged adjacent to each other so as to be adjacent to the direct-view visual field image 65a.
  • the image processing unit 32a causes the monitor 35 to display an observation image obtained by cutting out a predetermined region from the direct-view visual field image 65a, the side-view visual field image 65b, and 65c according to the detection result of the use state detection unit 32b.
  • the image processing unit 32a performs predetermined processing for only the direct-view visual field image 65a as illustrated in FIG. 9B.
  • An observation image 66 a obtained by cutting out the area is displayed on the monitor 35.
  • the image processing unit 32a when the usage state of the endoscope 2 is “observing”, the image processing unit 32a, as shown in FIG. 9C, direct view visual field image 65a and side visual field image 65b so that the display area becomes the maximum area. And the observation image 66c obtained by cutting out the predetermined area 65c is displayed on the monitor 35.
  • the image processing unit 32a changes the information amount of the side-view visual field image 65b while displaying at least the direct-view visual field image 65a on the monitor 35 according to the use state of the endoscope (during insertion or observation).
  • the internal system An observation image controlled to an optimum amount of necessary information can be displayed according to the use state of the endoscope (during insertion or observation).
  • FIGS. 10A to 10C are diagrams illustrating an example of an observation image displayed on the monitor by the image processing by the image processing unit.
  • the image processing unit 32a divides the acquired direct-view visual field image 65a and side-view visual field images 65b and 65c into a plurality of regions (9 regions in the example of FIG. 10A), respectively, and uses the endoscope 2 Accordingly, an arbitrary region is selected (or excluded) from the plurality of divided regions, and an observation image is generated.
  • the image processing unit 32a displays an observation image 67a in which the regions E11 to E19 having only the direct-view visual field image 65a are selected as illustrated in FIG. 10B. It is displayed on the monitor 35.
  • the image processing unit 32a when the use state of the endoscope 2 is “observing”, the image processing unit 32a, as shown in FIG. 10C, the regions E11 to E19 of the direct-view visual field image 65a and the regions E21 and E22 of the side-view visual field image 65b. , E24, E25, E27, E28 and the observation image 67b in which the regions E2, E3, E5, E6, E8, E9 of the side-view visual field image 65c are selected are displayed on the monitor 35.
  • the image processing unit 32a displays information on the side view field image 65b and the side view field image 65c while displaying at least the direct view field image 65a on the monitor 35 in accordance with the use state of the endoscope (during insertion or observation). Change the amount.
  • the endoscope system 1 As described above, in the endoscope system 1 according to the second modification of the first embodiment, as in the first embodiment, depending on the use state of the endoscope (during insertion or observation), It is possible to display an observation image controlled to an optimum amount of information necessary.
  • an endoscope system that switches an observation image to be displayed on the monitor 35 according to the use state of the treatment tool as the use state of the endoscope will be described.
  • FIG. 11 is a diagram showing the configuration of the distal end portion of the insertion portion of the endoscope
  • FIG. 12 is a diagram showing the configuration of the main part in the second embodiment
  • FIG. 13 is an enlarged view of the operation unit.
  • FIGS. 14A to 14D are diagrams illustrating an example of an observation image displayed on the monitor by the image processing by the image processing unit.
  • the configuration of the endoscope system 1 according to the present embodiment is the same as that of the endoscope system 1 according to the first embodiment. However, as shown in FIG. It is assumed that the optical design is made so that the tip opening 17 that projects the image is imaged.
  • the endoscope 2 is configured to include at least a sensor 70, a speed conversion unit 71, an acceleration conversion unit 72, and an edge detection unit 73.
  • the speed conversion unit 71, the acceleration conversion unit 72, and the edge detection unit 73 may be provided in the video processor 32.
  • the sensor 70 is provided adjacent to the treatment instrument channel 74 communicating with the treatment instrument insertion port 27 as shown in FIG.
  • the treatment instrument 75 is inserted into the treatment instrument channel 74, and the treatment instrument 75 protrudes from the distal end opening portion 17.
  • the treatment instrument 75 is provided with a plurality of markers 76 at equal intervals.
  • the sensor 70 detects the markers 76 provided at equal intervals on the treatment instrument 75 and outputs the detection result to the speed conversion unit 71.
  • the speed conversion unit 71 calculates the speed of the treatment instrument 75 from the detection interval (detection time) of the marker 76 detected by the sensor 70, and outputs the calculation result to the acceleration conversion unit 72.
  • the acceleration conversion unit 72 calculates the time change of the insertion speed, that is, the absolute value of the acceleration, from the insertion speed of the treatment instrument 75 calculated by the speed conversion unit 71, and outputs the calculation result to the use state detection unit 32b.
  • the imaging element 40 is electrically connected to the image signal generation unit 32 d via the edge detection unit 73, and the subject image acquired through the direct viewing observation window 12 and the side viewing observation window 13 is transmitted via the edge detection unit 73. Output to the image signal generator 32d.
  • the image signal generation unit 32d generates an image signal from the imaging signal output from the imaging element 40, and outputs the image signal to the image processing unit 32a.
  • the edge detection field 73 is input with the direct-view visual field image 50a acquired by the direct-view observation window 12 and the side-view visual field image 50b acquired by the side-view observation window 13.
  • the edge detection unit 73 detects whether or not the edge of the treatment instrument 75 exists in the direct-view visual field image 50a or the side-view visual field image 50b, and outputs the detection result to the use state detection unit 32b.
  • the use state detection unit 32 b detects the use state of the treatment tool 75 according to the acceleration of the treatment tool 75 calculated by the acceleration conversion unit 72 and the presence or absence of the edge of the treatment tool 75 detected by the edge detection unit 73. The detection result is output to the image processing unit 32a.
  • the use state detection unit 32b sets the use state of the endoscope 2 as “treatment instrument being inserted”. To detect. In addition, the use state detection unit 32b determines that the acceleration of the treatment instrument 75 calculated by the acceleration conversion unit 72 is less than a preset value, or the edge detection unit 73 applies only the side view visual field image 50b to the treatment instrument 75. When the edge is detected, the usage state of the endoscope 2 is detected as “before and after the treatment tool protrudes”.
  • the use state detection unit 32b detects the use state of the endoscope 2 as “in process”.
  • the use state detection unit 32b detects whether or not the use state (use state) of the treatment instrument 75 is “treatment instrument being inserted”, not only the acceleration value detected by the acceleration conversion unit 72 but also the acceleration value is detected.
  • the frequency of the positive / negative direction (+ and-) that is, whether the acceleration of the treatment instrument 75 is applied in the positive direction or in the negative direction with respect to the insertion direction is also used for a predetermined time. It may be added to the state detection result signal.
  • the image processing unit 32a causes the monitor 35 to display an observation image in which the cut-out ranges of the direct-view visual field image 50a and the side-view visual field image 50b are changed according to the use state of the endoscope 2 detected by the use state detection unit 32b. .
  • the image processing unit 32a monitors the observation image 51a cut out in a range inscribed in the direct-view visual field image 50a as illustrated in FIG. 14B. To display.
  • the image processing unit 32a has the maximum display areas of the direct-view visual field image 50a and the side-view visual field image 50b as illustrated in FIG. 14C.
  • the observation image 52b cut out in the range is displayed on the monitor 35.
  • the image processing unit 32a when the use state of the endoscope 2 is “before and after the projection of the treatment tool”, the image processing unit 32a, as shown in FIG. 14D, includes a part of the direct view visual field image 50a and the distal end opening of the side view visual field image 50b. 17 is displayed on the monitor 35.
  • FIG. 15 is a diagram for explaining the relationship between the acceleration of the treatment tool and the observation image to be displayed
  • FIG. 16 is a diagram for explaining the relationship between edge detection of the treatment tool and the observation image to be displayed
  • FIG. 17 is a flowchart for explaining the usage state detection processing by the usage state detection unit.
  • the surgeon inserts the treatment instrument 75 in a state where the treatment is captured by the direct view visual field image.
  • the surgeon repeats quick operation of the treatment tool until the treatment tool 75 reaches the vicinity of the distal end portion 6 of the insertion portion 4. That is, the speed when the treatment instrument 75 is inserted into the treatment instrument channel 74 has changed greatly, and the absolute value of the acceleration calculated by the acceleration conversion unit 72 is not less than a preset value. Therefore, the use state detection unit 32b detects that the use state of the treatment instrument 75 is “inserting” when the acceleration of the treatment instrument 75 is equal to or greater than a preset value.
  • the treatment instrument 75 when the treatment instrument 75 is to be protruded from the distal end opening 17 (or the distal end part 6), that is, when the surgeon determines that “the treatment instrument 75 has been inserted to the vicinity of the distal end opening 17”, The person carefully operates the treatment instrument 75 in order to prevent the treatment instrument 75 from suddenly protruding from the distal end opening portion 17. In other words, the speed of the treatment instrument 75 changes little, and the absolute value of the acceleration calculated by the acceleration conversion unit 72 is less than a preset value. Therefore, when the acceleration of the treatment instrument 75 is less than a preset value, the use state detection unit 32b detects that the use state of the treatment instrument 75 is “before and after the projection of the treatment instrument”.
  • the operator protrudes the treatment tool 75 too much and removes the mucous membrane or the like.
  • the treatment instrument 75 is carefully operated so as not to be damaged.
  • the speed of the treatment instrument 75 has a small change, and the absolute value of the acceleration calculated by the acceleration conversion unit 72 is less than a preset value.
  • the use state detection unit 32b detects that the use state of the endoscope 2 is “before and after the treatment instrument protrudes” of the treatment instrument 75.
  • the distal end portion of the treatment instrument 75 is captured by the direct-view visual field image 50a.
  • the edge of the treatment instrument 75 is detected from the direct-view visual field image 50a by the edge detection unit 73. Therefore, when the edge of the treatment tool 75 is detected in the direct-view visual field image 50a, the use state detection unit 32b detects that the use state of the treatment tool 75 is “being treated”.
  • step S1 it is determined whether or not the edge of the treatment instrument 75 is within the direct-view visual field image 50a (step S1).
  • the result is YES
  • the use state of the treatment tool 75 is detected as “in treatment” (step S2), and the process is terminated.
  • NO is determined, and it is determined whether or not the acceleration of the treatment instrument 75 is equal to or greater than a preset value (step S3).
  • step S4 If it is determined that the acceleration of the treatment instrument 75 is equal to or greater than a preset value, YES is determined, and the use state of the treatment instrument 75 is detected as “before and after the treatment instrument protrudes” (step S4), and the process ends. On the other hand, if it is determined that the acceleration of the treatment instrument 75 is not equal to or greater than a preset value (the acceleration of the treatment instrument 75 is less than a preset value), the determination is NO and the usage state of the treatment instrument 75 is “Inserting treatment instrument”. "Is detected (step S5), and the process is terminated.
  • the use state detection unit 32b outputs the detection result (“treatment in progress”, “before / after treatment instrument protrusion” or “during treatment instrument insertion”) to the image processing unit 32a.
  • the image processing unit 32a displays the observation image 51a shown in FIG. 14B on the monitor 35, which is a predetermined region of the direct-view visual field image 50a.
  • the image processing unit 32a cuts out the predetermined regions of the direct-view visual field image 50a and the side-view visual field image 50b, as shown in FIG. 14C.
  • the image 51b is displayed on the monitor 35.
  • the image processing unit 32a displays the observation image 51c shown in FIG. .
  • the endoscope system 1 captures the treatment target by displaying the observation image 51b of FIG. 14C on the monitor 35.
  • the treatment instrument 75 can be inserted in the state.
  • the endoscope system 1 displays the observation image 51c of FIG.
  • the treatment instrument 75 can be safely projected (inserted) while checking.
  • the image processing unit 32a displays at least the direct-view visual field image 50a on the monitor 35 and displays the side-view visual field image 50b according to the usage status of the treatment instrument 75 (when the treatment instrument is inserted, before and after the treatment instrument is projected, or during treatment). Change the amount of information.
  • the endoscope system 1 displays the observation image 51a of FIG. 14B on the monitor 35 when the surgeon is about to perform a procedure and is performing a procedure, so that only the region necessary for the procedure is displayed. You can pay attention.
  • the usage status of the endoscope here, the usage status of the treatment tool (during insertion of the treatment tool, before and after the treatment tool is projected, or during treatment).
  • An observation image controlled to an optimum amount of information can be displayed.
  • Modification 1 of the second embodiment will be described.
  • the first embodiment obtains one direct-view visual field image and a plurality of side-view visual field images by using the plurality of imaging elements 64a to 64c in the second embodiment described above.
  • the modified example 1 is applied.
  • FIG. 18 is a diagram illustrating the configuration of the distal end portion of the insertion portion of the endoscope
  • FIGS. 19A to 19D are diagrams illustrating examples of observation images displayed on the monitor by image processing by the image processing unit. .
  • the distal end portion 6b of the endoscope system 1 is provided with a distal end opening 62a on the inclined surface of the distal end surface of the distal end portion 6b. It is assumed that the optical design is made so that the distal end opening 62a from which the treatment instrument 75 (the distal end thereof) protrudes is imaged.
  • a direct-view visual field image 65a including the tip opening 62a is acquired by the direct-view observation window 60a, and side-view visual field images 65b and 65c are acquired by the side-view observation windows 60b and 60c.
  • predetermined regions of the direct-view visual field image 65 a and the side-view visual field images 65 b and 65 c are cut out and displayed on the monitor 35 according to the usage state of the treatment instrument 75.
  • an observation image 66a shown in FIG. 19B in which only the region of the direct-view visual field image 65a is cut out is displayed on the monitor 35.
  • the observation image shown in FIG. 19C is cut out in a range where the display areas of the direct-view visual field image 65a and the side-view visual field images 65b and 65c are the maximum area. 66 b is displayed on the monitor 35.
  • an observation image 66c shown in FIG. 19D that is cut out in an area where the display area of the distal end opening 62a is substantially the center is displayed on the monitor 35.
  • the image processing unit 32a displays at least the direct-view visual field image 65a on the monitor 35 and displays the side-view visual field image 65b or the like depending on the usage status of the treatment instrument 75 (when the treatment instrument is inserted, before or after the treatment instrument is projected, or during treatment).
  • the amount of information of the side view visual field image 65c is changed.
  • the internal system It is possible to display an observation image controlled to an optimum amount of necessary information according to the usage status of the endoscope, here, the usage status of the treatment tool (during insertion of the treatment tool, before and after the treatment tool is projected, or during treatment). .
  • Modification 2 of the second embodiment will be described.
  • the structure of the modification 2 of 2nd Embodiment is substantially the same as that of the modification 1 of 2nd Embodiment, and only a different structure is demonstrated.
  • FIG. 20 is a diagram illustrating a configuration of the distal end portion of the insertion portion of the endoscope
  • FIGS. 21A to 21D are diagrams illustrating examples of observation images displayed on the monitor by image processing by the image processing unit. .
  • the distal end portion 6c of the endoscope system 1 is a slope on the side surface of the distal end portion 6c, and a distal end opening portion on the rear end side of the side-view observation window 60b.
  • 62b is provided, and the optical design is such that the distal end opening 62b through which the treatment instrument 75 (the distal end thereof) protrudes is imaged by the side-view observation window 60b.
  • the side-view observation window 60b acquires the direct-view visual field image 65a by the direct-view observation window 60a
  • the side-view observation window 65b acquires the side-view visual field image 65b including the tip opening 62b.
  • the side view visual field image 65c is acquired by the side view observation window 65c.
  • predetermined regions of the direct-view visual field image 65 a and the side-view visual field images 65 b and 65 c are cut out and displayed on the monitor 35 according to the usage state of the treatment instrument 75.
  • an observation image 66a shown in FIG. 21B in which only the region of the direct-view visual field image 65a is cut out is displayed on the monitor 35.
  • the observation image shown in FIG. 21C is cut out in a range in which the display areas of the direct view visual field image 65a and the side view visual field images 65b and 65c are maximized.
  • 66 b is displayed on the monitor 35.
  • an observation image 66d shown in FIG. 21D cut out including an area where the distal end opening 62b can be visually recognized is displayed on the monitor 35.
  • the image processing unit 32a displays at least the direct-view visual field image 65a on the monitor 35 and displays the side-view visual field image 65b or the like depending on the usage status of the treatment instrument 75 (when the treatment instrument is inserted, before or after the treatment instrument is projected, or during treatment).
  • the amount of information of the side view visual field image 65c is changed.
  • the usage status of the endoscope As described above, in the endoscope system 1 according to the second modification of the second embodiment, similarly to the second embodiment, the usage status of the endoscope, here, the usage status of the treatment tool (treatment) It is possible to display an observation image controlled to an optimum amount of necessary information depending on whether the instrument is inserted, before or after the treatment instrument protrudes, or during treatment.
  • FIG. 22 is an enlarged view of the operation unit.
  • the treatment tool 75 a of the present embodiment changes the interval of the markers 76.
  • the marker 76 has a narrow interval on the distal end side of the treatment instrument 75a and a large interval on the proximal end side of the treatment instrument 75a. More specifically, the marker 76 has a wide interval between the markers 76 corresponding to the position where the treatment instrument 75a has reached the vicinity of the distal end opening 17.
  • the sensor 70 detects the marker 76 which changed the space
  • the acceleration conversion unit 72 artificially calculates the acceleration of the treatment instrument 75 based on the detection interval (time) of the marker 76 that changes the interval of the treatment instrument 75a.
  • the acceleration conversion unit 72 determines that the acceleration is reduced in a pseudo manner.
  • the use state detection unit 32b detects the use state of the treatment instrument 75a according to the pseudo acceleration calculated by the acceleration conversion unit 72. Then, the image processing unit 32a switches the observation image displayed on the monitor 35 based on the usage state of the treatment instrument 75a detected by the usage state detection unit 32b.
  • the observation image switching process is the same as that in the second embodiment.
  • FIG. 23 is a diagram for explaining the relationship between the acceleration of the treatment instrument and the observation image to be displayed
  • FIG. 24 is a flowchart for explaining the use state detection processing by the use state detection unit.
  • the treatment instrument 75a When the treatment instrument 75a is to be protruded from the distal end opening 17, an operator (or an assistant) who is not accustomed to inserting the treatment instrument 75a determines that the treatment instrument 75a has been inserted to the vicinity of the distal end opening 17.
  • the treatment instrument 75a may be inserted without reducing the insertion speed.
  • the acceleration of the treatment instrument becomes equal to or greater than a preset value, and the observation image 51c in which the distal end opening portion 17 shown in FIG. .
  • an operator (or an assistant) who is not accustomed to inserting the treatment instrument 75a frequently moves the treatment instrument 75a when the treatment instrument 75a is projected from the distal end opening 17. Even when the operation is performed, since the interval between the markers 76 is wide, the change in the pseudo speed of the treatment instrument 75a becomes small. For this reason, since the pseudo acceleration of the treatment instrument 75a calculated by the acceleration conversion unit 72 is less than a preset value, the use state detection unit 32b is an operator (or an assistant) who is not used to inserting the treatment instrument 75a. ) Can be detected as “before and after the projection of the treatment instrument” even when the treatment instrument 75 a is to be projected from the distal end opening portion 17.
  • step S11 when it is determined in step S1 that the edge of the treatment instrument 75a is not in the direct view visual field image 50a, it is determined whether or not the pseudo acceleration of the treatment instrument 75a is equal to or greater than a preset value (step S11). If it is determined that the pseudo acceleration of the treatment instrument 75a is equal to or greater than a preset value, the result is YES, and in step S4, the use state of the treatment instrument 75 is detected as “before and after the treatment instrument protrudes”, and the process ends.
  • step S5 if it is determined that the pseudo acceleration of the treatment instrument 75a is not equal to or greater than the preset value (the pseudo acceleration of the treatment instrument 75a is less than the preset value), NO is determined, and the use state of the treatment instrument 75 is determined in step S5. Is detected as “treatment tool being inserted”, and the process is terminated.
  • the image processing unit 32a displays the observation image 51a of FIG. 14B on the monitor 35 according to the use state of the treatment tool 75a from the use state detection unit 32b, 14C, the observation image 51b shown in FIG. 14C is displayed on the monitor 35.
  • the image processing unit 32a displays at least the direct-view visual field image 50a on the monitor 35 and displays the side-view visual field image 50b according to the usage status of the treatment instrument 75 (when the treatment instrument is inserted, before and after the treatment instrument is projected, or during treatment). Change the amount of information.
  • the second implementation is performed even when the insertion speed of the treatment instrument is not slowed down. Similar to the form, it is possible to display an observation image that is controlled to an optimum amount of information necessary depending on the use status of the treatment tool (during insertion of the treatment tool, before and after the treatment tool is projected, or during treatment).
  • an endoscope system that switches an observation image to be displayed on the monitor 35 in accordance with the use state of the bending operation lever as the use state of the endoscope will be described.
  • FIG. 25 is a diagram showing a configuration of a main part in the fourth embodiment
  • FIG. 26 is a diagram for explaining a relationship between a bending angle and an observation image to be cut out
  • FIGS. 27A to 27C are images. It is a figure which shows an example of the observation image displayed on a monitor by the image process by a process part.
  • the endoscope 2 is configured to include at least a sensor 80, a rotation angle detection unit 81, a rotation acceleration conversion unit 82, and a bending angle conversion unit 83.
  • the rotation angle detection unit 81, the rotation acceleration conversion unit 82, and the bending angle conversion unit 83 may be provided in the video processor 32.
  • the sensor 80 is provided adjacent to the bending operation lever 9, detects the operation state of the bending operation lever 9, and outputs it to the rotation angle detection unit 81.
  • the rotation angle detection unit 81 detects the rotation angle of the bending operation lever 9 from the operation state of the bending operation lever 9 from the sensor 80, and outputs it to the rotation acceleration conversion unit 82 and the bending angle conversion unit 83.
  • the rotation acceleration conversion unit 82 calculates the acceleration (rotation acceleration) of the bending operation lever 9 from the change in the rotation angle of the bending operation lever 9 from the rotation angle detection unit 81, and outputs it to the use state detection unit 32b.
  • the bending angle conversion unit 83 calculates the bending angle of the bending unit 7 from the rotation angle of the bending operation lever 9 from the rotation angle detection unit 81, and outputs it to the use state detection unit 32b.
  • the use state detection unit 32b detects that the use state of the endoscope 2 is “partial observation” when the rotational acceleration of the bending operation lever 9 is equal to or greater than a preset value. On the other hand, the use state detection unit 32b detects the use state of the endoscope 2 from the bending angle of the bending unit 7 when the rotational acceleration of the bending operation lever 9 is less than a preset value.
  • the use state detection unit 32b detects that the use state of the endoscope 2 is “under overall observation”.
  • the use state detection unit 32b detects that the use state of the endoscope 2 is “partial observation” when the bending angle (absolute value) of the bending unit 7 is larger than 0 degree and 45 degrees or less.
  • the use state detection unit 32b detects that the use state of the endoscope 2 is “in front view”.
  • the usage state detection unit 32b outputs the usage state of the endoscope 2 thus detected to the image processing unit 32a.
  • the imaging element 40 is electrically connected to the image signal generation unit 32d, and outputs the subject image acquired through the direct viewing observation window 12 and the side viewing observation window 13 to the image signal generation unit 32d.
  • the image signal generation unit 32d generates an image signal from the imaging signal output from the imaging element 40, and outputs the image signal to the image processing unit 32a.
  • the image processing unit 32a cuts out predetermined regions of the direct-view visual field image 50a and the side-view visual field image 50b based on the detection result detected by the use state detection unit 32b, and displays the cut-out observation image on the monitor 35.
  • the image processing unit 32a detects that “partial observation is in progress”, as shown in FIG. 26, the bending angle of the bending unit 7 is ⁇ 45 degrees on the L side (L side Max).
  • the observation image 84a obtained by cutting out the direct view visual field image 50a and the side view visual field image 50b at the position P1 is displayed on the monitor 35 as shown in FIG. 27A.
  • the image processing unit 32a displays an observation image 84b obtained by cutting out the direct-view visual field image 50a and the side-view visual field image 50b at the position P2 when the bending angle of the bending unit 7 is 0 degree. It is displayed on the monitor 35 as shown in 27B.
  • the image processing unit 32a displays the direct view visual field image 50a and the side view visual field image 50b at the position P3.
  • the cut out observation image 84c is displayed on the monitor 35 as shown in FIG. 27C.
  • the image processing unit 32a displays the observation image 84b cut out at the position P2 on the monitor 35 as shown in FIG. 27B.
  • the cut-out position can be changed in the same way in the up-down direction and in the up-down-left-right simultaneous operation.
  • the operator When observing the entire lumen, the operator makes the distal end portion 6 be arranged at the center of the lumen, and bends the bending portion 7 in order to efficiently and thoroughly observe the entire lumen. Remove slowly.
  • the use state detection unit 32b indicates that the rotation acceleration of the bending operation lever 9 is less than a preset value, and the bending angle of the bending unit 7 calculated from the rotation angle of the bending operation lever 9 is 0 degree. By detecting this, the usage state of the endoscope is detected as “overall observation”.
  • the surgeon makes the curved portion 7 slightly curved to make it easier to observe the site of interest.
  • the bending angle is not so large and is generally 45 degrees or less. That is, the rotational acceleration of the bending operation lever 9 is not less than a preset value, or the absolute value of the bending angle of the bending portion 7 is not more than 45 degrees. Therefore, the use state detection unit 32b detects that the rotational acceleration of the bending operation lever 9 is greater than or equal to a preset value, or the absolute value of the bending angle of the bending unit 7 is 45 degrees or less. The mirror usage state is detected as “partially observing”.
  • the surgeon largely curves the bending portion 7 in order to capture the target object at the center of the direct vision field.
  • the bending angle is generally greater than 45 degrees.
  • the distal end portion 6 is directed to the target of interest, the bending operation is not performed from a state where a certain amount of bending is applied. That is, the bending angle of the bending portion 7 is larger than 45, and the rotational acceleration of the bending operation lever 9 is almost zero.
  • the use state detection unit 32b detects that the rotational acceleration of the bending operation lever 9 is less than a preset value and that the absolute value of the bending angle of the bending unit 7 is greater than 45 degrees, whereby the endoscope It is detected that the usage state of is “in front view”.
  • FIG. 28 is a flowchart for explaining the usage state detection processing by the usage state detection unit.
  • step S21 it is determined whether or not the acceleration of the bending operation lever 9 is equal to or greater than a preset value. If the acceleration of the bending operation lever 9 is equal to or greater than a preset value, the result is YES, the use state of the endoscope 2 is detected as “partial observation” (step S22), and the process is terminated. On the other hand, if the acceleration of the bending operation lever 9 is not greater than or equal to a preset value, NO is determined and the bending angle (absolute value) of the bending portion 7 is determined (step S23).
  • step S24 When the bending angle (absolute value) of the bending portion 7 is 0 degree, the use state of the endoscope 2 is detected as “overall observation” (step S24), and the process is terminated. On the other hand, when the bending angle (absolute value) of the bending portion 7 is greater than 0 degree and equal to or less than 45 degrees, the use state of the endoscope 2 is detected as “partially observing” (step S25), and the process ends. On the other hand, when the bending angle (absolute value) of the bending portion 7 is greater than 45 degrees, the use state of the endoscope 2 is detected as “in front view” (step S26), and the process is terminated.
  • the image processing unit 32a displays FIG. 27A, FIG. 27B, or FIG. 27C on the monitor 35 in accordance with the use state of the endoscope 2 detected in this way.
  • the endoscope system 1 can display the observation image 84b of FIG. 27B on the monitor 35 to observe the entire lumen throughout. it can. Further, the endoscope system 1 displays the observation images 84a to 84c of FIGS. 27A to 27C on the monitor 35 according to the bending angle of the bending portion 7 when the operator wants to pay attention to a place where it is difficult to observe such as the back of the eyelid. By doing so, it is possible to emphasize the region to be noticed.
  • the endoscope system 1 is viewing a target of interest such as a lesion from the front, the endoscopic system 1 displays the observation image 84b of FIG. 27B on the monitor 35 so that the target portion can be identified in the same manner as a normal so-called direct-view endoscope. It can be observed directly.
  • the image processing unit 32 a changes the information amount of the side view visual field image 50 b while displaying at least the direct view visual field image 50 a on the monitor 35 according to the bending angle of the bending unit 7 by the operation of the bending operation lever 9.
  • the endoscope system of the present embodiment it is necessary depending on the use state of the endoscope, here, the use state of the bending operation lever (during overall observation, partial observation, or front view). An observation image controlled to an optimum amount of information can be displayed.
  • Modification Next, a modification of the fourth embodiment will be described. Note that a modification of the fourth embodiment is the same as that of the first embodiment in which one direct-view field-of-view image and a plurality of side-view field-of-view images are acquired by the plurality of imaging elements 64a to 64c. Modification 1 is applied.
  • FIGS. 29A to 29D are diagrams illustrating an example of an observation image displayed on the monitor by the image processing by the image processing unit.
  • the usage state detection process by the usage state detection unit 32b is the same as that in the fourth embodiment described above.
  • the image processing unit 32a converts the direct-view visual field image 65a and the side-view visual field images 65b and 65c acquired by the direct-view observation window 60a, the side-view observation windows 60b and 60c, The cutout position is changed according to the above and displayed on the monitor 35.
  • the image processing unit 32 a An observation image 68a obtained by cutting a part of the direct-view visual field image 65a at the position P4 is displayed on the monitor 35 as shown in FIG. 29B.
  • the image processing unit 32a displays an observation image 68b obtained by cutting out a portion of the direct-view visual field image 65a and the side-view visual field images 65b and 65c at the position P5 in FIG. 29C. As shown, it is displayed on the monitor 35.
  • the image processing unit 32a is an observation image obtained by cutting out a part of the side view visual field image 65b and the direct view visual field image 65a at the position P6. 68c is displayed on the monitor 35 as shown in FIG. 29D.
  • the image processing unit 32a causes the monitor 35 to display the observation image 68b shown in FIG.
  • the image processing unit 32a displays at least the direct view visual field image 65a on the monitor 35 according to the bending angle of the bending unit 7 by the operation of the bending operation lever 9, and information on the side view visual field image 65b and the side view visual field image 65c. Change the amount.
  • the endoscope system 1 including the endoscope 2 including the plurality of imaging elements 64a to 64c according to the modification of the fourth embodiment An observation image controlled to the optimum amount of information required can be displayed according to the usage status of the mirror, here, the usage status of the bending operation lever (during overall observation, partial observation, or front view).
  • an endoscope system will be described in which an observation image displayed on the monitor 35 is switched according to the use state of the suction operation as the use state of the endoscope.
  • FIG. 30 is a diagram showing a configuration of a main part in the fifth embodiment.
  • the endoscope 2 includes at least a sensor 90 and a suction operation detection unit 91.
  • the suction operation detection unit 91 may be provided in the video processor 32.
  • the sensor 90 detects the operation state of the suction operation button 26 and outputs the detection result to the suction operation detection unit 91.
  • the suction operation detection unit 91 detects whether or not the suction operation button 26 is operated based on the detection result of the sensor 90, and outputs the detection result to the use state detection unit 32b.
  • the suction operation detection unit 91 detects that the suction operation button 26 is not operated when the suction operation button 26 is OFF, and operates the suction operation button 26 when the suction operation button 26 is half-pressed or turned ON. It detects that it was done.
  • the use state detection unit 32b detects the use state of the endoscope 2 as “in sucking”.
  • the use state detection unit 32b performs acceleration and rotation of the insertion unit 4 from the acceleration detectors 42a and 42b. The usage state of the endoscope 2 is detected from the acceleration in the direction.
  • the use state detection by this acceleration is the same as in the first embodiment, and the use state detection unit 32b has a case in which the acceleration in the advancing / retreating direction of the insertion unit 4 or the acceleration in the rotation direction is greater than or equal to a preset value.
  • the use state of the endoscope 2 is detected as “inserting” and the acceleration in the advancing / retreating direction and the acceleration in the rotating direction of the insertion unit 4 are less than preset values, the use state of the endoscope 2 is determined. It is detected as “observing”.
  • the use state detection unit 32b detects whether the use state (use state) of the endoscope 2 is “inserting” or “observing”, only the acceleration value detected by each acceleration detector 27a and 27b is used. Rather, the direction of acceleration is positive or negative (+ and ⁇ ), that is, whether the acceleration of the insertion portion 4 is applied in the positive direction or the negative direction with respect to the insertion direction, or in the positive direction with respect to a predetermined rotation direction. You may add to the use condition detection result signal also about the frequency which changes whether it is hanging or it applies to a negative direction within the predetermined fixed time.
  • the imaging element 40 is electrically connected to the image signal generation unit 32d, and outputs the subject image acquired through the direct viewing observation window 12 and the side viewing observation window 13 to the image signal generation unit 32d.
  • the image signal generation unit 32d generates an image signal from the imaging signal output from the imaging element 40, and outputs the image signal to the image processing unit 32a.
  • the image processing unit 32a causes the monitor 35 to display an observation image 51c in which the distal end opening 17 shown in FIG. 14D can be visually recognized when the use state detection unit 32b detects “sucking”.
  • the image processing unit 32a displays the observation image 51a in which only a predetermined region of the direct-view visual field image 50a illustrated in FIG.
  • the observation image 51b obtained by cutting out the region where the display area of the direct view visual field image 50a and the side view visual field image 50b shown in FIG.
  • the suction operation button 26 When the suction operation is going to be performed and when the suction operation is being performed, the surgeon operates the suction operation button 26 with a finger. At that time, the suction operation button 26 is half pressed or turned on. That is, by detecting that the position of the suction operation button 26 has been changed by the sensor 90 and the suction operation detection unit 91, the use state detection unit 32b detects the use state of the endoscope 2 as “in sucking”. Can do.
  • FIG. 31 is a flowchart for explaining the usage state detection processing by the usage state detection unit.
  • step S31 it is determined whether or not the suction operation button has been operated.
  • the answer is YES
  • the use state of the endoscope 2 is detected as “in suction” (step S32), and the process is terminated.
  • the determination is NO, and it is determined whether the acceleration in the forward / backward direction or the rotational direction is equal to or greater than a preset value (step S33).
  • step S34 If the acceleration in the forward / backward direction or the rotational direction is greater than or equal to a preset value, the usage state of the endoscope 2 is detected as “inserting” (step S34), and the process ends. On the other hand, when the acceleration in the forward / backward direction or the rotational direction is not equal to or greater than a preset value, the use state of the endoscope 2 is detected as “observing” (step S35), and the process ends.
  • the image processing unit 32a displays FIG. 14A, FIG. 14B, or FIG. 14C on the monitor 35 according to the use state of the endoscope 2 detected in this way.
  • the endoscope system 1 displays the observation images 51 and 51b of FIGS. 14B and 14C on the monitor 35 in accordance with the acceleration in the forward / backward direction and the rotational direction at the normal time when the operator is not performing the suction operation. By doing so, the same effect as the first embodiment can be obtained. Further, when the surgeon is performing a suction operation, the endoscope system 1 makes it easier to perform the suction operation by displaying the observation image 51c in which the distal end opening 17 in FIG.
  • the image processing unit 32a changes the information amount of the side view visual field image 50b while displaying at least the direct view visual field image 50a on the monitor 35 according to the state of the suction operation.
  • the endoscope system of the present embodiment it is possible to display an observation image controlled to an optimum amount of necessary information in each of the use status of the endoscope, here, during normal time / during suction. it can.
  • suction operation has been described in the present embodiment, for example, whether or not the air / liquid feeding operation button 24b is pressed during the air / water feeding operation, and the observation screen is displayed as in the present embodiment. Switching can be done.
  • Modification Next, a modification of the fifth embodiment will be described. Note that a modification of the fifth embodiment is the same as that of the second embodiment in which one direct-view field-of-view image and a plurality of side-view field-of-view images are acquired by the plurality of imaging elements 64a to 64c. Modification 1 is applied. That is, it is assumed that the endoscope system 1 is optically designed so that the distal end opening 62a is imaged by the direct-view observation window 60a.
  • the use state detection unit 32b detects the use state of the endoscope 2 as “in sucking”.
  • the image processing unit 32a causes the monitor 35 to display an observation image 66c shown in FIG.
  • the image processing unit 32a determines the endoscope 2 from the acceleration in the advancing / retreating direction and the acceleration in the rotation direction of the insertion unit 4 as in the fifth embodiment. Detect the usage status of.
  • the image processing unit 32a displays the observation image 66a shown in FIG. 19C is displayed on the monitor 35.
  • the observation image 66b shown in FIG. 19C is cut out so that the display areas of the direct-view visual field image 65a and the side-view visual field images 65b and 65c are maximized.
  • the image processing unit 32a changes the information amount of the side view visual field image 65b and the side view visual field image 65c while displaying at least the direct view visual field image 65a on the monitor 35 according to the state of the suction operation.
  • the endoscope system 1 including the endoscope 2 including the plurality of image pickup devices 64a to 64c according to the modification of the fifth embodiment It is possible to display an observation image controlled to an optimum amount of necessary information in each of the usage conditions of the mirror, here, during normal time / during suction.
  • FIG. 32 is a diagram showing an example of an observation image displayed on the monitor by the image processing by the image processing unit.
  • the image processing unit 32a generates a plurality of observation images 92a to 92f having different cutout ranges in advance from the direct view visual field image 50a and the side view visual field image 50b.
  • the surgeon arbitrarily sets an observation image to be displayed on the monitor 35 from the plurality of observation images 92a to 92f according to the usage state of the endoscope 2.
  • the surgeon sets the observation image 92f when the usage state of the endoscope 2 is “inserting”, sets the observation image 92d when “use”, and “when the treatment tool protrudes before and after”
  • the observation image 92d is set, and when “treatment is in progress”, the observation image 92a is set.
  • the image processing unit 32a selects an observation image set for each operator based on the usage state of the endoscope 2 from the usage state detection unit 32b and displays the selected observation image on the monitor 35. That is, the image processing unit 32a causes the monitor 35 to display the observation image 92a set for the surgeon when the use state detection unit 32b detects “in process”.
  • the image processing unit 32a changes the information amount of the side-view visual field image 50b while displaying at least the direct-view visual field image 50a on the monitor 35 according to the use state of the endoscope.
  • the observation image to be displayed can be set according to the operator's preference, and the optimum information required for each operator according to the usage status of the endoscope. It is possible to display an observation image controlled in quantity.
  • the image processing unit 32a may divide the direct-view visual field image 50a and the side-view visual field image 50b into a plurality of regions, and generate an observation image except for regions that are inappropriate for display. For example, the image processing unit 32a detects the luminance values of the direct-view visual field image 50a and the side-view visual field image 50b, generates an observation image excluding a region where overexposure occurs, and displays the observation image on the monitor 35. May be. At this time, the image processing unit 32a generates an observation image that is dimmed only with the image of the selected region.
  • the image processing unit 32a automatically switches the observation image to be displayed according to the usage state of the endoscope 2 detected by the usage state detection unit 32b.
  • a mode in which the observation image to be displayed is automatically switched and a mode in which at least a part of the observation image can be manually switched by an operation from the operator may be selected as necessary.
  • the side is
  • the mechanism that realizes the function of illuminating and observing may be a separate structure from the mechanism that realizes the function of illuminating and observing the front, and may be detachable from the insertion portion 4.
  • FIG. 33 is a perspective view of the distal end portion 6 of the insertion portion 4 to which a side observation unit is attached.
  • the distal end portion 6 of the insertion portion 4 has a front vision unit 100.
  • the side viewing unit 110 has a structure that can be attached to and detached from the front viewing unit 100 by a clip portion 111.
  • the side visual field unit 110 has two observation windows 112 for acquiring an image in the left-right direction and two illumination windows 113 for illuminating the left-right direction.
  • a use state detection unit such as an acceleration sensor for the insertion unit 4 or the side view unit 500 having such a configuration
  • an operation is performed according to the use state of the endoscope as in the above-described embodiments. It is possible to display an observation image controlled to an optimum amount of information required for each person.
  • an observation image arranged side by side so that the side-view visual field images are arranged on both sides of the direct-view visual field image is not limited to the embodiment of FIG. It is not restricted to the form which displays a some image on 35.
  • FIG. 34 is a diagram for explaining an example in which a plurality of field-of-view images are displayed separately on a plurality of monitors.
  • a monitor 35a for displaying a direct-view visual field image 65a, a monitor 35b for displaying a side-view visual field image 65b, and a side-view visual field image 65c are displayed for a plurality of visual fields such as front and side.
  • the observation images may be displayed separately on a plurality of monitors.
  • steps in the flowchart in this specification may be executed in a different order for each execution by changing the execution order and executing a plurality of steps at the same time as long as it does not contradict its nature.
  • the main field of view can always be displayed even a little so that safety can be confirmed during use.
  • an endoscope displaying a wide-angle visual field has been described as an example.
  • the gist of the present invention may be applied to a side endoscope.
  • the main subject image is a subject image in the field of view in the lateral region, and the sub-subject image is inserted when, for example, a necessary part is inserted.
  • This is a subject image in the visual field in the front area for confirming the insertion direction.
  • the image processing unit 32a displays a direct-view visual field image or a side view when displaying an observation image controlled to an optimum amount of information necessary according to the use state of the endoscope 2 detected by the use state detection unit 32b.
  • a mask process for covering the display may be performed on a part of the observation image to display it on the monitor 35.

Landscapes

  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Pathology (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Endoscopes (AREA)
  • Instruments For Viewing The Inside Of Hollow Bodies (AREA)

Abstract

L'invention concerne un système d'endoscope (1) qui comprend : une partie d'insertion (4) qui est insérée dans un sujet ; une première unité d'acquisition d'image de sujet (12) qui est située sur la partie d'insertion (4) et acquiert, à partir d'une première région du sujet, une première image de sujet qui sert d'image principale ; une seconde unité d'acquisition d'image de sujet (13) qui est située sur la partie d'insertion (4) et acquiert, à partir d'une seconde région du sujet, qui est différente de la première région, une seconde image de sujet qui sert d'image secondaire ; une unité de détection d'état d'utilisation de partie d'insertion (32b) qui détecte l'état d'utilisation de la partie d'insertion et délivre un signal de résultats de détection d'état d'utilisation sur la base des résultats de détection ; une unité de génération de signal d'image (32d) qui génère un signal d'image sur la base d'au moins la première image de sujet, entre la première image de sujet et la seconde image de sujet ; et une unité de traitement d'image (32a) qui affiche, sur une unité d'affichage réalisant un affichage, la première image de sujet et la seconde image de sujet en fonction du signal de résultats de détection d'état d'utilisation délivré par l'unité de détection d'état d'utilisation de partie d'insertion (32b), et qui change la quantité d'informations de la seconde image de sujet.
PCT/JP2015/058508 2014-03-27 2015-03-20 Système d'endoscope WO2015146836A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014066590 2014-03-27
JP2014-066590 2014-03-27

Publications (1)

Publication Number Publication Date
WO2015146836A1 true WO2015146836A1 (fr) 2015-10-01

Family

ID=54195354

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/058508 WO2015146836A1 (fr) 2014-03-27 2015-03-20 Système d'endoscope

Country Status (1)

Country Link
WO (1) WO2015146836A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017104197A1 (fr) * 2015-12-17 2017-06-22 オリンパス株式会社 Appareil de traitement d'image
JP2018057605A (ja) * 2016-10-05 2018-04-12 富士フイルム株式会社 内視鏡システム及び内視鏡システムの駆動方法
WO2018105063A1 (fr) * 2016-12-07 2018-06-14 オリンパス株式会社 Dispositif de traitement d'image
JP2020037003A (ja) * 2016-09-29 2020-03-12 富士フイルム株式会社 内視鏡システム及び内視鏡システムの駆動方法
WO2021144838A1 (fr) * 2020-01-14 2021-07-22 オリンパス株式会社 Dispositif de commande d'affichage, procédé de commande d'affichage, programme de commande d'affichage et système d'endoscope

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04341232A (ja) * 1991-03-11 1992-11-27 Olympus Optical Co Ltd 電子内視鏡システム
JPH10276974A (ja) * 1997-04-03 1998-10-20 Olympus Optical Co Ltd 内視鏡装置
JP2010117665A (ja) * 2008-11-14 2010-05-27 Olympus Medical Systems Corp 光学系
WO2011055614A1 (fr) * 2009-11-06 2011-05-12 オリンパスメディカルシステムズ株式会社 Système endoscopique

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04341232A (ja) * 1991-03-11 1992-11-27 Olympus Optical Co Ltd 電子内視鏡システム
JPH10276974A (ja) * 1997-04-03 1998-10-20 Olympus Optical Co Ltd 内視鏡装置
JP2010117665A (ja) * 2008-11-14 2010-05-27 Olympus Medical Systems Corp 光学系
WO2011055614A1 (fr) * 2009-11-06 2011-05-12 オリンパスメディカルシステムズ株式会社 Système endoscopique

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017104197A1 (fr) * 2015-12-17 2017-06-22 オリンパス株式会社 Appareil de traitement d'image
JPWO2017104197A1 (ja) * 2015-12-17 2017-12-14 オリンパス株式会社 ビデオプロセッサ
US10512393B2 (en) 2015-12-17 2019-12-24 Olympus Corporation Video processor
JP2020037003A (ja) * 2016-09-29 2020-03-12 富士フイルム株式会社 内視鏡システム及び内視鏡システムの駆動方法
JP2018057605A (ja) * 2016-10-05 2018-04-12 富士フイルム株式会社 内視鏡システム及び内視鏡システムの駆動方法
US10820786B2 (en) 2016-10-05 2020-11-03 Fujifilm Corporation Endoscope system and method of driving endoscope system
WO2018105063A1 (fr) * 2016-12-07 2018-06-14 オリンパス株式会社 Dispositif de traitement d'image
JPWO2018105063A1 (ja) * 2016-12-07 2019-07-04 オリンパス株式会社 画像処理装置
CN110049709A (zh) * 2016-12-07 2019-07-23 奥林巴斯株式会社 图像处理装置
US11145053B2 (en) 2016-12-07 2021-10-12 Olympus Corporation Image processing apparatus and computer-readable storage medium storing instructions for specifying lesion portion and performing differentiation classification in response to judging that differentiation classification operation is engaged based on signal from endoscope
CN110049709B (zh) * 2016-12-07 2022-01-11 奥林巴斯株式会社 图像处理装置
WO2021144838A1 (fr) * 2020-01-14 2021-07-22 オリンパス株式会社 Dispositif de commande d'affichage, procédé de commande d'affichage, programme de commande d'affichage et système d'endoscope

Similar Documents

Publication Publication Date Title
JP4856286B2 (ja) 内視鏡システム
JP4884567B2 (ja) 内視鏡システム
JP5942044B2 (ja) 内視鏡システム
JP4500096B2 (ja) 内視鏡及び内視鏡システム
JP2018519860A (ja) 動的視野内視鏡
JP5974188B2 (ja) 内視鏡システム
WO2018069992A1 (fr) Système d'insertion
WO2015146836A1 (fr) Système d'endoscope
US20160345808A1 (en) Endoscope system
JP5889495B2 (ja) 内視鏡システム
WO2015198981A1 (fr) Système d'endoscopie
JP5608580B2 (ja) 内視鏡
JP6062112B2 (ja) 内視鏡システム
US20170215710A1 (en) Endoscope system
JP4789490B2 (ja) 内視鏡装置
JP6523062B2 (ja) 内視鏡装置
WO2020004259A1 (fr) Système d'affichage d'image endoscopique et dispositif d'affichage d'image endoscopique
JP2011235114A (ja) 画像信号処理装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15768916

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15768916

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP