WO2022209156A1 - Medical observation device, information processing device, medical observation method, and endoscopic surgery system - Google Patents

Medical observation device, information processing device, medical observation method, and endoscopic surgery system Download PDF

Info

Publication number
WO2022209156A1
WO2022209156A1 PCT/JP2022/001496 JP2022001496W WO2022209156A1 WO 2022209156 A1 WO2022209156 A1 WO 2022209156A1 JP 2022001496 W JP2022001496 W JP 2022001496W WO 2022209156 A1 WO2022209156 A1 WO 2022209156A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
change
light source
sensor
irradiation
Prior art date
Application number
PCT/JP2022/001496
Other languages
French (fr)
Japanese (ja)
Inventor
景 戸松
健治 高橋
容平 黒田
大輔 長尾
誠之 梅島
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2022209156A1 publication Critical patent/WO2022209156A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • G02B23/26Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes using light guides

Definitions

  • the present disclosure relates to a medical observation device, an information processing device, a medical observation method, and an endoscopic surgery system.
  • Patent Document 1 It is known to control an arm that supports an endoscope based on an image captured by the endoscope (see Patent Document 1, for example).
  • One aspect of the present disclosure aims to provide a distance measurement technique suitable for surgery.
  • a medical observation apparatus includes a light source device that irradiates light onto an object, a sensor that detects that the amount of change in brightness due to light incident on each of a plurality of pixels exceeds a predetermined threshold, and a light source device. a control unit that changes the irradiation profile of light and measures the distance to the object based on the detection result of the sensor generated by the change in the irradiation profile of light.
  • An information processing device changes the light irradiation profile of a light source device that irradiates light onto an object, and calculates the distance to the object based on the sensor detection result generated by the change in the light irradiation profile. and a control unit for measuring, and the sensor detects that the amount of change in luminance due to light incident on each of the plurality of pixels exceeds a predetermined threshold.
  • an information processing device changes a light irradiation profile of a light source device that irradiates light onto an object, and based on a sensor detection result generated by the change in the light irradiation profile, measuring the distance to the object, and the sensor detects that the amount of change in brightness due to light incident on each of the plurality of pixels exceeds a predetermined threshold.
  • An endoscopic surgery system includes an endoscope including a sensor that detects that a luminance change due to light incident on each of a plurality of pixels exceeds a predetermined threshold, and an arm that supports the endoscope. and a light source device for irradiating an object with light, and a control device for controlling the endoscope, the arm portion, and the light source device, wherein the control device changes the light irradiation profile of the light source device to irradiate the light.
  • the distance to the object is measured based on the detection result of the sensor caused by the profile change.
  • FIG. 1 is a diagram showing an example of a schematic configuration of a medical observation device according to an embodiment
  • FIG. It is a figure which shows the example of a schematic structure of an endoscope. It is a figure which shows the example of a schematic structure of a sensor (EVS). It is a figure which shows the example of the schematic structure of a pixel.
  • 4 is a flow chart showing an example of processing executed in the medical observation device;
  • FIG. 5 is a diagram schematically showing an example of changes in the irradiation range; It is a figure which shows typically the example of the relationship between an incident angle and an irradiation range. It is a figure which shows typically the combination of several irradiation.
  • FIG. 4 is a flowchart showing an example of processing including temporary suspension of irradiation (turning off the light); It is a figure which shows the example of a schematic structure of a light source device.
  • 1 is a diagram showing an example of a schematic configuration of an endoscopic surgery system;
  • FIG. 4 is a flow chart showing an example of processing executed in the endoscopic surgery system;
  • 4 is a flow chart showing an example of processing executed in the endoscopic surgery system;
  • Embodiment of Medical Observation Device 2 Example of irradiation profile 3 .
  • FIG. 1 is a diagram showing a schematic configuration of a medical observation apparatus according to an embodiment.
  • the medical observation device 100 includes an endoscope 1, a light source device 4, and a CCU5.
  • the endoscope 1 images the operative field (operative site) within the body cavity of the patient.
  • An object within the operative field is referred to as object 9 and illustrated. Examples of the object 9 are organs, surgical tools, and the like.
  • the endoscope 1 may be a forward viewing scope or a perspective scope.
  • FIG. 1 illustrates a lens barrel 2 and a camera head 3 as components of an endoscope 1, details of which will be described later with reference to FIG.
  • imaging may mean including “shooting”.
  • imaging and/or “video” will simply be referred to as “imaging” unless otherwise specified.
  • imaging may be read as “shooting”.
  • image may include “video”.
  • image and/or “video” are simply referred to as “image” unless otherwise specified.
  • image may be read as "video”.
  • the light source device 4 irradiates the object 9 with light.
  • the light source device 4 supplies the endoscope 1 with light for irradiating the surgical field.
  • the light source device 4 may include a light source such as an LED (Light Emitting Diode) light source, a laser light source, or the like.
  • the light source device 4 includes a light exit port 43 through which light from the light source is emitted.
  • the light exit port 43 is connected to the endoscope 1 via a cable C14.
  • the cable C14 includes a light guide (for example, an optical waveguide such as an optical fiber) that propagates light.
  • the light exit port 43 can also be said to be a light guide insertion port into which the light guide is inserted.
  • An example of the light that the light source device 4 irradiates onto the object 9 is illumination light for illuminating the surgical field.
  • An example of illumination light is white light, but it is not particularly limited to this.
  • Another example of the light with which the light source device 4 irradiates the object 9 is light in a predetermined wavelength band corresponding to special light observation.
  • Special light observation may be so-called narrow band imaging.
  • the object 9 is irradiated with light that is narrower than the irradiation light (for example, white light) during normal observation, and as a result, Predetermined tissues such as blood vessels on the mucosal surface are imaged with high contrast.
  • Special light observation may be fluorescence observation. In that case, an image is obtained by fluorescence generated by irradiation of the object 9 with excitation light.
  • the body tissue is irradiated with excitation light and fluorescence from the body tissue is observed (autofluorescence observation).
  • a reagent such as indocyanine green (ICG) may be locally injected into the body tissue, and the body tissue may be irradiated with excitation light corresponding to the fluorescence wavelength of the reagent to obtain a fluorescence image.
  • the light source device 4 may be configured to be capable of supplying narrow-band light and/or excitation light corresponding to such special light observation.
  • Another example of the light that the light source device 4 irradiates onto the object 9 is light for measuring the distance to the object 9 (for distance measurement).
  • Various lights described above, including illumination light, may be used as the light for distance measurement.
  • the light for distance measurement is not limited to visible light, and may be infrared light or ultraviolet light.
  • the light source device 4 may be configured to change the wavelength of the light supplied to the endoscope 1, which will be explained later with reference to FIG.
  • the CCU 5 can function as a control unit in the medical observation device 100.
  • CCU 5 is a camera control unit that controls endoscope 1 .
  • CCU 5 is connected to endoscope 1 via cable C15.
  • the cable C ⁇ b>15 includes signal lines and the like, and enables control of the endoscope 1 by the CCU 5 .
  • the CCU 5 includes, for example, a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and the like.
  • the CCU 5 supervises the medical observation device 100, including the operation of a display device (corresponding to the display device 5041 in FIG. 11 described later) that displays an image of the surgical site in the patient's body cavity captured by the endoscope 1. control.
  • the CCU 5 can also be said to be a control device that performs overall control of the medical observation device 100 .
  • the CCU 5 can also be said to be an information processing device that processes various information handled by the medical observation device 100 .
  • the CCU 5 controls irradiation of light to the object 9 by controlling the light source device 4 .
  • CCU5 is connected to light source device 4 via cable C45.
  • the cable C45 includes signal lines and the like, and enables control of the light source device 4 by the CCU5.
  • the CCU 5 transmits to the light source device 4 an illumination profile that defines the mode of illumination. The irradiation profile will be explained again later.
  • the light source device 4 supplies light to the endoscope 1 as defined by the received illumination profile.
  • FIG. 2 is a diagram showing an example of a schematic configuration of an endoscope.
  • the endoscope 1 includes a lens barrel 2 and a camera head 3.
  • the illustrated lens barrel 2 is rigid, and the endoscope 1 is configured as a so-called rigid scope.
  • the lens barrel 2 extends with the camera head 3 as its base end.
  • An example of the size of the lens barrel 2 is several millimeters to ten and several millimeters in diameter and several hundred millimeters in length. Note that the lens barrel 2 may be flexible and the endoscope 1 may be configured as a rigid endoscope.
  • the lens barrel 2 is inserted into the patient's body cavity for a predetermined length from its tip.
  • a mounting portion 21, a light guide 22, lenses 23a to 23g, a mounting portion 24, and an opening 25 are exemplified.
  • an opening at the tip of the lens barrel 2 on the side opposite to the camera head 3 (on the side of the object 9) is referred to as an opening 2a.
  • the attachment portion 21 is a portion to which the cable C14 from the light source device 4 is attached. Light from the light source device 4 is supplied to the endoscope 1 via the cable C14 and the mounting portion 21 .
  • the light guide 22 extends inside the lens barrel 2 and guides the light from the light source device 4 to the tip of the lens barrel 2 so that the light from the light source device 4 irradiates the object 9 from a predetermined position and in a predetermined direction.
  • the tip of the light guide 22 at the tip of the lens barrel 2 is referred to as an emitting end 22a and illustrated.
  • the light from the light source device 4 is emitted from the emission end 22 a to irradiate the object 9 .
  • the position of the output end 22a corresponds to the predetermined position, and the direction of light emitted from the light guide 22 corresponds to the predetermined direction.
  • predetermined positions and predetermined directions are grasped by the medical observation device 100, more specifically by the CCU 5, and enable distance measurement, which will be described later. At least part of the light applied to the object 9 is reflected by the object 9 , and at least part of the reflected light enters the opening 25 .
  • the light incident on the opening 2a from the object 9 in this way may be simply referred to as "light from the object 9" below.
  • the mounting portion 23 is a portion (for example, an eyepiece) that connects the base end of the lens barrel 2 to the camera head 3 .
  • the lenses 24a to 24g guide light from the object 9 to the camera head 3.
  • the lenses 24a to 24g are provided inside the lens barrel 2 in order from the tip (the end on the opening 2a side) of the lens barrel 2 toward the base end.
  • the lens 24a positioned closest to the object 9 can function as an objective lens.
  • the number of lenses provided in the lens barrel 2 is merely an example. More lenses than shown may be provided inside the barrel 2, for example several tens or even 100 or more. Light from the object 9 reaches the camera head 3 through many lenses provided in such a lens barrel 2 .
  • the camera head 3 detects light from the object 9.
  • a mounting portion 31, a lens 32a, an optical element 32b, sensors 33 to 35, and a vibration blur correction mechanism 36 are exemplified.
  • the attachment portion 31 is a portion to which the base end portion of the lens barrel 2 is attached, and the attachment portion 23 of the lens barrel 2 described above is connected. Light from the object 9 enters the camera head 3 through the mounting portion 31 .
  • the lens 32a and the optical element 32b collect light from the object 9 onto the sensors 33 to 35, respectively.
  • the lens 32a guides observation light to the optical element 32b.
  • the optical element 32b guides the light to be detected by the sensor 33 to the sensor 33, the light to be detected by the sensor 34 to the sensor 34, and the light to be detected by the sensor 35 to the sensor 35, respectively.
  • the optical element 32b includes various known optical elements such as mirrors and prisms.
  • the sensor 33 is an imaging device (first imaging sensor) that captures an image of the object 9 .
  • the sensor 33 is an imaging element for visible light observation, and includes RGB pixels and the like.
  • the optical element 32b described above guides the light from the object 9, especially the visible light, to the sensor 33.
  • FIG. The sensor 33 generates an electrical signal (image signal) through photoelectric conversion. The generated image signal is transmitted to CCU5.
  • the sensor 34 is an imaging device (second imaging sensor) that captures an image of the object 9 .
  • the sensor 34 is an imaging element for special light observation, and includes infrared light pixels and the like.
  • the optical element 32b described above guides light from the object 9, particularly infrared light, to the sensor .
  • the sensor 34 generates an electrical signal (image signal) through photoelectric conversion. The generated image signal is transmitted to CCU5.
  • the camera head 3 may be provided with a drive mechanism or the like for appropriately driving the optical system such as the lens 32a and the optical element 32b. Adjustment of magnification, focal length, etc. becomes possible. Also, for example, in order to cope with stereoscopic vision (3D display) or the like, a plurality of sensors 33 or a plurality of sensors 34 may be provided. In this case, a plurality of relay optical systems may be provided inside the lens barrel 2 in order to guide the observation light to each of the plurality of imaging elements.
  • the sensor 35 is a sensor (brightness change detection sensor) that detects a brightness change due to light from the object 9 .
  • the sensor 35 is also called an EVS (Event Vision Sensor) or the like. Sensor 35 will be described with reference also to FIGS.
  • FIG. 3 is a diagram showing an example of the schematic configuration of the sensor (EVS).
  • the sensor 35 includes a pixel array section 351 including a plurality of pixels 352 , a driving circuit 353 , a signal processing section 354 , an arbiter section 355 (arbitration section), and a column processing section 356 .
  • the drive circuit 353 , signal processing section 354 , arbiter section 355 and column processing section 356 are peripheral circuit sections of the pixel array section 351 .
  • the sensor 35 is a sensor that detects that the luminance change amount due to light incident on each of the plurality of pixels 352 exceeds a predetermined threshold.
  • a plurality of pixels 352 are arranged in a matrix in this example.
  • Each pixel 352 can generate a voltage corresponding to a photocurrent generated by photoelectric conversion as a pixel signal.
  • Each pixel 352 can detect the presence or absence of an event by comparing a change in photocurrent corresponding to a change in luminance of incident light with a predetermined threshold. In other words, each pixel 352 can detect an event based on a luminance change exceeding a predetermined threshold.
  • each pixel 352 When detecting an event, each pixel 352 can output a request to the arbiter unit 355 requesting output of event data representing the occurrence of the event. Each pixel 352 outputs the event data to the driving circuit 353 and the signal processing unit 354 when receiving a response indicating permission to output the event data from the arbiter unit 355 . Also, the pixel 352 that has detected the event outputs a pixel signal generated by photoelectric conversion to the column processing unit 356 .
  • the drive circuit 353 can drive each pixel 352 of the pixel array section 351 .
  • the drive circuit 353 detects an event, drives the pixel 352 that outputs the event data, and outputs the pixel signal of the corresponding pixel 352 to the column processing unit 356 .
  • the arbiter unit 355 arbitrates requests requesting output of event data supplied from each pixel 352, responds based on the arbitration result (permission/non-permission of event data output), and resets event detection. A reset signal can be sent to the pixel 352 to do so.
  • the column processing unit 356 can perform processing for converting analog pixel signals output from the pixels 352 of the corresponding column into digital signals for each column of the pixel array unit 351 .
  • the column processing unit 356 can also perform CDS (Correlated Double Sampling) processing on digitized pixel signals.
  • the signal processing unit 354 performs predetermined signal processing on the digitized pixel signals supplied from the column processing unit 356 and the event data output from the pixel array unit 351, and converts the signal-processed event data ( time stamp information, etc.) and pixel signals can be output. These event data, pixel signals, etc. are transmitted to the CCU 5 .
  • a change in the photocurrent generated by the pixel 352 can be regarded as a change in the amount of light (luminance change) incident on the pixel 352 . Therefore, an event can be said to be a luminance change of pixel 352 exceeding a predetermined threshold. Furthermore, the event data representing the occurrence of an event can include at least positional information such as coordinates representing the position of the pixel 352 where the change in the amount of light has occurred as an event. Event data including such position information is transmitted to CCU5.
  • FIG. 4 is a diagram showing an example of a schematic configuration of a pixel.
  • the pixel 352 includes a light receiving portion 3521 , a pixel signal generation portion 3522 and an event detection portion 3523 .
  • the light receiving section 3521 can photoelectrically convert incident light to generate a photocurrent. Then, the light receiving unit 3521 can supply a voltage signal corresponding to the photocurrent to either the pixel signal generation unit 3522 or the event detection unit 3523 under the control of the driving circuit 353 .
  • the pixel signal generation unit 3522 can generate the signal supplied from the light receiving unit 3521 as a pixel signal. Then, the pixel signal generation section 3522 can supply the generated analog pixel signals to the column processing section 356 via vertical signal lines VSL (not shown) corresponding to the columns of the pixel array section 351 .
  • the event detection unit 3523 can detect whether an event has occurred based on whether the amount of change in photocurrent from the light receiving unit 3521 has exceeded a predetermined threshold.
  • the events can include, for example, an ON event indicating that the amount of change in photocurrent has exceeded the upper threshold, and an OFF event indicating that the amount of change has fallen below the lower threshold. Note that the event detection unit 3523 may detect only on-events.
  • the event detection unit 3523 can output to the arbiter unit 355 a request to output event data representing the occurrence of the event. Then, when the event detection unit 3523 receives a response to the request from the arbiter unit 355 , it can output event data to the drive circuit 353 and the signal processing unit 354 .
  • the senor 35 achieves a high dynamic range, high robustness in fast-moving subject detection, high temporal resolution, and the like.
  • light from the object 9 can be detected with high sensitivity.
  • the vibration blur correction mechanism 36 is provided for the sensor 35, and suppresses blur that may occur in light detection by the sensor 35 due to vibration of the endoscope 1 for light detection by the sensor 35 or the like.
  • Various known mechanisms may be adopted as the vibration blur correction mechanism 36 .
  • the CCU 5 can function as a distance measuring unit that measures the distance to the object 9 using the endoscope 1 and light source device 4 described above.
  • the CCU 5 changes the light irradiation profile of the light source device 4 and measures the distance to the object 9 based on the detection result of the sensor 35 generated by the change in the light irradiation profile.
  • a command instructing to change the irradiation profile may be transmitted from the CCU 5 to the light source device 4 , or the changed irradiation profile may be transmitted from the CCU 5 to the light source device 4 .
  • the light source device 4 irradiates the object 9 with light as defined by the changed irradiation profile.
  • the light incident on the sensor 35 from the object 9 also changes.
  • a change in brightness due to the change in light occurs and is detected as an event.
  • event data including location information of pixel 352 where the event occurred is sent to CCU 5 .
  • the CCU 5 calculates the distance to the object 9 based on the generated detection result of the sensor 35, that is, the event data transmitted from the sensor 35 to the CCU 5.
  • the calculated distance of the target object 9 may be the distance (depth) from an arbitrary position grasped by the CCU 5 to the target object 9 .
  • the CCU 5 calculates the distance from the tip of the barrel 2 of the endoscope 1 to the object 9, the distance from the sensor 35 to the object 9, and the like.
  • the light from the light source device 4 irradiates the object 9 from a predetermined position and a predetermined direction ascertained by the CCU 5 .
  • the event data transmitted from the sensor 35 of the camera head 3 of the endoscope 1 to the CCU 5 includes the position information of the pixel 352 where the event was detected, that is, the information of the light receiving position of the object 9. .
  • the CCU 5 calculates the distance to the target object 9 using various known methods using these pieces of information. For example triangulation may be used, in which case the light source device 4 acts as a laser light source and the sensor 35 acts as a position sensor.
  • a scanning rangefinder method may be used, in which case the light source 41 may output, for example, a scannable stripe laser light.
  • the CCU 5 measures the distance to the object 9.
  • the distance measurement results can be used in various ways. For example, based on the distance measurement results, three-dimensional distance measurement data of the object 9 may be constructed. It is also possible to generate a three-dimensional map containing (three-dimensional distance measurement data of) the object 9 . If the object 9 is an organ, a three-dimensional map of the patient's body cavity including the position of the organ can be generated. Self-supporting control (robot control) of the endoscope 1 based on the three-dimensional map is also possible.
  • the medical observation device 100 may measure the distance to the object 9 at designated timing.
  • the distance to the object 9 may be measured at a timing designated by user operation.
  • An example of the user operation may be a voice instruction such as pressing a button (distance measurement button) for instructing distance measurement.
  • the user operation may include not only these but various operations that can instruct distance measurement.
  • FIG. 5 is a flowchart showing an example of processing (medical observation method) executed in the medical observation device. Processing related to distance measurement is exemplified. Explanations that overlap with the above will be omitted as appropriate.
  • step S1 it is determined whether or not to measure the distance.
  • the CCU 5 determines that distance measurement should be performed, for example, when distance measurement is instructed by the above-described user operation. If distance measurement is to be performed (step S1: Yes), the process proceeds to step S2. Otherwise (step S1: No), the process of step S1 is repeated.
  • step S2 the irradiation profile is changed.
  • CCU5 changes the illumination profile.
  • the light from the object 9 also changes, causing a brightness change at the sensor 35 .
  • step S3 luminance change is detected.
  • a sensor 35 detects luminance changes.
  • Event data including the position information of the pixel 352 where the luminance change was detected (the event occurred) is sent to the CCU5.
  • step S4 distance measurement is performed.
  • the CCU 5 calculates the distance to the object 9 based on the detection result in the previous step S3, more specifically, the event data transmitted from the sensor 35 to the CCU 5. As described above, the results of distance measurement are used to construct three-dimensional distance measurement data of the object 9, generate a three-dimensional map including the data, and the like.
  • the detection of an event by the sensor 35 is (forcibly) generated by changing the light irradiation profile of the light source device 4, and the target 9 is detected using the detection result.
  • the distance to is measured.
  • the lens barrel 2 of the endoscope 1 has a fairly small optical system diameter and includes many lenses (lenses 24a to 24g, etc.). Therefore, the light from the object 9 is greatly attenuated by the lens barrel 2 .
  • a TOF sensor into an endoscope, but many of the TOF sensors use infrared wavelengths for their wavelengths, so the above-mentioned problem of attenuation becomes apparent.
  • the TOF sensor cannot sufficiently detect the light from the object 9, and it is difficult to obtain accurate distance measurement.
  • Distance measurement by a stereo optical system (3D) is also conceivable, but it is necessary to provide two photographic optical systems in parallel within the lens barrel 2, and the diameter of each photographic optical system becomes narrower.
  • the high-sensitivity sensor 35 as a distance-measuring sensor, as in the medical observation apparatus 100 described above, the light from the object 9 can be detected with high sensitivity, thereby improving the accuracy of distance measurement. can. Sufficient accuracy can be obtained even with the endoscope 1 .
  • the distance to the object 9 can be measured by changing the light irradiation profile of the light source device 4. Therefore, for example, the SfM robot operation disclosed in Patent Document 1 (mechanical motion of the sensor) is not required. In this sense as well, it is suitable for distance measurement with the endoscope 1 . Since the irradiation change can be performed more quickly (for example, instantaneously) than the mechanical operation, it is possible to measure the distance in a short time accordingly.
  • the irradiation profile defines the method of light irradiation by the light source device 4 .
  • the illumination profile may define any illumination scheme whose changes can change the light from the object 9 . Some examples of illumination profiles are described.
  • the illumination profile may include an illumination range.
  • CCU 5 may cause detection of an event by sensor 35 to measure the distance to object 9 by changing the illumination range.
  • the irradiation range examples are the entire operative field (equivalent to the screen on which the captured image is presented), the central part of the operative field, and the surrounding part of the central part of the operative field. It should be noted that the irradiation of the central portion and the peripheral portion may be performed intensively, and the existence of the irradiation of other portions is not excluded.
  • the irradiation range may have various shapes.
  • the shape of the irradiation range may be circular, elliptical, polygonal, or the like.
  • the shape of the irradiation range may be an annular (ring-shaped) circular shape, an elliptical shape, a polygonal shape, or the like.
  • the CCU 5 may change the size of the annular shape.
  • the CCU 5 may change the illumination range such that the annular shape gradually becomes larger and smaller. An example including such changes will be described with reference to FIG.
  • FIG. 6 is a diagram schematically showing an example of changes in the irradiation range.
  • the irradiation range R1 shown in FIG. 6A is the irradiation range of the central portion of the screen, and has a circular shape in this example.
  • Illuminated ranges R2 to R4 shown in (B) to (D) of FIG. 6 are the illuminated ranges of the peripheral portion of the screen, and in this example, have an annular shape.
  • the annular shape of the irradiation range R3 is larger than the annular shape of the irradiation range R2.
  • the annular shape of the irradiation range R4 is larger than the annular shape of the irradiation range R3.
  • the CCU 5 gradually changes the irradiation range so as to include such irradiation ranges R1 to R4.
  • the operative field can be efficiently scanned and the distance can be measured.
  • a change in the irradiation range such as the irradiation range R1 to the irradiation range R4 described above is realized, for example, by changing the incident angle of the light incident on the optical waveguide from the light source device 4 . This will be described with reference to FIG.
  • FIG. 7 is a diagram schematically showing an example of the relationship between the incident angle and the irradiation range.
  • the optical waveguides are the cable C14 and the light guide 22 previously described with reference to FIG.
  • the optical waveguide may be, for example, a bundle fiber or a liquid light guide in which a plurality of index guide type multimode optical fibers having a core diameter of about 10 ⁇ m to 100 ⁇ m are bundled.
  • Such an optical waveguide has the characteristic of radiating light rays from the exit facet while preserving the angle of the light rays incident on the entrance facet.
  • the angle of light emitted from the optical waveguide is also reduced.
  • the angle of incidence of light from the light source device 4 onto the optical waveguide is increased, the angle of light emitted from the optical waveguide also increases.
  • the incident angle of light rays is preserved, but the incident position of light rays is not preserved. Therefore, light rays that enter at a certain incident angle are emitted from the output end face as ring-shaped rays while maintaining that angle. be.
  • the light source device 4 may include various optical elements for controlling the angle of incidence of light on the optical waveguides described above.
  • optical elements include a collimator lens for obtaining parallel light, a mechanism for adjusting the incident angle (incident angle adjusting mechanism), and an optical system for coupling the light whose incident angle is adjusted to the optical waveguide (coupling optics system), etc. See Patent Document 2, for example, for a more specific configuration.
  • the illumination profile may comprise a combination of illumination of the object 9 from different positions of the light of the light source device 4 .
  • CCU 5 may cause detection of an event by sensor 35 and measure the distance to object 9 by changing the combination.
  • FIG. 8 is a diagram schematically showing a combination of multiple irradiations.
  • FIG. 8 schematically shows the appearance of the front end (opening 2a side) of the lens barrel 2 when viewed from the front.
  • Emission ends 22a of a plurality of light guides 22 are positioned around the opening 2a.
  • the output ends 22a are referred to as output ends 22a-1 to 22a-12 so that the output ends 22a can be distinguished from each other.
  • the light source device 4 is configured to supply light to each of the emission ends 22a-1 to 22a-12 and to individually control each light.
  • An example of control is control of lighting of light
  • the CCU 5 may change the lighting pattern of light emitted from the emission ends 22a-1 to 22a-12.
  • the CCU 5 may change the lighting pattern so that the output ends 22a-1 to 22a-12 emit light (light up) in this order (sequentially).
  • the CCU 5 may temporarily stop the irradiation (turn off the light) when changing the light irradiation profile of the light source device 4 .
  • a change in the illumination of the object 9 becomes more conspicuous, and an event can be detected more reliably by the sensor 35 .
  • FIG. 9 is a flowchart showing an example of processing including temporary suspension of irradiation (turning off the light).
  • step S11 it is determined whether or not to change the irradiation profile. For example, the CCU 5 determines that the irradiation profile should be changed when distance measurement is instructed by a user's operation. If the irradiation profile is to be changed (step S11: Yes), the process proceeds to step S12. Otherwise (step S11: No), the process of step S11 is repeated.
  • the lights are turned off.
  • the CCU 5 transmits to the light source device 4, for example, a command to stop light irradiation.
  • the light source device 4 stops irradiating the object 9 with light.
  • step S13 the irradiation profile is changed.
  • CCU5 changes the illumination profile.
  • the light source device 4 is ready to irradiate the object 9 with light as defined by the changed irradiation profile.
  • step S14 lighting is performed.
  • the CCU 5 transmits to the light source device 4, for example, a command instructing the start of light irradiation.
  • the light source device 4 starts irradiating the object 9 with light.
  • the object 9 is irradiated with light as defined by the irradiation profile after the change in the previous step S13.
  • the illumination profile may include wavelengths of light from the light source device 4 .
  • the CCU 5 may cause the sensor 35 to detect an event by changing the wavelength of light from the light source device 4 and measure the distance to the object 9 .
  • Some objects 9 for example, internal organs, etc.
  • Changing the wavelength of the light illuminating such an object 9 will also change the light from the object 9 and can cause detection of an event by the sensor 35 .
  • the light source device 4 is configured to be able to change the wavelength of light with which the object 9 is irradiated. An example of the configuration of such a light source device 4 will be described with reference to FIG.
  • FIG. 10 is a diagram showing an example of a schematic configuration of a light source device.
  • the light source device 4 includes light sources 41a to 41f, mirrors 42a to 42f, and a lens 42g in addition to the light exit port 43 described above.
  • the light sources 41a to 41f output lights of different wavelengths (lights of different colors).
  • the light source 41a outputs infrared light.
  • the light source 41b outputs red light.
  • the light source 41c outputs yellow light.
  • the light source 41d outputs green light.
  • the light source 41e outputs blue light.
  • the light source 41f outputs violet light.
  • the light sources 41a to 41f are individually controllable.
  • the mirrors 42a to 42f guide the light from the light sources 41a to 41f to the lens 42g.
  • the mirror 42a reflects the light from the light source 41a toward the lens 42g.
  • the mirror 42b reflects the light from the light source 41b toward the lens 42g.
  • the mirror 42b transmits the light of the light source 41a reflected by the mirror 42a.
  • the mirror 42c reflects the light from the light source 41c toward the lens 42g.
  • the mirror 42c transmits the light of the light source 41a and the light source 41b reflected by the mirror 42a and the mirror 42b.
  • the mirror 42d reflects the light from the light source 41d toward the lens 42g. Also, the mirror 42d transmits the lights of the light sources 41a to 41c reflected by the mirrors 42a to 42c.
  • the mirror 42e reflects the light from the light source 41e toward the lens 42g. Also, the mirror 42e transmits the lights of the light sources 41a to 41d reflected by the mirrors 42a to 42d.
  • the mirror 42f reflects the light from the light source 41f toward the lens 42g. The mirror 42f transmits the lights of the light sources 41a to 41e reflected by the mirrors 42a to 42e.
  • the mirror 42a may be a total reflection mirror.
  • Mirrors 42b through 42f may be dichroic mirrors.
  • the lens 42g is a condensing lens that collects the light from the light sources 41a to 41f reflected by the mirrors 42a to 42g and guides them to the light exit port 43.
  • the light source device 4 emits arbitrarily selected light from the light sources 41a to 41f (lights with different wavelengths) through the light exit 43. can be emitted from and illuminate the object 9 .
  • the CCU 5 may change the wavelength of the light from the light source device 4 according to the type of the object 9, such as the type and composition of the organ.
  • a wavelength change is selected that causes a change in the light from the object 9 .
  • image recognition or the like for a captured image for example, an image based on an image signal generated by the sensor 33 or the sensor 34
  • Various known image recognition algorithms, image recognition models (learned models), etc. may be used for image recognition.
  • the medical observation device 100 (the endoscope 1, the light source device 4, and the CCU 5) described above may be incorporated and used in an endoscopic surgery system including an endoscopic robot, for example. Such embodiments are described.
  • the endoscope robot will be described as an arm or the like that supports an endoscope.
  • FIG. 11 is a diagram showing an example of a schematic configuration of an endoscopic surgery system.
  • the illustrated endoscope 5001 (lens barrel 5003, camera head 5005), light source device 5043, and CCU 5039 are the endoscope 1 (lens barrel 2, camera head 3) and light source device of the medical observation apparatus 100 described so far. 4 and CCU 5 configuration and functions. Duplicate descriptions are omitted as appropriate.
  • FIG. 11 shows a surgeon 5067 performing surgery on a patient 5071 on a patient bed 5069 using the endoscopic surgery system 5000 .
  • an endoscopic surgery system 5000 includes an endoscope 5001, other surgical instruments (medical instruments) 5017, and a support arm device (medical arm) 5027 for supporting the endoscope 5001. , and a cart 5037 loaded with various devices for endoscopic surgery. Details of the endoscopic surgery system 5000 will be sequentially described below.
  • Surgical tools In endoscopic surgery, instead of cutting the abdominal wall to open the abdomen, for example, a plurality of tubular opening instruments called trocars 5025a to 5025d are punctured into the abdominal wall. Then, the barrel 5003 of the endoscope 5001 and other surgical instruments 5017 are inserted into the body cavity (abdominal cavity) of the patient 5071 from the trocars 5025a to 5025d. In the example shown in FIG. 11 , a pneumoperitoneum tube 5019 , an energy treatment instrument 5021 and forceps 5023 are inserted into the body cavity of a patient 5071 as other surgical instruments 5017 .
  • the energy treatment tool 5021 is a treatment tool that performs tissue incision and ablation, blood vessel sealing, or the like, using high-frequency current or ultrasonic vibration.
  • the surgical tool 5017 shown in FIG. 11 is merely an example, and examples of the surgical tool 5017 include various surgical tools generally used in endoscopic surgery, such as pestle, retractor, and the like.
  • the support arm device 5027 has an arm portion 5031 extending from the base portion 5029 .
  • the arm section 5031 is composed of joint sections 5033a, 5033b, and 5033c and links 5035a and 5035b, and is driven under the control of the arm control device 5045.
  • the arm portion 5031 supports the endoscope 5001 and controls the position and attitude of the endoscope 5001 . As a result, stable position fixation of the endoscope 5001 can be realized.
  • the display device 5041 displays an image based on an image signal subjected to image processing by the CCU 5039 under the control of the CCU 5039 .
  • the endoscope 5001 is compatible with high-resolution imaging such as 4K (horizontal pixel number 3840 ⁇ vertical pixel number 2160) or 8K (horizontal pixel number 7680 ⁇ vertical pixel number 4320), and/or If the display device 5041 is compatible with 3D display, a device capable of high-resolution display and/or a device capable of 3D display is used as the display device 5041 . Further, a plurality of display devices 5041 having different resolutions and sizes may be provided depending on the application.
  • An image of the surgical site within the body cavity of the patient 5071 captured by the endoscope 5001 is displayed on the display device 5041 .
  • the surgeon 5067 can use the energy treatment tool 5021 and the forceps 5023 to perform treatment such as excision of the affected area while viewing the image of the surgical area displayed on the display device 5041 in real time.
  • the pneumoperitoneum tube 5019, the energy treatment instrument 5021, and the forceps 5023 may be supported by the surgeon 5067, an assistant, or the like during surgery.
  • the CCU 5039 can comprehensively control the operations of the endoscope 5001 and the display device 5041. Specifically, the CCU 5039 subjects the image signal received from the camera head 5005 to various image processing such as development processing (demosaicing) for displaying an image based on the image signal. Furthermore, the CCU 5039 provides the image signal subjected to the image processing to the display device 5041 . Also, the CCU 5039 transmits a control signal to the camera head 5005 to control its driving. The control signal can include information about imaging conditions such as magnification and focal length. As described above, the CCU 5039 also measures the distance to the object 9 .
  • the arm control device 5045 is composed of a processor such as a CPU, for example, and operates according to a predetermined program to control the driving of the arm portion 5031 of the support arm device 5027 according to a predetermined control method.
  • the input device 5047 is an input interface for the endoscopic surgery system 5000.
  • the surgeon 5067 can input various information and instructions to the endoscopic surgery system 5000 via the input device 5047 .
  • the surgeon 5067 inputs various types of information regarding surgery, such as the patient's physical information and information about surgical techniques, via the input device 5047 .
  • the surgeon 5067 gives an instruction to drive the arm unit 5031 via the input device 5047, or an instruction to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) by the endoscope 5001. , an instruction to drive the energy treatment instrument 5021, and the like.
  • the type of the input device 5047 is not limited, and the input device 5047 may be various known input devices.
  • the input device 5047 for example, a mouse, keyboard, touch panel, switch, footswitch 5057, and/or lever can be applied.
  • the touch panel may be provided on the display surface of the display device 5041 .
  • the input device 5047 may be a device that is attached to a part of the body of the surgeon 5067, such as a glasses-type wearable device or an HMD (Head Mounted Display). In this case, various inputs are performed according to the gestures and line of sight of the surgeon 5067 detected by these devices.
  • the input device 5047 can include a camera capable of detecting the movement of the surgeon 5067, and various inputs are performed according to the gestures and line of sight of the surgeon 5067 detected from the image captured by the camera. may be broken.
  • the input device 5047 can include a microphone capable of picking up the voice of the surgeon 5067, and various voice inputs may be made through the microphone.
  • the input device 5047 is configured to be capable of inputting various kinds of information in a non-contact manner, a user belonging to a particularly clean area (for example, a surgeon 5067) can operate a device belonging to an unclean area without contact. becomes possible.
  • a surgeon 5067 can operate the device without taking his/her hand off the surgical tool, the convenience of the surgeon 5067 is improved.
  • the treatment instrument control device 5049 controls driving of the energy treatment instrument 5021 for tissue cauterization, incision, blood vessel sealing, or the like.
  • the pneumoperitoneum device 5051 is inserted into the body cavity through the pneumoperitoneum tube 5019 in order to inflate the body cavity of the patient 5071 for the purpose of securing the visual field of the endoscope 5001 and securing the working space of the surgeon 5067 . send gas.
  • the recorder 5053 is a device capable of recording various types of information regarding surgery.
  • the printer 5055 is a device capable of printing various types of information regarding surgery in various formats such as text, images, and graphs.
  • the support arm device 5027 has a base portion 5029 as a base and an arm portion 5031 extending from the base portion 5029 .
  • the arm portion 5031 is composed of a plurality of joint portions 5033a, 5033b, 5033c and a plurality of links 5035a, 5035b connected by the joint portion 5033b. Therefore, the configuration of the arm portion 5031 is simplified for illustration. Specifically, the shape, number and arrangement of the joints 5033a to 5033c and the links 5035a and 5035b, the direction of the rotation axis of the joints 5033a to 5033c, etc.
  • the arm 5031 has a desired degree of freedom.
  • the arm portion 5031 may preferably be configured to have 6 or more degrees of freedom.
  • the endoscope 5001 can be freely moved within the movable range of the arm portion 5031, so that the barrel 5003 of the endoscope 5001 can be inserted into the body cavity of the patient 5071 from a desired direction. be possible.
  • the joints 5033a to 5033c are provided with actuators, and the joints 5033a to 5033c are configured to be rotatable around a predetermined rotation axis by driving the actuators.
  • the arm control device 5045 By controlling the driving of the actuator by the arm control device 5045, the rotation angles of the joints 5033a to 5033c are controlled, and the driving of the arm 5031 is controlled. Thereby, control of the position and attitude of the endoscope 5001 can be realized.
  • the arm control device 5045 can control the driving of the arm section 5031 by various known control methods such as force control or position control.
  • the arm control device 5045 appropriately controls the driving of the arm section 5031 according to the operation input.
  • the position and orientation of the scope 5001 may be controlled.
  • the arm portion 5031 may be operated by a so-called master-slave method.
  • the arm section 5031 (slave) can be remotely controlled by the surgeon 5067 via the input device 5047 (master console) installed at a location remote from or within the operating room.
  • the endoscope 5001 was supported by a doctor called a scopist.
  • the use of the support arm device 5027 makes it possible to more reliably fix the position of the endoscope 5001 without manual intervention, so that the image of the surgical site is can be stably obtained, and the operation can be performed smoothly.
  • the arm control device 5045 does not necessarily have to be provided on the cart 5037. Also, the arm control device 5045 does not necessarily have to be one device. For example, the arm control device 5045 may be provided at each joint portion 5033a to 5033c of the arm portion 5031 of the support arm device 5027, and the arm portion 5031 is driven by the cooperation of the plurality of arm control devices 5045. Control may be implemented.
  • the medical observation device 100 described above by incorporating the distance measurement technique by the medical observation device 100 described above into the endoscopic surgery system 5000 described above, distance measurement of objects such as organs in the body cavity of the patient 5071 and the surgical instrument 5017 can be performed. becomes possible.
  • the distance measurement result can be utilized for autonomous control (autonomous drive) of the arm section 5031 that supports the endoscope 5001, and the like.
  • the three-dimensional map may be generated in advance by CT scanning or the like and held in the endoscopic surgery system 5000 .
  • the tip of the endoscope 5001 reaches an area not covered by the previously generated three-dimensional map.
  • an object 9 not shown on the three-dimensional map that is, an object 9 for which distance information (such as three-dimensional distance measurement data) is not held may appear around the distal end of the endoscope 5001 .
  • the CCU 5 may measure the distance to the object 9 when it does not have the distance information of the object 9 existing around the distal end of the endoscope 5001 . Measurement method As described above, the description will not be repeated.
  • the CCU 5 may construct 3D range measurement data for the object 9 based on the range measurement results and generate a 3D map containing (the 3D range measurement data for) the object 9 .
  • the generation of this 3D map may be complementary to a previously generated 3D map.
  • FIG. 12 is a flowchart showing an example of processing executed in the endoscopic surgery system.
  • step S21 it is determined whether or not surrounding scan information is held. For example, when the target object 9 existing around the distal end of the endoscope 5001 is included in the pre-generated three-dimensional map, the CCU 5 determines that the peripheral scan information is held. If the surrounding scan information is held (step S21: Yes), the process of step S21 is repeated. Otherwise (step S21: No), the process proceeds to step S22.
  • step S22 transition to the distance measurement mode.
  • CCU 5039 initiates distance measurement according to the techniques described so far. Specifically, the processes of steps S23 to S25 are executed. These processes are the same as the processes of steps S2 to S4 in FIG. 6 described above.
  • the CCU 5039 changes the light irradiation profile of the light source device 5043 and measures the distance to the object 9 based on the detection result of the sensor 35 generated by the change in the light irradiation profile.
  • step S26 three-dimensional distance measurement data is constructed.
  • the CCU 5039 constructs three-dimensional distance measurement data of the object 9 based on the distance measurement result to the object 9 in the previous step S25.
  • the CCU 5039 also complements the pre-generated 3D map to include the constructed 3D distance measurement data of the object 9 .
  • the CCU 5 may measure the distance to the object 9 when the position of the object 9 existing around the distal end of the endoscope 5001 changes.
  • a change in the position of the object 9 may be detected by image recognition or the like for a captured image (for example, an image based on an image signal generated by the sensor 33 or the sensor 34). For example, when an assistant or the like lifts and moves the liver during surgery, a change in the position of the liver is detected.
  • the CCU 5 may construct three-dimensional distance measurement data of the object 9 after the position change and generate a three-dimensional map containing the object 9 . The generation of this 3D map may update a previously generated 3D map.
  • FIG. 13 is a flowchart showing an example of processing executed in the endoscopic surgery system.
  • step S31 it is determined whether or not the surrounding circumstances have changed. For example, the CCU 5039 determines that the surrounding situation has changed when detecting a change in the position of the object 9 included in the captured image. If the surrounding situation has changed (step S31: Yes), the process proceeds to step S32. Otherwise (step S31: No), the process of step S31 is repeated.
  • steps S32 to S35 is the same as the processing of steps S22 to S25 in FIG. 13 described above.
  • the CCU 5039 transitions to the distance measurement mode, changes the light irradiation profile of the light source device 5043, and measures the distance to the object 9 based on the detection result of the sensor 35 generated by the change in the light irradiation profile.
  • step S36 three-dimensional distance measurement data is constructed.
  • the CCU 5039 constructs three-dimensional distance measurement data of the object 9 based on the distance measurement result in the previous step S35.
  • the CCU 5039 also updates the pre-generated 3D map to include the constructed 3D distance measurement data of the object 9 .
  • the distance measurement technique by the medical observation device 100 may be applied to other surgical systems.
  • Another example of a surgical system is a surgical system that includes an operating microscope.
  • FIG. 14 is a diagram showing an example of the hardware configuration of the apparatus.
  • the CCU5, CCU5039, etc. described so far are implemented by, for example, the computer 1000 shown in FIG.
  • the computer 1000 has a CPU 1100, a RAM 1200, a ROM (Read Only Memory) 1300, a HDD (Hard Disk Drive) 1400, a communication interface 1500, and an input/output interface 1600. Each part of computer 1000 is connected by bus 1050 .
  • the CPU 1100 operates based on programs stored in the ROM 1300 or HDD 1400 and controls each section. For example, the CPU 1100 loads programs stored in the ROM 1300 or HDD 1400 into the RAM 1200 and executes processes corresponding to various programs.
  • the ROM 1300 stores boot programs such as the BIOS (Basic Input Output System) executed by the CPU 1100 when the computer 1000 is started, programs dependent on the hardware of the computer 1000, and the like.
  • BIOS Basic Input Output System
  • the HDD 1400 is a computer-readable recording medium that non-temporarily records programs executed by the CPU 1100 and data used by such programs.
  • the HDD 1400 is a recording medium that records programs for the medical observation method and information processing method according to the present disclosure, which are examples of the program data 1450 .
  • a communication interface 1500 is an interface for connecting the computer 1000 to an external network 1550 (for example, the Internet).
  • CPU 1100 receives data from another device via communication interface 1500, and transmits data generated by CPU 1100 to another device.
  • the input/output interface 1600 is an interface for connecting the input/output device 1650 and the computer 1000 .
  • the CPU 1100 receives data from input devices such as a keyboard and mouse via the input/output interface 1600 .
  • the CPU 1100 also transmits data to an output device such as a display, speaker, or printer via the input/output interface 1600 .
  • the input/output interface 1600 may function as a media interface for reading a program or the like recorded on a predetermined computer-readable recording medium.
  • Media include, for example, optical recording media such as DVD (Digital Versatile Disc) PD (Phase change rewritable Disk), magneto-optical recording media such as MO (Magneto-Optical Disk), tape media, magnetic recording media, or semiconductor memory. be.
  • the CPU 1100 of the computer 1000 implements the functions of the CCU5 and CCU5039 by executing a program for estimating the depth loaded on the RAM 1200. do.
  • HDD 1400 may store programs for executing the processes of CCU5 and CCU5039. Note that CPU 1100 reads program data 1450 from HDD 1400 and executes it, but as another example, the program may be obtained from another device via external network 1550 .
  • CCU5 and CCU5039 may be applied to a system consisting of multiple devices, such as cloud computing, which assumes connection to a network (or communication between devices).
  • Each of the above components may be configured using general-purpose members, or may be configured with hardware specialized for the function of each component. Such a configuration can be changed as appropriate according to the technical level of implementation.
  • the medical observation device 100 includes the light source device 4, the sensor 35, and the control unit (CCU 5).
  • the light source device 4 irradiates the object 9 with light.
  • the sensor 35 detects that the amount of luminance change due to the light incident on each of the plurality of pixels 352 exceeds a predetermined threshold.
  • the controller changes the light irradiation profile of the light source device 4 and measures the distance to the object 9 based on the detection result of the sensor 35 generated by the change in the light irradiation profile.
  • the light irradiation profile of the light source device 4 is changed, and the distance to the object 9 is measured based on the detection result of the sensor 35 generated by the change in the light irradiation profile.
  • distance measurement accuracy in surgery such as endoscopic surgery can be improved.
  • changes in the irradiation profile may include changes in the irradiation range.
  • the illuminated area may comprise an annular shape, and the change in illuminated area may comprise a change in the size of the annular shape. For example, by varying the illumination profile in this way, the distance to the object 9 can be measured.
  • changes in the irradiation profile may include changes in combinations of irradiation of the light from the light source device 4 onto the object 9 from a plurality of different positions.
  • Changes in illumination combinations may include changes in lighting patterns. For example, by varying the illumination profile in this way, the distance to the object 9 can be measured.
  • control unit may temporarily stop irradiation when changing the irradiation profile. As a result, changes in the illumination of the object 9 become more conspicuous, and detection by the sensor 35 can be more reliably generated.
  • changes in the irradiation profile may include changes in the wavelength of light. For example, by varying the illumination profile in this way, the distance to the object 9 can be measured.
  • the CCU 5 described with reference to FIGS. 1 to 10 and the like can also be called an information processing device that performs processing related to distance measurement, and such an information processing device is also one of the embodiments.
  • the information processing device changes the light irradiation profile of the light source device 4 that irradiates the object 9 with light, and calculates the distance to the object 9 based on the detection result of the sensor 35 generated by the change in the light irradiation profile. Measure.
  • the sensor 35 detects that the amount of luminance change due to the light incident on each of the plurality of pixels exceeds a predetermined threshold.
  • Such an information processing apparatus can also improve the accuracy of distance measurement in surgery, as described above.
  • the medical observation method described with reference to FIG. 5 etc. is also one of the embodiments.
  • the information processing unit changes the light irradiation profile of the light source device 4 that irradiates the object 9 with light, and based on the detection result of the sensor 35 generated by the change in the light irradiation profile. and measuring the distance to the object 9 (steps S2 to S4).
  • the sensor 35 detects that the amount of luminance change due to the light incident on each of the plurality of pixels 352 exceeds a predetermined threshold.
  • Such a medical observation method can also improve the accuracy of distance measurement in surgery, as described above.
  • the endoscopic surgery system 5000 described with reference to FIGS. 11 to 13 and the like is also one of the embodiments.
  • the endoscopic surgery system 5000 includes an endoscope 5001 (corresponding to the endoscope 1) including a sensor 35 for detecting that the luminance change amount due to light incident on each of a plurality of pixels 352 exceeds a predetermined threshold, An arm portion 5031 that supports the endoscope 5001, a light source device 5043 (corresponding to the light source device 4) that irradiates the object 9 with light, and a control device that controls the endoscope 5001, the arm portion 5031 and the light source device 5043 CCU 5039).
  • an endoscope 5001 (corresponding to the endoscope 1) including a sensor 35 for detecting that the luminance change amount due to light incident on each of a plurality of pixels 352 exceeds a predetermined threshold
  • An arm portion 5031 that supports the endoscope 5001, a light source device 5043 (corresponding to the light source device 4) that irradiates the object 9 with light
  • the control device changes the light irradiation profile of the light source device 5043, and measures the distance to the object 9 based on the detection result of the sensor 35 generated by the change in the light irradiation profile.
  • Such an endoscopic surgery system 5000 can also improve the accuracy of distance measurement in surgery as described above.
  • the control device (CCU 5039) may measure the distance when it does not hold the distance information of the object 9 existing around the tip of the endoscope 5001.
  • the control unit (CCU 5039) may measure the distance when the position of the object 9 existing around the distal end of the endoscope 5001 changes.
  • the control unit (CCU 5039) may generate a three-dimensional map containing the object 9 based on the distance measurement results. As a result, distance information of the object 9, which has not been grasped so far, can be grasped (construction of three-dimensional distance measurement data, etc.), and a three-dimensional map including it can be obtained.
  • the control device (CCU 5039) may control the driving of the arm section 5031 based on the generated three-dimensional map including the object 9. FIG. More appropriate control of the arm section 5031 based on the latest three-dimensional map becomes possible.
  • the present technology can also take the following configuration.
  • a light source device for irradiating an object with light a sensor that detects that a luminance change amount due to light incident on each of the plurality of pixels exceeds a predetermined threshold; a control unit that changes the light irradiation profile of the light source device and measures the distance to the object based on the detection result of the sensor generated by the change in the light irradiation profile; comprising a Medical observation device.
  • the change in the irradiation profile includes a change in irradiation range.
  • the irradiation range includes an annular shape, The change in the irradiation range includes a change in the size of the annular shape.
  • a change in the irradiation profile includes a change in combination of irradiation of the object from a plurality of different positions of the light of the light source device.
  • the change in the irradiation combination includes a lighting pattern, The medical observation device according to (4).
  • the control unit temporarily stops the irradiation when changing the irradiation profile;
  • the change in the irradiation profile includes a change in the wavelength of the light;
  • An information processing device changing a light irradiation profile of a light source device that irradiates light onto an object, and measuring a distance to the object based on a sensor detection result generated by the change in the light irradiation profile;
  • An information processing device The sensor detects that a luminance change amount due to light incident on each of the plurality of pixels exceeds a predetermined threshold.
  • Information processing equipment (9) An information processing device changes a light irradiation profile of a light source device that irradiates light onto an object, and measures a distance to the object based on a sensor detection result generated by the change in the light irradiation profile.
  • the sensor detects that a luminance change amount due to light incident on each of the plurality of pixels exceeds a predetermined threshold.
  • Medical observation method (10) an endoscope including a sensor that detects that a luminance change due to light incident on each of a plurality of pixels exceeds a predetermined threshold; an arm that supports the endoscope; a light source device for irradiating an object with light; a control device that controls the endoscope, the arm portion, and the light source device; with The control device changes the light irradiation profile of the light source device, and measures the distance to the object based on the detection result of the sensor generated by the change in the light irradiation profile.
  • Endoscopic surgery system (10) an endoscope including a sensor that detects that a luminance change due to light incident on each of a plurality of pixels exceeds a predetermined threshold; an arm that supports the endoscope; a light source device for irradiating an object with light; a control device that controls the endoscope, the
  • the control device measures the distance when the distance information of the object existing around the tip of the endoscope is not held.
  • (12) The control device measures the distance when the position of the object existing around the tip of the endoscope changes.
  • (13) The control device generates a three-dimensional map including the object based on the distance measurement result.
  • the light source device irradiates the object with light from a predetermined position and a predetermined direction; A medical observation device according to any one of (1) to (7).
  • the light emitted from the light source device is applied to the object from a predetermined position and in a predetermined direction.
  • the information processing device according to (8).
  • the light emitted from the light source device is applied to the object from a predetermined position and in a predetermined direction.
  • the medical observation method according to (9). wherein the light source device irradiates the object with light from a predetermined position and a predetermined direction;
  • the endoscopic surgery system according to any one of (10) to (14).

Landscapes

  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Optics & Photonics (AREA)
  • Surgery (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Engineering & Computer Science (AREA)
  • Biophysics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Pathology (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Astronomy & Astrophysics (AREA)
  • General Physics & Mathematics (AREA)
  • Endoscopes (AREA)

Abstract

A medical observation device (100) is equipped with: a light source device (4) for illuminating an object (9) with light; a sensor (35) for detecting that the amount of change in the brightness of light incident to each of a plurality of pixels (352) has exceeded a threshold value; and a control unit (5) that changes a light illumination profile of the light source device (4), and measures the distance to the object (9) on the basis of the detection result which is acquired from the sensor (35) and is generated as a result of the change of the light illumination profile.

Description

医療用観察装置、情報処理装置、医療用観察方法及び内視鏡手術システムMedical observation device, information processing device, medical observation method, and endoscopic surgery system
 本開示は、医療用観察装置、情報処理装置、医療用観察方法及び内視鏡手術システムに関する。 The present disclosure relates to a medical observation device, an information processing device, a medical observation method, and an endoscopic surgery system.
 内視鏡の撮像画像に基づいて、内視鏡を支持するアームを制御することが知られている(例えば特許文献1を参照)。 It is known to control an arm that supports an endoscope based on an image captured by the endoscope (see Patent Document 1, for example).
特開2015-123201号公報JP 2015-123201 A 国際公開第2018/123456号WO2018/123456
 内視鏡手術や顕微鏡手術において距離測定を行うことへの要望が存在し、また、そのような手術に用いるための距離測定技術には、依然として検討の余地が残る。  There is a demand for distance measurement in endoscopic and microsurgery, and distance measurement technology for use in such surgeries is still under consideration.
 本開示の一側面は、手術に適した距離測定技術を提供することを目的とする。 One aspect of the present disclosure aims to provide a distance measurement technique suitable for surgery.
 一側面に係る医療用観察装置は、対象物に光を照射する光源装置と、複数の画素それぞれに入射した光による輝度変化量が所定の閾値を超えたことを検出するセンサと、光源装置による光の照射プロファイルを変化させ、光の照射プロファイルの変化により発生したセンサの検出結果に基づいて、対象物までの距離を測定する制御部と、を具備する。 A medical observation apparatus according to one aspect includes a light source device that irradiates light onto an object, a sensor that detects that the amount of change in brightness due to light incident on each of a plurality of pixels exceeds a predetermined threshold, and a light source device. a control unit that changes the irradiation profile of light and measures the distance to the object based on the detection result of the sensor generated by the change in the irradiation profile of light.
 一側面に係る情報処理装置は、対象物に光を照射する光源装置による光の照射プロファイルを変化させ、光の照射プロファイルの変化により発生したセンサの検出結果に基づいて、対象物までの距離を測定する制御部、を具備し、センサは、複数の画素それぞれに入射した光による輝度変化量が所定の閾値を超えたことを検出する。 An information processing device according to one aspect changes the light irradiation profile of a light source device that irradiates light onto an object, and calculates the distance to the object based on the sensor detection result generated by the change in the light irradiation profile. and a control unit for measuring, and the sensor detects that the amount of change in luminance due to light incident on each of the plurality of pixels exceeds a predetermined threshold.
 一側面に係る医療用観察方法は、情報処理装置が、対象物に光を照射する光源装置による光の照射プロファイルを変化させ、光の照射プロファイルの変化により発生したセンサの検出結果に基づいて、対象物までの距離を測定すること、を含み、センサは、複数の画素それぞれに入射した光による輝度変化量が所定の閾値を超えたことを検出する。 In a medical observation method according to one aspect, an information processing device changes a light irradiation profile of a light source device that irradiates light onto an object, and based on a sensor detection result generated by the change in the light irradiation profile, measuring the distance to the object, and the sensor detects that the amount of change in brightness due to light incident on each of the plurality of pixels exceeds a predetermined threshold.
 一側面に係る内視鏡手術システムは、複数の画素それぞれに入射した光による輝度変化量が所定の閾値を超えたことを検出するセンサを含む内視鏡と、内視鏡を支持するアーム部と、対象物に光を照射する光源装置と、内視鏡、アーム部及び光源装置を制御する制御装置と、を備え、制御装置は、光源装置による光の照射プロファイルを変化させ、光の照射プロファイルの変化により発生したセンサの検出結果に基づいて、対象物までの距離を測定する。 An endoscopic surgery system according to one aspect includes an endoscope including a sensor that detects that a luminance change due to light incident on each of a plurality of pixels exceeds a predetermined threshold, and an arm that supports the endoscope. and a light source device for irradiating an object with light, and a control device for controlling the endoscope, the arm portion, and the light source device, wherein the control device changes the light irradiation profile of the light source device to irradiate the light. The distance to the object is measured based on the detection result of the sensor caused by the profile change.
実施形態に係る医療用観察装置の概略構成の例を示す図である。1 is a diagram showing an example of a schematic configuration of a medical observation device according to an embodiment; FIG. 内視鏡の概略構成の例を示す図である。It is a figure which shows the example of a schematic structure of an endoscope. センサ(EVS)の概略構成の例を示す図である。It is a figure which shows the example of a schematic structure of a sensor (EVS). 画素の概略構成の例を示す図である。It is a figure which shows the example of the schematic structure of a pixel. 医療用観察装置において実行される処理の例を示すフローチャートである。4 is a flow chart showing an example of processing executed in the medical observation device; 照射範囲の変化の例を模式的に示す図である。FIG. 5 is a diagram schematically showing an example of changes in the irradiation range; 入射角度と照射範囲との関係の例を模式的に示す図である。It is a figure which shows typically the example of the relationship between an incident angle and an irradiation range. 複数の照射の組み合わせを模式的に示す図である。It is a figure which shows typically the combination of several irradiation. 照射の一時停止(消灯)を含む処理の例を示すフローチャートである。4 is a flowchart showing an example of processing including temporary suspension of irradiation (turning off the light); 光源装置の概略構成の例を示す図である。It is a figure which shows the example of a schematic structure of a light source device. 内視鏡手術システムの概略構成の例を示す図である。1 is a diagram showing an example of a schematic configuration of an endoscopic surgery system; FIG. 内視鏡手術システムにおいて実行される処理の例を示すフローチャートである。4 is a flow chart showing an example of processing executed in the endoscopic surgery system; 内視鏡手術システムにおいて実行される処理の例を示すフローチャートである。4 is a flow chart showing an example of processing executed in the endoscopic surgery system; 装置のハードウェア構成の例を示す図である。It is a figure which shows the example of the hardware constitutions of an apparatus.
 以下に、本開示の実施形態について図面に基づいて詳細に説明する。なお、以下の各実施形態において、同一の要素には同一の符号を付することにより重複する説明を省略する。 Below, embodiments of the present disclosure will be described in detail based on the drawings. In addition, in each of the following embodiments, the same reference numerals are given to the same elements to omit redundant description.
 以下に示す項目順序に従って本開示を説明する。
  1.医療用観察装置の実施形態
  2.照射プロファイルの例
  3.内視鏡手術システムの実施形態
  4.ハードウェア構成の例
  5.効果の例
The present disclosure will be described according to the order of items shown below.
1. Embodiment of Medical Observation Device 2 . Example of irradiation profile 3 . Embodiment of Endoscopic Surgery System4. Example of hardware configuration5. Example of effect
1.医療用観察装置の実施形態
 図1は、実施形態に係る医療用観察装置の概略構成を示す図である。医療用観察装置100は、内視鏡1と、光源装置4と、CCU5とを含む。
1. Embodiment of Medical Observation Apparatus FIG. 1 is a diagram showing a schematic configuration of a medical observation apparatus according to an embodiment. The medical observation device 100 includes an endoscope 1, a light source device 4, and a CCU5.
 内視鏡1は、患者の体腔内の術野(術部)を撮像する。術野内の対象物を、対象物9と称し図示する。対象物9の例は、臓器、術具等である。内視鏡1は、前方直視鏡であってもよいし、斜視鏡であってもよい。図1には、内視鏡1の構成要素として、鏡筒2及びカメラヘッド3が例示され、これらの詳細は後に図2を参照して説明する。 The endoscope 1 images the operative field (operative site) within the body cavity of the patient. An object within the operative field is referred to as object 9 and illustrated. Examples of the object 9 are organs, surgical tools, and the like. The endoscope 1 may be a forward viewing scope or a perspective scope. FIG. 1 illustrates a lens barrel 2 and a camera head 3 as components of an endoscope 1, details of which will be described later with reference to FIG.
 なお、本開示において、「撮像」は、「撮影」を含む意味であってよい。以下、とくに説明がある場合を除き、「撮像」及び/又は「映像」を、単に「撮像」という。矛盾の無い範囲において、「撮像」は、「撮影」に読み替えられてよい。また、「画像」は、「映像」を含む意味であってよい。以下、とくに説明がある場合を除き、「画像」及び/又は「映像」を、単に「画像」という。矛盾の無い範囲において、「画像」は「映像」に読み替えられてよい。 In addition, in the present disclosure, "imaging" may mean including "shooting". Hereinafter, "imaging" and/or "video" will simply be referred to as "imaging" unless otherwise specified. As long as there is no contradiction, "imaging" may be read as "shooting". Also, the term "image" may include "video". Hereinafter, "image" and/or "video" are simply referred to as "image" unless otherwise specified. As long as there is no contradiction, "image" may be read as "video".
 光源装置4は、対象物9に光を照射する。光源装置4は、術野に照射するための光を内視鏡1に供給する。光源装置4は、LED(Light Emitting Diode)光源、レーザ光源等の光源を含んで構成されてよい。光源装置4は、光源からの光を出射する光出射口43を含む。光出射口43は、ケーブルC14を介して、内視鏡1に接続される。ケーブルC14は、光を伝搬させるライトガイド(例えば光ファイバ等の光導波路)等を含んで構成される。光出射口43は、ライトガイドが挿入されるライトガイド挿入口ともいえる。 The light source device 4 irradiates the object 9 with light. The light source device 4 supplies the endoscope 1 with light for irradiating the surgical field. The light source device 4 may include a light source such as an LED (Light Emitting Diode) light source, a laser light source, or the like. The light source device 4 includes a light exit port 43 through which light from the light source is emitted. The light exit port 43 is connected to the endoscope 1 via a cable C14. The cable C14 includes a light guide (for example, an optical waveguide such as an optical fiber) that propagates light. The light exit port 43 can also be said to be a light guide insertion port into which the light guide is inserted.
 光源装置4が対象物9に照射する光の例は、術野を照明するための照明光である。照明光の例は、白色光であるが、とくにこれに限定されるわけではない。 An example of the light that the light source device 4 irradiates onto the object 9 is illumination light for illuminating the surgical field. An example of illumination light is white light, but it is not particularly limited to this.
 光源装置4が対象物9に照射する光の別の例は、特殊光観察に対応した所定の波長帯域の光である。特殊光観察は、いわゆる狭帯域光観察(Narrow Band Imaging)であってよい。その場合、例えば、体組織における光の吸収の波長依存性を利用して、通常の観察時における照射光(例えば白色光)に比べて狭帯域の光が対象物9に照射され、その結果、粘膜表層の血管等の所定の組織が高コントラストで撮像される。特殊光観察は、蛍光観察であってもよい。その場合、対象物9への励起光の照射によって発生する蛍光により、画像が得られる。例えば、対象物9が体組織である場合、体組織に励起光が照射され、当該体組織からの蛍光が観察される(自家蛍光観察)。インドシアニングリーン(ICG)等の試薬が体組織に局注されるとともに、当該体組織にその試薬の蛍光波長に対応した励起光が照射され、蛍光像が得られてもよい。光源装置4は、このような特殊光観察に対応した狭帯域光、及び/又は、励起光を供給可能に構成されてもよい。 Another example of the light with which the light source device 4 irradiates the object 9 is light in a predetermined wavelength band corresponding to special light observation. Special light observation may be so-called narrow band imaging. In that case, for example, by utilizing the wavelength dependence of light absorption in body tissues, the object 9 is irradiated with light that is narrower than the irradiation light (for example, white light) during normal observation, and as a result, Predetermined tissues such as blood vessels on the mucosal surface are imaged with high contrast. Special light observation may be fluorescence observation. In that case, an image is obtained by fluorescence generated by irradiation of the object 9 with excitation light. For example, when the object 9 is a body tissue, the body tissue is irradiated with excitation light and fluorescence from the body tissue is observed (autofluorescence observation). A reagent such as indocyanine green (ICG) may be locally injected into the body tissue, and the body tissue may be irradiated with excitation light corresponding to the fluorescence wavelength of the reagent to obtain a fluorescence image. The light source device 4 may be configured to be capable of supplying narrow-band light and/or excitation light corresponding to such special light observation.
 光源装置4が対象物9に照射する光の別の例は、対象物9までの距離を測定するための(距離測定用の)光である。距離測定用の光には、照明光を含め、上述したさまざまな光が用いられてよい。距離測定用の光は、可視光に限らず、赤外光、紫外光であってもよい。光源装置4は、内視鏡1に供給する光の波長を変えるように構成されていてもよく、この点は後に図10を参照して説明する。 Another example of the light that the light source device 4 irradiates onto the object 9 is light for measuring the distance to the object 9 (for distance measurement). Various lights described above, including illumination light, may be used as the light for distance measurement. The light for distance measurement is not limited to visible light, and may be infrared light or ultraviolet light. The light source device 4 may be configured to change the wavelength of the light supplied to the endoscope 1, which will be explained later with reference to FIG.
 CCU5は、医療用観察装置100における制御部として機能し得る。CCU5は、内視鏡1を制御するカメラコントロールユニット(Camera Control Unit)である。CCU5は、ケーブルC15を介して、内視鏡1に接続される。ケーブルC15は、信号線等を含んで構成され、CCU5による内視鏡1の制御を可能にする。CCU5は、例えばCPU(Central Processing Unit)、GPU(Graphics Processing Unit)等を含んで構成される。 The CCU 5 can function as a control unit in the medical observation device 100. CCU 5 is a camera control unit that controls endoscope 1 . CCU 5 is connected to endoscope 1 via cable C15. The cable C<b>15 includes signal lines and the like, and enables control of the endoscope 1 by the CCU 5 . The CCU 5 includes, for example, a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and the like.
 CCU5は、内視鏡1によって撮像された患者の体腔内の術部の画像を表示する表示装置(後述の図11の表示装置5041に相当)等の動作も含め、医療用観察装置100の統括的な制御を行う。CCU5は、医療用観察装置100の全体制御を行う制御装置ということもできる。CCU5は、医療用観察装置100において扱われる種々の情報を処理する情報処理装置ということもできる。 The CCU 5 supervises the medical observation device 100, including the operation of a display device (corresponding to the display device 5041 in FIG. 11 described later) that displays an image of the surgical site in the patient's body cavity captured by the endoscope 1. control. The CCU 5 can also be said to be a control device that performs overall control of the medical observation device 100 . The CCU 5 can also be said to be an information processing device that processes various information handled by the medical observation device 100 .
 CCU5は、光源装置4を制御することにより、対象物9への光の照射を制御する。CCU5は、ケーブルC45を介して、光源装置4に接続される。ケーブルC45は、信号線等を含んで構成され、CCU5による光源装置4の制御を可能にする。例えば、CCU5は、照射の態様を規定する照射プロファイルを光源装置4に送信する。照射プロファイルについては後に改めて説明する。光源装置4は、受信した照射プロファイルに規定されるように、光を内視鏡1に供給する。 The CCU 5 controls irradiation of light to the object 9 by controlling the light source device 4 . CCU5 is connected to light source device 4 via cable C45. The cable C45 includes signal lines and the like, and enables control of the light source device 4 by the CCU5. For example, the CCU 5 transmits to the light source device 4 an illumination profile that defines the mode of illumination. The irradiation profile will be explained again later. The light source device 4 supplies light to the endoscope 1 as defined by the received illumination profile.
 図2は、内視鏡の概略構成の例を示す図である。先にも触れたように、内視鏡1は、鏡筒2と、カメラヘッド3とを含む。例示される鏡筒2は硬性であり、内視鏡1は、いわゆる硬性鏡として構成される。鏡筒2は、カメラヘッド3を基端として延在する。鏡筒2のサイズの例は、直径が数mm~十数mm程度、長さが数百mm程度である。なお、鏡筒2が軟性であり、内視鏡1が硬性鏡として構成されてもよい。 FIG. 2 is a diagram showing an example of a schematic configuration of an endoscope. As mentioned earlier, the endoscope 1 includes a lens barrel 2 and a camera head 3. The illustrated lens barrel 2 is rigid, and the endoscope 1 is configured as a so-called rigid scope. The lens barrel 2 extends with the camera head 3 as its base end. An example of the size of the lens barrel 2 is several millimeters to ten and several millimeters in diameter and several hundred millimeters in length. Note that the lens barrel 2 may be flexible and the endoscope 1 may be configured as a rigid endoscope.
 鏡筒2は、その先端から所定の長さの領域が患者の体腔内に挿入される。鏡筒2の構成要素として、取付部21、ライトガイド22、レンズ23a~レンズ23g、取付部24及び開口部25が例示される。また、鏡筒2におけるカメラヘッド3とは反対側(対象物9側)の先端の開口部を、開口部2aと称し図示する。 The lens barrel 2 is inserted into the patient's body cavity for a predetermined length from its tip. As components of the lens barrel 2, a mounting portion 21, a light guide 22, lenses 23a to 23g, a mounting portion 24, and an opening 25 are exemplified. Also, an opening at the tip of the lens barrel 2 on the side opposite to the camera head 3 (on the side of the object 9) is referred to as an opening 2a.
 取付部21は、光源装置4からのケーブルC14が取り付けられる部分である。ケーブルC14及び取付部21を介して、光源装置4の光が内視鏡1に供給される。 The attachment portion 21 is a portion to which the cable C14 from the light source device 4 is attached. Light from the light source device 4 is supplied to the endoscope 1 via the cable C14 and the mounting portion 21 .
 ライトガイド22は、鏡筒2の内部に延設され、光源装置4の光が所定位置及び所定方向から対象物9に照射されるように光源装置4の光を鏡筒2の先端に導く。鏡筒2の先端におけるライトガイド22の先端を、出射端22aと称し図示する。光源装置4の光は、出射端22aから出射され、対象物9に照射される。出射端22aの位置が所定位置に相当し、ライトガイド22から出射される光の方向が所定方向に相当する。これらの所定位置及び所定方向は、医療用観察装置100において、より具体的にはCCU5において把握されており、後述の距離測定を可能にする。対象物9に照射された光の少なくとも一部は対象物9で反射し、さらに、反射した光の少なくとも一部は開口部25に入射する。このように対象物9から開口部2aに入射する光を、以下では、単に「対象物9からの光」等という場合もある。 The light guide 22 extends inside the lens barrel 2 and guides the light from the light source device 4 to the tip of the lens barrel 2 so that the light from the light source device 4 irradiates the object 9 from a predetermined position and in a predetermined direction. The tip of the light guide 22 at the tip of the lens barrel 2 is referred to as an emitting end 22a and illustrated. The light from the light source device 4 is emitted from the emission end 22 a to irradiate the object 9 . The position of the output end 22a corresponds to the predetermined position, and the direction of light emitted from the light guide 22 corresponds to the predetermined direction. These predetermined positions and predetermined directions are grasped by the medical observation device 100, more specifically by the CCU 5, and enable distance measurement, which will be described later. At least part of the light applied to the object 9 is reflected by the object 9 , and at least part of the reflected light enters the opening 25 . The light incident on the opening 2a from the object 9 in this way may be simply referred to as "light from the object 9" below.
 取付部23は、鏡筒2の基端をカメラヘッド3に接続する部分(例えばアイピース)である。 The mounting portion 23 is a portion (for example, an eyepiece) that connects the base end of the lens barrel 2 to the camera head 3 .
 レンズ24a~レンズ24gは、対象物9からの光をカメラヘッド3に導く。この例では、レンズ24a~レンズ24gは、鏡筒2の先端(開口部2a側端)から基端に向けて、鏡筒2の内部に順に設けられる。もっとも対象物9側に位置するレンズ24aは、対物レンズとして機能し得る。なお、鏡筒2内に設けられるレンズの数は例示に過ぎない。図示されるよりも多い数、例えば数十、さらには100以上のレンズが、鏡筒2の内部に設けられてよい。対象物9からの光は、このような鏡筒2に設けられた多くのレンズを通りカメラヘッド3に到達する。 The lenses 24a to 24g guide light from the object 9 to the camera head 3. In this example, the lenses 24a to 24g are provided inside the lens barrel 2 in order from the tip (the end on the opening 2a side) of the lens barrel 2 toward the base end. The lens 24a positioned closest to the object 9 can function as an objective lens. Note that the number of lenses provided in the lens barrel 2 is merely an example. More lenses than shown may be provided inside the barrel 2, for example several tens or even 100 or more. Light from the object 9 reaches the camera head 3 through many lenses provided in such a lens barrel 2 .
 カメラヘッド3は、対象物9からの光を検出する。カメラヘッド3の構成要素として、取付部31、レンズ32a、光学素子32b、センサ33~センサ35及び振動ブレ補正機構36が例示される。 The camera head 3 detects light from the object 9. As components of the camera head 3, a mounting portion 31, a lens 32a, an optical element 32b, sensors 33 to 35, and a vibration blur correction mechanism 36 are exemplified.
 取付部31は、鏡筒2の基端部が取り付けられる部分であり、上述の鏡筒2の取付部23が接続される。対象物9からの光が、取付部31を介してカメラヘッド3に入射する。 The attachment portion 31 is a portion to which the base end portion of the lens barrel 2 is attached, and the attachment portion 23 of the lens barrel 2 described above is connected. Light from the object 9 enters the camera head 3 through the mounting portion 31 .
 レンズ32a及び光学素子32bは、対象物9からの光を、センサ33~センサ35それぞれに集光する。レンズ32aは、観察光を、光学素子32bに導く。光学素子32bは、センサ33の検出対象の光をセンサ33に、センサ34の検出対象の光をセンサ34に、センサ35の検出対象の光をセンサ35にそれぞれ導く。光学素子32bは、種々の公知の光学素子、例えばミラー、プリズム等を含んで構成される。 The lens 32a and the optical element 32b collect light from the object 9 onto the sensors 33 to 35, respectively. The lens 32a guides observation light to the optical element 32b. The optical element 32b guides the light to be detected by the sensor 33 to the sensor 33, the light to be detected by the sensor 34 to the sensor 34, and the light to be detected by the sensor 35 to the sensor 35, respectively. The optical element 32b includes various known optical elements such as mirrors and prisms.
 センサ33は、対象物9を撮像する撮像素子(第1の撮像用センサ)である。例えば、センサ33は、可視光観察用の撮像素子であり、RGB画素等を含んで構成される。その場合、上述の光学素子32bは、対象物9からの光のうち、とくに可視光をセンサ33に導く。センサ33は、光電変換によって電気信号(画像信号)を生成する。生成された画像信号はCCU5に送信される。 The sensor 33 is an imaging device (first imaging sensor) that captures an image of the object 9 . For example, the sensor 33 is an imaging element for visible light observation, and includes RGB pixels and the like. In that case, the optical element 32b described above guides the light from the object 9, especially the visible light, to the sensor 33. FIG. The sensor 33 generates an electrical signal (image signal) through photoelectric conversion. The generated image signal is transmitted to CCU5.
 センサ34は、対象物9を撮像する撮像素子(第2の撮像用センサ)である。例えば、センサ34は、特殊光観察用の撮像素子であり、赤外光画素等を含んで構成される。その場合、上述の光学素子32bは、対象物9からの光のうち、とくに赤外光をセンサ34に導く。センサ34は、光電変換によって電気信号(画像信号)を生成する。生成された画像信号は、CCU5に送信される。 The sensor 34 is an imaging device (second imaging sensor) that captures an image of the object 9 . For example, the sensor 34 is an imaging element for special light observation, and includes infrared light pixels and the like. In that case, the optical element 32b described above guides light from the object 9, particularly infrared light, to the sensor . The sensor 34 generates an electrical signal (image signal) through photoelectric conversion. The generated image signal is transmitted to CCU5.
 なお、カメラヘッド3には、レンズ32a及び光学素子32b等の光学系を適宜駆動させる駆動機構等が設けられてもよい。倍率及び焦点距離等の調整が可能になる。また、例えば立体視(3D表示)等に対応するために、複数のセンサ33が設けられたり、複数のセンサ34が設けられたりしてもよい。この場合、鏡筒2の内部には、当該複数の撮像素子のそれぞれに観察光を導光するために、リレー光学系が複数系統設けられてよい。 The camera head 3 may be provided with a drive mechanism or the like for appropriately driving the optical system such as the lens 32a and the optical element 32b. Adjustment of magnification, focal length, etc. becomes possible. Also, for example, in order to cope with stereoscopic vision (3D display) or the like, a plurality of sensors 33 or a plurality of sensors 34 may be provided. In this case, a plurality of relay optical systems may be provided inside the lens barrel 2 in order to guide the observation light to each of the plurality of imaging elements.
 センサ35は、対象物9からの光による輝度変化を検出するセンサ(輝度変化検出用センサ)である。センサ35は、EVS(Event Vision Sensor)等とも称される。センサ35について、図3及び図4も参照して説明する。 The sensor 35 is a sensor (brightness change detection sensor) that detects a brightness change due to light from the object 9 . The sensor 35 is also called an EVS (Event Vision Sensor) or the like. Sensor 35 will be described with reference also to FIGS.
 図3は、センサ(EVS)の概略構成の例を示す図である。センサ35は、複数の画素352を含む画素アレイ部351と、駆動回路353と、信号処理部354と、アービタ部355(調停部)と、カラム処理部356とを含む。駆動回路353、信号処理部354、アービタ部355及びカラム処理部356は、画素アレイ部351の周辺回路部である。センサ35は、複数の画素352それぞれに入射した光による輝度変化量が所定の閾値を超えたことを検出するセンサである。 FIG. 3 is a diagram showing an example of the schematic configuration of the sensor (EVS). The sensor 35 includes a pixel array section 351 including a plurality of pixels 352 , a driving circuit 353 , a signal processing section 354 , an arbiter section 355 (arbitration section), and a column processing section 356 . The drive circuit 353 , signal processing section 354 , arbiter section 355 and column processing section 356 are peripheral circuit sections of the pixel array section 351 . The sensor 35 is a sensor that detects that the luminance change amount due to light incident on each of the plurality of pixels 352 exceeds a predetermined threshold.
 複数の画素352は、この例ではマトリクス状に配置される。各画素352は、光電変換によって生成された光電流に応じた電圧を画素信号として生成することができる。各画素352は、入射光の輝度の変化に対応する光電流の変化を所定の閾値と比較することにより、イベントの有無を検出することができる。言い換えると、各画素352は、輝度変化が所定の閾値を超えたことに基づき、イベントを検出することができる。 A plurality of pixels 352 are arranged in a matrix in this example. Each pixel 352 can generate a voltage corresponding to a photocurrent generated by photoelectric conversion as a pixel signal. Each pixel 352 can detect the presence or absence of an event by comparing a change in photocurrent corresponding to a change in luminance of incident light with a predetermined threshold. In other words, each pixel 352 can detect an event based on a luminance change exceeding a predetermined threshold.
 各画素352は、イベントを検出した際に、イベントの発生を表すイベントデータの出力を要求するリクエストをアービタ部355に出力することができる。そして、各画素352は、イベントデータの出力の許可を表す応答をアービタ部355から受け取った場合、駆動回路353及び信号処理部354に対してイベントデータを出力する。また、イベントを検出した画素352は、光電変換によって生成される画素信号をカラム処理部356に対して出力する。 When detecting an event, each pixel 352 can output a request to the arbiter unit 355 requesting output of event data representing the occurrence of the event. Each pixel 352 outputs the event data to the driving circuit 353 and the signal processing unit 354 when receiving a response indicating permission to output the event data from the arbiter unit 355 . Also, the pixel 352 that has detected the event outputs a pixel signal generated by photoelectric conversion to the column processing unit 356 .
 駆動回路353は、画素アレイ部351の各画素352を駆動することができる。例えば、駆動回路353は、イベントを検出し、イベントデータを出力した画素352を駆動し、該当する画素352の画素信号を、カラム処理部356へ出力させる。 The drive circuit 353 can drive each pixel 352 of the pixel array section 351 . For example, the drive circuit 353 detects an event, drives the pixel 352 that outputs the event data, and outputs the pixel signal of the corresponding pixel 352 to the column processing unit 356 .
 アービタ部355は、各画素352のそれぞれから供給されるイベントデータの出力を要求するリクエストを調停し、その調停結果(イベントデータの出力の許可/不許可)に基づく応答、及び、イベント検出をリセットするリセット信号を画素352に送信することができる。 The arbiter unit 355 arbitrates requests requesting output of event data supplied from each pixel 352, responds based on the arbitration result (permission/non-permission of event data output), and resets event detection. A reset signal can be sent to the pixel 352 to do so.
 カラム処理部356では、画素アレイ部351の列ごとに、該当する列の画素352から出力されるアナログの画素信号をデジタル信号に変換する処理が行うことができる。カラム処理部356は、デジタル化した画素信号に対して、CDS(Correlated Double Sampling)処理を行うこともできる。 The column processing unit 356 can perform processing for converting analog pixel signals output from the pixels 352 of the corresponding column into digital signals for each column of the pixel array unit 351 . The column processing unit 356 can also perform CDS (Correlated Double Sampling) processing on digitized pixel signals.
 信号処理部354は、カラム処理部356から供給されるデジタル化された画素信号や、画素アレイ部351から出力されるイベントデータに対して所定の信号処理を実行し、信号処理後のイベントデータ(タイムスタンプ情報等)及び画素信号を出力することができる。これらのイベントデータ、画素信号等は、CCU5に送信される。 The signal processing unit 354 performs predetermined signal processing on the digitized pixel signals supplied from the column processing unit 356 and the event data output from the pixel array unit 351, and converts the signal-processed event data ( time stamp information, etc.) and pixel signals can be output. These event data, pixel signals, etc. are transmitted to the CCU 5 .
 画素352で生成される光電流の変化は、画素352に入射する光の光量変化(輝度変化)と捉えることができる。従って、イベントは、所定の閾値を超える画素352の輝度変化であるということができる。さらに、イベントの発生を表すイベントデータには、少なくとも、イベントとしての光量変化が発生した画素352の位置を表す座標等の位置情報を含むことができる。このような位置情報も含むイベントデータが、CCU5に送信される。 A change in the photocurrent generated by the pixel 352 can be regarded as a change in the amount of light (luminance change) incident on the pixel 352 . Therefore, an event can be said to be a luminance change of pixel 352 exceeding a predetermined threshold. Furthermore, the event data representing the occurrence of an event can include at least positional information such as coordinates representing the position of the pixel 352 where the change in the amount of light has occurred as an event. Event data including such position information is transmitted to CCU5.
 図4は、画素の概略構成の例を示す図である。画素352は、受光部3521と、画素信号生成部3522と、イベント検出部3523とを含む。 FIG. 4 is a diagram showing an example of a schematic configuration of a pixel. The pixel 352 includes a light receiving portion 3521 , a pixel signal generation portion 3522 and an event detection portion 3523 .
 受光部3521は、入射光を光電変換して光電流を生成することができる。そして、受光部3521は、駆動回路353の制御に従って、画素信号生成部3522及びイベント検出部3523のいずれかに、光電流に応じた電圧の信号を供給することができる。 The light receiving section 3521 can photoelectrically convert incident light to generate a photocurrent. Then, the light receiving unit 3521 can supply a voltage signal corresponding to the photocurrent to either the pixel signal generation unit 3522 or the event detection unit 3523 under the control of the driving circuit 353 .
 画素信号生成部3522は、受光部3521から供給された信号を、画素信号として生成することができる。そして、画素信号生成部3522は、生成したアナログの画素信号を、画素アレイ部351の列に対応する垂直信号線VSL(図示省略)を介してカラム処理部356に供給することができる。 The pixel signal generation unit 3522 can generate the signal supplied from the light receiving unit 3521 as a pixel signal. Then, the pixel signal generation section 3522 can supply the generated analog pixel signals to the column processing section 356 via vertical signal lines VSL (not shown) corresponding to the columns of the pixel array section 351 .
 イベント検出部3523は、受光部3521からの光電流の変化量が所定の閾値を超えたか否かに基づき、イベントの発生の有無を検出することができる。イベントは、例えば、光電流の変化量が上限の閾値を超えた旨を示すオンイベント、及び、その変化量が下限の閾値を下回った旨を示すオフイベントを含むことができる。なお、イベント検出部3523は、オンイベントのみを検出するようにしてもよい。 The event detection unit 3523 can detect whether an event has occurred based on whether the amount of change in photocurrent from the light receiving unit 3521 has exceeded a predetermined threshold. The events can include, for example, an ON event indicating that the amount of change in photocurrent has exceeded the upper threshold, and an OFF event indicating that the amount of change has fallen below the lower threshold. Note that the event detection unit 3523 may detect only on-events.
 イベント検出部3523は、イベントが発生した際に、イベントの発生を表すイベントデータの出力を要求するリクエストをアービタ部355に出力することができる。そして、イベント検出部3523は、リクエストに対する応答をアービタ部355から受け取った場合、駆動回路353及び信号処理部354に対してイベントデータを出力することができる。 When an event occurs, the event detection unit 3523 can output to the arbiter unit 355 a request to output event data representing the occurrence of the event. Then, when the event detection unit 3523 receives a response to the request from the arbiter unit 355 , it can output event data to the drive circuit 353 and the signal processing unit 354 .
 上述のような構成を備えることにより、センサ35は、高いダイナミックレンジ、動きの速い被写体検出のロバスト性の高さ、高い時間分解能等を実現する。とくに高いダイナミックレンジを有することにより、対象物9からの光を高感度で検出することができる。 With the configuration as described above, the sensor 35 achieves a high dynamic range, high robustness in fast-moving subject detection, high temporal resolution, and the like. By having a particularly high dynamic range, light from the object 9 can be detected with high sensitivity.
 図2に戻り、振動ブレ補正機構36は、センサ35に対して設けられ、センサ35における光検出の内視鏡1の振動等によってセンサ35での光検出に生じ得るブレを抑制する。振動ブレ補正機構36には、種々の公知の機構が採用されてよい。 Returning to FIG. 2, the vibration blur correction mechanism 36 is provided for the sensor 35, and suppresses blur that may occur in light detection by the sensor 35 due to vibration of the endoscope 1 for light detection by the sensor 35 or the like. Various known mechanisms may be adopted as the vibration blur correction mechanism 36 .
 CCU5は、上述の内視鏡1及び光源装置4を用いて対象物9までの距離を測定する距離測定部としての機能を有し得る。CCU5は、光源装置4による光の照射プロファイルを変化させ、光の照射プロファイルの変化により発生したセンサ35の検出結果に基づいて、対象物9までの距離を測定する。照射プロファイルの変更を指示するコマンドがCCU5から光源装置4に送信されてもよいし、変更された照射プロファイルがCCU5から光源装置4に送信されてもよい。いずれにしても、光源装置4は、変化後の照射プロファイルに規定されるように、対象物9に光を照射する。対象物9からセンサ35に入射する光も変化する。センサ35の画素アレイ部351では、その光の変化に起因した輝度変化が発生し、イベントとして検出される。先に説明したように、イベントが発生した画素352の位置情報を含むイベントデータが、CCU5に送信される。 The CCU 5 can function as a distance measuring unit that measures the distance to the object 9 using the endoscope 1 and light source device 4 described above. The CCU 5 changes the light irradiation profile of the light source device 4 and measures the distance to the object 9 based on the detection result of the sensor 35 generated by the change in the light irradiation profile. A command instructing to change the irradiation profile may be transmitted from the CCU 5 to the light source device 4 , or the changed irradiation profile may be transmitted from the CCU 5 to the light source device 4 . In any case, the light source device 4 irradiates the object 9 with light as defined by the changed irradiation profile. The light incident on the sensor 35 from the object 9 also changes. In the pixel array section 351 of the sensor 35, a change in brightness due to the change in light occurs and is detected as an event. As described above, event data including location information of pixel 352 where the event occurred is sent to CCU 5 .
 CCU5は、発生したセンサ35の検出結果、すなわちセンサ35からCCU5に送信されるイベントデータに基づいて、対象物9までの距離を算出する。算出される対象物9の距離は、CCU5において把握されている任意の位置から対象物9までの距離(デプス)であってよい。例えば、内視鏡1の鏡筒2の先端から対象物9までの距離、センサ35から対象物9までの距離等が、CCU5によって算出される。先に説明したように、光源装置4の光は、CCU5において把握されている所定位置及び所定方向から対象物9に照射される。また、内視鏡1のカメラヘッド3のセンサ35からCCU5に送信されるイベントデータには、イベントが検出された画素352の位置情報、すなわち対象物9からの光の受光位置の情報が含まれる。CCU5は、これらの情報を利用した種々の公知の手法を用いて、対象物9の距離を算出する。例えば三角距離測定法が用いられてよく、その場合、光源装置4はレーザ光源として機能し、センサ35は位置センサとして機能する。走査型距離測定法が用いられてよく、その場合、光源41は、例えば走査可能なストライプレーザ光を出力してよい。 The CCU 5 calculates the distance to the object 9 based on the generated detection result of the sensor 35, that is, the event data transmitted from the sensor 35 to the CCU 5. The calculated distance of the target object 9 may be the distance (depth) from an arbitrary position grasped by the CCU 5 to the target object 9 . For example, the CCU 5 calculates the distance from the tip of the barrel 2 of the endoscope 1 to the object 9, the distance from the sensor 35 to the object 9, and the like. As described above, the light from the light source device 4 irradiates the object 9 from a predetermined position and a predetermined direction ascertained by the CCU 5 . Further, the event data transmitted from the sensor 35 of the camera head 3 of the endoscope 1 to the CCU 5 includes the position information of the pixel 352 where the event was detected, that is, the information of the light receiving position of the object 9. . The CCU 5 calculates the distance to the target object 9 using various known methods using these pieces of information. For example triangulation may be used, in which case the light source device 4 acts as a laser light source and the sensor 35 acts as a position sensor. A scanning rangefinder method may be used, in which case the light source 41 may output, for example, a scannable stripe laser light.
 以上のようにして、CCU5は、対象物9までの距離を測定する。距離測定結果をさまざまに活用することができる。例えば、距離測定結果に基づいて、対象物9の3次元距離測定データが構築されてよい。対象物9(の3次元距離測定データ)を含む3次元マップの生成も可能である。対象物9が臓器である場合には、臓器の位置を含めた患者の体腔内の3次元マップを生成することができる。3次元マップに基づく内視鏡1の自立制御(ロボット制御)等も可能になる。 As described above, the CCU 5 measures the distance to the object 9. The distance measurement results can be used in various ways. For example, based on the distance measurement results, three-dimensional distance measurement data of the object 9 may be constructed. It is also possible to generate a three-dimensional map containing (three-dimensional distance measurement data of) the object 9 . If the object 9 is an organ, a three-dimensional map of the patient's body cavity including the position of the organ can be generated. Self-supporting control (robot control) of the endoscope 1 based on the three-dimensional map is also possible.
 医療用観察装置100は、指定されたタイミングで対象物9までの距離を測定してよい。例えば、ユーザ操作によって指定されたタイミグで、対象物9までの距離が測定されてよい。ユーザ操作の例は、距離測定を指示するためのボタン(距離測定ボタン)の押下等、音声による指示等であってよい。これらに限らず、距離測定を指示することが可能なさまざまな操作がユーザ操作に含まれてよい。 The medical observation device 100 may measure the distance to the object 9 at designated timing. For example, the distance to the object 9 may be measured at a timing designated by user operation. An example of the user operation may be a voice instruction such as pressing a button (distance measurement button) for instructing distance measurement. The user operation may include not only these but various operations that can instruct distance measurement.
 図5は、医療用観察装置において実行される処理(医療用観察方法)の例を示すフローチャートである。距離測定に関する処理が例示される。これまでと重複する説明は適宜省略する。 FIG. 5 is a flowchart showing an example of processing (medical observation method) executed in the medical observation device. Processing related to distance measurement is exemplified. Explanations that overlap with the above will be omitted as appropriate.
 ステップS1において、距離測定を行うか否かが判断される。CCU5は、例えば上述のユーザ操作により距離測定が指示された場合に、距離測定を行うべきと判断する。距離測定を行う場合(ステップS1:Yes)、ステップS2に処理が進められる。そうでない場合(ステップS1:No)、ステップS1の処理が繰り返される。 In step S1, it is determined whether or not to measure the distance. The CCU 5 determines that distance measurement should be performed, for example, when distance measurement is instructed by the above-described user operation. If distance measurement is to be performed (step S1: Yes), the process proceeds to step S2. Otherwise (step S1: No), the process of step S1 is repeated.
 ステップS2において、照射プロファイルが変化させられる。CCU5は、照射プロファイルを変化させる。対象物9からの光も変化し、センサ35では輝度変化が発生する。 In step S2, the irradiation profile is changed. CCU5 changes the illumination profile. The light from the object 9 also changes, causing a brightness change at the sensor 35 .
 ステップS3において、輝度変化が検出される。センサ35は、輝度変化を検出する。輝度変化が検出された(イベントが発生した)画素352の位置情報を含むイベントデータが、CCU5に送信される。 In step S3, luminance change is detected. A sensor 35 detects luminance changes. Event data including the position information of the pixel 352 where the luminance change was detected (the event occurred) is sent to the CCU5.
 ステップS4において、距離測定が行われる。CCU5は、先のステップS3での検出結果、より具体的にはセンサ35からCCU5に送信されたイベントデータに基づいて、対象物9までの距離を算出する。先に述べたように、距離測定結果は、対象物9の3次元距離測定データの構築、それを含む3次元マップの生成等に活用される。 In step S4, distance measurement is performed. The CCU 5 calculates the distance to the object 9 based on the detection result in the previous step S3, more specifically, the event data transmitted from the sensor 35 to the CCU 5. As described above, the results of distance measurement are used to construct three-dimensional distance measurement data of the object 9, generate a three-dimensional map including the data, and the like.
 以上説明した医療用観察装置100によれば、光源装置4による光の照射プロファイルを変化させることにより、センサ35によるイベントの検出を(強制的に)発生させ、その検出結果を用いて対象物9までの距離が測定される。 According to the medical observation apparatus 100 described above, the detection of an event by the sensor 35 is (forcibly) generated by changing the light irradiation profile of the light source device 4, and the target 9 is detected using the detection result. The distance to is measured.
 先に述べたように、内視鏡1の鏡筒2は、光学系の直径がかなり小さく、また、多くのレンズ(レンズ24a~レンズ24g等)を含む。そのため、対象物9からの光が、鏡筒2で大きく減衰する。例えば、内視鏡にTOFセンサを組み込むことも考えられるが、TOFセンサの多くは、その波長に赤外波長を使用するため、上述の減衰の問題が顕在化する。TOFセンサでは、対象物9からの光を十分に検出することができず、距離測定の精度が得られにくい。ステレオ光学系(3D)による距離測定も考えられるが、2つの撮影光学系を鏡筒2内に並列に設ける必要があり、それぞれの撮影光学系の直径がより細くなってしまい、やはり感度が得られにくい。上記の医療用観察装置100のように、高感度のセンサ35を距離測定用センサとして用いることで、対象物9からの光を高感度で検出することにより、距離測定の精度を向上させることができる。内視鏡1であっても十分な精度を得ることができる。 As described above, the lens barrel 2 of the endoscope 1 has a fairly small optical system diameter and includes many lenses (lenses 24a to 24g, etc.). Therefore, the light from the object 9 is greatly attenuated by the lens barrel 2 . For example, it is conceivable to incorporate a TOF sensor into an endoscope, but many of the TOF sensors use infrared wavelengths for their wavelengths, so the above-mentioned problem of attenuation becomes apparent. The TOF sensor cannot sufficiently detect the light from the object 9, and it is difficult to obtain accurate distance measurement. Distance measurement by a stereo optical system (3D) is also conceivable, but it is necessary to provide two photographic optical systems in parallel within the lens barrel 2, and the diameter of each photographic optical system becomes narrower. hard to get By using the high-sensitivity sensor 35 as a distance-measuring sensor, as in the medical observation apparatus 100 described above, the light from the object 9 can be detected with high sensitivity, thereby improving the accuracy of distance measurement. can. Sufficient accuracy can be obtained even with the endoscope 1 .
 なお、医療用観察装置100によれば、光源装置4による光の照射プロファイルを変化させることで対象物9までの距離が測定できるので、例えば特許文献1に開示されているようなSfMのロボット動作(センサの機械動作)は不要である。この意味においても、内視鏡1での距離測定に好適である。照射変化は機械動作よりも素早く(例えば瞬時に)行えるので、その分、短時間での距離測定も可能になる。 According to the medical observation apparatus 100, the distance to the object 9 can be measured by changing the light irradiation profile of the light source device 4. Therefore, for example, the SfM robot operation disclosed in Patent Document 1 (mechanical motion of the sensor) is not required. In this sense as well, it is suitable for distance measurement with the endoscope 1 . Since the irradiation change can be performed more quickly (for example, instantaneously) than the mechanical operation, it is possible to measure the distance in a short time accordingly.
2.照射プロファイルの例
 照射プロファイルは、光源装置4による光の照射方式を規定する。照射プロファイルは、その変化により対象物9からの光を変化させることのできるあらゆる照射方式を規定してよい。照射プロファイルのいくつかの例について説明する。
2. Example of Irradiation Profile The irradiation profile defines the method of light irradiation by the light source device 4 . The illumination profile may define any illumination scheme whose changes can change the light from the object 9 . Some examples of illumination profiles are described.
 一実施形態において、照射プロファイルは、照射範囲を含んでよい。CCU5は、照射範囲を変化させることにより、センサ35によるイベントの検出を発生させ、対象物9までの距離を測定してよい。 In one embodiment, the illumination profile may include an illumination range. CCU 5 may cause detection of an event by sensor 35 to measure the distance to object 9 by changing the illumination range.
 照射範囲の例は、術野(撮像画像が提示される画面に相当)の全体、術野の中央部分、術野の中央部分の周囲部分等である。なお、中央部分及び周囲部分の照射は、その部分を重点的に照射するものであってよく、他の部分の照射の存在を排除するものではない。 Examples of the irradiation range are the entire operative field (equivalent to the screen on which the captured image is presented), the central part of the operative field, and the surrounding part of the central part of the operative field. It should be noted that the irradiation of the central portion and the peripheral portion may be performed intensively, and the existence of the irradiation of other portions is not excluded.
 照射範囲は、さまざまな形状を有してよい。例えば、照射範囲が画面の中央部である場合、照射範囲の形状は、円形形状、楕円形形状、多角形形状等であってよい。照射範囲が画面の周囲部分である場合、照射範囲の形状は、環状(リング状)の円形形状、楕円形形状、多角形形状等であってよい。 The irradiation range may have various shapes. For example, when the irradiation range is the central portion of the screen, the shape of the irradiation range may be circular, elliptical, polygonal, or the like. When the irradiation range is the peripheral portion of the screen, the shape of the irradiation range may be an annular (ring-shaped) circular shape, an elliptical shape, a polygonal shape, or the like.
 照射範囲の形状が環状形状である場合、CCU5は、その環状形状の大きさを変化させてもよい。例えば、CCU5は、環状形状が徐々に大きくなったり小さくなったりするように、照射範囲を変化させてよい。このような変化を含む例について、図6を参照して説明する。 When the irradiation range has an annular shape, the CCU 5 may change the size of the annular shape. For example, the CCU 5 may change the illumination range such that the annular shape gradually becomes larger and smaller. An example including such changes will be described with reference to FIG.
 図6は、照射範囲の変化の例を模式的に示す図である。図6の(A)に示される照射範囲R1は、画面の中央部分の照射範囲であり、この例では、円形形状を有する。図6の(B)~(D)に示される照射範囲R2~照射範囲R4は、画面の周囲部分の照射範囲であり、この例では、円環形状を有する。照射範囲R3の円環形状は、照射範囲R2の円環形状よりも大きい。照射範囲R4の円環形状は、照射範囲R3の円環形状よりも大きい。例えばこのような照射範囲R1~照射範囲R4を含むように、CCU5は、照射範囲を徐々に変化させる。このように照射範囲を円の径方向に変化させることで、術野を効率よく走査し、距離測定を行うことができる。 FIG. 6 is a diagram schematically showing an example of changes in the irradiation range. The irradiation range R1 shown in FIG. 6A is the irradiation range of the central portion of the screen, and has a circular shape in this example. Illuminated ranges R2 to R4 shown in (B) to (D) of FIG. 6 are the illuminated ranges of the peripheral portion of the screen, and in this example, have an annular shape. The annular shape of the irradiation range R3 is larger than the annular shape of the irradiation range R2. The annular shape of the irradiation range R4 is larger than the annular shape of the irradiation range R3. For example, the CCU 5 gradually changes the irradiation range so as to include such irradiation ranges R1 to R4. By changing the irradiation range in the radial direction of the circle in this way, the operative field can be efficiently scanned and the distance can be measured.
 上述の照射範囲R1~照射範囲R4のような照射範囲の変化は、例えば、光源装置4から光導波路に入射する光の入射角度を変えることによって実現される。これについて、図7を参照して説明する。 A change in the irradiation range such as the irradiation range R1 to the irradiation range R4 described above is realized, for example, by changing the incident angle of the light incident on the optical waveguide from the light source device 4 . This will be described with reference to FIG.
 図7は、入射角度と照射範囲との関係の例を模式的に示す図である。光導波路は、先に図2を参照して説明したケーブルC14及びライトガイド22である。光導波路は、例えば、インデックスガイド型の10μm~100μm程度のコア径を有する複数のマルチモード光ファイバが束ねられたバンドルファイバや液体ライトガイドであってよい。このような光導波路は、入射端面に入射した光線の角度を保存したまま出射端面から光線を放射するという特性を有する。 FIG. 7 is a diagram schematically showing an example of the relationship between the incident angle and the irradiation range. The optical waveguides are the cable C14 and the light guide 22 previously described with reference to FIG. The optical waveguide may be, for example, a bundle fiber or a liquid light guide in which a plurality of index guide type multimode optical fibers having a core diameter of about 10 μm to 100 μm are bundled. Such an optical waveguide has the characteristic of radiating light rays from the exit facet while preserving the angle of the light rays incident on the entrance facet.
 図7の上側に示されるように、光源装置4から光導波路への入射角度を小さくすると、光導波路から出射する光の角度も小さくなる。図7の下側に示されるように、光源装置4からの光導波路への入射角度を大きくすると、光導波路から出射する光の角度も大きくなる。光導波路では、光線の入射角度は保存されるものの、光線の入射位置は保存されないので、ある入射角度で入射した光線は、その角度を維持したままリング状の光線となって出射端面から放射される。 As shown in the upper part of FIG. 7, when the incident angle from the light source device 4 to the optical waveguide is reduced, the angle of light emitted from the optical waveguide is also reduced. As shown in the lower part of FIG. 7, when the angle of incidence of light from the light source device 4 onto the optical waveguide is increased, the angle of light emitted from the optical waveguide also increases. In an optical waveguide, the incident angle of light rays is preserved, but the incident position of light rays is not preserved. Therefore, light rays that enter at a certain incident angle are emitted from the output end face as ring-shaped rays while maintaining that angle. be.
 例えば上記のように光導波路への光の入射角度を制御することで、照射範囲を変えることができる。光源装置4は、上述の光導波路への光の入射角度を制御するためのさまざまな光学素子を含んでよい。光学素子の例は、平行光を得るためのコリメータレンズ、入射角度を調節するための機構(入射角度調節機構)、入射角度が調整された光を光導波路へ結合させるための光学系(結合光学系)等である。より具体的な構成については、例えば特許文献2を参照されたい。 For example, by controlling the incident angle of light to the optical waveguide as described above, the irradiation range can be changed. The light source device 4 may include various optical elements for controlling the angle of incidence of light on the optical waveguides described above. Examples of optical elements include a collimator lens for obtaining parallel light, a mechanism for adjusting the incident angle (incident angle adjusting mechanism), and an optical system for coupling the light whose incident angle is adjusted to the optical waveguide (coupling optics system), etc. See Patent Document 2, for example, for a more specific configuration.
 一実施形態において、照射プロファイルは、光源装置4の光の複数の異なる位置からの対象物9への照射の組み合わせを含んでよい。CCU5は、その組み合わせを変化させることにより、センサ35によるイベントの検出を発生させ、対象物9までの距離を測定してよい。 In one embodiment, the illumination profile may comprise a combination of illumination of the object 9 from different positions of the light of the light source device 4 . CCU 5 may cause detection of an event by sensor 35 and measure the distance to object 9 by changing the combination.
 図8は、複数の照射の組み合わせを模式的に示す図である。図8には、鏡筒2の先端(開口部2a側)を正面視したときの外観が模式的に示される。開口部2aの周囲に、複数のライトガイド22の出射端22aが位置している。各出射端22aを区別できるように、出射端22a―1~出射端22a―12と称し図示する。光源装置4は、出射端22a―1~出射端22a―12それぞれに光を供給するとともに、それぞれの光を個別に制御するように構成される。制御の例は、光の点灯の制御であり、CCU5は、出射端22a―1~出射端22a―12から出射される光の点灯パターンを変化させてよい。例えば、CCU5は、出射端22a―1~出射端22a―12がこの順に(シーケンシャルに)光を出射する(点灯する)ように、点灯パターンを変化させてよい。 FIG. 8 is a diagram schematically showing a combination of multiple irradiations. FIG. 8 schematically shows the appearance of the front end (opening 2a side) of the lens barrel 2 when viewed from the front. Emission ends 22a of a plurality of light guides 22 are positioned around the opening 2a. The output ends 22a are referred to as output ends 22a-1 to 22a-12 so that the output ends 22a can be distinguished from each other. The light source device 4 is configured to supply light to each of the emission ends 22a-1 to 22a-12 and to individually control each light. An example of control is control of lighting of light, and the CCU 5 may change the lighting pattern of light emitted from the emission ends 22a-1 to 22a-12. For example, the CCU 5 may change the lighting pattern so that the output ends 22a-1 to 22a-12 emit light (light up) in this order (sequentially).
 一実施形態において、CCU5は、光源装置4による光の照射プロファイルを変化させる際に、照射を一時的に停止させて(消灯して)よい。対象物9への照射の変化がより顕在化し、センサ35によるイベントの検出をより確実に発生させることができる。 In one embodiment, the CCU 5 may temporarily stop the irradiation (turn off the light) when changing the light irradiation profile of the light source device 4 . A change in the illumination of the object 9 becomes more conspicuous, and an event can be detected more reliably by the sensor 35 .
 図9は、照射の一時停止(消灯)を含む処理の例を示すフローチャートである。ステップS11において、照射プロファイルを変化させるか否かが判断される。例えば、CCU5は、ユーザ操作により距離測定が指示されると、照射プロファイルを変化させるべきであると判断する。照射プロファイルを変化させる場合(ステップS11:Yes)、ステップS12に処理が進められる。そうでない場合(ステップS11:No)、ステップS11の処理が繰り返される。 FIG. 9 is a flowchart showing an example of processing including temporary suspension of irradiation (turning off the light). In step S11, it is determined whether or not to change the irradiation profile. For example, the CCU 5 determines that the irradiation profile should be changed when distance measurement is instructed by a user's operation. If the irradiation profile is to be changed (step S11: Yes), the process proceeds to step S12. Otherwise (step S11: No), the process of step S11 is repeated.
 ステップS12において、消灯が行われる。CCU5は、例えば光照射の停止を指示するコマンドを光源装置4に送信する。光源装置4は、対象物9への光の照射を停止する。 At step S12, the lights are turned off. The CCU 5 transmits to the light source device 4, for example, a command to stop light irradiation. The light source device 4 stops irradiating the object 9 with light.
 ステップS13において、照射プロファイルが変化させられる。CCU5は、照射プロファイルを変化させる。光源装置4は、変化後の照射プロファイルに規定されるように対象物9に光を照射できる状態となる。 In step S13, the irradiation profile is changed. CCU5 changes the illumination profile. The light source device 4 is ready to irradiate the object 9 with light as defined by the changed irradiation profile.
 ステップS14において、点灯が行われる。CCU5は、例えば光照射の開始を指示するコマンドを光源装置4に送信する。光源装置4は、対象物9への光の照射を開始する。先のステップS13で変化した後の照射プロファイルに規定されるように、対象物9に光が照射される。 At step S14, lighting is performed. The CCU 5 transmits to the light source device 4, for example, a command instructing the start of light irradiation. The light source device 4 starts irradiating the object 9 with light. The object 9 is irradiated with light as defined by the irradiation profile after the change in the previous step S13.
 一実施形態において、照射プロファイルは、光源装置4による光の波長を含んでよい。CCU5は、光源装置4による光の波長を変化させることにより、センサ35によるイベントの検出を発生させ、対象物9までの距離を測定してよい。対象物9(例えば臓器等)によっては、照射される光の波長によって、異なる反射特性を示すものも存在する。そのような対象物9に対して照射する光の波長を変化させると、対象物9からの光も変化し、センサ35によるイベントの検出を発生させることができる。 In one embodiment, the illumination profile may include wavelengths of light from the light source device 4 . The CCU 5 may cause the sensor 35 to detect an event by changing the wavelength of light from the light source device 4 and measure the distance to the object 9 . Some objects 9 (for example, internal organs, etc.) exhibit different reflection characteristics depending on the wavelength of the irradiated light. Changing the wavelength of the light illuminating such an object 9 will also change the light from the object 9 and can cause detection of an event by the sensor 35 .
 光源装置4は、対象物9に照射する光の波長を変えることができるように構成される。そのような光源装置4の構成の例について、図10を参照して説明する。 The light source device 4 is configured to be able to change the wavelength of light with which the object 9 is irradiated. An example of the configuration of such a light source device 4 will be described with reference to FIG.
 図10は、光源装置の概略構成の例を示す図である。この例では、光源装置4は、先に説明した光出射口43の他に、光源41a~光源41fと、ミラー42a~ミラー42fと、レンズ42gとを含む。 FIG. 10 is a diagram showing an example of a schematic configuration of a light source device. In this example, the light source device 4 includes light sources 41a to 41f, mirrors 42a to 42f, and a lens 42g in addition to the light exit port 43 described above.
 光源41a~光源41fは、各々が異なる波長の光(異なる色の光)を出力する。この例では、光源41aは、赤外光を出力する。光源41bは、赤色光を出力する。光源41cは、黄色光を出力する。光源41dは、緑色光を出力する。光源41eは、青色光を出力する。光源41fは、紫色光を出力する。光源41a~光源41fは、個別に制御可能である。 The light sources 41a to 41f output lights of different wavelengths (lights of different colors). In this example, the light source 41a outputs infrared light. The light source 41b outputs red light. The light source 41c outputs yellow light. The light source 41d outputs green light. The light source 41e outputs blue light. The light source 41f outputs violet light. The light sources 41a to 41f are individually controllable.
 ミラー42a~ミラー42fは、光源41a~光源41fの光をレンズ42gに導く。ミラー42aは、光源41aの光をレンズ42gに向けて反射する。ミラー42bは、光源41bの光をレンズ42gに向けて反射する。また、ミラー42bは、ミラー42aで反射した光源41aの光を透過させる。ミラー42cは、光源41cの光をレンズ42gに向けて反射する。また、ミラー42cは、ミラー42a及びミラー42bで反射した光源41a及び光源41bの光を透過させる。 The mirrors 42a to 42f guide the light from the light sources 41a to 41f to the lens 42g. The mirror 42a reflects the light from the light source 41a toward the lens 42g. The mirror 42b reflects the light from the light source 41b toward the lens 42g. Moreover, the mirror 42b transmits the light of the light source 41a reflected by the mirror 42a. The mirror 42c reflects the light from the light source 41c toward the lens 42g. Also, the mirror 42c transmits the light of the light source 41a and the light source 41b reflected by the mirror 42a and the mirror 42b.
 ミラー42dは、光源41dの光をレンズ42gに向けて反射する。また、ミラー42dは、ミラー42a~ミラー42cで反射した光源41a~光源41cの光を透過させる。ミラー42eは、光源41eの光をレンズ42gに向けて反射する。また、ミラー42eは、ミラー42a~ミラー42dで反射した光源41a~光源41dの光を透過させる。ミラー42fは、光源41fの光をレンズ42gに向けて反射する。ミラー42fは、ミラー42a~ミラー42eで反射した光源41a~光源41eの光を透過させる。 The mirror 42d reflects the light from the light source 41d toward the lens 42g. Also, the mirror 42d transmits the lights of the light sources 41a to 41c reflected by the mirrors 42a to 42c. The mirror 42e reflects the light from the light source 41e toward the lens 42g. Also, the mirror 42e transmits the lights of the light sources 41a to 41d reflected by the mirrors 42a to 42d. The mirror 42f reflects the light from the light source 41f toward the lens 42g. The mirror 42f transmits the lights of the light sources 41a to 41e reflected by the mirrors 42a to 42e.
 例えば、ミラー42aは、全反射ミラーであってよい。ミラー42b~ミラー42fは、ダイクロイックミラーであってよい。 For example, the mirror 42a may be a total reflection mirror. Mirrors 42b through 42f may be dichroic mirrors.
 レンズ42gは、ミラー42a~レンズ42gで反射した光源41a~光源41fの光を集光し、光出射口43に導く集光レンズである。 The lens 42g is a condensing lens that collects the light from the light sources 41a to 41f reflected by the mirrors 42a to 42g and guides them to the light exit port 43.
 上述のように光源41a~光源41fが個別に制御可能であるので、光源装置4は、光源41a~光源41fの光(波長の異なる光)のうちの任意に選択された光を光出射口43から出射し、対象物9に照射することができる。 Since the light sources 41a to 41f are individually controllable as described above, the light source device 4 emits arbitrarily selected light from the light sources 41a to 41f (lights with different wavelengths) through the light exit 43. can be emitted from and illuminate the object 9 .
 CCU5は、対象物9の種類、例えば臓器の種類、組成等に応じて、光源装置4による光の波長を変化させてよい。対象物9からの光に変化が生じるような波長変化が選択される。対象物9の種類の特定には、撮像画像(例えばセンサ33やセンサ34で生成された画像信号に基づく画像)に対する画像認識等が用いられてよい。画像認識には、種々の公知の画像認識アルゴリズム、画像認識モデル(学習済みモデル)等が用いられてよい。 The CCU 5 may change the wavelength of the light from the light source device 4 according to the type of the object 9, such as the type and composition of the organ. A wavelength change is selected that causes a change in the light from the object 9 . To specify the type of the object 9, image recognition or the like for a captured image (for example, an image based on an image signal generated by the sensor 33 or the sensor 34) may be used. Various known image recognition algorithms, image recognition models (learned models), etc. may be used for image recognition.
 以上説明した医療用観察装置100(内視鏡1、光源装置4及びCCU5)は、例えば内視鏡ロボットを含む内視鏡手術システムに組み入れられて用いられてよい。そのような実施形態について説明する。なお、以下では、内視鏡ロボットは、内視鏡を支持するアーム部等として説明される。 The medical observation device 100 (the endoscope 1, the light source device 4, and the CCU 5) described above may be incorporated and used in an endoscopic surgery system including an endoscopic robot, for example. Such embodiments are described. In the following description, the endoscope robot will be described as an arm or the like that supports an endoscope.
3.内視鏡手術システムの実施形態
 図11は、内視鏡手術システムの概略構成の例を示す図である。図示される内視鏡5001(鏡筒5003、カメラヘッド5005)、光源装置5043及びCCU5039は、これまで説明した医療用観察装置100の内視鏡1(鏡筒2、カメラヘッド3)、光源装置4及びCCU5の構成及び機能を有するように構成されてよい。重複する説明は適宜省略する。
3. Embodiment of Endoscopic Surgery System FIG. 11 is a diagram showing an example of a schematic configuration of an endoscopic surgery system. The illustrated endoscope 5001 (lens barrel 5003, camera head 5005), light source device 5043, and CCU 5039 are the endoscope 1 (lens barrel 2, camera head 3) and light source device of the medical observation apparatus 100 described so far. 4 and CCU 5 configuration and functions. Duplicate descriptions are omitted as appropriate.
 図11では、執刀医5067が、内視鏡手術システム5000を用いて、患者ベッド5069上の患者5071に手術を行っている様子が図示されている。図11に示すように、内視鏡手術システム5000は、内視鏡5001と、その他の術具(医療用器具)5017と、内視鏡5001を支持する支持アーム装置(医療用アーム)5027と、内視鏡下手術のための各種の装置が搭載されたカート5037とを有する。以下、内視鏡手術システム5000の詳細について、順次説明する。 FIG. 11 shows a surgeon 5067 performing surgery on a patient 5071 on a patient bed 5069 using the endoscopic surgery system 5000 . As shown in FIG. 11, an endoscopic surgery system 5000 includes an endoscope 5001, other surgical instruments (medical instruments) 5017, and a support arm device (medical arm) 5027 for supporting the endoscope 5001. , and a cart 5037 loaded with various devices for endoscopic surgery. Details of the endoscopic surgery system 5000 will be sequentially described below.
[術具]
 内視鏡手術では、腹壁を切って開腹する代わりに、例えば、トロッカ5025a~5025dと呼ばれる筒状の開孔器具が腹壁に複数穿刺される。そして、トロッカ5025a~5025dから、内視鏡5001の鏡筒5003や、その他の術具5017が患者5071の体腔内(腹腔内)に挿入される。図11に示す例では、その他の術具5017として、気腹チューブ5019、エネルギー処置具5021及び鉗子5023が、患者5071の体腔内に挿入されている。また、エネルギー処置具5021は、高周波電流や超音波振動により、組織の切開及び剥離、又は血管の封止等を行う処置具である。ただし、図11に示す術具5017はあくまで一例であり、術具5017としては、例えば攝子、レトラクタ等、一般的に内視鏡下手術において用いられる各種の術具を挙げることができる。
[Surgical tools]
In endoscopic surgery, instead of cutting the abdominal wall to open the abdomen, for example, a plurality of tubular opening instruments called trocars 5025a to 5025d are punctured into the abdominal wall. Then, the barrel 5003 of the endoscope 5001 and other surgical instruments 5017 are inserted into the body cavity (abdominal cavity) of the patient 5071 from the trocars 5025a to 5025d. In the example shown in FIG. 11 , a pneumoperitoneum tube 5019 , an energy treatment instrument 5021 and forceps 5023 are inserted into the body cavity of a patient 5071 as other surgical instruments 5017 . Also, the energy treatment tool 5021 is a treatment tool that performs tissue incision and ablation, blood vessel sealing, or the like, using high-frequency current or ultrasonic vibration. However, the surgical tool 5017 shown in FIG. 11 is merely an example, and examples of the surgical tool 5017 include various surgical tools generally used in endoscopic surgery, such as pestle, retractor, and the like.
[支持アーム装置]
 支持アーム装置5027は、ベース部5029から延伸するアーム部5031を有する。図11に示す例では、アーム部5031は、関節部5033a、5033b、5033c、及びリンク5035a、5035bから構成されており、アーム制御装置5045からの制御により駆動される。そして、アーム部5031によって内視鏡5001が支持され、内視鏡5001の位置及び姿勢が制御される。これにより、内視鏡5001の安定的な位置の固定が実現され得る。
[Support arm device]
The support arm device 5027 has an arm portion 5031 extending from the base portion 5029 . In the example shown in FIG. 11, the arm section 5031 is composed of joint sections 5033a, 5033b, and 5033c and links 5035a and 5035b, and is driven under the control of the arm control device 5045. The arm portion 5031 supports the endoscope 5001 and controls the position and attitude of the endoscope 5001 . As a result, stable position fixation of the endoscope 5001 can be realized.
[カートに搭載される各種装置]
 まず、表示装置5041は、CCU5039からの制御により、当該CCU5039によって画像処理が施された画像信号に基づく画像を表示する。内視鏡5001が、例えば4K(水平画素数3840×垂直画素数2160)又は8K(水平画素数7680×垂直画素数4320)等の高解像度の撮影に対応したものである場合、及び/又は、3D表示に対応したものである場合には、表示装置5041として、それぞれに対応する、高解像度の表示が可能なもの、及び/又は、3D表示可能なものが用いられる。また、用途に応じて、解像度、サイズが異なる複数の表示装置5041が設けられていてもよい。
[Various devices mounted on the cart]
First, the display device 5041 displays an image based on an image signal subjected to image processing by the CCU 5039 under the control of the CCU 5039 . When the endoscope 5001 is compatible with high-resolution imaging such as 4K (horizontal pixel number 3840×vertical pixel number 2160) or 8K (horizontal pixel number 7680×vertical pixel number 4320), and/or If the display device 5041 is compatible with 3D display, a device capable of high-resolution display and/or a device capable of 3D display is used as the display device 5041 . Further, a plurality of display devices 5041 having different resolutions and sizes may be provided depending on the application.
 内視鏡5001によって撮影された患者5071の体腔内の術部の画像は、表示装置5041に表示される。執刀医5067は、表示装置5041に表示された術部の画像をリアルタイムで見ながら、エネルギー処置具5021や鉗子5023を用いて、例えば患部を切除する等の処置を行うことができる。なお、図示を省略しているが、気腹チューブ5019、エネルギー処置具5021及び鉗子5023は、手術中に、執刀医5067又は助手等によって支持されてもよい。 An image of the surgical site within the body cavity of the patient 5071 captured by the endoscope 5001 is displayed on the display device 5041 . The surgeon 5067 can use the energy treatment tool 5021 and the forceps 5023 to perform treatment such as excision of the affected area while viewing the image of the surgical area displayed on the display device 5041 in real time. Although illustration is omitted, the pneumoperitoneum tube 5019, the energy treatment instrument 5021, and the forceps 5023 may be supported by the surgeon 5067, an assistant, or the like during surgery.
 CCU5039は、内視鏡5001及び表示装置5041の動作を統括的に制御することができる。具体的には、CCU5039は、カメラヘッド5005から受け取った画像信号に対して、例えば現像処理(デモザイク処理)等の、当該画像信号に基づく画像を表示するための各種の画像処理を施す。さらに、CCU5039は、当該画像処理を施した画像信号を表示装置5041に提供する。また、CCU5039は、カメラヘッド5005に対して制御信号を送信し、その駆動を制御する。当該制御信号は、倍率や焦点距離等、撮像条件に関する情報を含むことができる。なお、CCU5039が対象物9の距離測定も行うことは、これまでに説明したとおりである。 The CCU 5039 can comprehensively control the operations of the endoscope 5001 and the display device 5041. Specifically, the CCU 5039 subjects the image signal received from the camera head 5005 to various image processing such as development processing (demosaicing) for displaying an image based on the image signal. Furthermore, the CCU 5039 provides the image signal subjected to the image processing to the display device 5041 . Also, the CCU 5039 transmits a control signal to the camera head 5005 to control its driving. The control signal can include information about imaging conditions such as magnification and focal length. As described above, the CCU 5039 also measures the distance to the object 9 .
 アーム制御装置5045は、例えばCPU等のプロセッサによって構成され、所定のプログラムに従って動作することにより、所定の制御方式に従って支持アーム装置5027のアーム部5031の駆動を制御する。 The arm control device 5045 is composed of a processor such as a CPU, for example, and operates according to a predetermined program to control the driving of the arm portion 5031 of the support arm device 5027 according to a predetermined control method.
 入力装置5047は、内視鏡手術システム5000に対する入力インターフェイスである。執刀医5067は、入力装置5047を介して、内視鏡手術システム5000に対して各種の情報の入力や指示入力を行うことができる。例えば、執刀医5067は、入力装置5047を介して、患者の身体情報や、手術の術式についての情報等、手術に関する各種の情報を入力する。また、例えば、執刀医5067は、入力装置5047を介して、アーム部5031を駆動させる旨の指示や、内視鏡5001による撮像条件(照射光の種類、倍率及び焦点距離等)を変更する旨の指示、エネルギー処置具5021を駆動させる旨の指示等を入力することができる。なお、入力装置5047の種類は限定されず、入力装置5047は各種の公知の入力装置であってよい。入力装置5047としては、例えば、マウス、キーボード、タッチパネル、スイッチ、フットスイッチ5057、及び/又は、レバー等が適用され得る。例えば、入力装置5047としてタッチパネルが用いられる場合には、当該タッチパネルは表示装置5041の表示面上に設けられていてもよい。 The input device 5047 is an input interface for the endoscopic surgery system 5000. The surgeon 5067 can input various information and instructions to the endoscopic surgery system 5000 via the input device 5047 . For example, the surgeon 5067 inputs various types of information regarding surgery, such as the patient's physical information and information about surgical techniques, via the input device 5047 . Further, for example, the surgeon 5067 gives an instruction to drive the arm unit 5031 via the input device 5047, or an instruction to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) by the endoscope 5001. , an instruction to drive the energy treatment instrument 5021, and the like. The type of the input device 5047 is not limited, and the input device 5047 may be various known input devices. As the input device 5047, for example, a mouse, keyboard, touch panel, switch, footswitch 5057, and/or lever can be applied. For example, when a touch panel is used as the input device 5047 , the touch panel may be provided on the display surface of the display device 5041 .
 あるいは、入力装置5047は、例えば、メガネ型のウェアラブルデバイスやHMD(Head Mounted Display)等の、執刀医5067の身体の一部に装着されるデバイスであってもよい。この場合、これらのデバイスによって検出される執刀医5067のジェスチャや視線に応じて、各種の入力が行われることとなる。また、入力装置5047は、執刀医5067の動きを検出可能なカメラを含むことができ、当該カメラによって撮像された画像から検出される執刀医5067のジェスチャや視線に応じて、各種の入力が行われてもよい。さらに、入力装置5047は、執刀医5067の声を収音可能なマイクロフォンを含むことができ、当該マイクロフォンを介して音声によって各種の入力が行われてもよい。このように、入力装置5047が非接触で各種の情報を入力可能に構成されることにより、特に清潔域に属するユーザ(例えば執刀医5067)が、不潔域に属する機器を非接触で操作することが可能となる。また、執刀医5067は、所持している術具から手を離すことなく機器を操作することが可能となるため、執刀医5067の利便性が向上する。 Alternatively, the input device 5047 may be a device that is attached to a part of the body of the surgeon 5067, such as a glasses-type wearable device or an HMD (Head Mounted Display). In this case, various inputs are performed according to the gestures and line of sight of the surgeon 5067 detected by these devices. Also, the input device 5047 can include a camera capable of detecting the movement of the surgeon 5067, and various inputs are performed according to the gestures and line of sight of the surgeon 5067 detected from the image captured by the camera. may be broken. Furthermore, the input device 5047 can include a microphone capable of picking up the voice of the surgeon 5067, and various voice inputs may be made through the microphone. Since the input device 5047 is configured to be capable of inputting various kinds of information in a non-contact manner, a user belonging to a particularly clean area (for example, a surgeon 5067) can operate a device belonging to an unclean area without contact. becomes possible. In addition, since the surgeon 5067 can operate the device without taking his/her hand off the surgical tool, the convenience of the surgeon 5067 is improved.
 処置具制御装置5049は、組織の焼灼、切開又は血管の封止等のためのエネルギー処置具5021の駆動を制御する。気腹装置5051は、内視鏡5001による視野の確保及び執刀医5067の作業空間の確保の目的で、患者5071の体腔を膨らめるために、気腹チューブ5019を介して当該体腔内にガスを送り込む。レコーダ5053は、手術に関する各種の情報を記録可能な装置である。プリンタ5055は、手術に関する各種の情報を、テキスト、画像又はグラフ等各種の形式で印刷可能な装置である。 The treatment instrument control device 5049 controls driving of the energy treatment instrument 5021 for tissue cauterization, incision, blood vessel sealing, or the like. The pneumoperitoneum device 5051 is inserted into the body cavity through the pneumoperitoneum tube 5019 in order to inflate the body cavity of the patient 5071 for the purpose of securing the visual field of the endoscope 5001 and securing the working space of the surgeon 5067 . send gas. The recorder 5053 is a device capable of recording various types of information regarding surgery. The printer 5055 is a device capable of printing various types of information regarding surgery in various formats such as text, images, and graphs.
[支持アーム装置(詳細構成)]
 さらに、支持アーム装置5027の詳細構成の一例について説明する。支持アーム装置5027は、基台であるベース部5029と、ベース部5029から延伸するアーム部5031とを有する。図11に示す例では、アーム部5031は、複数の関節部5033a、5033b、5033cと、関節部5033bによって連結される複数のリンク5035a、5035bとから構成されているが、図11では、簡単のため、アーム部5031の構成を簡略化して図示している。具体的には、アーム部5031が所望の自由度を有するように、関節部5033a~5033c及びリンク5035a、5035bの形状、数及び配置、並びに関節部5033a~5033cの回転軸の方向等が適宜設定され得る。例えば、アーム部5031は、好適に、6自由度以上の自由度を有するように構成され得る。これにより、アーム部5031の可動範囲内において内視鏡5001を自由に移動させることが可能になるため、所望の方向から内視鏡5001の鏡筒5003を患者5071の体腔内に挿入することが可能になる。
[Support arm device (detailed configuration)]
Furthermore, an example of the detailed configuration of the support arm device 5027 will be described. The support arm device 5027 has a base portion 5029 as a base and an arm portion 5031 extending from the base portion 5029 . In the example shown in FIG. 11, the arm portion 5031 is composed of a plurality of joint portions 5033a, 5033b, 5033c and a plurality of links 5035a, 5035b connected by the joint portion 5033b. Therefore, the configuration of the arm portion 5031 is simplified for illustration. Specifically, the shape, number and arrangement of the joints 5033a to 5033c and the links 5035a and 5035b, the direction of the rotation axis of the joints 5033a to 5033c, etc. are appropriately set so that the arm 5031 has a desired degree of freedom. can be For example, the arm portion 5031 may preferably be configured to have 6 or more degrees of freedom. As a result, the endoscope 5001 can be freely moved within the movable range of the arm portion 5031, so that the barrel 5003 of the endoscope 5001 can be inserted into the body cavity of the patient 5071 from a desired direction. be possible.
 関節部5033a~5033cにはアクチュエータが設けられており、関節部5033a~5033cは当該アクチュエータの駆動により所定の回転軸まわりに回転可能に構成されている。当該アクチュエータの駆動がアーム制御装置5045によって制御されることにより、各関節部5033a~5033cの回転角度が制御され、アーム部5031の駆動が制御される。これにより、内視鏡5001の位置及び姿勢の制御が実現され得る。この際、アーム制御装置5045は、力制御又は位置制御等、各種の公知の制御方式によってアーム部5031の駆動を制御することができる。 The joints 5033a to 5033c are provided with actuators, and the joints 5033a to 5033c are configured to be rotatable around a predetermined rotation axis by driving the actuators. By controlling the driving of the actuator by the arm control device 5045, the rotation angles of the joints 5033a to 5033c are controlled, and the driving of the arm 5031 is controlled. Thereby, control of the position and attitude of the endoscope 5001 can be realized. At this time, the arm control device 5045 can control the driving of the arm section 5031 by various known control methods such as force control or position control.
 例えば、執刀医5067が、入力装置5047(フットスイッチ5057を含む)を介して適宜操作入力を行うことにより、当該操作入力に応じてアーム制御装置5045によってアーム部5031の駆動が適宜制御され、内視鏡5001の位置及び姿勢が制御されてよい。なお、アーム部5031は、いわゆるマスタースレイブ方式で操作されてもよい。この場合、アーム部5031(スレーブ)は、手術室から離れた場所または手術室内に設置される入力装置5047(マスターコンソール)を介して執刀医5067によって遠隔操作され得る。 For example, when the surgeon 5067 appropriately performs an operation input via the input device 5047 (including the foot switch 5057), the arm control device 5045 appropriately controls the driving of the arm section 5031 according to the operation input. The position and orientation of the scope 5001 may be controlled. Note that the arm portion 5031 may be operated by a so-called master-slave method. In this case, the arm section 5031 (slave) can be remotely controlled by the surgeon 5067 via the input device 5047 (master console) installed at a location remote from or within the operating room.
 ここで、一般的には、内視鏡下手術では、スコピストと呼ばれる医師によって内視鏡5001が支持されていた。これに対して、本開示の実施形態においては、支持アーム装置5027を用いることにより、人手によらずに内視鏡5001の位置をより確実に固定することが可能になるため、術部の画像を安定的に得ることができ、手術を円滑に行うことが可能になる。 Here, in general, in endoscopic surgery, the endoscope 5001 was supported by a doctor called a scopist. In contrast, in the embodiment of the present disclosure, the use of the support arm device 5027 makes it possible to more reliably fix the position of the endoscope 5001 without manual intervention, so that the image of the surgical site is can be stably obtained, and the operation can be performed smoothly.
 なお、アーム制御装置5045は必ずしもカート5037に設けられなくてもよい。また、アーム制御装置5045は必ずしも1つの装置でなくてもよい。例えば、アーム制御装置5045は、支持アーム装置5027のアーム部5031の各関節部5033a~5033cにそれぞれ設けられてもよく、複数のアーム制御装置5045が互いに協働することにより、アーム部5031の駆動制御が実現されてもよい。 It should be noted that the arm control device 5045 does not necessarily have to be provided on the cart 5037. Also, the arm control device 5045 does not necessarily have to be one device. For example, the arm control device 5045 may be provided at each joint portion 5033a to 5033c of the arm portion 5031 of the support arm device 5027, and the arm portion 5031 is driven by the cooperation of the plurality of arm control devices 5045. Control may be implemented.
 例えば以上説明したような内視鏡手術システム5000に、先に説明した医療用観察装置100による距離測定技術を組み入れることにより、患者5071の体腔内の臓器、術具5017等の対象物の距離測定が可能になる。距離測定結果は、内視鏡5001を支持するアーム部5031の自律制御(自律駆動)等に活用することができる。 For example, by incorporating the distance measurement technique by the medical observation device 100 described above into the endoscopic surgery system 5000 described above, distance measurement of objects such as organs in the body cavity of the patient 5071 and the surgical instrument 5017 can be performed. becomes possible. The distance measurement result can be utilized for autonomous control (autonomous drive) of the arm section 5031 that supports the endoscope 5001, and the like.
 患者5071の体腔内の3次元マップを構築することは重要である。3次元マップを参照することにより、内視鏡5001の先端の周囲の環境(周辺環境)を把握でき、腹腔内の臓器などに対する内視鏡5001の衝突、過度の近接等を回避するような自律制御が行える。3次元マップは、CTスキャン等によって予め生成され、内視鏡手術システム5000において保持されていてよい。 It is important to construct a three-dimensional map inside the patient's 5071 body cavity. By referring to the three-dimensional map, the environment around the tip of the endoscope 5001 (peripheral environment) can be grasped, and the endoscope 5001 can avoid collision, excessive proximity, etc. to organs in the abdominal cavity. can be controlled. The three-dimensional map may be generated in advance by CT scanning or the like and held in the endoscopic surgery system 5000 .
 手術中に、上述の予め生成された3次元マップではカバーされていないエリアに、内視鏡5001の先端が到達するケースもある。その場合、3次元マップに示されない対象物9、すなわち距離情報(3次元距離測定データ等)が保持されていない対象物9が、内視鏡5001の先端の周囲に現れることがある。CCU5は、内視鏡5001の先端の周囲に存在する対象物9の距離情報を保持していない場合に、その対象物9までの距離を測定してよい。測定手法これまで説明したとおりであるので、説明は繰り返さない。CCU5は、距離測定結果に基づいて、その対象物9の3次元距離測定データを構築し、対象物9(の3次元距離測定データ)を含む3次元マップを生成してよい。この3次元マップの生成は、予め生成された3次元マップを補完するものであってよい。 In some cases, during surgery, the tip of the endoscope 5001 reaches an area not covered by the previously generated three-dimensional map. In that case, an object 9 not shown on the three-dimensional map, that is, an object 9 for which distance information (such as three-dimensional distance measurement data) is not held may appear around the distal end of the endoscope 5001 . The CCU 5 may measure the distance to the object 9 when it does not have the distance information of the object 9 existing around the distal end of the endoscope 5001 . Measurement method As described above, the description will not be repeated. The CCU 5 may construct 3D range measurement data for the object 9 based on the range measurement results and generate a 3D map containing (the 3D range measurement data for) the object 9 . The generation of this 3D map may be complementary to a previously generated 3D map.
 図12は、内視鏡手術システムにおいて実行される処理の例を示すフローチャートである。ステップS21において、周囲のスキャン情報を保持しているか否かが判断される。例えば、CCU5は、内視鏡5001の先端の周囲に存在する対象物9が予め生成された3次元マップに含まれている場合に、周囲のスキャン情報を保持していると判断する。周囲のスキャン情報を保持している場合(ステップS21:Yes)、ステップS21の処理が繰り返される。そうでない場合(ステップS21:No)、ステップS22に処理が進められる。 FIG. 12 is a flowchart showing an example of processing executed in the endoscopic surgery system. In step S21, it is determined whether or not surrounding scan information is held. For example, when the target object 9 existing around the distal end of the endoscope 5001 is included in the pre-generated three-dimensional map, the CCU 5 determines that the peripheral scan information is held. If the surrounding scan information is held (step S21: Yes), the process of step S21 is repeated. Otherwise (step S21: No), the process proceeds to step S22.
 ステップS22において、距離測定モードに遷移する。CCU5039は、これまで説明した手法による距離測定を開始する。具体的に、ステップS23~ステップS25の処理が実行される。これらの処理は、先に説明した図6のステップS2~ステップS4の処理と同様である。CCU5039は、光源装置5043による光の照射プロファイルを変化させ、光の照射プロファイルの変化により発生したセンサ35の検出結果に基づいて、対象物9までの距離を測定する。 In step S22, transition to the distance measurement mode. CCU 5039 initiates distance measurement according to the techniques described so far. Specifically, the processes of steps S23 to S25 are executed. These processes are the same as the processes of steps S2 to S4 in FIG. 6 described above. The CCU 5039 changes the light irradiation profile of the light source device 5043 and measures the distance to the object 9 based on the detection result of the sensor 35 generated by the change in the light irradiation profile.
 ステップS26において、3次元距離測定データが構築される。CCU5039は、先のステップS25での対象物9までの距離測定結果に基づいて、対象物9の3次元距離測定データを構築する。また、CCU5039は、構築した対象物9の3次元距離測定データを含むように、予め生成された3次元マップを補完する。 In step S26, three-dimensional distance measurement data is constructed. The CCU 5039 constructs three-dimensional distance measurement data of the object 9 based on the distance measurement result to the object 9 in the previous step S25. The CCU 5039 also complements the pre-generated 3D map to include the constructed 3D distance measurement data of the object 9 .
 例えば以上のようにして、これまで把握されていなかったエリアの対象物9の3次元距離測定データを構築し、さらにはそれを含む3次元マップを得ることができる。 For example, as described above, it is possible to construct the 3D distance measurement data of the object 9 in an area that has not been grasped so far, and to obtain a 3D map including it.
 また、CCU5は、内視鏡5001の先端の周囲に存在する対象物9の位置が変化した場合に、その対象物9までの距離を測定してよい。対象物9の位置の変化は、撮像画像(例えばセンサ33やセンサ34で生成された画像信号に基づく画像)に対する画像認識等によって検出されてよい。例えば、手術中に助手等が肝臓を持ち上げて移動させた場合には、その肝臓の位置の変化が検出される。CCU5は、距離測定結果に基づいて、位置が変化した後の対象物9の3次元距離測定データを構築し、対象物9を含む3次元マップを生成してよい。この3次元マップの生成は、予め生成された3次元マップを更新するものであってよい。 Also, the CCU 5 may measure the distance to the object 9 when the position of the object 9 existing around the distal end of the endoscope 5001 changes. A change in the position of the object 9 may be detected by image recognition or the like for a captured image (for example, an image based on an image signal generated by the sensor 33 or the sensor 34). For example, when an assistant or the like lifts and moves the liver during surgery, a change in the position of the liver is detected. Based on the distance measurement results, the CCU 5 may construct three-dimensional distance measurement data of the object 9 after the position change and generate a three-dimensional map containing the object 9 . The generation of this 3D map may update a previously generated 3D map.
 図13は、内視鏡手術システムにおいて実行される処理の例を示すフローチャートである。ステップS31において、周囲の状況が変化したか否かが判断される。例えば、CCU5039は、撮像画像に含まれる対象物9の位置の変化を検出した場合に、周囲の状況が変化したと判断する。周囲の状況が変化した場合(ステップS31:Yes)、ステップS32に処理が進められる。そうでない場合(ステップS31:No)、ステップS31の処理が繰り返される。 FIG. 13 is a flowchart showing an example of processing executed in the endoscopic surgery system. In step S31, it is determined whether or not the surrounding circumstances have changed. For example, the CCU 5039 determines that the surrounding situation has changed when detecting a change in the position of the object 9 included in the captured image. If the surrounding situation has changed (step S31: Yes), the process proceeds to step S32. Otherwise (step S31: No), the process of step S31 is repeated.
 ステップS32~ステップS35の処理は、先に説明した図13のステップS22~ステップS25の処理と同様である。CCU5039は、距離測定モードに遷移し、光源装置5043による光の照射プロファイルを変化させ、光の照射プロファイルの変化により発生したセンサ35の検出結果に基づいて、対象物9までの距離を測定する。 The processing of steps S32 to S35 is the same as the processing of steps S22 to S25 in FIG. 13 described above. The CCU 5039 transitions to the distance measurement mode, changes the light irradiation profile of the light source device 5043, and measures the distance to the object 9 based on the detection result of the sensor 35 generated by the change in the light irradiation profile.
 ステップS36において、3次元距離測定データが構築される。CCU5039は、先のステップS35での距離測定結果に基づいて、対象物9の3次元距離測定データを構築する。また、CCU5039は、構築した対象物9の3次元距離測定データを含むように、予め生成された3次元マップを更新する。 In step S36, three-dimensional distance measurement data is constructed. The CCU 5039 constructs three-dimensional distance measurement data of the object 9 based on the distance measurement result in the previous step S35. The CCU 5039 also updates the pre-generated 3D map to include the constructed 3D distance measurement data of the object 9 .
 例えば以上のようにして、位置が変化した後の対象物9の3次元距離測定データを構築し、さらにはそれを含む3次元マップを得ることができる。 For example, as described above, it is possible to construct the 3D distance measurement data of the object 9 after the position has changed, and furthermore obtain a 3D map including it.
 なお、上記では、医療用観察装置100による距離測定技術を内視鏡手術システム5000に適用する例について説明した。ただし、医療用観察装置100による距離測定技術は、他の手術システムに適用されてもよい。他の手術システムの例は、手術顕微鏡を含む手術システムである。 In the above, an example of applying the distance measurement technique by the medical observation device 100 to the endoscopic surgery system 5000 has been described. However, the distance measurement technique by the medical observation device 100 may be applied to other surgical systems. Another example of a surgical system is a surgical system that includes an operating microscope.
4.ハードウェア構成の例
 図14は、装置のハードウェア構成の例を示す図である。これまで説明したCCU5、CCU5039等は、例えば図15に示されるコンピュータ1000によって実現される。
4. Example of Hardware Configuration FIG. 14 is a diagram showing an example of the hardware configuration of the apparatus. The CCU5, CCU5039, etc. described so far are implemented by, for example, the computer 1000 shown in FIG.
 コンピュータ1000は、CPU1100、RAM1200、ROM(Read Only Memory)1300、HDD(Hard Disk Drive)1400、通信インターフェイス1500、及び、入出力インターフェイス1600を有する。コンピュータ1000の各部は、バス1050によって接続される。 The computer 1000 has a CPU 1100, a RAM 1200, a ROM (Read Only Memory) 1300, a HDD (Hard Disk Drive) 1400, a communication interface 1500, and an input/output interface 1600. Each part of computer 1000 is connected by bus 1050 .
 CPU1100は、ROM1300又はHDD1400に保存されたプログラムに基づいて動作し、各部の制御を行う。例えば、CPU1100は、ROM1300又はHDD1400に保存されたプログラムをRAM1200に展開し、各種プログラムに対応した処理を実行する。 The CPU 1100 operates based on programs stored in the ROM 1300 or HDD 1400 and controls each section. For example, the CPU 1100 loads programs stored in the ROM 1300 or HDD 1400 into the RAM 1200 and executes processes corresponding to various programs.
 ROM1300は、コンピュータ1000の起動時にCPU1100によって実行されるBIOS(Basic Input Output System)等のブートプログラムや、コンピュータ1000のハードウェアに依存するプログラム等を保存する。 The ROM 1300 stores boot programs such as the BIOS (Basic Input Output System) executed by the CPU 1100 when the computer 1000 is started, programs dependent on the hardware of the computer 1000, and the like.
 HDD1400は、CPU1100によって実行されるプログラム、及び、かかるプログラムによって使用されるデータ等を非一時的に記録する、コンピュータが読み取り可能な記録媒体である。具体的には、HDD1400は、プログラムデータ1450の一例である本開示に係る医療用観察方法、情報処理方法のためのプログラムを記録する記録媒体である。 The HDD 1400 is a computer-readable recording medium that non-temporarily records programs executed by the CPU 1100 and data used by such programs. Specifically, the HDD 1400 is a recording medium that records programs for the medical observation method and information processing method according to the present disclosure, which are examples of the program data 1450 .
 通信インターフェイス1500は、コンピュータ1000が外部ネットワーク1550(例えばインターネット)と接続するためのインターフェイスである。例えば、CPU1100は、通信インターフェイス1500を介して、他の機器からデータを受信したり、CPU1100が生成したデータを他の機器へ送信したりする。 A communication interface 1500 is an interface for connecting the computer 1000 to an external network 1550 (for example, the Internet). For example, CPU 1100 receives data from another device via communication interface 1500, and transmits data generated by CPU 1100 to another device.
 入出力インターフェイス1600は、入出力デバイス1650とコンピュータ1000とを接続するためのインターフェイスである。例えば、CPU1100は、入出力インターフェイス1600を介して、キーボードやマウス等の入力デバイスからデータを受信する。また、CPU1100は、入出力インターフェイス1600を介して、ディスプレイやスピーカーやプリンタ等の出力デバイスにデータを送信する。また、入出力インターフェイス1600は、コンピュータ読み取り可能な所定の記録媒体(メディア)に記録されたプログラム等を読み取るメディアインターフェイスとして機能してもよい。メディアとは、例えばDVD(Digital Versatile Disc)PD(Phase change rewritable Disk)等の光学記録媒体、MO(Magneto-Optical Disk)等の光磁気記録媒体、テープ媒体、磁気記録媒体、または半導体メモリ等である。 The input/output interface 1600 is an interface for connecting the input/output device 1650 and the computer 1000 . For example, the CPU 1100 receives data from input devices such as a keyboard and mouse via the input/output interface 1600 . The CPU 1100 also transmits data to an output device such as a display, speaker, or printer via the input/output interface 1600 . Also, the input/output interface 1600 may function as a media interface for reading a program or the like recorded on a predetermined computer-readable recording medium. Media include, for example, optical recording media such as DVD (Digital Versatile Disc) PD (Phase change rewritable Disk), magneto-optical recording media such as MO (Magneto-Optical Disk), tape media, magnetic recording media, or semiconductor memory. be.
 例えば、コンピュータ1000がこれまで説明したCCU5、CCU5039として機能する場合、コンピュータ1000のCPU1100は、RAM1200上にロードされたデプスを推定するためのプログラムを実行することにより、CCU5、CCU5039の各機能を実現する。また、HDD1400には、CCU5、CCU5039の処理を実行するためのプログラムが格納されてもよい。なお、CPU1100は、プログラムデータ1450をHDD1400から読み取って実行するが、他の例として、外部ネットワーク1550を介して、他の装置からプログラムを取得してもよい。 For example, when the computer 1000 functions as the CCU5 and CCU5039 described above, the CPU 1100 of the computer 1000 implements the functions of the CCU5 and CCU5039 by executing a program for estimating the depth loaded on the RAM 1200. do. Further, HDD 1400 may store programs for executing the processes of CCU5 and CCU5039. Note that CPU 1100 reads program data 1450 from HDD 1400 and executes it, but as another example, the program may be obtained from another device via external network 1550 .
 CCU5、CCU5039は、例えばクラウドコンピューティング等のように、ネットワークへの接続(または各装置間の通信)を前提とした、複数の装置からなるシステムに適用されてもよい。 CCU5 and CCU5039 may be applied to a system consisting of multiple devices, such as cloud computing, which assumes connection to a network (or communication between devices).
 上記の各構成要素は、汎用的な部材を用いて構成されていてもよいし、各構成要素の機能に特化したハードウェアにより構成されていてもよい。かかる構成は、実施する時々の技術レベルに応じて適宜変更され得る。 Each of the above components may be configured using general-purpose members, or may be configured with hardware specialized for the function of each component. Such a configuration can be changed as appropriate according to the technical level of implementation.
5.効果の例
 以上説明した技術は、例えば次のように特定される。図1~図10等を参照して説明したように、医療用観察装置100は、光源装置4と、センサ35と、制御部(CCU5)と、を具備する。光源装置4は、対象物9に光を照射する。センサ35は、複数の画素352それぞれに入射した光による輝度変化量が所定の閾値を超えたことを検出する。制御部は、光源装置4による光の照射プロファイルを変化させ、光の照射プロファイルの変化により発生したセンサ35の検出結果に基づいて、対象物9までの距離を測定する。
5. Example of Effect The technique described above is specified as follows, for example. As described with reference to FIGS. 1 to 10 and the like, the medical observation device 100 includes the light source device 4, the sensor 35, and the control unit (CCU 5). The light source device 4 irradiates the object 9 with light. The sensor 35 detects that the amount of luminance change due to the light incident on each of the plurality of pixels 352 exceeds a predetermined threshold. The controller changes the light irradiation profile of the light source device 4 and measures the distance to the object 9 based on the detection result of the sensor 35 generated by the change in the light irradiation profile.
 上記の医療用観察装置100によれば、光源装置4による光の照射プロファイルを変化させ、光の照射プロファイルの変化により発生したセンサ35の検出結果に基づいて、対象物9までの距離が測定される。例えば内視鏡手術等の手術での距離測定精度を向上させることができる。 According to the medical observation apparatus 100 described above, the light irradiation profile of the light source device 4 is changed, and the distance to the object 9 is measured based on the detection result of the sensor 35 generated by the change in the light irradiation profile. be. For example, distance measurement accuracy in surgery such as endoscopic surgery can be improved.
 図6及び図7等を参照して説明したように、照射プロファイルの変化は、照射範囲の変化を含んでよい。照射範囲は、環状形状を含み、照射範囲の変化は、環状形状の大きさの変化を含んでよい。例えばこのように照射プロファイルを変化させて、対象物9までの距離を測定することができる。 As described with reference to FIGS. 6 and 7, changes in the irradiation profile may include changes in the irradiation range. The illuminated area may comprise an annular shape, and the change in illuminated area may comprise a change in the size of the annular shape. For example, by varying the illumination profile in this way, the distance to the object 9 can be measured.
 図8等を参照して説明したように、照射プロファイルの変化は、光源装置4の光の複数の異なる位置からの対象物9への照射の組み合わせの変化を含んでよい。照射の組み合わせの変化は、点灯パターンの変化を含んでよい。例えばこのように照射プロファイルを変化させて、対象物9までの距離を測定することができる。 As described with reference to FIG. 8 and the like, changes in the irradiation profile may include changes in combinations of irradiation of the light from the light source device 4 onto the object 9 from a plurality of different positions. Changes in illumination combinations may include changes in lighting patterns. For example, by varying the illumination profile in this way, the distance to the object 9 can be measured.
 図9等を参照して説明したように、制御部は、照射プロファイルを変化させる際に、照射を一時的に停止させてよい。これにより、対象物9への照射の変化がより顕在化し、センサ35による検出をより確実に発生させることができる。 As described with reference to FIG. 9 and the like, the control unit may temporarily stop irradiation when changing the irradiation profile. As a result, changes in the illumination of the object 9 become more conspicuous, and detection by the sensor 35 can be more reliably generated.
 図10等を参照して説明したように、照射プロファイルの変化は、光の波長の変化を含んでよい。例えばこのように照射プロファイルを変化させて、対象物9までの距離を測定することができる。 As described with reference to FIG. 10 and the like, changes in the irradiation profile may include changes in the wavelength of light. For example, by varying the illumination profile in this way, the distance to the object 9 can be measured.
 図1~図10等を参照して説明したCCU5は、距離測定に関する処理を行う情報処理装置とも呼ぶことができ、このような情報処理装置も、実施形態の1つである。情報処理装置は、対象物9に光を照射する光源装置4による光の照射プロファイルを変化させ、光の照射プロファイルの変化により発生したセンサ35の検出結果に基づいて、対象物9までの距離を測定する。センサ35は、複数の画素それぞれに入射した光による輝度変化量が所定の閾値を超えたことを検出する。このような情報処理装置によっても、これまで説明したように、手術での距離測定精度を向上させることができる。 The CCU 5 described with reference to FIGS. 1 to 10 and the like can also be called an information processing device that performs processing related to distance measurement, and such an information processing device is also one of the embodiments. The information processing device changes the light irradiation profile of the light source device 4 that irradiates the object 9 with light, and calculates the distance to the object 9 based on the detection result of the sensor 35 generated by the change in the light irradiation profile. Measure. The sensor 35 detects that the amount of luminance change due to the light incident on each of the plurality of pixels exceeds a predetermined threshold. Such an information processing apparatus can also improve the accuracy of distance measurement in surgery, as described above.
 図5等を参照して説明した医療用観察方法も、実施形態の1つである。医療用観察方法は、情報処理装置(CCU5)が、対象物9に光を照射する光源装置4による光の照射プロファイルを変化させ、光の照射プロファイルの変化により発生したセンサ35の検出結果に基づいて、対象物9までの距離を測定すること(ステップS2~ステップS4)、を含む。センサ35は、複数の画素352それぞれに入射した光による輝度変化量が所定の閾値を超えたことを検出する。このような医療用観察方法によっても、これまで説明したように、手術での距離測定精度を向上させることができる。 The medical observation method described with reference to FIG. 5 etc. is also one of the embodiments. In the medical observation method, the information processing unit (CCU 5) changes the light irradiation profile of the light source device 4 that irradiates the object 9 with light, and based on the detection result of the sensor 35 generated by the change in the light irradiation profile. and measuring the distance to the object 9 (steps S2 to S4). The sensor 35 detects that the amount of luminance change due to the light incident on each of the plurality of pixels 352 exceeds a predetermined threshold. Such a medical observation method can also improve the accuracy of distance measurement in surgery, as described above.
 図11~図13等を参照して説明した内視鏡手術システム5000も、実施形態の1つである。内視鏡手術システム5000は、複数の画素352それぞれに入射した光による輝度変化量が所定の閾値を超えたことを検出するセンサ35を含む内視鏡5001(内視鏡1に相当)と、内視鏡5001を支持するアーム部5031と、対象物9に光を照射する光源装置5043(光源装置4に相当)と、内視鏡5001、アーム部5031及び光源装置5043を制御する制御装置(CCU5039)と、を備える。制御装置(CCU5039)は、光源装置5043による光の照射プロファイルを変化させ、光の照射プロファイルの変化により発生したセンサ35の検出結果に基づいて、対象物9までの距離を測定する。このような内視鏡手術システム5000によっても、これまで説明したように、手術での距離測定精度を向上させることができる。 The endoscopic surgery system 5000 described with reference to FIGS. 11 to 13 and the like is also one of the embodiments. The endoscopic surgery system 5000 includes an endoscope 5001 (corresponding to the endoscope 1) including a sensor 35 for detecting that the luminance change amount due to light incident on each of a plurality of pixels 352 exceeds a predetermined threshold, An arm portion 5031 that supports the endoscope 5001, a light source device 5043 (corresponding to the light source device 4) that irradiates the object 9 with light, and a control device that controls the endoscope 5001, the arm portion 5031 and the light source device 5043 CCU 5039). The control device (CCU 5039) changes the light irradiation profile of the light source device 5043, and measures the distance to the object 9 based on the detection result of the sensor 35 generated by the change in the light irradiation profile. Such an endoscopic surgery system 5000 can also improve the accuracy of distance measurement in surgery as described above.
 制御装置(CCU5039)は、内視鏡5001の先端の周囲に存在する対象物9の距離情報を保持していない場合に、距離を測定してよい。制御装置(CCU5039)は、内視鏡5001の先端の周囲に存在する対象物9の位置が変化した場合に、距離を測定してもよい。制御装置(CCU5039)は、距離測定結果に基づいて、対象物9を含む3次元マップを生成してよい。これにより、これまで把握されていなかった対象物9の距離情報を把握し(3次元距離測定データの構築等)、それを含む3次元マップを得ることができる。制御装置(CCU5039)は、生成した対象物9を含む3次元マップに基づいてアーム部5031の駆動を制御してよい。最新の3次元マップに基づくより適切なアーム部5031の制御が可能になる。 The control device (CCU 5039) may measure the distance when it does not hold the distance information of the object 9 existing around the tip of the endoscope 5001. The control unit (CCU 5039) may measure the distance when the position of the object 9 existing around the distal end of the endoscope 5001 changes. The control unit (CCU 5039) may generate a three-dimensional map containing the object 9 based on the distance measurement results. As a result, distance information of the object 9, which has not been grasped so far, can be grasped (construction of three-dimensional distance measurement data, etc.), and a three-dimensional map including it can be obtained. The control device (CCU 5039) may control the driving of the arm section 5031 based on the generated three-dimensional map including the object 9. FIG. More appropriate control of the arm section 5031 based on the latest three-dimensional map becomes possible.
 なお、本開示に記載された効果は、あくまで例示であって、開示された内容に限定されない。他の効果があってもよい。 It should be noted that the effects described in this disclosure are merely examples and are not limited to the disclosed content. There may be other effects.
 以上、本開示の実施形態について説明したが、本開示の技術的範囲は、上述の実施形態そのままに限定されるものではなく、本開示の要旨を逸脱しない範囲において種々の変更が可能である。また、異なる実施形態及び変形例にわたる構成要素を適宜組み合わせてもよい。 Although the embodiments of the present disclosure have been described above, the technical scope of the present disclosure is not limited to the embodiments described above, and various modifications are possible without departing from the gist of the present disclosure. Moreover, you may combine the component over different embodiment and modifications suitably.
 なお、本技術は以下のような構成も取ることができる。
(1)
 対象物に光を照射する光源装置と、
 複数の画素それぞれに入射した光による輝度変化量が所定の閾値を超えたことを検出するセンサと、
 前記光源装置による光の照射プロファイルを変化させ、前記光の照射プロファイルの変化により発生した前記センサの検出結果に基づいて、前記対象物までの距離を測定する制御部と、
 を具備する、
 医療用観察装置。
(2)
 前記照射プロファイルの変化は、照射範囲の変化を含む、
 (1)に記載の医療用観察装置。
(3)
 前記照射範囲は、環状形状を含み、
 前記照射範囲の変化は、前記環状形状の大きさの変化を含む、
 (2)に記載の医療用観察装置。
(4)
 前記照射プロファイルの変化は、前記光源装置の光の複数の異なる位置からの前記対象物への照射の組み合わせの変化を含む、
 (1)~(3)のいずれかに記載の医療用観察装置。
(5)
 前記照射の組み合わせの変化は、点灯パターンを含む、
 (4)に記載の医療用観察装置。
(6)
 前記制御部は、前記照射プロファイルを変化させる際に、前記照射を一時的に停止させる、
 (1)~(5)のいずれかに記載の医療用観察装置。
(7)
 前記照射プロファイルの変化は、前記光の波長の変化を含む、
 (1)~(6)のいずれかに記載の医療用観察装置。
(8)
 対象物に光を照射する光源装置による光の照射プロファイルを変化させ、前記光の照射プロファイルの変化により発生したセンサの検出結果に基づいて、前記対象物までの距離を測定する、
 情報処理装置であって、
 前記センサは、複数の画素それぞれに入射した光による輝度変化量が所定の閾値を超えたことを検出する、
 情報処理装置。
(9)
 情報処理装置が、対象物に光を照射する光源装置による光の照射プロファイルを変化させ、前記光の照射プロファイルの変化により発生したセンサの検出結果に基づいて、前記対象物までの距離を測定すること、
 を含み、
 前記センサは、複数の画素それぞれに入射した光による輝度変化量が所定の閾値を超えたことを検出する、
 医療用観察方法。
(10)
 複数の画素それぞれに入射した光による輝度変化量が所定の閾値を超えたことを検出するセンサを含む内視鏡と、
 前記内視鏡を支持するアーム部と、
 対象物に光を照射する光源装置と、
 前記内視鏡、前記アーム部及び前記光源装置を制御する制御装置と、
 を備え、
 前記制御装置は、前記光源装置による光の照射プロファイルを変化させ、前記光の照射プロファイルの変化により発生した前記センサの検出結果に基づいて、前記対象物までの距離を測定する、
 内視鏡手術システム。
(11)
 前記制御装置は、前記内視鏡の先端の周囲に存在する前記対象物の距離情報を保持していない場合に、前記距離を測定する、
 (10)に記載の内視鏡手術システム。
(12)
 前記制御装置は、前記内視鏡の先端の周囲に存在する前記対象物の位置が変化した場合に、前記距離を測定する、
 (10)又は(11)に記載の内視鏡手術システム。
(13)
 前記制御装置は、前記距離の測定結果に基づいて、前記対象物を含む3次元マップを生成する、
 (11)又は(12)に記載の内視鏡手術システム。
(14)
 前記制御装置は、生成した前記対象物を含む3次元マップに基づいて前記アーム部の駆動を制御する、(13)に記載の内視鏡手術システム。
(15)
 前記光源装置は、所定位置及び所定方向から前記対象物に光を照射する、
 (1)~(7)のいずれかに記載の医療用観察装置。
(16)
 前記光源装置による光は、所定位置及び所定方向から前記対象物に照射される、
 (8)に記載の情報処理装置。
(17)
 前記光源装置による光は、所定位置及び所定方向から前記対象物に照射される、
 (9)に記載の医療用観察方法。
(18)
 前記光源装置は、所定位置及び所定方向から前記対象物に光を照射する、
 (10)~(14)のいずれかに記載の内視鏡手術システム。
Note that the present technology can also take the following configuration.
(1)
a light source device for irradiating an object with light;
a sensor that detects that a luminance change amount due to light incident on each of the plurality of pixels exceeds a predetermined threshold;
a control unit that changes the light irradiation profile of the light source device and measures the distance to the object based on the detection result of the sensor generated by the change in the light irradiation profile;
comprising a
Medical observation device.
(2)
The change in the irradiation profile includes a change in irradiation range.
The medical observation device according to (1).
(3)
The irradiation range includes an annular shape,
The change in the irradiation range includes a change in the size of the annular shape.
The medical observation device according to (2).
(4)
A change in the irradiation profile includes a change in combination of irradiation of the object from a plurality of different positions of the light of the light source device.
A medical observation device according to any one of (1) to (3).
(5)
The change in the irradiation combination includes a lighting pattern,
The medical observation device according to (4).
(6)
wherein the control unit temporarily stops the irradiation when changing the irradiation profile;
The medical observation device according to any one of (1) to (5).
(7)
the change in the irradiation profile includes a change in the wavelength of the light;
A medical observation device according to any one of (1) to (6).
(8)
changing a light irradiation profile of a light source device that irradiates light onto an object, and measuring a distance to the object based on a sensor detection result generated by the change in the light irradiation profile;
An information processing device,
The sensor detects that a luminance change amount due to light incident on each of the plurality of pixels exceeds a predetermined threshold.
Information processing equipment.
(9)
An information processing device changes a light irradiation profile of a light source device that irradiates light onto an object, and measures a distance to the object based on a sensor detection result generated by the change in the light irradiation profile. thing,
including
The sensor detects that a luminance change amount due to light incident on each of the plurality of pixels exceeds a predetermined threshold.
Medical observation method.
(10)
an endoscope including a sensor that detects that a luminance change due to light incident on each of a plurality of pixels exceeds a predetermined threshold;
an arm that supports the endoscope;
a light source device for irradiating an object with light;
a control device that controls the endoscope, the arm portion, and the light source device;
with
The control device changes the light irradiation profile of the light source device, and measures the distance to the object based on the detection result of the sensor generated by the change in the light irradiation profile.
Endoscopic surgery system.
(11)
The control device measures the distance when the distance information of the object existing around the tip of the endoscope is not held.
The endoscopic surgery system according to (10).
(12)
The control device measures the distance when the position of the object existing around the tip of the endoscope changes.
The endoscopic surgery system according to (10) or (11).
(13)
The control device generates a three-dimensional map including the object based on the distance measurement result.
The endoscopic surgery system according to (11) or (12).
(14)
The endoscopic surgery system according to (13), wherein the control device controls driving of the arm based on the generated three-dimensional map including the object.
(15)
wherein the light source device irradiates the object with light from a predetermined position and a predetermined direction;
A medical observation device according to any one of (1) to (7).
(16)
The light emitted from the light source device is applied to the object from a predetermined position and in a predetermined direction.
The information processing device according to (8).
(17)
The light emitted from the light source device is applied to the object from a predetermined position and in a predetermined direction.
The medical observation method according to (9).
(18)
wherein the light source device irradiates the object with light from a predetermined position and a predetermined direction;
The endoscopic surgery system according to any one of (10) to (14).
   1 内視鏡
   2 鏡筒
   3 カメラヘッド
  33 センサ(撮像用センサ)
  34 センサ(撮像用センサ)
  35 センサ(輝度変化検出用センサ)
   4 光源装置
  41 光源
   5 CCU(制御部、情報処理装置)
 100 医療用観察装置
5000 内視鏡手術システム
5001 内視鏡
5031 アーム部
5039 CCU(制御部、情報処理装置)
5043 光源装置
1 endoscope 2 lens barrel 3 camera head 33 sensor (imaging sensor)
34 sensor (imaging sensor)
35 sensor (brightness change detection sensor)
4 light source device 41 light source 5 CCU (control unit, information processing device)
100 medical observation device 5000 endoscopic surgery system 5001 endoscope 5031 arm unit 5039 CCU (control unit, information processing unit)
5043 light source device

Claims (14)

  1.  対象物に光を照射する光源装置と、
     複数の画素それぞれに入射した光による輝度変化量が所定の閾値を超えたことを検出するセンサと、
     前記光源装置による光の照射プロファイルを変化させ、前記光の照射プロファイルの変化により発生した前記センサの検出結果に基づいて、前記対象物までの距離を測定する制御部と、
     を具備する、
     医療用観察装置。
    a light source device for irradiating an object with light;
    a sensor that detects that a luminance change amount due to light incident on each of the plurality of pixels exceeds a predetermined threshold;
    a control unit that changes the light irradiation profile of the light source device and measures the distance to the object based on the detection result of the sensor generated by the change in the light irradiation profile;
    comprising a
    Medical observation device.
  2.  前記照射プロファイルの変化は、照射範囲の変化を含む、
     請求項1に記載の医療用観察装置。
    The change in the irradiation profile includes a change in irradiation range.
    The medical observation device according to claim 1.
  3.  前記照射範囲は、環状形状を含み、
     前記照射範囲の変化は、前記環状形状の大きさの変化を含む、
     請求項2に記載の医療用観察装置。
    The irradiation range includes an annular shape,
    The change in the irradiation range includes a change in the size of the annular shape.
    The medical observation device according to claim 2.
  4.  前記照射プロファイルの変化は、前記光源装置の光の複数の異なる位置からの前記対象物への照射の組み合わせの変化を含む、
     請求項1に記載の医療用観察装置。
    A change in the irradiation profile includes a change in combination of irradiation of the object from a plurality of different positions of the light of the light source device.
    The medical observation device according to claim 1.
  5.  前記照射の組み合わせの変化は、点灯パターンを含む、
     請求項4に記載の医療用観察装置。
    The change in the irradiation combination includes a lighting pattern,
    The medical observation device according to claim 4.
  6.  前記制御部は、前記照射プロファイルを変化させる際に、前記照射を一時的に停止させる、
     請求項1に記載の医療用観察装置。
    wherein the control unit temporarily stops the irradiation when changing the irradiation profile;
    The medical observation device according to claim 1.
  7.  前記照射プロファイルの変化は、前記光の波長の変化を含む、
     請求項1に記載の医療用観察装置。
    the change in the irradiation profile includes a change in the wavelength of the light;
    The medical observation device according to claim 1.
  8.  対象物に光を照射する光源装置による光の照射プロファイルを変化させ、前記光の照射プロファイルの変化により発生したセンサの検出結果に基づいて、前記対象物までの距離を測定する、
     情報処理装置であって、
     前記センサは、複数の画素それぞれに入射した光による輝度変化量が所定の閾値を超えたことを検出する、
     情報処理装置。
    changing the light irradiation profile of a light source device that irradiates light onto an object, and measuring the distance to the object based on the sensor detection result generated by the change in the light irradiation profile;
    An information processing device,
    The sensor detects that a luminance change amount due to light incident on each of the plurality of pixels exceeds a predetermined threshold.
    Information processing equipment.
  9.  情報処理装置が、対象物に光を照射する光源装置による光の照射プロファイルを変化させ、前記光の照射プロファイルの変化により発生したセンサの検出結果に基づいて、前記対象物までの距離を測定すること、
     を含み、
     前記センサは、複数の画素それぞれに入射した光による輝度変化量が所定の閾値を超えたことを検出する、
     医療用観察方法。
    An information processing device changes a light irradiation profile of a light source device that irradiates light onto an object, and measures a distance to the object based on a sensor detection result generated by the change in the light irradiation profile. thing,
    including
    The sensor detects that a luminance change amount due to light incident on each of the plurality of pixels exceeds a predetermined threshold.
    Medical observation method.
  10.  複数の画素それぞれに入射した光による輝度変化量が所定の閾値を超えたことを検出するセンサを含む内視鏡と、
     前記内視鏡を支持するアーム部と、
     対象物に光を照射する光源装置と、
     前記内視鏡、前記アーム部及び前記光源装置を制御する制御装置と、
     を備え、
     前記制御装置は、前記光源装置による光の照射プロファイルを変化させ、前記光の照射プロファイルの変化により発生した前記センサの検出結果に基づいて、前記対象物までの距離を測定する、
     内視鏡手術システム。
    an endoscope including a sensor that detects that a luminance change due to light incident on each of a plurality of pixels exceeds a predetermined threshold;
    an arm that supports the endoscope;
    a light source device for irradiating an object with light;
    a control device that controls the endoscope, the arm portion, and the light source device;
    with
    The control device changes the light irradiation profile of the light source device, and measures the distance to the object based on the detection result of the sensor generated by the change in the light irradiation profile.
    Endoscopic surgery system.
  11.  前記制御装置は、前記内視鏡の先端の周囲に存在する前記対象物の距離情報を保持していない場合に、前記距離を測定する、
     請求項10に記載の内視鏡手術システム。
    The control device measures the distance when the distance information of the object existing around the tip of the endoscope is not held.
    The endoscopic surgery system according to claim 10.
  12.  前記制御装置は、前記内視鏡の先端の周囲に存在する前記対象物の位置が変化した場合に、前記距離を測定する、
     請求項10に記載の内視鏡手術システム。
    The control device measures the distance when the position of the object existing around the tip of the endoscope changes.
    The endoscopic surgery system according to claim 10.
  13.  前記制御装置は、前記距離の測定結果に基づいて、前記対象物を含む3次元マップを生成する、
     請求項11に記載の内視鏡手術システム。
    The control device generates a three-dimensional map including the object based on the distance measurement result.
    The endoscopic surgery system according to claim 11.
  14.  前記制御装置は、生成した前記対象物を含む3次元マップに基づいて前記アーム部の駆動を制御する、請求項13に記載の内視鏡手術システム。 The endoscopic surgery system according to claim 13, wherein the control device controls driving of the arm based on the generated three-dimensional map including the object.
PCT/JP2022/001496 2021-03-30 2022-01-18 Medical observation device, information processing device, medical observation method, and endoscopic surgery system WO2022209156A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-057040 2021-03-30
JP2021057040 2021-03-30

Publications (1)

Publication Number Publication Date
WO2022209156A1 true WO2022209156A1 (en) 2022-10-06

Family

ID=83458582

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/001496 WO2022209156A1 (en) 2021-03-30 2022-01-18 Medical observation device, information processing device, medical observation method, and endoscopic surgery system

Country Status (1)

Country Link
WO (1) WO2022209156A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014067193A (en) * 2012-09-25 2014-04-17 Canon Inc Image processing apparatus and image processing method
JP2017517298A (en) * 2014-04-25 2017-06-29 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Catheter with two optical sensors
JP2017217215A (en) * 2016-06-07 2017-12-14 公立大学法人広島市立大学 Three-dimensional shape measuring apparatus and three-dimensional shape measuring method
US20170366773A1 (en) * 2016-06-21 2017-12-21 Siemens Aktiengesellschaft Projection in endoscopic medical imaging
JP2018108274A (en) * 2017-01-04 2018-07-12 ソニー株式会社 Endoscope apparatus and image generation method for endoscope apparatus
WO2018171851A1 (en) * 2017-03-20 2018-09-27 3Dintegrated Aps A 3d reconstruction system
WO2019137946A1 (en) * 2018-01-10 2019-07-18 Universite Libre De Bruxelles Endoscopic non-contact measurement device
JP2021032762A (en) * 2019-08-27 2021-03-01 ソニーセミコンダクタソリューションズ株式会社 Range-finding system and electronic instrument

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014067193A (en) * 2012-09-25 2014-04-17 Canon Inc Image processing apparatus and image processing method
JP2017517298A (en) * 2014-04-25 2017-06-29 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Catheter with two optical sensors
JP2017217215A (en) * 2016-06-07 2017-12-14 公立大学法人広島市立大学 Three-dimensional shape measuring apparatus and three-dimensional shape measuring method
US20170366773A1 (en) * 2016-06-21 2017-12-21 Siemens Aktiengesellschaft Projection in endoscopic medical imaging
JP2018108274A (en) * 2017-01-04 2018-07-12 ソニー株式会社 Endoscope apparatus and image generation method for endoscope apparatus
WO2018171851A1 (en) * 2017-03-20 2018-09-27 3Dintegrated Aps A 3d reconstruction system
WO2019137946A1 (en) * 2018-01-10 2019-07-18 Universite Libre De Bruxelles Endoscopic non-contact measurement device
JP2021032762A (en) * 2019-08-27 2021-03-01 ソニーセミコンダクタソリューションズ株式会社 Range-finding system and electronic instrument

Similar Documents

Publication Publication Date Title
JP5774596B2 (en) Visual tracking / annotation of clinically important anatomical landmarks for surgical intervention
WO2020045015A1 (en) Medical system, information processing device and information processing method
WO2020095987A2 (en) Medical observation system, signal processing apparatus, and medical observation method
JP2018075218A (en) Medical support arm and medical system
WO2019239942A1 (en) Surgical observation device, surgical observation method, surgical light source device, and light irradiation method for surgery
JP2021003531A (en) Surgery support system, control device, and control method
WO2021049438A1 (en) Medical support arm and medical system
WO2020262262A1 (en) Medical observation system, control device, and control method
WO2021049220A1 (en) Medical support arm and medical system
US11699215B2 (en) Imaging device, method and program for producing images of a scene having an extended depth of field with good contrast
WO2022209156A1 (en) Medical observation device, information processing device, medical observation method, and endoscopic surgery system
WO2019181242A1 (en) Endoscope and arm system
WO2018180068A1 (en) Medical imaging device and endoscope
JP4436495B2 (en) Surgical observation system
WO2021140923A1 (en) Medical image generation device, medical image generation method, and medical image generation program
US20220022728A1 (en) Medical system, information processing device, and information processing method
WO2020045014A1 (en) Medical system, information processing device and information processing method
WO2022201933A1 (en) Intravital observation system, observation system, intravital observation method, and intravital observation device
WO2022172733A1 (en) Observation device for medical treatment, observation device, observation method and adapter
WO2022239339A1 (en) Medical information processing device, medical observation system, and medical information processing method
WO2022249572A1 (en) Image processing device, image processing method, and recording medium
US20240016364A1 (en) Surgery system, surgery control device, control method, and program
WO2023017651A1 (en) Medical observation system, information processing device, and information processing method
JP2001299695A (en) Endoscopic device and microscope for operation
EP4312711A1 (en) An image capture device, an endoscope system, an image capture method and a computer program product

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22779388

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22779388

Country of ref document: EP

Kind code of ref document: A1