WO2020084917A1 - Medical system and information processing method - Google Patents

Medical system and information processing method Download PDF

Info

Publication number
WO2020084917A1
WO2020084917A1 PCT/JP2019/034884 JP2019034884W WO2020084917A1 WO 2020084917 A1 WO2020084917 A1 WO 2020084917A1 JP 2019034884 W JP2019034884 W JP 2019034884W WO 2020084917 A1 WO2020084917 A1 WO 2020084917A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
unit
light
speckle
subject
Prior art date
Application number
PCT/JP2019/034884
Other languages
French (fr)
Japanese (ja)
Inventor
哲朗 桑山
健太郎 深沢
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2020084917A1 publication Critical patent/WO2020084917A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/026Measuring blood flow
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/27Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands using photo-electric detection ; circuits for computing concentration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/31Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry
    • G01N21/35Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry using infrared light
    • G01N21/3563Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry using infrared light for analysing solids; Preparation of samples therefor
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/06Means for illuminating specimens
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • G02B23/26Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes using light guides

Definitions

  • the present disclosure relates to a medical system and an information processing method.
  • the speckle is a phenomenon in which a speckled pattern is generated by the emitted coherent light being reflected and interfered by minute irregularities on the surface of a subject (object). Based on this phenomenon of speckle, for example, a blood flow part and a non-blood flow part in a living body as a subject can be identified.
  • the speckle contrast value decreases due to movement of red blood cells that reflect coherent light, whereas in the non-blood flow part, the whole is stationary and the speckle contrast value increases. Therefore, the blood flow portion and the non-blood flow portion can be distinguished based on the speckle contrast image generated using the speckle contrast value of each pixel.
  • BR Bit Rate
  • SBR Square BR
  • MBR Mel BR
  • the size (for example, 5 ⁇ 5 pixels) of the calculation area for statistically processing the brightness value of the speckle is constant, and for example, the calculation area is the blood flow part. It cannot be said to be the best in terms of the accuracy of the value (eg, speckle contrast value) calculated to include both non-blood flow parts.
  • the predetermined image is generated based on the image by the predetermined light from the subject and the calculation region, there is room for improvement in the calculation region.
  • the present disclosure proposes a medical system and an information processing method capable of generating a highly accurate predetermined image based on an image by predetermined light from a subject and a more appropriate calculation area.
  • a medical system is a first irradiation unit that irradiates a subject with incoherent visible light, and a first irradiation unit that irradiates the subject with coherent near-infrared light.
  • Second irradiating means first imaging means for imaging the reflected light of the visible light from the subject, second imaging means for imaging the reflected light of the near-infrared light from the subject, and To obtain a visible light image from the first image pickup means and to obtain a speckle image from the second image pickup means, and to statistically process the speckle luminance value based on the visible light image.
  • Setting means for setting the calculation area, and generating means for generating a predetermined image based on the speckle image and the calculation area.
  • FIG. 5 is an explanatory diagram of a calculation example using a calculation area according to the first embodiment of the present disclosure. 5 is a flowchart showing SC image generation processing by the information processing apparatus according to the first embodiment of the present disclosure.
  • FIG. 20 is a diagram showing an example of a schematic configuration of a microscopic surgery system according to an application example 2 of the present disclosure. It is a figure which shows the mode of surgery using the microscopic surgery system shown in FIG.
  • blood flow evaluation is often important. For example, in bypass surgery in brain surgery, vascularization (blood flow) is confirmed after connecting blood vessels. In addition, in clipping aneurysm, the flow of blood into the aneurysm is confirmed after clipping. In these applications, blood flow evaluation by an angiography using an ultrasonic Doppler blood flow meter or ICG (Indocyanine Green) drug has been performed so far.
  • ICG Indocyanine Green
  • the ultrasonic Doppler blood flow meter measures blood flow at one point where the probe is in contact, the distribution of blood flow trends in the entire surgical field is unknown. There is also the risk of having to contact the cerebrovascular system for evaluation.
  • angiography using an ICG drug takes advantage of the feature that the ICG drug binds to plasma protein in the living body and emits fluorescence by excitation light in the near infrared, which is invasive to administer the drug. It is an observation. Further, in terms of blood flow evaluation, the flow has to be determined from the change immediately after the administration of the ICG drug, so that there is a limitation in usage in terms of timing.
  • Japanese Patent Laid-Open No. 2017-170064 discloses an optical device for perfusion evaluation in the speckle imaging technique.
  • the principle of detecting movement (blood flow) using speckle generated by a laser is used.
  • speckle contrast is used as an index for motion detection.
  • Speckle contrast is a value indicated by (standard deviation) / (average value) of the light intensity distribution.
  • the standard deviation of the intensity distribution is large and the speckle contrast (degree of glare) is high.
  • the speckle pattern changes with the motion. Considering shooting a speckle pattern with an observation system that has a certain exposure time, the speckle pattern changes during the exposure time, so the taken speckle patterns are averaged and the speckle contrast (degree of glare) Will be lower.
  • the larger the movement is the more the averaging is performed, so the speckle contrast becomes lower.
  • This method is a method to perform statistical evaluation using the light intensity of the pixel of interest and multiple surrounding pixels.
  • it is desirable to increase the calculation area (analysis area) and the number of pixels. This stabilizes the statistical value and reduces the variation in the analyzed pixel value.
  • the calculation area is made large, the resolution is lowered because information of peripheral pixels far from the target pixel is also included. It may be possible to reduce the calculation area to avoid resolution degradation, but in a small calculation area, the number of pixels contained in the calculation area is small, so the statistically processed values will vary widely, and the image will be noisy. .
  • the size of the calculation region for calculating the speckle contrast value of each pixel of interest has been set to a constant value (for example, 5 ⁇ 5 pixels), and the calculation region is, for example, the blood flow part. It is not the best in terms of the accuracy of the speckle contrast value if it includes both the non-blood flow area and the non-blood flow area.
  • FIG. 1 is a diagram showing a configuration example of a medical system 1 according to the first embodiment of the present disclosure.
  • the medical system 1 according to the first embodiment includes a structure observation light source 2 (first irradiation means), a narrow band light source 3 (second irradiation means), a wavelength separation device 4, a color camera 5 (first imaging means). ), An IR camera 6 (second image pickup means), and an information processing device 7.
  • first irradiation means first irradiation means
  • second irradiation means narrow band light source 3
  • a wavelength separation device 4 a color camera 5 (first imaging means).
  • An IR camera 6 second image pickup means
  • the structure observation light source 2 irradiates a subject with incoherent light (for example, incoherent visible light; hereinafter, also simply referred to as “visible light”).
  • the narrow band light source 3 irradiates the subject with coherent light (for example, coherent near infrared light.
  • coherent light is a phase relationship of light waves at two arbitrary points in the light flux that is constant in time and is constant, and after splitting the light flux by an arbitrary method, a large optical path difference is given and the light waves are superposed again.
  • Incoherent light is light that does not have the above-mentioned properties of coherent light.
  • the wavelength of the coherent light output from the narrowband light source 3 according to the present disclosure is preferably about 800 to 900 nm, for example.
  • the wavelength is 830 nm
  • ICG observation and an optical system can be used together.
  • near-infrared light with a wavelength of 830 nm so if near-infrared light with the same wavelength is also used for speckle observation, ICG observation can be performed. Speckle observation is possible without changing the optical system of a simple microscope.
  • the wavelength of the coherent light emitted by the narrowband light source 3 is not limited to this, and it is assumed that various wavelengths are used.
  • coherent light having a wavelength of 400 to 450 nm (ultraviolet to blue) or having a wavelength of 700 to 900 nm (infrared) is used, the visible light illumination and the wavelength separated observation are easy.
  • visible coherent light having a wavelength of 450 to 700 nm is used, it is easy to select a laser used in a projector or the like.
  • the type of the narrow band light source 3 that emits coherent light is not particularly limited as long as the effect of the present technology is not impaired.
  • the narrow band light source 3 that emits laser light include an argon ion (Ar) laser, a helium-neon (He-Ne) laser, a die (dye) laser, a krypton (Cr) laser, a semiconductor laser, and a semiconductor laser and wavelength conversion.
  • Solid-state lasers or the like combined with optical elements can be used alone or in combination.
  • the visible light from the structure observation light source 2 and the near-infrared light from the narrow band light source 3 are simultaneously emitted to the subject.
  • the type of the structure observation light source 2 is not particularly limited as long as the effect of the present technology is not impaired.
  • a light emitting diode etc. can be mentioned as an example.
  • other light sources include a xenon lamp, a metal halide lamp, a high pressure mercury lamp and the like.
  • the subject can be various, for example, a subject containing fluid is suitable. Due to the nature of speckle, there is a property that speckle is less likely to be generated from a fluid. Therefore, when the medical system 1 according to the present disclosure is used to image a subject containing a fluid, the boundary between the fluid portion and the non-fluid portion, the flow velocity of the fluid portion, and the like can be obtained.
  • the subject can be a living body whose fluid is blood.
  • the medical system 1 according to the present disclosure for microscopic surgery or endoscopic surgery, it is possible to perform surgery while confirming the position of the blood vessel. Therefore, safer and more accurate surgery can be performed, which can contribute to further development of medical technology.
  • the color camera 5 images reflected light (scattered light) of visible light from a subject.
  • the color camera 5 is, for example, an RGB (Red Green Blue) imager for observing visible light.
  • IR camera 6 images the reflected light (scattered light) of near infrared light from the subject.
  • the IR camera 6 is, for example, an IR (Infrared) imager for speckle observation.
  • the wavelength separation device 4 is, for example, a dichroic mirror.
  • the wavelength separation device 4 separates the received near infrared light (reflected light) and visible light (reflected light).
  • the color camera 5 captures a visible light image (first image) obtained from the visible light separated by the wavelength separation device 4.
  • the IR camera 6 captures a speckle image (second image) obtained from the near infrared light separated by the wavelength separation device 4.
  • FIG. 2 is a diagram illustrating a configuration example of the information processing device 7 according to the first embodiment of the present disclosure.
  • the information processing device 7 is an image processing device, and includes a processing unit 71, a storage unit 72, an input unit 73, and a display unit 74 as main components.
  • SC means speckle contrast (value).
  • the processing unit 71 is realized by a CPU (Central Processing Unit), for example, and includes an acquisition unit 711, a setting unit 712, a generation unit 713, and a display control unit 714 as main components.
  • a CPU Central Processing Unit
  • the acquisition unit 711 acquires a visible light image from the color camera 5 and a speckle image from the IR camera 6. It is assumed that the visible light image and the speckle image are associated with the position of the subject in the image. In other words, if the angles of view of the two are the same, the image may be the same as it is. If the angles of view of the two are not the same, at least one of the images may be corrected to match the position of the subject in the image. To do.
  • the setting unit 712 sets a calculation area for statistically processing the speckle luminance value based on the visible light image. For example, the setting unit 712 sets the calculation area for calculating the speckle contrast value based on the visible light image. More specifically, for example, the setting unit 712 calculates the degree of association based on the visible light image with respect to each of the target pixel and each of the plurality of peripheral pixels, and calculates the degree of association based on at least one of the distance, the difference in luminance, and the difference in color. Based on the above, the calculation area for the target pixel is set in at least a part of the plurality of peripheral pixels (details will be described later).
  • the generation unit 713 calculates the speckle contrast value for the pixel of interest based on the speckle image and the calculation area set by the setting unit 712, and generates the speckle contrast image (SC image. Predetermined image) ( Details will be described later).
  • speckle contrast value of the i-th pixel (Standard deviation of intensity of i-th pixel and surrounding pixels) / (Average of intensity of i-th pixel and surrounding pixels) Equation (1)
  • FIG. 3 is a diagram showing an example of an SC image of a pseudo blood vessel. As shown in the SC image example of FIG. 3, many speckles are observed in the non-blood flow portion, and almost no speckles are observed in the blood flow portion.
  • the generation unit 713 identifies the fluid portion (for example, the blood flow portion) and the non-fluid portion (for example, the non-blood flow portion) based on the SC image. More specifically, for example, the generation unit 713 determines, based on the SC image, whether the speckle contrast value (SC value) is greater than or equal to a predetermined SC threshold, thereby determining the blood flow portion and the non-blood flow portion. It is possible to identify and recognize the degree of blood flow in the blood flow section.
  • SC value speckle contrast value
  • the display control unit 714 causes the display unit 74 to display the SC image based on the SC image generated by the generation unit 713. In that case, it is preferable to display the SC image so that the blood flow part and the non-blood flow part can be distinguished.
  • the display control unit 714 can also superimpose the SC image on the visible light image and display it on the display unit 74.
  • the storage unit 72 stores the visible light image acquired by the acquisition unit 711, the speckle image, the calculation result by each unit of the processing unit 71, various information such as various thresholds.
  • a storage device external to the medical system 1 may be used instead of the storage unit 72.
  • the input unit 73 is a means for the user to input information, and is, for example, a keyboard or a mouse.
  • the display unit 74 Under control of the display control unit 714, the display unit 74 displays the visible light image acquired by the acquisition unit 711, the speckle image, the calculation result by each unit of the processing unit 71, various types of information such as various thresholds.
  • a display device outside the medical system 1 may be used instead of the display unit 74.
  • FIG. 4 is a diagram showing an example of an SC image having a processing size of 3 ⁇ 3 (pixels) and an example of an SC image having a processing size of 11 ⁇ 11 (pixels). Note that both SC image examples are calculated with a constant calculation area.
  • FIG. 5 a diagram showing an example of a calculation area in a comparative example (prior art).
  • FIG. 5 is a diagram showing an example of a calculation area in the comparative example.
  • the calculation region (for example, 3 ⁇ 3 pixels) is set regardless of the boundary between the region RM having a motion (flow) and the region RS having no motion (flow) (for example, the calculation regions R1 and R2). Therefore, if the calculation area includes both the moving area RM and the non-moving area RS, the accuracy of the speckle contrast value decreases.
  • FIG. 6 is an explanatory diagram of a calculation example using the calculation area according to the first embodiment of the present disclosure.
  • the setting unit 712 calculates the degree of association based on the visible light image with respect to the target pixel (for example, the pixel R12) and each of the plurality of peripheral pixels, based on at least one of the distance, the difference in luminance, and the difference in color, and based on the degree of association
  • a calculation region for example, region R11
  • the degree of association is described for each peripheral pixel. For example, it is determined that the closer the distance is, the higher the degree of association is. Also, the closer the difference in brightness is, the higher the degree of association is determined.
  • the total difference of RGB may be used, or the difference of two or one of RGB may be used.
  • the blood is red, and therefore only the difference R of RGB may be used.
  • the generation unit 713 for example, sets only peripheral pixels having a degree of association (0 or more, 1 or less) of 0.5 or more as a calculation area.
  • the generation unit 713 calculates the speckle contrast value regarding the target pixel (for example, the pixel R12) based on the calculation region (for example, the region R11) set by the setting unit 712. , SC images are generated. At this time, for example, the generation unit 713 may calculate the speckle contrast value by weighting the peripheral pixels by the degree of association (for example, multiplying the SC value by the degree of association).
  • the pixel R14 and the region R13 are similar to those of the pixel R12 and the region R11.
  • FIG. 7 is a flowchart showing SC image generation processing by the information processing device 7 according to the first embodiment of the present disclosure.
  • step S1 the acquisition unit 711 acquires a visible light image from the color camera 5.
  • step S2 the acquisition unit 711 acquires a speckle image from the IR camera 6.
  • step S3 the setting unit 712 calculates the degree of association between the pixel of interest and each of a plurality of peripheral pixels based on the visible light image.
  • step S4 the setting unit 712 sets the calculation region regarding the target pixel in at least a part of the plurality of peripheral pixels based on the degree of association calculated in step S3.
  • step S5 the generation unit 713 calculates the speckle contrast value for the pixel of interest based on the speckle image and the calculation area set in step S4, and generates the SC image.
  • step S6 the display control unit 714 causes the display unit 74 to display the SC image generated in step S5. After step S6, the process ends.
  • a highly accurate predetermined image is generated based on an image (speckle image) of predetermined light from a subject and a more appropriate calculation area. be able to.
  • the SC value when the SC value is calculated for the pixel of interest (for example, the pixel R12) related to the non-moving region RS, the pixels in the non-moving region RS (region Only the pixels in R11) can be used. Further, when the SC value is calculated for the target pixel (pixel R14) related to the moving region RM, only the pixels in the moving region RM (pixels in the region R13) can be used among the peripheral pixels. Therefore, the accuracy of the SC value can be improved.
  • the SC value is calculated for the pixel of interest (for example, the pixel R12) related to the non-moving region RS
  • the pixels in the non-moving region RS region Only the pixels in R11
  • the SC value when the SC value is calculated for the target pixel (pixel R14) related to the moving region RM, only the pixels in the moving region RM (pixels in the region R13) can be used among the peripheral pixels. Therefore, the accuracy of the SC value can be improved.
  • FIG. 8 is a diagram showing a schematic image example in the first embodiment of the present disclosure.
  • FIG. 8A is a schematic diagram of a visible light image.
  • FIG. 8B is a schematic diagram of the speckle image.
  • FIG. 8C is a schematic diagram of the main processing image created by the processing of this embodiment.
  • FIG. 8D is a schematic diagram of a comparative image created by conventional processing in which the size of the calculation area is constant.
  • the blood vessel BT1 is a thick blood vessel having a blood flow.
  • the blood vessel BT2 is a thin blood vessel having a blood flow.
  • the blood vessel BT3 is a thin blood vessel with no blood flow.
  • both the main processed image of FIG. 8C and the comparative image of FIG. 8D are the same in that the blood vessel BT3 is not shown.
  • both blood vessels BT1 and BT2 are clearly displayed as compared with the comparative image of FIG. 8D. That is, in the comparative image of FIG. 8D, the edge of the blood flow portion of the blood vessel BT1 is blurred and the blood vessel BT2 is blurred and difficult to see (or becomes invisible).
  • Clipping an aneurysm prevents the blood flow from entering the aneurysm by sandwiching the aneurysm with clips, and prevents future rupture of the aneurysm. Then, a thin blood vessel called a perforator may run near the aneurysm.
  • the perforator is a thin blood vessel, but a very important blood vessel that supplies a part of the brain with nutrition. If clips stop the blood flow, nutrients may not be supplied to a part of the brain, which may result in dysfunction. Therefore, displaying blood flow in a thin blood vessel in real time and avoiding accidental clipping is of great utility.
  • cerebral artery bypass surgery In addition to this, application examples for cerebral artery bypass surgery are also possible.
  • cerebral artery bypass surgery arteries with poor cerebral blood flow are connected to other arteries to increase the flow of arterial blood. In this operation, the blood flow is increased by opening the bypass of the cerebral artery, but by making it possible to see the blood flow in the thin blood vessels, it is possible to increase the blood flow in the capillaries that feed the brain tissue. You will be able to confirm and confirm the effect of surgery.
  • a thrombus develops in a small blood vessel and obstruction of blood flow can have serious sequelae.
  • FIG. 9 is a diagram showing a configuration example of the medical system 1 according to the second embodiment of the present disclosure.
  • the medical system 1 according to the second embodiment is different from the medical system 1 (FIG. 1) according to the first embodiment in that the wavelength separation device 4, the color camera 5, and the IR camera 6 are replaced by a control device 8 and a camera. 9 is provided.
  • the control device 8 switches the irradiation of visible light from the structure observation light source 2 and the irradiation of near infrared light from the narrow band light source 3 at predetermined time intervals according to an instruction from the camera 9.
  • the method of switching the illumination light source for example, there is a method of switching the filter of the observation camera.
  • the camera 9 is an imager that has both the functions of an RGB imager for visible light observation and an IR imager for speckle observation. Further, the camera 9 outputs to the control device 8 a control signal for switching the irradiation of visible light by the structure observation light source 2 and the irradiation of near infrared light by the narrow band light source 3.
  • the information processing device 7 is similar to that of the first embodiment.
  • the visible light image and the speckle image are temporally alternately captured by the single camera 9, and the information processing device 7 uses the images to perform the first imaging.
  • the same processing as that of the first embodiment can be performed.
  • the device configuration is simplified, and the visible light image and the speckle image with the same angle of view can be easily acquired, so the processing is also simplified.
  • FIG. 10 is a diagram showing a configuration example of the medical system 1 according to the third embodiment of the present disclosure.
  • the medical system 1 according to the third embodiment is different from the medical system 1 (FIG. 1) according to the first embodiment in that the structure observation light source 2 and the color camera 5 are replaced by fluorescence excitation light sources 10, respectively.
  • the fluorescence observation camera 11 (third imaging means) is provided.
  • the fluorescence excitation light source 10 irradiates a subject with excitation light for exciting fluorescence.
  • an ICG drug, fluorescein, or the like as a drug that emits fluorescence is administered to a living body that is a subject.
  • the calculation area is set based on the visible light image, but in the third embodiment, the calculation area is set based on the fluorescence observation image. Since the ICG drug and fluorescein are drugs used for angiography, the fluorescence observation image reflects the running of the blood vessel and is useful for setting the calculation region.
  • the acquisition unit 711 acquires the fluorescence observation image (third image) from the fluorescence observation camera 11 that images the fluorescence from the subject.
  • the setting unit 712 sets the calculation area based on the fluorescence observation image.
  • the generation unit 713 generates the SC image based on the speckle image and the calculation area set by the setting unit 712.
  • the SC value of the blood flow portion can be calculated using only the region having a high fluorescence brightness value (blood flow region) as the calculation region. Can generate images.
  • fluorescence observation using an ICG agent or fluorescein is shown, fluorescence observation using another agent may be used.
  • FIG. 11 is a schematic diagram showing a mixed image according to the fourth embodiment of the present disclosure.
  • This mixed image is obtained, for example, by a method using so-called space division two-step exposure.
  • This mixed image includes visible light pixels and IR pixels alternately in both the vertical and horizontal directions in one frame.
  • the lighter color pixel is a visible light pixel and the darker color pixel is an IR pixel.
  • the processing unit 71 of the information processing device 7 acquires such a mixed image from an imaging device (not shown). Then, the processing unit 71 of the information processing device 7 may perform the same processing as that in the case where the visible light pixel and the IR pixel are extracted from the mixed image and the visible light image and the speckle image are separately acquired. it can.
  • the medical system 1 of the fourth embodiment by using one type of mixed image, the irradiation operation of the structure observation light source 2 and the narrow band light source 3 is performed with a single imaging device.
  • the same processing as in the first embodiment can be performed without switching every time.
  • the technology according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure may be applied to an endoscopic surgery system.
  • FIG. 12 is a diagram showing an example of a schematic configuration of an endoscopic surgery system 5000 to which the technology according to the present disclosure can be applied.
  • a surgeon (doctor) 5067 is performing an operation on a patient 5071 on a patient bed 5069 using the endoscopic surgery system 5000.
  • the endoscopic surgery system 5000 includes an endoscope 5001, other surgical tools 5017, a support arm device 5027 for supporting the endoscope 5001, and various devices for endoscopic surgery. And a cart 5037 on which is mounted.
  • trocars 5025a to 5025d are punctured in the abdominal wall. Then, the barrel 5003 of the endoscope 5001 and other surgical tools 5017 are inserted into the body cavity of the patient 5071 from the trocars 5025a to 5025d.
  • a pneumoperitoneum tube 5019, an energy treatment tool 5021, and forceps 5023 are inserted into the body cavity of the patient 5071 as other surgical tools 5017.
  • the energy treatment tool 5021 is a treatment tool that performs incision and separation of tissue, sealing of blood vessels, or the like by high-frequency current or ultrasonic vibration.
  • the illustrated surgical instrument 5017 is merely an example, and various surgical instruments generally used in endoscopic surgery, such as a contusion and a retractor, may be used as the surgical instrument 5017.
  • An image of the surgical site in the body cavity of the patient 5071 taken by the endoscope 5001 is displayed on the display device 5041.
  • the surgeon 5067 uses the energy treatment tool 5021 and the forceps 5023 while performing a real-time image of the surgical site displayed on the display device 5041, and performs a procedure such as excising the affected site.
  • the pneumoperitoneum tube 5019, the energy treatment tool 5021, and the forceps 5023 are supported by an operator 5067, an assistant, or the like during surgery.
  • the support arm device 5027 includes an arm portion 5031 extending from the base portion 5029.
  • the arm portion 5031 includes joint portions 5033a, 5033b, 5033c, and links 5035a, 5035b, and is driven by control from the arm control device 5045.
  • the endoscope 5001 is supported by the arm portion 5031, and its position and posture are controlled. Thereby, stable fixing of the position of the endoscope 5001 can be realized.
  • the endoscope 5001 includes a lens barrel 5003 into which a region having a predetermined length from the distal end is inserted into the body cavity of the patient 5071, and a camera head 5005 connected to the base end of the lens barrel 5003.
  • the endoscope 5001 configured as a so-called rigid endoscope having the rigid barrel 5003 is illustrated, but the endoscope 5001 is configured as a so-called flexible mirror having the flexible barrel 5003. Good.
  • An opening in which an objective lens is fitted is provided at the tip of the lens barrel 5003.
  • a light source device 5043 is connected to the endoscope 5001, and the light generated by the light source device 5043 is guided to the tip of the lens barrel by a light guide extending inside the lens barrel 5003, and the objective
  • the observation target (subject) in the body cavity of the patient 5071 is irradiated via the lens.
  • the endoscope 5001 may be a direct-viewing endoscope, a perspective mirror, or a side-viewing endoscope.
  • An optical system and an image pickup device are provided inside the camera head 5005, and the reflected light (observation light) from the observation target is focused on the image pickup device by the optical system.
  • the observation light is photoelectrically converted by the imaging element, and an electric signal corresponding to the observation light, that is, an image signal corresponding to the observation image is generated.
  • the image signal is transmitted to the camera control unit (CCU) 5039 as RAW data.
  • the camera head 5005 has a function of adjusting the magnification and the focal length by appropriately driving the optical system.
  • the camera head 5005 may be provided with a plurality of image pickup elements in order to support, for example, stereoscopic vision (3D display).
  • a plurality of relay optical systems are provided inside the lens barrel 5003 to guide the observation light to each of the plurality of image pickup devices.
  • the CCU 5039 includes a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and the like, and controls the operations of the endoscope 5001 and the display device 5041 in a centralized manner. Specifically, the CCU 5039 performs various image processing such as development processing (demosaic processing) on the image signal received from the camera head 5005 to display an image based on the image signal. The CCU 5039 provides the display device 5041 with the image signal subjected to the image processing. The CCU 5039 also transmits a control signal to the camera head 5005 to control the driving thereof.
  • the control signal may include information regarding imaging conditions such as magnification and focal length.
  • the display device 5041 displays an image based on an image signal subjected to image processing by the CCU 5039 under the control of the CCU 5039.
  • the endoscope 5001 is compatible with high-resolution imaging such as 4K (horizontal pixel number 3840 ⁇ vertical pixel number 2160) or 8K (horizontal pixel number 7680 ⁇ vertical pixel number 4320), and / or 3D display
  • high-resolution imaging such as 4K (horizontal pixel number 3840 ⁇ vertical pixel number 2160) or 8K (horizontal pixel number 7680 ⁇ vertical pixel number 4320)
  • 3D display In the case where the display device 5041 corresponds to the display device 5041, a device capable of high-resolution display and / or a device capable of 3D display can be used as the display device 5041.
  • the display device 5041 When the display device 5041 is compatible with high-resolution shooting such as 4K or 8K, a more immersive feeling can be obtained by using a display device 5041 having a size of 55 inches or more. Further, a plurality of display devices 5041 having different resolutions and sizes may be provided depending on the application.
  • the light source device 5043 is composed of a light source such as an LED (light emitting diode), and supplies irradiation light to the endoscope 5001 when the surgical site is imaged.
  • a light source such as an LED (light emitting diode)
  • the arm control device 5045 is configured by a processor such as a CPU, for example, and operates according to a predetermined program to control the drive of the arm portion 5031 of the support arm device 5027 according to a predetermined control method.
  • the input device 5047 is an input interface for the endoscopic surgery system 5000.
  • the user can input various kinds of information and instructions to the endoscopic surgery system 5000 via the input device 5047.
  • the user inputs various kinds of information regarding the surgery, such as the physical information of the patient and the information regarding the surgical procedure, through the input device 5047.
  • the user may, via the input device 5047, give an instruction to drive the arm portion 5031 or an instruction to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) of the endoscope 5001. , And inputs an instruction to drive the energy treatment tool 5021.
  • the type of the input device 5047 is not limited, and the input device 5047 may be various known input devices.
  • the input device 5047 for example, a mouse, a keyboard, a touch panel, a switch, a foot switch 5057 and / or a lever can be applied.
  • the touch panel may be provided on the display surface of the display device 5041.
  • the input device 5047 is a device worn by the user, such as a glasses-type wearable device or an HMD (Head Mounted Display), and various inputs can be made according to the user's gesture or line of sight detected by these devices. Done. Further, the input device 5047 includes a camera capable of detecting the movement of the user, and various inputs are performed according to the gesture or the line of sight of the user detected from the image captured by the camera. Further, the input device 5047 includes a microphone capable of collecting the voice of the user, and various inputs are performed by voice through the microphone.
  • a glasses-type wearable device or an HMD Head Mounted Display
  • the input device 5047 is configured to be able to input various kinds of information in a contactless manner, a user (for example, a surgeon 5067) who belongs to a clean area can operate a device that belongs to a dirty area in a contactless manner. Is possible. In addition, since the user can operate the device without releasing his / her hand from the surgical tool, the convenience of the user is improved.
  • the treatment instrument control device 5049 controls driving of the energy treatment instrument 5021 for cauterization of tissue, incision, sealing of blood vessel, or the like.
  • the pneumoperitoneum device 5051 uses a gastrointestinal tube 5019 to inflate a gas into the body cavity of the patient 5071 in order to inflate the body cavity of the patient 5071 for the purpose of securing a visual field by the endoscope 5001 and a working space for an operator.
  • the recorder 5053 is a device capable of recording various information regarding surgery.
  • the printer 5055 is a device that can print various types of information regarding surgery in various formats such as text, images, or graphs.
  • the support arm device 5027 includes a base portion 5029 that is a base and an arm portion 5031 that extends from the base portion 5029.
  • the arm section 5031 is composed of a plurality of joint sections 5033a, 5033b, 5033c and a plurality of links 5035a, 5035b connected by the joint section 5033b, but in FIG.
  • the configuration of the arm portion 5031 is illustrated in a simplified manner. Actually, the shapes, the numbers, and the arrangements of the joints 5033a to 5033c and the links 5035a and 5035b, the directions of the rotation axes of the joints 5033a to 5033c, and the like are appropriately set so that the arm 5031 has a desired degree of freedom. obtain.
  • the arm portion 5031 can be preferably configured to have 6 or more degrees of freedom. Accordingly, the endoscope 5001 can be freely moved within the movable range of the arm portion 5031, so that the lens barrel 5003 of the endoscope 5001 can be inserted into the body cavity of the patient 5071 from a desired direction. It will be possible.
  • An actuator is provided in each of the joints 5033a to 5033c, and the joints 5033a to 5033c are configured to be rotatable about a predetermined rotation axis by driving the actuator.
  • the drive of the actuator is controlled by the arm controller 5045, so that the rotation angles of the joints 5033a to 5033c are controlled and the drive of the arm 5031 is controlled. Thereby, control of the position and orientation of the endoscope 5001 can be realized.
  • the arm control device 5045 can control the driving of the arm unit 5031 by various known control methods such as force control or position control.
  • the surgeon 5067 appropriately performs an operation input via the input device 5047 (including the foot switch 5057), and the arm controller 5045 appropriately controls the drive of the arm portion 5031 in accordance with the operation input.
  • the position and orientation of the endoscope 5001 may be controlled.
  • the endoscope 5001 at the tip of the arm portion 5031 can be moved from an arbitrary position to an arbitrary position and then fixedly supported at the position after the movement.
  • the arm portion 5031 may be operated by a so-called master slave method.
  • the arm unit 5031 can be remotely operated by the user via the input device 5047 installed at a place apart from the operating room.
  • the arm control device 5045 receives the external force from the user and operates the actuators of the joint parts 5033a to 5033c so that the arm part 5031 moves smoothly according to the external force.
  • a doctor called a scoopist supported the endoscope 5001.
  • the position of the endoscope 5001 can be fixed more reliably without manual labor, so that an image of the surgical site can be stably obtained. It becomes possible to perform surgery smoothly.
  • the arm control device 5045 does not necessarily have to be provided on the cart 5037. Also, the arm control device 5045 does not necessarily have to be one device. For example, the arm control device 5045 may be provided in each of the joint parts 5033a to 5033c of the arm part 5031 of the support arm device 5027, and the plurality of arm control devices 5045 cooperate with each other to drive the arm part 5031. Control may be implemented.
  • the light source device 5043 supplies the endoscope 5001 with irradiation light for imaging the surgical site.
  • the light source device 5043 is composed of, for example, an LED, a laser light source, or a white light source configured by a combination thereof.
  • the white light source is formed by the combination of the RGB laser light sources, the output intensity and the output timing of each color (each wavelength) can be controlled with high accuracy, so that the white balance of the captured image in the light source device 5043. Can be adjusted.
  • the laser light from each of the RGB laser light sources is time-divisionally irradiated to the observation target, and the drive of the image pickup device of the camera head 5005 is controlled in synchronization with the irradiation timing, so as to correspond to each of RGB. It is also possible to take the captured image in time division. According to this method, a color image can be obtained without providing a color filter on the image sensor.
  • the drive of the light source device 5043 may be controlled so as to change the intensity of the output light at predetermined time intervals.
  • the driving of the image sensor of the camera head 5005 in synchronism with the timing of changing the intensity of the light to acquire images in a time-division manner and synthesizing the images, high dynamic without so-called blackout and whiteout. An image of the range can be generated.
  • the light source device 5043 may be configured to be able to supply light in a predetermined wavelength band corresponding to special light observation.
  • the special light observation for example, the wavelength dependence of the absorption of light in body tissues is used to irradiate a narrow band of light as compared with the irradiation light (that is, white light) at the time of normal observation, so that the mucosal surface layer
  • the so-called narrow band imaging is performed, in which predetermined tissues such as blood vessels are imaged with high contrast.
  • fluorescence observation in which an image is obtained by the fluorescence generated by irradiating the excitation light may be performed.
  • the body tissue is irradiated with excitation light to observe fluorescence from the body tissue (autofluorescence observation), or a reagent such as indocyanine green (ICG) is locally injected into the body tissue and the body tissue is injected into the body tissue.
  • a fluorescent image may be obtained by irradiating excitation light corresponding to the fluorescent wavelength of the reagent.
  • the light source device 5043 may be configured to be capable of supplying narrowband light and / or excitation light compatible with such special light observation.
  • FIG. 13 is a block diagram showing an example of the functional configuration of the camera head 5005 and CCU 5039 shown in FIG.
  • the camera head 5005 has a lens unit 5007, an imaging unit 5009, a drive unit 5011, a communication unit 5013, and a camera head control unit 5015 as its functions.
  • the CCU 5039 also has a communication unit 5059, an image processing unit 5061, and a control unit 5063 as its functions.
  • the camera head 5005 and the CCU 5039 are bidirectionally connected by a transmission cable 5065.
  • the lens unit 5007 is an optical system provided in a connection portion with the lens barrel 5003.
  • the observation light taken in from the tip of the lens barrel 5003 is guided to the camera head 5005 and enters the lens unit 5007.
  • the lens unit 5007 is configured by combining a plurality of lenses including a zoom lens and a focus lens.
  • the optical characteristics of the lens unit 5007 are adjusted so that the observation light is condensed on the light receiving surface of the image pickup element of the image pickup section 5009.
  • the zoom lens and the focus lens are configured so that their positions on the optical axis can be moved in order to adjust the magnification and focus of the captured image.
  • the image pickup unit 5009 is composed of an image pickup element, and is arranged in the latter stage of the lens unit 5007.
  • the observation light that has passed through the lens unit 5007 is condensed on the light receiving surface of the image pickup element, and an image signal corresponding to the observation image is generated by photoelectric conversion.
  • the image signal generated by the imaging unit 5009 is provided to the communication unit 5013.
  • the image pickup device that constitutes the image pickup unit 5009 for example, a CMOS (Complementary Metal Oxide Semiconductor) type image sensor, which has a Bayer array and is capable of color image pickup is used. It should be noted that as the image pickup device, for example, a device capable of capturing a high-resolution image of 4K or higher may be used. By obtaining the image of the surgical site with high resolution, the surgeon 5067 can grasp the state of the surgical site in more detail, and the surgery can proceed more smoothly.
  • CMOS Complementary Metal Oxide Semiconductor
  • the image pickup device constituting the image pickup unit 5009 is configured to have a pair of image pickup devices for respectively obtaining the image signals for the right eye and the left eye corresponding to 3D display. By performing the 3D display, the operator 5067 can more accurately grasp the depth of the living tissue in the operation site.
  • the image pickup unit 5009 is configured by a multi-plate type, a plurality of lens unit 5007 systems are provided corresponding to each image pickup element.
  • the image pickup unit 5009 does not necessarily have to be provided on the camera head 5005.
  • the imaging unit 5009 may be provided inside the lens barrel 5003 immediately after the objective lens.
  • the drive unit 5011 is composed of an actuator, and moves the zoom lens and the focus lens of the lens unit 5007 by a predetermined distance along the optical axis under the control of the camera head control unit 5015. As a result, the magnification and focus of the image captured by the image capturing unit 5009 can be adjusted appropriately.
  • the communication unit 5013 is composed of a communication device for transmitting and receiving various information to and from the CCU 5039.
  • the communication unit 5013 transmits the image signal obtained from the image capturing unit 5009 as RAW data to the CCU 5039 via the transmission cable 5065.
  • the image signal is preferably transmitted by optical communication in order to display the captured image of the surgical site with low latency.
  • the operator 5067 performs the operation while observing the state of the affected area by the captured image, and therefore, for safer and more reliable operation, the moving image of the operated area is displayed in real time as much as possible. Is required.
  • the communication unit 5013 is provided with a photoelectric conversion module that converts an electric signal into an optical signal.
  • the image signal is converted into an optical signal by the photoelectric conversion module and then transmitted to the CCU 5039 via the transmission cable 5065.
  • the communication unit 5013 also receives a control signal for controlling the driving of the camera head 5005 from the CCU 5039.
  • the control signal includes, for example, information that specifies the frame rate of the captured image, information that specifies the exposure value at the time of capturing, and / or information that specifies the magnification and focus of the captured image. Contains information about the condition.
  • the communication unit 5013 provides the received control signal to the camera head control unit 5015.
  • the control signal from the CCU 5039 may also be transmitted by optical communication.
  • the communication unit 5013 is provided with a photoelectric conversion module that converts an optical signal into an electric signal, and the control signal is converted into an electric signal by the photoelectric conversion module and then provided to the camera head control unit 5015.
  • the imaging conditions such as the frame rate, the exposure value, the magnification, and the focus described above are automatically set by the control unit 5063 of the CCU 5039 based on the acquired image signal. That is, a so-called AE (Auto Exposure) function, AF (Auto Focus) function, and AWB (Auto White Balance) function are mounted on the endoscope 5001.
  • AE Auto Exposure
  • AF Automatic Focus
  • AWB Automatic White Balance
  • the camera head controller 5015 controls driving of the camera head 5005 based on a control signal from the CCU 5039 received via the communication unit 5013.
  • the camera head control unit 5015 controls the driving of the image pickup device of the image pickup unit 5009 based on the information indicating the frame rate of the captured image and / or the information indicating the exposure at the time of image capturing.
  • the camera head control unit 5015 appropriately moves the zoom lens and the focus lens of the lens unit 5007 via the drive unit 5011 based on the information indicating that the magnification and the focus of the captured image are designated.
  • the camera head controller 5015 may further have a function of storing information for identifying the lens barrel 5003 and the camera head 5005.
  • the camera head 5005 can be made resistant to autoclave sterilization.
  • the communication unit 5059 is composed of a communication device for transmitting and receiving various information with the camera head 5005.
  • the communication unit 5059 receives the image signal transmitted from the camera head 5005 via the transmission cable 5065.
  • the image signal can be preferably transmitted by optical communication.
  • the communication unit 5059 is provided with a photoelectric conversion module that converts an optical signal into an electrical signal in response to optical communication.
  • the communication unit 5059 provides the image signal converted into the electric signal to the image processing unit 5061.
  • the communication unit 5059 transmits a control signal for controlling the driving of the camera head 5005 to the camera head 5005.
  • the control signal may also be transmitted by optical communication.
  • the image processing unit 5061 performs various kinds of image processing on the image signal that is the RAW data transmitted from the camera head 5005.
  • image processing for example, development processing, high image quality processing (band emphasis processing, super-resolution processing, NR (Noise reduction) processing and / or camera shake correction processing, etc.), and / or enlargement processing (electronic zoom processing) Etc., various known signal processes are included.
  • the image processing unit 5061 also performs detection processing on the image signal for performing AE, AF, and AWB.
  • the image processing unit 5061 includes a processor such as a CPU and a GPU, and the image processing and the detection processing described above can be performed by the processor operating according to a predetermined program.
  • the image processing unit 5061 is composed of a plurality of GPUs, the image processing unit 5061 appropriately divides information related to the image signal, and the plurality of GPUs perform image processing in parallel.
  • the control unit 5063 performs various controls regarding imaging of the surgical site by the endoscope 5001 and display of the captured image. For example, the control unit 5063 generates a control signal for controlling the driving of the camera head 5005. At this time, when the imaging condition is input by the user, the control unit 5063 generates a control signal based on the input by the user. Alternatively, when the endoscope 5001 is equipped with the AE function, the AF function, and the AWB function, the control unit 5063 determines the optimum exposure value, focal length, and optimum exposure value according to the result of the detection processing by the image processing unit 5061. The white balance is calculated appropriately and a control signal is generated.
  • control unit 5063 causes the display device 5041 to display the image of the surgical site based on the image signal subjected to the image processing by the image processing unit 5061.
  • the control unit 5063 recognizes various objects in the operation region image using various image recognition techniques.
  • the control unit 5063 detects a surgical instrument such as forceps, a specific living body part, bleeding, a mist when the energy treatment instrument 5021 is used, by detecting the shape and color of the edge of the object included in the surgical image. Can be recognized.
  • the control unit 5063 uses the recognition result to superimpose and display various types of surgical support information on the image of the surgical site. By displaying the surgery support information in a superimposed manner and presenting it to the operator 5067, it becomes possible to proceed with the surgery more safely and reliably.
  • a transmission cable 5065 connecting the camera head 5005 and the CCU 5039 is an electric signal cable compatible with electric signal communication, an optical fiber compatible with optical communication, or a composite cable of these.
  • wired communication is performed using the transmission cable 5065, but communication between the camera head 5005 and the CCU 5039 may be performed wirelessly.
  • the communication between the two is performed wirelessly, it is not necessary to lay the transmission cable 5065 in the operating room, so that the situation in which the movement of the medical staff in the operating room is prevented by the transmission cable 5065 can be solved.
  • the example of the endoscopic surgery system 5000 to which the technology according to the present disclosure can be applied has been described above.
  • the endoscopic surgery system 5000 is described here as an example, the system to which the technology according to the present disclosure can be applied is not limited to this example.
  • the technique according to the present disclosure may be applied to a flexible endoscopic surgery system for inspection and a microscopic surgery system described in Application Example 2 below.
  • the technology according to the present disclosure can be suitably applied to the endoscope 5001 among the configurations described above. Specifically, in the case where the blood flow part and the non-blood flow part in the image of the surgical site in the body cavity of the patient 5071 captured by the endoscope 5001 are displayed on the display device 5041 so as to be easily visible, the present disclosure is disclosed. Such technology can be applied.
  • the technique according to the present disclosure to the endoscope 5001, it is possible to more appropriately set the calculation region for calculating the SC value in the speckle imaging technique and generate a good SC image.
  • the operator 5067 can view the image of the surgical site in which the blood flow portion and the non-blood flow portion are accurately identified on the display device 5041 in real time, and can perform the surgery more safely.
  • the technology according to the present disclosure may be applied to a microscopic surgery system used for so-called microsurgery, which is performed while observing a microscopic part of a patient in an enlarged manner.
  • FIG. 14 is a diagram showing an example of a schematic configuration of a microscopic surgery system 5300 to which the technology according to the present disclosure can be applied.
  • the microscopic surgery system 5300 includes a microscope device 5301, a control device 5317, and a display device 5319.
  • the “user” refers to an operator, an assistant, or any medical staff who uses the microscopic surgery system 5300.
  • the microscope device 5301 includes a microscope unit 5303 for magnifying and observing an observation target (operated part of a patient), an arm unit 5309 for supporting the microscope unit 5303 at a tip, and a base unit 5315 for supporting a base end of the arm unit 5309. With.
  • the microscope unit 5303 includes a substantially cylindrical tubular portion 5305, an imaging unit (not shown) provided inside the tubular portion 5305, and an operation unit 5307 provided in a partial area on the outer periphery of the tubular portion 5305. It consists of and.
  • the microscope unit 5303 is an electronic imaging type microscope unit (so-called video type microscope unit) that electronically captures a captured image by the imaging unit.
  • a cover glass that protects the internal image pickup unit is provided on the opening surface at the lower end of the tubular portion 5305.
  • Light from the observation target (hereinafter, also referred to as observation light) passes through the cover glass and enters the imaging unit inside the tubular portion 5305.
  • a light source such as an LED (Light Emitting Diode) may be provided inside the tubular portion 5305, and light is emitted from the light source to the observation target through the cover glass during imaging. May be.
  • the imaging unit is composed of an optical system that collects the observation light and an imaging device that receives the observation light that is collected by the optical system.
  • the optical system is configured by combining a plurality of lenses including a zoom lens and a focus lens, and its optical characteristics are adjusted so as to form observation light on the light receiving surface of the image sensor.
  • the imaging device receives the observation light and photoelectrically converts the observation light to generate a signal corresponding to the observation light, that is, an image signal corresponding to the observation image.
  • the image pickup device for example, a device having a Bayer array and capable of color imaging is used.
  • the image pickup device may be various known image pickup devices such as a CMOS (Complementary Metal Oxide Semiconductor) image sensor or a CCD (Charge Coupled Device) image sensor.
  • the image signal generated by the image sensor is transmitted to the control device 5317 as RAW data.
  • the transmission of the image signal may be preferably performed by optical communication.
  • the surgeon performs surgery while observing the state of the affected area with the captured images. Therefore, for safer and more reliable surgery, it is required that the moving image of the surgery area be displayed in real time as much as possible. Because it will be done.
  • By transmitting the image signal by optical communication it is possible to display the captured image with low latency.
  • the image pickup unit may have a drive mechanism that moves the zoom lens and the focus lens of the optical system along the optical axis. By appropriately moving the zoom lens and the focus lens by the drive mechanism, the magnification of the captured image and the focal length at the time of capturing can be adjusted. Further, the image pickup unit may be equipped with various functions such as an AE (Auto Exposure) function and an AF (Auto Focus) function, which are generally provided in an electronic image pickup type microscope unit.
  • AE Auto Exposure
  • AF Auto Focus
  • the image pickup unit may be configured as a so-called single-plate type image pickup unit having one image pickup device, or may be configured as a so-called multi-plate type image pickup unit having a plurality of image pickup devices.
  • image signals corresponding to RGB are generated by the image pickup elements, respectively, and the image signals may be combined to obtain a color image.
  • the image capturing unit may be configured to have a pair of image capturing elements for respectively acquiring image signals for the right eye and the left eye corresponding to stereoscopic vision (3D display).
  • 3D display enables the operator to more accurately understand the depth of the living tissue in the operation site.
  • a multiple optical system can be provided corresponding to each imaging element.
  • the operation unit 5307 is an input unit that is configured by, for example, a cross lever or a switch, and that accepts a user operation input.
  • the user can input, via the operation unit 5307, an instruction to change the magnification of the observation image and the focal length to the observation target.
  • the drive mechanism of the imaging unit appropriately moves the zoom lens and the focus lens, so that the enlargement magnification and the focal length can be adjusted.
  • the user can input an instruction to switch the operation mode (all-free mode and fixed mode described later) of the arm unit 5309 via the operation unit 5307.
  • the operation unit 5307 may be provided at a position where the user can easily operate it with his / her finger while holding the tubular portion 5305 so that the operation portion 5307 can be operated while the user is moving the tubular portion 5305. preferable.
  • the arm portion 5309 is configured by a plurality of links (first link 5313a to sixth link 5313f) being rotatably connected to each other by a plurality of joint portions (first joint portion 5311a to sixth joint portion 5311f). To be done.
  • the first joint portion 5311a has a substantially columnar shape, and at its tip (lower end), the upper end of the tubular portion 5305 of the microscope portion 5303 is parallel to the central axis of the tubular portion 5305.
  • O1 Supports to be rotatable around.
  • the first joint portion 5311a may be configured such that the first axis O1 coincides with the optical axis of the imaging unit of the microscope unit 5303. Accordingly, by rotating the microscope unit 5303 around the first axis O1, it is possible to change the field of view so as to rotate the captured image.
  • the first link 5313a fixedly supports the first joint portion 5311a at the tip.
  • the first link 5313a is a rod-shaped member having a substantially L shape, and one end side of the first link 5313a extends in a direction orthogonal to the first axis O1 while the end portion of the one side is the first joint.
  • the first joint portion 5311a is connected so as to come into contact with the upper end of the outer periphery of the portion 5311a.
  • the second joint 5311b is connected to the end of the other side of the first link 5313a on the base end side of the substantially L shape.
  • the second joint portion 5311b has a substantially columnar shape, and the tip end thereof supports the base end of the first link 5313a so as to be rotatable about a rotation axis (second axis O2) orthogonal to the first axis O1. .
  • the tip end of the second link 5313b is fixedly connected to the base end of the second joint portion 5311b.
  • the second link 5313b is a rod-shaped member having a substantially L shape, and one end side of the second link 5313b extends in a direction orthogonal to the second axis O2, while the end of the one side is a base of the second joint portion 5311b. Permanently connected to the end.
  • the third joint portion 5311c is connected to the other side of the second link 5313b on the base end side of the substantially L shape.
  • the third joint portion 5311c has a substantially cylindrical shape, and at the tip thereof, the base end of the second link 5313b is rotated around a rotation axis (third axis O3) orthogonal to the first axis O1 and the second axis O2. Support rotatably.
  • the tip end of the third link 5313c is fixedly connected to the base end of the third joint portion 5311c. Moving the microscope unit 5303 so as to change the position of the microscope unit 5303 in the horizontal plane by rotating the configuration on the tip side including the microscope unit 5303 around the second axis O2 and the third axis O3. You can That is, by controlling the rotation around the second axis O2 and the third axis O3, it becomes possible to move the field of view of the captured image within the plane.
  • the third link 5313c is configured such that the tip end side thereof has a substantially columnar shape, and the base end of the third joint portion 5311c has the substantially same central axis at the tip end of the columnar shape, It is fixedly connected.
  • the base end side of the third link 5313c has a prismatic shape, and the fourth joint 5311d is connected to the end thereof.
  • the fourth joint portion 5311d has a substantially columnar shape, and the tip end thereof supports the base end of the third link 5313c so as to be rotatable about a rotation axis (fourth axis O4) orthogonal to the third axis O3. .
  • the tip end of the fourth link 5313d is fixedly connected to the base end of the fourth joint portion 5311d.
  • the fourth link 5313d is a rod-shaped member that extends in a substantially straight line.
  • the fourth link 5313d extends so as to be orthogonal to the fourth axis O4, and the end of the tip of the fourth link 5313d abuts the substantially cylindrical side surface of the fourth joint 5311d. It is fixedly connected to the fourth joint portion 5311d so as to be in contact therewith.
  • the fifth joint 5311e is connected to the base end of the fourth link 5313d.
  • the fifth joint portion 5311e has a substantially columnar shape, and supports the base end of the fourth link 5313d at its tip end side so as to be rotatable about a rotation axis (fifth axis O5) parallel to the fourth axis O4. To do.
  • the tip end of the fifth link 5313e is fixedly connected to the base end of the fifth joint portion 5311e.
  • the fourth axis O4 and the fifth axis O5 are rotation axes that can move the microscope unit 5303 in the vertical direction. By rotating the configuration of the tip side including the microscope unit 5303 around the fourth axis O4 and the fifth axis O5, the height of the microscope unit 5303, that is, the distance between the microscope unit 5303 and the observation target can be adjusted. .
  • the fifth link 5313e includes a first member having a substantially L shape in which one side extends in the vertical direction and the other side extends in the horizontal direction, and vertically downward from a portion of the first member extending in the horizontal direction. And a rod-shaped second member that extends is configured to be combined.
  • the proximal end of the fifth joint 5311e is fixedly connected to the vicinity of the upper end of the portion of the fifth link 5313e that extends in the vertical direction of the first member.
  • the sixth joint 5311f is connected to the base end (lower end) of the second member of the fifth link 5313e.
  • the sixth joint 5311f has a substantially columnar shape, and supports the base end of the fifth link 5313e on the tip side thereof so as to be rotatable about a rotation axis (sixth axis O6) parallel to the vertical direction.
  • the tip end of the sixth link 5313f is fixedly connected to the base end of the sixth joint portion 5311f.
  • the sixth link 5313f is a rod-shaped member extending in the vertical direction, and its base end is fixedly connected to the upper surface of the base portion 5315.
  • the rotatable range of the first joint portion 5311a to the sixth joint portion 5311f is appropriately set so that the microscope unit 5303 can move as desired.
  • the arm portion 5309 having the above-described configuration with respect to the movement of the microscope portion 5303, translational 3 degrees of freedom and rotation 3 degrees of freedom, that is, a total of 6 degrees of freedom can be realized.
  • the arm unit 5309 so as to realize 6 degrees of freedom regarding the movement of the microscope unit 5303, it is possible to freely control the position and orientation of the microscope unit 5303 within the movable range of the arm unit 5309. It will be possible. Therefore, the surgical site can be observed from any angle, and the surgery can be performed more smoothly.
  • the configuration of the illustrated arm portion 5309 is merely an example, and the number and shape (length) of the links that configure the arm portion 5309, the number of joint portions, the arrangement position, the direction of the rotation axis, and the like can be set as desired. It may be appropriately designed so that the degree can be realized.
  • the arm unit 5309 in order to move the microscope unit 5303 freely, the arm unit 5309 is preferably configured to have 6 degrees of freedom, but the arm unit 5309 has a larger degree of freedom (ie, redundant freedom). ).
  • the posture of the arm unit 5309 can be changed while the position and posture of the microscope unit 5303 are fixed. Therefore, for example, a more convenient control for the operator can be realized by controlling the posture of the arm 5309 so that the arm 5309 does not interfere with the field of view of the operator who views the display device 5319.
  • each of the first joint portion 5311a to the sixth joint portion 5311f may be provided with a drive mechanism such as a motor and an actuator equipped with an encoder or the like that detects a rotation angle at each joint portion. Then, by appropriately controlling the driving of each actuator provided in the first joint portion 5311a to the sixth joint portion 5311f by the control device 5317, the posture of the arm portion 5309, that is, the position and posture of the microscope portion 5303 can be controlled. . Specifically, the control device 5317 can grasp the current posture of the arm unit 5309 and the current position and posture of the microscope unit 5303 based on the information about the rotation angle of each joint detected by the encoder.
  • the control device 5317 calculates the control value (for example, the rotation angle or the generated torque) for each joint portion that realizes the movement of the microscope unit 5303 according to the operation input from the user, using the grasped information. Then, the drive mechanism of each joint is driven according to the control value. At this time, the control method of the arm 5309 by the control device 5317 is not limited, and various known control methods such as force control or position control may be applied.
  • the control device 5317 appropriately controls the driving of the arm portion 5309 according to the operation input, and the position and posture of the microscope portion 5303 are controlled. May be done.
  • the microscope unit 5303 can be moved from an arbitrary position to an arbitrary position and then fixedly supported at the position after the movement.
  • a device such as a foot switch that can be operated even if the operator has a surgical tool in his or her hand.
  • the operation input may be performed in a non-contact manner based on the gesture detection and the line-of-sight detection using the wearable device or the camera provided in the operating room.
  • the arm portion 5309 may be operated by a so-called master slave method.
  • the arm unit 5309 can be remotely operated by the user via an input device installed at a place apart from the operating room.
  • the actuators of the first joint portion 5311a to the sixth joint portion 5311f are driven so as to receive an external force from the user and move the arm portion 5309 smoothly according to the external force. That is, so-called power assist control may be performed.
  • This allows the microscope unit 5303 to be moved with a comparatively light force when the user grasps the microscope unit 5303 and directly moves its position. Therefore, the microscope unit 5303 can be moved more intuitively and with a simpler operation, and the convenience of the user can be improved.
  • the drive of the arm portion 5309 may be controlled so as to perform a pivot operation.
  • the pivot operation is an operation of moving the microscope unit 5303 so that the optical axis of the microscope unit 5303 always faces a predetermined point in space (hereinafter referred to as a pivot point). According to the pivot operation, it is possible to observe the same observation position from various directions, and thus it is possible to observe the affected area in more detail.
  • the microscope unit 5303 is configured such that its focal length cannot be adjusted, it is preferable that the pivot operation be performed with the distance between the microscope unit 5303 and the pivot point being fixed. In this case, the distance between the microscope unit 5303 and the pivot point may be adjusted to a fixed focal length of the microscope unit 5303.
  • the microscope unit 5303 moves on a hemispherical surface (schematically shown in FIG. 14) having a radius corresponding to the focal length centered on the pivot point, and is clear even if the observation direction is changed. A captured image will be obtained.
  • the pivot operation may be performed in a state where the distance between the microscope unit 5303 and the pivot point is variable.
  • the control device 5317 calculates the distance between the microscope unit 5303 and the pivot point based on the information about the rotation angle of each joint detected by the encoder, and the microscope based on the calculation result.
  • the focal length of the unit 5303 may be automatically adjusted.
  • the microscope unit 5303 is provided with an AF function, the AF function may automatically adjust the focal length each time the distance between the microscope unit 5303 and the pivot point changes due to the pivot operation. .
  • first joint portion 5311a to the sixth joint portion 5311f may be provided with a brake that restrains the rotation thereof.
  • the operation of the brake can be controlled by the controller 5317.
  • the control device 5317 operates the brake of each joint. Accordingly, the posture of the arm portion 5309, that is, the position and posture of the microscope portion 5303 can be fixed without driving the actuator, so that power consumption can be reduced.
  • the control device 5317 may release the brake of each joint unit and drive the actuator according to a predetermined control method.
  • the operation of such a brake can be performed according to the operation input by the user via the operation unit 5307 described above.
  • the user wants to move the position and posture of the microscope unit 5303, the user operates the operation unit 5307 to release the brake of each joint.
  • the operation mode of the arm portion 5309 shifts to a mode (all free mode) in which each joint can freely rotate.
  • the user wants to fix the position and posture of the microscope unit 5303, the user operates the operation unit 5307 to activate the brake of each joint.
  • the operation mode of the arm portion 5309 shifts to a mode in which the rotation of each joint portion is restricted (fixed mode).
  • the control device 5317 controls the operation of the microscope operation system 5300 by controlling the operation of the microscope device 5301 and the display device 5319.
  • the control device 5317 controls the drive of the arm portion 5309 by operating the actuators of the first joint portion 5311a to the sixth joint portion 5311f according to a predetermined control method.
  • the control device 5317 changes the operation mode of the arm portion 5309 by controlling the operation of the brakes of the first joint portion 5311a to the sixth joint portion 5311f.
  • the control device 5317 performs various signal processing on the image signal acquired by the imaging unit of the microscope unit 5303 of the microscope device 5301 to generate image data for display and display the image data. It is displayed on the device 5319.
  • the signal processing for example, development processing (demosaic processing), high image quality processing (band enhancement processing, super-resolution processing, NR (Noise reduction) processing and / or camera shake correction processing, etc.) and / or enlargement processing (that is, Various known signal processes such as electronic zoom process) may be performed.
  • communication between the control device 5317 and the microscope unit 5303 and communication between the control device 5317 and the first joint portion 5311a to the sixth joint portion 5311f may be wired communication or wireless communication.
  • wired communication electric signal communication may be performed or optical communication may be performed.
  • the transmission cable used for wired communication may be configured as an electric signal cable, an optical fiber, or a composite cable of these, depending on the communication system.
  • wireless communication since it is not necessary to lay a transmission cable in the operating room, it is possible to eliminate the situation where the transmission cable hinders the movement of medical staff in the operating room.
  • the control device 5317 may be a processor such as a CPU (Central Processing Unit) or a GPU (Graphics Processing Unit), or a microcomputer or a control board in which a storage element such as a processor and a memory is embedded.
  • the various functions described above can be realized by the processor of the control device 5317 operating according to a predetermined program.
  • the control device 5317 is provided as a device separate from the microscope device 5301, but the control device 5317 is installed inside the base portion 5315 of the microscope device 5301 and is integrated with the microscope device 5301. May be configured as.
  • the control device 5317 may be composed of a plurality of devices.
  • the microscope unit 5303 and the first joint unit 5311a to the sixth joint unit 5311f of the arm unit 5309 are provided with microcomputers, control boards, and the like, respectively, and are connected to each other so that they can communicate with each other. Similar functions may be realized.
  • the display device 5319 is provided in the operating room, and displays an image corresponding to the image data generated by the control device 5317 under the control of the control device 5317. That is, the display device 5319 displays the image of the operation portion photographed by the microscope portion 5303.
  • the display device 5319 may display various kinds of information related to the surgery, such as the physical information of the patient and the information about the surgical procedure, instead of the image of the surgery part or together with the image of the surgery part. In this case, the display on the display device 5319 may be appropriately switched by the operation of the user.
  • a plurality of display devices 5319 may be provided, and each of the plurality of display devices 5319 may display an image of the operation portion and various kinds of information regarding surgery.
  • various known display devices such as a liquid crystal display device or an EL (Electro Luminescence) display device may be applied.
  • FIG. 15 is a diagram showing a state of surgery using the microscopic surgery system 5300 shown in FIG.
  • FIG. 15 schematically shows a surgeon 5321 performing an operation on a patient 5325 on a patient bed 5323 using the microscopic surgery system 5300.
  • the control device 5317 in the configuration of the microscopic surgery system 5300 is omitted, and the microscope device 5301 is illustrated in a simplified manner.
  • an image of the surgical site taken by the microscope device 5301 is enlarged and displayed on the display device 5319 installed on the wall of the operating room by using the microscopic surgery system 5300.
  • the display device 5319 is installed at a position facing the surgeon 5321.
  • the surgeon 5321 observes the state of the surgical site by the image displayed on the display device 5319, and, for example, cuts the affected area, and the like. Perform various treatments on the.
  • the microscopic surgery system 5300 to which the technology according to the present disclosure can be applied has been described above.
  • the microscope device 5301 can also function as a support arm device that supports another observation device or another surgical tool at its tip instead of the microscope portion 5303.
  • An endoscope may be applied as the other observation device.
  • forceps, contusion, pneumoperitoneum tube for pneumoperitoneum, or energy treatment tool for incising tissue or sealing blood vessel by cauterization can be applied.
  • the technique according to the present disclosure may be applied to a support arm device that supports a configuration other than such a microscope unit.
  • the technology according to the present disclosure can be preferably applied to the control device 5317 among the configurations described above.
  • the present disclosure when the blood flow portion and the non-blood flow portion in the image of the surgical portion of the patient 5325 captured by the image capturing unit of the microscope unit 5303 are displayed on the display device 5319 in a readily visible manner.
  • Technology can be applied.
  • the technology according to the present disclosure to the control device 5317, it is possible to more appropriately set the calculation region for calculating the SC value in the speckle imaging technology and generate a good SC image. Accordingly, the operator 5321 can view the image of the surgical site in which the blood flow portion and the non-blood flow portion are accurately identified on the display device 5319 in real time, and can perform the surgery more safely.
  • a first irradiation means for irradiating a subject with incoherent visible light Second irradiating means for irradiating the subject with coherent near-infrared light
  • First imaging means for imaging the reflected light of the visible light from the subject
  • Second imaging means for imaging the reflected light of the near infrared light from the subject
  • An acquisition unit that acquires a visible light image from the first imaging unit and a speckle image from the second imaging unit
  • setting means for setting a calculation region for statistical processing of the brightness value of the speckle
  • a medical system comprising: the speckle image and a generation unit that generates a predetermined image based on the calculation area.
  • the medical system according to (1) wherein the medical system is a microscopic surgery system or an endoscopic surgery system.
  • a second image capturing unit that acquires the first image from the first image capturing unit that captures the reflected light of the incoherent light that is applied to the subject, and a second image capturing unit that captures the reflected light of the coherent light that is applied to the subject
  • An information processing method comprising: a generation step of generating a predetermined image based on the second image and the calculation area.
  • the incoherent light is incoherent visible light
  • the coherent light is coherent near infrared light
  • the first image is a visible light image
  • the second image is a speckle image
  • the information processing method according to (3), wherein the predetermined image is a speckle contrast image.
  • the degree of association is calculated based on the visible light image with respect to each of the target pixel and each of a plurality of peripheral pixels by at least one of a distance, a difference in luminance, and a difference in color, and the plurality of the degrees of association based on the degree of association Setting the calculation region for the pixel of interest in at least a part of the peripheral pixels,
  • the speckle contrast image is generated by calculating a speckle contrast value for the pixel of interest based on the speckle image and the calculation region.
  • the speckle contrast value is calculated by weighting each of the peripheral pixels with the degree of association when calculating the speckle contrast value of the pixel of interest based on the speckle image and the calculation region.
  • the information processing method according to (5), wherein (7) An acquisition step of acquiring a third image from a third image capturing unit that captures fluorescence from a subject and acquiring a second image from a second image capturing unit that captures reflected light of coherent light emitted to the subject.
  • a setting step of setting a calculation area based on the third image An information processing method comprising: a generation step of generating a predetermined image based on the second image and the calculation area.
  • the speckle contrast value has been described as an example of the value calculated by performing the statistical processing on the speckle luminance value, but the invention is not limited to this, and the BR (Blur Rate) is not limited to this. , SBR (Square BR), MBR (Mean BR), and the like.

Abstract

A medical system (1) comprises a first irradiation means (2) for radiating incoherent visible light to a subject, a second irradiation means (3) for radiating coherent near-infrared light to the subject, a first imaging means (5) for capturing an image of the visible light reflected from the subject, a second imaging means (6) for capturing an image of the near-infrared light reflected from the subject, an acquisition means (711) for acquiring a visible-light image from the first imaging means (5) and acquiring a speckle image from the second imaging means (6), a setting means (712) for setting a calculation region for performing statistical processing of speckle luminance values on the basis of the visible-light image, and a generating means (712) for generating a predetermined image on the basis of the speckle image and the calculation region.

Description

医療システム及び情報処理方法Medical system and information processing method
 本開示は、医療システム及び情報処理方法に関する。 The present disclosure relates to a medical system and an information processing method.
 例えば、医療システムの分野で、血流やリンパ流の常時観察が可能なスペックルイメージング技術の開発が進められている。ここで、スペックルとは、照射されたコヒーレント光が被写体(対象物)の表面の微小な凹凸等によって反射、干渉することによって斑点状のパターンが生じる現象である。このスペックルの現象に基いて、例えば、被写体である生体における血流部と非血流部を識別することができる。 For example, in the field of medical systems, development of speckle imaging technology that allows constant observation of blood flow and lymph flow is underway. Here, the speckle is a phenomenon in which a speckled pattern is generated by the emitted coherent light being reflected and interfered by minute irregularities on the surface of a subject (object). Based on this phenomenon of speckle, for example, a blood flow part and a non-blood flow part in a living body as a subject can be identified.
 具体的には、血流部ではコヒーレント光を反射する赤血球等が動くことによりスペックルコントラスト値が小さくなるのに対し、非血流部では全体が静止していてスペックルコントラスト値が大きくなる。したがって、各画素のスペックルコントラスト値を用いて生成されたスペックルコントラスト画像に基いて、血流部と非血流部を識別することができる。 Specifically, in the blood flow part, the speckle contrast value decreases due to movement of red blood cells that reflect coherent light, whereas in the non-blood flow part, the whole is stationary and the speckle contrast value increases. Therefore, the blood flow portion and the non-blood flow portion can be distinguished based on the speckle contrast image generated using the speckle contrast value of each pixel.
 なお、スペックルの輝度値について統計的な処理を行うことによって算出する値としては、スペックルコントラスト値のほかに、BR(Blur Rate)、SBR(Square BR)、MBR(Mean BR)などもある。 In addition to speckle contrast values, BR (Blur Rate), SBR (Square BR), MBR (Mean BR), etc. are also available as values calculated by performing statistical processing on the speckle brightness values. .
特開2017-170064号公報JP, 2017-170064, A
 しかしながら、これまでのスペックルイメージング技術では、スペックルの輝度値について統計的な処理をするための計算領域のサイズ(例えば5×5画素)は一定であり、例えば、計算領域が血流部と非血流部の両方を含んでいると算出する値(例えばスペックルコントラスト値)の精度の点で最善とは言えない。このように、被写体からの所定光による画像と計算領域に基いて所定画像を生成する場合に、計算領域について改善の余地がある。 However, in the conventional speckle imaging technology, the size (for example, 5 × 5 pixels) of the calculation area for statistically processing the brightness value of the speckle is constant, and for example, the calculation area is the blood flow part. It cannot be said to be the best in terms of the accuracy of the value (eg, speckle contrast value) calculated to include both non-blood flow parts. As described above, when the predetermined image is generated based on the image by the predetermined light from the subject and the calculation region, there is room for improvement in the calculation region.
 そこで、本開示では、被写体からの所定光による画像とより適切な計算領域に基いて高精度な所定画像を生成することができる医療システム及び情報処理方法を提案する。 Therefore, the present disclosure proposes a medical system and an information processing method capable of generating a highly accurate predetermined image based on an image by predetermined light from a subject and a more appropriate calculation area.
 上記の課題を解決するために、本開示に係る一形態の医療システムは、被写体にインコヒーレントな可視光を照射する第1の照射手段と、前記被写体にコヒーレントな近赤外光を照射する第2の照射手段と、前記被写体からの前記可視光の反射光を撮像する第1の撮像手段と、前記被写体からの前記近赤外光の反射光を撮像する第2の撮像手段と、前記第1の撮像手段から可視光画像を取得し、前記第2の撮像手段からスペックル画像を取得する取得手段と、前記可視光画像に基いて、スペックルの輝度値について統計的な処理をするための計算領域を設定する設定手段と、前記スペックル画像と前記計算領域に基いて所定画像を生成する生成手段と、を備える。 In order to solve the above problems, a medical system according to an aspect of the present disclosure is a first irradiation unit that irradiates a subject with incoherent visible light, and a first irradiation unit that irradiates the subject with coherent near-infrared light. Second irradiating means, first imaging means for imaging the reflected light of the visible light from the subject, second imaging means for imaging the reflected light of the near-infrared light from the subject, and To obtain a visible light image from the first image pickup means and to obtain a speckle image from the second image pickup means, and to statistically process the speckle luminance value based on the visible light image. Setting means for setting the calculation area, and generating means for generating a predetermined image based on the speckle image and the calculation area.
本開示の第1実施形態に係る医療システムの構成例を示す図である。It is a figure showing an example of composition of a medical system concerning a 1st embodiment of this indication. 本開示の第1実施形態に係る情報処理装置の構成例を示す図である。It is a figure showing an example of composition of an information processor concerning a 1st embodiment of this indication. 疑似血管のSC画像例を示す図である。It is a figure which shows the SC image example of a pseudo blood vessel. 処理サイズが3×3のSC画像例と処理サイズが11×11のSC画像例を示す図である。It is a figure which shows the example of SC image of process size 3x3, and the example of SC image of process size 11x11. 比較例における計算領域例を示す図である。It is a figure which shows the example of a calculation area | region in a comparative example. 本開示の第1実施形態における計算領域を用いた計算例の説明図である。FIG. 5 is an explanatory diagram of a calculation example using a calculation area according to the first embodiment of the present disclosure. 本開示の第1実施形態に係る情報処理装置によるSC画像生成処理を示すフローチャートである。5 is a flowchart showing SC image generation processing by the information processing apparatus according to the first embodiment of the present disclosure. 本開示の第1実施形態における模式的な画像例を示す図である。It is a figure showing an example of a typical image in a 1st embodiment of this indication. 本開示の第2実施形態に係る医療システムの構成例を示す図である。It is a figure which shows the structural example of the medical system which concerns on 2nd Embodiment of this indication. 本開示の第3実施形態に係る医療システムの構成例を示す図である。It is a figure which shows the structural example of the medical system which concerns on 3rd Embodiment of this indication. 本開示の第4実施形態における混合画像を示す模式図である。It is a schematic diagram which shows the mixed image in 4th Embodiment of this indication. 本開示の応用例1に係る内視鏡手術システムの概略的な構成の一例を示す図である。It is a figure which shows an example of a schematic structure of the endoscopic surgery system which concerns on the example 1 of application of this indication. 図12に示すカメラヘッド及びCCUの機能構成の一例を示すブロック図である。It is a block diagram which shows an example of a functional structure of the camera head and CCU shown in FIG. 本開示の応用例2に係る顕微鏡手術システムの概略的な構成の一例を示す図である。FIG. 20 is a diagram showing an example of a schematic configuration of a microscopic surgery system according to an application example 2 of the present disclosure. 図14に示す顕微鏡手術システムを用いた手術の様子を示す図である。It is a figure which shows the mode of surgery using the microscopic surgery system shown in FIG.
 以下に、本開示の実施形態について図面に基づいて詳細に説明する。なお、以下の各実施形態において、同一の構成には同一の符号を付することにより重複する説明を適宜省略する。 Hereinafter, an embodiment of the present disclosure will be described in detail with reference to the drawings. Note that, in each of the following embodiments, the same configurations are denoted by the same reference numerals, and redundant description is appropriately omitted.
 まず、本発明の意義について説明する。医療分野では血流の評価が重要であるケースが多い。例えば、脳外科手術でのバイパス術においては、血管同士をつないだ後に開通(血流)の確認を行う。また、動脈瘤のクリッピング術においては、クリッピング後に瘤内への血流の流れ込みを確認する。これらの用途では、これまで、超音波ドップラー血流計やICG(Indocyanine Green)薬剤を用いた血管造影による血流評価などが行われている。 First, the significance of the present invention will be explained. In the medical field, blood flow evaluation is often important. For example, in bypass surgery in brain surgery, vascularization (blood flow) is confirmed after connecting blood vessels. In addition, in clipping aneurysm, the flow of blood into the aneurysm is confirmed after clipping. In these applications, blood flow evaluation by an angiography using an ultrasonic Doppler blood flow meter or ICG (Indocyanine Green) drug has been performed so far.
 しかし、超音波ドップラー血流計は、プローブを接触した部分の一点での血流測定であるために、術野全体での血流動向分布がわからない。また、脳血管に接触して評価しなければならないというリスクがある。 However, since the ultrasonic Doppler blood flow meter measures blood flow at one point where the probe is in contact, the distribution of blood flow trends in the entire surgical field is unknown. There is also the risk of having to contact the cerebrovascular system for evaluation.
 また、ICG薬剤を用いた血管造影は、ICG薬剤が生体内で血漿蛋白と結合して近赤外の励起光により蛍光を発するという特徴を利用したものであり、薬剤を投与する侵襲性のある観察である。また、血流の評価という点では、ICG薬剤を投与した直後の変化から流れを見極めなければならないのでタイミングの点でも使い方の制限がある。 Further, angiography using an ICG drug takes advantage of the feature that the ICG drug binds to plasma protein in the living body and emits fluorescence by excitation light in the near infrared, which is invasive to administer the drug. It is an observation. Further, in terms of blood flow evaluation, the flow has to be determined from the change immediately after the administration of the ICG drug, so that there is a limitation in usage in terms of timing.
 そういった状況下で、薬剤投与せずに血流を可視化する血流評価方法としてスペックルイメージング技術がある。例えば、特開2017-170064号公報では、スペックルイメージング技術における潅流評価の光学デバイスが開示されている。ここでは、レーザによって発生したスペックルを利用して動き(血流)を検出する原理が用いられている。以下、動き検出の指標として、例えばスペックルコントラストを利用した場合について説明する。 Under such circumstances, there is speckle imaging technology as a blood flow evaluation method that visualizes the blood flow without drug administration. For example, Japanese Patent Laid-Open No. 2017-170064 discloses an optical device for perfusion evaluation in the speckle imaging technique. Here, the principle of detecting movement (blood flow) using speckle generated by a laser is used. Hereinafter, a case where speckle contrast is used as an index for motion detection will be described.
 スペックルコントラストは、光強度分布の(標準偏差)/(平均値)で示される値である。動きのない部分では、スペックルパターンの局所的に明るい部分から暗い部分まで分布しているので、強度分布の標準偏差が大きくなり、スペックルコントラスト(ぎらつき度合い)が高い。一方で、動きがある部分では、動きに伴ってスペックルパターンが変化する。ある露光時間をもった観察系でスペックルパターンを撮影することを考えると、露光時間内にスペックルパターンが変化するので撮影されたスペックルパターンは平均化され、スペックルコントラスト(ぎらつき度合い)が低くなる。特に動きが大きいほど平均化が進むのでスペックルコントラストが低くなる。このようにスペックルコントラストを評価することで動き量の大きさを知ることができる。 Speckle contrast is a value indicated by (standard deviation) / (average value) of the light intensity distribution. In the non-moving part, since the speckle pattern is locally distributed from the bright part to the dark part, the standard deviation of the intensity distribution is large and the speckle contrast (degree of glare) is high. On the other hand, in a moving part, the speckle pattern changes with the motion. Considering shooting a speckle pattern with an observation system that has a certain exposure time, the speckle pattern changes during the exposure time, so the taken speckle patterns are averaged and the speckle contrast (degree of glare) Will be lower. Especially, the larger the movement is, the more the averaging is performed, so the speckle contrast becomes lower. By evaluating the speckle contrast in this way, the magnitude of the amount of movement can be known.
 この手法は、注目画素と複数の周囲画素の光強度を使って統計的な評価を行う手法である。統計的に安定な値を得るためには計算領域(解析領域)を大きくし、画素数を大きくとることが望まれる。これにより統計的な値が安定し、解析した画素の値のばらつきが小さくなる。しかし、計算領域を大きくとると、注目画素から遠い周辺画素の情報も含まれるため解像度が低下するという短所がある。解像度低下を避けるために計算領域を小さくすることも考えられるが、小さい計算領域ではその内部に含まれる画素が少ないため、統計処理した値にばらつきが大きくなり、画像としてはノイズの多いものとなる。 This method is a method to perform statistical evaluation using the light intensity of the pixel of interest and multiple surrounding pixels. In order to obtain a statistically stable value, it is desirable to increase the calculation area (analysis area) and the number of pixels. This stabilizes the statistical value and reduces the variation in the analyzed pixel value. However, if the calculation area is made large, the resolution is lowered because information of peripheral pixels far from the target pixel is also included. It may be possible to reduce the calculation area to avoid resolution degradation, but in a small calculation area, the number of pixels contained in the calculation area is small, so the statistically processed values will vary widely, and the image will be noisy. .
 また、上述のように、これまで、各注目画素のスペックルコントラスト値を算出するための計算領域のサイズは一定(例えば5×5画素)に設定されており、例えば、計算領域が血流部と非血流部の両方を含んでいるとスペックルコントラスト値の精度の点で最善とは言えない。 Further, as described above, the size of the calculation region for calculating the speckle contrast value of each pixel of interest has been set to a constant value (for example, 5 × 5 pixels), and the calculation region is, for example, the blood flow part. It is not the best in terms of the accuracy of the speckle contrast value if it includes both the non-blood flow area and the non-blood flow area.
 そこで、以下では、被写体からの所定光による画像とより適切な計算領域に基いて高精度な所定画像を生成する医療システム及び情報処理方法について説明する。 Therefore, in the following, a medical system and an information processing method for generating a highly accurate predetermined image based on an image by predetermined light from a subject and a more appropriate calculation area will be described.
(第1実施形態)
[第1実施形態に係る医療システムの構成]
 図1は、本開示の第1実施形態に係る医療システム1の構成例を示す図である。第1実施形態に係る医療システム1は、構造観察用光源2(第1の照射手段)、狭帯域光源3(第2の照射手段)、波長分離装置4、カラーカメラ5(第1の撮像手段)、IRカメラ6(第2の撮像手段)及び情報処理装置7を備える。以下、各構成について詳細に説明する。
(First embodiment)
[Configuration of the medical system according to the first embodiment]
FIG. 1 is a diagram showing a configuration example of a medical system 1 according to the first embodiment of the present disclosure. The medical system 1 according to the first embodiment includes a structure observation light source 2 (first irradiation means), a narrow band light source 3 (second irradiation means), a wavelength separation device 4, a color camera 5 (first imaging means). ), An IR camera 6 (second image pickup means), and an information processing device 7. Hereinafter, each configuration will be described in detail.
(1)光源
 構造観察用光源2は、被写体にインコヒーレント光(例えばインコヒーレントな可視光。以下、単に「可視光」ともいう。)を照射する。また、狭帯域光源3は、被写体にコヒーレント光(例えばコヒーレントな近赤外光。以下、単に「近赤外光」ともいう。)を照射する。なお、コヒーレント光とは、光束内の任意の二点における光波の位相関係が時間的に不変で一定であり、任意の方法で光束を分割した後、大きな光路差を与えて再び重ねあわせても完全な干渉性を示す光をいう。また、インコヒーレント光は、コヒーレント光における上述のような性質を有さない光をいう。
(1) Light Source The structure observation light source 2 irradiates a subject with incoherent light (for example, incoherent visible light; hereinafter, also simply referred to as “visible light”). In addition, the narrow band light source 3 irradiates the subject with coherent light (for example, coherent near infrared light. Hereinafter, also simply referred to as “near infrared light”). It should be noted that coherent light is a phase relationship of light waves at two arbitrary points in the light flux that is constant in time and is constant, and after splitting the light flux by an arbitrary method, a large optical path difference is given and the light waves are superposed again. Light that exhibits perfect coherence. Incoherent light is light that does not have the above-mentioned properties of coherent light.
 本開示に係る狭帯域光源3から出力されるコヒーレント光の波長は、例えば800~900nm程度であることが好ましい。例えば、波長が830nmであれば、ICG観察と光学系を併用できる。つまり、ICG観察を行う場合は波長が830nmの近赤外光を使うのが一般的であるため、スペックル観察にも同波長の近赤外光を使うようにすれば、ICG観察を実施可能な顕微鏡の光学系を変更することなく、スペックル観察が可能となる。 The wavelength of the coherent light output from the narrowband light source 3 according to the present disclosure is preferably about 800 to 900 nm, for example. For example, if the wavelength is 830 nm, ICG observation and an optical system can be used together. In other words, when performing ICG observation, it is common to use near-infrared light with a wavelength of 830 nm, so if near-infrared light with the same wavelength is also used for speckle observation, ICG observation can be performed. Speckle observation is possible without changing the optical system of a simple microscope.
 なお、狭帯域光源3が照射するコヒーレント光の波長はこれに限定されず、ほかに、様々な波長を使うことが想定される。例えば、コヒーレント光として波長が400~450nm(紫外~青)のものや波長が700~900nm(赤外)のものを用いたときには可視光照明と波長分離した観察が容易である。また、波長が450~700nmの可視のコヒーレント光を用いた場合にはプロジェクタなどで用いられているレーザを選択しやすい。また、Si以外のイメージャを考えるなら波長が900nm以上のコヒーレント光を利用することも可能である。以下では、コヒーレント光として波長が830nmの近赤外光を使用する場合を例にとる。 Note that the wavelength of the coherent light emitted by the narrowband light source 3 is not limited to this, and it is assumed that various wavelengths are used. For example, when coherent light having a wavelength of 400 to 450 nm (ultraviolet to blue) or having a wavelength of 700 to 900 nm (infrared) is used, the visible light illumination and the wavelength separated observation are easy. Further, when visible coherent light having a wavelength of 450 to 700 nm is used, it is easy to select a laser used in a projector or the like. If an imager other than Si is considered, it is also possible to use coherent light having a wavelength of 900 nm or more. In the following, the case where near infrared light with a wavelength of 830 nm is used as coherent light will be taken as an example.
 また、コヒーレント光を照射する狭帯域光源3の種類は、本技術の効果を損なわない限り特に限定されない。レーザ光を発する狭帯域光源3としては、例えば、アルゴンイオン(Ar)レーザ、ヘリウム-ネオン(He-Ne)レーザ、ダイ(dye)レーザ、クリプトン(Cr)レーザ、半導体レーザ、半導体レーザと波長変換光学素子を組み合わせた固体レーザ等を、単独で、または、組み合わせて、用いることができる。 Also, the type of the narrow band light source 3 that emits coherent light is not particularly limited as long as the effect of the present technology is not impaired. Examples of the narrow band light source 3 that emits laser light include an argon ion (Ar) laser, a helium-neon (He-Ne) laser, a die (dye) laser, a krypton (Cr) laser, a semiconductor laser, and a semiconductor laser and wavelength conversion. Solid-state lasers or the like combined with optical elements can be used alone or in combination.
 なお、被写体に対して、構造観察用光源2からの可視光と狭帯域光源3からの近赤外光は同時に照射される。また、構造観察用光源2の種類は、本技術の効果を損なわない限り特に限定されない。一例としては、発光ダイオード等を挙げることができる。また、他の光源としては、キセノンランプ、メタルハライドランプ、高圧水銀ランプ等も挙げられる。 Note that the visible light from the structure observation light source 2 and the near-infrared light from the narrow band light source 3 are simultaneously emitted to the subject. Further, the type of the structure observation light source 2 is not particularly limited as long as the effect of the present technology is not impaired. A light emitting diode etc. can be mentioned as an example. Further, other light sources include a xenon lamp, a metal halide lamp, a high pressure mercury lamp and the like.
(2)被写体
 被写体は、様々なものでありえるが、例えば、流体を含むものが好適である。スペックルの性質上、流体からはスペックルが発生しにくいという性質がある。そのため、本開示に係る医療システム1を用いて流体を含む被写体をイメージングとすると、流体部と非流体部の境界や流体部の流速等を、求めることができる。
(2) Subject Although the subject can be various, for example, a subject containing fluid is suitable. Due to the nature of speckle, there is a property that speckle is less likely to be generated from a fluid. Therefore, when the medical system 1 according to the present disclosure is used to image a subject containing a fluid, the boundary between the fluid portion and the non-fluid portion, the flow velocity of the fluid portion, and the like can be obtained.
 より具体的には、例えば、被写体を、流体が血液である生体とすることができる。例えば、本開示に係る医療システム1を顕微鏡手術や内視鏡手術などで用いることで、血管の位置を確認しながら手術を行うことが可能である。そのため、より安全で高精度な手術を行うことができ、医療技術の更なる発展にも貢献することができる。 More specifically, for example, the subject can be a living body whose fluid is blood. For example, by using the medical system 1 according to the present disclosure for microscopic surgery or endoscopic surgery, it is possible to perform surgery while confirming the position of the blood vessel. Therefore, safer and more accurate surgery can be performed, which can contribute to further development of medical technology.
(3)撮像装置
 カラーカメラ5は、被写体からの可視光の反射光(散乱光)を撮像する。カラーカメラ5は、例えば、可視光観察用のRGB(Red Green Blue)イメージャである。
(3) Imaging Device The color camera 5 images reflected light (scattered light) of visible light from a subject. The color camera 5 is, for example, an RGB (Red Green Blue) imager for observing visible light.
 IRカメラ6は、被写体からの近赤外光の反射光(散乱光)を撮像する。IRカメラ6は、例えば、スペックル観察用のIR(Infrared)イメージャである。 IR camera 6 images the reflected light (scattered light) of near infrared light from the subject. The IR camera 6 is, for example, an IR (Infrared) imager for speckle observation.
 波長分離装置4は、例えば、ダイクロイックミラーである。波長分離装置4は、受光した近赤外光(反射光)と可視光(反射光)を分離する。カラーカメラ5は、波長分離装置4によって分離された可視光から得られる可視光画像(第1の画像)を撮像する。IRカメラ6は、波長分離装置4によって分離された近赤外光から得られるスペックル画像(第2の画像)を撮像する。 The wavelength separation device 4 is, for example, a dichroic mirror. The wavelength separation device 4 separates the received near infrared light (reflected light) and visible light (reflected light). The color camera 5 captures a visible light image (first image) obtained from the visible light separated by the wavelength separation device 4. The IR camera 6 captures a speckle image (second image) obtained from the near infrared light separated by the wavelength separation device 4.
(4)情報処理装置
 次に、図2を参照して、情報処理装置7について説明する。図2は、本開示の第1実施形態に係る情報処理装置7の構成例を示す図である。情報処理装置7は、画像処理装置であり、主な構成として、処理部71、記憶部72、入力部73及び表示部74を備える。なお、以下において、「SC」は、スペックルコントラスト(値)を意味する。
(4) Information Processing Device Next, the information processing device 7 will be described with reference to FIG. FIG. 2 is a diagram illustrating a configuration example of the information processing device 7 according to the first embodiment of the present disclosure. The information processing device 7 is an image processing device, and includes a processing unit 71, a storage unit 72, an input unit 73, and a display unit 74 as main components. In the following, “SC” means speckle contrast (value).
 処理部71は、例えば、CPU(Central Processing Unit)により実現され、主な構成として、取得部711、設定部712、生成部713及び表示制御部714を備える。 The processing unit 71 is realized by a CPU (Central Processing Unit), for example, and includes an acquisition unit 711, a setting unit 712, a generation unit 713, and a display control unit 714 as main components.
 取得部711は、カラーカメラ5から可視光画像を取得し、IRカメラ6からスペックル画像を取得する。なお、可視光画像とスペックル画像について、写っている被写体の位置の対応がとられているものとする。つまり、両者の画角が一致している場合はそのままの画像でよいし、画角が一致していない場合は少なくとも一方の画像を補正して、写っている被写体の位置の対応をとるようにする。 The acquisition unit 711 acquires a visible light image from the color camera 5 and a speckle image from the IR camera 6. It is assumed that the visible light image and the speckle image are associated with the position of the subject in the image. In other words, if the angles of view of the two are the same, the image may be the same as it is. If the angles of view of the two are not the same, at least one of the images may be corrected to match the position of the subject in the image. To do.
 設定部712は、可視光画像に基いて、スペックルの輝度値について統計的な処理をするための計算領域を設定する。例えば、設定部712は、可視光画像に基いて、スペックルコントラスト値を算出するための計算領域を設定する。さらに具体的には、例えば、設定部712は、可視光画像に基いて注目画素と複数の周辺画素それぞれに関して距離、輝度の差、色の差の少なくともいずれかによって関連度を算出し、関連度に基いて複数の周辺画素の少なくとも一部に注目画素に関する計算領域を設定する(詳細は後述)。 The setting unit 712 sets a calculation area for statistically processing the speckle luminance value based on the visible light image. For example, the setting unit 712 sets the calculation area for calculating the speckle contrast value based on the visible light image. More specifically, for example, the setting unit 712 calculates the degree of association based on the visible light image with respect to each of the target pixel and each of the plurality of peripheral pixels, and calculates the degree of association based on at least one of the distance, the difference in luminance, and the difference in color. Based on the above, the calculation area for the target pixel is set in at least a part of the plurality of peripheral pixels (details will be described later).
 生成部713は、スペックル画像と、設定部712によって設定された計算領域に基いて、注目画素に関するスペックルコントラスト値を算出して、スペックルコントラスト画像(SC画像。所定画像)を生成する(詳細は後述)。 The generation unit 713 calculates the speckle contrast value for the pixel of interest based on the speckle image and the calculation area set by the setting unit 712, and generates the speckle contrast image (SC image. Predetermined image) ( Details will be described later).
 なお、i番目の画素(注目画素)のスペックルコントラスト値は、以下の式(1)で表すことができる。
 i番目の画素のスペックルコントラスト値=
  (i番目の画素と周辺画素の強度の標準偏差)/
    (i番目の画素と周辺画素の強度の平均)   ・・・式(1)
The speckle contrast value of the i-th pixel (pixel of interest) can be expressed by the following equation (1).
Speckle contrast value of i-th pixel =
(Standard deviation of intensity of i-th pixel and surrounding pixels) /
(Average of intensity of i-th pixel and surrounding pixels) Equation (1)
 ここで、図3を参照して、SC画像例について説明する。図3は、疑似血管のSC画像例を示す図である。図3のSC画像例に示す通り、非血流部に多くのスペックルが観察され、血流部にはスペックルがほとんど観察されない。 Here, an example of an SC image will be described with reference to FIG. FIG. 3 is a diagram showing an example of an SC image of a pseudo blood vessel. As shown in the SC image example of FIG. 3, many speckles are observed in the non-blood flow portion, and almost no speckles are observed in the blood flow portion.
 図2に戻って、生成部713は、SC画像に基いて流体部(例えば血流部)と非流体部(例えば非血流部)を識別する。より具体的には、例えば、生成部713は、SC画像に基いて、スペックルコントラスト値(SC値)が所定のSC閾値以上か否かを判定することによって血流部と非血流部を識別したり、血流部の血液の流れの程度を認識したりすることができる。 Returning to FIG. 2, the generation unit 713 identifies the fluid portion (for example, the blood flow portion) and the non-fluid portion (for example, the non-blood flow portion) based on the SC image. More specifically, for example, the generation unit 713 determines, based on the SC image, whether the speckle contrast value (SC value) is greater than or equal to a predetermined SC threshold, thereby determining the blood flow portion and the non-blood flow portion. It is possible to identify and recognize the degree of blood flow in the blood flow section.
 表示制御部714は、生成部713によって生成されたSC画像に基いて、表示部74にSC画像を表示させる。その場合、血流部と非血流部を識別可能に、SC画像を表示させることが好ましい。また、表示制御部714は、可視光画像にSC画像を重畳して表示部74に表示することもできる。 The display control unit 714 causes the display unit 74 to display the SC image based on the SC image generated by the generation unit 713. In that case, it is preferable to display the SC image so that the blood flow part and the non-blood flow part can be distinguished. The display control unit 714 can also superimpose the SC image on the visible light image and display it on the display unit 74.
 記憶部72は、取得部711によって取得された可視光画像、スペックル画像、処理部71の各部による演算結果、各種閾値等の各種情報を記憶する。なお、この記憶部72の代わりに、医療システム1の外部の記憶装置を用いてもよい。 The storage unit 72 stores the visible light image acquired by the acquisition unit 711, the speckle image, the calculation result by each unit of the processing unit 71, various information such as various thresholds. A storage device external to the medical system 1 may be used instead of the storage unit 72.
 入力部73は、ユーザによって情報を入力するための手段であり、例えば、キーボードやマウスである。 The input unit 73 is a means for the user to input information, and is, for example, a keyboard or a mouse.
 表示部74は、表示制御部714からの制御により、取得部711によって取得された可視光画像、スペックル画像、処理部71の各部による演算結果、各種閾値等の各種情報を表示する。なお、この表示部74の代わりに、医療システム1の外部の表示装置を用いてもよい。 Under control of the display control unit 714, the display unit 74 displays the visible light image acquired by the acquisition unit 711, the speckle image, the calculation result by each unit of the processing unit 71, various types of information such as various thresholds. A display device outside the medical system 1 may be used instead of the display unit 74.
 次に、図4を参照して、処理サイズ(計算領域のサイズ)とSC画像におけるぎらつき度合いとの関係について説明する。図4は、処理サイズが3×3(画素)のSC画像例と処理サイズが11×11(画素)のSC画像例を示す図である。なお、両SC画像例とも、計算領域を一定にして演算したものである。 Next, the relationship between the processing size (size of the calculation area) and the degree of glare in the SC image will be described with reference to FIG. FIG. 4 is a diagram showing an example of an SC image having a processing size of 3 × 3 (pixels) and an example of an SC image having a processing size of 11 × 11 (pixels). Note that both SC image examples are calculated with a constant calculation area.
 図4からわかるように、3×3のSC画像は、ディテール(各画素ごとの情報)は失われにくいが、処理サイズが小さいためSC値が安定せず、ノイズが多くてザラザラ感のある画像となる。一方、11×11のSC画像は、処理サイズが大きいためSC値が安定してノイズが少なくて済むが、ディテールは失われやすく、例えばエッジ部分(血流部と非血流部の境目)がぼけた画像となる。 As can be seen from FIG. 4, in the 3 × 3 SC image, details (information for each pixel) are not easily lost, but the SC value is not stable because the processing size is small, and the image is noisy and rough. Becomes On the other hand, since the 11 × 11 SC image has a large processing size, the SC value is stable and the amount of noise is small, but details are easily lost, and for example, an edge portion (a boundary between a blood flow portion and a non-blood flow portion) is generated. The image becomes blurred.
 次に、図5を参照して、比較例(従来技術)における計算領域例を示す図である。図5は、比較例における計算領域例を示す図である。比較例では、動き(流れ)のある領域RMと動き(流れ)のない領域RSの境目に関係なく計算領域(例えば3×3画素。)が設定される(例えば、計算領域R1、R2)。そのため、計算領域が動きのある領域RMと動きのない領域RSの両方を含んでいるとスペックルコントラスト値の精度が落ちる。 Next, referring to FIG. 5, a diagram showing an example of a calculation area in a comparative example (prior art). FIG. 5 is a diagram showing an example of a calculation area in the comparative example. In the comparative example, the calculation region (for example, 3 × 3 pixels) is set regardless of the boundary between the region RM having a motion (flow) and the region RS having no motion (flow) (for example, the calculation regions R1 and R2). Therefore, if the calculation area includes both the moving area RM and the non-moving area RS, the accuracy of the speckle contrast value decreases.
 次に、図6を参照して、本開示の第1実施形態における計算領域を用いた計算例について説明する。図6は、本開示の第1実施形態における計算領域を用いた計算例の説明図である。 Next, a calculation example using the calculation area in the first embodiment of the present disclosure will be described with reference to FIG. 6. FIG. 6 is an explanatory diagram of a calculation example using the calculation area according to the first embodiment of the present disclosure.
 設定部712は、可視光画像に基いて注目画素(例えば画素R12)と複数の周辺画素それぞれに関して距離、輝度の差、色の差の少なくともいずれかによって関連度を算出し、関連度に基いて複数の周辺画素の少なくとも一部に注目画素に関する計算領域(例えば領域R11)を設定する。なお、図6の可視光画像において、周辺画素それぞれに関連度を記載している。例えば、距離が近いほど、関連度が高いと判断する。また、輝度の差が近いほど、関連度が高いと判断する。 The setting unit 712 calculates the degree of association based on the visible light image with respect to the target pixel (for example, the pixel R12) and each of the plurality of peripheral pixels, based on at least one of the distance, the difference in luminance, and the difference in color, and based on the degree of association A calculation region (for example, region R11) related to the pixel of interest is set in at least a part of the plurality of peripheral pixels. In the visible light image of FIG. 6, the degree of association is described for each peripheral pixel. For example, it is determined that the closer the distance is, the higher the degree of association is. Also, the closer the difference in brightness is, the higher the degree of association is determined.
 また、色の差が近いほど、関連度が高いと判断する。その場合、例えば、RGBの総合的な差を用いてもよいし、RGBのうちの2つか1つの差を用いてもよい。例えば、被写体が血流部と非血流部を有する生体の場合、血液は赤いので、RGBのうちRのみの差を用いてもよい。 Also, the closer the color difference is, the higher the degree of association is judged. In that case, for example, the total difference of RGB may be used, or the difference of two or one of RGB may be used. For example, when the subject is a living body having a blood flow part and a non-blood flow part, the blood is red, and therefore only the difference R of RGB may be used.
 生成部713は、例えば、関連度(0以上、1以下)が0.5以上の周辺画素だけを計算領域とする。 The generation unit 713, for example, sets only peripheral pixels having a degree of association (0 or more, 1 or less) of 0.5 or more as a calculation area.
 そして、生成部713は、IR画像(スペックル画像)において、設定部712によって設定された計算領域(例えば領域R11)に基いて、注目画素(例えば画素R12)に関するスペックルコントラスト値を算出して、SC画像を生成する。その際、例えば、生成部713は、周辺画素ごとに関連度によって重み付けをする(例えばSC値に関連度を乗算する)ことによってスペックルコントラスト値を算出するようにしてもよい。なお、画素R14と領域R13についても、画素R12と領域R11の場合と同様である。 Then, in the IR image (speckle image), the generation unit 713 calculates the speckle contrast value regarding the target pixel (for example, the pixel R12) based on the calculation region (for example, the region R11) set by the setting unit 712. , SC images are generated. At this time, for example, the generation unit 713 may calculate the speckle contrast value by weighting the peripheral pixels by the degree of association (for example, multiplying the SC value by the degree of association). The pixel R14 and the region R13 are similar to those of the pixel R12 and the region R11.
[第1実施形態に係るSC画像生成処理]
 次に、図7を参照して、本開示の第1実施形態に係る情報処理装置7によるSC画像生成処理について説明する。図7は、本開示の第1実施形態に係る情報処理装置7によるSC画像生成処理を示すフローチャートである。
[SC image generation processing according to the first embodiment]
Next, with reference to FIG. 7, SC image generation processing by the information processing device 7 according to the first embodiment of the present disclosure will be described. FIG. 7 is a flowchart showing SC image generation processing by the information processing device 7 according to the first embodiment of the present disclosure.
 まず、ステップS1において、取得部711は、カラーカメラ5から可視光画像を取得する。次に、ステップS2において、取得部711は、IRカメラ6からスペックル画像を取得する。 First, in step S1, the acquisition unit 711 acquires a visible light image from the color camera 5. Next, in step S2, the acquisition unit 711 acquires a speckle image from the IR camera 6.
 次に、ステップS3において、設定部712は、可視光画像に基いて、注目画素と複数の周辺画素それぞれの関連度を算出する。次に、ステップS4において、設定部712は、ステップS3で算出した関連度に基いて、複数の周辺画素の少なくとも一部に注目画素に関する計算領域を設定する。 Next, in step S3, the setting unit 712 calculates the degree of association between the pixel of interest and each of a plurality of peripheral pixels based on the visible light image. Next, in step S4, the setting unit 712 sets the calculation region regarding the target pixel in at least a part of the plurality of peripheral pixels based on the degree of association calculated in step S3.
 次に、ステップS5において、生成部713は、スペックル画像とステップS4で設定された計算領域に基いて、注目画素に関するスペックルコントラスト値を算出し、SC画像を生成する。 Next, in step S5, the generation unit 713 calculates the speckle contrast value for the pixel of interest based on the speckle image and the calculation area set in step S4, and generates the SC image.
 次に、ステップS6において、表示制御部714は、ステップS5で生成されたSC画像を表示部74に表示させる。ステップS6の後、処理を終了する。 Next, in step S6, the display control unit 714 causes the display unit 74 to display the SC image generated in step S5. After step S6, the process ends.
 このように、第1実施形態の情報処理装置7によれば、被写体からの所定光による画像(スペックル画像)とより適切な計算領域に基いて高精度な所定画像(SC画像)を生成することができる。 As described above, according to the information processing device 7 of the first embodiment, a highly accurate predetermined image (SC image) is generated based on an image (speckle image) of predetermined light from a subject and a more appropriate calculation area. be able to.
 具体的には、例えば、図6に示すように、動きのない領域RSに関する注目画素(例えば画素R12)についてSC値を算出するときは、周辺画素のうち動きのない領域RS内の画素(領域R11内の画素)だけを用いることができる。また、動きのある領域RMに関する注目画素(画素R14)についてSC値を算出するときは、周辺画素のうち動きのある領域RM内の画素(領域R13内の画素)だけを用いることができる。よって、SC値の精度を向上させることができる。 Specifically, for example, as shown in FIG. 6, when the SC value is calculated for the pixel of interest (for example, the pixel R12) related to the non-moving region RS, the pixels in the non-moving region RS (region Only the pixels in R11) can be used. Further, when the SC value is calculated for the target pixel (pixel R14) related to the moving region RM, only the pixels in the moving region RM (pixels in the region R13) can be used among the peripheral pixels. Therefore, the accuracy of the SC value can be improved.
 ここで、図8は、本開示の第1実施形態における模式的な画像例を示す図である。図8(a)は、可視光画像の模式図である。図8(b)は、スペックル画像の模式図である。図8(c)は、本実施形態の処理によって作成した本処理画像の模式図である。図8(d)は、計算領域のサイズが一定の従来処理によって作成した比較画像の模式図である。 Here, FIG. 8 is a diagram showing a schematic image example in the first embodiment of the present disclosure. FIG. 8A is a schematic diagram of a visible light image. FIG. 8B is a schematic diagram of the speckle image. FIG. 8C is a schematic diagram of the main processing image created by the processing of this embodiment. FIG. 8D is a schematic diagram of a comparative image created by conventional processing in which the size of the calculation area is constant.
 図8(a)(b)において、血管BT1は、血液の流れのある太い血管である。血管BT2は、血液の流れのある細い血管である。血管BT3は、血液の流れのない細い血管である。 In FIGS. 8A and 8B, the blood vessel BT1 is a thick blood vessel having a blood flow. The blood vessel BT2 is a thin blood vessel having a blood flow. The blood vessel BT3 is a thin blood vessel with no blood flow.
 また、図8(c)の本処理画像と図8(d)の比較画像の両方とも、血管BT3が映っていないという点では同じである。しかし、図8(c)の本処理画像では、図8(d)の比較画像と比較して、血管BT1、BT2ともに、鮮明に表示されている。すなわち、図8(d)の比較画像では、血管BT1の血液の流れ部分のエッジがぼけているとともに、血管BT2がぼけて見にくくなっている(または、見えなくなる)。 Further, both the main processed image of FIG. 8C and the comparative image of FIG. 8D are the same in that the blood vessel BT3 is not shown. However, in the main processed image of FIG. 8C, both blood vessels BT1 and BT2 are clearly displayed as compared with the comparative image of FIG. 8D. That is, in the comparative image of FIG. 8D, the edge of the blood flow portion of the blood vessel BT1 is blurred and the blood vessel BT2 is blurred and difficult to see (or becomes invisible).
 つまり、本処理画像(図8(c))は、動きのない領域RSに関する注目画素についてSC値を算出するときは周辺画素のうち動きのない領域RS内の画素だけを用い、動きのある領域RMに関する注目画素についてSC値を算出するときは周辺画素のうち動きのある領域RM内の画素だけを用いることで、SC値の精度を向上させ、例えば、細い血管BT2についても鮮明に表示させることができる。 That is, in the present processed image (FIG. 8C), when calculating the SC value of the pixel of interest regarding the non-moving region RS, only the pixels within the non-moving region RS are used among the peripheral pixels, and the moving region When calculating the SC value for the pixel of interest regarding the RM, the accuracy of the SC value is improved by using only the pixels in the moving region RM among the peripheral pixels, and for example, the thin blood vessel BT2 is also displayed clearly. You can
 細い血管を鮮明に表示させることができるようになると、様々な血流確認の分野で有用になる。例えば、動脈瘤のクリッピング術への応用例について説明する。動脈瘤のクリッピング術では、動脈瘤をクリップで挟むことにより、瘤内へ血流が入り込むことを阻止し、動脈瘤の将来的な破裂を予防する。そして、動脈瘤の近傍に穿通枝と呼ばれる細い血管が走行していることがある。穿通枝は細い血管であるが、脳の一部に栄養を供給している非常に重要な血管である。クリップによって血流を止めてしまうと、脳の一部に栄養が供給されなくなるために、機能障害が発生することがある。このため、細い血管の血流をリアルタイムに表示し、誤ってクリップしてしまうことを避けることは大きな有用性がある。 ❖ Being able to clearly display thin blood vessels will be useful in various fields of blood flow confirmation. For example, an application example of clipping aneurysm will be described. Clipping an aneurysm prevents the blood flow from entering the aneurysm by sandwiching the aneurysm with clips, and prevents future rupture of the aneurysm. Then, a thin blood vessel called a perforator may run near the aneurysm. The perforator is a thin blood vessel, but a very important blood vessel that supplies a part of the brain with nutrition. If clips stop the blood flow, nutrients may not be supplied to a part of the brain, which may result in dysfunction. Therefore, displaying blood flow in a thin blood vessel in real time and avoiding accidental clipping is of great utility.
 このほかに、脳動脈バイパス術への応用例も考えられる。脳動脈バイパス術では、脳血流の良くない動脈に対し、他の部位の動脈をつなぎ、動脈血の流量を増やす。この手術においては、脳動脈のバイパスを開通させることにより血流量を増やすが、細い血管の血流を見ることができるようになることで、脳組織に栄養を送る毛細血管の血流の増加を確認し、手術の効果を確認することができるようになる。 In addition to this, application examples for cerebral artery bypass surgery are also possible. In cerebral artery bypass surgery, arteries with poor cerebral blood flow are connected to other arteries to increase the flow of arterial blood. In this operation, the blood flow is increased by opening the bypass of the cerebral artery, but by making it possible to see the blood flow in the thin blood vessels, it is possible to increase the blood flow in the capillaries that feed the brain tissue. You will be able to confirm and confirm the effect of surgery.
 他の応用例としては、血栓などによる血流阻害をより詳細に把握することにより、対処できるようになる可能性がある。血栓が細い血管で発生し、血流を阻害すると重大な後遺症になる可能性がある。どこに血栓が発生したかを細い血管まで含めて発見できるようになることで、この血栓を取り除くなどの対処が可能になる。この他にも、細い血管の血流を鮮明に表示させることが有用となる場面は数多くある。 As another application example, by grasping the blood flow obstruction due to thrombus etc. in more detail, it may be possible to deal with it. A thrombus develops in a small blood vessel and obstruction of blood flow can have serious sequelae. By being able to detect where the blood clot has occurred, including the thin blood vessels, it becomes possible to take measures such as removing the blood clot. In addition to this, there are many situations in which it is useful to clearly display the blood flow in a thin blood vessel.
 一方、例えば、従来技術で、注目画素の時間方向の輝度変化を用いて評価する方法がある。この方法によれば、周辺画素の情報を使わないため高解像度な画像を得ることができるが、多数のフレームを使って演算することに起因する計算量の増大、処理の遅延や、動いている被写体におけるボケの発生などの短所がある。本実施形態の手法によれば、そのような短所はない。 On the other hand, for example, there is a method in the related art in which evaluation is performed using a change in luminance of a pixel of interest in the time direction. According to this method, it is possible to obtain a high-resolution image because the information of the peripheral pixels is not used, but the amount of calculation is increased due to the calculation using a large number of frames, the processing is delayed, and the image is moving. There are disadvantages such as blurring on the subject. According to the method of this embodiment, there is no such disadvantage.
(第2実施形態)
 次に、第2実施形態の医療システム1について説明する。第1実施形態と同様の事項については重複する説明を適宜省略する。図9は、本開示の第2実施形態に係る医療システム1の構成例を示す図である。第2実施形態に係る医療システム1は、第1実施形態に係る医療システム1(図1)と比較して、波長分離装置4、カラーカメラ5、IRカメラ6に替えて、制御装置8とカメラ9を備える。
(Second embodiment)
Next, the medical system 1 of the second embodiment will be described. Duplicated description of the same items as those in the first embodiment will be appropriately omitted. FIG. 9 is a diagram showing a configuration example of the medical system 1 according to the second embodiment of the present disclosure. The medical system 1 according to the second embodiment is different from the medical system 1 (FIG. 1) according to the first embodiment in that the wavelength separation device 4, the color camera 5, and the IR camera 6 are replaced by a control device 8 and a camera. 9 is provided.
 制御装置8は、カメラ9からの指示により、構造観察用光源2による可視光の照射と、狭帯域光源3による近赤外光の照射を、所定時間間隔ごとに切り替える。なお、照明光源を切り替える方法以外に、例えば、観察用カメラのフィルタを切り替える方式もある。また、照明光源の切り替えと観察用カメラのフィルタの切り替えを組み合わせる方法もある。 The control device 8 switches the irradiation of visible light from the structure observation light source 2 and the irradiation of near infrared light from the narrow band light source 3 at predetermined time intervals according to an instruction from the camera 9. In addition to the method of switching the illumination light source, for example, there is a method of switching the filter of the observation camera. There is also a method of combining switching of the illumination light source and switching of the filter of the observation camera.
 カメラ9は、可視光観察用のRGBイメージャとスペックル観察用のIRイメージャの両機能を兼ね備えたイメージャである。また、カメラ9は、制御装置8に対して、構造観察用光源2による可視光の照射と狭帯域光源3による近赤外光の照射を切り替えさせるための制御信号を出力する。情報処理装置7は、第1実施形態と同様である。 The camera 9 is an imager that has both the functions of an RGB imager for visible light observation and an IR imager for speckle observation. Further, the camera 9 outputs to the control device 8 a control signal for switching the irradiation of visible light by the structure observation light source 2 and the irradiation of near infrared light by the narrow band light source 3. The information processing device 7 is similar to that of the first embodiment.
 このように、第2実施形態の医療システム1によれば、単一のカメラ9によって可視光画像とスペックル画像を時間的に交互に撮像し、情報処理装置7がそれらの画像を使って第1実施形態の場合と同様の処理を行うことができる。つまり、装置構成がシンプルになるとともに、同じ画角の可視光画像とスペックル画像を容易に取得できるので処理もシンプルになる。 As described above, according to the medical system 1 of the second embodiment, the visible light image and the speckle image are temporally alternately captured by the single camera 9, and the information processing device 7 uses the images to perform the first imaging. The same processing as that of the first embodiment can be performed. In other words, the device configuration is simplified, and the visible light image and the speckle image with the same angle of view can be easily acquired, so the processing is also simplified.
(第3実施形態)
 次に、第3実施形態の医療システム1について説明する。第1実施形態と同様の事項については重複する説明を適宜省略する。図10は、本開示の第3実施形態に係る医療システム1の構成例を示す図である。第3実施形態に係る医療システム1は、第1実施形態に係る医療システム1(図1)と比較して、構造観察用光源2、カラーカメラ5に替えて、それぞれ、蛍光励起用光源10、蛍光観察カメラ11(第3の撮像手段)を備える。
(Third Embodiment)
Next, the medical system 1 of the third embodiment will be described. Duplicated description of the same items as those in the first embodiment will be appropriately omitted. FIG. 10 is a diagram showing a configuration example of the medical system 1 according to the third embodiment of the present disclosure. The medical system 1 according to the third embodiment is different from the medical system 1 (FIG. 1) according to the first embodiment in that the structure observation light source 2 and the color camera 5 are replaced by fluorescence excitation light sources 10, respectively. The fluorescence observation camera 11 (third imaging means) is provided.
 蛍光励起用光源10は、蛍光を励起するための励起光を被写体に照射する。これに先立ち、例えば、蛍光を発する薬剤としてICG薬剤やフルオレセインなどを被写体である生体に投与しておく。第1実施形態と第2実施形態では可視光画像に基いて計算領域を設定したが、第3実施形態では蛍光観察画像に基いて計算領域を設定する。ICG薬剤やフルオレセインは血管造影に用いられる薬剤であるため、蛍光観察画像は、血管の走行を反映したものとなり、計算領域の設定に有用である。 The fluorescence excitation light source 10 irradiates a subject with excitation light for exciting fluorescence. Prior to this, for example, an ICG drug, fluorescein, or the like as a drug that emits fluorescence is administered to a living body that is a subject. In the first and second embodiments, the calculation area is set based on the visible light image, but in the third embodiment, the calculation area is set based on the fluorescence observation image. Since the ICG drug and fluorescein are drugs used for angiography, the fluorescence observation image reflects the running of the blood vessel and is useful for setting the calculation region.
 情報処理装置7においては、取得部711は、被写体からの蛍光を撮像する蛍光観察カメラ11から蛍光観察画像(第3の画像)を取得する。設定部712は、蛍光観察画像に基いて計算領域を設定する。そして、生成部713は、スペックル画像と設定部712によって設定された計算領域に基いて、SC画像を生成する。 In the information processing device 7, the acquisition unit 711 acquires the fluorescence observation image (third image) from the fluorescence observation camera 11 that images the fluorescence from the subject. The setting unit 712 sets the calculation area based on the fluorescence observation image. Then, the generation unit 713 generates the SC image based on the speckle image and the calculation area set by the setting unit 712.
 このように、第3実施形態の医療システム1によれば、例えば、蛍光の輝度値の高い領域(血流領域)のみを計算領域として血流部のSC値を算出できるので、高精度なSC画像を生成できる。なお、ICG薬剤やフルオレセインを用いた蛍光観察の事例を示したが、他の薬剤を用いた蛍光観察でもよい。 As described above, according to the medical system 1 of the third embodiment, for example, the SC value of the blood flow portion can be calculated using only the region having a high fluorescence brightness value (blood flow region) as the calculation region. Can generate images. Although an example of fluorescence observation using an ICG agent or fluorescein is shown, fluorescence observation using another agent may be used.
(第4実施形態)
 次に、第4実施形態の医療システム1について説明する。第1実施形態と同様の事項については重複する説明を適宜省略する。図11は、本開示の第4実施形態における混合画像を示す模式図である。この混合画像は、例えば、いわゆる空間分割二段階露光を用いた手法によって得られる。この混合画像は、1フレーム中に、可視光画素とIR画素を縦方向と横方向の両方向について交互に含んでいる。図11の混合画像では、色の薄いほうの画素が可視光画素で、色の濃いほうの画素がIR画素である。
(Fourth Embodiment)
Next, the medical system 1 of the fourth embodiment will be described. Duplicated description of the same items as those in the first embodiment will be appropriately omitted. FIG. 11 is a schematic diagram showing a mixed image according to the fourth embodiment of the present disclosure. This mixed image is obtained, for example, by a method using so-called space division two-step exposure. This mixed image includes visible light pixels and IR pixels alternately in both the vertical and horizontal directions in one frame. In the mixed image of FIG. 11, the lighter color pixel is a visible light pixel and the darker color pixel is an IR pixel.
 情報処理装置7の処理部71は、このような混合画像を撮像装置(不図示)から取得する。そして、情報処理装置7の処理部71は、このような混合画像から可視光画素とIR画素をそれぞれ抽出し、可視光画像とスペックル画像を別々に取得した場合と同様の処理を行うことができる。 The processing unit 71 of the information processing device 7 acquires such a mixed image from an imaging device (not shown). Then, the processing unit 71 of the information processing device 7 may perform the same processing as that in the case where the visible light pixel and the IR pixel are extracted from the mixed image and the visible light image and the speckle image are separately acquired. it can.
 このように、第4実施形態の医療システム1によれば、1種類の混合画像を用いることで、単一の撮像装置で、かつ、構造観察用光源2と狭帯域光源3の照射動作を所定時間ごとに切り替えることなく、第1実施形態と同様の処理を行うことができる。 As described above, according to the medical system 1 of the fourth embodiment, by using one type of mixed image, the irradiation operation of the structure observation light source 2 and the narrow band light source 3 is performed with a single imaging device. The same processing as in the first embodiment can be performed without switching every time.
(応用例1)
 本開示に係る技術は、様々な製品へ応用することができる。例えば、本開示に係る技術は、内視鏡手術システムに適用されてもよい。
(Application example 1)
The technology according to the present disclosure can be applied to various products. For example, the technology according to the present disclosure may be applied to an endoscopic surgery system.
 図12は、本開示に係る技術が適用され得る内視鏡手術システム5000の概略的な構成の一例を示す図である。図12では、術者(医師)5067が、内視鏡手術システム5000を用いて、患者ベッド5069上の患者5071に手術を行っている様子が図示されている。図示するように、内視鏡手術システム5000は、内視鏡5001と、その他の術具5017と、内視鏡5001を支持する支持アーム装置5027と、内視鏡下手術のための各種の装置が搭載されたカート5037と、から構成される。 FIG. 12 is a diagram showing an example of a schematic configuration of an endoscopic surgery system 5000 to which the technology according to the present disclosure can be applied. In FIG. 12, a surgeon (doctor) 5067 is performing an operation on a patient 5071 on a patient bed 5069 using the endoscopic surgery system 5000. As shown in the figure, the endoscopic surgery system 5000 includes an endoscope 5001, other surgical tools 5017, a support arm device 5027 for supporting the endoscope 5001, and various devices for endoscopic surgery. And a cart 5037 on which is mounted.
 内視鏡手術では、腹壁を切って開腹する代わりに、トロッカ5025a~5025dと呼ばれる筒状の開孔器具が腹壁に複数穿刺される。そして、トロッカ5025a~5025dから、内視鏡5001の鏡筒5003や、その他の術具5017が患者5071の体腔内に挿入される。図示する例では、その他の術具5017として、気腹チューブ5019、エネルギー処置具5021及び鉗子5023が、患者5071の体腔内に挿入されている。また、エネルギー処置具5021は、高周波電流や超音波振動により、組織の切開及び剥離、又は血管の封止等を行う処置具である。ただし、図示する術具5017はあくまで一例であり、術具5017としては、例えば攝子、レトラクタ等、一般的に内視鏡下手術において用いられる各種の術具が用いられてよい。 In endoscopic surgery, instead of cutting the abdominal wall and opening the abdomen, a plurality of tubular opening instruments called trocars 5025a to 5025d are punctured in the abdominal wall. Then, the barrel 5003 of the endoscope 5001 and other surgical tools 5017 are inserted into the body cavity of the patient 5071 from the trocars 5025a to 5025d. In the illustrated example, a pneumoperitoneum tube 5019, an energy treatment tool 5021, and forceps 5023 are inserted into the body cavity of the patient 5071 as other surgical tools 5017. Further, the energy treatment tool 5021 is a treatment tool that performs incision and separation of tissue, sealing of blood vessels, or the like by high-frequency current or ultrasonic vibration. However, the illustrated surgical instrument 5017 is merely an example, and various surgical instruments generally used in endoscopic surgery, such as a contusion and a retractor, may be used as the surgical instrument 5017.
 内視鏡5001によって撮影された患者5071の体腔内の術部の画像が、表示装置5041に表示される。術者5067は、表示装置5041に表示された術部の画像をリアルタイムで見ながら、エネルギー処置具5021や鉗子5023を用いて、例えば患部を切除する等の処置を行う。なお、図示は省略しているが、気腹チューブ5019、エネルギー処置具5021及び鉗子5023は、手術中に、術者5067又は助手等によって支持される。 An image of the surgical site in the body cavity of the patient 5071 taken by the endoscope 5001 is displayed on the display device 5041. The surgeon 5067 uses the energy treatment tool 5021 and the forceps 5023 while performing a real-time image of the surgical site displayed on the display device 5041, and performs a procedure such as excising the affected site. Although not shown, the pneumoperitoneum tube 5019, the energy treatment tool 5021, and the forceps 5023 are supported by an operator 5067, an assistant, or the like during surgery.
(支持アーム装置)
 支持アーム装置5027は、ベース部5029から延伸するアーム部5031を備える。図示する例では、アーム部5031は、関節部5033a、5033b、5033c、及びリンク5035a、5035bから構成されており、アーム制御装置5045からの制御により駆動される。アーム部5031によって内視鏡5001が支持され、その位置及び姿勢が制御される。これにより、内視鏡5001の安定的な位置の固定が実現され得る。
(Support arm device)
The support arm device 5027 includes an arm portion 5031 extending from the base portion 5029. In the illustrated example, the arm portion 5031 includes joint portions 5033a, 5033b, 5033c, and links 5035a, 5035b, and is driven by control from the arm control device 5045. The endoscope 5001 is supported by the arm portion 5031, and its position and posture are controlled. Thereby, stable fixing of the position of the endoscope 5001 can be realized.
(内視鏡)
 内視鏡5001は、先端から所定の長さの領域が患者5071の体腔内に挿入される鏡筒5003と、鏡筒5003の基端に接続されるカメラヘッド5005と、から構成される。図示する例では、硬性の鏡筒5003を有するいわゆる硬性鏡として構成される内視鏡5001を図示しているが、内視鏡5001は、軟性の鏡筒5003を有するいわゆる軟性鏡として構成されてもよい。
(Endoscope)
The endoscope 5001 includes a lens barrel 5003 into which a region having a predetermined length from the distal end is inserted into the body cavity of the patient 5071, and a camera head 5005 connected to the base end of the lens barrel 5003. In the illustrated example, the endoscope 5001 configured as a so-called rigid endoscope having the rigid barrel 5003 is illustrated, but the endoscope 5001 is configured as a so-called flexible mirror having the flexible barrel 5003. Good.
 鏡筒5003の先端には、対物レンズが嵌め込まれた開口部が設けられている。内視鏡5001には光源装置5043が接続されており、当該光源装置5043によって生成された光が、鏡筒5003の内部に延設されるライトガイドによって当該鏡筒の先端まで導光され、対物レンズを介して患者5071の体腔内の観察対象(被写体)に向かって照射される。なお、内視鏡5001は、直視鏡であってもよいし、斜視鏡又は側視鏡であってもよい。 An opening in which an objective lens is fitted is provided at the tip of the lens barrel 5003. A light source device 5043 is connected to the endoscope 5001, and the light generated by the light source device 5043 is guided to the tip of the lens barrel by a light guide extending inside the lens barrel 5003, and the objective The observation target (subject) in the body cavity of the patient 5071 is irradiated via the lens. Note that the endoscope 5001 may be a direct-viewing endoscope, a perspective mirror, or a side-viewing endoscope.
 カメラヘッド5005の内部には光学系及び撮像素子が設けられており、観察対象からの反射光(観察光)は当該光学系によって当該撮像素子に集光される。当該撮像素子によって観察光が光電変換され、観察光に対応する電気信号、すなわち観察像に対応する画像信号が生成される。当該画像信号は、RAWデータとしてカメラコントロールユニット(CCU:Camera Control Unit)5039に送信される。なお、カメラヘッド5005には、その光学系を適宜駆動させることにより、倍率及び焦点距離を調整する機能が搭載される。 An optical system and an image pickup device are provided inside the camera head 5005, and the reflected light (observation light) from the observation target is focused on the image pickup device by the optical system. The observation light is photoelectrically converted by the imaging element, and an electric signal corresponding to the observation light, that is, an image signal corresponding to the observation image is generated. The image signal is transmitted to the camera control unit (CCU) 5039 as RAW data. The camera head 5005 has a function of adjusting the magnification and the focal length by appropriately driving the optical system.
 なお、例えば立体視(3D表示)等に対応するために、カメラヘッド5005には撮像素子が複数設けられてもよい。この場合、鏡筒5003の内部には、当該複数の撮像素子のそれぞれに観察光を導光するために、リレー光学系が複数系統設けられる。 Note that the camera head 5005 may be provided with a plurality of image pickup elements in order to support, for example, stereoscopic vision (3D display). In this case, a plurality of relay optical systems are provided inside the lens barrel 5003 to guide the observation light to each of the plurality of image pickup devices.
(カートに搭載される各種の装置)
 CCU5039は、CPU(Central Processing Unit)やGPU(GraphicsProcessing Unit)等によって構成され、内視鏡5001及び表示装置5041の動作を統括的に制御する。具体的には、CCU5039は、カメラヘッド5005から受け取った画像信号に対して、例えば現像処理(デモザイク処理)等の、当該画像信号に基づく画像を表示するための各種の画像処理を施す。CCU5039は、当該画像処理を施した画像信号を表示装置5041に提供する。また、CCU5039は、カメラヘッド5005に対して制御信号を送信し、その駆動を制御する。当該制御信号には、倍率や焦点距離等、撮像条件に関する情報が含まれ得る。
(Various devices mounted on the cart)
The CCU 5039 includes a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and the like, and controls the operations of the endoscope 5001 and the display device 5041 in a centralized manner. Specifically, the CCU 5039 performs various image processing such as development processing (demosaic processing) on the image signal received from the camera head 5005 to display an image based on the image signal. The CCU 5039 provides the display device 5041 with the image signal subjected to the image processing. The CCU 5039 also transmits a control signal to the camera head 5005 to control the driving thereof. The control signal may include information regarding imaging conditions such as magnification and focal length.
 表示装置5041は、CCU5039からの制御により、当該CCU5039によって画像処理が施された画像信号に基づく画像を表示する。内視鏡5001が例えば4K(水平画素数3840×垂直画素数2160)又は8K(水平画素数7680×垂直画素数4320)等の高解像度の撮影に対応したものである場合、及び/又は3D表示に対応したものである場合には、表示装置5041としては、それぞれに対応して、高解像度の表示が可能なもの、及び/又は3D表示可能なものが用いられ得る。4K又は8K等の高解像度の撮影に対応したものである場合、表示装置5041として55インチ以上のサイズのものを用いることで一層の没入感が得られる。また、用途に応じて、解像度、サイズが異なる複数の表示装置5041が設けられてもよい。 The display device 5041 displays an image based on an image signal subjected to image processing by the CCU 5039 under the control of the CCU 5039. When the endoscope 5001 is compatible with high-resolution imaging such as 4K (horizontal pixel number 3840 × vertical pixel number 2160) or 8K (horizontal pixel number 7680 × vertical pixel number 4320), and / or 3D display In the case where the display device 5041 corresponds to the display device 5041, a device capable of high-resolution display and / or a device capable of 3D display can be used as the display device 5041. When the display device 5041 is compatible with high-resolution shooting such as 4K or 8K, a more immersive feeling can be obtained by using a display device 5041 having a size of 55 inches or more. Further, a plurality of display devices 5041 having different resolutions and sizes may be provided depending on the application.
 光源装置5043は、例えばLED(light emitting diode)等の光源から構成され、術部を撮影する際の照射光を内視鏡5001に供給する。 The light source device 5043 is composed of a light source such as an LED (light emitting diode), and supplies irradiation light to the endoscope 5001 when the surgical site is imaged.
 アーム制御装置5045は、例えばCPU等のプロセッサによって構成され、所定のプログラムに従って動作することにより、所定の制御方式に従って支持アーム装置5027のアーム部5031の駆動を制御する。 The arm control device 5045 is configured by a processor such as a CPU, for example, and operates according to a predetermined program to control the drive of the arm portion 5031 of the support arm device 5027 according to a predetermined control method.
 入力装置5047は、内視鏡手術システム5000に対する入力インタフェースである。ユーザは、入力装置5047を介して、内視鏡手術システム5000に対して各種の情報の入力や指示入力を行うことができる。例えば、ユーザは、入力装置5047を介して、患者の身体情報や、手術の術式についての情報等、手術に関する各種の情報を入力する。また、例えば、ユーザは、入力装置5047を介して、アーム部5031を駆動させる旨の指示や、内視鏡5001による撮像条件(照射光の種類、倍率及び焦点距離等)を変更する旨の指示、エネルギー処置具5021を駆動させる旨の指示等を入力する。 The input device 5047 is an input interface for the endoscopic surgery system 5000. The user can input various kinds of information and instructions to the endoscopic surgery system 5000 via the input device 5047. For example, the user inputs various kinds of information regarding the surgery, such as the physical information of the patient and the information regarding the surgical procedure, through the input device 5047. In addition, for example, the user may, via the input device 5047, give an instruction to drive the arm portion 5031 or an instruction to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) of the endoscope 5001. , And inputs an instruction to drive the energy treatment tool 5021.
 入力装置5047の種類は限定されず、入力装置5047は各種の公知の入力装置であってよい。入力装置5047としては、例えば、マウス、キーボード、タッチパネル、スイッチ、フットスイッチ5057及び/又はレバー等が適用され得る。入力装置5047としてタッチパネルが用いられる場合には、当該タッチパネルは表示装置5041の表示面上に設けられてもよい。 The type of the input device 5047 is not limited, and the input device 5047 may be various known input devices. As the input device 5047, for example, a mouse, a keyboard, a touch panel, a switch, a foot switch 5057 and / or a lever can be applied. When a touch panel is used as the input device 5047, the touch panel may be provided on the display surface of the display device 5041.
 あるいは、入力装置5047は、例えばメガネ型のウェアラブルデバイスやHMD(HeadMounted Display)等の、ユーザによって装着されるデバイスであり、これらのデバイスによって検出されるユーザのジェスチャや視線に応じて各種の入力が行われる。また、入力装置5047は、ユーザの動きを検出可能なカメラを含み、当該カメラによって撮像された映像から検出されるユーザのジェスチャや視線に応じて各種の入力が行われる。更に、入力装置5047は、ユーザの声を収音可能なマイクロフォンを含み、当該マイクロフォンを介して音声によって各種の入力が行われる。このように、入力装置5047が非接触で各種の情報を入力可能に構成されることにより、特に清潔域に属するユーザ(例えば術者5067)が、不潔域に属する機器を非接触で操作することが可能となる。また、ユーザは、所持している術具から手を離すことなく機器を操作することが可能となるため、ユーザの利便性が向上する。 Alternatively, the input device 5047 is a device worn by the user, such as a glasses-type wearable device or an HMD (Head Mounted Display), and various inputs can be made according to the user's gesture or line of sight detected by these devices. Done. Further, the input device 5047 includes a camera capable of detecting the movement of the user, and various inputs are performed according to the gesture or the line of sight of the user detected from the image captured by the camera. Further, the input device 5047 includes a microphone capable of collecting the voice of the user, and various inputs are performed by voice through the microphone. As described above, since the input device 5047 is configured to be able to input various kinds of information in a contactless manner, a user (for example, a surgeon 5067) who belongs to a clean area can operate a device that belongs to a dirty area in a contactless manner. Is possible. In addition, since the user can operate the device without releasing his / her hand from the surgical tool, the convenience of the user is improved.
 処置具制御装置5049は、組織の焼灼、切開又は血管の封止等のためのエネルギー処置具5021の駆動を制御する。気腹装置5051は、内視鏡5001による視野の確保及び術者の作業空間の確保の目的で、患者5071の体腔を膨らめるために、気腹チューブ5019を介して当該体腔内にガスを送り込む。レコーダ5053は、手術に関する各種の情報を記録可能な装置である。プリンタ5055は、手術に関する各種の情報を、テキスト、画像又はグラフ等各種の形式で印刷可能な装置である。 The treatment instrument control device 5049 controls driving of the energy treatment instrument 5021 for cauterization of tissue, incision, sealing of blood vessel, or the like. The pneumoperitoneum device 5051 uses a gastrointestinal tube 5019 to inflate a gas into the body cavity of the patient 5071 in order to inflate the body cavity of the patient 5071 for the purpose of securing a visual field by the endoscope 5001 and a working space for an operator. Send in. The recorder 5053 is a device capable of recording various information regarding surgery. The printer 5055 is a device that can print various types of information regarding surgery in various formats such as text, images, or graphs.
 以下、内視鏡手術システム5000において特に特徴的な構成について、更に詳細に説明する。 Below, a particularly characteristic configuration of the endoscopic surgery system 5000 will be described in more detail.
(支持アーム装置)
 支持アーム装置5027は、基台であるベース部5029と、ベース部5029から延伸するアーム部5031と、を備える。図示する例では、アーム部5031は、複数の関節部5033a、5033b、5033cと、関節部5033bによって連結される複数のリンク5035a、5035bと、から構成されているが、図12では、簡単のため、アーム部5031の構成を簡略化して図示している。実際には、アーム部5031が所望の自由度を有するように、関節部5033a~5033c及びリンク5035a、5035bの形状、数及び配置、並びに関節部5033a~5033cの回転軸の方向等が適宜設定され得る。例えば、アーム部5031は、好適に、6自由度以上の自由度を有するように構成され得る。これにより、アーム部5031の可動範囲内において内視鏡5001を自由に移動させることが可能になるため、所望の方向から内視鏡5001の鏡筒5003を患者5071の体腔内に挿入することが可能になる。
(Support arm device)
The support arm device 5027 includes a base portion 5029 that is a base and an arm portion 5031 that extends from the base portion 5029. In the illustrated example, the arm section 5031 is composed of a plurality of joint sections 5033a, 5033b, 5033c and a plurality of links 5035a, 5035b connected by the joint section 5033b, but in FIG. The configuration of the arm portion 5031 is illustrated in a simplified manner. Actually, the shapes, the numbers, and the arrangements of the joints 5033a to 5033c and the links 5035a and 5035b, the directions of the rotation axes of the joints 5033a to 5033c, and the like are appropriately set so that the arm 5031 has a desired degree of freedom. obtain. For example, the arm portion 5031 can be preferably configured to have 6 or more degrees of freedom. Accordingly, the endoscope 5001 can be freely moved within the movable range of the arm portion 5031, so that the lens barrel 5003 of the endoscope 5001 can be inserted into the body cavity of the patient 5071 from a desired direction. It will be possible.
 関節部5033a~5033cにはアクチュエータが設けられており、関節部5033a~5033cは当該アクチュエータの駆動により所定の回転軸まわりに回転可能に構成されている。当該アクチュエータの駆動がアーム制御装置5045によって制御されることにより、各関節部5033a~5033cの回転角度が制御され、アーム部5031の駆動が制御される。これにより、内視鏡5001の位置及び姿勢の制御が実現され得る。この際、アーム制御装置5045は、力制御又は位置制御等、各種の公知の制御方式によってアーム部5031の駆動を制御することができる。 An actuator is provided in each of the joints 5033a to 5033c, and the joints 5033a to 5033c are configured to be rotatable about a predetermined rotation axis by driving the actuator. The drive of the actuator is controlled by the arm controller 5045, so that the rotation angles of the joints 5033a to 5033c are controlled and the drive of the arm 5031 is controlled. Thereby, control of the position and orientation of the endoscope 5001 can be realized. At this time, the arm control device 5045 can control the driving of the arm unit 5031 by various known control methods such as force control or position control.
 例えば、術者5067が、入力装置5047(フットスイッチ5057を含む)を介して適宜操作入力を行うことにより、当該操作入力に応じてアーム制御装置5045によってアーム部5031の駆動が適宜制御され、内視鏡5001の位置及び姿勢が制御されてよい。当該制御により、アーム部5031の先端の内視鏡5001を任意の位置から任意の位置まで移動させた後、その移動後の位置で固定的に支持することができる。なお、アーム部5031は、いわゆるマスタースレイブ方式で操作されてもよい。この場合、アーム部5031は、手術室から離れた場所に設置される入力装置5047を介してユーザによって遠隔操作され得る。 For example, the surgeon 5067 appropriately performs an operation input via the input device 5047 (including the foot switch 5057), and the arm controller 5045 appropriately controls the drive of the arm portion 5031 in accordance with the operation input. The position and orientation of the endoscope 5001 may be controlled. By this control, the endoscope 5001 at the tip of the arm portion 5031 can be moved from an arbitrary position to an arbitrary position and then fixedly supported at the position after the movement. The arm portion 5031 may be operated by a so-called master slave method. In this case, the arm unit 5031 can be remotely operated by the user via the input device 5047 installed at a place apart from the operating room.
 また、力制御が適用される場合には、アーム制御装置5045は、ユーザからの外力を受け、その外力にならってスムーズにアーム部5031が移動するように、各関節部5033a~5033cのアクチュエータを駆動させる、いわゆるパワーアシスト制御を行ってもよい。これにより、ユーザが直接アーム部5031に触れながらアーム部5031を移動させる際に、比較的軽い力で当該アーム部5031を移動させることができる。従って、より直感的に、より簡易な操作で内視鏡5001を移動させることが可能となり、ユーザの利便性を向上させることができる。 Further, when force control is applied, the arm control device 5045 receives the external force from the user and operates the actuators of the joint parts 5033a to 5033c so that the arm part 5031 moves smoothly according to the external force. You may perform what is called a power assist control which drives. Accordingly, when the user moves the arm unit 5031 while directly touching the arm unit 5031, the arm unit 5031 can be moved with a relatively light force. Therefore, the endoscope 5001 can be moved more intuitively and with a simpler operation, and the convenience of the user can be improved.
 ここで、一般的に、内視鏡下手術では、スコピストと呼ばれる医師によって内視鏡5001が支持されていた。これに対して、支持アーム装置5027を用いることにより、人手によらずに内視鏡5001の位置をより確実に固定することが可能になるため、術部の画像を安定的に得ることができ、手術を円滑に行うことが可能になる。 In general, in endoscopic surgery, a doctor called a scoopist supported the endoscope 5001. On the other hand, by using the support arm device 5027, the position of the endoscope 5001 can be fixed more reliably without manual labor, so that an image of the surgical site can be stably obtained. It becomes possible to perform surgery smoothly.
 なお、アーム制御装置5045は必ずしもカート5037に設けられなくてもよい。また、アーム制御装置5045は必ずしも1つの装置でなくてもよい。例えば、アーム制御装置5045は、支持アーム装置5027のアーム部5031の各関節部5033a~5033cにそれぞれ設けられてもよく、複数のアーム制御装置5045が互いに協働することにより、アーム部5031の駆動制御が実現されてもよい。 The arm control device 5045 does not necessarily have to be provided on the cart 5037. Also, the arm control device 5045 does not necessarily have to be one device. For example, the arm control device 5045 may be provided in each of the joint parts 5033a to 5033c of the arm part 5031 of the support arm device 5027, and the plurality of arm control devices 5045 cooperate with each other to drive the arm part 5031. Control may be implemented.
(光源装置)
 光源装置5043は、内視鏡5001に術部を撮影する際の照射光を供給する。光源装置5043は、例えばLED、レーザ光源又はこれらの組み合わせによって構成される白色光源から構成される。このとき、RGBレーザ光源の組み合わせにより白色光源が構成される場合には、各色(各波長)の出力強度及び出力タイミングを高精度に制御することができるため、光源装置5043において撮像画像のホワイトバランスの調整を行うことができる。また、この場合には、RGBレーザ光源それぞれからのレーザ光を時分割で観察対象に照射し、その照射タイミングに同期してカメラヘッド5005の撮像素子の駆動を制御することにより、RGBそれぞれに対応した画像を時分割で撮像することも可能である。当該方法によれば、当該撮像素子にカラーフィルタを設けなくても、カラー画像を得ることができる。
(Light source device)
The light source device 5043 supplies the endoscope 5001 with irradiation light for imaging the surgical site. The light source device 5043 is composed of, for example, an LED, a laser light source, or a white light source configured by a combination thereof. At this time, when the white light source is formed by the combination of the RGB laser light sources, the output intensity and the output timing of each color (each wavelength) can be controlled with high accuracy, so that the white balance of the captured image in the light source device 5043. Can be adjusted. In this case, the laser light from each of the RGB laser light sources is time-divisionally irradiated to the observation target, and the drive of the image pickup device of the camera head 5005 is controlled in synchronization with the irradiation timing, so as to correspond to each of RGB. It is also possible to take the captured image in time division. According to this method, a color image can be obtained without providing a color filter on the image sensor.
 また、光源装置5043は、出力する光の強度を所定の時間ごとに変更するようにその駆動が制御されてもよい。その光の強度の変更のタイミングに同期してカメラヘッド5005の撮像素子の駆動を制御して時分割で画像を取得し、その画像を合成することにより、いわゆる黒つぶれ及び白とびのない高ダイナミックレンジの画像を生成することができる。 Further, the drive of the light source device 5043 may be controlled so as to change the intensity of the output light at predetermined time intervals. By controlling the driving of the image sensor of the camera head 5005 in synchronism with the timing of changing the intensity of the light to acquire images in a time-division manner and synthesizing the images, high dynamic without so-called blackout and whiteout. An image of the range can be generated.
 また、光源装置5043は、特殊光観察に対応した所定の波長帯域の光を供給可能に構成されてもよい。特殊光観察では、例えば、体組織における光の吸収の波長依存性を利用して、通常の観察時における照射光(すなわち、白色光)に比べて狭帯域の光を照射することにより、粘膜表層の血管等の所定の組織を高コントラストで撮影する、いわゆる狭帯域光観察(Narrow Band Imaging)が行われる。あるいは、特殊光観察では、励起光を照射することにより発生する蛍光により画像を得る蛍光観察が行われてもよい。蛍光観察では、体組織に励起光を照射し当該体組織からの蛍光を観察するもの(自家蛍光観察)、又はインドシアニングリーン(ICG)等の試薬を体組織に局注するとともに当該体組織にその試薬の蛍光波長に対応した励起光を照射し蛍光像を得るもの等が行われ得る。光源装置5043は、このような特殊光観察に対応した狭帯域光及び/又は励起光を供給可能に構成され得る。 Further, the light source device 5043 may be configured to be able to supply light in a predetermined wavelength band corresponding to special light observation. In the special light observation, for example, the wavelength dependence of the absorption of light in body tissues is used to irradiate a narrow band of light as compared with the irradiation light (that is, white light) at the time of normal observation, so that the mucosal surface layer The so-called narrow band imaging (Narrow Band Imaging) is performed, in which predetermined tissues such as blood vessels are imaged with high contrast. Alternatively, in the special light observation, fluorescence observation in which an image is obtained by the fluorescence generated by irradiating the excitation light may be performed. In fluorescence observation, the body tissue is irradiated with excitation light to observe fluorescence from the body tissue (autofluorescence observation), or a reagent such as indocyanine green (ICG) is locally injected into the body tissue and the body tissue is injected into the body tissue. For example, a fluorescent image may be obtained by irradiating excitation light corresponding to the fluorescent wavelength of the reagent. The light source device 5043 may be configured to be capable of supplying narrowband light and / or excitation light compatible with such special light observation.
(カメラヘッド及びCCU)
 図13を参照して、内視鏡5001のカメラヘッド5005及びCCU5039の機能についてより詳細に説明する。図13は、図12に示すカメラヘッド5005及びCCU5039の機能構成の一例を示すブロック図である。
(Camera head and CCU)
The functions of the camera head 5005 and the CCU 5039 of the endoscope 5001 will be described in more detail with reference to FIG. FIG. 13 is a block diagram showing an example of the functional configuration of the camera head 5005 and CCU 5039 shown in FIG.
 図13を参照すると、カメラヘッド5005は、その機能として、レンズユニット5007と、撮像部5009と、駆動部5011と、通信部5013と、カメラヘッド制御部5015と、を有する。また、CCU5039は、その機能として、通信部5059と、画像処理部5061と、制御部5063と、を有する。カメラヘッド5005とCCU5039とは、伝送ケーブル5065によって双方向に通信可能に接続されている。 Referring to FIG. 13, the camera head 5005 has a lens unit 5007, an imaging unit 5009, a drive unit 5011, a communication unit 5013, and a camera head control unit 5015 as its functions. The CCU 5039 also has a communication unit 5059, an image processing unit 5061, and a control unit 5063 as its functions. The camera head 5005 and the CCU 5039 are bidirectionally connected by a transmission cable 5065.
 まず、カメラヘッド5005の機能構成について説明する。レンズユニット5007は、鏡筒5003との接続部に設けられる光学系である。鏡筒5003の先端から取り込まれた観察光は、カメラヘッド5005まで導光され、当該レンズユニット5007に入射する。レンズユニット5007は、ズームレンズ及びフォーカスレンズを含む複数のレンズが組み合わされて構成される。レンズユニット5007は、撮像部5009の撮像素子の受光面上に観察光を集光するように、その光学特性が調整されている。また、ズームレンズ及びフォーカスレンズは、撮像画像の倍率及び焦点の調整のため、その光軸上の位置が移動可能に構成される。 First, the functional configuration of the camera head 5005 will be described. The lens unit 5007 is an optical system provided in a connection portion with the lens barrel 5003. The observation light taken in from the tip of the lens barrel 5003 is guided to the camera head 5005 and enters the lens unit 5007. The lens unit 5007 is configured by combining a plurality of lenses including a zoom lens and a focus lens. The optical characteristics of the lens unit 5007 are adjusted so that the observation light is condensed on the light receiving surface of the image pickup element of the image pickup section 5009. Further, the zoom lens and the focus lens are configured so that their positions on the optical axis can be moved in order to adjust the magnification and focus of the captured image.
 撮像部5009は撮像素子によって構成され、レンズユニット5007の後段に配置される。レンズユニット5007を通過した観察光は、当該撮像素子の受光面に集光され、光電変換によって、観察像に対応した画像信号が生成される。撮像部5009によって生成された画像信号は、通信部5013に提供される。 The image pickup unit 5009 is composed of an image pickup element, and is arranged in the latter stage of the lens unit 5007. The observation light that has passed through the lens unit 5007 is condensed on the light receiving surface of the image pickup element, and an image signal corresponding to the observation image is generated by photoelectric conversion. The image signal generated by the imaging unit 5009 is provided to the communication unit 5013.
 撮像部5009を構成する撮像素子としては、例えばCMOS(Complementary MetalOxide Semiconductor)タイプのイメージセンサであり、Bayer配列を有するカラー撮影可能なものが用いられる。なお、当該撮像素子としては、例えば4K以上の高解像度の画像の撮影に対応可能なものが用いられてもよい。術部の画像が高解像度で得られることにより、術者5067は、当該術部の様子をより詳細に把握することができ、手術をより円滑に進行することが可能となる。 As the image pickup device that constitutes the image pickup unit 5009, for example, a CMOS (Complementary Metal Oxide Semiconductor) type image sensor, which has a Bayer array and is capable of color image pickup is used. It should be noted that as the image pickup device, for example, a device capable of capturing a high-resolution image of 4K or higher may be used. By obtaining the image of the surgical site with high resolution, the surgeon 5067 can grasp the state of the surgical site in more detail, and the surgery can proceed more smoothly.
 また、撮像部5009を構成する撮像素子は、3D表示に対応する右目用及び左目用の画像信号をそれぞれ取得するための1対の撮像素子を有するように構成される。3D表示が行われることにより、術者5067は術部における生体組織の奥行きをより正確に把握することが可能になる。なお、撮像部5009が多板式で構成される場合には、各撮像素子に対応して、レンズユニット5007も複数系統設けられる。 Further, the image pickup device constituting the image pickup unit 5009 is configured to have a pair of image pickup devices for respectively obtaining the image signals for the right eye and the left eye corresponding to 3D display. By performing the 3D display, the operator 5067 can more accurately grasp the depth of the living tissue in the operation site. When the image pickup unit 5009 is configured by a multi-plate type, a plurality of lens unit 5007 systems are provided corresponding to each image pickup element.
 また、撮像部5009は、必ずしもカメラヘッド5005に設けられなくてもよい。例えば、撮像部5009は、鏡筒5003の内部に、対物レンズの直後に設けられてもよい。 The image pickup unit 5009 does not necessarily have to be provided on the camera head 5005. For example, the imaging unit 5009 may be provided inside the lens barrel 5003 immediately after the objective lens.
 駆動部5011は、アクチュエータによって構成され、カメラヘッド制御部5015からの制御により、レンズユニット5007のズームレンズ及びフォーカスレンズを光軸に沿って所定の距離だけ移動させる。これにより、撮像部5009による撮像画像の倍率及び焦点が適宜調整され得る。 The drive unit 5011 is composed of an actuator, and moves the zoom lens and the focus lens of the lens unit 5007 by a predetermined distance along the optical axis under the control of the camera head control unit 5015. As a result, the magnification and focus of the image captured by the image capturing unit 5009 can be adjusted appropriately.
 通信部5013は、CCU5039との間で各種の情報を送受信するための通信装置によって構成される。通信部5013は、撮像部5009から得た画像信号をRAWデータとして伝送ケーブル5065を介してCCU5039に送信する。この際、術部の撮像画像を低レイテンシで表示するために、当該画像信号は光通信によって送信されることが好ましい。手術の際には、術者5067が撮像画像によって患部の状態を観察しながら手術を行うため、より安全で確実な手術のためには、術部の動画像が可能な限りリアルタイムに表示されることが求められるからである。光通信が行われる場合には、通信部5013には、電気信号を光信号に変換する光電変換モジュールが設けられる。画像信号は当該光電変換モジュールによって光信号に変換された後、伝送ケーブル5065を介してCCU5039に送信される。 The communication unit 5013 is composed of a communication device for transmitting and receiving various information to and from the CCU 5039. The communication unit 5013 transmits the image signal obtained from the image capturing unit 5009 as RAW data to the CCU 5039 via the transmission cable 5065. At this time, the image signal is preferably transmitted by optical communication in order to display the captured image of the surgical site with low latency. During the operation, the operator 5067 performs the operation while observing the state of the affected area by the captured image, and therefore, for safer and more reliable operation, the moving image of the operated area is displayed in real time as much as possible. Is required. When optical communication is performed, the communication unit 5013 is provided with a photoelectric conversion module that converts an electric signal into an optical signal. The image signal is converted into an optical signal by the photoelectric conversion module and then transmitted to the CCU 5039 via the transmission cable 5065.
 また、通信部5013は、CCU5039から、カメラヘッド5005の駆動を制御するための制御信号を受信する。当該制御信号には、例えば、撮像画像のフレームレートを指定する旨の情報、撮像時の露出値を指定する旨の情報、並びに/又は撮像画像の倍率及び焦点を指定する旨の情報等、撮像条件に関する情報が含まれる。通信部5013は、受信した制御信号をカメラヘッド制御部5015に提供する。なお、CCU5039からの制御信号も、光通信によって伝送されてもよい。この場合、通信部5013には、光信号を電気信号に変換する光電変換モジュールが設けられ、制御信号は当該光電変換モジュールによって電気信号に変換された後、カメラヘッド制御部5015に提供される。 The communication unit 5013 also receives a control signal for controlling the driving of the camera head 5005 from the CCU 5039. The control signal includes, for example, information that specifies the frame rate of the captured image, information that specifies the exposure value at the time of capturing, and / or information that specifies the magnification and focus of the captured image. Contains information about the condition. The communication unit 5013 provides the received control signal to the camera head control unit 5015. The control signal from the CCU 5039 may also be transmitted by optical communication. In this case, the communication unit 5013 is provided with a photoelectric conversion module that converts an optical signal into an electric signal, and the control signal is converted into an electric signal by the photoelectric conversion module and then provided to the camera head control unit 5015.
 なお、上記のフレームレートや露出値、倍率、焦点等の撮像条件は、取得された画像信号に基づいてCCU5039の制御部5063によって自動的に設定される。つまり、いわゆるAE(Auto Exposure)機能、AF(Auto Focus)機能及びAWB(Auto WhiteBalance)機能が内視鏡5001に搭載される。 Note that the imaging conditions such as the frame rate, the exposure value, the magnification, and the focus described above are automatically set by the control unit 5063 of the CCU 5039 based on the acquired image signal. That is, a so-called AE (Auto Exposure) function, AF (Auto Focus) function, and AWB (Auto White Balance) function are mounted on the endoscope 5001.
 カメラヘッド制御部5015は、通信部5013を介して受信したCCU5039からの制御信号に基づいて、カメラヘッド5005の駆動を制御する。例えば、カメラヘッド制御部5015は、撮像画像のフレームレートを指定する旨の情報及び/又は撮像時の露光を指定する旨の情報に基づいて、撮像部5009の撮像素子の駆動を制御する。また、例えば、カメラヘッド制御部5015は、撮像画像の倍率及び焦点を指定する旨の情報に基づいて、駆動部5011を介してレンズユニット5007のズームレンズ及びフォーカスレンズを適宜移動させる。カメラヘッド制御部5015は、更に、鏡筒5003やカメラヘッド5005を識別するための情報を記憶する機能を備えてもよい。 The camera head controller 5015 controls driving of the camera head 5005 based on a control signal from the CCU 5039 received via the communication unit 5013. For example, the camera head control unit 5015 controls the driving of the image pickup device of the image pickup unit 5009 based on the information indicating the frame rate of the captured image and / or the information indicating the exposure at the time of image capturing. Further, for example, the camera head control unit 5015 appropriately moves the zoom lens and the focus lens of the lens unit 5007 via the drive unit 5011 based on the information indicating that the magnification and the focus of the captured image are designated. The camera head controller 5015 may further have a function of storing information for identifying the lens barrel 5003 and the camera head 5005.
 なお、レンズユニット5007や撮像部5009等の構成を、気密性及び防水性が高い密閉構造内に配置することで、カメラヘッド5005について、オートクレーブ滅菌処理に対する耐性を持たせることができる。 By disposing the lens unit 5007, the imaging unit 5009, and the like in a hermetically sealed structure that is highly airtight and waterproof, the camera head 5005 can be made resistant to autoclave sterilization.
 次に、CCU5039の機能構成について説明する。通信部5059は、カメラヘッド5005との間で各種の情報を送受信するための通信装置によって構成される。通信部5059は、カメラヘッド5005から、伝送ケーブル5065を介して送信される画像信号を受信する。この際、上記のように、当該画像信号は好適に光通信によって送信され得る。この場合、光通信に対応して、通信部5059には、光信号を電気信号に変換する光電変換モジュールが設けられる。通信部5059は、電気信号に変換した画像信号を画像処理部5061に提供する。 Next, the functional configuration of the CCU 5039 will be described. The communication unit 5059 is composed of a communication device for transmitting and receiving various information with the camera head 5005. The communication unit 5059 receives the image signal transmitted from the camera head 5005 via the transmission cable 5065. At this time, as described above, the image signal can be preferably transmitted by optical communication. In this case, the communication unit 5059 is provided with a photoelectric conversion module that converts an optical signal into an electrical signal in response to optical communication. The communication unit 5059 provides the image signal converted into the electric signal to the image processing unit 5061.
 また、通信部5059は、カメラヘッド5005に対して、カメラヘッド5005の駆動を制御するための制御信号を送信する。当該制御信号も光通信によって送信されてよい。 Further, the communication unit 5059 transmits a control signal for controlling the driving of the camera head 5005 to the camera head 5005. The control signal may also be transmitted by optical communication.
 画像処理部5061は、カメラヘッド5005から送信されたRAWデータである画像信号に対して各種の画像処理を施す。当該画像処理としては、例えば現像処理、高画質化処理(帯域強調処理、超解像処理、NR(Noise reduction)処理及び/又は手ブレ補正処理等)、並びに/又は拡大処理(電子ズーム処理)等、各種の公知の信号処理が含まれる。また、画像処理部5061は、AE、AF及びAWBを行うための、画像信号に対する検波処理を行う。 The image processing unit 5061 performs various kinds of image processing on the image signal that is the RAW data transmitted from the camera head 5005. As the image processing, for example, development processing, high image quality processing (band emphasis processing, super-resolution processing, NR (Noise reduction) processing and / or camera shake correction processing, etc.), and / or enlargement processing (electronic zoom processing) Etc., various known signal processes are included. The image processing unit 5061 also performs detection processing on the image signal for performing AE, AF, and AWB.
 画像処理部5061は、CPUやGPU等のプロセッサによって構成され、当該プロセッサが所定のプログラムに従って動作することにより、上述した画像処理や検波処理が行われ得る。なお、画像処理部5061が複数のGPUによって構成される場合には、画像処理部5061は、画像信号に係る情報を適宜分割し、これら複数のGPUによって並列的に画像処理を行う。 The image processing unit 5061 includes a processor such as a CPU and a GPU, and the image processing and the detection processing described above can be performed by the processor operating according to a predetermined program. When the image processing unit 5061 is composed of a plurality of GPUs, the image processing unit 5061 appropriately divides information related to the image signal, and the plurality of GPUs perform image processing in parallel.
 制御部5063は、内視鏡5001による術部の撮像、及びその撮像画像の表示に関する各種の制御を行う。例えば、制御部5063は、カメラヘッド5005の駆動を制御するための制御信号を生成する。この際、撮像条件がユーザによって入力されている場合には、制御部5063は、当該ユーザによる入力に基づいて制御信号を生成する。あるいは、内視鏡5001にAE機能、AF機能及びAWB機能が搭載されている場合には、制御部5063は、画像処理部5061による検波処理の結果に応じて、最適な露出値、焦点距離及びホワイトバランスを適宜算出し、制御信号を生成する。 The control unit 5063 performs various controls regarding imaging of the surgical site by the endoscope 5001 and display of the captured image. For example, the control unit 5063 generates a control signal for controlling the driving of the camera head 5005. At this time, when the imaging condition is input by the user, the control unit 5063 generates a control signal based on the input by the user. Alternatively, when the endoscope 5001 is equipped with the AE function, the AF function, and the AWB function, the control unit 5063 determines the optimum exposure value, focal length, and optimum exposure value according to the result of the detection processing by the image processing unit 5061. The white balance is calculated appropriately and a control signal is generated.
 また、制御部5063は、画像処理部5061によって画像処理が施された画像信号に基づいて、術部の画像を表示装置5041に表示させる。この際、制御部5063は、各種の画像認識技術を用いて術部画像内における各種の物体を認識する。例えば、制御部5063は、術部画像に含まれる物体のエッジの形状や色等を検出することにより、鉗子等の術具、特定の生体部位、出血、エネルギー処置具5021使用時のミスト等を認識することができる。制御部5063は、表示装置5041に術部の画像を表示させる際に、その認識結果を用いて、各種の手術支援情報を当該術部の画像に重畳表示させる。手術支援情報が重畳表示され、術者5067に提示されることにより、より安全かつ確実に手術を進めることが可能になる。 Further, the control unit 5063 causes the display device 5041 to display the image of the surgical site based on the image signal subjected to the image processing by the image processing unit 5061. At this time, the control unit 5063 recognizes various objects in the operation region image using various image recognition techniques. For example, the control unit 5063 detects a surgical instrument such as forceps, a specific living body part, bleeding, a mist when the energy treatment instrument 5021 is used, by detecting the shape and color of the edge of the object included in the surgical image. Can be recognized. When displaying the image of the surgical site on the display device 5041, the control unit 5063 uses the recognition result to superimpose and display various types of surgical support information on the image of the surgical site. By displaying the surgery support information in a superimposed manner and presenting it to the operator 5067, it becomes possible to proceed with the surgery more safely and reliably.
 カメラヘッド5005及びCCU5039を接続する伝送ケーブル5065は、電気信号の通信に対応した電気信号ケーブル、光通信に対応した光ファイバ、又はこれらの複合ケーブルである。 A transmission cable 5065 connecting the camera head 5005 and the CCU 5039 is an electric signal cable compatible with electric signal communication, an optical fiber compatible with optical communication, or a composite cable of these.
 ここで、図示する例では、伝送ケーブル5065を用いて有線で通信が行われていたが、カメラヘッド5005とCCU5039との間の通信は無線で行われてもよい。両者の間の通信が無線で行われる場合には、伝送ケーブル5065を手術室内に敷設する必要がなくなるため、手術室内における医療スタッフの移動が当該伝送ケーブル5065によって妨げられる事態が解消され得る。 Here, in the illustrated example, wired communication is performed using the transmission cable 5065, but communication between the camera head 5005 and the CCU 5039 may be performed wirelessly. When the communication between the two is performed wirelessly, it is not necessary to lay the transmission cable 5065 in the operating room, so that the situation in which the movement of the medical staff in the operating room is prevented by the transmission cable 5065 can be solved.
 以上、本開示に係る技術が適用され得る内視鏡手術システム5000の一例について説明した。なお、ここでは、一例として内視鏡手術システム5000について説明したが、本開示に係る技術が適用され得るシステムはかかる例に限定されない。例えば、本開示に係る技術は、検査用軟性内視鏡手術システムや、以下に応用例2で説明する顕微鏡手術システムに適用されてもよい。 The example of the endoscopic surgery system 5000 to which the technology according to the present disclosure can be applied has been described above. Although the endoscopic surgery system 5000 is described here as an example, the system to which the technology according to the present disclosure can be applied is not limited to this example. For example, the technique according to the present disclosure may be applied to a flexible endoscopic surgery system for inspection and a microscopic surgery system described in Application Example 2 below.
 本開示に係る技術は、以上説明した構成のうち、内視鏡5001に好適に適用され得る。具体的には、内視鏡5001によって撮影された患者5071の体腔内の術部の画像における血流部と非血流部を容易に視認可能に表示装置5041に表示する場合に、本開示に係る技術を適用できる。内視鏡5001に本開示に係る技術を適用することにより、スペックルイメージング技術においてSC値を算出するための計算領域をより適切に設定し、良好なSC画像を生成することができる。これにより、術者5067は、血流部と非血流部が正確に識別された術部の画像を表示装置5041においてリアルタイムで見ることができ、手術をより安全に行うことができる。 The technology according to the present disclosure can be suitably applied to the endoscope 5001 among the configurations described above. Specifically, in the case where the blood flow part and the non-blood flow part in the image of the surgical site in the body cavity of the patient 5071 captured by the endoscope 5001 are displayed on the display device 5041 so as to be easily visible, the present disclosure is disclosed. Such technology can be applied. By applying the technique according to the present disclosure to the endoscope 5001, it is possible to more appropriately set the calculation region for calculating the SC value in the speckle imaging technique and generate a good SC image. As a result, the operator 5067 can view the image of the surgical site in which the blood flow portion and the non-blood flow portion are accurately identified on the display device 5041 in real time, and can perform the surgery more safely.
(応用例2)
 また、本開示に係る技術は、患者の微細部位を拡大観察しながら行う、いわゆるマイクロサージェリーに用いられる顕微鏡手術システムに適用されてもよい。
(Application example 2)
Further, the technology according to the present disclosure may be applied to a microscopic surgery system used for so-called microsurgery, which is performed while observing a microscopic part of a patient in an enlarged manner.
 図14は、本開示に係る技術が適用され得る顕微鏡手術システム5300の概略的な構成の一例を示す図である。図14を参照すると、顕微鏡手術システム5300は、顕微鏡装置5301と、制御装置5317と、表示装置5319と、から構成される。なお、以下の顕微鏡手術システム5300についての説明において、「ユーザ」とは、術者及び助手等、顕微鏡手術システム5300を使用する任意の医療スタッフのことを意味する。 FIG. 14 is a diagram showing an example of a schematic configuration of a microscopic surgery system 5300 to which the technology according to the present disclosure can be applied. Referring to FIG. 14, the microscopic surgery system 5300 includes a microscope device 5301, a control device 5317, and a display device 5319. In the following description of the microscopic surgery system 5300, the “user” refers to an operator, an assistant, or any medical staff who uses the microscopic surgery system 5300.
 顕微鏡装置5301は、観察対象(患者の術部)を拡大観察するための顕微鏡部5303と、顕微鏡部5303を先端で支持するアーム部5309と、アーム部5309の基端を支持するベース部5315と、を有する。 The microscope device 5301 includes a microscope unit 5303 for magnifying and observing an observation target (operated part of a patient), an arm unit 5309 for supporting the microscope unit 5303 at a tip, and a base unit 5315 for supporting a base end of the arm unit 5309. With.
 顕微鏡部5303は、略円筒形状の筒状部5305と、当該筒状部5305の内部に設けられる撮像部(図示せず)と、筒状部5305の外周の一部領域に設けられる操作部5307と、から構成される。顕微鏡部5303は、撮像部によって電子的に撮像画像を撮像する、電子撮像式の顕微鏡部(いわゆるビデオ式の顕微鏡部)である。 The microscope unit 5303 includes a substantially cylindrical tubular portion 5305, an imaging unit (not shown) provided inside the tubular portion 5305, and an operation unit 5307 provided in a partial area on the outer periphery of the tubular portion 5305. It consists of and. The microscope unit 5303 is an electronic imaging type microscope unit (so-called video type microscope unit) that electronically captures a captured image by the imaging unit.
 筒状部5305の下端の開口面には、内部の撮像部を保護するカバーガラスが設けられる。観察対象からの光(以下、観察光ともいう)は、当該カバーガラスを通過して、筒状部5305の内部の撮像部に入射する。なお、筒状部5305の内部には例えばLED(Light Emitting Diode)等からなる光源が設けられてもよく、撮像時には、当該カバーガラスを介して、当該光源から観察対象に対して光が照射されてもよい。 A cover glass that protects the internal image pickup unit is provided on the opening surface at the lower end of the tubular portion 5305. Light from the observation target (hereinafter, also referred to as observation light) passes through the cover glass and enters the imaging unit inside the tubular portion 5305. A light source such as an LED (Light Emitting Diode) may be provided inside the tubular portion 5305, and light is emitted from the light source to the observation target through the cover glass during imaging. May be.
 撮像部は、観察光を集光する光学系と、当該光学系が集光した観察光を受光する撮像素子と、から構成される。当該光学系は、ズームレンズ及びフォーカスレンズを含む複数のレンズが組み合わされて構成され、その光学特性は、観察光を撮像素子の受光面上に結像するように調整されている。当該撮像素子は、観察光を受光して光電変換することにより、観察光に対応した信号、すなわち観察像に対応した画像信号を生成する。当該撮像素子としては、例えばBayer配列を有するカラー撮影可能なものが用いられる。当該撮像素子は、CMOS(Complementary Metal Oxide Semiconductor)イメージセンサ又はCCD(Charge Coupled Device)イメージセンサ等、各種の公知の撮像素子であってよい。撮像素子によって生成された画像信号は、RAWデータとして制御装置5317に送信される。ここで、この画像信号の送信は、好適に光通信によって行われてもよい。手術現場では、術者が撮像画像によって患部の状態を観察しながら手術を行うため、より安全で確実な手術のためには、術部の動画像が可能な限りリアルタイムに表示されることが求められるからである。光通信で画像信号が送信されることにより、低レイテンシで撮像画像を表示することが可能となる。 The imaging unit is composed of an optical system that collects the observation light and an imaging device that receives the observation light that is collected by the optical system. The optical system is configured by combining a plurality of lenses including a zoom lens and a focus lens, and its optical characteristics are adjusted so as to form observation light on the light receiving surface of the image sensor. The imaging device receives the observation light and photoelectrically converts the observation light to generate a signal corresponding to the observation light, that is, an image signal corresponding to the observation image. As the image pickup device, for example, a device having a Bayer array and capable of color imaging is used. The image pickup device may be various known image pickup devices such as a CMOS (Complementary Metal Oxide Semiconductor) image sensor or a CCD (Charge Coupled Device) image sensor. The image signal generated by the image sensor is transmitted to the control device 5317 as RAW data. Here, the transmission of the image signal may be preferably performed by optical communication. At the surgery site, the surgeon performs surgery while observing the state of the affected area with the captured images. Therefore, for safer and more reliable surgery, it is required that the moving image of the surgery area be displayed in real time as much as possible. Because it will be done. By transmitting the image signal by optical communication, it is possible to display the captured image with low latency.
 なお、撮像部は、その光学系のズームレンズ及びフォーカスレンズを光軸に沿って移動させる駆動機構を有してもよい。当該駆動機構によってズームレンズ及びフォーカスレンズが適宜移動されることにより、撮像画像の拡大倍率及び撮像時の焦点距離が調整され得る。また、撮像部には、AE(Auto Exposure)機能やAF(Auto Focus)機能等、一般的に電子撮像式の顕微鏡部に備えられ得る各種の機能が搭載されてもよい。 The image pickup unit may have a drive mechanism that moves the zoom lens and the focus lens of the optical system along the optical axis. By appropriately moving the zoom lens and the focus lens by the drive mechanism, the magnification of the captured image and the focal length at the time of capturing can be adjusted. Further, the image pickup unit may be equipped with various functions such as an AE (Auto Exposure) function and an AF (Auto Focus) function, which are generally provided in an electronic image pickup type microscope unit.
 また、撮像部は、1つの撮像素子を有するいわゆる単板式の撮像部として構成されてもよいし、複数の撮像素子を有するいわゆる多板式の撮像部として構成されてもよい。撮像部が多板式で構成される場合には、例えば各撮像素子によってRGBそれぞれに対応する画像信号が生成され、それらが合成されることによりカラー画像が得られてもよい。あるいは、当該撮像部は、立体視(3D表示)に対応する右目用及び左目用の画像信号をそれぞれ取得するための1対の撮像素子を有するように構成されてもよい。3D表示が行われることにより、術者は術部における生体組織の奥行きをより正確に把握することが可能になる。なお、当該撮像部が多板式で構成される場合には、各撮像素子に対応して、光学系も複数系統が設けられ得る。 The image pickup unit may be configured as a so-called single-plate type image pickup unit having one image pickup device, or may be configured as a so-called multi-plate type image pickup unit having a plurality of image pickup devices. When the image pickup unit is configured by a multi-plate type, for example, image signals corresponding to RGB are generated by the image pickup elements, respectively, and the image signals may be combined to obtain a color image. Alternatively, the image capturing unit may be configured to have a pair of image capturing elements for respectively acquiring image signals for the right eye and the left eye corresponding to stereoscopic vision (3D display). The 3D display enables the operator to more accurately understand the depth of the living tissue in the operation site. In addition, when the said imaging part is comprised by a multiplate type, a multiple optical system can be provided corresponding to each imaging element.
 操作部5307は、例えば十字レバー又はスイッチ等によって構成され、ユーザの操作入力を受け付ける入力手段である。例えば、ユーザは、操作部5307を介して、観察像の拡大倍率及び観察対象までの焦点距離を変更する旨の指示を入力することができる。当該指示に従って撮像部の駆動機構がズームレンズ及びフォーカスレンズを適宜移動させることにより、拡大倍率及び焦点距離が調整され得る。また、例えば、ユーザは、操作部5307を介して、アーム部5309の動作モード(後述するオールフリーモード及び固定モード)を切り替える旨の指示を入力することができる。なお、ユーザが顕微鏡部5303を移動させようとする場合には、当該ユーザは筒状部5305を握るように把持した状態で当該顕微鏡部5303を移動させる様態が想定される。従って、操作部5307は、ユーザが筒状部5305を移動させている間でも操作可能なように、ユーザが筒状部5305を握った状態で指によって容易に操作しやすい位置に設けられることが好ましい。 The operation unit 5307 is an input unit that is configured by, for example, a cross lever or a switch, and that accepts a user operation input. For example, the user can input, via the operation unit 5307, an instruction to change the magnification of the observation image and the focal length to the observation target. According to the instruction, the drive mechanism of the imaging unit appropriately moves the zoom lens and the focus lens, so that the enlargement magnification and the focal length can be adjusted. Further, for example, the user can input an instruction to switch the operation mode (all-free mode and fixed mode described later) of the arm unit 5309 via the operation unit 5307. When the user wants to move the microscope unit 5303, it is assumed that the user moves the microscope unit 5303 while holding the tubular unit 5305. Therefore, the operation unit 5307 may be provided at a position where the user can easily operate it with his / her finger while holding the tubular portion 5305 so that the operation portion 5307 can be operated while the user is moving the tubular portion 5305. preferable.
 アーム部5309は、複数のリンク(第1リンク5313a~第6リンク5313f)が、複数の関節部(第1関節部5311a~第6関節部5311f)によって互いに回動可能に連結されることによって構成される。 The arm portion 5309 is configured by a plurality of links (first link 5313a to sixth link 5313f) being rotatably connected to each other by a plurality of joint portions (first joint portion 5311a to sixth joint portion 5311f). To be done.
 第1関節部5311aは、略円柱形状を有し、その先端(下端)で、顕微鏡部5303の筒状部5305の上端を、当該筒状部5305の中心軸と平行な回転軸(第1軸O1)まわりに回動可能に支持する。ここで、第1関節部5311aは、第1軸O1が顕微鏡部5303の撮像部の光軸と一致するように構成され得る。これにより、第1軸O1まわりに顕微鏡部5303を回動させることにより、撮像画像を回転させるように視野を変更することが可能になる。 The first joint portion 5311a has a substantially columnar shape, and at its tip (lower end), the upper end of the tubular portion 5305 of the microscope portion 5303 is parallel to the central axis of the tubular portion 5305. O1) Supports to be rotatable around. Here, the first joint portion 5311a may be configured such that the first axis O1 coincides with the optical axis of the imaging unit of the microscope unit 5303. Accordingly, by rotating the microscope unit 5303 around the first axis O1, it is possible to change the field of view so as to rotate the captured image.
 第1リンク5313aは、先端で第1関節部5311aを固定的に支持する。具体的には、第1リンク5313aは略L字形状を有する棒状の部材であり、その先端側の一辺が第1軸O1と直交する方向に延伸しつつ、当該一辺の端部が第1関節部5311aの外周の上端部に当接するように、第1関節部5311aに接続される。第1リンク5313aの略L字形状の基端側の他辺の端部に第2関節部5311bが接続される。 The first link 5313a fixedly supports the first joint portion 5311a at the tip. Specifically, the first link 5313a is a rod-shaped member having a substantially L shape, and one end side of the first link 5313a extends in a direction orthogonal to the first axis O1 while the end portion of the one side is the first joint. The first joint portion 5311a is connected so as to come into contact with the upper end of the outer periphery of the portion 5311a. The second joint 5311b is connected to the end of the other side of the first link 5313a on the base end side of the substantially L shape.
 第2関節部5311bは、略円柱形状を有し、その先端で、第1リンク5313aの基端を、第1軸O1と直交する回転軸(第2軸O2)まわりに回動可能に支持する。第2関節部5311bの基端には、第2リンク5313bの先端が固定的に接続される。 The second joint portion 5311b has a substantially columnar shape, and the tip end thereof supports the base end of the first link 5313a so as to be rotatable about a rotation axis (second axis O2) orthogonal to the first axis O1. . The tip end of the second link 5313b is fixedly connected to the base end of the second joint portion 5311b.
 第2リンク5313bは、略L字形状を有する棒状の部材であり、その先端側の一辺が第2軸O2と直交する方向に延伸しつつ、当該一辺の端部が第2関節部5311bの基端に固定的に接続される。第2リンク5313bの略L字形状の基端側の他辺には、第3関節部5311cが接続される。 The second link 5313b is a rod-shaped member having a substantially L shape, and one end side of the second link 5313b extends in a direction orthogonal to the second axis O2, while the end of the one side is a base of the second joint portion 5311b. Permanently connected to the end. The third joint portion 5311c is connected to the other side of the second link 5313b on the base end side of the substantially L shape.
 第3関節部5311cは、略円柱形状を有し、その先端で、第2リンク5313bの基端を、第1軸O1及び第2軸O2と互いに直交する回転軸(第3軸O3)まわりに回動可能に支持する。第3関節部5311cの基端には、第3リンク5313cの先端が固定的に接続される。第2軸O2及び第3軸O3まわりに顕微鏡部5303を含む先端側の構成を回動させることにより、水平面内での顕微鏡部5303の位置を変更するように、当該顕微鏡部5303を移動させることができる。つまり、第2軸O2及び第3軸O3まわりの回転を制御することにより、撮像画像の視野を平面内で移動させることが可能になる。 The third joint portion 5311c has a substantially cylindrical shape, and at the tip thereof, the base end of the second link 5313b is rotated around a rotation axis (third axis O3) orthogonal to the first axis O1 and the second axis O2. Support rotatably. The tip end of the third link 5313c is fixedly connected to the base end of the third joint portion 5311c. Moving the microscope unit 5303 so as to change the position of the microscope unit 5303 in the horizontal plane by rotating the configuration on the tip side including the microscope unit 5303 around the second axis O2 and the third axis O3. You can That is, by controlling the rotation around the second axis O2 and the third axis O3, it becomes possible to move the field of view of the captured image within the plane.
 第3リンク5313cは、その先端側が略円柱形状を有するように構成されており、当該円柱形状の先端に、第3関節部5311cの基端が、両者が略同一の中心軸を有するように、固定的に接続される。第3リンク5313cの基端側は角柱形状を有し、その端部に第4関節部5311dが接続される。 The third link 5313c is configured such that the tip end side thereof has a substantially columnar shape, and the base end of the third joint portion 5311c has the substantially same central axis at the tip end of the columnar shape, It is fixedly connected. The base end side of the third link 5313c has a prismatic shape, and the fourth joint 5311d is connected to the end thereof.
 第4関節部5311dは、略円柱形状を有し、その先端で、第3リンク5313cの基端を、第3軸O3と直交する回転軸(第4軸O4)まわりに回動可能に支持する。第4関節部5311dの基端には、第4リンク5313dの先端が固定的に接続される。 The fourth joint portion 5311d has a substantially columnar shape, and the tip end thereof supports the base end of the third link 5313c so as to be rotatable about a rotation axis (fourth axis O4) orthogonal to the third axis O3. . The tip end of the fourth link 5313d is fixedly connected to the base end of the fourth joint portion 5311d.
 第4リンク5313dは、略直線状に延伸する棒状の部材であり、第4軸O4と直交するように延伸しつつ、その先端の端部が第4関節部5311dの略円柱形状の側面に当接するように、第4関節部5311dに固定的に接続される。第4リンク5313dの基端には、第5関節部5311eが接続される。 The fourth link 5313d is a rod-shaped member that extends in a substantially straight line. The fourth link 5313d extends so as to be orthogonal to the fourth axis O4, and the end of the tip of the fourth link 5313d abuts the substantially cylindrical side surface of the fourth joint 5311d. It is fixedly connected to the fourth joint portion 5311d so as to be in contact therewith. The fifth joint 5311e is connected to the base end of the fourth link 5313d.
 第5関節部5311eは、略円柱形状を有し、その先端側で、第4リンク5313dの基端を、第4軸O4と平行な回転軸(第5軸O5)まわりに回動可能に支持する。第5関節部5311eの基端には、第5リンク5313eの先端が固定的に接続される。第4軸O4及び第5軸O5は、顕微鏡部5303を上下方向に移動させ得る回転軸である。第4軸O4及び第5軸O5まわりに顕微鏡部5303を含む先端側の構成を回動させることにより、顕微鏡部5303の高さ、すなわち顕微鏡部5303と観察対象との距離を調整することができる。 The fifth joint portion 5311e has a substantially columnar shape, and supports the base end of the fourth link 5313d at its tip end side so as to be rotatable about a rotation axis (fifth axis O5) parallel to the fourth axis O4. To do. The tip end of the fifth link 5313e is fixedly connected to the base end of the fifth joint portion 5311e. The fourth axis O4 and the fifth axis O5 are rotation axes that can move the microscope unit 5303 in the vertical direction. By rotating the configuration of the tip side including the microscope unit 5303 around the fourth axis O4 and the fifth axis O5, the height of the microscope unit 5303, that is, the distance between the microscope unit 5303 and the observation target can be adjusted. .
 第5リンク5313eは、一辺が鉛直方向に延伸するとともに他辺が水平方向に延伸する略L字形状を有する第1の部材と、当該第1の部材の水平方向に延伸する部位から鉛直下向きに延伸する棒状の第2の部材と、が組み合わされて構成される。第5リンク5313eの第1の部材の鉛直方向に延伸する部位の上端近傍に、第5関節部5311eの基端が固定的に接続される。第5リンク5313eの第2の部材の基端(下端)には、第6関節部5311fが接続される。 The fifth link 5313e includes a first member having a substantially L shape in which one side extends in the vertical direction and the other side extends in the horizontal direction, and vertically downward from a portion of the first member extending in the horizontal direction. And a rod-shaped second member that extends is configured to be combined. The proximal end of the fifth joint 5311e is fixedly connected to the vicinity of the upper end of the portion of the fifth link 5313e that extends in the vertical direction of the first member. The sixth joint 5311f is connected to the base end (lower end) of the second member of the fifth link 5313e.
 第6関節部5311fは、略円柱形状を有し、その先端側で、第5リンク5313eの基端を、鉛直方向と平行な回転軸(第6軸O6)まわりに回動可能に支持する。第6関節部5311fの基端には、第6リンク5313fの先端が固定的に接続される。 The sixth joint 5311f has a substantially columnar shape, and supports the base end of the fifth link 5313e on the tip side thereof so as to be rotatable about a rotation axis (sixth axis O6) parallel to the vertical direction. The tip end of the sixth link 5313f is fixedly connected to the base end of the sixth joint portion 5311f.
 第6リンク5313fは鉛直方向に延伸する棒状の部材であり、その基端はベース部5315の上面に固定的に接続される。 The sixth link 5313f is a rod-shaped member extending in the vertical direction, and its base end is fixedly connected to the upper surface of the base portion 5315.
 第1関節部5311a~第6関節部5311fの回転可能範囲は、顕微鏡部5303が所望の動きを可能であるように適宜設定されている。これにより、以上説明した構成を有するアーム部5309においては、顕微鏡部5303の動きに関して、並進3自由度及び回転3自由度の計6自由度の動きが実現され得る。このように、顕微鏡部5303の動きに関して6自由度が実現されるようにアーム部5309を構成することにより、アーム部5309の可動範囲内において顕微鏡部5303の位置及び姿勢を自由に制御することが可能になる。従って、あらゆる角度から術部を観察することが可能となり、手術をより円滑に実行することができる。 The rotatable range of the first joint portion 5311a to the sixth joint portion 5311f is appropriately set so that the microscope unit 5303 can move as desired. As a result, in the arm portion 5309 having the above-described configuration, with respect to the movement of the microscope portion 5303, translational 3 degrees of freedom and rotation 3 degrees of freedom, that is, a total of 6 degrees of freedom can be realized. In this way, by configuring the arm unit 5309 so as to realize 6 degrees of freedom regarding the movement of the microscope unit 5303, it is possible to freely control the position and orientation of the microscope unit 5303 within the movable range of the arm unit 5309. It will be possible. Therefore, the surgical site can be observed from any angle, and the surgery can be performed more smoothly.
 なお、図示するアーム部5309の構成はあくまで一例であり、アーム部5309を構成するリンクの数及び形状(長さ)、並びに関節部の数、配置位置及び回転軸の方向等は、所望の自由度が実現され得るように適宜設計されてよい。例えば、上述したように、顕微鏡部5303を自由に動かすためには、アーム部5309は6自由度を有するように構成されることが好ましいが、アーム部5309はより大きな自由度(すなわち、冗長自由度)を有するように構成されてもよい。冗長自由度が存在する場合には、アーム部5309においては、顕微鏡部5303の位置及び姿勢が固定された状態で、アーム部5309の姿勢を変更することが可能となる。従って、例えば表示装置5319を見る術者の視界にアーム部5309が干渉しないように当該アーム部5309の姿勢を制御する等、術者にとってより利便性の高い制御が実現され得る。 Note that the configuration of the illustrated arm portion 5309 is merely an example, and the number and shape (length) of the links that configure the arm portion 5309, the number of joint portions, the arrangement position, the direction of the rotation axis, and the like can be set as desired. It may be appropriately designed so that the degree can be realized. For example, as described above, in order to move the microscope unit 5303 freely, the arm unit 5309 is preferably configured to have 6 degrees of freedom, but the arm unit 5309 has a larger degree of freedom (ie, redundant freedom). ). When there is a redundant degree of freedom, in the arm unit 5309, the posture of the arm unit 5309 can be changed while the position and posture of the microscope unit 5303 are fixed. Therefore, for example, a more convenient control for the operator can be realized by controlling the posture of the arm 5309 so that the arm 5309 does not interfere with the field of view of the operator who views the display device 5319.
 ここで、第1関節部5311a~第6関節部5311fには、モータ等の駆動機構、及び各関節部における回転角度を検出するエンコーダ等が搭載されたアクチュエータが設けられ得る。そして、第1関節部5311a~第6関節部5311fに設けられる各アクチュエータの駆動が制御装置5317によって適宜制御されることにより、アーム部5309の姿勢、すなわち顕微鏡部5303の位置及び姿勢が制御され得る。具体的には、制御装置5317は、エンコーダによって検出された各関節部の回転角度についての情報に基づいて、アーム部5309の現在の姿勢、並びに顕微鏡部5303の現在の位置及び姿勢を把握することができる。制御装置5317は、把握したこれらの情報を用いて、ユーザからの操作入力に応じた顕微鏡部5303の移動を実現するような各関節部に対する制御値(例えば、回転角度又は発生トルク等)を算出し、当該制御値に応じて各関節部の駆動機構を駆動させる。なお、この際、制御装置5317によるアーム部5309の制御方式は限定されず、力制御又は位置制御等、各種の公知の制御方式が適用されてよい。 Here, each of the first joint portion 5311a to the sixth joint portion 5311f may be provided with a drive mechanism such as a motor and an actuator equipped with an encoder or the like that detects a rotation angle at each joint portion. Then, by appropriately controlling the driving of each actuator provided in the first joint portion 5311a to the sixth joint portion 5311f by the control device 5317, the posture of the arm portion 5309, that is, the position and posture of the microscope portion 5303 can be controlled. . Specifically, the control device 5317 can grasp the current posture of the arm unit 5309 and the current position and posture of the microscope unit 5303 based on the information about the rotation angle of each joint detected by the encoder. You can The control device 5317 calculates the control value (for example, the rotation angle or the generated torque) for each joint portion that realizes the movement of the microscope unit 5303 according to the operation input from the user, using the grasped information. Then, the drive mechanism of each joint is driven according to the control value. At this time, the control method of the arm 5309 by the control device 5317 is not limited, and various known control methods such as force control or position control may be applied.
 例えば、術者が、図示しない入力装置を介して適宜操作入力を行うことにより、当該操作入力に応じて制御装置5317によってアーム部5309の駆動が適宜制御され、顕微鏡部5303の位置及び姿勢が制御されてよい。当該制御により、顕微鏡部5303を任意の位置から任意の位置まで移動させた後、その移動後の位置で固定的に支持することができる。なお、当該入力装置としては、術者の利便性を考慮して、例えばフットスイッチ等、術者が手に術具を有していても操作可能なものが適用されることが好ましい。また、ウェアラブルデバイスや手術室内に設けられるカメラを用いたジェスチャ検出や視線検出に基づいて、非接触で操作入力が行われてもよい。これにより、清潔域に属するユーザであっても、不潔域に属する機器をより自由度高く操作することが可能になる。あるいは、アーム部5309は、いわゆるマスタースレイブ方式で操作されてもよい。この場合、アーム部5309は、手術室から離れた場所に設置される入力装置を介してユーザによって遠隔操作され得る。 For example, when the operator appropriately inputs an operation through an input device (not shown), the control device 5317 appropriately controls the driving of the arm portion 5309 according to the operation input, and the position and posture of the microscope portion 5303 are controlled. May be done. With this control, the microscope unit 5303 can be moved from an arbitrary position to an arbitrary position and then fixedly supported at the position after the movement. In consideration of the convenience of the operator, it is preferable to apply a device such as a foot switch that can be operated even if the operator has a surgical tool in his or her hand. Further, the operation input may be performed in a non-contact manner based on the gesture detection and the line-of-sight detection using the wearable device or the camera provided in the operating room. As a result, even a user belonging to the clean area can operate the equipment belonging to the unclean area with a higher degree of freedom. Alternatively, the arm portion 5309 may be operated by a so-called master slave method. In this case, the arm unit 5309 can be remotely operated by the user via an input device installed at a place apart from the operating room.
 また、力制御が適用される場合には、ユーザからの外力を受け、その外力にならってスムーズにアーム部5309が移動するように第1関節部5311a~第6関節部5311fのアクチュエータが駆動される、いわゆるパワーアシスト制御が行われてもよい。これにより、ユーザが、顕微鏡部5303を把持して直接その位置を移動させようとする際に、比較的軽い力で顕微鏡部5303を移動させることができる。従って、より直感的に、より簡易な操作で顕微鏡部5303を移動させることが可能となり、ユーザの利便性を向上させることができる。 When force control is applied, the actuators of the first joint portion 5311a to the sixth joint portion 5311f are driven so as to receive an external force from the user and move the arm portion 5309 smoothly according to the external force. That is, so-called power assist control may be performed. This allows the microscope unit 5303 to be moved with a comparatively light force when the user grasps the microscope unit 5303 and directly moves its position. Therefore, the microscope unit 5303 can be moved more intuitively and with a simpler operation, and the convenience of the user can be improved.
 また、アーム部5309は、ピボット動作をするようにその駆動が制御されてもよい。ここで、ピボット動作とは、顕微鏡部5303の光軸が空間上の所定の点(以下、ピボット点という)を常に向くように、顕微鏡部5303を移動させる動作である。ピボット動作によれば、同一の観察位置を様々な方向から観察することが可能となるため、より詳細な患部の観察が可能となる。なお、顕微鏡部5303が、その焦点距離を調整不可能に構成される場合には、顕微鏡部5303とピボット点との距離が固定された状態でピボット動作が行われることが好ましい。この場合には、顕微鏡部5303とピボット点との距離を、顕微鏡部5303の固定的な焦点距離に調整しておけばよい。これにより、顕微鏡部5303は、ピボット点を中心とする焦点距離に対応する半径を有する半球面(図14に概略的に図示する)上を移動することとなり、観察方向を変更しても鮮明な撮像画像が得られることとなる。一方、顕微鏡部5303が、その焦点距離を調整可能に構成される場合には、顕微鏡部5303とピボット点との距離が可変な状態でピボット動作が行われてもよい。この場合には、例えば、制御装置5317は、エンコーダによって検出された各関節部の回転角度についての情報に基づいて、顕微鏡部5303とピボット点との距離を算出し、その算出結果に基づいて顕微鏡部5303の焦点距離を自動で調整してもよい。あるいは、顕微鏡部5303にAF機能が設けられる場合であれば、ピボット動作によって顕微鏡部5303とピボット点との距離が変化するごとに、当該AF機能によって自動で焦点距離の調整が行われてもよい。 Further, the drive of the arm portion 5309 may be controlled so as to perform a pivot operation. Here, the pivot operation is an operation of moving the microscope unit 5303 so that the optical axis of the microscope unit 5303 always faces a predetermined point in space (hereinafter referred to as a pivot point). According to the pivot operation, it is possible to observe the same observation position from various directions, and thus it is possible to observe the affected area in more detail. When the microscope unit 5303 is configured such that its focal length cannot be adjusted, it is preferable that the pivot operation be performed with the distance between the microscope unit 5303 and the pivot point being fixed. In this case, the distance between the microscope unit 5303 and the pivot point may be adjusted to a fixed focal length of the microscope unit 5303. As a result, the microscope unit 5303 moves on a hemispherical surface (schematically shown in FIG. 14) having a radius corresponding to the focal length centered on the pivot point, and is clear even if the observation direction is changed. A captured image will be obtained. On the other hand, when the microscope unit 5303 is configured so that its focal length can be adjusted, the pivot operation may be performed in a state where the distance between the microscope unit 5303 and the pivot point is variable. In this case, for example, the control device 5317 calculates the distance between the microscope unit 5303 and the pivot point based on the information about the rotation angle of each joint detected by the encoder, and the microscope based on the calculation result. The focal length of the unit 5303 may be automatically adjusted. Alternatively, if the microscope unit 5303 is provided with an AF function, the AF function may automatically adjust the focal length each time the distance between the microscope unit 5303 and the pivot point changes due to the pivot operation. .
 また、第1関節部5311a~第6関節部5311fには、その回転を拘束するブレーキが設けられてもよい。当該ブレーキの動作は、制御装置5317によって制御され得る。例えば、顕微鏡部5303の位置及び姿勢を固定したい場合には、制御装置5317は各関節部のブレーキを作動させる。これにより、アクチュエータを駆動させなくてもアーム部5309の姿勢、すなわち顕微鏡部5303の位置及び姿勢が固定され得るため、消費電力を低減することができる。顕微鏡部5303の位置及び姿勢を移動したい場合には、制御装置5317は、各関節部のブレーキを解除し、所定の制御方式に従ってアクチュエータを駆動させればよい。 Further, the first joint portion 5311a to the sixth joint portion 5311f may be provided with a brake that restrains the rotation thereof. The operation of the brake can be controlled by the controller 5317. For example, when it is desired to fix the position and posture of the microscope unit 5303, the control device 5317 operates the brake of each joint. Accordingly, the posture of the arm portion 5309, that is, the position and posture of the microscope portion 5303 can be fixed without driving the actuator, so that power consumption can be reduced. When it is desired to move the position and posture of the microscope unit 5303, the control device 5317 may release the brake of each joint unit and drive the actuator according to a predetermined control method.
 このようなブレーキの動作は、上述した操作部5307を介したユーザによる操作入力に応じて行われ得る。ユーザは、顕微鏡部5303の位置及び姿勢を移動したい場合には、操作部5307を操作し、各関節部のブレーキを解除させる。これにより、アーム部5309の動作モードが、各関節部における回転を自由に行えるモード(オールフリーモード)に移行する。また、ユーザは、顕微鏡部5303の位置及び姿勢を固定したい場合には、操作部5307を操作し、各関節部のブレーキを作動させる。これにより、アーム部5309の動作モードが、各関節部における回転が拘束されたモード(固定モード)に移行する。 The operation of such a brake can be performed according to the operation input by the user via the operation unit 5307 described above. When the user wants to move the position and posture of the microscope unit 5303, the user operates the operation unit 5307 to release the brake of each joint. As a result, the operation mode of the arm portion 5309 shifts to a mode (all free mode) in which each joint can freely rotate. In addition, when the user wants to fix the position and posture of the microscope unit 5303, the user operates the operation unit 5307 to activate the brake of each joint. As a result, the operation mode of the arm portion 5309 shifts to a mode in which the rotation of each joint portion is restricted (fixed mode).
 制御装置5317は、顕微鏡装置5301及び表示装置5319の動作を制御することにより、顕微鏡手術システム5300の動作を統括的に制御する。例えば、制御装置5317は、所定の制御方式に従って第1関節部5311a~第6関節部5311fのアクチュエータを動作させることにより、アーム部5309の駆動を制御する。また、例えば、制御装置5317は、第1関節部5311a~第6関節部5311fのブレーキの動作を制御することにより、アーム部5309の動作モードを変更する。また、例えば、制御装置5317は、顕微鏡装置5301の顕微鏡部5303の撮像部によって取得された画像信号に各種の信号処理を施すことにより、表示用の画像データを生成するとともに、当該画像データを表示装置5319に表示させる。当該信号処理では、例えば現像処理(デモザイク処理)、高画質化処理(帯域強調処理、超解像処理、NR(Noise reduction)処理及び/又は手ブレ補正処理等)及び/又は拡大処理(すなわち、電子ズーム処理)等、各種の公知の信号処理が行われてよい。 The control device 5317 controls the operation of the microscope operation system 5300 by controlling the operation of the microscope device 5301 and the display device 5319. For example, the control device 5317 controls the drive of the arm portion 5309 by operating the actuators of the first joint portion 5311a to the sixth joint portion 5311f according to a predetermined control method. Further, for example, the control device 5317 changes the operation mode of the arm portion 5309 by controlling the operation of the brakes of the first joint portion 5311a to the sixth joint portion 5311f. Further, for example, the control device 5317 performs various signal processing on the image signal acquired by the imaging unit of the microscope unit 5303 of the microscope device 5301 to generate image data for display and display the image data. It is displayed on the device 5319. In the signal processing, for example, development processing (demosaic processing), high image quality processing (band enhancement processing, super-resolution processing, NR (Noise reduction) processing and / or camera shake correction processing, etc.) and / or enlargement processing (that is, Various known signal processes such as electronic zoom process) may be performed.
 なお、制御装置5317と顕微鏡部5303との通信、及び制御装置5317と第1関節部5311a~第6関節部5311fとの通信は、有線通信であってもよいし無線通信であってもよい。有線通信の場合には、電気信号による通信が行われてもよいし、光通信が行われてもよい。この場合、有線通信に用いられる伝送用のケーブルは、その通信方式に応じて電気信号ケーブル、光ファイバ、又はこれらの複合ケーブルとして構成され得る。一方、無線通信の場合には、手術室内に伝送ケーブルを敷設する必要がなくなるため、当該伝送ケーブルによって医療スタッフの手術室内の移動が妨げられる事態が解消され得る。 Note that communication between the control device 5317 and the microscope unit 5303 and communication between the control device 5317 and the first joint portion 5311a to the sixth joint portion 5311f may be wired communication or wireless communication. In the case of wired communication, electric signal communication may be performed or optical communication may be performed. In this case, the transmission cable used for wired communication may be configured as an electric signal cable, an optical fiber, or a composite cable of these, depending on the communication system. On the other hand, in the case of wireless communication, since it is not necessary to lay a transmission cable in the operating room, it is possible to eliminate the situation where the transmission cable hinders the movement of medical staff in the operating room.
 制御装置5317は、CPU(Central Processing Unit)、GPU(Graphics Processing Unit)等のプロセッサ、又はプロセッサとメモリ等の記憶素子が混載されたマイコン若しくは制御基板等であり得る。制御装置5317のプロセッサが所定のプログラムに従って動作することにより、上述した各種の機能が実現され得る。なお、図示する例では、制御装置5317は、顕微鏡装置5301と別個の装置として設けられているが、制御装置5317は、顕微鏡装置5301のベース部5315の内部に設置され、顕微鏡装置5301と一体的に構成されてもよい。あるいは、制御装置5317は、複数の装置によって構成されてもよい。例えば、顕微鏡部5303や、アーム部5309の第1関節部5311a~第6関節部5311fにそれぞれマイコンや制御基板等が配設され、これらが互いに通信可能に接続されることにより、制御装置5317と同様の機能が実現されてもよい。 The control device 5317 may be a processor such as a CPU (Central Processing Unit) or a GPU (Graphics Processing Unit), or a microcomputer or a control board in which a storage element such as a processor and a memory is embedded. The various functions described above can be realized by the processor of the control device 5317 operating according to a predetermined program. In the illustrated example, the control device 5317 is provided as a device separate from the microscope device 5301, but the control device 5317 is installed inside the base portion 5315 of the microscope device 5301 and is integrated with the microscope device 5301. May be configured as. Alternatively, the control device 5317 may be composed of a plurality of devices. For example, the microscope unit 5303 and the first joint unit 5311a to the sixth joint unit 5311f of the arm unit 5309 are provided with microcomputers, control boards, and the like, respectively, and are connected to each other so that they can communicate with each other. Similar functions may be realized.
 表示装置5319は、手術室内に設けられ、制御装置5317からの制御により、当該制御装置5317によって生成された画像データに対応する画像を表示する。つまり、表示装置5319には、顕微鏡部5303によって撮影された術部の画像が表示される。なお、表示装置5319は、術部の画像に代えて、又は術部の画像とともに、例えば患者の身体情報や手術の術式についての情報等、手術に関する各種の情報を表示してもよい。この場合、表示装置5319の表示は、ユーザによる操作によって適宜切り替えられてよい。あるいは、表示装置5319は複数設けられてもよく、複数の表示装置5319のそれぞれに、術部の画像や手術に関する各種の情報が、それぞれ表示されてもよい。なお、表示装置5319としては、液晶ディスプレイ装置又はEL(Electro Luminescence)ディスプレイ装置等、各種の公知の表示装置が適用されてよい。 The display device 5319 is provided in the operating room, and displays an image corresponding to the image data generated by the control device 5317 under the control of the control device 5317. That is, the display device 5319 displays the image of the operation portion photographed by the microscope portion 5303. Note that the display device 5319 may display various kinds of information related to the surgery, such as the physical information of the patient and the information about the surgical procedure, instead of the image of the surgery part or together with the image of the surgery part. In this case, the display on the display device 5319 may be appropriately switched by the operation of the user. Alternatively, a plurality of display devices 5319 may be provided, and each of the plurality of display devices 5319 may display an image of the operation portion and various kinds of information regarding surgery. Note that as the display device 5319, various known display devices such as a liquid crystal display device or an EL (Electro Luminescence) display device may be applied.
 図15は、図14に示す顕微鏡手術システム5300を用いた手術の様子を示す図である。図15では、術者5321が、顕微鏡手術システム5300を用いて、患者ベッド5323上の患者5325に対して手術を行っている様子を概略的に示している。なお、図15では、簡単のため、顕微鏡手術システム5300の構成のうち制御装置5317の図示を省略するとともに、顕微鏡装置5301を簡略化して図示している。 FIG. 15 is a diagram showing a state of surgery using the microscopic surgery system 5300 shown in FIG. FIG. 15 schematically shows a surgeon 5321 performing an operation on a patient 5325 on a patient bed 5323 using the microscopic surgery system 5300. Note that in FIG. 15, for simplification, the control device 5317 in the configuration of the microscopic surgery system 5300 is omitted, and the microscope device 5301 is illustrated in a simplified manner.
 図2Cに示すように、手術時には、顕微鏡手術システム5300を用いて、顕微鏡装置5301によって撮影された術部の画像が、手術室の壁面に設置される表示装置5319に拡大表示される。表示装置5319は、術者5321と対向する位置に設置されており、術者5321は、表示装置5319に映し出された映像によって術部の様子を観察しながら、例えば患部の切除等、当該術部に対して各種の処置を行う。 As shown in FIG. 2C, at the time of surgery, an image of the surgical site taken by the microscope device 5301 is enlarged and displayed on the display device 5319 installed on the wall of the operating room by using the microscopic surgery system 5300. The display device 5319 is installed at a position facing the surgeon 5321. The surgeon 5321 observes the state of the surgical site by the image displayed on the display device 5319, and, for example, cuts the affected area, and the like. Perform various treatments on the.
 以上、本開示に係る技術が適用され得る顕微鏡手術システム5300の一例について説明した。なお、ここでは、一例として顕微鏡手術システム5300について説明したが、本開示に係る技術が適用され得るシステムはかかる例に限定されない。例えば、顕微鏡装置5301は、その先端に顕微鏡部5303に代えて他の観察装置や他の術具を支持する、支持アーム装置としても機能し得る。当該他の観察装置としては、例えば内視鏡が適用され得る。また、当該他の術具としては、鉗子、攝子、気腹のための気腹チューブ、又は焼灼によって組織の切開や血管の封止を行うエネルギー処置具等が適用され得る。これらの観察装置や術具を支持アーム装置によって支持することにより、医療スタッフが人手で支持する場合に比べて、より安定的に位置を固定することが可能となるとともに、医療スタッフの負担を軽減することが可能となる。本開示に係る技術は、このような顕微鏡部以外の構成を支持する支持アーム装置に適用されてもよい。 The example of the microscopic surgery system 5300 to which the technology according to the present disclosure can be applied has been described above. Note that, here, the microscopic surgery system 5300 has been described as an example, but a system to which the technology according to the present disclosure can be applied is not limited to such an example. For example, the microscope device 5301 can also function as a support arm device that supports another observation device or another surgical tool at its tip instead of the microscope portion 5303. An endoscope may be applied as the other observation device. Further, as the other surgical tool, forceps, contusion, pneumoperitoneum tube for pneumoperitoneum, or energy treatment tool for incising tissue or sealing blood vessel by cauterization can be applied. By supporting these observation devices and surgical tools with the support arm device, it is possible to more stably fix the position and reduce the burden on the medical staff as compared to the case where the medical staff supports it manually. It becomes possible to do. The technique according to the present disclosure may be applied to a support arm device that supports a configuration other than such a microscope unit.
 本開示に係る技術は、以上説明した構成のうち、制御装置5317に好適に適用され得る。具体的には、顕微鏡部5303の撮像部によって撮影された患者5325の術部の画像における血流部と非血流部を容易に視認可能に表示装置5319に表示する場合に、本開示に係る技術を適用できる。制御装置5317に本開示に係る技術を適用することにより、スペックルイメージング技術においてSC値を算出するための計算領域をより適切に設定し、良好なSC画像を生成することができる。これにより、術者5321は、血流部と非血流部が正確に識別された術部の画像を表示装置5319においてリアルタイムで見ることができ、手術をより安全に行うことができる。 The technology according to the present disclosure can be preferably applied to the control device 5317 among the configurations described above. Specifically, according to the present disclosure, when the blood flow portion and the non-blood flow portion in the image of the surgical portion of the patient 5325 captured by the image capturing unit of the microscope unit 5303 are displayed on the display device 5319 in a readily visible manner. Technology can be applied. By applying the technology according to the present disclosure to the control device 5317, it is possible to more appropriately set the calculation region for calculating the SC value in the speckle imaging technology and generate a good SC image. Accordingly, the operator 5321 can view the image of the surgical site in which the blood flow portion and the non-blood flow portion are accurately identified on the display device 5319 in real time, and can perform the surgery more safely.
 なお、本技術は以下のような構成も取ることができる。
(1)
 被写体にインコヒーレントな可視光を照射する第1の照射手段と、
 前記被写体にコヒーレントな近赤外光を照射する第2の照射手段と、
 前記被写体からの前記可視光の反射光を撮像する第1の撮像手段と、
 前記被写体からの前記近赤外光の反射光を撮像する第2の撮像手段と、
 前記第1の撮像手段から可視光画像を取得し、前記第2の撮像手段からスペックル画像を取得する取得手段と、
 前記可視光画像に基いて、スペックルの輝度値について統計的な処理をするための計算領域を設定する設定手段と、
 前記スペックル画像と前記計算領域に基いて所定画像を生成する生成手段と、を備える医療システム。
(2)
 前記医療システムは、顕微鏡手術システム、または、内視鏡手術システムである、(1)に記載の医療システム。
(3)
 被写体に照射されたインコヒーレント光の反射光を撮像する第1の撮像手段から第1の画像を取得し、前記被写体に照射されたコヒーレント光の反射光を撮像する第2の撮像手段から第2の画像を取得する取得工程と、
 前記第1の画像に基いて計算領域を設定する設定工程と、
 前記第2の画像と前記計算領域に基いて、所定画像を生成する生成工程と、を含む情報処理方法。
(4)
 前記インコヒーレント光はインコヒーレントな可視光であり、
 前記コヒーレント光はコヒーレントな近赤外光であり、
 前記第1の画像は可視光画像であり、
 前記第2の画像はスペックル画像であり、
 前記所定画像はスペックルコントラスト画像である、(3)に記載の情報処理方法。
(5)
 前記設定工程は、前記可視光画像に基いて注目画素と複数の周辺画素それぞれに関して距離、輝度の差、色の差の少なくともいずれかによって関連度を算出し、前記関連度に基いて複数の前記周辺画素の少なくとも一部に前記注目画素に関する前記計算領域を設定し、
 前記生成工程は、前記スペックル画像と前記計算領域に基いて、前記注目画素に関するスペックルコントラスト値を算出することで、前記スペックルコントラスト画像を生成する、(4)に記載の情報処理方法。
(6)
 前記生成工程は、前記スペックル画像と前記計算領域に基いて、前記注目画素に関するスペックルコントラスト値を算出する際に、前記周辺画素ごとに前記関連度によって重み付けをすることによって前記スペックルコントラスト値を算出する、(5)に記載の情報処理方法。
(7)
 被写体からの蛍光を撮像する第3の撮像手段から第3の画像を取得し、前記被写体に照射されたコヒーレント光の反射光を撮像する第2の撮像手段から第2の画像を取得する取得工程と、
 前記第3の画像に基いて計算領域を設定する設定工程と、
 前記第2の画像と前記計算領域に基いて、所定画像を生成する生成工程と、を含む情報処理方法。
Note that the present technology may also be configured as below.
(1)
A first irradiation means for irradiating a subject with incoherent visible light;
Second irradiating means for irradiating the subject with coherent near-infrared light;
First imaging means for imaging the reflected light of the visible light from the subject,
Second imaging means for imaging the reflected light of the near infrared light from the subject;
An acquisition unit that acquires a visible light image from the first imaging unit and a speckle image from the second imaging unit;
Based on the visible light image, setting means for setting a calculation region for statistical processing of the brightness value of the speckle,
A medical system comprising: the speckle image and a generation unit that generates a predetermined image based on the calculation area.
(2)
The medical system according to (1), wherein the medical system is a microscopic surgery system or an endoscopic surgery system.
(3)
A second image capturing unit that acquires the first image from the first image capturing unit that captures the reflected light of the incoherent light that is applied to the subject, and a second image capturing unit that captures the reflected light of the coherent light that is applied to the subject An acquisition process for acquiring the image of
A setting step of setting a calculation area based on the first image;
An information processing method comprising: a generation step of generating a predetermined image based on the second image and the calculation area.
(4)
The incoherent light is incoherent visible light,
The coherent light is coherent near infrared light,
The first image is a visible light image,
The second image is a speckle image,
The information processing method according to (3), wherein the predetermined image is a speckle contrast image.
(5)
In the setting step, the degree of association is calculated based on the visible light image with respect to each of the target pixel and each of a plurality of peripheral pixels by at least one of a distance, a difference in luminance, and a difference in color, and the plurality of the degrees of association based on the degree of association Setting the calculation region for the pixel of interest in at least a part of the peripheral pixels,
The information processing method according to (4), wherein in the generating step, the speckle contrast image is generated by calculating a speckle contrast value for the pixel of interest based on the speckle image and the calculation region.
(6)
In the generating step, the speckle contrast value is calculated by weighting each of the peripheral pixels with the degree of association when calculating the speckle contrast value of the pixel of interest based on the speckle image and the calculation region. The information processing method according to (5), wherein
(7)
An acquisition step of acquiring a third image from a third image capturing unit that captures fluorescence from a subject and acquiring a second image from a second image capturing unit that captures reflected light of coherent light emitted to the subject. When,
A setting step of setting a calculation area based on the third image;
An information processing method comprising: a generation step of generating a predetermined image based on the second image and the calculation area.
 以上、本開示の実施形態、変形例について説明したが、本開示の技術的範囲は、上述の実施形態、変形例そのままに限定されるものではなく、本開示の要旨を逸脱しない範囲において種々の変更が可能である。また、異なる実施形態、変形例にわたる構成要素を適宜組み合わせてもよい。 Although the embodiments and modified examples of the present disclosure have been described above, the technical scope of the present disclosure is not limited to the above-described embodiments and modified examples as they are, and various modifications are possible without departing from the scope of the present disclosure. It can be changed. Moreover, you may combine suitably the component over different embodiment and a modification.
 なお、本明細書に記載された各実施形態、変形例における効果はあくまで例示であって限定されるものでは無く、他の効果があってもよい。 It should be noted that the effects in each of the embodiments and modifications described in the present specification are merely examples and not limited, and other effects may be present.
 また、上述の各実施形態では、スペックルの輝度値について統計的な処理を行うことによって算出する値として、スペックルコントラスト値を例にとって説明したが、これに限定されず、BR(Blur Rate)、SBR(Square BR)、MBR(Mean BR)などであってもよい。 Further, in each of the above-described embodiments, the speckle contrast value has been described as an example of the value calculated by performing the statistical processing on the speckle luminance value, but the invention is not limited to this, and the BR (Blur Rate) is not limited to this. , SBR (Square BR), MBR (Mean BR), and the like.
 1   医療システム
 2   構造観察用光源
 3   狭帯域光源
 4   波長分離装置
 5   カラーカメラ
 6   IRカメラ
 7   情報処理装置
 8   制御装置
 9   カメラ
 10  蛍光励起用光源
 11  蛍光観察カメラ
 71  処理部
 72  記憶部
 73  入力部
 74  表示部
 711 取得部
 712 設定部
 713 生成部
 714 表示制御部
1 Medical System 2 Structure Observation Light Source 3 Narrow Band Light Source 4 Wavelength Separation Device 5 Color Camera 6 IR Camera 7 Information Processing Device 8 Control Device 9 Camera 10 Fluorescence Excitation Light Source 11 Fluorescence Observation Camera 71 Processing Unit 72 Storage Unit 73 Input Unit 74 Display unit 711 Acquisition unit 712 Setting unit 713 Generation unit 714 Display control unit

Claims (7)

  1.  被写体にインコヒーレントな可視光を照射する第1の照射手段と、
     前記被写体にコヒーレントな近赤外光を照射する第2の照射手段と、
     前記被写体からの前記可視光の反射光を撮像する第1の撮像手段と、
     前記被写体からの前記近赤外光の反射光を撮像する第2の撮像手段と、
     前記第1の撮像手段から可視光画像を取得し、前記第2の撮像手段からスペックル画像を取得する取得手段と、
     前記可視光画像に基いて、スペックルの輝度値について統計的な処理をするための計算領域を設定する設定手段と、
     前記スペックル画像と前記計算領域に基いて所定画像を生成する生成手段と、を備える医療システム。
    A first irradiation means for irradiating a subject with incoherent visible light;
    Second irradiating means for irradiating the subject with coherent near-infrared light;
    First imaging means for imaging the reflected light of the visible light from the subject,
    Second imaging means for imaging the reflected light of the near infrared light from the subject;
    An acquisition unit that acquires a visible light image from the first imaging unit and a speckle image from the second imaging unit;
    Based on the visible light image, setting means for setting a calculation region for statistical processing of the brightness value of the speckle,
    A medical system comprising: the speckle image and a generation unit that generates a predetermined image based on the calculation area.
  2.  前記医療システムは、顕微鏡手術システム、または、内視鏡手術システムである、請求項1に記載の医療システム。 The medical system according to claim 1, wherein the medical system is a microscopic surgery system or an endoscopic surgery system.
  3.  被写体に照射されたインコヒーレント光の反射光を撮像する第1の撮像手段から第1の画像を取得し、前記被写体に照射されたコヒーレント光の反射光を撮像する第2の撮像手段から第2の画像を取得する取得工程と、
     前記第1の画像に基いて計算領域を設定する設定工程と、
     前記第2の画像と前記計算領域に基いて、所定画像を生成する生成工程と、を含む情報処理方法。
    A second image capturing unit that acquires the first image from the first image capturing unit that captures the reflected light of the incoherent light that is applied to the subject, and a second image capturing unit that captures the reflected light of the coherent light that is applied to the subject An acquisition process for acquiring the image of
    A setting step of setting a calculation area based on the first image;
    An information processing method comprising: a generation step of generating a predetermined image based on the second image and the calculation area.
  4.  前記インコヒーレント光はインコヒーレントな可視光であり、
     前記コヒーレント光はコヒーレントな近赤外光であり、
     前記第1の画像は可視光画像であり、
     前記第2の画像はスペックル画像であり、
     前記所定画像はスペックルコントラスト画像である、請求項3に記載の情報処理方法。
    The incoherent light is incoherent visible light,
    The coherent light is coherent near infrared light,
    The first image is a visible light image,
    The second image is a speckle image,
    The information processing method according to claim 3, wherein the predetermined image is a speckle contrast image.
  5.  前記設定工程は、前記可視光画像に基いて注目画素と複数の周辺画素それぞれに関して距離、輝度の差、色の差の少なくともいずれかによって関連度を算出し、前記関連度に基いて複数の前記周辺画素の少なくとも一部に前記注目画素に関する前記計算領域を設定し、
     前記生成工程は、前記スペックル画像と前記計算領域に基いて、前記注目画素に関するスペックルコントラスト値を算出することで、前記スペックルコントラスト画像を生成する、請求項4に記載の情報処理方法。
    In the setting step, the degree of association is calculated based on the visible light image with respect to each of the target pixel and each of a plurality of peripheral pixels by at least one of a distance, a difference in luminance, and a difference in color, and the plurality of the degrees of association are calculated based on the degree of association. Setting the calculation region for the pixel of interest in at least a part of the peripheral pixels,
    The information processing method according to claim 4, wherein in the generating step, the speckle contrast image is generated by calculating a speckle contrast value regarding the pixel of interest based on the speckle image and the calculation region.
  6.  前記生成工程は、前記スペックル画像と前記計算領域に基いて、前記注目画素に関するスペックルコントラスト値を算出する際に、前記周辺画素ごとに前記関連度によって重み付けをすることによって前記スペックルコントラスト値を算出する、請求項5に記載の情報処理方法。 In the generating step, the speckle contrast value is calculated by weighting each of the peripheral pixels with the degree of association when calculating the speckle contrast value of the pixel of interest based on the speckle image and the calculation region. The information processing method according to claim 5, wherein
  7.  被写体からの蛍光を撮像する第3の撮像手段から第3の画像を取得し、前記被写体に照射されたコヒーレント光の反射光を撮像する第2の撮像手段から第2の画像を取得する取得工程と、
     前記第3の画像に基いて計算領域を設定する設定工程と、
     前記第2の画像と前記計算領域に基いて、所定画像を生成する生成工程と、を含む情報処理方法。
    An acquisition step of acquiring a third image from a third image capturing unit that captures fluorescence from a subject and acquiring a second image from a second image capturing unit that captures reflected light of coherent light emitted to the subject. When,
    A setting step of setting a calculation area based on the third image;
    An information processing method comprising: a generation step of generating a predetermined image based on the second image and the calculation area.
PCT/JP2019/034884 2018-10-24 2019-09-05 Medical system and information processing method WO2020084917A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-200399 2018-10-24
JP2018200399 2018-10-24

Publications (1)

Publication Number Publication Date
WO2020084917A1 true WO2020084917A1 (en) 2020-04-30

Family

ID=70330697

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/034884 WO2020084917A1 (en) 2018-10-24 2019-09-05 Medical system and information processing method

Country Status (1)

Country Link
WO (1) WO2020084917A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017160643A1 (en) * 2016-03-14 2017-09-21 Massachusetts Institute Of Technology Device and method for imaging shortwave infrared fluorescence
JP2018514335A (en) * 2015-05-07 2018-06-07 ノバダック テクノロジーズ インコーポレイテッド Method and system for laser speckle imaging of tissue using a color image sensor

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018514335A (en) * 2015-05-07 2018-06-07 ノバダック テクノロジーズ インコーポレイテッド Method and system for laser speckle imaging of tissue using a color image sensor
WO2017160643A1 (en) * 2016-03-14 2017-09-21 Massachusetts Institute Of Technology Device and method for imaging shortwave infrared fluorescence

Similar Documents

Publication Publication Date Title
US11221296B2 (en) Imaging system
WO2020045015A1 (en) Medical system, information processing device and information processing method
US10904437B2 (en) Control apparatus and control method
WO2018230066A1 (en) Medical system, medical apparatus, and control method
JP2019084334A (en) Medical holding apparatus, medical arm system, and drape mounting mechanism
US11653824B2 (en) Medical observation system and medical observation device
WO2018088113A1 (en) Joint driving actuator and medical system
JP2020074926A (en) Medical observation system, signal processing device and medical observation method
WO2021049438A1 (en) Medical support arm and medical system
JPWO2019239942A1 (en) Surgical observation device, surgical observation method, surgical light source device, and surgical light irradiation method
JPWO2019092950A1 (en) Image processing equipment, image processing method and image processing system
WO2020203225A1 (en) Medical system, information processing device, and information processing method
WO2019181242A1 (en) Endoscope and arm system
WO2017221491A1 (en) Control device, control system, and control method
WO2020116067A1 (en) Medical system, information processing device, and information processing method
WO2020203164A1 (en) Medical system, information processing device, and information processing method
JP2021164494A (en) Medical observation system, medical observation apparatus, and method for driving medical observation apparatus
WO2020045014A1 (en) Medical system, information processing device and information processing method
WO2021140923A1 (en) Medical image generation device, medical image generation method, and medical image generation program
WO2020084917A1 (en) Medical system and information processing method
WO2020009127A1 (en) Medical observation system, medical observation device, and medical observation device driving method
JP7140139B2 (en) medical imaging system and computer program
WO2022004250A1 (en) Medical system, information processing device, and information processing method
WO2020050187A1 (en) Medical system, information processing device, and information processing method
WO2019202860A1 (en) Medical system, connection structure, and connection method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19875971

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19875971

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP