WO2020084917A1 - Système médical et procédé de traitement d'informations - Google Patents

Système médical et procédé de traitement d'informations Download PDF

Info

Publication number
WO2020084917A1
WO2020084917A1 PCT/JP2019/034884 JP2019034884W WO2020084917A1 WO 2020084917 A1 WO2020084917 A1 WO 2020084917A1 JP 2019034884 W JP2019034884 W JP 2019034884W WO 2020084917 A1 WO2020084917 A1 WO 2020084917A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
unit
light
speckle
subject
Prior art date
Application number
PCT/JP2019/034884
Other languages
English (en)
Japanese (ja)
Inventor
哲朗 桑山
健太郎 深沢
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2020084917A1 publication Critical patent/WO2020084917A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/026Measuring blood flow
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/27Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands using photo-electric detection ; circuits for computing concentration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/31Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry
    • G01N21/35Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry using infrared light
    • G01N21/3563Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry using infrared light for analysing solids; Preparation of samples therefor
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/06Means for illuminating specimens
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • G02B23/26Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes using light guides

Definitions

  • the present disclosure relates to a medical system and an information processing method.
  • the speckle is a phenomenon in which a speckled pattern is generated by the emitted coherent light being reflected and interfered by minute irregularities on the surface of a subject (object). Based on this phenomenon of speckle, for example, a blood flow part and a non-blood flow part in a living body as a subject can be identified.
  • the speckle contrast value decreases due to movement of red blood cells that reflect coherent light, whereas in the non-blood flow part, the whole is stationary and the speckle contrast value increases. Therefore, the blood flow portion and the non-blood flow portion can be distinguished based on the speckle contrast image generated using the speckle contrast value of each pixel.
  • BR Bit Rate
  • SBR Square BR
  • MBR Mel BR
  • the size (for example, 5 ⁇ 5 pixels) of the calculation area for statistically processing the brightness value of the speckle is constant, and for example, the calculation area is the blood flow part. It cannot be said to be the best in terms of the accuracy of the value (eg, speckle contrast value) calculated to include both non-blood flow parts.
  • the predetermined image is generated based on the image by the predetermined light from the subject and the calculation region, there is room for improvement in the calculation region.
  • the present disclosure proposes a medical system and an information processing method capable of generating a highly accurate predetermined image based on an image by predetermined light from a subject and a more appropriate calculation area.
  • a medical system is a first irradiation unit that irradiates a subject with incoherent visible light, and a first irradiation unit that irradiates the subject with coherent near-infrared light.
  • Second irradiating means first imaging means for imaging the reflected light of the visible light from the subject, second imaging means for imaging the reflected light of the near-infrared light from the subject, and To obtain a visible light image from the first image pickup means and to obtain a speckle image from the second image pickup means, and to statistically process the speckle luminance value based on the visible light image.
  • Setting means for setting the calculation area, and generating means for generating a predetermined image based on the speckle image and the calculation area.
  • FIG. 5 is an explanatory diagram of a calculation example using a calculation area according to the first embodiment of the present disclosure. 5 is a flowchart showing SC image generation processing by the information processing apparatus according to the first embodiment of the present disclosure.
  • FIG. 20 is a diagram showing an example of a schematic configuration of a microscopic surgery system according to an application example 2 of the present disclosure. It is a figure which shows the mode of surgery using the microscopic surgery system shown in FIG.
  • blood flow evaluation is often important. For example, in bypass surgery in brain surgery, vascularization (blood flow) is confirmed after connecting blood vessels. In addition, in clipping aneurysm, the flow of blood into the aneurysm is confirmed after clipping. In these applications, blood flow evaluation by an angiography using an ultrasonic Doppler blood flow meter or ICG (Indocyanine Green) drug has been performed so far.
  • ICG Indocyanine Green
  • the ultrasonic Doppler blood flow meter measures blood flow at one point where the probe is in contact, the distribution of blood flow trends in the entire surgical field is unknown. There is also the risk of having to contact the cerebrovascular system for evaluation.
  • angiography using an ICG drug takes advantage of the feature that the ICG drug binds to plasma protein in the living body and emits fluorescence by excitation light in the near infrared, which is invasive to administer the drug. It is an observation. Further, in terms of blood flow evaluation, the flow has to be determined from the change immediately after the administration of the ICG drug, so that there is a limitation in usage in terms of timing.
  • Japanese Patent Laid-Open No. 2017-170064 discloses an optical device for perfusion evaluation in the speckle imaging technique.
  • the principle of detecting movement (blood flow) using speckle generated by a laser is used.
  • speckle contrast is used as an index for motion detection.
  • Speckle contrast is a value indicated by (standard deviation) / (average value) of the light intensity distribution.
  • the standard deviation of the intensity distribution is large and the speckle contrast (degree of glare) is high.
  • the speckle pattern changes with the motion. Considering shooting a speckle pattern with an observation system that has a certain exposure time, the speckle pattern changes during the exposure time, so the taken speckle patterns are averaged and the speckle contrast (degree of glare) Will be lower.
  • the larger the movement is the more the averaging is performed, so the speckle contrast becomes lower.
  • This method is a method to perform statistical evaluation using the light intensity of the pixel of interest and multiple surrounding pixels.
  • it is desirable to increase the calculation area (analysis area) and the number of pixels. This stabilizes the statistical value and reduces the variation in the analyzed pixel value.
  • the calculation area is made large, the resolution is lowered because information of peripheral pixels far from the target pixel is also included. It may be possible to reduce the calculation area to avoid resolution degradation, but in a small calculation area, the number of pixels contained in the calculation area is small, so the statistically processed values will vary widely, and the image will be noisy. .
  • the size of the calculation region for calculating the speckle contrast value of each pixel of interest has been set to a constant value (for example, 5 ⁇ 5 pixels), and the calculation region is, for example, the blood flow part. It is not the best in terms of the accuracy of the speckle contrast value if it includes both the non-blood flow area and the non-blood flow area.
  • FIG. 1 is a diagram showing a configuration example of a medical system 1 according to the first embodiment of the present disclosure.
  • the medical system 1 according to the first embodiment includes a structure observation light source 2 (first irradiation means), a narrow band light source 3 (second irradiation means), a wavelength separation device 4, a color camera 5 (first imaging means). ), An IR camera 6 (second image pickup means), and an information processing device 7.
  • first irradiation means first irradiation means
  • second irradiation means narrow band light source 3
  • a wavelength separation device 4 a color camera 5 (first imaging means).
  • An IR camera 6 second image pickup means
  • the structure observation light source 2 irradiates a subject with incoherent light (for example, incoherent visible light; hereinafter, also simply referred to as “visible light”).
  • the narrow band light source 3 irradiates the subject with coherent light (for example, coherent near infrared light.
  • coherent light is a phase relationship of light waves at two arbitrary points in the light flux that is constant in time and is constant, and after splitting the light flux by an arbitrary method, a large optical path difference is given and the light waves are superposed again.
  • Incoherent light is light that does not have the above-mentioned properties of coherent light.
  • the wavelength of the coherent light output from the narrowband light source 3 according to the present disclosure is preferably about 800 to 900 nm, for example.
  • the wavelength is 830 nm
  • ICG observation and an optical system can be used together.
  • near-infrared light with a wavelength of 830 nm so if near-infrared light with the same wavelength is also used for speckle observation, ICG observation can be performed. Speckle observation is possible without changing the optical system of a simple microscope.
  • the wavelength of the coherent light emitted by the narrowband light source 3 is not limited to this, and it is assumed that various wavelengths are used.
  • coherent light having a wavelength of 400 to 450 nm (ultraviolet to blue) or having a wavelength of 700 to 900 nm (infrared) is used, the visible light illumination and the wavelength separated observation are easy.
  • visible coherent light having a wavelength of 450 to 700 nm is used, it is easy to select a laser used in a projector or the like.
  • the type of the narrow band light source 3 that emits coherent light is not particularly limited as long as the effect of the present technology is not impaired.
  • the narrow band light source 3 that emits laser light include an argon ion (Ar) laser, a helium-neon (He-Ne) laser, a die (dye) laser, a krypton (Cr) laser, a semiconductor laser, and a semiconductor laser and wavelength conversion.
  • Solid-state lasers or the like combined with optical elements can be used alone or in combination.
  • the visible light from the structure observation light source 2 and the near-infrared light from the narrow band light source 3 are simultaneously emitted to the subject.
  • the type of the structure observation light source 2 is not particularly limited as long as the effect of the present technology is not impaired.
  • a light emitting diode etc. can be mentioned as an example.
  • other light sources include a xenon lamp, a metal halide lamp, a high pressure mercury lamp and the like.
  • the subject can be various, for example, a subject containing fluid is suitable. Due to the nature of speckle, there is a property that speckle is less likely to be generated from a fluid. Therefore, when the medical system 1 according to the present disclosure is used to image a subject containing a fluid, the boundary between the fluid portion and the non-fluid portion, the flow velocity of the fluid portion, and the like can be obtained.
  • the subject can be a living body whose fluid is blood.
  • the medical system 1 according to the present disclosure for microscopic surgery or endoscopic surgery, it is possible to perform surgery while confirming the position of the blood vessel. Therefore, safer and more accurate surgery can be performed, which can contribute to further development of medical technology.
  • the color camera 5 images reflected light (scattered light) of visible light from a subject.
  • the color camera 5 is, for example, an RGB (Red Green Blue) imager for observing visible light.
  • IR camera 6 images the reflected light (scattered light) of near infrared light from the subject.
  • the IR camera 6 is, for example, an IR (Infrared) imager for speckle observation.
  • the wavelength separation device 4 is, for example, a dichroic mirror.
  • the wavelength separation device 4 separates the received near infrared light (reflected light) and visible light (reflected light).
  • the color camera 5 captures a visible light image (first image) obtained from the visible light separated by the wavelength separation device 4.
  • the IR camera 6 captures a speckle image (second image) obtained from the near infrared light separated by the wavelength separation device 4.
  • FIG. 2 is a diagram illustrating a configuration example of the information processing device 7 according to the first embodiment of the present disclosure.
  • the information processing device 7 is an image processing device, and includes a processing unit 71, a storage unit 72, an input unit 73, and a display unit 74 as main components.
  • SC means speckle contrast (value).
  • the processing unit 71 is realized by a CPU (Central Processing Unit), for example, and includes an acquisition unit 711, a setting unit 712, a generation unit 713, and a display control unit 714 as main components.
  • a CPU Central Processing Unit
  • the acquisition unit 711 acquires a visible light image from the color camera 5 and a speckle image from the IR camera 6. It is assumed that the visible light image and the speckle image are associated with the position of the subject in the image. In other words, if the angles of view of the two are the same, the image may be the same as it is. If the angles of view of the two are not the same, at least one of the images may be corrected to match the position of the subject in the image. To do.
  • the setting unit 712 sets a calculation area for statistically processing the speckle luminance value based on the visible light image. For example, the setting unit 712 sets the calculation area for calculating the speckle contrast value based on the visible light image. More specifically, for example, the setting unit 712 calculates the degree of association based on the visible light image with respect to each of the target pixel and each of the plurality of peripheral pixels, and calculates the degree of association based on at least one of the distance, the difference in luminance, and the difference in color. Based on the above, the calculation area for the target pixel is set in at least a part of the plurality of peripheral pixels (details will be described later).
  • the generation unit 713 calculates the speckle contrast value for the pixel of interest based on the speckle image and the calculation area set by the setting unit 712, and generates the speckle contrast image (SC image. Predetermined image) ( Details will be described later).
  • speckle contrast value of the i-th pixel (Standard deviation of intensity of i-th pixel and surrounding pixels) / (Average of intensity of i-th pixel and surrounding pixels) Equation (1)
  • FIG. 3 is a diagram showing an example of an SC image of a pseudo blood vessel. As shown in the SC image example of FIG. 3, many speckles are observed in the non-blood flow portion, and almost no speckles are observed in the blood flow portion.
  • the generation unit 713 identifies the fluid portion (for example, the blood flow portion) and the non-fluid portion (for example, the non-blood flow portion) based on the SC image. More specifically, for example, the generation unit 713 determines, based on the SC image, whether the speckle contrast value (SC value) is greater than or equal to a predetermined SC threshold, thereby determining the blood flow portion and the non-blood flow portion. It is possible to identify and recognize the degree of blood flow in the blood flow section.
  • SC value speckle contrast value
  • the display control unit 714 causes the display unit 74 to display the SC image based on the SC image generated by the generation unit 713. In that case, it is preferable to display the SC image so that the blood flow part and the non-blood flow part can be distinguished.
  • the display control unit 714 can also superimpose the SC image on the visible light image and display it on the display unit 74.
  • the storage unit 72 stores the visible light image acquired by the acquisition unit 711, the speckle image, the calculation result by each unit of the processing unit 71, various information such as various thresholds.
  • a storage device external to the medical system 1 may be used instead of the storage unit 72.
  • the input unit 73 is a means for the user to input information, and is, for example, a keyboard or a mouse.
  • the display unit 74 Under control of the display control unit 714, the display unit 74 displays the visible light image acquired by the acquisition unit 711, the speckle image, the calculation result by each unit of the processing unit 71, various types of information such as various thresholds.
  • a display device outside the medical system 1 may be used instead of the display unit 74.
  • FIG. 4 is a diagram showing an example of an SC image having a processing size of 3 ⁇ 3 (pixels) and an example of an SC image having a processing size of 11 ⁇ 11 (pixels). Note that both SC image examples are calculated with a constant calculation area.
  • FIG. 5 a diagram showing an example of a calculation area in a comparative example (prior art).
  • FIG. 5 is a diagram showing an example of a calculation area in the comparative example.
  • the calculation region (for example, 3 ⁇ 3 pixels) is set regardless of the boundary between the region RM having a motion (flow) and the region RS having no motion (flow) (for example, the calculation regions R1 and R2). Therefore, if the calculation area includes both the moving area RM and the non-moving area RS, the accuracy of the speckle contrast value decreases.
  • FIG. 6 is an explanatory diagram of a calculation example using the calculation area according to the first embodiment of the present disclosure.
  • the setting unit 712 calculates the degree of association based on the visible light image with respect to the target pixel (for example, the pixel R12) and each of the plurality of peripheral pixels, based on at least one of the distance, the difference in luminance, and the difference in color, and based on the degree of association
  • a calculation region for example, region R11
  • the degree of association is described for each peripheral pixel. For example, it is determined that the closer the distance is, the higher the degree of association is. Also, the closer the difference in brightness is, the higher the degree of association is determined.
  • the total difference of RGB may be used, or the difference of two or one of RGB may be used.
  • the blood is red, and therefore only the difference R of RGB may be used.
  • the generation unit 713 for example, sets only peripheral pixels having a degree of association (0 or more, 1 or less) of 0.5 or more as a calculation area.
  • the generation unit 713 calculates the speckle contrast value regarding the target pixel (for example, the pixel R12) based on the calculation region (for example, the region R11) set by the setting unit 712. , SC images are generated. At this time, for example, the generation unit 713 may calculate the speckle contrast value by weighting the peripheral pixels by the degree of association (for example, multiplying the SC value by the degree of association).
  • the pixel R14 and the region R13 are similar to those of the pixel R12 and the region R11.
  • FIG. 7 is a flowchart showing SC image generation processing by the information processing device 7 according to the first embodiment of the present disclosure.
  • step S1 the acquisition unit 711 acquires a visible light image from the color camera 5.
  • step S2 the acquisition unit 711 acquires a speckle image from the IR camera 6.
  • step S3 the setting unit 712 calculates the degree of association between the pixel of interest and each of a plurality of peripheral pixels based on the visible light image.
  • step S4 the setting unit 712 sets the calculation region regarding the target pixel in at least a part of the plurality of peripheral pixels based on the degree of association calculated in step S3.
  • step S5 the generation unit 713 calculates the speckle contrast value for the pixel of interest based on the speckle image and the calculation area set in step S4, and generates the SC image.
  • step S6 the display control unit 714 causes the display unit 74 to display the SC image generated in step S5. After step S6, the process ends.
  • a highly accurate predetermined image is generated based on an image (speckle image) of predetermined light from a subject and a more appropriate calculation area. be able to.
  • the SC value when the SC value is calculated for the pixel of interest (for example, the pixel R12) related to the non-moving region RS, the pixels in the non-moving region RS (region Only the pixels in R11) can be used. Further, when the SC value is calculated for the target pixel (pixel R14) related to the moving region RM, only the pixels in the moving region RM (pixels in the region R13) can be used among the peripheral pixels. Therefore, the accuracy of the SC value can be improved.
  • the SC value is calculated for the pixel of interest (for example, the pixel R12) related to the non-moving region RS
  • the pixels in the non-moving region RS region Only the pixels in R11
  • the SC value when the SC value is calculated for the target pixel (pixel R14) related to the moving region RM, only the pixels in the moving region RM (pixels in the region R13) can be used among the peripheral pixels. Therefore, the accuracy of the SC value can be improved.
  • FIG. 8 is a diagram showing a schematic image example in the first embodiment of the present disclosure.
  • FIG. 8A is a schematic diagram of a visible light image.
  • FIG. 8B is a schematic diagram of the speckle image.
  • FIG. 8C is a schematic diagram of the main processing image created by the processing of this embodiment.
  • FIG. 8D is a schematic diagram of a comparative image created by conventional processing in which the size of the calculation area is constant.
  • the blood vessel BT1 is a thick blood vessel having a blood flow.
  • the blood vessel BT2 is a thin blood vessel having a blood flow.
  • the blood vessel BT3 is a thin blood vessel with no blood flow.
  • both the main processed image of FIG. 8C and the comparative image of FIG. 8D are the same in that the blood vessel BT3 is not shown.
  • both blood vessels BT1 and BT2 are clearly displayed as compared with the comparative image of FIG. 8D. That is, in the comparative image of FIG. 8D, the edge of the blood flow portion of the blood vessel BT1 is blurred and the blood vessel BT2 is blurred and difficult to see (or becomes invisible).
  • Clipping an aneurysm prevents the blood flow from entering the aneurysm by sandwiching the aneurysm with clips, and prevents future rupture of the aneurysm. Then, a thin blood vessel called a perforator may run near the aneurysm.
  • the perforator is a thin blood vessel, but a very important blood vessel that supplies a part of the brain with nutrition. If clips stop the blood flow, nutrients may not be supplied to a part of the brain, which may result in dysfunction. Therefore, displaying blood flow in a thin blood vessel in real time and avoiding accidental clipping is of great utility.
  • cerebral artery bypass surgery In addition to this, application examples for cerebral artery bypass surgery are also possible.
  • cerebral artery bypass surgery arteries with poor cerebral blood flow are connected to other arteries to increase the flow of arterial blood. In this operation, the blood flow is increased by opening the bypass of the cerebral artery, but by making it possible to see the blood flow in the thin blood vessels, it is possible to increase the blood flow in the capillaries that feed the brain tissue. You will be able to confirm and confirm the effect of surgery.
  • a thrombus develops in a small blood vessel and obstruction of blood flow can have serious sequelae.
  • FIG. 9 is a diagram showing a configuration example of the medical system 1 according to the second embodiment of the present disclosure.
  • the medical system 1 according to the second embodiment is different from the medical system 1 (FIG. 1) according to the first embodiment in that the wavelength separation device 4, the color camera 5, and the IR camera 6 are replaced by a control device 8 and a camera. 9 is provided.
  • the control device 8 switches the irradiation of visible light from the structure observation light source 2 and the irradiation of near infrared light from the narrow band light source 3 at predetermined time intervals according to an instruction from the camera 9.
  • the method of switching the illumination light source for example, there is a method of switching the filter of the observation camera.
  • the camera 9 is an imager that has both the functions of an RGB imager for visible light observation and an IR imager for speckle observation. Further, the camera 9 outputs to the control device 8 a control signal for switching the irradiation of visible light by the structure observation light source 2 and the irradiation of near infrared light by the narrow band light source 3.
  • the information processing device 7 is similar to that of the first embodiment.
  • the visible light image and the speckle image are temporally alternately captured by the single camera 9, and the information processing device 7 uses the images to perform the first imaging.
  • the same processing as that of the first embodiment can be performed.
  • the device configuration is simplified, and the visible light image and the speckle image with the same angle of view can be easily acquired, so the processing is also simplified.
  • FIG. 10 is a diagram showing a configuration example of the medical system 1 according to the third embodiment of the present disclosure.
  • the medical system 1 according to the third embodiment is different from the medical system 1 (FIG. 1) according to the first embodiment in that the structure observation light source 2 and the color camera 5 are replaced by fluorescence excitation light sources 10, respectively.
  • the fluorescence observation camera 11 (third imaging means) is provided.
  • the fluorescence excitation light source 10 irradiates a subject with excitation light for exciting fluorescence.
  • an ICG drug, fluorescein, or the like as a drug that emits fluorescence is administered to a living body that is a subject.
  • the calculation area is set based on the visible light image, but in the third embodiment, the calculation area is set based on the fluorescence observation image. Since the ICG drug and fluorescein are drugs used for angiography, the fluorescence observation image reflects the running of the blood vessel and is useful for setting the calculation region.
  • the acquisition unit 711 acquires the fluorescence observation image (third image) from the fluorescence observation camera 11 that images the fluorescence from the subject.
  • the setting unit 712 sets the calculation area based on the fluorescence observation image.
  • the generation unit 713 generates the SC image based on the speckle image and the calculation area set by the setting unit 712.
  • the SC value of the blood flow portion can be calculated using only the region having a high fluorescence brightness value (blood flow region) as the calculation region. Can generate images.
  • fluorescence observation using an ICG agent or fluorescein is shown, fluorescence observation using another agent may be used.
  • FIG. 11 is a schematic diagram showing a mixed image according to the fourth embodiment of the present disclosure.
  • This mixed image is obtained, for example, by a method using so-called space division two-step exposure.
  • This mixed image includes visible light pixels and IR pixels alternately in both the vertical and horizontal directions in one frame.
  • the lighter color pixel is a visible light pixel and the darker color pixel is an IR pixel.
  • the processing unit 71 of the information processing device 7 acquires such a mixed image from an imaging device (not shown). Then, the processing unit 71 of the information processing device 7 may perform the same processing as that in the case where the visible light pixel and the IR pixel are extracted from the mixed image and the visible light image and the speckle image are separately acquired. it can.
  • the medical system 1 of the fourth embodiment by using one type of mixed image, the irradiation operation of the structure observation light source 2 and the narrow band light source 3 is performed with a single imaging device.
  • the same processing as in the first embodiment can be performed without switching every time.
  • the technology according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure may be applied to an endoscopic surgery system.
  • FIG. 12 is a diagram showing an example of a schematic configuration of an endoscopic surgery system 5000 to which the technology according to the present disclosure can be applied.
  • a surgeon (doctor) 5067 is performing an operation on a patient 5071 on a patient bed 5069 using the endoscopic surgery system 5000.
  • the endoscopic surgery system 5000 includes an endoscope 5001, other surgical tools 5017, a support arm device 5027 for supporting the endoscope 5001, and various devices for endoscopic surgery. And a cart 5037 on which is mounted.
  • trocars 5025a to 5025d are punctured in the abdominal wall. Then, the barrel 5003 of the endoscope 5001 and other surgical tools 5017 are inserted into the body cavity of the patient 5071 from the trocars 5025a to 5025d.
  • a pneumoperitoneum tube 5019, an energy treatment tool 5021, and forceps 5023 are inserted into the body cavity of the patient 5071 as other surgical tools 5017.
  • the energy treatment tool 5021 is a treatment tool that performs incision and separation of tissue, sealing of blood vessels, or the like by high-frequency current or ultrasonic vibration.
  • the illustrated surgical instrument 5017 is merely an example, and various surgical instruments generally used in endoscopic surgery, such as a contusion and a retractor, may be used as the surgical instrument 5017.
  • An image of the surgical site in the body cavity of the patient 5071 taken by the endoscope 5001 is displayed on the display device 5041.
  • the surgeon 5067 uses the energy treatment tool 5021 and the forceps 5023 while performing a real-time image of the surgical site displayed on the display device 5041, and performs a procedure such as excising the affected site.
  • the pneumoperitoneum tube 5019, the energy treatment tool 5021, and the forceps 5023 are supported by an operator 5067, an assistant, or the like during surgery.
  • the support arm device 5027 includes an arm portion 5031 extending from the base portion 5029.
  • the arm portion 5031 includes joint portions 5033a, 5033b, 5033c, and links 5035a, 5035b, and is driven by control from the arm control device 5045.
  • the endoscope 5001 is supported by the arm portion 5031, and its position and posture are controlled. Thereby, stable fixing of the position of the endoscope 5001 can be realized.
  • the endoscope 5001 includes a lens barrel 5003 into which a region having a predetermined length from the distal end is inserted into the body cavity of the patient 5071, and a camera head 5005 connected to the base end of the lens barrel 5003.
  • the endoscope 5001 configured as a so-called rigid endoscope having the rigid barrel 5003 is illustrated, but the endoscope 5001 is configured as a so-called flexible mirror having the flexible barrel 5003. Good.
  • An opening in which an objective lens is fitted is provided at the tip of the lens barrel 5003.
  • a light source device 5043 is connected to the endoscope 5001, and the light generated by the light source device 5043 is guided to the tip of the lens barrel by a light guide extending inside the lens barrel 5003, and the objective
  • the observation target (subject) in the body cavity of the patient 5071 is irradiated via the lens.
  • the endoscope 5001 may be a direct-viewing endoscope, a perspective mirror, or a side-viewing endoscope.
  • An optical system and an image pickup device are provided inside the camera head 5005, and the reflected light (observation light) from the observation target is focused on the image pickup device by the optical system.
  • the observation light is photoelectrically converted by the imaging element, and an electric signal corresponding to the observation light, that is, an image signal corresponding to the observation image is generated.
  • the image signal is transmitted to the camera control unit (CCU) 5039 as RAW data.
  • the camera head 5005 has a function of adjusting the magnification and the focal length by appropriately driving the optical system.
  • the camera head 5005 may be provided with a plurality of image pickup elements in order to support, for example, stereoscopic vision (3D display).
  • a plurality of relay optical systems are provided inside the lens barrel 5003 to guide the observation light to each of the plurality of image pickup devices.
  • the CCU 5039 includes a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and the like, and controls the operations of the endoscope 5001 and the display device 5041 in a centralized manner. Specifically, the CCU 5039 performs various image processing such as development processing (demosaic processing) on the image signal received from the camera head 5005 to display an image based on the image signal. The CCU 5039 provides the display device 5041 with the image signal subjected to the image processing. The CCU 5039 also transmits a control signal to the camera head 5005 to control the driving thereof.
  • the control signal may include information regarding imaging conditions such as magnification and focal length.
  • the display device 5041 displays an image based on an image signal subjected to image processing by the CCU 5039 under the control of the CCU 5039.
  • the endoscope 5001 is compatible with high-resolution imaging such as 4K (horizontal pixel number 3840 ⁇ vertical pixel number 2160) or 8K (horizontal pixel number 7680 ⁇ vertical pixel number 4320), and / or 3D display
  • high-resolution imaging such as 4K (horizontal pixel number 3840 ⁇ vertical pixel number 2160) or 8K (horizontal pixel number 7680 ⁇ vertical pixel number 4320)
  • 3D display In the case where the display device 5041 corresponds to the display device 5041, a device capable of high-resolution display and / or a device capable of 3D display can be used as the display device 5041.
  • the display device 5041 When the display device 5041 is compatible with high-resolution shooting such as 4K or 8K, a more immersive feeling can be obtained by using a display device 5041 having a size of 55 inches or more. Further, a plurality of display devices 5041 having different resolutions and sizes may be provided depending on the application.
  • the light source device 5043 is composed of a light source such as an LED (light emitting diode), and supplies irradiation light to the endoscope 5001 when the surgical site is imaged.
  • a light source such as an LED (light emitting diode)
  • the arm control device 5045 is configured by a processor such as a CPU, for example, and operates according to a predetermined program to control the drive of the arm portion 5031 of the support arm device 5027 according to a predetermined control method.
  • the input device 5047 is an input interface for the endoscopic surgery system 5000.
  • the user can input various kinds of information and instructions to the endoscopic surgery system 5000 via the input device 5047.
  • the user inputs various kinds of information regarding the surgery, such as the physical information of the patient and the information regarding the surgical procedure, through the input device 5047.
  • the user may, via the input device 5047, give an instruction to drive the arm portion 5031 or an instruction to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) of the endoscope 5001. , And inputs an instruction to drive the energy treatment tool 5021.
  • the type of the input device 5047 is not limited, and the input device 5047 may be various known input devices.
  • the input device 5047 for example, a mouse, a keyboard, a touch panel, a switch, a foot switch 5057 and / or a lever can be applied.
  • the touch panel may be provided on the display surface of the display device 5041.
  • the input device 5047 is a device worn by the user, such as a glasses-type wearable device or an HMD (Head Mounted Display), and various inputs can be made according to the user's gesture or line of sight detected by these devices. Done. Further, the input device 5047 includes a camera capable of detecting the movement of the user, and various inputs are performed according to the gesture or the line of sight of the user detected from the image captured by the camera. Further, the input device 5047 includes a microphone capable of collecting the voice of the user, and various inputs are performed by voice through the microphone.
  • a glasses-type wearable device or an HMD Head Mounted Display
  • the input device 5047 is configured to be able to input various kinds of information in a contactless manner, a user (for example, a surgeon 5067) who belongs to a clean area can operate a device that belongs to a dirty area in a contactless manner. Is possible. In addition, since the user can operate the device without releasing his / her hand from the surgical tool, the convenience of the user is improved.
  • the treatment instrument control device 5049 controls driving of the energy treatment instrument 5021 for cauterization of tissue, incision, sealing of blood vessel, or the like.
  • the pneumoperitoneum device 5051 uses a gastrointestinal tube 5019 to inflate a gas into the body cavity of the patient 5071 in order to inflate the body cavity of the patient 5071 for the purpose of securing a visual field by the endoscope 5001 and a working space for an operator.
  • the recorder 5053 is a device capable of recording various information regarding surgery.
  • the printer 5055 is a device that can print various types of information regarding surgery in various formats such as text, images, or graphs.
  • the support arm device 5027 includes a base portion 5029 that is a base and an arm portion 5031 that extends from the base portion 5029.
  • the arm section 5031 is composed of a plurality of joint sections 5033a, 5033b, 5033c and a plurality of links 5035a, 5035b connected by the joint section 5033b, but in FIG.
  • the configuration of the arm portion 5031 is illustrated in a simplified manner. Actually, the shapes, the numbers, and the arrangements of the joints 5033a to 5033c and the links 5035a and 5035b, the directions of the rotation axes of the joints 5033a to 5033c, and the like are appropriately set so that the arm 5031 has a desired degree of freedom. obtain.
  • the arm portion 5031 can be preferably configured to have 6 or more degrees of freedom. Accordingly, the endoscope 5001 can be freely moved within the movable range of the arm portion 5031, so that the lens barrel 5003 of the endoscope 5001 can be inserted into the body cavity of the patient 5071 from a desired direction. It will be possible.
  • An actuator is provided in each of the joints 5033a to 5033c, and the joints 5033a to 5033c are configured to be rotatable about a predetermined rotation axis by driving the actuator.
  • the drive of the actuator is controlled by the arm controller 5045, so that the rotation angles of the joints 5033a to 5033c are controlled and the drive of the arm 5031 is controlled. Thereby, control of the position and orientation of the endoscope 5001 can be realized.
  • the arm control device 5045 can control the driving of the arm unit 5031 by various known control methods such as force control or position control.
  • the surgeon 5067 appropriately performs an operation input via the input device 5047 (including the foot switch 5057), and the arm controller 5045 appropriately controls the drive of the arm portion 5031 in accordance with the operation input.
  • the position and orientation of the endoscope 5001 may be controlled.
  • the endoscope 5001 at the tip of the arm portion 5031 can be moved from an arbitrary position to an arbitrary position and then fixedly supported at the position after the movement.
  • the arm portion 5031 may be operated by a so-called master slave method.
  • the arm unit 5031 can be remotely operated by the user via the input device 5047 installed at a place apart from the operating room.
  • the arm control device 5045 receives the external force from the user and operates the actuators of the joint parts 5033a to 5033c so that the arm part 5031 moves smoothly according to the external force.
  • a doctor called a scoopist supported the endoscope 5001.
  • the position of the endoscope 5001 can be fixed more reliably without manual labor, so that an image of the surgical site can be stably obtained. It becomes possible to perform surgery smoothly.
  • the arm control device 5045 does not necessarily have to be provided on the cart 5037. Also, the arm control device 5045 does not necessarily have to be one device. For example, the arm control device 5045 may be provided in each of the joint parts 5033a to 5033c of the arm part 5031 of the support arm device 5027, and the plurality of arm control devices 5045 cooperate with each other to drive the arm part 5031. Control may be implemented.
  • the light source device 5043 supplies the endoscope 5001 with irradiation light for imaging the surgical site.
  • the light source device 5043 is composed of, for example, an LED, a laser light source, or a white light source configured by a combination thereof.
  • the white light source is formed by the combination of the RGB laser light sources, the output intensity and the output timing of each color (each wavelength) can be controlled with high accuracy, so that the white balance of the captured image in the light source device 5043. Can be adjusted.
  • the laser light from each of the RGB laser light sources is time-divisionally irradiated to the observation target, and the drive of the image pickup device of the camera head 5005 is controlled in synchronization with the irradiation timing, so as to correspond to each of RGB. It is also possible to take the captured image in time division. According to this method, a color image can be obtained without providing a color filter on the image sensor.
  • the drive of the light source device 5043 may be controlled so as to change the intensity of the output light at predetermined time intervals.
  • the driving of the image sensor of the camera head 5005 in synchronism with the timing of changing the intensity of the light to acquire images in a time-division manner and synthesizing the images, high dynamic without so-called blackout and whiteout. An image of the range can be generated.
  • the light source device 5043 may be configured to be able to supply light in a predetermined wavelength band corresponding to special light observation.
  • the special light observation for example, the wavelength dependence of the absorption of light in body tissues is used to irradiate a narrow band of light as compared with the irradiation light (that is, white light) at the time of normal observation, so that the mucosal surface layer
  • the so-called narrow band imaging is performed, in which predetermined tissues such as blood vessels are imaged with high contrast.
  • fluorescence observation in which an image is obtained by the fluorescence generated by irradiating the excitation light may be performed.
  • the body tissue is irradiated with excitation light to observe fluorescence from the body tissue (autofluorescence observation), or a reagent such as indocyanine green (ICG) is locally injected into the body tissue and the body tissue is injected into the body tissue.
  • a fluorescent image may be obtained by irradiating excitation light corresponding to the fluorescent wavelength of the reagent.
  • the light source device 5043 may be configured to be capable of supplying narrowband light and / or excitation light compatible with such special light observation.
  • FIG. 13 is a block diagram showing an example of the functional configuration of the camera head 5005 and CCU 5039 shown in FIG.
  • the camera head 5005 has a lens unit 5007, an imaging unit 5009, a drive unit 5011, a communication unit 5013, and a camera head control unit 5015 as its functions.
  • the CCU 5039 also has a communication unit 5059, an image processing unit 5061, and a control unit 5063 as its functions.
  • the camera head 5005 and the CCU 5039 are bidirectionally connected by a transmission cable 5065.
  • the lens unit 5007 is an optical system provided in a connection portion with the lens barrel 5003.
  • the observation light taken in from the tip of the lens barrel 5003 is guided to the camera head 5005 and enters the lens unit 5007.
  • the lens unit 5007 is configured by combining a plurality of lenses including a zoom lens and a focus lens.
  • the optical characteristics of the lens unit 5007 are adjusted so that the observation light is condensed on the light receiving surface of the image pickup element of the image pickup section 5009.
  • the zoom lens and the focus lens are configured so that their positions on the optical axis can be moved in order to adjust the magnification and focus of the captured image.
  • the image pickup unit 5009 is composed of an image pickup element, and is arranged in the latter stage of the lens unit 5007.
  • the observation light that has passed through the lens unit 5007 is condensed on the light receiving surface of the image pickup element, and an image signal corresponding to the observation image is generated by photoelectric conversion.
  • the image signal generated by the imaging unit 5009 is provided to the communication unit 5013.
  • the image pickup device that constitutes the image pickup unit 5009 for example, a CMOS (Complementary Metal Oxide Semiconductor) type image sensor, which has a Bayer array and is capable of color image pickup is used. It should be noted that as the image pickup device, for example, a device capable of capturing a high-resolution image of 4K or higher may be used. By obtaining the image of the surgical site with high resolution, the surgeon 5067 can grasp the state of the surgical site in more detail, and the surgery can proceed more smoothly.
  • CMOS Complementary Metal Oxide Semiconductor
  • the image pickup device constituting the image pickup unit 5009 is configured to have a pair of image pickup devices for respectively obtaining the image signals for the right eye and the left eye corresponding to 3D display. By performing the 3D display, the operator 5067 can more accurately grasp the depth of the living tissue in the operation site.
  • the image pickup unit 5009 is configured by a multi-plate type, a plurality of lens unit 5007 systems are provided corresponding to each image pickup element.
  • the image pickup unit 5009 does not necessarily have to be provided on the camera head 5005.
  • the imaging unit 5009 may be provided inside the lens barrel 5003 immediately after the objective lens.
  • the drive unit 5011 is composed of an actuator, and moves the zoom lens and the focus lens of the lens unit 5007 by a predetermined distance along the optical axis under the control of the camera head control unit 5015. As a result, the magnification and focus of the image captured by the image capturing unit 5009 can be adjusted appropriately.
  • the communication unit 5013 is composed of a communication device for transmitting and receiving various information to and from the CCU 5039.
  • the communication unit 5013 transmits the image signal obtained from the image capturing unit 5009 as RAW data to the CCU 5039 via the transmission cable 5065.
  • the image signal is preferably transmitted by optical communication in order to display the captured image of the surgical site with low latency.
  • the operator 5067 performs the operation while observing the state of the affected area by the captured image, and therefore, for safer and more reliable operation, the moving image of the operated area is displayed in real time as much as possible. Is required.
  • the communication unit 5013 is provided with a photoelectric conversion module that converts an electric signal into an optical signal.
  • the image signal is converted into an optical signal by the photoelectric conversion module and then transmitted to the CCU 5039 via the transmission cable 5065.
  • the communication unit 5013 also receives a control signal for controlling the driving of the camera head 5005 from the CCU 5039.
  • the control signal includes, for example, information that specifies the frame rate of the captured image, information that specifies the exposure value at the time of capturing, and / or information that specifies the magnification and focus of the captured image. Contains information about the condition.
  • the communication unit 5013 provides the received control signal to the camera head control unit 5015.
  • the control signal from the CCU 5039 may also be transmitted by optical communication.
  • the communication unit 5013 is provided with a photoelectric conversion module that converts an optical signal into an electric signal, and the control signal is converted into an electric signal by the photoelectric conversion module and then provided to the camera head control unit 5015.
  • the imaging conditions such as the frame rate, the exposure value, the magnification, and the focus described above are automatically set by the control unit 5063 of the CCU 5039 based on the acquired image signal. That is, a so-called AE (Auto Exposure) function, AF (Auto Focus) function, and AWB (Auto White Balance) function are mounted on the endoscope 5001.
  • AE Auto Exposure
  • AF Automatic Focus
  • AWB Automatic White Balance
  • the camera head controller 5015 controls driving of the camera head 5005 based on a control signal from the CCU 5039 received via the communication unit 5013.
  • the camera head control unit 5015 controls the driving of the image pickup device of the image pickup unit 5009 based on the information indicating the frame rate of the captured image and / or the information indicating the exposure at the time of image capturing.
  • the camera head control unit 5015 appropriately moves the zoom lens and the focus lens of the lens unit 5007 via the drive unit 5011 based on the information indicating that the magnification and the focus of the captured image are designated.
  • the camera head controller 5015 may further have a function of storing information for identifying the lens barrel 5003 and the camera head 5005.
  • the camera head 5005 can be made resistant to autoclave sterilization.
  • the communication unit 5059 is composed of a communication device for transmitting and receiving various information with the camera head 5005.
  • the communication unit 5059 receives the image signal transmitted from the camera head 5005 via the transmission cable 5065.
  • the image signal can be preferably transmitted by optical communication.
  • the communication unit 5059 is provided with a photoelectric conversion module that converts an optical signal into an electrical signal in response to optical communication.
  • the communication unit 5059 provides the image signal converted into the electric signal to the image processing unit 5061.
  • the communication unit 5059 transmits a control signal for controlling the driving of the camera head 5005 to the camera head 5005.
  • the control signal may also be transmitted by optical communication.
  • the image processing unit 5061 performs various kinds of image processing on the image signal that is the RAW data transmitted from the camera head 5005.
  • image processing for example, development processing, high image quality processing (band emphasis processing, super-resolution processing, NR (Noise reduction) processing and / or camera shake correction processing, etc.), and / or enlargement processing (electronic zoom processing) Etc., various known signal processes are included.
  • the image processing unit 5061 also performs detection processing on the image signal for performing AE, AF, and AWB.
  • the image processing unit 5061 includes a processor such as a CPU and a GPU, and the image processing and the detection processing described above can be performed by the processor operating according to a predetermined program.
  • the image processing unit 5061 is composed of a plurality of GPUs, the image processing unit 5061 appropriately divides information related to the image signal, and the plurality of GPUs perform image processing in parallel.
  • the control unit 5063 performs various controls regarding imaging of the surgical site by the endoscope 5001 and display of the captured image. For example, the control unit 5063 generates a control signal for controlling the driving of the camera head 5005. At this time, when the imaging condition is input by the user, the control unit 5063 generates a control signal based on the input by the user. Alternatively, when the endoscope 5001 is equipped with the AE function, the AF function, and the AWB function, the control unit 5063 determines the optimum exposure value, focal length, and optimum exposure value according to the result of the detection processing by the image processing unit 5061. The white balance is calculated appropriately and a control signal is generated.
  • control unit 5063 causes the display device 5041 to display the image of the surgical site based on the image signal subjected to the image processing by the image processing unit 5061.
  • the control unit 5063 recognizes various objects in the operation region image using various image recognition techniques.
  • the control unit 5063 detects a surgical instrument such as forceps, a specific living body part, bleeding, a mist when the energy treatment instrument 5021 is used, by detecting the shape and color of the edge of the object included in the surgical image. Can be recognized.
  • the control unit 5063 uses the recognition result to superimpose and display various types of surgical support information on the image of the surgical site. By displaying the surgery support information in a superimposed manner and presenting it to the operator 5067, it becomes possible to proceed with the surgery more safely and reliably.
  • a transmission cable 5065 connecting the camera head 5005 and the CCU 5039 is an electric signal cable compatible with electric signal communication, an optical fiber compatible with optical communication, or a composite cable of these.
  • wired communication is performed using the transmission cable 5065, but communication between the camera head 5005 and the CCU 5039 may be performed wirelessly.
  • the communication between the two is performed wirelessly, it is not necessary to lay the transmission cable 5065 in the operating room, so that the situation in which the movement of the medical staff in the operating room is prevented by the transmission cable 5065 can be solved.
  • the example of the endoscopic surgery system 5000 to which the technology according to the present disclosure can be applied has been described above.
  • the endoscopic surgery system 5000 is described here as an example, the system to which the technology according to the present disclosure can be applied is not limited to this example.
  • the technique according to the present disclosure may be applied to a flexible endoscopic surgery system for inspection and a microscopic surgery system described in Application Example 2 below.
  • the technology according to the present disclosure can be suitably applied to the endoscope 5001 among the configurations described above. Specifically, in the case where the blood flow part and the non-blood flow part in the image of the surgical site in the body cavity of the patient 5071 captured by the endoscope 5001 are displayed on the display device 5041 so as to be easily visible, the present disclosure is disclosed. Such technology can be applied.
  • the technique according to the present disclosure to the endoscope 5001, it is possible to more appropriately set the calculation region for calculating the SC value in the speckle imaging technique and generate a good SC image.
  • the operator 5067 can view the image of the surgical site in which the blood flow portion and the non-blood flow portion are accurately identified on the display device 5041 in real time, and can perform the surgery more safely.
  • the technology according to the present disclosure may be applied to a microscopic surgery system used for so-called microsurgery, which is performed while observing a microscopic part of a patient in an enlarged manner.
  • FIG. 14 is a diagram showing an example of a schematic configuration of a microscopic surgery system 5300 to which the technology according to the present disclosure can be applied.
  • the microscopic surgery system 5300 includes a microscope device 5301, a control device 5317, and a display device 5319.
  • the “user” refers to an operator, an assistant, or any medical staff who uses the microscopic surgery system 5300.
  • the microscope device 5301 includes a microscope unit 5303 for magnifying and observing an observation target (operated part of a patient), an arm unit 5309 for supporting the microscope unit 5303 at a tip, and a base unit 5315 for supporting a base end of the arm unit 5309. With.
  • the microscope unit 5303 includes a substantially cylindrical tubular portion 5305, an imaging unit (not shown) provided inside the tubular portion 5305, and an operation unit 5307 provided in a partial area on the outer periphery of the tubular portion 5305. It consists of and.
  • the microscope unit 5303 is an electronic imaging type microscope unit (so-called video type microscope unit) that electronically captures a captured image by the imaging unit.
  • a cover glass that protects the internal image pickup unit is provided on the opening surface at the lower end of the tubular portion 5305.
  • Light from the observation target (hereinafter, also referred to as observation light) passes through the cover glass and enters the imaging unit inside the tubular portion 5305.
  • a light source such as an LED (Light Emitting Diode) may be provided inside the tubular portion 5305, and light is emitted from the light source to the observation target through the cover glass during imaging. May be.
  • the imaging unit is composed of an optical system that collects the observation light and an imaging device that receives the observation light that is collected by the optical system.
  • the optical system is configured by combining a plurality of lenses including a zoom lens and a focus lens, and its optical characteristics are adjusted so as to form observation light on the light receiving surface of the image sensor.
  • the imaging device receives the observation light and photoelectrically converts the observation light to generate a signal corresponding to the observation light, that is, an image signal corresponding to the observation image.
  • the image pickup device for example, a device having a Bayer array and capable of color imaging is used.
  • the image pickup device may be various known image pickup devices such as a CMOS (Complementary Metal Oxide Semiconductor) image sensor or a CCD (Charge Coupled Device) image sensor.
  • the image signal generated by the image sensor is transmitted to the control device 5317 as RAW data.
  • the transmission of the image signal may be preferably performed by optical communication.
  • the surgeon performs surgery while observing the state of the affected area with the captured images. Therefore, for safer and more reliable surgery, it is required that the moving image of the surgery area be displayed in real time as much as possible. Because it will be done.
  • By transmitting the image signal by optical communication it is possible to display the captured image with low latency.
  • the image pickup unit may have a drive mechanism that moves the zoom lens and the focus lens of the optical system along the optical axis. By appropriately moving the zoom lens and the focus lens by the drive mechanism, the magnification of the captured image and the focal length at the time of capturing can be adjusted. Further, the image pickup unit may be equipped with various functions such as an AE (Auto Exposure) function and an AF (Auto Focus) function, which are generally provided in an electronic image pickup type microscope unit.
  • AE Auto Exposure
  • AF Auto Focus
  • the image pickup unit may be configured as a so-called single-plate type image pickup unit having one image pickup device, or may be configured as a so-called multi-plate type image pickup unit having a plurality of image pickup devices.
  • image signals corresponding to RGB are generated by the image pickup elements, respectively, and the image signals may be combined to obtain a color image.
  • the image capturing unit may be configured to have a pair of image capturing elements for respectively acquiring image signals for the right eye and the left eye corresponding to stereoscopic vision (3D display).
  • 3D display enables the operator to more accurately understand the depth of the living tissue in the operation site.
  • a multiple optical system can be provided corresponding to each imaging element.
  • the operation unit 5307 is an input unit that is configured by, for example, a cross lever or a switch, and that accepts a user operation input.
  • the user can input, via the operation unit 5307, an instruction to change the magnification of the observation image and the focal length to the observation target.
  • the drive mechanism of the imaging unit appropriately moves the zoom lens and the focus lens, so that the enlargement magnification and the focal length can be adjusted.
  • the user can input an instruction to switch the operation mode (all-free mode and fixed mode described later) of the arm unit 5309 via the operation unit 5307.
  • the operation unit 5307 may be provided at a position where the user can easily operate it with his / her finger while holding the tubular portion 5305 so that the operation portion 5307 can be operated while the user is moving the tubular portion 5305. preferable.
  • the arm portion 5309 is configured by a plurality of links (first link 5313a to sixth link 5313f) being rotatably connected to each other by a plurality of joint portions (first joint portion 5311a to sixth joint portion 5311f). To be done.
  • the first joint portion 5311a has a substantially columnar shape, and at its tip (lower end), the upper end of the tubular portion 5305 of the microscope portion 5303 is parallel to the central axis of the tubular portion 5305.
  • O1 Supports to be rotatable around.
  • the first joint portion 5311a may be configured such that the first axis O1 coincides with the optical axis of the imaging unit of the microscope unit 5303. Accordingly, by rotating the microscope unit 5303 around the first axis O1, it is possible to change the field of view so as to rotate the captured image.
  • the first link 5313a fixedly supports the first joint portion 5311a at the tip.
  • the first link 5313a is a rod-shaped member having a substantially L shape, and one end side of the first link 5313a extends in a direction orthogonal to the first axis O1 while the end portion of the one side is the first joint.
  • the first joint portion 5311a is connected so as to come into contact with the upper end of the outer periphery of the portion 5311a.
  • the second joint 5311b is connected to the end of the other side of the first link 5313a on the base end side of the substantially L shape.
  • the second joint portion 5311b has a substantially columnar shape, and the tip end thereof supports the base end of the first link 5313a so as to be rotatable about a rotation axis (second axis O2) orthogonal to the first axis O1. .
  • the tip end of the second link 5313b is fixedly connected to the base end of the second joint portion 5311b.
  • the second link 5313b is a rod-shaped member having a substantially L shape, and one end side of the second link 5313b extends in a direction orthogonal to the second axis O2, while the end of the one side is a base of the second joint portion 5311b. Permanently connected to the end.
  • the third joint portion 5311c is connected to the other side of the second link 5313b on the base end side of the substantially L shape.
  • the third joint portion 5311c has a substantially cylindrical shape, and at the tip thereof, the base end of the second link 5313b is rotated around a rotation axis (third axis O3) orthogonal to the first axis O1 and the second axis O2. Support rotatably.
  • the tip end of the third link 5313c is fixedly connected to the base end of the third joint portion 5311c. Moving the microscope unit 5303 so as to change the position of the microscope unit 5303 in the horizontal plane by rotating the configuration on the tip side including the microscope unit 5303 around the second axis O2 and the third axis O3. You can That is, by controlling the rotation around the second axis O2 and the third axis O3, it becomes possible to move the field of view of the captured image within the plane.
  • the third link 5313c is configured such that the tip end side thereof has a substantially columnar shape, and the base end of the third joint portion 5311c has the substantially same central axis at the tip end of the columnar shape, It is fixedly connected.
  • the base end side of the third link 5313c has a prismatic shape, and the fourth joint 5311d is connected to the end thereof.
  • the fourth joint portion 5311d has a substantially columnar shape, and the tip end thereof supports the base end of the third link 5313c so as to be rotatable about a rotation axis (fourth axis O4) orthogonal to the third axis O3. .
  • the tip end of the fourth link 5313d is fixedly connected to the base end of the fourth joint portion 5311d.
  • the fourth link 5313d is a rod-shaped member that extends in a substantially straight line.
  • the fourth link 5313d extends so as to be orthogonal to the fourth axis O4, and the end of the tip of the fourth link 5313d abuts the substantially cylindrical side surface of the fourth joint 5311d. It is fixedly connected to the fourth joint portion 5311d so as to be in contact therewith.
  • the fifth joint 5311e is connected to the base end of the fourth link 5313d.
  • the fifth joint portion 5311e has a substantially columnar shape, and supports the base end of the fourth link 5313d at its tip end side so as to be rotatable about a rotation axis (fifth axis O5) parallel to the fourth axis O4. To do.
  • the tip end of the fifth link 5313e is fixedly connected to the base end of the fifth joint portion 5311e.
  • the fourth axis O4 and the fifth axis O5 are rotation axes that can move the microscope unit 5303 in the vertical direction. By rotating the configuration of the tip side including the microscope unit 5303 around the fourth axis O4 and the fifth axis O5, the height of the microscope unit 5303, that is, the distance between the microscope unit 5303 and the observation target can be adjusted. .
  • the fifth link 5313e includes a first member having a substantially L shape in which one side extends in the vertical direction and the other side extends in the horizontal direction, and vertically downward from a portion of the first member extending in the horizontal direction. And a rod-shaped second member that extends is configured to be combined.
  • the proximal end of the fifth joint 5311e is fixedly connected to the vicinity of the upper end of the portion of the fifth link 5313e that extends in the vertical direction of the first member.
  • the sixth joint 5311f is connected to the base end (lower end) of the second member of the fifth link 5313e.
  • the sixth joint 5311f has a substantially columnar shape, and supports the base end of the fifth link 5313e on the tip side thereof so as to be rotatable about a rotation axis (sixth axis O6) parallel to the vertical direction.
  • the tip end of the sixth link 5313f is fixedly connected to the base end of the sixth joint portion 5311f.
  • the sixth link 5313f is a rod-shaped member extending in the vertical direction, and its base end is fixedly connected to the upper surface of the base portion 5315.
  • the rotatable range of the first joint portion 5311a to the sixth joint portion 5311f is appropriately set so that the microscope unit 5303 can move as desired.
  • the arm portion 5309 having the above-described configuration with respect to the movement of the microscope portion 5303, translational 3 degrees of freedom and rotation 3 degrees of freedom, that is, a total of 6 degrees of freedom can be realized.
  • the arm unit 5309 so as to realize 6 degrees of freedom regarding the movement of the microscope unit 5303, it is possible to freely control the position and orientation of the microscope unit 5303 within the movable range of the arm unit 5309. It will be possible. Therefore, the surgical site can be observed from any angle, and the surgery can be performed more smoothly.
  • the configuration of the illustrated arm portion 5309 is merely an example, and the number and shape (length) of the links that configure the arm portion 5309, the number of joint portions, the arrangement position, the direction of the rotation axis, and the like can be set as desired. It may be appropriately designed so that the degree can be realized.
  • the arm unit 5309 in order to move the microscope unit 5303 freely, the arm unit 5309 is preferably configured to have 6 degrees of freedom, but the arm unit 5309 has a larger degree of freedom (ie, redundant freedom). ).
  • the posture of the arm unit 5309 can be changed while the position and posture of the microscope unit 5303 are fixed. Therefore, for example, a more convenient control for the operator can be realized by controlling the posture of the arm 5309 so that the arm 5309 does not interfere with the field of view of the operator who views the display device 5319.
  • each of the first joint portion 5311a to the sixth joint portion 5311f may be provided with a drive mechanism such as a motor and an actuator equipped with an encoder or the like that detects a rotation angle at each joint portion. Then, by appropriately controlling the driving of each actuator provided in the first joint portion 5311a to the sixth joint portion 5311f by the control device 5317, the posture of the arm portion 5309, that is, the position and posture of the microscope portion 5303 can be controlled. . Specifically, the control device 5317 can grasp the current posture of the arm unit 5309 and the current position and posture of the microscope unit 5303 based on the information about the rotation angle of each joint detected by the encoder.
  • the control device 5317 calculates the control value (for example, the rotation angle or the generated torque) for each joint portion that realizes the movement of the microscope unit 5303 according to the operation input from the user, using the grasped information. Then, the drive mechanism of each joint is driven according to the control value. At this time, the control method of the arm 5309 by the control device 5317 is not limited, and various known control methods such as force control or position control may be applied.
  • the control device 5317 appropriately controls the driving of the arm portion 5309 according to the operation input, and the position and posture of the microscope portion 5303 are controlled. May be done.
  • the microscope unit 5303 can be moved from an arbitrary position to an arbitrary position and then fixedly supported at the position after the movement.
  • a device such as a foot switch that can be operated even if the operator has a surgical tool in his or her hand.
  • the operation input may be performed in a non-contact manner based on the gesture detection and the line-of-sight detection using the wearable device or the camera provided in the operating room.
  • the arm portion 5309 may be operated by a so-called master slave method.
  • the arm unit 5309 can be remotely operated by the user via an input device installed at a place apart from the operating room.
  • the actuators of the first joint portion 5311a to the sixth joint portion 5311f are driven so as to receive an external force from the user and move the arm portion 5309 smoothly according to the external force. That is, so-called power assist control may be performed.
  • This allows the microscope unit 5303 to be moved with a comparatively light force when the user grasps the microscope unit 5303 and directly moves its position. Therefore, the microscope unit 5303 can be moved more intuitively and with a simpler operation, and the convenience of the user can be improved.
  • the drive of the arm portion 5309 may be controlled so as to perform a pivot operation.
  • the pivot operation is an operation of moving the microscope unit 5303 so that the optical axis of the microscope unit 5303 always faces a predetermined point in space (hereinafter referred to as a pivot point). According to the pivot operation, it is possible to observe the same observation position from various directions, and thus it is possible to observe the affected area in more detail.
  • the microscope unit 5303 is configured such that its focal length cannot be adjusted, it is preferable that the pivot operation be performed with the distance between the microscope unit 5303 and the pivot point being fixed. In this case, the distance between the microscope unit 5303 and the pivot point may be adjusted to a fixed focal length of the microscope unit 5303.
  • the microscope unit 5303 moves on a hemispherical surface (schematically shown in FIG. 14) having a radius corresponding to the focal length centered on the pivot point, and is clear even if the observation direction is changed. A captured image will be obtained.
  • the pivot operation may be performed in a state where the distance between the microscope unit 5303 and the pivot point is variable.
  • the control device 5317 calculates the distance between the microscope unit 5303 and the pivot point based on the information about the rotation angle of each joint detected by the encoder, and the microscope based on the calculation result.
  • the focal length of the unit 5303 may be automatically adjusted.
  • the microscope unit 5303 is provided with an AF function, the AF function may automatically adjust the focal length each time the distance between the microscope unit 5303 and the pivot point changes due to the pivot operation. .
  • first joint portion 5311a to the sixth joint portion 5311f may be provided with a brake that restrains the rotation thereof.
  • the operation of the brake can be controlled by the controller 5317.
  • the control device 5317 operates the brake of each joint. Accordingly, the posture of the arm portion 5309, that is, the position and posture of the microscope portion 5303 can be fixed without driving the actuator, so that power consumption can be reduced.
  • the control device 5317 may release the brake of each joint unit and drive the actuator according to a predetermined control method.
  • the operation of such a brake can be performed according to the operation input by the user via the operation unit 5307 described above.
  • the user wants to move the position and posture of the microscope unit 5303, the user operates the operation unit 5307 to release the brake of each joint.
  • the operation mode of the arm portion 5309 shifts to a mode (all free mode) in which each joint can freely rotate.
  • the user wants to fix the position and posture of the microscope unit 5303, the user operates the operation unit 5307 to activate the brake of each joint.
  • the operation mode of the arm portion 5309 shifts to a mode in which the rotation of each joint portion is restricted (fixed mode).
  • the control device 5317 controls the operation of the microscope operation system 5300 by controlling the operation of the microscope device 5301 and the display device 5319.
  • the control device 5317 controls the drive of the arm portion 5309 by operating the actuators of the first joint portion 5311a to the sixth joint portion 5311f according to a predetermined control method.
  • the control device 5317 changes the operation mode of the arm portion 5309 by controlling the operation of the brakes of the first joint portion 5311a to the sixth joint portion 5311f.
  • the control device 5317 performs various signal processing on the image signal acquired by the imaging unit of the microscope unit 5303 of the microscope device 5301 to generate image data for display and display the image data. It is displayed on the device 5319.
  • the signal processing for example, development processing (demosaic processing), high image quality processing (band enhancement processing, super-resolution processing, NR (Noise reduction) processing and / or camera shake correction processing, etc.) and / or enlargement processing (that is, Various known signal processes such as electronic zoom process) may be performed.
  • communication between the control device 5317 and the microscope unit 5303 and communication between the control device 5317 and the first joint portion 5311a to the sixth joint portion 5311f may be wired communication or wireless communication.
  • wired communication electric signal communication may be performed or optical communication may be performed.
  • the transmission cable used for wired communication may be configured as an electric signal cable, an optical fiber, or a composite cable of these, depending on the communication system.
  • wireless communication since it is not necessary to lay a transmission cable in the operating room, it is possible to eliminate the situation where the transmission cable hinders the movement of medical staff in the operating room.
  • the control device 5317 may be a processor such as a CPU (Central Processing Unit) or a GPU (Graphics Processing Unit), or a microcomputer or a control board in which a storage element such as a processor and a memory is embedded.
  • the various functions described above can be realized by the processor of the control device 5317 operating according to a predetermined program.
  • the control device 5317 is provided as a device separate from the microscope device 5301, but the control device 5317 is installed inside the base portion 5315 of the microscope device 5301 and is integrated with the microscope device 5301. May be configured as.
  • the control device 5317 may be composed of a plurality of devices.
  • the microscope unit 5303 and the first joint unit 5311a to the sixth joint unit 5311f of the arm unit 5309 are provided with microcomputers, control boards, and the like, respectively, and are connected to each other so that they can communicate with each other. Similar functions may be realized.
  • the display device 5319 is provided in the operating room, and displays an image corresponding to the image data generated by the control device 5317 under the control of the control device 5317. That is, the display device 5319 displays the image of the operation portion photographed by the microscope portion 5303.
  • the display device 5319 may display various kinds of information related to the surgery, such as the physical information of the patient and the information about the surgical procedure, instead of the image of the surgery part or together with the image of the surgery part. In this case, the display on the display device 5319 may be appropriately switched by the operation of the user.
  • a plurality of display devices 5319 may be provided, and each of the plurality of display devices 5319 may display an image of the operation portion and various kinds of information regarding surgery.
  • various known display devices such as a liquid crystal display device or an EL (Electro Luminescence) display device may be applied.
  • FIG. 15 is a diagram showing a state of surgery using the microscopic surgery system 5300 shown in FIG.
  • FIG. 15 schematically shows a surgeon 5321 performing an operation on a patient 5325 on a patient bed 5323 using the microscopic surgery system 5300.
  • the control device 5317 in the configuration of the microscopic surgery system 5300 is omitted, and the microscope device 5301 is illustrated in a simplified manner.
  • an image of the surgical site taken by the microscope device 5301 is enlarged and displayed on the display device 5319 installed on the wall of the operating room by using the microscopic surgery system 5300.
  • the display device 5319 is installed at a position facing the surgeon 5321.
  • the surgeon 5321 observes the state of the surgical site by the image displayed on the display device 5319, and, for example, cuts the affected area, and the like. Perform various treatments on the.
  • the microscopic surgery system 5300 to which the technology according to the present disclosure can be applied has been described above.
  • the microscope device 5301 can also function as a support arm device that supports another observation device or another surgical tool at its tip instead of the microscope portion 5303.
  • An endoscope may be applied as the other observation device.
  • forceps, contusion, pneumoperitoneum tube for pneumoperitoneum, or energy treatment tool for incising tissue or sealing blood vessel by cauterization can be applied.
  • the technique according to the present disclosure may be applied to a support arm device that supports a configuration other than such a microscope unit.
  • the technology according to the present disclosure can be preferably applied to the control device 5317 among the configurations described above.
  • the present disclosure when the blood flow portion and the non-blood flow portion in the image of the surgical portion of the patient 5325 captured by the image capturing unit of the microscope unit 5303 are displayed on the display device 5319 in a readily visible manner.
  • Technology can be applied.
  • the technology according to the present disclosure to the control device 5317, it is possible to more appropriately set the calculation region for calculating the SC value in the speckle imaging technology and generate a good SC image. Accordingly, the operator 5321 can view the image of the surgical site in which the blood flow portion and the non-blood flow portion are accurately identified on the display device 5319 in real time, and can perform the surgery more safely.
  • a first irradiation means for irradiating a subject with incoherent visible light Second irradiating means for irradiating the subject with coherent near-infrared light
  • First imaging means for imaging the reflected light of the visible light from the subject
  • Second imaging means for imaging the reflected light of the near infrared light from the subject
  • An acquisition unit that acquires a visible light image from the first imaging unit and a speckle image from the second imaging unit
  • setting means for setting a calculation region for statistical processing of the brightness value of the speckle
  • a medical system comprising: the speckle image and a generation unit that generates a predetermined image based on the calculation area.
  • the medical system according to (1) wherein the medical system is a microscopic surgery system or an endoscopic surgery system.
  • a second image capturing unit that acquires the first image from the first image capturing unit that captures the reflected light of the incoherent light that is applied to the subject, and a second image capturing unit that captures the reflected light of the coherent light that is applied to the subject
  • An information processing method comprising: a generation step of generating a predetermined image based on the second image and the calculation area.
  • the incoherent light is incoherent visible light
  • the coherent light is coherent near infrared light
  • the first image is a visible light image
  • the second image is a speckle image
  • the information processing method according to (3), wherein the predetermined image is a speckle contrast image.
  • the degree of association is calculated based on the visible light image with respect to each of the target pixel and each of a plurality of peripheral pixels by at least one of a distance, a difference in luminance, and a difference in color, and the plurality of the degrees of association based on the degree of association Setting the calculation region for the pixel of interest in at least a part of the peripheral pixels,
  • the speckle contrast image is generated by calculating a speckle contrast value for the pixel of interest based on the speckle image and the calculation region.
  • the speckle contrast value is calculated by weighting each of the peripheral pixels with the degree of association when calculating the speckle contrast value of the pixel of interest based on the speckle image and the calculation region.
  • the information processing method according to (5), wherein (7) An acquisition step of acquiring a third image from a third image capturing unit that captures fluorescence from a subject and acquiring a second image from a second image capturing unit that captures reflected light of coherent light emitted to the subject.
  • a setting step of setting a calculation area based on the third image An information processing method comprising: a generation step of generating a predetermined image based on the second image and the calculation area.
  • the speckle contrast value has been described as an example of the value calculated by performing the statistical processing on the speckle luminance value, but the invention is not limited to this, and the BR (Blur Rate) is not limited to this. , SBR (Square BR), MBR (Mean BR), and the like.

Landscapes

  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Pathology (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Chemical & Material Sciences (AREA)
  • Optics & Photonics (AREA)
  • Biochemistry (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Surgery (AREA)
  • Immunology (AREA)
  • Engineering & Computer Science (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Mathematical Physics (AREA)
  • Astronomy & Astrophysics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Theoretical Computer Science (AREA)
  • Hematology (AREA)
  • Cardiology (AREA)
  • Physiology (AREA)
  • Image Processing (AREA)

Abstract

L'invention concerne un système médical (1) comprenant un premier moyen d'irradiation (2) pour émettre une lumière visible incohérente vers un sujet, un second moyen d'irradiation (3) pour émettre une lumière proche infrarouge cohérente vers le sujet, un premier moyen d'imagerie (5) pour capturer une image de la lumière visible réfléchie par le sujet, un second moyen d'imagerie (6) pour capturer une image de la lumière proche infrarouge réfléchie par le sujet, un moyen d'acquisition (711) pour acquérir une image de lumière visible à partir du premier moyen d'imagerie (5) et acquérir une image de tavelure à partir du second moyen d'imagerie (6), un moyen de réglage (712) pour régler une région de calcul pour effectuer un traitement statistique de valeurs de luminance de tavelure sur la base de l'image de lumière visible, et un moyen de génération (712) pour générer une image prédéterminée sur la base de l'image de tavelure et de la région de calcul.
PCT/JP2019/034884 2018-10-24 2019-09-05 Système médical et procédé de traitement d'informations WO2020084917A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-200399 2018-10-24
JP2018200399 2018-10-24

Publications (1)

Publication Number Publication Date
WO2020084917A1 true WO2020084917A1 (fr) 2020-04-30

Family

ID=70330697

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/034884 WO2020084917A1 (fr) 2018-10-24 2019-09-05 Système médical et procédé de traitement d'informations

Country Status (1)

Country Link
WO (1) WO2020084917A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017160643A1 (fr) * 2016-03-14 2017-09-21 Massachusetts Institute Of Technology Dispositif et procédé d'imagerie de fluorescence proche infrarouge
JP2018514335A (ja) * 2015-05-07 2018-06-07 ノバダック テクノロジーズ インコーポレイテッド カラーイメージセンサを用いた組織のレーザスペックルイメージングのための方法及びシステム

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018514335A (ja) * 2015-05-07 2018-06-07 ノバダック テクノロジーズ インコーポレイテッド カラーイメージセンサを用いた組織のレーザスペックルイメージングのための方法及びシステム
WO2017160643A1 (fr) * 2016-03-14 2017-09-21 Massachusetts Institute Of Technology Dispositif et procédé d'imagerie de fluorescence proche infrarouge

Similar Documents

Publication Publication Date Title
US11221296B2 (en) Imaging system
WO2020045015A1 (fr) Système médical, dispositif de traitement d'informations et méthode de traitement d'informations
US10904437B2 (en) Control apparatus and control method
WO2018230066A1 (fr) Système médical, appareil médical et procédé de commande
US11653824B2 (en) Medical observation system and medical observation device
JP2019084334A (ja) 医療用保持装置、医療用アームシステム、及びドレープ装着機構
WO2021049438A1 (fr) Bras de support médical et système médical
JP2020074926A (ja) 医療用観察システム、信号処理装置及び医療用観察方法
WO2018088113A1 (fr) Actionneur d'entraînement d'articulation et système médical
JPWO2019239942A1 (ja) 手術用観察装置、手術用観察方法、手術用光源装置、及び手術用の光照射方法
JPWO2019092950A1 (ja) 画像処理装置、画像処理方法および画像処理システム
WO2017221491A1 (fr) Dispositif, système et procédé de commande
WO2020203225A1 (fr) Système médical, dispositif et procédé de traitement d'informations
WO2019181242A1 (fr) Endoscope et système de bras
WO2020116067A1 (fr) Système médical, dispositif de traitement d'informations, et procédé de traitement d'informations
JP7544033B2 (ja) 医療システム、情報処理装置及び情報処理方法
JP2021164494A (ja) 医療用観察システム、医療用観察装置、及び医療用観察装置の駆動方法
WO2020045014A1 (fr) Système médical, dispositif de traitement d'informations et procédé de traitement d'informations
WO2021140923A1 (fr) Dispositif de génération d'images médicales, procédé de génération d'images médicales, et programme de génération d'images médicales
WO2020084917A1 (fr) Système médical et procédé de traitement d'informations
WO2020009127A1 (fr) Système d'observation médicale, dispositif d'observation médicale et procédé de commande de dispositif d'observation médicale
JP7140139B2 (ja) 医療撮像システム及びコンピュータプログラム
WO2022004250A1 (fr) Système médical, dispositif de traitement d'informations et procédé de traitement d'informations
WO2020050187A1 (fr) Système médical, dispositif de traitement d'informations, et procédé de traitement d'informations
WO2019202860A1 (fr) Système médical, structure de connexion, et procédé de connexion

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19875971

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19875971

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP