WO2023273013A1 - 手术导航装置及系统 - Google Patents

手术导航装置及系统 Download PDF

Info

Publication number
WO2023273013A1
WO2023273013A1 PCT/CN2021/123830 CN2021123830W WO2023273013A1 WO 2023273013 A1 WO2023273013 A1 WO 2023273013A1 CN 2021123830 W CN2021123830 W CN 2021123830W WO 2023273013 A1 WO2023273013 A1 WO 2023273013A1
Authority
WO
WIPO (PCT)
Prior art keywords
organ
surgical navigation
light source
image
imaging unit
Prior art date
Application number
PCT/CN2021/123830
Other languages
English (en)
French (fr)
Inventor
汪远
赵可为
周丰茂
Original Assignee
南京微纳科技研究院有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN202110725203.7A external-priority patent/CN115530972A/zh
Priority claimed from CN202121465017.6U external-priority patent/CN215606241U/zh
Application filed by 南京微纳科技研究院有限公司 filed Critical 南京微纳科技研究院有限公司
Publication of WO2023273013A1 publication Critical patent/WO2023273013A1/zh

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations

Definitions

  • This application relates to medical equipment technology, in particular to surgical navigation devices and systems.
  • the fluorescent markers can gather near the tumor of the focal organ, so that the fluorescent molecular imaging surgical navigation system can realize the acquisition of tumor localization and image acquisition of the focal organ through fluorescence imaging technology , and by fusing the fluorescent image of the tumor with the image of the lesion organ and displaying it on the monitor, it helps the doctor to perform tumor resection.
  • the doctor when using the existing surgical navigation system, the doctor must observe the lesion tissue or organ through the monitor, and then compare it with the patient's real tissue or organ to complete the operation.
  • This method requires the doctor's field of view to switch between the monitor and the real lesion organ, thereby increasing the doctor's fatigue and prolonging the operation time.
  • the present application provides a surgical navigation device and system to better assist doctors in performing operations, relieve doctor fatigue, and reduce operation time.
  • the present application provides a surgical navigation device, including:
  • the industrial computer is used to superimpose and fuse the tumor fluorescence image and the lesion organ image to obtain a position image, and the position image is used to indicate the position of the tumor on the lesion organ;
  • An imaging unit connected with the industrial computer includes a projection light source, and the projection light source is used to project the position image on the lesion organ.
  • the imaging unit further includes: a near-infrared camera, configured to acquire tumor fluorescence images, and transmit the tumor fluorescence images to the industrial computer.
  • a near-infrared camera configured to acquire tumor fluorescence images, and transmit the tumor fluorescence images to the industrial computer.
  • the imaging unit further includes: a beam splitter, used to transmit the fluorescence emitted by the tumor to the near-infrared camera; and project the projection light emitted by the projection light source on the lesion organ;
  • the light path of the light is the projection light path, and the projection light path and the fluorescence light path between the beam splitter and the lesion organ are the shared light path.
  • the imaging unit also includes: a ranging module, used to determine the distance between the imaging unit and the lesion organ, and transmit the distance to the near-infrared camera; the near-infrared camera is also used to measure the near-infrared The lens is focused.
  • a ranging module used to determine the distance between the imaging unit and the lesion organ, and transmit the distance to the near-infrared camera; the near-infrared camera is also used to measure the near-infrared The lens is focused.
  • the imaging unit further includes a near-infrared filter element, and the near-infrared filter element is used to filter out light outside a preset wavelength range.
  • the imaging unit further includes: a visible light camera, configured to acquire images of the lesioned organs and transmit the images of the lesioned organs to the industrial computer.
  • a visible light camera configured to acquire images of the lesioned organs and transmit the images of the lesioned organs to the industrial computer.
  • the imaging unit further includes: a compensating light source, configured to project visible light onto the lesion organ, and provide ambient light for the visible light camera when the ambient light is lower than a threshold.
  • a compensating light source configured to project visible light onto the lesion organ, and provide ambient light for the visible light camera when the ambient light is lower than a threshold.
  • the imaging unit further includes an excitation light source, which is used to project excitation light onto the diseased organ.
  • the imaging unit further includes an indicating light source for emitting indicating light, and the indicating light is used for indicating the position where the excitation light emitted by the excitation light source is projected on the lesion organ.
  • the indicating light source includes a diffractive element, and the diffractive element is used to shape the light emitted by the indicating light source into an indicating light in the form of a contour. same.
  • the excitation light source includes a uniform light module, and the uniform light module is used to uniformly process the excitation light emitted by the excitation light source.
  • the present application provides a surgical navigation system, including the surgical navigation device described in the first aspect of the present application.
  • the surgical navigation system further includes a mobile platform, and the surgical navigation device is set on the mobile platform.
  • the surgical navigation system further includes a mechanical arm, an imaging unit is installed at one end of the mechanical arm, and the other end of the mechanical arm is set on the mobile platform.
  • the surgical navigation system further includes a display, which is set on the mobile platform and used for displaying position images.
  • the present application provides a surgical navigation method, including:
  • the tumor fluorescence image and the lesion organ image are superimposed and fused to obtain a position image, which is used to indicate the position of the tumor on the lesion organ;
  • the surgical navigation device and system provided by the present application superimpose and fuse the tumor fluorescence image and the lesion organ image through the industrial computer to obtain the position image, and the projection light source included in the imaging unit connected to the industrial computer projects the position image on the lesion organ.
  • this application superimposes and fuses the tumor fluorescence image and the lesion organ image to directly project the position image on the lesion organ, and the doctor does not need to observe the lesion organ through the display. Therefore, it is more intuitive and can reduce the number of doctors. Operation time is reduced, operation efficiency is improved, and, due to the introduction of a shared optical path, moving the imaging unit in the working distance will not cause the projection light source to readjust the projection area.
  • FIG. 1 is a schematic diagram of a surgical navigation device provided by an embodiment of the present application.
  • FIG. 2 is a schematic diagram of an imaging unit 130 provided by an embodiment of the present application.
  • FIG. 3 is a schematic diagram of a surgical navigation system provided by an embodiment of the present application.
  • Fig. 4 is a flowchart of a surgical navigation method provided by an embodiment of the present application.
  • indocyanine green can be injected into the human body to make it gather at the tumor of the focal organ, and the indocyanine green fluorescence imaging technology can be used (indocyanine green can be maximized under the laser radiation of 785nm-808nm). Exciting fluorescence in the near-infrared band to a greater extent) to achieve tumor location and morphology acquisition, as well as image acquisition of focal organs, and by fusing tumor images with focal organ images and displaying them on a monitor, it helps surgeons to perform tumor resection. However, at present, doctors have to observe the lesion tissue or organ through the screen, and then compare it with the patient's real tissue and organ to complete the surgical operation.
  • none of the above methods can develop and monitor the perfusion of lymph, blood vessels and related tissues.
  • the present application provides a surgical navigation device and system, which can obtain real-time images of the distribution of tumors in the focal organ and accurately project them on the surface of the focal organ, so that doctors can see the location of the tumor in the focal organ on the surface of the focal organ Distributed images are more intuitive, thereby reducing the doctor's operation time and improving operation efficiency.
  • Fig. 1 is a schematic diagram of a surgical navigation device provided by an embodiment of the present application.
  • the surgical navigation device 100 of the embodiment of the present application includes: an industrial computer 110 , and an imaging unit 130 connected to the industrial computer.
  • the imaging unit 130 includes a projection light source 120 . in:
  • the industrial computer 110 is used to superimpose and fuse the fluorescence image of the tumor and the image of the lesion organ to obtain a position image, and the position image is used to indicate the position of the tumor on the lesion organ.
  • the projection light source 120 is used to project the location image on the focal organ.
  • the fluorescence image of the tumor can be obtained through the development technology of the fluorescent marker.
  • the fluorescent marker is indocyanine green, and the present application is not limited thereto.
  • Indocyanine green is enriched in tumors of lesions and organs. Under the excitation of specific wavelength light, indocyanine green can emit fluorescence in the near-infrared band.
  • tumor fluorescence images can be obtained.
  • the industrial computer 110 is a microcomputer composed of large-scale integrated circuits.
  • the industrial computer 110 includes an image processing module and a system control module. Lesion organ images were superimposed and fused.
  • the specific way of superimposing and fusing tumor fluorescence images and lesion organ images may be to directly process the tumor fluorescence images and superimpose them on the lesion organ images in a specific color to obtain position images, which show the location of the tumor.
  • the location of the lesion on the organ that is, the distribution image of the tumor in the lesion tissue.
  • the system control module of the industrial computer 110 feeds back the position image to the connected projection light source 120, and the projection light source 120 receives the position image transmitted from the industrial computer 110, and projects the position image onto the surface of the observed lesion organ through the projection lens.
  • the projection light source 120 may be a projector, or a spatial light modulator, which is not limited in the present application.
  • the surgical navigation device provided in the embodiment of the present application superimposes and fuses the tumor fluorescence image and the lesion organ image through the industrial computer to obtain a position image, and the projection light source included in the imaging unit connected to the industrial computer projects the position image on the lesion organ.
  • the embodiment of the present application superimposes and fuses the tumor fluorescence image and the lesion organ image to directly project the position image on the lesion organ, and the doctor does not need to observe the lesion organ through the display, so it is more intuitive and can Reduce the doctor's operation time and improve the operation efficiency.
  • FIG. 2 is a schematic diagram of an imaging unit 130 provided by an embodiment of the present application.
  • the embodiment of the present application further describes the imaging unit 130 of the surgical navigation device 100 .
  • the imaging unit 130 of the embodiment of the present application further includes a near-infrared camera 201 on the basis of the projection light source 120 .
  • the near-infrared camera 201 is used to acquire tumor fluorescence images, and transmit the tumor fluorescence images to the industrial computer.
  • the near-infrared camera 201 is a digital imaging device sensitive to electromagnetic waves with a wavelength in the range of 780-3000nm.
  • the human body has been injected with indocyanine green so that it gathers at the tumor of the lesion organ 1, and the near-infrared camera 201 obtains the near-infrared fluorescence emitted by the fluorescent probe (for example, indocyanine green) through the near-infrared lens 202 image, so as to obtain a tumor fluorescence image, and transmit the tumor fluorescence image to the industrial computer 110.
  • the industrial computer 110 receives the tumor fluorescence image.
  • the imaging unit 130 of the embodiment of the present application further includes a near-infrared filter element 203 for filtering out light outside a predetermined wavelength range.
  • the near-infrared filter element 203 can block light other than infrared light, such as visible light, and only allow infrared light to pass through.
  • the near-infrared filter element 203 allows infrared rays with a wavelength range of 700-1700 nm to pass through, so as to filter out the excitation light and visible light reflected by the lesion organ 1 to stimulate tumor fluorescence.
  • the near-infrared camera 201 obtains a tumor fluorescence image using indocyanine green fluorescence imaging technology through a near-infrared lens 202 and a near-infrared filter element 203 .
  • the wavelength range of light allowed by the near-infrared filter element 203 is 700-1700 nm, and the penetration depth is large, a tumor fluorescence image with a high signal-to-noise ratio can be obtained. In practical applications, it can be used to develop the human lymph, blood vessels, etc. and monitor the perfusion of related tissues.
  • the imaging unit 130 of the embodiment of the present application further includes a beam splitter 206 .
  • the beam splitter 206 is used to transmit the fluorescence emitted by the tumor to the near-infrared camera; and project the projection light emitted by the projection light source on the lesion organ 1; wherein, the optical path of the fluorescence is the fluorescent optical path, and the optical path of the projected light is the projection optical path, and the optical splitter
  • the projection light path and the fluorescence light path between the lesion organ 1 are shared light paths.
  • the optical splitter 206 is a passive device that does not require external energy, as long as there is input light.
  • the beam splitter 206 is a dichroic mirror, which can separate the light source into a specific spectrum and change the light path direction of a part of the spectrum, and can almost completely transmit light of certain wavelengths and almost completely reflect light of other wavelengths.
  • the beam splitter 206 can transmit the fluorescence emitted by the tumor to the near-infrared camera 201, and the light path is the fluorescence light path; the beam splitter 206 can also project the projection light emitted by the projection light source 120 onto the lesion organ 1 , the optical path is the projection optical path.
  • the projection light path and the fluorescence light path realize a common light path through the beam splitter 206 , and accurately project the tumor fluorescence image on the surface of the lesion organ 1 .
  • the common optical path mode of the projection optical path and the fluorescent optical path can realize the accurate projection of the tumor fluorescent image on the surface of the lesion organ 1 , therefore, it can solve the problem that the lesion organ 1 needs to be manually marked in the prior art.
  • the use of the shared optical path is not considered, even if the projection light source 120 is added, a good projection effect cannot be achieved.
  • the imaging unit 130 of the embodiment of the present application further includes a ranging module 208 .
  • the ranging module 208 is used to determine the distance between the imaging unit 130 and the focal organ 1, and transmits the distance to the near-infrared camera 201; the near-infrared camera 201 is also used to adjust the near-infrared lens 202 of the near-infrared camera 201 according to the distance. coke.
  • the ranging module 208 uses a laser as a light source, and uses the laser to accurately measure the distance of the target.
  • the ranging module 208 can measure the distance between the imaging unit 130 and the focal organ 1 , and transmit the distance to the near-infrared lens 202 .
  • the near-infrared camera 201 receives the distance between the imaging unit 130 and the lesion organ 1 sent by the ranging module 208, and adjusts the focus of the near-infrared lens 202 of the near-infrared camera 201 according to the distance, so that the near-infrared lens 202 of the near-infrared camera 201 Adjust the focus to the clearest position according to this distance to obtain the clearest tumor fluorescence image.
  • the adjustment range of the working distance of the imaging unit 130 is 100mm-1000mm, which is not limited in the present application.
  • the imaging unit 130 of the embodiment of the present application further includes a visible light camera 210 .
  • the visible light camera 210 is used to acquire images of the lesion organ 1 and transmit the image of the lesion organ 1 to the industrial computer.
  • the visible light camera 210 only needs to be a camera capable of imaging. As shown in FIG. 2 , the visible light camera 210 acquires the color image of the diseased organ 1 through the visible light lens 211 , and transmits the color image of the diseased organ 1 to the industrial computer 110 . Correspondingly, the industrial computer 110 receives the color image of the diseased organ 1 .
  • the imaging unit 130 of the embodiment of the present application further includes a compensating light source 212 .
  • the compensation light source 212 is used to project visible light onto the diseased organ 1, and provide ambient light for the visible light camera when the ambient light is lower than a threshold.
  • the compensating light source 212 is used to supplement the ambient light to the visible light camera 210 when the ambient light is insufficient, so that the visible light camera 210 can obtain a color image of the lesion organ 1 even when the ambient light is insufficient.
  • the compensation light source 212 is a light emitting diode (Light Emitting Diode, LED), and the light emitted by the LED can supplement the ambient light for the visible light camera 210.
  • the imaging unit 130 needs to be moved above the tumor site to observe tumor images. However, at this time, the imaging unit 130 may block the light in the operating room, so that the visible light camera 210 cannot clearly obtain the image of the lesion organ 1.
  • the compensation light source 212 is turned on, so that the visible light camera 210 can clearly obtain the color image of the lesion organ 1, and
  • the tumor fluorescence image obtained by the indocyanine green fluorescence development technology with the near-infrared camera 201 through the near-infrared lens 202 and the near-infrared filter element 203 is superimposed and fused in the industrial computer 110 to obtain a position image.
  • the projection light source 120, the projection lens 205, and the beam splitter 206 the position image is projected onto the surface of the observed lesion organ 1, and the tumor resection operation is intuitively guided for the doctor.
  • the imaging unit 130 of the embodiment of the present application further includes an excitation light source 209 .
  • the excitation light source 209 is used to project excitation light onto the diseased organ 1 .
  • the excitation light source 209 projects a uniform excitation light spot onto the surface of the lesion organ 1 to realize tumor fluorescence imaging.
  • the excitation light source 209 is a laser light source with a center wavelength of 785nm ⁇ 5nm, but this application is not limited thereto.
  • the power adjustment range of the excitation light source 209 is 10mw-3000mw, and the higher luminous power of the laser light source can help the system realize the detection of tiny tumors, but the application is not limited to this adjustment range.
  • the excitation light source 209 includes a uniform light module 214, and the uniform light module 214 is used to uniformly process the excitation light emitted by the excitation light source.
  • the homogenization module 214 of the excitation light source 209 uniformly processes the excitation light emitted by the excitation light source 209 so that the intensity of the light spot irradiated on the surface of the lesion organ 1 is uniformly distributed.
  • the excitation light source 209 is composed of a power-adjustable semiconductor laser and a homogenization module 214, which realizes spot emission of uniform excitation light with adjustable optical power, and can help the system realize the detection of tiny tumors under high luminous power.
  • the imaging unit 130 of the embodiment of the present application further includes an indicator light source 207 .
  • the indicating light source 207 is used to emit indicating light, and the indicating light is used to indicate the position where the excitation light emitted by the excitation light source 209 is projected on the lesion organ 1 .
  • the indicating light source 207 is a laser light source with a central wavelength of 520 nm, but this application is not limited thereto.
  • the indicating light source 207 emits indicating light to indicate the position where the excitation light emitted by the excitation light source 209 is projected on the lesion organ 1, thereby providing the doctor with an indication of the spot area projected by the excitation light source 209, which is convenient for operation.
  • the indicating light source 207 includes a diffractive element 215, and the diffractive element 215 is used to shape the light emitted by the indicating light source 207 into an indicating light in the form of a contour.
  • the excitation light source 209 has the same projection range on the focal organ 1 .
  • the diffraction element 215 shapes the indication light emitted by the indication light source 207, and the light emitted after shaping is an indication light in the form of an outline, also called an outline light.
  • the contour of the contour light is consistent with the contour of the light spot irradiated by the excitation light source 209 on the surface of the lesion organ 1 , so as to provide an indication of the irradiation range for the excitation light source 209 irradiated area.
  • the excitation light source 209 projects a uniform light spot onto the surface of the lesion organ 1 to realize tumor fluorescence imaging
  • the near-infrared camera 201 obtains the tumor fluorescence of the indocyanine green fluorescence imaging technology through the near-infrared lens 202 and the near-infrared filter element 203 image
  • the visible light camera 210 acquires the color image of the lesion organ 1 through the visible light lens 211
  • the tumor fluorescence image and the color image of the lesion organ 1 are both transmitted to the industrial computer 110 for superimposition and fusion, and an image containing both the lesion organ 1 and the tumor is obtained.
  • the light source 120, the projection lens 205 of the projection light source 120, and the beam splitter 206 project onto the surface of the observed lesion organ 1, which can intuitively guide the tumor resection operation for the doctor, and overcome the defect that the lesion organ 1 needs to be manually marked in the existing medical projection technology , Reduce the doctor's operation time and improve the operation efficiency.
  • Fig. 3 is a schematic diagram of a surgical navigation system provided by an embodiment of the present application.
  • the surgical navigation system 300 of the embodiment of the present application includes the surgical navigation device 100 in the above embodiments.
  • the surgical navigation device 100 for the specific implementation process of the surgical navigation device 100, reference may be made to the related description of the embodiment shown in FIG. 2 , which will not be repeated here.
  • the surgical navigation system 300 of the embodiment of the present application further includes a mobile platform 310 on which the surgical navigation device 100 is set.
  • the mobile platform 310 is mounted on wheels and can be moved or fixed as required.
  • the industrial computer 110 in the surgical navigation device 100 and the imaging unit 130 connected to the industrial computer 110 can be set on the mobile platform 310 .
  • the surgical navigation system 300 of the embodiment of the present application further includes a robotic arm 320 , an imaging unit 130 is installed at one end of the robotic arm 320 , and the other end of the robotic arm is arranged on a mobile platform 310 .
  • the robotic arm 320 is respectively connected to the imaging unit 130 and the mobile platform 310 .
  • the robotic arm 320 is a six-degree-of-freedom robotic arm, which can adjust the working distance and working angle of the imaging unit 130, making the entire surgical navigation system easy to move and facilitate the operation of the doctor.
  • the surgical navigation system 300 of the embodiment of the present application further includes a display 330 , which is arranged on the mobile platform 310 and is used for displaying position images.
  • the display 330 can be placed directly on the mobile platform 310 .
  • the display 330 is used to display the distribution image of the tumor in the lesion organ 1 sent by the image processing module in the industrial computer 110 or the near-infrared fluorescence image of the perfusion of human lymph, blood vessels, and related tissues, or to superimpose and fuse the tumor fluorescence image and the image of the lesion organ The obtained position image.
  • the surgical navigation system can obtain real-time images of the distribution of tumors in the focal organ and accurately project them on the surface of the focal organ; it can also perform real-time visualization of the perfusion of lymph, blood vessels, and related tissues in the human body; the imaging can be adjusted The working distance and working angle of the unit; according to different working distances, it can quickly and automatically adjust the focus, realize automatic real-time focusing of near-infrared imaging and visible light color imaging, and obtain clear images; it can indicate the location of the excitation laser for the doctor, so as to facilitate the operation; Moreover, the common optical path technology of the projection light source and the near-infrared camera overcomes the defect of manually marking the focus and organs in the existing medical projection technology, and provides an intuitive surgical navigation system, thereby reducing the doctor's operation time and improving the operation efficiency. .
  • FIG. 4 is a flowchart of a surgical navigation method provided by an embodiment of the present application.
  • the method of the embodiment of the present application can be applied to the surgical navigation device shown in FIG. 1 .
  • the surgical navigation method of the embodiment of the present application includes:
  • the surgical navigation method in the embodiment of the present application specifically includes the following six steps:
  • the first step, light source emitting step emit excitation light through the turned on excitation light source 209;
  • the light source emitting step may also include a light homogenization step: the excitation light emitted by the excitation light source 209 is subjected to uniform light treatment by the light homogenization module 214, so as to obtain uniform excitation light;
  • the second step, the excitation step the excitation light is irradiated on the lesion organ, so that the tumor site where markers such as indocyanine green are accumulated is excited to emit a fluorescent signal, wherein the wavelength of the fluorescent signal is in the range of 700-1700nm;
  • the excitation step may also include an indication step: the indication light source 207 emits indication light, and the indication light indicates the position where the excitation light emitted by the excitation light source 209 is projected on the lesion organ 1; wherein, this step also includes that the indication light source 207 emits The indicating light of the indexing light is shaped into the indicating light in the outline form by the diffractive element 215, and the range in which the indicating light in the outline form is projected on the lesion organ 1 is the same as the range projected on the lesion organ 1 by the excitation light source 209;
  • the signal receiving step may also include a ranging step: the ranging module 208 measures the distance between the imaging unit 130 and the lesion organ 1, and transmits the distance to the near-infrared camera 201 through the industrial computer 110, thereby measuring the near-infrared
  • the near-infrared lens 202 in front of the camera 201 performs focus adjustment; a clear tumor fluorescence image is acquired through the focused near-infrared camera.
  • the third step may also include a step of compensating ambient light: projecting compensation light to the focal organ 1 through the turned-on compensation light source 212;
  • the fourth step, image transmission step transmit the tumor fluorescence image and lesion organ image to the industrial computer 110;
  • the fifth step, image fusion step the industrial computer 110 receives the tumor fluorescence image and the lesion organ image, and superimposes and fuses the tumor fluorescence image and the lesion organ image, thereby obtaining the position image;
  • Step 6 projection step: the industrial computer 110 controls the projection light source 120 so that the projection light source 120 projects the location image on the focal organ through the projection lens 205 .
  • the propagation direction of the fluorescent signal and the projected position image signal is adjusted so that the two beams of light become parallel light (although their propagation directions are different), so that the tumor fluorescence image can be accurately projected on the surface of the lesion organ.
  • the method in the embodiment of the present application can be used to execute the technical solution of any one of the above-mentioned embodiments of the surgical navigation device, and its implementation principle and technical effect are similar, and will not be repeated here.

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Robotics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Investigating, Analyzing Materials By Fluorescence Or Luminescence (AREA)

Abstract

一种手术导航装置(100)及系统,该装置(100)包括:工控机(110)对肿瘤荧光图像和病灶器官图像叠加融合,获得位置图像,与工控机(110)连接的成像单元(130)包含的投影光源(120)将该位置图像投影在病灶器官上。手术导航装置(100)直观性更强,能够减少医生手术时间,提高手术效率。

Description

手术导航装置及系统
本申请要求于2021年06月29日提交中国专利局、申请号为202110725203.7、申请名称为“手术导航装置及系统”的中国专利申请的优先权,以及要求于2021年06月29日提交中国专利局、申请号为202121465017.6、申请名称为“手术导航装置及系统”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及医疗设备技术,尤其涉及手术导航装置及系统。
背景技术
现代医学的外科手术中,在人体注射荧光标记物后,荧光标记物能够聚集在病灶器官的肿瘤附近,使荧光分子影像手术导航系统可通过荧光显影技术实现肿瘤定位形态获取、以及病灶器官图像获取,并通过将肿瘤荧光图像与病灶器官图像进行融合,通过显示器显示出来,帮助医生进行肿瘤切除的操作。
然而,使用现有的手术导航系统时,医生须通过显示器观察病灶组织或器官,进而与患者的真实组织或器官进行比对,完成手术。此方法需要医生的视野在显示器和真实病灶器官之间切换,从而增加医生的疲劳度,导致了手术时间的延长。
发明内容
本申请提供一种手术导航装置及系统,以更好地辅助医生进行手术,缓解医生疲劳,并减少手术时间。
第一方面,本申请提供一种手术导航装置,包括:
工控机,用于对肿瘤荧光图像和病灶器官图像叠加融合,获得位置图像,位置图像用于表示肿瘤在病灶器官上的位置;
与工控机连接的成像单元,成像单元包括投影光源,投影光源用于将位 置图像投影在病灶器官上。
可选的,成像单元还包括:近红外相机,用于获取肿瘤荧光图像,并将肿瘤荧光图像传输至工控机。
可选的,成像单元还包括:分光器,用于将肿瘤发出的荧光透射至近红外相机;以及,将投影光源发射的投影光投射在病灶器官上;其中,荧光的光路为荧光光路,投影光的光路为投影光路,分光器和病灶器官之间的投影光路和荧光光路为共用光路。
可选的,成像单元还包括:测距模块,用于确定成像单元与病灶器官之间的距离,并将距离传输至近红外相机;近红外相机,还用于根据距离对近红外相机的近红外镜头进行调焦。
可选的,成像单元还还包括近红外滤光元件,近红外滤光元件用于滤除预设波长范围以外的光。
可选的,成像单元还包括:可见光相机,用于获取病灶器官图像,并将病灶器官图像传输至工控机。
可选的,成像单元还包括:补偿光源,用于向病灶器官上投射可见光,在环境光线低于阈值时,为可见光相机提供环境光。
可选的,成像单元还包括激发光源,激发光源用于向病灶器官上投射激发光。
可选的,成像单元还包括指示光源,指示光源用于发射指示光,指示光用于指示激发光源发射的激发光投射在病灶器官上的位置。
可选的,指示光源包括衍射元件,衍射元件用于将指示光源发出的光整形为轮廓形式的指示光,轮廓形式的指示光投射在病灶器官上的范围与激发光源投射在病灶器官上的范围相同。
可选的,激发光源包括匀光模块,匀光模块用于对激发光源发射的激发光进行均匀处理。
第二方面,本申请提供一种手术导航系统,包括如本申请第一方面所述的手术导航装置。
可选的,该手术导航系统还包括移动平台,手术导航装置设置在移动平台上。
可选的,该手术导航系统还包括机械臂,机械臂的一端安设有成像单元,机械臂的另一端设置在移动平台上。
可选的,该手术导航系统还包括显示器,显示器设置在移动平台上,显示器用于显示位置图像。
第三方面,本申请提供一种手术导航方法,包括:
对肿瘤荧光图像和病灶器官图像叠加融合,获得位置图像,位置图像用于表示肿瘤在病灶器官上的位置;
将位置图像投影在病灶器官上。
本申请提供的手术导航装置及系统,通过工控机对肿瘤荧光图像和病灶器官图像叠加融合,获得位置图像,与工控机连接的成像单元包含的投影光源将该位置图像投影在病灶器官上。与现有技术相比,本申请将肿瘤荧光图像和病灶器官图像叠加融合后获得的位置图像直接投影在病灶器官上,医生不需要通过显示器观察病灶器官,因此,直观性更强,能够减少医生手术时间,提高手术效率,且,由于共用光路的引入,使得在工作距离中移动成像单元不会导致投影光源重新调节投影区域。
附图说明
为了更清楚地说明本申请实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图做一简单地介绍,显而易见地,下面描述中的附图是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1为本申请一实施例提供的手术导航装置的示意图;
图2为本申请一实施例提供的成像单元130的示意图;
图3为本申请一实施例提供的手术导航系统的示意图;
图4为本申请一实施例提供的手术导航方法的流程图。
具体实施方式
这里将详细地对示例性实施例进行说明,其示例表示在附图中。下面的描述涉及附图时,除非另有表示,不同附图中的相同数字表示相同或相似的要素。以下示例性实施例中所描述的实施方式并不代表与本申请相一致的所有实施方式。相反,它们仅是与如所附权利要求书中所详述的、本申请的一些方面相一致的装置和方法的例子。
现代医学的外科手术中,可通过对人体注射吲哚菁绿使其聚集在病灶器官肿瘤处,利用吲哚菁绿荧光显影技术(吲哚菁绿在785nm-808nm波长的激光辐射下,可最大程度地激发出近红外波段的荧光)实现肿瘤定位与形态获取、以及病灶器官的图像获取,并通过将肿瘤图像与病灶器官图像进行融合,通过显示器显示出来,帮助外科医生进行肿瘤切除。但是,目前医生须通过屏幕观察病灶组织或器官,进而与患者真实组织、器官进行比对,完成外科手术。此方法直观性不强,需要医生在不同视野中转换,这直接导致了手术时间的延长,并增加医生的疲劳度。在其他现有技术中,有的通过将肿瘤在病灶器官中的位置通过显示屏显示出来,需要医生结合显示屏图像进行手术,直观性不强;有的通过在图像投影前对器官或器官图像进行手动标记,会延长术前准备时间,从而延长手术时间,不利于快速完成手术。
且,上述方式都无法对淋巴、血管以及相关组织的灌注情况进行显影、监测。
基于上述问题,本申请提供一种手术导航装置及系统,能够实时获得肿瘤在病灶器官中的分布图像,并准确投影在病灶器官表面,使得医生可以在病灶器官表面看到肿瘤在病灶器官中的分布图像,直观性更强,从而减少医生手术时间,提高手术效率。
图1为本申请一实施例提供的手术导航装置的示意图。如图1所示,本申请实施例的手术导航装置100包括:工控机110,与工控机连接的成像单元130,成像单元130包括投影光源120。其中:
工控机110,用于对肿瘤荧光图像和病灶器官图像叠加融合,获得位置图像,位置图像用于表示肿瘤在病灶器官上的位置。
投影光源120,用于将位置图像投影在病灶器官上。
本申请实施例中,肿瘤荧光图像可以通荧光标记物的显影技术获得,示例性地,荧光标记物为吲哚菁绿,本申请不以此为限制,在人体注射吲哚菁绿后,吲哚菁绿富集在病灶器官的肿瘤处,在特定波长光的激发下,吲哚菁绿能够发出近红外波段的荧光,经过感光设备的接收处理后可以获得肿瘤荧光图像。示例性的,工控机110是由大规模集成电路组成的微型计算机,工控机110包括图像处理模块与系统控制模块,图像处理模块用于接收肿瘤荧光图像和病灶器官图像,并将肿瘤荧光图像和病灶器官图像进行叠加融合。示例性地,将肿瘤荧光图像和病灶器官图像进行叠加融合的具体方式,可以 为直接将肿瘤荧光图像处理后以特定的颜色叠加到病灶器官图像之上,获得位置图像,位置图像显示了肿瘤在病灶器官上的位置,也就是肿瘤在病灶组织中的分布图像。工控机110的系统控制模块将位置图像反馈至连接的投影光源120,投影光源120接收工控机110传来的位置图像,通过投影镜头将位置图像投影至所观察的病灶器官表面。示例性地,投影光源120可以是投影仪,或者,空间光调制器,本申请不以此为限制。
本申请实施例提供的手术导航装置,通过工控机对肿瘤荧光图像和病灶器官图像叠加融合,获得位置图像,与工控机连接的成像单元包含的投影光源将该位置图像投影在病灶器官上。与现有技术相比,本申请实施例将肿瘤荧光图像和病灶器官图像叠加融合后获得的位置图像直接投影在病灶器官上,医生不需要通过显示器观察病灶器官,因此,直观性更强,能够减少医生手术时间,提高手术效率。
图2为本申请一实施例提供的成像单元130的示意图。在上述实施例的基础上,本申请实施例对手术导航装置100的成像单元130进行进一步说明。如图2所示,本申请实施例的成像单元130在包括投影光源120的基础上,还包括:近红外相机201。近红外相机201用于获取肿瘤荧光图像,并将肿瘤荧光图像传输至工控机。
近红外相机201是对波长在780-3000nm范围的电磁波感应敏感的数字成像设备。示例性地,已对人体注射了吲哚菁绿使其聚集在病灶器官1的肿瘤处,近红外相机201通过近红外镜头202获得荧光探针(例如,吲哚菁绿)发出的近红外荧光图像,从而获得肿瘤荧光图像,并将肿瘤荧光图像传输至工控机110。相应地,工控机110接收肿瘤荧光图像。
在上述实施例的基础上,参照图2,本申请实施例的成像单元130还包括近红外滤光元件203,近红外滤光元件203用于滤除预设波长范围以外的光。
本申请实施例中,近红外滤光元件203能够阻挡红外光之外的光线比如可见光,而只让红外光顺利通过。示例性地,近红外滤光元件203允许波长范围为700-1700nm的红外线通过,从而可以滤除病灶器官1反射的激发肿瘤荧光的激发光及可见光。近红外相机201通过近红外镜头202、近红外滤光元件203获得吲哚菁绿荧光显影技术的肿瘤荧光图像。由于近红外滤光元件203允许的光的波长范围为700-1700nm,穿透深度大,因此,能够获得高信噪比的肿瘤荧光图像。在实际应用中,可以用于对人体淋巴、血管等的显影以及 相关组织灌注情况的监测。
在上述实施例的基础上,参照图2,本申请实施例的成像单元130还包括分光器206。分光器206用于将肿瘤发出的荧光透射至近红外相机;以及,将投影光源发射的投影光投射在病灶器官1上;其中,荧光的光路为荧光光路,投影光的光路为投影光路,分光器和病灶器官1之间的投影光路和荧光光路为共用光路。
本申请实施例中,分光器206是一种无源器件,不需要外部能量,只要有输入光即可。示例性地,分光器206为二向色镜,能够把光源分离出特定的光谱并改变部分光谱光路方向,能够对一定波长的光几乎完全透过,而对另一些波长的光几乎完全反射。示例性地,如图2所示,分光器206能够将肿瘤发出的荧光透射至近红外相机201,该光路为荧光光路;分光器206还能将投影光源120发射的投影光投射在病灶器官1上,该光路为投影光路。投影光路和荧光光路通过分光器206实现了共光路,将肿瘤荧光图像准确地投影在病灶器官1表面。投影光路和荧光光路的共用光路方式,能够实现肿瘤荧光图像在病灶器官1表面的准确投影,因此,可以解决现有技术中需要手动对病灶器官1进行标记的问题。另外,现有技术中,因为没有考虑到共用光路的使用方式,使得即使加入了投影光源120,也无法实现很好的投影效果,原因在于,若不是共用光路的使用方式,投影光源120和近红外相机201之间是存在夹角的,那么当医生在手术中调节成像单元130距离手术台的距离时,近红外相机201的成像区域和投影光源120的投影区域就会各自不相关的变化,使得原本一致的成像区域和投影区域变得不一致。
在上述实施例的基础上,参照图2,本申请实施例的成像单元130还包括测距模块208。测距模块208用于确定成像单元130与病灶器官1之间的距离,并将距离传输至近红外相机201;近红外相机201,还用于根据距离对近红外相机201的近红外镜头202进行调焦。
示例性地,测距模块208以激光器作为光源,利用激光对目标的距离进行准确测距。示例性地,如图2所示,测距模块208可以测量成像单元130与病灶器官1之间的距离,并将距离传输至近红外镜头202。近红外相机201接收测距模块208发送的成像单元130与病灶器官1之间的距离,根据该距离对近红外相机201的近红外镜头202进行调焦,使近红外相机201的近红外镜头202根据该距离调焦至最清晰位置,以获得最清晰肿瘤荧光图像。示 例性地,成像单元130工作距离调节范围为100mm-1000mm,本申请不以此为限制。
在上述实施例的基础上,参照图2,本申请实施例的成像单元130还包括可见光相机210。可见光相机210用于获取病灶器官1图像,并将病灶器官1图像传输至工控机。
示例性地,可见光相机210为可成像的相机即可。如图2所示,可见光相机210通过可见光镜头211获取病灶器官1的彩色图像,并将病灶器官1的彩色图像传输至工控机110。相应地,工控机110接收病灶器官1的彩色图像。
在上述实施例的基础上,参照图2,本申请实施例的成像单元130还包括补偿光源212。补偿光源212用于向病灶器官1上投射可见光,在环境光线低于阈值时,为可见光相机提供环境光。
本申请实施例中,补偿光源212用于在环境光照不足时,对可见光相机210进行补充环境光,使可见光相机210可以在环境光照不足时也可获得病灶器官1的彩色图像。示例性地,补偿光源212为发光二极管(Light Emitting Diode,LED),LED发出的光可以对可见光相机210进行补充环境光。示例性地,肿瘤切除手术中,需要将成像单元130移动至肿瘤部位上方以观察肿瘤图像。然而此时成像单元130可能会挡住手术室中的光线,使可见光相机210无法清晰获取病灶器官1的图像,此时开启补偿光源212,可使可见光相机210清晰获得病灶器官1的彩色图像,并与近红外相机201通过近红外镜头202、近红外滤光元件203获得吲哚菁绿荧光显影技术的肿瘤荧光图像在工控机110中进行叠加融合,获得位置图像。通过投影光源120、投影镜头205、分光器206将该位置图像投影至观察的病灶器官1的表面,为医生直观引导肿瘤切除手术。
在上述实施例的基础上,参照图2,本申请实施例的成像单元130还包括激发光源209。激发光源209用于向病灶器官1上投射激发光。
本申请实施例中,激发光源209投射均匀的激发光的光斑至病灶器官1表面,实现肿瘤荧光显影。示例性地,激发光源209是中心波长为785nm±5nm的激光光源,但本申请不以此为限制。示例性地,激发光源209的功率调节范围为10mw-3000mw,激光光源的发光功率较高能够帮助系统实现对微小肿瘤的检测,但本申请不限于该调节范围。
在上述实施例的基础上,激发光源209包括匀光模块214,匀光模块214用于对激发光源发射的激发光进行均匀处理。
本申请实施例中,激发光源209的匀光模块214,通过对激发光源209发射的激发光进行均匀处理,使照射在病灶器官1表面的光斑的强度均匀分布。示例性地,激发光源209由功率可调半导体激光器与匀光模块214构成,实现光功率可调的均匀激发光的光斑出射,高发光功率下可帮助系统实现对微小肿瘤的检测。
在上述实施例的基础上,参照图2,本申请实施例的成像单元130还包括指示光源207。指示光源207用于发射指示光,指示光用于指示激发光源209发射的激发光投射在病灶器官1上的位置。
示例性地,指示光源207是中心波长为520nm的激光光源,但本申请不以此为限制。指示光源207通过发射指示光,指示激发光源209发射的激发光投射在病灶器官1上的位置,从而为医生提供激发光源209投射光斑区域的指示,便于手术操作。
在上述实施例的基础上,指示光源207包括衍射元件215,衍射元件215用于将指示光源207发出的光整形为轮廓形式的指示光,轮廓形式的指示光投射在病灶器官1上的范围与激发光源209投射在病灶器官1上的范围相同。
本申请实施例中,该衍射元件215对指示光源207发射的指示光进行整形,整形后发射的光为轮廓形式的指示光,也称为轮廓光。该轮廓光的轮廓与激发光源209照射在病灶器官1表面的光斑的轮廓一致,从而为激发光源209照射区域提供照射范围指示。
本申请实施例中,激发光源209投射均匀光斑至病灶器官1表面,实现肿瘤荧光显影,近红外相机201通过近红外镜头202、近红外滤光元件203获得吲哚菁绿荧光显影技术的肿瘤荧光图像,可见光相机210通过可见光镜头211获取病灶器官1的彩色图像,肿瘤荧光图像与病灶器官1的彩色图像均传输至工控机110进行叠加融合,得到同时包含病灶器官1与肿瘤的图像,通过投影光源120、投影光源120的投影镜头205、分光器206投影至观察的病灶器官1表面,能够为医生直观引导肿瘤切除手术,克服了现有医学投影技术中需要手动对病灶器官1进行标记的缺陷,减少了医生手术时间,提高了手术效率。
图3为本申请一实施例提供的手术导航系统的示意图。在上述实施例的 基础上,如图3所示,本申请实施例的手术导航系统300包括上述实施例中的手术导航装置100。本申请实施例中,手术导航装置100的具体实现过程可以参见图2所示实施例的相关描述,此处不再赘述。
在上述实施例的基础上,参照图3,本申请实施例的手术导航系统300还包括移动平台310,手术导航装置100设置在移动平台310上。
示例性地,移动平台310安转有轮子,可以按需移动或固定。示例性地,如图3所示,可在移动平台310上设置手术导航装置100中的工控机110以及与工控机110连接的成像单元130。
在上述实施例的基础上,参照图3,本申请实施例的手术导航系统300还包括机械臂320,机械臂320的一端安设有成像单元130,机械臂的另一端设置在移动平台310上。
示例性地,如图3所示,机械臂320分别连接着成像单元130与移动平台310。机械臂320为六自由度机械臂,可调节成像单元130的工作距离与工作角度,使整个手术导航系统便于移动、利于医生操作。
在上述实施例的基础上,参照图3,本申请实施例的手术导航系统300还包括显示器330,显示器330设置在移动平台310上,显示器330用于显示位置图像。
示例性地,显示器330可以直接放置在移动平台310上。显示器330用于显示工控机110中图像处理模块发送的肿瘤在病灶器官1中的分布图像或者人体淋巴、血管以及相关组织灌注情况的近红外荧光图像或者是对肿瘤荧光图像和病灶器官图像叠加融合后获得的位置图像。
本申请实施例提供的手术导航系统,可以实时获得肿瘤在病灶器官中的分布图像,并准确投影在病灶器官表面;还可以对人体淋巴、血管以及相关组织的灌注情况进行实时显影;可以调节成像单元的工作距离与工作角度;根据不同工作距离,快速自动调焦,实现近红外成像与可见光彩色成像自动实时对焦,获得清晰的图像;可以为医生指示激发激光的位置,从而方便医生进行手术;且,采用投影光源与近红外相机共光路技术,克服了现有医学投影技术中需要手动对病灶器官进行标记的缺陷,提供了一种直观的手术导航系统,从而减少医生手术时间,提高手术效率。
图4为本申请一实施例提供的手术导航方法的流程图,本申请实施例的方法可以应用于如图1所示的手术导航装置。如图4所示,本申请实施例的 手术导航方法包括:
S401、对肿瘤荧光图像和病灶器官图像叠加融合,获得位置图像,位置图像用于表示肿瘤在病灶器官上的位置。
S402、将位置图像投影在病灶器官上。
一种可能的实施方式中,基于图1所示的手术导航装置以及图2所示的成像单元130,本申请实施例的手术导航方法具体包括如下六个步骤:
第一步,光源发射步骤:通过打开的激发光源209发出激发光;
可选的,光源发射步骤还可以包括匀光步骤:激发光源209发出的激发光经过匀光模块214进行匀光处理,从而获得均匀的激发光;
第二步,激发步骤:激发光照射到病灶器官上,从而使得聚集吲哚菁绿等标记物的肿瘤部位被激发出荧光信号,其中荧光信号的波长在700-1700nm范围内;
可选的,激发步骤还可以包括指示步骤:指示光源207发出指示光,该指示光指示激发光源209发出的激发光投射在病灶器官1上的位置;其中,该步骤还包括,指示光源207发出的指示光经过衍射元件215整形为轮廓形式的指示光,轮廓形式的指示光投射在病灶器官1上的范围与激发光源209投射在病灶器官1上的范围相同;
第三步,信号接收步骤:荧光信号被分光器206分光后,经过近红外滤光元件203滤除了700-1700nm波长范围以外的光从而入射到近红外镜头202,并被近红外相机201接收,近红外相机201获得近红外荧光图像,从而获得肿瘤荧光图像,同时病灶器官1反射的可见光入射到可见光镜头211后被可见光相机210接收,从而获得病灶器官图像;
进一步可选的,信号接收步骤还可以包括测距步骤:测距模块208测量成像单元130与病灶器官1之间的距离,并将该距离通过工控机110传输至近红外相机201,从而对近红外相机201前的近红外镜头202进行调焦;通过调焦后的近红外相机获取清晰的肿瘤荧光图像。
可选的,第三步还可以包括补偿环境光的步骤:通过打开的补偿光源212向病灶器官1投射补偿光;
第四步,图像传输步骤:将肿瘤荧光图像和病灶器官图像传输至工控机110;
第五步,图像融合步骤:工控机110接收肿瘤荧光图像和病灶器官图 像,并将肿瘤荧光图像和病灶器官图像进行叠加融合,从而获得位置图像;
第六步,投影步骤:工控机110控制投影光源120,从而使投影光源120通过投影镜头205将位置图像投影在病灶器官上。
可选的,调节荧光信号与投影的位置图像信号的传播方向,使两束光成为平行光(虽然它们的传播方向不同),使得肿瘤荧光图像能够准确投影在病灶器官表面。
本申请实施例的方法,可以用于执行上述任一所示手术导航装置实施例的技术方案,其实现原理和技术效果类似,此处不再赘述。
可以理解的是,在本申请的实施例中涉及的各种数字编号仅为描述方便进行的区分,并不用来限制本申请的实施例的范围。在本申请的实施例中,上述各过程的序号的大小并不意味着执行顺序的先后,各过程的执行顺序应以其功能和内在逻辑确定,而不应对本申请的实施例的实施过程构成任何限定。
最后应说明的是:以上各实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述各实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分或者全部技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的范围。

Claims (16)

  1. 一种手术导航装置,其特征在于,包括:
    工控机,用于对肿瘤荧光图像和病灶器官图像叠加融合,获得位置图像,所述位置图像用于表示肿瘤在病灶器官上的位置;
    与所述工控机连接的成像单元,所述成像单元包括投影光源,所述投影光源用于将所述位置图像投影在病灶器官上。
  2. 根据权利要求1所述的手术导航装置,其特征在于,所述成像单元还包括:
    近红外相机,用于获取所述肿瘤荧光图像,并将所述肿瘤荧光图像传输至所述工控机。
  3. 根据权利要求2所述的手术导航装置,其特征在于,所述成像单元还包括:
    分光器,用于将所述肿瘤发出的荧光透射至所述近红外相机;以及,将所述投影光源发射的投影光投射在所述病灶器官上;
    其中,所述荧光的光路为荧光光路,所述投影光的光路为投影光路,所述分光器和所述病灶器官之间的投影光路和荧光光路为共用光路。
  4. 根据权利要求2所述的手术导航装置,其特征在于,所述成像单元还包括:
    测距模块,用于确定所述成像单元与所述病灶器官之间的距离,并将所述距离传输至所述近红外相机;
    所述近红外相机,还用于根据所述距离对所述近红外相机的近红外镜头进行调焦。
  5. 根据权利要求2所述的手术导航装置,其特征在于,所述成像单元还包括近红外滤光元件,所述近红外滤光元件用于滤除预设波长范围以外的光。
  6. 根据权利要求1所述的手术导航装置,其特征在于,所述成像单元还包括:
    可见光相机,用于获取所述病灶器官图像,并将所述病灶器官图像传输至所述工控机。
  7. 根据权利要求6所述的手术导航装置,其特征在于,所述成像单元还包括:
    补偿光源,用于向所述病灶器官上投射可见光,在环境光线低于阈值时,为所述可见光相机提供环境光。
  8. 根据权利要求1至7中任一项所述的手术导航装置,其特征在于,所述成像单元还包括激发光源,所述激发光源用于向所述病灶器官上投射激发光。
  9. 根据权利要求8所述的手术导航装置,其特征在于,所述成像单元还包括指示光源,所述指示光源用于发射指示光,所述指示光用于指示所述激发光源发射的激发光投射在所述病灶器官上的位置。
  10. 根据权利要求9所述的手术导航装置,其特征在于,所述指示光源包括衍射元件,所述衍射元件用于将所述指示光源发出的光整形为轮廓形式的指示光,所述轮廓形式的指示光投射在所述病灶器官上的范围与所述激发光源投射在所述病灶器官上的范围相同。
  11. 根据权利要求10所述的手术导航装置,其特征在于,所述激发光源包括匀光模块,所述匀光模块用于对所述激发光源发射的激发光进行均匀处理。
  12. 一种手术导航系统,其特征在于,包括如权利要求1至11中任一项所述的手术导航装置。
  13. 根据权利要求12所述的手术导航系统,其特征在于,还包括移动平台,所述手术导航装置设置在所述移动平台上。
  14. 根据权利要求13所述的手术导航系统,其特征在于,还包括机械臂,所述机械臂的一端安设有所述成像单元,所述机械臂的另一端设置在所述移动平台上。
  15. 根据权利要求13所述的手术导航系统,其特征在于,还包括显示器,所述显示器设置在所述移动平台上,所述显示器用于显示所述位置图像。
  16. 一种手术导航方法,其特征在于,包括:
    对肿瘤荧光图像和病灶器官图像叠加融合,获得位置图像,所述位置图像用于表示肿瘤在病灶器官上的位置;
    将所述位置图像投影在病灶器官上。
PCT/CN2021/123830 2021-06-29 2021-10-14 手术导航装置及系统 WO2023273013A1 (zh)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN202110725203.7A CN115530972A (zh) 2021-06-29 2021-06-29 手术导航装置及系统
CN202121465017.6U CN215606241U (zh) 2021-06-29 2021-06-29 手术导航装置及系统
CN202110725203.7 2021-06-29
CN202121465017.6 2021-06-29

Publications (1)

Publication Number Publication Date
WO2023273013A1 true WO2023273013A1 (zh) 2023-01-05

Family

ID=84692437

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/123830 WO2023273013A1 (zh) 2021-06-29 2021-10-14 手术导航装置及系统

Country Status (1)

Country Link
WO (1) WO2023273013A1 (zh)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080004533A1 (en) * 2006-06-30 2008-01-03 General Electric Company Optical imaging systems and methods
CN203915087U (zh) * 2014-06-24 2014-11-05 中国科学技术大学 一种乳腺癌显像投射导航系统
CN104605824A (zh) * 2015-02-10 2015-05-13 安徽信美医学工程科技有限公司 一种病变部位显像投影导航装置
CN106943193A (zh) * 2017-04-27 2017-07-14 华中科技大学 共定位手术导航系统和相机头
CN107049491A (zh) * 2017-05-16 2017-08-18 江苏信美医学工程科技有限公司 一种共光轴式病变部位显像投影导航装置与方法
CN207837634U (zh) * 2017-05-16 2018-09-11 合肥新迈美克医疗科技有限公司 一种共光轴式病变部位显像投影导航装置
CN110584783A (zh) * 2019-10-14 2019-12-20 中国科学技术大学 外科手术导航系统
CN111616799A (zh) * 2020-06-08 2020-09-04 广东欧谱曼迪科技有限公司 一种增强现实近红外荧光导航系统及方法

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080004533A1 (en) * 2006-06-30 2008-01-03 General Electric Company Optical imaging systems and methods
CN203915087U (zh) * 2014-06-24 2014-11-05 中国科学技术大学 一种乳腺癌显像投射导航系统
CN104605824A (zh) * 2015-02-10 2015-05-13 安徽信美医学工程科技有限公司 一种病变部位显像投影导航装置
CN106943193A (zh) * 2017-04-27 2017-07-14 华中科技大学 共定位手术导航系统和相机头
CN107049491A (zh) * 2017-05-16 2017-08-18 江苏信美医学工程科技有限公司 一种共光轴式病变部位显像投影导航装置与方法
CN207837634U (zh) * 2017-05-16 2018-09-11 合肥新迈美克医疗科技有限公司 一种共光轴式病变部位显像投影导航装置
CN110584783A (zh) * 2019-10-14 2019-12-20 中国科学技术大学 外科手术导航系统
CN111616799A (zh) * 2020-06-08 2020-09-04 广东欧谱曼迪科技有限公司 一种增强现实近红外荧光导航系统及方法

Similar Documents

Publication Publication Date Title
US10295815B2 (en) Augmented stereoscopic microscopy
JP6251888B2 (ja) 投影システム
JP2021191418A (ja) オープンフィールドハンドヘルド蛍光イメージングシステムおよび方法
CN104605824B (zh) 一种病变部位显像投影导航装置
JP2011147757A (ja) 医療機器及び内視鏡装置
KR101938807B1 (ko) 초음파 시술용 바늘의 위치 가이드 장치
JP2002113017A (ja) レーザ治療装置
JP2011156339A (ja) 医療機器及び内視鏡装置
WO2020012841A1 (ja) 光凝固装置、光凝固装置の制御方法、プログラム、及び、記録媒体
US20180360299A1 (en) Imaging apparatus, imaging method, and medical observation equipment
JPWO2016039000A1 (ja) イメージング装置
CN107049491A (zh) 一种共光轴式病变部位显像投影导航装置与方法
WO2021029277A1 (ja) 内視鏡システム及びその作動方法
WO2023273013A1 (zh) 手术导航装置及系统
WO2022179117A1 (zh) 基于荧光分子成像的导航方法、设备、存储介质
WO2020014999A1 (zh) 一种用于手术的不可见光显示设备和光学指导系统
CN215606241U (zh) 手术导航装置及系统
JP2006034452A (ja) X線テレビ装置
CN111568547A (zh) 一种外科手术精确定位指导系统
JP3504677B2 (ja) レーザー照射装置
CN115530972A (zh) 手术导航装置及系统
CN209154017U (zh) 一种用于手术的不可见光显示设备和光学指导系统
CN210170185U (zh) 一种实时手术荧光成像引导仪
WO2020071139A1 (ja) 医療用観察システム、医療用光源装置、及び医療用照明方法
CN214908029U (zh) 基于荧光分子成像的导航设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21947940

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE