CN215606241U - Operation navigation device and system - Google Patents

Operation navigation device and system Download PDF

Info

Publication number
CN215606241U
CN215606241U CN202121465017.6U CN202121465017U CN215606241U CN 215606241 U CN215606241 U CN 215606241U CN 202121465017 U CN202121465017 U CN 202121465017U CN 215606241 U CN215606241 U CN 215606241U
Authority
CN
China
Prior art keywords
organ
image
light source
surgical navigation
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202121465017.6U
Other languages
Chinese (zh)
Inventor
汪远
赵可为
周丰茂
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Weina Shijie Medical Technology Co ltd
Original Assignee
Nanjing Weina Shijie Medical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Weina Shijie Medical Technology Co ltd filed Critical Nanjing Weina Shijie Medical Technology Co ltd
Priority to CN202121465017.6U priority Critical patent/CN215606241U/en
Priority to PCT/CN2021/123830 priority patent/WO2023273013A1/en
Application granted granted Critical
Publication of CN215606241U publication Critical patent/CN215606241U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Investigating, Analyzing Materials By Fluorescence Or Luminescence (AREA)

Abstract

The application provides a surgical navigation device and system, the device includes: the industrial personal computer superposes and fuses the tumor fluorescence image and the focus organ image to obtain a position image, and a projection light source contained in an imaging unit connected with the industrial personal computer projects the position image on the focus organ. The application provides a surgery navigation head directly perceived nature is stronger, can reduce doctor's operation time, improves operation efficiency.

Description

Operation navigation device and system
Technical Field
The application relates to the technology of medical equipment, in particular to a surgical navigation device and a surgical navigation system.
Background
In the surgical operation of modern medicine, after a human body is injected with the fluorescent marker, the fluorescent marker can be gathered near the tumor of a focus organ, so that a fluorescent molecular imaging operation navigation system can realize the acquisition of a tumor positioning form and the acquisition of a focus organ image through a fluorescent developing technology, and the tumor fluorescent image and the focus organ image are fused and displayed through a display to help a doctor to perform the operation of tumor excision.
However, when using the conventional surgical navigation system, a doctor needs to observe a lesion tissue or organ through a display, and then compare the lesion tissue or organ with a real tissue or organ of a patient to complete a surgery. This method requires the doctor to switch the visual field between the display and the real focal organ, thereby increasing the fatigue of the doctor, resulting in an extension of the operation time.
SUMMERY OF THE UTILITY MODEL
The application provides a surgery navigation device and system to help the doctor perform the operation better, alleviate doctor's fatigue, and reduce the operation time.
In a first aspect, the present application provides a surgical navigation device comprising:
the industrial personal computer is used for overlaying and fusing the tumor fluorescence image and the focus organ image to obtain a position image, and the position image is used for representing the position of the tumor on the focus organ;
and the imaging unit is connected with the industrial personal computer and comprises a projection light source, and the projection light source is used for projecting the position image on the focus organ.
Optionally, the imaging unit further comprises: and the near-infrared camera is used for acquiring the tumor fluorescence image and transmitting the tumor fluorescence image to the industrial personal computer.
Optionally, the imaging unit further comprises: the optical splitter is used for transmitting the fluorescence emitted by the tumor to the near-infrared camera; and projecting the projection light emitted by the projection light source onto the focal organ; the fluorescence light path is a fluorescence light path, the projection light path is a projection light path, and the projection light path and the fluorescence light path between the light splitter and the focus organ are a shared light path.
Optionally, the imaging unit further comprises: the distance measurement module is used for determining the distance between the imaging unit and the focus organ and transmitting the distance to the near-infrared camera; and the near-infrared camera is also used for focusing the near-infrared lens of the near-infrared camera according to the distance.
Optionally, the imaging unit further includes a near-infrared filter element, and the near-infrared filter element is configured to filter light outside the preset wavelength range.
Optionally, the imaging unit further comprises: and the visible light camera is used for acquiring the focus organ image and transmitting the focus organ image to the industrial personal computer.
Optionally, the imaging unit further comprises: and the compensation light source is used for projecting visible light to the lesion organ and providing the visible light camera with ambient light when the ambient light is lower than a threshold value.
Optionally, the imaging unit further comprises an excitation light source for projecting excitation light onto the lesion organ.
Optionally, the imaging unit further comprises an indicating light source for emitting indicating light for indicating a position on the focal organ where the excitation light emitted by the excitation light source is projected.
Optionally, the indication light source includes a diffraction element for shaping light emitted from the indication light source into indication light in a profile form, and the range of the indication light in the profile form projected on the focal organ is the same as the range of the excitation light source projected on the focal organ.
Optionally, the excitation light source includes a light uniformizing module, and the light uniformizing module is configured to perform uniform processing on the excitation light emitted by the excitation light source.
In a second aspect, the present application provides a surgical navigation system comprising a surgical navigation device as described in the first aspect of the present application.
Optionally, the surgical navigation system further comprises a mobile platform, and the surgical navigation device is arranged on the mobile platform.
Optionally, the surgical navigation system further comprises a mechanical arm, wherein an imaging unit is installed at one end of the mechanical arm, and the other end of the mechanical arm is arranged on the mobile platform.
Optionally, the surgical navigation system further includes a display, the display is disposed on the mobile platform, and the display is used for displaying the position image.
The application provides an operation navigation head and system, superpose the fusion to tumour fluorescence image and focus organ image through the industrial computer, obtain the position image, the projection light source that the imaging element who is connected with the industrial computer contains with this position image projection on focus organ. Compared with the prior art, the position image direct projection that obtains after this application fuses tumour fluorescence image and focus organ image stack is on the focus organ, and the doctor need not observe the focus organ through the display, and consequently, the intuition nature is stronger, can reduce doctor's operation time, improves operation efficiency, and, owing to the introduction of sharing light path for it can not lead to projection light source readjustment projection area to remove the imaging element in working distance.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to these drawings without inventive exercise.
Fig. 1 is a schematic view of a surgical navigation device according to an embodiment of the present application;
fig. 2 is a schematic diagram of an imaging unit 130 according to an embodiment of the present disclosure;
FIG. 3 is a schematic view of a surgical navigation system provided in an embodiment of the present application;
fig. 4 is a flowchart of a surgical navigation method according to an embodiment of the present application.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
In the surgery of modern medicine, the indocyanine green can be injected into a human body to be gathered at a focus organ tumor, tumor positioning and form acquisition and focus organ image acquisition are realized by using an indocyanine green fluorescence developing technology (the indocyanine green can excite the fluorescence of a near infrared band to the maximum extent under the laser radiation with the wavelength of 785nm-808 nm), and the tumor image and the focus organ image are fused and displayed by a display to help a surgeon to perform tumor excision. However, currently, doctors must observe the focal tissues or organs through a screen, and then compare the focal tissues or organs with the real tissues or organs of patients to complete the surgical operation. This method is not very intuitive and requires the surgeon to switch between different views, which directly results in prolonged surgery time and increased surgeon fatigue. In other prior art, some patients display the position of the tumor in the focus organ through a display screen, and need a doctor to perform an operation by combining with the display screen image, so that the intuition is not strong; some methods manually mark organs or organ images before image projection, which can prolong preoperative preparation time, thus prolonging operation time and being not beneficial to quickly completing the operation.
Moreover, none of the above methods can be used for imaging and monitoring the perfusion of lymph, blood vessels and related tissues.
Based on the above problem, the application provides an operation navigation head and system, can obtain the distribution image of tumour in the focus organ in real time to accurate projection is on focus organ surface, makes the doctor can see the distribution image of tumour in the focus organ on focus organ surface, and the intuition nature is stronger, thereby reduces doctor's operation time, improves operation efficiency.
Fig. 1 is a schematic view of a surgical navigation device according to an embodiment of the present application. As shown in fig. 1, a surgical navigation device 100 according to an embodiment of the present application includes: the industrial personal computer 110 and the imaging unit 130 connected with the industrial personal computer, wherein the imaging unit 130 comprises a projection light source 120. Wherein:
and the industrial personal computer 110 is used for overlaying and fusing the tumor fluorescence image and the focus organ image to obtain a position image, and the position image is used for representing the position of the tumor on the focus organ.
And a projection light source 120 for projecting the position image on the lesion organ.
In the embodiment of the application, a tumor fluorescence image can be obtained by a developing technology of a fluorescence marker, exemplarily, the fluorescence marker is indocyanine green, the application is not limited thereto, after human body injects indocyanine green, the indocyanine green is enriched at a tumor of a focus organ, under the excitation of specific wavelength light, the indocyanine green can emit fluorescence of a near-infrared band, and the tumor fluorescence image can be obtained after the receiving processing of photosensitive equipment. Illustratively, the industrial personal computer 110 is a microcomputer composed of a large-scale integrated circuit, and the industrial personal computer 110 includes an image processing module and a system control module, wherein the image processing module is used for receiving the tumor fluorescence image and the lesion organ image, and superposing and fusing the tumor fluorescence image and the lesion organ image. Illustratively, the specific way of overlaying and fusing the tumor fluorescence image and the lesion organ image may be to directly overlay the tumor fluorescence image with a specific color on the lesion organ image after processing, so as to obtain a position image, where the position image shows the position of the tumor on the lesion organ, that is, the distribution image of the tumor in the lesion tissue. The system control module of the industrial personal computer 110 feeds back the position image to the connected projection light source 120, and the projection light source 120 receives the position image transmitted from the industrial personal computer 110 and projects the position image to the surface of the observed lesion organ through the projection lens. Illustratively, the projection light source 120 may be a projector, or a spatial light modulator, which is not limited in this application.
The surgical navigation device provided by the embodiment of the application obtains the position image by overlapping and fusing the tumor fluorescence image and the focus organ image through the industrial personal computer, and the position image is projected on the focus organ by the projection light source contained in the imaging unit connected with the industrial personal computer. Compared with the prior art, the position image obtained after the tumor fluorescence image and the focus organ image are superposed and fused is directly projected on the focus organ, and a doctor does not need to observe the focus organ through a display, so that the intuition is stronger, the operation time of the doctor can be reduced, and the operation efficiency is improved.
Fig. 2 is a schematic diagram of an imaging unit 130 according to an embodiment of the present disclosure. On the basis of the above embodiments, the imaging unit 130 of the surgical navigation device 100 is further described in the embodiments of the present application. As shown in fig. 2, the imaging unit 130 of the embodiment of the present application further includes, on the basis of the projection light source 120: a near infrared camera 201. The near-infrared camera 201 is used for acquiring a tumor fluorescence image and transmitting the tumor fluorescence image to an industrial personal computer.
The near-infrared camera 201 is a digital imaging device sensitive to electromagnetic wave induction in the wavelength range of 780-3000 nm. Illustratively, indocyanine green has been injected into a human body to be focused at a tumor of the focal organ 1, the near-infrared camera 201 obtains a near-infrared fluorescence image emitted from a fluorescent probe (e.g., indocyanine green) through the near-infrared lens 202, thereby obtaining a tumor fluorescence image, and transmits the tumor fluorescence image to the industrial personal computer 110. Accordingly, the industrial personal computer 110 receives the tumor fluorescence image.
On the basis of the above embodiment, referring to fig. 2, the imaging unit 130 of the embodiment of the present application further includes a near infrared filter element 203, where the near infrared filter element 203 is configured to filter light outside the preset wavelength range.
In the embodiment of the present application, the near-infrared filter element 203 can block light other than infrared light, such as visible light, and only allow infrared light to pass through smoothly. Illustratively, the near-infrared filter element 203 allows infrared rays with a wavelength range of 700-1700nm to pass through, so that the excitation light and the visible light reflected by the diseased organ 1 to excite tumor fluorescence can be filtered out. The near-infrared camera 201 obtains a tumor fluorescence image of indocyanine green fluorescence development technology through the near-infrared lens 202 and the near-infrared filter element 203. Since the wavelength range of the light allowed by the near infrared filter element 203 is 700-1700nm, the penetration depth is large, and therefore, a tumor fluorescence image with a high signal-to-noise ratio can be obtained. In practical application, the system can be used for developing human lymph, blood vessels and the like and monitoring the perfusion condition of related tissues.
On the basis of the above embodiment, referring to fig. 2, the imaging unit 130 of the embodiment of the present application further includes a beam splitter 206. The spectrometer 206 is used for transmitting the fluorescence emitted by the tumor to the near-infrared camera; and projecting the projection light emitted from the projection light source onto the focal organ 1; wherein, the fluorescence light path is a fluorescence light path, the projection light path is a projection light path, and the projection light path and the fluorescence light path between the spectroscope and the focus organ 1 are a common light path.
In the embodiment of the present application, the beam splitter 206 is a passive device, and does not require external energy, as long as there is input light. Illustratively, the beam splitter 206 is a dichroic mirror capable of separating a light source into a specific spectrum and changing the optical path direction of a portion of the spectrum, and is capable of almost completely transmitting light of certain wavelengths and almost completely reflecting light of other wavelengths. Illustratively, as shown in fig. 2, the beam splitter 206 is capable of transmitting fluorescence emitted from the tumor to the near-infrared camera 201, which is a fluorescence light path; the beam splitter 206 is also capable of projecting the projection light emitted from the projection light source 120 onto the lesion 1, which is a projection light path. The projection light path and the fluorescence light path realize a common light path through the beam splitter 206, and accurately project the tumor fluorescence image on the surface of the lesion organ 1. The common light path mode of the projection light path and the fluorescence light path can realize the accurate projection of the tumor fluorescence image on the surface of the focus organ 1, thereby solving the problem that the focus organ 1 needs to be marked manually in the prior art. In addition, in the prior art, because the usage mode of the common light path is not considered, even if the projection light source 120 is added, a good projection effect cannot be achieved, because if the usage mode of the common light path is not considered, an included angle exists between the projection light source 120 and the near-infrared camera 201, when a doctor adjusts the distance from the imaging unit 130 to the operating table during an operation, the imaging area of the near-infrared camera 201 and the projection area of the projection light source 120 change independently, so that the originally consistent imaging area and projection area become inconsistent.
On the basis of the above embodiment, referring to fig. 2, the imaging unit 130 of the embodiment of the present application further includes a ranging module 208. The distance measuring module 208 is used for determining the distance between the imaging unit 130 and the lesion organ 1 and transmitting the distance to the near-infrared camera 201; and the near-infrared camera 201 is also used for focusing the near-infrared lens 202 of the near-infrared camera 201 according to the distance.
Illustratively, the ranging module 208 takes a laser as a light source, and uses the laser to accurately range the distance of the target. For example, as shown in fig. 2, the ranging module 208 may measure a distance between the imaging unit 130 and the lesion 1 and transmit the distance to the near infrared lens 202. The near-infrared camera 201 receives the distance between the imaging unit 130 and the lesion organ 1 sent by the distance measuring module 208, and focuses the near-infrared lens 202 of the near-infrared camera 201 according to the distance, so that the near-infrared lens 202 of the near-infrared camera 201 focuses to the clearest position according to the distance, and the clearest tumor fluorescence image is obtained. Illustratively, the adjustment range of the working distance of the imaging unit 130 is 100mm-1000mm, which is not limited in the present application.
On the basis of the above embodiment, referring to fig. 2, the imaging unit 130 of the embodiment of the present application further includes a visible light camera 210. The visible light camera 210 is used for acquiring an image of the disease organ 1 and transmitting the image of the disease organ 1 to the industrial personal computer.
Illustratively, the visible camera 210 may be an imageable camera. As shown in fig. 2, the visible light camera 210 acquires a color image of the lesion organ 1 through the visible light lens 211, and transmits the color image of the lesion organ 1 to the industrial personal computer 110. Accordingly, the industrial personal computer 110 receives the color image of the disease organ 1.
On the basis of the above embodiment, referring to fig. 2, the imaging unit 130 of the embodiment of the present application further includes a compensation light source 212. The compensating light source 212 is used for projecting visible light onto the lesion organ 1, and providing the visible light camera with ambient light when the ambient light is lower than a threshold value.
In the embodiment of the present application, the compensation light source 212 is configured to supplement the ambient light to the visible light camera 210 when the ambient light is insufficient, so that the visible light camera 210 can obtain a color image of the lesion organ 1 when the ambient light is insufficient. Illustratively, the compensating Light source 212 is a Light Emitting Diode (LED), and the LED emits Light to supplement the visible Light camera 210 with ambient Light. Illustratively, in a tumor resection procedure, the imaging unit 130 needs to be moved over the tumor site to observe the tumor image. However, at this time, the imaging unit 130 may block light in the operating room, so that the visible light camera 210 cannot clearly acquire the image of the focal organ 1, and at this time, the compensation light source 212 is turned on, so that the visible light camera 210 clearly acquires a color image of the focal organ 1, and the color image and the tumor fluorescence image acquired by the near-infrared camera 201 through the near-infrared lens 202 and the near-infrared filter element 203 by the indocyanine green fluorescence developing technology are superimposed and fused in the industrial personal computer 110 to acquire a position image. The position image is projected to the surface of the observed lesion organ 1 through the projection light source 120, the projection lens 205 and the beam splitter 206, so as to guide the tumor resection operation for the doctor visually.
On the basis of the above-described embodiment, referring to fig. 2, the imaging unit 130 of the embodiment of the present application further includes an excitation light source 209. The excitation light source 209 projects excitation light onto the lesion 1.
In the embodiment of the present application, the excitation light source 209 projects a spot of uniform excitation light onto the surface of the lesion organ 1, so as to perform fluorescence imaging of the tumor. The excitation light source 209 is a laser light source having a center wavelength of 785nm ± 5nm, for example, but the present application is not limited thereto. Illustratively, the power of the excitation light source 209 is adjusted in a range of 10mw to 3000mw, and the higher light emitting power of the laser light source can help the system to detect the micro-tumor, but the application is not limited to this adjustment range.
On the basis of the above embodiment, the excitation light source 209 includes the light uniformizing module 214, and the light uniformizing module 214 is used for performing uniform processing on the excitation light emitted by the excitation light source.
In the embodiment of the present application, the light uniformizing module 214 of the excitation light source 209 performs uniform processing on the excitation light emitted from the excitation light source 209, so as to uniformly distribute the intensity of the light spot irradiated on the surface of the lesion organ 1. Illustratively, the excitation light source 209 is composed of a power-adjustable semiconductor laser and a light uniformizing module 214, so as to realize the light spot emergence of uniform excitation light with adjustable light power, and the system can be helped to realize the detection of micro-tumors under high luminous power.
On the basis of the above-described embodiment, referring to fig. 2, the imaging unit 130 of the embodiment of the present application further includes an indication light source 207. The indicating light source 207 is configured to emit indicating light for indicating a position where the excitation light emitted from the excitation light source 209 is projected on the diseased organ 1.
Illustratively, the indicating light source 207 is a laser light source with a center wavelength of 520nm, but the present application is not limited thereto. The pointing light source 207 points the position where the excitation light emitted from the excitation light source 209 is projected on the lesion organ 1 by emitting pointing light, thereby providing the doctor with an instruction that the excitation light source 209 projects a spot area, facilitating the surgical operation.
On the basis of the above-described embodiment, the indicating light source 207 includes the diffraction element 215, and the diffraction element 215 is configured to shape the light emitted from the indicating light source 207 into the indicating light in the form of a profile, which is projected on the diseased organ 1 in the same range as the excitation light source 209 is projected on the diseased organ 1.
In the embodiment of the present application, the diffractive element 215 shapes the indicating light emitted by the indicating light source 207, and the light emitted after shaping is the indicating light in the form of a profile, which is also referred to as profile light. The profile of the profile light is matched with the profile of the spot irradiated on the surface of the lesion 1 by the excitation light source 209, thereby providing an irradiation range indication for the irradiation region irradiated by the excitation light source 209.
In the embodiment of the present application, the excitation light source 209 projects a uniform light spot onto the surface of the focal organ 1 to realize tumor fluorescence imaging, the near-infrared camera 201 obtains a tumor fluorescence image of indocyanine green fluorescence imaging technology through the near-infrared lens 202 and the near-infrared filter element 203, the visible light camera 210 acquires a color image of the disease focus organ 1 through the visible light lens 211, and the tumor fluorescence image and the color image of the disease focus organ 1 are transmitted to the industrial personal computer 110 for superposition and fusion to obtain an image simultaneously containing the disease focus organ 1 and the tumor, the projection light source 120, the projection lens 205 of the projection light source 120 and the beam splitter 206 are projected to the surface of the observed lesion organ 1, so that the tumor resection operation can be intuitively guided for doctors, the defect that the lesion organ 1 needs to be marked manually in the existing medical projection technology is overcome, the operation time of the doctors is shortened, and the operation efficiency is improved.
Fig. 3 is a schematic view of a surgical navigation system according to an embodiment of the present application. On the basis of the above embodiments, as shown in fig. 3, the surgical navigation system 300 of the embodiment of the present application includes the surgical navigation device 100 in the above embodiments. In the embodiment of the present application, reference may be made to the related description of the embodiment shown in fig. 2 for a specific implementation process of the surgical navigation device 100, and details are not repeated here.
On the basis of the above embodiment, referring to fig. 3, the surgical navigation system 300 of the embodiment of the present application further includes a mobile platform 310, and the surgical navigation device 100 is disposed on the mobile platform 310.
Illustratively, the mobile platform 310 is wheeled and may be moved or fixed as desired. For example, as shown in fig. 3, an industrial personal computer 110 in the surgical navigation device 100 and an imaging unit 130 connected to the industrial personal computer 110 may be disposed on the mobile platform 310.
On the basis of the above embodiment, referring to fig. 3, the surgical navigation system 300 of the embodiment of the present application further includes a robot arm 320, one end of the robot arm 320 is provided with the imaging unit 130, and the other end of the robot arm is disposed on the mobile platform 310.
Illustratively, as shown in FIG. 3, the robotic arm 320 is coupled to the imaging unit 130 and the moving platform 310, respectively. The mechanical arm 320 is a six-degree-of-freedom mechanical arm, and the working distance and the working angle of the imaging unit 130 can be adjusted, so that the whole surgical navigation system is convenient to move and is beneficial to the operation of doctors.
On the basis of the above embodiment, referring to fig. 3, the surgical navigation system 300 of the embodiment of the present application further includes a display 330, the display 330 is disposed on the mobile platform 310, and the display 330 is used for displaying the position image.
Illustratively, the display 330 may be placed directly on the mobile platform 310. The display 330 is used for displaying a distribution image of the tumor in the disease organ 1 sent by the image processing module in the industrial personal computer 110 or a near infrared fluorescence image of the perfusion condition of the lymph, the blood vessel and the related tissues of the human body or a position image obtained by overlaying and fusing the tumor fluorescence image and the disease organ image.
The operation navigation system provided by the embodiment of the application can obtain the distribution image of the tumor in the focus organ in real time and accurately project the distribution image on the surface of the focus organ; the perfusion conditions of human lymph, blood vessels and related tissues can be developed in real time; the working distance and the working angle of the imaging unit can be adjusted; according to different working distances, fast automatic focusing is realized, automatic real-time focusing of near-infrared imaging and visible color imaging is realized, and clear images are obtained; the laser excitation position can be indicated for a doctor, so that the doctor can conveniently perform an operation; and moreover, a common light path technology of a projection light source and a near infrared camera is adopted, the defect that focus organs need to be marked manually in the existing medical projection technology is overcome, and an intuitive operation navigation system is provided, so that the operation time of doctors is reduced, and the operation efficiency is improved.
Fig. 4 is a flowchart of a surgical navigation method according to an embodiment of the present application, and the method according to the embodiment of the present application may be applied to the surgical navigation device shown in fig. 1. As shown in fig. 4, the surgical navigation method according to the embodiment of the present application includes:
s401, overlaying and fusing the tumor fluorescence image and the focus organ image to obtain a position image, wherein the position image is used for representing the position of the tumor on the focus organ.
S402, projecting the position image on the lesion organ.
In a possible implementation manner, based on the surgical navigation apparatus shown in fig. 1 and the imaging unit 130 shown in fig. 2, the surgical navigation method of the embodiment of the present application specifically includes the following six steps:
the first step, the light source emission step: emitting excitation light by the excitation light source 209 being turned on;
optionally, the light source emitting step may further include a dodging step: excitation light emitted by the excitation light source 209 is subjected to dodging processing by the dodging module 214, so that uniform excitation light is obtained;
and a second step, an excitation step: exciting light is irradiated on a focus organ, so that a tumor part gathering the marker such as indocyanine green is excited to generate a fluorescence signal, wherein the wavelength of the fluorescence signal is within the range of 700-1700 nm;
optionally, the step of activating may further comprise the step of indicating: the indicating light source 207 emits indicating light indicating a position where the excitation light emitted from the excitation light source 209 is projected on the lesion organ 1; wherein, the step further comprises that the indicating light emitted by the indicating light source 207 is shaped into the indicating light in the form of a contour through the diffraction element 215, and the range of the indicating light in the form of the contour projected on the lesion organ 1 is the same as the range of the exciting light source 209 projected on the lesion organ 1;
step three, signal receiving step: after the fluorescence signal is split by the beam splitter 206, light except the wavelength range of 700-1700nm is filtered by the near-infrared filter element 203 to enter the near-infrared lens 202 and is received by the near-infrared camera 201, the near-infrared camera 201 obtains a near-infrared fluorescence image so as to obtain a tumor fluorescence image, and meanwhile, visible light reflected by the disease organ 1 enters the visible light lens 211 and is received by the visible light camera 210 so as to obtain a disease organ image;
further optionally, the signal receiving step may further include a ranging step: the distance measurement module 208 measures the distance between the imaging unit 130 and the focal organ 1, and transmits the distance to the near-infrared camera 201 through the industrial personal computer 110, so as to focus the near-infrared lens 202 in front of the near-infrared camera 201; and acquiring a clear tumor fluorescence image by the focused near-infrared camera.
Optionally, the third step may further include the step of compensating for ambient light: projecting compensation light to the lesion organ 1 through the turned-on compensation light source 212;
fourthly, image transmission: transmitting the tumor fluorescence image and the focus organ image to the industrial personal computer 110;
step five, image fusion: the industrial personal computer 110 receives the tumor fluorescence image and the focus organ image, and superposes and fuses the tumor fluorescence image and the focus organ image to obtain a position image;
sixthly, projecting: the industrial personal computer 110 controls the projection light source 120 so that the projection light source 120 projects a position image on the lesion organ through the projection lens 205.
Optionally, the propagation directions of the fluorescence signal and the projected position image signal are adjusted to make the two beams of light become parallel light (although the propagation directions of the two beams of light are different), so that the tumor fluorescence image can be accurately projected on the surface of the lesion organ.
The method according to the embodiment of the present application can be used for implementing any of the above-described technical solutions of the embodiments of the surgical navigation device, and the implementation principle and technical effects thereof are similar and will not be described herein again.
It is to be understood that the various numerical references referred to in the embodiments of the present application are merely for descriptive convenience and are not intended to limit the scope of the embodiments of the present application. In the embodiment of the present application, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiment of the present application.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

Claims (15)

1. A surgical navigation device, comprising:
the industrial personal computer is used for overlaying and fusing the tumor fluorescence image and the focus organ image to obtain a position image, and the position image is used for representing the position of the tumor on the focus organ;
and the imaging unit is connected with the industrial personal computer and comprises a projection light source, and the projection light source is used for projecting the position image on a focus organ.
2. The surgical navigation device of claim 1, wherein the imaging unit further includes:
and the near-infrared camera is used for acquiring the tumor fluorescence image and transmitting the tumor fluorescence image to the industrial personal computer.
3. The surgical navigation device of claim 2, wherein the imaging unit further includes:
a spectrometer for transmitting fluorescence emitted by the tumor to the near-infrared camera; and projecting the projection light emitted by the projection light source onto the focal organ;
the fluorescence light path is a fluorescence light path, the projection light path is a projection light path, and the projection light path and the fluorescence light path between the light splitter and the focus organ are a shared light path.
4. The surgical navigation device of claim 2, wherein the imaging unit further includes:
the distance measurement module is used for determining the distance between the imaging unit and the focus organ and transmitting the distance to the near-infrared camera;
and the near-infrared camera is also used for focusing the near-infrared lens of the near-infrared camera according to the distance.
5. The surgical navigation device of claim 2, wherein the imaging unit further includes a near-infrared filter element for filtering light outside of a predetermined wavelength range.
6. The surgical navigation device of claim 1, wherein the imaging unit further includes:
and the visible light camera is used for acquiring the focus organ image and transmitting the focus organ image to the industrial personal computer.
7. The surgical navigation device of claim 6, wherein the imaging unit further includes:
and the compensation light source is used for projecting visible light to the lesion organ and providing ambient light for the visible light camera when the ambient light is lower than a threshold value.
8. The surgical navigation device according to any one of claims 1 to 7, wherein the imaging unit further includes an excitation light source for projecting excitation light onto the focal organ.
9. The surgical navigation device according to claim 8, wherein the imaging unit further includes an indication light source for emitting an indication light for indicating a position on the focal organ where the excitation light emitted from the excitation light source is projected.
10. The surgical navigation device according to claim 9, wherein the indication light source includes a diffraction element for shaping light emitted from the indication light source into a profile form of indication light, the profile form of indication light being projected on the focal organ in the same range as the excitation light source is projected on the focal organ.
11. The surgical navigation device of claim 10, wherein the excitation light source includes a light uniformizing module configured to uniformize excitation light emitted from the excitation light source.
12. A surgical navigation system comprising a surgical navigation device according to any one of claims 1 to 11.
13. The surgical navigation system of claim 12, further comprising a mobile platform on which the surgical navigation device is disposed.
14. The surgical navigation system of claim 13, further comprising a robotic arm having one end mounted with the imaging unit and another end disposed on the mobile platform.
15. The surgical navigation system of claim 13, further comprising a display disposed on the mobile platform, the display for displaying the position image.
CN202121465017.6U 2021-06-29 2021-06-29 Operation navigation device and system Active CN215606241U (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202121465017.6U CN215606241U (en) 2021-06-29 2021-06-29 Operation navigation device and system
PCT/CN2021/123830 WO2023273013A1 (en) 2021-06-29 2021-10-14 Surgical navigation device and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202121465017.6U CN215606241U (en) 2021-06-29 2021-06-29 Operation navigation device and system

Publications (1)

Publication Number Publication Date
CN215606241U true CN215606241U (en) 2022-01-25

Family

ID=79946684

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202121465017.6U Active CN215606241U (en) 2021-06-29 2021-06-29 Operation navigation device and system

Country Status (1)

Country Link
CN (1) CN215606241U (en)

Similar Documents

Publication Publication Date Title
JP6931705B2 (en) Open Field Handheld Fluorescence Imaging Systems and Methods
JP5867719B2 (en) Optical image measuring device
US10102646B2 (en) Optical image measuring apparatus
CN109938919B (en) Intelligent fundus laser surgery treatment device, system and implementation method thereof
US20110261175A1 (en) Multiple channel imaging system and method for fluorescence guided surgery
JP2004163413A (en) Microscope system and microscope inspection method
JP2017164007A (en) Medical image processing device, medical image processing method, and program
CN210009227U (en) Intelligent fundus laser surgery treatment device and treatment system
WO2016039000A1 (en) Imaging device
US20180360299A1 (en) Imaging apparatus, imaging method, and medical observation equipment
CN110720985A (en) Multi-mode guided surgical navigation method and system
CN215606241U (en) Operation navigation device and system
WO2022179117A1 (en) Navigation method and apparatus based on fluorescence molecular imaging, and storage medium
EP4016162A1 (en) Endoscope system and method for operating same
WO2020014999A1 (en) Invisible light display device and optical guidance system for operations
CN115530972A (en) Operation navigation device and system
WO2023273013A1 (en) Surgical navigation device and system
JP2006034452A (en) X-ray television receiver
JP3504677B2 (en) Laser irradiation device
CN209154017U (en) A kind of black light for operation shows that equipment and optics instruct system
CN214908029U (en) Navigation device based on fluorescent molecular imaging
JP2013027672A (en) Fundus photography device
JP2012223428A (en) Ophthalmic apparatus
CN114098611B (en) Endoscope system and imaging adjusting method thereof
WO2023273014A1 (en) Medical imaging device

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant