CN110720985A - Multi-mode guided surgical navigation method and system - Google Patents
Multi-mode guided surgical navigation method and system Download PDFInfo
- Publication number
- CN110720985A CN110720985A CN201911103584.4A CN201911103584A CN110720985A CN 110720985 A CN110720985 A CN 110720985A CN 201911103584 A CN201911103584 A CN 201911103584A CN 110720985 A CN110720985 A CN 110720985A
- Authority
- CN
- China
- Prior art keywords
- dimensional
- infrared
- image
- patient
- surgical
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/105—Modelling of the patient, e.g. for ligaments or bones
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2065—Tracking using image or pattern recognition
Abstract
The invention discloses a multi-mode guided surgical navigation method and a system, wherein the method comprises the following steps: s1, positioning the patient on an operating bed, and establishing a corresponding relation between an operating room coordinate system and an image examination coordinate system; s2, reconstructing a three-dimensional digital human body model of the patient in the space of the operating room according to the medical image data and the current coordinate system; s3, irradiating the surgical site by using a white light source, and collecting a color image by using a visible light high-definition camera; s4, irradiating the surgical site by using an infrared light source, collecting and exciting a fluorescence image by using an infrared camera, and marking the surgical target area of the patient; and S5, the color image and the infrared image can be superposed with the three-dimensional digital human body model or displayed on a head-mounted display screen of a doctor in a single way after being preprocessed, registered and fused, so as to guide the surgical resection in real time. The method overcomes the problems that the traditional navigation method can only simply superpose the two-dimensional infrared image and the two-dimensional visible light image, and the two-dimensional infrared image and the two-dimensional visible light image are observed through a display beside an operation bed, so that the problems of single visual angle, angle dislocation and incapability of multi-angle three-dimensional accurate guidance exist.
Description
Technical Field
The invention relates to the field of medical treatment, in particular to a multi-mode guided surgical navigation method and system.
Background
In the process of surgical tumor resection, a doctor of a chief physician currently resects the tumor mainly according to the color, texture and form of visual tissues, but the form of the visual operation area of the doctor in the operation process is completely different from the form of image information such as CT/MRI/PET and the like seen on a computer screen when a diagnosis and a surgical plan are made; if doctors can obtain real-time anatomical structure images of an operation area and characteristic signal images of a tumor target area in the operation, the success rate of the operation is greatly improved, the damage to surrounding important organs and normal tissues is reduced, and the occurrence of operation accidents is avoided; the conventional imaging equipment (CT/MRI/PET, etc.) cannot be moved to the operating room due to the volume and cost.
Surgical navigation systems currently available on the market and in the process of research and development generally fall into several categories: the first type is based on an infrared reflection support, infrared light is irradiated on the reflection support, the position of a probe at the top end of the support is determined by positioning a plurality of infrared reflection ball positions which are irregularly arranged on the support, and position information is determined by a single point of the probe; the method has the problems that only point positioning can be carried out on the surface of the region (when no surgical incision is carried out), or the positioning of a deeper position can be carried out only by entering the human body through a surgical incision opening/a human body cavity; since only a single point can be located, no tumor region boundary information can be identified. The second type is based on molecular image fluorescence excitation, after the molecular contrast agent is injected into a human body, the molecular contrast agent is excited by infrared light irradiation with specific wavelength, and fluorescence excitation information is collected for imaging; although the method solves the problem of tumor area boundary identification, the position of the infrared camera is fixed, only a two-dimensional image at a single position can be acquired, a three-dimensional image cannot be acquired, a display placed on an operating table is required to be used for observation, and a doctor must frequently change and search the corresponding relation between the display visual angle and the real patient visual angle time; it is impossible to observe the operation area of the patient from a real operation view and to guide the operation in real time.
Therefore, the invention discloses an operation navigation system which can identify the boundary outline of a tumor target area, is overlapped with color image data, can be independently displayed or is overlapped and displayed with a three-dimensional patient digital model, and finally can be displayed on a head-mounted display screen of a doctor in an operation view angle, can guide the doctor to perform accurate operation, greatly improves the accuracy of the operation, reduces the damage to surrounding important organs and normal tissues, and has very important function and urgent clinical practical requirements.
Disclosure of Invention
The invention aims to provide a multi-mode guided surgery navigation method and a surgery navigation system; the method and the system for surgical navigation overcome the problems that the infrared reflection guiding method cannot identify the boundary of the tumor region, the infrared excitation method can only observe the surgical region from a single angle, and the observation visual angle and the surgical visual angle of the display do not correspond to each other, realize the unification of the observation visual angle and the surgical visual angle while completely identifying the tumor surgical target area, and greatly solve the problem of accurate surgical guidance excision of a main surgeon.
In order to achieve the technical purpose and achieve the technical effect, the invention is realized by the following technical scheme:
a method of multi-modal guided surgical navigation, the method comprising:
s1, positioning the patient on an operating bed, and establishing a corresponding relation between an operating room coordinate system and an image examination coordinate system;
s2, reconstructing a three-dimensional static or four-dimensional dynamic digital human body model of the patient in the operating room space according to the medical image data and the operating room coordinate system;
s3, illuminating the surgical site by using a white light source, and collecting color image data by using a camera;
s4, irradiating the surgical site by using an infrared light source, acquiring infrared image data by using a camera, and identifying the surgical target area of the patient;
and S5, after preprocessing, registering and fusing the color image and the infrared image, displaying the color image and the infrared image together with the three-dimensional digital human body model in an overlapping manner or displaying the color image and the infrared image separately, and guiding the surgical resection in real time.
Further, in step S1, the patient may be restored to the coordinate system during image acquisition by using the multi-dimensional adjustable flat bed surgical bed; in step S2, a three-dimensional static or four-dimensional dynamic digital phantom of the patient is reconstructed from the patient medical image data.
Further, in step S3, illuminating the operation area with a white light source, and acquiring image data of the operation area with a color CCD camera; in step S4, a molecular contrast agent is injected into the surgical field, irradiated with an infrared light source, and received by an infrared CCD camera to excite fluorescence imaging, thereby identifying the surgical target of the patient.
Further, in step S5, according to the operation requirement of the physician, different infrared images and visible light images at multiple angles can be collected, and the images are registered by the computer and displayed in superposition with the three-dimensional image of the patient.
A multi-modal guided surgical navigation system, the system comprising:
(1) multidimensional adjustable operating table: is configured to accurately adjust the body position of the patient and establish a corresponding relation with a coordinate system during image acquisition;
(2) infrared excitation light source: configured to emit infrared excitation light to illuminate a surgical field of a patient to excite the molecular contrast agent to emit fluorescence; infrared CCD camera: configured to receive the excited fluorescence for imaging the tumor region to identify the surgical target area;
(3) halogen lamp light source: configured to emit white light to illuminate a surgical field of a patient; high definition CCD camera: configured to receive acquired patient color image data;
(4) image acquisition and processor: the system is configured to collect, preprocess, register and fuse infrared fluorescence image data and color image data, and superpose fluorescence images and color images, and the superposed images can be superposed with a human body three-dimensional digital model for display or can be independently displayed;
(5) image display: the head-mounted mask type display is configured to display two-dimensional and three-dimensional image information processed by the image processor by the mask display and simultaneously display the two-dimensional and three-dimensional image information on an environment image acquired by the high-definition CCD in an overlapping manner; accurate guiding of the operation is realized;
(6) a controller: configured to control the above components;
(7) a power supply system: configured to supply power to the above-mentioned components;
(8) a signal transmission system: the control device is configured to send control signals to the components and carry out signal acquisition;
(9) a bracket system: is configured to support various components of the whole system and realize the system function.
Furthermore, the operation bed is multi-dimensionally adjustable; establishing a corresponding relation between coordinate systems by using a cone beam CT scanning device or a bi-orthogonal X-ray imaging device and carrying out bone or organ registration positioning operation on a patient; carrying out three-dimensional or four-dimensional reconstruction on image data acquired by a patient in the diagnosis and positioning processes, and establishing a three-dimensional or four-dimensional patient digital human body model; molecular image information is obtained by using an infrared excitation molecular contrast agent mode, and an operation target area is marked.
Further, an infrared excitation light source, a halogen white light source, an infrared CCD camera and a color CCD camera are arranged at the front end of the head-mounted mask display by using a bracket, and the imaging angles of the infrared CCD camera and the color CCD camera are consistent with the visual angle of a doctor, namely the imaging angle is 'what you see is what you get'.
Further, using a head mounted mask display, the mask display may directly display the processed image information.
Further, an infrared CCD camera and a color CCD camera use a plastic lens to reduce weight; the system components of the head wear, use carbon fiber or other lightweight materials to reduce weight.
Further, the infrared excitation light source, the halogen white light source, the infrared CCD camera and the color CCD camera solve the power supply problem in a wired mode; the wireless mode is used for solving the transmission problem of control and acquisition signals.
The invention has the beneficial effects that: the multi-mode guiding operation navigation method and the operation navigation system provided by the method overcome the problems that the traditional infrared reflection guiding method can not identify the boundary of a tumor area, the infrared excitation method can only observe an operation area from a single angle, and the observation visual angle and the operation visual angle of the display do not correspond to each other, realize the unification of the observation visual angle and the operation visual angle while completely identifying the target area of the tumor operation, and greatly solve the problem of accurate operation guiding excision of a master surgeon.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a flow chart illustrating the present invention.
Detailed Description
In order to make the technical means, the creation characteristics, the achievement purposes and the effects of the invention easy to understand, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, but not all the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The invention provides a precise operation navigation method, which comprises the following steps:
s1, positioning the patient on an operating bed, and establishing a corresponding relation between an operating room coordinate system and an image examination coordinate system;
s2, reconstructing a three-dimensional static or four-dimensional dynamic digital human body model of the patient in the operating room space according to the medical image data and the operating room coordinate system;
s3, illuminating the surgical site by using a white light source, and collecting color image data by using a camera;
s4, irradiating the surgical site by using an infrared light source, acquiring infrared image data by using a camera, and identifying the surgical target area of the patient;
and S5, after preprocessing, registering and fusing the color image and the infrared image, displaying the color image and the infrared image together with the three-dimensional digital human body model in an overlapping manner or displaying the color image and the infrared image separately, and guiding the surgical resection in real time.
Through the embodiment, the method of the invention uses the multidimensional operating bed and the CBCT/biorthogonal positioning system to position the patient, and simultaneously establishes the multidimensional digital image model of the patient; according to the molecular image, the visible light imaging and the single/superposed imaging of the digital model, the real-time display is carried out through the head-mounted display, and the accurate guidance of the whole operation process is realized.
The following describes how to perform tumor target region positioning and guided surgical resection in detail with reference to the multi-mode guided surgical navigation method and system of fig. 1, and the present invention combines excitation fluorescence imaging, visible light imaging and a digital patient model in sequence, and displays on a head-mounted display to accurately guide the surgical procedure.
In such embodiments, to establish an accurate digital model of the patient, the patient is positioned using a multi-dimensional flat bed surgical bed, CBCT, and/or a bi-orthogonal positioning system;
in this embodiment, to accurately identify the boundary contour of the tumor target area, excitation imaging is performed using a molecular contrast agent and an infrared laser; simultaneously collecting visible light color image data at the same angle;
in the implementation mode, in order to solve the problem that a two-dimensional image cannot accurately position a tumor region, a multi-angle data acquisition mode is adopted and is fused with a three-dimensional patient model to perform three-dimensional accurate positioning;
the invention also provides a multi-mode guided precise surgery navigation system, which comprises:
multidimensional adjustable operating table: is configured to accurately adjust the body position of the patient and establish a corresponding relation with a coordinate system during image acquisition;
infrared excitation light source: configured to emit infrared excitation light to illuminate a surgical field of a patient to excite the molecular contrast agent to emit fluorescence; infrared CCD camera: configured to receive the excited fluorescence for imaging the tumor region to identify the surgical target area;
halogen lamp light source: configured to emit white light to illuminate a surgical field of a patient; high definition CCD camera: configured to receive acquired patient color image data;
image acquisition and processor: the system is configured to collect, preprocess, register, fuse and the like infrared fluorescence image data and color image data, and superpose fluorescence images and color images, and the superposed images can be superposed with a human body three-dimensional digital model for display or can be independently displayed;
image display: the head-mounted mask type display is configured to display two-dimensional and three-dimensional image information processed by the image processor by the mask display and simultaneously display the two-dimensional and three-dimensional image information on an environment image acquired by the high-definition CCD in an overlapping manner; accurate guiding of the operation is realized;
a controller: configured to control the above components;
a power supply system: configured to supply power to the above-mentioned components;
a signal transmission system: the control device is configured to send control signals to the components and carry out signal acquisition;
a bracket system: is configured to support various components of the whole system and realize the functions of the system;
through the system, the tumor target area of the patient can be accurately positioned according to the digital human body model of the patient, the fluorescence image data and the visible light color data, the surgical excision is accurately guided, and the damage to important organs and functions is greatly reduced.
In one embodiment of the present invention, a multi-modal guidance surgical navigation system, a multi-modal guidance and head-mounted data acquisition and display system are important features.
In this embodiment, the operating bed is a flat plate and is multi-dimensionally adjustable;
in this embodiment, the molecular contrast agent is used for infrared excitation imaging;
in the embodiment, the infrared light source, the visible light source, the infrared lens and the visible light lens are all arranged at the front end of the head-mounted display screen,
in this manner, multi-mode video information is displayed on the head-mounted display screen, and the guidance mode of "what you see is what you get" for the physician is realized.
The preferred embodiments of the invention disclosed above are intended to be illustrative only. The preferred embodiments are not intended to be exhaustive or to limit the invention to the precise embodiments disclosed. Obviously, many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the invention and the practical application, to thereby enable others skilled in the art to best utilize the invention. The invention is limited only by the claims and their full scope and equivalents.
Claims (10)
1. A multi-modal guided surgical navigation method, the method comprising:
s1, positioning the patient on an operating bed, and establishing a corresponding relation between an operating room coordinate system and an image examination coordinate system;
s2, reconstructing a three-dimensional static or four-dimensional dynamic digital human body model of the patient in the operating room space according to the medical image data and the operating room coordinate system;
s3, illuminating the surgical site by using a white light source, and collecting color image data by using a camera;
s4, irradiating the surgical site by using an infrared light source, acquiring infrared image data by using a camera, and identifying the surgical target area of the patient;
and S5, after preprocessing, registering and fusing the color image and the infrared image, displaying the color image and the infrared image together with the three-dimensional digital human body model in an overlapping manner or displaying the color image and the infrared image separately, and guiding the surgical resection in real time.
2. The multi-modal guided surgical navigation method according to claim 1, wherein in step S1, the patient can be restored to the coordinate system of the image acquisition by using the multi-dimensional adjustable flat bed surgical bed; in step S2, a three-dimensional static or four-dimensional dynamic digital phantom of the patient is reconstructed from the patient medical image data.
3. The multi-modal guided surgical navigation method according to claim 1, wherein in step S3, the operation area is illuminated by a white light source, and image data of the operation area is collected by a color CCD camera; in step S4, a molecular contrast agent is injected into the surgical field, irradiated with an infrared light source, and received by an infrared CCD camera to excite fluorescence imaging, thereby identifying the surgical target of the patient.
4. The multi-modal guided surgical navigation method according to claim 2 and claim 3, wherein in step S5, according to the surgical needs of the physician, different infrared images and visible light images at multiple angles can be acquired, and the images are registered by the computer and displayed in superposition with the three-dimensional image of the patient.
5. A multi-modal guided surgical navigation system, the system comprising:
(1) multidimensional adjustable operating table: is configured to accurately adjust the body position of the patient and establish a corresponding relation with a coordinate system during image acquisition;
(2) infrared excitation light source: configured to emit infrared excitation light to illuminate a surgical field of a patient to excite the molecular contrast agent to emit fluorescence; infrared CCD camera: configured to receive the excited fluorescence for imaging the tumor region to identify the surgical target area;
(3) halogen lamp light source: configured to emit white light to illuminate a surgical field of a patient; high definition CCD camera: configured to receive acquired patient color image data;
(4) image acquisition and processor: the system is configured to collect, preprocess, register and fuse infrared fluorescence image data and color image data, and superpose fluorescence images and color images, and the superposed images can be superposed with a human body three-dimensional digital model for display or can be independently displayed;
(5) image display: the head-mounted mask type display is configured to display two-dimensional and three-dimensional image information processed by the image processor by the mask display and simultaneously display the two-dimensional and three-dimensional image information on an environment image acquired by the high-definition CCD in an overlapping manner; accurate guiding of the operation is realized;
(6) a controller: configured to control the above components;
(7) a power supply system: configured to supply power to the above-mentioned components;
(8) a signal transmission system: the control device is configured to send control signals to the components and carry out signal acquisition;
(9) a bracket system: is configured to support various components of the whole system and realize the system function.
6. A multi-modal guided surgical navigation system as recited in claim 5, wherein the operating bed is multi-dimensionally adjustable; establishing a corresponding relation between coordinate systems by using a cone beam CT scanning device or a bi-orthogonal X-ray imaging device and carrying out bone or organ registration positioning operation on a patient; carrying out three-dimensional or four-dimensional reconstruction on image data acquired by a patient in the diagnosis and positioning processes, and establishing a three-dimensional or four-dimensional patient digital human body model; molecular image information is obtained by using an infrared excitation molecular contrast agent mode, and an operation target area is marked.
7. A multi-mode guided surgical navigation system according to claim 5, wherein the infrared excitation light source, the halogen white light source, the infrared CCD camera and the color CCD camera are mounted at the front end of the head-mounted mask display by using a bracket, and the imaging angles of the infrared CCD camera and the color CCD camera are consistent with the visual angle of the doctor, namely "what you see is what you get".
8. A multi-modal guided surgical navigation system according to claim 5, wherein the mask display is configured to directly display the processed image information using a head-mounted mask display.
9. The multi-modal guided surgical navigation system of claim 5, wherein the infrared CCD camera and the color CCD camera are reduced in weight using plastic lenses; the system components of the head wear, use carbon fiber or other lightweight materials to reduce weight.
10. The multi-mode guided surgical navigation system of claim 5, wherein the infrared excitation light source, the halogen white light source, the infrared CCD camera and the color CCD camera solve the power supply problem in a wired manner; the wireless mode is used for solving the transmission problem of control and acquisition signals.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911103584.4A CN110720985A (en) | 2019-11-13 | 2019-11-13 | Multi-mode guided surgical navigation method and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911103584.4A CN110720985A (en) | 2019-11-13 | 2019-11-13 | Multi-mode guided surgical navigation method and system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110720985A true CN110720985A (en) | 2020-01-24 |
Family
ID=69224047
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911103584.4A Pending CN110720985A (en) | 2019-11-13 | 2019-11-13 | Multi-mode guided surgical navigation method and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110720985A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113436129A (en) * | 2021-08-24 | 2021-09-24 | 南京微纳科技研究院有限公司 | Image fusion system, method, device, equipment and storage medium |
CN114052903A (en) * | 2021-10-09 | 2022-02-18 | 山东大学 | Near-infrared imaging surgical navigation system and method |
CN114565517A (en) * | 2021-12-29 | 2022-05-31 | 骨圣元化机器人(深圳)有限公司 | Image denoising method and device for infrared camera and computer equipment |
WO2023115707A1 (en) * | 2021-12-21 | 2023-06-29 | 广东欧谱曼迪科技有限公司 | Double-source endoscopic surgery navigation system and method |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102397106A (en) * | 2011-10-03 | 2012-04-04 | 杨晓峰 | Multispectral light-splitting fused surgical operation guide system |
CN202313279U (en) * | 2011-10-28 | 2012-07-11 | 北京天助基业科技发展有限公司 | Guide device of near-infrared fluorescence angiography surgery |
CN103796566A (en) * | 2011-09-05 | 2014-05-14 | 富士胶片株式会社 | Endoscope system and image display method |
CN204445836U (en) * | 2015-02-10 | 2015-07-08 | 安徽信美医学工程科技有限公司 | A kind of diseased region video picture projection guider |
CN106420057A (en) * | 2016-11-23 | 2017-02-22 | 北京锐视康科技发展有限公司 | PET (positron emission computed tomography)-fluorescence dual-mode intraoperative navigation imaging system and imaging method implemented by same |
US20170239491A1 (en) * | 2015-12-01 | 2017-08-24 | University Of Iowa Research Foundation | Real-time application position monitoring system |
CN107440669A (en) * | 2017-08-25 | 2017-12-08 | 北京数字精准医疗科技有限公司 | A kind of binary channels spy imaging system |
CN107709968A (en) * | 2015-06-26 | 2018-02-16 | 利康公司 | Fluorescence biopsy sample imager and method |
CN107744382A (en) * | 2017-11-20 | 2018-03-02 | 北京数字精准医疗科技有限公司 | Optical molecular image navigation system |
CN109363625A (en) * | 2018-12-17 | 2019-02-22 | 温州医科大学 | A kind of augmented reality system of online mark axis of astigmatism |
CN109549689A (en) * | 2018-08-21 | 2019-04-02 | 池嘉昌 | A kind of puncture auxiliary guide device, system and method |
CN109925058A (en) * | 2017-12-18 | 2019-06-25 | 吕海 | A kind of minimally invasive spinal surgery operation guiding system |
-
2019
- 2019-11-13 CN CN201911103584.4A patent/CN110720985A/en active Pending
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103796566A (en) * | 2011-09-05 | 2014-05-14 | 富士胶片株式会社 | Endoscope system and image display method |
CN102397106A (en) * | 2011-10-03 | 2012-04-04 | 杨晓峰 | Multispectral light-splitting fused surgical operation guide system |
CN202313279U (en) * | 2011-10-28 | 2012-07-11 | 北京天助基业科技发展有限公司 | Guide device of near-infrared fluorescence angiography surgery |
CN204445836U (en) * | 2015-02-10 | 2015-07-08 | 安徽信美医学工程科技有限公司 | A kind of diseased region video picture projection guider |
CN107709968A (en) * | 2015-06-26 | 2018-02-16 | 利康公司 | Fluorescence biopsy sample imager and method |
US20170239491A1 (en) * | 2015-12-01 | 2017-08-24 | University Of Iowa Research Foundation | Real-time application position monitoring system |
CN106420057A (en) * | 2016-11-23 | 2017-02-22 | 北京锐视康科技发展有限公司 | PET (positron emission computed tomography)-fluorescence dual-mode intraoperative navigation imaging system and imaging method implemented by same |
CN107440669A (en) * | 2017-08-25 | 2017-12-08 | 北京数字精准医疗科技有限公司 | A kind of binary channels spy imaging system |
CN107744382A (en) * | 2017-11-20 | 2018-03-02 | 北京数字精准医疗科技有限公司 | Optical molecular image navigation system |
CN109925058A (en) * | 2017-12-18 | 2019-06-25 | 吕海 | A kind of minimally invasive spinal surgery operation guiding system |
CN109549689A (en) * | 2018-08-21 | 2019-04-02 | 池嘉昌 | A kind of puncture auxiliary guide device, system and method |
CN109363625A (en) * | 2018-12-17 | 2019-02-22 | 温州医科大学 | A kind of augmented reality system of online mark axis of astigmatism |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113436129A (en) * | 2021-08-24 | 2021-09-24 | 南京微纳科技研究院有限公司 | Image fusion system, method, device, equipment and storage medium |
CN114052903A (en) * | 2021-10-09 | 2022-02-18 | 山东大学 | Near-infrared imaging surgical navigation system and method |
WO2023115707A1 (en) * | 2021-12-21 | 2023-06-29 | 广东欧谱曼迪科技有限公司 | Double-source endoscopic surgery navigation system and method |
CN114565517A (en) * | 2021-12-29 | 2022-05-31 | 骨圣元化机器人(深圳)有限公司 | Image denoising method and device for infrared camera and computer equipment |
CN114565517B (en) * | 2021-12-29 | 2023-09-29 | 骨圣元化机器人(深圳)有限公司 | Image denoising method and device of infrared camera and computer equipment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109419524B (en) | Control of medical imaging system | |
CN110720985A (en) | Multi-mode guided surgical navigation method and system | |
US7203277B2 (en) | Visualization device and method for combined patient and object image data | |
EP2099377B1 (en) | A method and system for integrating image information from an external source | |
US7050845B2 (en) | Projecting patient image data from radioscopic imaging methods and/or tomographic imaging methods onto video images | |
JP4822634B2 (en) | A method for obtaining coordinate transformation for guidance of an object | |
CN109646089B (en) | Intelligent positioning system and method for spinal cord body surface puncture access point based on multi-mode medical fusion image | |
CN109549689A (en) | A kind of puncture auxiliary guide device, system and method | |
CN101474075B (en) | Navigation system of minimal invasive surgery | |
CN111356395A (en) | System and method for facilitating visualization during a procedure | |
KR20190058528A (en) | Systems for Guided Procedures | |
WO2019037606A1 (en) | Surgical navigation system and method based on ar technology | |
JP2003159247A (en) | System and method to visualize inside region of anatomical object | |
US20200315734A1 (en) | Surgical Enhanced Visualization System and Method of Use | |
CN107847278A (en) | Optics targets and track visualization | |
EP2438880A1 (en) | Image projection system for projecting image on the surface of an object | |
CN110584783A (en) | Surgical navigation system | |
US20210196404A1 (en) | Implementation method for operating a surgical instrument using smart surgical glasses | |
US20040152975A1 (en) | Image registration | |
JPH09173352A (en) | Medical navigation system | |
JP2014131552A (en) | Medical support device | |
CN111728695B (en) | Light beam auxiliary positioning system for craniotomy | |
US11406346B2 (en) | Surgical position calibration method | |
CN108143501B (en) | Anatomical projection method based on body surface vein features | |
Richey et al. | Soft tissue monitoring of the surgical field: detection and tracking of breast surface deformations |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200124 |