CN110584783A - Surgical navigation system - Google Patents

Surgical navigation system Download PDF

Info

Publication number
CN110584783A
CN110584783A CN201910972990.8A CN201910972990A CN110584783A CN 110584783 A CN110584783 A CN 110584783A CN 201910972990 A CN201910972990 A CN 201910972990A CN 110584783 A CN110584783 A CN 110584783A
Authority
CN
China
Prior art keywords
image
focus
navigation system
projection
component
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910972990.8A
Other languages
Chinese (zh)
Other versions
CN110584783B (en
Inventor
邵鹏飞
吴柄萱
刘鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Science and Technology of China USTC
Original Assignee
University of Science and Technology of China USTC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Science and Technology of China USTC filed Critical University of Science and Technology of China USTC
Priority to CN201910972990.8A priority Critical patent/CN110584783B/en
Publication of CN110584783A publication Critical patent/CN110584783A/en
Application granted granted Critical
Publication of CN110584783B publication Critical patent/CN110584783B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition

Abstract

The invention discloses a surgical operation navigation system, which integrates a camera component and a projector component into an information acquisition and projection device of a coaxial optical path system, can be used for acquiring image information of a position calibration component and projecting a focus image on a to-be-operated part of a patient, can perform data processing on the image information through an upper computer to determine the position of a focus in the body of the patient, determine the projection position, and control the projector component to project the focus image on the set position of the to-be-operated part based on the projection position. Therefore, the surgical navigation system can directly project the focus image on the part to be operated of the patient in the operation process, the image in a display is not needed to be compared in the operation process of a doctor, the use is more convenient, and the position of the focus can be more accurately positioned.

Description

Surgical navigation system
Technical Field
The invention relates to the technical field of medical imaging and medical instruments, in particular to a surgical operation navigation system.
Background
Craniocerebral malignant tumors, such as brain glioma and the like, are primary tumors with high lethality rate, and surgical excision is one of the most effective treatment methods. In the past, due to the limitation of computer hardware, imaging equipment, image processing technology and other factors, a surgeon can only judge the specific size and position of a tumor in the brain of a patient through a tomographic section obtained by computed tomography (CT, MRI and the like) in an operation, and then perform surgical resection. This approach to locating and resecting a brain tumor by viewing tomographic slices has a number of disadvantages: firstly, because the color and the touch of the tumor part and the normal brain tissue are not greatly different, the positioning of the tumor depends on the experience of surgeons on nerve dissection and clinical operation to a great extent, secondly, the boundary between the tumor and the normal brain tissue is difficult to determine, the recurrence rate can be obviously increased when the tumor cannot be completely resected, and if the tumor is resected too much, the normal brain tissue can be damaged, so the success rate of the operation and the prognosis condition are not ideal.
With the development of computer hardware, medical image processing, tomographic three-dimensional reconstruction and other technologies, the manner of brain surgery has changed greatly in the nineties of the last century, and some neurosurgical navigation systems based on modern computers and displays are gradually appeared. The systems can display the reconstructed three-dimensional tumor model on the display in a two-dimensional image mode, and the physician assistant can change the observation visual angle of the three-dimensional model through keyboard and mouse operation in the operation, so that the intuition of navigation and positioning is greatly improved. The doctor does not need to judge the size and position of the tumor by means of only the sectional slice and clinical experience, but can judge the resection range by means of the tumor model reconstructed in three dimensions.
However, the conventional surgical navigation system can only transmit the image on the display to the surgeon in one way, the image is still displayed on the reference display by adjusting the visual angle during the operation of the surgeon, and the image displayed on the display cannot be visually compared with the current to-be-operated part of the patient, so that the surgical navigation system is inconvenient to use.
Disclosure of Invention
In view of this, the present application provides a surgical navigation system, which can directly project a focus image on a to-be-operated portion of a patient, and does not require a doctor to adjust an image in a visual angle comparison display during an operation, so that the surgical navigation system is more convenient to use and can more accurately position a focus position.
A surgical navigation system, the surgical navigation system comprising:
a position calibration component;
the information acquisition and projection equipment comprises a camera component and a projector component which are integrated into a whole, the camera component is used for acquiring the image information of the position calibration component through a lens component, and the projector component is used for projecting a focus image on a to-be-operated part of a patient through the lens component;
the upper computer is used for acquiring the image information, determining the position of the focus of the patient in the lower body under the current posture based on the image information, determining a projection position, and controlling the projector assembly to project the focus image at the set position of the part to be operated based on the projection position.
Preferably, in the surgical navigation system, the camera assembly includes at least a camera CMOS chip;
the projector component at least comprises a projector DMD chip;
the lens assembly comprises a plurality of optical lenses with a common optical axis;
and the light path of the camera CMOS chip and the light path of the projector DMD chip are coupled to the optical axis of the same lens component through a light splitting sheet.
Preferably, in the surgical navigation system, the beam splitter and the optical axis of the lens assembly form a preset included angle;
after light rays representing the image information and collected by the lens assembly penetrate through the light splitting sheet, the light rays are incident to the camera CMOS chip along the same transmission direction;
the light rays which are emitted by the projector DMD chip and represent the focus images are reflected by the light splitting sheet, and after the propagation direction is changed, the light rays irradiate the set position of the to-be-operated part along the optical axis of the lens component.
Preferably, in the surgical navigation system, the beam splitter and the optical axis of the lens assembly form a preset included angle;
the light rays which are collected by the lens assembly and represent the image information are reflected by the light splitting sheet, and are incident to the camera CMOS chip after the propagation direction is changed;
and after the light rays which are emitted by the projector DMD chip and represent the focus images penetrate through the light splitting sheet, the light rays irradiate the set position of the part to be operated along the optical axis of the lens component in the same transmission direction.
Preferably, in the above surgical navigation system, the position calibration assembly includes:
a plurality of reflective stickers for sticking on the position to be operated; the plurality of reflective patches are arranged around the focus;
the light reflecting balls are used for being arranged on an operating table and are placed on the operating table through a support;
the upper computer is used for determining a projection position based on a spatial position relation between the image of the light reflecting sticker and the image of the light reflecting ball in the image information, drawing a two-dimensional image of the three-dimensional image of the focus acquired before the operation under the current projection position view angle, and projecting the two-dimensional image to the set position of the to-be-operated part as the focus image.
Preferably, in the above surgical navigation system, the light reflecting patch is attached to the to-be-operated site through a light reflecting patch base, and the surface of the light reflecting patch base is attached with cod-liver oil particles for nuclear magnetic resonance imaging;
and acquiring the three-dimensional image of the focus before the operation based on nuclear magnetic resonance scanning equipment.
Preferably, in the above surgical navigation system, the surgical navigation system further includes:
the adjustable mounting platform is used for mounting the information acquisition and projection equipment;
the upper computer is further used for calculating a translation vector and a rotation vector based on the projection position, and adjusting the mounting platform to translate and rotate according to the translation vector and the rotation vector, so that the lens component is right aligned to the projection position for projection.
Preferably, in the above surgical navigation system, the upper computer includes a memory, and the memory stores a three-dimensional image of a lesion of the patient; the upper computer is used for determining the focus image based on the three-dimensional image.
Preferably, in the above surgical navigation system, the upper computer is further configured to control the projector to project the image of the lesion with light and dark alternating light to show the lesion and the edge of the body of the patient.
As can be seen from the above description, in the surgical navigation system provided in the technical solution of the present invention, the camera module and the projector module are integrated into the information collecting and projecting device of the coaxial optical path system, which can be used to collect image information of the position calibration module and project a lesion image on the to-be-operated portion of the patient, and the image information can be subjected to data processing by the upper computer to determine the position of the lesion in the patient, determine the projection position, and control the projector module to project the lesion image on the set position of the to-be-operated portion based on the projection position. Therefore, the surgical navigation system can directly project the focus image on the part to be operated of the patient in the operation process, the image in a display is not needed to be compared in the operation process of a doctor, the use is more convenient, and the position of the focus can be more accurately positioned.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
FIG. 1 is a surgical navigation system provided in accordance with an embodiment of the present invention;
fig. 2 is a schematic structural diagram of an information collecting and projecting apparatus according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of another information collecting and projecting apparatus according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of another information collecting and projecting apparatus according to an embodiment of the present invention.
Detailed Description
The embodiments of the present application will be described in detail and fully with reference to the accompanying drawings, wherein the description is only for the purpose of illustrating the embodiments of the present application and is not intended to limit the scope of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Although there is a stereo positioning technique for brain tumor that does not depend on physical frame in the prior art, the surgical navigation system based on this technique cannot use navigation bar (a positioning tool held by surgeon), the image information can still only be transmitted to surgeon from the image on the display in one way, the surgeon has no tool to position the position relationship between a certain point (such as incisal margin) of the brain and the tumor, and there is no interaction.
With the continuous development of computer vision, people improve the surgical navigation system, a navigation rod is added, the improved surgical navigation system registers the position relationship between a patient model and a navigation light-reflecting ball by scanning facial information or a group of mark points on the head of the patient, and an intraoperative camera can indirectly calculate the position relationship between the navigation rod and the patient and display the position relationship on a display in real time by capturing the positions of the navigation rod and the light-reflecting ball. The method improves the interactivity of the surgeon and the navigation system, the surgeon can actively acquire the position relation between any point and the tumor, and the accuracy and the practicability of the neurosurgical navigation system are improved. Neurosurgical navigation systems based on the above principles are now commercialized, and mature navigation systems based on binocular camera-light reflecting ball-navigation stick-display are developed.
In recent years, with the development of camera-projection system technology and the continuous improvement of some Open-source computer graphics libraries (such as Visual Tool Kit) and computer vision libraries (such as Open CV), a new surgical navigation mode has emerged, i.e. a projector is used to project important pathological and physiological information on the surface of a patient to guide a surgeon to perform an operation. The navigation mode based on the projection technology is more visual and convenient than the navigation mode based on the navigation stick-display, a doctor does not need to repeatedly switch the visual angle between the operation part and the display in the operation, and the information projected on the body surface is superposed with the focus or the physiological structure in the deep part of the body surface along the optical axis of the projector, so that the surgeon can be guided uninterruptedly, and the doctor does not need to interrupt the operation process to look up the display. There are three main types of projection type surgical navigation systems.
The first is a navigation system with a camera separated from a projector, when the system works, the camera captures the spatial positions of a patient and a projection device at the same time, a computer host calculates the relative position relation and the spatial posture of the projector relative to the patient, and draws a two-dimensional image of a three-dimensional model (a model which is established preoperatively and contains important pathophysiological information) under the posture. The projector projects the image onto the patient's body surface to assist the surgeon in performing the procedure. The system has the innovation point that the prior mode of interacting with a doctor by using a display is changed, a more intuitive and clear projection mode is used, and the defects still exist in the design of the system. First, the camera is separated from the projection device, and the camera must keep a certain distance from the projection device and the patient to be positioned at the same time, so that the floor space of the system is increased. And secondly, the communication among the computer host, the camera and the projection equipment is realized through wired cables, so that the environment of the operating room is more crowded and complicated. And there is certain error in carrying out space positioning through three reflective ball, and patient and projection equipment are fixed a position respectively to this system, and the space positioning error of the two can superpose, has greatly influenced this disconnect-type projection navigation's navigation accuracy.
The second type is a non-coaxial projection system, in which the camera and the projector are packaged together, but the optical paths of the two are not coaxial, so that navigation can only be performed at a certain working distance. If the distance is too far or too close, the projected pathophysiological information will be biased due to the non-coaxial design of the optical path.
The third projection type navigation system is a coaxial navigation system, and the system enables the optical axes of a camera and a projector to be superposed through optical design, so that the defects of a separated projection system and a non-coaxial projection system are overcome. The camera, the projection equipment, the computer host and the display of the system are designed and packaged into a whole, and the bottom of the equipment is provided with the roller, so that the system is convenient to move, small in space occupation of an operating room and high in maintainability. However, the current function of the system is limited to in-situ projection navigation based on fluorescence images, and three-dimensional relative position information of a patient and a camera cannot be calculated, so that the system cannot be used for three-dimensional navigation.
In order to solve the above problems, embodiments of the present invention provide a real-time three-dimensional coaxial projection surgical navigation system, which can be used not only for navigation in neurosurgical tumor resection operations, but also for operations in other rigid physiological structural regions, such as bone surgery, maxillofacial surgery, oral surgery, and the like.
In the surgical navigation system provided by the embodiment of the invention, the camera component and the projector component are integrated into the information acquisition and projection equipment of the coaxial optical path system, the information acquisition and projection equipment can be used for acquiring the image information of the position calibration component and projecting the focus image on the to-be-operated part of the patient, the image information can be subjected to data processing by the upper computer to determine the focus position in the patient body and determine the projection position, and the projector component is controlled to project the focus image on the set position of the to-be-operated part based on the projection position. Therefore, the surgical navigation system can directly project the focus image on the part to be operated of the patient in the operation process, the image in a display is not needed to be compared in the operation process of a doctor, the use is more convenient, and the position of the focus can be more accurately positioned.
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, the present application is described in further detail with reference to the accompanying drawings and the detailed description.
Referring to fig. 1 and fig. 2, fig. 1 is a surgical navigation system provided in an embodiment of the present invention, and fig. 2 is a schematic structural diagram of an information acquisition and projection apparatus provided in an embodiment of the present invention, where the surgical navigation system includes: a position calibration component 11; the information acquisition and projection device 12, the information acquisition and projection device 12 includes an integrated camera component and a projector component, the camera component is used for acquiring the image information of the position calibration component 11 through a lens component, and the projector component is used for projecting a focus image 15 on the to-be-operated part of the patient through the lens component; and the upper computer 13 is used for acquiring the image information, determining the position of a focus 14 of the patient in the current posture based on the image information, determining a projection position, and controlling the projector assembly to project the focus image 15 at the set position of the part to be operated based on the projection position.
The information acquisition and projection equipment 12 and the upper computer 13 can be in wireless communication connection or wired communication connection, so that the camera assembly and the projector assembly can perform data interaction with the upper computer 13.
The information collecting and projecting device 12 is shown in fig. 2, and the camera component at least comprises a camera CMOS chip 21; the projector assembly includes at least a projector DMD chip 22; the lens assembly includes a plurality of co-axial optical lenses 24; the optical path of the camera CMOS chip 21 and the optical path of the projector DMD chip 22 are coupled to the optical axis L of the same lens assembly through a light splitter 23. The camera component collects light and the projector component projects light to share the same lens component design, so that the camera component collects light and the projector component projects light to achieve the design of the same optical axis on the light paths of the patient, the lens component and the light splitting sheet 23.
The beam splitter and the optical axis of the lens assembly form a preset included angle beta, and preferably, the preset included angle beta is designed to be 45 degrees. In the manner shown in fig. 2, the light rays representing the image information collected by the lens assembly are transmitted through the beam splitter 23 and then are incident to the camera CMOS chip 21 along the same propagation direction; the light rays which are emitted by the projector DMD chip 22 and represent the focus image 15 are reflected by the light splitting sheet 23, and after the propagation direction is changed, the light rays irradiate the set position of the to-be-operated part along the optical axis L of the lens assembly.
Fig. 3 may also be used for the information collecting and projecting device 12, where fig. 3 is a schematic structural diagram of another information collecting and projecting device provided in an embodiment of the present invention, and in the manner shown in fig. 3, similarly, the beam splitter and the optical axis of the lens assembly form a preset included angle β, and the preset included angle β is preferably designed to be 45 °. The mode shown in fig. 3 is different from the mode shown in fig. 2 in that the light rays representing the image information collected by the lens assembly are reflected by the beam splitter 23, and after the propagation direction is changed, the light rays are incident to the camera CMOS chip 21; after the light rays representing the focus image 15 emitted from the projector DMD chip 22 pass through the light splitting sheet 23, the light rays irradiate the set position of the to-be-operated site along the optical axis L of the lens assembly in the same propagation direction.
As shown in fig. 1, the position calibration assembly 11 includes: a plurality of reflective patches 113 attached to the site to be operated, wherein the plurality of reflective patches 113 are disposed around the lesion 14; the light-reflecting balls 111 are arranged on the operating table and placed on the operating table through the support.
The upper computer 13 is configured to determine a projection position based on a spatial position relationship between the image of the reflective sticker 113 and the image of the reflective ball 111 in the image information, draw a two-dimensional image of the three-dimensional image of the lesion acquired before the operation at a current projection position view angle a, and project the two-dimensional image as the lesion image 15 to a set position of the to-be-operated portion.
The reflective sticker 113 is used for being attached to the position to be operated through the reflective sticker base 112, and cod liver oil particles are attached to the surface of the reflective sticker base 112 and used for nuclear magnetic resonance imaging; and acquiring the three-dimensional image of the focus before the operation based on nuclear magnetic resonance scanning equipment.
Fig. 4 also shows the information collecting and projecting device 12, where fig. 4 is a schematic structural diagram of another information collecting and projecting device according to an embodiment of the present invention, and the manner shown in fig. 4 further includes an infrared laser module 25 for emitting infrared light to assist in positioning the light-reflecting ball 111 based on the manner shown in fig. 3. The infrared laser module 25 is arranged beside the coaxial light path of the information acquisition and projection device 12, the infrared light emitted by the infrared laser module irradiates the reflective ball 111, and the infrared light emitted by the reflective ball 111 is acquired by the information acquisition and projection device 12, so that the brightness of the reflective ball 111 is greatly improved, the reflective ball is easily distinguished from the background, and the real-time tracking effect of the reflective ball is enhanced. Fig. 4 is only to provide the infrared laser module 25 based on the manner shown in fig. 3, and the infrared laser module 25 may also be added based on the manner shown in fig. 2, and the principle is the same as that of fig. 4, and will not be described herein again.
Before operation, the three-dimensional image of the focus of the patient is formed through a nuclear magnetic resonance scanning device, and the relative position relation of the focus and the part to be operated can be determined based on the three-dimensional image of the focus. The image position of the retro-reflective sticker 113 attached to the site of the patient to be operated on may represent the image position of the patient in the current posture. Based on the surgical navigation system, the relative positions of the image of the reflective sticker 113 and the image of the reflective ball 111 of the patient in the current posture can be acquired, and based on the known relative position relationship between the focus and the to-be-operated part, the projection position of the patient in the current posture can be determined. And based on the projection position and the three-dimensional image of the focus, the three-dimensional image of the focus can be subjected to two-dimensional planarization processing under the current visual angle to obtain a required two-dimensional image.
When the surgical navigation system is used for a head tumor resection operation, the surgical navigation system provided by the embodiment of the invention observes the relative positions of the head reflecting sticker 113 and the reflecting ball 111 of the patient through the camera assembly, calculates the spatial position relationship between the patient and the reflecting ball 111, and completes the preoperative position registration. During intraoperative navigation, the camera component observes the reflective ball 111 in real time and calculates the spatial position relation between the information acquisition and projection equipment 12 relative to the reflective ball 111, then a two-dimensional image of a preoperative nuclear magnetic scanning and reconstructed three-dimensional tumor model under the current position view angle is drawn, in-situ projection is carried out by using the coaxial projector component, and doctors are assisted to position the tumor position, size and edge from various angles.
The surgical navigation system further comprises: an adjustable mounting platform, not shown in fig. 1. The information acquisition and projection equipment is arranged on the mounting platform. The upper computer 13 is further configured to calculate a translation vector and a rotation vector based on the projection position, and adjust the mounting platform to translate and rotate according to the translation vector and the rotation vector, so that the lens assembly projects the projection position.
The upper computer 13 comprises a memory, and the memory stores a focus three-dimensional image of the patient; the upper computer 13 is used for determining the focus image based on the three-dimensional image.
The upper computer 13 is also used for controlling the projector to project the focus image 15 by light and dark alternate light so as to show the focus and the edge of the body of the patient.
The surgical navigation system provided by the embodiment of the invention is provided with an image recognition system and a space positioning system for image recognition and space position positioning. The operation navigation can be executed based on the set identification and positioning method of the reflective sticker 113 and the reflective ball 111, the registration method of the patient-reflective ball 111 based on multi-point identification and matching, and the three-dimensional structure information drawing and projection method based on the in-situ projection technology.
The information acquisition and projection device can acquire image information in a field of view and project required information (such as tumor morphology or important physiological structures reconstructed before operation) through the projector assembly. The identification and positioning method of the position calibration component can extract one or more circular patterns with strong contrast from a two-dimensional image and give the pixel coordinates of the center point of the patterns. The patient-light reflecting sphere registration method based on multi-point identification matching can calculate the relative position relationship between a focus and a light reflecting sphere, including a translation vector and a rotation vector, according to the pixel coordinates and the three-dimensional space coordinates of a plurality of identified light reflecting points (including four light reflecting stickers 113 and four light reflecting spheres 111 on the head of a patient). The method comprises the steps of calculating a spatial position relation of a camera assembly relative to a reflective sphere in real time based on a multipoint identification matching information acquisition and projection equipment spatial positioning technology, pixel coordinates and three-dimensional space coordinates of four reflective spheres, wherein the spatial position relation comprises a translation vector and a rotation vector. The three-dimensional structure information drawing and projecting method based on the in-situ projection technology can draw a two-dimensional image of a three-dimensional model at any spatial position according to the position and the posture of the current information acquisition and projection equipment, and project the two-dimensional image to the spatial position of the three-dimensional model in situ.
Compared with the traditional neurosurgery navigation system based on the display, the neurosurgery navigation system provided by the embodiment of the invention has the advantages that the interaction between the projection mode and the doctor is more intuitive and convenient, and the doctor does not need to repeatedly switch the visual angle between the display and the surgical part of the patient. Meanwhile, the surgical navigation system breaks through the limitation of the traditional surgical operation, increases the information acquired by doctors in the operation, develops the concepts of the surgical operation and surgical operation instruments, and has great significance for increasing the operation precision and reducing the operation trauma.
The surgical navigation system can project two-dimensional image information of the position and the outline of a focus (such as a head tumor) at a current view angle from any position, allows a doctor to observe projection information of the tumor at a to-be-operated part (such as a scalp) from multiple view angles, is beneficial to planning an operation path before an operation and can also be used for operation teaching. The surgical navigation system may be used to determine the location of a surgical incision. No matter the information acquisition and projection device 12 is at any position, the position information and the shape information of the incision planned before the operation can be accurately projected on the surface of the part (such as scalp) to be operated of the patient, so as to assist the doctor to determine the position, the size and the shape of the incision.
When the surgical navigation system is used for head tumor resection operation, the surgical navigation system can be used for determining the direction of craniotomy punching. When an incision is made during operation, the projected tumor image and the incision image can be superposed by moving the device, and the direction pointed by the current system optical axis is the craniotomy punching direction of a doctor. During navigation in the operation, the surgical navigation system can project the two-dimensional tumor morphology information of each angle onto brain tissue, enhances the information contrast in a light and shade alternative projection mode, is beneficial to a doctor to determine the tumor position and the tumor edge in the operation, and increases the operation speed and the tumor resection precision.
As mentioned above, the device can be used for navigation in neurosurgical tumor resection operations, and can also be used for operations of other rigid physiological structure areas, such as bone surgery, maxillofacial surgery, oral surgery and the like.
The surgical navigation system can acquire the pixel coordinates of the reflective sticker on the head of the patient and the reflective ball in the same image for multiple times through the information acquisition and projection equipment, then calculate the spatial position relation between the patient and the reflective ball, and output a reprojection error for evaluating the registration accuracy of the reflective ball and the patient.
In navigation, the information acquisition and projection equipment is also used for acquiring the pixel coordinates of the reflective ball, calculating the spatial position relation between the information acquisition and projection equipment and the reflective ball in real time, and outputting a reprojection error for evaluating the positioning accuracy of the information acquisition and projection equipment.
One of the key applications of the surgical navigation system is the resection of a neurosurgical brain glioma or a dural tumor. After registration is completed by using the reflective sticker and the reflective ball on the head of the patient before operation, the camera assembly positions the system in real time according to the reflective ball, and draws a two-dimensional image of the tumor of the patient according to the current position and a tumor three-dimensional model established before operation and projects the two-dimensional image to the scalp, the skull, the brain tissue and other surfaces of the patient to assist a surgeon in performing the operation.
The following detailed description is made of the surgical navigation system according to the embodiment of the present invention for the resection of brain glioma or dural tumor:
1. four reflecting sticker bases are stuck on the scalp of a patient three hours before operation and are approximately distributed around the tumor;
2. sticking cod liver oil particles on the surface of the reflecting sticker base for nuclear magnetic resonance imaging;
3. the patient carries out nuclear magnetic resonance scanning, and a doctor marks out a tumor range and a cod-liver oil particle position on a scanned slice;
4. performing three-dimensional reconstruction on the scanning data to obtain a three-dimensional model of the tumor of the patient and the spatial position relation between the model and the four light-reflecting patches on the head;
5. a doctor plans an operation path and an incision in the three-dimensional model, and the incision information is stored in the three-dimensional model;
6. five minutes before the operation, four reflecting pastes are pasted on the surface of the base, and a reference frame with a reflecting ball at the top is placed beside the head of the patient on the operation table;
7. starting a system, and automatically searching a reflective ball and a reflective sticker on the head of a patient in the visual field range of the camera assembly;
8. the system automatically completes the registration of the position of the patient according to the positions of the light-reflecting ball and the light-reflecting paste on the head of the patient, and displays the registered reprojection error;
9. the system starts navigation, a tumor two-dimensional graph and an operation incision two-dimensional graph are drawn according to the position of the current information acquisition and projection equipment and a tumor three-dimensional model established before the operation, and the two-dimensional graph and the operation incision two-dimensional graph are projected to the surface of the head of a patient;
10. the doctor starts the operation, and the operation path is determined according to the projected tumor and the incision in the operation;
11. after tumor resection, tumor remnants can be confirmed with reference to the projected tumor image;
when a doctor performs an operation, the information acquisition and projection equipment is moved, and the projected fluorescence image can move simultaneously according to the position and the angle of the system; the doctor can judge the position and the size of the tumor according to the projected fluorescence image, and perform corresponding excision or other operations; when the distance between the information acquisition and projection equipment and the head is adjusted, the projected tumor image can still be coincided with the real outline of the tumor in real time.
In some implementations, if the information collection and projection device moves too fast during navigation, the camera assembly cannot recognize the reflective ball due to blurred images, and the system automatically searches for the position of the reflective ball again and continues navigation.
In some implementation processes, if the power is interrupted or the computer is halted, after the software is restarted, the navigation can still be continued according to the position relation between the patient and the light reflecting ball registered last time, and the re-registration is not needed.
The embodiments in the present description are described in a progressive manner, or in a parallel manner, or in a combination of a progressive manner and a parallel manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments can be referred to each other.
It is further noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that an article or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such article or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in an article or device that comprises the element.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (9)

1. A surgical navigation system, comprising:
a position calibration component;
the information acquisition and projection equipment comprises a camera component and a projector component which are integrated into a whole, the camera component is used for acquiring the image information of the position calibration component through a lens component, and the projector component is used for projecting a focus image on a to-be-operated part of a patient through the lens component;
the upper computer is used for acquiring the image information, determining the position of the focus of the patient in the lower body under the current posture based on the image information, determining a projection position, and controlling the projector assembly to project the focus image at the set position of the part to be operated based on the projection position.
2. The surgical navigation system of claim 1, wherein the camera assembly includes at least a camera CMOS chip;
the projector component at least comprises a projector DMD chip;
the lens assembly comprises a plurality of optical lenses with a common optical axis;
and the light path of the camera CMOS chip and the light path of the projector DMD chip are coupled to the optical axis of the same lens component through a light splitting sheet.
3. The surgical navigation system of claim 2, wherein the beam splitter has a preset included angle with an optical axis of the lens assembly;
after light rays representing the image information and collected by the lens assembly penetrate through the light splitting sheet, the light rays are incident to the camera CMOS chip along the same transmission direction;
the light rays which are emitted by the projector DMD chip and represent the focus images are reflected by the light splitting sheet, and after the propagation direction is changed, the light rays irradiate the set position of the to-be-operated part along the optical axis of the lens component.
4. The surgical navigation system of claim 2, wherein the beam splitter has a preset included angle with an optical axis of the lens assembly;
the light rays which are collected by the lens assembly and represent the image information are reflected by the light splitting sheet, and are incident to the camera CMOS chip after the propagation direction is changed;
and after the light rays which are emitted by the projector DMD chip and represent the focus images penetrate through the light splitting sheet, the light rays irradiate the set position of the part to be operated along the optical axis of the lens component in the same transmission direction.
5. The surgical navigation system of claim 1, wherein the position calibration assembly includes:
a plurality of reflective stickers for sticking on the position to be operated; the plurality of reflective patches are arranged around the focus;
the light reflecting balls are used for being arranged on an operating table and are placed on the operating table through a support;
the upper computer is used for determining a projection position based on a spatial position relation between the image of the light reflecting sticker and the image of the light reflecting ball in the image information, drawing a two-dimensional image of the three-dimensional image of the focus acquired before the operation under the current projection position view angle, and projecting the two-dimensional image to the set position of the to-be-operated part as the focus image.
6. The surgical navigation system of claim 5, wherein the reflective sticker is configured to be attached to the site to be operated via a reflective sticker base, and the surface of the reflective sticker base is attached with cod-liver oil particles for nuclear magnetic resonance imaging;
and acquiring the three-dimensional image of the focus before the operation based on nuclear magnetic resonance scanning equipment.
7. The surgical navigation system of claim 1, further comprising:
the adjustable mounting platform is used for mounting the information acquisition and projection equipment;
the upper computer is further used for calculating a translation vector and a rotation vector based on the projection position, and adjusting the mounting platform to translate and rotate according to the translation vector and the rotation vector, so that the lens component is right aligned to the projection position for projection.
8. The surgical navigation system of claim 1, wherein the host computer includes a memory storing a three-dimensional image of a lesion of the patient; the upper computer is used for determining the focus image based on the three-dimensional image.
9. The surgical navigation system of claim 1, wherein the host computer is further configured to control the projector to project the image of the lesion with alternating bright and dark light to show the lesion and an edge of the patient's body.
CN201910972990.8A 2019-10-14 2019-10-14 Surgical navigation system Active CN110584783B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910972990.8A CN110584783B (en) 2019-10-14 2019-10-14 Surgical navigation system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910972990.8A CN110584783B (en) 2019-10-14 2019-10-14 Surgical navigation system

Publications (2)

Publication Number Publication Date
CN110584783A true CN110584783A (en) 2019-12-20
CN110584783B CN110584783B (en) 2021-03-09

Family

ID=68867084

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910972990.8A Active CN110584783B (en) 2019-10-14 2019-10-14 Surgical navigation system

Country Status (1)

Country Link
CN (1) CN110584783B (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111588999A (en) * 2020-05-25 2020-08-28 李硕 Operation guide model, head-wearing wearable device-assisted operation navigation system and method
CN111603140A (en) * 2020-06-04 2020-09-01 北京瀚丞医用设备有限公司 In-situ visual positioning method and system for biological target
CN111839730A (en) * 2020-07-07 2020-10-30 厦门大学附属翔安医院 Photoacoustic imaging surgical navigation platform for guiding tumor resection
CN112168392A (en) * 2020-10-21 2021-01-05 雅客智慧(北京)科技有限公司 Dental navigation surgery registration method and system
CN112991457A (en) * 2021-02-22 2021-06-18 北京理工大学 Method and device for calibrating spatial position and internal and external parameters of projector in surgical navigation
CN113349927A (en) * 2021-05-19 2021-09-07 上海交通大学 Navigation system for face autologous fat transplantation operation and virtual scale prompting method
WO2022067608A1 (en) * 2020-09-30 2022-04-07 中国科学技术大学 Remote surgical guidance and teaching apparatus and method, and medical device
WO2023273013A1 (en) * 2021-06-29 2023-01-05 南京微纳科技研究院有限公司 Surgical navigation device and system
WO2024067753A1 (en) * 2022-09-29 2024-04-04 武汉联影智融医疗科技有限公司 Registration method, registration system, navigation information determination method, and navigation system

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103908345A (en) * 2012-12-31 2014-07-09 复旦大学 Volume data visualization method for surgical navigation based on PPC (Panel Personal Computer)
US20150145966A1 (en) * 2013-11-27 2015-05-28 Children's National Medical Center 3d corrected imaging
CN106344151A (en) * 2016-08-31 2017-01-25 北京市计算中心 Operation positioning method and system
US20170258528A1 (en) * 2015-08-24 2017-09-14 Synaptive Medical (Barbados) Inc. Medical imaging system for illuminating tissue samples using three-dimensional structured illumination microscopy
US20170312035A1 (en) * 2016-04-27 2017-11-02 Biomet Manufacturing, Llc Surgical system having assisted navigation
CN107468337A (en) * 2017-08-03 2017-12-15 苏州医视医疗科技有限公司 Surgical navigational platform based on intelligent glasses
CN108186117A (en) * 2018-02-28 2018-06-22 安徽大中润科技有限公司 A kind of distribution optical alignment tracking system and method
CN207837634U (en) * 2017-05-16 2018-09-11 合肥新迈美克医疗科技有限公司 A kind of common optical axis formula diseased region imaging projection navigation device
CN109674532A (en) * 2019-01-25 2019-04-26 上海交通大学医学院附属第九人民医院 Operation guiding system and its equipment, method and storage medium based on MR
CN109758231A (en) * 2019-03-05 2019-05-17 钟文昭 Operation piloting method and system in thoracic cavity based on mixed reality

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103908345A (en) * 2012-12-31 2014-07-09 复旦大学 Volume data visualization method for surgical navigation based on PPC (Panel Personal Computer)
US20150145966A1 (en) * 2013-11-27 2015-05-28 Children's National Medical Center 3d corrected imaging
US20170258528A1 (en) * 2015-08-24 2017-09-14 Synaptive Medical (Barbados) Inc. Medical imaging system for illuminating tissue samples using three-dimensional structured illumination microscopy
US20170312035A1 (en) * 2016-04-27 2017-11-02 Biomet Manufacturing, Llc Surgical system having assisted navigation
CN106344151A (en) * 2016-08-31 2017-01-25 北京市计算中心 Operation positioning method and system
CN207837634U (en) * 2017-05-16 2018-09-11 合肥新迈美克医疗科技有限公司 A kind of common optical axis formula diseased region imaging projection navigation device
CN107468337A (en) * 2017-08-03 2017-12-15 苏州医视医疗科技有限公司 Surgical navigational platform based on intelligent glasses
CN108186117A (en) * 2018-02-28 2018-06-22 安徽大中润科技有限公司 A kind of distribution optical alignment tracking system and method
CN109674532A (en) * 2019-01-25 2019-04-26 上海交通大学医学院附属第九人民医院 Operation guiding system and its equipment, method and storage medium based on MR
CN109758231A (en) * 2019-03-05 2019-05-17 钟文昭 Operation piloting method and system in thoracic cavity based on mixed reality

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111588999A (en) * 2020-05-25 2020-08-28 李硕 Operation guide model, head-wearing wearable device-assisted operation navigation system and method
CN111588999B (en) * 2020-05-25 2022-07-08 李硕 Operation guide model and head-wearing wearable equipment-assisted operation navigation system
CN111603140A (en) * 2020-06-04 2020-09-01 北京瀚丞医用设备有限公司 In-situ visual positioning method and system for biological target
CN111603140B (en) * 2020-06-04 2021-12-21 北京瀚丞医用设备有限公司 In-situ visual positioning method and system for biological target
CN111839730A (en) * 2020-07-07 2020-10-30 厦门大学附属翔安医院 Photoacoustic imaging surgical navigation platform for guiding tumor resection
WO2022067608A1 (en) * 2020-09-30 2022-04-07 中国科学技术大学 Remote surgical guidance and teaching apparatus and method, and medical device
CN112168392A (en) * 2020-10-21 2021-01-05 雅客智慧(北京)科技有限公司 Dental navigation surgery registration method and system
CN112991457A (en) * 2021-02-22 2021-06-18 北京理工大学 Method and device for calibrating spatial position and internal and external parameters of projector in surgical navigation
CN113349927A (en) * 2021-05-19 2021-09-07 上海交通大学 Navigation system for face autologous fat transplantation operation and virtual scale prompting method
CN113349927B (en) * 2021-05-19 2022-11-22 上海交通大学 Navigation system for face autologous fat transplantation operation and virtual scale prompting method
WO2023273013A1 (en) * 2021-06-29 2023-01-05 南京微纳科技研究院有限公司 Surgical navigation device and system
WO2024067753A1 (en) * 2022-09-29 2024-04-04 武汉联影智融医疗科技有限公司 Registration method, registration system, navigation information determination method, and navigation system

Also Published As

Publication number Publication date
CN110584783B (en) 2021-03-09

Similar Documents

Publication Publication Date Title
CN110584783B (en) Surgical navigation system
US20210212772A1 (en) System and methods for intraoperative guidance feedback
US10511822B2 (en) Augmented reality visualization and guidance for spinal procedures
CA2948257C (en) Operating room safety zone
Gavaghan et al. A portable image overlay projection device for computer-aided open liver surgery
US20220015636A1 (en) Method, system and apparatus for tracking surgical imaging devices
US10543045B2 (en) System and method for providing a contour video with a 3D surface in a medical navigation system
US9392258B2 (en) Imaging system and method
US6690960B2 (en) Video-based surgical targeting system
US5999840A (en) System and method of registration of three-dimensional data sets
EP2438880A1 (en) Image projection system for projecting image on the surface of an object
WO1998038908A1 (en) Imaging device and method
von Atzigen et al. HoloYolo: A proof‐of‐concept study for marker‐less surgical navigation of spinal rod implants with augmented reality and on‐device machine learning
JP2010054520A (en) Device for fixing location
CN110720985A (en) Multi-mode guided surgical navigation method and system
Gard et al. Image-based measurement by instrument tip tracking for tympanoplasty using digital surgical microscopy
CN213030699U (en) Imaging system
CN111671466A (en) Imaging system
CA3233118A1 (en) Anatomical scanning, targeting, and visualization
CN115363751A (en) Intraoperative anatomical structure indication method
Watson Development of an interactive image-guided neurosurgical system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant