WO2021217713A1 - 手术导航系统及执行手术导航方法的计算机与存储介质 - Google Patents

手术导航系统及执行手术导航方法的计算机与存储介质 Download PDF

Info

Publication number
WO2021217713A1
WO2021217713A1 PCT/CN2020/089607 CN2020089607W WO2021217713A1 WO 2021217713 A1 WO2021217713 A1 WO 2021217713A1 CN 2020089607 W CN2020089607 W CN 2020089607W WO 2021217713 A1 WO2021217713 A1 WO 2021217713A1
Authority
WO
WIPO (PCT)
Prior art keywords
surgical
tracer
navigation
computer
information
Prior art date
Application number
PCT/CN2020/089607
Other languages
English (en)
French (fr)
Inventor
刘衍志
孙东辉
黄伟
朱圣晓
Original Assignee
深圳市鑫君特智能医疗器械有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市鑫君特智能医疗器械有限公司 filed Critical 深圳市鑫君特智能医疗器械有限公司
Priority to EP20933979.5A priority Critical patent/EP4159149A4/en
Priority to US17/624,320 priority patent/US20220354580A1/en
Publication of WO2021217713A1 publication Critical patent/WO2021217713A1/zh

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/90Identification means for patients or instruments, e.g. tags
    • A61B90/94Identification means for patients or instruments, e.g. tags coded with symbols, e.g. text
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B33ADDITIVE MANUFACTURING TECHNOLOGY
    • B33YADDITIVE MANUFACTURING, i.e. MANUFACTURING OF THREE-DIMENSIONAL [3-D] OBJECTS BY ADDITIVE DEPOSITION, ADDITIVE AGGLOMERATION OR ADDITIVE LAYERING, e.g. BY 3-D PRINTING, STEREOLITHOGRAPHY OR SELECTIVE LASER SINTERING
    • B33Y40/00Auxiliary operations or equipment, e.g. for material handling
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/34Trocars; Puncturing needles
    • A61B17/3403Needle locating or guiding means
    • A61B2017/3405Needle locating or guiding means using mechanical guide means
    • A61B2017/3407Needle locating or guiding means using mechanical guide means including a base for support on the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/56Surgical instruments or methods for treatment of bones or joints; Devices specially adapted therefor
    • A61B2017/568Surgical instruments or methods for treatment of bones or joints; Devices specially adapted therefor produced with shape and dimensions specific for an individual patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2068Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3904Markers, e.g. radio-opaque or breast lesions markers specially adapted for marking specified tissue
    • A61B2090/3916Bone tissue
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3983Reference marker arrangements for use with image guided surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/10Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis
    • A61B90/11Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis with guides for needles or instruments, e.g. arcuate slides or ball joints
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B33ADDITIVE MANUFACTURING TECHNOLOGY
    • B33YADDITIVE MANUFACTURING, i.e. MANUFACTURING OF THREE-DIMENSIONAL [3-D] OBJECTS BY ADDITIVE DEPOSITION, ADDITIVE AGGLOMERATION OR ADDITIVE LAYERING, e.g. BY 3-D PRINTING, STEREOLITHOGRAPHY OR SELECTIVE LASER SINTERING
    • B33Y80/00Products made by additive manufacturing

Definitions

  • the present invention relates to the technical field of medical devices, in particular to an orthopedic surgery navigation system that can be used in orthopedic surgery, and more particularly to an orthopedic surgery navigation system that uses a combination of a 3D printed tracer and optical positioning technology and a method for performing navigation Computer and storage media.
  • the surgical navigation system that is, a surgical system that uses computer-assisted technology, has now been used in spine, pelvic and thoracic surgery, joint surgery, trauma surgery, bone tumors and orthopedics. It constructs a virtual operating environment through digital medical images for surgeons. Provide visual support to make surgical operations more accurate, less invasive, and safer. This technology tracks the surgical site and surgical instruments in real time, just like navigating airplanes and ships, so it is called a navigation system.
  • the current surgical navigation system generally adopts the following principle: a surgeon holds an improved surgical instrument with a traceable mark to perform an operation on a patient's surgical target site.
  • the three-dimensional spatial positioning and aiming operation process of surgical instruments are monitored by a tracker connected to a computer.
  • the tracker must accurately provide the relative relationship between the patient's anatomical position and the preoperative or intraoperative multi-mode image through registration, so as to guide the doctor Operate surgical instruments and implement corresponding surgical operations.
  • the surgical process generally follows the following process: Obtain preoperative patient images, such as CT/X Light and import it into the computer system for necessary processing, such as noise reduction, 3D Reconstruction; preoperative planning, doctors make surgical planning according to the patient’s condition, such as nail placement, nail placement angle and depth; intraoperative registration, spatial matching (registration) with preoperative images through intraoperative images and positioning trackers , Obtain the spatial position relationship between surgical instruments and the patient's anatomical structure, and establish a simulation model in the monitoring computer to display the position of the surgical instruments in real time; perform the operation, track the surgical instruments and the surgical site, and guide the operation according to the preoperative planning.
  • a surgical navigation system disclosed in includes an angle and orientation positioning device installed on surgical instruments, two or more laser projection boards, two or more laser projection point collectors, and a computer; the angle and orientation positioning device in the device can be After measuring the angle of the current surgical instrument, the laser projection point collector can collect the position change of the laser beam on the laser projection board, and then judge the displacement of the surgical instrument in the vertical direction, and obtain the penetration depth of the surgical instrument.
  • the angular orientation positioning device installed on the surgical instrument includes a gyroscope, two or more lasers and other components. Both the equipment itself and the working principle are quite complicated. And registration is difficult.
  • the registration technology is the key technology of the navigation system, its purpose is to integrate the patient's preoperative medical images, the patient's anatomical structure position information obtained through the intraoperative positioning tracer, and the position information of the surgical instrument into the same spatial coordinate system.
  • the navigation tracer fixed on the patient is used, such as the Chinese patent application number 201710970763.2 , "A tracer for orthopedic surgery and its connecting structure", including a groove formed on the top for fixing the tracer, a connecting piece and a fixing piece for fixing on the bone structure of the patient.
  • the tracers currently used are all pre-designed structures. On the one hand, they need to be fixed near the bone structure of the surgical site, and additional positions are needed.
  • the current orthopedic surgery navigation system has the disadvantages of inconvenient fixation, instability, and difficulty in minimally invasive navigation tracers, or the problem of complicated positioning of surgical instruments.
  • the process of fluoroscopy registration is generally required, which requires greater In the operation space, the registration process also prolongs the operation time, and medical staff and patients are exposed to additional radiation.
  • the purpose of the present invention is to provide a surgical navigation system with simple and stable registration, high accuracy, and simple operation, as well as a computer device and a storage medium for performing surgical navigation.
  • a surgical navigation system which includes a spatial positioning marking element for installation on a surgical instrument, a tracer for fixing on a bone structure to be operated, a binocular camera for spatial positioning of binocular vision, and A graphics workstation computer of a navigation terminal, the tracer is provided with at least one navigation tracking surface for navigation and positioning, the spatial positioning marking element includes at least one navigation surface for tracking surgical instruments, the binocular camera Connect to the computer, and transmit the collected information of the tracer and the spatial positioning marking element to the computer.
  • the tracer is provided with at least one bone fitting surface for fitting and fixing the bone to be operated on, the bone fitting surface perfectly fits the bone structure to be operated, and the error is small. After the tracer and the bone structure are fixed, the relative position in space is unique, which is equivalent to the extension of the bone structure.
  • the spatial position of the navigation tracer surface in the tracer is known. Therefore, there is no need for fluoroscopic image support and registration during the operation, and direct tracking and positioning can be performed.
  • the tracer includes a surgical guide constructed by 3D printing, and the navigation tracking surface is provided on the surgical guide.
  • the tracer manufactured by 3D printing can easily and quickly achieve a perfect fit with the bone structure to be operated.
  • the surgical guide includes a guide body.
  • the navigation tracking surface is directly formed on the guide body, or the navigation tracking surface is arranged on a navigation tracking carrier, and the navigation tracking carrier is arranged On the guide main body, the navigation tracking surface is a plane pasted with a visible light visual recognition tracking pattern, or a display film with one or more feature points, or a tracking surface formed by multiple feature points.
  • the spatial positioning marking element is a polyhedron, and at least two navigation surfaces are provided.
  • the navigation surface can be completely photographed by binocular cameras during the operation, and spatial positioning can be performed by a computer.
  • the computer includes the following modules:
  • the data receiving and storing module is used to receive and store the information transmitted by the binocular camera
  • the image reconstruction module is used to import and reconstruct the three-dimensional model of the bone structure to be operated and the three-dimensional model of the surgical instrument, and use the information collected by the binocular camera to reconstruct the three-dimensional image and posture of the surgical instrument during the operation to realize visual navigation.
  • the present invention also provides a computer device, including a memory and a processor, and a computer program is stored in the memory.
  • the processor When the computer program is executed by the processor, the processor is caused to perform a surgical navigation method, and the method includes:
  • the binocular vision positioning algorithm is used to calculate the spatial position information of the bones to be operated on and the surgical instruments, and the information is real-time fused with the three-dimensional model.
  • the tracer is provided with at least one navigation tracking surface that can cooperate with the optical positioning of the binocular camera
  • the spatial positioning marking element includes at least one navigation surface that can cooperate with the optical positioning of the binocular camera
  • the method further includes: before the operation, according to the three-dimensional model of the bone to be operated, designing the three-dimensional model of the tracer by 3D and printing and manufacturing the tracer, and then converting the three-dimensional model of the tracer The model is imported into the computer memory.
  • the tracer includes a surgical guide constructed by 3D printing, and the navigation tracer surface is provided on the surgical guide, and the tracer is provided with at least one for attaching to the bone to be operated. Fit the fixed bone fitting surface.
  • the method further includes: before the operation, performing three-dimensional scanning of the surgical instrument to obtain three-dimensional image information, calibrating the spatial positioning marker element, and importing it into the computer memory.
  • the spatial positioning marking element is a polyhedron, and at least two navigation surfaces are provided.
  • the computer includes a monitor for real-time display of the calculated dynamic image of the positional relationship between the bone and the surgical instrument.
  • the present invention also provides a computer-readable storage medium having a computer program stored on the computer-readable storage medium, and when the computer program is executed by a processor, the processor executes the above-mentioned surgical navigation method.
  • the present invention has the following advantages:
  • the technical scheme of the present invention is based on the digital navigation technology of visual positioning.
  • the binocular camera cooperates with the tracer on the bone and the spatial positioning marking element on the surgical instrument to collect real-time information of the tracer and the spatial positioning marking element, using binoculars
  • the visual positioning algorithm calculates and obtains the spatial position information of the bone to be operated and the surgical instrument, and merges it with the three-dimensional model in real time to obtain the dynamic image of the real-time position relationship between the bone and the surgical instrument to realize the surgical navigation. Simplify the operation process, improve the accuracy of the operation, reduce the risk of the operation, and realize the personalized treatment needs.
  • the tracer uses 3D printing technology to construct a surgical guide that fits the bone surface, because the 3D printing guide is customized according to the patient's bone structure, and the fitting surface perfectly fits the patient's bone structure, with minimal error, and can stabilize the complex.
  • the spatial relative position is unique. There is no need for registration and intraoperative image support.
  • the operation can be directly tracked, which reduces the number of intraoperative imaging links, simplifies the operation process, and solves orthopedic surgery.
  • the patient s anatomy and surgical instrument navigation and positioning problems, while optimizing the operation process and reducing the use of intraoperative images, visual navigation shortens the operation time, reduces the amount of radiation received by the medical staff and the patient during the operation, and reduces the risk of the operation. It is widely used in small and medium-sized hospitals; in addition, the surgical guide generally overlaps with the area to be operated, without additional position fixation, making minimally invasive surgery possible.
  • Fig. 1 is a schematic diagram of the application of an embodiment of the surgical navigation system of the present invention
  • Fig. 2 is a first perspective view of one of the embodiments of the tracer in the surgical navigation system of the present invention
  • Figure 3 is a second perspective view of one of the embodiments of the tracer in the surgical navigation system of the present invention.
  • Figure 4 is a reverse perspective view of one of the embodiments of the tracer in the surgical navigation system of the present invention.
  • Figure 5 is a schematic diagram of one application of one embodiment of the tracer in the surgical navigation system of the present invention.
  • FIG. 6 is a schematic diagram of the application of one embodiment of the tracer in the surgical navigation system of the present invention.
  • FIG. 7 is a schematic diagram of the application of the second embodiment of the tracer in the surgical navigation system of the present invention.
  • FIG. 8 is a schematic diagram of the first embodiment of the spatial positioning marking element in the surgical navigation system of the present invention.
  • FIG. 9 is a schematic diagram of the second embodiment of the spatial positioning marking element in the surgical navigation system of the present invention.
  • Fig. 10 is a schematic flow chart of a surgical navigation method executed by a computer program in the present invention.
  • Fig. 11 is a schematic diagram of the principle of obtaining image information of the measured object by binocular stereo vision.
  • the surgical navigation system of the embodiment of the present invention includes a space positioning marking element (not marked) for installing on the surgical instrument, a tracer (not shown) for fixing on the bone structure to be operated, and
  • the information of the marking element is transmitted to the computer, and the visual principle is used to track the tracer and the surgical instrument for spatial positioning.
  • Y is the patient's surgical position and the reference coordinate of the surgical instrument.
  • the principle of binocular stereo vision to obtain the image information of the measured object is shown in Figure 11.
  • the main function of the graphics workstation computer 502 is image reconstruction, importing and reconstructing the patient's 3D bone structure model and the surgical instrument 3D model; storing the three-dimensional images (and positioning information) of various surgical instruments, which can be conveniently switched during the operation.
  • image reconstruction importing and reconstructing the patient's 3D bone structure model and the surgical instrument 3D model; storing the three-dimensional images (and positioning information) of various surgical instruments, which can be conveniently switched during the operation.
  • storage of preoperative CT 3D reconstruction images and registration information of 3D printed tracers real-time image tracking and image fusion functions, including preoperative CT 3D images and intraoperative 3D images of surgical instruments; receiving from binocular cameras
  • the binocular camera 501 with binocular visual spatial positioning is used to collect the tracer information on the patient and the spatial positioning marking element locator information of the surgical instrument in real time, and the binocular visual positioning algorithm is used to calculate the 3D printed tracer and
  • the computer includes the following modules:
  • the data receiving and storing module is used to receive and store the information transmitted by the binocular camera
  • the image reconstruction module is used to import and reconstruct the three-dimensional model of the bone structure to be operated and the three-dimensional model of the surgical instrument, and use the information collected by the binocular camera to reconstruct the three-dimensional image and posture of the surgical instrument during the operation to realize visual navigation.
  • the tracer is provided with at least one navigation tracking surface used for navigation and positioning
  • the spatial positioning marking element includes at least one navigation surface used for tracking surgical instruments.
  • the tracer is 3D printed.
  • the first embodiment includes a guide plate body 100.
  • a bone fitting surface 101 is provided under the guide plate body. Production, wherein the bone fitting surface 101 completely fits with the bone structure surface 302 of the vertebra 300 to be operated on.
  • a navigation tracking surface 200 is also provided above the surgical guide, and the navigation tracking surface 200 is directly arranged on the surface 100 of the guide body.
  • This embodiment is provided with a surgical guide pin hole 103 and a fixing hole 104.
  • the fixing hole 104 is used with screws and other fixing nails to strengthen the fixation on the vertebra 300 and make it more stable.
  • the surgical guide pin hole 103 is used for guiding Surgical needles and other instruments.
  • the guide plate body 100 includes a main body 110 and a base 120, the fixing hole 104 is provided on the base 120, and the bone fitting surface 101 is formed on the bottom surface of the base 120, or as required
  • the bone fitting surface 101 is provided on any surface of the body 100 corresponding to the surface of the bone structure to be operated.
  • the base 120 is provided with a guide needle tube 130 and a reinforcing beam 105 connecting the guide needle tube 130 and the main body 110, and the surgical guide needle hole 103 is provided in the guide needle tube 130.
  • the main body 110 is also provided with an arch 102 adapted to the protruding bone structure 301 on the vertebrae.
  • the navigation tracking surface 200 is a plane set on the top of the guide plate body 100, the navigation tracking surface 200 is pasted with a visible light visual recognition tracking pattern 201, and the plane is directly formed on the guide plate. On the body 100.
  • the tracer is to perform 3D reconstruction of the bone structure based on the patient's preoperative CT images, and then design a reverse template consistent with the anatomical shape in the 3D editing software.
  • a navigation tracking surface 200 is provided with an optical visual recognition tracking pattern 201.
  • feature points, reflective points, etc. can also be set for intraoperative registration or navigation.
  • the technical scheme of the present invention enables the guide plate body and the navigation tracking surface to be stably fixed to the complex bone structure, can be adapted to the bone structure of different parts of different patients, is not prone to deviation, has high navigation accuracy, and reduces surgery
  • the middle image link simplifies the operation process.
  • the registration and tracking of the spatial position within the surgical area can be realized through the pattern or characteristic mark points of the navigation surface.
  • the navigation tracer surface is designed according to the navigation requirements. For example, the navigation tracer surface is a plane with a minimum of 10*10mm and is attached with visible light. Visually recognize the traced pattern. There are also many technical methods for registration and tracking, such as X-ray, infrared and so on. In other embodiments, there may be no fixed holes and surgical guide needle holes.
  • the surgical guide needle hole is used to guide surgical instruments during surgical drilling or nail placement. When digital navigation or surgical robots are used, the position and angle of the drilling or nail placement have been determined through preoperative or intraoperative planning, or not set The surgical guide pin
  • the second embodiment of the tracer includes a guide plate body 401 and a navigation tracking surface on it.
  • the guide plate body 401 is directly fixed on the bone, and the bone abutment surface on the guide plate body and the bone The structural surface is completely fitted and fixed so that the tracer can be clamped at the spinous process of the spine.
  • a platform 402 is provided on the guide plate body 401 as a navigation trace carrier, and a visible light visual recognition tracking pattern 403 is provided on the platform 402.
  • the bone fitting surface fits perfectly with the bone structure to be operated, and the error is small. After the tracer and the bone structure are fixed, the spatial relative position is unique, which is equivalent to the extension of the bone structure.
  • the spatial position of the navigation tracer surface in the tracer is known. Therefore, there is no need for fluoroscopic image support and registration during the operation, and direct tracking and positioning can be performed.
  • the spatial positioning marking element 600 installed on a specific part of the surgical instrument is a polyhedron and is provided with at least two navigation surfaces 601 and 602.
  • the navigation surface can be completely photographed by binocular cameras during the operation, and spatial positioning can be performed by a computer.
  • the second embodiment of the spatial positioning and marking element in FIG. 8 has more positioning posts 603 for fixed positioning.
  • the spatial positioning marking element 600 can also be manufactured using a 3D printing method.
  • a 3D structure of the spatial positioning marking element matching the 3D model of the surgical instrument is designed in the 3D editing software, and then manufactured and fixed on the surgical instrument by 3D printing.
  • the number of navigation planes depends on the operating requirements of the surgical instrument.
  • the visual tracking and positioning requires at least a complete picture of a navigation plane. noodle.
  • the method of applying the surgical navigation system of the present invention for surgical navigation mainly includes:
  • the binocular vision positioning algorithm is used to calculate the spatial position information of the bone to be operated and the surgical instrument, and merge it with the three-dimensional model in real time. After fusion, the real-time position relationship between the bone and the surgical instrument is obtained. Dynamic images to achieve surgical navigation.
  • the spatial positioning and marking elements of the tracer and surgical instruments adopt the tracer and polyhedral spatial positioning and marking elements as in the above-mentioned embodiment, and the coordination between the binocular camera and the spatial positioning and marking elements of the tracer and surgical instruments is the optical positioning method.
  • the equipment occupies a small space and has high accuracy.
  • Binocular stereo vision is based on the principle of parallax and uses imaging equipment to obtain two images of the measured object from different positions, and obtain the three-dimensional geometric information of the object by calculating the position deviation between the corresponding points of the image, as shown in Figure 11.
  • taking spinal surgery navigation as an example first perform patient diagnosis, obtain bone image information through CT scan before surgery, and then perform bone 3D reconstruction through computer equipment, and 3D print the tracer 3D reconstruction and editing of the bone structure from the preoperative images, and then made by a 3D printer.
  • the doctor makes an incision at the surgical site of the patient and peels off the soft tissue, and fixes the tracer on the bone structure of the patient.
  • the tracer becomes the bone structure
  • the extension of the navigation tracer surface pattern can be optically positioned; three-dimensional scanning of surgical instruments to obtain three-dimensional image information, installation of spatial positioning marking elements, calibration of spatial positioning marking elements, surgical instruments installed with spatial positioning marking elements, spatial positioning marking elements Contains 2 to 4 navigation surfaces, which can be positioned optically as with 3D printed tracers; the computer imports 3D models of bones, tracers and surgical instruments.
  • the computer initializes the 3D model image, obtains image information through the binocular camera, tracks the spatial position information of the tracer and surgical instruments, connects to the binocular camera of the computer, and transmits the collected video data to the computer in real time;
  • the real-time image uses the binocular vision positioning algorithm to calculate the relative positional relationship between the patient’s bones and surgical instruments.
  • the position information of the surgical instruments and the patient’s bone structure is fused with the bone 3D model and the surgical instrument 3D model obtained from the preoperative image to obtain the bone and
  • the dynamic images of the real-time positional relationship of surgical instruments are displayed in real time through the monitor connected to the computer to display the dynamic images of the bones and surgical instruments, so that the doctor can perform simultaneous surgery by observing the monitor, so as to realize the visualized navigation of the surgery.
  • CT scan and 3D reconstruction of the patient's surgical site are performed, and then the 3D printed tracer (surgical guide) is designed and produced in the 3D editing software.
  • the bone model and tracer model are imported into the surgical system before the operation Graphics workstation computer.
  • the surgical instruments to be used in the operation should be scanned in 3 dimensions and imported into the computer (or the surgical instrument library is prepared), and the navigation markers should be calibrated to make the posture position error within the acceptable range.
  • the computer mainly realizes the real-time fusion of images and the quantitative monitoring of the position information of surgical instruments. First, it accepts the initial instructions to start the 3D image reconstruction of the patient and the 3D image fusion of the surgical instruments; during the operation, it receives the real-time images returned by the binocular camera for image Analyze, obtain the position information of the surgical site and surgical instruments, fuse the 3D model and display it on the navigation display.
  • the present invention also provides a computer device, including a memory and a processor, and a computer program is stored in the memory.
  • the processor causes the processor to execute the above-mentioned surgical navigation method.
  • the present invention also provides a computer-readable storage medium having a computer program stored on the computer-readable storage medium, and when the computer program is executed by a processor, the processor executes the above-mentioned surgical navigation method.
  • the surgical navigation system of the present invention can be applied to spinal surgery, pelvic and thoracic spine surgery, joint surgery, trauma surgery, bone tumor and orthopedic surgery.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Robotics (AREA)
  • Chemical & Material Sciences (AREA)
  • Manufacturing & Machinery (AREA)
  • Materials Engineering (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Pathology (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

手术导航系统包括手术器械上的空间定位标记元件(600)、骨骼结构上的示踪器、双目摄像机(501)和计算机(502),示踪器设有导航示踪面(200),空间定位标记元件(600)包括导航面(601),双目摄像机(501)连接计算机(502),将采集到的示踪器和空间定位标记元件(600)的信息传输至计算机(502)。计算机(502)以及计算机可读存储介质,分别存储有计算机程序,执行手术导航方法:生成骨骼和手术器械三维模型,获得示踪器配准信息和空间定位标记元件(600)标定信息;接收双目摄像机(501)采集到的示踪器和空间定位标记元件(600)的实时信息;计算得到骨骼、手术器械的空间位置信息,与三维模型实时融合得到两者实时位置关系动态图像。实现手术导航并无需术中配准,简化手术过程,提高了手术精准度。

Description

手术导航系统及执行手术导航方法的计算机与存储介质 技术领域
本发明涉及医疗器械技术领域,具体涉及一种可用于骨科手术术中的骨科手术导航系统,尤其涉及一种采用3D打印示踪器以及光学定位技术相结合的骨科手术导航系统及执行导航方法的计算机与存储介质。
背景技术
随着现代医学和计算机技术的不断发展,医学成像技术、计算机图像处理技术正逐步应用于医学领域,计算机辅助手术技术已经成为外科手术中一个主要的发展方向,该技术延伸了外科医生有限的视觉范围,突破了传统外科手术的界限,重新定义了外科手术和手术器械的概念。对于提高手术定位精度、减少手术损伤、减少手术时间和提高手术成功率等具有十分重要的意义。
外科手术导航系统,即采用了计算机辅助技术的手术系统,如今已经应用于脊柱、骨盆和胸椎外科、关节外科、创伤外科、骨肿瘤和矫形外科,通过数字化医学影像构造虚拟手术环境,为外科医生提供可视化的支持,使外科手术更准确、更微创、更安全。这项技术实时跟踪手术部位和手术器械,犹如为飞机和舰船导航一样,所以称为导航系统。
目前的手术导航系统普遍采用如下原理:外科医生手持经过改进的具有可跟踪标记的手术器械,对患者的手术目标部位实施手术。手术器械的立体空间定位和瞄准操作过程均在连接计算机的跟踪器监视之下,同时跟踪器要通过配准精确给出患者解剖位置与术前或术中多模图像的相对关系,从而引导医生操作手术器械,实施相应的手术操作。
手术过程一般遵循如下过程:获得术前患者影像,如 CT/X 光,并导入计算机系统,进行必要的处理,如消噪、 3D 重建;术前规划,医生根据患者的情况制定手术规划,如置钉位置、置钉角度和深度;术中配准,通过术中影像和定位跟踪器与术前影像进行空间匹配(配准),获得手术器械和患者解剖结构间的空间位置关系,并在监控计算机中建立仿真模型,实时显示手术器械的位置;实施手术,跟踪手术器械和手术部位,根据术前规划引导手术。
骨科手术导航系统多采用电磁定位、超声波定位或光学定位法,也有采用陀螺仪复合定位法。如中国专利申请号 201810430822.1 中公开的一种手术导航系统,包括安装在手术器械上的角度方位定位装置、两个以上的激光投影板、两个以上的激光投影点采集器、计算机;该装置中的角度方位定位装置可以测得当前手术器械的角度,激光投影点采集器可以采集到激光束在激光投影板上的位置变化,进而判断手术器械在垂直方向上的位移情况,获得手术器械的进入深度。但是该技术方案需要对手术器械进行大幅改造,定位设备也十分复杂,安装在手术器械上的角度方位定位装置包括陀螺仪、两个以上的激光器等部件,无论设备本身还是工作原理都相当复杂,并且配准难度高。
其中,配准技术是导航系统的关键技术,其目的是把患者术前医学影像、以及通过术中定位示踪器得到的患者解剖结构位置信息、手术器械的位置信息集成到同一个空间坐标系中,其中都要用到固定在患者身上的导航示踪器,如中国专利申请号 201710970763.2 ,“一种用于骨科手术的示踪器及其连接结构”,包括顶部成型用于固定示踪器的槽体,连接件和用于固定在患者的骨性结构上的固定件。目前采用的示踪器都为预先设计好的结构,一方面需要固定在手术部位骨结构附近,需要额外的位置,扩大了开放面,增加了患者痛苦和手术难度,另一方面不能很好的配合不同的骨结构,固定难度大,可能造成导航精度下降和术中的二次损伤。使用这种技术,在术中需要进行透视影像配准,增加了手术时间,医护人员和患者还要在手术中受到更多的辐射,透视过程需要用到 C 型臂 X 光机,占用的手术空间大,不利于小型医院的推广使用。
目前的骨科手术导航系统存在导航示踪器固定不便、不稳定和难以微创化的缺点,或者存在手术器械定位复杂的问题,在手术时,又普遍必需透视配准的过程,需要较大的手术空间,配准过程还延长了手术时间,医护人员和患者受到额外的辐射。
因此,需要提供一种具有配准简易稳定、精准度高的优点,简单易操作、手术微创化、手术空间灵活、辐射量低的手术导航系统及相应的计算机设备与存储介质。
技术问题
本发明的目的在于提供一种配准简易稳定、精准度高、简单易操作的手术导航系统以及执行手术导航的计算机设备与存储介质。
技术解决方案
为实现本发明目的,提供以下技术方案:
提供一种手术导航系统,其包括用于安装在手术器械上的空间定位标记元件、用于固定在待手术骨骼结构上的示踪器、用作双目视觉空间定位的双目摄像机和用作导航终端的图形工作站计算机,所述示踪器设有至少一个用于导航定位的导航示踪面,所述空间定位标记元件包括至少一个用于手术器械示踪的导航面,所述双目摄像机连接计算机,将采集到的示踪器和空间定位标记元件的信息传输至计算机。
一些实施方式中,所述示踪器设有至少一个用于与待手术骨骼贴合固定的骨骼贴合面,所述骨骼贴合面与待手术骨骼结构完美契合,误差很小,所述示踪器与骨骼结构固定后空间相对位置唯一,相当于骨骼结构的延伸。而导航示踪面在该示踪器的空间位置为已知,因此,手术中无需透视影像支持和配准,可直接跟踪定位。
一些实施方式中,所述示踪器包括通过3D打印构造的手术导板,在所述手术导板上设置有所述导航示踪面。通过3D打印制造的示踪器可以简易快速的实现与待手术骨骼结构完美契合。
所述手术导板包括导板本体,一些实施方式中,所述导航示踪面直接形成于所述导板本体上,或者所述导航示踪面设置于导航示踪载体上,所述导航示踪载体设置于所述导板本体上,所述导航示踪面为贴有可见光视觉识别跟踪图案的平面,或是具有一个或多个特征点的显影片,或是由多个特征点形成的示踪面。
一些实施方式中,所述空间定位标记元件为多面体,设置有至少两个所述导航面。该导航面在手术过程中可被双目摄像机完整拍摄,并通过计算机进行空间定位。
一些实施方式中,所述计算机包括以下模块:
数据接收存储模块,用于接收所述的双目摄像机传输的信息,并存储;
图像重构模块,用于导入并重建待手术骨骼结构三维模型、手术器械三维模型,并采用双目摄像机集到的信息用于重建手术中的手术器械三维图像和姿态,以实现可视化导航。
本发明还提供一种计算机设备,包括存储器和处理器,所述存储器中存储有计算机程序,所述计算机程序被所述处理器执行时,使得所述处理器执行手术导航方法,该方法包括:
接收待手术骨骼的图像和手术器械图像并生成三维模型,获得固定于骨骼上的示踪器配准信息和手术器械上空间定位标记元件的标定信息;
接收双目摄像机采集到的示踪器和空间定位标记元件的实时信息;
根据示踪器和空间定位标记元件的实时信息,采用双目视觉定位算法计算得到待手术骨骼、手术器械的空间位置信息,并与三维模型实时融合,
融合后得到骨骼和手术器械的实时位置关系动态图像,以实现手术导航。
一些实施方式中,所述示踪器设有至少一个可配合所述双目摄像机光学定位的导航示踪面,所述空间定位标记元件包括至少一个可配合所述双目摄像机光学定位的导航面。
一些实施方式中,该方法还包括:在手术前,根据待手术骨骼的三维模型通过3D设计所述示踪器三维模型及打印制造所述示踪器,并在将所述示踪器的三维模型导入所述计算机存储器中。
一些实施方式中,所述示踪器包括通过3D打印构造的手术导板,在所述手术导板上设置有所述导航示踪面,所述示踪器设有至少一个用于与待手术骨骼贴合固定的骨骼贴合面。
一些实施方式中,该方法还包括:在手术前,对手术器械进行三维扫描获得三维图像信息,标定空间定位标记元件,并导入所述计算机存储器中。
一些实施方式中,所述空间定位标记元件为多面体,设置有至少两个所述导航面。
一些实施方式中,该计算机包括监视器,所述监视器用于实时显示所计算得到的骨骼和手术器械的位置关系动态图像。
本发明还提供一种计算机可读存储介质,所述计算机可读存储介质上存储有计算机程序,所述计算机程序被处理器执行时,使得所述处理器执行上述手术导航方法。
有益效果
与现有技术相比较,本发明具有如下所述优点:
本发明技术方案基于视觉定位的数字化导航技术,双目摄像机与骨骼上的示踪器以及手术器械上的空间定位标记元件相配合,采集示踪器和空间定位标记元件的实时信息,采用双目视觉定位算法计算得到待手术骨骼、手术器械的空间位置信息,并与三维模型实时融合,可获得骨骼和手术器械的实时位置关系动态图像,以实现手术导航。简化手术过程,提高了手术精准度,降低了手术风险,实现个性化的治疗需求。
进一步的,所述示踪器采用3D打印技术构造贴合骨骼面的手术导板,因为3D打印导板按照患者骨结构定制,贴合面与患者骨结构完美契合,误差极小,可以稳定锲合复杂的骨结构,所述导航示踪面与骨骼结构固定后空间相对位置唯一,无需配准,无需术中影像支持,可直接跟踪手术,减少了术中影像环节,简化了手术过程,解决骨科手术中患者解剖部位和手术器械导航定位问题,同时优化手术过程和减少术中影像的使用,可视导航缩短手术时间,减少医护人员和患者在手术中受到的辐射量,降低手术风险,适于在中小型医院普及使用;另外手术导板一般与待手术部位空间重合,无需额外的位置固定,使微创手术成为可能。
附图说明
图1为本发明手术导航系统实施例应用的示意图;
图2为本发明手术导航系统中示踪器实施例之一的立体视图一;
图3为本发明手术导航系统中示踪器实施例之一的立体视图二;
图4为本发明手术导航系统中示踪器实施例之一的反面立体视图;
图5为本发明手术导航系统中示踪器实施例之一应用示意图一;
图6为本发明手术导航系统中示踪器实施例之一应用示意图二;
图7为本发明手术导航系统中示踪器实施例之二的应用示意图;
图8为本发明手术导航系统中空间定位标记元件施例一的示意图;
图9为本发明手术导航系统中空间定位标记元件施例二的示意图;
图10为本发明中计算机程序所执行的手术导航方法流程示意图;
图11为双目立体视觉获取被测物体图像信息的原理示意图。
本发明的实施方式
请参阅图1,本发明实施例手术导航系统包括用于安装在手术器械上的空间定位标记元件(未标识)、用于固定在待手术骨骼结构上的示踪器(图未示)、用作双目视觉空间定位的双目摄像机501和用作导航终端的图形工作站计算机502,将骨骼术前影像503输入计算机502,所述双目摄像机连接计算机,将采集到的示踪器和空间定位标记元件的信息传输至计算机,采用视觉原理跟踪示踪器和手术器械进行空间定位,图中Y为患者手术位置和手术器械的参考坐标。双目立体视觉获取被测物体图像信息的原理如图11所示。
图形工作站计算机502,主要功能是图像重构,导入并重建患者3D骨结构模型、手术器械3D模型;存贮各种手术器械的三维图像(以及定位信息),可以方便地在手术中进行切换,例如存储患者术前CT三维重建图像以及3D打印示踪器的配准信息;承担实时图像跟踪以及图像融合的功能,包括术前CT三维图像和术中手术器械的三维图像;接收来自双目摄像机的实时图像;并采用双目视觉空间定位的双目摄像机501实时采集患者身上的示踪器以及手术器械的空间定位标记元件定位器信息,采用双目视觉定位算法计算得到3D打印示踪器和手术器械空间定位标记元件的位置姿态信息,将位置信息与三维图像实时融合,用于重建计算机中的手术器械3D图像和姿态,从而在手术中实现可视化导航;在关键操作中具有边界识别或量化偏移的功能,提供手术安全操作功能。
所述计算机包括以下模块:
数据接收存储模块,用于接收所述的双目摄像机传输的信息,并存储;
图像重构模块,用于导入并重建待手术骨骼结构三维模型、手术器械三维模型,并采用双目摄像机集到的信息用于重建手术中的手术器械三维图像和姿态,以实现可视化导航。
请结合参阅图2~9,分别显示了示踪器和空间定位标记元件的实施例示意图。其中,所述示踪器设有至少一个用于导航定位的导航示踪面,所述空间定位标记元件包括至少一个用于手术器械示踪的导航面。
请参阅图2~7,所述示踪器采用3D打印,实施例一中包括导板本体100,所述导板本体下方设有一个骨骼贴合面101,该手术导板根据术前骨骼影像3D重建而制作,其中,所述骨骼贴合面101与待手术的椎骨300的骨结构面302完全契合。所述手术导板上方还设有导航示踪面200,所述导航示踪面200直接设置于所述导板本体表面100上。
该实施例设有手术导针孔103和固定孔104,所述固定孔104配合螺钉等固定钉用于加强固定在椎骨300上,使其更稳固,所述手术导针孔103用于导引手术针等器械。具体的,本实施例中所述导板本体100包括主体110和底座120,所述固定孔104设置在底座120上,所述骨骼贴合面101生成在所述底座120底面,或根据需要所述骨骼贴合面101设置在所述本体100上与待手术骨骼结构面相对应的任意面。所述底座120上设有导针管130及连接导针管130与主体110的加强梁105,所述手术导针孔103设于所述导针管130中。所述主体110上还设有与椎骨上的突骨结构301相适应的拱门102。
本实施例中,所述导航示踪面200是设置在所述导板本体100顶部的平面,所述导航示踪面200上贴有可见光视觉识别跟踪图案201,所述平面直接形成于所述导板本体100上。
所述示踪器是根据患者术前CT影像进行骨结构3D重建,然后在3D编辑软件中设计与解剖形态一致的反向模板,除了骨骼贴合面和辅助结构,在导板的顶面再设计一个导航示踪面200,设置光视觉识别跟踪图案201。在其他实施方式中也可设置特征点、反光点等,用于术中配准或导航。
本发明技术方案使导板本体以及所带有的导航示踪面均可以稳定的固定于复杂的骨结构,可适应不同患者不同部位的骨骼结构,不容易发生偏移,导航精度高,减少了术中影像环节,简化了手术过程。通过导航面的图案或特征标记点,就可实现手术范围内空间位置的配准和跟踪,导航示踪面根据导航需求进行设计,例如导航示踪面为最小10*10mm的平面,贴有可见光视觉识别跟踪的图案。配准和跟踪的技术方法也可以有多种,如X光、红外线等。另一些实施例中,也可以没有固定孔和手术导针孔。所述手术导针孔用作手术打孔或置钉时引导手术器械,在应用数字导航或手术机器人时,打孔或置钉位置和角度通过术前或术中规划已经确定,也可以不设置该手术导针孔。
如图7所示,所述示踪器实施例二,包括导板本体401及其上的导航示踪面,所述导板本体401直接固定在骨骼上,通过导板本体上的骨骼贴合面与骨骼结构面的完全贴合从而固定,使得所述示踪器可以夹在脊骨棘突位置,导板本体401上设有平台402作为导航示踪载体,平台402上设有可见光视觉识别跟踪图案403。
所述骨骼贴合面与待手术骨骼结构完美契合,误差很小,所述示踪器与骨骼结构固定后空间相对位置唯一,相当于骨骼结构的延伸。而导航示踪面在该示踪器的空间位置为已知,因此,手术中无需透视影像支持和配准,可直接跟踪定位。
请参阅图8和图9,安装在手术器械特定部位上的空间定位标记元件600为多面体,设置有至少两个导航面601、602。该导航面在手术过程中可被双目摄像机完整拍摄,并通过计算机进行空间定位。图9空间定位标记元件实施例二相比图8空间定位标记元件实施例一多了用于固定定位的定位柱603。
所述空间定位标记元件600也可以采用3D打印方法制造,在3D编辑软件中设计与手术器械3D模型相配合的空间定位标记元件3D结构,再采用3D打印方式制造及固定于手术器械上。所述空间定位标记元件上的导航面也可以是两个以上,例如有3个或4个导航面,导航面的数量取决于手术器械的操作要求,视觉跟踪定位要求至少能完整地拍摄一个导航面。
请结合参阅图1~图10,应用本发明手术导航系统进行手术导航的方法主要包括:
接收待手术骨骼的图像和手术器械图像并生成三维模型,获得固定于骨骼上的示踪器配准信息和手术器械上空间定位标记元件的标定信息;
接收双目摄像机采集到的示踪器和空间定位标记元件的实时信息;
根据示踪器和空间定位标记元件的实时信息,采用双目视觉定位算法计算得到待手术骨骼、手术器械的空间位置信息,并与三维模型实时融合,融合后得到骨骼和手术器械的实时位置关系动态图像,以实现手术导航。
示踪器及手术器械的空间定位标记元件采用如上述实施例的示踪器和多面体空间定位标记元件,双目摄像头与示踪器及手术器械的空间定位标记元件中间的配合是采用光学定位法,通过双摄像机或多摄像机观察目标,再用视觉原理重建目标空间位置,设备占用空间小,精度高。双目立体视觉是基于视差原理并利用成像设备从不同的位置获取被测物体的两幅图像,通过计算图像对应点间的位置偏差,来获取物体三维几何信息的方法,如图11。
具体的,如图1和图10所示,以脊柱手术导航为例,首先进行患者诊断,在手术前通过CT扫描,获得骨骼影像信息,再通过计算机设备进行骨骼3D重建,3D打印示踪器由术前影像进行骨结构3D重建并编辑模型,然后通过3D打印机制作而来,手术时医生在患者手术部位切口并剥离软组织,将示踪器固定在患者骨结构上,示踪器成为骨结构的延伸,导航示踪面的图案可光学定位;对手术器械进行三维扫描获得三维图像信息,安装空间定位标记元件,标定空间定位标记元件,安装了空间定位标记元件的手术器械,空间定位标记元件包含2~4个导航面,与3D打印示踪器同样可光学定位;计算机导入骨骼、示踪器和手术器械的3D模型。
手术时,计算机初始化3D模型图像,通过双目摄像机获取图像信息,追踪示踪器和手术器械的空间位置信息,连接到计算机的双目摄像机,将采集得到的视频数据实时传送到计算机;计算机根据实时图像运用双目视觉定位算法计算获得患者骨骼、手术器械的相对位置关系,将手术器械、患者骨骼结构的位置信息与术前影像得到的骨骼3D模型和手术器械3D模型进行融合,得到骨骼和手术器械的实时位置关系动态图像,通过计算机所连接的监视器实时显示骨骼和手术器械的动态图像,使得医生可以通过观察监视器进行同步手术,从而实现手术的可视化导航。
在手术前,要对患者手术部位进行CT扫描和3D重建,然后在3D编辑软件中进行3D打印示踪器(手术导板)的设计和制作,骨骼模型和示踪器模型在手术前导入手术系统的图形工作站计算机。以及在手术前,要对手术中要使用的手术器械进行3维扫描并导入计算机(或制备手术器械库),并标定导航标记物,使其姿态位置误差在可接受范围内。
计算机主要实现图像的实时融合和手术器械位置信息的量化监测,首先接受起始指令,开始患者三维图像重建和手术器械的三维图像融合;在手术中,接收双目摄像机传回的实时图像进行图像解析,得到手术部位与手术器械的位置信息,融合3D模型并显示在导航显示器。
本发明还提供一种计算机设备,包括存储器和处理器,所述存储器中存储有计算机程序,所述计算机程序被所述处理器执行时,使得所述处理器执行上述手术导航方法。
本发明还提供一种计算机可读存储介质,所述计算机可读存储介质上存储有计算机程序,所述计算机程序被处理器执行时,使得所述处理器执行上述手术导航方法。
本发明的手术导航系统可应用于脊柱手术、骨盆和胸椎手术、关节外科手术、创伤外科手术、骨肿瘤和矫形外科手术。
以上所述仅为本发明的较佳实施例,本发明的保护范围并不局限于此,任何基于本发明技术方案上的等效变换均属于本发明保护范围之内。

Claims (10)

  1. 一种手术导航系统,其特征在于,其包括用于安装在手术器械上的空间定位标记元件、用于固定在待手术骨骼结构上的示踪器、用作双目视觉空间定位的双目摄像机和用作导航终端的图形工作站计算机,所述示踪器设有至少一个用于导航定位的导航示踪面,所述空间定位标记元件包括至少一个用于手术器械示踪的导航面,所述双目摄像机连接计算机,将采集到的示踪器和空间定位标记元件的信息传输至计算机。
  2. 如权利要求1所述的手术导航系统,其特征在于,所述示踪器设有至少一个用于与待手术骨骼贴合固定的骨骼贴合面。
  3. 如权利要求2所述的手术导航系统,其特征在于,所述示踪器包括通过3D打印构造的手术导板,在所述手术导板上设置有所述导航示踪面。
  4. 如权利要求3所述的手术导航系统,其特征在于,所述手术导板包括导板本体,所述导航示踪面直接形成于所述导板本体上,或者所述导航示踪面设置于导航示踪载体上,所述导航示踪载体设置于所述导板本体上,所述导航示踪面为贴有可见光视觉识别跟踪图案的平面,或是具有一个或多个特征点的显影片,或是由多个特征点形成的示踪面。
  5. 如权利要求1所述的手术导航系统,其特征在于,所述空间定位标记元件为多面体,设置有至少两个所述导航面。
  6. 如权利要求1所述的手术导航系统,其特征在于,所述计算机包括以下模块:
    数据接收存储模块,用于接收所述的双目摄像机传输的信息,并存储;
    图像重构模块,用于导入并重建待手术骨骼结构三维模型、手术器械三维模型,并采用双目摄像机集到的信息用于重建手术中的手术器械三维图像和姿态,以实现可视化导航。
  7. 一种执行手术导航方法的计算机设备,包括存储器和处理器,所述存储器中存储有计算机程序,其特征在于,所述计算机设备应用于如权利要求1~6任一项所述的手术导航系统中,所述计算机程序被所述处理器执行时,使得所述处理器执行手术导航方法,该方法包括:
    接收待手术骨骼的图像和手术器械图像并生成三维模型,获得固定于骨骼上的示踪器配准信息和手术器械上空间定位标记元件的标定信息;
    接收双目摄像机采集到的示踪器和空间定位标记元件的实时信息;
    根据示踪器和空间定位标记元件的实时信息,采用双目视觉定位算法计算得到待手术骨骼、手术器械的空间位置信息,并与三维模型实时融合,
    融合后得到骨骼和手术器械的实时位置关系动态图像,以实现手术导航。
  8. 如权利要求7所述的计算机设备,其特征在于,所述示踪器包括通过3D打印构造的手术导板,在所述手术导板上设置有至少一个可配合所述双目摄像机光学定位的导航示踪面,所述示踪器设有至少一个用于与待手术骨骼贴合固定的骨骼贴合面;所述空间定位标记元件为多面体,设置有至少两个可配合所述双目摄像机光学定位的导航面。
  9. 如权利要求8所述的计算机设备,其特征在于,该方法还包括:在手术前,根据待手术骨骼的三维模型通过3D设计所述示踪器三维模型及打印制造所述示踪器,并在将所述示踪器的三维模型导入所述计算机存储器中;在手术前,对手术器械进行三维扫描获得三维图像信息,标定空间定位标记元件,并导入所述计算机存储器中。
  10. 一种计算机可读存储介质,所述计算机可读存储介质上存储有计算机程序,其特征在于,所述计算机可读存储介质应用于如权利要求1~6任一项所述的手术导航系统的计算机设备中,所述计算机程序被处理器执行时,使得所述处理器执行手术导航方法:
    接收待手术骨骼的图像和手术器械图像并生成三维模型,获得固定于骨骼上的示踪器配准信息和手术器械上空间定位标记元件的标定信息;
    接收双目摄像机采集到的示踪器和空间定位标记元件的实时信息;
    根据示踪器和空间定位标记元件的实时信息,采用双目视觉定位算法计算得到待手术骨骼、手术器械的空间位置信息,并与三维模型实时融合,
    融合后得到骨骼和手术器械的实时位置关系动态图像,以实现手术导航。
PCT/CN2020/089607 2020-04-26 2020-05-11 手术导航系统及执行手术导航方法的计算机与存储介质 WO2021217713A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP20933979.5A EP4159149A4 (en) 2020-04-26 2020-05-11 SURGICAL NAVIGATION SYSTEM, COMPUTER FOR CARRYING OUT A SURGICAL NAVIGATION METHOD AND STORAGE MEDIUM
US17/624,320 US20220354580A1 (en) 2020-04-26 2020-05-11 Surgical navigation system, computer for performing surgical navigation method, and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010339289.5A CN111388087A (zh) 2020-04-26 2020-04-26 手术导航系统及执行手术导航方法的计算机与存储介质
CN202010339289.5 2020-04-26

Publications (1)

Publication Number Publication Date
WO2021217713A1 true WO2021217713A1 (zh) 2021-11-04

Family

ID=71411643

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/089607 WO2021217713A1 (zh) 2020-04-26 2020-05-11 手术导航系统及执行手术导航方法的计算机与存储介质

Country Status (4)

Country Link
US (1) US20220354580A1 (zh)
EP (1) EP4159149A4 (zh)
CN (1) CN111388087A (zh)
WO (1) WO2021217713A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114224428A (zh) * 2021-12-31 2022-03-25 杭州三坛医疗科技有限公司 一种截骨平面定位方法、系统及装置

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111658148A (zh) * 2020-07-14 2020-09-15 山东威高医疗科技有限公司 一种与电磁导航系统和c臂配合使用的空间定位器
CN112002018A (zh) * 2020-08-18 2020-11-27 云南省第一人民医院 一种基于混合现实的术中位置导航系统、装置及方法
CN112155727A (zh) * 2020-08-31 2021-01-01 上海市第一人民医院 基于三维模型的手术导航系统、方法、设备和介质
CN112006780B (zh) * 2020-09-07 2022-04-08 哈尔滨工业大学 一种微创手术机器人系统及人工耳蜗微创植入手术装置
CN112288887B (zh) * 2020-10-15 2023-12-01 雅客智慧(北京)科技有限公司 磨削模拟方法、装置、电子设备和存储介质
CN113509263B (zh) * 2021-04-01 2024-06-14 上海复拓知达医疗科技有限公司 一种物体空间校准定位方法
CN113674849A (zh) * 2021-06-22 2021-11-19 广州诺曼数字化医疗科技有限公司 一种三维扫描辅助手术导板定位系统及定位方法
CN113545848B (zh) * 2021-07-15 2022-11-15 北京长木谷医疗科技有限公司 导航导板的配准方法及配准装置
CN113520619A (zh) * 2021-08-26 2021-10-22 重庆市妇幼保健院 用于三维医疗影像系统与双目视觉系统配准的标记元件及其装配方法
CN113855236B (zh) * 2021-09-03 2022-05-31 北京长木谷医疗科技有限公司 手术机器人追踪和移动的方法及系统
CN113842213B (zh) * 2021-09-03 2022-10-11 北京长木谷医疗科技有限公司 手术机器人导航定位方法及系统
CN113768628B (zh) * 2021-09-27 2023-12-12 武汉联影智融医疗科技有限公司 磨骨量安全范围确定方法和电子装置
CN118055738A (zh) * 2021-10-19 2024-05-17 史密夫和内修有限公司 用于在医疗程序中跟踪对象的标记
CN114159157A (zh) * 2021-12-06 2022-03-11 北京诺亦腾科技有限公司 辅助移动器械的方法、装置、设备及存储介质
CN114224489B (zh) * 2021-12-12 2024-02-13 浙江德尚韵兴医疗科技有限公司 用于手术机器人的轨迹跟踪系统及利用该系统的跟踪方法
CN114027828A (zh) * 2021-12-15 2022-02-11 杭州柳叶刀机器人有限公司 膝关节间隙测量方法、装置、终端设备和可读存储介质
CN115721417B (zh) * 2022-09-09 2024-01-30 苏州铸正机器人有限公司 一种手术机器人末端位姿全视野测量装置及方法
CN115568914B (zh) * 2022-10-08 2024-05-24 上海宇度医学科技股份有限公司 一种女性盆底重建术定位系统
CN115624384B (zh) * 2022-10-18 2024-03-22 方田医创(成都)科技有限公司 基于混合现实技术的手术辅助导航系统、方法和存储介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105411678A (zh) * 2014-09-16 2016-03-23 X-Nav技术有限责任公司 医疗程序期间用于确定和追踪运动的系统
US20160191887A1 (en) * 2014-12-30 2016-06-30 Carlos Quiles Casas Image-guided surgery with surface reconstruction and augmented reality visualization
CN108742876A (zh) * 2018-08-02 2018-11-06 雅客智慧(北京)科技有限公司 一种手术导航装置
CN109152615A (zh) * 2016-05-23 2019-01-04 马科外科公司 在机器人手术过程期间识别和跟踪物理对象的系统和方法
CN109925055A (zh) * 2019-03-04 2019-06-25 北京和华瑞博科技有限公司 全数字化全膝关节置换手术机器人系统及其模拟手术方法
CN110248618A (zh) * 2016-09-09 2019-09-17 Gys科技有限责任公司(经营名称为卡丹机器人) 用于在计算机辅助手术中显示患者数据的方法及系统
CN110946654A (zh) * 2019-12-23 2020-04-03 中国科学院合肥物质科学研究院 一种基于多模影像融合的骨科手术导航系统

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1051123A1 (en) * 1998-01-28 2000-11-15 Eric Richard Cosman Optical object tracking system
US20060195111A1 (en) * 2005-01-25 2006-08-31 Orthosoft Inc. Universal positioning block assembly
US20080319491A1 (en) * 2007-06-19 2008-12-25 Ryan Schoenefeld Patient-matched surgical component and methods of use
US11633254B2 (en) * 2018-06-04 2023-04-25 Mighty Oak Medical, Inc. Patient-matched apparatus for use in augmented reality assisted surgical procedures and methods for using the same
US20130261783A1 (en) * 2012-03-28 2013-10-03 Navident Technologies, Inc. Method for creating of dynamic patient-specific surgical monitoring system
EP2887873A4 (en) * 2012-08-27 2016-05-25 Target Tape Inc HELP TO LOCALIZE A MEDICAL INTERVENTION
DE102014107481A1 (de) * 2014-05-27 2015-12-03 Aesculap Ag Medizinisches System
FR3107449B1 (fr) * 2020-02-20 2022-01-21 One Ortho Système de guidage en réalité augmentée d’une opération chirurgicale d’une partie d’articulation d’un os

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105411678A (zh) * 2014-09-16 2016-03-23 X-Nav技术有限责任公司 医疗程序期间用于确定和追踪运动的系统
US20160191887A1 (en) * 2014-12-30 2016-06-30 Carlos Quiles Casas Image-guided surgery with surface reconstruction and augmented reality visualization
CN109152615A (zh) * 2016-05-23 2019-01-04 马科外科公司 在机器人手术过程期间识别和跟踪物理对象的系统和方法
CN110248618A (zh) * 2016-09-09 2019-09-17 Gys科技有限责任公司(经营名称为卡丹机器人) 用于在计算机辅助手术中显示患者数据的方法及系统
CN108742876A (zh) * 2018-08-02 2018-11-06 雅客智慧(北京)科技有限公司 一种手术导航装置
CN109925055A (zh) * 2019-03-04 2019-06-25 北京和华瑞博科技有限公司 全数字化全膝关节置换手术机器人系统及其模拟手术方法
CN110946654A (zh) * 2019-12-23 2020-04-03 中国科学院合肥物质科学研究院 一种基于多模影像融合的骨科手术导航系统

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4159149A4

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114224428A (zh) * 2021-12-31 2022-03-25 杭州三坛医疗科技有限公司 一种截骨平面定位方法、系统及装置
CN114224428B (zh) * 2021-12-31 2023-08-18 杭州三坛医疗科技有限公司 一种截骨平面定位方法、系统及装置

Also Published As

Publication number Publication date
EP4159149A4 (en) 2024-01-03
EP4159149A1 (en) 2023-04-05
US20220354580A1 (en) 2022-11-10
CN111388087A (zh) 2020-07-10

Similar Documents

Publication Publication Date Title
WO2021217713A1 (zh) 手术导航系统及执行手术导航方法的计算机与存储介质
CN110946654B (zh) 一种基于多模影像融合的骨科手术导航系统
US10603133B2 (en) Image guided augmented reality method and a surgical navigation of wearable glasses using the same
CN102784003B (zh) 一种基于结构光扫描的椎弓根内固定手术导航系统
US20200390503A1 (en) Systems and methods for surgical navigation and orthopaedic fixation
CN103211655B (zh) 一种骨科手术导航系统及导航方法
US6285902B1 (en) Computer assisted targeting device for use in orthopaedic surgery
US7885441B2 (en) Systems and methods for implant virtual review
US9320569B2 (en) Systems and methods for implant distance measurement
CN202751447U (zh) 一种基于结构光扫描的椎弓根内固定手术导航系统
JP5328137B2 (ja) 用具又は埋植物の表現を表示するユーザ・インタフェイス・システム
JP5702861B2 (ja) 解剖学的表面の支援型自動データ収集方法
US20080119725A1 (en) Systems and Methods for Visual Verification of CT Registration and Feedback
US20080154120A1 (en) Systems and methods for intraoperative measurements on navigated placements of implants
KR102105974B1 (ko) 의료 영상 시스템
CN106420054A (zh) 一种融合术前3d规划信息的前交叉韧带止点定位和韧带隧道定位装置
CN114727847A (zh) 用于计算坐标系变换的系统和方法
CN113648061B (zh) 一种基于混合现实的头戴式导航系统及导航配准方法
CN113229937A (zh) 一种利用实时结构光技术实现手术导航的方法和系统
CN213098281U (zh) 手术导航系统
Harders et al. Multimodal augmented reality in medicine
US11406346B2 (en) Surgical position calibration method
Nakajima et al. Surgical tool alignment guidance by drawing two cross-sectional laser-beam planes
CN112107366B (zh) 一种混合现实超声导航系统
CN219048814U (zh) 手术导航系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20933979

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2020933979

Country of ref document: EP

Effective date: 20221128