WO2017185540A1 - Neurosurgical robot navigation positioning system and method - Google Patents

Neurosurgical robot navigation positioning system and method Download PDF

Info

Publication number
WO2017185540A1
WO2017185540A1 PCT/CN2016/090789 CN2016090789W WO2017185540A1 WO 2017185540 A1 WO2017185540 A1 WO 2017185540A1 CN 2016090789 W CN2016090789 W CN 2016090789W WO 2017185540 A1 WO2017185540 A1 WO 2017185540A1
Authority
WO
WIPO (PCT)
Prior art keywords
motion
lesion
targeted
computer device
coordinate systems
Prior art date
Application number
PCT/CN2016/090789
Other languages
French (fr)
Inventor
Rongjun Wang
Depeng ZHAO
Lilong ZHANG
Original Assignee
Beijing Baihui Wei Kang Technology Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baihui Wei Kang Technology Co., Ltd. filed Critical Beijing Baihui Wei Kang Technology Co., Ltd.
Priority to EP16741841.7A priority Critical patent/EP3253320A4/en
Publication of WO2017185540A1 publication Critical patent/WO2017185540A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/32Surgical robots operating autonomously
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/10Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/40ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/50ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40519Motion, trajectory planning
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/45Nc applications
    • G05B2219/45117Medical, radio surgery manipulator

Definitions

  • Embodiments of the present invention relate to the technical field of medical surgical robots, and in particular, relate to a neurosurgical robot navigation and positioning system and method.
  • Automation equipment such as robots
  • Automation equipment such as robots
  • people are discussing how to introduce robots in the surgeries to provide new treatment methods and systems for surgeons by taking advantages of robots, sensors and the like high and new technologies to improve the surgery effects. That is, the surgeries are practiced by means of medical robots, including surgeries on and head.
  • frame brain three-dimensional navigation and frameless brain three-dimensional navigation or image guided neurosurgery phases are successively experienced.
  • One difference between frame brain three-dimensional navigation and frameless brain three-dimensional navigation lies in whether to sleeve a position frame on the head of a patent to achieve a series of technologies such as oriented registration.
  • a frame is purposely mounted outside the skull of the patient, which forms a three-dimensional spatial coordinate system, such that the brain structure is included in the coordinate system.
  • the frameless brain three-dimensional navigation is not suitable for the above positioning frame, and is mainly implemented based on a joint arm system and a digital instrument system.
  • the digital instrument system comprises infrared ray, sound wave, electromagnetic wave and the like digital instruments
  • the joint arm system comprises a robotic arm having multiple freedom degrees.
  • the robotic arm is only capable of implementing three-dimensional surgical plan and real-time virtual display of the skull position, but may not proactively participate targeting of the lesion position.
  • Embodiments of the present invention are intended to provide a neurosurgical robot navigation and positioning system and method, to solve the technical problems in the related art.
  • the present invention employs technical solutions as follows:
  • An embodiment of the present invention provides a neurosurgical robot navigation and positioning system, comprising: a motion executing device, a spatial position sensor, and an equipped position marker unit and a computer device; wherein
  • the computer device is connected to the motion executing device and the spatial position sensor, and configured to create a surgery plan on a digital graphic image, the surgery plan comprising a lesion position which is precisely self-targeted and a motion path thereof;
  • the spatial position sensor is configured to capture the equipped position marker unit such that the computer device implements position mapping between different spatial coordinate systems
  • the motion execution device is mounted with a surgical instrument, and configured to generate a specific motion scheme according to the lesion position which is precisely self-targeted and the motion path thereof and the position mapping between the different spatial coordinate systems, precisely self-target a lesion position, and securely lock the surgical instrument to support a surgical operation.
  • the equipped position marker unit comprises one or a combination of a marker on the motion executing device, a marker attached on the head of a patient and a marker on a handheld probe.
  • the computer device is configured to construct a craniocerebral three-dimensional model based on two-dimensional medical data, display a three-dimensional configuration and position delineating the lesion, and create the surgical plan on the three-dimensional configuration, the surgical plan comprises multiple targets and multiple cranial paths that are predetermined.
  • the computer device is configured to collect and process spatial information of the motion executing device and spatial information of the marker unit captured by the spatial sensor, and establish a transformation relationship between the different coordinate systems.
  • the computer device is configured to solve the motion scheme of the motion executing device according to the lesion position which is precisely self-targeted and the motion path thereof and the position mapping between the different spatial coordinate systems, simulate and rehearse the motion scheme in the craniocerebral three-dimensional model, and make an adjustment to the motion scheme.
  • the computer device is further configured to synchronously navigate a relative position of the surgical instrument in the two-dimensional medical image data and the craniocerebral three-dimensional model.
  • An embodiment of the present invention further provides a neurosurgical robot navigation and positioning method, comprising:
  • the surgery plan comprising a lesion position which is precisely self-targeted and a motion path thereof
  • the method further comprises: constructing a craniocerebral three-dimensional model based on two-dimensional medical data, displaying a three-dimensional configuration and position delineating the lesion, and creating the surgical plan on the three-dimensional configuration, the surgical plan comprising multiple targets and multiple cranial paths that are predetermined.
  • the method further comprises: solving the motion scheme of the motion executing device according to the lesion position which is precisely self-targeted and the motion path thereof and the position mapping between the different spatial coordinate systems, simulating and rehearsing the motion scheme in the craniocerebral three-dimensional model, and making an adjustment to the motion scheme.
  • the method further comprises: synchronously navigating a relative position of the surgical instrument in the two-dimensional medical image data and the craniocerebral three-dimensional model.
  • a surgical plan is created by using the computer device on a digital graphic image, wherein the surgical plan comprises a lesion position which is precisely self-targeted and a motion path thereof; the spatial position sensor captures the equipped position marker unit such that the computer device implements position mapping between different spatial coordinate systems; the motion executing device generates a specific motion scheme to the lesion position which is precisely self-targeted and the motion path thereof and the position mapping between the different spatial coordinate systems.
  • the surgical instrument may be securely locked to support a surgical operation, and the lesion position is precisely self-targeted.
  • FIG. 1 is a schematic diagram of a neurosurgical robot navigation and positioning system according to an embodiment of the present invention
  • FIG. 2 is a schematic diagram of a relationship between a robotic arm and a surgical instrument according to an embodiment of the present invention
  • FIG. 3 is a schematic diagram of a lesion position marker unit arranged on the head according to an embodiment of the present invention
  • FIG. 4 is another schematic diagram of a lesion position marker unit arranged on the head according to an embodiment of the present invention.
  • FIG. 5 is a schematic flowchart of a neurosurgical robot navigation and positioning method according to an embodiment of the present invention.
  • a surgical plan is created by using the computer device on a digital graphic image, wherein the surgical plan comprises a lesion position which is precisely self-targeted and a motion path thereof; the spatial position sensor captures the equipped position marker unit such that the computer device implements position mapping between different spatial coordinate systems; the motion executing device generates a specific motion scheme to the lesion position which is precisely self-targeted and the motion path thereof and the position mapping between the different spatial coordinate systems.
  • the surgical instrument may be securely locked to support a surgical operation, and the lesion position is precisely self-targeted.
  • FIG. 1 is a schematic diagram of a neurosurgical robot navigation and positioning system according to an embodiment of the present invention. As illustrated in FIG. 1, the system comprises: a motion executing device 101, a spatial position sensor 102, an equipped position marker unit 103 and a computer device 104.
  • the computer device 104 is connected to the motion executing device 101 and the spatial position sensor 102, and configured to create a surgery plan on a digital graphic image, the surgery plan comprising a lesion position which is precisely self-targeted and a motion path thereof.
  • the computer device 104 may be fixed on a trolley 106, wherein the trolley 106 is connected to a couch via a fixed connection structure 105.
  • the digital graphic image comprises one or a combination of a craniocerebral axial plane, a coronal plane, a sagittal plane, a three-dimensional model, a vessel model and a standard spectrum.
  • the surgical plan is operated or practiced by a principal doctor on a digital graphic image, such that the abstract surgical intension and experience is digitalized, and thus can be computed, stored and accurately transferred.
  • the motion executing device 101 comprises a high-precision drive robotic arm.
  • the spatial position sensor 102 comprises one or a combination of an infrared ray, electromagnetic wave, an ultrasonic wave and a visible light sensor.
  • the computer device 104 constructs a craniocerebral three-dimensional model of a patient 100 and calculates a three-dimensional volume of the lesion, delineates a three-dimensional configuration and position in the craniocerebral three-dimensional model, and creates a surgical plan on a digital graphic image comprising the three-dimensional configuration of the lesion, wherein the surgical plan comprises multiple targets and multiple cranial paths that are predetermined.
  • two-dimensional medical image data may be all the medical image files complying with the DICOM protocol, including one or a combination of CT and MRI.
  • a surgical record of a patient including name, age and the of the patient, may be created according to the read craniocerebral two-dimensional medical image data of the patient.
  • the spatial position sensor 102 is configured to capture the equipped position marker unit 103, such that the computer device 104 implements position mapping between different spatial coordinate systems, thereby achieving matching between the digital graphic image and the patient.
  • the equipped position marker unit comprises one or a combination of a marker on the motion executing device 101, a marker attached on the head of a patient and a marker on a handheld probe.
  • the computer device 104 is configured to collect and process spatial information of the motion executing device 101 and spatial information of the marker unit captured by the spatial position sensor 102, and establish a transformation relationship between the different coordinate systems.
  • the computer device 104 is configured to solve the motion scheme of the motion executing device 101 according to the lesion position which is precisely self-targeted and the motion path thereof and the position mapping between the different spatial coordinate systems, simulate and rehearse the motion scheme in the craniocerebral three-dimensional model, and make an adjustment to the motion scheme.
  • the computer device 104 is further configured to synchronously navigate a relative position of the surgical instrument in the two-dimensional medical image data and the craniocerebral three-dimensional model.
  • a bridge may be built for registration of various spaces based on the same marker; that is, coordinates of the same marker in different spaces may be firstly acquired, and a mapping matrix of the relationship between the same coordinate point in different spaces is acquired. As such, a transformation relationship from one space to another space is acquired.
  • the motion execution device 101 is mounted with a surgical instrument, and configured to generate a specific motion scheme according to the lesion position which is precisely self-targeted and the motion path thereof and the position mapping between the different spatial coordinate systems, precisely self-target a lesion position, and securely lock the surgical instrument to support a surgical operation. As such, the motion executing device finally guides the surgical instrument to navigate and position to the practice of an anatomic structure.
  • motion of the robotic arm is controlled according to a data link established based on a specific motion scheme.
  • the surgical instrument comprises a minimal puncture needle, a neuroendoscope and the like.
  • FIG. 2 A schematic diagram of the motion executing device 101 mounted with a surgical instrument is given in FIG. 2.
  • the lesion position marker unit 103 is arranged on a tail end of the robotic arm of the motion executing device 101 or the surgical instrument or a probe 111, for example, a black-white positioning pattern.
  • the equipped position marker unit 103 may be a in vivo characteristic or a in vitro characteristic.
  • the in vitro characteristic comprises a marker arranged on the head of a patient, and the in vivo characteristic comprises a anatomic characteristic of the body of the patient, wherein the anatomic characteristic comprises, for example, spine, scapular and the like, which is not described herein any further.
  • the marker may be specifically a patch adhered to a surgical target position.
  • the patch is provided with a pattern which is identifiable by an optical positioning unit, wherein the pattern is a black-white block.
  • the pattern is defined on the surface of a panel, wherein the panel may be a soft base.
  • a plurality of markers are needed, which are respectively arranged at different positions in different directions of the patient, that is, a set of markers are used in the surgery. There are typically three or more than three markers, any three markers should not be arranged in the same line, and any four markers should not be arranged within the same plane.
  • identifiers such as C1, C2 and C3, and C1C2C3 are assigned to these markers and the mating relationship between these markers, the identifiers of these markers and the mating relationship between these markers are stored in an electronic tag during the production process, and the electronic tag is integrated with the panel, for example, embedded into the panel.
  • the electronic tag may also store any one of any combination of production information, sales channel information, inspection information and geometric dimensions of the markers.
  • the production information may comprise manufacturer and production date;
  • the sales channel information comprise sales information of hospitals having the sales qualifications, and information of hospitals legally using the markers;
  • the inspection information comprises scaled precision grade of the makers, and
  • the sales hospital information may comprise regional information such as postal codes of the sales hospitals.
  • relevant information regarding the patient and the surgery may be acquired, and the information of the patient may be stored in the electronic tag and bound to other information such as production information, sales channel information, inspection information and geometric dimensions of the markers.
  • the electronic tag may also store any one or any combination of information of the patient (name, age and disease category) , surgery time information, information of surgery carry-out hospital, doctor carrying out surgery.
  • the above position marker may further comprise a black-white positioning pattern arranged on the probe meter, which is not described herein any further.
  • the space of the motion executing device, the space of the lesion position sensor and the space of the patient surgery are known, and only the image space needs to be established.
  • Three first markers may be arranged on a periphery of the surgical target position of the patient as the lesion position marker units, as illustrated in FIG. 3. It should be noted that the number of first markers is not limited to 3, but may be 1.
  • Four second markers are arranged at the target position of the patient with reference to the first marker 101. It should be noted that the number of second markers is not limited to 4, but may be 3.
  • the transformation matrix is a rigid-body transformation matrix, wherein the position of the coordinate system of the image space is established by means of rotation and translation of the coordinate by using the rigid-body transformation matrix.
  • the lesion position determining unit 103 comprises a marker arranged on the head of a patient and a marker arranged on a probe, that is, a black-white pattern block.
  • the marker arranged on the surgical instrument is not described any further.
  • FIG. 5 is a schematic flowchart of a neurosurgical robot navigation and positioning method according to an embodiment of the present invention. As illustrated in FIG. 5, the method comprises the following steps:
  • a surgery plan is created on a digital graphic image, wherein the surgery plan comprises a lesion position which is precisely self-targeted and a motion path thereof.
  • the method further comprises: constructing a craniocerebral three-dimensional model based on two-dimensional medical data, displaying a three-dimensional configuration and position delineating the lesion, and creating the surgical plan on the three-dimensional configuration, the surgical plan comprising multiple targets and multiple cranial paths that are predetermined.
  • the method further comprises: solving the motion scheme of the motion executing device 101 according to the lesion position which is precisely self-targeted and the motion path thereof and the position mapping between the different spatial coordinate systems, simulating and rehearsing the motion scheme in the craniocerebral three-dimensional model, and making an adjustment to the motion scheme.
  • the method further comprises: synchronously navigating a relative position of the surgical instrument in the two-dimensional medical image data and the craniocerebral three-dimensional model.
  • S602 Capture information of an equipped marker unit is acquired, and a position mapping between different spatial coordinate systems is calculated.
  • S603 A specific motion scheme is generated according to the lesion position which is precisely self-targeted and the motion path thereof and the position mapping between the different spatial coordinate systems, a lesion position is precisely self-targeted, and the surgical instrument is securely locked to support a surgical operation.
  • Steps S601-S603 in this embodiment may be preferably or specifically performed with reference to the disclosure in FIG. 1, which are not described herein any further.
  • the above described apparatus embodiments are merely for illustration purpose only.
  • the units which are described as separate components may be physically separated or may be not physically separated, and the components which are illustrated as units may be or may not be physical units, that is, the components may be located in the same position or may be distributed into a plurality of network units. Partial or all the modules may be selected according to the actual needs to achieve the objectives of the technical solutions of the embodiments. Persons of ordinary skill in the art may understand and implement the present application without paying any creative effort.
  • the embodiments of the present invention may be implemented by means of hardware or by means of software plus a necessary general hardware platform.
  • portions of the technical solutions of the present application that essentially contribute to the related art may be embodied in the form of a software product, the computer software product may be stored in a storage medium, such as a ROM/RAM, a magnetic disk, a CD-ROM and the like, including several instructions for causing a computer device (a personal computer, a server, or a network device) to perform the various embodiments of the present application, or certain portions of the method of the embodiments.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Primary Health Care (AREA)
  • Robotics (AREA)
  • Epidemiology (AREA)
  • Pathology (AREA)
  • General Business, Economics & Management (AREA)
  • Urology & Nephrology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Business, Economics & Management (AREA)
  • Mechanical Engineering (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Manipulator (AREA)

Abstract

A neurosurgical robot navigation and positioning system and method are disclosed. The system comprises: a motion executing device(101), a spatial position sensor(102), an equipped position marker unit(103) and a computer device(104); wherein the computer device(104) is connected to the motion executing device(101) and the spatial position sensor(102), and configured to create a surgery plan on a digital graphic image, the surgery plan comprising a lesion position which is precisely self-targeted and a motion path thereof; the spatial position sensor(102) is configured to capture the equipped position marker unit(103) such that the computer device(104) implements position mapping between different spatial coordinate systems; and the motion execution device(101) is mounted with a surgical instrument(111), and configured to generate a specific motion scheme according to the lesion position which is precisely self-targeted and the motion path thereof and the position mapping between the different spatial coordinate systems, precisely self-target a lesion position, and securely lock the surgical instrument(111) to support a surgical operation.

Description

NEUROSURGICAL ROBOT NAVIGATION POSITIONING SYSTEM AND METHOD
CROSS REFERENCE TO RELATED APPLICATION
This application claims priority to Chinese Patent Application No. 201610285561. X, filed on 4/29/2016, the entire contents of which are incorporated herein by reference.
TECHNICAL FIELD
Embodiments of the present invention relate to the technical field of medical surgical robots, and in particular, relate to a neurosurgical robot navigation and positioning system and method.
BACKGROUND
Automation equipment, such as robots, has been widely applied n the industrial field, and has exhibited prominent advantages in operation flexibility, stability and accuracy and the like. To solve the problems that the surgeries are subjected to insufficient precision, more radiation, large cutting, operation fatigue and the like, people are discussing how to introduce robots in the surgeries to provide new treatment methods and systems for surgeons by taking advantages of robots, sensors and the like high and new technologies to improve the surgery effects. That is, the surgeries are practiced by means of medical robots, including surgeries on and head.
Using a brain surgery as an example, frame brain three-dimensional navigation and frameless brain three-dimensional navigation or image guided neurosurgery phases are successively experienced. One difference between frame brain three-dimensional navigation and frameless brain three-dimensional navigation lies in whether to sleeve a position frame on the head of a patent to  achieve a series of technologies such as oriented registration. Specifically, in the frame brain three-dimensional navigation, a frame is purposely mounted outside the skull of the patient, which forms a three-dimensional spatial coordinate system, such that the brain structure is included in the coordinate system. In this case, if the patient with the frame is subjected to scanning by CT or MRI, an CT or MRI image of the skull of the patient carrying the coordinate parameter marker of the frame is obtained, and anatomic structures of various images in the skull of the patient have a corresponding coordinate value in the coordinate system. Then, the coordinate point is reached according to mechanical data of a brain stereotaxic apparatus, thereby implementing brain stereotaxic orientation. However, the frameless brain three-dimensional navigation is not suitable for the above positioning frame, and is mainly implemented based on a joint arm system and a digital instrument system. The digital instrument system comprises infrared ray, sound wave, electromagnetic wave and the like digital instruments, and the joint arm system comprises a robotic arm having multiple freedom degrees.
With respect to the frameless brain three-dimensional navigation technology based on the robotic arm, in the related art, the robotic arm is only capable of implementing three-dimensional surgical plan and real-time virtual display of the skull position, but may not proactively participate targeting of the lesion position.
SUMMARY
Embodiments of the present invention are intended to provide a neurosurgical robot navigation and positioning system and method, to solve the technical problems in the related art.
The present invention employs technical solutions as follows:
An embodiment of the present invention provides a neurosurgical robot navigation and positioning system, comprising: a motion executing device, a spatial position sensor, and an equipped position marker unit and a computer device; wherein
the computer device is connected to the motion executing device and the spatial position sensor, and configured to create a surgery plan on a digital graphic image, the surgery plan comprising a lesion position which is precisely self-targeted and a motion path thereof;
the spatial position sensor is configured to capture the equipped position marker unit such that the computer device implements position mapping between different spatial coordinate systems; and
the motion execution device is mounted with a surgical instrument, and configured to generate a specific motion scheme according to the lesion position which is precisely self-targeted and the motion path thereof and the position mapping between the different spatial coordinate systems, precisely self-target a lesion position, and securely lock the surgical instrument to support a surgical operation.
Preferably, in an embodiment of the present invention, the equipped position marker unit comprises one or a combination of a marker on the motion executing device, a marker attached on the head of a patient and a marker on a handheld probe.
Preferably, in an embodiment of the present invention, the computer device is configured to construct a craniocerebral three-dimensional model based on two-dimensional medical data, display a three-dimensional configuration and position delineating the lesion, and create the surgical plan on the three-dimensional configuration, the surgical plan comprises multiple targets and multiple cranial paths that are predetermined.
Preferably, in an embodiment of the present invention, the computer device is configured to collect and process spatial information of the motion executing device and spatial information of the marker unit captured by the spatial sensor, and establish a transformation relationship between the different coordinate systems.
Preferably, in an embodiment of the present invention, the computer device is  configured to solve the motion scheme of the motion executing device according to the lesion position which is precisely self-targeted and the motion path thereof and the position mapping between the different spatial coordinate systems, simulate and rehearse the motion scheme in the craniocerebral three-dimensional model, and make an adjustment to the motion scheme.
Preferably, in an embodiment of the present invention, the computer device is further configured to synchronously navigate a relative position of the surgical instrument in the two-dimensional medical image data and the craniocerebral three-dimensional model.
An embodiment of the present invention further provides a neurosurgical robot navigation and positioning method, comprising:
creating a surgery plan on a digital graphic image, the surgery plan comprising a lesion position which is precisely self-targeted and a motion path thereof
acquiring capture information of an equipped marker unit, and calculating a position mapping between different spatial coordinate systems; and
generating a specific motion scheme according to the lesion position which is precisely self-targeted and the motion path thereof and the position mapping between the different spatial coordinate systems, precisely self-targeting a lesion position, and securely locking the surgical instrument to support a surgical operation.
Preferably, in an embodiment of the present invention, the method further comprises: constructing a craniocerebral three-dimensional model based on two-dimensional medical data, displaying a three-dimensional configuration and position delineating the lesion, and creating the surgical plan on the three-dimensional configuration, the surgical plan comprising multiple targets and multiple cranial paths that are predetermined.
Preferably, in an embodiment of the present invention, the method further comprises: solving the motion scheme of the motion executing device according to  the lesion position which is precisely self-targeted and the motion path thereof and the position mapping between the different spatial coordinate systems, simulating and rehearsing the motion scheme in the craniocerebral three-dimensional model, and making an adjustment to the motion scheme.
Preferably, in an embodiment of the present invention, the method further comprises: synchronously navigating a relative position of the surgical instrument in the two-dimensional medical image data and the craniocerebral three-dimensional model.
In the embodiments of the present invention, a surgical plan is created by using the computer device on a digital graphic image, wherein the surgical plan comprises a lesion position which is precisely self-targeted and a motion path thereof; the spatial position sensor captures the equipped position marker unit such that the computer device implements position mapping between different spatial coordinate systems; the motion executing device generates a specific motion scheme to the lesion position which is precisely self-targeted and the motion path thereof and the position mapping between the different spatial coordinate systems. As such, the surgical instrument may be securely locked to support a surgical operation, and the lesion position is precisely self-targeted.
BRIEF DESCRIPTION OF THE DRAWINGS
To describe embodiments of the present invention or the technical solution in the related art, hereinafter, drawings that are to be referred for description of the embodiments or the related art are briefly described. Apparently, the drawings described hereinafter merely illustrate some embodiments of the present invention. Persons of ordinary skill in the art may also derive other drawings based on the drawings described herein without any creative effort.
FIG. 1 is a schematic diagram of a neurosurgical robot navigation and  positioning system according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a relationship between a robotic arm and a surgical instrument according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a lesion position marker unit arranged on the head according to an embodiment of the present invention;
FIG. 4 is another schematic diagram of a lesion position marker unit arranged on the head according to an embodiment of the present invention; and
FIG. 5 is a schematic flowchart of a neurosurgical robot navigation and positioning method according to an embodiment of the present invention.
DETAILED DESCRIPTION
To make the objectives, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions according to the embodiments of the present invention are clearly and thoroughly described with reference to the accompanying drawings of the embodiments of the present invention. The described embodiments are merely exemplary ones, but are not all the embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments derived by persons of ordinary skill in the art without any creative efforts shall fall within the protection scope of the present invention.
In embodiments of the present invention hereinafter, a surgical plan is created by using the computer device on a digital graphic image, wherein the surgical plan comprises a lesion position which is precisely self-targeted and a motion path thereof; the spatial position sensor captures the equipped position marker unit such that the computer device implements position mapping between different spatial coordinate systems; the motion executing device generates a specific motion scheme to the lesion position which is precisely self-targeted and the motion path thereof and the position mapping between the different spatial coordinate systems. As such, the surgical instrument may be securely locked to support a surgical  operation, and the lesion position is precisely self-targeted.
FIG. 1 is a schematic diagram of a neurosurgical robot navigation and positioning system according to an embodiment of the present invention. As illustrated in FIG. 1, the system comprises: a motion executing device 101, a spatial position sensor 102, an equipped position marker unit 103 and a computer device 104.
(1) The computer device 104 is connected to the motion executing device 101 and the spatial position sensor 102, and configured to create a surgery plan on a digital graphic image, the surgery plan comprising a lesion position which is precisely self-targeted and a motion path thereof. The computer device 104 may be fixed on a trolley 106, wherein the trolley 106 is connected to a couch via a fixed connection structure 105.
Specifically, in this embodiment or any other embodiment of the present invention, the digital graphic image comprises one or a combination of a craniocerebral axial plane, a coronal plane, a sagittal plane, a three-dimensional model, a vessel model and a standard spectrum.
Preferably, the surgical plan is operated or practiced by a principal doctor on a digital graphic image, such that the abstract surgical intension and experience is digitalized, and thus can be computed, stored and accurately transferred.
preferably, in this embodiment or any other embodiment of the present invention, the motion executing device 101 comprises a high-precision drive robotic arm.
Preferably, in this embodiment or any other embodiment of the present invention, the spatial position sensor 102 comprises one or a combination of an infrared ray, electromagnetic wave, an ultrasonic wave and a visible light sensor.
Preferably, in this embodiment or any other embodiment of the present invention, the computer device 104 constructs a craniocerebral three-dimensional model of a patient 100 and calculates a three-dimensional volume of the lesion, delineates a three-dimensional configuration and position in the craniocerebral  three-dimensional model, and creates a surgical plan on a digital graphic image comprising the three-dimensional configuration of the lesion, wherein the surgical plan comprises multiple targets and multiple cranial paths that are predetermined.
Specifically, in this embodiment or any other embodiment of the present invention, two-dimensional medical image data may be all the medical image files complying with the DICOM protocol, including one or a combination of CT and MRI.
In addition, a surgical record of a patient, including name, age and the of the patient, may be created according to the read craniocerebral two-dimensional medical image data of the patient.
(2) The spatial position sensor 102 is configured to capture the equipped position marker unit 103, such that the computer device 104 implements position mapping between different spatial coordinate systems, thereby achieving matching between the digital graphic image and the patient.
specifically, in this embodiment or any other embodiment of the present invention, the equipped position marker unit comprises one or a combination of a marker on the motion executing device 101, a marker attached on the head of a patient and a marker on a handheld probe.
Preferably, in this embodiment or any other embodiment of the present invention, the computer device 104 is configured to collect and process spatial information of the motion executing device 101 and spatial information of the marker unit captured by the spatial position sensor 102, and establish a transformation relationship between the different coordinate systems.
Preferably, in this embodiment or any other embodiment of the present invention, the computer device 104 is configured to solve the motion scheme of the motion executing device 101 according to the lesion position which is precisely self-targeted and the motion path thereof and the position mapping between the different spatial coordinate systems, simulate and rehearse the motion scheme in the craniocerebral three-dimensional model, and make an adjustment to  the motion scheme.
Preferably, in this embodiment or any other embodiment of the present invention, the computer device 104 is further configured to synchronously navigate a relative position of the surgical instrument in the two-dimensional medical image data and the craniocerebral three-dimensional model.
During a tracking process, a bridge may be built for registration of various spaces based on the same marker; that is, coordinates of the same marker in different spaces may be firstly acquired, and a mapping matrix of the relationship between the same coordinate point in different spaces is acquired. As such, a transformation relationship from one space to another space is acquired.
(3) The motion execution device 101 is mounted with a surgical instrument, and configured to generate a specific motion scheme according to the lesion position which is precisely self-targeted and the motion path thereof and the position mapping between the different spatial coordinate systems, precisely self-target a lesion position, and securely lock the surgical instrument to support a surgical operation. As such, the motion executing device finally guides the surgical instrument to navigate and position to the practice of an anatomic structure.
In this embodiment or any other embodiment of the present invention, when the lesion position is precisely self-targeted, motion of the robotic arm is controlled according to a data link established based on a specific motion scheme.
Specifically, in this embodiment or any other embodiment of the present invention, the surgical instrument comprises a minimal puncture needle, a neuroendoscope and the like.
A schematic diagram of the motion executing device 101 mounted with a surgical instrument is given in FIG. 2. As illustrated in FIG. 2, the lesion position marker unit 103 is arranged on a tail end of the robotic arm of the motion executing device 101 or the surgical instrument or a probe 111, for example, a black-white positioning pattern.
In the above embodiment, the equipped position marker unit 103 may be a in  vivo characteristic or a in vitro characteristic. The in vitro characteristic comprises a marker arranged on the head of a patient, and the in vivo characteristic comprises a anatomic characteristic of the body of the patient, wherein the anatomic characteristic comprises, for example, spine, scapular and the like, which is not described herein any further.
Specifically, the marker may be specifically a patch adhered to a surgical target position. The patch is provided with a pattern which is identifiable by an optical positioning unit, wherein the pattern is a black-white block. To be specific, the pattern is defined on the surface of a panel, wherein the panel may be a soft base. In a common surgery, generally a plurality of markers are needed, which are respectively arranged at different positions in different directions of the patient, that is, a set of markers are used in the surgery. There are typically three or more than three markers, any three markers should not be arranged in the same line, and any four markers should not be arranged within the same plane.
To differentiate these markers and a mating relationship between these markers to further prevent a mixed use of these markers with non-mated markers, identifiers such as C1, C2 and C3, and C1C2C3 are assigned to these markers and the mating relationship between these markers, the identifiers of these markers and the mating relationship between these markers are stored in an electronic tag during the production process, and the electronic tag is integrated with the panel, for example, embedded into the panel.
The electronic tag may also store any one of any combination of production information, sales channel information, inspection information and geometric dimensions of the markers. The production information may comprise manufacturer and production date; the sales channel information comprise sales information of hospitals having the sales qualifications, and information of hospitals legally using the markers; and the inspection information comprises scaled precision grade of the makers, and the sales hospital information may comprise regional information such as postal codes of the sales hospitals. Such  different categories of additional information ensure quality of the markers, and meanwhile prevent the markers from being counterfeited.
During the surgery, relevant information regarding the patient and the surgery may be acquired, and the information of the patient may be stored in the electronic tag and bound to other information such as production information, sales channel information, inspection information and geometric dimensions of the markers. The electronic tag may also store any one or any combination of information of the patient (name, age and disease category) , surgery time information, information of surgery carry-out hospital, doctor carrying out surgery.
The above position marker may further comprise a black-white positioning pattern arranged on the probe meter, which is not described herein any further.
In the above embodiment, the space of the motion executing device, the space of the lesion position sensor and the space of the patient surgery are known, and only the image space needs to be established. Three first markers may be arranged on a periphery of the surgical target position of the patient as the lesion position marker units, as illustrated in FIG. 3. It should be noted that the number of first markers is not limited to 3, but may be 1. Four second markers are arranged at the target position of the patient with reference to the first marker 101. It should be noted that the number of second markers is not limited to 4, but may be 3.
In this embodiment, the transformation matrix is a rigid-body transformation matrix, wherein the position of the coordinate system of the image space is established by means of rotation and translation of the coordinate by using the rigid-body transformation matrix.
Referring to FIG. 4, the lesion position determining unit 103 comprises a marker arranged on the head of a patient and a marker arranged on a probe, that is, a black-white pattern block. The marker arranged on the surgical instrument is not described any further.
FIG. 5 is a schematic flowchart of a neurosurgical robot navigation and positioning method according to an embodiment of the present invention. As  illustrated in FIG. 5, the method comprises the following steps:
S601: A surgery plan is created on a digital graphic image, wherein the surgery plan comprises a lesion position which is precisely self-targeted and a motion path thereof.
Preferably, in an embodiment of the present invention, the method further comprises: constructing a craniocerebral three-dimensional model based on two-dimensional medical data, displaying a three-dimensional configuration and position delineating the lesion, and creating the surgical plan on the three-dimensional configuration, the surgical plan comprising multiple targets and multiple cranial paths that are predetermined.
Preferably, in an embodiment of the present invention, the method further comprises: solving the motion scheme of the motion executing device 101 according to the lesion position which is precisely self-targeted and the motion path thereof and the position mapping between the different spatial coordinate systems, simulating and rehearsing the motion scheme in the craniocerebral three-dimensional model, and making an adjustment to the motion scheme.
Preferably, in an embodiment of the present invention, the method further comprises: synchronously navigating a relative position of the surgical instrument in the two-dimensional medical image data and the craniocerebral three-dimensional model.
S602: Capture information of an equipped marker unit is acquired, and a position mapping between different spatial coordinate systems is calculated.
S603: A specific motion scheme is generated according to the lesion position which is precisely self-targeted and the motion path thereof and the position mapping between the different spatial coordinate systems, a lesion position is precisely self-targeted, and the surgical instrument is securely locked to support a surgical operation.
Steps S601-S603 in this embodiment may be preferably or specifically performed with reference to the disclosure in FIG. 1, which are not described  herein any further.
The above described apparatus embodiments are merely for illustration purpose only. The units which are described as separate components may be physically separated or may be not physically separated, and the components which are illustrated as units may be or may not be physical units, that is, the components may be located in the same position or may be distributed into a plurality of network units. Partial or all the modules may be selected according to the actual needs to achieve the objectives of the technical solutions of the embodiments. Persons of ordinary skill in the art may understand and implement the present application without paying any creative effort.
According to the above embodiments of the present invention, a person skilled in the art may clearly understand that the embodiments of the present invention may be implemented by means of hardware or by means of software plus a necessary general hardware platform. Based on such understanding, portions of the technical solutions of the present application that essentially contribute to the related art may be embodied in the form of a software product, the computer software product may be stored in a storage medium, such as a ROM/RAM, a magnetic disk, a CD-ROM and the like, including several instructions for causing a computer device (a personal computer, a server, or a network device) to perform the various embodiments of the present application, or certain portions of the method of the embodiments.
It should be finally noted that the above-described embodiments are merely for illustration of the present invention, but are not intended to limit the present invention. Although the present invention is described in detail with reference to these embodiments, a person skilled in the art may also make various modifications to the technical solutions disclosed in the embodiments, or make equivalent replacements to a part of the technical features contained therein. Such modifications or replacement, made without departing from the principles of the present invention, shall fall within the scope of the present invention.

Claims (10)

  1. A neurosurgical robot navigation and positioning system, characterized by comprising: a motion executing device, a spatial position sensor, and an equipped position marker unit and a computer device; wherein
    the computer device is connected to the motion executing device and the spatial position sensor, and configured to create a surgery plan on a digital graphic image, the surgery plan comprising a lesion position which is precisely self-targeted and a motion path thereof;
    the spatial position sensor is configured to capture the equipped position marker unit such that the computer device implements position mapping between different spatial coordinate systems; and
    the motion execution device is mounted with a surgical instrument, and configured to generate a specific motion scheme according to the lesion position which is precisely self-targeted and the motion path thereof and the position mapping between the different spatial coordinate systems, precisely self-target a lesion position, and securely lock the surgical instrument to support a surgical operation.
  2. The system according to claim 1, characterized in that the equipped position marker unit comprises one or a combination of a marker on the motion executing device, a marker attached on the head of a patient and a marker on a handheld probe.
  3. The system according to claim 1, characterized in that the computer device is configured to construct a craniocerebral three-dimensional model based on two-dimensional medical data, display a three-dimensional configuration and position delineating the lesion, and create the surgical plan on the three-dimensional configuration, the surgical plan comprises multiple targets and multiple cranial paths that are predetermined.
  4. The system according to claim 1, characterized in that the computer device  is configured to collect and process spatial information of the motion executing device and spatial information of the marker unit captured by the spatial sensor, and establish a transformation relationship between the different coordinate systems.
  5. The system according to claim 3, characterized in that the computer device is configured to solve the motion scheme of the motion executing device according to the lesion position which is precisely self-targeted and the motion path thereof and the position mapping between the different spatial coordinate systems, simulate and rehearse the motion scheme in the craniocerebral three-dimensional model, and make an adjustment to the motion scheme.
  6. The system according to claim 5, characterized in that the computer device is further configured to synchronously navigate a relative position of the surgical instrument in the two-dimensional medical image data and the craniocerebral three-dimensional model.
  7. A neurosurgical robot navigation and positioning method, characterized by comprising:
    creating a surgery plan on a digital graphic image, the surgery plan comprising a lesion position which is precisely self-targeted and a motion path thereof;
    acquiring capture information of an equipped marker unit, and calculating a position mapping between different spatial coordinate systems; and
    generating a specific motion scheme according to the lesion position which is precisely self-targeted and the motion path thereof and the position mapping between the different spatial coordinate systems, precisely self-targeting a lesion position, and securely locking the surgical instrument to support a surgical operation.
  8. The method according to claim 7, characterized by further comprising: constructing a craniocerebral three-dimensional model based on two-dimensional medical data, displaying a three-dimensional configuration and position  delineating the lesion, and creating the surgical plan on the three-dimensional configuration, the surgical plan comprising multiple targets and multiple cranial paths that are predetermined.
  9. The method according to claim 8, characterized by further comprising: solving the motion scheme of the motion executing device according to the lesion position which is precisely self-targeted and the motion path thereof and the position mapping between the different spatial coordinate systems, simulating and rehearsing the motion scheme in the craniocerebral three-dimensional model, and making an adjustment to the motion scheme.
  10. The method according to claim 9, characterized by further comprising: synchronously navigating a relative position of the surgical instrument in the two-dimensional medical image data and the craniocerebral three-dimensional model.
PCT/CN2016/090789 2016-04-29 2016-07-21 Neurosurgical robot navigation positioning system and method WO2017185540A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP16741841.7A EP3253320A4 (en) 2016-04-29 2016-07-21 Neurosurgical robot navigation positioning system and method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201610285561.XA CN105852970B (en) 2016-04-29 2016-04-29 Neurosurgical Robot navigation positioning system and method
CN201610285561.X 2016-04-29

Publications (1)

Publication Number Publication Date
WO2017185540A1 true WO2017185540A1 (en) 2017-11-02

Family

ID=56628880

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/090789 WO2017185540A1 (en) 2016-04-29 2016-07-21 Neurosurgical robot navigation positioning system and method

Country Status (3)

Country Link
EP (1) EP3253320A4 (en)
CN (1) CN105852970B (en)
WO (1) WO2017185540A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110000793A (en) * 2019-04-29 2019-07-12 武汉库柏特科技有限公司 A kind of motion planning and robot control method, apparatus, storage medium and robot
CN112587232A (en) * 2020-12-10 2021-04-02 中国人民解放军空军军医大学 VR simulation traction device and system for neurosurgery
CN112807084A (en) * 2020-06-01 2021-05-18 上海库欣医疗科技有限公司 Craniocerebral puncture path establishing method and navigation method for brain stem hemorrhage operation navigation
EP4074275A4 (en) * 2019-12-09 2023-05-10 Microport Navibot (Suzhou) Co., Ltd. Navigation surgery system and registration method therefor, electronic device, and support apparatus

Families Citing this family (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107536643A (en) * 2017-08-18 2018-01-05 北京航空航天大学 A kind of augmented reality operation guiding system of Healing in Anterior Cruciate Ligament Reconstruction
CN107374729B (en) * 2017-08-21 2021-02-23 刘洋 Operation navigation system and method based on AR technology
CN107440797B (en) * 2017-08-21 2020-04-03 刘洋 Registration and registration system and method for surgical navigation
CN107595287B (en) * 2017-09-21 2020-09-25 燕山大学 Method for converting magnetic resonance scanning coordinates of carp into brain stereotaxic coordinates
CN109596125B (en) * 2017-09-30 2022-03-11 北京柏惠维康科技有限公司 Method and device for determining spatial coordinate system conversion relationship of robot
CN107997822B (en) * 2017-12-06 2021-03-19 上海卓梦医疗科技有限公司 Minimally invasive surgery positioning system
CN116602766A (en) * 2018-01-31 2023-08-18 武汉联影智融医疗科技有限公司 Orthopaedics operation system and control method thereof
US11229493B2 (en) * 2019-01-18 2022-01-25 Nuvasive, Inc. Motion programming of a robotic device
CN109877832B (en) * 2019-02-28 2022-05-10 广东工业大学 Position determination method, system and related device
CN109935311A (en) * 2019-03-21 2019-06-25 刘伟民 A kind of large size digital medical instrument method for managing security and system
CN109938842B (en) * 2019-04-18 2021-07-30 雅客智慧(北京)科技有限公司 Facial surgery positioning navigation method and device
CN110192920A (en) * 2019-06-19 2019-09-03 雅客智慧(北京)科技有限公司 A kind of operating robot
CN110215285A (en) * 2019-07-16 2019-09-10 华志微创医疗科技(北京)有限公司 The method for correcting error and system of surgical navigational
CN113041519A (en) * 2019-12-27 2021-06-29 重庆海扶医疗科技股份有限公司 Intelligent space positioning method
CN111202651B (en) * 2020-01-20 2022-01-25 武汉联影智融医疗科技有限公司 Operation auxiliary robot system, supporting unlocking mechanism and unlocking method thereof
CN111481268B (en) * 2020-04-17 2021-06-29 吉林大学第一医院 Automatic positioning and guiding system for basicranial foramen ovale
CN111640345A (en) * 2020-05-22 2020-09-08 北京数医脊微科技有限公司 Spinal endoscope puncture catheterization training method and device and computer equipment
CN111821025B (en) * 2020-07-21 2022-05-13 腾讯科技(深圳)有限公司 Space positioning method, device, equipment, storage medium and navigation bar
CN111887990B (en) * 2020-08-06 2021-08-13 杭州湖西云百生科技有限公司 Remote operation navigation cloud desktop system based on 5G technology
CN112192566B (en) * 2020-09-25 2022-03-01 武汉联影智融医疗科技有限公司 Control method for end adapter of mechanical arm
CN112155732B (en) * 2020-09-29 2022-05-17 苏州微创畅行机器人有限公司 Readable storage medium, bone modeling and registering system and bone surgery system
CN112336462B (en) * 2020-11-05 2022-03-18 华志微创医疗科技(北京)有限公司 Intelligent master-slave combined mechanical arm
CN113100939A (en) * 2021-04-06 2021-07-13 德智鸿(上海)机器人有限责任公司 Orthopedic surgery navigation method, device, computer equipment, system and storage medium
CN113100934A (en) * 2021-04-06 2021-07-13 德智鸿(上海)机器人有限责任公司 Operation assisting method, device, computer equipment and storage medium
CN113855244B (en) * 2021-09-08 2022-10-18 江苏集奥医工交叉科技有限公司 Surgical robot for treating pain
CN115005851A (en) * 2022-06-09 2022-09-06 上海市胸科医院 Nodule positioning method and device based on triangulation positioning and electronic equipment
CN117653332B (en) * 2024-02-01 2024-04-12 四川省肿瘤医院 Method and system for determining image navigation strategy

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6167296A (en) * 1996-06-28 2000-12-26 The Board Of Trustees Of The Leland Stanford Junior University Method for volumetric image navigation
CN1554315A (en) * 2003-12-26 2004-12-15 北京航空航天大学 Vision registering method for medical robot
CN104083219A (en) * 2014-07-11 2014-10-08 山东大学 Force-sensor-based coupling method for extracranial and intracranial coordinate systems in brain stereotactic surgery of neurosurgery
CN104146767A (en) * 2014-04-24 2014-11-19 薛青 Intraoperative navigation method and system for assisting in surgery

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6235038B1 (en) * 1999-10-28 2001-05-22 Medtronic Surgical Navigation Technologies System for translation of electromagnetic and optical localization systems
EP1531749A2 (en) * 2002-08-13 2005-05-25 Microbotics Corporation Microsurgical robot system
JP2008526422A (en) * 2005-01-13 2008-07-24 メイザー サージカル テクノロジーズ リミテッド Image guide robot system for keyhole neurosurgery
FR2917598B1 (en) * 2007-06-19 2010-04-02 Medtech MULTI-APPLICATIVE ROBOTIC PLATFORM FOR NEUROSURGERY AND METHOD OF RECALING
EP2468207A1 (en) * 2010-12-21 2012-06-27 Renishaw (Ireland) Limited Method and apparatus for analysing images
US20120226145A1 (en) * 2011-03-03 2012-09-06 National University Of Singapore Transcutaneous robot-assisted ablation-device insertion navigation system
US20130218005A1 (en) * 2012-02-08 2013-08-22 University Of Maryland, Baltimore Minimally invasive neurosurgical intracranial robot system and method
CN105286988A (en) * 2015-10-12 2016-02-03 北京工业大学 CT image-guided liver tumor thermal ablation needle location and navigation system
CN105496556B (en) * 2015-12-03 2019-03-01 中南民族大学 A kind of high-precision optical positioning system for surgical navigational

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6167296A (en) * 1996-06-28 2000-12-26 The Board Of Trustees Of The Leland Stanford Junior University Method for volumetric image navigation
CN1554315A (en) * 2003-12-26 2004-12-15 北京航空航天大学 Vision registering method for medical robot
CN104146767A (en) * 2014-04-24 2014-11-19 薛青 Intraoperative navigation method and system for assisting in surgery
CN104083219A (en) * 2014-07-11 2014-10-08 山东大学 Force-sensor-based coupling method for extracranial and intracranial coordinate systems in brain stereotactic surgery of neurosurgery

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110000793A (en) * 2019-04-29 2019-07-12 武汉库柏特科技有限公司 A kind of motion planning and robot control method, apparatus, storage medium and robot
EP4074275A4 (en) * 2019-12-09 2023-05-10 Microport Navibot (Suzhou) Co., Ltd. Navigation surgery system and registration method therefor, electronic device, and support apparatus
CN112807084A (en) * 2020-06-01 2021-05-18 上海库欣医疗科技有限公司 Craniocerebral puncture path establishing method and navigation method for brain stem hemorrhage operation navigation
CN112587232A (en) * 2020-12-10 2021-04-02 中国人民解放军空军军医大学 VR simulation traction device and system for neurosurgery

Also Published As

Publication number Publication date
EP3253320A1 (en) 2017-12-13
EP3253320A4 (en) 2017-12-13
CN105852970A (en) 2016-08-17
CN105852970B (en) 2019-06-14

Similar Documents

Publication Publication Date Title
WO2017185540A1 (en) Neurosurgical robot navigation positioning system and method
US6259943B1 (en) Frameless to frame-based registration system
EP2153794B1 (en) System for and method of visualizing an interior of a body
US8682413B2 (en) Systems and methods for automated tracker-driven image selection
US8131031B2 (en) Systems and methods for inferred patient annotation
EP2583244B1 (en) Method of determination of access areas from 3d patient images
US20080119725A1 (en) Systems and Methods for Visual Verification of CT Registration and Feedback
US20160000518A1 (en) Tracking apparatus for tracking an object with respect to a body
CN109416841A (en) Surgical guide of the method and application this method of Imaging enhanced validity in wearable glasses
CA2681275A1 (en) Recognizing a real world fiducial in patient image data
Burgner et al. A study on the theoretical and practical accuracy of conoscopic holography‐based surface measurements: toward image registration in minimally invasive surgery
Mewes et al. Projector‐based augmented reality system for interventional visualization inside MRI scanners
WO2008035271A2 (en) Device for registering a 3d model
US9818175B2 (en) Removing image distortions based on movement of an imaging device
Alam et al. A review on extrinsic registration methods for medical images
Hamming et al. Automatic image‐to‐world registration based on x‐ray projections in cone‐beam CT‐guided interventions
Citardi et al. Image-guided sinus surgery: current concepts and technology
Xie et al. Image-guided navigation system for minimally invasive total hip arthroplasty (MITHA) using an improved position-sensing marker
US20180153622A1 (en) Method for Registering Articulated Anatomical Structures
US20220354579A1 (en) Systems and methods for planning and simulation of minimally invasive therapy
US20070274589A1 (en) Tracking rigid body structures which are difficult or impossible to register
EP4041114B1 (en) Patterned incision foil and method for determining a geometry of an anatomical surface
EP3024408B1 (en) Wrong level surgery prevention
Ahmadian et al. Fundamentals of navigation surgery
Li et al. C-arm based image-guided percutaneous puncture of minimally invasive spine surgery

Legal Events

Date Code Title Description
REEP Request for entry into the european phase

Ref document number: 2016741841

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2016741841

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE