CN116602766A - Orthopaedics operation system and control method thereof - Google Patents

Orthopaedics operation system and control method thereof Download PDF

Info

Publication number
CN116602766A
CN116602766A CN202310666030.5A CN202310666030A CN116602766A CN 116602766 A CN116602766 A CN 116602766A CN 202310666030 A CN202310666030 A CN 202310666030A CN 116602766 A CN116602766 A CN 116602766A
Authority
CN
China
Prior art keywords
vivo
robot
image data
workstation
dimensional image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310666030.5A
Other languages
Chinese (zh)
Inventor
陈刚
李自汉
陈雨杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan United Imaging Zhirong Medical Technology Co Ltd
Original Assignee
Wuhan United Imaging Zhirong Medical Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan United Imaging Zhirong Medical Technology Co Ltd filed Critical Wuhan United Imaging Zhirong Medical Technology Co Ltd
Priority to CN202310666030.5A priority Critical patent/CN116602766A/en
Publication of CN116602766A publication Critical patent/CN116602766A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/16Bone cutting, breaking or removal means other than saws, e.g. Osteoclasts; Drills or chisels for bones; Trepans
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/34Trocars; Puncturing needles
    • A61B17/3472Trocars; Puncturing needles for bones, e.g. intraosseus injections
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/40ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Robotics (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Pathology (AREA)
  • Urology & Nephrology (AREA)
  • Dentistry (AREA)
  • Radiology & Medical Imaging (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Manipulator (AREA)
  • Magnetic Resonance Imaging Apparatus (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

The invention relates to an orthopedics operation system and a control method thereof, wherein the orthopedics operation system comprises: the medical imaging equipment is used for acquiring in-vivo three-dimensional image data and acquiring in-vivo two-dimensional image data in real time; the 3D visual imaging device is used for acquiring in-vitro space coordinate data; an orthopedics operation robot; the workstation is used for fusing the in-vivo three-dimensional image data and the in-vitro space coordinate data to generate a three-dimensional virtual image, fusing the three-dimensional virtual image with the real-time in-vivo two-dimensional image, and establishing a space position mapping relation between the three-dimensional virtual image and the in-vivo two-dimensional image data so as to obtain a real-time navigation image; and the display device is used for displaying the navigation image. The orthopedic operating system and the control method thereof provide real-time 3D image navigation, and a doctor does not need to reconstruct a 3D model of a fracture area of a patient in the brain by experience.

Description

Orthopaedics operation system and control method thereof
Technical Field
The invention relates to the field of medical instruments, in particular to an orthopedic operation system and a control method thereof.
Background
When the bone surgery is clinically performed aiming at the problems of angulation or dislocation and the like of the long bone fracture of the limb of a patient, corresponding equipment and techniques are required to be adopted for fixation, traction and reduction, broken end fixation is performed, and then intramedullary nails or steel plates are adopted for internal fixation (or devices such as a fixing frame are adopted for external fixation). In the process, because the broken limb muscles of the patient are pulled, and the restoring force is difficult to grasp, repeated pulling and rotating of the broken end are required, and secondary damage (such as cutting of blood vessels and nerves by bone fracture surfaces) can be caused in the operation process of the patient.
Generally, the orthopedic operation adopts a mode of manual traction and reset or adopts a traction bed traction and reset by adopting a mode of matching multiple persons. This requires high experience and degree of fit for the doctor and long duration of the operation, which is extremely painful for the patient during the operation. The doctor needs to navigate by means of the C-arm system during operation to observe the fracture condition, however, the C-arm system serving as navigation equipment only can provide plane images and provides less space information. The doctor needs to reconstruct the 3D model of the fracture in the brain and plan the operation of reduction fixation, and the whole process needs extremely abundant clinical experience.
Disclosure of Invention
Based on this, it is necessary to provide an orthopedic operation system and a control method thereof for how to provide real-time image navigation for orthopedic operation.
An orthopedic operating system for assisting a doctor in performing an orthopedic operation on a patient, the orthopedic operating system comprising:
the medical imaging device is used for acquiring in-vivo three-dimensional image data of a patient and in-vivo two-dimensional image data of the patient in real time, wherein the in-vivo three-dimensional image data is internal three-dimensional anatomical image data of a fracture area of the patient, and the in-vivo two-dimensional image data is internal plane anatomical image data of the fracture area of the patient;
the 3D visual imaging device is used for acquiring in-vitro space coordinate data of a patient; the in-vitro space coordinate data is a three-dimensional model of a fractured limb of a patient, which is output in a point cloud coordinate form;
the orthopedics operation robot is used for performing orthopedics operation on a patient;
the workstation is in communication connection with the medical imaging device to receive the in-vivo three-dimensional image data and the in-vivo two-dimensional image data; the workstation is in communication connection with the 3D visual imaging device to acquire the in-vitro space coordinate data; the workstation is in communication connection with the orthopedics operation robot;
the workstation is used for carrying out image fusion on the in-vivo three-dimensional image data and the in-vitro space coordinate data to generate a three-dimensional virtual image, the three-dimensional virtual image is used for reflecting the three-dimensional anatomical information of a fracture area of a patient, then the three-dimensional virtual image is fused with the in-vivo two-dimensional image in real time, and a space position mapping relation between the three-dimensional virtual image and the in-vivo two-dimensional image data is established, so that the in-vivo two-dimensional image data can find a corresponding position in the three-dimensional virtual image, and a real-time navigation image is obtained; the method comprises the steps of,
and the display device is in communication connection with the workstation and is used for displaying the navigation image.
In one embodiment, the workstation comprises:
the medical imaging equipment workstation is in communication connection with the medical imaging equipment to acquire the in-vivo three-dimensional image data and the in-vivo two-dimensional image data;
the navigation workstation is in communication connection with the 3D visual imaging device and is used for receiving and fusing the in-vivo three-dimensional image data, the in-vivo two-dimensional image data and the in-vitro space coordinate data, generating the navigation image and planning an orthopedics operation path at least partially according to the navigation image;
and the orthopedics operation robot and the navigation workstation are in communication connection with the robot workstation, so that the robot workstation can acquire the orthopedics operation path from the navigation workstation and control the orthopedics operation robot based on the orthopedics operation path.
In one embodiment, the orthopedic operation robot is a reduction robot, the orthopedic operation path is a bone reduction operation path, and the reduction robot is used for performing bidirectional traction and/or angle adjustment on a fracture area of a patient so as to perform orthopedic reduction operation.
In one embodiment, the reduction robot comprises a first fixing part and a second fixing part which can fix the fractured limb, the first fixing part is configured to be sleeved at the near body end of the fractured limb, the second fixing part is configured to be sleeved at the far body end of the fractured limb, the heights and angles of the first fixing part and the second fixing part can be adjusted to adjust the angle of the fractured limb, and the first fixing part and the second fixing part can be mutually close to or far away from each other to carry out bidirectional traction on the fractured limb.
In one embodiment, the reset robot further comprises a load cell, and the load cell is used for detecting and feeding back the traction force of the reset robot.
In one embodiment, the orthopedic operation robot is a surgical robot for performing an orthopedic operation, and the orthopedic operation path is a bone operation path.
In one embodiment, the surgical robot is a drilling robot and the orthopedic surgical operation is a drilling operation performed by the drilling robot; the drilling robot comprises a mechanical arm and a bone drill, and the bone drill is connected with the mechanical arm;
alternatively, the surgical robot is a puncture robot, and the orthopedic operation is a puncture operation performed by the puncture robot; the penetration robot includes a penetration needle.
In one embodiment, the surgical robot comprises a mechanical arm and a surgical element connected with one end of the mechanical arm, the surgical element comprises a puncture needle and/or a fixing needle and/or a bone drill, the surgical robot is used for conducting drilling and/or puncture operation on a fracture area of a patient through the surgical element, and the display device is further used for displaying the position of the surgical element in the navigation image in real time.
In one embodiment, the device further comprises a laser indicating device communicatively connected to the navigation workstation, wherein the laser indicating device is used for acquiring the bone surgery operation path from the navigation workstation and indicating a puncture target point according to the bone surgery operation path.
In one embodiment, the system further comprises a manual console in communication with the orthopaedic operating robot, the manual console configured to be manually controllable to the orthopaedic operating robot.
In one embodiment, the medical imaging device includes a CT imaging device and a C-arm imaging device, where the CT imaging device is configured to acquire the in-vivo three-dimensional image data and send the acquired in-vivo three-dimensional image data to the workstation, and the C-arm imaging device is configured to acquire the in-vivo two-dimensional image data in real time and send the acquired in-vivo two-dimensional image data to the workstation;
alternatively, the medical imaging device is a CBCT imaging device.
The orthopaedics operation system comprises medical imaging equipment, a 3D visual imaging device, an orthopaedics operation robot and a workstation. The workstation fuses three-dimensional image data in the body of a patient acquired by the medical imaging equipment and in-vitro space coordinate data of the patient acquired by the 3D visual imaging device to generate a three-dimensional virtual image, fuses the three-dimensional virtual image and real-time in-vivo two-dimensional image data acquired by the medical imaging equipment, thereby establishing a spatial position mapping relation between the three-dimensional virtual image in the body and the in-vivo two-dimensional image data, and generates a navigation image according to the spatial position mapping relation, so that real-time 3D image navigation is carried out on the whole operation process. In the whole operation process, doctors can observe the anatomical information of the fracture region of the patient in real time under the real-time 3D image navigation provided by the workstation, and the doctors do not need to reconstruct the 3D model of the fracture region of the patient in the brain by experience, so that the clinical experience requirements of the operators are effectively reduced. Meanwhile, the operation is performed by the navigation image and the operation robot to assist the doctor, so that the operation automation and the operation precision can be improved, secondary damage possibly caused to a patient in a purely manual operation is avoided, the operation time is shortened due to the improvement of the operation automation, and the pain of the patient in the operation process is reduced.
The control method of the orthopaedics operation system comprises the following steps:
acquiring the in-vivo three-dimensional image data and sending the data to the workstation through the medical imaging equipment;
acquiring the in-vitro space coordinate data through the 3D visual imaging device;
acquiring the in-vivo two-dimensional image data in real time through the medical imaging equipment and sending the in-vivo two-dimensional image data to the workstation;
the in-vivo three-dimensional image data and the in-vitro space coordinate data are subjected to image fusion through the workstation to generate the three-dimensional virtual image, the three-dimensional virtual image is fused with the real-time in-vivo two-dimensional image, and a space position mapping relation between the three-dimensional virtual image and the in-vivo two-dimensional image data is established, so that the in-vivo two-dimensional image data can find a corresponding position in the three-dimensional virtual image, and the real-time navigation image is obtained;
and displaying the navigation image through the display device.
According to the control method of the orthopedic operation system, the three-dimensional virtual image is generated by fusing the in-vivo three-dimensional image data and the in-vitro space coordinate data of the patient, the three-dimensional virtual image and the real-time in-vivo two-dimensional image data are fused, the spatial position mapping relation between the three-dimensional virtual image and the in-vivo two-dimensional image data of the patient is obtained, so that the navigation image is generated, the real-time 3D image navigation is carried out on the operation process, a doctor can observe the anatomical information of the fracture area of the patient in real time under the real-time 3D image navigation provided by a workstation, the doctor does not need to reconstruct the 3D model of the fracture area of the patient in the brain by virtue of experience, the clinical experience requirement of the doctor is effectively reduced, meanwhile, the operation accuracy can be improved by the navigation image and the operation robot assisting doctor in operation, secondary damage possibly caused to the patient in the purely manual operation is avoided, the operation time is reduced due to the improvement of operation automation, and the pain of the operation process of the patient is reduced.
Drawings
FIG. 1 is a block diagram of an orthopedic operating system according to one embodiment;
FIG. 2 is a schematic diagram of an orthopedic operating system according to an embodiment;
FIG. 3 is a flowchart of an orthopedic operating system control method according to an embodiment.
Detailed Description
In order that the above objects, features and advantages of the invention will be readily understood, a more particular description of the invention will be rendered by reference to the appended drawings. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the invention. The invention may be embodied in many other forms than described herein and similarly modified by those skilled in the art without departing from the spirit or essential characteristics thereof, so that the invention is not limited to the specific embodiments disclosed below.
It will be understood that when an element is referred to as being "fixed to" another element, it can be directly on the other element or intervening elements may also be present. When an element is referred to as being "connected" to another element, it can be directly connected to the other element or intervening elements may also be present.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used herein in the description of the invention is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. The term "and/or" as used herein includes any and all combinations of one or more of the associated listed items.
Referring to fig. 1, an orthopedic operation system 100 of an embodiment includes a medical imaging device 110, a workstation 120, an orthopedic operation robot 130, and a display device 140. The medical imaging device 110, the orthopedic robot 130, and the display device 140 are all communicatively connected to the workstation 120. The medical imaging device 110 is used for acquiring in-vivo three-dimensional image data of a fracture region of a patient, acquiring in-vivo two-dimensional image data of the patient in real time during an orthopedic operation, and transmitting the in-vivo three-dimensional image data and the in-vivo two-dimensional image data to the workstation 120 in real time. The in-vivo three-dimensional image data refers to internal stereoscopic anatomical image data of the fracture region of the patient scanned by the medical imaging device 110. The real-time in-vivo two-dimensional image data refers to the internal planar anatomic image data of the fractured region of the patient scanned by the medical imaging device 110 during the orthopedic operation. The orthopaedic surgical robot 130 is configured to perform orthopaedic procedures on a patient, wherein in one embodiment, the orthopaedic surgical robot 130 is configured to perform specific orthopaedic procedures, such as, for example, bi-directional distraction and/or angular adjustment of a fractured limb of a patient in one embodiment, and positioning piercing and/or drilling of a fractured limb of a patient in another embodiment.
The workstation 120 serves as a control center and a processing center of the whole orthopaedics operation system, and can acquire in-vivo three-dimensional image data of a fracture region of a patient acquired in advance by the medical imaging device 110 and in-vivo two-dimensional image data acquired in real time, and then fuse the in-vivo three-dimensional image data with the in-vivo two-dimensional image data. In particular, in one embodiment, the fusion process may include data type conversion and unification of the image coordinate system (e.g., unifying the origin of coordinates). Firstly, the type of in-vivo three-dimensional image data is converted into a data type consistent with in-vivo two-dimensional image data, then the coordinate origins of the in-vivo three-dimensional image data and the in-vivo two-dimensional image data are unified, so that the spatial positions of all pixel points of the in-vivo three-dimensional image data relative to the coordinate origins are determined, the spatial positions of the in-vitro three-dimensional image data relative to the coordinate origins are also determined, a spatial position mapping relation between the in-vivo three-dimensional image data and the in-vivo two-dimensional image data is established, a navigation image is generated in real time, and the display device 140 outputs the navigation image.
The workstation 120 fuses in-vivo three-dimensional image data and real-time in-vivo two-dimensional image data of the patient acquired by the medical imaging device 110, so that the workstation 120 can establish a spatial position mapping relationship between the in-vivo three-dimensional image data and the in-vivo two-dimensional image data, and generate a navigation image according to the spatial position mapping relationship, thereby performing real-time 3D image navigation on the whole orthopaedics operation process. With the assistance of the workstation 120, the doctor can plan an orthopedic operation path according to the real-time 3D navigation image, and the orthopedic operation robot 130 assists the doctor to perform orthopedic operation according to the planned orthopedic operation path. During the whole orthopaedics operation process, doctors can observe the anatomical information of the fracture region of the patient in real time under the real-time 3D image navigation provided by the workstation 120, and the doctors do not need to reconstruct the 3D model of the fracture region of the patient in the brain by experience, so that the clinical experience requirements of the doctors are effectively reduced. Simultaneously, the orthopedics operation is assisted by the navigation image and the orthopedics operation robot, so that the automation of the orthopedics operation and the accuracy of the orthopedics operation can be improved, secondary damage to a patient possibly caused in the pure artificial orthopedics operation is avoided, the automation of the orthopedics operation is improved, the orthopedics operation time is shortened, and the pain of the orthopedics operation process of the patient is reduced. The orthopedic procedures referred to herein generally include orthopedic positioning procedures, including orthopedic traction and fixation, and orthopedic surgical procedures.
Specifically, referring to fig. 2, an orthopedic operation system 200 of an embodiment includes a medical imaging device, a 3D visual imaging apparatus 220, an orthopedic operation robot 230, a workstation 240, and a display device. The medical imaging device comprises a C-arm imaging device 211 and a CT imaging device 212. Wherein the C-arm imaging device 211, the CT imaging device 212, the 3D vision imaging device 220, and the orthopaedic manipulator robot 230 are all communicatively coupled to a workstation 240. The communication connection mode can be wire connection such as data transmission line connection or wireless communication connection such as WiFi and Bluetooth.
Specifically, the CT imaging device 212 is configured to acquire three-dimensional in-vivo image data of a fracture region of a patient prior to orthopedic operation on the patient and transmit the data to the workstation 240. The C-arm imaging device 211 is used to acquire in-vivo two-dimensional image data of a fracture region of a patient in real time during an orthopedic operation of the patient. By adopting the CT imaging device 212 and the C-arm imaging device 211 to conduct image navigation on the orthopaedics operation process, the characteristics of in-vivo three-dimensional imaging of the CT imaging device and continuous monitoring of the C-arm imaging device can be effectively considered, and the medical imaging equipment can provide real-time three-dimensional navigation images.
It should be noted that the medical imaging apparatus is not limited to the combination of the C-arm imaging device 211 and the CT imaging device 212 in the present embodiment. In other embodiments, the medical imaging device may also be a CBCT (Cone beam CT), i.e. a Cone beam computed tomography device. CBCT has both the continuous monitoring of C-arm imaging and the three-dimensional imaging of CT, thus meeting the image requirements of orthopedic operation. Still further, in another embodiment, the medical imaging device may also have only a C-arm imaging device in the orthopedic operating room, where the C-arm imaging device continuously acquires in-vivo two-dimensional image data of the fracture region of the patient to provide image navigation for the orthopedic operation. The in-vivo three-dimensional image data of the fracture region of the patient required during data fusion is provided by a third party device or other departments in the hospital, such as a radiology department, prior to the orthopedic operation.
The 3D visual imaging device 220 is configured to acquire in vitro spatial coordinate data of the patient, and when the workstation 240 fuses the image data, the addition of the in vitro spatial coordinate data can enable the spatial coordinates of the in vivo three-dimensional image data and the in vivo two-dimensional image data to be fused more accurately. Specifically, the 3D visual imaging device 220 is a laser visual imaging device. The laser vision imaging device can image the external space of the fractured limb of the patient, establish a three-dimensional model for the fractured limb and output the three-dimensional model in the form of point cloud coordinates. Meanwhile, the 3D visual imaging device 220 also provides external three-dimensional image navigation for the orthopedic operation robot 230 to image the external space of the patient, and real-time reflects the obstacle information of the external space of the patient, so that the orthopedic operation robot 230 is prevented from interfering with the limb clamping device or the external fixing frame.
Specifically, in one embodiment, workstation 240 comprises: a medical imaging device workstation 241 and a navigation workstation 242. Wherein the medical imaging device workstation 241 is communicatively coupled to the medical imaging device 210 and the medical imaging device workstation 241 is communicatively coupled to the navigation workstation. The medical imaging device workstation 241 is used for controlling the medical imaging device 210 and acquiring in-vivo three-dimensional image data and in-vivo two-dimensional image data of a fracture region of a patient from the medical imaging device 210.
In one embodiment, the navigation workstation 242 is communicatively coupled to the medical imaging device 210. The navigation workstation 242 receives and fuses the in-vivo three-dimensional image data and the in-vivo two-dimensional image data, generates a navigation image according to the fused image data, and assists the doctor in planning an orthopedics operation path according to the navigation image.
In another embodiment, the navigation workstation 242 is communicatively coupled to the medical imaging device 210 and the 3D visual imaging apparatus 220. The navigation workstation 242 receives and fuses the in-vivo three-dimensional image data, the in-vivo two-dimensional image data and the in-vitro space coordinate data, generates a navigation image according to the fused image data, and assists the doctor in planning a surgical path according to the navigation image. Specifically, in one embodiment, the in-vivo three-dimensional image data and the in-vitro space coordinate data may be image-fused to generate a three-dimensional virtual image, and then the three-dimensional virtual image is fused with the real-time in-vivo two-dimensional image. The process of image fusion needs to be unified in data type, specifically, the in-vitro space coordinate data acquired by the 3D visual imaging device 220 is point cloud coordinate data in STL format, and the in-vivo three-dimensional image data scanned by the medical imaging device is DICOM format data. The DICOM format data can be converted into STL format data by the navigation workstation 242, thereby realizing the unification of the data types of the in-vivo three-dimensional image data and the in-vitro space coordinate data. Secondly, the image fusion needs to unify the coordinate origin of the image, for example, the center point of the in-vivo three-dimensional image data is taken as the coordinate origin, the spatial position of each pixel of the in-vivo three-dimensional image data relative to the coordinate origin is determined, and the spatial position of the in-vitro spatial coordinate data acquired by the 3D visual imaging device 220 relative to the coordinate origin is also determined, so that the fusion is realized. After fusion, a three-dimensional virtual image is reconstructed, and the three-dimensional virtual image can reflect the three-dimensional anatomical information of the fracture area of the patient. And fusing the three-dimensional virtual image with the real-time in-vivo two-dimensional image, and establishing a spatial position mapping relation between the three-dimensional virtual image and in-vivo two-dimensional image data, so that the in-vivo two-dimensional image data can find a corresponding position in the three-dimensional virtual image, thereby obtaining the real-time 3D navigation image. The navigation workstation can also plan a reasonable orthopedic operation path based at least in part on the real-time 3D navigation image.
In some embodiments, workstation 240 may also include robotic workstation 243. Robot workstation 243 is communicatively coupled to orthopedic operating robot 230, and robot workstation 243 is communicatively coupled to navigation workstation 242. The robot workstation 243 obtains the planned orthopedic operation path from the navigation workstation 242, and uses the orthopedic operation path as navigation to control the orthopedic operation robot 230 to automatically perform or assist the doctor in orthopedic operation.
The medical imaging equipment and the orthopedics operation robot can be remotely controlled through the workstation, meanwhile, the workstation can automatically generate navigation images, and the contact time between doctors and the medical imaging equipment can be reduced, so that the doctors are prevented from receiving a large amount of radiation, and meanwhile, remote auxiliary orthopedics operation in remote areas or under special conditions can be realized.
The orthopedic operating system 200 also includes a display device 280, the display device 280 being communicatively coupled to the workstation 240. The display device 280 is used for displaying the navigation image after the image fusion. The display device 280 may be disposed within a control room and/or an orthopedic operating room (e.g., an orthopedic operating room). A doctor can monitor the acquisition of the medical image of the patient through the display device 280 in the control room, and simultaneously, the orthopedics operation path planning is performed with the aid of the navigation image displayed through the display device 280 and the workstation. Further, a hanging display device 281 may be disposed in the orthopedic operation room, and in the orthopedic operation process, the doctor may observe the image data in the orthopedic operation process through the hanging display device 281 in the orthopedic operation room.
Specifically, in one embodiment, the display device 280 is further configured to display the position of the operating element 2322 in the navigation image in real time. So that the doctor performs the orthopedic operation through the display device 280.
The orthopedic operation robot may include a reduction robot and/or a surgical robot. Referring to fig. 2, in one exemplary embodiment, an orthopedic operation robot 230 may include a reduction robot 231 and a surgical robot 232. Wherein, the reduction robot 231 can carry out two-way traction and angle-adjusting orthopedic reduction operations on fractured broken limbs of a patient, thereby realizing accurate anastomosis of the broken ends of bones. The surgical robot 232 can position the puncture target point of the fracture region of the patient according to the orthopedic operation path (operation path) planned by the workstation 240, thereby automatically performing or assisting the doctor in the drilling operation or the puncture operation.
Specifically, in one of the embodiments, the reset robot 231 includes a first fixing portion 2311 and a second fixing portion 2312. The first fixing portion 2311 is sleeved on the near body end of the fractured limb of the patient, the second fixing portion 2312 is sleeved on the far body end of the fractured limb of the patient, the first fixing portion 2311 and the second fixing portion 2312 can be used for preliminarily fixing the fractured limb of the patient, meanwhile, the first fixing portion 2311 and the second fixing portion 2312 are connected with the electric push rod, and the electric push rod can be used for adjusting the heights and angles of the first fixing portion 2311 and the second fixing portion 2312, so that angle adjustment of the fractured limb of the patient is achieved. The first fixing portion 2311 and the second fixing portion 2312 are further connected with a traction motor, and the traction motor can drive the first fixing portion 2311 and the second fixing portion 2311 to move close to each other or away from each other, so that two-way traction on fractured limbs of a patient is achieved, and accurate anastomosis of fractured ends of bones of the patient is achieved.
Further, in one embodiment, the reset robot 231 further includes a load cell that can provide feedback on the traction of the reset robot. Preventing secondary damage to fracture area of patient caused by improper traction when the fractured limb of patient is subjected to reduction traction.
Further, in one embodiment, the orthopedic operating system 200 further includes a loading device 250, the loading device 250 being configured to load a patient. The reset robot 231 is disposed on the carrier device 250, and the reset robot 231 is detachably connected with the carrier device 250.
Further, for the reduction robot 231, the navigation workstation 242 can automatically or assist a doctor in planning a bone reduction operation path according to the navigation image, and the robot workstation 243 acquires the bone reduction operation path and controls the reduction robot based on the bone reduction operation path. Other working steps of the workstation 240 are the same as those described above, and will not be described in detail herein.
Specifically, in one embodiment, the surgical robot 232 may include a mechanical arm 2321 and a surgical element 2322 connected to one end of the mechanical arm 2321. Surgical robot 232 is used to perform orthopedic surgical procedures on a patient. Specifically, in one embodiment, the surgical element 2322 includes a puncture needle and/or a fixation needle and/or a bone drill. Further, the surgical element 2322 is hinged to the robotic arm 2321. The robot 2321 includes a plurality of connection arms, each of which is hinged. Thereby enabling multi-angle rotation of the surgical robot 232. Meanwhile, in one embodiment, the other end of the mechanical arm 2321 is further connected to a moving device. The movement device can drive the surgical robot 232 to move so as to treat different fracture parts of the patient. Surgical robot 232 drills and/or punctures the fractured bone of the patient through surgical member 2322 and according to the navigation positioning of the planned surgical path, and then implants and fixes the fractured bone by intramedullary nail fixation.
More specifically, surgical robot 232 may be a drilling robot or a penetration robot, the drilling robot including a mechanical arm and a bone drill, the bone drill being coupled to the mechanical arm; the drilling robot is used for performing drilling operation; the puncture robot comprises a puncture needle; the puncture robot is used for performing a puncture operation.
Further, for surgical robots, the navigation workstation 242 can automatically or assist a doctor in planning a bone surgery operation path according to the navigation image, and the robot workstation 243 obtains the bone surgery operation path. Other working steps of the workstation 240 are the same as those described above, and will not be described in detail herein.
In particular, in the present embodiment, the orthopedic operating system 200 can be divided into an orthopedic operating room (e.g., an orthopedic operating room for reduction, fixation, and surgery of limbs) and a control room. Wherein the C-arm imaging device 211, the CT imaging device 212, the 3D visual imaging device 220 and the orthopaedics operation robot 230 are arranged in the orthopaedics operation room. The workstation 240 is disposed within the control room. Before an operation, when making an operation route, a doctor only needs to remotely monitor the whole image acquisition process in a control room. Avoiding the doctor from contacting a great deal of radiation and realizing remote auxiliary operation in remote areas or special situations.
Further, in one embodiment, the orthopedic operating system 200 further includes a laser indicating device 260, where the laser indicating device 260 is communicatively connected to the navigation workstation 242, and the laser indicating device 260 may be integrated with the 3D visual imaging device 220 and disposed on the mechanical arm 2321 of the surgical robot 232, and the laser indicating device 260 may obtain the planned surgical path from the navigation workstation 242. The laser indicating device 260 can indicate the puncture target point according to the operation path, so as to assist doctors in performing drilling and puncture operations on the fracture area of the patient according to the puncture target point indicated by the laser indicating device 260.
Further, in one embodiment, the orthopedic operating system 200 further includes a manual console 270, the manual console 270 being communicatively coupled to the orthopedic operating robot 230. The physician may manually control the orthopedic operation robot 230 through a manual console. In particular, a manual console may be coupled to the carrier 250. Is convenient for doctors to use when performing operations. It should be noted that, in addition to controlling the orthopedic operation robot 230, the manual console 270 may further add function keys on an interface thereof and perform some program coupling on software, so as to implement manual control on the carrying device 250, the 3D visual imaging device 220, the medical imaging device, and the like.
The control method of the orthopaedics operation system comprises the following steps:
s302, acquiring in-vivo three-dimensional image data of a patient through medical imaging equipment and sending the data to a workstation.
Specifically, the medical imaging device includes a CT imaging apparatus 212, and the CT imaging apparatus 212 scans and obtains in-vivo three-dimensional image data of a fractured limb of a patient prior to surgery and transmits to a medical imaging device workstation 241 and a navigation workstation 242.
S304, acquiring in-vivo two-dimensional image data of the patient in real time through the medical imaging equipment and sending the in-vivo two-dimensional image data to the workstation.
Specifically, in one embodiment, the medical imaging device further comprises a C-arm imaging device 211. The C-arm imaging device 211 scans and acquires in-vivo two-dimensional image data of a fractured limb of a patient in real time during a surgical procedure and transmits to the medical imaging device workstation 241 and the navigation workstation 242.
S306, fusing the in-vivo three-dimensional image data and the in-vivo two-dimensional image data through a workstation; and establishing a spatial position mapping relation between the in-vivo three-dimensional image data and the in-vivo two-dimensional image data, and generating a navigation image.
Preferably, the in vitro spatial coordinate data may also be obtained by the 3D visual imaging device 220. In this way, the in-vivo three-dimensional image data and the in-vitro space coordinate data need to be fused, the data types of the in-vivo three-dimensional image data and the in-vitro space coordinate data need to be unified at first in the image fusion process, specifically, the in-vitro space coordinate data acquired by the 3D visual imaging device 220 is point cloud coordinate data in the STL format, and the in-vivo three-dimensional image data scanned by the medical imaging device is DICOM format data. The DICOM format data can be converted into STL format data by the navigation workstation 242, thereby realizing the unification of the data types of the in-vivo three-dimensional image data and the in-vitro space coordinate data. Next, the image coordinate origin is unified, for example, the center point of the in-vivo three-dimensional image data is taken as the coordinate origin, the spatial position of each pixel of the in-vivo three-dimensional image data relative to the coordinate origin is determined, and the spatial position of the in-vitro spatial coordinate data acquired by the 3D visual imaging device 220 relative to the coordinate origin is also determined, so that fusion is realized. After fusion, a three-dimensional virtual image is reconstructed, and the three-dimensional virtual image can reflect the three-dimensional anatomical information of the fracture area of the patient. And fusing the three-dimensional virtual image with the real-time in-vivo two-dimensional image, and establishing a spatial position mapping relation between the three-dimensional virtual image and in-vivo two-dimensional image data, so that the in-vivo two-dimensional image data can find a corresponding position in the three-dimensional virtual image, thereby obtaining the real-time 3D navigation image.
S307: and displaying the navigation image through the display device.
Specifically, the display device 280 can display the navigation image in real time. For the case of orthopedic procedures, such as orthopedic reduction and fixation, the display device 280 can display the fracture site in real time. In the case of an orthopedic operation, the display device 280 can display the position of the surgical element 2322 in the navigation image in real time.
S308: and planning a surgical path according to the navigation image.
Specifically, with the aid of workstation 240, the physician observes real-time 3D anatomical image information of the patient's fracture zone from the navigator images, thereby planning a reasonable surgical path and formulating a bone reduction and/or surgical plan.
S310, controlling the orthopaedics operation robot to operate according to the orthopaedics operation path.
Specifically, in one embodiment, the robotic workstation 243 controls the surgical robot 232 to automatically or assist the surgeon in performing the piercing and drilling actions and to implant fixation needles to fix the fractured limb of the patient to the first fixation portion 2311 and the second fixation portion 2312 of the reduction robot 231 according to the navigation of the surgical path. The robotic workstation 243 then controls the reduction robot 231 to perform bi-directional traction on the fractured bones of the patient and adjust the angle to achieve precise anastomosis of the fractured bones. Then the broken bone clamping and fixing device is implanted, and then the fixing needle on the resetting robot is gradually taken out. And the surgical robot performs bone marrow cavity drilling under the guidance of the C-arm imaging device 211, and assists the doctor in implanting the intramedullary nail and the intramedullary nail fixation needle. Finally, the doctor finishes the incision suturing and the treatment of installing a clamping device, an external fixing frame and the like for the fractured limb of the patient.
According to the control method of the orthopedic operation system, the in-vivo three-dimensional image data and the in-vitro three-dimensional image data of the patient are fused to conduct image navigation on the operation process, and the operation path is extracted by the workstation so as to control the robot to conduct orthopedic operation. Replaces or partially replaces the manual operation process which needs to be carried out by doctors according to clinical experience in the orthopedic operation, improves the automation of the orthopedic operation, and has higher safety and rapidity compared with the traditional manual traction reduction operation. Can effectively reduce the pain of the patient in the operation process, and simultaneously can effectively prevent the doctor from receiving a large amount of radiation through remote control.
The technical features of the above-described embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above-described embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above examples illustrate only a few embodiments of the invention, which are described in detail and are not to be construed as limiting the scope of the invention. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the invention, which are all within the scope of the invention. Accordingly, the scope of protection of the present invention is to be determined by the appended claims.

Claims (12)

1. An orthopedic operating system, comprising:
the medical imaging device is used for acquiring in-vivo three-dimensional image data of a patient and in-vivo two-dimensional image data of the patient in real time, wherein the in-vivo three-dimensional image data is internal three-dimensional anatomical image data of a fracture area of the patient, and the in-vivo two-dimensional image data is internal plane anatomical image data of the fracture area of the patient;
the 3D visual imaging device is used for acquiring in-vitro space coordinate data of a patient; the in-vitro space coordinate data is a three-dimensional model of a fractured limb of a patient, which is output in a point cloud coordinate form;
the orthopedics operation robot is used for performing orthopedics operation on a patient;
the workstation is in communication connection with the medical imaging device to receive the in-vivo three-dimensional image data and the in-vivo two-dimensional image data; the workstation is in communication connection with the 3D visual imaging device to acquire the in-vitro space coordinate data; the workstation is in communication connection with the orthopedics operation robot;
the workstation is used for carrying out image fusion on the in-vivo three-dimensional image data and the in-vitro space coordinate data to generate a three-dimensional virtual image, the three-dimensional virtual image is used for reflecting the three-dimensional anatomical information of a fracture area of a patient, then the three-dimensional virtual image is fused with the in-vivo two-dimensional image in real time, and a space position mapping relation between the three-dimensional virtual image and the in-vivo two-dimensional image data is established, so that the in-vivo two-dimensional image data can find a corresponding position in the three-dimensional virtual image, and a real-time navigation image is obtained; the method comprises the steps of,
and the display device is in communication connection with the workstation and is used for displaying the navigation image.
2. The orthopedic operating system of claim 1 wherein said workstation comprises:
the medical imaging equipment workstation is in communication connection with the medical imaging equipment to acquire the in-vivo three-dimensional image data and the in-vivo two-dimensional image data;
the navigation workstation is in communication connection with the 3D visual imaging device and is used for receiving and fusing the in-vivo three-dimensional image data, the in-vivo two-dimensional image data and the in-vitro space coordinate data, generating the navigation image and planning an orthopedics operation path at least partially according to the navigation image;
and the orthopedics operation robot and the navigation workstation are in communication connection with the robot workstation, so that the robot workstation can acquire the orthopedics operation path from the navigation workstation and control the orthopedics operation robot based on the orthopedics operation path.
3. The orthopedic operating system of claim 2 wherein the orthopedic operating robot is a reduction robot, the orthopedic operating path being a bone reduction operating path, the reduction robot being configured to bi-directionally distract and/or angle a fractured region of a patient for an orthopedic reduction operation.
4. The orthopedic operating system of claim 3 wherein the reduction robot comprises a first fixation portion and a second fixation portion capable of fixing a fractured limb, the first fixation portion configured to be sleeved on a proximal body end of the fractured limb, the second fixation portion configured to be sleeved on a distal body end of the fractured limb, the height and angle of the first fixation portion and the second fixation portion being adjustable for angular adjustment of the fractured limb, and the first fixation portion and the second fixation portion being capable of approaching or moving away from each other for bi-directional distraction of the fractured limb.
5. The orthopedic operating system of claim 4 wherein said reduction robot further comprises a load cell for detecting feedback of the traction of said reduction robot.
6. The orthopedic operating system of claim 2 wherein the orthopedic operating robot is a surgical robot for performing orthopedic surgical procedures, the orthopedic operating path being a bone surgical operating path.
7. The orthopedic operating system of claim 6 wherein said surgical robot is a drilling robot and said orthopedic surgical operation is a drilling operation by said drilling robot; the drilling robot comprises a mechanical arm and a bone drill, and the bone drill is connected with the mechanical arm;
alternatively, the surgical robot is a puncture robot, and the orthopedic operation is a puncture operation performed by the puncture robot; the penetration robot includes a penetration needle.
8. The orthopedic operating system according to claim 6, characterized in that the surgical robot comprises a mechanical arm and a surgical element connected with one end of the mechanical arm, the surgical element comprises a puncture needle and/or a fixation needle and/or a bone drill, the surgical robot performs drilling and/or puncture operation on a fracture area of a patient through the surgical element, and the display device is further used for displaying the position of the surgical element in the navigation image in real time.
9. The orthopedic operating system of claim 6 further comprising a laser indicating device communicatively coupled to the navigation workstation, the laser indicating device configured to obtain the bone surgery operating path from the navigation workstation and indicate a puncture target point based on the bone surgery operating path.
10. The orthopedic operating system of claim 1, further comprising a manual console in communicative connection with the orthopedic operating robot, the manual console configured to be manually controllable to the orthopedic operating robot.
11. The orthopedic operating system of claim 1 wherein said medical imaging device comprises a CT imaging device for acquiring said in vivo three-dimensional image data and transmitting to said workstation and a C-arm imaging device for acquiring said in vivo two-dimensional image data in real time and transmitting to said workstation;
alternatively, the medical imaging device is a CBCT imaging device.
12. A control method using the orthopedic operating system according to any one of claims 1 to 11, characterized by comprising the steps of:
acquiring the in-vivo three-dimensional image data and sending the data to the workstation through the medical imaging equipment;
acquiring the in-vitro space coordinate data through the 3D visual imaging device;
acquiring the in-vivo two-dimensional image data in real time through the medical imaging equipment and sending the in-vivo two-dimensional image data to the workstation;
the in-vivo three-dimensional image data and the in-vitro space coordinate data are subjected to image fusion through the workstation to generate the three-dimensional virtual image, the three-dimensional virtual image is fused with the real-time in-vivo two-dimensional image, and a space position mapping relation between the three-dimensional virtual image and the in-vivo two-dimensional image data is established, so that the in-vivo two-dimensional image data can find a corresponding position in the three-dimensional virtual image, and the real-time navigation image is obtained;
and displaying the navigation image through the display device.
CN202310666030.5A 2018-01-31 2018-01-31 Orthopaedics operation system and control method thereof Pending CN116602766A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310666030.5A CN116602766A (en) 2018-01-31 2018-01-31 Orthopaedics operation system and control method thereof

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810097899.1A CN108392271A (en) 2018-01-31 2018-01-31 Orthopaedics operating system and its control method
CN202310666030.5A CN116602766A (en) 2018-01-31 2018-01-31 Orthopaedics operation system and control method thereof

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201810097899.1A Division CN108392271A (en) 2018-01-19 2018-01-31 Orthopaedics operating system and its control method

Publications (1)

Publication Number Publication Date
CN116602766A true CN116602766A (en) 2023-08-18

Family

ID=63095941

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202310666030.5A Pending CN116602766A (en) 2018-01-31 2018-01-31 Orthopaedics operation system and control method thereof
CN201810097899.1A Pending CN108392271A (en) 2018-01-19 2018-01-31 Orthopaedics operating system and its control method

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201810097899.1A Pending CN108392271A (en) 2018-01-19 2018-01-31 Orthopaedics operating system and its control method

Country Status (1)

Country Link
CN (2) CN116602766A (en)

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019141262A1 (en) * 2018-01-19 2019-07-25 Shenzhen United Imaging Healthcare Co., Ltd. Bone fracture reduction device and system
WO2019228530A1 (en) 2018-05-31 2019-12-05 Shanghai United Imaging Healthcare Co., Ltd. Systems and methods for controllinig an x-ray imaging device
CN109620274B (en) * 2018-12-12 2021-05-14 上海联影医疗科技股份有限公司 Mechanical arm navigation method and system of C-arm machine and computer readable storage medium
CN109316235B (en) * 2018-10-10 2021-04-30 江西欧德医疗器材有限公司 Intelligent spicule control system and using method
CN109464194A (en) * 2018-12-29 2019-03-15 上海联影医疗科技有限公司 Display methods, device, medical supply and the computer storage medium of medical image
CN109498106B (en) * 2018-12-26 2021-11-19 哈尔滨工程大学 Intramedullary nail hole positioning and navigation method based on three-dimensional image
CN109717957B (en) * 2018-12-27 2021-05-11 北京维卓致远医疗科技发展有限责任公司 Control system based on mixed reality
CN114711969B (en) * 2019-01-21 2023-10-31 华科精准(北京)医疗科技有限公司 Surgical robot system and application method thereof
CN110141361A (en) * 2019-05-13 2019-08-20 王军强 A kind of laser surgey scalpel system for orthopaedics drilling
CN112869856B (en) * 2021-02-08 2022-04-01 清华大学 Two-dimensional image guided intramedullary needle distal locking robot system and locking method thereof
CN111297435B (en) * 2020-01-21 2021-09-24 惟精医疗器械(天津)有限公司 Medical drilling method, system, apparatus and computer readable storage medium
CN111297463B (en) * 2020-02-21 2022-08-26 京东方科技集团股份有限公司 Skeleton reduction system and skeleton reduction experiment system
CN113538572A (en) * 2020-04-17 2021-10-22 杭州三坛医疗科技有限公司 Method, device and equipment for determining coordinates of target object
CN111613318A (en) * 2020-05-12 2020-09-01 上海上实龙创智慧能源科技股份有限公司 System and method for remote surgery
CN111590584B (en) * 2020-05-27 2021-12-10 京东方科技集团股份有限公司 Determination method and device of safety limit area, reset method and medical robot
CN112155737B (en) * 2020-10-16 2022-04-22 惟精医疗器械(天津)有限公司 System and method for implanting detection device into cranium
CN112402000B (en) * 2020-11-13 2022-03-11 山东中医药大学附属医院 Steel plate insertion auxiliary device, steel plate automatic insertion system and control method
CN112370170B (en) * 2020-11-13 2022-06-17 毕建平 Robot combined traction and bone fracture reduction cooperation system and control method thereof
CN112370153B (en) * 2020-11-13 2022-03-11 山东中医药大学 Integrated operation system for limb fracture and control method
CN113231751B (en) * 2021-05-19 2022-09-23 北京航空航天大学杭州创新研究院 Laser equipment for orthopedic surgery and use method
CN113558765B (en) * 2021-07-09 2023-03-21 北京罗森博特科技有限公司 Navigation and reset operation control system and method
CN113729941B (en) * 2021-09-23 2024-01-30 上海卓昕医疗科技有限公司 VR-based operation auxiliary positioning system and control method thereof
CN114052903A (en) * 2021-10-09 2022-02-18 山东大学 Near-infrared imaging surgical navigation system and method
CN114788734A (en) * 2022-06-23 2022-07-26 康达洲际医疗器械有限公司 Intraoperative three-dimensional navigation method and system based on double-C-arm imaging system

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100435735C (en) * 2006-09-30 2008-11-26 南方医科大学 Human body orthopedic navigation system
CN101862205A (en) * 2010-05-25 2010-10-20 中国人民解放军第四军医大学 Intraoperative tissue tracking method combined with preoperative image
CN103211655B (en) * 2013-04-11 2016-03-09 深圳先进技术研究院 A kind of orthopaedics operation navigation system and air navigation aid
CN103519895A (en) * 2013-10-18 2014-01-22 江苏艾迪尔医疗科技股份有限公司 Orthopedic operation auxiliary guide method
CN105434048B (en) * 2016-01-25 2017-09-08 杭州三坛医疗科技有限公司 Robot for orthopaedic surgery based on noninvasive type real-time surgery positioning and navigating equipment
CN105852970B (en) * 2016-04-29 2019-06-14 北京柏惠维康科技有限公司 Neurosurgical Robot navigation positioning system and method

Also Published As

Publication number Publication date
CN108392271A (en) 2018-08-14

Similar Documents

Publication Publication Date Title
CN116602766A (en) Orthopaedics operation system and control method thereof
CN112641510B (en) Joint replacement surgical robot navigation positioning system and method
WO2022126828A1 (en) Navigation system and method for joint replacement surgery
US11974761B2 (en) Surgical system for cutting an anatomical structure according to at least one target plane
EP3551099B1 (en) Surgical system for cutting an anatomical structure according to at least one target plane
CN109925058B (en) Spinal surgery minimally invasive surgery navigation system
US20210068845A1 (en) Surgical system for cutting an anatomical structure according to at least one target plane
Haberland et al. Incorporation of intraoperative computerized tomography in a newly developed spinal navigation technique
KR20210158877A (en) Methods for conducting guided oral and maxillofacial procedures, and associated system
JP2016503319A (en) System and method for guidance and control of an implant placement device
CN112842533B (en) Flexible surgical tool and vascular intervention surgical robot system
Bouazza-Marouf et al. Robotic-assisted internal fixation of femoral fractures
KR20160136330A (en) Surgical robot system for integrated surgical planning and implant preparation, and associated method
CN115624385B (en) Preoperative space registration method and device, computer equipment and storage medium
US20140324182A1 (en) Control system, method and computer program for positioning an endoprosthesis
WO2022267838A1 (en) Spinal surgery robot system for screw placement operation
CN113729941B (en) VR-based operation auxiliary positioning system and control method thereof
CN117064557B (en) Surgical robot for orthopedic surgery
CN107898499B (en) Orthopedic three-dimensional region positioning system and method
CN219021534U (en) Master-slave teleoperation orthopedics robot system
CN113729940B (en) Operation auxiliary positioning system and control method thereof
CN114711961A (en) Virtual reality navigation method and system for spinal endoscopic surgery
Zixiang et al. Robot-assisted orthopedic surgery
Portaccio et al. Design of a positioning system for orienting surgical cannulae during Minimally Invasive Spine Surgery
CN117462253A (en) Medical navigation equipment and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination