CN113171173A - VR (virtual reality) preoperative planning method and system for surgical robot - Google Patents

VR (virtual reality) preoperative planning method and system for surgical robot Download PDF

Info

Publication number
CN113171173A
CN113171173A CN202110469212.4A CN202110469212A CN113171173A CN 113171173 A CN113171173 A CN 113171173A CN 202110469212 A CN202110469212 A CN 202110469212A CN 113171173 A CN113171173 A CN 113171173A
Authority
CN
China
Prior art keywords
virtual
surgical robot
beam cone
range
virtual reality
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110469212.4A
Other languages
Chinese (zh)
Other versions
CN113171173B (en
Inventor
毕航
徐欣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Electric Group Corp
Original Assignee
Shanghai Electric Group Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Electric Group Corp filed Critical Shanghai Electric Group Corp
Priority to CN202110469212.4A priority Critical patent/CN113171173B/en
Publication of CN113171173A publication Critical patent/CN113171173A/en
Application granted granted Critical
Publication of CN113171173B publication Critical patent/CN113171173B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Robotics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Manipulator (AREA)

Abstract

The invention provides a VR preoperative planning method and system for a surgical robot. The method comprises the following steps: step S1, constructing a virtual reality scene; step S2, constructing a virtual beam cone according to the visual field range of the optical positioner in the surgical robot in the virtual reality scene; and step S3, generating a planning path according to the contact condition of the object in the virtual reality scene and the light beam cone. The technical scheme of the invention has the beneficial effects that: the virtual beam cone is constructed according to the visual field range of the optical positioner, and the planned path is generated according to the contact condition of an object and the virtual beam cone so as to indicate the moving path of an operator and a surgical instrument in the surgical process, so that the operator and the surgical instrument do not interfere with the infrared beam between the optical positioner and the photosensitive ball any more, the optical positioning system can work normally, and the positioning result of the optical positioning system is more accurate.

Description

VR (virtual reality) preoperative planning method and system for surgical robot
Technical Field
The invention relates to the field of surgical robots, in particular to a VR preoperative planning method and system for a surgical robot.
Background
With the large-scale popularization and application of the VR (Virtual Reality Technology), the fusion degree of VR and the medical industry is higher and higher, which brings great convenience to doctors and patients. The surgical robot has an optical navigation system, an optical writing positioning system and a mechanical arm system, and not only has surgical instruments with multiple degrees of freedom, so that the surgery is more flexible, but also can provide a doctor with a three-dimensional image amplified by the doctor, and the surgery is more delicate.
However, in the operation process of the surgical robot, the optical positioning system needs infrared beams emitted by the optical positioner to irradiate the pelvic reference frame and the mechanical arm reference frame and infrared beams between the photosensitive balls arranged on the pelvic reference frame and the mechanical arm reference frame at any time are shielded, so that the normal work of the optical positioning system is influenced, the real-time position calculation is influenced, the system precision and the operation effect are seriously influenced, and serious consequences can be generated.
However, in the actual operation process, the operation operator and the operation instrument move correspondingly along with the progress of the operation, so that the infrared beam between the optical positioner and the photosensitive ball is interfered, the positioning result is inaccurate, and the operation operator continues the operation according to the inaccurate positioning result, which may possibly affect the operation effect and cause the life risk of the patient.
Disclosure of Invention
Aiming at the problems in the prior art, a VR preoperative planning method and a VR preoperative planning system for a surgical robot are provided.
The VR preoperative planning method for the surgical robot is applied to the surgical robot and comprises the following steps:
step S1, constructing a virtual reality scene;
step S2, constructing a virtual beam cone according to the visual field range of the optical positioner in the surgical robot in the virtual reality scene;
and step S3, generating a planning path according to the contact condition of the object in the virtual reality scene and the light beam cone.
Preferably, the step S1 includes:
step S11, constructing a virtual reality scene in the three-dimensional virtual environment;
step S12, acquiring real-time state data of the object in the surgical scene, and converting the real-time state data into rendering data;
and step S13, rendering the virtual reality scene in real time according to the rendering data.
Preferably, the step S2 includes:
step S21, measuring and calculating the visual field range of the light beam emitted by the optical positioner;
and step S22, performing three-dimensional modeling according to the measured visual field range to generate the virtual beam cone.
Preferably, the step S3 includes:
step S31, predefining a first movable range of the object and a second movable range of the virtual beam cone;
step S32, acquiring the contact condition between the object and the light beam cone in the operation process;
step S33, generating the planned path according to the contact condition, the first moving range, and the second moving range.
Preferably, in step S31, the first movable range is a movement range and a rotation range of the object in six degrees of freedom.
Preferably, the step S31 includes:
step S311, obtaining the component motion range of the component of the optical positioner;
step S312, defining the second movable range of the virtual beam cone according to the movement range.
Preferably, the component movement range is a rotational degree of freedom and the rotational angle of the component.
Preferably, the second movable range of the virtual beam cone is an angle of rotation of the virtual beam cone around the optical positioner fixed axis.
Preferably, the method further comprises the following steps:
and controlling the display and the hiding of the virtual light beam cone in the operation process.
A surgical robot VR preoperative planning system for a surgical robot, comprising:
the first construction module is used for constructing a virtual reality scene;
the second construction module is connected with the first construction module and used for constructing a virtual beam cone according to the visual field range of an optical positioner in the surgical robot in the virtual reality scene;
and the planning module is connected with the second construction module and used for generating a planned path according to the contact condition of the object in the virtual reality scene and the virtual light beam cone.
The technical scheme of the invention has the beneficial effects that: a virtual beam cone is constructed in a virtual reality scene according to the visual field range of an optical positioner, a planned path is generated according to the contact condition of an object in the virtual reality scene and the virtual beam cone, the movement path of an operator and a surgical instrument in the operation process is indicated, the infrared beam between the optical positioner and a photosensitive ball is not interfered any more, the optical positioning system always works normally, and the positioning result of the optical positioning system is more accurate.
Drawings
Fig. 1 is a schematic flow chart of a VR preoperative planning method for a surgical robot in a preferred embodiment of the invention;
FIG. 2 is a schematic flow chart of step S1 according to the preferred embodiment of the present invention;
FIG. 3 is a schematic flow chart of step S2 according to the preferred embodiment of the present invention;
FIG. 4 is a schematic flow chart of step S3 according to the preferred embodiment of the present invention;
FIG. 5 is a schematic flow chart of step S31 according to the preferred embodiment of the present invention;
fig. 6 is a schematic structural diagram of a VR preoperative planning system of a surgical robot according to a preferred embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the embodiments and features of the embodiments may be combined with each other without conflict.
The invention is further described with reference to the following drawings and specific examples, which are not intended to be limiting.
The invention provides a VR preoperative planning method and system for a surgical robot. The preoperative planning method for the VR of the surgical robot is applied to the surgical robot, and as shown in fig. 1, the preoperative planning method comprises the following steps:
step S1, constructing a virtual reality scene;
step S2, constructing a virtual beam cone according to the visual field range of the optical positioner in the surgical robot in the virtual reality scene;
and step S3, generating a planning path according to the contact condition of the object in the virtual reality scene and the virtual light beam cone.
Specifically, considering the problem that the movement of an operation operator obstructs an infrared beam between an optical positioner and a photosensitive ball in the operation process, which affects the normal operation of an optical positioning system, thereby causing inaccurate positioning and further seriously interfering with the accuracy and the operation effect of the operation, the invention provides a VR preoperative planning method for an operation robot, which constructs a virtual reality scene based on a virtual reality technology through steps S1 to S4, constructs a corresponding virtual beam cone in the virtual reality scene according to the visual field range of the optical positioner, and further generates a planned path of the operation based on the contact condition between the moving path of an object such as the operator, an operation instrument and the like and the virtual beam cone in the virtual reality scene in the operation process, so as to guide the moving path of the operator and the operation instrument in the operation process, the infrared light beam between the optical positioner and the photosensitive ball is not interfered by the positioning device, so that the optical positioning system always works normally, and the positioning result is more accurate.
It should be noted that, considering that the infrared transceiving between the optical positioner and the photosensitive ball is not interfered, the precise positioning can be realized, therefore, all surgical instruments except the optical positioner and the photosensitive ball and surgical operators are regarded as "objects", and the interference situation of the objects to the infrared transceiving between the optical positioner and the photosensitive ball can be obtained by acquiring the interference situation of the objects and the virtual light beam cone.
In a preferred embodiment of the present invention, as shown in fig. 2, step S1 includes:
step S11, constructing a virtual reality scene in the three-dimensional virtual environment;
step S12, acquiring real-time state data of objects in the surgical scene, and converting the real-time state data into rendering data;
and step S13, rendering the virtual reality scene in real time according to the rendering data.
Specifically, in the process of constructing the virtual reality scene, since the operator and the surgical instrument may move during the surgical procedure, the real-time state data of the object in the surgical scene may be acquired through steps S11 to S13, and the real-time state data is converted into rendering data to render the virtual reality scene.
In a preferred embodiment of the present invention, as shown in fig. 3, step S2 includes:
step S21, measuring and calculating the visual field range of the light beam emitted by the optical positioner;
and step S22, performing three-dimensional modeling according to the measured visual field range to generate a virtual beam cone.
Specifically, the field of view of the light beam emitted from the optical positioner is measured and calculated through steps S21 to S22, and three-dimensional modeling is performed according to the measured field of view, thereby obtaining a virtual beam cone.
In a preferred embodiment of the present invention, as shown in fig. 4, step S3 includes:
step S31, predefining a first movable range of the object and a second movable range of the virtual beam cone;
step S32, obtaining the contact condition between the object and the beam cone in the operation process;
and step S33, generating a planning path according to the contact condition, the first moving range and the second moving range.
In a preferred embodiment of the present invention, in step S31, the first movable range is a movement range and a rotation range of the object in six degrees of freedom.
In a preferred embodiment of the present invention, as shown in fig. 5, step S31 includes:
step S311, acquiring the component motion range of the component of the optical positioner;
step S312, defining a second movable range of the virtual beam cone according to the movement range.
In a preferred embodiment of the invention, the component movement range is the rotational degree of freedom and the rotational angle of the component parts.
In a preferred embodiment of the present invention, the second movable range of the virtual beam cone is an angle of rotation of the virtual beam cone around the optical positioner fixed axis.
Specifically, a virtual reality scene corresponding to the whole surgical process is constructed, a first movable range of an object and a second movable range of a virtual beam cone in the surgical process are defined, the contact condition of the object and the virtual beam cone is obtained based on the movement condition of the object in the surgical process, and a preoperative planned path is generated according to the contact condition, the first movable range of the object and the second movable range of the virtual beam cone so as to indicate the movement path of a surgical operator in the surgical process, the movement path of a surgical instrument and the movement path of an optical positioner.
Further, the movement range and the rotation range of the object in six degrees of freedom may be obtained in advance to define the first motion range of the object.
Further, the rotational degree of freedom and the rotational angle of the components of the optical positioner may be measured in advance through steps S311 to S312, and then the second movable range of the virtual beam cone may be defined according to the rotational degree of freedom and the rotational angle. And finally, the obtained second movable range can be regarded as the angle of the virtual beam cone which can rotate around the fixed axis of the optical positioner.
In a preferred embodiment of the present invention, the method further comprises: and controlling the display and the hiding of the virtual light beam cone in the operation process. Specifically, the virtual beam cone hiding can be controlled in time to meet the actual operation requirements.
Wherein, a planning system before operation robot VR is applied to operation robot, as shown in fig. 6, includes:
the first construction module 1 is used for constructing a virtual reality scene;
the second construction module 2 is connected with the first construction module 1 and used for constructing a virtual beam cone according to the visual field range of an optical positioner in the surgical robot in the virtual reality scene;
and the planning module 3 is connected with the second construction module 2 and used for generating a planned path according to the contact condition of the object in the virtual reality scene and the light beam cone.
Specifically, in order to avoid that the movement of an operator and a surgical instrument obstructs an infrared beam between an optical positioner and a photosensitive ball in the surgical process and further influences the normal work of an optical positioning system, the invention further provides a preoperative planning system for a surgical robot VR, which comprises the steps of firstly constructing a virtual reality scene through a first construction module 1, then constructing a virtual beam cone through a second construction module 2 according to the visual field range of the optical positioner in the surgical robot in the virtual reality scene, and finally generating a planned path according to the contact condition of an object in the virtual reality scene and the beam cone by using a planning module 3.
The technical scheme of the invention has the beneficial effects that: a virtual beam cone is constructed in a virtual reality scene according to the visual field range of an optical positioner, a planned path is generated according to the contact condition of an object in the virtual reality scene and the virtual beam cone, the movement path of an operator and a surgical instrument in the operation process is indicated, the infrared beam between the optical positioner and a photosensitive ball is not interfered any more, the optical positioning system always works normally, and the positioning result of the optical positioning system is more accurate.
While the invention has been described with reference to a preferred embodiment, it will be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the invention.

Claims (10)

1. A VR preoperative planning method for a surgical robot is characterized in that an optical positioning system in the surgical robot comprises an optical positioner and a photosphere;
the VR preoperative planning method for the surgical robot comprises the following steps:
step S1, constructing a virtual reality scene;
step S2, constructing a virtual beam cone according to the visual field range of the optical positioner in the surgical robot in the virtual reality scene;
and step S3, generating a planning path according to the contact condition of the object in the virtual reality scene and the virtual light beam cone.
2. The surgical robot VR preoperative planning method of claim 1, wherein the step S1 includes:
step S11, constructing a virtual reality scene in the three-dimensional virtual environment;
step S12, acquiring real-time state data of the object in the surgical scene, and converting the real-time state data into rendering data;
and step S13, rendering the virtual reality scene in real time according to the rendering data.
3. The surgical robot VR preoperative planning method of claim 1, wherein the step S2 includes:
step S21, measuring and calculating the visual field range of the light beam emitted by the optical positioner;
and step S22, performing three-dimensional modeling according to the measured visual field range to generate the virtual beam cone.
4. The surgical robot VR preoperative planning method of claim 1, wherein the step S3 includes:
step S31, predefining a first movable range of the object and a second movable range of the virtual beam cone;
step S32, acquiring the contact condition between the object and the light beam cone in the operation process;
step S33, generating the planned path according to the contact condition, the first moving range, and the second moving range.
5. The surgical robot VR pre-operative planning method of claim 4, wherein in the step S31, the first movable range is a movement range and a rotation range of the object in six degrees of freedom.
6. The surgical robot VR pre-operative planning method of claim 4, wherein the step S31 includes:
step S311, obtaining the component motion range of the component of the optical positioner;
step S312, defining the second movable range of the virtual beam cone according to the component movement range.
7. The surgical robot VR pre-operative planning method of claim 6, wherein the component range of motion is a rotational degree of freedom and the rotational angle of the component parts.
8. The surgical robot VR pre-operative planning method of claim 4, wherein the second range of motion of the virtual beam cone is an angle of rotation of the virtual beam cone about the optical positioner axis.
9. The surgical robot VR pre-operative planning method of claim 1, further comprising:
and controlling the display and the hiding of the virtual light beam cone in the operation process.
10. A surgical robot VR pre-operative planning system, comprising:
the first construction module is used for constructing a virtual reality scene;
the second construction module is connected with the first construction module and used for constructing a virtual beam cone according to the visual field range of an optical positioner in the surgical robot in the virtual reality scene;
and the planning module is connected with the second construction module and used for generating a planned path according to the contact condition of the object in the virtual reality scene and the virtual light beam cone.
CN202110469212.4A 2021-04-28 2021-04-28 VR preoperative planning method and system for surgical robot Active CN113171173B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110469212.4A CN113171173B (en) 2021-04-28 2021-04-28 VR preoperative planning method and system for surgical robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110469212.4A CN113171173B (en) 2021-04-28 2021-04-28 VR preoperative planning method and system for surgical robot

Publications (2)

Publication Number Publication Date
CN113171173A true CN113171173A (en) 2021-07-27
CN113171173B CN113171173B (en) 2024-09-17

Family

ID=76925228

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110469212.4A Active CN113171173B (en) 2021-04-28 2021-04-28 VR preoperative planning method and system for surgical robot

Country Status (1)

Country Link
CN (1) CN113171173B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0948377A1 (en) * 1996-08-27 1999-10-13 David E. E. Carmein Omni-directional treadmill
CN101036938A (en) * 2006-03-16 2007-09-19 上海电气自动化有限公司 Automatization device of high-precision copper strap horizontal casting machine assembly
CN104969029A (en) * 2012-12-19 2015-10-07 巴斯夫欧洲公司 Detector for optically detecting at least one object
CN106901834A (en) * 2016-12-29 2017-06-30 陕西联邦义齿有限公司 The preoperative planning of minimally invasive cardiac surgery and operation virtual reality simulation method
AU2018102036A4 (en) * 2018-12-09 2019-01-17 Fang, Yide Mr A search-and-rescue hexapod robot with a tele-operable mechanical arm
CN109925057A (en) * 2019-04-29 2019-06-25 苏州大学 A kind of minimally invasive spine surgical navigation methods and systems based on augmented reality
CN109964148A (en) * 2016-11-17 2019-07-02 特里纳米克斯股份有限公司 Detector at least one object of optical detection
CN110430809A (en) * 2017-01-16 2019-11-08 P·K·朗 Optical guidance for surgery, medical treatment and dental operation
CN112634342A (en) * 2019-09-24 2021-04-09 福特全球技术公司 Method for computer-implemented simulation of optical sensors in a virtual environment

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0948377A1 (en) * 1996-08-27 1999-10-13 David E. E. Carmein Omni-directional treadmill
CN101036938A (en) * 2006-03-16 2007-09-19 上海电气自动化有限公司 Automatization device of high-precision copper strap horizontal casting machine assembly
CN104969029A (en) * 2012-12-19 2015-10-07 巴斯夫欧洲公司 Detector for optically detecting at least one object
CN109964148A (en) * 2016-11-17 2019-07-02 特里纳米克斯股份有限公司 Detector at least one object of optical detection
CN106901834A (en) * 2016-12-29 2017-06-30 陕西联邦义齿有限公司 The preoperative planning of minimally invasive cardiac surgery and operation virtual reality simulation method
CN110430809A (en) * 2017-01-16 2019-11-08 P·K·朗 Optical guidance for surgery, medical treatment and dental operation
AU2018102036A4 (en) * 2018-12-09 2019-01-17 Fang, Yide Mr A search-and-rescue hexapod robot with a tele-operable mechanical arm
CN109925057A (en) * 2019-04-29 2019-06-25 苏州大学 A kind of minimally invasive spine surgical navigation methods and systems based on augmented reality
CN112634342A (en) * 2019-09-24 2021-04-09 福特全球技术公司 Method for computer-implemented simulation of optical sensors in a virtual environment

Also Published As

Publication number Publication date
CN113171173B (en) 2024-09-17

Similar Documents

Publication Publication Date Title
US11179219B2 (en) Surgical robot system for stereotactic surgery and method for controlling stereotactic surgery robot
CN110946653B (en) Operation navigation system
KR100467111B1 (en) apparatus for automatically positioning a patient for treatment/diagnoses
CN107753106B (en) Surgical robot for positioning operation and control method thereof
CN110876643B (en) Medical operation navigation system and method
CN113876425B (en) Surgical system and navigation method
CN112603538A (en) Orthopedic navigation positioning system and method
CN113855286B (en) Implant robot navigation system and method
CN108778179A (en) Method and system for instructing user positioning robot
CN113476141B (en) Pose control method, optical navigation system applicable to pose control method and surgical robot system
JP2011502672A (en) Method for determining location of detection device in navigation system and method for positioning detection device
Shirinzadeh Laser‐interferometry‐based tracking for dynamic measurements
CN113693723B (en) Cross-modal navigation positioning system and method for oral and throat surgery
JP2023530652A (en) Spatial Perception Display for Computer-Assisted Interventions
Tauscher et al. High-accuracy drilling with an image guided light weight robot: autonomous versus intuitive feed control
CN113171173B (en) VR preoperative planning method and system for surgical robot
CN115424701B (en) Bone surface follow-up technology for optimal path planning
CN114366330B (en) Mixed reality system and marker registration method based on mixed reality
US20230346484A1 (en) Robotic surgery system with user interfacing
CN214857401U (en) Integrated system structure device
Mönnich et al. A Hand Guided Robotic Planning System for Laser Osteotomy in Surgery
CN116965928A (en) Positioning navigation method, positioning navigation device, electronic equipment and readable storage medium
WO2024147830A1 (en) Calibration for surgical navigation
CN117340891A (en) Motion precision calibration method for operation auxiliary robot
WO2019236540A1 (en) Articulated apparatus for surgery

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant