CN115462901A - Operation real-time navigation method, device, system, equipment and medium - Google Patents

Operation real-time navigation method, device, system, equipment and medium Download PDF

Info

Publication number
CN115462901A
CN115462901A CN202211287443.4A CN202211287443A CN115462901A CN 115462901 A CN115462901 A CN 115462901A CN 202211287443 A CN202211287443 A CN 202211287443A CN 115462901 A CN115462901 A CN 115462901A
Authority
CN
China
Prior art keywords
ultrasonic
real
rectum
time
surgical
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211287443.4A
Other languages
Chinese (zh)
Inventor
崔亮
苏衍宇
战梦雪
袁宝武
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Kangduo Robot Co ltd
Original Assignee
Harbin Sagebot Intelligent Medical Equipment Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Sagebot Intelligent Medical Equipment Co Ltd filed Critical Harbin Sagebot Intelligent Medical Equipment Co Ltd
Priority to CN202211287443.4A priority Critical patent/CN115462901A/en
Publication of CN115462901A publication Critical patent/CN115462901A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/40ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2063Acoustic tracking systems, e.g. using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/303Surgical robots specifically adapted for manipulations within body lumens, e.g. within lumen of gut, spine, or blood vessels

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Robotics (AREA)
  • Veterinary Medicine (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Urology & Nephrology (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Surgical Instruments (AREA)

Abstract

The embodiment of the invention discloses a method, a device, a system, equipment and a medium for real-time navigation of a surgery, wherein the method comprises the following steps: according to positioning information of an instrument reference target point of the surgical robot input by a user in the initial ultrasonic image, establishing a position mapping relation between the instrument reference target point and a rectum B ultrasonic coordinate system; analyzing the real-time relative pose relationship between the instrument reference target point and a rectum B ultrasonic coordinate system according to the real-time motion information and the position mapping relationship generated by the control arm of the surgical robot under the surgical operation of a user; and determining the target rotation angle of the rectum B-ultrasonic based on the real-time relative pose relationship, driving the ultrasonic follow-up subsystem to rotate the target rotation angle of the rectum B-ultrasonic, updating the ultrasonic imaging plane, and realizing the real-time navigation of the operation. The technical scheme of the embodiment of the invention solves the problem that the operation visual field can not be continuously provided in the operation, can acquire more accurate structural form information and position information of the tissue, and improves the safety and the accuracy of operation positioning.

Description

Operation real-time navigation method, device, system, equipment and medium
Technical Field
The embodiment of the invention relates to the technical field of medical image processing, in particular to a method, a device, a system, equipment and a medium for real-time navigation of an operation.
Background
Currently, in the surgical procedure of the prostate gland, the robot system for prostate surgery mostly positions and measures the position of the puncture needle through an ultrasound image so as to guide the puncture surgery of the prostate. However, the surgical field cannot be continuously provided during the tumor removal or radical surgery of the prostate by the surgical robot.
Disclosure of Invention
The embodiment of the invention provides a method, a device, a system, equipment and a medium for real-time navigation of an operation, solves the problem that the operation visual field cannot be continuously provided in the operation, can acquire more accurate structural morphology information and position information of tissues, and improves the safety and the accuracy of operation positioning.
In a first aspect, an embodiment of the present invention provides a method for real-time surgical navigation, where the method includes:
according to positioning information of the instrument reference target point of the surgical robot input by a user in the initial ultrasonic image, establishing a position mapping relation between the instrument reference target point and a rectum B ultrasonic coordinate system;
analyzing a real-time relative pose relation between an instrument reference target point and a rectum B ultrasonic coordinate system according to a real-time motion information and position mapping relation generated by a control arm of the surgical robot under the surgical operation of a user;
and determining a target rotation angle of the rectum B-ultrasonic based on the real-time relative pose relationship, driving the ultrasonic follow-up subsystem to rotate the rectum B-ultrasonic by the target rotation angle, updating an ultrasonic imaging plane, and realizing real-time navigation of the operation.
In a second aspect, an embodiment of the present invention further provides a surgical real-time navigation device, where the surgical real-time navigation device includes:
the first position relation determining module is used for establishing a position mapping relation between an instrument reference target point and a rectum B-ultrasonic coordinate system according to positioning information of the instrument reference target point of the surgical robot input by a user in an initial ultrasonic image;
the second position relation determining module is used for analyzing the real-time relative pose relation between the instrument reference target point and the rectum B ultrasonic coordinate system according to the real-time motion information and the position mapping relation generated by the control arm of the surgical robot under the surgical operation of the user;
and the navigation driving module is used for determining the target rotation angle of the rectum B-ultrasonic based on the real-time relative pose relationship, driving the ultrasonic follow-up subsystem to rotate the target rotation angle of the rectum B-ultrasonic, updating the ultrasonic imaging plane and realizing the real-time navigation of the operation.
In a third aspect, an embodiment of the present invention further provides a surgical real-time navigation system, where the system includes: the system comprises a surgical robot subsystem, an ultrasonic follow-up subsystem and a navigation control subsystem;
wherein the content of the first and second substances, the surgical robot subsystem is used for enabling a mechanical arm of the surgical robot to move under the operation of a user so as to execute a surgical operation;
the ultrasonic follow-up subsystem comprises a rectum B-ultrasonic and B-ultrasonic moving device and is used for controlling the movement of the rectum B-ultrasonic so as to enable the rectum B-ultrasonic to acquire B-ultrasonic images in a moving range;
the navigation control subsystem is used for determining the movement parameters of the ultrasonic follow-up subsystem according to the mechanical arm movement information of the surgical robot subsystem, driving the ultrasonic follow-up subsystem to control the rectum B ultrasonic to move, and realizing the surgical real-time navigation method provided by any embodiment of the invention.
In a fourth aspect, an embodiment of the present invention further provides a computer device, where the computer device includes:
one or more processors;
a memory for storing one or more programs;
when executed by one or more processors, cause the one or more processors to implement a method of surgical real-time navigation as provided by any of the embodiments of the present invention.
In a fifth aspect, the embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, which when executed by a processor, implements the surgical real-time navigation method provided in any embodiment of the present invention.
According to the embodiment of the invention, a position mapping relation between an instrument reference target point and a rectum B-ultrasonic coordinate system is established according to positioning information of the instrument reference target point of the surgical robot input by a user in an initial ultrasonic image; analyzing the real-time relative pose relationship between the instrument reference target point and a rectum B ultrasonic coordinate system according to the position mapping relationship and the real-time motion information generated by the control arm of the surgical robot under the surgical operation of the user; and determining the target rotation angle of the rectum B-ultrasonic based on the real-time relative pose relationship, driving the ultrasonic follow-up subsystem to rotate the target rotation angle of the rectum B-ultrasonic, updating the ultrasonic imaging plane, and realizing the real-time navigation of the operation. The technical scheme of the embodiment of the invention solves the problem that the operation visual field can not be continuously provided in the operation, can acquire more accurate structural form information and position information of the tissue, and improves the safety and the accuracy of operation positioning.
Drawings
FIG. 1 is a flow chart of a method for real-time surgical navigation according to an embodiment of the present invention;
FIG. 2 is a flow chart of a method for real-time surgical navigation according to an embodiment of the present invention;
FIG. 3 is a block diagram of a surgical real-time navigation device according to an embodiment of the present invention;
FIG. 4 is a block diagram of a surgical real-time navigation system according to an embodiment of the present invention;
FIG. 5 is a schematic structural diagram of an ultrasound follow-up subsystem provided by an embodiment of the present invention;
FIG. 6 is a schematic view of a rectal ultrasound B virtual scene in radical prostatectomy;
FIG. 7 is an interactive interface of a real-time surgical navigation system for the prostate;
fig. 8 is a block diagram of a computer device according to an embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
It is to be understood that the terms "first" and "second," and the like in the description and claims of the present invention and in the above-described drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
FIG. 1 is a flow chart of a method for real-time surgical navigation according to an embodiment of the present invention; the embodiment can be applied to the scene of real-time navigation in the operation process, and particularly, the embodiment is more suitable for the condition of real-time ultrasonic navigation in the prostate operation. The method can be executed by a surgery real-time navigation device, which can be realized by software and/or hardware and is integrated in a computer device with application development function.
As shown in fig. 1, the real-time surgical navigation method of the present embodiment includes:
s110, according to positioning information of the instrument reference target point of the surgical robot input by a user in the initial ultrasonic image, establishing a position mapping relation between the instrument reference target point and a rectum B ultrasonic coordinate system.
The user may be an operator performing laparoscopic surgery, and may be a doctor, for example, who needs to observe the surgical field in real time during the surgical operation.
The instrument reference target may be a position of an instrument tip of the surgical robot.
The positioning information is used to describe the position of the instrument reference target in the initial ultrasound image, i.e. the coordinate information in the ultrasound system coordinate system.
The initial ultrasonic image is a transrectal ultrasonic image shot before an operation, when the image is shot, the ultrasonic probe enters the rectal cavity through the anus, and the prostate, seminal vesicle and bladder waiting for operation are shot through the rectal B-ultrasonic image so as to present an operation visual field. Firstly, a mapping relation between a surgical robot coordinate system and an ultrasonic system coordinate system is established by acquiring positioning information of an instrument tip of the surgical robot in an initial ultrasonic image.
Specifically, the position information of the tip of the instrument of the surgical robot is determined in an ultrasonic image shot before the operation by a user needing real-time navigation of the operation, the position information of the tip of the instrument of the surgical robot in the ultrasonic image input by the user establishes a corresponding relation between the position of the tip of the instrument of the surgical robot and a rectum B ultrasonic coordinate system, and further, the mapping relation from the surgical robot coordinate system to the rectum B ultrasonic coordinate system can be realized by combining the coordinates of the tip of the instrument under the surgical robot coordinate system.
And S120, analyzing the real-time relative pose relationship between the instrument reference target point and a rectum B ultrasonic coordinate system according to the real-time motion information and position mapping relationship generated by the control arm of the surgical robot under the surgical operation of the user.
The real-time motion information refers to information related to a user performing real-time motion by using a control arm of the surgical robot according to an operation of the user, and may be, for example, a position change of the control arm of the surgical robot.
The pose relation refers to position and posture data of a reference target point of the surgical robot under a rectum B ultrasonic coordinate system.
Further, analyzing the real-time relative pose relationship between the instrument reference target and the rectum B ultrasonic coordinate system according to the real-time motion information and the position mapping relationship generated by the control arm of the surgical robot under the surgical operation of the user comprises:
firstly, performing kinematic analysis on real-time motion information to obtain a first real-time pose relation of an instrument reference target point under a coordinate system of the surgical robot.
The first real-time pose relation is that under the operation of a user, the control arm moves in real time, the real-time pose of the control arm changes, and the real-time position and pose data of the corresponding instrument reference target point under the coordinate system of the surgical robot change.
Specifically, under the operation of a user, the manipulator master hand is controlled, the real-time motion of the manipulator master hand is conducted to the motion of the manipulator arm, and the real-time pose of each joint of the manipulator arm changes, so that the pose of the tip end of the surgical instrument changes. Therefore, the real-time pose change of the control arm can be calculated, the real-time position and the posture of the corresponding instrument reference target point under the coordinate system of the surgical robot are determined according to the corresponding relation between the real-time pose change of the control arm and the pose change of the instrument reference target point, and the first real-time pose relation of the instrument reference target point under the coordinate system of the surgical robot is obtained.
And then, mapping the first real-time pose relation to a rectum B ultrasonic coordinate system based on the position mapping relation to obtain a real-time relative pose relation of the instrument reference target point in the rectum B ultrasonic coordinate system.
Specifically, based on the position mapping relationship between the instrument reference target point and the rectum B ultrasonic coordinate system, the real-time pose relationship of the instrument reference target point in the surgical robot coordinate system is mapped to the rectum B ultrasonic coordinate system, and the real-time relative pose relationship of the instrument reference target point in the rectum B ultrasonic coordinate system is obtained.
S130, determining a target rotation angle of the rectum B-ultrasonic based on the real-time relative pose relationship, driving the ultrasonic follow-up subsystem to rotate the target rotation angle of the rectum B-ultrasonic, updating an ultrasonic imaging plane, and realizing real-time navigation of the operation.
The method comprises the steps of determining the current pose of a rectum B-ultrasonic probe according to the real-time relative pose relation of an instrument reference target point in a rectum B-ultrasonic coordinate system, determining the corresponding current rotation angle of the rectum B-ultrasonic probe, rotating a target rotation angle of the rectum B-ultrasonic probe, imaging a surgical position of a patient, updating an ultrasonic imaging plane by using an ultrasonic image after the rectum B-ultrasonic probe rotates, realizing surgical real-time navigation of the position of the patient to be operated, namely moving the ultrasonic imaging plane along with the movement of the instrument reference target point, and updating an ultrasonic image of a surgical field in real time.
Further, the method for determining the target rotation angle of the rectum B ultrasonic based on the real-time relative pose relationship comprises the following steps:
firstly, inputting the real-time relative pose relationship into a kinematic inverse solution model of the ultrasonic follow-up subsystem to obtain a preliminary adjustment rotation angle.
The kinematic inverse solution model is arranged in the ultrasonic follow-up subsystem, the current pose of the rectum B ultrasonic probe is determined according to the real-time relative pose relation of the instrument reference target point in the rectum B ultrasonic coordinate system, the current rotation angle of the rectum B ultrasonic probe is determined, and the current rotation angle is compared with the target rotation angle to obtain the initial adjustment rotation angle.
And then, correcting the preliminary adjustment rotating angle to obtain a target rotating angle.
Specifically, the actual contact force is measured through the induction signal of a force sensor arranged on an ultrasonic follow-up subsystem, and the actual contact force is measured according to the induction signal of the force sensor arranged on the ultrasonic follow-up subsystem; and realizing impedance control by a preset impedance control algorithm, and correcting the preliminarily adjusted rotating angle to obtain the target rotating angle.
The preset impedance control algorithm is used for improving the influence of external force on the rotating angle, the algorithm is arranged in the navigation control subsystem, the dynamic relation between the force of the mechanical arm and the position and speed is established according to the actual contact force measured by the sensing signal of the force sensor, the force of the mechanical arm is indirectly controlled by using the control position and speed, and the primarily adjusted rotating angle is corrected to obtain the target rotating angle.
It can be understood that, in the ultrasonic follow-up subsystem, the current pose of the rectum B ultrasonic probe is determined according to the real-time relative pose relation of the input instrument reference target point in the rectum B ultrasonic coordinate system, the current rotation angle of the rectum B ultrasonic probe is determined, and the current rotation angle is compared with the target rotation angle to obtain a primary adjustment rotation angle; the actual contact force is measured through the sensing signal of a force sensor arranged on the ultrasonic follow-up subsystem, the dynamic relation between the force of the mechanical arm and the position and speed is established, the force of the mechanical arm is indirectly controlled by controlling the position and the speed, and the primarily adjusted rotating angle is corrected to obtain the target rotating angle.
According to the technical scheme of the embodiment, a position mapping relation between an instrument reference target point and a rectum B ultrasonic coordinate system is established according to positioning information of the instrument reference target point of the surgical robot input by a user in an initial ultrasonic image; analyzing the real-time relative pose relationship between the instrument reference target point and a rectum B ultrasonic coordinate system according to the real-time motion information and the position mapping relationship generated by the control arm of the surgical robot under the surgical operation of a user; and determining the target rotation angle of the rectum B-ultrasonic based on the real-time relative pose relationship, driving the ultrasonic follow-up subsystem to rotate the target rotation angle of the rectum B-ultrasonic, updating the ultrasonic imaging plane, and realizing the real-time navigation of the operation. The technical scheme of the embodiment of the invention solves the problem that the surgical field cannot be continuously provided in the operation, can acquire more accurate structural morphology information and position information of the tissue, and improves the safety and the accuracy of surgical positioning.
Fig. 2 is a flowchart of a real-time surgical navigation method according to an embodiment of the present invention, where the present embodiment and the real-time surgical navigation method according to the above embodiment belong to the same inventive concept, and how to perform kinematic analysis on real-time motion information to obtain a first real-time pose relationship of an instrument reference target point in a surgical robot coordinate system is further described on the basis of the above embodiment. The method can be executed by a surgery real-time navigation device, which can be realized by software and/or hardware and is integrated in a computer device with application development function.
As shown in fig. 2, the real-time surgical navigation method of the present embodiment includes:
s210, according to positioning information of the instrument reference target point of the surgical robot input by a user in the initial ultrasonic image, establishing a position mapping relation between the instrument reference target point and a rectum B ultrasonic coordinate system.
And S220, performing kinematics forward solution operation on the real-time motion information, and inputting a forward solution operation result into a kinematics inverse solution model of a mechanical arm of the surgical robot to obtain each joint angle of the mechanical arm.
The kinematics forward solution operation refers to that a user moves a control arm of the surgical robot, the control arm is brought into a kinematics forward solution model of an operation main hand according to operation information of the user, then a forward solution operation result of the previous step is input into a kinematics inverse solution model of a mechanical arm of the surgical robot, and the position and the angle of each joint of the mechanical arm of the surgical robot are determined. And sending the obtained position and angle information of each joint of the mechanical arm to the mechanical arm body to control the motion of the tail end of the instrument.
And S230, inputting each joint angle of the mechanical arm into a kinematics forward solution operation model of the mechanical arm, and determining a first real-time pose relation of the instrument reference target point in a coordinate system of the surgical robot.
Because different parts or devices of the surgical robot are provided with different driving structures, each device has a corresponding operation science forward solution model and a kinematics inverse solution model. The positive kinematics solution model of the mechanical arm can further resolve that the instrument tip of the mechanical arm is a pose matrix relative to the robot coordinate system origin.
S240, mapping the first real-time pose relation to a rectum B ultrasonic coordinate system based on the position mapping relation to obtain a real-time relative pose relation of the instrument reference target point in the rectum B ultrasonic coordinate system.
And further, combining the position mapping relation between the instrument reference target point and the rectum B ultrasonic coordinate system determined in the step S210, performing matrix inverse operation, and determining a pose matrix of the instrument tip (reference target point) relative to the rectum B ultrasonic coordinate system to serve as a real-time relative pose relation of the instrument reference target point in the rectum B ultrasonic coordinate system.
And S250, determining the target rotation angle of the rectum B-ultrasonic based on the real-time relative pose relationship, driving the ultrasonic follow-up subsystem to rotate the target rotation angle of the rectum B-ultrasonic, updating the ultrasonic imaging plane, and realizing the real-time navigation of the operation.
Namely, when a user operates the operation master hand of the surgical robot, the tip of the mechanical arm of the surgical robot can displace, the corresponding ultrasonic probe of the rectum B-ultrasonic can correspondingly move, the ultrasonic imaging plane is adjusted, the operation visual field is updated, and the user can obtain the operation visual field ultrasonic image in real time in the operation process.
According to the technical scheme of the embodiment, each joint angle of the mechanical arm is obtained by performing kinematic forward solution operation on real-time motion information and inputting a forward solution operation result into a kinematic inverse solution model of the mechanical arm of the surgical robot; and inputting each joint angle of the mechanical arm into an application forward solution operation model of the mechanical arm, and determining a first real-time pose relation of an instrument reference target point under a coordinate system of the surgical robot, so that the accuracy of surgical positioning is improved.
Fig. 3 is a structural block diagram of a real-time surgical navigation device according to an embodiment of the present invention, which is applicable to a real-time navigation scene during a surgical procedure, and in particular, is more applicable to a real-time ultrasound navigation situation during a prostate surgery. The device can be realized by software and/or hardware, and is integrated in a computer device with application development function.
As shown in fig. 3, the surgical real-time navigation device includes: a first positional relationship determination module 310, a second positional relationship determination module 320, and a navigation driver module 330.
The first position relation determining module 310 is configured to establish a position mapping relation between an instrument reference target point and a B-ultrasonic rectal coordinate system according to positioning information, input by a user, of the instrument reference target point of the surgical robot in an initial ultrasonic image; the second position relation determining module 320 is configured to analyze a real-time relative pose relation between the instrument reference target and the rectum B-ultrasonic coordinate system according to a real-time motion information and position mapping relation generated by a manipulation arm of the surgical robot under the surgical operation of the user; and the navigation driving module 330 is used for determining a target rotation angle of the rectum B-ultrasonic based on the real-time relative pose relationship, driving the ultrasonic follow-up subsystem to rotate the target rotation angle of the rectum B-ultrasonic, updating the ultrasonic imaging plane and realizing the real-time navigation of the operation.
According to the technical scheme of the embodiment, a position mapping relation between an instrument reference target point and a rectum B ultrasonic coordinate system is established according to positioning information of the instrument reference target point of the surgical robot input by a user in an initial ultrasonic image; analyzing the real-time relative pose relationship between the instrument reference target point and a rectum B ultrasonic coordinate system according to the real-time motion information and the position mapping relationship generated by the control arm of the surgical robot under the surgical operation of a user; and determining the target rotation angle of the rectum B-ultrasonic based on the real-time relative pose relationship, driving the ultrasonic follow-up subsystem to rotate the target rotation angle of the rectum B-ultrasonic, updating the ultrasonic imaging plane, and realizing the real-time navigation of the operation. The technical scheme of the embodiment of the invention solves the problem that the operation visual field can not be continuously provided in the operation, obtains more accurate structural morphology information and position information of the tissue, and improves the safety and the accuracy of the operation positioning.
Optionally, the second position relation determining module 320 is further configured to:
performing kinematic analysis on the real-time motion information to obtain a first real-time pose relation of the instrument reference target point under the coordinate system of the surgical robot;
and mapping the first real-time pose relation to a rectum B ultrasonic coordinate system based on the position mapping relation to obtain a real-time relative pose relation of the instrument reference target point in the rectum B ultrasonic coordinate system.
Optionally, the second position relation determining module 320 is further configured to:
performing kinematics forward solution operation on the real-time motion information, and inputting a forward solution operation result into a kinematics inverse solution model of a mechanical arm of the surgical robot to obtain each joint angle of the mechanical arm;
and inputting each joint angle of the mechanical arm into an application forward solution operation model of the mechanical arm, and determining a first real-time pose relation of the instrument reference target point under a coordinate system of the surgical robot.
Optionally, the navigation driving module 330 is further configured to:
inputting the real-time relative pose relationship into a kinematic inverse solution model of the ultrasonic follow-up subsystem to obtain a preliminary adjustment rotation angle;
and correcting the preliminary adjustment rotating angle to obtain a target rotating angle.
Optionally, the navigation driver module 330 is further configured to: and correcting the primarily adjusted rotating angle through a preset impedance control algorithm according to the sensing signal of a force sensor arranged on the ultrasonic follow-up subsystem.
The operation real-time navigation device provided by the embodiment of the invention can execute the operation real-time navigation method provided by any embodiment of the invention, and has corresponding functional modules and beneficial effects of the execution method.
Fig. 4 is a structural block diagram of a surgical real-time navigation system according to an embodiment of the present invention, and as shown in fig. 4, the surgical real-time navigation system includes: a surgical robot subsystem 410, an ultrasound follow-up subsystem 420, and a navigation control subsystem 430.
Wherein the surgical robot subsystem 410 is used for moving the mechanical arm of the surgical robot under the operation of the user to execute the surgical operation; the ultrasonic follow-up subsystem 420 comprises a rectum B-ultrasonic and B-ultrasonic moving device, and is used for controlling the movement of the rectum B-ultrasonic and enabling the rectum B-ultrasonic to acquire B-ultrasonic images in a moving range; the navigation control subsystem 430 is used for determining the movement parameters of the ultrasound follow-up subsystem 420 according to the mechanical arm movement information of the surgical robot subsystem 410, and driving the ultrasound follow-up subsystem 420 to control the rectum B ultrasound to move, so that the surgical real-time navigation method of any embodiment of the invention is realized.
Optionally, the navigation control subsystem 430 is configured to:
performing kinematic analysis on the real-time motion information to obtain a first real-time pose relation of the instrument reference target point under the coordinate system of the surgical robot;
and mapping the first real-time pose relation to a rectum B-ultrasonic 100 coordinate system based on the position mapping relation to obtain a real-time relative pose relation of the instrument reference target point in the rectum B-ultrasonic 100 coordinate system.
Optionally, the navigation control subsystem 430 is configured to:
performing kinematics forward solution operation on the real-time motion information, and inputting a forward solution operation result into a kinematics inverse solution model of a mechanical arm of the surgical robot to obtain each joint angle of the mechanical arm;
and inputting each joint angle of the mechanical arm into an application forward solution operation model of the mechanical arm, and determining a first real-time pose relation of the instrument reference target point under a coordinate system of the surgical robot.
Optionally, the navigation control subsystem 430 is configured to:
inputting the real-time relative pose relationship into a kinematic inverse solution model of the ultrasonic follow-up subsystem to obtain a preliminary adjustment rotation angle;
and correcting the preliminary adjustment rotating angle to obtain a target rotating angle.
Optionally, the navigation control subsystem 430 is configured to:
and correcting the preliminarily adjusted rotating angle through a preset impedance control algorithm according to the sensing signal of a force sensor arranged on the ultrasonic follow-up subsystem.
Optionally, the navigation control subsystem 430 is configured to:
one or more ultrasound images are fused with a virtual scene associated therewith to create and display a new virtual scene or model of the scene. Not only can realize real-time operation navigation, but also can fuse more images and prompt messages according to different requirements, can acquire more accurate structural morphology information and position information of tissues, and improve the safety and accuracy of operation positioning.
In a specific embodiment, as shown in fig. 5, fig. 5 is a schematic structural diagram of an ultrasound follow-up subsystem provided by an embodiment of the present invention, and the ultrasound follow-up subsystem 420 is configured as a structure in which a rectal B-mode ultrasound 4201 is embedded in a multi-degree-of-freedom flexible guide arm, and includes a rectal B-mode ultrasound 4201, a rotating motor 4202, a stage 4203, a displacement slide 4204, and a force sensor 4205. The rotary motor 200 is used for controlling the rectum B-ultrasonic 4201 to rotate; the displacement sliding table 4204 is used for controlling the rectum B-ultrasound 4201 to move back and forth; the pedestal 4203 is used for fixing the rectum B-ultrasonic probe 4201, the rotary motor 4202 and the displacement sliding table 4204; force sensor 4205 is provided on rotating electric machine 4202.
Optionally, the force sensor 4205 is used to sense a signal to measure the actual contact force and send the measured actual contact force to the navigation control subsystem 430.
Optionally, rectal B-ultrasound 4201 is used for: the peripheral and internal parts of the rectum and the prostate are observed to see if the lower lesions are present.
Rectal B-ultrasound 4201 is used for: an initial ultrasound image is acquired, and an ultrasound image is acquired in real time during the procedure.
The technical scheme of the embodiment provides a surgery real-time navigation system which comprises a surgery robot subsystem, an ultrasonic follow-up subsystem and a navigation control subsystem. The surgical robot subsystem is used for enabling a mechanical arm of the surgical robot to move under the operation of a user so as to execute a surgical operation; the ultrasonic follow-up subsystem comprises a rectum B-ultrasonic and B-ultrasonic moving device and is used for controlling the movement of the rectum B-ultrasonic so as to enable the rectum B-ultrasonic to acquire B-ultrasonic images in a moving range; the navigation control subsystem is used for determining the movement parameters of the ultrasonic follow-up subsystem according to the mechanical arm movement information of the surgical robot subsystem, driving the ultrasonic follow-up subsystem to control the rectum B ultrasonic to move, and realizing the surgical real-time navigation method according to any embodiment of the invention. The technical scheme of the embodiment of the invention solves the problem that the operation visual field can not be continuously provided in the operation, can acquire more accurate structural form information and position information of the tissue, and improves the safety and the accuracy of operation positioning.
In radical prostatectomy, fig. 6 is a schematic diagram of a B-mode rectal ultrasound virtual scene in radical prostatectomy, which includes a B-mode rectal ultrasound 4201, a B-mode rectal ultrasound coordinate system 601, a robot coordinate system 602, a bladder 603, a prostate 604, and a B-mode rectal ultrasound two-dimensional imaging plane 605.
The rectum B ultrasonic coordinate system 601 is a coordinate system established by taking the horizontal arrangement middle axis position of the ultrasonic probe as a reference.
The robot coordinate system 602 is a coordinate system established based on the instrument reference target point being perpendicular to the ultrasonic probe, and when the angle of the instrument reference target point relative to the ultrasonic probe is greater than 90 degrees, the rotation angle is a negative value, and when the angle of the instrument reference target point relative to the ultrasonic probe is less than 90 degrees, the rotation angle is a positive value.
Fig. 7 is an interactive interface of a real-time prostate surgical navigation system, where a physician selects an instrument reference target in an initial ultrasound image.
The real-time navigation for the operation in the prostate operation specifically comprises the following steps:
(1) And the doctor selects an instrument reference target point in the initial ultrasonic image through an interactive interface of the prostate real-time surgery navigation system, and moves the control arm of the laparoscopic surgery robot subsystem 410 to control each joint of the arm to generate real-time motion.
(2) The navigation control subsystem 430 performs kinematics forward operation on each joint of the control arm to obtain a pose matrix of the instrument reference target point in the robot coordinate system, and the pose matrix of the reference target point in the B ultrasonic coordinate system is converted into the pose matrix of the reference target point in the B ultrasonic coordinate system through the position mapping relation between the instrument reference target point and the rectum B ultrasonic coordinate system.
(3) And inputting the pose matrix of the reference target point in the B ultrasonic coordinate system into a kinematic inverse solution model of the mechanical arm of the surgical robot to obtain each joint angle of the mechanical arm.
(4) And inputting each joint angle of the mechanical arm into a kinematics forward solution operation model of the mechanical arm, and determining a first real-time pose relation of the instrument reference target point under a coordinate system of the surgical robot.
(5) And mapping the first real-time pose relation to a rectum B ultrasonic coordinate system based on the position mapping relation to obtain a real-time relative pose relation of the instrument reference target point in the rectum B ultrasonic coordinate system.
(6) And determining a target rotation angle of the rectum B-ultrasonic based on the real-time relative pose relationship and the sensing signal of the force sensor 4205, driving the ultrasonic follow-up subsystem to rotate the rectum B-ultrasonic by the target rotation angle, updating an ultrasonic imaging plane, and realizing the real-time navigation of the operation.
Fig. 8 is a block diagram of a computer device according to an embodiment of the present invention. FIG. 8 illustrates a block diagram of an exemplary computer device 12 suitable for use in implementing embodiments of the present invention. The computer device 12 shown in fig. 8 is only an example and should not impose any limitation on the scope of use or functionality of embodiments of the invention. The computer device 12 may be any terminal device with computing power, and may be configured in a surgical real-time navigation device and a surgical real-time navigation system.
As shown in FIG. 8, computer device 12 is in the form of a general purpose computing device. The components of computer device 12 may include, but are not limited to: one or more processors or processing units 16, a memory 28, and a bus 18 that couples various system components including the memory 28 and the processing unit 16.
Bus 18 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. By way of example, such architectures include, but are not limited to, industry Standard Architecture (ISA) bus, micro-channel architecture (MAC) bus, enhanced ISA bus, video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
Computer device 12 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by computer device 12 and includes both volatile and nonvolatile media, removable and non-removable media.
The memory 28 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM) 30 and/or cache memory 32. Computer device 12 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 34 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 8, and commonly referred to as a "hard drive"). Although not shown in FIG. 8, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In these cases, each drive may be connected to bus 18 by one or more data media interfaces. Memory 28 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention.
A program/utility 40 having a set (at least one) of program modules 42 may be stored, for example, in memory 28, such program modules 42 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which examples or some combination thereof may comprise an implementation of a network environment. Program modules 42 generally carry out the functions and/or methodologies of the described embodiments of the invention.
Computer device 12 may also communicate with one or more external devices 14 (e.g., keyboard, pointing device, display 24, etc.), with one or more devices that enable a user to interact with computer device 12, and/or with any devices (e.g., network card, modem, etc.) that enable computer device 12 to communicate with one or more other computing devices. Such communication may be through an input/output (I/O) interface 22. Also, computer device 12 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the Internet) via network adapter 20. As shown, network adapter 20 communicates with the other modules of computer device 12 via bus 18. It should be appreciated that although not shown in FIG. 8, other hardware and/or software modules may be used in conjunction with computer device 12, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
The processing unit 16 executes various functional applications and data processing by executing programs stored in the memory 28, for example, implementing the surgical real-time navigation method provided by the embodiment of the present invention, the method includes:
according to positioning information of an instrument reference target point of the surgical robot input by a user in the initial ultrasonic image, establishing a position mapping relation between the instrument reference target point and a rectum B ultrasonic coordinate system;
analyzing the real-time relative pose relationship between the instrument reference target point and a rectum B ultrasonic coordinate system according to the real-time motion information and the position mapping relationship generated by the control arm of the surgical robot under the surgical operation of a user;
and determining the target rotation angle of the rectum B-ultrasonic based on the real-time relative pose relationship, driving the ultrasonic follow-up subsystem to rotate the target rotation angle of the rectum B-ultrasonic, updating the ultrasonic imaging plane, and realizing the real-time navigation of the operation.
The present embodiment provides a computer-readable storage medium, on which a computer program is stored, which when executed by a processor, implements a surgical real-time navigation method as provided in any embodiment of the present invention, the method including:
according to positioning information of an instrument reference target point of the surgical robot input by a user in the initial ultrasonic image, establishing a position mapping relation between the instrument reference target point and a rectum B ultrasonic coordinate system;
analyzing a real-time relative pose relation between an instrument reference target point and a rectum B ultrasonic coordinate system according to a real-time motion information and position mapping relation generated by a control arm of the surgical robot under the surgical operation of a user;
and determining a target rotation angle of the rectum B-ultrasonic based on the real-time relative pose relationship, driving the ultrasonic follow-up subsystem to rotate the rectum B-ultrasonic by the target rotation angle, updating an ultrasonic imaging plane, and realizing real-time navigation of the operation.
Computer storage media for embodiments of the present invention may take the form of any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. The computer-readable storage medium may be, for example but not limited to: an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination thereof. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, smalltalk, C + +, or the like, as well as conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
It will be understood by those skilled in the art that the modules or steps of the invention described above may be implemented by a general purpose computing device, they may be centralized on a single computing device or distributed across a network of computing devices, and optionally they may be implemented by program code executable by a computing device, such that it may be stored in a memory device and executed by a computing device, or it may be separately fabricated into various integrated circuit modules, or it may be fabricated by fabricating a plurality of modules or steps thereof into a single integrated circuit module. Thus, the present invention is not limited to any specific combination of hardware and software.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments illustrated herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (10)

1. A method for real-time navigation of a surgery, comprising:
according to positioning information of an instrument reference target point of the surgical robot input by a user in an initial ultrasonic image, establishing a position mapping relation between the instrument reference target point and a rectum B ultrasonic coordinate system;
analyzing the real-time relative pose relationship between the instrument reference target point and a rectum B ultrasonic coordinate system according to the position mapping relationship and the real-time motion information generated by the control arm of the surgical robot under the surgical operation of the user;
and determining a target rotation angle of the rectum B-ultrasonic based on the real-time relative pose relationship, driving an ultrasonic follow-up subsystem to rotate the rectum B-ultrasonic by the target rotation angle, updating an ultrasonic imaging plane, and realizing operation real-time navigation.
2. The method according to claim 1, wherein the analyzing the real-time relative pose relationship of the instrument reference target and the B-ultrasonic rectal coordinate system according to the real-time motion information generated by the manipulating arm of the surgical robot under the surgical operation of the user and the position mapping relationship comprises:
performing kinematic analysis on the real-time motion information to obtain a first real-time pose relation of the instrument reference target point under a coordinate system of the surgical robot;
and mapping the first real-time pose relation to the rectum B ultrasonic coordinate system based on the position mapping relation to obtain a real-time relative pose relation of the instrument reference target point in the rectum B ultrasonic coordinate system.
3. The method of claim 2, wherein the performing kinematic analysis on the real-time motion information to obtain a first real-time pose relationship of the instrument reference target under a surgical robot coordinate system comprises:
performing kinematics forward solution operation on the real-time motion information, and inputting a forward solution operation result into a kinematics inverse solution model of a mechanical arm of the surgical robot to obtain each joint angle of the mechanical arm;
and inputting each joint angle of the mechanical arm into a kinematics forward solution operation model of the mechanical arm, and determining a first real-time pose relation of the instrument reference target point in a surgical robot coordinate system.
4. The method according to claim 1, wherein the determining the target rotation angle of the rectal B-mode based on the real-time relative pose relationship comprises:
inputting the real-time relative pose relationship into a kinematic inverse solution model of the ultrasonic follow-up subsystem to obtain a preliminary adjustment rotation angle;
and correcting the preliminary adjustment rotating angle to obtain the target rotating angle.
5. The method according to claim 4, wherein the correcting the preliminary adjustment rotation angle to obtain the target rotation angle comprises:
and correcting the initial adjustment rotating angle through a preset impedance control algorithm according to the sensing signal of a force sensor arranged on the ultrasonic follow-up subsystem.
6. A surgical real-time navigation device, comprising:
the first position relation determining module is used for establishing a position mapping relation between an instrument reference target point and a rectum B ultrasonic coordinate system according to positioning information of the instrument reference target point of the surgical robot input by a user in an initial ultrasonic image;
the second position relation determining module is used for analyzing the real-time relative pose relation between the instrument reference target point and a rectum B ultrasonic coordinate system according to the position mapping relation and the real-time motion information generated by the control arm of the surgical robot under the surgical operation of a user;
and the navigation driving module is used for determining a target rotation angle of the rectum B ultrasonic based on the real-time relative pose relation, driving the ultrasonic follow-up subsystem to rotate the rectum B ultrasonic by the target rotation angle, updating an ultrasonic imaging plane and realizing real-time navigation of the operation.
7. A surgical real-time navigation system, comprising:
the system comprises a surgical robot subsystem, an ultrasonic follow-up subsystem and a navigation control subsystem;
wherein the surgical robot subsystem is used for enabling a mechanical arm of the surgical robot to move under the operation of a user so as to execute a surgical operation;
the ultrasonic follow-up subsystem comprises a rectum B-ultrasonic and B-ultrasonic moving device and is used for controlling the movement of the rectum B-ultrasonic so as to enable the rectum B-ultrasonic to acquire B-ultrasonic images in a moving range;
the navigation control subsystem is used for determining the movement parameters of the ultrasonic follow-up subsystem according to the mechanical arm movement information of the surgical robot subsystem, driving the ultrasonic follow-up subsystem to control the rectum B ultrasonic to move, and realizing the surgical real-time navigation method as claimed in any one of claims 1 to 5.
8. The system of claim 7, wherein the ultrasound follow-up subsystem is configured as a structure in which the rectum B ultrasonic is embedded in the multi-degree-of-freedom flexible guide arm, and comprises the rectum B ultrasonic, a rotating motor, a displacement sliding table, a pedestal and a force sensor;
the rotary motor is used for controlling the rectum B-ultrasonic to rotate;
the displacement sliding table is used for controlling the rectum B-ultrasonic to move back and forth;
the pedestal is used for fixing the rectum B-ultrasonic, the rotating motor and the displacement sliding table;
the force sensor is disposed on the rotating electrical machine.
9. A computer device, characterized in that the computer device comprises:
one or more processors;
a memory for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement the surgical real-time navigation method of any one of claims 1-5.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out a method for surgical real-time navigation according to any one of claims 1 to 5.
CN202211287443.4A 2022-10-20 2022-10-20 Operation real-time navigation method, device, system, equipment and medium Pending CN115462901A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211287443.4A CN115462901A (en) 2022-10-20 2022-10-20 Operation real-time navigation method, device, system, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211287443.4A CN115462901A (en) 2022-10-20 2022-10-20 Operation real-time navigation method, device, system, equipment and medium

Publications (1)

Publication Number Publication Date
CN115462901A true CN115462901A (en) 2022-12-13

Family

ID=84336796

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211287443.4A Pending CN115462901A (en) 2022-10-20 2022-10-20 Operation real-time navigation method, device, system, equipment and medium

Country Status (1)

Country Link
CN (1) CN115462901A (en)

Similar Documents

Publication Publication Date Title
US9526473B2 (en) Apparatus and method for medical image searching
EP3076875B1 (en) An ultrasound system with stereo image guidance or tracking
CN104739519B (en) Force feedback surgical robot control system based on augmented reality
US10284762B2 (en) System and method for targeting feedback
WO2022141153A1 (en) Ultrasonic positioning puncture system and storage medium
CN112754616B (en) Ultrasonic positioning puncture system and storage medium
CN116077155B (en) Surgical navigation method based on optical tracking equipment and mechanical arm and related device
CN114391953A (en) Navigation positioning system for orthopedics department
WO2022159229A1 (en) Service life management for an instrument of a robotic surgery system
CN115462901A (en) Operation real-time navigation method, device, system, equipment and medium
KR20160129311A (en) Robot system of intervention treatment of needle insert type
Black et al. Mixed reality human teleoperation
CN112043359A (en) Mammary gland puncture method, device, equipment and storage medium
WO2023116185A1 (en) Path determination method, electronic apparatus and computer-readable storage medium
KR20200096125A (en) Prescriptive guidance for ultrasound diagnostics
US20140024938A1 (en) Enhanced ultrasound imaging apparatus and associated methods of work flow
CN116585032A (en) Minimally invasive puncture system based on navigation system
CN113397708B (en) Particle puncture surgical robot navigation system
CN114668460A (en) Method and system for unifying spatial poses of puncture needles
CN110176300B (en) Puncture needle selection method, puncture needle selection device, server and storage medium
Trinh et al. Preliminary design and evaluation of an interfacing mechanism for maneuvering virtual minimally invasive surgical instruments
CN111223575A (en) Radiotherapy auxiliary display method and system based on virtual intelligent medical platform
CN215192074U (en) Puncture navigator and puncture navigation system
US11711596B2 (en) System and methods for determining proximity relative to an anatomical structure
US20230147826A1 (en) Interactive augmented reality system for laparoscopic and video assisted surgeries

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20230322

Address after: 215163 No.2 standard workshop of industrial village, No.300 Qingchengshan Road, Suzhou high tech Zone, Suzhou City, Jiangsu Province

Applicant after: SUZHOU KANGDUO ROBOT CO.,LTD.

Address before: Room 08, 15 / F, No.368, Changjiang Road, Nangang District, economic development zone, Harbin City, Heilongjiang Province

Applicant before: Harbin sizherui intelligent medical equipment Co.,Ltd.