CN115317097A - Robot navigation control method and device, electronic equipment and storage medium - Google Patents

Robot navigation control method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN115317097A
CN115317097A CN202211014116.1A CN202211014116A CN115317097A CN 115317097 A CN115317097 A CN 115317097A CN 202211014116 A CN202211014116 A CN 202211014116A CN 115317097 A CN115317097 A CN 115317097A
Authority
CN
China
Prior art keywords
image
puncture
coordinate
robot
medical
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211014116.1A
Other languages
Chinese (zh)
Inventor
王宁
谷野
王枫
王苑铮
张颖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenyang Aijian Network Technology Co ltd
Original Assignee
Shenyang Aijian Network Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenyang Aijian Network Technology Co ltd filed Critical Shenyang Aijian Network Technology Co ltd
Priority to CN202211014116.1A priority Critical patent/CN115317097A/en
Publication of CN115317097A publication Critical patent/CN115317097A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/34Trocars; Puncturing needles
    • A61B17/3403Needle locating or guiding means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/34Trocars; Puncturing needles
    • A61B17/3403Needle locating or guiding means
    • A61B2017/3405Needle locating or guiding means using mechanical guide means
    • A61B2017/3409Needle locating or guiding means using mechanical guide means including needle or instrument drives
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/102Modelling of surgical devices, implants or prosthesis
    • A61B2034/104Modelling the effect of the tool, e.g. the effect of an implanted prosthesis or for predicting the effect of ablation or burring
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/108Computer aided selection or customisation of medical implants or cutting guides
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Robotics (AREA)
  • Pathology (AREA)
  • Manipulator (AREA)

Abstract

The application discloses a robot navigation control method, a device, electronic equipment and a storage medium, wherein the method and the device are used for carrying out image identification based on an image of a registration plate on a puncture plane to obtain a perspective transformation parameter; carrying out perspective transformation on the three-dimensional live-action image based on the perspective transformation parameters to obtain a simulated image; marking the initial position of the puncture needle clamped by the robot in the simulated image based on the simulated image; acquiring a medical image of a patient on an examination bed; carrying out image registration processing on the medical image and the simulation image to obtain a coordinate transformation relation between a robot coordinate system and the medical image; converting the target point, the needle inlet point and the puncture path on the medical image into a plurality of parameters such as a target point coordinate, a needle inlet point coordinate and a puncture path coordinate based on a robot coordinate system based on a coordinate conversion relation; and navigation control is performed based on the plurality of parameters. The scheme is based on the registration of the medical image and the three-dimensional real scene, so that the positioning precision is improved.

Description

Robot navigation control method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of medical device technologies, and in particular, to a robot navigation control method and apparatus, an electronic device, and a storage medium.
Background
The robot-assisted puncture technology is a medical technology which utilizes a computer to carry out medical image information, plans an operation scheme based on the medical image information, then completes registration among different coordinate systems and tracking of space positions of interventional instruments and patients through a positioning navigation device, further sends an operation instruction to a medical robot, adjusts the angle posture of an end effector of the robot through the operation instruction, and realizes accurate positioning and assisted manual puncture. Compared with manual operation, the robot has the advantages of high operation positioning precision, high motion stability and strong repeatability, makes interventional operation more accurate, and reduces the dependence on doctor experience. Currently, a magnetic positioning navigation method is generally adopted to provide path navigation for a medical robot when percutaneous puncture is performed.
In a magnetic positioning navigation system, a tracked instrument (such as an ultrasonic probe or a puncture instrument) is fixed with a sensor coil, when the sensor coil moves relative to a magnetic field transmitter, current signals with different intensities are generated, and then the system positions the tracked instrument through the current signals. The positioning time delay of the scheme is slightly high, the real-time performance is poor, the positioning precision is low, and the operation accuracy of an operator is reduced.
Disclosure of Invention
In view of the above, the present application provides a method, an apparatus, an electronic device and a storage medium for controlling robot navigation, which are used to provide navigation control for a medical robot during percutaneous puncture to improve positioning accuracy.
In order to achieve the above object, the following solutions are proposed:
a robot navigation control method of puncture path is applied to electronic equipment and is used for navigation control of a medical robot in an interventional operating room, the interventional operating room is provided with medical imaging equipment and image acquisition equipment, a puncture needle of the medical robot is positioned on a puncture plane perpendicular to an examining table of the medical imaging equipment, and the robot navigation control method comprises the following steps:
performing image recognition based on the registration plate image of the registration plate on the puncture plane, which is acquired by the image acquisition equipment, to obtain perspective transformation parameters;
carrying out perspective transformation on the three-dimensional live-action image acquired by the image acquisition equipment based on the perspective transformation parameters to obtain a simulated image of the three-dimensional live-action image;
marking the initial position of the puncture needle clamped by the medical robot in the simulated image to obtain the current position of the puncture needle;
acquiring a medical image of a patient on the examining table, wherein at least a target point, a needle inserting point and a puncture path are marked on the medical image;
registering the medical image and the simulation image to obtain a coordinate transformation relation between a robot coordinate system of the medical robot and the medical image;
performing coordinate transformation on the current position, the target point, the needle feeding point and the puncture path based on the coordinate transformation relation to obtain a current position coordinate, a target point coordinate, a needle feeding point coordinate and a puncture path coordinate based on the robot coordinate system;
and in the process of puncturing the patient by the medical robot, performing navigation control on the medical robot based on the current position coordinate, the target point coordinate, the needle feeding point coordinate and the puncturing path coordinate.
Optionally, the medical image comprises a partial, full or fused image of a CT image, an MRI image and a PET-CT image.
Optionally, the navigation control of the medical robot based on the target point coordinate, the needle insertion point coordinate and the puncture path coordinate includes:
resolving the target point coordinate, the needle feeding point coordinate and the puncture path coordinate to obtain a needle feeding path, a needle holding angle posture and a maximum puncture depth of the puncture needle reaching the needle feeding point;
and controlling the medical robot based on the current position of the puncture needle, the needle inserting path, the needle holding angle posture and the maximum puncture depth.
A robot navigation control device is applied to electronic equipment and used for carrying out navigation control on a medical robot in an interventional operating room, the interventional operating room is provided with medical imaging equipment and image acquisition equipment, a puncture needle of the medical robot is positioned on a puncture plane perpendicular to an examining table of the medical imaging equipment, and the robot navigation control device comprises:
the image recognition module is configured to perform image recognition based on a registration plate image of a registration plate on the puncture plane, which is acquired by the image acquisition equipment, so as to obtain a perspective transformation parameter;
the image transformation module is configured to perform perspective transformation on the three-dimensional live-action image acquired by the image acquisition equipment based on the perspective transformation parameters to obtain a simulated image of the three-dimensional live-action image;
the position marking module is configured to mark the initial position of the puncture needle clamped by the medical robot in the simulated image to obtain the current position of the puncture needle;
the system comprises an image acquisition module, a medical image acquisition module and a control module, wherein the image acquisition module is configured to acquire a medical image of a patient on the examination bed, and at least a target point, a needle inlet point and a puncture path are marked on the medical image;
an image registration module configured to perform registration processing on the medical image and the simulation image to obtain a coordinate transformation relation between a robot coordinate system of the medical robot and the medical image;
the coordinate transformation module is configured to perform coordinate transformation on the current position, the target point, the needle feeding point and the puncture path based on the coordinate transformation relation to obtain a current position coordinate, a target point coordinate, a needle feeding point coordinate and a puncture path coordinate based on the robot coordinate system;
and the navigation execution module is configured to perform navigation control on the medical robot based on the current position coordinate, the target point coordinate, the needle feeding point coordinate and the puncture path coordinate in the process that the medical robot punctures the patient.
Optionally, the medical image comprises a partial, full or fused image of a CT image, an MRI image and a PET-CT image.
Optionally, the navigation executing module includes:
the data calculating unit is used for calculating the target point coordinate, the needle feeding point coordinate and the puncture path coordinate to obtain a needle feeding path, a needle holding angle posture and a maximum puncture depth of the puncture needle reaching the needle feeding point;
and the puncture control unit is used for controlling the medical robot based on the current position of the puncture needle, the needle inserting path, the needle holding angle posture and the maximum puncture depth.
An electronic device comprising at least one processor and a memory coupled to the processor, wherein:
the memory is for storing a computer program or instructions;
the processor is configured to execute the computer program or instructions to enable the electronic device to implement the robot navigation control method as described above.
A storage medium applied to an electronic device, the storage medium being used to carry one or more computer programs, so that when the one or more computer programs are executed by the electronic device, the electronic device can realize the robot navigation control method.
From the technical scheme, the application discloses a robot navigation control method, a device, electronic equipment and a storage medium, and the method and the device are specifically used for carrying out image recognition based on a registration plate image on a puncture plane to obtain perspective transformation parameters; carrying out perspective transformation on the three-dimensional live-action image based on the perspective transformation parameters to obtain a simulated image; marking the initial position of the puncture needle clamped by the robot in the simulated image based on the simulated image; acquiring a medical image of a patient on an examination bed; carrying out image registration processing on the medical image and the simulation image to obtain a coordinate transformation relation between a robot coordinate system and the medical image; converting the target point, the needle inlet point and the puncture path on the medical image into a plurality of parameters such as target point coordinates, needle inlet point coordinates and puncture path coordinates based on a robot coordinate system based on a coordinate conversion relation; and the medical robot is subjected to navigation control based on the plurality of parameters in the puncture process. Practical experiments show that the scheme can improve the positioning precision due to the fact that the scheme is based on real-time image transformation.
In addition, the scheme does not need to customize special surgical instruments, is convenient and smooth in surgical operation, convenient in equipment installation and deployment, low in acquisition cost and operation cost, and is more suitable for intelligent medical auxiliary setting with high cost performance for deployment and use in primary hospitals. And no identification device is arranged in the operation area, so that the infection risk in the operation is reduced; the navigation effect can be checked in real time, and the operation feedback in the operation is good in real-time performance.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the embodiments or the prior art descriptions will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a schematic view of a layout of an interventional operating room according to an embodiment of the present application;
fig. 2 is a flowchart of a robot navigation control method according to an embodiment of the present application;
FIG. 3 is a conceptual diagram of a real scene of a registration plate image according to an embodiment of the present application;
FIG. 4 is a simulated image of an embodiment of the present application;
fig. 5 is a block diagram of a robot navigation control device according to an embodiment of the present application;
fig. 6 is a block diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application without making any creative effort belong to the protection scope of the present application.
The scheme disclosed in the application is applied to an interventional operating room and used for implementing navigation control on a medical robot in the operating room in the process of puncturing a patient. The interventional operating room referred to in the present application is configured with a medical robot E, a medical imaging device a, an image acquisition device G, and a surface-shaped laser emitter B, as shown in fig. 1. The planar laser transmitter is used in an interventional operating room to assist an operator in visually establishing a puncture plane F. The interventional operating room is provided with a computer, a workstation or a server, and is respectively provided with physical connections with the medical robot, the medical imaging equipment, the image acquisition equipment and the planar laser emitter so as to realize the interaction of signals, data and information.
The medical imaging apparatus a is preferably a CT scanner, whose bed C is perpendicular to the puncture plane, and the puncture plane is located on the patient D on the bed. The image acquisition equipment can select a common industrial camera which is provided with a fixed-focus optical lens. Based on the above configuration, the present application discloses an embodiment to realize navigation control of a puncture path of the medical robot introduced into the operating room when performing a puncture operation on a patient.
Example one
Fig. 2 is a flowchart of a robot navigation control method according to an embodiment of the present application.
As shown in fig. 2, the robot navigation control method of the embodiment is applied to an electronic device, which is a computer, a workstation, or a server configured in the interventional operating room and connected to the above devices. The navigation control method specifically comprises the following steps:
s1, image recognition processing is carried out on the basis of the registration plate image.
Namely, image recognition is carried out on a registration plate image obtained by shooting a registration plate on a puncture plane based on image acquisition equipment, so that a simulated image of a three-dimensional live-action image corresponding to the registration plate image is obtained. The purpose of the calibration process is to obtain the perspective transformation parameters of the perspective transformation matrix as a basis for establishing the transformation relationship between the robot coordinate system and the CT scan image coordinate system.
And S2, carrying out perspective transformation on the three-dimensional live-action image based on the perspective transformation parameters.
Namely, the three-dimensional live-action image collected by the image collecting device is subjected to perspective transformation based on the perspective transformation parameters to obtain a simulated image of the three-dimensional live-action image.
As shown in fig. 3, a puncture plane is arranged on an examination bed of a CT scanning device, and a light surface emitted by a planar laser emitter is coplanar with a CT fault plane; the image acquisition equipment is placed at a proper position (such as a position with the side about 2.4m far away from the plane to be punctured, about 80cm far away from the center line of the examination bed and about 2.2m high) of the CT operation room and is connected with the workstation through a network cable interface;
the registration plate H is placed at the proposed puncture location on the examination couch. And identifying the square blocks in the registration plate by using software, and acquiring the parameters of the perspective transformation matrix through software calculation. The image capturing device G in fig. 3 captures a realistic conceptual view of the puncture plane from a top view, i.e., the registration plate image. The puncture needle registration plate is arranged on the examination bed, is coplanar with the puncture plane, and is provided with a white square at the center; when the image acquisition device is shot from a top view angle, the square in the display image is not a standard square but a rhombus;
fig. 4 is a simulated image of the three-dimensional live-action image obtained by the perspective transformation of fig. 3, and after the perspective transformation processing, the squares of the registration plate in fig. 4 are transformed from diamonds to squares, i.e. a simulated image of the three-dimensional live-action image corresponding to the registration plate image is obtained.
And S3, marking the initial position of the puncture needle in the simulated image.
Namely, the initial position of the puncture needle is marked in the simulated image, so that the current position of the puncture needle is obtained.
And S4, acquiring a medical image of the patient.
Before each puncture operation, medical images obtained by various medical image acquisition means are acquired, and the medical images bear target points marked by doctors or other professionals through various means, needle points and planned puncture paths between the target points and the needle points. The medical image includes, but is not limited to, one or more of a CT image, an MRI image, and a PET-CT image.
The medical image is used for detecting a patient on an examination couch on the spot, and in the detection process, a laser positioning line carried by a CT image device is used for pasting a corresponding positioning mark on a puncture plane, so that the obtained CT image can generate corresponding characteristic points. The positioning mark does not need to be stuck on the body of the patient.
And S5, carrying out image registration processing on the medical image of the patient and the simulated image of the three-dimensional real scene.
The transformation relation of the robot coordinate system of the medical robot relative to the coordinate system of the medical image is used. The specific scheme is as follows:
firstly, selecting two first characteristic points (A0 and B0) in a medical image of a patient, then selecting two second characteristic points (A1 and B1) corresponding to the actual positions of A0 and B0 in a simulated image of a three-dimensional real scene, carrying out image scaling, rotation and translation operations, and carrying out registration processing on the medical image of the patient and the simulated image of the three-dimensional real scene, namely realizing the preliminary corresponding relation between the medical image and the simulated image of the three-dimensional real scene;
and then, obtaining the transformation relation of the robot coordinate system relative to the medical image coordinate system according to the marked positions of the puncture needles clamped by the medical robot in the simulated image of the three-dimensional real scene. The medical image coordinate system is a coordinate system of the medical image obtained based on the puncture plane.
S6, carrying out position transformation on the target point, the needle inserting point and the puncture path.
Since the coordinate system of the medical image and the coordinate system of the robot are transformed based on the transformation relation, the position of the target point, the position of the needle insertion point, and the position of the puncture path on the medical image can be transformed based on the transformation relation after the medical image is acquired, so that the coordinates of the target point, the needle insertion point, and the puncture path based on the coordinate system of the robot can be obtained.
And S7, performing navigation control on the medical robot in the puncture process.
When puncture is carried out, the examination bed is moved firstly, the selected puncture layer surface, namely the puncture position, is moved to a puncture plane (namely a laser plane F), and the laser line of the puncture plane is superposed with the mark point of the selected puncture layer surface; at the moment, the position of the puncture needle clamped by the medical robot in the working face live-action image is the initial position of the puncture needle.
In the puncture process, firstly, the current position of the robot for clamping the puncture needle can be marked in the CT scanning image of the selected puncture layer; and resolving based on the target point coordinate, the needle feeding point coordinate and the puncture path coordinate to obtain parameters such as a motion path of the puncture needle reaching the needle feeding point, a needle holding angle posture, a maximum puncture depth and the like.
Then, the operator can select an executed needle inserting plan according to the operation requirement, then sends an operation instruction to the robot, sends the puncture needle to a corresponding needle inserting position, automatically aligns the puncture needle to a planned angle, and continues to execute the puncture operation on the patient according to the motion path and the maximum puncture depth.
When the puncture needle punctures to 1/2 or 1/3 of the preset depth, repeating CT scanning verification, and if the angle and the needle inserting point meet the requirement of puncture precision, continuing to puncture to the preset depth; otherwise, the appropriate adjustment is made.
It can be seen from the above technical solutions that the present embodiment provides a robot navigation control method, which is applied to electronic devices, and the method specifically performs image recognition based on a registration plate image on a puncture plane to obtain a perspective transformation parameter; carrying out perspective transformation on the three-dimensional live-action image based on the perspective transformation parameters to obtain a simulated image; marking the initial position of the puncture needle clamped by the robot in the simulated image based on the simulated image; acquiring a medical image of a patient on an examination bed; carrying out image registration processing on the medical image and the simulation image to obtain a coordinate transformation relation between a robot coordinate system and the medical image; converting the target point, the needle inlet point and the puncture path on the medical image into a plurality of parameters such as a target point coordinate, a needle inlet point coordinate and a puncture path coordinate based on a robot coordinate system based on a coordinate conversion relation; and the medical robot is subjected to navigation control based on the plurality of parameters in the puncture process. Practical experiments show that the scheme can improve the positioning precision due to the fact that the scheme is based on real-time image transformation. .
Practical experiments show that the scheme improves the positioning precision because the registration is directly carried out based on the medical image and the three-dimensional real scene; the identification device is not arranged in the operation area, so that the infection risk is reduced; the navigation effect can be checked in real time, and the operation feedback in the operation is good in real-time performance. And no identification device is arranged in the operation area, so that the infection risk in the operation is reduced; the navigation effect can be checked in real time, and the operation feedback in the operation is good in real-time performance.
In addition, the scheme does not need to customize special surgical instruments, is convenient and smooth in surgical operation, convenient in equipment installation and deployment, low in purchase cost and operation cost, and is more suitable for intelligent medical auxiliary setting with high cost performance deployed and used in primary hospitals.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Although the operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. Under certain circumstances, multitasking and parallel processing may be advantageous.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in a different order, and/or performed in parallel. Moreover, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
Computer program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including but not limited to an object oriented programming language such as Java, smalltalk, C + +, and conventional procedural programming languages, such as the C language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer.
Example two
Fig. 5 is a block diagram of a navigation control device for a puncture route according to an embodiment of the present application.
As shown in fig. 5, the navigation control device of the present embodiment is applied to an electronic device, which is a computer, a workstation, or a server configured for the interventional operating room and connected to the above devices. The navigation control device may be understood as the electronic device itself or a functional module thereof, and specifically includes an image recognition module 10, an image transformation module 20, a position marking module 30, an image acquisition module 40, an image registration module 50, a coordinate transformation module 60, and a navigation execution module 70.
The image recognition module is used for carrying out image recognition processing based on the registration plate image.
Namely, image recognition is carried out on a registration plate image obtained by shooting a registration plate on a puncture plane based on image acquisition equipment, so that a simulated image of a three-dimensional live-action image corresponding to the registration plate image is obtained. The purpose of the calibration process is to obtain the perspective transformation parameters of the perspective transformation matrix as a basis for establishing the transformation relationship between the robot coordinate system and the CT scan image coordinate system.
The image transformation module is used for carrying out perspective transformation on the three-dimensional live-action image based on the perspective transformation parameters.
Namely, the three-dimensional live-action image acquired by the image acquisition equipment is subjected to perspective transformation based on the perspective transformation parameters to obtain a simulated image of the three-dimensional live-action image.
As shown in fig. 3, a puncture plane is arranged on an examination bed of the CT scanning device, and a light surface emitted by the planar laser emitter is coplanar with a CT fault plane; the image acquisition equipment is placed at a proper position (such as a position with the side about 2.4m away from the plane to be punctured, about 80cm away from the center line of the examination bed and about 2.2m high) of the CT operation room and is connected with the workstation through a network cable interface;
and placing a registration plate H at the quasi-puncture position on the examination bed. And identifying the square blocks in the registration plate by using software, and acquiring the parameters of the perspective transformation matrix through software calculation. The image capturing device G in fig. 3 captures a realistic conceptual view of the puncture plane from a top view, i.e., the registration plate image. The puncture needle registration plate is arranged on the examination bed, is coplanar with the puncture plane, and is a white square at the center; when the image acquisition device is shot from a top view, the square in the display image is not a standard square but a diamond;
fig. 4 is a simulated image of the three-dimensional live-action image obtained by the perspective transformation of fig. 3, and after the perspective transformation processing, the squares of the registration plate in fig. 4 are transformed from diamonds to squares, i.e. a simulated image of the three-dimensional live-action image corresponding to the registration plate image is obtained.
The position marking module is used for marking the initial position of the puncture needle in the simulated image.
Namely, the initial position of the puncture needle is marked in the simulated image, so that the current position of the puncture needle is obtained.
The image acquisition module is used for acquiring medical images of a patient.
Before each puncture operation, medical images obtained by various medical image acquisition means are acquired, and the medical images bear target points marked by doctors or other professionals by various means, needle points and planned puncture paths between the target points and the needle points. The medical image includes, but is not limited to, one or more of a CT image, an MRI image, and a PET-CT image.
The medical image is used for detecting a patient on an examination couch on the spot, and in the detection process, a laser positioning line carried by a CT image device is used for pasting a corresponding positioning mark on a puncture plane, so that the obtained CT image can generate corresponding characteristic points. The positioning mark does not need to be stuck on the body of the patient.
The medical registration module is used for carrying out image registration processing on the medical image of the patient and the simulated image of the three-dimensional real scene.
The transformation relation of the robot coordinate system of the medical robot relative to the coordinate system of the medical image is used. The specific scheme is as follows:
firstly, selecting two first characteristic points (A0, B0) in a medical image of a patient, then selecting two second characteristic points (A1, B1) corresponding to the actual positions of A0 and B0 in a simulated image of a three-dimensional real scene, carrying out image scaling, rotation and translation operations, and carrying out registration processing on the medical image of the patient and the simulated image of the three-dimensional real scene, namely realizing the preliminary corresponding relation between the medical image and the simulated image of the three-dimensional real scene;
and then, obtaining the transformation relation of the robot coordinate system relative to the medical image coordinate system according to the marked positions of the puncture needles clamped by the medical robot in the simulated image of the three-dimensional real scene. The medical image coordinate system is a coordinate system of the medical image obtained based on the puncture plane.
The coordinate transformation module is used for carrying out position transformation on the target point, the needle inserting point and the puncture path.
Since the coordinate system of the medical image and the coordinate system of the robot are transformed based on the transformation relation, the position of the target point, the position of the needle insertion point, and the position of the puncture path on the medical image can be transformed based on the transformation relation after the medical image is acquired, so that the coordinates of the target point, the needle insertion point, and the puncture path based on the coordinate system of the robot can be obtained.
The navigation execution module is used for performing navigation control on the medical robot in the puncture process.
When puncture is carried out, the examination bed is moved firstly, the selected puncture layer surface, namely the puncture position, is moved to the puncture plane (namely the laser surface F), and the laser line of the puncture plane is superposed with the mark point of the selected puncture layer surface; at the moment, the position of the puncture needle clamped by the medical robot in the working face live-action image is the initial position of the puncture needle. The module comprises a data calculating unit and a puncture control unit.
In the puncture process, the data calculation unit is used for marking the current position of the robot clamping puncture needle in the CT scanning image of the selected puncture layer; and resolving based on the target point coordinate, the needle feeding point coordinate and the puncture path coordinate to obtain parameters such as a motion path of the puncture needle reaching the needle feeding point, a needle holding angle posture, a maximum puncture depth and the like.
Then, the operator can select an executed needle inserting plan according to the operation requirement, then an operation instruction is sent to the robot, the puncture needle is sent to the corresponding needle inserting position, the puncture needle is automatically aligned to the planned angle by the puncture control unit, and the puncture operation is continuously executed on the patient according to the motion path and the maximum puncture depth.
When the puncture needle punctures to 1/2 or 1/3 of the preset depth, repeating CT scanning verification, and if the angle and the needle inserting point meet the requirement of puncture precision, continuing to puncture to the preset depth; otherwise, the appropriate adjustment is made.
From the technical scheme, the embodiment provides the robot navigation control method which is applied to electronic equipment, and the device is specifically used for carrying out image recognition based on the image of the registration plate on the puncture plane to obtain the perspective transformation parameter; carrying out perspective transformation on the three-dimensional live-action image based on the perspective transformation parameters to obtain a simulated image; marking the initial position of the puncture needle clamped by the robot in the simulated image based on the simulated image; acquiring a medical image of a patient on an examination bed; carrying out image registration processing on the medical image and the simulation image to obtain a coordinate transformation relation between a robot coordinate system and the medical image; converting the target point, the needle inlet point and the puncture path on the medical image into a plurality of parameters such as a target point coordinate, a needle inlet point coordinate and a puncture path coordinate based on a robot coordinate system based on a coordinate conversion relation; and the medical robot is navigated and controlled based on the parameters in the puncture process. Practical experiments show that the scheme can improve the positioning precision due to the fact that the scheme is based on real-time image transformation. .
The units described in the embodiments of the present disclosure may be implemented by software or hardware. Where the name of a unit does not in some cases constitute a limitation of the unit itself, for example, the first retrieving unit may also be described as a "unit for retrieving at least two internet protocol addresses".
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), system on a chip (SOCs), complex Programmable Logic Devices (CPLDs), and the like.
EXAMPLE III
Fig. 6 is a block diagram of an electronic device according to an embodiment of the present application.
Referring now to FIG. 6, a schematic diagram of an electronic device suitable for use in implementing embodiments of the present disclosure is shown. The terminal device in the embodiments of the present disclosure may include, but is not limited to, a mobile terminal such as a mobile phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a vehicle terminal (e.g., a car navigation terminal), and the like, and a fixed terminal such as a digital TV, a desktop computer, and the like. The electronic device shown in fig. 6 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 6, the electronic device may include a processing device (e.g., central processing unit, graphics processor, etc.) 601 that may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 602 or a program loaded from a storage device 606 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data necessary for the operation of the electronic apparatus are also stored. The processing device 601, the ROM 602, and the RAM 603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
Generally, the following devices may be connected to the I/O interface 605: input devices 606 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; output devices 607 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 606 including, for example, magnetic tape, hard disk, etc.; and a communication device 609. The communication means 609 may allow the electronic device to communicate with other devices wirelessly or by wire to exchange data. While fig. 6 illustrates an electronic device having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may be alternatively implemented or provided.
When the program in this embodiment is executed, the electronic device can implement the navigation control method for a puncture path provided in the first embodiment, and the method specifically performs image calibration based on an image of a registration plate located on a puncture plane to obtain a simulated image; performing image registration processing based on the registration plate image and the simulation image to obtain a transformation relation of a robot coordinate system of the medical robot relative to a medical image coordinate system; acquiring a medical image of a patient on an examination bed; transforming the positions of the target point, the needle feeding point and the puncture path on the medical image based on the transformation relation to obtain a plurality of parameters such as a target point coordinate, a needle feeding point coordinate and a puncture path coordinate based on a robot coordinate system; and performing navigation control on the medical robot based on the plurality of parameters in the puncture process of the medical robot. Practical experiments show that the scheme can improve the positioning precision due to the fact that the scheme is based on real-time image transformation.
Example four
The embodiment provides a computer-readable storage medium, which carries one or more programs, and when the one or more programs are executed by an electronic device, the electronic device is enabled to implement the robot navigation control method for a puncture path disclosed in the present application, where the method specifically includes performing image recognition based on a registration plate image on a puncture plane to obtain a perspective transformation parameter; carrying out perspective transformation on the three-dimensional live-action image based on the perspective transformation parameters to obtain a simulated image; marking the initial position of the puncture needle clamped by the robot in the simulated image based on the simulated image; acquiring a medical image of a patient on an examination bed; carrying out image registration processing on the medical image and the simulation image to obtain a coordinate transformation relation between a robot coordinate system and the medical image; converting the target point, the needle inlet point and the puncture path on the medical image into a plurality of parameters such as target point coordinates, needle inlet point coordinates and puncture path coordinates based on a robot coordinate system based on a coordinate conversion relation; and the medical robot is navigated and controlled based on the parameters in the puncture process. Practical experiments show that the scheme can improve the positioning precision due to the fact that the scheme is based on real-time image transformation.
It should be noted that the computer readable storage medium of the present disclosure may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
The embodiments in the present specification are all described in a progressive manner, and each embodiment focuses on differences from other embodiments, and portions that are the same and similar between the embodiments may be referred to each other.
While preferred embodiments of the present invention have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including the preferred embodiment and all changes and modifications that fall within the true scope of the embodiments of the present invention.
Finally, it should also be noted that, in this document, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "include", "including" or any other variations thereof are intended to cover non-exclusive inclusion, so that a process, method, article, or terminal device including a series of elements includes not only those elements but also other elements not explicitly listed or inherent to such process, method, article, or terminal device. Without further limitation, an element defined by the phrases "comprising one of \ 8230; \8230;" does not exclude the presence of additional like elements in a process, method, article, or terminal device that comprises the element.
The technical solutions provided by the present invention are described in detail above, and the principle and the implementation of the present invention are explained in this document by applying specific examples, and the descriptions of the above examples are only used to help understanding the method and the core idea of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (8)

1. A robot navigation control method of puncture path is applied to electronic equipment and is used for navigation control of a medical robot in an interventional operating room, the interventional operating room is provided with medical imaging equipment and image acquisition equipment, and a puncture needle of the medical robot is positioned on a puncture plane which is vertical to an examining table of the medical imaging equipment, and is characterized in that the robot navigation control method comprises the following steps:
performing image recognition based on the registration plate image of the registration plate on the puncture plane, which is acquired by the image acquisition equipment, to obtain a perspective transformation parameter;
carrying out perspective transformation on the three-dimensional live-action image acquired by the image acquisition equipment based on the perspective transformation parameters to obtain a simulated image of the three-dimensional live-action image;
marking the initial position of the puncture needle clamped by the medical robot in the simulated image to obtain the current position of the puncture needle;
acquiring a medical image of a patient on the examining table, wherein at least a target point, a needle inserting point and a puncture path are marked on the medical image;
registering the medical image and the simulation image to obtain a coordinate transformation relation between a robot coordinate system of the medical robot and the medical image;
performing coordinate transformation on the current position, the target point, the needle feeding point and the puncture path based on the coordinate transformation relation to obtain a current position coordinate, a target point coordinate, a needle feeding point coordinate and a puncture path coordinate based on the robot coordinate system;
and in the process that the medical robot punctures a patient, performing navigation control on the medical robot based on the current position coordinate, the target point coordinate, the needle feeding point coordinate and the puncture path coordinate.
2. The robot navigation control method of claim 1, wherein the medical image includes a partial, a full, or a fused image of a CT image, an MRI image, and a PET-CT image.
3. The navigation control method according to claim 1, wherein the navigation control of the medical robot based on the target point coordinates, the needle insertion point coordinates, and the puncture path coordinates includes the steps of:
resolving the target point coordinate, the needle feeding point coordinate and the puncture path coordinate to obtain a needle feeding path, a needle holding angle posture and a maximum puncture depth of the puncture needle reaching the needle feeding point;
and controlling the medical robot based on the current position of the puncture needle, the needle inserting path, the needle holding angle posture and the maximum puncture depth.
4. The utility model provides a robot navigation control device, is applied to electronic equipment for carry out navigation control to medical robot in the intervention operating room, the intervention operating room disposes medical imaging equipment and image acquisition equipment, medical robot's pjncture needle is in a perpendicular to on the puncture plane of the inspection bed of medical imaging equipment, its characterized in that, robot navigation control device includes:
the image identification module is configured to perform image identification on the basis of a registration plate image of a registration plate on the puncture plane, which is acquired by the image acquisition equipment, so as to obtain a perspective transformation parameter;
the image transformation module is configured to perform perspective transformation on the three-dimensional live-action image acquired by the image acquisition equipment based on the perspective transformation parameters to obtain a simulated image of the three-dimensional live-action image;
the position marking module is configured to mark the initial position of the puncture needle clamped by the medical robot in the simulated image to obtain the current position of the puncture needle;
the system comprises an image acquisition module, a medical image acquisition module and a control module, wherein the image acquisition module is configured to acquire a medical image of a patient on the examination bed, and at least a target point, a needle inlet point and a puncture path are marked on the medical image;
an image registration module configured to perform registration processing on the medical image and the simulation image to obtain a coordinate transformation relationship between a robot coordinate system of the medical robot and the medical image;
the coordinate transformation module is configured to perform coordinate transformation on the current position, the target point, the needle feeding point and the puncture path based on the coordinate transformation relation to obtain a current position coordinate, a target point coordinate, a needle feeding point coordinate and a puncture path coordinate based on the robot coordinate system;
and the navigation execution module is configured to perform navigation control on the medical robot based on the current position coordinate, the target point coordinate, the needle feeding point coordinate and the puncture path coordinate in the process that the medical robot punctures the patient.
5. The navigation control device of claim 4, wherein the medical image includes a partial, full, or fused image of a CT image, an MRI image, and a PET-CT image.
6. The navigation control device of claim 4, wherein the navigation execution module comprises:
the data calculating unit is used for calculating the target point coordinate, the needle feeding point coordinate and the puncture path coordinate to obtain a needle feeding path, a needle holding angle posture and a maximum puncture depth of the puncture needle reaching the needle feeding point;
and the puncture control unit is used for controlling the medical robot based on the current position of the puncture needle, the needle inserting path, the needle holding angle posture and the maximum puncture depth.
7. An electronic device comprising at least one processor and a memory coupled to the processor, wherein:
the memory is used for storing computer programs or instructions;
the processor is configured to execute the computer program or instructions to enable the electronic device to implement the robot navigation control method according to any one of claims 1 to 3.
8. A storage medium applied to an electronic device, wherein the storage medium is configured to carry one or more computer programs, so that when the one or more computer programs are executed by the electronic device, the electronic device can be enabled to implement the robot navigation control method according to any one of claims 1 to 3.
CN202211014116.1A 2022-08-23 2022-08-23 Robot navigation control method and device, electronic equipment and storage medium Pending CN115317097A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211014116.1A CN115317097A (en) 2022-08-23 2022-08-23 Robot navigation control method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211014116.1A CN115317097A (en) 2022-08-23 2022-08-23 Robot navigation control method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115317097A true CN115317097A (en) 2022-11-11

Family

ID=83926248

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211014116.1A Pending CN115317097A (en) 2022-08-23 2022-08-23 Robot navigation control method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115317097A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115998425A (en) * 2022-12-27 2023-04-25 东莞市人民医院 Puncture navigation method and device under CT guidance and terminal equipment
CN117598787A (en) * 2024-01-08 2024-02-27 上海卓昕医疗科技有限公司 Medical instrument navigation method, device, equipment and medium based on medical image

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115998425A (en) * 2022-12-27 2023-04-25 东莞市人民医院 Puncture navigation method and device under CT guidance and terminal equipment
CN115998425B (en) * 2022-12-27 2023-10-03 东莞市人民医院 Puncture navigation method and device under CT guidance and terminal equipment
CN117598787A (en) * 2024-01-08 2024-02-27 上海卓昕医疗科技有限公司 Medical instrument navigation method, device, equipment and medium based on medical image

Similar Documents

Publication Publication Date Title
US11025889B2 (en) Systems and methods for determining three dimensional measurements in telemedicine application
CN115317097A (en) Robot navigation control method and device, electronic equipment and storage medium
US9978141B2 (en) System and method for fused image based navigation with late marker placement
CN107049488B (en) Single-plane operation positioning method and model
US20100249595A1 (en) System and method for automatic calibration of tracked ultrasound
CN110420050B (en) CT-guided puncture method and related device
CN107105972A (en) Model register system and method
CN106063726A (en) Puncture navigation system and air navigation aid thereof in real time
CN101099673A (en) Surgical instrument positioning method using infrared reflecting ball as symbolic point
CN105863674A (en) Segment erection pose detection device and detection method
CN110742631A (en) Imaging method and device for medical image
Morgan et al. Hand-eye calibration for surgical cameras: a procrustean perspective-n-point solution
Chan et al. A needle tracking device for ultrasound guided percutaneous procedures
CN111493878A (en) Optical three-dimensional scanning device for orthopedic surgery and method for measuring bone surface
CN115153855A (en) Positioning alignment method and device of micro mechanical arm and electronic equipment
US20220327735A1 (en) Ultrasound probe position registration method, ultrasound imaging system, ultrasound probe position registration system, ultrasound probe position registration phantom, and ultrasound probe position registration program
CN112183657B (en) Method and device for acquiring annotation information, electronic equipment and computer readable medium
CN115317098A (en) Method and device for controlling implantation of radioactive particles, electronic apparatus, and storage medium
EP3655919A1 (en) Systems and methods for determining three dimensional measurements in telemedicine application
JP2022047374A (en) Surgical navigation system, medical imaging system with surgical navigation function, and registration method of medical images for surgical navigation
CN112107366B (en) Mixed reality ultrasonic navigation system
US20220313363A1 (en) Optical System And Apparatus For Instrument Projection And Tracking
CN113855238B (en) Registration method, device, medium and electronic equipment for two-dimensional image
Khosravi et al. One-step needle pose estimation for ultrasound guided biopsies
US20230200775A1 (en) Ultrasonic imaging system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination