CN110916799A - Puncture robot navigation system based on 5G network - Google Patents

Puncture robot navigation system based on 5G network Download PDF

Info

Publication number
CN110916799A
CN110916799A CN201911156410.4A CN201911156410A CN110916799A CN 110916799 A CN110916799 A CN 110916799A CN 201911156410 A CN201911156410 A CN 201911156410A CN 110916799 A CN110916799 A CN 110916799A
Authority
CN
China
Prior art keywords
coordinate system
robot
point
puncture
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911156410.4A
Other languages
Chinese (zh)
Inventor
张华东
骆敏舟
王永
辛艳峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Intelligent Manufacturing Technology JITRI
Original Assignee
Institute of Intelligent Manufacturing Technology JITRI
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Intelligent Manufacturing Technology JITRI filed Critical Institute of Intelligent Manufacturing Technology JITRI
Priority to CN201911156410.4A priority Critical patent/CN110916799A/en
Publication of CN110916799A publication Critical patent/CN110916799A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Robotics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Manipulator (AREA)

Abstract

The invention provides a puncture robot navigation system based on a 5G network, which comprises a local puncture navigation system and a remote puncture server, wherein the local puncture navigation system and the remote puncture server are communicated through the 5G network; the local puncture navigation system performs spatial mapping on a CT coordinate system, a robot base coordinate system and a binocular camera world coordinate system through mark points, the CT coordinate system is mapped under the robot base coordinate system through the fixed binocular camera world coordinate system, the position of the current mark point is sent to the robot after mapping, then a target point path is determined for a doctor, the target point path is sent to a puncture server at a different place through a 5G network, and the doctor controls the robot to complete an operation through the puncture server at the different place. The invention is operated in different places, and through the low time delay and high efficiency of 5G, doctors can carry out system operation in different places, video images and mechanical arm data are transmitted in real time, and accurate puncture navigation paths are provided for doctors in an operating room for the doctors to puncture.

Description

Puncture robot navigation system based on 5G network
Technical Field
The invention relates to the technical field of medical equipment, in particular to a puncture robot navigation system based on a 5G network.
Background
With the development of 5G technology, the characteristics of high transmission and low delay become the pronoun of 5G. In medicine, a puncture robot navigation system with a binocular vision dynamic compensation function is the key of the whole operation, and the function of the puncture robot navigation system is to obtain the spatial position of a target point in a robot in real time. Firstly, through registration of preoperative and intraoperative medical images and registration relation between intraoperative medical images and patients and surgical instruments, after the surgical instruments are controlled to reach required positions through corresponding coordinate conversion, the specific mark points are tracked to obtain the motion trail of the mark points, and the coordinate points of the motion trail are fed back to the robot in real time through intelligent analysis, so that the robot tracks a target point in a certain range.
By utilizing the low delay of 5G, a doctor in different places can operate the robot to determine the path, so that the problem of back and forth running of the doctor is solved, the operation time is shortened, and public resources are saved.
The puncture robot navigation system based on the 5G binocular vision dynamic compensation function is researched, and the puncture robot navigation system has extremely important significance.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: in order to overcome the defects in the prior art, the invention provides a puncture robot navigation system based on a 5G network, which not only greatly improves the puncture precision of a doctor in an operation, but also can relieve the pain of a patient, greatly shortens the operation time of the doctor, improves the efficiency, and simultaneously can enable the patient to easily obtain the resources of a large hospital through the 5G network, improve the recovery mood of the patient and reasonably distribute medical resources.
The technical scheme adopted for solving the technical problems is as follows: a puncture robot navigation system based on a 5G network comprises a local puncture navigation system and a remote puncture server, wherein the local puncture navigation system and the remote puncture server are communicated through the 5G network;
the local puncture navigation system performs spatial mapping on a CT coordinate system, a robot base coordinate system and a binocular camera world coordinate system through mark points, the CT coordinate system is mapped under the robot base coordinate system through the fixed binocular camera world coordinate system, the position of the current mark point is sent to the robot after mapping, then a target point path is determined for a doctor, the target point path is sent to a puncture server at a different place through a 5G network, and the doctor controls the robot to complete an operation through the puncture server at the different place.
Further, a binocular camera of the local puncture navigation system is connected to a controller of the robot through a USB3.0 data transmission line, the controller maps data frame images of the binocular camera into channels through a 5G network, and a remote puncture server receives data in real time; and the algorithm program of the controller is also mapped to the remote puncture server through the 5G channel.
Further, the space mapping process of the mark points comprises four steps:
1) positioning a focus target point, a puncture needle inserting point and a puncture path pose in a medical image space;
2) positioning the marked points on the surface layer of the skin of the patient in the image space of the binocular camera;
3) positioning the space coordinates of the robot in the image space of the binocular camera;
4) and calculating the positions of the focus target point, the puncture needle inserting point and the puncture path position in the robot space.
Before operation, firstly, roughly judging a puncture part of a patient, and then adhering N (N is more than or equal to 4) circular mark points capable of imaging under CT scanning to the body surface near the puncture part, wherein any 3 mark points are not on the same straight line, and 4 mark points are not on the same plane; in the study, a human skeleton 3D model is selected for experiment, and the mark points are pasted as shown in the figure. After CT or MRI scanning, the imaging of the circular mark point can be clearly and accurately obtained on the medical image through a three-dimensional reconstruction technology, and the coordinate information of the mark point in the medical image space can be obtained.
Identifying the image coordinate values of the circular mark points in a binocular image plane through a binocular camera, and accurately deducing the accurate coordinate values of the circle centers of the circular mark points in a visual space coordinate system; meanwhile, the binocular camera deduces the coordinate value of the robot in a visual space coordinate system by identifying the coordinates of the circular mark points of the robot body. Thus, the coordinate positions of the marking points in the image space and the robot space can be determined, and a one-to-one correspondence relationship exists between the marking points.
The spatial mapping of the auxiliary precise puncture robot system based on the mark points mainly comprises four coordinate systems of a computer image coordinate system { V }, a patient coordinate system { P }, an optical coordinate system { M } and a robot coordinate system { R }, and the mapping relation is described as follows.
Specifically, the conversion relationship between the image coordinate system and the 3D model affine coordinate system is as follows:
pasting not less than 4 marking points on the 3D model, establishing a 3D model affine coordinate system by using the marking points, establishing a reference coordinate system by using the 4 marking points which are not in the same plane, and optionally selecting one marking point M0(xm0,ym0,zm0) As the origin of the reference coordinate system, and the three coordinate axis directions of the reference coordinate system are parallel to the three coordinate axis directions of the CT image coordinate system, the 3D model affine coordinate system is translated and rotated relative to the CT image coordinate system, and the translation distances of the three coordinate axis directions are x, y and z axesm0、ym0、zm0It rotates 0 along the x, y, z axes; three remaining marker points M1(xm1,ym1,zm1)、M2(xm2,ym,2,zm2)、M3(xm3,ym3,zm3) Establishing affine relation, mapping matrix T from 3D model affine coordinate system to image coordinate system1Can be expressed as follows:
Figure BDA0002284924140000021
thus, the 3D model affine coordinate system and the image coordinate system are in one-to-one correspondence, and any point P (x) in the image coordinate systemmp,ymp,zmp) There is a unique point on the 3D model corresponding to it, and the conversion formula can be expressed as:
Figure BDA0002284924140000022
wherein, P3D affineRepresenting a point in the image coordinate system as changed by equation 2The corresponding point, P, obtainedImage of a personRepresenting points in the image coordinate system.
Specifically, the conversion relationship between the 3D model affine coordinate system and the binocular camera optical coordinate system is as follows:
the coordinate positions of the four marking points under the optical coordinate system are obtained by performing optical positioning calculation through the binocular camera, and a transformation matrix from the camera optical coordinate system to a 3D model affine coordinate system can be established as well:
Figure BDA0002284924140000031
wherein (x)cr0,ycr0,zcr0)、(xcr1,ycr1,zcr1)、(xcr2,ycr2,zcr2) And (x)cr3,ycr3,zcr3) Respectively representing point coordinates under a world coordinate system of the binocular camera;
by transforming the matrix T2The optical coordinate system of the binocular camera and the affine coordinate system of the 3D model establish a one-to-one correspondence relationship, so that any point on the 3D model has a unique point corresponding to the optical coordinate system of the binocular camera, and the conversion formula can be expressed as follows:
Figure BDA0002284924140000032
wherein, POptical systemRepresenting the spatial point coordinates, P, of the point in the world coordinate system of the binocular camera3D affineThe corresponding points obtained by changing the points in the image coordinate system through the formula 2 are represented, and the mapping relation of the points in the image coordinate system to the camera optical coordinate system can be obtained according to the relation of the formula (2) and the formula (4):
Figure BDA0002284924140000033
wherein the content of the first and second substances,
Figure BDA0002284924140000034
is a mapping matrix of the image coordinate system to the camera coordinate system.
Specifically, the conversion relationship between the robot coordinate system and the robot affine coordinate system is as follows:
four marking points are selected at the tail end of the robot and used as marking points of the robot, and a transformation matrix T from a robot coordinate system to a robot affine coordinate system can be established4
Figure BDA0002284924140000035
By transforming the matrix T4The robot coordinate system and the robot affine coordinate system establish a one-to-one correspondence relationship, so that any point of the robot has a unique point corresponding to the robot affine coordinate system, and the conversion formula can be expressed as follows:
Figure BDA0002284924140000036
wherein, P in the formulaRobotRepresenting any point, P, reachable in the robot coordinate systemRobot affineThe point obtained after the change of the formula (7).
Specifically, the robot affine coordinate system-binocular camera optical coordinate system conversion relationship is as follows:
establishing a corresponding relation between the robot affine coordinate system and the optical coordinate system of the binocular camera, wherein each point on the robot can be uniquely mapped into the optical coordinate system of the binocular camera, and a conversion relation matrix T is formed5Comprises the following steps:
Figure BDA0002284924140000041
wherein (x)cr0,ycr0,zcr0)、(xcr1,ycr1,zcr1)、(xcr2,ycr2,zcr2) And (x)cr3,ycr3,zcr3) Respectively representing point coordinates under a world coordinate system of the binocular camera;
the conversion formula can be expressed as:
Figure BDA0002284924140000042
Poptical systemRepresenting the spatial point coordinates, P, of the point in the world coordinate system of the binocular cameraRobot affinePoint coordinates obtained after the change of equation 7.
The mapping relation of the points in the robot coordinate system corresponding to the camera optical coordinate system can be obtained according to the relation:
Figure BDA0002284924140000043
wherein, POptical systemRepresenting the spatial point coordinates, P, of the point in the world coordinate system of the binocular cameraRobotRepresenting any point that can be reached in the robot coordinate system,
Figure BDA0002284924140000044
is a mapping matrix of the optical coordinate system to the camera coordinate system.
Specifically, the conversion relationship between the image coordinate system and the robot coordinate system is as follows:
and (3) according to a transformation matrix T3 from the image coordinate system to the camera optical coordinate system and a transformation matrix T6 from the optical coordinate system to the robot coordinate system, obtaining the transformation relation from the points on the image coordinate system to the robot coordinate system:
Probot=T6T3PImage of a person=T7PImage of a person(11)
Wherein, PRobotRepresents an arbitrary point, P, that can be reached under the robot coordinate systemImage of a personRepresenting a point in the image coordinate system, T7=T6T3The method is a conversion relation matrix from an image coordinate system to a robot coordinate system, and realizes that the position coordinates of any point on the 3D model are mapped to the robot coordinate system.
The invention has the beneficial effects that: the invention provides a puncture robot navigation system based on a 5G network, which utilizes a 5G network to provide a precise needle insertion route for doctors by utilizing an advanced robot puncture technology aiming at patients at different places, completes the operation at different places through the resources of large hospitals, can greatly shorten the operation time and provide the successful diagnosis rate of the patients, simultaneously designs a puncture robot navigation system based on a binocular vision dynamic compensation function, identifies a circular marking point on the skin surface of the patients and a marking point at the tail end of an operation robot through a binocular camera to obtain the space position coordinates of the patients in a visual coordinate system, completes the mapping of the space coordinate system position of the medical image of the patients in the puncture operation to the space coordinate system of the operation robot, moves the robot to an initial puncture point, then opens a camera continuous working mode, tracks the marking point and obtains the movement track of the marking point (the movement of the skin layer of the patients) within a certain number of frames, the target point of puncture is corrected through the mean value, so that errors caused by artificial puncture can be reduced, and the problem of auxiliary accurate positioning in a medical surgical minimally invasive puncture operation is solved by means of the characteristics of high positioning accuracy, stable operation, safety and reliability of a robot; according to the invention, the manipulator controller program is mapped by 5g for the user to use, and the operation is carried out by 5g, so that the resources are saved, and the operation time is shortened.
Drawings
The invention is further illustrated by the following figures and examples.
FIG. 1 is a computer image coordinate system to robot coordinate system map;
FIG. 2 is a puncture flow diagram;
FIG. 3 is a flow chart of the present invention for puncturing using a 5G network;
FIG. 4 is a robot remote control interface;
FIG. 5 is a coordinate system mapping relationship diagram based on the marked points;
FIG. 6 is a comparison graph of the marking points of the three-dimensional reconstruction model of the 3D model and the CT image;
FIG. 7 is a comparison graph of the marking points of the three-dimensional reconstruction model of the 3D model and the CT image;
fig. 8 is a schematic diagram of a spatial mapping process.
Detailed Description
The present invention will now be described in detail with reference to the accompanying drawings. This figure is a simplified schematic diagram, and merely illustrates the basic structure of the present invention in a schematic manner, and therefore it shows only the constitution related to the present invention.
As shown in fig. 1-5, the puncture robot navigation system based on 5G network of the present invention includes a local puncture navigation system and a remote puncture server, where the local puncture navigation system and the remote puncture server communicate via the 5G network; in this embodiment, a site is local, a site is remote, and the robot employs a robot arm.
The method specifically comprises the following steps:
as shown in fig. 2 and 3, firstly, the binocular camera is connected to the mechanical arm controller through the USB data transmission line at the a site, the camera data frame image is mapped into the channel through the 5G network, and the puncture server at the B site can receive the data in real time. The algorithm program of the most main mechanical arm controller is also mapped to the B ground through a 5G channel, and the robot can be remotely controlled through the interface shown in the figure 4. With 5G, application distribution as shown in fig. 3, the puncture server is placed on the B site through 5G network communication, and the patient on the a site can be actually operated by the robot arm.
As shown in figure 4, joint space actions of the mechanical arm are completed through the control area, the numerical value of the information prompt area reflects the current space posture of the robot in real time, and the operation mode completely breaks away from the robot controller body, so that the robot controller is convenient for operators, and is safer and more convenient.
As shown in fig. 5, the process of spatial mapping the marker point includes four steps: 1) positioning a focus target point, a puncture needle inserting point and a puncture path pose in a medical image space; 2) positioning a 3D model mark point in a binocular camera image space; 3) positioning the space coordinate of the robot in the image space of the binocular camera; 4) and calculating the positions of the focus target point, the puncture needle inserting point and the puncture path position in the robot space.
Before operation, firstly, roughly judging a puncture part of a patient, and then adhering N (N is more than or equal to 4) circular mark points capable of imaging under CT scanning to the body surface near the puncture part, wherein any 3 mark points are not on the same straight line, and 4 mark points are not on the same plane; in the study, a human skeleton 3D model is selected for experiment, and the mark points are pasted as shown in the figure. After CT or MRI scanning, the three-dimensional reconstruction technology can clearly and accurately obtain the imaging of the circular mark point on the medical image, as shown in figures 6 and 7, and the coordinate information of the mark point in the medical image space can be obtained. The shape of the mark point is not limited to a circle, and can be selected according to specific situations and needs.
Identifying the image coordinate values of the circular mark points in a binocular image plane through a binocular camera, and accurately deducing the accurate coordinate values of the circle centers of the circular mark points in a visual space coordinate system; meanwhile, the binocular camera deduces the coordinate value of the robot in a visual space coordinate system by identifying the coordinates of the circular mark points of the robot body. Thus, the coordinate positions of the marking points in the image space and the robot space can be determined, and a one-to-one correspondence relationship exists between the marking points.
The spatial mapping of the auxiliary precise puncture robot system based on the mark points mainly comprises four coordinate systems of a computer image coordinate system { V }, a patient coordinate system { P }, an optical coordinate system { M } and a robot coordinate system { R }, and the spatial mapping relationship is shown in FIG. 8:
in Table I, (X)w,Yw,Zw) Representing the spatial point of the marked point P in binocular world coordinates.
Locating model marking points in table binocular camera image space
Figure BDA0002284924140000061
Positioning of focus target point, puncture needle insertion point and puncture path pose in table II medical image space
Figure BDA0002284924140000062
Robot mark point positioning by using table-three-binocular camera
Figure BDA0002284924140000063
(1) Image coordinate system-3D model affine coordinate system
The method comprises the steps of establishing a 3D model affine coordinate system by pasting not less than 4 mark points on a 3D model, establishing a reference coordinate system by using 4 mark points which are not in the same plane, and selecting one mark point M0(xm0,ym0,zm0) As the origin of the reference coordinate system and parallel to the three coordinate axis directions of the CT image coordinate system, the 3D model affine coordinate system is translated and rotated with respect to the CT image coordinate system, and the translation distances are x on the x, y and z axes respectivelym0、ym0、zm0Which is 0 in rotation along the x, y, z axes. And other three marked points M1(xm1,ym1,zm1)、M2(xm2,ym,2,zm2)、M3(xm3,ym3,zm3) Establishing an affine relationship, the mapping matrix from the 3D model radial coordinate system to the image coordinate system can be represented as follows:
Figure BDA0002284924140000071
thus, the 3D model affine coordinate system and the image coordinate system are in one-to-one correspondence, and any point P (x) in the image coordinate systemmp,ymp,zmp) There is a unique point on the 3D model corresponding to it, and the conversion formula can be expressed as:
Figure BDA0002284924140000072
(2)3D model affine coordinate system-binocular camera optical coordinate system
The binocular camera is used for optical positioning calculation, the coordinate positions of the four mark points under the optical coordinate system are obtained in the same way, and a transformation matrix from the camera optical coordinate system to the 3D model affine coordinate system can be established in the same way:
Figure BDA0002284924140000073
thus, the optical coordinate system of the binocular camera and the affine coordinate system of the 3D model establish a one-to-one correspondence relationship, and any point on the 3D model has a unique point corresponding to the point on the optical coordinate system of the binocular camera, and the conversion formula can be expressed as follows:
Figure BDA0002284924140000074
the mapping relation of the point in the image coordinate system corresponding to the optical coordinate system of the camera can be obtained by the relation of the formula (2) and the formula (4):
Figure BDA0002284924140000075
wherein
Figure BDA0002284924140000076
Is a mapping matrix of the image coordinate system to the camera coordinate system.
(3) Robot coordinate system-robot affine coordinate system
According to the above conversion relation, four mark points are selected at the tail end of the robot and used as the mark points of the robot, and a conversion matrix from a robot coordinate system to a robot affine coordinate system can be established:
Figure BDA0002284924140000077
thus, the robot coordinate system and the robot affine coordinate system establish a one-to-one correspondence relationship, and any point of the robot has a unique point corresponding to the robot affine coordinate system, and the conversion formula can be expressed as follows:
Figure BDA0002284924140000078
(4) robot affine coordinate system-binocular camera optical coordinate system
Similarly, a corresponding relation between the robot affine coordinate system and the optical coordinate system of the binocular camera can be established, each point on the robot can be uniquely mapped into the optical coordinate system of the binocular camera, and a conversion relation matrix is as follows:
Figure BDA0002284924140000081
the conversion formula can be expressed as:
Figure BDA0002284924140000082
the mapping relation of the points in the robot coordinate system corresponding to the camera optical coordinate system can be obtained according to the relation:
Figure BDA0002284924140000083
wherein
Figure BDA0002284924140000084
Is a mapping matrix of the optical coordinate system to the camera coordinate system.
(5) Image coordinate system-robot coordinate system
In summary, the transformation relation T3 from the image coordinate system to the camera optical coordinate system and the transformation relation T6 from the optical coordinate system to the robot coordinate system have been obtained, so that the transformation relation from the point on the image coordinate system to the robot coordinate system can be found:
Probot=T6T3PImage of a person=T7PImage of a person(11)
Wherein T is7=T6T3The transformation relation matrix from the image coordinate system to the robot coordinate system can realize the mapping of the position coordinates of any point on the 3D model to the robot coordinate system.
Identifying the image coordinate values of the circular mark points in a binocular image plane through a binocular camera, and accurately deducing the accurate coordinate values of the circle centers of the circular mark points in a visual space coordinate system by utilizing accuracy; meanwhile, the binocular camera deduces the coordinate value of the robot in a visual space coordinate system by identifying the coordinates of the circular mark points of the robot body. Thus, the coordinate positions of the marking points in the image space and the robot space can be determined, and a one-to-one correspondence relationship exists between the marking points.
In light of the foregoing description of preferred embodiments in accordance with the invention, it is to be understood that numerous changes and modifications may be made by those skilled in the art without departing from the scope of the invention. The technical scope of the present invention is not limited to the contents of the specification, and must be determined according to the scope of the claims.

Claims (8)

1. The utility model provides a puncture robot navigation based on 5G network which characterized in that: the remote puncture system comprises a local puncture navigation system and a remote puncture server, wherein the local puncture navigation system and the remote puncture server are communicated through a 5G network;
the local puncture navigation system performs spatial mapping on a CT coordinate system, a robot base coordinate system and a binocular camera world coordinate system through mark points, the CT coordinate system is mapped under the robot base coordinate system through the fixed binocular camera world coordinate system, the position of the current mark point is sent to the robot after mapping, then a target point path is determined for a doctor, the target point path is sent to a puncture server at a different place through a 5G network, and the doctor controls the robot to complete an operation through the puncture server at the different place.
2. The puncture robot navigation system based on 5G network according to claim 1, characterized in that: a binocular camera of the local puncture navigation system is connected to a controller of the robot, the controller maps data frame images of the binocular camera into channels through a 5G network, and a remote puncture server receives data in real time; and the algorithm program of the controller is also mapped to the remote puncture server through the 5G channel.
3. The puncture robot navigation system based on 5G network according to claim 1, characterized in that: the space mapping process of the mark points comprises the following steps:
firstly, roughly judging a puncture part of a patient, and then pasting N (N is more than or equal to 4) mark points capable of imaging under CT scanning on the body surface near the puncture part, wherein any 3 mark points are not on the same straight line, and 4 mark points are not on the same plane;
then, after CT or MRI scanning, obtaining the imaging of the mark point on the medical image through a three-dimensional reconstruction technology, and obtaining the coordinate information of the mark point in the medical image space;
secondly, identifying the image coordinate values of the marking points in the binocular image plane through a binocular camera, and deducing the coordinate values of the circle centers of the marking points in a visual space coordinate system; meanwhile, the binocular camera deduces the coordinate value of the robot in a visual space coordinate system by identifying the coordinates of the mark points of the robot body.
4. The 5G network-based piercing robot navigation system of claim 3, wherein: image coordinate system-transformation relationship between 3D model affine coordinate systems:
pasting at least 4 mark points on a 3D model, establishing a 3D model affine coordinate system, establishing a reference coordinate system by using any 4 mark points pasted on the model and not in the same plane, and selecting any one point M0As a reference coordinate system origin, the 3D model affine coordinate system is translated and rotated relative to the CT image coordinate system, and the translation distance of the 3D model affine coordinate system is x on the x axis, the y axis and the z axis respectivelym0、ym0、zm0The rotation angle along the x, y, z axes is assumed to be 0; the remaining three marker points M1(xm1,ym1,zm1)、M2(xm2,ym,2,zm2)、M3(xm3,ym3,zm3) Establishing affine relation, and mapping matrix T from 3D model radial coordinate system to image coordinate system1Can be expressed as follows:
Figure FDA0002284924130000011
by mapping the matrix T1The 3D model affine coordinate system and the image coordinate system establish a one-to-one correspondence relationship, and then any point P (x) in the image coordinate systemmp,ymp,zmp) There is a unique point on the 3D model corresponding to it, and the conversion formula can be expressed as:
Figure FDA0002284924130000021
wherein, P3D affineRepresenting the corresponding point, P, of a point in the image coordinate system as a function of equation (2)Image of a personRepresenting points in the image coordinate system.
5. The 5G network-based piercing robot navigation system of claim 4, wherein: conversion relationship between 3D model affine coordinate system-binocular camera optical coordinate system:
performing optical positioning calculation through a binocular camera, solving the coordinate positions of the four mark points under a world coordinate system of the binocular camera, and establishing a conversion matrix T from an optical coordinate system of the camera to a 3D model affine coordinate system2
Figure FDA0002284924130000022
Wherein (x)c0,yc0,zc0)、(xc1,yc1,zc1)、(xc2,yc2,zc2) And (x)c3,yc3,zc3) Point coordinates represented in a binocular camera world coordinate system;
by transforming the matrix T2The optical coordinate system of the binocular camera and the affine coordinate system of the 3D model establish a one-to-one correspondence relationship, so that any point on the 3D model has a unique point corresponding to the optical coordinate system of the binocular camera, and the conversion formula can be expressed as follows:
Poptical system=T2 -1P3D affine,P3D affine=T2POptical system(4)
Wherein, P3D affineRepresenting an arbitrary point on the 3D model, POptical systemIs the corresponding point obtained by the transformation of the above formula 4;
the mapping relation of the point in the image coordinate system corresponding to the binocular camera world coordinate system can be obtained according to the relation of the formula (2) and the formula (4):
Figure FDA0002284924130000023
wherein, PImage of a personRepresenting the coordinates of the point in the image coordinates,
Figure FDA0002284924130000024
is a mapping matrix of the image coordinate system to the camera coordinate system.
6. The puncture robot navigation system based on 5G network according to claim 5, characterized in that: conversion relationship between robot coordinate system-robot affine coordinate system:
selecting four mark points at the tail end of the robot as the mark points of the robot, and establishing a transformation matrix T from a robot coordinate system to a robot affine coordinate system4
Figure FDA0002284924130000025
By transforming the matrix T4The robot coordinate system and the robot affine coordinate system establish a one-to-one correspondence relationship, so that any point of the robot has a unique point corresponding to the robot affine coordinate system, and the conversion formula can be expressed as follows:
Figure FDA0002284924130000031
wherein, P in the formulaRobotIndicating accessibility of the robot in a coordinate systemTo an arbitrary point, PRobot affineThe point obtained after the change of the formula (7).
7. The 5G network-based piercing robot navigation system of claim 6, wherein: the conversion relationship between the robot affine coordinate system and the optical coordinate system of the binocular camera is as follows:
establishing a corresponding relation between an affine coordinate system of the robot and a world coordinate system of the binocular camera, wherein each point on the robot can be uniquely mapped into an optical coordinate system of the binocular camera, and a transformation relation matrix T of the coordinate system is5Comprises the following steps:
Figure FDA0002284924130000032
wherein (x)cr0,ycr0,zcr0)、(xcr1,ycr1,zcr1)、(xcr2,ycr2,zcr2) And (x)cr3,ycr3,zcr3) Respectively representing point coordinates under a world coordinate system of the binocular camera;
the conversion formula can be expressed as:
Figure FDA0002284924130000033
wherein, POptical systemRepresenting the spatial point coordinates, P, of the point in the world coordinate system of the binocular cameraRobot affineRepresenting the points transformed by equation (9);
the mapping relation of the points in the robot coordinate system corresponding to the camera optical coordinate system can be obtained according to the relation:
Figure FDA0002284924130000034
wherein, POptical systemRepresenting the spatial point coordinates, P, of the point in the world coordinate system of the binocular cameraRobotRepresenting the coordinates of the point in the robot coordinate system,
Figure FDA0002284924130000035
is a mapping matrix of the optical coordinate system to the camera coordinate system.
8. The 5G network-based piercing robot navigation system of claim 7, wherein: conversion relationship between image coordinate system and robot coordinate system:
and (3) according to a transformation matrix T3 from the image coordinate system to the camera optical coordinate system and a transformation matrix T6 from the optical coordinate system to the robot coordinate system, obtaining the transformation relation from the points on the image coordinate system to the robot coordinate system:
Probot=T6T3PImage of a person=T7PImage of a person(11)
Wherein, PRobotRepresenting the coordinates of the point in the robot coordinate system, PImage of a personRepresenting the coordinates of the point in the image coordinates, T7=T6T3The method is a conversion relation matrix from an image coordinate system to a robot coordinate system, and realizes that the position coordinates of any point on the 3D model are mapped to the robot coordinate system.
CN201911156410.4A 2019-11-22 2019-11-22 Puncture robot navigation system based on 5G network Pending CN110916799A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911156410.4A CN110916799A (en) 2019-11-22 2019-11-22 Puncture robot navigation system based on 5G network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911156410.4A CN110916799A (en) 2019-11-22 2019-11-22 Puncture robot navigation system based on 5G network

Publications (1)

Publication Number Publication Date
CN110916799A true CN110916799A (en) 2020-03-27

Family

ID=69850726

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911156410.4A Pending CN110916799A (en) 2019-11-22 2019-11-22 Puncture robot navigation system based on 5G network

Country Status (1)

Country Link
CN (1) CN110916799A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111481268A (en) * 2020-04-17 2020-08-04 吉林大学第一医院 Automatic positioning and guiding system for basicranial foramen ovale
CN111743628A (en) * 2020-07-18 2020-10-09 纽智医疗科技(苏州)有限公司 Automatic puncture mechanical arm path planning method based on computer vision
CN111839741A (en) * 2020-07-02 2020-10-30 Ndr医疗科技有限公司 Control system and method for operating robot
CN112641512A (en) * 2020-12-08 2021-04-13 北京信息科技大学 Spatial registration method applied to surgical robot front planning
CN113598948A (en) * 2021-08-25 2021-11-05 上海导向医疗系统有限公司 Method for accurately positioning navigation target point
CN115869072A (en) * 2023-02-27 2023-03-31 武汉市第四医院 Fracture reduction robot system and control method thereof

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109864806A (en) * 2018-12-19 2019-06-11 江苏集萃智能制造技术研究所有限公司 The Needle-driven Robot navigation system of dynamic compensation function based on binocular vision
CN110062608A (en) * 2016-11-11 2019-07-26 直观外科手术操作公司 Remote operation surgery systems with the positioning based on scanning
CN110432989A (en) * 2019-06-20 2019-11-12 江苏省人民医院(南京医科大学第一附属医院) 5G remote orthopedic surgery robot combining virtual technology and 3D printing
CN110448378A (en) * 2019-08-13 2019-11-15 北京唯迈医疗设备有限公司 A kind of immersion intervention operation overall-in-one control schema platform

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110062608A (en) * 2016-11-11 2019-07-26 直观外科手术操作公司 Remote operation surgery systems with the positioning based on scanning
CN109864806A (en) * 2018-12-19 2019-06-11 江苏集萃智能制造技术研究所有限公司 The Needle-driven Robot navigation system of dynamic compensation function based on binocular vision
CN110432989A (en) * 2019-06-20 2019-11-12 江苏省人民医院(南京医科大学第一附属医院) 5G remote orthopedic surgery robot combining virtual technology and 3D printing
CN110448378A (en) * 2019-08-13 2019-11-15 北京唯迈医疗设备有限公司 A kind of immersion intervention operation overall-in-one control schema platform

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111481268A (en) * 2020-04-17 2020-08-04 吉林大学第一医院 Automatic positioning and guiding system for basicranial foramen ovale
CN111481268B (en) * 2020-04-17 2021-06-29 吉林大学第一医院 Automatic positioning and guiding system for basicranial foramen ovale
CN111839741A (en) * 2020-07-02 2020-10-30 Ndr医疗科技有限公司 Control system and method for operating robot
CN111743628A (en) * 2020-07-18 2020-10-09 纽智医疗科技(苏州)有限公司 Automatic puncture mechanical arm path planning method based on computer vision
CN112641512A (en) * 2020-12-08 2021-04-13 北京信息科技大学 Spatial registration method applied to surgical robot front planning
CN112641512B (en) * 2020-12-08 2023-11-10 北京信息科技大学 Spatial registration method applied to preoperative robot planning
CN113598948A (en) * 2021-08-25 2021-11-05 上海导向医疗系统有限公司 Method for accurately positioning navigation target point
CN115869072A (en) * 2023-02-27 2023-03-31 武汉市第四医院 Fracture reduction robot system and control method thereof

Similar Documents

Publication Publication Date Title
CN110916799A (en) Puncture robot navigation system based on 5G network
EP3254621B1 (en) 3d image special calibrator, surgical localizing system and method
CN113288429A (en) Space registration and real-time navigation method of breast minimally invasive interventional operation robot
WO2021217713A1 (en) Surgical navigation system, computer for performing surgical navigation method, and storage medium
US20070018975A1 (en) Methods and systems for mapping a virtual model of an object to the object
US20110306873A1 (en) System for performing highly accurate surgery
CN110101452A (en) A kind of optomagnetic integrated positioning navigation method for surgical operation
CN114711969A (en) Surgical robot system and using method thereof
CN109864806A (en) The Needle-driven Robot navigation system of dynamic compensation function based on binocular vision
CN110215284A (en) A kind of visualization system and method
CN101375805A (en) Method and system for guiding operation of electronic endoscope by auxiliary computer
CN113940755A (en) Surgical operation planning and navigation method integrating operation and image
CN103519895A (en) Orthopedic operation auxiliary guide method
CN103479431A (en) Non-intrusive minimally invasive operation navigation system
CN112043382A (en) Surgical navigation system and use method thereof
CN108056819A (en) A kind of operation track and localization air navigation aid for medical robot
CN115105207A (en) Operation holographic navigation method and system based on mixed reality
CN112190328A (en) Holographic perspective positioning system and positioning method
CN106251284A (en) Medical image registration method based on facing
CN113648061B (en) Head-mounted navigation system based on mixed reality and navigation registration method
CN113456220B (en) Alignment method, surgical robot, and computer storage medium
Gu et al. A calibration-free workflow for image-based mixed reality navigation of total shoulder arthroplasty
CN110584780A (en) Cerebral hemorrhage puncture operation navigation system
CN114022587A (en) Marker sharing method, device, system, apparatus and medium for surgical robot
CN112168197B (en) Positioning method and navigation system for elbow joint external fixation rotating shaft

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200327