CN115737121A - Surgical robot navigation method, device and system - Google Patents
Surgical robot navigation method, device and system Download PDFInfo
- Publication number
- CN115737121A CN115737121A CN202211252780.XA CN202211252780A CN115737121A CN 115737121 A CN115737121 A CN 115737121A CN 202211252780 A CN202211252780 A CN 202211252780A CN 115737121 A CN115737121 A CN 115737121A
- Authority
- CN
- China
- Prior art keywords
- registration
- image
- perspective image
- surgical
- tracer
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Landscapes
- Length Measuring Devices By Optical Means (AREA)
Abstract
The embodiment of the invention provides a surgical robot navigation method, a device and a system, which relate to the technical field of data processing, wherein the method comprises the following steps: obtaining a first perspective image acquired by perspective image acquisition equipment; obtaining first spatial coordinates of the marker in three-dimensional space and first image coordinates in a first fluoroscopic image; obtaining a first registration relation between the first perspective image and the three-dimensional space based on the first space coordinate and the first image coordinate; in the operation process, a second perspective image acquired by the perspective image acquisition equipment is obtained, and the pose information of a tracer arranged on the registration assembly relative to the operation navigation equipment at the acquisition moment is obtained; and acquiring a second registration relation between the second perspective image and the surgical navigation equipment based on the first registration relation and the pose information, and navigating the surgical robot based on the second registration relation. By applying the scheme provided by the embodiment of the invention, the accuracy of the surgical robot in navigation in the surgical process can be improved.
Description
Technical Field
The invention relates to the technical field of data processing, in particular to a navigation method, a navigation device and a navigation system for a surgical robot.
Background
With the gradual improvement of the research and development level of medical instruments, the application of surgical robots is gradually widened. Surgical robots are capable of performing more delicate and accurate surgical procedures, often for performing surgical tasks that are difficult for a surgeon to directly perform independently.
In the process of performing an operation, after the perspective image acquisition device acquires the perspective image for the operation object, the operation robot needs to register the perspective image and the operation navigation device, and then performs navigation based on the registration result so as to smoothly move to the focus area of the operation object. In the prior art, registration is generally realized by means of a registration plate with an active luminous point, the registration plate is arranged in a field range of a perspective image acquisition device, a surgical robot firstly obtains the position of the active luminous point in a perspective image acquired by the perspective image acquisition device, then obtains pose information of the registration plate obtained by the surgical navigation device through identifying the active luminous point, and performs registration according to the position and the pose information.
However, in an actual surgical procedure, the entire fluoroscopic image capturing device is covered by the sterile cover, and then the registration plate disposed within the field of view of the fluoroscopic image capturing device is also covered by the sterile cover, so that the sterile cover may interfere with the surgical navigation device to identify the light emitting point in the registration plate, thereby reducing the accuracy of the pose information obtained by the surgical navigation device, and further reducing the accuracy of the surgical robot in aligning the surgical navigation device with the fluoroscopic image based on the pose information, resulting in a lower accuracy of the surgical robot in navigating according to the registration result.
Disclosure of Invention
The embodiment of the invention aims to provide a surgical robot navigation method, device and system so as to improve the accuracy of the surgical robot navigation in the surgical process. The specific technical scheme is as follows:
in a first aspect, an embodiment of the present invention provides a surgical robot navigation method, where the method includes:
obtaining a first perspective image acquired by perspective image acquisition equipment, wherein a registration component is arranged in a field range of the perspective image acquisition equipment, and a marker capable of being developed in the perspective image is arranged on the registration component;
obtaining first spatial coordinates of the marker in three-dimensional space and first image coordinates of the marker in the first fluoroscopic image;
obtaining a first registration relation between the first perspective image and the three-dimensional space based on the first space coordinate and the first image coordinate;
in the process of operation, a second perspective image acquired by the perspective image acquisition device is obtained, and the pose information of a tracer installed on the registration assembly relative to the operation navigation device at the acquisition time is obtained, wherein the acquisition time is as follows: the moment when the fluoroscopic image acquisition apparatus acquires the second fluoroscopic image;
and acquiring a second registration relation between the second perspective image and the surgical navigation equipment based on the first registration relation and the pose information, and navigating the surgical robot based on the second registration relation.
In one embodiment of the invention, the tracer comprises a reflective ball capable of being identified by the surgical navigation equipment, and the tracer is mounted on the registration assembly and positioned within the field of view of the surgical navigation equipment;
the obtaining pose information of a tracer mounted on the registration assembly relative to a surgical navigation device at an acquisition time comprises:
acquiring a pose image acquired by surgical navigation equipment;
determining second image coordinates of the light reflecting ball in the pose image;
based on the second space coordinate of the reflective ball and the second image coordinate, obtaining pose information of the tracer relative to the surgical navigation equipment at the acquisition time, wherein the second space coordinate is as follows: coordinates of the light-reflecting sphere in the three-dimensional space.
In an embodiment of the present invention, the obtaining a second registration relationship between the second perspective image and the surgical navigation device based on the first registration relationship and the pose information includes:
obtaining a first matrix representing the first registration relation and a second matrix representing the pose information;
and calculating the product of the first matrix and the second matrix to obtain a second registration relation between the second perspective image and the surgical navigation equipment.
In one embodiment of the invention, a plurality of markers are disposed on the registration assembly;
the obtaining first spatial coordinates of the marker in three-dimensional space and first image coordinates of the marker in the first fluoroscopic image includes:
obtaining first spatial coordinates of each marker in three-dimensional space;
based on the distribution information of the respective markers in the registration component, corresponding first image coordinates of the first spatial coordinates of the respective markers in the first fluoroscopic image are determined.
In one embodiment of the invention, the registration component is detachably arranged at the front side of the light receiver of the perspective image acquisition device; or
The registration assembly comprises a main body plate and markers, the main body plate is of a double-layer plate-shaped structure, and the markers are embedded in each plate surface of the main body plate respectively; or
The tracer is detachably mounted at a preset position of the registration assembly.
In a second aspect, an embodiment of the present invention provides a surgical robot navigation device, including:
the system comprises a first perspective image acquisition module, a first image acquisition module and a second perspective image acquisition module, wherein the first perspective image acquisition module is used for acquiring a first perspective image acquired by a perspective image acquisition device, a registration component is arranged in a field range of the perspective image acquisition device, and a marker capable of being developed in the perspective image is arranged on the registration component;
a marker coordinate obtaining module for obtaining a first space coordinate of the marker in a three-dimensional space and a first image coordinate of the marker in the first perspective image;
a registration relation obtaining module, configured to obtain a first registration relation between the first perspective image and the three-dimensional space based on the first space coordinate and the first image coordinate;
a pose information obtaining module, configured to, in an operation process, obtain a second perspective image acquired by the perspective image acquisition device, and obtain pose information of a tracer installed on the registration assembly with respect to an operation navigation device at an acquisition time, where the acquisition time is: the moment when the fluoroscopic image acquisition apparatus acquires the second fluoroscopic image;
and the navigation module is used for acquiring a second registration relation between the second perspective image and the surgical navigation equipment based on the first registration relation and the pose information, and navigating the surgical robot based on the second registration relation.
In one embodiment of the invention, the tracer comprises a light-reflecting ball capable of being identified by the surgical navigation equipment, and the tracer is mounted on the registration assembly and positioned in the field of view of the surgical navigation equipment;
the pose information acquisition module is specifically used for acquiring a second perspective image acquired by the perspective image acquisition equipment and acquiring a pose image acquired by the surgical navigation equipment in the surgical process; determining second image coordinates of the reflective ball in the pose image; based on the second space coordinate of the reflective ball and the second image coordinate, obtaining pose information of the tracer relative to the surgical navigation equipment at the acquisition time, wherein the second space coordinate is as follows: coordinates of the light-reflecting sphere in the three-dimensional space.
In an embodiment of the present invention, the navigation module is specifically configured to obtain a first matrix representing the first registration relationship and a second matrix representing the pose information; and calculating the product of the first matrix and the second matrix to obtain a second registration relation between the second perspective image and the surgical navigation equipment, and navigating the surgical robot based on the second registration relation.
In one embodiment of the invention, a plurality of markers are disposed on the registration assembly;
the marker coordinate obtaining module is specifically used for obtaining a first space coordinate of each marker in a three-dimensional space; based on the distribution information of the respective markers in the registration component, corresponding first image coordinates of the first spatial coordinates of the respective markers in the first fluoroscopic image are determined.
In one embodiment of the invention, the registration component is detachably arranged on the front side of the light receiver of the perspective image acquisition device; or
The registration assembly comprises a main body plate and markers, the main body plate is of a double-layer plate-shaped structure, and the markers are embedded in each plate surface of the main body plate respectively; or
The tracer is detachably arranged at a preset position of the registration assembly.
In a third aspect, an embodiment of the present invention provides a surgical robot navigation system, including: the system comprises a surgical robot, a perspective image acquisition device, a surgical navigation device, a registration assembly and a tracer;
the registration assembly comprises a main body plate and markers, the main body plate is of a double-layer plate-shaped structure, the markers are embedded in each plate surface of the main body plate respectively, and the registration assembly is detachably mounted on the front side of a light receiver of the perspective image acquisition equipment;
the tracer comprises a light reflecting ball which can be identified by the surgical navigation equipment, and the tracer is detachably arranged at a preset position of the registration assembly and is positioned in the field of view of the surgical navigation equipment;
the perspective image acquisition equipment is used for acquiring a perspective image and sending the acquired perspective image to the surgical robot;
the surgical navigation equipment is used for acquiring a pose image and sending the acquired pose image to the surgical robot;
the surgical robot is configured to perform the surgical robot navigation method according to the first aspect.
In a fourth aspect, an embodiment of the present invention provides a surgical robot, including a processor, a communication interface, a memory, and a communication bus, where the processor and the communication interface complete communication between the processor and the memory through the communication bus;
a memory for storing a computer program;
and a processor for implementing the surgical robot navigation method of the first aspect when executing the program stored in the memory.
In a fifth aspect, the embodiments of the present invention provide a computer-readable storage medium, in which a computer program is stored, and the computer program, when executed by a processor, implements the surgical robot navigation method according to the first aspect.
In a sixth aspect, embodiments of the present invention provide a computer program product containing instructions which, when run on a computer, cause the computer to perform the surgical robot navigation method of the first aspect.
As can be seen from the above, when the surgical navigation is performed by using the scheme provided by the embodiment of the present invention, the first registration relationship between the first perspective image and the three-dimensional space is obtained based on the first spatial coordinate of the marker in the registration assembly in the three-dimensional space and the first image coordinate in the perspective image, so that when the perspective image acquisition device acquires the second perspective image during surgery, the pose information of the tracer installed on the registration assembly relative to the surgical navigation device at the acquisition time of the second perspective image can be obtained, and based on the pose information and the obtained first registration relationship, the second registration relationship between the second perspective image and the surgical navigation device can be obtained, so that the focus can be positioned based on the second registration relationship, and the surgical robot can be successfully navigated.
In addition, the acquisition modes of the first image coordinate and the pose information are mutually independent, namely the acquisition of the pose information does not depend on a marker ball arranged on a registration assembly, so that even if the registration assembly is wrapped by a sterile cover in the operation process, the pose information can be acquired based on a tracer arranged outside the sterile cover in an identification mode, the identification cannot be interfered by the sterile cover, the accuracy of the pose information acquired based on the identification result and the accuracy of the surgical robot in the registration of the surgical navigation equipment and the perspective image based on the pose information are improved, the accuracy of the surgical robot in the navigation according to the registration result is improved, the system is more applicable to a wide surgical formula, and the operation reliability is higher.
Of course, not all of the advantages described above need to be achieved at the same time in the practice of any one product or method of the invention.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other embodiments can be obtained by referring to these drawings.
Fig. 1 is a schematic flow chart of a first surgical robot navigation method according to an embodiment of the present invention;
fig. 2 is a schematic diagram of an arrangement of a registration assembly according to an embodiment of the present invention;
fig. 3 is a schematic diagram of a positioning manner of the registration assembly after the marker is removed according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a tracer installation provided in an embodiment of the invention;
fig. 5 is a flowchart illustrating a second surgical robot navigation method according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of a surgical robot navigation device according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of a surgical robot navigation system according to an embodiment of the present invention;
fig. 8 is a schematic structural diagram of a surgical robot according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived from the embodiments given herein by one of ordinary skill in the art, are within the scope of the invention.
In order to improve the accuracy of the surgical robot in navigation in the surgical process, the embodiment of the invention provides a surgical robot navigation method, a surgical robot navigation device and a surgical robot navigation system.
First, an execution subject of the surgical robot navigation method according to the embodiment of the present invention will be described.
The embodiment of the invention provides a surgical robot navigation method, which comprises the following steps: the surgical robot has data processing and communication functions.
The following describes the surgical robot navigation method provided in the embodiment of the present invention in detail.
Referring to fig. 1, a flowchart of a first surgical robot navigation method according to an embodiment of the present invention is shown, where the method includes the following steps S101 to S105.
Step S101: a first fluoroscopic image acquired by a fluoroscopic image acquisition apparatus is obtained.
The perspective image acquisition equipment comprises a perspective image acquisition device, a registration component and a marker, wherein the registration component is arranged in a field range of the perspective image acquisition device, and the marker capable of being developed in the perspective image is arranged on the registration component.
The fluoroscopic image acquiring apparatus may be any type of apparatus capable of acquiring a fluoroscopic image, and may be, for example, an X-ray machine, a CT (Computed Tomography) machine, a nuclear magnetic resonance apparatus, or the like. The X-ray machine may be a C-arm X-ray machine.
Accordingly, the type of the first fluoroscopic image is determined by the type of the fluoroscopic image acquiring apparatus, and may be, for example, an X-ray image, a CT image, an MR (Magnetic Resonance) image, or the like.
A registration component is arranged in the field range of the perspective image acquisition equipment, so that the first perspective image acquired by the perspective image acquisition equipment comprises an area where the registration component is located; and the registration component is provided with a marker which can be developed in the perspective image, so the first perspective image comprises the development of the marker in the registration component.
The above-described registration assembly is explained below.
Specifically, the registration assembly may include a main body plate and a marker, the main body plate has a double-layer plate structure, and the marker is embedded in each plate surface of the main body plate. Wherein, the double-layer plate-shaped structures can be not parallel to each other or not.
The invention is not limited to the specific shape and material of the marker, for example, the shape of the marker can be any shape that is easy to determine the geometric center, such as a sphere; the material of the marker may be any material that can be developed in the first fluoroscopic image, such as steel.
The marker is arranged on the plate surface of the main body plate in an embedded mode, and can be conveniently detached when the marker ball is not needed to be used subsequently, so that the interference of the development of the marker can be removed from the image shot by the perspective image acquisition equipment under the condition that the marker ball is not needed to be used.
In an embodiment of the present invention, the material of the main body plate and other components in the registration assembly may be a material that is not developed in the first perspective image or is difficult to develop in the first perspective image.
This may allow for the development of other components of the body plate or the like in the registration assembly that are not or are not sharp in the first fluoroscopic image, thereby avoiding or reducing the overlap of such development with the development of the markers in the first fluoroscopic image, facilitating the subsequent identification of the first image coordinates of the markers based on the development of the markers in the first fluoroscopic image.
The above-mentioned registration component can be disposed within the field of view of the fluoroscopic image acquiring apparatus in various ways, and the detailed description is given in the following embodiments, which will not be detailed here.
Step S102: first spatial coordinates of the marker in three-dimensional space and first image coordinates of the marker in the first fluoroscopic image are obtained.
The first space coordinate of the marker in the three-dimensional space can be measured in advance and stored, and the marker can be read directly when the step is executed. A specific manner of obtaining the first spatial coordinates will be described below.
In one embodiment, the first spatial coordinates may be obtained based on a 3D depth sensor. For example, the distance between the marker in the registration assembly and the 3D depth sensor is measured by the 3D depth sensor, and the first space coordinate of the marker in the real three-dimensional space can be calculated by the space coordinate of the 3D depth sensor in the three-dimensional space and the pose information of the 3D depth sensor in the three-dimensional space which are obtained in advance.
In this case, the three-dimensional space is a real space where the 3D depth sensor is located, and the first spatial coordinate is a spatial coordinate in a spatial coordinate system established based on the real space where the 3D depth sensor is located.
In another embodiment, the first spatial coordinates may be obtained based on a binocular camera. For example, images containing the markers are acquired by using a binocular camera, pixel coordinates of the markers in the images are determined, and first space coordinates of the markers in a three-dimensional space can be calculated according to a binocular vision distance measuring principle based on the pixel coordinates, internal reference and external reference of the binocular camera calibrated in advance and the distance between the binocular cameras.
In this case, the real three-dimensional space is a real space where the binocular camera is located, and the first spatial coordinate is a spatial coordinate in a spatial coordinate system established based on the real space where the binocular camera is located.
The manner of obtaining the first image coordinates will be described below.
Specifically, a feature point developed by the marker in the registration component may be extracted from the first fluoroscopic image, a pixel coordinate of the extracted feature point in the first fluoroscopic image may be obtained, and the pixel coordinate may be used as a first image coordinate of the marker in the first fluoroscopic image.
For example, the marker is spherical, the center of a circle developed by the marker in the first perspective image can be extracted as a feature point, and the pixel coordinate of the center of the circle in the perspective image is acquired as the first image coordinate; for another example, the marker is a cube, and a geometric center of the marker in the first fluoroscopic image may be extracted as a feature point, and a pixel coordinate of the geometric center in the fluoroscopic image may be acquired as the first image coordinate.
Step S103: and obtaining a first registration relation between the first perspective image and the three-dimensional space based on the first space coordinate and the first image coordinate.
In this step, based on the first spatial coordinate of the marker and the first image coordinate, a first registration relationship between the first fluoroscopic image and the three-dimensional space may be obtained.
Specifically, after the first spatial coordinate and the first image coordinate are measured, the first registration relationship may be obtained through the following formula based on the first spatial coordinate, the first image coordinate, and the focal screen distance of the fluoroscopic image capturing apparatus.
In the above formula, s denotes a pixel interval of the first fluoroscopic image, (u, v) denotes first image coordinates of the marker, and f x And f y Same, representing the focal screen distance of the fluoroscopic image acquiring apparatus, (c) x ,c y ) Representing the normal center of incidence of the X-ray source in the first fluoroscopic image (X, Y, Z) representing a first spatial coordinate of the marker in the fluoroscopic image acquisition apparatus.
The focal screen distance of the perspective image acquisition equipment can be provided by a manufacturer of the perspective image acquisition equipment, and can also be obtained by calibrating a focal screen distance calibration method of the perspective image acquisition equipment; the pixel interval and the vertical incidence center are both set in advance.
In the above formula r 11 -r 33 The matrix represented is a rotation matrix, t 1 -t 3 The matrix represented is a translation matrix, and r is composed of the rotation matrix and the translation matrix 11 -t 3 The matrix of representations may be referred to as a rigid body transformation matrix and may also be referred to as a registration matrix.
It can be seen that the parameters in the formula except for the registration matrix are known parameters, and the registration matrix can be obtained by calculating by substituting the known parameters into the formula, and the registration matrix represents the coordinate conversion relationship between the image coordinate system corresponding to the first perspective image and the space coordinate system corresponding to the three-dimensional space, that is, the first registration relationship between the first perspective image and the three-dimensional space.
It should be noted that, the steps S101 to S103 may be performed in a preparation stage before an operation, and after the first registration relationship is obtained, the subsequent steps S104 and S105 may be performed during the operation based on the obtained first registration relationship.
Step S104: in the operation process, a second perspective image acquired by the perspective image acquisition equipment is obtained, and the pose information of the tracer arranged on the registration assembly relative to the operation navigation equipment at the acquisition moment is obtained.
Wherein, the above-mentioned collection moment is: the moment when the fluoroscopic image acquisition apparatus acquires the second fluoroscopic image.
The second perspective image is an image acquired by the perspective image acquisition device for the patient focus during the operation, and since the registration component is also within the field of view of the perspective image acquisition device, the second perspective image includes the visualization of the focus region of the patient and the visualization of the marker in the registration component.
In an embodiment of the present invention, after the steps S101 to S103 are completed and the first registration relationship is obtained, the marker in the registration component may be removed, so that the second fluoroscopic image acquired by the fluoroscopic image acquiring apparatus during the operation does not include the marker visualization, and the marker visualization is prevented from overlapping with the visualization of the lesion area in the second fluoroscopic image, thereby preventing the lesion area in the fluoroscopic image from being recognized by the surgical robot or the surgical doctor due to the marker visualization. Of course, the body plate with embedded markers in the registration assembly may also be removed directly, as described in further detail in the examples below.
The surgical navigation device may be a binocular navigation device, and in this case, the surgical navigation device includes two image capturing devices, which may be a visible light image capturing device or an infrared image capturing device.
The tracer includes a plurality of positioning markers, which may be active light emitting dots, reflective spheres, etc., and the tracer may be mounted on the registration assembly in a variety of ways, as described in more detail in the following examples, which are not detailed herein.
Because the tracer is mounted on the registration assembly, the pose information of the tracer relative to the surgical navigation device at the acquisition time can also be understood as the pose information of the registration assembly relative to the surgical navigation device at the acquisition time.
The above pose information can be understood as: and the coordinate conversion relation between the space coordinate system corresponding to the three-dimensional space and the space coordinate system established based on the real space where the surgical navigation equipment is located.
Specifically, the tracer may be disposed within a field of view of the surgical navigation device, and then the pose information may be obtained in the following manner.
In one embodiment, the relative position relationship between the positioning markers in the tracer may be first determined, then the pose image collected by the surgical navigation device and including the positioning markers is obtained, the second image coordinates of the positioning markers in the image are determined, and then the pose information of the tracer relative to the surgical navigation device is obtained according to the second image coordinates and the relative position relationship.
For example, according to the relative position relationship between the positioning markers in the tracer and the second image coordinates of the positioning markers in the image captured by the surgical navigation device, the rotation angle information and the translation distance information of the plane where the positioning markers are located relative to the imaging plane of the surgical navigation device can be calculated, and the rotation angle information and the translation distance information are used as the pose information of the tracer relative to the fluoroscopic image acquisition device.
In another embodiment, the second spatial coordinates of each positioning marker in the tracer may be obtained, then the second image coordinates in the pose image of each positioning marker in the surgical navigation device region may be obtained, and the pose information of the tracer relative to the fluoroscopic image capturing device may be obtained based on the second image coordinates and the second spatial coordinates. The detailed description will be given in the following step S504-step S506 in the embodiment shown in fig. 5, and will not be described in detail here.
Step S105: and acquiring a second registration relation between the second perspective image and the surgical navigation equipment based on the first registration relation and the pose information, and navigating the surgical robot based on the second registration relation.
As can be seen from the foregoing embodiment, the first registration relationship represents a coordinate transformation relationship between an image coordinate system corresponding to the first perspective image and a space coordinate system corresponding to the three-dimensional space, and the pose information represents a coordinate transformation relationship between the space coordinate system corresponding to the three-dimensional space and a coordinate system established based on a real space in which the surgical navigation device is located, so that based on the first registration relationship and the pose information, a coordinate transformation relationship between the image coordinate system corresponding to the first perspective image and a coordinate system established based on a real space in which the surgical navigation device is located, that is, a second registration relationship between the second perspective image and the surgical navigation device, can be obtained.
Specifically, a first matrix representing the first registration relationship and a second matrix representing pose information can be obtained, and the product of the first matrix and the second matrix is calculated to obtain a second registration relationship between the second perspective image and the surgical navigation device.
For example, if the first matrix is denoted as T1 and the second matrix is denoted as T2, the second registration relationship is T1 × T2.
In this way, by calculating the product of the matrices representing the two coordinate transformation relations, the coordinate transformation relation between the image coordinate system corresponding to the first perspective image and the coordinate system established based on the real space where the surgical navigation device is located, that is, the second registration relation between the second perspective image and the surgical navigation device, can be accurately obtained.
After the second registration relationship is obtained, the target image coordinate of the lesion in the second perspective image can be obtained, the target image coordinate is substituted into the second registration relationship, and the target space coordinate of the lesion in a space coordinate system established based on the surgical navigation equipment can be calculated, so that the surgical robot can move to the position of the lesion in a three-dimensional space based on the target space coordinate, and navigation of the surgical robot is realized.
For example, if the first matrix is T1 and the second matrix is T2, the second registration relationship is T1 × T2, and the image coordinate of the lesion area in the second fluoroscopic image is P1, then the coordinate P1 of the lesion area in the three-dimensional space is: t1 × T2 × P1, so that the surgical robot can move to the spatial position of the coordinate P1, that is, to the position of the lesion in the three-dimensional space, thereby implementing navigation of the surgical robot.
As can be seen from the above, when the surgical navigation is performed by using the scheme provided by the embodiment of the present invention, the first registration relationship between the first perspective image and the three-dimensional space is obtained based on the first spatial coordinate of the marker in the registration assembly in the three-dimensional space and the first image coordinate in the perspective image, so that when the perspective image acquisition device acquires the second perspective image during surgery, the pose information of the tracer installed on the registration assembly relative to the surgical navigation device at the acquisition time of the second perspective image can be obtained, and based on the pose information and the obtained first registration relationship, the second registration relationship between the second perspective image and the surgical navigation device can be obtained, so that the focus can be positioned based on the second registration relationship, and the surgical robot can be successfully navigated.
In addition, the acquisition modes of the first image coordinate and the pose information are mutually independent, namely the acquisition of the pose information does not depend on a marker ball arranged on the registration assembly, so that even if the registration assembly is wrapped by a sterile cover in the operation process, the pose information can be acquired based on a tracer arranged outside the sterile cover in an identification mode, the identification cannot be interfered by the sterile cover, the accuracy of the surgical robot in acquiring the pose information based on the identification result and the accuracy of the surgical navigation equipment and the perspective image in registration based on the pose information are improved, the accuracy of the surgical robot in navigation according to the registration result is improved, the system is more applicable to a wide surgical formula, and the operation reliability is higher.
The following describes the arrangement of the aforementioned registration component within the field of view of the fluoroscopic image acquisition apparatus.
In one embodiment, the registration assembly may be fixed within the field of view of the fluoroscopic image acquisition apparatus by using a mechanical arm, a bracket, or other fixed support.
In another embodiment, the registration assembly is detachably mounted to the front side of the light receiver of the fluoroscopic image acquiring apparatus.
The registration assembly is detachably mounted on the front side of the light receiver of the perspective image acquisition device, so that an X-ray emission source in the perspective image acquisition device can penetrate through the registration assembly to enter the light receiver, and a perspective image acquired by the perspective image acquisition device can contain a registration assembly area, namely the registration assembly is located in the field range of the perspective image acquisition device.
In particular, the registration assembly may include a buckle that can be fastened to the front side of the light receiver of the fluoroscopic image capturing apparatus, so that the registration assembly can be detachably fixed to the front side of the light receiver by the buckle.
For ease of understanding, the above-described registration assembly arrangement is described below in conjunction with fig. 2.
Referring to fig. 2, a schematic diagram of an arrangement manner of a registration assembly according to an embodiment of the present invention is provided.
As can be seen from fig. 2, the registration assembly includes a main body plate having a double-layer plate structure and a buckle, the markers are respectively embedded in each plate surface of the main body plate, and the registration assembly can be fixed to the front side of the light receiver through the buckle, so that the registration assembly is located within the field of view of the fluoroscopic image capturing apparatus, and thus, the registration assembly does not need to be supported by a bracket, a mechanical arm, and other fixtures, and the operation of setting the registration assembly within the field of view of the fluoroscopic image capturing apparatus is simplified.
In addition, when the registration assembly is disposed within the field of view of the fluoroscopic image capturing apparatus by using fixtures such as a stent, a mechanical arm, and the like, the fixtures such as the stent, the mechanical arm, and the like may prevent the surgical robot or the surgical doctor from performing the surgical operation during the operation; when the registering component can be fixed on the front side of the light receiver based on the buckle, the situation can be effectively prevented, and the smooth execution of the operation is facilitated.
One way of arranging the registration assembly within the field of view of the fluoroscopic image acquisition apparatus when the marker is removed is described below.
In one embodiment, the body plate with the marker embedded therein may be removed from the registration assembly, and then the registration assembly may be mounted to the front side of the light receiver of the fluoroscopic image capturing apparatus by means of a snap.
Referring to fig. 3, a schematic diagram of a setting manner of the registration assembly after removing the marker according to an embodiment of the present invention is provided.
As can be seen from fig. 3, the main body plates embedded with the markers in the registration assembly are all removed and are mounted on the front side of the light receiver of the fluoroscopic image acquisition apparatus by means of a snap. In this case, it can also be considered that the registration assembly only comprises the above-mentioned snap.
The way the tracer is arranged on the registration assembly will be described below.
In one embodiment, the tracer is removably mounted to the registration assembly at a predetermined location.
Specifically, the tracer can be detachably attached to a preset position of the registration assembly in a magnetic attraction manner; or, the tracer may include a pin, so that the pin of the tracer can be inserted into a slot at a predetermined position of the registration assembly, so that the tracer can be detachably mounted at the predetermined position of the registration assembly, and the like.
For ease of understanding, the placement of the tracer on the registration assembly is described below in conjunction with fig. 4.
Referring to fig. 4, a schematic diagram of a tracer setting mode provided by the present invention is shown.
As shown in fig. 4, the registration component is as shown in fig. 2, and includes a main body plate with a double-layer plate structure and a buckle, and a marker is embedded in each plate surface of the main body plate respectively, and the registration component is fixed on the front side of the light receiver through the buckle; the predetermined position is the upper right side of the catch in the registration assembly such that the tracer can be removably mounted in the registration assembly in the manner described above.
Because the tracer is detachably arranged at the preset position of the registration assembly, the registration assembly can be arranged in the field range of the perspective image acquisition equipment when the registration assembly is separated from the tracer, so that the tracer can be arranged on the registration assembly through the sterile cover in the operation process even if the perspective image acquisition equipment and the registration assembly positioned in the field range of the perspective image acquisition equipment are integrally wrapped by the sterile cover, and the operation navigation equipment can not be influenced by the sterile cover when shooting a pose image and identifying the tracer in the image, thereby improving the accuracy of identifying the tracer in the image and obtaining the pose information of the tracer relative to the operation navigation equipment based on the identification result.
In an embodiment of the present invention, the registration assembly may be provided with a plurality of markers, in which case, the foregoing step S102 may be implemented by:
first space coordinates of each marker in three-dimensional space are obtained, and corresponding first image coordinates of the first space coordinates of each marker in the first perspective image are determined based on distribution information of each marker in the registration assembly.
For example, the first spatial coordinate P2 of the marker a in the three-dimensional space is P2, and the marker a is located at the center of the registration component according to the distribution information, so that the first spatial coordinate P2 of the marker a in the first perspective image corresponds to the first image coordinate: a first image coordinate p2 developed for a marker located in the center of the registration assembly in the first fluoroscopic image.
For another example, after the first spatial coordinates of each marker are obtained, the first image coordinates of each marker in the first fluoroscopic image may be sequentially determined as the first image coordinates corresponding to the obtained first spatial coordinates according to the sorting order.
When a plurality of markers are arranged in the registration assembly, the first space coordinate and the first image coordinate corresponding to each marker can be conveniently and accurately determined based on the distribution information of each marker in the registration assembly.
In an embodiment of the present invention, when the foregoing step S104 is executed, if it is detected that the relative position between the registration component and the fluoroscopic image capturing apparatus changes at the time of capturing the second fluoroscopic image by the fluoroscopic image capturing apparatus, the steps S101 to S103 may be executed again to obtain the updated first registration relationship, and the subsequent step 105 is executed based on the updated first registration relationship.
Since the first registration relationship represents a coordinate conversion relationship between the image coordinate system corresponding to the first perspective image and the space coordinate system corresponding to the three-dimensional space, if the relative position of the registration assembly and the perspective image acquisition device changes, the first registration relationship also changes, so that steps S101 to S103 may be re-executed to obtain an updated first registration relationship, and subsequent step 105 is executed based on the updated first registration relationship.
The difference between the surgical robot navigation solution provided by the embodiment of the present invention and the prior art will be described in detail below.
In the prior art, registration is generally realized by means of a registration plate with an active luminous point, the registration plate is arranged in a field range of a perspective image acquisition device, a surgical robot firstly obtains the position of the active luminous point in a perspective image acquired by the perspective image acquisition device, then obtains the pose information of the registration plate obtained by the surgical navigation device through identifying the active luminous point, and performs registration according to the position and the pose information.
Therefore, in the prior art, the markers are used for obtaining the image positions and the pose information of the registration plate. In the actual operation process, the whole perspective image acquisition equipment is covered by the sterile cover, so that the registration plate arranged in the field range of the perspective image acquisition equipment is also covered by the sterile cover, the sterile cover can interfere the operation navigation equipment to identify the light-emitting point in the registration plate, the accuracy of the pose information obtained by the operation navigation equipment is reduced, the accuracy of the operation robot in the alignment of the operation navigation equipment and the perspective image based on the pose information is reduced, and the accuracy of the operation robot in the navigation according to the alignment result is lower.
When the scheme provided by the embodiment of the invention is applied, the acquisition of the position of the marker in the registration assembly in the perspective image and the acquisition of the pose information are mutually independent, the pose information is acquired based on the tracer installed in the registration assembly and does not depend on the position information of the marker in the registration assembly, therefore, even if the registration assembly in the field range of the perspective image acquisition equipment is wrapped by the sterile hood in the operation process, the tracer is only required to be installed on the registration assembly through the sterile hood, the surgical robot can acquire the pose information of the registration assembly by identifying the pose image which is shot by the surgical navigation equipment and contains the tracer, the identification process is not interfered by the sterile hood, the accuracy of the pose information obtained by the surgical robot based on the identification result and the accuracy of the surgical navigation equipment and the perspective image in registration based on the pose information are improved, the accuracy of the surgical robot in navigation according to the registration result is improved, the system is more suitable for the operation, and the operation reliability is stronger.
On the basis of the embodiment shown in fig. 1, the tracer includes a light-reflecting ball capable of being recognized by the surgical navigation device, the tracer is mounted on the registration assembly and is located within the field of view of the surgical navigation device, so that the second spatial coordinate of the light-reflecting ball and the second image coordinate of the light-reflecting ball in the pose image acquired by the surgical navigation device can be obtained, and the pose information of the tracer relative to the surgical navigation device is determined based on the second spatial coordinate and the second image coordinate. In view of the above, the embodiment of the present invention provides a second surgical robot navigation method.
Referring to fig. 5, a flowchart of a second surgical robot navigation method according to an embodiment of the present invention is shown, where the method includes the following steps S501 to S507:
step S501: a first fluoroscopic image acquired by a fluoroscopic image acquisition apparatus is obtained.
Step S502: first spatial coordinates of the marker in three-dimensional space and first image coordinates of the marker in the first fluoroscopic image are obtained.
Step S503: and obtaining a first registration relation between the first perspective image and the three-dimensional space based on the first space coordinate and the first image coordinate.
The steps S501 to S503 are the same as the steps S101 to S103 in the embodiment shown in fig. 1, and are not described again here.
Step S504: and in the operation process, obtaining a second perspective image collected by the perspective image collecting equipment and obtaining a pose image collected by the operation navigation equipment.
According to the embodiment, the surgical navigation equipment can be visible light image acquisition equipment or infrared image acquisition equipment, and correspondingly, the pose image acquired by the surgical navigation equipment can be a visible light image or an infrared image.
Because the tracer is located in the field range of the surgical navigation equipment and the reflective ball in the tracer can be identified by the surgical navigation equipment, the pose image acquired by the surgical navigation equipment contains the reflective ball in the tracer.
Step S505: and determining a second image coordinate of the reflective ball in the pose image.
Specifically, the feature points of the light reflecting ball can be extracted from the pose image, the pixel coordinates of the extracted feature points in the pose image are obtained, and the pixel coordinates are used as the second image coordinates of the light reflecting ball. The detailed obtaining manner of the second image coordinate can be obtained on the basis of the obtaining manner of the first image coordinate of the marker described in the foregoing embodiment shown in fig. 1, and will not be described herein again.
Step S506: and obtaining the position and pose information of the tracer relative to the surgical navigation equipment at the acquisition time based on the second space coordinate and the second image coordinate of the reflective ball.
Wherein the second spatial coordinate is: coordinates of the light-reflecting sphere in three-dimensional space.
The second space coordinate of the light reflecting ball can be measured in advance and stored, and the step can be conveniently and directly read. The detailed obtaining manner of the second spatial coordinate can be obtained on the basis of the obtaining manner of the first spatial coordinate of the marker described in the foregoing embodiment shown in fig. 1, and will not be described herein again.
Specifically, based on the second image coordinate and the second spatial coordinate of the reflective ball and calibration parameters such as internal reference and external reference of the surgical navigation device, pose calculation can be performed by adopting a PNP algorithm and the like pose calculation method, so as to determine rotation angle information and translation distance information of the surgical navigation device relative to the plane where the reflective ball is located, wherein the rotation angle information and the translation distance information are pose information of the tracer relative to the surgical navigation device at the acquisition time.
Step S507: and acquiring a second registration relation between the second perspective image and the surgical navigation equipment based on the first registration relation and the pose information, and navigating the surgical robot based on the second registration relation.
Step S507 is the same as step S105 in the embodiment shown in fig. 1, and is not repeated herein.
Therefore, the second space coordinate of the reflective ball can be measured in advance, so that when the pose information of the tracer relative to the surgical navigation equipment is obtained, the second space coordinate can be directly read, the pose information is solved by combining the second image coordinate of the reflective ball in the pose image shot by the surgical navigation equipment, and the obtaining efficiency of the pose information is improved.
Corresponding to the surgical robot navigation method, the embodiment of the invention also provides a surgical robot navigation device.
Referring to fig. 6, a schematic structural diagram of a surgical robot navigation apparatus provided in an embodiment of the present invention includes the following modules:
a first perspective image obtaining module 601, configured to obtain a first perspective image collected by a perspective image collecting device, where a registration component is arranged in a field range of the perspective image collecting device, and a marker capable of being developed in the perspective image is arranged on the registration component;
a marker coordinate obtaining module 602, configured to obtain a first spatial coordinate of the marker in a three-dimensional space and a first image coordinate of the marker in the first fluoroscopic image;
a registration relation obtaining module 603, configured to obtain a first registration relation between the first perspective image and the three-dimensional space based on the first space coordinate and the first image coordinate;
a pose information obtaining module 604, configured to obtain a second perspective image acquired by the perspective image acquiring device and obtain pose information of a tracer installed on the registration assembly relative to a surgical navigation device at an acquisition time during a surgical procedure, where the acquisition time is: the moment when the fluoroscopic image acquisition apparatus acquires the second fluoroscopic image;
the navigation module 605 obtains a second registration relationship between the second perspective image and the surgical navigation device based on the first registration relationship and the pose information, and navigates the surgical robot based on the second registration relationship.
As can be seen from the above, when the surgical navigation is performed by using the scheme provided by the embodiment of the present invention, the first registration relationship between the first perspective image and the three-dimensional space is obtained based on the first spatial coordinate of the marker in the registration assembly in the three-dimensional space and the first image coordinate in the perspective image, so that when the perspective image acquisition device acquires the second perspective image during surgery, the pose information of the tracer installed on the registration assembly relative to the surgical navigation device at the acquisition time of the second perspective image can be obtained, and based on the pose information and the obtained first registration relationship, the second registration relationship between the second perspective image and the surgical navigation device can be obtained, so that the focus can be positioned based on the second registration relationship, and the surgical robot can be successfully navigated.
In addition, the acquisition modes of the first image coordinate and the pose information are mutually independent, namely the acquisition of the pose information does not depend on a marker ball arranged on the registration assembly, so that even if the registration assembly is wrapped by a sterile cover in the operation process, the pose information can be acquired based on a tracer arranged outside the sterile cover in an identification mode, the identification cannot be interfered by the sterile cover, the accuracy of the surgical robot in acquiring the pose information based on the identification result and the accuracy of the surgical navigation equipment and the perspective image in registration based on the pose information are improved, the accuracy of the surgical robot in navigation according to the registration result is improved, the system is more applicable to a wide surgical formula, and the operation reliability is higher.
In one embodiment of the invention, the tracer comprises a light-reflecting ball capable of being identified by the surgical navigation equipment, and the tracer is mounted on the registration assembly and positioned in the field of view of the surgical navigation equipment;
the pose information obtaining module 604 is specifically configured to obtain a second perspective image acquired by the perspective image acquisition device and obtain a pose image acquired by a surgical navigation device in a surgical procedure; determining second image coordinates of the reflective ball in the pose image; based on the second space coordinate of the reflective ball and the second image coordinate, obtaining pose information of the tracer relative to the surgical navigation equipment at the collection time, wherein the second space coordinate is as follows: coordinates of the light-reflecting sphere in the three-dimensional space.
Therefore, the second space coordinate of the reflective ball can be measured in advance, so that when the pose information of the tracer relative to the surgical navigation equipment is obtained, the second space coordinate can be directly read, the pose information is solved by combining the second image coordinate of the reflective ball in the pose image shot by the surgical navigation equipment, and the obtaining efficiency of the pose information is improved.
In an embodiment of the present invention, the navigation module 605 is specifically configured to obtain a first matrix representing the first registration relationship and a second matrix representing the pose information; and calculating the product of the first matrix and the second matrix to obtain a second registration relation between the second perspective image and the surgical navigation equipment, and navigating the surgical robot based on the second registration relation.
In this way, by calculating the product of the matrices representing the two coordinate transformation relations, the coordinate transformation relation between the image coordinate system corresponding to the first perspective image and the coordinate system established based on the real space where the surgical navigation device is located, that is, the second registration relation between the second perspective image and the surgical navigation device, can be accurately obtained.
In one embodiment of the invention, a plurality of markers are disposed on the registration assembly;
the marker coordinate obtaining module 602 is specifically configured to obtain first space coordinates of each marker in a three-dimensional space; based on the distribution information of the respective markers in the registration assembly, corresponding first image coordinates of the first spatial coordinates of the respective markers in the first fluoroscopic image are determined.
When a plurality of markers are arranged in the registration assembly, the first space coordinates and the first image coordinates corresponding to each marker can be conveniently and accurately determined based on the distribution information of each marker in the registration assembly.
In one embodiment of the invention, the registration component is detachably arranged at the front side of the light receiver of the perspective image acquisition device;
the registration assembly is detachably mounted on the front side of the light receiver of the perspective image acquisition device, so that an X-ray emission source in the perspective image acquisition device can penetrate through the registration assembly to enter the light receiver, and a perspective image acquired by the perspective image acquisition device can contain a registration assembly area, namely the registration assembly is located in the field range of the perspective image acquisition device.
Or
The registration assembly comprises a main body plate and markers, the main body plate is of a double-layer plate-shaped structure, and the markers are embedded in each plate surface of the main body plate respectively;
the marker is arranged on the plate surface of the main body plate in an embedded mode, and can be conveniently detached when the marker ball is not needed subsequently, so that the interference of the development of the marker can be removed from the image shot by the perspective image acquisition equipment under the condition that the marker ball is not needed.
Or
The tracer is detachably mounted at a preset position of the registration assembly.
Because the tracer is detachably arranged at the preset position of the registration assembly, the registration assembly can be arranged in the field range of the perspective image acquisition equipment when the registration assembly is separated from the tracer, so that the tracer can be arranged on the registration assembly through the sterile cover in the operation process even if the perspective image acquisition equipment and the registration assembly positioned in the field range of the perspective image acquisition equipment are integrally wrapped by the sterile cover, and the operation navigation equipment can not be influenced by the sterile cover when shooting a pose image and identifying the tracer in the image, thereby improving the accuracy of identifying the tracer in the image and obtaining the pose information of the tracer relative to the operation navigation equipment based on the identification result.
Corresponding to the surgical robot navigation method, the embodiment of the invention also provides a surgical robot navigation system.
Referring to fig. 7, a schematic structural diagram of a surgical robot navigation system provided in an embodiment of the present invention includes: a surgical robot 701, a fluoroscopic image acquisition device 702, a surgical navigation device 703, a registration component 704, and a tracer 705;
the registration component 704 comprises a main body plate and markers, the main body plate is of a double-layer plate-shaped structure, the markers are embedded in each plate surface of the main body plate respectively, and the registration component 704 is detachably mounted on the front side of a light receiver of the perspective image acquisition device 702;
the tracer 705 comprises a light-reflecting ball which can be identified by the surgical navigation equipment 703, and the tracer 705 is detachably mounted at a preset position of the registration component 704 and is positioned in the field of view of the surgical navigation equipment 703;
the perspective image collecting device 702 is configured to collect a perspective image and send the collected perspective image to the surgical robot 701;
the surgical navigation device 703 is configured to collect a pose image and send the collected pose image to the surgical robot 701;
the surgical robot 701 is configured to perform any one of the embodiments of the surgical robot navigation method.
It should be noted that the embodiments of the present invention are not limited to the shapes or kinds of the surgical robot and the surgical navigation apparatus, and the surgical robot and the surgical navigation apparatus in the system diagram are only illustrated for convenience of understanding, and it is not implied that the embodiments of the present invention are only applicable to the surgical robot and the surgical navigation apparatus of the shapes or kinds.
Fig. 8 is a schematic structural diagram of a surgical robot according to an embodiment of the present invention, where the surgical robot includes a processor 801, a communication interface 802, a memory 803, and a communication bus 804, where the processor 801, the communication interface 802, and the memory 803 complete mutual communication via the communication bus 804,
a memory 803 for storing a computer program;
the processor 801 is configured to implement the surgical robot navigation method according to the embodiment of the present invention when executing the program stored in the memory 803.
The communication bus mentioned in the surgical robot may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The communication bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown, but this is not intended to represent only one bus or type of bus.
The communication interface is used for communication between the surgical robot and other equipment.
The Memory may include a Random Access Memory (RAM) or a Non-Volatile Memory (NVM), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the processor.
The Processor may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but also Digital Signal Processors (DSPs), application Specific Integrated Circuits (ASICs), field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components.
In still another embodiment of the present invention, a computer-readable storage medium is further provided, in which a computer program is stored, and the computer program, when executed by a processor, implements the surgical robot navigation method provided by the embodiment of the present invention.
In yet another embodiment provided by the present invention, there is also provided a computer program product containing instructions which, when run on a computer, cause the computer to perform the surgical robot navigation method provided by the embodiments of the present invention.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the invention to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that includes one or more available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid State Disk (SSD)), among others.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrases "comprising a," "8230," "8230," or "comprising" does not exclude the presence of additional like elements in a process, method, article, or apparatus that comprises the element.
All the embodiments in the present specification are described in a related manner, and the same and similar parts among the embodiments may be referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, as for the embodiments of the device, the system, the surgical robot and the storage medium, since they are substantially similar to the embodiments of the method, the description is relatively simple, and reference may be made to the partial description of the embodiments of the method for the relevant points.
The above description is only for the preferred embodiment of the present invention, and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention shall fall within the protection scope of the present invention.
Claims (13)
1. A surgical robot navigation method, the method comprising:
obtaining a first perspective image acquired by perspective image acquisition equipment, wherein a registration component is arranged in a field range of the perspective image acquisition equipment, and a marker capable of being developed in the perspective image is arranged on the registration component;
obtaining first spatial coordinates of the marker in three-dimensional space and first image coordinates of the marker in the first fluoroscopic image;
obtaining a first registration relation between the first perspective image and the three-dimensional space based on the first space coordinate and the first image coordinate;
in the operation process, a second perspective image acquired by the perspective image acquisition device is obtained, and the pose information of a tracer installed on the registration assembly relative to the operation navigation device at the acquisition time is obtained, wherein the acquisition time is as follows: the time when the perspective image acquisition device acquires the second perspective image;
and acquiring a second registration relation between the second perspective image and the surgical navigation equipment based on the first registration relation and the pose information, and navigating the surgical robot based on the second registration relation.
2. The method of claim 1, wherein the tracer comprises a light-reflective ball recognizable by a navigation device, the tracer being mounted to the registration assembly and located within a field of view of the surgical navigation device;
the obtaining pose information of a tracer installed on the registration assembly relative to a surgical navigation device at an acquisition time comprises:
obtaining a pose image acquired by surgical navigation equipment;
determining second image coordinates of the light reflecting ball in the pose image;
based on the second space coordinate of the reflective ball and the second image coordinate, obtaining pose information of the tracer relative to the surgical navigation equipment at the acquisition time, wherein the second space coordinate is as follows: coordinates of the light-reflecting sphere in the three-dimensional space.
3. The method of claim 1, wherein the obtaining a second registration relationship of the second fluoroscopic image with the surgical navigation device based on the first registration relationship and the pose information comprises:
obtaining a first matrix representing the first registration relation and a second matrix representing the pose information;
and calculating the product of the first matrix and the second matrix to obtain a second registration relation between the second perspective image and the surgical navigation equipment.
4. The method of any of claims 1-3, wherein a plurality of markers are disposed on the registration assembly;
said obtaining first spatial coordinates of said marker in three-dimensional space and first image coordinates of said marker in said first fluoroscopic image comprises:
obtaining first spatial coordinates of each marker in three-dimensional space;
based on the distribution information of the respective markers in the registration component, corresponding first image coordinates of the first spatial coordinates of the respective markers in the first fluoroscopic image are determined.
5. The method according to any one of claims 1 to 3,
the registration assembly is detachably arranged on the front side of a light receiver of the perspective image acquisition equipment; or
The registration component comprises a main body plate and markers, the main body plate is of a double-layer plate-shaped structure, and the markers are embedded in each plate surface of the main body plate respectively; or
The tracer is detachably arranged at a preset position of the registration assembly.
6. A surgical robotic navigation device, the device comprising:
the device comprises a first perspective image acquisition module, a second perspective image acquisition module and a display module, wherein the first perspective image acquisition module is used for acquiring a first perspective image acquired by a perspective image acquisition device, a registration assembly is arranged in a field range of the perspective image acquisition device, and a marker capable of being developed in the perspective image is arranged on the registration assembly;
a marker coordinate obtaining module for obtaining a first space coordinate of the marker in a three-dimensional space and a first image coordinate of the marker in the first perspective image;
a registration relation obtaining module, configured to obtain a first registration relation between the first perspective image and the three-dimensional space based on the first space coordinate and the first image coordinate;
a pose information obtaining module, configured to, in a surgical procedure, obtain a second perspective image acquired by the perspective image acquisition device, and obtain pose information of a tracer installed on the registration assembly with respect to a surgical navigation device at an acquisition time, where the acquisition time is: the moment when the fluoroscopic image acquisition apparatus acquires the second fluoroscopic image;
and the navigation module is used for acquiring a second registration relation between the second perspective image and the surgical navigation equipment based on the first registration relation and the pose information, and navigating the surgical robot based on the second registration relation.
7. The apparatus of claim 6,
the tracer comprises a light-reflecting ball which can be identified by the navigation equipment, and the tracer is mounted on the registration assembly and is positioned in the field of view of the surgical navigation equipment;
the pose information acquisition module is specifically used for acquiring a second perspective image acquired by the perspective image acquisition equipment and acquiring a pose image acquired by the surgical navigation equipment in the surgical process; determining second image coordinates of the light reflecting ball in the pose image; based on the second space coordinate of the reflective ball and the second image coordinate, obtaining pose information of the tracer relative to the surgical navigation equipment at the acquisition time, wherein the second space coordinate is as follows: coordinates of the light-reflecting sphere in the three-dimensional space.
8. The apparatus of claim 6,
the navigation module is specifically configured to obtain a first matrix representing the first registration relationship and a second matrix representing the pose information; and calculating the product of the first matrix and the second matrix to obtain a second registration relation between the second perspective image and the surgical navigation equipment, and navigating the surgical robot based on the second registration relation.
9. The apparatus according to any one of claims 6 to 8,
a plurality of markers are arranged on the registration assembly;
the marker coordinate obtaining module is specifically used for obtaining a first space coordinate of each marker in a three-dimensional space; based on the distribution information of the respective markers in the registration component, corresponding first image coordinates of the first spatial coordinates of the respective markers in the first fluoroscopic image are determined.
10. The apparatus according to any one of claims 6-8,
the registration component is detachably arranged on the front side of a light receiver of the perspective image acquisition equipment; or
The registration component comprises a main body plate and markers, the main body plate is of a double-layer plate-shaped structure, and the markers are embedded in each plate surface of the main body plate respectively; or
The tracer is detachably mounted at a preset position of the registration assembly.
11. A surgical robotic navigation system, comprising: the system comprises a surgical robot, a perspective image acquisition device, a surgical navigation device, a registration assembly and a tracer;
the registration component comprises a main body plate and markers, the main body plate is of a double-layer plate-shaped structure, the markers are embedded in each plate surface of the main body plate respectively, and the registration component is detachably mounted on the front side of a light receiver of the perspective image acquisition equipment;
the tracer comprises a light-reflecting ball which can be identified by the surgical navigation equipment, and the tracer is detachably arranged at a preset position of the registration assembly and is positioned in the field of view of the surgical navigation equipment;
the perspective image acquisition equipment is used for acquiring a perspective image and sending the acquired perspective image to the surgical robot;
the surgical navigation equipment is used for acquiring a pose image and sending the acquired pose image to the surgical robot;
the surgical robot for performing the surgical robot navigation method of any one of claims 1-5.
12. A surgical robot is characterized by comprising a processor, a communication interface, a memory and a communication bus, wherein the processor and the communication interface are used for realizing mutual communication by the memory through the communication bus;
a memory for storing a computer program;
a processor for implementing the method steps of any one of claims 1 to 5 when executing a program stored in the memory.
13. A computer-readable storage medium, characterized in that a computer program is stored in the computer-readable storage medium, which computer program, when being executed by a processor, carries out the method steps of any one of the claims 1-5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211252780.XA CN115737121A (en) | 2022-10-13 | 2022-10-13 | Surgical robot navigation method, device and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211252780.XA CN115737121A (en) | 2022-10-13 | 2022-10-13 | Surgical robot navigation method, device and system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115737121A true CN115737121A (en) | 2023-03-07 |
Family
ID=85351325
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211252780.XA Pending CN115737121A (en) | 2022-10-13 | 2022-10-13 | Surgical robot navigation method, device and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115737121A (en) |
-
2022
- 2022-10-13 CN CN202211252780.XA patent/CN115737121A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11801097B2 (en) | Robotic fluoroscopic navigation | |
US7760909B2 (en) | Video tracking and registering | |
CN112006779B (en) | Precision detection method of surgical navigation system | |
JP4599184B2 (en) | Index placement measurement method, index placement measurement device | |
US7831082B2 (en) | System and method for image based sensor calibration | |
US20070270690A1 (en) | Non-contact medical registration with distance measuring | |
US7634122B2 (en) | Registering intraoperative scans | |
US9693748B2 (en) | System and method for automatically determining calibration parameters of a fluoroscope | |
US20100063387A1 (en) | Pointing device for medical imaging | |
EP3238649B1 (en) | Self-localizing medical device | |
US7789562B2 (en) | Calibration of a multi-plane X-ray unit | |
CN104000654A (en) | Computer-implemented technique for calculating a position of a surgical device | |
JP7538812B2 (en) | SYSTEM AND METHOD FOR AIMING AND POSITIONING PROCEDURE TOOLS IN AN X-RAY OR ULTRASOUND ENVIRONMENT - Patent application | |
CN115661214A (en) | Registration precision verification method and device | |
Meng et al. | An automatic markerless registration method for neurosurgical robotics based on an optical camera | |
JP7463625B2 (en) | Navigation Support | |
CN115737121A (en) | Surgical robot navigation method, device and system | |
CN115619836A (en) | Focal screen distance calibration method and device | |
CN118078438A (en) | Calibration method, positioning method and calibration device of surgical navigation system | |
CN117100397A (en) | Registration method and system of surgical navigation positioning system | |
CN114176779A (en) | Surgical robot navigation positioning method and device | |
JPH09166410A (en) | Position measuring apparatus | |
CN116350359A (en) | Method for guiding position adjustment of image equipment, storage medium and medical system | |
CN115227397B (en) | Registration plate automatic alignment method and device | |
US20230252681A1 (en) | Method of medical calibration |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |