CN114129262A - Method, equipment and device for tracking surgical position of patient - Google Patents

Method, equipment and device for tracking surgical position of patient Download PDF

Info

Publication number
CN114129262A
CN114129262A CN202111334581.9A CN202111334581A CN114129262A CN 114129262 A CN114129262 A CN 114129262A CN 202111334581 A CN202111334581 A CN 202111334581A CN 114129262 A CN114129262 A CN 114129262A
Authority
CN
China
Prior art keywords
patient
pose
surgical
optical calibration
optical
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111334581.9A
Other languages
Chinese (zh)
Other versions
CN114129262B (en
Inventor
眭菁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Gerui Technology Co ltd
Original Assignee
Beijing Gerui Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Gerui Technology Co ltd filed Critical Beijing Gerui Technology Co ltd
Priority to CN202111334581.9A priority Critical patent/CN114129262B/en
Priority to CN202311464902.6A priority patent/CN117323007A/en
Publication of CN114129262A publication Critical patent/CN114129262A/en
Application granted granted Critical
Publication of CN114129262B publication Critical patent/CN114129262B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems

Abstract

The specification discloses a method, equipment and a device for tracking a surgical position of a patient, wherein the surgical position of the patient and the relative pose of an optical calibration label arranged on the patient are predetermined through an optical tracking device, the pose of the optical calibration label and the position of the surgical area of the patient are determined according to a depth image of the surgical area of the patient acquired by a depth device in the surgical process, whether the relative pose of the optical calibration label and the surgical pose of the patient is not changed or not is judged in an auxiliary mode, and the coordinate of the surgical position of the patient is determined according to the pose and the relative pose of the optical calibration label acquired by the optical tracking device under the condition that the relative pose is not changed. The method determines the pose of the optical calibration label and the position of the operation area of the patient through the depth image, assists in judging whether the relative pose is not changed, can determine the change condition of the relative pose in time, can accurately determine the coordinate of the operation position of the patient, and reduces the potential safety hazard.

Description

Method, equipment and device for tracking surgical position of patient
Technical Field
The specification relates to the technical field of artificial intelligence, in particular to a method, equipment and a device for tracking a surgical position of a patient.
Background
At present, with the development of artificial intelligence technology, medical robots become one of the research hotspots in the robot field. The medical robot is applied to medical processes such as rescue, surgical treatment, rehabilitation training and the like, so that the development of medical science is greatly promoted. However, during the operation of the medical robot, the coordinates of the operation position of the patient may change, and thus the coordinates of the operation position of the patient needs to be tracked.
One common method of tracking the surgical site of a patient is based on an infrared camera. Specifically, the optical calibration label may be first fixed at the surgical position of the patient, and the position of the optical calibration label and the relative pose of the optical calibration label and the surgical pose of the patient may be determined. Then, in the operation process, the pose of the optical calibration label can be acquired through the infrared camera. And finally, determining the coordinates of the surgical position of the patient according to the position variation of the optical calibration label and the predetermined relative pose.
However, the prior art is realized based on the assumption that the relative pose between the optical calibration label fixed at the surgical position of the patient and the surgical position of the patient is not changed. However, in the operation process, the relative pose between the optical calibration label and the operation position of the patient may change, so that the error is large when the coordinate of the operation position of the patient is determined in the prior art, and potential safety hazards exist.
Disclosure of Invention
The present specification provides a method and apparatus for tracking a surgical site of a patient, which partially solve the above problems of the prior art.
The technical scheme adopted by the specification is as follows:
the present specification provides a method of tracking a surgical site of a patient, comprising:
predetermining, by an optical tracking device, a surgical position of a patient and a relative pose of an optical calibration tag disposed on the patient;
in the patient operation process, acquiring a depth image of a patient operation area at the current moment through a depth device, and respectively determining the range of the patient operation area at the current moment and the pose of an optical calibration label;
determining a first bit difference according to the range of the operation area of the patient determined historically and the range of the operation area of the patient at the current moment, and determining a second bit difference according to the position and the attitude of the optical calibration label determined historically and the position and the attitude of the optical calibration label at the current moment;
judging whether the difference value of the first difference and the second difference is smaller than a preset first difference threshold value or not;
if so, determining the coordinates of the surgical position of the patient according to the relative pose and the pose of the optical calibration label acquired by the optical tracking equipment at the current moment;
if not, the relative poses of the patient operation position and the optical calibration label are determined again through the optical tracking equipment.
Optionally, the method includes the steps of predetermining, by an optical tracking device, a surgical position of the patient and a relative pose of an optical calibration label disposed on the patient, and specifically including:
respectively determining the pose of an optical calibration label arranged on a patient and the position of the optical calibration label arranged at the operation position of the patient through an optical tracking device;
taking the position of the optical calibration label arranged at the surgical position of the patient as the coordinate of the surgical position of the patient;
and determining the relative poses of the patient operation position and the optical calibration label arranged on the patient according to the poses of the optical calibration label arranged on the patient and the coordinates of the patient operation position, wherein the relative poses comprise relative positions and relative postures.
Optionally, the method includes acquiring a depth image of the surgical area of the patient at the current time through a depth device, and determining the range of the surgical area of the patient and the pose of the optical calibration tag at the current time respectively, and specifically includes:
according to a depth image of a patient operation area at the current moment acquired by a depth device, carrying out target object identification on the depth image at the current moment, and respectively determining point cloud data of the patient operation area and point cloud data of the optical calibration label;
and respectively determining the range of the surgical area of the patient and the pose of the optical calibration label according to the point cloud data of the surgical area of the patient and the point cloud data of the optical calibration label.
Optionally, determining the first bit difference according to the range of the surgical area of the patient determined historically and the range of the surgical area of the patient at the current time includes:
respectively determining the central position of the historic patient operation area and the central position of the current patient operation area according to the range of the historic patient operation area and the range of the current patient operation area;
and determining a first bit difference according to the historical center position and the current center position.
Optionally, before determining the coordinates of the surgical site of the patient according to the relative pose and the pose of the optical calibration tag acquired by the optical tracking device at the current time, the method further includes:
determining a depth image of the patient operation region which is not deformed from the depth images corresponding to the historical moments acquired by the depth device, and taking the depth image as a reference depth image;
determining point cloud data of a patient operation area of the depth image at the current moment as first point cloud, and determining point cloud data of the patient operation area in the reference depth image as second point cloud;
judging whether the surgical area of the patient is not deformed or not according to the similarity of the first point cloud and the second point cloud;
if so, continuously determining the coordinates of the surgical position of the patient;
if not, the relative poses of the patient operation position and the optical calibration label are determined again.
Optionally, before determining the coordinates of the surgical site of the patient according to the relative pose and the pose of the optical calibration tag acquired by the optical tracking device at the current time, the method further includes:
determining the pose of an optical calibration label acquired by the optical tracking equipment at the current moment;
determining a third gap according to the pose of the optical calibration label at the current moment and the historically determined pose of the optical calibration label;
judging whether the third difference is smaller than a preset second difference threshold value or not;
if so, continuously determining the coordinates of the surgical position of the patient;
if not, the relative poses of the patient operation position and the optical calibration label are determined again.
Optionally, the re-determining, by the optical tracking device, the relative pose of the surgical position of the patient and the optical calibration label specifically includes:
when the first difference is smaller than a preset third difference threshold value and the second difference is not smaller than the third difference threshold value, determining the position of a cursor anchor point connecting the optical calibration label and the surgical area of the patient at the current moment according to the depth image at the current moment;
determining the position of the cursor anchor point historically through the depth image historically collected;
determining a fourth difference according to the position of the cursor anchor point at the current moment and the position of the cursor anchor point at the historical moment;
judging whether the fourth difference is smaller than a preset fourth difference threshold value or not;
if so, updating the relative pose according to the pose of the optical calibration label acquired by the optical tracking equipment at the current moment, the pose of the optical calibration label at the last moment at the current moment and the relative pose;
if not, the relative poses of the patient operation position and the optical calibration label are determined again.
The present specification provides a patient surgical position tracking apparatus comprising an optical tracking sensor, a depth sensor and a processing unit, wherein:
the optical tracking sensor is used for acquiring the pose of the optical calibration label at the current moment and sending the pose of the optical calibration label at the current moment to the processing unit;
the depth sensor is used for acquiring a depth image of a patient operation area at the current moment in the patient operation process, respectively determining the range of the patient operation area at the current moment and the pose of the optical calibration label, and sending the range of the patient operation area at the current moment and the pose of the optical calibration label to the processing unit;
the processing unit is used for determining the relative pose of the surgical position of the patient and an optical calibration label arranged on the patient according to the surgical position of the patient sent by the optical tracking sensor and the pose of the optical calibration label, determining a first difference according to the received historically determined range of the surgical area of the patient and the received range of the surgical area of the patient at the current moment, determining a second difference according to the historically determined range of the optical calibration label and the received pose of the optical calibration label at the current moment, judging whether the difference value between the first difference and the second difference is smaller than a preset first difference threshold value, if so, determining the coordinate of the surgical position of the patient according to the received relative pose sent by the optical tracking sensor and the received pose of the optical calibration label at the current moment, if not, acquiring the coordinates of the surgical position of the patient from the optical tracking equipment, and re-determining the relative poses of the surgical position of the patient and the optical calibration label.
The present specification provides a tracking device for a surgical site of a patient, comprising:
the first determination module is used for determining the surgical position of the patient and the relative pose of an optical calibration label arranged on the patient in advance through an optical tracking device;
the second determination module is used for acquiring a depth image of the patient operation area at the current moment through depth equipment in the patient operation process, and respectively determining the range of the patient operation area at the current moment and the pose of the optical calibration label;
the third determining module is used for determining the first bit difference according to the range of the patient operation area determined historically and the range of the patient operation area at the current moment, and determining the second bit difference according to the position and posture of the optical calibration label determined historically and the position and posture of the optical calibration label at the current moment;
the judging module is used for judging whether the difference value of the first difference and the second difference is smaller than a preset first difference threshold value or not;
the first execution module is used for determining the coordinates of the surgical position of the patient according to the relative pose and the pose of the optical calibration label acquired by the optical tracking equipment at the current moment if the relative pose is determined to be the same as the pose of the optical calibration label acquired by the optical tracking equipment at the current moment;
and the second execution module is used for re-determining the relative poses of the patient operation position and the optical calibration label through the optical tracking equipment if the relative poses of the patient operation position and the optical calibration label are not determined.
The present specification provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the above-described method of tracking a surgical site of a patient.
The present specification provides an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the above-mentioned method of tracking a surgical site of a patient when executing the program.
The technical scheme adopted by the specification can achieve the following beneficial effects:
in the method for tracking the surgical position of the patient provided by the present specification, the surgical position of the patient and the relative pose of the optical calibration tag disposed on the patient are determined in advance by the optical tracking device, the pose of the optical calibration tag and the position of the surgical area of the patient are determined according to the depth image of the surgical area of the patient acquired by the depth device in the surgical process, whether the relative pose of the optical calibration tag and the surgical pose of the patient is not changed is determined in an auxiliary manner, and the coordinate of the surgical position of the patient is determined according to the pose and the relative pose of the optical calibration tag acquired by the optical tracking device under the condition that the relative pose is not changed.
According to the method, the position and pose of the optical calibration label and the position of the operation area of the patient are determined through the depth image, whether the relative position and pose are not changed or not is judged in an auxiliary mode, the change condition of the relative position and pose can be determined in time, the coordinate of the operation position of the patient can be accurately determined, and potential safety hazards are reduced.
Drawings
The accompanying drawings, which are included to provide a further understanding of the specification and are incorporated in and constitute a part of this specification, illustrate embodiments of the specification and together with the description serve to explain the specification and not to limit the specification in a non-limiting sense. In the drawings:
FIG. 1 is a schematic flow diagram of a method of tracking a surgical site of a patient provided herein;
FIG. 2 is a schematic view of an optical tracking device provided herein;
FIG. 3 is a flow chart of a method of tracking a surgical site of a patient;
FIG. 4 is a schematic diagram of an optical tracking apparatus provided herein;
FIG. 5 is a tracking device for a surgical site of a patient provided herein;
fig. 6 is a schematic diagram of an electronic device corresponding to fig. 1 provided in the present specification.
Detailed Description
In order to make the objects, technical solutions and advantages of the present disclosure more clear, the technical solutions of the present disclosure will be clearly and completely described below with reference to the specific embodiments of the present disclosure and the accompanying drawings. It is to be understood that the embodiments described are only a few embodiments of the present disclosure, and not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present specification without any creative effort belong to the protection scope of the present specification.
The technical solutions provided by the embodiments of the present description are described in detail below with reference to the accompanying drawings.
Fig. 1 is a schematic flowchart of a method for tracking a surgical site of a patient provided in this specification, and specifically includes the following steps:
s100: the relative poses of the patient's surgical site and the optical calibration labels disposed on the patient are predetermined by the optical tracking device.
The method for tracking the surgical position of the patient can be applied to monitoring and tracking the movement of the surgical position of the patient in all aspects before or during the surgical operation of the patient, and further solves the potential safety hazard caused by the change of the relative poses of the surgical area of the patient and the optical calibration label in the surgical process.
The method for tracking the surgical position of the patient described in the present specification is applied to a scene in which the patient performs an open surgery, and since the requirement for the accuracy of the coordinates of the surgical position of the patient in the medical surgery is high, the coordinates of the surgical position of the patient can be located by using an optical tracking device with high-accuracy positioning. And because this instruction can be based on the relative position appearance between this patient's operation position and the optics calibration label, the coordinate of this patient's operation position is pinpointed. Based on this, the relative poses of the patient's surgical site and the optical calibration labels disposed on the patient can be first determined.
Specifically, the optical calibration label may be first placed on the surgical site of the patient and the patient, respectively.
Secondly, light rays can be emitted outwards through the optical tracking equipment, and the position of the optical calibration label arranged on the patient and the position of the optical calibration label arranged on the surgical position of the patient are respectively determined according to the reflection results of the optical calibration label on the patient and the optical calibration label on the surgical position of the patient. Wherein the pose comprises position coordinates and a position orientation.
The processing unit may then use the position of the optical calibration label disposed at the surgical site of the patient as the coordinates of the surgical site of the patient.
Finally, the processing unit can determine the relative poses of the surgical position of the patient and the optical calibration labels arranged on the patient according to the coordinates of the surgical position of the patient and the poses of the optical calibration labels arranged on the patient.
Of course, the relative pose of the surgical site and the optical calibration label may also be preset, that is, the optical calibration label is set on the patient by the optical tracking device according to the preset relative pose. The processing unit may acquire a pre-stored relative pose when the relative pose of the surgical site of the patient and the optical calibration label disposed on the patient needs to be determined.
Further, the optical tracking device in this specification may employ a binocular infrared camera, and the corresponding optical calibration tag may be made of an infrared reflective material. Radiation can then be sent out by the infrared camera and the position of the patient can be determined from the infrared reflection.
Fig. 2 is a schematic diagram of an optical tracking apparatus provided in the present specification. In the figure, the black ball represents an optical index label placed on the patient at the surgical site. The patient is shown lying on the operating table in position, the gray filled area a on the patient represents the operating area, and optical positioning labels are placed near the operating area and at the operating position of the patient for positioning. The infrared camera is placed at a fixed position near the hospital bed, and the whole space in the whole operating room can be observed. The infrared camera can emit infrared rays outwards, receive reflected light rays of the optical calibration label on the patient and the optical calibration label at the surgical position of the patient, and accurately determine the relative pose of the surgical position of the patient and the optical calibration label according to the reflection result. Wherein the surgical site of the patient is within the surgical field of the patient.
S102: in the process of a patient operation, a depth image of the patient operation area at the current moment is acquired through a depth device, and the range of the patient operation area at the current moment and the pose of an optical calibration label are respectively determined.
The method is different from the method that the coordinates of the surgical position of the patient are determined only by the optical tracking equipment in the prior art, and the coordinates of the surgical position of the patient cannot be accurately determined when the relative pose between the surgical position of the patient and the optical calibration label is changed. The present specification provides a method for tracking a surgical position of a patient, which determines whether a relative pose between a surgical position of the patient and an optical calibration tag has changed through a depth image of a surgical area acquired by a depth device, and determines a coordinate of the surgical position of the patient according to the relative pose in step S100 and the pose of the optical calibration tag without changing the pose. Based thereon, a depth image of the surgical field of the patient acquired by the depth device may first be acquired.
In particular, the depth device may first acquire environmental data in the surrounding environment. The depth device is a depth sensor capable of sensing the environment, can be a depth camera, a binocular camera, a laser radar device and the like, and can be used as long as a depth image of a surgical area of a patient can be determined.
Then, the processing unit can determine the information of the operation area of the patient in the environment data and the information of the optical calibration label arranged on the patient according to the environment data of the operation area of the patient in the environment data collected by the depth equipment.
Finally, the processing unit can respectively determine the range of the surgical area of the patient and the pose of the optical calibration label according to the information of the surgical area of the patient in the environment data and the information of the optical calibration label.
Further, in order to more accurately determine the range of the surgical area of the patient and the pose of the optical calibration label, the processing unit also determines the range of the surgical area of the patient and the pose of the optical calibration label through object recognition.
Specifically, the processing unit may perform target object recognition on the depth image at the current time, and respectively determine point cloud data of the surgical area of the patient and point cloud data of the optical calibration label.
Then, the processing unit can respectively determine the range of the surgical area of the patient and the pose of the optical calibration label according to the point cloud data of the surgical area of the patient and the point cloud data of the optical calibration label.
Furthermore, since object recognition may require training of an object recognition model for a large amount of data, the processing unit may also determine the approximate shape and area of the surgical area of the patient and the shape and approximate volume of the optical calibration label in advance, and perform feature point matching on the depth image at the current time, thereby determining the range of the surgical area of the patient and the pose of the optical calibration label.
S104: and determining a first bit difference according to the range of the operation area of the patient determined historically and the range of the operation area of the patient at the current moment, and determining a second bit difference according to the position and the attitude of the optical calibration label determined historically and the position and the attitude of the optical calibration label at the current moment.
In one or more embodiments provided by the present specification, since the accuracy of the coordinates of the surgical site of the patient in the medical operation is high, the range of the surgical site of the patient determined by the depth image and the pose of the optical calibration tag are low. And whether the position relation between the patient and the optical calibration label is accurate or not can not be determined only according to the position of the optical calibration label. Therefore, the processing unit can determine whether the relative pose determined in step S100 is changed or not by the change of the surgical area of the patient and the change amount of the optical calibration label, and if not, the processing unit can determine the coordinates of the surgical position of the patient based on the relative pose and the pose of the optical calibration label.
Based on this, the processing unit may first determine the change in the surgical area of the patient and the change in the optical calibration label.
Specifically, the processing unit may determine the center position of the current surgical area of the patient according to the range of the surgical area of the patient at the current time. And determining the center position of the operation area of the patient in history according to the range of the operation area of the patient at the current moment determined in history.
Thereafter, the processing unit may determine a change amount of the position of the surgical area of the patient as the first parallax based on the historical center position of the surgical area of the patient and the center position of the surgical area of the patient at the current time.
Meanwhile, the processing unit may determine a position variation of the optical calibration tag as the second difference according to the determined pose of the optical calibration tag at the current time and the historically determined pose of the optical calibration tag.
In addition, since the surgical area of the patient is generally a regular image, and when the first position difference is determined, if the surgical area is determined only according to the variation corresponding to the surgical center position of the patient, and if the center position is not changed, the vertex position of the surgical area of the patient is changed, obviously, the determined variation is inaccurate, and therefore, the first position difference can also be determined according to the vertex position of the surgical area of the patient.
Specifically, the processing unit may determine the positions of the vertices of the current surgical area of the patient according to the range of the surgical area of the patient at the current time. And determining the positions of all vertexes of the operation area of the patient in history according to the range of the operation area of the patient at the current moment determined in history.
Thereafter, the processing unit may determine, for each vertex, a position change amount corresponding to the vertex based on the historical position of the vertex in the surgical region of the patient and the position of the vertex in the surgical region of the patient at the current time, add the change amounts corresponding to the vertices, determine an average change amount, and take the average change amount as the first difference.
S106: and judging whether the difference value of the first difference and the second difference is smaller than a preset first difference threshold value. If so, the subsidy is executed S108. If not, go to step S110.
In one or more embodiments provided herein, the processing unit can determine the difference between the first and second differences based on using the first and second differences to assist in determining whether the relative pose has changed, as previously described.
Specifically, the processing unit may obtain the first bit difference and the second bit difference determined in step S104, and determine a difference between the first bit difference and the second bit difference according to the determined first bit difference and the determined second bit difference.
Taking the first difference as (1, 1, 0) and the second difference as (1, 0, 0) as an example, the difference between the first difference and the second difference is (0, 1, 0).
Then, the processing unit may determine whether the difference is smaller than a preset first threshold according to a preset first difference threshold and a difference between the determined first difference and the second difference.
Taking the first difference threshold as (1, 1, 0) for example, if the difference between the first difference and the second difference is (0, 1, 0), it is obvious that if the difference is subtracted from the difference threshold and there is no negative number, the difference is smaller than the preset first difference threshold.
Of course, the form of the difference and the first difference threshold can also be characterized by a distance, for example, when the difference is (0, 1, 0), the corresponding distance is (0, 1, 0)
Figure BDA0003350126320000111
The specific form and content of the difference and the first gap threshold may be set as required, and this specification does not limit this.
S108: and determining the coordinates of the surgical position of the patient according to the relative pose and the pose of the optical calibration label acquired by the optical tracking equipment at the current moment.
In one or more embodiments provided herein, the processing unit may determine that the relative pose of the patient's surgical site and the optical calibration label disposed on the patient has not changed when it is determined that the difference between the first difference and the second difference is less than the first difference threshold.
The processing unit may then determine the coordinates of the surgical site of the patient from the relative pose determined in step S100 and the pose of the optical calibration tag currently acquired by the optical tracking device.
Specifically, the processing unit can emit light outwards through the optical tracking device, and determine the pose of the optical calibration label arranged on the patient according to the reflection result of the optical calibration label on the patient.
The processing unit may then determine the coordinates of the surgical site of the patient based on the pose of the optical calibration label and the relative poses of the surgical site of the patient and the optical calibration label.
S110: re-determining, by the optical tracking device, the relative pose of the patient surgical site and the optical calibration label.
In one or more embodiments provided herein, upon determining that the relative pose between the surgical site of the patient and the optical calibration label has changed, the processing unit may re-determine the relative pose between the surgical site of the patient and the optical calibration label via the optical tracking device to perform the subsequent steps based on the re-determined relative pose.
The specific steps of determining the relative pose by the optical tracking device can refer to the contents described in the above step S100.
Further, if the surgical site of the patient is changed and the position of the optical calibration label is changed, the processing unit may determine the reason for the change of the position of the optical calibration label. If the optical calibration label is changed due to movement of bed sheets and other external forces of a sickbed where the patient is located, the processing unit can update the relative pose according to the second position difference. If the position of the cursor anchor point connected with the optical calibration label and the surgical area of the patient changes, namely the position of the component for fixing the cursor anchor point in the surgical area of the patient changes, the processing unit needs to determine the relative pose again through the optical tracking device.
Specifically, the processing unit may determine, according to a preset third difference threshold, a position of a cursor anchor point connecting the optical calibration label and the surgical region of the patient at the current time according to the depth image acquired by the depth device at the current time when the first difference is smaller than the third difference threshold and the second difference is larger than the third difference threshold.
The processing unit may then determine the historical location of the cursor anchor point based on the historically collected depth images. The historical depth image at least comprises a depth image at the previous moment of the current moment.
Finally, the processing unit may determine a fourth difference according to the position of the cursor anchor point at the current time and the position of the cursor anchor point in history, and update the relative pose according to the second difference determined in step S102 when the fourth difference is smaller than a preset fourth difference threshold.
Of course, since the determined second position difference has low precision, the processing unit may update the relative pose according to the pose of the optical calibration tag acquired by the optical tracking apparatus at the current time, the pose of the optical calibration tag at the previous time at the current time, and the relative pose.
Based on the tracking method for the surgical position of the patient shown in fig. 1, the surgical position of the patient and the relative pose of an optical calibration label arranged on the patient are predetermined by an optical tracking device, the pose of the optical calibration label and the position of the surgical area of the patient are determined according to a depth image of the surgical area of the patient acquired by a depth device in the surgical process, whether the relative pose of the optical calibration label and the surgical pose of the patient is not changed or not is judged in an auxiliary manner, and the coordinate of the surgical position of the patient is determined according to the pose and the relative pose of the optical calibration label acquired by the optical tracking device under the condition that the relative pose is not changed. The method determines the pose of the optical calibration label and the position of the operation area of the patient through the depth image, assists in judging whether the relative pose is not changed, can determine the change condition of the relative pose in time, can accurately determine the coordinate of the operation position of the patient, and reduces the potential safety hazard.
In addition, during a patient's surgery, the shape of the surgical field of the patient may change due to changes in the patient's posture. For example, in the spine surgery, the spine is deformed downward due to the pressure of surgical instruments and the like on the spine, so that the surgical position of the patient is changed. Therefore, in the process of the patient operation, the processing unit can also judge whether the operation area of the patient is deformed or not through the point cloud data of the operation area of the patient in the depth image of the operation area of the patient acquired by the depth equipment.
Specifically, the processing unit may determine, from the depth image corresponding to the historical time acquired by the depth device, a depth image in which the surgical region of the patient is not deformed, as a reference depth image. The reference depth image may be preset, for example, a depth image of the surgical area of the patient collected before the surgery is started is used as the reference depth image.
The processing unit may then determine point cloud data of the patient surgical area of the depth image at the current time as a first point cloud and determine point cloud data of the patient surgical area in the reference depth image as a second point cloud.
Finally, the processing unit can determine the coordinate of the surgical position of the patient when the surgical area of the patient is not deformed according to the similarity of the first point cloud and the second point cloud, and prompt the user to stop the surgery if the surgical area of the patient is deformed according to the similarity. Wherein the user is a staff member such as a doctor.
Further, since the deformation of the surgical area of the patient may be temporary, the processing unit may determine that the deformation of the surgical area of the patient is not temporary and the operation should be stopped if the similarity between the point cloud data of the surgical area of the patient and the second point cloud is lower than the preset similarity threshold within the period of time according to a preset time threshold before prompting the user to stop the operation. If the similarity between the point cloud data of the surgical area of the patient and the second point cloud is higher than a preset similarity threshold, the deformation of the surgical area of the patient can be determined to be temporary, the processing unit can continue to perform the surgery after the similarity between the first point cloud and the second point cloud is higher than the similarity threshold, and the surgical position of the patient and the relative pose of the optical calibration label can be determined again.
Further, if the pose of the optical calibration tag changes too much, the relative pose may also change. Therefore, the processing unit may further determine whether to redetermine the relative pose based on the change in the pose of the optical calibration tag.
Specifically, the processing unit may determine a pose of the optical calibration label acquired by the optical acquisition device at the current time, and determine the third difference according to the pose of the optical calibration label at the current time and the poses of the optical calibration labels determined historically.
Then, the processing unit may determine whether the third difference is smaller than a preset second difference threshold.
Finally, the processing unit may continue to determine the coordinates of the surgical site of the patient when the third difference is less than the second difference threshold. When the third gap is not less than the second gap threshold, the processing unit may re-determine the relative pose of the patient surgical position and the optical calibration label.
It should be noted that, the forms, contents, numerical values, and the like of the first gap threshold, the second gap threshold, the third gap threshold, and the fourth gap threshold may be set as needed, which is not limited in this specification.
According to the above method for tracking the surgical position of the patient, the present specification further provides a flowchart of the method for tracking the surgical position of the patient as shown in fig. 3. In the figure, the processing unit may first determine the relative poses of the patient surgical site and the optical calibration label disposed on the patient based on the optical calibration label disposed on the patient and the optical calibration label disposed on the patient surgical site.
Secondly, the processing unit can judge whether the relative pose between the surgical position of the patient and the optical calibration label arranged on the patient changes or not according to the depth image of the surgical area of the patient.
Specifically, the processing unit may respectively determine the range of the surgical area of the patient and the pose of the optical calibration tag at the current time through the depth image of the surgical area of the patient at the current time. And further determining a first potential difference between the historically determined surgical area of the patient and the surgical area of the patient at the current time, and a second potential difference between the historically determined optical calibration label and the optical calibration label at the current time. And judging whether the difference value between the first difference and the second difference is smaller than a preset first difference threshold value.
If so, determining that the relative pose is not changed, and continuously judging whether the relative pose can be used for determining the coordinates of the surgical position of the patient.
If not, the relative pose is determined to have changed, and the relative pose needs to be determined again.
When the relative pose is judged not to change according to the first position difference and the second position difference, the processing unit can also judge whether the relative pose changes according to whether the surgical area of the patient deforms or not.
Specifically, the processing unit may determine, from the historically acquired depth images, a reference depth image in which the surgical region of the patient is not deformed, determine point cloud data of the surgical region of the patient in the depth image at the current time as a first point cloud, and determine point cloud data of the surgical region of the patient in the reference depth image as a second point cloud. And then judging whether the similarity between the first point cloud and the second point cloud is not less than a preset similarity threshold value.
If so, it can be determined that the current surgical area of the patient is not deformed.
If not, the deformation of the current surgical area of the patient can be determined, the surgery needs to be stopped, and the relative pose between the surgical position of the patient and the optical calibration label is determined again.
When the relative pose is not changed and the surgical area of the patient is not deformed, the processing unit may determine the coordinates of the surgical position of the patient based on the pose of the optical calibration tag at the current time acquired by the optical acquisition device. However, when the amount of change between the pose of the optical calibration label at the current time and the pose of the optical calibration label at the previous time is too large, it may be considered that the coordinates of the surgical site of the patient determined based on the pose of the optical calibration label at the current time are not accurate enough. Therefore, whether the coordinates of the surgical site of the patient can be determined can also be determined based on the amount of change in the position of the optical calibration label.
Specifically, the processing unit may determine the third difference according to the pose of the optical calibration label at the current time and the pose of the historical optical calibration label acquired by the optical acquisition device. And judging whether the third difference is smaller than a preset second difference threshold value.
If so, the processing unit may determine the coordinates of the surgical site of the patient based on the pose and the relative pose of the optical calibration tag at the current time.
If not, the processing unit needs to re-determine the relative pose.
In addition, if the range of the surgical area of the patient is not moved and the pose of the optical calibration tag is changed, the processing unit may update the relative pose only according to the variation of the optical calibration tag.
Specifically, the processing unit may first determine whether the first bit difference is smaller than a preset third difference threshold, and whether the second bit difference is larger than the third difference threshold.
If so, the processing unit may determine that the surgical area of the patient has not moved and the pose of the optical calibration label has changed. The processing unit may continue to determine whether the optical calibration label and the position of the cursor anchor point associated with the patient have changed.
If not, the processing unit needs to determine the relative pose through the optical acquisition equipment again.
The processing unit may determine a position of the cursor anchor point at the current time, and determine a fourth difference according to the determined position of the cursor anchor point at the current time and the positions of the cursor anchor points in the history. And judging whether the fourth difference is smaller than a preset fourth difference threshold value.
If so, the processing unit may update the relative pose according to the pose of the optical calibration tag acquired at the current time and the pose of the optical calibration tag at the previous time of the current time.
If not, the processing unit needs to determine the relative pose through the optical acquisition equipment again.
Of course, the specific contents can also refer to the description of the tracking method of the surgical position of the patient. This description will not be repeated.
Based on the same idea, the present specification further provides a tracking device for a surgical position of a patient, as shown in fig. 4.
Fig. 4 is a patient surgical position tracking device provided by the present specification, the device including an optical tracking sensor, a depth sensor, and a processing unit, wherein:
a is an optical tracking sensor, B is a depth sensor, and C is a processing unit. The optical sensor determines the pose of the optical calibration label and sends the pose to the processing unit, and the depth sensor determines the depth image of the surgical area of the patient and sends the depth image to the processing unit. And the processing unit tracks the surgical position of the patient according to the pose of the optical calibration label obtained by the intercepting ditch and the depth image of the surgical area of the patient.
Specifically, the optical tracking sensor is configured to acquire the pose of the optical calibration label at the current time, and send the pose of the optical calibration label at the current time to the processing unit.
The depth sensor is used for acquiring a depth image of a patient operation area at the current moment in the patient operation process, respectively determining the range of the patient operation area at the current moment and the pose of the optical calibration label, and sending the range of the patient operation area at the current moment and the pose of the optical calibration label to the processing unit.
A processing unit, configured to determine a relative pose of the surgical position of the patient and an optical calibration label set on the patient in advance according to the surgical position of the patient sent by the optical tracking sensor and the pose of the optical calibration label, determine a first difference according to the received historically determined range of the surgical area of the patient and the currently determined range of the surgical area of the patient sent by the depth sensor, determine a second difference according to the historically determined range of the optical calibration label and the pose of the optical calibration label at the current time, determine whether a difference between the first difference and the second difference is smaller than a preset first difference threshold, if yes, determine a coordinate of the surgical position of the patient according to the received relative pose of the optical tracking sensor and the pose of the optical calibration label at the current time, if not, determine a coordinate of the surgical position of the patient according to the coordinates of the surgical position of the patient sent by the optical tracking device, and re-determining the relative poses of the surgical position of the patient and the optical calibration label.
For specific description of the optical tracking sensor, the depth sensor and the processing unit, reference may be made to the above-mentioned patient surgical position tracking method, and for description of the optical tracking device, the depth device and the processing unit, detailed description thereof is omitted here.
The method for tracking the surgical position of the patient provided for one or more embodiments of the present specification is based on the same idea, and the present specification further provides a corresponding device for tracking the surgical position of the patient, as shown in fig. 5.
Fig. 5 is a device for tracking a surgical site of a patient provided by the present specification, comprising:
the first determination module 200 is used for determining the relative poses of the surgical position of the patient and the optical calibration label arranged on the patient in advance through the optical tracking device.
The second determining module 202 is configured to acquire a depth image of the surgical area of the patient at the current time through a depth device during the surgical procedure of the patient, and determine a range of the surgical area of the patient and a pose of the optical calibration tag at the current time respectively.
The third determining module 204 is configured to determine the first bit difference according to the historically determined range of the surgical area of the patient and the range of the surgical area of the patient at the current time, and determine the second bit difference according to the historically determined pose of the optical calibration tag and the pose of the optical calibration tag at the current time.
The determining module 206 is configured to determine whether a difference between the first difference and the second difference is smaller than a preset first difference threshold.
And the first executing module 208 is configured to, if yes, determine the coordinates of the surgical position of the patient according to the relative pose and the pose of the optical calibration tag acquired by the optical tracking device at the current time.
And a second executing module 210, configured to, if not, re-determine, by the optical tracking device, the relative poses of the surgical position of the patient and the optical calibration label.
Optionally, the first determining module 200 is specifically configured to determine, by an optical tracking device, a pose of an optical calibration label disposed on a patient and a position of an optical calibration label disposed on a surgical site of the patient, respectively, use the position of the optical calibration label disposed on the surgical site of the patient as a coordinate of the surgical site of the patient, and determine a relative pose of the surgical site of the patient and a relative pose of the optical calibration label disposed on the patient according to the pose of the optical calibration label disposed on the patient and the coordinate of the surgical site of the patient, where the relative pose includes a relative position and a relative pose.
Optionally, the second determining module 202 is specifically configured to perform target object identification on the depth image of the patient operation area at the current time according to the depth image of the patient operation area at the current time acquired by the depth device, determine point cloud data of the patient operation area and point cloud data of the optical calibration tag respectively, and determine the range of the patient operation area and the pose of the optical calibration tag respectively according to the point cloud data of the patient operation area and the point cloud data of the optical calibration tag.
Optionally, the second determining module 202 is specifically configured to determine, according to a range of a surgical area of a patient determined historically and a range of a surgical area of a patient at the current time, a central position of the surgical area of the patient at the historical time and a central position of the surgical area of the patient at the current time, respectively, and determine the first bit difference according to the central position of the patient at the historical time and the central position of the surgical area of the patient at the current time.
Optionally, the first executing module 208 is specifically configured to determine, from depth images corresponding to historical moments acquired by the depth device, a depth image in which the surgical area of the patient is not deformed, as a reference depth image, determine point cloud data of the surgical area of the patient of the depth image at the current moment, as a first point cloud, determine, in the reference depth image, the point cloud data of the surgical area of the patient, as a second point cloud, determine, according to a similarity between the first point cloud and the second point cloud, whether the surgical area of the patient is not deformed, if yes, continue to determine the coordinates of the surgical position of the patient, and if not, re-determine the relative pose of the surgical position of the patient and the optical calibration tag.
Optionally, the first executing module 208 is configured to determine a pose of an optical calibration label acquired by the optical tracking apparatus at the current time, determine a third difference according to the pose of the optical calibration label at the current time and a historically determined pose of the optical calibration label, determine whether the third difference is smaller than a preset second difference threshold, if so, continue to determine the coordinate of the surgical position of the patient, and if not, re-determine the surgical position of the patient and the relative pose of the optical calibration label.
Optionally, the second performing module 210 is specifically configured to, when the first difference is smaller than a preset third difference threshold and the second difference is not smaller than the third difference threshold, determine a position of a cursor anchor point connecting the optical calibration label and the surgical region of the patient at the current time according to the depth image at the current time, determine a position of the cursor anchor point historically according to the depth image historically collected, determine a fourth difference according to the position of the cursor anchor point at the current time and the position of the cursor anchor point historically collected, determine whether the fourth difference is smaller than a preset fourth difference threshold, if yes, update the relative pose according to the pose of the optical calibration label collected by the optical tracking device at the current time, the pose of the optical calibration label at the previous time at the current time, and the relative pose, if not, the relative poses of the patient operation position and the optical calibration label are determined again.
The present specification also provides a computer readable storage medium having stored thereon a computer program operable to execute the method of tracking a surgical site of a patient as provided in fig. 1 above.
This specification also provides a schematic block diagram of the electronic device shown in fig. 6. As shown in fig. 6, at the hardware level, the electronic device includes a processor, an internal bus, a network interface, a memory, and a non-volatile memory, but may also include hardware required for other services. The processor reads the corresponding computer program from the non-volatile memory into the memory and then runs the computer program to implement the tracking method of the surgical position of the patient described in fig. 1 above. Of course, besides the software implementation, the present specification does not exclude other implementations, such as logic devices or a combination of software and hardware, and the like, that is, the execution subject of the following processing flow is not limited to each logic unit, and may be hardware or logic devices.
In the 90 s of the 20 th century, improvements in a technology could clearly distinguish between improvements in hardware (e.g., improvements in circuit structures such as diodes, transistors, switches, etc.) and improvements in software (improvements in process flow). However, as technology advances, many of today's process flow improvements have been seen as direct improvements in hardware circuit architecture. Designers almost always obtain the corresponding hardware circuit structure by programming an improved method flow into the hardware circuit. Thus, it cannot be said that an improvement in the process flow cannot be realized by hardware physical modules. For example, a Programmable Logic Device (PLD), such as a Field Programmable Gate Array (FPGA), is an integrated circuit whose Logic functions are determined by programming the Device by a user. A digital system is "integrated" on a PLD by the designer's own programming without requiring the chip manufacturer to design and fabricate application-specific integrated circuit chips. Furthermore, nowadays, instead of manually making an Integrated Circuit chip, such Programming is often implemented by "logic compiler" software, which is similar to a software compiler used in program development and writing, but the original code before compiling is also written by a specific Programming Language, which is called Hardware Description Language (HDL), and HDL is not only one but many, such as abel (advanced Boolean Expression Language), ahdl (alternate Hardware Description Language), traffic, pl (core universal Programming Language), HDCal (jhdware Description Language), lang, Lola, HDL, laspam, hardward Description Language (vhr Description Language), vhal (Hardware Description Language), and vhigh-Language, which are currently used in most common. It will also be apparent to those skilled in the art that hardware circuitry that implements the logical method flows can be readily obtained by merely slightly programming the method flows into an integrated circuit using the hardware description languages described above.
The controller may be implemented in any suitable manner, for example, the controller may take the form of, for example, a microprocessor or processor and a computer-readable medium storing computer-readable program code (e.g., software or firmware) executable by the (micro) processor, logic gates, switches, an Application Specific Integrated Circuit (ASIC), a programmable logic controller, and an embedded microcontroller, examples of which include, but are not limited to, the following microcontrollers: ARC 625D, Atmel AT91SAM, Microchip PIC18F26K20, and Silicone Labs C8051F320, the memory controller may also be implemented as part of the control logic for the memory. Those skilled in the art will also appreciate that, in addition to implementing the controller as pure computer readable program code, the same functionality can be implemented by logically programming method steps such that the controller is in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers and the like. Such a controller may thus be considered a hardware component, and the means included therein for performing the various functions may also be considered as a structure within the hardware component. Or even means for performing the functions may be regarded as being both a software module for performing the method and a structure within a hardware component.
The systems, devices, modules or units illustrated in the above embodiments may be implemented by a computer chip or an entity, or by a product with certain functions. One typical implementation device is a computer. In particular, the computer may be, for example, a personal computer, a laptop computer, a cellular telephone, a camera phone, a smartphone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of these devices.
For convenience of description, the above devices are described as being divided into various units by function, and are described separately. Of course, the functions of the various elements may be implemented in the same one or more software and/or hardware implementations of the present description.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
As will be appreciated by one skilled in the art, embodiments of the present description may be provided as a method, system, or computer program product. Accordingly, the description may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the description may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
This description may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The specification may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the system embodiment, since it is substantially similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
The above description is only an example of the present specification, and is not intended to limit the present specification. Various modifications and alterations to this description will become apparent to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present specification should be included in the scope of the claims of the present specification.

Claims (11)

1. A method of tracking a surgical site of a patient, comprising:
predetermining, by an optical tracking device, a surgical position of a patient and a relative pose of an optical calibration tag disposed on the patient;
in the patient operation process, acquiring a depth image of a patient operation area at the current moment through a depth device, and respectively determining the range of the patient operation area at the current moment and the pose of an optical calibration label;
determining a first bit difference according to the range of the operation area of the patient determined historically and the range of the operation area of the patient at the current moment, and determining a second bit difference according to the position and the attitude of the optical calibration label determined historically and the position and the attitude of the optical calibration label at the current moment;
judging whether the difference value of the first difference and the second difference is smaller than a preset first difference threshold value or not;
if so, determining the coordinates of the surgical position of the patient according to the relative pose and the pose of the optical calibration label acquired by the optical tracking equipment at the current moment;
if not, the relative poses of the patient operation position and the optical calibration label are determined again through the optical tracking equipment.
2. The method of claim 1, wherein the relative poses of the surgical site of the patient and the optical calibration labels disposed on the patient are predetermined by an optical tracking device, comprising:
respectively determining the pose of an optical calibration label arranged on a patient and the position of the optical calibration label arranged at the operation position of the patient through an optical tracking device;
taking the position of the optical calibration label arranged at the surgical position of the patient as the coordinate of the surgical position of the patient;
and determining the relative poses of the patient operation position and the optical calibration label arranged on the patient according to the poses of the optical calibration label arranged on the patient and the coordinates of the patient operation position, wherein the relative poses comprise relative positions and relative postures.
3. The method as claimed in claim 1, wherein the acquiring of the depth image of the surgical area of the patient at the current time by the depth device, and the determining of the range of the surgical area of the patient and the pose of the optical calibration tag at the current time respectively, specifically comprises:
according to a depth image of a patient operation area at the current moment acquired by a depth device, carrying out target object identification on the depth image at the current moment, and respectively determining point cloud data of the patient operation area and point cloud data of the optical calibration label;
and respectively determining the range of the surgical area of the patient and the pose of the optical calibration label according to the point cloud data of the surgical area of the patient and the point cloud data of the optical calibration label.
4. The method of claim 1, wherein determining the first bit difference based on the historically determined extent of the surgical field of the patient and the extent of the surgical field of the patient at the current time comprises:
respectively determining the central position of the historic patient operation area and the central position of the current patient operation area according to the range of the historic patient operation area and the range of the current patient operation area;
and determining a first bit difference according to the historical center position and the current center position.
5. The method of claim 1, wherein prior to determining the coordinates of the surgical site of the patient from the relative pose and the pose of the optical calibration tag acquired by the optical tracking device at the current time, the method further comprises:
determining a depth image of the patient operation region which is not deformed from the depth images corresponding to the historical moments acquired by the depth device, and taking the depth image as a reference depth image;
determining point cloud data of a patient operation area of the depth image at the current moment as first point cloud, and determining point cloud data of the patient operation area in the reference depth image as second point cloud;
judging whether the surgical area of the patient is not deformed or not according to the similarity of the first point cloud and the second point cloud;
if so, continuously determining the coordinates of the surgical position of the patient;
if not, the relative poses of the patient operation position and the optical calibration label are determined again.
6. The method of claim 1, wherein prior to determining the coordinates of the surgical site of the patient from the relative pose and the pose of the optical calibration tag acquired by the optical tracking device at the current time, the method further comprises:
determining the pose of an optical calibration label acquired by the optical tracking equipment at the current moment;
determining a third gap according to the pose of the optical calibration label at the current moment and the historically determined pose of the optical calibration label;
judging whether the third difference is smaller than a preset second difference threshold value or not;
if so, continuously determining the coordinates of the surgical position of the patient;
if not, the relative poses of the patient operation position and the optical calibration label are determined again.
7. The method of claim 1, wherein re-determining, by the optical tracking device, the relative pose of the patient surgical site and the optical calibration label comprises:
when the first difference is smaller than a preset third difference threshold value and the second difference is not smaller than the third difference threshold value, determining the position of a cursor anchor point connecting the optical calibration label and the surgical area of the patient at the current moment according to the depth image at the current moment;
determining the position of the cursor anchor point historically through the depth image historically collected;
determining a fourth difference according to the position of the cursor anchor point at the current moment and the position of the cursor anchor point at the historical moment;
judging whether the fourth difference is smaller than a preset fourth difference threshold value or not;
if so, updating the relative pose according to the pose of the optical calibration label acquired by the optical tracking equipment at the current moment, the pose of the optical calibration label at the last moment at the current moment and the relative pose;
if not, the relative poses of the patient operation position and the optical calibration label are determined again.
8. A patient surgical position tracking apparatus, characterized in that the apparatus comprises an optical tracking sensor, a depth sensor and a processing unit, wherein:
the optical tracking sensor is used for acquiring the pose of the optical calibration label at the current moment and sending the pose of the optical calibration label at the current moment to the processing unit;
the depth sensor is used for acquiring a depth image of a patient operation area at the current moment in the patient operation process, respectively determining the range of the patient operation area at the current moment and the pose of the optical calibration label, and sending the range of the patient operation area at the current moment and the pose of the optical calibration label to the processing unit;
the processing unit is used for determining the relative pose of the surgical position of the patient and an optical calibration label arranged on the patient according to the surgical position of the patient sent by the optical tracking sensor and the pose of the optical calibration label, determining a first difference according to the received historically determined range of the surgical area of the patient and the received range of the surgical area of the patient at the current moment, determining a second difference according to the historically determined range of the optical calibration label and the received pose of the optical calibration label at the current moment, judging whether the difference value between the first difference and the second difference is smaller than a preset first difference threshold value, if so, determining the coordinate of the surgical position of the patient according to the received relative pose sent by the optical tracking sensor and the received pose of the optical calibration label at the current moment, if not, acquiring the coordinates of the surgical position of the patient from the optical tracking equipment, and re-determining the relative poses of the surgical position of the patient and the optical calibration label.
9. An apparatus for tracking a surgical site of a patient, the apparatus comprising:
the first determination module is used for determining the surgical position of the patient and the relative pose of an optical calibration label arranged on the patient in advance through an optical tracking device;
the second determination module is used for acquiring a depth image of the patient operation area at the current moment through depth equipment in the patient operation process, and respectively determining the range of the patient operation area at the current moment and the pose of the optical calibration label;
the third determining module is used for determining the first bit difference according to the range of the patient operation area determined historically and the range of the patient operation area at the current moment, and determining the second bit difference according to the position and posture of the optical calibration label determined historically and the position and posture of the optical calibration label at the current moment;
the judging module is used for judging whether the difference value of the first difference and the second difference is smaller than a preset first difference threshold value or not;
the first execution module is used for determining the coordinates of the surgical position of the patient according to the relative pose and the pose of the optical calibration label acquired by the optical tracking equipment at the current moment if the relative pose is determined to be the same as the pose of the optical calibration label acquired by the optical tracking equipment at the current moment;
and the second execution module is used for re-determining the relative poses of the patient operation position and the optical calibration label through the optical tracking equipment if the relative poses of the patient operation position and the optical calibration label are not determined.
10. A computer-readable storage medium, characterized in that the storage medium stores a computer program which, when executed by a processor, implements the method of any of the preceding claims 1 to 7.
11. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the method of any of claims 1 to 7 when executing the program.
CN202111334581.9A 2021-11-11 2021-11-11 Patient operation position tracking method, equipment and device Active CN114129262B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111334581.9A CN114129262B (en) 2021-11-11 2021-11-11 Patient operation position tracking method, equipment and device
CN202311464902.6A CN117323007A (en) 2021-11-11 2021-11-11 Patient operation position tracking method, equipment and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111334581.9A CN114129262B (en) 2021-11-11 2021-11-11 Patient operation position tracking method, equipment and device

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202311464902.6A Division CN117323007A (en) 2021-11-11 2021-11-11 Patient operation position tracking method, equipment and device

Publications (2)

Publication Number Publication Date
CN114129262A true CN114129262A (en) 2022-03-04
CN114129262B CN114129262B (en) 2023-12-22

Family

ID=80392946

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202111334581.9A Active CN114129262B (en) 2021-11-11 2021-11-11 Patient operation position tracking method, equipment and device
CN202311464902.6A Pending CN117323007A (en) 2021-11-11 2021-11-11 Patient operation position tracking method, equipment and device

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202311464902.6A Pending CN117323007A (en) 2021-11-11 2021-11-11 Patient operation position tracking method, equipment and device

Country Status (1)

Country Link
CN (2) CN114129262B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115089293A (en) * 2022-07-04 2022-09-23 山东大学 Calibration method for spinal endoscopic surgical robot

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103142313A (en) * 2013-03-19 2013-06-12 张巍 Surgical operation tool position-pose real-time detection method and system based on monocular vision
US20160367321A1 (en) * 2015-03-02 2016-12-22 Navigate Surgical Technologies, Inc. Surgical location monitoring system and method with surgical instrument guidance and graphic user interface
US20170273595A1 (en) * 2014-09-19 2017-09-28 Koh Young Technology Inc. Optical tracking system and coordinate registration method for optical tracking system
US20180235714A1 (en) * 2017-02-21 2018-08-23 Synaptive Medical (Barbados) Inc. Method, system and apparatus for maintaining patient registration in a surgical navigation system
US20190000556A1 (en) * 2017-06-30 2019-01-03 Gal Sela Medical electronic device with multi-tracking cameras
WO2020027377A1 (en) * 2018-07-31 2020-02-06 서울대학교산학협력단 Device for providing 3d image registration and method therefor
CN111202583A (en) * 2020-01-20 2020-05-29 上海奥朋医疗科技有限公司 Method, system and medium for tracking movement of surgical bed
US20210104062A1 (en) * 2019-10-08 2021-04-08 Samsung Electronics Co., Ltd. Method and apparatus with pose tracking
WO2021194803A1 (en) * 2020-03-24 2021-09-30 Intuitive Surgical Operations, Inc. Systems and methods for registering an instrument to an image using point cloud data and endoscopic image data

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103142313A (en) * 2013-03-19 2013-06-12 张巍 Surgical operation tool position-pose real-time detection method and system based on monocular vision
US20170273595A1 (en) * 2014-09-19 2017-09-28 Koh Young Technology Inc. Optical tracking system and coordinate registration method for optical tracking system
US20160367321A1 (en) * 2015-03-02 2016-12-22 Navigate Surgical Technologies, Inc. Surgical location monitoring system and method with surgical instrument guidance and graphic user interface
US20180235714A1 (en) * 2017-02-21 2018-08-23 Synaptive Medical (Barbados) Inc. Method, system and apparatus for maintaining patient registration in a surgical navigation system
US20190000556A1 (en) * 2017-06-30 2019-01-03 Gal Sela Medical electronic device with multi-tracking cameras
WO2020027377A1 (en) * 2018-07-31 2020-02-06 서울대학교산학협력단 Device for providing 3d image registration and method therefor
US20210104062A1 (en) * 2019-10-08 2021-04-08 Samsung Electronics Co., Ltd. Method and apparatus with pose tracking
CN111202583A (en) * 2020-01-20 2020-05-29 上海奥朋医疗科技有限公司 Method, system and medium for tracking movement of surgical bed
WO2021194803A1 (en) * 2020-03-24 2021-09-30 Intuitive Surgical Operations, Inc. Systems and methods for registering an instrument to an image using point cloud data and endoscopic image data

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115089293A (en) * 2022-07-04 2022-09-23 山东大学 Calibration method for spinal endoscopic surgical robot

Also Published As

Publication number Publication date
CN117323007A (en) 2024-01-02
CN114129262B (en) 2023-12-22

Similar Documents

Publication Publication Date Title
KR20180008221A (en) Method and device for acquiring image and recordimg medium thereof
WO2021169420A1 (en) Visual positioning on basis of multiple image frames
CN113589473A (en) Focusing method, device and equipment of lens module
CN111288971B (en) Visual positioning method and device
CN114129262B (en) Patient operation position tracking method, equipment and device
CN115600157B (en) Data processing method and device, storage medium and electronic equipment
CN116740361B (en) Point cloud segmentation method and device, storage medium and electronic equipment
CN111458030B (en) Infrared human body temperature measurement calibration method and device
CN112308113A (en) Target identification method, device and medium based on semi-supervision
CN116188971A (en) Robot character recognition method, device and storage medium
CN116309823A (en) Pose determining method, pose determining device, pose determining equipment and storage medium
CN112861831A (en) Target object identification method and device, storage medium and electronic equipment
CN113674424A (en) Method and device for drawing electronic map
CN113486775A (en) Target tracking method, system, electronic equipment and storage medium
CN112362084A (en) Data calibration method, device and system
CN116342888B (en) Method and device for training segmentation model based on sparse labeling
CN113885513A (en) Medical equipment position placing method, system and device
CN114332189A (en) High-precision map construction method and device, storage medium and electronic equipment
CN114187355A (en) Image calibration method and device
CN116740114B (en) Object boundary fitting method and device based on convex hull detection
CN116740197B (en) External parameter calibration method and device, storage medium and electronic equipment
CN117911528A (en) Depth camera calibration method, device, storage medium and equipment
CN117726907B (en) Training method of modeling model, three-dimensional human modeling method and device
CN115862668B (en) Method and system for judging interactive object based on sound source positioning by robot
CN116152246B (en) Image recognition method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant