WO2023274100A1 - 位姿控制方法及其适用的光学导航系统、手术机器人系统 - Google Patents

位姿控制方法及其适用的光学导航系统、手术机器人系统 Download PDF

Info

Publication number
WO2023274100A1
WO2023274100A1 PCT/CN2022/101378 CN2022101378W WO2023274100A1 WO 2023274100 A1 WO2023274100 A1 WO 2023274100A1 CN 2022101378 W CN2022101378 W CN 2022101378W WO 2023274100 A1 WO2023274100 A1 WO 2023274100A1
Authority
WO
WIPO (PCT)
Prior art keywords
optical
tracking system
monitoring
optical tracking
area
Prior art date
Application number
PCT/CN2022/101378
Other languages
English (en)
French (fr)
Inventor
刘赫
何锐
邵辉
Original Assignee
苏州微创畅行机器人有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 苏州微创畅行机器人有限公司 filed Critical 苏州微创畅行机器人有限公司
Publication of WO2023274100A1 publication Critical patent/WO2023274100A1/zh

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/32Surgical robots operating autonomously
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • A61B2034/2057Details of tracking cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2068Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points

Definitions

  • the present application relates to the technical field of medical equipment, in particular to a pose control method and its applicable optical navigation system and surgical robot system.
  • the surgical navigation system accurately corresponds the patient's preoperative or intraoperative image data to the patient's anatomical structure on the operating bed.
  • the surgical navigation system also tracks the surgical instruments during the operation and updates and displays the position of the surgical instruments in the form of virtual instruments on the patient's image in real time, so that the doctor can see at a glance the position of the surgical instruments relative to the patient's anatomical structure, thus making the surgical operation faster , more precise and safer.
  • the surgical navigation system provides richer reference information and more accurate guidance for surgical operations through the analysis of patients' medical images and the application of various sensors during surgery, and becomes a powerful tool to assist doctors in completing operations.
  • Optical surgical navigation has high precision, is easy to use, has no radiation, and has little impact on the surgical process, so it has been widely used in surgical navigation systems.
  • optical tracking system to track optical markers with fixed shape and easy identification in real time. Place the optical marker on the target anatomical structure, use the optical tracking system to track the marker, and indirectly obtain the real-time position and posture of the target anatomical structure.
  • optical characteristics limited by the optical characteristics, it is necessary to ensure that there are no other objects between the optical tracking system and the marker if the marker is to be tracked, and the marker must remain within the field of view of the optical tracking system. This has high requirements for the movement of people in the operating room and the placement of equipment.
  • the purpose of this application is to provide a pose control method and its applicable optical navigation system, surgical robot system, computer equipment, support device and computer-readable storage medium to solve the problem of marking in the optical surgical navigation system in the prior art
  • the present application provides a pose control method of an optical tracking system, including the following steps: acquiring at least one monitoring image of the monitoring area formed between the optical tracking system and the operation area; wherein, the operation The area is determined based on the position of at least one optical marker; when it is determined according to the monitoring image that there is an occluding object in the monitoring area, the position and/or posture of the optical tracking system is adjusted so that the occlusion The object is located outside the new monitoring area formed between the adjusted optical tracking system and the operating area.
  • the operation area is greater than or equal to the spatial range bounded by the position of the at least one optical marker.
  • the monitoring area includes: a space range surrounded by the boundary of the operation area and the viewing angle range of the optical tracking system.
  • the step of determining that there is an occluding object in the monitoring area according to the monitoring image includes at least one of the following:
  • the foreground image in the image area corresponding to the monitoring area in the monitoring image it is determined that there is an occluded object in the monitoring area; by detecting the image area corresponding to the monitoring area in at least two monitoring images Changes between the images in the monitoring area to determine that there is an occluded object in the monitoring area; when the monitoring image is a depth image, by detecting that there is an entity position in the image area corresponding to the monitoring area in the monitoring image, determine There is an occluding object in the monitoring area; and determining that there is an occluding object in the monitoring area by detecting a parallax between image data pairs in image areas corresponding to the monitoring area in at least two of the monitoring images.
  • step of adjusting the position and/or attitude of the optical tracking system includes:
  • step of adjusting the position and/or attitude of the optical tracking system includes:
  • the position and/or attitude of the optical tracking system is adjusted successively until it is determined according to the acquired monitoring image that the adjusted optical tracking system is within a new monitoring area formed between the operating area No occluding objects.
  • the step of adjusting the position and/or posture of the optical tracking system includes any of the following: centering on the operation area, adjusting the position and posture of the optical tracking system; or, according to the analysis of the The pose relationship or position relationship between the occluded object and the optical tracking system determined by monitoring the image translates the position of the optical tracking system.
  • the present application also provides a computer device, including: at least one memory storing at least one computer program; at least one processor executing the computer program to implement the pose control method of the optical tracking system as described above.
  • the present application also provides a supporting device for supporting an optical tracking system, wherein the optical tracking system is used for acquiring position information of an optical marker during a surgical operation, and the supporting device includes: at least one joint, wherein, The joints are used to provide movement in at least one degree of freedom; the controller is electrically connected to each of the joints, and is used to control the movement of the at least one joint according to the received control instructions; wherein the control instructions come from Computer equipment as previously described.
  • the computer equipment is built in the supporting device.
  • the present application also provides an optical navigation system, including: an image acquisition device, including: a first camera module and a second camera module; wherein, the first camera module is used to acquire images containing at least one optical marker positioning image; wherein, the position of the at least one optical marker identifies an operation area; the second camera module is used to obtain the monitoring corresponding to the monitoring area between the operation area and the first camera module image; the optical navigation system further includes: the support device as described above, which is connected with the image acquisition device.
  • first camera module and the second camera module have overlapping viewing angle ranges.
  • the present application also provides a surgical robot system, including: an optical navigation system, configured to determine the position information of the at least one optical marker according to the captured positioning image containing the at least one optical marker; wherein, the at least one The position of the optical marker identifies an operation area; the support device is used to assemble the optical navigation system; the monitoring device is used to obtain the monitoring image corresponding to the monitoring area between the operation area and the first camera module; The mechanical arm is used to connect the surgical instrument; and the aforementioned computer equipment is respectively connected to the supporting device, the optical navigation system, the monitoring device, and the surgical robotic arm in communication; wherein, the computer equipment executes the aforementioned
  • the above pose control method sends a control instruction to the support device, so that the support device adjusts the position and/or posture of the optical navigation system; and, the computer device is also used to The position information sends a control command to the surgical robot arm, so that the surgical robot arm adjusts the position and/or posture of the assembled surgical instrument.
  • the monitoring device is configured in an optical navigation system.
  • the present application also provides a computer-readable storage medium for storing a computer program, and when the computer program is executed (such as by a processor), the above-mentioned control method is implemented.
  • the pose control method of the optical tracking system and its applicable optical navigation system, surgical robot system, computer equipment, support device and computer-readable storage medium provided by the present application have the following advantages :
  • the pose control method of the optical tracking system provided by this application can determine whether there is an obstacle blocking the optical marker in the monitoring area, identify obstacles of arbitrary shape, and adjust the optical tracking system.
  • the tracking system keeps the monitoring area from being blocked, so there is no need to add manual marking, and it has a wide range of application scenarios.
  • the pose control method of the optical tracking system provided in this application can further plan the obstacle avoidance motion of the optical tracking system according to the position information of the obstacle, and move the optical tracking system according to the planned motion track to avoid the optical marker being blocked.
  • the pose control method of the present application can always ensure that the optical marker to be tracked is within the central range of the field of view of the optical tracking system during the obstacle avoidance movement of the optical tracking system, so as to avoid interruption of the surgical navigation process.
  • the support device provided by this application has a controllable movement function, and automatically moves the optical tracking system to avoid obstacles according to the planned obstacle avoidance movement. Therefore, the support device provided by the present application does not need manual adjustment by the doctor, and can avoid the interruption of the navigation process caused by the marker being blocked.
  • the optical navigation system provided by this application is highly integrated and has an obstacle avoidance function. Therefore, the optical navigation system can solve the problem of occluded objects in the monitoring space by implementing the control method of the optical tracking system of the present application.
  • the surgical robot system of the present application combines stereo vision technology with robot technology to solve the common problem of the monitoring area being blocked in the optical surgical navigation system, and the entire navigation adjustment system does not contact with patients or medical staff, avoids disinfection, and reduces infection possibility. Moreover, the obstacle avoidance movement of the surgical robot system of the present application does not need to change the flow of the original surgical navigation system, so for doctors to use, no additional software and hardware operations are required, and the functions of the original surgical navigation system are not affected. This reduces the learning curve for physicians and improves operating room utilization.
  • FIG. 1 is a schematic flow chart of a pose control method of an optical tracking system in an embodiment of the present application
  • FIG. 2 is a schematic diagram of an orthopedic surgery navigation system for knee replacement in an embodiment of the present application
  • FIG. 3 is a schematic diagram of components of the surgical robot system in Embodiment 1 of the present application.
  • FIG. 4 is a schematic diagram of determining the monitoring area in the pose control method of the optical tracking system in Embodiment 1 of the present application;
  • FIG. 5 is a schematic diagram of the transformation of the coordinate systems of the three (occluded objects, the operating area, and the optical tracking system) in the pose control method of the optical tracking system in Embodiment 1 of the present application;
  • FIG. 6 is a schematic diagram of obstacles blocking optical markers in the pose control method of the optical tracking system in Embodiment 1 of the present application;
  • FIG. 7 is a schematic diagram of the obstacle avoidance movement of the optical tracking system in the pose control method of the optical tracking system in Embodiment 1 of the present application;
  • FIG. 8 is a schematic diagram of the coordinate system transformation of the optical tracking system obstacle avoidance movement in the pose control method of the optical tracking system in Embodiment 1 of the present application;
  • FIG. 9 is a schematic diagram of the movement of the mechanical arm during the obstacle avoidance movement of the optical tracking system in the pose control method of the optical tracking system in Embodiment 1 of the present application;
  • FIG. 10 is a schematic diagram of the installation of binocular cameras in the pose control method of the optical tracking system in Embodiment 2 of the present application;
  • FIG. 11 is a schematic diagram of imaging of binocular cameras in the pose control method of the optical tracking system in Embodiment 2 of the present application;
  • FIG. 12 is a schematic diagram of the imaging of the monitoring area on the binocular camera in the pose control method of the optical tracking system in Embodiment 2 of the present application;
  • FIG. 13 is a schematic diagram of the principle of judging whether there is an obstacle in the monitoring area in the pose control method of the optical tracking system in Embodiment 2 of the present application;
  • FIG. 14 is a schematic diagram of the obstacle avoidance movement of the optical tracking system in the imaging space when there is a single obstacle in the monitoring area in the pose control method of the optical tracking system in Embodiment 2 of the present application;
  • FIG. 15 is a schematic diagram of the obstacle avoidance movement of the optical tracking system in the imaging space when there are multiple obstacles in the monitoring area in the pose control method of the optical tracking system in Embodiment 2 of the present application;
  • FIG. 16 is a schematic diagram of the motion parameter transformation of the optical tracking system in each coordinate system in the pose control method of the optical tracking system in Embodiment 2 of the present application.
  • the terms “comprises,” “comprises,” or any other variation thereof are intended to cover a non-exclusive inclusion such that a process, method, article, or apparatus that includes a set of elements includes not only those elements, but also includes not expressly included other elements listed, or also include elements inherent in such a process, method, article, or apparatus. Without further limitations, an element defined by the phrase “comprising a " does not exclude the presence of additional identical elements in the process, method, article or apparatus comprising said element.
  • the optical tracking system is easy to be unable to track markers due to obstructions, thereby interrupting the entire navigation process ; Obstacles can be any person or object that exists between the optical tracking system and the marker, and it is difficult to identify and obtain its position in real time; when moving the optical tracking system, it is easy to move the marker out of the field of view, resulting in the failure of marker tracking.
  • One of the core ideas of the present application is to provide a pose control method of an optical tracking system to solve the problem that the optical tracking system of the optical surgical navigation system in the prior art is blocked when tracking markers, which affects the efficiency and fluency of the operation. question.
  • the present application provides a pose control method of an optical tracking system, as shown in FIG. 1 , including the following steps:
  • the monitoring image of the monitoring space is obtained in real time, and whether there is an occluding object (that is, an obstacle) in the monitoring space is judged according to the detection image.
  • the pose of the optical tracking system can be adjusted to control the movement of the optical tracking system to avoid the obstacle and prevent the optical marker from being blocked. In this way, the problem that the markers are blocked and affect the operation efficiency and fluency is overcome.
  • the operation region of the present application is greater than or equal to the spatial range bounded by the position of the at least one optical marker.
  • a minimum enclosing sphere may be obtained according to the position of the at least one optical marker, and all optical markers are located within the range of the minimum enclosing sphere.
  • the operation area may be the area of the minimum enclosing sphere, or an area slightly larger than the area of the minimum enclosing sphere.
  • the monitoring area may be determined according to the operation area, wherein the monitoring area may be a spatial range enclosed by the boundary of the operation area and the viewing angle range of the optical tracking system.
  • the monitoring area may be a cylindrical space enclosed by the viewing angle range of the optical tracking system and the minimum enclosing sphere mentioned above.
  • the monitoring area may be a spatial range slightly larger than the cylindrical spatial range set for safety reasons.
  • the step of determining that there is an occluded object in the monitoring area according to the monitoring image may include at least one of the following:
  • the position of each surgical instrument in the entire monitoring area can be determined before the operation, and the standard image corresponding to the monitoring area when there is no occlusion can be obtained, and the standard image can be used as the background image.
  • monitoring images are acquired in real time. According to the background image, if the foreground image can be extracted from the image area corresponding to the monitoring area in the monitoring image, it is determined that there is an obstacle in the monitoring area; otherwise, there is no obstacle.
  • an occluding object exists in the monitoring area. For example, two monitoring images with a certain time difference are acquired and compared, and image changes in the image area corresponding to the monitoring area in the two monitoring images are observed. If there is an occluding object in the monitoring area, then in the same image area, the latter image will have more occluding obstacles than the previous image, so as to determine whether there is an obstacle in the monitoring area.
  • the monitoring image is a depth image
  • it is determined that there is an occluding object in the monitoring area by detecting that there is an entity position in the image area corresponding to the monitoring area in the monitoring image.
  • the entity position refers to the position of the blocking object in the monitoring area.
  • a monitoring image of the monitoring area may be collected. The monitoring image at this time can directly display the position of each object in the monitoring area, and can intuitively judge whether there is an occluded object in the monitoring area.
  • the image data pair represents matching image data of the same object in the monitoring area described by different monitoring images. For example, two monitoring images with parallax are taken by a binocular camera device, and then the two monitoring images are compared. , to determine that there are obstacles in the monitoring area.
  • the step of adjusting the position and/or posture of the optical tracking system may include: according to the posture relationship or relative positional relationship between the corresponding occluding object and the optical tracking system in the monitoring image , adjusting the position and/or attitude of the optical tracking system.
  • the trajectory of the optical tracking system can be planned according to the position information of obstacles in the monitoring space, and then the optical tracking system can be quantitatively adjusted to avoid obstacles.
  • the movement of the optical tracking system is no longer an unplanned random movement, and its movement track is precisely controlled, so as to ensure that the optical markers to be tracked are always in the field of view of the optical tracking system during the movement of the optical tracking system , to prevent interruption of surgery.
  • the adjustment of the optical tracking system in addition to the above-mentioned solutions, can also be adjusted successively, for example, according to the preset adjustment unit, the position and/or attitude of the optical tracking system is adjusted successively until it is adjusted according to the The acquired new monitoring image determines that there is no occluded object in the monitoring area.
  • the preset adjustment unit includes but not limited to a set length and/or a set angle.
  • the optical tracking system can be adjusted in translation according to the set length, or the optical tracking system can be adjusted around the operation area according to the set angle.
  • An example of a manner of adjusting the position and/or posture of the optical tracking system includes: adjusting the position and posture of the optical tracking system centering on the operation area. For example, with the distance between the optical tracking system and the operating area as the radius and the center of the operating area as the center, plan an arc-shaped route, and adjust the position and posture of the optical tracking system accordingly.
  • the way of adjusting the position and/or posture of the optical tracking system also includes, for example: according to the posture relationship or position relationship between the occluded object and the optical tracking system determined by analyzing the monitoring image, translating the optical tracking system.
  • the location of the tracking system For example, when it is determined that an occluded object blocks the edge of the surgical area by analyzing the monitoring image, the positional relationship among the occluded object, the surgical area, and the optical tracking system is changed by shifting the optical tracking system so that there is no occluded object in the monitoring area of the optical tracking system .
  • each of the above adjustment methods can be selected and used by analyzing the monitoring image. Or combine the two adjustment methods to plan the adjustment route. For example, adjusting the optical tracking system to include the path of translation and arc, or optimizing the non-circle arc based on translation and arc, etc.
  • the control method of the optical tracking system provided by the present application will be further described in detail below in conjunction with a specific operation example.
  • the operation takes an orthopedic surgery navigation system for knee joint replacement as an example.
  • the obstacle avoidance method and system provided in the present application are not limited to the orthopedic surgery navigation system for knee joint replacement, and can also be applied to other optical surgery navigation systems.
  • the surgical navigation system may specifically include: a surgical trolley 1; and various surgical instruments installed on the surgical robotic arm 2, such as surgical instruments such as the osteotomy guide tool 4 and the pendulum saw 5 among Fig. 1; the operating table 16; and the patient on the operating table 16 17 (the site of the patient 17 to be operated on, eg, in orthopedic surgery for knee replacement, including the femur 12 and tibia 14); and various optical markers.
  • Optical markers are mainly divided into two categories, one is set on the patient's site to be operated, such as femur marker 11 and tibia marker 13 respectively set on the femur 12 and tibia 14, this type of markers can be used to mark the position of the surgical site, so that the surgical site can be tracked conveniently; Base markers 15 etc. on the operating trolley 1 . This type of marker is used to identify the position of the surgical robot to facilitate tracking of surgical instruments and the like.
  • the surgical navigation system also includes a navigation trolley 9 and an optical tracking system 6 installed on the navigation trolley 9 .
  • the optical tracking system 6 is used to track the real-time positions of the aforementioned optical markers.
  • a computer system is also provided on the navigation trolley 9 for overall control, which may include a main display 8 disposed on the navigation trolley 9 , a keyboard 10 and a controller located in the navigation trolley 9 .
  • a main display 8 disposed on the navigation trolley 9
  • a keyboard 10 disposed on the navigation trolley 9
  • a controller located in the navigation trolley 9 .
  • an auxiliary display 7 can also be added to facilitate multi-person operation.
  • the main steps of using the knee replacement navigation robot system are as follows:
  • the operating trolley 1 and the navigation trolley 9 are placed at appropriate positions beside the operating table 16 of the hospital bed.
  • a femoral marker 11 and a tibial marker 13 are installed on the site to be operated on of the patient 17 such as the femur 12 and the tibia 14 .
  • Surgical instruments such as surgical manipulator system 2, osteotomy guide tool 4, and swing saw 5 are installed on the operating trolley 1, and base markers 15 and tool markers 3 are installed on the corresponding positions of the operating trolley 1 and its accessories Etc., can also place other surgical necessary utensils, such as sterile bags etc., on the operating trolley 1 simultaneously.
  • Preoperative planning mainly includes the coordinates of the osteotomy plane, the model of the prosthesis, and the installation orientation of the prosthesis.
  • the physician then uses the optical tracking probe to identify feature points of the patient's 17 femur 12 and tibia 14 .
  • the optical tracking system 6 uses the femoral marker 11 and the tibial marker 13 as references to respectively record the positions of the patient's bone feature points and send the positions of the bone feature points to the computer.
  • the computer obtains the corresponding relationship between the actual orientation of the femur 12 and the tibia 14 and the orientation of the CT image through the feature matching algorithm, and links the actual orientations of the femur 12 and the tibia 14 with the corresponding markers installed on the femur 12 and the tibia 14, Therefore, the femoral marker 11 and the tibial marker 13 can track the actual position of the bone in real time (during the operation, as long as the relative position between the marker and the bone is fixed, the movement of the bone will not affect the surgical effect).
  • the surgical manipulator 2 locates the osteotomy plane through the tool marker 3 and moves to a predetermined position, and the doctor can use the oscillating saw 5 or electric drill to perform osteotomy and drilling operations through the osteotomy guide groove and the guide hole of the osteotomy guide tool 4 . After the osteotomy and drilling operations are completed, the doctor can install the prosthesis and perform other operations.
  • the optical tracking system 6 is required to always be able to obtain the pose information of the relevant markers, otherwise the entire navigation process will be terminated.
  • this application proposes a pose control method of the optical tracking system, which can effectively solve the problem that the operation is interrupted due to the blocked optical markers. Moreover, according to different positioning devices used to collect monitoring images of the monitoring area, the present application provides the following two specific embodiment solutions to solve the problem of the optical marker being blocked.
  • a positioning device is used to obtain at least one monitoring image of the monitoring area.
  • the positioning device is described in detail as a depth camera.
  • the positioning device uses a depth camera 19 to detect the monitoring area, and the specific pose control method can be as follows:
  • optical markers may include the femoral marker 11 , the tibial marker 13 , the tool marker 3 and the base marker 15 set on the corresponding positions as described above.
  • the optical tracking system 6 is installed on a supporting device, wherein the supporting device may be a mechanical arm 18 or other mobile platforms capable of moving. This embodiment is introduced by taking the supporting device as an example of the mechanical arm 18;
  • the optical tracking system 6 tracks the position information of the optical marker in real time, and determines a monitoring area according to the position of the optical tracking system 6 and the optical marker;
  • the optical tracking system 6 tracks the position information of the optical markers in real time, specifically including: obtaining a minimum enclosing sphere 20 according to the position information of multiple optical markers, as shown in FIG. 4 , multiple Each of the optical markers is within the spatial range of the minimum enclosing sphere 20; the mechanical arm 18 drives the optical tracking system 6 to move, so that the optical axis 21 of the optical tracking system 6 passes through the minimum enclosing sphere
  • the center O of the sphere, the optical axis 21 of the optical tracking system 6 generally refers to the central axis of symmetry of the imaging of the optical tracking system 6 .
  • the operation area can be the space formed by the minimum enclosing sphere 20, or it can be a larger space area including the minimum enclosing sphere 20.
  • the present embodiment 1 takes the operation area as the minimum enclosing sphere 20 Take the surrounding synthetic space as an example to introduce.
  • the step of determining the monitoring area specifically includes: taking the optical axis 21 of the optical tracking system 6 as the central axis, taking the minimum enclosing sphere 20 to the The farthest distance of the optical tracking system 6 along the direction of the central axis is height H, and the distance threshold L between the optical axis 21 of the optical tracking system 6 is the radius, and the determined cylindrical space 22 is the Monitor area.
  • Objects in this space area will be treated as occluded objects (i.e. obstacles), wherein the radius L of the cylindrical space 22 is not less than the radius of the minimum enclosing sphere 20, and the actual size of L can be set by itself according to monitoring needs .
  • S3 Install a positioning device, that is, install a depth camera 19 at a suitable position, as shown in FIGS. 3 and 4 , so that the field of view of the depth camera 19 covers the entire monitoring area.
  • the image captured by the depth camera 19 contains information of the entire monitoring area, and also includes the optical tracking system 6 and the above-mentioned minimum enclosing sphere 20, and their position information.
  • the image captured by the depth camera 19 also includes the obstacle and its position information.
  • FIG. 5 is a diagram of the relationship between the spatial coordinate systems of various parts of the system in the solution using the depth camera 19 .
  • the transformation relationship between the coordinate system A of the optical tracking system 6 and the coordinate system R of the robot arm 18 can be calculated through calibration and kinematic information of the robot arm 18 .
  • the coordinate system B of the depth camera 19 can also be linked with the coordinate system A of the optical tracking system 6 through calibration. In this way, the coordinate system of the whole system can be unified.
  • the position of the optical marker is measured by the optical tracking system 6
  • the position of the obstacle is measured by the depth camera 19 .
  • the depth camera 19 acquires monitoring images within its field of view in real time, and judges whether there is an obstacle in the monitoring area according to the monitoring images. If there is an obstacle, obtain the position information of the obstacle in the monitoring space.
  • the step of judging whether there is an obstacle in the monitoring space according to the monitoring image may specifically include: transforming the point cloud of the monitoring image acquired by the depth camera 19 into the In the coordinate system A of the optical tracking system 6; in the coordinate system A of the optical tracking system 6, compare the positional relationship between each point cloud and the monitoring space, when there is a part of the point cloud in the monitoring space, Determine that there is an obstacle in the monitoring space, as shown in Figure 6, part of the point cloud 23 in the figure represents that the obstacle falls within the scope of the monitoring space; when there is no point cloud in the monitoring space, it is determined that there is no obstacle in the monitoring space obstacle;
  • Obtaining the position information of the obstacle in the monitoring space specifically includes: taking the position of the point P closest to the optical axis 21 of the optical tracking system 6 in the part of the point cloud 23 as the position of the obstacle in the monitoring space information.
  • S6 Obtain the movement parameters of the optical tracking system 6 for obstacle avoidance according to the position information of the obstacle, so as to ensure that the optical marker does not move out of the imaging field of view of the optical tracking system 6 while avoiding the obstruction of the obstacle;
  • the optical axis center position C of the optical tracking system 6, and the position P determine the vector with and the angle ⁇ between the two vectors; according to the vector with And the included angle ⁇ obtains the motion parameters of the optical tracking system 6 for obstacle avoidance.
  • the movement of the optical tracking system 6 for obstacle avoidance is a rotation around the center position O in the first plane, and the rotation direction is 24 (that is, the rotation direction is counterclockwise), and the motion parameters include the rotation angular velocity and the radius of rotation; the size of the angular velocity of rotation is determined according to the angle ⁇ , and the direction is Among them, the closer the obstacle is to the line of sight 21 of the optical tracking system 6, the greater the probability of the marker being blocked, so the optical tracking system 6 should be kept away from the obstacle as soon as possible, that is, the speed of the obstacle avoidance movement is negatively correlated with the size of ⁇ relation. However, in order to ensure safety, the speed should be controlled within a certain range.
  • the radius of rotation is The first plane is the with determined plane.
  • the above-mentioned method can be used to first calculate the motion parameters for bypassing a single obstacle, and then use the method of vector synthesis to calculate motion parameters of the object.
  • the mechanical arm 18 drives the optical tracking system 6 to avoid obstacles. Specifically, according to the motion parameters of the optical tracking system 6 and the transformation relationship between the coordinate system A of the optical tracking system 6 and the coordinate system R of the mechanical arm 18, the inverse kinematics calculation of the robot is used to obtain the Motion parameters of each joint of the robotic arm 18 , the motion of each joint of the robotic arm 18 drives the optical tracking system 6 to avoid obstacles. Then, repeat the above-mentioned process of S2-S7. For example, the positioning device obtains the refreshed monitoring images at a fixed frequency, and executes the above-mentioned processes of S2-S7 in real time according to each obtained monitoring image.
  • each refresh can correspond to new motion parameters, that is, the rotational angular velocity of the optical tracking system 6 is refreshed according to the real-time position of the obstacle during each refresh, and then the refresh time is passed by the rotational angular velocity , until there are no obstacles in the new monitoring image. In this way, real-time monitoring of obstacles in the monitoring space is realized.
  • Fig. 8 and Fig. 9 show the planning method of the obstacle avoidance motion of the manipulator.
  • the optical tracking system 6 should keep the line of sight optical axis 21 always facing the minimum enclosing sphere 20 of the optical marker (that is, the optical axis 21 passes through the center O) of the smallest enclosing sphere. Therefore, a sub-coordinate system As fixed relative to the coordinate system A is added to the coordinate system A of the optical tracking system 6 .
  • the origin of the sub-coordinate system coincides with the center O of the smallest enclosing sphere 20 .
  • the obstacle avoidance motion of the optical tracking system 6 can be expressed as a pure rotation ⁇ s of the sub-coordinate system As. According to the relative positional relationship between the sub-coordinate system As and the coordinate system A of the optical tracking system 6, the relationship between the motion speed of the two coordinate systems can be calculated:
  • p x , p y , p z are Components in the x, y, and z directions, and then obtain the positive kinematic equation of the mechanical arm 18 with As as the reference coordinate system:
  • Example 2 the positioning device uses a binocular camera to monitor obstacles, and the specific pose control method is as follows:
  • Y1 Similar to S1 in Example 1, first set and fix the optical markers, wherein the optical markers can include femoral markers 11, tibial markers 13, tool markers 3 and base markers 15 set on corresponding positions etc.; and the optical tracking system 6 is installed on the mechanical arm 18;
  • step S2 of Embodiment 1 the optical tracking system 6 tracks the position information of the optical marker in real time, and determines the monitoring area according to the position of the optical tracking system 6 and the optical marker;
  • the optical tracking system 6 tracks the position information of the optical markers in real time, specifically including: according to the position information of a plurality of the optical markers, a minimum enclosing sphere 20 can be obtained, and a plurality of the optical markers are all within the spatial range of the minimum enclosing sphere 20; the mechanical arm 18 drives the optical tracking system 6 to move, so that the optical axis 21 of the optical tracking system 6 passes through the center O of the minimum enclosing sphere; wherein , the operation area can be the space formed by the minimum enclosing sphere 20, or it can be a larger space area containing the minimum enclosing sphere 20.
  • the present embodiment 1 takes the operation area as the space surrounded by the minimum enclosing sphere 20.
  • the synthetic space is presented as an example.
  • the step of determining the monitoring area specifically includes: taking the optical axis 21 of the optical tracking system 6 as the central axis, taking the minimum enclosing sphere 20 to the The farthest distance of the optical tracking system 6 along the direction of the central axis is height H, and the distance threshold L between the optical axis 21 of the optical tracking system 6 is the radius, and the determined cylindrical space 22 is the Monitor area. Objects in this space area will be treated as obstacles, wherein the radius L of the cylindrical space 22 is not less than the radius of the minimum enclosing sphere 20, and the distance threshold between the optical axes 21 of the optical tracking system 6 The actual size of L can be set by itself according to monitoring needs.
  • Y3 Install a positioning device, that is, install a binocular camera.
  • the two cameras 25 and 26 of the binocular camera are respectively installed on the optical tracking system 6 .
  • the two cameras 25 and 26 are distributed symmetrically with respect to the optical axis 21 of the optical tracking system 6 , and are respectively arranged close to the two sensors of the optical tracking system 6 .
  • the optical axes of the two cameras are parallel. Install the two cameras near the two sensors of the optical tracking system 6 respectively, so that the field of view of the binocular camera can be as close as possible to the field of view of the optical tracking system 6 .
  • the optical tracking system 6 Due to the special design of the optical tracking system 6 , its sensors are mainly used to identify and track specific optical markers, and it is difficult to identify other objects such as obstacles within the field of view. Therefore, after the binocular camera is installed, the binocular camera can accurately track and identify objects such as obstacles that are within the field of view of the optical tracking system 6 but cannot be accurately identified by the optical tracking system 6 .
  • Y4 Establish the transformation relationship between the coordinate system A of the optical tracking system 6, the coordinate system B of the binocular camera, and the coordinate system R of the robotic arm 18; the coordinate system A of the optical tracking system 6 and the coordinate system of the robotic arm 18
  • the transformation relationship between R can be calculated through calibration and kinematics information of the robotic arm 18, and the coordinate system B of the binocular camera can also be linked with the coordinate system A of the optical tracking system 6 through calibration. In this way, the coordinate system of the whole system can be unified.
  • the position of the optical marker is measured by the optical tracking system 6, and the position of the obstacle is measured by the binocular camera.
  • Y5 The binocular camera acquires monitoring images within its field of view in real time, and judges whether there are obstacles in the monitoring space according to the monitoring images. If there is an obstacle, obtain the position information of the obstacle in the monitoring space;
  • the difference from Embodiment 1 is that the monitoring images captured by the binocular camera are two two-dimensional images with parallax. Therefore, when judging whether there is an obstacle in the monitoring space, it is first necessary to determine the monitoring projection area corresponding to the monitoring space on the two two-dimensional monitoring images in the two monitoring images of the binocular camera.
  • the minimum enclosing sphere 20 is projected onto the imaging planes 27 and 29 of the monocular camera respectively, so there will be corresponding projection areas 28 and 30 on the two two-dimensional monitoring images. When there are projections of other objects covering these two areas, the optical markers may be blocked.
  • a is the difference between the radius L of the above-mentioned monitoring space and the radius of the minimum enclosing sphere 20
  • Figure 12 is the monitoring projection area corresponding to the monitoring space on the two-dimensional monitoring image.
  • d ⁇ D it means that other objects are farther away from the binocular camera than the minimum enclosing sphere 20, and the other objects are not If the tracking of the optical marker by the optical tracking system 6 is blocked, it can be judged that there is no obstacle in the monitoring space. At this point, the other object can be ignored.
  • the acquisition of the position information of obstacles in the monitoring space is also different from Embodiment 1, and the steps may specifically include: in the coordinate system B of the binocular camera, as shown in FIG. 14 , the The projection of the obstacle in the monitoring image is the obstacle projection area 32, and the projection of the center O of the minimum enclosing sphere 20 in the monitoring image is O'; the distance between the obstacle projection area 32 and the projection The position information of the center point O' and the nearest point P' in the monitoring projection area 32 is used as the position information of the obstacle in the monitoring space.
  • the obstacle-avoiding movement parameters of the optical tracking system 6 can be obtained according to the position information of the obstacle, so as to ensure that the optical marker does not move out of the field of view while avoiding the obstruction of the obstacle; the specific operation can be as follows: In the coordinate system B of the target camera, according to the position O' of the projection center point and the position P' of the obstacle, determine the vector According to the vector
  • the motion parameters of the optical tracking system 6 for obstacle avoidance can be obtained, including motion speed In the coordinate system B, the motion direction 33 of the optical tracking system 6 is a vector direction, speed the size of There is a negative correlation. The smaller the , the closer the surface obstacle is to the line-of-sight optical axis 21 of the optical tracking system 6 , and the more severe the occlusion. At this time, the movement speed of the optical tracking system 6 should be higher.
  • Embodiment 2 also provides a solution on how to determine the motion parameters of the optical tracking system 6 in the case of multiple obstacles. For example, when there are two different obstacles in the monitoring space, as shown in FIG.
  • the projection areas 32 and 34 are the closest points P 1 and P 2 to the projection center point O'.
  • the position information of P 1 and P 2 represent the position information of two obstacles in the monitoring space respectively.
  • determine the vector and vector Next determine the moving speeds of the optical tracking system 6 to avoid the two obstacles separately with Then, according to the speed synthesis method, obtain the moving speed of the optical tracking system 6 while avoiding these two obstacles the speed for two motion speeds with vector sum of .
  • the obstacle avoidance motion of the optical tracking system 6 in the coordinate system R is a rotation around the center position O of the minimum enclosing sphere 20 .
  • the motion parameters in the coordinate system R include the rotational angular velocity ⁇ r and the rotational radius r.
  • the steps to obtain these motion parameters include:
  • the obstacle avoidance method further includes Y7: the mechanical arm 18 drives the optical tracking system 6 to move to avoid obstacles.
  • the motion parameters of the optical tracking system 6 in the coordinate system R of the robotic arm 18 can be obtained through robot inverse kinematics calculation.
  • each joint of the mechanical arm 18 moves to drive the optical tracking system 6 to avoid obstacles. Similar to the above-mentioned embodiment 1, the above-mentioned process of Y2-Y7 is repeated to monitor the obstacles in the monitoring space in real time.
  • the method for calculating and obtaining the motion parameters of each joint of the robotic arm 18 according to the inverse kinematics of the robot may refer to the method in Embodiment 1, which will not be repeated here.
  • the present application also provides a computer device, including: at least one memory storing at least one computer program; at least one processor executing the computer program to realize the control of the optical tracking system mentioned in the above-mentioned embodiments of the present application method.
  • the computer equipment of the present application can be integrated in the navigation trolley 9 or the operation trolley 1, and the computer equipment can also include, for example, an auxiliary display 7, a main display 8, a keyboard, in addition to the memory and the processor mentioned above. 10 and other peripherals.
  • the present application also protects a supporting device for supporting the optical tracking system 6, wherein the optical tracking system 6 is used to obtain position information of optical markers during surgery.
  • the support device includes: at least one joint, wherein the joint is used to provide at least one degree of freedom of movement; a controller is electrically connected to each of the joints, and is used to control the at least one joint according to the received control instructions. Joint movement; wherein, the control instruction comes from the above-mentioned computer equipment.
  • the supporting device of the present application can be, for example, a mechanical arm 18 .
  • the robotic arm 18 includes at least one joint and has a built-in controller.
  • the control instruction is information generated by the computer device according to the operations performed by the supporting device and used for communicating with the supporting device.
  • the manner in which the computer equipment controls the support device to adjust the posture of the optical tracking system includes that the computer equipment generates a control instruction for adjusting the posture of the support device by analyzing the monitoring image, and sends it to the support device;
  • the attitude of the joint is converted into driving data such as torque, angular velocity, etc. for controlling at least one joint, and sent to the driver of each joint, so that the driver performs the adjustment operation.
  • the present application also protects an optical navigation system, including: an image acquisition device, including: a first camera module and a second camera module; image; wherein, the position of the at least one optical marker identifies an operation area; the second camera module is used to acquire a monitoring image corresponding to the monitoring area between the operation area and the first camera module.
  • the optical navigation system also includes: the above-mentioned support device, assembled with the image acquisition device; a processing device, electrically connected with the image acquisition device and the support device, for executing the above-mentioned optical tracking system. A control method and determining position information of the at least one optical marker using the first image.
  • the function of the first camera module is similar to that of the optical tracking system 6 mentioned above, and is used to obtain the positioning image of the optical marker, and realize the tracking of the optical marker.
  • the function of the second camera module is similar to that of the positioning device mentioned above, and it is used to acquire detection images of the monitoring area.
  • the two camera modules can be integrated on one image acquisition device, and the first camera module and the second camera module can have overlapping viewing angle ranges.
  • the present application also provides a surgical robot system, including: an optical navigation system, configured to determine the position information of the at least one optical marker according to the captured positioning image containing the at least one optical marker; wherein, at least one of the optical markers The position of the marker identifies an operation area; the supporting device is used to assemble the optical navigation system; the monitoring device is used to obtain the monitoring image corresponding to the monitoring area between the operation area and the first camera module; The mechanical arm is used to connect surgical instruments; the above-mentioned computer equipment is respectively connected to the supporting device, the optical navigation system, the monitoring device, and the surgical robotic arm; wherein, the computer equipment provides The support device sends a control command, so that the support device adjusts the position and/or posture of the optical navigation system; The arm issues a control command, so that the surgical robot arm adjusts the position and/or posture of the assembled surgical instrument.
  • the monitoring device can be configured in an optical navigation system.
  • the present application also protects a computer-readable storage medium for storing computer programs.
  • the computer program When the computer program is executed by the processor, the pose control method of the optical tracking system mentioned in the embodiment of the present application is realized.
  • the readable storage medium in the embodiments of the present application may use any combination of one or more computer-readable media.
  • the readable medium may be a computer readable signal medium or a computer readable storage medium.
  • a computer-readable storage medium may be, for example, but not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, device, or device, or any combination thereof.
  • a computer-readable storage medium may be any tangible medium that contains or stores a program that can be used by or in combination with an instruction execution system, apparatus, or device.
  • a computer readable signal medium may include a data signal carrying computer readable program code in baseband or as part of a carrier wave traveling as a data signal. Such propagated data signals may take many forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the foregoing.
  • the computer-readable signal medium can also be any computer-readable medium other than the computer-readable storage medium, and the computer-readable medium can send, propagate or transmit the program used by the instruction execution system, apparatus or device or in combination with it.
  • the computer program codes for executing the operations of the present application can be written in one or more programming languages or a combination thereof.
  • Such programming languages include object-oriented programming languages - such as Java, Smalltalk, C++, and conventional procedural programming languages - such as the "C" language or similar programming languages.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • WAN wide area network
  • an external computer e.g., via an Internet connection using an Internet service provider.
  • the pose control method of the optical tracking system and its applicable optical navigation system, surgical robot system, computer equipment, support device and computer-readable storage medium provided by the present application have the following advantages :
  • the pose control method of the optical tracking system provided by this application can determine whether there is an obstacle blocking the optical marker in the monitoring area, identify obstacles of any shape, and adjust the optical tracking system.
  • the tracking system keeps the monitoring area from being blocked, so there is no need to add manual marking, and it has a wide range of application scenarios.
  • the pose control method of the optical tracking system provided in this application can further plan the obstacle avoidance motion of the optical tracking system according to the position information of the obstacle, and move the optical tracking system according to the planned motion track to avoid the optical marker being blocked.
  • the pose control method of the present application can always ensure that the optical marker to be tracked is within the central range of the field of view of the optical tracking system during the obstacle avoidance movement of the optical tracking system, so as to avoid interruption of the surgical navigation process.
  • the support device provided by this application has a controllable movement function, and automatically moves the optical tracking system to avoid obstacles according to the planned obstacle avoidance movement. Therefore, the support device provided by the present application does not need manual adjustment by the doctor, and can avoid the interruption of the navigation process caused by the marker being blocked.
  • the optical navigation system provided by this application is highly integrated and has an obstacle avoidance function. Therefore, the optical navigation system can solve the problem of occluded objects in the monitoring space by implementing the control method of the optical tracking system of the present application.
  • the surgical robot system of the present application combines stereo vision technology with robot technology to solve the common problem of the monitoring area being blocked in the optical surgical navigation system, and the entire navigation adjustment system does not contact with patients or medical staff, avoids disinfection, and reduces infection possibility. Moreover, the obstacle avoidance movement of the surgical robot system of the present application does not need to change the flow of the original surgical navigation system, so for doctors to use, no additional software and hardware operations are required, and the functions of the original surgical navigation system are not affected. This reduces the learning curve for physicians and improves operating room utilization.

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Robotics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Manipulator (AREA)
  • Image Analysis (AREA)

Abstract

一种位姿控制方法及其适用的光学导航系统和手术机器人系统。位姿控制方法包括:获取于光学追踪系统(6)与手术区域之间形成的监测区域的至少一幅监测图像;其中,手术区域是基于至少一个光学标记物(3,11,13, 15)的位置而确定的;当根据监测图像而确定监测区域内存在遮挡物体时,调整光学追踪系统(6)的位置和/或姿态,以使遮挡物体位于调整后的光学追踪系统(6)与手术区域之间所形成的新的监测区域之外。位姿控制方法可以解决现有技术中的光学手术导航系统的标记物(3,11,13, 15)被遮挡而影响手术效率和流畅性的问题。

Description

位姿控制方法及其适用的光学导航系统、手术机器人系统 技术领域
本申请涉及医疗设备技术领域,特别涉及一种位姿控制方法及其适用的光学导航系统、手术机器人系统。
背景技术
手术导航系统的出现符合精准外科的发展趋势。手术导航系统将病人术前或术中影像数据和手术床上的病人的解剖结构准确地对应。手术导航系统还在手术中跟踪手术器械并将手术器械的位置以虚拟器械的形式在病人影像上实时更新显示,使医生对手术器械相对于病人的解剖结构的位置一目了然,从而使得外科手术更快速、更精确、更安全。手术导航系统通过对患者医学影像的分析以及术中各种传感器的应用,为手术操作提供更丰富的参考信息和更精准的引导,成为辅助医生完成手术的有力工具。光学手术导航方式具有较高的精度,使用简单,无辐射,对手术流程影响较小,因此在手术导航系统中获得了广泛的应用。
光学手术导航系统中常用的导航方式是利用光学追踪系统实时追踪具有固定形状且容易识别的光学标记物。将光学标记物置于目标解剖结构上,利用光学追踪系统对标记物的追踪,间接获得目标解剖结构的实时位置和姿态。然而,受限于光学特性,若要追踪标记物需保证光学追踪系统和标记物之间没有其他物体,且标记物需保持在光学追踪系统的视野范围内。这对手术室内的人的移动、设备的放置等有很高要求。
发明内容
本申请的目的在于提供一种位姿控制方法及其适用的光学导航系统、手术机器人系统、计算机设备、支撑装置及计算机可读存储介质,以解决现有技术中的光学手术导航系统中的标记物被遮挡而影响手术效率和流畅性的技术问题。
为解决上述技术问题,本申请提供一种光学追踪系统的位姿控制方法,包括如下步骤:获取于光学追踪系统与手术区域之间形成的监测区域的至少一幅监测图像;其中,所述手术区域是基于至少一个光学标记物的位置而确定的;当根据所述监测图像而确定所述监测区域内存在遮挡物体时,调整所述光学追踪系统的位置和/或姿态,以使所述遮挡物体位于调整后的所述光学追踪系统与手术区域之间所形成的新的监测区域之外。
进一步的,所述手术区域大于或等于以所述至少一个光学标记物的位置为边界的空间范围。
进一步的,所述监测区域包括:根据所述手术区域的边界与所述光学追踪系统的视角范围而围成的空间范围。
进一步的,根据所述监测图像而确定所述监测区域内存在遮挡物体的步骤包括以下至少一个:
通过提取所述监测图像中与所述监测区域对应的图像区域中的前景图像,确定所述监测区域内存在遮挡物体;通过检测至少两幅所述监测图像中与所述监测区域对应的图像区域中的图像之间的变化,确定所述监测区域内存在遮挡物体;当所述监测图像为深度图像时,通过检测所述监测图像中与所述监测区域对应的图像区域内存在实体位置,确定所述监测区域内存在遮挡物体;以及通过检测至少两幅所述监测图像中与所述监测区域对应的图像区域内的图像数据对之间的视差,确定所述监测区域内存在遮挡物体。
进一步的,所述调整所述光学追踪系统的位置和/或姿态的步骤包括:
根据所述监测图像中的对应遮挡物体和光学追踪系统之间的姿态关系、或者相对位置关系,调整所述光学追踪系统的位置和/或姿态。
进一步的,所述调整所述光学追踪系统的位置和/或姿态的步骤包括:
按照预设的调整单位,逐次调整所述光学追踪系统的位置和/或姿态,直至根据所获取的监测图像确定调整后的所述光学追踪系统与手术区域之间所形成的新的监测区域内无遮挡物体。
进一步的,所述调整所述光学追踪系统的位置和/或姿态的步骤包括以下任一个:以所述手术区域为中心,调整所述光学追踪系统的位置和姿态;或者,根据通过分析所述监测图像而确定的所述遮挡物体和光学追踪系统之间的姿态关系、或位置关系,平移所述光学追踪系统的位置。
本申请还提供了一种计算机设备,包括:至少一个存储器,存储至少一个计算机程序;至少一个处理器,执行所述计算机程序以实现如上所述光学追踪系统的位姿控制方法。
本申请还提供了一种支撑装置,用于支撑光学追踪系统,其中,所述光学追踪系统用于在外科手术中获取光学标记物的位置信息,所述支撑装置包括:至少一个关节,其中,所述关节用于提供至少一个自由度上的运动;控制器,与每个所述关节电气连接,用于根据所接收的控制指令控制所述至少一个关节运动;其中,所述控制指令来自于如前所述的计算机设备。
进一步的,所述计算机设备内置于所述支撑装置中。
本申请还提供了一种光学导航系统,包括:图像获取装置,包括:第一摄像模组和第二摄像模组;其中,所述第一摄像模组,用于获取包含至少一个光学标记物的定位图像;其中,所述至少一个所述光学标记物的位置标识出一手术区域;第二摄像模组,用于获取对应所述手术区域与第一摄像模组之间的监测区域的监测图像;所述光学导航系统还包括:如上所述的支撑装置,其与所述图像获取装置连接。
进一步的,所述第一摄像模组和第二摄像模组具有重叠的视角范围。
本申请还提供了一种手术机器人系统,包括:光学导航系统,用于根据所摄取的包含至少一个光学标记物的定位图像确定所述至少一个光学标记物的位置信息;其中,所述至少一个光学标记物的位置标识一手术区域;支撑装置,用于装配所述光学导航系统;监测装置,用于获取与所述手术区域与第一摄像模组之间的监测区域对应的监测图像;手术机械臂,用于连接手术器械;以及如前所述的计算机设备,分别与所述支撑装置、光学导航系统、监测装置、和手术机械臂通信连接;其中,所述计算机设备通过执行如前所述的位姿控制方法向所述支撑装置发送控制指令,以使所述支撑装置调整光学导航系统的位置和/或姿态;以及,所述计算机设备还用于根据所述至少一个光学标记物的位置信息向所述手术机械臂发出控制指令,以使手术机械臂调整所装配的手术器械的位置和/或姿态。
进一步的,所述监测装置配置于光学导航系统中。
本申请还提供了一种计算机可读存储介质,用于存储计算机程序,所述计算机程序被执行(如被处理器)执行时实现如上所述的控制方法。
综上所述,与现有技术相比,本申请提供的光学追踪系统的位姿控制方法及其适用的光学导航系统、手术机器人系统、计算机设备、支撑装置及计算机可读存储介质具有以下优点:
通过在手术过程中实时获取监测区域内的监测图像,本申请提供的光学追踪系统的位姿控制方法可以确定监测区域内是否有遮挡光学标记物的障碍物、识别任意形状的障碍物以及调整光学追踪系统使监测区域不被遮挡,因而无需添加人工标记,应用场景广泛。
本申请提供的光学追踪系统的位姿控制方法能进一步根据障碍物的位置信息规划光学追踪系统的避障运动,并按照规划的运动轨迹移动光学追踪系统,以避免光学标记物被遮挡。此外,本申请的位姿控制方法在光学追踪系统的避障运动过程中可以始终保证需要被追踪的光学标记物在光学追踪系统的视野中央范围内,避免手术导航过程被中断。
本申请提供的支撑装置具有可控运动功能,根据规划的避障运动,自动移动光学追踪系统避开障碍物。因此,本申请提供的支撑装置无需医生手动调整,便可避免标记物被遮挡而导致导航流程中断。
本申请提供的光学导航系统集成度高,且具备避障功能。因此,该光学导航系统可以 通过执行本申请的光学追踪系统的控制方法来解决监测空间内存在遮挡物体的问题。
本申请的手术机器人系统将立体视觉技术与机器人技术相结合,解决光学手术导航系统中普遍存在的监测区域被遮挡的问题,且整个导航调整系统不与患者或医护人员接触,免消毒,降低感染的可能性。并且,本申请的手术机器人系统的避障运动无需更改原手术导航系统的流程,因而对医生使用来说,不需要额外的软硬件操作,原手术导航系统的功能也不受影响。这可降低医生的学习曲线,提高手术室利用效率。
附图说明
图1为本申请一实施方式中的一种光学追踪系统的位姿控制方法的流程示意图;
图2为本申请一实施方式中的膝关节置换的骨科手术导航系统的示意图;
图3为本申请实施例1中的手术机器人系统的各组件的示意图;
图4为本申请实施例1中的光学追踪系统的位姿控制方法中确定监测区域的示意图;
图5为本申请实施例1中的光学追踪系统的位姿控制方法中三者(遮挡物体、手术区域和光学追踪系统)坐标系的变换示意图;
图6为本申请实施例1中的光学追踪系统的位姿控制方法中障碍物遮挡光学标记物的示意图;
图7为本申请实施例1中的光学追踪系统的位姿控制方法中光学追踪系统避障运动的示意图;
图8为本申请实施例1中的光学追踪系统的位姿控制方法中光学追踪系统避障运动的坐标系变换示意图;
图9为本申请实施例1中的光学追踪系统的位姿控制方法中光学追踪系统避障运动时机械臂的运动示意图;
图10为本申请实施例2中的光学追踪系统的位姿控制方法中双目相机的安装示意图;
图11为本申请实施例2中的光学追踪系统的位姿控制方法中双目相机的成像示意图;
图12为本申请实施例2中的光学追踪系统的位姿控制方法中监测区域在双目相机上的成像示意图;
图13为本申请实施例2中的光学追踪系统的位姿控制方法中判断监测区域中是否存在障碍物的原理示意图;
图14为本申请实施例2中的光学追踪系统的位姿控制方法中监测区域内存在单个障碍物时光学追踪系统在成像空间内的避障运动示意图;
图15为本申请实施例2中的光学追踪系统的位姿控制方法中监测区域内存在多个障碍物时光学追踪系统在成像空间内的避障运动示意图;
图16为本申请实施例2中的光学追踪系统的位姿控制方法中光学追踪系统在各坐标系内的运动参数变换示意图。
其中,附图标记如下:
1-手术台车;2-手术机械臂;3-工具标记物;4-截骨导向工具;5-摆锯;6-光学追踪系统;7-辅助显示器;8-主显示器;9-导航台车;10-键盘;11-股骨标记物;12-股骨;13-胫骨标记物;14-胫骨;15-基座标记物;16-手术台;17-患者;18-机械臂;19-深度相机;20-最小包围球;21-光学追踪系统的光轴;22-圆柱形空间;23-点云;24-光学追踪系统的转动方向;25,26-单目相机;27,29-成像平面;28,30-最小包围球的投影区域;31-圆环区域;32,34-障碍投影区域;33-光学追踪系统在成像空间中的运动方向。
具体实施方式
以下结合附图和具体实施方式对本申请提出的一种光学追踪系统的控制方法、计算机设备、支撑装置、光学导航系统、手术机器人系统及计算机可读存储介质作进一步详细说 明。根据下面说明,本申请的优点和特征将更清楚。
需要说明的是,附图采用非常简化的形式且均使用非精准的比例,仅用以方便、明晰地辅助说明本申请实施方式的目的。为了使本申请的目的、特征和优点能够更加明显易懂,请参阅附图。须知,本说明书所附图式所绘示的结构、比例、大小等,均仅用以配合说明书所揭示的内容,以供熟悉此技术的人士了解与阅读,并非用以限定本申请实施的限定条件,故不具技术上的实质意义。此外,任何结构的修饰、比例关系的改变或大小的调整,在不影响本申请所能产生的功效及所能达成的目的下,均应仍落在本申请所揭示的技术内容能涵盖的范围内。
在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的过程、方法、物品或者设备中还存在另外的相同要素。
在手术室这种空间有限且人员、设备等位置复杂的环境中,光学手术导航系统的应用面临如下几个问题:光学追踪系统容易因障碍物遮挡导致无法追踪标记物,进而打断整个导航流程;障碍物可以是存在于光学追踪系统和标记物之间的任何人或物体,难以实时识别并获取其位置;移动光学追踪系统时容易将标记物移到视野外,导致标记物追踪失败。
上述这些问题会影响光学手术导航系统的使用体验,增加医生的额外操作,降低手术的效率和流畅性,甚至产生影响手术安全性的风险。
本申请的核心思想之一在于提供一种光学追踪系统的位姿控制方法,以解决现有技术中的光学手术导航系统的光学追踪系统在追踪标记物时被遮挡而影响手术效率和流畅性的问题。
为实现上述思想,本申请提供一种光学追踪系统的位姿控制方法,如图1所示,包括如下步骤:
获取于光学追踪系统与手术区域之间形成的监测区域的至少一幅监测图像;其中,所述手术区域是基于至少一个光学标记物的位置而确定的;当根据所述监测图像而确定所述监测区域内存在遮挡物体时,调整所述光学追踪系统的位置和/或姿态,以使所述遮挡物体位于调整后的所述光学追踪系统与手术区域之间所形成的新的监测区域之外。
在本申请提供的上述光学追踪系统的位姿控制方法中,通过实时获取监测空间的监测图像并根据检测图像来判断监测空间内是否存在遮挡物体(也即障碍物)。一旦确定监测空间内有障碍物时,可以对光学追踪系统的位姿进行调整,以此来控制光学追踪系统运动绕开障碍物,避免光学标记物被遮挡。这样,就克服了标记物被遮挡影响手术效率和流畅性的问题。
其中,本申请的手术区域大于或等于以所述至少一个光学标记物的位置为边界的空间范围。例如,根据所述至少一个光学标记物的位置可以得到一个最小包围球,光学标记物都位于该最小包围球的范围内。手术区域可以是这个最小包围球的区域,也可以是比最小包围球区域稍大的区域。确定好手术区域后,可以根据手术区域确定所述监测区域,其中所述监测区域可以是所述手术区域的边界与所述光学追踪系统的视角范围而围成的空间范围。例如,所述监测区域可以是光学追踪系统的视角范围与上面提到的最小包围球围合 而成的一圆柱形空间范围。又如,所述监测区域可以是为了安全起见而设置的比该圆柱形空间范围略大的空间范围。
本申请的位姿控制方法中,据所述监测图像确定所述监测区域内存在遮挡物体的步骤可以包括以下至少一个:
通过提取所述监测图像中与所述监测区域对应的图像区域中的前景图像,确定所述监测区域内存在遮挡物体。例如在手术前就可以确定好整个监测区域内各手术相关器械的位置,并得到不存在遮挡时监测区域对应的标准图像,并将该标准图像作为背景图像。在手术过程中,实时获取监测图像。根据背景图像,若能够从所述监测图像中的与所述监测区域对应的图像区域中提取出前景图像,则确定所述监测区域存在障碍物;反之,则不存在障碍物。
或者,通过检测至少两幅所述监测图像中的与所述监测区域对应的图像区域中的图像之间的变化,确定所述监测区域内存在遮挡物体。例如,获取具有一定时间差的两幅监测图像并对这两幅监测图像进行比对,和观察两幅监测图像中与监测区域对应的图像区域内的图像变化。若监测区域内存在遮挡物体时,则在同一图像区域内,后一副图像相比前一副图像会多出遮挡的障碍物,以此确定监测区域内是否存在障碍物。
或者,当所述监测图像为深度图像时,通过检测所述监测图像中与所述监测区域对应的图像区域内存在实体位置,确定所述监测区域内存在遮挡物体。其中,所述实体位置指遮挡物体在监测区域中的位置。例如,当采用深度相机对监测区域进行检测时,可以采集得到监测区域的监测图像。此时的监测图像可以直接显示监测区域内各物体的位置,可以直观的判断监测区域内是否存在遮挡物体。
或者,通过检测至少两幅所述监测图像中与所述监测区域对应的图像区域内的图像数据对之间的视差,确定所述监测区域内存在遮挡物体。其中,所述图像数据对表示不同监测图像所描述的监测区域内的同一物体的相匹配的图像数据。例如,采用双目摄像装置拍摄具有视差的两幅监测图像,然后对这两幅监测图像进行比对,当根据视差确定两幅监测图像中描述有监测区域内的同一障碍物的图像数据对时,确定在监测区域内存在障碍物。
本领域技术人员应当明了,除了上述列举的几种根据监测图像确定监测区域内是否有障碍物的方案外,还可以采用其他利用监测图像确定是否存在障碍物的方案,只要能根据监测图像识别监测区域内存在遮挡物即可,都属于本申请的保护范畴。
当确定监测空间内存在遮挡物体后,调整所述光学追踪系统的位置和/或姿态的步骤可以包括:根据所述监测图像中对应遮挡物体和光学追踪系统之间的姿态关系、或者相对位置关系,调整所述光学追踪系统的位置和/或姿态。可以根据监测空间中障碍物的位置信息对光学追踪系统的运动轨迹进行规划,然后对光学追踪系统进行定量的调整以绕开障碍物。此时,光学追踪系统的运动不再是未经规划的随意移动,其运动轨迹得到精确控制,从而可以保证光学追踪系统在移动过程中,需要追踪的光学标记物一直在光学追踪系统的视野中,防止手术中断。
此外,对光学追踪系统的调整,除上述提到的方案外,也可以采用逐次调整的方式,例如按照预设的调整单位,逐次调整所述光学追踪系统的位置和/或姿态,直至根据所获取的新的监测图像确定所述监测区域内无遮挡物体。预设的调整单位包括但不限于是设定的长度,和/或设定的角度。例如按照设定的长度平移调整光学追踪系统,或者是按照设定的 角度绕手术区域转动调整光学追踪系统。
对光学追踪系统的位置和/或姿态进行调整的方式举例包括:以所述手术区域为中心,调整所述光学追踪系统的位置和姿态。例如,以光学追踪系统与手术区域之间的距离为半径,以手术区域的中心为圆心,规划一圆弧线形路线,并据此调整光学追踪系统的位置和姿态。
对光学追踪系统的位置和/或姿态进行调整的方式还举例包括:根据通过分析所述监测图像而确定的所述遮挡物体和光学追踪系统之间的姿态关系、或位置关系,平移所述光学追踪系统的位置。例如,当通过分析监测图像而确定遮挡物体遮挡手术区域的边缘,通过平移光学追踪系统来改变遮挡物体、手术区域和光学追踪系统三者的位置关系,使得光学追踪系统的监测区域内无遮挡物体。
需要说明的是,上述各调整方式可通过分析监测图像而选择使用。或者结合两种调整方式而规划调整路线。例如,调整光学追踪系统包括平移和圆弧线的路线,或者依据平移和圆弧而优化的非圆弧形弧线等。以下结合具体的手术实例对本申请提供的光学追踪系统的控制方法进行更进一步的详细描述,手术以膝关节置换的骨科手术导航系统为例。本领域技术人员应当理解,本申请提供的避障方法及系统并不仅仅局限于膝关节置换的骨科手术导航系统,还可以应用于其他光学手术导航系统中。
如图2所示,本申请提供了一种膝关节置换的骨科手术导航系统和其手术应用场景,在该具体实施中,手术导航系统具体可以包括:手术台车1;安装在手术台车1上的手术机械臂2;以及安装在手术机械臂2上的各种手术器械,例如图1中的截骨导向工具4、摆锯5等手术器械;手术台16;以及手术台16上的患者17(患者17的待手术部位,例如在膝关节置换的骨科手术中,包括股骨12和胫骨14);以及各种光学标记物。光学标记物主要分为两大类,一类是设置在患者的待手术部位上,例如分别设置在股骨12和胫骨14上的股骨标记物11和胫骨标记物13,这一类标记物可以用来标识手术部位的位置,方便手术部位被追踪,另一类是设置在手术台车1及其附属部件上的各类标记物,例如设置在截骨导向工具4上的工具标记物3、设置在手术台车1上的基座标记物15等。这一类标记物用来标识手术机器人的位置,方便手术器械等被追踪。手术导航系统还包括导航台车9,以及安装在导航台车9上的光学追踪系统6。光学追踪系统6用来追踪上述这些光学标记物的实时位置。此外,在导航台车9上还设置了计算机系统,用来进行全局的控制,其可以包括设置在导航台车9上的主显示器8、键盘10以及位于导航台车9内的控制器等。另外,还可以增设一面辅助显示器7,方便多人操作。
膝关节置换导航机器人系统的主要使用步骤如下:
首先,将手术台车1及导航台车9放置在病床手术台16旁边合适的位置。在患者17的待手术部位如股骨12和胫骨14上安装股骨标记物11和胫骨标记物13。在手术台车1上安装手术机械臂系统2、截骨导向工具4、摆锯5等手术器械,并在手术台车1及其附属部件的对应位置安装基座标记物15、工具标记物3等,同时还可以在手术台车1上放置其他手术必要的器具,如无菌袋等。
其次,医生将患者的术前规划导入计算机。术前规划主要包括截骨平面坐标,假体型号及假体安装方位等。
然后,医生使用光学追踪探针识别患者17的股骨12及胫骨14的特征点。光学追踪 系统6以股骨标记物11和胫骨标记物13为基准,分别记录患者骨头特征点位置,并将骨头特征点位置发送给计算机。然后,计算机通过特征匹配算法得到股骨12及胫骨14的实际方位与其CT图像方位的对应关系,并将股骨12、胫骨14的实际方位与安装在股骨12及胫骨14上的相应标记物相联系,从而使股骨标记物11和胫骨标记物13可以实时跟踪骨头的实际位置(手术过程中,只要标记物与骨头相对位置固定,骨头移动不会影响手术效果)。
接着,将术前规划的截骨平面坐标发送给手术机械臂2。手术机械臂2通过工具标记物3定位截骨平面并运动到预定位置,医生即可使用摆锯5或电钻通过截骨导向工具4的截骨导向槽及导向孔进行截骨及钻孔操作。这样完成截骨及钻孔操作后,医生即可安装假体及进行其他手术操作。
在上述的膝关节置换手术导航过程中,要求光学追踪系统6始终能获取到相关标记物的位姿信息,否则会导致整个导航流程的终端中断。
为解决标记物在导航过程中被遮挡的问题,本申请提出了一种光学追踪系统的位姿控制方法,可以有效的解决光学标记物被遮挡导致手术被中断的问题。并且,根据用于采集监测区域的监测图像的定位装置的不同,本申请提供了如下两种具体的实施例方案来解决光学标记物被遮挡的问题。
实施例1
本实施例1的光学追踪系统的位姿控制方法中,采用定位装置获取监测区域的至少一副监测图像。本实施例以定位装置为深度相机进行详细介绍。如图3所示,定位装置采用深度相机19来对监测区域进行检测,具体的位姿控制方法可以如下:
S1:首先设置并固定光学标记物,其中光学标记物可以包括如上所述的设置在对应部位上的股骨标记物11、胫骨标记物13、工具标记物3和基座标记物15等。将光学追踪系统6安装在一支撑装置上,其中支撑装置可以是机械臂18,也可以是其他能够运动的移动平台。本实施例以支撑装置为机械臂18为例进行介绍;
S2:所述光学追踪系统6实时追踪所述光学标记物的位置信息,根据所述光学追踪系统6和所述光学标记物的位置,确定监测区域;
具体来说,所述光学追踪系统6实时追踪所述光学标物的位置信息,具体包括:根据多个所述光学标记物的位置信息,得到一个最小包围球20,如图4所示,多个所述光学标记物均在所述最小包围球20的空间范围内;所述机械臂18带动所述光学追踪系统6运动,使所述光学追踪系统6的光轴21经过所述最小包围球的球心O,所述光学追踪系统6的光轴21一般指光学追踪系统6成像的对称中心轴。其中,手术区域可以是该最小包围球20围合成的空间,也可以是包含该最小包围球20的更大的空间区域,在下面的描述中,本实施例1以手术区域为最小包围球20围合成的空间为例进行介绍。
根据所述光学追踪系统6和所述光学标记物的位置,确定监测区域的步骤,具体包括:以所述光学追踪系统6的光轴21为中心轴,以所述最小包围球20到所述光学追踪系统6沿所述中心轴的方向的最远距离为高H,以与所述光学追踪系统6的光轴21之间的距离阈值L为半径,所确定的圆柱形空间22为所述监测区域。在此空间区域内的物体会被当作遮挡物体(即障碍物)处理,其中,圆柱形空间22的半径L不小于所述最小包围球20的半径,L的实际大小可根据监控需要自行设置。
S3:安装定位装置,即在合适的位置安装深度相机19,如图3和4所示,使所述深度相机19的视野覆盖整个监测区域。这样就可以实时监控所述监测区域。深度相机19拍摄得到的图像包含了整个监测区域的信息,并且还包括了光学追踪系统6以及上述提到的最小包围球20,以及他们的位置信息。当监测区域内存在障碍物时,深度相机19拍摄得到的图像中同样包含了该障碍物及其位置信息。
S4:建立光学追踪系统6的坐标系A、深度相机19的坐标系B以及机械臂18的坐标系R三者坐标系的变换关系。图5为采用深度相机19的方案中,系统各部分的空间坐标系之间的关系图。光学追踪系统6的坐标系A与机械臂18的坐标系R之间的变换关系可通过标定和机械臂18的运动学信息计算得到。深度相机19的坐标系B也可通过标定与光学追踪系统6的坐标系A联系起来。这样,整个系统的坐标系可以统一起来。光学标记物的位置被光学追踪系统6测量得到,障碍物的位置被深度相机19测量得到。
S5:所述深度相机19实时获取其视野范围内的监测图像,并根据所述监测图像判断所述监测区域内是否存在障碍物。若存在障碍物,获取障碍物在监测空间内的位置信息。
其中,根据所述监测图像判断所述监测空间内是否存在障碍物的步骤具体可以包括:根据所述三者坐标系的变换关系,将所述深度相机19获取的监测图像的点云变换到所述光学追踪系统6的坐标系A中;在所述光学追踪系统6的坐标系A中,比较每个点云与所述监测空间的位置关系,当所述监测空间内存在部分点云时,判定监测空间内存在障碍物,如图6所示,图中的部分点云23代表了障碍物落入监测空间范围内;当所述监测空间内不存在点云时,判定监测空间内不存在障碍物;
获取障碍物在监测空间内的位置信息具体包括:将该部分点云23中距离所述光学追踪系统6的光轴21最近的点P的位置作为所述障碍物在所述监测空间内的位置信息。
S6:根据所述障碍物的位置信息得到所述光学追踪系统6避障的运动参数,以保证光学标记物不移出所述光学追踪系统6的成像视野的同时避开障碍物的遮挡;
具体来说,如图7所示,在所述光学追踪系统6的坐标系A中,根据所述最小包围球20的球心位置O、光学追踪系统6的光轴中心位置C、障碍物的位置P,确定向量
Figure PCTCN2022101378-appb-000001
Figure PCTCN2022101378-appb-000002
以及两个向量之间的夹角α;根据所述向量
Figure PCTCN2022101378-appb-000003
Figure PCTCN2022101378-appb-000004
以及所述夹角α获得所述光学追踪系统6避障的运动参数。其中,所述光学追踪系统6避障的运动为在第一平面内绕所述球心位置O的转动,转动方向为24(即,转动方向是逆时针方向),所述运动参数包括转动角速度和转动半径;所述转动角速度的大小根据所述夹角α确定,方向为
Figure PCTCN2022101378-appb-000005
其中,障碍物越靠近光学追踪系统6的视线21,标记物被遮挡的概率越大,因此越应使光学追踪系统6尽快远离障碍物,即避障运动的速度大小与α的大小呈负相关关系。但是,为保证安全,速度应控制在一定范围内。所述转动半径为
Figure PCTCN2022101378-appb-000006
所述第一平面为所述
Figure PCTCN2022101378-appb-000007
Figure PCTCN2022101378-appb-000008
所确定的平面。此外,当所述监测空间内存在多个障碍物遮挡时,可以采用上述的方法,首先计算出分别绕开单个障碍物遮挡的运动参数,然后采用矢量合成的方法计算出同时绕开多个障碍物的运动参数。
S7:所述机械臂18带动所述光学追踪系统6运动避开障碍物。具体来说,根据所述光学追踪系统6的运动参数以及所述光学追踪系统6的坐标系A与所述机械臂18的坐标系R之间的变换关系,通过机器人逆运动学计算获取所述机械臂18的各个关节的运动参数,所述机械臂18的各个关节运动带动所述光学追踪系统6运动避开障碍物。然后,再 重复执行上述S2-S7的过程。例如,所述定位装置以固定的频率获得刷新后的监测图像,根据每次获得的监测图像实时执行上述S2-S7的过程。若障碍物始终存在,则每次刷新均可以对应得到新的运动参数,即每次刷新时根据障碍物的实时位置刷新所述光学追踪系统6的转动角速度,然后以该转动角速度转过刷新时间,直至新的监测图像中不再有障碍物遮挡。这样就实现了对监测空间内的障碍物进行实时监控。
图8和图9显示了机械臂避障运动的规划方法。为保证在避障过程中各标记物不超出光学追踪系统6的追踪范围,光学追踪系统6在移动过程中应使视线光轴21始终保持正对光学标记物的最小包围球20(即光轴21穿过最小包围球的球心O)。因此,在光学追踪系统6的坐标系A中添加一个相对于坐标系A固定的子坐标系As。该子坐标系As的原点与最小包围球20的球心O重合。光学追踪系统6的避障运动可表示成子坐标系As的纯转动ω s。根据子坐标系As与光学追踪系统6的坐标系A的相对位置关系,可计算得到两坐标系运动速度的关系:
Figure PCTCN2022101378-appb-000009
其中,
Figure PCTCN2022101378-appb-000010
为子坐标系As的线速度,
Figure PCTCN2022101378-appb-000011
为坐标系A的线速度,
Figure PCTCN2022101378-appb-000012
为坐标系A的角速度,
Figure PCTCN2022101378-appb-000013
为子坐标系As在光学追踪系统的坐标系A中的位置向量,
Figure PCTCN2022101378-appb-000014
定义如下:
Figure PCTCN2022101378-appb-000015
其中,p x、p y、p z
Figure PCTCN2022101378-appb-000016
在x,y,z方向的分量,进而得到以As为参考坐标系的机械臂18的正运动学方程:
Figure PCTCN2022101378-appb-000017
其中,
Figure PCTCN2022101378-appb-000018
为机械臂关节空间的运动速度,J为机械臂18关于坐标系A的雅可比矩阵,J m为机械臂18关于子坐标系As的雅可比矩阵。根据避障运动的特点,即子坐标系As只有角速度而没有线速度,可得机械臂避障运动的逆运动学方程:
Figure PCTCN2022101378-appb-000019
其中,
Figure PCTCN2022101378-appb-000020
为机械臂雅可比矩阵J m的转秩,
Figure PCTCN2022101378-appb-000021
为机械臂雅可比矩阵J m的伪逆矩阵。根据避障运动的角速度ω s,按照上述方程即可获得机械臂18的各关节的角速度
Figure PCTCN2022101378-appb-000022
如此,在避开障碍物遮挡的同时,可以保证各标记物始终在光学追踪系统6的监测范围内。
实施例2
本实施例2中,定位装置采用双目相机来对障碍物进行监测,具体的位姿控制方法如下:
Y1:与实施例1的S1相似,先设置并固定光学标记物,其中光学标记物可以包括设置在对应部位上的股骨标记物11、胫骨标记物13、工具标记物3和基座标记物15等;并将光学追踪系统6安装在机械臂18上;
Y2:与实施例1的S2步骤相似,所述光学追踪系统6实时追踪所述光学标记物的位置信息,根据所述光学追踪系统6和所述光学标记物的位置,确定监测区域;
同样的,所述光学追踪系统6实时追踪所述光学标物的位置信息,具体包括:根据多个所述光学标记物的位置信息,可以得到一个最小包围球20,多个所述光学标记物均在所述最小包围球20的空间范围内;所述机械臂18带动所述光学追踪系统6运动,使所述光学追踪系统6的光轴21经过所述最小包围球的球心O;其中,手术区域可以是该最小包围球20围合成的空间,也可以是包含该最小包围球20的更大的空间区域,在下面的描述中,本实施例1以手术区域为最小包围球20围合成的空间为例进行介绍。
根据所述光学追踪系统6和所述光学标记物的位置,确定监测区域的步骤,具体包括:以所述光学追踪系统6的光轴21为中心轴,以所述最小包围球20到所述光学追踪系统6沿所述中心轴的方向的最远距离为高H,以与所述光学追踪系统6的光轴21之间的距离阈值L为半径,所确定的圆柱形空间22为所述监测区域。在此空间区域内的物体会被当作障碍物处理,其中,圆柱形空间22的半径L不小于所述最小包围球20的半径,所述光学追踪系统6的光轴21之间的距离阈值L的实际大小可根据监控需要自行设置。
Y3:安装定位装置,即安装双目相机。如图10所示,将所述双目相机的两个相机25和26分别安装在所述光学追踪系统6上。两个相机25和26关于所述光学追踪系统6的光轴21对称分布,且分别靠近所述光学追踪系统6的两个传感器设置。两个相机的光轴平行。将两个相机分别安装在光学追踪系统6的两个传感器附近,这样,双目相机的视野范围就可以尽可能的与光学追踪系统6的视野范围接近。由于光学追踪系统6的特殊设计,其传感器主要用于识别追踪特定的光学标记物,对视野范围内的其他物体如障碍物等识别较为困难。因此,在安装双目相机后,双目相机可以对在光学追踪系统6视野范围内但又无法被光学追踪系统6准确识别的障碍物等物体进行精确的追踪识别。
Y4:建立光学追踪系统6的坐标系A、双目相机的坐标系B以及机械臂18的坐标系R三者坐标系的变换关系;光学追踪系统6的坐标系A与机械臂18的坐标系R之间的变换关系可通过标定和机械臂18的运动学信息计算得到,双目相机的坐标系B也可通过标定与光学追踪系统6的坐标系A联系起来。这样,整个系统的坐标系可以统一起来。光学标记物的位置被光学追踪系统6测量得到,障碍物的位置被双目相机测量得到。
Y5:所述双目相机实时获取其视野范围内的监测图像,根据所述监测图像判断所述监测空间内是否存在障碍物。若存在障碍物,获取障碍物在监测空间内的位置信息;
其中,与实施例1不同的地方在于,双目相机拍摄得到的监测图像是两张具有视差的二维图像。因此,在判断所述监测空间内是否存在障碍物时,需要首先在双目相机的两张监测图像中,确定所述监测空间在两张二维监测图像上对应的监测投影区域。一般来说,如图11所示,所述最小包围球20分别投影到单目相机的成像平面27和29上,因而在两 张二维的监测图像上会有对应的投影区域28和30。当有其他物体的投影覆盖到这两个区域时,可能会对光学标记物产生遮挡。这两个圆形的投影区域28和30再加上其外围宽度为a的圆环区域31(a为上述监测空间的半径L与最小包围球20的半径的差值),如图12所示,即为所述监测空间在二维监测图像上对应的监测投影区域。通过比较两张二维监测图像上与所述最小包围球20对应的投影区域,如图13所示,可以得到所述最小包围球20在双目相机中的视差D。在判断监测空间内是否有障碍物时,首先在监测图像上判断是否有其他物体的投影落入了监测图像上的所述监测投影区域内,如果没有落入范围内,则表明所述监测空间内不存在障碍物;如果落入了所述监测投影区域的范围内,则进一步通过两张二维的监测图像确定该其他物体在双目相机中的视差d,若d>D,则说明该其他物体比最小包围球20距离双目相机更近,则可判断所述监测空间内存在障碍物,若d<D,则说明该其他物体比最小包围球20距离双目相机更远,该其他物体不会遮挡所述光学追踪系统6对光学标记物的追踪,则可判断所述监测空间内不存在障碍物。此时,该其他物体可以被忽略。
在本实施例2中,获取障碍物在监测空间内的位置信息也与实施例1不同,其步骤具体可以包括:在双目相机的所述坐标系B中,如图14所示,所述障碍物在所述监测图像中的投影为障碍投影区域32,所述最小包围球20的球心O在所述监测图像中的投影为O’;将所述障碍投影区域32中距离所述投影中心点O’最近的点P’在所述监测投影区域32中的位置信息作为所述障碍物在所述监测空间内的位置信息。
Y6:同样,可以根据所述障碍物的位置信息得到所述光学追踪系统6避障的运动参数,以保证光学标记物不移出视野的同时避开障碍物的遮挡;具体操作可以如下:在双目相机的所述坐标系B中,根据所述投影中心点的位置O’和所述障碍物的位置P’,确定向量
Figure PCTCN2022101378-appb-000023
根据所述向量
Figure PCTCN2022101378-appb-000024
可获得所述光学追踪系统6避障的运动参数,包括运动速度
Figure PCTCN2022101378-appb-000025
在坐标系B中,光学追踪系统6的运动方向33为向量
Figure PCTCN2022101378-appb-000026
的方向,运动速度
Figure PCTCN2022101378-appb-000027
的大小则与
Figure PCTCN2022101378-appb-000028
呈负相关关系。
Figure PCTCN2022101378-appb-000029
越小,表面障碍物距离光学追踪系统6的视线光轴21越近,遮挡越厉害。此时,光学追踪系统6的运动速度应当就越大。
此外,本实施例2还提供了存在多个障碍物遮挡的情况下,如何确定光学追踪系统6的运动参数的方案。例如,当所述监测空间内存在两个不同的障碍物遮挡时,如图15所示,按照上述方法分别确定这两个障碍物在监测图像上的障碍投影区域32和34,并进一步确定障碍投影区域32和34距离所述投影中心点O’最近的点P 1和P 2。P 1和P 2的位置信息分别代表了两个障碍物在所述监测空间内的位置信息。然后,确定向量
Figure PCTCN2022101378-appb-000030
和向量
Figure PCTCN2022101378-appb-000031
接着,确定所述光学追踪系统6分别单独避开这两个障碍物的运动速度
Figure PCTCN2022101378-appb-000032
Figure PCTCN2022101378-appb-000033
然后,根据速度合成的方法,得到所述光学追踪系统6同时避开这两个障碍物的运动速度
Figure PCTCN2022101378-appb-000034
该速度
Figure PCTCN2022101378-appb-000035
为两个运动速度
Figure PCTCN2022101378-appb-000036
Figure PCTCN2022101378-appb-000037
的向量和。
在确定光学追踪系统6在坐标系B中的运动轨迹后,还需要将其变换到最终的机械臂18的坐标系R中,并以此来逆推得到所述机械臂18的各关节的运动参数。
在本实施例2的方案中,如图16所示,光学追踪系统6在坐标系R中的避障运动为 绕所述最小包围球20的球心位置O的转动。在坐标系R中的运动参数包括转动角速度ω r和转动半径r。获取这些运动参数的步骤包括:
根据向量
Figure PCTCN2022101378-appb-000038
获得所述光学追踪系统6在所述坐标系B中的运动参数
Figure PCTCN2022101378-appb-000039
根据所述坐标系B与坐标系A的变换关系以及所述
Figure PCTCN2022101378-appb-000040
获得所述光学追踪系统在所述坐标系A中的运动参数
Figure PCTCN2022101378-appb-000041
根据所述坐标系A与坐标系R的变换关系以及所述
Figure PCTCN2022101378-appb-000042
获得所述光学追踪系统在所述坐标系R中的运动参数
Figure PCTCN2022101378-appb-000043
根据所述
Figure PCTCN2022101378-appb-000044
所述最小包围球的球心O到所述光学追踪系统的光轴中心C的矢量
Figure PCTCN2022101378-appb-000045
获得所述转动角速度ω r,所述转动半径r为
Figure PCTCN2022101378-appb-000046
在得到光学追踪系统6在坐标系R中的运动轨迹后,所述避障方法还包括Y7:所述机械臂18带动所述光学追踪系统6运动避开障碍物。具体来说,可根据所述光学追踪系统6在机械臂18的坐标系R中的运动参数,通过机器人逆运动学计算获取所述机械臂18的各个关节的运动参数。然后,所述机械臂18的各个关节运动,以带动所述光学追踪系统6运动避开障碍物。与上述实施例1相似,再重复执行上述Y2-Y7的过程,以对监测空间内的障碍物进行实时监控。
其中,根据机器人逆运动学计算获取所述机械臂18的各个关节的运动参数的方法,可参考实施例1的方法,在此不再赘述。
本申请还提供了一种计算机设备,包括:至少一个存储器,存储至少一个计算机程序;至少一个处理器,执行所述计算机程序以实现本申请上述实施例中提到的所述光学追踪系统的控制方法。本申请的计算机设备可以集成在所述导航台车9或手术台车1内,并且计算机设备除可以包括上面提到的存储器和处理器外,还可以包括如辅助显示器7、主显示器8、键盘10等外设。
本申请还保护一种支撑装置,用于支撑光学追踪系统6,其中,所述光学追踪系统6用于外科手术中获取光学标记物的位置信息。所述支撑装置包括:至少一个关节,其中,所述关节用于提供至少一个自由度的运动;控制器,与每个所述关节电气连接,用于根据所接收的控制指令控制所述至少一个关节运动;其中,所述控制指令来自于上述提到的计算机设备。本申请的支撑装置可以举例为机械臂18。机械臂18包括至少一个关节,并内置有控制器。其中,所述控制指令为计算机设备根据供支撑装置执行的操作而生成的用于与支撑装置通信的信息。例如,计算机设备控制支撑装置调整光学追踪系统的姿态的方式包括计算机设备通过分析监测图像而生成调整支撑装置姿态的控制指令,并发送至支撑装置;支撑装置利用运动学模型将所述控制指令中的姿态转换成用于控制至少一个关节的如扭矩、角速度等驱动数据,并发送至各关节的驱动器,以使驱动器执行调整操作。
为了实现更高的集成度,计算机设备可以直接内置在所述支撑装置中。
本申请还保护一种光学导航系统,包括:图像获取装置,包括:第一摄像模组和第二摄像模组;其中,所述第一摄像模组用于获取包含至少一个光学标记物的定位图像;其中,所述至少一个所述光学标记物的位置标识出一手术区域;第二摄像模组用于获取对应所述 手术区域与第一摄像模组之间的监测区域的监测图像。所述光学导航系统还包括:如上所述的支撑装置,与所述图像获取装置装配在一起;处理装置,与所述图像获取装置和支撑装置电气连接,用于执行如上所述光学追踪系统的控制方法以及利用所述第一图像确定所述至少一个光学标记物的位置信息。其中,第一摄像模组与上述提到的光学追踪系统6的功能相似,用于获取光学标记物的定位图像,实现对光学标记物的追踪。第二摄像模组与上述提到的定位装置的功能相似,用于获取监测区域的检测图像。这两个摄像模组可以集成在一个图像获取装置上,所述第一摄像模组和第二摄像模组可以具有重叠的视角范围。
本申请还提供一种手术机器人系统,包括:光学导航系统,用于根据所摄取的包含至少一个光学标记物的定位图像确定所述至少一个光学标记物的位置信息;其中,至少一个所述光学标记物的位置标识出一手术区域;支撑装置,用于装配所述光学导航系统;监测装置,用于获取与所述手术区域与第一摄像模组之间的监测区域对应的监测图像;手术机械臂,用于连接手术器械;如上所述的计算机设备,分别与支撑装置、光学导航系统、监测装置、和所述手术机械臂通信连接;其中,所述计算机设备通过执行所述控制方法向所述支撑装置发送控制指令,以使所述支撑装置调整光学导航系统的位置和/或姿态;以及,所述计算机设备还用于根据所述至少一个光学标记物的位置信息向所述手术机械臂发出控制指令,以使手术机械臂调整所装配的手术器械的位置和/或姿态。其中,所述监测装置可以配置于光学导航系统中。
本申请还保护一种计算机可读存储介质,用于存储计算机程序。所述计算机程序被处理器执行时实现本申请实施例中提到的所述的光学追踪系统的位姿控制方法。本申请实施方式的可读存储介质,可以采用一个或多个计算机可读的介质的任意组合。可读介质可以是计算机可读信号介质或者计算机可读存储介质。计算机可读存储介质例如可以是但不限于电、磁、光、电磁、红外线或半导体的系统、装置或器件,或者任意以上的组合。计算机可读存储介质的更具体的例子(非穷举的列表)包括:具有一个或多个导线的电连接、便携式计算机硬盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦式可编程只读存储器(EPROM或闪存)、光纤、便携式紧凑磁盘只读存储器(CD-ROM)、光存储器件、磁存储器件、或者上述的任意合适的组合。在本文中,计算机可读存储介质可以是任何包含或存储程序的有形介质,该程序可以被指令执行系统、装置或者器件使用或者与其组合使用。
计算机可读的信号介质可以包括在基带中或者作为载波的一部分进行传播的数据信号,其中承载了计算机可读的程序代码。这种传播的数据信号可以采用多种形式,包括但不限于电磁信号、光信号或上述的任意合适的组合。计算机可读的信号介质还可以是计算机可读存储介质以外的任何计算机可读介质,该计算机可读介质可以发送、传播或者传输由指令执行系统、装置或者器件或者与其结合使用的程序。
需要说明的是,可以以一种或多种程序设计语言或其组合来编写用于执行本申请操作的计算机程序代码。所述程序设计语言包括面向对象的程序设计语言-诸如Java、Smalltalk、C++,还包括常规的过程式程序设计语言-诸如“C”语言或类似的程序设计语言。程序代码可以完全地在用户计算机上执行、部分地在用户计算机上执行、作为一个独立的软件包执行、部分在用户计算机上部分在远程计算机上执行、或者完全在远程计算机或服务器上执行。在涉及远程计算机的情形中,远程计算机可以通过任意种类的网络——包括局域网 (LAN)或广域网(WAN)连接到用户计算机,或者可以连接到外部计算机(例如利用因特网服务提供商来通过因特网连接)。
综上所述,与现有技术相比,本申请提供的光学追踪系统的位姿控制方法及其适用的光学导航系统、手术机器人系统、计算机设备、支撑装置及计算机可读存储介质具有以下优点:
通过在手术过程中实时获取监测区域内的监测图像,本申请提供的光学追踪系统的位姿控制方法可以确定监测区域内是否有遮挡光学标记物的障碍物,识别任意形状的障碍物以及调整光学追踪系统使监测区域不被遮挡,因而无需添加人工标记,应用场景广泛。
本申请提供的光学追踪系统的位姿控制方法能进一步根据障碍物的位置信息规划光学追踪系统的避障运动,并按照规划的运动轨迹移动光学追踪系统,以避免光学标记物被遮挡。此外,本申请的位姿控制方法在光学追踪系统的避障运动过程中可以始终保证需要被追踪的光学标记物在光学追踪系统的视野中央范围内,避免手术导航过程被中断。
本申请提供的支撑装置具有可控运动功能,根据规划的避障运动,自动移动光学追踪系统避开障碍物。因此,本申请提供的支撑装置无需医生手动调整,便可避免标记物被遮挡而导致导航流程中断。
本申请提供的光学导航系统集成度高,且具备避障功能。因此,该光学导航系统可以通过执行本申请的光学追踪系统的控制方法解决监测空间内存在遮挡物体的问题。
本申请的手术机器人系统将立体视觉技术与机器人技术相结合,解决光学手术导航系统中普遍存在的监测区域被遮挡的问题,且整个导航调整系统不与患者或医护人员接触,免消毒,降低感染的可能性。并且,本申请的手术机器人系统的避障运动无需更改原手术导航系统的流程,因而对医生使用来说,不需要额外的软硬件操作,原手术导航系统的功能也不受影响。这可降低医生的学习曲线,提高手术室利用效率。
上述描述仅是对本申请较佳实施方式的描述,并非对本申请范围的任何限定,本申请领域的普通技术人员根据上述揭示内容做的任何变更、修饰,均属于权利要求书的保护范围。显然,本领域的技术人员可以对发明进行各种改动和变型而不脱离本申请的精神和范围。这样,倘若本申请的这些修改和变型属于本申请权利要求及其等同技术的范围之内,则本申请也意图包括这些改动和变型在内。

Claims (15)

  1. 一种光学追踪系统的位姿控制方法,其特征在于,包括如下步骤:
    获取于所述光学追踪系统与手术区域之间形成的监测区域的至少一幅监测图像;其中,所述手术区域是基于至少一个光学标记物的位置而确定的;
    当根据所述监测图像而确定所述监测区域内存在遮挡物体时,调整所述光学追踪系统的位置和/或姿态,以使所述遮挡物体位于调整后的所述光学追踪系统与手术区域之间所形成的新的监测区域之外。
  2. 根据权利要求1所述的位姿控制方法,其特征在于,所述手术区域大于或等于以所述至少一个光学标记物的位置为边界的空间范围。
  3. 根据权利要求2所述的位姿控制方法,其特征在于,所述监测区域包括:根据所述手术区域的边界与所述光学追踪系统的视角范围而围成的空间范围。
  4. 根据权利要求1所述的位姿控制方法,其特征在于,根据所述监测图像而确定所述监测区域内存在遮挡物体的步骤包括以下至少一个:
    通过提取所述监测图像中与所述监测区域对应的图像区域中的前景图像,确定所述监测区域内存在遮挡物体;
    通过检测至少两幅所述监测图像中与所述监测区域对应的图像区域中的图像之间的变化,确定所述监测区域内存在遮挡物体;
    当所述监测图像为深度图像时,通过检测所述监测图像中与所述监测区域对应的图像区域内存在实体位置,确定所述监测区域内存在遮挡物体;以及
    通过检测至少两幅所述监测图像中与所述监测区域对应的图像区域内的图像数据对之间的视差,确定所述监测区域内存在遮挡物体。
  5. 根据权利要求1所述的位姿控制方法,其特征在于,所述调整所述光学追踪系统的位置和/或姿态的步骤包括:
    根据所述监测图像中对应遮挡物体和光学追踪系统之间的姿态关系、或者相对位置关系,调整所述光学追踪系统的位置和/或姿态。
  6. 根据权利要求1所述的一种光学追踪系统的位姿控制方法,其特征在于,所述调整所述光学追踪系统的位置和/或姿态的步骤包括:
    按照预设的调整单位,逐次调整所述光学追踪系统的位置和/或姿态,直至根据所获取的监测图像确定调整后的所述光学追踪系统与手术区域之间所形成的新的监测区域内无遮挡物体。
  7. 根据权利要求1所述的位姿控制方法,其特征在于,所述调整所述光学追踪系统的位置和/或姿态的步骤包括以下任一个:
    以所述手术区域为中心,调整所述光学追踪系统的位置和姿态;以及
    根据通过分析所述监测图像而确定的所述遮挡物体和光学追踪系统之间的姿态关系、或位置关系,平移所述光学追踪系统的位置。
  8. 一种计算机设备,其特征在于,包括:
    至少一个存储器,存储至少一个计算机程序;
    至少一个处理器,执行所述计算机程序以实现如权利要求1-7中的任一项所述的光学追踪系统的位姿控制方法。
  9. 一种支撑装置,用于支撑光学追踪系统,其中,所述光学追踪系统用于在外科手术中获取光学标记物的位置信息,其特征在于,所述支撑装置包括:
    至少一个关节,其中,所述关节用于提供至少一个自由度上的运动;
    控制器,与每个所述关节电气连接,用于根据所接收的控制指令控制所述至少一个关节运动;
    其中,所述控制指令来自于如权利要求8所述的计算机设备。
  10. 根据权利要求9所述的支撑装置,其特征在于,所述计算机设备内置于所述支撑装置中。
  11. 一种光学导航系统,其特征在于,包括:
    图像获取装置,包括:第一摄像模组和第二摄像模组;
    其中,所述第一摄像模组,用于获取包含至少一个光学标记物的定位图像;其中,所述至少一个光学标记物的位置标识出一手术区域;
    第二摄像模组,用于获取与所述手术区域与第一摄像模组之间的监测区域对应的监测图像;
    所述光学导航系统还包括:
    如权利要求9或10所述的支撑装置,所述支撑装置与所述图像获取装置连接。
  12. 根据权利要求11所述的光学导航系统,其特征在于,所述第一摄像模组和第二摄像模组具有重叠的视角范围。
  13. 一种手术机器人系统,其特征在于,包括:
    光学导航系统,用于根据所摄取的包含至少一个光学标记物的定位图像确定所述至少一个光学标记物的位置信息;其中,至少一个所述光学标记物的位置标识出一手术区域;
    支撑装置,用于装配所述光学导航系统;
    监测装置,用于获取与所述手术区域与第一摄像模组之间的监测区域对应的监测图像;
    手术机械臂,用于连接手术器械;以及
    如权利要求8所述的计算机设备,分别与所述支撑装置、光学导航系统、监测装置、和手术机械臂通信连接;其中,所述计算机设备通过执行所述位姿控制方法向所述支撑装置发送控制指令,以使所述支撑装置调整光学导航系统的位置和/或姿态;以及,所述计算机设备还用于根据所述至少一个光学标记物的位置信息向所述手术机械臂发出控制指令,以使手术机械臂调整所装配的手术器械的位置和/或姿态。
  14. 根据权利要求13所述的手术机器人系统,其特征在于,所述监测装置配置于所述光学导航系统中。
  15. 一种计算机可读存储介质,其特征在于,用于存储计算机程序,所述计算机程序被执行时实现如权利要求1-7中的任一项所述的位姿控制方法。
PCT/CN2022/101378 2021-06-30 2022-06-27 位姿控制方法及其适用的光学导航系统、手术机器人系统 WO2023274100A1 (zh)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN202110735945.8 2021-06-30
CN202110735945 2021-06-30
CN202110785262.3A CN113476141B (zh) 2021-06-30 2021-07-12 位姿控制方法及其适用的光学导航系统、手术机器人系统
CN202110785262.3 2021-07-12

Publications (1)

Publication Number Publication Date
WO2023274100A1 true WO2023274100A1 (zh) 2023-01-05

Family

ID=77938724

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/101378 WO2023274100A1 (zh) 2021-06-30 2022-06-27 位姿控制方法及其适用的光学导航系统、手术机器人系统

Country Status (2)

Country Link
CN (1) CN113476141B (zh)
WO (1) WO2023274100A1 (zh)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112914755A (zh) * 2021-01-25 2021-06-08 深圳市奥昇医疗科技有限责任公司 手术追踪系统及其控制方法
CN113476141B (zh) * 2021-06-30 2023-02-10 苏州微创畅行机器人有限公司 位姿控制方法及其适用的光学导航系统、手术机器人系统
CN113954082B (zh) * 2021-12-23 2022-03-08 真健康(北京)医疗科技有限公司 适用于穿刺手术机械臂的控制方法、控制设备和辅助系统
CN115381554B (zh) * 2022-08-02 2023-11-21 北京长木谷医疗科技股份有限公司 一种骨科手术机器人智能位置调整系统及方法
CN116849727B (zh) * 2023-06-19 2024-05-14 北京纳通医用机器人科技有限公司 手术机器人的状态监控系统、方法、设备和存储介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104688351A (zh) * 2015-02-28 2015-06-10 华南理工大学 一种基于两个双目视觉系统的手术器械无遮挡定位方法
CN108472096A (zh) * 2015-12-31 2018-08-31 史赛克公司 用于在由虚拟对象限定的目标部位处对患者执行手术的系统和方法
US20200015909A1 (en) * 2018-07-16 2020-01-16 Mako Surgical Corp System and method for image based registration and calibration
CN110897717A (zh) * 2019-12-09 2020-03-24 苏州微创畅行机器人有限公司 导航手术系统及其注册方法与电子设备
CN111417352A (zh) * 2016-10-21 2020-07-14 Gys科技有限责任公司(经营名称为卡丹机器人) 用于设定图像引导式外科手术的轨迹和目标位置的方法和系统
WO2021003401A1 (en) * 2019-07-03 2021-01-07 Stryker Corporation Obstacle avoidance techniques for surgical navigation
CN113476141A (zh) * 2021-06-30 2021-10-08 苏州微创畅行机器人有限公司 位姿控制方法及其适用的光学导航系统、手术机器人系统

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102657531B (zh) * 2012-04-28 2015-07-15 深圳泰山在线科技有限公司 基于计算机视觉的人体躯干围度测量方法和装置
ES2647226T3 (es) * 2014-11-26 2017-12-20 Masmec S.P.A. Sistema asistido por ordenador para guiar un instrumento quirúrgico/de diagnóstico en el cuerpo de un paciente
CN105030331A (zh) * 2015-04-24 2015-11-11 长春理工大学 位置传感器与三维腹腔镜摄像机标定装置及方法
CN105496519B (zh) * 2015-12-31 2018-10-30 精微视达医疗科技(武汉)有限公司 一种b超引导下的穿刺导航系统
US11684425B2 (en) * 2018-03-16 2023-06-27 Shimadzu Corporation X-ray fluoroscopic imaging apparatus
CN112472297B (zh) * 2020-11-26 2022-03-29 上海微创医疗机器人(集团)股份有限公司 位姿监测系统、方法、手术机器人系统和存储介质

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104688351A (zh) * 2015-02-28 2015-06-10 华南理工大学 一种基于两个双目视觉系统的手术器械无遮挡定位方法
CN108472096A (zh) * 2015-12-31 2018-08-31 史赛克公司 用于在由虚拟对象限定的目标部位处对患者执行手术的系统和方法
CN111417352A (zh) * 2016-10-21 2020-07-14 Gys科技有限责任公司(经营名称为卡丹机器人) 用于设定图像引导式外科手术的轨迹和目标位置的方法和系统
US20200015909A1 (en) * 2018-07-16 2020-01-16 Mako Surgical Corp System and method for image based registration and calibration
WO2021003401A1 (en) * 2019-07-03 2021-01-07 Stryker Corporation Obstacle avoidance techniques for surgical navigation
CN110897717A (zh) * 2019-12-09 2020-03-24 苏州微创畅行机器人有限公司 导航手术系统及其注册方法与电子设备
CN113476141A (zh) * 2021-06-30 2021-10-08 苏州微创畅行机器人有限公司 位姿控制方法及其适用的光学导航系统、手术机器人系统

Also Published As

Publication number Publication date
CN113476141A (zh) 2021-10-08
CN113476141B (zh) 2023-02-10

Similar Documents

Publication Publication Date Title
WO2023274100A1 (zh) 位姿控制方法及其适用的光学导航系统、手术机器人系统
US11844574B2 (en) Patient-specific preoperative planning simulation techniques
JP7233841B2 (ja) ロボット外科手術システムのロボットナビゲーション
AU2022203687B2 (en) Method and system for guiding user positioning of a robot
EP3212109B1 (en) Determining a configuration of a medical robotic arm
EP2967348B1 (en) Intelligent positioning system
US11717351B2 (en) Navigation surgical system, registration method thereof and electronic device
US20110306873A1 (en) System for performing highly accurate surgery
JP7367990B2 (ja) 手術室の遠隔監視
WO2022237538A1 (zh) 手术机器人系统、调整系统和存储介质
JP2023530652A (ja) コンピュータ支援インターベンション用の空間認識ディスプレイ
EP3200719B1 (en) Determining a configuration of a medical robotic arm
WO2023047395A1 (en) Systems and methods for work volume mapping to facilitate dynamic collision avoidance
CN115429432A (zh) 可读存储介质、手术机器人系统和调整系统
Zheng et al. Automatic Tracking Motion Based on Flexible Forbidden Virtual Fixtures Design in Robot Assisted Nasal Surgery
WO2023286052A1 (en) Path planning based on work volume mapping

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22831914

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE