WO2023280326A1 - 一种外科手术的主动导航系统及其控制方法 - Google Patents

一种外科手术的主动导航系统及其控制方法 Download PDF

Info

Publication number
WO2023280326A1
WO2023280326A1 PCT/CN2022/109446 CN2022109446W WO2023280326A1 WO 2023280326 A1 WO2023280326 A1 WO 2023280326A1 CN 2022109446 W CN2022109446 W CN 2022109446W WO 2023280326 A1 WO2023280326 A1 WO 2023280326A1
Authority
WO
WIPO (PCT)
Prior art keywords
positioning
pose
optimal
sensor
tools
Prior art date
Application number
PCT/CN2022/109446
Other languages
English (en)
French (fr)
Inventor
秦岩丁
韩建达
王鸿鹏
游煜根
宋志超
蒙一扬
Original Assignee
南开大学深圳研究院
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 南开大学深圳研究院 filed Critical 南开大学深圳研究院
Priority to US18/268,316 priority Critical patent/US20240050161A1/en
Publication of WO2023280326A1 publication Critical patent/WO2023280326A1/zh

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • A61B2034/2057Details of tracking cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2059Mechanical position encoders
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2068Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2072Reference field transducer attached to an instrument or patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/90Identification means for patients or instruments, e.g. tags
    • A61B90/94Identification means for patients or instruments, e.g. tags coded with symbols, e.g. text
    • A61B90/96Identification means for patients or instruments, e.g. tags coded with symbols, e.g. text using barcodes

Definitions

  • the invention relates to the technical field of medical equipment, in particular to the field of surgical robots, in particular to an active navigation system for surgical operations and a control method thereof.
  • the assisted surgery system can accurately locate the surgical site and operating tools to assist doctors in minimally invasive surgery, remote surgery, or robot-assisted surgery.
  • surgical navigation relies on optical navigation equipment to detect and identify optical positioning tools, and perform image and position settlement to realize the positioning function of surgical sites or surgical tools.
  • the surgical navigation equipment is manually adjusted by the doctor who assists the operation according to the needs of the operation.
  • the optical navigation device is adjusted to a suitable observation position by dragging the handle of the device.
  • this interactive method brings a lot of inconvenience in the actual operation process, and for some special operation position design, it is difficult to adjust the appropriate measurement position by hand alone, and the position accuracy cannot be guaranteed.
  • the present invention provides an active navigation system for surgical operations and a control method thereof.
  • the technical solution of the present invention solves the problems of obtaining the optimal observation pose of the robot for surgical navigation and positioning, actively adjusting the position in real time, avoiding the navigation target locator from being blocked, and improving the positioning accuracy of the navigation process.
  • a control method for the active navigation system of the above-mentioned surgical operation includes the following steps:
  • Step 1 Multi-objective optimization of measurement angle of view: input the position parameters of the positioning tool and set other relevant parameters, and solve the optimal set of measurement angles of view through multi-objective optimization;
  • Step 2 Multi-objective decision-making of the pose of the manipulator: According to the set of the optimal measurement angle of view, use the multi-objective decision-making algorithm to recommend to the user the optimal pose of the manipulator in each link of the operation; or choose the operation according to the user's preference The optimal pose scheme of the manipulator in each link;
  • Step 3 Robotic arm path planning and execution: According to the selected optimal pose scheme of the robotic arm in each link of the operation, plan the path of the robotic arm from the current pose to the optimal pose scheme.
  • said step 1 includes the following steps:
  • Step 1.1 Obtain the information of all positioning tools and their locations in each link in the operation process, and establish a multi-objective minimization problem based on the decision variable x:
  • q 1 , q 2 , q 3 , ..., q N are joint variables; N is the number of joint variables; decision variable x represents a vector composed of N joint variables of the manipulator, and its value range is The joint value range Q that can be realized by each joint of the manipulator, that is, x ⁇ Q;
  • Step 1.2 define at least two objective functions f 1 and f 2 for minimization optimization, specifically as follows:
  • f 1 represents the maximum distance between the coordinate origin of all positioning tools and the coordinate origin of the positioning sensor
  • O min (j, k) represents for Given a pair of positioning tools j and k, its smaller unoccluded margin function in each camera coordinate of the positioning sensor
  • min j, k ⁇ S O min (j, k) represents the mechanical Under the arm pose, the smallest unoccluded margin function value among the binary combinations of all positioning tools measured by all cameras of the positioning sensor
  • G is the coordinate origin of the left or right camera in the positioning sensor;
  • L and R are the coordinate origins of the left and right cameras in the positioning sensor respectively;
  • M j and M k are any two positioning tools j
  • the radii of j and k are respectively the centers of the smallest circumscribed spheres of l j and l k , that is, the coordinate origins of positioning tools j and k;
  • r j and r k are the expansion radii of positioning tools j and k respectively;
  • the margin coefficient ⁇ is Constant greater than 1; vector length and Obtained by positioning sensor measurement;
  • means vector dot product;
  • Step 1.3 setting the following constraints, while ensuring that the following constraints are satisfied, at least two objective functions f 1 and f 2 are minimized simultaneously:
  • Constraint 1 means that any positioning tool must be within the detectable range of both the positioning sensor and the environment perception sensor;
  • Constraint 2 means that the angle between the line from the camera on either side of the positioning sensor to any positioning tool and the z-axis direction of the positioning tool cannot be greater than the predetermined threshold; ⁇ G, i represents the i-th positioning tool The angle between the vector origin pointing to the coordinate origin of the left or right camera in the positioning sensor and the z-axis direction vector of the i-th positioning tool; Th is the preset threshold;
  • Constraint 3 means that any two positioning tools do not occlude each other, that is, the minimum value of the non-occlusion margin function O(j, k, G) between any two positioning tools is non-negative.
  • the multi-objective decision-making algorithm is used to recommend to the user the optimal pose scheme of the robotic arm in each link of the operation according to the set of the optimal measurement angle of view, including the following steps:
  • Step 2.1 Find the optimal solution on a single target in the set of optimal measurement angles of view, and calculate the linear equation where the two endpoints of the curve corresponding to the set of optimal measurement angles of view are located:
  • Step 2.2 Calculate the vertical distance d from each point in the set corresponding curve of the optimal measurement angle to the above-mentioned straight line, and substitute the target value of each point into the following formula:
  • Step 2.3 taking the solution of the optimal measurement angle of view corresponding to the maximum value of the vertical distance d as the recommended value of the multi-objective decision-making of the joint value of the manipulator;
  • A, B, and C are obtained by solving the linear equation with the target value of the single-objective optimal solution.
  • said step 3 includes the following steps:
  • Step 3.1 During the operation, after entering the designated operation link, according to the optimal solution of the pose of the manipulator before the operation and the optimal pose scheme obtained by multi-objective decision-making and the optimal pose scheme of the manipulator during the operation , to obtain the target pose of the current operation;
  • Step 3.2 The environment perception sensor obtains the three-dimensional information of the surrounding environment of the surgical robot, generates a point cloud image C B of the surrounding environment, and obtains the point cloud position information C N of the environment point cloud under the coordinates of the positioning sensor through the following formula:
  • Step 3.3 Randomly generate candidate waypoints
  • Step 3.4 Determine whether the waypoint will encounter an obstacle; if so, return to step 3.3; otherwise, continue to the next step;
  • Step 3.5 Determine whether all positioning tools can be detected in this pose; if not, return to step 3.3; otherwise, continue to the next step;
  • the positioning tools need to meet the above constraint conditions 1-3;
  • Step 3.6 Add the current candidate path points into the path directory to generate a reasonable path plan
  • Step 3.7 Determine whether the target pose has been reached, if not, return to step 3.3; otherwise, find the shortest path in the current path directory as the path of the robot arm movement;
  • Step 3.8 Execute the above path pose to make the robotic arm of the surgical robot reach the target pose.
  • An active navigation system for surgical operations can implement the control method for the active navigation system for surgical operations as described above, the system includes: a control host, a serial mechanical arm with any degree of freedom, a positioning sensor and an adapted one Or a plurality of positioning tools, environment sensing sensors; the overlapping measurement area of the environment sensing sensors and the positioning sensors is the measurable area of the active navigation system for surgical operations;
  • the number of positioning tools is one or more; each positioning tool has K positioning parts formed according to a certain positional relationship distribution; the positioning parts are specific markers that can reflect light or light, and/or consist of several The parts formed after the patterns are arranged in a certain positional relationship; the specific markers that can reflect light include at least: small balls covered with a high-reflective coating on the surface; the specific markers that can emit light include at least: LED lights; the specific pattern Patterns designed for special encoding, including at least QR codes and Gray codes;
  • the positions and/or numbers of the positioning components on each positioning tool are different to distinguish the positioning tools; the centroids of the K positioning components of the same positioning tool are all on the same plane;
  • a specific shape feature is designed at the center of each positioning tool, and the focal point of the plane where the feature axis and the center of mass of the positioning component is located is taken as the coordinate origin;
  • the shape feature can be at least a circular hole, a hemisphere, a boss, and a cone;
  • the coordinate origin Be the center of the sphere, construct the minimum circumscribed sphere that envelops the K positioning components on the positioning tool for each positioning tool, the radius of the minimum circumscribed sphere is l i ;
  • the normal direction of the plane where the centroids of the K positioning components is The direction of the z-axis; and the direction towards the side where the K positioning components are attached is the positive direction of the z-axis;
  • the direction perpendicular to the z-axis and pointing to the positioning component farthest from the coordinate origin is the positive direction of the x-axis, and a three-dimensional Cartesian coordinate system is established ;
  • the set of all positioning tools is recorded as S, and for the i-th positioning tool, the center of its coordinate system is M i , that is, M i ⁇ S.
  • a certain margin will be added on the basis of l i , that is, the size of the spherical surface is set to be slightly larger than l i , for example: multiply l i by a margin coefficient ⁇ greater than 1 to obtain r j , so as to avoid some small differences in actual operation causing the method to fail.
  • the invention provides an active navigation system for surgical operations and a control method thereof.
  • the technical solution of the present invention solves the problems of obtaining the optimal observation pose of the robot for surgical navigation and positioning, actively adjusting the position in real time, avoiding the navigation target locator from being blocked, and improving the positioning accuracy of the navigation process.
  • Fig. 1 is the overall structural diagram of the active navigation system of surgical operation of the present invention
  • Fig. 2 is the implementation figure of the active navigation system of surgical operation of the present invention
  • Fig. 3 is the schematic diagram of establishment of the coordinate system in the active navigation system of the surgical operation of the present invention.
  • Fig. 4 is the establishment figure of positioning tool and coordinate system thereof of the present invention.
  • Fig. 5 is a schematic diagram of the design of the unoccluded margin function O(j, k, G) of the present invention
  • Fig. 6 is the schematic diagram of observation angle ⁇ G of the present invention, i ;
  • Fig. 7 is the optimal solution diagram of the multi-objective optimization of the measurement angle of view of the present invention.
  • Fig. 8 is a diagram of the optimal solution recommendation method provided by the multi-objective decision-making algorithm of the present invention.
  • the invention provides an active navigation system for surgical operations and a control method thereof.
  • Fig. 1 is an overall structural diagram of the active navigation system for surgical operations of the present invention.
  • the system includes: a surgical operation planning system, a control host for data processing and robot control, a robotic arm, a positioning sensor and its adapted positioning tool, and an environmental sensing sensor; the environmental sensing sensor realizes Surgical environment, e.g. perception of potential occlusions and/or obstacles.
  • the manipulator is a tandem manipulator with 7 degrees of freedom; the positioning sensor and/or environment perception sensor is connected to the end flange of the manipulator.
  • the positioning sensor can adopt a variety of different modes: such as a binocular depth camera based on visible light, a binocular positioning camera based on near-infrared light, etc.
  • the corresponding positioning tool is an optical two-dimensional code or other coding pattern that matches the positioning sensor , or a positioning tool composed of optical balls covered with a specific coating.
  • Environmental perception sensors can also be in multiple modes: such as binocular depth cameras based on visible light, lidar, ultrasonic sensors, etc.
  • the environmental perception sensor and the positioning sensor can be a combination of two types of equipment carriers, such as: a solution based on near-infrared light binocular positioning camera + lidar; they can also be the same type of sensor, such as: a binocular depth camera based on visible light, both It can be used for positioning, and at the same time, it can realize the perception of the surgical environment.
  • equipment carriers such as: a solution based on near-infrared light binocular positioning camera + lidar; they can also be the same type of sensor, such as: a binocular depth camera based on visible light, both It can be used for positioning, and at the same time, it can realize the perception of the surgical environment.
  • the spatial areas measured by the environment perception sensor and the positioning sensor must overlap with each other, and the overlapped area is the measurable area of the system.
  • Fig. 2 is an embodiment diagram of the active navigation system for surgical operation of the present invention.
  • the implementation is as follows: the system consists of a 7-DOF manipulator, a near-infrared optical positioning system (as a "positioning sensor”) and a binocular camera (as an “environment perception sensor”) connected to the flange at the end of the manipulator. sensors”), as well as a computer for data processing and robot control, and positioning tools adapted to the near-infrared optical positioning system.
  • the near-infrared optical positioning system here includes two infrared emitting lamps, and an infrared camera for detecting reflected infrared light. Its working principle is: the two infrared emitting lamps on the left and right emit specific infrared light, which is projected on the surface of the reflective ball on the positioning tool.
  • the reflective ball reflects infrared light and is detected by the infrared camera, which calculates the relative position between the near-infrared optical positioning system and each small ball based on the received reflected infrared light, and calculates according to the pre-calibrated positioning relationship model The relative position of each positioning tool relative to the near-infrared optical positioning system is obtained.
  • the base coordinate of the manipulator is O
  • the joint angle of the kth joint is q k
  • the origin of the coordinate system of the end flange is E.
  • the center coordinate of the near-infrared optical positioning system is N
  • the coordinates of the cameras on the left and right sides are R and L respectively.
  • the near-infrared optical positioning system can measure the area space as A(p).
  • the coordinate system of the binocular camera is C.
  • Fig. 2 The reference signs in Fig. 2 have the following meanings: 1—seven degrees of freedom mechanical arm, 2—near-infrared optical positioning system, 3—binocular camera, 4—positioning tool, 5—computer.
  • Fig. 3 is a schematic diagram of establishment of a coordinate system in the active navigation system for surgery according to the present invention.
  • the set of all positioning tools is S, and for the i-th tool, the center of its coordinate system is M i , that is, M i ⁇ S.
  • the center coordinate of the optical positioning system is N, and the coordinates of the cameras on the left and right sides are R and L respectively.
  • A(p) the measurable area space where the optical positioning system coincides with the environment perception sensor
  • A(p) that is, under the condition of no occlusion
  • the positioning tool can be measured normally A collection of locations.
  • the coordinate system of the binocular camera is C.
  • Fig. 4 is a diagram of the positioning tool and its coordinate system establishment in the present invention.
  • the positioning tool is a positioning tool that matches the near-infrared optical positioning system (ie, "positioning sensor"), as shown in Figure 4.
  • Positioning sensor ie, "positioning sensor”
  • Each positioning tool has 4 small balls covered with a high-reflective coating on the surface, which are formed according to a certain positional relationship.
  • the centers of the four small balls of the same positioning tool are all on the same plane, and the normal direction of the plane where the centroids of the K positioning components are located is the z-axis direction; and the direction toward the side where the K positioning components are attached is The z-axis is positive.
  • the positions and/or numbers of the small balls of each positioning tool are different, so as to distinguish the positioning tools.
  • Each positioning tool uses the intersection point of the center axis of the plane where the center of the ball is located and the small hole in the center of the connecting rod of the positioning tool (that is, an example of a shape feature) as the coordinate origin, and points to the ball farthest from the origin with the intersection point
  • the direction is the x-axis direction.
  • the intersection point establish the smallest circumscribed sphere that envelops all small balls, and the radius of the circumscribed sphere is l i .
  • the set of all positioning tools is S, and for the i-th tool, the center of its coordinate system is M i , that is, M i ⁇ S.
  • the invention provides a control method for active navigation of a surgical robot.
  • the realization of this control method includes three parts: "multi-objective optimization of measurement angle of view” - "multi-objective decision-making of manipulator pose” - "route planning and execution of manipulator”. details as follows:
  • ⁇ Multi-objective optimization of measurement angle of view By inputting the situation and position of the positioning tool into the program, after setting the relevant parameters, through multi-objective optimization, the optimal set of measurement angles of view is solved.
  • ⁇ Multi-objective decision-making of the pose of the robotic arm Based on the optimal solution set obtained through optimization in the previous step, the multi-objective decision-making algorithm is used to recommend solutions to the user, or the user selects the appropriate surgical navigation robotic arm in each link of the operation according to the user's preference pose scheme.
  • Robotic arm path planning and execution Based on the optimal pose scheme of each link of the operation obtained in the previous step, the robotic arm plans to reach the optimal pose scheme from the current pose through the algorithm. In this process, it is necessary to consider that the positioning sensor can always normally position all the positioning tools required for the operation and the unexpected obstacles that appear in the process during the movement, and finally reach the appropriate optimal pose.
  • Multi-objective optimization of measurement perspective Obtain the information and location of all positioning tools in each link of the operation process through the operation planning system. Establish the following multi-objective minimization problem:
  • q 1 , q 2 , q 3 ,..., q N are joint variables; N is the number of joint variables; decision variable x represents a vector composed of N joint variables of the manipulator, and its value range is The joint value range Q that can be realized by each joint of the manipulator, that is, x ⁇ Q;
  • the optimization objective is as follows (simultaneous minimization of at least two objective functions f1 and f2):
  • Optimization goal 1 Minimize the maximum distance between the positioning tool and the near-infrared optical positioning system:
  • optimization objective 2 min j, k ⁇ S O min (j, k) represents the minimum unoccluded margin function value between positioning tools. By taking the opposite number of its value, it is transformed into a minimization optimization problem:
  • O min (j, k) represents the smaller unoccluded margin function in each camera coordinate of the positioning sensor for a given pair of positioning tools j and k;
  • min j, k ⁇ S O min (j, k) represents the minimum unoccluded margin function value among the binary combinations of all positioning tools measured in all cameras of the positioning sensor under the pose of the manipulator determined by q;
  • FIG. 5 is a schematic diagram of the design of the non-occlusion margin function O(j, k, G) of the present invention, which describes the definition of the non-occlusion margin function O(j, k, G).
  • FIG. 5 describes the geometric relationship between any two positioning tools and the cameras on the left or right side of the positioning sensor. Therefore, if the number of positioning tools is greater than 2, any two positioning tools and cameras on either side will generate a specific O(j, k, G) value, for example: 3 positioning tools can generate 6 O(j, k, G) values, namely: O(1,2,L), O(1,3,L), O(2,3,L), O(1,2,R), O(1,3, R), O(2,3,R).
  • G refers to the origin of the camera coordinate system on the left or right side of the positioning sensor.
  • M j and M k refer to the centers of spheres after any two positioning tools are abstracted into spheres, and they are also the origin of the coordinate system of the positioning tools.
  • r j and r k are the radii of the sphere abstracted by the positioning tool.
  • Each positioning tool is selected to use the intersection point of the center axis of the plane where the center of the ball is located and the small hole in the center of the connecting rod of the positioning tool (that is, an example of a shape feature) as the coordinate origin.
  • the minimum circumscribed sphere radius with the coordinate origin as the center is l i .
  • the margin ⁇ times is extended on the basis of l i , and the radii r j and r k of the sphere abstracted by the positioning tool are obtained.
  • the feature of the positioning tool here is that 4 or more coplanar connecting rods are extended from a center, and small balls are arranged at the end of the connecting rods. In a set of navigation equipment, the relative position between the small balls of each positioning tool is only). where ⁇ >1.
  • ⁇ G, j and ⁇ G, k can be obtained by the following relationship:
  • represents vector dot product
  • Constraint 1 means that any positioning tool must be within the detectable range of both the positioning sensor and the environment perception sensor;
  • Constraint 2 means that the angle between the line from the camera on either side of the positioning sensor to any positioning tool and the z-axis direction of the positioning tool cannot be greater than the predetermined threshold;
  • ⁇ G, i represents the i-th positioning tool The angle between the vector origin pointing to the coordinate origin of the left or right camera in the positioning sensor and the z-axis direction vector of the i-th positioning tool;
  • Constraint 3 means that any two positioning tools do not occlude each other, that is, the minimum value of the non-occlusion margin function O(j, k, G) between any two positioning tools is non-negative.
  • Fig. 6 is a schematic diagram of the observation angle ⁇ G,i of the present invention.
  • the observation angle refers to the angle between the origin of the left or right camera and the Z-axis of any positioning tool (the normal direction of the positioning tool pointing upward is fixed as the Z-axis of the positioning tool coordinates).
  • G refers to the origin of the camera coordinate system on the left or right side of the positioning sensor. It is the Z-axis unit vector for positioning the tool in the G coordinate system. By positioning the sensor it can be obtained and Substitute into the formula to calculate. In addition, it should be noted that the camera on any side will have an observation angle value for any positioning tool.
  • the above optimization problem can be solved.
  • the MOEA/D-CDP algorithm is used to obtain the Pareto optimal solution of the above optimization problem.
  • Fig. 7 is an optimal solution diagram of the multi-objective optimization of the measurement perspective in the present invention.
  • each point in the figure corresponds to an optimal pose solution. These solutions do not dominate each other and are optimal solutions.
  • the user can directly choose any of the above-mentioned optimal solutions according to his own preferences; or after making recommendations based on the multi-objective decision-making algorithm provided by the system, then Make a selection.
  • Fig. 8 is a diagram of the optimal solution recommendation method provided by the multi-objective decision-making algorithm of the present invention.
  • Step 1 Find the optimal solution on a single target in the optimal solution set, and calculate the equation of the straight line where the two endpoints are:
  • Step 2 Calculate the vertical distance d from each point to the line d. Substitute the target value of each point into the formula
  • Step 3 According to the needs of the user, recommend an optimal solution with the largest d value as the recommended value, and use it directly; or select several optimal solutions by the user.
  • Step 1 In the specific operation process, after entering the specified operation steps, according to the optimal solution of the mechanical arm pose before the operation and the optimal pose scheme obtained by multi-objective decision-making and the optimal position of the robotic arm during the operation
  • the pose scheme which obtains the target pose of the current operation link (that is, the optimal target selected in the multi-objective decision-making link of the robot arm pose).
  • Step 2 The binocular camera obtains the three-dimensional information of the surrounding environment of the robot. Generate the point cloud image C B of the surrounding environment, and pass the following formula:
  • Step 3 The algorithm randomly generates candidate waypoints.
  • Step 4 Determine whether the path point will encounter an obstacle, if so, return to step 3; otherwise, proceed to the next step of judgment;
  • Step 5 Determine whether all positioning tools can be detected in this pose, if not, return to step 3; otherwise, continue to the next step;
  • the positioning tool needs to meet the following constraints:
  • Step 6 Add the current candidate path points into the path directory for finally generating a reasonable path plan
  • Step 7 Whether the pose of the target has been reached, if not, return to step 3, otherwise find the shortest path in the current directory as the path of the robot arm movement.
  • Step 8 Execute the above path pose to make the robot reach the target pose.

Abstract

一种外科手术的主动导航系统及其控制方法,其中该系统包括:控制主机、任意自由度的串联机械臂(7)、定位传感器及其适配的一个或多个定位工具(4)、环境感知传感器。控制方法包括:步骤1、测量视角多目标优化:输入定位工具(4)的位置参数并设置其他相关参数,通过多目标优化求解出最优测量视角的集合;步骤2、机械臂(7)位姿的多目标决策:根据最优测量视角的集合,采用多目标决策算法向用户推荐手术各环节中机械臂的最优位姿方案;或者根据用户根据偏好选择手术各环节中机械臂(7)的最优位姿方案;步骤3、机械臂(7)路径规划与执行:根据所选择的手术各环节中机械臂(7)的最优位姿方案,规划出机械臂(7)从当前位姿到达最优位姿方案的路径。该系统和方法解决了手术导航定位的机器人最优的观测位姿的获取,以及实时主动调整位置,避免定位工具被遮挡,提高导航过程的定位精度等问题。

Description

一种外科手术的主动导航系统及其控制方法 技术领域
本发明涉及医疗设备技术领域,具体涉及外科手术机器人领域,尤其涉及一种外科手术的主动导航系统及其控制方法。
背景技术
依靠图像导航技术,辅助手术系统可以对手术部位以及操作工具进行精确的定位,以辅助医生开展微创手术、远程手术或者由机器人辅助执行外科手术。目前,手术导航依赖光学导航设备通过检测识别光学定位工具,并进行图像以及位置的结算实现对手术部位或者手术工具的定位功能。实际操作时,手术导航设备由辅助手术的医生根据手术需要进行人手调整。具体的,通过拖动设备的手柄,把光学导航设备调整到合适的观测位置。然而,这种交互方式在实际手术过程中带来诸多不便,而且对于一些特殊的手术位置设计,单靠人手难以调整出合适的测量位置,无法保证位置精度。
赋予光学导航设备运动能力,成为新的趋势。需要实现光学导航的主动导航,不仅要求机器人具备用于定位的光学导航传感器,还需要具备其他环境感知功能的传感器,来感知手术室内发生的人为或者设备位置变动的事件发生,由此触发响应的主动运动。因此需要特定的硬件构成系统;同时,机器人主动调整到的目标位姿需要综合考虑多种因素,包括但仅限于:测量的精度、靶标定位的可测量条件、机器人的可达性等,并且在术中调整位姿时还不能丢失任何一个光学定位工具,因此需要特定的机器人位姿优化以及路径规划的控制算法。
发明内容
考虑到上述因素,本发明提供一种外科手术的主动导航系统及其控制方法。本发明的技术方案解决了手术导航定位的机器人最优的观测位姿的获取,以及实时主动调整位置,避免导航靶标定位器被遮挡,提高导航过程的定位精度等问题。
一种用于上述外科手术的主动导航系统的控制方法,所述控制方法包括以下步骤:
步骤1、测量视角多目标优化:输入定位工具的位置参数并设置其他相关参数,通过多目标优化求解出最优测量视角的集合;
步骤2、机械臂位姿的多目标决策:根据所述最优测量视角的集合,采用多目标决策算法向用户推荐手术各环节中机械臂的最优位姿方案;或者根据用户根据偏好选择手术各环节中机械臂的最优位姿方案;
步骤3、机械臂路径规划与执行:根据所选择的手术各环节中机械臂的最优位姿方案,规划出机械臂从当前位姿到达最优位姿方案的路径。
可选地,所述步骤1包括以下步骤:
步骤1.1、获取手术过程中各个环节的所有定位工具的信息及其所在的位置,建立基于决策变量x的多目标最小化问题:
x=[q 1,q 2,q 3,...,q N]     (公式1)
上式中:q 1,q 2,q 3,...,q N为关节变量;N为关节变量的数量;决策变量x表示机械臂的N个关节变量组成的向量,其取值范围为机械臂各关节可实现的关节值范围Q,即x∈Q;
步骤1.2、定义最小化优化的至少两个目标函数f 1和f 2,具体如下:
Figure PCTCN2022109446-appb-000001
Figure PCTCN2022109446-appb-000002
其中,
Figure PCTCN2022109446-appb-000003
表示第m个定位工具的坐标原点与定位传感器的坐标原点之间的距离;f 1表示所有定位工具的坐标原点与定位传感器的坐标原点之间的最大距离;O min(j,k)表示对于给定的一对定位工具j与k,其在定位传感器各个相机坐标中较小的无遮挡裕度函数;min j,k∈SO min(j,k)表示在决策变量x所决定的机械臂位姿下,在定位传感器所有相机中,测量到的所有定位工具的二元组合中,最小的无遮挡裕度函数值;
通过以下公式计算所述较小的无遮挡裕度函数O min(j,k):
Figure PCTCN2022109446-appb-000004
Figure PCTCN2022109446-appb-000005
r j=ωl j,且ω>1      (公式6)
Figure PCTCN2022109446-appb-000006
r k=ωl k,且ω>1     (公式8)
Figure PCTCN2022109446-appb-000007
Figure PCTCN2022109446-appb-000008
Figure PCTCN2022109446-appb-000009
O min(j,k)=min(O(j,k,L),O(j,k,R))    (公式12)
上述公式中:G为定位传感器中左侧或右侧摄像头的坐标原点;L、R分别为定位传感器中左侧、右侧摄像头的坐标原点;M j和M k分别为任意两个定位工具j与k的半径分别为l j、l k的最小外接球的球心,即定位工具j与k的坐标原点;r j和r k分别为定位工具j与k的扩展半径;裕度系数ω为大于1的常数;向量长度
Figure PCTCN2022109446-appb-000010
Figure PCTCN2022109446-appb-000011
通过定位传感器测量获得;■表示向量点乘;
步骤1.3、设置以下约束条件,在保证以下约束条件的到满足的同时,使至少两个目标函数f 1和f 2同时实现最小化:
约束条件1:
Figure PCTCN2022109446-appb-000012
约束条件2:
Figure PCTCN2022109446-appb-000013
约束条件3:
Figure PCTCN2022109446-appb-000014
其中,
约束条件1表示任意定位工具都要处于定位传感器与环境感知传感器共同可检测的范围内;
约束条件2表示从定位传感器的任意一侧摄像头向任意一个定位工具的连线与该定位工具的z轴方向之间的夹角不能大于既定的阈值;α G,i表示第i个定位工具的坐标原点向指向定位传感器中左侧或右侧摄像头坐标原点的向量与第i个定位工具的z轴方向向量之间的夹角;Th为预设阈值;
约束条件3表示任意两个定位工具之间互不遮挡,即任意两个定位工具之间的无遮挡裕度函数O(j,k,G)的最小值为非负。
可选地,所述步骤2中,所述根据所述最优测量视角的集合,采用多目标决策算法向用户推荐手术各环节中机械臂的最优位姿方案,包括以下步骤:
步骤2.1:找出所述最优测量视角的集合中,在单一目标上最优的解,并计算出所述最优测量视角的集合对应曲线的两个端点所在的直线方程:
Af 1+Bf 2+C=0    (公式13)
步骤2.2:计算所述最优测量视角的集合对应曲线中每个点到上述直线的垂直距离d,将每个点的目标值代入以下公式:
Figure PCTCN2022109446-appb-000015
步骤2.3:将垂直距离d的最大值所对应最优测量视角的解作为所述机械臂关节值的多目标决策的推荐值;
其中,A、B、C通过单目标最优解的目标值求解线性方程获得。
可选地,所述步骤3包括以下步骤:
步骤3.1:手术过程中,在进入指定的手术环节以后,根据手术前机械臂位姿的优化求解以及多目标决策所获得的最优位姿方案以及手术进行情况中机械臂的最优位姿方案,获取当前手术环节的目标位姿;
步骤3.2:环境感知传感器获取手术机器人周围环境的三维信息,生成周围环境的点云图像C B,并通过以下公式获得环境点云在定位传感器的坐标下的点云位置信息C N
Figure PCTCN2022109446-appb-000016
其中,
Figure PCTCN2022109446-appb-000017
为4*4常数转换矩阵;
步骤3.3:随机生成候选路径点;
步骤3.4:判断路径点是否会碰到障碍物;如果会,则返回步骤3.3;否则继续下一步;
步骤3.5:判断在该位姿上是否所有定位工具都可以被检测;如果不可以,则返回步骤3.3;否则继续下一步;
其中,所述判断在该位姿上是否所有定位工具都可以被检测的步骤中,需要定位工具符合上述约束条件1~3;
步骤3.6:将当前的候选路径点加入路径目录中,用以生成合理的路径规划;
步骤3.7:判断是否到达了目标位姿,如果还未达到,则返回步骤3.3;否则找出当前路径目录中最短的路径,作为机械臂运动的路径;
步骤3.8:执行上述路径位姿,使手术机器人的机械臂到达目标位姿。
一种外科手术的主动导航系统,该系统可执行如上所述的外科手术的主动导航系统的控制方法,该系统包括:控制主机、任意自由度的串联机械臂、定位传感器及其适配的一个或多个定位工具、环境感知传感器;环境感知传感器与定位传感器的重合测量区域为外科手术的主动导航系统的可测量区域;
定位工具的数量为一个或多个;每个定位工具上有K个按照一定位置关系分布形成的定位部件;所述定位部件为能够反光或发光的特异性标志物、和/或由若干个特定图案按一定位置关系排列后形成的部件;能够反光的特异性标志物至少包括:表面覆盖有高反光度涂层的小球;能够发光的特异性标志物至少包括:LED灯;所述特定图案为经过专门编码设计的图案,至少包括二维码、格雷码;
每个定位工具上各个定位部件的位置与/或数量不相同,用以区分定位工具;同一个定位工具的K个定位部件的质心都在同一个平面上;
每个定位工具中心处设计有特异的形状特征,把特征轴线与定位部件质心所在的平面焦点作为坐标原点;所述形状特征至少可以为圆孔、半球、凸台、圆锥;以所述坐标原点为球心,为每个定位工具构造包络该定位工具上K个定位部件的最小外接球,所述最小外接球的半径为l i;以K个定位部件的质心所在平面的法线方向为z轴方向;且朝K个定位部件附着的一侧的方向为z轴正向;以垂直于z轴且指向离坐标原点最远的定位部件的方向为x轴正 向,建立三维直角坐标系;
将所有定位工具的集合记为S,对于第i个定位工具,其坐标系圆心为M i,即M i∈S。
实际应用中,会在l i的基础上增加一定的裕度,即球面大小估算时设定比l i稍大一些,例如:将l i乘以一个大于1的裕度系数ω,得到r j,以避免实际操作中的一些细小差异导致方法失效。
有益效果:
本发明提供一种外科手术的主动导航系统及其控制方法。本发明的技术方案解决了手术导航定位的机器人最优的观测位姿的获取,以及实时主动调整位置,避免导航靶标定位器被遮挡,提高导航过程的定位精度等问题。
附图说明
图1为本发明的外科手术的主动导航系统的总体结构图;
图2为本发明的外科手术的主动导航系统的实施方案图;
图3为本发明的外科手术的主动导航系统中坐标系的建立示意图;
图4为本发明的定位工具及其坐标系建立图;
图5为本发明的无遮挡裕度函数O(j,k,G)设计的示意图;
图6为本发明的观测角度α G,i的示意图;
图7为本发明的测量视角多目标优化的最优解图;
图8为本发明的多目标决策算法提供的最优解推荐方法图。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述。
本发明提供一种外科手术的主动导航系统及其控制方法。
图1为本发明的外科手术的主动导航系统的总体结构图。如图1所示,所述系统包括:手术操作规划系统、用于数据处理和机器人控制的控制主机、机械臂、定位传感器及其适配的定位工具、环境感知传感器;所述环境感知传感器实现手术环境,例如:潜在遮挡物和/或障碍物的感知。所述机械臂为7自由度的串联机械臂;所述定位传感器和/或环境感知 传感器连接在所述机械臂的末端法兰上。
定位传感器可以采用多种不同的模态方式:如基于可见光的双目深度摄像头、基于近红外光双目定位摄像头等,对应的定位工具是与定位传感器相匹配的光学二维码或者其他编码图案,或者表面覆盖有特定涂料的光学小球组成的定位工具等。
环境感知传感器同样可以是多种模态方式的:如基于可见光的双目深度摄像头、激光雷达、超声波传感器等。
环境感知传感器与定位传感器可以是两种的设备载体的组合,如:基于近红外光双目定位摄像头+激光雷达的方案;也可以是同一类传感器,如:基于可见光的双目深度摄像头,既可以用于定位,同时可以实现手术环境感知。但是,无论是何种方式,环境感知传感器与定位传感器测量的空间区域必须要相互重合的区域,并且相互重合的区域是本系统的可测量区域。
图2为本发明的外科手术的主动导航系统的实施方案图。如图2所示,实施方案如下:系统由一台7自由度机械臂以及连接在机械臂末端法兰上的近红外光学定位系统(作为“定位传感器”)和双目摄像头(作为“环境感知传感器”),以及用于数据处理和机器人控制的计算机、近红外光学定位系统适配的定位工具组成。
这里的近红外光学定位系统包括两个红外发射灯、以及用于检测反射红外光的红外光摄像头。其工作原理是:左右两个红外发射灯发射特定红外光,投射在定位工具上的反光小球表面。反光小球反射红外光,并被红外光摄像头检测到,其根据接收到的反射红外光推算出近红外光学定位系统与各个小球之间的相对位置,并根据预先标定好的定位关系模型计算出各个定位工具相对于近红外光学定位系统的相对位置。
机械臂的基座坐标为O,第k个关节的关节角为q k,末端法兰的坐标系原点为E。近红外光学定位系统中心坐标为N,左右两边的摄像头坐标分别为R与L。在机械臂处于位置p时,近红外光学定位系统可测量区域空间为A(p)。双目摄像头的坐标系为C。
如图2中的附图标记含义如下:1—七自由度机械臂,2—近红外光学定位系统,3—双目摄像头,4—定位工具,5—计算机。
图3为本发明的外科手术的主动导航系统中坐标系的建立示意图。所有定位工具的集合为S,对于第i个工具,其坐标系圆心为M i,即M i∈S。光学定位系统中心坐标为N,左右两边的摄像头坐标分别为R与L。在机械臂处于位置p时,光学定位系统与环境感知传 感器重合的可测量区域空间为A(p),即无遮挡条件下,在机械臂处于位置p时,定位工具可以被正常测量的所有可能位置的集合。双目摄像头的坐标系为C。
图4为本发明的定位工具及其坐标系建立图。定位工具选用与近红外光学定位系统(即“定位传感器”)相匹配的定位工具,如图4所示。每一个定位工具上有4个表面覆盖有高反光度涂层的小球,按一定的位置关系分布形成的。同一个定位工具的4个小球的球心都在同一个平面上,以K个定位部件的质心所在平面的法线方向为z轴方向;且朝K个定位部件附着的一侧的方向为z轴正向。每一个定位工具小球的位置与/或数量不相同,用以区分定位工具。每个定位工具采用小球球心所在平面与定位工具连杆中心小孔(即外形特征的一种实例)的中心轴相交点作为坐标原点,并以相交点指向离原点距离最远的小球方向为x轴方向。以交点为圆心,建立包络所有小球的最小外接球,该外接球半径为l i。所有定位工具的集合为S,对于第i的工具,其坐标系圆心为M i,即M i∈S。
本发明提供一种外科手术机器人主动导航的控制方法。这种控制方法的实现,包括:“测量视角多目标优化”-“机械臂位姿的多目标决策”-“机械臂路径规划与执行”三部分组成。具体如下:
·测量视角多目标优化:通过把定位工具的情况及位置等输入程序,设置好相关的参数后,通过多目标优化,求解出最优测量视角的集合。
·机械臂位姿的多目标决策:基于上一个步骤中通过优化获得最优解集合,采用多目标决策算法,向用户推荐方案,或者由用户根据偏好选择手术各环节中合适的手术导航机械臂的位姿方案。
·机械臂路径规划与执行:基于上一步骤获得的手术各环节的最优位姿方案,机械臂通过算法规划出从当前位姿,到达最优位姿方案。在此过程中需要考虑在运动过程中定位传感器始终可以正常对该手术环节需要的所有定位工具进行定位以及在此过程中出现的非预期的障碍,并最终到达合适的最优位姿。
上述三部分的内容具体介绍如下:
(一)测量视角多目标优化:通过手术操作规划系统获取手术过程中各个环节的所有定位工具的信息以及所在的位置。建立如下多目标最小化问题:
决策变量:x=[q 1,q 2,q 3,...,qN],
上式中:q 1,q 2,q 3,...,q N为关节变量;N为关节变量的数量;决策变量x表示机械臂的N 个关节变量组成的向量,其取值范围为机械臂各关节可实现的关节值范围Q,即x∈Q;
优化目标如下(至少两个目标函数f 1和f 2同时最小化):
优化目标1:最小化定位工具与近红外光学定位系统的最大距离:
Figure PCTCN2022109446-appb-000018
其中,
Figure PCTCN2022109446-appb-000019
表示第m个定位工具的坐标原点与近红外光学定位系统的坐标原点之间的距离;
优化目标2:min j,k∈SO min(j,k)表示定位工具之间最小的无遮挡裕度函数值。通过取其数值的相反数,转化成最小化优化问题:
Figure PCTCN2022109446-appb-000020
其中,O min(j,k)表示对于给定的一对定位工具j与k,其在定位传感器各个相机坐标中较小的无遮挡裕度函数;min j,k∈SO min(j,k)表示在q所决定的机械臂位姿下,在定位传感器所有相机中,测量到的所有定位工具的二元组合中,最小的无遮挡裕度函数值;
定位工具j与k之间的无遮挡裕度函数O(j,k,G)定义如图5所示:
图5为本发明的无遮挡裕度函数O(j,k,G)设计的示意图,其描述了无遮挡裕度函数O(j,k,G)的定义。具体地,图5中描述的是任意两个定位工具以及定位传感器左或者右边其中一侧摄像头的几何关系。因此如果对于定位工具数量大于2时,任意两个定位工具和任意一侧摄像头就会产生一个特定的O(j,k,G)值,例如:3个定位工具可以产生6个O(j,k,G)值,即:O(1,2,L),O(1,3,L),O(2,3,L),O(1,2,R),O(1,3,R),O(2,3,R)。
其中,G是指定位传感器左或右其中一侧摄像头坐标系原点。M j和M k分别指任意两个定位 工具抽象成球体以后的球心,同时也是定位工具坐标系原点。r j和r k是指定位工具抽象成球体的半径。选择每个定位工具采用小球球心所在平面与定位工具连杆中心小孔(即外形特征的一种实例)的中心轴相交点作为坐标原点。以坐标原点为球心的最小外接球半径为l i。考虑到实际操作时的误差影响,在l i的基础上进行裕度ω倍的扩展,获得定位工具抽象成球体的半径r j和r k。(这里定位工具的特点是,以一个中心延伸出4条或以上共平面的连杆,连杆末端设置有小球。在一套导航设备内,每一个定位工具的小球之间相对位置是唯一的)。其中ω>1。
因此,r j和r k大小已知。向量长度
Figure PCTCN2022109446-appb-000021
Figure PCTCN2022109446-appb-000022
可以通过定位传感器测量获得。β G,j和β G,k可以通过以下关系求得:
Figure PCTCN2022109446-appb-000023
Figure PCTCN2022109446-appb-000024
而α G,j,k可以通过向量计算:
Figure PCTCN2022109446-appb-000025
其中,■表示向量点乘。
最后,计算
Figure PCTCN2022109446-appb-000026
其中,r i=ωl i,表示把定位工具抽象简化后的球面的半径;其中,ω>1。
约束条件如下:
约束条件1:
Figure PCTCN2022109446-appb-000027
约束条件2:
Figure PCTCN2022109446-appb-000028
约束条件3:
Figure PCTCN2022109446-appb-000029
其中,
约束条件1表示任意定位工具都要处于定位传感器与环境感知传感器共同可检测的范围内;
约束条件2表示从定位传感器的任意一侧摄像头向任意一个定位工具的连线与该定位工具的z轴方向之间的夹角不能大于既定的阈值;α G,i表示第i个定位工具的坐标原点向指向定位传感器中左侧或右侧摄像头坐标原点的向量与第i个定位工具的z轴方向向量之间的夹角;Th为预设阈值,例如:Th=π/2;
约束条件3表示任意两个定位工具之间互不遮挡,即任意两个定位工具之间的无遮挡裕度函数O(j,k,G)的最小值为非负。
图6为本发明的观测角度α G,i的示意图。观测角度是指:左或右摄像头原点与任一个定位工具的Z轴(定位工具向上指的法向方向固定为定位工具坐标Z轴)之间的夹角。
Figure PCTCN2022109446-appb-000030
如图6所示:其中,G是指定位传感器左或者右边其中一侧摄像头坐标系原点。
Figure PCTCN2022109446-appb-000031
为在G坐标系下定位工具Z轴单位向量。通过定位传感器可以求得
Figure PCTCN2022109446-appb-000032
Figure PCTCN2022109446-appb-000033
代入公式计算即可。另外,需要说明的是,任意一侧相机对于任意一个定位工具,均会有一个观测角大小值。
综上所述,即需要优化以下优化问题:
决策变量:x=[q 1,q 2,q 3,...,q N]
同时最小化:
Figure PCTCN2022109446-appb-000034
Figure PCTCN2022109446-appb-000035
并同时考虑以下约束:
Figure PCTCN2022109446-appb-000036
Figure PCTCN2022109446-appb-000037
Figure PCTCN2022109446-appb-000038
通过约束多目标优化算法,可以对上述的优化问题进行求解。在本实施例中采用MOEA/D-CDP算法,可以获得上述优化问题的帕累托最优解。
图7为本发明的测量视角多目标优化的最优解图。
如图7所示:图中每一个点都对应着一种最优化的位姿方案。这些方案互不支配,都是最优解。
(二)机械臂位姿的多目标决策:
在获得如图7所示的测量视角多目标优化的最优解以后,用户可以直接根据自身偏好,选择上述的任意一个最优解;或者基于系统所提供的多目标决策算法进行推荐后,再进行选择。
图8为本发明的多目标决策算法提供的最优解推荐方法图。
所述最优解推荐方法的具体步骤如下:
步骤1:找出最优解集合中,在单一个目标上最优的解,并计算出两个端点所在的直线方程为:
Af 1+Bf 2+C=0
步骤2:计算各个点到这条直线的垂直距离d.把每个点的目标值代入公式
Figure PCTCN2022109446-appb-000039
步骤3:根据用户的需要,推荐d值最大的一个最优解作为推荐值,直接使用;或者若干个最优解,由用户进行选择。
(三)机械臂路径规划与执行:
步骤1:在具体手术过程中,进入了指定的手术环节步骤以后,根据手术前机械臂位姿的优化求解以及多目标决策所获得的最优位姿方案以及手术进行情况中机械臂的最优位姿方案,获取当前手术环节的目标位姿(即在机械臂位姿的多目标决策环节中选定的最优目标)。
步骤2:双目摄像头获取机器人周边环境的三维信息。生成周围环境的点云图像C B,并通过以下公式:
Figure PCTCN2022109446-appb-000040
获得环境点云在光学定位系统坐标下的点云位置信息C N;其中
Figure PCTCN2022109446-appb-000041
为4*4常数转换矩阵,其值与双目相机与光学定位系统相对位置有关。
步骤3:算法随机生成候选路径点。
步骤4:判断路径点是否会碰到障碍物,如果会,则返回步骤3;否则继续进行下一步判断;
步骤5:判断是否在该位姿上所有定位工具都可以被检测,如果不可以,则返回步骤3;否则继续下一步;
其中,判断定位工具是否可以被检测,需要定位工具符合以下约束条件:
Figure PCTCN2022109446-appb-000042
Figure PCTCN2022109446-appb-000043
Figure PCTCN2022109446-appb-000044
步骤6:把当前的候选路径点加入路径目录当中,用于最后生成合理的路径规划;
步骤7:是否到达了目标的位姿,如果还未达到,则返回步骤3,否则找出当前目录中最短的路径,作为机械臂运动的路径。
步骤8:执行上述路径位姿,使机器人到达目标位姿。
以上所述,仅为本发明的具体实施方式,但本发明的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本发明揭露的技术范围内,可轻易想到各种等效的修改或替换,这些修改或替换都应涵盖在本发明的保护范围之内。因此,本发明的保护范围应以权利要求的保护范围为准。

Claims (5)

  1. 一种外科手术的主动导航系统的控制方法,其特征在于,所述控制方法包括以下步骤:
    步骤1、测量视角多目标优化:输入定位工具的位置参数并设置其他相关参数,通过多目标优化求解出最优测量视角的集合;
    步骤2、机械臂位姿的多目标决策:根据所述最优测量视角的集合,采用多目标决策算法向用户推荐手术各环节中机械臂的最优位姿方案;或者根据用户根据偏好选择手术各环节中机械臂的最优位姿方案;
    步骤3、机械臂路径规划与执行:根据所选择的手术各环节中机械臂的最优位姿方案,规划出机械臂从当前位姿到达最优位姿方案的路径;
    其中,所述步骤1包括以下步骤:
    步骤1.1、获取手术过程中各个环节的所有定位工具的信息及其所在的位置,建立基于决策变量x的多目标最小化问题:
    x=[q 1,q 2,q 3,...,q N]       (公式1)
    上式中:q 1,q 2,q 3,...,q N为关节变量;N为关节变量的数量;决策变量x表示机械臂的N个关节变量组成的向量,其取值范围为机械臂各关节可实现的关节值范围Q,即x∈Q;
    步骤1.2、定义最小化优化的至少两个目标函数f 1和f 2,具体如下:
    Figure PCTCN2022109446-appb-100001
    f 2=-min j,k∈SO min(j,k)            (公式3)
    其中,
    Figure PCTCN2022109446-appb-100002
    表示第m个定位工具的坐标原点与定位传感器的坐标原点之间的距离;f 1表示所有定位工具的坐标原点与定位传感器的坐标原点之间的最大距离;O min(j,k)表示对于给定的一对定位工具j与k,其在定位传感器各个相机坐标中较小的无遮挡裕度函数;
    min j,k∈SO min(j,k)表示在决策变量x所决定的机械臂位姿下,在定位传感器所有相机中,测量到的所有定位工具的二元组合中,最小的无遮挡裕度函数值;
    通过以下公式计算所述较小的无遮挡裕度函数O min(j,k):
    Figure PCTCN2022109446-appb-100003
    Figure PCTCN2022109446-appb-100004
    r j=ωl j,且ω>1     (公式6)
    Figure PCTCN2022109446-appb-100005
    r k=ωl k,且ω>1     (公式8)
    Figure PCTCN2022109446-appb-100006
    Figure PCTCN2022109446-appb-100007
    Figure PCTCN2022109446-appb-100008
    O min(j,k)=min(O(j,k,L),O(j,k,R))    (公式12)
    上述公式中:G为定位传感器中左侧或右侧摄像头的坐标原点;L、R分别为定位传感器中 左侧、右侧摄像头的坐标原点;l j、l k分别为任意两个定位工具j与k的最小外接球半径,M j和M k分别为任意两个定位工具j与k的最小外接球球心,即定位工具j与k的坐标原点;r j和r k分别为定位工具j与k的扩展半径;裕度系数ω为大于1的常数;向量长度
    Figure PCTCN2022109446-appb-100009
    Figure PCTCN2022109446-appb-100010
    通过定位传感器测量获得;■表示向量点乘;
    步骤1.3、设置以下约束条件,在保证以下约束条件的到满足的同时,使至少两个目标函数f 1和f 2同时实现最小化:
    约束条件1:
    Figure PCTCN2022109446-appb-100011
    约束条件2:
    Figure PCTCN2022109446-appb-100012
    约束条件3:
    Figure PCTCN2022109446-appb-100013
    其中,
    约束条件1表示任意定位工具都要处于定位传感器与环境感知传感器共同可检测的范围内;
    约束条件2表示从定位传感器的任意一侧摄像头向任意一个定位工具的连线与该定位工具的z轴方向之间的夹角不能大于既定的阈值;α G,i表示第i个定位工具的坐标原点向指向定位传感器中左侧或右侧摄像头坐标原点的向量与第i个定位工具的z轴方向向量之间的夹角;Th为预设阈值;
    约束条件3表示任意两个定位工具之间互不遮挡,即任意两个定位工具之间的无遮挡裕度函数O(j,k,G)的最小值为非负。
  2. 根据权利要求1所述的控制方法,其特征在于,所述步骤2中,所述根据所述最优测量 视角的集合,采用多目标决策算法向用户推荐手术各环节中机械臂的最优位姿方案,包括以下步骤:
    步骤2.1:找出所述最优测量视角的集合中,在单一目标上最优的解,并计算出所述最优测量视角的集合对应曲线的两个端点所在的直线方程:
    Af 1+Bf 2+C=0   (公式13)
    步骤2.2:计算所述最优测量视角的集合对应曲线中每个点到上述直线的垂直距离d,将每个点的目标值代入以下公式:
    Figure PCTCN2022109446-appb-100014
    步骤2.3:将垂直距离d的最大值所对应最优测量视角的解作为所述机械臂关节值的多目标决策的推荐值;
    其中,A、B、C通过单目标最优解的目标值求解线性方程获得。
  3. 根据权利要求2所述的控制方法,其特征在于,所述步骤3包括以下步骤:
    步骤3.1:手术过程中,在进入指定的手术环节以后,根据手术前机械臂位姿的优化求解以及多目标决策所获得的最优位姿方案以及手术进行情况中机械臂的最优位姿方案,获取当前手术环节的目标位姿;
    步骤3.2:环境感知传感器获取手术机器人周围环境的三维信息,生成周围环境的点云图像C B,并通过以下公式获得环境点云在定位传感器的坐标下的点云位置信息C N
    Figure PCTCN2022109446-appb-100015
    其中,
    Figure PCTCN2022109446-appb-100016
    为4*4常数转换矩阵;
    步骤3.3:随机生成候选路径点;
    步骤3.4:判断路径点是否会碰到障碍物;如果会,则返回步骤3.3;否则继续下一步;
    步骤3.5:判断在该位姿上是否所有定位工具都可以被检测;如果不可以,则返回步骤3.3;
    否则继续下一步;
    其中,所述判断在该位姿上是否所有定位工具都可以被检测的步骤中,需要定位工具符合上 述约束条件1~3;
    步骤3.6:将当前的候选路径点加入路径目录中,用以生成合理的路径规划;
    步骤3.7:判断是否到达了目标位姿,如果还未达到,则返回步骤3.3;否则找出当前路径目录中最短的路径,作为机械臂运动的路径;
    步骤3.8:执行上述路径位姿,使手术机器人的机械臂到达目标位姿。
  4. 一种外科手术的主动导航系统,该系统可执行如权利要求1-3中任一项所述的外科手术的主动导航系统的控制方法,其特征在于,该系统包括:控制主机、多自由度的串联机械臂、定位传感器及其适配的一个或多个定位工具、环境感知传感器;环境感知传感器与定位传感器的重合测量区域为外科手术的主动导航系统的可测量区域;
    定位工具的数量为一个或多个;每个定位工具上有K个按照一定位置关系分布形成的定位部件;所述定位部件为能够反光或发光的特异性标志物、和/或由若干个特定图案按一定位置关系排列后形成的部件;能够反光的特异性标志物至少包括:表面覆盖有高反光度涂层的小球;能够发光的特异性标志物至少包括:LED灯;所述特定图案为经过专门编码设计的图案,至少可以为二维码、格雷码;
    每个定位工具上各个定位部件的位置与/或数量不相同,用以区分定位工具;同一个定位工具的K个定位部件的质心都在同一个平面上;
    每个定位工具中心处设计有特异的形状特征,把特征轴线与定位部件质心所在的平面焦点作为坐标原点;以所述坐标原点为球心,为每个定位工具构造包络该定位工具上K个定位部件的最小外接球,所述最小外接球的半径为l i;以K个定位部件的质心所在平面的法线方向为z轴方向;且朝K个定位部件附着的一侧的方向为z轴正向;以垂直于z轴且指向离坐标原点最远的定位部件的方向为x轴正向,建立三维直角坐标系;
    将所有定位工具的集合记为S,对于第i个定位工具,其坐标系圆心为M i,即M i∈S。
  5. 根据权利要求4所述的主动导航系统,其特征在于,所述形状特征为圆孔或者半球或者凸台或者圆锥。
PCT/CN2022/109446 2021-07-07 2022-08-01 一种外科手术的主动导航系统及其控制方法 WO2023280326A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US18/268,316 US20240050161A1 (en) 2021-07-07 2022-08-01 Active navigation system of surgery and control method thereof

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110764801.5 2021-07-07
CN202110764801.5A CN113499138B (zh) 2021-07-07 2021-07-07 一种外科手术的主动导航系统及其控制方法

Publications (1)

Publication Number Publication Date
WO2023280326A1 true WO2023280326A1 (zh) 2023-01-12

Family

ID=78011775

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/109446 WO2023280326A1 (zh) 2021-07-07 2022-08-01 一种外科手术的主动导航系统及其控制方法

Country Status (3)

Country Link
US (1) US20240050161A1 (zh)
CN (1) CN113499138B (zh)
WO (1) WO2023280326A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116277007A (zh) * 2023-03-28 2023-06-23 北京维卓致远医疗科技发展有限责任公司 位姿控制方法、装置、存储介质及控制器
CN117061876A (zh) * 2023-10-11 2023-11-14 常州微亿智造科技有限公司 基于飞拍机器人的飞拍控制方法和系统
CN117084790A (zh) * 2023-10-19 2023-11-21 苏州恒瑞宏远医疗科技有限公司 一种穿刺方位控制方法、装置、计算机设备、存储介质

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113499138B (zh) * 2021-07-07 2022-08-09 南开大学 一种外科手术的主动导航系统及其控制方法
CN113499137B (zh) * 2021-07-07 2022-07-12 南开大学 一种手术机器人导航定位系统及测量视角多目标优化方法
CN113954082B (zh) * 2021-12-23 2022-03-08 真健康(北京)医疗科技有限公司 适用于穿刺手术机械臂的控制方法、控制设备和辅助系统
CN114952806B (zh) * 2022-06-16 2023-10-03 法奥意威(苏州)机器人系统有限公司 约束运动控制方法、装置、系统和电子设备
CN116370082B (zh) * 2022-07-01 2024-03-12 北京和华瑞博医疗科技有限公司 机械臂系统及外科手术系统
CN115381554B (zh) * 2022-08-02 2023-11-21 北京长木谷医疗科技股份有限公司 一种骨科手术机器人智能位置调整系统及方法
CN115919472B (zh) * 2023-01-09 2023-05-05 北京云力境安科技有限公司 一种机械臂定位方法及相关系统、装置、设备及介质

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104739514A (zh) * 2015-03-13 2015-07-01 华南理工大学 大视场下手术器械的自动跟踪定位方法
WO2015188393A1 (zh) * 2014-06-11 2015-12-17 清华大学 人体器官运动监测方法、手术导航系统和计算机可读介质
US20160113720A1 (en) * 2013-06-11 2016-04-28 Minmaxmedical System for the treatment of a planned volume of a body part
WO2017147596A1 (en) * 2016-02-26 2017-08-31 Think Surgical, Inc. Method and system for guiding user positioning of a robot
CN107862129A (zh) * 2017-11-03 2018-03-30 哈尔滨工业大学 一种基于moead的偏差区间偏好引导多目标决策优化方法
CN110116410A (zh) * 2019-05-28 2019-08-13 中国科学院自动化研究所 基于视觉伺服的机械臂目标导引系统、方法
CN112223288A (zh) * 2020-10-09 2021-01-15 南开大学 一种视觉融合的服务机器人控制方法
CN112451096A (zh) * 2020-11-24 2021-03-09 广州艾目易科技有限公司 一种示踪器识别信息的生成方法及装置
CN113499137A (zh) * 2021-07-07 2021-10-15 南开大学 一种手术机器人导航定位系统及测量视角多目标优化方法
CN113499138A (zh) * 2021-07-07 2021-10-15 南开大学 一种外科手术的主动导航系统及其控制方法

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5235652B2 (ja) * 2008-12-26 2013-07-10 ヤマハ発動機株式会社 多目的最適化装置、多目的最適化方法および多目的最適化プログラム
CN110051436B (zh) * 2018-01-18 2020-04-17 上海舍成医疗器械有限公司 自动化协同工作组件及其在手术器械中的应用
CN111227935A (zh) * 2020-02-20 2020-06-05 中国科学院长春光学精密机械与物理研究所 一种手术机器人导航定位系统
CN111360826B (zh) * 2020-02-29 2023-01-06 华南理工大学 一种可实时显示抓取位姿的系统

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160113720A1 (en) * 2013-06-11 2016-04-28 Minmaxmedical System for the treatment of a planned volume of a body part
WO2015188393A1 (zh) * 2014-06-11 2015-12-17 清华大学 人体器官运动监测方法、手术导航系统和计算机可读介质
CN104739514A (zh) * 2015-03-13 2015-07-01 华南理工大学 大视场下手术器械的自动跟踪定位方法
WO2017147596A1 (en) * 2016-02-26 2017-08-31 Think Surgical, Inc. Method and system for guiding user positioning of a robot
CN107862129A (zh) * 2017-11-03 2018-03-30 哈尔滨工业大学 一种基于moead的偏差区间偏好引导多目标决策优化方法
CN110116410A (zh) * 2019-05-28 2019-08-13 中国科学院自动化研究所 基于视觉伺服的机械臂目标导引系统、方法
CN112223288A (zh) * 2020-10-09 2021-01-15 南开大学 一种视觉融合的服务机器人控制方法
CN112451096A (zh) * 2020-11-24 2021-03-09 广州艾目易科技有限公司 一种示踪器识别信息的生成方法及装置
CN113499137A (zh) * 2021-07-07 2021-10-15 南开大学 一种手术机器人导航定位系统及测量视角多目标优化方法
CN113499138A (zh) * 2021-07-07 2021-10-15 南开大学 一种外科手术的主动导航系统及其控制方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
MENG, XIANGFENG, ZHANG CHAO, TANG QIAOHONG, WANG HAO, WANG CHENXI, LI JIAGE: "Discussion on the Evaluation Method of Surgical Robot Performance", CHINA MEDICAL DEVICES, vol. 09, 1 January 2020 (2020-01-01), pages 18 - 21, XP093022948, ISSN: 1674-1633, DOI: 10.3969/j.issn.1674-1633.2020.09.004 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116277007A (zh) * 2023-03-28 2023-06-23 北京维卓致远医疗科技发展有限责任公司 位姿控制方法、装置、存储介质及控制器
CN116277007B (zh) * 2023-03-28 2023-12-19 北京维卓致远医疗科技发展有限责任公司 位姿控制方法、装置、存储介质及控制器
CN117061876A (zh) * 2023-10-11 2023-11-14 常州微亿智造科技有限公司 基于飞拍机器人的飞拍控制方法和系统
CN117061876B (zh) * 2023-10-11 2024-02-27 常州微亿智造科技有限公司 基于飞拍机器人的飞拍控制方法和系统
CN117084790A (zh) * 2023-10-19 2023-11-21 苏州恒瑞宏远医疗科技有限公司 一种穿刺方位控制方法、装置、计算机设备、存储介质
CN117084790B (zh) * 2023-10-19 2024-01-02 苏州恒瑞宏远医疗科技有限公司 一种穿刺方位控制方法、装置、计算机设备、存储介质

Also Published As

Publication number Publication date
CN113499138B (zh) 2022-08-09
CN113499138A (zh) 2021-10-15
US20240050161A1 (en) 2024-02-15

Similar Documents

Publication Publication Date Title
WO2023280326A1 (zh) 一种外科手术的主动导航系统及其控制方法
WO2023279874A1 (zh) 一种手术机器人导航定位系统及测量视角多目标优化方法
US11872005B2 (en) Method and system for guiding user positioning of a robot
US20240058086A1 (en) Hand controller for robotic surgery system
US11653983B2 (en) Methods for locating and tracking a tool axis
CN109567942B (zh) 采用人工智能技术的颅颌面外科手术机器人辅助系统
JP5702797B2 (ja) 遠隔操作される低侵襲スレーブ手術器具の手による制御のための方法およびシステム
CN102665588B (zh) 用于微创手术系统中的手存在性探测的方法和系统
US20220175464A1 (en) Tracker-Based Surgical Navigation
Deacon et al. The Pathfinder image-guided surgical robot
JP2004522220A (ja) ジェスチャーに基づいた入力及びターゲット指示のための単一カメラシステム
JP2015107377A (ja) 低侵襲外科システムにおいて使用するマスターフィンガー追跡デバイスおよびその方法
JP6147360B2 (ja) トラッキングシステム及びこれを用いたトラッキング方法
JP2016515837A (ja) トラッキングシステム及びこれを用いたトラッキング方法
Janabi-Sharifi et al. Automatic grasp planning for visual-servo controlled robotic manipulators
JP2022142773A (ja) オブジェクトのカメラ画像からオブジェクトの場所を位置特定するための装置及び方法
US20240130806A1 (en) Surgical robot navigation and positioning system and measurement viewing angle multi-objective optimization method
US20200205911A1 (en) Determining Relative Robot Base Positions Using Computer Vision
US20230092980A1 (en) Surgical robotic system setup
Huang Evaluation of Haptic Virtual Fixtures with Real-Time Sensors

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22837071

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 18268316

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE