CN114515193A - Parallel robot, system, device and storage medium - Google Patents

Parallel robot, system, device and storage medium Download PDF

Info

Publication number
CN114515193A
CN114515193A CN202210113699.7A CN202210113699A CN114515193A CN 114515193 A CN114515193 A CN 114515193A CN 202210113699 A CN202210113699 A CN 202210113699A CN 114515193 A CN114515193 A CN 114515193A
Authority
CN
China
Prior art keywords
target
puncture
point
initial
robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210113699.7A
Other languages
Chinese (zh)
Inventor
何超
其他发明人请求不公开姓名
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Shuhang Robot Co ltd
Original Assignee
Shanghai Shuhang Robot Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Shuhang Robot Co ltd filed Critical Shanghai Shuhang Robot Co ltd
Priority to CN202210113699.7A priority Critical patent/CN114515193A/en
Publication of CN114515193A publication Critical patent/CN114515193A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/34Trocars; Puncturing needles
    • A61B17/3403Needle locating or guiding means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J18/00Arms
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/003Programme-controlled manipulators having parallel kinematics
    • B25J9/0045Programme-controlled manipulators having parallel kinematics with kinematics chains having a rotary joint at the base
    • B25J9/0048Programme-controlled manipulators having parallel kinematics with kinematics chains having a rotary joint at the base with kinematics chains of the type rotary-rotary-rotary
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1661Programme controls characterised by programming, planning systems for manipulators characterised by task planning, object-oriented languages
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/34Trocars; Puncturing needles
    • A61B17/3403Needle locating or guiding means
    • A61B2017/3405Needle locating or guiding means using mechanical guide means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/302Surgical robots specifically adapted for manipulations within body cavities, e.g. within abdominal or thoracic cavities
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/305Details of wrist mechanisms at distal ends of robotic arms

Abstract

The invention relates to a parallel robot, a system, a device and a storage medium, wherein the parallel robot comprises: the end effector and a first plane moving mechanism and a second plane moving mechanism which are parallel to the moving plane; the first plane moving mechanism and the second plane moving mechanism are provided with at least two degrees of freedom of plane movement; the moving far end of the first plane moving mechanism is connected with a first rotary motion pair structure of the end effector; the moving far end of the second plane moving mechanism is connected with a second rotary motion pair structure of the end effector; and the pose of the end effector is changed by controlling the first plane moving mechanism to act in cooperation with the second plane moving mechanism. This application reduces the robot volume in, improves the rigidity and the positioning accuracy of robot.

Description

Parallel robot, system, device and storage medium
Technical Field
The present invention relates to the field of robotics, and in particular, to a parallel robot, system, device, and storage medium.
Background
The robot has the advantages of short learning curve, short learning time, capability of bearing ray radiation, short action execution delay time, high accuracy and the like, and is widely applied to the fields of industry, medicine, agriculture, service industry, construction industry, military and the like.
However, the conventional robot for medical operation generally adopts a rectangular coordinate system or a multi-degree-of-freedom rotary arm, and the robot has a large volume, limits the operation space of a clinician, and has weak rigidity and low positioning accuracy.
Disclosure of Invention
Therefore, it is necessary to provide a parallel robot, a system, a device and a storage medium for solving the technical problems of large size, weak rigidity and low positioning accuracy of the conventional robot, so as to reduce the size of the robot and improve the rigidity and the positioning accuracy of the robot.
In order to achieve the above and other objects, an aspect of the embodiments of the present application provides a parallel robot, including an end effector, and a first plane moving mechanism and a second plane moving mechanism which move planes parallel to each other; the first plane moving mechanism and the second plane moving mechanism are provided with at least two degrees of freedom of plane movement; the moving far end of the first plane moving mechanism is connected with a first rotary motion pair structure of the end effector; the moving far end of the second plane moving mechanism is connected with a second rotary motion pair structure of the end effector; and the pose of the end effector is changed by controlling the first plane moving mechanism to act in cooperation with the second plane moving mechanism.
In the parallel robot in the above embodiment, the first plane moving mechanism and the second plane moving mechanism, which have parallel moving planes, are provided with at least two degrees of freedom of plane movement, and the moving distal end of the first plane moving mechanism is connected to the first rotary kinematic pair structure of the end effector, so that the first rotary kinematic pair structure can be driven to move in the moving plane of the first plane moving machine by controlling the first plane moving mechanism; the moving far end of the second plane moving mechanism is connected with a second rotary motion pair structure of the end effector, and the second rotary motion pair structure can be driven to move in the moving plane of the second plane moving machine by controlling the second plane moving mechanism; the pose of the end effector is changed by controlling the first plane moving mechanism to cooperate with the second plane moving mechanism to move, and the end effector can be accurately positioned to bear an object. Because the first plane moving mechanism and the second plane moving mechanism are controlled independently in parallel, compared with a series robot comprising a moving mechanism which is connected with a plurality of degrees of freedom in series, the robot has the advantages that the rigidity of the robot is effectively improved, and the size of the robot is reduced. Because the first plane moving mechanism and the second plane moving mechanism which are parallel to each other in the moving plane act in a coordinated mode, compared with the traditional technology of positioning an end effector bearing object through multi-axis linkage, errors such as system errors or structural errors caused by multi-axis transmission are avoided, and the positioning accuracy of the parallel robot is effectively improved.
In some embodiments, the first planar moving mechanism comprises a first drive device, a second drive device connected via a first five-bar linkage; the second plane moving mechanism comprises a third driving device and a fourth driving device which are connected through a second five-bar linkage; the first driving device, the second driving device, the third driving device and the fourth driving device are all arranged on the frame, so that the overlapping area of the orthographic projections of the moving plane of the first plane moving mechanism and the moving plane of the second plane moving mechanism on a plane parallel to the moving plane is as large as possible, and the size of the robot is effectively reduced on the premise of ensuring that the moving space of the end effector is large enough.
In some embodiments, the parallel robot further comprises a needle holder disposed on a surface of the end effector remote from the first planar moving mechanism and the second planar moving mechanism for fixing a puncture needle; the first plane moving mechanism is controlled to cooperate with the second plane moving mechanism to act, so that the position and the extending direction of the puncture needle are changed, and the puncture needle is controlled to puncture a focus accurately.
In some embodiments, the parallel robot further comprises a first motor, a second motor, a third motor, a fourth motor and a controller, the first driving device is connected with the first five-bar linkage via the first motor; the second driving device is connected with the first five-bar linkage through the second motor; the third driving device is connected with the second five-bar mechanism through the third motor; the fourth driving device is connected with the second five-bar linkage through the fourth motor; the controller is connected with the first driving device, the second driving device, the third driving device and the fourth driving device, and is configured to: acquiring a target pose control command, wherein the target pose control command is used for indicating the target position and the target direction of the puncture needle; analyzing the target pose control command according to a first preset algorithm to obtain a first angle driving command, a second angle driving command, a third angle driving command and a fourth angle driving command, so that the first driving device drives the first motor to rotate by a first angle based on the first angle driving command, the second driving device drives the second motor to rotate by a second angle based on the second angle driving command, the third driving device drives the third motor to rotate by a third angle based on the third angle driving command, and the fourth driving device drives the fourth motor to rotate by a fourth angle based on the fourth angle driving command to drive the terminal puncture needle to move to the target position and the target direction, so that accurate puncture of the focus by controlling puncture is realized, generation of a manual operation learning curve and a long-time point are avoided, Easy to malfunction and depends heavily on the experience and ability of the operator.
A second aspect of the embodiments of the present application provides a parallel robot system, including any one of the parallel robots and the control processing device described in the embodiments of the present application, wherein a plurality of positioning markers made of a first preset material are disposed on an outer surface of the end effector; a control processing device is communicatively interconnected with the parallel robots and configured to: receiving a medical image of a target object and the positioning marker disposed adjacent to the target object; reconstructing a three-dimensional model of the target object from the medical image; identifying a lesion, the localization marker, and other tissues from the three-dimensional model; determining an initial target pose of the end effector according to the identified lesion; calculating the target pose of the end effector in a base coordinate system according to the initial target pose, the transformation relation of an image recognition coordinate system and the base coordinate system of the parallel robot; and generating the target pose control command according to the target pose.
In the parallel robot system in the above embodiment, a medical image of a target object and the positioning marker disposed adjacent to the target object is received, a three-dimensional model of the target object is reconstructed according to the medical image, a lesion, the positioning marker and other tissues are identified according to the three-dimensional model, so as to determine an initial target pose of the end effector according to the identified lesion, and after a target pose of the end effector in a base coordinate system is calculated according to a transformation relation between the initial target pose, an image recognition coordinate system and the base coordinate system of the parallel robot, a target pose control command is generated according to the target pose, so as to control the parallel robot to move and drive the end effector to move to the target pose indicated by the target pose control command.
In some embodiments, the parallel robotic system further comprises a needle holder disposed at the end effector, the needle holder for securing a puncture needle; the control processing apparatus is further configured to: acquiring an initial puncture needle inserting point, and determining an initial puncture target point according to the identified focus; calculating the puncture needle inserting point of the base coordinate system according to the initial puncture needle inserting point and the transformation relation, and calculating the puncture target point of the base coordinate system according to the initial puncture target point and the transformation relation; and generating the target pose control command according to the puncture needle inserting point and the puncture target point so as to control the puncture needle to move to a target position and a target direction.
In the parallel robot system in the above embodiment, the needle holder for fixing the puncture needle is provided on the end effector of the parallel robot system, and the control processing device is configured to: acquiring an initial puncture needle inserting point, and determining an initial puncture target point according to the identified focus; calculating the puncture needle inserting point of a base coordinate system according to the initial puncture needle inserting point and the transformation relation between the image recognition coordinate system and the base coordinate system of the parallel robot, and calculating the puncture target point of the base coordinate system according to the initial puncture target point and the transformation relation; and a target pose control command is generated according to the puncture needle inserting point and the puncture target point so as to control the puncture needle to move to a target position and a target direction indicated by the target pose control command, so that accurate puncture operation is realized, and the problems of manual operation learning curve, long time, easiness in misoperation and serious dependence on experience and capability of an operator are avoided.
In some embodiments, the other tissue comprises blood vessels and/or bone; the control processing apparatus is further configured to: after determining the initial puncture target point and before calculating the puncture needle point and the puncture target point, judging whether an initial puncture path passing through the initial puncture needle point and the initial puncture target point passes through a blood vessel and/or a bone; if the puncture needle does not penetrate through the target puncture point, calculating the puncture needle inserting point and the puncture target point; otherwise, the initial puncture needle inserting point is obtained again. The embodiment can effectively avoid the determined initial puncture path from passing through the blood vessel and/or the bone, thereby effectively reducing the damage caused by the puncture to the target object.
In some embodiments, the transformation relationship comprises a rotation matrix R and a translation matrix T; the step of calculating the transformation relationship comprises: determining a first coordinate of the central point of the identified positioning marker in an image identification coordinate system; and calculating the rotation matrix R and the translation matrix T through rigid body registration according to the first coordinate and a pre-acquired second coordinate of the central point in the base coordinate system, and establishing the corresponding relation between the initial puncture needle point and the puncture needle point of the parallel robot, the initial puncture target point and the puncture target point of the parallel robot through the rotation matrix R and the translation matrix T, so that the parallel robot is controlled to move and the end effector of the parallel robot is driven to move accurately to position/operate accurately.
In some embodiments, the penetration needle point P is calculated according to the following formulainRobotThe puncture target point PdRobot:PinRobot=R*PinImage+T;PdRobot=R*PdImage+ T; wherein, PinImageFor the purpose of identifying the initial penetration needle point, P, of the coordinate system in said imagedImageAn initial puncture target point determined based on the identified lesion. To insert the initial puncture needle point P in the image recognition coordinate systeminImageInitial puncture target Point PdImagePuncture needle insertion point P respectively converted into base coordinate system of parallel robotinRobotPuncture target Point PdRobotThe parallel robot is intelligently controlled to act according to the image recognition result, and the end effector is driven to move accurately so as to be positioned/operated accurately.
In some embodiments, the step of determining whether the initial puncture path via the initial puncture needle point and the initial puncture target point passes through a blood vessel and/or a bone comprises: and performing simulated collision detection on the linear initial puncture path, and judging whether the linear initial puncture path passes through a blood vessel and/or a bone. The determined initial puncture path is prevented from passing through a blood vessel and/or a bone through the simulation collision detection, so that the damage of the puncture to a target object can be effectively reduced, and the working intelligence and safety of the parallel robot system are improved.
In some embodiments, said identifying lesions, said localization markers, and other tissues from said three-dimensional model comprises at least one of: segmenting a focus from the three-dimensional model by a region growing method; segmenting the positioning markers from the three-dimensional model by a threshold segmentation method; and segmenting other tissues from the three-dimensional model by an image interaction method. According to the embodiment, at least one of a focus, a positioning marker and other tissues in the three-dimensional model reconstructed according to the medical image of the target object can be effectively identified, so that the parallel robot is controlled to realize intelligent and accurate treatment on the target object, the intelligence of medical operation is improved, and the problems that a manual operation learning curve is generated, the time is long, misoperation is easy to occur, and the experience and the capability of an operator are seriously depended on are avoided.
In some embodiments, the parallel robot system further includes a bed board made of a second preset material, where the bed board made of the second preset material is used for bearing the target object and fixing the parallel robot, so as to fix the bed board on a scanning bed surface of the imaging device, and realize that the bed board and the scanning bed move together, and simultaneously acquire a medical image including the target object, the parallel robot fixed on the bed board, and a positioning marker, so as to subsequently perform image recognition and control the parallel robot to move according to an image recognition result and drive an end effector of the parallel robot to move accurately for precise positioning/operation.
A third aspect of the embodiments of the present application provides a control processing apparatus, including an image processing unit and a path planning unit, where the image processing unit is configured to receive a medical image of a target object and a positioning marker disposed adjacent to the target object, reconstruct a three-dimensional model of the target object according to the medical image, and identify a lesion, the positioning marker, and other tissues according to the three-dimensional model; the path planning unit is used for determining an initial target pose of an end effector arranged at the tail end of the robot according to the identified focus, calculating a target pose of the end effector in a base coordinate system according to the initial target pose, a transformation relation of an image identification coordinate system and the base coordinate system of the robot, and generating a target pose control command according to the target pose.
In the control processing apparatus in the above embodiment, an image processing unit is arranged to receive a medical image of a target object and a positioning marker arranged adjacent to the target object, reconstruct a three-dimensional model of the target object according to the medical image, and identify a lesion, the positioning marker and other tissues according to the three-dimensional model; and a path planning unit is arranged to determine an initial target pose of an end effector arranged at the tail end of the robot according to the identified focus, calculate a target pose of the end effector in a base coordinate system according to a transformation relation among the initial target pose, the image identification coordinate system and the base coordinate system of the robot, and generate a target pose control command according to the target pose so as to control the action of the robot and drive the end effector to move to the target pose indicated by the target pose control command, so that accurate positioning/operation is realized.
In some embodiments, the control and processing device is used for controlling the puncture needle to move to a target position and a target direction; the path planning unit is further used for acquiring an initial puncture needle inserting point and determining an initial puncture target point according to the identified focus; the device also comprises a positioning device, wherein the positioning device is used for calculating the puncture needle inserting point of the base coordinate system according to the initial puncture needle inserting point and the transformation relation and calculating the puncture target point of the base coordinate system according to the initial puncture target point and the transformation relation; and generating the target pose control command according to the puncture needle inserting point and the puncture target point so as to control the puncture needle to move to a target position and a target direction.
In the control processing device in the above embodiment, the positioning device is configured to calculate the puncture needle insertion point of the base coordinate system according to the initial puncture needle insertion point and the transformation relationship between the image recognition coordinate system and the base coordinate system of the parallel robot, and calculate the puncture target point of the base coordinate system according to the initial puncture target point and the transformation relationship; and generating the target pose control command according to the puncture needle inserting point and the puncture target point so as to control the puncture needle to move to a target position and a target direction, so that accurate puncture operation is realized, and the problems of manual operation learning curve generation, long time, easiness in misoperation and serious dependence on experience and capability of an operator are avoided.
In some embodiments, the other tissue comprises blood vessels and/or bone; the path planning unit is further configured to: after determining the initial puncture target point and before calculating the puncture needle point and the puncture target point, judging whether an initial puncture path passing through the initial puncture needle point and the initial puncture target point passes through a blood vessel and/or a bone; if the puncture needle does not penetrate through the target puncture point, calculating the puncture needle inserting point and the puncture target point; otherwise, the initial puncture needle inserting point is obtained again. The embodiment can effectively avoid the determined initial puncture path from passing through the blood vessel and/or the bone, thereby effectively reducing the damage caused by the puncture to the target object.
In some embodiments, the image processing unit is further configured to perform at least one of: segmenting a focus from the three-dimensional model by a region growing method; segmenting the positioning markers from the three-dimensional model by a threshold segmentation method; and segmenting other tissues from the three-dimensional model by an image interaction method. According to the embodiment, at least one of a focus, a positioning marker and other tissues in the three-dimensional model reconstructed according to the medical image of the target object can be effectively identified, so that the parallel robot is controlled to realize intelligent and accurate treatment on the target object, the intelligence of medical operation is improved, and the problems that a manual operation learning curve is generated, the time is long, misoperation is easy to occur, and the experience and the capability of an operator are seriously depended on are avoided.
A fourth aspect of the embodiments of the present application provides a motion detection apparatus, including a plurality of sensors and a position tracking device, where the sensors are configured to be disposed at detection positions of a target object to detect position data of the detection positions; and the position tracking equipment is connected with the sensor and used for tracking the position change of the detection position according to the received position data and generating action prompt information according to the position change so as to instruct to execute a preset action on the target object, thereby avoiding the occurrence of inaccurate/failed positioning or operation of an operator caused by the movement of the detection position of the target object. Compared with the traditional puncture operation assisted by using respiratory gating equipment, the method and the device do not need additional auxiliary equipment and can effectively solve focus displacement images caused by the respiration of the target object.
In some embodiments, the location tracking device is configured to: filtering and/or amplifying the time sequence of the position data to obtain preprocessed position data; fitting the preprocessed position data to obtain a detection position change curve; acquiring the variation of preset time on the detection position variation curve; if the variation is smaller than or equal to a preset displacement threshold, the action prompt information is generated to prompt an operator to execute preset operation, the situation that the operator is positioned or the operation is inaccurate/failed due to the fact that the detection position of the target object moves is avoided, and the intelligence and the safety of the operation are improved.
In some embodiments, the preset action comprises a puncturing action; the detection position is located in at least one of: a skin region directly above the upper right lung lobe of the target subject; a skin region directly above the lower right lobe of the lung of the target subject; a region of skin directly above the middle of the left lung of the target subject; the location tracking device is further configured to: and generating action prompt information according to the position change so as to instruct to execute a puncture action on the lung of the target object. According to the embodiment, the accurate detection of the lung movement condition of the target object is realized, the puncture action is performed during the breath holding period of the target object, the inaccurate/failed positioning or operation of an operator caused by the lung breathing movement of the target object is avoided, and the intelligence and the safety of the puncture operation are improved.
A fifth aspect of an embodiment of the present application provides a medical system, including a robot, an imaging device, and a control processing device, where a plurality of positioning markers made of a first preset material are disposed on an end effector of the robot; the imaging device is used for scanning and imaging a target object and the positioning marker arranged close to the target object to acquire a medical image; a control processing device is communicatively interconnected with both the imaging device and the robot, and is configured to: receiving a medical image of a target object and the positioning marker disposed adjacent to the target object; reconstructing a three-dimensional model of the target object from the medical image; identifying a lesion, the localization marker, and other tissues from the three-dimensional model; determining an initial target pose of the end effector according to the identified lesion; calculating the target pose of the end effector in a base coordinate system according to the initial target pose, the transformation relation of an image recognition coordinate system and the base coordinate system of the robot; and generating a target pose control command according to the target pose so as to control the robot to act and drive the end effector to move to the target pose indicated by the target pose control command.
In the medical system in the above embodiment, the imaging device is configured to acquire a target object and a positioning marker disposed adjacent to the target object, and the positioning marker is disposed on the end effector of the robot; and a control processing device is configured to receive a target object and a medical image of the positioning marker arranged close to the target object, reconstruct a three-dimensional model of the target object according to the medical image, identify a focus, the positioning marker and other tissues according to the three-dimensional model, determine an initial target pose of the end effector according to the identified focus, calculate a target pose of the end effector in a base coordinate system according to a transformation relation between the initial target pose, the image identification coordinate system and the base coordinate system of the robot, and generate a target pose control command according to the target pose so as to realize intelligent control of the robot action according to the medical image acquired by an imaging device and drive the end effector to move to the target pose indicated by the target control command. Compared with the traditional method for carrying out medical operation navigation by utilizing electromagnetic navigation or optical navigation, the method needs more complicated and complex operations such as space transformation, registration and the like, is simple and convenient in the embodiment, low in implementation cost and small in occupied space, and avoids determining factors caused by manual participation.
In some embodiments, the medical system further comprises a needle holder disposed at the end effector, the needle holder for securing a puncture needle; the control processing apparatus is further configured to: acquiring an initial puncture needle inserting point, and determining an initial puncture target point according to the identified focus; calculating the puncture needle inserting point of the base coordinate system according to the initial puncture needle inserting point and the transformation relation, and calculating the puncture target point of the base coordinate system according to the initial puncture target point and the transformation relation; and generating the target pose control command according to the puncture needle inserting point and the puncture target point so as to control the robot to act and drive the puncture needle to move to the target position and the target direction indicated by the target pose control command, so that the accurate puncture operation is realized, and the problems of manual puncture learning curve, long time, easiness in misoperation and serious dependence on the experience and capability of an operator are avoided. Compared with the traditional method for performing puncture navigation by utilizing electromagnetic navigation or optical navigation, the method needs more complicated and complex operations such as space transformation, registration and the like, is simple and convenient in the embodiment, low in implementation cost, small in occupied space and avoids determining factors caused by manual participation.
In some embodiments, the imaging device comprises at least one of a positron emission tomography device, a computed tomography device, a magnetic resonance device, an X-ray device, an ultrasound image device, and a multi-modality fusion imaging device.
A sixth aspect of an embodiment of the present application provides a medical system, comprising: imaging device, control processing apparatus, robot and the motion detection device of any embodiment of this application, control processing apparatus with imaging device, motion detection device and the robot all communication interconnect, be configured as: receiving a medical image of a target object and the positioning marker disposed adjacent to the target object; reconstructing a three-dimensional model of the target object from the medical image; identifying a lesion, the localization marker, and other tissues from the three-dimensional model; determining an initial target pose of the end effector according to the identified lesion; calculating the target pose of the end effector in a base coordinate system of the robot according to the transformation relation among the initial target pose, the image recognition coordinate system and the base coordinate system; generating a target pose control command according to the target pose; and controlling the robot to act according to the target pose control command and the action prompt information, and driving the end effector to move to the target pose indicated by the target pose control command.
A seventh aspect of embodiments of the present application provides a storage medium having a computer program stored thereon; the computer program, when executed by a processor, performs the steps of: receiving a medical image of a target object and a positioning marker disposed adjacent to the target object; reconstructing a three-dimensional model of the target object from the medical image; identifying a lesion, the localization marker, and other tissues from the three-dimensional model; determining an initial target pose of an end effector of the robot according to the identified focus; calculating the target pose of the end effector in a base coordinate system according to the initial target pose, the transformation relation of an image recognition coordinate system and the base coordinate system of the robot; and generating a target pose control command according to the target pose so as to control the robot to drive the end effector to move to the target pose indicated by the target pose control command, and intelligently controlling the robot to be accurately positioned/operated according to an image recognition result.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present disclosure, the drawings required to be used in the description of the embodiments will be briefly introduced below, and it is apparent that the drawings in the following description are only some embodiments of the present disclosure, and it is obvious for those skilled in the art that other drawings may be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic diagram of a parallel robot in one embodiment;
FIG. 2 is an equivalent schematic view of the first five-bar linkage of the embodiment of FIG. 1;
FIG. 3 is a schematic diagram of an end effector of an embodiment;
FIG. 4 is a schematic diagram of the control circuitry of the parallel robot in one embodiment;
FIG. 5 is a schematic diagram of a parallel robotic system according to an embodiment;
FIG. 6 is a schematic diagram illustrating initial puncture path validation in accordance with an exemplary embodiment;
FIG. 7 is a schematic diagram illustrating a transformation principle between an image recognition coordinate system and a base coordinate system of a parallel robot according to an embodiment;
fig. 8 is a schematic structural view of a bed plate in an embodiment;
FIG. 9 is a schematic diagram of a control processing apparatus according to an embodiment;
FIG. 10 is a schematic structural diagram of a control processing device according to another embodiment;
FIG. 11 is a schematic diagram of an embodiment of a motion detection apparatus;
FIGS. 12-13 are schematic diagrams illustrating an application scenario of the motion detection apparatus according to an embodiment;
FIG. 14 is a schematic diagram of the configuration of the medical system in one embodiment;
fig. 15 is a schematic view of an application scenario of the medical system in an embodiment.
Description of reference numerals:
401. a first plane moving mechanism; 402. a second plane moving mechanism; 403. an end effector; 404. an axis of the puncture needle; 405. a first plane intersection; 406. a second plane intersection; 407. a first rotary kinematic pair structure; 408. a second rotary kinematic pair structure; 409. a needle holder; 4010. positioning the marker; 4011. a first driving device; 4012. a second driving device; 4013. a first five-bar linkage; 4023. a second five-bar linkage; 4021. a third driving device; 4022. a fourth drive device; 4031. a first motor; 4032. a second motor; 4033. a third motor; 4034. a fourth motor; 4035. a controller; 4014. a first plane of movement; 4024. a second plane of movement; 100. a robot; 101. a parallel robot; 102. a control processing device; 103. a motion detection device; 200. an imaging device; 601. a location tracking device; 104. a support arm; 105. a bed board; 106. a CT bed; 602. a first sensor; 603. a second sensor; 604. a third sensor; 1074. initially puncturing a needle point; 1075. initially puncturing a target point; 1076. a linear initial puncture path; 1. a first positioning marker ball; 2. a second positioning marker ball; 3. a third positioning marker ball; 4. a fourth positioning marker ball; 211. an image processing unit; 212. a path planning unit; 213. and a positioning device.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
To facilitate an understanding of the present application, the present application will now be described more fully with reference to the accompanying drawings. Preferred embodiments of the present application are shown in the drawings. This application may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
Where the terms "comprising," "having," and "including" are used herein, another element may be added unless an explicit limitation is used, such as "only," "consisting of … …," etc. Unless mentioned to the contrary, terms in the singular may include the plural and are not to be construed as being one in number.
It will be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, without departing from the scope of the present application.
In this application, unless otherwise expressly stated or limited, the terms "connected" and "connecting" are used broadly and encompass, for example, direct connection, indirect connection via an intermediary, communication between two elements, or interaction between two elements. The specific meaning of the above terms in the present application can be understood by those of ordinary skill in the art as appropriate.
It is noted that the robot referred to in the present application comprises a terminal device comprising a processor and control information applied by the processor to the drive motor for controlling operation of the surgical instrument.
Referring to fig. 1, an embodiment of the present application provides a parallel robot 101, which includes an end effector 403, and a first plane moving mechanism 401 and a second plane moving mechanism 402 that move planes in parallel; the first plane moving mechanism 401 and the second plane moving mechanism 402 each have at least two degrees of freedom of plane movement; the moving distal end of the first planar moving mechanism 401 is connected to the first rotary kinematic pair structure 407 of the end effector 403; the moving distal end of the second planar moving mechanism 402 is connected with the second rotary kinematic pair structure 408 of the end effector 403; wherein, the pose of the end effector 403 is changed by controlling the first plane moving mechanism 401 to act in cooperation with the second plane moving mechanism 402.
As an example, continuing to refer to fig. 1, the movement plane of the first plane moving mechanism 401 may be set to a first movement plane 4014, the movement plane of the second plane moving mechanism 402 may be set to a second movement plane 4024, and the first movement plane 4014 and the second movement plane 4024 are parallel to each other; the first plane moving mechanism 401 and the second plane moving mechanism 402 both have at least two degrees of freedom of plane movement, the moving far end of the first plane moving mechanism 401 is connected with the first rotating kinematic pair structure 407 of the end effector 403, and the first rotating kinematic pair structure 407 can be driven to move in the first moving plane 4014 by controlling the first plane moving mechanism 401; the far end of the second planar moving mechanism 402 is connected to the second rotary kinematic pair structure 408 of the end effector 403, and the second planar moving mechanism 402 can be controlled to drive the second rotary kinematic pair structure 408 to move in the second moving plane 4024; the first plane moving mechanism 401 is controlled to cooperate with the second plane moving mechanism 402 to change the pose of the end effector 403, so that the end effector 403 can be accurately positioned to bear the object. Because the first plane moving mechanism 401 and the second plane moving mechanism 402 are independently controlled in parallel, compared with a tandem robot comprising moving mechanisms with a plurality of degrees of freedom in tandem, the robot has the advantages that the rigidity of the robot is effectively improved, and the size of the robot is reduced. Because the first plane moving mechanism 401 and the second plane moving mechanism 402 which are parallel to the moving plane act cooperatively, compared with the traditional serial robot which uses multi-axis linkage to position an object carried by the end effector 403, the method avoids errors such as system errors or structural errors caused by multi-axis transmission, and effectively improves the positioning accuracy of the parallel robot 101; and the parallel robot has stronger structural rigidity compared with the series robot.
As an example, continuing to refer to fig. 1, the first rotary kinematic pair structure 407, the second rotary kinematic pair structure 408, the first planar moving mechanism 401 and the second planar moving mechanism 402 are all located on the same side of the end effector 403, so as to further effectively reduce the volume of the robot.
By way of example, with continued reference to fig. 1, the moving distal end a of the first planar movement mechanism 401 is articulated with the first rotary kinematic pair structure 407 of the end effector 403; the distal moving end b of the second planar moving mechanism 402 is hinged to the second rotary motion pair structure 408 of the end effector 403 to ensure the degrees of freedom of the first planar moving mechanism 401 and the second planar moving mechanism 402 in the respective moving planes.
As an example, continuing to refer to fig. 1, the first plane moving mechanism 401 includes a first driving device 4011, a second driving device 4012 connected via a first five-link mechanism; the second plane movement mechanism 402 includes a third drive device 4021 and a fourth drive device 4022 connected via a second five-link mechanism; the first driving device 4011, the second driving device 4012, the third driving device 4021 and the fourth driving device 4022 are all disposed on the frame, so that the overlapping area of the orthographic projections of the moving plane of the first planar moving mechanism 401 and the moving plane of the second planar moving mechanism 402 on a plane parallel to the moving plane is increased, and the size of the robot is effectively reduced on the premise that the moving space of the end effector 403 is ensured to be large enough.
As an example, referring to fig. 2, the first five-bar linkage 4013 of the first plane moving mechanism in fig. 1 is taken as an example to illustrate the implementation principle of the five-bar linkage in the present application. The first five-link mechanism comprises a connecting rod L1, a connecting rod L2, a connecting rod L3, a connecting rod L4 and a connecting rod L5 which are sequentially connected end to end, wherein a first driving device 4011 is arranged between the connecting rod L1 and the connecting rod L3, a second driving device 4012 is arranged between the connecting rod L1 and the connecting rod L2, and a first rotating kinematic pair structure 407 is arranged between the connecting rod L4 and the connecting rod L5.
As an example, referring to fig. 3, the parallel robot further includes a needle holder 409, the needle holder 409 is disposed on a surface of the end effector 403 away from the first planar moving mechanism 401 and the second planar moving mechanism 402, and is used for fixing a puncture needle (not shown); the first plane moving mechanism 401 is controlled to cooperate with the second plane moving mechanism 402 to move, so that the position and the extending direction of the puncture needle are changed, and the puncture needle is controlled to perform accurate puncture on a focus. The needles may have a variety of different length gauges, for example needles of different lengths such as 90mm, 100mm, 150mm, 200mm, etc. are common. The doctor or the operator can select the puncture needle with the corresponding length according to the puncture requirement of the puncture position of the target object.
Specifically, with reference to fig. 1, an intersection point of the puncture needle axis 404 of the needle holder and the first moving plane 4014 is a first plane intersection point 405, an intersection point of the puncture needle axis 404 and the second moving plane 4024 is a second plane intersection point 406, the first driving device 4011 and the second driving device 4012 drive the first five-bar linkage mechanism 4013 to drive the first plane intersection point 405 to move in the first moving plane 4014, and the third driving device 4021 and the fourth driving device 4022 drive the second five-bar linkage mechanism 4023 to drive the second plane intersection point 406 to move in the second moving plane 4024. The first plane moving mechanism 401 is controlled to cooperate with the second plane moving mechanism 402 to move, so that the position and the extension direction of the puncture needle are changed, and the puncture is controlled to precisely puncture a focus. Taking the parallel robot applied to the pulmonary nodule puncture operation as an example, the parallel robot adopts an upper plane five-bar mechanism and a lower plane five-bar mechanism, each plane five-bar mechanism respectively has plane movement with two degrees of freedom, the two plane five-bar mechanisms realize space movement with four degrees of freedom through four motors, and cover the left lung and the right lung, and the parallel robot is small and exquisite in equipment and easy to install and disassemble.
As an example, referring to fig. 4, the parallel robot further includes a first motor 4031, a second motor 4032, a third motor 4033, a fourth motor 4034, and a controller 4035, and the first driving device 4011 is connected to the first five-bar linkage via the first motor 4031; the second driving device 4012 is connected to the first five-bar linkage via a second motor 4032; the third driving device 4021 is connected with the second five-bar linkage through a third motor 4033; the fourth drive 4022 is connected to the second fifth link mechanism via a fourth motor 4034; the controller 4035 is connected to the first drive device 4011, the second drive device 4012, the third drive device 4021, and the fourth drive device 4022, and is configured to: acquiring a target pose control command, wherein the target pose control command is used for indicating the target position and the target direction of the puncture needle; analyzing the target pose control command according to a first preset algorithm, such as a robot inverse kinematics algorithm, to obtain a first angle driving command, a second angle driving command, a third angle driving command and a fourth angle driving command, such that the first drive means 4011 drives the first motor 4031 for a first angle based on a first angle drive command, the second drive means 4012 drives the second motor 4032 for a second angle based on a second angle drive command, the third drive means 4021 drives the third motor 4033 for a third angle based on a third angle drive command, the fourth drive means 4022 drives the fourth motor 4034 for a fourth angle based on a fourth angle drive command, the tail end puncture needle is driven to move to a target position and a target direction, so that the puncture is controlled to carry out accurate puncture aiming at a focus, and the problems of manual operation learning curve, long time, easiness in misoperation and serious dependence on the experience and the capability of an operator are avoided.
As an example, a motor of the parallel robot may be placed at the rear end of the base, and the front end link and the end effector of the parallel robot are both made of a non-metal material, so as to reduce artifacts in CT images.
Referring to fig. 1 to 5, an embodiment of the present application provides a parallel robot system, including a parallel robot 101 and a control processing device 102 in any embodiment of the present application, wherein a plurality of positioning markers 4010 made of a first predetermined material are disposed on an outer surface of an end effector 403 of the parallel robot 101; the control processing device 102 is communicatively interconnected with the parallel robot 101 and is configured to: receiving a medical image of a target object and a positioning marker 4010 arranged adjacent to the target object; reconstructing a three-dimensional model of the target object from the medical image; identifying the focus, the positioning marker 4010 and other tissues according to the three-dimensional model; determining an initial target pose of the end effector 403 according to the identified lesion; calculating the target pose of the end effector 403 in the base coordinate system according to the transformation relation between the initial target pose and the image recognition coordinate system and the base coordinate system of the parallel robot 101; and generating a target pose control command according to the target pose.
Referring to fig. 5, by receiving the medical image of the target object and the positioning mark 4010 disposed adjacent to the target object, reconstructing a three-dimensional model of the target object according to the medical image, identifying the focus, the positioning mark 4010 and other tissues according to the three-dimensional model, determining an initial target pose of the end effector 403 according to the identified focus, and generating a target pose control command according to the target pose after calculating the target pose of the end effector 403 in the base coordinate system according to the transformation relationship between the initial target pose, the image identification coordinate system and the base coordinate system of the parallel robot 101, so as to control the parallel robot 101 to move and drive the end effector 403 to move to the target pose indicated by the target pose control command.
With continued reference to fig. 1-5, the parallel robot system further includes a needle holder 409 disposed on the end effector 403, the needle holder 409 being used for fixing the puncture needle; the control processing apparatus 102 is further configured to: acquiring an initial puncture needle inserting point, and determining an initial puncture target point according to the identified focus; calculating the puncture needle inserting point of the base coordinate system according to the initial puncture needle inserting point and the transformation relation between the image recognition coordinate system and the base coordinate system of the parallel robot, and calculating the puncture target point of the base coordinate system according to the initial puncture target point and the transformation relation; and generating a target pose control command according to the puncture needle inserting point and the puncture target point so as to control the puncture needle to move to the target position and the target direction indicated by the target pose control command. The method has the advantages that electromagnetic navigation or optical navigation is not needed for registration, image positioning is directly used, accurate puncture operation is achieved, and the problems that a manual operation learning curve is generated, time is long, misoperation is easy to occur, and the experience and the capability of an operator are seriously depended on are avoided.
With continued reference to fig. 1-5, other tissues may be provided including blood vessels and/or bone; the control processing apparatus 102 is further configured to: after the initial puncture target point is determined and before the puncture needle inserting point and the puncture target point are calculated, whether an initial puncture path passing through the initial puncture needle inserting point and the initial puncture target point passes through a blood vessel and/or a bone is judged; if the puncture needle does not penetrate through the target point, calculating a puncture needle inserting point and a puncture target point; otherwise, the initial puncture needle inserting point is obtained again. The embodiment can effectively avoid the determined initial puncture path from passing through the blood vessel and/or the bone, thereby effectively reducing the damage caused by the puncture to the target object.
As an example, identification of lesions, localization markers, and other tissues based on the three-dimensional model may include at least one of: dividing the focus from the three-dimensional model by a region growing method; segmenting the positioning marker from the three-dimensional model by a threshold segmentation method; and segmenting other tissues from the three-dimensional model by an image interaction method. According to the embodiment, at least one of a focus, a positioning marker and other tissues in the three-dimensional model reconstructed according to the medical image of the target object can be effectively identified, so that the parallel robot is controlled to realize intelligent and accurate treatment on the target object, the intelligence of medical operation is improved, and the problems that a manual operation learning curve is generated, the time is long, misoperation is easy to occur, and the experience and the capability of an operator are seriously depended on are avoided.
As an example, referring to fig. 6, the step of determining whether the initial puncture path via the initial puncture needle point 1074 and the initial puncture target point 1075 passes through a blood vessel and/or bone includes: the linear initial puncture path 1076 is subjected to a simulated collision detection to determine whether the linear initial puncture path 1076 passes through a blood vessel and/or bone. The determined initial puncture path is prevented from passing through a blood vessel and/or a bone through the simulation collision detection, so that the damage of the puncture to a target object can be effectively reduced, and the working intelligence and safety of the parallel robot system are improved.
By way of example, referring to fig. 7, the transformation relationship includes a rotation matrix R and a translation matrix T; the positioning marker includes a first positioning marker ball 1, a second positioning marker ball 2, a third positioning marker ball 3, and a fourth positioning marker ball 4, and the step of calculating the transformation relationship includes: determining a first coordinate of the center point of the identified position marker in image recognition coordinate system omn, and calculating an average of the coordinates of the center points of first position marker ball 1, second position marker ball 2, third position marker ball 3, and fourth position marker ball 4 in image recognition coordinate system omn as the first coordinate; according to the first coordinate and a second coordinate of a center point in the basic coordinate system oxyz of the parallel robot, which is obtained in advance, the average value of the coordinates of the center points of the first positioning marker ball 1, the second positioning marker ball 2, the third positioning marker ball 3 and the fourth positioning marker ball 4 in the basic coordinate system oxyz of the parallel robot can be calculated and used as the second coordinate; and calculating a rotation matrix R and a translation matrix T through rigid body registration to establish the corresponding relation between the initial puncture needle point and the puncture needle point of the parallel robot, the initial puncture target point and the puncture target point of the parallel robot through the rotation matrix R and the translation matrix T, so that the parallel robot 101 is controlled to act and the end effector 403 of the parallel robot is driven to move accurately to position/operate accurately.
Specifically, the puncture needle insertion point P can be calculated according to the following formulainRobotPuncture target Point PdRobot:PinRobot=R*PinImage+T;PdRobot=R*PdImage+ T; wherein, PinImageFor the initial penetration of the needle point in the image-identifying coordinate system, PdImageAn initial puncture target point determined based on the identified lesion. To identify the initial penetration needle point P in the coordinate system of the imageinImageInitial puncture target Point PdImagePuncture needle insertion point P respectively converted into base coordinate system of parallel robotinRobotPuncture target Point PdRobotThe parallel robot is intelligently controlled to act according to the image recognition result, and the end effector is driven to move accurately so as to be positioned/operated accurately.
As an example, the material of the positioning marker may include titanium, and the gray scale value in the DICOM format image is 6000HU-8000 HU. For example, the localization markers may be titanium spheres with gray scale values of 6000HU, 7000HU, 8000HU, etc. in DICOM-format images.
As an example, referring to fig. 8, the parallel robot system further includes a bed board 105 made of a second predetermined material, where the bed board 105 made of the second predetermined material is used for bearing the target object and fixing the parallel robot 101, so as to fix the bed board 105 on the scanning bed surface of the imaging device, realize that the bed board 105 and the scanning bed move together, and simultaneously acquire a medical image including the target object, the parallel robot 101 fixed on the bed board 105 and the positioning marker, so as to perform image recognition subsequently, and control the parallel robot 101 to move and drive the end effector to move accurately according to the image recognition result so as to perform accurate positioning/operation. In an embodiment of the present application, the table may be fixed on a CT table 106 of the CT apparatus, so that the CT apparatus scans the target object and the parallel robot 101 and the positioning marker fixed on the table 105 at the same time to obtain a medical image including the target object and the parallel robot 101 and the positioning marker fixed on the table 105.
In some embodiments, the second predetermined material may include carbon fiber, polyester fiber, plastic, or the like, and the weight of the bed board is minimized to facilitate moving the bed board and the target object on the bed board while ensuring the structural strength of the bed board.
As an example, referring to fig. 9, an embodiment of the present application provides a control processing apparatus 102, which includes an image processing unit 211 and a path planning unit 212, where the image processing unit 211 is configured to receive a medical image of a target object and a positioning marker disposed adjacent to the target object, reconstruct a three-dimensional model of the target object according to the medical image, and identify a lesion, the positioning marker, and other tissues according to the three-dimensional model; the path planning unit 212 is configured to determine an initial target pose of an end effector disposed at an end of the robot according to the identified lesion, calculate a target pose of the end effector 4 in a base coordinate system according to a transformation relationship between the initial target pose, an image recognition coordinate system and a base coordinate system of the robot, and generate a target pose control command according to the target pose to control the robot to move and drive the end effector 403 to move to the target pose indicated by the target pose control command, so as to implement precise positioning/operation.
With continued reference to fig. 10, in some embodiments, the control and processing device 102 is configured to control the movement of the needle to a target position and a target direction; the path planning unit 212 is further configured to obtain an initial puncture needle insertion point, and determine an initial puncture target point according to the identified lesion; the control processing device 102 further comprises a positioning device 213, wherein the positioning device 213 is used for calculating the puncture needle inserting point of the base coordinate system according to the initial puncture needle inserting point and the transformation relation, and calculating the puncture target point of the base coordinate system according to the initial puncture target point and the transformation relation; and generating a target pose control command according to the puncture needle inserting point and the puncture target point so as to control the puncture needle to move to a target position and a target direction. Calculating the puncture needle point of the base coordinate system according to the initial puncture needle point and the transformation relation between the image recognition coordinate system and the base coordinate system of the parallel robot by arranging the positioning device 213, and calculating the puncture target point of the base coordinate system according to the initial puncture target point and the transformation relation; and generating a target pose control command according to the puncture needle inserting point and the puncture target point so as to control the puncture needle to move to a target position and a target direction, realizing accurate puncture operation, and avoiding the problems of generating a manual operation learning curve, having long time, being easy to operate by mistake and being seriously dependent on the experience and the capability of an operator.
With continued reference to fig. 10, in some embodiments, the other tissue includes blood vessels and/or bone; the path planning unit 212 is further configured to: after the initial puncture target point is determined and before the puncture needle inserting point and the puncture target point are calculated, whether an initial puncture path passing through the initial puncture needle inserting point and the initial puncture target point passes through a blood vessel and/or a bone is judged; if the puncture needle does not penetrate through the target point, calculating a puncture needle inserting point and a puncture target point; otherwise, the initial puncture needle inserting point is obtained again. The embodiment can effectively avoid the determined initial puncture path from passing through the blood vessel and/or the bone, thereby effectively reducing the damage caused by the puncture on the target object.
With continued reference to fig. 10, in some embodiments, the image processing unit 211 is further configured to perform at least one of the following steps: dividing the focus from the three-dimensional model by a region growing method; segmenting a positioning marker 4010 from the three-dimensional model by a threshold segmentation method; and segmenting other tissues from the three-dimensional model by an image interaction method. According to the embodiment, at least one of a focus, a positioning marker 4010 and other tissues in the three-dimensional model reconstructed according to the medical image of the target object can be effectively identified, so that the parallel robot 101 is controlled to realize intelligent and accurate treatment on the target object, the intelligence of medical operation is improved, and the problems that a manual operation learning curve is generated, the time is long, the misoperation is easy to occur, and the experience and the capability of an operator are seriously depended are avoided.
Specifically, the image processing unit 211 may be configured to import a DICOM-format CT image from a Picture Archiving and Communication System (PACS) system via a network, acquire CT images of an end effector of a patient and a parallel robot via an imaging device, analyze the DICOM-format CT image, acquire at least one of information such as a direction, a position, a layer thickness, a layer interval, a pixel interval, and a pixel value of each image, reconstruct a three-dimensional model of the end effector of the patient and the robot by using a volume rendering method after reconstructing a horizontal position, a sagittal position, and a coronal position of each image. The intercostal blood vessels are segmented by using a region growing method of seed points, the positioning marker spheres are segmented by using a threshold segmentation method, the skeleton and the lung parenchyma are rendered by using volume rendering, and the nodules are segmented by using an image interaction method.
As an example, referring to fig. 11, an embodiment of the present application provides a motion detection apparatus 103, which includes a plurality of sensors and a position tracking device 601, where the sensors are configured to be disposed at detection positions of target objects to detect position data of the detection positions; the position tracking device 601 is connected to the sensor, and is configured to track a position change of the detected position according to the received position data, and generate an action prompt message according to the position change to instruct execution of a preset action on the target object, thereby avoiding occurrence of inaccurate/failed positioning or operation of an operator due to movement of the detected position of the target object. Compared with the traditional puncture operation assisted by using respiratory gating equipment, the method and the device do not need additional auxiliary equipment and can effectively solve focus displacement images caused by the respiration of the target object.
By way of example, continuing to refer to fig. 11, the location tracking device 601 is configured to: filtering and/or amplifying the time sequence of the position data to obtain preprocessed position data; fitting the preprocessed position data to obtain a detection position change curve; acquiring the variable quantity of preset time on a detection position change curve; if the variation is smaller than or equal to the preset displacement threshold, action prompt information is generated to prompt an operator to execute preset operation, so that the situation that the operator is positioned or the operation is inaccurate/failed due to the movement of the detection position of the target object is avoided, and the intelligence and the safety of the operation are improved.
As an example, referring to fig. 12-13, it may be provided that the preset action comprises a puncturing action; the detection position is located in at least one of: a skin region directly above the right upper lung lobe of the target subject; a skin region directly above the lower right lobe of the lung of the target subject; the area of skin directly above the middle of the left lung of the target subject. With continued reference to fig. 11, sensors may be provided including a first sensor 602, a second sensor 603, and a third sensor 604, wherein the first sensor 602, the second sensor 603, and the third sensor 604 are all connected to the position tracking device 601. The first sensor 602 is disposed in a skin region directly above the upper right lobe of the target subject's right lung, the second sensor 603 is disposed in a skin region directly above the lower right lobe of the target subject's right lung, and the third sensor 604 is disposed in a skin region directly above the middle of the target subject's left lung; the location tracking device 601 is further configured to: based on the first sensor 602, the second sensor 603, and the third sensor 604, the change in the position of the first sensor 602, the second sensor 603, and the third sensor 604 generates movement indication information indicating that the puncturing movement is performed on the lung of the target object. The first sensor 602, the second sensor 603 and the third sensor 604 may fluctuate along with the breathing motion of the target object, and the position tracking device 601 collects the position changes of the first sensor 602, the second sensor 603 and the third sensor 604 in real time, and fits the curve of the breathing motion after simple denoising, filtering and amplifying processing. During lung CT scanning of a patient, CT images of the patient in a breath holding state are collected, after positioning is completed, a respiratory motion curve is checked, and breath holding time puncture is selected. For example, during the CT imaging, the average value P0 of the position data of the first sensor 602, the second sensor 603, and the third sensor 604 of the patient is recorded when the patient is holding his breath, during the puncture surgery, the average value Pt of the position data of the first sensor 602, the second sensor 603, and the third sensor 604 of the patient is recorded at time t, a preset displacement threshold is set to m, and S is set to Pt-P0, and if S < m, it is determined as the needle insertion time. According to the embodiment, the accurate detection of the lung movement condition of the target object is realized, the puncture action is performed during the breath holding period of the target object, the inaccurate/failed positioning or operation of an operator caused by the lung breathing movement of the target object is avoided, and the intelligence and the safety of the puncture operation are improved.
Referring to fig. 14, an embodiment of the present application provides a medical system, which includes a robot 100, an imaging device 200, and a control processing device 102, wherein a plurality of positioning markers made of a first predetermined material are disposed on an end effector of the robot 100; the imaging device is used for scanning and imaging the target object and a positioning marker arranged close to the target object to acquire a medical image; the control processing device 102 is communicatively interconnected with the imaging apparatus 200 and the robot 100, and is configured to: receiving a medical image of a target object and a positioning marker arranged adjacent to the target object; reconstructing a three-dimensional model of the target object from the medical image; identifying the focus, the positioning marker and other tissues according to the three-dimensional model; determining an initial target pose of the end effector according to the identified focus; calculating the target pose of the end effector in a base coordinate system according to the initial target pose, the transformation relation of the image recognition coordinate system and the base coordinate system of the robot; and generating a target pose control command according to the target pose so as to control the robot 100 to act and drive the end effector to move to the target pose indicated by the target pose control command. Compared with the traditional method for carrying out medical operation navigation by utilizing electromagnetic navigation or optical navigation, the method needs more complicated and complex operations such as space transformation, registration and the like, is simple and convenient in the embodiment, low in implementation cost and small in occupied space, and avoids determining factors caused by manual participation.
By way of example, referring to fig. 3 and 15, the medical system further includes a needle holder 409 disposed at the end effector 403, the needle holder 409 being used for fixing the puncture needle; the control processing means 102 is further configured to: acquiring an initial puncture needle inserting point, and determining an initial puncture target point according to the identified focus; calculating the puncture needle inserting point of the base coordinate system according to the initial puncture needle inserting point and the transformation relation, and calculating the puncture target point of the base coordinate system according to the initial puncture target point and the transformation relation; and generating a target pose control command according to the puncture needle inserting point and the puncture target point so as to control the robot to act and drive the puncture needle to move to a target position and a target direction indicated by the target pose control command, so that accurate puncture operation is realized, and the problems of generation of a manual puncture learning curve, long time, easiness in misoperation and serious dependence on the experience and capability of an operator are avoided. Compared with the traditional method for performing puncture navigation by utilizing electromagnetic navigation or optical navigation, the method needs more complicated and complex operations such as space transformation, registration and the like, is simple and convenient in the embodiment, low in implementation cost, small in occupied space and avoids determining factors caused by manual participation.
As an example, the imaging device may specifically include a Computed Tomography (CT) device, an X-ray device, a Magnetic Resonance (MR) device, a Positron Emission Tomography (PET) device, an ultrasound imaging device, a multi-modality fusion imaging device, and the like.
With continuing reference to fig. 3, 5, 15, in an embodiment of the present application, there is provided a medical system comprising: the imaging device, the control processing device 102, the robot and the motion detection device 103 in any embodiment of the present application, the control processing device 102 may be placed on a trolley, the trolley includes a display and a computer host; the robot performs data exchange with the control processing device 102, and the control processing device 102 performs data exchange with the motion detection device 103, that is, the control processing device 102 is communicatively interconnected with the imaging apparatus, the robot and the motion detection device 103, and is configured to: receiving a medical image of a target object and a positioning marker 4010 arranged adjacent to the target object; reconstructing a three-dimensional model of the target object from the medical image; identifying the focus, the positioning marker 4010 and other tissues according to the three-dimensional model; determining an initial target pose of the end effector 403 according to the identified lesion; calculating the target pose of the end effector 403 in the base coordinate system according to the transformation relation between the initial target pose, the image recognition coordinate system and the base coordinate system of the robot; generating a target pose control command according to the target pose; and controlling the robot to act according to the target pose control command and the action prompt information, and driving the end effector 403 to move to the target pose indicated by the target pose control command.
In an embodiment of the present application, there is provided a storage medium having a computer program stored thereon; the computer program when executed by the processor performs the steps of: receiving a medical image of a target object and a positioning marker arranged adjacent to the target object; reconstructing a three-dimensional model of the target object from the medical image; identifying the focus, the positioning marker and other tissues according to the three-dimensional model; determining an initial target pose of an end effector of the robot according to the identified focus; calculating the target pose of the end effector in a base coordinate system according to the initial target pose, the transformation relation of the image recognition coordinate system and the base coordinate system of the robot; and generating a target pose control command according to the target pose so as to control the robot to drive the end effector to move to the target pose indicated by the target pose control command, and intelligently controlling the robot to be accurately positioned/operated according to the image recognition result.
The technical scheme in the embodiment of the application can be applied to positioning before, during and after operation, lung nodule puncture, spinal puncture or craniocerebral puncture and other operations.
It should be noted that the above-mentioned embodiments are only for illustrative purposes and are not meant to limit the present invention. The embodiments in this context may be referred to, referred to each other.
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent should be subject to the appended claims.

Claims (21)

1. A parallel robot is characterized by comprising an end effector, a first plane moving mechanism and a second plane moving mechanism, wherein moving planes of the first plane moving mechanism and the second plane moving mechanism are parallel to each other;
the first plane moving mechanism and the second plane moving mechanism are provided with at least two degrees of freedom of plane movement;
the moving far end of the first plane moving mechanism is connected with a first rotary motion pair structure of the end effector;
the moving far end of the second plane moving mechanism is connected with a second rotary motion pair structure of the end effector;
and the pose of the end effector is changed by controlling the first plane moving mechanism to act in cooperation with the second plane moving mechanism.
2. The parallel robot of claim 1, wherein:
the first plane moving mechanism comprises a first driving device and a second driving device which are connected through a first five-link mechanism;
the second plane moving mechanism comprises a third driving device and a fourth driving device which are connected through a second five-bar linkage;
the first driving device, the second driving device, the third driving device and the fourth driving device are all arranged on a rack.
3. The parallel robot of claim 2, further comprising:
the needle holder is arranged on the surface of the end effector, which is far away from the first plane moving mechanism and the second plane moving mechanism, and is used for fixing the puncture needle;
and the position and the extending direction of the puncture needle are changed by controlling the first plane moving mechanism to cooperate with the second plane moving mechanism to act.
4. The parallel robot of claim 3, further comprising:
the first driving device is connected with the first five-bar linkage through the first motor;
the second driving device is connected with the first five-bar linkage through the second motor;
a third motor, via which the third driving device is connected with the second five-bar linkage;
a fourth motor, the fourth drive device being connected to the second five-bar linkage via the fourth motor;
a controller, coupled to the first, second, third, and fourth driving devices, configured to:
acquiring a target pose control command, wherein the target pose control command is used for indicating the target position and the target direction of the puncture needle;
analyzing the target pose control command according to a first preset algorithm to obtain a first angle driving command, a second angle driving command, a third angle driving command and a fourth angle driving command, so that the first driving device drives the first motor to rotate by a first angle based on the first angle driving command, the second driving device drives the second motor to rotate by a second angle based on the second angle driving command, the third driving device drives the third motor to rotate by a third angle based on the third angle driving command, and the fourth driving device drives the fourth motor to rotate by a fourth angle based on the fourth angle driving command to drive the end puncture needle to move to the target position and the target direction.
5. A parallel robotic system, comprising:
the parallel robot of any one of claims 1-4, wherein the outer surface of the end effector is provided with a plurality of positioning markers of a first predetermined material; and
a control processing device communicatively interconnected with the parallel robot and configured to:
receiving a medical image of a target object and the positioning marker disposed adjacent to the target object;
reconstructing a three-dimensional model of the target object from the medical image;
identifying a lesion, the localization marker, and other tissues from the three-dimensional model;
determining an initial target pose of the end effector according to the identified lesion;
calculating the target pose of the end effector in a base coordinate system according to the initial target pose, the transformation relation of an image recognition coordinate system and the base coordinate system of the parallel robot;
and generating the target pose control command according to the target pose.
6. The parallel robotic system of claim 5, further comprising a needle holder disposed at the end effector, the needle holder configured to secure a puncture needle; the control processing apparatus is further configured to:
acquiring an initial puncture needle inserting point, and determining an initial puncture target point according to the identified focus;
calculating the puncture needle point of the base coordinate system according to the initial puncture needle point and the transformation relation, and calculating the puncture target point of the base coordinate system according to the initial puncture target point and the transformation relation;
and generating the target pose control command according to the puncture needle inserting point and the puncture target point so as to control the puncture needle to move to a target position and a target direction.
7. The parallel robotic system as claimed in claim 6 wherein the other tissue includes blood vessels and/or bones; the control processing apparatus is further configured to:
after determining the initial puncture target point and before calculating the puncture needle point and the puncture target point, judging whether an initial puncture path passing through the initial puncture needle point and the initial puncture target point passes through a blood vessel and/or a bone;
if the puncture needle does not penetrate through the target puncture point, calculating the puncture needle inserting point and the puncture target point;
otherwise, the initial puncture needle inserting point is obtained again.
8. The parallel robotic system as claimed in claim 7 wherein the transformation relationship comprises a rotation matrix R and a translation matrix T; the step of calculating the transformation relationship comprises:
determining a first coordinate of the central point of the identified positioning marker in an image identification coordinate system;
and calculating the rotation matrix R and the translation matrix T through rigid body registration according to the first coordinate and a pre-acquired second coordinate of the central point in the base coordinate system.
9. The parallel robotic system of claim 8, wherein said penetration needle point P is calculated according to the following formulainRobotThe puncture target point PdRobot
PinRobot=R*PinImage+T;
PdRobot=R*PdImage+T;
Wherein, PinImageFor the purpose of identifying the initial penetration needle point, P, of the coordinate system in said imagedImageAn initial puncture target point determined based on the identified lesion.
10. The parallel robotic system of any one of claims 7-9, wherein said step of determining whether the blood vessel and/or bone is traversed via said initial puncture needle point and said initial puncture path comprises:
and performing simulated collision detection on the linear initial puncture path to judge whether the linear initial puncture path passes through a blood vessel and/or a bone.
11. The parallel robotic system of any one of claims 5-9, wherein said identifying a lesion, said localization marker, and other tissues from said three-dimensional model comprises at least one of:
segmenting a focus from the three-dimensional model by a region growing method;
segmenting the positioning markers from the three-dimensional model by a threshold segmentation method;
and segmenting other tissues from the three-dimensional model by an image interaction method.
12. The parallel robot system of any of claims 5-9, further comprising:
and the bed plate made of a second preset material is used for bearing the target object and fixing the parallel robot.
13. A control processing apparatus characterized by comprising:
the image processing unit is used for receiving a target object and a medical image of a positioning marker arranged close to the target object, reconstructing a three-dimensional model of the target object according to the medical image, and identifying a focus, the positioning marker and other tissues according to the three-dimensional model;
and the path planning unit is used for determining an initial target pose of an end effector arranged at the tail end of the robot according to the identified focus, calculating a target pose of the end effector in a base coordinate system according to the initial target pose, a transformation relation of an image identification coordinate system and the base coordinate system of the robot, and generating the target pose control command according to the target pose.
14. The control processing device of claim 13, configured to control the movement of the puncture needle to a target position and a target direction; the path planning unit is also used for acquiring an initial puncture needle inserting point and determining an initial puncture target point according to the identified focus; the device further comprises:
the positioning device is used for calculating the puncture needle inserting point of the base coordinate system according to the initial puncture needle inserting point and the transformation relation and calculating the puncture target point of the base coordinate system according to the initial puncture target point and the transformation relation; and generating the target pose control command according to the puncture needle inserting point and the puncture target point so as to control the puncture needle to move to a target position and a target direction.
15. The control processing device according to claim 14, wherein the other tissue comprises blood vessels and/or bone; the path planning unit is further configured to:
after determining the initial puncture target point and before calculating the puncture needle point and the puncture target point, judging whether an initial puncture path passing through the initial puncture needle point and the initial puncture target point passes through a blood vessel and/or a bone;
if the puncture needle does not penetrate through the target puncture point, calculating the puncture needle inserting point and the puncture target point;
otherwise, the initial puncture needle inserting point is obtained again.
16. The control processing apparatus according to any one of claims 13 to 15, wherein the image processing unit is further configured to perform at least one of:
segmenting a focus from the three-dimensional model by a region growing method;
segmenting the positioning markers from the three-dimensional model by a threshold segmentation method;
and segmenting other tissues from the three-dimensional model by an image interaction method.
17. A motion detection device, comprising:
the system comprises a plurality of sensors, a detection unit and a control unit, wherein the sensors are arranged at detection positions of a target object to detect position data of the detection positions;
and the position tracking equipment is connected with the sensor and used for tracking the position change of the detection position according to the received position data and generating action prompt information according to the position change so as to indicate to execute a preset action on a target object.
18. The motion detection apparatus of claim 17, wherein the location tracking device is configured to:
filtering and/or amplifying the time sequence of the position data to obtain preprocessed position data;
fitting the preprocessed position data to obtain a detection position change curve;
acquiring the variation of preset time on the detection position variation curve;
and if the variable quantity is smaller than or equal to a preset displacement threshold, generating the action prompt message.
19. The motion detection apparatus according to claim 17 or 18, wherein the preset action comprises a puncturing action; the detection position is located in at least one of:
a skin region directly above the upper right lung lobe of the target subject;
a skin region directly above the lower right lobe of the lung of the target subject;
a region of skin directly above the middle of the left lung of the target subject;
the location tracking device is further configured to: and generating action prompt information according to the position change so as to instruct to execute a puncture action on the lung of the target object.
20. A medical system, comprising:
the robot comprises a robot, wherein a plurality of positioning markers made of a first preset material are arranged on an end effector of the robot;
the imaging device is used for scanning and imaging a target object and the positioning marker arranged close to the target object so as to acquire a medical image;
the motion detection device of any one of claims 17 to 19, configured to detect a position change of a portion to be operated by the target object, and generate an action prompt message according to the position change;
a control processing device communicatively interconnected with the imaging device, the motion detection device, and the robot, configured to:
receiving a medical image of a target object and the positioning marker disposed adjacent to the target object;
reconstructing a three-dimensional model of the target object from the medical image;
identifying a lesion, the localization marker, and other tissues from the three-dimensional model;
determining an initial target pose of the end effector according to the identified lesion;
calculating the target pose of the end effector in a base coordinate system according to the initial target pose, the transformation relation of an image recognition coordinate system and the base coordinate system of the robot;
generating a target pose control command according to the target pose;
and controlling the robot to act according to the target pose control command and the action prompt information, and driving the end effector to move to the target pose indicated by the target pose control command.
21. A storage medium having a computer program stored thereon; wherein the computer program when executed by the processor performs the steps of:
receiving a medical image of a target object and a positioning marker disposed adjacent to the target object;
reconstructing a three-dimensional model of the target object from the medical image;
identifying a lesion, the localization marker, and other tissues from the three-dimensional model;
determining an initial target pose of an end effector of the robot according to the identified focus;
calculating the target pose of the end effector in a base coordinate system according to the initial target pose, the transformation relation of an image recognition coordinate system and the base coordinate system of the robot;
and generating a target pose control command according to the target pose so as to control the robot to drive the end effector to move to the target pose indicated by the target pose control command.
CN202210113699.7A 2022-01-30 2022-01-30 Parallel robot, system, device and storage medium Pending CN114515193A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210113699.7A CN114515193A (en) 2022-01-30 2022-01-30 Parallel robot, system, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210113699.7A CN114515193A (en) 2022-01-30 2022-01-30 Parallel robot, system, device and storage medium

Publications (1)

Publication Number Publication Date
CN114515193A true CN114515193A (en) 2022-05-20

Family

ID=81597664

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210113699.7A Pending CN114515193A (en) 2022-01-30 2022-01-30 Parallel robot, system, device and storage medium

Country Status (1)

Country Link
CN (1) CN114515193A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115607297A (en) * 2022-10-19 2023-01-17 山东大学 Tremor-suppression master-slave surgical robot control system and method
CN116269783A (en) * 2023-03-28 2023-06-23 北京维卓致远医疗科技发展有限责任公司 Guide frame and surgical robot
CN115607297B (en) * 2022-10-19 2024-04-30 山东大学 Master-slave operation robot control system and method for tremor suppression

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105662587A (en) * 2016-04-18 2016-06-15 山东科技大学 Orthopaedics operation robot
CN105796161A (en) * 2016-03-02 2016-07-27 赛诺威盛科技(北京)有限公司 Method for conducting puncture navigation in CT interventional therapy and puncture navigation device
CN106859768A (en) * 2015-12-11 2017-06-20 上海工程技术大学 For the decoupling four-degree-of-freedom telecentricity mechanism of abdominal-cavity minimal-invasion surgery
CN107126258A (en) * 2017-06-29 2017-09-05 哈尔滨理工大学 A kind of paths planning method of Minimally Invasive Surgery sleeve pipe flexible needle
CN107274428A (en) * 2017-08-03 2017-10-20 汕头市超声仪器研究所有限公司 Multi-target three-dimensional ultrasonic image partition method based on emulation and measured data
CN110755136A (en) * 2019-10-10 2020-02-07 中国科学院合肥肿瘤医院 Puncture method
CN110946653A (en) * 2018-12-29 2020-04-03 华科精准(北京)医疗科技有限公司 Operation navigation system
CN112568996A (en) * 2019-09-30 2021-03-30 格罗伯斯医疗有限公司 Surgical system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106859768A (en) * 2015-12-11 2017-06-20 上海工程技术大学 For the decoupling four-degree-of-freedom telecentricity mechanism of abdominal-cavity minimal-invasion surgery
CN105796161A (en) * 2016-03-02 2016-07-27 赛诺威盛科技(北京)有限公司 Method for conducting puncture navigation in CT interventional therapy and puncture navigation device
CN105662587A (en) * 2016-04-18 2016-06-15 山东科技大学 Orthopaedics operation robot
CN107126258A (en) * 2017-06-29 2017-09-05 哈尔滨理工大学 A kind of paths planning method of Minimally Invasive Surgery sleeve pipe flexible needle
CN107274428A (en) * 2017-08-03 2017-10-20 汕头市超声仪器研究所有限公司 Multi-target three-dimensional ultrasonic image partition method based on emulation and measured data
CN110946653A (en) * 2018-12-29 2020-04-03 华科精准(北京)医疗科技有限公司 Operation navigation system
CN112568996A (en) * 2019-09-30 2021-03-30 格罗伯斯医疗有限公司 Surgical system
CN110755136A (en) * 2019-10-10 2020-02-07 中国科学院合肥肿瘤医院 Puncture method

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115607297A (en) * 2022-10-19 2023-01-17 山东大学 Tremor-suppression master-slave surgical robot control system and method
CN115607297B (en) * 2022-10-19 2024-04-30 山东大学 Master-slave operation robot control system and method for tremor suppression
CN116269783A (en) * 2023-03-28 2023-06-23 北京维卓致远医疗科技发展有限责任公司 Guide frame and surgical robot
CN116269783B (en) * 2023-03-28 2023-12-19 北京维卓致远医疗科技发展有限责任公司 Guide frame and surgical robot

Similar Documents

Publication Publication Date Title
US20220346886A1 (en) Systems and methods of pose estimation and calibration of perspective imaging system in image guided surgery
CN108024838B (en) System and method for using registered fluoroscopic images in image-guided surgery
JP4822634B2 (en) A method for obtaining coordinate transformation for guidance of an object
US8213693B1 (en) System and method to track and navigate a tool through an imaged subject
KR102048352B1 (en) Remote center of motion robot for medical image scanning and image-guided targeting
JP2966089B2 (en) Interactive device for local surgery inside heterogeneous tissue
US20210059762A1 (en) Motion compensation platform for image guided percutaneous access to bodily organs and structures
CN112220557B (en) Operation navigation and robot arm device for craniocerebral puncture and positioning method
US20090082660A1 (en) Clinical workflow for treatment of atrial fibrulation by ablation using 3d visualization of pulmonary vein antrum in 2d fluoroscopic images
EP3097885A1 (en) Method and apparatus for registering a physical space to image space
CN113316429A (en) System and method for registration and navigation between coordinate systems
JP2008126075A (en) System and method for visual verification of ct registration and feedback
CN1647759A (en) Method for aligning and overlapping image data of medical imaging in serial shooting
CN1864635A (en) Method and device for registering 2d projection images relative to a 3d image data record
CN1939217A (en) Method for visually supporting an invasive examination or therapy of the heart with the aid of an invasive instrument
US10390892B2 (en) System and methods for updating patient registration during surface trace acquisition
JP6349278B2 (en) Radiation imaging apparatus, image processing method, and program
CN114515193A (en) Parallel robot, system, device and storage medium
Brattain et al. Instrument tracking and visualization for ultrasound catheter guided procedures
Kim et al. Robot for ultrasound-guided prostate imaging and intervention
CN114848143A (en) Operation navigation system and method based on spine operation auxiliary robot
CN112076401B (en) High-intensity focused ultrasound therapy system
CN113940756B (en) Operation navigation system based on mobile DR image
CN117137626B (en) Noninvasive registration method for neurosurgery robot
US20230316550A1 (en) Image processing device, method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination