CN112618024A - Multi-arm cooperative surgical robot - Google Patents

Multi-arm cooperative surgical robot Download PDF

Info

Publication number
CN112618024A
CN112618024A CN202110008685.4A CN202110008685A CN112618024A CN 112618024 A CN112618024 A CN 112618024A CN 202110008685 A CN202110008685 A CN 202110008685A CN 112618024 A CN112618024 A CN 112618024A
Authority
CN
China
Prior art keywords
arm
position information
robot
information
instrument head
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110008685.4A
Other languages
Chinese (zh)
Inventor
吴皓
贾欢
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ninth Peoples Hospital Shanghai Jiaotong University School of Medicine
Original Assignee
Ninth Peoples Hospital Shanghai Jiaotong University School of Medicine
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ninth Peoples Hospital Shanghai Jiaotong University School of Medicine filed Critical Ninth Peoples Hospital Shanghai Jiaotong University School of Medicine
Priority to CN202110008685.4A priority Critical patent/CN112618024A/en
Publication of CN112618024A publication Critical patent/CN112618024A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/76Manipulators having means for providing feel, e.g. force or tactile feedback
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/371Surgical systems with images on a monitor during operation with simultaneous use of two cameras

Abstract

The present application provides a multi-arm collaborative surgical robot comprising: a control module and a multi-arm coordination module; wherein, multi-arm cooperation module, connection control module includes: the robot comprises a plurality of mechanical arms and spatial position acquisition modules connected with the mechanical arms, wherein the spatial position acquisition modules are used for acquiring one or more of position information of each part of the mechanical arms, spatial position information and/or type information of instrument heads installed on the mechanical arms and touch sensing information of the mechanical arms, and feeding back the position information, the spatial position information and/or the type information to the control module in real time, so that the control module calculates multi-arm movement paths and/or adjacent relations, and collision and clamping of the mechanical arms are avoided. The shortcoming of prior art is solved to this application, can realize the supplementary operation of robot under the multi-arm cooperation condition, more accords with actual operation demand, can carry out the instrument head type and the positional information of tip to the arm and discern, makes each instrument arm avoid appearing collision and conflict, and then makes operation efficiency improve, and the security is stronger.

Description

Multi-arm cooperative surgical robot
Technical Field
The application relates to the technical field of medical instruments, in particular to a multi-arm cooperative surgical robot.
Background
The surgical robot provides a brand new choice for various surgical operations due to the characteristics of accuracy and safety. With the development of a period of time, the related technology of the surgical robot is mature day by day, and the use frequency of the surgical robot in clinical operation is also increased year by year. At present, surgical robots designed for otolaryngological related operations in the market are not common, and most of the surgical robots are designed by adopting a single-arm type, and cannot be adapted to technical conditions of high precision and great difficulty of the otolaryngological related operations. However, when multiple arms are used, the instrument arms are easy to collide and collide during the working process, thereby affecting the surgical progress, reducing the surgical efficiency and the safety.
Content of application
In view of the above disadvantages of the prior art, the present application aims to provide a multi-arm cooperative surgical robot, which is used to solve the problems that in the prior art, when multiple arms are used, the instrument arms are prone to collision and conflict during the working process, and further the surgical progress is affected, so that the surgical efficiency is reduced and the safety is reduced.
To achieve the above and other related objects, the present application provides a multi-arm cooperative surgical robot comprising: a control module and a multi-arm coordination module; wherein the multi-arm coordination module, connected to the control module, comprises: the multi-arm robot comprises a plurality of mechanical arms and spatial position acquisition modules connected with the mechanical arms, wherein the spatial position acquisition modules are used for acquiring one or more of position information of each part of the mechanical arms, spatial position information and/or type information of an instrument head installed on the mechanical arms and touch sensing information of each mechanical arm, and feeding back the position information, the spatial position information and/or the type information to the control module in real time, so that the control module calculates multi-arm movement paths and/or adjacent relations, and collision and clamping of each mechanical arm are avoided.
In an embodiment of the present application, the spatial position acquiring module includes: . One or more of a mechanical arm position identification unit, an instrument head identification unit and a touch sensing unit; the mechanical arm position identification unit is used for monitoring and calculating position information of each part of the mechanical arm; the instrument head identification unit is used for obtaining type information and/or position information of the instrument head; the touch sensing unit is used for obtaining touch sensing information of each mechanical arm so as to sense the touch condition of each mechanical arm.
In an embodiment of the present application, the instrument head includes: an interface for mounting on the robotic arm; wherein each instrument head corresponds to one type of interface.
In an embodiment of the application, the instrument head recognition unit obtains type information and/or position information of an instrument head mounted on the robot arm through the interface.
In an embodiment of the present application, the position recognition unit of the robot arm includes: and the micro-electromechanical gyroscope is used for calculating the position information of each part of the mechanical arm.
In an embodiment of the present application, the position recognition unit of the robot arm includes: and the optical navigation probe and/or the pressure sensor are used for monitoring the positions of all parts of the mechanical arm.
In an embodiment of the present application, the spatial position acquiring module further includes: one or more communication interfaces for communicating with external devices and/or the control module.
In an embodiment of the present application, the communication interface includes: one or more of a USB interface, an HDMI interface, a VGA interface and a Bluetooth interface.
In an embodiment of the present application, the touch sensing unit includes: touch the induction system.
In one embodiment of the present application, the robot arm includes: a lens arm and one or more robot arms.
As described above, the multi-arm cooperative surgical robot of the present application has the following advantageous effects: the defects in the prior art are overcome, robot-assisted surgery under multi-arm cooperation conditions can be achieved, compared with a traditional single-arm ear-nose-throat surgery robot, the robot is more flexible and better meets actual surgery requirements, the type and position information of an instrument head of a mechanical arm execution tip can be identified, and multi-arm motion paths and/or adjacent relations are calculated to enable the instrument arms to avoid collision and conflict, so that surgery efficiency is improved, and safety is higher.
Drawings
Fig. 1 is a schematic structural diagram of a multi-arm cooperative surgical robot according to an embodiment of the present application.
Fig. 2 is a schematic structural diagram of a spatial position acquisition module according to an embodiment of the present application.
Fig. 3 is a schematic structural diagram of a multi-arm cooperative surgical robot according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application is provided by way of specific examples, and other advantages and effects of the present application will be readily apparent to those skilled in the art from the disclosure herein. The present application is capable of other and different embodiments and its several details are capable of modifications and/or changes in various respects, all without departing from the spirit of the present application. It is to be noted that the features in the following embodiments and examples may be combined with each other without conflict.
It is noted that in the following description, reference is made to the accompanying drawings which illustrate several embodiments of the present application. It is to be understood that other embodiments may be utilized and that mechanical, structural, electrical, and operational changes may be made without departing from the spirit and scope of the present application. The following detailed description is not to be taken in a limiting sense, and the scope of embodiments of the present application is defined only by the claims of the issued patent. The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. Spatially relative terms, such as "upper," "lower," "left," "right," "lower," "below," "lower," "over," "upper," and the like, may be used herein to facilitate describing one element or feature's relationship to another element or feature as illustrated in the figures.
Throughout the specification, when a part is referred to as being "connected" to another part, this includes not only a case of being "directly connected" but also a case of being "indirectly connected" with another element interposed therebetween. In addition, when a certain part is referred to as "including" a certain component, unless otherwise stated, other components are not excluded, but it means that other components may be included.
The terms first, second, third, etc. are used herein to describe various elements, components, regions, layers and/or sections, but are not limited thereto. These terms are only used to distinguish one element, component, region, layer or section from another element, component, region, layer or section. Thus, a first element, component, region, layer or section discussed below could be termed a second element, component, region, layer or section without departing from the scope of the present application.
Also, as used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context indicates otherwise. It will be further understood that the terms "comprises," "comprising," and/or "comprising," when used in this specification, specify the presence of stated features, operations, elements, components, items, species, and/or groups, but do not preclude the presence, or addition of one or more other features, operations, elements, components, items, species, and/or groups thereof. The terms "or" and/or "as used herein are to be construed as inclusive or meaning any one or any combination. Thus, "A, B or C" or "A, B and/or C" means "any of the following: a; b; c; a and B; a and C; b and C; A. b and C ". An exception to this definition will occur only when a combination of elements, functions or operations are inherently mutually exclusive in some way.
The application provides a multi-arm collaborative surgical robot, has solved prior art and has used under the circumstances of multi-arm, and each apparatus arm appears colliding and conflict easily at the course of the work, and then influences the operation process for operation efficiency reduces, the problem of security decline. The defects in the prior art are overcome, robot-assisted surgery under multi-arm cooperation conditions can be achieved, compared with a traditional single-arm ear-nose-throat surgery robot, the robot is more flexible and better meets actual surgery requirements, the type and position information of an instrument head of a mechanical arm execution tip can be identified, and multi-arm motion paths and/or adjacent relations are calculated to enable the instrument arms to avoid collision and conflict, so that surgery efficiency is improved, and safety is higher.
The following detailed description of the embodiments of the present application will be made with reference to fig. 1 so that those skilled in the art described in the present application can easily implement the embodiments. The present application may be embodied in many different forms and is not limited to the embodiments described herein.
As shown in fig. 1, there is shown a schematic structural diagram of a multi-arm cooperative surgical robot in one embodiment, the surgical robot comprising:
a control module 11 and a multi-arm coordination module 12;
wherein, the multi-arm cooperation module 12, connected to the control module 11, includes: the system comprises a plurality of mechanical arms 121 and a spatial position acquisition module 122 connected with the mechanical arms 121, wherein the spatial position acquisition module 122 is used for acquiring one or more of position information of each part of the mechanical arms 121, spatial position information and/or type information of an instrument head 123 mounted on the mechanical arms 121 and touch sensing information of each mechanical arm, and feeding back the position information, the spatial position information and/or the type information to the control module 11 in real time, so that the control module 11 calculates a multi-arm motion path and/or an adjacent relation to avoid collision and jamming of each mechanical arm.
Optionally, the control module 11 is a mobile integrated host, and is controlled by a doctor.
Optionally, the control module 11 includes: and the calculating unit is used for calculating the multi-arm motion path and/or the adjacent relation according to one or more of the position information of each part of the mechanical arm 121, the space position information and/or the type information of the instrument head 123 and the touch sensing information of each mechanical arm so as to avoid collision and jamming of each mechanical arm.
Optionally, the multi-arm motion path is used for enabling the mechanical arms to move according to the path, so that collision and jamming among the mechanical arms can be avoided.
Optionally, the adjacency relation includes: the adjacent positions between the robot arms and the distances between the robot arms are determined.
Optionally, the spatial position acquiring module 122 includes: one or more of a mechanical arm position identification unit, an instrument head identification unit and a touch sensing unit;
the mechanical arm position identification unit is used for monitoring and calculating position information of each part of the mechanical arm; the instrument head identification unit is used for obtaining type information and/or position information of the instrument head; the touch sensing unit is used for obtaining touch sensing information of each mechanical arm so as to sense the touch condition of each mechanical arm.
Specifically, the mechanical arm position identification unit monitors the position of each mechanical arm in real time and calculates the position information of each part of the mechanical arm; the instrument head identification unit is used for detecting type information and/or position information of an instrument head mounted on the mechanical arm; the touch sensing unit is used for acquiring collision condition information such as whether each part of each mechanical arm has collision, collision strength and the like.
It should be noted that the mechanical arm position recognition unit is any device that realizes real-time monitoring of the position of each mechanical arm, the instrument head recognition unit is any device that detects type information and/or position information of an instrument head mounted on the mechanical arm, the touch sensing unit is any device that obtains collision condition information such as whether each part of each mechanical arm has a collision, collision strength and the like, and the above devices are not limited in this application.
Optionally, as shown in fig. 2, the spatial position acquiring module includes: one or more of a robot arm position recognition unit 21, an instrument head recognition unit 22, and a touch sensing unit 23; the mechanical arm position recognition unit 21 is connected to the instrument head recognition unit 22, and the instrument head recognition unit 22 is connected to the touch sensing unit 23.
Optionally, the position recognition unit 21 of the lower robot arm includes: and the micro-electromechanical gyroscope is used for calculating the position information of each part of the mechanical arm. The microelectromechanical gyroscope includes: a single-axis microelectromechanical gyroscope or a dual-axis microelectromechanical gyroscope.
Optionally, the position recognition unit 21 of the lower robot arm includes: the system comprises an optical navigation probe and/or a pressure sensor, wherein the optical navigation probe optically measures the position information of each part of the mechanical arm; the pressure sensor acquires pressure data of joints of all parts of the mechanical arm to obtain position information of all parts of the mechanical arm.
Optionally, each mechanical arm 121 is mounted with one or more interchangeable instrument heads 123; the instrument head 123 includes: an interface for mounting on the robotic arm; wherein each instrument head 123 corresponds to one type of interface. The identity of the instrument head may be identified by identifying the type of joint.
Optionally, each robot arm 121 is equipped with one or more interchangeable instrument heads 123; wherein each instrument head is provided with different types of interfaces. When the instrument head 123 is replaced, the instrument head recognition unit judges the type of the instrument head 123 according to the interface of the instrument head to obtain the type information and/or the position information of the instrument head 123, and feeds the type information back to the control module 11 in real time, so that a doctor can conveniently adjust and replace the instrument in real time according to the operation requirement.
Optionally, the instrument head recognition unit includes: an instrument head type identification subunit and/or an instrument head position identification subunit, wherein the instrument head type identification subunit is configured to identify type information of the instrument head; the instrument head position subunit is used for identifying the position information of the instrument head. It is to be noted that the functions of the instrument head type identification subunit and the instrument head position identification subunit may be implemented on two devices, respectively, or of course, on one device.
Optionally, the spatial position acquiring module 122 further includes: one or more communication interfaces for communicating with external devices and/or the control module.
Optionally, the communication interface includes: one or more of a USB interface, an HDMI interface, a VGA interface and a Bluetooth interface.
Optionally, the communication interface includes: and the output interface is used for communicating with an external device and comprises one or more of a USB interface, an HDMI interface and a VGA interface. For example, the communication interface is a USB interface, and the inter-position collection module 122 outputs the mechanical arm position information to the external device through the USB interface, so as to avoid collision and position clamping of each mechanical arm.
When the communication interface is an HDMI interface, the spatial position acquisition module 122 connects the acquired real-time position information of the mechanical arm 121 to a computer system in an operating room through the HDMI interface.
Optionally, the communication interface includes: and the Bluetooth interface is used for communicating with the control module through Bluetooth and transmitting the position information of the mechanical arm 121 to the control module 11.
Alternatively, the robot arm 30 shown in fig. 3 includes: a lens arm 31 and one or more robot arms 32.
The spatial position acquisition module acquires position information of each part of the lens arm 31 and the mechanical arm 32, and feeds the position information back to the control module in real time.
Optionally, the lens arm 31 includes: and the camera shooting unit is used for collecting the video signal.
Optionally, the camera unit is mounted on the lens arm and is a device with a video signal collecting function, and preferably, the camera unit includes one or more cameras.
Optionally, the video signal comprises images captured during a surgical procedure.
In summary, the multi-arm cooperative surgical robot solves the problems that in the prior art, under the condition of using multiple arms, collision and conflict easily occur in the working process of each instrument arm, and further the surgical process is influenced, so that the surgical efficiency is reduced and the safety is reduced. The defects in the prior art are overcome, robot-assisted surgery under multi-arm cooperation conditions can be achieved, compared with a traditional single-arm ear-nose-throat surgery robot, the robot is more flexible and better meets actual surgery requirements, the type and position information of an instrument head of a mechanical arm execution tip can be identified, and multi-arm motion paths and/or adjacent relations are calculated to enable the instrument arms to avoid collision and conflict, so that surgery efficiency is improved, and safety is higher. Therefore, the application effectively overcomes various defects in the prior art and has high industrial utilization value.
The above embodiments are merely illustrative of the principles and utilities of the present application and are not intended to limit the application. Any person skilled in the art can modify or change the above-described embodiments without departing from the spirit and scope of the present application. Accordingly, it is intended that all equivalent modifications or changes which can be made by those skilled in the art without departing from the spirit and technical concepts disclosed in the present application shall be covered by the claims of the present application.

Claims (10)

1. A multi-arm cooperative surgical robot, comprising: a control module and a multi-arm coordination module;
wherein the multi-arm coordination module, connected to the control module, comprises: the multi-arm robot comprises a plurality of mechanical arms and spatial position acquisition modules connected with the mechanical arms, wherein the spatial position acquisition modules are used for acquiring one or more of position information of each part of the mechanical arms, spatial position information and/or type information of an instrument head installed on the mechanical arms and touch sensing information of each mechanical arm, and feeding back the position information, the spatial position information and/or the type information to the control module in real time, so that the control module calculates multi-arm movement paths and/or adjacent relations, and collision and clamping of each mechanical arm are avoided.
2. The multi-arm cooperative surgical robot of claim 1, wherein the spatial position acquisition module comprises: .
One or more of a mechanical arm position identification unit, an instrument head identification unit and a touch sensing unit;
the mechanical arm position identification unit is used for monitoring and calculating position information of each part of the mechanical arm;
the instrument head identification unit is used for obtaining type information and/or position information of the instrument head;
the touch sensing unit is used for obtaining touch sensing information of each mechanical arm so as to sense the touch condition of each mechanical arm.
3. The multi-arm cooperative surgical robot of claim 1, wherein the instrument head comprises: an interface for mounting on the robotic arm; wherein each instrument head corresponds to one type of interface.
4. A multi-arm cooperative surgical robot as claimed in claim 3, wherein the instrument head recognition unit obtains type information and/or position information of an instrument head mounted on the mechanical arm through the interface.
5. The multi-arm cooperative surgical robot of claim 2, wherein the primary robot arm position identification unit comprises: and the micro-electromechanical gyroscope is used for calculating the position information of each part of the mechanical arm.
6. The multi-arm cooperative surgical robot of claim 2, wherein the primary robot arm position identification unit comprises: and the optical navigation probe and/or the pressure sensor are used for monitoring the positions of all parts of the mechanical arm.
7. The multi-arm cooperative surgical robot of claim 1, wherein said spatial position acquisition module further comprises: one or more communication interfaces for communicating with external devices and/or the control module.
8. The multi-arm cooperative surgical robot of claim 7, wherein the communication interface comprises: one or more of a USB interface, an HDMI interface, a VGA interface and a Bluetooth interface.
9. The multi-arm cooperative surgical robot of claim 2, wherein said touch sensing unit comprises: touch the induction system.
10. The multi-arm cooperative surgical robot of claim 1, wherein the robotic arm comprises: a lens arm and one or more robot arms.
CN202110008685.4A 2021-01-05 2021-01-05 Multi-arm cooperative surgical robot Pending CN112618024A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110008685.4A CN112618024A (en) 2021-01-05 2021-01-05 Multi-arm cooperative surgical robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110008685.4A CN112618024A (en) 2021-01-05 2021-01-05 Multi-arm cooperative surgical robot

Publications (1)

Publication Number Publication Date
CN112618024A true CN112618024A (en) 2021-04-09

Family

ID=75290666

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110008685.4A Pending CN112618024A (en) 2021-01-05 2021-01-05 Multi-arm cooperative surgical robot

Country Status (1)

Country Link
CN (1) CN112618024A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022160877A1 (en) * 2021-01-28 2022-08-04 哈尔滨思哲睿智能医疗设备有限公司 Voice prompt control method and system for laparoscopic surgery robot
CN115229806A (en) * 2022-09-21 2022-10-25 杭州三坛医疗科技有限公司 Mechanical arm control method, device, system, equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140163736A1 (en) * 2012-12-10 2014-06-12 Intuitive Surgical Operations, Inc. Collision avoidance during controlled movement of image capturing device and manipulatable device movable arms
CN109288592A (en) * 2018-10-09 2019-02-01 成都博恩思医学机器人有限公司 The method of operating robot and detection mechanical arm collision with mechanical arm
CN109620410A (en) * 2018-12-04 2019-04-16 微创(上海)医疗机器人有限公司 The method and system of mechanical arm anticollision, medical robot
CN111542271A (en) * 2017-12-28 2020-08-14 爱惜康有限责任公司 Collaborative surgical operation of robotically-assisted surgical platforms

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140163736A1 (en) * 2012-12-10 2014-06-12 Intuitive Surgical Operations, Inc. Collision avoidance during controlled movement of image capturing device and manipulatable device movable arms
CN111542271A (en) * 2017-12-28 2020-08-14 爱惜康有限责任公司 Collaborative surgical operation of robotically-assisted surgical platforms
CN109288592A (en) * 2018-10-09 2019-02-01 成都博恩思医学机器人有限公司 The method of operating robot and detection mechanical arm collision with mechanical arm
CN109620410A (en) * 2018-12-04 2019-04-16 微创(上海)医疗机器人有限公司 The method and system of mechanical arm anticollision, medical robot

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
刘英: "《临床手术室护理实践指南》", 30 September 2018, 天津科学技术出版社 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022160877A1 (en) * 2021-01-28 2022-08-04 哈尔滨思哲睿智能医疗设备有限公司 Voice prompt control method and system for laparoscopic surgery robot
CN115229806A (en) * 2022-09-21 2022-10-25 杭州三坛医疗科技有限公司 Mechanical arm control method, device, system, equipment and storage medium
CN115229806B (en) * 2022-09-21 2023-03-03 杭州三坛医疗科技有限公司 Mechanical arm control method, device, system, equipment and storage medium

Similar Documents

Publication Publication Date Title
US11408728B2 (en) Registration of three-dimensional coordinates measured on interior and exterior portions of an object
US20220155060A1 (en) Triangulation scanner with blue-light projector
US11009941B2 (en) Calibration of measurement units in alignment with a skeleton model to control a computer system
US9964398B2 (en) Three-dimensional measuring device removably coupled to robotic arm on motorized mobile platform
US9232980B2 (en) Operation input device and method of initializing operation input device
US9901411B2 (en) Control device and method for controlling a robot with a system by means of gesture control
AU2020399817B2 (en) Navigation surgery system and registration method therefor, electronic device, and support apparatus
US20160128783A1 (en) Surgical navigation system with one or more body borne components and method therefor
US7668584B2 (en) Interface apparatus for passive tracking systems and method of use thereof
US20150223725A1 (en) Mobile maneuverable device for working on or observing a body
CN112618024A (en) Multi-arm cooperative surgical robot
US20050131582A1 (en) Process and device for determining the position and the orientation of an image reception means
US20170042625A1 (en) Robotic interface positioning determination systems and methods
WO2018032083A1 (en) Methods and systems for registration of virtual space with real space in an augmented reality system
RU2009115691A (en) METHODS AND SYSTEMS OF MEDICAL SCAN WITH TACTICAL FEEDBACK
JP2013034835A (en) Operation support device and method for controlling the same
CN109297413A (en) A kind of large-size cylinder body Structural visual measurement method
JPWO2018043525A1 (en) Robot system, robot system control apparatus, and robot system control method
CN115363762A (en) Positioning method and device of surgical robot and computer equipment
JP2011200997A (en) Teaching device and method for robot
CN112109069A (en) Robot teaching device and robot system
CN114536399B (en) Error detection method based on multiple pose identifications and robot system
US20220000571A1 (en) System and method for assisting tool exchange
WO2020159978A1 (en) Camera control systems and methods for a computer-assisted surgical system
CN110547874B (en) Method for determining a movement path, component for the method, and use in an automation device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210409

RJ01 Rejection of invention patent application after publication