CN116423547A - Surgical robot pedal control system, method, readable medium and surgical robot - Google Patents

Surgical robot pedal control system, method, readable medium and surgical robot Download PDF

Info

Publication number
CN116423547A
CN116423547A CN202310160630.4A CN202310160630A CN116423547A CN 116423547 A CN116423547 A CN 116423547A CN 202310160630 A CN202310160630 A CN 202310160630A CN 116423547 A CN116423547 A CN 116423547A
Authority
CN
China
Prior art keywords
pedal
surgical robot
target object
real
control device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310160630.4A
Other languages
Chinese (zh)
Inventor
请求不公布姓名
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Microport Medbot Group Co Ltd
Original Assignee
Shanghai Microport Medbot Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Microport Medbot Group Co Ltd filed Critical Shanghai Microport Medbot Group Co Ltd
Priority to CN202310160630.4A priority Critical patent/CN116423547A/en
Publication of CN116423547A publication Critical patent/CN116423547A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/04Foot-operated control means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/023Optical sensing devices including video camera means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05GCONTROL DEVICES OR SYSTEMS INSOFAR AS CHARACTERISED BY MECHANICAL FEATURES ONLY
    • G05G1/00Controlling members, e.g. knobs or handles; Assemblies or arrangements thereof; Indicating position of controlling members
    • G05G1/30Controlling members actuated by foot
    • G05G1/46Means, e.g. links, for connecting the pedal to the controlled unit

Abstract

The invention provides a surgical robot pedal control system, a method, a readable medium and a surgical robot, wherein the surgical robot pedal control system comprises: an image acquisition device and a control device; the image acquisition device is used for acquiring real-time three-dimensional images of the pedal of the surgical robot system and the adjacent area of the pedal and transmitting the real-time three-dimensional images to the control device; the control device is configured to recognize and obtain a pedal and a target object according to the acquired real-time three-dimensional image; based on the identified pedal and target object, obtaining real-time relative position information of the pedal and the target object in space; the control device is further configured to judge according to the real-time relative position information, and if the target object is in the pre-trigger area of the pedal, when the displacement of the target object along the preset direction reaches a threshold value, the mapping relation between the pedal and the slave-end equipment of the surgical robot system is switched. Therefore, the same pedal can realize different functions, and the complexity and the operation difficulty of pedal operation are simplified.

Description

Surgical robot pedal control system, method, readable medium and surgical robot
Technical Field
The invention relates to the technical field of medical instruments, in particular to a pedal control system and method of a surgical robot, a readable medium and the surgical robot.
Background
The application of the surgical robot system solves the clinical demands of surgical micro-trauma and refinement. Generally, surgical robotic systems generally include a master end device (e.g., a physician console) and a slave end device (e.g., a patient surgical platform). During operation, a doctor observes tissue characteristics in a patient body through two-dimensional or three-dimensional display equipment at a doctor control console, controls a main manipulator and a pedal on the doctor control console in a master-slave teleoperation mode, and drives a mechanical arm and surgical tool instruments on a patient surgical platform through master-slave mapping to complete operation. The doctor can finish the operation in a similar mode and feel to the traditional operation, so that the difficulty of the doctor in operation is greatly reduced, the operation efficiency and safety are improved, and the realization of remote operation is made to progress in a breakthrough manner. The surgical robot system is used for performing the surgery, the wound of a patient is small, the bleeding is less, the recovery is quick, the postoperative hospitalization time of the patient is greatly shortened, the postoperative survival rate and the recovery rate can be obviously improved, the surgical robot system is favored by vast doctors and patients, and the surgical robot system is widely applied to various clinical operations as a high-end medical instrument.
However, the pedal system of the surgeon console of the surgical robotic system typically includes a plurality of foot switches (e.g., six or more foot switches) that respectively control the functions of the slave-end patient surgical platform, such as electro-cutting, electro-coagulation, switching the operating arms, and clutching. On one hand, more foot switches are complicated in operation, and on the other hand, doctors are required to record the corresponding function of each foot switch, so that the operation difficulty is increased. In addition, the pedal system of the existing surgical robot system cannot intuitively know the specific positions of all foot switches when a doctor performs surgery, so that the situation that the doctor steps on by mistake in the surgical process can be caused, and the surgical safety is seriously affected. The doctor can also cause the defect of operation confidence of the doctor by repeatedly checking the trampling position, thereby affecting the operation experience of the doctor.
Disclosure of Invention
The invention aims to provide a pedal control system and method for a surgical robot, a readable medium and the surgical robot, which are used for solving the problem of difficult operation caused by more pedal switches of the existing surgical robot system.
In order to solve the above technical problems, the present invention provides a surgical robot pedal control system, which includes: an image acquisition device and a control device;
the image acquisition device is used for acquiring real-time three-dimensional images of a pedal of the surgical robot system and an adjacent area of the pedal and transmitting the real-time three-dimensional images to the control device;
the control device is configured to recognize and obtain the pedal and the target object according to the acquired real-time three-dimensional image; based on the identified pedal and the target object, obtaining real-time relative position information of the pedal and the target object in space;
the control device is further configured to judge according to the real-time relative position information, and if the target object is in the pre-trigger area of the pedal, when the displacement of the target object along the preset direction reaches a threshold value, the mapping relation between the pedal and the slave-end equipment of the surgical robot system is switched.
Optionally, in the surgical robot pedal control system, the pre-trigger area is a three-dimensional space area formed by surrounding a horizontal boundary, a first height boundary and a second height boundary, and the horizontal boundary is a boundary formed by extending in a horizontal direction with the pedal as a center; the first height boundary is higher than the second height boundary in the vertical direction, and the first height boundary and the second height boundary are boundaries formed by taking the vertical height of the tread of the pedal as a reference and extending in the vertical direction.
Optionally, in the surgical robot pedal control system, the control device is further configured to send an excitation signal to the master device when the pedal is stepped on, so that the master device drives the corresponding instrument of the slave device to perform a corresponding operation according to the current mapping relationship.
Optionally, in the surgical robot pedal control system, the control device includes an image synthesis module, where the image synthesis module is configured to output a menu image corresponding to a current mapping relationship and/or excitation prompt information corresponding to the current mapping relationship to a display device of the main terminal device.
Optionally, in the surgical robot pedal control system, if the target object is outside the pre-trigger area of the pedal, the image synthesis module is configured to hide the menu image.
Optionally, in the surgical robot pedal control system, the control device is further configured to store a current mapping relationship when the target object leaves a pre-trigger area of the pedal; and when the target object reenters the pre-trigger area of the pedal, taking the stored mapping relation as an initial mapping relation.
Optionally, in the surgical robot pedal control system, the control device is further configured to obtain a current mapping relationship of a master device of the surgical robot system when connected to the master device, and use the obtained mapping relationship as an initial mapping relationship.
In order to solve the technical problems, the invention also provides a surgical robot pedal control method which is applied to the surgical robot pedal control system; the surgical robot pedal control method comprises the following steps:
acquiring a real-time three-dimensional image of the pedal and an adjacent area of the pedal;
identifying and obtaining the pedal and the target object according to the acquired real-time three-dimensional image; based on the identified pedal and the target object, obtaining real-time relative position information of the pedal and the target object in space;
and judging according to the real-time relative position information, and switching the mapping relation with the slave-end equipment of the surgical robot system when the displacement of the target object along the preset direction reaches a threshold value if the target object is in the pre-trigger area of the pedal.
In order to solve the above technical problem, the present invention also provides a readable storage medium having stored thereon a program which, when executed, implements the steps of the surgical robot pedal control method as described above.
To solve the above technical problem, the present invention further provides a surgical robot system, which includes the surgical robot pedal control system as described above, and further includes a main end device including a pedal, the preset direction is perpendicular to a trigger direction of the pedal, and the preset direction is perpendicular to a facing direction of the main end device.
Optionally, in the surgical robot system, the main end device includes at most two of the pedals.
In summary, in the surgical robot pedal control system, the surgical robot pedal control method, the readable storage medium, and the surgical robot system provided by the present invention, the surgical robot pedal control system includes: an image acquisition device and a control device; the image acquisition device is used for acquiring real-time three-dimensional images of a pedal of the surgical robot system and an adjacent area of the pedal and transmitting the real-time three-dimensional images to the control device; the control device is configured to recognize and obtain the pedal and the target object according to the acquired real-time three-dimensional image; based on the identified pedal and the target object, obtaining real-time relative position information of the pedal and the target object in space; the control device is further configured to judge according to the real-time relative position information, and if the target object is in the pre-trigger area of the pedal, when the displacement of the target object along the preset direction reaches a threshold value, the mapping relation between the pedal and the slave-end equipment of the surgical robot system is switched.
So configured, an operator can switch the mapping relation between the pedals and the slave-end equipment by moving the feet along the preset direction, so that different functions can be realized by the same pedal, the number of physical pedals can be reduced, the complexity and the operation difficulty of pedal operation are simplified, meanwhile, the situation of mistaken stepping in the operation process can be avoided, and the operation safety is improved.
Drawings
Those of ordinary skill in the art will appreciate that the figures are provided for a better understanding of the present invention and do not constitute any limitation on the scope of the present invention. Wherein:
FIG. 1 is a schematic view of a surgical robotic system of an embodiment of the present invention;
FIG. 2 is a schematic diagram of a physician console according to an embodiment of the present invention;
FIG. 3 is a block diagram of the hardware architecture of a surgical robot foot pedal control system according to an embodiment of the present invention;
FIG. 4 is a block diagram of the software architecture of a surgical robot foot pedal control system according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a mapping relationship between a pedal and a slave device according to an embodiment of the present invention;
FIG. 6 is a schematic view of a surgical robot foot pedal control system according to an embodiment of the present invention;
FIGS. 7a and 7b are schematic views of the foot being moved side-to-side in accordance with an embodiment of the present invention;
FIG. 8 is a top view of a pre-trigger area of an embodiment of the present invention;
FIGS. 9 a-9 c are side views of the foot relative to the pre-trigger area in accordance with embodiments of the present invention;
fig. 10 is a schematic diagram of a display case of a display device of an embodiment of the present invention;
fig. 11 is a schematic diagram of switching of menu images according to an embodiment of the present invention.
Detailed Description
The invention will be described in further detail with reference to the drawings and the specific embodiments thereof in order to make the objects, advantages and features of the invention more apparent. It should be noted that the drawings are in a very simplified form and are not drawn to scale, merely for convenience and clarity in aiding in the description of embodiments of the invention. Furthermore, the structures shown in the drawings are often part of actual structures. In particular, the drawings are shown with different emphasis instead being placed upon illustrating the various embodiments.
As used in this disclosure, the singular forms "a," "an," and "the" include plural referents, the term "or" are generally used in the sense of comprising "and/or" and the term "several" are generally used in the sense of comprising "at least one," the term "at least two" are generally used in the sense of comprising "two or more," and the term "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying any relative importance or number of features indicated. Thus, a feature defining "first," "second," "third," or the like, may explicitly or implicitly include one or at least two such features, with "one end" and "another end" and "proximal end" and "distal end" generally referring to the corresponding two portions, including not only the endpoints. Furthermore, as used in this disclosure, "mounted," "connected," and "disposed" with respect to another element should be construed broadly to mean generally only that there is a connection, coupling, mating or transmitting relationship between the two elements, and that there may be a direct connection, coupling, mating or transmitting relationship between the two elements or indirectly through intervening elements, and that no spatial relationship between the two elements is to be understood or implied, i.e., that an element may be in any orientation, such as internal, external, above, below, or to one side, of the other element unless the context clearly dictates otherwise. The specific meaning of the above terms in the present invention can be understood by those of ordinary skill in the art according to the specific circumstances. Furthermore, directional terms, such as above, below, upper, lower, upward, downward, left, right, etc., are used with respect to the exemplary embodiments as they are shown in the drawings, upward or upward toward the top of the corresponding drawing, downward or downward toward the bottom of the corresponding drawing.
The invention aims to provide a pedal control system and a pedal control method for a surgical robot, a readable medium and the surgical robot, which are used for solving the problem of difficult operation caused by more pedal switches of the existing surgical robot system. The following description refers to the accompanying drawings.
An embodiment of the present invention provides a surgical robot system, and fig. 1 and 2 show an application scenario of the surgical robot system, which includes a master-slave teleoperated surgical robot, that is, the surgical robot system includes a master end device 100 (e.g., a doctor console) and a slave end device 200 (e.g., a patient operation platform) and a support apparatus 400 (e.g., an operation table) for supporting an operation object (e.g., a patient) to perform an operation. The master end device 100 includes a doctor end host 111 (not shown in fig. 1, see fig. 3 and 4) and a vision host 112 (not shown in fig. 1, see fig. 3 and 4). It should be noted that in some embodiments, the support device 400 may be replaced by another surgical operation platform, which is not limited by the present invention.
As shown in fig. 2, the main terminal device 100 is an operation terminal of a teleoperated surgical robot, and includes a main manipulator 101 and a plurality of pedals 500 mounted thereon. The main manipulator 101 is configured to receive hand motion information of an operator (e.g., a doctor), and the pedal 500 is configured to receive foot motion information of the operator, so as to complete input of related operation instructions such as clutch, electric cutting, electric coagulation, and endoscope movement control. Preferably, the master device 100 further includes a display device 102, where the display device 102 is connected to a vision host 112, and the vision host 112 is further connected to an endoscope of the slave device 200, so as to obtain an image of an operative field in a cavity (referred to as a body cavity of a patient) captured by the endoscope. The vision host 112 is further configured to perform imaging processing on the surgical field image acquired by the endoscope, and transmit the processed surgical field image to the display device 102 for displaying, so that an operator can observe the surgical field image. The surgical field images comprise the types, the number and the pose of surgical instruments in the abdomen, the morphology, the arrangement and the like of blood vessels of diseased organ tissues and surrounding organ tissues. It is to be understood that the image displayed by the display device 102 may be a two-dimensional or three-dimensional image.
The slave device 200 is a specific execution platform of a teleoperated surgical robot, and includes a base 201 and a surgical execution assembly mounted thereon. The surgical execution assembly includes an instrument arm 210 and an instrument 221, the instrument 221 being mounted on or attached to the distal end of the instrument arm 210. Further, the instruments 221 include surgical instruments for specifically performing surgical operations, endoscopes for assisting in viewing, and the like. In one embodiment, the surgical instrument is used to perform specific surgical procedures, such as clamping, cutting, shearing, and the like.
The doctor-side host 111 is communicatively connected to the slave-side device 200, where the doctor-side host 111 is configured to control movement of the surgical execution assembly according to movement of the master manipulator 101 and the pedal 500, and specifically, the doctor-side host 111 includes a master-slave mapping module 113, where the master-slave mapping module 113 is configured to obtain end pose information of the master manipulator 101, obtain a desired end pose of the surgical execution assembly according to a predetermined mapping relationship, and further control the instrument arm 210 to drive the instrument to move to the desired end pose. The master-slave mapping module 113 is further configured to receive the stepping information of the pedal 500, and obtain an instrument function operation instruction (such as an electric cutting operation instruction, an electric coagulation operation instruction, etc.) according to a predetermined mapping relationship, and control an energy driver of the surgical instrument 221 to release energy to implement operation operations such as electric cutting operation, electric coagulation operation, etc.
Further, the surgical robot system further includes an image dolly 300. The image dolly 300 includes: optionally, the image trolley 300 further comprises an auxiliary display screen 302. The auxiliary display screen 302 is communicatively coupled to the vision host 112 for providing real-time display of surgical field images or other auxiliary display information to an auxiliary operator (e.g., a nurse).
Optionally, in some surgical application scenarios, the surgical robotic system further includes auxiliary components such as a ventilator and anesthesia machine 410 and an instrument table 420 for use in surgery. Those skilled in the art can select and configure these auxiliary components according to the prior art, and will not be described here.
It should be noted that, the surgical robot system disclosed in the above example is only an demonstration of one application scenario and is not limited to the application scenario of the surgical robot system, and the surgical robot system is not limited to a master-slave teleoperation surgical robot, and may be a single-ended surgical robot system, and an operator directly operates the surgical robot to perform a surgery.
To solve the problem of excessive number of pedals 500 included in the master device 100 in the prior art, please refer to fig. 3 to 11, an embodiment of the present invention provides a pedal control system for a surgical robot, which includes: an image acquisition device 600 and a control device 700; the image acquisition device 600 is configured to acquire real-time three-dimensional images of the pedal 500 and an adjacent region of the pedal 500, and transmit the real-time three-dimensional images to the control device 700; the control device 700 is configured to recognize the pedal 500 and a target object according to the acquired real-time three-dimensional image; based on the identified pedal 500 and the target object, obtaining real-time relative position information of the pedal 500 and the target object in space; the control device 700 is further configured to determine according to the real-time relative position information, and if the target object is in the pre-trigger area of the pedal 500, switch the mapping relationship between the pedal 500 and the slave device 200 of the surgical robot system when the displacement of the target object along the preset direction reaches a threshold. The target object here may refer to the operator's foot, or may be an object such as a foot prosthesis model for calibration and test, and the present embodiment is not limited to this. For convenience of description, an example in which the feet of the operator are the target object will be described below. The pre-trigger area is a space area with a certain range adjacent to the periphery of the pedal 500 as a reference, when the foot moves in the pre-trigger area, the foot is shot by the image acquisition device 600 and monitored by the control device 700, and if the displacement of the foot in the pre-trigger area along the preset direction reaches a threshold value, the control device 700 outputs pedal function switching information to the master-slave mapping module 113, so as to switch the mapping relation between the pedal 500 and the slave device 200. It can be appreciated that parameters such as the pre-trigger area, the pre-set direction, and the threshold of the displacement amount may be set according to parameters such as the size and layout of the pedal 500 in practice, and the pre-trigger area should be completely within the shooting range of the image capturing apparatus 600.
So configured, an operator can switch the mapping relationship between the pedal 500 and the slave device 200 by moving the foot along the preset direction, so that the same pedal 500 can realize different functions, the number of physical pedals can be reduced, the complexity and the operation difficulty of the pedal 500 are simplified, meanwhile, the situation of mistaken stepping in the operation process can be avoided, and the operation safety is improved.
Referring to fig. 3 and 4, a hardware structure and a software structure of the surgical robot pedal control system provided in the present embodiment are exemplarily shown. Referring to fig. 6 in combination, the image capturing device 600 is, for example, a structured light depth camera, which is used to capture real-time three-dimensional images of the surrounding area of the bottom bracket of the main end device 100, and it can be understood that the surrounding area of the bottom bracket of the main end device 100 includes the pedal 500 and the adjacent area of the pedal 500. Alternatively, the image capturing device 600 is connected to the control device 700 through a USB cable. The pedal 500 is preferably detachably connected to the control device 700, for example, by a USB cable, to the control device 700. The pedal 500 is configured to transmit pedal information to the control device 700 when depressed. Alternatively, in the example shown in fig. 3 and 4, the control device 700 is detachably connected to the doctor end host 111 as an integrated hardware device independent of the main end device 100, for example, by a network cable. Alternatively, the control apparatus 700 is detachably connected to the vision host 112 and the display device 102, respectively, such as through an HDMI line. The control device 700 is used for processing real-time three-dimensional images from the vision host 112 and from the image acquisition device 600, and synthesizing new images to output to the display device 102 for display.
Referring to fig. 4, the software structure of the surgical robot pedal control system includes a foot position and gesture recognition module 701, a pedal configuration module 702, a pedal instruction excitation module 703 and an image synthesis module 704, and the gesture recognition module 701, the pedal configuration module 702, the pedal instruction excitation module 703 and the image synthesis module 704 may be integrated in the control device 700 in hardware. The real-time three-dimensional image acquired by the image acquisition device 600 is input to the foot position and posture recognition module 701, and the foot position and posture recognition module 701 is configured to recognize the foot, the position of the pedal 500, and the movement posture of the foot, and output a signal to the pedal instruction excitation module 703 in combination with the threshold value configured by the pedal configuration module 702. The pedal instruction excitation module 703 determines whether the foot is in the pre-trigger area or not based on the signal sent by the foot position and posture recognition module 701, and if so, obtains pedal function switching information if the displacement of the foot in the preset direction reaches the threshold.
Further, the pedal instruction triggering module 703 sends pedal function switching information to the master-slave mapping module 113, and sends menu images 810 corresponding to the current mapping relation and/or triggering prompt information 820 corresponding to the current mapping relation to the image synthesizing module 704. The master-slave mapping module 113 is mainly used for updating and switching the association mapping between the original pedal processing logic and each function of the instruction module menu according to the received pedal function switching information.
In one aspect, the control apparatus 700 is further configured to send an excitation signal to the master device 100 when the pedal 500 is depressed, so that the master device 100 drives the corresponding instrument of the slave device 200 to perform a corresponding operation according to the current mapping relationship. Specifically, when the pedal 500 is depressed, a depressing signal is sent to the control device 700, and the control device 700 sends an excitation signal to the master-slave mapping module 113 of the master device 100 according to the depressing signal, so as to drive the corresponding apparatus of the slave device 200 to execute a corresponding operation according to the current mapping relationship.
In another aspect, the image composition module 704 is configured to output the menu image 810 corresponding to the current mapping relationship and/or the excitation prompt 820 corresponding to the current mapping relationship to the display device 102 of the master device 100. Optionally, the image synthesis module 704 is further configured to obtain the surgical field image from the vision host 112, synthesize the menu image 810, the excitation prompt information 820, and the surgical field image, and finally output the synthesized surgical field image to the display device 102 for display. Alternatively, the foot pedal configuration module 702 may be displayed on the display device 102, for example, the threshold may be set in a setup page on the doctor's side.
Referring to fig. 5, an example of a mapping relationship between the pedal 500 and the slave device 200 is shown, wherein the solid line 114 expresses that when the master pedal 500 is depressed, the first arm of the slave device 200 is driven to perform electro-cutting through the control device 700 and the master-slave mapping module 113. The remaining broken lines 115 represent optional functions respectively corresponding to the slave devices 200 when the front pedal 500 is depressed. It will be appreciated that the function mapped when the current pedal 500 is depressed may be selected among the various selectable functions of the slave device 200 according to the pedal function switching information.
Optionally, the control device 700 is further configured to store a current mapping relationship when the target object leaves the pre-trigger area of the pedal 500; when the target object reenters the pre-trigger area of the pedal 500, the stored mapping relationship is used as an initial mapping relationship. When the foot of the operator leaves the pre-trigger area and enters the pre-trigger area again, the mapping relation selected to be switched when the foot leaves can be stored and recorded for convenient operation, and when the foot of the operator enters the pre-trigger area again, the mapping relation stored and recorded is directly used as the current initial mapping relation. In some embodiments, the mapping relationship when the foot leaves the pre-trigger area may be stored by the master device 100, and when the foot of the operator enters the pre-trigger area again, the control apparatus 700 obtains the stored mapping relationship from the master device 100. Of course, in other embodiments, the mapping relationship when the foot leaves the pre-trigger area may be stored by the control device 700, which is not limited in this embodiment.
In addition, in some embodiments, the control device 700 is detachably connected to the master device 100, and the control device 700 is further configured to obtain, when connected to the master device 100 of the surgical robot system, a current mapping relationship of the master device 100, and use the obtained mapping relationship as an initial mapping relationship. Since the control device 700 is detachably connected to the master device 100, when the control device 700 is connected to the master device 100, the two devices need to be matched and synchronized. In one example, the control device 700 may initiate a connection request to the doctor end host 111 of the master end device 100, where the doctor end host 111 monitors whether there is a connection of the control device 700, if so, the current mapping relationship of the master end device 100 is sent to the control device 700, and after the control device 700 receives the mapping relationship, the mapping relationship is used as the current initial mapping relationship. Further, the image synthesis module 704 may obtain the corresponding menu image 810 and/or the excitation prompt information 820 according to the initial mapping relationship, and then re-render and output to the display device 102 for display after processing.
Referring now to fig. 6-7 b, which illustrate one example of a surgical robotic foot pedal control system, fig. 6 illustrates an example wherein the master end device 100 includes two pedals 500, preferably corresponding to the feet of an operator, respectively, the two pedals 500 corresponding to alternative functions from a portion of the end device 200, respectively.
Preferably, the preset direction is perpendicular to the triggering direction of the pedal 500, and the preset direction is perpendicular to the facing direction of the main end apparatus 100. Referring to fig. 1 and 2 in combination, in one example, the master device 100 is a doctor console, and the operator preferably faces the doctor console in a sitting position when using the device, that is, the facing direction of the master device 100 faces the operator. The trigger direction of the pedal 500 refers to the direction in which the pedal 500 is depressed and activated, and in the example of fig. 1 and 2, is a direction vertical or slightly angled to the vertical (e.g., tilted back and forth, or tilted left and right). Thereby, the preset direction extends in the left-right direction of the main end apparatus 100. Referring to fig. 7a and 7b in combination, since the operator preferably operates in a sitting posture, the foot can move left and right conveniently, and the foot position and posture recognition module 701 is not easy to be erroneously recognized. Of course, in other embodiments, the preset direction may be set in a vertical direction or in a facing direction (i.e., a front-to-rear direction) of the main end device 100, which is not limited in this embodiment, and may be set by those skilled in the art according to actual needs of an operator in practice.
Further, to improve the robustness of the surgical robot foot pedal control system, the foot position and posture recognition module 701 may integrate a foot recognition unit to avoid false triggering. The foot recognition unit is used for recognizing features contained in the real-time three-dimensional image to confirm whether certain features are feet of an operator, and avoid false triggering caused by similar objects entering the adjacent area of the pedal 500. The foot recognition unit may employ some existing image recognition algorithms, such as the SURF algorithm, etc., as would be understood by one skilled in the art in view of the present disclosure. Alternatively, the foot recognition unit may for example comprise a trained foot model library, the step of recognizing the foot recognition unit comprising: identification step 1: acquiring a real-time three-dimensional image acquired by the image acquisition device 600; and (2) identification step 2: extracting features of the real-time three-dimensional image; and (3) identification step 3: and performing feature matching with the trained foot model library, and outputting a recognition result. The foot model library can be trained according to a large amount of foot data in advance, and specific training algorithms can refer to the prior art, for example, a neural network algorithm is adopted, and the description of the embodiment is omitted.
Optionally, for ease of understanding, in the following example, the target object (the foot of the operator) is abstracted to be the recognition point a located at the front end of the foot. It will be appreciated that in practice, any point, any area or whole of the foot may be used as the identification point or area for comparison and calculation, and the embodiment is not limited thereto.
Referring to fig. 6, in one example, an image capturing device 600 is mounted on a post 104 of a host device 100. Optionally, the image capture device 600 is a structured light depth camera. The image capturing device 600 includes an infrared projector 610, an infrared camera 620, and a general camera 630 that are sequentially arranged at intervals. The infrared projector 610 is used for projecting infrared structured light toward an object (such as the pedal 500, the foot, etc.), the infrared camera 620 is used for capturing a first image reflected with infrared structured light information, and the normal camera 630 is used for capturing a second image reflected with visible light information. A three-dimensional space coordinate system is established with the intersection point of the left and inner brackets and the ground as an origin O (please refer to fig. 2 and 6 in combination), the left and right direction (length direction) of the main end apparatus 100 as an X axis, the front and rear direction (width direction) of the main end apparatus 100 as a Y axis, and the vertical direction (height direction) as a Z axis. Since the distance between the infrared camera 620 and the normal camera 630 is known, the first image and the second image are superimposed, and based on the two-dimensional coordinates of a certain point in space in the first image and the second image, the coordinates of the point in the three-dimensional space coordinate system can be obtained by calculating depth information according to the triangulation principle. For specific measurement principles, reference is made to the prior art and no description will be given here.
Optionally, referring to fig. 8 to 9c, the pre-trigger area is a three-dimensional space area formed by surrounding a horizontal boundary 710, a first height boundary 721 and a second height boundary 722, and the horizontal boundary 710 is a boundary formed by extending in a horizontal direction with the pedal 500 as a center; the first height boundary 721 is higher than the second height boundary 722 in the vertical direction, and the first height boundary 721 and the second height boundary 722 are boundaries formed by extending in the vertical direction based on the vertical height of the tread surface of the pedal 500.
In the example shown in fig. 8 to 9c, the left pedal 500 is taken as an example for explanation, and the coordinate ranges of the left pedal 500 are (X15, X25), (Y40, Y55), Z20; the horizontal boundary 710 has a coordinate range of (X0, X40), (Y35, Y70); the first height boundary 721 has a coordinate Z50 and the second height boundary 722 has a coordinate Z20. Taking the recognition point a at the front end of the foot as an example, the foot position and gesture recognition module 701 detects the movement of the recognition point a, and when the three-dimensional coordinates of the recognition point a are located in the pre-trigger area, the foot instruction excitation module 703 determines that the foot is in the pre-trigger area. Referring to fig. 7a and fig. 7b in combination, if the identification point a moves along a preset direction (e.g., along a left-right direction of the master device 100) and the displacement W reaches a threshold, the pedal indication excitation module 703 determines that the intention of the operator needs to switch the mapping relationship at this time, so as to obtain pedal function switching information, and sends the pedal function switching information to the master-slave mapping module 113. Alternatively, the specific ranges of the horizontal boundary 710, the first height boundary 721, and the second height boundary 722 may be configured by the foot pedal configuration module 702.
Further, in order to more clearly and intuitively reflect the mapping relationship switched by the foot movement, please refer to fig. 10, which exemplarily illustrates a menu image 810 and an excitation prompt 820 corresponding to the mapping relationship, wherein the selectable functions corresponding to the slave device 200 are from function one to function four corresponding to the current pedal 500, the menu image 810 shows from function one to function four, and the currently selected mapping is function one, which may be displayed by a dark font or a frame selection manner, for example. The unselected functions two to four can be displayed by light fonts.
Further, the excitation prompt 820 includes four prompt areas (1) to (4), where the four prompt areas (1) to (4) correspond to the functions one to four, respectively, and when the pedal 500 is depressed, according to the currently selected mapping relationship (function one in fig. 10), the corresponding instrument of the slave device 200 is excited and performs the corresponding operation, while the prompt area (1) corresponding to the function one is excited and prompted, for example, the prompt area (1) may flash or be displayed in red, so as to prompt the operator that the instrument corresponding to the current function one is being excited and performs the corresponding operation.
In the example of changing the mapping relationship, for example, when the foot moves leftwards and the displacement reaches the threshold, the function selected for mapping is switched from the first function to the second function, and correspondingly, the menu content moves leftwards, so that the second function moves to the selected position, and the function is displayed in a dark font or frame selection manner, as shown in fig. 11. It will be appreciated that, corresponding to the map of fig. 11, when the pedal 500 is depressed, the presentation area (2) corresponding to the function two is activated for presentation. Of course, the menu image 810 and the excitation prompt 820 shown in fig. 10 and 11 are merely exemplary and are not limiting of the menu image 810 and the excitation prompt 820.
Preferably, if the target object is outside the pre-trigger area of the pedal 500, the image composition module 704 is configured to hide the menu image 810. To alert the operator, menu image 810 may be hidden when the foot is outside of the pre-trigger area, but only displayed when the foot is within the pre-trigger area. Taking the example shown in fig. 9a to 9c as an example, in fig. 9a, the Z-axis coordinate of the identification point a is Z60, which is located above the coordinate Z50 of the first height boundary 721 of the pre-trigger area, which indicates that the elevation height of the foot of the operator is high at this time, and exceeds the height of the pre-trigger area, and the menu image 810 is hidden at this time, and the mapping relationship is not changed even if the operator moves the foot left and right. In fig. 9b, the Z-axis coordinate of the identification point a is located between the coordinate Z50 of the first height boundary 721 and the coordinate Z20 of the second height boundary 722, and at this time, the menu image 810 is displayed on the display device 102, and if the displacement amount of the foot of the operator moving along the preset direction reaches the threshold value, the mapping relationship is changed. In fig. 9c, the operator's foot is pressed down, the Z-axis coordinate of the identification point a is Z15, and is lower than the coordinate Z20 of the second height boundary 722, and the operator's foot cannot be moved left and right without moving, so that the menu image 810 is hidden.
Preferably, said master device 100 comprises at most two of said pedals 500. In order to simplify the structure of the surgical robot pedal control system, at most two pedals 500 are provided corresponding to both feet of an operator, each pedal 500 mapping a function of a portion, respectively. Effectively solves the problem of difficult operation caused by more foot switches of the existing surgical robot system. Compared with the scheme of completely canceling the pedal and integrating the original pedal function onto the operation handle, the arrangement of the pedal 500 reduces the hand operation burden of the operator without affecting the operation experience. Of course, in some embodiments, the surgical robotic foot control system may also include only one foot pedal 500, which may map all of the optional functions of the slave end device 200.
The embodiment of the invention also provides a pedal control method of the surgical robot, which is applied to the pedal control system of the surgical robot; the surgical robot pedal control method comprises the following steps:
step S1: acquiring a real-time three-dimensional image of an adjacent region of the step 500;
step S2: identifying the pedal 500 and a target object according to the acquired real-time three-dimensional image; based on the identified pedal 500 and the target object, obtaining real-time relative position information of the pedal 500 and the target object in space;
step S3: and judging according to the real-time relative position information, if the target object is in the pre-trigger area of the pedal 500, switching the mapping relation between the pedal 500 and the slave end equipment 200 of the surgical robot system when the displacement of the target object along the preset direction reaches a threshold value.
The specific principles and steps of the method may be described with reference to the above description of the surgical robot pedal control system, and will not be repeated here.
Further, an embodiment of the present invention also provides a readable storage medium having stored thereon a program which, when executed, implements the steps of the surgical robot pedal control method as described above. Still further, an embodiment of the present invention also provides a computer apparatus including a processor and a readable storage medium as above, the processor being configured to execute a program stored on the readable storage medium. The readable storage medium may be provided independently or may be integrated in the surgical robot system, which is not limited in this regard. Still further, embodiments of the present invention further provide a surgical robot system including the surgical robot pedal indication system as described above, and the structure and principle of other components of the surgical robot system may refer to the prior art, and the present invention will not be described in detail.
In summary, in the surgical robot pedal control system, the surgical robot pedal control method, the readable storage medium, and the surgical robot system provided by the present invention, the surgical robot pedal control system includes: an image acquisition device and a control device; the image acquisition device is used for acquiring real-time three-dimensional images of a pedal of the surgical robot system and an adjacent area of the pedal and transmitting the real-time three-dimensional images to the control device; the control device is configured to recognize and obtain the pedal and the target object according to the acquired real-time three-dimensional image; based on the identified pedal and the target object, obtaining real-time relative position information of the pedal and the target object in space; the control device is further configured to judge according to the real-time relative position information, and if the target object is in the pre-trigger area of the pedal, when the displacement of the target object along the preset direction reaches a threshold value, the mapping relation between the pedal and the slave-end equipment of the surgical robot system is switched.
So configured, an operator can switch the mapping relation between the pedals and the slave-end equipment by moving the feet along the preset direction, so that different functions can be realized by the same pedal, the number of physical pedals can be reduced, the complexity and the operation difficulty of pedal operation are simplified, meanwhile, the situation of mistaken stepping in the operation process can be avoided, and the operation safety is improved.
It should be noted that the above embodiments may be combined with each other. The above description is only illustrative of the preferred embodiments of the present invention and is not intended to limit the scope of the present invention, and any alterations and modifications made by those skilled in the art based on the above disclosure shall fall within the scope of the appended claims.

Claims (11)

1. A surgical robot pedal control system, comprising: an image acquisition device and a control device;
the image acquisition device is used for acquiring real-time three-dimensional images of a pedal of the surgical robot system and an adjacent area of the pedal and transmitting the real-time three-dimensional images to the control device;
the control device is configured to recognize and obtain the pedal and the target object according to the acquired real-time three-dimensional image; based on the identified pedal and the target object, obtaining real-time relative position information of the pedal and the target object in space;
the control device is further configured to judge according to the real-time relative position information, and if the target object is in the pre-trigger area of the pedal, when the displacement of the target object along the preset direction reaches a threshold value, the mapping relation between the pedal and the slave-end equipment of the surgical robot system is switched.
2. The surgical robot foot pedal control system according to claim 1, wherein the pre-trigger area is a three-dimensional space area formed by surrounding a horizontal boundary, which is a boundary formed by extending in a horizontal direction centering on the foot pedal, a first height boundary, and a second height boundary; the first height boundary is higher than the second height boundary in the vertical direction, and the first height boundary and the second height boundary are boundaries formed by taking the vertical height of the tread of the pedal as a reference and extending in the vertical direction.
3. The surgical robot foot pedal control system of claim 1, wherein the control device is further configured to send an excitation signal to the master device when the foot pedal is depressed to cause the master device to drive the respective instrument of the slave device to perform the respective operation according to the current mapping.
4. The surgical robot pedal control system according to claim 1, wherein the control device includes an image composition module for outputting a menu image corresponding to a current mapping relation and/or excitation prompt information corresponding to the current mapping relation to a display device of the main terminal device.
5. The surgical robot foot pedal control system of claim 5, wherein the image composition module is configured to conceal the menu image if the target object is outside a pre-trigger area of the foot pedal.
6. The surgical robot foot pedal control system of claim 1, wherein the control device is further configured to store a current mapping when the target object leaves a pre-trigger area of the foot pedal; and when the target object reenters the pre-trigger area of the pedal, taking the stored mapping relation as an initial mapping relation.
7. The surgical robot pedal control system according to claim 1, wherein the control device is further configured to acquire a current mapping relation of a master device of the surgical robot system when connected to the master device, and to take the acquired mapping relation as an initial mapping relation.
8. A surgical robot pedal control method, characterized by being applied to the surgical robot pedal control system according to any one of claims 1 to 7; the surgical robot pedal control method comprises the following steps:
acquiring a real-time three-dimensional image of the pedal and an adjacent area of the pedal;
identifying and obtaining the pedal and the target object according to the acquired real-time three-dimensional image; based on the identified pedal and the target object, obtaining real-time relative position information of the pedal and the target object in space;
and judging according to the real-time relative position information, and switching the mapping relation between the pedal and the slave-end equipment of the surgical robot system when the displacement of the target object along the preset direction reaches a threshold value if the target object is in the pre-trigger area of the pedal.
9. A readable storage medium having a program stored thereon, characterized in that the program, when executed, implements the steps of the surgical robot pedal control method according to claim 8.
10. A surgical robot system comprising the surgical robot pedal control system according to any one of claims 1 to 7, further comprising a main end device including a pedal, the preset direction being perpendicular to a trigger direction of the pedal, and the preset direction being perpendicular to a facing direction of the main end device.
11. A surgical robotic system as claimed in claim 10, wherein the master end device includes at most two of the pedals.
CN202310160630.4A 2023-02-23 2023-02-23 Surgical robot pedal control system, method, readable medium and surgical robot Pending CN116423547A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310160630.4A CN116423547A (en) 2023-02-23 2023-02-23 Surgical robot pedal control system, method, readable medium and surgical robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310160630.4A CN116423547A (en) 2023-02-23 2023-02-23 Surgical robot pedal control system, method, readable medium and surgical robot

Publications (1)

Publication Number Publication Date
CN116423547A true CN116423547A (en) 2023-07-14

Family

ID=87087980

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310160630.4A Pending CN116423547A (en) 2023-02-23 2023-02-23 Surgical robot pedal control system, method, readable medium and surgical robot

Country Status (1)

Country Link
CN (1) CN116423547A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117017507A (en) * 2023-10-09 2023-11-10 华中科技大学同济医学院附属协和医院 Precise master-slave control system and method for puncture operation robot

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117017507A (en) * 2023-10-09 2023-11-10 华中科技大学同济医学院附属协和医院 Precise master-slave control system and method for puncture operation robot
CN117017507B (en) * 2023-10-09 2023-12-19 华中科技大学同济医学院附属协和医院 Precise master-slave control system of puncture operation robot

Similar Documents

Publication Publication Date Title
AU2019352792B2 (en) Indicator system
US11918299B2 (en) Systems and methods for detection of objects within a field of view of an image capture device
KR101705921B1 (en) Synthetic representation of a surgical robot
US20220095903A1 (en) Augmented medical vision systems and methods
JP5417609B2 (en) Medical diagnostic imaging equipment
US20210015343A1 (en) Surgical assistance apparatus, surgical method, non-transitory computer readable medium and surgical assistance system
KR20140112207A (en) Augmented reality imaging display system and surgical robot system comprising the same
CN106456148A (en) Medical devices, systems, and methods using eye gaze tracking
KR101598774B1 (en) Apparatus and Method for processing surgical image
CN112672709A (en) System and method for tracking the position of a robotically-manipulated surgical instrument
WO2013141155A1 (en) Image completion system for in-image cutoff region, image processing device, and program therefor
KR20140139840A (en) Display apparatus and control method thereof
JP2007007041A (en) Surgery support
JP2012223363A (en) Surgical imaging system and surgical robot
CN116423547A (en) Surgical robot pedal control system, method, readable medium and surgical robot
CN115500950A (en) Endoscope pose adjusting method, surgical robot, and storage medium
CN111770735A (en) Operation simulation information generation method and program
KR20190080706A (en) Program and method for displaying surgical assist image
WO2020243425A1 (en) Composite medical imaging systems and methods
US20210298848A1 (en) Robotically-assisted surgical device, surgical robot, robotically-assisted surgical method, and system
KR101864411B1 (en) Program and method for displaying surgical assist image
KR20120052574A (en) Surgical robitc system and method of driving endoscope of the same
US20210298830A1 (en) Robotic surgical system and methods of use thereof
CN113081273B (en) Punching auxiliary system and surgical robot system
US20220409300A1 (en) Systems and methods for providing surgical assistance based on operational context

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination