CN219629776U - Surgical robot display system and surgical robot system - Google Patents

Surgical robot display system and surgical robot system Download PDF

Info

Publication number
CN219629776U
CN219629776U CN202321142116.XU CN202321142116U CN219629776U CN 219629776 U CN219629776 U CN 219629776U CN 202321142116 U CN202321142116 U CN 202321142116U CN 219629776 U CN219629776 U CN 219629776U
Authority
CN
China
Prior art keywords
display
pose
surgical
surgical robot
simulation image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202321142116.XU
Other languages
Chinese (zh)
Inventor
徐凯
张冰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Surgerii Robot Co Ltd
Original Assignee
Beijing Surgerii Robot Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Surgerii Robot Co Ltd filed Critical Beijing Surgerii Robot Co Ltd
Priority to CN202321142116.XU priority Critical patent/CN219629776U/en
Application granted granted Critical
Publication of CN219629776U publication Critical patent/CN219629776U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Manipulator (AREA)

Abstract

The present disclosure relates to the field of medical instruments, and discloses a display system of a surgical robot and a surgical robot system. The display system includes at least one main operator, a display device, and a control device. The at least one primary manipulator is configured to receive user input, the display device is configured to display at least one display interface, the at least one display interface comprises a first display interface, the first display interface is configured to display a 3D simulated image of at least a portion of the surgical robot, the control device is communicatively connected with the at least one primary manipulator and the display device, the control device is configured to control display adjustment of the 3D simulated image of the at least a portion of the surgical robot based on the user input in a 3D simulated image adjustment mode. And performing display adjustment on the 3D simulation image through the main operator, and changing the display angle or the zoom state of the 3D simulation image so as to enable a user to better observe the 3D simulation image.

Description

Surgical robot display system and surgical robot system
Technical Field
The present disclosure relates to the field of medical devices, and more particularly, to a display system for a surgical robot and a surgical robot system.
Background
Minimally invasive surgery has taken a significant role in surgery because of less trauma to the patient and higher post-operative output. The existing robot-assisted minimally invasive surgery system mainly adopts a master-slave teleoperation mode. For example, a user sends motion commands to the slave operation equipment on the patient side through two master operators on the master control console, and the surgical scene is displayed in real time through the display device so that the user can control the slave operation equipment to perform surgical treatment according to the actual surgical field.
In preoperative positioning and operation processes, the display device can display simulation images of the slave operation equipment in the static or moving process, so that a user can more intuitively observe the state of the slave operation equipment. However, the conventional display device cannot adjust the viewing angle of the displayed simulation image, thereby affecting the viewing field and the user's feeling.
Disclosure of Invention
An object of the present disclosure is to provide a display system of a surgical robot, including:
at least one primary operator configured to receive user input;
a display device configured to display at least one display interface, the at least one display interface comprising a first display interface configured to display a 3D simulated image of at least a portion of the surgical robot; and
a control device in communication with the at least one primary operator and the display device, the control device configured to control display adjustment of a 3D simulated image of at least a portion of the surgical robot based on the user input in a 3D simulated image adjustment mode.
In some embodiments, the display adjustment includes at least one of: yaw, roll, pitch, move, up and down, zoom in and out.
In some embodiments, further comprising:
and a trigger key configured to trigger the 3D simulation image adjustment mode.
In some embodiments, the main manipulator includes a robot arm and a handle disposed at a distal end of the robot arm, and the control device is configured to obtain an initial pose and a current pose of the handle of the at least one main manipulator and control display adjustment of the 3D simulation image based on the initial pose and the current pose in the 3D simulation image adjustment mode.
In some embodiments, the control device is configured to determine a pose adjustment amount of the 3D simulation image based on a pose difference between the initial pose and the current pose.
In some embodiments, the control device is configured to determine a yaw adjustment amount of the 3D simulation image based on a yaw difference in the pose differences, or a roll adjustment amount of the 3D simulation image based on a roll difference in the pose differences, or a pitch adjustment amount of the 3D simulation image based on a pitch difference in the pose differences, or a yaw adjustment amount of the 3D simulation image based on a yaw difference in the pose differences, or a pitch adjustment amount of the 3D simulation image based on a pitch difference in the pose differences.
In some embodiments, the at least one main operator includes a left main operator and a right main operator configured to receive user input together in the 3D simulated image adjustment mode.
In some embodiments, the surgical robot includes at least one surgical tool including an arm and an end effector disposed at a distal end of the arm, the first display interface configured to display a 3D simulated image of the at least one surgical tool, the control device configured to disconnect a mapping relationship between the at least one primary manipulator and the at least one surgical tool in response to a 3D simulated image adjustment mode being triggered.
In some embodiments, the at least one display interface includes a second display interface configured to display an actual surgical image, and the first display interface is superimposed on the second display interface.
In some embodiments, the surgical robot includes a mobile station including a base and at least one motion arm disposed on the base, the first display interface configured to display a 3D simulated image of the mobile station or the at least one motion arm.
The present disclosure also provides a surgical robotic system comprising:
a mobile station comprising a base and at least one motion arm disposed on the base;
at least one surgical tool removably disposed at a distal end of the at least one motion arm; and
a display system as described in any of the embodiments of the present disclosure.
Some embodiments of the present disclosure have one or more of the following benefits: the user controls the display adjustment of the 3D simulation image of the surgical tool or the mobile station by operating the main operator, and changes the display angle or zoom state of the 3D simulation image through different adjustment modes, so that the user can better observe the 3D simulation image.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present disclosure, the following will briefly describe the drawings that are required to be used in the description of the embodiments of the present disclosure. The drawings in the following description illustrate only some embodiments of the disclosure and other embodiments may be obtained by those of ordinary skill in the art from the disclosure's contents and drawings without inventive effort.
FIG. 1 illustrates a schematic block diagram of a display system of a surgical robot according to some embodiments of the present disclosure;
FIG. 2A illustrates a schematic view of a 3D simulated image of a display interface of a display device in one state according to some embodiments of the present disclosure;
FIG. 2B illustrates a schematic view of a 3D simulated image of a display interface of a display device in another state according to some embodiments of the present disclosure;
FIG. 2C illustrates a schematic view of a 3D simulated image of a display interface of a display device in another state according to some embodiments of the present disclosure;
FIG. 3 illustrates a schematic block diagram of a control device according to some embodiments of the present disclosure;
FIG. 4 illustrates a schematic structural view of a surgical tool according to some embodiments of the present disclosure;
FIG. 5 illustrates a schematic diagram of a primary operator according to some embodiments of the present disclosure;
FIG. 6 illustrates a coordinate system schematic of a master manipulator according to some embodiments of the present disclosure;
fig. 7 illustrates a schematic structural view of a master control station of a surgical robot according to some embodiments of the present disclosure;
fig. 8 illustrates a schematic structural view of a mobile station of a surgical robot according to some embodiments of the present disclosure;
FIG. 9A illustrates a 3D simulated image of a state of a mobile station in a display interface according to some embodiments of the present disclosure;
FIG. 9B illustrates a 3D simulated image of another state of a mobile station in a display interface according to some embodiments of the present disclosure;
FIG. 9C illustrates a 3D simulated image of another state of a mobile station in a display interface according to some embodiments of the present disclosure;
fig. 10 illustrates a structural schematic of a surgical robotic system according to some embodiments of the present disclosure.
Detailed Description
In order to make the technical problems solved by the present disclosure, the technical solutions adopted and the technical effects achieved more clear, the technical solutions of the embodiments of the present disclosure will be described in further detail below with reference to the accompanying drawings, and it is obvious that the described embodiments are merely exemplary embodiments of the present disclosure, and not all embodiments.
In the description of the present disclosure, it should be noted that the directions or positional relationships indicated by the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc. are based on the directions or positional relationships shown in the drawings, are merely for convenience of describing the present disclosure and simplifying the description, and do not indicate or imply that the devices or elements referred to must have a specific orientation, be configured and operated in a specific orientation, and thus should not be construed as limiting the present disclosure. Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
In the description of the present disclosure, it should be noted that, unless explicitly specified and limited otherwise, the terms "mounted," "connected," and "coupled" are to be construed broadly, and may be either a fixed connection or a removable connection, for example; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium; may be a communication between the interiors of the two elements. The specific meaning of the terms in this disclosure will be understood by those of ordinary skill in the art as the case may be. In this disclosure, the end proximal to the operator (e.g., physician) is defined as proximal, or posterior, and the end proximal to the surgical patient is defined as distal, or anterior, anterior. Those skilled in the art will appreciate that embodiments of the present disclosure may be used with medical instruments or surgical robots, as well as with other non-medical devices.
Fig. 1 illustrates a schematic block diagram of a display system 1000 of a surgical robot according to some embodiments of the present disclosure. As shown in fig. 1, the display system 1000 of the surgical robot includes at least one main operator 100, a display device 200, and a control device 300.
In some embodiments, as shown in FIG. 1, at least one primary operator 100 is communicatively coupled to a control device 300. In some embodiments, the master manipulator 100 generally comprises a left master manipulator (e.g., for controlling a first surgical tool) and a right master manipulator (e.g., for controlling a second surgical tool) corresponding to left-handed operation of a user, respectively. In a practical scenario, at least one primary manipulator 100 may be configured to receive user input to control movement of a surgical tool or imaging tool in an operating region by teleoperation of the primary manipulator 100 to effect a medical operation. Alternatively, the user may control the display adjustment, e.g., the display angle or zoom state adjustment, of the 3D simulation image of the surgical tool by operating the main operator 100 so as to facilitate the user's better viewing of the 3D simulation image.
In some embodiments, the display device 200 is configured to display at least one display interface. The at least one display interface includes a first display interface configured to display a 3D simulated image of at least a portion of the surgical robot. For example, the first display interface may display a 3D simulated image of at least one surgical tool of the surgical robot, as shown in fig. 2A-2C.
In some embodiments, a display device (e.g., display device 200 shown in fig. 1) may include a liquid crystal display device, a field emission display device, an organic light emitting diode display device, and the like. Fig. 2A, 2B, and 2C illustrate schematic views of 3D simulated images of a display interface 210a of a display device 200 at different operating states and display angles of a surgical tool, respectively, according to some embodiments of the present disclosure. In some embodiments, as shown in fig. 2A-2C, the display device 200 may include at least one display interface 210. For example, the display device 200 may include a stereoscopic display (e.g., the stereoscopic display 520 shown in fig. 7) or a two-dimensional display (e.g., the master external display 530 shown in fig. 7, the system status display 650 shown in fig. 6). The display window, screen, or virtual image display area of the display device 200 (e.g., the display window 201 shown in fig. 2A-2C) may include one or more display interfaces 210. In some embodiments, as shown in fig. 2A-2C, at least one display interface 210 may include display interfaces 210a and 210b. The display interface 210a is configured to display a 3D simulated image of at least one surgical tool, the display interface 210b is configured to display an actual surgical image, and the display interface 210a is superimposed on the display interface 210b. In some embodiments, the actual surgical image includes an image taken by an imaging tool (e.g., an endoscope).
As shown in fig. 2A-2C, fig. 2A, 2B and 2C respectively show 3D simulation images of a plurality of surgical tools (e.g., surgical tool 400) in different states, and fig. 2A, 2B and 2C respectively show 3D simulation images of different display angles. The display interface 210a may be overlaid on the display interface 210b, the display interface 210a for displaying 3D simulated images of a plurality of surgical tools (e.g., the surgical tool 400 shown in fig. 4), and the display interface 210b for displaying actual surgical images. It should be appreciated that when the 3D simulated image adjustment mode is not triggered, the display interface 210 may include one display interface (e.g., display interface 210 b) or may include display interfaces 210a and 210b, and that the 3D simulated images of multiple surgical tools (e.g., surgical tool 400 shown in fig. 4) in display interface 210a may change as the configuration of the surgical tools changes. After the user triggers the 3D simulation image adjustment mode, the display interface 210 may display only the display interface 210a or superimpose the display interface 210a on the display interface 210b.
It should be appreciated that the 3D simulation image of each surgical tool 400 may be individually display adjusted. For example, a surgical tool that matches the primary manipulator 100 may be represented as a selected surgical tool, and the user may adjust the display of the 3D simulated image of the selected surgical tool. The 3D simulation image corresponding to the selected surgical tool may be displayed in a different color or highlighted color display so that the user can quickly know the surgical tool corresponding to the control of the main operator 100.
Thus, the 3D simulation images of one or more surgical tools are adjusted by means of adjustment such as rotation, movement or rolling so as to display the 3D simulation images of one or more surgical tools at different angles and postures on the display interface 210a, so that a user can conveniently see the 3D simulation images of different surgical tools. The above is by way of example only, and it should be appreciated that other display adjustments to the surgical tool may also be made by the user.
The control device 300 is communicatively connected to the at least one main operator 100 and the display device 200, the control device 300 being configured to control the display adjustment of the 3D simulation image of at least a part of the surgical robot based on user input in the 3D simulation image adjustment mode.
It should be appreciated that the adjustment of the display of the 3D simulated image of at least a portion of the surgical robot by the control device 300 may be performed iteratively through a plurality of control loops. For example, the control device 300 may be configured to periodically receive user input of the at least one primary manipulator 100 and control display adjustment of the 3D simulation image of at least a portion of the surgical robot in the first display interface (e.g., display interface 210 a) based on the user input. In some embodiments, the control device 300 may include at least one integrated circuit unit. At least one integrated circuit unit can be configured to implement the functions predetermined by the control device 300. For example, receiving information, performing calculations, sending instructions, etc. In some embodiments, the integrated circuit unit may be, for example, a circuit board or a circuit unit integrated on a circuit board for implementing a predetermined function. Fig. 3 illustrates a schematic block diagram of a control device 300 according to some embodiments of the present disclosure. In some embodiments, as shown in fig. 3, the control device 300 may include a signal receiving integrated circuit unit 310 and a display control integrated circuit unit 320 communicatively coupled to the signal receiving integrated circuit unit 310. For example, the signal receiving integrated circuit unit 310 may be configured to periodically receive user input of the main operator 100. The display control integrated circuit unit 320 is configured to control display adjustment of the 3D simulation image of at least a portion of the update surgical robot based on user input of a main operator (e.g., a left main operator or a right main operator).
Fig. 4 illustrates a schematic structural view of a surgical tool 400 according to some embodiments of the present disclosure. In some embodiments, as shown in fig. 4, the surgical robot includes at least one surgical tool 400, the surgical tool 400 including an arm 410 and an end effector 420 disposed at a distal end of the arm 410. The first display interface (e.g., display interface 210 a) is configured to display a 3D simulated image of at least one surgical tool 400. For example, the arm body 410 may be a flexible arm. In some embodiments, arm 410 may include one or more distal flexible segments, which may have multiple degrees of freedom, such as an arm that may achieve 6 degrees of freedom motion. The flexible segments may be implemented by a variety of suitable structures, such as a continuum, a serpentine structure, a rod plus multi-joint structure, and the like. In some embodiments, end effector 420 may include, but is not limited to, bipolar separation clamp, bipolar elbow grasper, monopolar bend shear, monopolar electric hook, bipolar grasper, needle holder, tissue grasper, and the like. Surgical tool 400 may also include a drive transmission 430, as shown in fig. 4. It should be understood that the 3D simulation image of at least a portion of the surgical robot may also comprise 3D simulation images of other portions, such as 3D simulation images of a mobile station or a motion arm of the surgical robot.
In some embodiments, the at least one main operator 100 includes a left main operator and a right main operator configured to receive user input together in the 3D simulated image adjustment mode. For example, the user may simultaneously operate the left and right main operators to make user inputs simultaneously, and the control apparatus 300 is configured to control the adjustment display of the 3D simulation image based on the user inputs of the left and right main operators.
In some embodiments, the display adjustment may include at least one of: yaw, roll, pitch, move, up and down, zoom in and out. For example, the 3D simulation image in the display interface may be adjusted to be zoomed in and out by moving the handle of the main manipulator 100 left and right, by scrolling the 3D simulation image in the display interface left and right, and by moving the handle of the main manipulator 100 back and forth. It will be appreciated that the display angle or zoom state of the 3D simulated image may be changed by different adjustments to allow the user to better view the 3D simulated image. The center of rotation of the 3D simulation image may be located at the center of the simulation image, or may be located at other positions of the 3D simulation image.
In some embodiments, the surgical robot display system 1000 further includes a trigger key. The trigger key may be configured to trigger a 3D simulated image adjustment mode. It should be appreciated that the toggle key may be provided on the main operator 100. For example, the trigger key may be one of a push-button trigger switch, a human body sensing trigger switch, a toggle trigger switch, a pedal, a touch trigger key, and the like, which is triggered by touching, pressing, or toggling. The above is merely an example, and the trigger key may be disposed at other positions of the system, and may be a keyboard input trigger, an interface touch input trigger, or a voice control trigger.
Fig. 5 illustrates a schematic structural view of a main operator 100 according to some embodiments of the present disclosure. In some embodiments, as shown in FIG. 5, the master manipulator 100 may include a multiple degree of freedom robot 110 (e.g., the multiple degree of freedom robot 110 may include robots 111-116) and a plurality of joints (e.g., joints 1111-1117) connecting the multiple degree of freedom robot 110. In some embodiments, the multiple degree of freedom robotic arm 110 has six degrees of freedom. A main manipulator sensor is provided at each joint on the multi-degree of freedom robot arm 110, and joint information (e.g., joint angle data) is generated by the main manipulator sensor of each joint. In some embodiments, the sensor at the joint employs a potentiometer and/or encoder. In some embodiments, a controller may be disposed in the main manipulator 100, and the controller may calculate pose data of the main manipulator (e.g., pose of the handle 120) according to joint information obtained by sensors at respective joints and transmit the calculated pose data to the control device. In other embodiments, the control device 300 may also calculate pose data of the main manipulator 100 according to the joint information sent by the joint sensor.
In some embodiments, the master manipulator 100 may include a handle 120 disposed at the end of the multiple degree of freedom robotic arm 110. In operation, the configuration of the multiple degree of freedom robotic arm 110 is adjusted by a user actuating the multiple joint drives and the multiple robotic arms (e.g., robotic arms 111-116) to move the handle 120 of the multiple degree of freedom robotic arm 110 to a suitable pose (e.g., a current pose) to control the adjustment of the display of the 3D simulated image from the operating device (e.g., surgical tool 400).
In some embodiments, the control device 300 may be configured to obtain an initial pose and a current pose of the handle 120 of the at least one main manipulator 100 and control display adjustment of the 3D simulated image based on the initial pose and the current pose in the 3D simulated image adjustment mode. In some embodiments, the control device 300 may be configured to determine the pose adjustment amount of the 3D simulation image based on the pose difference between the initial pose and the current pose.
In some embodiments, the pose difference may include various position differences and pose differences, and the control device may adjust the 3D simulation image based on the various position differences, pose differences, or a combination thereof, respectively. For example, the control device may be configured to determine a left-right yaw adjustment amount of the 3D simulation image based on a yaw difference in the pose difference, or a left-right roll adjustment amount of the 3D simulation image based on a roll difference in the pose difference, or a pitch adjustment amount up-down of the 3D simulation image based on a pitch difference in the pose difference, or a left-right movement adjustment amount of the 3D simulation image based on a left-right position difference in the pose difference, or a shift up-down adjustment amount of the 3D simulation image based on a position difference up-down in the pose difference, or a zoom-in adjustment amount of the 3D simulation image based on a front-back position difference in the pose difference. To obtain an intuitive adjustment experience, the pose difference of the handle 120 of the main operator 100 and the pose adjustment amount of the 3D simulation image may be kept consistent in direction. For example, the control device may be configured to determine a leftward deflection adjustment amount of the 3D simulation image based on a leftward deflection difference in the pose differences, or to determine an upward movement adjustment amount of the 3D simulation image based on an upward position difference in the pose differences, or the like.
In some embodiments, the initial pose of the handle 120 of the main operator 100 includes an initial position and an initial pose, and the current pose of the handle 120 of the main operator 100 includes a current position and a current pose. In some embodiments, the initial or current pose of the handle 120 of the main manipulator 100 is a pose relative to a reference coordinate system. For example, the pose of the handle 120 of the main operator 100 is the pose of the coordinate system defined by the handle of the main operator 100 or a portion thereof relative to the reference coordinate system. In some embodiments, determining the initial (or current) position of the handle 120 of the primary operator 100 includes determining the initial (or current) position of the handle 120 of the primary operator 100 relative to a primary operator-based coordinate system, and determining the initial (or current) pose of the handle 120 of the primary operator 100 includes determining the initial (or current) pose of the handle of the primary operator relative to the primary operator-based coordinate system.
Fig. 6 illustrates a coordinate system schematic of a master manipulator according to some embodiments of the present disclosure. As shown in fig. 6, { CombX } is the main operator base coordinate system, the coordinate axis direction is shown in fig. 6, { H } is the handle coordinate system of the main operator, and the coordinate axis direction is shown in fig. 6. In some embodiments, the initial (or current) pose of the primary manipulator may be determined based on a coordinate transformation. For example, the initial (or current) pose of the handgrip may be determined based on a transformation relationship between the coordinate system { H } of the handgrip of the main operator and the main operator base coordinate system { CombX }, a transformation relationship between the main operator base coordinate system { CombX } and the reference coordinate system { W }. Typically, the primary manipulator-based coordinate system may be disposed on a stand or base on which the primary manipulator is located, and the primary manipulator-based coordinate system remains unchanged during teleoperation.
In some embodiments, an initial or current pose of the primary operator may be determined based on the primary operator sensor. In some embodiments, initial joint information for at least one joint (e.g., joints 1111-1117) of the primary manipulator may be received and an initial pose of a handle of the primary manipulator is determined based on the initial joint information for the at least one joint. In some embodiments, current joint information of at least one joint (e.g., joints 1111-1117) of the primary manipulator is received, and a current pose of a handle of the primary manipulator is determined based on the current joint information of the at least one joint. For example, the initial pose and the current pose of the handle of the main operator are determined based on the main operator sensor reading joint information of the main operator at a previous time and a current time. For example, the amount of change in the position of the handle of the main operator is determined based on the initial position of the handle with respect to the reference coordinate system { W } and the current position. The posture change amount of the handle of the main operator is determined based on the initial posture of the handle with respect to the reference coordinate system { W }, and the current posture. For example, based on the joint information (position or angle) of the corresponding joint acquired by the main operation sensor, the initial pose or current pose of the main operator is calculated. For example, the initial position and initial posture of the main operator, or the current position and current posture, are calculated based on a forward kinematic algorithm.
In some embodiments, the main operator 100 includes at least one gesture joint for controlling the gesture of the handle. Determining the current pose of the handle 120 of the main operator 100 includes obtaining joint information of at least one pose joint and determining the current pose of the handle of the main operator based on the joint information of the at least one pose joint. As shown in FIG. 5, the plurality of joints includes position joints (e.g., joints 1111-1114) and pose joints (e.g., joints 1111, 1112, 1115-1117). The attitude joints 1111, 1112, 1115-1117 are used to adjust the attitude of the handle 120 of the main manipulator 100. The position joints 1111-1114 are used to adjust the position of the handle 120 of the main operator 100. It should be appreciated that joints 1111, 1112 are composite joints, either positional or postural. The main manipulator sensor is arranged at the gesture joint and the position joint and is used for acquiring the joint information (position or angle) corresponding to the gesture joint and the position joint. According to the obtained joint information, the initial pose or the current pose of the handle of the main manipulator relative to the base coordinate system { CombX } of the main manipulator can be determined, and then the initial pose or the current pose of the handle of the main manipulator relative to the reference coordinate system { W } is determined. For example, the initial pose or current pose of the primary manipulator is calculated based on joint information (e.g., angle) acquired by the primary manipulator sensor of the pose joint and a forward kinematic algorithm. The initial or current position of the primary manipulator is calculated based on joint information (e.g., position) obtained by the primary manipulator sensor of the positional joint and a forward kinematic algorithm.
In some embodiments, the control device 300 may be configured to determine the pose adjustment amount of the 3D simulation image based on the pose difference between the initial pose and the current pose. In some embodiments, the amount of change in the pose of the handle 120 of the main operator 100 may be determined based on the initial pose of the handle 120 of the main operator 100 and the current pose. The pose adjustment amount of the 3D simulation image may be determined based on the pose change amount of the handle 120 of the main manipulator 100 and the pose relationship of the main manipulator 100 and the 3D simulation image of the surgical tool. The target pose of the 3D simulation image may be determined based on the current pose of the 3D simulation image and the pose adjustment amount of the 3D simulation image. In some embodiments, there may be a proportional relationship between the amount of pose change of the handle 120 of the main manipulator 100 and the amount of pose adjustment of the 3D simulation image of the surgical tool, and the proportional value may be 1, greater than 1, or less than 1. The pose adjustment amount of the 3D simulation image may be a pose adjustment amount of the 3D simulation image with respect to a 3D simulation image coordinate system or an adjustment amount of the 3D simulation image with respect to a reference coordinate system. The 3D simulated image coordinate system may be defined according to actual needs, for example, the origin may be located at the center of the 3D simulated image, and the x-axis, y-axis and z-axis are defined similarly to the reference coordinate system in order to obtain an intuitive adjustment experience.
It should be appreciated that the pose relationship may include a positional relationship as well as a pose relationship. The positional relationship between the handle of the main operator and the 3D simulation image of the surgical tool may include a relationship between a positional change amount of the handle of the main operator and a positional change amount of the 3D simulation image of the surgical tool, and the posture relationship between the handle of the main operator and the 3D simulation image of the surgical tool may include a relationship between a posture change amount of the handle of the main operator and a posture adjustment amount of the 3D simulation image of the surgical tool.
In some embodiments, the amount of change in the attitude of the primary manipulator may be determined based on the initial attitude of the handle of the primary manipulator relative to the reference coordinate system at time t0 and the current attitude at time t 1. It should be understood that, as shown in fig. 6, the reference coordinate system { W } may be the coordinate system of the space in which the main operator is located or the world coordinate system, and may be based on the body feeling of the user, which is upward when the user sits in front of the consoleThe direction, the somatosensory forward direction is +.>Direction. In this way, the target pose of the 3D simulation image of the surgical tool in the display device can be determined based on the pose change amount and the initial pose of the image of the 3D simulation image in the display device with respect to the 3D simulation image coordinate system or the reference coordinate system at time t 0.
In some embodiments, the surgical robot may further include a master control station 500. Fig. 7 illustrates a schematic structural diagram of a master control station 500 of a surgical robot according to some embodiments of the present disclosure. As shown in fig. 7, the master station 500 includes a master station body 510 and a master manipulator (e.g., master manipulator 100) in any embodiment of the present disclosure. The distal ends of the multiple-freedom robot arms 110 of the main manipulator 100 are disposed on the main control station body 510. For example, the master station body 510 may include a master trolley body. The support of the multi-free robot 110 of the main manipulator 100 is achieved by the main control station body 510.
In some embodiments, as shown in fig. 7, a display device (e.g., displays 520-540 shown in fig. 7) may be provided on the main control station body 510 for displaying an image of the operation region or a 3D simulation image of at least one surgical tool or a 3D simulation image of other parts of the surgical robot. The image capturing device (e.g., endoscope) of the surgical robot may be used to capture an actual surgical image of the operation region, and display the captured actual surgical image on the display of the surgical robot main control station 500 after being processed by the video processing module. The control device may superimpose and display a 3D simulation image of at least a part of the surgical robot, for example a 3D simulation image of the surgical tool, on the display interface.
In the teleoperation mode, the user views the pose of the end effector 420 of the surgical tool 400 in real time through the actual surgical image in the display. The pose change perceived by the user by teleoperation of the main manipulator 100 corresponds to a determinable pose relationship with the pose change perceived by the user in the display of the end effector 420 of the surgical tool 400. In this way, by remotely teleoperating the main manipulator 100, the pose transformation of the main manipulator 100 is converted into the pose transformation of the end effector of the surgical tool based on the preset pose relation, thereby realizing the pose control of the end effector of the surgical tool. In this way, when the user holds the handle 120 of the main manipulator 100 to move to operate the surgical tool, the posture change amount of the end effector 420 of the surgical tool 400 perceived by the user is kept identical to the posture change amount of the main manipulator 100 perceived by the user, the position change amount is proportional, and the improvement of the teleoperation feeling and the teleoperation precision of the user is facilitated.
During the operation, after the user enters the 3D simulation image adjustment mode through the trigger key of the trigger system, the user adjusts the display of the 3D simulation image by holding the handle 120 of the main operator 100 for movement, so as to improve the viewing angle and operation experience of the user.
In some embodiments, as shown in fig. 7, a display device of the surgical robot (e.g., display device 200) may include a stereoscopic display 520, a master external display 530, a master touch display 540. The stereoscopic display 520 displays the surgical image and the system status prompt, the main control external display 530 displays the surgical image and the system status prompt, and the touch display 540 displays the software user interface of the surgical robot main control station 500. It should be understood that the display interface of the present disclosure may refer to an interface displayed on a display screen of the stereoscopic display 520 or the main control external display 530. In some embodiments, the image displayed by the stereoscopic display 520 or the master external display 530 may be determined based on the image acquired by the image acquisition device. In some embodiments, the master control station 500 may also include foot pedals (e.g., foot pedals 550-570) for gathering input from both feet of the healthcare worker. For example, the foot pedal may include an electro-cut pedal 550, an electro-coagulation pedal 560, a clutch pedal 570, and the like. The foot pedal may also include a pedal that acts as a trigger key. In some embodiments, the master station 500 further comprises a controller. The controller may be communicatively connected to the main operator 100, the main console car display, and the foot pedals, respectively, for signal interaction with the main operator 100, the main console car display, and the foot pedals, and generating corresponding control instructions based on the collected control information.
Fig. 8 illustrates a schematic structural diagram of a mobile station 600 of a surgical robot according to some embodiments of the present disclosure. As shown in fig. 8, the surgical robot includes a mobile station 600. For example, mobile station 600 may include a surgical trolley. The mobile station 600 includes a base and at least one motion arm 610 disposed on the base, and a first display interface (e.g., display interface 210 a) may be configured to display a 3D simulated image of the mobile station 600 or the at least one motion arm 610. Fig. 9A, 9B, and 9C illustrate 3D simulated images of different states of a mobile station 600 in a display interface, respectively, according to some embodiments of the present disclosure. As shown in fig. 9A and 9B, fig. 9A shows a 3D simulation image of the plurality of moving arms 610 of the mobile station 600 in a partially unfolded state, and fig. 9B shows a 3D simulation image of the plurality of moving arms 610 of the mobile station 600 in fig. 9A after being rotated 180 ° about a vertical axis, so that a user can easily see the 3D simulation images of the different moving arms 610 by rotating at different angles. As shown in fig. 9C, fig. 9C shows a 3D simulation image of the mobile station 600 at a front view angle, which may facilitate the user to observe the display states of the mobile station 600 and the motion arm 610 as a whole. In some embodiments, as shown in fig. 9A-9C, the display interface 210a may also include a current adjustment mode status display box 202 for displaying a current adjustment mode of the 3D simulated image, e.g., the current adjustment mode status display box 202 displays "rotate about a vertical axis" indicating that the user is performing a side-to-side rotational operation on the main operator 100 about the vertical axis to adjust the 3D simulated image display of the mobile station 600 or the positioning arm 610. The above is by way of example only, and it should be appreciated that other display adjustments may be made by the user to the mobile station 600 or the movement arm 610. In some embodiments, the display interface 210a may also display menu bar options 203, and the menu bar options 203 may include, but are not limited to, setup, system, user, etc. operation buttons for the user to perform different option operations.
In some embodiments, as shown in fig. 8, the surgical trolley 600 includes control devices (which may be disposed on a computer device, disposed inside the surgical trolley 600), a motion arm 610, a surgical trolley chassis 620, a surgical trolley case 630, a system status display 650, a main column 660, a main beam 670, a drive module 690, and the like. The surgical trolley chassis 620 is used to perform the movement and positioning functions of the surgical trolley 600. The surgical trolley chassis 630 is used to integrate surgical trolley electronics inside. The display device 200 may be a system status display 650, the system status display 650 being for displaying a user interface of the surgical trolley system and receiving user inputs. For example, the display interface of the system status display 650 may be configured to display 3D simulated images of the surgical trolley 600 or the at least one motion arm 610. The main upright 660 is liftable and fixed at its top end to the main cross member 670. The end of the main beam 670 is provided with a beam holder, and the lower end of the beam holder is connected with a plurality of moving arms 610. The movement arm 610 carries a drive module 690, and the drive module 690 is used to carry the surgical tool 400 or the imaging tool 700 (the imaging tool 700 is, for example, a 3D electronic endoscope).
Fig. 10 illustrates a schematic structural view of a surgical robotic system 1 according to some embodiments of the present disclosure. As shown in fig. 10, surgical robotic system 1 includes a mobile station (e.g., mobile station 600), at least one surgical tool (e.g., surgical tool 400), and a display system (e.g., display system 1000) in any embodiment of the present disclosure. In some embodiments, surgical robotic system 1 further includes a master control station 500, master control station 500 being communicatively coupled to mobile station 600. The display system 1000 may be provided on the main control station 500. As shown in fig. 8 and 10, the mobile station may be a surgical trolley 600 comprising a base and at least one motion arm 610 disposed on the base, at least one surgical tool 400 being detachably disposed at a distal end of the at least one motion arm 610. It should be appreciated that the surgical trolley 600 may include components such as a surgical trolley chassis 620, a surgical trolley chassis 630, a system status display 650, a main column 660, a main beam 670, a drive module 690, and the like.
In some embodiments, as shown in fig. 8 and 10, at least one motion arm 610 may include a plurality of motion arms. In some embodiments, the surgical robotic system 1 may include a single motion arm 610, the motion arm 610 may include a single multi-stage arm and a plurality of joints connecting the multi-stage arms, the surgical trolley 600 may be integrated with a plurality of surgical tools 400 and imaging tools 700, one or more surgical tools 400 or imaging tools 700 may be disposed at the distal end of one or more of the motion arms 610, the arm 410 and end effector 420 of the plurality of surgical tools 400 and the arm 710 and imaging module 720 of the imaging tools 700 may be accessed into the workspace via the sheath assembly 800, in some embodiments, the surgical robotic system 1 may include a single motion arm 610, the distal end of the motion arm 610 may include a plurality of tool mounts, the surgical tool 400 may be removably disposed at one of the tool mounts, and one or more surgical tools 400 or imaging tools 700 may be disposed at the remaining one or more tool mounts.
Note that the above is merely exemplary embodiments of the present disclosure and the technical principles applied. Those skilled in the art will appreciate that the present disclosure is not limited to the particular embodiments described herein, and that various obvious changes, rearrangements and substitutions can be made by those skilled in the art without departing from the scope of the disclosure. Therefore, while the present disclosure has been described in connection with the above embodiments, the present disclosure is not limited to the above embodiments, but may include many other equivalent embodiments without departing from the spirit of the present disclosure, the scope of which is determined by the scope of the appended claims.

Claims (11)

1. A display system for a surgical robot, comprising:
at least one primary operator configured to receive user input;
a display device configured to display at least one display interface, the at least one display interface comprising a first display interface configured to display a 3D simulated image of at least a portion of the surgical robot; and
a control device in communication with the at least one primary operator and the display device, the control device configured to control display adjustment of a 3D simulated image of at least a portion of the surgical robot based on the user input in a 3D simulated image adjustment mode.
2. The surgical robot display system of claim 1, wherein the display adjustment comprises at least one of: yaw, roll, pitch, move, up and down, zoom in and out.
3. The surgical robot display system of claim 1, further comprising:
and a trigger key configured to trigger the 3D simulation image adjustment mode.
4. A display system of a surgical robot according to claim 3, wherein the main manipulator comprises a robotic arm and a handle provided at a distal end of the robotic arm, the control device being configured to obtain an initial pose and a current pose of the handle of the at least one main manipulator and to control display adjustment of the 3D simulation image based on the initial pose and the current pose in the 3D simulation image adjustment mode.
5. The display system of a surgical robot according to claim 4, wherein the control device is configured to determine a pose adjustment amount of the 3D simulation image based on a pose difference between the initial pose and the current pose.
6. The display system of a surgical robot according to claim 5, wherein the control device is configured to determine a left-right yaw adjustment amount of the 3D simulation image based on a yaw difference in the pose differences, or a left-right roll adjustment amount of the 3D simulation image based on a roll difference in the pose differences, or a pitch adjustment amount of the 3D simulation image based on a pitch difference in the pose differences, or a left-right movement adjustment amount of the 3D simulation image based on a left-right position difference in the pose differences, or a up-down movement adjustment amount of the 3D simulation image based on a up-down position difference in the pose differences, or a zoom-in/out adjustment amount of the 3D simulation image based on a front-back position difference in the pose differences.
7. The surgical robot display system of claim 3, wherein the at least one primary operator comprises a left primary operator and a right primary operator configured to receive user input together in the 3D simulated image adjustment mode.
8. The surgical robot display system of any of claims 1-7, wherein the surgical robot comprises at least one surgical tool comprising an arm and an end effector disposed at a distal end of the arm, the first display interface configured to display a 3D simulated image of the at least one surgical tool, the control device configured to disconnect a mapping relationship between the at least one primary manipulator and the at least one surgical tool in response to a 3D simulated image adjustment mode being triggered.
9. The surgical robot display system of claim 8, wherein the at least one display interface comprises a second display interface configured to display an actual surgical image, and the first display interface is superimposed on the second display interface.
10. The surgical robot display system of any one of claims 1-7, wherein the surgical robot comprises a mobile station including a base and at least one motion arm disposed on the base, the first display interface configured to display a 3D simulated image of the mobile station or the at least one motion arm.
11. A surgical robotic system, comprising:
a mobile station comprising a base and at least one motion arm disposed on the base;
at least one surgical tool removably disposed at a distal end of the at least one motion arm; and
the display system of any one of claims 1-10.
CN202321142116.XU 2023-05-12 2023-05-12 Surgical robot display system and surgical robot system Active CN219629776U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202321142116.XU CN219629776U (en) 2023-05-12 2023-05-12 Surgical robot display system and surgical robot system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202321142116.XU CN219629776U (en) 2023-05-12 2023-05-12 Surgical robot display system and surgical robot system

Publications (1)

Publication Number Publication Date
CN219629776U true CN219629776U (en) 2023-09-05

Family

ID=87812849

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202321142116.XU Active CN219629776U (en) 2023-05-12 2023-05-12 Surgical robot display system and surgical robot system

Country Status (1)

Country Link
CN (1) CN219629776U (en)

Similar Documents

Publication Publication Date Title
US10980604B2 (en) Remote control apparatus for medical equipment
US11224489B2 (en) Robotized surgery system with improved control
EP3658057B1 (en) Association systems for manipulators
EP3636194B1 (en) Surgical robot system, and method for displaying position of surgical instrument
US8335590B2 (en) System and method for adjusting an image capturing device attribute using an unused degree-of-freedom of a master control device
US20220249193A1 (en) Systems and methods for presenting augmented reality in a display of a teleoperational system
US20230240765A1 (en) Remote control apparatus
KR102482803B1 (en) Secondary mechanism control in computer-assisted remote control system
JP6856594B2 (en) Surgical system and display method
WO2019010097A1 (en) Systems and methods for haptic feedback in selection of menu items in a teleoperational system
KR101798590B1 (en) Master robot, surgical robot system using thereof, control method thereof, and recording medium thereof
WO2022002155A1 (en) Master-slave motion control method, robot system, device, and storage medium
CN113876434A (en) Master-slave motion control method, robot system, device, and storage medium
JP2021019949A (en) Surgery system
WO2023083078A1 (en) Mechanical arm, slave operation apparatus and surgical robot
WO2023083077A1 (en) Method for maintaining rc point unchanged, and robotic arm, device, robot and medium
JP2020044354A (en) Remote operation device and remote operation system
JP6902639B2 (en) Surgical system
CN219629776U (en) Surgical robot display system and surgical robot system
JP5800609B2 (en) Medical master-slave manipulator
JP7016400B2 (en) Surgical system and display method
CN219846790U (en) surgical robot system
JP2020006231A (en) Remote operation device
CN219846789U (en) surgical robot system
US20240217115A1 (en) Robotic surgical system and method for controlling robotic surgical system

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant