WO2024028467A1 - Système de guidage de robot médical à écran tactile intégré, et méthode de fonctionnement - Google Patents

Système de guidage de robot médical à écran tactile intégré, et méthode de fonctionnement Download PDF

Info

Publication number
WO2024028467A1
WO2024028467A1 PCT/EP2023/071615 EP2023071615W WO2024028467A1 WO 2024028467 A1 WO2024028467 A1 WO 2024028467A1 EP 2023071615 W EP2023071615 W EP 2023071615W WO 2024028467 A1 WO2024028467 A1 WO 2024028467A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
touch display
guidance system
head
control
Prior art date
Application number
PCT/EP2023/071615
Other languages
German (de)
English (en)
Inventor
Sebastian Zepf
Ann-Kathrin Huber
Alisa POST
Amir Sarvestani
Original Assignee
B. Braun New Ventures GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by B. Braun New Ventures GmbH filed Critical B. Braun New Ventures GmbH
Publication of WO2024028467A1 publication Critical patent/WO2024028467A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/74Manipulators with manual electric input means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/20Surgical microscopes characterised by non-optical aspects
    • A61B90/25Supports therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Definitions

  • the present disclosure relates to a medical, in particular surgical, robot guidance system for a medical procedure, in particular a surgical procedure, on a patient.
  • the robot guidance system has a robot with a robot arm that is movably connected to a robot base and a robot head that is connected at the end to the robot arm.
  • the robot head has an end effector or an end effector itself forms the robot head.
  • the robot guidance system has a (central) control unit (in particular with a processor and a memory unit) which is adapted to control and move at least the robot, in particular the robot arm and the robot head (and thus in particular the end effector).
  • the robot guidance system has at least one touch display/touch screen/touch-sensitive screen, which is adapted to visually output at least one operating menu (for control), and to detect a touch-sensitive input as an operating input and to send this to the control unit, in particular to control the robot.
  • the disclosure relates to a (robot) operating method and a computer-readable storage medium according to the preambles of the independent claims.
  • Surgical guidance systems are increasingly being used during an operation, particularly a minimally invasive procedure. Due to technological development and increasing specialization of various subsystems with corresponding integration, the number of functions of medical systems is increasing significantly. The progressive expansion of the management systems to include this variety of different functionalities makes it increasingly difficult for the user to be provided with such a central operation or operating modality that allows him to maintain an overview of the functions and control.
  • touch displays are currently provided on, for example, a medical cart, a medical tower or on a base of a medical (surgical) microscope, via which a medical specialist can be guided by means of a (operating/operating) Operating menu can make an (operating) input for a corresponding control.
  • touch display is not centrally located in the area of the procedure, but rather at a distance from it. This circumstance makes accessibility difficult on the one hand and hygiene requirements on the other, since the touch display is not designed to be sterile. Due to the distance on the one hand and the sterility requirement on the other, the surgeon conducting the procedure is usually unable to operate the touch display himself.
  • the US 2005/0041282 A1 discloses, for example, a surgical microscope as a robot guidance system with a touch display, which is rigidly attached to a base/carriage of the surgical microscope.
  • a touch-sensitive display Using the touch-sensitive display, different functions can be controlled via different areas of the display.
  • the operator due to this configuration, the operator must give instructions to another medical specialist to control a function, so that this specialist then carries out the control.
  • the operation should bring together different operating modalities centrally and should preferably be provided in an area that is easily accessible to the surgeon.
  • a basic idea of the present disclosure therefore envisages providing an operating modality, such as controlling the robot, in the area of a robot head or an end effector.
  • the touch display is not provided as an input and output unit in a separate, remote location, such as on a medical tower or similar, but directly on a movable part of the robot, namely on the terminal robot head and thus in the area of the End Effectors.
  • the at least one touch display is rigidly attached/attached/fixed to the robot head, in particular to the end effector, and moves with it.
  • the touch display on the robot head is moved dynamically.
  • the end effector is arranged on the robot head or the end effector itself forms the robot head, the operating modality in the form of the touch display is arranged directly in the area of the end effector and can be easily and safely accessed by a surgeon.
  • the surgeon can then control various functions of the operating modalities centrally, so to speak, or is provided with the option of controlling at least one movement of the robot and thus the robot head centrally in the area of the end effector.
  • the surgeon can control different functions via the touch display, for example by displaying an associated operating menu for the corresponding function.
  • the disclosure describes a touch display/touchscreen/touch-sensitive screen embedded in a robot-assisted guidance system/robotic guidance system for medical procedures.
  • the touch display is data-connected to a central control unit (as an execution unit for software) and enables flexible control of various functions, for example in non-sterile and sterile environments, as well as visualization of information.
  • the robot guidance system can, for example, include both visualization-based guidance (e.g. a surgical microscope) and guidance of instruments (e.g. a trocar).
  • an embedded touch display on or on a robotic end effector (on or as a robot head) is proposed here, which enables a user to flexibly provide information in both a sterile and non-sterile environment (e.g. via an external monitor). and offer various control options.
  • the touch display is rigidly connected to the robot end effector and is particularly aligned so that the surgeon has a good viewing angle in the most common surgical positions (especially positions) of the guidance system.
  • the touch display can show both interactive content (such as the operating menu) and non-interactive content (such as an annotation).
  • the visualized content can, for example, be the same content as a larger main monitor of the surgeon (not located on the robot arm) or as others Show control displays that are not attached to the control unit.
  • independent content which can be situation-dependent, can preferably also be displayed by the touch display.
  • position means a geometric position in three-dimensional space, which is specified in particular using coordinates of a Cartesian coordinate system.
  • the position can be specified by the three coordinates X, Y and Z.
  • orientation indicates an orientation (e.g. position) in space.
  • orientation indicates an orientation with direction or Rotation specification in three-dimensional space.
  • the orientation can be specified using three angles.
  • location includes both a position and an orientation.
  • the position can be specified using six coordinates, three position coordinates X, Y and Z and three angular coordinates for orientation.
  • An operating input can, for example, be a control command associated with a selection of the operating menu, which is sent to the control unit so that it carries out the corresponding function directly or indirectly via, for example, a control unit of a subsystem.
  • a central control unit can control it indirectly via a sub-control unit of a visualization system, or it can control it indirectly via a sub-control unit of a navigation system and, for example, have a waypoint set.
  • the robot guidance system can be in the form of a (surgical) operating microscope with a microscope head connected to the robot arm as a robot head or in the form of a navigation system with a camera system connected to the robot arm, in particular with a laser system.
  • the microscope head can be actively controlled and moved and an operator can make operating inputs directly on the microscope head via the touch display, for example with regard to a zoom, an alignment and/or an illumination and/or a movement/a movement of the microscope head (change in position).
  • the view of the touch display and/or the view on the navigation monitor can be changed and adjusted in order to obtain a better overview or to be able to track instruments even better.
  • the robot arm of the robot can be configured in such a way that the robot head is adjustable both in its position and in its orientation, i.e. in its position (or has six degrees of freedom/6DOF (degree of freedom)).
  • the robot arm can have at least a first and a second robot arm segment, which are connected to one another via a joint and the robot head can be connected to the robot arm via a further joint and the robot arm can be connected to the robot base via an additional joint.
  • the joint of the first and second robot arm segments can have a rotational degree of freedom for rotation about an axis of rotation, wherein this axis of rotation can be arranged in a kinematic position of the robot arm, in particular in a horizontal direction (i.e. perpendicular to a top-bottom direction), in order to achieve a type Provide boom (similar to an excavator arm).
  • the robot can be designed in the form of an articulated arm robot.
  • the robot head is designed to be rigid and connected to the robot arm via a bearing or joint.
  • the robot arm can have at least three robot arm segments, each of which is connected to one another via a joint.
  • the robot (or the robot's control unit) can control the position and orientation of the robot head via multi-unit actuated kinematics.
  • the surgical microscope is preferably a digital microscope, which creates and digitally provides a digital microscope image using a sensor, such as a CMOS sensor.
  • the touch display can be arranged on a lateral side of the robot head (i.e. not a rear side in an extension of a longitudinal axis of the robot head).
  • the touch display can be arranged on the robot head in such a way that a normal to the display surface is essentially perpendicular to a longitudinal axis of the robot head, such as a visual axis of a surgical microscope.
  • the touch display can be arranged laterally (i.e. not on a rear side) in an area opposite the optical output.
  • the touch display of a microscope head can be arranged on a side facing away from the optical output, i.e. virtually opposite (one end face is adapted for the optical output, in particular has a lens of an optical system, the opposite end face has the touch display).
  • the at least one touch display is arranged, so to speak, “at the top” of the microscope (camera) head on a lateral and/or upper side, while the optical output is provided “at the bottom”.
  • An optional control button or an actuation button or an input means in the form of a joystick or a 3D mouse can be arranged at an axial position between the optical output (bottom) and the touch display (top), in particular directly below the touch display, as seen in the longitudinal axis direction of the robot head be provided.
  • the robot arm can also preferably be connected to the robot head via a joint between the optical output (bottom) and the touch display (top), viewed in the longitudinal axis direction of the robot head.
  • the touch display represents the “top” or most terminal element or component, which is provided laterally and/or at the front of the robot head.
  • the touch display can have a round outer contour or shape, in particular a circular outer contour or shape.
  • the robot head can have a cylindrical/cylindrical base body in the form of a microscope head, with one round end face forming the optical output for the digital microscope or the microscope camera and the opposite round end face having the circular touch display.
  • an additional input means for example in the form of a joystick or a 3D mouse, can be provided in an extension or at the same height opposite the connection of the robot arm. If the robot head is connected to the robot arm on one lateral side via a joint, in particular a swivel joint, then on the opposite lateral side, starting from the connection, an axis is perpendicular to an optical axis (you can also say perpendicular to a longitudinal axis). of the robot head) the input means is provided.
  • the robot guidance system can be adapted to display a menu structure for accessing various functionalities via the touch display.
  • Exemplary applications of the touch display are listed below in particular. Not only a single operating menu or a single display can be displayed on the touch display, but the surgeon is provided with a large number of operating menus to control various functions. In particular, you can switch from a top menu structure to several submenu structures and back.
  • the robot guidance system can be adapted to output a robot control menu as an operating menu via the touch display in order to control the robot via an operating input (a movement), in particular to control a robot movement in six degrees of freedom, for example in six translational ones Directions (each two opposite directions of a Cartesian coordinate system +X/-X, +Y/-Y, +Z/-Z) and/or six rotational ones Directions (clockwise or counterclockwise rotations around the respective axis).
  • the touch display can be adapted to display an (operating) menu structure for accessing the function, which is adapted to control robot movements, in particular six degrees of freedom.
  • the robot guidance system can be adapted to output a visualization control menu as an operating menu via the touch display in order to control/change settings of a visualization system, in particular a zoom, a focus and/or an illumination intensity (as settings).
  • the touch display can be adapted to display a menu structure for accessing the function: setting settings of a visualization system (e.g. a surgical microscope) such as zoom, focus and light intensity via, in particular, a touch bar/slider.
  • the robot guidance system can be adapted to switch between at least two different control menus, in particular between at least one robot control menu and a visualization control menu, in order to achieve at least two different functions of the robot guidance system control, in particular a movement of the robot as a first function and a visualization as a second function.
  • the robot guidance system can be adapted to output a visualization control menu as an operating menu via the touch display, which allows control of various light/imaging modes of the visualization system, in particular control of fluorescence for in particular ICG (indocyanine green), 5 -ALA (5-aminolevulinic acid).
  • a visualization control menu as an operating menu via the touch display, which allows control of various light/imaging modes of the visualization system, in particular control of fluorescence for in particular ICG (indocyanine green), 5 -ALA (5-aminolevulinic acid).
  • the robot guidance system can also be adapted to output a navigation control menu as an operating menu via the touch display, which allows waypoints and/or robot configurations and/or navigation positions in relation to the patient (or a patient position) to be saved and in particular provides a so-called “Hold and Drive” function, in which, by long pressing on one of the saved and displayed data, moves to the saved points or predefined positions.
  • a navigation control menu as an operating menu via the touch display, which allows waypoints and/or robot configurations and/or navigation positions in relation to the patient (or a patient position) to be saved and in particular provides a so-called “Hold and Drive” function, in which, by long pressing on one of the saved and displayed data, moves to the saved points or predefined positions.
  • the robot guidance system can also be adapted to output a navigation control menu as an operating menu via the touch display, which allows or provides control of navigation processes, in particular point digitization and verification for patient registration and calibration/activation of tools.
  • the robot guidance system can also be adapted to output an instrument control menu as an operating menu via the touch display, which allows control of instrument guidance functions such as driving the robot or the instrument as an end effector on a target trajectory or an on or off Switching off an instrument function provides.
  • the robot guidance system can also be adapted to output a media control menu as an operating menu via the touch display, which allows control of the recording of image recordings (e.g. as current snapshots) and/or video data of a visualization system.
  • a media control menu as an operating menu via the touch display, which allows control of the recording of image recordings (e.g. as current snapshots) and/or video data of a visualization system.
  • the surgeon can, for example, create a recording at an initial point in time using an operator input and have this recording displayed on the touch display or an external monitor at a later point in time, for example to make a before-and-after comparison or to call up information about the surgical site.
  • the robot guidance system can also be adapted to output a media control menu as an operating menu via the touch display, which allows control of the playback and management of the recorded media data, with the reproduced video in particular on other displays (in addition or alternatively to that touch display).
  • the robot guidance system can also be adapted to output a navigation control menu as an operating menu via the touch display, which allows or provides control of settings for the information that is displayed on the visualization monitor, in particular planned trajectories, Show or hide operation targets or other navigation information.
  • the robot guidance system can also be adapted to output a navigation control menu as an operating menu via the touch display, which provides control for switching between different monitor layouts of the main visualization monitor.
  • situation-dependent content can be shown based on information from the control unit/control device in order to save the user time when navigating through the operating menu.
  • situation-dependent content for example depending on a robot configuration or a current status of an operation plan
  • - Settings of a visualization system can be highlighted after the guidance system has been repositioned to allow quick adjustment of visualization parameters to the new position;
  • the instrument guidance settings can be highlighted;
  • a display mode of the touch display can be switched to a so-called "trackpad” mode, in which: o a mouse symbol on the main visualization screen, in particular a surgical monitor, is activated and thus allows the user to operate the visualization screen as a computer screen with a mouse/laptop - Trackpad (via the touch display) enabled; or o allows the user to scroll through the navigation views/slices (segmentations) on the main visualization screen, in particular the surgical monitor.
  • the touch display fixed to the robot head can have a sterile casing, which is preferably designed to be interchangeable in order to provide a sterile barrier to a sterile entry point.
  • the touch display can be covered with a medical (surgical) cloth with transparency, with connection points/coupling points in particular being provided for the surgical cloth.
  • an inertial measurement unit can also be provided on or in the robot head, in particular provided on or in the touch display or attached to the end effector in order to determine a position and / or orientation of the touch display or the robot head to detect, and the robot guidance system, in particular the control unit, be adapted to adapt an orientation of a visual output of the touch display based on the detected position and / or orientation, in particular to adapt an output (e.g. of the operating menu) so that it is always in consistent horizontal orientation.
  • IMU inertial measurement unit
  • an inertial measurement unit can optionally be connected to the touch display or the robot head or the end effector in order to be able to change the orientation of the visualized content in different positions and/or orientations, in particular extreme positions, of the guidance system.
  • IMU inertial measurement unit
  • a position and / or orientation of the touch display or the robot head can be recorded.
  • a position and/or orientation of the touch display or the robot head can preferably be recorded via the navigation camera of the navigation system and corresponding data processing.
  • control unit can be adapted to adapt a display based on the detected position of the touch display so that it is relative to a predetermined position in space, namely the position at which the head of an operator is located (for example, this position can be detected by a navigation camera ), is displayed optimally.
  • This can in particular include rotating the view of the touch display, so that the view is preferably displayed approximately horizontally relative to a floor of an operating room, and thus relative to the surgeon, as well as stretching or compressing the view in order to achieve the most neutral, natural view possible to provide the operator's recorded position (similar to arrows or labels on a road or a road marking, which are adapted (stretched) for a driver so that the driver can recognize the information as well as possible).
  • control unit or the touch display can be adapted to output such a projection onto a virtual surface via the touch display, the surface being perpendicular to a visual axis between the operator and the touch display so that the operator does not see a distorted representation (if the touch display is at an angle to a line of sight), but rather a representation similar to if he were looking vertically at the touch display.
  • the touch display can preferably have a radio connection module in order to (independently) establish a wireless data connection to the control unit, in particular via WLAN or Bluetooth.
  • the touch display can also be connected to the control unit via a data cable.
  • the touch display in order to control various functionalities of the guidance system for situation-dependent display of content, can be connected to the control unit (as a computer system) via a touch display control device by cable or wirelessly (with a radio connection module).
  • the touch display itself has an independent sub-control unit as a computer system to independently form an independent control complement, which can be integrated into the overall system and can be linked to the (central) control unit in terms of data technology.
  • the touch display can have a display diagonal or a display diameter of at least 4 cm and/or a maximum of 20 cm.
  • the size of the touch display can be between 4 cm and 20 cm.
  • the touch display can have a square, rectangular or round form factor. This size and shape can be based, for example, on the number of functions and the size of the connected robot effector. The size and shape help to advantageously integrate the touch display into the area of the end effector.
  • the robot head can have at least one (physical) actuation button (hardware button), preferably three actuation buttons, and the touch display can be arranged directly adjacent to the button and adapted to display the current functional status (/the current functionality) of the button.
  • the touch display can be arranged directly adjacent to the button and adapted to display the current functional status (/the current functionality) of the button.
  • the touch display has a color display to display color operating menus and/or information.
  • a touch display in addition to the touch display on the robot head, can be provided on the robot arm and/or on a robot base.
  • the robot guidance system can be designed as a robot guidance unit and all components can be integrated into this unit or a single module.
  • the robot guidance system can be designed to be mobile and be provided and moved independently in an operating room, for example in the form of a mobile surgical microscope, on whose microscope head (as a robot head and end effector) the touch display is arranged.
  • the tasks are solved with regard to a medical robot operating method, in particular for a robot guidance system according to the present disclosure, by the steps: outputting at least one visual operating menu via a touch display that is rigidly attached to a robot head of a robot; Capturing a touch-sensitive input as an operating input via the touch display; Sending the operating input to a control unit adapted to control at least the robot; Control based on the operating input of a function, in particular a movement of a robot, by the control unit.
  • this step can provide a flexible and central operating method by means of which the operator can carry out different functions, for example a movement of a robot or a setting of a visualization system.
  • the (operating) method can further comprise the steps:
  • Detecting a (change in) position and orientation of the robot head in particular by an inertial measuring unit (IMU) or by detected robot kinematics or by a navigation camera of a navigation system; Sending the captured position and orientation to the control unit; Calculating an orientation of the visual operating menu adapted to the position and orientation; and outputting the customized visual representation of the operating menu.
  • IMU inertial measuring unit
  • the surgeon can be provided with the best possible or best-adapted visual output and, in particular, he does not have to turn his head to read a label or information or to reposition himself in order to view the information on the display when looking at an oblique top view of the touch display. for example, to recognize a text well.
  • the tasks are solved by comprising instructions that can be executed by a computer cause this to carry out the procedural steps of the operating method according to the present disclosure.
  • the control unit of the robot guidance system can include such a computer-readable storage medium.
  • FIG. 1 is a schematic front view of a robot guidance system according to a preferred embodiment of the present disclosure
  • FIG. 2 shows a schematic view of a functional relationship between the touch display, control unit and IMU of a robot guidance system according to a further preferred embodiment
  • Figs. 3a and 3b show an exemplary view of an operating menu in a table-like arrangement and in a circular arrangement
  • FIG. 4 shows a schematic view of an exemplary submenu of a visualization control menu for setting a light intensity
  • 5 is a schematic view of an exemplary submenu of a media control menu for a recorded video management and playback function
  • 6 shows a schematic view of an operating menu with only a stop button as an interactive input
  • FIG. 7 shows a schematic front view of a robot head or end effector with three buttons and a touch display
  • FIG. 8 is a schematic front view of a robot guidance system according to another preferred embodiment of the present disclosure.
  • FIG. 9 to 11 show different views of a robot guidance system with a robot head with a touch display, which is arranged at the top of the microscope head on a side facing away from the optical output;
  • FIG. 12 shows a flowchart of an operating method according to a preferred embodiment.
  • Fig. 1 shows a schematic front view of a medical robot guidance system 1 (hereinafter referred to as guidance system) for a surgical procedure on a patient P.
  • the guidance system 1 has a robot 2 with a movable robot arm 4 and a robot head 6 connected at the end to the robot arm 4 on.
  • the guidance system 1 is designed in the form of a robot-guided surgical microscope.
  • the microscope head is provided as the robot head on the robot arm 4 with several robot arm segments. Therefore, the robot head or microscope head as a whole can also be referred to as the end effector 8 of the surgical microscope.
  • the guidance system 1 has a control unit 10.
  • the control unit 10 can be designed as a central control unit, for example as a computer system that can process and control different functions, or it can have a subsystem of a robot control unit in order to control the robot accordingly.
  • the guidance system 1 has a touch display 12 as an input and output unit. This is adapted to visually output different operating menus 14 and displays for different functions, and to record a touch-sensitive input as an operating input and to send this to the control unit 10 in order to both control the robot 2 and make settings on the visualization system.
  • the touch display for control is not attached to a static base of the guide system 1, but the touch display 12 is rigidly fixed directly to the robot head 6 or to the end effector 8 and moves with it.
  • the touch display 12 moves in the area of the procedure and the surgeon has a central visualization of different operating menus 14 and can flexibly control the different functions of different sub-systems via a single touch display.
  • the operator can select from at least one robot control menu and visualization control menu or switch between the individual control menus as operating menus 14.
  • the surgeon can, for example, guide the instrument or move the robot head 6 so that the end effector 8 is in a better position for the procedure.
  • the microscope head as an end effector can be moved manually or automatically via the operating input of the touch display 12 into a position that allows a better view.
  • Additional recordings can be output via an external (surgical) monitor 15, with media output being controllable via a further media control menu via the touch display 12.
  • a current image of the microscope can be output via the surgical monitor 15, with brightness or color contrast being adjustable via a media control menu.
  • the operator is provided with at least three control menus: robot control menu, visualization control menu and media control menu, which he can operate centrally in order to flexibly perform various functions and safe to control.
  • Fig. 2 shows a schematic functional view of an interaction between individual modules of a guidance system 1 according to a further, second preferred embodiment.
  • an inertial measuring unit (IMU) 16 is arranged directly on the touch display 12 in order to detect a movement (or a change in a movement via an acceleration) and, based on an initial position, a new position and orientation of the touch display 12 after a movement of the to determine the robot.
  • the IMU data is sent directly to a central control unit 12.
  • the central control unit 10 receives data from an end effector control unit 18 (as a sub-control unit).
  • the central control unit 12 has a mutual data connection with a touch display control unit (as a sub-control unit). This central control unit 12 then calculates a visual output based on this data, which is then output on the touch display 12.
  • the touch display is no longer arranged statically, but moves dynamically with the end effector 8, the orientation of the display changes relative to the operator. For example, if in Fig. 1 the robot arm 4 is pivoted upwards by 90°, the visual view of the Touch displays 12 can also be rotated 90 ° counterclockwise relative to the surgeon.
  • the control unit 12 calculates such an orientation, in the above example a rotation of the display relative to the touch display by 90° clockwise, so that the rotation and counter-rotation cancel each other out and an intuitive reading is possible Allow surgeon.
  • the control unit calculates such an orientation of the display that appears as consistent as possible to the operator and, to a certain extent, enables a consistent relationship between the display of the touch display relative to the operator.
  • the visual output when changing from a front view of the touch display 12 (i.e. vertical view) to an oblique view, the visual output can also be displayed correspondingly distorted (similar to a road marking in the form of a label, which is also displayed elongated in order to achieve this to provide the driver with the best possible view).
  • 3a and 3b are exemplary views of an operating menu shown by the touch display, with a table-shaped arrangement with rectangular, symbol-labeled (touch) buttons Z buttons being shown in FIG. 3 a and a circular arrangement of symbol-labeled (touch) buttons Z buttons shown in FIG. 3 b .
  • the user can, for example, select a sub-menu, which is shown in FIGS. 4 and 5 and explained below.
  • Fig. 4 shows a sub-menu of a lighting control menu when the symbol-labeled light bulb is selected in the main menu.
  • the user can adjust a brightness of white light as well as an intensity of UV radiation and IR radiation.
  • the menu jumps to the sub-menu of the video function shown in FIG. 5, in which video data is also managed Playback functions can be selected.
  • the video can be played on an external surgical monitor (not shown here).
  • FIG. 6 shows a schematic view of a (digital) stop button on the touch display 12 in order to stop this movement during a movement of the robot 2 or the robot arm 4 (kind of emergency button).
  • Fig. 7 shows a front view of an end effector of a robot guidance system 1 of a further preferred embodiment.
  • three physical pushbuttons/buttons/actuation buttons 22 are provided on the end effector 8 on a straight line.
  • a touch display 12 is attached to the end effector, which represents the function status above the corresponding actuation button 22. In this way, several functions can be assigned to the three actuation buttons 22 and visualized accordingly by the touch display.
  • control unit 12 can assign different functions to the operating buttons for different steps in the operation plan and have them output via the touch display.
  • the robot 2 has a stationary robot base 24 to which the robot arm 4 is movably connected.
  • the robot head 6 is in turn connected to the end of the robot arm 4 in the form of a (digital) microscope head as an end effector s, so that the robot arm 4 can adjust both the position and the orientation (i.e. the spatial position) of the microscope head in order to assume a suitable recording position and create a digital microscope image.
  • the optical axis of the microscope head is shown as a (vertical) dashed line, which passes through the optical system (not shown) and, as an extension, a CMOS sensor for a digital recording.
  • the robot arm 4 has several robot arm segments 26, each of which is connected to one another via a joint 28.
  • the robot head 6 with the optical system is also connected to the robot arm 4 via a further joint 28 and the robot arm 4 is connected to the robot base 24 via a joint 28.
  • the joint of the first and second robot arm segments 26 can have a rotational degree of freedom for rotation about a rotation axis have, wherein this axis of rotation can be arranged in a kinematic position of the robot arm 4 shown in FIG. 8, in particular in a horizontal direction (i.e. perpendicular to a top-bottom direction), in order to provide a type of boom.
  • the robot head 6 is rigid (or has a rigid robot head housing in which the optical system, the downstream CMOS sensor and other electronics are housed).
  • the robot 2 (or the control unit of the robot) can therefore control the position and orientation of the robot head 6 via multi-unit actuated kinematics and adjust the optical axis accordingly.
  • the touch display 12 is arranged on a lateral side of the robot head 6 (i.e. not on a rear side in an extension of a longitudinal axis of the robot head).
  • the touch display can be arranged on the robot head in such a way that a normal to the display surface is essentially perpendicular to a longitudinal axis of the robot head, such as a visual axis of a surgical microscope.
  • the touch display 12 is arranged so that it is at the top of the robot head 6 as seen in FIG. 8. While in Fig. 8 a lower side forms the optical opening for the optical system of the digital microscope, the touch display is arranged in an upper area.
  • the optical output is provided in a first, lower region of the robot head 6, while the touch display is provided on a second upper region of the robot head, the first and second regions forming terminal regions facing away from one another.
  • the touch display 12 is therefore arranged in such a way that during a usual operation, in which the surgical microscope or the microscope head looks down on the patient from above, a recording is created and the laterally arranged touch display is particularly easy to see and operate for a medical specialist.
  • the actuation buttons 22 are arranged below the touch display 12 as seen in FIG. 8, so that the touch display forms, so to speak, the top element. Seen along a longitudinal axis of the robot head 6, the following is provided one after the other in this order in the axial position: touch display 12 control button optical output.
  • the robot arm 4 is connected laterally to the robot head 6 via the joint 28 and the three dashed rotation indications are intended to show that the robot head can adjust an orientation around three axes or has three degrees of freedom of rotation.
  • the figures 9 to 12 show a medical robot guidance system 1 according to a further preferred embodiment in a side view, in a perspective top view and in a perspective isometric view.
  • the robot 2 is designed in the form of a robot-guided surgical microscope, the robot head 6 of which forms the end effector 8 as the microscope head and a position and orientation of the microscope head can be adjusted via the robot arm 4.
  • the optical output is provided on a lower side as seen in FIG. 9, the optical axis being shown in dashed lines.
  • the touch display 12 is arranged on the side opposite the optical output.
  • the touch display 12 is designed to be circular, with a center point of the circular touch display 12 lying essentially in the extension of the optical axis, so to speak concentric with it. You can also say that the normal to the display is parallel to the optical axis, especially on the optical axis (the optical system and the touch display are, so to speak, symmetrical to each other).
  • the touch display 12 forms a front, upper, flat surface and is offset from the rest of the microscope head (and even the robot arm 4) (i.e. forms a projection) in order to enable good operation. Seen in the direction of the longitudinal axis of the robot head or the optical axis, from top to bottom, first the touch display 12 is provided, then an input means 30 in the form of a 3D mouse or a 3D space mouse and then the optical output (with approximately the last lens of the optical system). The input means 30 is also arranged in extension to a longitudinal axis of a cylindrical arm with a swivel joint 28 to a robot arm segment 26 on an opposite side of the connection to the robot arm 4. A zero axis of the 3D mouse is concentric to an axis of the robot arm segment 26.
  • buttons 22 can be controlled using further actuating buttons 22.
  • the robot head 6 is connected to the robot arm 4 (or to the robot arm segment 26) by means of a swivel joint as a joint 28.
  • An orthogonal on the touch display 12 is also perpendicular to an axis of rotation, whereby here the orthogonal on the touch display 12 is parallel to the optical axis.
  • FIG. 12 shows a flowchart of an operating method according to a preferred embodiment.
  • an operating menu 14 is output by the touch display 12.
  • a position and an orientation (i.e. the position) of the touch display 12 are recorded.
  • the position can also be recorded via recorded (mechanical) robot kinematics or by a tracking camera/navigation camera of a navigation system.
  • the position of the touch display 12 is then passed on to the control unit 12 in substep S1.2. This then calculates an adapted orientation for the output in substep S1.3 and then outputs this adapted representation through the touch display 12 in substep S1.4.
  • a third step S3 the recorded operating input is then sent to the control unit 12.
  • step S4 a corresponding function, as displayed and selected on the touch display 12, is controlled.
  • this can be a control of the robot 2.
  • IMU Intertial Measurement Unit

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Robotics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Pathology (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Manipulator (AREA)

Abstract

L'invention concerne un système de guidage de robot médical (1) pour une intervention chirurgicale sur un patient (P), comprenant : un robot (2) avec un bras de robot mobile (4) et une tête de robot (6) fixée à l'extrémité du bras de robot (4), comprenant notamment un effecteur terminal (8) sur la tête de robot (6) ou en tant que tête de robot (6) ; une unité de commande (10) personnalisée pour commander et déplacer au moins le robot (2) ; au moins un écran tactile (12) conçu pour afficher visuellement au moins un menu d'exploitation (14), pour détecter une entrée tactile en tant qu'entrée utilisateur et pour transmettre cette entrée utilisateur à l'unité de commande (10) afin de commander plus particulièrement le robot (2) ; ledit au moins un écran tactile (12) étant fixé de manière rigide à la tête du robot (6) et se déplaçant avec cette dernière. L'invention concerne également une méthode d'exploitation de robot et un support d'enregistrement lisible par ordinateur selon des revendications indépendantes supplémentaires.
PCT/EP2023/071615 2022-08-04 2023-08-03 Système de guidage de robot médical à écran tactile intégré, et méthode de fonctionnement WO2024028467A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102022119613.5A DE102022119613A1 (de) 2022-08-04 2022-08-04 Medizinisches Roboter-Führungssystem mit integriertem Touchdisplay und Bedienungsverfahren
DE102022119613.5 2022-08-04

Publications (1)

Publication Number Publication Date
WO2024028467A1 true WO2024028467A1 (fr) 2024-02-08

Family

ID=87571920

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2023/071615 WO2024028467A1 (fr) 2022-08-04 2023-08-03 Système de guidage de robot médical à écran tactile intégré, et méthode de fonctionnement

Country Status (2)

Country Link
DE (1) DE102022119613A1 (fr)
WO (1) WO2024028467A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050041282A1 (en) 2003-08-21 2005-02-24 Frank Rudolph Operating menu for a surgical microscope
US20160081753A1 (en) * 2014-09-18 2016-03-24 KB Medical SA Robot-Mounted User Interface For Interacting With Operation Room Equipment
US20210085424A1 (en) * 2017-07-27 2021-03-25 Intuitive Surgical Operations, Inc. Light displays in a medical device
US20210369366A1 (en) * 2020-05-29 2021-12-02 Canon U.S.A., Inc. Robotic endoscope controller with detachable monitor

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102005018431A1 (de) 2005-04-21 2006-10-26 Leica Microsystems (Schweiz) Ag Operationsmikroskop
EP3989236A1 (fr) 2020-10-23 2022-04-27 Leica Instruments (Singapore) Pte. Ltd. Système pour un système de microscope et procédé et programme informatique correspondants

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050041282A1 (en) 2003-08-21 2005-02-24 Frank Rudolph Operating menu for a surgical microscope
US20160081753A1 (en) * 2014-09-18 2016-03-24 KB Medical SA Robot-Mounted User Interface For Interacting With Operation Room Equipment
US20210085424A1 (en) * 2017-07-27 2021-03-25 Intuitive Surgical Operations, Inc. Light displays in a medical device
US20210369366A1 (en) * 2020-05-29 2021-12-02 Canon U.S.A., Inc. Robotic endoscope controller with detachable monitor

Also Published As

Publication number Publication date
DE102022119613A1 (de) 2024-02-15

Similar Documents

Publication Publication Date Title
DE102008016146B4 (de) Operations-Assistenz-System zur Führung eines chirurgischen Hilfsinstrumentes
DE102018206406B3 (de) Mikroskopiesystem und Verfahren zum Betrieb eines Mikroskopiesystems
EP3175770B1 (fr) Dispositif d'observation, en particulier dispositif d'observation médical, comprenant une unité de commande et utilisation d'un module d'entrée
EP2328501B1 (fr) Poste de travail médical et dispositif de commande pour déplacer manuellement un bras robotisé d un poste de travail médical
EP3305232B1 (fr) Dispositif de commande et procede de commande pour l'opération d'un dispositif médical
EP1240418A1 (fr) Procede de poursuite automatique fiable d'un endoscope et pistage (tracking) d'un instrument chirurgical avec un systeme de guidage d'endoscope (efs) a entrainement et a commande electriques, en chirurgie a effraction minimale
EP3753520A1 (fr) Dispositif de manipulation médical de commande d'un dispositif de manipulation
EP3639782A1 (fr) Dispositif de commande d'un mouvement d'un bras robotique et dispositif de traitement doté d'un dispositif de commande
EP3363358A2 (fr) Dispositif de détermination et recouvrement d'un point de référence lors d'une intervention chirurgicale
DE102013108115A1 (de) Verfahren und Vorrichtung zum Festlegen eines Arbeitsbereichs eines Roboters
WO2019149400A1 (fr) Procédé de planification de la position d'un système d'enregistrement d'un appareil d'imagerie médicale, et appareil d'imagerie médicale
DE102014210046A1 (de) Operationsmikroskopsystem
DE102019114817B4 (de) Bildgebungssystem und Verfahren zur Beobachtung
WO2024028467A1 (fr) Système de guidage de robot médical à écran tactile intégré, et méthode de fonctionnement
DE102015216573A1 (de) Digitales Operationsmikroskopiesystem
EP4284290A1 (fr) Système d'assistance chirurgical à microscope opératoire et caméra et procédé de visualisation
DE102018206405B3 (de) Mikroskopiesystem sowie Verfahren zum Betrieb eines Mikroskopiesystems
EP3753521A1 (fr) Dispositif de manipulation médical d'un dispositif de manipulation
DE4204601B4 (de) Vorrichtung zum Erfassen von Lageinformationen mit einer optischen Beobachtungseinheit und Verfahren zur Ermittlung von Lageinformationen
DE102004052753A1 (de) Verfahren und Operations-Assistenz-System zur Steuerung der Nachführung zumindest eines Hilfsinstrumentes bei einem medizinisch minimal-invasiven Eingriff
WO2024018011A1 (fr) Dispositif de commande et système, et système équipé d'un instrument d'intervention chirurgicale médical, d'un dispositif d'acquisition de données et d'un dispositif de traitement de données
EP1537830B1 (fr) Procédé et appareil pour l'observation des objets avec un microscope
DE102016213050A1 (de) Bewegungssteuerung eines Röntgengerätes
DE102022118328A1 (de) Steuervorrichtung und System
WO2022049069A1 (fr) Procédé de fonctionnement d'un système de microscopie et système de microscopie

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23754715

Country of ref document: EP

Kind code of ref document: A1