WO2024018011A1 - Dispositif de commande et système, et système équipé d'un instrument d'intervention chirurgicale médical, d'un dispositif d'acquisition de données et d'un dispositif de traitement de données - Google Patents

Dispositif de commande et système, et système équipé d'un instrument d'intervention chirurgicale médical, d'un dispositif d'acquisition de données et d'un dispositif de traitement de données Download PDF

Info

Publication number
WO2024018011A1
WO2024018011A1 PCT/EP2023/070175 EP2023070175W WO2024018011A1 WO 2024018011 A1 WO2024018011 A1 WO 2024018011A1 EP 2023070175 W EP2023070175 W EP 2023070175W WO 2024018011 A1 WO2024018011 A1 WO 2024018011A1
Authority
WO
WIPO (PCT)
Prior art keywords
instrument
control device
designed
axis
information
Prior art date
Application number
PCT/EP2023/070175
Other languages
German (de)
English (en)
Inventor
Florian Huber
Chunman Fan
Mirko KUNZE
Kirsten Klein
Lena Felber
Thorsten Ahrens
Hans-Georg Mathé
Original Assignee
Karl Storz Se & Co. Kg
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from DE102022118328.9A external-priority patent/DE102022118328A1/de
Priority claimed from DE102022118330.0A external-priority patent/DE102022118330A1/de
Application filed by Karl Storz Se & Co. Kg filed Critical Karl Storz Se & Co. Kg
Publication of WO2024018011A1 publication Critical patent/WO2024018011A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/74Manipulators with manual electric input means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/35Surgical robots for telesurgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2048Tracking techniques using an accelerometer or inertia sensor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/302Surgical robots specifically adapted for manipulations within body cavities, e.g. within abdominal or thoracic cavities
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/74Manipulators with manual electric input means
    • A61B2034/742Joysticks

Definitions

  • Control device and system as well as system with a medical surgical instrument, a data acquisition device and a data processing device
  • the present invention relates to a control device and a system with a control device, a first instrument, a second instrument and a motor control unit for positioning the second instrument.
  • voice control As a control mechanism. This technology is very popular in certain markets (e.g. USA), but is being used in other regions This method is used less frequently. Voice control has a certain inaccuracy and delay. In addition, only one degree of freedom can be controlled at any given time. In addition, communication in the operating room must be subordinate to the voice-controlled system when the robot arm is repositioned.
  • gesture tracking Hands-free control of the robot arm is recorded, for example, with a headband that has an infrared receiver. When the surgeon nods his head, the receiver detects the movement and commands the arm to reposition the endoscope accordingly.
  • the object is achieved by a control device with a first, optical sensor and a second sensor with an acceleration sensor and / or a gyroscope, the optical sensor being designed to provide first information about a first, translational and rotational component of a displacement a first, manually displaceable instrument relative to the optical sensor, and the second sensor is designed to detect a second piece of information about a second component of the displacement of the first instrument, the second component being a roll, pitch and yaw of the first instrument , wherein the control device is designed to send the first and second information for positioning a second instrument, so that the second instrument is displaced corresponding to the first instrument.
  • Such a control device enables the user with an instrument, the first instrument that he is already operating, to move the instrument as desired.
  • the reaction of the second instrument takes place immediately and in accordance with the displacement of the first instrument, so that the user achieves an intuitive and direct displacement of the second instrument by displacing the first instrument.
  • the first and second information is sent, in particular directly or indirectly, to a motor control unit which is designed to position the second instrument.
  • control device has a continuous passage which is designed to accommodate a shaft of the first instrument.
  • This configuration enables a defined interaction between the control device and the first instrument.
  • the optical sensor is arranged such that light reflected from the first instrument reaches the optical sensor.
  • This configuration makes it possible to reliably detect the translational and rotational components of the displacement of the first instrument.
  • the passage is in the field of vision of the optical sensor.
  • control device has a light source which is designed to emit light to illuminate a shaft of the first instrument.
  • This configuration makes it possible to reliably detect the translational and rotational components of the displacement of the first instrument.
  • the light is emitted in particular in the direction of the passage and is at least partially reflected there by a shaft of the first instrument in the direction of the optical sensor.
  • the light is emitted with a structured pattern.
  • This configuration makes it possible to reliably detect the translational and rotational components of the displacement of the first instrument.
  • the optical sensor is arranged in a hermetically sealed housing with a translucent pane.
  • This configuration makes it possible for the optical sensor to be protected from external influences, but still be able to detect or observe the translational and rotational components of the displacement of the first instrument.
  • This disk is preferably either arranged at a certain angle to the surface of the optical sensor or provided with an anti-reflective coating of 800 to 900 nm in order to prevent reflections from the disk from disturbing the optical sensor.
  • control device has an actuating device, the control device being designed to cause a displacement of the second instrument when the first instrument is displaced when the actuating device is in an activated state and to cause no displacement of the second instrument when the first instrument is displaced in a deactivated state to cause the second instrument to be relocated.
  • This embodiment allows the user to easily switch to a first operating mode in order to relocate the second instrument using the first instrument, or to switch to a second operating mode in order to work with the first instrument without displacing the second instrument.
  • the user can first position an endoscope with a camera using the first instrument in the first operating mode, then work with the first instrument in the second operating mode, then reposition the endoscope again in the first operating mode, and so on.
  • control device has a configuration device which is designed to apply a factor to the first and/or the second information, so that a displacement of the first instrument with respect to one or more degrees of freedom results in a displacement of the second instrument a factor is applied to this or these degrees of freedom.
  • This embodiment makes it possible to configure a translation of the first displacement into the second displacement. For example, if insertion of the first instrument should only lead to a smaller insertion of the second instrument, a positive factor smaller than 1 can be set for this degree of freedom. If a shift in the first instrument is generally intended to lead to smaller shifts in the second instrument, a factor smaller than 1 is chosen for all degrees of freedom. If a shift in the first instrument is generally intended to lead to larger shifts in the second instrument, a factor greater than 1 is chosen for all degrees of freedom. Using a negative factor, a mirror image translation can be chosen for one or more degrees of freedom if desired by the user.
  • the control device further has an instrument trocar to which the control device is fixed.
  • This configuration enables a defined interaction between the control device and the first instrument.
  • a first reference coordinate system of the first instrument is formed from three axes perpendicular to one another in pairs, with a z-axis of the three axes being directed from a distal end of the first instrument towards a proximal end of the first instrument, an x-axis of the three axes follows the earth's gravitational field and one y-axis of the three axes is perpendicular to the x-axis and the z-axis.
  • This configuration enables a practical and computationally easy to implement definition of the first reference coordinate system of the first instrument.
  • a second reference coordinate system of the second instrument is formed from three axes perpendicular to one another in pairs, with a z-axis of the three axes being directed from a distal end of the second instrument towards a proximal end of the second instrument, an x-axis of the three axes follows the earth's gravitational field and one y-axis of the three axes is perpendicular to the x-axis and the z-axis.
  • This configuration enables a practical and computationally easy to implement definition of the second reference coordinate system of the second instrument.
  • Embodiments of a preferred specific embodiment are described below, and individual elements of this embodiment can also be selectively combined with the previously generally described control device.
  • a sensor unit is positioned in the vicinity of an instrument trocar and is kinetically coupled to it, the instrument trocar being designed to accommodate the first instrument.
  • the sensor unit has an optical sensor, in particular a tracking sensor, and a 3-axis accelerometer and/or a 3-axis gyroscope.
  • the optical tracking sensor is arranged to provide a direct optical access path to the instrument shaft when the first instrument is inserted into the instrument trocar while the sensor is enclosed in a hermetic housing.
  • the accelerometer and/or the gyroscope record the absolute roll and pitch movement as well as the relative yaw movement of the sensor unit and thus of the instrument trocar.
  • the optical tracking sensor records the translation and rotation of the first instrument.
  • This sensor has a laser-based light source that projects a pattern onto the surface of the instrument shaft of the first instrument.
  • the hermetic housing of the optical tracking sensor has a glass window that provides the optical access path to the instrument shaft. This window is preferably either arranged at a certain angle to the surface of the sensor or provided with an anti-reflective coating of 800 to 900 nm to prevent reflections from the glass from interfering with the sensor.
  • control device or system has an activation method that signals when the user starts the endoscope control mode.
  • This activation method can be, for example, a foot switch, a button on the instruments or voice control.
  • the second instrument here a robot arm with an endoscope
  • the first instrument which precisely defines how the endoscope and the camera are positioned relative to the endoscope positioner, here the robot arm.
  • the inclination of the first instrument can be easily translated into the inclination of the second instrument or the endoscope, and the translation of the instrument shaft can be mapped to a translation of the second instrument or the endoscope.
  • this sensor arrangement has no information about the absolute orientation of the first instrument in the horizontal plane or about how the first instrument is positioned in relation to the second instrument or the endoscope or how the user or surgeon relates to the first instrument or the second instrument, here the endoscope. Therefore, a direct association between the movement of the first instrument and the second instrument is not possible. Therefore, the first information and second information mentioned are recorded and serve as the basis for a calculation as to which movement of the first instrument leads to an endoscope movement to the right or to the left.
  • the proposed mapping method which is applicable to all embodiments described in this disclosure, does not require user calibration or additional sensors. Instead, the gravity vector and instrument axis are used to calculate the left-right direction according to the right-hand rule.
  • the axis of gravity can be represented using the thumb and the direction of insertion of the instrument using the index finger. Then it will Direction in which the middle finger points is considered to be right.
  • the camera moves to the right side of the image. Accordingly, the direction of rotation can also be determined using the right-hand rule.
  • the thumb can point in the direction of insertion. Then the curved fingers point in the direction that causes the image on the screen to rotate clockwise.
  • Such observations can be carried out in both a Cartesian and a spherical coordinate system.
  • the sensor unit forms a spherical coordinate system, in the center of which the sensor unit is located (position or horizontal orientation of the sensor unit are irrelevant).
  • the tip of the first instrument can be viewed as a point in the coordinate system, the instrument shaft as its vector. Moving the tip in the positive direction of the azimuth unit vector causes the image to move leftward, moving in the direction of the positive polar angle unit vector causes the image to move downward. Moving the tip in the positive direction of the radius unit vector causes a zoom movement.
  • mapping method was not chosen arbitrarily, but was identified as preferred because it is based on the surgeon's daily experience. If the first instrument were actually an endoscope with a camera, it would move the image on the screen in the same way that this mapping method does when the surgeon uses the first instrument to control the image rather than the endoscope directly. This will simplify the learning process for trained surgeons.
  • the mapping method is based on the instrument itself. Every person in the operating room can understand which instrument movement on the first instrument causes which endoscope movement, i.e. movement of the second instrument. This allows everyone in the operating room to easily control the system, regardless of their positioning in relation to the monitor, the first instrument and second instrument or the endoscope or the camera.
  • the object is achieved by a system with a previously described control device, a first instrument, a second instrument and a motor control unit for positioning the second instrument.
  • the first instrument is designed to be guided in an instrument trocar.
  • the second instrument is an endoscope, the endoscope in particular having a camera.
  • the present invention also relates to a system with a medical surgical instrument, a data acquisition device and a data processing device.
  • a system is proposed with a medical surgical instrument which is designed to be hand-held by the surgeon at least temporarily during an operation, a data acquisition device and a data processing device, the data acquisition device having a pose detector which is designed to repeatedly provide pose information for the surgical instrument to capture, wherein the pose information has at least one position or orientation information, and to send it to the data processing device, and wherein the data processing device is designed to process the pose information and support the implementation or documentation of the operation.
  • the pose detector can be designed as a separate device that can be detachably fixed to the surgical instrument.
  • the pose detector can also be an integrated part of a motor-operated device. This device can already have one or more detectors that determine and, if necessary, monitor a pose of the device and thus of the surgical instrument in space.
  • the pose detector can also be formed inherently by the motor-operated device.
  • a device in particular a motorized robot arm, can assume specific positions in space through specific positioning positions of at least one motor. If the position of the device is now changed manually, the position of the at least one motor also changes, which in turn allows conclusions to be drawn about the current pose of the surgical instrument.
  • the known causality that a desired, specific position is specified is specified, the positioning position of the at least one motor is controlled and the specific position is thus established in the space of the motor-operated device is reversed.
  • the position in space now becomes a selectable parameter that specifies the position of the at least one motor, from which the position in space can be determined.
  • One aspect of the present invention addresses a topic that the inventors have identified in connection with the present invention.
  • This topic will be referred to below as a data gap, especially as a data gap for manual operation steps.
  • this concerns one or more elements from the group comprising the type of instrument used, the instrument position, the change in the instrument position, the instrument condition, the instrument use, the useful life of an instrument, the camera position, the change in the camera position, the relative position between camera(s) and Instrument(s), the relative movement between camera(s) and instruments ⁇ ), the relative acceleration between camera(s) and instrument(s), collisions between camera(s) and/or instruments(s).
  • procedure-related data collected on the robot can be used to generate medically and economically relevant added value.
  • the invention can advantageously reduce or close these data gaps during the manual phases at the operating table.
  • a sensor unit is attached to the trocar and kinetically coupled to record the instrument position in the operating room. This can be done with any number of trocars.
  • a digital image of reality can be created, as when used on a surgical robot, and captured and saved for further processing.
  • Possible applications include the following aspects: automated documentation, recognition of phases during the operation, detection of emergency situations, for example through an instrument movement, monitoring of registered pivot points, evaluations for training purposes, integration of expert systems and retrieval or control of exact positions, e.g. when changing from robotic guidance to manual guidance or vice versa.
  • a security feature can be implemented, which allows, for example, restricted areas to be defined.
  • a partially robotically manipulated instrument cannot be moved into these impermissible areas, even when operated manually, or can only be moved against increased resistance. This is achieved by appropriately controlling a motorized robot arm, which is part of a surgical robot or telemanipulator.
  • Another aspect of the present invention addresses another topic that the inventors have recognized in connection with the present invention.
  • This topic will be referred to below as an input unit, in particular as an input unit for controlling an endoscope on the operating table.
  • a table-based input system for camera control should temporarily replace the input on the console, so that the full range of functions of the robot for camera control, including automatic, can still be used at the operating table Tracking is available.
  • a device and a method are therefore proposed in which, via the console of the telemanipulator as well as the evaluation of the input on the laparoscopic instrument, a robotically guided endoscope can be controlled in parallel by the console of the telemanipulator as well as the evaluation of the input on the laparoscopic instrument e.g. controlling the endoscope on the robot using a manual instrument.
  • Another aspect of the present invention addresses another topic that the inventors have recognized in connection with the present invention.
  • This topic will be referred to below as a security system, in particular as a redundant security system.
  • a requirement for the approval of a medical device is an adequate safety concept. For critical functions, this is usually based on a redundant design. Arm and instrument positions are currently primarily recorded and calculated via the sensors in the robot arm joints.
  • a sensor system in particular a sensor system that is arranged near a trocar or near an puncture site, enables an independent additional measurement modality, so that the sensors can be used to monitor a safety system of a surgical robot.
  • shear forces can be recorded in the registered trocar point, which can occur when the position or registration of the pivot point or the puncture site changes. Such a situation can occur, for example, due to a movement of the patient.
  • a sensor system in particular a sensor system that is arranged near a trocar or near an puncture site, in order to compare the actual and target values of the kinematics of the surgical robot, in particular of at least one arm joint position of the surgical robot, to be able to carry out.
  • this further data collection does not take place in the instrument itself.
  • the detection is essentially limited to the tip of the instrument, so that, for example, an absolute position cannot be easily detected - however, only immediate variables such as forces, temperature, etc. can be detected. Instead, the data collection here is carried out in particular at a pivot point or through a passive holder.
  • One or more of the following values are preferably determined: absolute position, relative position in combination with the current image from the camera, movement curves, kinematics, dynamics and forces.
  • the latter can be calculated in particular from a combination of image and absolute position, e.g. how much an instrument bends under the weight of the liver.
  • the data processing device is designed to determine the operator's movement patterns.
  • the surgeon i.e. the person who operates the instrument that is monitored with regard to the pose, is observed or accompanied as he works with the surgical instrument.
  • conclusions can be drawn about the current condition of the surgeon, the so-called “surgeon mood”.
  • the surgical instrument can be freely hand-held or coupled to a motor device in which the surgeon can guide the surgical instrument essentially without resistance.
  • the data processing device is designed to detect a tremor in the guidance of the instrument.
  • Intentional rhythmic movements can be identified, e.g. based on their frequency or deflection, and can be differentiated from any tremor caused by the surgeon.
  • the identification of a tremor can, for example, indicate a strenuous position of the instrument or tension on the part of the surgeon and can be signaled to the surgeon or another person.
  • the data processing device is designed to detect signs of stress and/or hecticness in the surgeon.
  • Both absolute limit values e.g. too high a movement speed, excessive deflection, driving into impermissible areas or rooms, etc., as well as relative changes to an empirically determined movement pattern of the operator or a group of operators can be determined. For example, this can increased speed, excessive movements or deviation from usual routes or spaces, etc., can be a sign of stress. This can be signaled to the surgeon or another person.
  • the data processing device is designed to detect signs of tiredness in the surgeon.
  • Movement dynamics that are too slow or reaction times that are too long can indicate that the surgeon has become tired.
  • absolute limit values e.g. a movement speed that is too low, remaining in a position for too long without doing any activity, etc., as well as relative changes to an empirically determined movement pattern of the surgeon or a group of surgeons can be determined.
  • the data processing device is designed to determine a three-dimensional camera position even if only a two-dimensional image is available.
  • the additional information recorded means that measurements, e.g. distance, distance or area measurements, can be made even in a two-dimensional image.
  • measurements e.g. distance, distance or area measurements
  • it can also be warned that critical structures may be affected or, if the movement continues, will be affected. In this way, possible collisions or ruptures can be warned.
  • resistance to a potentially dangerous movement can also be built up or the movement can be blocked by appropriately controlling a robot arm.
  • the data processing device is designed to control the image tracking of a camera.
  • the image tracking can be carried out in particular by a combination of image recognition and the pose information, in particular position data, so that the surgeon has the areas relevant to him in view. It can also be ensured that all instruments are displayed in the visible image section. Alternatively, the position of instruments that are not visible can be shown in an expanded display.
  • the data processing device is designed to determine a usage model.
  • the determined usage model or operator model can include a technical load measurement of the instruments, which allows predictions to be made about the service life or remaining service life of one or more instruments. Typical movement data of the surgeon can also be determined and/or a cleaning intensity can be determined.
  • the data processing device is designed to create a learning curve analysis from movement data.
  • the data processing device is designed to support a non-surgeon.
  • non-surgeon should be understood as a person who does not lead or significantly design the medical procedure.
  • This can in particular be an assisting person, e.g. assistant at the table.
  • the data processing device can thus control image processing in which the position of his instrument is displayed to the assistant, especially if the instrument cannot be seen in the image representation currently desired by the surgeon. A hidden direction that cannot be seen in the image can also be displayed.
  • the assistant can be supported in capturing or picking up an instrument with the camera at the trocar exit and accompanying the instrument in returning it to the target structure.
  • the data processing device is designed to automatically create reports and/or carry out a workflow analysis
  • the movement data is used to automatically divide the intervention into different phases. On the one hand, this can occur after the completion of an operation (post-op). In particular, an automated operation report can be created and/or it can be documented by stitching several images that the entire site has been inspected. On the other hand, workflows can also be optimized during the operation (intra-operation), such as determining the time for ordering the next patient. There may be delays in the operation Comparison to the average values of the hospital can be determined and, based on such a determination, a possible automated adjustment of the operation plan can be made.
  • the data processing device is designed to control a desired pose of the surgical instrument.
  • an instrument position can be controlled again when changing from a hand-held guidance to a robotic guidance.
  • a camera position can also be assumed again in a similar way.
  • Fig. 1 shows an embodiment of a control device as part of an embodiment of a system.
  • FIG. 1 shows an embodiment of a control device 10, which is part of a system 60, which also has a first instrument 16, a second instrument 24 and a motor control unit 22 for positioning the second instrument 24.
  • the first instrument 16 is designed to be guided in an instrument trocar 40.
  • the second instrument 24 here is an endoscope 54, the endoscope 54 in particular having a camera 56.
  • the control device 10 is detachably fixed to the instrument trocar 40.
  • the control device 10 has a first, optical sensor 12 and a second sensor 14 with an acceleration sensor and / or a gyroscope, the optical sensor 12 is designed to record first information about a first, translational and rotational component 18 of a displacement of a first, manually displaceable instrument 16 relative to the optical sensor 12.
  • the second sensor 14 is designed to detect second information about a second component 20 of the displacement of the first instrument 16, the second component 20 being a roll, pitch and yaw of the first instrument 16.
  • the control device 10 is designed to send the first and second information to the motor control unit 22 for positioning a second instrument 24, so that the second instrument 24 is displaced corresponding to the first instrument 16.
  • the second instrument 24 is detachably fixed to the motor control unit 22.
  • the control device 10 has a continuous passage 26, which is designed to accommodate a shaft 28 of the first instrument 16, in particular in a form-fitting manner with respect to the circumference of the shaft 28.
  • the optical sensor 12 is arranged such that light reflected by the first instrument 16 58 reaches the optical sensor 12.
  • the optical sensor 12 is arranged in a hermetically sealed housing 32 with a translucent disk 34.
  • the light 56 comes in particular from a light source 30, which is designed to emit light 56 for illuminating a shaft 28 of the first instrument 16.
  • the light 56 is emitted in particular with a structured pattern.
  • the control device 10 has an actuating device 36, wherein the control device 10 is designed to cause a displacement of the second instrument 24 when the actuating device 36 is in an activated state when the first instrument 16 is displaced and in a deactivated state when the first instrument is displaced 16 does not cause any displacement of the second instrument 24.
  • the control device 10 also has a configuration device 38, which is designed to apply a factor to the first and/or the second information, so that a displacement of the first instrument 16 with respect to one or more degrees of freedom results in a displacement of the second instrument 24 regarding this or these degrees of freedom, which is subjected to a factor.
  • a first reference coordinate system 42 of the first instrument 16 is formed from three axes perpendicular to one another in pairs, a z-axis of the three axes being directed from a distal end 44 of the first instrument 16 towards a proximal end 46 of the first instrument 16, an x-axis Axis of the three axes follows the earth's gravitational field G and a y-axis of the three axes is perpendicular to the x-axis and the z-axis.
  • a second reference coordinate system 48 of the second instrument is formed from three axes perpendicular to one another in pairs, with a z-axis of the three axes directed from a distal end 50 of the second instrument 24 towards a proximal end 52 of the second instrument 24, an x-axis of the three axes follows the earth's gravitational field G and one y-axis of the three axes is perpendicular to the x-axis and the z-axis.
  • the motor control unit 22 has a processing unit 62, here a microprocessor, and a memory 64.
  • the memory 64 has instructions with which, when these instructions are executed by the processing unit 62, control commands for the motor control unit 22 are determined from the first and second information.
  • control device A preferred programmatic implementation of the control by the control device is described below. It is assumed here that a camera of an endoscope, which represents the second instrument, is controlled by means of the first instrument, here in the function of a joystick. When this type of control is active, the system is in a camera control mode (KSM). Variables that are assigned to the camera have the identifier “cam”. Variables that are assigned to the joystick have the identifier “joy”.
  • KSM camera control mode
  • the z-axis is defined by the instrument and points from the tip of the instrument to the handle. In principle, this axis can be mathematically assigned a negative sign, which results in a reversal of direction.
  • the orientation suggested here is standard in camera technology; see in particular the so-called LookAt function.

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Robotics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Manipulator (AREA)

Abstract

L'invention concerne un dispositif de commande (10) comportant un premier capteur optique (12) et un second capteur (14) muni d'un capteur d'accélération et/ou d'un gyroscope, le capteur optique (12) étant conçu de manière à enregistrer une première information concernant une première composante (18) de translation et de rotation d'un déplacement d'un premier instrument (16) déplaçable manuellement relativement au capteur optique (12), et le second capteur (14) étant conçu de manière à enregistrer une seconde information concernant une seconde composante (20) du déplacement du premier instrument (16), le dispositif de commande (10) étant conçu de manière à envoyer la première et la seconde information afin de positionner un second instrument (24), de sorte que le second instrument (24) est déplacé en correspondance avec le premier instrument (16). L'invention concerne en outre un système (60) comportant un dispositif de commande (10), un premier instrument (16), un second instrument (24) et une unité de commande (22) motorisée destinée à positionner le second instrument (24), ainsi qu'un système équipé d'un instrument d'intervention chirurgicale médical.
PCT/EP2023/070175 2022-07-21 2023-07-20 Dispositif de commande et système, et système équipé d'un instrument d'intervention chirurgicale médical, d'un dispositif d'acquisition de données et d'un dispositif de traitement de données WO2024018011A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
DE102022118328.9A DE102022118328A1 (de) 2022-07-21 2022-07-21 Steuervorrichtung und System
DE102022118328.9 2022-07-21
DE102022118330.0 2022-07-21
DE102022118330.0A DE102022118330A1 (de) 2022-07-21 2022-07-21 System mit einem medizinischen Operationsinstrument, einer Datenerfassungsvorrichtung und einer Datenverarbeitungseinrichtung

Publications (1)

Publication Number Publication Date
WO2024018011A1 true WO2024018011A1 (fr) 2024-01-25

Family

ID=87429587

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2023/070175 WO2024018011A1 (fr) 2022-07-21 2023-07-20 Dispositif de commande et système, et système équipé d'un instrument d'intervention chirurgicale médical, d'un dispositif d'acquisition de données et d'un dispositif de traitement de données

Country Status (1)

Country Link
WO (1) WO2024018011A1 (fr)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160213436A1 (en) * 2013-07-26 2016-07-28 Olympus Corporation Medical system and method of controlling medical treatment tools

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160213436A1 (en) * 2013-07-26 2016-07-28 Olympus Corporation Medical system and method of controlling medical treatment tools

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
MICROSOFT HOLOLENS: "Windows Mixed Reality: Motion Controller Tracking", 16 January 2018 (2018-01-16), XP093089230, Retrieved from the Internet <URL:https://www.youtube.com/watch?v=rkDpRllbLII> [retrieved on 20231006] *

Similar Documents

Publication Publication Date Title
EP2449997B1 (fr) Poste de travail médical
EP2575662B1 (fr) Procédé de déplacement du bras porte-instruments d&#39;un robot de laparoscopie dans une position relative prédéfinissable par rapport à un trocart
DE102012110190B4 (de) Manuell betätigte Robotersteuerung und Verfahren zum Steuern eines Robotersystems
EP3363358B1 (fr) Dispositif de détermination et recouvrement d&#39;un point de référence lors d&#39;une intervention chirurgicale
WO2008058520A2 (fr) Dispositif de génération d&#39;images pour un opérateur
EP3412242A1 (fr) Émission de données de position d&#39;un instrument technique médical
WO2015049095A1 (fr) Dispositif de commande et procédé permettant de commander un système de robot par commande gestuelle
WO2014114551A1 (fr) Système robotisé et procédé de commande d&#39;un système robotisé pour la chirurgie mini-invasive
DE10130278A1 (de) Verfahren und Vorrichtung zur Darstellung eines Operationsgebietes bei Laseroperationen
DE19961971A1 (de) Verfahren zur sicheren automatischen Nachführung eines Endoskops und Verfolgung (Tracking) eines chirurgischen Instrumentes mit einem Endoskopführungssystem (EFS) für die minimal invasive Chirurgie
DE102013109677A1 (de) Assistenzeinrichtung zur bildgebenden Unterstützung eines Operateurs während eines chirurgischen Eingriffs
EP3753520A1 (fr) Dispositif de manipulation médical de commande d&#39;un dispositif de manipulation
EP3639782A1 (fr) Dispositif de commande d&#39;un mouvement d&#39;un bras robotique et dispositif de traitement doté d&#39;un dispositif de commande
DE102008055918A1 (de) Verfahren zum Betreiben eines medizinischen Navigationssystems und medizinisches Navigationssystem
WO2017186414A1 (fr) Système d&#39;assistance opératoire et procédé pour produire des signaux de commande pour assurer la commande d&#39;une cinématique de robot à déplacement commandé par moteur, d&#39;un système d&#39;assistance opératoire de ce type
EP3753519A1 (fr) Dispositif de manipulation médical
DE10335369B4 (de) Verfahren zum Bereitstellen einer berührungslosen Gerätefunktionssteuerung und Vorrichtung zum Durchführen des Verfahrens
WO2024018011A1 (fr) Dispositif de commande et système, et système équipé d&#39;un instrument d&#39;intervention chirurgicale médical, d&#39;un dispositif d&#39;acquisition de données et d&#39;un dispositif de traitement de données
WO2015014952A1 (fr) Système d&#39;assistance pour l&#39;aide par imagerie d&#39;un opérateur pendant une intervention chirurgicale
DE102014210056A1 (de) Verfahren zur Ansteuerung eines chirurgischen Geräts sowie chirurgisches Gerät
WO2022162217A1 (fr) Système d&#39;assistance chirurgical à microscope opératoire et caméra et procédé de visualisation
EP3753521A1 (fr) Dispositif de manipulation médical d&#39;un dispositif de manipulation
DE102022118330A1 (de) System mit einem medizinischen Operationsinstrument, einer Datenerfassungsvorrichtung und einer Datenverarbeitungseinrichtung
DE102022118328A1 (de) Steuervorrichtung und System
DE102004052753A1 (de) Verfahren und Operations-Assistenz-System zur Steuerung der Nachführung zumindest eines Hilfsinstrumentes bei einem medizinisch minimal-invasiven Eingriff

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23744777

Country of ref document: EP

Kind code of ref document: A1