EP2502558A1 - Medical workstation - Google Patents

Medical workstation Download PDF

Info

Publication number
EP2502558A1
EP2502558A1 EP12159485A EP12159485A EP2502558A1 EP 2502558 A1 EP2502558 A1 EP 2502558A1 EP 12159485 A EP12159485 A EP 12159485A EP 12159485 A EP12159485 A EP 12159485A EP 2502558 A1 EP2502558 A1 EP 2502558A1
Authority
EP
European Patent Office
Prior art keywords
medical
robot arm
medical instrument
robot
imaging device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP12159485A
Other languages
German (de)
French (fr)
Other versions
EP2502558B1 (en
Inventor
Ralph Berke
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kuka Deutschland GmbH
Original Assignee
Kuka Deutschland GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to DE102011005917A priority Critical patent/DE102011005917A1/en
Application filed by Kuka Deutschland GmbH filed Critical Kuka Deutschland GmbH
Publication of EP2502558A1 publication Critical patent/EP2502558A1/en
Application granted granted Critical
Publication of EP2502558B1 publication Critical patent/EP2502558B1/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Detecting, measuring or recording for diagnostic purposes; Identification of persons
    • A61B5/70Means for positioning the patient in relation to the detecting, measuring or recording means
    • A61B5/704Tables
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00147Holding or positioning arrangements
    • A61B1/00149Holding or positioning arrangements using articulated arms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/32Surgical robots operating autonomously
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/76Manipulators having means for providing feel, e.g. force or tactile feedback
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Detecting, measuring or recording for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • A61B5/065Determining position of the probe employing exclusively positioning means located on or in the probe, e.g. using position sensors arranged on the probe
    • A61B5/066Superposing sensor position on an image of the patient, e.g. obtained by ultrasound or x-ray imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00115Electrical control of surgical instruments with audible or visual output
    • A61B2017/00119Electrical control of surgical instruments with audible or visual output alarm; indicating an abnormal situation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/378Surgical systems with images on a monitor during operation using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis

Abstract

The invention relates to a medical workstation comprising a medical instrument (K) intended to be inserted at least partially inside the patient for treatment of an animal, an imaging device (2) arranged to transfer image data sets from the interior during the treatment of the living being (P) and having a robot (R). The robot (R) comprises a plurality of successively arranged members having robot arm (M) to which the imaging device (2) or the medical instrument (K) can be arranged, and provided for moving the robot arm (M) control device (S) adapted to move the robot arm (M) such that the imaging device (2) attached to the robot arm (M) follows a movement of the medical instrument (2) or the medical instrument (K) attached to the robot arm (M) Movement of the imaging device (2) follows.

Description

  • The invention relates to a medical workstation with a robot.
  • The WO 2009/065827 A1 discloses a medical workstation with a robot. The robot includes a robot arm and a controller for moving the robot arm. An endoscope is attached to a fastening device of the robot arm, for example, by means of which a living being is to be treated. In order to determine the positions and orientations of the living being relative to the robot arm, a navigation system is provided.
  • The object of the invention is to provide an improved medical workstation with a robot.
  • The object of the invention is achieved by a medical workstation comprising
    • a medical instrument intended to be introduced at least partially into the interior of a living being for treatment of a living being,
    • an imaging device adapted to create image records of the interior of the animal during the treatment, and
    • a robot having a plurality of successively arranged members having robot arm, to which the imaging device or the medical instrument can be arranged, and provided for moving the robot arm control device which is adapted to move the robot arm so that the robot arm attached A medical imaging movement imaging device Instruments follows or attached to the robot arm medical instrument follows a movement of the imaging device.
  • The medical workstation according to the invention accordingly comprises the medical instrument and the imaging device. The medical instrument is intended to be introduced at least partially inside the body for the treatment of the animal. The medical instrument is e.g. a cannula designed to aspirate adipose tissue of the animal.
  • By means of the imaging device image data sets from the inside of the living being during the treatment, so while the medical instrument is at least partially introduced into the interior of the living being, are recorded. The recorded image data sets may be e.g. used to display images associated with the image data sets by means of a display device.
  • In order to always or at least occasionally obtain current image data sets, in particular from the region of the interior of the living being in which at least partially the medical instrument is introduced, the inventive medical workstation has the robot which is set up in such a way to move the robot arm, such that the imaging device attached to the robot arm follows the movement of the medical instrument or the medical instrument attached to the robot arm follows the movement of the imaging device. If, for example, a doctor manually introduces the medical instrument at least partially into the interior of the living being, then the imaging device, moved by the robot arm, always follows the movement of the manually guided medical instrument.
  • Alternatively, it may be provided that if e.g. the doctor manually guides the imaging device, the robot automatically moves the medical instrument so that the medical instrument automatically follows the movement of the imaging device.
  • The imaging device is e.g. set up to create ultrasound images associated image data sets from the inside of the living being. An ultrasound device can be made relatively small and thus can be relatively easily moved by means of the robot if necessary. In addition, an ultrasound device does not generate harmful radiation such as radiation. an X-ray device.
  • According to one embodiment of the medical workstation according to the invention, the latter has a navigation system connected to the control device of the robot, which is set up to determine the positions and / or positions of the robot arm, the medical instrument and / or the imaging device, on the basis of which the control device controls the robot arm so that the imaging device attached to the robot arm follows the movement of the medical instrument or the medical instrument attached to the robot arm follows the movement of the imaging device.
  • Navigation systems are in medical technology, for example, from the WO 2009/065827 A1 well known. Navigation systems comprise a detection device, which may comprise, for example, an optical detection device, in particular a camera, a laser tracking system, structured light projectors or line projectors. The detection device is set up in a generally known manner on the object, in particular on the surface of the object, arranged markers or prominent points of the surface of the object. Due to the detection of the markers or landmarks, a computing device of the navigation system can determine the positions of the robot arm, the medical instrument and / or the imaging device and optionally its orientation in a manner which is generally well known. Thus, the inventive medical workstation may be enabled to determine, for example, the current position and / or position (= position and orientation) of the manually guided medical instrument, for example, in order to track the imaging device attached to the robot arm.
  • The medical workstation according to the invention may have a display device which is provided to visually represent an image associated with the image data record. Thus, the person treating the animal can observe in a relatively simple manner the at least partially inserted into the interior of the living organism medical instrument.
  • According to a variant of the medical workstation according to the invention, the latter is set up to analyze the image data record in areas which are undesirable and / or desirable for the treatment of the living being with the medical instrument.
  • Then it may be provided in an advantageous manner that the medical workstation according to the invention is set up to identify desired and / or undesired areas in the image displayed by means of the display device. As a result, the person treating the living being with the medical instrument can better recognize when the medical instrument introduced at least partially into the interior leaves or at least threatens to leave the desired area. The corresponding areas can be marked in color, for example.
  • In order to warn the person treating the animal, if necessary, according to a variant of the medical workstation according to the invention, this can be used to generate a warning signal, in particular an acoustic warning signal, due to the analyzed image data set, if the medical instrument is outside the desired range or threatens it to leave.
  • According to a variant of the medical workstation according to the invention, the medical instrument may have a vibration generator and the medical workstation according to the invention may be set to activate the vibration generator on the basis of the analyzed image data record if the medical instrument is outside the desired range is or threatens to leave.
  • According to a further embodiment of the medical workstation according to the invention, the latter is set up to stop or prevent movement of the robotic arm provided with the medical instrument on the basis of the analyzed image dataset if the medical instrument is outside the desired range or threatens to leave it. Thus, for example, if the person treating the living organism manually moves the imaging device and the robot automatically guides the medical instrument of the movement of the imaging device, according to this variant the robot automatically stops the movement of the medical instrument when it threatens to enter an area, which it should not enter.
  • According to a further embodiment of the medical workstation according to the invention, the latter has a further robot with one, several, successively arranged members and a further control device provided for moving the further robot arm, wherein one of the robot arms is provided to move the imaging device, and the other robot arms is provided to move the medical instrument. The two control devices can also be combined to form a single control device. By means of this variant of the medical workstation according to the invention, for example, a fully automatic intervention in the living being can be carried out by both robot arms automatically carrying out their movements.
  • According to an embodiment of the medical workstation according to the invention, the two control devices are coupled to one another such that upon movement of one of the robot arms the other robot arm automatically performs a movement such that the medical instrument follows a movement of the imaging device or the imaging device follows a movement of the medical instrument , To achieve this, e.g. the two control devices be designed as a master-slave system.
  • According to one embodiment of the medical workstation according to the invention, one of the robot arms is manually movable, and the other robot arm, controlled by its control device, automatically follows the movement of the manually moved robot arm. The robot arm in question may e.g. be manually moved by manual guidance or by means of a hand-held device.
  • If the manually movable robot arm is provided with the medical instrument, according to a further embodiment of the medical workstation according to the invention, this can be arranged to move the patient with the medical one Instruments provided robotic arm to prevent or at least aggravate when the medical instrument is outside the desired area or threatens to leave this. The desired region or the undesired region can be detected in particular by analyzing the image data set.
  • In particular, when the medical workstation according to the invention is to be used for aspirating adipose tissue of the animal, the safety and the quality of the liposuction can be increased by the medical workstation according to the invention. With the imaging device such as e.g. The ultrasound in particular fat tissue can be displayed relatively well and efficiently. As a result, the intentional fat layer to be aspirated can be visualized, for example, in order to be able to use the cannula-designed medical instrument to relatively accurately aspirate the relevant fat layer on the basis of the image data.
  • Embodiments of the invention are illustrated by way of example in the accompanying schematic drawings. Show it:
  • Fig. 1
    a medical workplace with a robot,
    Fig. 2
    an ultrasound image,
    Fig. 3
    a medical workplace with two robots and
    Fig. 4
    another medical workplace with a robot.
  • The Fig. 1 shows a medical workstation with a patient bed 1, on which a living being to be treated, for example a person P, lies. The medical workstation further has a cannula K, by means of which a liposuction is to be performed on the person P. The cannula K is manually guided manually by a doctor, not shown. The cannula K is an example of a medical instrument that can be inserted into the animal.
  • The medical workstation furthermore has a robot R with a robot arm M and a control device S. The robot arm M comprises a plurality of successive links, which are connected by means of joints and are movable relative to each other with respect to axes. At one end of the robot arm M, an ultrasonic transducer 2 is attached or integrated into the robot arm M. The ultrasound transducer 2 is connected to a computer 3 of the medical workstation and is set up to generate ultrasound images B from the inside of the person P or the ultrasound images B associated image datasets. The ultrasound images B can be displayed by means of a monitor 4 connected to the computer 3. One of the ultrasound images B is in the Fig. 2 shown. It shows, for example, imaged fatty tissue B1 and the imaged skin surface B2 of person P.
  • The robot arm M has drives, in particular electric drives, which are connected to the control device S. By means of the drives, the robot arm M or its members can be moved relative to one another and controlled by the control device S or by a computer program running on the control device S. In particular, it is possible that the ultrasonic transducer 2 occupies a predetermined position and orientation in space. possibly the drives are controlled by the control device S.
  • In the case of the present exemplary embodiment, the medical workstation comprises a navigation system N. Navigation systems as such are known to the person skilled in the art from among others WO 2009/065827 A1 known. Navigation systems can be, for example, magnetic or optical navigation systems or based on RFID and are used, for example, to determine the position and optionally the orientation of an object, for example the ultrasonic transducer 2, the cannula K or the robot arm M.
  • In the case of the present embodiment, the cannula is provided with markers M1, the ultrasonic transducer 2 with markers M2 and the robot arm M with markers M3, by means of which the navigation system N determines the positions or positions, i. Positions and orientations, the cannula K, the ultrasonic transducer 2 and the robot arm M in space can determine.
  • In the case of the present exemplary embodiment, the navigation system N has a detection device 5 which comprises, for example, a stereo camera 6. The stereo camera 6 is set up to take pictures of the markers M1, M2, M3.
  • In the case of the present exemplary embodiment, the detection device 5 is connected to the control device S of the robot R, which runs a computer program which evaluates the images of the markers M1, M2, M3 recorded by the stereo camera 6 in a generally known manner and, based on the evaluation, the position the marker M1, M2, M3 and thus determines the positions of the cannula K, the ultrasonic transducer 2 and the robot arm M in space. This evaluation can also be carried out by the detection device 5, which then transmits the result of the evaluation to the control device S.
  • Thus, it is possible for the control device S or a computer program running on the control device S to control the drives of the robot arm M so that it moves or tracks the ultrasound transducer 2 such that the ultrasound transducer 2 and the cannula K maintain a predetermined distance. Thus, it is possible that during the treatment of the person P by means of the cannula K, the ultrasound transducer 2 generates image data records which are assigned to ultrasound images B, in particular from the area of the person P in front of the cannula K, so that these ultrasound images B by means of the monitor 4 to the cannula 2 operating doctor are presented. The ultrasound images B accordingly represent the tissue layer of the person P lying on the monitor 4 in front of the cannula K. The doctor can then decide, for example, how and where he wants to move the cannula K for liposuction of the adipose tissue. In particular, it can be provided that the control device S always controls the robot arm M or its drives in such a way that the ultrasonic transducer 2 always follows the individual movement of the cannula 2 or its syringe. It may be possible for the physician to see relatively accurately on the monitor 4 in which tissue layer of the person P the cannula K is currently located.
  • An image processing program which processes the image data records generated by means of the ultrasound transducer 2 in such a way that adipose tissue B1 imaged in the ultrasound image B, for example, is color-coded by other regions of the person P, can run on the computer 3. Correspondingly imaged tissue can be detected eg by means of ultrasound elastography. It is also possible that areas of the person P, which should not be treated by means of the cannula K, different be colored in the ultrasound image B or marked marked.
  • Thus, it can also be provided that, in particular, the image processing program running on the computer 3 recognizes when, in particular, the tip of the cannula K approaches an unauthorized region, that is, a region outside the fatty tissue. Then it is possible for the computer 3 to generate a signal on the basis of which the doctor may e.g. is warned acoustically. But it is also possible that the cannula K has a vibration generator to warn the doctor in manual guidance of critical areas. In this case, e.g. the computer 3 is connected to the cannula K or the vibration transmitter.
  • It is also possible that, in particular at the tip of the cannula K, a sensor 7 is provided, which in particular is connected to the navigation system N. The sensor 7 may e.g. allow an improved determination of the position and, where appropriate, the orientation of the cannula K, in particular its tip. The sensor 7 may e.g. based on RFID.
  • However, the sensor 7 can also be designed such that it recognizes the current type of tissue and, for example, this information is transmitted to the computer 3. The transmission may e.g. wireless e.g. via radio or wired. The information originating from the sensor 7 can be used, for example, for the representation of the ultrasound image B.
  • Also, the medical workstation could be configured such that e.g. For a quality assurance the intervention is documented, for example by means of the computer 3.
  • It is also possible that a preoperative image data record is taken by the person P before the procedure. The preoperative Image data set is in particular a three-dimensional image data set and in particular depicts the area of the intervention. The preoperative image data set is recorded, in particular, with a medical device, for example with a magnetic resonance device, and can be merged or superimposed with the image data set recorded by the ultrasound transducer 2 during the procedure to produce a modified image which is displayed on the monitor 4 instead of the ultrasound image B becomes.
  • The Fig. 3 shows another medical workplace. Unless otherwise stated, then are components of in the Fig. 3 Medical work station shown with components of the medical work station Fig. 1 are essentially identical in construction and function, provided with the same reference numerals.
  • The Indian Fig. 3 shown medical workstation differs from that in the Fig. 1 shown medical workplace in that it comprises a second robot R '. The further robot R 'has a further robot arm M' and a further control device S '. The further robot arm M 'comprises a plurality of successive links, which are connected by means of joints and are movable relative to each other with respect to axes. At one end of the further robot arm M ', the cannula K is attached. For this purpose, the further robot arm M 'comprises a suitable fastening device, for example in the form of a flange.
  • The further robot arm M 'has drives, in particular electric drives, which are connected to the further control device S'. By means of its drives, the further robot arm M 'or its members can be controlled relative to one another by the further control device S' or by a computer program running on the further control device S ' to be moved. In particular, it is possible that the cannula K occupies a predetermined position and orientation in space. Optionally, the drives are controlled by the further control device S '. Thus, the cannula K is guided by the further robot R '.
  • Thus leads in the medical workplace of Fig. 3 the robot R the ultrasonic transducer 2 and the other robot R 'the cannula K. The two robots R, R' and their control devices S, S 'are coupled in the case of the present embodiment in a control loop, so that the cannula K moving robot R The needle K only moves in the permitted area of adipose tissue. By means of image processing, permitted and unauthorized areas are automatically identified in the ultrasound image B and displayed on the monitor 4. The movement of the further robot R 'can also be automatically limited via virtual walls so that the guidance of the cannula K is possible only in the permitted fatty tissue area. It is also possible that a common control device, for example the control device S, controls both robot arms M, M '.
  • In order to couple the two robots R, R 'together, their control devices S, S' can be designed as master-slave systems, so that, for example, for an automated intervention, the further control device S 'moves the further robot arm R' in such a way that the cannula K performs a predetermined movement. Due to the coupling of the two control devices S, S ', the control device S is allowed to move the robot arm M and thus the ultrasonic transducer 2 in such a way that it has the predetermined distance from the cannula K. In this case, if necessary, the medical workplace of the Fig. 3 without the navigation system N.
  • Instead of the fully automated medical workstation Fig. 3 it can also be provided that although two robots R, R 'are provided, the doctor during the procedure, however, one of the two robot arms M, M', preferably the further robot arm M ', to which the cannula K is attached, manually moves. In particular, provision may be made for the physician to manually guide the robot arm M, M 'in question, for example by pulling or pressing on the structure of the corresponding robot arm M, M'. But it is also possible that he moves the questionable robot arm M, M 'manually by using a non-illustrated, the skilled person but well-known control handset, which is connected to the corresponding control device S, S'.
  • Due to the coupling of the two control devices S, S ', the robot R or its robot arm M moves the ultrasound transducer 2 automatically at the predetermined distance to the tip of the cannula K. In this embodiment, it can then be provided in the case of the present embodiment that the physician The further robot arm M 'can not move or only with greater force as soon as the cannula K penetrates into an unauthorized area or leaves the fatty tissue area. This is detected by means of image processing by the computer 3 on the basis of the image data records recorded by the ultrasound transducer 2. In this case, N can be done without the navigation system if necessary. It is also possible that the robot arm M, to which the ultrasonic transducer 2 is attached, is moved manually.
  • The Fig. 4 shows another medical workplace. Unless otherwise stated, then are components of in the Fig. 4 Medical work station shown with components of the medical work station Fig. 1 are essentially identical in construction and function, provided with the same reference numerals.
  • The Indian Fig. 4 The medical workstation shown differs essentially from the medical workstation Fig. 1 in that the robot R or its robot arm M does not guide the ultrasound transducer 2 but rather the cannula K. The ultrasonic transducer 2, however, is manually guided by the doctor. Due to the signals originating from the navigation system N, the position and optionally the orientation of the ultrasound transducer 2 in space is detected, thereby allowing the control device S to move the robot arm M such that the cannula K automatically follows the manual movement of the ultrasound transducer 2, especially always follows.
  • In the case of the present exemplary embodiment, it is provided that when the cannula K leaves the permitted area, ie the fatty tissue of the person P, the movement of the robot R or its robot arm M is stopped by the control device S. Thus, it can be at least largely ensured that the cannula K or its tip is located exclusively within the fatty tissue, whereby the risk of aspiration of healthy tissue is at least reduced.

Claims (13)

  1. Medical workstation, having
    a medical instrument (K), which is intended to be introduced at least partly into the interior of a living being for treatment of a living being,
    - An imaging device (2), which is adapted to create during the treatment image data sets from the interior of the living thing (P), and
    - A robot (R) having a plurality of successively arranged members having robot arm (M) to which the imaging device (2) or the medical instrument (K) can be arranged, and provided for moving the robot arm (M) control device ( S) arranged to move the robot arm (M) such that the imaging device (2) attached to the robot arm (M) follows a movement of the medical instrument (2) or the medical instrument attached to the robot arm (M) ( K) follows a movement of the imaging device (2).
  2. Medical workstation according to claim 1, wherein the imaging device (2) is adapted to create ultrasound images (B) associated image data sets from the inside of the animal (P), and / or the medical instrument as a cannula (K) for sucking fatty tissue of the Living thing (P) is formed.
  3. Medical workstation according to claim 1 or 2, comprising a with the control device (S) of the robot (R) connected navigation system (N), which is set up to determine the positions and / or positions of the robot arm (M), the medical instrument (K) and / or the imaging device (2), on the basis of which the control device (S) moves the robot arm (M) such that the imaging device (2) attached to the robot arm (M) moves the medical instrument (2) or the medical instrument (K) attached to the robot arm (M) moves the imaging apparatus (2) follows.
  4. Medical workstation according to one of claims 1 to 3, which is adapted to analyze the image data set on for the treatment of the living with the medical instrument (2) undesirable and desirable areas.
  5. Medical workstation according to one of claims 1 to 4, comprising a display device (4) which is provided to visually represent an image (B) associated with the image data set.
  6. Medical workstation according to claim 4 and 5, which is arranged to mark the desired and / or undesired areas (B1, B2) in the image (B) displayed by means of the display device (4).
  7. Medical workstation according to one of Claims 4 to 6, which is set up to stop or prevent a movement of the robot arm (M, M ') provided with the medical instrument (2) on the basis of the analyzed image data set, when the medical instrument (K) is outside the desired range or threatens to leave.
  8. Medical workstation according to one of Claims 4 to 7, which is set up to generate a warning signal, in particular an acoustic warning signal, on the basis of the analyzed image data record if the medical instrument is outside the desired range or threatens to leave it, and / or its medical Instrument (K) has a vibration generator and the medical workstation is set up to activate the vibration generator based on the analyzed image data set when the medical instrument is outside the desired area or threatens to leave.
  9. Medical workstation according to one of claims 1 to 8, comprising a further robot (R ') having a, a plurality of successively arranged members having robotic arm (M') and with a for moving the further robot arm (M ') provided further control device (S' ), wherein one of the robot arms (M) is provided to move the imaging device (2), and the other robot arm (M ') is provided to move the medical instrument (2).
  10. The medical workstation according to claim 9, wherein the two control devices (S, S ') are coupled to each other such that upon movement of one of the robot arms (M) the other robot arm (M') automatically performs a movement such that the medical instrument ( K) follows a movement of the imaging device (2) or the imaging device (2) a movement of the medical instrument (K).
  11. Medical workstation according to claim 9 or 10, in which both robot arms (M, M ') automatically perform their movements.
  12. A medical workstation according to claim 9 or 10, wherein one of the robot arms (M, M ') is manually movable, and the other robot arm (M'), controlled by its control device (S '), automatically follows the movement of the manually moved robot arm (M ) follows.
  13. The medical workstation according to claim 12, wherein the manually movable robot arm (M ') is provided with the medical instrument (K) and the medical work station is configured to move the robotic arm (M') provided with the medical instrument (K) prevent or at least aggravate if the medical device (K) is outside the desired range or threatens to leave it.
EP12159485.7A 2011-03-22 2012-03-14 Medical workstation Active EP2502558B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
DE102011005917A DE102011005917A1 (en) 2011-03-22 2011-03-22 Medical workplace

Publications (2)

Publication Number Publication Date
EP2502558A1 true EP2502558A1 (en) 2012-09-26
EP2502558B1 EP2502558B1 (en) 2018-09-26

Family

ID=45887960

Family Applications (1)

Application Number Title Priority Date Filing Date
EP12159485.7A Active EP2502558B1 (en) 2011-03-22 2012-03-14 Medical workstation

Country Status (3)

Country Link
US (1) US9872651B2 (en)
EP (1) EP2502558B1 (en)
DE (1) DE102011005917A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016096366A1 (en) * 2014-12-17 2016-06-23 Kuka Roboter Gmbh System for robot-assisted medical treatment
EP3229161A3 (en) * 2016-09-21 2018-03-21 Siemens Healthcare GmbH System with a mobile control apparatus and method for issuing a control signal to a component of a medical imaging device

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8390291B2 (en) * 2008-05-19 2013-03-05 The Board Of Regents, The University Of Texas System Apparatus and method for tracking movement of a target
US10292887B2 (en) * 2012-12-31 2019-05-21 Mako Surgical Corp. Motorized joint positioner
CA2896381C (en) 2013-03-15 2017-01-10 Synaptive Medical (Barbados) Inc. Intelligent positioning system and methods therefore
CA2929702A1 (en) 2013-03-15 2014-09-18 Synaptive Medical (Barbados) Inc. Systems and methods for navigation and simulation of minimally invasive therapy
JP2015131375A (en) * 2014-01-15 2015-07-23 セイコーエプソン株式会社 Robot, robot system, robot control device, and robot control method
EP3136973B1 (en) * 2014-04-28 2019-08-14 Mazor Robotics Ltd. Ultrasound guided hand held robot
US20150335316A1 (en) * 2014-05-23 2015-11-26 General Electric Company Mri system for robotically assisted breast biopsy
CA2968879A1 (en) * 2014-11-25 2016-06-02 Synaptive Medical (Barbados) Inc. Hand guided automated positioning device controller
DE102015210218A1 (en) * 2015-06-02 2016-12-08 Kuka Roboter Gmbh Method for operating a robot, associated robot with a vibration device and robot workstation
CN105943161A (en) * 2016-06-04 2016-09-21 深圳市前海康启源科技有限公司 Surgical navigation system and method based on medical robot
US20190307519A1 (en) * 2016-12-07 2019-10-10 Koninklijke Philips N.V. Automatic motion control of a dependent surgical robotic arm
WO2019050821A1 (en) * 2017-09-05 2019-03-14 Covidien Lp Camera control for surgical robotic systems

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020120188A1 (en) * 2000-12-21 2002-08-29 Brock David L. Medical mapping system
US6626832B1 (en) * 1999-04-15 2003-09-30 Ultraguide Ltd. Apparatus and method for detecting the bending of medical invasive tools in medical interventions
US20050033161A1 (en) * 2000-12-21 2005-02-10 Rainer Birkenbach Cable-free medical detection and treatment system
WO2005039391A2 (en) * 2003-10-21 2005-05-06 The Board Of Trustees Of The Leland Stanford Junior University Systems and methods for intraoperative targetting
US20050107808A1 (en) * 1998-11-20 2005-05-19 Intuitive Surgical, Inc. Performing cardiac surgery without cardioplegia
WO2009045885A2 (en) * 2007-10-02 2009-04-09 Board Of Regents, The University Of Texas System Real-time ultrasound monitoring of heat-induced tissue interactions
WO2009045827A2 (en) * 2007-09-30 2009-04-09 Intuitive Surgical, Inc. Methods and systems for tool locating and tool tracking robotic instruments in robotic surgical systems
WO2009065827A1 (en) 2007-11-19 2009-05-28 Kuka Roboter Gmbh Device comprising a robot, medical work station, and method for registering an object

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4979949A (en) * 1988-04-26 1990-12-25 The Board Of Regents Of The University Of Washington Robot-aided system for surgery
US6459926B1 (en) * 1998-11-20 2002-10-01 Intuitive Surgical, Inc. Repositioning and reorientation of master/slave relationship in minimally invasive telesurgery
EP1355765B1 (en) * 2001-01-29 2008-05-07 The Acrobot Company Limited Active-constraint robots
TW200304608A (en) * 2002-03-06 2003-10-01 Z Kat Inc System and method for using a haptic device in combination with a computer-assisted surgery system
US7819859B2 (en) * 2005-12-20 2010-10-26 Intuitive Surgical Operations, Inc. Control system for reducing internally generated frictional and inertial resistance to manual positioning of a surgical manipulator
US8262591B2 (en) * 2006-09-07 2012-09-11 Nivasonix, Llc External ultrasound lipoplasty
WO2008058520A2 (en) * 2006-11-13 2008-05-22 Eberhard-Karls-Universität Universitätsklinikum Tübingen Apparatus for supplying images to an operator
DE102006061178A1 (en) * 2006-12-22 2008-06-26 Siemens Ag Medical system for carrying out and monitoring a minimal invasive intrusion, especially for treating electro-physiological diseases, has X-ray equipment and a control/evaluation unit
FR2917598B1 (en) * 2007-06-19 2010-04-02 Medtech Multi-applicative robotic platform for neurosurgery and method of recaling
DE102007045075B4 (en) * 2007-09-21 2010-05-12 Siemens Ag Interventional medical diagnosis and / or therapy system
US8108072B2 (en) * 2007-09-30 2012-01-31 Intuitive Surgical Operations, Inc. Methods and systems for robotic instrument tool tracking with adaptive fusion of kinematics information and image information
DE102008016414B4 (en) * 2008-03-31 2018-01-04 Kuka Roboter Gmbh X-ray device and medical workstation
DE102008022924A1 (en) * 2008-05-09 2009-11-12 Siemens Aktiengesellschaft Device for medical intervention, has medical instrument which is inserted in moving body area of patient, and robot with multiple free moving space grades
DE102008041260A1 (en) * 2008-08-14 2010-02-25 Deutsches Zentrum für Luft- und Raumfahrt e.V. Method for operating a medical robot, medical robot and medical workstation

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050107808A1 (en) * 1998-11-20 2005-05-19 Intuitive Surgical, Inc. Performing cardiac surgery without cardioplegia
US6626832B1 (en) * 1999-04-15 2003-09-30 Ultraguide Ltd. Apparatus and method for detecting the bending of medical invasive tools in medical interventions
US20020120188A1 (en) * 2000-12-21 2002-08-29 Brock David L. Medical mapping system
US20050033161A1 (en) * 2000-12-21 2005-02-10 Rainer Birkenbach Cable-free medical detection and treatment system
WO2005039391A2 (en) * 2003-10-21 2005-05-06 The Board Of Trustees Of The Leland Stanford Junior University Systems and methods for intraoperative targetting
WO2009045827A2 (en) * 2007-09-30 2009-04-09 Intuitive Surgical, Inc. Methods and systems for tool locating and tool tracking robotic instruments in robotic surgical systems
WO2009045885A2 (en) * 2007-10-02 2009-04-09 Board Of Regents, The University Of Texas System Real-time ultrasound monitoring of heat-induced tissue interactions
WO2009065827A1 (en) 2007-11-19 2009-05-28 Kuka Roboter Gmbh Device comprising a robot, medical work station, and method for registering an object

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016096366A1 (en) * 2014-12-17 2016-06-23 Kuka Roboter Gmbh System for robot-assisted medical treatment
CN106999250A (en) * 2014-12-17 2017-08-01 库卡罗伯特有限公司 System for the medical treatment of robot assisted
EP3229161A3 (en) * 2016-09-21 2018-03-21 Siemens Healthcare GmbH System with a mobile control apparatus and method for issuing a control signal to a component of a medical imaging device

Also Published As

Publication number Publication date
US9872651B2 (en) 2018-01-23
EP2502558B1 (en) 2018-09-26
US20120265071A1 (en) 2012-10-18
DE102011005917A1 (en) 2012-09-27

Similar Documents

Publication Publication Date Title
US8828023B2 (en) Medical workstation
JP2007534351A (en) Guidance system and method for surgical procedures with improved feedback
JP2011502686A (en) Method and apparatus for position tracking of therapeutic ultrasound transducers
US6185445B1 (en) MR tomograph comprising a positioning system for the exact determination of the position of a manually guided manipulator
US9913605B2 (en) Systems and methods for autonomous intravenous needle insertion
US10383765B2 (en) Apparatus and method for a global coordinate system for use in robotic surgery
US6314312B1 (en) Method and system for determining movement of an organ or therapy region of a patient
KR20120101040A (en) Visual tracking and annotation of clinically important anatomical landmarks for surgical interventions
KR20150120944A (en) Collision avoidance during controlled movement of image capturing device and manipulatable device movable arms
EP1080695A1 (en) Medical treatment apparatus and method for supporting or controlling medical treatment
JP4220780B2 (en) Surgery system
US20190167362A1 (en) Surgical robot platform
JP6529962B2 (en) System and method for integrating a robotic medical system with external imaging
US7076286B2 (en) Surgical microscope
JP4836122B2 (en) Surgery support apparatus, method and program
US7590442B2 (en) Method for determining the position of an instrument with an x-ray system
KR20160133469A (en) Signal connector for sterile barrier between surgical instrument and teleoperated actuator
EP1284673A1 (en) Fully-automatic, robot-assisted camera guidance using position sensors for laparoscopic interventions
US20190209043A1 (en) Systems and methods for registration of multiple vision systems
EP2938284B1 (en) Methods for interventional procedure planning
DE102005044033A1 (en) Positioning system for percutaneous interventions
JP6620191B2 (en) Shape sensor system for tracking interventional instruments and method of using the system
JP4152402B2 (en) Surgery support device
JP6251304B2 (en) Motion compensated surgical instrument system and surgical instrument
US8535230B2 (en) Ultrasound device

Legal Events

Date Code Title Description
AX Request for extension of the european patent to:

Extension state: BA ME

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

17P Request for examination filed

Effective date: 20121009

17Q First examination report despatched

Effective date: 20131104

RAP1 Rights of an application transferred

Owner name: KUKA ROBOTER GMBH

RIC1 Information provided on ipc code assigned before grant

Ipc: A61B 5/06 20060101AFI20180220BHEP

Ipc: A61B 34/30 20160101ALI20180220BHEP

INTG Intention to grant announced

Effective date: 20180420

RAP1 Rights of an application transferred

Owner name: KUKA DEUTSCHLAND GMBH

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

RIN1 Information on inventor provided before grant (corrected)

Inventor name: BERKE, RALPH

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

Free format text: NOT ENGLISH

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1045054

Country of ref document: AT

Kind code of ref document: T

Effective date: 20181015

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

Free format text: LANGUAGE OF EP DOCUMENT: GERMAN

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 502012013492

Country of ref document: DE

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20180926

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180926

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180926

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180926

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181227

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180926

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181226

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180926

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180926

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180926

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180926

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180926

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180926

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180926

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180926

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180926

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180926

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190126

PGFP Annual fee paid to national office [announced from national office to epo]

Ref country code: GB

Payment date: 20190313

Year of fee payment: 8

Ref country code: DE

Payment date: 20190226

Year of fee payment: 8

Ref country code: CH

Payment date: 20190314

Year of fee payment: 8

PGFP Annual fee paid to national office [announced from national office to epo]

Ref country code: FR

Payment date: 20190213

Year of fee payment: 8

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190126

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180926

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180926

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 502012013492

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180926

26N No opposition filed

Effective date: 20190627

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180926

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180926

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20190314

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20190331